From solipsis at pitrou.net Tue Feb 1 05:04:48 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 01 Feb 2011 05:04:48 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88284): sum=0 Message-ID: py3k results for svn r88284 (hg cset fad86c3fe4d3) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflog_4TJpc', '-x'] From python-checkins at python.org Tue Feb 1 22:21:22 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 01 Feb 2011 22:21:22 +0100 Subject: [Python-checkins] devguide: Add a note about how to handle modules that seem to have no coverage for global Message-ID: brett.cannon pushed e5053ce6c9e8 to devguide: http://hg.python.org/devguide/rev/e5053ce6c9e8 changeset: 224:e5053ce6c9e8 user: Brett Cannon date: Sun Jan 30 18:01:10 2011 -0800 summary: Add a note about how to handle modules that seem to have no coverage for global statements thanks to being imported before coverage tools can begin tracing. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -9,7 +9,7 @@ before the practice came into effect. This has left chunks of the stdlib untested which is not a desirable situation to be in. -A good, easy way to become acquianted with Python's code and to help out is to +A good, easy way to become acquainted with Python's code and to help out is to help increase the test coverage for Python's stdlib. Ideally we would like to have 100% coverage, but any increase is a good one. Do realize, though, that getting 100% coverage is not always possible. There could be platform-specific @@ -38,7 +38,7 @@ have an accurate, up-to-date notion of what modules need the most work. Do make sure, though, that for any module you do decide to work on that you run -coverage for just that module. This wll make sure you know how good the +coverage for just that module. This will make sure you know how good the explicit coverage of the module is from its own set of tests instead of from implicit testing by other code that happens to use the module. @@ -55,6 +55,16 @@ robust in the face of coverage measuring/having a trace function set; see http://bugs.python.org/issue10992 for a patch +It should be noted that a quirk of running coverage over Python's own stdlib is +that certain modules are imported as part of interpreter startup. Those modules +required by Python itself will not be viewed as executed by the coverage tools +and thus look like they have very poor coverage (e.g., the :py:mod:`stats` +module). In these instances the module will appear to not have any coverage of +global statements but will have proper coverage of local statements (e.g., +function definitions will be not be traced, but the function bodies will). +Calculating the coverage of modules in this situation will simply require +manually looking at what local statements were not executed. + Using coverage.py ----------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 1 22:21:24 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 01 Feb 2011 22:21:24 +0100 Subject: [Python-checkins] devguide: Fix the coverage.py statement for generating HTML reports. Message-ID: brett.cannon pushed b0bba5777e1e to devguide: http://hg.python.org/devguide/rev/b0bba5777e1e changeset: 225:b0bba5777e1e tag: tip user: Brett Cannon date: Mon Jan 31 21:08:53 2011 -0800 summary: Fix the coverage.py statement for generating HTML reports. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -120,12 +120,13 @@ But one of the strengths of coverage.py is its HTML-based reports which lets you visually see what lines of code were not tested:: - ./python -m coverage -d coverage_html html -i --omit="*/test/*,*/tests/*" + ./python -m coverage html -i --omit="*/test/*,*/tests/*" -This will generate an HTML report which ignores any errors that may arise and -ignores test modules. You can then open the ``coverage_html/index.html`` file +This will generate an HTML report in a directory named ``htmlcov`` which +ignores any errors that may arise and +ignores test modules. You can then open the ``htmlcov/index.html`` file in a web browser to view the coverage results along with pages that visibly -show what lines of code were not executed. +show what lines of code were or were not executed. .. _branch_coverage: -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 1 22:31:22 2011 From: python-checkins at python.org (eric.smith) Date: Tue, 1 Feb 2011 22:31:22 +0100 (CET) Subject: [Python-checkins] r88299 - python/branches/py3k/Doc/library/concurrent.futures.rst Message-ID: <20110201213122.D940CEE9AA@mail.python.org> Author: eric.smith Date: Tue Feb 1 22:31:22 2011 New Revision: 88299 Log: Wording fix. Modified: python/branches/py3k/Doc/library/concurrent.futures.rst Modified: python/branches/py3k/Doc/library/concurrent.futures.rst ============================================================================== --- python/branches/py3k/Doc/library/concurrent.futures.rst (original) +++ python/branches/py3k/Doc/library/concurrent.futures.rst Tue Feb 1 22:31:22 2011 @@ -221,7 +221,7 @@ .. method:: cancel() Attempt to cancel the call. If the call is currently being executed and - cannot be cancelled and the method will return ``False``, otherwise the + cannot be cancelled then the method will return ``False``, otherwise the call will be cancelled and the method will return ``True``. .. method:: cancelled() From python-checkins at python.org Tue Feb 1 22:34:25 2011 From: python-checkins at python.org (eric.smith) Date: Tue, 1 Feb 2011 22:34:25 +0100 (CET) Subject: [Python-checkins] r88300 - python/branches/release31-maint Message-ID: <20110201213425.A30C4EE9A7@mail.python.org> Author: eric.smith Date: Tue Feb 1 22:34:25 2011 New Revision: 88300 Log: Blocked revisions 88299 via svnmerge ........ r88299 | eric.smith | 2011-02-01 16:31:22 -0500 (Tue, 01 Feb 2011) | 1 line Wording fix. ........ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Wed Feb 2 00:43:46 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 2 Feb 2011 00:43:46 +0100 (CET) Subject: [Python-checkins] r88301 - python/branches/py3k/PC Message-ID: <20110201234346.5D817EE9BF@mail.python.org> Author: antoine.pitrou Date: Wed Feb 2 00:43:46 2011 New Revision: 88301 Log: Add a svn:ignore for PC/python3dll.obj Modified: python/branches/py3k/PC/ (props changed) From python-checkins at python.org Wed Feb 2 00:54:43 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 2 Feb 2011 00:54:43 +0100 (CET) Subject: [Python-checkins] r88302 - python/branches/py3k/Lib/decimal.py Message-ID: <20110201235443.D00C7EE98D@mail.python.org> Author: raymond.hettinger Date: Wed Feb 2 00:54:43 2011 New Revision: 88302 Log: Get command-line doctest of Lib/decimal.py to work again. If tested as '__main__' instead 'decimal', the tracebacks would abbreviate 'decimal.Inexact' as 'Inexact', breaking the doctests. (Reviewed by Antoine.) Modified: python/branches/py3k/Lib/decimal.py Modified: python/branches/py3k/Lib/decimal.py ============================================================================== --- python/branches/py3k/Lib/decimal.py (original) +++ python/branches/py3k/Lib/decimal.py Wed Feb 2 00:54:43 2011 @@ -6245,5 +6245,5 @@ if __name__ == '__main__': - import doctest, sys - doctest.testmod(sys.modules[__name__]) + import doctest, decimal + doctest.testmod(decimal) From solipsis at pitrou.net Wed Feb 2 05:05:38 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 02 Feb 2011 05:05:38 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88302): sum=0 Message-ID: py3k results for svn r88302 (hg cset 150a1c63a581) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogzV_eAW', '-x'] From python-checkins at python.org Wed Feb 2 09:37:11 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 2 Feb 2011 09:37:11 +0100 (CET) Subject: [Python-checkins] r88318 - python/branches/release27-maint/Lib/ConfigParser.py Message-ID: <20110202083711.D14C0EEA2F@mail.python.org> Author: raymond.hettinger Date: Wed Feb 2 09:37:11 2011 New Revision: 88318 Log: Issue 11089: Fix performance bug in ConfigParser that impaired its usability for large config files. The ConfigParser.get() method should have been O(1) but had O(n) dict copies and updates on every call. This was exacerbated by using OrderedDicts which do not copy or update nearly as fast as regular dicts which are coded in C. Modified: python/branches/release27-maint/Lib/ConfigParser.py Modified: python/branches/release27-maint/Lib/ConfigParser.py ============================================================================== --- python/branches/release27-maint/Lib/ConfigParser.py (original) +++ python/branches/release27-maint/Lib/ConfigParser.py Wed Feb 2 09:37:11 2011 @@ -545,6 +545,38 @@ if isinstance(val, list): options[name] = '\n'.join(val) +import UserDict as _UserDict + +class _Chainmap(_UserDict.DictMixin): + """Combine multiple mappings for successive lookups. + + For example, to emulate Python's normal lookup sequence: + + import __builtin__ + pylookup = _Chainmap(locals(), globals(), vars(__builtin__)) + """ + + def __init__(self, *maps): + self._maps = maps + + def __getitem__(self, key): + for mapping in self._maps: + try: + return mapping[key] + except KeyError: + pass + raise KeyError(key) + + def keys(self): + result = [] + seen = set() + for mapping in self_maps: + for key in mapping: + if key not in seen: + result.append(key) + seen.add(key) + return result + class ConfigParser(RawConfigParser): def get(self, section, option, raw=False, vars=None): @@ -559,16 +591,18 @@ The section DEFAULT is special. """ - d = self._defaults.copy() + sectiondict = {} try: - d.update(self._sections[section]) + sectiondict = self._sections[section] except KeyError: if section != DEFAULTSECT: raise NoSectionError(section) # Update with the entry specific variables + vardict = {} if vars: for key, value in vars.items(): - d[self.optionxform(key)] = value + vardict[self.optionxform(key)] = value + d = _Chainmap(vardict, sectiondict, self._defaults) option = self.optionxform(option) try: value = d[option] From python-checkins at python.org Wed Feb 2 17:58:43 2011 From: python-checkins at python.org (eric.araujo) Date: Wed, 2 Feb 2011 17:58:43 +0100 (CET) Subject: [Python-checkins] r88319 - python/branches/release31-maint/Lib/test/test_zipfile.py Message-ID: <20110202165843.9D084EE9A3@mail.python.org> Author: eric.araujo Date: Wed Feb 2 17:58:43 2011 New Revision: 88319 Log: Fix typo: BadZipFile exists in 3.2+ only, not older versions. Modified: python/branches/release31-maint/Lib/test/test_zipfile.py Modified: python/branches/release31-maint/Lib/test/test_zipfile.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_zipfile.py (original) +++ python/branches/release31-maint/Lib/test/test_zipfile.py Wed Feb 2 17:58:43 2011 @@ -909,7 +909,7 @@ zipf.close() try: zipf = zipfile.ZipFile(TESTFN, mode="r") - except zipfile.BadZipFile: + except zipfile.BadZipfile: self.fail("Unable to create empty ZIP file in 'w' mode") zipf = zipfile.ZipFile(TESTFN, mode="a") From python-checkins at python.org Wed Feb 2 18:03:38 2011 From: python-checkins at python.org (eric.araujo) Date: Wed, 2 Feb 2011 18:03:38 +0100 (CET) Subject: [Python-checkins] r88320 - python/branches/release27-maint/Lib/test/test_zipfile.py Message-ID: <20110202170338.DA4FDEE9AD@mail.python.org> Author: eric.araujo Date: Wed Feb 2 18:03:38 2011 New Revision: 88320 Log: Fix typo: BadZipFile exists in 3.2+ only, not older versions. Modified: python/branches/release27-maint/Lib/test/test_zipfile.py Modified: python/branches/release27-maint/Lib/test/test_zipfile.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_zipfile.py (original) +++ python/branches/release27-maint/Lib/test/test_zipfile.py Wed Feb 2 18:03:38 2011 @@ -956,7 +956,7 @@ zipf.close() try: zipf = zipfile.ZipFile(TESTFN, mode="r") - except zipfile.BadZipFile: + except zipfile.BadZipfile: self.fail("Unable to create empty ZIP file in 'w' mode") zipf = zipfile.ZipFile(TESTFN, mode="a") From python-checkins at python.org Wed Feb 2 21:44:48 2011 From: python-checkins at python.org (eric.araujo) Date: Wed, 2 Feb 2011 21:44:48 +0100 (CET) Subject: [Python-checkins] r88321 - python/branches/release31-maint Message-ID: <20110202204448.604E8EEA11@mail.python.org> Author: eric.araujo Date: Wed Feb 2 21:44:48 2011 New Revision: 88321 Log: Blocked revisions 84189,84532,84537,84540,84549,84551,84602,85226,85786,85891,86247-86248,86253,86275,86326,86486,86513,86617,86630,86661,86668,86758,86993,87056,87304,87446,87461 via svnmerge ........ r84189 | eric.araujo | 2010-08-19 00:35:23 +0200 (jeu., 19 ao?t 2010) | 2 lines Fix typo ........ r84532 | eric.araujo | 2010-09-05 19:32:25 +0200 (dim., 05 sept. 2010) | 11 lines Fix typos and wording in what?s new 3.2. - The entry about shutil.copytree is just a revert of r84524 which looks like an unfinished edition. - The use of gender-neutral language (s/his/their/) removes the implicit assumption that programmer == male (change agreed by Antoine). - Other changes should be uncontroversial fixes. I haven?t rewrapped under 80 lines to keep the diffs readable; I?ll rewrap later. ........ r84537 | eric.araujo | 2010-09-05 20:43:07 +0200 (dim., 05 sept. 2010) | 2 lines Make naming consistent ........ r84540 | eric.araujo | 2010-09-05 20:59:49 +0200 (dim., 05 sept. 2010) | 2 lines Fix accidental suppression in r84537 ........ r84549 | eric.araujo | 2010-09-06 03:27:06 +0200 (lun., 06 sept. 2010) | 1 line Update ........ r84551 | eric.araujo | 2010-09-06 03:31:11 +0200 (lun., 06 sept. 2010) | 2 lines Revert accidental commit, apologies for the noise ........ r84602 | eric.araujo | 2010-09-07 23:35:35 +0200 (mar., 07 sept. 2010) | 2 lines Fix typo in whatsnew (#9793) ........ r85226 | eric.araujo | 2010-10-05 02:04:20 +0200 (mar., 05 oct. 2010) | 2 lines Fix news entry formatting nits ........ r85786 | eric.araujo | 2010-10-22 01:02:07 +0200 (ven., 22 oct. 2010) | 8 lines Apply fix from r85784 on py3k too. Fixes bug #10126 for Python 3.2 by using $RUNSHARED to find the directory to the shared library. test_distutils now passes when Python was built with --enable-shared (Barry didn?t have the error but I did). ........ r85891 | eric.araujo | 2010-10-28 15:49:17 +0200 (jeu., 28 oct. 2010) | 2 lines Fix typo from r85874 ........ r86247 | eric.araujo | 2010-11-06 05:59:27 +0100 (sam., 06 nov. 2010) | 2 lines Fix typo from r86170. ........ r86248 | eric.araujo | 2010-11-06 07:00:54 +0100 (sam., 06 nov. 2010) | 6 lines Remove traces of Mac OS 9 support, again (#9508). This was done in r80805 (#7908) and erroneously brought back by the distutils revert. This commit removes more code than the original, which was uncomplete. There is no NEWS entry, like in r80805. ........ r86253 | eric.araujo | 2010-11-06 08:03:07 +0100 (sam., 06 nov. 2010) | 4 lines Tweak example to make clear the argument is a boolean, not any integer. With Raymond?s approval. ........ r86275 | eric.araujo | 2010-11-06 17:06:37 +0100 (sam., 06 nov. 2010) | 2 lines Remove traces of setuptools (#10341) ........ r86326 | eric.araujo | 2010-11-08 18:13:03 +0100 (lun., 08 nov. 2010) | 2 lines Move a news entry to the right section (+ light reformatting) ........ r86486 | eric.araujo | 2010-11-16 20:13:50 +0100 (mar., 16 nov. 2010) | 4 lines Provide links to Python source where the code is short, readable and informative adjunct to the docs. Forward-port of Raymond's r86225 and r86245 using the new source reST role added in #10334. ........ r86513 | eric.araujo | 2010-11-18 15:22:08 +0100 (jeu., 18 nov. 2010) | 2 lines Remove spurious space that was breaking Vim?s reST highlighting. ........ r86617 | eric.araujo | 2010-11-20 22:53:02 +0100 (sam., 20 nov. 2010) | 2 lines Fix typos and style in compileall. ........ r86630 | eric.araujo | 2010-11-21 03:19:09 +0100 (dim., 21 nov. 2010) | 2 lines Try to get more useful output from failing buildbot ........ r86661 | eric.araujo | 2010-11-22 02:11:49 +0100 (lun., 22 nov. 2010) | 2 lines Fix typo ........ r86668 | eric.araujo | 2010-11-22 03:42:43 +0100 (lun., 22 nov. 2010) | 2 lines Fix one compileall test (#10453). Patch by Michele Orr?. ........ r86758 | eric.araujo | 2010-11-26 01:39:59 +0100 (ven., 26 nov. 2010) | 2 lines #10453 follow-up: Fix test_quiet on Windows, thanks to Stephan Krah. ........ r86993 | eric.araujo | 2010-12-03 20:41:00 +0100 (ven., 03 d?c. 2010) | 7 lines Allow translators to reorder placeholders in localizable messages from argparse (#10528). There is no unit test; I checked with xgettext that no more warnings were emitted. Steven approved the change. ........ r87056 | eric.araujo | 2010-12-04 18:31:49 +0100 (sam., 04 d?c. 2010) | 2 lines Use proper plural forms in argparse (#4391) ........ r87304 | eric.araujo | 2010-12-16 04:13:05 +0100 (jeu., 16 d?c. 2010) | 2 lines Fix one versionchanged ........ r87446 | eric.araujo | 2010-12-23 19:44:31 +0100 (jeu., 23 d?c. 2010) | 2 lines Nits: use a real boolean, make one docstring more similar to the other ones ........ r87461 | eric.araujo | 2010-12-24 00:18:41 +0100 (ven., 24 d?c. 2010) | 2 lines Fix syntax typo ........ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Wed Feb 2 22:12:39 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 2 Feb 2011 22:12:39 +0100 (CET) Subject: [Python-checkins] r88322 - python/branches/py3k/Doc/library/collections.rst Message-ID: <20110202211239.C8D86EEA21@mail.python.org> Author: raymond.hettinger Date: Wed Feb 2 22:12:39 2011 New Revision: 88322 Log: Punctuation typo. Modified: python/branches/py3k/Doc/library/collections.rst Modified: python/branches/py3k/Doc/library/collections.rst ============================================================================== --- python/branches/py3k/Doc/library/collections.rst (original) +++ python/branches/py3k/Doc/library/collections.rst Wed Feb 2 22:12:39 2011 @@ -975,7 +975,7 @@ :class:`Sized` ``__len__`` :class:`Callable` ``__call__`` -:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``. ``__iter__``, ``__reversed__``. +:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``. ``__iter__``, ``__reversed__``, :class:`Iterable`, ``index``, and ``count`` :class:`Container` From python-checkins at python.org Wed Feb 2 22:35:49 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 2 Feb 2011 22:35:49 +0100 (CET) Subject: [Python-checkins] r88323 - in python/branches/release31-maint: Lib/configparser.py Misc/NEWS Message-ID: <20110202213549.1CE8BEEA55@mail.python.org> Author: raymond.hettinger Date: Wed Feb 2 22:35:48 2011 New Revision: 88323 Log: Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. Modified: python/branches/release31-maint/Lib/configparser.py python/branches/release31-maint/Misc/NEWS Modified: python/branches/release31-maint/Lib/configparser.py ============================================================================== --- python/branches/release31-maint/Lib/configparser.py (original) +++ python/branches/release31-maint/Lib/configparser.py Wed Feb 2 22:35:48 2011 @@ -88,7 +88,7 @@ """ try: - from collections import OrderedDict as _default_dict + from collections import Mapping, OrderedDict as _default_dict except ImportError: # fallback for setup.py which hasn't yet built _collections _default_dict = dict @@ -515,6 +515,38 @@ if e: raise e +class _Chainmap(Mapping): + """Combine multiple mappings for successive lookups. + + For example, to emulate Python's normal lookup sequence: + + import __builtin__ + pylookup = _Chainmap(locals(), globals(), vars(__builtin__)) + """ + + def __init__(self, *maps): + self.maps = maps + + def __getitem__(self, key): + for mapping in self.maps: + try: + return mapping[key] + except KeyError: + pass + raise KeyError(key) + + def __iter__(self): + seen = set() + for mapping in self.maps: + s = set(mapping) - seen + for elem in s: + yield elem + seen.update(s) + + def __len__(self): + s = set() + s.update(*self.maps) + return len(s) class ConfigParser(RawConfigParser): @@ -530,16 +562,18 @@ The section DEFAULT is special. """ - d = self._defaults.copy() + sectiondict = {} try: - d.update(self._sections[section]) + sectiondict = self._sections[section] except KeyError: if section != DEFAULTSECT: raise NoSectionError(section) # Update with the entry specific variables + vardict = {} if vars: for key, value in vars.items(): - d[self.optionxform(key)] = value + vardict[self.optionxform(key)] = value + d = _Chainmap(vardict, sectiondict, self._defaults) option = self.optionxform(option) try: value = d[option] Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Wed Feb 2 22:35:48 2011 @@ -37,6 +37,9 @@ Library ------- +- Issue #11089: Fix performance issue limiting the use of ConfigParser() + with large config files. + - Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch by Stan Mihai. From python-checkins at python.org Wed Feb 2 22:38:37 2011 From: python-checkins at python.org (eric.araujo) Date: Wed, 2 Feb 2011 22:38:37 +0100 (CET) Subject: [Python-checkins] r88324 - in python/branches/release31-maint: Doc/library/trace.rst Lib/distutils/tests/__init__.py Lib/distutils/tests/test_archive_util.py Lib/distutils/tests/test_bdist.py Lib/distutils/tests/test_bdist_dumb.py Lib/distutils/tests/test_bdist_rpm.py Lib/distutils/tests/test_bdist_wininst.py Lib/distutils/tests/test_build_clib.py Lib/distutils/tests/test_build_ext.py Lib/distutils/tests/test_build_py.py Lib/distutils/tests/test_build_scripts.py Lib/distutils/tests/test_check.py Lib/distutils/tests/test_clean.py Lib/distutils/tests/test_cmd.py Lib/distutils/tests/test_config.py Lib/distutils/tests/test_config_cmd.py Lib/distutils/tests/test_core.py Lib/distutils/tests/test_cygwinccompiler.py Lib/distutils/tests/test_dir_util.py Lib/distutils/tests/test_dist.py Lib/distutils/tests/test_extension.py Lib/distutils/tests/test_file_util.py Lib/distutils/tests/test_filelist.py Lib/distutils/tests/test_install.py Lib/distutils/tests/test_install_data.py Lib/distutils/tests/test_install_headers.py Lib/distutils/tests/test_install_lib.py Lib/distutils/tests/test_install_scripts.py Lib/distutils/tests/test_log.py Lib/distutils/tests/test_msvc9compiler.py Lib/distutils/tests/test_register.py Lib/distutils/tests/test_sdist.py Lib/distutils/tests/test_spawn.py Lib/distutils/tests/test_text_file.py Lib/distutils/tests/test_unixccompiler.py Lib/distutils/tests/test_upload.py Lib/distutils/tests/test_util.py Lib/distutils/tests/test_version.py Lib/distutils/tests/test_versionpredicate.py Lib/test/test_gettext.py Lib/test/test_tuple.py Misc/NEWS Message-ID: <20110202213837.8B081EEA58@mail.python.org> Author: eric.araujo Date: Wed Feb 2 22:38:37 2011 New Revision: 88324 Log: Merged revisions 86236,86240,86332,86340,87271,87273,87447 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k The missing NEWS entries correspond to changes that were made before 3.1.3, but I think it?s not usual to edit entries of released versions, so I put them at the top. ........ r86236 | eric.araujo | 2010-11-06 03:44:43 +0100 (sam., 06 nov. 2010) | 2 lines Make sure each test can be run standalone (./python Lib/distutils/tests/x.py) ........ r86240 | eric.araujo | 2010-11-06 05:11:59 +0100 (sam., 06 nov. 2010) | 2 lines Prevent ResourceWarnings in test_gettext ........ r86332 | eric.araujo | 2010-11-08 19:15:17 +0100 (lun., 08 nov. 2010) | 4 lines Add missing NEWS entry for a fix committed by Senthil. All recent modifications to distutils should now be covered in NEWS. ........ r86340 | eric.araujo | 2010-11-08 22:48:23 +0100 (lun., 08 nov. 2010) | 2 lines This was actually fixed for the previous alpha. ........ r87271 | eric.araujo | 2010-12-15 20:09:58 +0100 (mer., 15 d?c. 2010) | 2 lines Improve trace documentation (#9264). Patch by Eli Bendersky. ........ r87273 | eric.araujo | 2010-12-15 20:30:15 +0100 (mer., 15 d?c. 2010) | 2 lines Use nested method directives, rewrap long lines, fix whitespace. ........ r87447 | eric.araujo | 2010-12-23 20:13:05 +0100 (jeu., 23 d?c. 2010) | 2 lines Fix typo in superclass method name ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/library/trace.rst python/branches/release31-maint/Lib/distutils/tests/__init__.py python/branches/release31-maint/Lib/distutils/tests/test_archive_util.py python/branches/release31-maint/Lib/distutils/tests/test_bdist.py python/branches/release31-maint/Lib/distutils/tests/test_bdist_dumb.py python/branches/release31-maint/Lib/distutils/tests/test_bdist_rpm.py python/branches/release31-maint/Lib/distutils/tests/test_bdist_wininst.py python/branches/release31-maint/Lib/distutils/tests/test_build_clib.py python/branches/release31-maint/Lib/distutils/tests/test_build_ext.py python/branches/release31-maint/Lib/distutils/tests/test_build_py.py python/branches/release31-maint/Lib/distutils/tests/test_build_scripts.py python/branches/release31-maint/Lib/distutils/tests/test_check.py python/branches/release31-maint/Lib/distutils/tests/test_clean.py python/branches/release31-maint/Lib/distutils/tests/test_cmd.py python/branches/release31-maint/Lib/distutils/tests/test_config.py python/branches/release31-maint/Lib/distutils/tests/test_config_cmd.py python/branches/release31-maint/Lib/distutils/tests/test_core.py python/branches/release31-maint/Lib/distutils/tests/test_cygwinccompiler.py python/branches/release31-maint/Lib/distutils/tests/test_dir_util.py python/branches/release31-maint/Lib/distutils/tests/test_dist.py python/branches/release31-maint/Lib/distutils/tests/test_extension.py python/branches/release31-maint/Lib/distutils/tests/test_file_util.py python/branches/release31-maint/Lib/distutils/tests/test_filelist.py python/branches/release31-maint/Lib/distutils/tests/test_install.py python/branches/release31-maint/Lib/distutils/tests/test_install_data.py python/branches/release31-maint/Lib/distutils/tests/test_install_headers.py python/branches/release31-maint/Lib/distutils/tests/test_install_lib.py python/branches/release31-maint/Lib/distutils/tests/test_install_scripts.py python/branches/release31-maint/Lib/distutils/tests/test_log.py python/branches/release31-maint/Lib/distutils/tests/test_msvc9compiler.py python/branches/release31-maint/Lib/distutils/tests/test_register.py python/branches/release31-maint/Lib/distutils/tests/test_sdist.py python/branches/release31-maint/Lib/distutils/tests/test_spawn.py python/branches/release31-maint/Lib/distutils/tests/test_text_file.py python/branches/release31-maint/Lib/distutils/tests/test_unixccompiler.py python/branches/release31-maint/Lib/distutils/tests/test_upload.py python/branches/release31-maint/Lib/distutils/tests/test_util.py python/branches/release31-maint/Lib/distutils/tests/test_version.py python/branches/release31-maint/Lib/distutils/tests/test_versionpredicate.py python/branches/release31-maint/Lib/test/test_gettext.py python/branches/release31-maint/Lib/test/test_tuple.py python/branches/release31-maint/Misc/NEWS Modified: python/branches/release31-maint/Doc/library/trace.rst ============================================================================== --- python/branches/release31-maint/Doc/library/trace.rst (original) +++ python/branches/release31-maint/Doc/library/trace.rst Wed Feb 2 22:38:37 2011 @@ -13,103 +13,178 @@ .. _trace-cli: -Command Line Usage +Command-Line Usage ------------------ The :mod:`trace` module can be invoked from the command line. It can be as simple as :: - python -m trace --count somefile.py ... + python -m trace --count -C . somefile.py ... -The above will generate annotated listings of all Python modules imported during -the execution of :file:`somefile.py`. +The above will execute :file:`somefile.py` and generate annotated listings of +all Python modules imported during the execution into the current directory. -The following command-line arguments are supported: +.. program:: trace + +.. cmdoption:: --help + + Display usage and exit. + +.. cmdoption:: --version + + Display the version of the module and exit. + +Main options +^^^^^^^^^^^^ + +At least one of the following options must be specified when invoking +:mod:`trace`. The :option:`--listfuncs <-l>` option is mutually exclusive with +the :option:`--trace <-t>` and :option:`--counts <-c>` options . When +:option:`--listfuncs <-l>` is provided, neither :option:`--counts <-c>` nor +:option:`--trace <-t>` are accepted, and vice versa. + +.. program:: trace + +.. cmdoption:: -c, --count + + Produce a set of annotated listing files upon program completion that shows + how many times each statement was executed. See also + :option:`--coverdir <-C>`, :option:`--file <-f>` and + :option:`--no-report <-R>` below. + +.. cmdoption:: -t, --trace -:option:`--trace`, :option:`-t` Display lines as they are executed. -:option:`--count`, :option:`-c` - Produce a set of annotated listing files upon program completion that shows how - many times each statement was executed. +.. cmdoption:: -l, --listfuncs + + Display the functions executed by running the program. + +.. cmdoption:: -r, --report -:option:`--report`, :option:`-r` Produce an annotated list from an earlier program run that used the - :option:`--count` and :option:`--file` arguments. + :option:`--count <-c>` and :option:`--file <-f>` option. This does not + execute any code. -:option:`--no-report`, :option:`-R` - Do not generate annotated listings. This is useful if you intend to make - several runs with :option:`--count` then produce a single set of annotated - listings at the end. +.. cmdoption:: -T, --trackcalls + + Display the calling relationships exposed by running the program. + +Modifiers +^^^^^^^^^ + +.. program:: trace + +.. cmdoption:: -f, --file= -:option:`--listfuncs`, :option:`-l` - List the functions executed by running the program. + Name of a file to accumulate counts over several tracing runs. Should be + used with the :option:`--count <-c>` option. -:option:`--trackcalls`, :option:`-T` - Generate calling relationships exposed by running the program. +.. cmdoption:: -C, --coverdir= -:option:`--file`, :option:`-f` - Name a file containing (or to contain) counts. + Directory where the report files go. The coverage report for + ``package.module`` is written to file :file:`{dir}/{package}/{module}.cover`. -:option:`--coverdir`, :option:`-C` - Name a directory in which to save annotated listing files. +.. cmdoption:: -m, --missing -:option:`--missing`, :option:`-m` When generating annotated listings, mark lines which were not executed with - '``>>>>>>``'. + ``>>>>>>``. -:option:`--summary`, :option:`-s` - When using :option:`--count` or :option:`--report`, write a brief summary to - stdout for each file processed. - -:option:`--ignore-module` - Accepts comma separated list of module names. Ignore each of the named - module and its submodules (if it is a package). May be given - multiple times. - -:option:`--ignore-dir` - Ignore all modules and packages in the named directory and subdirectories - (multiple directories can be joined by os.pathsep). May be given multiple - times. +.. cmdoption:: -s, --summary + When using :option:`--count <-c>` or :option:`--report <-r>`, write a brief + summary to stdout for each file processed. + +.. cmdoption:: -R, --no-report + + Do not generate annotated listings. This is useful if you intend to make + several runs with :option:`--count <-c>`, and then produce a single set of + annotated listings at the end. + +.. cmdoption:: -g, --timing + + Prefix each line with the time since the program started. Only used while + tracing. + +Filters +^^^^^^^ + +These options may be repeated multiple times. + +.. program:: trace + +.. cmdoption:: --ignore-module= + + Ignore each of the given module names and its submodules (if it is a + package). The argument can be a list of names separated by a comma. + +.. cmdoption:: --ignore-dir= + + Ignore all modules and packages in the named directory and subdirectories. + The argument can be a list of directories separated by :data:`os.pathsep`. .. _trace-api: -Programming Interface ---------------------- +Programmatic Interface +---------------------- + +.. class:: Trace(count=1, trace=1, countfuncs=0, countcallers=0, ignoremods=(),\ + ignoredirs=(), infile=None, outfile=None, timing=False) + + Create an object to trace execution of a single statement or expression. All + parameters are optional. *count* enables counting of line numbers. *trace* + enables line execution tracing. *countfuncs* enables listing of the + functions called during the run. *countcallers* enables call relationship + tracking. *ignoremods* is a list of modules or packages to ignore. + *ignoredirs* is a list of directories whose modules or packages should be + ignored. *infile* is the name of the file from which to read stored count + information. *outfile* is the name of the file in which to write updated + count information. *timing* enables a timestamp relative to when tracing was + started to be displayed. + + .. method:: run(cmd) + + Execute the command and gather statistics from the execution with + the current tracing parameters. *cmd* must be a string or code object, + suitable for passing into :func:`exec`. + .. method:: runctx(cmd, globals=None, locals=None) -.. class:: Trace(count=1, trace=1, countfuncs=0, countcallers=0, ignoremods=(), ignoredirs=(), infile=None, outfile=None, timing=False) + Execute the command and gather statistics from the execution with the + current tracing parameters, in the defined global and local + environments. If not defined, *globals* and *locals* default to empty + dictionaries. - Create an object to trace execution of a single statement or expression. All - parameters are optional. *count* enables counting of line numbers. *trace* - enables line execution tracing. *countfuncs* enables listing of the functions - called during the run. *countcallers* enables call relationship tracking. - *ignoremods* is a list of modules or packages to ignore. *ignoredirs* is a list - of directories whose modules or packages should be ignored. *infile* is the - file from which to read stored count information. *outfile* is a file in which - to write updated count information. *timing* enables a timestamp relative - to when tracing was started to be displayed. + .. method:: runfunc(func, *args, **kwds) + Call *func* with the given arguments under control of the :class:`Trace` + object with the current tracing parameters. -.. method:: Trace.run(cmd) + .. method:: results() - Run *cmd* under control of the Trace object with the current tracing parameters. + Return a :class:`CoverageResults` object that contains the cumulative + results of all previous calls to ``run``, ``runctx`` and ``runfunc`` + for the given :class:`Trace` instance. Does not reset the accumulated + trace results. +.. class:: CoverageResults -.. method:: Trace.runctx(cmd, globals=None, locals=None) + A container for coverage results, created by :meth:`Trace.results`. Should + not be created directly by the user. - Run *cmd* under control of the Trace object with the current tracing parameters - in the defined global and local environments. If not defined, *globals* and - *locals* default to empty dictionaries. + .. method:: update(other) + Merge in data from another :class:`CoverageResults` object. -.. method:: Trace.runfunc(func, *args, **kwds) + .. method:: write_results(show_missing=True, summary=False, coverdir=None) - Call *func* with the given arguments under control of the :class:`Trace` object - with the current tracing parameters. + Write coverage results. Set *show_missing* to show lines that had no + hits. Set *summary* to include in the output the coverage summary per + module. *coverdir* specifies the directory into which the coverage + result files will be output. If ``None``, the results for each source + file are placed in its directory. -This is a simple example showing the use of this module:: +A simple example demonstrating the use of the programmatic interface:: import sys import trace Modified: python/branches/release31-maint/Lib/distutils/tests/__init__.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/__init__.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/__init__.py Wed Feb 2 22:38:37 2011 @@ -15,9 +15,10 @@ import os import sys import unittest +from test.support import run_unittest -here = os.path.dirname(__file__) +here = os.path.dirname(__file__) or os.curdir def test_suite(): @@ -32,4 +33,4 @@ if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_archive_util.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_archive_util.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_archive_util.py Wed Feb 2 22:38:37 2011 @@ -12,7 +12,7 @@ ARCHIVE_FORMATS) from distutils.spawn import find_executable, spawn from distutils.tests import support -from test.support import check_warnings +from test.support import check_warnings, run_unittest try: import zipfile @@ -211,4 +211,4 @@ return unittest.makeSuite(ArchiveUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_bdist.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_bdist.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_bdist.py Wed Feb 2 22:38:37 2011 @@ -4,6 +4,7 @@ import os import tempfile import shutil +from test.support import run_unittest from distutils.core import Distribution from distutils.command.bdist import bdist @@ -40,4 +41,4 @@ return unittest.makeSuite(BuildTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_bdist_dumb.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_bdist_dumb.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_bdist_dumb.py Wed Feb 2 22:38:37 2011 @@ -3,6 +3,7 @@ import unittest import sys import os +from test.support import run_unittest from distutils.core import Distribution from distutils.command.bdist_dumb import bdist_dumb @@ -77,4 +78,4 @@ return unittest.makeSuite(BuildDumbTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_bdist_rpm.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_bdist_rpm.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_bdist_rpm.py Wed Feb 2 22:38:37 2011 @@ -5,6 +5,7 @@ import os import tempfile import shutil +from test.support import run_unittest from distutils.core import Distribution from distutils.command.bdist_rpm import bdist_rpm @@ -122,4 +123,4 @@ return unittest.makeSuite(BuildRpmTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_bdist_wininst.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_bdist_wininst.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_bdist_wininst.py Wed Feb 2 22:38:37 2011 @@ -1,5 +1,6 @@ """Tests for distutils.command.bdist_wininst.""" import unittest +from test.support import run_unittest from distutils.command.bdist_wininst import bdist_wininst from distutils.tests import support @@ -27,4 +28,4 @@ return unittest.makeSuite(BuildWinInstTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_build_clib.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_build_clib.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_build_clib.py Wed Feb 2 22:38:37 2011 @@ -3,6 +3,8 @@ import os import sys +from test.support import run_unittest + from distutils.command.build_clib import build_clib from distutils.errors import DistutilsSetupError from distutils.tests import support @@ -141,4 +143,4 @@ return unittest.makeSuite(BuildCLibTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_build_ext.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_build_ext.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_build_ext.py Wed Feb 2 22:38:37 2011 @@ -15,6 +15,7 @@ import unittest from test import support +from test.support import run_unittest # http://bugs.python.org/issue4373 # Don't load the xx module more than once. Modified: python/branches/release31-maint/Lib/distutils/tests/test_build_py.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_build_py.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_build_py.py Wed Feb 2 22:38:37 2011 @@ -10,6 +10,7 @@ from distutils.errors import DistutilsFileError from distutils.tests import support +from test.support import run_unittest class BuildPyTestCase(support.TempdirManager, @@ -114,4 +115,4 @@ return unittest.makeSuite(BuildPyTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_build_scripts.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_build_scripts.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_build_scripts.py Wed Feb 2 22:38:37 2011 @@ -8,6 +8,7 @@ from distutils import sysconfig from distutils.tests import support +from test.support import run_unittest class BuildScriptsTestCase(support.TempdirManager, @@ -108,4 +109,4 @@ return unittest.makeSuite(BuildScriptsTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_check.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_check.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_check.py Wed Feb 2 22:38:37 2011 @@ -1,5 +1,6 @@ """Tests for distutils.command.check.""" import unittest +from test.support import run_unittest from distutils.command.check import check, HAS_DOCUTILS from distutils.tests import support @@ -95,4 +96,4 @@ return unittest.makeSuite(CheckTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_clean.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_clean.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_clean.py Wed Feb 2 22:38:37 2011 @@ -6,6 +6,7 @@ from distutils.command.clean import clean from distutils.tests import support +from test.support import run_unittest class cleanTestCase(support.TempdirManager, support.LoggingSilencer, @@ -47,4 +48,4 @@ return unittest.makeSuite(cleanTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_cmd.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_cmd.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_cmd.py Wed Feb 2 22:38:37 2011 @@ -1,7 +1,7 @@ """Tests for distutils.cmd.""" import unittest import os -from test.support import captured_stdout +from test.support import captured_stdout, run_unittest from distutils.cmd import Command from distutils.dist import Distribution @@ -99,7 +99,7 @@ def test_ensure_dirname(self): cmd = self.cmd - cmd.option1 = os.path.dirname(__file__) + cmd.option1 = os.path.dirname(__file__) or os.curdir cmd.ensure_dirname('option1') cmd.option2 = 'xxx' self.assertRaises(DistutilsOptionError, cmd.ensure_dirname, 'option2') @@ -124,4 +124,4 @@ return unittest.makeSuite(CommandTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_config.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_config.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_config.py Wed Feb 2 22:38:37 2011 @@ -10,6 +10,7 @@ from distutils.log import WARN from distutils.tests import support +from test.support import run_unittest PYPIRC = """\ [distutils] @@ -116,4 +117,4 @@ return unittest.makeSuite(PyPIRCCommandTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_config_cmd.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_config_cmd.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_config_cmd.py Wed Feb 2 22:38:37 2011 @@ -2,6 +2,7 @@ import unittest import os import sys +from test.support import run_unittest from distutils.command.config import dump_file, config from distutils.tests import support @@ -86,4 +87,4 @@ return unittest.makeSuite(ConfigTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_core.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_core.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_core.py Wed Feb 2 22:38:37 2011 @@ -6,7 +6,7 @@ import shutil import sys import test.support -from test.support import captured_stdout +from test.support import captured_stdout, run_unittest import unittest from distutils.tests import support @@ -105,4 +105,4 @@ return unittest.makeSuite(CoreTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_cygwinccompiler.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_cygwinccompiler.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_cygwinccompiler.py Wed Feb 2 22:38:37 2011 @@ -4,6 +4,7 @@ import os from io import BytesIO import subprocess +from test.support import run_unittest from distutils import cygwinccompiler from distutils.cygwinccompiler import (CygwinCCompiler, check_config_h, @@ -151,4 +152,4 @@ return unittest.makeSuite(CygwinCCompilerTestCase) if __name__ == '__main__': - test_support.run_unittest(test_suite()) + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_dir_util.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_dir_util.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_dir_util.py Wed Feb 2 22:38:37 2011 @@ -10,6 +10,7 @@ from distutils import log from distutils.tests import support +from test.support import run_unittest class DirUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -112,4 +113,4 @@ return unittest.makeSuite(DirUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_dist.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_dist.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_dist.py Wed Feb 2 22:38:37 2011 @@ -10,7 +10,7 @@ from distutils.dist import Distribution, fix_help_options from distutils.cmd import Command -from test.support import TESTFN, captured_stdout +from test.support import TESTFN, captured_stdout, run_unittest from distutils.tests import support @@ -326,4 +326,4 @@ return suite if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_extension.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_extension.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_extension.py Wed Feb 2 22:38:37 2011 @@ -3,7 +3,7 @@ import os import warnings -from test.support import check_warnings +from test.support import check_warnings, run_unittest from distutils.extension import read_setup_file, Extension class ExtensionTestCase(unittest.TestCase): @@ -66,4 +66,4 @@ return unittest.makeSuite(ExtensionTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_file_util.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_file_util.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_file_util.py Wed Feb 2 22:38:37 2011 @@ -6,6 +6,7 @@ from distutils.file_util import move_file from distutils import log from distutils.tests import support +from test.support import run_unittest class FileUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -62,4 +63,4 @@ return unittest.makeSuite(FileUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_filelist.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_filelist.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_filelist.py Wed Feb 2 22:38:37 2011 @@ -2,7 +2,7 @@ import unittest from distutils.filelist import glob_to_re, FileList -from test.support import captured_stdout +from test.support import captured_stdout, run_unittest from distutils import debug class FileListTestCase(unittest.TestCase): @@ -39,4 +39,4 @@ return unittest.makeSuite(FileListTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_install.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_install.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_install.py Wed Feb 2 22:38:37 2011 @@ -6,7 +6,7 @@ import unittest import site -from test.support import captured_stdout +from test.support import captured_stdout, run_unittest from distutils.command.install import install from distutils.command import install as install_module @@ -203,4 +203,4 @@ return unittest.makeSuite(InstallTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_install_data.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_install_data.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_install_data.py Wed Feb 2 22:38:37 2011 @@ -6,6 +6,7 @@ from distutils.command.install_data import install_data from distutils.tests import support +from test.support import run_unittest class InstallDataTestCase(support.TempdirManager, support.LoggingSilencer, @@ -73,4 +74,4 @@ return unittest.makeSuite(InstallDataTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_install_headers.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_install_headers.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_install_headers.py Wed Feb 2 22:38:37 2011 @@ -6,6 +6,7 @@ from distutils.command.install_headers import install_headers from distutils.tests import support +from test.support import run_unittest class InstallHeadersTestCase(support.TempdirManager, support.LoggingSilencer, @@ -37,4 +38,4 @@ return unittest.makeSuite(InstallHeadersTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_install_lib.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_install_lib.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_install_lib.py Wed Feb 2 22:38:37 2011 @@ -7,6 +7,7 @@ from distutils.extension import Extension from distutils.tests import support from distutils.errors import DistutilsOptionError +from test.support import run_unittest class InstallLibTestCase(support.TempdirManager, support.LoggingSilencer, @@ -98,4 +99,4 @@ return unittest.makeSuite(InstallLibTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_install_scripts.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_install_scripts.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_install_scripts.py Wed Feb 2 22:38:37 2011 @@ -7,6 +7,7 @@ from distutils.core import Distribution from distutils.tests import support +from test.support import run_unittest class InstallScriptsTestCase(support.TempdirManager, @@ -78,4 +79,4 @@ return unittest.makeSuite(InstallScriptsTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_log.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_log.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_log.py Wed Feb 2 22:38:37 2011 @@ -3,6 +3,7 @@ import sys import unittest from tempfile import NamedTemporaryFile +from test.support import run_unittest from distutils import log @@ -33,4 +34,4 @@ return unittest.makeSuite(TestLog) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_msvc9compiler.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_msvc9compiler.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_msvc9compiler.py Wed Feb 2 22:38:37 2011 @@ -5,6 +5,7 @@ from distutils.errors import DistutilsPlatformError from distutils.tests import support +from test.support import run_unittest _MANIFEST = """\ @@ -137,4 +138,4 @@ return unittest.makeSuite(msvc9compilerTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_register.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_register.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_register.py Wed Feb 2 22:38:37 2011 @@ -6,7 +6,7 @@ import urllib import warnings -from test.support import check_warnings +from test.support import check_warnings, run_unittest from distutils.command import register as register_module from distutils.command.register import register @@ -259,4 +259,4 @@ return unittest.makeSuite(RegisterTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_sdist.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_sdist.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_sdist.py Wed Feb 2 22:38:37 2011 @@ -8,11 +8,9 @@ import tempfile import warnings -from test.support import check_warnings -from test.support import captured_stdout +from test.support import captured_stdout, check_warnings, run_unittest -from distutils.command.sdist import sdist -from distutils.command.sdist import show_formats +from distutils.command.sdist import sdist, show_formats from distutils.core import Distribution from distutils.tests.test_config import PyPIRCCommandTestCase from distutils.errors import DistutilsExecError, DistutilsOptionError @@ -356,4 +354,4 @@ return unittest.makeSuite(SDistTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_spawn.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_spawn.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_spawn.py Wed Feb 2 22:38:37 2011 @@ -2,7 +2,7 @@ import unittest import os import time -from test.support import captured_stdout +from test.support import captured_stdout, run_unittest from distutils.spawn import _nt_quote_args from distutils.spawn import spawn, find_executable @@ -55,4 +55,4 @@ return unittest.makeSuite(SpawnTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_text_file.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_text_file.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_text_file.py Wed Feb 2 22:38:37 2011 @@ -3,6 +3,7 @@ import unittest from distutils.text_file import TextFile from distutils.tests import support +from test.support import run_unittest TEST_DATA = """# test file @@ -103,4 +104,4 @@ return unittest.makeSuite(TextFileTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_unixccompiler.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_unixccompiler.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_unixccompiler.py Wed Feb 2 22:38:37 2011 @@ -1,6 +1,7 @@ """Tests for distutils.unixccompiler.""" import sys import unittest +from test.support import run_unittest from distutils import sysconfig from distutils.unixccompiler import UnixCCompiler @@ -118,4 +119,4 @@ return unittest.makeSuite(UnixCCompilerTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_upload.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_upload.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_upload.py Wed Feb 2 22:38:37 2011 @@ -1,13 +1,12 @@ """Tests for distutils.command.upload.""" -import sys import os import unittest import http.client as httpclient +from test.support import run_unittest from distutils.command.upload import upload from distutils.core import Distribution -from distutils.tests import support from distutils.tests.test_config import PYPIRC, PyPIRCCommandTestCase PYPIRC_LONG_PASSWORD = """\ @@ -136,4 +135,4 @@ return unittest.makeSuite(uploadTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_util.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_util.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_util.py Wed Feb 2 22:38:37 2011 @@ -3,6 +3,7 @@ import sys import unittest from copy import copy +from test.support import run_unittest from distutils.errors import DistutilsPlatformError, DistutilsByteCompileError from distutils.util import (get_platform, convert_path, change_root, @@ -274,4 +275,4 @@ return unittest.makeSuite(UtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_version.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_version.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_version.py Wed Feb 2 22:38:37 2011 @@ -2,6 +2,7 @@ import unittest from distutils.version import LooseVersion from distutils.version import StrictVersion +from test.support import run_unittest class VersionTestCase(unittest.TestCase): @@ -67,4 +68,4 @@ return unittest.makeSuite(VersionTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/distutils/tests/test_versionpredicate.py ============================================================================== --- python/branches/release31-maint/Lib/distutils/tests/test_versionpredicate.py (original) +++ python/branches/release31-maint/Lib/distutils/tests/test_versionpredicate.py Wed Feb 2 22:38:37 2011 @@ -4,6 +4,10 @@ import distutils.versionpredicate import doctest +from test.support import run_unittest def test_suite(): return doctest.DocTestSuite(distutils.versionpredicate) + +if __name__ == '__main__': + run_unittest(test_suite()) Modified: python/branches/release31-maint/Lib/test/test_gettext.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_gettext.py (original) +++ python/branches/release31-maint/Lib/test/test_gettext.py Wed Feb 2 22:38:37 2011 @@ -64,15 +64,12 @@ def setUp(self): if not os.path.isdir(LOCALEDIR): os.makedirs(LOCALEDIR) - fp = open(MOFILE, 'wb') - fp.write(base64.decodebytes(GNU_MO_DATA)) - fp.close() - fp = open(UMOFILE, 'wb') - fp.write(base64.decodebytes(UMO_DATA)) - fp.close() - fp = open(MMOFILE, 'wb') - fp.write(base64.decodebytes(MMO_DATA)) - fp.close() + with open(MOFILE, 'wb') as fp: + fp.write(base64.decodebytes(GNU_MO_DATA)) + with open(UMOFILE, 'wb') as fp: + fp.write(base64.decodebytes(UMO_DATA)) + with open(MMOFILE, 'wb') as fp: + fp.write(base64.decodebytes(MMO_DATA)) self.env = support.EnvironmentVarGuard() self.env['LANGUAGE'] = 'xx' gettext._translations.clear() @@ -135,9 +132,8 @@ def test_the_alternative_interface(self): eq = self.assertEqual # test the alternative interface - fp = open(self.mofile, 'rb') - t = gettext.GNUTranslations(fp) - fp.close() + with open(self.mofile, 'rb') as fp: + t = gettext.GNUTranslations(fp) # Install the translation object t.install() eq(_('nudge nudge'), 'wink wink') @@ -227,9 +223,8 @@ def test_plural_forms2(self): eq = self.assertEqual - fp = open(self.mofile, 'rb') - t = gettext.GNUTranslations(fp) - fp.close() + with open(self.mofile, 'rb') as fp: + t = gettext.GNUTranslations(fp) x = t.ngettext('There is %s file', 'There are %s files', 1) eq(x, 'Hay %s fichero') x = t.ngettext('There is %s file', 'There are %s files', 2) @@ -299,11 +294,8 @@ class UnicodeTranslationsTest(GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) - fp = open(UMOFILE, 'rb') - try: + with open(UMOFILE, 'rb') as fp: self.t = gettext.GNUTranslations(fp) - finally: - fp.close() self._ = self.t.gettext def test_unicode_msgid(self): @@ -319,15 +311,12 @@ class WeirdMetadataTest(GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) - fp = open(MMOFILE, 'rb') - try: + with open(MMOFILE, 'rb') as fp: try: self.t = gettext.GNUTranslations(fp) except: self.tearDown() raise - finally: - fp.close() def test_weird_metadata(self): info = self.t.info() Modified: python/branches/release31-maint/Lib/test/test_tuple.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_tuple.py (original) +++ python/branches/release31-maint/Lib/test/test_tuple.py Wed Feb 2 22:38:37 2011 @@ -6,7 +6,7 @@ type2test = tuple def test_constructors(self): - super().test_len() + super().test_constructors() # calling built-in types without argument must return empty self.assertEqual(tuple(), ()) t0_3 = (0, 1, 2, 3) Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Wed Feb 2 22:38:37 2011 @@ -37,6 +37,10 @@ Library ------- +- Issue #2236: distutils' mkpath ignored the mode parameter. + +- Fix typo in one sdist option (medata-check). + - Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. From python-checkins at python.org Wed Feb 2 22:41:26 2011 From: python-checkins at python.org (eric.araujo) Date: Wed, 2 Feb 2011 22:41:26 +0100 (CET) Subject: [Python-checkins] r88325 - python/branches/release31-maint Message-ID: <20110202214126.14531EEA55@mail.python.org> Author: eric.araujo Date: Wed Feb 2 22:41:25 2011 New Revision: 88325 Log: Blocked revisions 86252 via svnmerge ........ r86252 | eric.araujo | 2010-11-06 07:33:03 +0100 (sam., 06 nov. 2010) | 2 lines Fix wrapper/wrapped typo (with Raymond?s blessing) ........ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Wed Feb 2 23:17:35 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 2 Feb 2011 23:17:35 +0100 (CET) Subject: [Python-checkins] r88326 - python/branches/release31-maint/Lib/configparser.py Message-ID: <20110202221735.08E63EEA4B@mail.python.org> Author: raymond.hettinger Date: Wed Feb 2 23:17:34 2011 New Revision: 88326 Log: collections.Mapping is not available for setup.py. Remove the dependency the old-fashioned way (copy and paste). Modified: python/branches/release31-maint/Lib/configparser.py Modified: python/branches/release31-maint/Lib/configparser.py ============================================================================== --- python/branches/release31-maint/Lib/configparser.py (original) +++ python/branches/release31-maint/Lib/configparser.py Wed Feb 2 23:17:34 2011 @@ -88,7 +88,7 @@ """ try: - from collections import Mapping, OrderedDict as _default_dict + from collections import OrderedDict as _default_dict except ImportError: # fallback for setup.py which hasn't yet built _collections _default_dict = dict @@ -515,7 +515,7 @@ if e: raise e -class _Chainmap(Mapping): +class _Chainmap: """Combine multiple mappings for successive lookups. For example, to emulate Python's normal lookup sequence: @@ -548,6 +548,36 @@ s.update(*self.maps) return len(s) + def get(self, key, default=None): + try: + return self[key] + except KeyError: + return default + + def __contains__(self, key): + try: + self[key] + except KeyError: + return False + else: + return True + + def keys(self): + return list(self) + + def items(self): + return [(k, self[k]) for k in self] + + def values(self): + return [self[k] for k in self] + + def __eq__(self, other): + return dict(self.items()) == dict(other.items()) + + def __ne__(self, other): + return not (self == other) + + class ConfigParser(RawConfigParser): def get(self, section, option, raw=False, vars=None): From ncoghlan at gmail.com Thu Feb 3 00:07:31 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 3 Feb 2011 09:07:31 +1000 Subject: [Python-checkins] r88324 - in python/branches/release31-maint: Doc/library/trace.rst Lib/distutils/tests/__init__.py Lib/distutils/tests/test_archive_util.py Lib/distutils/tests/test_bdist.py Lib/distutils/tests/test_bdist_dumb.py Lib/distutils/t Message-ID: On Thu, Feb 3, 2011 at 7:38 AM, eric.araujo wrote: > Author: eric.araujo > Date: Wed Feb ?2 22:38:37 2011 > New Revision: 88324 > > Log: > Merged revisions 86236,86240,86332,86340,87271,87273,87447 via svnmerge from > svn+ssh://pythondev at svn.python.org/python/branches/py3k > > The missing NEWS entries correspond to changes that were made before 3.1.3, but > I think it?s not usual to edit entries of released versions, so I put them at > the top. The only reason it isn't usual is because the change normally goes in at roughly the same time as the NEWS update, so it is very rare to have a change in a release without the corresponding NEWS entry. If NEWS entries get missed for the release, better to add them in the right place afterwards (it's easy enough to tell which entries were originally missing by comparing the NEWS file from source control with the one from the release). Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Thu Feb 3 00:30:04 2011 From: python-checkins at python.org (eric.araujo) Date: Thu, 3 Feb 2011 00:30:04 +0100 (CET) Subject: [Python-checkins] r88327 - python/branches/release31-maint/Misc/NEWS Message-ID: <20110202233004.D0FECEE9A7@mail.python.org> Author: eric.araujo Date: Thu Feb 3 00:30:04 2011 New Revision: 88327 Log: Move entries to the right place Modified: python/branches/release31-maint/Misc/NEWS Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Thu Feb 3 00:30:04 2011 @@ -37,10 +37,6 @@ Library ------- -- Issue #2236: distutils' mkpath ignored the mode parameter. - -- Fix typo in one sdist option (medata-check). - - Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. @@ -212,6 +208,10 @@ Library ------- +- Issue #2236: distutils' mkpath ignored the mode parameter. + +- Fix typo in one sdist option (medata-check). + - Issue #10323: itertools.islice() now consumes the minimum number of inputs before stopping. Formerly, the final state of the underlying iterator was undefined. From python-checkins at python.org Thu Feb 3 01:12:19 2011 From: python-checkins at python.org (eric.araujo) Date: Thu, 3 Feb 2011 01:12:19 +0100 (CET) Subject: [Python-checkins] r88328 - in python/branches/release27-maint: Doc/library/trace.rst Lib/distutils/tests/__init__.py Lib/distutils/tests/test_archive_util.py Lib/distutils/tests/test_bdist_msi.py Lib/distutils/tests/test_build.py Lib/distutils/tests/test_build_clib.py Lib/distutils/tests/test_build_py.py Lib/distutils/tests/test_build_scripts.py Lib/distutils/tests/test_check.py Lib/distutils/tests/test_clean.py Lib/distutils/tests/test_cmd.py Lib/distutils/tests/test_config.py Lib/distutils/tests/test_config_cmd.py Lib/distutils/tests/test_core.py Lib/distutils/tests/test_dep_util.py Lib/distutils/tests/test_dir_util.py Lib/distutils/tests/test_dist.py Lib/distutils/tests/test_file_util.py Lib/distutils/tests/test_filelist.py Lib/distutils/tests/test_install.py Lib/distutils/tests/test_install_data.py Lib/distutils/tests/test_install_headers.py Lib/distutils/tests/test_install_lib.py Lib/distutils/tests/test_install_scripts.py Lib/distutils/tests/test_msvc9compiler.py Lib/distutils/tests/test_register.py Lib/distutils/tests/test_sdist.py Lib/distutils/tests/test_spawn.py Lib/distutils/tests/test_text_file.py Lib/distutils/tests/test_unixccompiler.py Lib/distutils/tests/test_upload.py Lib/distutils/tests/test_util.py Lib/distutils/tests/test_version.py Lib/distutils/tests/test_versionpredicate.py Lib/test/test_gettext.py Lib/test/test_tuple.py Misc/NEWS Message-ID: <20110203001219.4EC93EE998@mail.python.org> Author: eric.araujo Date: Thu Feb 3 01:12:18 2011 New Revision: 88328 Log: Merged revisions 86236,86240,86332,86340,87271,87273,87447 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k To comply with the 2.x doc style, the methods in trace.rst use brackets around optional arguments. The rest is a mostly straight merge, modulo support changed to test_support and use of the old super call style in test_tuple. ........ r86236 | eric.araujo | 2010-11-06 03:44:43 +0100 (sam., 06 nov. 2010) | 2 lines Make sure each test can be run standalone (./python Lib/distutils/tests/x.py) ........ r86240 | eric.araujo | 2010-11-06 05:11:59 +0100 (sam., 06 nov. 2010) | 2 lines Prevent ResourceWarnings in test_gettext ........ r86332 | eric.araujo | 2010-11-08 19:15:17 +0100 (lun., 08 nov. 2010) | 4 lines Add missing NEWS entry for a fix committed by Senthil. All recent modifications to distutils should now be covered in NEWS. ........ r86340 | eric.araujo | 2010-11-08 22:48:23 +0100 (lun., 08 nov. 2010) | 2 lines This was actually fixed for the previous alpha. ........ r87271 | eric.araujo | 2010-12-15 20:09:58 +0100 (mer., 15 d?c. 2010) | 2 lines Improve trace documentation (#9264). Patch by Eli Bendersky. ........ r87273 | eric.araujo | 2010-12-15 20:30:15 +0100 (mer., 15 d?c. 2010) | 2 lines Use nested method directives, rewrap long lines, fix whitespace. ........ r87447 | eric.araujo | 2010-12-23 20:13:05 +0100 (jeu., 23 d?c. 2010) | 2 lines Fix typo in superclass method name ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/trace.rst python/branches/release27-maint/Lib/distutils/tests/__init__.py python/branches/release27-maint/Lib/distutils/tests/test_archive_util.py python/branches/release27-maint/Lib/distutils/tests/test_bdist_msi.py python/branches/release27-maint/Lib/distutils/tests/test_build.py python/branches/release27-maint/Lib/distutils/tests/test_build_clib.py python/branches/release27-maint/Lib/distutils/tests/test_build_py.py python/branches/release27-maint/Lib/distutils/tests/test_build_scripts.py python/branches/release27-maint/Lib/distutils/tests/test_check.py python/branches/release27-maint/Lib/distutils/tests/test_clean.py python/branches/release27-maint/Lib/distutils/tests/test_cmd.py python/branches/release27-maint/Lib/distutils/tests/test_config.py python/branches/release27-maint/Lib/distutils/tests/test_config_cmd.py python/branches/release27-maint/Lib/distutils/tests/test_core.py python/branches/release27-maint/Lib/distutils/tests/test_dep_util.py python/branches/release27-maint/Lib/distutils/tests/test_dir_util.py python/branches/release27-maint/Lib/distutils/tests/test_dist.py python/branches/release27-maint/Lib/distutils/tests/test_file_util.py python/branches/release27-maint/Lib/distutils/tests/test_filelist.py python/branches/release27-maint/Lib/distutils/tests/test_install.py python/branches/release27-maint/Lib/distutils/tests/test_install_data.py python/branches/release27-maint/Lib/distutils/tests/test_install_headers.py python/branches/release27-maint/Lib/distutils/tests/test_install_lib.py python/branches/release27-maint/Lib/distutils/tests/test_install_scripts.py python/branches/release27-maint/Lib/distutils/tests/test_msvc9compiler.py python/branches/release27-maint/Lib/distutils/tests/test_register.py python/branches/release27-maint/Lib/distutils/tests/test_sdist.py python/branches/release27-maint/Lib/distutils/tests/test_spawn.py python/branches/release27-maint/Lib/distutils/tests/test_text_file.py python/branches/release27-maint/Lib/distutils/tests/test_unixccompiler.py python/branches/release27-maint/Lib/distutils/tests/test_upload.py python/branches/release27-maint/Lib/distutils/tests/test_util.py python/branches/release27-maint/Lib/distutils/tests/test_version.py python/branches/release27-maint/Lib/distutils/tests/test_versionpredicate.py python/branches/release27-maint/Lib/test/test_gettext.py python/branches/release27-maint/Lib/test/test_tuple.py python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Doc/library/trace.rst ============================================================================== --- python/branches/release27-maint/Doc/library/trace.rst (original) +++ python/branches/release27-maint/Doc/library/trace.rst Thu Feb 3 01:12:18 2011 @@ -18,103 +18,177 @@ .. _trace-cli: -Command Line Usage +Command-Line Usage ------------------ The :mod:`trace` module can be invoked from the command line. It can be as simple as :: - python -m trace --count somefile.py ... + python -m trace --count -C . somefile.py ... -The above will generate annotated listings of all Python modules imported during -the execution of :file:`somefile.py`. +The above will execute :file:`somefile.py` and generate annotated listings of +all Python modules imported during the execution into the current directory. -The following command-line arguments are supported: +.. program:: trace + +.. cmdoption:: --help + + Display usage and exit. + +.. cmdoption:: --version + + Display the version of the module and exit. + +Main options +^^^^^^^^^^^^ + +At least one of the following options must be specified when invoking +:mod:`trace`. The :option:`--listfuncs <-l>` option is mutually exclusive with +the :option:`--trace <-t>` and :option:`--counts <-c>` options . When +:option:`--listfuncs <-l>` is provided, neither :option:`--counts <-c>` nor +:option:`--trace <-t>` are accepted, and vice versa. + +.. program:: trace + +.. cmdoption:: -c, --count + + Produce a set of annotated listing files upon program completion that shows + how many times each statement was executed. See also + :option:`--coverdir <-C>`, :option:`--file <-f>` and + :option:`--no-report <-R>` below. + +.. cmdoption:: -t, --trace -:option:`--trace`, :option:`-t` Display lines as they are executed. -:option:`--count`, :option:`-c` - Produce a set of annotated listing files upon program completion that shows how - many times each statement was executed. +.. cmdoption:: -l, --listfuncs + + Display the functions executed by running the program. + +.. cmdoption:: -r, --report -:option:`--report`, :option:`-r` Produce an annotated list from an earlier program run that used the - :option:`--count` and :option:`--file` arguments. + :option:`--count <-c>` and :option:`--file <-f>` option. This does not + execute any code. -:option:`--no-report`, :option:`-R` - Do not generate annotated listings. This is useful if you intend to make - several runs with :option:`--count` then produce a single set of annotated - listings at the end. +.. cmdoption:: -T, --trackcalls + + Display the calling relationships exposed by running the program. -:option:`--listfuncs`, :option:`-l` - List the functions executed by running the program. +Modifiers +^^^^^^^^^ -:option:`--trackcalls`, :option:`-T` - Generate calling relationships exposed by running the program. +.. program:: trace -:option:`--file`, :option:`-f` - Name a file containing (or to contain) counts. +.. cmdoption:: -f, --file= -:option:`--coverdir`, :option:`-C` - Name a directory in which to save annotated listing files. + Name of a file to accumulate counts over several tracing runs. Should be + used with the :option:`--count <-c>` option. + +.. cmdoption:: -C, --coverdir= + + Directory where the report files go. The coverage report for + ``package.module`` is written to file :file:`{dir}/{package}/{module}.cover`. + +.. cmdoption:: -m, --missing -:option:`--missing`, :option:`-m` When generating annotated listings, mark lines which were not executed with - '``>>>>>>``'. + ``>>>>>>``. -:option:`--summary`, :option:`-s` - When using :option:`--count` or :option:`--report`, write a brief summary to - stdout for each file processed. - -:option:`--ignore-module` - Accepts comma separated list of module names. Ignore each of the named - module and its submodules (if it is a package). May be given - multiple times. - -:option:`--ignore-dir` - Ignore all modules and packages in the named directory and subdirectories - (multiple directories can be joined by os.pathsep). May be given multiple - times. +.. cmdoption:: -s, --summary + When using :option:`--count <-c>` or :option:`--report <-r>`, write a brief + summary to stdout for each file processed. -.. _trace-api: +.. cmdoption:: -R, --no-report -Programming Interface ---------------------- + Do not generate annotated listings. This is useful if you intend to make + several runs with :option:`--count <-c>`, and then produce a single set of + annotated listings at the end. + +.. cmdoption:: -g, --timing + + Prefix each line with the time since the program started. Only used while + tracing. + +Filters +^^^^^^^ + +These options may be repeated multiple times. + +.. program:: trace + +.. cmdoption:: --ignore-module= + + Ignore each of the given module names and its submodules (if it is a + package). The argument can be a list of names separated by a comma. + +.. cmdoption:: --ignore-dir= + Ignore all modules and packages in the named directory and subdirectories. + The argument can be a list of directories separated by :data:`os.pathsep`. + +.. _trace-api: + +Programmatic Interface +---------------------- .. class:: Trace([count=1[, trace=1[, countfuncs=0[, countcallers=0[, ignoremods=()[, ignoredirs=()[, infile=None[, outfile=None[, timing=False]]]]]]]]]) - Create an object to trace execution of a single statement or expression. All - parameters are optional. *count* enables counting of line numbers. *trace* - enables line execution tracing. *countfuncs* enables listing of the functions - called during the run. *countcallers* enables call relationship tracking. - *ignoremods* is a list of modules or packages to ignore. *ignoredirs* is a list - of directories whose modules or packages should be ignored. *infile* is the - file from which to read stored count information. *outfile* is a file in which - to write updated count information. *timing* enables a timestamp relative - to when tracing was started to be displayed. + Create an object to trace execution of a single statement or expression. All + parameters are optional. *count* enables counting of line numbers. *trace* + enables line execution tracing. *countfuncs* enables listing of the + functions called during the run. *countcallers* enables call relationship + tracking. *ignoremods* is a list of modules or packages to ignore. + *ignoredirs* is a list of directories whose modules or packages should be + ignored. *infile* is the name of the file from which to read stored count + information. *outfile* is the name of the file in which to write updated + count information. *timing* enables a timestamp relative to when tracing was + started to be displayed. + + .. method:: run(cmd) + + Execute the command and gather statistics from the execution with + the current tracing parameters. *cmd* must be a string or code object, + suitable for passing into :func:`exec`. + + .. method:: runctx(cmd[, globals=None[, locals=None]]) + + Execute the command and gather statistics from the execution with the + current tracing parameters, in the defined global and local + environments. If not defined, *globals* and *locals* default to empty + dictionaries. + + .. method:: runfunc(func, *args, **kwds) + Call *func* with the given arguments under control of the :class:`Trace` + object with the current tracing parameters. -.. method:: Trace.run(cmd) + .. method:: results() - Run *cmd* under control of the Trace object with the current tracing parameters. + Return a :class:`CoverageResults` object that contains the cumulative + results of all previous calls to ``run``, ``runctx`` and ``runfunc`` + for the given :class:`Trace` instance. Does not reset the accumulated + trace results. +.. class:: CoverageResults -.. method:: Trace.runctx(cmd[, globals=None[, locals=None]]) + A container for coverage results, created by :meth:`Trace.results`. Should + not be created directly by the user. - Run *cmd* under control of the Trace object with the current tracing parameters - in the defined global and local environments. If not defined, *globals* and - *locals* default to empty dictionaries. + .. method:: update(other) + Merge in data from another :class:`CoverageResults` object. -.. method:: Trace.runfunc(func, *args, **kwds) + .. method:: write_results([show_missing=True[, summary=False[, coverdir=None]]]) - Call *func* with the given arguments under control of the :class:`Trace` object - with the current tracing parameters. + Write coverage results. Set *show_missing* to show lines that had no + hits. Set *summary* to include in the output the coverage summary per + module. *coverdir* specifies the directory into which the coverage + result files will be output. If ``None``, the results for each source + file are placed in its directory. -This is a simple example showing the use of this module:: +A simple example demonstrating the use of the programmatic interface:: import sys import trace Modified: python/branches/release27-maint/Lib/distutils/tests/__init__.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/__init__.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/__init__.py Thu Feb 3 01:12:18 2011 @@ -15,9 +15,10 @@ import os import sys import unittest +from test.test_support import run_unittest -here = os.path.dirname(__file__) +here = os.path.dirname(__file__) or os.curdir def test_suite(): @@ -32,4 +33,4 @@ if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_archive_util.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_archive_util.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_archive_util.py Thu Feb 3 01:12:18 2011 @@ -12,7 +12,7 @@ ARCHIVE_FORMATS) from distutils.spawn import find_executable, spawn from distutils.tests import support -from test.test_support import check_warnings +from test.test_support import check_warnings, run_unittest try: import grp @@ -281,4 +281,4 @@ return unittest.makeSuite(ArchiveUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_bdist_msi.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_bdist_msi.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_bdist_msi.py Thu Feb 3 01:12:18 2011 @@ -11,7 +11,7 @@ support.LoggingSilencer, unittest.TestCase): - def test_minial(self): + def test_minimal(self): # minimal test XXX need more tests from distutils.command.bdist_msi import bdist_msi pkg_pth, dist = self.create_dist() Modified: python/branches/release27-maint/Lib/distutils/tests/test_build.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_build.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_build.py Thu Feb 3 01:12:18 2011 @@ -2,6 +2,7 @@ import unittest import os import sys +from test.test_support import run_unittest from distutils.command.build import build from distutils.tests import support @@ -51,4 +52,4 @@ return unittest.makeSuite(BuildTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_build_clib.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_build_clib.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_build_clib.py Thu Feb 3 01:12:18 2011 @@ -3,6 +3,8 @@ import os import sys +from test.test_support import run_unittest + from distutils.command.build_clib import build_clib from distutils.errors import DistutilsSetupError from distutils.tests import support @@ -140,4 +142,4 @@ return unittest.makeSuite(BuildCLibTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_build_py.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_build_py.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_build_py.py Thu Feb 3 01:12:18 2011 @@ -10,6 +10,7 @@ from distutils.errors import DistutilsFileError from distutils.tests import support +from test.test_support import run_unittest class BuildPyTestCase(support.TempdirManager, @@ -123,4 +124,4 @@ return unittest.makeSuite(BuildPyTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_build_scripts.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_build_scripts.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_build_scripts.py Thu Feb 3 01:12:18 2011 @@ -8,6 +8,7 @@ import sysconfig from distutils.tests import support +from test.test_support import run_unittest class BuildScriptsTestCase(support.TempdirManager, @@ -108,4 +109,4 @@ return unittest.makeSuite(BuildScriptsTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_check.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_check.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_check.py Thu Feb 3 01:12:18 2011 @@ -1,5 +1,6 @@ """Tests for distutils.command.check.""" import unittest +from test.test_support import run_unittest from distutils.command.check import check, HAS_DOCUTILS from distutils.tests import support @@ -95,4 +96,4 @@ return unittest.makeSuite(CheckTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_clean.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_clean.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_clean.py Thu Feb 3 01:12:18 2011 @@ -6,6 +6,7 @@ from distutils.command.clean import clean from distutils.tests import support +from test.test_support import run_unittest class cleanTestCase(support.TempdirManager, support.LoggingSilencer, @@ -47,4 +48,4 @@ return unittest.makeSuite(cleanTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_cmd.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_cmd.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_cmd.py Thu Feb 3 01:12:18 2011 @@ -99,7 +99,7 @@ def test_ensure_dirname(self): cmd = self.cmd - cmd.option1 = os.path.dirname(__file__) + cmd.option1 = os.path.dirname(__file__) or os.curdir cmd.ensure_dirname('option1') cmd.option2 = 'xxx' self.assertRaises(DistutilsOptionError, cmd.ensure_dirname, 'option2') Modified: python/branches/release27-maint/Lib/distutils/tests/test_config.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_config.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_config.py Thu Feb 3 01:12:18 2011 @@ -11,6 +11,7 @@ from distutils.log import WARN from distutils.tests import support +from test.test_support import run_unittest PYPIRC = """\ [distutils] @@ -119,4 +120,4 @@ return unittest.makeSuite(PyPIRCCommandTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_config_cmd.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_config_cmd.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_config_cmd.py Thu Feb 3 01:12:18 2011 @@ -2,6 +2,7 @@ import unittest import os import sys +from test.test_support import run_unittest from distutils.command.config import dump_file, config from distutils.tests import support @@ -86,4 +87,4 @@ return unittest.makeSuite(ConfigTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_core.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_core.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_core.py Thu Feb 3 01:12:18 2011 @@ -6,7 +6,7 @@ import shutil import sys import test.test_support -from test.test_support import captured_stdout +from test.test_support import captured_stdout, run_unittest import unittest from distutils.tests import support @@ -105,4 +105,4 @@ return unittest.makeSuite(CoreTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_dep_util.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_dep_util.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_dep_util.py Thu Feb 3 01:12:18 2011 @@ -6,6 +6,7 @@ from distutils.dep_util import newer, newer_pairwise, newer_group from distutils.errors import DistutilsFileError from distutils.tests import support +from test.test_support import run_unittest class DepUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -77,4 +78,4 @@ return unittest.makeSuite(DepUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_dir_util.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_dir_util.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_dir_util.py Thu Feb 3 01:12:18 2011 @@ -10,6 +10,7 @@ from distutils import log from distutils.tests import support +from test.test_support import run_unittest class DirUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -112,4 +113,4 @@ return unittest.makeSuite(DirUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_dist.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_dist.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_dist.py Thu Feb 3 01:12:18 2011 @@ -11,7 +11,7 @@ from distutils.dist import Distribution, fix_help_options, DistributionMetadata from distutils.cmd import Command import distutils.dist -from test.test_support import TESTFN, captured_stdout +from test.test_support import TESTFN, captured_stdout, run_unittest from distutils.tests import support class test_dist(Command): @@ -433,4 +433,4 @@ return suite if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_file_util.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_file_util.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_file_util.py Thu Feb 3 01:12:18 2011 @@ -6,6 +6,7 @@ from distutils.file_util import move_file, write_file, copy_file from distutils import log from distutils.tests import support +from test.test_support import run_unittest class FileUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -77,4 +78,4 @@ return unittest.makeSuite(FileUtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_filelist.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_filelist.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_filelist.py Thu Feb 3 01:12:18 2011 @@ -1,7 +1,7 @@ """Tests for distutils.filelist.""" from os.path import join import unittest -from test.test_support import captured_stdout +from test.test_support import captured_stdout, run_unittest from distutils.filelist import glob_to_re, FileList from distutils import debug @@ -82,4 +82,4 @@ return unittest.makeSuite(FileListTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_install.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_install.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_install.py Thu Feb 3 01:12:18 2011 @@ -3,6 +3,8 @@ import os import unittest +from test.test_support import run_unittest + from distutils.command.install import install from distutils.core import Distribution @@ -52,4 +54,4 @@ return unittest.makeSuite(InstallTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_install_data.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_install_data.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_install_data.py Thu Feb 3 01:12:18 2011 @@ -6,6 +6,7 @@ from distutils.command.install_data import install_data from distutils.tests import support +from test.test_support import run_unittest class InstallDataTestCase(support.TempdirManager, support.LoggingSilencer, @@ -73,4 +74,4 @@ return unittest.makeSuite(InstallDataTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_install_headers.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_install_headers.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_install_headers.py Thu Feb 3 01:12:18 2011 @@ -6,6 +6,7 @@ from distutils.command.install_headers import install_headers from distutils.tests import support +from test.test_support import run_unittest class InstallHeadersTestCase(support.TempdirManager, support.LoggingSilencer, @@ -37,4 +38,4 @@ return unittest.makeSuite(InstallHeadersTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_install_lib.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_install_lib.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_install_lib.py Thu Feb 3 01:12:18 2011 @@ -7,6 +7,7 @@ from distutils.extension import Extension from distutils.tests import support from distutils.errors import DistutilsOptionError +from test.test_support import run_unittest class InstallLibTestCase(support.TempdirManager, support.LoggingSilencer, @@ -103,4 +104,4 @@ return unittest.makeSuite(InstallLibTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_install_scripts.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_install_scripts.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_install_scripts.py Thu Feb 3 01:12:18 2011 @@ -7,6 +7,7 @@ from distutils.core import Distribution from distutils.tests import support +from test.test_support import run_unittest class InstallScriptsTestCase(support.TempdirManager, @@ -78,4 +79,4 @@ return unittest.makeSuite(InstallScriptsTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_msvc9compiler.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_msvc9compiler.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_msvc9compiler.py Thu Feb 3 01:12:18 2011 @@ -5,6 +5,7 @@ from distutils.errors import DistutilsPlatformError from distutils.tests import support +from test.test_support import run_unittest _MANIFEST = """\ @@ -137,4 +138,4 @@ return unittest.makeSuite(msvc9compilerTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_register.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_register.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_register.py Thu Feb 3 01:12:18 2011 @@ -7,7 +7,7 @@ import urllib2 import warnings -from test.test_support import check_warnings +from test.test_support import check_warnings, run_unittest from distutils.command import register as register_module from distutils.command.register import register @@ -258,4 +258,4 @@ return unittest.makeSuite(RegisterTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_sdist.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_sdist.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_sdist.py Thu Feb 3 01:12:18 2011 @@ -24,11 +24,9 @@ import tempfile import warnings -from test.test_support import check_warnings -from test.test_support import captured_stdout +from test.test_support import captured_stdout, check_warnings, run_unittest -from distutils.command.sdist import sdist -from distutils.command.sdist import show_formats +from distutils.command.sdist import sdist, show_formats from distutils.core import Distribution from distutils.tests.test_config import PyPIRCCommandTestCase from distutils.errors import DistutilsExecError, DistutilsOptionError @@ -426,4 +424,4 @@ return unittest.makeSuite(SDistTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_spawn.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_spawn.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_spawn.py Thu Feb 3 01:12:18 2011 @@ -2,7 +2,7 @@ import unittest import os import time -from test.test_support import captured_stdout +from test.test_support import captured_stdout, run_unittest from distutils.spawn import _nt_quote_args from distutils.spawn import spawn, find_executable @@ -57,4 +57,4 @@ return unittest.makeSuite(SpawnTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_text_file.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_text_file.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_text_file.py Thu Feb 3 01:12:18 2011 @@ -3,6 +3,7 @@ import unittest from distutils.text_file import TextFile from distutils.tests import support +from test.test_support import run_unittest TEST_DATA = """# test file @@ -103,4 +104,4 @@ return unittest.makeSuite(TextFileTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_unixccompiler.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_unixccompiler.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_unixccompiler.py Thu Feb 3 01:12:18 2011 @@ -1,6 +1,7 @@ """Tests for distutils.unixccompiler.""" import sys import unittest +from test.test_support import run_unittest from distutils import sysconfig from distutils.unixccompiler import UnixCCompiler @@ -126,4 +127,4 @@ return unittest.makeSuite(UnixCCompilerTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_upload.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_upload.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_upload.py Thu Feb 3 01:12:18 2011 @@ -1,14 +1,13 @@ -"""Tests for distutils.command.upload.""" # -*- encoding: utf8 -*- -import sys +"""Tests for distutils.command.upload.""" import os import unittest +from test.test_support import run_unittest from distutils.command import upload as upload_mod from distutils.command.upload import upload from distutils.core import Distribution -from distutils.tests import support from distutils.tests.test_config import PYPIRC, PyPIRCCommandTestCase PYPIRC_LONG_PASSWORD = """\ @@ -129,4 +128,4 @@ return unittest.makeSuite(uploadTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_util.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_util.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_util.py Thu Feb 3 01:12:18 2011 @@ -1,6 +1,7 @@ """Tests for distutils.util.""" import sys import unittest +from test.test_support import run_unittest from distutils.errors import DistutilsPlatformError, DistutilsByteCompileError from distutils.util import byte_compile @@ -21,4 +22,4 @@ return unittest.makeSuite(UtilTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_version.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_version.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_version.py Thu Feb 3 01:12:18 2011 @@ -2,6 +2,7 @@ import unittest from distutils.version import LooseVersion from distutils.version import StrictVersion +from test.test_support import run_unittest class VersionTestCase(unittest.TestCase): @@ -67,4 +68,4 @@ return unittest.makeSuite(VersionTestCase) if __name__ == "__main__": - unittest.main(defaultTest="test_suite") + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/distutils/tests/test_versionpredicate.py ============================================================================== --- python/branches/release27-maint/Lib/distutils/tests/test_versionpredicate.py (original) +++ python/branches/release27-maint/Lib/distutils/tests/test_versionpredicate.py Thu Feb 3 01:12:18 2011 @@ -4,6 +4,10 @@ import distutils.versionpredicate import doctest +from test.test_support import run_unittest def test_suite(): return doctest.DocTestSuite(distutils.versionpredicate) + +if __name__ == '__main__': + run_unittest(test_suite()) Modified: python/branches/release27-maint/Lib/test/test_gettext.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_gettext.py (original) +++ python/branches/release27-maint/Lib/test/test_gettext.py Thu Feb 3 01:12:18 2011 @@ -64,15 +64,13 @@ def setUp(self): if not os.path.isdir(LOCALEDIR): os.makedirs(LOCALEDIR) - fp = open(MOFILE, 'wb') - fp.write(base64.decodestring(GNU_MO_DATA)) - fp.close() - fp = open(UMOFILE, 'wb') - fp.write(base64.decodestring(UMO_DATA)) - fp.close() - fp = open(MMOFILE, 'wb') - fp.write(base64.decodestring(MMO_DATA)) - fp.close() + with open(MOFILE, 'wb') as fp: + fp.write(base64.decodestring(GNU_MO_DATA)) + with open(UMOFILE, 'wb') as fp: + fp.write(base64.decodestring(UMO_DATA)) + with open(MMOFILE, 'wb') as fp: + fp.write(base64.decodestring(MMO_DATA)) + self.env = test_support.EnvironmentVarGuard() self.env['LANGUAGE'] = 'xx' gettext._translations.clear() @@ -135,9 +133,8 @@ def test_the_alternative_interface(self): eq = self.assertEqual # test the alternative interface - fp = open(self.mofile, 'rb') - t = gettext.GNUTranslations(fp) - fp.close() + with open(self.mofile, 'rb') as fp: + t = gettext.GNUTranslations(fp) # Install the translation object t.install() eq(_('nudge nudge'), 'wink wink') @@ -227,9 +224,8 @@ def test_plural_forms2(self): eq = self.assertEqual - fp = open(self.mofile, 'rb') - t = gettext.GNUTranslations(fp) - fp.close() + with open(self.mofile, 'rb') as fp: + t = gettext.GNUTranslations(fp) x = t.ngettext('There is %s file', 'There are %s files', 1) eq(x, 'Hay %s fichero') x = t.ngettext('There is %s file', 'There are %s files', 2) @@ -299,11 +295,8 @@ class UnicodeTranslationsTest(GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) - fp = open(UMOFILE, 'rb') - try: + with open(UMOFILE, 'rb') as fp: self.t = gettext.GNUTranslations(fp) - finally: - fp.close() self._ = self.t.ugettext def test_unicode_msgid(self): @@ -319,15 +312,12 @@ class WeirdMetadataTest(GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) - fp = open(MMOFILE, 'rb') - try: + with open(MMOFILE, 'rb') as fp: try: self.t = gettext.GNUTranslations(fp) except: self.tearDown() raise - finally: - fp.close() def test_weird_metadata(self): info = self.t.info() Modified: python/branches/release27-maint/Lib/test/test_tuple.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_tuple.py (original) +++ python/branches/release27-maint/Lib/test/test_tuple.py Thu Feb 3 01:12:18 2011 @@ -6,7 +6,7 @@ type2test = tuple def test_constructors(self): - super(TupleTest, self).test_len() + super(TupleTest, self).test_constructors() # calling built-in types without argument must return empty self.assertEqual(tuple(), ()) t0_3 = (0, 1, 2, 3) Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Thu Feb 3 01:12:18 2011 @@ -213,6 +213,10 @@ Library ------- +- Issue #2236: distutils' mkpath ignored the mode parameter. + +- Fix typo in one sdist option (medata-check). + - Issue #10323: itertools.islice() now consumes the minimum number of inputs before stopping. Formerly, the final state of the underlying iterator was undefined. From python-checkins at python.org Thu Feb 3 01:33:30 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 03 Feb 2011 01:33:30 +0100 Subject: [Python-checkins] devguide: Fix a module name. Message-ID: brett.cannon pushed dfbf819afa55 to devguide: http://hg.python.org/devguide/rev/dfbf819afa55 changeset: 226:dfbf819afa55 tag: tip user: Brett Cannon date: Wed Feb 02 16:33:25 2011 -0800 summary: Fix a module name. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -58,7 +58,7 @@ It should be noted that a quirk of running coverage over Python's own stdlib is that certain modules are imported as part of interpreter startup. Those modules required by Python itself will not be viewed as executed by the coverage tools -and thus look like they have very poor coverage (e.g., the :py:mod:`stats` +and thus look like they have very poor coverage (e.g., the :py:mod:`stat` module). In these instances the module will appear to not have any coverage of global statements but will have proper coverage of local statements (e.g., function definitions will be not be traced, but the function bodies will). -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Thu Feb 3 01:37:36 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 03 Feb 2011 01:37:36 +0100 Subject: [Python-checkins] devguide: Minor tweaks Message-ID: brett.cannon pushed 2f8e1917796d to devguide: http://hg.python.org/devguide/rev/2f8e1917796d changeset: 227:2f8e1917796d tag: tip user: Brett Cannon date: Wed Feb 02 16:37:26 2011 -0800 summary: Minor tweaks files: devcycle.rst diff --git a/devcycle.rst b/devcycle.rst --- a/devcycle.rst +++ b/devcycle.rst @@ -38,7 +38,7 @@ The branch currently being maintained for bug fixes. The branch under maintenance is the last minor version of Python to be released -as Final_. This means that the latest release of Python was 3.1.2, then the +as Final_. This means if the latest release of Python was 3.1.2, then the branch representing Python 3.1 is in maintenance mode. The only changes allowed to occur in a maintenance branch without debate are bug @@ -51,7 +51,7 @@ made. For example, this means that Python 2.6 stayed in maintenance mode until Python 2.7.0 was released, at which point 2.7 went into maintenance mode and 2.6 went into Security_ mode. As new minor releases occur on a (roughly) 18 -month schedule, a branch stays in mainteance mode for the same amount of time. +month schedule, a branch stays in maintenance mode for the same amount of time. A micro release of a maintenance branch is made about every six months. Typically when a new minor release is made one more release of the new-old @@ -64,9 +64,10 @@ The only changes made to a branch that is being maintained for security purposes are somewhat obviously those related to security, e.g., privilege -escalation. Crashers and other behaviorial issues are **not** considered a +escalation, and issues that lead to crashes. Other behavioral issues are +**not** considered a security risk and thus not backported to a branch being maintained for -security. Any releases made for a branch under security maintenance is +security. Any release made from a branch under security maintenance is source-only and done only when actual security patches have been applied to the branch. -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Thu Feb 3 05:04:51 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 03 Feb 2011 05:04:51 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88322): sum=0 Message-ID: py3k results for svn r88322 (hg cset fc18a886ebae) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogmUQkp6', '-x'] From python-checkins at python.org Thu Feb 3 08:08:26 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 3 Feb 2011 08:08:26 +0100 (CET) Subject: [Python-checkins] r88329 - python/branches/py3k/Doc/library/collections.rst Message-ID: <20110203070826.2C70AEEA26@mail.python.org> Author: georg.brandl Date: Thu Feb 3 08:08:25 2011 New Revision: 88329 Log: Punctuation typos. Modified: python/branches/py3k/Doc/library/collections.rst Modified: python/branches/py3k/Doc/library/collections.rst ============================================================================== --- python/branches/py3k/Doc/library/collections.rst (original) +++ python/branches/py3k/Doc/library/collections.rst Thu Feb 3 08:08:25 2011 @@ -975,7 +975,7 @@ :class:`Sized` ``__len__`` :class:`Callable` ``__call__`` -:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``. ``__iter__``, ``__reversed__``, +:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``, ``__iter__``, ``__reversed__``, :class:`Iterable`, ``index``, and ``count`` :class:`Container` @@ -984,7 +984,7 @@ and ``insert`` ``remove``, and ``__iadd__`` :class:`Set` :class:`Sized`, ``__le__``, ``__lt__``, ``__eq__``, ``__ne__``, - :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__`` + :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__``, :class:`Container` ``__sub__``, ``__xor__``, and ``isdisjoint`` :class:`MutableSet` :class:`Set` ``add`` and Inherited Set methods and From python-checkins at python.org Thu Feb 3 08:46:41 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 3 Feb 2011 08:46:41 +0100 (CET) Subject: [Python-checkins] r88330 - python/branches/py3k/Lib/urllib/request.py Message-ID: <20110203074641.A32D7EE98D@mail.python.org> Author: georg.brandl Date: Thu Feb 3 08:46:41 2011 New Revision: 88330 Log: Remove lots of spaces within exception message. Modified: python/branches/py3k/Lib/urllib/request.py Modified: python/branches/py3k/Lib/urllib/request.py ============================================================================== --- python/branches/py3k/Lib/urllib/request.py (original) +++ python/branches/py3k/Lib/urllib/request.py Thu Feb 3 08:46:41 2011 @@ -1057,8 +1057,8 @@ mv = memoryview(data) except TypeError: if isinstance(data, collections.Iterable): - raise ValueError("Content-Length should be specified \ - for iterable data of type %r %r" % (type(data), + raise ValueError("Content-Length should be specified " + "for iterable data of type %r %r" % (type(data), data)) else: request.add_unredirected_header( From python-checkins at python.org Thu Feb 3 23:01:55 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 3 Feb 2011 23:01:55 +0100 (CET) Subject: [Python-checkins] r88331 - in python/branches/py3k/Doc/howto: index.rst pyporting.rst Message-ID: <20110203220155.419E3EEA35@mail.python.org> Author: brett.cannon Date: Thu Feb 3 23:01:54 2011 New Revision: 88331 Log: Add a HOWTO on how to port from Python 2 to Python 3. Added: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/index.rst Modified: python/branches/py3k/Doc/howto/index.rst ============================================================================== --- python/branches/py3k/Doc/howto/index.rst (original) +++ python/branches/py3k/Doc/howto/index.rst Thu Feb 3 23:01:54 2011 @@ -14,6 +14,7 @@ :maxdepth: 1 advocacy.rst + pyporting.rst cporting.rst curses.rst descriptor.rst Added: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- (empty file) +++ python/branches/py3k/Doc/howto/pyporting.rst Thu Feb 3 23:01:54 2011 @@ -0,0 +1,581 @@ +.. _pyporting-howto: + +********************************* +Porting Python 2 Code to Python 3 +********************************* + +:author: Brett Cannon + +.. topic:: Abstract + + With Python 3 being the future of Python while Python 2 is still in active + use, it is good to have your project available for both major releases of + Python. This guide is meant to help you choose which strategy works best + for your project to support both Python 2 & 3 along with how to execute + that strategy. + + If you are looking to port an extension module instead of pure Python code, + please see http://docs.python.org/py3k/howto/cporting.html . + + +Choosing a Strategy +=================== +When a project makes the decision that it's time to support both Python 2 & 3, +a decision needs to be made as to how to go about accomplishing that goal. +Which strategy goes with will depend on how large the project's existing +codebase is and how much divergence you want from your Python 2 codebase from +your Python 3 one (e.g., starting a new version with Python 3). + +If your project is brand-new or does not have a large codebase, then you may +want to consider writing/porting :ref:`all of your code for Python 3 +and use 3to2 ` to port your code for Python 2. + +If your project has a pre-existing Python 2 codebase and you would like Python +3 support to start off a new branch or version of your project, then you will +most likely want to :ref:`port using 2to3 `. This will allow you port +your Python 2 code to Python 3 in a semi-automated fashion and begin to +maintain it separately from your Python 2 code. This approach can also work if +your codebase is small and/or simple enough for the translation to occur +quickly. + +Finally, if you want to maintain Python 2 and Python 3 versions of your project +simultaneously and with no differences, then you can write :ref:`Python 2/3 +source-compatible code `. While the code is not quite as +idiomatic as it would be written just for Python 3 or automating the port from +Python 2, it does makes it easier to continue to do rapid development +regardless of what major version of Python you are developing against at the +time. + +Regardless of which approach you choose, porting is probably not as hard or +time-consuming as you might initially think. You can also tackle the problem +piece-meal as a good portion of porting is simply updating your code to follow +current best practices in a Python 2/3 compatible way. + + +Universal Bits of Advice +------------------------ +Regardless of what strategy you pick, there are a few things you should +consider. + +One is make sure you have a robust test suite. You need to make sure everything +continues to work, just like when you support a new minor version of Python. +This means making sure your test suite is thorough and is ported properly +between Python 2 & 3. You will also most likely want to use something like tox_ +to automate testing between both a Python 2 and Python 3 VM. + +Two, once your project has Python 3 support, make sure to add the proper +classifier on the Cheeseshop_ (PyPI_). To have your project listed as Python 3 +compatible it must have the +`Python 3 classifier `_ +(from +http://techspot.zzzeek.org/2011/01/24/zzzeek-s-guide-to-python-3-porting/):: + + setup( + name='Your Library', + version='1.0', + classifiers=[ + # make sure to use :: Python *and* :: Python :: 3 so + # that pypi can list the package on the python 3 page + 'Programming Language :: Python', + 'Programming Language :: Python :: 3' + ], + packages=['yourlibrary'], + # make sure to add custom_fixers to the MANIFEST.in + include_package_data=True, + # ... + ) + + +Doing so will cause your project to show up in the +`Python 3 packages list +`_. You will know +you set the classifier properly as visiting your project page on the Cheeseshop +will show a Python 3 logo in the upper-left corner of the page. + +Three, the six_ project provides a library which helps iron out differences +between Python 2 & 3. If you find there is a sticky point that is a continual +point of contention in your translation or maintenance of code, consider using +a source-compatible solution relying on six. If you have to create your own +Python 2/3 compatible solution, you can use ``sys.version_info[0] >= 3`` as a +guard. + +Four, read all the approaches. Just because some bit of advice applies to one +approach more than another doesn't mean that some advice doesn't apply to other +strategies. + +Five, drop support for older Python versions if possible. While not a +requirement, `Python 2.5`_) introduced a lot of useful syntax and libraries +which have become idiomatic in Python 3. `Python 2.6`_ introduced future +statements which makes compatibility much easier if you are going from Python 2 +to 3. +`Python 2.7`_ continues the trend in the stdlib. So choose the newest version +of Python for which you believe you believe can be your minimum support version +and work from there. + + +.. _tox: http://codespeak.net/tox/ +.. _Cheeseshop: +.. _PyPI: http://pypi.python.org/ +.. _six: http://packages.python.org/six +.. _Python 2.7: http://www.python.org/2.7.x +.. _Python 2.6: http://www.python.org/2.6.x +.. _Python 2.5: http://www.python.org/2.5.x +.. _Python 2.4: http://www.python.org/2.4.x + + +.. _use_3to2: + +Python 3 and 3to2 +================= +If you are starting a new project or your codebase is small enough, you may +want to consider writing your code for Python 3 and backporting to Python 2 +using 3to2_. Thanks to Python 3 being more strict about things than Python 2 +(e.g., bytes vs. strings), the source translation can be easier and more +straightforward than from Python 2 to 3. Plus it gives you more direct +experience developing in Python 3 which, since it is the future of Python, is a +good thing long-term. + +A drawback of this approach is that 3to2 is a third-party project. This means +that the Python core developers (and thus this guide) can make no promises +about how well 3to2 works at any time. There is nothing to suggest, though, +that 3to2 is not a high-quality project. + + +.. _3to2: https://bitbucket.org/amentajo/lib3to2/overview + + +.. _use_2to3: + +Python 2 and 2to3 +================= +Included with Python since 2.6, 2to3_ tool (and :mod:`lib2to3` module) helps +with porting Python 2 to Python 3 by performing various source translations. +This is a perfect solution for projects which wish to branch their Python 3 +code from their Python 2 codebase and maintain them as independent codebases. +You can even begin preparing to use this approach today by writing +future-compatible Python code which works cleanly in Python 2 in conjunction +with 2to3; all steps outlined below will work with Python 2 code up to the +point when the actual use of 2to3 occurs. + +Use of 2to3 as an on-demand translation step at install time is also possible, +preventing the need to maintain a separate Python 3 codebase, but this approach +does come with some drawbacks. While users will only have to pay the +translation cost once at installation, you as a developer will need to pay the +cost regularly during development. If your codebase is sufficiently large +enough then the translation step ends up acting like a compilation step, +robbing you of the rapid development process you are used to with Python. +Obviously the time required to translate a project will vary, so do an +experimental translation just to see how long it takes to evaluate whether you +prefer this approach compared to using :ref:`use_same_source` or simply keeping +a separate Python 3 codebase. + +Below are the typical steps taken by a project which uses a 2to3-based approach +to supporting Python 2 & 3. + + +Support Python 2.7 +------------------ +As a first step, make sure that your project is compatible with `Python 2.7`_. +This is just good to do as Python 2.7 is the last release of Python 2 and thus +will be used for a rather long time. It also allows for use of the ``-3`` flag +to Python to help discover places in your code which 2to3 cannot handle but are +known to cause issues. + +Try to Support Python 2.6 and Newer Only +---------------------------------------- +While not possible for all projects, if you can support `Python 2.6`_ and newer +**only**, your life will be much easier. Various future statements, stdlib +additions, etc. exist only in Python 2.6 and later which greatly assist in +porting to Python 3. But if you project must keep support for `Python 2.5`_ (or +even `Python 2.4`_) then it is still possible to port to Python 3. + +Below are the benefits you gain if you only have to support Python 2.6 and +newer. Some of these options are personal choice while others are +**strongly** recommended (the ones that are more for personal choice are +labeled as such). If you continue to support older versions of Python then you +at least need to watch out for situations that these solutions fix. + + +``from __future__ import division`` +''''''''''''''''''''''''''''''''''' +While the exact same outcome can be had by using the ``-Qnew`` argument to +Python, using this future statement lifts the requirement that your users use +the flag to get the expected behavior of division in Python 3 (e.g., ``1/2 == +0.5; 1//2 == 0``). + + +``from __future__ import absolute_imports`` +''''''''''''''''''''''''''''''''''''''''''' +Implicit relative imports (e.g., importing ``spam.bacon`` from within +``spam.eggs`` with the statement ``import bacon``) does not work in Python 3. +This future statement moves away from that and allows the use of explicit +relative imports (e.g., ``from . import bacon``). + + +``from __future__ import print_function`` +''''''''''''''''''''''''''''''''''''''''' +This is a personal choice. 2to3 handles the translation from the print +statement to the print function rather well so this is an optional step. This +future statement does help, though, with getting used to typing +``print('Hello, World')`` instead of ``print 'Hello, World'``. + + +``from __future__ import unicode_literals`` +''''''''''''''''''''''''''''''''''''''''''' +Another personal choice. You can always mark what you want to be a (unicode) +string with a ``u`` prefix to get the same effect. But regardless of whether +you use this future statement or not, you **must** make sure you know exactly +which Python 2 strings you want to be bytes, and which are to be strings. This +means you should, **at minimum** mark all strings that are meant to be text +strings with a ``u`` prefix if you do not use this future statement. + + +Bytes literals +'''''''''''''' +This is a **very** important one. The ability to prefix Python 2 strings that +are meant to contain bytes with a ``b`` prefix help to very clearly delineate +what is and is not a Python 3 string. When you run 2to3 on code, all Python 2 +strings become Python 3 strings **unless** they are prefixed with ``b``. + +There are some differences between byte literals in Python 2 and those in +Python 3 thanks to the bytes type just being an alias to ``str`` in Python 2. +Probably the biggest "gotcha" is that indexing results in different values. In +Python 2, the value of ``b'py'[1]`` is ``'y'``, while in Python 3 it's ``121``. +You can avoid this disparity by always slicing at the size of a single element: +``b'py'[1:2]`` is ``'y'`` in Python 2 and ``b'y'`` in Python 3 (i.e., close +enough). + +You cannot concatenate bytes and strings in Python 3. But since in Python +2 has bytes aliased to ``str``, it will succeed: ``b'a' + u'b'`` works in +Python 2, but ``b'a' + 'b'`` in Python 3 is a :exc:`TypeError`. A similar issue +also comes about when doing comparisons between bytes and strings. + + +:mod:`io` Module +'''''''''''''''' +The built-in ``open()`` function in Python 2 always returns a Python 2 string, +not a unicode string. This is problematic as Python 3's :func:`open` returns a +string if a file is not opened as binary and bytes if it is. + +To help with compatibility, use :func:`io.open` instead of the built-in +``open()``. Since :func:`io.open` is essentially the same function in both +Python 2 and Python 3 it will help iron out any issues that might arise. + + +Handle Common "Gotchas" +----------------------- +There are a few things that just consistently come up as sticking points for +people which 2to3 cannot handle automatically or can easily be done in Python 2 +to help modernize your code. + + +Subclass ``object`` +''''''''''''''''''' +New-style classes have been around since Python 2.2. You need to make sure you +are subclassing from ``object`` to avoid odd edge cases involving method +resolution order, etc. This continues to be totally valid in Python 3 (although +unneeded as all classes implicitly inherit from ``object``). + + +Deal With the Bytes/String Dichotomy +'''''''''''''''''''''''''''''''''''' +One of the biggest issues people have when porting code to Python 3 is handling +the bytes/string dichotomy. Because Python 2 allowed the ``str`` type to hold +textual data, people have over the years been rather loose in their delineation +of what ``str`` instances held text compared to bytes. In Python 3 you cannot +be so care-free anymore and need to properly handle the difference. The key +handling this issue to to make sure that **every** string literal in your +Python 2 code is either syntactically of functionally marked as either bytes or +text data. After this is done you then need to make sure your APIs are designed +to either handle a specific type or made to be properly polymorphic. + + +Mark Up Python 2 String Literals +******************************** + +First thing you must do is designate every single string literal in Python 2 +as either textual or bytes data. If you are only supporting Python 2.6 or +newer, this can be accomplished by marking bytes literals with a ``b`` prefix +and then designating textual data with a ``u`` prefix or using the +``unicode_literals`` future statement. + +If your project supports versions of Python pre-dating 2.6, then you should use +the six_ project and its ``b()`` function to denote bytes literals. For text +literals you can either use six's ``u()`` function or use a ``u`` prefix. + + +Decide what APIs Will Accept +**************************** +In Python 2 it was very easy to accidentally create an API that accepted both +bytes and textual data. But in Python 3, thanks to the more strict handling of +disparate types, this loose usage of bytes and text together tends to fail. + +Take the dict ``{b'a': 'bytes', u'a': 'text'}`` in Python 2.6. It creates the +dict ``{u'a': 'text'}`` since ``b'a' == u'a'``. But in Python 3 the equivalent +dict creates ``{b'a': 'bytes', 'a': 'text'}``, i.e., no lost data. Similar +issues can crop up when transitioning Python 2 code to Python 3. + +This means you need to choose what an API is going to accept and create and +consistently stick to that API in both Python 2 and 3. + + +``__str__()``/``__unicode__()`` +''''''''''''''''''''''''''''''' +In Python 2, objects can specify both a string and unicode representation of +themselves. In Python 3, though, there is only a string representation. This +becomes an issue as people can inadvertantly do things in their ``__str__()`` +methods which have unpredictable results (e.g., infinite recursion if you +happen to use the ``unicode(self).encode('utf8')`` idiom as the body of your +``__str__()`` method). + +There are two ways to solve this issue. One is to use a custom 2to3 fixer. The +blog post at http://lucumr.pocoo.org/2011/1/22/forwards-compatible-python/ +specifies how to do this. That will allow 2to3 to change all instances of ``def +__unicode(self): ...`` to ``def __str__(self): ...``. This does require you +define your ``__str__()`` method in Python 2 before your ``__unicode__()`` +method. + +The other option is to use a mixin class. This allows you to only define a +``__unicode__()`` method for your class and let the mixin derive +``__str__()`` for you (code from +http://lucumr.pocoo.org/2011/1/22/forwards-compatible-python/):: + + import sys + + class UnicodeMixin(object): + + """Mixin class to handle defining the proper __str__/__unicode__ + methods in Python 2 or 3.""" + + if sys.version_info[0] >= 3: # Python 3 + def __str__(self): + return self.__unicode__() + else: # Python 2 + def __str__(self): + return self.__unicode__().encode('utf8') + + + class Spam(UnicodeMixin): + + def __unicode__(self): + return u'spam-spam-bacon-spam' # 2to3 will remove the 'u' prefix + + +Specify when opening a file as binary +''''''''''''''''''''''''''''''''''''' +Unless you have been working on Windows, there is a chance you have not always +bothered to add the ``b`` mode when opening a file (e.g., `` + + +Use :func:``codecs.open()`` +''''''''''''''''''''''''''' +If you are not able to limit your Python 2 compatibility to 2.6 or newer (and +thus get to use :func:`io.open`), then you should make sure you use +:func:`codecs.open` over the built-in ``open()`` function. This will make sure +that you get back unicode strings in Python 2 when reading in text and an +instance of ``str`` when dealing with bytes. + + +Don't Index on Exceptions +''''''''''''''''''''''''' +In Python 2, the following worked:: + + >>> exc = Exception(1, 2, 3) + >>> exc.args[1] + 2 + >>> exc[1] # Python 2 only! + 2 + +But in Python 3, indexing directly off of an exception is an error. You need to +make sure to only index on :attr:`BaseException.args` attribute which is a +sequence containing all arguments passed to the :meth:`__init__` method. + +Even better is to use documented attributes the exception provides. + + +Don't use ``__getslice__`` & Friends +'''''''''''''''''''''''''''''''''''' +Been deprecated for a while, but Python 3 finally drops support for +``__getslice__()``, etc. Move completely over to :meth:`__getitem__` and +friends. + + +Stop Using :mod:`doctest` +''''''''''''''''''''''''' +While 2to3 tries to port doctests properly, it's a rather tough thing to do. It +is probably best to simply convert your critical doctests to :mod:`unittest`. + + +Eliminate ``-3`` Warnings +------------------------- +When you run your application's test suite, run it using the ``-3`` flag passed +to Python. This will cause various warnings to be raised during execution about +things that 2to3 cannot handle automatically (e.g., modules that have been +removed). Try to eliminate those warnings to make your code even more portable +to Python 3. + + +Run 2to3 +-------- +Once you have made your Python 2 code future-compatible with Python 3, it's +time to use 2to3_ to actually port your code. + + +Manually +'''''''' +To manually convert source code using 2to3_, you use the ``2to3`` script that +is installed with Python 2.6 and later.:: + + 2to3 + +This will cause 2to3 to write out a diff with all of the fixers applied for the +converted source code. If you would like 2to3 to go ahead and apply the changes +you can pass it the ``-w`` flag:: + + 2to3 -w + +There are other flags available to control exactly which fixers are applied, +etc. + + +During Installation +''''''''''''''''''' +When a user installs your project for Python 3, you can have either +:mod:`distutils` or Distribute_ run 2to3_ on your behalf. +For distutils, use the following idiom:: + + try: # Python 3 + from distutils.command.build_py import build_py_2to3 as build_py + except ImportError: # Python 2 + from distutils.command.build_py import build_py + + setup(cmdclass = {'build_py':build_py}, + # ... + ) + +For Distribute:: + + setup(use_2to3=True, + # ... + ) + +This will allow you to not have to distribute a separate Python 3 version of +your project. It does require, though, that when you perform development that +you at least build your project and use the built Python 3 source for testing. + + +Verify & Test +------------- +At this point you should (hopefully) have your project converted in such a way +that it works in Python 3. Verify it by running your unit tests and making sure +nothing has gone awry. If you miss something then figure out how to fix it in +Python 3, backport to your Python 2 code, and run your code through 2to3 again +to verify the fix transforms properly. + + +.. _2to3: http://docs.python.org/py3k/library/2to3.html +.. _Distribute: http://packages.python.org/distribute/ + + +.. _use_same_source: + +Python 2/3 Compatible Source +============================ +While it may seem counter-intuitive, you can write Python code which is +source-compatible between Python 2 & 3. It does lead to code that is not +entirely idiomatic Python (e.g., having to extract the currently raised +exception from ``sys.exc_info()[1]``), but it can be run under Python 2 +**and** Python 3 without using 2to3_ as a translation step. This allows you to +continue to have a rapid development process regardless of whether you are +developing under Python 2 or Python 3. Whether this approach or using +:ref:`use_2to3` works best for you will be a per-project decision. + +To get a complete idea of what issues you will need to deal with, see the +`What's New in Python 3.0`_. Others have reorganized the data in other formats +such as http://docs.pythonsprints.com/python3_porting/py-porting.html . + +The following are some steps to take to try to support both Python 2 & 3 from +the same source code. + + +.. _What's New in Python 3.0: http://docs.python.org/release/3.0/whatsnew/3.0.html + + +Follow The Steps for Using 2to3_ (sans 2to3) +-------------------------------------------- +All of the steps outlined in how to +:ref:`port Python 2 code with 2to3 ` apply +to creating a Python 2/3 codebase. This includes trying only support Python 2.6 +or newer (the :mod:`__future__` statements work in Python 3 without issue), +eliminating warnings that are triggered by ``-3``, etc. + +Essentially you should cover all of the steps short of running 2to3 itself. + + +Use six_ +-------- +The six_ project contains many things to help you write portable Python code. +You should make sure to read its documentation from beginning to end and use +any and all features it provides. That way you will minimize any mistakes you +might make in writing cross-version code. + + +Capturing the Currently Raised Exception +---------------------------------------- +One change between Python 2 and 3 that will require changing how you code is +accessing the currently raised exception. In Python 2 the syntax to access the +current exception is:: + + try: + raise Exception() + except Exception, exc: + # Current exception is 'exc' + pass + +This syntax changed in Python 3 to:: + + try: + raise Exception() + except Exception as exc: + # Current exception is 'exc' + pass + +Because of this syntax change you must change to capturing the current +exception to:: + + try: + raise Exception() + except Exception: + import sys + exc = sys.exc_info()[1] + # Current exception is 'exc' + pass + +You can get more information about the raised exception from +:func:`sys.exc_info` than simply the current exception instance, but you most +likely don't need it. One very key point to understand, though, is **do not +save the traceback to a variable without deleting it**! Because tracebacks +contain references to the current executing frame you will inadvertently create +a circular reference, prevent everything in the frame from being garbage +collected. This can be a massive memory leak if you are not careful. Simply +index into the returned value from :func:`sys.version_info` instead of +assigning the tuple it returns to a variable. + + +Other Resources +=============== +The authors of the following blogs posts and wiki pages deserve special thanks +for making public their tips for porting Python 2 code to Python 3 (and thus +helping provide information for this document): + +* http://docs.pythonsprints.com/python3_porting/py-porting.html +* http://techspot.zzzeek.org/2011/01/24/zzzeek-s-guide-to-python-3-porting/ +* http://dabeaz.blogspot.com/2011/01/porting-py65-and-my-superboard-to.html +* http://lucumr.pocoo.org/2011/1/22/forwards-compatible-python/ +* http://lucumr.pocoo.org/2010/2/11/porting-to-python-3-a-guide/ +* http://wiki.python.org/moin/PortingPythonToPy3k + +If you feel there is something missing from this document that should be added, +please email the python-porting_ mailing list. + +.. _python-porting: http://mail.python.org/mailman/listinfo/python-porting From python-checkins at python.org Thu Feb 3 23:14:58 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 3 Feb 2011 23:14:58 +0100 (CET) Subject: [Python-checkins] r88332 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110203221458.BDDCAEE98B@mail.python.org> Author: brett.cannon Date: Thu Feb 3 23:14:58 2011 New Revision: 88332 Log: use 3-space indents. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Thu Feb 3 23:14:58 2011 @@ -8,14 +8,14 @@ .. topic:: Abstract - With Python 3 being the future of Python while Python 2 is still in active - use, it is good to have your project available for both major releases of - Python. This guide is meant to help you choose which strategy works best - for your project to support both Python 2 & 3 along with how to execute - that strategy. + With Python 3 being the future of Python while Python 2 is still in active + use, it is good to have your project available for both major releases of + Python. This guide is meant to help you choose which strategy works best + for your project to support both Python 2 & 3 along with how to execute + that strategy. - If you are looking to port an extension module instead of pure Python code, - please see http://docs.python.org/py3k/howto/cporting.html . + If you are looking to port an extension module instead of pure Python code, + please see http://docs.python.org/py3k/howto/cporting.html . Choosing a Strategy @@ -70,20 +70,20 @@ (from http://techspot.zzzeek.org/2011/01/24/zzzeek-s-guide-to-python-3-porting/):: - setup( - name='Your Library', - version='1.0', - classifiers=[ - # make sure to use :: Python *and* :: Python :: 3 so - # that pypi can list the package on the python 3 page - 'Programming Language :: Python', - 'Programming Language :: Python :: 3' - ], - packages=['yourlibrary'], - # make sure to add custom_fixers to the MANIFEST.in - include_package_data=True, - # ... - ) + setup( + name='Your Library', + version='1.0', + classifiers=[ + # make sure to use :: Python *and* :: Python :: 3 so + # that pypi can list the package on the python 3 page + 'Programming Language :: Python', + 'Programming Language :: Python :: 3' + ], + packages=['yourlibrary'], + # make sure to add custom_fixers to the MANIFEST.in + include_package_data=True, + # ... + ) Doing so will cause your project to show up in the @@ -340,25 +340,25 @@ ``__str__()`` for you (code from http://lucumr.pocoo.org/2011/1/22/forwards-compatible-python/):: - import sys + import sys - class UnicodeMixin(object): + class UnicodeMixin(object): - """Mixin class to handle defining the proper __str__/__unicode__ - methods in Python 2 or 3.""" + """Mixin class to handle defining the proper __str__/__unicode__ + methods in Python 2 or 3.""" - if sys.version_info[0] >= 3: # Python 3 - def __str__(self): - return self.__unicode__() - else: # Python 2 - def __str__(self): - return self.__unicode__().encode('utf8') + if sys.version_info[0] >= 3: # Python 3 + def __str__(self): + return self.__unicode__() + else: # Python 2 + def __str__(self): + return self.__unicode__().encode('utf8') - class Spam(UnicodeMixin): + class Spam(UnicodeMixin): - def __unicode__(self): - return u'spam-spam-bacon-spam' # 2to3 will remove the 'u' prefix + def __unicode__(self): + return u'spam-spam-bacon-spam' # 2to3 will remove the 'u' prefix Specify when opening a file as binary @@ -380,11 +380,11 @@ ''''''''''''''''''''''''' In Python 2, the following worked:: - >>> exc = Exception(1, 2, 3) - >>> exc.args[1] - 2 - >>> exc[1] # Python 2 only! - 2 + >>> exc = Exception(1, 2, 3) + >>> exc.args[1] + 2 + >>> exc[1] # Python 2 only! + 2 But in Python 3, indexing directly off of an exception is an error. You need to make sure to only index on :attr:`BaseException.args` attribute which is a @@ -426,13 +426,13 @@ To manually convert source code using 2to3_, you use the ``2to3`` script that is installed with Python 2.6 and later.:: - 2to3 + 2to3 This will cause 2to3 to write out a diff with all of the fixers applied for the converted source code. If you would like 2to3 to go ahead and apply the changes you can pass it the ``-w`` flag:: - 2to3 -w + 2to3 -w There are other flags available to control exactly which fixers are applied, etc. @@ -444,20 +444,20 @@ :mod:`distutils` or Distribute_ run 2to3_ on your behalf. For distutils, use the following idiom:: - try: # Python 3 - from distutils.command.build_py import build_py_2to3 as build_py - except ImportError: # Python 2 - from distutils.command.build_py import build_py - - setup(cmdclass = {'build_py':build_py}, - # ... - ) - -For Distribute:: - - setup(use_2to3=True, - # ... - ) + try: # Python 3 + from distutils.command.build_py import build_py_2to3 as build_py + except ImportError: # Python 2 + from distutils.command.build_py import build_py + + setup(cmdclass = {'build_py':build_py}, + # ... + ) + + For Distribute:: + + setup(use_2to3=True, + # ... + ) This will allow you to not have to distribute a separate Python 3 version of your project. It does require, though, that when you perform development that @@ -526,30 +526,30 @@ accessing the currently raised exception. In Python 2 the syntax to access the current exception is:: - try: - raise Exception() - except Exception, exc: - # Current exception is 'exc' - pass + try: + raise Exception() + except Exception, exc: + # Current exception is 'exc' + pass This syntax changed in Python 3 to:: - try: - raise Exception() - except Exception as exc: - # Current exception is 'exc' - pass + try: + raise Exception() + except Exception as exc: + # Current exception is 'exc' + pass Because of this syntax change you must change to capturing the current exception to:: - try: - raise Exception() - except Exception: - import sys - exc = sys.exc_info()[1] - # Current exception is 'exc' - pass + try: + raise Exception() + except Exception: + import sys + exc = sys.exc_info()[1] + # Current exception is 'exc' + pass You can get more information about the raised exception from :func:`sys.exc_info` than simply the current exception instance, but you most From ncoghlan at gmail.com Fri Feb 4 00:05:03 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 4 Feb 2011 09:05:03 +1000 Subject: [Python-checkins] r88331 - in python/branches/py3k/Doc/howto: index.rst pyporting.rst In-Reply-To: <20110203220155.419E3EEA35@mail.python.org> References: <20110203220155.419E3EEA35@mail.python.org> Message-ID: On Fri, Feb 4, 2011 at 8:01 AM, brett.cannon wrote: > +Capturing the Currently Raised Exception > +---------------------------------------- > +One change between Python 2 and 3 that will require changing how you code is > +accessing the currently raised exception. ?In Python 2 the syntax to access the > +current exception is:: > + > + ? ?try: > + ? ? ? ?raise Exception() > + ? ?except Exception, exc: > + ? ? ? ?# Current exception is 'exc' > + ? ? ? ?pass > + > +This syntax changed in Python 3 to:: > + > + ? ?try: > + ? ? ? ?raise Exception() > + ? ?except Exception as exc: > + ? ? ? ?# Current exception is 'exc' > + ? ? ? ?pass Note that you can write it the Python 3 way in 2.6+ as well (this was new syntax, so there weren't any backwards compatibility issues with adding it). You only need to do the sys.exc_info dance if you need to support 2.5 or earlier. Other notes: - explicit relative imports work in 2.6+ without needing a future import - absolute imports are the default in 2.7 Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From ncoghlan at gmail.com Fri Feb 4 00:10:07 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 4 Feb 2011 09:10:07 +1000 Subject: [Python-checkins] r88331 - in python/branches/py3k/Doc/howto: index.rst pyporting.rst In-Reply-To: <20110203220155.419E3EEA35@mail.python.org> References: <20110203220155.419E3EEA35@mail.python.org> Message-ID: On Fri, Feb 4, 2011 at 8:01 AM, brett.cannon wrote: > +Stop Using :mod:`doctest` > +''''''''''''''''''''''''' > +While 2to3 tries to port doctests properly, it's a rather tough thing to do. It > +is probably best to simply convert your critical doctests to :mod:`unittest`. This advice strikes me as being *way* too strong. Perhaps something like: Consider limiting use of :mod:`doctest` =============================== While 2to3 tries to port doctests properly, it's a rather tough thing to do. If your test suite is heavily doctest dependent, then you may end up spending a lot of time manually fixing doctests. The two major avenues for dealing with this are to either port doctest based tests over to the unittest module (making them significantly easier for 2to3 to handle) or else to follow the guidelines below for writing 2/3 compatible source code in all doctests (making it so they should run unmodified on both Python versions). Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From solipsis at pitrou.net Fri Feb 4 05:05:18 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 04 Feb 2011 05:05:18 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88332): sum=-55 Message-ID: py3k results for svn r88332 (hg cset 33774ca03c96) -------------------------------------------------- test_pyexpat leaked [0, 0, -56] references, sum=-56 test_timeout leaked [0, 0, 1] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogYZNdEl', '-x'] From python-checkins at python.org Fri Feb 4 12:36:11 2011 From: python-checkins at python.org (eli.bendersky) Date: Fri, 04 Feb 2011 12:36:11 +0100 Subject: [Python-checkins] devguide: typo fix Message-ID: eli.bendersky pushed 817fb3089b83 to devguide: http://hg.python.org/devguide/rev/817fb3089b83 changeset: 228:817fb3089b83 tag: tip user: eli.bendersky date: Fri Feb 04 13:35:58 2011 +0200 summary: typo fix files: devrole.rst diff --git a/devrole.rst b/devrole.rst --- a/devrole.rst +++ b/devrole.rst @@ -18,7 +18,7 @@ Gaining the Developer role will allow you to set any value on any issue in the tracker, releasing you from the burden of having to ask others to set values on an issue for you in order to properly triage something. This will not only help -speed up and simplify your work in helping out, but also help lesson the +speed up and simplify your work in helping out, but also help lessen the workload for everyone by gaining your help. .. _issue tracker: http://bugs.python.org -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 12:47:11 2011 From: python-checkins at python.org (eli.bendersky) Date: Fri, 04 Feb 2011 12:47:11 +0100 Subject: [Python-checkins] devguide: typo fix Message-ID: eli.bendersky pushed 67c2ba711a3f to devguide: http://hg.python.org/devguide/rev/67c2ba711a3f changeset: 229:67c2ba711a3f tag: tip user: eli.bendersky date: Fri Feb 04 13:47:06 2011 +0200 summary: typo fix files: triaging.rst diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -4,7 +4,7 @@ ================= When you have the Developer role on the `issue tracker`_ you are able to triage -issues directly without any assistence. +issues directly without any assistance. Fields ------ -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 13:10:38 2011 From: python-checkins at python.org (eli.bendersky) Date: Fri, 04 Feb 2011 13:10:38 +0100 Subject: [Python-checkins] devguide: fix a couple of typos Message-ID: eli.bendersky pushed 253dee176d55 to devguide: http://hg.python.org/devguide/rev/253dee176d55 changeset: 230:253dee176d55 tag: tip user: eli.bendersky date: Fri Feb 04 14:10:31 2011 +0200 summary: fix a couple of typos files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -34,7 +34,7 @@ As a core developer you will occasionally want to commit a patch created by someone else. When doing so you will want to make sure of some things. -First, make sure the patch in a good state. Both :ref:`patch` and +First, make sure the patch is in a good state. Both :ref:`patch` and :ref:`helptriage` explain what is to be expected of a patch. Typically patches that get cleared by triagers are good to go except maybe lacking ``Misc/ACKS`` and ``Misc/NEWS`` @@ -43,7 +43,7 @@ Second, make sure the patch does not break backwards-compatibility without a good reason. This means :ref:`running the test suite ` to make sure everything still passes. It also means that if semantics do change there must -be a good reason for the the breakage of code the change will cause (and it +be a good reason for the breakage of code the change will cause (and it **will** break someone's code). If you are unsure if the breakage is worth it, ask on python-dev. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 20:09:02 2011 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 4 Feb 2011 20:09:02 +0100 (CET) Subject: [Python-checkins] r88333 - in python/branches/py3k: Misc/NEWS PC/python3.def PC/python32stub.def Message-ID: <20110204190902.D6F2FEEA85@mail.python.org> Author: martin.v.loewis Date: Fri Feb 4 20:09:02 2011 New Revision: 88333 Log: Issue #11118: Fix bogus export of None in python3.dll. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/PC/python3.def python/branches/py3k/PC/python32stub.def Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 4 20:09:02 2011 @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #11118: Fix bogus export of None in python3.dll. + Library ------- Modified: python/branches/py3k/PC/python3.def ============================================================================== --- python/branches/py3k/PC/python3.def (original) +++ python/branches/py3k/PC/python3.def Fri Feb 4 20:09:02 2011 @@ -681,7 +681,7 @@ _Py_Dealloc=python32._Py_Dealloc _Py_EllipsisObject=python32._Py_EllipsisObject DATA _Py_FalseStruct=python32._Py_FalseStruct DATA - _Py_NoneStruct=python32.Py_GetCopyright + _Py_NoneStruct=python32._Py_NoneStruct DATA _Py_NotImplementedStruct=python32._Py_NotImplementedStruct DATA _Py_SwappedOp=python32._Py_SwappedOp DATA _Py_TrueStruct=python32._Py_TrueStruct DATA Modified: python/branches/py3k/PC/python32stub.def ============================================================================== --- python/branches/py3k/PC/python32stub.def (original) +++ python/branches/py3k/PC/python32stub.def Fri Feb 4 20:09:02 2011 @@ -10,13 +10,6 @@ PyBaseObject_Type PyBool_FromLong PyBool_Type -PyBuffer_FillContiguousStrides -PyBuffer_FillInfo -PyBuffer_FromContiguous -PyBuffer_GetPointer -PyBuffer_IsContiguous -PyBuffer_Release -PyBuffer_ToContiguous PyByteArrayIter_Type PyByteArray_AsString PyByteArray_Concat @@ -317,7 +310,6 @@ PyMem_Malloc PyMem_Realloc PyMemberDescr_Type -PyMemoryView_FromBuffer PyMemoryView_FromObject PyMemoryView_GetContiguous PyMemoryView_Type @@ -399,7 +391,6 @@ PyObject_CallObject PyObject_CheckReadBuffer PyObject_ClearWeakRefs -PyObject_CopyData PyObject_DelItem PyObject_DelItemString PyObject_Dir @@ -412,7 +403,6 @@ PyObject_GenericSetAttr PyObject_GetAttr PyObject_GetAttrString -PyObject_GetBuffer PyObject_GetItem PyObject_GetIter PyObject_HasAttr @@ -691,7 +681,7 @@ _Py_Dealloc _Py_EllipsisObject _Py_FalseStruct -Py_GetCopyright +_Py_NoneStruct _Py_NotImplementedStruct _Py_SwappedOp _Py_TrueStruct From python-checkins at python.org Fri Feb 4 21:11:11 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 4 Feb 2011 21:11:11 +0100 (CET) Subject: [Python-checkins] r88334 - python/branches/py3k/Doc/library/io.rst Message-ID: <20110204201111.AB111EEAA8@mail.python.org> Author: antoine.pitrou Date: Fri Feb 4 21:11:11 2011 New Revision: 88334 Log: Mention that seek and tell over a TextIOWrapper can be very slow. Modified: python/branches/py3k/Doc/library/io.rst Modified: python/branches/py3k/Doc/library/io.rst ============================================================================== --- python/branches/py3k/Doc/library/io.rst (original) +++ python/branches/py3k/Doc/library/io.rst Fri Feb 4 21:11:11 2011 @@ -814,6 +814,8 @@ binary I/O over the same storage, because it implies conversions from unicode to binary data using a character codec. This can become noticeable if you handle huge amounts of text data (for example very large log files). +Also, :meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both +quite slow due to the reconstruction algorithm used. :class:`StringIO`, however, is a native in-memory unicode container and will exhibit similar speed to :class:`BytesIO`. From python-checkins at python.org Fri Feb 4 21:17:40 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 4 Feb 2011 21:17:40 +0100 (CET) Subject: [Python-checkins] r88335 - in python/branches/release31-maint: Doc/library/io.rst Message-ID: <20110204201740.E400CEE9D9@mail.python.org> Author: antoine.pitrou Date: Fri Feb 4 21:17:40 2011 New Revision: 88335 Log: Merged revisions 88334 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88334 | antoine.pitrou | 2011-02-04 21:11:11 +0100 (ven., 04 f?vr. 2011) | 3 lines Mention that seek and tell over a TextIOWrapper can be very slow. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/library/io.rst Modified: python/branches/release31-maint/Doc/library/io.rst ============================================================================== --- python/branches/release31-maint/Doc/library/io.rst (original) +++ python/branches/release31-maint/Doc/library/io.rst Fri Feb 4 21:17:40 2011 @@ -796,6 +796,8 @@ binary I/O over the same storage, because it implies conversions from unicode to binary data using a character codec. This can become noticeable if you handle huge amounts of text data (for example very large log files). +Also, :meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both +quite slow due to the reconstruction algorithm used. :class:`StringIO`, however, is a native in-memory unicode container and will exhibit similar speed to :class:`BytesIO`. From python-checkins at python.org Fri Feb 4 21:17:54 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 4 Feb 2011 21:17:54 +0100 (CET) Subject: [Python-checkins] r88336 - in python/branches/release27-maint: Doc/library/io.rst Message-ID: <20110204201754.1F626EEA7F@mail.python.org> Author: antoine.pitrou Date: Fri Feb 4 21:17:53 2011 New Revision: 88336 Log: Merged revisions 88334 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88334 | antoine.pitrou | 2011-02-04 21:11:11 +0100 (ven., 04 f?vr. 2011) | 3 lines Mention that seek and tell over a TextIOWrapper can be very slow. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/io.rst Modified: python/branches/release27-maint/Doc/library/io.rst ============================================================================== --- python/branches/release27-maint/Doc/library/io.rst (original) +++ python/branches/release27-maint/Doc/library/io.rst Fri Feb 4 21:17:53 2011 @@ -812,6 +812,8 @@ binary I/O over the same storage, because it implies conversions from unicode to binary data using a character codec. This can become noticeable if you handle huge amounts of text data (for example very large log files). +Also, :meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both +quite slow due to the reconstruction algorithm used. :class:`StringIO`, however, is a native in-memory unicode container and will exhibit similar speed to :class:`BytesIO`. From python-checkins at python.org Fri Feb 4 21:24:03 2011 From: python-checkins at python.org (brett.cannon) Date: Fri, 4 Feb 2011 21:24:03 +0100 (CET) Subject: [Python-checkins] r88337 - in python/branches/py3k: Misc/NEWS Modules/_sqlite/module.c Message-ID: <20110204202403.1CC69EE9D9@mail.python.org> Author: brett.cannon Date: Fri Feb 4 21:24:02 2011 New Revision: 88337 Log: There was a possibility that the initialization of _sqlite, when it failed, would lead to a decref of a NULL. Fixes issue #11110. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/_sqlite/module.c Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 4 21:24:02 2011 @@ -15,6 +15,8 @@ Library ------- +- Issue #11110: Fix a potential decref of a NULL in sqlite3. + - Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch by Stan Mihai. Modified: python/branches/py3k/Modules/_sqlite/module.c ============================================================================== --- python/branches/py3k/Modules/_sqlite/module.c (original) +++ python/branches/py3k/Modules/_sqlite/module.c Fri Feb 4 21:24:02 2011 @@ -329,7 +329,7 @@ (pysqlite_statement_setup_types() < 0) || (pysqlite_prepare_protocol_setup_types() < 0) ) { - Py_DECREF(module); + Py_XDECREF(module); return NULL; } From python-checkins at python.org Fri Feb 4 21:29:59 2011 From: python-checkins at python.org (brett.cannon) Date: Fri, 4 Feb 2011 21:29:59 +0100 (CET) Subject: [Python-checkins] r88338 - python/branches/release27-maint Message-ID: <20110204202959.9D742EEA9F@mail.python.org> Author: brett.cannon Date: Fri Feb 4 21:29:59 2011 New Revision: 88338 Log: Blocked revisions 88337 via svnmerge ........ r88337 | brett.cannon | 2011-02-04 12:24:02 -0800 (Fri, 04 Feb 2011) | 5 lines There was a possibility that the initialization of _sqlite, when it failed, would lead to a decref of a NULL. Fixes issue #11110. ........ Modified: python/branches/release27-maint/ (props changed) From python-checkins at python.org Fri Feb 4 21:30:30 2011 From: python-checkins at python.org (brett.cannon) Date: Fri, 4 Feb 2011 21:30:30 +0100 (CET) Subject: [Python-checkins] r88339 - in python/branches/release31-maint: Misc/NEWS Modules/_sqlite/module.c Message-ID: <20110204203030.82B85EEA9F@mail.python.org> Author: brett.cannon Date: Fri Feb 4 21:30:30 2011 New Revision: 88339 Log: Merged revisions 88337 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88337 | brett.cannon | 2011-02-04 12:24:02 -0800 (Fri, 04 Feb 2011) | 5 lines There was a possibility that the initialization of _sqlite, when it failed, would lead to a decref of a NULL. Fixes issue #11110. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Misc/NEWS python/branches/release31-maint/Modules/_sqlite/module.c Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Fri Feb 4 21:30:30 2011 @@ -37,6 +37,8 @@ Library ------- +- Issue #11110: Fix _sqlite to not deref a NULL when module creation fails. + - Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. Modified: python/branches/release31-maint/Modules/_sqlite/module.c ============================================================================== --- python/branches/release31-maint/Modules/_sqlite/module.c (original) +++ python/branches/release31-maint/Modules/_sqlite/module.c Fri Feb 4 21:30:30 2011 @@ -329,7 +329,7 @@ (pysqlite_statement_setup_types() < 0) || (pysqlite_prepare_protocol_setup_types() < 0) ) { - Py_DECREF(module); + Py_XDECREF(module); return NULL; } From python-checkins at python.org Fri Feb 4 21:39:11 2011 From: python-checkins at python.org (brett.cannon) Date: Fri, 04 Feb 2011 21:39:11 +0100 Subject: [Python-checkins] devguide: Clarifications for issues found by Eli Bendersky. Message-ID: brett.cannon pushed fb67c5864898 to devguide: http://hg.python.org/devguide/rev/fb67c5864898 changeset: 231:fb67c5864898 tag: tip user: Brett Cannon date: Fri Feb 04 12:39:06 2011 -0800 summary: Clarifications for issues found by Eli Bendersky. files: triaging.rst diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -46,7 +46,7 @@ Components '''''''''' What part of Python is affected by the issue. This is a multi-select field. -Be aware what component is chosen may cause the issue to be auto-assigned. +Be aware that what component is chosen may cause the issue to be auto-assigned. Versions '''''''' @@ -135,7 +135,8 @@ The issue is blocked until someone (often times the :abbr:`OP (original poster)`) provides some critical info; the issue is automatically closed after a set amount of time if no reply comes in. - Useful for when someone reports a bug that lacks enough issue to reproduce + Useful for when someone reports a bug that lacks enough information to be + reproduced and thus should be closed if the lacking info is never provided. and thus the issue is worthless without the needed info being provided. * closed The issue has been resolved (somehow). -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 22:50:51 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 04 Feb 2011 22:50:51 +0100 Subject: [Python-checkins] devguide: Add hyperlinks Message-ID: antoine.pitrou pushed 25021ea224ce to devguide: http://hg.python.org/devguide/rev/25021ea224ce changeset: 232:25021ea224ce user: Antoine Pitrou date: Fri Feb 04 22:01:03 2011 +0100 summary: Add hyperlinks files: devcycle.rst setup.rst diff --git a/devcycle.rst b/devcycle.rst --- a/devcycle.rst +++ b/devcycle.rst @@ -18,9 +18,14 @@ Python has branches at the granularity of minor versions. Micro and release-level versions are represented using tags in the VCS. + +.. _indevbranch: + In-Development -------------- -The current branch under active development. + +The current branch under active development. It can be :ref:`checked out +` from http://svn.python.org/projects/python/branches/py3k. The in-development branch is where new functionality and semantic changes occur. Currently this branch is known as the "py3k" branch. The next minor @@ -35,6 +40,7 @@ Maintenance ----------- + The branch currently being maintained for bug fixes. The branch under maintenance is the last minor version of Python to be released diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -21,6 +21,8 @@ `_ graphical client. +.. _checkout: + Checking out the code ---------------------- @@ -33,7 +35,7 @@ To get a read-only checkout of CPython's source, you need to checkout the source code. To get a read-only checkout of -the in-development branch of Python, run:: +the :ref:`in-development ` branch of Python, run:: svn co http://svn.python.org/projects/python/branches/py3k -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 22:50:53 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 04 Feb 2011 22:50:53 +0100 Subject: [Python-checkins] devguide: Try to improve wording and explanations in the dev cycle Message-ID: antoine.pitrou pushed c93b73417764 to devguide: http://hg.python.org/devguide/rev/c93b73417764 changeset: 233:c93b73417764 tag: tip user: Antoine Pitrou date: Fri Feb 04 22:50:39 2011 +0100 summary: Try to improve wording and explanations in the dev cycle files: communication.rst devcycle.rst setup.rst diff --git a/communication.rst b/communication.rst --- a/communication.rst +++ b/communication.rst @@ -6,6 +6,8 @@ Python's development is communicated through a myriad of ways, mostly through mailing lists, but also other forms. +.. _mailinglists: + Mailing Lists ------------- diff --git a/devcycle.rst b/devcycle.rst --- a/devcycle.rst +++ b/devcycle.rst @@ -6,122 +6,150 @@ The responsibilities of a core developer shift based on what kind of branch of Python a developer is working on and what stage the branch is in. -To clarify terminology, Python uses a ``major.minor.micro.releaselevel`` -nomenclature for versions. So for Python 3.1.2 final, that is a major version -of 3, a minor version of 1, a micro version of 2, and a release level (or -stage) of "final". +To clarify terminology, Python uses a ``major.minor.micro`` nomenclature +for production-ready releases. So for Python 3.1.2 final, that is a *major +version* of 3, a *minor version* of 1, and a *micro version* of 2. + +* new *major versions* are exceptional; they only come when strongly + incompatible changes are deemed necessary, and are planned very long + in advance; + +* new *minor versions* are feature releases; they get released roughly + every 18 months, from the current :ref:`in-development ` + branch; + +* new *micro versions* are bugfix releases; they get released roughly + every 6 months, although they can come more often if necessary; they are + prepared in :ref:`maintenance ` branches. + +We also publish non-final versions which get an additional qualifier: +:ref:`alpha`, :ref:`beta`, :ref:`release candidate `. These versions +are aimed at testing by advanced users, not production use. Branches '''''''' -Python has branches at the granularity of minor versions. Micro and -release-level versions are represented using tags in the VCS. +Different branches are used at a time to represent different *minor versions* +in which development is made. All development should be done **first** in the +:ref:`in-development ` branch, and selectively backported +to other branches when necessary. .. _indevbranch: -In-Development --------------- +In-development (main) branch +---------------------------- -The current branch under active development. It can be :ref:`checked out -` from http://svn.python.org/projects/python/branches/py3k. +The branch for the next minor version; it is under active development for +all kinds of changes: new features, semantic changes, performance improvements, +bug fixes. It can be :ref:`checked out ` from +http://svn.python.org/projects/python/branches/py3k. -The in-development branch is where new functionality and semantic changes -occur. Currently this branch is known as the "py3k" branch. The next minor -release of Python will come from this branch (major releases are once a decade -and so have no specific rules on how they are started). All changes land in this -branch and then trickle down to other branches. +Once a :ref:`final` release is made from the in-development branch (say, 3.2), a +new :ref:`maintenance branch ` (e.g. ``release32-maint``) +is created to host all bug fixing activity for further micro versions +(3.2.1, 3.2.2, etc.). -Once a Final_ release is made from the in-development branch, a branch is made -to represent the minor version of Python and it goes into maintenance mode. -Typically a minor version of Python is under development for about 18 months. +.. _maintbranch: -Maintenance ------------ +Maintenance branches +-------------------- -The branch currently being maintained for bug fixes. +A branch currently being maintained for bug fixes. There are currently +two of them in activity: one for Python 3.x and one for Python 2.x. At +some point in the future, Python 2.x will be closed for bug fixes and there +will be only one maintenance branch left. -The branch under maintenance is the last minor version of Python to be released -as Final_. This means if the latest release of Python was 3.1.2, then the -branch representing Python 3.1 is in maintenance mode. +The only changes allowed to occur in a maintenance branch without debate are +bug fixes. Also, a general rule for maintenance branches is that compatibility +must not be broken at any point between sibling minor releases (3.1.1, 3.1.2, +etc.). For both rules, only rare exceptions are accepted and **must** be +discussed first. -The only changes allowed to occur in a maintenance branch without debate are bug -fixes. -Semantic changes **must** be carefully considered as code out in the world will -have already been developed that will rely on the released semantics. Changes -related to semantics should be discussed on python-dev before being made. +When a new maintenance branch is created (after a new *minor version* is +released), the old maintenance branch on that major version (e.g. 3.1.x +after 3.2 gets released) goes into :ref:`security mode `. -A branch stays in maintenance mode as long as a new minor release has not been -made. For example, this means that Python 2.6 stayed in maintenance mode until -Python 2.7.0 was released, at which point 2.7 went into maintenance mode and -2.6 went into Security_ mode. As new minor releases occur on a (roughly) 18 -month schedule, a branch stays in maintenance mode for the same amount of time. -A micro release of a maintenance branch is made about every six months. -Typically when a new minor release is made one more release of the new-old -version of Python is made. +.. _secbranch: +Security branches +----------------- -Security --------- -A branch less than five years old but no longer in maintenance mode. +A branch less than 5 years old but no longer in maintenance mode. -The only changes made to a branch that is being maintained for security -purposes are somewhat obviously those related to security, e.g., privilege -escalation, and issues that lead to crashes. Other behavioral issues are -**not** considered a -security risk and thus not backported to a branch being maintained for -security. Any release made from a branch under security maintenance is -source-only and done only when actual security patches have been applied to the -branch. +The only changes made to a security branch are those fixing issues exploitable +by attackers such as crashes, privilege escalation and, optionally, other +issues such as denial of service attacks. Other behavioral issues are +**not** considered a security risk and thus not backported to a security branch. +Any release made from a security branch is source-only and done only when +actual security patches have been applied to the branch. +.. _stages: + Stages '''''' -Based on what stage the in-development version of Python is in, the -responsibilities of a core developer change in regards to commits to the VCS. +Based on what stage the :ref:`in-development ` version of Python +is in, the responsibilities of a core developer change in regards to commits +to the VCS. Pre-alpha --------- -This is the stage a branch is in from the last final release until the first -alpha (a1). There are no special restrictions placed on commits beyond those -imposed by the type of branch being worked on (e.g., in-development vs. -maintenance). +The branch is in this stage when no official release has been done since +the latest final release. There are no special restrictions placed on +commits, although the usual advice applies (getting patches reviewed, avoiding +breaking the buildbots). + +.. _alpha: Alpha ----- -Alphas typically serve as a reminder to core developers that they need to start -getting in changes that change semantics or add something to Python as such -things should not be added during a Beta_. Otherwise no new restrictions are in -place while in alpha. +Alpha releases typically serve as a reminder to core developers that they +need to start getting in changes that change semantics or add something to +Python as such things should not be added during a Beta_. Otherwise no new +restrictions are in place while in alpha. + +.. _beta: Beta ---- -A branch in beta means that no new additions to Python are accepted. Bugfixes -and the like are still fine. Being in beta can be viewed much like being in RC_ -but without the extra overhead of needing commit reviews. +After a first beta release is published, no new features are accepted. Only +bug fixes can now be committed. This is when core developers should concentrate +on the task of fixing regressions and other new issues filed by users who have +downloaded the alpha and beta releases. -.. _RC: +Being in beta can be viewed much like being in RC_ but without the extra overhead +of needing commit reviews. + +.. _rc: Release Candidate (RC) ---------------------- + A branch preparing for an RC release can only have bugfixes applied that have -been reviewed by other core developers. That reviewer should make a post to the -issue related to the change and be mentioned in the commit message. +been reviewed by other core developers. Generally, these issues must be +severe enough (e.g. crashes) that they deserve fixing before the final release. +All other issues should be deferred to the next development cycle, since stability +is the strongest concern at this point. You **cannot** skip the peer review during an RC, no matter how small! Even if it is a simple copy-and-paste change, **everything** requires peer review from a core developer. +.. _final: Final ----- + When a final release is being cut, only the release manager (RM) can make -changes to the branch. +changes to the branch. After the final release is published, the full +:ref:`development cycle ` starts again for the next minor version. + diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -40,8 +40,8 @@ svn co http://svn.python.org/projects/python/branches/py3k If you want a read-only checkout of an already-released version of Python, -i.e., a version in maintenance mode, run something like the following which -gets you a checkout for Python 3.1:: +i.e., a version in :ref:`maintenance mode `, run something +like the following which gets you a checkout for Python 3.1:: svn co http://svn.python.org/projects/python/branches/release31-maint -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Feb 4 22:55:23 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 04 Feb 2011 22:55:23 +0100 Subject: [Python-checkins] devguide: Shift the developer log into the resources section Message-ID: antoine.pitrou pushed 2c711f0028a7 to devguide: http://hg.python.org/devguide/rev/2c711f0028a7 changeset: 234:2c711f0028a7 tag: tip user: Antoine Pitrou date: Fri Feb 04 22:55:21 2011 +0100 summary: Shift the developer log into the resources section files: index.rst diff --git a/index.rst b/index.rst --- a/index.rst +++ b/index.rst @@ -63,7 +63,6 @@ * :ref:`languishing` * :ref:`communication` * :ref:`coredev` - * :ref:`developers` * :ref:`committing` * :ref:`devcycle` @@ -103,6 +102,7 @@ * :ref:`faq` * PEPs_ (Python Enhancement Proposals) * `python.org maintenance`_ +* :ref:`developers` .. _buildbots: http://python.org/dev/buildbot/ -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Sat Feb 5 05:08:53 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 05 Feb 2011 05:08:53 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88337): sum=-56 Message-ID: py3k results for svn r88337 (hg cset 343e5269accf) -------------------------------------------------- test_pyexpat leaked [0, 0, -56] references, sum=-56 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogZnm0ZP', '-x'] From python-checkins at python.org Sat Feb 5 11:43:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 11:43:57 +0100 (CET) Subject: [Python-checkins] r88340 - python/branches/py3k/Doc/faq/gui.rst Message-ID: <20110205104357.454A8EEA23@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 11:43:57 2011 New Revision: 88340 Log: Update info in the GUI FAQ Modified: python/branches/py3k/Doc/faq/gui.rst Modified: python/branches/py3k/Doc/faq/gui.rst ============================================================================== --- python/branches/py3k/Doc/faq/gui.rst (original) +++ python/branches/py3k/Doc/faq/gui.rst Sat Feb 5 11:43:57 2011 @@ -15,7 +15,9 @@ What platform-independent GUI toolkits exist for Python? ======================================================== -Depending on what platform(s) you are aiming at, there are several. +Depending on what platform(s) you are aiming at, there are several. Some +of them haven't been ported to Python 3 yet. At least `Tkinter`_ and `Qt`_ +are known to be Python 3-compatible. .. XXX check links @@ -23,10 +25,12 @@ ------- Standard builds of Python include an object-oriented interface to the Tcl/Tk -widget set, called Tkinter. This is probably the easiest to install and use. -For more info about Tk, including pointers to the source, see the Tcl/Tk home -page at http://www.tcl.tk. Tcl/Tk is fully portable to the MacOS, Windows, and -Unix platforms. +widget set, called :ref:`tkinter `. This is probably the easiest to +install (since it comes included with most +`binary distributions `_ of Python) and use. +For more info about Tk, including pointers to the source, see the +`Tcl/Tk home page `_. Tcl/Tk is fully portable to the +MacOS, Windows, and Unix platforms. wxWidgets --------- @@ -51,13 +55,15 @@ Qt --- -There are bindings available for the Qt toolkit (`PyQt -`_) and for KDE (`PyKDE `__). If -you're writing open source software, you don't need to pay for PyQt, but if you -want to write proprietary applications, you must buy a PyQt license from -`Riverbank Computing `_ and (up to Qt 4.4; -Qt 4.5 upwards is licensed under the LGPL license) a Qt license from `Trolltech -`_. +There are bindings available for the Qt toolkit (using either `PyQt +`_ or `PySide +`_) and for KDE (`PyKDE `__). +PyQt is currently more mature than PySide, but you must buy a PyQt license from +`Riverbank Computing `_ +if you want to write proprietary applications. PySide is free for all applications. + +Qt 4.5 upwards is licensed under the LGPL license; also, commercial licenses +are available from `Nokia `_. Gtk+ ---- From python-checkins at python.org Sat Feb 5 11:57:17 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 11:57:17 +0100 (CET) Subject: [Python-checkins] r88341 - in python/branches/py3k/Doc/faq: design.rst extending.rst programming.rst Message-ID: <20110205105717.48044EE9BF@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 11:57:17 2011 New Revision: 88341 Log: Mention Cython and remove obsolete alternatives Modified: python/branches/py3k/Doc/faq/design.rst python/branches/py3k/Doc/faq/extending.rst python/branches/py3k/Doc/faq/programming.rst Modified: python/branches/py3k/Doc/faq/design.rst ============================================================================== --- python/branches/py3k/Doc/faq/design.rst (original) +++ python/branches/py3k/Doc/faq/design.rst Sat Feb 5 11:57:17 2011 @@ -418,11 +418,9 @@ .. XXX check which of these projects are still alive There are also several programs which make it easier to intermingle Python and C -code in various ways to increase performance. See, for example, `Psyco -`_, `Pyrex -`_, `PyInline -`_, `Py2Cmod -`_, and `Weave +code in various ways to increase performance. See, for example, `Cython +`_, `Pyrex +`_ and `Weave `_. Modified: python/branches/py3k/Doc/faq/extending.rst ============================================================================== --- python/branches/py3k/Doc/faq/extending.rst (original) +++ python/branches/py3k/Doc/faq/extending.rst Sat Feb 5 11:57:17 2011 @@ -45,10 +45,11 @@ very little effort, as long as you're running on a machine with an x86-compatible processor. -`Pyrex `_ is a compiler -that accepts a slightly modified form of Python and generates the corresponding -C code. Pyrex makes it possible to write an extension without having to learn -Python's C API. +`Cython `_ and its relative `Pyrex +`_ are compilers +that accept a slightly modified form of Python and generate the corresponding +C code. Cython and Pyrex make it possible to write an extension without having +to learn Python's C API. If you need to interface to some C or C++ library for which no Python extension currently exists, you can try wrapping the library's data types and functions Modified: python/branches/py3k/Doc/faq/programming.rst ============================================================================== --- python/branches/py3k/Doc/faq/programming.rst (original) +++ python/branches/py3k/Doc/faq/programming.rst Sat Feb 5 11:57:17 2011 @@ -127,9 +127,9 @@ .. XXX seems to have overlap with other questions! -`Pyrex `_ can compile a -slightly modified version of Python code into a C extension, and can be used on -many different platforms. +`Cython `_ and `Pyrex `_ +can compile a slightly modified version of Python code into a C extension, and +can be used on many different platforms. `Psyco `_ is a just-in-time compiler that translates Python code into x86 assembly language. If you can use it, Psyco can From python-checkins at python.org Sat Feb 5 12:04:01 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 12:04:01 +0100 (CET) Subject: [Python-checkins] r88342 - python/branches/py3k/Doc/faq/installed.rst Message-ID: <20110205110401.9D39EEEA91@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 12:04:01 2011 New Revision: 88342 Log: Update test of "why is Python installed" FAQ. Modified: python/branches/py3k/Doc/faq/installed.rst Modified: python/branches/py3k/Doc/faq/installed.rst ============================================================================== --- python/branches/py3k/Doc/faq/installed.rst (original) +++ python/branches/py3k/Doc/faq/installed.rst Sat Feb 5 12:04:01 2011 @@ -24,14 +24,14 @@ it; you'll have to figure out who's been using the machine and might have installed it. * A third-party application installed on the machine might have been written in - Python and included a Python installation. For a home computer, the most - common such application is `PySol `_, a - solitaire game that includes over 1000 different games and variations. + Python and included a Python installation. There are many such applications, + from GUI programs to network servers and administrative scripts. * Some Windows machines also have Python installed. At this writing we're aware of computers from Hewlett-Packard and Compaq that include Python. Apparently some of HP/Compaq's administrative tools are written in Python. -* All Apple computers running Mac OS X have Python installed; it's included in - the base installation. +* Many Unix-compatible operating systems, such as Mac OS X and some Linux + distributions, have Python installed by default; it's included in the base + installation. Can I delete Python? From python-checkins at python.org Sat Feb 5 12:18:35 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 12:18:35 +0100 (CET) Subject: [Python-checkins] r88343 - python/branches/py3k/Doc/faq/library.rst Message-ID: <20110205111835.0D9CFEE9BF@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 12:18:34 2011 New Revision: 88343 Log: Mention concurrent.futures and update answers about the GIL. Modified: python/branches/py3k/Doc/faq/library.rst Modified: python/branches/py3k/Doc/faq/library.rst ============================================================================== --- python/branches/py3k/Doc/faq/library.rst (original) +++ python/branches/py3k/Doc/faq/library.rst Sat Feb 5 12:18:34 2011 @@ -285,11 +285,15 @@ How do I parcel out work among a bunch of worker threads? --------------------------------------------------------- -Use the :mod:`queue` module to create a queue containing a list of jobs. The -:class:`~queue.Queue` class maintains a list of objects with ``.put(obj)`` to -add an item to the queue and ``.get()`` to return an item. The class will take -care of the locking necessary to ensure that each job is handed out exactly -once. +The easiest way is to use the new :mod:`concurrent.futures` module, +especially the :mod:`~concurrent.futures.ThreadPoolExecutor` class. + +Or, if you want fine control over the dispatching algorithm, you can write +your own logic manually. Use the :mod:`queue` module to create a queue +containing a list of jobs. The :class:`~queue.Queue` class maintains a +list of objects with ``.put(obj)`` to add an item to the queue and ``.get()`` +to return an item. The class will take care of the locking necessary to +ensure that each job is handed out exactly once. Here's a trivial example:: @@ -352,7 +356,7 @@ What kinds of global value mutation are thread-safe? ---------------------------------------------------- -A global interpreter lock (GIL) is used internally to ensure that only one +A :term:`global interpreter lock` (GIL) is used internally to ensure that only one thread runs in the Python VM at a time. In general, Python offers to switch among threads only between bytecode instructions; how frequently it switches can be set via :func:`sys.setswitchinterval`. Each bytecode instruction and @@ -395,32 +399,34 @@ Can't we get rid of the Global Interpreter Lock? ------------------------------------------------ -.. XXX mention multiprocessing .. XXX link to dbeazley's talk about GIL? -The Global Interpreter Lock (GIL) is often seen as a hindrance to Python's +The :term:`global interpreter lock` (GIL) is often seen as a hindrance to Python's deployment on high-end multiprocessor server machines, because a multi-threaded Python program effectively only uses one CPU, due to the insistence that (almost) all Python code can only run while the GIL is held. Back in the days of Python 1.5, Greg Stein actually implemented a comprehensive patch set (the "free threading" patches) that removed the GIL and replaced it -with fine-grained locking. Unfortunately, even on Windows (where locks are very -efficient) this ran ordinary Python code about twice as slow as the interpreter -using the GIL. On Linux the performance loss was even worse because pthread -locks aren't as efficient. - -Since then, the idea of getting rid of the GIL has occasionally come up but -nobody has found a way to deal with the expected slowdown, and users who don't -use threads would not be happy if their code ran at half at the speed. Greg's -free threading patch set has not been kept up-to-date for later Python versions. +with fine-grained locking. Adam Olsen recently did a similar experiment +in his `python-safethread `_ +project. Unfortunately, both experiments exhibited a sharp drop in single-thread +performance (at least 30% slower), due to the amount of fine-grained locking +necessary to compensate for the removal of the GIL. This doesn't mean that you can't make good use of Python on multi-CPU machines! You just have to be creative with dividing the work up between multiple -*processes* rather than multiple *threads*. Judicious use of C extensions will -also help; if you use a C extension to perform a time-consuming task, the -extension can release the GIL while the thread of execution is in the C code and -allow other threads to get some work done. +*processes* rather than multiple *threads*. The +:class:`~concurrent.futures.ProcessPoolExecutor` class in the new +:mod:`concurrent.futures` module provides an easy way of doing so; the +:mod:`multiprocessing` module provides a lower-level API in case you want +more control over dispatching of tasks. + +Judicious use of C extensions will also help; if you use a C extension to +perform a time-consuming task, the extension can release the GIL while the +thread of execution is in the C code and allow other threads to get some work +done. Some standard library modules such as :mod:`zlib` and :mod:`hashlib` +already do this. It has been suggested that the GIL should be a per-interpreter-state lock rather than truly global; interpreters then wouldn't be able to share objects. From python-checkins at python.org Sat Feb 5 12:24:15 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 12:24:15 +0100 (CET) Subject: [Python-checkins] r88344 - python/branches/py3k/Doc/faq/library.rst Message-ID: <20110205112415.894A7EE9BF@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 12:24:15 2011 New Revision: 88344 Log: Mention asyncore and Twisted in the library FAQ. Modified: python/branches/py3k/Doc/faq/library.rst Modified: python/branches/py3k/Doc/faq/library.rst ============================================================================== --- python/branches/py3k/Doc/faq/library.rst (original) +++ python/branches/py3k/Doc/faq/library.rst Sat Feb 5 12:24:15 2011 @@ -757,7 +757,8 @@ How do I avoid blocking in the connect() method of a socket? ------------------------------------------------------------ -The select module is commonly used to help with asynchronous I/O on sockets. +The :mod:`select` module is commonly used to help with asynchronous I/O on +sockets. To prevent the TCP connect from blocking, you can set the socket to non-blocking mode. Then when you do the ``connect()``, you will either connect immediately @@ -771,6 +772,12 @@ -- ``0`` or ``errno.EISCONN`` indicate that you're connected -- or you can pass this socket to select to check if it's writable. +.. note:: + The :mod:`asyncore` module presents a framework-like approach to the problem + of writing non-blocking networking code. + The third-party `Twisted `_ library is + a popular and feature-rich alternative. + Databases ========= From python-checkins at python.org Sat Feb 5 12:40:05 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 12:40:05 +0100 (CET) Subject: [Python-checkins] r88345 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205114005.E0C75EE9D4@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 12:40:05 2011 New Revision: 88345 Log: Mention -b and -bb Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 12:40:05 2011 @@ -319,6 +319,37 @@ consistently stick to that API in both Python 2 and 3. +Bytes / unicode comparison +************************** + +In Python 3, mixing bytes and unicode is forbidden in most situations; it +will raise a :class:`TypeError` where Python 2 would have attempted an implicit +coercion between types. However, there is one case where it doesn't and +it can be very misleading:: + + >>> b"" == "" + False + +This is because comparison for equality is required by the language to always +succeed (and return ``False`` for incompatible types). However, this also +means that code incorrectly ported to Python 3 can display buggy behaviour +if such comparisons are silently executed. To detect such situations, +Python 3 has a ``-b`` flag that will display a warning:: + + $ python3 -b + >>> b"" == "" + __main__:1: BytesWarning: Comparison between bytes and string + False + +To turn the warning into an exception, use the ``-bb`` flag instead:: + + $ python3 -bb + >>> b"" == "" + Traceback (most recent call last): + File "", line 1, in + BytesWarning: Comparison between bytes and string + + ``__str__()``/``__unicode__()`` ''''''''''''''''''''''''''''''' In Python 2, objects can specify both a string and unicode representation of From python-checkins at python.org Sat Feb 5 12:53:39 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 12:53:39 +0100 (CET) Subject: [Python-checkins] r88346 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205115339.82BAFEE9A4@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 12:53:39 2011 New Revision: 88346 Log: Fix entries pertaining to file I/O Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 12:53:39 2011 @@ -251,17 +251,6 @@ also comes about when doing comparisons between bytes and strings. -:mod:`io` Module -'''''''''''''''' -The built-in ``open()`` function in Python 2 always returns a Python 2 string, -not a unicode string. This is problematic as Python 3's :func:`open` returns a -string if a file is not opened as binary and bytes if it is. - -To help with compatibility, use :func:`io.open` instead of the built-in -``open()``. Since :func:`io.open` is essentially the same function in both -Python 2 and Python 3 it will help iron out any issues that might arise. - - Handle Common "Gotchas" ----------------------- There are a few things that just consistently come up as sticking points for @@ -269,6 +258,34 @@ to help modernize your code. +Specify when opening a file as binary +''''''''''''''''''''''''''''''''''''' + +Unless you have been working on Windows, there is a chance you have not always +bothered to add the ``b`` mode when opening a binary file (e.g., ``rb`` for +binary reading). Under Python 3, binary files and text files are clearly +distinct and mutually incompatible; see the :mod:`io` module for details. +Therefore, you **must** make a decision of whether a file will be used for +binary access (allowing to read and/or write bytes data) or text access +(allowing to read and/or write unicode data). + +Text files +'''''''''' + +Text files created using ``open()`` under Python 2 return byte strings, +while under Python 3 they return unicode strings. Depending on your porting +strategy, this can be an issue. + +If you want text files to return unicode strings in Python 2, you have two +possibilities: + +* Under Python 2.6 and higher, use :func:`io.open`. Since :func:`io.open` + is essentially the same function in both Python 2 and Python 3, it will + help iron out any issues that might arise. + +* If pre-2.6 compatibility is needed, then you should use :func:`codecs.open` + instead. This will make sure that you get back unicode strings in Python 2. + Subclass ``object`` ''''''''''''''''''' New-style classes have been around since Python 2.2. You need to make sure you @@ -392,23 +409,9 @@ return u'spam-spam-bacon-spam' # 2to3 will remove the 'u' prefix -Specify when opening a file as binary -''''''''''''''''''''''''''''''''''''' -Unless you have been working on Windows, there is a chance you have not always -bothered to add the ``b`` mode when opening a file (e.g., `` - - -Use :func:``codecs.open()`` -''''''''''''''''''''''''''' -If you are not able to limit your Python 2 compatibility to 2.6 or newer (and -thus get to use :func:`io.open`), then you should make sure you use -:func:`codecs.open` over the built-in ``open()`` function. This will make sure -that you get back unicode strings in Python 2 when reading in text and an -instance of ``str`` when dealing with bytes. - - Don't Index on Exceptions ''''''''''''''''''''''''' + In Python 2, the following worked:: >>> exc = Exception(1, 2, 3) @@ -423,9 +426,9 @@ Even better is to use documented attributes the exception provides. - Don't use ``__getslice__`` & Friends '''''''''''''''''''''''''''''''''''' + Been deprecated for a while, but Python 3 finally drops support for ``__getslice__()``, etc. Move completely over to :meth:`__getitem__` and friends. From python-checkins at python.org Sat Feb 5 13:01:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 13:01:08 +0100 (CET) Subject: [Python-checkins] r88347 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205120108.1B31BEE994@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 13:01:07 2011 New Revision: 88347 Log: Soften the wording about tracebacks. Reference cycles *don't* prevent garbage collection! (fortunately) Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 13:01:07 2011 @@ -587,14 +587,19 @@ You can get more information about the raised exception from :func:`sys.exc_info` than simply the current exception instance, but you most -likely don't need it. One very key point to understand, though, is **do not -save the traceback to a variable without deleting it**! Because tracebacks -contain references to the current executing frame you will inadvertently create -a circular reference, prevent everything in the frame from being garbage -collected. This can be a massive memory leak if you are not careful. Simply -index into the returned value from :func:`sys.version_info` instead of -assigning the tuple it returns to a variable. +likely don't need it. +.. note:: + In Python 3, the traceback is attached to the exception instance + through the **__traceback__** attribute. If the instance is saved in + a local variable that persists outside of the ``except`` block, the + traceback will create a reference cycle with the current frame and its + dictionary of local variables. This will delay reclaiming dead + resources until the next cyclic :term:`garbage collection` pass. + + In Python 2, this problem only occurs if you save the traceback itself + (e.g. the third element of the tuple returned by :func:`sys.exc_info`) + in a variable. Other Resources =============== From python-checkins at python.org Sat Feb 5 13:13:38 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 5 Feb 2011 13:13:38 +0100 (CET) Subject: [Python-checkins] r88348 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205121338.ECC72EE9F6@mail.python.org> Author: antoine.pitrou Date: Sat Feb 5 13:13:38 2011 New Revision: 88348 Log: Everybody hates this one :) (bytes indexing) Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 13:13:38 2011 @@ -367,6 +367,37 @@ BytesWarning: Comparison between bytes and string +Indexing bytes objects +'''''''''''''''''''''' + +Another potentially surprising change is the indexing behaviour of bytes +objects in Python 3:: + + >>> b"xyz"[0] + 120 + +Indeed, Python 3 bytes objects (as well as :class:`bytearray` objects) +are sequences of integers. But code converted from Python 2 will often +assume that indexing a bytestring produces another bytestring, not an +integer. To reconcile both behaviours, use slicing:: + + >>> b"xyz"[0:1] + b'x' + >>> n = 1 + >>> b"xyz"[n:n+1] + b'y' + +The only remaining gotcha is that an out-of-bounds slice returns an empty +bytes object instead of raising ``IndexError``: + + >>> b"xyz"[3] + Traceback (most recent call last): + File "", line 1, in + IndexError: index out of range + >>> b"xyz"[3:4] + b'' + + ``__str__()``/``__unicode__()`` ''''''''''''''''''''''''''''''' In Python 2, objects can specify both a string and unicode representation of From python-checkins at python.org Sat Feb 5 17:03:12 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 5 Feb 2011 17:03:12 +0100 (CET) Subject: [Python-checkins] r88349 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205160312.A416BEE9D5@mail.python.org> Author: eric.araujo Date: Sat Feb 5 17:03:12 2011 New Revision: 88349 Log: Use an internal reference instead of hard-coded URI. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 17:03:12 2011 @@ -15,7 +15,7 @@ that strategy. If you are looking to port an extension module instead of pure Python code, - please see http://docs.python.org/py3k/howto/cporting.html . + please see :ref:`cporting-howto`. Choosing a Strategy From python-checkins at python.org Sat Feb 5 21:26:52 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 5 Feb 2011 21:26:52 +0100 (CET) Subject: [Python-checkins] r88350 - in python/branches/py3k: Misc/NEWS configure configure.in Message-ID: <20110205202652.946BEEE988@mail.python.org> Author: martin.v.loewis Date: Sat Feb 5 21:26:52 2011 New Revision: 88350 Log: Issue #11121: Fix building with --enable-shared. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/configure python/branches/py3k/configure.in Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 5 21:26:52 2011 @@ -20,6 +20,11 @@ - Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch by Stan Mihai. +Build +----- + +- Issue #11121: Fix building with --enable-shared. + What's New in Python 3.2 Release Candidate 2? ============================================= Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Sat Feb 5 21:26:52 2011 @@ -1,14 +1,14 @@ #! /bin/sh -# From configure.in Revision: 87646 . +# From configure.in Revision: 87698 . # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.65 for python 3.2. +# Generated by GNU Autoconf 2.68 for python 3.2. # # Report bugs to . # # # Copyright (C) 1992, 1993, 1994, 1995, 1996, 1998, 1999, 2000, 2001, -# 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009 Free Software Foundation, -# Inc. +# 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010 Free Software +# Foundation, Inc. # # # This configure script is free software; the Free Software Foundation @@ -92,6 +92,7 @@ IFS=" "" $as_nl" # Find who we are. Look in the path if we contain no directory separator. +as_myself= case $0 in #(( *[\\/]* ) as_myself=$0 ;; *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR @@ -217,11 +218,18 @@ # We cannot yet assume a decent shell, so we have to provide a # neutralization value for shells without unset; and this also # works around shells that cannot unset nonexistent variables. + # Preserve -v and -x to the replacement shell. BASH_ENV=/dev/null ENV=/dev/null (unset BASH_ENV) >/dev/null 2>&1 && unset BASH_ENV ENV export CONFIG_SHELL - exec "$CONFIG_SHELL" "$as_myself" ${1+"$@"} + case $- in # (((( + *v*x* | *x*v* ) as_opts=-vx ;; + *v* ) as_opts=-v ;; + *x* ) as_opts=-x ;; + * ) as_opts= ;; + esac + exec "$CONFIG_SHELL" $as_opts "$as_myself" ${1+"$@"} fi if test x$as_have_required = xno; then : @@ -320,7 +328,7 @@ test -d "$as_dir" && break done test -z "$as_dirs" || eval "mkdir $as_dirs" - } || test -d "$as_dir" || as_fn_error "cannot create directory $as_dir" + } || test -d "$as_dir" || as_fn_error $? "cannot create directory $as_dir" } # as_fn_mkdir_p @@ -360,19 +368,19 @@ fi # as_fn_arith -# as_fn_error ERROR [LINENO LOG_FD] -# --------------------------------- +# as_fn_error STATUS ERROR [LINENO LOG_FD] +# ---------------------------------------- # Output "`basename $0`: error: ERROR" to stderr. If LINENO and LOG_FD are # provided, also output the error to LOG_FD, referencing LINENO. Then exit the -# script with status $?, using 1 if that was 0. +# script with STATUS, using 1 if that was 0. as_fn_error () { - as_status=$?; test $as_status -eq 0 && as_status=1 - if test "$3"; then - as_lineno=${as_lineno-"$2"} as_lineno_stack=as_lineno_stack=$as_lineno_stack - $as_echo "$as_me:${as_lineno-$LINENO}: error: $1" >&$3 + as_status=$1; test $as_status -eq 0 && as_status=1 + if test "$4"; then + as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4 fi - $as_echo "$as_me: error: $1" >&2 + $as_echo "$as_me: error: $2" >&2 as_fn_exit $as_status } # as_fn_error @@ -534,7 +542,7 @@ exec 6>&1 # Name of the host. -# hostname on some systems (SVR3.2, Linux) returns a bogus exit status, +# hostname on some systems (SVR3.2, old GNU/Linux) returns a bogus exit status, # so uname gets run too. ac_hostname=`(hostname || uname -n) 2>/dev/null | sed 1q` @@ -829,8 +837,9 @@ fi case $ac_option in - *=*) ac_optarg=`expr "X$ac_option" : '[^=]*=\(.*\)'` ;; - *) ac_optarg=yes ;; + *=?*) ac_optarg=`expr "X$ac_option" : '[^=]*=\(.*\)'` ;; + *=) ac_optarg= ;; + *) ac_optarg=yes ;; esac # Accept the important Cygnus configure options, so we can diagnose typos. @@ -875,7 +884,7 @@ ac_useropt=`expr "x$ac_option" : 'x-*disable-\(.*\)'` # Reject names that are not valid shell variable names. expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && - as_fn_error "invalid feature name: $ac_useropt" + as_fn_error $? "invalid feature name: $ac_useropt" ac_useropt_orig=$ac_useropt ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` case $ac_user_opts in @@ -901,7 +910,7 @@ ac_useropt=`expr "x$ac_option" : 'x-*enable-\([^=]*\)'` # Reject names that are not valid shell variable names. expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && - as_fn_error "invalid feature name: $ac_useropt" + as_fn_error $? "invalid feature name: $ac_useropt" ac_useropt_orig=$ac_useropt ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` case $ac_user_opts in @@ -1105,7 +1114,7 @@ ac_useropt=`expr "x$ac_option" : 'x-*with-\([^=]*\)'` # Reject names that are not valid shell variable names. expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && - as_fn_error "invalid package name: $ac_useropt" + as_fn_error $? "invalid package name: $ac_useropt" ac_useropt_orig=$ac_useropt ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` case $ac_user_opts in @@ -1121,7 +1130,7 @@ ac_useropt=`expr "x$ac_option" : 'x-*without-\(.*\)'` # Reject names that are not valid shell variable names. expr "x$ac_useropt" : ".*[^-+._$as_cr_alnum]" >/dev/null && - as_fn_error "invalid package name: $ac_useropt" + as_fn_error $? "invalid package name: $ac_useropt" ac_useropt_orig=$ac_useropt ac_useropt=`$as_echo "$ac_useropt" | sed 's/[-+.]/_/g'` case $ac_user_opts in @@ -1151,8 +1160,8 @@ | --x-librar=* | --x-libra=* | --x-libr=* | --x-lib=* | --x-li=* | --x-l=*) x_libraries=$ac_optarg ;; - -*) as_fn_error "unrecognized option: \`$ac_option' -Try \`$0 --help' for more information." + -*) as_fn_error $? "unrecognized option: \`$ac_option' +Try \`$0 --help' for more information" ;; *=*) @@ -1160,7 +1169,7 @@ # Reject names that are not valid shell variable names. case $ac_envvar in #( '' | [0-9]* | *[!_$as_cr_alnum]* ) - as_fn_error "invalid variable name: \`$ac_envvar'" ;; + as_fn_error $? "invalid variable name: \`$ac_envvar'" ;; esac eval $ac_envvar=\$ac_optarg export $ac_envvar ;; @@ -1170,7 +1179,7 @@ $as_echo "$as_me: WARNING: you should use --build, --host, --target" >&2 expr "x$ac_option" : ".*[^-._$as_cr_alnum]" >/dev/null && $as_echo "$as_me: WARNING: invalid host type: $ac_option" >&2 - : ${build_alias=$ac_option} ${host_alias=$ac_option} ${target_alias=$ac_option} + : "${build_alias=$ac_option} ${host_alias=$ac_option} ${target_alias=$ac_option}" ;; esac @@ -1178,13 +1187,13 @@ if test -n "$ac_prev"; then ac_option=--`echo $ac_prev | sed 's/_/-/g'` - as_fn_error "missing argument to $ac_option" + as_fn_error $? "missing argument to $ac_option" fi if test -n "$ac_unrecognized_opts"; then case $enable_option_checking in no) ;; - fatal) as_fn_error "unrecognized options: $ac_unrecognized_opts" ;; + fatal) as_fn_error $? "unrecognized options: $ac_unrecognized_opts" ;; *) $as_echo "$as_me: WARNING: unrecognized options: $ac_unrecognized_opts" >&2 ;; esac fi @@ -1207,7 +1216,7 @@ [\\/$]* | ?:[\\/]* ) continue;; NONE | '' ) case $ac_var in *prefix ) continue;; esac;; esac - as_fn_error "expected an absolute directory name for --$ac_var: $ac_val" + as_fn_error $? "expected an absolute directory name for --$ac_var: $ac_val" done # There might be people who depend on the old broken behavior: `$host' @@ -1221,8 +1230,8 @@ if test "x$host_alias" != x; then if test "x$build_alias" = x; then cross_compiling=maybe - $as_echo "$as_me: WARNING: If you wanted to set the --build type, don't use --host. - If a cross compiler is detected then cross compile mode will be used." >&2 + $as_echo "$as_me: WARNING: if you wanted to set the --build type, don't use --host. + If a cross compiler is detected then cross compile mode will be used" >&2 elif test "x$build_alias" != "x$host_alias"; then cross_compiling=yes fi @@ -1237,9 +1246,9 @@ ac_pwd=`pwd` && test -n "$ac_pwd" && ac_ls_di=`ls -di .` && ac_pwd_ls_di=`cd "$ac_pwd" && ls -di .` || - as_fn_error "working directory cannot be determined" + as_fn_error $? "working directory cannot be determined" test "X$ac_ls_di" = "X$ac_pwd_ls_di" || - as_fn_error "pwd does not report name of working directory" + as_fn_error $? "pwd does not report name of working directory" # Find the source files, if location was not specified. @@ -1278,11 +1287,11 @@ fi if test ! -r "$srcdir/$ac_unique_file"; then test "$ac_srcdir_defaulted" = yes && srcdir="$ac_confdir or .." - as_fn_error "cannot find sources ($ac_unique_file) in $srcdir" + as_fn_error $? "cannot find sources ($ac_unique_file) in $srcdir" fi ac_msg="sources are in $srcdir, but \`cd $srcdir' does not work" ac_abs_confdir=`( - cd "$srcdir" && test -r "./$ac_unique_file" || as_fn_error "$ac_msg" + cd "$srcdir" && test -r "./$ac_unique_file" || as_fn_error $? "$ac_msg" pwd)` # When building in place, set srcdir=. if test "$ac_abs_confdir" = "$ac_pwd"; then @@ -1322,7 +1331,7 @@ --help=short display options specific to this package --help=recursive display the short help of all the included packages -V, --version display version information and exit - -q, --quiet, --silent do not print \`checking...' messages + -q, --quiet, --silent do not print \`checking ...' messages --cache-file=FILE cache test results in FILE [disabled] -C, --config-cache alias for \`--cache-file=config.cache' -n, --no-create do not create output files @@ -1508,9 +1517,9 @@ if $ac_init_version; then cat <<\_ACEOF python configure 3.2 -generated by GNU Autoconf 2.65 +generated by GNU Autoconf 2.68 -Copyright (C) 2009 Free Software Foundation, Inc. +Copyright (C) 2010 Free Software Foundation, Inc. This configure script is free software; the Free Software Foundation gives unlimited permission to copy, distribute and modify it. _ACEOF @@ -1554,7 +1563,7 @@ ac_retval=1 fi - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno as_fn_set_status $ac_retval } # ac_fn_c_try_compile @@ -1580,7 +1589,7 @@ mv -f conftest.er1 conftest.err fi $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } >/dev/null && { + test $ac_status = 0; } > conftest.i && { test -z "$ac_c_preproc_warn_flag$ac_c_werror_flag" || test ! -s conftest.err }; then : @@ -1591,7 +1600,7 @@ ac_retval=1 fi - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno as_fn_set_status $ac_retval } # ac_fn_c_try_cpp @@ -1604,10 +1613,10 @@ ac_fn_c_check_header_mongrel () { as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack - if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : + if eval \${$3+:} false; then : { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 $as_echo_n "checking for $2... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 fi eval ac_res=\$$3 @@ -1643,7 +1652,7 @@ else ac_header_preproc=no fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_header_preproc" >&5 $as_echo "$ac_header_preproc" >&6; } @@ -1666,17 +1675,15 @@ $as_echo "$as_me: WARNING: $2: section \"Present But Cannot Be Compiled\"" >&2;} { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $2: proceeding with the compiler's result" >&5 $as_echo "$as_me: WARNING: $2: proceeding with the compiler's result" >&2;} -( cat <<\_ASBOX -## -------------------------------------- ## +( $as_echo "## -------------------------------------- ## ## Report this to http://bugs.python.org/ ## -## -------------------------------------- ## -_ASBOX +## -------------------------------------- ##" ) | sed "s/^/$as_me: WARNING: /" >&2 ;; esac { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 $as_echo_n "checking for $2... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else eval "$3=\$ac_header_compiler" @@ -1685,7 +1692,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } fi - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_header_mongrel @@ -1726,7 +1733,7 @@ ac_retval=$ac_status fi rm -rf conftest.dSYM conftest_ipa8_conftest.oo - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno as_fn_set_status $ac_retval } # ac_fn_c_try_run @@ -1740,7 +1747,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 $as_echo_n "checking for $2... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -1758,7 +1765,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_header_compile @@ -1803,7 +1810,7 @@ # interfere with the next link command; also delete a directory that is # left behind by Apple's compiler. We do this before executing the actions. rm -rf conftest.dSYM conftest_ipa8_conftest.oo - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno as_fn_set_status $ac_retval } # ac_fn_c_try_link @@ -1817,7 +1824,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 $as_echo_n "checking for $2... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else eval "$3=no" @@ -1858,7 +1865,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_type @@ -1871,7 +1878,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for uint$2_t" >&5 $as_echo_n "checking for uint$2_t... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else eval "$3=no" @@ -1901,8 +1908,7 @@ esac fi rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext - eval as_val=\$$3 - if test "x$as_val" = x""no; then : + if eval test \"x\$"$3"\" = x"no"; then : else break @@ -1912,7 +1918,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_find_uintX_t @@ -1925,7 +1931,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for int$2_t" >&5 $as_echo_n "checking for int$2_t... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else eval "$3=no" @@ -1936,11 +1942,11 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ $ac_includes_default + enum { N = $2 / 2 - 1 }; int main () { -static int test_array [1 - 2 * !(enum { N = $2 / 2 - 1 }; - 0 < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1))]; +static int test_array [1 - 2 * !(0 < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1))]; test_array [0] = 0 ; @@ -1951,11 +1957,11 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ $ac_includes_default + enum { N = $2 / 2 - 1 }; int main () { -static int test_array [1 - 2 * !(enum { N = $2 / 2 - 1 }; - ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1) +static int test_array [1 - 2 * !(($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1) < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 2))]; test_array [0] = 0 @@ -1976,8 +1982,7 @@ rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext fi rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext - eval as_val=\$$3 - if test "x$as_val" = x""no; then : + if eval test \"x\$"$3"\" = x"no"; then : else break @@ -1987,7 +1992,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_find_intX_t @@ -2164,7 +2169,7 @@ rm -f conftest.val fi - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno as_fn_set_status $ac_retval } # ac_fn_c_compute_int @@ -2177,7 +2182,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2" >&5 $as_echo_n "checking for $2... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -2232,7 +2237,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_func @@ -2245,7 +2250,7 @@ as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $2.$3" >&5 $as_echo_n "checking for $2.$3... " >&6; } -if { as_var=$4; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$4+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -2289,19 +2294,22 @@ eval ac_res=\$$4 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_member -# ac_fn_c_check_decl LINENO SYMBOL VAR -# ------------------------------------ -# Tests whether SYMBOL is declared, setting cache variable VAR accordingly. +# ac_fn_c_check_decl LINENO SYMBOL VAR INCLUDES +# --------------------------------------------- +# Tests whether SYMBOL is declared in INCLUDES, setting cache variable VAR +# accordingly. ac_fn_c_check_decl () { as_lineno=${as_lineno-"$1"} as_lineno_stack=as_lineno_stack=$as_lineno_stack - { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $2 is declared" >&5 -$as_echo_n "checking whether $2 is declared... " >&6; } -if { as_var=$3; eval "test \"\${$as_var+set}\" = set"; }; then : + as_decl_name=`echo $2|sed 's/ *(.*//'` + as_decl_use=`echo $2|sed -e 's/(/((/' -e 's/)/) 0&/' -e 's/,/) 0& (/g'` + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $as_decl_name is declared" >&5 +$as_echo_n "checking whether $as_decl_name is declared... " >&6; } +if eval \${$3+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -2310,8 +2318,12 @@ int main () { -#ifndef $2 - (void) $2; +#ifndef $as_decl_name +#ifdef __cplusplus + (void) $as_decl_use; +#else + (void) $as_decl_name; +#endif #endif ; @@ -2328,7 +2340,7 @@ eval ac_res=\$$3 { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } - eval $as_lineno_stack; test "x$as_lineno_stack" = x && { as_lineno=; unset as_lineno;} + eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno } # ac_fn_c_check_decl cat >config.log <<_ACEOF @@ -2336,7 +2348,7 @@ running configure, to aid debugging if configure makes a mistake. It was created by python $as_me 3.2, which was -generated by GNU Autoconf 2.65. Invocation command line was +generated by GNU Autoconf 2.68. Invocation command line was $ $0 $@ @@ -2446,11 +2458,9 @@ { echo - cat <<\_ASBOX -## ---------------- ## + $as_echo "## ---------------- ## ## Cache variables. ## -## ---------------- ## -_ASBOX +## ---------------- ##" echo # The following way of writing the cache mishandles newlines in values, ( @@ -2484,11 +2494,9 @@ ) echo - cat <<\_ASBOX -## ----------------- ## + $as_echo "## ----------------- ## ## Output variables. ## -## ----------------- ## -_ASBOX +## ----------------- ##" echo for ac_var in $ac_subst_vars do @@ -2501,11 +2509,9 @@ echo if test -n "$ac_subst_files"; then - cat <<\_ASBOX -## ------------------- ## + $as_echo "## ------------------- ## ## File substitutions. ## -## ------------------- ## -_ASBOX +## ------------------- ##" echo for ac_var in $ac_subst_files do @@ -2519,11 +2525,9 @@ fi if test -s confdefs.h; then - cat <<\_ASBOX -## ----------- ## + $as_echo "## ----------- ## ## confdefs.h. ## -## ----------- ## -_ASBOX +## ----------- ##" echo cat confdefs.h echo @@ -2578,7 +2582,12 @@ ac_site_file1=NONE ac_site_file2=NONE if test -n "$CONFIG_SITE"; then - ac_site_file1=$CONFIG_SITE + # We do not want a PATH search for config.site. + case $CONFIG_SITE in #(( + -*) ac_site_file1=./$CONFIG_SITE;; + */*) ac_site_file1=$CONFIG_SITE;; + *) ac_site_file1=./$CONFIG_SITE;; + esac elif test "x$prefix" != xNONE; then ac_site_file1=$prefix/share/config.site ac_site_file2=$prefix/etc/config.site @@ -2593,7 +2602,11 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: loading site script $ac_site_file" >&5 $as_echo "$as_me: loading site script $ac_site_file" >&6;} sed 's/^/| /' "$ac_site_file" >&5 - . "$ac_site_file" + . "$ac_site_file" \ + || { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 +$as_echo "$as_me: error: in \`$ac_pwd':" >&2;} +as_fn_error $? "failed to load site script $ac_site_file +See \`config.log' for more details" "$LINENO" 5; } fi done @@ -2669,7 +2682,7 @@ $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} { $as_echo "$as_me:${as_lineno-$LINENO}: error: changes in the environment can compromise the build" >&5 $as_echo "$as_me: error: changes in the environment can compromise the build" >&2;} - as_fn_error "run \`make distclean' and/or \`rm $cache_file' and start over" "$LINENO" 5 + as_fn_error $? "run \`make distclean' and/or \`rm $cache_file' and start over" "$LINENO" 5 fi ## -------------------- ## ## Main body of script. ## @@ -2770,7 +2783,7 @@ UNIVERSALSDK=$enableval if test ! -d "${UNIVERSALSDK}" then - as_fn_error "--enable-universalsdk specifies non-existing SDK: ${UNIVERSALSDK}" "$LINENO" 5 + as_fn_error $? "--enable-universalsdk specifies non-existing SDK: ${UNIVERSALSDK}" "$LINENO" 5 fi ;; esac @@ -3162,7 +3175,7 @@ # If the user switches compilers, we can't believe the cache if test ! -z "$ac_cv_prog_CC" -a ! -z "$CC" -a "$CC" != "$ac_cv_prog_CC" then - as_fn_error "cached CC is different -- throw away $cache_file + as_fn_error $? "cached CC is different -- throw away $cache_file (it is also a good idea to do 'make clean' before compiling)" "$LINENO" 5 fi @@ -3182,7 +3195,7 @@ set dummy ${ac_tool_prefix}gcc; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_CC+set}" = set; then : +if ${ac_cv_prog_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$CC"; then @@ -3222,7 +3235,7 @@ set dummy gcc; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_ac_ct_CC+set}" = set; then : +if ${ac_cv_prog_ac_ct_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$ac_ct_CC"; then @@ -3275,7 +3288,7 @@ set dummy ${ac_tool_prefix}cc; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_CC+set}" = set; then : +if ${ac_cv_prog_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$CC"; then @@ -3315,7 +3328,7 @@ set dummy cc; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_CC+set}" = set; then : +if ${ac_cv_prog_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$CC"; then @@ -3374,7 +3387,7 @@ set dummy $ac_tool_prefix$ac_prog; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_CC+set}" = set; then : +if ${ac_cv_prog_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$CC"; then @@ -3418,7 +3431,7 @@ set dummy $ac_prog; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_ac_ct_CC+set}" = set; then : +if ${ac_cv_prog_ac_ct_CC+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$ac_ct_CC"; then @@ -3472,8 +3485,8 @@ test -z "$CC" && { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -as_fn_error "no acceptable C compiler found in \$PATH -See \`config.log' for more details." "$LINENO" 5; } +as_fn_error $? "no acceptable C compiler found in \$PATH +See \`config.log' for more details" "$LINENO" 5; } # Provide some information about the compiler. $as_echo "$as_me:${as_lineno-$LINENO}: checking for C compiler version" >&5 @@ -3587,9 +3600,8 @@ { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "C compiler cannot create executables -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "C compiler cannot create executables +See \`config.log' for more details" "$LINENO" 5; } else { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 $as_echo "yes" >&6; } @@ -3631,8 +3643,8 @@ else { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -as_fn_error "cannot compute suffix of executables: cannot compile and link -See \`config.log' for more details." "$LINENO" 5; } +as_fn_error $? "cannot compute suffix of executables: cannot compile and link +See \`config.log' for more details" "$LINENO" 5; } fi rm -f conftest conftest$ac_cv_exeext { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_exeext" >&5 @@ -3689,9 +3701,9 @@ else { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -as_fn_error "cannot run C compiled programs. +as_fn_error $? "cannot run C compiled programs. If you meant to cross compile, use \`--host'. -See \`config.log' for more details." "$LINENO" 5; } +See \`config.log' for more details" "$LINENO" 5; } fi fi fi @@ -3702,7 +3714,7 @@ ac_clean_files=$ac_clean_files_save { $as_echo "$as_me:${as_lineno-$LINENO}: checking for suffix of object files" >&5 $as_echo_n "checking for suffix of object files... " >&6; } -if test "${ac_cv_objext+set}" = set; then : +if ${ac_cv_objext+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -3742,8 +3754,8 @@ { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -as_fn_error "cannot compute suffix of object files: cannot compile -See \`config.log' for more details." "$LINENO" 5; } +as_fn_error $? "cannot compute suffix of object files: cannot compile +See \`config.log' for more details" "$LINENO" 5; } fi rm -f conftest.$ac_cv_objext conftest.$ac_ext fi @@ -3753,7 +3765,7 @@ ac_objext=$OBJEXT { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether we are using the GNU C compiler" >&5 $as_echo_n "checking whether we are using the GNU C compiler... " >&6; } -if test "${ac_cv_c_compiler_gnu+set}" = set; then : +if ${ac_cv_c_compiler_gnu+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -3790,7 +3802,7 @@ ac_save_CFLAGS=$CFLAGS { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -g" >&5 $as_echo_n "checking whether $CC accepts -g... " >&6; } -if test "${ac_cv_prog_cc_g+set}" = set; then : +if ${ac_cv_prog_cc_g+:} false; then : $as_echo_n "(cached) " >&6 else ac_save_c_werror_flag=$ac_c_werror_flag @@ -3868,7 +3880,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $CC option to accept ISO C89" >&5 $as_echo_n "checking for $CC option to accept ISO C89... " >&6; } -if test "${ac_cv_prog_cc_c89+set}" = set; then : +if ${ac_cv_prog_cc_c89+:} false; then : $as_echo_n "(cached) " >&6 else ac_cv_prog_cc_c89=no @@ -4003,7 +4015,7 @@ set dummy g++; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_path_CXX+set}" = set; then : +if ${ac_cv_path_CXX+:} false; then : $as_echo_n "(cached) " >&6 else case $CXX in @@ -4044,7 +4056,7 @@ set dummy c++; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_path_CXX+set}" = set; then : +if ${ac_cv_path_CXX+:} false; then : $as_echo_n "(cached) " >&6 else case $CXX in @@ -4095,7 +4107,7 @@ set dummy $ac_prog; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_CXX+set}" = set; then : +if ${ac_cv_prog_CXX+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$CXX"; then @@ -4166,7 +4178,7 @@ CPP= fi if test -z "$CPP"; then - if test "${ac_cv_prog_CPP+set}" = set; then : + if ${ac_cv_prog_CPP+:} false; then : $as_echo_n "(cached) " >&6 else # Double quotes because CPP needs to be expanded @@ -4196,7 +4208,7 @@ # Broken: fails on valid input. continue fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext # OK, works on sane cases. Now check whether nonexistent headers # can be detected and how. @@ -4212,11 +4224,11 @@ ac_preproc_ok=: break fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext done # Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped. -rm -f conftest.err conftest.$ac_ext +rm -f conftest.i conftest.err conftest.$ac_ext if $ac_preproc_ok; then : break fi @@ -4255,7 +4267,7 @@ # Broken: fails on valid input. continue fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext # OK, works on sane cases. Now check whether nonexistent headers # can be detected and how. @@ -4271,18 +4283,18 @@ ac_preproc_ok=: break fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext done # Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped. -rm -f conftest.err conftest.$ac_ext +rm -f conftest.i conftest.err conftest.$ac_ext if $ac_preproc_ok; then : else { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -as_fn_error "C preprocessor \"$CPP\" fails sanity check -See \`config.log' for more details." "$LINENO" 5; } +as_fn_error $? "C preprocessor \"$CPP\" fails sanity check +See \`config.log' for more details" "$LINENO" 5; } fi ac_ext=c @@ -4294,7 +4306,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for grep that handles long lines and -e" >&5 $as_echo_n "checking for grep that handles long lines and -e... " >&6; } -if test "${ac_cv_path_GREP+set}" = set; then : +if ${ac_cv_path_GREP+:} false; then : $as_echo_n "(cached) " >&6 else if test -z "$GREP"; then @@ -4343,7 +4355,7 @@ done IFS=$as_save_IFS if test -z "$ac_cv_path_GREP"; then - as_fn_error "no acceptable grep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 + as_fn_error $? "no acceptable grep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 fi else ac_cv_path_GREP=$GREP @@ -4357,7 +4369,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for egrep" >&5 $as_echo_n "checking for egrep... " >&6; } -if test "${ac_cv_path_EGREP+set}" = set; then : +if ${ac_cv_path_EGREP+:} false; then : $as_echo_n "(cached) " >&6 else if echo a | $GREP -E '(a|b)' >/dev/null 2>&1 @@ -4409,7 +4421,7 @@ done IFS=$as_save_IFS if test -z "$ac_cv_path_EGREP"; then - as_fn_error "no acceptable egrep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 + as_fn_error $? "no acceptable egrep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin" "$LINENO" 5 fi else ac_cv_path_EGREP=$EGREP @@ -4424,7 +4436,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for ANSI C header files" >&5 $as_echo_n "checking for ANSI C header files... " >&6; } -if test "${ac_cv_header_stdc+set}" = set; then : +if ${ac_cv_header_stdc+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -4541,8 +4553,7 @@ as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` ac_fn_c_check_header_compile "$LINENO" "$ac_header" "$as_ac_Header" "$ac_includes_default " -eval as_val=\$$as_ac_Header - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_Header"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_header" | $as_tr_cpp` 1 _ACEOF @@ -4554,7 +4565,7 @@ ac_fn_c_check_header_mongrel "$LINENO" "minix/config.h" "ac_cv_header_minix_config_h" "$ac_includes_default" -if test "x$ac_cv_header_minix_config_h" = x""yes; then : +if test "x$ac_cv_header_minix_config_h" = xyes; then : MINIX=yes else MINIX= @@ -4576,7 +4587,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether it is safe to define __EXTENSIONS__" >&5 $as_echo_n "checking whether it is safe to define __EXTENSIONS__... " >&6; } -if test "${ac_cv_safe_to_define___extensions__+set}" = set; then : +if ${ac_cv_safe_to_define___extensions__+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -4769,7 +4780,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for inline" >&5 $as_echo_n "checking for inline... " >&6; } -if test "${ac_cv_c_inline+set}" = set; then : +if ${ac_cv_c_inline+:} false; then : $as_echo_n "(cached) " >&6 else ac_cv_c_inline=no @@ -4905,7 +4916,7 @@ BLDLIBRARY='-Wl,-R,$(LIBDIR) -L. -lpython$(LDVERSION)' RUNSHARED=LD_LIBRARY_PATH=`pwd`:${LD_LIBRARY_PATH} INSTSONAME="$LDLIBRARY".$SOVERSION - if test $with_pydebug == no + if test "$with_pydebug" != yes then PY3LIBRARY=libpython3.so fi @@ -4920,8 +4931,7 @@ ;; esac INSTSONAME="$LDLIBRARY".$SOVERSION - PY3LIBRARY=libpython3.so - if test $with_pydebug == no + if test "$with_pydebug" != yes then PY3LIBRARY=libpython3.so fi @@ -4971,7 +4981,7 @@ set dummy ${ac_tool_prefix}ranlib; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_RANLIB+set}" = set; then : +if ${ac_cv_prog_RANLIB+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$RANLIB"; then @@ -5011,7 +5021,7 @@ set dummy ranlib; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_ac_ct_RANLIB+set}" = set; then : +if ${ac_cv_prog_ac_ct_RANLIB+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$ac_ct_RANLIB"; then @@ -5065,7 +5075,7 @@ set dummy $ac_prog; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_AR+set}" = set; then : +if ${ac_cv_prog_AR+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$AR"; then @@ -5115,7 +5125,7 @@ set dummy svnversion; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_SVNVERSION+set}" = set; then : +if ${ac_cv_prog_SVNVERSION+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$SVNVERSION"; then @@ -5166,16 +5176,22 @@ esac ac_aux_dir= for ac_dir in "$srcdir" "$srcdir/.." "$srcdir/../.."; do - for ac_t in install-sh install.sh shtool; do - if test -f "$ac_dir/$ac_t"; then - ac_aux_dir=$ac_dir - ac_install_sh="$ac_aux_dir/$ac_t -c" - break 2 - fi - done + if test -f "$ac_dir/install-sh"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/install-sh -c" + break + elif test -f "$ac_dir/install.sh"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/install.sh -c" + break + elif test -f "$ac_dir/shtool"; then + ac_aux_dir=$ac_dir + ac_install_sh="$ac_aux_dir/shtool install -c" + break + fi done if test -z "$ac_aux_dir"; then - as_fn_error "cannot find install-sh, install.sh, or shtool in \"$srcdir\" \"$srcdir/..\" \"$srcdir/../..\"" "$LINENO" 5 + as_fn_error $? "cannot find install-sh, install.sh, or shtool in \"$srcdir\" \"$srcdir/..\" \"$srcdir/../..\"" "$LINENO" 5 fi # These three variables are undocumented and unsupported, @@ -5204,7 +5220,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for a BSD-compatible install" >&5 $as_echo_n "checking for a BSD-compatible install... " >&6; } if test -z "$INSTALL"; then -if test "${ac_cv_path_install+set}" = set; then : +if ${ac_cv_path_install+:} false; then : $as_echo_n "(cached) " >&6 else as_save_IFS=$IFS; IFS=$PATH_SEPARATOR @@ -5390,7 +5406,7 @@ ac_save_cc="$CC" CC="$CC -fno-strict-aliasing" save_CFLAGS="$CFLAGS" - if test "${ac_cv_no_strict_aliasing+set}" = set; then : + if ${ac_cv_no_strict_aliasing+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -5513,7 +5529,7 @@ ARCH_RUN_32BIT="/usr/bin/arch -i386 -ppc" else - as_fn_error "proper usage is --with-universal-arch=32-bit|64-bit|all|intel|3-way" "$LINENO" 5 + as_fn_error $? "proper usage is --with-universal-arch=32-bit|64-bit|all|intel|3-way" "$LINENO" 5 fi @@ -5646,7 +5662,7 @@ # options before we can check whether -Kpthread improves anything. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether pthreads are available without options" >&5 $as_echo_n "checking whether pthreads are available without options... " >&6; } -if test "${ac_cv_pthread_is_default+set}" = set; then : +if ${ac_cv_pthread_is_default+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -5699,7 +5715,7 @@ # function available. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -Kpthread" >&5 $as_echo_n "checking whether $CC accepts -Kpthread... " >&6; } -if test "${ac_cv_kpthread+set}" = set; then : +if ${ac_cv_kpthread+:} false; then : $as_echo_n "(cached) " >&6 else ac_save_cc="$CC" @@ -5748,7 +5764,7 @@ # function available. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -Kthread" >&5 $as_echo_n "checking whether $CC accepts -Kthread... " >&6; } -if test "${ac_cv_kthread+set}" = set; then : +if ${ac_cv_kthread+:} false; then : $as_echo_n "(cached) " >&6 else ac_save_cc="$CC" @@ -5797,7 +5813,7 @@ # function available. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -pthread" >&5 $as_echo_n "checking whether $CC accepts -pthread... " >&6; } -if test "${ac_cv_thread+set}" = set; then : +if ${ac_cv_thread+:} false; then : $as_echo_n "(cached) " >&6 else ac_save_cc="$CC" @@ -5882,7 +5898,7 @@ # checks for header files { $as_echo "$as_me:${as_lineno-$LINENO}: checking for ANSI C header files" >&5 $as_echo_n "checking for ANSI C header files... " >&6; } -if test "${ac_cv_header_stdc+set}" = set; then : +if ${ac_cv_header_stdc+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -6007,8 +6023,7 @@ do : as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` ac_fn_c_check_header_mongrel "$LINENO" "$ac_header" "$as_ac_Header" "$ac_includes_default" -eval as_val=\$$as_ac_Header - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_Header"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_header" | $as_tr_cpp` 1 _ACEOF @@ -6022,7 +6037,7 @@ as_ac_Header=`$as_echo "ac_cv_header_dirent_$ac_hdr" | $as_tr_sh` { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_hdr that defines DIR" >&5 $as_echo_n "checking for $ac_hdr that defines DIR... " >&6; } -if { as_var=$as_ac_Header; eval "test \"\${$as_var+set}\" = set"; }; then : +if eval \${$as_ac_Header+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -6049,8 +6064,7 @@ eval ac_res=\$$as_ac_Header { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 $as_echo "$ac_res" >&6; } -eval as_val=\$$as_ac_Header - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_Header"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_hdr" | $as_tr_cpp` 1 _ACEOF @@ -6063,7 +6077,7 @@ if test $ac_header_dirent = dirent.h; then { $as_echo "$as_me:${as_lineno-$LINENO}: checking for library containing opendir" >&5 $as_echo_n "checking for library containing opendir... " >&6; } -if test "${ac_cv_search_opendir+set}" = set; then : +if ${ac_cv_search_opendir+:} false; then : $as_echo_n "(cached) " >&6 else ac_func_search_save_LIBS=$LIBS @@ -6097,11 +6111,11 @@ fi rm -f core conftest.err conftest.$ac_objext \ conftest$ac_exeext - if test "${ac_cv_search_opendir+set}" = set; then : + if ${ac_cv_search_opendir+:} false; then : break fi done -if test "${ac_cv_search_opendir+set}" = set; then : +if ${ac_cv_search_opendir+:} false; then : else ac_cv_search_opendir=no @@ -6120,7 +6134,7 @@ else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for library containing opendir" >&5 $as_echo_n "checking for library containing opendir... " >&6; } -if test "${ac_cv_search_opendir+set}" = set; then : +if ${ac_cv_search_opendir+:} false; then : $as_echo_n "(cached) " >&6 else ac_func_search_save_LIBS=$LIBS @@ -6154,11 +6168,11 @@ fi rm -f core conftest.err conftest.$ac_objext \ conftest$ac_exeext - if test "${ac_cv_search_opendir+set}" = set; then : + if ${ac_cv_search_opendir+:} false; then : break fi done -if test "${ac_cv_search_opendir+set}" = set; then : +if ${ac_cv_search_opendir+:} false; then : else ac_cv_search_opendir=no @@ -6178,7 +6192,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether sys/types.h defines makedev" >&5 $as_echo_n "checking whether sys/types.h defines makedev... " >&6; } -if test "${ac_cv_header_sys_types_h_makedev+set}" = set; then : +if ${ac_cv_header_sys_types_h_makedev+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -6206,7 +6220,7 @@ if test $ac_cv_header_sys_types_h_makedev = no; then ac_fn_c_check_header_mongrel "$LINENO" "sys/mkdev.h" "ac_cv_header_sys_mkdev_h" "$ac_includes_default" -if test "x$ac_cv_header_sys_mkdev_h" = x""yes; then : +if test "x$ac_cv_header_sys_mkdev_h" = xyes; then : $as_echo "#define MAJOR_IN_MKDEV 1" >>confdefs.h @@ -6216,7 +6230,7 @@ if test $ac_cv_header_sys_mkdev_h = no; then ac_fn_c_check_header_mongrel "$LINENO" "sys/sysmacros.h" "ac_cv_header_sys_sysmacros_h" "$ac_includes_default" -if test "x$ac_cv_header_sys_sysmacros_h" = x""yes; then : +if test "x$ac_cv_header_sys_sysmacros_h" = xyes; then : $as_echo "#define MAJOR_IN_SYSMACROS 1" >>confdefs.h @@ -6236,7 +6250,7 @@ #endif " -if test "x$ac_cv_header_term_h" = x""yes; then : +if test "x$ac_cv_header_term_h" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_TERM_H 1 _ACEOF @@ -6258,7 +6272,7 @@ #endif " -if test "x$ac_cv_header_linux_netlink_h" = x""yes; then : +if test "x$ac_cv_header_linux_netlink_h" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_LINUX_NETLINK_H 1 _ACEOF @@ -6417,7 +6431,7 @@ # Type availability checks ac_fn_c_check_type "$LINENO" "mode_t" "ac_cv_type_mode_t" "$ac_includes_default" -if test "x$ac_cv_type_mode_t" = x""yes; then : +if test "x$ac_cv_type_mode_t" = xyes; then : else @@ -6428,7 +6442,7 @@ fi ac_fn_c_check_type "$LINENO" "off_t" "ac_cv_type_off_t" "$ac_includes_default" -if test "x$ac_cv_type_off_t" = x""yes; then : +if test "x$ac_cv_type_off_t" = xyes; then : else @@ -6439,7 +6453,7 @@ fi ac_fn_c_check_type "$LINENO" "pid_t" "ac_cv_type_pid_t" "$ac_includes_default" -if test "x$ac_cv_type_pid_t" = x""yes; then : +if test "x$ac_cv_type_pid_t" = xyes; then : else @@ -6455,7 +6469,7 @@ _ACEOF ac_fn_c_check_type "$LINENO" "size_t" "ac_cv_type_size_t" "$ac_includes_default" -if test "x$ac_cv_type_size_t" = x""yes; then : +if test "x$ac_cv_type_size_t" = xyes; then : else @@ -6467,7 +6481,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for uid_t in sys/types.h" >&5 $as_echo_n "checking for uid_t in sys/types.h... " >&6; } -if test "${ac_cv_type_uid_t+set}" = set; then : +if ${ac_cv_type_uid_t+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -6546,7 +6560,7 @@ esac ac_fn_c_check_type "$LINENO" "ssize_t" "ac_cv_type_ssize_t" "$ac_includes_default" -if test "x$ac_cv_type_ssize_t" = x""yes; then : +if test "x$ac_cv_type_ssize_t" = xyes; then : $as_echo "#define HAVE_SSIZE_T 1" >>confdefs.h @@ -6561,7 +6575,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of int" >&5 $as_echo_n "checking size of int... " >&6; } -if test "${ac_cv_sizeof_int+set}" = set; then : +if ${ac_cv_sizeof_int+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (int))" "ac_cv_sizeof_int" "$ac_includes_default"; then : @@ -6570,9 +6584,8 @@ if test "$ac_cv_type_int" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (int) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (int) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_int=0 fi @@ -6595,7 +6608,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of long" >&5 $as_echo_n "checking size of long... " >&6; } -if test "${ac_cv_sizeof_long+set}" = set; then : +if ${ac_cv_sizeof_long+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (long))" "ac_cv_sizeof_long" "$ac_includes_default"; then : @@ -6604,9 +6617,8 @@ if test "$ac_cv_type_long" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (long) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (long) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_long=0 fi @@ -6629,7 +6641,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of void *" >&5 $as_echo_n "checking size of void *... " >&6; } -if test "${ac_cv_sizeof_void_p+set}" = set; then : +if ${ac_cv_sizeof_void_p+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (void *))" "ac_cv_sizeof_void_p" "$ac_includes_default"; then : @@ -6638,9 +6650,8 @@ if test "$ac_cv_type_void_p" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (void *) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (void *) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_void_p=0 fi @@ -6663,7 +6674,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of short" >&5 $as_echo_n "checking size of short... " >&6; } -if test "${ac_cv_sizeof_short+set}" = set; then : +if ${ac_cv_sizeof_short+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (short))" "ac_cv_sizeof_short" "$ac_includes_default"; then : @@ -6672,9 +6683,8 @@ if test "$ac_cv_type_short" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (short) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (short) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_short=0 fi @@ -6697,7 +6707,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of float" >&5 $as_echo_n "checking size of float... " >&6; } -if test "${ac_cv_sizeof_float+set}" = set; then : +if ${ac_cv_sizeof_float+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (float))" "ac_cv_sizeof_float" "$ac_includes_default"; then : @@ -6706,9 +6716,8 @@ if test "$ac_cv_type_float" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (float) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (float) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_float=0 fi @@ -6731,7 +6740,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of double" >&5 $as_echo_n "checking size of double... " >&6; } -if test "${ac_cv_sizeof_double+set}" = set; then : +if ${ac_cv_sizeof_double+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (double))" "ac_cv_sizeof_double" "$ac_includes_default"; then : @@ -6740,9 +6749,8 @@ if test "$ac_cv_type_double" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (double) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (double) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_double=0 fi @@ -6765,7 +6773,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of fpos_t" >&5 $as_echo_n "checking size of fpos_t... " >&6; } -if test "${ac_cv_sizeof_fpos_t+set}" = set; then : +if ${ac_cv_sizeof_fpos_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (fpos_t))" "ac_cv_sizeof_fpos_t" "$ac_includes_default"; then : @@ -6774,9 +6782,8 @@ if test "$ac_cv_type_fpos_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (fpos_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (fpos_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_fpos_t=0 fi @@ -6799,7 +6806,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of size_t" >&5 $as_echo_n "checking size of size_t... " >&6; } -if test "${ac_cv_sizeof_size_t+set}" = set; then : +if ${ac_cv_sizeof_size_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (size_t))" "ac_cv_sizeof_size_t" "$ac_includes_default"; then : @@ -6808,9 +6815,8 @@ if test "$ac_cv_type_size_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (size_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (size_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_size_t=0 fi @@ -6833,7 +6839,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of pid_t" >&5 $as_echo_n "checking size of pid_t... " >&6; } -if test "${ac_cv_sizeof_pid_t+set}" = set; then : +if ${ac_cv_sizeof_pid_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (pid_t))" "ac_cv_sizeof_pid_t" "$ac_includes_default"; then : @@ -6842,9 +6848,8 @@ if test "$ac_cv_type_pid_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (pid_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (pid_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_pid_t=0 fi @@ -6894,7 +6899,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of long long" >&5 $as_echo_n "checking size of long long... " >&6; } -if test "${ac_cv_sizeof_long_long+set}" = set; then : +if ${ac_cv_sizeof_long_long+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (long long))" "ac_cv_sizeof_long_long" "$ac_includes_default"; then : @@ -6903,9 +6908,8 @@ if test "$ac_cv_type_long_long" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (long long) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (long long) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_long_long=0 fi @@ -6956,7 +6960,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of long double" >&5 $as_echo_n "checking size of long double... " >&6; } -if test "${ac_cv_sizeof_long_double+set}" = set; then : +if ${ac_cv_sizeof_long_double+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (long double))" "ac_cv_sizeof_long_double" "$ac_includes_default"; then : @@ -6965,9 +6969,8 @@ if test "$ac_cv_type_long_double" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (long double) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (long double) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_long_double=0 fi @@ -7019,7 +7022,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of _Bool" >&5 $as_echo_n "checking size of _Bool... " >&6; } -if test "${ac_cv_sizeof__Bool+set}" = set; then : +if ${ac_cv_sizeof__Bool+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (_Bool))" "ac_cv_sizeof__Bool" "$ac_includes_default"; then : @@ -7028,9 +7031,8 @@ if test "$ac_cv_type__Bool" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (_Bool) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (_Bool) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof__Bool=0 fi @@ -7056,7 +7058,7 @@ #include #endif " -if test "x$ac_cv_type_uintptr_t" = x""yes; then : +if test "x$ac_cv_type_uintptr_t" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_UINTPTR_T 1 @@ -7068,7 +7070,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of uintptr_t" >&5 $as_echo_n "checking size of uintptr_t... " >&6; } -if test "${ac_cv_sizeof_uintptr_t+set}" = set; then : +if ${ac_cv_sizeof_uintptr_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (uintptr_t))" "ac_cv_sizeof_uintptr_t" "$ac_includes_default"; then : @@ -7077,9 +7079,8 @@ if test "$ac_cv_type_uintptr_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (uintptr_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (uintptr_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_uintptr_t=0 fi @@ -7105,7 +7106,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of off_t" >&5 $as_echo_n "checking size of off_t... " >&6; } -if test "${ac_cv_sizeof_off_t+set}" = set; then : +if ${ac_cv_sizeof_off_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (off_t))" "ac_cv_sizeof_off_t" " @@ -7119,9 +7120,8 @@ if test "$ac_cv_type_off_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (off_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (off_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_off_t=0 fi @@ -7165,7 +7165,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of time_t" >&5 $as_echo_n "checking size of time_t... " >&6; } -if test "${ac_cv_sizeof_time_t+set}" = set; then : +if ${ac_cv_sizeof_time_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (time_t))" "ac_cv_sizeof_time_t" " @@ -7182,9 +7182,8 @@ if test "$ac_cv_type_time_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (time_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (time_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_time_t=0 fi @@ -7241,7 +7240,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of pthread_t" >&5 $as_echo_n "checking size of pthread_t... " >&6; } -if test "${ac_cv_sizeof_pthread_t+set}" = set; then : +if ${ac_cv_sizeof_pthread_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (pthread_t))" "ac_cv_sizeof_pthread_t" " @@ -7255,9 +7254,8 @@ if test "$ac_cv_type_pthread_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (pthread_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (pthread_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_pthread_t=0 fi @@ -7344,7 +7342,7 @@ MACOSX_DEFAULT_ARCH="ppc" ;; *) - as_fn_error "Unexpected output of 'arch' on OSX" "$LINENO" 5 + as_fn_error $? "Unexpected output of 'arch' on OSX" "$LINENO" 5 ;; esac else @@ -7356,7 +7354,7 @@ MACOSX_DEFAULT_ARCH="ppc64" ;; *) - as_fn_error "Unexpected output of 'arch' on OSX" "$LINENO" 5 + as_fn_error $? "Unexpected output of 'arch' on OSX" "$LINENO" 5 ;; esac @@ -7382,7 +7380,7 @@ $as_echo "yes" >&6; } if test $enable_shared = "yes" then - as_fn_error "Specifying both --enable-shared and --enable-framework is not supported, use only --enable-framework instead" "$LINENO" 5 + as_fn_error $? "Specifying both --enable-shared and --enable-framework is not supported, use only --enable-framework instead" "$LINENO" 5 fi else { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 @@ -7689,7 +7687,7 @@ # checks for libraries { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dlopen in -ldl" >&5 $as_echo_n "checking for dlopen in -ldl... " >&6; } -if test "${ac_cv_lib_dl_dlopen+set}" = set; then : +if ${ac_cv_lib_dl_dlopen+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -7723,7 +7721,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dl_dlopen" >&5 $as_echo "$ac_cv_lib_dl_dlopen" >&6; } -if test "x$ac_cv_lib_dl_dlopen" = x""yes; then : +if test "x$ac_cv_lib_dl_dlopen" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_LIBDL 1 _ACEOF @@ -7734,7 +7732,7 @@ # Dynamic linking for SunOS/Solaris and SYSV { $as_echo "$as_me:${as_lineno-$LINENO}: checking for shl_load in -ldld" >&5 $as_echo_n "checking for shl_load in -ldld... " >&6; } -if test "${ac_cv_lib_dld_shl_load+set}" = set; then : +if ${ac_cv_lib_dld_shl_load+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -7768,7 +7766,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dld_shl_load" >&5 $as_echo "$ac_cv_lib_dld_shl_load" >&6; } -if test "x$ac_cv_lib_dld_shl_load" = x""yes; then : +if test "x$ac_cv_lib_dld_shl_load" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_LIBDLD 1 _ACEOF @@ -7782,7 +7780,7 @@ if test "$with_threads" = "yes" -o -z "$with_threads"; then { $as_echo "$as_me:${as_lineno-$LINENO}: checking for library containing sem_init" >&5 $as_echo_n "checking for library containing sem_init... " >&6; } -if test "${ac_cv_search_sem_init+set}" = set; then : +if ${ac_cv_search_sem_init+:} false; then : $as_echo_n "(cached) " >&6 else ac_func_search_save_LIBS=$LIBS @@ -7816,11 +7814,11 @@ fi rm -f core conftest.err conftest.$ac_objext \ conftest$ac_exeext - if test "${ac_cv_search_sem_init+set}" = set; then : + if ${ac_cv_search_sem_init+:} false; then : break fi done -if test "${ac_cv_search_sem_init+set}" = set; then : +if ${ac_cv_search_sem_init+:} false; then : else ac_cv_search_sem_init=no @@ -7843,7 +7841,7 @@ # check if we need libintl for locale functions { $as_echo "$as_me:${as_lineno-$LINENO}: checking for textdomain in -lintl" >&5 $as_echo_n "checking for textdomain in -lintl... " >&6; } -if test "${ac_cv_lib_intl_textdomain+set}" = set; then : +if ${ac_cv_lib_intl_textdomain+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -7877,7 +7875,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_intl_textdomain" >&5 $as_echo "$ac_cv_lib_intl_textdomain" >&6; } -if test "x$ac_cv_lib_intl_textdomain" = x""yes; then : +if test "x$ac_cv_lib_intl_textdomain" = xyes; then : $as_echo "#define WITH_LIBINTL 1" >>confdefs.h @@ -7924,7 +7922,7 @@ # Most SVR4 platforms (e.g. Solaris) need -lsocket and -lnsl. { $as_echo "$as_me:${as_lineno-$LINENO}: checking for t_open in -lnsl" >&5 $as_echo_n "checking for t_open in -lnsl... " >&6; } -if test "${ac_cv_lib_nsl_t_open+set}" = set; then : +if ${ac_cv_lib_nsl_t_open+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -7958,13 +7956,13 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_nsl_t_open" >&5 $as_echo "$ac_cv_lib_nsl_t_open" >&6; } -if test "x$ac_cv_lib_nsl_t_open" = x""yes; then : +if test "x$ac_cv_lib_nsl_t_open" = xyes; then : LIBS="-lnsl $LIBS" fi # SVR4 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for socket in -lsocket" >&5 $as_echo_n "checking for socket in -lsocket... " >&6; } -if test "${ac_cv_lib_socket_socket+set}" = set; then : +if ${ac_cv_lib_socket_socket+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -7998,7 +7996,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_socket_socket" >&5 $as_echo "$ac_cv_lib_socket_socket" >&6; } -if test "x$ac_cv_lib_socket_socket" = x""yes; then : +if test "x$ac_cv_lib_socket_socket" = xyes; then : LIBS="-lsocket $LIBS" fi # SVR4 sockets @@ -8024,7 +8022,7 @@ set dummy ${ac_tool_prefix}pkg-config; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_path_PKG_CONFIG+set}" = set; then : +if ${ac_cv_path_PKG_CONFIG+:} false; then : $as_echo_n "(cached) " >&6 else case $PKG_CONFIG in @@ -8067,7 +8065,7 @@ set dummy pkg-config; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_path_ac_pt_PKG_CONFIG+set}" = set; then : +if ${ac_cv_path_ac_pt_PKG_CONFIG+:} false; then : $as_echo_n "(cached) " >&6 else case $ac_pt_PKG_CONFIG in @@ -8178,12 +8176,12 @@ withval=$with_dbmliborder; if test x$with_dbmliborder = xyes then -as_fn_error "proper usage is --with-dbmliborder=db1:db2:..." "$LINENO" 5 +as_fn_error $? "proper usage is --with-dbmliborder=db1:db2:..." "$LINENO" 5 else for db in `echo $with_dbmliborder | sed 's/:/ /g'`; do if test x$db != xndbm && test x$db != xgdbm && test x$db != xbdb then - as_fn_error "proper usage is --with-dbmliborder=db1:db2:..." "$LINENO" 5 + as_fn_error $? "proper usage is --with-dbmliborder=db1:db2:..." "$LINENO" 5 fi done fi @@ -8349,7 +8347,7 @@ $as_echo "#define _REENTRANT 1" >>confdefs.h ac_fn_c_check_header_mongrel "$LINENO" "cthreads.h" "ac_cv_header_cthreads_h" "$ac_includes_default" -if test "x$ac_cv_header_cthreads_h" = x""yes; then : +if test "x$ac_cv_header_cthreads_h" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h $as_echo "#define C_THREADS 1" >>confdefs.h @@ -8362,7 +8360,7 @@ else ac_fn_c_check_header_mongrel "$LINENO" "mach/cthreads.h" "ac_cv_header_mach_cthreads_h" "$ac_includes_default" -if test "x$ac_cv_header_mach_cthreads_h" = x""yes; then : +if test "x$ac_cv_header_mach_cthreads_h" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h $as_echo "#define C_THREADS 1" >>confdefs.h @@ -8406,7 +8404,7 @@ LIBS=$_libs ac_fn_c_check_func "$LINENO" "pthread_detach" "ac_cv_func_pthread_detach" -if test "x$ac_cv_func_pthread_detach" = x""yes; then : +if test "x$ac_cv_func_pthread_detach" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h posix_threads=yes @@ -8415,7 +8413,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for pthread_create in -lpthreads" >&5 $as_echo_n "checking for pthread_create in -lpthreads... " >&6; } -if test "${ac_cv_lib_pthreads_pthread_create+set}" = set; then : +if ${ac_cv_lib_pthreads_pthread_create+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8449,7 +8447,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_pthreads_pthread_create" >&5 $as_echo "$ac_cv_lib_pthreads_pthread_create" >&6; } -if test "x$ac_cv_lib_pthreads_pthread_create" = x""yes; then : +if test "x$ac_cv_lib_pthreads_pthread_create" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h posix_threads=yes @@ -8459,7 +8457,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for pthread_create in -lc_r" >&5 $as_echo_n "checking for pthread_create in -lc_r... " >&6; } -if test "${ac_cv_lib_c_r_pthread_create+set}" = set; then : +if ${ac_cv_lib_c_r_pthread_create+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8493,7 +8491,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_c_r_pthread_create" >&5 $as_echo "$ac_cv_lib_c_r_pthread_create" >&6; } -if test "x$ac_cv_lib_c_r_pthread_create" = x""yes; then : +if test "x$ac_cv_lib_c_r_pthread_create" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h posix_threads=yes @@ -8503,7 +8501,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for __pthread_create_system in -lpthread" >&5 $as_echo_n "checking for __pthread_create_system in -lpthread... " >&6; } -if test "${ac_cv_lib_pthread___pthread_create_system+set}" = set; then : +if ${ac_cv_lib_pthread___pthread_create_system+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8537,7 +8535,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_pthread___pthread_create_system" >&5 $as_echo "$ac_cv_lib_pthread___pthread_create_system" >&6; } -if test "x$ac_cv_lib_pthread___pthread_create_system" = x""yes; then : +if test "x$ac_cv_lib_pthread___pthread_create_system" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h posix_threads=yes @@ -8547,7 +8545,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for pthread_create in -lcma" >&5 $as_echo_n "checking for pthread_create in -lcma... " >&6; } -if test "${ac_cv_lib_cma_pthread_create+set}" = set; then : +if ${ac_cv_lib_cma_pthread_create+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8581,7 +8579,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_cma_pthread_create" >&5 $as_echo "$ac_cv_lib_cma_pthread_create" >&6; } -if test "x$ac_cv_lib_cma_pthread_create" = x""yes; then : +if test "x$ac_cv_lib_cma_pthread_create" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h posix_threads=yes @@ -8613,7 +8611,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for usconfig in -lmpc" >&5 $as_echo_n "checking for usconfig in -lmpc... " >&6; } -if test "${ac_cv_lib_mpc_usconfig+set}" = set; then : +if ${ac_cv_lib_mpc_usconfig+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8647,7 +8645,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_mpc_usconfig" >&5 $as_echo "$ac_cv_lib_mpc_usconfig" >&6; } -if test "x$ac_cv_lib_mpc_usconfig" = x""yes; then : +if test "x$ac_cv_lib_mpc_usconfig" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h LIBS="$LIBS -lmpc" @@ -8659,7 +8657,7 @@ if test "$posix_threads" != "yes"; then { $as_echo "$as_me:${as_lineno-$LINENO}: checking for thr_create in -lthread" >&5 $as_echo_n "checking for thr_create in -lthread... " >&6; } -if test "${ac_cv_lib_thread_thr_create+set}" = set; then : +if ${ac_cv_lib_thread_thr_create+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -8693,7 +8691,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_thread_thr_create" >&5 $as_echo "$ac_cv_lib_thread_thr_create" >&6; } -if test "x$ac_cv_lib_thread_thr_create" = x""yes; then : +if test "x$ac_cv_lib_thread_thr_create" = xyes; then : $as_echo "#define WITH_THREAD 1" >>confdefs.h LIBS="$LIBS -lthread" @@ -8742,7 +8740,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking if PTHREAD_SCOPE_SYSTEM is supported" >&5 $as_echo_n "checking if PTHREAD_SCOPE_SYSTEM is supported... " >&6; } - if test "${ac_cv_pthread_system_supported+set}" = set; then : + if ${ac_cv_pthread_system_supported+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -8785,7 +8783,7 @@ for ac_func in pthread_sigmask do : ac_fn_c_check_func "$LINENO" "pthread_sigmask" "ac_cv_func_pthread_sigmask" -if test "x$ac_cv_func_pthread_sigmask" = x""yes; then : +if test "x$ac_cv_func_pthread_sigmask" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_PTHREAD_SIGMASK 1 _ACEOF @@ -9177,12 +9175,12 @@ $as_echo "$with_valgrind" >&6; } if test "$with_valgrind" != no; then ac_fn_c_check_header_mongrel "$LINENO" "valgrind/valgrind.h" "ac_cv_header_valgrind_valgrind_h" "$ac_includes_default" -if test "x$ac_cv_header_valgrind_valgrind_h" = x""yes; then : +if test "x$ac_cv_header_valgrind_valgrind_h" = xyes; then : $as_echo "#define WITH_VALGRIND 1" >>confdefs.h else - as_fn_error "Valgrind support requested but headers not available" "$LINENO" 5 + as_fn_error $? "Valgrind support requested but headers not available" "$LINENO" 5 fi @@ -9199,7 +9197,7 @@ for ac_func in dlopen do : ac_fn_c_check_func "$LINENO" "dlopen" "ac_cv_func_dlopen" -if test "x$ac_cv_func_dlopen" = x""yes; then : +if test "x$ac_cv_func_dlopen" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_DLOPEN 1 _ACEOF @@ -9276,8 +9274,7 @@ do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" -eval as_val=\$$as_ac_var - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_var"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 _ACEOF @@ -9527,7 +9524,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for flock declaration" >&5 $as_echo_n "checking for flock declaration... " >&6; } -if test "${ac_cv_flock_decl+set}" = set; then : +if ${ac_cv_flock_decl+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -9557,7 +9554,7 @@ for ac_func in flock do : ac_fn_c_check_func "$LINENO" "flock" "ac_cv_func_flock" -if test "x$ac_cv_func_flock" = x""yes; then : +if test "x$ac_cv_func_flock" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_FLOCK 1 _ACEOF @@ -9565,7 +9562,7 @@ else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for flock in -lbsd" >&5 $as_echo_n "checking for flock in -lbsd... " >&6; } -if test "${ac_cv_lib_bsd_flock+set}" = set; then : +if ${ac_cv_lib_bsd_flock+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -9599,7 +9596,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_bsd_flock" >&5 $as_echo "$ac_cv_lib_bsd_flock" >&6; } -if test "x$ac_cv_lib_bsd_flock" = x""yes; then : +if test "x$ac_cv_lib_bsd_flock" = xyes; then : $as_echo "#define HAVE_FLOCK 1" >>confdefs.h @@ -9648,7 +9645,7 @@ set dummy $ac_prog; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } -if test "${ac_cv_prog_TRUE+set}" = set; then : +if ${ac_cv_prog_TRUE+:} false; then : $as_echo_n "(cached) " >&6 else if test -n "$TRUE"; then @@ -9688,7 +9685,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for inet_aton in -lc" >&5 $as_echo_n "checking for inet_aton in -lc... " >&6; } -if test "${ac_cv_lib_c_inet_aton+set}" = set; then : +if ${ac_cv_lib_c_inet_aton+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -9722,12 +9719,12 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_c_inet_aton" >&5 $as_echo "$ac_cv_lib_c_inet_aton" >&6; } -if test "x$ac_cv_lib_c_inet_aton" = x""yes; then : +if test "x$ac_cv_lib_c_inet_aton" = xyes; then : $ac_cv_prog_TRUE else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for inet_aton in -lresolv" >&5 $as_echo_n "checking for inet_aton in -lresolv... " >&6; } -if test "${ac_cv_lib_resolv_inet_aton+set}" = set; then : +if ${ac_cv_lib_resolv_inet_aton+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -9761,7 +9758,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_resolv_inet_aton" >&5 $as_echo "$ac_cv_lib_resolv_inet_aton" >&6; } -if test "x$ac_cv_lib_resolv_inet_aton" = x""yes; then : +if test "x$ac_cv_lib_resolv_inet_aton" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_LIBRESOLV 1 _ACEOF @@ -9778,7 +9775,7 @@ # exit Python { $as_echo "$as_me:${as_lineno-$LINENO}: checking for chflags" >&5 $as_echo_n "checking for chflags... " >&6; } -if test "${ac_cv_have_chflags+set}" = set; then : +if ${ac_cv_have_chflags+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -9812,7 +9809,7 @@ $as_echo "$ac_cv_have_chflags" >&6; } if test "$ac_cv_have_chflags" = cross ; then ac_fn_c_check_func "$LINENO" "chflags" "ac_cv_func_chflags" -if test "x$ac_cv_func_chflags" = x""yes; then : +if test "x$ac_cv_func_chflags" = xyes; then : ac_cv_have_chflags="yes" else ac_cv_have_chflags="no" @@ -9827,7 +9824,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for lchflags" >&5 $as_echo_n "checking for lchflags... " >&6; } -if test "${ac_cv_have_lchflags+set}" = set; then : +if ${ac_cv_have_lchflags+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -9861,7 +9858,7 @@ $as_echo "$ac_cv_have_lchflags" >&6; } if test "$ac_cv_have_lchflags" = cross ; then ac_fn_c_check_func "$LINENO" "lchflags" "ac_cv_func_lchflags" -if test "x$ac_cv_func_lchflags" = x""yes; then : +if test "x$ac_cv_func_lchflags" = xyes; then : ac_cv_have_lchflags="yes" else ac_cv_have_lchflags="no" @@ -9885,7 +9882,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for inflateCopy in -lz" >&5 $as_echo_n "checking for inflateCopy in -lz... " >&6; } -if test "${ac_cv_lib_z_inflateCopy+set}" = set; then : +if ${ac_cv_lib_z_inflateCopy+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -9919,7 +9916,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_z_inflateCopy" >&5 $as_echo "$ac_cv_lib_z_inflateCopy" >&6; } -if test "x$ac_cv_lib_z_inflateCopy" = x""yes; then : +if test "x$ac_cv_lib_z_inflateCopy" = xyes; then : $as_echo "#define HAVE_ZLIB_COPY 1" >>confdefs.h @@ -10062,7 +10059,7 @@ for ac_func in openpty do : ac_fn_c_check_func "$LINENO" "openpty" "ac_cv_func_openpty" -if test "x$ac_cv_func_openpty" = x""yes; then : +if test "x$ac_cv_func_openpty" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_OPENPTY 1 _ACEOF @@ -10070,7 +10067,7 @@ else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for openpty in -lutil" >&5 $as_echo_n "checking for openpty in -lutil... " >&6; } -if test "${ac_cv_lib_util_openpty+set}" = set; then : +if ${ac_cv_lib_util_openpty+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -10104,13 +10101,13 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_util_openpty" >&5 $as_echo "$ac_cv_lib_util_openpty" >&6; } -if test "x$ac_cv_lib_util_openpty" = x""yes; then : +if test "x$ac_cv_lib_util_openpty" = xyes; then : $as_echo "#define HAVE_OPENPTY 1" >>confdefs.h LIBS="$LIBS -lutil" else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for openpty in -lbsd" >&5 $as_echo_n "checking for openpty in -lbsd... " >&6; } -if test "${ac_cv_lib_bsd_openpty+set}" = set; then : +if ${ac_cv_lib_bsd_openpty+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -10144,7 +10141,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_bsd_openpty" >&5 $as_echo "$ac_cv_lib_bsd_openpty" >&6; } -if test "x$ac_cv_lib_bsd_openpty" = x""yes; then : +if test "x$ac_cv_lib_bsd_openpty" = xyes; then : $as_echo "#define HAVE_OPENPTY 1" >>confdefs.h LIBS="$LIBS -lbsd" fi @@ -10159,7 +10156,7 @@ for ac_func in forkpty do : ac_fn_c_check_func "$LINENO" "forkpty" "ac_cv_func_forkpty" -if test "x$ac_cv_func_forkpty" = x""yes; then : +if test "x$ac_cv_func_forkpty" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_FORKPTY 1 _ACEOF @@ -10167,7 +10164,7 @@ else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for forkpty in -lutil" >&5 $as_echo_n "checking for forkpty in -lutil... " >&6; } -if test "${ac_cv_lib_util_forkpty+set}" = set; then : +if ${ac_cv_lib_util_forkpty+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -10201,13 +10198,13 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_util_forkpty" >&5 $as_echo "$ac_cv_lib_util_forkpty" >&6; } -if test "x$ac_cv_lib_util_forkpty" = x""yes; then : +if test "x$ac_cv_lib_util_forkpty" = xyes; then : $as_echo "#define HAVE_FORKPTY 1" >>confdefs.h LIBS="$LIBS -lutil" else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for forkpty in -lbsd" >&5 $as_echo_n "checking for forkpty in -lbsd... " >&6; } -if test "${ac_cv_lib_bsd_forkpty+set}" = set; then : +if ${ac_cv_lib_bsd_forkpty+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -10241,7 +10238,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_bsd_forkpty" >&5 $as_echo "$ac_cv_lib_bsd_forkpty" >&6; } -if test "x$ac_cv_lib_bsd_forkpty" = x""yes; then : +if test "x$ac_cv_lib_bsd_forkpty" = xyes; then : $as_echo "#define HAVE_FORKPTY 1" >>confdefs.h LIBS="$LIBS -lbsd" fi @@ -10258,7 +10255,7 @@ for ac_func in memmove do : ac_fn_c_check_func "$LINENO" "memmove" "ac_cv_func_memmove" -if test "x$ac_cv_func_memmove" = x""yes; then : +if test "x$ac_cv_func_memmove" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_MEMMOVE 1 _ACEOF @@ -10272,8 +10269,7 @@ do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" -eval as_val=\$$as_ac_var - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_var"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 _ACEOF @@ -10282,31 +10278,50 @@ done -for ac_func in dup2 getcwd strdup -do : - as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` -ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" -eval as_val=\$$as_ac_var - if test "x$as_val" = x""yes; then : - cat >>confdefs.h <<_ACEOF -#define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 -_ACEOF +ac_fn_c_check_func "$LINENO" "dup2" "ac_cv_func_dup2" +if test "x$ac_cv_func_dup2" = xyes; then : + $as_echo "#define HAVE_DUP2 1" >>confdefs.h else case " $LIBOBJS " in - *" $ac_func.$ac_objext "* ) ;; - *) LIBOBJS="$LIBOBJS $ac_func.$ac_objext" + *" dup2.$ac_objext "* ) ;; + *) LIBOBJS="$LIBOBJS dup2.$ac_objext" + ;; +esac + +fi + +ac_fn_c_check_func "$LINENO" "getcwd" "ac_cv_func_getcwd" +if test "x$ac_cv_func_getcwd" = xyes; then : + $as_echo "#define HAVE_GETCWD 1" >>confdefs.h + +else + case " $LIBOBJS " in + *" getcwd.$ac_objext "* ) ;; + *) LIBOBJS="$LIBOBJS getcwd.$ac_objext" + ;; +esac + +fi + +ac_fn_c_check_func "$LINENO" "strdup" "ac_cv_func_strdup" +if test "x$ac_cv_func_strdup" = xyes; then : + $as_echo "#define HAVE_STRDUP 1" >>confdefs.h + +else + case " $LIBOBJS " in + *" strdup.$ac_objext "* ) ;; + *) LIBOBJS="$LIBOBJS strdup.$ac_objext" ;; esac fi -done for ac_func in getpgrp do : ac_fn_c_check_func "$LINENO" "getpgrp" "ac_cv_func_getpgrp" -if test "x$ac_cv_func_getpgrp" = x""yes; then : +if test "x$ac_cv_func_getpgrp" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_GETPGRP 1 _ACEOF @@ -10334,7 +10349,7 @@ for ac_func in setpgrp do : ac_fn_c_check_func "$LINENO" "setpgrp" "ac_cv_func_setpgrp" -if test "x$ac_cv_func_setpgrp" = x""yes; then : +if test "x$ac_cv_func_setpgrp" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_SETPGRP 1 _ACEOF @@ -10362,7 +10377,7 @@ for ac_func in gettimeofday do : ac_fn_c_check_func "$LINENO" "gettimeofday" "ac_cv_func_gettimeofday" -if test "x$ac_cv_func_gettimeofday" = x""yes; then : +if test "x$ac_cv_func_gettimeofday" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_GETTIMEOFDAY 1 _ACEOF @@ -10464,7 +10479,7 @@ then { $as_echo "$as_me:${as_lineno-$LINENO}: checking getaddrinfo bug" >&5 $as_echo_n "checking getaddrinfo bug... " >&6; } - if test "${ac_cv_buggy_getaddrinfo+set}" = set; then : + if ${ac_cv_buggy_getaddrinfo+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -10593,7 +10608,7 @@ for ac_func in getnameinfo do : ac_fn_c_check_func "$LINENO" "getnameinfo" "ac_cv_func_getnameinfo" -if test "x$ac_cv_func_getnameinfo" = x""yes; then : +if test "x$ac_cv_func_getnameinfo" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_GETNAMEINFO 1 _ACEOF @@ -10605,7 +10620,7 @@ # checks for structures { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether time.h and sys/time.h may both be included" >&5 $as_echo_n "checking whether time.h and sys/time.h may both be included... " >&6; } -if test "${ac_cv_header_time+set}" = set; then : +if ${ac_cv_header_time+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10640,7 +10655,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether struct tm is in sys/time.h or time.h" >&5 $as_echo_n "checking whether struct tm is in sys/time.h or time.h... " >&6; } -if test "${ac_cv_struct_tm+set}" = set; then : +if ${ac_cv_struct_tm+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10677,7 +10692,7 @@ #include <$ac_cv_struct_tm> " -if test "x$ac_cv_member_struct_tm_tm_zone" = x""yes; then : +if test "x$ac_cv_member_struct_tm_tm_zone" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_TM_TM_ZONE 1 @@ -10693,7 +10708,7 @@ else ac_fn_c_check_decl "$LINENO" "tzname" "ac_cv_have_decl_tzname" "#include " -if test "x$ac_cv_have_decl_tzname" = x""yes; then : +if test "x$ac_cv_have_decl_tzname" = xyes; then : ac_have_decl=1 else ac_have_decl=0 @@ -10705,7 +10720,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for tzname" >&5 $as_echo_n "checking for tzname... " >&6; } -if test "${ac_cv_var_tzname+set}" = set; then : +if ${ac_cv_var_tzname+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10741,7 +10756,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_rdev" "ac_cv_member_struct_stat_st_rdev" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_rdev" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_rdev" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_RDEV 1 @@ -10751,7 +10766,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_blksize" "ac_cv_member_struct_stat_st_blksize" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_blksize" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_blksize" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_BLKSIZE 1 @@ -10761,7 +10776,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_flags" "ac_cv_member_struct_stat_st_flags" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_flags" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_flags" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_FLAGS 1 @@ -10771,7 +10786,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_gen" "ac_cv_member_struct_stat_st_gen" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_gen" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_gen" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_GEN 1 @@ -10781,7 +10796,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_birthtime" "ac_cv_member_struct_stat_st_birthtime" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_birthtime" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_birthtime" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_BIRTHTIME 1 @@ -10791,7 +10806,7 @@ fi ac_fn_c_check_member "$LINENO" "struct stat" "st_blocks" "ac_cv_member_struct_stat_st_blocks" "$ac_includes_default" -if test "x$ac_cv_member_struct_stat_st_blocks" = x""yes; then : +if test "x$ac_cv_member_struct_stat_st_blocks" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_STAT_ST_BLOCKS 1 @@ -10813,7 +10828,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for time.h that defines altzone" >&5 $as_echo_n "checking for time.h that defines altzone... " >&6; } -if test "${ac_cv_header_time_altzone+set}" = set; then : +if ${ac_cv_header_time_altzone+:} false; then : $as_echo_n "(cached) " >&6 else @@ -10877,7 +10892,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for addrinfo" >&5 $as_echo_n "checking for addrinfo... " >&6; } -if test "${ac_cv_struct_addrinfo+set}" = set; then : +if ${ac_cv_struct_addrinfo+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10909,7 +10924,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for sockaddr_storage" >&5 $as_echo_n "checking for sockaddr_storage... " >&6; } -if test "${ac_cv_struct_sockaddr_storage+set}" = set; then : +if ${ac_cv_struct_sockaddr_storage+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10945,7 +10960,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether char is unsigned" >&5 $as_echo_n "checking whether char is unsigned... " >&6; } -if test "${ac_cv_c_char_unsigned+set}" = set; then : +if ${ac_cv_c_char_unsigned+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -10977,7 +10992,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for an ANSI C-conforming const" >&5 $as_echo_n "checking for an ANSI C-conforming const... " >&6; } -if test "${ac_cv_c_const+set}" = set; then : +if ${ac_cv_c_const+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -11265,7 +11280,7 @@ ac_fn_c_check_func "$LINENO" "gethostbyname_r" "ac_cv_func_gethostbyname_r" -if test "x$ac_cv_func_gethostbyname_r" = x""yes; then : +if test "x$ac_cv_func_gethostbyname_r" = xyes; then : $as_echo "#define HAVE_GETHOSTBYNAME_R 1" >>confdefs.h @@ -11396,7 +11411,7 @@ for ac_func in gethostbyname do : ac_fn_c_check_func "$LINENO" "gethostbyname" "ac_cv_func_gethostbyname" -if test "x$ac_cv_func_gethostbyname" = x""yes; then : +if test "x$ac_cv_func_gethostbyname" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_GETHOSTBYNAME 1 _ACEOF @@ -11418,12 +11433,12 @@ # Linux requires this for correct f.p. operations ac_fn_c_check_func "$LINENO" "__fpu_control" "ac_cv_func___fpu_control" -if test "x$ac_cv_func___fpu_control" = x""yes; then : +if test "x$ac_cv_func___fpu_control" = xyes; then : else { $as_echo "$as_me:${as_lineno-$LINENO}: checking for __fpu_control in -lieee" >&5 $as_echo_n "checking for __fpu_control in -lieee... " >&6; } -if test "${ac_cv_lib_ieee___fpu_control+set}" = set; then : +if ${ac_cv_lib_ieee___fpu_control+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -11457,7 +11472,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_ieee___fpu_control" >&5 $as_echo "$ac_cv_lib_ieee___fpu_control" >&6; } -if test "x$ac_cv_lib_ieee___fpu_control" = x""yes; then : +if test "x$ac_cv_lib_ieee___fpu_control" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_LIBIEEE 1 _ACEOF @@ -11513,7 +11528,7 @@ then LIBM=$withval { $as_echo "$as_me:${as_lineno-$LINENO}: result: set LIBM=\"$withval\"" >&5 $as_echo "set LIBM=\"$withval\"" >&6; } -else as_fn_error "proper usage is --with-libm=STRING" "$LINENO" 5 +else as_fn_error $? "proper usage is --with-libm=STRING" "$LINENO" 5 fi else { $as_echo "$as_me:${as_lineno-$LINENO}: result: default LIBM=\"$LIBM\"" >&5 @@ -11537,7 +11552,7 @@ then LIBC=$withval { $as_echo "$as_me:${as_lineno-$LINENO}: result: set LIBC=\"$withval\"" >&5 $as_echo "set LIBC=\"$withval\"" >&6; } -else as_fn_error "proper usage is --with-libc=STRING" "$LINENO" 5 +else as_fn_error $? "proper usage is --with-libc=STRING" "$LINENO" 5 fi else { $as_echo "$as_me:${as_lineno-$LINENO}: result: default LIBC=\"$LIBC\"" >&5 @@ -11551,7 +11566,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C doubles are little-endian IEEE 754 binary64" >&5 $as_echo_n "checking whether C doubles are little-endian IEEE 754 binary64... " >&6; } -if test "${ac_cv_little_endian_double+set}" = set; then : +if ${ac_cv_little_endian_double+:} false; then : $as_echo_n "(cached) " >&6 else @@ -11593,7 +11608,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C doubles are big-endian IEEE 754 binary64" >&5 $as_echo_n "checking whether C doubles are big-endian IEEE 754 binary64... " >&6; } -if test "${ac_cv_big_endian_double+set}" = set; then : +if ${ac_cv_big_endian_double+:} false; then : $as_echo_n "(cached) " >&6 else @@ -11639,7 +11654,7 @@ # conversions work. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether C doubles are ARM mixed-endian IEEE 754 binary64" >&5 $as_echo_n "checking whether C doubles are ARM mixed-endian IEEE 754 binary64... " >&6; } -if test "${ac_cv_mixed_endian_double+set}" = set; then : +if ${ac_cv_mixed_endian_double+:} false; then : $as_echo_n "(cached) " >&6 else @@ -11787,8 +11802,7 @@ do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" -eval as_val=\$$as_ac_var - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_var"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 _ACEOF @@ -11800,8 +11814,7 @@ do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` ac_fn_c_check_func "$LINENO" "$ac_func" "$as_ac_var" -eval as_val=\$$as_ac_var - if test "x$as_val" = x""yes; then : +if eval test \"x\$"$as_ac_var"\" = x"yes"; then : cat >>confdefs.h <<_ACEOF #define `$as_echo "HAVE_$ac_func" | $as_tr_cpp` 1 _ACEOF @@ -11811,7 +11824,7 @@ ac_fn_c_check_decl "$LINENO" "isinf" "ac_cv_have_decl_isinf" "#include " -if test "x$ac_cv_have_decl_isinf" = x""yes; then : +if test "x$ac_cv_have_decl_isinf" = xyes; then : ac_have_decl=1 else ac_have_decl=0 @@ -11822,7 +11835,7 @@ _ACEOF ac_fn_c_check_decl "$LINENO" "isnan" "ac_cv_have_decl_isnan" "#include " -if test "x$ac_cv_have_decl_isnan" = x""yes; then : +if test "x$ac_cv_have_decl_isnan" = xyes; then : ac_have_decl=1 else ac_have_decl=0 @@ -11833,7 +11846,7 @@ _ACEOF ac_fn_c_check_decl "$LINENO" "isfinite" "ac_cv_have_decl_isfinite" "#include " -if test "x$ac_cv_have_decl_isfinite" = x""yes; then : +if test "x$ac_cv_have_decl_isfinite" = xyes; then : ac_have_decl=1 else ac_have_decl=0 @@ -11848,7 +11861,7 @@ # -0. on some architectures. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether tanh preserves the sign of zero" >&5 $as_echo_n "checking whether tanh preserves the sign of zero... " >&6; } -if test "${ac_cv_tanh_preserves_zero_sign+set}" = set; then : +if ${ac_cv_tanh_preserves_zero_sign+:} false; then : $as_echo_n "(cached) " >&6 else @@ -11896,7 +11909,7 @@ # -0. See issue #9920. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether log1p drops the sign of negative zero" >&5 $as_echo_n "checking whether log1p drops the sign of negative zero... " >&6; } - if test "${ac_cv_log1p_drops_zero_sign+set}" = set; then : + if ${ac_cv_log1p_drops_zero_sign+:} false; then : $as_echo_n "(cached) " >&6 else @@ -11948,7 +11961,7 @@ # sem_open results in a 'Signal 12' error. { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether POSIX semaphores are enabled" >&5 $as_echo_n "checking whether POSIX semaphores are enabled... " >&6; } -if test "${ac_cv_posix_semaphores_enabled+set}" = set; then : +if ${ac_cv_posix_semaphores_enabled+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -11999,7 +12012,7 @@ # Multiprocessing check for broken sem_getvalue { $as_echo "$as_me:${as_lineno-$LINENO}: checking for broken sem_getvalue" >&5 $as_echo_n "checking for broken sem_getvalue... " >&6; } -if test "${ac_cv_broken_sem_getvalue+set}" = set; then : +if ${ac_cv_broken_sem_getvalue+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -12064,7 +12077,7 @@ 15|30) ;; *) - as_fn_error "bad value $enable_big_digits for --enable-big-digits; value should be 15 or 30" "$LINENO" 5 ;; + as_fn_error $? "bad value $enable_big_digits for --enable-big-digits; value should be 15 or 30" "$LINENO" 5 ;; esac { $as_echo "$as_me:${as_lineno-$LINENO}: result: $enable_big_digits" >&5 $as_echo "$enable_big_digits" >&6; } @@ -12082,7 +12095,7 @@ # check for wchar.h ac_fn_c_check_header_mongrel "$LINENO" "wchar.h" "ac_cv_header_wchar_h" "$ac_includes_default" -if test "x$ac_cv_header_wchar_h" = x""yes; then : +if test "x$ac_cv_header_wchar_h" = xyes; then : $as_echo "#define HAVE_WCHAR_H 1" >>confdefs.h @@ -12105,7 +12118,7 @@ # This bug is HP SR number 8606223364. { $as_echo "$as_me:${as_lineno-$LINENO}: checking size of wchar_t" >&5 $as_echo_n "checking size of wchar_t... " >&6; } -if test "${ac_cv_sizeof_wchar_t+set}" = set; then : +if ${ac_cv_sizeof_wchar_t+:} false; then : $as_echo_n "(cached) " >&6 else if ac_fn_c_compute_int "$LINENO" "(long int) (sizeof (wchar_t))" "ac_cv_sizeof_wchar_t" "#include @@ -12115,9 +12128,8 @@ if test "$ac_cv_type_wchar_t" = yes; then { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} -{ as_fn_set_status 77 -as_fn_error "cannot compute sizeof (wchar_t) -See \`config.log' for more details." "$LINENO" 5; }; } +as_fn_error 77 "cannot compute sizeof (wchar_t) +See \`config.log' for more details" "$LINENO" 5; } else ac_cv_sizeof_wchar_t=0 fi @@ -12172,7 +12184,7 @@ # check whether wchar_t is signed or not { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether wchar_t is signed" >&5 $as_echo_n "checking whether wchar_t is signed... " >&6; } - if test "${ac_cv_wchar_t_signed+set}" = set; then : + if ${ac_cv_wchar_t_signed+:} false; then : $as_echo_n "(cached) " >&6 else @@ -12268,7 +12280,7 @@ # check for endianness { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether byte ordering is bigendian" >&5 $as_echo_n "checking whether byte ordering is bigendian... " >&6; } -if test "${ac_cv_c_bigendian+set}" = set; then : +if ${ac_cv_c_bigendian+:} false; then : $as_echo_n "(cached) " >&6 else ac_cv_c_bigendian=unknown @@ -12486,7 +12498,7 @@ ;; #( *) - as_fn_error "unknown endianness + as_fn_error $? "unknown endianness presetting ac_cv_c_bigendian=no (or yes) will help" "$LINENO" 5 ;; esac @@ -12559,7 +12571,7 @@ # or fills with zeros (like the Cray J90, according to Tim Peters). { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether right shift extends the sign bit" >&5 $as_echo_n "checking whether right shift extends the sign bit... " >&6; } -if test "${ac_cv_rshift_extends_sign+set}" = set; then : +if ${ac_cv_rshift_extends_sign+:} false; then : $as_echo_n "(cached) " >&6 else @@ -12598,7 +12610,7 @@ # check for getc_unlocked and related locking functions { $as_echo "$as_me:${as_lineno-$LINENO}: checking for getc_unlocked() and friends" >&5 $as_echo_n "checking for getc_unlocked() and friends... " >&6; } -if test "${ac_cv_have_getc_unlocked+set}" = set; then : +if ${ac_cv_have_getc_unlocked+:} false; then : $as_echo_n "(cached) " >&6 else @@ -12696,7 +12708,7 @@ # check for readline 2.1 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for rl_callback_handler_install in -lreadline" >&5 $as_echo_n "checking for rl_callback_handler_install in -lreadline... " >&6; } -if test "${ac_cv_lib_readline_rl_callback_handler_install+set}" = set; then : +if ${ac_cv_lib_readline_rl_callback_handler_install+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -12730,7 +12742,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_readline_rl_callback_handler_install" >&5 $as_echo "$ac_cv_lib_readline_rl_callback_handler_install" >&6; } -if test "x$ac_cv_lib_readline_rl_callback_handler_install" = x""yes; then : +if test "x$ac_cv_lib_readline_rl_callback_handler_install" = xyes; then : $as_echo "#define HAVE_RL_CALLBACK 1" >>confdefs.h @@ -12748,7 +12760,7 @@ have_readline=no fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext if test $have_readline = yes then cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -12782,7 +12794,7 @@ # check for readline 4.0 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for rl_pre_input_hook in -lreadline" >&5 $as_echo_n "checking for rl_pre_input_hook in -lreadline... " >&6; } -if test "${ac_cv_lib_readline_rl_pre_input_hook+set}" = set; then : +if ${ac_cv_lib_readline_rl_pre_input_hook+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -12816,7 +12828,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_readline_rl_pre_input_hook" >&5 $as_echo "$ac_cv_lib_readline_rl_pre_input_hook" >&6; } -if test "x$ac_cv_lib_readline_rl_pre_input_hook" = x""yes; then : +if test "x$ac_cv_lib_readline_rl_pre_input_hook" = xyes; then : $as_echo "#define HAVE_RL_PRE_INPUT_HOOK 1" >>confdefs.h @@ -12826,7 +12838,7 @@ # also in 4.0 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for rl_completion_display_matches_hook in -lreadline" >&5 $as_echo_n "checking for rl_completion_display_matches_hook in -lreadline... " >&6; } -if test "${ac_cv_lib_readline_rl_completion_display_matches_hook+set}" = set; then : +if ${ac_cv_lib_readline_rl_completion_display_matches_hook+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -12860,7 +12872,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_readline_rl_completion_display_matches_hook" >&5 $as_echo "$ac_cv_lib_readline_rl_completion_display_matches_hook" >&6; } -if test "x$ac_cv_lib_readline_rl_completion_display_matches_hook" = x""yes; then : +if test "x$ac_cv_lib_readline_rl_completion_display_matches_hook" = xyes; then : $as_echo "#define HAVE_RL_COMPLETION_DISPLAY_MATCHES_HOOK 1" >>confdefs.h @@ -12870,7 +12882,7 @@ # check for readline 4.2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for rl_completion_matches in -lreadline" >&5 $as_echo_n "checking for rl_completion_matches in -lreadline... " >&6; } -if test "${ac_cv_lib_readline_rl_completion_matches+set}" = set; then : +if ${ac_cv_lib_readline_rl_completion_matches+:} false; then : $as_echo_n "(cached) " >&6 else ac_check_lib_save_LIBS=$LIBS @@ -12904,7 +12916,7 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_readline_rl_completion_matches" >&5 $as_echo "$ac_cv_lib_readline_rl_completion_matches" >&6; } -if test "x$ac_cv_lib_readline_rl_completion_matches" = x""yes; then : +if test "x$ac_cv_lib_readline_rl_completion_matches" = xyes; then : $as_echo "#define HAVE_RL_COMPLETION_MATCHES 1" >>confdefs.h @@ -12922,7 +12934,7 @@ have_readline=no fi -rm -f conftest.err conftest.$ac_ext +rm -f conftest.err conftest.i conftest.$ac_ext if test $have_readline = yes then cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -12945,7 +12957,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for broken nice()" >&5 $as_echo_n "checking for broken nice()... " >&6; } -if test "${ac_cv_broken_nice+set}" = set; then : +if ${ac_cv_broken_nice+:} false; then : $as_echo_n "(cached) " >&6 else @@ -12986,7 +12998,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for broken poll()" >&5 $as_echo_n "checking for broken poll()... " >&6; } -if test "${ac_cv_broken_poll+set}" = set; then : +if ${ac_cv_broken_poll+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -13041,7 +13053,7 @@ #include <$ac_cv_struct_tm> " -if test "x$ac_cv_member_struct_tm_tm_zone" = x""yes; then : +if test "x$ac_cv_member_struct_tm_tm_zone" = xyes; then : cat >>confdefs.h <<_ACEOF #define HAVE_STRUCT_TM_TM_ZONE 1 @@ -13057,7 +13069,7 @@ else ac_fn_c_check_decl "$LINENO" "tzname" "ac_cv_have_decl_tzname" "#include " -if test "x$ac_cv_have_decl_tzname" = x""yes; then : +if test "x$ac_cv_have_decl_tzname" = xyes; then : ac_have_decl=1 else ac_have_decl=0 @@ -13069,7 +13081,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for tzname" >&5 $as_echo_n "checking for tzname... " >&6; } -if test "${ac_cv_var_tzname+set}" = set; then : +if ${ac_cv_var_tzname+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13108,7 +13120,7 @@ # check tzset(3) exists and works like we expect it to { $as_echo "$as_me:${as_lineno-$LINENO}: checking for working tzset()" >&5 $as_echo_n "checking for working tzset()... " >&6; } -if test "${ac_cv_working_tzset+set}" = set; then : +if ${ac_cv_working_tzset+:} false; then : $as_echo_n "(cached) " >&6 else @@ -13205,7 +13217,7 @@ # Look for subsecond timestamps in struct stat { $as_echo "$as_me:${as_lineno-$LINENO}: checking for tv_nsec in struct stat" >&5 $as_echo_n "checking for tv_nsec in struct stat... " >&6; } -if test "${ac_cv_stat_tv_nsec+set}" = set; then : +if ${ac_cv_stat_tv_nsec+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13242,7 +13254,7 @@ # Look for BSD style subsecond timestamps in struct stat { $as_echo "$as_me:${as_lineno-$LINENO}: checking for tv_nsec2 in struct stat" >&5 $as_echo_n "checking for tv_nsec2 in struct stat... " >&6; } -if test "${ac_cv_stat_tv_nsec2+set}" = set; then : +if ${ac_cv_stat_tv_nsec2+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13279,7 +13291,7 @@ # On HP/UX 11.0, mvwdelch is a block with a return statement { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether mvwdelch is an expression" >&5 $as_echo_n "checking whether mvwdelch is an expression... " >&6; } -if test "${ac_cv_mvwdelch_is_expression+set}" = set; then : +if ${ac_cv_mvwdelch_is_expression+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13316,7 +13328,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether WINDOW has _flags" >&5 $as_echo_n "checking whether WINDOW has _flags... " >&6; } -if test "${ac_cv_window_has_flags+set}" = set; then : +if ${ac_cv_window_has_flags+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13464,7 +13476,7 @@ then { $as_echo "$as_me:${as_lineno-$LINENO}: checking for %lld and %llu printf() format support" >&5 $as_echo_n "checking for %lld and %llu printf() format support... " >&6; } - if test "${ac_cv_have_long_long_format+set}" = set; then : + if ${ac_cv_have_long_long_format+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -13534,7 +13546,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for %zd printf() format support" >&5 $as_echo_n "checking for %zd printf() format support... " >&6; } -if test "${ac_cv_have_size_t_format+set}" = set; then : +if ${ac_cv_have_size_t_format+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -13607,7 +13619,7 @@ #endif " -if test "x$ac_cv_type_socklen_t" = x""yes; then : +if test "x$ac_cv_type_socklen_t" = xyes; then : else @@ -13618,7 +13630,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking for broken mbstowcs" >&5 $as_echo_n "checking for broken mbstowcs... " >&6; } -if test "${ac_cv_broken_mbstowcs+set}" = set; then : +if ${ac_cv_broken_mbstowcs+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -13658,7 +13670,7 @@ { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether $CC supports computed gotos" >&5 $as_echo_n "checking whether $CC supports computed gotos... " >&6; } -if test "${ac_cv_computed_gotos+set}" = set; then : +if ${ac_cv_computed_gotos+:} false; then : $as_echo_n "(cached) " >&6 else if test "$cross_compiling" = yes; then : @@ -13738,11 +13750,11 @@ case $ac_sys_system in - OSF*) as_fn_error "OSF* systems are deprecated unless somebody volunteers. Check http://bugs.python.org/issue8606" "$LINENO" 5 ;; + OSF*) as_fn_error $? "OSF* systems are deprecated unless somebody volunteers. Check http://bugs.python.org/issue8606" "$LINENO" 5 ;; esac ac_fn_c_check_func "$LINENO" "pipe2" "ac_cv_func_pipe2" -if test "x$ac_cv_func_pipe2" = x""yes; then : +if test "x$ac_cv_func_pipe2" = xyes; then : $as_echo "#define HAVE_PIPE2 1" >>confdefs.h @@ -13837,10 +13849,21 @@ :end' >>confcache if diff "$cache_file" confcache >/dev/null 2>&1; then :; else if test -w "$cache_file"; then - test "x$cache_file" != "x/dev/null" && + if test "x$cache_file" != "x/dev/null"; then { $as_echo "$as_me:${as_lineno-$LINENO}: updating cache $cache_file" >&5 $as_echo "$as_me: updating cache $cache_file" >&6;} - cat confcache >$cache_file + if test ! -f "$cache_file" || test -h "$cache_file"; then + cat confcache >"$cache_file" + else + case $cache_file in #( + */* | ?:*) + mv -f confcache "$cache_file"$$ && + mv -f "$cache_file"$$ "$cache_file" ;; #( + *) + mv -f confcache "$cache_file" ;; + esac + fi + fi else { $as_echo "$as_me:${as_lineno-$LINENO}: not updating unwritable cache $cache_file" >&5 $as_echo "$as_me: not updating unwritable cache $cache_file" >&6;} @@ -13856,6 +13879,7 @@ ac_libobjs= ac_ltlibobjs= +U= for ac_i in : $LIBOBJS; do test "x$ac_i" = x: && continue # 1. Remove the extension, and $U if already installed. ac_script='s/\$U\././;s/\.o$//;s/\.obj$//' @@ -13872,7 +13896,7 @@ -: ${CONFIG_STATUS=./config.status} +: "${CONFIG_STATUS=./config.status}" ac_write_fail=0 ac_clean_files_save=$ac_clean_files ac_clean_files="$ac_clean_files $CONFIG_STATUS" @@ -13973,6 +13997,7 @@ IFS=" "" $as_nl" # Find who we are. Look in the path if we contain no directory separator. +as_myself= case $0 in #(( *[\\/]* ) as_myself=$0 ;; *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR @@ -14018,19 +14043,19 @@ (unset CDPATH) >/dev/null 2>&1 && unset CDPATH -# as_fn_error ERROR [LINENO LOG_FD] -# --------------------------------- +# as_fn_error STATUS ERROR [LINENO LOG_FD] +# ---------------------------------------- # Output "`basename $0`: error: ERROR" to stderr. If LINENO and LOG_FD are # provided, also output the error to LOG_FD, referencing LINENO. Then exit the -# script with status $?, using 1 if that was 0. +# script with STATUS, using 1 if that was 0. as_fn_error () { - as_status=$?; test $as_status -eq 0 && as_status=1 - if test "$3"; then - as_lineno=${as_lineno-"$2"} as_lineno_stack=as_lineno_stack=$as_lineno_stack - $as_echo "$as_me:${as_lineno-$LINENO}: error: $1" >&$3 + as_status=$1; test $as_status -eq 0 && as_status=1 + if test "$4"; then + as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack + $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4 fi - $as_echo "$as_me: error: $1" >&2 + $as_echo "$as_me: error: $2" >&2 as_fn_exit $as_status } # as_fn_error @@ -14226,7 +14251,7 @@ test -d "$as_dir" && break done test -z "$as_dirs" || eval "mkdir $as_dirs" - } || test -d "$as_dir" || as_fn_error "cannot create directory $as_dir" + } || test -d "$as_dir" || as_fn_error $? "cannot create directory $as_dir" } # as_fn_mkdir_p @@ -14280,7 +14305,7 @@ # values after options handling. ac_log=" This file was extended by python $as_me 3.2, which was -generated by GNU Autoconf 2.65. Invocation command line was +generated by GNU Autoconf 2.68. Invocation command line was CONFIG_FILES = $CONFIG_FILES CONFIG_HEADERS = $CONFIG_HEADERS @@ -14304,8 +14329,8 @@ cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 # Files that config.status was made for. -config_files="`echo $ac_config_files`" -config_headers="`echo $ac_config_headers`" +config_files="$ac_config_files" +config_headers="$ac_config_headers" _ACEOF @@ -14342,10 +14367,10 @@ ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ python config.status 3.2 -configured by $0, generated by GNU Autoconf 2.65, +configured by $0, generated by GNU Autoconf 2.68, with options \\"\$ac_cs_config\\" -Copyright (C) 2009 Free Software Foundation, Inc. +Copyright (C) 2010 Free Software Foundation, Inc. This config.status script is free software; the Free Software Foundation gives unlimited permission to copy, distribute and modify it." @@ -14361,11 +14386,16 @@ while test $# != 0 do case $1 in - --*=*) + --*=?*) ac_option=`expr "X$1" : 'X\([^=]*\)='` ac_optarg=`expr "X$1" : 'X[^=]*=\(.*\)'` ac_shift=: ;; + --*=) + ac_option=`expr "X$1" : 'X\([^=]*\)='` + ac_optarg= + ac_shift=: + ;; *) ac_option=$1 ac_optarg=$2 @@ -14387,6 +14417,7 @@ $ac_shift case $ac_optarg in *\'*) ac_optarg=`$as_echo "$ac_optarg" | sed "s/'/'\\\\\\\\''/g"` ;; + '') as_fn_error $? "missing file argument" ;; esac as_fn_append CONFIG_FILES " '$ac_optarg'" ac_need_defaults=false;; @@ -14399,7 +14430,7 @@ ac_need_defaults=false;; --he | --h) # Conflict between --help and --header - as_fn_error "ambiguous option: \`$1' + as_fn_error $? "ambiguous option: \`$1' Try \`$0 --help' for more information.";; --help | --hel | -h ) $as_echo "$ac_cs_usage"; exit ;; @@ -14408,7 +14439,7 @@ ac_cs_silent=: ;; # This is an error. - -*) as_fn_error "unrecognized option: \`$1' + -*) as_fn_error $? "unrecognized option: \`$1' Try \`$0 --help' for more information." ;; *) as_fn_append ac_config_targets " $1" @@ -14467,7 +14498,7 @@ "Misc/python.pc") CONFIG_FILES="$CONFIG_FILES Misc/python.pc" ;; "Modules/ld_so_aix") CONFIG_FILES="$CONFIG_FILES Modules/ld_so_aix" ;; - *) as_fn_error "invalid argument: \`$ac_config_target'" "$LINENO" 5;; + *) as_fn_error $? "invalid argument: \`$ac_config_target'" "$LINENO" 5;; esac done @@ -14489,9 +14520,10 @@ # after its creation but before its name has been assigned to `$tmp'. $debug || { - tmp= + tmp= ac_tmp= trap 'exit_status=$? - { test -z "$tmp" || test ! -d "$tmp" || rm -fr "$tmp"; } && exit $exit_status + : "${ac_tmp:=$tmp}" + { test ! -d "$ac_tmp" || rm -fr "$ac_tmp"; } && exit $exit_status ' 0 trap 'as_fn_exit 1' 1 2 13 15 } @@ -14499,12 +14531,13 @@ { tmp=`(umask 077 && mktemp -d "./confXXXXXX") 2>/dev/null` && - test -n "$tmp" && test -d "$tmp" + test -d "$tmp" } || { tmp=./conf$$-$RANDOM (umask 077 && mkdir "$tmp") -} || as_fn_error "cannot create a temporary directory in ." "$LINENO" 5 +} || as_fn_error $? "cannot create a temporary directory in ." "$LINENO" 5 +ac_tmp=$tmp # Set up the scripts for CONFIG_FILES section. # No need to generate them if there are no CONFIG_FILES. @@ -14521,12 +14554,12 @@ fi ac_cs_awk_cr=`$AWK 'BEGIN { print "a\rb" }' /dev/null` if test "$ac_cs_awk_cr" = "a${ac_cr}b"; then - ac_cs_awk_cr='\r' + ac_cs_awk_cr='\\r' else ac_cs_awk_cr=$ac_cr fi -echo 'BEGIN {' >"$tmp/subs1.awk" && +echo 'BEGIN {' >"$ac_tmp/subs1.awk" && _ACEOF @@ -14535,18 +14568,18 @@ echo "$ac_subst_vars" | sed 's/.*/&!$&$ac_delim/' && echo "_ACEOF" } >conf$$subs.sh || - as_fn_error "could not make $CONFIG_STATUS" "$LINENO" 5 -ac_delim_num=`echo "$ac_subst_vars" | grep -c '$'` + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 +ac_delim_num=`echo "$ac_subst_vars" | grep -c '^'` ac_delim='%!_!# ' for ac_last_try in false false false false false :; do . ./conf$$subs.sh || - as_fn_error "could not make $CONFIG_STATUS" "$LINENO" 5 + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 ac_delim_n=`sed -n "s/.*$ac_delim\$/X/p" conf$$subs.awk | grep -c X` if test $ac_delim_n = $ac_delim_num; then break elif $ac_last_try; then - as_fn_error "could not make $CONFIG_STATUS" "$LINENO" 5 + as_fn_error $? "could not make $CONFIG_STATUS" "$LINENO" 5 else ac_delim="$ac_delim!$ac_delim _$ac_delim!! " fi @@ -14554,7 +14587,7 @@ rm -f conf$$subs.sh cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 -cat >>"\$tmp/subs1.awk" <<\\_ACAWK && +cat >>"\$ac_tmp/subs1.awk" <<\\_ACAWK && _ACEOF sed -n ' h @@ -14602,7 +14635,7 @@ rm -f conf$$subs.awk cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 _ACAWK -cat >>"\$tmp/subs1.awk" <<_ACAWK && +cat >>"\$ac_tmp/subs1.awk" <<_ACAWK && for (key in S) S_is_set[key] = 1 FS = "" @@ -14634,21 +14667,29 @@ sed "s/$ac_cr\$//; s/$ac_cr/$ac_cs_awk_cr/g" else cat -fi < "$tmp/subs1.awk" > "$tmp/subs.awk" \ - || as_fn_error "could not setup config files machinery" "$LINENO" 5 +fi < "$ac_tmp/subs1.awk" > "$ac_tmp/subs.awk" \ + || as_fn_error $? "could not setup config files machinery" "$LINENO" 5 _ACEOF -# VPATH may cause trouble with some makes, so we remove $(srcdir), -# ${srcdir} and @srcdir@ from VPATH if srcdir is ".", strip leading and +# VPATH may cause trouble with some makes, so we remove sole $(srcdir), +# ${srcdir} and @srcdir@ entries from VPATH if srcdir is ".", strip leading and # trailing colons and then remove the whole line if VPATH becomes empty # (actually we leave an empty line to preserve line numbers). if test "x$srcdir" = x.; then - ac_vpsub='/^[ ]*VPATH[ ]*=/{ -s/:*\$(srcdir):*/:/ -s/:*\${srcdir}:*/:/ -s/:*@srcdir@:*/:/ -s/^\([^=]*=[ ]*\):*/\1/ + ac_vpsub='/^[ ]*VPATH[ ]*=[ ]*/{ +h +s/// +s/^/:/ +s/[ ]*$/:/ +s/:\$(srcdir):/:/g +s/:\${srcdir}:/:/g +s/:@srcdir@:/:/g +s/^:*// s/:*$// +x +s/\(=[ ]*\).*/\1/ +G +s/\n// s/^[^=]*=[ ]*$// }' fi @@ -14660,7 +14701,7 @@ # No need to generate them if there are no CONFIG_HEADERS. # This happens for instance with `./config.status Makefile'. if test -n "$CONFIG_HEADERS"; then -cat >"$tmp/defines.awk" <<\_ACAWK || +cat >"$ac_tmp/defines.awk" <<\_ACAWK || BEGIN { _ACEOF @@ -14672,11 +14713,11 @@ # handling of long lines. ac_delim='%!_!# ' for ac_last_try in false false :; do - ac_t=`sed -n "/$ac_delim/p" confdefs.h` - if test -z "$ac_t"; then + ac_tt=`sed -n "/$ac_delim/p" confdefs.h` + if test -z "$ac_tt"; then break elif $ac_last_try; then - as_fn_error "could not make $CONFIG_HEADERS" "$LINENO" 5 + as_fn_error $? "could not make $CONFIG_HEADERS" "$LINENO" 5 else ac_delim="$ac_delim!$ac_delim _$ac_delim!! " fi @@ -14761,7 +14802,7 @@ _ACAWK _ACEOF cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 - as_fn_error "could not setup config headers machinery" "$LINENO" 5 + as_fn_error $? "could not setup config headers machinery" "$LINENO" 5 fi # test -n "$CONFIG_HEADERS" @@ -14774,7 +14815,7 @@ esac case $ac_mode$ac_tag in :[FHL]*:*);; - :L* | :C*:*) as_fn_error "invalid tag \`$ac_tag'" "$LINENO" 5;; + :L* | :C*:*) as_fn_error $? "invalid tag \`$ac_tag'" "$LINENO" 5;; :[FH]-) ac_tag=-:-;; :[FH]*) ac_tag=$ac_tag:$ac_tag.in;; esac @@ -14793,7 +14834,7 @@ for ac_f do case $ac_f in - -) ac_f="$tmp/stdin";; + -) ac_f="$ac_tmp/stdin";; *) # Look for the file first in the build tree, then in the source tree # (if the path is not absolute). The absolute path cannot be DOS-style, # because $ac_f cannot contain `:'. @@ -14802,7 +14843,7 @@ [\\/$]*) false;; *) test -f "$srcdir/$ac_f" && ac_f="$srcdir/$ac_f";; esac || - as_fn_error "cannot find input file: \`$ac_f'" "$LINENO" 5;; + as_fn_error 1 "cannot find input file: \`$ac_f'" "$LINENO" 5;; esac case $ac_f in *\'*) ac_f=`$as_echo "$ac_f" | sed "s/'/'\\\\\\\\''/g"`;; esac as_fn_append ac_file_inputs " '$ac_f'" @@ -14828,8 +14869,8 @@ esac case $ac_tag in - *:-:* | *:-) cat >"$tmp/stdin" \ - || as_fn_error "could not create $ac_file" "$LINENO" 5 ;; + *:-:* | *:-) cat >"$ac_tmp/stdin" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 ;; esac ;; esac @@ -14959,23 +15000,24 @@ s&@INSTALL@&$ac_INSTALL&;t t $ac_datarootdir_hack " -eval sed \"\$ac_sed_extra\" "$ac_file_inputs" | $AWK -f "$tmp/subs.awk" >$tmp/out \ - || as_fn_error "could not create $ac_file" "$LINENO" 5 +eval sed \"\$ac_sed_extra\" "$ac_file_inputs" | $AWK -f "$ac_tmp/subs.awk" \ + >$ac_tmp/out || as_fn_error $? "could not create $ac_file" "$LINENO" 5 test -z "$ac_datarootdir_hack$ac_datarootdir_seen" && - { ac_out=`sed -n '/\${datarootdir}/p' "$tmp/out"`; test -n "$ac_out"; } && - { ac_out=`sed -n '/^[ ]*datarootdir[ ]*:*=/p' "$tmp/out"`; test -z "$ac_out"; } && + { ac_out=`sed -n '/\${datarootdir}/p' "$ac_tmp/out"`; test -n "$ac_out"; } && + { ac_out=`sed -n '/^[ ]*datarootdir[ ]*:*=/p' \ + "$ac_tmp/out"`; test -z "$ac_out"; } && { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $ac_file contains a reference to the variable \`datarootdir' -which seems to be undefined. Please make sure it is defined." >&5 +which seems to be undefined. Please make sure it is defined" >&5 $as_echo "$as_me: WARNING: $ac_file contains a reference to the variable \`datarootdir' -which seems to be undefined. Please make sure it is defined." >&2;} +which seems to be undefined. Please make sure it is defined" >&2;} - rm -f "$tmp/stdin" + rm -f "$ac_tmp/stdin" case $ac_file in - -) cat "$tmp/out" && rm -f "$tmp/out";; - *) rm -f "$ac_file" && mv "$tmp/out" "$ac_file";; + -) cat "$ac_tmp/out" && rm -f "$ac_tmp/out";; + *) rm -f "$ac_file" && mv "$ac_tmp/out" "$ac_file";; esac \ - || as_fn_error "could not create $ac_file" "$LINENO" 5 + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 ;; :H) # @@ -14984,21 +15026,21 @@ if test x"$ac_file" != x-; then { $as_echo "/* $configure_input */" \ - && eval '$AWK -f "$tmp/defines.awk"' "$ac_file_inputs" - } >"$tmp/config.h" \ - || as_fn_error "could not create $ac_file" "$LINENO" 5 - if diff "$ac_file" "$tmp/config.h" >/dev/null 2>&1; then + && eval '$AWK -f "$ac_tmp/defines.awk"' "$ac_file_inputs" + } >"$ac_tmp/config.h" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 + if diff "$ac_file" "$ac_tmp/config.h" >/dev/null 2>&1; then { $as_echo "$as_me:${as_lineno-$LINENO}: $ac_file is unchanged" >&5 $as_echo "$as_me: $ac_file is unchanged" >&6;} else rm -f "$ac_file" - mv "$tmp/config.h" "$ac_file" \ - || as_fn_error "could not create $ac_file" "$LINENO" 5 + mv "$ac_tmp/config.h" "$ac_file" \ + || as_fn_error $? "could not create $ac_file" "$LINENO" 5 fi else $as_echo "/* $configure_input */" \ - && eval '$AWK -f "$tmp/defines.awk"' "$ac_file_inputs" \ - || as_fn_error "could not create -" "$LINENO" 5 + && eval '$AWK -f "$ac_tmp/defines.awk"' "$ac_file_inputs" \ + || as_fn_error $? "could not create -" "$LINENO" 5 fi ;; @@ -15018,7 +15060,7 @@ ac_clean_files=$ac_clean_files_save test $ac_write_fail = 0 || - as_fn_error "write failure creating $CONFIG_STATUS" "$LINENO" 5 + as_fn_error $? "write failure creating $CONFIG_STATUS" "$LINENO" 5 # configure is writing to config.log, and then calls config.status. @@ -15039,7 +15081,7 @@ exec 5>>config.log # Use ||, not &&, to avoid exiting from the if with $? = 1, which # would make configure fail if this is the last instruction. - $ac_cs_success || as_fn_exit $? + $ac_cs_success || as_fn_exit 1 fi if test -n "$ac_unrecognized_opts" && test "$enable_option_checking" != no; then { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: unrecognized options: $ac_unrecognized_opts" >&5 Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Sat Feb 5 21:26:52 2011 @@ -738,7 +738,7 @@ BLDLIBRARY='-Wl,-R,$(LIBDIR) -L. -lpython$(LDVERSION)' RUNSHARED=LD_LIBRARY_PATH=`pwd`:${LD_LIBRARY_PATH} INSTSONAME="$LDLIBRARY".$SOVERSION - if test $with_pydebug == no + if test "$with_pydebug" != yes then PY3LIBRARY=libpython3.so fi @@ -753,8 +753,7 @@ ;; esac INSTSONAME="$LDLIBRARY".$SOVERSION - PY3LIBRARY=libpython3.so - if test $with_pydebug == no + if test "$with_pydebug" != yes then PY3LIBRARY=libpython3.so fi From python-checkins at python.org Sat Feb 5 21:35:30 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 5 Feb 2011 21:35:30 +0100 (CET) Subject: [Python-checkins] r88351 - in python/branches/py3k: Doc/c-api/type.rst Include/object.h Misc/NEWS Modules/xxlimited.c Objects/typeobject.c PC/python3.def PC/python32stub.def Message-ID: <20110205203530.32840EE994@mail.python.org> Author: martin.v.loewis Date: Sat Feb 5 21:35:29 2011 New Revision: 88351 Log: Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check in the limited ABI Modified: python/branches/py3k/Doc/c-api/type.rst python/branches/py3k/Include/object.h python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/xxlimited.c python/branches/py3k/Objects/typeobject.c python/branches/py3k/PC/python3.def python/branches/py3k/PC/python32stub.def Modified: python/branches/py3k/Doc/c-api/type.rst ============================================================================== --- python/branches/py3k/Doc/c-api/type.rst (original) +++ python/branches/py3k/Doc/c-api/type.rst Sat Feb 5 21:35:29 2011 @@ -35,6 +35,14 @@ Clear the internal lookup cache. Return the current version tag. +.. c:function:: long PyType_GetFlags(PyTypeObject* type) + + Return the :attr:`tp_flags` member of *type*. This function is primarily + meant for use with `Py_LIMITED_API`; the individual flag bits are + guaranteed to be stable across Python releases, but access to + :attr:`tp_flags` itself is not part of the limited API. + + .. versionadded:: 3.2 .. c:function:: void PyType_Modified(PyTypeObject *type) Modified: python/branches/py3k/Include/object.h ============================================================================== --- python/branches/py3k/Include/object.h (original) +++ python/branches/py3k/Include/object.h Sat Feb 5 21:35:29 2011 @@ -437,6 +437,8 @@ PyAPI_DATA(PyTypeObject) PyBaseObject_Type; /* built-in 'object' */ PyAPI_DATA(PyTypeObject) PySuper_Type; /* built-in 'super' */ +PyAPI_FUNC(long) PyType_GetFlags(PyTypeObject*); + #define PyType_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_TYPE_SUBCLASS) #define PyType_CheckExact(op) (Py_TYPE(op) == &PyType_Type) @@ -589,7 +591,11 @@ Py_TPFLAGS_HAVE_VERSION_TAG | \ 0) +#ifdef Py_LIMITED_API +#define PyType_HasFeature(t,f) ((PyType_GetFlags(t) & (f)) != 0) +#else #define PyType_HasFeature(t,f) (((t)->tp_flags & (f)) != 0) +#endif #define PyType_FastSubclass(t,f) PyType_HasFeature(t,f) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 5 21:35:29 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check + in the limited ABI. + - Issue #11118: Fix bogus export of None in python3.dll. Library Modified: python/branches/py3k/Modules/xxlimited.c ============================================================================== --- python/branches/py3k/Modules/xxlimited.c (original) +++ python/branches/py3k/Modules/xxlimited.c Sat Feb 5 21:35:29 2011 @@ -50,8 +50,14 @@ static PyObject * Xxo_demo(XxoObject *self, PyObject *args) { - if (!PyArg_ParseTuple(args, ":demo")) + PyObject *o = NULL; + if (!PyArg_ParseTuple(args, "|O:demo", &o)) return NULL; + /* Test availability of fast type checks */ + if (o != NULL && PyUnicode_Check(o)) { + Py_INCREF(o); + return o; + } Py_INCREF(Py_None); return Py_None; } Modified: python/branches/py3k/Objects/typeobject.c ============================================================================== --- python/branches/py3k/Objects/typeobject.c (original) +++ python/branches/py3k/Objects/typeobject.c Sat Feb 5 21:35:29 2011 @@ -1904,6 +1904,12 @@ return res; } +long +PyType_GetFlags(PyTypeObject *type) +{ + return type->tp_flags; +} + static PyObject * type_new(PyTypeObject *metatype, PyObject *args, PyObject *kwds) { Modified: python/branches/py3k/PC/python3.def ============================================================================== --- python/branches/py3k/PC/python3.def (original) +++ python/branches/py3k/PC/python3.def Sat Feb 5 21:35:29 2011 @@ -513,6 +513,7 @@ PyType_FromSpec=python32.PyType_FromSpec PyType_GenericAlloc=python32.PyType_GenericAlloc PyType_GenericNew=python32.PyType_GenericNew + PyType_GetFlags=python32.PyType_GetFlags PyType_IsSubtype=python32.PyType_IsSubtype PyType_Modified=python32.PyType_Modified PyType_Ready=python32.PyType_Ready Modified: python/branches/py3k/PC/python32stub.def ============================================================================== --- python/branches/py3k/PC/python32stub.def (original) +++ python/branches/py3k/PC/python32stub.def Sat Feb 5 21:35:29 2011 @@ -513,6 +513,7 @@ PyType_FromSpec PyType_GenericAlloc PyType_GenericNew +PyType_GetFlags PyType_IsSubtype PyType_Modified PyType_Ready From python-checkins at python.org Sat Feb 5 22:47:25 2011 From: python-checkins at python.org (gregory.p.smith) Date: Sat, 5 Feb 2011 22:47:25 +0100 (CET) Subject: [Python-checkins] r88352 - python/branches/py3k/Doc/library/subprocess.rst Message-ID: <20110205214725.9C783EE9EB@mail.python.org> Author: gregory.p.smith Date: Sat Feb 5 22:47:25 2011 New Revision: 88352 Log: issue7678 - Properly document how to replace a shell pipeline so that SIGPIPE happens when the end exits before the beginning. Modified: python/branches/py3k/Doc/library/subprocess.rst Modified: python/branches/py3k/Doc/library/subprocess.rst ============================================================================== --- python/branches/py3k/Doc/library/subprocess.rst (original) +++ python/branches/py3k/Doc/library/subprocess.rst Sat Feb 5 22:47:25 2011 @@ -519,8 +519,11 @@ ==> p1 = Popen(["dmesg"], stdout=PIPE) p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE) + p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits. output = p2.communicate()[0] +The p1.stdout.close() call after starting the p2 is important in order for p1 +to receive a SIGPIPE if p2 exits before p1. Replacing :func:`os.system` ^^^^^^^^^^^^^^^^^^^^^^^^^^^ From python-checkins at python.org Sat Feb 5 22:49:56 2011 From: python-checkins at python.org (gregory.p.smith) Date: Sat, 5 Feb 2011 22:49:56 +0100 (CET) Subject: [Python-checkins] r88353 - in python/branches/release27-maint: Doc/library/subprocess.rst Message-ID: <20110205214956.CCA0AEE994@mail.python.org> Author: gregory.p.smith Date: Sat Feb 5 22:49:56 2011 New Revision: 88353 Log: Merged revisions 88352 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88352 | gregory.p.smith | 2011-02-05 13:47:25 -0800 (Sat, 05 Feb 2011) | 3 lines issue7678 - Properly document how to replace a shell pipeline so that SIGPIPE happens when the end exits before the beginning. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/subprocess.rst Modified: python/branches/release27-maint/Doc/library/subprocess.rst ============================================================================== --- python/branches/release27-maint/Doc/library/subprocess.rst (original) +++ python/branches/release27-maint/Doc/library/subprocess.rst Sat Feb 5 22:49:56 2011 @@ -450,8 +450,11 @@ ==> p1 = Popen(["dmesg"], stdout=PIPE) p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE) + p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits. output = p2.communicate()[0] +The p1.stdout.close() call after starting the p2 is important in order for p1 +to receive a SIGPIPE if p2 exits before p1. Replacing :func:`os.system` ^^^^^^^^^^^^^^^^^^^^^^^^^^^ From python-checkins at python.org Sat Feb 5 22:51:27 2011 From: python-checkins at python.org (gregory.p.smith) Date: Sat, 5 Feb 2011 22:51:27 +0100 (CET) Subject: [Python-checkins] r88354 - in python/branches/release31-maint: Doc/library/subprocess.rst Message-ID: <20110205215127.5667EEE994@mail.python.org> Author: gregory.p.smith Date: Sat Feb 5 22:51:27 2011 New Revision: 88354 Log: Merged revisions 88352 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88352 | gregory.p.smith | 2011-02-05 13:47:25 -0800 (Sat, 05 Feb 2011) | 3 lines issue7678 - Properly document how to replace a shell pipeline so that SIGPIPE happens when the end exits before the beginning. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/library/subprocess.rst Modified: python/branches/release31-maint/Doc/library/subprocess.rst ============================================================================== --- python/branches/release31-maint/Doc/library/subprocess.rst (original) +++ python/branches/release31-maint/Doc/library/subprocess.rst Sat Feb 5 22:51:27 2011 @@ -465,8 +465,11 @@ ==> p1 = Popen(["dmesg"], stdout=PIPE) p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE) + p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits. output = p2.communicate()[0] +The p1.stdout.close() call after starting the p2 is important in order for p1 +to receive a SIGPIPE if p2 exits before p1. Replacing :func:`os.system` ^^^^^^^^^^^^^^^^^^^^^^^^^^^ From python-checkins at python.org Sat Feb 5 23:05:05 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 5 Feb 2011 23:05:05 +0100 (CET) Subject: [Python-checkins] r88355 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205220505.BD11DEEAB8@mail.python.org> Author: brett.cannon Date: Sat Feb 5 23:05:05 2011 New Revision: 88355 Log: Re-arrange and clarify some __future__ import statements. The absolute_import statement got moved to a new Python 2.5 and Newer section as it was available since then. The division statement got moved to Common Gotchas since it has been around for so long that any modern Python program can use it. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 23:05:05 2011 @@ -121,6 +121,8 @@ .. _Python 2.6: http://www.python.org/2.6.x .. _Python 2.5: http://www.python.org/2.5.x .. _Python 2.4: http://www.python.org/2.4.x +.. _Python 2.3: http://www.python.org/2.3.x +.. _Python 2.2: http://www.python.org/2.2.x .. _use_3to2: @@ -181,8 +183,8 @@ to Python to help discover places in your code which 2to3 cannot handle but are known to cause issues. -Try to Support Python 2.6 and Newer Only ----------------------------------------- +Try to Support `Python 2.6`_ and Newer Only +------------------------------------------- While not possible for all projects, if you can support `Python 2.6`_ and newer **only**, your life will be much easier. Various future statements, stdlib additions, etc. exist only in Python 2.6 and later which greatly assist in @@ -196,22 +198,6 @@ at least need to watch out for situations that these solutions fix. -``from __future__ import division`` -''''''''''''''''''''''''''''''''''' -While the exact same outcome can be had by using the ``-Qnew`` argument to -Python, using this future statement lifts the requirement that your users use -the flag to get the expected behavior of division in Python 3 (e.g., ``1/2 == -0.5; 1//2 == 0``). - - -``from __future__ import absolute_imports`` -''''''''''''''''''''''''''''''''''''''''''' -Implicit relative imports (e.g., importing ``spam.bacon`` from within -``spam.eggs`` with the statement ``import bacon``) does not work in Python 3. -This future statement moves away from that and allows the use of explicit -relative imports (e.g., ``from . import bacon``). - - ``from __future__ import print_function`` ''''''''''''''''''''''''''''''''''''''''' This is a personal choice. 2to3 handles the translation from the print @@ -251,6 +237,29 @@ also comes about when doing comparisons between bytes and strings. +Supporting `Python 2.5`_ and Newer Only +--------------------------------------- +If you are supporting `Python 2.5`_ and newer there are still some features of +Python that you can utilize. + + +``from __future__ import absolute_imports`` +''''''''''''''''''''''''''''''''''''''''''' +Implicit relative imports (e.g., importing ``spam.bacon`` from within +``spam.eggs`` with the statement ``import bacon``) does not work in Python 3. +This future statement moves away from that and allows the use of explicit +relative imports (e.g., ``from . import bacon``). + +In `Python 2.5`_ you must use +the __future__ statement to get to use explicit relative imports and prevent +implicit ones. In `Python 2.6`_ explicit relative imports are available without +the statement, but you still want the __future__ statement to prevent implicit +relative imports. In `Python 2.7`_ the __future__ statement is not needed. In +other words, unless you are only supporting Python 2.7 or a version earlier +than Python 2.5, use the __future__ statement. + + + Handle Common "Gotchas" ----------------------- There are a few things that just consistently come up as sticking points for @@ -258,6 +267,15 @@ to help modernize your code. +``from __future__ import division`` +''''''''''''''''''''''''''''''''''' +While the exact same outcome can be had by using the ``-Qnew`` argument to +Python, using this future statement lifts the requirement that your users use +the flag to get the expected behavior of division in Python 3 +(e.g., ``1/2 == 0.5; 1//2 == 0``). + + + Specify when opening a file as binary ''''''''''''''''''''''''''''''''''''' @@ -288,8 +306,8 @@ Subclass ``object`` ''''''''''''''''''' -New-style classes have been around since Python 2.2. You need to make sure you -are subclassing from ``object`` to avoid odd edge cases involving method +New-style classes have been around since `Python 2.2`_. You need to make sure +you are subclassing from ``object`` to avoid odd edge cases involving method resolution order, etc. This continues to be totally valid in Python 3 (although unneeded as all classes implicitly inherit from ``object``). @@ -336,7 +354,7 @@ consistently stick to that API in both Python 2 and 3. -Bytes / unicode comparison +Bytes / Unicode Comparison ************************** In Python 3, mixing bytes and unicode is forbidden in most situations; it @@ -587,9 +605,10 @@ Capturing the Currently Raised Exception ---------------------------------------- -One change between Python 2 and 3 that will require changing how you code is -accessing the currently raised exception. In Python 2 the syntax to access the -current exception is:: +One change between Python 2 and 3 that will require changing how you code (if +you support `Python 2.5`_ and earlier) is +accessing the currently raised exception. In Python 2.5 and earlier the syntax +to access the current exception is:: try: raise Exception() @@ -597,12 +616,14 @@ # Current exception is 'exc' pass -This syntax changed in Python 3 to:: +This syntax changed in Python 3 (and backported to `Python 2.6`_ and later) +to:: try: raise Exception() except Exception as exc: # Current exception is 'exc' + # In Python 3, 'exc' is restricted to the block; Python 2.6 will "leak" pass Because of this syntax change you must change to capturing the current @@ -622,7 +643,7 @@ .. note:: In Python 3, the traceback is attached to the exception instance - through the **__traceback__** attribute. If the instance is saved in + through the ``__traceback__`` attribute. If the instance is saved in a local variable that persists outside of the ``except`` block, the traceback will create a reference cycle with the current frame and its dictionary of local variables. This will delay reclaiming dead From nnorwitz at gmail.com Sat Feb 5 22:32:49 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Sat, 5 Feb 2011 16:32:49 -0500 Subject: [Python-checkins] Python Regression Test Failures opt (1) Message-ID: <20110205213249.GA7302@kbk-i386-bb.dyndns.org> 346 tests OK. 1 test failed: test_urllib2 37 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_bsddb3 test_cd test_cl test_curses test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_linuxaudiodev test_macos test_macostools test_multiprocessing test_ossaudiodev test_pep277 test_py3kwarn test_scriptpackages test_smtpnet test_socketserver test_startfile test_sunaudiodev test_tcl test_timeout test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_urllib2net test_urllibnet test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly == CPython 2.7 (trunk:88351M, Feb 5 2011, 16:01:41) [GCC 3.3.4 20040623 (Gentoo Linux 3.3.4-r1, ssp-3.3.2-2, pie-8.7.6)] == Linux-2.6.9-gentoo-r1-i686-AMD_Athlon-tm-_XP_3000+-with-gentoo-1.4.16 little-endian == /tmp/test_python_3153 test_grammar test_opcodes test_dict test_builtin test_exceptions test_types test_unittest test_doctest test_doctest2 test_MimeWriter test_SimpleHTTPServer test_StringIO test___all__ test___future__ test__locale test_abc test_abstract_numbers test_aepack test_aepack skipped -- No module named aetypes test_aifc test_al test_al skipped -- No module named al test_anydbm test_applesingle test_applesingle skipped -- No module named MacOS test_argparse test_array test_ascii_formatd test_ast test_asynchat test_asyncore test_atexit test_audioop test_augassign test_base64 test_bastion test_bigaddrspace test_bigmem test_binascii test_binhex test_binop test_bisect test_bool test_bsddb test_bsddb185 test_bsddb185 skipped -- No module named bsddb185 test_bsddb3 test_bsddb3 skipped -- Use of the `bsddb' resource not enabled test_buffer test_bufio test_bytes test_bz2 test_calendar test_call test_capi test_cd test_cd skipped -- No module named cd test_cfgparser test_cgi test_charmapcodec test_cl test_cl skipped -- No module named cl test_class test_cmath test_cmd test_cmd_line test_cmd_line_script test_code test_codeccallbacks test_codecencodings_cn test_codecencodings_hk test_codecencodings_jp test_codecencodings_kr test_codecencodings_tw test_codecmaps_cn test_codecmaps_hk test_codecmaps_jp test_codecmaps_kr test_codecmaps_tw test_codecs test_codeop test_coding test_coercion test_collections test_colorsys test_commands test_compare test_compile test_compileall test_compiler test_complex test_complex_args test_contains test_contextlib test_cookie test_cookielib test_copy test_copy_reg test_cpickle test_cprofile test_crypt test_csv test_ctypes test_curses test_curses skipped -- Use of the `curses' resource not enabled test_datetime test_dbm test_decimal test_decorators test_defaultdict test_deque test_descr test_descrtut test_dictcomps test_dictviews test_difflib test_dircache test_dis test_distutils [19646 refs] [19646 refs] [19646 refs] [19646 refs] [19643 refs] test_dl test_docxmlrpc test_dumbdbm test_dummy_thread test_dummy_threading test_email test_email_codecs test_email_renamed test_enumerate test_eof test_epoll test_epoll skipped -- kernel doesn't support epoll() test_errno test_exception_variations test_extcall test_fcntl test_file test_file2k test_filecmp test_fileinput test_fileio test_float test_fnmatch test_fork1 test_format test_fpformat test_fractions test_frozen test_ftplib test_funcattrs test_functools test_future test_future3 test_future4 test_future5 test_future_builtins test_gc test_gdb test_gdb skipped -- gdb versions before 7.0 didn't support python embedding Saw: GNU gdb 6.2.1 Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "i386-pc-linux-gnu". test_gdbm test_generators test_genericpath test_genexps test_getargs test_getargs2 test_getopt test_gettext test_gl test_gl skipped -- No module named gl test_glob test_global test_grp test_gzip test_hash test_hashlib test_heapq test_hmac test_hotshot test_htmllib test_htmlparser test_httplib test_httpservers [15648 refs] [15648 refs] [15648 refs] [25315 refs] test_imageop test_imaplib test_imgfile test_imgfile skipped -- No module named imgfile test_imp test_import test_importhooks test_importlib test_index test_inspect test_int test_int_literal test_io test_ioctl test_ioctl skipped -- Unable to open /dev/tty test_isinstance test_iter test_iterlen test_itertools test_json test_kqueue test_kqueue skipped -- test works only on BSD test_largefile test_lib2to3 test_linecache test_linuxaudiodev test_linuxaudiodev skipped -- Use of the `audio' resource not enabled test_list test_locale test_logging test_long test_long_future test_longexp test_macos test_macos skipped -- No module named MacOS test_macostools test_macostools skipped -- No module named MacOS test_macpath test_mailbox test_marshal test_math test_md5 test_memoryio test_memoryview test_mhlib test_mimetools test_mimetypes test_minidom test_mmap test_module test_modulefinder test_multibytecodec test_multibytecodec_support test_multifile test_multiprocessing test_multiprocessing skipped -- This platform lacks a functioning sem_open implementation, therefore, the required synchronization primitives needed will not function, see issue 3770. test_mutants test_mutex test_netrc test_new test_nis test_normalization test_ntpath test_old_mailbox test_openpty test_operator test_optparse test_os [15648 refs] [15648 refs] test_ossaudiodev test_ossaudiodev skipped -- Use of the `audio' resource not enabled test_parser Expecting 's_push: parser stack overflow' in next line s_push: parser stack overflow test_pdb test_peepholer test_pep247 test_pep263 test_pep277 test_pep277 skipped -- only NT+ and systems with Unicode-friendly filesystem encoding test_pep292 test_pep352 test_pickle test_pickletools test_pipes test_pkg test_pkgimport test_pkgutil test_platform [17044 refs] [17044 refs] test_plistlib test_poll test_popen [15653 refs] [15653 refs] [15653 refs] test_popen2 test_poplib test_posix test_posixpath test_pow test_pprint test_print test_profile test_profilehooks test_property test_pstats test_pty test_pwd test_py3kwarn test_py3kwarn skipped -- test.test_py3kwarn must be run with the -3 flag test_pyclbr test_pydoc [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20861 refs] [20861 refs] test_pyexpat test_queue test_quopri [18479 refs] [18479 refs] test_random test_re test_readline test_repr test_resource test_rfc822 test_richcmp test_robotparser test_runpy test_sax test_scope test_scriptpackages test_scriptpackages skipped -- No module named aetools test_select test_set test_setcomps test_sets test_sgmllib test_sha test_shelve test_shlex test_shutil test_signal test_site [15648 refs] [15648 refs] [15648 refs] [15648 refs] test_slice test_smtplib test_smtpnet test_smtpnet skipped -- Use of the `network' resource not enabled test_socket test_socketserver test_socketserver skipped -- Use of the `network' resource not enabled test_softspace test_sort test_sqlite test_ssl test_startfile test_startfile skipped -- module os has no attribute startfile test_str test_strftime test_string test_stringprep test_strop test_strptime test_strtod test_struct test_structmembers test_structseq test_subprocess [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] test_sunaudiodev test_sunaudiodev skipped -- No module named sunaudiodev test_sundry test_symtable test_syntax test_sys [15648 refs] [15648 refs] [15648 refs] [15877 refs] [15671 refs] test_sysconfig [15648 refs] [15648 refs] test_tarfile test_tcl test_tcl skipped -- No module named _tkinter test_telnetlib test_tempfile [15648 refs] test_textwrap test_thread test_threaded_import test_threadedtempfile test_threading [18948 refs] [20223 refs] [20035 refs] [20035 refs] [20035 refs] [20035 refs] test_threading_local test_threadsignals test_time test_timeout test_timeout skipped -- Use of the `network' resource not enabled test_tk test_tk skipped -- No module named _tkinter test_tokenize test_trace test_traceback test_transformer test_ttk_guionly test_ttk_guionly skipped -- No module named _tkinter test_ttk_textonly test_ttk_textonly skipped -- No module named _tkinter test_tuple test_typechecks test_ucn test_unary test_undocumented_details test_unicode test_unicode_file test_unicode_file skipped -- No Unicode filesystem semantics on this platform. test_unicodedata test_univnewlines test_univnewlines2k test_unpack test_urllib test_urllib2 test test_urllib2 failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known Re-running test 'test_urllib2' in verbose mode Trying: mgr = urllib2.HTTPPasswordMgr() Expecting nothing ok Trying: add = mgr.add_password Expecting nothing ok Trying: add("Some Realm", "http://example.com/", "joe", "password") Expecting nothing ok Trying: add("Some Realm", "http://example.com/ni", "ni", "ni") Expecting nothing ok Trying: add("c", "http://example.com/foo", "foo", "ni") Expecting nothing ok Trying: add("c", "http://example.com/bar", "bar", "nini") Expecting nothing ok Trying: add("b", "http://example.com/", "first", "blah") Expecting nothing ok Trying: add("b", "http://example.com/", "second", "spam") Expecting nothing ok Trying: add("a", "http://example.com", "1", "a") Expecting nothing ok Trying: add("Some Realm", "http://c.example.com:3128", "3", "c") Expecting nothing ok Trying: add("Some Realm", "d.example.com", "4", "d") Expecting nothing ok Trying: add("Some Realm", "e.example.com:3128", "5", "e") Expecting nothing ok Trying: mgr.find_user_password("Some Realm", "example.com") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/spam") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/spam/spam") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("c", "http://example.com/foo") Expecting: ('foo', 'ni') ok Trying: mgr.find_user_password("c", "http://example.com/bar") Expecting: ('bar', 'nini') ok Trying: mgr.find_user_password("b", "http://example.com/") Expecting: ('second', 'spam') ok Trying: mgr.find_user_password("a", "http://example.com/") Expecting: ('1', 'a') ok Trying: mgr.find_user_password("a", "http://a.example.com/") Expecting: (None, None) ok Trying: mgr.find_user_password("Some Realm", "c.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("Some Realm", "c.example.com:3128") Expecting: ('3', 'c') ok Trying: mgr.find_user_password("Some Realm", "http://c.example.com:3128") Expecting: ('3', 'c') ok Trying: mgr.find_user_password("Some Realm", "d.example.com") Expecting: ('4', 'd') ok Trying: mgr.find_user_password("Some Realm", "e.example.com:3128") Expecting: ('5', 'e') ok Trying: mgr = urllib2.HTTPPasswordMgr() Expecting nothing ok Trying: add = mgr.add_password Expecting nothing ok Trying: add("f", "http://g.example.com:80", "10", "j") Expecting nothing ok Trying: add("g", "http://h.example.com", "11", "k") Expecting nothing ok Trying: add("h", "i.example.com:80", "12", "l") Expecting nothing ok Trying: add("i", "j.example.com", "13", "m") Expecting nothing ok Trying: mgr.find_user_password("f", "g.example.com:100") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "g.example.com:80") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("f", "g.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "http://g.example.com:100") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "http://g.example.com:80") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("f", "http://g.example.com") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("g", "h.example.com") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("g", "h.example.com:80") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("g", "http://h.example.com:80") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("h", "i.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("h", "i.example.com:80") Expecting: ('12', 'l') ok Trying: mgr.find_user_password("h", "http://i.example.com:80") Expecting: ('12', 'l') ok Trying: mgr.find_user_password("i", "j.example.com") Expecting: ('13', 'm') ok Trying: mgr.find_user_password("i", "j.example.com:80") Expecting: (None, None) ok Trying: mgr.find_user_password("i", "http://j.example.com") Expecting: ('13', 'm') ok Trying: mgr.find_user_password("i", "http://j.example.com:80") Expecting: (None, None) ok Trying: url = "http://example.com" Expecting nothing ok Trying: Request(url, headers={"Spam-eggs": "blah"}).headers["Spam-eggs"] Expecting: 'blah' ok Trying: Request(url, headers={"spam-EggS": "blah"}).headers["Spam-eggs"] Expecting: 'blah' ok Trying: url = "http://example.com" Expecting nothing ok Trying: r = Request(url, headers={"Spam-eggs": "blah"}) Expecting nothing ok Trying: r.has_header("Spam-eggs") Expecting: True ok Trying: r.header_items() Expecting: [('Spam-eggs', 'blah')] ok Trying: r.add_header("Foo-Bar", "baz") Expecting nothing ok Trying: items = r.header_items() Expecting nothing ok Trying: items.sort() Expecting nothing ok Trying: items Expecting: [('Foo-bar', 'baz'), ('Spam-eggs', 'blah')] ok Trying: r.has_header("Not-there") Expecting: False ok Trying: print r.get_header("Not-there") Expecting: None ok Trying: r.get_header("Not-there", "default") Expecting: 'default' ok 93 items had no tests: test.test_urllib2 test.test_urllib2.FakeMethod test.test_urllib2.FakeMethod.__call__ test.test_urllib2.FakeMethod.__init__ test.test_urllib2.HandlerTests test.test_urllib2.HandlerTests._test_basic_auth test.test_urllib2.HandlerTests.test_basic_and_digest_auth_handlers test.test_urllib2.HandlerTests.test_basic_auth test.test_urllib2.HandlerTests.test_basic_auth_with_single_quoted_realm test.test_urllib2.HandlerTests.test_cookie_redirect test.test_urllib2.HandlerTests.test_cookies test.test_urllib2.HandlerTests.test_errors test.test_urllib2.HandlerTests.test_file test.test_urllib2.HandlerTests.test_ftp test.test_urllib2.HandlerTests.test_http test.test_urllib2.HandlerTests.test_http_doubleslash test.test_urllib2.HandlerTests.test_proxy test.test_urllib2.HandlerTests.test_proxy_basic_auth test.test_urllib2.HandlerTests.test_proxy_https test.test_urllib2.HandlerTests.test_proxy_https_proxy_authorization test.test_urllib2.HandlerTests.test_proxy_no_proxy test.test_urllib2.HandlerTests.test_redirect test.test_urllib2.MiscTests test.test_urllib2.MiscTests.opener_has_handler test.test_urllib2.MiscTests.test_build_opener test.test_urllib2.MockCookieJar test.test_urllib2.MockCookieJar.add_cookie_header test.test_urllib2.MockCookieJar.extract_cookies test.test_urllib2.MockFile test.test_urllib2.MockFile.close test.test_urllib2.MockFile.read test.test_urllib2.MockFile.readline test.test_urllib2.MockHTTPClass test.test_urllib2.MockHTTPClass.__call__ test.test_urllib2.MockHTTPClass.__init__ test.test_urllib2.MockHTTPClass.getresponse test.test_urllib2.MockHTTPClass.request test.test_urllib2.MockHTTPClass.set_debuglevel test.test_urllib2.MockHTTPClass.set_tunnel test.test_urllib2.MockHTTPHandler test.test_urllib2.MockHTTPHandler.__init__ test.test_urllib2.MockHTTPHandler.http_open test.test_urllib2.MockHTTPHandler.reset test.test_urllib2.MockHTTPResponse test.test_urllib2.MockHTTPResponse.__init__ test.test_urllib2.MockHTTPResponse.read test.test_urllib2.MockHTTPSHandler test.test_urllib2.MockHTTPSHandler.__init__ test.test_urllib2.MockHTTPSHandler.https_open test.test_urllib2.MockHandler test.test_urllib2.MockHandler.__init__ test.test_urllib2.MockHandler.__lt__ test.test_urllib2.MockHandler._define_methods test.test_urllib2.MockHandler.add_parent test.test_urllib2.MockHandler.close test.test_urllib2.MockHandler.handle test.test_urllib2.MockHeaders test.test_urllib2.MockHeaders.getheaders test.test_urllib2.MockOpener test.test_urllib2.MockOpener.error test.test_urllib2.MockOpener.open test.test_urllib2.MockPasswordManager test.test_urllib2.MockPasswordManager.add_password test.test_urllib2.MockPasswordManager.find_user_password test.test_urllib2.MockResponse test.test_urllib2.MockResponse.__init__ test.test_urllib2.MockResponse.geturl test.test_urllib2.MockResponse.info test.test_urllib2.OpenerDirectorTests test.test_urllib2.OpenerDirectorTests.test_add_non_handler test.test_urllib2.OpenerDirectorTests.test_badly_named_methods test.test_urllib2.OpenerDirectorTests.test_handled test.test_urllib2.OpenerDirectorTests.test_handler_order test.test_urllib2.OpenerDirectorTests.test_http_error test.test_urllib2.OpenerDirectorTests.test_processors test.test_urllib2.OpenerDirectorTests.test_raise test.test_urllib2.RequestTests test.test_urllib2.RequestTests.setUp test.test_urllib2.RequestTests.test_add_data test.test_urllib2.RequestTests.test_get_full_url test.test_urllib2.RequestTests.test_get_host test.test_urllib2.RequestTests.test_get_host_unquote test.test_urllib2.RequestTests.test_get_type test.test_urllib2.RequestTests.test_method test.test_urllib2.RequestTests.test_proxy test.test_urllib2.RequestTests.test_selector test.test_urllib2.TrivialTests test.test_urllib2.TrivialTests.test_parse_http_list test.test_urllib2.TrivialTests.test_trivial test.test_urllib2.add_ordered_mock_handlers test.test_urllib2.build_test_opener test.test_urllib2.sanepathname2url test.test_urllib2.test_main 4 items passed all tests: 27 tests in test.test_urllib2.test_password_manager 22 tests in test.test_urllib2.test_password_manager_default_port 3 tests in test.test_urllib2.test_request_headers_dict 11 tests in test.test_urllib2.test_request_headers_methods 63 tests in 97 items. 63 passed and 0 failed. Test passed. doctest (test.test_urllib2) ... 63 tests with zero failures Trying: _parse_proxy('file:/ftp.example.com/') Expecting: Traceback (most recent call last): ValueError: proxy URL with no authority: 'file:/ftp.example.com/' ok Trying: _parse_proxy('proxy.example.com') Expecting: (None, None, None, 'proxy.example.com') ok Trying: _parse_proxy('proxy.example.com:3128') Expecting: (None, None, None, 'proxy.example.com:3128') ok Trying: _parse_proxy('joe:password at proxy.example.com') Expecting: (None, 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('joe:password at proxy.example.com:3128') Expecting: (None, 'joe', 'password', 'proxy.example.com:3128') ok Trying: _parse_proxy('http://proxy.example.com/') Expecting: ('http', None, None, 'proxy.example.com') ok Trying: _parse_proxy('http://proxy.example.com:3128/') Expecting: ('http', None, None, 'proxy.example.com:3128') ok Trying: _parse_proxy('http://joe:password at proxy.example.com/') Expecting: ('http', 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('http://joe:password at proxy.example.com:3128') Expecting: ('http', 'joe', 'password', 'proxy.example.com:3128') ok Trying: _parse_proxy('ftp://joe:password at proxy.example.com/rubbish:3128') Expecting: ('ftp', 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('http://joe:password at proxy.example.com') Expecting: ('http', 'joe', 'password', 'proxy.example.com') ok 113 items had no tests: urllib2 urllib2.AbstractBasicAuthHandler urllib2.AbstractBasicAuthHandler.__init__ urllib2.AbstractBasicAuthHandler.http_error_auth_reqed urllib2.AbstractBasicAuthHandler.retry_http_basic_auth urllib2.AbstractDigestAuthHandler urllib2.AbstractDigestAuthHandler.__init__ urllib2.AbstractDigestAuthHandler.get_algorithm_impls urllib2.AbstractDigestAuthHandler.get_authorization urllib2.AbstractDigestAuthHandler.get_cnonce urllib2.AbstractDigestAuthHandler.get_entity_digest urllib2.AbstractDigestAuthHandler.http_error_auth_reqed urllib2.AbstractDigestAuthHandler.reset_retry_count urllib2.AbstractDigestAuthHandler.retry_http_digest_auth urllib2.AbstractHTTPHandler urllib2.AbstractHTTPHandler.__init__ urllib2.AbstractHTTPHandler.do_open urllib2.AbstractHTTPHandler.do_request_ urllib2.AbstractHTTPHandler.set_http_debuglevel urllib2.BaseHandler urllib2.BaseHandler.__lt__ urllib2.BaseHandler.add_parent urllib2.BaseHandler.close urllib2.CacheFTPHandler urllib2.CacheFTPHandler.__init__ urllib2.CacheFTPHandler.check_cache urllib2.CacheFTPHandler.connect_ftp urllib2.CacheFTPHandler.setMaxConns urllib2.CacheFTPHandler.setTimeout urllib2.FTPHandler urllib2.FTPHandler.connect_ftp urllib2.FTPHandler.ftp_open urllib2.FileHandler urllib2.FileHandler.file_open urllib2.FileHandler.get_names urllib2.FileHandler.open_local_file urllib2.HTTPBasicAuthHandler urllib2.HTTPBasicAuthHandler.http_error_401 urllib2.HTTPCookieProcessor urllib2.HTTPCookieProcessor.__init__ urllib2.HTTPCookieProcessor.http_request urllib2.HTTPCookieProcessor.https_response urllib2.HTTPDefaultErrorHandler urllib2.HTTPDefaultErrorHandler.http_error_default urllib2.HTTPDigestAuthHandler urllib2.HTTPDigestAuthHandler.http_error_401 urllib2.HTTPError urllib2.HTTPError.__init__ urllib2.HTTPError.__str__ urllib2.HTTPErrorProcessor urllib2.HTTPErrorProcessor.https_response urllib2.HTTPHandler urllib2.HTTPHandler.http_open urllib2.HTTPPasswordMgr urllib2.HTTPPasswordMgr.__init__ urllib2.HTTPPasswordMgr.add_password urllib2.HTTPPasswordMgr.find_user_password urllib2.HTTPPasswordMgr.is_suburi urllib2.HTTPPasswordMgr.reduce_uri urllib2.HTTPPasswordMgrWithDefaultRealm urllib2.HTTPPasswordMgrWithDefaultRealm.find_user_password urllib2.HTTPRedirectHandler urllib2.HTTPRedirectHandler.http_error_307 urllib2.HTTPRedirectHandler.redirect_request urllib2.HTTPSHandler urllib2.HTTPSHandler.https_open urllib2.OpenerDirector urllib2.OpenerDirector.__init__ urllib2.OpenerDirector._call_chain urllib2.OpenerDirector._open urllib2.OpenerDirector.add_handler urllib2.OpenerDirector.close urllib2.OpenerDirector.error urllib2.OpenerDirector.open urllib2.ProxyBasicAuthHandler urllib2.ProxyBasicAuthHandler.http_error_407 urllib2.ProxyDigestAuthHandler urllib2.ProxyDigestAuthHandler.http_error_407 urllib2.ProxyHandler urllib2.ProxyHandler.__init__ urllib2.ProxyHandler.proxy_open urllib2.Request urllib2.Request.__getattr__ urllib2.Request.__init__ urllib2.Request.add_data urllib2.Request.add_header urllib2.Request.add_unredirected_header urllib2.Request.get_data urllib2.Request.get_full_url urllib2.Request.get_header urllib2.Request.get_host urllib2.Request.get_method urllib2.Request.get_origin_req_host urllib2.Request.get_selector urllib2.Request.get_type urllib2.Request.has_data urllib2.Request.has_header urllib2.Request.has_proxy urllib2.Request.header_items urllib2.Request.is_unverifiable urllib2.Request.set_proxy urllib2.URLError urllib2.URLError.__init__ urllib2.URLError.__str__ urllib2.UnknownHandler urllib2.UnknownHandler.unknown_open urllib2.build_opener urllib2.install_opener urllib2.parse_http_list urllib2.parse_keqv_list urllib2.randombytes urllib2.request_host urllib2.urlopen 1 items passed all tests: 11 tests in urllib2._parse_proxy 11 tests in 114 items. 11 passed and 0 failed. Test passed. doctest (urllib2) ... 11 tests with zero failures test_parse_http_list (test.test_urllib2.TrivialTests) ... ok test_trivial (test.test_urllib2.TrivialTests) ... ok test_add_non_handler (test.test_urllib2.OpenerDirectorTests) ... ok test_badly_named_methods (test.test_urllib2.OpenerDirectorTests) ... ok test_handled (test.test_urllib2.OpenerDirectorTests) ... ok test_handler_order (test.test_urllib2.OpenerDirectorTests) ... ok test_http_error (test.test_urllib2.OpenerDirectorTests) ... ok test_processors (test.test_urllib2.OpenerDirectorTests) ... ok test_raise (test.test_urllib2.OpenerDirectorTests) ... ok test_basic_and_digest_auth_handlers (test.test_urllib2.HandlerTests) ... ok test_basic_auth (test.test_urllib2.HandlerTests) ... ok test_basic_auth_with_single_quoted_realm (test.test_urllib2.HandlerTests) ... ok test_cookie_redirect (test.test_urllib2.HandlerTests) ... ok test_cookies (test.test_urllib2.HandlerTests) ... ok test_errors (test.test_urllib2.HandlerTests) ... ok test_file (test.test_urllib2.HandlerTests) ... ERROR test_ftp (test.test_urllib2.HandlerTests) ... ok test_http (test.test_urllib2.HandlerTests) ... ok test_http_doubleslash (test.test_urllib2.HandlerTests) ... ok test_proxy (test.test_urllib2.HandlerTests) ... ok test_proxy_basic_auth (test.test_urllib2.HandlerTests) ... ok test_proxy_https (test.test_urllib2.HandlerTests) ... ok test_proxy_https_proxy_authorization (test.test_urllib2.HandlerTests) ... ok test_proxy_no_proxy (test.test_urllib2.HandlerTests) ... ok test_redirect (test.test_urllib2.HandlerTests) ... ok test_build_opener (test.test_urllib2.MiscTests) ... ok test_add_data (test.test_urllib2.RequestTests) ... ok test_get_full_url (test.test_urllib2.RequestTests) ... ok test_get_host (test.test_urllib2.RequestTests) ... ok test_get_host_unquote (test.test_urllib2.RequestTests) ... ok test_get_type (test.test_urllib2.RequestTests) ... ok test_method (test.test_urllib2.RequestTests) ... ok test_proxy (test.test_urllib2.RequestTests) ... ok test_selector (test.test_urllib2.RequestTests) ... ok ====================================================================== ERROR: test_file (test.test_urllib2.HandlerTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known ---------------------------------------------------------------------- Ran 34 tests in 19.868s FAILED (errors=1) test test_urllib2 failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known test_urllib2_localnet test_urllib2net test_urllib2net skipped -- Use of the `network' resource not enabled test_urllibnet test_urllibnet skipped -- Use of the `network' resource not enabled test_urlparse test_userdict test_userlist test_userstring test_uu test_uuid test_wait3 test_wait4 test_warnings [15679 refs] [15679 refs] [15672 refs] [15679 refs] [15679 refs] [15672 refs] test_wave test_weakref test_weakset test_whichdb test_winreg test_winreg skipped -- No module named _winreg test_winsound test_winsound skipped -- No module named winsound test_with test_wsgiref test_xdrlib test_xml_etree test_xml_etree_c test_xmllib test_xmlrpc test_xpickle test_xpickle -- skipping backwards compat tests. Use 'regrtest.py -u xpickle' to run them. test_xrange test_zipfile test_zipfile64 test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run test_zipimport test_zipimport_support test_zlib 346 tests OK. 1 test failed: test_urllib2 37 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_bsddb3 test_cd test_cl test_curses test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_linuxaudiodev test_macos test_macostools test_multiprocessing test_ossaudiodev test_pep277 test_py3kwarn test_scriptpackages test_smtpnet test_socketserver test_startfile test_sunaudiodev test_tcl test_timeout test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_urllib2net test_urllibnet test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly [965808 refs] From nnorwitz at gmail.com Sat Feb 5 22:18:25 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Sat, 5 Feb 2011 16:18:25 -0500 Subject: [Python-checkins] Python Regression Test Failures basics (1) Message-ID: <20110205211825.GA3148@kbk-i386-bb.dyndns.org> 346 tests OK. 1 test failed: test_urllib2 37 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_bsddb3 test_cd test_cl test_curses test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_linuxaudiodev test_macos test_macostools test_multiprocessing test_ossaudiodev test_pep277 test_py3kwarn test_scriptpackages test_smtpnet test_socketserver test_startfile test_sunaudiodev test_tcl test_timeout test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_urllib2net test_urllibnet test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly == CPython 2.7 (trunk:88351M, Feb 5 2011, 16:01:41) [GCC 3.3.4 20040623 (Gentoo Linux 3.3.4-r1, ssp-3.3.2-2, pie-8.7.6)] == Linux-2.6.9-gentoo-r1-i686-AMD_Athlon-tm-_XP_3000+-with-gentoo-1.4.16 little-endian == /tmp/test_python_31479 test_grammar test_opcodes test_dict test_builtin test_exceptions test_types test_unittest test_doctest test_doctest2 test_MimeWriter test_SimpleHTTPServer test_StringIO test___all__ test___future__ test__locale test_abc test_abstract_numbers test_aepack test_aepack skipped -- No module named aetypes test_aifc test_al test_al skipped -- No module named al test_anydbm test_applesingle test_applesingle skipped -- No module named MacOS test_argparse test_array test_ascii_formatd test_ast test_asynchat test_asyncore test_atexit test_audioop test_augassign test_base64 test_bastion test_bigaddrspace test_bigmem test_binascii test_binhex test_binop test_bisect test_bool test_bsddb test_bsddb185 test_bsddb185 skipped -- No module named bsddb185 test_bsddb3 test_bsddb3 skipped -- Use of the `bsddb' resource not enabled test_buffer test_bufio test_bytes test_bz2 test_calendar test_call test_capi test_cd test_cd skipped -- No module named cd test_cfgparser test_cgi test_charmapcodec test_cl test_cl skipped -- No module named cl test_class test_cmath test_cmd test_cmd_line test_cmd_line_script test_code test_codeccallbacks test_codecencodings_cn test_codecencodings_hk test_codecencodings_jp test_codecencodings_kr test_codecencodings_tw test_codecmaps_cn fetching http://source.icu-project.org/repos/icu/data/trunk/charset/data/xml/gb-18030-2000.xml ... fetching http://people.freebsd.org/~perky/i18n/EUC-CN.TXT ... fetching http://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP936.TXT ... test_codecmaps_hk fetching http://people.freebsd.org/~perky/i18n/BIG5HKSCS-2004.TXT ... test_codecmaps_jp fetching http://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP932.TXT ... fetching http://people.freebsd.org/~perky/i18n/EUC-JISX0213.TXT ... fetching http://people.freebsd.org/~perky/i18n/EUC-JP.TXT ... fetching http://www.unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/JIS/SHIFTJIS.TXT ... fetching http://people.freebsd.org/~perky/i18n/SHIFT_JISX0213.TXT ... test_codecmaps_kr fetching http://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP949.TXT ... fetching http://people.freebsd.org/~perky/i18n/EUC-KR.TXT ... fetching http://www.unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/KSC/JOHAB.TXT ... test_codecmaps_tw fetching http://www.unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/OTHER/BIG5.TXT ... fetching http://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP950.TXT ... test_codecs test_codeop test_coding test_coercion test_collections test_colorsys test_commands test_compare test_compile test_compileall test_compiler test_complex test_complex_args test_contains test_contextlib test_cookie test_cookielib test_copy test_copy_reg test_cpickle test_cprofile test_crypt test_csv test_ctypes test_curses test_curses skipped -- Use of the `curses' resource not enabled test_datetime test_dbm test_decimal test_decorators test_defaultdict test_deque test_descr test_descrtut test_dictcomps test_dictviews test_difflib test_dircache test_dis test_distutils [19643 refs] test_dl test_docxmlrpc test_dumbdbm test_dummy_thread test_dummy_threading test_email test_email_codecs test_email_renamed test_enumerate test_eof test_epoll test_epoll skipped -- kernel doesn't support epoll() test_errno test_exception_variations test_extcall test_fcntl test_file test_file2k test_filecmp test_fileinput test_fileio test_float test_fnmatch test_fork1 test_format test_fpformat test_fractions test_frozen test_ftplib test_funcattrs test_functools test_future test_future3 test_future4 test_future5 test_future_builtins test_gc test_gdb test_gdb skipped -- gdb versions before 7.0 didn't support python embedding Saw: GNU gdb 6.2.1 Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "i386-pc-linux-gnu". test_gdbm test_generators test_genericpath test_genexps test_getargs test_getargs2 test_getopt test_gettext test_gl test_gl skipped -- No module named gl test_glob test_global test_grp test_gzip test_hash test_hashlib test_heapq test_hmac test_hotshot test_htmllib test_htmlparser test_httplib test_httpservers [15648 refs] [15648 refs] [15648 refs] [25315 refs] test_imageop test_imaplib test_imgfile test_imgfile skipped -- No module named imgfile test_imp test_import test_importhooks test_importlib test_index test_inspect test_int test_int_literal test_io test_ioctl test_ioctl skipped -- Unable to open /dev/tty test_isinstance test_iter test_iterlen test_itertools test_json test_kqueue test_kqueue skipped -- test works only on BSD test_largefile test_lib2to3 test_linecache test_linuxaudiodev test_linuxaudiodev skipped -- Use of the `audio' resource not enabled test_list test_locale test_logging test_long test_long_future test_longexp test_macos test_macos skipped -- No module named MacOS test_macostools test_macostools skipped -- No module named MacOS test_macpath test_mailbox test_marshal test_math test_md5 test_memoryio test_memoryview test_mhlib test_mimetools test_mimetypes test_minidom test_mmap test_module test_modulefinder test_multibytecodec test_multibytecodec_support test_multifile test_multiprocessing test_multiprocessing skipped -- This platform lacks a functioning sem_open implementation, therefore, the required synchronization primitives needed will not function, see issue 3770. test_mutants test_mutex test_netrc test_new test_nis test_normalization fetching http://www.unicode.org/Public/5.2.0/ucd/NormalizationTest.txt ... test_ntpath test_old_mailbox test_openpty test_operator test_optparse test_os [15648 refs] [15648 refs] test_ossaudiodev test_ossaudiodev skipped -- Use of the `audio' resource not enabled test_parser Expecting 's_push: parser stack overflow' in next line s_push: parser stack overflow test_pdb test_peepholer test_pep247 test_pep263 test_pep277 test_pep277 skipped -- only NT+ and systems with Unicode-friendly filesystem encoding test_pep292 test_pep352 test_pickle test_pickletools test_pipes test_pkg test_pkgimport test_pkgutil test_platform [17044 refs] [17044 refs] test_plistlib test_poll test_popen [15653 refs] [15653 refs] [15653 refs] test_popen2 test_poplib test_posix test_posixpath test_pow test_pprint test_print test_profile test_profilehooks test_property test_pstats test_pty test_pwd test_py3kwarn test_py3kwarn skipped -- test.test_py3kwarn must be run with the -3 flag test_pyclbr test_pydoc [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20861 refs] [20861 refs] test_pyexpat test_queue test_quopri [18479 refs] [18479 refs] test_random test_re test_readline test_repr test_resource test_rfc822 test_richcmp test_robotparser test_runpy test_sax test_scope test_scriptpackages test_scriptpackages skipped -- No module named aetools test_select test_set test_setcomps test_sets test_sgmllib test_sha test_shelve test_shlex test_shutil test_signal test_site [15648 refs] [15648 refs] [15648 refs] [15648 refs] test_slice test_smtplib test_smtpnet test_smtpnet skipped -- Use of the `network' resource not enabled test_socket test_socketserver test_socketserver skipped -- Use of the `network' resource not enabled test_softspace test_sort test_sqlite test_ssl test_startfile test_startfile skipped -- module os has no attribute startfile test_str test_strftime test_string test_stringprep test_strop test_strptime test_strtod test_struct test_structmembers test_structseq test_subprocess [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] test_sunaudiodev test_sunaudiodev skipped -- No module named sunaudiodev test_sundry test_symtable test_syntax test_sys [15648 refs] [15648 refs] [15648 refs] [15877 refs] [15671 refs] test_sysconfig [15648 refs] [15648 refs] test_tarfile test_tcl test_tcl skipped -- No module named _tkinter test_telnetlib test_tempfile [15648 refs] test_textwrap test_thread test_threaded_import test_threadedtempfile test_threading [18948 refs] [21123 refs] [20035 refs] [20035 refs] [20035 refs] [20035 refs] test_threading_local test_threadsignals test_time test_timeout test_timeout skipped -- Use of the `network' resource not enabled test_tk test_tk skipped -- No module named _tkinter test_tokenize test_trace test_traceback test_transformer test_ttk_guionly test_ttk_guionly skipped -- No module named _tkinter test_ttk_textonly test_ttk_textonly skipped -- No module named _tkinter test_tuple test_typechecks test_ucn test_unary test_undocumented_details test_unicode test_unicode_file test_unicode_file skipped -- No Unicode filesystem semantics on this platform. test_unicodedata test_univnewlines test_univnewlines2k test_unpack test_urllib test_urllib2 test test_urllib2 failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known Re-running test 'test_urllib2' in verbose mode Trying: mgr = urllib2.HTTPPasswordMgr() Expecting nothing ok Trying: add = mgr.add_password Expecting nothing ok Trying: add("Some Realm", "http://example.com/", "joe", "password") Expecting nothing ok Trying: add("Some Realm", "http://example.com/ni", "ni", "ni") Expecting nothing ok Trying: add("c", "http://example.com/foo", "foo", "ni") Expecting nothing ok Trying: add("c", "http://example.com/bar", "bar", "nini") Expecting nothing ok Trying: add("b", "http://example.com/", "first", "blah") Expecting nothing ok Trying: add("b", "http://example.com/", "second", "spam") Expecting nothing ok Trying: add("a", "http://example.com", "1", "a") Expecting nothing ok Trying: add("Some Realm", "http://c.example.com:3128", "3", "c") Expecting nothing ok Trying: add("Some Realm", "d.example.com", "4", "d") Expecting nothing ok Trying: add("Some Realm", "e.example.com:3128", "5", "e") Expecting nothing ok Trying: mgr.find_user_password("Some Realm", "example.com") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/spam") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("Some Realm", "http://example.com/spam/spam") Expecting: ('joe', 'password') ok Trying: mgr.find_user_password("c", "http://example.com/foo") Expecting: ('foo', 'ni') ok Trying: mgr.find_user_password("c", "http://example.com/bar") Expecting: ('bar', 'nini') ok Trying: mgr.find_user_password("b", "http://example.com/") Expecting: ('second', 'spam') ok Trying: mgr.find_user_password("a", "http://example.com/") Expecting: ('1', 'a') ok Trying: mgr.find_user_password("a", "http://a.example.com/") Expecting: (None, None) ok Trying: mgr.find_user_password("Some Realm", "c.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("Some Realm", "c.example.com:3128") Expecting: ('3', 'c') ok Trying: mgr.find_user_password("Some Realm", "http://c.example.com:3128") Expecting: ('3', 'c') ok Trying: mgr.find_user_password("Some Realm", "d.example.com") Expecting: ('4', 'd') ok Trying: mgr.find_user_password("Some Realm", "e.example.com:3128") Expecting: ('5', 'e') ok Trying: mgr = urllib2.HTTPPasswordMgr() Expecting nothing ok Trying: add = mgr.add_password Expecting nothing ok Trying: add("f", "http://g.example.com:80", "10", "j") Expecting nothing ok Trying: add("g", "http://h.example.com", "11", "k") Expecting nothing ok Trying: add("h", "i.example.com:80", "12", "l") Expecting nothing ok Trying: add("i", "j.example.com", "13", "m") Expecting nothing ok Trying: mgr.find_user_password("f", "g.example.com:100") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "g.example.com:80") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("f", "g.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "http://g.example.com:100") Expecting: (None, None) ok Trying: mgr.find_user_password("f", "http://g.example.com:80") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("f", "http://g.example.com") Expecting: ('10', 'j') ok Trying: mgr.find_user_password("g", "h.example.com") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("g", "h.example.com:80") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("g", "http://h.example.com:80") Expecting: ('11', 'k') ok Trying: mgr.find_user_password("h", "i.example.com") Expecting: (None, None) ok Trying: mgr.find_user_password("h", "i.example.com:80") Expecting: ('12', 'l') ok Trying: mgr.find_user_password("h", "http://i.example.com:80") Expecting: ('12', 'l') ok Trying: mgr.find_user_password("i", "j.example.com") Expecting: ('13', 'm') ok Trying: mgr.find_user_password("i", "j.example.com:80") Expecting: (None, None) ok Trying: mgr.find_user_password("i", "http://j.example.com") Expecting: ('13', 'm') ok Trying: mgr.find_user_password("i", "http://j.example.com:80") Expecting: (None, None) ok Trying: url = "http://example.com" Expecting nothing ok Trying: Request(url, headers={"Spam-eggs": "blah"}).headers["Spam-eggs"] Expecting: 'blah' ok Trying: Request(url, headers={"spam-EggS": "blah"}).headers["Spam-eggs"] Expecting: 'blah' ok Trying: url = "http://example.com" Expecting nothing ok Trying: r = Request(url, headers={"Spam-eggs": "blah"}) Expecting nothing ok Trying: r.has_header("Spam-eggs") Expecting: True ok Trying: r.header_items() Expecting: [('Spam-eggs', 'blah')] ok Trying: r.add_header("Foo-Bar", "baz") Expecting nothing ok Trying: items = r.header_items() Expecting nothing ok Trying: items.sort() Expecting nothing ok Trying: items Expecting: [('Foo-bar', 'baz'), ('Spam-eggs', 'blah')] ok Trying: r.has_header("Not-there") Expecting: False ok Trying: print r.get_header("Not-there") Expecting: None ok Trying: r.get_header("Not-there", "default") Expecting: 'default' ok 93 items had no tests: test.test_urllib2 test.test_urllib2.FakeMethod test.test_urllib2.FakeMethod.__call__ test.test_urllib2.FakeMethod.__init__ test.test_urllib2.HandlerTests test.test_urllib2.HandlerTests._test_basic_auth test.test_urllib2.HandlerTests.test_basic_and_digest_auth_handlers test.test_urllib2.HandlerTests.test_basic_auth test.test_urllib2.HandlerTests.test_basic_auth_with_single_quoted_realm test.test_urllib2.HandlerTests.test_cookie_redirect test.test_urllib2.HandlerTests.test_cookies test.test_urllib2.HandlerTests.test_errors test.test_urllib2.HandlerTests.test_file test.test_urllib2.HandlerTests.test_ftp test.test_urllib2.HandlerTests.test_http test.test_urllib2.HandlerTests.test_http_doubleslash test.test_urllib2.HandlerTests.test_proxy test.test_urllib2.HandlerTests.test_proxy_basic_auth test.test_urllib2.HandlerTests.test_proxy_https test.test_urllib2.HandlerTests.test_proxy_https_proxy_authorization test.test_urllib2.HandlerTests.test_proxy_no_proxy test.test_urllib2.HandlerTests.test_redirect test.test_urllib2.MiscTests test.test_urllib2.MiscTests.opener_has_handler test.test_urllib2.MiscTests.test_build_opener test.test_urllib2.MockCookieJar test.test_urllib2.MockCookieJar.add_cookie_header test.test_urllib2.MockCookieJar.extract_cookies test.test_urllib2.MockFile test.test_urllib2.MockFile.close test.test_urllib2.MockFile.read test.test_urllib2.MockFile.readline test.test_urllib2.MockHTTPClass test.test_urllib2.MockHTTPClass.__call__ test.test_urllib2.MockHTTPClass.__init__ test.test_urllib2.MockHTTPClass.getresponse test.test_urllib2.MockHTTPClass.request test.test_urllib2.MockHTTPClass.set_debuglevel test.test_urllib2.MockHTTPClass.set_tunnel test.test_urllib2.MockHTTPHandler test.test_urllib2.MockHTTPHandler.__init__ test.test_urllib2.MockHTTPHandler.http_open test.test_urllib2.MockHTTPHandler.reset test.test_urllib2.MockHTTPResponse test.test_urllib2.MockHTTPResponse.__init__ test.test_urllib2.MockHTTPResponse.read test.test_urllib2.MockHTTPSHandler test.test_urllib2.MockHTTPSHandler.__init__ test.test_urllib2.MockHTTPSHandler.https_open test.test_urllib2.MockHandler test.test_urllib2.MockHandler.__init__ test.test_urllib2.MockHandler.__lt__ test.test_urllib2.MockHandler._define_methods test.test_urllib2.MockHandler.add_parent test.test_urllib2.MockHandler.close test.test_urllib2.MockHandler.handle test.test_urllib2.MockHeaders test.test_urllib2.MockHeaders.getheaders test.test_urllib2.MockOpener test.test_urllib2.MockOpener.error test.test_urllib2.MockOpener.open test.test_urllib2.MockPasswordManager test.test_urllib2.MockPasswordManager.add_password test.test_urllib2.MockPasswordManager.find_user_password test.test_urllib2.MockResponse test.test_urllib2.MockResponse.__init__ test.test_urllib2.MockResponse.geturl test.test_urllib2.MockResponse.info test.test_urllib2.OpenerDirectorTests test.test_urllib2.OpenerDirectorTests.test_add_non_handler test.test_urllib2.OpenerDirectorTests.test_badly_named_methods test.test_urllib2.OpenerDirectorTests.test_handled test.test_urllib2.OpenerDirectorTests.test_handler_order test.test_urllib2.OpenerDirectorTests.test_http_error test.test_urllib2.OpenerDirectorTests.test_processors test.test_urllib2.OpenerDirectorTests.test_raise test.test_urllib2.RequestTests test.test_urllib2.RequestTests.setUp test.test_urllib2.RequestTests.test_add_data test.test_urllib2.RequestTests.test_get_full_url test.test_urllib2.RequestTests.test_get_host test.test_urllib2.RequestTests.test_get_host_unquote test.test_urllib2.RequestTests.test_get_type test.test_urllib2.RequestTests.test_method test.test_urllib2.RequestTests.test_proxy test.test_urllib2.RequestTests.test_selector test.test_urllib2.TrivialTests test.test_urllib2.TrivialTests.test_parse_http_list test.test_urllib2.TrivialTests.test_trivial test.test_urllib2.add_ordered_mock_handlers test.test_urllib2.build_test_opener test.test_urllib2.sanepathname2url test.test_urllib2.test_main 4 items passed all tests: 27 tests in test.test_urllib2.test_password_manager 22 tests in test.test_urllib2.test_password_manager_default_port 3 tests in test.test_urllib2.test_request_headers_dict 11 tests in test.test_urllib2.test_request_headers_methods 63 tests in 97 items. 63 passed and 0 failed. Test passed. doctest (test.test_urllib2) ... 63 tests with zero failures Trying: _parse_proxy('file:/ftp.example.com/') Expecting: Traceback (most recent call last): ValueError: proxy URL with no authority: 'file:/ftp.example.com/' ok Trying: _parse_proxy('proxy.example.com') Expecting: (None, None, None, 'proxy.example.com') ok Trying: _parse_proxy('proxy.example.com:3128') Expecting: (None, None, None, 'proxy.example.com:3128') ok Trying: _parse_proxy('joe:password at proxy.example.com') Expecting: (None, 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('joe:password at proxy.example.com:3128') Expecting: (None, 'joe', 'password', 'proxy.example.com:3128') ok Trying: _parse_proxy('http://proxy.example.com/') Expecting: ('http', None, None, 'proxy.example.com') ok Trying: _parse_proxy('http://proxy.example.com:3128/') Expecting: ('http', None, None, 'proxy.example.com:3128') ok Trying: _parse_proxy('http://joe:password at proxy.example.com/') Expecting: ('http', 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('http://joe:password at proxy.example.com:3128') Expecting: ('http', 'joe', 'password', 'proxy.example.com:3128') ok Trying: _parse_proxy('ftp://joe:password at proxy.example.com/rubbish:3128') Expecting: ('ftp', 'joe', 'password', 'proxy.example.com') ok Trying: _parse_proxy('http://joe:password at proxy.example.com') Expecting: ('http', 'joe', 'password', 'proxy.example.com') ok 113 items had no tests: urllib2 urllib2.AbstractBasicAuthHandler urllib2.AbstractBasicAuthHandler.__init__ urllib2.AbstractBasicAuthHandler.http_error_auth_reqed urllib2.AbstractBasicAuthHandler.retry_http_basic_auth urllib2.AbstractDigestAuthHandler urllib2.AbstractDigestAuthHandler.__init__ urllib2.AbstractDigestAuthHandler.get_algorithm_impls urllib2.AbstractDigestAuthHandler.get_authorization urllib2.AbstractDigestAuthHandler.get_cnonce urllib2.AbstractDigestAuthHandler.get_entity_digest urllib2.AbstractDigestAuthHandler.http_error_auth_reqed urllib2.AbstractDigestAuthHandler.reset_retry_count urllib2.AbstractDigestAuthHandler.retry_http_digest_auth urllib2.AbstractHTTPHandler urllib2.AbstractHTTPHandler.__init__ urllib2.AbstractHTTPHandler.do_open urllib2.AbstractHTTPHandler.do_request_ urllib2.AbstractHTTPHandler.set_http_debuglevel urllib2.BaseHandler urllib2.BaseHandler.__lt__ urllib2.BaseHandler.add_parent urllib2.BaseHandler.close urllib2.CacheFTPHandler urllib2.CacheFTPHandler.__init__ urllib2.CacheFTPHandler.check_cache urllib2.CacheFTPHandler.connect_ftp urllib2.CacheFTPHandler.setMaxConns urllib2.CacheFTPHandler.setTimeout urllib2.FTPHandler urllib2.FTPHandler.connect_ftp urllib2.FTPHandler.ftp_open urllib2.FileHandler urllib2.FileHandler.file_open urllib2.FileHandler.get_names urllib2.FileHandler.open_local_file urllib2.HTTPBasicAuthHandler urllib2.HTTPBasicAuthHandler.http_error_401 urllib2.HTTPCookieProcessor urllib2.HTTPCookieProcessor.__init__ urllib2.HTTPCookieProcessor.http_request urllib2.HTTPCookieProcessor.https_response urllib2.HTTPDefaultErrorHandler urllib2.HTTPDefaultErrorHandler.http_error_default urllib2.HTTPDigestAuthHandler urllib2.HTTPDigestAuthHandler.http_error_401 urllib2.HTTPError urllib2.HTTPError.__init__ urllib2.HTTPError.__str__ urllib2.HTTPErrorProcessor urllib2.HTTPErrorProcessor.https_response urllib2.HTTPHandler urllib2.HTTPHandler.http_open urllib2.HTTPPasswordMgr urllib2.HTTPPasswordMgr.__init__ urllib2.HTTPPasswordMgr.add_password urllib2.HTTPPasswordMgr.find_user_password urllib2.HTTPPasswordMgr.is_suburi urllib2.HTTPPasswordMgr.reduce_uri urllib2.HTTPPasswordMgrWithDefaultRealm urllib2.HTTPPasswordMgrWithDefaultRealm.find_user_password urllib2.HTTPRedirectHandler urllib2.HTTPRedirectHandler.http_error_307 urllib2.HTTPRedirectHandler.redirect_request urllib2.HTTPSHandler urllib2.HTTPSHandler.https_open urllib2.OpenerDirector urllib2.OpenerDirector.__init__ urllib2.OpenerDirector._call_chain urllib2.OpenerDirector._open urllib2.OpenerDirector.add_handler urllib2.OpenerDirector.close urllib2.OpenerDirector.error urllib2.OpenerDirector.open urllib2.ProxyBasicAuthHandler urllib2.ProxyBasicAuthHandler.http_error_407 urllib2.ProxyDigestAuthHandler urllib2.ProxyDigestAuthHandler.http_error_407 urllib2.ProxyHandler urllib2.ProxyHandler.__init__ urllib2.ProxyHandler.proxy_open urllib2.Request urllib2.Request.__getattr__ urllib2.Request.__init__ urllib2.Request.add_data urllib2.Request.add_header urllib2.Request.add_unredirected_header urllib2.Request.get_data urllib2.Request.get_full_url urllib2.Request.get_header urllib2.Request.get_host urllib2.Request.get_method urllib2.Request.get_origin_req_host urllib2.Request.get_selector urllib2.Request.get_type urllib2.Request.has_data urllib2.Request.has_header urllib2.Request.has_proxy urllib2.Request.header_items urllib2.Request.is_unverifiable urllib2.Request.set_proxy urllib2.URLError urllib2.URLError.__init__ urllib2.URLError.__str__ urllib2.UnknownHandler urllib2.UnknownHandler.unknown_open urllib2.build_opener urllib2.install_opener urllib2.parse_http_list urllib2.parse_keqv_list urllib2.randombytes urllib2.request_host urllib2.urlopen 1 items passed all tests: 11 tests in urllib2._parse_proxy 11 tests in 114 items. 11 passed and 0 failed. Test passed. doctest (urllib2) ... 11 tests with zero failures test_parse_http_list (test.test_urllib2.TrivialTests) ... ok test_trivial (test.test_urllib2.TrivialTests) ... ok test_add_non_handler (test.test_urllib2.OpenerDirectorTests) ... ok test_badly_named_methods (test.test_urllib2.OpenerDirectorTests) ... ok test_handled (test.test_urllib2.OpenerDirectorTests) ... ok test_handler_order (test.test_urllib2.OpenerDirectorTests) ... ok test_http_error (test.test_urllib2.OpenerDirectorTests) ... ok test_processors (test.test_urllib2.OpenerDirectorTests) ... ok test_raise (test.test_urllib2.OpenerDirectorTests) ... ok test_basic_and_digest_auth_handlers (test.test_urllib2.HandlerTests) ... ok test_basic_auth (test.test_urllib2.HandlerTests) ... ok test_basic_auth_with_single_quoted_realm (test.test_urllib2.HandlerTests) ... ok test_cookie_redirect (test.test_urllib2.HandlerTests) ... ok test_cookies (test.test_urllib2.HandlerTests) ... ok test_errors (test.test_urllib2.HandlerTests) ... ok test_file (test.test_urllib2.HandlerTests) ... ERROR test_ftp (test.test_urllib2.HandlerTests) ... ok test_http (test.test_urllib2.HandlerTests) ... ok test_http_doubleslash (test.test_urllib2.HandlerTests) ... ok test_proxy (test.test_urllib2.HandlerTests) ... ok test_proxy_basic_auth (test.test_urllib2.HandlerTests) ... ok test_proxy_https (test.test_urllib2.HandlerTests) ... ok test_proxy_https_proxy_authorization (test.test_urllib2.HandlerTests) ... ok test_proxy_no_proxy (test.test_urllib2.HandlerTests) ... ok test_redirect (test.test_urllib2.HandlerTests) ... ok test_build_opener (test.test_urllib2.MiscTests) ... ok test_add_data (test.test_urllib2.RequestTests) ... ok test_get_full_url (test.test_urllib2.RequestTests) ... ok test_get_host (test.test_urllib2.RequestTests) ... ok test_get_host_unquote (test.test_urllib2.RequestTests) ... ok test_get_type (test.test_urllib2.RequestTests) ... ok test_method (test.test_urllib2.RequestTests) ... ok test_proxy (test.test_urllib2.RequestTests) ... ok test_selector (test.test_urllib2.RequestTests) ... ok ====================================================================== ERROR: test_file (test.test_urllib2.HandlerTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known ---------------------------------------------------------------------- Ran 34 tests in 19.658s FAILED (errors=1) test test_urllib2 failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_urllib2.py", line 711, in test_file h.file_open, Request(url)) File "/tmp/python-test/local/lib/python2.7/unittest/case.py", line 456, in assertRaises callableObj(*args, **kwargs) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1269, in file_open return self.open_local_file(req) File "/tmp/python-test/local/lib/python2.7/urllib2.py", line 1301, in open_local_file (not port and socket.gethostbyname(host) in self.get_names()): gaierror: [Errno -2] Name or service not known test_urllib2_localnet test_urllib2net test_urllib2net skipped -- Use of the `network' resource not enabled test_urllibnet test_urllibnet skipped -- Use of the `network' resource not enabled test_urlparse test_userdict test_userlist test_userstring test_uu test_uuid test_wait3 test_wait4 test_warnings [15679 refs] [15679 refs] [15672 refs] [15679 refs] [15679 refs] [15672 refs] test_wave test_weakref test_weakset test_whichdb test_winreg test_winreg skipped -- No module named _winreg test_winsound test_winsound skipped -- No module named winsound test_with test_wsgiref test_xdrlib test_xml_etree test_xml_etree_c test_xmllib test_xmlrpc test_xpickle test_xpickle -- skipping backwards compat tests. Use 'regrtest.py -u xpickle' to run them. test_xrange test_zipfile test_zipfile64 test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run test_zipimport test_zipimport_support test_zlib 346 tests OK. 1 test failed: test_urllib2 37 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_bsddb3 test_cd test_cl test_curses test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_linuxaudiodev test_macos test_macostools test_multiprocessing test_ossaudiodev test_pep277 test_py3kwarn test_scriptpackages test_smtpnet test_socketserver test_startfile test_sunaudiodev test_tcl test_timeout test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_urllib2net test_urllibnet test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly [966926 refs] From python-checkins at python.org Sat Feb 5 23:16:40 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 5 Feb 2011 23:16:40 +0100 (CET) Subject: [Python-checkins] r88356 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205221640.B14EBEEAF3@mail.python.org> Author: brett.cannon Date: Sat Feb 5 23:16:40 2011 New Revision: 88356 Log: Soften wording on doctest. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 23:16:40 2011 @@ -483,10 +483,14 @@ friends. -Stop Using :mod:`doctest` -''''''''''''''''''''''''' -While 2to3 tries to port doctests properly, it's a rather tough thing to do. It -is probably best to simply convert your critical doctests to :mod:`unittest`. +Updating doctests +''''''''''''''''' +2to3_ will attempt to generate fixes for doctests that it comes across. It's +not perfect, though. If you wrote a monolithic set of doctests (e.g., a single +docstring containing all of your doctests), you should at least consider +breaking the doctests up into smaller pieces to make it more manageable to fix. +Otherwise it might very well be worth your time and effort to port your tests +to :mod:`unittest`. Eliminate ``-3`` Warnings From python-checkins at python.org Sat Feb 5 23:22:47 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 5 Feb 2011 23:22:47 +0100 (CET) Subject: [Python-checkins] r88357 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110205222247.ABD1FEE9BF@mail.python.org> Author: brett.cannon Date: Sat Feb 5 23:22:47 2011 New Revision: 88357 Log: Mention that people going the source compatibility route should run 2to3 to find pain points. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sat Feb 5 23:22:47 2011 @@ -596,7 +596,9 @@ or newer (the :mod:`__future__` statements work in Python 3 without issue), eliminating warnings that are triggered by ``-3``, etc. -Essentially you should cover all of the steps short of running 2to3 itself. +You should even consider running 2to3_ over your code (without committing the +changes). This will let you know where potential pain points are within your +code so that you can fix them properly before they become an issue. Use six_ From python-checkins at python.org Sun Feb 6 00:54:10 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 00:54:10 +0100 Subject: [Python-checkins] devguide: Either move or mark where updates are needed from svn to hg. Message-ID: brett.cannon pushed 391f973aa971 to devguide: http://hg.python.org/devguide/rev/391f973aa971 changeset: 235:391f973aa971 branch: hg_transition tag: tip user: Brett Cannon date: Sat Feb 05 15:53:56 2011 -0800 summary: Either move or mark where updates are needed from svn to hg. files: index.rst diff --git a/index.rst b/index.rst --- a/index.rst +++ b/index.rst @@ -89,9 +89,9 @@ * `Firefox search engine plug-in`_ * Buildbots_ * Source code - * `Browse online `_ - * `Daily snapshot `_ - * `Daily OS X installer `_ + * `Browse online `_ + * `Snapshot of py3k `_ + * `XXX Daily OS X installer `_ * Tool support * :ref:`emacs` * :ref:`gdb` @@ -107,7 +107,7 @@ .. _buildbots: http://python.org/dev/buildbot/ .. _Firefox search engine plug-in: http://www.python.org/dev/searchplugin/ -.. _Misc directory: http://svn.python.org/view/python/branches/py3k/Misc/ +.. _Misc directory: http://hg.python.org/cpython/file/tip/Misc .. _PEPs: http://www.python.org/dev/peps .. _PEP 7: http://www.python.org/dev/peps/pep-0007 .. _PEP 8: http://www.python.org/dev/peps/pep-0008 -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 02:10:15 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 02:10:15 +0100 Subject: [Python-checkins] devguide: Tweak some URLs to start pointing towards hg. Message-ID: brett.cannon pushed 93128df24b2b to devguide: http://hg.python.org/devguide/rev/93128df24b2b changeset: 236:93128df24b2b branch: hg_transition user: Brett Cannon date: Sat Feb 05 16:03:57 2011 -0800 summary: Tweak some URLs to start pointing towards hg. files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -14,11 +14,11 @@ Version Control Setup --------------------- -CPython is developed using `Subversion (commonly abbreviated SVN) -`_. +CPython is developed using `Mercurial (commonly abbreviated 'hg') +`_. It is easily available for common Unix systems by way of the standard package -manager; under Windows, you might want to use the `TortoiseSVN -`_ graphical client. +manager; under Windows, you might want to use the `TortoiseHg +`_ graphical client. .. _checkout: @@ -37,13 +37,18 @@ code. To get a read-only checkout of the :ref:`in-development ` branch of Python, run:: - svn co http://svn.python.org/projects/python/branches/py3k + hg clone http://hg.python.org/cpython#py3k + +.. warning:: + XXX assuming 'default' branch is unused. Regardless, probably wrong URL as + PEP 385 suggests py3k will be its own repo compared to Python 2 which is not + the case as of 2011-02-05. If this is wrong, also change all other hg URLs. If you want a read-only checkout of an already-released version of Python, i.e., a version in :ref:`maintenance mode `, run something like the following which gets you a checkout for Python 3.1:: - svn co http://svn.python.org/projects/python/branches/release31-maint + hg clone http://hg.python.org/cpython#release-31maint To check out a version of Python other than 3.1, simply change the number in the above URL to the major/minor version (e.g., ``release27-maint`` for Python -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 02:10:15 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 02:10:15 +0100 Subject: [Python-checkins] devguide: Basic instructions on how to generate a patch with hg for non-committers. Message-ID: brett.cannon pushed 5a41c26a6906 to devguide: http://hg.python.org/devguide/rev/5a41c26a6906 changeset: 237:5a41c26a6906 branch: hg_transition tag: tip user: Brett Cannon date: Sat Feb 05 17:10:09 2011 -0800 summary: Basic instructions on how to generate a patch with hg for non-committers. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -7,6 +7,14 @@ Creating -------- +Tool Usage +'''''''''' + +Because Python uses hg as its version control system, **anyone** can make +commits locally to their repository. This means that you should make as many +commits to your code checkout as you want in order for you to work effectively. + + Preparation ''''''''''' @@ -67,24 +75,24 @@ To create your patch, you should generate a unified diff from your checkout's top-level directory:: - svn diff > patch.diff + hg outgoing --path > patch.diff If your work needs some new files to be added to the source tree, remember -to ``svn add`` them before generating the patch:: +to ``hg add`` them before generating the patch:: - svn add Lib/newfile.py - svn diff > patch.diff + hg add Lib/newfile.py + hg outgoing --patch > patch.diff To apply a patch generated this way, do:: - patch -p0 < patch.diff - -If the developer is using something other than svn to manage their code (e.g., -Mercurial), you might have to use ``-p1`` instead of ``-p0``. + patch -p1 < patch.diff To undo a patch, you can revert **all** changes made in your checkout:: - svn revert -R . + hg revert --all + +This will leave backups of the files with your changes still intact. To skip +that step, you can use the ``--no-backup`` flag. .. note:: The ``patch`` program is not available by default under Windows. You can find it `here `_, -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 03:31:31 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 03:31:31 +0100 Subject: [Python-checkins] devguide: Tweak a svn URL for hg. Message-ID: brett.cannon pushed ce0793e3f97c to devguide: http://hg.python.org/devguide/rev/ce0793e3f97c changeset: 238:ce0793e3f97c branch: hg_transition user: Brett Cannon date: Sat Feb 05 17:15:58 2011 -0800 summary: Tweak a svn URL for hg. files: devcycle.rst diff --git a/devcycle.rst b/devcycle.rst --- a/devcycle.rst +++ b/devcycle.rst @@ -44,7 +44,7 @@ The branch for the next minor version; it is under active development for all kinds of changes: new features, semantic changes, performance improvements, bug fixes. It can be :ref:`checked out ` from -http://svn.python.org/projects/python/branches/py3k. +http://hg.python.org/cpython#py3k. Once a :ref:`final` release is made from the in-development branch (say, 3.2), a new :ref:`maintenance branch ` (e.g. ``release32-maint``) -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 03:31:31 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 03:31:31 +0100 Subject: [Python-checkins] devguide: Old way of verifying commit privs working does not work on hg.python.org. Also Message-ID: brett.cannon pushed 07ff1215b0f6 to devguide: http://hg.python.org/devguide/rev/07ff1215b0f6 changeset: 239:07ff1215b0f6 branch: hg_transition tag: tip user: Brett Cannon date: Sat Feb 05 18:31:21 2011 -0800 summary: Old way of verifying commit privs working does not work on hg.python.org. Also update a URL. files: coredev.rst diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -68,7 +68,12 @@ You can verify your commit access by looking at http://www.python.org/dev/committers which lists all core developers by -username. You can also execute the follow command and look for the word +username. + +.. warning:: + XXX the technique below does not work with hg at hg.python.org + +You can also execute the follow command and look for the word "success" in the output:: ssh pythondev at svn.python.org @@ -112,7 +117,7 @@ For the development branch, you can check out the development branch with:: - svn co svn+ssh://pythondev at svn.python.org/python/branches/py3k + hg clone hg at hg.python.org/cpython#py3k Make the appropriate changes to the URL to checkout maintenance branches by removing ``py3k`` and replacing it with the name of the branch you want. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 03:44:37 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 06 Feb 2011 03:44:37 +0100 Subject: [Python-checkins] devguide: Drop svn.python.org as a source for python-checkins. Message-ID: brett.cannon pushed 1a2837321fbb to devguide: http://hg.python.org/devguide/rev/1a2837321fbb changeset: 240:1a2837321fbb branch: hg_transition tag: tip user: Brett Cannon date: Sat Feb 05 18:44:30 2011 -0800 summary: Drop svn.python.org as a source for python-checkins. files: communication.rst diff --git a/communication.rst b/communication.rst --- a/communication.rst +++ b/communication.rst @@ -31,7 +31,7 @@ it will get redirected here. Python-checkins_ sends out an email for every commit to Python's various -repositories (both svn.python.org and hg.python.org). All core developers +repositories from http://hg.python.org. All core developers subscribe to this list and are known to reply to these emails to make comments about various issues they catch in the commit. Replies get redirected to python-dev. -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Sun Feb 6 05:04:56 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 06 Feb 2011 05:04:56 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88357): sum=0 Message-ID: py3k results for svn r88357 (hg cset 64faf0a056e1) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflog15lIDH', '-x'] From python-checkins at python.org Sun Feb 6 07:11:29 2011 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 6 Feb 2011 07:11:29 +0100 (CET) Subject: [Python-checkins] r88358 - python/branches/py3k/Doc/howto/sorting.rst Message-ID: <20110206061129.E58DCEE986@mail.python.org> Author: raymond.hettinger Date: Sun Feb 6 07:11:29 2011 New Revision: 88358 Log: Small markup and wording tweaks for the sorting-howto. Modified: python/branches/py3k/Doc/howto/sorting.rst Modified: python/branches/py3k/Doc/howto/sorting.rst ============================================================================== --- python/branches/py3k/Doc/howto/sorting.rst (original) +++ python/branches/py3k/Doc/howto/sorting.rst Sun Feb 6 07:11:29 2011 @@ -23,7 +23,7 @@ >>> sorted([5, 2, 3, 1, 4]) [1, 2, 3, 4, 5] -You can also use the :meth:`list.sort` method of a list. It modifies the list +You can also use the :meth:`list.sort` method. It modifies the list in-place (and returns *None* to avoid confusion). Usually it's less convenient than :func:`sorted` - but if you don't need the original list, it's slightly more efficient. @@ -87,9 +87,9 @@ ========================= The key-function patterns shown above are very common, so Python provides -convenience functions to make accessor functions easier and faster. The operator -module has :func:`operator.itemgetter`, :func:`operator.attrgetter`, and -an :func:`operator.methodcaller` function. +convenience functions to make accessor functions easier and faster. The +:mod:`operator` module has :func:`~operator.itemgetter`, +:func:`~operator.attrgetter`, and an :func:`~operator.methodcaller` function. Using those functions, the above examples become simpler and faster: @@ -248,7 +248,7 @@ [5, 4, 3, 2, 1] In Python 3.2, the :func:`functools.cmp_to_key` function was added to the -functools module in the standard library. +:mod:`functools` module in the standard library. Odd and Ends ============ @@ -256,7 +256,7 @@ * For locale aware sorting, use :func:`locale.strxfrm` for a key function or :func:`locale.strcoll` for a comparison function. -* The *reverse* parameter still maintains sort stability (i.e. records with +* The *reverse* parameter still maintains sort stability (so that records with equal keys retain the original order). Interestingly, that effect can be simulated without the parameter by using the builtin :func:`reversed` function twice: From python-checkins at python.org Sun Feb 6 07:26:06 2011 From: python-checkins at python.org (nick.coghlan) Date: Sun, 06 Feb 2011 07:26:06 +0100 Subject: [Python-checkins] devguide: Start updating the developer tools FAQ Message-ID: nick.coghlan pushed c50fd7423eab to devguide: http://hg.python.org/devguide/rev/c50fd7423eab changeset: 241:c50fd7423eab branch: hg_transition user: Nick Coghlan date: Sun Feb 06 16:06:57 2011 +1000 summary: Start updating the developer tools FAQ files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -6,60 +6,71 @@ Version Control ================================== -Where can I learn about the version control system used, Subversion (svn)? +Where can I learn about the version control system used, Mercurial (hg)? ------------------------------------------------------------------------------- -`Subversion`_'s (also known as ``svn``) official web site is at -http://subversion.apache.org/ . A book on Subversion published by -`O'Reilly Media`_, `Version Control with Subversion`_, is available +`Mercurial`_'s (also known as ``hg``) official web site is at +http://mercurial.selenic.com/. A book on Subversion published by +`O'Reilly Media`_, `Mercurial: The Definitive Guide`_, is available for free online. -With Subversion installed, you can run the help tool that comes with -Subversion to get help:: +With Mercurial installed, you can run the help tool that comes with +Mercurial to get help:: - svn help + hg help -The man page for ``svn`` is rather scant and not very helpful. +The man page for ``hg`` provides a quick refresher on the details of +various commands, but doesn't provide any guidance on overall +workflow. -.. _Subversion: http://subversion.apache.org/ +.. _Mercurial: http://mercurial.selenic.com/ .. _O'Reilly Media: http://www.oreilly.com/ -.. _Version Control with Subversion: http://svnbook.red-bean.com/ +.. _Mercurial: The Definitive Guide: http://hgbook.red-bean.com/ -What do I need to use Subversion? +What do I need to use Mercurial? ------------------------------------------------------------------------------- -.. _download Subversion: http://subversion.apache.org/packages.html +.. _download Mercurial: http://mercurial.selenic.com/downloads/ UNIX ''''''''''''''''''' -First, you need to `download Subversion`_. Most UNIX-based operating systems -have binary packages available. Also, most packaging systems also -have Subversion available. +First, you need to `download Mercurial`_. Most UNIX-based operating systems +have binary packages available. Most package management systems also +have native Mercurial packages available. If you have checkin rights, you need OpenSSH_. This is needed to verify -your identity when performing commits. +your identity when performing commits. As with Mercurial, binary packages +are typically available either online or through the platform's package +management system. .. _OpenSSH: http://www.openssh.org/ Windows ''''''''''''''''''' -You have several options on Windows. One is to `download Subversion`_ itself +XXX: The following instructions need verification. They're based on +the old SVN instructions plus the info at +http://mercurial.selenic.com/wiki/AccessingSshRepositoriesFromWindows +and https://bitbucket.org/tortoisehg/stable/wiki/ssh + +You have several options on Windows. One is to `download Mercurial`_ itself which will give you a command-line version. Another option is to `download -TortoiseSVN`_ which integrates with Windows Explorer. +TortoiseHg`_ which integrates with Windows Explorer. Note that this FAQ only +covers the command line client in detail - refer to the TortoiseHg +documentation for assistance with that tool. If you have checkin rights, you will also need an SSH client. `Download PuTTY and friends`_ (PuTTYgen, Pageant, and Plink) for this. All other questions in this FAQ will assume you are using these tools. -Once you have both Subversion and PuTTY installed you must tell Subversion +Once you have both Mercurial and PuTTY installed you must tell Subversion where to find an SSH client. Do this by editing -``%APPDATA%\Subversion\config`` to have the following -section:: +``%APPDATA%\Mercurial.ini`` to add the following entry (use the existing +``[ui]`` section if one is already present):: - [tunnels] + [ui] ssh="c:/path/to/putty/plink.exe" -T Change the path to be the proper one for your system. The ``-T`` @@ -70,28 +81,55 @@ you need to create a profile in PuTTY. Go to Session:Saved Sessions and create a new profile named -``svn.python.org``. In Session:Host Name, enter ``svn.python.org``. In +``hg.python.org``. In Session:Host Name, enter ``hg.python.org``. In SSH/Auth:Private key file select your private key. In Connection:Auto-login -username enter ``pythondev``. +username enter ``hg``. +XXX: Does the following comment still apply to TortoiseHg? With this set up, paths are slightly different than most other settings in that the username is not required. Do take notice of this when choosing to check out a project! -.. _download TortoiseSVN: http://tortoisesvn.net/downloads +.. _download TortoiseHg: http://tortoisehg.bitbucket.org/download/index.html .. _PuTTY: http://www.chiark.greenend.org.uk/~sgtatham/putty/ .. _download PuTTY and friends: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html +How do I link my local repository to a particular remote repository? +------------------------------------------------------------------------------- + +In ``.hg/hgrc`` file for the relevant local repository, add the following section:: + + [paths] + default = ssh://hg at hg.python.org/devguide + +This example is for a local repository that mirrors the ``devguide`` repository +on ``hg.python.org``. The same approach works for other remote repositories. + +How do I create a nickname for a remote repository? +------------------------------------------------------------------------------- + +In your global ``.hgrc`` file add a section similar to the following:: + + [paths] + dg = ssh://hg at hg.python.org/devguide + +This example creates a ``dg`` alias for the ``devguide`` repository +on ``hg.python.org``. This allows "dg" to be entered instead of the +full URL for commands such as ``hg pull`. + How do I update my working copy to be in sync with the repository? ------------------------------------------------------------------------------- Run:: - svn update + hg pull + hg update -from the directory you wish to update. The directory and all its -subdirectories will be updated. +from the directory you wish to update. The first command retrieves any +changes from the specified remote repository and merges them into the local +repository. The second commands updates the current directory and all its +subdirectories from the local repository. How do I add a file or directory to the repository? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 07:26:07 2011 From: python-checkins at python.org (nick.coghlan) Date: Sun, 06 Feb 2011 07:26:07 +0100 Subject: [Python-checkins] devguide: Describe the Rdiff extension for remote diffs Message-ID: nick.coghlan pushed dfcbecf10237 to devguide: http://hg.python.org/devguide/rev/dfcbecf10237 changeset: 242:dfcbecf10237 branch: hg_transition tag: tip user: Nick Coghlan date: Sun Feb 06 16:25:40 2011 +1000 summary: Describe the Rdiff extension for remote diffs files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -106,7 +106,8 @@ This example is for a local repository that mirrors the ``devguide`` repository on ``hg.python.org``. The same approach works for other remote repositories. -How do I create a nickname for a remote repository? + +How do I create a shorthand alias for a remote repository? ------------------------------------------------------------------------------- In your global ``.hgrc`` file add a section similar to the following:: @@ -118,7 +119,32 @@ on ``hg.python.org``. This allows "dg" to be entered instead of the full URL for commands such as ``hg pull`. -How do I update my working copy to be in sync with the repository? +Anywhere that ```` is used in the commands in this +FAQ, ``hg`` should accept an alias in place of a complete remote URL. + + +How do I compare my working copy to a remote repository? +------------------------------------------------------------------------------- + +First, retrieve and enable the `Rdiff extension` for Mercurial. + +The following command will then show the effect of updating the working copy +to match that remote repository:: + + hg diff `` + +To instead see the effects of applying the changes in the current working copy +to that remote repository, simply reverse the diff:: + + hg diff --reverse + +By executing them in a pristine working copy, these same commands work to +diff the local repository against a remote repository. + +.. _Rdiff extension: http://mercurial.selenic.com/wiki/RdiffExtension + + +How do I update my working copy to be in sync with a remote repository? ------------------------------------------------------------------------------- Run:: -- Repository URL: http://hg.python.org/devguide From ncoghlan at gmail.com Sun Feb 6 14:59:33 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 6 Feb 2011 23:59:33 +1000 Subject: [Python-checkins] devguide: Describe the Rdiff extension for remote diffs In-Reply-To: References: Message-ID: On Sun, Feb 6, 2011 at 4:26 PM, nick.coghlan wrote: > +How do I compare my working copy to a remote repository? > +------------------------------------------------------------------------------- To save anyone else pointing this out, I'm now aware that "hg incoming" and "hg outgoing" are the actual commands I want. Still, that kind of mistake is why I want to keep the dev FAQ around - to help people that don't know enough to avoid the misleading answers a web search will sometimes give back. Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Sun Feb 6 19:08:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:08:34 +0100 Subject: [Python-checkins] devguide: Fix markup Message-ID: antoine.pitrou pushed 2c7f907d9cee to devguide: http://hg.python.org/devguide/rev/2c7f907d9cee changeset: 243:2c7f907d9cee branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 18:54:37 2011 +0100 summary: Fix markup files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -25,7 +25,7 @@ .. _Mercurial: http://mercurial.selenic.com/ .. _O'Reilly Media: http://www.oreilly.com/ -.. _Mercurial: The Definitive Guide: http://hgbook.red-bean.com/ +.. _Mercurial\: The Definitive Guide: http://hgbook.red-bean.com/ What do I need to use Mercurial? @@ -117,7 +117,7 @@ This example creates a ``dg`` alias for the ``devguide`` repository on ``hg.python.org``. This allows "dg" to be entered instead of the -full URL for commands such as ``hg pull`. +full URL for commands such as ``hg pull``. Anywhere that ```` is used in the commands in this FAQ, ``hg`` should accept an alias in place of a complete remote URL. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:08:35 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:08:35 +0100 Subject: [Python-checkins] devguide: Give an example of using an alias Message-ID: antoine.pitrou pushed f2b40f412f1a to devguide: http://hg.python.org/devguide/rev/f2b40f412f1a changeset: 244:f2b40f412f1a branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 18:56:28 2011 +0100 summary: Give an example of using an alias files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -117,7 +117,8 @@ This example creates a ``dg`` alias for the ``devguide`` repository on ``hg.python.org``. This allows "dg" to be entered instead of the -full URL for commands such as ``hg pull``. +full URL for commands taking a repository argument (e.g. ``hg pull dg`` or +``hg outgoing dg``). Anywhere that ```` is used in the commands in this FAQ, ``hg`` should accept an alias in place of a complete remote URL. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:08:35 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:08:35 +0100 Subject: [Python-checkins] devguide: Mention pull -u Message-ID: antoine.pitrou pushed 1637258849f6 to devguide: http://hg.python.org/devguide/rev/1637258849f6 changeset: 245:1637258849f6 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 19:06:41 2011 +0100 summary: Mention pull -u files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -158,6 +158,10 @@ repository. The second commands updates the current directory and all its subdirectories from the local repository. +You can combine the two one commands in one by using:: + + hg pull -u + How do I add a file or directory to the repository? ------------------------------------------------------------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:08:36 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:08:36 +0100 Subject: [Python-checkins] devguide: s/working copy/local repository/ Message-ID: antoine.pitrou pushed 082dbcff3137 to devguide: http://hg.python.org/devguide/rev/082dbcff3137 changeset: 246:082dbcff3137 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 19:08:31 2011 +0100 summary: s/working copy/local repository/ files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -145,7 +145,7 @@ .. _Rdiff extension: http://mercurial.selenic.com/wiki/RdiffExtension -How do I update my working copy to be in sync with a remote repository? +How do I update my local repository to be in sync with a remote repository? ------------------------------------------------------------------------------- Run:: -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:27:27 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:27:27 +0100 Subject: [Python-checkins] devguide: Translate some FAQ entries for hg Message-ID: antoine.pitrou pushed a3cb4170c035 to devguide: http://hg.python.org/devguide/rev/a3cb4170c035 changeset: 247:a3cb4170c035 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 19:27:24 2011 +0100 summary: Translate some FAQ entries for hg files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -168,38 +168,54 @@ Simply specify the path to the file or directory to add and run:: - svn add PATH + hg add PATH -Subversion will skip any directories it already knows about. But if -you want new files that exist in any directories specified in ``PATH``, specify -``--force`` and Subversion will check *all* directories for new files. +If ``PATH`` is a directory, Mercurial will recursively add any files in that +directory and its descendents. -You will then need to run ``svn commit`` (as discussed in -`How do I commit a change to a file?`_) to commit the file to the repository. +If you want Mercurial to figure out by itself which files should be added +and/or removed, just run:: + hg addremove + +**Be careful** though, as it might add some files that are not desired in +the repository (such as build products, cache files, or other data). + +You will then need to run ``hg commit`` (as discussed :ref:`below `) +to commit the file(s) to your local repository. + + +.. _hg-commit: How do I commit a change to a file? ------------------------------------------------------------------------------- -To have any changes to a file (which include adding a new file or deleting an -existing one), you use the command:: +To have any changes to a file (which include adding a new file or deleting +an existing one), you use the command:: - svn commit [PATH] + hg commit [PATH] -Although ``[PATH]`` is optional, if PATH is omitted all changes -in your local copy will be committed to the repository. -**DO NOT USE THIS!!!** You should specify the specific files -to be committed unless you are *absolutely* positive that -*all outstanding modifications* are meant to go in this commit. +``[PATH]`` is optional: if it is omitted, all changes in your working copy +will be committed to the local repository. When you commit, be sure that all +changes are desired; especially, when making commits that you intend to +push to public repositories, you should **not** commit together unrelated +changes. To abort a commit that you are in the middle of, leave the message empty (i.e., close the text editor without adding any text for the -message). Subversion will confirm if you want to abort the commit. +message). Mercurial will then abort the commit operation so that you can +try again later. -If you do not like the default text editor Subversion uses for -entering commmit messages, you may specify a different editor -in your Subversion config file with the -``editor-cmd`` option in the ``[helpers]`` section. +Once a change is committed to your local repository, it is still only visible +by you. This means you are free to experiment with as many local commits +you feel like. + +.. note:: + If you do not like the default text editor Mercurial uses for + entering commmit messages, you may specify a different editor, + either by changing the ``EDITOR`` environment variable or by setting + a Mercurial-specific editor in your global ``.hgrc`` with the ``editor`` + option in the ``[ui]`` section. How do I delete a file or directory in the repository? @@ -207,10 +223,11 @@ Specify the path to be removed with:: - svn delete PATH + hg rm PATH -Any modified files or files that are not checked in will not be deleted -in the working copy on your machine. +This will remove the file or the directory from your working copy; you will +have to :ref:`commit your changes ` for the removal to be recorded +in your local repository. What files are modified locally in my working copy? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:28:09 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:28:09 +0100 Subject: [Python-checkins] devguide: Fix typo Message-ID: antoine.pitrou pushed 267a55dcfe0f to devguide: http://hg.python.org/devguide/rev/267a55dcfe0f changeset: 248:267a55dcfe0f branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 19:28:04 2011 +0100 summary: Fix typo files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -158,7 +158,7 @@ repository. The second commands updates the current directory and all its subdirectories from the local repository. -You can combine the two one commands in one by using:: +You can combine the two commands in one by using:: hg pull -u -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:37:30 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:37:30 +0100 Subject: [Python-checkins] devguide: More hg changes. I replaced the rdiff reference with hg in and hg out. Message-ID: antoine.pitrou pushed 08c641674c5c to devguide: http://hg.python.org/devguide/rev/08c641674c5c changeset: 249:08c641674c5c branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 19:37:26 2011 +0100 summary: More hg changes. I replaced the rdiff reference with hg in and hg out. files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -106,6 +106,9 @@ This example is for a local repository that mirrors the ``devguide`` repository on ``hg.python.org``. The same approach works for other remote repositories. +Anywhere that ```` is used in the commands in this +FAQ, ``hg`` will use the default remote repository if you omit the parameter. + How do I create a shorthand alias for a remote repository? ------------------------------------------------------------------------------- @@ -124,25 +127,31 @@ FAQ, ``hg`` should accept an alias in place of a complete remote URL. -How do I compare my working copy to a remote repository? +How do I compare my local repository to a remote repository? ------------------------------------------------------------------------------- -First, retrieve and enable the `Rdiff extension` for Mercurial. +To display the list of changes that are in your local repository, but not +in the remote, use:: -The following command will then show the effect of updating the working copy -to match that remote repository:: + hg outgoing - hg diff `` +This is the list of changes that will be sent if you call +``hg push ``. -To instead see the effects of applying the changes in the current working copy -to that remote repository, simply reverse the diff:: +Conversely, for the list of changes that are in the remote repository but +not in the local, use:: - hg diff --reverse + hg incoming -By executing them in a pristine working copy, these same commands work to -diff the local repository against a remote repository. +This is the list of changes that will be retrieved if you call +``hg pull ``. -.. _Rdiff extension: http://mercurial.selenic.com/wiki/RdiffExtension +.. note:: + In most daily use, you will work against the default remote repository, + and therefore simply type ``hg outgoing`` and ``hg incoming``. + + In this case, you can also get a synthetic summary using + ``hg summary --remote``. How do I update my local repository to be in sync with a remote repository? @@ -160,7 +169,7 @@ You can combine the two commands in one by using:: - hg pull -u + hg pull -u How do I add a file or directory to the repository? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 19:42:52 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 19:42:52 +0100 Subject: [Python-checkins] devguide: Remove SVN-specific stuff. Message-ID: antoine.pitrou pushed 77a133c7e859 to devguide: http://hg.python.org/devguide/rev/77a133c7e859 changeset: 250:77a133c7e859 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 19:42:49 2011 +0100 summary: Remove SVN-specific stuff. (note: hg does support subrepos, but I don't think we have any use for them currently) files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -260,14 +260,6 @@ = =========================== -How do I find out what Subversion properties are set for a file or directory? -------------------------------------------------------------------------------- - -:: - - svn proplist PATH - - How do I revert a file I have modified back to the version in the respository? ------------------------------------------------------------------------------- @@ -315,17 +307,6 @@ should be used to get a listing of all files modified in that revision. -How can I edit the log message of a committed revision? -------------------------------------------------------------------------------- - -Use:: - - svn propedit -r --revprop svn:log - -Replace ```` with the revision number of the commit whose log message -you wish to change. - - How do I get a diff between the repository and my working copy for a file? ------------------------------------------------------------------------------- @@ -431,29 +412,6 @@ using svnmerge.py. -How do I include an external svn repository (external definition) in the repository? ------------------------------------------------------------------------------------- - -Before attempting to include an external svn repository into Python's -repository, it is important to realize that you can only include directories, -not individual files. - -To include a directory of an external definition (external svn repository) as a -directory you need to edit the ``svn:externals`` property on the root of the -repository you are working with using the format of:: - - local_directory remote_repositories_http_address - -For instance, to include Python's sandbox repository in the 'sandbox' directory -of your repository, run ``svn propedit svn:externals .`` while in the root of -your repository and enter:: - - sandbox http://svn.python.org/projects/sandbox/trunk/ - -in your text editor. The next time you run ``svn update`` it will pull in the -external definition. - - How can I create a directory in the sandbox? ------------------------------------------------------------------------------ -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:06:56 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:06:56 +0100 Subject: [Python-checkins] devguide: More hg changes Message-ID: antoine.pitrou pushed a75cbf2b304d to devguide: http://hg.python.org/devguide/rev/a75cbf2b304d changeset: 251:a75cbf2b304d branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 19:54:24 2011 +0100 summary: More hg changes files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -206,9 +206,9 @@ ``[PATH]`` is optional: if it is omitted, all changes in your working copy will be committed to the local repository. When you commit, be sure that all -changes are desired; especially, when making commits that you intend to -push to public repositories, you should **not** commit together unrelated -changes. +changes are desired by :ref:`reviewing them first `; +especially, when making commits that you intend to push to public repositories, +you should **not** commit together unrelated changes. To abort a commit that you are in the middle of, leave the message empty (i.e., close the text editor without adding any text for the @@ -239,25 +239,35 @@ in your local repository. -What files are modified locally in my working copy? +.. _hg-status: + +What files are modified in my working copy? ------------------------------------------------------------------------------- Running:: - svn status [PATH] + hg status -will list any differences between your working copy and the repository. Some +will list any pending changes in the working copy. These changes will get +commited to the local repository if you issue an ``hg commit`` without +specifying any path. + +Some key indicators that can appear in the first column of output are: -= =========================== -A Scheduled to be added + = =========================== + A Scheduled to be added -D Scheduled to be deleted + R Scheduled to be removed -M Modified locally + M Modified locally -? Not under version control -= =========================== + ? Not under version control + = =========================== + +If you want a line-by-line listing of the differences, use:: + + hg diff How do I revert a file I have modified back to the version in the respository? @@ -281,13 +291,13 @@ You want:: - svn blame PATH + hg annotate PATH -This will output to stdout every line of the file along with what revision -number last touched that line and who committed that revision. Since it is -printed to stdout, you probably want to pipe the output to a pager:: +This will output to stdout every line of the file along with which revision +last modified that line. When you have the revision number, it is then +easy to display it in detail using:: - svn blame PATH | less + hg log -vp -r How can I see a list of log messages for a file or specific revision? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:06:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:06:57 +0100 Subject: [Python-checkins] devguide: Add a blurb to explain WC and repo Message-ID: antoine.pitrou pushed 237c7c95fd42 to devguide: http://hg.python.org/devguide/rev/237c7c95fd42 changeset: 252:237c7c95fd42 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:04:31 2011 +0100 summary: Add a blurb to explain WC and repo files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -95,10 +95,36 @@ .. _download PuTTY and friends: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html +What's a working copy? What's a repository? +------------------------------------------- + +Mercurial is a "distributed" version control system. This means that each +participant, even casual contributors, download a complete copy (called a +*clone*, since it is obtained by calling ``hg clone``) of the central +repository which can be treated as a stand-alone repository for all purposes. +That copy is called in the FAQ the *local repository*, to differentiate +with any *remote repository* you might also interact with. + +But you don't modify files directly in the local repository; Mercurial doesn't +allow for it. You modify files in what's called the *working copy* associated +with your local repository: you also run compilations and tests there. +Once you are satisfied with your changes, you can :ref:`commit them `; +committing records the changes as a new *revision* in the *local repository*. + +Changes in your *local repository* don't get automatically shared with the +rest of the world. Mercurial ensures that you have to do so explicitly +(this allows you to experiment quite freely with multiple branches of +development, all on your private computer). The main commands for doing +so are ``hg pull`` and ``hg push``. + + How do I link my local repository to a particular remote repository? ------------------------------------------------------------------------------- -In ``.hg/hgrc`` file for the relevant local repository, add the following section:: +Your local repository is linked by default to the remote repository it +was *cloned* from. If you created it from scratch, however, it is not linked +to any remote repository. In ``.hg/hgrc`` file for the local repository, add +or modify the following section:: [paths] default = ssh://hg at hg.python.org/devguide -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:06:58 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:06:58 +0100 Subject: [Python-checkins] devguide: Clarify that hg out only displays committed changes Message-ID: antoine.pitrou pushed 870a74c7def5 to devguide: http://hg.python.org/devguide/rev/870a74c7def5 changeset: 253:870a74c7def5 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 20:06:49 2011 +0100 summary: Clarify that hg out only displays committed changes files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -162,7 +162,8 @@ hg outgoing This is the list of changes that will be sent if you call -``hg push ``. +``hg push ``. It does **not** include any :ref:`uncommitted +changes ` in your working copy! Conversely, for the list of changes that are in the remote repository but not in the local, use:: -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:00 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:00 +0100 Subject: [Python-checkins] devguide: Convert more FAQ entries to hg Message-ID: antoine.pitrou pushed a7601e01038b to devguide: http://hg.python.org/devguide/rev/a7601e01038b changeset: 254:a7601e01038b branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:12:45 2011 +0100 summary: Convert more FAQ entries to hg files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -302,15 +302,15 @@ Running:: - svn revert PATH + hg revert PATH -will change ``PATH`` to match the version in the repository, throwing away any +will revert ``PATH`` to its version in the repository, throwing away any changes you made locally. If you run:: - svn revert -R . + hg revert -a -from the root of your local repository it will recursively restore everything -to match up with the main server. +from the root of your working copy it will recursively restore everything +to match up with the repository. How do I find out who edited or what revision changed a line last? @@ -322,26 +322,29 @@ This will output to stdout every line of the file along with which revision last modified that line. When you have the revision number, it is then -easy to display it in detail using:: +easy to :ref:`display it in detail `. - hg log -vp -r +.. _hg-log: How can I see a list of log messages for a file or specific revision? --------------------------------------------------------------------- -To see the log messages for a specific file, run:: +To see the history of changes for a specific file, run:: - svn log PATH + hg log -v [PATH] -That will list all messages that pertain to the file specified in ``PATH``. +That will list all messages of revisions which modified the file specified +in ``PATH``. If ``PATH`` is omitted, all revisions are listed. -If you want to view the log message for a specific revision, run:: +If you want to display line-by-line differences for each revision as well, +add the ``-p`` option:: - svn log --verbose -r REV + hg log -vp [PATH] -With ``REV`` substituted with the revision number. The ``--verbose`` flag -should be used to get a listing of all files modified in that revision. +If you want to view the differences for a specific revision, run:: + + hg log -vp -r How do I get a diff between the repository and my working copy for a file? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:01 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:01 +0100 Subject: [Python-checkins] devguide: Remove the "svn diff" entry: "hg diff" is treated together with "hg status" Message-ID: antoine.pitrou pushed 2c558efa9366 to devguide: http://hg.python.org/devguide/rev/2c558efa9366 changeset: 255:2c558efa9366 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:13:40 2011 +0100 summary: Remove the "svn diff" entry: "hg diff" is treated together with "hg status" files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -347,26 +347,6 @@ hg log -vp -r -How do I get a diff between the repository and my working copy for a file? -------------------------------------------------------------------------------- - -The diff between your working copy and what is in the repository can be had -with:: - - svn diff PATH - -This will work off the current revision in the repository. To diff your -working copy with a specific revision, do:: - - svn diff -r REV PATH - -Finally, to generate a diff between two specific revisions, use:: - - svn diff -r REV1:REV2 PATH - -Notice the ``:`` between ``REV1`` and ``REV2``. - - How do I undo the changes made in a recent committal? ------------------------------------------------------------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:01 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:01 +0100 Subject: [Python-checkins] devguide: hg backout Message-ID: antoine.pitrou pushed f00ff9019472 to devguide: http://hg.python.org/devguide/rev/f00ff9019472 changeset: 256:f00ff9019472 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:27:41 2011 +0100 summary: hg backout files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -347,23 +347,24 @@ hg log -vp -r -How do I undo the changes made in a recent committal? +How do I undo the changes made in a recent commit? ------------------------------------------------------------------------------- -Assuming your bad revision is ``NEW`` and ``OLD`` is the equivalent of ``NEW -- 1``, then run:: +First, this should not happen if you take the habit of :ref:`reviewing changes +` before committing them. - svn merge -r NEW:OLD PATH +In any case, run:: -This will revert *all* files back to their state in revision ``OLD``. -The reason that ``OLD`` is just ``NEW - 1`` is you do not want files to be -accidentally reverted to a state older than your changes, just to the point -prior. + hg backout -Note: PATH here refers to the top of the checked out repository, -not the full pathname to a file. PATH can refer to a different -branch when merging from the head, but it must still be the top -and not an individual file or subdirectory. +This will commit a *new* revision reverting the exact changes made in +````. However, if other changes have been made since then, +you will have to merge them with that new revision. For that, run:: + + hg merge + hg commit + +.. XXX: "hg backout --merge" doesn't seem to work How do I update to a specific release tag? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:02 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:02 +0100 Subject: [Python-checkins] devguide: Remove "svn switch" entry Message-ID: antoine.pitrou pushed 7e1c407c0aa1 to devguide: http://hg.python.org/devguide/rev/7e1c407c0aa1 changeset: 257:7e1c407c0aa1 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:28:04 2011 +0100 summary: Remove "svn switch" entry files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -390,36 +390,6 @@ svn update -r 39619 -Why should I use ``svn switch``? -------------------------------------------------------------------------------- - -If you picture each file/directory in Subversion as uniquely identified -by a 2-space coordinate system [URL, revision] (given a checkout, you can -use "svn info" to get its coordinates), then we can say that "svn up -r N" -(for some revision number N) keeps the url unchanged and changes the -revision to whatever number you specified. In other words, you get the -state of the working copy URL at the time revision N was created. For -instance, if you execute it with revision 39619 within the trunk working -copy, you will get the trunk at the moment 2.4.2 was released. - -On the other hand, "svn switch" moves the URL: it basically "moves" your -checkout from [old_URL, revision] to [new_URL, HEAD], downloading the -minimal set of diffs to do so. If the new_URL is a tag URL -(e.g. .../tags/r242), it means any revision is good, since nobody is going -to commit into that directory (it will stay unchanged forever). So -[/tags/r242, HEAD] is the same as any other [/tags/r242, revision] (assuming -of course that /tags/r242 was already created at the time the revision was -created). - -If you want to create a sandbox corresponding to a particular release tag, -use svn switch to switch to [/tags/some_tag, HEAD] if you don't plan on -doing modifications. On the other hand if you want to make modifications to -a particular release branch, use svn switch to change to -[/branches/some_branch, HEAD]. - -(Written by Giovanni Bajo on python-dev.) - - How do I create a branch? ------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:03 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:03 +0100 Subject: [Python-checkins] devguide: Update FAQ entry about tags Message-ID: antoine.pitrou pushed dd431b68732e to devguide: http://hg.python.org/devguide/rev/dd431b68732e changeset: 258:dd431b68732e branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 20:29:30 2011 +0100 summary: Update FAQ entry about tags files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -372,22 +372,11 @@ Run:: - svn list svn+ssh://pythondev at svn.python.org/python/tags + hg tags -or visit:: +to get a list of tags. To update your working copy to a specific tag, use:: - http://svn.python.org/view/python/tags/ - -to get a list of tags. To switch your current sandbox to a specific tag, -run:: - - svn switch svn+ssh://pythondev at svn.python.org/python/tags/r242 - -To just update to the revision corresponding to that tag without changing -the metadata for the repository, note the revision number corresponding to -the tag of interest and update to it, e.g.:: - - svn update -r 39619 + hg update How do I create a branch? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:30:03 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:30:03 +0100 Subject: [Python-checkins] devguide: Remove the "how do I create a branch" question. It is too SVN-centric. Message-ID: antoine.pitrou pushed 70cdabddd60b to devguide: http://hg.python.org/devguide/rev/70cdabddd60b changeset: 259:70cdabddd60b branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 20:29:57 2011 +0100 summary: Remove the "how do I create a branch" question. It is too SVN-centric. files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -379,19 +379,6 @@ hg update -How do I create a branch? -------------------------- - -The best way is to do a server-side copy by specifying the URL for the source -of the branch, and the eventual destination URL for the new branch:: - - svn copy SRC_URL DEST_URL - -You can then checkout your branch as normal. You will want to prepare your -branch for future merging from the source branch so as to keep them in sync -using svnmerge.py. - - How can I create a directory in the sandbox? ------------------------------------------------------------------------------ -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 20:32:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 20:32:55 +0100 Subject: [Python-checkins] devguide: Update wording for hg Message-ID: antoine.pitrou pushed 754fc2536b09 to devguide: http://hg.python.org/devguide/rev/754fc2536b09 changeset: 260:754fc2536b09 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 20:32:44 2011 +0100 summary: Update wording for hg files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -450,11 +450,16 @@ .. _Pageant: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html -Can I make check-ins from machines other than the one I generated the keys on? + +Can I make commits from machines other than the one I generated the keys on? ------------------------------------------------------------------------------ -Yes, all you need is to make sure that the machine you want to check -in code from has both the public and private keys in the standard +You can :ref:`make commits ` from any machine, since they will be +recorded in your *local repository*. + +However, to push these changes to the remote server, you will need proper +credentials. All you need is to make sure that the machine you want to +push changes from has both the public and private keys in the standard place that ssh will look for them (i.e. ~/.ssh on Unix machines). Please note that although the key file ending in .pub contains your user name and machine name in it, that information is not used by the -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 21:08:57 2011 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 6 Feb 2011 21:08:57 +0100 (CET) Subject: [Python-checkins] r88359 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110206200857.A3046EE98D@mail.python.org> Author: raymond.hettinger Date: Sun Feb 6 21:08:57 2011 New Revision: 88359 Log: Issue #11071: Various improvements to whatsnew. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Sun Feb 6 21:08:57 2011 @@ -513,6 +513,7 @@ caused confusion and is no longer needed now that the shortest possible :func:`repr` is displayed by default: + >>> import math >>> repr(math.pi) '3.141592653589793' >>> str(math.pi) @@ -633,11 +634,10 @@ (See :issue:`10518`.) * Python's import mechanism can now load modules installed in directories with - non-ASCII characters in the path name: + non-ASCII characters in the path name. This solved an aggravating problem + with home directories for users with non-ASCII characters in their usernames. - >>> import m??se.bites - - (Required extensive work by Victor Stinner in :issue:`9425`.) + (Required extensive work by Victor Stinner in :issue:`9425`.) New, Improved, and Deprecated Modules @@ -646,14 +646,15 @@ Python's standard library has undergone significant maintenance efforts and quality improvements. -The biggest news for Python 3.2 is that the :mod:`email` package and -:mod:`nntplib` modules now work correctly with the bytes/text model in Python 3. -For the first time, there is correct handling of inputs with mixed encodings. +The biggest news for Python 3.2 is that the :mod:`email` package, :mod:`mailbox` +module, and :mod:`nntplib` modules now work correctly with the bytes/text model +in Python 3. For the first time, there is correct handling of message with +mixed encodings. Throughout the standard library, there has been more careful attention to encodings and text versus bytes issues. In particular, interactions with the -operating system are now better able to pass non-ASCII data using the Windows -MBCS encoding, locale-aware encodings, or UTF-8. +operating system are now better able to exchange non-ASCII data using the +Windows MBCS encoding, locale-aware encodings, or UTF-8. Another significant win is the addition of substantially better support for *SSL* connections and security certificates. @@ -822,6 +823,7 @@ * The :mod:`itertools` module has a new :func:`~itertools.accumulate` function modeled on APL's *scan* operator and Numpy's *accumulate* function: + >>> from itertools import accumulate >>> list(accumulate(8, 2, 50)) [8, 10, 60] @@ -911,6 +913,8 @@ Example of using barriers:: + from threading import Barrier, Thread + def get_votes(site): ballots = conduct_election(site) all_polls_closed.wait() # do not count until all polls are closed @@ -964,7 +968,7 @@ offset and timezone name. This makes it easier to create timezone-aware datetime objects:: - >>> import datetime + >>> from datetime import datetime, timezone >>> datetime.now(timezone.utc) datetime.datetime(2010, 12, 8, 21, 4, 2, 923754, tzinfo=datetime.timezone.utc) @@ -1069,12 +1073,12 @@ requires a particular :func:`classmethod` or :func:`staticmethod` to be implemented:: - class Temperature(metaclass=ABCMeta): + class Temperature(metaclass=abc.ABCMeta): @abc.abstractclassmethod - def from_fahrenheit(self, t): + def from_fahrenheit(cls, t): ... @abc.abstractclassmethod - def from_celsius(self, t): + def from_celsius(cls, t): ... (Patch submitted by Daniel Urban; :issue:`5867`.) @@ -1104,8 +1108,8 @@ >>> change_location(buffer, 1, b'warehouse ') >>> change_location(buffer, 0, b'showroom ') >>> print(byte_stream.getvalue()) - b'G3805 showroom Main chassis ' -> - b'X7899 warehouse Reserve cog ' -> + b'G3805 showroom Main chassis ' + b'X7899 warehouse Reserve cog ' b'L6988 receiving Primary sprocket' (Contributed by Antoine Pitrou in :issue:`5506`.) @@ -1425,7 +1429,7 @@ :: - >>> from ast import literal_request + >>> from ast import literal_eval >>> request = "{'req': 3, 'func': 'pow', 'args': (2, 0.5)}" >>> literal_eval(request) @@ -1491,7 +1495,8 @@ >>> import shutil, pprint >>> os.chdir('mydata') # change to the source directory - >>> f = make_archive('/var/backup/mydata', 'zip') # archive the current directory + >>> f = shutil.make_archive('/var/backup/mydata', + 'zip') # archive the current directory >>> f # show the name of archive '/var/backup/mydata.zip' >>> os.chdir('tmp') # change to an unpacking @@ -1505,8 +1510,8 @@ >>> shutil.register_archive_format( # register a new archive format name = 'xz', - function = 'xz.compress', - extra_args = [('level', 8)], + function = xz.compress, # callable archiving function + extra_args = [('level', 8)], # arguments to the function description = 'xz compression' ) @@ -1879,6 +1884,32 @@ 1: seq 2: i +In addition, the :func:`~dis.dis` function now accepts string arguments +so that the common idiom ``dis(compile(s, '', 'eval'))`` can be shortened +to ``dis(compile(s))``:: + + >>> dis('3*x+1 if x%2==1 else x//2') + 1 0 LOAD_NAME 0 (x) + 3 LOAD_CONST 0 (2) + 6 BINARY_MODULO + 7 LOAD_CONST 1 (1) + 10 COMPARE_OP 2 (==) + 13 POP_JUMP_IF_FALSE 28 + 16 LOAD_CONST 2 (3) + 19 LOAD_NAME 0 (x) + 22 BINARY_MULTIPLY + 23 LOAD_CONST 1 (1) + 26 BINARY_ADD + 27 RETURN_VALUE + >> 28 LOAD_NAME 0 (x) + 31 LOAD_CONST 0 (2) + 34 BINARY_FLOOR_DIVIDE + 35 RETURN_VALUE + +Taken together, these improvements make it easier to explore how CPython is +implemented and to see for yourself what the language syntax does +under-the-hood. + (Contributed by Nick Coghlan in :issue:`9147`.) dbm From python-checkins at python.org Sun Feb 6 22:00:38 2011 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 6 Feb 2011 22:00:38 +0100 (CET) Subject: [Python-checkins] r88360 - python/branches/py3k/Doc/library/unittest.rst Message-ID: <20110206210038.325D2EE988@mail.python.org> Author: raymond.hettinger Date: Sun Feb 6 22:00:38 2011 New Revision: 88360 Log: Fix awkwardly rendered sentence. Modified: python/branches/py3k/Doc/library/unittest.rst Modified: python/branches/py3k/Doc/library/unittest.rst ============================================================================== --- python/branches/py3k/Doc/library/unittest.rst (original) +++ python/branches/py3k/Doc/library/unittest.rst Sun Feb 6 22:00:38 2011 @@ -352,9 +352,9 @@ The basic building blocks of unit testing are :dfn:`test cases` --- single scenarios that must be set up and checked for correctness. In :mod:`unittest`, -test cases are represented by instances of :mod:`unittest`'s :class:`TestCase` -class. To make your own test cases you must write subclasses of -:class:`TestCase`, or use :class:`FunctionTestCase`. +test cases are represented by :class:`unittest.TestCase` instances. +To make your own test cases you must write subclasses of +:class:`TestCase` or use :class:`FunctionTestCase`. An instance of a :class:`TestCase`\ -derived class is an object that can completely run a single test method, together with optional set-up and tidy-up From python-checkins at python.org Sun Feb 6 22:43:02 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 22:43:02 +0100 Subject: [Python-checkins] devguide: Update Windows instructions for hg Message-ID: antoine.pitrou pushed 9258ce1ba9c0 to devguide: http://hg.python.org/devguide/rev/9258ce1ba9c0 changeset: 261:9258ce1ba9c0 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 22:42:59 2011 +0100 summary: Update Windows instructions for hg files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -40,59 +40,39 @@ have binary packages available. Most package management systems also have native Mercurial packages available. -If you have checkin rights, you need OpenSSH_. This is needed to verify +If you have push rights, you need OpenSSH_. This is needed to verify your identity when performing commits. As with Mercurial, binary packages are typically available either online or through the platform's package management system. .. _OpenSSH: http://www.openssh.org/ + Windows ''''''''''''''''''' -XXX: The following instructions need verification. They're based on -the old SVN instructions plus the info at -http://mercurial.selenic.com/wiki/AccessingSshRepositoriesFromWindows -and https://bitbucket.org/tortoisehg/stable/wiki/ssh +The recommended option on Windows is to `download TortoiseHg`_ which +integrates with Windows Explorer and also bundles the command line client +(meaning you can type ``hg`` in a DOS box). Note that most +entries in this FAQ only cover the command line client in detail - refer +to the TortoiseHg documentation for assistance with its graphical interface. -You have several options on Windows. One is to `download Mercurial`_ itself -which will give you a command-line version. Another option is to `download -TortoiseHg`_ which integrates with Windows Explorer. Note that this FAQ only -covers the command line client in detail - refer to the TortoiseHg -documentation for assistance with that tool. +If you have push rights, you need to configure Mercurial to work with +your SSH keys. For that, open your Mercurial configuration file +(you can do so by opening the TortoiseHg configuration dialog and then +clicking *"Edit File"*). In the ``[ui]`` section, add the following line:: -If you have checkin rights, you will also need an SSH client. -`Download PuTTY and friends`_ (PuTTYgen, Pageant, and Plink) for this. All -other questions in this FAQ will assume you are using these tools. + ssh = TortoisePlink.exe -ssh -2 -i C:\path\to\yourkey.ppk -Once you have both Mercurial and PuTTY installed you must tell Subversion -where to find an SSH client. Do this by editing -``%APPDATA%\Mercurial.ini`` to add the following entry (use the existing -``[ui]`` section if one is already present):: +where ``C:\path\to\yourkey.ppk`` should be replaced with the actual path +to your SSH private key. - [ui] - ssh="c:/path/to/putty/plink.exe" -T +.. note:: + If your private key is in OpenSSH format, you must first convert it to + PuTTY format by loading it into `PuTTYgen`_. -Change the path to be the proper one for your system. The ``-T`` -option prevents a pseudo-terminal from being created. - -You can use Pageant to prevent from having to type in your password for your -SSH 2 key constantly. If you prefer not to have another program running, -you need to create a profile in PuTTY. - -Go to Session:Saved Sessions and create a new profile named -``hg.python.org``. In Session:Host Name, enter ``hg.python.org``. In -SSH/Auth:Private key file select your private key. In Connection:Auto-login -username enter ``hg``. - -XXX: Does the following comment still apply to TortoiseHg? -With this set up, paths are slightly different than most other settings in that -the username is not required. Do take notice of this when choosing to check -out a project! .. _download TortoiseHg: http://tortoisehg.bitbucket.org/download/index.html -.. _PuTTY: http://www.chiark.greenend.org.uk/~sgtatham/putty/ -.. _download PuTTY and friends: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html What's a working copy? What's a repository? @@ -423,8 +403,8 @@ key to a file. Copy the section with the public key (using Alt-P) to a file; that file now has your public key. +.. _PuTTYgen: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html -.. _PuTTYgen: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html Is there a way to prevent from having to enter my password for my SSH 2 public key constantly? ------------------------------------------------------------------------------------------------ @@ -439,18 +419,23 @@ .. _KeyChain: http://www.gentoo.org/proj/en/keychain/ + +.. _pageant: + Windows ''''''''''''''''''' -Running Pageant_ will prevent you from having to type your password constantly. +The Pageant program is bundled with TortoiseHg. You can find it in its +installation directory (usually ``C:\Program Files (x86)\TortoiseHg\``); +you can also `download it separately +`_. + +Running Pageant will prevent you from having to type your password constantly. If you add a shortcut to Pageant to your Autostart group and edit the shortcut so that the command line includes an argument to your private key then Pageant will load the key every time you log in. -.. _Pageant: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html - - Can I make commits from machines other than the one I generated the keys on? ------------------------------------------------------------------------------ -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 22:55:27 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 22:55:27 +0100 Subject: [Python-checkins] devguide: Wording fixes Message-ID: antoine.pitrou pushed 59f424c61ee1 to devguide: http://hg.python.org/devguide/rev/59f424c61ee1 changeset: 262:59f424c61ee1 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 22:55:23 2011 +0100 summary: Wording fixes files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -197,8 +197,8 @@ **Be careful** though, as it might add some files that are not desired in the repository (such as build products, cache files, or other data). -You will then need to run ``hg commit`` (as discussed :ref:`below `) -to commit the file(s) to your local repository. +You will then need to run ``hg commit`` (as discussed below) to commit +the file(s) to your local repository. .. _hg-commit: @@ -206,7 +206,7 @@ How do I commit a change to a file? ------------------------------------------------------------------------------- -To have any changes to a file (which include adding a new file or deleting +To commit any changes to a file (which includes adding a new file or deleting an existing one), you use the command:: hg commit [PATH] @@ -214,7 +214,7 @@ ``[PATH]`` is optional: if it is omitted, all changes in your working copy will be committed to the local repository. When you commit, be sure that all changes are desired by :ref:`reviewing them first `; -especially, when making commits that you intend to push to public repositories, +also, when making commits that you intend to push to public repositories, you should **not** commit together unrelated changes. To abort a commit that you are in the middle of, leave the message -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 23:10:47 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 23:10:47 +0100 Subject: [Python-checkins] devguide: Use new anchor Message-ID: antoine.pitrou pushed c2ac89bc9dfa to devguide: http://hg.python.org/devguide/rev/c2ac89bc9dfa changeset: 263:c2ac89bc9dfa branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 23:10:44 2011 +0100 summary: Use new anchor files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -302,7 +302,7 @@ This will output to stdout every line of the file along with which revision last modified that line. When you have the revision number, it is then -easy to :ref:`display it in detail `. +easy to :ref:`display it in detail `. .. _hg-log: @@ -322,6 +322,8 @@ hg log -vp [PATH] +.. _hg-log-rev: + If you want to view the differences for a specific revision, run:: hg log -vp -r -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 23:42:46 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 23:42:46 +0100 Subject: [Python-checkins] devguide: Remove entry about svn mkdir Message-ID: antoine.pitrou pushed c27a16180595 to devguide: http://hg.python.org/devguide/rev/c27a16180595 changeset: 264:c27a16180595 branch: hg_transition user: Antoine Pitrou date: Sun Feb 06 23:38:48 2011 +0100 summary: Remove entry about svn mkdir files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -361,23 +361,6 @@ hg update -How can I create a directory in the sandbox? ------------------------------------------------------------------------------- - -Assuming you have commit privileges and you do not already have a complete -checkout of the sandbox itself, the easiest way is to use svn's ``mkdir`` -command:: - - svn mkdir svn+ssh://pythondev at svn.python.org/sandbox/trunk/ - -That command will create the new directory on the server. To gain access to -the new directory you then checkout it out (substitute ``mkdir`` in the command -above with ``checkout``). - -If you already have a complete checkout of the sandbox then you can just use -``svn mkdir`` on a local directory name and check in the new directory itself. - - SSH ======= -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 6 23:42:47 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 06 Feb 2011 23:42:47 +0100 Subject: [Python-checkins] devguide: Replace "Subversion" with "Mercurial" Message-ID: antoine.pitrou pushed 8ec99aa1327d to devguide: http://hg.python.org/devguide/rev/8ec99aa1327d changeset: 265:8ec99aa1327d branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 06 23:40:48 2011 +0100 summary: Replace "Subversion" with "Mercurial" files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -10,7 +10,7 @@ ------------------------------------------------------------------------------- `Mercurial`_'s (also known as ``hg``) official web site is at -http://mercurial.selenic.com/. A book on Subversion published by +http://mercurial.selenic.com/. A book on Mercurial published by `O'Reilly Media`_, `Mercurial: The Definitive Guide`_, is available for free online. -- Repository URL: http://hg.python.org/devguide From ncoghlan at gmail.com Mon Feb 7 00:28:22 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 7 Feb 2011 09:28:22 +1000 Subject: [Python-checkins] r88359 - python/branches/py3k/Doc/whatsnew/3.2.rst In-Reply-To: <20110206200857.A3046EE98D@mail.python.org> References: <20110206200857.A3046EE98D@mail.python.org> Message-ID: On Mon, Feb 7, 2011 at 6:08 AM, raymond.hettinger wrote: > +In addition, the :func:`~dis.dis` function now accepts string arguments > +so that the common idiom ``dis(compile(s, '', 'eval'))`` can be shortened > +to ``dis(compile(s))``:: That should be ``dis(s)``. Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From nnorwitz at gmail.com Sun Feb 6 22:00:10 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Sun, 6 Feb 2011 16:00:10 -0500 Subject: [Python-checkins] Python Regression Test Failures doc (1) Message-ID: <20110206210010.GA11694@kbk-i386-bb.dyndns.org> rm -rf build/* rm -rf tools/sphinx rm -rf tools/pygments rm -rf tools/jinja2 rm -rf tools/docutils Checking out Sphinx... svn: PROPFIND request failed on '/projects/external/Sphinx-0.6.5/sphinx' svn: PROPFIND of '/projects/external/Sphinx-0.6.5/sphinx': could not connect to server (http://svn.python.org) make: *** [checkout] Error 1 From python-checkins at python.org Mon Feb 7 05:00:24 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 7 Feb 2011 05:00:24 +0100 (CET) Subject: [Python-checkins] r88361 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110207040024.3A641EE98B@mail.python.org> Author: raymond.hettinger Date: Mon Feb 7 05:00:24 2011 New Revision: 88361 Log: Typo. Doh! Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Mon Feb 7 05:00:24 2011 @@ -1886,7 +1886,7 @@ In addition, the :func:`~dis.dis` function now accepts string arguments so that the common idiom ``dis(compile(s, '', 'eval'))`` can be shortened -to ``dis(compile(s))``:: +to ``dis(s)``:: >>> dis('3*x+1 if x%2==1 else x//2') 1 0 LOAD_NAME 0 (x) From solipsis at pitrou.net Mon Feb 7 05:04:47 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 07 Feb 2011 05:04:47 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88360): sum=-322 Message-ID: py3k results for svn r88360 (hg cset 101f4a776711) -------------------------------------------------- test_pyexpat leaked [0, -56, -267] references, sum=-323 test_timeout leaked [0, 1, 0] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogLMVJE2', '-x'] From python-checkins at python.org Mon Feb 7 05:19:57 2011 From: python-checkins at python.org (eli.bendersky) Date: Mon, 7 Feb 2011 05:19:57 +0100 (CET) Subject: [Python-checkins] r88362 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110207041957.4D472EE981@mail.python.org> Author: eli.bendersky Date: Mon Feb 7 05:19:57 2011 New Revision: 88362 Log: Fix some typos and grammar [commit during RC with Brett's approval] Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Mon Feb 7 05:19:57 2011 @@ -22,7 +22,7 @@ =================== When a project makes the decision that it's time to support both Python 2 & 3, a decision needs to be made as to how to go about accomplishing that goal. -Which strategy goes with will depend on how large the project's existing +The chosen strategy will depend on how large the project's existing codebase is and how much divergence you want from your Python 2 codebase from your Python 3 one (e.g., starting a new version with Python 3). @@ -32,8 +32,8 @@ If your project has a pre-existing Python 2 codebase and you would like Python 3 support to start off a new branch or version of your project, then you will -most likely want to :ref:`port using 2to3 `. This will allow you port -your Python 2 code to Python 3 in a semi-automated fashion and begin to +most likely want to :ref:`port using 2to3 `. This will allow you to +port your Python 2 code to Python 3 in a semi-automated fashion and begin to maintain it separately from your Python 2 code. This approach can also work if your codebase is small and/or simple enough for the translation to occur quickly. @@ -103,13 +103,12 @@ approach more than another doesn't mean that some advice doesn't apply to other strategies. -Five, drop support for older Python versions if possible. While not a -requirement, `Python 2.5`_) introduced a lot of useful syntax and libraries -which have become idiomatic in Python 3. `Python 2.6`_ introduced future -statements which makes compatibility much easier if you are going from Python 2 -to 3. +Five, drop support for older Python versions if possible. `Python 2.5`_ +introduced a lot of useful syntax and libraries which have become idiomatic +in Python 3. `Python 2.6`_ introduced future statements which makes +compatibility much easier if you are going from Python 2 to 3. `Python 2.7`_ continues the trend in the stdlib. So choose the newest version -of Python for which you believe you believe can be your minimum support version +of Python which you believe can be your minimum support version and work from there. From python-checkins at python.org Mon Feb 7 05:44:19 2011 From: python-checkins at python.org (eli.bendersky) Date: Mon, 7 Feb 2011 05:44:19 +0100 (CET) Subject: [Python-checkins] r88363 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110207044419.B617DEE981@mail.python.org> Author: eli.bendersky Date: Mon Feb 7 05:44:19 2011 New Revision: 88363 Log: Fix some typos and grammar Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Mon Feb 7 05:44:19 2011 @@ -149,14 +149,14 @@ Python 2 and 2to3 ================= -Included with Python since 2.6, 2to3_ tool (and :mod:`lib2to3` module) helps -with porting Python 2 to Python 3 by performing various source translations. -This is a perfect solution for projects which wish to branch their Python 3 -code from their Python 2 codebase and maintain them as independent codebases. -You can even begin preparing to use this approach today by writing -future-compatible Python code which works cleanly in Python 2 in conjunction -with 2to3; all steps outlined below will work with Python 2 code up to the -point when the actual use of 2to3 occurs. +Included with Python since 2.6, the 2to3_ tool (and :mod:`lib2to3` module) +helps with porting Python 2 to Python 3 by performing various source +translations. This is a perfect solution for projects which wish to branch +their Python 3 code from their Python 2 codebase and maintain them as +independent codebases. You can even begin preparing to use this approach +today by writing future-compatible Python code which works cleanly in +Python 2 in conjunction with 2to3; all steps outlined below will work +with Python 2 code up to the point when the actual use of 2to3 occurs. Use of 2to3 as an on-demand translation step at install time is also possible, preventing the need to maintain a separate Python 3 codebase, but this approach @@ -468,11 +468,11 @@ >>> exc[1] # Python 2 only! 2 -But in Python 3, indexing directly off of an exception is an error. You need to -make sure to only index on :attr:`BaseException.args` attribute which is a +But in Python 3, indexing directly on an exception is an error. You need to +make sure to only index on the :attr:`BaseException.args` attribute which is a sequence containing all arguments passed to the :meth:`__init__` method. -Even better is to use documented attributes the exception provides. +Even better is to use the documented attributes the exception provides. Don't use ``__getslice__`` & Friends '''''''''''''''''''''''''''''''''''' From python-checkins at python.org Mon Feb 7 13:10:46 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 13:10:46 +0100 (CET) Subject: [Python-checkins] r88364 - python/branches/py3k/Doc/library/string.rst Message-ID: <20110207121046.89BC1EE98F@mail.python.org> Author: georg.brandl Date: Mon Feb 7 13:10:46 2011 New Revision: 88364 Log: #11138: fix order of fill and align specifiers. Modified: python/branches/py3k/Doc/library/string.rst Modified: python/branches/py3k/Doc/library/string.rst ============================================================================== --- python/branches/py3k/Doc/library/string.rst (original) +++ python/branches/py3k/Doc/library/string.rst Mon Feb 7 13:10:46 2011 @@ -595,7 +595,7 @@ Nesting arguments and more complex examples:: >>> for align, text in zip('<^>', ['left', 'center', 'right']): - ... '{0:{align}{fill}16}'.format(text, fill=align, align=align) + ... '{0:{fill}{align}16}'.format(text, fill=align, align=align) ... 'left<<<<<<<<<<<<' '^^^^^center^^^^^' From python-checkins at python.org Mon Feb 7 13:13:59 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 13:13:59 +0100 (CET) Subject: [Python-checkins] r88365 - in python/branches/py3k/Doc: ACKS.txt library/string.rst Message-ID: <20110207121359.0F98DEE984@mail.python.org> Author: georg.brandl Date: Mon Feb 7 13:13:58 2011 New Revision: 88365 Log: #8691: document that right alignment is default for numbers. Modified: python/branches/py3k/Doc/ACKS.txt python/branches/py3k/Doc/library/string.rst Modified: python/branches/py3k/Doc/ACKS.txt ============================================================================== --- python/branches/py3k/Doc/ACKS.txt (original) +++ python/branches/py3k/Doc/ACKS.txt Mon Feb 7 13:13:58 2011 @@ -130,6 +130,7 @@ * Andrew MacIntyre * Vladimir Marangozov * Vincent Marchetti + * Westley Mart?nez * Laura Matson * Daniel May * Rebecca McCreary Modified: python/branches/py3k/Doc/library/string.rst ============================================================================== --- python/branches/py3k/Doc/library/string.rst (original) +++ python/branches/py3k/Doc/library/string.rst Mon Feb 7 13:13:58 2011 @@ -310,10 +310,10 @@ | Option | Meaning | +=========+==========================================================+ | ``'<'`` | Forces the field to be left-aligned within the available | - | | space (this is the default). | + | | space (this is the default for most objects). | +---------+----------------------------------------------------------+ | ``'>'`` | Forces the field to be right-aligned within the | - | | available space. | + | | available space (this is the default for numbers). | +---------+----------------------------------------------------------+ | ``'='`` | Forces the padding to be placed after the sign (if any) | | | but before the digits. This is used for printing fields | From python-checkins at python.org Mon Feb 7 13:36:54 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 13:36:54 +0100 (CET) Subject: [Python-checkins] r88366 - in python/branches/py3k: Lib/compileall.py Lib/test/test_compileall.py Misc/ACKS Misc/NEWS Message-ID: <20110207123654.B06F3EE982@mail.python.org> Author: georg.brandl Date: Mon Feb 7 13:36:54 2011 New Revision: 88366 Log: #11132: pass optimize parameter to recursive call in compileall.compile_dir(). Modified: python/branches/py3k/Lib/compileall.py python/branches/py3k/Lib/test/test_compileall.py python/branches/py3k/Misc/ACKS python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/compileall.py ============================================================================== --- python/branches/py3k/Lib/compileall.py (original) +++ python/branches/py3k/Lib/compileall.py Mon Feb 7 13:36:54 2011 @@ -58,7 +58,7 @@ elif (maxlevels > 0 and name != os.curdir and name != os.pardir and os.path.isdir(fullname) and not os.path.islink(fullname)): if not compile_dir(fullname, maxlevels - 1, dfile, force, rx, - quiet, legacy): + quiet, legacy, optimize): success = 0 return success Modified: python/branches/py3k/Lib/test/test_compileall.py ============================================================================== --- python/branches/py3k/Lib/test/test_compileall.py (original) +++ python/branches/py3k/Lib/test/test_compileall.py Mon Feb 7 13:36:54 2011 @@ -24,6 +24,10 @@ self.source_path2 = os.path.join(self.directory, '_test2.py') self.bc_path2 = imp.cache_from_source(self.source_path2) shutil.copyfile(self.source_path, self.source_path2) + self.subdirectory = os.path.join(self.directory, '_subdir') + os.mkdir(self.subdirectory) + self.source_path3 = os.path.join(self.subdirectory, '_test3.py') + shutil.copyfile(self.source_path, self.source_path3) def tearDown(self): shutil.rmtree(self.directory) @@ -96,6 +100,12 @@ cached = imp.cache_from_source(self.source_path, debug_override=not optimize) self.assertTrue(os.path.isfile(cached)) + cached2 = imp.cache_from_source(self.source_path2, + debug_override=not optimize) + self.assertTrue(os.path.isfile(cached2)) + cached3 = imp.cache_from_source(self.source_path3, + debug_override=not optimize) + self.assertTrue(os.path.isfile(cached3)) class EncodingTest(unittest.TestCase): Modified: python/branches/py3k/Misc/ACKS ============================================================================== --- python/branches/py3k/Misc/ACKS (original) +++ python/branches/py3k/Misc/ACKS Mon Feb 7 13:36:54 2011 @@ -843,6 +843,7 @@ Robin Thomas Jeremy Thurgood Eric Tiedemann +July Tikhonov Tracy Tims Oren Tirosh Jason Tishler Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 7 13:36:54 2011 @@ -18,6 +18,9 @@ Library ------- +- Issue #11132: Fix passing of "optimize" parameter when recursing + in compileall.compile_dir(). + - Issue #11110: Fix a potential decref of a NULL in sqlite3. - Issue #8275: Fix passing of callback arguments with ctypes under Win64. From python-checkins at python.org Mon Feb 7 13:37:35 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 13:37:35 +0100 (CET) Subject: [Python-checkins] r88366 - svn:log Message-ID: <20110207123735.9EF2AEE982@mail.python.org> Author: georg.brandl Revision: 88366 Property Name: svn:log Action: modified Property diff: --- old property value +++ new property value @@ -1 +1 @@ -#11132: pass optimize parameter to recursive call in compileall.compile_dir(). \ No newline at end of file +#11132: pass optimize parameter to recursive call in compileall.compile_dir(). Reviewed by Eric A. From python-checkins at python.org Mon Feb 7 13:51:05 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 7 Feb 2011 13:51:05 +0100 (CET) Subject: [Python-checkins] r88367 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110207125105.73CC2EE982@mail.python.org> Author: raymond.hettinger Date: Mon Feb 7 13:51:05 2011 New Revision: 88367 Log: Add link to table for Tcl/Tk issues on the Mac. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Mon Feb 7 13:51:05 2011 @@ -2511,11 +2511,12 @@ There were a number of other small changes to the C-API. See the :source:`Misc/NEWS` file for a complete list. -Also, there were a number of updates to the OS X build, see +Also, there were a number of updates to the Mac OS X build, see :source:`Mac/BuildScript/README.txt` for details. For users running a 32/64-bit -build, there is a known problem with the default Tcl/Tk on OS X 10.6. +build, there is a known problem with the default Tcl/Tk on Mac OS X 10.6. Accordingly, we recommend installing an updated alternative such as -`ActiveState Tcl/Tk 8.5 `_ . +`ActiveState Tcl/Tk 8.5.9 `_\. +See http://www.python.org/download/mac/tcltk/ for additional details. Porting to Python 3.2 ===================== From python-checkins at python.org Mon Feb 7 14:43:07 2011 From: python-checkins at python.org (nick.coghlan) Date: Mon, 7 Feb 2011 14:43:07 +0100 (CET) Subject: [Python-checkins] r88368 - in python/branches/py3k: Lib/test/test_zipimport_support.py Misc/NEWS Message-ID: <20110207134307.93AACEE98D@mail.python.org> Author: nick.coghlan Date: Mon Feb 7 14:43:07 2011 New Revision: 88368 Log: Issue 10971: Make test_zipimport_support once again compatible with refleak hunting Modified: python/branches/py3k/Lib/test/test_zipimport_support.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/test/test_zipimport_support.py ============================================================================== --- python/branches/py3k/Lib/test/test_zipimport_support.py (original) +++ python/branches/py3k/Lib/test/test_zipimport_support.py Mon Feb 7 14:43:07 2011 @@ -93,7 +93,10 @@ os.remove(init_name) sys.path.insert(0, zip_name) import zip_pkg - self.assertEqual(inspect.getsource(zip_pkg.foo), test_src) + try: + self.assertEqual(inspect.getsource(zip_pkg.foo), test_src) + finally: + del sys.modules["zip_pkg"] def test_doctest_issue4197(self): # To avoid having to keep two copies of the doctest module's @@ -128,53 +131,56 @@ os.remove(script_name) sys.path.insert(0, zip_name) import test_zipped_doctest - # Some of the doc tests depend on the colocated text files - # which aren't available to the zipped version (the doctest - # module currently requires real filenames for non-embedded - # tests). So we're forced to be selective about which tests - # to run. - # doctest could really use some APIs which take a text - # string or a file object instead of a filename... - known_good_tests = [ - test_zipped_doctest.SampleClass, - test_zipped_doctest.SampleClass.NestedClass, - test_zipped_doctest.SampleClass.NestedClass.__init__, - test_zipped_doctest.SampleClass.__init__, - test_zipped_doctest.SampleClass.a_classmethod, - test_zipped_doctest.SampleClass.a_property, - test_zipped_doctest.SampleClass.a_staticmethod, - test_zipped_doctest.SampleClass.double, - test_zipped_doctest.SampleClass.get, - test_zipped_doctest.SampleNewStyleClass, - test_zipped_doctest.SampleNewStyleClass.__init__, - test_zipped_doctest.SampleNewStyleClass.double, - test_zipped_doctest.SampleNewStyleClass.get, - test_zipped_doctest.sample_func, - test_zipped_doctest.test_DocTest, - test_zipped_doctest.test_DocTestParser, - test_zipped_doctest.test_DocTestRunner.basics, - test_zipped_doctest.test_DocTestRunner.exceptions, - test_zipped_doctest.test_DocTestRunner.option_directives, - test_zipped_doctest.test_DocTestRunner.optionflags, - test_zipped_doctest.test_DocTestRunner.verbose_flag, - test_zipped_doctest.test_Example, - test_zipped_doctest.test_debug, - test_zipped_doctest.test_pdb_set_trace, - test_zipped_doctest.test_pdb_set_trace_nested, - test_zipped_doctest.test_testsource, - test_zipped_doctest.test_trailing_space_in_test, - test_zipped_doctest.test_DocTestSuite, - test_zipped_doctest.test_DocTestFinder, - ] - # These remaining tests are the ones which need access - # to the data files, so we don't run them - fail_due_to_missing_data_files = [ - test_zipped_doctest.test_DocFileSuite, - test_zipped_doctest.test_testfile, - test_zipped_doctest.test_unittest_reportflags, - ] - for obj in known_good_tests: - _run_object_doctest(obj, test_zipped_doctest) + try: + # Some of the doc tests depend on the colocated text files + # which aren't available to the zipped version (the doctest + # module currently requires real filenames for non-embedded + # tests). So we're forced to be selective about which tests + # to run. + # doctest could really use some APIs which take a text + # string or a file object instead of a filename... + known_good_tests = [ + test_zipped_doctest.SampleClass, + test_zipped_doctest.SampleClass.NestedClass, + test_zipped_doctest.SampleClass.NestedClass.__init__, + test_zipped_doctest.SampleClass.__init__, + test_zipped_doctest.SampleClass.a_classmethod, + test_zipped_doctest.SampleClass.a_property, + test_zipped_doctest.SampleClass.a_staticmethod, + test_zipped_doctest.SampleClass.double, + test_zipped_doctest.SampleClass.get, + test_zipped_doctest.SampleNewStyleClass, + test_zipped_doctest.SampleNewStyleClass.__init__, + test_zipped_doctest.SampleNewStyleClass.double, + test_zipped_doctest.SampleNewStyleClass.get, + test_zipped_doctest.sample_func, + test_zipped_doctest.test_DocTest, + test_zipped_doctest.test_DocTestParser, + test_zipped_doctest.test_DocTestRunner.basics, + test_zipped_doctest.test_DocTestRunner.exceptions, + test_zipped_doctest.test_DocTestRunner.option_directives, + test_zipped_doctest.test_DocTestRunner.optionflags, + test_zipped_doctest.test_DocTestRunner.verbose_flag, + test_zipped_doctest.test_Example, + test_zipped_doctest.test_debug, + test_zipped_doctest.test_pdb_set_trace, + test_zipped_doctest.test_pdb_set_trace_nested, + test_zipped_doctest.test_testsource, + test_zipped_doctest.test_trailing_space_in_test, + test_zipped_doctest.test_DocTestSuite, + test_zipped_doctest.test_DocTestFinder, + ] + # These remaining tests are the ones which need access + # to the data files, so we don't run them + fail_due_to_missing_data_files = [ + test_zipped_doctest.test_DocFileSuite, + test_zipped_doctest.test_testfile, + test_zipped_doctest.test_unittest_reportflags, + ] + for obj in known_good_tests: + _run_object_doctest(obj, test_zipped_doctest) + finally: + del sys.modules["test_zipped_doctest"] def test_doctest_main_issue4197(self): test_src = textwrap.dedent("""\ Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 7 14:43:07 2011 @@ -31,6 +31,12 @@ - Issue #11121: Fix building with --enable-shared. +Tests +----- + +- Issue #10971:test_zipimport_support is once again compatible with the + refleak hunter feature of test.regrtest. + What's New in Python 3.2 Release Candidate 2? ============================================= From python-checkins at python.org Mon Feb 7 15:53:14 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 15:53:14 +0100 Subject: [Python-checkins] devguide: Fix instructions for hg backout Message-ID: antoine.pitrou pushed d9ed9a095c86 to devguide: http://hg.python.org/devguide/rev/d9ed9a095c86 changeset: 266:d9ed9a095c86 branch: hg_transition tag: tip user: Antoine Pitrou date: Mon Feb 07 15:53:11 2011 +0100 summary: Fix instructions for hg backout files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -339,14 +339,13 @@ hg backout -This will commit a *new* revision reverting the exact changes made in -````. However, if other changes have been made since then, -you will have to merge them with that new revision. For that, run:: +This will modify your working copy so that all changes in ```` +(including added or deleted files) are undone. You then need to :ref:`commit +` these changes so that the backout gets permanently recorded. - hg merge - hg commit - -.. XXX: "hg backout --merge" doesn't seem to work +.. note:: + These instructions are for Mercurial 1.7 and higher. ``hg backout`` has + a slightly different behaviour in versions before 1.7. How do I update to a specific release tag? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 15:56:20 2011 From: python-checkins at python.org (nick.coghlan) Date: Mon, 7 Feb 2011 15:56:20 +0100 (CET) Subject: [Python-checkins] r88368 - svn:log Message-ID: <20110207145620.971E7EE9DD@mail.python.org> Author: nick.coghlan Revision: 88368 Property Name: svn:log Action: modified Property diff: --- old property value +++ new property value @@ -1 +1 @@ -Issue 10971: Make test_zipimport_support once again compatible with refleak hunting \ No newline at end of file +Issue 10971: Make test_zipimport_support once again compatible with refleak hunting (reviewed by Georg Brandl) From python-checkins at python.org Mon Feb 7 16:30:45 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 16:30:45 +0100 (CET) Subject: [Python-checkins] r88369 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110207153045.CAF65EE98B@mail.python.org> Author: georg.brandl Date: Mon Feb 7 16:30:45 2011 New Revision: 88369 Log: Consistent heading spacing, and fix two typos. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Mon Feb 7 16:30:45 2011 @@ -20,6 +20,7 @@ Choosing a Strategy =================== + When a project makes the decision that it's time to support both Python 2 & 3, a decision needs to be made as to how to go about accomplishing that goal. The chosen strategy will depend on how large the project's existing @@ -54,6 +55,7 @@ Universal Bits of Advice ------------------------ + Regardless of what strategy you pick, there are a few things you should consider. @@ -128,6 +130,7 @@ Python 3 and 3to2 ================= + If you are starting a new project or your codebase is small enough, you may want to consider writing your code for Python 3 and backporting to Python 2 using 3to2_. Thanks to Python 3 being more strict about things than Python 2 @@ -149,6 +152,7 @@ Python 2 and 2to3 ================= + Included with Python since 2.6, the 2to3_ tool (and :mod:`lib2to3` module) helps with porting Python 2 to Python 3 by performing various source translations. This is a perfect solution for projects which wish to branch @@ -176,6 +180,7 @@ Support Python 2.7 ------------------ + As a first step, make sure that your project is compatible with `Python 2.7`_. This is just good to do as Python 2.7 is the last release of Python 2 and thus will be used for a rather long time. It also allows for use of the ``-3`` flag @@ -184,6 +189,7 @@ Try to Support `Python 2.6`_ and Newer Only ------------------------------------------- + While not possible for all projects, if you can support `Python 2.6`_ and newer **only**, your life will be much easier. Various future statements, stdlib additions, etc. exist only in Python 2.6 and later which greatly assist in @@ -199,6 +205,7 @@ ``from __future__ import print_function`` ''''''''''''''''''''''''''''''''''''''''' + This is a personal choice. 2to3 handles the translation from the print statement to the print function rather well so this is an optional step. This future statement does help, though, with getting used to typing @@ -207,6 +214,7 @@ ``from __future__ import unicode_literals`` ''''''''''''''''''''''''''''''''''''''''''' + Another personal choice. You can always mark what you want to be a (unicode) string with a ``u`` prefix to get the same effect. But regardless of whether you use this future statement or not, you **must** make sure you know exactly @@ -217,6 +225,7 @@ Bytes literals '''''''''''''' + This is a **very** important one. The ability to prefix Python 2 strings that are meant to contain bytes with a ``b`` prefix help to very clearly delineate what is and is not a Python 3 string. When you run 2to3 on code, all Python 2 @@ -238,12 +247,14 @@ Supporting `Python 2.5`_ and Newer Only --------------------------------------- + If you are supporting `Python 2.5`_ and newer there are still some features of Python that you can utilize. ``from __future__ import absolute_imports`` ''''''''''''''''''''''''''''''''''''''''''' + Implicit relative imports (e.g., importing ``spam.bacon`` from within ``spam.eggs`` with the statement ``import bacon``) does not work in Python 3. This future statement moves away from that and allows the use of explicit @@ -261,6 +272,7 @@ Handle Common "Gotchas" ----------------------- + There are a few things that just consistently come up as sticking points for people which 2to3 cannot handle automatically or can easily be done in Python 2 to help modernize your code. @@ -268,6 +280,7 @@ ``from __future__ import division`` ''''''''''''''''''''''''''''''''''' + While the exact same outcome can be had by using the ``-Qnew`` argument to Python, using this future statement lifts the requirement that your users use the flag to get the expected behavior of division in Python 3 @@ -305,6 +318,7 @@ Subclass ``object`` ''''''''''''''''''' + New-style classes have been around since `Python 2.2`_. You need to make sure you are subclassing from ``object`` to avoid odd edge cases involving method resolution order, etc. This continues to be totally valid in Python 3 (although @@ -313,6 +327,7 @@ Deal With the Bytes/String Dichotomy '''''''''''''''''''''''''''''''''''' + One of the biggest issues people have when porting code to Python 3 is handling the bytes/string dichotomy. Because Python 2 allowed the ``str`` type to hold textual data, people have over the years been rather loose in their delineation @@ -340,6 +355,7 @@ Decide what APIs Will Accept **************************** + In Python 2 it was very easy to accidentally create an API that accepted both bytes and textual data. But in Python 3, thanks to the more strict handling of disparate types, this loose usage of bytes and text together tends to fail. @@ -417,9 +433,10 @@ ``__str__()``/``__unicode__()`` ''''''''''''''''''''''''''''''' + In Python 2, objects can specify both a string and unicode representation of themselves. In Python 3, though, there is only a string representation. This -becomes an issue as people can inadvertantly do things in their ``__str__()`` +becomes an issue as people can inadvertently do things in their ``__str__()`` methods which have unpredictable results (e.g., infinite recursion if you happen to use the ``unicode(self).encode('utf8')`` idiom as the body of your ``__str__()`` method). @@ -484,6 +501,7 @@ Updating doctests ''''''''''''''''' + 2to3_ will attempt to generate fixes for doctests that it comes across. It's not perfect, though. If you wrote a monolithic set of doctests (e.g., a single docstring containing all of your doctests), you should at least consider @@ -494,6 +512,7 @@ Eliminate ``-3`` Warnings ------------------------- + When you run your application's test suite, run it using the ``-3`` flag passed to Python. This will cause various warnings to be raised during execution about things that 2to3 cannot handle automatically (e.g., modules that have been @@ -503,12 +522,14 @@ Run 2to3 -------- + Once you have made your Python 2 code future-compatible with Python 3, it's time to use 2to3_ to actually port your code. Manually '''''''' + To manually convert source code using 2to3_, you use the ``2to3`` script that is installed with Python 2.6 and later.:: @@ -526,6 +547,7 @@ During Installation ''''''''''''''''''' + When a user installs your project for Python 3, you can have either :mod:`distutils` or Distribute_ run 2to3_ on your behalf. For distutils, use the following idiom:: @@ -552,6 +574,7 @@ Verify & Test ------------- + At this point you should (hopefully) have your project converted in such a way that it works in Python 3. Verify it by running your unit tests and making sure nothing has gone awry. If you miss something then figure out how to fix it in @@ -567,6 +590,7 @@ Python 2/3 Compatible Source ============================ + While it may seem counter-intuitive, you can write Python code which is source-compatible between Python 2 & 3. It does lead to code that is not entirely idiomatic Python (e.g., having to extract the currently raised @@ -589,6 +613,7 @@ Follow The Steps for Using 2to3_ (sans 2to3) -------------------------------------------- + All of the steps outlined in how to :ref:`port Python 2 code with 2to3 ` apply to creating a Python 2/3 codebase. This includes trying only support Python 2.6 @@ -602,6 +627,7 @@ Use six_ -------- + The six_ project contains many things to help you write portable Python code. You should make sure to read its documentation from beginning to end and use any and all features it provides. That way you will minimize any mistakes you @@ -610,6 +636,7 @@ Capturing the Currently Raised Exception ---------------------------------------- + One change between Python 2 and 3 that will require changing how you code (if you support `Python 2.5`_ and earlier) is accessing the currently raised exception. In Python 2.5 and earlier the syntax @@ -658,9 +685,11 @@ (e.g. the third element of the tuple returned by :func:`sys.exc_info`) in a variable. + Other Resources =============== -The authors of the following blogs posts and wiki pages deserve special thanks + +The authors of the following blog posts and wiki pages deserve special thanks for making public their tips for porting Python 2 code to Python 3 (and thus helping provide information for this document): From python-checkins at python.org Mon Feb 7 16:44:27 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 7 Feb 2011 16:44:27 +0100 (CET) Subject: [Python-checkins] r88370 - python/branches/py3k/Doc/howto/logging-cookbook.rst Message-ID: <20110207154427.816F4EE98D@mail.python.org> Author: georg.brandl Date: Mon Feb 7 16:44:27 2011 New Revision: 88370 Log: Spelling fixes. Modified: python/branches/py3k/Doc/howto/logging-cookbook.rst Modified: python/branches/py3k/Doc/howto/logging-cookbook.rst ============================================================================== --- python/branches/py3k/Doc/howto/logging-cookbook.rst (original) +++ python/branches/py3k/Doc/howto/logging-cookbook.rst Mon Feb 7 16:44:27 2011 @@ -697,7 +697,7 @@ a separate listener process listens for events sent by other processes and logs them according to its own logging configuration. Although the example only demonstrates one way of doing it (for example, you may want to use a listener -thread rather than a separate listener process - the implementation would be +thread rather than a separate listener process -- the implementation would be analogous) it does allow for completely different logging configurations for the listener and the other processes in your application, and can be used as the basis for code meeting your own specific requirements:: @@ -719,7 +719,7 @@ # # In practice, you can configure the listener however you want, but note that in this # simple example, the listener does not apply level or filter logic to received records. - # In practice, you would probably want to do ths logic in the worker processes, to avoid + # In practice, you would probably want to do this logic in the worker processes, to avoid # sending events which would be filtered out between processes. # # The size of the rotated files is made small so you can see the results easily. @@ -918,7 +918,7 @@ Sometimes you want to let a log file grow to a certain size, then open a new file and log to that. You may want to keep a certain number of these files, and when that many files have been created, rotate the files so that the number of -files and the size of the files both remin bounded. For this usage pattern, the +files and the size of the files both remain bounded. For this usage pattern, the logging package provides a :class:`RotatingFileHandler`:: import glob From python-checkins at python.org Mon Feb 7 16:58:12 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 7 Feb 2011 16:58:12 +0100 (CET) Subject: [Python-checkins] r88371 - python/branches/py3k/Doc/library/imaplib.rst Message-ID: <20110207155812.0F8C5EE9EE@mail.python.org> Author: antoine.pitrou Date: Mon Feb 7 16:58:11 2011 New Revision: 88371 Log: Clarify that IMAP4() implicitly calls open(), and that logout() implicitly calls shutdown(). Modified: python/branches/py3k/Doc/library/imaplib.rst Modified: python/branches/py3k/Doc/library/imaplib.rst ============================================================================== --- python/branches/py3k/Doc/library/imaplib.rst (original) +++ python/branches/py3k/Doc/library/imaplib.rst Mon Feb 7 16:58:11 2011 @@ -298,9 +298,10 @@ .. method:: IMAP4.open(host, port) - Opens socket to *port* at *host*. The connection objects established by this + Opens socket to *port* at *host*. This method is implicitly called by + the :class:`IMAP4` constructor. The connection objects established by this method will be used in the ``read``, ``readline``, ``send``, and ``shutdown`` - methods. You may override this method. + methods. You may override this method. .. method:: IMAP4.partial(message_num, message_part, start, length) @@ -390,7 +391,8 @@ .. method:: IMAP4.shutdown() - Close connection established in ``open``. You may override this method. + Close connection established in ``open``. This method is implicitly + called by :meth:`IMAP4.logout`. You may override this method. .. method:: IMAP4.socket() From python-checkins at python.org Mon Feb 7 17:03:24 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 7 Feb 2011 17:03:24 +0100 (CET) Subject: [Python-checkins] r88372 - in python/branches/release31-maint: Doc/library/imaplib.rst Message-ID: <20110207160324.42C7FEE981@mail.python.org> Author: antoine.pitrou Date: Mon Feb 7 17:03:24 2011 New Revision: 88372 Log: Merged revisions 88371 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88371 | antoine.pitrou | 2011-02-07 16:58:11 +0100 (lun., 07 f?vr. 2011) | 3 lines Clarify that IMAP4() implicitly calls open(), and that logout() implicitly calls shutdown(). ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/library/imaplib.rst Modified: python/branches/release31-maint/Doc/library/imaplib.rst ============================================================================== --- python/branches/release31-maint/Doc/library/imaplib.rst (original) +++ python/branches/release31-maint/Doc/library/imaplib.rst Mon Feb 7 17:03:24 2011 @@ -288,9 +288,10 @@ .. method:: IMAP4.open(host, port) - Opens socket to *port* at *host*. The connection objects established by this + Opens socket to *port* at *host*. This method is implicitly called by + the :class:`IMAP4` constructor. The connection objects established by this method will be used in the ``read``, ``readline``, ``send``, and ``shutdown`` - methods. You may override this method. + methods. You may override this method. .. method:: IMAP4.partial(message_num, message_part, start, length) @@ -380,7 +381,8 @@ .. method:: IMAP4.shutdown() - Close connection established in ``open``. You may override this method. + Close connection established in ``open``. This method is implicitly + called by :meth:`IMAP4.logout`. You may override this method. .. method:: IMAP4.socket() From python-checkins at python.org Mon Feb 7 17:03:48 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 7 Feb 2011 17:03:48 +0100 (CET) Subject: [Python-checkins] r88373 - in python/branches/release27-maint: Doc/library/imaplib.rst Message-ID: <20110207160348.0029CEE98D@mail.python.org> Author: antoine.pitrou Date: Mon Feb 7 17:03:47 2011 New Revision: 88373 Log: Merged revisions 88371 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88371 | antoine.pitrou | 2011-02-07 16:58:11 +0100 (lun., 07 f?vr. 2011) | 3 lines Clarify that IMAP4() implicitly calls open(), and that logout() implicitly calls shutdown(). ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/imaplib.rst Modified: python/branches/release27-maint/Doc/library/imaplib.rst ============================================================================== --- python/branches/release27-maint/Doc/library/imaplib.rst (original) +++ python/branches/release27-maint/Doc/library/imaplib.rst Mon Feb 7 17:03:47 2011 @@ -307,9 +307,10 @@ .. method:: IMAP4.open(host, port) - Opens socket to *port* at *host*. The connection objects established by this + Opens socket to *port* at *host*. This method is implicitly called by + the :class:`IMAP4` constructor. The connection objects established by this method will be used in the ``read``, ``readline``, ``send``, and ``shutdown`` - methods. You may override this method. + methods. You may override this method. .. method:: IMAP4.partial(message_num, message_part, start, length) @@ -405,7 +406,8 @@ .. method:: IMAP4.shutdown() - Close connection established in ``open``. You may override this method. + Close connection established in ``open``. This method is implicitly + called by :meth:`IMAP4.logout`. You may override this method. .. method:: IMAP4.socket() From python-checkins at python.org Mon Feb 7 17:44:19 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 7 Feb 2011 17:44:19 +0100 (CET) Subject: [Python-checkins] r88374 - in python/branches/py3k: Mac/BuildScript/resources/ReadMe.txt Mac/Extras.ReadMe.txt Mac/Makefile.in Misc/NEWS Message-ID: <20110207164419.BAC17EE984@mail.python.org> Author: ned.deily Date: Mon Feb 7 17:44:19 2011 New Revision: 88374 Log: Issue #11079: The /Applications/Python x.x folder created by the Mac OS X installers no longer includes an Extras directory. The Tools directory is now installed in the framework under share/doc. Removed: python/branches/py3k/Mac/Extras.ReadMe.txt Modified: python/branches/py3k/Mac/BuildScript/resources/ReadMe.txt python/branches/py3k/Mac/Makefile.in python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Mac/BuildScript/resources/ReadMe.txt ============================================================================== --- python/branches/py3k/Mac/BuildScript/resources/ReadMe.txt (original) +++ python/branches/py3k/Mac/BuildScript/resources/ReadMe.txt Mon Feb 7 17:44:19 2011 @@ -24,7 +24,7 @@ ******************* The installer puts applications, an "Update Shell Profile" command, -and an Extras folder containing demo programs and tools into the +and a link to the optionally installed Python Documentation into the "Python $VERSION" subfolder of the system Applications folder, and puts the underlying machinery into the folder $PYTHONFRAMEWORKINSTALLDIR. It can Deleted: python/branches/py3k/Mac/Extras.ReadMe.txt ============================================================================== --- python/branches/py3k/Mac/Extras.ReadMe.txt Mon Feb 7 17:44:19 2011 +++ (empty file) @@ -1,5 +0,0 @@ -This folder contains examples of Python usage and useful scripts and tools. - -You should be aware that these are not Macintosh-specific but are shared -among Python on all platforms, so there are some that only run on Windows -or Unix or another platform, and/or make little sense on a Macintosh. Modified: python/branches/py3k/Mac/Makefile.in ============================================================================== --- python/branches/py3k/Mac/Makefile.in (original) +++ python/branches/py3k/Mac/Makefile.in Mon Feb 7 17:44:19 2011 @@ -177,11 +177,11 @@ $(INSTALLED_PYTHONAPP): install_Python -installextras: $(srcdir)/Extras.ReadMe.txt $(srcdir)/Extras.install.py - $(INSTALL) -d "$(DESTDIR)$(PYTHONAPPSDIR)/Extras" - $(INSTALL) $(srcdir)/Extras.ReadMe.txt "$(DESTDIR)$(PYTHONAPPSDIR)/Extras/ReadMe.txt" +installextras: $(srcdir)/Extras.install.py + $(INSTALL) -d "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples" $(RUNSHARED) $(BUILDPYTHON) $(srcdir)/Extras.install.py $(srcdir)/../Tools \ - "$(DESTDIR)$(PYTHONAPPSDIR)/Extras/Tools" + "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples/Tools" ; \ + chmod -R ugo+rX,go-w "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples/Tools" checkapplepython: $(srcdir)/Tools/fixapplepython23.py Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 7 17:44:19 2011 @@ -29,6 +29,11 @@ Build ----- +- Issue #11079: The /Applications/Python x.x folder created by the Mac + OS X installers now includes a link to the installed documentation + and no longer includes an Extras directory. The Tools directory is + now installed in the framework under share/doc. + - Issue #11121: Fix building with --enable-shared. Tests From python-checkins at python.org Mon Feb 7 17:48:28 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 7 Feb 2011 17:48:28 +0100 (CET) Subject: [Python-checkins] r88375 - python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation Message-ID: <20110207164828.9D760EE984@mail.python.org> Author: ned.deily Date: Mon Feb 7 17:48:28 2011 New Revision: 88375 Log: - Issue #11079: The /Applications/Python x.x folder created by the Mac OS X installers now includes a link to the installed documentation plus another in the framework share/doc directory. Modified: python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation Modified: python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation Mon Feb 7 17:48:28 2011 @@ -1,11 +1,32 @@ #!/bin/sh PYVER="@PYVER@" +FWK="/Library/Frameworks/Python.framework/Versions/${PYVER}" +FWK_DOCDIR_SUBPATH="Resources/English.lproj/Documentation" +FWK_DOCDIR="${FWK}/${FWK_DOCDIR_SUBPATH}" +APPDIR="/Applications/Python ${PYVER}" +DEV_DOCDIR="/Developer/Documentation" +SHARE_DIR="${FWK}/share" +SHARE_DOCDIR="${SHARE_DIR}/doc/python${PYVER}" +SHARE_DOCDIR_TO_FWK="../../.." -if [ -d /Developer/Documentation ]; then - if [ ! -d /Developer/Documentation/Python ]; then - mkdir -p /Developer/Documentation/Python - fi +# make link in /Developer/Documentation/ for Xcode users +if [ -d "${DEV_DOCDIR}" ]; then + if [ ! -d "${DEV_DOCDIR}/Python" ]; then + mkdir -p "${DEV_DOCDIR}/Python" + fi + ln -fhs "${FWK_DOCDIR}" "${DEV_DOCDIR}/Python/Reference Documentation ${PYVER}" +fi + +# make link in /Applications/Python m.n/ for Finder users +if [ -d "${APPDIR}" ]; then + ln -fhs "${FWK_DOCDIR}/index.html" "${APPDIR}/Python Documentation.html" +fi - ln -fhs /Library/Frameworks/Python.framework/Versions/${PYVER}/Resources/English.lproj/Documentation "/Developer/Documentation/Python/Reference Documentation @PYVER@" +# make share/doc link in framework for command line users +if [ -d "${SHARE_DIR}" ]; then + mkdir -p "${SHARE_DOCDIR}" + # make relative link to html doc directory + ln -s "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi + From python-checkins at python.org Mon Feb 7 17:52:25 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 7 Feb 2011 17:52:25 +0100 (CET) Subject: [Python-checkins] r88376 - in python/branches/release27-maint: Mac/BuildScript/scripts/postflight.documentation Misc/NEWS Message-ID: <20110207165225.8E0F7EE9FB@mail.python.org> Author: ned.deily Date: Mon Feb 7 17:52:25 2011 New Revision: 88376 Log: Issue #11079: The /Applications/Python x.x folder created by the Mac OS X installers now includes a link to the installed documentation. Modified: python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation Mon Feb 7 17:52:25 2011 @@ -1,11 +1,32 @@ #!/bin/sh PYVER="@PYVER@" +FWK="/Library/Frameworks/Python.framework/Versions/${PYVER}" +FWK_DOCDIR_SUBPATH="Resources/English.lproj/Documentation" +FWK_DOCDIR="${FWK}/${FWK_DOCDIR_SUBPATH}" +APPDIR="/Applications/Python ${PYVER}" +DEV_DOCDIR="/Developer/Documentation" +SHARE_DIR="${FWK}/share" +SHARE_DOCDIR="${SHARE_DIR}/doc/python${PYVER}" +SHARE_DOCDIR_TO_FWK="../../.." -if [ -d /Developer/Documentation ]; then - if [ ! -d /Developer/Documentation/Python ]; then - mkdir -p /Developer/Documentation/Python - fi +# make link in /Developer/Documentation/ for Xcode users +if [ -d "${DEV_DOCDIR}" ]; then + if [ ! -d "${DEV_DOCDIR}/Python" ]; then + mkdir -p "${DEV_DOCDIR}/Python" + fi + ln -fhs "${FWK_DOCDIR}" "${DEV_DOCDIR}/Python/Reference Documentation ${PYVER}" +fi + +# make link in /Applications/Python m.n/ for Finder users +if [ -d "${APPDIR}" ]; then + ln -fhs "${FWK_DOCDIR}/index.html" "${APPDIR}/Python Documentation.html" +fi - ln -fhs /Library/Frameworks/Python.framework/Versions/${PYVER}/Resources/English.lproj/Documentation "/Developer/Documentation/Python/Reference Documentation @PYVER@" +# make share/doc link in framework for command line users +if [ -d "${SHARE_DIR}" ]; then + mkdir -p "${SHARE_DOCDIR}" + # make relative link to html doc directory + ln -s "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi + Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Mon Feb 7 17:52:25 2011 @@ -167,6 +167,9 @@ Build ----- +- Issue #11079: The /Applications/Python x.x folder created by the Mac + OS X installers now includes a link to the installed documentation. + - Issue #11054: Allow Mac OS X installer builds to again work on 10.5 with the system-provided Python. From python-checkins at python.org Mon Feb 7 21:26:52 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 21:26:52 +0100 Subject: [Python-checkins] devguide: Add myself to nntplib Message-ID: antoine.pitrou pushed 215d08f352d3 to devguide: http://hg.python.org/devguide/rev/215d08f352d3 changeset: 267:215d08f352d3 parent: 234:2c711f0028a7 user: Antoine Pitrou date: Mon Feb 07 21:23:20 2011 +0100 summary: Add myself to nntplib files: experts.rst diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -150,7 +150,7 @@ multiprocessing jnoller netrc nis -nntplib +nntplib pitrou numbers operator optparse aronacher -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 21:26:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 21:26:55 +0100 Subject: [Python-checkins] devguide: Mention the freenode Web client. Message-ID: antoine.pitrou pushed 28cbae489d83 to devguide: http://hg.python.org/devguide/rev/28cbae489d83 changeset: 268:28cbae489d83 user: Antoine Pitrou date: Mon Feb 07 21:25:58 2011 +0100 summary: Mention the freenode Web client. files: communication.rst diff --git a/communication.rst b/communication.rst --- a/communication.rst +++ b/communication.rst @@ -66,7 +66,9 @@ Some core developers enjoy spending time on IRC discussing various issues regarding Python's development in the ``#python-dev`` channel on ``irc.freenode.net``. This is not a place to ask for help with Python, but to -discuss issues related to Python's own development. +discuss issues related to Python's own development. You can use freenode's +`Web interface `_ if you don't have an IRC +client. Blogs -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 21:26:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 21:26:55 +0100 Subject: [Python-checkins] devguide: Merge with default Message-ID: antoine.pitrou pushed 17b7fd622c97 to devguide: http://hg.python.org/devguide/rev/17b7fd622c97 changeset: 269:17b7fd622c97 branch: hg_transition tag: tip parent: 266:d9ed9a095c86 parent: 268:28cbae489d83 user: Antoine Pitrou date: Mon Feb 07 21:26:32 2011 +0100 summary: Merge with default files: communication.rst diff --git a/communication.rst b/communication.rst --- a/communication.rst +++ b/communication.rst @@ -66,7 +66,9 @@ Some core developers enjoy spending time on IRC discussing various issues regarding Python's development in the ``#python-dev`` channel on ``irc.freenode.net``. This is not a place to ask for help with Python, but to -discuss issues related to Python's own development. +discuss issues related to Python's own development. You can use freenode's +`Web interface `_ if you don't have an IRC +client. Blogs diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -150,7 +150,7 @@ multiprocessing jnoller netrc nis -nntplib +nntplib pitrou numbers operator optparse aronacher -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 22:13:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 22:13:16 +0100 Subject: [Python-checkins] devguide: Present an idiomatic hg workflow (rather than "all in the working copy") Message-ID: antoine.pitrou pushed 570a452dc4d9 to devguide: http://hg.python.org/devguide/rev/570a452dc4d9 changeset: 270:570a452dc4d9 branch: hg_transition user: Antoine Pitrou date: Mon Feb 07 22:01:21 2011 +0100 summary: Present an idiomatic hg workflow (rather than "all in the working copy") files: faq.rst patch.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -179,6 +179,8 @@ hg pull -u +.. _hg-local-workflow: + How do I add a file or directory to the repository? ------------------------------------------------------------------------------- diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -10,10 +10,23 @@ Tool Usage '''''''''' -Because Python uses hg as its version control system, **anyone** can make +Because Python uses Mercurial as its version control system, **anyone** can make commits locally to their repository. This means that you should make as many commits to your code checkout as you want in order for you to work effectively. +.. _named-branch-workflow: + +Also, Mercurial allows for various workflows according to each person's or +project's preference. We present here a very simple solution based on `named +branches `_. Before you +start modifying things in your working copy, type:: + + hg branch mywork + +where ``mywork`` is a descriptive name for what you are going to work on. +Then all your local commits will be recorded on that branch, which is an +effective way of distinguishing them from other (upstream) commits. + Preparation ''''''''''' @@ -70,30 +83,28 @@ make patchcheck This will check and/or fix various common things people forget to do for -patches. +patches, such as adding any new files needing for the patch to work. -To create your patch, you should generate a unified diff from your checkout's -top-level directory:: +The following instructions assume you are using the :ref:`named branch approach +` suggested earlier. To create your patch, first check +that all your local changes have been committed, then type the following:: - hg outgoing --path > patch.diff - -If your work needs some new files to be added to the source tree, remember -to ``hg add`` them before generating the patch:: - - hg add Lib/newfile.py - hg outgoing --patch > patch.diff + hg diff -r default > mywork.patch To apply a patch generated this way, do:: - patch -p1 < patch.diff + patch -p1 < mywork.patch -To undo a patch, you can revert **all** changes made in your checkout:: +To undo a patch applied in your working copy, simply can revert **all** changes:: hg revert --all This will leave backups of the files with your changes still intact. To skip that step, you can use the ``--no-backup`` flag. +Please refer to the :ref:`FAQ ` for :ref:`more information +` on how to manage your local changes. + .. note:: The ``patch`` program is not available by default under Windows. You can find it `here `_, courtesy of the `GnuWin32 `_ project. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 22:13:17 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 22:13:17 +0100 Subject: [Python-checkins] devguide: Move the prerequisite of running the test suite in the "preparation" Message-ID: antoine.pitrou pushed a90309bc93fb to devguide: http://hg.python.org/devguide/rev/a90309bc93fb changeset: 271:a90309bc93fb parent: 268:28cbae489d83 user: Antoine Pitrou date: Mon Feb 07 22:09:30 2011 +0100 summary: Move the prerequisite of running the test suite in the "preparation" checklist. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -32,9 +32,14 @@ Third, make sure you have proper tests to verify your patch works as expected. Patches will not be accepted without the proper tests! -Fourth, proper documentation additions/changes should be included. +Fourth, make sure the entire test suite :ref:`runs ` **without +failure** because of your changes. It is not sufficient to only run whichever +test seems impacted by your changes, because there might be interferences +unknown to you between your changes and some other part of the interpreter. -Fifth, if you are not already in the ``Misc/ACKS`` file then add your name. If +Fifth, proper documentation additions/changes should be included. + +Sixth, if you are not already in the ``Misc/ACKS`` file then add your name. If you have taken the time to diagnose a problem, invent a solution, code it up, and submit a patch you deserve to be recognized as having contributed to Python. This also means you need to fill out a `contributor form`_ which @@ -51,12 +56,6 @@ Generation '''''''''' -Before creating your patch, you should make sure that the entire test suite -:ref:`runs ` without failure because of your changes. It is not -sufficient to only run whichever test seems impacted by your changes, because -there might be interferences unknown to you between your changes and some -other part of the interpreter. - To perform a quick sanity check on your patch, you can run:: make patchcheck -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 22:13:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 22:13:18 +0100 Subject: [Python-checkins] devguide: Merge Message-ID: antoine.pitrou pushed ec3d7e79f9e2 to devguide: http://hg.python.org/devguide/rev/ec3d7e79f9e2 changeset: 272:ec3d7e79f9e2 branch: hg_transition parent: 270:570a452dc4d9 parent: 271:a90309bc93fb user: Antoine Pitrou date: Mon Feb 07 22:11:50 2011 +0100 summary: Merge files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -53,9 +53,14 @@ Third, make sure you have proper tests to verify your patch works as expected. Patches will not be accepted without the proper tests! -Fourth, proper documentation additions/changes should be included. +Fourth, make sure the entire test suite :ref:`runs ` **without +failure** because of your changes. It is not sufficient to only run whichever +test seems impacted by your changes, because there might be interferences +unknown to you between your changes and some other part of the interpreter. -Fifth, if you are not already in the ``Misc/ACKS`` file then add your name. If +Fifth, proper documentation additions/changes should be included. + +Sixth, if you are not already in the ``Misc/ACKS`` file then add your name. If you have taken the time to diagnose a problem, invent a solution, code it up, and submit a patch you deserve to be recognized as having contributed to Python. This also means you need to fill out a `contributor form`_ which @@ -72,12 +77,6 @@ Generation '''''''''' -Before creating your patch, you should make sure that the entire test suite -:ref:`runs ` without failure because of your changes. It is not -sufficient to only run whichever test seems impacted by your changes, because -there might be interferences unknown to you between your changes and some -other part of the interpreter. - To perform a quick sanity check on your patch, you can run:: make patchcheck -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 22:13:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 22:13:18 +0100 Subject: [Python-checkins] devguide: Add a link to "Documenting Python" Message-ID: antoine.pitrou pushed f5307c79683b to devguide: http://hg.python.org/devguide/rev/f5307c79683b changeset: 273:f5307c79683b parent: 271:a90309bc93fb user: Antoine Pitrou date: Mon Feb 07 22:12:50 2011 +0100 summary: Add a link to "Documenting Python" files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -37,7 +37,8 @@ test seems impacted by your changes, because there might be interferences unknown to you between your changes and some other part of the interpreter. -Fifth, proper documentation additions/changes should be included. +Fifth, proper `documentation `_ +additions/changes should be included. Sixth, if you are not already in the ``Misc/ACKS`` file then add your name. If you have taken the time to diagnose a problem, invent a solution, code it up, -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 7 22:13:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 07 Feb 2011 22:13:18 +0100 Subject: [Python-checkins] devguide: Merge Message-ID: antoine.pitrou pushed 1d86cf7cf506 to devguide: http://hg.python.org/devguide/rev/1d86cf7cf506 changeset: 274:1d86cf7cf506 branch: hg_transition tag: tip parent: 272:ec3d7e79f9e2 parent: 273:f5307c79683b user: Antoine Pitrou date: Mon Feb 07 22:13:13 2011 +0100 summary: Merge files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -58,7 +58,8 @@ test seems impacted by your changes, because there might be interferences unknown to you between your changes and some other part of the interpreter. -Fifth, proper documentation additions/changes should be included. +Fifth, proper `documentation `_ +additions/changes should be included. Sixth, if you are not already in the ``Misc/ACKS`` file then add your name. If you have taken the time to diagnose a problem, invent a solution, code it up, -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 8 00:10:33 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 8 Feb 2011 00:10:33 +0100 (CET) Subject: [Python-checkins] r88377 - in python/branches/py3k/Doc: ACKS.txt library/shelve.rst Message-ID: <20110207231033.6DF64EE9CC@mail.python.org> Author: antoine.pitrou Date: Tue Feb 8 00:10:33 2011 New Revision: 88377 Log: Issue #11141: Fix the shelve documentation to use a list, not a range object. Patch by SilentGhost. Modified: python/branches/py3k/Doc/ACKS.txt python/branches/py3k/Doc/library/shelve.rst Modified: python/branches/py3k/Doc/ACKS.txt ============================================================================== --- python/branches/py3k/Doc/ACKS.txt (original) +++ python/branches/py3k/Doc/ACKS.txt Tue Feb 8 00:10:33 2011 @@ -182,6 +182,7 @@ * Joakim Sernbrant * Justin Sheehy * Charlie Shepherd + * SilentGhost * Michael Simcich * Ionel Simionescu * Michael Sloan Modified: python/branches/py3k/Doc/library/shelve.rst ============================================================================== --- python/branches/py3k/Doc/library/shelve.rst (original) +++ python/branches/py3k/Doc/library/shelve.rst Tue Feb 8 00:10:33 2011 @@ -169,8 +169,8 @@ klist = list(d.keys()) # a list of all existing keys (slow!) # as d was opened WITHOUT writeback=True, beware: - d['xx'] = range(4) # this works as expected, but... - d['xx'].append(5) # *this doesn't!* -- d['xx'] is STILL range(4)! + d['xx'] = [0, 1, 2] # this works as expected, but... + d['xx'].append(3) # *this doesn't!* -- d['xx'] is STILL [0, 1, 2]! # having opened d without writeback=True, you need to code carefully: temp = d['xx'] # extracts the copy From python-checkins at python.org Tue Feb 8 00:18:52 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 8 Feb 2011 00:18:52 +0100 (CET) Subject: [Python-checkins] r88378 - in python/branches/release31-maint: Doc/ACKS.txt Doc/library/shelve.rst Message-ID: <20110207231852.E5B94EEA36@mail.python.org> Author: antoine.pitrou Date: Tue Feb 8 00:18:52 2011 New Revision: 88378 Log: Merged revisions 88377 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88377 | antoine.pitrou | 2011-02-08 00:10:33 +0100 (mar., 08 f?vr. 2011) | 4 lines Issue #11141: Fix the shelve documentation to use a list, not a range object. Patch by SilentGhost. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/ACKS.txt python/branches/release31-maint/Doc/library/shelve.rst Modified: python/branches/release31-maint/Doc/ACKS.txt ============================================================================== --- python/branches/release31-maint/Doc/ACKS.txt (original) +++ python/branches/release31-maint/Doc/ACKS.txt Tue Feb 8 00:18:52 2011 @@ -179,6 +179,7 @@ * Joakim Sernbrant * Justin Sheehy * Charlie Shepherd + * SilentGhost * Michael Simcich * Ionel Simionescu * Michael Sloan Modified: python/branches/release31-maint/Doc/library/shelve.rst ============================================================================== --- python/branches/release31-maint/Doc/library/shelve.rst (original) +++ python/branches/release31-maint/Doc/library/shelve.rst Tue Feb 8 00:18:52 2011 @@ -158,8 +158,8 @@ klist = list(d.keys()) # a list of all existing keys (slow!) # as d was opened WITHOUT writeback=True, beware: - d['xx'] = range(4) # this works as expected, but... - d['xx'].append(5) # *this doesn't!* -- d['xx'] is STILL range(4)! + d['xx'] = [0, 1, 2] # this works as expected, but... + d['xx'].append(3) # *this doesn't!* -- d['xx'] is STILL [0, 1, 2]! # having opened d without writeback=True, you need to code carefully: temp = d['xx'] # extracts the copy From nnorwitz at gmail.com Tue Feb 8 00:48:42 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Mon, 7 Feb 2011 18:48:42 -0500 Subject: [Python-checkins] Python Regression Test Failures refleak (1) Message-ID: <20110207234842.GA27331@kbk-i386-bb.dyndns.org> More important issues: ---------------------- test_bz2 leaked [0, 0, 80] references, sum=80 Less important issues: ---------------------- From solipsis at pitrou.net Tue Feb 8 05:04:53 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 08 Feb 2011 05:04:53 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88377): sum=0 Message-ID: py3k results for svn r88377 (hg cset d35dbf596638) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogILAGRc', '-x'] From python-checkins at python.org Tue Feb 8 20:24:10 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 08 Feb 2011 20:24:10 +0100 Subject: [Python-checkins] devguide: Two things: (1) explain how to switch between branches so people don't Message-ID: brett.cannon pushed f20f47637065 to devguide: http://hg.python.org/devguide/rev/f20f47637065 changeset: 275:f20f47637065 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 11:24:02 2011 -0800 summary: Two things: (1) explain how to switch between branches so people don't end up branching off of their own branches constantly, and (2) mention that using named branches for everything is more for non-committers, else we would end up with a ton of named branches in the main repo. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -10,16 +10,12 @@ Tool Usage '''''''''' -Because Python uses Mercurial as its version control system, **anyone** can make -commits locally to their repository. This means that you should make as many -commits to your code checkout as you want in order for you to work effectively. - .. _named-branch-workflow: -Also, Mercurial allows for various workflows according to each person's or +Mercurial allows for various workflows according to each person's or project's preference. We present here a very simple solution based on `named -branches `_. Before you -start modifying things in your working copy, type:: +branches `_ for +non-committers. Before you start modifying things in your working copy, type:: hg branch mywork @@ -27,6 +23,20 @@ Then all your local commits will be recorded on that branch, which is an effective way of distinguishing them from other (upstream) commits. +Make sure to do your branching from the ``default`` branch (type ``hg branch`` +to see which branch is active). To switch between branches, e.g., switch to the +``default`` branch, do:: + + hg update default + +When you are done with a branch, you can mark it as closed by doing the +following while the branch you wish to close is active:: + + hg commit --close-branch + +This deletes nothing, but it stops the branch from being listed as active in +your checkout. + Preparation ''''''''''' -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 8 21:12:24 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 08 Feb 2011 21:12:24 +0100 Subject: [Python-checkins] devguide: Try to explain how to port changes. Message-ID: brett.cannon pushed 5c1e3ac98774 to devguide: http://hg.python.org/devguide/rev/5c1e3ac98774 changeset: 276:5c1e3ac98774 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 12:12:17 2011 -0800 summary: Try to explain how to port changes. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -48,40 +48,63 @@ ask on python-dev. -Backporting ------------ +Forward-Porting +--------------- + If the patch is a bugfix and it does not break -backwards-compatibility *at all*, then backport it to the branch(es) in -maintenance mode. The easiest way to do this is to apply the patch in the -development branch, commit, and then use svnmerge.py_ to backport the patch. +backwards-compatibility *at all*, then it should be applied to the oldest +branch applicable and forward-ported until it reaches the in-development branch +of Python. A forward-port instead of a back-port is preferred as it allows the +:abbr:`DAG (directed acyclic graph)` used by hg to work with the movement of +the patch through the codebase instead of against it. -For example, let us assume you just made commit 42 in the development branch -and you want to backport it to the ``release31-maint`` branch. You would change -your working directory to the maintenance branch and run the command:: - svnmerge.py merge -r 42 +Porting Within a Major Version +'''''''''''''''''''''''''''''' +Assume that Python 3.2 is the current in-development version of Python and that +you have a patch that should also be applied to Python 3.1. To properly port +the patch to both versions of Python, you should first apply the patch to +Python 3.1:: -This will try to apply the patch to the current branch and generate a commit -message. You will need to revert ``Misc/NEWS`` and do a new entry (the file -changes too much between releases to ever have a merge succeed). To do a -reversion, you can either undo the changes:: + hg update release-31maint + patch -p1 < patch.diff + hg commit - svn revert Misc/NEWS +With the patch now committed (notice that pushing to hg.python.org is not +needed yet), you want to merge the patch up into Python 3.2:: -or you can manually fix the issue and tell svn the problem is resolved:: + hg update py3k + hg merge release-31maint + # Fix any conflicts; probably Misc/NEWS at least + hg commit + hg push - svn resolved Misc/NEWS +This will get the patch working in Python 3.2 and push **both** the Python 3.1 +and Python 3.2 updates to hg.python.org. If someone has forgotten to merge +their changes from previous patches applied to Python 3.1 then they too will be +merged (hopefully this will not be the case). -Once your checkout is ready to be committed, do:: +If you want to do the equivalent of blocking a patch in Python 3.2 that was +applied to Python 3.1, simply merge the change but revert the changes before +committing:: - svn ci -F svnmerge-commit-message.txt + hg merge release-31maint + hg revert -a + hg commit + hg push -This will commit the backport along with using the commit message created by -``svnmerge.py`` for you. +This will cause hg's DAG to note that the changes were merged while not +committing any change in the actual code. -If it turns out you do not have the time to do a backport, then at least leave -the proper issue open on the tracker with a note specifying that the change -should be backported so someone else can do it. +Porting Between Major Versions +'''''''''''''''''''''''''''''' +To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant +extension`_. Assuming you committed in Python 2.7 first, to pull changeset +#12345 into Python 3.2, do:: + hg transplant -s -m 12345 + # XXX any other steps required, or is it the quivalent of merged and committed? + hg push -.. _svnmerge.py: http://svn.apache.org/repos/asf/subversion/trunk/contrib/client-side/svnmerge/svnmerge.py + +.. _transplant extension: http://mercurial.selenic.com/wiki/TransplantExtension -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 8 22:38:39 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 08 Feb 2011 22:38:39 +0100 Subject: [Python-checkins] devguide: Denote the changes in forward-porting changes between feature clones and Message-ID: brett.cannon pushed d621fcb29f04 to devguide: http://hg.python.org/devguide/rev/d621fcb29f04 changeset: 277:d621fcb29f04 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 13:38:03 2011 -0800 summary: Denote the changes in forward-porting changes between feature clones and working in a single clone. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -71,10 +71,19 @@ hg commit With the patch now committed (notice that pushing to hg.python.org is not -needed yet), you want to merge the patch up into Python 3.2:: +needed yet), you want to merge the patch up into Python 3.2. Assuming you are +doing all of your work in a single clone:: hg update py3k hg merge release-31maint + +If you are using feature clones, then do:: + + hg pull + hg merge + +Now that the changes have been pulled into the proper branch/clone, do:: + # Fix any conflicts; probably Misc/NEWS at least hg commit hg push @@ -85,8 +94,8 @@ merged (hopefully this will not be the case). If you want to do the equivalent of blocking a patch in Python 3.2 that was -applied to Python 3.1, simply merge the change but revert the changes before -committing:: +applied to Python 3.1, simply pull/merge the change but revert the changes +before committing:: hg merge release-31maint hg revert -a -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 8 22:40:17 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 08 Feb 2011 22:40:17 +0100 Subject: [Python-checkins] devguide: Tweak in patch block/revert. Message-ID: brett.cannon pushed 84c750aec888 to devguide: http://hg.python.org/devguide/rev/84c750aec888 changeset: 278:84c750aec888 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 13:40:09 2011 -0800 summary: Tweak in patch block/revert. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -97,7 +97,7 @@ applied to Python 3.1, simply pull/merge the change but revert the changes before committing:: - hg merge release-31maint + # After pull/merge hg revert -a hg commit hg push -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 8 23:18:54 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 08 Feb 2011 23:18:54 +0100 Subject: [Python-checkins] devguide: Try to explain the two most common approaches to hg workflow: feature Message-ID: brett.cannon pushed 639ba3557445 to devguide: http://hg.python.org/devguide/rev/639ba3557445 changeset: 279:639ba3557445 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 14:18:47 2011 -0800 summary: Try to explain the two most common approaches to hg workflow: feature clones and mq. Also simplify the porting of changes by only discussing a single clone approach. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -28,6 +28,31 @@ contributed to the resolution, it is good practice to credit them. +Common Hg Workflows +------------------- + +While non-committers can use named branches without issue, as a core developer +you should limit their use to only those branches to be used to collaborate +between other core developers. This is because named branches do persist in the +revision history. + +Instead, for personal development that does not need to be shared prior to +uploading a patch, other approaches should be considered. Two common ones are +feature clones and :abbr:`mq (Mercurial Queues)`. + +Feature clones assumes you prefer to work with various directories containing +separate clones. You can then create other local clones for each feature you +wish to work on. From there you can push changes back through your local clones +until they finally get pushed to the remote repository. + +With mq_, you work within a single clone, managing your changes with a queue of +patches. This allows you to easily group related changes while still having +them all applied at once. + + +.. _mq: http://mercurial.selenic.com/wiki/MqExtension + + Handling Other's Code --------------------- @@ -76,14 +101,6 @@ hg update py3k hg merge release-31maint - -If you are using feature clones, then do:: - - hg pull - hg merge - -Now that the changes have been pulled into the proper branch/clone, do:: - # Fix any conflicts; probably Misc/NEWS at least hg commit hg push -- Repository URL: http://hg.python.org/devguide From tjreedy at udel.edu Tue Feb 8 23:27:01 2011 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 08 Feb 2011 17:27:01 -0500 Subject: [Python-checkins] devguide: Try to explain the two most common approaches to hg workflow: feature In-Reply-To: References: Message-ID: <4D51C335.10306@udel.edu> > +While non-committers can use named branches without issue, as a core developer > +you should limit their use to only those branches to be used to collaborate either /their/your/ or /as a core developer you/core developers/ I prefer latter as parallel to 'non-committers'. From python-checkins at python.org Wed Feb 9 00:28:20 2011 From: python-checkins at python.org (martin.v.loewis) Date: Wed, 9 Feb 2011 00:28:20 +0100 (CET) Subject: [Python-checkins] r88379 - tracker/roundup-src/roundup/backends/rdbms_common.py Message-ID: <20110208232820.7B36CEE984@mail.python.org> Author: martin.v.loewis Date: Wed Feb 9 00:28:20 2011 New Revision: 88379 Log: Issue #1907: Properly reject large issue IDs. Modified: tracker/roundup-src/roundup/backends/rdbms_common.py Modified: tracker/roundup-src/roundup/backends/rdbms_common.py ============================================================================== --- tracker/roundup-src/roundup/backends/rdbms_common.py (original) +++ tracker/roundup-src/roundup/backends/rdbms_common.py Wed Feb 9 00:28:20 2011 @@ -1104,6 +1104,9 @@ def hasnode(self, classname, nodeid): """ Determine if the database has a given node. """ + if int(nodeid) >= 2**31: + # value out of range + return 0 # If this node is in the cache, then we do not need to go to # the database. (We don't consider this an LRU hit, though.) if self.cache.has_key((classname, nodeid)): From python-checkins at python.org Wed Feb 9 01:53:38 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 01:53:38 +0100 Subject: [Python-checkins] devguide: grammatical fix Message-ID: brett.cannon pushed a6a3b9a80fda to devguide: http://hg.python.org/devguide/rev/a6a3b9a80fda changeset: 280:a6a3b9a80fda branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 16:52:55 2011 -0800 summary: grammatical fix files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -31,9 +31,9 @@ Common Hg Workflows ------------------- -While non-committers can use named branches without issue, as a core developer -you should limit their use to only those branches to be used to collaborate -between other core developers. This is because named branches do persist in the +While non-committers can use named branches without issue, as a committer +you should limit your use to only those branches to be used to collaborate +between other committers. This is because named branches do persist in the revision history. Instead, for personal development that does not need to be shared prior to -- Repository URL: http://hg.python.org/devguide From brett at python.org Wed Feb 9 01:53:43 2011 From: brett at python.org (Brett Cannon) Date: Tue, 8 Feb 2011 16:53:43 -0800 Subject: [Python-checkins] devguide: Try to explain the two most common approaches to hg workflow: feature In-Reply-To: <4D51C335.10306@udel.edu> References: <4D51C335.10306@udel.edu> Message-ID: fixed On Tue, Feb 8, 2011 at 14:27, Terry Reedy wrote: > >> +While non-committers can use named branches without issue, as a core >> developer >> +you should limit their use to only those branches to be used to >> collaborate > > either /their/your/ > or /as a core developer you/core developers/ > I prefer latter as parallel to 'non-committers'. > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > http://mail.python.org/mailman/listinfo/python-checkins > From python-checkins at python.org Wed Feb 9 02:43:26 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 02:43:26 +0100 Subject: [Python-checkins] devguide: Suggest rebasing to compress local commits into a single remote commit. Message-ID: brett.cannon pushed 2d519c5e1780 to devguide: http://hg.python.org/devguide/rev/2d519c5e1780 changeset: 281:2d519c5e1780 branch: hg_transition tag: tip user: Brett Cannon date: Tue Feb 08 17:43:20 2011 -0800 summary: Suggest rebasing to compress local commits into a single remote commit. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -28,8 +28,8 @@ contributed to the resolution, it is good practice to credit them. -Common Hg Workflows -------------------- +Working with Hg_ +---------------- While non-committers can use named branches without issue, as a committer you should limit your use to only those branches to be used to collaborate @@ -49,7 +49,12 @@ patches. This allows you to easily group related changes while still having them all applied at once. +Regardless of which approach to use, consider using the ``hg rebase`` command +before you push. This way you create a single commit in the history instead of +recording, e.g., five separate commits all related to the same fix. + +.. _hg: http://www.hg-scm.org/ .. _mq: http://mercurial.selenic.com/wiki/MqExtension -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Wed Feb 9 05:05:27 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 09 Feb 2011 05:05:27 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88377): sum=0 Message-ID: py3k results for svn r88377 (hg cset d35dbf596638) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogLBjuXn', '-x'] From python-checkins at python.org Wed Feb 9 07:50:15 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 9 Feb 2011 07:50:15 +0100 (CET) Subject: [Python-checkins] r88380 - peps/trunk/pep-0008.txt Message-ID: <20110209065015.0578AEE9CC@mail.python.org> Author: raymond.hettinger Date: Wed Feb 9 07:50:14 2011 New Revision: 88380 Log: Add note about how Control-L gets displayed. Fix whitespace. Modified: peps/trunk/pep-0008.txt Modified: peps/trunk/pep-0008.txt ============================================================================== --- peps/trunk/pep-0008.txt (original) +++ peps/trunk/pep-0008.txt Wed Feb 9 07:50:14 2011 @@ -86,9 +86,9 @@ The preferred way of wrapping long lines is by using Python's implied line continuation inside parentheses, brackets and braces. Long lines can be broken over multiple lines by wrapping expressions in parentheses. These - should be used in preference to using a backslash for line continuation. + should be used in preference to using a backslash for line continuation. Make sure to indent the continued line appropriately. The preferred place - to break around a binary operator is *after* the operator, not before it. + to break around a binary operator is *after* the operator, not before it. Some examples: class Rectangle(Blob): @@ -120,7 +120,9 @@ Python accepts the control-L (i.e. ^L) form feed character as whitespace; Many tools treat these characters as page separators, so you may use them - to separate pages of related sections of your file. + to separate pages of related sections of your file. Note, some editors + and web-based code viewers may not recognize control-L as a form feed + and will show another glyph in its place. Encodings (PEP 263) @@ -697,14 +699,14 @@ try: import platform_specific_module except ImportError: - platform_specific_module = None + platform_specific_module = None A bare 'except:' clause will catch SystemExit and KeyboardInterrupt exceptions, making it harder to interrupt a program with Control-C, and can disguise other problems. If you want to catch all exceptions that signal program errors, use 'except Exception:'. - A good rule of thumb is to limit use of bare 'except' clauses to two + A good rule of thumb is to limit use of bare 'except' clauses to two cases: 1) If the exception handler will be printing out or logging From python-checkins at python.org Wed Feb 9 19:08:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 09 Feb 2011 19:08:28 +0100 Subject: [Python-checkins] devguide: Use Mercurial's full name Message-ID: antoine.pitrou pushed 957687c483c7 to devguide: http://hg.python.org/devguide/rev/957687c483c7 changeset: 282:957687c483c7 branch: hg_transition user: Antoine Pitrou date: Wed Feb 09 18:36:02 2011 +0100 summary: Use Mercurial's full name files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -28,8 +28,8 @@ contributed to the resolution, it is good practice to credit them. -Working with Hg_ ----------------- +Working with Mercurial_ +----------------------- While non-committers can use named branches without issue, as a committer you should limit your use to only those branches to be used to collaborate @@ -54,7 +54,7 @@ recording, e.g., five separate commits all related to the same fix. -.. _hg: http://www.hg-scm.org/ +.. _Mercurial: http://www.hg-scm.org/ .. _mq: http://mercurial.selenic.com/wiki/MqExtension -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 19:08:29 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 09 Feb 2011 19:08:29 +0100 Subject: [Python-checkins] devguide: Try to present the constraints more clearly Message-ID: antoine.pitrou pushed 58b029cfd230 to devguide: http://hg.python.org/devguide/rev/58b029cfd230 changeset: 283:58b029cfd230 branch: hg_transition tag: tip user: Antoine Pitrou date: Wed Feb 09 19:08:24 2011 +0100 summary: Try to present the constraints more clearly files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -31,31 +31,30 @@ Working with Mercurial_ ----------------------- -While non-committers can use named branches without issue, as a committer -you should limit your use to only those branches to be used to collaborate -between other committers. This is because named branches do persist in the -revision history. +As a committer, the ability to push changes to the official Python +repositories means you have to be more careful with your workflow: -Instead, for personal development that does not need to be shared prior to -uploading a patch, other approaches should be considered. Two common ones are -feature clones and :abbr:`mq (Mercurial Queues)`. +* You should not push new named branches to the main repository. You can + still use them in clones that you use for development of patches; you can + also push these branches to a **separate** public repository that will be + dedicated to maintenance of the work before the work gets integrated in the + main repository. -Feature clones assumes you prefer to work with various directories containing -separate clones. You can then create other local clones for each feature you -wish to work on. From there you can push changes back through your local clones -until they finally get pushed to the remote repository. +* You should collapse changesets of a single feature or bugfix before pushing + the result to the main repository. The reason is that we don't want the + history to be full of intermediate commits recording the private history + of the person working on a patch. To automate such collapsing, you can + enable the rebase_ extension, and use the ``--collapse`` option to + ``hg rebase``. -With mq_, you work within a single clone, managing your changes with a queue of -patches. This allows you to easily group related changes while still having -them all applied at once. - -Regardless of which approach to use, consider using the ``hg rebase`` command -before you push. This way you create a single commit in the history instead of -recording, e.g., five separate commits all related to the same fix. +Because of these constraints, it can be practical to use other approaches +such as mq_ (Mercurial Queues), in order to maintain patches in a single +local repository and to push them seamlessly when they are ready. .. _Mercurial: http://www.hg-scm.org/ .. _mq: http://mercurial.selenic.com/wiki/MqExtension +.. _rebase: http://mercurial.selenic.com/wiki/RebaseExtension Handling Other's Code -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 19:16:33 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 9 Feb 2011 19:16:33 +0100 (CET) Subject: [Python-checkins] r88381 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110209181633.10E87EEA91@mail.python.org> Author: raymond.hettinger Date: Wed Feb 9 19:16:32 2011 New Revision: 88381 Log: Typo. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Wed Feb 9 19:16:32 2011 @@ -648,7 +648,7 @@ The biggest news for Python 3.2 is that the :mod:`email` package, :mod:`mailbox` module, and :mod:`nntplib` modules now work correctly with the bytes/text model -in Python 3. For the first time, there is correct handling of message with +in Python 3. For the first time, there is correct handling of messages with mixed encodings. Throughout the standard library, there has been more careful attention to From python-checkins at python.org Wed Feb 9 20:01:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 09 Feb 2011 20:01:57 +0100 Subject: [Python-checkins] devguide: Fix wording: it is not automation. Also, there's a probably a problem Message-ID: antoine.pitrou pushed 8693162abe8b to devguide: http://hg.python.org/devguide/rev/8693162abe8b changeset: 284:8693162abe8b branch: hg_transition tag: tip user: Antoine Pitrou date: Wed Feb 09 20:00:50 2011 +0100 summary: Fix wording: it is not automation. Also, there's a probably a problem if there's nothing to rebase, since collapsing then won't be done; so using rebase is not a strong suggestion. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -43,9 +43,8 @@ * You should collapse changesets of a single feature or bugfix before pushing the result to the main repository. The reason is that we don't want the history to be full of intermediate commits recording the private history - of the person working on a patch. To automate such collapsing, you can - enable the rebase_ extension, and use the ``--collapse`` option to - ``hg rebase``. + of the person working on a patch. If you are using the rebase_ extension, + consider adding the ``--collapse`` option to ``hg rebase``. Because of these constraints, it can be practical to use other approaches such as mq_ (Mercurial Queues), in order to maintain patches in a single -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 20:21:00 2011 From: python-checkins at python.org (terry.reedy) Date: Wed, 9 Feb 2011 20:21:00 +0100 (CET) Subject: [Python-checkins] r88382 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110209192100.807B6EEAB8@mail.python.org> Author: terry.reedy Date: Wed Feb 9 20:21:00 2011 New Revision: 88382 Log: Add 'pysqlite' before version 2.6.0 Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Wed Feb 9 20:21:00 2011 @@ -1520,7 +1520,7 @@ sqlite3 ------- -The :mod:`sqlite3` module was updated to version 2.6.0. It has two new capabilities. +The :mod:`sqlite3` module was updated to pysqlite version 2.6.0. It has two new capabilities. * The :attr:`sqlite3.Connection.in_transit` attribute is true if there is an active transaction for uncommitted changes. From python-checkins at python.org Wed Feb 9 21:17:51 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 21:17:51 +0100 Subject: [Python-checkins] devguide: More clarifications: use the term 'working copy' and mention 'hg update'. Message-ID: brett.cannon pushed 473eda2e3a43 to devguide: http://hg.python.org/devguide/rev/473eda2e3a43 changeset: 285:473eda2e3a43 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 12:17:41 2011 -0800 summary: More clarifications: use the term 'working copy' and mention 'hg update'. files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -3,10 +3,10 @@ Getting Set Up ============== -These instructions cover how to get a source checkout and a compiled version of -the CPython interpreter (CPython is the version of Python available from -http://www.python.org/). It also gives an overview of the directory -structure of a CPython checkout. +These instructions cover how to get a working copy of the source code and a +compiled version of the CPython interpreter (CPython is the version of Python +available from http://www.python.org/). It also gives an overview of the +directory structure of the CPython source code. .. contents:: @@ -26,37 +26,34 @@ Checking out the code ---------------------- -One should always work from a checkout of the CPython source code. While it may +One should always work from a working copy of the CPython source code. +While it may be tempting to work from the downloaded copy you already have installed on your machine, it is very likely that you will be working from out-of-date code as the Python core developers are constantly updating and fixing things in their :abbr:`VCS`. It also means you will have better tool support through the VCS as it will provide a diff tool, etc. -To get a read-only checkout of CPython's source, you need to checkout the source -code. To get a read-only checkout of +To get a read-only checkout of CPython's source, you need a working copy the +source code. To get a read-only checkout of the :ref:`in-development ` branch of Python, run:: - hg clone http://hg.python.org/cpython#py3k - -.. warning:: - XXX assuming 'default' branch is unused. Regardless, probably wrong URL as - PEP 385 suggests py3k will be its own repo compared to Python 2 which is not - the case as of 2011-02-05. If this is wrong, also change all other hg URLs. + hg clone http://hg.python.org/cpython If you want a read-only checkout of an already-released version of Python, i.e., a version in :ref:`maintenance mode `, run something -like the following which gets you a checkout for Python 3.1:: +like the following which gets you a working copy of Python 3.1:: hg clone http://hg.python.org/cpython#release-31maint To check out a version of Python other than 3.1, simply change the number in the above URL to the major/minor version (e.g., ``release27-maint`` for Python -2.7). +2.7). You may also update a working copy to other branches of CPython by using +``hg update``. -Do note that CPython will notice that it is being run from a source checkout. -This means that it if you edit Python's source code in your checkout the -changes will be picked up by the interpreter for immediate testing. +Do note that CPython will notice that it is being run from a working copy. +This means that it if you edit CPython's source code in your working copy the +changes will be picked up by the interpreter for immediate use and testing. Compiling (for debugging) @@ -130,9 +127,9 @@ that can be run in-place; ``./python`` on most machines (and what is used in all examples), ``./python.exe`` on OS X (when on a case-insensitive filesystem, which is the default). There is absolutely no need to install your built copy -of Python! The interpreter will realize it is being run directly out of a -checkout and thus use the files found in the checkout. If you are worried you -might accidentally install your checkout build, you can add +of Python! The interpreter will realize where it is being run from +and thus use the files found in the working copy. If you are worried +you might accidentally install your working copy build, you can add ``--prefix=/dev/null`` to the configuration step. .. _issue tracker: http://bugs.python.org -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:19:23 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:19:23 +0100 Subject: [Python-checkins] devguide: Move to using mq for basic usage. Message-ID: brett.cannon pushed 73e11f64a704 to devguide: http://hg.python.org/devguide/rev/73e11f64a704 changeset: 286:73e11f64a704 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 14:19:16 2011 -0800 summary: Move to using mq for basic usage. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -10,32 +10,55 @@ Tool Usage '''''''''' -.. _named-branch-workflow: +.. _mq-workflow: Mercurial allows for various workflows according to each person's or -project's preference. We present here a very simple solution based on `named -branches `_ for -non-committers. Before you start modifying things in your working copy, type:: +project's preference. We present here a very simple solution based on mq_ +(Mercurial Queue) non-core developers. - hg branch mywork +If you have not done so previously, make sure that the extension has been +turned on in your ``.hgrc`` or ``Mercurial.ini`` file:: + + [extensions] + mq = + +You can verify this is working properly by running ``hg help mq``. + + +Before you start modifying things in your working copy, type:: + + hg qnew mywork where ``mywork`` is a descriptive name for what you are going to work on. -Then all your local commits will be recorded on that branch, which is an -effective way of distinguishing them from other (upstream) commits. +This will create a patch in your patch queue. Whenever you have reached a point +that you want to save what you have done, run:: -Make sure to do your branching from the ``default`` branch (type ``hg branch`` -to see which branch is active). To switch between branches, e.g., switch to the -``default`` branch, do:: + hg qrefresh - hg update default +This will update the patch to contain all of the changes you have made up to +this point. If you have any you have added or removed, use ``hg add`` or ``hg +remove``, respectively, before running ``hg qrefresh``. -When you are done with a branch, you can mark it as closed by doing the -following while the branch you wish to close is active:: +When you are done with your work, you can create a patch to upload to the +`issue tracker`_ with:: - hg commit --close-branch + hg qdiff > patch.diff -This deletes nothing, but it stops the branch from being listed as active in -your checkout. +When you are done with your changes, you can delete them with:: + + hg qdelete mywork + +For more advanced usage of mq, read the `mq chapter +`_ +of `Mercurial: The Definitive Guide `_. + +You can obviously use other workflows if you choose, just please make sure that +the generated patch is *flat* instead of containing a diff for each changeset +(e.g., if you have two or more changesets to a single file it should show up as +a single change diff against the file instead of two separate ones). + +.. _issue tracker: http://bugs.python.org +.. _mq: http://mercurial.selenic.com/wiki/MqExtension Preparation @@ -93,34 +116,30 @@ make patchcheck This will check and/or fix various common things people forget to do for -patches, such as adding any new files needing for the patch to work. +patches, such as adding any new files needing for the patch to work (do not +that not all checks apply to non-core developers). -The following instructions assume you are using the :ref:`named branch approach -` suggested earlier. To create your patch, first check +The following instructions assume you are using the :ref:`mq approach +` suggested earlier. To create your patch, first check that all your local changes have been committed, then type the following:: - hg diff -r default > mywork.patch + hg qdiff > mywork.patch To apply a patch generated this way, do:: - patch -p1 < mywork.patch + hg qimport mywork.patch -To undo a patch applied in your working copy, simply can revert **all** changes:: +This will create a patch in your queue with a name that matches the filename. +You can use the ``-n`` argument to specify a different name. - hg revert --all +To undo a patch imported into your working copy, simply delete the patch from +your patch queue:: -This will leave backups of the files with your changes still intact. To skip -that step, you can use the ``--no-backup`` flag. + hg qdelete mywork.patch Please refer to the :ref:`FAQ ` for :ref:`more information ` on how to manage your local changes. -.. note:: The ``patch`` program is not available by default under Windows. - You can find it `here `_, - courtesy of the `GnuWin32 `_ project. - Also, you may find it necessary to add the "``--binary``" option when trying - to apply Unix-generated patches under Windows. - Submitting ---------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:43:46 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:43:46 +0100 Subject: [Python-checkins] devguide: Use the term 'core developer' instead of 'committer'. Message-ID: brett.cannon pushed dc3cf1e7a331 to devguide: http://hg.python.org/devguide/rev/dc3cf1e7a331 changeset: 287:dc3cf1e7a331 branch: hg_transition user: Brett Cannon date: Wed Feb 09 14:31:43 2011 -0800 summary: Use the term 'core developer' instead of 'committer'. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -31,7 +31,7 @@ Working with Mercurial_ ----------------------- -As a committer, the ability to push changes to the official Python +As a core developer, the ability to push changes to the official Python repositories means you have to be more careful with your workflow: * You should not push new named branches to the main repository. You can -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:43:47 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:43:47 +0100 Subject: [Python-checkins] devguide: Mention that using mq is totally optional and a more svn-like approach is Message-ID: brett.cannon pushed 41be4c428ce6 to devguide: http://hg.python.org/devguide/rev/41be4c428ce6 changeset: 288:41be4c428ce6 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 14:43:41 2011 -0800 summary: Mention that using mq is totally optional and a more svn-like approach is feasible. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -14,7 +14,11 @@ Mercurial allows for various workflows according to each person's or project's preference. We present here a very simple solution based on mq_ -(Mercurial Queue) non-core developers. +(Mercurial Queues) non-core developers. You are welcome to use any approach you +like (including a svn-like approach of simply never saving any changes you make +to your working copy and using ``hg diff`` to create a patch). Usage of mq is +merely a suggestion; it's a balance between being able to do everything needed +while allowing for more powerful usage if desired in the future. If you have not done so previously, make sure that the extension has been turned on in your ``.hgrc`` or ``Mercurial.ini`` file:: -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:46:56 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:46:56 +0100 Subject: [Python-checkins] devguide: Note that porting instructions might need to be updated if non-core developers Message-ID: brett.cannon pushed 9853f57300f4 to devguide: http://hg.python.org/devguide/rev/9853f57300f4 changeset: 289:9853f57300f4 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 14:46:50 2011 -0800 summary: Note that porting instructions might need to be updated if non-core developers don't go the 'patch' route. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -89,6 +89,11 @@ Porting Within a Major Version '''''''''''''''''''''''''''''' + +.. note:: + XXX Update to using hg qimport if that ends up being the way non-core + developers are told to go. + Assume that Python 3.2 is the current in-development version of Python and that you have a patch that should also be applied to Python 3.1. To properly port the patch to both versions of Python, you should first apply the patch to -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:49:58 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:49:58 +0100 Subject: [Python-checkins] devguide: clarification Message-ID: brett.cannon pushed e890ff3d124a to devguide: http://hg.python.org/devguide/rev/e890ff3d124a changeset: 290:e890ff3d124a branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 14:49:52 2011 -0800 summary: clarification files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -28,7 +28,7 @@ One should always work from a working copy of the CPython source code. While it may -be tempting to work from the downloaded copy you already have installed on your +be tempting to work from the copy of Python you already have installed on your machine, it is very likely that you will be working from out-of-date code as the Python core developers are constantly updating and fixing things in their :abbr:`VCS`. It also means you will have better tool -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 9 23:55:14 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 9 Feb 2011 23:55:14 +0100 (CET) Subject: [Python-checkins] r88383 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110209225514.0BB01EEA29@mail.python.org> Author: brett.cannon Date: Wed Feb 9 23:55:13 2011 New Revision: 88383 Log: Tweak wording about equality comparison. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Wed Feb 9 23:55:13 2011 @@ -380,7 +380,7 @@ >>> b"" == "" False -This is because comparison for equality is required by the language to always +This is because an equality comparison is required by the language to always succeed (and return ``False`` for incompatible types). However, this also means that code incorrectly ported to Python 3 can display buggy behaviour if such comparisons are silently executed. To detect such situations, From python-checkins at python.org Wed Feb 9 23:58:22 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 09 Feb 2011 23:58:22 +0100 Subject: [Python-checkins] devguide: Fix a silly statement. Message-ID: brett.cannon pushed 7101df1bd817 to devguide: http://hg.python.org/devguide/rev/7101df1bd817 changeset: 291:7101df1bd817 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 14:58:17 2011 -0800 summary: Fix a silly statement. files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -34,8 +34,7 @@ :abbr:`VCS`. It also means you will have better tool support through the VCS as it will provide a diff tool, etc. -To get a read-only checkout of CPython's source, you need a working copy the -source code. To get a read-only checkout of +To get a read-only checkout of the :ref:`in-development ` branch of Python, run:: hg clone http://hg.python.org/cpython -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Thu Feb 10 00:02:26 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 10 Feb 2011 00:02:26 +0100 Subject: [Python-checkins] devguide: Mention using 'patch' for svn patches. Message-ID: brett.cannon pushed 46e28043b931 to devguide: http://hg.python.org/devguide/rev/46e28043b931 changeset: 292:46e28043b931 branch: hg_transition tag: tip user: Brett Cannon date: Wed Feb 09 15:02:20 2011 -0800 summary: Mention using 'patch' for svn patches. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -136,14 +136,27 @@ This will create a patch in your queue with a name that matches the filename. You can use the ``-n`` argument to specify a different name. +If a patch was not created by hg (i.e., a patch created by svn and thus lacking +any ``a``/``b`` directory prefixes in the patch), use:: + + patch -p0 < mywork.patch + To undo a patch imported into your working copy, simply delete the patch from -your patch queue:: +your patch queue. You do need to make sure it is not applied (``hg qtop`` will +tell you that while ``hg qpop`` will un-apply the top-most patch):: - hg qdelete mywork.patch + hg qdelete mywork.patch Please refer to the :ref:`FAQ ` for :ref:`more information ` on how to manage your local changes. +.. note:: The ``patch`` program is not available by default under Windows. + You can find it `here `_, + courtesy of the `GnuWin32 `_ project. + Also, you may find it necessary to add the "``--binary``" option when trying + to apply Unix-generated patches under Windows. + + Submitting ---------- -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Thu Feb 10 05:04:46 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 10 Feb 2011 05:04:46 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88383): sum=0 Message-ID: py3k results for svn r88383 (hg cset 5705ee2284c2) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogEABvhH', '-x'] From tjreedy at udel.edu Thu Feb 10 08:04:52 2011 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 10 Feb 2011 02:04:52 -0500 Subject: [Python-checkins] devguide: More clarifications: use the term 'working copy' and mention 'hg update'. In-Reply-To: References: Message-ID: <4D538E14.9090906@udel.edu> > +To get a read-only checkout of CPython's source, you need a working copy the > +source code. To get a read-only checkout of copy *of* the source code From python-checkins at python.org Thu Feb 10 09:09:36 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 10 Feb 2011 09:09:36 +0100 (CET) Subject: [Python-checkins] r88384 - in python/branches/py3k/Doc/library: __future__.rst _dummy_thread.rst xml.dom.minidom.rst xml.dom.pulldom.rst xml.etree.elementtree.rst xmlrpc.client.rst xmlrpc.server.rst Message-ID: <20110210080936.6047DEEB29@mail.python.org> Author: raymond.hettinger Date: Thu Feb 10 09:09:36 2011 New Revision: 88384 Log: Add missing source links. Modified: python/branches/py3k/Doc/library/__future__.rst python/branches/py3k/Doc/library/_dummy_thread.rst python/branches/py3k/Doc/library/xml.dom.minidom.rst python/branches/py3k/Doc/library/xml.dom.pulldom.rst python/branches/py3k/Doc/library/xml.etree.elementtree.rst python/branches/py3k/Doc/library/xmlrpc.client.rst python/branches/py3k/Doc/library/xmlrpc.server.rst Modified: python/branches/py3k/Doc/library/__future__.rst ============================================================================== --- python/branches/py3k/Doc/library/__future__.rst (original) +++ python/branches/py3k/Doc/library/__future__.rst Thu Feb 10 09:09:36 2011 @@ -4,6 +4,9 @@ .. module:: __future__ :synopsis: Future statement definitions +**Source code:** :source:`Lib/__future__.py` + +-------------- :mod:`__future__` is a real module, and serves three purposes: Modified: python/branches/py3k/Doc/library/_dummy_thread.rst ============================================================================== --- python/branches/py3k/Doc/library/_dummy_thread.rst (original) +++ python/branches/py3k/Doc/library/_dummy_thread.rst Thu Feb 10 09:09:36 2011 @@ -4,6 +4,9 @@ .. module:: _dummy_thread :synopsis: Drop-in replacement for the _thread module. +**Source code:** :source:`Lib/_dummy_thread.py` + +-------------- This module provides a duplicate interface to the :mod:`_thread` module. It is meant to be imported when the :mod:`_thread` module is not provided on a Modified: python/branches/py3k/Doc/library/xml.dom.minidom.rst ============================================================================== --- python/branches/py3k/Doc/library/xml.dom.minidom.rst (original) +++ python/branches/py3k/Doc/library/xml.dom.minidom.rst Thu Feb 10 09:09:36 2011 @@ -7,6 +7,9 @@ .. sectionauthor:: Paul Prescod .. sectionauthor:: Martin v. L?wis +**Source code:** :source:`Lib/xml/dom/minidom.py` + +-------------- :mod:`xml.dom.minidom` is a light-weight implementation of the Document Object Model interface. It is intended to be simpler than the full DOM and also Modified: python/branches/py3k/Doc/library/xml.dom.pulldom.rst ============================================================================== --- python/branches/py3k/Doc/library/xml.dom.pulldom.rst (original) +++ python/branches/py3k/Doc/library/xml.dom.pulldom.rst Thu Feb 10 09:09:36 2011 @@ -5,6 +5,9 @@ :synopsis: Support for building partial DOM trees from SAX events. .. moduleauthor:: Paul Prescod +**Source code:** :source:`Lib/xml/dom/pulldom.py` + +-------------- :mod:`xml.dom.pulldom` allows building only selected portions of a Document Object Model representation of a document from SAX events. Modified: python/branches/py3k/Doc/library/xml.etree.elementtree.rst ============================================================================== --- python/branches/py3k/Doc/library/xml.etree.elementtree.rst (original) +++ python/branches/py3k/Doc/library/xml.etree.elementtree.rst Thu Feb 10 09:09:36 2011 @@ -5,6 +5,9 @@ :synopsis: Implementation of the ElementTree API. .. moduleauthor:: Fredrik Lundh +**Source code:** :source:`Lib/xml/etree/ElementTree.py` + +-------------- The :class:`Element` type is a flexible container object, designed to store hierarchical data structures in memory. The type can be described as a cross Modified: python/branches/py3k/Doc/library/xmlrpc.client.rst ============================================================================== --- python/branches/py3k/Doc/library/xmlrpc.client.rst (original) +++ python/branches/py3k/Doc/library/xmlrpc.client.rst Thu Feb 10 09:09:36 2011 @@ -10,6 +10,10 @@ .. XXX Not everything is documented yet. It might be good to describe Marshaller, Unmarshaller, getparser, dumps, loads, and Transport. +**Source code:** :source:`Lib/xmlrpc/client.py` + +-------------- + XML-RPC is a Remote Procedure Call method that uses XML passed via HTTP as a transport. With it, a client can call methods with parameters on a remote server (the server is named by a URI) and get back structured data. This module Modified: python/branches/py3k/Doc/library/xmlrpc.server.rst ============================================================================== --- python/branches/py3k/Doc/library/xmlrpc.server.rst (original) +++ python/branches/py3k/Doc/library/xmlrpc.server.rst Thu Feb 10 09:09:36 2011 @@ -6,6 +6,9 @@ .. moduleauthor:: Brian Quinlan .. sectionauthor:: Fred L. Drake, Jr. +**Source code:** :source:`Lib/xmlrpc/server.py` + +-------------- The :mod:`xmlrpc.server` module provides a basic server framework for XML-RPC servers written in Python. Servers can either be free standing, using From python-checkins at python.org Thu Feb 10 10:20:26 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 10 Feb 2011 10:20:26 +0100 (CET) Subject: [Python-checkins] r88385 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110210092026.60330EEAD7@mail.python.org> Author: raymond.hettinger Date: Thu Feb 10 10:20:26 2011 New Revision: 88385 Log: Add an entry for logging. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Thu Feb 10 10:20:26 2011 @@ -1140,6 +1140,44 @@ (Contributed by Raymond Hettinger in :issue:`9826` and :issue:`9840`.) +logging +------- + +In addition to dictionary based configuration described above, the +:mod:`logging` package has many other improvements. + +The logging documentation has been augmented by a :ref:`basic tutorial +`\, an :ref:`advanced tutorial +`\, and a :ref:`cookbook ` of +logging recipes. These documents are the fastest way to learn about logging. + +The :func:`logging.basicConfig` set-up function gained a *style* argument to +support three different types of string formatting. It defaults to "%" for +traditional %-formatting, can be set to "{" for the new :meth:`str.format` style, or +can be set to "$" for the shell-style formatting provided by +:class:`string.Template`. The following three configurations are equivalent:: + + >>> from logging import basicConfig + >>> basicConfig(style='%', format="%(name)s -> %(levelname)s: %(message)s") + >>> basicConfig(style='{', format="{name} -> {levelname} {message}") + >>> basicConfig(style='$', format="$name -> $levelname: $message") + +If no configuration is set-up before a logging event occurs, there is now a +default configuration using a :class:`~logging.StreamHandler` directed to +:attr:`sys.stderr` for events of ``WARNING`` level or higher. Formerly, an +event occurring before a configuration was set-up would either raise an +exception or silently drop the event depending on the value of +:attr:`logging.raiseExceptions`. The new default handler is stored in +:attr:`logging.lastResort`. + +The use of filters has been simplified. Instead of creating a +:class:`~logging.Filter` object, the predicate can be any Python callable that +returns *True* or *False*. + +There were a number of other improvements that add flexibility and simplify +configuration. See the module documentation for a full listing of changes in +Python 3.2. + csv --- @@ -2127,8 +2165,7 @@ (All changes contributed by ?ukasz Langa.) -.. XXX show a difflib example -.. XXX add entry for logging changes other than the dict config pep +.. XXX consider showing a difflib example urllib.parse ------------ From python-checkins at python.org Thu Feb 10 10:43:04 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 10 Feb 2011 10:43:04 +0100 (CET) Subject: [Python-checkins] r88386 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110210094304.F0FC1EEB06@mail.python.org> Author: raymond.hettinger Date: Thu Feb 10 10:43:04 2011 New Revision: 88386 Log: Fix nits. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Thu Feb 10 10:43:04 2011 @@ -1033,7 +1033,7 @@ [True, True, False, False] The :func:`~math.expm1` function computes ``e**x-1`` for small values of *x* -without incuring the loss of precision that usually accompanies the subtraction +without incurring the loss of precision that usually accompanies the subtraction of nearly equal quantities: >>> expm1(0.013671875) # more accurate way to compute e**x-1 for a small x @@ -1143,7 +1143,7 @@ logging ------- -In addition to dictionary based configuration described above, the +In addition to dictionary-based configuration described above, the :mod:`logging` package has many other improvements. The logging documentation has been augmented by a :ref:`basic tutorial @@ -1525,7 +1525,7 @@ The principal functions are :func:`~shutil.make_archive` and :func:`~shutil.unpack_archive`. By default, both operate on the current directory (which can be set by :func:`os.chdir`) and on any sub-directories. -The archive filename needs to specified with a full pathname. The archiving +The archive filename needs to be specified with a full pathname. The archiving step is non-destructive (the original files are left unchanged). :: @@ -1696,9 +1696,9 @@ :meth:`~http.client.HTTPConnection.set_tunnel` method that sets the host and port for HTTP Connect tunneling. -To match the behaviour of :mod:`http.server`, the HTTP client library now also +To match the behavior of :mod:`http.server`, the HTTP client library now also encodes headers with ISO-8859-1 (Latin-1) encoding. It was already doing that -for incoming headers, so now the behaviour is consistent for both incoming and +for incoming headers, so now the behavior is consistent for both incoming and outgoing traffic. (See work by Armin Ronacher in :issue:`10980`.) unittest @@ -1752,8 +1752,8 @@ diagnostics when a test fails. When possible, the failure is recorded along with a diff of the output. This is especially helpful for analyzing log files of failed test runs. However, since diffs can sometime be voluminous, there is - a new :attr:`~unittest.TestCase.maxDiff` attribute which sets maximum length of - diffs. + a new :attr:`~unittest.TestCase.maxDiff` attribute that sets maximum length of + diffs displayed. * In addition, the method names in the module have undergone a number of clean-ups. @@ -2002,7 +2002,7 @@ --------- The new :mod:`sysconfig` module makes it straightforward to discover -installation paths and configuration variables which vary across platforms and +installation paths and configuration variables that vary across platforms and installations. The module offers access simple access functions for platform and version From ncoghlan at gmail.com Thu Feb 10 14:00:08 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 10 Feb 2011 23:00:08 +1000 Subject: [Python-checkins] r88385 - python/branches/py3k/Doc/whatsnew/3.2.rst In-Reply-To: <20110210092026.60330EEAD7@mail.python.org> References: <20110210092026.60330EEAD7@mail.python.org> Message-ID: On Thu, Feb 10, 2011 at 7:20 PM, raymond.hettinger wrote: > +The :func:`logging.basicConfig` set-up function gained a *style* argument to > +support three different types of string formatting. ?It defaults to "%" for > +traditional %-formatting, can be set to "{" for the new :meth:`str.format` style, or > +can be set to "$" for the shell-style formatting provided by > +:class:`string.Template`. ?The following three configurations are equivalent:: > + > + ? ?>>> from logging import basicConfig > + ? ?>>> basicConfig(style='%', format="%(name)s -> %(levelname)s: %(message)s") > + ? ?>>> basicConfig(style='{', format="{name} -> {levelname} {message}") > + ? ?>>> basicConfig(style='$', format="$name -> $levelname: $message") It may be worth noting here that: 1. the "style" parameter also exists for logging.Formatter objects 2. it only applies to the output formatting for output, individual logging calls are still constrained to using %-formatting by backwards compatibility issues (Especially point 2 - I briefly forgot that distinction myself, and I helped review this feature when Vinay added it) Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Thu Feb 10 19:42:36 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Thu, 10 Feb 2011 19:42:36 +0100 (CET) Subject: [Python-checkins] r88387 - python/branches/py3k/Lib/asyncore.py Message-ID: <20110210184236.F1834EEB4E@mail.python.org> Author: giampaolo.rodola Date: Thu Feb 10 19:42:36 2011 New Revision: 88387 Log: get rid of asyncore.dispatcher's debug attribute, which is no longer used (assuming it ever was). Modified: python/branches/py3k/Lib/asyncore.py Modified: python/branches/py3k/Lib/asyncore.py ============================================================================== --- python/branches/py3k/Lib/asyncore.py (original) +++ python/branches/py3k/Lib/asyncore.py Thu Feb 10 19:42:36 2011 @@ -218,7 +218,6 @@ class dispatcher: - debug = False connected = False accepting = False closing = False @@ -544,8 +543,6 @@ return (not self.connected) or len(self.out_buffer) def send(self, data): - if self.debug: - self.log_info('sending %s' % repr(data)) self.out_buffer = self.out_buffer + data self.initiate_send() From python-checkins at python.org Thu Feb 10 20:43:44 2011 From: python-checkins at python.org (brett.cannon) Date: Thu, 10 Feb 2011 20:43:44 +0100 Subject: [Python-checkins] devguide: Update the setup instuctions to be more hg-y and less svn-y. Message-ID: brett.cannon pushed 49c866425446 to devguide: http://hg.python.org/devguide/rev/49c866425446 changeset: 293:49c866425446 branch: hg_transition tag: tip user: Brett Cannon date: Thu Feb 10 11:43:38 2011 -0800 summary: Update the setup instuctions to be more hg-y and less svn-y. files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -23,8 +23,8 @@ .. _checkout: -Checking out the code ----------------------- +Getting the Source Code +----------------------- One should always work from a working copy of the CPython source code. While it may @@ -34,21 +34,21 @@ :abbr:`VCS`. It also means you will have better tool support through the VCS as it will provide a diff tool, etc. -To get a read-only checkout of -the :ref:`in-development ` branch of Python, run:: +To get a working copy of the :ref:`in-development ` branch of +CPython (core developers use a different URL as outlined in :ref:`coredev`), +run:: hg clone http://hg.python.org/cpython -If you want a read-only checkout of an already-released version of Python, -i.e., a version in :ref:`maintenance mode `, run something -like the following which gets you a working copy of Python 3.1:: +If you want a working copy of an already-released version of Python, +i.e., a version in :ref:`maintenance mode `, you can update your +working copy. For instance, to update your working copy to Python 3.1, do:: - hg clone http://hg.python.org/cpython#release-31maint + hg update release-31maint -To check out a version of Python other than 3.1, simply change the number in -the above URL to the major/minor version (e.g., ``release27-maint`` for Python -2.7). You may also update a working copy to other branches of CPython by using -``hg update``. +To get a version of Python other than 3.1, simply change the number in +the above example to the major/minor version (e.g., ``release27-maint`` for +Python 2.7). You will need to re-compile CPython when you do an update. Do note that CPython will notice that it is being run from a working copy. This means that it if you edit CPython's source code in your working copy the -- Repository URL: http://hg.python.org/devguide From ncoghlan at gmail.com Fri Feb 11 00:25:19 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 11 Feb 2011 09:25:19 +1000 Subject: [Python-checkins] r88387 - python/branches/py3k/Lib/asyncore.py In-Reply-To: <20110210184236.F1834EEB4E@mail.python.org> References: <20110210184236.F1834EEB4E@mail.python.org> Message-ID: On Fri, Feb 11, 2011 at 4:42 AM, giampaolo.rodola wrote: > Author: giampaolo.rodola > Date: Thu Feb 10 19:42:36 2011 > New Revision: 88387 > > Log: > get rid of asyncore.dispatcher's debug attribute, which is no longer used (assuming it ever was). Reviewer? NEWS entry? RM approval? Tracker issue? Removing a public attribute seems like an odd change to be making at this stage of a release. What if there is third party code that uses that attribute to enable additional debugging info in asyncore? Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Fri Feb 11 01:03:03 2011 From: python-checkins at python.org (raymond.hettinger) Date: Fri, 11 Feb 2011 01:03:03 +0100 (CET) Subject: [Python-checkins] r88388 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110211000303.2CB02EEB99@mail.python.org> Author: raymond.hettinger Date: Fri Feb 11 01:03:03 2011 New Revision: 88388 Log: Insert missing section heading (noticed by Victor Stinner). Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Fri Feb 11 01:03:03 2011 @@ -1810,21 +1810,24 @@ poplib ------ -* :class:`~poplib.POP3_SSL` class now accepts a *context* parameter, which is a - :class:`ssl.SSLContext` object allowing bundling SSL configuration options, - certificates and private keys into a single (potentially long-lived) - structure. - - (Contributed by Giampaolo Rodol?; :issue:`8807`.) - -* :class:`asyncore.dispatcher` now provides a - :meth:`~asyncore.dispatcher.handle_accepted()` method - returning a `(sock, addr)` pair which is called when a connection has actually - been established with a new remote endpoint. This is supposed to be used as a - replacement for old :meth:`~asyncore.dispatcher.handle_accept()` and avoids - the user to call :meth:`~asyncore.dispatcher.accept()` directly. +:class:`~poplib.POP3_SSL` class now accepts a *context* parameter, which is a +:class:`ssl.SSLContext` object allowing bundling SSL configuration options, +certificates and private keys into a single (potentially long-lived) +structure. - (Contributed by Giampaolo Rodol?; :issue:`6706`.) +(Contributed by Giampaolo Rodol?; :issue:`8807`.) + +asyncore +-------- + +:class:`asyncore.dispatcher` now provides a +:meth:`~asyncore.dispatcher.handle_accepted()` method +returning a `(sock, addr)` pair which is called when a connection has actually +been established with a new remote endpoint. This is supposed to be used as a +replacement for old :meth:`~asyncore.dispatcher.handle_accept()` and avoids +the user to call :meth:`~asyncore.dispatcher.accept()` directly. + +Contributed by Giampaolo Rodol?; :issue:`6706`.) tempfile -------- From python-checkins at python.org Fri Feb 11 01:08:38 2011 From: python-checkins at python.org (raymond.hettinger) Date: Fri, 11 Feb 2011 01:08:38 +0100 (CET) Subject: [Python-checkins] r88389 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110211000838.4E015EEBC7@mail.python.org> Author: raymond.hettinger Date: Fri Feb 11 01:08:38 2011 New Revision: 88389 Log: Missing paren. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Fri Feb 11 01:08:38 2011 @@ -1827,7 +1827,7 @@ replacement for old :meth:`~asyncore.dispatcher.handle_accept()` and avoids the user to call :meth:`~asyncore.dispatcher.accept()` directly. -Contributed by Giampaolo Rodol?; :issue:`6706`.) +(Contributed by Giampaolo Rodol?; :issue:`6706`.) tempfile -------- From python-checkins at python.org Fri Feb 11 03:05:13 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 03:05:13 +0100 (CET) Subject: [Python-checkins] r88390 - python/branches/release31-maint Message-ID: <20110211020513.53C11EEB23@mail.python.org> Author: r.david.murray Date: Fri Feb 11 03:05:13 2011 New Revision: 88390 Log: Blocked revisions 73911,78780,80757,86577,87228,88197,88199,88203,88252 via svnmerge ................ r73911 | r.david.murray | 2009-07-09 15:51:32 -0400 (Thu, 09 Jul 2009) | 9 lines Unblocked revisions 73907 via svnmerge ........ r73907 | r.david.murray | 2009-07-09 12:17:30 -0400 (Thu, 09 Jul 2009) | 4 lines Temporarily ignore rmtree errors in test_getcwd_long_pathnames to see if the test gives useful failure info on Solaris buildbot. ........ ................ r78780 | r.david.murray | 2010-03-07 21:17:03 -0500 (Sun, 07 Mar 2010) | 20 lines bdecode was already gone in email 5. This merge adds the test from the trunk patch, and removes the last trace of bdecode, which was a commented out call in message.py. Merged revisions 78778 via svnmerge from svn+ssh://pythondev at svn.python.org/python/trunk ........ r78778 | r.david.murray | 2010-03-07 21:04:06 -0500 (Sun, 07 Mar 2010) | 9 lines Issue #7143: get_payload used to strip any trailing newline from a base64 transfer-encoded payload *after* decoding it; it no longer does. email had a special method in utils, _bdecode, specifically to do this, so it must have served a purpose at some point, yet it is clearly wrong per RFC. Fixed with Barry's approval, but no backport. Email package minor version number is bumped, now version 4.0.1. Patch by Joaquin Cuenca Abela. ........ ................ r80757 | r.david.murray | 2010-05-04 12:17:50 -0400 (Tue, 04 May 2010) | 12 lines Recorded merge of revisions 80458 via svnmerge from svn+ssh://pythondev at svn.python.org/python/trunk Sean merged this in r84059. ........ r80458 | sean.reifschneider | 2010-04-25 02:31:23 -0400 (Sun, 25 Apr 2010) | 3 lines Fixing obscure syslog corner-case when sys.argv = None, syslog() would call openlog() for every logged message. ........ ................ r86577 | r.david.murray | 2010-11-20 11:33:30 -0500 (Sat, 20 Nov 2010) | 4 lines #1574217: only swallow AttributeErrors in isinstance, not everything. Patch and tests by Brian Harring, with improvements by Ralf Schmitt. ................ r87228 | r.david.murray | 2010-12-13 21:25:43 -0500 (Mon, 13 Dec 2010) | 2 lines Turn on regrtest -W (rerun immediately) option for Windows, too. ................ r88197 | victor.stinner | 2011-01-25 19:39:19 -0500 (Tue, 25 Jan 2011) | 1 line Fix BytesGenerator._handle_text() if the message has no payload (None) ................ r88199 | r.david.murray | 2011-01-25 21:31:37 -0500 (Tue, 25 Jan 2011) | 2 lines Revert r88197. I'll refix correctly once there is a test. ................ r88203 | r.david.murray | 2011-01-26 16:21:32 -0500 (Wed, 26 Jan 2011) | 4 lines #11019: Make BytesGenerator handle Message with None body. Bug discovery and initial patch by Victor Stinner. ................ r88252 | r.david.murray | 2011-01-30 01:21:28 -0500 (Sun, 30 Jan 2011) | 16 lines #9124: mailbox now accepts binary input and uses binary internally Although this patch contains API changes and is rather weighty for an RC phase, the mailbox module was essentially unusable without the patch since it would produce UnicodeErrors when handling non-ascii input at arbitrary and somewhat mysterious places, and any non-trivial amount of email processing will encounter messages with non-ascii bytes. The release manager approved the patch application. The changes allow binary input, and reject non-ASCII string input early with a useful message instead of failing mysteriously later. Binary is used internally for reading and writing the mailbox files. StringIO and Text file input are deprecated. Initial patch by Victor Stinner, validated and expanded by R. David Murray. ................ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Fri Feb 11 03:27:10 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 03:27:10 +0100 (CET) Subject: [Python-checkins] r88391 - python/branches/release27-maint Message-ID: <20110211022710.AA62EEEB48@mail.python.org> Author: r.david.murray Date: Fri Feb 11 03:27:10 2011 New Revision: 88391 Log: Blocked revisions 86577,88203,88252 via svnmerge ........ r86577 | r.david.murray | 2010-11-20 11:33:30 -0500 (Sat, 20 Nov 2010) | 4 lines #1574217: only swallow AttributeErrors in isinstance, not everything. Patch and tests by Brian Harring, with improvements by Ralf Schmitt. ........ r88203 | r.david.murray | 2011-01-26 16:21:32 -0500 (Wed, 26 Jan 2011) | 4 lines #11019: Make BytesGenerator handle Message with None body. Bug discovery and initial patch by Victor Stinner. ........ r88252 | r.david.murray | 2011-01-30 01:21:28 -0500 (Sun, 30 Jan 2011) | 16 lines #9124: mailbox now accepts binary input and uses binary internally Although this patch contains API changes and is rather weighty for an RC phase, the mailbox module was essentially unusable without the patch since it would produce UnicodeErrors when handling non-ascii input at arbitrary and somewhat mysterious places, and any non-trivial amount of email processing will encounter messages with non-ascii bytes. The release manager approved the patch application. The changes allow binary input, and reject non-ASCII string input early with a useful message instead of failing mysteriously later. Binary is used internally for reading and writing the mailbox files. StringIO and Text file input are deprecated. Initial patch by Victor Stinner, validated and expanded by R. David Murray. ........ Modified: python/branches/release27-maint/ (props changed) From python-checkins at python.org Fri Feb 11 03:27:46 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 03:27:46 +0100 (CET) Subject: [Python-checkins] r88392 - python/branches/release31-maint Message-ID: <20110211022746.5C256EEB48@mail.python.org> Author: r.david.murray Date: Fri Feb 11 03:27:46 2011 New Revision: 88392 Log: Blocked revisions 80334,87216,87372,87516 via svnmerge ........ r80334 | r.david.murray | 2010-04-21 21:49:37 -0400 (Wed, 21 Apr 2010) | 2 lines Fix verb tense in skip message. ........ r87216 | r.david.murray | 2010-12-13 17:50:30 -0500 (Mon, 13 Dec 2010) | 2 lines #10698: fix typo in example. ........ r87372 | r.david.murray | 2010-12-18 11:39:06 -0500 (Sat, 18 Dec 2010) | 2 lines #10728: the default for printing help is sys.stdout, not stderr. ........ r87516 | r.david.murray | 2010-12-27 15:09:32 -0500 (Mon, 27 Dec 2010) | 5 lines #7056: runtest and runtest_inner don't use testdir, so drop it from their sigs I've only tested regular runs and -j runs. If I've broken anything else I'm sure I'll hear about it sooner or later. ........ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Fri Feb 11 04:13:19 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 04:13:19 +0100 (CET) Subject: [Python-checkins] r88393 - in python/branches/release31-maint: Doc/ACKS.txt Doc/library/compileall.rst Doc/library/email.header.rst Doc/library/http.client.rst Doc/library/os.rst Doc/tutorial/interpreter.rst Lib/email/header.py Lib/email/test/test_email.py Lib/test/script_helper.py Misc/NEWS Modules/timemodule.c Message-ID: <20110211031319.A83F5EEB23@mail.python.org> Author: r.david.murray Date: Fri Feb 11 04:13:19 2011 New Revision: 88393 Log: Merged revisions 87136,87221,87256,87337-87338,87571,87839,88164 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r87136 | r.david.murray | 2010-12-08 17:53:00 -0500 (Wed, 08 Dec 2010) | 6 lines Have script_helper._assert_python strip refcount strings from stderr. This makes the output of the function and those that depend on it independent of whether or not they are being run under a debug build. ........ r87221 | r.david.murray | 2010-12-13 19:55:46 -0500 (Mon, 13 Dec 2010) | 4 lines #10699: fix docstring for tzset: it does not take a parameter Thanks to Garrett Cooper for the fix. ........ r87256 | r.david.murray | 2010-12-14 21:19:14 -0500 (Tue, 14 Dec 2010) | 2 lines #10705: document what the values of debuglevel are and mean. ........ r87337 | r.david.murray | 2010-12-17 11:11:40 -0500 (Fri, 17 Dec 2010) | 2 lines #10559: provide instructions for accessing sys.argv when first mentioned. ........ r87338 | r.david.murray | 2010-12-17 11:29:07 -0500 (Fri, 17 Dec 2010) | 2 lines #10454: clarify the compileall docs and help messages. [compileall.py changes not backported.] ........ r87571 | r.david.murray | 2010-12-29 14:06:48 -0500 (Wed, 29 Dec 2010) | 2 lines Fix same typo in docs. ........ r87839 | r.david.murray | 2011-01-07 16:57:25 -0500 (Fri, 07 Jan 2011) | 9 lines Fix formatting of values with embedded newlines when rfc2047 encoding Before this patch if a value being encoded had an embedded newline, the line following the newline would have no leading whitespace, and the whitespace it did have was encoded into the word. Now the existing whitespace gets turned into a blank, the way it does in other header reformatting, and the _continuation_ws gets added at the beginning of the encoded line. ........ r88164 | r.david.murray | 2011-01-24 14:34:58 -0500 (Mon, 24 Jan 2011) | 12 lines #10960: fix 'stat' links, link to lstat from stat, general tidy of stat doc. Original patch by Michal Nowikowski, with some additions and wording fixes by me. I changed the wording from 'Performs a stat system call' to 'Performs the equivalent of a stat system call', since on Windows there are no stat/lstat system calls involved. I also extended Michal's breakout of the attributes into a list to the other paragraphs, and rearranged the order of the paragraphs in the 'stat' docs to make it flow better and put it in what I think is a more logical/useful order. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/ACKS.txt python/branches/release31-maint/Doc/library/compileall.rst python/branches/release31-maint/Doc/library/email.header.rst python/branches/release31-maint/Doc/library/http.client.rst python/branches/release31-maint/Doc/library/os.rst python/branches/release31-maint/Doc/tutorial/interpreter.rst python/branches/release31-maint/Lib/email/header.py python/branches/release31-maint/Lib/email/test/test_email.py python/branches/release31-maint/Lib/test/script_helper.py python/branches/release31-maint/Misc/NEWS python/branches/release31-maint/Modules/timemodule.c Modified: python/branches/release31-maint/Doc/ACKS.txt ============================================================================== --- python/branches/release31-maint/Doc/ACKS.txt (original) +++ python/branches/release31-maint/Doc/ACKS.txt Fri Feb 11 04:13:19 2011 @@ -139,6 +139,7 @@ * Ross Moore * Sjoerd Mullender * Dale Nagata + * Michal Nowikowski * Ng Pheng Siong * Koray Oner * Tomas Oppelstrup Modified: python/branches/release31-maint/Doc/library/compileall.rst ============================================================================== --- python/branches/release31-maint/Doc/library/compileall.rst (original) +++ python/branches/release31-maint/Doc/library/compileall.rst Fri Feb 11 04:13:19 2011 @@ -6,9 +6,10 @@ This module provides some utility functions to support installing Python -libraries. These functions compile Python source files in a directory tree, -allowing users without permission to write to the libraries to take advantage of -cached byte-code files. +libraries. These functions compile Python source files in a directory tree. +This module can be used to create the cached byte-code files at library +installation time, which makes them available for use even by users who don't +have write permission to the library directories. Command-line use @@ -27,7 +28,8 @@ .. cmdoption:: -l - Do not recurse. + Do not recurse into subdirectories, only compile source code files directly + contained in the named or implied directories. .. cmdoption:: -f @@ -35,15 +37,20 @@ .. cmdoption:: -q - Do not print the list of files compiled. + Do not print the list of files compiled, print only error messages. .. cmdoption:: -d destdir - Purported directory name for error messages. + Directory prepended to the path to each file being compiled. This will + appear in compilation time tracebacks, and is also compiled in to the + byte-code file, where it will be used in tracebacks and other messages in + cases where the source file does not exist at the time the byte-code file is + executed. .. cmdoption:: -x regex - Skip files with a full path that matches given regular expression. + regex is used to search the full path to each file considered for + compilation, and if the regex produces a match, the file is skipped. Public functions @@ -52,24 +59,34 @@ .. function:: compile_dir(dir, maxlevels=10, ddir=None, force=False, rx=None, quiet=False) Recursively descend the directory tree named by *dir*, compiling all :file:`.py` - files along the way. The *maxlevels* parameter is used to limit the depth of - the recursion; it defaults to ``10``. If *ddir* is given, it is used as the - base path from which the filenames used in error messages will be generated. + files along the way. + + The *maxlevels* parameter is used to limit the depth of the recursion; it + defaults to ``10``. + + If *ddir* is given, it is prepended to the path to each file being compiled + for use in compilation time tracebacks, and is also compiled in to the + byte-code file, where it will be used in tracebacks and other messages in + cases where the source file does not exist at the time the byte-code file is + executed. + If *force* is true, modules are re-compiled even if the timestamps are up to date. - If *rx* is given, it specifies a regular expression of file names to exclude - from the search; that expression is searched for in the full path. + If *rx* is given, its search method is called on the complete path to each + file considered for compilation, and if it returns a true value, the file + is skipped. - If *quiet* is true, nothing is printed to the standard output in normal - operation. + If *quiet* is true, nothing is printed to the standard output unless errors + occur. .. function:: compile_path(skip_curdir=True, maxlevels=0, force=False) Byte-compile all the :file:`.py` files found along ``sys.path``. If - *skip_curdir* is true (the default), the current directory is not included in - the search. The *maxlevels* and *force* parameters default to ``0`` and are - passed to the :func:`compile_dir` function. + *skip_curdir* is true (the default), the current directory is not included + in the search. All other parameters are passed to the :func:`compile_dir` + function. Note that unlike the other compile functions, ``maxlevels`` + defaults to ``0``. To force a recompile of all the :file:`.py` files in the :file:`Lib/` subdirectory and all its subdirectories:: Modified: python/branches/release31-maint/Doc/library/email.header.rst ============================================================================== --- python/branches/release31-maint/Doc/library/email.header.rst (original) +++ python/branches/release31-maint/Doc/library/email.header.rst Fri Feb 11 04:13:19 2011 @@ -63,7 +63,7 @@ character set is used both as *s*'s initial charset and as the default for subsequent :meth:`append` calls. - The maximum line length can be specified explicit via *maxlinelen*. For + The maximum line length can be specified explicitly via *maxlinelen*. For splitting the first line to a shorter value (to account for the field header which isn't included in *s*, e.g. :mailheader:`Subject`) pass in the name of the field in *header_name*. The default *maxlinelen* is 76, and the default value Modified: python/branches/release31-maint/Doc/library/http.client.rst ============================================================================== --- python/branches/release31-maint/Doc/library/http.client.rst (original) +++ python/branches/release31-maint/Doc/library/http.client.rst Fri Feb 11 04:13:19 2011 @@ -381,8 +381,10 @@ .. method:: HTTPConnection.set_debuglevel(level) - Set the debugging level (the amount of debugging output printed). The default - debug level is ``0``, meaning no debugging output is printed. + Set the debugging level. The default debug level is ``0``, meaning no + debugging output is printed. Any value greater than ``0`` will cause all + currently defined debug output to be printed to stdout. The ``debuglevel`` + is passed to any new :class:`HTTPResponse` objects that are created. .. versionadded:: 3.1 Modified: python/branches/release31-maint/Doc/library/os.rst ============================================================================== --- python/branches/release31-maint/Doc/library/os.rst (original) +++ python/branches/release31-maint/Doc/library/os.rst Fri Feb 11 04:13:19 2011 @@ -534,7 +534,7 @@ .. function:: fstat(fd) - Return status for file descriptor *fd*, like :func:`stat`. + Return status for file descriptor *fd*, like :func:`~os.stat`. Availability: Unix, Windows. @@ -952,9 +952,10 @@ .. function:: lstat(path) - Like :func:`stat`, but do not follow symbolic links. This is an alias for - :func:`stat` on platforms that do not support symbolic links, such as - Windows. + Perform the equivalent of an :c:func:`lstat` system call on the given path. + Similar to :func:`~os.stat`, but does not follow symbolic links. On + platforms that do not support symbolic links, this is an alias for + :func:`~os.stat`. .. function:: mkfifo(path[, mode]) @@ -1138,56 +1139,73 @@ .. function:: stat(path) - Perform a :cfunc:`stat` system call on the given path. The return value is an - object whose attributes correspond to the members of the :ctype:`stat` - structure, namely: :attr:`st_mode` (protection bits), :attr:`st_ino` (inode - number), :attr:`st_dev` (device), :attr:`st_nlink` (number of hard links), - :attr:`st_uid` (user id of owner), :attr:`st_gid` (group id of owner), - :attr:`st_size` (size of file, in bytes), :attr:`st_atime` (time of most recent - access), :attr:`st_mtime` (time of most recent content modification), - :attr:`st_ctime` (platform dependent; time of most recent metadata change on - Unix, or the time of creation on Windows):: + Perform the equivalent of a :c:func:`stat` system call on the given path. + (This function follows symlinks; to stat a symlink use :func:`lstat`.) - >>> import os - >>> statinfo = os.stat('somefile.txt') - >>> statinfo - (33188, 422511, 769, 1, 1032, 100, 926, 1105022698,1105022732, 1105022732) - >>> statinfo.st_size - 926 - >>> + The return value is an object whose attributes correspond to the members + of the :c:type:`stat` structure, namely: + * :attr:`st_mode` - protection bits, + * :attr:`st_ino` - inode number, + * :attr:`st_dev` - device, + * :attr:`st_nlink` - number of hard links, + * :attr:`st_uid` - user id of owner, + * :attr:`st_gid` - group id of owner, + * :attr:`st_size` - size of file, in bytes, + * :attr:`st_atime` - time of most recent access, + * :attr:`st_mtime` - time of most recent content modification, + * :attr:`st_ctime` - platform dependent; time of most recent metadata change on + Unix, or the time of creation on Windows) On some Unix systems (such as Linux), the following attributes may also be - available: :attr:`st_blocks` (number of blocks allocated for file), - :attr:`st_blksize` (filesystem blocksize), :attr:`st_rdev` (type of device if an - inode device). :attr:`st_flags` (user defined flags for file). + available: + + * :attr:`st_blocks` - number of blocks allocated for file + * :attr:`st_blksize` - filesystem blocksize + * :attr:`st_rdev` - type of device if an inode device + * :attr:`st_flags` - user defined flags for file On other Unix systems (such as FreeBSD), the following attributes may be - available (but may be only filled out if root tries to use them): :attr:`st_gen` - (file generation number), :attr:`st_birthtime` (time of file creation). + available (but may be only filled out if root tries to use them): + + * :attr:`st_gen` - file generation number + * :attr:`st_birthtime` - time of file creation On Mac OS systems, the following attributes may also be available: - :attr:`st_rsize`, :attr:`st_creator`, :attr:`st_type`. - .. index:: module: stat + * :attr:`st_rsize` + * :attr:`st_creator` + * :attr:`st_type` + + .. note:: + + The exact meaning and resolution of the :attr:`st_atime`, :attr:`st_mtime`, and + :attr:`st_ctime` members depends on the operating system and the file system. + For example, on Windows systems using the FAT or FAT32 file systems, + :attr:`st_mtime` has 2-second resolution, and :attr:`st_atime` has only 1-day + resolution. See your operating system documentation for details. - For backward compatibility, the return value of :func:`stat` is also accessible + For backward compatibility, the return value of :func:`~os.stat` is also accessible as a tuple of at least 10 integers giving the most important (and portable) members of the :ctype:`stat` structure, in the order :attr:`st_mode`, :attr:`st_ino`, :attr:`st_dev`, :attr:`st_nlink`, :attr:`st_uid`, :attr:`st_gid`, :attr:`st_size`, :attr:`st_atime`, :attr:`st_mtime`, :attr:`st_ctime`. More items may be added at the end by some implementations. + + .. index:: module: stat + The standard module :mod:`stat` defines functions and constants that are useful for extracting information from a :ctype:`stat` structure. (On Windows, some items are filled with dummy values.) - .. note:: + Example:: - The exact meaning and resolution of the :attr:`st_atime`, :attr:`st_mtime`, and - :attr:`st_ctime` members depends on the operating system and the file system. - For example, on Windows systems using the FAT or FAT32 file systems, - :attr:`st_mtime` has 2-second resolution, and :attr:`st_atime` has only 1-day - resolution. See your operating system documentation for details. + >>> import os + >>> statinfo = os.stat('somefile.txt') + >>> statinfo + (33188, 422511, 769, 1, 1032, 100, 926, 1105022698,1105022732, 1105022732) + >>> statinfo.st_size + 926 Availability: Unix, Windows. @@ -1195,7 +1213,7 @@ .. function:: stat_float_times([newvalue]) Determine whether :class:`stat_result` represents time stamps as float objects. - If *newvalue* is ``True``, future calls to :func:`stat` return floats, if it is + If *newvalue* is ``True``, future calls to :func:`~os.stat` return floats, if it is ``False``, future calls return ints. If *newvalue* is omitted, return the current setting. @@ -1255,8 +1273,8 @@ respectively. Whether a directory can be given for *path* depends on whether the operating system implements directories as files (for example, Windows does not). Note that the exact times you set here may not be returned by a - subsequent :func:`stat` call, depending on the resolution with which your - operating system records access and modification times; see :func:`stat`. + subsequent :func:`~os.stat` call, depending on the resolution with which your + operating system records access and modification times; see :func:`~os.stat`. Availability: Unix, Windows. Modified: python/branches/release31-maint/Doc/tutorial/interpreter.rst ============================================================================== --- python/branches/release31-maint/Doc/tutorial/interpreter.rst (original) +++ python/branches/release31-maint/Doc/tutorial/interpreter.rst Fri Feb 11 04:13:19 2011 @@ -78,8 +78,9 @@ ---------------- When known to the interpreter, the script name and additional arguments -thereafter are passed to the script in the variable ``sys.argv``, which is a -list of strings. Its length is at least one; when no script and no arguments +thereafter are turned into a list of strings and assigned to the ``argv`` +variable in the ``sys`` module. You can access this list by executing ``import +sys``. The length of the list is at least one; when no script and no arguments are given, ``sys.argv[0]`` is an empty string. When the script name is given as ``'-'`` (meaning standard input), ``sys.argv[0]`` is set to ``'-'``. When :option:`-c` *command* is used, ``sys.argv[0]`` is set to ``'-c'``. When Modified: python/branches/release31-maint/Lib/email/header.py ============================================================================== --- python/branches/release31-maint/Lib/email/header.py (original) +++ python/branches/release31-maint/Lib/email/header.py Fri Feb 11 04:13:19 2011 @@ -304,10 +304,15 @@ self._continuation_ws, splitchars) for string, charset in self._chunks: lines = string.splitlines() - for line in lines: + formatter.feed(lines[0], charset) + for line in lines[1:]: + formatter.newline() + if charset.header_encoding is not None: + formatter.feed(self._continuation_ws, USASCII) + line = ' ' + line.lstrip() formatter.feed(line, charset) - if len(lines) > 1: - formatter.newline() + if len(lines) > 1: + formatter.newline() formatter.add_transition() value = str(formatter) if _embeded_header.search(value): Modified: python/branches/release31-maint/Lib/email/test/test_email.py ============================================================================== --- python/branches/release31-maint/Lib/email/test/test_email.py (original) +++ python/branches/release31-maint/Lib/email/test/test_email.py Fri Feb 11 04:13:19 2011 @@ -9,6 +9,7 @@ import difflib import unittest import warnings +import textwrap from io import StringIO from itertools import chain @@ -959,6 +960,19 @@ """) + def test_long_rfc2047_header_with_embedded_fws(self): + h = Header(textwrap.dedent("""\ + We're going to pretend this header is in a non-ascii character set + \tto see if line wrapping with encoded words and embedded + folding white space works"""), + charset='utf-8', + header_name='Test') + self.assertEqual(h.encode()+'\n', textwrap.dedent("""\ + =?utf-8?q?We=27re_going_to_pretend_this_header_is_in_a_non-ascii_chara?= + =?utf-8?q?cter_set?= + =?utf-8?q?_to_see_if_line_wrapping_with_encoded_words_and_embedded?= + =?utf-8?q?_folding_white_space_works?=""")+'\n') + # Test mangling of "From " lines in the body of a message Modified: python/branches/release31-maint/Lib/test/script_helper.py ============================================================================== --- python/branches/release31-maint/Lib/test/script_helper.py (original) +++ python/branches/release31-maint/Lib/test/script_helper.py Fri Feb 11 04:13:19 2011 @@ -3,6 +3,7 @@ import sys import os +import re import os.path import tempfile import subprocess @@ -11,6 +12,7 @@ import shutil import zipfile +from test.support import strip_python_stderr # Executing the interpreter in a subprocess def _assert_python(expected_success, *args, **env_vars): @@ -32,6 +34,7 @@ p.stdout.close() p.stderr.close() rc = p.returncode + err = strip_python_stderr(err) if (rc and expected_success) or (not rc and not expected_success): raise AssertionError( "Process return code is %d, " Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Fri Feb 11 04:13:19 2011 @@ -37,6 +37,11 @@ Library ------- +- email.header.Header was incorrectly encoding folding white space when + rfc2047-encoding header values with embedded newlines, leaving them + without folding whitespace. It now uses the continuation_ws, as it + does for continuation lines that it creates itself. + - Issue #11110: Fix _sqlite to not deref a NULL when module creation fails. - Issue #11089: Fix performance issue limiting the use of ConfigParser() Modified: python/branches/release31-maint/Modules/timemodule.c ============================================================================== --- python/branches/release31-maint/Modules/timemodule.c (original) +++ python/branches/release31-maint/Modules/timemodule.c Fri Feb 11 04:13:19 2011 @@ -742,7 +742,7 @@ } PyDoc_STRVAR(tzset_doc, -"tzset(zone)\n\ +"tzset()\n\ \n\ Initialize, or reinitialize, the local timezone to the value stored in\n\ os.environ['TZ']. The TZ environment variable should be specified in\n\ From solipsis at pitrou.net Fri Feb 11 05:06:15 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 11 Feb 2011 05:06:15 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88389): sum=0 Message-ID: py3k results for svn r88389 (hg cset 9752b22b5129) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogmis-Mn', '-x'] From g.brandl at gmx.net Fri Feb 11 08:38:43 2011 From: g.brandl at gmx.net (Georg Brandl) Date: Fri, 11 Feb 2011 08:38:43 +0100 Subject: [Python-checkins] r88387 - python/branches/py3k/Lib/asyncore.py In-Reply-To: References: <20110210184236.F1834EEB4E@mail.python.org> Message-ID: Am 11.02.2011 00:25, schrieb Nick Coghlan: > On Fri, Feb 11, 2011 at 4:42 AM, giampaolo.rodola > wrote: >> Author: giampaolo.rodola >> Date: Thu Feb 10 19:42:36 2011 >> New Revision: 88387 >> >> Log: >> get rid of asyncore.dispatcher's debug attribute, which is no longer used (assuming it ever was). > > Reviewer? NEWS entry? RM approval? Tracker issue? > > Removing a public attribute seems like an odd change to be making at > this stage of a release. What if there is third party code that uses > that attribute to enable additional debugging info in asyncore? Exactly: please revert this and wait until after 3.2 is released. Georg From python-checkins at python.org Fri Feb 11 12:25:47 2011 From: python-checkins at python.org (senthil.kumaran) Date: Fri, 11 Feb 2011 12:25:47 +0100 (CET) Subject: [Python-checkins] r88394 - in python/branches/py3k: Doc/library/urllib.parse.rst Doc/library/urllib.request.rst Lib/test/test_urllib2.py Lib/urllib/request.py Message-ID: <20110211112547.702A2EC21@mail.python.org> Author: senthil.kumaran Date: Fri Feb 11 12:25:47 2011 New Revision: 88394 Log: Fixed issue11082 - Reject str for POST data with a TypeError. Document the need to explicitly encode to bytes when using urlencode. Modified: python/branches/py3k/Doc/library/urllib.parse.rst python/branches/py3k/Doc/library/urllib.request.rst python/branches/py3k/Lib/test/test_urllib2.py python/branches/py3k/Lib/urllib/request.py Modified: python/branches/py3k/Doc/library/urllib.parse.rst ============================================================================== --- python/branches/py3k/Doc/library/urllib.parse.rst (original) +++ python/branches/py3k/Doc/library/urllib.parse.rst Fri Feb 11 12:25:47 2011 @@ -140,6 +140,7 @@ Use the :func:`urllib.parse.urlencode` function to convert such dictionaries into query strings. + .. versionchanged:: 3.2 Add *encoding* and *errors* parameters. @@ -506,9 +507,10 @@ .. function:: urlencode(query, doseq=False, safe='', encoding=None, errors=None) Convert a mapping object or a sequence of two-element tuples, which may - either be a :class:`str` or a :class:`bytes`, to a "percent-encoded" string, - suitable to pass to :func:`urlopen` above as the optional *data* argument. - This is useful to pass a dictionary of form fields to a ``POST`` request. + either be a :class:`str` or a :class:`bytes`, to a "percent-encoded" + string. The resultant string must be converted to bytes using the + user-specified encoding before it is sent to :func:`urlopen` as the optional + *data* argument. The resulting string is a series of ``key=value`` pairs separated by ``'&'`` characters, where both *key* and *value* are quoted using :func:`quote_plus` above. When a sequence of two-element tuples is used as the *query* @@ -525,6 +527,9 @@ To reverse this encoding process, :func:`parse_qs` and :func:`parse_qsl` are provided in this module to parse query strings into Python data structures. + Refer to :ref:`urllib examples ` to find out how urlencode + method can be used for generating query string for a URL or data for POST. + .. versionchanged:: 3.2 Query parameter supports bytes and string objects. Modified: python/branches/py3k/Doc/library/urllib.request.rst ============================================================================== --- python/branches/py3k/Doc/library/urllib.request.rst (original) +++ python/branches/py3k/Doc/library/urllib.request.rst Fri Feb 11 12:25:47 2011 @@ -967,7 +967,7 @@ >>> import urllib.request >>> req = urllib.request.Request(url='https://localhost/cgi-bin/test.cgi', - ... data='This data is passed to stdin of the CGI') + ... data=b'This data is passed to stdin of the CGI') >>> f = urllib.request.urlopen(req) >>> print(f.read().decode('utf-8')) Got Data: "This data is passed to stdin of the CGI" @@ -1043,11 +1043,13 @@ >>> f = urllib.request.urlopen("http://www.musi-cal.com/cgi-bin/query?%s" % params) >>> print(f.read().decode('utf-8')) -The following example uses the ``POST`` method instead:: +The following example uses the ``POST`` method instead. Note that params output +from urlencode is encoded to bytes before it is sent to urlopen as data:: >>> import urllib.request >>> import urllib.parse >>> params = urllib.parse.urlencode({'spam': 1, 'eggs': 2, 'bacon': 0}) + >>> params = params.encode('utf-8') >>> f = urllib.request.urlopen("http://www.musi-cal.com/cgi-bin/query", params) >>> print(f.read().decode('utf-8')) Modified: python/branches/py3k/Lib/test/test_urllib2.py ============================================================================== --- python/branches/py3k/Lib/test/test_urllib2.py (original) +++ python/branches/py3k/Lib/test/test_urllib2.py Fri Feb 11 12:25:47 2011 @@ -794,6 +794,10 @@ http.raise_on_endheaders = True self.assertRaises(urllib.error.URLError, h.do_open, http, req) + # Check for TypeError on POST data which is str. + req = Request("http://example.com/","badpost") + self.assertRaises(TypeError, h.do_request_, req) + # check adding of standard headers o.addheaders = [("Spam", "eggs")] for data in b"", None: # POST, GET @@ -837,10 +841,11 @@ else: newreq = h.do_request_(req) - # A file object + # A file object. + # Test only Content-Length attribute of request. - file_obj = io.StringIO() - file_obj.write("Something\nSomething\nSomething\n") + file_obj = io.BytesIO() + file_obj.write(b"Something\nSomething\nSomething\n") for headers in {}, {"Content-Length": 30}: req = Request("http://example.com/", file_obj, headers) @@ -863,7 +868,6 @@ newreq = h.do_request_(req) self.assertEqual(int(newreq.get_header('Content-length')),16) - def test_http_doubleslash(self): # Checks the presence of any unnecessary double slash in url does not # break anything. Previously, a double slash directly after the host Modified: python/branches/py3k/Lib/urllib/request.py ============================================================================== --- python/branches/py3k/Lib/urllib/request.py (original) +++ python/branches/py3k/Lib/urllib/request.py Fri Feb 11 12:25:47 2011 @@ -1048,6 +1048,9 @@ if request.data is not None: # POST data = request.data + if isinstance(data, str): + raise TypeError("POST data should be bytes" + " or an iterable of bytes. It cannot be str.") if not request.has_header('Content-type'): request.add_unredirected_header( 'Content-type', From python-checkins at python.org Fri Feb 11 14:04:18 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Fri, 11 Feb 2011 14:04:18 +0100 (CET) Subject: [Python-checkins] r88395 - python/branches/py3k/Lib/asyncore.py Message-ID: <20110211130418.5D6F4EE9CE@mail.python.org> Author: giampaolo.rodola Date: Fri Feb 11 14:04:18 2011 New Revision: 88395 Log: asyncore: introduce a new 'closed' attribute to make sure that dispatcher gets closed only once. In different occasions close() might be called more than once, causing problems with already disconnected sockets/dispatchers. Modified: python/branches/py3k/Lib/asyncore.py Modified: python/branches/py3k/Lib/asyncore.py ============================================================================== --- python/branches/py3k/Lib/asyncore.py (original) +++ python/branches/py3k/Lib/asyncore.py Fri Feb 11 14:04:18 2011 @@ -220,7 +220,7 @@ connected = False accepting = False - closing = False + closed = False addr = None ignore_log_types = frozenset(['warning']) @@ -393,14 +393,16 @@ raise def close(self): - self.connected = False - self.accepting = False - self.del_channel() - try: - self.socket.close() - except socket.error as why: - if why.args[0] not in (ENOTCONN, EBADF): - raise + if not self.closed: + self.closed = True + self.connected = False + self.accepting = False + self.del_channel() + try: + self.socket.close() + except socket.error as why: + if why.args[0] not in (ENOTCONN, EBADF): + raise # cheap inheritance, used to pass all other attribute # references to the underlying socket object. From ncoghlan at gmail.com Fri Feb 11 14:25:10 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 11 Feb 2011 23:25:10 +1000 Subject: [Python-checkins] r88395 - python/branches/py3k/Lib/asyncore.py In-Reply-To: <20110211130418.5D6F4EE9CE@mail.python.org> References: <20110211130418.5D6F4EE9CE@mail.python.org> Message-ID: On Fri, Feb 11, 2011 at 11:04 PM, giampaolo.rodola wrote: > Author: giampaolo.rodola > Date: Fri Feb 11 14:04:18 2011 > New Revision: 88395 > > Log: > asyncore: introduce a new 'closed' attribute to make sure that dispatcher gets closed only once. > In different occasions close() might be called more than once, causing problems with already disconnected sockets/dispatchers. Giampaolo, This checkin and the previous one are not appropriate for the release candidate phase of the 3.2 release. At the very least, they need to identify the second core dev that reviewed them, as well as a reference to the tracker issue where the RM approved them for inclusion. Regards, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Fri Feb 11 15:01:46 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Fri, 11 Feb 2011 15:01:46 +0100 (CET) Subject: [Python-checkins] r88396 - python/branches/py3k/Lib/asyncore.py Message-ID: <20110211140146.4EDCFEE9A2@mail.python.org> Author: giampaolo.rodola Date: Fri Feb 11 15:01:46 2011 New Revision: 88396 Log: reverting r88395 and r88387 as per http://mail.python.org/pipermail/python-dev/2011-February/108005.html Modified: python/branches/py3k/Lib/asyncore.py Modified: python/branches/py3k/Lib/asyncore.py ============================================================================== --- python/branches/py3k/Lib/asyncore.py (original) +++ python/branches/py3k/Lib/asyncore.py Fri Feb 11 15:01:46 2011 @@ -218,9 +218,10 @@ class dispatcher: + debug = False connected = False accepting = False - closed = False + closing = False addr = None ignore_log_types = frozenset(['warning']) @@ -393,16 +394,14 @@ raise def close(self): - if not self.closed: - self.closed = True - self.connected = False - self.accepting = False - self.del_channel() - try: - self.socket.close() - except socket.error as why: - if why.args[0] not in (ENOTCONN, EBADF): - raise + self.connected = False + self.accepting = False + self.del_channel() + try: + self.socket.close() + except socket.error as why: + if why.args[0] not in (ENOTCONN, EBADF): + raise # cheap inheritance, used to pass all other attribute # references to the underlying socket object. @@ -545,6 +544,8 @@ return (not self.connected) or len(self.out_buffer) def send(self, data): + if self.debug: + self.log_info('sending %s' % repr(data)) self.out_buffer = self.out_buffer + data self.initiate_send() From python-checkins at python.org Fri Feb 11 18:25:54 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 18:25:54 +0100 (CET) Subject: [Python-checkins] r88397 - in python/branches/release27-maint: Doc/ACKS.txt Doc/library/argparse.rst Doc/library/compileall.rst Doc/library/doctest.rst Doc/library/email.header.rst Doc/library/os.rst Doc/tutorial/interpreter.rst Lib/test/regrtest.py Lib/test/script_helper.py Lib/test/test_argparse.py Modules/timemodule.c Tools/buildbot/test.bat Message-ID: <20110211172554.7C094EE9D0@mail.python.org> Author: r.david.murray Date: Fri Feb 11 18:25:54 2011 New Revision: 88397 Log: Merged revisions 86542,87136,87216,87221,87228,87256,87337-87338,87372,87516,87571,88164 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r86542 | r.david.murray | 2010-11-19 22:48:58 -0500 (Fri, 19 Nov 2010) | 2 lines Make test class name unique so that both test classes run. ........ r87136 | r.david.murray | 2010-12-08 17:53:00 -0500 (Wed, 08 Dec 2010) | 6 lines Have script_helper._assert_python strip refcount strings from stderr. This makes the output of the function and those that depend on it independent of whether or not they are being run under a debug build. ........ r87216 | r.david.murray | 2010-12-13 17:50:30 -0500 (Mon, 13 Dec 2010) | 2 lines #10698: fix typo in example. ........ r87221 | r.david.murray | 2010-12-13 19:55:46 -0500 (Mon, 13 Dec 2010) | 4 lines #10699: fix docstring for tzset: it does not take a parameter Thanks to Garrett Cooper for the fix. ........ r87228 | r.david.murray | 2010-12-13 21:25:43 -0500 (Mon, 13 Dec 2010) | 2 lines Turn on regrtest -W (rerun immediately) option for Windows, too. ........ r87256 | r.david.murray | 2010-12-14 21:19:14 -0500 (Tue, 14 Dec 2010) | 2 lines #10705: document what the values of debuglevel are and mean. ........ r87337 | r.david.murray | 2010-12-17 11:11:40 -0500 (Fri, 17 Dec 2010) | 2 lines #10559: provide instructions for accessing sys.argv when first mentioned. ........ r87338 | r.david.murray | 2010-12-17 11:29:07 -0500 (Fri, 17 Dec 2010) | 2 lines #10454: clarify the compileall docs and help messages. [changes to compileall.py were not backported, only the doc changes] ........ r87372 | r.david.murray | 2010-12-18 11:39:06 -0500 (Sat, 18 Dec 2010) | 2 lines #10728: the default for printing help is sys.stdout, not stderr. ........ r87516 | r.david.murray | 2010-12-27 15:09:32 -0500 (Mon, 27 Dec 2010) | 5 lines #7056: runtest and runtest_inner don't use testdir, so drop it from their sigs I've only tested regular runs and -j runs. If I've broken anything else I'm sure I'll hear about it sooner or later. ........ r87571 | r.david.murray | 2010-12-29 14:06:48 -0500 (Wed, 29 Dec 2010) | 2 lines Fix same typo in docs. ........ r88164 | r.david.murray | 2011-01-24 14:34:58 -0500 (Mon, 24 Jan 2011) | 12 lines #10960: fix 'stat' links, link to lstat from stat, general tidy of stat doc. Original patch by Michal Nowikowski, with some additions and wording fixes by me. I changed the wording from 'Performs a stat system call' to 'Performs the equivalent of a stat system call', since on Windows there are no stat/lstat system calls involved. I also extended Michal's breakout of the attributes into a list to the other paragraphs, and rearranged the order of the paragraphs in the 'stat' docs to make it flow better and put it in what I think is a more logical/useful order. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/ACKS.txt python/branches/release27-maint/Doc/library/argparse.rst python/branches/release27-maint/Doc/library/compileall.rst python/branches/release27-maint/Doc/library/doctest.rst python/branches/release27-maint/Doc/library/email.header.rst python/branches/release27-maint/Doc/library/os.rst python/branches/release27-maint/Doc/tutorial/interpreter.rst python/branches/release27-maint/Lib/test/regrtest.py python/branches/release27-maint/Lib/test/script_helper.py python/branches/release27-maint/Lib/test/test_argparse.py python/branches/release27-maint/Modules/timemodule.c python/branches/release27-maint/Tools/buildbot/test.bat Modified: python/branches/release27-maint/Doc/ACKS.txt ============================================================================== --- python/branches/release27-maint/Doc/ACKS.txt (original) +++ python/branches/release27-maint/Doc/ACKS.txt Fri Feb 11 18:25:54 2011 @@ -137,6 +137,7 @@ * Ross Moore * Sjoerd Mullender * Dale Nagata + * Michal Nowikowski * Ng Pheng Siong * Koray Oner * Tomas Oppelstrup Modified: python/branches/release27-maint/Doc/library/argparse.rst ============================================================================== --- python/branches/release27-maint/Doc/library/argparse.rst (original) +++ python/branches/release27-maint/Doc/library/argparse.rst Fri Feb 11 18:25:54 2011 @@ -1654,14 +1654,14 @@ .. method:: ArgumentParser.print_usage(file=None) Print a brief description of how the :class:`ArgumentParser` should be - invoked on the command line. If *file* is ``None``, :data:`sys.stderr` is + invoked on the command line. If *file* is ``None``, :data:`sys.stdout` is assumed. .. method:: ArgumentParser.print_help(file=None) Print a help message, including the program usage and information about the arguments registered with the :class:`ArgumentParser`. If *file* is - ``None``, :data:`sys.stderr` is assumed. + ``None``, :data:`sys.stdout` is assumed. There are also variants of these methods that simply return a string instead of printing it: Modified: python/branches/release27-maint/Doc/library/compileall.rst ============================================================================== --- python/branches/release27-maint/Doc/library/compileall.rst (original) +++ python/branches/release27-maint/Doc/library/compileall.rst Fri Feb 11 18:25:54 2011 @@ -6,9 +6,10 @@ This module provides some utility functions to support installing Python -libraries. These functions compile Python source files in a directory tree, -allowing users without permission to write to the libraries to take advantage of -cached byte-code files. +libraries. These functions compile Python source files in a directory tree. +This module can be used to create the cached byte-code files at library +installation time, which makes them available for use even by users who don't +have write permission to the library directories. Command-line use @@ -27,7 +28,8 @@ .. cmdoption:: -l - Do not recurse. + Do not recurse into subdirectories, only compile source code files directly + contained in the named or implied directories. .. cmdoption:: -f @@ -35,19 +37,26 @@ .. cmdoption:: -q - Do not print the list of files compiled. + Do not print the list of files compiled, print only error messages. .. cmdoption:: -d destdir - Purported directory name for error messages. + Directory prepended to the path to each file being compiled. This will + appear in compilation time tracebacks, and is also compiled in to the + byte-code file, where it will be used in tracebacks and other messages in + cases where the source file does not exist at the time the byte-code file is + executed. .. cmdoption:: -x regex - Skip files with a full path that matches given regular expression. + regex is used to search the full path to each file considered for + compilation, and if the regex produces a match, the file is skipped. .. cmdoption:: -i list - Expand list with its content (file and directory names). + Read the file ``list`` and add each line that it contains to the list of + files and directories to compile. If ``list`` is ``-``, read lines from + ``stdin``. .. versionchanged:: 2.7 Added the ``-i`` option. @@ -59,31 +68,44 @@ .. function:: compile_dir(dir[, maxlevels[, ddir[, force[, rx[, quiet]]]]]) Recursively descend the directory tree named by *dir*, compiling all :file:`.py` - files along the way. The *maxlevels* parameter is used to limit the depth of - the recursion; it defaults to ``10``. If *ddir* is given, it is used as the - base path from which the filenames used in error messages will be generated. + files along the way. + + The *maxlevels* parameter is used to limit the depth of the recursion; it + defaults to ``10``. + + If *ddir* is given, it is prepended to the path to each file being compiled + for use in compilation time tracebacks, and is also compiled in to the + byte-code file, where it will be used in tracebacks and other messages in + cases where the source file does not exist at the time the byte-code file is + executed. + If *force* is true, modules are re-compiled even if the timestamps are up to date. - If *rx* is given, it specifies a regular expression of file names to exclude - from the search; that expression is searched for in the full path. + If *rx* is given, its search method is called on the complete path to each + file considered for compilation, and if it returns a true value, the file + is skipped. - If *quiet* is true, nothing is printed to the standard output in normal - operation. + If *quiet* is true, nothing is printed to the standard output unless errors + occur. .. function:: compile_file(fullname[, ddir[, force[, rx[, quiet]]]]) - Compile the file with path *fullname*. If *ddir* is given, it is used as the - base path from which the filename used in error messages will be generated. - If *force* is true, modules are re-compiled even if the timestamp is up to - date. + Compile the file with path *fullname*. - If *rx* is given, it specifies a regular expression which, if matched, will - prevent compilation; that expression is searched for in the full path. + If *ddir* is given, it is prepended to the path to the file being compiled + for use in compilation time tracebacks, and is also compiled in to the + byte-code file, where it will be used in tracebacks and other messages in + cases where the source file does not exist at the time the byte-code file is + executed. + + If *ra* is given, its search method is passed the full path name to the + file being compiled, and if it returns a true value, the file is not + compiled and ``True`` is returned. - If *quiet* is true, nothing is printed to the standard output in normal - operation. + If *quiet* is true, nothing is printed to the standard output unless errors + occur. .. versionadded:: 2.7 @@ -91,9 +113,10 @@ .. function:: compile_path([skip_curdir[, maxlevels[, force]]]) Byte-compile all the :file:`.py` files found along ``sys.path``. If - *skip_curdir* is true (the default), the current directory is not included in - the search. The *maxlevels* and *force* parameters default to ``0`` and are - passed to the :func:`compile_dir` function. + *skip_curdir* is true (the default), the current directory is not included + in the search. All other parameters are passed to the :func:`compile_dir` + function. Note that unlike the other compile functions, ``maxlevels`` + defaults to ``0``. To force a recompile of all the :file:`.py` files in the :file:`Lib/` subdirectory and all its subdirectories:: Modified: python/branches/release27-maint/Doc/library/doctest.rst ============================================================================== --- python/branches/release27-maint/Doc/library/doctest.rst (original) +++ python/branches/release27-maint/Doc/library/doctest.rst Fri Feb 11 18:25:54 2011 @@ -982,7 +982,7 @@ def load_tests(loader, tests, ignore): tests.addTests(doctest.DocTestSuite(my_module_with_doctests)) - return test + return tests There are two main functions for creating :class:`unittest.TestSuite` instances from text files and modules with doctests: Modified: python/branches/release27-maint/Doc/library/email.header.rst ============================================================================== --- python/branches/release27-maint/Doc/library/email.header.rst (original) +++ python/branches/release27-maint/Doc/library/email.header.rst Fri Feb 11 18:25:54 2011 @@ -65,7 +65,7 @@ character set is used both as *s*'s initial charset and as the default for subsequent :meth:`append` calls. - The maximum line length can be specified explicit via *maxlinelen*. For + The maximum line length can be specified explicitly via *maxlinelen*. For splitting the first line to a shorter value (to account for the field header which isn't included in *s*, e.g. :mailheader:`Subject`) pass in the name of the field in *header_name*. The default *maxlinelen* is 76, and the default value Modified: python/branches/release27-maint/Doc/library/os.rst ============================================================================== --- python/branches/release27-maint/Doc/library/os.rst (original) +++ python/branches/release27-maint/Doc/library/os.rst Fri Feb 11 18:25:54 2011 @@ -674,7 +674,7 @@ .. function:: fstat(fd) - Return status for file descriptor *fd*, like :func:`stat`. + Return status for file descriptor *fd*, like :func:`~os.stat`. Availability: Unix, Windows. @@ -1114,9 +1114,10 @@ .. function:: lstat(path) - Like :func:`stat`, but do not follow symbolic links. This is an alias for - :func:`stat` on platforms that do not support symbolic links, such as - Windows. + Perform the equivalent of an :c:func:`lstat` system call on the given path. + Similar to :func:`~os.stat`, but does not follow symbolic links. On + platforms that do not support symbolic links, this is an alias for + :func:`~os.stat`. .. function:: mkfifo(path[, mode]) @@ -1314,23 +1315,23 @@ .. function:: stat(path) - Perform a :cfunc:`stat` system call on the given path. The return value is an - object whose attributes correspond to the members of the :ctype:`stat` - structure, namely: :attr:`st_mode` (protection bits), :attr:`st_ino` (inode - number), :attr:`st_dev` (device), :attr:`st_nlink` (number of hard links), - :attr:`st_uid` (user id of owner), :attr:`st_gid` (group id of owner), - :attr:`st_size` (size of file, in bytes), :attr:`st_atime` (time of most recent - access), :attr:`st_mtime` (time of most recent content modification), - :attr:`st_ctime` (platform dependent; time of most recent metadata change on - Unix, or the time of creation on Windows):: + Perform the equivalent of a :c:func:`stat` system call on the given path. + (This function follows symlinks; to stat a symlink use :func:`lstat`.) - >>> import os - >>> statinfo = os.stat('somefile.txt') - >>> statinfo - (33188, 422511L, 769L, 1, 1032, 100, 926L, 1105022698,1105022732, 1105022732) - >>> statinfo.st_size - 926L - >>> + The return value is an object whose attributes correspond to the members + of the :c:type:`stat` structure, namely: + + * :attr:`st_mode` - protection bits, + * :attr:`st_ino` - inode number, + * :attr:`st_dev` - device, + * :attr:`st_nlink` - number of hard links, + * :attr:`st_uid` - user id of owner, + * :attr:`st_gid` - group id of owner, + * :attr:`st_size` - size of file, in bytes, + * :attr:`st_atime` - time of most recent access, + * :attr:`st_mtime` - time of most recent content modification, + * :attr:`st_ctime` - platform dependent; time of most recent metadata change on + Unix, or the time of creation on Windows) .. versionchanged:: 2.3 If :func:`stat_float_times` returns ``True``, the time values are floats, measuring @@ -1339,39 +1340,60 @@ discussion. On some Unix systems (such as Linux), the following attributes may also be - available: :attr:`st_blocks` (number of blocks allocated for file), - :attr:`st_blksize` (filesystem blocksize), :attr:`st_rdev` (type of device if an - inode device). :attr:`st_flags` (user defined flags for file). + available: + + * :attr:`st_blocks` - number of blocks allocated for file + * :attr:`st_blksize` - filesystem blocksize + * :attr:`st_rdev` - type of device if an inode device + * :attr:`st_flags` - user defined flags for file On other Unix systems (such as FreeBSD), the following attributes may be - available (but may be only filled out if root tries to use them): :attr:`st_gen` - (file generation number), :attr:`st_birthtime` (time of file creation). + available (but may be only filled out if root tries to use them): + + * :attr:`st_gen` - file generation number + * :attr:`st_birthtime` - time of file creation On Mac OS systems, the following attributes may also be available: - :attr:`st_rsize`, :attr:`st_creator`, :attr:`st_type`. - On RISCOS systems, the following attributes are also available: :attr:`st_ftype` - (file type), :attr:`st_attrs` (attributes), :attr:`st_obtype` (object type). + * :attr:`st_rsize` + * :attr:`st_creator` + * :attr:`st_type` + + On RISCOS systems, the following attributes are also available: + + * :attr:`st_ftype` (file type) + * :attr:`st_attrs` (attributes) + * :attr:`st_obtype` (object type). - .. index:: module: stat + .. note:: + + The exact meaning and resolution of the :attr:`st_atime`, :attr:`st_mtime`, and + :attr:`st_ctime` members depends on the operating system and the file system. + For example, on Windows systems using the FAT or FAT32 file systems, + :attr:`st_mtime` has 2-second resolution, and :attr:`st_atime` has only 1-day + resolution. See your operating system documentation for details. - For backward compatibility, the return value of :func:`stat` is also accessible + For backward compatibility, the return value of :func:`~os.stat` is also accessible as a tuple of at least 10 integers giving the most important (and portable) members of the :ctype:`stat` structure, in the order :attr:`st_mode`, :attr:`st_ino`, :attr:`st_dev`, :attr:`st_nlink`, :attr:`st_uid`, :attr:`st_gid`, :attr:`st_size`, :attr:`st_atime`, :attr:`st_mtime`, :attr:`st_ctime`. More items may be added at the end by some implementations. + + .. index:: module: stat + The standard module :mod:`stat` defines functions and constants that are useful for extracting information from a :ctype:`stat` structure. (On Windows, some items are filled with dummy values.) - .. note:: + Example:: - The exact meaning and resolution of the :attr:`st_atime`, :attr:`st_mtime`, and - :attr:`st_ctime` members depends on the operating system and the file system. - For example, on Windows systems using the FAT or FAT32 file systems, - :attr:`st_mtime` has 2-second resolution, and :attr:`st_atime` has only 1-day - resolution. See your operating system documentation for details. + >>> import os + >>> statinfo = os.stat('somefile.txt') + >>> statinfo + (33188, 422511, 769, 1, 1032, 100, 926, 1105022698,1105022732, 1105022732) + >>> statinfo.st_size + 926 Availability: Unix, Windows. @@ -1385,7 +1407,7 @@ .. function:: stat_float_times([newvalue]) Determine whether :class:`stat_result` represents time stamps as float objects. - If *newvalue* is ``True``, future calls to :func:`stat` return floats, if it is + If *newvalue* is ``True``, future calls to :func:`~os.stat` return floats, if it is ``False``, future calls return ints. If *newvalue* is omitted, return the current setting. @@ -1505,8 +1527,8 @@ respectively. Whether a directory can be given for *path* depends on whether the operating system implements directories as files (for example, Windows does not). Note that the exact times you set here may not be returned by a - subsequent :func:`stat` call, depending on the resolution with which your - operating system records access and modification times; see :func:`stat`. + subsequent :func:`~os.stat` call, depending on the resolution with which your + operating system records access and modification times; see :func:`~os.stat`. .. versionchanged:: 2.0 Added support for ``None`` for *times*. Modified: python/branches/release27-maint/Doc/tutorial/interpreter.rst ============================================================================== --- python/branches/release27-maint/Doc/tutorial/interpreter.rst (original) +++ python/branches/release27-maint/Doc/tutorial/interpreter.rst Fri Feb 11 18:25:54 2011 @@ -78,8 +78,9 @@ ---------------- When known to the interpreter, the script name and additional arguments -thereafter are passed to the script in the variable ``sys.argv``, which is a -list of strings. Its length is at least one; when no script and no arguments +thereafter are turned into a list of strings and assigned to the ``argv`` +variable in the ``sys`` module. You can access this list by executing ``import +sys``. The length of the list is at least one; when no script and no arguments are given, ``sys.argv[0]`` is an empty string. When the script name is given as ``'-'`` (meaning standard input), ``sys.argv[0]`` is set to ``'-'``. When :option:`-c` *command* is used, ``sys.argv[0]`` is set to ``'-c'``. When Modified: python/branches/release27-maint/Lib/test/regrtest.py ============================================================================== --- python/branches/release27-maint/Lib/test/regrtest.py (original) +++ python/branches/release27-maint/Lib/test/regrtest.py Fri Feb 11 18:25:54 2011 @@ -490,7 +490,7 @@ def tests_and_args(): for test in tests: args_tuple = ( - (test, verbose, quiet, testdir), + (test, verbose, quiet), dict(huntrleaks=huntrleaks, use_resources=use_resources) ) yield (test, args_tuple) @@ -557,16 +557,15 @@ if trace: # If we're tracing code coverage, then we don't exit with status # if on a false return value from main. - tracer.runctx('runtest(test, verbose, quiet, testdir)', + tracer.runctx('runtest(test, verbose, quiet)', globals=globals(), locals=vars()) else: try: - result = runtest(test, verbose, quiet, - testdir, huntrleaks) + result = runtest(test, verbose, quiet, huntrleaks) accumulate_result(test, result) if verbose3 and result[0] == FAILED: print "Re-running test %r in verbose mode" % test - runtest(test, True, quiet, testdir, huntrleaks) + runtest(test, True, quiet, huntrleaks) except KeyboardInterrupt: interrupted = True break @@ -636,8 +635,7 @@ sys.stdout.flush() try: test_support.verbose = True - ok = runtest(test, True, quiet, testdir, - huntrleaks) + ok = runtest(test, True, quiet, huntrleaks) except KeyboardInterrupt: # print a newline separate from the ^C print @@ -693,14 +691,13 @@ return stdtests + sorted(tests) def runtest(test, verbose, quiet, - testdir=None, huntrleaks=False, use_resources=None): + huntrleaks=False, use_resources=None): """Run a single test. test -- the name of the test verbose -- if true, print more messages quiet -- if true, don't print 'skipped' messages (probably redundant) test_times -- a list of (time, test_name) pairs - testdir -- test directory huntrleaks -- run multiple times to test for leaks; requires a debug build; a triple corresponding to -R's three arguments Returns one of the test result constants: @@ -716,8 +713,7 @@ if use_resources is not None: test_support.use_resources = use_resources try: - return runtest_inner(test, verbose, quiet, - testdir, huntrleaks) + return runtest_inner(test, verbose, quiet, huntrleaks) finally: cleanup_test_droppings(test, verbose) @@ -850,10 +846,8 @@ return False -def runtest_inner(test, verbose, quiet, - testdir=None, huntrleaks=False): +def runtest_inner(test, verbose, quiet, huntrleaks=False): test_support.unload(test) - testdir = findtestdir(testdir) if verbose: capture_stdout = None else: Modified: python/branches/release27-maint/Lib/test/script_helper.py ============================================================================== --- python/branches/release27-maint/Lib/test/script_helper.py (original) +++ python/branches/release27-maint/Lib/test/script_helper.py Fri Feb 11 18:25:54 2011 @@ -3,6 +3,7 @@ import sys import os +import re import os.path import tempfile import subprocess @@ -11,6 +12,8 @@ import shutil import zipfile +from test.test_support import strip_python_stderr + # Executing the interpreter in a subprocess def _assert_python(expected_success, *args, **env_vars): cmd_line = [sys.executable] @@ -31,6 +34,7 @@ p.stdout.close() p.stderr.close() rc = p.returncode + err = strip_python_stderr(err) if (rc and expected_success) or (not rc and not expected_success): raise AssertionError( "Process return code is %d, " Modified: python/branches/release27-maint/Lib/test/test_argparse.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_argparse.py (original) +++ python/branches/release27-maint/Lib/test/test_argparse.py Fri Feb 11 18:25:54 2011 @@ -4317,7 +4317,7 @@ # ArgumentTypeError tests # ======================= -class TestArgumentError(TestCase): +class TestArgumentTypeError(TestCase): def test_argument_type_error(self): Modified: python/branches/release27-maint/Modules/timemodule.c ============================================================================== --- python/branches/release27-maint/Modules/timemodule.c (original) +++ python/branches/release27-maint/Modules/timemodule.c Fri Feb 11 18:25:54 2011 @@ -671,7 +671,7 @@ } PyDoc_STRVAR(tzset_doc, -"tzset(zone)\n\ +"tzset()\n\ \n\ Initialize, or reinitialize, the local timezone to the value stored in\n\ os.environ['TZ']. The TZ environment variable should be specified in\n\ Modified: python/branches/release27-maint/Tools/buildbot/test.bat ============================================================================== --- python/branches/release27-maint/Tools/buildbot/test.bat (original) +++ python/branches/release27-maint/Tools/buildbot/test.bat Fri Feb 11 18:25:54 2011 @@ -1,3 +1,3 @@ @rem Used by the buildbot "test" step. cd PCbuild -call rt.bat -d -q -uall -rw +call rt.bat -d -q -uall -rwW From python-checkins at python.org Fri Feb 11 18:29:23 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 18:29:23 +0100 (CET) Subject: [Python-checkins] r88398 - python/branches/release27-maint Message-ID: <20110211172923.863F5EE983@mail.python.org> Author: r.david.murray Date: Fri Feb 11 18:29:23 2011 New Revision: 88398 Log: Blocked revisions 85078,87839 via svnmerge 87839 turns out to have been a py3k regression; 2.x doesn't have the bug. ........ r85078 | r.david.murray | 2010-09-28 18:25:18 -0400 (Tue, 28 Sep 2010) | 2 lines #9628: fix runtests.sh -x option so more than one test can be excluded. ........ r87839 | r.david.murray | 2011-01-07 16:57:25 -0500 (Fri, 07 Jan 2011) | 9 lines Fix formatting of values with embedded newlines when rfc2047 encoding Before this patch if a value being encoded had an embedded newline, the line following the newline would have no leading whitespace, and the whitespace it did have was encoded into the word. Now the existing whitespace gets turned into a blank, the way it does in other header reformatting, and the _continuation_ws gets added at the beginning of the encoded line. ........ Modified: python/branches/release27-maint/ (props changed) From python-checkins at python.org Fri Feb 11 21:44:40 2011 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 11 Feb 2011 21:44:40 +0100 (CET) Subject: [Python-checkins] r88399 - python/branches/py3k/Tools/msi/uuids.py Message-ID: <20110211204440.62A17EE99E@mail.python.org> Author: martin.v.loewis Date: Fri Feb 11 21:44:40 2011 New Revision: 88399 Log: Add uuid for 3.2rc3. Modified: python/branches/py3k/Tools/msi/uuids.py Modified: python/branches/py3k/Tools/msi/uuids.py ============================================================================== --- python/branches/py3k/Tools/msi/uuids.py (original) +++ python/branches/py3k/Tools/msi/uuids.py Fri Feb 11 21:44:40 2011 @@ -85,5 +85,6 @@ '3.2.112' :'{0e350c98-8d73-4993-b686-cfe87160046e}', # 3.2b2 '3.2.121' :'{2094968d-7583-47f6-a7fd-22304532e09f}', # 3.2rc1 '3.2.122' :'{4f3edfa6-cf70-469a-825f-e1206aa7f412}', # 3.2rc2 + '3.2.123' :'{90c673d7-8cfd-4969-9816-f7d70bad87f3}', # 3.2rc3 '3.2.150' :'{b2042d5e-986d-44ec-aee3-afe4108ccc93}', # 3.2.0 } From python-checkins at python.org Fri Feb 11 21:47:49 2011 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 11 Feb 2011 21:47:49 +0100 (CET) Subject: [Python-checkins] r88400 - in python/branches/py3k: Include/object.h Misc/NEWS Modules/xxlimited.c Tools/scripts/abitype.py Message-ID: <20110211204749.AD77CEE987@mail.python.org> Author: martin.v.loewis Date: Fri Feb 11 21:47:49 2011 New Revision: 88400 Log: Issue #11135: Remove redundant doc field from PyType_Spec. Reviewed by Georg Brandl. Modified: python/branches/py3k/Include/object.h python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/xxlimited.c python/branches/py3k/Tools/scripts/abitype.py Modified: python/branches/py3k/Include/object.h ============================================================================== --- python/branches/py3k/Include/object.h (original) +++ python/branches/py3k/Include/object.h Fri Feb 11 21:47:49 2011 @@ -396,7 +396,6 @@ typedef struct{ const char* name; - const char* doc; int basicsize; int itemsize; int flags; Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 11 21:47:49 2011 @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #11135: Remove redundant doc field from PyType_Spec. + - Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check in the limited ABI. Modified: python/branches/py3k/Modules/xxlimited.c ============================================================================== --- python/branches/py3k/Modules/xxlimited.c (original) +++ python/branches/py3k/Modules/xxlimited.c Fri Feb 11 21:47:49 2011 @@ -101,6 +101,7 @@ } static PyType_Slot Xxo_Type_slots[] = { + {Py_tp_doc, "The Xxo type"}, {Py_tp_dealloc, Xxo_dealloc}, {Py_tp_getattro, Xxo_getattro}, {Py_tp_setattr, Xxo_setattr}, @@ -109,8 +110,7 @@ }; static PyType_Spec Xxo_Type_spec = { - "xxmodule.Xxo", - NULL, + "xxlimited.Xxo", sizeof(XxoObject), 0, Py_TPFLAGS_DEFAULT, @@ -178,7 +178,6 @@ "xxlimited.Str", 0, 0, - 0, Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, Str_Type_slots }; @@ -201,7 +200,6 @@ static PyType_Spec Null_Type_spec = { "xxlimited.Null", - NULL, /* doc */ 0, /* basicsize */ 0, /* itemsize */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, @@ -230,7 +228,7 @@ static struct PyModuleDef xxmodule = { PyModuleDef_HEAD_INIT, - "xx", + "xxlimited", module_doc, -1, xx_methods, @@ -264,7 +262,7 @@ /* Add some symbolic constants to the module */ if (ErrorObject == NULL) { - ErrorObject = PyErr_NewException("xx.error", NULL, NULL); + ErrorObject = PyErr_NewException("xxlimited.error", NULL, NULL); if (ErrorObject == NULL) goto fail; } Modified: python/branches/py3k/Tools/scripts/abitype.py ============================================================================== --- python/branches/py3k/Tools/scripts/abitype.py (original) +++ python/branches/py3k/Tools/scripts/abitype.py Fri Feb 11 21:47:49 2011 @@ -162,7 +162,7 @@ res = [] res.append('static PyType_Slot %s_slots[] = {' % name) # defaults for spec - spec = { 'tp_doc':'NULL', 'tp_itemsize':'0' } + spec = { 'tp_itemsize':'0' } for i, val in enumerate(fields): if val.endswith('0'): continue @@ -174,7 +174,6 @@ res.append('};') res.append('static PyType_Spec %s_spec = {' % name) res.append(' %s,' % spec['tp_name']) - res.append(' %s,' % spec['tp_doc']) res.append(' %s,' % spec['tp_basicsize']) res.append(' %s,' % spec['tp_itemsize']) res.append(' %s,' % spec['tp_flags']) From python-checkins at python.org Fri Feb 11 21:50:24 2011 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 11 Feb 2011 21:50:24 +0100 (CET) Subject: [Python-checkins] r88401 - in python/branches/py3k: Include/typeslots.h Misc/NEWS Objects/typeslots.inc Message-ID: <20110211205024.A904EEE987@mail.python.org> Author: martin.v.loewis Date: Fri Feb 11 21:50:24 2011 New Revision: 88401 Log: Issue #11134: Add missing fields to typeslots.h. Reviewed by Georg Brandl. Modified: python/branches/py3k/Include/typeslots.h python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/typeslots.inc Modified: python/branches/py3k/Include/typeslots.h ============================================================================== --- python/branches/py3k/Include/typeslots.h (original) +++ python/branches/py3k/Include/typeslots.h Fri Feb 11 21:50:24 2011 @@ -71,3 +71,6 @@ #define Py_tp_setattro 69 #define Py_tp_str 70 #define Py_tp_traverse 71 +#define Py_tp_members 72 +#define Py_tp_getset 73 +#define Py_tp_free 74 Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 11 21:50:24 2011 @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #11134: Add missing fields to typeslots.h. + - Issue #11135: Remove redundant doc field from PyType_Spec. - Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check Modified: python/branches/py3k/Objects/typeslots.inc ============================================================================== --- python/branches/py3k/Objects/typeslots.inc (original) +++ python/branches/py3k/Objects/typeslots.inc Fri Feb 11 21:50:24 2011 @@ -70,3 +70,6 @@ offsetof(PyHeapTypeObject, ht_type.tp_setattro), offsetof(PyHeapTypeObject, ht_type.tp_str), offsetof(PyHeapTypeObject, ht_type.tp_traverse), +offsetof(PyHeapTypeObject, ht_type.tp_members), +offsetof(PyHeapTypeObject, ht_type.tp_getset), +offsetof(PyHeapTypeObject, ht_type.tp_free), From python-checkins at python.org Fri Feb 11 23:37:17 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 23:37:17 +0100 (CET) Subject: [Python-checkins] r88402 - python/branches/py3k/Doc/library/compileall.rst Message-ID: <20110211223717.08F69EE98E@mail.python.org> Author: r.david.murray Date: Fri Feb 11 23:37:16 2011 New Revision: 88402 Log: Fix argument name typo in compileall docs. Modified: python/branches/py3k/Doc/library/compileall.rst Modified: python/branches/py3k/Doc/library/compileall.rst ============================================================================== --- python/branches/py3k/Doc/library/compileall.rst (original) +++ python/branches/py3k/Doc/library/compileall.rst Fri Feb 11 23:37:16 2011 @@ -119,7 +119,7 @@ cases where the source file does not exist at the time the byte-code file is executed. - If *ra* is given, its search method is passed the full path name to the + If *rx* is given, its search method is passed the full path name to the file being compiled, and if it returns a true value, the file is not compiled and ``True`` is returned. From python-checkins at python.org Fri Feb 11 23:47:17 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 23:47:17 +0100 (CET) Subject: [Python-checkins] r88403 - in python/branches/py3k: Lib/mailbox.py Lib/test/test_mailbox.py Misc/NEWS Message-ID: <20110211224717.971D7EE989@mail.python.org> Author: r.david.murray Date: Fri Feb 11 23:47:17 2011 New Revision: 88403 Log: #11116: roll back on error during add so mailbox isn't left corrupted. Modified: python/branches/py3k/Lib/mailbox.py python/branches/py3k/Lib/test/test_mailbox.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/mailbox.py ============================================================================== --- python/branches/py3k/Lib/mailbox.py (original) +++ python/branches/py3k/Lib/mailbox.py Fri Feb 11 23:47:17 2011 @@ -277,8 +277,11 @@ tmp_file = self._create_tmp() try: self._dump_message(message, tmp_file) - finally: - _sync_close(tmp_file) + except BaseException: + tmp_file.close() + os.remove(tmp_file.name) + raise + _sync_close(tmp_file) if isinstance(message, MaildirMessage): subdir = message.get_subdir() suffix = self.colon + message.get_info() @@ -724,9 +727,14 @@ def _append_message(self, message): """Append message to mailbox and return (start, stop) offsets.""" self._file.seek(0, 2) - self._pre_message_hook(self._file) - offsets = self._install_message(message) - self._post_message_hook(self._file) + before = self._file.tell() + try: + self._pre_message_hook(self._file) + offsets = self._install_message(message) + self._post_message_hook(self._file) + except BaseException: + self._file.truncate(before) + raise self._file.flush() self._file_length = self._file.tell() # Record current length of mailbox return offsets @@ -906,7 +914,11 @@ if self._locked: _lock_file(f) try: - self._dump_message(message, f) + try: + self._dump_message(message, f) + except BaseException: + os.remove(new_path) + raise if isinstance(message, MHMessage): self._dump_sequences(message, new_key) finally: Modified: python/branches/py3k/Lib/test/test_mailbox.py ============================================================================== --- python/branches/py3k/Lib/test/test_mailbox.py (original) +++ python/branches/py3k/Lib/test/test_mailbox.py Fri Feb 11 23:47:17 2011 @@ -107,9 +107,22 @@ 'Subject: =?unknown-8bit?b?RmFsaW5hcHThciBo4Xpob3pzeuFsbO104XNz' 'YWwuIE3hciByZW5kZWx06Ww/?=\n\n') - def test_add_nonascii_header_raises(self): + def test_add_nonascii_string_header_raises(self): with self.assertRaisesRegex(ValueError, "ASCII-only"): self._box.add(self._nonascii_msg) + self._box.flush() + self.assertEqual(len(self._box), 0) + self.assertMailboxEmpty() + + def test_add_that_raises_leaves_mailbox_empty(self): + # XXX This test will start failing when Message learns to handle + # non-ASCII string headers, and a different internal failure will + # need to be found or manufactured. + with self.assertRaises(ValueError): + self._box.add(email.message_from_string("From: Alph?so")) + self.assertEqual(len(self._box), 0) + self._box.close() + self.assertMailboxEmpty() _non_latin_bin_msg = textwrap.dedent("""\ From: foo at bar.com @@ -174,6 +187,9 @@ with self.assertWarns(DeprecationWarning): with self.assertRaisesRegex(ValueError, "ASCII-only"): self._box.add(io.StringIO(self._nonascii_msg)) + self.assertEqual(len(self._box), 0) + self._box.close() + self.assertMailboxEmpty() def test_remove(self): # Remove messages using remove() @@ -571,6 +587,9 @@ if os.name in ('nt', 'os2') or sys.platform == 'cygwin': self._box.colon = '!' + def assertMailboxEmpty(self): + self.assertEqual(os.listdir(os.path.join(self._path, 'tmp')), []) + def test_add_MM(self): # Add a MaildirMessage instance msg = mailbox.MaildirMessage(self._template % 0) @@ -890,6 +909,10 @@ for lock_remnant in glob.glob(self._path + '.*'): support.unlink(lock_remnant) + def assertMailboxEmpty(self): + with open(self._path) as f: + self.assertEqual(f.readlines(), []) + def test_add_from_string(self): # Add a string starting with 'From ' to the mailbox key = self._box.add('From foo at bar blah\nFrom: foo\n\n0') @@ -1012,6 +1035,9 @@ _factory = lambda self, path, factory=None: mailbox.MH(path, factory) + def assertMailboxEmpty(self): + self.assertEqual(os.listdir(self._path), ['.mh_sequences']) + def test_list_folders(self): # List folders self._box.add_folder('one') @@ -1144,6 +1170,10 @@ _factory = lambda self, path, factory=None: mailbox.Babyl(path, factory) + def assertMailboxEmpty(self): + with open(self._path) as f: + self.assertEqual(f.readlines(), []) + def tearDown(self): super().tearDown() self._box.close() Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 11 23:47:17 2011 @@ -22,6 +22,9 @@ Library ------- +- Issue #11116: any error during addition of a message to a mailbox now causes + a rollback, instead of leaving the mailbox partially modified. + - Issue #11132: Fix passing of "optimize" parameter when recursing in compileall.compile_dir(). From python-checkins at python.org Fri Feb 11 23:51:47 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 23:51:47 +0100 (CET) Subject: [Python-checkins] r88404 - python/branches/release31-maint Message-ID: <20110211225147.82349EE98E@mail.python.org> Author: r.david.murray Date: Fri Feb 11 23:51:47 2011 New Revision: 88404 Log: Blocked revisions 88402-88403 via svnmerge ........ r88402 | r.david.murray | 2011-02-11 17:37:16 -0500 (Fri, 11 Feb 2011) | 2 lines Fix argument name typo in compileall docs. ........ r88403 | r.david.murray | 2011-02-11 17:47:17 -0500 (Fri, 11 Feb 2011) | 3 lines #11116: roll back on error during add so mailbox isn't left corrupted. ........ Modified: python/branches/release31-maint/ (props changed) From python-checkins at python.org Fri Feb 11 23:54:37 2011 From: python-checkins at python.org (r.david.murray) Date: Fri, 11 Feb 2011 23:54:37 +0100 (CET) Subject: [Python-checkins] r88405 - in python/branches/release27-maint: Doc/library/compileall.rst Message-ID: <20110211225437.B326CEE989@mail.python.org> Author: r.david.murray Date: Fri Feb 11 23:54:34 2011 New Revision: 88405 Log: Merged revisions 88402 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88402 | r.david.murray | 2011-02-11 17:37:16 -0500 (Fri, 11 Feb 2011) | 2 lines Fix argument name typo in compileall docs. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/compileall.rst Modified: python/branches/release27-maint/Doc/library/compileall.rst ============================================================================== --- python/branches/release27-maint/Doc/library/compileall.rst (original) +++ python/branches/release27-maint/Doc/library/compileall.rst Fri Feb 11 23:54:34 2011 @@ -100,7 +100,7 @@ cases where the source file does not exist at the time the byte-code file is executed. - If *ra* is given, its search method is passed the full path name to the + If *rx* is given, its search method is passed the full path name to the file being compiled, and if it returns a true value, the file is not compiled and ``True`` is returned. From python-checkins at python.org Sat Feb 12 00:03:13 2011 From: python-checkins at python.org (r.david.murray) Date: Sat, 12 Feb 2011 00:03:13 +0100 (CET) Subject: [Python-checkins] r88406 - in python/branches/release27-maint: Lib/mailbox.py Misc/NEWS Message-ID: <20110211230313.437ADEE989@mail.python.org> Author: r.david.murray Date: Sat Feb 12 00:03:13 2011 New Revision: 88406 Log: Merged revisions 88403 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k Test not backported since they depend on 3.x quirks. Not easy to rewrite them for 2.7. ........ r88403 | r.david.murray | 2011-02-11 17:47:17 -0500 (Fri, 11 Feb 2011) | 3 lines #11116: roll back on error during add so mailbox isn't left corrupted. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/mailbox.py python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Lib/mailbox.py ============================================================================== --- python/branches/release27-maint/Lib/mailbox.py (original) +++ python/branches/release27-maint/Lib/mailbox.py Sat Feb 12 00:03:13 2011 @@ -253,8 +253,11 @@ tmp_file = self._create_tmp() try: self._dump_message(message, tmp_file) - finally: - _sync_close(tmp_file) + except BaseException: + tmp_file.close() + os.remove(tmp_file.name) + raise + _sync_close(tmp_file) if isinstance(message, MaildirMessage): subdir = message.get_subdir() suffix = self.colon + message.get_info() @@ -700,9 +703,14 @@ def _append_message(self, message): """Append message to mailbox and return (start, stop) offsets.""" self._file.seek(0, 2) - self._pre_message_hook(self._file) - offsets = self._install_message(message) - self._post_message_hook(self._file) + before = self._file.tell() + try: + self._pre_message_hook(self._file) + offsets = self._install_message(message) + self._post_message_hook(self._file) + except BaseException: + self._file.truncate(before) + raise self._file.flush() self._file_length = self._file.tell() # Record current length of mailbox return offsets @@ -872,7 +880,11 @@ if self._locked: _lock_file(f) try: - self._dump_message(message, f) + try: + self._dump_message(message, f) + except BaseException: + os.remove(new_path) + raise if isinstance(message, MHMessage): self._dump_sequences(message, new_key) finally: Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Sat Feb 12 00:03:13 2011 @@ -37,6 +37,9 @@ Library ------- +- Issue #11116: any error during addition of a message to a mailbox now causes + a rollback, instead of leaving the mailbox partially modified. + - Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch by Stan Mihai. From python-checkins at python.org Sat Feb 12 01:03:31 2011 From: python-checkins at python.org (r.david.murray) Date: Sat, 12 Feb 2011 01:03:31 +0100 (CET) Subject: [Python-checkins] r88407 - python/branches/py3k/Lib/mailbox.py Message-ID: <20110212000331.98E1CEE9B6@mail.python.org> Author: r.david.murray Date: Sat Feb 12 01:03:31 2011 New Revision: 88407 Log: Fix #11116 fix on Windows (close file before removing in MH code) Modified: python/branches/py3k/Lib/mailbox.py Modified: python/branches/py3k/Lib/mailbox.py ============================================================================== --- python/branches/py3k/Lib/mailbox.py (original) +++ python/branches/py3k/Lib/mailbox.py Sat Feb 12 01:03:31 2011 @@ -910,6 +910,7 @@ new_key = max(keys) + 1 new_path = os.path.join(self._path, str(new_key)) f = _create_carefully(new_path) + closed = False try: if self._locked: _lock_file(f) @@ -917,6 +918,11 @@ try: self._dump_message(message, f) except BaseException: + # Unlock and close so it can be deleted on Windows + if self._locked: + _unlock_file(f) + _sync_close(f) + closed = True os.remove(new_path) raise if isinstance(message, MHMessage): @@ -925,7 +931,8 @@ if self._locked: _unlock_file(f) finally: - _sync_close(f) + if not closed: + _sync_close(f) return new_key def remove(self, key): From python-checkins at python.org Sat Feb 12 01:59:34 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 12 Feb 2011 01:59:34 +0100 Subject: [Python-checkins] devguide: Fill in some gaps as found at the doc guide at python.org/dev/ (so the latter Message-ID: brett.cannon pushed d407e6d82090 to devguide: http://hg.python.org/devguide/rev/d407e6d82090 changeset: 294:d407e6d82090 tag: tip parent: 273:f5307c79683b user: Brett Cannon date: Fri Feb 11 16:59:22 2011 -0800 summary: Fill in some gaps as found at the doc guide at python.org/dev/ (so the latter can simply redirect to here). files: docquality.rst diff --git a/docquality.rst b/docquality.rst --- a/docquality.rst +++ b/docquality.rst @@ -14,6 +14,24 @@ documentation (which allows you to see how your changes will look along with validating that your new markup is correct). +The current in-development version of the documentation is available at +http://docs.python.org/dev/. It is re-generated from source once a day from the +``Doc/tools/dailybuild.py`` script as found in Python's source tree. + +If you would like a technical documentation style guide, the `Apple +Publications Style Guide +`_ +is recommended. + +If you care to get more involved with documentation, you may also consider +subscribing to the +`docs at python.org mailing list `_. +Documentation issues reported on the `issue tracker`_ are sent here as well as +some bug reports being directly emailed to the mailing list. There is also the +`docs-sig at python.org mailing list +`_ which discusses the +documentation toolchain, projects, standards, etc. + .. _Documenting Python: http://docs.python.org/py3k/documenting/index.html -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 02:19:19 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 12 Feb 2011 02:19:19 +0100 Subject: [Python-checkins] devguide: Drop the version number; docs are at final and now a living document. Message-ID: brett.cannon pushed db65614f08f8 to devguide: http://hg.python.org/devguide/rev/db65614f08f8 changeset: 295:db65614f08f8 tag: tip user: Brett Cannon date: Fri Feb 11 17:19:13 2011 -0800 summary: Drop the version number; docs are at final and now a living document. files: conf.py diff --git a/conf.py b/conf.py --- a/conf.py +++ b/conf.py @@ -50,9 +50,9 @@ # built documents. # # The short X.Y version. -version = '1.0' +version = '' # The full version, including alpha/beta/rc tags. -release = '1.0b1' +release = '' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 02:19:55 2011 From: python-checkins at python.org (brett.cannon) Date: Sat, 12 Feb 2011 02:19:55 +0100 Subject: [Python-checkins] devguide: Merge from default Message-ID: brett.cannon pushed 204d86e42be6 to devguide: http://hg.python.org/devguide/rev/204d86e42be6 changeset: 296:204d86e42be6 branch: hg_transition tag: tip parent: 293:49c866425446 parent: 295:db65614f08f8 user: Brett Cannon date: Fri Feb 11 17:19:49 2011 -0800 summary: Merge from default files: diff --git a/conf.py b/conf.py --- a/conf.py +++ b/conf.py @@ -50,9 +50,9 @@ # built documents. # # The short X.Y version. -version = '1.0' +version = '' # The full version, including alpha/beta/rc tags. -release = '1.0b1' +release = '' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. diff --git a/docquality.rst b/docquality.rst --- a/docquality.rst +++ b/docquality.rst @@ -14,6 +14,24 @@ documentation (which allows you to see how your changes will look along with validating that your new markup is correct). +The current in-development version of the documentation is available at +http://docs.python.org/dev/. It is re-generated from source once a day from the +``Doc/tools/dailybuild.py`` script as found in Python's source tree. + +If you would like a technical documentation style guide, the `Apple +Publications Style Guide +`_ +is recommended. + +If you care to get more involved with documentation, you may also consider +subscribing to the +`docs at python.org mailing list `_. +Documentation issues reported on the `issue tracker`_ are sent here as well as +some bug reports being directly emailed to the mailing list. There is also the +`docs-sig at python.org mailing list +`_ which discusses the +documentation toolchain, projects, standards, etc. + .. _Documenting Python: http://docs.python.org/py3k/documenting/index.html -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 03:03:56 2011 From: python-checkins at python.org (r.david.murray) Date: Sat, 12 Feb 2011 03:03:56 +0100 (CET) Subject: [Python-checkins] r88408 - in python/branches/release27-maint: Lib/mailbox.py Message-ID: <20110212020356.32825EE983@mail.python.org> Author: r.david.murray Date: Sat Feb 12 03:03:56 2011 New Revision: 88408 Log: Merged revisions 88407 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88407 | r.david.murray | 2011-02-11 19:03:31 -0500 (Fri, 11 Feb 2011) | 2 lines Fix #11116 fix on Windows (close file before removing in MH code) ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/mailbox.py Modified: python/branches/release27-maint/Lib/mailbox.py ============================================================================== --- python/branches/release27-maint/Lib/mailbox.py (original) +++ python/branches/release27-maint/Lib/mailbox.py Sat Feb 12 03:03:56 2011 @@ -876,6 +876,7 @@ new_key = max(keys) + 1 new_path = os.path.join(self._path, str(new_key)) f = _create_carefully(new_path) + closed = False try: if self._locked: _lock_file(f) @@ -883,6 +884,11 @@ try: self._dump_message(message, f) except BaseException: + # Unlock and close so it can be deleted on Windows + if self._locked: + _unlock_file(f) + _sync_close(f) + closed = True os.remove(new_path) raise if isinstance(message, MHMessage): @@ -891,7 +897,8 @@ if self._locked: _unlock_file(f) finally: - _sync_close(f) + if not closed: + _sync_close(f) return new_key def remove(self, key): From solipsis at pitrou.net Sat Feb 12 05:10:06 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 12 Feb 2011 05:10:06 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88407): sum=-56 Message-ID: py3k results for svn r88407 (hg cset 9b1daa80168d) -------------------------------------------------- test_pyexpat leaked [0, 0, -56] references, sum=-56 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogvQ8rpW', '-x'] From python-checkins at python.org Sat Feb 12 08:32:04 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 12 Feb 2011 08:32:04 +0100 (CET) Subject: [Python-checkins] r88409 - python/branches/release31-maint/Doc/library/os.rst Message-ID: <20110212073204.731CCEE9FA@mail.python.org> Author: georg.brandl Date: Sat Feb 12 08:32:02 2011 New Revision: 88409 Log: Fix markup problems. Modified: python/branches/release31-maint/Doc/library/os.rst Modified: python/branches/release31-maint/Doc/library/os.rst ============================================================================== --- python/branches/release31-maint/Doc/library/os.rst (original) +++ python/branches/release31-maint/Doc/library/os.rst Sat Feb 12 08:32:02 2011 @@ -952,7 +952,7 @@ .. function:: lstat(path) - Perform the equivalent of an :c:func:`lstat` system call on the given path. + Perform the equivalent of an :cfunc:`lstat` system call on the given path. Similar to :func:`~os.stat`, but does not follow symbolic links. On platforms that do not support symbolic links, this is an alias for :func:`~os.stat`. @@ -1139,11 +1139,11 @@ .. function:: stat(path) - Perform the equivalent of a :c:func:`stat` system call on the given path. + Perform the equivalent of a :cfunc:`stat` system call on the given path. (This function follows symlinks; to stat a symlink use :func:`lstat`.) The return value is an object whose attributes correspond to the members - of the :c:type:`stat` structure, namely: + of the :ctype:`stat` structure, namely: * :attr:`st_mode` - protection bits, * :attr:`st_ino` - inode number, From python-checkins at python.org Sat Feb 12 08:32:17 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 12 Feb 2011 08:32:17 +0100 (CET) Subject: [Python-checkins] r88410 - python/branches/release27-maint/Doc/library/os.rst Message-ID: <20110212073217.2CC1FEEA0F@mail.python.org> Author: georg.brandl Date: Sat Feb 12 08:32:17 2011 New Revision: 88410 Log: Fix markup problems. Modified: python/branches/release27-maint/Doc/library/os.rst Modified: python/branches/release27-maint/Doc/library/os.rst ============================================================================== --- python/branches/release27-maint/Doc/library/os.rst (original) +++ python/branches/release27-maint/Doc/library/os.rst Sat Feb 12 08:32:17 2011 @@ -1114,7 +1114,7 @@ .. function:: lstat(path) - Perform the equivalent of an :c:func:`lstat` system call on the given path. + Perform the equivalent of an :cfunc:`lstat` system call on the given path. Similar to :func:`~os.stat`, but does not follow symbolic links. On platforms that do not support symbolic links, this is an alias for :func:`~os.stat`. @@ -1315,11 +1315,11 @@ .. function:: stat(path) - Perform the equivalent of a :c:func:`stat` system call on the given path. + Perform the equivalent of a :cfunc:`stat` system call on the given path. (This function follows symlinks; to stat a symlink use :func:`lstat`.) The return value is an object whose attributes correspond to the members - of the :c:type:`stat` structure, namely: + of the :ctype:`stat` structure, namely: * :attr:`st_mode` - protection bits, * :attr:`st_ino` - inode number, From python-checkins at python.org Sat Feb 12 08:49:19 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 12 Feb 2011 08:49:19 +0100 Subject: [Python-checkins] devguide: Add some punctuation. Message-ID: georg.brandl pushed b560997b365d to devguide: http://hg.python.org/devguide/rev/b560997b365d changeset: 297:b560997b365d tag: tip parent: 295:db65614f08f8 user: Georg Brandl date: Sat Feb 12 08:57:28 2011 +0100 summary: Add some punctuation. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -49,11 +49,11 @@ .. warning:: Running the entire test suite under coverage (using either technique listed below) currently fails as some tests are resetting the trace function; - see http://bugs.python.org/issue10990 for a patch + see http://bugs.python.org/issue10990 for a patch. There are also various tests that simply fail as they have not been made robust in the face of coverage measuring/having a trace function set; - see http://bugs.python.org/issue10992 for a patch + see http://bugs.python.org/issue10992 for a patch. It should be noted that a quirk of running coverage over Python's own stdlib is that certain modules are imported as part of interpreter startup. Those modules @@ -170,7 +170,7 @@ If you are running coverage over the entire test suite, make sure to add ``-x test_importlib test_runpy test_trace`` to exclude those tests as they trigger exceptions during coverage; see - http://bugs.python.org/issue10541 and http://bugs.python.org/issue10991 + http://bugs.python.org/issue10541 and http://bugs.python.org/issue10991. Once the tests are done you will find the directory you specified contains files for each executed module along with which lines were executed how many -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 21:50:56 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 21:50:56 +0100 Subject: [Python-checkins] devguide: Merge from default Message-ID: antoine.pitrou pushed 4a3ea9e0b132 to devguide: http://hg.python.org/devguide/rev/4a3ea9e0b132 changeset: 298:4a3ea9e0b132 branch: hg_transition tag: tip parent: 296:204d86e42be6 parent: 297:b560997b365d user: Antoine Pitrou date: Sat Feb 12 21:50:53 2011 +0100 summary: Merge from default files: diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -49,11 +49,11 @@ .. warning:: Running the entire test suite under coverage (using either technique listed below) currently fails as some tests are resetting the trace function; - see http://bugs.python.org/issue10990 for a patch + see http://bugs.python.org/issue10990 for a patch. There are also various tests that simply fail as they have not been made robust in the face of coverage measuring/having a trace function set; - see http://bugs.python.org/issue10992 for a patch + see http://bugs.python.org/issue10992 for a patch. It should be noted that a quirk of running coverage over Python's own stdlib is that certain modules are imported as part of interpreter startup. Those modules @@ -170,7 +170,7 @@ If you are running coverage over the entire test suite, make sure to add ``-x test_importlib test_runpy test_trace`` to exclude those tests as they trigger exceptions during coverage; see - http://bugs.python.org/issue10541 and http://bugs.python.org/issue10991 + http://bugs.python.org/issue10541 and http://bugs.python.org/issue10991. Once the tests are done you will find the directory you specified contains files for each executed module along with which lines were executed how many -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 22:34:39 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 22:34:39 +0100 Subject: [Python-checkins] devguide: Comment out the "make patchcheck" advice, since it doesn't work for a Message-ID: antoine.pitrou pushed f22bac464e11 to devguide: http://hg.python.org/devguide/rev/f22bac464e11 changeset: 299:f22bac464e11 branch: hg_transition user: Antoine Pitrou date: Sat Feb 12 21:59:56 2011 +0100 summary: Comment out the "make patchcheck" advice, since it doesn't work for a non-SVN workflow. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -115,13 +115,15 @@ Generation '''''''''' -To perform a quick sanity check on your patch, you can run:: +.. XXX [commented out] make patchcheck doesn't work with non-SVN workflow - make patchcheck + To perform a quick sanity check on your patch, you can run:: -This will check and/or fix various common things people forget to do for -patches, such as adding any new files needing for the patch to work (do not -that not all checks apply to non-core developers). + make patchcheck + + This will check and/or fix various common things people forget to do for + patches, such as adding any new files needing for the patch to work (do not + that not all checks apply to non-core developers). The following instructions assume you are using the :ref:`mq approach ` suggested earlier. To create your patch, first check -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 22:34:39 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 22:34:39 +0100 Subject: [Python-checkins] devguide: Tweak wording and subsections Message-ID: antoine.pitrou pushed 608b362afbc8 to devguide: http://hg.python.org/devguide/rev/608b362afbc8 changeset: 300:608b362afbc8 branch: hg_transition user: Antoine Pitrou date: Sat Feb 12 22:30:40 2011 +0100 summary: Tweak wording and subsections files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -14,21 +14,20 @@ Mercurial allows for various workflows according to each person's or project's preference. We present here a very simple solution based on mq_ -(Mercurial Queues) non-core developers. You are welcome to use any approach you -like (including a svn-like approach of simply never saving any changes you make -to your working copy and using ``hg diff`` to create a patch). Usage of mq is -merely a suggestion; it's a balance between being able to do everything needed +(*Mercurial Queues*). You are welcome to use any approach you like (including +a svn-like approach of simply never saving any changes you make to your working +copy and using ``hg diff`` to create a patch). Usage of mq_ is merely a +suggestion; it's a balance between being able to do everything needed while allowing for more powerful usage if desired in the future. -If you have not done so previously, make sure that the extension has been -turned on in your ``.hgrc`` or ``Mercurial.ini`` file:: +First make sure that the extension has been turned on in your ``.hgrc`` or +``Mercurial.ini`` file:: [extensions] mq = You can verify this is working properly by running ``hg help mq``. - Before you start modifying things in your working copy, type:: hg qnew mywork @@ -40,26 +39,24 @@ hg qrefresh This will update the patch to contain all of the changes you have made up to -this point. If you have any you have added or removed, use ``hg add`` or ``hg +this point. If you have added or removed any file, use ``hg add`` or ``hg remove``, respectively, before running ``hg qrefresh``. -When you are done with your work, you can create a patch to upload to the -`issue tracker`_ with:: +Later on, we will explain :ref:`how to generate a patch `. - hg qdiff > patch.diff +If you want to delete your changes irrevocably (either because they were +committed, or they ended up uninteresting), use:: -When you are done with your changes, you can delete them with:: - + hg qpop mywork hg qdelete mywork -For more advanced usage of mq, read the `mq chapter -`_ -of `Mercurial: The Definitive Guide `_. +.. seealso:: + For more advanced usage of mq, read the `mq chapter + `_ + of `Mercurial: The Definitive Guide `_. -You can obviously use other workflows if you choose, just please make sure that -the generated patch is *flat* instead of containing a diff for each changeset -(e.g., if you have two or more changesets to a single file it should show up as -a single change diff against the file instead of two separate ones). + Also, regardless of your workflow, refer to the :ref:`FAQ ` for + more information on using Mercurial. .. _issue tracker: http://bugs.python.org .. _mq: http://mercurial.selenic.com/wiki/MqExtension @@ -112,6 +109,8 @@ .. _Python Software Foundation: http://www.python.org/psf/ +.. _patch-generation: + Generation '''''''''' @@ -125,15 +124,23 @@ patches, such as adding any new files needing for the patch to work (do not that not all checks apply to non-core developers). -The following instructions assume you are using the :ref:`mq approach -` suggested earlier. To create your patch, first check -that all your local changes have been committed, then type the following:: +Assume you are using the :ref:`mq approach ` suggested earlier, +first check that all your local changes have been recorded (using +``hg qrefresh``), then type the following:: hg qdiff > mywork.patch -To apply a patch generated this way, do:: +If you are using another approach, you probably need to find out the right +invocation of ``hg diff`` for your purposes. Just please make sure that you +generate a **single, condensed** patch rather than a series of several changesets. - hg qimport mywork.patch + +Importing +''''''''' + +To apply a patch generated in the way above by someone else, do:: + + hg qimport mywork.patch This will create a patch in your queue with a name that matches the filename. You can use the ``-n`` argument to specify a different name. @@ -149,9 +156,6 @@ hg qdelete mywork.patch -Please refer to the :ref:`FAQ ` for :ref:`more information -` on how to manage your local changes. - .. note:: The ``patch`` program is not available by default under Windows. You can find it `here `_, courtesy of the `GnuWin32 `_ project. @@ -159,7 +163,6 @@ to apply Unix-generated patches under Windows. - Submitting ---------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 22:34:40 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 22:34:40 +0100 Subject: [Python-checkins] devguide: Re-add anchor. Message-ID: antoine.pitrou pushed 8128df0931a0 to devguide: http://hg.python.org/devguide/rev/8128df0931a0 changeset: 301:8128df0931a0 branch: hg_transition tag: tip user: Antoine Pitrou date: Sat Feb 12 22:34:33 2011 +0100 summary: Re-add anchor. files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -56,7 +56,7 @@ of `Mercurial: The Definitive Guide `_. Also, regardless of your workflow, refer to the :ref:`FAQ ` for - more information on using Mercurial. + :ref:`more information on using Mercurial `. .. _issue tracker: http://bugs.python.org .. _mq: http://mercurial.selenic.com/wiki/MqExtension -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 23:02:37 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 23:02:37 +0100 Subject: [Python-checkins] devguide: Move the section about applying patches to the FAQ (it's not really part Message-ID: antoine.pitrou pushed eb33bbf74c1b to devguide: http://hg.python.org/devguide/rev/eb33bbf74c1b changeset: 302:eb33bbf74c1b branch: hg_transition user: Antoine Pitrou date: Sat Feb 12 22:53:26 2011 +0100 summary: Move the section about applying patches to the FAQ (it's not really part of the patch creation lifecycle). files: faq.rst patch.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -181,6 +181,39 @@ .. _hg-local-workflow: +How do I apply a patch? +------------------------------------------------------------------------------- + +If you want to try out or review a patch generated using Mercurial, do:: + + patch -p1 < somework.patch + +This will apply the changes in your working copy without committing them. +If the patch was not created by hg (i.e., a patch created by SVN and thus lacking +any ``a``/``b`` directory prefixes in the patch), use ``-p0`` instead of ``-p1``. + +If you want to work on the patch using mq_ (Mercurial Queues), type instead:: + + hg qimport somework.patch + +This will create a patch in your queue with a name that matches the filename. +You can use the ``-n`` argument to specify a different name. + +To undo a patch imported into your working copy, simply delete the patch from +your patch queue. You do need to make sure it is not applied (``hg qtop`` will +tell you that while ``hg qpop`` will un-apply the top-most patch):: + + hg qdelete mywork.patch + +.. note:: The ``patch`` program is not available by default under Windows. + You can find it `here `_, + courtesy of the `GnuWin32 `_ project. + Also, you may find it necessary to add the "``--binary``" option when trying + to apply Unix-generated patches under Windows. + +.. _mq: http://mercurial.selenic.com/wiki/MqExtension + + How do I add a file or directory to the repository? ------------------------------------------------------------------------------- diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -135,34 +135,6 @@ generate a **single, condensed** patch rather than a series of several changesets. -Importing -''''''''' - -To apply a patch generated in the way above by someone else, do:: - - hg qimport mywork.patch - -This will create a patch in your queue with a name that matches the filename. -You can use the ``-n`` argument to specify a different name. - -If a patch was not created by hg (i.e., a patch created by svn and thus lacking -any ``a``/``b`` directory prefixes in the patch), use:: - - patch -p0 < mywork.patch - -To undo a patch imported into your working copy, simply delete the patch from -your patch queue. You do need to make sure it is not applied (``hg qtop`` will -tell you that while ``hg qpop`` will un-apply the top-most patch):: - - hg qdelete mywork.patch - -.. note:: The ``patch`` program is not available by default under Windows. - You can find it `here `_, - courtesy of the `GnuWin32 `_ project. - Also, you may find it necessary to add the "``--binary``" option when trying - to apply Unix-generated patches under Windows. - - Submitting ---------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 23:02:37 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 23:02:37 +0100 Subject: [Python-checkins] devguide: Mention that the FAQ can be read at any point. Message-ID: antoine.pitrou pushed bfa29c938ed9 to devguide: http://hg.python.org/devguide/rev/bfa29c938ed9 changeset: 303:bfa29c938ed9 branch: hg_transition tag: tip user: Antoine Pitrou date: Sat Feb 12 23:02:33 2011 +0100 summary: Mention that the FAQ can be read at any point. This isn't greatly integrated to the preceding paragraph, so improvements welcome! files: index.rst diff --git a/index.rst b/index.rst --- a/index.rst +++ b/index.rst @@ -43,6 +43,8 @@ at once, but please do not skip around within the documentation as everything is written assuming preceding documentation has been read. +You can, *however*, read the :ref:`FAQ ` at any point! + * :ref:`setup` * Coding style guides * `PEP 7`_ (Style Guide for C Code) -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 12 23:44:24 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 23:44:24 +0100 Subject: [Python-checkins] hooks: Add the branch name to the notification emails' subject line Message-ID: antoine.pitrou pushed b54a59dd8f69 to hooks: http://hg.python.org/hooks/rev/b54a59dd8f69 changeset: 17:b54a59dd8f69 tag: tip user: Antoine Pitrou date: Sat Feb 12 23:40:24 2011 +0100 summary: Add the branch name to the notification emails' subject line files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -50,13 +50,19 @@ print 'no email address configured' return False + branch = ctx.branch() + if branch == 'default': + branch_insert = '' + else: + branch_insert = ' (%s)' % branch + desc = ctx.description().splitlines()[0] if len(desc) > 80: desc = desc[:80] if ' ' in desc: desc = desc.rsplit(' ', 1)[0] - subj = '%s: %s' % (path, desc) + subj = '%s%s: %s' % (path, branch_insert, desc) send(subj, FROM % user, to, '\n'.join(body)) print 'notified %s of incoming changeset %s' % (to, ctx) return False -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Sat Feb 12 23:57:20 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 12 Feb 2011 23:57:20 +0100 Subject: [Python-checkins] devguide: Replace the part about manually ssh'ing python.org with a suggestion Message-ID: antoine.pitrou pushed 0207e2830b8b to devguide: http://hg.python.org/devguide/rev/0207e2830b8b changeset: 304:0207e2830b8b branch: hg_transition tag: tip user: Antoine Pitrou date: Sat Feb 12 23:57:17 2011 +0100 summary: Replace the part about manually ssh'ing python.org with a suggestion to use the test repository. files: coredev.rst diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -68,19 +68,10 @@ You can verify your commit access by looking at http://www.python.org/dev/committers which lists all core developers by -username. +username. If you want to practice, there is a test repository where you can +freely commit and push any changes you like:: -.. warning:: - XXX the technique below does not work with hg at hg.python.org - -You can also execute the follow command and look for the word -"success" in the output:: - - ssh pythondev at svn.python.org - -For Windows users using Pageant:: - - c:\path\to\putty\plink.exe pythondev at svn.python.org + hg clone ssh://hg at hg.python.org/test/ hgtest An entry in the ``Misc/developers.txt`` file should also be entered for you. Typically the person who sponsored your application to become a core developer -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 13 00:01:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 13 Feb 2011 00:01:55 +0100 Subject: [Python-checkins] devguide: Add a link to the test repo's Web interface Message-ID: antoine.pitrou pushed a7df1a869e4a to devguide: http://hg.python.org/devguide/rev/a7df1a869e4a changeset: 305:a7df1a869e4a branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 13 00:01:52 2011 +0100 summary: Add a link to the test repo's Web interface files: coredev.rst diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -68,8 +68,9 @@ You can verify your commit access by looking at http://www.python.org/dev/committers which lists all core developers by -username. If you want to practice, there is a test repository where you can -freely commit and push any changes you like:: +username. If you want to practice, there is a `test repository +`_ where you can freely commit and push any +changes you like:: hg clone ssh://hg at hg.python.org/test/ hgtest -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 13 01:47:25 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 13 Feb 2011 01:47:25 +0100 Subject: [Python-checkins] devguide: Clean up the porting guide. Insist that the test suite is run when porting. Message-ID: antoine.pitrou pushed 5cf9b7dadebc to devguide: http://hg.python.org/devguide/rev/5cf9b7dadebc changeset: 306:5cf9b7dadebc branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 13 01:47:23 2011 +0100 summary: Clean up the porting guide. Insist that the test suite is run when porting. Try to explain differences with svnmerge. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -86,6 +86,9 @@ :abbr:`DAG (directed acyclic graph)` used by hg to work with the movement of the patch through the codebase instead of against it. +Even when porting an already committed patch, you should **still** check the +test suite runs successfully before committing the patch to another branch. + Porting Within a Major Version '''''''''''''''''''''''''''''' @@ -101,37 +104,44 @@ hg update release-31maint patch -p1 < patch.diff + # Compile; run the test suite hg commit -With the patch now committed (notice that pushing to hg.python.org is not -needed yet), you want to merge the patch up into Python 3.2. Assuming you are -doing all of your work in a single clone:: +With the patch now committed, you want to merge the patch up into Python 3.2. +This should be done *before* pushing your changes to hg.python.org, so that +the branches are in sync on the public repository. Assuming you are doing +all of your work in a single clone, do:: hg update py3k hg merge release-31maint - # Fix any conflicts; probably Misc/NEWS at least + # Fix any conflicts; compile; run the test suite hg commit + +.. note:: + *If the patch shouldn't be ported* from Python 3.1 to Python 3.2, you must + also make it explicit: merge the changes but revert them before committing:: + + hg update py3k + hg merge release-31maint + hg revert -a + hg commit + + This is necessary so that the merge gets recorded; otherwise, somebody + else will have to make a decision about your patch when they try to merge. + +When you have finished your porting work (you can port several patches one +after another in your local repository), you can push **all** outstanding +changesets to hg.python.org:: + hg push -This will get the patch working in Python 3.2 and push **both** the Python 3.1 -and Python 3.2 updates to hg.python.org. If someone has forgotten to merge -their changes from previous patches applied to Python 3.1 then they too will be -merged (hopefully this will not be the case). +This will push changes in both the Python 3.1 and Python 3.2 branches to +hg.python.org. -If you want to do the equivalent of blocking a patch in Python 3.2 that was -applied to Python 3.1, simply pull/merge the change but revert the changes -before committing:: - - # After pull/merge - hg revert -a - hg commit - hg push - -This will cause hg's DAG to note that the changes were merged while not -committing any change in the actual code. Porting Between Major Versions '''''''''''''''''''''''''''''' + To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant extension`_. Assuming you committed in Python 2.7 first, to pull changeset #12345 into Python 3.2, do:: @@ -141,4 +151,31 @@ hg push +Differences with ``svnmerge`` +''''''''''''''''''''''''''''' + +If you are coming from SVN, you might be surprised by how Mercurial works. +Despite its name, ``svnmerge`` is different from ``hg merge``: while ``svnmerge`` +allows to cherrypick individual revisions, ``hg merge`` can only merge whole +lines of development in the repository's :abbr:`DAG (directed acyclic graph)`. +Therefore, ``hg merge`` might force you to review outstanding changesets that +haven't been merged by someone else yet. + +The way to avoid such situations is for everyone to make sure that they have +merged their commits to the ``default`` branch. Just type:: + + $ hg branches + default 3051:a7df1a869e4a + release31-maint 3012:b560997b365d (inactive) + +and check that all branches except ``default`` are marked *inactive*. This +means there is no pending changeset to merge from these branches. + + .. _transplant extension: http://mercurial.selenic.com/wiki/TransplantExtension + + +.. seealso:: + `Merging work + `_, + in `Mercurial: The Definitive Guide `_. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 13 02:10:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 13 Feb 2011 02:10:55 +0100 Subject: [Python-checkins] devguide: Add an XXX that transplant commits automatically (which we would like to Message-ID: antoine.pitrou pushed 2a3371e026e6 to devguide: http://hg.python.org/devguide/rev/2a3371e026e6 changeset: 307:2a3371e026e6 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 13 02:10:31 2011 +0100 summary: Add an XXX that transplant commits automatically (which we would like to avoid) files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -142,12 +142,17 @@ Porting Between Major Versions '''''''''''''''''''''''''''''' -To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant +.. warning:: XXX transplant always commits automatically. This breaks the + "run the test suite before committing" rule. We could advocate + "hg transplant --mq" but combining transplant and mq is another level + of complexity. + +To move a patch between, e.g., Python 3.1 and 2.7, use the `transplant extension`_. Assuming you committed in Python 2.7 first, to pull changeset -#12345 into Python 3.2, do:: +``a7df1a869e4a`` into Python 3.1, do:: - hg transplant -s -m 12345 - # XXX any other steps required, or is it the quivalent of merged and committed? + hg transplant -s a7df1a869e4a + # Compile; run the test suite hg push -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 13 02:31:33 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 13 Feb 2011 02:31:33 +0100 Subject: [Python-checkins] devguide: Fix the XXX ("transplant --mq" is not what I thought it was) Message-ID: antoine.pitrou pushed bb37f09287b9 to devguide: http://hg.python.org/devguide/rev/bb37f09287b9 changeset: 308:bb37f09287b9 branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 13 02:31:29 2011 +0100 summary: Fix the XXX ("transplant --mq" is not what I thought it was) files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -143,9 +143,9 @@ '''''''''''''''''''''''''''''' .. warning:: XXX transplant always commits automatically. This breaks the - "run the test suite before committing" rule. We could advocate - "hg transplant --mq" but combining transplant and mq is another level - of complexity. + "run the test suite before committing" rule. We could advocate using + "hg qimport -r tip -P" afterwards but that would add another level of + complexity. To move a patch between, e.g., Python 3.1 and 2.7, use the `transplant extension`_. Assuming you committed in Python 2.7 first, to pull changeset -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Sun Feb 13 05:05:44 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 13 Feb 2011 05:05:44 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88407): sum=-56 Message-ID: py3k results for svn r88407 (hg cset 9b1daa80168d) -------------------------------------------------- test_pyexpat leaked [0, 0, -56] references, sum=-56 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogh-FGN4', '-x'] From python-checkins at python.org Sun Feb 13 10:54:13 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 13 Feb 2011 10:54:13 +0100 (CET) Subject: [Python-checkins] r88411 - python/branches/py3k/Lib/pydoc_data/topics.py Message-ID: <20110213095413.1A85CF979@mail.python.org> Author: georg.brandl Date: Sun Feb 13 10:54:12 2011 New Revision: 88411 Log: Update pydoc topics. Modified: python/branches/py3k/Lib/pydoc_data/topics.py Modified: python/branches/py3k/Lib/pydoc_data/topics.py ============================================================================== --- python/branches/py3k/Lib/pydoc_data/topics.py (original) +++ python/branches/py3k/Lib/pydoc_data/topics.py Sun Feb 13 10:54:12 2011 @@ -1,4 +1,4 @@ -# Autogenerated by Sphinx on Sun Jan 30 14:58:50 2011 +# Autogenerated by Sphinx on Sun Feb 13 11:01:51 2011 topics = {'assert': '\nThe ``assert`` statement\n************************\n\nAssert statements are a convenient way to insert debugging assertions\ninto a program:\n\n assert_stmt ::= "assert" expression ["," expression]\n\nThe simple form, ``assert expression``, is equivalent to\n\n if __debug__:\n if not expression: raise AssertionError\n\nThe extended form, ``assert expression1, expression2``, is equivalent\nto\n\n if __debug__:\n if not expression1: raise AssertionError(expression2)\n\nThese equivalences assume that ``__debug__`` and ``AssertionError``\nrefer to the built-in variables with those names. In the current\nimplementation, the built-in variable ``__debug__`` is ``True`` under\nnormal circumstances, ``False`` when optimization is requested\n(command line option -O). The current code generator emits no code\nfor an assert statement when optimization is requested at compile\ntime. Note that it is unnecessary to include the source code for the\nexpression that failed in the error message; it will be displayed as\npart of the stack trace.\n\nAssignments to ``__debug__`` are illegal. The value for the built-in\nvariable is determined when the interpreter starts.\n', 'assignment': '\nAssignment statements\n*********************\n\nAssignment statements are used to (re)bind names to values and to\nmodify attributes or items of mutable objects:\n\n assignment_stmt ::= (target_list "=")+ (expression_list | yield_expression)\n target_list ::= target ("," target)* [","]\n target ::= identifier\n | "(" target_list ")"\n | "[" target_list "]"\n | attributeref\n | subscription\n | slicing\n | "*" target\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn assignment statement evaluates the expression list (remember that\nthis can be a single expression or a comma-separated list, the latter\nyielding a tuple) and assigns the single resulting object to each of\nthe target lists, from left to right.\n\nAssignment is defined recursively depending on the form of the target\n(list). When a target is part of a mutable object (an attribute\nreference, subscription or slicing), the mutable object must\nultimately perform the assignment and decide about its validity, and\nmay raise an exception if the assignment is unacceptable. The rules\nobserved by various types and the exceptions raised are given with the\ndefinition of the object types (see section *The standard type\nhierarchy*).\n\nAssignment of an object to a target list, optionally enclosed in\nparentheses or square brackets, is recursively defined as follows.\n\n* If the target list is a single target: The object is assigned to\n that target.\n\n* If the target list is a comma-separated list of targets: The object\n must be an iterable with the same number of items as there are\n targets in the target list, and the items are assigned, from left to\n right, to the corresponding targets. (This rule is relaxed as of\n Python 1.5; in earlier versions, the object had to be a tuple.\n Since strings are sequences, an assignment like ``a, b = "xy"`` is\n now legal as long as the string has the right length.)\n\n * If the target list contains one target prefixed with an asterisk,\n called a "starred" target: The object must be a sequence with at\n least as many items as there are targets in the target list, minus\n one. The first items of the sequence are assigned, from left to\n right, to the targets before the starred target. The final items\n of the sequence are assigned to the targets after the starred\n target. A list of the remaining items in the sequence is then\n assigned to the starred target (the list can be empty).\n\n * Else: The object must be a sequence with the same number of items\n as there are targets in the target list, and the items are\n assigned, from left to right, to the corresponding targets.\n\nAssignment of an object to a single target is recursively defined as\nfollows.\n\n* If the target is an identifier (name):\n\n * If the name does not occur in a ``global`` or ``nonlocal``\n statement in the current code block: the name is bound to the\n object in the current local namespace.\n\n * Otherwise: the name is bound to the object in the global namespace\n or the outer namespace determined by ``nonlocal``, respectively.\n\n The name is rebound if it was already bound. This may cause the\n reference count for the object previously bound to the name to reach\n zero, causing the object to be deallocated and its destructor (if it\n has one) to be called.\n\n* If the target is a target list enclosed in parentheses or in square\n brackets: The object must be an iterable with the same number of\n items as there are targets in the target list, and its items are\n assigned, from left to right, to the corresponding targets.\n\n* If the target is an attribute reference: The primary expression in\n the reference is evaluated. It should yield an object with\n assignable attributes; if this is not the case, ``TypeError`` is\n raised. That object is then asked to assign the assigned object to\n the given attribute; if it cannot perform the assignment, it raises\n an exception (usually but not necessarily ``AttributeError``).\n\n Note: If the object is a class instance and the attribute reference\n occurs on both sides of the assignment operator, the RHS expression,\n ``a.x`` can access either an instance attribute or (if no instance\n attribute exists) a class attribute. The LHS target ``a.x`` is\n always set as an instance attribute, creating it if necessary.\n Thus, the two occurrences of ``a.x`` do not necessarily refer to the\n same attribute: if the RHS expression refers to a class attribute,\n the LHS creates a new instance attribute as the target of the\n assignment:\n\n class Cls:\n x = 3 # class variable\n inst = Cls()\n inst.x = inst.x + 1 # writes inst.x as 4 leaving Cls.x as 3\n\n This description does not necessarily apply to descriptor\n attributes, such as properties created with ``property()``.\n\n* If the target is a subscription: The primary expression in the\n reference is evaluated. It should yield either a mutable sequence\n object (such as a list) or a mapping object (such as a dictionary).\n Next, the subscript expression is evaluated.\n\n If the primary is a mutable sequence object (such as a list), the\n subscript must yield an integer. If it is negative, the sequence\'s\n length is added to it. The resulting value must be a nonnegative\n integer less than the sequence\'s length, and the sequence is asked\n to assign the assigned object to its item with that index. If the\n index is out of range, ``IndexError`` is raised (assignment to a\n subscripted sequence cannot add new items to a list).\n\n If the primary is a mapping object (such as a dictionary), the\n subscript must have a type compatible with the mapping\'s key type,\n and the mapping is then asked to create a key/datum pair which maps\n the subscript to the assigned object. This can either replace an\n existing key/value pair with the same key value, or insert a new\n key/value pair (if no key with the same value existed).\n\n For user-defined objects, the ``__setitem__()`` method is called\n with appropriate arguments.\n\n* If the target is a slicing: The primary expression in the reference\n is evaluated. It should yield a mutable sequence object (such as a\n list). The assigned object should be a sequence object of the same\n type. Next, the lower and upper bound expressions are evaluated,\n insofar they are present; defaults are zero and the sequence\'s\n length. The bounds should evaluate to integers. If either bound is\n negative, the sequence\'s length is added to it. The resulting\n bounds are clipped to lie between zero and the sequence\'s length,\n inclusive. Finally, the sequence object is asked to replace the\n slice with the items of the assigned sequence. The length of the\n slice may be different from the length of the assigned sequence,\n thus changing the length of the target sequence, if the object\n allows it.\n\n**CPython implementation detail:** In the current implementation, the\nsyntax for targets is taken to be the same as for expressions, and\ninvalid syntax is rejected during the code generation phase, causing\nless detailed error messages.\n\nWARNING: Although the definition of assignment implies that overlaps\nbetween the left-hand side and the right-hand side are \'safe\' (for\nexample ``a, b = b, a`` swaps two variables), overlaps *within* the\ncollection of assigned-to variables are not safe! For instance, the\nfollowing program prints ``[0, 2]``:\n\n x = [0, 1]\n i = 0\n i, x[i] = 1, 2\n print(x)\n\nSee also:\n\n **PEP 3132** - Extended Iterable Unpacking\n The specification for the ``*target`` feature.\n\n\nAugmented assignment statements\n===============================\n\nAugmented assignment is the combination, in a single statement, of a\nbinary operation and an assignment statement:\n\n augmented_assignment_stmt ::= augtarget augop (expression_list | yield_expression)\n augtarget ::= identifier | attributeref | subscription | slicing\n augop ::= "+=" | "-=" | "*=" | "/=" | "//=" | "%=" | "**="\n | ">>=" | "<<=" | "&=" | "^=" | "|="\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn augmented assignment evaluates the target (which, unlike normal\nassignment statements, cannot be an unpacking) and the expression\nlist, performs the binary operation specific to the type of assignment\non the two operands, and assigns the result to the original target.\nThe target is only evaluated once.\n\nAn augmented assignment expression like ``x += 1`` can be rewritten as\n``x = x + 1`` to achieve a similar, but not exactly equal effect. In\nthe augmented version, ``x`` is only evaluated once. Also, when\npossible, the actual operation is performed *in-place*, meaning that\nrather than creating a new object and assigning that to the target,\nthe old object is modified instead.\n\nWith the exception of assigning to tuples and multiple targets in a\nsingle statement, the assignment done by augmented assignment\nstatements is handled the same way as normal assignments. Similarly,\nwith the exception of the possible *in-place* behavior, the binary\noperation performed by augmented assignment is the same as the normal\nbinary operations.\n\nFor targets which are attribute references, the same *caveat about\nclass and instance attributes* applies as for regular assignments.\n', 'atom-identifiers': '\nIdentifiers (Names)\n*******************\n\nAn identifier occurring as an atom is a name. See section\n*Identifiers and keywords* for lexical definition and section *Naming\nand binding* for documentation of naming and binding.\n\nWhen the name is bound to an object, evaluation of the atom yields\nthat object. When a name is not bound, an attempt to evaluate it\nraises a ``NameError`` exception.\n\n**Private name mangling:** When an identifier that textually occurs in\na class definition begins with two or more underscore characters and\ndoes not end in two or more underscores, it is considered a *private\nname* of that class. Private names are transformed to a longer form\nbefore code is generated for them. The transformation inserts the\nclass name in front of the name, with leading underscores removed, and\na single underscore inserted in front of the class name. For example,\nthe identifier ``__spam`` occurring in a class named ``Ham`` will be\ntransformed to ``_Ham__spam``. This transformation is independent of\nthe syntactical context in which the identifier is used. If the\ntransformed name is extremely long (longer than 255 characters),\nimplementation defined truncation may happen. If the class name\nconsists only of underscores, no transformation is done.\n', @@ -33,7 +33,7 @@ 'exprlists': '\nExpression lists\n****************\n\n expression_list ::= expression ( "," expression )* [","]\n\nAn expression list containing at least one comma yields a tuple. The\nlength of the tuple is the number of expressions in the list. The\nexpressions are evaluated from left to right.\n\nThe trailing comma is required only to create a single tuple (a.k.a. a\n*singleton*); it is optional in all other cases. A single expression\nwithout a trailing comma doesn\'t create a tuple, but rather yields the\nvalue of that expression. (To create an empty tuple, use an empty pair\nof parentheses: ``()``.)\n', 'floating': '\nFloating point literals\n***********************\n\nFloating point literals are described by the following lexical\ndefinitions:\n\n floatnumber ::= pointfloat | exponentfloat\n pointfloat ::= [intpart] fraction | intpart "."\n exponentfloat ::= (intpart | pointfloat) exponent\n intpart ::= digit+\n fraction ::= "." digit+\n exponent ::= ("e" | "E") ["+" | "-"] digit+\n\nNote that the integer and exponent parts are always interpreted using\nradix 10. For example, ``077e010`` is legal, and denotes the same\nnumber as ``77e10``. The allowed range of floating point literals is\nimplementation-dependent. Some examples of floating point literals:\n\n 3.14 10. .001 1e100 3.14e-10 0e0\n\nNote that numeric literals do not include a sign; a phrase like ``-1``\nis actually an expression composed of the unary operator ``-`` and the\nliteral ``1``.\n', 'for': '\nThe ``for`` statement\n*********************\n\nThe ``for`` statement is used to iterate over the elements of a\nsequence (such as a string, tuple or list) or other iterable object:\n\n for_stmt ::= "for" target_list "in" expression_list ":" suite\n ["else" ":" suite]\n\nThe expression list is evaluated once; it should yield an iterable\nobject. An iterator is created for the result of the\n``expression_list``. The suite is then executed once for each item\nprovided by the iterator, in the order of ascending indices. Each\nitem in turn is assigned to the target list using the standard rules\nfor assignments (see *Assignment statements*), and then the suite is\nexecuted. When the items are exhausted (which is immediately when the\nsequence is empty or an iterator raises a ``StopIteration``\nexception), the suite in the ``else`` clause, if present, is executed,\nand the loop terminates.\n\nA ``break`` statement executed in the first suite terminates the loop\nwithout executing the ``else`` clause\'s suite. A ``continue``\nstatement executed in the first suite skips the rest of the suite and\ncontinues with the next item, or with the ``else`` clause if there was\nno next item.\n\nThe suite may assign to the variable(s) in the target list; this does\nnot affect the next item assigned to it.\n\nNames in the target list are not deleted when the loop is finished,\nbut if the sequence is empty, it will not have been assigned to at all\nby the loop. Hint: the built-in function ``range()`` returns an\niterator of integers suitable to emulate the effect of Pascal\'s ``for\ni := a to b do``; e.g., ``list(range(3))`` returns the list ``[0, 1,\n2]``.\n\nNote: There is a subtlety when the sequence is being modified by the loop\n (this can only occur for mutable sequences, i.e. lists). An\n internal counter is used to keep track of which item is used next,\n and this is incremented on each iteration. When this counter has\n reached the length of the sequence the loop terminates. This means\n that if the suite deletes the current (or a previous) item from the\n sequence, the next item will be skipped (since it gets the index of\n the current item which has already been treated). Likewise, if the\n suite inserts an item in the sequence before the current item, the\n current item will be treated again the next time through the loop.\n This can lead to nasty bugs that can be avoided by making a\n temporary copy using a slice of the whole sequence, e.g.,\n\n for x in a[:]:\n if x < 0: a.remove(x)\n', - 'formatstrings': '\nFormat String Syntax\n********************\n\nThe ``str.format()`` method and the ``Formatter`` class share the same\nsyntax for format strings (although in the case of ``Formatter``,\nsubclasses can define their own format string syntax).\n\nFormat strings contain "replacement fields" surrounded by curly braces\n``{}``. Anything that is not contained in braces is considered literal\ntext, which is copied unchanged to the output. If you need to include\na brace character in the literal text, it can be escaped by doubling:\n``{{`` and ``}}``.\n\nThe grammar for a replacement field is as follows:\n\n replacement_field ::= "{" [field_name] ["!" conversion] [":" format_spec] "}"\n field_name ::= arg_name ("." attribute_name | "[" element_index "]")*\n arg_name ::= [identifier | integer]\n attribute_name ::= identifier\n element_index ::= integer | index_string\n index_string ::= +\n conversion ::= "r" | "s" | "a"\n format_spec ::= \n\nIn less formal terms, the replacement field can start with a\n*field_name* that specifies the object whose value is to be formatted\nand inserted into the output instead of the replacement field. The\n*field_name* is optionally followed by a *conversion* field, which is\npreceded by an exclamation point ``\'!\'``, and a *format_spec*, which\nis preceded by a colon ``\':\'``. These specify a non-default format\nfor the replacement value.\n\nSee also the *Format Specification Mini-Language* section.\n\nThe *field_name* itself begins with an *arg_name* that is either\neither a number or a keyword. If it\'s a number, it refers to a\npositional argument, and if it\'s a keyword, it refers to a named\nkeyword argument. If the numerical arg_names in a format string are\n0, 1, 2, ... in sequence, they can all be omitted (not just some) and\nthe numbers 0, 1, 2, ... will be automatically inserted in that order.\nThe *arg_name* can be followed by any number of index or attribute\nexpressions. An expression of the form ``\'.name\'`` selects the named\nattribute using ``getattr()``, while an expression of the form\n``\'[index]\'`` does an index lookup using ``__getitem__()``.\n\nChanged in version 3.1: The positional argument specifiers can be\nomitted, so ``\'{} {}\'`` is equivalent to ``\'{0} {1}\'``.\n\nSome simple format string examples:\n\n "First, thou shalt count to {0}" # References first positional argument\n "Bring me a {}" # Implicitly references the first positional argument\n "From {} to {}" # Same as "From {0} to {1}"\n "My quest is {name}" # References keyword argument \'name\'\n "Weight in tons {0.weight}" # \'weight\' attribute of first positional arg\n "Units destroyed: {players[0]}" # First element of keyword argument \'players\'.\n\nThe *conversion* field causes a type coercion before formatting.\nNormally, the job of formatting a value is done by the\n``__format__()`` method of the value itself. However, in some cases\nit is desirable to force a type to be formatted as a string,\noverriding its own definition of formatting. By converting the value\nto a string before calling ``__format__()``, the normal formatting\nlogic is bypassed.\n\nThree conversion flags are currently supported: ``\'!s\'`` which calls\n``str()`` on the value, ``\'!r\'`` which calls ``repr()`` and ``\'!a\'``\nwhich calls ``ascii()``.\n\nSome examples:\n\n "Harold\'s a clever {0!s}" # Calls str() on the argument first\n "Bring out the holy {name!r}" # Calls repr() on the argument first\n "More {!a}" # Calls ascii() on the argument first\n\nThe *format_spec* field contains a specification of how the value\nshould be presented, including such details as field width, alignment,\npadding, decimal precision and so on. Each value type can define its\nown "formatting mini-language" or interpretation of the *format_spec*.\n\nMost built-in types support a common formatting mini-language, which\nis described in the next section.\n\nA *format_spec* field can also include nested replacement fields\nwithin it. These nested replacement fields can contain only a field\nname; conversion flags and format specifications are not allowed. The\nreplacement fields within the format_spec are substituted before the\n*format_spec* string is interpreted. This allows the formatting of a\nvalue to be dynamically specified.\n\nSee the *Format examples* section for some examples.\n\n\nFormat Specification Mini-Language\n==================================\n\n"Format specifications" are used within replacement fields contained\nwithin a format string to define how individual values are presented\n(see *Format String Syntax*). They can also be passed directly to the\nbuilt-in ``format()`` function. Each formattable type may define how\nthe format specification is to be interpreted.\n\nMost built-in types implement the following options for format\nspecifications, although some of the formatting options are only\nsupported by the numeric types.\n\nA general convention is that an empty format string (``""``) produces\nthe same result as if you had called ``str()`` on the value. A non-\nempty format string typically modifies the result.\n\nThe general form of a *standard format specifier* is:\n\n format_spec ::= [[fill]align][sign][#][0][width][,][.precision][type]\n fill ::= \n align ::= "<" | ">" | "=" | "^"\n sign ::= "+" | "-" | " "\n width ::= integer\n precision ::= integer\n type ::= "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%"\n\nThe *fill* character can be any character other than \'{\' or \'}\'. The\npresence of a fill character is signaled by the character following\nit, which must be one of the alignment options. If the second\ncharacter of *format_spec* is not a valid alignment option, then it is\nassumed that both the fill character and the alignment option are\nabsent.\n\nThe meaning of the various alignment options is as follows:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'<\'`` | Forces the field to be left-aligned within the available |\n | | space (this is the default). |\n +-----------+------------------------------------------------------------+\n | ``\'>\'`` | Forces the field to be right-aligned within the available |\n | | space. |\n +-----------+------------------------------------------------------------+\n | ``\'=\'`` | Forces the padding to be placed after the sign (if any) |\n | | but before the digits. This is used for printing fields |\n | | in the form \'+000000120\'. This alignment option is only |\n | | valid for numeric types. |\n +-----------+------------------------------------------------------------+\n | ``\'^\'`` | Forces the field to be centered within the available |\n | | space. |\n +-----------+------------------------------------------------------------+\n\nNote that unless a minimum field width is defined, the field width\nwill always be the same size as the data to fill it, so that the\nalignment option has no meaning in this case.\n\nThe *sign* option is only valid for number types, and can be one of\nthe following:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'+\'`` | indicates that a sign should be used for both positive as |\n | | well as negative numbers. |\n +-----------+------------------------------------------------------------+\n | ``\'-\'`` | indicates that a sign should be used only for negative |\n | | numbers (this is the default behavior). |\n +-----------+------------------------------------------------------------+\n | space | indicates that a leading space should be used on positive |\n | | numbers, and a minus sign on negative numbers. |\n +-----------+------------------------------------------------------------+\n\nThe ``\'#\'`` option causes the "alternate form" to be used for the\nconversion. The alternate form is defined differently for different\ntypes. This option is only valid for integer, float, complex and\nDecimal types. For integers, when binary, octal, or hexadecimal output\nis used, this option adds the prefix respective ``\'0b\'``, ``\'0o\'``, or\n``\'0x\'`` to the output value. For floats, complex and Decimal the\nalternate form causes the result of the conversion to always contain a\ndecimal-point character, even if no digits follow it. Normally, a\ndecimal-point character appears in the result of these conversions\nonly if a digit follows it. In addition, for ``\'g\'`` and ``\'G\'``\nconversions, trailing zeros are not removed from the result.\n\nThe ``\',\'`` option signals the use of a comma for a thousands\nseparator. For a locale aware separator, use the ``\'n\'`` integer\npresentation type instead.\n\nChanged in version 3.1: Added the ``\',\'`` option (see also **PEP\n378**).\n\n*width* is a decimal integer defining the minimum field width. If not\nspecified, then the field width will be determined by the content.\n\nIf the *width* field is preceded by a zero (``\'0\'``) character, this\nenables zero-padding. This is equivalent to an *alignment* type of\n``\'=\'`` and a *fill* character of ``\'0\'``.\n\nThe *precision* is a decimal number indicating how many digits should\nbe displayed after the decimal point for a floating point value\nformatted with ``\'f\'`` and ``\'F\'``, or before and after the decimal\npoint for a floating point value formatted with ``\'g\'`` or ``\'G\'``.\nFor non-number types the field indicates the maximum field size - in\nother words, how many characters will be used from the field content.\nThe *precision* is not allowed for integer values.\n\nFinally, the *type* determines how the data should be presented.\n\nThe available string presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'s\'`` | String format. This is the default type for strings and |\n | | may be omitted. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'s\'``. |\n +-----------+------------------------------------------------------------+\n\nThe available integer presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'b\'`` | Binary format. Outputs the number in base 2. |\n +-----------+------------------------------------------------------------+\n | ``\'c\'`` | Character. Converts the integer to the corresponding |\n | | unicode character before printing. |\n +-----------+------------------------------------------------------------+\n | ``\'d\'`` | Decimal Integer. Outputs the number in base 10. |\n +-----------+------------------------------------------------------------+\n | ``\'o\'`` | Octal format. Outputs the number in base 8. |\n +-----------+------------------------------------------------------------+\n | ``\'x\'`` | Hex format. Outputs the number in base 16, using lower- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'X\'`` | Hex format. Outputs the number in base 16, using upper- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'d\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'d\'``. |\n +-----------+------------------------------------------------------------+\n\nIn addition to the above presentation types, integers can be formatted\nwith the floating point presentation types listed below (except\n``\'n\'`` and None). When doing so, ``float()`` is used to convert the\ninteger to a floating point number before formatting.\n\nThe available presentation types for floating point and decimal values\nare:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'e\'`` | Exponent notation. Prints the number in scientific |\n | | notation using the letter \'e\' to indicate the exponent. |\n +-----------+------------------------------------------------------------+\n | ``\'E\'`` | Exponent notation. Same as ``\'e\'`` except it uses an upper |\n | | case \'E\' as the separator character. |\n +-----------+------------------------------------------------------------+\n | ``\'f\'`` | Fixed point. Displays the number as a fixed-point number. |\n +-----------+------------------------------------------------------------+\n | ``\'F\'`` | Fixed point. Same as ``\'f\'``, but converts ``nan`` to |\n | | ``NAN`` and ``inf`` to ``INF``. |\n +-----------+------------------------------------------------------------+\n | ``\'g\'`` | General format. For a given precision ``p >= 1``, this |\n | | rounds the number to ``p`` significant digits and then |\n | | formats the result in either fixed-point format or in |\n | | scientific notation, depending on its magnitude. The |\n | | precise rules are as follows: suppose that the result |\n | | formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1`` would have exponent ``exp``. Then if ``-4 <= exp |\n | | < p``, the number is formatted with presentation type |\n | | ``\'f\'`` and precision ``p-1-exp``. Otherwise, the number |\n | | is formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1``. In both cases insignificant trailing zeros are |\n | | removed from the significand, and the decimal point is |\n | | also removed if there are no remaining digits following |\n | | it. Positive and negative infinity, positive and negative |\n | | zero, and nans, are formatted as ``inf``, ``-inf``, ``0``, |\n | | ``-0`` and ``nan`` respectively, regardless of the |\n | | precision. A precision of ``0`` is treated as equivalent |\n | | to a precision of ``1``. |\n +-----------+------------------------------------------------------------+\n | ``\'G\'`` | General format. Same as ``\'g\'`` except switches to ``\'E\'`` |\n | | if the number gets too large. The representations of |\n | | infinity and NaN are uppercased, too. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'g\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | ``\'%\'`` | Percentage. Multiplies the number by 100 and displays in |\n | | fixed (``\'f\'``) format, followed by a percent sign. |\n +-----------+------------------------------------------------------------+\n | None | Similar to ``\'g\'``, except with at least one digit past |\n | | the decimal point and a default precision of 12. This is |\n | | intended to match ``str()``, except you can add the other |\n | | format modifiers. |\n +-----------+------------------------------------------------------------+\n\n\nFormat examples\n===============\n\nThis section contains examples of the new format syntax and comparison\nwith the old ``%``-formatting.\n\nIn most of the cases the syntax is similar to the old\n``%``-formatting, with the addition of the ``{}`` and with ``:`` used\ninstead of ``%``. For example, ``\'%03.2f\'`` can be translated to\n``\'{:03.2f}\'``.\n\nThe new format syntax also supports new and different options, shown\nin the follow examples.\n\nAccessing arguments by position:\n\n >>> \'{0}, {1}, {2}\'.format(\'a\', \'b\', \'c\')\n \'a, b, c\'\n >>> \'{}, {}, {}\'.format(\'a\', \'b\', \'c\') # 3.1+ only\n \'a, b, c\'\n >>> \'{2}, {1}, {0}\'.format(\'a\', \'b\', \'c\')\n \'c, b, a\'\n >>> \'{2}, {1}, {0}\'.format(*\'abc\') # unpacking argument sequence\n \'c, b, a\'\n >>> \'{0}{1}{0}\'.format(\'abra\', \'cad\') # arguments\' indices can be repeated\n \'abracadabra\'\n\nAccessing arguments by name:\n\n >>> \'Coordinates: {latitude}, {longitude}\'.format(latitude=\'37.24N\', longitude=\'-115.81W\')\n \'Coordinates: 37.24N, -115.81W\'\n >>> coord = {\'latitude\': \'37.24N\', \'longitude\': \'-115.81W\'}\n >>> \'Coordinates: {latitude}, {longitude}\'.format(**coord)\n \'Coordinates: 37.24N, -115.81W\'\n\nAccessing arguments\' attributes:\n\n >>> c = 3-5j\n >>> (\'The complex number {0} is formed from the real part {0.real} \'\n ... \'and the imaginary part {0.imag}.\').format(c)\n \'The complex number (3-5j) is formed from the real part 3.0 and the imaginary part -5.0.\'\n >>> class Point:\n ... def __init__(self, x, y):\n ... self.x, self.y = x, y\n ... def __str__(self):\n ... return \'Point({self.x}, {self.y})\'.format(self=self)\n ...\n >>> str(Point(4, 2))\n \'Point(4, 2)\'\n\nAccessing arguments\' items:\n\n >>> coord = (3, 5)\n >>> \'X: {0[0]}; Y: {0[1]}\'.format(coord)\n \'X: 3; Y: 5\'\n\nReplacing ``%s`` and ``%r``:\n\n >>> "repr() shows quotes: {!r}; str() doesn\'t: {!s}".format(\'test1\', \'test2\')\n "repr() shows quotes: \'test1\'; str() doesn\'t: test2"\n\nAligning the text and specifying a width:\n\n >>> \'{:<30}\'.format(\'left aligned\')\n \'left aligned \'\n >>> \'{:>30}\'.format(\'right aligned\')\n \' right aligned\'\n >>> \'{:^30}\'.format(\'centered\')\n \' centered \'\n >>> \'{:*^30}\'.format(\'centered\') # use \'*\' as a fill char\n \'***********centered***********\'\n\nReplacing ``%+f``, ``%-f``, and ``% f`` and specifying a sign:\n\n >>> \'{:+f}; {:+f}\'.format(3.14, -3.14) # show it always\n \'+3.140000; -3.140000\'\n >>> \'{: f}; {: f}\'.format(3.14, -3.14) # show a space for positive numbers\n \' 3.140000; -3.140000\'\n >>> \'{:-f}; {:-f}\'.format(3.14, -3.14) # show only the minus -- same as \'{:f}; {:f}\'\n \'3.140000; -3.140000\'\n\nReplacing ``%x`` and ``%o`` and converting the value to different\nbases:\n\n >>> # format also supports binary numbers\n >>> "int: {0:d}; hex: {0:x}; oct: {0:o}; bin: {0:b}".format(42)\n \'int: 42; hex: 2a; oct: 52; bin: 101010\'\n >>> # with 0x, 0o, or 0b as prefix:\n >>> "int: {0:d}; hex: {0:#x}; oct: {0:#o}; bin: {0:#b}".format(42)\n \'int: 42; hex: 0x2a; oct: 0o52; bin: 0b101010\'\n\nUsing the comma as a thousands separator:\n\n >>> \'{:,}\'.format(1234567890)\n \'1,234,567,890\'\n\nExpressing a percentage:\n\n >>> points = 19\n >>> total = 22\n >>> \'Correct answers: {:.2%}.\'.format(points/total)\n \'Correct answers: 86.36%\'\n\nUsing type-specific formatting:\n\n >>> import datetime\n >>> d = datetime.datetime(2010, 7, 4, 12, 15, 58)\n >>> \'{:%Y-%m-%d %H:%M:%S}\'.format(d)\n \'2010-07-04 12:15:58\'\n\nNesting arguments and more complex examples:\n\n >>> for align, text in zip(\'<^>\', [\'left\', \'center\', \'right\']):\n ... \'{0:{align}{fill}16}\'.format(text, fill=align, align=align)\n ...\n \'left<<<<<<<<<<<<\'\n \'^^^^^center^^^^^\'\n \'>>>>>>>>>>>right\'\n >>>\n >>> octets = [192, 168, 0, 1]\n >>> \'{:02X}{:02X}{:02X}{:02X}\'.format(*octets)\n \'C0A80001\'\n >>> int(_, 16)\n 3232235521\n >>>\n >>> width = 5\n >>> for num in range(5,12):\n ... for base in \'dXob\':\n ... print(\'{0:{width}{base}}\'.format(num, base=base, width=width), end=\' \')\n ... print()\n ...\n 5 5 5 101\n 6 6 6 110\n 7 7 7 111\n 8 8 10 1000\n 9 9 11 1001\n 10 A 12 1010\n 11 B 13 1011\n', + 'formatstrings': '\nFormat String Syntax\n********************\n\nThe ``str.format()`` method and the ``Formatter`` class share the same\nsyntax for format strings (although in the case of ``Formatter``,\nsubclasses can define their own format string syntax).\n\nFormat strings contain "replacement fields" surrounded by curly braces\n``{}``. Anything that is not contained in braces is considered literal\ntext, which is copied unchanged to the output. If you need to include\na brace character in the literal text, it can be escaped by doubling:\n``{{`` and ``}}``.\n\nThe grammar for a replacement field is as follows:\n\n replacement_field ::= "{" [field_name] ["!" conversion] [":" format_spec] "}"\n field_name ::= arg_name ("." attribute_name | "[" element_index "]")*\n arg_name ::= [identifier | integer]\n attribute_name ::= identifier\n element_index ::= integer | index_string\n index_string ::= +\n conversion ::= "r" | "s" | "a"\n format_spec ::= \n\nIn less formal terms, the replacement field can start with a\n*field_name* that specifies the object whose value is to be formatted\nand inserted into the output instead of the replacement field. The\n*field_name* is optionally followed by a *conversion* field, which is\npreceded by an exclamation point ``\'!\'``, and a *format_spec*, which\nis preceded by a colon ``\':\'``. These specify a non-default format\nfor the replacement value.\n\nSee also the *Format Specification Mini-Language* section.\n\nThe *field_name* itself begins with an *arg_name* that is either\neither a number or a keyword. If it\'s a number, it refers to a\npositional argument, and if it\'s a keyword, it refers to a named\nkeyword argument. If the numerical arg_names in a format string are\n0, 1, 2, ... in sequence, they can all be omitted (not just some) and\nthe numbers 0, 1, 2, ... will be automatically inserted in that order.\nThe *arg_name* can be followed by any number of index or attribute\nexpressions. An expression of the form ``\'.name\'`` selects the named\nattribute using ``getattr()``, while an expression of the form\n``\'[index]\'`` does an index lookup using ``__getitem__()``.\n\nChanged in version 3.1: The positional argument specifiers can be\nomitted, so ``\'{} {}\'`` is equivalent to ``\'{0} {1}\'``.\n\nSome simple format string examples:\n\n "First, thou shalt count to {0}" # References first positional argument\n "Bring me a {}" # Implicitly references the first positional argument\n "From {} to {}" # Same as "From {0} to {1}"\n "My quest is {name}" # References keyword argument \'name\'\n "Weight in tons {0.weight}" # \'weight\' attribute of first positional arg\n "Units destroyed: {players[0]}" # First element of keyword argument \'players\'.\n\nThe *conversion* field causes a type coercion before formatting.\nNormally, the job of formatting a value is done by the\n``__format__()`` method of the value itself. However, in some cases\nit is desirable to force a type to be formatted as a string,\noverriding its own definition of formatting. By converting the value\nto a string before calling ``__format__()``, the normal formatting\nlogic is bypassed.\n\nThree conversion flags are currently supported: ``\'!s\'`` which calls\n``str()`` on the value, ``\'!r\'`` which calls ``repr()`` and ``\'!a\'``\nwhich calls ``ascii()``.\n\nSome examples:\n\n "Harold\'s a clever {0!s}" # Calls str() on the argument first\n "Bring out the holy {name!r}" # Calls repr() on the argument first\n "More {!a}" # Calls ascii() on the argument first\n\nThe *format_spec* field contains a specification of how the value\nshould be presented, including such details as field width, alignment,\npadding, decimal precision and so on. Each value type can define its\nown "formatting mini-language" or interpretation of the *format_spec*.\n\nMost built-in types support a common formatting mini-language, which\nis described in the next section.\n\nA *format_spec* field can also include nested replacement fields\nwithin it. These nested replacement fields can contain only a field\nname; conversion flags and format specifications are not allowed. The\nreplacement fields within the format_spec are substituted before the\n*format_spec* string is interpreted. This allows the formatting of a\nvalue to be dynamically specified.\n\nSee the *Format examples* section for some examples.\n\n\nFormat Specification Mini-Language\n==================================\n\n"Format specifications" are used within replacement fields contained\nwithin a format string to define how individual values are presented\n(see *Format String Syntax*). They can also be passed directly to the\nbuilt-in ``format()`` function. Each formattable type may define how\nthe format specification is to be interpreted.\n\nMost built-in types implement the following options for format\nspecifications, although some of the formatting options are only\nsupported by the numeric types.\n\nA general convention is that an empty format string (``""``) produces\nthe same result as if you had called ``str()`` on the value. A non-\nempty format string typically modifies the result.\n\nThe general form of a *standard format specifier* is:\n\n format_spec ::= [[fill]align][sign][#][0][width][,][.precision][type]\n fill ::= \n align ::= "<" | ">" | "=" | "^"\n sign ::= "+" | "-" | " "\n width ::= integer\n precision ::= integer\n type ::= "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%"\n\nThe *fill* character can be any character other than \'{\' or \'}\'. The\npresence of a fill character is signaled by the character following\nit, which must be one of the alignment options. If the second\ncharacter of *format_spec* is not a valid alignment option, then it is\nassumed that both the fill character and the alignment option are\nabsent.\n\nThe meaning of the various alignment options is as follows:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'<\'`` | Forces the field to be left-aligned within the available |\n | | space (this is the default for most objects). |\n +-----------+------------------------------------------------------------+\n | ``\'>\'`` | Forces the field to be right-aligned within the available |\n | | space (this is the default for numbers). |\n +-----------+------------------------------------------------------------+\n | ``\'=\'`` | Forces the padding to be placed after the sign (if any) |\n | | but before the digits. This is used for printing fields |\n | | in the form \'+000000120\'. This alignment option is only |\n | | valid for numeric types. |\n +-----------+------------------------------------------------------------+\n | ``\'^\'`` | Forces the field to be centered within the available |\n | | space. |\n +-----------+------------------------------------------------------------+\n\nNote that unless a minimum field width is defined, the field width\nwill always be the same size as the data to fill it, so that the\nalignment option has no meaning in this case.\n\nThe *sign* option is only valid for number types, and can be one of\nthe following:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'+\'`` | indicates that a sign should be used for both positive as |\n | | well as negative numbers. |\n +-----------+------------------------------------------------------------+\n | ``\'-\'`` | indicates that a sign should be used only for negative |\n | | numbers (this is the default behavior). |\n +-----------+------------------------------------------------------------+\n | space | indicates that a leading space should be used on positive |\n | | numbers, and a minus sign on negative numbers. |\n +-----------+------------------------------------------------------------+\n\nThe ``\'#\'`` option causes the "alternate form" to be used for the\nconversion. The alternate form is defined differently for different\ntypes. This option is only valid for integer, float, complex and\nDecimal types. For integers, when binary, octal, or hexadecimal output\nis used, this option adds the prefix respective ``\'0b\'``, ``\'0o\'``, or\n``\'0x\'`` to the output value. For floats, complex and Decimal the\nalternate form causes the result of the conversion to always contain a\ndecimal-point character, even if no digits follow it. Normally, a\ndecimal-point character appears in the result of these conversions\nonly if a digit follows it. In addition, for ``\'g\'`` and ``\'G\'``\nconversions, trailing zeros are not removed from the result.\n\nThe ``\',\'`` option signals the use of a comma for a thousands\nseparator. For a locale aware separator, use the ``\'n\'`` integer\npresentation type instead.\n\nChanged in version 3.1: Added the ``\',\'`` option (see also **PEP\n378**).\n\n*width* is a decimal integer defining the minimum field width. If not\nspecified, then the field width will be determined by the content.\n\nIf the *width* field is preceded by a zero (``\'0\'``) character, this\nenables zero-padding. This is equivalent to an *alignment* type of\n``\'=\'`` and a *fill* character of ``\'0\'``.\n\nThe *precision* is a decimal number indicating how many digits should\nbe displayed after the decimal point for a floating point value\nformatted with ``\'f\'`` and ``\'F\'``, or before and after the decimal\npoint for a floating point value formatted with ``\'g\'`` or ``\'G\'``.\nFor non-number types the field indicates the maximum field size - in\nother words, how many characters will be used from the field content.\nThe *precision* is not allowed for integer values.\n\nFinally, the *type* determines how the data should be presented.\n\nThe available string presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'s\'`` | String format. This is the default type for strings and |\n | | may be omitted. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'s\'``. |\n +-----------+------------------------------------------------------------+\n\nThe available integer presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'b\'`` | Binary format. Outputs the number in base 2. |\n +-----------+------------------------------------------------------------+\n | ``\'c\'`` | Character. Converts the integer to the corresponding |\n | | unicode character before printing. |\n +-----------+------------------------------------------------------------+\n | ``\'d\'`` | Decimal Integer. Outputs the number in base 10. |\n +-----------+------------------------------------------------------------+\n | ``\'o\'`` | Octal format. Outputs the number in base 8. |\n +-----------+------------------------------------------------------------+\n | ``\'x\'`` | Hex format. Outputs the number in base 16, using lower- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'X\'`` | Hex format. Outputs the number in base 16, using upper- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'d\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'d\'``. |\n +-----------+------------------------------------------------------------+\n\nIn addition to the above presentation types, integers can be formatted\nwith the floating point presentation types listed below (except\n``\'n\'`` and None). When doing so, ``float()`` is used to convert the\ninteger to a floating point number before formatting.\n\nThe available presentation types for floating point and decimal values\nare:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'e\'`` | Exponent notation. Prints the number in scientific |\n | | notation using the letter \'e\' to indicate the exponent. |\n +-----------+------------------------------------------------------------+\n | ``\'E\'`` | Exponent notation. Same as ``\'e\'`` except it uses an upper |\n | | case \'E\' as the separator character. |\n +-----------+------------------------------------------------------------+\n | ``\'f\'`` | Fixed point. Displays the number as a fixed-point number. |\n +-----------+------------------------------------------------------------+\n | ``\'F\'`` | Fixed point. Same as ``\'f\'``, but converts ``nan`` to |\n | | ``NAN`` and ``inf`` to ``INF``. |\n +-----------+------------------------------------------------------------+\n | ``\'g\'`` | General format. For a given precision ``p >= 1``, this |\n | | rounds the number to ``p`` significant digits and then |\n | | formats the result in either fixed-point format or in |\n | | scientific notation, depending on its magnitude. The |\n | | precise rules are as follows: suppose that the result |\n | | formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1`` would have exponent ``exp``. Then if ``-4 <= exp |\n | | < p``, the number is formatted with presentation type |\n | | ``\'f\'`` and precision ``p-1-exp``. Otherwise, the number |\n | | is formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1``. In both cases insignificant trailing zeros are |\n | | removed from the significand, and the decimal point is |\n | | also removed if there are no remaining digits following |\n | | it. Positive and negative infinity, positive and negative |\n | | zero, and nans, are formatted as ``inf``, ``-inf``, ``0``, |\n | | ``-0`` and ``nan`` respectively, regardless of the |\n | | precision. A precision of ``0`` is treated as equivalent |\n | | to a precision of ``1``. |\n +-----------+------------------------------------------------------------+\n | ``\'G\'`` | General format. Same as ``\'g\'`` except switches to ``\'E\'`` |\n | | if the number gets too large. The representations of |\n | | infinity and NaN are uppercased, too. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'g\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | ``\'%\'`` | Percentage. Multiplies the number by 100 and displays in |\n | | fixed (``\'f\'``) format, followed by a percent sign. |\n +-----------+------------------------------------------------------------+\n | None | Similar to ``\'g\'``, except with at least one digit past |\n | | the decimal point and a default precision of 12. This is |\n | | intended to match ``str()``, except you can add the other |\n | | format modifiers. |\n +-----------+------------------------------------------------------------+\n\n\nFormat examples\n===============\n\nThis section contains examples of the new format syntax and comparison\nwith the old ``%``-formatting.\n\nIn most of the cases the syntax is similar to the old\n``%``-formatting, with the addition of the ``{}`` and with ``:`` used\ninstead of ``%``. For example, ``\'%03.2f\'`` can be translated to\n``\'{:03.2f}\'``.\n\nThe new format syntax also supports new and different options, shown\nin the follow examples.\n\nAccessing arguments by position:\n\n >>> \'{0}, {1}, {2}\'.format(\'a\', \'b\', \'c\')\n \'a, b, c\'\n >>> \'{}, {}, {}\'.format(\'a\', \'b\', \'c\') # 3.1+ only\n \'a, b, c\'\n >>> \'{2}, {1}, {0}\'.format(\'a\', \'b\', \'c\')\n \'c, b, a\'\n >>> \'{2}, {1}, {0}\'.format(*\'abc\') # unpacking argument sequence\n \'c, b, a\'\n >>> \'{0}{1}{0}\'.format(\'abra\', \'cad\') # arguments\' indices can be repeated\n \'abracadabra\'\n\nAccessing arguments by name:\n\n >>> \'Coordinates: {latitude}, {longitude}\'.format(latitude=\'37.24N\', longitude=\'-115.81W\')\n \'Coordinates: 37.24N, -115.81W\'\n >>> coord = {\'latitude\': \'37.24N\', \'longitude\': \'-115.81W\'}\n >>> \'Coordinates: {latitude}, {longitude}\'.format(**coord)\n \'Coordinates: 37.24N, -115.81W\'\n\nAccessing arguments\' attributes:\n\n >>> c = 3-5j\n >>> (\'The complex number {0} is formed from the real part {0.real} \'\n ... \'and the imaginary part {0.imag}.\').format(c)\n \'The complex number (3-5j) is formed from the real part 3.0 and the imaginary part -5.0.\'\n >>> class Point:\n ... def __init__(self, x, y):\n ... self.x, self.y = x, y\n ... def __str__(self):\n ... return \'Point({self.x}, {self.y})\'.format(self=self)\n ...\n >>> str(Point(4, 2))\n \'Point(4, 2)\'\n\nAccessing arguments\' items:\n\n >>> coord = (3, 5)\n >>> \'X: {0[0]}; Y: {0[1]}\'.format(coord)\n \'X: 3; Y: 5\'\n\nReplacing ``%s`` and ``%r``:\n\n >>> "repr() shows quotes: {!r}; str() doesn\'t: {!s}".format(\'test1\', \'test2\')\n "repr() shows quotes: \'test1\'; str() doesn\'t: test2"\n\nAligning the text and specifying a width:\n\n >>> \'{:<30}\'.format(\'left aligned\')\n \'left aligned \'\n >>> \'{:>30}\'.format(\'right aligned\')\n \' right aligned\'\n >>> \'{:^30}\'.format(\'centered\')\n \' centered \'\n >>> \'{:*^30}\'.format(\'centered\') # use \'*\' as a fill char\n \'***********centered***********\'\n\nReplacing ``%+f``, ``%-f``, and ``% f`` and specifying a sign:\n\n >>> \'{:+f}; {:+f}\'.format(3.14, -3.14) # show it always\n \'+3.140000; -3.140000\'\n >>> \'{: f}; {: f}\'.format(3.14, -3.14) # show a space for positive numbers\n \' 3.140000; -3.140000\'\n >>> \'{:-f}; {:-f}\'.format(3.14, -3.14) # show only the minus -- same as \'{:f}; {:f}\'\n \'3.140000; -3.140000\'\n\nReplacing ``%x`` and ``%o`` and converting the value to different\nbases:\n\n >>> # format also supports binary numbers\n >>> "int: {0:d}; hex: {0:x}; oct: {0:o}; bin: {0:b}".format(42)\n \'int: 42; hex: 2a; oct: 52; bin: 101010\'\n >>> # with 0x, 0o, or 0b as prefix:\n >>> "int: {0:d}; hex: {0:#x}; oct: {0:#o}; bin: {0:#b}".format(42)\n \'int: 42; hex: 0x2a; oct: 0o52; bin: 0b101010\'\n\nUsing the comma as a thousands separator:\n\n >>> \'{:,}\'.format(1234567890)\n \'1,234,567,890\'\n\nExpressing a percentage:\n\n >>> points = 19\n >>> total = 22\n >>> \'Correct answers: {:.2%}.\'.format(points/total)\n \'Correct answers: 86.36%\'\n\nUsing type-specific formatting:\n\n >>> import datetime\n >>> d = datetime.datetime(2010, 7, 4, 12, 15, 58)\n >>> \'{:%Y-%m-%d %H:%M:%S}\'.format(d)\n \'2010-07-04 12:15:58\'\n\nNesting arguments and more complex examples:\n\n >>> for align, text in zip(\'<^>\', [\'left\', \'center\', \'right\']):\n ... \'{0:{fill}{align}16}\'.format(text, fill=align, align=align)\n ...\n \'left<<<<<<<<<<<<\'\n \'^^^^^center^^^^^\'\n \'>>>>>>>>>>>right\'\n >>>\n >>> octets = [192, 168, 0, 1]\n >>> \'{:02X}{:02X}{:02X}{:02X}\'.format(*octets)\n \'C0A80001\'\n >>> int(_, 16)\n 3232235521\n >>>\n >>> width = 5\n >>> for num in range(5,12):\n ... for base in \'dXob\':\n ... print(\'{0:{width}{base}}\'.format(num, base=base, width=width), end=\' \')\n ... print()\n ...\n 5 5 5 101\n 6 6 6 110\n 7 7 7 111\n 8 8 10 1000\n 9 9 11 1001\n 10 A 12 1010\n 11 B 13 1011\n', 'function': '\nFunction definitions\n********************\n\nA function definition defines a user-defined function object (see\nsection *The standard type hierarchy*):\n\n funcdef ::= [decorators] "def" funcname "(" [parameter_list] ")" ["->" expression] ":" suite\n decorators ::= decorator+\n decorator ::= "@" dotted_name ["(" [argument_list [","]] ")"] NEWLINE\n dotted_name ::= identifier ("." identifier)*\n parameter_list ::= (defparameter ",")*\n ( "*" [parameter] ("," defparameter)*\n [, "**" parameter]\n | "**" parameter\n | defparameter [","] )\n parameter ::= identifier [":" expression]\n defparameter ::= parameter ["=" expression]\n funcname ::= identifier\n\nA function definition is an executable statement. Its execution binds\nthe function name in the current local namespace to a function object\n(a wrapper around the executable code for the function). This\nfunction object contains a reference to the current global namespace\nas the global namespace to be used when the function is called.\n\nThe function definition does not execute the function body; this gets\nexecuted only when the function is called. [3]\n\nA function definition may be wrapped by one or more *decorator*\nexpressions. Decorator expressions are evaluated when the function is\ndefined, in the scope that contains the function definition. The\nresult must be a callable, which is invoked with the function object\nas the only argument. The returned value is bound to the function name\ninstead of the function object. Multiple decorators are applied in\nnested fashion. For example, the following code\n\n @f1(arg)\n @f2\n def func(): pass\n\nis equivalent to\n\n def func(): pass\n func = f1(arg)(f2(func))\n\nWhen one or more parameters have the form *parameter* ``=``\n*expression*, the function is said to have "default parameter values."\nFor a parameter with a default value, the corresponding argument may\nbe omitted from a call, in which case the parameter\'s default value is\nsubstituted. If a parameter has a default value, all following\nparameters up until the "``*``" must also have a default value ---\nthis is a syntactic restriction that is not expressed by the grammar.\n\n**Default parameter values are evaluated when the function definition\nis executed.** This means that the expression is evaluated once, when\nthe function is defined, and that that same "pre-computed" value is\nused for each call. This is especially important to understand when a\ndefault parameter is a mutable object, such as a list or a dictionary:\nif the function modifies the object (e.g. by appending an item to a\nlist), the default value is in effect modified. This is generally not\nwhat was intended. A way around this is to use ``None`` as the\ndefault, and explicitly test for it in the body of the function, e.g.:\n\n def whats_on_the_telly(penguin=None):\n if penguin is None:\n penguin = []\n penguin.append("property of the zoo")\n return penguin\n\nFunction call semantics are described in more detail in section\n*Calls*. A function call always assigns values to all parameters\nmentioned in the parameter list, either from position arguments, from\nkeyword arguments, or from default values. If the form\n"``*identifier``" is present, it is initialized to a tuple receiving\nany excess positional parameters, defaulting to the empty tuple. If\nthe form "``**identifier``" is present, it is initialized to a new\ndictionary receiving any excess keyword arguments, defaulting to a new\nempty dictionary. Parameters after "``*``" or "``*identifier``" are\nkeyword-only parameters and may only be passed used keyword arguments.\n\nParameters may have annotations of the form "``: expression``"\nfollowing the parameter name. Any parameter may have an annotation\neven those of the form ``*identifier`` or ``**identifier``. Functions\nmay have "return" annotation of the form "``-> expression``" after the\nparameter list. These annotations can be any valid Python expression\nand are evaluated when the function definition is executed.\nAnnotations may be evaluated in a different order than they appear in\nthe source code. The presence of annotations does not change the\nsemantics of a function. The annotation values are available as\nvalues of a dictionary keyed by the parameters\' names in the\n``__annotations__`` attribute of the function object.\n\nIt is also possible to create anonymous functions (functions not bound\nto a name), for immediate use in expressions. This uses lambda forms,\ndescribed in section *Lambdas*. Note that the lambda form is merely a\nshorthand for a simplified function definition; a function defined in\na "``def``" statement can be passed around or assigned to another name\njust like a function defined by a lambda form. The "``def``" form is\nactually more powerful since it allows the execution of multiple\nstatements and annotations.\n\n**Programmer\'s note:** Functions are first-class objects. A "``def``"\nform executed inside a function definition defines a local function\nthat can be returned or passed around. Free variables used in the\nnested function can access the local variables of the function\ncontaining the def. See section *Naming and binding* for details.\n', 'global': '\nThe ``global`` statement\n************************\n\n global_stmt ::= "global" identifier ("," identifier)*\n\nThe ``global`` statement is a declaration which holds for the entire\ncurrent code block. It means that the listed identifiers are to be\ninterpreted as globals. It would be impossible to assign to a global\nvariable without ``global``, although free variables may refer to\nglobals without being declared global.\n\nNames listed in a ``global`` statement must not be used in the same\ncode block textually preceding that ``global`` statement.\n\nNames listed in a ``global`` statement must not be defined as formal\nparameters or in a ``for`` loop control target, ``class`` definition,\nfunction definition, or ``import`` statement.\n\n**CPython implementation detail:** The current implementation does not\nenforce the latter two restrictions, but programs should not abuse\nthis freedom, as future implementations may enforce them or silently\nchange the meaning of the program.\n\n**Programmer\'s note:** the ``global`` is a directive to the parser.\nIt applies only to code parsed at the same time as the ``global``\nstatement. In particular, a ``global`` statement contained in a string\nor code object supplied to the built-in ``exec()`` function does not\naffect the code block *containing* the function call, and code\ncontained in such a string is unaffected by ``global`` statements in\nthe code containing the function call. The same applies to the\n``eval()`` and ``compile()`` functions.\n', 'id-classes': '\nReserved classes of identifiers\n*******************************\n\nCertain classes of identifiers (besides keywords) have special\nmeanings. These classes are identified by the patterns of leading and\ntrailing underscore characters:\n\n``_*``\n Not imported by ``from module import *``. The special identifier\n ``_`` is used in the interactive interpreter to store the result of\n the last evaluation; it is stored in the ``builtins`` module. When\n not in interactive mode, ``_`` has no special meaning and is not\n defined. See section *The import statement*.\n\n Note: The name ``_`` is often used in conjunction with\n internationalization; refer to the documentation for the\n ``gettext`` module for more information on this convention.\n\n``__*__``\n System-defined names. These names are defined by the interpreter\n and its implementation (including the standard library). Current\n system names are discussed in the *Special method names* section\n and elsewhere. More will likely be defined in future versions of\n Python. *Any* use of ``__*__`` names, in any context, that does\n not follow explicitly documented use, is subject to breakage\n without warning.\n\n``__*``\n Class-private names. Names in this category, when used within the\n context of a class definition, are re-written to use a mangled form\n to help avoid name clashes between "private" attributes of base and\n derived classes. See section *Identifiers (Names)*.\n', From python-checkins at python.org Sun Feb 13 10:59:39 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 13 Feb 2011 10:59:39 +0100 (CET) Subject: [Python-checkins] r88412 - in python/branches/py3k/Doc: howto/pyporting.rst tools/sphinxext/susp-ignored.csv Message-ID: <20110213095939.48136FA84@mail.python.org> Author: georg.brandl Date: Sun Feb 13 10:59:39 2011 New Revision: 88412 Log: Fix markup error and update suspicious ignores. Modified: python/branches/py3k/Doc/howto/pyporting.rst python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Sun Feb 13 10:59:39 2011 @@ -557,11 +557,11 @@ except ImportError: # Python 2 from distutils.command.build_py import build_py - setup(cmdclass = {'build_py':build_py}, + setup(cmdclass = {'build_py': build_py}, # ... ) - For Distribute:: +For Distribute:: setup(use_2to3=True, # ... Modified: python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv ============================================================================== --- python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv (original) +++ python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv Sun Feb 13 10:59:39 2011 @@ -383,3 +383,20 @@ whatsnew/3.2,2102,:affe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," whatsnew/3.2,2102,:deaf,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," whatsnew/3.2,2102,:feed,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +howto/pyporting,75,::,# make sure to use :: Python *and* :: Python :: 3 so +howto/pyporting,75,::,"'Programming Language :: Python'," +howto/pyporting,75,::,'Programming Language :: Python :: 3' +whatsnew/3.2,1419,:gz,">>> with tarfile.open(name='myarchive.tar.gz', mode='w:gz') as tf:" +whatsnew/3.2,2135,:directory,${buildout:directory}/downloads/dist +whatsnew/3.2,2135,:location,zope9-location = ${zope9:location} +whatsnew/3.2,2135,:prefix,zope-conf = ${custom:prefix}/etc/zope.conf +whatsnew/3.2,2178,:beef,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,2178,:cafe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,2178,:affe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,2178,:deaf,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,2178,:feed,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,2178,:beef,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,2178,:cafe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,2178,:affe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,2178,:deaf,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,2178,:feed,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," From python-checkins at python.org Sun Feb 13 11:00:57 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 13 Feb 2011 11:00:57 +0100 (CET) Subject: [Python-checkins] r88413 - in python/branches/py3k: Include/patchlevel.h Lib/distutils/__init__.py Lib/idlelib/idlever.py Misc/NEWS Misc/RPM/python-3.2.spec README Message-ID: <20110213100057.A74F4FA84@mail.python.org> Author: georg.brandl Date: Sun Feb 13 11:00:57 2011 New Revision: 88413 Log: Bump for 3.2rc3. Modified: python/branches/py3k/Include/patchlevel.h python/branches/py3k/Lib/distutils/__init__.py python/branches/py3k/Lib/idlelib/idlever.py python/branches/py3k/Misc/NEWS python/branches/py3k/Misc/RPM/python-3.2.spec python/branches/py3k/README Modified: python/branches/py3k/Include/patchlevel.h ============================================================================== --- python/branches/py3k/Include/patchlevel.h (original) +++ python/branches/py3k/Include/patchlevel.h Sun Feb 13 11:00:57 2011 @@ -20,10 +20,10 @@ #define PY_MINOR_VERSION 2 #define PY_MICRO_VERSION 0 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 2 +#define PY_RELEASE_SERIAL 3 /* Version as a string */ -#define PY_VERSION "3.2rc2+" +#define PY_VERSION "3.2rc3" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository) */ Modified: python/branches/py3k/Lib/distutils/__init__.py ============================================================================== --- python/branches/py3k/Lib/distutils/__init__.py (original) +++ python/branches/py3k/Lib/distutils/__init__.py Sun Feb 13 11:00:57 2011 @@ -15,5 +15,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.2rc2" +__version__ = "3.2rc3" #--end constants-- Modified: python/branches/py3k/Lib/idlelib/idlever.py ============================================================================== --- python/branches/py3k/Lib/idlelib/idlever.py (original) +++ python/branches/py3k/Lib/idlelib/idlever.py Sun Feb 13 11:00:57 2011 @@ -1 +1 @@ -IDLE_VERSION = "3.2rc2" +IDLE_VERSION = "3.2rc3" Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sun Feb 13 11:00:57 2011 @@ -2,10 +2,10 @@ Python News +++++++++++ -What's New in Python 3.2? -========================= +What's New in Python 3.2 Release Candidate 3? +============================================= -*Release date: XX-Feb-2011* +*Release date: 13-Feb-2011* Core and Builtins ----------------- @@ -46,7 +46,7 @@ Tests ----- -- Issue #10971:test_zipimport_support is once again compatible with the +- Issue #10971: test_zipimport_support is once again compatible with the refleak hunter feature of test.regrtest. Modified: python/branches/py3k/Misc/RPM/python-3.2.spec ============================================================================== --- python/branches/py3k/Misc/RPM/python-3.2.spec (original) +++ python/branches/py3k/Misc/RPM/python-3.2.spec Sun Feb 13 11:00:57 2011 @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.2rc2 +%define version 3.2rc3 %define libvers 3.2 #--end constants-- %define release 1pydotorg Modified: python/branches/py3k/README ============================================================================== --- python/branches/py3k/README (original) +++ python/branches/py3k/README Sun Feb 13 11:00:57 2011 @@ -1,4 +1,4 @@ -This is Python version 3.2, release candidate 2 +This is Python version 3.2, release candidate 3 =============================================== Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 From python-checkins at python.org Sun Feb 13 11:03:21 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 13 Feb 2011 11:03:21 +0100 (CET) Subject: [Python-checkins] r88414 - python/tags/r32rc3 Message-ID: <20110213100321.E089DF979@mail.python.org> Author: georg.brandl Date: Sun Feb 13 11:03:21 2011 New Revision: 88414 Log: Tag 3.2rc3. Added: python/tags/r32rc3/ - copied from r88413, /python/branches/py3k/ From python-checkins at python.org Sun Feb 13 11:39:22 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 13 Feb 2011 11:39:22 +0100 (CET) Subject: [Python-checkins] r88415 - in peps/trunk: pep-0101.txt pep-0392.txt Message-ID: <20110213103922.07B14FE2A@mail.python.org> Author: georg.brandl Date: Sun Feb 13 11:39:21 2011 New Revision: 88415 Log: Update schedule. Modified: peps/trunk/pep-0101.txt peps/trunk/pep-0392.txt Modified: peps/trunk/pep-0101.txt ============================================================================== --- peps/trunk/pep-0101.txt (original) +++ peps/trunk/pep-0101.txt Sun Feb 13 11:39:21 2011 @@ -234,7 +234,7 @@ though, so keep reading until you hit the next STOP. ___ Forward the commit message that created the tag to python-committers and - ask that the experts build the binaries. Currently, this is only the WE. + ask that the experts build the binaries. ___ XXX The WE builds the Windows helpfile, using (in Doc/) either Modified: peps/trunk/pep-0392.txt ============================================================================== --- peps/trunk/pep-0392.txt (original) +++ peps/trunk/pep-0392.txt Sun Feb 13 11:39:21 2011 @@ -45,7 +45,8 @@ - 3.2 beta 2: December 18, 2010 - 3.2 candidate 1: January 15, 2011 - 3.2 candidate 2: January 29, 2011 -- 3.2 final: February 12, 2011 +- 3.2 candidate 3: February 12, 2011 +- 3.2 final: February 19, 2011 .. don't forget to update final date above as well From python-checkins at python.org Sun Feb 13 20:05:24 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 13 Feb 2011 20:05:24 +0100 (CET) Subject: [Python-checkins] r88416 - sandbox/trunk/pep362/pep362.py Message-ID: <20110213190524.5D280C9C1@mail.python.org> Author: brett.cannon Date: Sun Feb 13 20:05:24 2011 New Revision: 88416 Log: Minor whitespace cleanup. Modified: sandbox/trunk/pep362/pep362.py Modified: sandbox/trunk/pep362/pep362.py ============================================================================== --- sandbox/trunk/pep362/pep362.py (original) +++ sandbox/trunk/pep362/pep362.py Sun Feb 13 20:05:24 2011 @@ -13,6 +13,7 @@ """Represent a parameter in a function signature. Each parameter has the following attributes: + * name The name of the parameter. * position @@ -20,8 +21,9 @@ variable position argument. * keyword_only True if the parameter is keyword-only. - + And the following optoinal attributes: + * default_value The default value for the parameter, if one exists. * annotation @@ -199,10 +201,10 @@ def bind(self, *args, **kwargs): """Return a dictionary mapping function arguments to their parameter variables, if possible. - + Multiple arguments for the same parameter using keyword arguments cannot be detected. - + """ bindings = {} if self.var_args: From python-checkins at python.org Sun Feb 13 20:06:01 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 13 Feb 2011 20:06:01 +0100 (CET) Subject: [Python-checkins] r88417 - sandbox/trunk/pep362/setup.py Message-ID: <20110213190601.2FCCCDCCF@mail.python.org> Author: brett.cannon Date: Sun Feb 13 20:06:01 2011 New Revision: 88417 Log: Go ahead and list Python 3.2 support. Modified: sandbox/trunk/pep362/setup.py Modified: sandbox/trunk/pep362/setup.py ============================================================================== --- sandbox/trunk/pep362/setup.py (original) +++ sandbox/trunk/pep362/setup.py Sun Feb 13 20:06:01 2011 @@ -20,5 +20,6 @@ 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.0', 'Programming Language :: Python :: 3.1', + 'Programming Language :: Python :: 3.2', ] ) From solipsis at pitrou.net Mon Feb 14 05:26:04 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 14 Feb 2011 05:26:04 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88413): sum=1 Message-ID: py3k results for svn r88413 (hg cset de251dc6c5ef) -------------------------------------------------- test_timeout leaked [1, 0, 0] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflog8o2FuJ', '-x'] From python-checkins at python.org Mon Feb 14 07:01:57 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 14 Feb 2011 07:01:57 +0100 (CET) Subject: [Python-checkins] r88418 - python/branches/release31-maint/Doc/library/itertools.rst Message-ID: <20110214060157.0DE8BEE98E@mail.python.org> Author: raymond.hettinger Date: Mon Feb 14 07:01:56 2011 New Revision: 88418 Log: Fix example for count(). Modified: python/branches/release31-maint/Doc/library/itertools.rst Modified: python/branches/release31-maint/Doc/library/itertools.rst ============================================================================== --- python/branches/release31-maint/Doc/library/itertools.rst (original) +++ python/branches/release31-maint/Doc/library/itertools.rst Mon Feb 14 07:01:56 2011 @@ -229,7 +229,7 @@ def count(start=0, step=1): # count(10) --> 10 11 12 13 14 ... - # count(2.5, 0.5) -> 3.5 3.0 4.5 ... + # count(2.5, 0.5) -> 2.5 3.0 3.5 ... n = start while True: yield n From python-checkins at python.org Mon Feb 14 07:03:41 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 14 Feb 2011 07:03:41 +0100 (CET) Subject: [Python-checkins] r88419 - python/branches/release27-maint/Doc/library/itertools.rst Message-ID: <20110214060341.9F4B0EE993@mail.python.org> Author: raymond.hettinger Date: Mon Feb 14 07:03:41 2011 New Revision: 88419 Log: Fix example for itertools.count(). Modified: python/branches/release27-maint/Doc/library/itertools.rst Modified: python/branches/release27-maint/Doc/library/itertools.rst ============================================================================== --- python/branches/release27-maint/Doc/library/itertools.rst (original) +++ python/branches/release27-maint/Doc/library/itertools.rst Mon Feb 14 07:03:41 2011 @@ -237,7 +237,7 @@ def count(start=0, step=1): # count(10) --> 10 11 12 13 14 ... - # count(2.5, 0.5) -> 3.5 3.0 4.5 ... + # count(2.5, 0.5) -> 2.5 3.0 3.5 ... n = start while True: yield n From python-checkins at python.org Mon Feb 14 07:35:00 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 14 Feb 2011 07:35:00 +0100 (CET) Subject: [Python-checkins] r88420 - in python/branches/py3k: Include/patchlevel.h Misc/NEWS Message-ID: <20110214063500.436AEEE9F8@mail.python.org> Author: georg.brandl Date: Mon Feb 14 07:35:00 2011 New Revision: 88420 Log: Post-release updates. Modified: python/branches/py3k/Include/patchlevel.h python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Include/patchlevel.h ============================================================================== --- python/branches/py3k/Include/patchlevel.h (original) +++ python/branches/py3k/Include/patchlevel.h Mon Feb 14 07:35:00 2011 @@ -23,7 +23,7 @@ #define PY_RELEASE_SERIAL 3 /* Version as a string */ -#define PY_VERSION "3.2rc3" +#define PY_VERSION "3.2rc3+" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository) */ Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 14 07:35:00 2011 @@ -2,6 +2,18 @@ Python News +++++++++++ +What's New in Python 3.2? +========================= + +*Release date: 20-Feb-2011* + +Core and Builtins +----------------- + +Library +------- + + What's New in Python 3.2 Release Candidate 3? ============================================= From python-checkins at python.org Mon Feb 14 18:33:49 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 14 Feb 2011 18:33:49 +0100 Subject: [Python-checkins] devguide (hg_transition): Refine the FAQ entry about hg qimport Message-ID: antoine.pitrou pushed d1402dfa33ac to devguide: http://hg.python.org/devguide/rev/d1402dfa33ac changeset: 309:d1402dfa33ac branch: hg_transition tag: tip user: Antoine Pitrou date: Mon Feb 14 18:33:46 2011 +0100 summary: Refine the FAQ entry about hg qimport files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -192,24 +192,27 @@ If the patch was not created by hg (i.e., a patch created by SVN and thus lacking any ``a``/``b`` directory prefixes in the patch), use ``-p0`` instead of ``-p1``. +.. note:: The ``patch`` program is not available by default under Windows. + You can find it `here `_, + courtesy of the `GnuWin32 `_ project. + Also, you may find it necessary to add the "``--binary``" option when trying + to apply Unix-generated patches under Windows. + + If you want to work on the patch using mq_ (Mercurial Queues), type instead:: hg qimport somework.patch This will create a patch in your queue with a name that matches the filename. -You can use the ``-n`` argument to specify a different name. +You can use the ``-n`` argument to specify a different name. To have the +patch applied to the working copy, type:: -To undo a patch imported into your working copy, simply delete the patch from -your patch queue. You do need to make sure it is not applied (``hg qtop`` will -tell you that while ``hg qpop`` will un-apply the top-most patch):: + hg qpush - hg qdelete mywork.patch +Finally, to delete the patch, first un-apply it if necessary using ``hg qpop``, +then do:: -.. note:: The ``patch`` program is not available by default under Windows. - You can find it `here `_, - courtesy of the `GnuWin32 `_ project. - Also, you may find it necessary to add the "``--binary``" option when trying - to apply Unix-generated patches under Windows. + hg qdelete somework.patch .. _mq: http://mercurial.selenic.com/wiki/MqExtension -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 14 19:18:49 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 14 Feb 2011 19:18:49 +0100 (CET) Subject: [Python-checkins] r88421 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110214181849.B7A62EE986@mail.python.org> Author: raymond.hettinger Date: Mon Feb 14 19:18:49 2011 New Revision: 88421 Log: Fix accumulate() example. (Reported by David Murray.) Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Mon Feb 14 19:18:49 2011 @@ -824,7 +824,7 @@ modeled on APL's *scan* operator and Numpy's *accumulate* function: >>> from itertools import accumulate - >>> list(accumulate(8, 2, 50)) + >>> list(accumulate([8, 2, 50])) [8, 10, 60] >>> prob_dist = [0.1, 0.4, 0.2, 0.3] From python-checkins at python.org Mon Feb 14 21:04:00 2011 From: python-checkins at python.org (barry.warsaw) Date: Mon, 14 Feb 2011 21:04:00 +0100 (CET) Subject: [Python-checkins] r88422 - in python/branches/release27-maint: Lib/sysconfig.py Misc/NEWS Message-ID: <20110214200400.B0364EE986@mail.python.org> Author: barry.warsaw Date: Mon Feb 14 21:04:00 2011 New Revision: 88422 Log: - Issue #11171: Fix detection of config/Makefile when --prefix != --exec-prefix, which caused Python to not start. Modified: python/branches/release27-maint/Lib/sysconfig.py python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Lib/sysconfig.py ============================================================================== --- python/branches/release27-maint/Lib/sysconfig.py (original) +++ python/branches/release27-maint/Lib/sysconfig.py Mon Feb 14 21:04:00 2011 @@ -271,7 +271,7 @@ def _get_makefile_filename(): if _PYTHON_BUILD: return os.path.join(_PROJECT_BASE, "Makefile") - return os.path.join(get_path('stdlib'), "config", "Makefile") + return os.path.join(get_path('platstdlib'), "config", "Makefile") def _init_posix(vars): Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Mon Feb 14 21:04:00 2011 @@ -37,6 +37,9 @@ Library ------- +- Issue #11171: Fix detection of config/Makefile when --prefix != + --exec-prefix, which caused Python to not start. + - Issue #11116: any error during addition of a message to a mailbox now causes a rollback, instead of leaving the mailbox partially modified. From nnorwitz at gmail.com Mon Feb 14 22:00:46 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Mon, 14 Feb 2011 16:00:46 -0500 Subject: [Python-checkins] Python Regression Test Failures doc (1) Message-ID: <20110214210046.GA14940@kbk-i386-bb.dyndns.org> rm -rf build/* rm -rf tools/sphinx rm -rf tools/pygments rm -rf tools/jinja2 rm -rf tools/docutils Checking out Sphinx... svn: PROPFIND request failed on '/projects/external/Sphinx-0.6.5/sphinx' svn: PROPFIND of '/projects/external/Sphinx-0.6.5/sphinx': Could not resolve hostname `svn.python.org': Host not found (http://svn.python.org) make: *** [checkout] Error 1 From solipsis at pitrou.net Tue Feb 15 05:05:16 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 15 Feb 2011 05:05:16 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88421): sum=1 Message-ID: py3k results for svn r88421 (hg cset 6b822f0d4ac9) -------------------------------------------------- test_timeout leaked [0, 0, 1] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogkdkRFH', '-x'] From python-checkins at python.org Tue Feb 15 13:41:17 2011 From: python-checkins at python.org (georg.brandl) Date: Tue, 15 Feb 2011 13:41:17 +0100 (CET) Subject: [Python-checkins] r88423 - python/branches/py3k/Doc/library/logging.handlers.rst Message-ID: <20110215124117.F1547FD0E@mail.python.org> Author: georg.brandl Date: Tue Feb 15 13:41:17 2011 New Revision: 88423 Log: Apply logging SocketHandler doc update by Vinay. Modified: python/branches/py3k/Doc/library/logging.handlers.rst Modified: python/branches/py3k/Doc/library/logging.handlers.rst ============================================================================== --- python/branches/py3k/Doc/library/logging.handlers.rst (original) +++ python/branches/py3k/Doc/library/logging.handlers.rst Tue Feb 15 13:41:17 2011 @@ -331,6 +331,27 @@ Send a pickled string *packet* to the socket. This function allows for partial sends which can happen when the network is busy. + .. method:: createSocket() + + Tries to create a socket; on failure, uses an exponential back-off + algorithm. On intial failure, the handler will drop the message it was + trying to send. When subsequent messages are handled by the same + instance, it will not try connecting until some time has passed. The + default parameters are such that the initial delay is one second, and if + after that delay the connection still can't be made, the handler will + double the delay each time up to a maximum of 30 seconds. + + This behaviour is controlled by the following handler attributes: + + * ``retryStart`` (initial delay, defaulting to 1.0 seconds). + * ``retryFactor`` (multiplier, defaulting to 2.0). + * ``retryMax`` (maximum delay, defaulting to 30.0 seconds). + + This means that if the remote listener starts up *after* the handler has + been used, you could lose messages (since the handler won't even attempt + a connection until the delay has elapsed, but just silently drop messages + during the delay period). +^ .. _datagram-handler: From python-checkins at python.org Tue Feb 15 13:44:43 2011 From: python-checkins at python.org (georg.brandl) Date: Tue, 15 Feb 2011 13:44:43 +0100 (CET) Subject: [Python-checkins] r88424 - python/branches/py3k/Doc/library/logging.handlers.rst Message-ID: <20110215124443.3E5C9F93E@mail.python.org> Author: georg.brandl Date: Tue Feb 15 13:44:43 2011 New Revision: 88424 Log: Remove editing slip. Modified: python/branches/py3k/Doc/library/logging.handlers.rst Modified: python/branches/py3k/Doc/library/logging.handlers.rst ============================================================================== --- python/branches/py3k/Doc/library/logging.handlers.rst (original) +++ python/branches/py3k/Doc/library/logging.handlers.rst Tue Feb 15 13:44:43 2011 @@ -326,11 +326,13 @@ them on the receiving end, or alternatively you can disable unpickling of global objects on the receiving end. + .. method:: send(packet) Send a pickled string *packet* to the socket. This function allows for partial sends which can happen when the network is busy. + .. method:: createSocket() Tries to create a socket; on failure, uses an exponential back-off @@ -351,7 +353,7 @@ been used, you could lose messages (since the handler won't even attempt a connection until the delay has elapsed, but just silently drop messages during the delay period). -^ + .. _datagram-handler: From python-checkins at python.org Tue Feb 15 16:41:00 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Tue, 15 Feb 2011 16:41:00 +0100 (CET) Subject: [Python-checkins] r88425 - in python/branches/release31-maint: Lib/test/test_time.py Modules/timemodule.c Message-ID: <20110215154100.2AD59EE981@mail.python.org> Author: alexander.belopolsky Date: Tue Feb 15 16:40:59 2011 New Revision: 88425 Log: Merged revisions 87919 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r87919 | alexander.belopolsky | 2011-01-10 20:21:25 -0500 (Mon, 10 Jan 2011) | 4 lines Issue #1726687: time.mktime() will now correctly compute value one second before epoch. Original patch by Peter Wang, reported by Martin Blais. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Lib/test/test_time.py python/branches/release31-maint/Modules/timemodule.c Modified: python/branches/release31-maint/Lib/test/test_time.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_time.py (original) +++ python/branches/release31-maint/Lib/test/test_time.py Tue Feb 15 16:40:59 2011 @@ -233,6 +233,15 @@ t1 = time.mktime(lt1) self.assertTrue(0 <= (t1-t0) < 0.2) + def test_mktime(self): + # Issue #1726687 + for t in (-2, -1, 0, 1): + try: + tt = time.localtime(t) + except (OverflowError, ValueError): + pass + self.assertEqual(time.mktime(tt), t) + class TestLocale(unittest.TestCase): def setUp(self): self.oldloc = locale.setlocale(locale.LC_ALL) Modified: python/branches/release31-maint/Modules/timemodule.c ============================================================================== --- python/branches/release31-maint/Modules/timemodule.c (original) +++ python/branches/release31-maint/Modules/timemodule.c Tue Feb 15 16:40:59 2011 @@ -703,8 +703,11 @@ time_t tt; if (!gettmarg(tup, &buf)) return NULL; + buf.tm_wday = -1; /* sentinel; original value ignored */ tt = mktime(&buf); - if (tt == (time_t)(-1)) { + /* Return value of -1 does not necessarily mean an error, but tm_wday + * cannot remain set to -1 if mktime succedded. */ + if (tt == (time_t)(-1) && buf.tm_wday == -1) { PyErr_SetString(PyExc_OverflowError, "mktime argument out of range"); return NULL; From python-checkins at python.org Tue Feb 15 16:44:51 2011 From: python-checkins at python.org (georg.brandl) Date: Tue, 15 Feb 2011 16:44:51 +0100 (CET) Subject: [Python-checkins] r88426 - in python/branches/py3k: Makefile.pre.in Misc/NEWS Modules/ld_so_aix.in configure configure.in Message-ID: <20110215154451.530EFEE981@mail.python.org> Author: georg.brandl Date: Tue Feb 15 16:44:51 2011 New Revision: 88426 Log: #941346: Fix broken shared library build on AIX. Patch by Sebastien Sable, review by Antoine Pitrou. Modified: python/branches/py3k/Makefile.pre.in python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/ld_so_aix.in python/branches/py3k/configure python/branches/py3k/configure.in Modified: python/branches/py3k/Makefile.pre.in ============================================================================== --- python/branches/py3k/Makefile.pre.in (original) +++ python/branches/py3k/Makefile.pre.in Tue Feb 15 16:44:51 2011 @@ -1256,7 +1256,7 @@ done -rm -f core Makefile Makefile.pre config.status \ Modules/Setup Modules/Setup.local Modules/Setup.config \ - Modules/ld_so_aix Misc/python.pc + Modules/ld_so_aix Modules/python.exp Misc/python.pc -rm -f python*-gdb.py -rm -f pybuilddir.txt find $(srcdir) '(' -name '*.fdc' -o -name '*~' \ Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 15 16:44:51 2011 @@ -13,6 +13,11 @@ Library ------- +Build +----- + +- Issue #941346: Fix broken shared library build on AIX. + What's New in Python 3.2 Release Candidate 3? ============================================= Modified: python/branches/py3k/Modules/ld_so_aix.in ============================================================================== --- python/branches/py3k/Modules/ld_so_aix.in (original) +++ python/branches/py3k/Modules/ld_so_aix.in Tue Feb 15 16:44:51 2011 @@ -131,7 +131,7 @@ shift done -if test "$objfile" = "libpython at VERSION@.so"; then +if test "$objfile" = "libpython at VERSION@@ABIFLAGS at .so"; then ldsocoremode="true" fi Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Tue Feb 15 16:44:51 2011 @@ -1,5 +1,5 @@ #! /bin/sh -# From configure.in Revision: 87698 . +# From configure.in Revision: 88350 . # Guess values for system-dependent variables and create Makefiles. # Generated by GNU Autoconf 2.68 for python 3.2. # @@ -7426,7 +7426,7 @@ then case $ac_sys_system/$ac_sys_release in AIX*) - BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:Modules/python.exp -L\$(srcdir)" + BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:\$(srcdir)/Modules/python.exp" LDSHARED="\$(BINLIBDEST)/config/ld_so_aix \$(CC) -bI:\$(BINLIBDEST)/config/python.exp" ;; IRIX/5*) LDSHARED="ld -shared";; Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Tue Feb 15 16:44:51 2011 @@ -1642,7 +1642,7 @@ then case $ac_sys_system/$ac_sys_release in AIX*) - BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:Modules/python.exp -L\$(srcdir)" + BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:\$(srcdir)/Modules/python.exp" LDSHARED="\$(BINLIBDEST)/config/ld_so_aix \$(CC) -bI:\$(BINLIBDEST)/config/python.exp" ;; IRIX/5*) LDSHARED="ld -shared";; From python-checkins at python.org Tue Feb 15 16:51:17 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Tue, 15 Feb 2011 16:51:17 +0100 (CET) Subject: [Python-checkins] r88427 - in python/branches/release27-maint: Lib/test/test_time.py Modules/timemodule.c Message-ID: <20110215155117.55C7CF924@mail.python.org> Author: alexander.belopolsky Date: Tue Feb 15 16:51:17 2011 New Revision: 88427 Log: Merged revisions 87919 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r87919 | alexander.belopolsky | 2011-01-10 20:21:25 -0500 (Mon, 10 Jan 2011) | 4 lines Issue #1726687: time.mktime() will now correctly compute value one second before epoch. Original patch by Peter Wang, reported by Martin Blais. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/test/test_time.py python/branches/release27-maint/Modules/timemodule.c Modified: python/branches/release27-maint/Lib/test/test_time.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_time.py (original) +++ python/branches/release27-maint/Lib/test/test_time.py Tue Feb 15 16:51:17 2011 @@ -223,6 +223,16 @@ t1 = time.mktime(lt1) self.assertTrue(0 <= (t1-t0) < 0.2) + def test_mktime(self): + # Issue #1726687 + for t in (-2, -1, 0, 1): + try: + tt = time.localtime(t) + except (OverflowError, ValueError): + pass + self.assertEqual(time.mktime(tt), t) + + def test_main(): test_support.run_unittest(TimeTestCase) Modified: python/branches/release27-maint/Modules/timemodule.c ============================================================================== --- python/branches/release27-maint/Modules/timemodule.c (original) +++ python/branches/release27-maint/Modules/timemodule.c Tue Feb 15 16:51:17 2011 @@ -632,8 +632,11 @@ time_t tt; if (!gettmarg(tup, &buf)) return NULL; + buf.tm_wday = -1; /* sentinel; original value ignored */ tt = mktime(&buf); - if (tt == (time_t)(-1)) { + /* Return value of -1 does not necessarily mean an error, but tm_wday + * cannot remain set to -1 if mktime succedded. */ + if (tt == (time_t)(-1) && buf.tm_wday == -1) { PyErr_SetString(PyExc_OverflowError, "mktime argument out of range"); return NULL; From python-checkins at python.org Tue Feb 15 16:58:04 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Tue, 15 Feb 2011 16:58:04 +0100 (CET) Subject: [Python-checkins] r88428 - in python/branches/release31-maint: Lib/test/test_time.py Message-ID: <20110215155804.BB99BEE983@mail.python.org> Author: alexander.belopolsky Date: Tue Feb 15 16:58:04 2011 New Revision: 88428 Log: Merged revisions 87921 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r87921 | alexander.belopolsky | 2011-01-10 21:22:16 -0500 (Mon, 10 Jan 2011) | 1 line This should fix mktime test on Windows ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Lib/test/test_time.py Modified: python/branches/release31-maint/Lib/test/test_time.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_time.py (original) +++ python/branches/release31-maint/Lib/test/test_time.py Tue Feb 15 16:58:04 2011 @@ -240,7 +240,8 @@ tt = time.localtime(t) except (OverflowError, ValueError): pass - self.assertEqual(time.mktime(tt), t) + else: + self.assertEqual(time.mktime(tt), t) class TestLocale(unittest.TestCase): def setUp(self): From python-checkins at python.org Tue Feb 15 17:01:11 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Tue, 15 Feb 2011 17:01:11 +0100 (CET) Subject: [Python-checkins] r88429 - in python/branches/release27-maint: Lib/test/test_time.py Message-ID: <20110215160111.5F29DEE983@mail.python.org> Author: alexander.belopolsky Date: Tue Feb 15 17:01:11 2011 New Revision: 88429 Log: Merged revisions 87921 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r87921 | alexander.belopolsky | 2011-01-10 21:22:16 -0500 (Mon, 10 Jan 2011) | 1 line This should fix mktime test on Windows ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/test/test_time.py Modified: python/branches/release27-maint/Lib/test/test_time.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_time.py (original) +++ python/branches/release27-maint/Lib/test/test_time.py Tue Feb 15 17:01:11 2011 @@ -230,7 +230,8 @@ tt = time.localtime(t) except (OverflowError, ValueError): pass - self.assertEqual(time.mktime(tt), t) + else: + self.assertEqual(time.mktime(tt), t) def test_main(): From python-checkins at python.org Tue Feb 15 20:48:59 2011 From: python-checkins at python.org (georg.brandl) Date: Tue, 15 Feb 2011 20:48:59 +0100 (CET) Subject: [Python-checkins] r88430 - in python/branches/py3k: Python/dynload_aix.c configure configure.in Message-ID: <20110215194859.8C6A7FDA2@mail.python.org> Author: georg.brandl Date: Tue Feb 15 20:48:59 2011 New Revision: 88430 Log: #730467: Another small AIX fix. Modified: python/branches/py3k/Python/dynload_aix.c python/branches/py3k/configure python/branches/py3k/configure.in Modified: python/branches/py3k/Python/dynload_aix.c ============================================================================== --- python/branches/py3k/Python/dynload_aix.c (original) +++ python/branches/py3k/Python/dynload_aix.c Tue Feb 15 20:48:59 2011 @@ -12,7 +12,7 @@ #ifdef AIX_GENUINE_CPLUSPLUS -#include "/usr/lpp/xlC/include/load.h" +#include #define aix_load loadAndInit #else #define aix_load load Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Tue Feb 15 20:48:59 2011 @@ -1,5 +1,5 @@ #! /bin/sh -# From configure.in Revision: 88350 . +# From configure.in Revision: 88426 . # Guess values for system-dependent variables and create Makefiles. # Generated by GNU Autoconf 2.68 for python 3.2. # @@ -7890,7 +7890,7 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ - #include "/usr/lpp/xlC/include/load.h" + #include int main () { Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Tue Feb 15 20:48:59 2011 @@ -1912,7 +1912,7 @@ case "$ac_sys_system" in AIX*) AC_MSG_CHECKING(for genuine AIX C++ extensions support) AC_LINK_IFELSE([ - AC_LANG_PROGRAM([[#include "/usr/lpp/xlC/include/load.h"]], + AC_LANG_PROGRAM([[#include ]], [[loadAndInit("", 0, "")]]) ],[ AC_DEFINE(AIX_GENUINE_CPLUSPLUS, 1, From nnorwitz at gmail.com Wed Feb 16 01:24:36 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Tue, 15 Feb 2011 19:24:36 -0500 Subject: [Python-checkins] Python Regression Test Failures all (3) Message-ID: <20110216002436.GA4783@kbk-i386-bb.dyndns.org> 350 tests OK. 3 tests failed: test_robotparser test_smtpnet test_ssl 28 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_cd test_cl test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_macos test_macostools test_multiprocessing test_pep277 test_py3kwarn test_scriptpackages test_startfile test_sunaudiodev test_tcl test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly == CPython 2.7 (trunk:88430M, Feb 15 2011, 16:01:36) [GCC 3.3.4 20040623 (Gentoo Linux 3.3.4-r1, ssp-3.3.2-2, pie-8.7.6)] == Linux-2.6.9-gentoo-r1-i686-AMD_Athlon-tm-_XP_3000+-with-gentoo-1.4.16 little-endian == /tmp/test_python_29777 test_grammar test_opcodes test_dict test_builtin test_exceptions test_types test_unittest test_doctest test_doctest2 test_MimeWriter test_SimpleHTTPServer test_StringIO test___all__ test___future__ test__locale test_abc test_abstract_numbers test_aepack test_aepack skipped -- No module named aetypes test_aifc test_al test_al skipped -- No module named al test_anydbm test_applesingle test_applesingle skipped -- No module named MacOS test_argparse test_array test_ascii_formatd test_ast test_asynchat test_asyncore test_atexit test_audioop test_augassign test_base64 test_bastion test_bigaddrspace test_bigmem test_binascii test_binhex test_binop test_bisect test_bool test_bsddb test_bsddb185 test_bsddb185 skipped -- No module named bsddb185 test_bsddb3 Sleepycat Software: Berkeley DB 4.1.25: (December 19, 2002) Test path prefix: /tmp/z-test_bsddb3-29777 test_buffer test_bufio test_bytes test_bz2 test_calendar test_call test_capi test_cd test_cd skipped -- No module named cd test_cfgparser test_cgi test_charmapcodec test_cl test_cl skipped -- No module named cl test_class test_cmath test_cmd test_cmd_line test_cmd_line_script test_code test_codeccallbacks test_codecencodings_cn test_codecencodings_hk test_codecencodings_jp test_codecencodings_kr test_codecencodings_tw test_codecmaps_cn test_codecmaps_hk test_codecmaps_jp test_codecmaps_kr test_codecmaps_tw test_codecs test_codeop test_coding test_coercion test_collections test_colorsys test_commands test_compare test_compile test_compileall test_compiler testCompileLibrary still working, be patient... test_complex test_complex_args test_contains test_contextlib test_cookie test_cookielib test_copy test_copy_reg test_cpickle test_cprofile test_crypt test_csv test_ctypes test_datetime test_dbm test_decimal test_decorators test_defaultdict test_deque test_descr test_descrtut test_dictcomps test_dictviews test_difflib test_dircache test_dis test_distutils [19643 refs] test_dl test_docxmlrpc test_dumbdbm test_dummy_thread test_dummy_threading test_email test_email_codecs test_email_renamed test_enumerate test_eof test_epoll test_epoll skipped -- kernel doesn't support epoll() test_errno test_exception_variations test_extcall test_fcntl test_file test_file2k test_filecmp test_fileinput test_fileio test_float test_fnmatch test_fork1 test_format test_fpformat test_fractions test_frozen test_ftplib test_funcattrs test_functools test_future test_future3 test_future4 test_future5 test_future_builtins test_gc test_gdb test_gdb skipped -- gdb versions before 7.0 didn't support python embedding Saw: GNU gdb 6.2.1 Copyright 2004 Free Software Foundation, Inc. GDB is free software, covered by the GNU General Public License, and you are welcome to change it and/or distribute copies of it under certain conditions. Type "show copying" to see the conditions. There is absolutely no warranty for GDB. Type "show warranty" for details. This GDB was configured as "i386-pc-linux-gnu". test_gdbm test_generators test_genericpath test_genexps test_getargs test_getargs2 test_getopt test_gettext test_gl test_gl skipped -- No module named gl test_glob test_global test_grp test_gzip test_hash test_hashlib test_heapq test_hmac test_hotshot test_htmllib test_htmlparser test_httplib test_httpservers [15648 refs] [15648 refs] [15648 refs] [25315 refs] test_imageop test_imaplib test_imgfile test_imgfile skipped -- No module named imgfile test_imp test_import test_importhooks test_importlib test_index test_inspect test_int test_int_literal test_io test_ioctl test_ioctl skipped -- Unable to open /dev/tty test_isinstance test_iter test_iterlen test_itertools test_json test_kqueue test_kqueue skipped -- test works only on BSD test_largefile test_lib2to3 test_linecache test_list test_locale test_logging test_long test_long_future test_longexp test_macos test_macos skipped -- No module named MacOS test_macostools test_macostools skipped -- No module named MacOS test_macpath test_mailbox test_marshal test_math test_md5 test_memoryio test_memoryview test_mhlib test_mimetools test_mimetypes test_minidom test_mmap test_module test_modulefinder test_multibytecodec test_multibytecodec_support test_multifile test_multiprocessing test_multiprocessing skipped -- This platform lacks a functioning sem_open implementation, therefore, the required synchronization primitives needed will not function, see issue 3770. test_mutants test_mutex test_netrc test_new test_nis test_normalization test_ntpath test_old_mailbox test_openpty test_operator test_optparse test_os [15648 refs] [15648 refs] test_parser Expecting 's_push: parser stack overflow' in next line s_push: parser stack overflow test_pdb test_peepholer test_pep247 test_pep263 test_pep277 test_pep277 skipped -- only NT+ and systems with Unicode-friendly filesystem encoding test_pep292 test_pep352 test_pickle test_pickletools test_pipes test_pkg test_pkgimport test_pkgutil test_platform [17044 refs] [17044 refs] test_plistlib test_poll test_popen [15653 refs] [15653 refs] [15653 refs] test_popen2 test_poplib test_posix test_posixpath test_pow test_pprint test_print test_profile test_profilehooks test_property test_pstats test_pty test_pwd test_py3kwarn test_py3kwarn skipped -- test.test_py3kwarn must be run with the -3 flag test_pyclbr test_pydoc [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20862 refs] [20861 refs] [20861 refs] test_pyexpat test_queue test_quopri [18479 refs] [18479 refs] test_random test_re test_readline test_repr test_resource test_rfc822 test_richcmp test_robotparser test test_robotparser failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_robotparser.py", line 224, in testPythonOrg parser.read() File "/tmp/python-test/local/lib/python2.7/robotparser.py", line 57, in read f = opener.open(self.url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 205, in open return getattr(self, name)(url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 342, in open_http h.endheaders(data) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 940, in endheaders self._send_output(message_body) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 803, in _send_output self.send(msg) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 755, in send self.connect() File "/tmp/python-test/local/lib/python2.7/httplib.py", line 736, in connect self.timeout, self.source_address) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): IOError: [Errno socket error] [Errno -3] Temporary failure in name resolution Re-running test 'test_robotparser' in verbose mode RobotTest(1, good, /) ... ok RobotTest(1, good, /test.html) ... ok RobotTest(1, bad, /cyberworld/map/index.html) ... ok RobotTest(1, bad, /tmp/xxx) ... ok RobotTest(1, bad, /foo.html) ... ok RobotTest(2, good, /) ... ok RobotTest(2, good, /test.html) ... ok RobotTest(2, good, ('cybermapper', '/cyberworld/map/index.html')) ... ok RobotTest(2, bad, /cyberworld/map/index.html) ... ok RobotTest(3, bad, /cyberworld/map/index.html) ... ok RobotTest(3, bad, /) ... ok RobotTest(3, bad, /tmp/) ... ok RobotTest(4, bad, /tmp) ... ok RobotTest(4, bad, /tmp.html) ... ok RobotTest(4, bad, /tmp/a.html) ... ok RobotTest(4, bad, /a%3cd.html) ... ok RobotTest(4, bad, /a%3Cd.html) ... ok RobotTest(4, bad, /a%2fb.html) ... ok RobotTest(4, bad, /~joe/index.html) ... ok RobotTest(5, bad, /tmp) ... ok RobotTest(5, bad, /tmp.html) ... ok RobotTest(5, bad, /tmp/a.html) ... ok RobotTest(5, bad, /a%3cd.html) ... ok RobotTest(5, bad, /a%3Cd.html) ... ok RobotTest(5, bad, /a%2fb.html) ... ok RobotTest(5, bad, /~joe/index.html) ... ok RobotTest(6, good, /tmp) ... ok RobotTest(6, bad, /tmp/) ... ok RobotTest(6, bad, /tmp/a.html) ... ok RobotTest(6, bad, /a%3cd.html) ... ok RobotTest(6, bad, /a%3Cd.html) ... ok RobotTest(6, bad, /a/b.html) ... ok RobotTest(6, bad, /%7Ejoe/index.html) ... ok RobotTest(7, good, /foo.html) ... ok RobotTest(8, good, /folder1/myfile.html) ... ok RobotTest(8, bad, /folder1/anotherfile.html) ... ok RobotTest(9, bad, /something.jpg) ... ok RobotTest(10, bad, /something.jpg) ... ok RobotTest(11, bad, /something.jpg) ... ok RobotTest(12, good, /something.jpg) ... ok RobotTest(13, good, /folder1/myfile.html) ... ok RobotTest(13, bad, /folder1/anotherfile.html) ... ok ---------------------------------------------------------------------- Ran 42 tests in 0.012s OK testPasswordProtectedSite (test.test_robotparser.NetworkTestCase) ... skipped 'http://mueblesmoraleda.com is unavailable' testPythonOrg (test.test_robotparser.NetworkTestCase) ... ERROR ====================================================================== ERROR: testPythonOrg (test.test_robotparser.NetworkTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_robotparser.py", line 224, in testPythonOrg parser.read() File "/tmp/python-test/local/lib/python2.7/robotparser.py", line 57, in read f = opener.open(self.url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 205, in open return getattr(self, name)(url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 342, in open_http h.endheaders(data) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 940, in endheaders self._send_output(message_body) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 803, in _send_output self.send(msg) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 755, in send self.connect() File "/tmp/python-test/local/lib/python2.7/httplib.py", line 736, in connect self.timeout, self.source_address) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): IOError: [Errno socket error] [Errno -3] Temporary failure in name resolution ---------------------------------------------------------------------- Ran 2 tests in 3.143s FAILED (errors=1, skipped=1) test test_robotparser failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_robotparser.py", line 224, in testPythonOrg parser.read() File "/tmp/python-test/local/lib/python2.7/robotparser.py", line 57, in read f = opener.open(self.url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 205, in open return getattr(self, name)(url) File "/tmp/python-test/local/lib/python2.7/urllib.py", line 342, in open_http h.endheaders(data) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 940, in endheaders self._send_output(message_body) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 803, in _send_output self.send(msg) File "/tmp/python-test/local/lib/python2.7/httplib.py", line 755, in send self.connect() File "/tmp/python-test/local/lib/python2.7/httplib.py", line 736, in connect self.timeout, self.source_address) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): IOError: [Errno socket error] [Errno -3] Temporary failure in name resolution test_runpy test_sax test_scope test_scriptpackages test_scriptpackages skipped -- No module named aetools test_select test_set test_setcomps test_sets test_sgmllib test_sha test_shelve test_shlex test_shutil test_signal test_site [15648 refs] [15648 refs] [15648 refs] [15648 refs] test_slice test_smtplib test_smtpnet test test_smtpnet failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_smtpnet.py", line 15, in test_connect server = smtplib.SMTP_SSL(self.testServer, self.remotePort) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 752, in __init__ SMTP.__init__(self, host, port, local_hostname, timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 239, in __init__ (code, msg) = self.connect(host, port) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 295, in connect self.sock = self._get_socket(host, port, self.timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 757, in _get_socket new_socket = socket.create_connection((host, port), timeout) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): gaierror: [Errno -3] Temporary failure in name resolution Re-running test 'test_smtpnet' in verbose mode test_connect (test.test_smtpnet.SmtpSSLTest) ... ERROR ====================================================================== ERROR: test_connect (test.test_smtpnet.SmtpSSLTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_smtpnet.py", line 15, in test_connect server = smtplib.SMTP_SSL(self.testServer, self.remotePort) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 752, in __init__ SMTP.__init__(self, host, port, local_hostname, timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 239, in __init__ (code, msg) = self.connect(host, port) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 295, in connect self.sock = self._get_socket(host, port, self.timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 757, in _get_socket new_socket = socket.create_connection((host, port), timeout) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): gaierror: [Errno -3] Temporary failure in name resolution ---------------------------------------------------------------------- Ran 1 test in 1.339s FAILED (errors=1) test test_smtpnet failed -- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_smtpnet.py", line 15, in test_connect server = smtplib.SMTP_SSL(self.testServer, self.remotePort) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 752, in __init__ SMTP.__init__(self, host, port, local_hostname, timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 239, in __init__ (code, msg) = self.connect(host, port) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 295, in connect self.sock = self._get_socket(host, port, self.timeout) File "/tmp/python-test/local/lib/python2.7/smtplib.py", line 757, in _get_socket new_socket = socket.create_connection((host, port), timeout) File "/tmp/python-test/local/lib/python2.7/socket.py", line 551, in create_connection for res in getaddrinfo(host, port, 0, SOCK_STREAM): gaierror: [Errno -3] Temporary failure in name resolution test_socket test_socketserver test_softspace test_sort test_sqlite test_ssl test test_ssl failed -- multiple errors occurred; run in verbose mode for details Re-running test 'test_ssl' in verbose mode test_DER_to_PEM (test.test_ssl.BasicTests) ... ok test_ciphers (test.test_ssl.BasicTests) ... ERROR test_constants (test.test_ssl.BasicTests) ... ok test_openssl_version (test.test_ssl.BasicTests) ... ok test_parse_cert (test.test_ssl.BasicTests) ... {'notAfter': 'Feb 16 16:54:50 2013 GMT', 'subject': ((('countryName', u'US'),), (('stateOrProvinceName', u'Delaware'),), (('localityName', u'Wilmington'),), (('organizationName', u'Python Software Foundation'),), (('organizationalUnitName', u'SSL'),), (('commonName', u'somemachine.python.org'),))} ok test_random (test.test_ssl.BasicTests) ... RAND_status is 1 (sufficient randomness) ok test_refcycle (test.test_ssl.BasicTests) ... ok test_sslwrap_simple (test.test_ssl.BasicTests) ... ok test_algorithms (test.test_ssl.NetworkedTests) ... skipped "SHA256 not available on 'OpenSSL 0.9.7d 17 Mar 2004'" test_connect (test.test_ssl.NetworkedTests) ... ERROR test_get_server_certificate (test.test_ssl.NetworkedTests) ... ERROR test_makefile_close (test.test_ssl.NetworkedTests) ... ERROR test_non_blocking_handshake (test.test_ssl.NetworkedTests) ... ERROR test_asyncore_server (test.test_ssl.ThreadedTests) Check the example asyncore integration. ... server: new connection from 127.0.0.1:51182 client: sending 'TEST MESSAGE of mixed case\n'... client: read 'test message of mixed case\n' client: closing connection. server: closed connection ok test_echo (test.test_ssl.ThreadedTests) Basic test of an SSL client connecting to a server ... server: new connection from ('127.0.0.1', 51184) client: sending 'FOO\n'... server: connection cipher is now ('AES256-SHA', 'TLSv1/SSLv3', 256) server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: sending bytearray(b'FOO\n')... server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: sending ... server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: closing connection. server: client closed connection ok test_empty_cert (test.test_ssl.ThreadedTests) Connecting with an empty cert file ... SSLError is _ssl.c:347: error:140B0009:SSL routines:SSL_CTX_use_PrivateKey_file:PEM lib ok test_getpeercert (test.test_ssl.ThreadedTests) ... {'notAfter': 'Feb 16 16:54:50 2013 GMT', 'subject': ((('countryName', u'US'),), (('stateOrProvinceName', u'Delaware'),), (('localityName', u'Wilmington'),), (('organizationName', u'Python Software Foundation'),), (('organizationalUnitName', u'SSL'),), (('commonName', u'somemachine.python.org'),))} Connection cipher is ('AES256-SHA', 'TLSv1/SSLv3', 256). ok test_handshake_timeout (test.test_ssl.ThreadedTests) ... ok test_malformed_cert (test.test_ssl.ThreadedTests) Connecting with a badly formatted certificate (syntax error) ... SSLError is _ssl.c:361: error:140DC009:SSL routines:SSL_CTX_use_certificate_chain_file:PEM lib ok test_malformed_key (test.test_ssl.ThreadedTests) Connecting with a badly formatted key (syntax error) ... SSLError is _ssl.c:347: error:140B0009:SSL routines:SSL_CTX_use_PrivateKey_file:PEM lib ok test_nonexisting_cert (test.test_ssl.ThreadedTests) Connecting with a non-existing cert file ... SSLError is _ssl.c:499: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert unknown ca ok test_protocol_sslv2 (test.test_ssl.ThreadedTests) Connecting to an SSLv2 server with various client options ... SSLv2->SSLv2 CERT_NONE SSLv2->SSLv2 CERT_OPTIONAL SSLv2->SSLv2 CERT_REQUIRED SSLv23->SSLv2 CERT_NONE {SSLv3->SSLv2} CERT_NONE {TLSv1->SSLv2} CERT_NONE ok test_protocol_sslv23 (test.test_ssl.ThreadedTests) Connecting to an SSLv23 server with various client options ... SSLv2->SSLv23 CERT_NONE SSLv3->SSLv23 CERT_NONE SSLv23->SSLv23 CERT_NONE TLSv1->SSLv23 CERT_NONE SSLv3->SSLv23 CERT_OPTIONAL SSLv23->SSLv23 CERT_OPTIONAL TLSv1->SSLv23 CERT_OPTIONAL SSLv3->SSLv23 CERT_REQUIRED SSLv23->SSLv23 CERT_REQUIRED TLSv1->SSLv23 CERT_REQUIRED ok test_protocol_sslv3 (test.test_ssl.ThreadedTests) Connecting to an SSLv3 server with various client options ... SSLv3->SSLv3 CERT_NONE SSLv3->SSLv3 CERT_OPTIONAL SSLv3->SSLv3 CERT_REQUIRED {SSLv2->SSLv3} CERT_NONE {SSLv23->SSLv3} CERT_NONE {TLSv1->SSLv3} CERT_NONE ok test_protocol_tlsv1 (test.test_ssl.ThreadedTests) Connecting to a TLSv1 server with various client options ... TLSv1->TLSv1 CERT_NONE TLSv1->TLSv1 CERT_OPTIONAL TLSv1->TLSv1 CERT_REQUIRED {SSLv2->TLSv1} CERT_NONE {SSLv3->TLSv1} CERT_NONE {SSLv23->TLSv1} CERT_NONE ok test_recv_send (test.test_ssl.ThreadedTests) Test recv(), send() and friends. ... server: new connection from ('127.0.0.1', 51255) server: connection cipher is now ('AES256-SHA', 'TLSv1/SSLv3', 256) ok test_rude_shutdown (test.test_ssl.ThreadedTests) A brutal shutdown of an SSL server should raise an IOError ... ok test_socketserver (test.test_ssl.ThreadedTests) Using a SocketServer to create and manage SSL connections. ... server (('127.0.0.1', 51258):51258 ('AES256-SHA', 'TLSv1/SSLv3', 256)): [15/Feb/2011 19:11:25] "GET /keycert.pem HTTP/1.0" 200 - client: read 1872 bytes from remote server '>' ok test_starttls (test.test_ssl.ThreadedTests) Switching from clear text to encrypted and back again. ... client: sending 'msg 1'... server: new connection from ('127.0.0.1', 51261) server: read 'msg 1' (unencrypted), sending back 'msg 1' (unencrypted)... client: read 'msg 1' from server client: sending 'MSG 2'... server: read 'MSG 2' (unencrypted), sending back 'msg 2' (unencrypted)... client: read 'msg 2' from server client: sending 'STARTTLS'... server: read STARTTLS from client, sending OK... client: read 'OK\n' from server, starting TLS... client: sending 'MSG 3'... server: read 'MSG 3' (encrypted), sending back 'msg 3' (encrypted)... client: read 'msg 3' from server client: sending 'msg 4'... server: read 'msg 4' (encrypted), sending back 'msg 4' (encrypted)... client: read 'msg 4' from server client: sending 'ENDTLS'... server: read ENDTLS from client, sending OK... client: read 'OK\n' from server, ending TLS... client: sending 'msg 5'... server: connection is now unencrypted... server: read 'msg 5' (unencrypted), sending back 'msg 5' (unencrypted)... client: read 'msg 5' from server client: sending 'msg 6'... server: read 'msg 6' (unencrypted), sending back 'msg 6' (unencrypted)... client: read 'msg 6' from server client: closing connection. server: client closed connection ok test_wrapped_accept (test.test_ssl.ThreadedTests) Check the accept() method on SSL sockets. ... server: wrapped server socket as client: sending 'FOO\n'... server: new connection from ('127.0.0.1', 51263) client cert is {'notAfter': 'Feb 16 16:54:50 2013 GMT', 'subject': ((('countryName', u'US'),), (('stateOrProvinceName', u'Delaware'),), (('localityName', u'Wilmington'),), (('organizationName', u'Python Software Foundation'),), (('organizationalUnitName', u'SSL'),), (('commonName', u'somemachine.python.org'),))} cert binary is 683 bytes server: connection cipher is now ('AES256-SHA', 'TLSv1/SSLv3', 256) server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: sending bytearray(b'FOO\n')... server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: sending ... server: read 'FOO\n' (encrypted), sending back 'foo\n' (encrypted)... client: read 'foo\n' client: closing connection. server: client closed connection ok ====================================================================== ERROR: test_ciphers (test.test_ssl.BasicTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_ssl.py", line 133, in test_ciphers s.connect(remote) File "/tmp/python-test/local/lib/python2.7/ssl.py", line 292, in connect socket.connect(self, addr) File "/tmp/python-test/local/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) gaierror: [Errno -2] Name or service not known ====================================================================== ERROR: test_connect (test.test_ssl.NetworkedTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_ssl.py", line 160, in test_connect s.connect(("svn.python.org", 443)) File "/tmp/python-test/local/lib/python2.7/ssl.py", line 292, in connect socket.connect(self, addr) File "/tmp/python-test/local/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) gaierror: [Errno -2] Name or service not known ====================================================================== ERROR: test_get_server_certificate (test.test_ssl.NetworkedTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_ssl.py", line 229, in test_get_server_certificate pem = ssl.get_server_certificate(("svn.python.org", 443)) File "/tmp/python-test/local/lib/python2.7/ssl.py", line 403, in get_server_certificate s.connect(addr) File "/tmp/python-test/local/lib/python2.7/ssl.py", line 292, in connect socket.connect(self, addr) File "/tmp/python-test/local/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) gaierror: [Errno -2] Name or service not known ====================================================================== ERROR: test_makefile_close (test.test_ssl.NetworkedTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_ssl.py", line 191, in test_makefile_close ss.connect(("svn.python.org", 443)) File "/tmp/python-test/local/lib/python2.7/ssl.py", line 292, in connect socket.connect(self, addr) File "/tmp/python-test/local/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) gaierror: [Errno -2] Name or service not known ====================================================================== ERROR: test_non_blocking_handshake (test.test_ssl.NetworkedTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/tmp/python-test/local/lib/python2.7/test/test_ssl.py", line 206, in test_non_blocking_handshake s.connect(("svn.python.org", 443)) File "/tmp/python-test/local/lib/python2.7/socket.py", line 222, in meth return getattr(self._sock,name)(*args) gaierror: [Errno -2] Name or service not known ---------------------------------------------------------------------- Ran 30 tests in 6.884s FAILED (errors=5, skipped=1) test test_ssl failed -- multiple errors occurred test_startfile test_startfile skipped -- module os has no attribute startfile test_str test_strftime test_string test_stringprep test_strop test_strptime test_strtod test_struct test_structmembers test_structseq test_subprocess [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15863 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] [15648 refs] . [15648 refs] [15648 refs] this bit of output is from a test of stdout in a different process ... [15648 refs] [15648 refs] [15863 refs] test_sunaudiodev test_sunaudiodev skipped -- No module named sunaudiodev test_sundry test_symtable test_syntax test_sys [15648 refs] [15648 refs] [15648 refs] [15877 refs] [15671 refs] test_sysconfig [15648 refs] [15648 refs] test_tarfile test_tcl test_tcl skipped -- No module named _tkinter test_telnetlib test_tempfile [15648 refs] test_textwrap test_thread test_threaded_import test_threadedtempfile test_threading [18948 refs] [21123 refs] [20035 refs] [20035 refs] [20035 refs] [20035 refs] test_threading_local test_threadsignals test_time test_timeout test_tk test_tk skipped -- No module named _tkinter test_tokenize test_trace test_traceback test_transformer test_ttk_guionly test_ttk_guionly skipped -- No module named _tkinter test_ttk_textonly test_ttk_textonly skipped -- No module named _tkinter test_tuple test_typechecks test_ucn test_unary test_undocumented_details test_unicode test_unicode_file test_unicode_file skipped -- No Unicode filesystem semantics on this platform. test_unicodedata test_univnewlines test_univnewlines2k test_unpack test_urllib test_urllib2 test_urllib2_localnet test_urllib2net test_urllibnet test_urlparse test_userdict test_userlist test_userstring test_uu test_uuid test_wait3 test_wait4 test_warnings [15679 refs] [15679 refs] [15672 refs] [15679 refs] [15679 refs] [15672 refs] test_wave test_weakref test_weakset test_whichdb test_winreg test_winreg skipped -- No module named _winreg test_winsound test_winsound skipped -- No module named winsound test_with test_wsgiref test_xdrlib test_xml_etree test_xml_etree_c test_xmllib test_xmlrpc test_xpickle sh: line 1: python2.4: command not found sh: line 1: python2.6: command not found test_xrange test_zipfile test_zipfile64 test_zipfile64 skipped -- test requires loads of disk-space bytes and a long time to run test_zipimport test_zipimport_support test_zlib 350 tests OK. 3 tests failed: test_robotparser test_smtpnet test_ssl 28 tests skipped: test_aepack test_al test_applesingle test_bsddb185 test_cd test_cl test_epoll test_gdb test_gl test_imgfile test_ioctl test_kqueue test_macos test_macostools test_multiprocessing test_pep277 test_py3kwarn test_scriptpackages test_startfile test_sunaudiodev test_tcl test_tk test_ttk_guionly test_ttk_textonly test_unicode_file test_winreg test_winsound test_zipfile64 7 skips unexpected on linux2: test_epoll test_gdb test_ioctl test_multiprocessing test_tk test_ttk_guionly test_ttk_textonly [985097 refs] From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Fixing typo: Removing debugging line Message-ID: tarek.ziade pushed 1b78bac6dd43 to distutils2: http://hg.python.org/distutils2/rev/1b78bac6dd43 changeset: 986:1b78bac6dd43 user: Andre Espaze date: Sun Jan 30 18:04:54 2011 +0100 summary: Fixing typo: Removing debugging line files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -805,7 +805,6 @@ cmd = ' '.join(cmd) if verbose: logger.info(cmd) - logger.info(env) if dry_run: return exit_status = sub_call(cmd, shell=True, env=env) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Parsing extension sections with interpret in setup.cfg Message-ID: tarek.ziade pushed 8813f2ecd580 to distutils2: http://hg.python.org/distutils2/rev/8813f2ecd580 changeset: 987:8813f2ecd580 user: Andre Espaze date: Sun Jan 30 18:06:00 2011 +0100 summary: Parsing extension sections with interpret in setup.cfg files: distutils2/config.py distutils2/tests/test_config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -14,13 +14,22 @@ from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command +from distutils2.markers import interpret def _pop_values(values_dct, key): """Remove values from the dictionary and convert them as a list""" vals_str = values_dct.pop(key, '') + if not vals_str: + return + fields = [] + for field in vals_str.split(os.linesep): + tmp_vals = field.split('--') + if (len(tmp_vals) == 2) and (not interpret(tmp_vals[1])): + continue + fields.append(tmp_vals[0]) # Get bash options like `gcc -print-file-name=libgcc.a` - vals = split(vals_str) + vals = split(' '.join(fields)) if vals: return vals diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -105,6 +105,8 @@ sources = c_src/speed_coconuts.c extra_link_args = "`gcc -print-file-name=libgcc.a`" -shared define_macros = HAVE_CAIRO HAVE_GTK2 +libraries = gecodeint gecodekernel -- sys.platform != 'win32' + GecodeInt GecodeKernel -- sys.platform == 'win32' [extension=fast_taunt] name = three.fast_taunt @@ -113,6 +115,8 @@ include_dirs = /usr/include/gecode /usr/include/blitz extra_compile_args = -fPIC -O2 + -DGECODE_VERSION=$(./gecode_version) -- sys.platform != 'win32' + /DGECODE_VERSION='win32' -- sys.platform == 'win32' language = cxx """ @@ -284,6 +288,10 @@ ext = ext_modules.get('one.speed_coconuts') self.assertEqual(ext.sources, ['c_src/speed_coconuts.c']) self.assertEqual(ext.define_macros, ['HAVE_CAIRO', 'HAVE_GTK2']) + libs = ['gecodeint', 'gecodekernel'] + if sys.platform == 'win32': + libs = ['GecodeInt', 'GecodeKernel'] + self.assertEqual(ext.libraries, libs) self.assertEqual(ext.extra_link_args, ['`gcc -print-file-name=libgcc.a`', '-shared']) @@ -292,7 +300,12 @@ ['cxx_src/utils_taunt.cxx', 'cxx_src/python_module.cxx']) self.assertEqual(ext.include_dirs, ['/usr/include/gecode', '/usr/include/blitz']) - self.assertEqual(ext.extra_compile_args, ['-fPIC', '-O2']) + cargs = ['-fPIC', '-O2'] + if sys.platform == 'win32': + cargs.append("/DGECODE_VERSION='win32'") + else: + cargs.append('-DGECODE_VERSION=$(./gecode_version)') + self.assertEqual(ext.extra_compile_args, cargs) self.assertEqual(ext.language, 'cxx') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Documentation concerning the ways of contributing to distutils2. Message-ID: tarek.ziade pushed b512b1d2767a to distutils2: http://hg.python.org/distutils2/rev/b512b1d2767a changeset: 988:b512b1d2767a parent: 985:22028f4d78bc user: Julien Miotte date: Mon Jan 31 15:20:12 2011 +0100 summary: Documentation concerning the ways of contributing to distutils2. Currently contains "Reporting issues." files: docs/source/contributing.rst diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst new file mode 100644 --- /dev/null +++ b/docs/source/contributing.rst @@ -0,0 +1,52 @@ +========================== +Contributing to Distutils2 +========================== + +---------------- +Reporting Issues +---------------- + +When using, testing, developping distutils2, you may encounter issues. Please report to the following sections to know how these issues should be reported. + +Please keep in mind that this guide is intended to ease the triage and fixing processes by giving the maximum information to the developers. It should not be viewed as mandatory, only advised ;). + +Issues regarding distutils2 commands +==================================== + +- Go to http://bugs.python.org/ (you'll need a Python Bugs account), then "Issues" > "Create ticket". +- **Title**: write in a short summary of the issue. + * You may prefix the issue title with [d2_component], where d2_component can be : installer, sdist, setup.cfg, ... This will ease up the triage process. + +- **Components**: choose "Distutils2" +- **Version**: choose "3rd party" +- **Comment**: use the following template for versions, reproduction conditions: + * If some of the fields presented don't apply to the issue, feel free to pick only the ones you need. + +:: + + Operating System: + Version of Python: + Version of Distutils2: + + How to reproduce: + + What happens: + + What should happen: + +- Filling in the fields: + * **How to reproduce**: indicate some test case to reproduce the issue. + * **What happens**: describe what is the error, paste tracebacks if you have any. + * **What should happen**: indicate what you think should be the result of the test case (wanted behaviour). + * **Versions**: + - If you're using a release of distutils2, you may want to test the latest version of the project (under developpment code). + - If the issue is present in the latest version, please indicate the tip commit of the version tested. + - Be careful to indicate the remote reference (12 characters, for instance c3cf81fc64db), not the local reference (rXXX). + +- If it is relevant, please join any file that will help reproducing the issue or logs to understand the problem (setup.cfg, strace ouptups, ...). + +Issues regarding PyPI display of the distutils2 projects +======================================================== + +- Please send a bug report to the catalog-sig at python.org mailing list. +- You can include your setup.cfg, and a link to your project page. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Refactoring the way we write the distutils setup.py. Message-ID: tarek.ziade pushed c0ca4120182c to distutils2: http://hg.python.org/distutils2/rev/c0ca4120182c changeset: 989:c0ca4120182c parent: 976:69240ad43f64 user: Julien Miotte date: Mon Jan 31 17:32:43 2011 +0100 summary: Refactoring the way we write the distutils setup.py. Using the : handle = open() try: ... finally: handle.close() as advised by Eric Araujo. Also, instead of several calling to handle.write(), use a multi-line string. files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -1156,12 +1156,18 @@ raise DistutilsFileError("A pre existing setup.py file exists") handle = open("setup.py", "w") - handle.write("# Distutils script using distutils2 setup.cfg to call the\n") - handle.write("# distutils.core.setup() with the right args.\n\n\n") - handle.write("import os\n") - handle.write("from distutils.core import setup\n") - handle.write("from ConfigParser import RawConfigParser\n\n") - handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) - handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") - handle.write("setup(**kwargs)") - handle.close() + try: + handle.write( + "# Distutils script using distutils2 setup.cfg to call the\n" + "# distutils.core.setup() with the right args.\n\n" + "import os\n" + "from distutils.core import setup\n" + "from ConfigParser import RawConfigParser\n\n" + "" + getsource(generate_distutils_kwargs_from_setup_cfg) + "\n\n" + "kwargs = generate_distutils_kwargs_from_setup_cfg()\n" + "setup(**kwargs)\n" + ) + finally: + handle.close() + +generate_distutils_setup_py() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Removing useless print message in the distutils->d2 setup.py script. Message-ID: tarek.ziade pushed 16a91e49164c to distutils2: http://hg.python.org/distutils2/rev/16a91e49164c changeset: 990:16a91e49164c user: Julien Miotte date: Mon Jan 31 17:33:28 2011 +0100 summary: Removing useless print message in the distutils->d2 setup.py script. files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -1128,7 +1128,6 @@ # There is no such option in the setup.cfg if arg == "long_description": filename = has_get_option(config, section, "description_file") - print "We have a filename", filename if filename: in_cfg_value = open(filename).read() else: -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: metadata.check checks for missing author Message-ID: tarek.ziade pushed 6c4e92fdb95e to distutils2: http://hg.python.org/distutils2/rev/6c4e92fdb95e changeset: 993:6c4e92fdb95e parent: 915:374f93ab103c user: Godefroid Chapelle date: Sat Jan 29 17:36:34 2011 +0100 summary: metadata.check checks for missing author add tests accordingly files: distutils2/command/check.py distutils2/metadata.py distutils2/tests/test_metadata.py diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -52,8 +52,7 @@ def check_metadata(self): """Ensures that all required elements of metadata are supplied. - name, version, URL, (author and author_email) or - (maintainer and maintainer_email)). + name, version, URL, author Warns if any are missing. """ diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -466,7 +466,7 @@ msg = "missing required metadata: %s" % ', '.join(missing) raise MetadataMissingError(msg) - for attr in ('Home-page',): + for attr in ('Home-page', 'Author'): if attr not in self: missing.append(attr) diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -220,17 +220,75 @@ [('one', 'http://ok')]) self.assertEqual(metadata.version, '1.2') - def test_check(self): + def test_check_version(self): + metadata = DistributionMetadata() + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Version']) + + def test_check_version_strict(self): + metadata = DistributionMetadata() + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + from distutils2.errors import MetadataMissingError + self.assertRaises(MetadataMissingError, metadata.check, strict=True) + + def test_check_name(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Name']) + + def test_check_name_strict(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + from distutils2.errors import MetadataMissingError + self.assertRaises(MetadataMissingError, metadata.check, strict=True) + + def test_check_author(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Author']) + + def test_check_homepage(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Name'] = 'vimpdb' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Home-page']) + + def test_check_predicates(self): metadata = DistributionMetadata() metadata['Version'] = 'rr' + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' metadata['Requires-dist'] = ['Foo (a)'] + metadata['Obsoletes-dist'] = ['Foo (a)'] + metadata['Provides-dist'] = ['Foo (a)'] if metadata.docutils_support: missing, warnings = metadata.check() - self.assertEqual(len(warnings), 2) + self.assertEqual(len(warnings), 4) metadata.docutils_support = False missing, warnings = metadata.check() - self.assertEqual(missing, ['Name', 'Home-page']) - self.assertEqual(len(warnings), 2) + self.assertEqual(len(warnings), 4) def test_best_choice(self): metadata = DistributionMetadata() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: merging Julien Miotte work Message-ID: tarek.ziade pushed feda55d419cb to distutils2: http://hg.python.org/distutils2/rev/feda55d419cb changeset: 991:feda55d419cb parent: 987:8813f2ecd580 parent: 988:b512b1d2767a user: Alexis Metaireau date: Tue Feb 01 01:37:52 2011 +0000 summary: merging Julien Miotte work files: diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst new file mode 100644 --- /dev/null +++ b/docs/source/contributing.rst @@ -0,0 +1,52 @@ +========================== +Contributing to Distutils2 +========================== + +---------------- +Reporting Issues +---------------- + +When using, testing, developping distutils2, you may encounter issues. Please report to the following sections to know how these issues should be reported. + +Please keep in mind that this guide is intended to ease the triage and fixing processes by giving the maximum information to the developers. It should not be viewed as mandatory, only advised ;). + +Issues regarding distutils2 commands +==================================== + +- Go to http://bugs.python.org/ (you'll need a Python Bugs account), then "Issues" > "Create ticket". +- **Title**: write in a short summary of the issue. + * You may prefix the issue title with [d2_component], where d2_component can be : installer, sdist, setup.cfg, ... This will ease up the triage process. + +- **Components**: choose "Distutils2" +- **Version**: choose "3rd party" +- **Comment**: use the following template for versions, reproduction conditions: + * If some of the fields presented don't apply to the issue, feel free to pick only the ones you need. + +:: + + Operating System: + Version of Python: + Version of Distutils2: + + How to reproduce: + + What happens: + + What should happen: + +- Filling in the fields: + * **How to reproduce**: indicate some test case to reproduce the issue. + * **What happens**: describe what is the error, paste tracebacks if you have any. + * **What should happen**: indicate what you think should be the result of the test case (wanted behaviour). + * **Versions**: + - If you're using a release of distutils2, you may want to test the latest version of the project (under developpment code). + - If the issue is present in the latest version, please indicate the tip commit of the version tested. + - Be careful to indicate the remote reference (12 characters, for instance c3cf81fc64db), not the local reference (rXXX). + +- If it is relevant, please join any file that will help reproducing the issue or logs to understand the problem (setup.cfg, strace ouptups, ...). + +Issues regarding PyPI display of the distutils2 projects +======================================================== + +- Please send a bug report to the catalog-sig at python.org mailing list. +- You can include your setup.cfg, and a link to your project page. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: merge Message-ID: tarek.ziade pushed a832bc472995 to distutils2: http://hg.python.org/distutils2/rev/a832bc472995 changeset: 992:a832bc472995 parent: 991:feda55d419cb parent: 990:16a91e49164c user: Alexis Metaireau date: Tue Feb 01 01:38:08 2011 +0000 summary: merge files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -1080,7 +1080,6 @@ # There is no such option in the setup.cfg if arg == "long_description": filename = has_get_option(config, section, "description_file") - print "We have a filename", filename if filename: in_cfg_value = open(filename).read() else: @@ -1108,12 +1107,18 @@ raise DistutilsFileError("A pre existing setup.py file exists") handle = open("setup.py", "w") - handle.write("# Distutils script using distutils2 setup.cfg to call the\n") - handle.write("# distutils.core.setup() with the right args.\n\n\n") - handle.write("import os\n") - handle.write("from distutils.core import setup\n") - handle.write("from ConfigParser import RawConfigParser\n\n") - handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) - handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") - handle.write("setup(**kwargs)") - handle.close() + try: + handle.write( + "# Distutils script using distutils2 setup.cfg to call the\n" + "# distutils.core.setup() with the right args.\n\n" + "import os\n" + "from distutils.core import setup\n" + "from ConfigParser import RawConfigParser\n\n" + "" + getsource(generate_distutils_kwargs_from_setup_cfg) + "\n\n" + "kwargs = generate_distutils_kwargs_from_setup_cfg()\n" + "setup(**kwargs)\n" + ) + finally: + handle.close() + +generate_distutils_setup_py() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: metadata warnings were not shown to the user Message-ID: tarek.ziade pushed 7bbd7533bc87 to distutils2: http://hg.python.org/distutils2/rev/7bbd7533bc87 changeset: 995:7bbd7533bc87 parent: 993:6c4e92fdb95e user: Godefroid Chapelle date: Sun Jan 30 12:28:40 2011 +0100 summary: metadata warnings were not shown to the user files: distutils2/command/check.py distutils2/tests/test_command_check.py diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -56,9 +56,11 @@ Warns if any are missing. """ - missing, __ = self.distribution.metadata.check(strict=True) + missing, warnings = self.distribution.metadata.check(strict=True) if missing != []: self.warn("missing required metadata: %s" % ', '.join(missing)) + for warning in warnings: + self.warn(warning) def check_restructuredtext(self): """Checks if the long string fields are reST-compliant.""" diff --git a/distutils2/tests/test_command_check.py b/distutils2/tests/test_command_check.py --- a/distutils2/tests/test_command_check.py +++ b/distutils2/tests/test_command_check.py @@ -48,6 +48,43 @@ cmd = self._run(metadata, strict=1) self.assertEqual(len(cmd._warnings), 0) + def test_check_metadata_1_2(self): + # let's run the command with no metadata at all + # by default, check is checking the metadata + # should have some warnings + cmd = self._run() + self.assertTrue(len(cmd._warnings) > 0) + + # now let's add the required fields + # and run it again, to make sure we don't get + # any warning anymore + # let's use requires_python as a marker to enforce + # Metadata-Version 1.2 + metadata = {'home_page': 'xxx', 'author': 'xxx', + 'author_email': 'xxx', + 'name': 'xxx', 'version': 'xxx', + 'requires_python': '2.4', + } + cmd = self._run(metadata) + self.assertEqual(len(cmd._warnings), 1) + + # now with the strict mode, we should + # get an error if there are missing metadata + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) + + # complain about version format + self.assertRaises(DistutilsSetupError, self._run, metadata, **{'strict': 1}) + + # now with correct version format + metadata = {'home_page': 'xxx', 'author': 'xxx', + 'author_email': 'xxx', + 'name': 'xxx', 'version': '1.2', + 'requires_python': '2.4', + } + cmd = self._run(metadata, strict=1) + self.assertEqual(len(cmd._warnings), 0) + @unittest.skipUnless(_HAS_DOCUTILS, "requires docutils") def test_check_restructuredtext(self): # let's see if it detects broken rest in long_description @@ -79,7 +116,7 @@ cmd = check(dist) cmd.check_hooks_resolvable() self.assertEqual(len(cmd._warnings), 1) - + def test_suite(): return unittest.makeSuite(CheckTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: pep8 format Message-ID: tarek.ziade pushed a73c351afd57 to distutils2: http://hg.python.org/distutils2/rev/a73c351afd57 changeset: 997:a73c351afd57 user: Godefroid Chapelle date: Sun Jan 30 12:38:30 2011 +0100 summary: pep8 format files: distutils2/tests/test_command_check.py diff --git a/distutils2/tests/test_command_check.py b/distutils2/tests/test_command_check.py --- a/distutils2/tests/test_command_check.py +++ b/distutils2/tests/test_command_check.py @@ -6,13 +6,14 @@ from distutils2.errors import DistutilsSetupError from distutils2.errors import MetadataMissingError + class CheckTestCase(support.LoggingCatcher, support.TempdirManager, unittest.TestCase): def _run(self, metadata=None, **options): if metadata is None: - metadata = {'name':'xxx', 'version':'xxx'} + metadata = {'name': 'xxx', 'version': 'xxx'} pkg_info, dist = self.create_dist(**metadata) cmd = check(dist) cmd.initialize_options() @@ -34,7 +35,7 @@ # any warning anymore metadata = {'home_page': 'xxx', 'author': 'xxx', 'author_email': 'xxx', - 'name': 'xxx', 'version': 'xxx' + 'name': 'xxx', 'version': 'xxx', } cmd = self._run(metadata) self.assertEqual(len(cmd._warnings), 0) @@ -42,7 +43,8 @@ # now with the strict mode, we should # get an error if there are missing metadata self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) - self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1}) # and of course, no error when all metadata fields are present cmd = self._run(metadata, strict=1) @@ -71,10 +73,12 @@ # now with the strict mode, we should # get an error if there are missing metadata self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) - self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1}) # complain about version format - self.assertRaises(DistutilsSetupError, self._run, metadata, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, metadata, + **{'strict': 1}) # now with correct version format metadata = {'home_page': 'xxx', 'author': 'xxx', @@ -102,7 +106,7 @@ def test_check_all(self): self.assertRaises(DistutilsSetupError, self._run, - {'name':'xxx', 'version':'xxx'}, **{'strict': 1, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1, 'all': 1}) self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1, -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: merge changes by sprint Message-ID: tarek.ziade pushed f314fe720c70 to distutils2: http://hg.python.org/distutils2/rev/f314fe720c70 changeset: 994:f314fe720c70 parent: 993:6c4e92fdb95e parent: 943:3bb80edf7d81 user: Godefroid Chapelle date: Sat Jan 29 17:50:13 2011 +0100 summary: merge changes by sprint files: distutils2/metadata.py distutils2/tests/test_metadata.py diff --git a/distutils2/_backport/__init__.py b/distutils2/_backport/__init__.py --- a/distutils2/_backport/__init__.py +++ b/distutils2/_backport/__init__.py @@ -1,8 +1,2 @@ """Things that will land in the Python 3.3 std lib but which we must drag along with us for now to support 2.x.""" - -def any(seq): - for elem in seq: - if elem: - return True - return False diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1,24 +1,19 @@ """Utilities to support packages.""" -# NOTE: This module must remain compatible with Python 2.3, as it is shared -# by setuptools for distribution with Python 2.3 and up. - import os import sys import imp -import os.path +import re +import warnings from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate -import zipimport try: import cStringIO as StringIO except ImportError: import StringIO -import re -import warnings __all__ = [ @@ -28,10 +23,14 @@ 'Distribution', 'EggInfoDistribution', 'distinfo_dirname', 'get_distributions', 'get_distribution', 'get_file_users', 'provides_distribution', 'obsoletes_distribution', - 'enable_cache', 'disable_cache', 'clear_cache' + 'enable_cache', 'disable_cache', 'clear_cache', ] +########################## +# PEP 302 Implementation # +########################## + def read_code(stream): # This helper is needed in order for the :pep:`302` emulation to # correctly handle compiled files @@ -41,7 +40,7 @@ if magic != imp.get_magic(): return None - stream.read(4) # Skip timestamp + stream.read(4) # Skip timestamp return marshal.load(stream) @@ -173,7 +172,6 @@ #@simplegeneric def iter_importer_modules(importer, prefix=''): - "" if not hasattr(importer, 'iter_modules'): return [] return importer.iter_modules(prefix) @@ -331,9 +329,9 @@ def get_filename(self, fullname=None): fullname = self._fix_name(fullname) mod_type = self.etc[2] - if self.etc[2] == imp.PKG_DIRECTORY: + if mod_type == imp.PKG_DIRECTORY: return self._get_delegate().get_filename() - elif self.etc[2] in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): + elif mod_type in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): return self.filename return None @@ -432,7 +430,8 @@ import mechanism will find the latter. Items of the following types can be affected by this discrepancy: - ``imp.C_EXTENSION, imp.PY_SOURCE, imp.PY_COMPILED, imp.PKG_DIRECTORY`` + :data:`imp.C_EXTENSION`, :data:`imp.PY_SOURCE`, :data:`imp.PY_COMPILED`, + :data:`imp.PKG_DIRECTORY` """ if fullname.startswith('.'): raise ImportError("Relative module names not supported") @@ -534,13 +533,13 @@ # frozen package. Return the path unchanged in that case. return path - pname = os.path.join(*name.split('.')) # Reconstitute as relative path + pname = os.path.join(*name.split('.')) # Reconstitute as relative path # Just in case os.extsep != '.' sname = os.extsep.join(name.split('.')) sname_pkg = sname + os.extsep + "pkg" init_py = "__init__" + os.extsep + "py" - path = path[:] # Start with a copy of the existing path + path = path[:] # Start with a copy of the existing path for dir in sys.path: if not isinstance(dir, basestring) or not os.path.isdir(dir): @@ -565,7 +564,7 @@ line = line.rstrip('\n') if not line or line.startswith('#'): continue - path.append(line) # Don't check for existence! + path.append(line) # Don't check for existence! f.close() return path @@ -609,6 +608,7 @@ resource_name = os.path.join(*parts) return loader.get_data(resource_name) + ########################## # PEP 376 Implementation # ########################## @@ -616,12 +616,12 @@ DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED',) # Cache -_cache_name = {} # maps names to Distribution instances -_cache_name_egg = {} # maps names to EggInfoDistribution instances -_cache_path = {} # maps paths to Distribution instances -_cache_path_egg = {} # maps paths to EggInfoDistribution instances -_cache_generated = False # indicates if .dist-info distributions are cached -_cache_generated_egg = False # indicates if .dist-info and .egg are cached +_cache_name = {} # maps names to Distribution instances +_cache_name_egg = {} # maps names to EggInfoDistribution instances +_cache_path = {} # maps paths to Distribution instances +_cache_path_egg = {} # maps paths to EggInfoDistribution instances +_cache_generated = False # indicates if .dist-info distributions are cached +_cache_generated_egg = False # indicates if .dist-info and .egg are cached _cache_enabled = True @@ -636,6 +636,7 @@ _cache_enabled = True + def disable_cache(): """ Disables the internal cache. @@ -647,9 +648,10 @@ _cache_enabled = False + def clear_cache(): """ Clears the internal cache. """ - global _cache_name, _cache_name_egg, cache_path, _cache_path_egg, \ + global _cache_name, _cache_name_egg, _cache_path, _cache_path_egg, \ _cache_generated, _cache_generated_egg _cache_name = {} @@ -660,14 +662,14 @@ _cache_generated_egg = False -def _yield_distributions(include_dist, include_egg): +def _yield_distributions(include_dist, include_egg, paths=sys.path): """ Yield .dist-info and .egg(-info) distributions, based on the arguments :parameter include_dist: yield .dist-info distributions :parameter include_egg: yield .egg(-info) distributions """ - for path in sys.path: + for path in paths: realpath = os.path.realpath(path) if not os.path.isdir(realpath): continue @@ -679,7 +681,7 @@ dir.endswith('.egg')): yield EggInfoDistribution(dist_path) -def _generate_cache(use_egg_info=False): +def _generate_cache(use_egg_info=False, paths=sys.path): global _cache_generated, _cache_generated_egg if _cache_generated_egg or (_cache_generated and not use_egg_info): @@ -688,7 +690,7 @@ gen_dist = not _cache_generated gen_egg = use_egg_info - for dist in _yield_distributions(gen_dist, gen_egg): + for dist in _yield_distributions(gen_dist, gen_egg, paths): if isinstance(dist, Distribution): _cache_path[dist.path] = dist if not dist.name in _cache_name: @@ -872,7 +874,8 @@ if isinstance(strs, basestring): for s in strs.splitlines(): s = s.strip() - if s and not s.startswith('#'): # skip blank lines/comments + # skip blank lines/comments + if s and not s.startswith('#'): yield s else: for ss in strs: @@ -890,6 +893,7 @@ except IOError: requires = None else: + # FIXME handle the case where zipfile is not available zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) self.metadata = DistributionMetadata(fileobj=fileobj) @@ -952,7 +956,7 @@ version = match.group('first') if match.group('rest'): version += match.group('rest') - version = version.replace(' ', '') # trim spaces + version = version.replace(' ', '') # trim spaces if version is None: reqs.append(name) else: @@ -982,12 +986,6 @@ __hash__ = object.__hash__ -def _normalize_dist_name(name): - """Returns a normalized name from the given *name*. - :rtype: string""" - return name.replace('-', '_') - - def distinfo_dirname(name, version): """ The *name* and *version* parameters are converted into their @@ -1007,7 +1005,7 @@ :returns: directory name :rtype: string""" file_extension = '.dist-info' - name = _normalize_dist_name(name) + name = name.replace('-', '_') normalized_version = suggest_normalized_version(version) # Because this is a lookup procedure, something will be returned even if # it is a version that cannot be normalized @@ -1017,7 +1015,7 @@ return '-'.join([name, normalized_version]) + file_extension -def get_distributions(use_egg_info=False): +def get_distributions(use_egg_info=False, paths=sys.path): """ Provides an iterator that looks for ``.dist-info`` directories in ``sys.path`` and returns :class:`Distribution` instances for each one of @@ -1028,7 +1026,7 @@ instances """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): yield dist else: _generate_cache(use_egg_info) @@ -1041,7 +1039,7 @@ yield dist -def get_distribution(name, use_egg_info=False): +def get_distribution(name, use_egg_info=False, paths=sys.path): """ Scans all elements in ``sys.path`` and looks for all directories ending with ``.dist-info``. Returns a :class:`Distribution` @@ -1059,7 +1057,7 @@ :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): if dist.name == name: return dist else: @@ -1148,7 +1146,7 @@ raise DistutilsError(('Distribution %s has invalid ' + 'provides field: %s') \ % (dist.name, p)) - p_ver = p_ver[1:-1] # trim off the parenthesis + p_ver = p_ver[1:-1] # trim off the parenthesis if p_name == name and predicate.match(p_ver): yield dist break diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -1,4 +1,4 @@ -"""Utility functions for copying files and directory trees. +"""Utility functions for copying and archiving files and directory trees. XXX The functions here don't copy the resource fork or other metadata on Mac. @@ -9,7 +9,13 @@ import stat from os.path import abspath import fnmatch -from warnings import warn +import errno + +try: + import bz2 + _BZ2_SUPPORTED = True +except ImportError: + _BZ2_SUPPORTED = False try: from pwd import getpwnam @@ -21,9 +27,12 @@ except ImportError: getgrnam = None -__all__ = ["copyfileobj","copyfile","copymode","copystat","copy","copy2", - "copytree","move","rmtree","Error", "SpecialFileError", - "ExecError","make_archive"] +__all__ = ["copyfileobj", "copyfile", "copymode", "copystat", "copy", "copy2", + "copytree", "move", "rmtree", "Error", "SpecialFileError", + "ExecError", "make_archive", "get_archive_formats", + "register_archive_format", "unregister_archive_format", + "get_unpack_formats", "register_unpack_format", + "unregister_unpack_format", "unpack_archive"] class Error(EnvironmentError): pass @@ -35,6 +44,14 @@ class ExecError(EnvironmentError): """Raised when a command could not be executed""" +class ReadError(EnvironmentError): + """Raised when an archive cannot be read""" + +class RegistryError(Exception): + """Raised when a registery operation with the archiving + and unpacking registeries fails""" + + try: WindowsError except NameError: @@ -50,7 +67,7 @@ def _samefile(src, dst): # Macintosh, Unix. - if hasattr(os.path,'samefile'): + if hasattr(os.path, 'samefile'): try: return os.path.samefile(src, dst) except OSError: @@ -63,10 +80,8 @@ def copyfile(src, dst): """Copy data from src to dst""" if _samefile(src, dst): - raise Error, "`%s` and `%s` are the same file" % (src, dst) + raise Error("`%s` and `%s` are the same file" % (src, dst)) - fsrc = None - fdst = None for fn in [src, dst]: try: st = os.stat(fn) @@ -77,15 +92,16 @@ # XXX What about other special files? (sockets, devices...) if stat.S_ISFIFO(st.st_mode): raise SpecialFileError("`%s` is a named pipe" % fn) + + fsrc = open(src, 'rb') try: - fsrc = open(src, 'rb') fdst = open(dst, 'wb') - copyfileobj(fsrc, fdst) + try: + copyfileobj(fsrc, fdst) + finally: + fdst.close() finally: - if fdst: - fdst.close() - if fsrc: - fsrc.close() + fsrc.close() def copymode(src, dst): """Copy mode bits from src to dst""" @@ -103,8 +119,12 @@ if hasattr(os, 'chmod'): os.chmod(dst, mode) if hasattr(os, 'chflags') and hasattr(st, 'st_flags'): - os.chflags(dst, st.st_flags) - + try: + os.chflags(dst, st.st_flags) + except OSError, why: + if (not hasattr(errno, 'EOPNOTSUPP') or + why.errno != errno.EOPNOTSUPP): + raise def copy(src, dst): """Copy data and mode bits ("cp src dst"). @@ -140,8 +160,9 @@ return set(ignored_names) return _ignore_patterns -def copytree(src, dst, symlinks=False, ignore=None): - """Recursively copy a directory tree using copy2(). +def copytree(src, dst, symlinks=False, ignore=None, copy_function=copy2, + ignore_dangling_symlinks=False): + """Recursively copy a directory tree. The destination directory must not already exist. If exception(s) occur, an Error is raised with a list of reasons. @@ -149,7 +170,13 @@ If the optional symlinks flag is true, symbolic links in the source tree result in symbolic links in the destination tree; if it is false, the contents of the files pointed to by symbolic - links are copied. + links are copied. If the file pointed by the symlink doesn't + exist, an exception will be added in the list of errors raised in + an Error exception at the end of the copy process. + + You can set the optional ignore_dangling_symlinks flag to true if you + want to silence this exception. Notice that this has no effect on + platforms that don't support os.symlink. The optional ignore argument is a callable. If given, it is called with the `src` parameter, which is the directory @@ -163,7 +190,10 @@ list of names relative to the `src` directory that should not be copied. - XXX Consider this example code rather than the ultimate tool. + The optional copy_function argument is a callable that will be used + to copy each file. It will be called with the source path and the + destination path as arguments. By default, copy2() is used, but any + function that supports the same signature (like copy()) can be used. """ names = os.listdir(src) @@ -182,14 +212,21 @@ srcname = os.path.join(src, name) dstname = os.path.join(dst, name) try: - if symlinks and os.path.islink(srcname): + if os.path.islink(srcname): linkto = os.readlink(srcname) - os.symlink(linkto, dstname) + if symlinks: + os.symlink(linkto, dstname) + else: + # ignore dangling symlink if the flag is on + if not os.path.exists(linkto) and ignore_dangling_symlinks: + continue + # otherwise let the copy occurs. copy2 will raise an error + copy_function(srcname, dstname) elif os.path.isdir(srcname): - copytree(srcname, dstname, symlinks, ignore) + copytree(srcname, dstname, symlinks, ignore, copy_function) else: # Will raise a SpecialFileError for unsupported file types - copy2(srcname, dstname) + copy_function(srcname, dstname) # catch the Error from the recursive copytree so that we can # continue with other files except Error, err: @@ -205,7 +242,7 @@ else: errors.extend((src, dst, str(why))) if errors: - raise Error, errors + raise Error(errors) def rmtree(path, ignore_errors=False, onerror=None): """Recursively delete a directory tree. @@ -235,7 +272,7 @@ names = [] try: names = os.listdir(path) - except os.error, err: + except os.error: onerror(os.listdir, path, sys.exc_info()) for name in names: fullname = os.path.join(path, name) @@ -248,7 +285,7 @@ else: try: os.remove(fullname) - except os.error, err: + except os.error: onerror(os.remove, fullname, sys.exc_info()) try: os.rmdir(path) @@ -282,13 +319,13 @@ if os.path.isdir(dst): real_dst = os.path.join(dst, _basename(src)) if os.path.exists(real_dst): - raise Error, "Destination path '%s' already exists" % real_dst + raise Error("Destination path '%s' already exists" % real_dst) try: os.rename(src, real_dst) except OSError: if os.path.isdir(src): if _destinsrc(src, dst): - raise Error, "Cannot move a directory '%s' into itself '%s'." % (src, dst) + raise Error("Cannot move a directory '%s' into itself '%s'." % (src, dst)) copytree(src, real_dst, symlinks=True) rmtree(src) else: @@ -333,40 +370,41 @@ """Create a (possibly compressed) tar file from all the files under 'base_dir'. - 'compress' must be "gzip" (the default), "compress", "bzip2", or None. - (compress will be deprecated in Python 3.2) + 'compress' must be "gzip" (the default), "bzip2", or None. 'owner' and 'group' can be used to define an owner and a group for the archive that is being built. If not provided, the current owner and group will be used. The output tar file will be named 'base_dir' + ".tar", possibly plus - the appropriate compression extension (".gz", ".bz2" or ".Z"). + the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. """ - tar_compression = {'gzip': 'gz', 'bzip2': 'bz2', None: '', 'compress': ''} - compress_ext = {'gzip': '.gz', 'bzip2': '.bz2', 'compress': '.Z'} + tar_compression = {'gzip': 'gz', None: ''} + compress_ext = {'gzip': '.gz'} + + if _BZ2_SUPPORTED: + tar_compression['bzip2'] = 'bz2' + compress_ext['bzip2'] = '.bz2' # flags for compression program, each element of list will be an argument if compress is not None and compress not in compress_ext: - raise ValueError, \ - ("bad value for 'compress': must be None, 'gzip', 'bzip2' " - "or 'compress'") + raise ValueError("bad value for 'compress', or compression format not " + "supported: %s" % compress) - archive_name = base_name + '.tar' - if compress != 'compress': - archive_name += compress_ext.get(compress, '') + archive_name = base_name + '.tar' + compress_ext.get(compress, '') + archive_dir = os.path.dirname(archive_name) - archive_dir = os.path.dirname(archive_name) if not os.path.exists(archive_dir): if logger is not None: - logger.info("creating %s" % archive_dir) + logger.info("creating %s", archive_dir) if not dry_run: os.makedirs(archive_dir) - # creating the tarball + # XXX late import because of circular dependency between shutil and + # tarfile :( from distutils2._backport import tarfile if logger is not None: @@ -391,23 +429,9 @@ finally: tar.close() - # compression using `compress` - # XXX this block will be removed in Python 3.2 - if compress == 'compress': - warn("'compress' will be deprecated.", PendingDeprecationWarning) - # the option varies depending on the platform - compressed_name = archive_name + compress_ext[compress] - if sys.platform == 'win32': - cmd = [compress, archive_name, compressed_name] - else: - cmd = [compress, '-f', archive_name] - from distutils2.spawn import spawn - spawn(cmd, dry_run=dry_run) - return compressed_name - return archive_name -def _call_external_zip(directory, verbose=False): +def _call_external_zip(base_dir, zip_filename, verbose=False, dry_run=False): # XXX see if we want to keep an external call here if verbose: zipoptions = "-r" @@ -420,8 +444,7 @@ except DistutilsExecError: # XXX really should distinguish between "couldn't find # external 'zip' command" and "zip failed". - raise ExecError, \ - ("unable to create zip file '%s': " + raise ExecError("unable to create zip file '%s': " "could neither import the 'zipfile' module nor " "find a standalone zip utility") % zip_filename @@ -451,7 +474,7 @@ zipfile = None if zipfile is None: - _call_external_zip(base_dir, verbose) + _call_external_zip(base_dir, zip_filename, verbose, dry_run) else: if logger is not None: logger.info("creating '%s' and adding '%s' to it", @@ -475,12 +498,14 @@ _ARCHIVE_FORMATS = { 'gztar': (_make_tarball, [('compress', 'gzip')], "gzip'ed tar-file"), 'bztar': (_make_tarball, [('compress', 'bzip2')], "bzip2'ed tar-file"), - 'ztar': (_make_tarball, [('compress', 'compress')], - "compressed tar file"), 'tar': (_make_tarball, [('compress', None)], "uncompressed tar file"), - 'zip': (_make_zipfile, [],"ZIP file") + 'zip': (_make_zipfile, [], "ZIP file"), } +if _BZ2_SUPPORTED: + _ARCHIVE_FORMATS['bztar'] = (_make_tarball, [('compress', 'bzip2')], + "bzip2'ed tar-file") + def get_archive_formats(): """Returns a list of supported formats for archiving and unarchiving. @@ -507,7 +532,7 @@ if not isinstance(extra_args, (tuple, list)): raise TypeError('extra_args needs to be a sequence') for element in extra_args: - if not isinstance(element, (tuple, list)) or len(element) !=2 : + if not isinstance(element, (tuple, list)) or len(element) !=2: raise TypeError('extra_args elements are : (arg_name, value)') _ARCHIVE_FORMATS[name] = (function, extra_args, description) @@ -520,7 +545,7 @@ """Create an archive file (eg. zip or tar). 'base_name' is the name of the file to create, minus any format-specific - extension; 'format' is the archive format: one of "zip", "tar", "ztar", + extension; 'format' is the archive format: one of "zip", "tar", "bztar" or "gztar". 'root_dir' is a directory that will be the root directory of the @@ -549,7 +574,7 @@ try: format_info = _ARCHIVE_FORMATS[format] except KeyError: - raise ValueError, "unknown archive format '%s'" % format + raise ValueError("unknown archive format '%s'" % format) func = format_info[0] for arg, val in format_info[1]: @@ -568,3 +593,169 @@ os.chdir(save_cwd) return filename + + +def get_unpack_formats(): + """Returns a list of supported formats for unpacking. + + Each element of the returned sequence is a tuple + (name, extensions, description) + """ + formats = [(name, info[0], info[3]) for name, info in + _UNPACK_FORMATS.iteritems()] + formats.sort() + return formats + +def _check_unpack_options(extensions, function, extra_args): + """Checks what gets registered as an unpacker.""" + # first make sure no other unpacker is registered for this extension + existing_extensions = {} + for name, info in _UNPACK_FORMATS.iteritems(): + for ext in info[0]: + existing_extensions[ext] = name + + for extension in extensions: + if extension in existing_extensions: + msg = '%s is already registered for "%s"' + raise RegistryError(msg % (extension, + existing_extensions[extension])) + + if not callable(function): + raise TypeError('The registered function must be a callable') + + +def register_unpack_format(name, extensions, function, extra_args=None, + description=''): + """Registers an unpack format. + + `name` is the name of the format. `extensions` is a list of extensions + corresponding to the format. + + `function` is the callable that will be + used to unpack archives. The callable will receive archives to unpack. + If it's unable to handle an archive, it needs to raise a ReadError + exception. + + If provided, `extra_args` is a sequence of + (name, value) tuples that will be passed as arguments to the callable. + description can be provided to describe the format, and will be returned + by the get_unpack_formats() function. + """ + if extra_args is None: + extra_args = [] + _check_unpack_options(extensions, function, extra_args) + _UNPACK_FORMATS[name] = extensions, function, extra_args, description + +def unregister_unpack_format(name): + """Removes the pack format from the registery.""" + del _UNPACK_FORMATS[name] + +def _ensure_directory(path): + """Ensure that the parent directory of `path` exists""" + dirname = os.path.dirname(path) + if not os.path.isdir(dirname): + os.makedirs(dirname) + +def _unpack_zipfile(filename, extract_dir): + """Unpack zip `filename` to `extract_dir` + """ + try: + import zipfile + except ImportError: + raise ReadError('zlib not supported, cannot unpack this archive.') + + if not zipfile.is_zipfile(filename): + raise ReadError("%s is not a zip file" % filename) + + zip = zipfile.ZipFile(filename) + try: + for info in zip.infolist(): + name = info.filename + + # don't extract absolute paths or ones with .. in them + if name.startswith('/') or '..' in name: + continue + + target = os.path.join(extract_dir, *name.split('/')) + if not target: + continue + + _ensure_directory(target) + if not name.endswith('/'): + # file + data = zip.read(info.filename) + f = open(target, 'wb') + try: + f.write(data) + finally: + f.close() + del data + finally: + zip.close() + +def _unpack_tarfile(filename, extract_dir): + """Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir` + """ + from distutils2._backport import tarfile + try: + tarobj = tarfile.open(filename) + except tarfile.TarError: + raise ReadError( + "%s is not a compressed or uncompressed tar file" % filename) + try: + tarobj.extractall(extract_dir) + finally: + tarobj.close() + +_UNPACK_FORMATS = { + 'gztar': (['.tar.gz', '.tgz'], _unpack_tarfile, [], "gzip'ed tar-file"), + 'tar': (['.tar'], _unpack_tarfile, [], "uncompressed tar file"), + 'zip': (['.zip'], _unpack_zipfile, [], "ZIP file") + } + +if _BZ2_SUPPORTED: + _UNPACK_FORMATS['bztar'] = (['.bz2'], _unpack_tarfile, [], + "bzip2'ed tar-file") + +def _find_unpack_format(filename): + for name, info in _UNPACK_FORMATS.iteritems(): + for extension in info[0]: + if filename.endswith(extension): + return name + return None + +def unpack_archive(filename, extract_dir=None, format=None): + """Unpack an archive. + + `filename` is the name of the archive. + + `extract_dir` is the name of the target directory, where the archive + is unpacked. If not provided, the current working directory is used. + + `format` is the archive format: one of "zip", "tar", or "gztar". Or any + other registered format. If not provided, unpack_archive will use the + filename extension and see if an unpacker was registered for that + extension. + + In case none is found, a ValueError is raised. + """ + if extract_dir is None: + extract_dir = os.getcwd() + + if format is not None: + try: + format_info = _UNPACK_FORMATS[format] + except KeyError: + raise ValueError("Unknown unpack format '{0}'".format(format)) + + func = format_info[0] + func(filename, extract_dir, **dict(format_info[1])) + else: + # we need to look at the registered unpackers supported extensions + format = _find_unpack_format(filename) + if format is None: + raise ReadError("Unknown archive format '{0}'".format(filename)) + + func = _UNPACK_FORMATS[format][1] + kwargs = dict(_UNPACK_FORMATS[format][2]) + raise ValueError('Unknown archive format: %s' % filename) diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -12,10 +12,15 @@ except ImportError: from distutils2._backport.hashlib import md5 -from test.test_support import TESTFN +from distutils2.errors import DistutilsError +from distutils2.metadata import DistributionMetadata +from distutils2.tests import unittest, run_unittest, support -from distutils2.tests import unittest, run_unittest, support from distutils2._backport import pkgutil +from distutils2._backport.pkgutil import ( + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -108,6 +113,12 @@ self.assertEqual(res1, RESOURCE_DATA) res2 = pkgutil.get_data(pkg, 'sub/res.txt') self.assertEqual(res2, RESOURCE_DATA) + + names = [] + for loader, name, ispkg in pkgutil.iter_modules([zip_file]): + names.append(name) + self.assertEqual(names, ['test_getdata_zipfile']) + del sys.path[0] del sys.modules[pkg] @@ -205,7 +216,7 @@ record_writer.writerow(record_pieces( os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) - del record_writer # causes the RECORD file to close + del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: @@ -225,9 +236,6 @@ def test_instantiation(self): # Test the Distribution class's instantiation provides us with usable # attributes. - # Import the Distribution class - from distutils2._backport.pkgutil import distinfo_dirname, Distribution - here = os.path.abspath(os.path.dirname(__file__)) name = 'choxie' version = '2.0.0.9' @@ -236,7 +244,6 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - from distutils2.metadata import DistributionMetadata self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) @@ -244,7 +251,6 @@ def test_installed_files(self): # Test the iteration of installed files. # Test the distribution's installed files - from distutils2._backport.pkgutil import Distribution for distinfo_dir in self.distinfo_dirs: dist = Distribution(distinfo_dir) for path, md5_, size in dist.get_installed_files(): @@ -267,14 +273,12 @@ false_path = relpath(os.path.join(*false_path), sys.prefix) # Test if the distribution uses the file in question - from distutils2._backport.pkgutil import Distribution dist = Distribution(distinfo_dir) self.assertTrue(dist.uses(true_path)) self.assertFalse(dist.uses(false_path)) def test_get_distinfo_file(self): # Test the retrieval of dist-info file objects. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, @@ -295,7 +299,6 @@ # Is it the correct file? self.assertEqual(value.name, os.path.join(distinfo_dir, distfile)) - from distutils2.errors import DistutilsError # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, other_distinfo_name + '.dist-info', 'REQUESTED') @@ -307,7 +310,6 @@ def test_get_distinfo_files(self): # Test for the iteration of RECORD path entries. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, distinfo_name + '.dist-info') @@ -345,7 +347,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [# (name, version, standard_dirname) + items = [ # (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -356,9 +358,6 @@ ('python-ldap', '2.5 a---5', 'python_ldap-2.5 a---5.dist-info'), ] - # Import the function in question - from distutils2._backport.pkgutil import distinfo_dirname - # Loop through the items to validate the results for name, version, standard_dirname in items: dirname = distinfo_dirname(name, version) @@ -371,11 +370,6 @@ ('towel-stuff', '0.1')] found_dists = [] - # Import the function in question - from distutils2._backport.pkgutil import get_distributions, \ - Distribution, \ - EggInfoDistribution - # Verify the fake dists have been found. dists = [dist for dist in get_distributions()] for dist in dists: @@ -416,12 +410,7 @@ def test_get_distribution(self): # Test for looking up a distribution by name. # Test the lookup of the towel-stuff distribution - name = 'towel-stuff' # Note: This is different from the directory name - - # Import the function in question - from distutils2._backport.pkgutil import get_distribution, \ - Distribution, \ - EggInfoDistribution + name = 'towel-stuff' # Note: This is different from the directory name # Lookup the distribution dist = get_distribution(name) @@ -461,7 +450,6 @@ def test_get_file_users(self): # Test the iteration of distributions that use a file. - from distutils2._backport.pkgutil import get_file_users, Distribution name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, 'towel_stuff', '__init__.py') @@ -471,9 +459,6 @@ def test_provides(self): # Test for looking up distributions by what they provide - from distutils2._backport.pkgutil import provides_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in provides_distribution('truffles')] @@ -522,12 +507,10 @@ use_egg_info=True)] checkLists(l, ['strawberry']) - l = [dist.name for dist in provides_distribution('strawberry', '>0.6', use_egg_info=True)] checkLists(l, []) - l = [dist.name for dist in provides_distribution('banana', '0.4', use_egg_info=True)] checkLists(l, ['banana']) @@ -536,16 +519,12 @@ use_egg_info=True)] checkLists(l, ['banana']) - l = [dist.name for dist in provides_distribution('banana', '!=0.4', use_egg_info=True)] checkLists(l, []) def test_obsoletes(self): # Test looking for distributions based on what they obsolete - from distutils2._backport.pkgutil import obsoletes_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in obsoletes_distribution('truffles', '1.0')] @@ -555,7 +534,6 @@ use_egg_info=True)] checkLists(l, ['cheese', 'bacon']) - l = [dist.name for dist in obsoletes_distribution('truffles', '0.8')] checkLists(l, ['choxie']) @@ -575,7 +553,6 @@ def test_yield_distribution(self): # tests the internal function _yield_distributions - from distutils2._backport.pkgutil import _yield_distributions checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), diff --git a/distutils2/_backport/tests/test_shutil.py b/distutils2/_backport/tests/test_shutil.py new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/test_shutil.py @@ -0,0 +1,945 @@ +import os +import sys +import tempfile +import stat +import tarfile +from os.path import splitdrive +from StringIO import StringIO + +from distutils.spawn import find_executable, spawn +from distutils2._backport import shutil +from distutils2._backport.shutil import ( + _make_tarball, _make_zipfile, make_archive, unpack_archive, + register_archive_format, unregister_archive_format, get_archive_formats, + register_unpack_format, unregister_unpack_format, get_unpack_formats, + Error, RegistryError) + +from distutils2.tests import unittest, support, TESTFN + +try: + import bz2 + BZ2_SUPPORTED = True +except ImportError: + BZ2_SUPPORTED = False + +TESTFN2 = TESTFN + "2" + +try: + import grp + import pwd + UID_GID_SUPPORT = True +except ImportError: + UID_GID_SUPPORT = False + +try: + import zlib +except ImportError: + zlib = None + +try: + import zipfile + ZIP_SUPPORT = True +except ImportError: + ZIP_SUPPORT = find_executable('zip') + +class TestShutil(unittest.TestCase): + + def setUp(self): + super(TestShutil, self).setUp() + self.tempdirs = [] + + def tearDown(self): + super(TestShutil, self).tearDown() + while self.tempdirs: + d = self.tempdirs.pop() + shutil.rmtree(d, os.name in ('nt', 'cygwin')) + + def write_file(self, path, content='xxx'): + """Writes a file in the given path. + + + path can be a string or a sequence. + """ + if isinstance(path, (list, tuple)): + path = os.path.join(*path) + f = open(path, 'w') + try: + f.write(content) + finally: + f.close() + + def mkdtemp(self): + """Create a temporary directory that will be cleaned up. + + Returns the path of the directory. + """ + d = tempfile.mkdtemp() + self.tempdirs.append(d) + return d + + def test_rmtree_errors(self): + # filename is guaranteed not to exist + filename = tempfile.mktemp() + self.assertRaises(OSError, shutil.rmtree, filename) + + # See bug #1071513 for why we don't run this on cygwin + # and bug #1076467 for why we don't run this as root. + if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' + and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.childpath = os.path.join(TESTFN, 'a') + f = open(self.childpath, 'w') + f.close() + old_dir_mode = os.stat(TESTFN).st_mode + old_child_mode = os.stat(self.childpath).st_mode + # Make unwritable. + os.chmod(self.childpath, stat.S_IREAD) + os.chmod(TESTFN, stat.S_IREAD) + + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 2, + "Expected call to onerror function did not happen.") + + # Make writable again. + os.chmod(TESTFN, old_dir_mode) + os.chmod(self.childpath, old_child_mode) + + # Clean up. + shutil.rmtree(TESTFN) + + def check_args_to_onerror(self, func, arg, exc): + # test_rmtree_errors deliberately runs rmtree + # on a directory that is chmod 400, which will fail. + # This function is run when shutil.rmtree fails. + # 99.9% of the time it initially fails to remove + # a file in the directory, so the first time through + # func is os.remove. + # However, some Linux machines running ZFS on + # FUSE experienced a failure earlier in the process + # at os.listdir. The first failure may legally + # be either. + if self.errorState == 0: + if func is os.remove: + self.assertEqual(arg, self.childpath) + else: + self.assertIs(func, os.listdir, + "func must be either os.remove or os.listdir") + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 1 + else: + self.assertEqual(func, os.rmdir) + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 2 + + def test_rmtree_dont_delete_file(self): + # When called on a file instead of a directory, don't delete it. + handle, path = tempfile.mkstemp() + os.fdopen(handle).close() + self.assertRaises(OSError, shutil.rmtree, path) + os.remove(path) + + def _write_data(self, path, data): + f = open(path, "w") + f.write(data) + f.close() + + def test_copytree_simple(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + src_dir = tempfile.mkdtemp() + dst_dir = os.path.join(tempfile.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + try: + shutil.copytree(src_dir, dst_dir) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test.txt'))) + self.assertTrue(os.path.isdir(os.path.join(dst_dir, 'test_dir'))) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test_dir', + 'test.txt'))) + actual = read_data(os.path.join(dst_dir, 'test.txt')) + self.assertEqual(actual, '123') + actual = read_data(os.path.join(dst_dir, 'test_dir', 'test.txt')) + self.assertEqual(actual, '456') + finally: + for path in ( + os.path.join(src_dir, 'test.txt'), + os.path.join(dst_dir, 'test.txt'), + os.path.join(src_dir, 'test_dir', 'test.txt'), + os.path.join(dst_dir, 'test_dir', 'test.txt'), + ): + if os.path.exists(path): + os.remove(path) + for path in (src_dir, + os.path.dirname(dst_dir) + ): + if os.path.exists(path): + shutil.rmtree(path) + + def test_copytree_with_exclude(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + # creating data + join = os.path.join + exists = os.path.exists + src_dir = tempfile.mkdtemp() + try: + dst_dir = join(tempfile.mkdtemp(), 'destination') + self._write_data(join(src_dir, 'test.txt'), '123') + self._write_data(join(src_dir, 'test.tmp'), '123') + os.mkdir(join(src_dir, 'test_dir')) + self._write_data(join(src_dir, 'test_dir', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2')) + self._write_data(join(src_dir, 'test_dir2', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2', 'subdir')) + os.mkdir(join(src_dir, 'test_dir2', 'subdir2')) + self._write_data(join(src_dir, 'test_dir2', 'subdir', 'test.txt'), + '456') + self._write_data(join(src_dir, 'test_dir2', 'subdir2', 'test.py'), + '456') + + + # testing glob-like patterns + try: + patterns = shutil.ignore_patterns('*.tmp', 'test_dir2') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(exists(join(dst_dir, 'test.txt'))) + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + try: + patterns = shutil.ignore_patterns('*.tmp', 'subdir*') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + + # testing callable-style + try: + def _filter(src, names): + res = [] + for name in names: + path = os.path.join(src, name) + + if (os.path.isdir(path) and + path.split()[-1] == 'subdir'): + res.append(name) + elif os.path.splitext(path)[-1] in ('.py'): + res.append(name) + return res + + shutil.copytree(src_dir, dst_dir, ignore=_filter) + + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2', + 'test.py'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + finally: + shutil.rmtree(src_dir) + shutil.rmtree(os.path.dirname(dst_dir)) + + @support.skip_unless_symlink + def test_dont_copy_file_onto_link_to_itself(self): + # bug 851123. + os.mkdir(TESTFN) + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + try: + f = open(src, 'w') + f.write('cheddar') + f.close() + + if hasattr(os, "link"): + os.link(src, dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + + # Using `src` here would mean we end up with a symlink pointing + # to TESTFN/TESTFN/cheese, while it should point at + # TESTFN/cheese. + os.symlink('cheese', dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + finally: + try: + shutil.rmtree(TESTFN) + except OSError: + pass + + @support.skip_unless_symlink + def test_rmtree_on_symlink(self): + # bug 1669. + os.mkdir(TESTFN) + try: + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + os.mkdir(src) + os.symlink(src, dst) + self.assertRaises(OSError, shutil.rmtree, dst) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + if hasattr(os, "mkfifo"): + # Issue #3002: copyfile and copytree block indefinitely on named pipes + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, 'mkfifo'), 'requires os.mkfifo') + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) + try: + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error, e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) + + def test_copytree_special_func(self): + + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + copied = [] + def _copy(src, dst): + copied.append((src, dst)) + + shutil.copytree(src_dir, dst_dir, copy_function=_copy) + self.assertEquals(len(copied), 2) + + @support.skip_unless_symlink + def test_copytree_dangling_symlinks(self): + + # a dangling symlink raises an error at the end + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + os.symlink('IDONTEXIST', os.path.join(src_dir, 'test.txt')) + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + self.assertRaises(Error, shutil.copytree, src_dir, dst_dir) + + # a dangling symlink is ignored with the proper flag + dst_dir = os.path.join(self.mkdtemp(), 'destination2') + shutil.copytree(src_dir, dst_dir, ignore_dangling_symlinks=True) + self.assertNotIn('test.txt', os.listdir(dst_dir)) + + # a dangling symlink is copied if symlinks=True + dst_dir = os.path.join(self.mkdtemp(), 'destination3') + shutil.copytree(src_dir, dst_dir, symlinks=True) + self.assertIn('test.txt', os.listdir(dst_dir)) + + @unittest.skipUnless(zlib, "requires zlib") + def test_make_tarball(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + os.mkdir(os.path.join(tmpdir, 'sub')) + self.write_file([tmpdir, 'sub', 'file3'], 'xxx') + + tmpdir2 = self.mkdtemp() + unittest.skipUnless(splitdrive(tmpdir)[0] == splitdrive(tmpdir2)[0], + "source and target should be on same drive") + + base_name = os.path.join(tmpdir2, 'archive') + + # working with relative paths to avoid tar warnings + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + def _tarinfo(self, path): + tar = tarfile.open(path) + try: + names = tar.getnames() + names.sort() + return tuple(names) + finally: + tar.close() + + def _create_files(self): + # creating something to tar + tmpdir = self.mkdtemp() + dist = os.path.join(tmpdir, 'dist') + os.mkdir(dist) + self.write_file([dist, 'file1'], 'xxx') + self.write_file([dist, 'file2'], 'xxx') + os.mkdir(os.path.join(dist, 'sub')) + self.write_file([dist, 'sub', 'file3'], 'xxx') + os.mkdir(os.path.join(dist, 'sub2')) + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + return tmpdir, tmpdir2, base_name + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(find_executable('tar') and find_executable('gzip'), + 'Need the tar command to run') + def test_tarfile_vs_tar(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # now create another tarball using `tar` + tarball2 = os.path.join(tmpdir, 'archive2.tar.gz') + tar_cmd = ['tar', '-cf', 'archive2.tar', 'dist'] + gzip_cmd = ['gzip', '-f9', 'archive2.tar'] + old_dir = os.getcwd() + old_stdout = sys.stdout + os.chdir(tmpdir) + sys.stdout = StringIO() + + try: + spawn(tar_cmd) + spawn(gzip_cmd) + finally: + os.chdir(old_dir) + sys.stdout = old_stdout + + self.assertTrue(os.path.exists(tarball2)) + # let's compare both tarballs + self.assertEquals(self._tarinfo(tarball), self._tarinfo(tarball2)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + # now for a dry_run + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None, dry_run=True) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(ZIP_SUPPORT, 'Need zip support to run') + def test_make_zipfile(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + _make_zipfile(base_name, tmpdir) + + # check if the compressed tarball was created + tarball = base_name + '.zip' + self.assertTrue(os.path.exists(tarball)) + + + def test_make_archive(self): + tmpdir = self.mkdtemp() + base_name = os.path.join(tmpdir, 'archive') + self.assertRaises(ValueError, make_archive, base_name, 'xxx') + + @unittest.skipUnless(zlib, "Requires zlib") + def test_make_archive_owner_group(self): + # testing make_archive with owner and group, with various combinations + # this works even if there's not gid/uid support + if UID_GID_SUPPORT: + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + else: + group = owner = 'root' + + base_dir, root_dir, base_name = self._create_files() + base_name = os.path.join(self.mkdtemp() , 'archive') + res = make_archive(base_name, 'zip', root_dir, base_dir, owner=owner, + group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'zip', root_dir, base_dir) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner=owner, group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner='kjhkjhkjg', group='oihohoh') + self.assertTrue(os.path.exists(res)) + + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(UID_GID_SUPPORT, "Requires grp and pwd support") + def test_tarfile_root_owner(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + try: + archive_name = _make_tarball(base_name, 'dist', compress=None, + owner=owner, group=group) + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + self.assertTrue(os.path.exists(archive_name)) + + # now checks the rights + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, 0) + self.assertEquals(member.gid, 0) + finally: + archive.close() + + def test_make_archive_cwd(self): + current_dir = os.getcwd() + def _breaks(*args, **kw): + raise RuntimeError() + + register_archive_format('xxx', _breaks, [], 'xxx file') + try: + try: + make_archive('xxx', 'xxx', root_dir=self.mkdtemp()) + except Exception: + pass + self.assertEquals(os.getcwd(), current_dir) + finally: + unregister_archive_format('xxx') + + def test_register_archive_format(self): + + self.assertRaises(TypeError, register_archive_format, 'xxx', 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + [(1, 2), (1, 2, 3)]) + + register_archive_format('xxx', lambda: x, [(1, 2)], 'xxx file') + formats = [name for name, params in get_archive_formats()] + self.assertIn('xxx', formats) + + unregister_archive_format('xxx') + formats = [name for name, params in get_archive_formats()] + self.assertNotIn('xxx', formats) + + def _compare_dirs(self, dir1, dir2): + # check that dir1 and dir2 are equivalent, + # return the diff + diff = [] + for root, dirs, files in os.walk(dir1): + for file_ in files: + path = os.path.join(root, file_) + target_path = os.path.join(dir2, os.path.split(path)[-1]) + if not os.path.exists(target_path): + diff.append(file_) + return diff + + @unittest.skipUnless(zlib, "Requires zlib") + def test_unpack_archive(self): + formats = ['tar', 'gztar', 'zip'] + if BZ2_SUPPORTED: + formats.append('bztar') + + for format in formats: + tmpdir = self.mkdtemp() + base_dir, root_dir, base_name = self._create_files() + tmpdir2 = self.mkdtemp() + filename = make_archive(base_name, format, root_dir, base_dir) + + # let's try to unpack it now + unpack_archive(filename, tmpdir2) + diff = self._compare_dirs(tmpdir, tmpdir2) + self.assertEquals(diff, []) + + def test_unpack_registery(self): + + formats = get_unpack_formats() + + def _boo(filename, extract_dir, extra): + self.assertEquals(extra, 1) + self.assertEquals(filename, 'stuff.boo') + self.assertEquals(extract_dir, 'xx') + + register_unpack_format('Boo', ['.boo', '.b2'], _boo, [('extra', 1)]) + unpack_archive('stuff.boo', 'xx') + + # trying to register a .boo unpacker again + self.assertRaises(RegistryError, register_unpack_format, 'Boo2', + ['.boo'], _boo) + + # should work now + unregister_unpack_format('Boo') + register_unpack_format('Boo2', ['.boo'], _boo) + self.assertIn(('Boo2', ['.boo'], ''), get_unpack_formats()) + self.assertNotIn(('Boo', ['.boo'], ''), get_unpack_formats()) + + # let's leave a clean state + unregister_unpack_format('Boo2') + self.assertEquals(get_unpack_formats(), formats) + + +class TestMove(unittest.TestCase): + + def setUp(self): + filename = "foo" + self.src_dir = tempfile.mkdtemp() + self.dst_dir = tempfile.mkdtemp() + self.src_file = os.path.join(self.src_dir, filename) + self.dst_file = os.path.join(self.dst_dir, filename) + # Try to create a dir in the current directory, hoping that it is + # not located on the same filesystem as the system tmp dir. + try: + self.dir_other_fs = tempfile.mkdtemp( + dir=os.path.dirname(__file__)) + self.file_other_fs = os.path.join(self.dir_other_fs, + filename) + except OSError: + self.dir_other_fs = None + f = open(self.src_file, "wb") + try: + f.write("spam") + finally: + f.close() + + def tearDown(self): + for d in (self.src_dir, self.dst_dir, self.dir_other_fs): + try: + if d: + shutil.rmtree(d) + except: + pass + + def _check_move_file(self, src, dst, real_dst): + f = open(src, "rb") + try: + contents = f.read() + finally: + f.close() + + shutil.move(src, dst) + f = open(real_dst, "rb") + try: + self.assertEqual(contents, f.read()) + finally: + f.close() + + self.assertFalse(os.path.exists(src)) + + def _check_move_dir(self, src, dst, real_dst): + contents = sorted(os.listdir(src)) + shutil.move(src, dst) + self.assertEqual(contents, sorted(os.listdir(real_dst))) + self.assertFalse(os.path.exists(src)) + + def test_move_file(self): + # Move a file to another location on the same filesystem. + self._check_move_file(self.src_file, self.dst_file, self.dst_file) + + def test_move_file_to_dir(self): + # Move a file inside an existing dir on the same filesystem. + self._check_move_file(self.src_file, self.dst_dir, self.dst_file) + + def test_move_file_other_fs(self): + # Move a file to an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.file_other_fs, + self.file_other_fs) + + def test_move_file_to_dir_other_fs(self): + # Move a file to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.dir_other_fs, + self.file_other_fs) + + def test_move_dir(self): + # Move a dir to another location on the same filesystem. + dst_dir = tempfile.mktemp() + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_other_fs(self): + # Move a dir to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + dst_dir = tempfile.mktemp(dir=self.dir_other_fs) + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_to_dir(self): + # Move a dir inside an existing dir on the same filesystem. + self._check_move_dir(self.src_dir, self.dst_dir, + os.path.join(self.dst_dir, os.path.basename(self.src_dir))) + + def test_move_dir_to_dir_other_fs(self): + # Move a dir inside an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_dir(self.src_dir, self.dir_other_fs, + os.path.join(self.dir_other_fs, os.path.basename(self.src_dir))) + + def test_existing_file_inside_dest_dir(self): + # A file with the same name inside the destination dir already exists. + f = open(self.dst_file, "wb") + try: + pass + finally: + f.close() + self.assertRaises(shutil.Error, shutil.move, self.src_file, self.dst_dir) + + def test_dont_move_dir_in_itself(self): + # Moving a dir inside itself raises an Error. + dst = os.path.join(self.src_dir, "bar") + self.assertRaises(shutil.Error, shutil.move, self.src_dir, dst) + + def test_destinsrc_false_negative(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'srcdir/dest')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertTrue(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is not in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + def test_destinsrc_false_positive(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'src/dest'), ('srcdir', 'srcdir.new')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertFalse(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + +class TestCopyFile(unittest.TestCase): + + _delete = False + + class Faux(object): + _entered = False + _exited_with = None + _raised = False + + def __init__(self, raise_in_exit=False, suppress_at_exit=True): + self._raise_in_exit = raise_in_exit + self._suppress_at_exit = suppress_at_exit + + def read(self, *args): + return '' + + def __enter__(self): + self._entered = True + + def __exit__(self, exc_type, exc_val, exc_tb): + self._exited_with = exc_type, exc_val, exc_tb + if self._raise_in_exit: + self._raised = True + raise IOError("Cannot close") + return self._suppress_at_exit + + def tearDown(self): + if self._delete: + del shutil.open + + def _set_shutil_open(self, func): + shutil.open = func + self._delete = True + + def test_w_source_open_fails(self): + def _open(filename, mode='r'): + if filename == 'srcfile': + raise IOError('Cannot open "srcfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, shutil.copyfile, 'srcfile', 'destfile') + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_open_fails(self): + + srcfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + raise IOError('Cannot open "destfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot open "destfile"',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_close_fails(self): + + srcfile = self.Faux() + destfile = self.Faux(True) + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertTrue(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot close',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_source_close_fails(self): + + srcfile = self.Faux(True) + destfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, + shutil.copyfile, 'srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertFalse(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is None) + self.assertTrue(srcfile._raised) + + +def test_suite(): + suite = unittest.TestSuite() + load = unittest.defaultTestLoader.loadTestsFromTestCase + suite.addTest(load(TestCopyFile)) + suite.addTest(load(TestMove)) + suite.addTest(load(TestShutil)) + return suite + + +if __name__ == '__main__': + unittest.main(defaultTest='test_suite') diff --git a/distutils2/_backport/tests/test_sysconfig.py b/distutils2/_backport/tests/test_sysconfig.py --- a/distutils2/_backport/tests/test_sysconfig.py +++ b/distutils2/_backport/tests/test_sysconfig.py @@ -4,7 +4,7 @@ import sys import subprocess import shutil -from copy import copy, deepcopy +from copy import copy from ConfigParser import RawConfigParser from StringIO import StringIO @@ -15,13 +15,9 @@ get_scheme_names, _main, _SCHEMES) from distutils2.tests import unittest -from distutils2.tests.support import EnvironGuard +from distutils2.tests.support import EnvironGuard, skip_unless_symlink from test.test_support import TESTFN, unlink -try: - from test.test_support import skip_unless_symlink -except ImportError: - skip_unless_symlink = unittest.skip( - 'requires test.test_support.skip_unless_symlink') + class TestSysConfig(EnvironGuard, unittest.TestCase): diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py --- a/distutils2/command/build_py.py +++ b/distutils2/command/build_py.py @@ -8,7 +8,6 @@ import logging from glob import glob -import distutils2 from distutils2.command.cmd import Command from distutils2.errors import DistutilsOptionError, DistutilsFileError from distutils2.util import convert_path @@ -66,10 +65,9 @@ self.packages = self.distribution.packages self.py_modules = self.distribution.py_modules self.package_data = self.distribution.package_data - self.package_dir = {} - if self.distribution.package_dir: - for name, path in self.distribution.package_dir.iteritems(): - self.package_dir[name] = convert_path(path) + self.package_dir = None + if self.distribution.package_dir is not None: + self.package_dir = convert_path(self.distribution.package_dir) self.data_files = self.get_data_files() # Ick, copied straight from install_lib.py (fancy_getopt needs a @@ -164,11 +162,13 @@ Helper function for `run()`. """ + # FIXME add tests for this method for package, src_dir, build_dir, filenames in self.data_files: for filename in filenames: target = os.path.join(build_dir, filename) + srcfile = os.path.join(src_dir, filename) self.mkpath(os.path.dirname(target)) - outf, copied = self.copy_file(os.path.join(src_dir, filename), + outf, copied = self.copy_file(srcfile, target, preserve_mode=False) if copied and srcfile in self.distribution.convert_2to3.doctests: self._doctests_2to3.append(outf) @@ -179,41 +179,14 @@ """Return the directory, relative to the top of the source distribution, where package 'package' should be found (at least according to the 'package_dir' option, if any).""" + path = package.split('.') + if self.package_dir is not None: + path.insert(0, self.package_dir) - path = package.split('.') + if len(path) > 0: + return os.path.join(*path) - if not self.package_dir: - if path: - return os.path.join(*path) - else: - return '' - else: - tail = [] - while path: - try: - pdir = self.package_dir['.'.join(path)] - except KeyError: - tail.insert(0, path[-1]) - del path[-1] - else: - tail.insert(0, pdir) - return os.path.join(*tail) - else: - # Oops, got all the way through 'path' without finding a - # match in package_dir. If package_dir defines a directory - # for the root (nameless) package, then fallback on it; - # otherwise, we might as well have not consulted - # package_dir at all, as we just use the directory implied - # by 'tail' (which should be the same as the original value - # of 'path' at this point). - pdir = self.package_dir.get('') - if pdir is not None: - tail.insert(0, pdir) - - if tail: - return os.path.join(*tail) - else: - return '' + return '' def check_package(self, package, package_dir): """Helper function for `find_package_modules()` and `find_modules()'. diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -165,7 +165,10 @@ header = "command options for '%s':" % self.get_command_name() self.announce(indent + header, level=logging.INFO) indent = indent + " " + negative_opt = getattr(self, 'negative_opt', ()) for (option, _, _) in self.user_options: + if option in negative_opt: + continue option = option.replace('-', '_') if option[-1] == "=": option = option[:-1] diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -18,7 +18,8 @@ from distutils2.command import get_command_names from distutils2.command.cmd import Command from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError, - DistutilsTemplateError, DistutilsModuleError) + DistutilsTemplateError, DistutilsModuleError, + DistutilsFileError) from distutils2.manifest import Manifest from distutils2 import logger from distutils2.util import convert_path, resolve_name @@ -291,6 +292,12 @@ logger.warn("no files to distribute -- empty manifest?") else: logger.info(msg) + + for file in self.distribution.metadata.requires_files: + if file not in files: + msg = "'%s' must be included explicitly in 'extra_files'" % file + raise DistutilsFileError(msg) + for file in files: if not os.path.isfile(file): logger.warn("'%s' not a regular file -- skipping" % file) diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -7,7 +7,8 @@ from ConfigParser import RawConfigParser from distutils2 import logger -from distutils2.util import check_environ, resolve_name +from distutils2.errors import DistutilsOptionError +from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command @@ -76,10 +77,9 @@ return value def _multiline(self, value): - if '\n' in value: - value = [v for v in - [v.strip() for v in value.split('\n')] - if v != ''] + value = [v for v in + [v.strip() for v in value.split('\n')] + if v != ''] return value def _read_setup_cfg(self, parser): @@ -100,7 +100,9 @@ if 'metadata' in content: for key, value in content['metadata'].iteritems(): key = key.replace('_', '-') - value = self._multiline(value) + if metadata.is_multi_field(key): + value = self._multiline(value) + if key == 'project-url': value = [(label.strip(), url.strip()) for label, url in @@ -112,30 +114,45 @@ "mutually exclusive") raise DistutilsOptionError(msg) - f = open(value) # will raise if file not found - try: - value = f.read() - finally: - f.close() + if isinstance(value, list): + filenames = value + else: + filenames = value.split() + + # concatenate each files + value = '' + for filename in filenames: + f = open(filename) # will raise if file not found + try: + value += f.read().strip() + '\n' + finally: + f.close() + # add filename as a required file + if filename not in metadata.requires_files: + metadata.requires_files.append(filename) + value = value.strip() key = 'description' if metadata.is_metadata_field(key): metadata[key] = self._convert_metadata(key, value) + if 'files' in content: - files = dict([(key, self._multiline(value)) + def _convert(key, value): + if key not in ('packages_root',): + value = self._multiline(value) + return value + + files = dict([(key, _convert(key, value)) for key, value in content['files'].iteritems()]) self.dist.packages = [] - self.dist.package_dir = {} + self.dist.package_dir = pkg_dir = files.get('packages_root') packages = files.get('packages', []) if isinstance(packages, str): packages = [packages] for package in packages: - if ':' in package: - dir_, package = package.split(':') - self.dist.package_dir[package] = dir_ self.dist.packages.append(package) self.dist.py_modules = files.get('modules', []) diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -17,19 +17,19 @@ import urllib import urlparse import zipfile - try: import hashlib except ImportError: from distutils2._backport import hashlib +from distutils2._backport.shutil import unpack_archive from distutils2.errors import IrrationalVersionError from distutils2.index.errors import (HashDoesNotMatch, UnsupportedHashName, CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) from distutils2.metadata import DistributionMetadata -from distutils2.util import untar_file, unzip_file, splitext +from distutils2.util import splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -206,6 +206,7 @@ __hash__ = object.__hash__ + class DistInfo(IndexReference): """Represents a distribution retrieved from an index (sdist, bdist, ...) """ @@ -313,17 +314,8 @@ filename = self.download() content_type = mimetypes.guess_type(filename)[0] + self._unpacked_dir = unpack_archive(filename) - if (content_type == 'application/zip' - or filename.endswith('.zip') - or filename.endswith('.pybundle') - or zipfile.is_zipfile(filename)): - unzip_file(filename, path, flatten=not filename.endswith('.pybundle')) - elif (content_type == 'application/x-gzip' - or tarfile.is_tarfile(filename) - or splitext(filename)[1].lower() in ('.tar', '.tar.gz', '.tar.bz2', '.tgz', '.tbz')): - untar_file(filename, path) - self._unpacked_dir = path return self._unpacked_dir def _check_md5(self, filename): diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -127,10 +127,17 @@ return release def get_metadata(self, project_name, version): - """Retreive project metadatas. + """Retrieve project metadata. Return a ReleaseInfo object, with metadata informations filled in. """ + # to be case-insensitive, get the informations from the XMLRPC API + projects = [d['name'] for d in + self.proxy.search({'name': project_name}) + if d['name'].lower() == project_name] + if len(projects) > 0: + project_name = projects[0] + metadata = self.proxy.release_data(project_name, version) project = self._get_project(project_name) if version not in project.get_versions(): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -1,14 +1,16 @@ from tempfile import mkdtemp -import logging import shutil import os import errno import itertools +from distutils2 import logger from distutils2._backport.pkgutil import get_distributions +from distutils2._backport.sysconfig import get_config_var from distutils2.depgraph import generate_graph from distutils2.index import wrapper from distutils2.index.errors import ProjectNotFound, ReleaseNotFound +from distutils2.version import get_version_predicate """Provides installations scripts. @@ -53,7 +55,63 @@ else: raise e os.rename(old, new) - yield(old, new) + yield (old, new) + + +def _run_d1_install(archive_dir, path): + # backward compat: using setuptools or plain-distutils + cmd = '%s setup.py install --root=%s --record=%s' + setup_py = os.path.join(archive_dir, 'setup.py') + if 'setuptools' in open(setup_py).read(): + cmd += ' --single-version-externally-managed' + + # how to place this file in the egg-info dir + # for non-distutils2 projects ? + record_file = os.path.join(archive_dir, 'RECORD') + os.system(cmd % (sys.executable, path, record_file)) + if not os.path.exists(record_file): + raise ValueError('Failed to install.') + return open(record_file).read().split('\n') + + +def _run_d2_install(archive_dir, path): + # using our own install command + raise NotImplementedError() + + +def _install_dist(dist, path): + """Install a distribution into a path. + + This: + + * unpack the distribution + * copy the files in "path" + * determine if the distribution is distutils2 or distutils1. + """ + where = dist.unpack(archive) + + # get into the dir + archive_dir = None + for item in os.listdir(where): + fullpath = os.path.join(where, item) + if os.path.isdir(fullpath): + archive_dir = fullpath + break + + if archive_dir is None: + raise ValueError('Cannot locate the unpacked archive') + + # install + old_dir = os.getcwd() + os.chdir(archive_dir) + try: + # distutils2 or distutils1 ? + if 'setup.py' in os.listdir(archive_dir): + return _run_d1_install(archive_dir, path) + else: + return _run_d2_install(archive_dir, path) + finally: + os.chdir(old_dir) def install_dists(dists, path=None): @@ -65,19 +123,23 @@ Return a list of installed files. :param dists: distributions to install - :param path: base path to install distribution on + :param path: base path to install distribution in """ if not path: path = mkdtemp() installed_dists, installed_files = [], [] for d in dists: + logger.info('Installing %s %s' % (d.name, d.version)) try: - installed_files.extend(d.install(path)) + installed_files.extend(_install_dist(d, path)) installed_dists.append(d) except Exception, e : + logger.info('Failed. %s' % str(e)) + + # reverting for d in installed_dists: - d.uninstall() + uninstall(d) raise e return installed_files @@ -123,16 +185,26 @@ try: if install: installed_files = install_dists(install, install_path) # install to tmp first - for files in temp_files.itervalues(): - for old, new in files: - os.remove(new) - except Exception,e: + except: # if an error occurs, put back the files in the good place. - for files in temp_files.itervalues(): + for files in temp_files.values(): for old, new in files: shutil.move(new, old) + # now re-raising + raise + + # we can remove them for good + for files in temp_files.values(): + for old, new in files: + os.remove(new) + + +def _get_setuptools_deps(release): + # NotImplementedError + pass + def get_infos(requirements, index=None, installed=None, prefer_final=True): """Return the informations on what's going to be installed and upgraded. @@ -154,44 +226,74 @@ Conflict contains all the conflicting distributions, if there is a conflict. """ + if not installed: + logger.info('Reading installed distributions') + installed = get_distributions(use_egg_info=True) + + infos = {'install': [], 'remove': [], 'conflict': []} + # Is a compatible version of the project is already installed ? + predicate = get_version_predicate(requirements) + found = False + installed = list(installed) + + # check that the project isnt already installed + for installed_project in installed: + # is it a compatible project ? + if predicate.name.lower() != installed_project.name.lower(): + continue + found = True + logger.info('Found %s %s' % (installed_project.name, + installed_project.version)) + + # if we already have something installed, check it matches the + # requirements + if predicate.match(installed_project.version): + return infos + break + + if not found: + logger.info('Project not installed.') if not index: index = wrapper.ClientWrapper() - if not installed: - installed = get_distributions(use_egg_info=True) - # Get all the releases that match the requirements try: releases = index.get_releases(requirements) except (ReleaseNotFound, ProjectNotFound), e: raise InstallationException('Release not found: "%s"' % requirements) - + # Pick up a release, and try to get the dependency tree release = releases.get_last(requirements, prefer_final=prefer_final) - # Iter since we found something without conflicts + if release is None: + logger.info('Could not find a matching project') + return infos + + # this works for Metadata 1.2 metadata = release.fetch_metadata() - # Get the distributions already_installed on the system - # and add the one we want to install + # for earlier, we need to build setuptools deps if any + if 'requires_dist' not in metadata: + deps = _get_setuptools_deps(release) + else: + deps = metadata['requires_dist'] distributions = itertools.chain(installed, [release]) depgraph = generate_graph(distributions) # Store all the already_installed packages in a list, in case of rollback. - infos = {'install': [], 'remove': [], 'conflict': []} - # Get what the missing deps are - for dists in depgraph.missing.itervalues(): - if dists: - logging.info("missing dependencies found, installing them") - # we have missing deps - for dist in dists: - _update_infos(infos, get_infos(dist, index, installed)) + dists = depgraph.missing[release] + if dists: + logger.info("missing dependencies found, retrieving metadata") + # we have missing deps + for dist in dists: + _update_infos(infos, get_infos(dist, index, installed)) # Fill in the infos existing = [d for d in installed if d.name == release.name] + if existing: infos['remove'].append(existing[0]) infos['conflict'].extend(depgraph.reverse_list[existing[0]]) @@ -203,16 +305,46 @@ """extends the lists contained in the `info` dict with those contained in the `new_info` one """ - for key, value in infos.iteritems(): + for key, value in infos.items(): if key in new_infos: infos[key].extend(new_infos[key]) +def remove(project_name): + """Removes a single project from the installation""" + pass + + + + def main(**attrs): if 'script_args' not in attrs: import sys attrs['requirements'] = sys.argv[1] get_infos(**attrs) + +def install(project): + logger.info('Getting information about "%s".' % project) + try: + info = get_infos(project) + except InstallationException: + logger.info('Cound not find "%s".' % project) + return + + if info['install'] == []: + logger.info('Nothing to install.') + return + + install_path = get_config_var('base') + try: + install_from_infos(info['install'], info['remove'], info['conflict'], + install_path=install_path) + + except InstallationConflict, e: + projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] + logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) + + if __name__ == '__main__': main() diff --git a/distutils2/markers.py b/distutils2/markers.py new file mode 100644 --- /dev/null +++ b/distutils2/markers.py @@ -0,0 +1,194 @@ +""" Micro-language for PEP 345 environment markers +""" +import sys +import platform +import os +from tokenize import tokenize, NAME, OP, STRING, ENDMARKER +from StringIO import StringIO + +__all__ = ['interpret'] + + +# allowed operators +_OPERATORS = {'==': lambda x, y: x == y, + '!=': lambda x, y: x != y, + '>': lambda x, y: x > y, + '>=': lambda x, y: x >= y, + '<': lambda x, y: x < y, + '<=': lambda x, y: x <= y, + 'in': lambda x, y: x in y, + 'not in': lambda x, y: x not in y} + + +def _operate(operation, x, y): + return _OPERATORS[operation](x, y) + + +# restricted set of variables +_VARS = {'sys.platform': sys.platform, + 'python_version': sys.version[:3], + 'python_full_version': sys.version.split(' ', 1)[0], + 'os.name': os.name, + 'platform.version': platform.version(), + 'platform.machine': platform.machine()} + + +class _Operation(object): + + def __init__(self, execution_context=None): + self.left = None + self.op = None + self.right = None + if execution_context is None: + execution_context = {} + self.execution_context = execution_context + + def _get_var(self, name): + if name in self.execution_context: + return self.execution_context[name] + return _VARS[name] + + def __repr__(self): + return '%s %s %s' % (self.left, self.op, self.right) + + def _is_string(self, value): + if value is None or len(value) < 2: + return False + for delimiter in '"\'': + if value[0] == value[-1] == delimiter: + return True + return False + + def _is_name(self, value): + return value in _VARS + + def _convert(self, value): + if value in _VARS: + return self._get_var(value) + return value.strip('"\'') + + def _check_name(self, value): + if value not in _VARS: + raise NameError(value) + + def _nonsense_op(self): + msg = 'This operation is not supported : "%s"' % self + raise SyntaxError(msg) + + def __call__(self): + # make sure we do something useful + if self._is_string(self.left): + if self._is_string(self.right): + self._nonsense_op() + self._check_name(self.right) + else: + if not self._is_string(self.right): + self._nonsense_op() + self._check_name(self.left) + + if self.op not in _OPERATORS: + raise TypeError('Operator not supported "%s"' % self.op) + + left = self._convert(self.left) + right = self._convert(self.right) + return _operate(self.op, left, right) + + +class _OR(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'OR(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() or self.right() + + +class _AND(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'AND(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() and self.right() + + +class _CHAIN(object): + + def __init__(self, execution_context=None): + self.ops = [] + self.op_starting = True + self.execution_context = execution_context + + def eat(self, toktype, tokval, rowcol, line, logical_line): + if toktype not in (NAME, OP, STRING, ENDMARKER): + raise SyntaxError('Type not supported "%s"' % tokval) + + if self.op_starting: + op = _Operation(self.execution_context) + if len(self.ops) > 0: + last = self.ops[-1] + if isinstance(last, (_OR, _AND)) and not last.filled(): + last.right = op + else: + self.ops.append(op) + else: + self.ops.append(op) + self.op_starting = False + else: + op = self.ops[-1] + + if (toktype == ENDMARKER or + (toktype == NAME and tokval in ('and', 'or'))): + if toktype == NAME and tokval == 'and': + self.ops.append(_AND(self.ops.pop())) + elif toktype == NAME and tokval == 'or': + self.ops.append(_OR(self.ops.pop())) + self.op_starting = True + return + + if isinstance(op, (_OR, _AND)) and op.right is not None: + op = op.right + + if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) + or (toktype == OP and tokval == '.')): + if op.op is None: + if op.left is None: + op.left = tokval + else: + op.left += tokval + else: + if op.right is None: + op.right = tokval + else: + op.right += tokval + elif toktype == OP or tokval in ('in', 'not'): + if tokval == 'in' and op.op == 'not': + op.op = 'not in' + else: + op.op = tokval + + def result(self): + for op in self.ops: + if not op(): + return False + return True + + +def interpret(marker, execution_context=None): + """Interpret a marker and return a result depending on environment.""" + marker = marker.strip() + operations = _CHAIN(execution_context) + tokenize(StringIO(marker).readline, operations.eat) + return operations.result() diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -5,13 +5,12 @@ import os import sys -import platform import re from StringIO import StringIO from email import message_from_file -from tokenize import tokenize, NAME, OP, STRING, ENDMARKER from distutils2 import logger +from distutils2.markers import interpret from distutils2.version import (is_valid_predicate, is_valid_version, is_valid_versions) from distutils2.errors import (MetadataMissingError, @@ -78,13 +77,13 @@ 'Obsoletes-Dist', 'Requires-External', 'Maintainer', 'Maintainer-email', 'Project-URL') +_345_REQUIRED = ('Name', 'Version') + _ALL_FIELDS = set() _ALL_FIELDS.update(_241_FIELDS) _ALL_FIELDS.update(_314_FIELDS) _ALL_FIELDS.update(_345_FIELDS) -_345_REQUIRED = ('Name', 'Version') - def _version2fieldlist(version): if version == '1.0': return _241_FIELDS @@ -174,14 +173,19 @@ _LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes', 'Requires', 'Provides', 'Obsoletes-Dist', 'Provides-Dist', 'Requires-Dist', 'Requires-External', - 'Project-URL') + 'Project-URL', 'Supported-Platform') _LISTTUPLEFIELDS = ('Project-URL',) _ELEMENTSFIELD = ('Keywords',) _UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description') -_MISSING = object() +class NoDefault(object): + """Marker object used for clean representation""" + def __repr__(self): + return '' + +_MISSING = NoDefault() class DistributionMetadata(object): """The metadata of a release. @@ -202,6 +206,7 @@ self._fields = {} self.display_warnings = display_warnings self.version = None + self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent self.execution_context = execution_context @@ -285,7 +290,7 @@ if not self.platform_dependent or ';' not in value: return True, value value, marker = value.split(';') - return _interpret(marker, self.execution_context), value + return interpret(marker, self.execution_context), value def _remove_line_prefix(self, value): return _LINE_PREFIX.sub('\n', value) @@ -294,13 +299,20 @@ # Public API # def get_fullname(self): + """Return the distribution name with version""" return '%s-%s' % (self['Name'], self['Version']) def is_metadata_field(self, name): + """return True if name is a valid metadata key""" name = self._convert_name(name) return name in _ALL_FIELDS + def is_multi_field(self, name): + name = self._convert_name(name) + return name in _LISTFIELDS + def read(self, filepath): + """Read the metadata values from a file path.""" self.read_file(open(filepath)) def read_file(self, fileob): @@ -454,7 +466,8 @@ return value def check(self, strict=False): - """Check if the metadata is compliant.""" + """Check if the metadata is compliant. If strict is False then raise if + no Name or Version are provided""" # XXX should check the versions (if the file was loaded) missing, warnings = [], [] @@ -494,198 +507,13 @@ return missing, warnings def keys(self): + """Dict like api""" return _version2fieldlist(self.version) def values(self): + """Dict like api""" return [self[key] for key in self.keys()] def items(self): + """Dict like api""" return [(key, self[key]) for key in self.keys()] - - -# -# micro-language for PEP 345 environment markers -# - -# allowed operators -_OPERATORS = {'==': lambda x, y: x == y, - '!=': lambda x, y: x != y, - '>': lambda x, y: x > y, - '>=': lambda x, y: x >= y, - '<': lambda x, y: x < y, - '<=': lambda x, y: x <= y, - 'in': lambda x, y: x in y, - 'not in': lambda x, y: x not in y} - - -def _operate(operation, x, y): - return _OPERATORS[operation](x, y) - -# restricted set of variables -_VARS = {'sys.platform': sys.platform, - 'python_version': sys.version[:3], - 'python_full_version': sys.version.split(' ', 1)[0], - 'os.name': os.name, - 'platform.version': platform.version(), - 'platform.machine': platform.machine()} - - -class _Operation(object): - - def __init__(self, execution_context=None): - self.left = None - self.op = None - self.right = None - if execution_context is None: - execution_context = {} - self.execution_context = execution_context - - def _get_var(self, name): - if name in self.execution_context: - return self.execution_context[name] - return _VARS[name] - - def __repr__(self): - return '%s %s %s' % (self.left, self.op, self.right) - - def _is_string(self, value): - if value is None or len(value) < 2: - return False - for delimiter in '"\'': - if value[0] == value[-1] == delimiter: - return True - return False - - def _is_name(self, value): - return value in _VARS - - def _convert(self, value): - if value in _VARS: - return self._get_var(value) - return value.strip('"\'') - - def _check_name(self, value): - if value not in _VARS: - raise NameError(value) - - def _nonsense_op(self): - msg = 'This operation is not supported : "%s"' % self - raise SyntaxError(msg) - - def __call__(self): - # make sure we do something useful - if self._is_string(self.left): - if self._is_string(self.right): - self._nonsense_op() - self._check_name(self.right) - else: - if not self._is_string(self.right): - self._nonsense_op() - self._check_name(self.left) - - if self.op not in _OPERATORS: - raise TypeError('Operator not supported "%s"' % self.op) - - left = self._convert(self.left) - right = self._convert(self.right) - return _operate(self.op, left, right) - - -class _OR(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'OR(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() or self.right() - - -class _AND(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'AND(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() and self.right() - - -class _CHAIN(object): - - def __init__(self, execution_context=None): - self.ops = [] - self.op_starting = True - self.execution_context = execution_context - - def eat(self, toktype, tokval, rowcol, line, logical_line): - if toktype not in (NAME, OP, STRING, ENDMARKER): - raise SyntaxError('Type not supported "%s"' % tokval) - - if self.op_starting: - op = _Operation(self.execution_context) - if len(self.ops) > 0: - last = self.ops[-1] - if isinstance(last, (_OR, _AND)) and not last.filled(): - last.right = op - else: - self.ops.append(op) - else: - self.ops.append(op) - self.op_starting = False - else: - op = self.ops[-1] - - if (toktype == ENDMARKER or - (toktype == NAME and tokval in ('and', 'or'))): - if toktype == NAME and tokval == 'and': - self.ops.append(_AND(self.ops.pop())) - elif toktype == NAME and tokval == 'or': - self.ops.append(_OR(self.ops.pop())) - self.op_starting = True - return - - if isinstance(op, (_OR, _AND)) and op.right is not None: - op = op.right - - if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) - or (toktype == OP and tokval == '.')): - if op.op is None: - if op.left is None: - op.left = tokval - else: - op.left += tokval - else: - if op.right is None: - op.right = tokval - else: - op.right += tokval - elif toktype == OP or tokval in ('in', 'not'): - if tokval == 'in' and op.op == 'not': - op.op = 'not in' - else: - op.op = tokval - - def result(self): - for op in self.ops: - if not op(): - return False - return True - - -def _interpret(marker, execution_context=None): - """Interpret a marker and return a result depending on environment.""" - marker = marker.strip() - operations = _CHAIN(execution_context) - tokenize(StringIO(marker).readline, operations.eat) - return operations.result() diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -1,7 +1,9 @@ import os import sys from optparse import OptionParser +import logging +from distutils2 import logger from distutils2.util import grok_environment_error from distutils2.errors import (DistutilsSetupError, DistutilsArgError, DistutilsError, CCompilerError) @@ -9,6 +11,7 @@ from distutils2 import __version__ from distutils2._backport.pkgutil import get_distributions, get_distribution from distutils2.depgraph import generate_graph +from distutils2.install import install # This is a barebones help message generated displayed when the user # runs the setup script with no arguments at all. More useful help @@ -109,13 +112,23 @@ except (DistutilsError, CCompilerError), msg: + raise raise SystemExit, "error: " + str(msg) return dist +def _set_logger(): + logger.setLevel(logging.INFO) + sth = logging.StreamHandler(sys.stderr) + sth.setLevel(logging.INFO) + logger.addHandler(sth) + logger.propagate = 0 + + def main(): """Main entry point for Distutils2""" + _set_logger() parser = OptionParser() parser.disable_interspersed_args() parser.usage = '%prog [options] cmd1 cmd2 ..' @@ -124,6 +137,10 @@ action="store_true", dest="version", default=False, help="Prints out the version of Distutils2 and exits.") + parser.add_option("-m", "--metadata", + action="append", dest="metadata", default=[], + help="List METADATA metadata or 'all' for all metadatas.") + parser.add_option("-s", "--search", action="store", dest="search", default=None, help="Search for installed distributions.") @@ -136,11 +153,44 @@ action="store_true", dest="fgraph", default=False, help="Display the full graph for installed distributions.") + parser.add_option("-i", "--install", + action="store", dest="install", + help="Install a project.") + + parser.add_option("-r", "--remove", + action="store", dest="remove", + help="Remove a project.") + options, args = parser.parse_args() if options.version: print('Distutils2 %s' % __version__) # sys.exit(0) + if len(options.metadata): + from distutils2.dist import Distribution + dist = Distribution() + dist.parse_config_files() + metadata = dist.metadata + + if 'all' in options.metadata: + keys = metadata.keys() + else: + keys = options.metadata + if len(keys) == 1: + print metadata[keys[0]] + sys.exit(0) + + for key in keys: + if key in metadata: + print(metadata._convert_name(key)+':') + value = metadata[key] + if isinstance(value, list): + for v in value: + print(' '+v) + else: + print(' '+value.replace('\n', '\n ')) + sys.exit(0) + if options.search is not None: search = options.search.lower() for dist in get_distributions(use_egg_info=True): @@ -169,6 +219,10 @@ print(graph) sys.exit(0) + if options.install is not None: + install(options.install) + sys.exit(0) + if len(args) == 0: parser.print_help() sys.exit(0) diff --git a/distutils2/tests/pypi_server.py b/distutils2/tests/pypi_server.py --- a/distutils2/tests/pypi_server.py +++ b/distutils2/tests/pypi_server.py @@ -375,6 +375,7 @@ def __init__(self, dists=[]): self._dists = dists + self._search_result = [] def add_distributions(self, dists): for dist in dists: diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py --- a/distutils2/tests/support.py +++ b/distutils2/tests/support.py @@ -17,10 +17,11 @@ super(SomeTestCase, self).setUp() ... # other setup code -Read each class' docstring to see its purpose and usage. +Also provided is a DummyCommand class, useful to mock commands in the +tests of another command that needs them, a create_distribution function +and a skip_unless_symlink decorator. -Also provided is a DummyCommand class, useful to mock commands in the -tests of another command that needs them (see docstring). +Each class or function has a docstring to explain its purpose and usage. """ import os @@ -35,7 +36,8 @@ from distutils2.tests import unittest __all__ = ['LoggingCatcher', 'WarningsCatcher', 'TempdirManager', - 'EnvironGuard', 'DummyCommand', 'unittest'] + 'EnvironGuard', 'DummyCommand', 'unittest', 'create_distribution', + 'skip_unless_symlink'] class LoggingCatcher(object): @@ -135,7 +137,7 @@ finally: f.close() - def create_dist(self, pkg_name='foo', **kw): + def create_dist(self, **kw): """Create a stub distribution object and files. This function creates a Distribution instance (use keyword arguments @@ -143,17 +145,19 @@ (currently an empty directory). It returns the path to the directory and the Distribution instance. - You can use TempdirManager.write_file to write any file in that + You can use self.write_file to write any file in that directory, e.g. setup scripts or Python modules. """ # Late import so that third parties can import support without # loading a ton of distutils2 modules in memory. from distutils2.dist import Distribution + if 'name' not in kw: + kw['name'] = 'foo' tmp_dir = self.mkdtemp() - pkg_dir = os.path.join(tmp_dir, pkg_name) - os.mkdir(pkg_dir) + project_dir = os.path.join(tmp_dir, kw['name']) + os.mkdir(project_dir) dist = Distribution(attrs=kw) - return pkg_dir, dist + return project_dir, dist class EnvironGuard(object): @@ -211,3 +215,9 @@ d.parse_command_line() return d + +try: + from test.test_support import skip_unless_symlink +except ImportError: + skip_unless_symlink = unittest.skip( + 'requires test.test_support.skip_unless_symlink') diff --git a/distutils2/tests/test_command_build_ext.py b/distutils2/tests/test_command_build_ext.py --- a/distutils2/tests/test_command_build_ext.py +++ b/distutils2/tests/test_command_build_ext.py @@ -289,7 +289,7 @@ # inplace = 0, cmd.package = 'bar' build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {'': 'bar'} + build_py.package_dir = 'bar' path = cmd.get_ext_fullpath('foo') # checking that the last directory is the build_dir path = os.path.split(path)[0] @@ -318,7 +318,7 @@ dist = Distribution() cmd = build_ext(dist) cmd.inplace = 1 - cmd.distribution.package_dir = {'': 'src'} + cmd.distribution.package_dir = 'src' cmd.distribution.packages = ['lxml', 'lxml.html'] curdir = os.getcwd() wanted = os.path.join(curdir, 'src', 'lxml', 'etree' + ext) @@ -334,7 +334,7 @@ # building twisted.runner.portmap not inplace build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {} + build_py.package_dir = None cmd.distribution.packages = ['twisted', 'twisted.runner.portmap'] path = cmd.get_ext_fullpath('twisted.runner.portmap') wanted = os.path.join(curdir, 'tmpdir', 'twisted', 'runner', diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py --- a/distutils2/tests/test_command_build_py.py +++ b/distutils2/tests/test_command_build_py.py @@ -17,12 +17,14 @@ def test_package_data(self): sources = self.mkdtemp() - f = open(os.path.join(sources, "__init__.py"), "w") + pkg_dir = os.path.join(sources, 'pkg') + os.mkdir(pkg_dir) + f = open(os.path.join(pkg_dir, "__init__.py"), "w") try: f.write("# Pretend this is a package.") finally: f.close() - f = open(os.path.join(sources, "README.txt"), "w") + f = open(os.path.join(pkg_dir, "README.txt"), "w") try: f.write("Info about this package") finally: @@ -31,8 +33,9 @@ destination = self.mkdtemp() dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": sources}}) + "package_dir": sources}) # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") dist.command_obj["build"] = support.DummyCommand( force=0, @@ -42,7 +45,7 @@ use_2to3=False) dist.packages = ["pkg"] dist.package_data = {"pkg": ["README.txt"]} - dist.package_dir = {"pkg": sources} + dist.package_dir = sources cmd = build_py(dist) cmd.compile = 1 @@ -68,19 +71,20 @@ # create the distribution files. sources = self.mkdtemp() - open(os.path.join(sources, "__init__.py"), "w").close() - - testdir = os.path.join(sources, "doc") + pkg = os.path.join(sources, 'pkg') + os.mkdir(pkg) + open(os.path.join(pkg, "__init__.py"), "w").close() + testdir = os.path.join(pkg, "doc") os.mkdir(testdir) open(os.path.join(testdir, "testfile"), "w").close() os.chdir(sources) old_stdout = sys.stdout - sys.stdout = StringIO.StringIO() + #sys.stdout = StringIO.StringIO() try: dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": ""}, + "package_dir": sources, "package_data": {"pkg": ["doc/*"]}}) # script_name need not exist, it just need to be initialized dist.script_name = os.path.join(sources, "setup.py") @@ -89,7 +93,7 @@ try: dist.run_commands() - except DistutilsFileError: + except DistutilsFileError, e: self.fail("failed package_data test when package_dir is ''") finally: # Restore state. diff --git a/distutils2/tests/test_command_install_dist.py b/distutils2/tests/test_command_install_dist.py --- a/distutils2/tests/test_command_install_dist.py +++ b/distutils2/tests/test_command_install_dist.py @@ -180,8 +180,8 @@ cmd.user = 'user' self.assertRaises(DistutilsOptionError, cmd.finalize_options) - def test_record(self): - + def test_old_record(self): + # test pre-PEP 376 --record option (outside dist-info dir) install_dir = self.mkdtemp() pkgdir, dist = self.create_dist() @@ -189,11 +189,11 @@ cmd = install_dist(dist) dist.command_obj['install_dist'] = cmd cmd.root = install_dir - cmd.record = os.path.join(pkgdir, 'RECORD') + cmd.record = os.path.join(pkgdir, 'filelist') cmd.ensure_finalized() cmd.run() - # let's check the RECORD file was created with four + # let's check the record file was created with four # lines, one for each .dist-info entry: METADATA, # INSTALLER, REQUSTED, RECORD f = open(cmd.record) diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -5,6 +5,8 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest +from distutils2.command.sdist import sdist +from distutils2.errors import DistutilsFileError SETUP_CFG = """ @@ -16,7 +18,7 @@ maintainer = ??ric Araujo maintainer_email = merwok at netwok.org summary = A sample project demonstrating distutils2 packaging -description-file = README +description-file = %(description-file)s keywords = distutils2, packaging, sample project classifier = @@ -47,9 +49,11 @@ Fork in progress, http://bitbucket.org/Merwok/sample-distutils2-project [files] +packages_root = src + packages = one - src:two - src2:three + two + three modules = haven @@ -66,6 +70,8 @@ config = cfg/data.cfg /etc/init.d = init-script +extra_files = %(extra-files)s + # Replaces MANIFEST.in sdist_extra = include THANKS HACKING @@ -130,22 +136,33 @@ self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) - def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) - self.write_file('README', 'yeah') + def write_setup(self, kwargs=None): + opts = {'description-file': 'README', 'extra-files':''} + if kwargs: + opts.update(kwargs) + self.write_file('setup.cfg', SETUP_CFG % opts) - # try to load the metadata now + + def run_setup(self, *args): + # run setup with args sys.stdout = StringIO() - sys.argv[:] = ['setup.py', '--version'] + sys.argv[:] = [''] + list(args) old_sys = sys.argv[:] - try: from distutils2.run import commands_main dist = commands_main() finally: sys.argv[:] = old_sys + return dist + + def test_config(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup() + self.write_file('README', 'yeah') + + # try to load the metadata now + dist = self.run_setup('--version') # sanity check self.assertEqual(sys.stdout.getvalue(), '0.6.4.dev1' + os.linesep) @@ -184,7 +201,6 @@ 'http://bitbucket.org/Merwok/sample-distutils2-project')] self.assertEqual(dist.metadata['Project-Url'], urls) - self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) @@ -192,7 +208,8 @@ [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), ('config ', ['cfg/data.cfg']), ('/etc/init.d ', ['init-script'])]) - self.assertEqual(dist.package_dir['two'], 'src') + + self.assertEqual(dist.package_dir, 'src') # Make sure we get the foo command loaded. We use a string comparison # instead of assertIsInstance because the class is not the same when @@ -213,10 +230,94 @@ d = new_compiler(compiler='d') self.assertEqual(d.description, 'D Compiler') + + def test_multiple_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog2') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_multiline_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README\n CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_metadata_requires_description_files_missing(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--version') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + def test_metadata_requires_description_files(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2', + 'extra-files':'\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--description') + self.assertIn('yeah\nyeah\n', sys.stdout.getvalue()) + + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + self.write_setup({'description-file': 'README\n README2', + 'extra-files': '\n README2\n README'}) + dist = self.run_setup('--description') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + cmd.make_distribution() + self.assertIn('README\nREADME2\n', open('MANIFEST').read()) + def test_sub_commands(self): tempdir = self.mkdtemp() os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) + self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') self.write_file('script1.py', '#') @@ -224,20 +325,15 @@ self.write_file(os.path.join('scripts', 'find-coconuts'), '#') os.mkdir('bin') self.write_file(os.path.join('bin', 'taunt'), '#') + os.mkdir('src') - for pkg in ('one', 'src', 'src2'): + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) os.mkdir(pkg) self.write_file(os.path.join(pkg, '__init__.py'), '#') # try to run the install command to see if foo is called - sys.stdout = sys.stderr = StringIO() - sys.argv[:] = ['', 'install_dist'] - old_sys = sys.argv[:] - try: - from distutils2.run import main - dist = main() - finally: - sys.argv[:] = old_sys + dist = self.run_setup('install_dist') self.assertEqual(dist.foo_was_here, 1) diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -29,27 +29,16 @@ class ToInstallDist(object): """Distribution that will be installed""" - def __init__(self, raise_error=False, files=False): - self._raise_error = raise_error + def __init__(self, files=False): self._files = files - self.install_called = False - self.install_called_with = {} self.uninstall_called = False self._real_files = [] + self.name = "fake" + self.version = "fake" if files: for f in range(0,3): self._real_files.append(mkstemp()) - def install(self, *args): - self.install_called = True - self.install_called_with = args - if self._raise_error: - raise Exception('Oops !') - return ['/path/to/foo', '/path/to/bar'] - - def uninstall(self, **args): - self.uninstall_called = True - def get_installed_files(self, **args): if self._files: return [f[1] for f in self._real_files] @@ -58,7 +47,49 @@ return self.get_installed_files() +class MagicMock(object): + def __init__(self, return_value=None, raise_exception=False): + self.called = False + self._times_called = 0 + self._called_with = [] + self._return_value = return_value + self._raise = raise_exception + + def __call__(self, *args, **kwargs): + self.called = True + self._times_called = self._times_called + 1 + self._called_with.append((args, kwargs)) + iterable = hasattr(self._raise, '__iter__') + if self._raise: + if ((not iterable and self._raise) + or self._raise[self._times_called - 1]): + raise Exception + return self._return_value + + def called_with(self, *args, **kwargs): + return (args, kwargs) in self._called_with + + +def patch(parent, to_patch): + """monkey match a module""" + def wrapper(func): + print func + print dir(func) + old_func = getattr(parent, to_patch) + def wrapped(*args, **kwargs): + parent.__dict__[to_patch] = MagicMock() + try: + out = func(*args, **kwargs) + finally: + setattr(parent, to_patch, old_func) + return out + return wrapped + return wrapper + + def get_installed_dists(dists): + """Return a list of fake installed dists. + The list is name, version, deps""" objects = [] for (name, version, deps) in dists: objects.append(InstalledDist(name, version, deps)) @@ -69,6 +100,12 @@ def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) + def _patch_run_install(self): + """Patch run install""" + + def _unpatch_run_install(self): + """Unpatch run install for d2 and d1""" + def _get_results(self, output): """return a list of results""" installed = [(o.name, '%s' % o.version) for o in output['install']] @@ -150,6 +187,8 @@ # Tests that conflicts are detected client = self._get_client(server) archive_path = '%s/distribution.tar.gz' % server.full_address + + # choxie depends on towel-stuff, which depends on bacon. server.xmlrpc.set_distributions([ {'name':'choxie', 'version': '2.0.0.9', @@ -164,7 +203,9 @@ 'requires_dist': [], 'url': archive_path}, ]) - already_installed = [('bacon', '0.1', []), + + # name, version, deps. + already_installed = [('bacon', '0.1', []), ('chicken', '1.1', ['bacon (0.1)'])] output = install.get_infos("choxie", index=client, installed= get_installed_dists(already_installed)) @@ -221,23 +262,39 @@ # if one of the distribution installation fails, call uninstall on all # installed distributions. - d1 = ToInstallDist() - d2 = ToInstallDist(raise_error=True) - self.assertRaises(Exception, install.install_dists, [d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertTrue(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) + + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + d1 = ToInstallDist() + d2 = ToInstallDist() + path = self.mkdtemp() + self.assertRaises(Exception, install.install_dists, [d1, d2], path) + self.assertTrue(install._install_dist.called_with(d1, path)) + self.assertTrue(install.uninstall.called) + finally: + install._install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_dists_success(self): - # test that the install method is called on each of the distributions. - d1 = ToInstallDist() - d2 = ToInstallDist() - install.install_dists([d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertFalse(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + install._install_dist = MagicMock(return_value=[]) + try: + # test that the install method is called on each of the distributions. + d1 = ToInstallDist() + d2 = ToInstallDist() + + # should call install + path = self.mkdtemp() + install.install_dists([d1, d2], path) + for dist in (d1, d2): + self.assertTrue(install._install_dist.called_with(dist, path)) + finally: + install._install_dist = old_install_dist def test_install_from_infos_conflict(self): # assert conflicts raise an exception @@ -262,29 +319,46 @@ install.install_dists = old_install_dists def test_install_from_infos_remove_rollback(self): - # assert that if an error occurs, the removed files are restored. - remove = [] - for i in range(0,2): - remove.append(ToInstallDist(files=True, raise_error=True)) - to_install = [ToInstallDist(raise_error=True), - ToInstallDist()] + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) - install.install_from_infos(remove=remove, install=to_install) - # assert that the files are in the same place - # assert that the files have been removed - for dist in remove: - for f in dist.get_installed_files(): - self.assertTrue(os.path.exists(f)) + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + # assert that if an error occurs, the removed files are restored. + remove = [] + for i in range(0,2): + remove.append(ToInstallDist(files=True)) + to_install = [ToInstallDist(), ToInstallDist()] + + self.assertRaises(Exception, install.install_from_infos, + remove=remove, install=to_install) + # assert that the files are in the same place + # assert that the files have been removed + for dist in remove: + for f in dist.get_installed_files(): + self.assertTrue(os.path.exists(f)) + finally: + install.install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_from_infos_install_succes(self): - # assert that the distribution can be installed - install_path = "my_install_path" - to_install = [ToInstallDist(), ToInstallDist()] + old_install_dist = install._install_dist + install._install_dist = MagicMock([]) + try: + # assert that the distribution can be installed + install_path = "my_install_path" + to_install = [ToInstallDist(), ToInstallDist()] - install.install_from_infos(install=to_install, - install_path=install_path) - for dist in to_install: - self.assertEqual(dist.install_called_with, (install_path,)) + install.install_from_infos(install=to_install, + install_path=install_path) + for dist in to_install: + install._install_dist.called_with(install_path) + finally: + install._install_dist = old_install_dist + def test_suite(): suite = unittest.TestSuite() diff --git a/distutils2/tests/test_markers.py b/distutils2/tests/test_markers.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_markers.py @@ -0,0 +1,69 @@ +"""Tests for distutils.metadata.""" +import os +import sys +import platform +from StringIO import StringIO + +from distutils2.markers import interpret +from distutils2.tests import run_unittest, unittest +from distutils2.tests.support import LoggingCatcher, WarningsCatcher + + +class MarkersTestCase(LoggingCatcher, WarningsCatcher, + unittest.TestCase): + + def test_interpret(self): + sys_platform = sys.platform + version = sys.version.split()[0] + os_name = os.name + platform_version = platform.version() + platform_machine = platform.machine() + + self.assertTrue(interpret("sys.platform == '%s'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' and python_full_version == '%s'" % + (sys_platform, version))) + self.assertTrue(interpret("'%s' == sys.platform" % sys_platform)) + self.assertTrue(interpret('os.name == "%s"' % os_name)) + self.assertTrue(interpret( + 'platform.version == "%s" and platform.machine == "%s"' % + (platform_version, platform_machine))) + + # stuff that need to raise a syntax error + ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", + 'okpjonon', '', 'os.name ==', 'python_version == 2.4') + for op in ops: + self.assertRaises(SyntaxError, interpret, op) + + # combined operations + OP = 'os.name == "%s"' % os_name + AND = ' and ' + OR = ' or ' + self.assertTrue(interpret(OP + AND + OP)) + self.assertTrue(interpret(OP + AND + OP + AND + OP)) + self.assertTrue(interpret(OP + OR + OP)) + self.assertTrue(interpret(OP + OR + OP + OR + OP)) + + # other operators + self.assertTrue(interpret("os.name != 'buuuu'")) + self.assertTrue(interpret("python_version > '1.0'")) + self.assertTrue(interpret("python_version < '5.0'")) + self.assertTrue(interpret("python_version <= '5.0'")) + self.assertTrue(interpret("python_version >= '1.0'")) + self.assertTrue(interpret("'%s' in os.name" % os_name)) + self.assertTrue(interpret("'buuuu' not in os.name")) + self.assertTrue(interpret( + "'buuuu' not in os.name and '%s' in os.name" % os_name)) + + # execution context + self.assertTrue(interpret('python_version == "0.1"', + {'python_version': '0.1'})) + + +def test_suite(): + return unittest.makeSuite(MarkersTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -1,10 +1,10 @@ -"""Tests for distutils.command.bdist.""" +"""Tests for distutils.metadata.""" import os import sys import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, _interpret, +from distutils2.metadata import (DistributionMetadata, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -46,55 +46,6 @@ self.assertRaises(TypeError, DistributionMetadata, PKG_INFO, mapping=m, fileobj=fp) - def test_interpret(self): - sys_platform = sys.platform - version = sys.version.split()[0] - os_name = os.name - platform_version = platform.version() - platform_machine = platform.machine() - - self.assertTrue(_interpret("sys.platform == '%s'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' and python_full_version == '%s'" % - (sys_platform, version))) - self.assertTrue(_interpret("'%s' == sys.platform" % sys_platform)) - self.assertTrue(_interpret('os.name == "%s"' % os_name)) - self.assertTrue(_interpret( - 'platform.version == "%s" and platform.machine == "%s"' % - (platform_version, platform_machine))) - - # stuff that need to raise a syntax error - ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", - 'okpjonon', '', 'os.name ==', 'python_version == 2.4') - for op in ops: - self.assertRaises(SyntaxError, _interpret, op) - - # combined operations - OP = 'os.name == "%s"' % os_name - AND = ' and ' - OR = ' or ' - self.assertTrue(_interpret(OP + AND + OP)) - self.assertTrue(_interpret(OP + AND + OP + AND + OP)) - self.assertTrue(_interpret(OP + OR + OP)) - self.assertTrue(_interpret(OP + OR + OP + OR + OP)) - - # other operators - self.assertTrue(_interpret("os.name != 'buuuu'")) - self.assertTrue(_interpret("python_version > '1.0'")) - self.assertTrue(_interpret("python_version < '5.0'")) - self.assertTrue(_interpret("python_version <= '5.0'")) - self.assertTrue(_interpret("python_version >= '1.0'")) - self.assertTrue(_interpret("'%s' in os.name" % os_name)) - self.assertTrue(_interpret("'buuuu' not in os.name")) - self.assertTrue(_interpret( - "'buuuu' not in os.name and '%s' in os.name" % os_name)) - - # execution context - self.assertTrue(_interpret('python_version == "0.1"', - {'python_version': '0.1'})) - def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') metadata = DistributionMetadata(PKG_INFO) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,6 +15,7 @@ from copy import copy from fnmatch import fnmatchcase from ConfigParser import RawConfigParser +from inspect import getsource from distutils2.errors import (DistutilsPlatformError, DistutilsFileError, DistutilsByteCompileError, DistutilsExecError) @@ -674,83 +675,6 @@ return base, ext -def unzip_file(filename, location, flatten=True): - """Unzip the file (zip file located at filename) to the destination - location""" - if not os.path.exists(location): - os.makedirs(location) - zipfp = open(filename, 'rb') - try: - zip = zipfile.ZipFile(zipfp) - leading = has_leading_dir(zip.namelist()) and flatten - for name in zip.namelist(): - data = zip.read(name) - fn = name - if leading: - fn = split_leading_dir(name)[1] - fn = os.path.join(location, fn) - dir = os.path.dirname(fn) - if not os.path.exists(dir): - os.makedirs(dir) - if fn.endswith('/') or fn.endswith('\\'): - # A directory - if not os.path.exists(fn): - os.makedirs(fn) - else: - fp = open(fn, 'wb') - try: - fp.write(data) - finally: - fp.close() - finally: - zipfp.close() - - -def untar_file(filename, location): - """Untar the file (tar file located at filename) to the destination - location - """ - if not os.path.exists(location): - os.makedirs(location) - if filename.lower().endswith('.gz') or filename.lower().endswith('.tgz'): - mode = 'r:gz' - elif (filename.lower().endswith('.bz2') - or filename.lower().endswith('.tbz')): - mode = 'r:bz2' - elif filename.lower().endswith('.tar'): - mode = 'r' - else: - mode = 'r:*' - tar = tarfile.open(filename, mode) - try: - leading = has_leading_dir([member.name for member in tar.getmembers()]) - for member in tar.getmembers(): - fn = member.name - if leading: - fn = split_leading_dir(fn)[1] - path = os.path.join(location, fn) - if member.isdir(): - if not os.path.exists(path): - os.makedirs(path) - else: - try: - fp = tar.extractfile(member) - except (KeyError, AttributeError): - # Some corrupt tar files seem to produce this - # (specifically bad symlinks) - continue - if not os.path.exists(os.path.dirname(path)): - os.makedirs(os.path.dirname(path)) - destfp = open(path, 'wb') - try: - shutil.copyfileobj(fp, destfp) - finally: - destfp.close() - fp.close() - finally: - tar.close() - - def has_leading_dir(paths): """Returns true if all the paths have the same leading path name (i.e., everything is in one subdirectory in an archive)""" @@ -1127,3 +1051,117 @@ """ Issues a call to util.run_2to3. """ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) + + +def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): + """ Distutils2 to distutils1 compatibility util. + + This method uses an existing setup.cfg to generate a dictionnary of + keywords that can be used by distutils.core.setup(kwargs**). + + :param file: + The setup.cfg path. + :raises DistutilsFileError: + When the setup.cfg file is not found. + + """ + # We need to declare the following constants here so that it's easier to + # generate the setup.py afterwards, using inspect.getsource. + D1_D2_SETUP_ARGS = { + # D1 name : (D2_section, D2_name) + "name" : ("metadata",), + "version" : ("metadata",), + "author" : ("metadata",), + "author_email" : ("metadata",), + "maintainer" : ("metadata",), + "maintainer_email" : ("metadata",), + "url" : ("metadata", "home_page"), + "description" : ("metadata", "summary"), + "long_description" : ("metadata", "description"), + "download-url" : ("metadata",), + "classifiers" : ("metadata", "classifier"), + "platforms" : ("metadata", "platform"), # Needs testing + "license" : ("metadata",), + "requires" : ("metadata", "requires_dist"), + "provides" : ("metadata", "provides_dist"), # Needs testing + "obsoletes" : ("metadata", "obsoletes_dist"), # Needs testing + + "packages" : ("files",), + "scripts" : ("files",), + "py_modules" : ("files", "modules"), # Needs testing + } + + MULTI_FIELDS = ("classifiers", + "requires", + "platforms", + "packages", + "scripts") + + def has_get_option(config, section, option): + if config.has_option(section, option): + return config.get(section, option) + elif config.has_option(section, option.replace('_', '-')): + return config.get(section, option.replace('_', '-')) + else: + return False + + # The method source code really starts here. + config = RawConfigParser() + if not os.path.exists(file): + raise DistutilsFileError("file '%s' does not exist" % + os.path.abspath(file)) + config.read(file) + + kwargs = {} + for arg in D1_D2_SETUP_ARGS: + if len(D1_D2_SETUP_ARGS[arg]) == 2: + # The distutils field name is different than distutils2's. + section, option = D1_D2_SETUP_ARGS[arg] + + elif len(D1_D2_SETUP_ARGS[arg]) == 1: + # The distutils field name is the same thant distutils2's. + section = D1_D2_SETUP_ARGS[arg][0] + option = arg + + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg + if arg == "long_description": + filename = has_get_option(config, section, "description_file") + print "We have a filename", filename + if filename: + in_cfg_value = open(filename).read() + else: + continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option + if "\n" in in_cfg_value: + in_cfg_value = in_cfg_value.strip().split('\n') + else: + in_cfg_value = list((in_cfg_value,)) + + kwargs[arg] = in_cfg_value + + return kwargs + + +def generate_distutils_setup_py(): + """ Generate a distutils compatible setup.py using an existing setup.cfg. + + :raises DistutilsFileError: + When a setup.py already exists. + """ + if os.path.exists("setup.py"): + raise DistutilsFileError("A pre existing setup.py file exists") + + handle = open("setup.py", "w") + handle.write("# Distutils script using distutils2 setup.cfg to call the\n") + handle.write("# distutils.core.setup() with the right args.\n\n\n") + handle.write("import os\n") + handle.write("from distutils.core import setup\n") + handle.write("from ConfigParser import RawConfigParser\n\n") + handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) + handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") + handle.write("setup(**kwargs)") + handle.close() diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1055,6 +1055,13 @@ Create a file called *filename* and write *contents* (a sequence of strings without line terminators) to it. +:mod:`distutils2.metadata` --- Metadata handling +================================================================ + +.. module:: distutils2.metadata + +.. autoclass:: distutils2.metadata.DistributionMetadata + :members: :mod:`distutils2.util` --- Miscellaneous other utility functions ================================================================ diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -301,7 +301,7 @@ :class:`distutils2.dist.DistributionMetadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.dist import DistributionMetadata + >>> from distutils2.metadata import DistributionMetadata >>> metadata = DistributionMetadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -2,7 +2,9 @@ Metadata ======== -Distutils2 provides a :class:`DistributionMetadata` class that can read and +.. module:: distutils2.metadata + +Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -17,7 +19,7 @@ Reading metadata ================ -The :class:`DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: >>> from distutils2.metadata import DistributionMetadata @@ -33,7 +35,7 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: diff --git a/docs/source/library/distutils2.tests.pypi_server.rst b/docs/source/library/distutils2.tests.pypi_server.rst --- a/docs/source/library/distutils2.tests.pypi_server.rst +++ b/docs/source/library/distutils2.tests.pypi_server.rst @@ -77,6 +77,7 @@ @use_pypi_server() def test_somthing(self, server): # your tests goes here + ... The decorator will instantiate the server for you, and run and stop it just before and after your method call. You also can pass the server initializer, @@ -85,4 +86,4 @@ class SampleTestCase(TestCase): @use_pypi_server("test_case_name") def test_something(self, server): - # something + ... diff --git a/docs/source/library/pkgutil.rst b/docs/source/library/pkgutil.rst --- a/docs/source/library/pkgutil.rst +++ b/docs/source/library/pkgutil.rst @@ -4,77 +4,204 @@ .. module:: pkgutil :synopsis: Utilities to support packages. -.. TODO Follow the reST conventions used in the stdlib +This module provides utilities to manipulate packages: support for the +Importer protocol defined in :PEP:`302` and implementation of the API +described in :PEP:`376` to work with the database of installed Python +distributions. -This module provides functions to manipulate packages, as well as -the necessary functions to provide support for the "Importer Protocol" as -described in :PEP:`302` and for working with the database of installed Python -distributions which is specified in :PEP:`376`. In addition to the functions -required in :PEP:`376`, back support for older ``.egg`` and ``.egg-info`` -distributions is provided as well. These distributions are represented by the -class :class:`~distutils2._backport.pkgutil.EggInfoDistribution` and most -functions provide an extra argument ``use_egg_info`` which indicates if -they should consider these old styled distributions. This document details -first the functions and classes available and then presents several use cases. - +Import system utilities +----------------------- .. function:: extend_path(path, name) - Extend the search path for the modules which comprise a package. Intended use is - to place the following code in a package's :file:`__init__.py`:: + Extend the search path for the modules which comprise a package. Intended + use is to place the following code in a package's :file:`__init__.py`:: from pkgutil import extend_path __path__ = extend_path(__path__, __name__) - This will add to the package's ``__path__`` all subdirectories of directories on - ``sys.path`` named after the package. This is useful if one wants to distribute - different parts of a single logical package as multiple directories. + This will add to the package's ``__path__`` all subdirectories of directories + on :data:`sys.path` named after the package. This is useful if one wants to + distribute different parts of a single logical package as multiple + directories. - It also looks for :file:`\*.pkg` files beginning where ``*`` matches the *name* - argument. This feature is similar to :file:`\*.pth` files (see the :mod:`site` - module for more information), except that it doesn't special-case lines starting - with ``import``. A :file:`\*.pkg` file is trusted at face value: apart from - checking for duplicates, all entries found in a :file:`\*.pkg` file are added to - the path, regardless of whether they exist on the filesystem. (This is a - feature.) + It also looks for :file:`\*.pkg` files beginning where ``*`` matches the + *name* argument. This feature is similar to :file:`\*.pth` files (see the + :mod:`site` module for more information), except that it doesn't special-case + lines starting with ``import``. A :file:`\*.pkg` file is trusted at face + value: apart from checking for duplicates, all entries found in a + :file:`\*.pkg` file are added to the path, regardless of whether they exist + on the filesystem. (This is a feature.) If the input path is not a list (as is the case for frozen packages) it is returned unchanged. The input path is not modified; an extended copy is returned. Items are only appended to the copy at the end. - It is assumed that ``sys.path`` is a sequence. Items of ``sys.path`` that are - not strings referring to existing directories are ignored. Unicode items on - ``sys.path`` that cause errors when used as filenames may cause this function - to raise an exception (in line with :func:`os.path.isdir` behavior). + It is assumed that :data:`sys.path` is a sequence. Items of :data:`sys.path` + that are not strings referring to existing directories are ignored. Unicode + items on :data:`sys.path` that cause errors when used as filenames may cause + this function to raise an exception (in line with :func:`os.path.isdir` + behavior). + + +.. class:: ImpImporter(dirname=None) + + :pep:`302` Importer that wraps Python's "classic" import algorithm. + + If *dirname* is a string, a :pep:`302` importer is created that searches that + directory. If *dirname* is ``None``, a :pep:`302` importer is created that + searches the current :data:`sys.path`, plus any modules that are frozen or + built-in. + + Note that :class:`ImpImporter` does not currently support being used by + placement on :data:`sys.meta_path`. + + +.. class:: ImpLoader(fullname, file, filename, etc) + + :pep:`302` Loader that wraps Python's "classic" import algorithm. + + +.. function:: find_loader(fullname) + + Find a :pep:`302` "loader" object for *fullname*. + + If *fullname* contains dots, path must be the containing package's + ``__path__``. Returns ``None`` if the module cannot be found or imported. + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: get_importer(path_item) + + Retrieve a :pep:`302` importer for the given *path_item*. + + The returned importer is cached in :data:`sys.path_importer_cache` if it was + newly created by a path hook. + + If there is no importer, a wrapper around the basic import machinery is + returned. This wrapper is never inserted into the importer cache (None is + inserted instead). + + The cache (or part of it) can be cleared manually if a rescan of + :data:`sys.path_hooks` is necessary. + + +.. function:: get_loader(module_or_name) + + Get a :pep:`302` "loader" object for *module_or_name*. + + If the module or package is accessible via the normal import mechanism, a + wrapper around the relevant part of that machinery is returned. Returns + ``None`` if the module cannot be found or imported. If the named module is + not already imported, its containing package (if any) is imported, in order + to establish the package ``__path__``. + + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: iter_importers(fullname='') + + Yield :pep:`302` importers for the given module name. + + If fullname contains a '.', the importers will be for the package containing + fullname, otherwise they will be importers for :data:`sys.meta_path`, + :data:`sys.path`, and Python's "classic" import machinery, in that order. If + the named module is in a package, that package is imported as a side effect + of invoking this function. + + Non-:pep:`302` mechanisms (e.g. the Windows registry) used by the standard + import machinery to find files in alternative locations are partially + supported, but are searched *after* :data:`sys.path`. Normally, these + locations are searched *before* :data:`sys.path`, preventing :data:`sys.path` + entries from shadowing them. + + For this to cause a visible difference in behaviour, there must be a module + or package name that is accessible via both :data:`sys.path` and one of the + non-:pep:`302` file system mechanisms. In this case, the emulation will find + the former version, while the builtin import mechanism will find the latter. + + Items of the following types can be affected by this discrepancy: + ``imp.C_EXTENSION``, ``imp.PY_SOURCE``, ``imp.PY_COMPILED``, + ``imp.PKG_DIRECTORY``. + + +.. function:: iter_modules(path=None, prefix='') + + Yields ``(module_loader, name, ispkg)`` for all submodules on *path*, or, if + path is ``None``, all top-level modules on :data:`sys.path`. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + +.. function:: walk_packages(path=None, prefix='', onerror=None) + + Yields ``(module_loader, name, ispkg)`` for all modules recursively on + *path*, or, if path is ``None``, all accessible modules. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + Note that this function must import all *packages* (*not* all modules!) on + the given *path*, in order to access the ``__path__`` attribute to find + submodules. + + *onerror* is a function which gets called with one argument (the name of the + package which was being imported) if any exception occurs while trying to + import a package. If no *onerror* function is supplied, :exc:`ImportError`\s + are caught and ignored, while all other exceptions are propagated, + terminating the search. + + Examples:: + + # list all modules python can access + walk_packages() + + # list all submodules of ctypes + walk_packages(ctypes.__path__, ctypes.__name__ + '.') + .. function:: get_data(package, resource) Get a resource from a package. - This is a wrapper for the :pep:`302` loader :func:`get_data` API. The package - argument should be the name of a package, in standard module format - (foo.bar). The resource argument should be in the form of a relative - filename, using ``/`` as the path separator. The parent directory name + This is a wrapper for the :pep:`302` loader :func:`get_data` API. The + *package* argument should be the name of a package, in standard module format + (``foo.bar``). The *resource* argument should be in the form of a relative + filename, using ``/`` as the path separator. The parent directory name ``..`` is not allowed, and nor is a rooted name (starting with a ``/``). - The function returns a binary string that is the contents of the - specified resource. + The function returns a binary string that is the contents of the specified + resource. For packages located in the filesystem, which have already been imported, this is the rough equivalent of:: - d = os.path.dirname(sys.modules[package].__file__) - data = open(os.path.join(d, resource), 'rb').read() + d = os.path.dirname(sys.modules[package].__file__) + data = open(os.path.join(d, resource), 'rb').read() If the package cannot be located or loaded, or it uses a :pep:`302` loader - which does not support :func:`get_data`, then None is returned. + which does not support :func:`get_data`, then ``None`` is returned. -API Reference -============= +Installed distributions database +-------------------------------- -.. automodule:: distutils2._backport.pkgutil - :members: +Installed Python distributions are represented by instances of +:class:`~distutils2._backport.pkgutil.Distribution`, or its subclass +:class:`~distutils2._backport.pkgutil.EggInfoDistribution` for legacy ``.egg`` +and ``.egg-info`` formats). Most functions also provide an extra argument +``use_egg_info`` to take legacy distributions into account. + +.. TODO write docs here, don't rely on automodule + classes: Distribution and descendents + functions: provides, obsoletes, replaces, etc. Caching +++++++ @@ -86,11 +213,10 @@ :func:`~distutils2._backport.pkgutil.clear_cache`. +Examples +-------- -Example Usage -============= - -Print All Information About a Distribution +Print all information about a distribution ++++++++++++++++++++++++++++++++++++++++++ Given a path to a ``.dist-info`` distribution, we shall print out all @@ -182,7 +308,7 @@ ===== * It was installed as a dependency -Find Out Obsoleted Distributions +Find out obsoleted distributions ++++++++++++++++++++++++++++++++ Now, we take tackle a different problem, we are interested in finding out diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -128,6 +128,8 @@ This section describes the files included in the project. +- **packages_root**: the root directory containing all packages. If not provided + Distutils2 will use the current directory. *\*optional* - **packages**: a list of packages the project includes *\*optional* *\*multi* - **modules**: a list of packages the project includes *\*optional* *\*multi* - **scripts**: a list of scripts the project includes *\*optional* *\*multi* @@ -136,6 +138,7 @@ Example:: [files] + packages_root = src packages = pypi2rpm pypi2rpm.command -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Remove the call to generate_distutils_setup_py from utils. Message-ID: tarek.ziade pushed ddf305e7ab51 to distutils2: http://hg.python.org/distutils2/rev/ddf305e7ab51 changeset: 999:ddf305e7ab51 user: Alexis Metaireau date: Tue Feb 01 12:27:19 2011 +0000 summary: Remove the call to generate_distutils_setup_py from utils. files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -1120,5 +1120,3 @@ ) finally: handle.close() - -generate_distutils_setup_py() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Don't check twice the same thing (restructured text compatibility). Message-ID: tarek.ziade pushed d355f123ac79 to distutils2: http://hg.python.org/distutils2/rev/d355f123ac79 changeset: 1000:d355f123ac79 user: Alexis Metaireau date: Tue Feb 01 16:15:32 2011 +0000 summary: Don't check twice the same thing (restructured text compatibility). Add a note in DEVNOTES to be sure all tests are really runned (and not skipped cause of docutils) files: DEVNOTES.txt distutils2/command/check.py distutils2/metadata.py diff --git a/DEVNOTES.txt b/DEVNOTES.txt --- a/DEVNOTES.txt +++ b/DEVNOTES.txt @@ -6,4 +6,5 @@ one of these Python versions. - Always run tests.sh before you push a change. This implies - that you have all Python versions installed from 2.4 to 2.7. + that you have all Python versions installed from 2.4 to 2.7. Be sure to have + docutils installed on all python versions no avoid skipping tests as well. diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -64,7 +64,7 @@ def check_restructuredtext(self): """Checks if the long string fields are reST-compliant.""" - missing, warnings = self.distribution.metadata.check() + missing, warnings = self.distribution.metadata.check(restructuredtext=True) if self.distribution.metadata.docutils_support: for warning in warnings: line = warning[-1].get('line') diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -465,7 +465,7 @@ return None return value - def check(self, strict=False): + def check(self, strict=False, restructuredtext=False): """Check if the metadata is compliant. If strict is False then raise if no Name or Version are provided""" # XXX should check the versions (if the file was loaded) @@ -483,7 +483,7 @@ if attr not in self: missing.append(attr) - if _HAS_DOCUTILS: + if _HAS_DOCUTILS and restructuredtext: warnings.extend(self._check_rst_data(self['Description'])) # checking metadata 1.2 (XXX needs to check 1.1, 1.0) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: merge changes from central repo Message-ID: tarek.ziade pushed 1e1b39856550 to distutils2: http://hg.python.org/distutils2/rev/1e1b39856550 changeset: 998:1e1b39856550 parent: 997:a73c351afd57 parent: 992:a832bc472995 user: Godefroid Chapelle date: Tue Feb 01 10:31:10 2011 +0100 summary: merge changes from central repo files: diff --git a/distutils2/_backport/__init__.py b/distutils2/_backport/__init__.py --- a/distutils2/_backport/__init__.py +++ b/distutils2/_backport/__init__.py @@ -1,2 +1,8 @@ """Things that will land in the Python 3.3 std lib but which we must drag along -with us for now to support 2.x.""" + us for now to support 2.x.""" + +def any(seq): + for elem in seq: + if elem: + return True + return False diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -7,6 +7,13 @@ import warnings from csv import reader as csv_reader from types import ModuleType +from stat import ST_SIZE + +try: + from hashlib import md5 +except ImportError: + from md5 import md5 + from distutils2.errors import DistutilsError from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate @@ -922,10 +929,6 @@ for field in ('Obsoletes', 'Requires', 'Provides'): del self.metadata[field] - provides = "%s (%s)" % (self.metadata['name'], - self.metadata['version']) - self.metadata['Provides-Dist'] += (provides,) - reqs = [] if requires is not None: @@ -973,6 +976,33 @@ return '%s-%s at %s' % (self.name, self.metadata.version, self.path) def get_installed_files(self, local=False): + + def _md5(path): + f = open(path) + try: + content = f.read() + finally: + f.close() + return md5(content).hexdigest() + + def _size(path): + return os.stat(path)[ST_SIZE] + + path = self.path + if local: + path = path.replace('/', os.sep) + + # XXX What about scripts and data files ? + if os.path.isfile(path): + return [(path, _md5(path), _size(path))] + else: + files = [] + for root, dir, files_ in os.walk(path): + for item in files_: + item = os.path.join(root, item) + files.append((item, _md5(item), _size(item))) + return files + return [] def uses(self, path): @@ -1029,7 +1059,7 @@ for dist in _yield_distributions(True, use_egg_info, paths): yield dist else: - _generate_cache(use_egg_info) + _generate_cache(use_egg_info, paths) for dist in _cache_path.itervalues(): yield dist @@ -1061,7 +1091,7 @@ if dist.name == name: return dist else: - _generate_cache(use_egg_info) + _generate_cache(use_egg_info, paths) if name in _cache_name: return _cache_name[name][0] diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -742,6 +742,8 @@ if extract_dir is None: extract_dir = os.getcwd() + func = None + if format is not None: try: format_info = _UNPACK_FORMATS[format] @@ -758,4 +760,9 @@ func = _UNPACK_FORMATS[format][1] kwargs = dict(_UNPACK_FORMATS[format][2]) - raise ValueError('Unknown archive format: %s' % filename) + func(filename, extract_dir, **kwargs) + + if func is None: + raise ValueError('Unknown archive format: %s' % filename) + + return extract_dir diff --git a/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO b/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO @@ -0,0 +1,5 @@ +Metadata-Version: 1.2 +Name: coconuts-aster +Version: 10.3 +Provides-Dist: strawberry (0.6) +Provides-Dist: banana (0.4) diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -389,6 +389,7 @@ # Now, test if the egg-info distributions are found correctly as well fake_dists += [('bacon', '0.1'), ('cheese', '2.0.2'), + ('coconuts-aster', '10.3'), ('banana', '0.4'), ('strawberry', '0.6'), ('truffles', '5.0'), ('nut', 'funkyversion')] found_dists = [] @@ -494,18 +495,18 @@ l = [dist.name for dist in provides_distribution('truffles', '>1.5', use_egg_info=True)] - checkLists(l, ['bacon', 'truffles']) + checkLists(l, ['bacon']) l = [dist.name for dist in provides_distribution('truffles', '>=1.0')] checkLists(l, ['choxie', 'towel-stuff']) l = [dist.name for dist in provides_distribution('strawberry', '0.6', use_egg_info=True)] - checkLists(l, ['strawberry']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('strawberry', '>=0.5', use_egg_info=True)] - checkLists(l, ['strawberry']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('strawberry', '>0.6', use_egg_info=True)] @@ -513,11 +514,11 @@ l = [dist.name for dist in provides_distribution('banana', '0.4', use_egg_info=True)] - checkLists(l, ['banana']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('banana', '>=0.3', use_egg_info=True)] - checkLists(l, ['banana']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('banana', '!=0.4', use_egg_info=True)] @@ -557,7 +558,7 @@ eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), ('truffles', '5.0'), ('cheese', '2.0.2'), - ('nut', 'funkyversion')] + ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), ('towel-stuff', '0.1')] diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -3,14 +3,36 @@ Know how to read all config files Distutils2 uses. """ import os +import re import sys from ConfigParser import RawConfigParser +from shlex import split from distutils2 import logger from distutils2.errors import DistutilsOptionError +from distutils2.compiler.extension import Extension from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command +from distutils2.markers import interpret + + +def _pop_values(values_dct, key): + """Remove values from the dictionary and convert them as a list""" + vals_str = values_dct.pop(key, '') + if not vals_str: + return + fields = [] + for field in vals_str.split(os.linesep): + tmp_vals = field.split('--') + if (len(tmp_vals) == 2) and (not interpret(tmp_vals[1])): + continue + fields.append(tmp_vals[0]) + # Get bash options like `gcc -print-file-name=libgcc.a` + vals = split(' '.join(fields)) + if vals: + return vals + class Config(object): """Reads configuration files and work with the Distribution instance @@ -182,6 +204,34 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) + ext_modules = self.dist.ext_modules + for section_key in content: + labels = section_key.split('=') + if (len(labels) == 2) and (labels[0] == 'extension'): + # labels[1] not used from now but should be implemented + # for extension build dependency + values_dct = content[section_key] + ext_modules.append(Extension( + values_dct.pop('name'), + _pop_values(values_dct, 'sources'), + _pop_values(values_dct, 'include_dirs'), + _pop_values(values_dct, 'define_macros'), + _pop_values(values_dct, 'undef_macros'), + _pop_values(values_dct, 'library_dirs'), + _pop_values(values_dct, 'libraries'), + _pop_values(values_dct, 'runtime_library_dirs'), + _pop_values(values_dct, 'extra_objects'), + _pop_values(values_dct, 'extra_compile_args'), + _pop_values(values_dct, 'extra_link_args'), + _pop_values(values_dct, 'export_symbols'), + _pop_values(values_dct, 'swig_opts'), + _pop_values(values_dct, 'depends'), + values_dct.pop('language', None), + values_dct.pop('optional', None), + **values_dct + )) + + def parse_config_files(self, filenames=None): if filenames is None: filenames = self.find_config_files() diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -149,6 +149,16 @@ dist = self.dists.values()[0] return dist + def unpack(self, path=None, prefer_source=True): + """Unpack the distribution to the given path. + + If not destination is given, creates a temporary location. + + Returns the location of the extracted files (root). + """ + return self.get_distribution(prefer_source=prefer_source)\ + .unpack(path=path) + def download(self, temp_path=None, prefer_source=True): """Download the distribution, using the requirements. @@ -312,7 +322,7 @@ if path is None: path = tempfile.mkdtemp() - filename = self.download() + filename = self.download(path) content_type = mimetypes.guess_type(filename)[0] self._unpacked_dir = unpack_archive(filename) @@ -332,8 +342,11 @@ % (hashval.hexdigest(), expected_hashval)) def __repr__(self): + if self.release is None: + return "" % self.dist_type + return "<%s %s %s>" % ( - self.release.name, self.release.version, self.dist_type or "") + self.release.name, self.release.version, self.dist_type or "") class ReleasesList(IndexReference): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -1,17 +1,3 @@ -from tempfile import mkdtemp -import shutil -import os -import errno -import itertools - -from distutils2 import logger -from distutils2._backport.pkgutil import get_distributions -from distutils2._backport.sysconfig import get_config_var -from distutils2.depgraph import generate_graph -from distutils2.index import wrapper -from distutils2.index.errors import ProjectNotFound, ReleaseNotFound -from distutils2.version import get_version_predicate - """Provides installations scripts. The goal of this script is to install a release from the indexes (eg. @@ -20,6 +6,27 @@ It uses the work made in pkgutil and by the index crawlers to browse the installed distributions, and rely on the instalation commands to install. """ +import shutil +import os +import sys +import stat +import errno +import itertools +import tempfile + +from distutils2 import logger +from distutils2._backport.pkgutil import get_distributions +from distutils2._backport.pkgutil import get_distribution +from distutils2._backport.sysconfig import get_config_var +from distutils2.depgraph import generate_graph +from distutils2.index import wrapper +from distutils2.index.errors import ProjectNotFound, ReleaseNotFound +from distutils2.errors import DistutilsError +from distutils2.version import get_version_predicate + + +__all__ = ['install_dists', 'install_from_infos', 'get_infos', 'remove', + 'install'] class InstallationException(Exception): @@ -30,7 +37,7 @@ """Raised when a conflict is detected""" -def move_files(files, destination=None): +def _move_files(files, destination): """Move the list of files in the destination folder, keeping the same structure. @@ -38,13 +45,11 @@ :param files: a list of files to move. :param destination: the destination directory to put on the files. - if not defined, create a new one, using mkdtemp """ - if not destination: - destination = mkdtemp() - for old in files: - new = '%s%s' % (destination, old) + # not using os.path.join() because basename() might not be + # unique in destination + new = "%s%s" % (destination, old) # try to make the paths. try: @@ -55,7 +60,7 @@ else: raise e os.rename(old, new) - yield (old, new) + yield old, new def _run_d1_install(archive_dir, path): @@ -88,7 +93,7 @@ * copy the files in "path" * determine if the distribution is distutils2 or distutils1. """ - where = dist.unpack(archive) + where = dist.unpack(path) # get into the dir archive_dir = None @@ -114,7 +119,7 @@ os.chdir(old_dir) -def install_dists(dists, path=None): +def install_dists(dists, path, paths=sys.path): """Install all distributions provided in dists, with the given prefix. If an error occurs while installing one of the distributions, uninstall all @@ -124,27 +129,28 @@ :param dists: distributions to install :param path: base path to install distribution in + :param paths: list of paths (defaults to sys.path) to look for info """ - if not path: - path = mkdtemp() installed_dists, installed_files = [], [] - for d in dists: - logger.info('Installing %s %s' % (d.name, d.version)) + for dist in dists: + logger.info('Installing %s %s' % (dist.name, dist.version)) try: - installed_files.extend(_install_dist(d, path)) - installed_dists.append(d) - except Exception, e : + installed_files.extend(_install_dist(dist, path)) + installed_dists.append(dist) + except Exception, e: logger.info('Failed. %s' % str(e)) # reverting - for d in installed_dists: - uninstall(d) + for installed_dist in installed_dists: + _remove_dist(installed_dist, paths) raise e + return installed_files -def install_from_infos(install=[], remove=[], conflicts=[], install_path=None): +def install_from_infos(install_path=None, install=[], remove=[], conflicts=[], + paths=sys.path): """Install and remove the given distributions. The function signature is made to be compatible with the one of get_infos. @@ -163,35 +169,43 @@ 4. Else, move the distributions to the right locations, and remove for real the distributions thats need to be removed. - :param install: list of distributions that will be installed. + :param install_path: the installation path where we want to install the + distributions. + :param install: list of distributions that will be installed; install_path + must be provided if this list is not empty. :param remove: list of distributions that will be removed. :param conflicts: list of conflicting distributions, eg. that will be in conflict once the install and remove distribution will be processed. - :param install_path: the installation path where we want to install the - distributions. + :param paths: list of paths (defaults to sys.path) to look for info """ # first of all, if we have conflicts, stop here. if conflicts: raise InstallationConflict(conflicts) + if install and not install_path: + raise ValueError("Distributions are to be installed but `install_path`" + " is not provided.") + # before removing the files, we will start by moving them away # then, if any error occurs, we could replace them in the good place. temp_files = {} # contains lists of {dist: (old, new)} paths + temp_dir = None if remove: + temp_dir = tempfile.mkdtemp() for dist in remove: files = dist.get_installed_files() - temp_files[dist] = move_files(files) + temp_files[dist] = _move_files(files, temp_dir) try: if install: - installed_files = install_dists(install, install_path) # install to tmp first - + install_dists(install, install_path, paths) except: - # if an error occurs, put back the files in the good place. + # if an error occurs, put back the files in the right place. for files in temp_files.values(): for old, new in files: shutil.move(new, old) - + if temp_dir: + shutil.rmtree(temp_dir) # now re-raising raise @@ -199,6 +213,8 @@ for files in temp_files.values(): for old, new in files: os.remove(new) + if temp_dir: + shutil.rmtree(temp_dir) def _get_setuptools_deps(release): @@ -260,9 +276,9 @@ # Get all the releases that match the requirements try: releases = index.get_releases(requirements) - except (ReleaseNotFound, ProjectNotFound), e: + except (ReleaseNotFound, ProjectNotFound): raise InstallationException('Release not found: "%s"' % requirements) - + # Pick up a release, and try to get the dependency tree release = releases.get_last(requirements, prefer_final=prefer_final) @@ -279,6 +295,8 @@ else: deps = metadata['requires_dist'] + # XXX deps not used + distributions = itertools.chain(installed, [release]) depgraph = generate_graph(distributions) @@ -310,18 +328,71 @@ infos[key].extend(new_infos[key]) -def remove(project_name): +def _remove_dist(dist, paths=sys.path): + remove(dist.name, paths) + + +def remove(project_name, paths=sys.path): """Removes a single project from the installation""" - pass + dist = get_distribution(project_name, use_egg_info=True, paths=paths) + if dist is None: + raise DistutilsError('Distribution "%s" not found' % project_name) + files = dist.get_installed_files(local=True) + rmdirs = [] + rmfiles = [] + tmp = tempfile.mkdtemp(prefix=project_name + '-uninstall') + try: + for file_, md5, size in files: + if os.path.isfile(file_): + dirname, filename = os.path.split(file_) + tmpfile = os.path.join(tmp, filename) + try: + os.rename(file_, tmpfile) + finally: + if not os.path.isfile(file_): + os.rename(tmpfile, file_) + if file_ not in rmfiles: + rmfiles.append(file_) + if dirname not in rmdirs: + rmdirs.append(dirname) + finally: + shutil.rmtree(tmp) + logger.info('Removing %r...' % project_name) + file_count = 0 + for file_ in rmfiles: + os.remove(file_) + file_count +=1 + dir_count = 0 + for dirname in rmdirs: + if not os.path.exists(dirname): + # could + continue -def main(**attrs): - if 'script_args' not in attrs: - import sys - attrs['requirements'] = sys.argv[1] - get_infos(**attrs) + files_count = 0 + for root, dir, files in os.walk(dirname): + files_count += len(files) + + if files_count > 0: + # XXX Warning + continue + + # empty dirs with only empty dirs + if bool(os.stat(dirname).st_mode & stat.S_IWUSR): + # XXX Add a callable in shutil.rmtree to count + # the number of deleted elements + shutil.rmtree(dirname) + dir_count += 1 + + # removing the top path + # XXX count it ? + if os.path.exists(dist.path): + shutil.rmtree(dist.path) + + logger.info('Success ! Removed %d files and %d dirs' % \ + (file_count, dir_count)) def install(project): @@ -338,13 +409,20 @@ install_path = get_config_var('base') try: - install_from_infos(info['install'], info['remove'], info['conflict'], - install_path=install_path) + install_from_infos(install_path, + info['install'], info['remove'], info['conflict']) except InstallationConflict, e: projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) +def _main(**attrs): + if 'script_args' not in attrs: + import sys + attrs['requirements'] = sys.argv[1] + get_infos(**attrs) + + if __name__ == '__main__': - main() + _main() diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -20,17 +20,27 @@ # Ask for the dependencies. # Ask for the Requires-Dist # Ask for the Provides-Dist +# Ask for a description # Detect scripts (not sure how. #! outside of package?) import os import sys import re import shutil +import glob +import re from ConfigParser import RawConfigParser from textwrap import dedent +if sys.version_info[:2] < (2, 6): + from sets import Set as set +try: + from hashlib import md5 +except ImportError: + from md5 import md5 # importing this with an underscore as it should be replaced by the # dict form or another structures for all purposes from distutils2._trove import all_classifiers as _CLASSIFIERS_LIST +from distutils2._backport import sysconfig _FILENAME = 'setup.cfg' @@ -82,6 +92,10 @@ Optionally, you can set other trove identifiers for things such as the human language, programming language, user interface, etc... ''', + 'setup.py found':''' +The setup.py script will be executed to retrieve the metadata. +A wizard will be run if you answer "n", +''' } # XXX everything needs docstrings and tests (both low-level tests of various @@ -158,16 +172,18 @@ LICENCES = _build_licences(_CLASSIFIERS_LIST) - class MainProgram(object): def __init__(self): self.configparser = None - self.classifiers = {} + self.classifiers = set([]) self.data = {} self.data['classifier'] = self.classifiers self.data['packages'] = [] self.data['modules'] = [] + self.data['platform'] = [] + self.data['resources'] = [] self.data['extra_files'] = [] + self.data['scripts'] = [] self.load_config_file() def lookup_option(self, key): @@ -178,6 +194,7 @@ def load_config_file(self): self.configparser = RawConfigParser() # TODO replace with section in distutils config file + #XXX freedesktop self.configparser.read(os.path.expanduser('~/.mkcfg')) self.data['author'] = self.lookup_option('author') self.data['author_email'] = self.lookup_option('author_email') @@ -194,6 +211,7 @@ if not valuesDifferent: return + #XXX freedesktop fp = open(os.path.expanduser('~/.mkcfgpy'), 'w') try: self.configparser.write(fp) @@ -201,19 +219,122 @@ fp.close() def load_existing_setup_script(self): - raise NotImplementedError - # Ideas: - # - define a mock module to assign to sys.modules['distutils'] before - # importing the setup script as a module (or executing it); it would - # provide setup (a function that just returns its args as a dict), - # Extension (ditto), find_packages (the real function) - # - we could even mock Distribution and commands to handle more setup - # scripts - # - we could use a sandbox (http://bugs.python.org/issue8680) - # - the cleanest way is to parse the file, not import it, but there is - # no way to do that across versions (the compiler package is - # deprecated or removed in recent Pythons, the ast module is not - # present before 2.6) + """ Generate a setup.cfg from an existing setup.py. + + It only exports the distutils metadata (setuptools specific metadata + is not actually supported). + """ + setuppath = 'setup.py' + if not os.path.exists(setuppath): + return + else: + ans = ask_yn(('A legacy setup.py has been found.\n' + 'Would you like to convert it to a setup.cfg ?'), + 'y', + _helptext['setup.py found']) + if ans != 'y': + return + + #_______mock setup start + data = self.data + def setup(**attrs): + """Mock the setup(**attrs) in order to retrive metadata.""" + # use the distutils v1 processings to correctly parse metadata. + #XXX we could also use the setuptools distibution ??? + from distutils.dist import Distribution + dist = Distribution(attrs) + dist.parse_config_files() + # 1. retrieves metadata that are quite similar PEP314<->PEP345 + labels = (('name',) * 2, + ('version',) * 2, + ('author',) * 2, + ('author_email',) * 2, + ('maintainer',) * 2, + ('maintainer_email',) * 2, + ('description', 'summary'), + ('long_description', 'description'), + ('url', 'home_page'), + ('platforms', 'platform')) + + if sys.version[:3] >= '2.5': + labels += (('provides', 'provides-dist'), + ('obsoletes', 'obsoletes-dist'), + ('requires', 'requires-dist'),) + get = lambda lab: getattr(dist.metadata, lab.replace('-', '_')) + data.update((new, get(old)) for (old, new) in labels if get(old)) + # 2. retrieves data that requires special processings. + data['classifier'].update(dist.get_classifiers() or []) + data['scripts'].extend(dist.scripts or []) + data['packages'].extend(dist.packages or []) + data['modules'].extend(dist.py_modules or []) + # 2.1 data_files -> resources. + if dist.data_files: + if len(dist.data_files) < 2 or \ + isinstance(dist.data_files[1], str): + dist.data_files = [('', dist.data_files)] + #add tokens in the destination paths + vars = {'distribution.name':data['name']} + path_tokens = sysconfig.get_paths(vars=vars).items() + #sort tokens to use the longest one first + path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), + key=lambda x: x[1]) + for dest, srcs in (dist.data_files or []): + dest = os.path.join(sys.prefix, dest) + for tok, path in path_tokens: + if dest.startswith(path): + dest = ('{%s}' % tok) + dest[len(path):] + files = [('/ '.join(src.rsplit('/', 1)), dest) + for src in srcs] + data['resources'].extend(files) + continue + # 2.2 package_data -> extra_files + package_dirs = dist.package_dir or {} + for package, extras in dist.package_data.iteritems() or []: + package_dir = package_dirs.get(package, package) + fils = [os.path.join(package_dir, fil) for fil in extras] + data['extra_files'].extend(fils) + + # Use README file if its content is the desciption + if "description" in data: + ref = md5(re.sub('\s', '', self.data['description']).lower()) + ref = ref.digest() + for readme in glob.glob('README*'): + fob = open(readme) + val = md5(re.sub('\s', '', fob.read()).lower()).digest() + fob.close() + if val == ref: + del data['description'] + data['description-file'] = readme + break + #_________ mock setup end + + # apply monkey patch to distutils (v1) and setuptools (if needed) + # (abord the feature if distutils v1 has been killed) + try: + import distutils.core as DC + getattr(DC, 'setup') # ensure distutils v1 + except ImportError, AttributeError: + return + saved_setups = [(DC, DC.setup)] + DC.setup = setup + try: + import setuptools + saved_setups.append((setuptools, setuptools.setup)) + setuptools.setup = setup + except ImportError, AttributeError: + pass + # get metadata by executing the setup.py with the patched setup(...) + success = False # for python < 2.4 + try: + pyenv = globals().copy() + execfile(setuppath, pyenv) + success = True + finally: #revert monkey patches + for patched_module, original_setup in saved_setups: + patched_module.setup = original_setup + if not self.data: + raise ValueError('Unable to load metadata from setup.py') + return success def inspect_file(self, path): fp = open(path, 'r') @@ -222,9 +343,11 @@ m = re.match(r'^#!.*python((?P\d)(\.\d+)?)?$', line) if m: if m.group('major') == '3': - self.classifiers['Programming Language :: Python :: 3'] = 1 + self.classifiers.add( + 'Programming Language :: Python :: 3') else: - self.classifiers['Programming Language :: Python :: 2'] = 1 + self.classifiers.add( + 'Programming Language :: Python :: 2') finally: fp.close() @@ -370,7 +493,7 @@ for key in sorted(trove): if len(trove[key]) == 0: if ask_yn('Add "%s"' % desc[4:] + ' :: ' + key, 'n') == 'y': - classifiers[desc[4:] + ' :: ' + key] = 1 + classifiers.add(desc[4:] + ' :: ' + key) continue if ask_yn('Do you want to set items under\n "%s" (%d sub-items)' @@ -421,7 +544,7 @@ print ("ERROR: Invalid selection, type a number from the list " "above.") - classifiers[_CLASSIFIERS_LIST[index]] = 1 + classifiers.add(_CLASSIFIERS_LIST[index]) return def set_devel_status(self, classifiers): @@ -448,7 +571,7 @@ 'Development Status :: 5 - Production/Stable', 'Development Status :: 6 - Mature', 'Development Status :: 7 - Inactive'][choice] - classifiers[key] = 1 + classifiers.add(key) return except (IndexError, ValueError): print ("ERROR: Invalid selection, type a single digit " @@ -475,28 +598,39 @@ fp = open(_FILENAME, 'w') try: fp.write('[metadata]\n') - fp.write('name = %s\n' % self.data['name']) - fp.write('version = %s\n' % self.data['version']) - fp.write('author = %s\n' % self.data['author']) - fp.write('author_email = %s\n' % self.data['author_email']) - fp.write('summary = %s\n' % self.data['summary']) - fp.write('home_page = %s\n' % self.data['home_page']) - fp.write('\n') - if len(self.data['classifier']) > 0: - classifiers = '\n'.join([' %s' % clas for clas in - self.data['classifier']]) - fp.write('classifier = %s\n' % classifiers.strip()) - fp.write('\n') - - fp.write('[files]\n') - for element in ('packages', 'modules', 'extra_files'): - if len(self.data[element]) == 0: + # simple string entries + for name in ('name', 'version', 'summary', 'download_url'): + fp.write('%s = %s\n' % (name, self.data.get(name, 'UNKNOWN'))) + # optional string entries + if 'keywords' in self.data and self.data['keywords']: + fp.write('keywords = %s\n' % ' '.join(self.data['keywords'])) + for name in ('home_page', 'author', 'author_email', + 'maintainer', 'maintainer_email', 'description-file'): + if name in self.data and self.data[name]: + fp.write('%s = %s\n' % (name, self.data[name])) + if 'description' in self.data: + fp.write( + 'description = %s\n' + % '\n |'.join(self.data['description'].split('\n'))) + # multiple use string entries + for name in ('platform', 'supported-platform', 'classifier', + 'requires-dist', 'provides-dist', 'obsoletes-dist', + 'requires-external'): + if not(name in self.data and self.data[name]): continue - items = '\n'.join([' %s' % item for item in - self.data[element]]) - fp.write('%s = %s\n' % (element, items.strip())) - - fp.write('\n') + fp.write('%s = ' % name) + fp.write(''.join(' %s\n' % val + for val in self.data[name]).lstrip()) + fp.write('\n[files]\n') + for name in ('packages', 'modules', 'scripts', + 'package_data', 'extra_files'): + if not(name in self.data and self.data[name]): + continue + fp.write('%s = %s\n' + % (name, '\n '.join(self.data[name]).strip())) + fp.write('\n[resources]\n') + for src, dest in self.data['resources']: + fp.write('%s = %s\n' % (src, dest)) finally: fp.close() @@ -508,11 +642,12 @@ """Main entry point.""" program = MainProgram() # uncomment when implemented - #program.load_existing_setup_script() - program.inspect_directory() - program.query_user() - program.update_config_file() + if not program.load_existing_setup_script(): + program.inspect_directory() + program.query_user() + program.update_config_file() program.write_setup_script() + # istutils2.util.generate_distutils_setup_py() if __name__ == '__main__': diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -11,7 +11,7 @@ from distutils2 import __version__ from distutils2._backport.pkgutil import get_distributions, get_distribution from distutils2.depgraph import generate_graph -from distutils2.install import install +from distutils2.install import install, remove # This is a barebones help message generated displayed when the user # runs the setup script with no arguments at all. More useful help @@ -83,10 +83,10 @@ dist = distclass(attrs) except DistutilsSetupError, msg: if 'name' in attrs: - raise SystemExit, "error in %s setup command: %s" % \ - (attrs['name'], msg) + raise SystemExit("error in %s setup command: %s" % \ + (attrs['name'], msg)) else: - raise SystemExit, "error in setup command: %s" % msg + raise SystemExit("error in setup command: %s" % msg) # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. @@ -98,22 +98,21 @@ try: res = dist.parse_command_line() except DistutilsArgError, msg: - raise SystemExit, gen_usage(dist.script_name) + "\nerror: %s" % msg + raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) # And finally, run all the commands found on the command line. if res: try: dist.run_commands() except KeyboardInterrupt: - raise SystemExit, "interrupted" + raise SystemExit("interrupted") except (IOError, os.error), exc: error = grok_environment_error(exc) - raise SystemExit, error + raise SystemExit(error) except (DistutilsError, CCompilerError), msg: - raise - raise SystemExit, "error: " + str(msg) + raise SystemExit("error: " + str(msg)) return dist @@ -127,7 +126,10 @@ def main(): - """Main entry point for Distutils2""" + """Main entry point for Distutils2 + + Execute an action or delegate to the commands system. + """ _set_logger() parser = OptionParser() parser.disable_interspersed_args() @@ -164,7 +166,7 @@ options, args = parser.parse_args() if options.version: print('Distutils2 %s' % __version__) -# sys.exit(0) + return 0 if len(options.metadata): from distutils2.dist import Distribution @@ -178,18 +180,18 @@ keys = options.metadata if len(keys) == 1: print metadata[keys[0]] - sys.exit(0) + return for key in keys: if key in metadata: - print(metadata._convert_name(key)+':') + print(metadata._convert_name(key) + ':') value = metadata[key] if isinstance(value, list): for v in value: - print(' '+v) + print(' ' + v) else: - print(' '+value.replace('\n', '\n ')) - sys.exit(0) + print(' ' + value.replace('\n', '\n ')) + return 0 if options.search is not None: search = options.search.lower() @@ -199,7 +201,7 @@ print('%s %s at %s' % (dist.name, dist.metadata['version'], dist.path)) - sys.exit(0) + return 0 if options.graph is not None: name = options.graph @@ -211,25 +213,29 @@ graph = generate_graph(dists) print(graph.repr_node(dist)) - sys.exit(0) + return 0 if options.fgraph: dists = get_distributions(use_egg_info=True) graph = generate_graph(dists) print(graph) - sys.exit(0) + return 0 if options.install is not None: install(options.install) - sys.exit(0) + return 0 + + if options.remove is not None: + remove(options.remove) + return 0 if len(args) == 0: parser.print_help() - sys.exit(0) + return 0 - return commands_main() -# sys.exit(0) + commands_main() + return 0 if __name__ == '__main__': - main() + sys.exit(main()) diff --git a/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz b/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz index 0000000000000000000000000000000000000000..333961eb18a6e7db80fefd41c339ab218d5180c4 GIT binary patch literal 110 zc$|~(=3uy!>FUeC{PvtR-ysJc)&sVu?9yZ7`(A1Di)P(6s!I71JWZ;--fWND`LA)=lAmk-7Jbj=XMlnFEsQ#U Kd|Vkc7#IK&xGYxy diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py --- a/distutils2/tests/support.py +++ b/distutils2/tests/support.py @@ -159,6 +159,21 @@ dist = Distribution(attrs=kw) return project_dir, dist + def assertIsFile(self, *args): + path = os.path.join(*args) + dirname = os.path.dirname(path) + file = os.path.basename(path) + if os.path.isdir(dirname): + files = os.listdir(dirname) + msg = "%s not found in %s: %s" % (file, dirname, files) + assert os.path.isfile(path), msg + else: + raise AssertionError( + '%s not found. %s does not exist' % (file, dirname)) + + def assertIsNotFile(self, *args): + path = os.path.join(*args) + assert not os.path.isfile(path), "%s exist" % path class EnvironGuard(object): """TestCase-compatible mixin to save and restore the environment.""" diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -93,6 +93,34 @@ sub_commands = foo """ +# Can not be merged with SETUP_CFG else install_dist +# command will fail when trying to compile C sources +EXT_SETUP_CFG = """ +[files] +packages = one + two + +[extension=speed_coconuts] +name = one.speed_coconuts +sources = c_src/speed_coconuts.c +extra_link_args = "`gcc -print-file-name=libgcc.a`" -shared +define_macros = HAVE_CAIRO HAVE_GTK2 +libraries = gecodeint gecodekernel -- sys.platform != 'win32' + GecodeInt GecodeKernel -- sys.platform == 'win32' + +[extension=fast_taunt] +name = three.fast_taunt +sources = cxx_src/utils_taunt.cxx + cxx_src/python_module.cxx +include_dirs = /usr/include/gecode + /usr/include/blitz +extra_compile_args = -fPIC -O2 + -DGECODE_VERSION=$(./gecode_version) -- sys.platform != 'win32' + /DGECODE_VERSION='win32' -- sys.platform == 'win32' +language = cxx + +""" + class DCompiler(object): name = 'd' @@ -134,7 +162,14 @@ super(ConfigTestCase, self).setUp() self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) + sys.stdout = sys.stderr = StringIO() + self.addCleanup(os.chdir, os.getcwd()) + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.tempdir = tempdir + + self.addCleanup(setattr, sys, 'argv', sys.argv) def write_setup(self, kwargs=None): opts = {'description-file': 'README', 'extra-files':''} @@ -156,8 +191,6 @@ return dist def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') @@ -232,9 +265,6 @@ def test_multiple_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog2') @@ -242,9 +272,6 @@ self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) def test_multiline_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README\n CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog') @@ -252,9 +279,37 @@ self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + def test_parse_extensions_in_config(self): + self.write_file('setup.cfg', EXT_SETUP_CFG) + dist = self.run_setup('--version') + + ext_modules = dict((mod.name, mod) for mod in dist.ext_modules) + self.assertEqual(len(ext_modules), 2) + ext = ext_modules.get('one.speed_coconuts') + self.assertEqual(ext.sources, ['c_src/speed_coconuts.c']) + self.assertEqual(ext.define_macros, ['HAVE_CAIRO', 'HAVE_GTK2']) + libs = ['gecodeint', 'gecodekernel'] + if sys.platform == 'win32': + libs = ['GecodeInt', 'GecodeKernel'] + self.assertEqual(ext.libraries, libs) + self.assertEqual(ext.extra_link_args, + ['`gcc -print-file-name=libgcc.a`', '-shared']) + + ext = ext_modules.get('three.fast_taunt') + self.assertEqual(ext.sources, + ['cxx_src/utils_taunt.cxx', 'cxx_src/python_module.cxx']) + self.assertEqual(ext.include_dirs, + ['/usr/include/gecode', '/usr/include/blitz']) + cargs = ['-fPIC', '-O2'] + if sys.platform == 'win32': + cargs.append("/DGECODE_VERSION='win32'") + else: + cargs.append('-DGECODE_VERSION=$(./gecode_version)') + self.assertEqual(ext.extra_compile_args, cargs) + self.assertEqual(ext.language, 'cxx') + + def test_metadata_requires_description_files_missing(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2'}) self.write_file('README', 'yeah') self.write_file('README2', 'yeah') @@ -278,8 +333,6 @@ self.assertRaises(DistutilsFileError, cmd.make_distribution) def test_metadata_requires_description_files(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2', 'extra-files':'\n README2'}) self.write_file('README', 'yeah') @@ -315,8 +368,6 @@ self.assertIn('README\nREADME2\n', open('MANIFEST').read()) def test_sub_commands(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') diff --git a/distutils2/tests/test_index_dist.py b/distutils2/tests/test_index_dist.py --- a/distutils2/tests/test_index_dist.py +++ b/distutils2/tests/test_index_dist.py @@ -127,7 +127,7 @@ url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address # check md5 if given dist = Dist(url=url, hashname="md5", - hashval="d41d8cd98f00b204e9800998ecf8427e") + hashval="fe18804c5b722ff024cabdf514924fc4") dist.download(self.mkdtemp()) # a wrong md5 fails @@ -157,6 +157,25 @@ hashname="invalid_hashname", hashval="value") + @use_pypi_server('downloads_with_md5') + def test_unpack(self, server): + url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address + dist = Dist(url=url) + # doing an unpack + here = self.mkdtemp() + there = dist.unpack(here) + result = os.listdir(there) + self.assertIn('paf', result) + os.remove('paf') + + def test_hashname(self): + # Invalid hashnames raises an exception on assignation + Dist(hashname="md5", hashval="value") + + self.assertRaises(UnsupportedHashName, Dist, + hashname="invalid_hashname", + hashval="value") + class TestReleasesList(unittest.TestCase): diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -39,6 +39,11 @@ for f in range(0,3): self._real_files.append(mkstemp()) + def _unlink_installed_files(self): + if self._files: + for f in self._real_files: + os.unlink(f[1]) + def get_installed_files(self, **args): if self._files: return [f[1] for f in self._real_files] @@ -54,14 +59,14 @@ self._called_with = [] self._return_value = return_value self._raise = raise_exception - + def __call__(self, *args, **kwargs): self.called = True self._times_called = self._times_called + 1 self._called_with.append((args, kwargs)) iterable = hasattr(self._raise, '__iter__') if self._raise: - if ((not iterable and self._raise) + if ((not iterable and self._raise) or self._raise[self._times_called - 1]): raise Exception return self._return_value @@ -70,25 +75,8 @@ return (args, kwargs) in self._called_with -def patch(parent, to_patch): - """monkey match a module""" - def wrapper(func): - print func - print dir(func) - old_func = getattr(parent, to_patch) - def wrapped(*args, **kwargs): - parent.__dict__[to_patch] = MagicMock() - try: - out = func(*args, **kwargs) - finally: - setattr(parent, to_patch, old_func) - return out - return wrapped - return wrapper - - def get_installed_dists(dists): - """Return a list of fake installed dists. + """Return a list of fake installed dists. The list is name, version, deps""" objects = [] for (name, version, deps) in dists: @@ -100,12 +88,6 @@ def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) - def _patch_run_install(self): - """Patch run install""" - - def _unpatch_run_install(self): - """Unpatch run install for d2 and d1""" - def _get_results(self, output): """return a list of results""" installed = [(o.name, '%s' % o.version) for o in output['install']] @@ -205,7 +187,7 @@ ]) # name, version, deps. - already_installed = [('bacon', '0.1', []), + already_installed = [('bacon', '0.1', []), ('chicken', '1.1', ['bacon (0.1)'])] output = install.get_infos("choxie", index=client, installed= get_installed_dists(already_installed)) @@ -236,7 +218,7 @@ files = [os.path.join(path, '%s' % x) for x in range(1, 20)] for f in files: file(f, 'a+') - output = [o for o in install.move_files(files, newpath)] + output = [o for o in install._move_files(files, newpath)] # check that output return the list of old/new places for f in files: @@ -265,19 +247,19 @@ old_install_dist = install._install_dist old_uninstall = getattr(install, 'uninstall', None) - install._install_dist = MagicMock(return_value=[], + install._install_dist = MagicMock(return_value=[], raise_exception=(False, True)) - install.uninstall = MagicMock() + install.remove = MagicMock() try: d1 = ToInstallDist() d2 = ToInstallDist() path = self.mkdtemp() self.assertRaises(Exception, install.install_dists, [d1, d2], path) self.assertTrue(install._install_dist.called_with(d1, path)) - self.assertTrue(install.uninstall.called) + self.assertTrue(install.remove.called) finally: install._install_dist = old_install_dist - install.uninstall = old_uninstall + install.remove = old_uninstall def test_install_dists_success(self): @@ -322,7 +304,7 @@ old_install_dist = install._install_dist old_uninstall = getattr(install, 'uninstall', None) - install._install_dist = MagicMock(return_value=[], + install._install_dist = MagicMock(return_value=[], raise_exception=(False, True)) install.uninstall = MagicMock() try: @@ -331,14 +313,17 @@ for i in range(0,2): remove.append(ToInstallDist(files=True)) to_install = [ToInstallDist(), ToInstallDist()] + temp_dir = self.mkdtemp() - self.assertRaises(Exception, install.install_from_infos, - remove=remove, install=to_install) + self.assertRaises(Exception, install.install_from_infos, + install_path=temp_dir, install=to_install, + remove=remove) # assert that the files are in the same place # assert that the files have been removed for dist in remove: for f in dist.get_installed_files(): self.assertTrue(os.path.exists(f)) + dist._unlink_installed_files() finally: install.install_dist = old_install_dist install.uninstall = old_uninstall @@ -352,8 +337,7 @@ install_path = "my_install_path" to_install = [ToInstallDist(), ToInstallDist()] - install.install_from_infos(install=to_install, - install_path=install_path) + install.install_from_infos(install_path, install=to_install) for dist in to_install: install._install_dist.called_with(install_path) finally: diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -1,10 +1,17 @@ +# -*- coding: utf-8 -*- """Tests for distutils.mkcfg.""" import os +import os.path as osp import sys import StringIO +if sys.version_info[:2] < (2, 6): + from sets import Set as set +from textwrap import dedent + from distutils2.tests import run_unittest, support, unittest from distutils2.mkcfg import MainProgram -from distutils2.mkcfg import ask_yn, ask +from distutils2.mkcfg import ask_yn, ask, main + class MkcfgTestCase(support.TempdirManager, unittest.TestCase): @@ -12,16 +19,20 @@ def setUp(self): super(MkcfgTestCase, self).setUp() self._stdin = sys.stdin - self._stdout = sys.stdout + self._stdout = sys.stdout sys.stdin = StringIO.StringIO() sys.stdout = StringIO.StringIO() - + self._cwd = os.getcwd() + self.wdir = self.mkdtemp() + os.chdir(self.wdir) + def tearDown(self): super(MkcfgTestCase, self).tearDown() sys.stdin = self._stdin sys.stdout = self._stdout - - def test_ask_yn(self): + os.chdir(self._cwd) + + def test_ask_yn(self): sys.stdin.write('y\n') sys.stdin.seek(0) self.assertEqual('y', ask_yn('is this a test')) @@ -40,13 +51,13 @@ main.data['author'] = [] main._set_multi('_set_multi test', 'author') self.assertEqual(['aaaaa'], main.data['author']) - + def test_find_files(self): # making sure we scan a project dir correctly main = MainProgram() # building the structure - tempdir = self.mkdtemp() + tempdir = self.wdir dirs = ['pkg1', 'data', 'pkg2', 'pkg2/sub'] files = ['README', 'setup.cfg', 'foo.py', 'pkg1/__init__.py', 'pkg1/bar.py', @@ -60,12 +71,7 @@ path = os.path.join(tempdir, file_) self.write_file(path, 'xxx') - old_dir = os.getcwd() - os.chdir(tempdir) - try: - main._find_files() - finally: - os.chdir(old_dir) + main._find_files() # do we have what we want ? self.assertEqual(main.data['packages'], ['pkg1', 'pkg2', 'pkg2.sub']) @@ -73,6 +79,131 @@ self.assertEqual(set(main.data['extra_files']), set(['setup.cfg', 'README', 'data/data1'])) + def test_convert_setup_py_to_cfg(self): + self.write_file((self.wdir, 'setup.py'), + dedent(""" + # -*- coding: utf-8 -*- + from distutils.core import setup + lg_dsc = '''My super Death-scription + barbar is now on the public domain, + ho, baby !''' + setup(name='pyxfoil', + version='0.2', + description='Python bindings for the Xfoil engine', + long_description = lg_dsc, + maintainer='Andr?? Espaze', + maintainer_email='andre.espaze at logilab.fr', + url='http://www.python-science.org/project/pyxfoil', + license='GPLv2', + packages=['pyxfoil', 'babar', 'me'], + data_files=[('share/doc/pyxfoil', ['README.rst']), + ('share/man', ['pyxfoil.1']), + ], + py_modules = ['my_lib', 'mymodule'], + package_dir = {'babar' : '', + 'me' : 'Martinique/Lamentin', + }, + package_data = {'babar': ['Pom', 'Flora', 'Alexander'], + 'me': ['dady', 'mumy', 'sys', 'bro'], + '': ['setup.py', 'README'], + 'pyxfoil' : ['fengine.so'], + }, + scripts = ['my_script', 'bin/run'], + ) + """)) + sys.stdin.write('y\n') + sys.stdin.seek(0) + main() + fid = open(osp.join(self.wdir, 'setup.cfg')) + lines = set([line.rstrip() for line in fid]) + fid.close() + self.assertEqual(lines, set(['', + '[metadata]', + 'version = 0.2', + 'name = pyxfoil', + 'maintainer = Andr?? Espaze', + 'description = My super Death-scription', + ' |barbar is now on the public domain,', + ' |ho, baby !', + 'maintainer_email = andre.espaze at logilab.fr', + 'home_page = http://www.python-science.org/project/pyxfoil', + 'download_url = UNKNOWN', + 'summary = Python bindings for the Xfoil engine', + '[files]', + 'modules = my_lib', + ' mymodule', + 'packages = pyxfoil', + ' babar', + ' me', + 'extra_files = Martinique/Lamentin/dady', + ' Martinique/Lamentin/mumy', + ' Martinique/Lamentin/sys', + ' Martinique/Lamentin/bro', + ' Pom', + ' Flora', + ' Alexander', + ' setup.py', + ' README', + ' pyxfoil/fengine.so', + 'scripts = my_script', + ' bin/run', + '[resources]', + 'README.rst = {doc}', + 'pyxfoil.1 = {man}', + ])) + + def test_convert_setup_py_to_cfg_with_description_in_readme(self): + self.write_file((self.wdir, 'setup.py'), + dedent(""" + # -*- coding: utf-8 -*- + from distutils.core import setup + lg_dsc = open('README.txt').read() + setup(name='pyxfoil', + version='0.2', + description='Python bindings for the Xfoil engine', + long_description=lg_dsc, + maintainer='Andr?? Espaze', + maintainer_email='andre.espaze at logilab.fr', + url='http://www.python-science.org/project/pyxfoil', + license='GPLv2', + packages=['pyxfoil'], + package_data={'pyxfoil' : ['fengine.so']}, + data_files=[ + ('share/doc/pyxfoil', ['README.rst']), + ('share/man', ['pyxfoil.1']), + ], + ) + """)) + self.write_file((self.wdir, 'README.txt'), + dedent(''' +My super Death-scription +barbar is now on the public domain, +ho, baby ! + ''')) + sys.stdin.write('y\n') + sys.stdin.seek(0) + main() + fid = open(osp.join(self.wdir, 'setup.cfg')) + lines = set([line.strip() for line in fid]) + fid.close() + self.assertEqual(lines, set(['', + '[metadata]', + 'version = 0.2', + 'name = pyxfoil', + 'maintainer = Andr?? Espaze', + 'maintainer_email = andre.espaze at logilab.fr', + 'home_page = http://www.python-science.org/project/pyxfoil', + 'download_url = UNKNOWN', + 'summary = Python bindings for the Xfoil engine', + 'description-file = README.txt', + '[files]', + 'packages = pyxfoil', + 'extra_files = pyxfoil/fengine.so', + '[resources]', + 'README.rst = {doc}', + 'pyxfoil.1 = {man}', + ])) + def test_suite(): return unittest.makeSuite(MkcfgTestCase) diff --git a/distutils2/tests/test_uninstall.py b/distutils2/tests/test_uninstall.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_uninstall.py @@ -0,0 +1,93 @@ +"""Tests for the uninstall command.""" +import os +import sys +from StringIO import StringIO +from distutils2._backport.pkgutil import disable_cache, enable_cache +from distutils2.tests import unittest, support, run_unittest +from distutils2.errors import DistutilsError +from distutils2.install import remove + +SETUP_CFG = """ +[metadata] +name = %(name)s +version = %(version)s + +[files] +packages = + %(name)s + %(name)s.sub +""" + +class UninstallTestCase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def setUp(self): + super(UninstallTestCase, self).setUp() + self.addCleanup(setattr, sys, 'stdout', sys.stdout) + self.addCleanup(setattr, sys, 'stderr', sys.stderr) + self.addCleanup(os.chdir, os.getcwd()) + self.addCleanup(enable_cache) + self.root_dir = self.mkdtemp() + disable_cache() + + def run_setup(self, *args): + # run setup with args + #sys.stdout = StringIO() + sys.argv[:] = [''] + list(args) + old_sys = sys.argv[:] + try: + from distutils2.run import commands_main + dist = commands_main() + finally: + sys.argv[:] = old_sys + return dist + + def get_path(self, dist, name): + from distutils2.command.install_dist import install_dist + cmd = install_dist(dist) + cmd.prefix = self.root_dir + cmd.finalize_options() + return getattr(cmd, 'install_'+name) + + def make_dist(self, pkg_name='foo', **kw): + dirname = self.mkdtemp() + kw['name'] = pkg_name + if 'version' not in kw: + kw['version'] = '0.1' + self.write_file((dirname, 'setup.cfg'), SETUP_CFG % kw) + os.mkdir(os.path.join(dirname, pkg_name)) + self.write_file((dirname, '__init__.py'), '#') + self.write_file((dirname, pkg_name+'_utils.py'), '#') + os.mkdir(os.path.join(dirname, pkg_name, 'sub')) + self.write_file((dirname, pkg_name, 'sub', '__init__.py'), '#') + self.write_file((dirname, pkg_name, 'sub', pkg_name+'_utils.py'), '#') + return dirname + + def install_dist(self, pkg_name='foo', dirname=None, **kw): + if not dirname: + dirname = self.make_dist(pkg_name, **kw) + os.chdir(dirname) + dist = self.run_setup('install_dist', '--prefix='+self.root_dir) + install_lib = self.get_path(dist, 'purelib') + return dist, install_lib + + def test_uninstall_unknow_distribution(self): + self.assertRaises(DistutilsError, remove, 'foo', paths=[self.root_dir]) + + def test_uninstall(self): + dist, install_lib = self.install_dist() + self.assertIsFile(install_lib, 'foo', 'sub', '__init__.py') + self.assertIsFile(install_lib, 'foo-0.1.dist-info', 'RECORD') + remove('foo', paths=[install_lib]) + self.assertIsNotFile(install_lib, 'foo', 'sub', '__init__.py') + self.assertIsNotFile(install_lib, 'foo-0.1.dist-info', 'RECORD') + + + + +def test_suite(): + return unittest.makeSuite(UninstallTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py --- a/distutils2/tests/test_util.py +++ b/distutils2/tests/test_util.py @@ -414,6 +414,10 @@ @unittest.skipUnless(os.name in ('nt', 'posix'), 'runs only under posix or nt') def test_spawn(self): + # Do not patch subprocess on unix because + # distutils2.util._spawn_posix uses it + if os.name in 'posix': + subprocess.Popen = self.old_popen tmpdir = self.mkdtemp() # creating something executable diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -12,6 +12,7 @@ import shutil import tarfile import zipfile +from subprocess import call as sub_call from copy import copy from fnmatch import fnmatchcase from ConfigParser import RawConfigParser @@ -800,65 +801,16 @@ "command '%s' failed with exit status %d" % (cmd[0], rc)) -def _spawn_posix(cmd, search_path=1, verbose=0, dry_run=0, env=None): - logger.info(' '.join(cmd)) +def _spawn_posix(cmd, search_path=1, verbose=1, dry_run=0, env=None): + cmd = ' '.join(cmd) + if verbose: + logger.info(cmd) if dry_run: return - - if env is None: - exec_fn = search_path and os.execvp or os.execv - else: - exec_fn = search_path and os.execvpe or os.execve - - pid = os.fork() - - if pid == 0: # in the child - try: - if env is None: - exec_fn(cmd[0], cmd) - else: - exec_fn(cmd[0], cmd, env) - except OSError, e: - sys.stderr.write("unable to execute %s: %s\n" % - (cmd[0], e.strerror)) - os._exit(1) - - sys.stderr.write("unable to execute %s for unknown reasons" % cmd[0]) - os._exit(1) - else: # in the parent - # Loop until the child either exits or is terminated by a signal - # (ie. keep waiting if it's merely stopped) - while 1: - try: - pid, status = os.waitpid(pid, 0) - except OSError, exc: - import errno - if exc.errno == errno.EINTR: - continue - raise DistutilsExecError( - "command '%s' failed: %s" % (cmd[0], exc[-1])) - if os.WIFSIGNALED(status): - raise DistutilsExecError( - "command '%s' terminated by signal %d" % \ - (cmd[0], os.WTERMSIG(status))) - - elif os.WIFEXITED(status): - exit_status = os.WEXITSTATUS(status) - if exit_status == 0: - return # hey, it succeeded! - else: - raise DistutilsExecError( - "command '%s' failed with exit status %d" % \ - (cmd[0], exit_status)) - - elif os.WIFSTOPPED(status): - continue - - else: - raise DistutilsExecError( - "unknown error executing '%s': termination status %d" % \ - (cmd[0], status)) - + exit_status = sub_call(cmd, shell=True, env=env) + if exit_status != 0: + msg = "command '%s' failed with exit status %d" + raise DistutilsExecError(msg % (cmd, exit_status)) def find_executable(executable, path=None): """Tries to find 'executable' in the directories listed in 'path'. @@ -1128,7 +1080,6 @@ # There is no such option in the setup.cfg if arg == "long_description": filename = has_get_option(config, section, "description_file") - print "We have a filename", filename if filename: in_cfg_value = open(filename).read() else: @@ -1156,12 +1107,18 @@ raise DistutilsFileError("A pre existing setup.py file exists") handle = open("setup.py", "w") - handle.write("# Distutils script using distutils2 setup.cfg to call the\n") - handle.write("# distutils.core.setup() with the right args.\n\n\n") - handle.write("import os\n") - handle.write("from distutils.core import setup\n") - handle.write("from ConfigParser import RawConfigParser\n\n") - handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) - handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") - handle.write("setup(**kwargs)") - handle.close() + try: + handle.write( + "# Distutils script using distutils2 setup.cfg to call the\n" + "# distutils.core.setup() with the right args.\n\n" + "import os\n" + "from distutils.core import setup\n" + "from ConfigParser import RawConfigParser\n\n" + "" + getsource(generate_distutils_kwargs_from_setup_cfg) + "\n\n" + "kwargs = generate_distutils_kwargs_from_setup_cfg()\n" + "setup(**kwargs)\n" + ) + finally: + handle.close() + +generate_distutils_setup_py() diff --git a/docs/design/configfile.rst b/docs/design/configfile.rst new file mode 100644 --- /dev/null +++ b/docs/design/configfile.rst @@ -0,0 +1,132 @@ +.. _setup-config: + +************************************ +Writing the Setup Configuration File +************************************ + +Often, it's not possible to write down everything needed to build a distribution +*a priori*: you may need to get some information from the user, or from the +user's system, in order to proceed. As long as that information is fairly +simple---a list of directories to search for C header files or libraries, for +example---then providing a configuration file, :file:`setup.cfg`, for users to +edit is a cheap and easy way to solicit it. Configuration files also let you +provide default values for any command option, which the installer can then +override either on the command line or by editing the config file. + +The setup configuration file is a useful middle-ground between the setup script +---which, ideally, would be opaque to installers [#]_---and the command line to +the setup script, which is outside of your control and entirely up to the +installer. In fact, :file:`setup.cfg` (and any other Distutils configuration +files present on the target system) are processed after the contents of the +setup script, but before the command line. This has several useful +consequences: + +.. If you have more advanced needs, such as determining which extensions to + build based on what capabilities are present on the target system, then you + need the Distutils auto-configuration facility. This started to appear in + Distutils 0.9 but, as of this writing, isn't mature or stable enough yet + for real-world use. + +* installers can override some of what you put in :file:`setup.py` by editing + :file:`setup.cfg` + +* you can provide non-standard defaults for options that are not easily set in + :file:`setup.py` + +* installers can override anything in :file:`setup.cfg` using the command-line + options to :file:`setup.py` + +The basic syntax of the configuration file is simple:: + + [command] + option=value + ... + +where *command* is one of the Distutils commands (e.g. :command:`build_py`, +:command:`install`), and *option* is one of the options that command supports. +Any number of options can be supplied for each command, and any number of +command sections can be included in the file. Blank lines are ignored, as are +comments, which run from a ``'#'`` character until the end of the line. Long +option values can be split across multiple lines simply by indenting the +continuation lines. + +You can find out the list of options supported by a particular command with the +universal :option:`--help` option, e.g. :: + + > python setup.py --help build_ext + [...] + Options for 'build_ext' command: + --build-lib (-b) directory for compiled extension modules + --build-temp (-t) directory for temporary files (build by-products) + --inplace (-i) ignore build-lib and put compiled extensions into the + source directory alongside your pure Python modules + --include-dirs (-I) list of directories to search for header files + --define (-D) C preprocessor macros to define + --undef (-U) C preprocessor macros to undefine + --swig-opts list of SWIG command-line options + [...] + +.. XXX do we want to support ``setup.py --help metadata``? + +Note that an option spelled :option:`--foo-bar` on the command line is spelled +:option:`foo_bar` in configuration files. + +For example, say you want your extensions to be built "in-place"---that is, you +have an extension :mod:`pkg.ext`, and you want the compiled extension file +(:file:`ext.so` on Unix, say) to be put in the same source directory as your +pure Python modules :mod:`pkg.mod1` and :mod:`pkg.mod2`. You can always use the +:option:`--inplace` option on the command line to ensure this:: + + python setup.py build_ext --inplace + +But this requires that you always specify the :command:`build_ext` command +explicitly, and remember to provide :option:`--inplace`. An easier way is to +"set and forget" this option, by encoding it in :file:`setup.cfg`, the +configuration file for this distribution:: + + [build_ext] + inplace=1 + +This will affect all builds of this module distribution, whether or not you +explicitly specify :command:`build_ext`. If you include :file:`setup.cfg` in +your source distribution, it will also affect end-user builds---which is +probably a bad idea for this option, since always building extensions in-place +would break installation of the module distribution. In certain peculiar cases, +though, modules are built right in their installation directory, so this is +conceivably a useful ability. (Distributing extensions that expect to be built +in their installation directory is almost always a bad idea, though.) + +Another example: certain commands take a lot of options that don't change from +run to run; for example, :command:`bdist_rpm` needs to know everything required +to generate a "spec" file for creating an RPM distribution. Some of this +information comes from the setup script, and some is automatically generated by +the Distutils (such as the list of files installed). But some of it has to be +supplied as options to :command:`bdist_rpm`, which would be very tedious to do +on the command line for every run. Hence, here is a snippet from the Distutils' +own :file:`setup.cfg`:: + + [bdist_rpm] + release = 1 + packager = Greg Ward + doc_files = CHANGES.txt + README.txt + USAGE.txt + doc/ + examples/ + +Note that the :option:`doc_files` option is simply a whitespace-separated string +split across multiple lines for readability. + + +.. seealso:: + + :ref:`inst-config-syntax` in "Installing Python Modules" + More information on the configuration files is available in the manual for + system administrators. + + +.. rubric:: Footnotes + +.. [#] This ideal probably won't be achieved until auto-configuration is fully + supported by the Distutils. + diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst new file mode 100644 --- /dev/null +++ b/docs/source/contributing.rst @@ -0,0 +1,52 @@ +========================== +Contributing to Distutils2 +========================== + +---------------- +Reporting Issues +---------------- + +When using, testing, developping distutils2, you may encounter issues. Please report to the following sections to know how these issues should be reported. + +Please keep in mind that this guide is intended to ease the triage and fixing processes by giving the maximum information to the developers. It should not be viewed as mandatory, only advised ;). + +Issues regarding distutils2 commands +==================================== + +- Go to http://bugs.python.org/ (you'll need a Python Bugs account), then "Issues" > "Create ticket". +- **Title**: write in a short summary of the issue. + * You may prefix the issue title with [d2_component], where d2_component can be : installer, sdist, setup.cfg, ... This will ease up the triage process. + +- **Components**: choose "Distutils2" +- **Version**: choose "3rd party" +- **Comment**: use the following template for versions, reproduction conditions: + * If some of the fields presented don't apply to the issue, feel free to pick only the ones you need. + +:: + + Operating System: + Version of Python: + Version of Distutils2: + + How to reproduce: + + What happens: + + What should happen: + +- Filling in the fields: + * **How to reproduce**: indicate some test case to reproduce the issue. + * **What happens**: describe what is the error, paste tracebacks if you have any. + * **What should happen**: indicate what you think should be the result of the test case (wanted behaviour). + * **Versions**: + - If you're using a release of distutils2, you may want to test the latest version of the project (under developpment code). + - If the issue is present in the latest version, please indicate the tip commit of the version tested. + - Be careful to indicate the remote reference (12 characters, for instance c3cf81fc64db), not the local reference (rXXX). + +- If it is relevant, please join any file that will help reproducing the issue or logs to understand the problem (setup.cfg, strace ouptups, ...). + +Issues regarding PyPI display of the distutils2 projects +======================================================== + +- Please send a bug report to the catalog-sig at python.org mailing list. +- You can include your setup.cfg, and a link to your project page. diff --git a/docs/source/distutils/sourcedist.rst b/docs/source/distutils/sourcedist.rst --- a/docs/source/distutils/sourcedist.rst +++ b/docs/source/distutils/sourcedist.rst @@ -86,8 +86,7 @@ distributions, but in the future there will be a standard for testing Python module distributions) -* :file:`README.txt` (or :file:`README`), :file:`setup.py` (or whatever you - called your setup script), and :file:`setup.cfg` +* The configuration file :file:`setup.cfg` * all files that matches the ``package_data`` metadata. See :ref:`distutils-installing-package-data`. @@ -95,6 +94,10 @@ * all files that matches the ``data_files`` metadata. See :ref:`distutils-additional-files`. +.. Warning:: + In Distutils2, setup.py and README (or README.txt) files are not more + included in source distribution by default + Sometimes this is enough, but usually you will want to specify additional files to distribute. The typical way to do this is to write a *manifest template*, called :file:`MANIFEST.in` by default. The manifest template is just a list of diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -7,24 +7,35 @@ Each section contains a description of its options. -- Options that are marked *\*multi* can have multiple values, one value - per line. +- Options that are marked *\*multi* can have multiple values, one value per + line. - Options that are marked *\*optional* can be omited. -- Options that are marked *\*environ* can use environement markes, as described - in PEP 345. +- Options that are marked *\*environ* can use environment markers, as described + in :PEP:`345`. + The sections are: -- global -- metadata -- files -- command sections +global + Global options for Distutils2. + +metadata + The metadata section contains the metadata for the project as described in + :PEP:`345`. + +files + Declaration of package files included in the project. + +`command` sections + Redefinition of user options for Distutils2 commands. global ====== -Contains global options for Distutils2. This section is shared with Distutils1. +Contains global options for Distutils2. This section is shared with Distutils1 +(legacy version distributed in python 2.X standard library). + - **commands**: Defined Distutils2 command. A command is defined by its fully qualified name. @@ -38,13 +49,13 @@ *\*optional* *\*multi* - **compilers**: Defined Distutils2 compiler. A compiler is defined by its fully - qualified name. + qualified name. Example:: [global] compiler = - package.compilers.CustomCCompiler + package.compiler.CustomCCompiler *\*optional* *\*multi* @@ -52,21 +63,29 @@ :file:`setup.cfg` file is read. The callable receives the configuration in form of a mapping and can make some changes to it. *\*optional* + Example:: + + [global] + setup_hook = + distutils2.tests.test_config.hook + metadata ======== The metadata section contains the metadata for the project as described in -PEP 345. +:PEP:`345`. +.. Note:: + Field names are case-insensitive. Fields: - **name**: Name of the project. -- **version**: Version of the project. Must comply with PEP 386. +- **version**: Version of the project. Must comply with :PEP:`386`. - **platform**: Platform specification describing an operating system supported by the distribution which is not listed in the "Operating System" Trove - classifiers. *\*multi* *\*optional* + classifiers (:PEP:`301`). *\*multi* *\*optional* - **supported-platform**: Binary distributions containing a PKG-INFO file will use the Supported-Platform field in their metadata to specify the OS and CPU for which the binary distribution was compiled. The semantics of @@ -113,14 +132,18 @@ name = pypi2rpm version = 0.1 author = Tarek Ziade - author_email = tarek at ziade.org + author-email = tarek at ziade.org summary = Script that transforms a sdist archive into a rpm archive description-file = README - home_page = http://bitbucket.org/tarek/pypi2rpm + home-page = http://bitbucket.org/tarek/pypi2rpm + project-url: RSS feed, https://bitbucket.org/tarek/pypi2rpm/rss classifier = Development Status :: 3 - Alpha License :: OSI Approved :: Mozilla Public License 1.1 (MPL 1.1) +.. Note:: + Some metadata fields seen in :PEP:`345` are automatically generated + (for instance Metadata-Version value). files @@ -148,17 +171,30 @@ extra_files = setup.py + README +.. Note:: + In Distutils2, setup.cfg will be implicitly included. -command sections -================ +.. Warning:: + In Distutils2, setup.py and README (or README.txt) files are not more + included in source distribution by default -Each command can have its options described in :file:`setup.cfg` +`command` sections +================== + +Each Distutils2 command can have its own user options defined in :file:`setup.cfg` Example:: [sdist] - manifest_makers = package.module.Maker + manifest-builders = package.module.Maker +To override the build class in order to generate Python3 code from your Python2 base:: + + [build_py] + use-2to3 = True + + -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Remove dead code Message-ID: tarek.ziade pushed 8877bdd6079d to distutils2: http://hg.python.org/distutils2/rev/8877bdd6079d changeset: 1002:8877bdd6079d parent: 958:c732ac2c3105 user: ?ric Araujo date: Sat Jan 29 22:23:12 2011 +0100 summary: Remove dead code files: distutils2/tests/test_command_test.py diff --git a/distutils2/tests/test_command_test.py b/distutils2/tests/test_command_test.py --- a/distutils2/tests/test_command_test.py +++ b/distutils2/tests/test_command_test.py @@ -17,10 +17,6 @@ from distutils2.dist import Distribution from distutils2._backport import pkgutil -try: - any -except NameError: - from distutils2._backport import any EXPECTED_OUTPUT_RE = r'''FAIL: test_blah \(myowntestmodule.SomeTest\) ---------------------------------------------------------------------- -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:55 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:55 +0100 Subject: [Python-checkins] distutils2: Merge branch 'master' of /Users/gotcha/co/distutils2/git Message-ID: tarek.ziade pushed e65761262dbf to distutils2: http://hg.python.org/distutils2/rev/e65761262dbf changeset: 996:e65761262dbf parent: 995:7bbd7533bc87 parent: 994:f314fe720c70 user: Godefroid Chapelle date: Sun Jan 30 12:30:49 2011 +0100 summary: Merge branch 'master' of /Users/gotcha/co/distutils2/git files: diff --git a/distutils2/_backport/__init__.py b/distutils2/_backport/__init__.py --- a/distutils2/_backport/__init__.py +++ b/distutils2/_backport/__init__.py @@ -1,8 +1,2 @@ """Things that will land in the Python 3.3 std lib but which we must drag along with us for now to support 2.x.""" - -def any(seq): - for elem in seq: - if elem: - return True - return False diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1,24 +1,19 @@ """Utilities to support packages.""" -# NOTE: This module must remain compatible with Python 2.3, as it is shared -# by setuptools for distribution with Python 2.3 and up. - import os import sys import imp -import os.path +import re +import warnings from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate -import zipimport try: import cStringIO as StringIO except ImportError: import StringIO -import re -import warnings __all__ = [ @@ -28,10 +23,14 @@ 'Distribution', 'EggInfoDistribution', 'distinfo_dirname', 'get_distributions', 'get_distribution', 'get_file_users', 'provides_distribution', 'obsoletes_distribution', - 'enable_cache', 'disable_cache', 'clear_cache' + 'enable_cache', 'disable_cache', 'clear_cache', ] +########################## +# PEP 302 Implementation # +########################## + def read_code(stream): # This helper is needed in order for the :pep:`302` emulation to # correctly handle compiled files @@ -41,7 +40,7 @@ if magic != imp.get_magic(): return None - stream.read(4) # Skip timestamp + stream.read(4) # Skip timestamp return marshal.load(stream) @@ -173,7 +172,6 @@ #@simplegeneric def iter_importer_modules(importer, prefix=''): - "" if not hasattr(importer, 'iter_modules'): return [] return importer.iter_modules(prefix) @@ -331,9 +329,9 @@ def get_filename(self, fullname=None): fullname = self._fix_name(fullname) mod_type = self.etc[2] - if self.etc[2] == imp.PKG_DIRECTORY: + if mod_type == imp.PKG_DIRECTORY: return self._get_delegate().get_filename() - elif self.etc[2] in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): + elif mod_type in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): return self.filename return None @@ -432,7 +430,8 @@ import mechanism will find the latter. Items of the following types can be affected by this discrepancy: - ``imp.C_EXTENSION, imp.PY_SOURCE, imp.PY_COMPILED, imp.PKG_DIRECTORY`` + :data:`imp.C_EXTENSION`, :data:`imp.PY_SOURCE`, :data:`imp.PY_COMPILED`, + :data:`imp.PKG_DIRECTORY` """ if fullname.startswith('.'): raise ImportError("Relative module names not supported") @@ -534,13 +533,13 @@ # frozen package. Return the path unchanged in that case. return path - pname = os.path.join(*name.split('.')) # Reconstitute as relative path + pname = os.path.join(*name.split('.')) # Reconstitute as relative path # Just in case os.extsep != '.' sname = os.extsep.join(name.split('.')) sname_pkg = sname + os.extsep + "pkg" init_py = "__init__" + os.extsep + "py" - path = path[:] # Start with a copy of the existing path + path = path[:] # Start with a copy of the existing path for dir in sys.path: if not isinstance(dir, basestring) or not os.path.isdir(dir): @@ -565,7 +564,7 @@ line = line.rstrip('\n') if not line or line.startswith('#'): continue - path.append(line) # Don't check for existence! + path.append(line) # Don't check for existence! f.close() return path @@ -609,6 +608,7 @@ resource_name = os.path.join(*parts) return loader.get_data(resource_name) + ########################## # PEP 376 Implementation # ########################## @@ -616,12 +616,12 @@ DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED',) # Cache -_cache_name = {} # maps names to Distribution instances -_cache_name_egg = {} # maps names to EggInfoDistribution instances -_cache_path = {} # maps paths to Distribution instances -_cache_path_egg = {} # maps paths to EggInfoDistribution instances -_cache_generated = False # indicates if .dist-info distributions are cached -_cache_generated_egg = False # indicates if .dist-info and .egg are cached +_cache_name = {} # maps names to Distribution instances +_cache_name_egg = {} # maps names to EggInfoDistribution instances +_cache_path = {} # maps paths to Distribution instances +_cache_path_egg = {} # maps paths to EggInfoDistribution instances +_cache_generated = False # indicates if .dist-info distributions are cached +_cache_generated_egg = False # indicates if .dist-info and .egg are cached _cache_enabled = True @@ -636,6 +636,7 @@ _cache_enabled = True + def disable_cache(): """ Disables the internal cache. @@ -647,9 +648,10 @@ _cache_enabled = False + def clear_cache(): """ Clears the internal cache. """ - global _cache_name, _cache_name_egg, cache_path, _cache_path_egg, \ + global _cache_name, _cache_name_egg, _cache_path, _cache_path_egg, \ _cache_generated, _cache_generated_egg _cache_name = {} @@ -660,14 +662,14 @@ _cache_generated_egg = False -def _yield_distributions(include_dist, include_egg): +def _yield_distributions(include_dist, include_egg, paths=sys.path): """ Yield .dist-info and .egg(-info) distributions, based on the arguments :parameter include_dist: yield .dist-info distributions :parameter include_egg: yield .egg(-info) distributions """ - for path in sys.path: + for path in paths: realpath = os.path.realpath(path) if not os.path.isdir(realpath): continue @@ -679,7 +681,7 @@ dir.endswith('.egg')): yield EggInfoDistribution(dist_path) -def _generate_cache(use_egg_info=False): +def _generate_cache(use_egg_info=False, paths=sys.path): global _cache_generated, _cache_generated_egg if _cache_generated_egg or (_cache_generated and not use_egg_info): @@ -688,7 +690,7 @@ gen_dist = not _cache_generated gen_egg = use_egg_info - for dist in _yield_distributions(gen_dist, gen_egg): + for dist in _yield_distributions(gen_dist, gen_egg, paths): if isinstance(dist, Distribution): _cache_path[dist.path] = dist if not dist.name in _cache_name: @@ -872,7 +874,8 @@ if isinstance(strs, basestring): for s in strs.splitlines(): s = s.strip() - if s and not s.startswith('#'): # skip blank lines/comments + # skip blank lines/comments + if s and not s.startswith('#'): yield s else: for ss in strs: @@ -890,6 +893,7 @@ except IOError: requires = None else: + # FIXME handle the case where zipfile is not available zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) self.metadata = DistributionMetadata(fileobj=fileobj) @@ -952,7 +956,7 @@ version = match.group('first') if match.group('rest'): version += match.group('rest') - version = version.replace(' ', '') # trim spaces + version = version.replace(' ', '') # trim spaces if version is None: reqs.append(name) else: @@ -982,12 +986,6 @@ __hash__ = object.__hash__ -def _normalize_dist_name(name): - """Returns a normalized name from the given *name*. - :rtype: string""" - return name.replace('-', '_') - - def distinfo_dirname(name, version): """ The *name* and *version* parameters are converted into their @@ -1007,7 +1005,7 @@ :returns: directory name :rtype: string""" file_extension = '.dist-info' - name = _normalize_dist_name(name) + name = name.replace('-', '_') normalized_version = suggest_normalized_version(version) # Because this is a lookup procedure, something will be returned even if # it is a version that cannot be normalized @@ -1017,7 +1015,7 @@ return '-'.join([name, normalized_version]) + file_extension -def get_distributions(use_egg_info=False): +def get_distributions(use_egg_info=False, paths=sys.path): """ Provides an iterator that looks for ``.dist-info`` directories in ``sys.path`` and returns :class:`Distribution` instances for each one of @@ -1028,7 +1026,7 @@ instances """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): yield dist else: _generate_cache(use_egg_info) @@ -1041,7 +1039,7 @@ yield dist -def get_distribution(name, use_egg_info=False): +def get_distribution(name, use_egg_info=False, paths=sys.path): """ Scans all elements in ``sys.path`` and looks for all directories ending with ``.dist-info``. Returns a :class:`Distribution` @@ -1059,7 +1057,7 @@ :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): if dist.name == name: return dist else: @@ -1148,7 +1146,7 @@ raise DistutilsError(('Distribution %s has invalid ' + 'provides field: %s') \ % (dist.name, p)) - p_ver = p_ver[1:-1] # trim off the parenthesis + p_ver = p_ver[1:-1] # trim off the parenthesis if p_name == name and predicate.match(p_ver): yield dist break diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -1,4 +1,4 @@ -"""Utility functions for copying files and directory trees. +"""Utility functions for copying and archiving files and directory trees. XXX The functions here don't copy the resource fork or other metadata on Mac. @@ -9,7 +9,13 @@ import stat from os.path import abspath import fnmatch -from warnings import warn +import errno + +try: + import bz2 + _BZ2_SUPPORTED = True +except ImportError: + _BZ2_SUPPORTED = False try: from pwd import getpwnam @@ -21,9 +27,12 @@ except ImportError: getgrnam = None -__all__ = ["copyfileobj","copyfile","copymode","copystat","copy","copy2", - "copytree","move","rmtree","Error", "SpecialFileError", - "ExecError","make_archive"] +__all__ = ["copyfileobj", "copyfile", "copymode", "copystat", "copy", "copy2", + "copytree", "move", "rmtree", "Error", "SpecialFileError", + "ExecError", "make_archive", "get_archive_formats", + "register_archive_format", "unregister_archive_format", + "get_unpack_formats", "register_unpack_format", + "unregister_unpack_format", "unpack_archive"] class Error(EnvironmentError): pass @@ -35,6 +44,14 @@ class ExecError(EnvironmentError): """Raised when a command could not be executed""" +class ReadError(EnvironmentError): + """Raised when an archive cannot be read""" + +class RegistryError(Exception): + """Raised when a registery operation with the archiving + and unpacking registeries fails""" + + try: WindowsError except NameError: @@ -50,7 +67,7 @@ def _samefile(src, dst): # Macintosh, Unix. - if hasattr(os.path,'samefile'): + if hasattr(os.path, 'samefile'): try: return os.path.samefile(src, dst) except OSError: @@ -63,10 +80,8 @@ def copyfile(src, dst): """Copy data from src to dst""" if _samefile(src, dst): - raise Error, "`%s` and `%s` are the same file" % (src, dst) + raise Error("`%s` and `%s` are the same file" % (src, dst)) - fsrc = None - fdst = None for fn in [src, dst]: try: st = os.stat(fn) @@ -77,15 +92,16 @@ # XXX What about other special files? (sockets, devices...) if stat.S_ISFIFO(st.st_mode): raise SpecialFileError("`%s` is a named pipe" % fn) + + fsrc = open(src, 'rb') try: - fsrc = open(src, 'rb') fdst = open(dst, 'wb') - copyfileobj(fsrc, fdst) + try: + copyfileobj(fsrc, fdst) + finally: + fdst.close() finally: - if fdst: - fdst.close() - if fsrc: - fsrc.close() + fsrc.close() def copymode(src, dst): """Copy mode bits from src to dst""" @@ -103,8 +119,12 @@ if hasattr(os, 'chmod'): os.chmod(dst, mode) if hasattr(os, 'chflags') and hasattr(st, 'st_flags'): - os.chflags(dst, st.st_flags) - + try: + os.chflags(dst, st.st_flags) + except OSError, why: + if (not hasattr(errno, 'EOPNOTSUPP') or + why.errno != errno.EOPNOTSUPP): + raise def copy(src, dst): """Copy data and mode bits ("cp src dst"). @@ -140,8 +160,9 @@ return set(ignored_names) return _ignore_patterns -def copytree(src, dst, symlinks=False, ignore=None): - """Recursively copy a directory tree using copy2(). +def copytree(src, dst, symlinks=False, ignore=None, copy_function=copy2, + ignore_dangling_symlinks=False): + """Recursively copy a directory tree. The destination directory must not already exist. If exception(s) occur, an Error is raised with a list of reasons. @@ -149,7 +170,13 @@ If the optional symlinks flag is true, symbolic links in the source tree result in symbolic links in the destination tree; if it is false, the contents of the files pointed to by symbolic - links are copied. + links are copied. If the file pointed by the symlink doesn't + exist, an exception will be added in the list of errors raised in + an Error exception at the end of the copy process. + + You can set the optional ignore_dangling_symlinks flag to true if you + want to silence this exception. Notice that this has no effect on + platforms that don't support os.symlink. The optional ignore argument is a callable. If given, it is called with the `src` parameter, which is the directory @@ -163,7 +190,10 @@ list of names relative to the `src` directory that should not be copied. - XXX Consider this example code rather than the ultimate tool. + The optional copy_function argument is a callable that will be used + to copy each file. It will be called with the source path and the + destination path as arguments. By default, copy2() is used, but any + function that supports the same signature (like copy()) can be used. """ names = os.listdir(src) @@ -182,14 +212,21 @@ srcname = os.path.join(src, name) dstname = os.path.join(dst, name) try: - if symlinks and os.path.islink(srcname): + if os.path.islink(srcname): linkto = os.readlink(srcname) - os.symlink(linkto, dstname) + if symlinks: + os.symlink(linkto, dstname) + else: + # ignore dangling symlink if the flag is on + if not os.path.exists(linkto) and ignore_dangling_symlinks: + continue + # otherwise let the copy occurs. copy2 will raise an error + copy_function(srcname, dstname) elif os.path.isdir(srcname): - copytree(srcname, dstname, symlinks, ignore) + copytree(srcname, dstname, symlinks, ignore, copy_function) else: # Will raise a SpecialFileError for unsupported file types - copy2(srcname, dstname) + copy_function(srcname, dstname) # catch the Error from the recursive copytree so that we can # continue with other files except Error, err: @@ -205,7 +242,7 @@ else: errors.extend((src, dst, str(why))) if errors: - raise Error, errors + raise Error(errors) def rmtree(path, ignore_errors=False, onerror=None): """Recursively delete a directory tree. @@ -235,7 +272,7 @@ names = [] try: names = os.listdir(path) - except os.error, err: + except os.error: onerror(os.listdir, path, sys.exc_info()) for name in names: fullname = os.path.join(path, name) @@ -248,7 +285,7 @@ else: try: os.remove(fullname) - except os.error, err: + except os.error: onerror(os.remove, fullname, sys.exc_info()) try: os.rmdir(path) @@ -282,13 +319,13 @@ if os.path.isdir(dst): real_dst = os.path.join(dst, _basename(src)) if os.path.exists(real_dst): - raise Error, "Destination path '%s' already exists" % real_dst + raise Error("Destination path '%s' already exists" % real_dst) try: os.rename(src, real_dst) except OSError: if os.path.isdir(src): if _destinsrc(src, dst): - raise Error, "Cannot move a directory '%s' into itself '%s'." % (src, dst) + raise Error("Cannot move a directory '%s' into itself '%s'." % (src, dst)) copytree(src, real_dst, symlinks=True) rmtree(src) else: @@ -333,40 +370,41 @@ """Create a (possibly compressed) tar file from all the files under 'base_dir'. - 'compress' must be "gzip" (the default), "compress", "bzip2", or None. - (compress will be deprecated in Python 3.2) + 'compress' must be "gzip" (the default), "bzip2", or None. 'owner' and 'group' can be used to define an owner and a group for the archive that is being built. If not provided, the current owner and group will be used. The output tar file will be named 'base_dir' + ".tar", possibly plus - the appropriate compression extension (".gz", ".bz2" or ".Z"). + the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. """ - tar_compression = {'gzip': 'gz', 'bzip2': 'bz2', None: '', 'compress': ''} - compress_ext = {'gzip': '.gz', 'bzip2': '.bz2', 'compress': '.Z'} + tar_compression = {'gzip': 'gz', None: ''} + compress_ext = {'gzip': '.gz'} + + if _BZ2_SUPPORTED: + tar_compression['bzip2'] = 'bz2' + compress_ext['bzip2'] = '.bz2' # flags for compression program, each element of list will be an argument if compress is not None and compress not in compress_ext: - raise ValueError, \ - ("bad value for 'compress': must be None, 'gzip', 'bzip2' " - "or 'compress'") + raise ValueError("bad value for 'compress', or compression format not " + "supported: %s" % compress) - archive_name = base_name + '.tar' - if compress != 'compress': - archive_name += compress_ext.get(compress, '') + archive_name = base_name + '.tar' + compress_ext.get(compress, '') + archive_dir = os.path.dirname(archive_name) - archive_dir = os.path.dirname(archive_name) if not os.path.exists(archive_dir): if logger is not None: - logger.info("creating %s" % archive_dir) + logger.info("creating %s", archive_dir) if not dry_run: os.makedirs(archive_dir) - # creating the tarball + # XXX late import because of circular dependency between shutil and + # tarfile :( from distutils2._backport import tarfile if logger is not None: @@ -391,23 +429,9 @@ finally: tar.close() - # compression using `compress` - # XXX this block will be removed in Python 3.2 - if compress == 'compress': - warn("'compress' will be deprecated.", PendingDeprecationWarning) - # the option varies depending on the platform - compressed_name = archive_name + compress_ext[compress] - if sys.platform == 'win32': - cmd = [compress, archive_name, compressed_name] - else: - cmd = [compress, '-f', archive_name] - from distutils2.spawn import spawn - spawn(cmd, dry_run=dry_run) - return compressed_name - return archive_name -def _call_external_zip(directory, verbose=False): +def _call_external_zip(base_dir, zip_filename, verbose=False, dry_run=False): # XXX see if we want to keep an external call here if verbose: zipoptions = "-r" @@ -420,8 +444,7 @@ except DistutilsExecError: # XXX really should distinguish between "couldn't find # external 'zip' command" and "zip failed". - raise ExecError, \ - ("unable to create zip file '%s': " + raise ExecError("unable to create zip file '%s': " "could neither import the 'zipfile' module nor " "find a standalone zip utility") % zip_filename @@ -451,7 +474,7 @@ zipfile = None if zipfile is None: - _call_external_zip(base_dir, verbose) + _call_external_zip(base_dir, zip_filename, verbose, dry_run) else: if logger is not None: logger.info("creating '%s' and adding '%s' to it", @@ -475,12 +498,14 @@ _ARCHIVE_FORMATS = { 'gztar': (_make_tarball, [('compress', 'gzip')], "gzip'ed tar-file"), 'bztar': (_make_tarball, [('compress', 'bzip2')], "bzip2'ed tar-file"), - 'ztar': (_make_tarball, [('compress', 'compress')], - "compressed tar file"), 'tar': (_make_tarball, [('compress', None)], "uncompressed tar file"), - 'zip': (_make_zipfile, [],"ZIP file") + 'zip': (_make_zipfile, [], "ZIP file"), } +if _BZ2_SUPPORTED: + _ARCHIVE_FORMATS['bztar'] = (_make_tarball, [('compress', 'bzip2')], + "bzip2'ed tar-file") + def get_archive_formats(): """Returns a list of supported formats for archiving and unarchiving. @@ -507,7 +532,7 @@ if not isinstance(extra_args, (tuple, list)): raise TypeError('extra_args needs to be a sequence') for element in extra_args: - if not isinstance(element, (tuple, list)) or len(element) !=2 : + if not isinstance(element, (tuple, list)) or len(element) !=2: raise TypeError('extra_args elements are : (arg_name, value)') _ARCHIVE_FORMATS[name] = (function, extra_args, description) @@ -520,7 +545,7 @@ """Create an archive file (eg. zip or tar). 'base_name' is the name of the file to create, minus any format-specific - extension; 'format' is the archive format: one of "zip", "tar", "ztar", + extension; 'format' is the archive format: one of "zip", "tar", "bztar" or "gztar". 'root_dir' is a directory that will be the root directory of the @@ -549,7 +574,7 @@ try: format_info = _ARCHIVE_FORMATS[format] except KeyError: - raise ValueError, "unknown archive format '%s'" % format + raise ValueError("unknown archive format '%s'" % format) func = format_info[0] for arg, val in format_info[1]: @@ -568,3 +593,169 @@ os.chdir(save_cwd) return filename + + +def get_unpack_formats(): + """Returns a list of supported formats for unpacking. + + Each element of the returned sequence is a tuple + (name, extensions, description) + """ + formats = [(name, info[0], info[3]) for name, info in + _UNPACK_FORMATS.iteritems()] + formats.sort() + return formats + +def _check_unpack_options(extensions, function, extra_args): + """Checks what gets registered as an unpacker.""" + # first make sure no other unpacker is registered for this extension + existing_extensions = {} + for name, info in _UNPACK_FORMATS.iteritems(): + for ext in info[0]: + existing_extensions[ext] = name + + for extension in extensions: + if extension in existing_extensions: + msg = '%s is already registered for "%s"' + raise RegistryError(msg % (extension, + existing_extensions[extension])) + + if not callable(function): + raise TypeError('The registered function must be a callable') + + +def register_unpack_format(name, extensions, function, extra_args=None, + description=''): + """Registers an unpack format. + + `name` is the name of the format. `extensions` is a list of extensions + corresponding to the format. + + `function` is the callable that will be + used to unpack archives. The callable will receive archives to unpack. + If it's unable to handle an archive, it needs to raise a ReadError + exception. + + If provided, `extra_args` is a sequence of + (name, value) tuples that will be passed as arguments to the callable. + description can be provided to describe the format, and will be returned + by the get_unpack_formats() function. + """ + if extra_args is None: + extra_args = [] + _check_unpack_options(extensions, function, extra_args) + _UNPACK_FORMATS[name] = extensions, function, extra_args, description + +def unregister_unpack_format(name): + """Removes the pack format from the registery.""" + del _UNPACK_FORMATS[name] + +def _ensure_directory(path): + """Ensure that the parent directory of `path` exists""" + dirname = os.path.dirname(path) + if not os.path.isdir(dirname): + os.makedirs(dirname) + +def _unpack_zipfile(filename, extract_dir): + """Unpack zip `filename` to `extract_dir` + """ + try: + import zipfile + except ImportError: + raise ReadError('zlib not supported, cannot unpack this archive.') + + if not zipfile.is_zipfile(filename): + raise ReadError("%s is not a zip file" % filename) + + zip = zipfile.ZipFile(filename) + try: + for info in zip.infolist(): + name = info.filename + + # don't extract absolute paths or ones with .. in them + if name.startswith('/') or '..' in name: + continue + + target = os.path.join(extract_dir, *name.split('/')) + if not target: + continue + + _ensure_directory(target) + if not name.endswith('/'): + # file + data = zip.read(info.filename) + f = open(target, 'wb') + try: + f.write(data) + finally: + f.close() + del data + finally: + zip.close() + +def _unpack_tarfile(filename, extract_dir): + """Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir` + """ + from distutils2._backport import tarfile + try: + tarobj = tarfile.open(filename) + except tarfile.TarError: + raise ReadError( + "%s is not a compressed or uncompressed tar file" % filename) + try: + tarobj.extractall(extract_dir) + finally: + tarobj.close() + +_UNPACK_FORMATS = { + 'gztar': (['.tar.gz', '.tgz'], _unpack_tarfile, [], "gzip'ed tar-file"), + 'tar': (['.tar'], _unpack_tarfile, [], "uncompressed tar file"), + 'zip': (['.zip'], _unpack_zipfile, [], "ZIP file") + } + +if _BZ2_SUPPORTED: + _UNPACK_FORMATS['bztar'] = (['.bz2'], _unpack_tarfile, [], + "bzip2'ed tar-file") + +def _find_unpack_format(filename): + for name, info in _UNPACK_FORMATS.iteritems(): + for extension in info[0]: + if filename.endswith(extension): + return name + return None + +def unpack_archive(filename, extract_dir=None, format=None): + """Unpack an archive. + + `filename` is the name of the archive. + + `extract_dir` is the name of the target directory, where the archive + is unpacked. If not provided, the current working directory is used. + + `format` is the archive format: one of "zip", "tar", or "gztar". Or any + other registered format. If not provided, unpack_archive will use the + filename extension and see if an unpacker was registered for that + extension. + + In case none is found, a ValueError is raised. + """ + if extract_dir is None: + extract_dir = os.getcwd() + + if format is not None: + try: + format_info = _UNPACK_FORMATS[format] + except KeyError: + raise ValueError("Unknown unpack format '{0}'".format(format)) + + func = format_info[0] + func(filename, extract_dir, **dict(format_info[1])) + else: + # we need to look at the registered unpackers supported extensions + format = _find_unpack_format(filename) + if format is None: + raise ReadError("Unknown archive format '{0}'".format(filename)) + + func = _UNPACK_FORMATS[format][1] + kwargs = dict(_UNPACK_FORMATS[format][2]) + raise ValueError('Unknown archive format: %s' % filename) diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -12,10 +12,15 @@ except ImportError: from distutils2._backport.hashlib import md5 -from test.test_support import TESTFN +from distutils2.errors import DistutilsError +from distutils2.metadata import DistributionMetadata +from distutils2.tests import unittest, run_unittest, support -from distutils2.tests import unittest, run_unittest, support from distutils2._backport import pkgutil +from distutils2._backport.pkgutil import ( + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -108,6 +113,12 @@ self.assertEqual(res1, RESOURCE_DATA) res2 = pkgutil.get_data(pkg, 'sub/res.txt') self.assertEqual(res2, RESOURCE_DATA) + + names = [] + for loader, name, ispkg in pkgutil.iter_modules([zip_file]): + names.append(name) + self.assertEqual(names, ['test_getdata_zipfile']) + del sys.path[0] del sys.modules[pkg] @@ -205,7 +216,7 @@ record_writer.writerow(record_pieces( os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) - del record_writer # causes the RECORD file to close + del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: @@ -225,9 +236,6 @@ def test_instantiation(self): # Test the Distribution class's instantiation provides us with usable # attributes. - # Import the Distribution class - from distutils2._backport.pkgutil import distinfo_dirname, Distribution - here = os.path.abspath(os.path.dirname(__file__)) name = 'choxie' version = '2.0.0.9' @@ -236,7 +244,6 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - from distutils2.metadata import DistributionMetadata self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) @@ -244,7 +251,6 @@ def test_installed_files(self): # Test the iteration of installed files. # Test the distribution's installed files - from distutils2._backport.pkgutil import Distribution for distinfo_dir in self.distinfo_dirs: dist = Distribution(distinfo_dir) for path, md5_, size in dist.get_installed_files(): @@ -267,14 +273,12 @@ false_path = relpath(os.path.join(*false_path), sys.prefix) # Test if the distribution uses the file in question - from distutils2._backport.pkgutil import Distribution dist = Distribution(distinfo_dir) self.assertTrue(dist.uses(true_path)) self.assertFalse(dist.uses(false_path)) def test_get_distinfo_file(self): # Test the retrieval of dist-info file objects. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, @@ -295,7 +299,6 @@ # Is it the correct file? self.assertEqual(value.name, os.path.join(distinfo_dir, distfile)) - from distutils2.errors import DistutilsError # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, other_distinfo_name + '.dist-info', 'REQUESTED') @@ -307,7 +310,6 @@ def test_get_distinfo_files(self): # Test for the iteration of RECORD path entries. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, distinfo_name + '.dist-info') @@ -345,7 +347,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [# (name, version, standard_dirname) + items = [ # (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -356,9 +358,6 @@ ('python-ldap', '2.5 a---5', 'python_ldap-2.5 a---5.dist-info'), ] - # Import the function in question - from distutils2._backport.pkgutil import distinfo_dirname - # Loop through the items to validate the results for name, version, standard_dirname in items: dirname = distinfo_dirname(name, version) @@ -371,11 +370,6 @@ ('towel-stuff', '0.1')] found_dists = [] - # Import the function in question - from distutils2._backport.pkgutil import get_distributions, \ - Distribution, \ - EggInfoDistribution - # Verify the fake dists have been found. dists = [dist for dist in get_distributions()] for dist in dists: @@ -416,12 +410,7 @@ def test_get_distribution(self): # Test for looking up a distribution by name. # Test the lookup of the towel-stuff distribution - name = 'towel-stuff' # Note: This is different from the directory name - - # Import the function in question - from distutils2._backport.pkgutil import get_distribution, \ - Distribution, \ - EggInfoDistribution + name = 'towel-stuff' # Note: This is different from the directory name # Lookup the distribution dist = get_distribution(name) @@ -461,7 +450,6 @@ def test_get_file_users(self): # Test the iteration of distributions that use a file. - from distutils2._backport.pkgutil import get_file_users, Distribution name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, 'towel_stuff', '__init__.py') @@ -471,9 +459,6 @@ def test_provides(self): # Test for looking up distributions by what they provide - from distutils2._backport.pkgutil import provides_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in provides_distribution('truffles')] @@ -522,12 +507,10 @@ use_egg_info=True)] checkLists(l, ['strawberry']) - l = [dist.name for dist in provides_distribution('strawberry', '>0.6', use_egg_info=True)] checkLists(l, []) - l = [dist.name for dist in provides_distribution('banana', '0.4', use_egg_info=True)] checkLists(l, ['banana']) @@ -536,16 +519,12 @@ use_egg_info=True)] checkLists(l, ['banana']) - l = [dist.name for dist in provides_distribution('banana', '!=0.4', use_egg_info=True)] checkLists(l, []) def test_obsoletes(self): # Test looking for distributions based on what they obsolete - from distutils2._backport.pkgutil import obsoletes_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in obsoletes_distribution('truffles', '1.0')] @@ -555,7 +534,6 @@ use_egg_info=True)] checkLists(l, ['cheese', 'bacon']) - l = [dist.name for dist in obsoletes_distribution('truffles', '0.8')] checkLists(l, ['choxie']) @@ -575,7 +553,6 @@ def test_yield_distribution(self): # tests the internal function _yield_distributions - from distutils2._backport.pkgutil import _yield_distributions checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), diff --git a/distutils2/_backport/tests/test_shutil.py b/distutils2/_backport/tests/test_shutil.py new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/test_shutil.py @@ -0,0 +1,945 @@ +import os +import sys +import tempfile +import stat +import tarfile +from os.path import splitdrive +from StringIO import StringIO + +from distutils.spawn import find_executable, spawn +from distutils2._backport import shutil +from distutils2._backport.shutil import ( + _make_tarball, _make_zipfile, make_archive, unpack_archive, + register_archive_format, unregister_archive_format, get_archive_formats, + register_unpack_format, unregister_unpack_format, get_unpack_formats, + Error, RegistryError) + +from distutils2.tests import unittest, support, TESTFN + +try: + import bz2 + BZ2_SUPPORTED = True +except ImportError: + BZ2_SUPPORTED = False + +TESTFN2 = TESTFN + "2" + +try: + import grp + import pwd + UID_GID_SUPPORT = True +except ImportError: + UID_GID_SUPPORT = False + +try: + import zlib +except ImportError: + zlib = None + +try: + import zipfile + ZIP_SUPPORT = True +except ImportError: + ZIP_SUPPORT = find_executable('zip') + +class TestShutil(unittest.TestCase): + + def setUp(self): + super(TestShutil, self).setUp() + self.tempdirs = [] + + def tearDown(self): + super(TestShutil, self).tearDown() + while self.tempdirs: + d = self.tempdirs.pop() + shutil.rmtree(d, os.name in ('nt', 'cygwin')) + + def write_file(self, path, content='xxx'): + """Writes a file in the given path. + + + path can be a string or a sequence. + """ + if isinstance(path, (list, tuple)): + path = os.path.join(*path) + f = open(path, 'w') + try: + f.write(content) + finally: + f.close() + + def mkdtemp(self): + """Create a temporary directory that will be cleaned up. + + Returns the path of the directory. + """ + d = tempfile.mkdtemp() + self.tempdirs.append(d) + return d + + def test_rmtree_errors(self): + # filename is guaranteed not to exist + filename = tempfile.mktemp() + self.assertRaises(OSError, shutil.rmtree, filename) + + # See bug #1071513 for why we don't run this on cygwin + # and bug #1076467 for why we don't run this as root. + if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' + and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.childpath = os.path.join(TESTFN, 'a') + f = open(self.childpath, 'w') + f.close() + old_dir_mode = os.stat(TESTFN).st_mode + old_child_mode = os.stat(self.childpath).st_mode + # Make unwritable. + os.chmod(self.childpath, stat.S_IREAD) + os.chmod(TESTFN, stat.S_IREAD) + + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 2, + "Expected call to onerror function did not happen.") + + # Make writable again. + os.chmod(TESTFN, old_dir_mode) + os.chmod(self.childpath, old_child_mode) + + # Clean up. + shutil.rmtree(TESTFN) + + def check_args_to_onerror(self, func, arg, exc): + # test_rmtree_errors deliberately runs rmtree + # on a directory that is chmod 400, which will fail. + # This function is run when shutil.rmtree fails. + # 99.9% of the time it initially fails to remove + # a file in the directory, so the first time through + # func is os.remove. + # However, some Linux machines running ZFS on + # FUSE experienced a failure earlier in the process + # at os.listdir. The first failure may legally + # be either. + if self.errorState == 0: + if func is os.remove: + self.assertEqual(arg, self.childpath) + else: + self.assertIs(func, os.listdir, + "func must be either os.remove or os.listdir") + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 1 + else: + self.assertEqual(func, os.rmdir) + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 2 + + def test_rmtree_dont_delete_file(self): + # When called on a file instead of a directory, don't delete it. + handle, path = tempfile.mkstemp() + os.fdopen(handle).close() + self.assertRaises(OSError, shutil.rmtree, path) + os.remove(path) + + def _write_data(self, path, data): + f = open(path, "w") + f.write(data) + f.close() + + def test_copytree_simple(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + src_dir = tempfile.mkdtemp() + dst_dir = os.path.join(tempfile.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + try: + shutil.copytree(src_dir, dst_dir) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test.txt'))) + self.assertTrue(os.path.isdir(os.path.join(dst_dir, 'test_dir'))) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test_dir', + 'test.txt'))) + actual = read_data(os.path.join(dst_dir, 'test.txt')) + self.assertEqual(actual, '123') + actual = read_data(os.path.join(dst_dir, 'test_dir', 'test.txt')) + self.assertEqual(actual, '456') + finally: + for path in ( + os.path.join(src_dir, 'test.txt'), + os.path.join(dst_dir, 'test.txt'), + os.path.join(src_dir, 'test_dir', 'test.txt'), + os.path.join(dst_dir, 'test_dir', 'test.txt'), + ): + if os.path.exists(path): + os.remove(path) + for path in (src_dir, + os.path.dirname(dst_dir) + ): + if os.path.exists(path): + shutil.rmtree(path) + + def test_copytree_with_exclude(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + # creating data + join = os.path.join + exists = os.path.exists + src_dir = tempfile.mkdtemp() + try: + dst_dir = join(tempfile.mkdtemp(), 'destination') + self._write_data(join(src_dir, 'test.txt'), '123') + self._write_data(join(src_dir, 'test.tmp'), '123') + os.mkdir(join(src_dir, 'test_dir')) + self._write_data(join(src_dir, 'test_dir', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2')) + self._write_data(join(src_dir, 'test_dir2', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2', 'subdir')) + os.mkdir(join(src_dir, 'test_dir2', 'subdir2')) + self._write_data(join(src_dir, 'test_dir2', 'subdir', 'test.txt'), + '456') + self._write_data(join(src_dir, 'test_dir2', 'subdir2', 'test.py'), + '456') + + + # testing glob-like patterns + try: + patterns = shutil.ignore_patterns('*.tmp', 'test_dir2') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(exists(join(dst_dir, 'test.txt'))) + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + try: + patterns = shutil.ignore_patterns('*.tmp', 'subdir*') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + + # testing callable-style + try: + def _filter(src, names): + res = [] + for name in names: + path = os.path.join(src, name) + + if (os.path.isdir(path) and + path.split()[-1] == 'subdir'): + res.append(name) + elif os.path.splitext(path)[-1] in ('.py'): + res.append(name) + return res + + shutil.copytree(src_dir, dst_dir, ignore=_filter) + + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2', + 'test.py'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + finally: + shutil.rmtree(src_dir) + shutil.rmtree(os.path.dirname(dst_dir)) + + @support.skip_unless_symlink + def test_dont_copy_file_onto_link_to_itself(self): + # bug 851123. + os.mkdir(TESTFN) + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + try: + f = open(src, 'w') + f.write('cheddar') + f.close() + + if hasattr(os, "link"): + os.link(src, dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + + # Using `src` here would mean we end up with a symlink pointing + # to TESTFN/TESTFN/cheese, while it should point at + # TESTFN/cheese. + os.symlink('cheese', dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + finally: + try: + shutil.rmtree(TESTFN) + except OSError: + pass + + @support.skip_unless_symlink + def test_rmtree_on_symlink(self): + # bug 1669. + os.mkdir(TESTFN) + try: + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + os.mkdir(src) + os.symlink(src, dst) + self.assertRaises(OSError, shutil.rmtree, dst) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + if hasattr(os, "mkfifo"): + # Issue #3002: copyfile and copytree block indefinitely on named pipes + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, 'mkfifo'), 'requires os.mkfifo') + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) + try: + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error, e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) + + def test_copytree_special_func(self): + + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + copied = [] + def _copy(src, dst): + copied.append((src, dst)) + + shutil.copytree(src_dir, dst_dir, copy_function=_copy) + self.assertEquals(len(copied), 2) + + @support.skip_unless_symlink + def test_copytree_dangling_symlinks(self): + + # a dangling symlink raises an error at the end + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + os.symlink('IDONTEXIST', os.path.join(src_dir, 'test.txt')) + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + self.assertRaises(Error, shutil.copytree, src_dir, dst_dir) + + # a dangling symlink is ignored with the proper flag + dst_dir = os.path.join(self.mkdtemp(), 'destination2') + shutil.copytree(src_dir, dst_dir, ignore_dangling_symlinks=True) + self.assertNotIn('test.txt', os.listdir(dst_dir)) + + # a dangling symlink is copied if symlinks=True + dst_dir = os.path.join(self.mkdtemp(), 'destination3') + shutil.copytree(src_dir, dst_dir, symlinks=True) + self.assertIn('test.txt', os.listdir(dst_dir)) + + @unittest.skipUnless(zlib, "requires zlib") + def test_make_tarball(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + os.mkdir(os.path.join(tmpdir, 'sub')) + self.write_file([tmpdir, 'sub', 'file3'], 'xxx') + + tmpdir2 = self.mkdtemp() + unittest.skipUnless(splitdrive(tmpdir)[0] == splitdrive(tmpdir2)[0], + "source and target should be on same drive") + + base_name = os.path.join(tmpdir2, 'archive') + + # working with relative paths to avoid tar warnings + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + def _tarinfo(self, path): + tar = tarfile.open(path) + try: + names = tar.getnames() + names.sort() + return tuple(names) + finally: + tar.close() + + def _create_files(self): + # creating something to tar + tmpdir = self.mkdtemp() + dist = os.path.join(tmpdir, 'dist') + os.mkdir(dist) + self.write_file([dist, 'file1'], 'xxx') + self.write_file([dist, 'file2'], 'xxx') + os.mkdir(os.path.join(dist, 'sub')) + self.write_file([dist, 'sub', 'file3'], 'xxx') + os.mkdir(os.path.join(dist, 'sub2')) + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + return tmpdir, tmpdir2, base_name + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(find_executable('tar') and find_executable('gzip'), + 'Need the tar command to run') + def test_tarfile_vs_tar(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # now create another tarball using `tar` + tarball2 = os.path.join(tmpdir, 'archive2.tar.gz') + tar_cmd = ['tar', '-cf', 'archive2.tar', 'dist'] + gzip_cmd = ['gzip', '-f9', 'archive2.tar'] + old_dir = os.getcwd() + old_stdout = sys.stdout + os.chdir(tmpdir) + sys.stdout = StringIO() + + try: + spawn(tar_cmd) + spawn(gzip_cmd) + finally: + os.chdir(old_dir) + sys.stdout = old_stdout + + self.assertTrue(os.path.exists(tarball2)) + # let's compare both tarballs + self.assertEquals(self._tarinfo(tarball), self._tarinfo(tarball2)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + # now for a dry_run + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None, dry_run=True) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(ZIP_SUPPORT, 'Need zip support to run') + def test_make_zipfile(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + _make_zipfile(base_name, tmpdir) + + # check if the compressed tarball was created + tarball = base_name + '.zip' + self.assertTrue(os.path.exists(tarball)) + + + def test_make_archive(self): + tmpdir = self.mkdtemp() + base_name = os.path.join(tmpdir, 'archive') + self.assertRaises(ValueError, make_archive, base_name, 'xxx') + + @unittest.skipUnless(zlib, "Requires zlib") + def test_make_archive_owner_group(self): + # testing make_archive with owner and group, with various combinations + # this works even if there's not gid/uid support + if UID_GID_SUPPORT: + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + else: + group = owner = 'root' + + base_dir, root_dir, base_name = self._create_files() + base_name = os.path.join(self.mkdtemp() , 'archive') + res = make_archive(base_name, 'zip', root_dir, base_dir, owner=owner, + group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'zip', root_dir, base_dir) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner=owner, group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner='kjhkjhkjg', group='oihohoh') + self.assertTrue(os.path.exists(res)) + + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(UID_GID_SUPPORT, "Requires grp and pwd support") + def test_tarfile_root_owner(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + try: + archive_name = _make_tarball(base_name, 'dist', compress=None, + owner=owner, group=group) + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + self.assertTrue(os.path.exists(archive_name)) + + # now checks the rights + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, 0) + self.assertEquals(member.gid, 0) + finally: + archive.close() + + def test_make_archive_cwd(self): + current_dir = os.getcwd() + def _breaks(*args, **kw): + raise RuntimeError() + + register_archive_format('xxx', _breaks, [], 'xxx file') + try: + try: + make_archive('xxx', 'xxx', root_dir=self.mkdtemp()) + except Exception: + pass + self.assertEquals(os.getcwd(), current_dir) + finally: + unregister_archive_format('xxx') + + def test_register_archive_format(self): + + self.assertRaises(TypeError, register_archive_format, 'xxx', 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + [(1, 2), (1, 2, 3)]) + + register_archive_format('xxx', lambda: x, [(1, 2)], 'xxx file') + formats = [name for name, params in get_archive_formats()] + self.assertIn('xxx', formats) + + unregister_archive_format('xxx') + formats = [name for name, params in get_archive_formats()] + self.assertNotIn('xxx', formats) + + def _compare_dirs(self, dir1, dir2): + # check that dir1 and dir2 are equivalent, + # return the diff + diff = [] + for root, dirs, files in os.walk(dir1): + for file_ in files: + path = os.path.join(root, file_) + target_path = os.path.join(dir2, os.path.split(path)[-1]) + if not os.path.exists(target_path): + diff.append(file_) + return diff + + @unittest.skipUnless(zlib, "Requires zlib") + def test_unpack_archive(self): + formats = ['tar', 'gztar', 'zip'] + if BZ2_SUPPORTED: + formats.append('bztar') + + for format in formats: + tmpdir = self.mkdtemp() + base_dir, root_dir, base_name = self._create_files() + tmpdir2 = self.mkdtemp() + filename = make_archive(base_name, format, root_dir, base_dir) + + # let's try to unpack it now + unpack_archive(filename, tmpdir2) + diff = self._compare_dirs(tmpdir, tmpdir2) + self.assertEquals(diff, []) + + def test_unpack_registery(self): + + formats = get_unpack_formats() + + def _boo(filename, extract_dir, extra): + self.assertEquals(extra, 1) + self.assertEquals(filename, 'stuff.boo') + self.assertEquals(extract_dir, 'xx') + + register_unpack_format('Boo', ['.boo', '.b2'], _boo, [('extra', 1)]) + unpack_archive('stuff.boo', 'xx') + + # trying to register a .boo unpacker again + self.assertRaises(RegistryError, register_unpack_format, 'Boo2', + ['.boo'], _boo) + + # should work now + unregister_unpack_format('Boo') + register_unpack_format('Boo2', ['.boo'], _boo) + self.assertIn(('Boo2', ['.boo'], ''), get_unpack_formats()) + self.assertNotIn(('Boo', ['.boo'], ''), get_unpack_formats()) + + # let's leave a clean state + unregister_unpack_format('Boo2') + self.assertEquals(get_unpack_formats(), formats) + + +class TestMove(unittest.TestCase): + + def setUp(self): + filename = "foo" + self.src_dir = tempfile.mkdtemp() + self.dst_dir = tempfile.mkdtemp() + self.src_file = os.path.join(self.src_dir, filename) + self.dst_file = os.path.join(self.dst_dir, filename) + # Try to create a dir in the current directory, hoping that it is + # not located on the same filesystem as the system tmp dir. + try: + self.dir_other_fs = tempfile.mkdtemp( + dir=os.path.dirname(__file__)) + self.file_other_fs = os.path.join(self.dir_other_fs, + filename) + except OSError: + self.dir_other_fs = None + f = open(self.src_file, "wb") + try: + f.write("spam") + finally: + f.close() + + def tearDown(self): + for d in (self.src_dir, self.dst_dir, self.dir_other_fs): + try: + if d: + shutil.rmtree(d) + except: + pass + + def _check_move_file(self, src, dst, real_dst): + f = open(src, "rb") + try: + contents = f.read() + finally: + f.close() + + shutil.move(src, dst) + f = open(real_dst, "rb") + try: + self.assertEqual(contents, f.read()) + finally: + f.close() + + self.assertFalse(os.path.exists(src)) + + def _check_move_dir(self, src, dst, real_dst): + contents = sorted(os.listdir(src)) + shutil.move(src, dst) + self.assertEqual(contents, sorted(os.listdir(real_dst))) + self.assertFalse(os.path.exists(src)) + + def test_move_file(self): + # Move a file to another location on the same filesystem. + self._check_move_file(self.src_file, self.dst_file, self.dst_file) + + def test_move_file_to_dir(self): + # Move a file inside an existing dir on the same filesystem. + self._check_move_file(self.src_file, self.dst_dir, self.dst_file) + + def test_move_file_other_fs(self): + # Move a file to an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.file_other_fs, + self.file_other_fs) + + def test_move_file_to_dir_other_fs(self): + # Move a file to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.dir_other_fs, + self.file_other_fs) + + def test_move_dir(self): + # Move a dir to another location on the same filesystem. + dst_dir = tempfile.mktemp() + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_other_fs(self): + # Move a dir to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + dst_dir = tempfile.mktemp(dir=self.dir_other_fs) + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_to_dir(self): + # Move a dir inside an existing dir on the same filesystem. + self._check_move_dir(self.src_dir, self.dst_dir, + os.path.join(self.dst_dir, os.path.basename(self.src_dir))) + + def test_move_dir_to_dir_other_fs(self): + # Move a dir inside an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_dir(self.src_dir, self.dir_other_fs, + os.path.join(self.dir_other_fs, os.path.basename(self.src_dir))) + + def test_existing_file_inside_dest_dir(self): + # A file with the same name inside the destination dir already exists. + f = open(self.dst_file, "wb") + try: + pass + finally: + f.close() + self.assertRaises(shutil.Error, shutil.move, self.src_file, self.dst_dir) + + def test_dont_move_dir_in_itself(self): + # Moving a dir inside itself raises an Error. + dst = os.path.join(self.src_dir, "bar") + self.assertRaises(shutil.Error, shutil.move, self.src_dir, dst) + + def test_destinsrc_false_negative(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'srcdir/dest')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertTrue(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is not in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + def test_destinsrc_false_positive(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'src/dest'), ('srcdir', 'srcdir.new')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertFalse(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + +class TestCopyFile(unittest.TestCase): + + _delete = False + + class Faux(object): + _entered = False + _exited_with = None + _raised = False + + def __init__(self, raise_in_exit=False, suppress_at_exit=True): + self._raise_in_exit = raise_in_exit + self._suppress_at_exit = suppress_at_exit + + def read(self, *args): + return '' + + def __enter__(self): + self._entered = True + + def __exit__(self, exc_type, exc_val, exc_tb): + self._exited_with = exc_type, exc_val, exc_tb + if self._raise_in_exit: + self._raised = True + raise IOError("Cannot close") + return self._suppress_at_exit + + def tearDown(self): + if self._delete: + del shutil.open + + def _set_shutil_open(self, func): + shutil.open = func + self._delete = True + + def test_w_source_open_fails(self): + def _open(filename, mode='r'): + if filename == 'srcfile': + raise IOError('Cannot open "srcfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, shutil.copyfile, 'srcfile', 'destfile') + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_open_fails(self): + + srcfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + raise IOError('Cannot open "destfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot open "destfile"',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_close_fails(self): + + srcfile = self.Faux() + destfile = self.Faux(True) + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertTrue(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot close',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_source_close_fails(self): + + srcfile = self.Faux(True) + destfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, + shutil.copyfile, 'srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertFalse(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is None) + self.assertTrue(srcfile._raised) + + +def test_suite(): + suite = unittest.TestSuite() + load = unittest.defaultTestLoader.loadTestsFromTestCase + suite.addTest(load(TestCopyFile)) + suite.addTest(load(TestMove)) + suite.addTest(load(TestShutil)) + return suite + + +if __name__ == '__main__': + unittest.main(defaultTest='test_suite') diff --git a/distutils2/_backport/tests/test_sysconfig.py b/distutils2/_backport/tests/test_sysconfig.py --- a/distutils2/_backport/tests/test_sysconfig.py +++ b/distutils2/_backport/tests/test_sysconfig.py @@ -4,7 +4,7 @@ import sys import subprocess import shutil -from copy import copy, deepcopy +from copy import copy from ConfigParser import RawConfigParser from StringIO import StringIO @@ -15,13 +15,9 @@ get_scheme_names, _main, _SCHEMES) from distutils2.tests import unittest -from distutils2.tests.support import EnvironGuard +from distutils2.tests.support import EnvironGuard, skip_unless_symlink from test.test_support import TESTFN, unlink -try: - from test.test_support import skip_unless_symlink -except ImportError: - skip_unless_symlink = unittest.skip( - 'requires test.test_support.skip_unless_symlink') + class TestSysConfig(EnvironGuard, unittest.TestCase): diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py --- a/distutils2/command/build_py.py +++ b/distutils2/command/build_py.py @@ -8,7 +8,6 @@ import logging from glob import glob -import distutils2 from distutils2.command.cmd import Command from distutils2.errors import DistutilsOptionError, DistutilsFileError from distutils2.util import convert_path @@ -66,10 +65,9 @@ self.packages = self.distribution.packages self.py_modules = self.distribution.py_modules self.package_data = self.distribution.package_data - self.package_dir = {} - if self.distribution.package_dir: - for name, path in self.distribution.package_dir.iteritems(): - self.package_dir[name] = convert_path(path) + self.package_dir = None + if self.distribution.package_dir is not None: + self.package_dir = convert_path(self.distribution.package_dir) self.data_files = self.get_data_files() # Ick, copied straight from install_lib.py (fancy_getopt needs a @@ -164,11 +162,13 @@ Helper function for `run()`. """ + # FIXME add tests for this method for package, src_dir, build_dir, filenames in self.data_files: for filename in filenames: target = os.path.join(build_dir, filename) + srcfile = os.path.join(src_dir, filename) self.mkpath(os.path.dirname(target)) - outf, copied = self.copy_file(os.path.join(src_dir, filename), + outf, copied = self.copy_file(srcfile, target, preserve_mode=False) if copied and srcfile in self.distribution.convert_2to3.doctests: self._doctests_2to3.append(outf) @@ -179,41 +179,14 @@ """Return the directory, relative to the top of the source distribution, where package 'package' should be found (at least according to the 'package_dir' option, if any).""" + path = package.split('.') + if self.package_dir is not None: + path.insert(0, self.package_dir) - path = package.split('.') + if len(path) > 0: + return os.path.join(*path) - if not self.package_dir: - if path: - return os.path.join(*path) - else: - return '' - else: - tail = [] - while path: - try: - pdir = self.package_dir['.'.join(path)] - except KeyError: - tail.insert(0, path[-1]) - del path[-1] - else: - tail.insert(0, pdir) - return os.path.join(*tail) - else: - # Oops, got all the way through 'path' without finding a - # match in package_dir. If package_dir defines a directory - # for the root (nameless) package, then fallback on it; - # otherwise, we might as well have not consulted - # package_dir at all, as we just use the directory implied - # by 'tail' (which should be the same as the original value - # of 'path' at this point). - pdir = self.package_dir.get('') - if pdir is not None: - tail.insert(0, pdir) - - if tail: - return os.path.join(*tail) - else: - return '' + return '' def check_package(self, package, package_dir): """Helper function for `find_package_modules()` and `find_modules()'. diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -165,7 +165,10 @@ header = "command options for '%s':" % self.get_command_name() self.announce(indent + header, level=logging.INFO) indent = indent + " " + negative_opt = getattr(self, 'negative_opt', ()) for (option, _, _) in self.user_options: + if option in negative_opt: + continue option = option.replace('-', '_') if option[-1] == "=": option = option[:-1] diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -18,7 +18,8 @@ from distutils2.command import get_command_names from distutils2.command.cmd import Command from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError, - DistutilsTemplateError, DistutilsModuleError) + DistutilsTemplateError, DistutilsModuleError, + DistutilsFileError) from distutils2.manifest import Manifest from distutils2 import logger from distutils2.util import convert_path, resolve_name @@ -291,6 +292,12 @@ logger.warn("no files to distribute -- empty manifest?") else: logger.info(msg) + + for file in self.distribution.metadata.requires_files: + if file not in files: + msg = "'%s' must be included explicitly in 'extra_files'" % file + raise DistutilsFileError(msg) + for file in files: if not os.path.isfile(file): logger.warn("'%s' not a regular file -- skipping" % file) diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -7,7 +7,8 @@ from ConfigParser import RawConfigParser from distutils2 import logger -from distutils2.util import check_environ, resolve_name +from distutils2.errors import DistutilsOptionError +from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command @@ -76,10 +77,9 @@ return value def _multiline(self, value): - if '\n' in value: - value = [v for v in - [v.strip() for v in value.split('\n')] - if v != ''] + value = [v for v in + [v.strip() for v in value.split('\n')] + if v != ''] return value def _read_setup_cfg(self, parser): @@ -100,7 +100,9 @@ if 'metadata' in content: for key, value in content['metadata'].iteritems(): key = key.replace('_', '-') - value = self._multiline(value) + if metadata.is_multi_field(key): + value = self._multiline(value) + if key == 'project-url': value = [(label.strip(), url.strip()) for label, url in @@ -112,30 +114,45 @@ "mutually exclusive") raise DistutilsOptionError(msg) - f = open(value) # will raise if file not found - try: - value = f.read() - finally: - f.close() + if isinstance(value, list): + filenames = value + else: + filenames = value.split() + + # concatenate each files + value = '' + for filename in filenames: + f = open(filename) # will raise if file not found + try: + value += f.read().strip() + '\n' + finally: + f.close() + # add filename as a required file + if filename not in metadata.requires_files: + metadata.requires_files.append(filename) + value = value.strip() key = 'description' if metadata.is_metadata_field(key): metadata[key] = self._convert_metadata(key, value) + if 'files' in content: - files = dict([(key, self._multiline(value)) + def _convert(key, value): + if key not in ('packages_root',): + value = self._multiline(value) + return value + + files = dict([(key, _convert(key, value)) for key, value in content['files'].iteritems()]) self.dist.packages = [] - self.dist.package_dir = {} + self.dist.package_dir = pkg_dir = files.get('packages_root') packages = files.get('packages', []) if isinstance(packages, str): packages = [packages] for package in packages: - if ':' in package: - dir_, package = package.split(':') - self.dist.package_dir[package] = dir_ self.dist.packages.append(package) self.dist.py_modules = files.get('modules', []) diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -17,19 +17,19 @@ import urllib import urlparse import zipfile - try: import hashlib except ImportError: from distutils2._backport import hashlib +from distutils2._backport.shutil import unpack_archive from distutils2.errors import IrrationalVersionError from distutils2.index.errors import (HashDoesNotMatch, UnsupportedHashName, CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) from distutils2.metadata import DistributionMetadata -from distutils2.util import untar_file, unzip_file, splitext +from distutils2.util import splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -206,6 +206,7 @@ __hash__ = object.__hash__ + class DistInfo(IndexReference): """Represents a distribution retrieved from an index (sdist, bdist, ...) """ @@ -313,17 +314,8 @@ filename = self.download() content_type = mimetypes.guess_type(filename)[0] + self._unpacked_dir = unpack_archive(filename) - if (content_type == 'application/zip' - or filename.endswith('.zip') - or filename.endswith('.pybundle') - or zipfile.is_zipfile(filename)): - unzip_file(filename, path, flatten=not filename.endswith('.pybundle')) - elif (content_type == 'application/x-gzip' - or tarfile.is_tarfile(filename) - or splitext(filename)[1].lower() in ('.tar', '.tar.gz', '.tar.bz2', '.tgz', '.tbz')): - untar_file(filename, path) - self._unpacked_dir = path return self._unpacked_dir def _check_md5(self, filename): diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -127,10 +127,17 @@ return release def get_metadata(self, project_name, version): - """Retreive project metadatas. + """Retrieve project metadata. Return a ReleaseInfo object, with metadata informations filled in. """ + # to be case-insensitive, get the informations from the XMLRPC API + projects = [d['name'] for d in + self.proxy.search({'name': project_name}) + if d['name'].lower() == project_name] + if len(projects) > 0: + project_name = projects[0] + metadata = self.proxy.release_data(project_name, version) project = self._get_project(project_name) if version not in project.get_versions(): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -1,14 +1,16 @@ from tempfile import mkdtemp -import logging import shutil import os import errno import itertools +from distutils2 import logger from distutils2._backport.pkgutil import get_distributions +from distutils2._backport.sysconfig import get_config_var from distutils2.depgraph import generate_graph from distutils2.index import wrapper from distutils2.index.errors import ProjectNotFound, ReleaseNotFound +from distutils2.version import get_version_predicate """Provides installations scripts. @@ -53,7 +55,63 @@ else: raise e os.rename(old, new) - yield(old, new) + yield (old, new) + + +def _run_d1_install(archive_dir, path): + # backward compat: using setuptools or plain-distutils + cmd = '%s setup.py install --root=%s --record=%s' + setup_py = os.path.join(archive_dir, 'setup.py') + if 'setuptools' in open(setup_py).read(): + cmd += ' --single-version-externally-managed' + + # how to place this file in the egg-info dir + # for non-distutils2 projects ? + record_file = os.path.join(archive_dir, 'RECORD') + os.system(cmd % (sys.executable, path, record_file)) + if not os.path.exists(record_file): + raise ValueError('Failed to install.') + return open(record_file).read().split('\n') + + +def _run_d2_install(archive_dir, path): + # using our own install command + raise NotImplementedError() + + +def _install_dist(dist, path): + """Install a distribution into a path. + + This: + + * unpack the distribution + * copy the files in "path" + * determine if the distribution is distutils2 or distutils1. + """ + where = dist.unpack(archive) + + # get into the dir + archive_dir = None + for item in os.listdir(where): + fullpath = os.path.join(where, item) + if os.path.isdir(fullpath): + archive_dir = fullpath + break + + if archive_dir is None: + raise ValueError('Cannot locate the unpacked archive') + + # install + old_dir = os.getcwd() + os.chdir(archive_dir) + try: + # distutils2 or distutils1 ? + if 'setup.py' in os.listdir(archive_dir): + return _run_d1_install(archive_dir, path) + else: + return _run_d2_install(archive_dir, path) + finally: + os.chdir(old_dir) def install_dists(dists, path=None): @@ -65,19 +123,23 @@ Return a list of installed files. :param dists: distributions to install - :param path: base path to install distribution on + :param path: base path to install distribution in """ if not path: path = mkdtemp() installed_dists, installed_files = [], [] for d in dists: + logger.info('Installing %s %s' % (d.name, d.version)) try: - installed_files.extend(d.install(path)) + installed_files.extend(_install_dist(d, path)) installed_dists.append(d) except Exception, e : + logger.info('Failed. %s' % str(e)) + + # reverting for d in installed_dists: - d.uninstall() + uninstall(d) raise e return installed_files @@ -123,16 +185,26 @@ try: if install: installed_files = install_dists(install, install_path) # install to tmp first - for files in temp_files.itervalues(): - for old, new in files: - os.remove(new) - except Exception,e: + except: # if an error occurs, put back the files in the good place. - for files in temp_files.itervalues(): + for files in temp_files.values(): for old, new in files: shutil.move(new, old) + # now re-raising + raise + + # we can remove them for good + for files in temp_files.values(): + for old, new in files: + os.remove(new) + + +def _get_setuptools_deps(release): + # NotImplementedError + pass + def get_infos(requirements, index=None, installed=None, prefer_final=True): """Return the informations on what's going to be installed and upgraded. @@ -154,44 +226,74 @@ Conflict contains all the conflicting distributions, if there is a conflict. """ + if not installed: + logger.info('Reading installed distributions') + installed = get_distributions(use_egg_info=True) + + infos = {'install': [], 'remove': [], 'conflict': []} + # Is a compatible version of the project is already installed ? + predicate = get_version_predicate(requirements) + found = False + installed = list(installed) + + # check that the project isnt already installed + for installed_project in installed: + # is it a compatible project ? + if predicate.name.lower() != installed_project.name.lower(): + continue + found = True + logger.info('Found %s %s' % (installed_project.name, + installed_project.version)) + + # if we already have something installed, check it matches the + # requirements + if predicate.match(installed_project.version): + return infos + break + + if not found: + logger.info('Project not installed.') if not index: index = wrapper.ClientWrapper() - if not installed: - installed = get_distributions(use_egg_info=True) - # Get all the releases that match the requirements try: releases = index.get_releases(requirements) except (ReleaseNotFound, ProjectNotFound), e: raise InstallationException('Release not found: "%s"' % requirements) - + # Pick up a release, and try to get the dependency tree release = releases.get_last(requirements, prefer_final=prefer_final) - # Iter since we found something without conflicts + if release is None: + logger.info('Could not find a matching project') + return infos + + # this works for Metadata 1.2 metadata = release.fetch_metadata() - # Get the distributions already_installed on the system - # and add the one we want to install + # for earlier, we need to build setuptools deps if any + if 'requires_dist' not in metadata: + deps = _get_setuptools_deps(release) + else: + deps = metadata['requires_dist'] distributions = itertools.chain(installed, [release]) depgraph = generate_graph(distributions) # Store all the already_installed packages in a list, in case of rollback. - infos = {'install': [], 'remove': [], 'conflict': []} - # Get what the missing deps are - for dists in depgraph.missing.itervalues(): - if dists: - logging.info("missing dependencies found, installing them") - # we have missing deps - for dist in dists: - _update_infos(infos, get_infos(dist, index, installed)) + dists = depgraph.missing[release] + if dists: + logger.info("missing dependencies found, retrieving metadata") + # we have missing deps + for dist in dists: + _update_infos(infos, get_infos(dist, index, installed)) # Fill in the infos existing = [d for d in installed if d.name == release.name] + if existing: infos['remove'].append(existing[0]) infos['conflict'].extend(depgraph.reverse_list[existing[0]]) @@ -203,16 +305,46 @@ """extends the lists contained in the `info` dict with those contained in the `new_info` one """ - for key, value in infos.iteritems(): + for key, value in infos.items(): if key in new_infos: infos[key].extend(new_infos[key]) +def remove(project_name): + """Removes a single project from the installation""" + pass + + + + def main(**attrs): if 'script_args' not in attrs: import sys attrs['requirements'] = sys.argv[1] get_infos(**attrs) + +def install(project): + logger.info('Getting information about "%s".' % project) + try: + info = get_infos(project) + except InstallationException: + logger.info('Cound not find "%s".' % project) + return + + if info['install'] == []: + logger.info('Nothing to install.') + return + + install_path = get_config_var('base') + try: + install_from_infos(info['install'], info['remove'], info['conflict'], + install_path=install_path) + + except InstallationConflict, e: + projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] + logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) + + if __name__ == '__main__': main() diff --git a/distutils2/markers.py b/distutils2/markers.py new file mode 100644 --- /dev/null +++ b/distutils2/markers.py @@ -0,0 +1,194 @@ +""" Micro-language for PEP 345 environment markers +""" +import sys +import platform +import os +from tokenize import tokenize, NAME, OP, STRING, ENDMARKER +from StringIO import StringIO + +__all__ = ['interpret'] + + +# allowed operators +_OPERATORS = {'==': lambda x, y: x == y, + '!=': lambda x, y: x != y, + '>': lambda x, y: x > y, + '>=': lambda x, y: x >= y, + '<': lambda x, y: x < y, + '<=': lambda x, y: x <= y, + 'in': lambda x, y: x in y, + 'not in': lambda x, y: x not in y} + + +def _operate(operation, x, y): + return _OPERATORS[operation](x, y) + + +# restricted set of variables +_VARS = {'sys.platform': sys.platform, + 'python_version': sys.version[:3], + 'python_full_version': sys.version.split(' ', 1)[0], + 'os.name': os.name, + 'platform.version': platform.version(), + 'platform.machine': platform.machine()} + + +class _Operation(object): + + def __init__(self, execution_context=None): + self.left = None + self.op = None + self.right = None + if execution_context is None: + execution_context = {} + self.execution_context = execution_context + + def _get_var(self, name): + if name in self.execution_context: + return self.execution_context[name] + return _VARS[name] + + def __repr__(self): + return '%s %s %s' % (self.left, self.op, self.right) + + def _is_string(self, value): + if value is None or len(value) < 2: + return False + for delimiter in '"\'': + if value[0] == value[-1] == delimiter: + return True + return False + + def _is_name(self, value): + return value in _VARS + + def _convert(self, value): + if value in _VARS: + return self._get_var(value) + return value.strip('"\'') + + def _check_name(self, value): + if value not in _VARS: + raise NameError(value) + + def _nonsense_op(self): + msg = 'This operation is not supported : "%s"' % self + raise SyntaxError(msg) + + def __call__(self): + # make sure we do something useful + if self._is_string(self.left): + if self._is_string(self.right): + self._nonsense_op() + self._check_name(self.right) + else: + if not self._is_string(self.right): + self._nonsense_op() + self._check_name(self.left) + + if self.op not in _OPERATORS: + raise TypeError('Operator not supported "%s"' % self.op) + + left = self._convert(self.left) + right = self._convert(self.right) + return _operate(self.op, left, right) + + +class _OR(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'OR(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() or self.right() + + +class _AND(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'AND(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() and self.right() + + +class _CHAIN(object): + + def __init__(self, execution_context=None): + self.ops = [] + self.op_starting = True + self.execution_context = execution_context + + def eat(self, toktype, tokval, rowcol, line, logical_line): + if toktype not in (NAME, OP, STRING, ENDMARKER): + raise SyntaxError('Type not supported "%s"' % tokval) + + if self.op_starting: + op = _Operation(self.execution_context) + if len(self.ops) > 0: + last = self.ops[-1] + if isinstance(last, (_OR, _AND)) and not last.filled(): + last.right = op + else: + self.ops.append(op) + else: + self.ops.append(op) + self.op_starting = False + else: + op = self.ops[-1] + + if (toktype == ENDMARKER or + (toktype == NAME and tokval in ('and', 'or'))): + if toktype == NAME and tokval == 'and': + self.ops.append(_AND(self.ops.pop())) + elif toktype == NAME and tokval == 'or': + self.ops.append(_OR(self.ops.pop())) + self.op_starting = True + return + + if isinstance(op, (_OR, _AND)) and op.right is not None: + op = op.right + + if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) + or (toktype == OP and tokval == '.')): + if op.op is None: + if op.left is None: + op.left = tokval + else: + op.left += tokval + else: + if op.right is None: + op.right = tokval + else: + op.right += tokval + elif toktype == OP or tokval in ('in', 'not'): + if tokval == 'in' and op.op == 'not': + op.op = 'not in' + else: + op.op = tokval + + def result(self): + for op in self.ops: + if not op(): + return False + return True + + +def interpret(marker, execution_context=None): + """Interpret a marker and return a result depending on environment.""" + marker = marker.strip() + operations = _CHAIN(execution_context) + tokenize(StringIO(marker).readline, operations.eat) + return operations.result() diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -5,13 +5,12 @@ import os import sys -import platform import re from StringIO import StringIO from email import message_from_file -from tokenize import tokenize, NAME, OP, STRING, ENDMARKER from distutils2 import logger +from distutils2.markers import interpret from distutils2.version import (is_valid_predicate, is_valid_version, is_valid_versions) from distutils2.errors import (MetadataMissingError, @@ -78,13 +77,13 @@ 'Obsoletes-Dist', 'Requires-External', 'Maintainer', 'Maintainer-email', 'Project-URL') +_345_REQUIRED = ('Name', 'Version') + _ALL_FIELDS = set() _ALL_FIELDS.update(_241_FIELDS) _ALL_FIELDS.update(_314_FIELDS) _ALL_FIELDS.update(_345_FIELDS) -_345_REQUIRED = ('Name', 'Version') - def _version2fieldlist(version): if version == '1.0': return _241_FIELDS @@ -174,14 +173,19 @@ _LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes', 'Requires', 'Provides', 'Obsoletes-Dist', 'Provides-Dist', 'Requires-Dist', 'Requires-External', - 'Project-URL') + 'Project-URL', 'Supported-Platform') _LISTTUPLEFIELDS = ('Project-URL',) _ELEMENTSFIELD = ('Keywords',) _UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description') -_MISSING = object() +class NoDefault(object): + """Marker object used for clean representation""" + def __repr__(self): + return '' + +_MISSING = NoDefault() class DistributionMetadata(object): """The metadata of a release. @@ -202,6 +206,7 @@ self._fields = {} self.display_warnings = display_warnings self.version = None + self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent self.execution_context = execution_context @@ -285,7 +290,7 @@ if not self.platform_dependent or ';' not in value: return True, value value, marker = value.split(';') - return _interpret(marker, self.execution_context), value + return interpret(marker, self.execution_context), value def _remove_line_prefix(self, value): return _LINE_PREFIX.sub('\n', value) @@ -294,13 +299,20 @@ # Public API # def get_fullname(self): + """Return the distribution name with version""" return '%s-%s' % (self['Name'], self['Version']) def is_metadata_field(self, name): + """return True if name is a valid metadata key""" name = self._convert_name(name) return name in _ALL_FIELDS + def is_multi_field(self, name): + name = self._convert_name(name) + return name in _LISTFIELDS + def read(self, filepath): + """Read the metadata values from a file path.""" self.read_file(open(filepath)) def read_file(self, fileob): @@ -454,7 +466,8 @@ return value def check(self, strict=False): - """Check if the metadata is compliant.""" + """Check if the metadata is compliant. If strict is False then raise if + no Name or Version are provided""" # XXX should check the versions (if the file was loaded) missing, warnings = [], [] @@ -494,198 +507,13 @@ return missing, warnings def keys(self): + """Dict like api""" return _version2fieldlist(self.version) def values(self): + """Dict like api""" return [self[key] for key in self.keys()] def items(self): + """Dict like api""" return [(key, self[key]) for key in self.keys()] - - -# -# micro-language for PEP 345 environment markers -# - -# allowed operators -_OPERATORS = {'==': lambda x, y: x == y, - '!=': lambda x, y: x != y, - '>': lambda x, y: x > y, - '>=': lambda x, y: x >= y, - '<': lambda x, y: x < y, - '<=': lambda x, y: x <= y, - 'in': lambda x, y: x in y, - 'not in': lambda x, y: x not in y} - - -def _operate(operation, x, y): - return _OPERATORS[operation](x, y) - -# restricted set of variables -_VARS = {'sys.platform': sys.platform, - 'python_version': sys.version[:3], - 'python_full_version': sys.version.split(' ', 1)[0], - 'os.name': os.name, - 'platform.version': platform.version(), - 'platform.machine': platform.machine()} - - -class _Operation(object): - - def __init__(self, execution_context=None): - self.left = None - self.op = None - self.right = None - if execution_context is None: - execution_context = {} - self.execution_context = execution_context - - def _get_var(self, name): - if name in self.execution_context: - return self.execution_context[name] - return _VARS[name] - - def __repr__(self): - return '%s %s %s' % (self.left, self.op, self.right) - - def _is_string(self, value): - if value is None or len(value) < 2: - return False - for delimiter in '"\'': - if value[0] == value[-1] == delimiter: - return True - return False - - def _is_name(self, value): - return value in _VARS - - def _convert(self, value): - if value in _VARS: - return self._get_var(value) - return value.strip('"\'') - - def _check_name(self, value): - if value not in _VARS: - raise NameError(value) - - def _nonsense_op(self): - msg = 'This operation is not supported : "%s"' % self - raise SyntaxError(msg) - - def __call__(self): - # make sure we do something useful - if self._is_string(self.left): - if self._is_string(self.right): - self._nonsense_op() - self._check_name(self.right) - else: - if not self._is_string(self.right): - self._nonsense_op() - self._check_name(self.left) - - if self.op not in _OPERATORS: - raise TypeError('Operator not supported "%s"' % self.op) - - left = self._convert(self.left) - right = self._convert(self.right) - return _operate(self.op, left, right) - - -class _OR(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'OR(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() or self.right() - - -class _AND(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'AND(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() and self.right() - - -class _CHAIN(object): - - def __init__(self, execution_context=None): - self.ops = [] - self.op_starting = True - self.execution_context = execution_context - - def eat(self, toktype, tokval, rowcol, line, logical_line): - if toktype not in (NAME, OP, STRING, ENDMARKER): - raise SyntaxError('Type not supported "%s"' % tokval) - - if self.op_starting: - op = _Operation(self.execution_context) - if len(self.ops) > 0: - last = self.ops[-1] - if isinstance(last, (_OR, _AND)) and not last.filled(): - last.right = op - else: - self.ops.append(op) - else: - self.ops.append(op) - self.op_starting = False - else: - op = self.ops[-1] - - if (toktype == ENDMARKER or - (toktype == NAME and tokval in ('and', 'or'))): - if toktype == NAME and tokval == 'and': - self.ops.append(_AND(self.ops.pop())) - elif toktype == NAME and tokval == 'or': - self.ops.append(_OR(self.ops.pop())) - self.op_starting = True - return - - if isinstance(op, (_OR, _AND)) and op.right is not None: - op = op.right - - if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) - or (toktype == OP and tokval == '.')): - if op.op is None: - if op.left is None: - op.left = tokval - else: - op.left += tokval - else: - if op.right is None: - op.right = tokval - else: - op.right += tokval - elif toktype == OP or tokval in ('in', 'not'): - if tokval == 'in' and op.op == 'not': - op.op = 'not in' - else: - op.op = tokval - - def result(self): - for op in self.ops: - if not op(): - return False - return True - - -def _interpret(marker, execution_context=None): - """Interpret a marker and return a result depending on environment.""" - marker = marker.strip() - operations = _CHAIN(execution_context) - tokenize(StringIO(marker).readline, operations.eat) - return operations.result() diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -1,7 +1,9 @@ import os import sys from optparse import OptionParser +import logging +from distutils2 import logger from distutils2.util import grok_environment_error from distutils2.errors import (DistutilsSetupError, DistutilsArgError, DistutilsError, CCompilerError) @@ -9,6 +11,7 @@ from distutils2 import __version__ from distutils2._backport.pkgutil import get_distributions, get_distribution from distutils2.depgraph import generate_graph +from distutils2.install import install # This is a barebones help message generated displayed when the user # runs the setup script with no arguments at all. More useful help @@ -109,13 +112,23 @@ except (DistutilsError, CCompilerError), msg: + raise raise SystemExit, "error: " + str(msg) return dist +def _set_logger(): + logger.setLevel(logging.INFO) + sth = logging.StreamHandler(sys.stderr) + sth.setLevel(logging.INFO) + logger.addHandler(sth) + logger.propagate = 0 + + def main(): """Main entry point for Distutils2""" + _set_logger() parser = OptionParser() parser.disable_interspersed_args() parser.usage = '%prog [options] cmd1 cmd2 ..' @@ -124,6 +137,10 @@ action="store_true", dest="version", default=False, help="Prints out the version of Distutils2 and exits.") + parser.add_option("-m", "--metadata", + action="append", dest="metadata", default=[], + help="List METADATA metadata or 'all' for all metadatas.") + parser.add_option("-s", "--search", action="store", dest="search", default=None, help="Search for installed distributions.") @@ -136,11 +153,44 @@ action="store_true", dest="fgraph", default=False, help="Display the full graph for installed distributions.") + parser.add_option("-i", "--install", + action="store", dest="install", + help="Install a project.") + + parser.add_option("-r", "--remove", + action="store", dest="remove", + help="Remove a project.") + options, args = parser.parse_args() if options.version: print('Distutils2 %s' % __version__) # sys.exit(0) + if len(options.metadata): + from distutils2.dist import Distribution + dist = Distribution() + dist.parse_config_files() + metadata = dist.metadata + + if 'all' in options.metadata: + keys = metadata.keys() + else: + keys = options.metadata + if len(keys) == 1: + print metadata[keys[0]] + sys.exit(0) + + for key in keys: + if key in metadata: + print(metadata._convert_name(key)+':') + value = metadata[key] + if isinstance(value, list): + for v in value: + print(' '+v) + else: + print(' '+value.replace('\n', '\n ')) + sys.exit(0) + if options.search is not None: search = options.search.lower() for dist in get_distributions(use_egg_info=True): @@ -169,6 +219,10 @@ print(graph) sys.exit(0) + if options.install is not None: + install(options.install) + sys.exit(0) + if len(args) == 0: parser.print_help() sys.exit(0) diff --git a/distutils2/tests/pypi_server.py b/distutils2/tests/pypi_server.py --- a/distutils2/tests/pypi_server.py +++ b/distutils2/tests/pypi_server.py @@ -375,6 +375,7 @@ def __init__(self, dists=[]): self._dists = dists + self._search_result = [] def add_distributions(self, dists): for dist in dists: diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py --- a/distutils2/tests/support.py +++ b/distutils2/tests/support.py @@ -17,10 +17,11 @@ super(SomeTestCase, self).setUp() ... # other setup code -Read each class' docstring to see its purpose and usage. +Also provided is a DummyCommand class, useful to mock commands in the +tests of another command that needs them, a create_distribution function +and a skip_unless_symlink decorator. -Also provided is a DummyCommand class, useful to mock commands in the -tests of another command that needs them (see docstring). +Each class or function has a docstring to explain its purpose and usage. """ import os @@ -35,7 +36,8 @@ from distutils2.tests import unittest __all__ = ['LoggingCatcher', 'WarningsCatcher', 'TempdirManager', - 'EnvironGuard', 'DummyCommand', 'unittest'] + 'EnvironGuard', 'DummyCommand', 'unittest', 'create_distribution', + 'skip_unless_symlink'] class LoggingCatcher(object): @@ -135,7 +137,7 @@ finally: f.close() - def create_dist(self, pkg_name='foo', **kw): + def create_dist(self, **kw): """Create a stub distribution object and files. This function creates a Distribution instance (use keyword arguments @@ -143,17 +145,19 @@ (currently an empty directory). It returns the path to the directory and the Distribution instance. - You can use TempdirManager.write_file to write any file in that + You can use self.write_file to write any file in that directory, e.g. setup scripts or Python modules. """ # Late import so that third parties can import support without # loading a ton of distutils2 modules in memory. from distutils2.dist import Distribution + if 'name' not in kw: + kw['name'] = 'foo' tmp_dir = self.mkdtemp() - pkg_dir = os.path.join(tmp_dir, pkg_name) - os.mkdir(pkg_dir) + project_dir = os.path.join(tmp_dir, kw['name']) + os.mkdir(project_dir) dist = Distribution(attrs=kw) - return pkg_dir, dist + return project_dir, dist class EnvironGuard(object): @@ -211,3 +215,9 @@ d.parse_command_line() return d + +try: + from test.test_support import skip_unless_symlink +except ImportError: + skip_unless_symlink = unittest.skip( + 'requires test.test_support.skip_unless_symlink') diff --git a/distutils2/tests/test_command_build_ext.py b/distutils2/tests/test_command_build_ext.py --- a/distutils2/tests/test_command_build_ext.py +++ b/distutils2/tests/test_command_build_ext.py @@ -289,7 +289,7 @@ # inplace = 0, cmd.package = 'bar' build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {'': 'bar'} + build_py.package_dir = 'bar' path = cmd.get_ext_fullpath('foo') # checking that the last directory is the build_dir path = os.path.split(path)[0] @@ -318,7 +318,7 @@ dist = Distribution() cmd = build_ext(dist) cmd.inplace = 1 - cmd.distribution.package_dir = {'': 'src'} + cmd.distribution.package_dir = 'src' cmd.distribution.packages = ['lxml', 'lxml.html'] curdir = os.getcwd() wanted = os.path.join(curdir, 'src', 'lxml', 'etree' + ext) @@ -334,7 +334,7 @@ # building twisted.runner.portmap not inplace build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {} + build_py.package_dir = None cmd.distribution.packages = ['twisted', 'twisted.runner.portmap'] path = cmd.get_ext_fullpath('twisted.runner.portmap') wanted = os.path.join(curdir, 'tmpdir', 'twisted', 'runner', diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py --- a/distutils2/tests/test_command_build_py.py +++ b/distutils2/tests/test_command_build_py.py @@ -17,12 +17,14 @@ def test_package_data(self): sources = self.mkdtemp() - f = open(os.path.join(sources, "__init__.py"), "w") + pkg_dir = os.path.join(sources, 'pkg') + os.mkdir(pkg_dir) + f = open(os.path.join(pkg_dir, "__init__.py"), "w") try: f.write("# Pretend this is a package.") finally: f.close() - f = open(os.path.join(sources, "README.txt"), "w") + f = open(os.path.join(pkg_dir, "README.txt"), "w") try: f.write("Info about this package") finally: @@ -31,8 +33,9 @@ destination = self.mkdtemp() dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": sources}}) + "package_dir": sources}) # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") dist.command_obj["build"] = support.DummyCommand( force=0, @@ -42,7 +45,7 @@ use_2to3=False) dist.packages = ["pkg"] dist.package_data = {"pkg": ["README.txt"]} - dist.package_dir = {"pkg": sources} + dist.package_dir = sources cmd = build_py(dist) cmd.compile = 1 @@ -68,19 +71,20 @@ # create the distribution files. sources = self.mkdtemp() - open(os.path.join(sources, "__init__.py"), "w").close() - - testdir = os.path.join(sources, "doc") + pkg = os.path.join(sources, 'pkg') + os.mkdir(pkg) + open(os.path.join(pkg, "__init__.py"), "w").close() + testdir = os.path.join(pkg, "doc") os.mkdir(testdir) open(os.path.join(testdir, "testfile"), "w").close() os.chdir(sources) old_stdout = sys.stdout - sys.stdout = StringIO.StringIO() + #sys.stdout = StringIO.StringIO() try: dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": ""}, + "package_dir": sources, "package_data": {"pkg": ["doc/*"]}}) # script_name need not exist, it just need to be initialized dist.script_name = os.path.join(sources, "setup.py") @@ -89,7 +93,7 @@ try: dist.run_commands() - except DistutilsFileError: + except DistutilsFileError, e: self.fail("failed package_data test when package_dir is ''") finally: # Restore state. diff --git a/distutils2/tests/test_command_install_dist.py b/distutils2/tests/test_command_install_dist.py --- a/distutils2/tests/test_command_install_dist.py +++ b/distutils2/tests/test_command_install_dist.py @@ -180,8 +180,8 @@ cmd.user = 'user' self.assertRaises(DistutilsOptionError, cmd.finalize_options) - def test_record(self): - + def test_old_record(self): + # test pre-PEP 376 --record option (outside dist-info dir) install_dir = self.mkdtemp() pkgdir, dist = self.create_dist() @@ -189,11 +189,11 @@ cmd = install_dist(dist) dist.command_obj['install_dist'] = cmd cmd.root = install_dir - cmd.record = os.path.join(pkgdir, 'RECORD') + cmd.record = os.path.join(pkgdir, 'filelist') cmd.ensure_finalized() cmd.run() - # let's check the RECORD file was created with four + # let's check the record file was created with four # lines, one for each .dist-info entry: METADATA, # INSTALLER, REQUSTED, RECORD f = open(cmd.record) diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -5,6 +5,8 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest +from distutils2.command.sdist import sdist +from distutils2.errors import DistutilsFileError SETUP_CFG = """ @@ -16,7 +18,7 @@ maintainer = ??ric Araujo maintainer_email = merwok at netwok.org summary = A sample project demonstrating distutils2 packaging -description-file = README +description-file = %(description-file)s keywords = distutils2, packaging, sample project classifier = @@ -47,9 +49,11 @@ Fork in progress, http://bitbucket.org/Merwok/sample-distutils2-project [files] +packages_root = src + packages = one - src:two - src2:three + two + three modules = haven @@ -66,6 +70,8 @@ config = cfg/data.cfg /etc/init.d = init-script +extra_files = %(extra-files)s + # Replaces MANIFEST.in sdist_extra = include THANKS HACKING @@ -130,22 +136,33 @@ self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) - def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) - self.write_file('README', 'yeah') + def write_setup(self, kwargs=None): + opts = {'description-file': 'README', 'extra-files':''} + if kwargs: + opts.update(kwargs) + self.write_file('setup.cfg', SETUP_CFG % opts) - # try to load the metadata now + + def run_setup(self, *args): + # run setup with args sys.stdout = StringIO() - sys.argv[:] = ['setup.py', '--version'] + sys.argv[:] = [''] + list(args) old_sys = sys.argv[:] - try: from distutils2.run import commands_main dist = commands_main() finally: sys.argv[:] = old_sys + return dist + + def test_config(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup() + self.write_file('README', 'yeah') + + # try to load the metadata now + dist = self.run_setup('--version') # sanity check self.assertEqual(sys.stdout.getvalue(), '0.6.4.dev1' + os.linesep) @@ -184,7 +201,6 @@ 'http://bitbucket.org/Merwok/sample-distutils2-project')] self.assertEqual(dist.metadata['Project-Url'], urls) - self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) @@ -192,7 +208,8 @@ [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), ('config ', ['cfg/data.cfg']), ('/etc/init.d ', ['init-script'])]) - self.assertEqual(dist.package_dir['two'], 'src') + + self.assertEqual(dist.package_dir, 'src') # Make sure we get the foo command loaded. We use a string comparison # instead of assertIsInstance because the class is not the same when @@ -213,10 +230,94 @@ d = new_compiler(compiler='d') self.assertEqual(d.description, 'D Compiler') + + def test_multiple_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog2') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_multiline_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README\n CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_metadata_requires_description_files_missing(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--version') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + def test_metadata_requires_description_files(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2', + 'extra-files':'\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--description') + self.assertIn('yeah\nyeah\n', sys.stdout.getvalue()) + + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + self.write_setup({'description-file': 'README\n README2', + 'extra-files': '\n README2\n README'}) + dist = self.run_setup('--description') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + cmd.make_distribution() + self.assertIn('README\nREADME2\n', open('MANIFEST').read()) + def test_sub_commands(self): tempdir = self.mkdtemp() os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) + self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') self.write_file('script1.py', '#') @@ -224,20 +325,15 @@ self.write_file(os.path.join('scripts', 'find-coconuts'), '#') os.mkdir('bin') self.write_file(os.path.join('bin', 'taunt'), '#') + os.mkdir('src') - for pkg in ('one', 'src', 'src2'): + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) os.mkdir(pkg) self.write_file(os.path.join(pkg, '__init__.py'), '#') # try to run the install command to see if foo is called - sys.stdout = sys.stderr = StringIO() - sys.argv[:] = ['', 'install_dist'] - old_sys = sys.argv[:] - try: - from distutils2.run import main - dist = main() - finally: - sys.argv[:] = old_sys + dist = self.run_setup('install_dist') self.assertEqual(dist.foo_was_here, 1) diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -29,27 +29,16 @@ class ToInstallDist(object): """Distribution that will be installed""" - def __init__(self, raise_error=False, files=False): - self._raise_error = raise_error + def __init__(self, files=False): self._files = files - self.install_called = False - self.install_called_with = {} self.uninstall_called = False self._real_files = [] + self.name = "fake" + self.version = "fake" if files: for f in range(0,3): self._real_files.append(mkstemp()) - def install(self, *args): - self.install_called = True - self.install_called_with = args - if self._raise_error: - raise Exception('Oops !') - return ['/path/to/foo', '/path/to/bar'] - - def uninstall(self, **args): - self.uninstall_called = True - def get_installed_files(self, **args): if self._files: return [f[1] for f in self._real_files] @@ -58,7 +47,49 @@ return self.get_installed_files() +class MagicMock(object): + def __init__(self, return_value=None, raise_exception=False): + self.called = False + self._times_called = 0 + self._called_with = [] + self._return_value = return_value + self._raise = raise_exception + + def __call__(self, *args, **kwargs): + self.called = True + self._times_called = self._times_called + 1 + self._called_with.append((args, kwargs)) + iterable = hasattr(self._raise, '__iter__') + if self._raise: + if ((not iterable and self._raise) + or self._raise[self._times_called - 1]): + raise Exception + return self._return_value + + def called_with(self, *args, **kwargs): + return (args, kwargs) in self._called_with + + +def patch(parent, to_patch): + """monkey match a module""" + def wrapper(func): + print func + print dir(func) + old_func = getattr(parent, to_patch) + def wrapped(*args, **kwargs): + parent.__dict__[to_patch] = MagicMock() + try: + out = func(*args, **kwargs) + finally: + setattr(parent, to_patch, old_func) + return out + return wrapped + return wrapper + + def get_installed_dists(dists): + """Return a list of fake installed dists. + The list is name, version, deps""" objects = [] for (name, version, deps) in dists: objects.append(InstalledDist(name, version, deps)) @@ -69,6 +100,12 @@ def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) + def _patch_run_install(self): + """Patch run install""" + + def _unpatch_run_install(self): + """Unpatch run install for d2 and d1""" + def _get_results(self, output): """return a list of results""" installed = [(o.name, '%s' % o.version) for o in output['install']] @@ -150,6 +187,8 @@ # Tests that conflicts are detected client = self._get_client(server) archive_path = '%s/distribution.tar.gz' % server.full_address + + # choxie depends on towel-stuff, which depends on bacon. server.xmlrpc.set_distributions([ {'name':'choxie', 'version': '2.0.0.9', @@ -164,7 +203,9 @@ 'requires_dist': [], 'url': archive_path}, ]) - already_installed = [('bacon', '0.1', []), + + # name, version, deps. + already_installed = [('bacon', '0.1', []), ('chicken', '1.1', ['bacon (0.1)'])] output = install.get_infos("choxie", index=client, installed= get_installed_dists(already_installed)) @@ -221,23 +262,39 @@ # if one of the distribution installation fails, call uninstall on all # installed distributions. - d1 = ToInstallDist() - d2 = ToInstallDist(raise_error=True) - self.assertRaises(Exception, install.install_dists, [d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertTrue(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) + + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + d1 = ToInstallDist() + d2 = ToInstallDist() + path = self.mkdtemp() + self.assertRaises(Exception, install.install_dists, [d1, d2], path) + self.assertTrue(install._install_dist.called_with(d1, path)) + self.assertTrue(install.uninstall.called) + finally: + install._install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_dists_success(self): - # test that the install method is called on each of the distributions. - d1 = ToInstallDist() - d2 = ToInstallDist() - install.install_dists([d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertFalse(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + install._install_dist = MagicMock(return_value=[]) + try: + # test that the install method is called on each of the distributions. + d1 = ToInstallDist() + d2 = ToInstallDist() + + # should call install + path = self.mkdtemp() + install.install_dists([d1, d2], path) + for dist in (d1, d2): + self.assertTrue(install._install_dist.called_with(dist, path)) + finally: + install._install_dist = old_install_dist def test_install_from_infos_conflict(self): # assert conflicts raise an exception @@ -262,29 +319,46 @@ install.install_dists = old_install_dists def test_install_from_infos_remove_rollback(self): - # assert that if an error occurs, the removed files are restored. - remove = [] - for i in range(0,2): - remove.append(ToInstallDist(files=True, raise_error=True)) - to_install = [ToInstallDist(raise_error=True), - ToInstallDist()] + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) - install.install_from_infos(remove=remove, install=to_install) - # assert that the files are in the same place - # assert that the files have been removed - for dist in remove: - for f in dist.get_installed_files(): - self.assertTrue(os.path.exists(f)) + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + # assert that if an error occurs, the removed files are restored. + remove = [] + for i in range(0,2): + remove.append(ToInstallDist(files=True)) + to_install = [ToInstallDist(), ToInstallDist()] + + self.assertRaises(Exception, install.install_from_infos, + remove=remove, install=to_install) + # assert that the files are in the same place + # assert that the files have been removed + for dist in remove: + for f in dist.get_installed_files(): + self.assertTrue(os.path.exists(f)) + finally: + install.install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_from_infos_install_succes(self): - # assert that the distribution can be installed - install_path = "my_install_path" - to_install = [ToInstallDist(), ToInstallDist()] + old_install_dist = install._install_dist + install._install_dist = MagicMock([]) + try: + # assert that the distribution can be installed + install_path = "my_install_path" + to_install = [ToInstallDist(), ToInstallDist()] - install.install_from_infos(install=to_install, - install_path=install_path) - for dist in to_install: - self.assertEqual(dist.install_called_with, (install_path,)) + install.install_from_infos(install=to_install, + install_path=install_path) + for dist in to_install: + install._install_dist.called_with(install_path) + finally: + install._install_dist = old_install_dist + def test_suite(): suite = unittest.TestSuite() diff --git a/distutils2/tests/test_markers.py b/distutils2/tests/test_markers.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_markers.py @@ -0,0 +1,69 @@ +"""Tests for distutils.metadata.""" +import os +import sys +import platform +from StringIO import StringIO + +from distutils2.markers import interpret +from distutils2.tests import run_unittest, unittest +from distutils2.tests.support import LoggingCatcher, WarningsCatcher + + +class MarkersTestCase(LoggingCatcher, WarningsCatcher, + unittest.TestCase): + + def test_interpret(self): + sys_platform = sys.platform + version = sys.version.split()[0] + os_name = os.name + platform_version = platform.version() + platform_machine = platform.machine() + + self.assertTrue(interpret("sys.platform == '%s'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' and python_full_version == '%s'" % + (sys_platform, version))) + self.assertTrue(interpret("'%s' == sys.platform" % sys_platform)) + self.assertTrue(interpret('os.name == "%s"' % os_name)) + self.assertTrue(interpret( + 'platform.version == "%s" and platform.machine == "%s"' % + (platform_version, platform_machine))) + + # stuff that need to raise a syntax error + ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", + 'okpjonon', '', 'os.name ==', 'python_version == 2.4') + for op in ops: + self.assertRaises(SyntaxError, interpret, op) + + # combined operations + OP = 'os.name == "%s"' % os_name + AND = ' and ' + OR = ' or ' + self.assertTrue(interpret(OP + AND + OP)) + self.assertTrue(interpret(OP + AND + OP + AND + OP)) + self.assertTrue(interpret(OP + OR + OP)) + self.assertTrue(interpret(OP + OR + OP + OR + OP)) + + # other operators + self.assertTrue(interpret("os.name != 'buuuu'")) + self.assertTrue(interpret("python_version > '1.0'")) + self.assertTrue(interpret("python_version < '5.0'")) + self.assertTrue(interpret("python_version <= '5.0'")) + self.assertTrue(interpret("python_version >= '1.0'")) + self.assertTrue(interpret("'%s' in os.name" % os_name)) + self.assertTrue(interpret("'buuuu' not in os.name")) + self.assertTrue(interpret( + "'buuuu' not in os.name and '%s' in os.name" % os_name)) + + # execution context + self.assertTrue(interpret('python_version == "0.1"', + {'python_version': '0.1'})) + + +def test_suite(): + return unittest.makeSuite(MarkersTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -1,10 +1,10 @@ -"""Tests for distutils.command.bdist.""" +"""Tests for distutils.metadata.""" import os import sys import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, _interpret, +from distutils2.metadata import (DistributionMetadata, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -46,55 +46,6 @@ self.assertRaises(TypeError, DistributionMetadata, PKG_INFO, mapping=m, fileobj=fp) - def test_interpret(self): - sys_platform = sys.platform - version = sys.version.split()[0] - os_name = os.name - platform_version = platform.version() - platform_machine = platform.machine() - - self.assertTrue(_interpret("sys.platform == '%s'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' and python_full_version == '%s'" % - (sys_platform, version))) - self.assertTrue(_interpret("'%s' == sys.platform" % sys_platform)) - self.assertTrue(_interpret('os.name == "%s"' % os_name)) - self.assertTrue(_interpret( - 'platform.version == "%s" and platform.machine == "%s"' % - (platform_version, platform_machine))) - - # stuff that need to raise a syntax error - ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", - 'okpjonon', '', 'os.name ==', 'python_version == 2.4') - for op in ops: - self.assertRaises(SyntaxError, _interpret, op) - - # combined operations - OP = 'os.name == "%s"' % os_name - AND = ' and ' - OR = ' or ' - self.assertTrue(_interpret(OP + AND + OP)) - self.assertTrue(_interpret(OP + AND + OP + AND + OP)) - self.assertTrue(_interpret(OP + OR + OP)) - self.assertTrue(_interpret(OP + OR + OP + OR + OP)) - - # other operators - self.assertTrue(_interpret("os.name != 'buuuu'")) - self.assertTrue(_interpret("python_version > '1.0'")) - self.assertTrue(_interpret("python_version < '5.0'")) - self.assertTrue(_interpret("python_version <= '5.0'")) - self.assertTrue(_interpret("python_version >= '1.0'")) - self.assertTrue(_interpret("'%s' in os.name" % os_name)) - self.assertTrue(_interpret("'buuuu' not in os.name")) - self.assertTrue(_interpret( - "'buuuu' not in os.name and '%s' in os.name" % os_name)) - - # execution context - self.assertTrue(_interpret('python_version == "0.1"', - {'python_version': '0.1'})) - def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') metadata = DistributionMetadata(PKG_INFO) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,6 +15,7 @@ from copy import copy from fnmatch import fnmatchcase from ConfigParser import RawConfigParser +from inspect import getsource from distutils2.errors import (DistutilsPlatformError, DistutilsFileError, DistutilsByteCompileError, DistutilsExecError) @@ -674,83 +675,6 @@ return base, ext -def unzip_file(filename, location, flatten=True): - """Unzip the file (zip file located at filename) to the destination - location""" - if not os.path.exists(location): - os.makedirs(location) - zipfp = open(filename, 'rb') - try: - zip = zipfile.ZipFile(zipfp) - leading = has_leading_dir(zip.namelist()) and flatten - for name in zip.namelist(): - data = zip.read(name) - fn = name - if leading: - fn = split_leading_dir(name)[1] - fn = os.path.join(location, fn) - dir = os.path.dirname(fn) - if not os.path.exists(dir): - os.makedirs(dir) - if fn.endswith('/') or fn.endswith('\\'): - # A directory - if not os.path.exists(fn): - os.makedirs(fn) - else: - fp = open(fn, 'wb') - try: - fp.write(data) - finally: - fp.close() - finally: - zipfp.close() - - -def untar_file(filename, location): - """Untar the file (tar file located at filename) to the destination - location - """ - if not os.path.exists(location): - os.makedirs(location) - if filename.lower().endswith('.gz') or filename.lower().endswith('.tgz'): - mode = 'r:gz' - elif (filename.lower().endswith('.bz2') - or filename.lower().endswith('.tbz')): - mode = 'r:bz2' - elif filename.lower().endswith('.tar'): - mode = 'r' - else: - mode = 'r:*' - tar = tarfile.open(filename, mode) - try: - leading = has_leading_dir([member.name for member in tar.getmembers()]) - for member in tar.getmembers(): - fn = member.name - if leading: - fn = split_leading_dir(fn)[1] - path = os.path.join(location, fn) - if member.isdir(): - if not os.path.exists(path): - os.makedirs(path) - else: - try: - fp = tar.extractfile(member) - except (KeyError, AttributeError): - # Some corrupt tar files seem to produce this - # (specifically bad symlinks) - continue - if not os.path.exists(os.path.dirname(path)): - os.makedirs(os.path.dirname(path)) - destfp = open(path, 'wb') - try: - shutil.copyfileobj(fp, destfp) - finally: - destfp.close() - fp.close() - finally: - tar.close() - - def has_leading_dir(paths): """Returns true if all the paths have the same leading path name (i.e., everything is in one subdirectory in an archive)""" @@ -1127,3 +1051,117 @@ """ Issues a call to util.run_2to3. """ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) + + +def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): + """ Distutils2 to distutils1 compatibility util. + + This method uses an existing setup.cfg to generate a dictionnary of + keywords that can be used by distutils.core.setup(kwargs**). + + :param file: + The setup.cfg path. + :raises DistutilsFileError: + When the setup.cfg file is not found. + + """ + # We need to declare the following constants here so that it's easier to + # generate the setup.py afterwards, using inspect.getsource. + D1_D2_SETUP_ARGS = { + # D1 name : (D2_section, D2_name) + "name" : ("metadata",), + "version" : ("metadata",), + "author" : ("metadata",), + "author_email" : ("metadata",), + "maintainer" : ("metadata",), + "maintainer_email" : ("metadata",), + "url" : ("metadata", "home_page"), + "description" : ("metadata", "summary"), + "long_description" : ("metadata", "description"), + "download-url" : ("metadata",), + "classifiers" : ("metadata", "classifier"), + "platforms" : ("metadata", "platform"), # Needs testing + "license" : ("metadata",), + "requires" : ("metadata", "requires_dist"), + "provides" : ("metadata", "provides_dist"), # Needs testing + "obsoletes" : ("metadata", "obsoletes_dist"), # Needs testing + + "packages" : ("files",), + "scripts" : ("files",), + "py_modules" : ("files", "modules"), # Needs testing + } + + MULTI_FIELDS = ("classifiers", + "requires", + "platforms", + "packages", + "scripts") + + def has_get_option(config, section, option): + if config.has_option(section, option): + return config.get(section, option) + elif config.has_option(section, option.replace('_', '-')): + return config.get(section, option.replace('_', '-')) + else: + return False + + # The method source code really starts here. + config = RawConfigParser() + if not os.path.exists(file): + raise DistutilsFileError("file '%s' does not exist" % + os.path.abspath(file)) + config.read(file) + + kwargs = {} + for arg in D1_D2_SETUP_ARGS: + if len(D1_D2_SETUP_ARGS[arg]) == 2: + # The distutils field name is different than distutils2's. + section, option = D1_D2_SETUP_ARGS[arg] + + elif len(D1_D2_SETUP_ARGS[arg]) == 1: + # The distutils field name is the same thant distutils2's. + section = D1_D2_SETUP_ARGS[arg][0] + option = arg + + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg + if arg == "long_description": + filename = has_get_option(config, section, "description_file") + print "We have a filename", filename + if filename: + in_cfg_value = open(filename).read() + else: + continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option + if "\n" in in_cfg_value: + in_cfg_value = in_cfg_value.strip().split('\n') + else: + in_cfg_value = list((in_cfg_value,)) + + kwargs[arg] = in_cfg_value + + return kwargs + + +def generate_distutils_setup_py(): + """ Generate a distutils compatible setup.py using an existing setup.cfg. + + :raises DistutilsFileError: + When a setup.py already exists. + """ + if os.path.exists("setup.py"): + raise DistutilsFileError("A pre existing setup.py file exists") + + handle = open("setup.py", "w") + handle.write("# Distutils script using distutils2 setup.cfg to call the\n") + handle.write("# distutils.core.setup() with the right args.\n\n\n") + handle.write("import os\n") + handle.write("from distutils.core import setup\n") + handle.write("from ConfigParser import RawConfigParser\n\n") + handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) + handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") + handle.write("setup(**kwargs)") + handle.close() diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1055,6 +1055,13 @@ Create a file called *filename* and write *contents* (a sequence of strings without line terminators) to it. +:mod:`distutils2.metadata` --- Metadata handling +================================================================ + +.. module:: distutils2.metadata + +.. autoclass:: distutils2.metadata.DistributionMetadata + :members: :mod:`distutils2.util` --- Miscellaneous other utility functions ================================================================ diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -301,7 +301,7 @@ :class:`distutils2.dist.DistributionMetadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.dist import DistributionMetadata + >>> from distutils2.metadata import DistributionMetadata >>> metadata = DistributionMetadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -2,7 +2,9 @@ Metadata ======== -Distutils2 provides a :class:`DistributionMetadata` class that can read and +.. module:: distutils2.metadata + +Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -17,7 +19,7 @@ Reading metadata ================ -The :class:`DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: >>> from distutils2.metadata import DistributionMetadata @@ -33,7 +35,7 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: diff --git a/docs/source/library/distutils2.tests.pypi_server.rst b/docs/source/library/distutils2.tests.pypi_server.rst --- a/docs/source/library/distutils2.tests.pypi_server.rst +++ b/docs/source/library/distutils2.tests.pypi_server.rst @@ -77,6 +77,7 @@ @use_pypi_server() def test_somthing(self, server): # your tests goes here + ... The decorator will instantiate the server for you, and run and stop it just before and after your method call. You also can pass the server initializer, @@ -85,4 +86,4 @@ class SampleTestCase(TestCase): @use_pypi_server("test_case_name") def test_something(self, server): - # something + ... diff --git a/docs/source/library/pkgutil.rst b/docs/source/library/pkgutil.rst --- a/docs/source/library/pkgutil.rst +++ b/docs/source/library/pkgutil.rst @@ -4,77 +4,204 @@ .. module:: pkgutil :synopsis: Utilities to support packages. -.. TODO Follow the reST conventions used in the stdlib +This module provides utilities to manipulate packages: support for the +Importer protocol defined in :PEP:`302` and implementation of the API +described in :PEP:`376` to work with the database of installed Python +distributions. -This module provides functions to manipulate packages, as well as -the necessary functions to provide support for the "Importer Protocol" as -described in :PEP:`302` and for working with the database of installed Python -distributions which is specified in :PEP:`376`. In addition to the functions -required in :PEP:`376`, back support for older ``.egg`` and ``.egg-info`` -distributions is provided as well. These distributions are represented by the -class :class:`~distutils2._backport.pkgutil.EggInfoDistribution` and most -functions provide an extra argument ``use_egg_info`` which indicates if -they should consider these old styled distributions. This document details -first the functions and classes available and then presents several use cases. - +Import system utilities +----------------------- .. function:: extend_path(path, name) - Extend the search path for the modules which comprise a package. Intended use is - to place the following code in a package's :file:`__init__.py`:: + Extend the search path for the modules which comprise a package. Intended + use is to place the following code in a package's :file:`__init__.py`:: from pkgutil import extend_path __path__ = extend_path(__path__, __name__) - This will add to the package's ``__path__`` all subdirectories of directories on - ``sys.path`` named after the package. This is useful if one wants to distribute - different parts of a single logical package as multiple directories. + This will add to the package's ``__path__`` all subdirectories of directories + on :data:`sys.path` named after the package. This is useful if one wants to + distribute different parts of a single logical package as multiple + directories. - It also looks for :file:`\*.pkg` files beginning where ``*`` matches the *name* - argument. This feature is similar to :file:`\*.pth` files (see the :mod:`site` - module for more information), except that it doesn't special-case lines starting - with ``import``. A :file:`\*.pkg` file is trusted at face value: apart from - checking for duplicates, all entries found in a :file:`\*.pkg` file are added to - the path, regardless of whether they exist on the filesystem. (This is a - feature.) + It also looks for :file:`\*.pkg` files beginning where ``*`` matches the + *name* argument. This feature is similar to :file:`\*.pth` files (see the + :mod:`site` module for more information), except that it doesn't special-case + lines starting with ``import``. A :file:`\*.pkg` file is trusted at face + value: apart from checking for duplicates, all entries found in a + :file:`\*.pkg` file are added to the path, regardless of whether they exist + on the filesystem. (This is a feature.) If the input path is not a list (as is the case for frozen packages) it is returned unchanged. The input path is not modified; an extended copy is returned. Items are only appended to the copy at the end. - It is assumed that ``sys.path`` is a sequence. Items of ``sys.path`` that are - not strings referring to existing directories are ignored. Unicode items on - ``sys.path`` that cause errors when used as filenames may cause this function - to raise an exception (in line with :func:`os.path.isdir` behavior). + It is assumed that :data:`sys.path` is a sequence. Items of :data:`sys.path` + that are not strings referring to existing directories are ignored. Unicode + items on :data:`sys.path` that cause errors when used as filenames may cause + this function to raise an exception (in line with :func:`os.path.isdir` + behavior). + + +.. class:: ImpImporter(dirname=None) + + :pep:`302` Importer that wraps Python's "classic" import algorithm. + + If *dirname* is a string, a :pep:`302` importer is created that searches that + directory. If *dirname* is ``None``, a :pep:`302` importer is created that + searches the current :data:`sys.path`, plus any modules that are frozen or + built-in. + + Note that :class:`ImpImporter` does not currently support being used by + placement on :data:`sys.meta_path`. + + +.. class:: ImpLoader(fullname, file, filename, etc) + + :pep:`302` Loader that wraps Python's "classic" import algorithm. + + +.. function:: find_loader(fullname) + + Find a :pep:`302` "loader" object for *fullname*. + + If *fullname* contains dots, path must be the containing package's + ``__path__``. Returns ``None`` if the module cannot be found or imported. + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: get_importer(path_item) + + Retrieve a :pep:`302` importer for the given *path_item*. + + The returned importer is cached in :data:`sys.path_importer_cache` if it was + newly created by a path hook. + + If there is no importer, a wrapper around the basic import machinery is + returned. This wrapper is never inserted into the importer cache (None is + inserted instead). + + The cache (or part of it) can be cleared manually if a rescan of + :data:`sys.path_hooks` is necessary. + + +.. function:: get_loader(module_or_name) + + Get a :pep:`302` "loader" object for *module_or_name*. + + If the module or package is accessible via the normal import mechanism, a + wrapper around the relevant part of that machinery is returned. Returns + ``None`` if the module cannot be found or imported. If the named module is + not already imported, its containing package (if any) is imported, in order + to establish the package ``__path__``. + + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: iter_importers(fullname='') + + Yield :pep:`302` importers for the given module name. + + If fullname contains a '.', the importers will be for the package containing + fullname, otherwise they will be importers for :data:`sys.meta_path`, + :data:`sys.path`, and Python's "classic" import machinery, in that order. If + the named module is in a package, that package is imported as a side effect + of invoking this function. + + Non-:pep:`302` mechanisms (e.g. the Windows registry) used by the standard + import machinery to find files in alternative locations are partially + supported, but are searched *after* :data:`sys.path`. Normally, these + locations are searched *before* :data:`sys.path`, preventing :data:`sys.path` + entries from shadowing them. + + For this to cause a visible difference in behaviour, there must be a module + or package name that is accessible via both :data:`sys.path` and one of the + non-:pep:`302` file system mechanisms. In this case, the emulation will find + the former version, while the builtin import mechanism will find the latter. + + Items of the following types can be affected by this discrepancy: + ``imp.C_EXTENSION``, ``imp.PY_SOURCE``, ``imp.PY_COMPILED``, + ``imp.PKG_DIRECTORY``. + + +.. function:: iter_modules(path=None, prefix='') + + Yields ``(module_loader, name, ispkg)`` for all submodules on *path*, or, if + path is ``None``, all top-level modules on :data:`sys.path`. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + +.. function:: walk_packages(path=None, prefix='', onerror=None) + + Yields ``(module_loader, name, ispkg)`` for all modules recursively on + *path*, or, if path is ``None``, all accessible modules. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + Note that this function must import all *packages* (*not* all modules!) on + the given *path*, in order to access the ``__path__`` attribute to find + submodules. + + *onerror* is a function which gets called with one argument (the name of the + package which was being imported) if any exception occurs while trying to + import a package. If no *onerror* function is supplied, :exc:`ImportError`\s + are caught and ignored, while all other exceptions are propagated, + terminating the search. + + Examples:: + + # list all modules python can access + walk_packages() + + # list all submodules of ctypes + walk_packages(ctypes.__path__, ctypes.__name__ + '.') + .. function:: get_data(package, resource) Get a resource from a package. - This is a wrapper for the :pep:`302` loader :func:`get_data` API. The package - argument should be the name of a package, in standard module format - (foo.bar). The resource argument should be in the form of a relative - filename, using ``/`` as the path separator. The parent directory name + This is a wrapper for the :pep:`302` loader :func:`get_data` API. The + *package* argument should be the name of a package, in standard module format + (``foo.bar``). The *resource* argument should be in the form of a relative + filename, using ``/`` as the path separator. The parent directory name ``..`` is not allowed, and nor is a rooted name (starting with a ``/``). - The function returns a binary string that is the contents of the - specified resource. + The function returns a binary string that is the contents of the specified + resource. For packages located in the filesystem, which have already been imported, this is the rough equivalent of:: - d = os.path.dirname(sys.modules[package].__file__) - data = open(os.path.join(d, resource), 'rb').read() + d = os.path.dirname(sys.modules[package].__file__) + data = open(os.path.join(d, resource), 'rb').read() If the package cannot be located or loaded, or it uses a :pep:`302` loader - which does not support :func:`get_data`, then None is returned. + which does not support :func:`get_data`, then ``None`` is returned. -API Reference -============= +Installed distributions database +-------------------------------- -.. automodule:: distutils2._backport.pkgutil - :members: +Installed Python distributions are represented by instances of +:class:`~distutils2._backport.pkgutil.Distribution`, or its subclass +:class:`~distutils2._backport.pkgutil.EggInfoDistribution` for legacy ``.egg`` +and ``.egg-info`` formats). Most functions also provide an extra argument +``use_egg_info`` to take legacy distributions into account. + +.. TODO write docs here, don't rely on automodule + classes: Distribution and descendents + functions: provides, obsoletes, replaces, etc. Caching +++++++ @@ -86,11 +213,10 @@ :func:`~distutils2._backport.pkgutil.clear_cache`. +Examples +-------- -Example Usage -============= - -Print All Information About a Distribution +Print all information about a distribution ++++++++++++++++++++++++++++++++++++++++++ Given a path to a ``.dist-info`` distribution, we shall print out all @@ -182,7 +308,7 @@ ===== * It was installed as a dependency -Find Out Obsoleted Distributions +Find out obsoleted distributions ++++++++++++++++++++++++++++++++ Now, we take tackle a different problem, we are interested in finding out diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -128,6 +128,8 @@ This section describes the files included in the project. +- **packages_root**: the root directory containing all packages. If not provided + Distutils2 will use the current directory. *\*optional* - **packages**: a list of packages the project includes *\*optional* *\*multi* - **modules**: a list of packages the project includes *\*optional* *\*multi* - **scripts**: a list of scripts the project includes *\*optional* *\*multi* @@ -136,6 +138,7 @@ Example:: [files] + packages_root = src packages = pypi2rpm pypi2rpm.command -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Add informations about contribution in the doc and in the README. Message-ID: tarek.ziade pushed b2b6b967a7ef to distutils2: http://hg.python.org/distutils2/rev/b2b6b967a7ef changeset: 1001:b2b6b967a7ef user: Alexis Metaireau date: Wed Feb 02 16:35:22 2011 +0000 summary: Add informations about contribution in the doc and in the README. files: README.txt docs/source/index.rst diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -10,6 +10,9 @@ See the documentation at http://packages.python.org/Distutils2 for more info. +If you want to contribute, please have a look to +http://distutils2.notmyidea.org/contributing.html + **Beware that Distutils2 is in its early stage and should not be used in production. Its API is subject to changes** diff --git a/docs/source/index.rst b/docs/source/index.rst --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -29,7 +29,7 @@ .. __: http://bitbucket.org/tarek/distutils2/wiki/GSoC_2010_teams If you???re looking for information on how to contribute, head to -:doc:`devresources`. +:doc:`devresources`, and be sure to have a look at :doc:`contributing`. Documentation @@ -76,6 +76,7 @@ distutils/index library/distutils2 library/pkgutil + contributing Indices and tables -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Fix wrong name in docstring and doc (port of r87277). Message-ID: tarek.ziade pushed d1d07904cc8e to distutils2: http://hg.python.org/distutils2/rev/d1d07904cc8e changeset: 1003:d1d07904cc8e user: ?ric Araujo date: Sat Jan 29 22:27:16 2011 +0100 summary: Fix wrong name in docstring and doc (port of r87277). files: distutils2/_backport/shutil.py docs/source/distutils/apiref.rst diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -376,7 +376,7 @@ archive that is being built. If not provided, the current owner and group will be used. - The output tar file will be named 'base_dir' + ".tar", possibly plus + The output tar file will be named 'base_name' + ".tar", possibly plus the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. @@ -451,7 +451,7 @@ def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None): """Create a zip file from all the files under 'base_dir'. - The output zip file will be named 'base_dir' + ".zip". Uses either the + The output zip file will be named 'base_name' + ".zip". Uses either the "zipfile" Python module (if available) or the InfoZIP "zip" utility (if installed and found on the default search path). If neither tool is available, raises ExecError. Returns the name of the output zip diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -888,7 +888,7 @@ .. function:: make_zipfile(base_name, base_dir[, verbose=0, dry_run=0]) Create a zip file from all files in and under *base_dir*. The output zip file - will be named *base_dir* + :file:`.zip`. Uses either the :mod:`zipfile` Python + will be named *base_name* + :file:`.zip`. Uses either the :mod:`zipfile` Python module (if available) or the InfoZIP :file:`zip` utility (if installed and found on the default search path). If neither tool is available, raises :exc:`DistutilsExecError`. Returns the name of the output zip file. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Branch merge. Message-ID: tarek.ziade pushed 71223a14aa43 to distutils2: http://hg.python.org/distutils2/rev/71223a14aa43 changeset: 1004:71223a14aa43 parent: 1003:d1d07904cc8e parent: 955:4b89c997484d user: ?ric Araujo date: Sat Jan 29 22:41:40 2011 +0100 summary: Branch merge. Addition: don?t use shutil from the stdlib in 2.7, it has the logger=None bug (fixed in our backport). files: distutils2/command/cmd.py diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -10,14 +10,7 @@ from distutils2.errors import DistutilsOptionError from distutils2 import util from distutils2 import logger - -# XXX see if we want to backport this -from distutils2._backport.shutil import copytree, copyfile, move - -try: - from shutil import make_archive -except ImportError: - from distutils2._backport.shutil import make_archive +from distutils2._backport.shutil import copytree, copyfile, move, make_archive class Command(object): diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -3,15 +3,33 @@ Know how to read all config files Distutils2 uses. """ import os +import re import sys from ConfigParser import RawConfigParser from distutils2 import logger from distutils2.errors import DistutilsOptionError +from distutils2.compiler.extension import Extension from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command + +def _pop_values(values_dct, key): + """Remove values from the dictionary and convert them as a list""" + vals_str = values_dct.pop(key, None) + if not vals_str: + return + # Get bash options like `gcc -print-file-name=libgcc.a` + vals = re.search('(`.*?`)', vals_str) or [] + if vals: + vals = list(vals.groups()) + vals_str = re.sub('`.*?`', '', vals_str) + vals.extend(vals_str.split()) + if vals: + return vals + + class Config(object): """Reads configuration files and work with the Distribution instance """ @@ -182,6 +200,34 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) + ext_modules = self.dist.ext_modules + for section_key in content: + labels = section_key.split('=') + if (len(labels) == 2) and (labels[0] == 'extension'): + # labels[1] not used from now but should be implemented + # for extension build dependency + values_dct = content[section_key] + ext_modules.append(Extension( + values_dct.pop('name'), + _pop_values(values_dct, 'sources'), + _pop_values(values_dct, 'include_dirs'), + _pop_values(values_dct, 'define_macros'), + _pop_values(values_dct, 'undef_macros'), + _pop_values(values_dct, 'library_dirs'), + _pop_values(values_dct, 'libraries'), + _pop_values(values_dct, 'runtime_library_dirs'), + _pop_values(values_dct, 'extra_objects'), + _pop_values(values_dct, 'extra_compile_args'), + _pop_values(values_dct, 'extra_link_args'), + _pop_values(values_dct, 'export_symbols'), + _pop_values(values_dct, 'swig_opts'), + _pop_values(values_dct, 'depends'), + values_dct.pop('language', None), + values_dct.pop('optional', None), + **values_dct + )) + + def parse_config_files(self, filenames=None): if filenames is None: filenames = self.find_config_files() diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -93,6 +93,30 @@ sub_commands = foo """ +# Can not be merged with SETUP_CFG else install_dist +# command will fail when trying to compile C sources +EXT_SETUP_CFG = """ +[files] +packages = one + two + +[extension=speed_coconuts] +name = one.speed_coconuts +sources = c_src/speed_coconuts.c +extra_link_args = `gcc -print-file-name=libgcc.a` -shared +define_macros = HAVE_CAIRO HAVE_GTK2 + +[extension=fast_taunt] +name = three.fast_taunt +sources = cxx_src/utils_taunt.cxx + cxx_src/python_module.cxx +include_dirs = /usr/include/gecode + /usr/include/blitz +extra_compile_args = -fPIC -O2 +language = cxx + +""" + class DCompiler(object): name = 'd' @@ -134,7 +158,14 @@ super(ConfigTestCase, self).setUp() self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) + sys.stdout = sys.stderr = StringIO() + self.addCleanup(os.chdir, os.getcwd()) + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.tempdir = tempdir + + self.addCleanup(setattr, sys, 'argv', sys.argv) def write_setup(self, kwargs=None): opts = {'description-file': 'README', 'extra-files':''} @@ -156,8 +187,6 @@ return dist def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') @@ -232,9 +261,6 @@ def test_multiple_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog2') @@ -242,9 +268,6 @@ self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) def test_multiline_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README\n CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog') @@ -252,9 +275,28 @@ self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + def test_parse_extensions_in_config(self): + self.write_file('setup.cfg', EXT_SETUP_CFG) + dist = self.run_setup('--version') + + ext_modules = dict((mod.name, mod) for mod in dist.ext_modules) + self.assertEqual(len(ext_modules), 2) + ext = ext_modules.get('one.speed_coconuts') + self.assertEqual(ext.sources, ['c_src/speed_coconuts.c']) + self.assertEqual(ext.define_macros, ['HAVE_CAIRO', 'HAVE_GTK2']) + self.assertEqual(ext.extra_link_args, + ['`gcc -print-file-name=libgcc.a`', '-shared']) + + ext = ext_modules.get('three.fast_taunt') + self.assertEqual(ext.sources, + ['cxx_src/utils_taunt.cxx', 'cxx_src/python_module.cxx']) + self.assertEqual(ext.include_dirs, + ['/usr/include/gecode', '/usr/include/blitz']) + self.assertEqual(ext.extra_compile_args, ['-fPIC', '-O2']) + self.assertEqual(ext.language, 'cxx') + + def test_metadata_requires_description_files_missing(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2'}) self.write_file('README', 'yeah') self.write_file('README2', 'yeah') @@ -278,8 +320,6 @@ self.assertRaises(DistutilsFileError, cmd.make_distribution) def test_metadata_requires_description_files(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2', 'extra-files':'\n README2'}) self.write_file('README', 'yeah') @@ -315,8 +355,6 @@ self.assertIn('README\nREADME2\n', open('MANIFEST').read()) def test_sub_commands(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: renamed DistributionMetadata to Metadata Message-ID: tarek.ziade pushed 196f90ab8ee3 to distutils2: http://hg.python.org/distutils2/rev/196f90ab8ee3 changeset: 1005:196f90ab8ee3 parent: 896:272155a17d56 user: Christophe Combelles date: Fri Jan 28 10:43:00 2011 +0100 summary: renamed DistributionMetadata to Metadata files: CHANGES.txt distutils2/_backport/pkgutil.py distutils2/_backport/tests/test_pkgutil.py distutils2/dist.py distutils2/index/dist.py distutils2/index/simple.py distutils2/metadata.py distutils2/tests/test_command_install_distinfo.py distutils2/tests/test_dist.py distutils2/tests/test_install.py distutils2/tests/test_metadata.py docs/design/pep-0376.txt docs/source/distutils/examples.rst docs/source/library/distutils2.metadata.rst diff --git a/CHANGES.txt b/CHANGES.txt --- a/CHANGES.txt +++ b/CHANGES.txt @@ -10,6 +10,7 @@ - Issue #10409: Fixed the Licence selector in mkcfg [tarek] - Issue #9558: Fix build_ext with VS 8.0 [??ric] - Issue #6007: Add disclaimer about MinGW compatibility in docs [??ric] +- Renamed DistributionMetadata to Metadata [ccomb] 1.0a3 - 2010-10-08 ------------------ diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -10,7 +10,7 @@ from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import suggest_normalized_version, VersionPredicate import zipimport try: @@ -716,7 +716,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" requested = False """A boolean that indicates whether the ``REQUESTED`` metadata file is @@ -728,7 +728,7 @@ self.metadata = _cache_path[path].metadata else: metadata_path = os.path.join(path, 'METADATA') - self.metadata = DistributionMetadata(path=metadata_path) + self.metadata = Metadata(path=metadata_path) self.path = path self.name = self.metadata['name'] @@ -849,7 +849,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" _REQUIREMENT = re.compile( \ r'(?P[-A-Za-z0-9_.]+)\s*' \ @@ -883,7 +883,7 @@ if path.endswith('.egg'): if os.path.isdir(path): meta_path = os.path.join(path, 'EGG-INFO', 'PKG-INFO') - self.metadata = DistributionMetadata(path=meta_path) + self.metadata = Metadata(path=meta_path) try: req_path = os.path.join(path, 'EGG-INFO', 'requires.txt') requires = open(req_path, 'r').read() @@ -892,7 +892,7 @@ else: zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) - self.metadata = DistributionMetadata(fileobj=fileobj) + self.metadata = Metadata(fileobj=fileobj) try: requires = zipf.get_data('EGG-INFO/requires.txt') except IOError: @@ -906,7 +906,7 @@ requires = req_f.read() except IOError: requires = None - self.metadata = DistributionMetadata(path=path) + self.metadata = Metadata(path=path) self.name = self.metadata['name'] else: raise ValueError('The path must end with .egg-info or .egg') diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -236,8 +236,8 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - from distutils2.metadata import DistributionMetadata - self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) + from distutils2.metadata import Metadata + self.assertTrue(isinstance(dist.metadata, Metadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -15,7 +15,7 @@ from distutils2.fancy_getopt import FancyGetopt from distutils2.util import strtobool, resolve_name from distutils2 import logger -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.config import Config from distutils2.command import get_command_class @@ -145,7 +145,7 @@ # forth) in a separate object -- we're getting to have enough # information here (and enough command-line options) that it's # worth it. - self.metadata = DistributionMetadata() + self.metadata = Metadata() # 'cmdclass' maps command names to class objects, so we # can 1) quickly figure out which class to instantiate when diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -28,7 +28,7 @@ CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.util import untar_file, unzip_file, splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -66,7 +66,7 @@ self._version = None self.version = version if metadata: - self.metadata = DistributionMetadata(mapping=metadata) + self.metadata = Metadata(mapping=metadata) else: self.metadata = None self.dists = {} @@ -164,7 +164,7 @@ def set_metadata(self, metadata): if not self.metadata: - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata.update(metadata) def __getitem__(self, item): diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -22,7 +22,7 @@ UnableToDownload, CantParseArchiveName, ReleaseNotFound, ProjectNotFound) from distutils2.index.mirrors import get_mirrors -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import get_version_predicate from distutils2 import __version__ as __distutils2_version__ @@ -203,7 +203,7 @@ if not release._metadata: location = release.get_distribution().unpack() pkg_info = os.path.join(location, 'PKG-INFO') - release._metadata = DistributionMetadata(pkg_info) + release._metadata = Metadata(pkg_info) return release def _switch_to_next_mirror(self): diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -41,7 +41,7 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ('DistributionMetadata', 'PKG_INFO_ENCODING', +__all__ = ('Metadata', 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION') # Encoding used for the PKG-INFO files @@ -181,7 +181,7 @@ _MISSING = object() -class DistributionMetadata(object): +class Metadata(object): """The metadata of a release. Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can diff --git a/distutils2/tests/test_command_install_distinfo.py b/distutils2/tests/test_command_install_distinfo.py --- a/distutils2/tests/test_command_install_distinfo.py +++ b/distutils2/tests/test_command_install_distinfo.py @@ -5,7 +5,7 @@ from distutils2.command.install_distinfo import install_distinfo from distutils2.command.cmd import Command -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import unittest, support try: @@ -64,7 +64,7 @@ self.assertEqual(open(os.path.join(dist_info, 'REQUESTED')).read(), '') meta_path = os.path.join(dist_info, 'METADATA') - self.assertTrue(DistributionMetadata(path=meta_path).check()) + self.assertTrue(Metadata(path=meta_path).check()) def test_installer(self): pkg_dir, dist = self.create_dist(name='foo', diff --git a/distutils2/tests/test_dist.py b/distutils2/tests/test_dist.py --- a/distutils2/tests/test_dist.py +++ b/distutils2/tests/test_dist.py @@ -68,7 +68,7 @@ distutils2.dist.DEBUG = False def test_write_pkg_file(self): - # Check DistributionMetadata handling of Unicode fields + # Check Metadata handling of Unicode fields tmp_dir = self.mkdtemp() my_file = os.path.join(tmp_dir, 'f') cls = Distribution diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -5,7 +5,7 @@ from distutils2 import install from distutils2.index.xmlrpc import Client -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import run_unittest from distutils2.tests.support import TempdirManager from distutils2.tests.pypi_server import use_xmlrpc_server @@ -18,7 +18,7 @@ def __init__(self, name, version, deps): self.name = name self.version = version - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata['Requires-Dist'] = deps self.metadata['Provides-Dist'] = ['%s (%s)' % (name, version)] diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -4,7 +4,7 @@ import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, _interpret, +from distutils2.metadata import (Metadata, _interpret, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -12,7 +12,7 @@ MetadataUnrecognizedVersionError) -class DistributionMetadataTestCase(LoggingCatcher, WarningsCatcher, +class MetadataTestCase(LoggingCatcher, WarningsCatcher, unittest.TestCase): def test_instantiation(self): @@ -24,26 +24,26 @@ fp.close() fp = StringIO(contents) - m = DistributionMetadata() + m = Metadata() self.assertRaises(MetadataUnrecognizedVersionError, m.items) - m = DistributionMetadata(PKG_INFO) + m = Metadata(PKG_INFO) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(fileobj=fp) + m = Metadata(fileobj=fp) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(mapping=dict(name='Test', version='1.0')) + m = Metadata(mapping=dict(name='Test', version='1.0')) self.assertEqual(len(m.items()), 11) d = dict(m.items()) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, fileobj=fp) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, fileobj=fp, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=m, fileobj=fp) def test_interpret(self): @@ -97,11 +97,11 @@ def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') - metadata = DistributionMetadata(PKG_INFO) + metadata = Metadata(PKG_INFO) out = StringIO() metadata.write_file(out) out.seek(0) - res = DistributionMetadata() + res = Metadata() res.read_file(out) for k in metadata.keys(): self.assertTrue(metadata[k] == res[k]) @@ -111,7 +111,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(platform_dependent=True) + metadata = Metadata(platform_dependent=True) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['bar']) metadata['Name'] = "baz; sys.platform == 'blah'" @@ -121,7 +121,7 @@ # test with context context = {'sys.platform': 'okook'} - metadata = DistributionMetadata(platform_dependent=True, + metadata = Metadata(platform_dependent=True, execution_context=context) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['foo']) @@ -130,7 +130,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(StringIO(content)) # see if we can read the description now @@ -149,7 +149,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(fileobj=StringIO(content)) + metadata = Metadata(fileobj=StringIO(content)) self.assertIn('Version', metadata.keys()) self.assertIn('0.5', metadata.values()) self.assertIn(('Version', '0.5'), metadata.items()) @@ -160,7 +160,7 @@ self.assertEqual(metadata['Version'], '0.7') def test_versions(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Obsoletes'] = 'ok' self.assertEqual(metadata['Metadata-Version'], '1.1') @@ -191,7 +191,7 @@ # XXX Spurious Warnings were disabled def XXXtest_warnings(self): - metadata = DistributionMetadata() + metadata = Metadata() # these should raise a warning values = (('Requires-Dist', 'Funky (Groovie)'), @@ -204,7 +204,7 @@ self.assertEqual(len(self.logs), 2) def test_multiple_predicates(self): - metadata = DistributionMetadata() + metadata = Metadata() # see for "3" instead of "3.0" ??? # its seems like the MINOR VERSION can be omitted @@ -214,14 +214,14 @@ self.assertEqual(len(self.warnings), 0) def test_project_url(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Project-URL'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-URL'], [('one', 'http://ok')]) self.assertEqual(metadata.version, '1.2') def test_check(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = 'rr' metadata['Requires-dist'] = ['Foo (a)'] if metadata.docutils_support: @@ -233,7 +233,7 @@ self.assertEqual(len(warnings), 2) def test_best_choice(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION) metadata['Classifier'] = ['ok'] @@ -242,7 +242,7 @@ def test_project_urls(self): # project-url is a bit specific, make sure we write it # properly in PKG-INFO - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Project-Url'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) @@ -253,13 +253,13 @@ self.assertIn('Project-URL: one,http://ok', res) file_.seek(0) - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(file_) self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) def test_suite(): - return unittest.makeSuite(DistributionMetadataTestCase) + return unittest.makeSuite(MetadataTestCase) if __name__ == '__main__': run_unittest(test_suite()) diff --git a/docs/design/pep-0376.txt b/docs/design/pep-0376.txt --- a/docs/design/pep-0376.txt +++ b/docs/design/pep-0376.txt @@ -425,7 +425,7 @@ - ``name``: The name of the distribution. -- ``metadata``: A ``DistributionMetadata`` instance loaded with the +- ``metadata``: A ``Metadata`` instance loaded with the distribution's PKG-INFO file. - ``requested``: A boolean that indicates whether the REQUESTED diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -298,11 +298,11 @@ ``2.7`` or ``3.2``. You can read back this static file, by using the -:class:`distutils2.dist.DistributionMetadata` class and its +:class:`distutils2.dist.Metadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.dist import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.dist import Metadata + >>> metadata = Metadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name 'distribute' @@ -315,7 +315,7 @@ loads its values:: >>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info' - >>> DistributionMetadata(pkg_info_path).name + >>> Metadata(pkg_info_path).name 'distribute' diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -2,7 +2,7 @@ Metadata ======== -Distutils2 provides a :class:`DistributionMetadata` class that can read and +Distutils2 provides a :class:`Metadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -17,11 +17,11 @@ Reading metadata ================ -The :class:`DistributionMetadata` class can be instantiated with the path of +The :class:`Metadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO') + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO') >>> metadata.keys()[:5] ('Metadata-Version', 'Name', 'Version', 'Platform', 'Supported-Platform') >>> metadata['Name'] @@ -33,13 +33,13 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`DistributionMetadata` will interpret in the case the markers and will +:class:`Metadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True) + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO', platform_dependent=True) >>> metadata['Requires-Dist'] ['bar'] @@ -51,9 +51,9 @@ Here's an example, simulating a win32 environment:: - >>> from distutils2.metadata import DistributionMetadata + >>> from distutils2.metadata import Metadata >>> context = {'sys.platform': 'win32'} - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True, + >>> metadata = Metadata('PKG-INFO', platform_dependent=True, ... execution_context=context) ... >>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'", @@ -81,8 +81,8 @@ Some fields in :PEP:`345` have to follow a version scheme in their versions predicate. When the scheme is violated, a warning is emitted:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata() >>> metadata['Requires-Dist'] = ['Funky (Groovie)'] "Funky (Groovie)" is not a valid predicate >>> metadata['Requires-Dist'] = ['Funky (1.2)'] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Branch merge Message-ID: tarek.ziade pushed 41e4fb0c765b to distutils2: http://hg.python.org/distutils2/rev/41e4fb0c765b changeset: 1006:41e4fb0c765b parent: 1001:b2b6b967a7ef parent: 1004:71223a14aa43 user: ?ric Araujo date: Thu Feb 03 17:47:17 2011 +0100 summary: Branch merge files: distutils2/_backport/shutil.py diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -376,7 +376,7 @@ archive that is being built. If not provided, the current owner and group will be used. - The output tar file will be named 'base_dir' + ".tar", possibly plus + The output tar file will be named 'base_name' + ".tar", possibly plus the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. @@ -451,7 +451,7 @@ def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None): """Create a zip file from all the files under 'base_dir'. - The output zip file will be named 'base_dir' + ".zip". Uses either the + The output zip file will be named 'base_name' + ".zip". Uses either the "zipfile" Python module (if available) or the InfoZIP "zip" utility (if installed and found on the default search path). If neither tool is available, raises ExecError. Returns the name of the output zip diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -10,14 +10,7 @@ from distutils2.errors import DistutilsOptionError from distutils2 import util from distutils2 import logger - -# XXX see if we want to backport this -from distutils2._backport.shutil import copytree, copyfile, move - -try: - from shutil import make_archive -except ImportError: - from distutils2._backport.shutil import make_archive +from distutils2._backport.shutil import copytree, copyfile, move, make_archive class Command(object): diff --git a/distutils2/tests/test_command_test.py b/distutils2/tests/test_command_test.py --- a/distutils2/tests/test_command_test.py +++ b/distutils2/tests/test_command_test.py @@ -17,10 +17,6 @@ from distutils2.dist import Distribution from distutils2._backport import pkgutil -try: - any -except NameError: - from distutils2._backport import any EXPECTED_OUTPUT_RE = r'''FAIL: test_blah \(myowntestmodule.SomeTest\) ---------------------------------------------------------------------- diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -888,7 +888,7 @@ .. function:: make_zipfile(base_name, base_dir[, verbose=0, dry_run=0]) Create a zip file from all files in and under *base_dir*. The output zip file - will be named *base_dir* + :file:`.zip`. Uses either the :mod:`zipfile` Python + will be named *base_name* + :file:`.zip`. Uses either the :mod:`zipfile` Python module (if available) or the InfoZIP :file:`zip` utility (if installed and found on the default search path). If neither tool is available, raises :exc:`DistutilsExecError`. Returns the name of the output zip file. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Branch merge. Message-ID: tarek.ziade pushed db87d8b26fa1 to distutils2: http://hg.python.org/distutils2/rev/db87d8b26fa1 changeset: 1007:db87d8b26fa1 parent: 1006:41e4fb0c765b parent: 1005:196f90ab8ee3 user: ?ric Araujo date: Thu Feb 03 19:48:52 2011 +0100 summary: Branch merge. There are a number of bugs and issues in the implementation of mkcfg. The tests should be partially rewritten too (to use RawInputs instead of sys.stdin.write for example, or to compare config file sections without using sets). I?ll look into it. files: distutils2/_backport/pkgutil.py distutils2/_backport/tests/test_pkgutil.py distutils2/index/dist.py distutils2/metadata.py distutils2/tests/test_install.py distutils2/tests/test_metadata.py docs/source/distutils/apiref.rst docs/source/distutils/examples.rst docs/source/library/distutils2.metadata.rst diff --git a/CHANGES.txt b/CHANGES.txt --- a/CHANGES.txt +++ b/CHANGES.txt @@ -10,6 +10,7 @@ - Issue #10409: Fixed the Licence selector in mkcfg [tarek] - Issue #9558: Fix build_ext with VS 8.0 [??ric] - Issue #6007: Add disclaimer about MinGW compatibility in docs [??ric] +- Renamed DistributionMetadata to Metadata [ccomb] 1.0a3 - 2010-10-08 ------------------ diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -15,7 +15,7 @@ from md5 import md5 from distutils2.errors import DistutilsError -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import suggest_normalized_version, VersionPredicate try: import cStringIO as StringIO @@ -725,7 +725,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" requested = False """A boolean that indicates whether the ``REQUESTED`` metadata file is @@ -737,7 +737,7 @@ self.metadata = _cache_path[path].metadata else: metadata_path = os.path.join(path, 'METADATA') - self.metadata = DistributionMetadata(path=metadata_path) + self.metadata = Metadata(path=metadata_path) self.path = path self.name = self.metadata['name'] @@ -858,7 +858,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" _REQUIREMENT = re.compile( \ r'(?P[-A-Za-z0-9_.]+)\s*' \ @@ -893,7 +893,7 @@ if path.endswith('.egg'): if os.path.isdir(path): meta_path = os.path.join(path, 'EGG-INFO', 'PKG-INFO') - self.metadata = DistributionMetadata(path=meta_path) + self.metadata = Metadata(path=meta_path) try: req_path = os.path.join(path, 'EGG-INFO', 'requires.txt') requires = open(req_path, 'r').read() @@ -903,7 +903,7 @@ # FIXME handle the case where zipfile is not available zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) - self.metadata = DistributionMetadata(fileobj=fileobj) + self.metadata = Metadata(fileobj=fileobj) try: requires = zipf.get_data('EGG-INFO/requires.txt') except IOError: @@ -917,7 +917,7 @@ requires = req_f.read() except IOError: requires = None - self.metadata = DistributionMetadata(path=path) + self.metadata = Metadata(path=path) self.name = self.metadata['name'] else: raise ValueError('The path must end with .egg-info or .egg') diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -13,7 +13,7 @@ from distutils2._backport.hashlib import md5 from distutils2.errors import DistutilsError -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import unittest, run_unittest, support from distutils2._backport import pkgutil @@ -244,7 +244,7 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) + self.assertTrue(isinstance(dist.metadata, Metadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -15,7 +15,7 @@ from distutils2.fancy_getopt import FancyGetopt from distutils2.util import strtobool, resolve_name from distutils2 import logger -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.config import Config from distutils2.command import get_command_class @@ -145,7 +145,7 @@ # forth) in a separate object -- we're getting to have enough # information here (and enough command-line options) that it's # worth it. - self.metadata = DistributionMetadata() + self.metadata = Metadata() # 'cmdclass' maps command names to class objects, so we # can 1) quickly figure out which class to instantiate when diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -28,7 +28,7 @@ CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.util import splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -66,7 +66,7 @@ self._version = None self.version = version if metadata: - self.metadata = DistributionMetadata(mapping=metadata) + self.metadata = Metadata(mapping=metadata) else: self.metadata = None self.dists = {} @@ -174,7 +174,7 @@ def set_metadata(self, metadata): if not self.metadata: - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata.update(metadata) def __getitem__(self, item): @@ -216,7 +216,6 @@ __hash__ = object.__hash__ - class DistInfo(IndexReference): """Represents a distribution retrieved from an index (sdist, bdist, ...) """ @@ -346,7 +345,7 @@ return "" % self.dist_type return "<%s %s %s>" % ( - self.release.name, self.release.version, self.dist_type or "") + self.release.name, self.release.version, self.dist_type or "") class ReleasesList(IndexReference): diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -22,7 +22,7 @@ UnableToDownload, CantParseArchiveName, ReleaseNotFound, ProjectNotFound) from distutils2.index.mirrors import get_mirrors -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import get_version_predicate from distutils2 import __version__ as __distutils2_version__ @@ -203,7 +203,7 @@ if not release._metadata: location = release.get_distribution().unpack() pkg_info = os.path.join(location, 'PKG-INFO') - release._metadata = DistributionMetadata(pkg_info) + release._metadata = Metadata(pkg_info) return release def _switch_to_next_mirror(self): diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -41,7 +41,7 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ('DistributionMetadata', 'PKG_INFO_ENCODING', +__all__ = ('Metadata', 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION') # Encoding used for the PKG-INFO files @@ -187,7 +187,7 @@ _MISSING = NoDefault() -class DistributionMetadata(object): +class Metadata(object): """The metadata of a release. Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can diff --git a/distutils2/tests/test_command_install_distinfo.py b/distutils2/tests/test_command_install_distinfo.py --- a/distutils2/tests/test_command_install_distinfo.py +++ b/distutils2/tests/test_command_install_distinfo.py @@ -5,7 +5,7 @@ from distutils2.command.install_distinfo import install_distinfo from distutils2.command.cmd import Command -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import unittest, support try: @@ -64,7 +64,7 @@ self.assertEqual(open(os.path.join(dist_info, 'REQUESTED')).read(), '') meta_path = os.path.join(dist_info, 'METADATA') - self.assertTrue(DistributionMetadata(path=meta_path).check()) + self.assertTrue(Metadata(path=meta_path).check()) def test_installer(self): pkg_dir, dist = self.create_dist(name='foo', diff --git a/distutils2/tests/test_dist.py b/distutils2/tests/test_dist.py --- a/distutils2/tests/test_dist.py +++ b/distutils2/tests/test_dist.py @@ -68,7 +68,7 @@ distutils2.dist.DEBUG = False def test_write_pkg_file(self): - # Check DistributionMetadata handling of Unicode fields + # Check Metadata handling of Unicode fields tmp_dir = self.mkdtemp() my_file = os.path.join(tmp_dir, 'f') cls = Distribution diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -5,7 +5,7 @@ from distutils2 import install from distutils2.index.xmlrpc import Client -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import run_unittest from distutils2.tests.support import TempdirManager from distutils2.tests.pypi_server import use_xmlrpc_server @@ -18,7 +18,7 @@ def __init__(self, name, version, deps): self.name = name self.version = version - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata['Requires-Dist'] = deps self.metadata['Provides-Dist'] = ['%s (%s)' % (name, version)] diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -4,7 +4,7 @@ import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, +from distutils2.metadata import (Metadata, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -12,7 +12,7 @@ MetadataUnrecognizedVersionError) -class DistributionMetadataTestCase(LoggingCatcher, WarningsCatcher, +class MetadataTestCase(LoggingCatcher, WarningsCatcher, unittest.TestCase): def test_instantiation(self): @@ -24,35 +24,35 @@ fp.close() fp = StringIO(contents) - m = DistributionMetadata() + m = Metadata() self.assertRaises(MetadataUnrecognizedVersionError, m.items) - m = DistributionMetadata(PKG_INFO) + m = Metadata(PKG_INFO) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(fileobj=fp) + m = Metadata(fileobj=fp) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(mapping=dict(name='Test', version='1.0')) + m = Metadata(mapping=dict(name='Test', version='1.0')) self.assertEqual(len(m.items()), 11) d = dict(m.items()) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, fileobj=fp) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, fileobj=fp, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=m, fileobj=fp) def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') - metadata = DistributionMetadata(PKG_INFO) + metadata = Metadata(PKG_INFO) out = StringIO() metadata.write_file(out) out.seek(0) - res = DistributionMetadata() + res = Metadata() res.read_file(out) for k in metadata.keys(): self.assertTrue(metadata[k] == res[k]) @@ -62,7 +62,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(platform_dependent=True) + metadata = Metadata(platform_dependent=True) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['bar']) metadata['Name'] = "baz; sys.platform == 'blah'" @@ -72,7 +72,7 @@ # test with context context = {'sys.platform': 'okook'} - metadata = DistributionMetadata(platform_dependent=True, + metadata = Metadata(platform_dependent=True, execution_context=context) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['foo']) @@ -81,7 +81,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(StringIO(content)) # see if we can read the description now @@ -100,7 +100,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(fileobj=StringIO(content)) + metadata = Metadata(fileobj=StringIO(content)) self.assertIn('Version', metadata.keys()) self.assertIn('0.5', metadata.values()) self.assertIn(('Version', '0.5'), metadata.items()) @@ -111,7 +111,7 @@ self.assertEqual(metadata['Version'], '0.7') def test_versions(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Obsoletes'] = 'ok' self.assertEqual(metadata['Metadata-Version'], '1.1') @@ -142,7 +142,7 @@ # XXX Spurious Warnings were disabled def XXXtest_warnings(self): - metadata = DistributionMetadata() + metadata = Metadata() # these should raise a warning values = (('Requires-Dist', 'Funky (Groovie)'), @@ -155,7 +155,7 @@ self.assertEqual(len(self.logs), 2) def test_multiple_predicates(self): - metadata = DistributionMetadata() + metadata = Metadata() # see for "3" instead of "3.0" ??? # its seems like the MINOR VERSION can be omitted @@ -165,14 +165,14 @@ self.assertEqual(len(self.warnings), 0) def test_project_url(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Project-URL'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-URL'], [('one', 'http://ok')]) self.assertEqual(metadata.version, '1.2') def test_check_version(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -181,7 +181,7 @@ self.assertEqual(missing, ['Version']) def test_check_version_strict(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -190,7 +190,7 @@ self.assertRaises(MetadataMissingError, metadata.check, strict=True) def test_check_name(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -199,7 +199,7 @@ self.assertEqual(missing, ['Name']) def test_check_name_strict(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -208,7 +208,7 @@ self.assertRaises(MetadataMissingError, metadata.check, strict=True) def test_check_author(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' @@ -217,7 +217,7 @@ self.assertEqual(missing, ['Author']) def test_check_homepage(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Name'] = 'vimpdb' metadata['Author'] = 'Monty Python' @@ -226,7 +226,7 @@ self.assertEqual(missing, ['Home-page']) def test_check_predicates(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = 'rr' metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' @@ -242,7 +242,7 @@ self.assertEqual(len(warnings), 4) def test_best_choice(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION) metadata['Classifier'] = ['ok'] @@ -251,7 +251,7 @@ def test_project_urls(self): # project-url is a bit specific, make sure we write it # properly in PKG-INFO - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Project-Url'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) @@ -262,13 +262,13 @@ self.assertIn('Project-URL: one,http://ok', res) file_.seek(0) - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(file_) self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) def test_suite(): - return unittest.makeSuite(DistributionMetadataTestCase) + return unittest.makeSuite(MetadataTestCase) if __name__ == '__main__': run_unittest(test_suite()) diff --git a/docs/design/pep-0376.txt b/docs/design/pep-0376.txt --- a/docs/design/pep-0376.txt +++ b/docs/design/pep-0376.txt @@ -425,7 +425,7 @@ - ``name``: The name of the distribution. -- ``metadata``: A ``DistributionMetadata`` instance loaded with the +- ``metadata``: A ``Metadata`` instance loaded with the distribution's PKG-INFO file. - ``requested``: A boolean that indicates whether the REQUESTED diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1060,7 +1060,9 @@ .. module:: distutils2.metadata -.. autoclass:: distutils2.metadata.DistributionMetadata +.. FIXME CPython/stdlib docs don't use autoclass, write doc manually here + +.. autoclass:: distutils2.metadata.Metadata :members: :mod:`distutils2.util` --- Miscellaneous other utility functions diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -298,11 +298,11 @@ ``2.7`` or ``3.2``. You can read back this static file, by using the -:class:`distutils2.dist.DistributionMetadata` class and its +:class:`distutils2.dist.Metadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name 'distribute' @@ -315,7 +315,7 @@ loads its values:: >>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info' - >>> DistributionMetadata(pkg_info_path).name + >>> Metadata(pkg_info_path).name 'distribute' diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -4,7 +4,7 @@ .. module:: distutils2.metadata -Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and +Distutils2 provides a :class:`~distutils2.metadata.Metadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -19,11 +19,11 @@ Reading metadata ================ -The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.Metadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO') + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO') >>> metadata.keys()[:5] ('Metadata-Version', 'Name', 'Version', 'Platform', 'Supported-Platform') >>> metadata['Name'] @@ -35,13 +35,13 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.Metadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True) + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO', platform_dependent=True) >>> metadata['Requires-Dist'] ['bar'] @@ -53,9 +53,9 @@ Here's an example, simulating a win32 environment:: - >>> from distutils2.metadata import DistributionMetadata + >>> from distutils2.metadata import Metadata >>> context = {'sys.platform': 'win32'} - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True, + >>> metadata = Metadata('PKG-INFO', platform_dependent=True, ... execution_context=context) ... >>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'", @@ -83,8 +83,8 @@ Some fields in :PEP:`345` have to follow a version scheme in their versions predicate. When the scheme is violated, a warning is emitted:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata() >>> metadata['Requires-Dist'] = ['Funky (Groovie)'] "Funky (Groovie)" is not a valid predicate >>> metadata['Requires-Dist'] = ['Funky (1.2)'] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Slight cleanup for mkcfg + fixed two bugs with except clauses. Message-ID: tarek.ziade pushed 1d1d07ebb991 to distutils2: http://hg.python.org/distutils2/rev/1d1d07ebb991 changeset: 1008:1d1d07ebb991 user: ?ric Araujo date: Thu Feb 03 20:13:36 2011 +0100 summary: Slight cleanup for mkcfg + fixed two bugs with except clauses. files: distutils2/mkcfg.py distutils2/tests/test_mkcfg.py diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -25,18 +25,15 @@ import os import sys +import glob import re import shutil -import glob -import re from ConfigParser import RawConfigParser from textwrap import dedent -if sys.version_info[:2] < (2, 6): - from sets import Set as set try: from hashlib import md5 except ImportError: - from md5 import md5 + from distutils2._backport.hashlib import md5 # importing this with an underscore as it should be replaced by the # dict form or another structures for all purposes from distutils2._trove import all_classifiers as _CLASSIFIERS_LIST @@ -92,10 +89,10 @@ Optionally, you can set other trove identifiers for things such as the human language, programming language, user interface, etc... ''', - 'setup.py found':''' + 'setup.py found': ''' The setup.py script will be executed to retrieve the metadata. A wizard will be run if you answer "n", -''' +''', } # XXX everything needs docstrings and tests (both low-level tests of various @@ -162,6 +159,7 @@ CLASSIFIERS = _build_classifiers_dict(_CLASSIFIERS_LIST) + def _build_licences(classifiers): res = [] for index, item in enumerate(classifiers): @@ -172,6 +170,7 @@ LICENCES = _build_licences(_CLASSIFIERS_LIST) + class MainProgram(object): def __init__(self): self.configparser = None @@ -235,8 +234,8 @@ if ans != 'y': return - #_______mock setup start data = self.data + def setup(**attrs): """Mock the setup(**attrs) in order to retrive metadata.""" # use the distutils v1 processings to correctly parse metadata. @@ -272,11 +271,12 @@ if len(dist.data_files) < 2 or \ isinstance(dist.data_files[1], str): dist.data_files = [('', dist.data_files)] - #add tokens in the destination paths - vars = {'distribution.name':data['name']} + # add tokens in the destination paths + vars = {'distribution.name': data['name']} path_tokens = sysconfig.get_paths(vars=vars).items() - #sort tokens to use the longest one first - path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), + # sort tokens to use the longest one first + # TODO chain two sorted with key arguments, remove cmp + path_tokens.sort(cmp=lambda x, y: cmp(len(y), len(x)), key=lambda x: x[1]) for dest, srcs in (dist.data_files or []): dest = os.path.join(sys.prefix, dest) @@ -291,29 +291,31 @@ package_dirs = dist.package_dir or {} for package, extras in dist.package_data.iteritems() or []: package_dir = package_dirs.get(package, package) - fils = [os.path.join(package_dir, fil) for fil in extras] - data['extra_files'].extend(fils) + files = [os.path.join(package_dir, f) for f in extras] + data['extra_files'].extend(files) # Use README file if its content is the desciption if "description" in data: ref = md5(re.sub('\s', '', self.data['description']).lower()) ref = ref.digest() for readme in glob.glob('README*'): - fob = open(readme) - val = md5(re.sub('\s', '', fob.read()).lower()).digest() - fob.close() + fp = open(readme) + try: + contents = fp.read() + finally: + fp.close() + val = md5(re.sub('\s', '', contents.lower())).digest() if val == ref: del data['description'] data['description-file'] = readme break - #_________ mock setup end # apply monkey patch to distutils (v1) and setuptools (if needed) # (abord the feature if distutils v1 has been killed) try: import distutils.core as DC - getattr(DC, 'setup') # ensure distutils v1 - except ImportError, AttributeError: + DC.setup # ensure distutils v1 + except (ImportError, AttributeError): return saved_setups = [(DC, DC.setup)] DC.setup = setup @@ -321,15 +323,15 @@ import setuptools saved_setups.append((setuptools, setuptools.setup)) setuptools.setup = setup - except ImportError, AttributeError: + except (ImportError, AttributeError): pass # get metadata by executing the setup.py with the patched setup(...) - success = False # for python < 2.4 + success = False # for python < 2.4 try: pyenv = globals().copy() execfile(setuppath, pyenv) success = True - finally: #revert monkey patches + finally: # revert monkey patches for patched_module, original_setup in saved_setups: patched_module.setup = original_setup if not self.data: @@ -339,7 +341,8 @@ def inspect_file(self, path): fp = open(path, 'r') try: - for line in [fp.readline() for _ in range(10)]: + for _ in xrange(10): + line = fp.readline() m = re.match(r'^#!.*python((?P\d)(\.\d+)?)?$', line) if m: if m.group('major') == '3': @@ -393,7 +396,6 @@ helptext=_helptext['extra_files']) == 'y': self._set_multi('Extra file/dir name', 'extra_files') - if ask_yn('Do you want to set Trove classifiers?', helptext=_helptext['do_classifier']) == 'y': self.set_classifier() @@ -413,7 +415,6 @@ _pref = ['lib', 'include', 'dist', 'build', '.', '~'] _suf = ['.pyc'] - def to_skip(path): path = relative(path) diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -1,11 +1,8 @@ # -*- coding: utf-8 -*- """Tests for distutils.mkcfg.""" import os -import os.path as osp import sys import StringIO -if sys.version_info[:2] < (2, 6): - from sets import Set as set from textwrap import dedent from distutils2.tests import run_unittest, support, unittest @@ -114,9 +111,11 @@ sys.stdin.write('y\n') sys.stdin.seek(0) main() - fid = open(osp.join(self.wdir, 'setup.cfg')) - lines = set([line.rstrip() for line in fid]) - fid.close() + fp = open(os.path.join(self.wdir, 'setup.cfg')) + try: + lines = set([line.rstrip() for line in fp]) + finally: + fp.close() self.assertEqual(lines, set(['', '[metadata]', 'version = 0.2', @@ -183,9 +182,11 @@ sys.stdin.write('y\n') sys.stdin.seek(0) main() - fid = open(osp.join(self.wdir, 'setup.cfg')) - lines = set([line.strip() for line in fid]) - fid.close() + fp = open(os.path.join(self.wdir, 'setup.cfg')) + try: + lines = set([line.strip() for line in fp]) + finally: + fp.close() self.assertEqual(lines, set(['', '[metadata]', 'version = 0.2', -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Fix type of d2.metadata.__all__ Message-ID: tarek.ziade pushed bbc4437b851b to distutils2: http://hg.python.org/distutils2/rev/bbc4437b851b changeset: 1010:bbc4437b851b user: ?ric Araujo date: Sun Feb 06 18:12:35 2011 +0100 summary: Fix type of d2.metadata.__all__ files: distutils2/metadata.py diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -41,8 +41,8 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ('Metadata', 'PKG_INFO_ENCODING', - 'PKG_INFO_PREFERRED_VERSION') +__all__ = ['Metadata', 'PKG_INFO_ENCODING', + 'PKG_INFO_PREFERRED_VERSION'] # Encoding used for the PKG-INFO files PKG_INFO_ENCODING = 'utf-8' -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: pep8 formatting of the distutils2.index package Message-ID: tarek.ziade pushed 89b8e21129e6 to distutils2: http://hg.python.org/distutils2/rev/89b8e21129e6 changeset: 1011:89b8e21129e6 user: Kelsey Hightower date: Wed Feb 09 16:48:07 2011 -0500 summary: pep8 formatting of the distutils2.index package files: distutils2/index/__init__.py distutils2/index/dist.py distutils2/index/mirrors.py distutils2/index/wrapper.py distutils2/index/xmlrpc.py diff --git a/distutils2/index/__init__.py b/distutils2/index/__init__.py --- a/distutils2/index/__init__.py +++ b/distutils2/index/__init__.py @@ -6,6 +6,6 @@ 'xmlrpc', 'dist', 'errors', - 'mirrors',] + 'mirrors'] from dist import ReleaseInfo, ReleasesList, DistInfo diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -109,7 +109,8 @@ self.dists = {} return self.dists - def add_distribution(self, dist_type='sdist', python_version=None, **params): + def add_distribution(self, dist_type='sdist', python_version=None, + **params): """Add distribution informations to this release. If distribution information is already set for this distribution type, add the given url paths to the distribution. This can be useful while diff --git a/distutils2/index/mirrors.py b/distutils2/index/mirrors.py --- a/distutils2/index/mirrors.py +++ b/distutils2/index/mirrors.py @@ -1,4 +1,4 @@ -"""Utilities related to the mirror infrastructure defined in PEP 381. +"""Utilities related to the mirror infrastructure defined in PEP 381. See http://www.python.org/dev/peps/pep-0381/ """ @@ -7,6 +7,7 @@ DEFAULT_MIRROR_URL = "last.pypi.python.org" + def get_mirrors(hostname=None): """Return the list of mirrors from the last record found on the DNS entry:: @@ -19,7 +20,7 @@ """ if hostname is None: hostname = DEFAULT_MIRROR_URL - + # return the last mirror registered on PyPI. try: hostname = socket.gethostbyname_ex(hostname)[0] @@ -30,23 +31,24 @@ # determine the list from the last one. return ["%s.%s" % (s, end_letter[1]) for s in string_range(end_letter[0])] + def string_range(last): """Compute the range of string between "a" and last. - + This works for simple "a to z" lists, but also for "a to zz" lists. """ for k in range(len(last)): - for x in product(ascii_lowercase, repeat=k+1): + for x in product(ascii_lowercase, repeat=(k + 1)): result = ''.join(x) yield result if result == last: return + def product(*args, **kwds): pools = map(tuple, args) * kwds.get('repeat', 1) result = [[]] for pool in pools: - result = [x+[y] for x in result for y in pool] + result = [x + [y] for x in result for y in pool] for prod in result: yield tuple(prod) - diff --git a/distutils2/index/wrapper.py b/distutils2/index/wrapper.py --- a/distutils2/index/wrapper.py +++ b/distutils2/index/wrapper.py @@ -9,6 +9,7 @@ _WRAPPER_INDEXES = {'xmlrpc': xmlrpc.Client, 'simple': simple.Crawler} + def switch_index_if_fails(func, wrapper): """Decorator that switch of index (for instance from xmlrpc to simple) if the first mirror return an empty list or raises an exception. @@ -82,11 +83,11 @@ other_indexes = [i for i in self._indexes if i != self._default_index] for index in other_indexes: - real_method = getattr(self._indexes[index], method_name, None) + real_method = getattr(self._indexes[index], method_name, + None) if real_method: break if real_method: return switch_index_if_fails(real_method, self) else: raise AttributeError("No index have attribute '%s'" % method_name) - diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -103,7 +103,6 @@ project.sort_releases(prefer_final) return project - def get_distributions(self, project_name, version): """Grab informations about distributions from XML-RPC. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Solve bug with --help-commands. Message-ID: tarek.ziade pushed 0da75325526c to distutils2: http://hg.python.org/distutils2/rev/0da75325526c changeset: 1009:0da75325526c user: ?ric Araujo date: Sun Feb 06 18:11:55 2011 +0100 summary: Solve bug with --help-commands. files: distutils2/command/__init__.py distutils2/dist.py diff --git a/distutils2/command/__init__.py b/distutils2/command/__init__.py --- a/distutils2/command/__init__.py +++ b/distutils2/command/__init__.py @@ -5,6 +5,9 @@ from distutils2.errors import DistutilsModuleError from distutils2.util import resolve_name +__all__ = ['get_command_names', 'set_command', 'get_command_class', + 'STANDARD_COMMANDS'] + _COMMANDS = { 'check': 'distutils2.command.check.check', 'test': 'distutils2.command.test.test', @@ -29,6 +32,8 @@ 'upload': 'distutils2.command.upload.upload', 'upload_docs': 'distutils2.command.upload_docs.upload_docs'} +STANDARD_COMMANDS = set(_COMMANDS) + def get_command_names(): """Return registered commands""" diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -17,7 +17,7 @@ from distutils2 import logger from distutils2.metadata import Metadata from distutils2.config import Config -from distutils2.command import get_command_class +from distutils2.command import get_command_class, STANDARD_COMMANDS # Regex to define acceptable Distutils command names. This is not *quite* # the same as a Python NAME -- I don't allow leading underscores. The fact @@ -589,31 +589,26 @@ print(header + ":") for cmd in commands: - cls = self.cmdclass.get(cmd) - if not cls: - cls = get_command_class(cmd) - try: - description = cls.description - except AttributeError: - description = "(no description available)" + cls = self.cmdclass.get(cmd) or get_command_class(cmd) + description = getattr(cls, 'description', + '(no description available)') print(" %-*s %s" % (max_length, cmd, description)) def _get_command_groups(self): """Helper function to retrieve all the command class names divided - into standard commands (listed in distutils2.command.__all__) - and extra commands (given in self.cmdclass and not standard - commands). + into standard commands (listed in + distutils2.command.STANDARD_COMMANDS) and extra commands (given in + self.cmdclass and not standard commands). """ - from distutils2.command import __all__ as std_commands extra_commands = [cmd for cmd in self.cmdclass - if cmd not in std_commands] - return std_commands, extra_commands + if cmd not in STANDARD_COMMANDS] + return STANDARD_COMMANDS, extra_commands def print_commands(self): """Print out a help message listing all available commands with a description of each. The list is divided into standard commands - (listed in distutils2.command.__all__) and extra commands + (listed in distutils2.command.STANDARD_COMMANDS) and extra commands (given in self.cmdclass and not standard commands). The descriptions come from the command class attribute 'description'. @@ -633,10 +628,8 @@ "Extra commands", max_length) - # -- Command class/object methods ---------------------------------- - def get_command_obj(self, command, create=1): """Return the command object for 'command'. Normally this object is cached on a previous call to 'get_command_obj()'; if no command -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Fixing logging strings; improve lower-case consistency Message-ID: tarek.ziade pushed 39e2a0b88acf to distutils2: http://hg.python.org/distutils2/rev/39e2a0b88acf changeset: 1012:39e2a0b88acf user: Kelsey Hightower date: Wed Feb 09 18:57:42 2011 -0500 summary: Fixing logging strings; improve lower-case consistency files: distutils2/_backport/shutil.py distutils2/command/bdist_dumb.py distutils2/command/bdist_wininst.py distutils2/command/clean.py distutils2/command/register.py distutils2/command/sdist.py distutils2/compiler/bcppcompiler.py distutils2/compiler/msvc9compiler.py distutils2/compiler/msvccompiler.py distutils2/compiler/unixccompiler.py distutils2/index/simple.py distutils2/install.py diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -408,7 +408,7 @@ from distutils2._backport import tarfile if logger is not None: - logger.info('Creating tar archive') + logger.info('creating tar archive') uid = _get_uid(owner) gid = _get_gid(group) diff --git a/distutils2/command/bdist_dumb.py b/distutils2/command/bdist_dumb.py --- a/distutils2/command/bdist_dumb.py +++ b/distutils2/command/bdist_dumb.py @@ -129,7 +129,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % self.bdist_dir) + logger.info('removing %s' % self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/bdist_wininst.py b/distutils2/command/bdist_wininst.py --- a/distutils2/command/bdist_wininst.py +++ b/distutils2/command/bdist_wininst.py @@ -192,7 +192,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % self.bdist_dir) + logger.info('removing %s' % self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/clean.py b/distutils2/command/clean.py --- a/distutils2/command/clean.py +++ b/distutils2/command/clean.py @@ -48,7 +48,7 @@ # gone) if os.path.exists(self.build_temp): if self.dry_run: - logger.info('Removing %s' % self.build_temp) + logger.info('removing %s' % self.build_temp) else: rmtree(self.build_temp) else: @@ -62,7 +62,7 @@ self.build_scripts): if os.path.exists(directory): if self.dry_run: - logger.info('Removing %s' % directory) + logger.info('removing %s' % directory) else: rmtree(directory) else: diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -92,7 +92,7 @@ ''' # send the info to the server and report the result code, result = self.post_to_server(self.build_post_data('verify')) - logger.info('Server response (%s): %s' % (code, result)) + logger.info('server response (%s): %s' % (code, result)) def send_metadata(self): @@ -206,10 +206,10 @@ data['email'] = raw_input(' EMail: ') code, result = self.post_to_server(data) if code != 200: - logger.info('Server response (%s): %s' % (code, result)) + logger.info('server response (%s): %s' % (code, result)) else: - logger.info('You will receive an email shortly.') - logger.info(('Follow the instructions in it to ' + logger.info('you will receive an email shortly.') + logger.info(('follow the instructions in it to ' 'complete registration.')) elif choice == '3': data = {':action': 'password_reset'} @@ -217,7 +217,7 @@ while not data['email']: data['email'] = raw_input('Your email address: ') code, result = self.post_to_server(data) - logger.info('Server response (%s): %s' % (code, result)) + logger.info('server response (%s): %s' % (code, result)) def build_post_data(self, action): # figure the data to send - the metadata plus some additional diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -336,7 +336,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % base_dir) + logger.info('removing %s' % base_dir) else: rmtree(base_dir) diff --git a/distutils2/compiler/bcppcompiler.py b/distutils2/compiler/bcppcompiler.py --- a/distutils2/compiler/bcppcompiler.py +++ b/distutils2/compiler/bcppcompiler.py @@ -191,7 +191,7 @@ self._fix_lib_args (libraries, library_dirs, runtime_library_dirs) if runtime_library_dirs: - logger.warning(("I don't know what to do with " + logger.warning(("don't know what to do with " "'runtime_library_dirs': %s"), str(runtime_library_dirs)) diff --git a/distutils2/compiler/msvc9compiler.py b/distutils2/compiler/msvc9compiler.py --- a/distutils2/compiler/msvc9compiler.py +++ b/distutils2/compiler/msvc9compiler.py @@ -215,7 +215,7 @@ productdir = Reg.get_value(r"%s\Setup\VC" % vsbase, "productdir") except KeyError: - logger.debug("Unable to find productdir in registry") + logger.debug("unable to find productdir in registry") productdir = None if not productdir or not os.path.isdir(productdir): @@ -229,14 +229,14 @@ logger.debug("%s is not a valid directory" % productdir) return None else: - logger.debug("Env var %s is not set or invalid" % toolskey) + logger.debug("env var %s is not set or invalid" % toolskey) if not productdir: - logger.debug("No productdir found") + logger.debug("no productdir found") return None vcvarsall = os.path.join(productdir, "vcvarsall.bat") if os.path.isfile(vcvarsall): return vcvarsall - logger.debug("Unable to find vcvarsall.bat") + logger.debug("unable to find vcvarsall.bat") return None def query_vcvarsall(version, arch="x86"): @@ -248,7 +248,7 @@ if vcvarsall is None: raise DistutilsPlatformError("Unable to find vcvarsall.bat") - logger.debug("Calling 'vcvarsall.bat %s' (version=%s)", arch, version) + logger.debug("calling 'vcvarsall.bat %s' (version=%s)", arch, version) popen = subprocess.Popen('"%s" %s & set' % (vcvarsall, arch), stdout=subprocess.PIPE, stderr=subprocess.PIPE) diff --git a/distutils2/compiler/msvccompiler.py b/distutils2/compiler/msvccompiler.py --- a/distutils2/compiler/msvccompiler.py +++ b/distutils2/compiler/msvccompiler.py @@ -44,9 +44,9 @@ RegError = win32api.error except ImportError: - logger.info("Warning: Can't read registry to find the " + logger.info("warning: can't read registry to find the " "necessary compiler setting\n" - "Make sure that Python modules _winreg, " + "make sure that Python modules _winreg, " "win32api or win32con are installed.") pass @@ -653,7 +653,7 @@ if get_build_version() >= 8.0: - logger.debug("Importing new compiler from distutils.msvc9compiler") + logger.debug("importing new compiler from distutils.msvc9compiler") OldMSVCCompiler = MSVCCompiler from distutils2.compiler.msvc9compiler import MSVCCompiler # get_build_architecture not really relevant now we support cross-compile diff --git a/distutils2/compiler/unixccompiler.py b/distutils2/compiler/unixccompiler.py --- a/distutils2/compiler/unixccompiler.py +++ b/distutils2/compiler/unixccompiler.py @@ -98,9 +98,9 @@ sysroot = compiler_so[idx+1] if sysroot and not os.path.isdir(sysroot): - logger.warning("Compiling with an SDK that doesn't seem to exist: %s", + logger.warning("compiling with an SDK that doesn't seem to exist: %s", sysroot) - logger.warning("Please check your Xcode installation") + logger.warning("please check your Xcode installation") return compiler_so diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -168,7 +168,7 @@ if predicate.name.lower() in self._projects and not force_update: return self._projects.get(predicate.name.lower()) prefer_final = self._get_prefer_final(prefer_final) - logger.info('Reading info on PyPI about %s' % predicate.name) + logger.info('reading info on PyPI about %s' % predicate.name) self._process_index_page(predicate.name) if predicate.name.lower() not in self._projects: diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -134,12 +134,12 @@ installed_dists, installed_files = [], [] for dist in dists: - logger.info('Installing %s %s' % (dist.name, dist.version)) + logger.info('installing %s %s' % (dist.name, dist.version)) try: installed_files.extend(_install_dist(dist, path)) installed_dists.append(dist) except Exception, e: - logger.info('Failed. %s' % str(e)) + logger.info('failed. %s' % str(e)) # reverting for installed_dist in installed_dists: @@ -243,7 +243,7 @@ conflict. """ if not installed: - logger.info('Reading installed distributions') + logger.info('reading installed distributions') installed = get_distributions(use_egg_info=True) infos = {'install': [], 'remove': [], 'conflict': []} @@ -258,7 +258,7 @@ if predicate.name.lower() != installed_project.name.lower(): continue found = True - logger.info('Found %s %s' % (installed_project.name, + logger.info('found %s %s' % (installed_project.name, installed_project.version)) # if we already have something installed, check it matches the @@ -268,7 +268,7 @@ break if not found: - logger.info('Project not installed.') + logger.info('project not installed.') if not index: index = wrapper.ClientWrapper() @@ -283,7 +283,7 @@ release = releases.get_last(requirements, prefer_final=prefer_final) if release is None: - logger.info('Could not find a matching project') + logger.info('could not find a matching project') return infos # this works for Metadata 1.2 @@ -358,7 +358,7 @@ finally: shutil.rmtree(tmp) - logger.info('Removing %r...' % project_name) + logger.info('removing %r...' % project_name) file_count = 0 for file_ in rmfiles: @@ -391,20 +391,20 @@ if os.path.exists(dist.path): shutil.rmtree(dist.path) - logger.info('Success ! Removed %d files and %d dirs' % \ + logger.info('success ! removed %d files and %d dirs' % \ (file_count, dir_count)) def install(project): - logger.info('Getting information about "%s".' % project) + logger.info('getting information about "%s".' % project) try: info = get_infos(project) except InstallationException: - logger.info('Cound not find "%s".' % project) + logger.info('could not find "%s".' % project) return if info['install'] == []: - logger.info('Nothing to install.') + logger.info('nothing to install.') return install_path = get_config_var('base') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Clean up metadata Message-ID: tarek.ziade pushed ba7ca508c539 to distutils2: http://hg.python.org/distutils2/rev/ba7ca508c539 changeset: 1014:ba7ca508c539 user: ?ric Araujo date: Wed Feb 09 21:08:20 2011 +0100 summary: Clean up metadata files: distutils2/metadata.py diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -3,8 +3,6 @@ Supports all metadata formats (1.0, 1.1, 1.2). """ -import os -import sys import re from StringIO import StringIO from email import message_from_file @@ -82,6 +80,7 @@ _ALL_FIELDS.update(_314_FIELDS) _ALL_FIELDS.update(_345_FIELDS) + def _version2fieldlist(version): if version == '1.0': return _241_FIELDS @@ -138,46 +137,48 @@ # default marker when 1.0 is disqualified return '1.2' -_ATTR2FIELD = {'metadata_version': 'Metadata-Version', - 'name': 'Name', - 'version': 'Version', - 'platform': 'Platform', - 'supported_platform': 'Supported-Platform', - 'summary': 'Summary', - 'description': 'Description', - 'keywords': 'Keywords', - 'home_page': 'Home-page', - 'author': 'Author', - 'author_email': 'Author-email', - 'maintainer': 'Maintainer', - 'maintainer_email': 'Maintainer-email', - 'license': 'License', - 'classifier': 'Classifier', - 'download_url': 'Download-URL', - 'obsoletes_dist': 'Obsoletes-Dist', - 'provides_dist': 'Provides-Dist', - 'requires_dist': 'Requires-Dist', - 'requires_python': 'Requires-Python', - 'requires_external': 'Requires-External', - 'requires': 'Requires', - 'provides': 'Provides', - 'obsoletes': 'Obsoletes', - 'project_url': 'Project-URL', - } +_ATTR2FIELD = { + 'metadata_version': 'Metadata-Version', + 'name': 'Name', + 'version': 'Version', + 'platform': 'Platform', + 'supported_platform': 'Supported-Platform', + 'summary': 'Summary', + 'description': 'Description', + 'keywords': 'Keywords', + 'home_page': 'Home-page', + 'author': 'Author', + 'author_email': 'Author-email', + 'maintainer': 'Maintainer', + 'maintainer_email': 'Maintainer-email', + 'license': 'License', + 'classifier': 'Classifier', + 'download_url': 'Download-URL', + 'obsoletes_dist': 'Obsoletes-Dist', + 'provides_dist': 'Provides-Dist', + 'requires_dist': 'Requires-Dist', + 'requires_python': 'Requires-Python', + 'requires_external': 'Requires-External', + 'requires': 'Requires', + 'provides': 'Provides', + 'obsoletes': 'Obsoletes', + 'project_url': 'Project-URL', +} _PREDICATE_FIELDS = ('Requires-Dist', 'Obsoletes-Dist', 'Provides-Dist') _VERSIONS_FIELDS = ('Requires-Python',) _VERSION_FIELDS = ('Version',) _LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes', - 'Requires', 'Provides', 'Obsoletes-Dist', - 'Provides-Dist', 'Requires-Dist', 'Requires-External', - 'Project-URL', 'Supported-Platform') + 'Requires', 'Provides', 'Obsoletes-Dist', + 'Provides-Dist', 'Requires-Dist', 'Requires-External', + 'Project-URL', 'Supported-Platform') _LISTTUPLEFIELDS = ('Project-URL',) _ELEMENTSFIELD = ('Keywords',) _UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description') + class NoDefault(object): """Marker object used for clean representation""" def __repr__(self): @@ -185,6 +186,7 @@ _MISSING = NoDefault() + class Metadata(object): """The metadata of a release. @@ -474,7 +476,7 @@ missing.append(attr) if strict and missing != []: - msg = "missing required metadata: %s" % ', '.join(missing) + msg = 'missing required metadata: %s' % ', '.join(missing) raise MetadataMissingError(msg) for attr in ('Home-page', 'Author'): @@ -504,14 +506,13 @@ return missing, warnings + # Mapping API + def keys(self): - """Dict like api""" return _version2fieldlist(self.version) def values(self): - """Dict like api""" return [self[key] for key in self.keys()] def items(self): - """Dict like api""" return [(key, self[key]) for key in self.keys()] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Remove unneeded constant in favor of comment. Message-ID: tarek.ziade pushed 7e1118583e5f to distutils2: http://hg.python.org/distutils2/rev/7e1118583e5f changeset: 1013:7e1118583e5f parent: 1010:bbc4437b851b user: ?ric Araujo date: Wed Feb 09 21:08:06 2011 +0100 summary: Remove unneeded constant in favor of comment. files: distutils2/metadata.py diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -77,8 +77,6 @@ 'Obsoletes-Dist', 'Requires-External', 'Maintainer', 'Maintainer-email', 'Project-URL') -_345_REQUIRED = ('Name', 'Version') - _ALL_FIELDS = set() _ALL_FIELDS.update(_241_FIELDS) _ALL_FIELDS.update(_314_FIELDS) @@ -471,7 +469,7 @@ # XXX should check the versions (if the file was loaded) missing, warnings = [], [] - for attr in ('Name', 'Version'): + for attr in ('Name', 'Version'): # required by PEP 345 if attr not in self: missing.append(attr) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Use lazy form in logging calls, again. Message-ID: tarek.ziade pushed 2318e50ccda4 to distutils2: http://hg.python.org/distutils2/rev/2318e50ccda4 changeset: 1015:2318e50ccda4 user: ?ric Araujo date: Thu Feb 10 00:51:42 2011 +0100 summary: Use lazy form in logging calls, again. Logging calls have the signature (msg, *args, **kwargs) so that the %-formatting can be delayed until it is needed. Logger objects also have an isEnabledFor method that can be used to isolate expensive code. Next steps: use only one of d2.logger methods or logging module functions; use a proper handler in our test machinery instead of monkey-patching; remove cmd.warn and cmd.announce and use logging instead. TODOs have been added in the modules and on the wiki. files: distutils2/command/bdist_dumb.py distutils2/command/bdist_wininst.py distutils2/command/clean.py distutils2/command/cmd.py distutils2/command/config.py distutils2/command/install_dist.py distutils2/command/register.py distutils2/command/sdist.py distutils2/compiler/msvc9compiler.py distutils2/config.py distutils2/dist.py distutils2/index/simple.py distutils2/index/xmlrpc.py distutils2/install.py distutils2/manifest.py distutils2/tests/support.py distutils2/tests/test_command_build_py.py distutils2/tests/test_command_install_lib.py distutils2/tests/test_command_sdist.py distutils2/tests/test_manifest.py diff --git a/distutils2/command/bdist_dumb.py b/distutils2/command/bdist_dumb.py --- a/distutils2/command/bdist_dumb.py +++ b/distutils2/command/bdist_dumb.py @@ -87,7 +87,7 @@ install.skip_build = self.skip_build install.warn_dir = 0 - logger.info("installing to %s" % self.bdist_dir) + logger.info("installing to %s", self.bdist_dir) self.run_command('install_dist') # And make an archive relative to the root of the @@ -106,11 +106,10 @@ else: if (self.distribution.has_ext_modules() and (install.install_base != install.install_platbase)): - raise DistutilsPlatformError, \ - ("can't make a dumb built distribution where " - "base and platbase are different (%s, %s)" - % (repr(install.install_base), - repr(install.install_platbase))) + raise DistutilsPlatformError( + "can't make a dumb built distribution where base and " + "platbase are different (%r, %r)" % + (install.install_base, install.install_platbase)) else: archive_root = os.path.join( self.bdist_dir, @@ -129,7 +128,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % self.bdist_dir) + logger.info('Removing %s', self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/bdist_wininst.py b/distutils2/command/bdist_wininst.py --- a/distutils2/command/bdist_wininst.py +++ b/distutils2/command/bdist_wininst.py @@ -192,7 +192,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % self.bdist_dir) + logger.info('Removing %s', self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/clean.py b/distutils2/command/clean.py --- a/distutils2/command/clean.py +++ b/distutils2/command/clean.py @@ -48,7 +48,7 @@ # gone) if os.path.exists(self.build_temp): if self.dry_run: - logger.info('Removing %s' % self.build_temp) + logger.info('Removing %s', self.build_temp) else: rmtree(self.build_temp) else: @@ -62,7 +62,7 @@ self.build_scripts): if os.path.exists(directory): if self.dry_run: - logger.info('Removing %s' % directory) + logger.info('Removing %s', directory) else: rmtree(directory) else: diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -182,6 +182,7 @@ raise RuntimeError( "abstract method -- subclass %s must override" % self.__class__) + # TODO remove this method, just use logging def announce(self, msg, level=logging.INFO): """If the current verbosity level is of greater than or equal to 'level' print 'msg' to stdout. @@ -363,8 +364,9 @@ # -- External world manipulation ----------------------------------- + # TODO remove this method, just use logging def warn(self, msg): - logger.warning("warning: %s: %s\n" % (self.get_command_name(), msg)) + logger.warning("warning: %s: %s\n", self.get_command_name(), msg) def execute(self, func, args, msg=None, level=1): util.execute(func, args, msg, dry_run=self.dry_run) diff --git a/distutils2/command/config.py b/distutils2/command/config.py --- a/distutils2/command/config.py +++ b/distutils2/command/config.py @@ -345,7 +345,7 @@ If head is not None, will be dumped before the file content. """ if head is None: - logger.info('%s' % filename) + logger.info(filename) else: logger.info(head) file = open(filename) diff --git a/distutils2/command/install_dist.py b/distutils2/command/install_dist.py --- a/distutils2/command/install_dist.py +++ b/distutils2/command/install_dist.py @@ -418,7 +418,7 @@ else: opt_name = opt_name.replace('-', '_') val = getattr(self, opt_name) - logger.debug(" %s: %s" % (opt_name, val)) + logger.debug(" %s: %s", opt_name, val) def select_scheme(self, name): """Set the install directories by applying the install schemes.""" diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -92,7 +92,7 @@ ''' # send the info to the server and report the result code, result = self.post_to_server(self.build_post_data('verify')) - logger.info('Server response (%s): %s' % (code, result)) + logger.info('Server response (%s): %s', code, result) def send_metadata(self): @@ -206,18 +206,18 @@ data['email'] = raw_input(' EMail: ') code, result = self.post_to_server(data) if code != 200: - logger.info('Server response (%s): %s' % (code, result)) + logger.info('Server response (%s): %s', code, result) else: logger.info('You will receive an email shortly.') - logger.info(('Follow the instructions in it to ' - 'complete registration.')) + logger.info('Follow the instructions in it to ' + 'complete registration.') elif choice == '3': data = {':action': 'password_reset'} data['email'] = '' while not data['email']: data['email'] = raw_input('Your email address: ') code, result = self.post_to_server(data) - logger.info('Server response (%s): %s' % (code, result)) + logger.info('Server response (%s): %s', (code, result)) def build_post_data(self, action): # figure the data to send - the metadata plus some additional diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -2,10 +2,7 @@ Implements the Distutils 'sdist' command (create a source distribution).""" import os -import string import sys -from glob import glob -from warnings import warn from shutil import rmtree import re from StringIO import StringIO @@ -18,11 +15,10 @@ from distutils2.command import get_command_names from distutils2.command.cmd import Command from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError, - DistutilsTemplateError, DistutilsModuleError, - DistutilsFileError) + DistutilsModuleError, DistutilsFileError) from distutils2.manifest import Manifest from distutils2 import logger -from distutils2.util import convert_path, resolve_name +from distutils2.util import resolve_name def show_formats(): """Print all possible values for the 'formats' option (used by @@ -300,7 +296,7 @@ for file in files: if not os.path.isfile(file): - logger.warn("'%s' not a regular file -- skipping" % file) + logger.warn("'%s' not a regular file -- skipping", file) else: dest = os.path.join(base_dir, file) self.copy_file(file, dest, link=link) @@ -336,7 +332,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s' % base_dir) + logger.info('Removing %s', base_dir) else: rmtree(base_dir) diff --git a/distutils2/compiler/msvc9compiler.py b/distutils2/compiler/msvc9compiler.py --- a/distutils2/compiler/msvc9compiler.py +++ b/distutils2/compiler/msvc9compiler.py @@ -226,10 +226,10 @@ productdir = os.path.join(toolsdir, os.pardir, os.pardir, "VC") productdir = os.path.abspath(productdir) if not os.path.isdir(productdir): - logger.debug("%s is not a valid directory" % productdir) + logger.debug("%s is not a valid directory", productdir) return None else: - logger.debug("Env var %s is not set or invalid" % toolskey) + logger.debug("Env var %s is not set or invalid", toolskey) if not productdir: logger.debug("No productdir found") return None diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -3,8 +3,8 @@ Know how to read all config files Distutils2 uses. """ import os -import re import sys +import logging from ConfigParser import RawConfigParser from shlex import split @@ -90,7 +90,8 @@ if os.path.isfile(local_file): files.append(local_file) - logger.debug("using config files: %s" % ', '.join(files)) + if logger.isEnabledFor(logging.DEBUG): + logger.debug("using config files: %s", ', '.join(files)) return files def _convert_metadata(self, name, value): @@ -168,7 +169,7 @@ files = dict([(key, _convert(key, value)) for key, value in content['files'].iteritems()]) self.dist.packages = [] - self.dist.package_dir = pkg_dir = files.get('packages_root') + self.dist.package_dir = files.get('packages_root') packages = files.get('packages', []) if isinstance(packages, str): @@ -241,7 +242,7 @@ parser = RawConfigParser() for filename in filenames: - logger.debug(" reading %s" % filename) + logger.debug(" reading %s", filename) parser.read(filename) if os.path.split(filename)[-1] == 'setup.cfg': diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -293,10 +293,10 @@ opt_dict = self.command_options.get(cmd_name) if opt_dict is None: self.announce(indent + - "no option dict for '%s' command" % cmd_name) + "no option dict for %r command" % cmd_name) else: self.announce(indent + - "option dict for '%s' command:" % cmd_name) + "option dict for %r command:" % cmd_name) out = pformat(opt_dict) for line in out.split('\n'): self.announce(indent + " " + line) @@ -401,7 +401,7 @@ # Pull the current command from the head of the command line command = args[0] if not command_re.match(command): - raise SystemExit("invalid command name '%s'" % command) + raise SystemExit("invalid command name %r" % command) self.commands.append(command) # Dig up the command class that implements this command, so we @@ -420,15 +420,15 @@ if hasattr(cmd_class, meth): continue raise DistutilsClassError( - 'command "%s" must implement "%s"' % (cmd_class, meth)) + 'command %r must implement %r' % (cmd_class, meth)) # Also make sure that the command object provides a list of its # known options. if not (hasattr(cmd_class, 'user_options') and isinstance(cmd_class.user_options, list)): raise DistutilsClassError( - ("command class %s must provide " - "'user_options' attribute (a list of tuples)") % cmd_class) + "command class %s must provide " + "'user_options' attribute (a list of tuples)" % cmd_class) # If the command class has a list of negative alias options, # merge it in with the global negative aliases. @@ -466,7 +466,7 @@ func() else: raise DistutilsClassError( - "invalid help function %r for help option '%s': " + "invalid help function %r for help option %r: " "must be a callable object (function, etc.)" % (func, help_option)) @@ -537,7 +537,7 @@ fix_help_options(cls.help_options)) else: parser.set_option_table(cls.user_options) - parser.print_help("Options for '%s' command:" % cls.__name__) + parser.print_help("Options for %r command:" % cls.__name__) print('') print(gen_usage(self.script_name)) @@ -639,7 +639,7 @@ cmd_obj = self.command_obj.get(command) if not cmd_obj and create: logger.debug("Distribution.get_command_obj(): " \ - "creating '%s' command object" % command) + "creating %r command object", command) cls = get_command_class(command) cmd_obj = self.command_obj[command] = cls(self) @@ -669,10 +669,10 @@ if option_dict is None: option_dict = self.get_option_dict(command_name) - logger.debug(" setting options for '%s' command:" % command_name) + logger.debug(" setting options for %r command:", command_name) for (option, (source, value)) in option_dict.iteritems(): - logger.debug(" %s = %s (from %s)" % (option, value, source)) + logger.debug(" %s = %s (from %s)", option, value, source) try: bool_opts = [x.replace('-', '_') for x in command_obj.boolean_options] @@ -693,7 +693,7 @@ setattr(command_obj, option, value) else: raise DistutilsOptionError( - "error in %s: command '%s' has no such option '%s'" % + "error in %s: command %r has no such option %r" % (source, command_name, option)) except ValueError, msg: raise DistutilsOptionError(msg) diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -11,7 +11,6 @@ import sys import urllib2 import urlparse -import logging import os from distutils2 import logger @@ -168,7 +167,7 @@ if predicate.name.lower() in self._projects and not force_update: return self._projects.get(predicate.name.lower()) prefer_final = self._get_prefer_final(prefer_final) - logger.info('Reading info on PyPI about %s' % predicate.name) + logger.info('Reading info on PyPI about %s', predicate.name) self._process_index_page(predicate.name) if predicate.name.lower() not in self._projects: @@ -306,8 +305,8 @@ infos = get_infos_from_url(link, project_name, is_external=not self.index_url in url) except CantParseArchiveName, e: - logging.warning("version has not been parsed: %s" - % e) + logger.warning( + "version has not been parsed: %s", e) else: self._register_release(release_info=infos) else: diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -165,7 +165,7 @@ p['version'], metadata={'summary': p['summary']}, index=self._index)) except IrrationalVersionError, e: - logging.warn("Irrational version error found: %s" % e) + logging.warn("Irrational version error found: %s", e) return [self._projects[p['name'].lower()] for p in projects] def get_all_projects(self): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -12,6 +12,7 @@ import stat import errno import itertools +import logging import tempfile from distutils2 import logger @@ -134,12 +135,12 @@ installed_dists, installed_files = [], [] for dist in dists: - logger.info('Installing %s %s' % (dist.name, dist.version)) + logger.info('Installing %s %s', dist.name, dist.version) try: installed_files.extend(_install_dist(dist, path)) installed_dists.append(dist) except Exception, e: - logger.info('Failed. %s' % str(e)) + logger.info('Failed. %s', e) # reverting for installed_dist in installed_dists: @@ -258,8 +259,8 @@ if predicate.name.lower() != installed_project.name.lower(): continue found = True - logger.info('Found %s %s' % (installed_project.name, - installed_project.version)) + logger.info('Found %s %s', installed_project.name, + installed_project.version) # if we already have something installed, check it matches the # requirements @@ -358,7 +359,7 @@ finally: shutil.rmtree(tmp) - logger.info('Removing %r...' % project_name) + logger.info('Removing %r...', project_name) file_count = 0 for file_ in rmfiles: @@ -391,16 +392,16 @@ if os.path.exists(dist.path): shutil.rmtree(dist.path) - logger.info('Success ! Removed %d files and %d dirs' % \ - (file_count, dir_count)) + logger.info('Success! Removed %d files and %d dirs', + file_count, dir_count) def install(project): - logger.info('Getting information about "%s".' % project) + logger.info('Getting information about "%s".', project) try: info = get_infos(project) except InstallationException: - logger.info('Cound not find "%s".' % project) + logger.info('Cound not find "%s".', project) return if info['install'] == []: @@ -413,8 +414,9 @@ info['install'], info['remove'], info['conflict']) except InstallationConflict, e: - projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] - logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) + if logger.isEnabledFor(logging.INFO): + projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] + logger.info('%r conflicts with %s', project, ','.join(projects)) def _main(**attrs): diff --git a/distutils2/manifest.py b/distutils2/manifest.py --- a/distutils2/manifest.py +++ b/distutils2/manifest.py @@ -95,7 +95,7 @@ try: self._process_template_line(line) except DistutilsTemplateError, msg: - logging.warning("%s, %s" % (path_or_file, msg)) + logging.warning("%s, %s", path_or_file, msg) def write(self, path): """Write the file list in 'self.filelist' (presumably as filled in @@ -111,14 +111,14 @@ if first_line != '# file GENERATED by distutils, do NOT edit\n': logging.info("not writing to manually maintained " - "manifest file '%s'", path) + "manifest file %r", path) return self.sort() self.remove_duplicates() content = self.files[:] content.insert(0, '# file GENERATED by distutils, do NOT edit') - logging.info("writing manifest file '%s'", path) + logging.info("writing manifest file %r", path) write_file(path, content) def read(self, path): @@ -126,7 +126,7 @@ fill in 'self.filelist', the list of files to include in the source distribution. """ - logging.info("reading manifest file '%s'" % path) + logging.info("reading manifest file %r", path) manifest = open(path) try: for line in manifest.readlines(): @@ -168,14 +168,14 @@ 'global-include', 'global-exclude'): if len(words) < 2: raise DistutilsTemplateError( - "'%s' expects ..." % action) + "%r expects ..." % action) patterns = map(convert_path, words[1:]) elif action in ('recursive-include', 'recursive-exclude'): if len(words) < 3: raise DistutilsTemplateError( - "'%s' expects ..." % action) + "%r expects ..." % action) dir = convert_path(words[1]) patterns = map(convert_path, words[2:]) @@ -183,12 +183,12 @@ elif action in ('graft', 'prune'): if len(words) != 2: raise DistutilsTemplateError( - "'%s' expects a single " % action) + "%r expects a single " % action) dir_pattern = convert_path(words[1]) else: - raise DistutilsTemplateError("unknown action '%s'" % action) + raise DistutilsTemplateError("unknown action %r" % action) return action, patterns, dir, dir_pattern @@ -206,53 +206,53 @@ if action == 'include': for pattern in patterns: if not self._include_pattern(pattern, anchor=1): - logging.warning("warning: no files found matching '%s'" % - pattern) + logging.warning("warning: no files found matching %r", + pattern) elif action == 'exclude': for pattern in patterns: if not self.exclude_pattern(pattern, anchor=1): - logging.warning(("warning: no previously-included files " - "found matching '%s'") % pattern) + logging.warning("warning: no previously-included files " + "found matching %r", pattern) elif action == 'global-include': for pattern in patterns: if not self._include_pattern(pattern, anchor=0): - logging.warning(("warning: no files found matching '%s' " + - "anywhere in distribution") % pattern) + logging.warning("warning: no files found matching %r " + "anywhere in distribution", pattern) elif action == 'global-exclude': for pattern in patterns: if not self.exclude_pattern(pattern, anchor=0): - logging.warning(("warning: no previously-included files " - "matching '%s' found anywhere in distribution") % - pattern) + logging.warning("warning: no previously-included files " + "matching %r found anywhere in " + "distribution", pattern) elif action == 'recursive-include': for pattern in patterns: if not self._include_pattern(pattern, prefix=dir): - logging.warning(("warning: no files found matching '%s' " - "under directory '%s'" % (pattern, dir))) + logging.warning("warning: no files found matching %r " + "under directory %r", pattern, dir) elif action == 'recursive-exclude': for pattern in patterns: if not self.exclude_pattern(pattern, prefix=dir): - logging.warning(("warning: no previously-included files " - "matching '%s' found under directory '%s'") % - (pattern, dir)) + logging.warning("warning: no previously-included files " + "matching %r found under directory %r", + pattern, dir) elif action == 'graft': if not self._include_pattern(None, prefix=dir_pattern): - logging.warning("warning: no directories found matching '%s'" % - dir_pattern) + logging.warning("warning: no directories found matching %r", + dir_pattern) elif action == 'prune': if not self.exclude_pattern(None, prefix=dir_pattern): - logging.warning(("no previously-included directories found " + - "matching '%s'") % dir_pattern) + logging.warning("no previously-included directories found " + "matching %r", dir_pattern) else: raise DistutilsInternalError( - "this cannot happen: invalid action '%s'" % action) + "this cannot happen: invalid action %r" % action) def _include_pattern(self, pattern, anchor=1, prefix=None, is_regex=0): """Select strings (presumably filenames) from 'self.files' that diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py --- a/distutils2/tests/support.py +++ b/distutils2/tests/support.py @@ -51,6 +51,9 @@ def setUp(self): super(LoggingCatcher, self).setUp() + # TODO read the new logging docs and/or the python-dev posts about + # logging and tests to properly use a handler instead of + # monkey-patching self.old_log = logger._log logger._log = self._log logger.setLevel(logging.INFO) diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py --- a/distutils2/tests/test_command_build_py.py +++ b/distutils2/tests/test_command_build_py.py @@ -116,7 +116,7 @@ finally: sys.dont_write_bytecode = old_dont_write_bytecode - self.assertTrue('byte-compiling is disabled' in self.logs[0][1]) + self.assertIn('byte-compiling is disabled', self.logs[0][2][1]) def test_suite(): return unittest.makeSuite(BuildPyTestCase) diff --git a/distutils2/tests/test_command_install_lib.py b/distutils2/tests/test_command_install_lib.py --- a/distutils2/tests/test_command_install_lib.py +++ b/distutils2/tests/test_command_install_lib.py @@ -97,7 +97,7 @@ finally: sys.dont_write_bytecode = old_dont_write_bytecode - self.assertTrue('byte-compiling is disabled' in self.logs[0][1]) + self.assertIn('byte-compiling is disabled', self.logs[0][2][1]) def test_suite(): return unittest.makeSuite(InstallLibTestCase) diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -95,9 +95,6 @@ dist.include_package_data = True cmd = sdist(dist) cmd.dist_dir = 'dist' - def _warn(*args): - pass - cmd.warn = _warn return dist, cmd @unittest.skipUnless(zlib, "requires zlib") @@ -251,7 +248,7 @@ cmd.ensure_finalized() cmd.run() warnings = self.get_logs(logging.WARN) - self.assertEqual(len(warnings), 1) + self.assertEqual(len(warnings), 2) # trying with a complete set of metadata self.clear_logs() @@ -263,7 +260,8 @@ # removing manifest generated warnings warnings = [warn for warn in warnings if not warn.endswith('-- skipping')] - self.assertEqual(len(warnings), 0) + # the remaining warning is about the use of the default file list + self.assertEqual(len(warnings), 1) def test_show_formats(self): diff --git a/distutils2/tests/test_manifest.py b/distutils2/tests/test_manifest.py --- a/distutils2/tests/test_manifest.py +++ b/distutils2/tests/test_manifest.py @@ -1,6 +1,5 @@ """Tests for distutils.manifest.""" import os -import sys import logging from StringIO import StringIO @@ -25,10 +24,11 @@ class ManifestTestCase(support.TempdirManager, + # enable this after LoggingCatcher is fixed + #support.LoggingCatcher, unittest.TestCase): def test_manifest_reader(self): - tmpdir = self.mkdtemp() MANIFEST = os.path.join(tmpdir, 'MANIFEST.in') f = open(MANIFEST, 'w') @@ -38,9 +38,10 @@ f.close() manifest = Manifest() + # remove this when LoggingCatcher is fixed warns = [] - def _warn(msg): - warns.append(msg) + def _warn(*args): + warns.append(args[0]) old_warn = logging.warning logging.warning = _warn -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: More logging tweaks: use real warning method, merge some calls. Message-ID: tarek.ziade pushed 6a5628860de4 to distutils2: http://hg.python.org/distutils2/rev/6a5628860de4 changeset: 1016:6a5628860de4 user: ?ric Araujo date: Thu Feb 10 01:48:09 2011 +0100 summary: More logging tweaks: use real warning method, merge some calls. files: distutils2/command/cmd.py distutils2/command/register.py distutils2/compiler/bcppcompiler.py distutils2/compiler/ccompiler.py distutils2/compiler/msvccompiler.py distutils2/compiler/unixccompiler.py distutils2/manifest.py diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -182,7 +182,7 @@ raise RuntimeError( "abstract method -- subclass %s must override" % self.__class__) - # TODO remove this method, just use logging + # TODO remove this method, just use logging.info def announce(self, msg, level=logging.INFO): """If the current verbosity level is of greater than or equal to 'level' print 'msg' to stdout. @@ -364,7 +364,7 @@ # -- External world manipulation ----------------------------------- - # TODO remove this method, just use logging + # TODO remove this method, just use logging.warn def warn(self, msg): logger.warning("warning: %s: %s\n", self.get_command_name(), msg) diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -208,9 +208,8 @@ if code != 200: logger.info('Server response (%s): %s', code, result) else: - logger.info('You will receive an email shortly.') - logger.info('Follow the instructions in it to ' - 'complete registration.') + logger.info('You will receive an email shortly; follow the ' + 'instructions in it to complete registration.') elif choice == '3': data = {':action': 'password_reset'} data['email'] = '' diff --git a/distutils2/compiler/bcppcompiler.py b/distutils2/compiler/bcppcompiler.py --- a/distutils2/compiler/bcppcompiler.py +++ b/distutils2/compiler/bcppcompiler.py @@ -191,9 +191,8 @@ self._fix_lib_args (libraries, library_dirs, runtime_library_dirs) if runtime_library_dirs: - logger.warning(("I don't know what to do with " - "'runtime_library_dirs': %s"), - str(runtime_library_dirs)) + logger.warning("I don't know what to do with " + "'runtime_library_dirs': %r", runtime_library_dirs) if output_dir is not None: output_filename = os.path.join (output_dir, output_filename) diff --git a/distutils2/compiler/ccompiler.py b/distutils2/compiler/ccompiler.py --- a/distutils2/compiler/ccompiler.py +++ b/distutils2/compiler/ccompiler.py @@ -850,6 +850,7 @@ # -- Utility methods ----------------------------------------------- + # TODO use logging.info def announce(self, msg, level=None): logger.debug(msg) @@ -858,6 +859,7 @@ if DEBUG: print msg + # TODO use logging.warn def warn(self, msg): sys.stderr.write("warning: %s\n" % msg) diff --git a/distutils2/compiler/msvccompiler.py b/distutils2/compiler/msvccompiler.py --- a/distutils2/compiler/msvccompiler.py +++ b/distutils2/compiler/msvccompiler.py @@ -44,11 +44,10 @@ RegError = win32api.error except ImportError: - logger.info("Warning: Can't read registry to find the " - "necessary compiler setting\n" - "Make sure that Python modules _winreg, " - "win32api or win32con are installed.") - pass + logger.warning( + "Can't read registry to find the necessary compiler setting;\n" + "make sure that Python modules _winreg, win32api or win32con " + "are installed.") if _can_read_reg: HKEYS = (hkey_mod.HKEY_USERS, diff --git a/distutils2/compiler/unixccompiler.py b/distutils2/compiler/unixccompiler.py --- a/distutils2/compiler/unixccompiler.py +++ b/distutils2/compiler/unixccompiler.py @@ -98,9 +98,9 @@ sysroot = compiler_so[idx+1] if sysroot and not os.path.isdir(sysroot): - logger.warning("Compiling with an SDK that doesn't seem to exist: %s", - sysroot) - logger.warning("Please check your Xcode installation") + logger.warning( + "Compiling with an SDK that doesn't seem to exist: %r;\n" + "please check your Xcode installation", sysroot) return compiler_so diff --git a/distutils2/manifest.py b/distutils2/manifest.py --- a/distutils2/manifest.py +++ b/distutils2/manifest.py @@ -206,44 +206,43 @@ if action == 'include': for pattern in patterns: if not self._include_pattern(pattern, anchor=1): - logging.warning("warning: no files found matching %r", - pattern) + logging.warning("no files found matching %r", pattern) elif action == 'exclude': for pattern in patterns: if not self.exclude_pattern(pattern, anchor=1): - logging.warning("warning: no previously-included files " + logging.warning("no previously-included files " "found matching %r", pattern) elif action == 'global-include': for pattern in patterns: if not self._include_pattern(pattern, anchor=0): - logging.warning("warning: no files found matching %r " + logging.warning("no files found matching %r " "anywhere in distribution", pattern) elif action == 'global-exclude': for pattern in patterns: if not self.exclude_pattern(pattern, anchor=0): - logging.warning("warning: no previously-included files " + logging.warning("no previously-included files " "matching %r found anywhere in " "distribution", pattern) elif action == 'recursive-include': for pattern in patterns: if not self._include_pattern(pattern, prefix=dir): - logging.warning("warning: no files found matching %r " + logging.warning("no files found matching %r " "under directory %r", pattern, dir) elif action == 'recursive-exclude': for pattern in patterns: if not self.exclude_pattern(pattern, prefix=dir): - logging.warning("warning: no previously-included files " + logging.warning("no previously-included files " "matching %r found under directory %r", pattern, dir) elif action == 'graft': if not self._include_pattern(None, prefix=dir_pattern): - logging.warning("warning: no directories found matching %r", + logging.warning("no directories found matching %r", dir_pattern) elif action == 'prune': -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Merge Kelsey?s edits with mine Message-ID: tarek.ziade pushed b201575c7cdc to distutils2: http://hg.python.org/distutils2/rev/b201575c7cdc changeset: 1017:b201575c7cdc parent: 1016:6a5628860de4 parent: 1012:39e2a0b88acf user: ?ric Araujo date: Thu Feb 10 01:57:49 2011 +0100 summary: Merge Kelsey?s edits with mine files: distutils2/command/bdist_dumb.py distutils2/command/bdist_wininst.py distutils2/command/clean.py distutils2/command/register.py distutils2/command/sdist.py distutils2/compiler/bcppcompiler.py distutils2/compiler/msvc9compiler.py distutils2/compiler/msvccompiler.py distutils2/compiler/unixccompiler.py distutils2/index/simple.py distutils2/index/xmlrpc.py distutils2/install.py distutils2/tests/test_manifest.py diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -408,7 +408,7 @@ from distutils2._backport import tarfile if logger is not None: - logger.info('Creating tar archive') + logger.info('creating tar archive') uid = _get_uid(owner) gid = _get_gid(group) diff --git a/distutils2/command/bdist_dumb.py b/distutils2/command/bdist_dumb.py --- a/distutils2/command/bdist_dumb.py +++ b/distutils2/command/bdist_dumb.py @@ -128,7 +128,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s', self.bdist_dir) + logger.info('removing %s', self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/bdist_wininst.py b/distutils2/command/bdist_wininst.py --- a/distutils2/command/bdist_wininst.py +++ b/distutils2/command/bdist_wininst.py @@ -192,7 +192,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s', self.bdist_dir) + logger.info('removing %s', self.bdist_dir) else: rmtree(self.bdist_dir) diff --git a/distutils2/command/clean.py b/distutils2/command/clean.py --- a/distutils2/command/clean.py +++ b/distutils2/command/clean.py @@ -48,7 +48,7 @@ # gone) if os.path.exists(self.build_temp): if self.dry_run: - logger.info('Removing %s', self.build_temp) + logger.info('removing %s', self.build_temp) else: rmtree(self.build_temp) else: @@ -62,7 +62,7 @@ self.build_scripts): if os.path.exists(directory): if self.dry_run: - logger.info('Removing %s', directory) + logger.info('removing %s', directory) else: rmtree(directory) else: diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -92,7 +92,7 @@ ''' # send the info to the server and report the result code, result = self.post_to_server(self.build_post_data('verify')) - logger.info('Server response (%s): %s', code, result) + logger.info('server response (%s): %s', code, result) def send_metadata(self): @@ -206,9 +206,9 @@ data['email'] = raw_input(' EMail: ') code, result = self.post_to_server(data) if code != 200: - logger.info('Server response (%s): %s', code, result) + logger.info('server response (%s): %s', code, result) else: - logger.info('You will receive an email shortly; follow the ' + logger.info('you will receive an email shortly; follow the ' 'instructions in it to complete registration.') elif choice == '3': data = {':action': 'password_reset'} @@ -216,7 +216,7 @@ while not data['email']: data['email'] = raw_input('Your email address: ') code, result = self.post_to_server(data) - logger.info('Server response (%s): %s', (code, result)) + logger.info('server response (%s): %s', code, result) def build_post_data(self, action): # figure the data to send - the metadata plus some additional diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -332,7 +332,7 @@ if not self.keep_temp: if self.dry_run: - logger.info('Removing %s', base_dir) + logger.info('removing %s', base_dir) else: rmtree(base_dir) diff --git a/distutils2/compiler/bcppcompiler.py b/distutils2/compiler/bcppcompiler.py --- a/distutils2/compiler/bcppcompiler.py +++ b/distutils2/compiler/bcppcompiler.py @@ -191,7 +191,7 @@ self._fix_lib_args (libraries, library_dirs, runtime_library_dirs) if runtime_library_dirs: - logger.warning("I don't know what to do with " + logger.warning("don't know what to do with " "'runtime_library_dirs': %r", runtime_library_dirs) if output_dir is not None: diff --git a/distutils2/compiler/msvc9compiler.py b/distutils2/compiler/msvc9compiler.py --- a/distutils2/compiler/msvc9compiler.py +++ b/distutils2/compiler/msvc9compiler.py @@ -229,14 +229,14 @@ logger.debug("%s is not a valid directory", productdir) return None else: - logger.debug("Env var %s is not set or invalid", toolskey) + logger.debug("env var %s is not set or invalid", toolskey) if not productdir: - logger.debug("No productdir found") + logger.debug("no productdir found") return None vcvarsall = os.path.join(productdir, "vcvarsall.bat") if os.path.isfile(vcvarsall): return vcvarsall - logger.debug("Unable to find vcvarsall.bat") + logger.debug("unable to find vcvarsall.bat") return None def query_vcvarsall(version, arch="x86"): @@ -248,7 +248,7 @@ if vcvarsall is None: raise DistutilsPlatformError("Unable to find vcvarsall.bat") - logger.debug("Calling 'vcvarsall.bat %s' (version=%s)", arch, version) + logger.debug("calling 'vcvarsall.bat %s' (version=%s)", arch, version) popen = subprocess.Popen('"%s" %s & set' % (vcvarsall, arch), stdout=subprocess.PIPE, stderr=subprocess.PIPE) diff --git a/distutils2/compiler/msvccompiler.py b/distutils2/compiler/msvccompiler.py --- a/distutils2/compiler/msvccompiler.py +++ b/distutils2/compiler/msvccompiler.py @@ -45,7 +45,7 @@ except ImportError: logger.warning( - "Can't read registry to find the necessary compiler setting;\n" + "can't read registry to find the necessary compiler setting;\n" "make sure that Python modules _winreg, win32api or win32con " "are installed.") @@ -652,7 +652,7 @@ if get_build_version() >= 8.0: - logger.debug("Importing new compiler from distutils.msvc9compiler") + logger.debug("importing new compiler from distutils.msvc9compiler") OldMSVCCompiler = MSVCCompiler from distutils2.compiler.msvc9compiler import MSVCCompiler # get_build_architecture not really relevant now we support cross-compile diff --git a/distutils2/compiler/unixccompiler.py b/distutils2/compiler/unixccompiler.py --- a/distutils2/compiler/unixccompiler.py +++ b/distutils2/compiler/unixccompiler.py @@ -99,7 +99,7 @@ if sysroot and not os.path.isdir(sysroot): logger.warning( - "Compiling with an SDK that doesn't seem to exist: %r;\n" + "compiling with an SDK that doesn't seem to exist: %r;\n" "please check your Xcode installation", sysroot) return compiler_so diff --git a/distutils2/index/__init__.py b/distutils2/index/__init__.py --- a/distutils2/index/__init__.py +++ b/distutils2/index/__init__.py @@ -6,6 +6,6 @@ 'xmlrpc', 'dist', 'errors', - 'mirrors',] + 'mirrors'] from dist import ReleaseInfo, ReleasesList, DistInfo diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -109,7 +109,8 @@ self.dists = {} return self.dists - def add_distribution(self, dist_type='sdist', python_version=None, **params): + def add_distribution(self, dist_type='sdist', python_version=None, + **params): """Add distribution informations to this release. If distribution information is already set for this distribution type, add the given url paths to the distribution. This can be useful while diff --git a/distutils2/index/mirrors.py b/distutils2/index/mirrors.py --- a/distutils2/index/mirrors.py +++ b/distutils2/index/mirrors.py @@ -1,4 +1,4 @@ -"""Utilities related to the mirror infrastructure defined in PEP 381. +"""Utilities related to the mirror infrastructure defined in PEP 381. See http://www.python.org/dev/peps/pep-0381/ """ @@ -7,6 +7,7 @@ DEFAULT_MIRROR_URL = "last.pypi.python.org" + def get_mirrors(hostname=None): """Return the list of mirrors from the last record found on the DNS entry:: @@ -19,7 +20,7 @@ """ if hostname is None: hostname = DEFAULT_MIRROR_URL - + # return the last mirror registered on PyPI. try: hostname = socket.gethostbyname_ex(hostname)[0] @@ -30,23 +31,24 @@ # determine the list from the last one. return ["%s.%s" % (s, end_letter[1]) for s in string_range(end_letter[0])] + def string_range(last): """Compute the range of string between "a" and last. - + This works for simple "a to z" lists, but also for "a to zz" lists. """ for k in range(len(last)): - for x in product(ascii_lowercase, repeat=k+1): + for x in product(ascii_lowercase, repeat=(k + 1)): result = ''.join(x) yield result if result == last: return + def product(*args, **kwds): pools = map(tuple, args) * kwds.get('repeat', 1) result = [[]] for pool in pools: - result = [x+[y] for x in result for y in pool] + result = [x + [y] for x in result for y in pool] for prod in result: yield tuple(prod) - diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -167,7 +167,7 @@ if predicate.name.lower() in self._projects and not force_update: return self._projects.get(predicate.name.lower()) prefer_final = self._get_prefer_final(prefer_final) - logger.info('Reading info on PyPI about %s', predicate.name) + logger.info('reading info on PyPI about %s', predicate.name) self._process_index_page(predicate.name) if predicate.name.lower() not in self._projects: diff --git a/distutils2/index/wrapper.py b/distutils2/index/wrapper.py --- a/distutils2/index/wrapper.py +++ b/distutils2/index/wrapper.py @@ -9,6 +9,7 @@ _WRAPPER_INDEXES = {'xmlrpc': xmlrpc.Client, 'simple': simple.Crawler} + def switch_index_if_fails(func, wrapper): """Decorator that switch of index (for instance from xmlrpc to simple) if the first mirror return an empty list or raises an exception. @@ -82,11 +83,11 @@ other_indexes = [i for i in self._indexes if i != self._default_index] for index in other_indexes: - real_method = getattr(self._indexes[index], method_name, None) + real_method = getattr(self._indexes[index], method_name, + None) if real_method: break if real_method: return switch_index_if_fails(real_method, self) else: raise AttributeError("No index have attribute '%s'" % method_name) - diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -103,7 +103,6 @@ project.sort_releases(prefer_final) return project - def get_distributions(self, project_name, version): """Grab informations about distributions from XML-RPC. diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -76,7 +76,7 @@ record_file = os.path.join(archive_dir, 'RECORD') os.system(cmd % (sys.executable, path, record_file)) if not os.path.exists(record_file): - raise ValueError('Failed to install.') + raise ValueError('failed to install') return open(record_file).read().split('\n') @@ -135,12 +135,12 @@ installed_dists, installed_files = [], [] for dist in dists: - logger.info('Installing %s %s', dist.name, dist.version) + logger.info('installing %s %s', dist.name, dist.version) try: installed_files.extend(_install_dist(dist, path)) installed_dists.append(dist) except Exception, e: - logger.info('Failed. %s', e) + logger.info('failed: %s', e) # reverting for installed_dist in installed_dists: @@ -244,7 +244,7 @@ conflict. """ if not installed: - logger.info('Reading installed distributions') + logger.info('reading installed distributions') installed = get_distributions(use_egg_info=True) infos = {'install': [], 'remove': [], 'conflict': []} @@ -259,7 +259,7 @@ if predicate.name.lower() != installed_project.name.lower(): continue found = True - logger.info('Found %s %s', installed_project.name, + logger.info('found %s %s', installed_project.name, installed_project.version) # if we already have something installed, check it matches the @@ -269,7 +269,7 @@ break if not found: - logger.info('Project not installed.') + logger.info('project not installed') if not index: index = wrapper.ClientWrapper() @@ -284,7 +284,7 @@ release = releases.get_last(requirements, prefer_final=prefer_final) if release is None: - logger.info('Could not find a matching project') + logger.info('could not find a matching project') return infos # this works for Metadata 1.2 @@ -359,7 +359,7 @@ finally: shutil.rmtree(tmp) - logger.info('Removing %r...', project_name) + logger.info('removing %r...', project_name) file_count = 0 for file_ in rmfiles: @@ -392,20 +392,20 @@ if os.path.exists(dist.path): shutil.rmtree(dist.path) - logger.info('Success! Removed %d files and %d dirs', + logger.info('success: removed %d files and %d dirs', file_count, dir_count) def install(project): - logger.info('Getting information about "%s".', project) + logger.info('getting information about %r', project) try: info = get_infos(project) except InstallationException: - logger.info('Cound not find "%s".', project) + logger.info('cound not find %r', project) return if info['install'] == []: - logger.info('Nothing to install.') + logger.info('nothing to install') return install_path = get_config_var('base') diff --git a/distutils2/tests/test_manifest.py b/distutils2/tests/test_manifest.py --- a/distutils2/tests/test_manifest.py +++ b/distutils2/tests/test_manifest.py @@ -54,7 +54,7 @@ # and 3 warnings issued (we ddidn't provided the files) self.assertEqual(len(warns), 3) for warn in warns: - self.assertIn('warning: no files found matching', warn) + self.assertIn('no files found matching', warn) # manifest also accepts file-like objects old_warn = logging.warning -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Refactoring -- Removing grok_environment_error. Message-ID: tarek.ziade pushed 2ca60fd8c79b to distutils2: http://hg.python.org/distutils2/rev/2ca60fd8c79b changeset: 1019:2ca60fd8c79b parent: 1017:b201575c7cdc user: Kelsey Hightower date: Thu Feb 10 21:37:08 2011 -0500 summary: Refactoring -- Removing grok_environment_error. files: distutils2/run.py distutils2/util.py docs/source/distutils/apiref.rst diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -4,7 +4,6 @@ import logging from distutils2 import logger -from distutils2.util import grok_environment_error from distutils2.errors import (DistutilsSetupError, DistutilsArgError, DistutilsError, CCompilerError) from distutils2.dist import Distribution @@ -106,12 +105,7 @@ dist.run_commands() except KeyboardInterrupt: raise SystemExit("interrupted") - except (IOError, os.error), exc: - error = grok_environment_error(exc) - raise SystemExit(error) - - except (DistutilsError, - CCompilerError), msg: + except (IOError, os.error, DistutilsError, CCompilerError), msg: raise SystemExit("error: " + str(msg)) return dist diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -177,29 +177,6 @@ raise ValueError("invalid variable '$%s'" % var) -def grok_environment_error(exc, prefix="error: "): - """Generate a useful error message from an EnvironmentError. - - This will generate an IOError or an OSError exception object. - Handles Python 1.5.1 and 1.5.2 styles, and - does what it can to deal with exception objects that don't have a - filename (which happens when the error is due to a two-file operation, - such as 'rename()' or 'link()'. Returns the error message as a string - prefixed with 'prefix'. - """ - # check for Python 1.5.2-style {IO,OS}Error exception objects - if hasattr(exc, 'filename') and hasattr(exc, 'strerror'): - if exc.filename: - error = prefix + "%s: %s" % (exc.filename, exc.strerror) - else: - # two-argument functions in posix module don't - # include the filename in the exception object! - error = prefix + "%s" % exc.strerror - else: - error = prefix + str(exc[-1]) - - return error - # Needed by 'split_quoted()' _wordchars_re = _squote_re = _dquote_re = None diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1166,15 +1166,6 @@ an underscore. No { } or ( ) style quoting is available. -.. function:: grok_environment_error(exc[, prefix='error: ']) - - Generate a useful error message from an :exc:`EnvironmentError` - (:exc:`IOError` or :exc:`OSError`) exception object. Does what it can to deal - with exception objects that don't have a filename (which happens when the - error is due to a two-file operation, such as :func:`rename` or - :func:`link`). Returns the error message as a string prefixed with *prefix*. - - .. function:: split_quoted(s) Split a string up according to Unix shell-like rules for quotes and -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Changing the metadata.version API and relocating the metadata_to_dict function. Message-ID: tarek.ziade pushed a1f153865c36 to distutils2: http://hg.python.org/distutils2/rev/a1f153865c36 changeset: 1018:a1f153865c36 user: Kelsey Hightower date: Thu Feb 10 00:14:01 2011 -0500 summary: Changing the metadata.version API and relocating the metadata_to_dict function. A new function, get_metadata_version, replaces the metadata.version attribute as the preferred method of retrieving the metadata version. The metadata_to_dict function has been relocated from distutils2.util to distutils2.metadata to help improve organization. files: distutils2/command/register.py distutils2/command/upload.py distutils2/metadata.py distutils2/tests/test_metadata.py distutils2/util.py diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -14,9 +14,9 @@ from distutils2.command.cmd import Command from distutils2 import logger -from distutils2.util import (metadata_to_dict, read_pypirc, generate_pypirc, - DEFAULT_REPOSITORY, DEFAULT_REALM, - get_pypirc_path) +from distutils2.metadata import metadata_to_dict +from distutils2.util import (read_pypirc, generate_pypirc, DEFAULT_REPOSITORY, + DEFAULT_REALM, get_pypirc_path) class register(Command): diff --git a/distutils2/command/upload.py b/distutils2/command/upload.py --- a/distutils2/command/upload.py +++ b/distutils2/command/upload.py @@ -20,8 +20,8 @@ from distutils2.errors import DistutilsOptionError from distutils2.util import spawn from distutils2.command.cmd import Command -from distutils2.util import (metadata_to_dict, read_pypirc, - DEFAULT_REPOSITORY, DEFAULT_REALM) +from distutils2.metadata import metadata_to_dict +from distutils2.util import read_pypirc, DEFAULT_REPOSITORY, DEFAULT_REALM class upload(Command): diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -39,8 +39,8 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ['Metadata', 'PKG_INFO_ENCODING', - 'PKG_INFO_PREFERRED_VERSION'] +__all__ = ['Metadata', 'get_metadata_version', 'metadata_to_dict', + 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION'] # Encoding used for the PKG-INFO files PKG_INFO_ENCODING = 'utf-8' @@ -137,6 +137,53 @@ # default marker when 1.0 is disqualified return '1.2' + +def get_metadata_version(metadata): + """Return the Metadata-Version attribute + + - *metadata* give a METADATA object + """ + return metadata['Metadata-Version'] + + +def metadata_to_dict(metadata): + """Convert a metadata object to a dict + + - *metadata* give a METADATA object + """ + data = { + 'metadata_version': metadata['Metadata-Version'], + 'name': metadata['Name'], + 'version': metadata['Version'], + 'summary': metadata['Summary'], + 'home_page': metadata['Home-page'], + 'author': metadata['Author'], + 'author_email': metadata['Author-email'], + 'license': metadata['License'], + 'description': metadata['Description'], + 'keywords': metadata['Keywords'], + 'platform': metadata['Platform'], + 'classifier': metadata['Classifier'], + 'download_url': metadata['Download-URL'], + } + + if metadata['Metadata-Version'] == '1.2': + data['requires_dist'] = metadata['Requires-Dist'] + data['requires_python'] = metadata['Requires-Python'] + data['requires_external'] = metadata['Requires-External'] + data['provides_dist'] = metadata['Provides-Dist'] + data['obsoletes_dist'] = metadata['Obsoletes-Dist'] + data['project_url'] = [','.join(url) for url in + metadata['Project-URL']] + + elif metadata['Metadata-Version'] == '1.1': + data['provides'] = metadata['Provides'] + data['requires'] = metadata['Requires'] + data['obsoletes'] = metadata['Obsoletes'] + + return data + + _ATTR2FIELD = { 'metadata_version': 'Metadata-Version', 'name': 'Name', @@ -205,7 +252,6 @@ display_warnings=False): self._fields = {} self.display_warnings = display_warnings - self.version = None self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent @@ -220,8 +266,7 @@ self.update(mapping) def _set_best_version(self): - self.version = _best_version(self._fields) - self._fields['Metadata-Version'] = self.version + self._fields['Metadata-Version'] = _best_version(self._fields) def _write_field(self, file, name, value): file.write('%s: %s\n' % (name, value)) @@ -318,9 +363,9 @@ def read_file(self, fileob): """Read the metadata values from a file object.""" msg = message_from_file(fileob) - self.version = msg['metadata-version'] + self._fields['Metadata-Version'] = msg['metadata-version'] - for field in _version2fieldlist(self.version): + for field in _version2fieldlist(self['Metadata-Version']): if field in _LISTFIELDS: # we can have multiple lines values = msg.get_all(field) @@ -344,7 +389,7 @@ def write_file(self, fileobject): """Write the PKG-INFO format data to a file object.""" self._set_best_version() - for field in _version2fieldlist(self.version): + for field in _version2fieldlist(self['Metadata-Version']): values = self.get(field) if field in _ELEMENTSFIELD: self._write_field(fileobject, field, ','.join(values)) @@ -509,7 +554,7 @@ # Mapping API def keys(self): - return _version2fieldlist(self.version) + return _version2fieldlist(self['Metadata-Version']) def values(self): return [self[key] for key in self.keys()] diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -4,7 +4,7 @@ import platform from StringIO import StringIO -from distutils2.metadata import (Metadata, +from distutils2.metadata import (Metadata, get_metadata_version, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -126,18 +126,23 @@ del metadata['Obsoletes-Dist'] metadata['Version'] = '1' self.assertEqual(metadata['Metadata-Version'], '1.0') + self.assertEqual(get_metadata_version(metadata), '1.0') PKG_INFO = os.path.join(os.path.dirname(__file__), 'SETUPTOOLS-PKG-INFO') metadata.read_file(StringIO(open(PKG_INFO).read())) self.assertEqual(metadata['Metadata-Version'], '1.0') + self.assertEqual(get_metadata_version(metadata), '1.0') PKG_INFO = os.path.join(os.path.dirname(__file__), 'SETUPTOOLS-PKG-INFO2') metadata.read_file(StringIO(open(PKG_INFO).read())) self.assertEqual(metadata['Metadata-Version'], '1.1') + self.assertEqual(get_metadata_version(metadata), '1.1') - metadata.version = '1.618' + # Update the _fields dict directly to prevent 'Metadata-Version' + # from being updated by the _set_best_version() method. + metadata._fields['Metadata-Version'] = '1.618' self.assertRaises(MetadataUnrecognizedVersionError, metadata.keys) # XXX Spurious Warnings were disabled @@ -169,7 +174,7 @@ metadata['Project-URL'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-URL'], [('one', 'http://ok')]) - self.assertEqual(metadata.version, '1.2') + self.assertEqual(metadata['Metadata-Version'], '1.2') def test_check_version(self): metadata = Metadata() @@ -244,9 +249,13 @@ def test_best_choice(self): metadata = Metadata() metadata['Version'] = '1.0' - self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION) + self.assertEqual(metadata['Metadata-Version'], + PKG_INFO_PREFERRED_VERSION) + self.assertEqual(get_metadata_version(metadata), + PKG_INFO_PREFERRED_VERSION) metadata['Classifier'] = ['ok'] - self.assertEqual(metadata.version, '1.2') + self.assertEqual(metadata['Metadata-Version'], '1.2') + self.assertEqual(get_metadata_version(metadata), '1.2') def test_project_urls(self): # project-url is a bit specific, make sure we write it diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -924,41 +924,6 @@ return {} -def metadata_to_dict(meta): - """XXX might want to move it to the Metadata class.""" - data = { - 'metadata_version': meta.version, - 'name': meta['Name'], - 'version': meta['Version'], - 'summary': meta['Summary'], - 'home_page': meta['Home-page'], - 'author': meta['Author'], - 'author_email': meta['Author-email'], - 'license': meta['License'], - 'description': meta['Description'], - 'keywords': meta['Keywords'], - 'platform': meta['Platform'], - 'classifier': meta['Classifier'], - 'download_url': meta['Download-URL'], - } - - if meta.version == '1.2': - data['requires_dist'] = meta['Requires-Dist'] - data['requires_python'] = meta['Requires-Python'] - data['requires_external'] = meta['Requires-External'] - data['provides_dist'] = meta['Provides-Dist'] - data['obsoletes_dist'] = meta['Obsoletes-Dist'] - data['project_url'] = [','.join(url) for url in - meta['Project-URL']] - - elif meta.version == '1.1': - data['provides'] = meta['Provides'] - data['requires'] = meta['Requires'] - data['obsoletes'] = meta['Obsoletes'] - - return data - - # utility functions for 2to3 support def run_2to3(files, doctests_only=False, fixer_names=None, -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: branch merge Message-ID: tarek.ziade pushed 2c34d929388b to distutils2: http://hg.python.org/distutils2/rev/2c34d929388b changeset: 1020:2c34d929388b parent: 1019:2ca60fd8c79b parent: 1018:a1f153865c36 user: Alexis Metaireau date: Fri Feb 11 21:00:19 2011 +0000 summary: branch merge files: distutils2/util.py diff --git a/distutils2/command/register.py b/distutils2/command/register.py --- a/distutils2/command/register.py +++ b/distutils2/command/register.py @@ -14,9 +14,9 @@ from distutils2.command.cmd import Command from distutils2 import logger -from distutils2.util import (metadata_to_dict, read_pypirc, generate_pypirc, - DEFAULT_REPOSITORY, DEFAULT_REALM, - get_pypirc_path) +from distutils2.metadata import metadata_to_dict +from distutils2.util import (read_pypirc, generate_pypirc, DEFAULT_REPOSITORY, + DEFAULT_REALM, get_pypirc_path) class register(Command): diff --git a/distutils2/command/upload.py b/distutils2/command/upload.py --- a/distutils2/command/upload.py +++ b/distutils2/command/upload.py @@ -20,8 +20,8 @@ from distutils2.errors import DistutilsOptionError from distutils2.util import spawn from distutils2.command.cmd import Command -from distutils2.util import (metadata_to_dict, read_pypirc, - DEFAULT_REPOSITORY, DEFAULT_REALM) +from distutils2.metadata import metadata_to_dict +from distutils2.util import read_pypirc, DEFAULT_REPOSITORY, DEFAULT_REALM class upload(Command): diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -39,8 +39,8 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ['Metadata', 'PKG_INFO_ENCODING', - 'PKG_INFO_PREFERRED_VERSION'] +__all__ = ['Metadata', 'get_metadata_version', 'metadata_to_dict', + 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION'] # Encoding used for the PKG-INFO files PKG_INFO_ENCODING = 'utf-8' @@ -137,6 +137,53 @@ # default marker when 1.0 is disqualified return '1.2' + +def get_metadata_version(metadata): + """Return the Metadata-Version attribute + + - *metadata* give a METADATA object + """ + return metadata['Metadata-Version'] + + +def metadata_to_dict(metadata): + """Convert a metadata object to a dict + + - *metadata* give a METADATA object + """ + data = { + 'metadata_version': metadata['Metadata-Version'], + 'name': metadata['Name'], + 'version': metadata['Version'], + 'summary': metadata['Summary'], + 'home_page': metadata['Home-page'], + 'author': metadata['Author'], + 'author_email': metadata['Author-email'], + 'license': metadata['License'], + 'description': metadata['Description'], + 'keywords': metadata['Keywords'], + 'platform': metadata['Platform'], + 'classifier': metadata['Classifier'], + 'download_url': metadata['Download-URL'], + } + + if metadata['Metadata-Version'] == '1.2': + data['requires_dist'] = metadata['Requires-Dist'] + data['requires_python'] = metadata['Requires-Python'] + data['requires_external'] = metadata['Requires-External'] + data['provides_dist'] = metadata['Provides-Dist'] + data['obsoletes_dist'] = metadata['Obsoletes-Dist'] + data['project_url'] = [','.join(url) for url in + metadata['Project-URL']] + + elif metadata['Metadata-Version'] == '1.1': + data['provides'] = metadata['Provides'] + data['requires'] = metadata['Requires'] + data['obsoletes'] = metadata['Obsoletes'] + + return data + + _ATTR2FIELD = { 'metadata_version': 'Metadata-Version', 'name': 'Name', @@ -205,7 +252,6 @@ display_warnings=False): self._fields = {} self.display_warnings = display_warnings - self.version = None self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent @@ -220,8 +266,7 @@ self.update(mapping) def _set_best_version(self): - self.version = _best_version(self._fields) - self._fields['Metadata-Version'] = self.version + self._fields['Metadata-Version'] = _best_version(self._fields) def _write_field(self, file, name, value): file.write('%s: %s\n' % (name, value)) @@ -318,9 +363,9 @@ def read_file(self, fileob): """Read the metadata values from a file object.""" msg = message_from_file(fileob) - self.version = msg['metadata-version'] + self._fields['Metadata-Version'] = msg['metadata-version'] - for field in _version2fieldlist(self.version): + for field in _version2fieldlist(self['Metadata-Version']): if field in _LISTFIELDS: # we can have multiple lines values = msg.get_all(field) @@ -344,7 +389,7 @@ def write_file(self, fileobject): """Write the PKG-INFO format data to a file object.""" self._set_best_version() - for field in _version2fieldlist(self.version): + for field in _version2fieldlist(self['Metadata-Version']): values = self.get(field) if field in _ELEMENTSFIELD: self._write_field(fileobject, field, ','.join(values)) @@ -509,7 +554,7 @@ # Mapping API def keys(self): - return _version2fieldlist(self.version) + return _version2fieldlist(self['Metadata-Version']) def values(self): return [self[key] for key in self.keys()] diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -4,7 +4,7 @@ import platform from StringIO import StringIO -from distutils2.metadata import (Metadata, +from distutils2.metadata import (Metadata, get_metadata_version, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -126,18 +126,23 @@ del metadata['Obsoletes-Dist'] metadata['Version'] = '1' self.assertEqual(metadata['Metadata-Version'], '1.0') + self.assertEqual(get_metadata_version(metadata), '1.0') PKG_INFO = os.path.join(os.path.dirname(__file__), 'SETUPTOOLS-PKG-INFO') metadata.read_file(StringIO(open(PKG_INFO).read())) self.assertEqual(metadata['Metadata-Version'], '1.0') + self.assertEqual(get_metadata_version(metadata), '1.0') PKG_INFO = os.path.join(os.path.dirname(__file__), 'SETUPTOOLS-PKG-INFO2') metadata.read_file(StringIO(open(PKG_INFO).read())) self.assertEqual(metadata['Metadata-Version'], '1.1') + self.assertEqual(get_metadata_version(metadata), '1.1') - metadata.version = '1.618' + # Update the _fields dict directly to prevent 'Metadata-Version' + # from being updated by the _set_best_version() method. + metadata._fields['Metadata-Version'] = '1.618' self.assertRaises(MetadataUnrecognizedVersionError, metadata.keys) # XXX Spurious Warnings were disabled @@ -169,7 +174,7 @@ metadata['Project-URL'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-URL'], [('one', 'http://ok')]) - self.assertEqual(metadata.version, '1.2') + self.assertEqual(metadata['Metadata-Version'], '1.2') def test_check_version(self): metadata = Metadata() @@ -244,9 +249,13 @@ def test_best_choice(self): metadata = Metadata() metadata['Version'] = '1.0' - self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION) + self.assertEqual(metadata['Metadata-Version'], + PKG_INFO_PREFERRED_VERSION) + self.assertEqual(get_metadata_version(metadata), + PKG_INFO_PREFERRED_VERSION) metadata['Classifier'] = ['ok'] - self.assertEqual(metadata.version, '1.2') + self.assertEqual(metadata['Metadata-Version'], '1.2') + self.assertEqual(get_metadata_version(metadata), '1.2') def test_project_urls(self): # project-url is a bit specific, make sure we write it diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -901,41 +901,6 @@ return {} -def metadata_to_dict(meta): - """XXX might want to move it to the Metadata class.""" - data = { - 'metadata_version': meta.version, - 'name': meta['Name'], - 'version': meta['Version'], - 'summary': meta['Summary'], - 'home_page': meta['Home-page'], - 'author': meta['Author'], - 'author_email': meta['Author-email'], - 'license': meta['License'], - 'description': meta['Description'], - 'keywords': meta['Keywords'], - 'platform': meta['Platform'], - 'classifier': meta['Classifier'], - 'download_url': meta['Download-URL'], - } - - if meta.version == '1.2': - data['requires_dist'] = meta['Requires-Dist'] - data['requires_python'] = meta['Requires-Python'] - data['requires_external'] = meta['Requires-External'] - data['provides_dist'] = meta['Provides-Dist'] - data['obsoletes_dist'] = meta['Obsoletes-Dist'] - data['project_url'] = [','.join(url) for url in - meta['Project-URL']] - - elif meta.version == '1.1': - data['provides'] = meta['Provides'] - data['requires'] = meta['Requires'] - data['obsoletes'] = meta['Obsoletes'] - - return data - - # utility functions for 2to3 support def run_2to3(files, doctests_only=False, fixer_names=None, -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Mock sysconfig.get_paths in the mkcfg tests, in order to have the same output Message-ID: tarek.ziade pushed f20a7df21e50 to distutils2: http://hg.python.org/distutils2/rev/f20a7df21e50 changeset: 1021:f20a7df21e50 user: Alexis Metaireau date: Sat Feb 12 00:02:26 2011 +0000 summary: Mock sysconfig.get_paths in the mkcfg tests, in order to have the same output in everywhere. files: distutils2/tests/test_mkcfg.py diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -8,6 +8,7 @@ from distutils2.tests import run_unittest, support, unittest from distutils2.mkcfg import MainProgram from distutils2.mkcfg import ask_yn, ask, main +from distutils2._backport import sysconfig class MkcfgTestCase(support.TempdirManager, @@ -22,12 +23,18 @@ self._cwd = os.getcwd() self.wdir = self.mkdtemp() os.chdir(self.wdir) + # patch sysconfig + self._old_get_paths = sysconfig.get_paths + sysconfig.get_paths = lambda *args, **kwargs: { + 'man': sys.prefix + '/share/man', + 'doc': sys.prefix + '/share/doc/pyxfoil',} def tearDown(self): super(MkcfgTestCase, self).tearDown() sys.stdin = self._stdin sys.stdout = self._stdout os.chdir(self._cwd) + sysconfig.get_paths = self._old_get_paths def test_ask_yn(self): sys.stdin.write('y\n') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Move installation exceptions from distutils2.install to distutils2.errors and Message-ID: tarek.ziade pushed 5dd221e9bb49 to distutils2: http://hg.python.org/distutils2/rev/5dd221e9bb49 changeset: 1023:5dd221e9bb49 user: Kelsey Hightower date: Sun Feb 13 12:07:46 2011 -0500 summary: Move installation exceptions from distutils2.install to distutils2.errors and pep8 clean up on both modules files: distutils2/errors.py distutils2/install.py diff --git a/distutils2/errors.py b/distutils2/errors.py --- a/distutils2/errors.py +++ b/distutils2/errors.py @@ -9,7 +9,6 @@ symbols whose names start with "Distutils" and end with "Error".""" - class DistutilsError(Exception): """The root of all Distutils evil.""" @@ -135,3 +134,11 @@ This guard can be disabled by setting that option False. """ pass + + +class InstallationException(Exception): + """Base exception for installation scripts""" + + +class InstallationConflict(InstallationException): + """Raised when a conflict is detected""" diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -22,7 +22,8 @@ from distutils2.depgraph import generate_graph from distutils2.index import wrapper from distutils2.index.errors import ProjectNotFound, ReleaseNotFound -from distutils2.errors import DistutilsError +from distutils2.errors import (DistutilsError, InstallationException, + InstallationConflict) from distutils2.version import get_version_predicate @@ -30,14 +31,6 @@ 'install'] -class InstallationException(Exception): - """Base exception for installation scripts""" - - -class InstallationConflict(InstallationException): - """Raised when a conflict is detected""" - - def _move_files(files, destination): """Move the list of files in the destination folder, keeping the same structure. @@ -364,7 +357,7 @@ file_count = 0 for file_ in rmfiles: os.remove(file_) - file_count +=1 + file_count += 1 dir_count = 0 for dirname in rmdirs: -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Added documentation about extra_path. Original patch by Ronald Oussoren Message-ID: tarek.ziade pushed 121bdbb58399 to distutils2: http://hg.python.org/distutils2/rev/121bdbb58399 changeset: 1022:121bdbb58399 user: Alexis Metaireau date: Sun Feb 13 14:33:09 2011 +0000 summary: Added documentation about extra_path. Original patch by Ronald Oussoren files: docs/source/distutils/apiref.rst diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -104,6 +104,26 @@ | *package_dir* | A mapping of package to | a dictionary | | | directory names | | +--------------------+--------------------------------+-------------------------------------------------------------+ + | *extra_path* | Information about an | a string, 1-tuple or 2-tuple | + | | intervening directory the | | + | | install directory and the | | + | | actual installation directory. | | + | | | | + | | If the value is a string is is | | + | | treated as a comma-separated | | + | | tuple. | | + | | | | + | | If the value is a 2-tuple, | | + | | the first element is the | | + | | ``.pth`` file and the second | | + | | is the name of the intervening | | + | | directory. | | + | | | | + | | If the value is a 1-tuple that | | + | | element is both the name of | | + | | the ``.pth`` file and the | | + | | intervening directory. | | + +--------------------+--------------------------------+-------------------------------------------------------------+ -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Fixing a bug in distutils2.index.dist.DistInfo.unpack; does not unpack into the Message-ID: tarek.ziade pushed c3706e13ec0b to distutils2: http://hg.python.org/distutils2/rev/c3706e13ec0b changeset: 1024:c3706e13ec0b user: Kelsey Hightower date: Sun Feb 13 12:09:44 2011 -0500 summary: Fixing a bug in distutils2.index.dist.DistInfo.unpack; does not unpack into the path specified in the path argument. files: distutils2/index/dist.py distutils2/tests/test_index_dist.py diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -324,7 +324,7 @@ filename = self.download(path) content_type = mimetypes.guess_type(filename)[0] - self._unpacked_dir = unpack_archive(filename) + self._unpacked_dir = unpack_archive(filename, path) return self._unpacked_dir diff --git a/distutils2/tests/test_index_dist.py b/distutils2/tests/test_index_dist.py --- a/distutils2/tests/test_index_dist.py +++ b/distutils2/tests/test_index_dist.py @@ -160,13 +160,23 @@ @use_pypi_server('downloads_with_md5') def test_unpack(self, server): url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address - dist = Dist(url=url) + dist1 = Dist(url=url) # doing an unpack - here = self.mkdtemp() - there = dist.unpack(here) - result = os.listdir(there) - self.assertIn('paf', result) - os.remove('paf') + dist1_here = self.mkdtemp() + dist1_there = dist1.unpack(path=dist1_here) + # assert we unpack to the path provided + self.assertEqual(dist1_here, dist1_there) + dist1_result = os.listdir(dist1_there) + self.assertIn('paf', dist1_result) + os.remove(os.path.join(dist1_there, 'paf')) + + # Test unpack works without a path argument + dist2 = Dist(url=url) + # doing an unpack + dist2_there = dist2.unpack() + dist2_result = os.listdir(dist2_there) + self.assertIn('paf', dist2_result) + os.remove(os.path.join(dist2_there, 'paf')) def test_hashname(self): # Invalid hashnames raises an exception on assignation -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Add example hierarchy for data files (example of wiki) Message-ID: tarek.ziade pushed e418b51ecbed to distutils2: http://hg.python.org/distutils2/rev/e418b51ecbed changeset: 1025:e418b51ecbed parent: 896:272155a17d56 user: FELD Boris date: Thu Jan 27 16:30:13 2011 +0100 summary: Add example hierarchy for data files (example of wiki) files: distutils2/tests/test_data_files/README distutils2/tests/test_data_files/mailman/database/mailman.db distutils2/tests/test_data_files/mailman/database/schemas/blah.schema distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt distutils2/tests/test_data_files/mailman/developer-docs/index.txt distutils2/tests/test_data_files/mailman/etc/my.cnf distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg distutils2/tests/test_data_files/some-new-semantic.sns distutils2/tests/test_data_files/some.tpl diff --git a/distutils2/tests/test_data_files/README b/distutils2/tests/test_data_files/README new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/database/mailman.db b/distutils2/tests/test_data_files/mailman/database/mailman.db new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/database/schemas/blah.schema b/distutils2/tests/test_data_files/mailman/database/schemas/blah.schema new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt b/distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/developer-docs/index.txt b/distutils2/tests/test_data_files/mailman/developer-docs/index.txt new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/etc/my.cnf b/distutils2/tests/test_data_files/mailman/etc/my.cnf new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg b/distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg new file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg b/distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg new file mode 100644 diff --git a/distutils2/tests/test_data_files/some-new-semantic.sns b/distutils2/tests/test_data_files/some-new-semantic.sns new file mode 100644 diff --git a/distutils2/tests/test_data_files/some.tpl b/distutils2/tests/test_data_files/some.tpl new file mode 100644 -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Add distutils2 data files glob and unittest. Message-ID: tarek.ziade pushed 866f5fad7919 to distutils2: http://hg.python.org/distutils2/rev/866f5fad7919 changeset: 1026:866f5fad7919 user: FELD Boris date: Thu Jan 27 17:15:13 2011 +0100 summary: Add distutils2 data files glob and unittest. files: distutils2/data/__init__.py distutils2/data/tools.py distutils2/tests/test_data_files.py diff --git a/distutils2/data/__init__.py b/distutils2/data/__init__.py new file mode 100644 --- /dev/null +++ b/distutils2/data/__init__.py @@ -0,0 +1,3 @@ +"""distutils2.data + +contains tools for processing data files paths.""" \ No newline at end of file diff --git a/distutils2/data/tools.py b/distutils2/data/tools.py new file mode 100644 --- /dev/null +++ b/distutils2/data/tools.py @@ -0,0 +1,12 @@ +from glob import glob + +def check_glob(ressources): + correspondence = {} + for (path_glob,category) in ressources: + filepaths = glob(path_glob) + for filepath in filepaths: + if not filepath in correspondence: + correspondence[filepath] = [] + if not category in correspondence[filepath]: + correspondence[filepath].append(category) + return correspondence \ No newline at end of file diff --git a/distutils2/tests/test_data_files.py b/distutils2/tests/test_data_files.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_data_files.py @@ -0,0 +1,58 @@ +# -*- encoding: utf-8 -*- +"""Tests for distutils.data.""" +import os +import sys +from StringIO import StringIO + +from distutils2.tests import unittest, support, run_unittest +from distutils2.data.tools import check_glob + +class DataFilesTestCase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def setUp(self): + super(DataFilesTestCase, self).setUp() + self.addCleanup(setattr, sys, 'stdout', sys.stdout) + self.addCleanup(setattr, sys, 'stderr', sys.stderr) + self.addCleanup(os.chdir, os.getcwd()) + + def test_simple_check_glob(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_file('coucou.tpl', '') + category = '{data}' + self.assertEquals( + check_glob([('*.tpl', category)]), + {'coucou.tpl' : [category]}) + + def test_multiple_glob_same_category(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + os.mkdir('scripts') + path = os.path.join('scripts', 'script.bin') + self.write_file(path, '') + category = '{appdata}' + self.assertEquals( + check_glob( + [('**/*.bin', category), ('scripts/*', category)]), + {path : [category]}) + + def test_multiple_glob_different_category(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + os.mkdir('scripts') + path = os.path.join('scripts', 'script.bin') + self.write_file(path, '') + category_1 = '{appdata}' + category_2 = '{appscript}' + self.assertEquals( + check_glob( + [('**/*.bin', category_1), ('scripts/*', category_2)]), + {path : [category_1, category_2]}) + +def test_suite(): + return unittest.makeSuite(DataFilesTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) \ No newline at end of file -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: [DATA FILES] Add one assert in test_rglob. All tests green, chuck norris will Message-ID: tarek.ziade pushed 84c024423fb3 to distutils2: http://hg.python.org/distutils2/rev/84c024423fb3 changeset: 1029:84c024423fb3 user: FELD Boris date: Thu Jan 27 19:44:36 2011 +0100 summary: [DATA FILES] Add one assert in test_rglob. All tests green, chuck norris will be happy. files: distutils2/tests/test_data_files.py diff --git a/distutils2/tests/test_data_files.py b/distutils2/tests/test_data_files.py --- a/distutils2/tests/test_data_files.py +++ b/distutils2/tests/test_data_files.py @@ -21,7 +21,7 @@ os.makedirs(os.path.join('mailman', 'database', 'schemas')) os.makedirs(os.path.join('mailman', 'etc')) os.makedirs(os.path.join('mailman', 'foo', 'some', 'path', 'bar')) - os.makedirs(os.path.join('developpers-docs', 'api')) + os.makedirs(os.path.join('developer-docs', 'api')) self.write_file('README', '') self.write_file('some.tpl', '') @@ -31,8 +31,8 @@ self.write_file(os.path.join('mailman', 'etc', 'my.cnf'), '') self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg'), '') self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg'), '') - self.write_file(os.path.join('developpers-docs', 'index.txt'), '') - self.write_file(os.path.join('developpers-docs', 'api', 'toc.txt'), '') + self.write_file(os.path.join('developer-docs', 'index.txt'), '') + self.write_file(os.path.join('developer-docs', 'api', 'toc.txt'), '') def test_simple_glob(self): @@ -73,14 +73,16 @@ tempdir = self.mkdtemp() os.chdir(tempdir) os.makedirs(os.path.join('scripts', 'bin')) + path0 = 'binary0.bin' path1 = os.path.join('scripts', 'binary1.bin') - path2 = os.path.join('scripts', 'bin', 'binary1.bin') + path2 = os.path.join('scripts', 'bin', 'binary2.bin') + self.write_file(path0, '') self.write_file(path1, '') self.write_file(path2, '') category = '{bin}' self.assertEquals(check_glob( [('**/*.bin', category)]), - {path1 : set([category]), path2 : set([category])}) + {path0 : set([category]), path1 : set([category]), path2 : set([category])}) def test_final_exemple_glob(self): tempdir = self.mkdtemp() @@ -99,15 +101,15 @@ result = { os.path.join('mailman', 'database', 'schemas', 'blah.schema') : set([resources[0][1]]), 'some.tpl' : set([resources[1][1]]), - os.path.join('developpers-docs', 'index.txt') : set([resources[2][1]]), - os.path.join('developpers-docs', 'api', 'toc.txt') : set([resources[2][1]]), + os.path.join('developer-docs', 'index.txt') : set([resources[2][1]]), + os.path.join('developer-docs', 'api', 'toc.txt') : set([resources[2][1]]), 'README' : set([resources[3][1]]), os.path.join('mailman', 'etc', 'my.cnf') : set([resources[4][1]]), os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg') : set([resources[5][1], resources[6][1]]), os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg') : set([resources[6][1]]), 'some-new-semantic.sns' : set([resources[7][1]]) } - self.assertEquals(sorted(check_glob(resources)), sorted(result)) + self.assertEquals(check_glob(resources), result) def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: [DATA FILES] Add support for double wildcard. Message-ID: tarek.ziade pushed 747d2621234b to distutils2: http://hg.python.org/distutils2/rev/747d2621234b changeset: 1028:747d2621234b user: FELD Boris date: Thu Jan 27 19:31:52 2011 +0100 summary: [DATA FILES] Add support for double wildcard. files: distutils2/data/tools.py distutils2/tests/test_data_files.py diff --git a/distutils2/data/tools.py b/distutils2/data/tools.py --- a/distutils2/data/tools.py +++ b/distutils2/data/tools.py @@ -1,12 +1,36 @@ -from glob import glob +from glob import glob as simple_glob +import os +from os import path as osp + +def glob(path_glob): + if '**' in path_glob: + return rglob(path_glob) + else: + return simple_glob(path_glob) def check_glob(ressources): - correspondence = {} + destinations = {} for (path_glob,category) in ressources: - filepaths = glob(path_glob) - for filepath in filepaths: - if not filepath in correspondence: - correspondence[filepath] = [] - if not category in correspondence[filepath]: - correspondence[filepath].append(category) - return correspondence \ No newline at end of file + project_path = os.getcwd() + abspath_glob = osp.join(project_path, path_glob) + + for file in glob(abspath_glob): + file = file[len(project_path):].lstrip('/') + destinations.setdefault(file, set()).add(category) + + return destinations + +def rglob(path_glob): + prefix, radical = path_glob.split('**', 1) + if prefix == '': + prefix = '.' + if radical == '': + radical = '*' + else: + radical = radical.lstrip('/') + glob_files = [] + for (path, dir, files) in os.walk(prefix): + for file in glob(osp.join(prefix, path, radical)): + glob_files.append(os.path.join(prefix, file)) + + return glob_files \ No newline at end of file diff --git a/distutils2/tests/test_data_files.py b/distutils2/tests/test_data_files.py --- a/distutils2/tests/test_data_files.py +++ b/distutils2/tests/test_data_files.py @@ -16,15 +16,33 @@ self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) + + def build_example(self): + os.makedirs(os.path.join('mailman', 'database', 'schemas')) + os.makedirs(os.path.join('mailman', 'etc')) + os.makedirs(os.path.join('mailman', 'foo', 'some', 'path', 'bar')) + os.makedirs(os.path.join('developpers-docs', 'api')) + + self.write_file('README', '') + self.write_file('some.tpl', '') + self.write_file('some-new-semantic.sns', '') + self.write_file(os.path.join('mailman', 'database', 'mailman.db'), '') + self.write_file(os.path.join('mailman', 'database', 'schemas', 'blah.schema'), '') + self.write_file(os.path.join('mailman', 'etc', 'my.cnf'), '') + self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg'), '') + self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg'), '') + self.write_file(os.path.join('developpers-docs', 'index.txt'), '') + self.write_file(os.path.join('developpers-docs', 'api', 'toc.txt'), '') + - def test_simple_check_glob(self): + def test_simple_glob(self): tempdir = self.mkdtemp() os.chdir(tempdir) self.write_file('coucou.tpl', '') category = '{data}' self.assertEquals( check_glob([('*.tpl', category)]), - {'coucou.tpl' : [category]}) + {'coucou.tpl' : set([category])}) def test_multiple_glob_same_category(self): tempdir = self.mkdtemp() @@ -35,8 +53,8 @@ category = '{appdata}' self.assertEquals( check_glob( - [('**/*.bin', category), ('scripts/*', category)]), - {path : [category]}) + [('scripts/*.bin', category), ('scripts/*', category)]), + {path : set([category])}) def test_multiple_glob_different_category(self): tempdir = self.mkdtemp() @@ -48,8 +66,48 @@ category_2 = '{appscript}' self.assertEquals( check_glob( - [('**/*.bin', category_1), ('scripts/*', category_2)]), - {path : [category_1, category_2]}) + [('scripts/*.bin', category_1), ('scripts/*', category_2)]), + {path : set([category_1, category_2])}) + + def test_rglob(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + os.makedirs(os.path.join('scripts', 'bin')) + path1 = os.path.join('scripts', 'binary1.bin') + path2 = os.path.join('scripts', 'bin', 'binary1.bin') + self.write_file(path1, '') + self.write_file(path2, '') + category = '{bin}' + self.assertEquals(check_glob( + [('**/*.bin', category)]), + {path1 : set([category]), path2 : set([category])}) + + def test_final_exemple_glob(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.build_example() + resources = [ + ('mailman/database/schemas/*', '{appdata}/schemas'), + ('**/*.tpl', '{appdata}/templates'), + ('developer-docs/**/*.txt', '{doc}'), + ('README', '{doc}'), + ('mailman/etc/*', '{config}'), + ('mailman/foo/**/bar/*.cfg', '{config}/baz'), + ('mailman/foo/**/*.cfg', '{config}/hmm'), + ('some-new-semantic.sns', '{funky-crazy-category}') + ] + result = { + os.path.join('mailman', 'database', 'schemas', 'blah.schema') : set([resources[0][1]]), + 'some.tpl' : set([resources[1][1]]), + os.path.join('developpers-docs', 'index.txt') : set([resources[2][1]]), + os.path.join('developpers-docs', 'api', 'toc.txt') : set([resources[2][1]]), + 'README' : set([resources[3][1]]), + os.path.join('mailman', 'etc', 'my.cnf') : set([resources[4][1]]), + os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg') : set([resources[5][1], resources[6][1]]), + os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg') : set([resources[6][1]]), + 'some-new-semantic.sns' : set([resources[7][1]]) + } + self.assertEquals(sorted(check_glob(resources)), sorted(result)) def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:56 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:56 +0100 Subject: [Python-checkins] distutils2: Remove example hierarchy for data_files. Use tempdir instead. Message-ID: tarek.ziade pushed 92a3235d193e to distutils2: http://hg.python.org/distutils2/rev/92a3235d193e changeset: 1027:92a3235d193e user: FELD Boris date: Thu Jan 27 17:15:42 2011 +0100 summary: Remove example hierarchy for data_files. Use tempdir instead. files: distutils2/tests/test_data_files/README distutils2/tests/test_data_files/mailman/database/mailman.db distutils2/tests/test_data_files/mailman/database/schemas/blah.schema distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt distutils2/tests/test_data_files/mailman/developer-docs/index.txt distutils2/tests/test_data_files/mailman/etc/my.cnf distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg distutils2/tests/test_data_files/some-new-semantic.sns distutils2/tests/test_data_files/some.tpl diff --git a/distutils2/tests/test_data_files/README b/distutils2/tests/test_data_files/README deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/database/mailman.db b/distutils2/tests/test_data_files/mailman/database/mailman.db deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/database/schemas/blah.schema b/distutils2/tests/test_data_files/mailman/database/schemas/blah.schema deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt b/distutils2/tests/test_data_files/mailman/developer-docs/api/toc.txt deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/developer-docs/index.txt b/distutils2/tests/test_data_files/mailman/developer-docs/index.txt deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/etc/my.cnf b/distutils2/tests/test_data_files/mailman/etc/my.cnf deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg b/distutils2/tests/test_data_files/mailman/foo/some/path/bar/my.cfg deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg b/distutils2/tests/test_data_files/mailman/foo/some/path/other.cfg deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/some-new-semantic.sns b/distutils2/tests/test_data_files/some-new-semantic.sns deleted file mode 100644 diff --git a/distutils2/tests/test_data_files/some.tpl b/distutils2/tests/test_data_files/some.tpl deleted file mode 100644 -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles/test] First step of test refactoing + last rules overwrite the others Message-ID: tarek.ziade pushed 43be86619651 to distutils2: http://hg.python.org/distutils2/rev/43be86619651 changeset: 1031:43be86619651 user: Pierre-Yves David date: Fri Jan 28 12:05:53 2011 +0100 summary: [datafiles/test] First step of test refactoing + last rules overwrite the others files: distutils2/datafiles.py distutils2/tests/__init__.py distutils2/tests/test_datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -2,24 +2,29 @@ import os from os import path as osp + +class SmartGlob(object): + + def __init__(self, path_glob): + self.path_glob = path_glob + if '**' in path_glob: + self.base = path_glob.split('**', 1)[0] # XXX not exactly what we want + else: + self.base = osp.dirname(path_glob) + + + def expand(self, basepath, category): + for file in glob(osp.join(basepath, self.path_glob)): + file = file[len(basepath):].lstrip('/') + suffix = file[len(self.base):].lstrip('/') + yield file, osp.join(category, suffix) + def glob(path_glob): if '**' in path_glob: return rglob(path_glob) else: return simple_glob(path_glob) -def find_glob(ressources): - destinations = {} - for (path_glob,category) in ressources: - project_path = os.getcwd() - abspath_glob = osp.join(project_path, path_glob) - - for file in glob(abspath_glob): - file = file[len(project_path):].lstrip('/') - destinations.setdefault(file, set()).add(category) - - return destinations - def rglob(path_glob): prefix, radical = path_glob.split('**', 1) if prefix == '': @@ -34,3 +39,11 @@ glob_files.append(os.path.join(prefix, file)) return glob_files + +def resources_dests(resources_dir, rules): + destinations = {} + for (path_glob, glob_dest) in rules: + sglob = SmartGlob(path_glob) + for file, file_dest in sglob.expand(resources_dir, glob_dest): + destinations[file] = file_dest + return destinations diff --git a/distutils2/tests/__init__.py b/distutils2/tests/__init__.py --- a/distutils2/tests/__init__.py +++ b/distutils2/tests/__init__.py @@ -28,6 +28,7 @@ # external release of same package for older versions import unittest2 as unittest except ImportError: + raise sys.exit('Error: You have to install unittest2') diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -5,7 +5,14 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest -from distutils2.datafiles import find_glob +from distutils2.datafiles import resources_dests +import re +from os import path as osp + + + + +SLASH = re.compile(r'[^\\](?:\\{2})* ') class DataFilesTestCase(support.TempdirManager, support.LoggingCatcher, @@ -15,80 +22,38 @@ super(DataFilesTestCase, self).setUp() self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) - self.addCleanup(os.chdir, os.getcwd()) - def build_example(self): - os.makedirs(os.path.join('mailman', 'database', 'schemas')) - os.makedirs(os.path.join('mailman', 'etc')) - os.makedirs(os.path.join('mailman', 'foo', 'some', 'path', 'bar')) - os.makedirs(os.path.join('developer-docs', 'api')) - - self.write_file('README', '') - self.write_file('some.tpl', '') - self.write_file('some-new-semantic.sns', '') - self.write_file(os.path.join('mailman', 'database', 'mailman.db'), '') - self.write_file(os.path.join('mailman', 'database', 'schemas', 'blah.schema'), '') - self.write_file(os.path.join('mailman', 'etc', 'my.cnf'), '') - self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg'), '') - self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg'), '') - self.write_file(os.path.join('developer-docs', 'index.txt'), '') - self.write_file(os.path.join('developer-docs', 'api', 'toc.txt'), '') - + def assertFindGlob(self, rules, spec): + tempdir = self.mkdtemp() + for filepath in spec: + filepath = osp.join(tempdir, *SLASH.split(filepath)) + dirname = osp.dirname(filepath) + if dirname and not osp.exists(dirname): + os.makedirs(dirname) + self.write_file(filepath, 'babar') + result = resources_dests(tempdir, rules) + self.assertEquals(spec, result) def test_simple_glob(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_file('coucou.tpl', '') - category = '{data}' - self.assertEquals( - find_glob([('*.tpl', category)]), - {'coucou.tpl' : set([category])}) + rules = [('*.tpl', '{data}')] + spec = {'coucou.tpl': '{data}/coucou.tpl'} + self.assertFindGlob(rules, spec) - def test_multiple_glob_same_category(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - os.mkdir('scripts') - path = os.path.join('scripts', 'script.bin') - self.write_file(path, '') - category = '{appdata}' - self.assertEquals( - find_glob( - [('scripts/*.bin', category), ('scripts/*', category)]), - {path : set([category])}) + def test_multiple_match(self): + rules = [('scripts/*.bin', '{appdata}'), + ('scripts/*', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin'} + self.assertFindGlob(rules, spec) - def test_multiple_glob_different_category(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - os.mkdir('scripts') - path = os.path.join('scripts', 'script.bin') - self.write_file(path, '') - category_1 = '{appdata}' - category_2 = '{appscript}' - self.assertEquals( - find_glob( - [('scripts/*.bin', category_1), ('scripts/*', category_2)]), - {path : set([category_1, category_2])}) - - def test_rglob(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - os.makedirs(os.path.join('scripts', 'bin')) - path0 = 'binary0.bin' - path1 = os.path.join('scripts', 'binary1.bin') - path2 = os.path.join('scripts', 'bin', 'binary2.bin') - self.write_file(path0, '') - self.write_file(path1, '') - self.write_file(path2, '') - category = '{bin}' - self.assertEquals(find_glob( - [('**/*.bin', category)]), - {path0 : set([category]), path1 : set([category]), path2 : set([category])}) + def test_recursive_glob(self): + rules = [('**/*.bin', '{binary}')] + spec = {'binary0.bin': '{binary}/binary0.bin', + 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', + 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin'} + self.assertFindGlob(rules, spec) def test_final_exemple_glob(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.build_example() - resources = [ + rules = [ ('mailman/database/schemas/*', '{appdata}/schemas'), ('**/*.tpl', '{appdata}/templates'), ('developer-docs/**/*.txt', '{doc}'), @@ -98,6 +63,19 @@ ('mailman/foo/**/*.cfg', '{config}/hmm'), ('some-new-semantic.sns', '{funky-crazy-category}') ] + spec = { + 'README': '{doc}/README', + 'some.tpl': '{appdata}/templates/some.tpl', + 'some-new-semantic.sns': '{funky-crazy-category}/some-new-semantic.sns', + 'mailman/database/mailman.db': None, + 'mailman/database/schemas/blah.schema': '{appdata}/schemas/blah.schema', + 'mailman/etc/my.cnf': '{config}/my.cnf', + 'mailman/foo/some/path/bar/my.cfg': '{config}/hmm/some/path/bar/my.cfg', + 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/bar/other.cfg', + 'developer-docs/index.txt': '{doc}/index.txt', + 'developer-docs/api/toc.txt': '{doc}/api/toc.txt', + } + self.assertFindGlob(rules, spec) result = { os.path.join('mailman', 'database', 'schemas', 'blah.schema') : set([resources[0][1]]), 'some.tpl' : set([resources[1][1]]), -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] rename modules and function Message-ID: tarek.ziade pushed b15201b9e015 to distutils2: http://hg.python.org/distutils2/rev/b15201b9e015 changeset: 1030:b15201b9e015 user: Pierre-Yves David date: Thu Jan 27 19:57:44 2011 +0100 summary: [datafiles] rename modules and function files: distutils2/data/__init__.py distutils2/data/tools.py distutils2/datafiles.py distutils2/tests/test_data_files.py distutils2/tests/test_datafiles.py diff --git a/distutils2/data/__init__.py b/distutils2/data/__init__.py deleted file mode 100644 --- a/distutils2/data/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -"""distutils2.data - -contains tools for processing data files paths.""" \ No newline at end of file diff --git a/distutils2/data/tools.py b/distutils2/datafiles.py rename from distutils2/data/tools.py rename to distutils2/datafiles.py --- a/distutils2/data/tools.py +++ b/distutils2/datafiles.py @@ -8,16 +8,16 @@ else: return simple_glob(path_glob) -def check_glob(ressources): +def find_glob(ressources): destinations = {} for (path_glob,category) in ressources: project_path = os.getcwd() abspath_glob = osp.join(project_path, path_glob) - + for file in glob(abspath_glob): file = file[len(project_path):].lstrip('/') destinations.setdefault(file, set()).add(category) - + return destinations def rglob(path_glob): @@ -28,9 +28,9 @@ radical = '*' else: radical = radical.lstrip('/') - glob_files = [] + glob_files = [] for (path, dir, files) in os.walk(prefix): for file in glob(osp.join(prefix, path, radical)): glob_files.append(os.path.join(prefix, file)) - - return glob_files \ No newline at end of file + + return glob_files diff --git a/distutils2/tests/test_data_files.py b/distutils2/tests/test_datafiles.py rename from distutils2/tests/test_data_files.py rename to distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_data_files.py +++ b/distutils2/tests/test_datafiles.py @@ -5,24 +5,24 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest -from distutils2.data.tools import check_glob +from distutils2.datafiles import find_glob class DataFilesTestCase(support.TempdirManager, support.LoggingCatcher, unittest.TestCase): - + def setUp(self): super(DataFilesTestCase, self).setUp() self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) - + def build_example(self): os.makedirs(os.path.join('mailman', 'database', 'schemas')) os.makedirs(os.path.join('mailman', 'etc')) os.makedirs(os.path.join('mailman', 'foo', 'some', 'path', 'bar')) os.makedirs(os.path.join('developer-docs', 'api')) - + self.write_file('README', '') self.write_file('some.tpl', '') self.write_file('some-new-semantic.sns', '') @@ -33,17 +33,17 @@ self.write_file(os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg'), '') self.write_file(os.path.join('developer-docs', 'index.txt'), '') self.write_file(os.path.join('developer-docs', 'api', 'toc.txt'), '') - - + + def test_simple_glob(self): tempdir = self.mkdtemp() os.chdir(tempdir) self.write_file('coucou.tpl', '') category = '{data}' self.assertEquals( - check_glob([('*.tpl', category)]), + find_glob([('*.tpl', category)]), {'coucou.tpl' : set([category])}) - + def test_multiple_glob_same_category(self): tempdir = self.mkdtemp() os.chdir(tempdir) @@ -52,10 +52,10 @@ self.write_file(path, '') category = '{appdata}' self.assertEquals( - check_glob( + find_glob( [('scripts/*.bin', category), ('scripts/*', category)]), {path : set([category])}) - + def test_multiple_glob_different_category(self): tempdir = self.mkdtemp() os.chdir(tempdir) @@ -65,10 +65,10 @@ category_1 = '{appdata}' category_2 = '{appscript}' self.assertEquals( - check_glob( + find_glob( [('scripts/*.bin', category_1), ('scripts/*', category_2)]), {path : set([category_1, category_2])}) - + def test_rglob(self): tempdir = self.mkdtemp() os.chdir(tempdir) @@ -80,10 +80,10 @@ self.write_file(path1, '') self.write_file(path2, '') category = '{bin}' - self.assertEquals(check_glob( + self.assertEquals(find_glob( [('**/*.bin', category)]), {path0 : set([category]), path1 : set([category]), path2 : set([category])}) - + def test_final_exemple_glob(self): tempdir = self.mkdtemp() os.chdir(tempdir) @@ -109,10 +109,10 @@ os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg') : set([resources[6][1]]), 'some-new-semantic.sns' : set([resources[7][1]]) } - self.assertEquals(check_glob(resources), result) - + self.assertEquals(find_glob(resources), result) + def test_suite(): return unittest.makeSuite(DataFilesTestCase) if __name__ == '__main__': - run_unittest(test_suite()) \ No newline at end of file + run_unittest(test_suite()) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] implement suffix in resources_dests Message-ID: tarek.ziade pushed 045d9c2d723a to distutils2: http://hg.python.org/distutils2/rev/045d9c2d723a changeset: 1032:045d9c2d723a user: Pierre-Yves David date: Fri Jan 28 12:59:54 2011 +0100 summary: [datafiles] implement suffix in resources_dests files: distutils2/datafiles.py distutils2/tests/test_datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -5,25 +5,28 @@ class SmartGlob(object): - def __init__(self, path_glob): - self.path_glob = path_glob - if '**' in path_glob: - self.base = path_glob.split('**', 1)[0] # XXX not exactly what we want - else: - self.base = osp.dirname(path_glob) + def __init__(self, base, suffix): + self.base = base + self.suffix = suffix def expand(self, basepath, category): - for file in glob(osp.join(basepath, self.path_glob)): - file = file[len(basepath):].lstrip('/') - suffix = file[len(self.base):].lstrip('/') - yield file, osp.join(category, suffix) + if self.base: + base = osp.join(basepath, self.base) + else: + base = basepath + absglob = osp.join(base, self.suffix) + for file in glob(absglob): + path_suffix = file[len(base):].lstrip('/') + relpath = file[len(basepath):].lstrip('/') + yield relpath, osp.join(category, path_suffix) def glob(path_glob): if '**' in path_glob: - return rglob(path_glob) + files = rglob(path_glob) else: - return simple_glob(path_glob) + files = simple_glob(path_glob) + return files def rglob(path_glob): prefix, radical = path_glob.split('**', 1) @@ -37,13 +40,12 @@ for (path, dir, files) in os.walk(prefix): for file in glob(osp.join(prefix, path, radical)): glob_files.append(os.path.join(prefix, file)) - return glob_files def resources_dests(resources_dir, rules): destinations = {} - for (path_glob, glob_dest) in rules: - sglob = SmartGlob(path_glob) + for (base, suffix, glob_dest) in rules: + sglob = SmartGlob(base, suffix) for file, file_dest in sglob.expand(resources_dir, glob_dest): destinations[file] = file_dest return destinations diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -12,7 +12,7 @@ -SLASH = re.compile(r'[^\\](?:\\{2})* ') +SLASH = re.compile(r'(?<=[^\\])(?:\\{2})*/') class DataFilesTestCase(support.TempdirManager, support.LoggingCatcher, @@ -31,37 +31,43 @@ if dirname and not osp.exists(dirname): os.makedirs(dirname) self.write_file(filepath, 'babar') + for key, value in list(spec.items()): + if value is None: + del spec[key] result = resources_dests(tempdir, rules) self.assertEquals(spec, result) def test_simple_glob(self): - rules = [('*.tpl', '{data}')] - spec = {'coucou.tpl': '{data}/coucou.tpl'} + rules = [('', '*.tpl', '{data}')] + spec = {'coucou.tpl': '{data}/coucou.tpl', + 'Donotwant': None} self.assertFindGlob(rules, spec) def test_multiple_match(self): - rules = [('scripts/*.bin', '{appdata}'), - ('scripts/*', '{appscript}')] - spec = {'scripts/script.bin': '{appscript}/script.bin'} + rules = [('scripts', '*.bin', '{appdata}'), + ('scripts', '*', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin', + 'Babarlikestrawberry': None} self.assertFindGlob(rules, spec) def test_recursive_glob(self): - rules = [('**/*.bin', '{binary}')] + rules = [('', '**/*.bin', '{binary}')] spec = {'binary0.bin': '{binary}/binary0.bin', 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', - 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin'} + 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin', + 'you/kill/pandabear.guy': None} self.assertFindGlob(rules, spec) def test_final_exemple_glob(self): rules = [ - ('mailman/database/schemas/*', '{appdata}/schemas'), - ('**/*.tpl', '{appdata}/templates'), - ('developer-docs/**/*.txt', '{doc}'), - ('README', '{doc}'), - ('mailman/etc/*', '{config}'), - ('mailman/foo/**/bar/*.cfg', '{config}/baz'), - ('mailman/foo/**/*.cfg', '{config}/hmm'), - ('some-new-semantic.sns', '{funky-crazy-category}') + ('mailman/database/schemas/','*', '{appdata}/schemas'), + ('', '**/*.tpl', '{appdata}/templates'), + ('developer-docs/', '**/*.txt', '{doc}'), + ('', 'README', '{doc}'), + ('mailman/etc/', '*', '{config}'), + ('mailman/foo/', '**/bar/*.cfg', '{config}/baz'), + ('mailman/foo/', '**/*.cfg', '{config}/hmm'), + ('', 'some-new-semantic.sns', '{funky-crazy-category}') ] spec = { 'README': '{doc}/README', @@ -71,23 +77,12 @@ 'mailman/database/schemas/blah.schema': '{appdata}/schemas/blah.schema', 'mailman/etc/my.cnf': '{config}/my.cnf', 'mailman/foo/some/path/bar/my.cfg': '{config}/hmm/some/path/bar/my.cfg', - 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/bar/other.cfg', + 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/other.cfg', 'developer-docs/index.txt': '{doc}/index.txt', 'developer-docs/api/toc.txt': '{doc}/api/toc.txt', } + self.maxDiff = None self.assertFindGlob(rules, spec) - result = { - os.path.join('mailman', 'database', 'schemas', 'blah.schema') : set([resources[0][1]]), - 'some.tpl' : set([resources[1][1]]), - os.path.join('developer-docs', 'index.txt') : set([resources[2][1]]), - os.path.join('developer-docs', 'api', 'toc.txt') : set([resources[2][1]]), - 'README' : set([resources[3][1]]), - os.path.join('mailman', 'etc', 'my.cnf') : set([resources[4][1]]), - os.path.join('mailman', 'foo', 'some', 'path', 'bar', 'my.cfg') : set([resources[5][1], resources[6][1]]), - os.path.join('mailman', 'foo', 'some', 'path', 'other.cfg') : set([resources[6][1]]), - 'some-new-semantic.sns' : set([resources[7][1]]) - } - self.assertEquals(find_glob(resources), result) def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles/test] more complete suffix test Message-ID: tarek.ziade pushed 15b6a9c06b89 to distutils2: http://hg.python.org/distutils2/rev/15b6a9c06b89 changeset: 1034:15b6a9c06b89 user: Pierre-Yves David date: Fri Jan 28 13:02:36 2011 +0100 summary: [datafiles/test] more complete suffix test files: distutils2/tests/test_datafiles.py diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -62,7 +62,7 @@ rules = [ ('mailman/database/schemas/','*', '{appdata}/schemas'), ('', '**/*.tpl', '{appdata}/templates'), - ('developer-docs/', '**/*.txt', '{doc}'), + ('', 'developer-docs/**/*.txt', '{doc}'), ('', 'README', '{doc}'), ('mailman/etc/', '*', '{config}'), ('mailman/foo/', '**/bar/*.cfg', '{config}/baz'), @@ -78,8 +78,8 @@ 'mailman/etc/my.cnf': '{config}/my.cnf', 'mailman/foo/some/path/bar/my.cfg': '{config}/hmm/some/path/bar/my.cfg', 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/other.cfg', - 'developer-docs/index.txt': '{doc}/index.txt', - 'developer-docs/api/toc.txt': '{doc}/api/toc.txt', + 'developer-docs/index.txt': '{doc}/developer-docs/index.txt', + 'developer-docs/api/toc.txt': '{doc}/developer-docs/api/toc.txt', } self.maxDiff = None self.assertFindGlob(rules, spec) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] add {opt1, opt2, opt3} options to glob Message-ID: tarek.ziade pushed 94f663f71283 to distutils2: http://hg.python.org/distutils2/rev/94f663f71283 changeset: 1035:94f663f71283 user: Pierre-Yves David date: Fri Jan 28 18:16:23 2011 +0100 summary: [datafiles] add {opt1,opt2,opt3} options to glob files: distutils2/datafiles.py distutils2/tests/test_datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,6 +1,7 @@ -from glob import glob as simple_glob import os +import re from os import path as osp +from glob import iglob as simple_iglob class SmartGlob(object): @@ -15,32 +16,41 @@ base = osp.join(basepath, self.base) else: base = basepath + if '*' in base or '{' in base or '}' in base: + raise NotImplementedError('glob are not supported into base part of datafiles definition. %r is an invalide basepath' % base) absglob = osp.join(base, self.suffix) - for file in glob(absglob): + for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') yield relpath, osp.join(category, path_suffix) -def glob(path_glob): - if '**' in path_glob: - files = rglob(path_glob) +RICH_GLOB = re.compile(r'\{([^}]*)\}') + +# r'\\\{' match "\{" + +def iglob(path_glob): + rich_path_glob = RICH_GLOB.split(path_glob, 1) + if len(rich_path_glob) > 1: + assert len(rich_path_glob) == 3, rich_path_glob + prefix, set, suffix = rich_path_glob + for item in set.split(','): + for path in iglob( ''.join((prefix, item, suffix))): + yield path else: - files = simple_glob(path_glob) - return files - -def rglob(path_glob): - prefix, radical = path_glob.split('**', 1) - if prefix == '': - prefix = '.' - if radical == '': - radical = '*' - else: - radical = radical.lstrip('/') - glob_files = [] - for (path, dir, files) in os.walk(prefix): - for file in glob(osp.join(prefix, path, radical)): - glob_files.append(os.path.join(prefix, file)) - return glob_files + if '**' not in path_glob: + for item in simple_iglob(path_glob): + yield item + else: + prefix, radical = path_glob.split('**', 1) + if prefix == '': + prefix = '.' + if radical == '': + radical = '*' + else: + radical = radical.lstrip('/') + for (path, dir, files) in os.walk(prefix): + for file in iglob(osp.join(prefix, path, radical)): + yield os.path.join(prefix, file) def resources_dests(resources_dir, rules): destinations = {} diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -5,15 +5,13 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest -from distutils2.datafiles import resources_dests +from distutils2.datafiles import resources_dests, RICH_GLOB import re from os import path as osp -SLASH = re.compile(r'(?<=[^\\])(?:\\{2})*/') - class DataFilesTestCase(support.TempdirManager, support.LoggingCatcher, unittest.TestCase): @@ -23,20 +21,30 @@ self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) - def assertFindGlob(self, rules, spec): + + def build_spec(self, spec, clean=True): tempdir = self.mkdtemp() for filepath in spec: - filepath = osp.join(tempdir, *SLASH.split(filepath)) + filepath = osp.join(tempdir, *filepath.split('/')) dirname = osp.dirname(filepath) if dirname and not osp.exists(dirname): os.makedirs(dirname) self.write_file(filepath, 'babar') - for key, value in list(spec.items()): - if value is None: - del spec[key] + if clean: + for key, value in list(spec.items()): + if value is None: + del spec[key] + return tempdir + + def assertFindGlob(self, rules, spec): + tempdir = self.build_spec(spec) result = resources_dests(tempdir, rules) self.assertEquals(spec, result) + def test_regex_rich_glob(self): + matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") + self.assertEquals(["fraises","huitres"], matches) + def test_simple_glob(self): rules = [('', '*.tpl', '{data}')] spec = {'coucou.tpl': '{data}/coucou.tpl', @@ -50,6 +58,27 @@ 'Babarlikestrawberry': None} self.assertFindGlob(rules, spec) + def test_set_match(self): + rules = [('scripts', '*.{bin,sh}', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin', + 'scripts/babar.sh': '{appscript}/babar.sh', + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + + def test_set_match_multiple(self): + rules = [('scripts', 'script{s,}.{bin,sh}', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': '{appscript}/script.sh', + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + + def test_glob_in_base(self): + rules = [('scrip*', '*.bin', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'Babarlikestrawberry': None} + tempdir = self.build_spec(spec) + self.assertRaises(NotImplementedError, resources_dests, tempdir, rules) + def test_recursive_glob(self): rules = [('', '**/*.bin', '{binary}')] spec = {'binary0.bin': '{binary}/binary0.bin', -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Distutils 2 now install datafiles in the right place (indicated by sysconfig). Message-ID: tarek.ziade pushed 1bb3a0e4029d to distutils2: http://hg.python.org/distutils2/rev/1bb3a0e4029d changeset: 1033:1bb3a0e4029d user: FELD Boris date: Fri Jan 28 17:10:57 2011 +0100 summary: Distutils 2 now install datafiles in the right place (indicated by sysconfig). Data files are read from config file. Data files are installed to the expanded category file. Data files list is written in DATAFILES file in .dist-info dir. files: distutils2/command/install_data.py distutils2/command/install_dist.py distutils2/command/install_distinfo.py distutils2/config.py diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -9,6 +9,7 @@ import os from distutils2.command.cmd import Command from distutils2.util import change_root, convert_path +from distutils2._backport.sysconfig import _expand_vars, _subst_vars, get_paths class install_data(Command): @@ -28,6 +29,7 @@ def initialize_options(self): self.install_dir = None self.outfiles = [] + self.data_files_out = [] self.root = None self.force = 0 self.data_files = self.distribution.data_files @@ -40,50 +42,32 @@ def run(self): self.mkpath(self.install_dir) - for f in self.data_files: - if isinstance(f, str): - # it's a simple file, so copy it - f = convert_path(f) - if self.warn_dir: - self.warn("setup script did not provide a directory for " - "'%s' -- installing right in '%s'" % - (f, self.install_dir)) - (out, _) = self.copy_file(f, self.install_dir) - self.outfiles.append(out) - else: - # it's a tuple with path to install to and a list of files - dir = convert_path(f[0]) - if not os.path.isabs(dir): - dir = os.path.join(self.install_dir, dir) - elif self.root: - dir = change_root(self.root, dir) - self.mkpath(dir) + for file in self.data_files.items(): + destination = convert_path(self.expand_categories(file[1])) + dir_dest = os.path.abspath(os.path.dirname(destination)) + + self.mkpath(dir_dest) + (out, _) = self.copy_file(file[0], dir_dest) - if f[1] == []: - # If there are no files listed, the user must be - # trying to create an empty directory, so add the - # directory to the list of output files. - self.outfiles.append(dir) - else: - # Copy files, adding them to the list of output files. - for data in f[1]: - data = convert_path(data) - (out, _) = self.copy_file(data, dir) - self.outfiles.append(out) + self.outfiles.append(out) + self.data_files_out.append((file[0], destination)) + + def expand_categories(self, path_with_categories): + local_vars = get_paths() + local_vars['distribution.name'] = self.distribution.metadata['Name'] + expanded_path = _subst_vars(path_with_categories, local_vars) + expanded_path = _subst_vars(expanded_path, local_vars) + if '{' in expanded_path and '}' in expanded_path: + self.warn("Unable to expand %s, some categories may missing." % + path_with_categories) + return expanded_path def get_source_files(self): sources = [] - for item in self.data_files: - if isinstance(item, str): # plain file - item = convert_path(item) - if os.path.isfile(item): - sources.append(item) - else: # a (dirname, filenames) tuple - dirname, filenames = item - for f in filenames: - f = convert_path(f) - if os.path.isfile(f): - sources.append(f) + for file in self.data_files: + destination = convert_path(self.expand_categories(file[1])) + if os.path.file(destination): + sources.append(destination) return sources def get_inputs(self): @@ -91,3 +75,6 @@ def get_outputs(self): return self.outfiles + + def get_datafiles_out(self): + return self.data_files_out \ No newline at end of file diff --git a/distutils2/command/install_dist.py b/distutils2/command/install_dist.py --- a/distutils2/command/install_dist.py +++ b/distutils2/command/install_dist.py @@ -87,6 +87,8 @@ ('record=', None, "filename in which to record a list of installed files " "(not PEP 376-compliant)"), + ('datafiles=', None, + "data files mapping"), # .dist-info related arguments, read by install_dist_info ('no-distinfo', None, @@ -184,12 +186,14 @@ #self.install_info = None self.record = None + self.datafiles = None # .dist-info related options self.no_distinfo = None self.installer = None self.requested = None self.no_record = None + self.no_datafiles = None # -- Option finalizing methods ------------------------------------- # (This is rather more involved than for most commands, diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -39,9 +39,11 @@ "do not generate a REQUESTED file"), ('no-record', None, "do not generate a RECORD file"), + ('no-datafiles', None, + "do not generate a DATAFILES list installed file") ] - boolean_options = ['requested', 'no-record'] + boolean_options = ['requested', 'no-record', 'no-datafiles'] negative_opt = {'no-requested': 'requested'} @@ -50,6 +52,7 @@ self.installer = None self.requested = None self.no_record = None + self.no_datafiles = None def finalize_options(self): self.set_undefined_options('install_dist', @@ -142,6 +145,22 @@ finally: f.close() + if not self.no_datafiles: + datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') + logger.info('creating %s', datafiles_path) + f = open(datafiles_path, 'wb') + try: + writer = csv.writer(f, delimiter=',', + lineterminator=os.linesep, + quotechar='"') + install_data = self.get_finalized_command('install_data') + if install_data.get_datafiles_out() != '': + for tuple in install_data.get_datafiles_out(): + writer.writerow(tuple) + finally: + f.close() + + def get_outputs(self): return self.outputs diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -2,14 +2,17 @@ Know how to read all config files Distutils2 uses. """ +import os.path import os import sys +import re from ConfigParser import RawConfigParser from distutils2 import logger from distutils2.util import check_environ, resolve_name from distutils2.compiler import set_compiler from distutils2.command import set_command +from distutils2.datafiles import resources_dests class Config(object): """Reads configuration files and work with the Distribution instance @@ -82,7 +85,7 @@ if v != ''] return value - def _read_setup_cfg(self, parser): + def _read_setup_cfg(self, parser, filename): content = {} for section in parser.sections(): content[section] = dict(parser.items(section)) @@ -164,6 +167,25 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) + + if 'resources' in content: + resources = [] + regex = re.compile(r'[^\\](?:\\{2})* ') + for glob, destination in content['resources'].iteritems(): + splitted_glob = regex.split(glob, 1) + if len(splitted_glob) == 1: + prefix = '' + suffix = splitted_glob[0] + else: + prefix = splitted_glob[0] + suffix = splitted_glob[1] + + resources.append((prefix, suffix, destination)) + + dir = os.path.dirname(os.path.join(os.getcwd(), filename)) + data_files = resources_dests(dir, resources) + self.dist.data_files = data_files + def parse_config_files(self, filenames=None): if filenames is None: @@ -178,7 +200,7 @@ parser.read(filename) if os.path.split(filename)[-1] == 'setup.cfg': - self._read_setup_cfg(parser) + self._read_setup_cfg(parser, filename) for section in parser.sections(): if section == 'global': -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: merge Message-ID: tarek.ziade pushed 7f6a4b4b6e7b to distutils2: http://hg.python.org/distutils2/rev/7f6a4b4b6e7b changeset: 1039:7f6a4b4b6e7b parent: 1038:f1c840b37d7a parent: 1036:9d8967f87a31 user: FELD Boris date: Sat Jan 29 14:11:13 2011 +0100 summary: merge files: diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -167,19 +167,17 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) - + if 'resources' in content: resources = [] - regex = re.compile(r'[^\\](?:\\{2})* ') for glob, destination in content['resources'].iteritems(): - splitted_glob = regex.split(glob, 1) + splitted_glob = glob.split(' ', 1) if len(splitted_glob) == 1: prefix = '' suffix = splitted_glob[0] else: prefix = splitted_glob[0] suffix = splitted_glob[1] - resources.append((prefix, suffix, destination)) dir = os.path.dirname(os.path.join(os.getcwd(), filename)) diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,6 +1,7 @@ -from glob import glob as simple_glob import os +import re from os import path as osp +from glob import iglob as simple_iglob class SmartGlob(object): @@ -15,32 +16,41 @@ base = osp.join(basepath, self.base) else: base = basepath + if '*' in base or '{' in base or '}' in base: + raise NotImplementedError('glob are not supported into base part of datafiles definition. %r is an invalide basepath' % base) absglob = osp.join(base, self.suffix) - for file in glob(absglob): + for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') yield relpath, osp.join(category, path_suffix) -def glob(path_glob): - if '**' in path_glob: - files = rglob(path_glob) +RICH_GLOB = re.compile(r'\{([^}]*)\}') + +# r'\\\{' match "\{" + +def iglob(path_glob): + rich_path_glob = RICH_GLOB.split(path_glob, 1) + if len(rich_path_glob) > 1: + assert len(rich_path_glob) == 3, rich_path_glob + prefix, set, suffix = rich_path_glob + for item in set.split(','): + for path in iglob( ''.join((prefix, item, suffix))): + yield path else: - files = simple_glob(path_glob) - return files - -def rglob(path_glob): - prefix, radical = path_glob.split('**', 1) - if prefix == '': - prefix = '.' - if radical == '': - radical = '*' - else: - radical = radical.lstrip('/') - glob_files = [] - for (path, dir, files) in os.walk(prefix): - for file in glob(osp.join(prefix, path, radical)): - glob_files.append(os.path.join(prefix, file)) - return glob_files + if '**' not in path_glob: + for item in simple_iglob(path_glob): + yield item + else: + prefix, radical = path_glob.split('**', 1) + if prefix == '': + prefix = '.' + if radical == '': + radical = '*' + else: + radical = radical.lstrip('/') + for (path, dir, files) in os.walk(prefix): + for file in iglob(osp.join(prefix, path, radical)): + yield os.path.join(prefix, file) def resources_dests(resources_dir, rules): destinations = {} diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -5,15 +5,13 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest -from distutils2.datafiles import resources_dests +from distutils2.datafiles import resources_dests, RICH_GLOB import re from os import path as osp -SLASH = re.compile(r'(?<=[^\\])(?:\\{2})*/') - class DataFilesTestCase(support.TempdirManager, support.LoggingCatcher, unittest.TestCase): @@ -23,20 +21,30 @@ self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) - def assertFindGlob(self, rules, spec): + + def build_spec(self, spec, clean=True): tempdir = self.mkdtemp() for filepath in spec: - filepath = osp.join(tempdir, *SLASH.split(filepath)) + filepath = osp.join(tempdir, *filepath.split('/')) dirname = osp.dirname(filepath) if dirname and not osp.exists(dirname): os.makedirs(dirname) self.write_file(filepath, 'babar') - for key, value in list(spec.items()): - if value is None: - del spec[key] + if clean: + for key, value in list(spec.items()): + if value is None: + del spec[key] + return tempdir + + def assertFindGlob(self, rules, spec): + tempdir = self.build_spec(spec) result = resources_dests(tempdir, rules) self.assertEquals(spec, result) + def test_regex_rich_glob(self): + matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") + self.assertEquals(["fraises","huitres"], matches) + def test_simple_glob(self): rules = [('', '*.tpl', '{data}')] spec = {'coucou.tpl': '{data}/coucou.tpl', @@ -50,6 +58,27 @@ 'Babarlikestrawberry': None} self.assertFindGlob(rules, spec) + def test_set_match(self): + rules = [('scripts', '*.{bin,sh}', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin', + 'scripts/babar.sh': '{appscript}/babar.sh', + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + + def test_set_match_multiple(self): + rules = [('scripts', 'script{s,}.{bin,sh}', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': '{appscript}/script.sh', + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + + def test_glob_in_base(self): + rules = [('scrip*', '*.bin', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'Babarlikestrawberry': None} + tempdir = self.build_spec(spec) + self.assertRaises(NotImplementedError, resources_dests, tempdir, rules) + def test_recursive_glob(self): rules = [('', '**/*.bin', '{binary}')] spec = {'binary0.bin': '{binary}/binary0.bin', @@ -62,7 +91,7 @@ rules = [ ('mailman/database/schemas/','*', '{appdata}/schemas'), ('', '**/*.tpl', '{appdata}/templates'), - ('developer-docs/', '**/*.txt', '{doc}'), + ('', 'developer-docs/**/*.txt', '{doc}'), ('', 'README', '{doc}'), ('mailman/etc/', '*', '{config}'), ('mailman/foo/', '**/bar/*.cfg', '{config}/baz'), @@ -78,8 +107,8 @@ 'mailman/etc/my.cnf': '{config}/my.cnf', 'mailman/foo/some/path/bar/my.cfg': '{config}/hmm/some/path/bar/my.cfg', 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/other.cfg', - 'developer-docs/index.txt': '{doc}/index.txt', - 'developer-docs/api/toc.txt': '{doc}/api/toc.txt', + 'developer-docs/index.txt': '{doc}/developer-docs/index.txt', + 'developer-docs/api/toc.txt': '{doc}/developer-docs/api/toc.txt', } self.maxDiff = None self.assertFindGlob(rules, spec) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Improve data_files documentation Message-ID: tarek.ziade pushed f1c840b37d7a to distutils2: http://hg.python.org/distutils2/rev/f1c840b37d7a changeset: 1038:f1c840b37d7a user: FELD Boris date: Sat Jan 29 11:19:02 2011 +0100 summary: Improve data_files documentation files: docs/source/setupcfg.rst diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -212,7 +212,50 @@ destination ----------- -The destination is a traditionnal path (with unix separator **/**) where some parts will be expanded at installation time. +The destination is a traditionnal path (with unix separator **/**) where some parts will be expanded at installation time. These parts look like **{category}**, they will be expanded by reading system-wide default-path stored in sysconfig.cfg. Defaults categories are : + +* config +* appdata +* appdata.arch +* appdata.persistent +* appdata.disposable +* help +* icon +* scripts +* doc +* info +* man + +A special category exists, named {distribution.name} which will be expanded into your distribution name. You should not use it in your destination path, as they are may be used in defaults categories:: + + [globals] + # These are the useful categories that are sometimes referenced at runtime, + # using pkgutil.open(): + # Configuration files + config = {confdir}/{distribution.name} + # Non-writable data that is independent of architecture (images, many xml/text files) + appdata = {datadir}/{distribution.name} + # Non-writable data that is architecture-dependent (some binary data formats) + appdata.arch = {libdir}/{distribution.name} + # Data, written by the package, that must be preserved (databases) + appdata.persistent = {statedir}/lib/{distribution.name} + # Data, written by the package, that can be safely discarded (cache) + appdata.disposable = {statedir}/cache/{distribution.name} + # Help or documentation files referenced at runtime + help = {datadir}/{distribution.name} + icon = {datadir}/pixmaps + scripts = {base}/bin + + # Non-runtime files. These are valid categories for marking files for + # install, but they should not be referenced by the app at runtime: + # Help or documentation files not referenced by the package at runtime + doc = {datadir}/doc/{distribution.name} + # GNU info documentation files + info = {datadir}/info + # man pages + man = {datadir}/man + +So, if you have this destination path : **{help}/api**, it will be expanded into **{datadir}/{distribution.name}/api**. {datadir} will be expanded depending on your system value (ex : confdir = datadir = /usr/share/). command sections ================ -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Improve data_files documentation Message-ID: tarek.ziade pushed b15e9887efa9 to distutils2: http://hg.python.org/distutils2/rev/b15e9887efa9 changeset: 1037:b15e9887efa9 parent: 1033:1bb3a0e4029d user: FELD Boris date: Sat Jan 29 10:27:11 2011 +0100 summary: Improve data_files documentation files: docs/source/setupcfg.rst diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -146,6 +146,73 @@ extra_files = setup.py +data-files +========== + +This section describes the files used by the project which must not be installed in the same place that python modules or libraries. + +The format for specifing data files is : + + **glob_syntax** = **destination** + +Example:: + + scripts/ *.bin = {scripts} + +It means that every file which match the glob_syntax will be placed in the destination. A part of the path of the file will be stripped when it will be expanded and another part will be append to the destination. For more informations about which part of the path will be stripped or not, take a look at next sub-section globsyntax_. + +The destination path will be expanded at the installation time using categories's default-path in the sysconfig.cfg file in the system. For more information about categories's default-paths, take a look at next next sub-section destination_. + +So, if you have this source tree:: + + mailman-1.0/ + README + scripts/ + start.sh + start.py + start.bat + LAUNCH + docs/ + index.rst + mailman/ + databases/ + main.db + mailman.py + + + + +.. _globsyntax: + +glob_syntax +----------- + +The glob syntax is traditionnal glob syntax (with unix separator **/**) with one more information : what part of the path will be stripped when path will be expanded ? + +The special character which indicate the end of the part that will be stripped and the beginning of the part that will be added is whitespace, which can follow or replace a path separator. + +Example:: + + scripts/ *.bin + +is equivalent to:: + + scripts *.bin + +Theses examples means that all files with extensions bin in the directory scripts will be placed directly on **destination** directory. + +This glob example:: + + scripts/*.bin + +means that all files with extensions bin in the directory scripts will be placed directly on **destination/scripts** directory. + +.. _destination: + +destination +----------- + +The destination is a traditionnal path (with unix separator **/**) where some parts will be expanded at installation time. command sections ================ -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] remove regex use to detect space. You just can't escape them Message-ID: tarek.ziade pushed 9d8967f87a31 to distutils2: http://hg.python.org/distutils2/rev/9d8967f87a31 changeset: 1036:9d8967f87a31 user: Pierre-Yves David date: Fri Jan 28 18:53:47 2011 +0100 summary: [datafiles] remove regex use to detect space. You just can't escape them files: distutils2/config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -167,19 +167,17 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) - + if 'resources' in content: resources = [] - regex = re.compile(r'[^\\](?:\\{2})* ') for glob, destination in content['resources'].iteritems(): - splitted_glob = regex.split(glob, 1) + splitted_glob = glob.split(' ', 1) if len(splitted_glob) == 1: prefix = '' suffix = splitted_glob[0] else: prefix = splitted_glob[0] suffix = splitted_glob[1] - resources.append((prefix, suffix, destination)) dir = os.path.dirname(os.path.join(os.getcwd(), filename)) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] support for files exclusion in resources_dests Message-ID: tarek.ziade pushed 2f920a952c93 to distutils2: http://hg.python.org/distutils2/rev/2f920a952c93 changeset: 1040:2f920a952c93 user: Pierre-Yves David date: Sat Jan 29 09:51:39 2011 +0100 summary: [datafiles] support for files exclusion in resources_dests rules with destination set to None excludes maching files. files: distutils2/datafiles.py distutils2/tests/test_datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -11,7 +11,7 @@ self.suffix = suffix - def expand(self, basepath, category): + def expand(self, basepath, destination): if self.base: base = osp.join(basepath, self.base) else: @@ -22,7 +22,8 @@ for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') - yield relpath, osp.join(category, path_suffix) + dest = osp.join(destination, path_suffix) + yield relpath, dest RICH_GLOB = re.compile(r'\{([^}]*)\}') @@ -56,6 +57,15 @@ destinations = {} for (base, suffix, glob_dest) in rules: sglob = SmartGlob(base, suffix) - for file, file_dest in sglob.expand(resources_dir, glob_dest): - destinations[file] = file_dest + if glob_dest is None: + delete = True + dest = '' + else: + delete = False + dest = glob_dest + for file, file_dest in sglob.expand(resources_dir, dest): + if delete and file in destinations: + del destinations[file] + else: + destinations[file] = file_dest return destinations diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -72,6 +72,14 @@ 'Babarlikestrawberry': None} self.assertFindGlob(rules, spec) + def test_set_match_exclude(self): + rules = [('scripts', '*', '{appscript}'), + ('', '**/*.sh', None)] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': None, + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + def test_glob_in_base(self): rules = [('scrip*', '*.bin', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: add support for exclusing datafiles in setup.cfg Message-ID: tarek.ziade pushed 501afb46682c to distutils2: http://hg.python.org/distutils2/rev/501afb46682c changeset: 1041:501afb46682c user: Pierre-Yves David date: Sat Jan 29 09:53:51 2011 +0100 summary: add support for exclusing datafiles in setup.cfg files: distutils2/config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -178,6 +178,8 @@ else: prefix = splitted_glob[0] suffix = splitted_glob[1] + if destination == '': + destination = None resources.append((prefix, suffix, destination)) dir = os.path.dirname(os.path.join(os.getcwd(), filename)) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Use public method from sysconfig instead of private one to expand paths. Message-ID: tarek.ziade pushed 7b7827c8e4dd to distutils2: http://hg.python.org/distutils2/rev/7b7827c8e4dd changeset: 1045:7b7827c8e4dd parent: 1043:9ab3a8ca0c6d user: FELD Boris date: Sat Jan 29 15:39:34 2011 +0100 summary: Use public method from sysconfig instead of private one to expand paths. Correct a type in config module. files: distutils2/_backport/pkgutil.py distutils2/command/install_data.py docs/source/setupcfg.rst diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1169,4 +1169,4 @@ yield dist def open(distribution_name, relative_path): - + pass diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -9,7 +9,7 @@ import os from distutils2.command.cmd import Command from distutils2.util import change_root, convert_path -from distutils2._backport.sysconfig import _expand_vars, _subst_vars, get_paths +from distutils2._backport.sysconfig import get_paths class install_data(Command): @@ -55,8 +55,8 @@ def expand_categories(self, path_with_categories): local_vars = get_paths() local_vars['distribution.name'] = self.distribution.metadata['Name'] - expanded_path = _subst_vars(path_with_categories, local_vars) - expanded_path = _subst_vars(expanded_path, local_vars) + expanded_path = get_paths(path_with_categories, local_vars) + expanded_path = get_paths(expanded_path, local_vars) if '{' in expanded_path and '}' in expanded_path: self.warn("Unable to expand %s, some categories may missing." % path_with_categories) @@ -66,7 +66,7 @@ sources = [] for file in self.data_files: destination = convert_path(self.expand_categories(file[1])) - if os.path.file(destination): + if os.path.isfile(destination): sources.append(destination) return sources diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -149,15 +149,38 @@ data-files ========== +### +source -> destination + +fichier-final = destination + source + +There is an {alias} for each categories of datafiles +----- +source may be a glob (*, ?, **, {}) + +order + +exclude +-- +base-prefix + +#### +overwrite system config for {alias} + +#### +extra-categori + This section describes the files used by the project which must not be installed in the same place that python modules or libraries. The format for specifing data files is : - **glob_syntax** = **destination** + **source** = **destination** Example:: - scripts/ *.bin = {scripts} + scripts/script1.bin = {scripts} + +It means that the file scripts/script1.bin will be placed It means that every file which match the glob_syntax will be placed in the destination. A part of the path of the file will be stripped when it will be expanded and another part will be append to the destination. For more informations about which part of the path will be stripped or not, take a look at next sub-section globsyntax_. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Merge Message-ID: tarek.ziade pushed 9ab3a8ca0c6d to distutils2: http://hg.python.org/distutils2/rev/9ab3a8ca0c6d changeset: 1043:9ab3a8ca0c6d parent: 1042:681504f2a84a parent: 1041:501afb46682c user: FELD Boris date: Sat Jan 29 14:42:03 2011 +0100 summary: Merge files: diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -178,6 +178,8 @@ else: prefix = splitted_glob[0] suffix = splitted_glob[1] + if destination == '': + destination = None resources.append((prefix, suffix, destination)) dir = os.path.dirname(os.path.join(os.getcwd(), filename)) diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -11,7 +11,7 @@ self.suffix = suffix - def expand(self, basepath, category): + def expand(self, basepath, destination): if self.base: base = osp.join(basepath, self.base) else: @@ -22,7 +22,8 @@ for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') - yield relpath, osp.join(category, path_suffix) + dest = osp.join(destination, path_suffix) + yield relpath, dest RICH_GLOB = re.compile(r'\{([^}]*)\}') @@ -56,6 +57,15 @@ destinations = {} for (base, suffix, glob_dest) in rules: sglob = SmartGlob(base, suffix) - for file, file_dest in sglob.expand(resources_dir, glob_dest): - destinations[file] = file_dest + if glob_dest is None: + delete = True + dest = '' + else: + delete = False + dest = glob_dest + for file, file_dest in sglob.expand(resources_dir, dest): + if delete and file in destinations: + del destinations[file] + else: + destinations[file] = file_dest return destinations diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -72,6 +72,14 @@ 'Babarlikestrawberry': None} self.assertFindGlob(rules, spec) + def test_set_match_exclude(self): + rules = [('scripts', '*', '{appscript}'), + ('', '**/*.sh', None)] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': None, + 'Babarlikestrawberry': None} + self.assertFindGlob(rules, spec) + def test_glob_in_base(self): rules = [('scrip*', '*.bin', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] Only resources_dest and iglob are public. Message-ID: tarek.ziade pushed 43429b148e37 to distutils2: http://hg.python.org/distutils2/rev/43429b148e37 changeset: 1047:43429b148e37 user: Pierre-Yves David date: Sat Jan 29 15:44:00 2011 +0100 summary: [datafiles] Only resources_dest and iglob are public. files: distutils2/datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -3,6 +3,7 @@ from os import path as osp from glob import iglob as simple_iglob +__all__ = ['iglob', 'resources_dests'] class SmartGlob(object): -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Add example for data-files documentation Message-ID: tarek.ziade pushed 681504f2a84a to distutils2: http://hg.python.org/distutils2/rev/681504f2a84a changeset: 1042:681504f2a84a parent: 1039:7f6a4b4b6e7b user: FELD Boris date: Sat Jan 29 14:41:47 2011 +0100 summary: Add example for data-files documentation files: distutils2/_backport/pkgutil.py docs/design/wiki.rst docs/source/setupcfg.rst diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -613,7 +613,7 @@ # PEP 376 Implementation # ########################## -DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED',) +DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'RESOURCES') # Cache _cache_name = {} # maps names to Distribution instances @@ -1167,3 +1167,6 @@ for dist in get_distributions(): if dist.uses(path): yield dist + +def open(distribution_name, relative_path): + diff --git a/docs/design/wiki.rst b/docs/design/wiki.rst --- a/docs/design/wiki.rst +++ b/docs/design/wiki.rst @@ -250,8 +250,8 @@ == ==================================== =================================================================================================== 1 mailman/database/schemas/blah.schema /var/mailman/schemas/blah.schema 2 some.tpl /var/mailman/templates/some.tpl -3 path/to/some.tpl /var/mailman/templates/path/to/some.tpl -4 mailman/database/mailman.db /var/mailman/database/mailman.db +3 path/to/some.tpl /var/mailman/templates/path/to/some.tpl ! +4 mailman/database/mailman.db /var/mailman/database/mailman.db ! 5 developer-docs/index.txt /usr/share/doc/mailman/developer-docs/index.txt 6 developer-docs/api/toc.txt /usr/share/doc/mailman/developer-docs/api/toc.txt 7 README /usr/share/doc/mailman/README @@ -259,7 +259,7 @@ 9 mailman/foo/some/path/bar/my.cfg /etc/mailman/baz/some/path/bar/my.cfg AND /etc/mailman/hmm/some/path/bar/my.cfg + emit a warning -10 mailman/foo/some/path/other.cfg /etc/mailman/some/path/other.cfg +10 mailman/foo/some/path/other.cfg /etc/mailman/some/path/other.cfg ! 11 some-new-semantic.sns /var/funky/mailman/some-new-semantic.sns == ==================================== =================================================================================================== diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -163,24 +163,6 @@ The destination path will be expanded at the installation time using categories's default-path in the sysconfig.cfg file in the system. For more information about categories's default-paths, take a look at next next sub-section destination_. -So, if you have this source tree:: - - mailman-1.0/ - README - scripts/ - start.sh - start.py - start.bat - LAUNCH - docs/ - index.rst - mailman/ - databases/ - main.db - mailman.py - - - .. _globsyntax: @@ -257,6 +239,109 @@ So, if you have this destination path : **{help}/api**, it will be expanded into **{datadir}/{distribution.name}/api**. {datadir} will be expanded depending on your system value (ex : confdir = datadir = /usr/share/). + +Simple-example +-------------- + +Source tree:: + + babar-1.0/ + README + babar.sh + launch.sh + babar.py + +Setup.cfg:: + + [RESOURCES] + README = {doc} + *.sh = {scripts} + +So babar.sh and launch.sh will be placed in {scripts} directory. + +Now let's create to move all the scripts into a scripts/directory. + +Second-example +-------------- + +Source tree:: + + babar-1.1/ + README + scripts/ + babar.sh + launch.sh + LAUNCH + babar.py + +Setup.cfg:: + + [RESOURCES] + README = {doc} + scripts/ LAUNCH = {scripts} + scripts/ *.sh = {scripts} + +It's important to use the separator after scripts/ to install all the bash scripts into {scripts} instead of {scripts}/scripts. + +Now let's add some docs. + +Third-example +------------- + +Source tree:: + + babar-1.2/ + README + scripts/ + babar.sh + launch.sh + LAUNCH + docs/ + api + man + babar.py + +Setup.cfg:: + + [RESOURCES] + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.sh = {scripts} + doc/ * = {doc} + doc/ man = {man} + +You want to place all the file in the docs script into {doc} category, instead of man, which must be placed into {man} category, we will use the order of declaration of globs to choose the destination, the last glob that match the file is used. + +Now let's add some scripts for windows users. + +Final example +------------- + +Source tree:: + + babar-1.3/ + README + doc/ + api + man + scripts/ + babar.sh + launch.sh + babar.bat + launch.bat + LAUNCH + +Setup.cfg:: + + [RESOURCES] + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.{sh,bat} = {scripts} + doc/ * = {doc} + doc/ man = {man} + +We use brace expansion syntax to place all the bash and batch scripts into {scripts} category. + command sections ================ -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: merge with main branch Message-ID: tarek.ziade pushed cc4a40ea2743 to distutils2: http://hg.python.org/distutils2/rev/cc4a40ea2743 changeset: 1044:cc4a40ea2743 parent: 1043:9ab3a8ca0c6d parent: 916:88b453387537 user: Pierre-Yves David date: Sat Jan 29 15:03:21 2011 +0100 summary: merge with main branch files: distutils2/config.py docs/source/setupcfg.rst diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py --- a/distutils2/command/build_py.py +++ b/distutils2/command/build_py.py @@ -66,10 +66,9 @@ self.packages = self.distribution.packages self.py_modules = self.distribution.py_modules self.package_data = self.distribution.package_data - self.package_dir = {} - if self.distribution.package_dir: - for name, path in self.distribution.package_dir.iteritems(): - self.package_dir[name] = convert_path(path) + self.package_dir = None + if self.distribution.package_dir is not None: + self.package_dir = convert_path(self.distribution.package_dir) self.data_files = self.get_data_files() # Ick, copied straight from install_lib.py (fancy_getopt needs a @@ -179,41 +178,14 @@ """Return the directory, relative to the top of the source distribution, where package 'package' should be found (at least according to the 'package_dir' option, if any).""" + path = package.split('.') + if self.package_dir is not None: + path.insert(0, self.package_dir) - path = package.split('.') + if len(path) > 0: + return os.path.join(*path) - if not self.package_dir: - if path: - return os.path.join(*path) - else: - return '' - else: - tail = [] - while path: - try: - pdir = self.package_dir['.'.join(path)] - except KeyError: - tail.insert(0, path[-1]) - del path[-1] - else: - tail.insert(0, pdir) - return os.path.join(*tail) - else: - # Oops, got all the way through 'path' without finding a - # match in package_dir. If package_dir defines a directory - # for the root (nameless) package, then fallback on it; - # otherwise, we might as well have not consulted - # package_dir at all, as we just use the directory implied - # by 'tail' (which should be the same as the original value - # of 'path' at this point). - pdir = self.package_dir.get('') - if pdir is not None: - tail.insert(0, pdir) - - if tail: - return os.path.join(*tail) - else: - return '' + return '' def check_package(self, package, package_dir): """Helper function for `find_package_modules()` and `find_modules()'. diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -57,7 +57,7 @@ Warns if any are missing. """ - missing, __ = self.distribution.metadata.check() + missing, __ = self.distribution.metadata.check(strict=True) if missing != []: self.warn("missing required metadata: %s" % ', '.join(missing)) diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -18,7 +18,8 @@ from distutils2.command import get_command_names from distutils2.command.cmd import Command from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError, - DistutilsTemplateError, DistutilsModuleError) + DistutilsTemplateError, DistutilsModuleError, + DistutilsFileError) from distutils2.manifest import Manifest from distutils2 import logger from distutils2.util import convert_path, resolve_name @@ -214,8 +215,6 @@ def add_defaults(self): """Add all the default files to self.filelist: - - README or README.txt - - test/test*.py - all pure Python modules mentioned in setup script - all files pointed by package_data (build_py) - all files defined in data_files. @@ -225,32 +224,6 @@ Warns if (README or README.txt) or setup.py are missing; everything else is optional. """ - standards = [('README', 'README.txt')] - for fn in standards: - if isinstance(fn, tuple): - alts = fn - got_it = 0 - for fn in alts: - if os.path.exists(fn): - got_it = 1 - self.filelist.append(fn) - break - - if not got_it: - self.warn("standard file not found: should have one of " + - string.join(alts, ', ')) - else: - if os.path.exists(fn): - self.filelist.append(fn) - else: - self.warn("standard file '%s' not found" % fn) - - optional = ['test/test*.py', 'setup.cfg'] - for pattern in optional: - files = filter(os.path.isfile, glob(pattern)) - if files: - self.filelist.extend(files) - for cmd_name in get_command_names(): try: cmd_obj = self.get_finalized_command(cmd_name) @@ -319,6 +292,12 @@ logger.warn("no files to distribute -- empty manifest?") else: logger.info(msg) + + for file in self.distribution.metadata.requires_files: + if file not in files: + msg = "'%s' must be included explicitly in 'extra_files'" % file + raise DistutilsFileError(msg) + for file in files: if not os.path.isfile(file): logger.warn("'%s' not a regular file -- skipping" % file) @@ -376,4 +355,3 @@ # Now create them for dir in need_dirs: self.mkpath(dir, mode, verbose=verbose, dry_run=dry_run) - diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -79,10 +79,9 @@ return value def _multiline(self, value): - if '\n' in value: - value = [v for v in - [v.strip() for v in value.split('\n')] - if v != ''] + value = [v for v in + [v.strip() for v in value.split('\n')] + if v != ''] return value def _read_setup_cfg(self, parser, filename): @@ -103,7 +102,9 @@ if 'metadata' in content: for key, value in content['metadata'].iteritems(): key = key.replace('_', '-') - value = self._multiline(value) + if metadata.is_multi_field(key): + value = self._multiline(value) + if key == 'project-url': value = [(label.strip(), url.strip()) for label, url in @@ -115,30 +116,45 @@ "mutually exclusive") raise DistutilsOptionError(msg) - f = open(value) # will raise if file not found - try: - value = f.read() - finally: - f.close() + if isinstance(value, list): + filenames = value + else: + filenames = value.split() + + # concatenate each files + value = '' + for filename in filenames: + f = open(filename) # will raise if file not found + try: + value += f.read().strip() + '\n' + finally: + f.close() + # add filename as a required file + if filename not in metadata.requires_files: + metadata.requires_files.append(filename) + value = value.strip() key = 'description' if metadata.is_metadata_field(key): metadata[key] = self._convert_metadata(key, value) + if 'files' in content: - files = dict([(key, self._multiline(value)) + def _convert(key, value): + if key not in ('packages_root',): + value = self._multiline(value) + return value + + files = dict([(key, _convert(key, value)) for key, value in content['files'].iteritems()]) self.dist.packages = [] - self.dist.package_dir = {} + self.dist.package_dir = pkg_dir = files.get('packages_root') packages = files.get('packages', []) if isinstance(packages, str): packages = [packages] for package in packages: - if ':' in package: - dir_, package = package.split(':') - self.dist.package_dir[package] = dir_ self.dist.packages.append(package) self.dist.py_modules = files.get('modules', []) diff --git a/distutils2/errors.py b/distutils2/errors.py --- a/distutils2/errors.py +++ b/distutils2/errors.py @@ -110,6 +110,10 @@ """Attempt to process an unknown file type.""" +class MetadataMissingError(DistutilsError): + """A required metadata is missing""" + + class MetadataConflictError(DistutilsError): """Attempt to read or write metadata fields that are conflictual.""" diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -208,6 +208,13 @@ infos[key].extend(new_infos[key]) +def remove(project_name): + """Removes a single project from the installation""" + pass + + + + def main(**attrs): if 'script_args' not in attrs: import sys diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -14,7 +14,8 @@ from distutils2 import logger from distutils2.version import (is_valid_predicate, is_valid_version, is_valid_versions) -from distutils2.errors import (MetadataConflictError, +from distutils2.errors import (MetadataMissingError, + MetadataConflictError, MetadataUnrecognizedVersionError) try: @@ -77,12 +78,13 @@ 'Obsoletes-Dist', 'Requires-External', 'Maintainer', 'Maintainer-email', 'Project-URL') +_345_REQUIRED = ('Name', 'Version') + _ALL_FIELDS = set() _ALL_FIELDS.update(_241_FIELDS) _ALL_FIELDS.update(_314_FIELDS) _ALL_FIELDS.update(_345_FIELDS) - def _version2fieldlist(version): if version == '1.0': return _241_FIELDS @@ -172,14 +174,19 @@ _LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes', 'Requires', 'Provides', 'Obsoletes-Dist', 'Provides-Dist', 'Requires-Dist', 'Requires-External', - 'Project-URL') + 'Project-URL', 'Supported-Platform') _LISTTUPLEFIELDS = ('Project-URL',) _ELEMENTSFIELD = ('Keywords',) _UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description') -_MISSING = object() +class NoDefault(object): + """Marker object used for clean representation""" + def __repr__(self): + return '' + +_MISSING = NoDefault() class DistributionMetadata(object): """The metadata of a release. @@ -200,6 +207,7 @@ self._fields = {} self.display_warnings = display_warnings self.version = None + self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent self.execution_context = execution_context @@ -292,13 +300,20 @@ # Public API # def get_fullname(self): + """Return the distribution name with version""" return '%s-%s' % (self['Name'], self['Version']) def is_metadata_field(self, name): + """return True if name is a valid metadata key""" name = self._convert_name(name) return name in _ALL_FIELDS + def is_multi_field(self, name): + name = self._convert_name(name) + return name in _LISTFIELDS + def read(self, filepath): + """Read the metadata values from a file path.""" self.read_file(open(filepath)) def read_file(self, fileob): @@ -451,11 +466,21 @@ return None return value - def check(self): - """Check if the metadata is compliant.""" + def check(self, strict=False): + """Check if the metadata is compliant. If strict is False then raise if + no Name or Version are provided""" # XXX should check the versions (if the file was loaded) missing, warnings = [], [] - for attr in ('Name', 'Version', 'Home-page'): + + for attr in ('Name', 'Version'): + if attr not in self: + missing.append(attr) + + if strict and missing != []: + msg = "missing required metadata: %s" % ', '.join(missing) + raise MetadataMissingError(msg) + + for attr in ('Home-page',): if attr not in self: missing.append(attr) @@ -483,12 +508,15 @@ return missing, warnings def keys(self): + """Dict like api""" return _version2fieldlist(self.version) def values(self): + """Dict like api""" return [self[key] for key in self.keys()] def items(self): + """Dict like api""" return [(key, self[key]) for key in self.keys()] diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -109,6 +109,7 @@ except (DistutilsError, CCompilerError), msg: + raise raise SystemExit, "error: " + str(msg) return dist @@ -124,6 +125,10 @@ action="store_true", dest="version", default=False, help="Prints out the version of Distutils2 and exits.") + parser.add_option("-m", "--metadata", + action="append", dest="metadata", default=[], + help="List METADATA metadata or 'all' for all metadatas.") + parser.add_option("-s", "--search", action="store", dest="search", default=None, help="Search for installed distributions.") @@ -141,6 +146,31 @@ print('Distutils2 %s' % __version__) # sys.exit(0) + if len(options.metadata): + from distutils2.dist import Distribution + dist = Distribution() + dist.parse_config_files() + metadata = dist.metadata + + if 'all' in options.metadata: + keys = metadata.keys() + else: + keys = options.metadata + if len(keys) == 1: + print metadata[keys[0]] + sys.exit(0) + + for key in keys: + if key in metadata: + print(metadata._convert_name(key)+':') + value = metadata[key] + if isinstance(value, list): + for v in value: + print(' '+v) + else: + print(' '+value.replace('\n', '\n ')) + sys.exit(0) + if options.search is not None: search = options.search.lower() for dist in get_distributions(use_egg_info=True): diff --git a/distutils2/tests/test_command_build_ext.py b/distutils2/tests/test_command_build_ext.py --- a/distutils2/tests/test_command_build_ext.py +++ b/distutils2/tests/test_command_build_ext.py @@ -289,7 +289,7 @@ # inplace = 0, cmd.package = 'bar' build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {'': 'bar'} + build_py.package_dir = 'bar' path = cmd.get_ext_fullpath('foo') # checking that the last directory is the build_dir path = os.path.split(path)[0] @@ -318,7 +318,7 @@ dist = Distribution() cmd = build_ext(dist) cmd.inplace = 1 - cmd.distribution.package_dir = {'': 'src'} + cmd.distribution.package_dir = 'src' cmd.distribution.packages = ['lxml', 'lxml.html'] curdir = os.getcwd() wanted = os.path.join(curdir, 'src', 'lxml', 'etree' + ext) @@ -334,7 +334,7 @@ # building twisted.runner.portmap not inplace build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {} + build_py.package_dir = None cmd.distribution.packages = ['twisted', 'twisted.runner.portmap'] path = cmd.get_ext_fullpath('twisted.runner.portmap') wanted = os.path.join(curdir, 'tmpdir', 'twisted', 'runner', diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py --- a/distutils2/tests/test_command_build_py.py +++ b/distutils2/tests/test_command_build_py.py @@ -17,12 +17,14 @@ def test_package_data(self): sources = self.mkdtemp() - f = open(os.path.join(sources, "__init__.py"), "w") + pkg_dir = os.path.join(sources, 'pkg') + os.mkdir(pkg_dir) + f = open(os.path.join(pkg_dir, "__init__.py"), "w") try: f.write("# Pretend this is a package.") finally: f.close() - f = open(os.path.join(sources, "README.txt"), "w") + f = open(os.path.join(pkg_dir, "README.txt"), "w") try: f.write("Info about this package") finally: @@ -31,8 +33,9 @@ destination = self.mkdtemp() dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": sources}}) + "package_dir": sources}) # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") dist.command_obj["build"] = support.DummyCommand( force=0, @@ -42,7 +45,7 @@ use_2to3=False) dist.packages = ["pkg"] dist.package_data = {"pkg": ["README.txt"]} - dist.package_dir = {"pkg": sources} + dist.package_dir = sources cmd = build_py(dist) cmd.compile = 1 @@ -68,19 +71,20 @@ # create the distribution files. sources = self.mkdtemp() - open(os.path.join(sources, "__init__.py"), "w").close() - - testdir = os.path.join(sources, "doc") + pkg = os.path.join(sources, 'pkg') + os.mkdir(pkg) + open(os.path.join(pkg, "__init__.py"), "w").close() + testdir = os.path.join(pkg, "doc") os.mkdir(testdir) open(os.path.join(testdir, "testfile"), "w").close() os.chdir(sources) old_stdout = sys.stdout - sys.stdout = StringIO.StringIO() + #sys.stdout = StringIO.StringIO() try: dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": ""}, + "package_dir": sources, "package_data": {"pkg": ["doc/*"]}}) # script_name need not exist, it just need to be initialized dist.script_name = os.path.join(sources, "setup.py") @@ -89,7 +93,7 @@ try: dist.run_commands() - except DistutilsFileError: + except DistutilsFileError, e: self.fail("failed package_data test when package_dir is ''") finally: # Restore state. diff --git a/distutils2/tests/test_command_check.py b/distutils2/tests/test_command_check.py --- a/distutils2/tests/test_command_check.py +++ b/distutils2/tests/test_command_check.py @@ -4,6 +4,7 @@ from distutils2.metadata import _HAS_DOCUTILS from distutils2.tests import unittest, support from distutils2.errors import DistutilsSetupError +from distutils2.errors import MetadataMissingError class CheckTestCase(support.LoggingCatcher, support.TempdirManager, @@ -11,7 +12,7 @@ def _run(self, metadata=None, **options): if metadata is None: - metadata = {} + metadata = {'name':'xxx', 'version':'xxx'} pkg_info, dist = self.create_dist(**metadata) cmd = check(dist) cmd.initialize_options() @@ -40,7 +41,8 @@ # now with the strict mode, we should # get an error if there are missing metadata - self.assertRaises(DistutilsSetupError, self._run, {}, **{'strict': 1}) + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) # and of course, no error when all metadata fields are present cmd = self._run(metadata, strict=1) @@ -63,6 +65,9 @@ def test_check_all(self): self.assertRaises(DistutilsSetupError, self._run, + {'name':'xxx', 'version':'xxx'}, **{'strict': 1, + 'all': 1}) + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1, 'all': 1}) diff --git a/distutils2/tests/test_command_register.py b/distutils2/tests/test_command_register.py --- a/distutils2/tests/test_command_register.py +++ b/distutils2/tests/test_command_register.py @@ -195,7 +195,7 @@ # long_description is not reSt compliant # empty metadata - cmd = self._get_cmd({}) + cmd = self._get_cmd({'name': 'xxx', 'version': 'xxx'}) cmd.ensure_finalized() cmd.strict = 1 inputs = RawInputs('1', 'tarek', 'y') diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -45,7 +45,6 @@ MANIFEST = """\ # file GENERATED by distutils, do NOT edit -README inroot.txt data%(sep)sdata.dt scripts%(sep)sscript.py @@ -141,7 +140,7 @@ zip_file.close() # making sure everything has been pruned correctly - self.assertEqual(len(content), 3) + self.assertEqual(len(content), 2) @unittest.skipUnless(zlib, "requires zlib") def test_make_distribution(self): @@ -236,7 +235,7 @@ zip_file.close() # making sure everything was added - self.assertEqual(len(content), 10) + self.assertEqual(len(content), 9) # checking the MANIFEST manifest = open(join(self.tmp_dir, 'MANIFEST')).read() @@ -245,7 +244,7 @@ @unittest.skipUnless(zlib, "requires zlib") def test_metadata_check_option(self): # testing the `check-metadata` option - dist, cmd = self.get_cmd(metadata={}) + dist, cmd = self.get_cmd(metadata={'name':'xxx', 'version':'xxx'}) # this should raise some warnings ! # with the `check` subcommand @@ -362,8 +361,7 @@ if line.strip() != ''] finally: f.close() - - self.assertEqual(len(manifest), 4) + self.assertEqual(len(manifest), 3) # adding a file self.write_file((self.tmp_dir, 'somecode', 'doc2.txt'), '#') @@ -383,7 +381,7 @@ f.close() # do we have the new file in MANIFEST ? - self.assertEqual(len(manifest2), 5) + self.assertEqual(len(manifest2), 4) self.assertIn('doc2.txt', manifest2[-1]) def test_manifest_marker(self): diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -5,6 +5,8 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest +from distutils2.command.sdist import sdist +from distutils2.errors import DistutilsFileError SETUP_CFG = """ @@ -16,7 +18,7 @@ maintainer = ??ric Araujo maintainer_email = merwok at netwok.org summary = A sample project demonstrating distutils2 packaging -description-file = README +description-file = %(description-file)s keywords = distutils2, packaging, sample project classifier = @@ -47,9 +49,11 @@ Fork in progress, http://bitbucket.org/Merwok/sample-distutils2-project [files] +packages_root = src + packages = one - src:two - src2:three + two + three modules = haven @@ -66,6 +70,8 @@ config = cfg/data.cfg /etc/init.d = init-script +extra_files = %(extra-files)s + # Replaces MANIFEST.in sdist_extra = include THANKS HACKING @@ -130,22 +136,33 @@ self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) - def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) - self.write_file('README', 'yeah') + def write_setup(self, kwargs=None): + opts = {'description-file': 'README', 'extra-files':''} + if kwargs: + opts.update(kwargs) + self.write_file('setup.cfg', SETUP_CFG % opts) - # try to load the metadata now + + def run_setup(self, *args): + # run setup with args sys.stdout = StringIO() - sys.argv[:] = ['setup.py', '--version'] + sys.argv[:] = [''] + list(args) old_sys = sys.argv[:] - try: from distutils2.run import commands_main dist = commands_main() finally: sys.argv[:] = old_sys + return dist + + def test_config(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup() + self.write_file('README', 'yeah') + + # try to load the metadata now + dist = self.run_setup('--version') # sanity check self.assertEqual(sys.stdout.getvalue(), '0.6.4.dev1' + os.linesep) @@ -184,7 +201,6 @@ 'http://bitbucket.org/Merwok/sample-distutils2-project')] self.assertEqual(dist.metadata['Project-Url'], urls) - self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) @@ -192,7 +208,8 @@ [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), ('config ', ['cfg/data.cfg']), ('/etc/init.d ', ['init-script'])]) - self.assertEqual(dist.package_dir['two'], 'src') + + self.assertEqual(dist.package_dir, 'src') # Make sure we get the foo command loaded. We use a string comparison # instead of assertIsInstance because the class is not the same when @@ -213,10 +230,94 @@ d = new_compiler(compiler='d') self.assertEqual(d.description, 'D Compiler') + + def test_multiple_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog2') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_multiline_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README\n CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_metadata_requires_description_files_missing(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--version') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + def test_metadata_requires_description_files(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2', + 'extra-files':'\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--description') + self.assertIn('yeah\nyeah\n', sys.stdout.getvalue()) + + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + self.write_setup({'description-file': 'README\n README2', + 'extra-files': '\n README2\n README'}) + dist = self.run_setup('--description') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + cmd.make_distribution() + self.assertIn('README\nREADME2\n', open('MANIFEST').read()) + def test_sub_commands(self): tempdir = self.mkdtemp() os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) + self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') self.write_file('script1.py', '#') @@ -224,20 +325,15 @@ self.write_file(os.path.join('scripts', 'find-coconuts'), '#') os.mkdir('bin') self.write_file(os.path.join('bin', 'taunt'), '#') + os.mkdir('src') - for pkg in ('one', 'src', 'src2'): + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) os.mkdir(pkg) self.write_file(os.path.join(pkg, '__init__.py'), '#') # try to run the install command to see if foo is called - sys.stdout = sys.stderr = StringIO() - sys.argv[:] = ['', 'install_dist'] - old_sys = sys.argv[:] - try: - from distutils2.run import main - dist = main() - finally: - sys.argv[:] = old_sys + dist = self.run_setup('install_dist') self.assertEqual(dist.foo_was_here, 1) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,6 +15,7 @@ from copy import copy from fnmatch import fnmatchcase from ConfigParser import RawConfigParser +from inspect import getsource from distutils2.errors import (DistutilsPlatformError, DistutilsFileError, DistutilsByteCompileError, DistutilsExecError) @@ -1127,3 +1128,117 @@ """ Issues a call to util.run_2to3. """ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) + + +def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): + """ Distutils2 to distutils1 compatibility util. + + This method uses an existing setup.cfg to generate a dictionnary of + keywords that can be used by distutils.core.setup(kwargs**). + + :param file: + The setup.cfg path. + :raises DistutilsFileError: + When the setup.cfg file is not found. + + """ + # We need to declare the following constants here so that it's easier to + # generate the setup.py afterwards, using inspect.getsource. + D1_D2_SETUP_ARGS = { + # D1 name : (D2_section, D2_name) + "name" : ("metadata",), + "version" : ("metadata",), + "author" : ("metadata",), + "author_email" : ("metadata",), + "maintainer" : ("metadata",), + "maintainer_email" : ("metadata",), + "url" : ("metadata", "home_page"), + "description" : ("metadata", "summary"), + "long_description" : ("metadata", "description"), + "download-url" : ("metadata",), + "classifiers" : ("metadata", "classifier"), + "platforms" : ("metadata", "platform"), # Needs testing + "license" : ("metadata",), + "requires" : ("metadata", "requires_dist"), + "provides" : ("metadata", "provides_dist"), # Needs testing + "obsoletes" : ("metadata", "obsoletes_dist"), # Needs testing + + "packages" : ("files",), + "scripts" : ("files",), + "py_modules" : ("files", "modules"), # Needs testing + } + + MULTI_FIELDS = ("classifiers", + "requires", + "platforms", + "packages", + "scripts") + + def has_get_option(config, section, option): + if config.has_option(section, option): + return config.get(section, option) + elif config.has_option(section, option.replace('_', '-')): + return config.get(section, option.replace('_', '-')) + else: + return False + + # The method source code really starts here. + config = RawConfigParser() + if not os.path.exists(file): + raise DistutilsFileError("file '%s' does not exist" % + os.path.abspath(file)) + config.read(file) + + kwargs = {} + for arg in D1_D2_SETUP_ARGS: + if len(D1_D2_SETUP_ARGS[arg]) == 2: + # The distutils field name is different than distutils2's. + section, option = D1_D2_SETUP_ARGS[arg] + + elif len(D1_D2_SETUP_ARGS[arg]) == 1: + # The distutils field name is the same thant distutils2's. + section = D1_D2_SETUP_ARGS[arg][0] + option = arg + + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg + if arg == "long_description": + filename = has_get_option(config, section, "description_file") + print "We have a filename", filename + if filename: + in_cfg_value = open(filename).read() + else: + continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option + if "\n" in in_cfg_value: + in_cfg_value = in_cfg_value.strip().split('\n') + else: + in_cfg_value = list((in_cfg_value,)) + + kwargs[arg] = in_cfg_value + + return kwargs + + +def generate_distutils_setup_py(): + """ Generate a distutils compatible setup.py using an existing setup.cfg. + + :raises DistutilsFileError: + When a setup.py already exists. + """ + if os.path.exists("setup.py"): + raise DistutilsFileError("A pre existing setup.py file exists") + + handle = open("setup.py", "w") + handle.write("# Distutils script using distutils2 setup.cfg to call the\n") + handle.write("# distutils.core.setup() with the right args.\n\n\n") + handle.write("import os\n") + handle.write("from distutils.core import setup\n") + handle.write("from ConfigParser import RawConfigParser\n\n") + handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) + handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") + handle.write("setup(**kwargs)") + handle.close() diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1055,6 +1055,13 @@ Create a file called *filename* and write *contents* (a sequence of strings without line terminators) to it. +:mod:`distutils2.metadata` --- Metadata handling +================================================================ + +.. module:: distutils2.metadata + +.. autoclass:: distutils2.metadata.DistributionMetadata + :members: :mod:`distutils2.util` --- Miscellaneous other utility functions ================================================================ diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -301,7 +301,7 @@ :class:`distutils2.dist.DistributionMetadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.dist import DistributionMetadata + >>> from distutils2.metadata import DistributionMetadata >>> metadata = DistributionMetadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -2,7 +2,9 @@ Metadata ======== -Distutils2 provides a :class:`DistributionMetadata` class that can read and +.. module:: distutils2.metadata + +Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -17,7 +19,7 @@ Reading metadata ================ -The :class:`DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: >>> from distutils2.metadata import DistributionMetadata @@ -33,7 +35,7 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -128,6 +128,8 @@ This section describes the files included in the project. +- **packages_root**: the root directory containing all packages. If not provided + Distutils2 will use the current directory. *\*optional* - **packages**: a list of packages the project includes *\*optional* *\*multi* - **modules**: a list of packages the project includes *\*optional* *\*multi* - **scripts**: a list of scripts the project includes *\*optional* *\*multi* @@ -136,6 +138,7 @@ Example:: [files] + packages_root = src packages = pypi2rpm pypi2rpm.command diff --git a/patch b/patch new file mode 100644 --- /dev/null +++ b/patch @@ -0,0 +1,23 @@ +diff -r 5603e1bc5442 distutils2/util.py +--- a/distutils2/util.py Fri Jan 28 18:42:44 2011 +0100 ++++ b/distutils2/util.py Sat Jan 29 02:39:55 2011 +0100 +@@ -1203,12 +1203,13 @@ + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg +- continue +- +- if arg == "long_description": +- filename = has_get_option("description_file") +- if filename: +- in_cfg_value = open(filename).read() ++ if arg == "long_description": ++ filename = has_get_option(config, section, "description_file") ++ print "We have a filename", filename ++ if filename: ++ in_cfg_value = open(filename).read() ++ else: ++ continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -7,11 +7,15 @@ from distutils2 import __version__ as VERSION from distutils2.util import find_packages from distutils import log -from distutils.core import setup, Extension from distutils.ccompiler import new_compiler from distutils.command.sdist import sdist from distutils.command.install import install +try: + from setuptools import setup, Extension +except ImportError: + from distutils.core import setup, Extension + f = open('README.txt') try: README = f.read() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: [datafiles] use os.path instead of osp Message-ID: tarek.ziade pushed b4c14d0649dd to distutils2: http://hg.python.org/distutils2/rev/b4c14d0649dd changeset: 1048:b4c14d0649dd user: Pierre-Yves David date: Sat Jan 29 15:44:26 2011 +0100 summary: [datafiles] use os.path instead of osp files: distutils2/datafiles.py distutils2/tests/test_datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,6 +1,5 @@ import os import re -from os import path as osp from glob import iglob as simple_iglob __all__ = ['iglob', 'resources_dests'] @@ -14,16 +13,16 @@ def expand(self, basepath, destination): if self.base: - base = osp.join(basepath, self.base) + base = os.path.join(basepath, self.base) else: base = basepath if '*' in base or '{' in base or '}' in base: raise NotImplementedError('glob are not supported into base part of datafiles definition. %r is an invalide basepath' % base) - absglob = osp.join(base, self.suffix) + absglob = os.path.join(base, self.suffix) for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') - dest = osp.join(destination, path_suffix) + dest = os.path.join(destination, path_suffix) yield relpath, dest RICH_GLOB = re.compile(r'\{([^}]*)\}') @@ -51,7 +50,7 @@ else: radical = radical.lstrip('/') for (path, dir, files) in os.walk(prefix): - for file in iglob(osp.join(prefix, path, radical)): + for file in iglob(os.path.join(prefix, path, radical)): yield os.path.join(prefix, file) def resources_dests(resources_dir, rules): diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -7,7 +7,6 @@ from distutils2.tests import unittest, support, run_unittest from distutils2.datafiles import resources_dests, RICH_GLOB import re -from os import path as osp @@ -25,9 +24,9 @@ def build_spec(self, spec, clean=True): tempdir = self.mkdtemp() for filepath in spec: - filepath = osp.join(tempdir, *filepath.split('/')) - dirname = osp.dirname(filepath) - if dirname and not osp.exists(dirname): + filepath = os.path.join(tempdir, *filepath.split('/')) + dirname = os.path.dirname(filepath) + if dirname and not os.path.exists(dirname): os.makedirs(dirname) self.write_file(filepath, 'babar') if clean: -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Merge with merge with main branche Message-ID: tarek.ziade pushed 25439a8e604b to distutils2: http://hg.python.org/distutils2/rev/25439a8e604b changeset: 1046:25439a8e604b parent: 1045:7b7827c8e4dd parent: 1044:cc4a40ea2743 user: FELD Boris date: Sat Jan 29 15:40:33 2011 +0100 summary: Merge with merge with main branche files: docs/source/setupcfg.rst diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py --- a/distutils2/command/build_py.py +++ b/distutils2/command/build_py.py @@ -66,10 +66,9 @@ self.packages = self.distribution.packages self.py_modules = self.distribution.py_modules self.package_data = self.distribution.package_data - self.package_dir = {} - if self.distribution.package_dir: - for name, path in self.distribution.package_dir.iteritems(): - self.package_dir[name] = convert_path(path) + self.package_dir = None + if self.distribution.package_dir is not None: + self.package_dir = convert_path(self.distribution.package_dir) self.data_files = self.get_data_files() # Ick, copied straight from install_lib.py (fancy_getopt needs a @@ -179,41 +178,14 @@ """Return the directory, relative to the top of the source distribution, where package 'package' should be found (at least according to the 'package_dir' option, if any).""" + path = package.split('.') + if self.package_dir is not None: + path.insert(0, self.package_dir) - path = package.split('.') + if len(path) > 0: + return os.path.join(*path) - if not self.package_dir: - if path: - return os.path.join(*path) - else: - return '' - else: - tail = [] - while path: - try: - pdir = self.package_dir['.'.join(path)] - except KeyError: - tail.insert(0, path[-1]) - del path[-1] - else: - tail.insert(0, pdir) - return os.path.join(*tail) - else: - # Oops, got all the way through 'path' without finding a - # match in package_dir. If package_dir defines a directory - # for the root (nameless) package, then fallback on it; - # otherwise, we might as well have not consulted - # package_dir at all, as we just use the directory implied - # by 'tail' (which should be the same as the original value - # of 'path' at this point). - pdir = self.package_dir.get('') - if pdir is not None: - tail.insert(0, pdir) - - if tail: - return os.path.join(*tail) - else: - return '' + return '' def check_package(self, package, package_dir): """Helper function for `find_package_modules()` and `find_modules()'. diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -57,7 +57,7 @@ Warns if any are missing. """ - missing, __ = self.distribution.metadata.check() + missing, __ = self.distribution.metadata.check(strict=True) if missing != []: self.warn("missing required metadata: %s" % ', '.join(missing)) diff --git a/distutils2/command/sdist.py b/distutils2/command/sdist.py --- a/distutils2/command/sdist.py +++ b/distutils2/command/sdist.py @@ -18,7 +18,8 @@ from distutils2.command import get_command_names from distutils2.command.cmd import Command from distutils2.errors import (DistutilsPlatformError, DistutilsOptionError, - DistutilsTemplateError, DistutilsModuleError) + DistutilsTemplateError, DistutilsModuleError, + DistutilsFileError) from distutils2.manifest import Manifest from distutils2 import logger from distutils2.util import convert_path, resolve_name @@ -214,8 +215,6 @@ def add_defaults(self): """Add all the default files to self.filelist: - - README or README.txt - - test/test*.py - all pure Python modules mentioned in setup script - all files pointed by package_data (build_py) - all files defined in data_files. @@ -225,32 +224,6 @@ Warns if (README or README.txt) or setup.py are missing; everything else is optional. """ - standards = [('README', 'README.txt')] - for fn in standards: - if isinstance(fn, tuple): - alts = fn - got_it = 0 - for fn in alts: - if os.path.exists(fn): - got_it = 1 - self.filelist.append(fn) - break - - if not got_it: - self.warn("standard file not found: should have one of " + - string.join(alts, ', ')) - else: - if os.path.exists(fn): - self.filelist.append(fn) - else: - self.warn("standard file '%s' not found" % fn) - - optional = ['test/test*.py', 'setup.cfg'] - for pattern in optional: - files = filter(os.path.isfile, glob(pattern)) - if files: - self.filelist.extend(files) - for cmd_name in get_command_names(): try: cmd_obj = self.get_finalized_command(cmd_name) @@ -319,6 +292,12 @@ logger.warn("no files to distribute -- empty manifest?") else: logger.info(msg) + + for file in self.distribution.metadata.requires_files: + if file not in files: + msg = "'%s' must be included explicitly in 'extra_files'" % file + raise DistutilsFileError(msg) + for file in files: if not os.path.isfile(file): logger.warn("'%s' not a regular file -- skipping" % file) @@ -376,4 +355,3 @@ # Now create them for dir in need_dirs: self.mkpath(dir, mode, verbose=verbose, dry_run=dry_run) - diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -79,10 +79,9 @@ return value def _multiline(self, value): - if '\n' in value: - value = [v for v in - [v.strip() for v in value.split('\n')] - if v != ''] + value = [v for v in + [v.strip() for v in value.split('\n')] + if v != ''] return value def _read_setup_cfg(self, parser, filename): @@ -103,7 +102,9 @@ if 'metadata' in content: for key, value in content['metadata'].iteritems(): key = key.replace('_', '-') - value = self._multiline(value) + if metadata.is_multi_field(key): + value = self._multiline(value) + if key == 'project-url': value = [(label.strip(), url.strip()) for label, url in @@ -115,30 +116,45 @@ "mutually exclusive") raise DistutilsOptionError(msg) - f = open(value) # will raise if file not found - try: - value = f.read() - finally: - f.close() + if isinstance(value, list): + filenames = value + else: + filenames = value.split() + + # concatenate each files + value = '' + for filename in filenames: + f = open(filename) # will raise if file not found + try: + value += f.read().strip() + '\n' + finally: + f.close() + # add filename as a required file + if filename not in metadata.requires_files: + metadata.requires_files.append(filename) + value = value.strip() key = 'description' if metadata.is_metadata_field(key): metadata[key] = self._convert_metadata(key, value) + if 'files' in content: - files = dict([(key, self._multiline(value)) + def _convert(key, value): + if key not in ('packages_root',): + value = self._multiline(value) + return value + + files = dict([(key, _convert(key, value)) for key, value in content['files'].iteritems()]) self.dist.packages = [] - self.dist.package_dir = {} + self.dist.package_dir = pkg_dir = files.get('packages_root') packages = files.get('packages', []) if isinstance(packages, str): packages = [packages] for package in packages: - if ':' in package: - dir_, package = package.split(':') - self.dist.package_dir[package] = dir_ self.dist.packages.append(package) self.dist.py_modules = files.get('modules', []) diff --git a/distutils2/errors.py b/distutils2/errors.py --- a/distutils2/errors.py +++ b/distutils2/errors.py @@ -110,6 +110,10 @@ """Attempt to process an unknown file type.""" +class MetadataMissingError(DistutilsError): + """A required metadata is missing""" + + class MetadataConflictError(DistutilsError): """Attempt to read or write metadata fields that are conflictual.""" diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -208,6 +208,13 @@ infos[key].extend(new_infos[key]) +def remove(project_name): + """Removes a single project from the installation""" + pass + + + + def main(**attrs): if 'script_args' not in attrs: import sys diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -14,7 +14,8 @@ from distutils2 import logger from distutils2.version import (is_valid_predicate, is_valid_version, is_valid_versions) -from distutils2.errors import (MetadataConflictError, +from distutils2.errors import (MetadataMissingError, + MetadataConflictError, MetadataUnrecognizedVersionError) try: @@ -77,12 +78,13 @@ 'Obsoletes-Dist', 'Requires-External', 'Maintainer', 'Maintainer-email', 'Project-URL') +_345_REQUIRED = ('Name', 'Version') + _ALL_FIELDS = set() _ALL_FIELDS.update(_241_FIELDS) _ALL_FIELDS.update(_314_FIELDS) _ALL_FIELDS.update(_345_FIELDS) - def _version2fieldlist(version): if version == '1.0': return _241_FIELDS @@ -172,14 +174,19 @@ _LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes', 'Requires', 'Provides', 'Obsoletes-Dist', 'Provides-Dist', 'Requires-Dist', 'Requires-External', - 'Project-URL') + 'Project-URL', 'Supported-Platform') _LISTTUPLEFIELDS = ('Project-URL',) _ELEMENTSFIELD = ('Keywords',) _UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description') -_MISSING = object() +class NoDefault(object): + """Marker object used for clean representation""" + def __repr__(self): + return '' + +_MISSING = NoDefault() class DistributionMetadata(object): """The metadata of a release. @@ -200,6 +207,7 @@ self._fields = {} self.display_warnings = display_warnings self.version = None + self.requires_files = [] self.docutils_support = _HAS_DOCUTILS self.platform_dependent = platform_dependent self.execution_context = execution_context @@ -292,13 +300,20 @@ # Public API # def get_fullname(self): + """Return the distribution name with version""" return '%s-%s' % (self['Name'], self['Version']) def is_metadata_field(self, name): + """return True if name is a valid metadata key""" name = self._convert_name(name) return name in _ALL_FIELDS + def is_multi_field(self, name): + name = self._convert_name(name) + return name in _LISTFIELDS + def read(self, filepath): + """Read the metadata values from a file path.""" self.read_file(open(filepath)) def read_file(self, fileob): @@ -451,11 +466,21 @@ return None return value - def check(self): - """Check if the metadata is compliant.""" + def check(self, strict=False): + """Check if the metadata is compliant. If strict is False then raise if + no Name or Version are provided""" # XXX should check the versions (if the file was loaded) missing, warnings = [], [] - for attr in ('Name', 'Version', 'Home-page'): + + for attr in ('Name', 'Version'): + if attr not in self: + missing.append(attr) + + if strict and missing != []: + msg = "missing required metadata: %s" % ', '.join(missing) + raise MetadataMissingError(msg) + + for attr in ('Home-page',): if attr not in self: missing.append(attr) @@ -483,12 +508,15 @@ return missing, warnings def keys(self): + """Dict like api""" return _version2fieldlist(self.version) def values(self): + """Dict like api""" return [self[key] for key in self.keys()] def items(self): + """Dict like api""" return [(key, self[key]) for key in self.keys()] diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -109,6 +109,7 @@ except (DistutilsError, CCompilerError), msg: + raise raise SystemExit, "error: " + str(msg) return dist @@ -124,6 +125,10 @@ action="store_true", dest="version", default=False, help="Prints out the version of Distutils2 and exits.") + parser.add_option("-m", "--metadata", + action="append", dest="metadata", default=[], + help="List METADATA metadata or 'all' for all metadatas.") + parser.add_option("-s", "--search", action="store", dest="search", default=None, help="Search for installed distributions.") @@ -141,6 +146,31 @@ print('Distutils2 %s' % __version__) # sys.exit(0) + if len(options.metadata): + from distutils2.dist import Distribution + dist = Distribution() + dist.parse_config_files() + metadata = dist.metadata + + if 'all' in options.metadata: + keys = metadata.keys() + else: + keys = options.metadata + if len(keys) == 1: + print metadata[keys[0]] + sys.exit(0) + + for key in keys: + if key in metadata: + print(metadata._convert_name(key)+':') + value = metadata[key] + if isinstance(value, list): + for v in value: + print(' '+v) + else: + print(' '+value.replace('\n', '\n ')) + sys.exit(0) + if options.search is not None: search = options.search.lower() for dist in get_distributions(use_egg_info=True): diff --git a/distutils2/tests/test_command_build_ext.py b/distutils2/tests/test_command_build_ext.py --- a/distutils2/tests/test_command_build_ext.py +++ b/distutils2/tests/test_command_build_ext.py @@ -289,7 +289,7 @@ # inplace = 0, cmd.package = 'bar' build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {'': 'bar'} + build_py.package_dir = 'bar' path = cmd.get_ext_fullpath('foo') # checking that the last directory is the build_dir path = os.path.split(path)[0] @@ -318,7 +318,7 @@ dist = Distribution() cmd = build_ext(dist) cmd.inplace = 1 - cmd.distribution.package_dir = {'': 'src'} + cmd.distribution.package_dir = 'src' cmd.distribution.packages = ['lxml', 'lxml.html'] curdir = os.getcwd() wanted = os.path.join(curdir, 'src', 'lxml', 'etree' + ext) @@ -334,7 +334,7 @@ # building twisted.runner.portmap not inplace build_py = cmd.get_finalized_command('build_py') - build_py.package_dir = {} + build_py.package_dir = None cmd.distribution.packages = ['twisted', 'twisted.runner.portmap'] path = cmd.get_ext_fullpath('twisted.runner.portmap') wanted = os.path.join(curdir, 'tmpdir', 'twisted', 'runner', diff --git a/distutils2/tests/test_command_build_py.py b/distutils2/tests/test_command_build_py.py --- a/distutils2/tests/test_command_build_py.py +++ b/distutils2/tests/test_command_build_py.py @@ -17,12 +17,14 @@ def test_package_data(self): sources = self.mkdtemp() - f = open(os.path.join(sources, "__init__.py"), "w") + pkg_dir = os.path.join(sources, 'pkg') + os.mkdir(pkg_dir) + f = open(os.path.join(pkg_dir, "__init__.py"), "w") try: f.write("# Pretend this is a package.") finally: f.close() - f = open(os.path.join(sources, "README.txt"), "w") + f = open(os.path.join(pkg_dir, "README.txt"), "w") try: f.write("Info about this package") finally: @@ -31,8 +33,9 @@ destination = self.mkdtemp() dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": sources}}) + "package_dir": sources}) # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") dist.command_obj["build"] = support.DummyCommand( force=0, @@ -42,7 +45,7 @@ use_2to3=False) dist.packages = ["pkg"] dist.package_data = {"pkg": ["README.txt"]} - dist.package_dir = {"pkg": sources} + dist.package_dir = sources cmd = build_py(dist) cmd.compile = 1 @@ -68,19 +71,20 @@ # create the distribution files. sources = self.mkdtemp() - open(os.path.join(sources, "__init__.py"), "w").close() - - testdir = os.path.join(sources, "doc") + pkg = os.path.join(sources, 'pkg') + os.mkdir(pkg) + open(os.path.join(pkg, "__init__.py"), "w").close() + testdir = os.path.join(pkg, "doc") os.mkdir(testdir) open(os.path.join(testdir, "testfile"), "w").close() os.chdir(sources) old_stdout = sys.stdout - sys.stdout = StringIO.StringIO() + #sys.stdout = StringIO.StringIO() try: dist = Distribution({"packages": ["pkg"], - "package_dir": {"pkg": ""}, + "package_dir": sources, "package_data": {"pkg": ["doc/*"]}}) # script_name need not exist, it just need to be initialized dist.script_name = os.path.join(sources, "setup.py") @@ -89,7 +93,7 @@ try: dist.run_commands() - except DistutilsFileError: + except DistutilsFileError, e: self.fail("failed package_data test when package_dir is ''") finally: # Restore state. diff --git a/distutils2/tests/test_command_check.py b/distutils2/tests/test_command_check.py --- a/distutils2/tests/test_command_check.py +++ b/distutils2/tests/test_command_check.py @@ -4,6 +4,7 @@ from distutils2.metadata import _HAS_DOCUTILS from distutils2.tests import unittest, support from distutils2.errors import DistutilsSetupError +from distutils2.errors import MetadataMissingError class CheckTestCase(support.LoggingCatcher, support.TempdirManager, @@ -11,7 +12,7 @@ def _run(self, metadata=None, **options): if metadata is None: - metadata = {} + metadata = {'name':'xxx', 'version':'xxx'} pkg_info, dist = self.create_dist(**metadata) cmd = check(dist) cmd.initialize_options() @@ -40,7 +41,8 @@ # now with the strict mode, we should # get an error if there are missing metadata - self.assertRaises(DistutilsSetupError, self._run, {}, **{'strict': 1}) + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) # and of course, no error when all metadata fields are present cmd = self._run(metadata, strict=1) @@ -63,6 +65,9 @@ def test_check_all(self): self.assertRaises(DistutilsSetupError, self._run, + {'name':'xxx', 'version':'xxx'}, **{'strict': 1, + 'all': 1}) + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1, 'all': 1}) diff --git a/distutils2/tests/test_command_register.py b/distutils2/tests/test_command_register.py --- a/distutils2/tests/test_command_register.py +++ b/distutils2/tests/test_command_register.py @@ -195,7 +195,7 @@ # long_description is not reSt compliant # empty metadata - cmd = self._get_cmd({}) + cmd = self._get_cmd({'name': 'xxx', 'version': 'xxx'}) cmd.ensure_finalized() cmd.strict = 1 inputs = RawInputs('1', 'tarek', 'y') diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -45,7 +45,6 @@ MANIFEST = """\ # file GENERATED by distutils, do NOT edit -README inroot.txt data%(sep)sdata.dt scripts%(sep)sscript.py @@ -141,7 +140,7 @@ zip_file.close() # making sure everything has been pruned correctly - self.assertEqual(len(content), 3) + self.assertEqual(len(content), 2) @unittest.skipUnless(zlib, "requires zlib") def test_make_distribution(self): @@ -236,7 +235,7 @@ zip_file.close() # making sure everything was added - self.assertEqual(len(content), 10) + self.assertEqual(len(content), 9) # checking the MANIFEST manifest = open(join(self.tmp_dir, 'MANIFEST')).read() @@ -245,7 +244,7 @@ @unittest.skipUnless(zlib, "requires zlib") def test_metadata_check_option(self): # testing the `check-metadata` option - dist, cmd = self.get_cmd(metadata={}) + dist, cmd = self.get_cmd(metadata={'name':'xxx', 'version':'xxx'}) # this should raise some warnings ! # with the `check` subcommand @@ -362,8 +361,7 @@ if line.strip() != ''] finally: f.close() - - self.assertEqual(len(manifest), 4) + self.assertEqual(len(manifest), 3) # adding a file self.write_file((self.tmp_dir, 'somecode', 'doc2.txt'), '#') @@ -383,7 +381,7 @@ f.close() # do we have the new file in MANIFEST ? - self.assertEqual(len(manifest2), 5) + self.assertEqual(len(manifest2), 4) self.assertIn('doc2.txt', manifest2[-1]) def test_manifest_marker(self): diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -5,6 +5,8 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest +from distutils2.command.sdist import sdist +from distutils2.errors import DistutilsFileError SETUP_CFG = """ @@ -16,7 +18,7 @@ maintainer = ??ric Araujo maintainer_email = merwok at netwok.org summary = A sample project demonstrating distutils2 packaging -description-file = README +description-file = %(description-file)s keywords = distutils2, packaging, sample project classifier = @@ -47,9 +49,11 @@ Fork in progress, http://bitbucket.org/Merwok/sample-distutils2-project [files] +packages_root = src + packages = one - src:two - src2:three + two + three modules = haven @@ -66,6 +70,8 @@ config = cfg/data.cfg /etc/init.d = init-script +extra_files = %(extra-files)s + # Replaces MANIFEST.in sdist_extra = include THANKS HACKING @@ -130,22 +136,33 @@ self.addCleanup(setattr, sys, 'stderr', sys.stderr) self.addCleanup(os.chdir, os.getcwd()) - def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) - self.write_file('README', 'yeah') + def write_setup(self, kwargs=None): + opts = {'description-file': 'README', 'extra-files':''} + if kwargs: + opts.update(kwargs) + self.write_file('setup.cfg', SETUP_CFG % opts) - # try to load the metadata now + + def run_setup(self, *args): + # run setup with args sys.stdout = StringIO() - sys.argv[:] = ['setup.py', '--version'] + sys.argv[:] = [''] + list(args) old_sys = sys.argv[:] - try: from distutils2.run import commands_main dist = commands_main() finally: sys.argv[:] = old_sys + return dist + + def test_config(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup() + self.write_file('README', 'yeah') + + # try to load the metadata now + dist = self.run_setup('--version') # sanity check self.assertEqual(sys.stdout.getvalue(), '0.6.4.dev1' + os.linesep) @@ -184,7 +201,6 @@ 'http://bitbucket.org/Merwok/sample-distutils2-project')] self.assertEqual(dist.metadata['Project-Url'], urls) - self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) @@ -192,7 +208,8 @@ [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), ('config ', ['cfg/data.cfg']), ('/etc/init.d ', ['init-script'])]) - self.assertEqual(dist.package_dir['two'], 'src') + + self.assertEqual(dist.package_dir, 'src') # Make sure we get the foo command loaded. We use a string comparison # instead of assertIsInstance because the class is not the same when @@ -213,10 +230,94 @@ d = new_compiler(compiler='d') self.assertEqual(d.description, 'D Compiler') + + def test_multiple_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog2') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_multiline_description_file(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + + self.write_setup({'description-file': 'README\n CHANGES'}) + self.write_file('README', 'yeah') + self.write_file('CHANGES', 'changelog') + dist = self.run_setup('--version') + self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') + self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + + def test_metadata_requires_description_files_missing(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--version') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + def test_metadata_requires_description_files(self): + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.write_setup({'description-file': 'README\n README2', + 'extra-files':'\n README2'}) + self.write_file('README', 'yeah') + self.write_file('README2', 'yeah') + self.write_file('haven.py', '#') + self.write_file('script1.py', '#') + os.mkdir('scripts') + self.write_file(os.path.join('scripts', 'find-coconuts'), '#') + os.mkdir('bin') + self.write_file(os.path.join('bin', 'taunt'), '#') + + os.mkdir('src') + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) + os.mkdir(pkg) + self.write_file(os.path.join(pkg, '__init__.py'), '#') + + dist = self.run_setup('--description') + self.assertIn('yeah\nyeah\n', sys.stdout.getvalue()) + + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + self.assertRaises(DistutilsFileError, cmd.make_distribution) + + self.write_setup({'description-file': 'README\n README2', + 'extra-files': '\n README2\n README'}) + dist = self.run_setup('--description') + cmd = sdist(dist) + cmd.finalize_options() + cmd.get_file_list() + cmd.make_distribution() + self.assertIn('README\nREADME2\n', open('MANIFEST').read()) + def test_sub_commands(self): tempdir = self.mkdtemp() os.chdir(tempdir) - self.write_file('setup.cfg', SETUP_CFG) + self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') self.write_file('script1.py', '#') @@ -224,20 +325,15 @@ self.write_file(os.path.join('scripts', 'find-coconuts'), '#') os.mkdir('bin') self.write_file(os.path.join('bin', 'taunt'), '#') + os.mkdir('src') - for pkg in ('one', 'src', 'src2'): + for pkg in ('one', 'two', 'three'): + pkg = os.path.join('src', pkg) os.mkdir(pkg) self.write_file(os.path.join(pkg, '__init__.py'), '#') # try to run the install command to see if foo is called - sys.stdout = sys.stderr = StringIO() - sys.argv[:] = ['', 'install_dist'] - old_sys = sys.argv[:] - try: - from distutils2.run import main - dist = main() - finally: - sys.argv[:] = old_sys + dist = self.run_setup('install_dist') self.assertEqual(dist.foo_was_here, 1) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,6 +15,7 @@ from copy import copy from fnmatch import fnmatchcase from ConfigParser import RawConfigParser +from inspect import getsource from distutils2.errors import (DistutilsPlatformError, DistutilsFileError, DistutilsByteCompileError, DistutilsExecError) @@ -1127,3 +1128,117 @@ """ Issues a call to util.run_2to3. """ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) + + +def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): + """ Distutils2 to distutils1 compatibility util. + + This method uses an existing setup.cfg to generate a dictionnary of + keywords that can be used by distutils.core.setup(kwargs**). + + :param file: + The setup.cfg path. + :raises DistutilsFileError: + When the setup.cfg file is not found. + + """ + # We need to declare the following constants here so that it's easier to + # generate the setup.py afterwards, using inspect.getsource. + D1_D2_SETUP_ARGS = { + # D1 name : (D2_section, D2_name) + "name" : ("metadata",), + "version" : ("metadata",), + "author" : ("metadata",), + "author_email" : ("metadata",), + "maintainer" : ("metadata",), + "maintainer_email" : ("metadata",), + "url" : ("metadata", "home_page"), + "description" : ("metadata", "summary"), + "long_description" : ("metadata", "description"), + "download-url" : ("metadata",), + "classifiers" : ("metadata", "classifier"), + "platforms" : ("metadata", "platform"), # Needs testing + "license" : ("metadata",), + "requires" : ("metadata", "requires_dist"), + "provides" : ("metadata", "provides_dist"), # Needs testing + "obsoletes" : ("metadata", "obsoletes_dist"), # Needs testing + + "packages" : ("files",), + "scripts" : ("files",), + "py_modules" : ("files", "modules"), # Needs testing + } + + MULTI_FIELDS = ("classifiers", + "requires", + "platforms", + "packages", + "scripts") + + def has_get_option(config, section, option): + if config.has_option(section, option): + return config.get(section, option) + elif config.has_option(section, option.replace('_', '-')): + return config.get(section, option.replace('_', '-')) + else: + return False + + # The method source code really starts here. + config = RawConfigParser() + if not os.path.exists(file): + raise DistutilsFileError("file '%s' does not exist" % + os.path.abspath(file)) + config.read(file) + + kwargs = {} + for arg in D1_D2_SETUP_ARGS: + if len(D1_D2_SETUP_ARGS[arg]) == 2: + # The distutils field name is different than distutils2's. + section, option = D1_D2_SETUP_ARGS[arg] + + elif len(D1_D2_SETUP_ARGS[arg]) == 1: + # The distutils field name is the same thant distutils2's. + section = D1_D2_SETUP_ARGS[arg][0] + option = arg + + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg + if arg == "long_description": + filename = has_get_option(config, section, "description_file") + print "We have a filename", filename + if filename: + in_cfg_value = open(filename).read() + else: + continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option + if "\n" in in_cfg_value: + in_cfg_value = in_cfg_value.strip().split('\n') + else: + in_cfg_value = list((in_cfg_value,)) + + kwargs[arg] = in_cfg_value + + return kwargs + + +def generate_distutils_setup_py(): + """ Generate a distutils compatible setup.py using an existing setup.cfg. + + :raises DistutilsFileError: + When a setup.py already exists. + """ + if os.path.exists("setup.py"): + raise DistutilsFileError("A pre existing setup.py file exists") + + handle = open("setup.py", "w") + handle.write("# Distutils script using distutils2 setup.cfg to call the\n") + handle.write("# distutils.core.setup() with the right args.\n\n\n") + handle.write("import os\n") + handle.write("from distutils.core import setup\n") + handle.write("from ConfigParser import RawConfigParser\n\n") + handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) + handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") + handle.write("setup(**kwargs)") + handle.close() diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -1055,6 +1055,13 @@ Create a file called *filename* and write *contents* (a sequence of strings without line terminators) to it. +:mod:`distutils2.metadata` --- Metadata handling +================================================================ + +.. module:: distutils2.metadata + +.. autoclass:: distutils2.metadata.DistributionMetadata + :members: :mod:`distutils2.util` --- Miscellaneous other utility functions ================================================================ diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -301,7 +301,7 @@ :class:`distutils2.dist.DistributionMetadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.dist import DistributionMetadata + >>> from distutils2.metadata import DistributionMetadata >>> metadata = DistributionMetadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -2,7 +2,9 @@ Metadata ======== -Distutils2 provides a :class:`DistributionMetadata` class that can read and +.. module:: distutils2.metadata + +Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -17,7 +19,7 @@ Reading metadata ================ -The :class:`DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: >>> from distutils2.metadata import DistributionMetadata @@ -33,7 +35,7 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -128,6 +128,8 @@ This section describes the files included in the project. +- **packages_root**: the root directory containing all packages. If not provided + Distutils2 will use the current directory. *\*optional* - **packages**: a list of packages the project includes *\*optional* *\*multi* - **modules**: a list of packages the project includes *\*optional* *\*multi* - **scripts**: a list of scripts the project includes *\*optional* *\*multi* @@ -136,6 +138,7 @@ Example:: [files] + packages_root = src packages = pypi2rpm pypi2rpm.command diff --git a/patch b/patch new file mode 100644 --- /dev/null +++ b/patch @@ -0,0 +1,23 @@ +diff -r 5603e1bc5442 distutils2/util.py +--- a/distutils2/util.py Fri Jan 28 18:42:44 2011 +0100 ++++ b/distutils2/util.py Sat Jan 29 02:39:55 2011 +0100 +@@ -1203,12 +1203,13 @@ + in_cfg_value = has_get_option(config, section, option) + if not in_cfg_value: + # There is no such option in the setup.cfg +- continue +- +- if arg == "long_description": +- filename = has_get_option("description_file") +- if filename: +- in_cfg_value = open(filename).read() ++ if arg == "long_description": ++ filename = has_get_option(config, section, "description_file") ++ print "We have a filename", filename ++ if filename: ++ in_cfg_value = open(filename).read() ++ else: ++ continue + + if arg in MULTI_FIELDS: + # Special behaviour when we have a multi line option diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -7,11 +7,15 @@ from distutils2 import __version__ as VERSION from distutils2.util import find_packages from distutils import log -from distutils.core import setup, Extension from distutils.ccompiler import new_compiler from distutils.command.sdist import sdist from distutils.command.install import install +try: + from setuptools import setup, Extension +except ImportError: + from distutils.core import setup, Extension + f = open('README.txt') try: README = f.read() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Merge Message-ID: tarek.ziade pushed b6f8c2fbc3ec to distutils2: http://hg.python.org/distutils2/rev/b6f8c2fbc3ec changeset: 1050:b6f8c2fbc3ec parent: 1049:b437c5373952 parent: 1048:b4c14d0649dd user: FELD Boris date: Sat Jan 29 16:59:48 2011 +0100 summary: Merge files: diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,8 +1,8 @@ import os import re -from os import path as osp from glob import iglob as simple_iglob +__all__ = ['iglob', 'resources_dests'] class SmartGlob(object): @@ -13,16 +13,16 @@ def expand(self, basepath, destination): if self.base: - base = osp.join(basepath, self.base) + base = os.path.join(basepath, self.base) else: base = basepath if '*' in base or '{' in base or '}' in base: raise NotImplementedError('glob are not supported into base part of datafiles definition. %r is an invalide basepath' % base) - absglob = osp.join(base, self.suffix) + absglob = os.path.join(base, self.suffix) for file in iglob(absglob): path_suffix = file[len(base):].lstrip('/') relpath = file[len(basepath):].lstrip('/') - dest = osp.join(destination, path_suffix) + dest = os.path.join(destination, path_suffix) yield relpath, dest RICH_GLOB = re.compile(r'\{([^}]*)\}') @@ -50,7 +50,7 @@ else: radical = radical.lstrip('/') for (path, dir, files) in os.walk(prefix): - for file in iglob(osp.join(prefix, path, radical)): + for file in iglob(os.path.join(prefix, path, radical)): yield os.path.join(prefix, file) def resources_dests(resources_dir, rules): diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -7,7 +7,6 @@ from distutils2.tests import unittest, support, run_unittest from distutils2.datafiles import resources_dests, RICH_GLOB import re -from os import path as osp @@ -25,9 +24,9 @@ def build_spec(self, spec, clean=True): tempdir = self.mkdtemp() for filepath in spec: - filepath = osp.join(tempdir, *filepath.split('/')) - dirname = osp.dirname(filepath) - if dirname and not osp.exists(dirname): + filepath = os.path.join(tempdir, *filepath.split('/')) + dirname = os.path.dirname(filepath) + if dirname and not os.path.exists(dirname): os.makedirs(dirname) self.write_file(filepath, 'babar') if clean: -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Correct typo error in RESOURCE paths mapping file in pkgutil. Message-ID: tarek.ziade pushed b437c5373952 to distutils2: http://hg.python.org/distutils2/rev/b437c5373952 changeset: 1049:b437c5373952 parent: 1046:25439a8e604b user: FELD Boris date: Sat Jan 29 16:47:23 2011 +0100 summary: Correct typo error in RESOURCE paths mapping file in pkgutil. Add a format_value function in sysconfig. Correct bug in get_source_files in install_data. files: distutils2/_backport/pkgutil.py distutils2/_backport/sysconfig.py distutils2/command/install_data.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -613,7 +613,7 @@ # PEP 376 Implementation # ########################## -DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'RESOURCES') +DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'DATAFILES') # Cache _cache_name = {} # maps names to Distribution instances diff --git a/distutils2/_backport/sysconfig.py b/distutils2/_backport/sysconfig.py --- a/distutils2/_backport/sysconfig.py +++ b/distutils2/_backport/sysconfig.py @@ -120,6 +120,14 @@ res[key] = os.path.normpath(_subst_vars(value, vars)) return res +def format_value(value, vars): + def _replacer(matchobj): + name = matchobj.group(1) + if name in vars: + return vars[name] + return matchobj.group(0) + return _VAR_REPL.sub(_replacer, value) + def _get_default_scheme(): if os.name == 'posix': diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -9,7 +9,7 @@ import os from distutils2.command.cmd import Command from distutils2.util import change_root, convert_path -from distutils2._backport.sysconfig import get_paths +from distutils2._backport.sysconfig import get_paths, format_value class install_data(Command): @@ -55,20 +55,15 @@ def expand_categories(self, path_with_categories): local_vars = get_paths() local_vars['distribution.name'] = self.distribution.metadata['Name'] - expanded_path = get_paths(path_with_categories, local_vars) - expanded_path = get_paths(expanded_path, local_vars) + expanded_path = format_value(path_with_categories, local_vars) + expanded_path = format_value(expanded_path, local_vars) if '{' in expanded_path and '}' in expanded_path: self.warn("Unable to expand %s, some categories may missing." % path_with_categories) return expanded_path def get_source_files(self): - sources = [] - for file in self.data_files: - destination = convert_path(self.expand_categories(file[1])) - if os.path.isfile(destination): - sources.append(destination) - return sources + return self.data_files.keys() def get_inputs(self): return self.data_files or [] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Only create DATAFILES if distribution include data files. Message-ID: tarek.ziade pushed 76c733a49022 to distutils2: http://hg.python.org/distutils2/rev/76c733a49022 changeset: 1053:76c733a49022 user: FELD Boris date: Sat Jan 29 17:42:22 2011 +0100 summary: Only create DATAFILES if distribution include data files. files: distutils2/command/install_distinfo.py diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -121,21 +121,21 @@ if not self.no_datafiles: - datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') - logger.info('creating %s', datafiles_path) - f = open(datafiles_path, 'wb') - try: - writer = csv.writer(f, delimiter=',', - lineterminator=os.linesep, - quotechar='"') - install_data = self.get_finalized_command('install_data') - if install_data.get_datafiles_out() != '': + install_data = self.get_finalized_command('install_data') + if install_data.get_datafiles_out() != []: + datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') + logger.info('creating %s', datafiles_path) + f = open(datafiles_path, 'wb') + try: + writer = csv.writer(f, delimiter=',', + lineterminator=os.linesep, + quotechar='"') for tuple in install_data.get_datafiles_out(): writer.writerow(tuple) - self.outputs.append(datafiles_path) - finally: - f.close() + self.outputs.append(datafiles_path) + finally: + f.close() if not self.no_record: record_path = os.path.join(self.distinfo_dir, 'RECORD') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Correct bug : DATAFILES not added in RECORD file. Message-ID: tarek.ziade pushed 08024169e638 to distutils2: http://hg.python.org/distutils2/rev/08024169e638 changeset: 1052:08024169e638 user: FELD Boris date: Sat Jan 29 17:32:29 2011 +0100 summary: Correct bug : DATAFILES not added in RECORD file. files: distutils2/command/install_distinfo.py diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -69,6 +69,9 @@ self.requested = True if self.no_record is None: self.no_record = False + if self.no_datafiles is None: + self.no_datafiles = False + metadata = self.distribution.metadata @@ -116,6 +119,24 @@ f.close() self.outputs.append(requested_path) + + if not self.no_datafiles: + datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') + logger.info('creating %s', datafiles_path) + f = open(datafiles_path, 'wb') + try: + writer = csv.writer(f, delimiter=',', + lineterminator=os.linesep, + quotechar='"') + install_data = self.get_finalized_command('install_data') + if install_data.get_datafiles_out() != '': + for tuple in install_data.get_datafiles_out(): + writer.writerow(tuple) + + self.outputs.append(datafiles_path) + finally: + f.close() + if not self.no_record: record_path = os.path.join(self.distinfo_dir, 'RECORD') logger.info('creating %s', record_path) @@ -145,21 +166,6 @@ finally: f.close() - if not self.no_datafiles: - datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') - logger.info('creating %s', datafiles_path) - f = open(datafiles_path, 'wb') - try: - writer = csv.writer(f, delimiter=',', - lineterminator=os.linesep, - quotechar='"') - install_data = self.get_finalized_command('install_data') - if install_data.get_datafiles_out() != '': - for tuple in install_data.get_datafiles_out(): - writer.writerow(tuple) - finally: - f.close() - def get_outputs(self): return self.outputs -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Add a TODO for data files draft Message-ID: tarek.ziade pushed ba93466e49a9 to distutils2: http://hg.python.org/distutils2/rev/ba93466e49a9 changeset: 1051:ba93466e49a9 user: FELD Boris date: Sat Jan 29 17:09:34 2011 +0100 summary: Add a TODO for data files draft files: docs/source/setupcfg.rst diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -152,26 +152,29 @@ data-files ========== -### -source -> destination -fichier-final = destination + source +TODO : -There is an {alias} for each categories of datafiles ------ -source may be a glob (*, ?, **, {}) - -order - -exclude --- -base-prefix - -#### -overwrite system config for {alias} - -#### -extra-categori + ### + source -> destination + + final-path = destination + source + + There is an {alias} for each categories of datafiles + ----- + source may be a glob (*, ?, **, {}) + + order + + exclude + -- + base-prefix + + #### + overwrite system config for {alias} + + #### + extra-categories This section describes the files used by the project which must not be installed in the same place that python modules or libraries. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Rename pkgutil.open to avoid overwriting builtin function open Message-ID: tarek.ziade pushed 2f00ba92fc7d to distutils2: http://hg.python.org/distutils2/rev/2f00ba92fc7d changeset: 1055:2f00ba92fc7d user: FELD Boris date: Sat Jan 29 21:36:41 2011 +0100 summary: Rename pkgutil.open to avoid overwriting builtin function open files: distutils2/_backport/pkgutil.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1166,5 +1166,5 @@ if dist.uses(path): yield dist -def open(distribution_name, relative_path): +def data_open(distribution_name, relative_path): pass -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Remove bad exception raise. Message-ID: tarek.ziade pushed 287f87c61aae to distutils2: http://hg.python.org/distutils2/rev/287f87c61aae changeset: 1059:287f87c61aae user: FELD Boris date: Sun Jan 30 11:59:48 2011 +0100 summary: Remove bad exception raise. files: distutils2/tests/test_mkcfg.py diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -117,7 +117,6 @@ fid = open(osp.join(self.wdir, 'setup.cfg')) lines = set([line.rstrip() for line in fid]) fid.close() - raise Exception(lines) self.assertEqual(lines, set(['', '[metadata]', 'version = 0.2', -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Merge with upstream and update some unittest to the new data-files Message-ID: tarek.ziade pushed 82e83fa442c4 to distutils2: http://hg.python.org/distutils2/rev/82e83fa442c4 changeset: 1057:82e83fa442c4 parent: 1056:225d826cf39e parent: 955:4b89c997484d user: FELD Boris date: Sat Jan 29 23:30:59 2011 +0100 summary: Merge with upstream and update some unittest to the new data-files implementation files: distutils2/config.py distutils2/tests/test_command_install_data.py distutils2/tests/test_command_sdist.py distutils2/tests/test_config.py distutils2/tests/test_mkcfg.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -4,17 +4,35 @@ """ import os.path import os +import re import sys import re from ConfigParser import RawConfigParser from distutils2 import logger from distutils2.errors import DistutilsOptionError +from distutils2.compiler.extension import Extension from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command from distutils2.datafiles import resources_dests + +def _pop_values(values_dct, key): + """Remove values from the dictionary and convert them as a list""" + vals_str = values_dct.pop(key, None) + if not vals_str: + return + # Get bash options like `gcc -print-file-name=libgcc.a` + vals = re.search('(`.*?`)', vals_str) or [] + if vals: + vals = list(vals.groups()) + vals_str = re.sub('`.*?`', '', vals_str) + vals.extend(vals_str.split()) + if vals: + return vals + + class Config(object): """Reads configuration files and work with the Distribution instance """ @@ -189,11 +207,37 @@ if destination == '': destination = None resources.append((prefix, suffix, destination)) - + dir = os.path.dirname(os.path.join(os.getcwd(), cfg_filename)) data_files = resources_dests(dir, resources) self.dist.data_files = data_files - + + ext_modules = self.dist.ext_modules + for section_key in content: + labels = section_key.split('=') + if (len(labels) == 2) and (labels[0] == 'extension'): + # labels[1] not used from now but should be implemented + # for extension build dependency + values_dct = content[section_key] + ext_modules.append(Extension( + values_dct.pop('name'), + _pop_values(values_dct, 'sources'), + _pop_values(values_dct, 'include_dirs'), + _pop_values(values_dct, 'define_macros'), + _pop_values(values_dct, 'undef_macros'), + _pop_values(values_dct, 'library_dirs'), + _pop_values(values_dct, 'libraries'), + _pop_values(values_dct, 'runtime_library_dirs'), + _pop_values(values_dct, 'extra_objects'), + _pop_values(values_dct, 'extra_compile_args'), + _pop_values(values_dct, 'extra_link_args'), + _pop_values(values_dct, 'export_symbols'), + _pop_values(values_dct, 'swig_opts'), + _pop_values(values_dct, 'depends'), + values_dct.pop('language', None), + values_dct.pop('optional', None), + **values_dct + )) def parse_config_files(self, filenames=None): if filenames is None: diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -20,17 +20,27 @@ # Ask for the dependencies. # Ask for the Requires-Dist # Ask for the Provides-Dist +# Ask for a description # Detect scripts (not sure how. #! outside of package?) import os import sys import re import shutil +import glob +import re from ConfigParser import RawConfigParser from textwrap import dedent +if sys.version_info[:2] < (2, 6): + from sets import Set as set +try: + from hashlib import md5 +except ImportError: + from md5 import md5 # importing this with an underscore as it should be replaced by the # dict form or another structures for all purposes from distutils2._trove import all_classifiers as _CLASSIFIERS_LIST +from distutils2._backport import sysconfig _FILENAME = 'setup.cfg' @@ -82,6 +92,10 @@ Optionally, you can set other trove identifiers for things such as the human language, programming language, user interface, etc... ''', + 'setup.py found':''' +The setup.py script will be executed to retrieve the metadata. +A wizard will be run if you answer "n", +''' } # XXX everything needs docstrings and tests (both low-level tests of various @@ -158,16 +172,18 @@ LICENCES = _build_licences(_CLASSIFIERS_LIST) - class MainProgram(object): def __init__(self): self.configparser = None - self.classifiers = {} + self.classifiers = set([]) self.data = {} self.data['classifier'] = self.classifiers self.data['packages'] = [] self.data['modules'] = [] + self.data['platform'] = [] + self.data['resources'] = [] self.data['extra_files'] = [] + self.data['scripts'] = [] self.load_config_file() def lookup_option(self, key): @@ -178,6 +194,7 @@ def load_config_file(self): self.configparser = RawConfigParser() # TODO replace with section in distutils config file + #XXX freedesktop self.configparser.read(os.path.expanduser('~/.mkcfg')) self.data['author'] = self.lookup_option('author') self.data['author_email'] = self.lookup_option('author_email') @@ -194,6 +211,7 @@ if not valuesDifferent: return + #XXX freedesktop fp = open(os.path.expanduser('~/.mkcfgpy'), 'w') try: self.configparser.write(fp) @@ -201,19 +219,118 @@ fp.close() def load_existing_setup_script(self): - raise NotImplementedError - # Ideas: - # - define a mock module to assign to sys.modules['distutils'] before - # importing the setup script as a module (or executing it); it would - # provide setup (a function that just returns its args as a dict), - # Extension (ditto), find_packages (the real function) - # - we could even mock Distribution and commands to handle more setup - # scripts - # - we could use a sandbox (http://bugs.python.org/issue8680) - # - the cleanest way is to parse the file, not import it, but there is - # no way to do that across versions (the compiler package is - # deprecated or removed in recent Pythons, the ast module is not - # present before 2.6) + """ Generate a setup.cfg from an existing setup.py. + + It only exports the distutils metadata (setuptools specific metadata + is not actually supported). + """ + setuppath = 'setup.py' + if not os.path.exists(setuppath): + return + else: + ans = ask_yn(('A legacy setup.py has been found.\n' + 'Would you like to convert it to a setup.cfg ?'), + 'y', + _helptext['setup.py found']) + if ans != 'y': + return + + #_______mock setup start + data = self.data + def setup(**attrs): + """Mock the setup(**attrs) in order to retrive metadata.""" + # use the distutils v1 processings to correctly parse metadata. + #XXX we could also use the setuptools distibution ??? + from distutils.dist import Distribution + dist = Distribution(attrs) + dist.parse_config_files() + # 1. retrieves metadata that are quite similar PEP314<->PEP345 + labels = (('name',) * 2, + ('version',) * 2, + ('author',) * 2, + ('author_email',) * 2, + ('maintainer',) * 2, + ('maintainer_email',) * 2, + ('description', 'summary'), + ('long_description', 'description'), + ('url', 'home_page'), + ('platforms', 'platform'), + ('provides', 'provides-dist'), + ('obsoletes', 'obsoletes-dist'), + ('requires', 'requires-dist'),) + get = lambda lab: getattr(dist.metadata, lab.replace('-', '_')) + data.update((new, get(old)) for (old, new) in labels if get(old)) + # 2. retrieves data that requires special processings. + data['classifier'].update(dist.get_classifiers() or []) + data['scripts'].extend(dist.scripts or []) + data['packages'].extend(dist.packages or []) + data['modules'].extend(dist.py_modules or []) + # 2.1 data_files -> resources. + if len(dist.data_files) < 2 or isinstance(dist.data_files[1], str): + dist.data_files = [('', dist.data_files)] + #add tokens in the destination paths + vars = {'distribution.name':data['name']} + path_tokens = sysconfig.get_paths(vars=vars).items() + #sort tokens to use the longest one first + path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), + key=lambda x: x[1]) + for dest, srcs in (dist.data_files or []): + dest = os.path.join(sys.prefix, dest) + for tok, path in path_tokens: + if dest.startswith(path): + dest = ('{%s}' % tok) + dest[len(path):] + files = [('/ '.join(src.rsplit('/', 1)), dest) + for src in srcs] + data['resources'].extend(files) + continue + # 2.2 package_data -> extra_files + package_dirs = dist.package_dir or {} + for package, extras in dist.package_data.iteritems() or []: + package_dir = package_dirs.get(package, package) + fils = [os.path.join(package_dir, fil) for fil in extras] + data['extra_files'].extend(fils) + + # Use README file if its content is the desciption + if "description" in data: + ref = md5(re.sub('\s', '', self.data['description']).lower()) + ref = ref.digest() + for readme in glob.glob('README*'): + fob = open(readme) + val = md5(re.sub('\s', '', fob.read()).lower()).digest() + fob.close() + if val == ref: + del data['description'] + data['description-file'] = readme + break + #_________ mock setup end + + # apply monkey patch to distutils (v1) and setuptools (if needed) + # (abord the feature if distutils v1 has been killed) + try: + import distutils.core as DC + getattr(DC, 'setup') # ensure distutils v1 + except ImportError, AttributeError: + return + saved_setups = [(DC, DC.setup)] + DC.setup = setup + try: + import setuptools + saved_setups.append((setuptools, setuptools.setup)) + setuptools.setup = setup + except ImportError, AttributeError: + pass + # get metadata by executing the setup.py with the patched setup(...) + success = False # for python < 2.4 + try: + pyenv = globals().copy() + execfile(setuppath, pyenv) + success = True + finally: #revert monkey patches + for patched_module, original_setup in saved_setups: + patched_module.setup = original_setup + if not self.data: + raise ValueError('Unable to load metadata from setup.py') + return success def inspect_file(self, path): fp = open(path, 'r') @@ -222,9 +339,11 @@ m = re.match(r'^#!.*python((?P\d)(\.\d+)?)?$', line) if m: if m.group('major') == '3': - self.classifiers['Programming Language :: Python :: 3'] = 1 + self.classifiers.add( + 'Programming Language :: Python :: 3') else: - self.classifiers['Programming Language :: Python :: 2'] = 1 + self.classifiers.add( + 'Programming Language :: Python :: 2') finally: fp.close() @@ -370,7 +489,7 @@ for key in sorted(trove): if len(trove[key]) == 0: if ask_yn('Add "%s"' % desc[4:] + ' :: ' + key, 'n') == 'y': - classifiers[desc[4:] + ' :: ' + key] = 1 + classifiers.add(desc[4:] + ' :: ' + key) continue if ask_yn('Do you want to set items under\n "%s" (%d sub-items)' @@ -421,7 +540,7 @@ print ("ERROR: Invalid selection, type a number from the list " "above.") - classifiers[_CLASSIFIERS_LIST[index]] = 1 + classifiers.add(_CLASSIFIERS_LIST[index]) return def set_devel_status(self, classifiers): @@ -448,7 +567,7 @@ 'Development Status :: 5 - Production/Stable', 'Development Status :: 6 - Mature', 'Development Status :: 7 - Inactive'][choice] - classifiers[key] = 1 + classifiers.add(key) return except (IndexError, ValueError): print ("ERROR: Invalid selection, type a single digit " @@ -475,28 +594,39 @@ fp = open(_FILENAME, 'w') try: fp.write('[metadata]\n') - fp.write('name = %s\n' % self.data['name']) - fp.write('version = %s\n' % self.data['version']) - fp.write('author = %s\n' % self.data['author']) - fp.write('author_email = %s\n' % self.data['author_email']) - fp.write('summary = %s\n' % self.data['summary']) - fp.write('home_page = %s\n' % self.data['home_page']) - fp.write('\n') - if len(self.data['classifier']) > 0: - classifiers = '\n'.join([' %s' % clas for clas in - self.data['classifier']]) - fp.write('classifier = %s\n' % classifiers.strip()) - fp.write('\n') - - fp.write('[files]\n') - for element in ('packages', 'modules', 'extra_files'): - if len(self.data[element]) == 0: + # simple string entries + for name in ('name', 'version', 'summary', 'download_url'): + fp.write('%s = %s\n' % (name, self.data.get(name, 'UNKNOWN'))) + # optional string entries + if 'keywords' in self.data and self.data['keywords']: + fp.write('keywords = %s\n' % ' '.join(self.data['keywords'])) + for name in ('home_page', 'author', 'author_email', + 'maintainer', 'maintainer_email', 'description-file'): + if name in self.data and self.data[name]: + fp.write('%s = %s\n' % (name, self.data[name])) + if 'description' in self.data: + fp.write( + 'description = %s\n' + % '\n |'.join(self.data['description'].split('\n'))) + # multiple use string entries + for name in ('platform', 'supported-platform', 'classifier', + 'requires-dist', 'provides-dist', 'obsoletes-dist', + 'requires-external'): + if not(name in self.data and self.data[name]): continue - items = '\n'.join([' %s' % item for item in - self.data[element]]) - fp.write('%s = %s\n' % (element, items.strip())) - - fp.write('\n') + fp.write('%s = ' % name) + fp.write(''.join(' %s\n' % val + for val in self.data[name]).lstrip()) + fp.write('\n[files]\n') + for name in ('packages', 'modules', 'scripts', + 'package_data', 'extra_files'): + if not(name in self.data and self.data[name]): + continue + fp.write('%s = %s\n' + % (name, '\n '.join(self.data[name]).strip())) + fp.write('\n[resources]\n') + for src, dest in self.data['resources']: + fp.write('%s = %s\n' % (src, dest)) finally: fp.close() @@ -508,11 +638,12 @@ """Main entry point.""" program = MainProgram() # uncomment when implemented - #program.load_existing_setup_script() - program.inspect_directory() - program.query_user() - program.update_config_file() + if not program.load_existing_setup_script(): + program.inspect_directory() + program.query_user() + program.update_config_file() program.write_setup_script() + # istutils2.util.generate_distutils_setup_py() if __name__ == '__main__': diff --git a/distutils2/tests/test_command_install_data.py b/distutils2/tests/test_command_install_data.py --- a/distutils2/tests/test_command_install_data.py +++ b/distutils2/tests/test_command_install_data.py @@ -14,16 +14,13 @@ cmd = install_data(dist) cmd.install_dir = inst = os.path.join(pkg_dir, 'inst') - # data_files can contain - # - simple files - # - a tuple with a path, and a list of file one = os.path.join(pkg_dir, 'one') self.write_file(one, 'xxx') inst2 = os.path.join(pkg_dir, 'inst2') two = os.path.join(pkg_dir, 'two') self.write_file(two, 'xxx') - cmd.data_files = [one, (inst2, [two])] + cmd.data_files = {'one' : '{appdata}/one', 'two' : '{appdata}/two'} self.assertEqual(cmd.get_inputs(), [one, (inst2, [two])]) # let's run the command diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -205,11 +205,11 @@ self.write_file((some_dir, 'file.txt'), '#') self.write_file((some_dir, 'other_file.txt'), '#') - dist.data_files = [('data', ['data/data.dt', - 'inroot.txt', - 'notexisting']), - 'some/file.txt', - 'some/other_file.txt'] + dist.data_files = {'data/data.dt' : '{appdata}/data.dt', + 'inroot.txt' : '{appdata}/inroot.txt', + 'notexisting' : '{appdata}/notexisting', + 'some/file.txt' : '{appdata}/file.txt', + 'some/other_file.txt' : '{appdata}/other_file.txt'} # adding a script script_dir = join(self.tmp_dir, 'scripts') diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -93,6 +93,30 @@ sub_commands = foo """ +# Can not be merged with SETUP_CFG else install_dist +# command will fail when trying to compile C sources +EXT_SETUP_CFG = """ +[files] +packages = one + two + +[extension=speed_coconuts] +name = one.speed_coconuts +sources = c_src/speed_coconuts.c +extra_link_args = `gcc -print-file-name=libgcc.a` -shared +define_macros = HAVE_CAIRO HAVE_GTK2 + +[extension=fast_taunt] +name = three.fast_taunt +sources = cxx_src/utils_taunt.cxx + cxx_src/python_module.cxx +include_dirs = /usr/include/gecode + /usr/include/blitz +extra_compile_args = -fPIC -O2 +language = cxx + +""" + class DCompiler(object): name = 'd' @@ -134,7 +158,14 @@ super(ConfigTestCase, self).setUp() self.addCleanup(setattr, sys, 'stdout', sys.stdout) self.addCleanup(setattr, sys, 'stderr', sys.stderr) + sys.stdout = sys.stderr = StringIO() + self.addCleanup(os.chdir, os.getcwd()) + tempdir = self.mkdtemp() + os.chdir(tempdir) + self.tempdir = tempdir + + self.addCleanup(setattr, sys, 'argv', sys.argv) def write_setup(self, kwargs=None): opts = {'description-file': 'README', 'extra-files':''} @@ -156,8 +187,6 @@ return dist def test_config(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') os.mkdir('bm') @@ -239,9 +268,6 @@ def test_multiple_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog2') @@ -249,9 +275,6 @@ self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) def test_multiline_description_file(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) - self.write_setup({'description-file': 'README\n CHANGES'}) self.write_file('README', 'yeah') self.write_file('CHANGES', 'changelog') @@ -259,9 +282,28 @@ self.assertEqual(dist.metadata['description'], 'yeah\nchangelog') self.assertEqual(dist.metadata.requires_files, ['README', 'CHANGES']) + def test_parse_extensions_in_config(self): + self.write_file('setup.cfg', EXT_SETUP_CFG) + dist = self.run_setup('--version') + + ext_modules = dict((mod.name, mod) for mod in dist.ext_modules) + self.assertEqual(len(ext_modules), 2) + ext = ext_modules.get('one.speed_coconuts') + self.assertEqual(ext.sources, ['c_src/speed_coconuts.c']) + self.assertEqual(ext.define_macros, ['HAVE_CAIRO', 'HAVE_GTK2']) + self.assertEqual(ext.extra_link_args, + ['`gcc -print-file-name=libgcc.a`', '-shared']) + + ext = ext_modules.get('three.fast_taunt') + self.assertEqual(ext.sources, + ['cxx_src/utils_taunt.cxx', 'cxx_src/python_module.cxx']) + self.assertEqual(ext.include_dirs, + ['/usr/include/gecode', '/usr/include/blitz']) + self.assertEqual(ext.extra_compile_args, ['-fPIC', '-O2']) + self.assertEqual(ext.language, 'cxx') + + def test_metadata_requires_description_files_missing(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2'}) self.write_file('README', 'yeah') self.write_file('README2', 'yeah') @@ -285,8 +327,6 @@ self.assertRaises(DistutilsFileError, cmd.make_distribution) def test_metadata_requires_description_files(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup({'description-file': 'README\n README2', 'extra-files':'\n README2'}) self.write_file('README', 'yeah') @@ -322,8 +362,6 @@ self.assertIn('README\nREADME2\n', open('MANIFEST').read()) def test_sub_commands(self): - tempdir = self.mkdtemp() - os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') self.write_file('haven.py', '#') diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -1,10 +1,17 @@ +# -*- coding: utf-8 -*- """Tests for distutils.mkcfg.""" import os +import os.path as osp import sys import StringIO +if sys.version_info[:2] < (2, 6): + from sets import Set as set +from textwrap import dedent + from distutils2.tests import run_unittest, support, unittest from distutils2.mkcfg import MainProgram -from distutils2.mkcfg import ask_yn, ask +from distutils2.mkcfg import ask_yn, ask, main + class MkcfgTestCase(support.TempdirManager, unittest.TestCase): @@ -12,16 +19,20 @@ def setUp(self): super(MkcfgTestCase, self).setUp() self._stdin = sys.stdin - self._stdout = sys.stdout + self._stdout = sys.stdout sys.stdin = StringIO.StringIO() sys.stdout = StringIO.StringIO() - + self._cwd = os.getcwd() + self.wdir = self.mkdtemp() + os.chdir(self.wdir) + def tearDown(self): super(MkcfgTestCase, self).tearDown() sys.stdin = self._stdin sys.stdout = self._stdout - - def test_ask_yn(self): + os.chdir(self._cwd) + + def test_ask_yn(self): sys.stdin.write('y\n') sys.stdin.seek(0) self.assertEqual('y', ask_yn('is this a test')) @@ -40,13 +51,13 @@ main.data['author'] = [] main._set_multi('_set_multi test', 'author') self.assertEqual(['aaaaa'], main.data['author']) - + def test_find_files(self): # making sure we scan a project dir correctly main = MainProgram() # building the structure - tempdir = self.mkdtemp() + tempdir = self.wdir dirs = ['pkg1', 'data', 'pkg2', 'pkg2/sub'] files = ['README', 'setup.cfg', 'foo.py', 'pkg1/__init__.py', 'pkg1/bar.py', @@ -60,12 +71,7 @@ path = os.path.join(tempdir, file_) self.write_file(path, 'xxx') - old_dir = os.getcwd() - os.chdir(tempdir) - try: - main._find_files() - finally: - os.chdir(old_dir) + main._find_files() # do we have what we want ? self.assertEqual(main.data['packages'], ['pkg1', 'pkg2', 'pkg2.sub']) @@ -73,6 +79,132 @@ self.assertEqual(set(main.data['extra_files']), set(['setup.cfg', 'README', 'data/data1'])) + def test_convert_setup_py_to_cfg(self): + self.write_file((self.wdir, 'setup.py'), + dedent(""" + # -*- coding: utf-8 -*- + from distutils.core import setup + lg_dsc = '''My super Death-scription + barbar is now on the public domain, + ho, baby !''' + setup(name='pyxfoil', + version='0.2', + description='Python bindings for the Xfoil engine', + long_description = lg_dsc, + maintainer='Andr?? Espaze', + maintainer_email='andre.espaze at logilab.fr', + url='http://www.python-science.org/project/pyxfoil', + license='GPLv2', + packages=['pyxfoil', 'babar', 'me'], + data_files=[('share/doc/pyxfoil', ['README.rst']), + ('share/man', ['pyxfoil.1']), + ], + py_modules = ['my_lib', 'mymodule'], + package_dir = {'babar' : '', + 'me' : 'Martinique/Lamentin', + }, + package_data = {'babar': ['Pom', 'Flora', 'Alexander'], + 'me': ['dady', 'mumy', 'sys', 'bro'], + '': ['setup.py', 'README'], + 'pyxfoil' : ['fengine.so'], + }, + scripts = ['my_script', 'bin/run'], + ) + """)) + sys.stdin.write('y\n') + sys.stdin.seek(0) + main() + fid = open(osp.join(self.wdir, 'setup.cfg')) + lines = set([line.rstrip() for line in fid]) + fid.close() + raise Exception(lines) + self.assertEqual(lines, set(['', + '[metadata]', + 'version = 0.2', + 'name = pyxfoil', + 'maintainer = Andr?? Espaze', + 'description = My super Death-scription', + ' |barbar is now on the public domain,', + ' |ho, baby !', + 'maintainer_email = andre.espaze at logilab.fr', + 'home_page = http://www.python-science.org/project/pyxfoil', + 'download_url = UNKNOWN', + 'summary = Python bindings for the Xfoil engine', + '[files]', + 'modules = my_lib', + ' mymodule', + 'packages = pyxfoil', + ' babar', + ' me', + 'extra_files = Martinique/Lamentin/dady', + ' Martinique/Lamentin/mumy', + ' Martinique/Lamentin/sys', + ' Martinique/Lamentin/bro', + ' Pom', + ' Flora', + ' Alexander', + ' setup.py', + ' README', + ' pyxfoil/fengine.so', + 'scripts = my_script', + ' bin/run', + '[resources]', + 'README.rst = {doc}', + 'pyxfoil.1 = {man}', + ])) + + def test_convert_setup_py_to_cfg_with_description_in_readme(self): + self.write_file((self.wdir, 'setup.py'), + dedent(""" + # -*- coding: utf-8 -*- + from distutils.core import setup + lg_dsc = open('README.txt').read() + setup(name='pyxfoil', + version='0.2', + description='Python bindings for the Xfoil engine', + long_description=lg_dsc, + maintainer='Andr?? Espaze', + maintainer_email='andre.espaze at logilab.fr', + url='http://www.python-science.org/project/pyxfoil', + license='GPLv2', + packages=['pyxfoil'], + package_data={'pyxfoil' : ['fengine.so']}, + data_files=[ + ('share/doc/pyxfoil', ['README.rst']), + ('share/man', ['pyxfoil.1']), + ], + ) + """)) + self.write_file((self.wdir, 'README.txt'), + dedent(''' +My super Death-scription +barbar is now on the public domain, +ho, baby ! + ''')) + sys.stdin.write('y\n') + sys.stdin.seek(0) + main() + fid = open(osp.join(self.wdir, 'setup.cfg')) + lines = set([line.strip() for line in fid]) + fid.close() + self.assertEqual(lines, set(['', + '[metadata]', + 'version = 0.2', + 'name = pyxfoil', + 'maintainer = Andr?? Espaze', + 'maintainer_email = andre.espaze at logilab.fr', + 'home_page = http://www.python-science.org/project/pyxfoil', + 'download_url = UNKNOWN', + 'summary = Python bindings for the Xfoil engine', + 'description-file = README.txt', + '[files]', + 'packages = pyxfoil', + 'extra_files = pyxfoil/fengine.so', + '[resources]', + 'README.rst = {doc}', + 'pyxfoil.1 = {man}', + ])) + def test_suite(): return unittest.makeSuite(MkcfgTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: No file that does not exists can be present in distribution.data_files. Message-ID: tarek.ziade pushed 0b303234fe44 to distutils2: http://hg.python.org/distutils2/rev/0b303234fe44 changeset: 1058:0b303234fe44 user: FELD Boris date: Sun Jan 30 11:56:46 2011 +0100 summary: No file that does not exists can be present in distribution.data_files. Correct bugs in get_inputs in install_data. files: distutils2/command/install_data.py distutils2/tests/test_command_sdist.py diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -66,7 +66,7 @@ return self.data_files.keys() def get_inputs(self): - return self.data_files or [] + return self.data_files.keys() def get_outputs(self): return self.outfiles diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -207,7 +207,6 @@ dist.data_files = {'data/data.dt' : '{appdata}/data.dt', 'inroot.txt' : '{appdata}/inroot.txt', - 'notexisting' : '{appdata}/notexisting', 'some/file.txt' : '{appdata}/file.txt', 'some/other_file.txt' : '{appdata}/other_file.txt'} -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Update some unittest to the new way of declaring data_files. Message-ID: tarek.ziade pushed 225d826cf39e to distutils2: http://hg.python.org/distutils2/rev/225d826cf39e changeset: 1056:225d826cf39e user: FELD Boris date: Sat Jan 29 23:05:26 2011 +0100 summary: Update some unittest to the new way of declaring data_files. Remove implementation of old way to do it. Fix lots of bugs due to type errors. files: distutils2/config.py distutils2/dist.py distutils2/tests/test_config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -85,7 +85,7 @@ if v != ''] return value - def _read_setup_cfg(self, parser, filename): + def _read_setup_cfg(self, parser, cfg_filename): content = {} for section in parser.sections(): content[section] = dict(parser.items(section)) @@ -173,15 +173,6 @@ key, value = data self.dist.package_data[key.strip()] = value.strip() - self.dist.data_files = [] - for data in files.get('data_files', []): - data = data.split('=') - if len(data) != 2: - continue - key, value = data - values = [v.strip() for v in value.split(',')] - self.dist.data_files.append((key, values)) - # manifest template self.dist.extra_files = files.get('extra_files', []) @@ -199,7 +190,7 @@ destination = None resources.append((prefix, suffix, destination)) - dir = os.path.dirname(os.path.join(os.getcwd(), filename)) + dir = os.path.dirname(os.path.join(os.getcwd(), cfg_filename)) data_files = resources_dests(dir, resources) self.dist.data_files = data_files diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -191,7 +191,7 @@ self.include_dirs = [] self.extra_path = None self.scripts = [] - self.data_files = [] + self.data_files = {} self.password = '' self.use_2to3 = False self.convert_2to3_doctests = [] diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -65,11 +65,6 @@ package_data = cheese = data/templates/* -data_files = - bitmaps = bm/b1.gif, bm/b2.gif - config = cfg/data.cfg - /etc/init.d = init-script - extra_files = %(extra-files)s # Replaces MANIFEST.in @@ -78,6 +73,11 @@ recursive-include examples *.txt *.py prune examples/sample?/build +[resources] +bm/ {b1,b2}.gif = {icon} +cfg/ data.cfg = {config} +init_script = {script} + [global] commands = distutils2.tests.test_config.FooBarBazTest @@ -160,6 +160,12 @@ os.chdir(tempdir) self.write_setup() self.write_file('README', 'yeah') + os.mkdir('bm') + self.write_file(os.path.join('bm', 'b1.gif'), '') + self.write_file(os.path.join('bm', 'b2.gif'), '') + os.mkdir('cfg') + self.write_file(os.path.join('cfg', 'data.cfg'), '') + self.write_file('init_script', '') # try to load the metadata now dist = self.run_setup('--version') @@ -205,9 +211,10 @@ self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) self.assertEqual(dist.data_files, - [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), - ('config ', ['cfg/data.cfg']), - ('/etc/init.d ', ['init-script'])]) + {'bm/b1.gif' : '{icon}/b1.gif', + 'bm/b2.gif' : '{icon}/b2.gif', + 'cfg/data.cfg' : '{config}/data.cfg', + 'init_script' : '{script}/init_script'}) self.assertEqual(dist.package_dir, 'src') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:57 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:57 +0100 Subject: [Python-checkins] distutils2: Merge with main branch Message-ID: tarek.ziade pushed 2968e9cec396 to distutils2: http://hg.python.org/distutils2/rev/2968e9cec396 changeset: 1054:2968e9cec396 parent: 1053:76c733a49022 parent: 949:f31564d7ed76 user: FELD Boris date: Sat Jan 29 18:07:13 2011 +0100 summary: Merge with main branch files: distutils2/_backport/pkgutil.py distutils2/config.py patch diff --git a/distutils2/_backport/__init__.py b/distutils2/_backport/__init__.py --- a/distutils2/_backport/__init__.py +++ b/distutils2/_backport/__init__.py @@ -1,8 +1,2 @@ """Things that will land in the Python 3.3 std lib but which we must drag along with us for now to support 2.x.""" - -def any(seq): - for elem in seq: - if elem: - return True - return False diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1,24 +1,19 @@ """Utilities to support packages.""" -# NOTE: This module must remain compatible with Python 2.3, as it is shared -# by setuptools for distribution with Python 2.3 and up. - import os import sys import imp -import os.path +import re +import warnings from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate -import zipimport try: import cStringIO as StringIO except ImportError: import StringIO -import re -import warnings __all__ = [ @@ -28,10 +23,14 @@ 'Distribution', 'EggInfoDistribution', 'distinfo_dirname', 'get_distributions', 'get_distribution', 'get_file_users', 'provides_distribution', 'obsoletes_distribution', - 'enable_cache', 'disable_cache', 'clear_cache' + 'enable_cache', 'disable_cache', 'clear_cache', ] +########################## +# PEP 302 Implementation # +########################## + def read_code(stream): # This helper is needed in order for the :pep:`302` emulation to # correctly handle compiled files @@ -41,7 +40,7 @@ if magic != imp.get_magic(): return None - stream.read(4) # Skip timestamp + stream.read(4) # Skip timestamp return marshal.load(stream) @@ -173,7 +172,6 @@ #@simplegeneric def iter_importer_modules(importer, prefix=''): - "" if not hasattr(importer, 'iter_modules'): return [] return importer.iter_modules(prefix) @@ -331,9 +329,9 @@ def get_filename(self, fullname=None): fullname = self._fix_name(fullname) mod_type = self.etc[2] - if self.etc[2] == imp.PKG_DIRECTORY: + if mod_type == imp.PKG_DIRECTORY: return self._get_delegate().get_filename() - elif self.etc[2] in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): + elif mod_type in (imp.PY_SOURCE, imp.PY_COMPILED, imp.C_EXTENSION): return self.filename return None @@ -432,7 +430,8 @@ import mechanism will find the latter. Items of the following types can be affected by this discrepancy: - ``imp.C_EXTENSION, imp.PY_SOURCE, imp.PY_COMPILED, imp.PKG_DIRECTORY`` + :data:`imp.C_EXTENSION`, :data:`imp.PY_SOURCE`, :data:`imp.PY_COMPILED`, + :data:`imp.PKG_DIRECTORY` """ if fullname.startswith('.'): raise ImportError("Relative module names not supported") @@ -534,13 +533,13 @@ # frozen package. Return the path unchanged in that case. return path - pname = os.path.join(*name.split('.')) # Reconstitute as relative path + pname = os.path.join(*name.split('.')) # Reconstitute as relative path # Just in case os.extsep != '.' sname = os.extsep.join(name.split('.')) sname_pkg = sname + os.extsep + "pkg" init_py = "__init__" + os.extsep + "py" - path = path[:] # Start with a copy of the existing path + path = path[:] # Start with a copy of the existing path for dir in sys.path: if not isinstance(dir, basestring) or not os.path.isdir(dir): @@ -565,7 +564,7 @@ line = line.rstrip('\n') if not line or line.startswith('#'): continue - path.append(line) # Don't check for existence! + path.append(line) # Don't check for existence! f.close() return path @@ -609,6 +608,7 @@ resource_name = os.path.join(*parts) return loader.get_data(resource_name) + ########################## # PEP 376 Implementation # ########################## @@ -616,12 +616,12 @@ DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'DATAFILES') # Cache -_cache_name = {} # maps names to Distribution instances -_cache_name_egg = {} # maps names to EggInfoDistribution instances -_cache_path = {} # maps paths to Distribution instances -_cache_path_egg = {} # maps paths to EggInfoDistribution instances -_cache_generated = False # indicates if .dist-info distributions are cached -_cache_generated_egg = False # indicates if .dist-info and .egg are cached +_cache_name = {} # maps names to Distribution instances +_cache_name_egg = {} # maps names to EggInfoDistribution instances +_cache_path = {} # maps paths to Distribution instances +_cache_path_egg = {} # maps paths to EggInfoDistribution instances +_cache_generated = False # indicates if .dist-info distributions are cached +_cache_generated_egg = False # indicates if .dist-info and .egg are cached _cache_enabled = True @@ -636,6 +636,7 @@ _cache_enabled = True + def disable_cache(): """ Disables the internal cache. @@ -647,9 +648,10 @@ _cache_enabled = False + def clear_cache(): """ Clears the internal cache. """ - global _cache_name, _cache_name_egg, cache_path, _cache_path_egg, \ + global _cache_name, _cache_name_egg, _cache_path, _cache_path_egg, \ _cache_generated, _cache_generated_egg _cache_name = {} @@ -660,14 +662,14 @@ _cache_generated_egg = False -def _yield_distributions(include_dist, include_egg): +def _yield_distributions(include_dist, include_egg, paths=sys.path): """ Yield .dist-info and .egg(-info) distributions, based on the arguments :parameter include_dist: yield .dist-info distributions :parameter include_egg: yield .egg(-info) distributions """ - for path in sys.path: + for path in paths: realpath = os.path.realpath(path) if not os.path.isdir(realpath): continue @@ -679,7 +681,7 @@ dir.endswith('.egg')): yield EggInfoDistribution(dist_path) -def _generate_cache(use_egg_info=False): +def _generate_cache(use_egg_info=False, paths=sys.path): global _cache_generated, _cache_generated_egg if _cache_generated_egg or (_cache_generated and not use_egg_info): @@ -688,7 +690,7 @@ gen_dist = not _cache_generated gen_egg = use_egg_info - for dist in _yield_distributions(gen_dist, gen_egg): + for dist in _yield_distributions(gen_dist, gen_egg, paths): if isinstance(dist, Distribution): _cache_path[dist.path] = dist if not dist.name in _cache_name: @@ -872,7 +874,8 @@ if isinstance(strs, basestring): for s in strs.splitlines(): s = s.strip() - if s and not s.startswith('#'): # skip blank lines/comments + # skip blank lines/comments + if s and not s.startswith('#'): yield s else: for ss in strs: @@ -890,6 +893,7 @@ except IOError: requires = None else: + # FIXME handle the case where zipfile is not available zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) self.metadata = DistributionMetadata(fileobj=fileobj) @@ -952,7 +956,7 @@ version = match.group('first') if match.group('rest'): version += match.group('rest') - version = version.replace(' ', '') # trim spaces + version = version.replace(' ', '') # trim spaces if version is None: reqs.append(name) else: @@ -982,12 +986,6 @@ __hash__ = object.__hash__ -def _normalize_dist_name(name): - """Returns a normalized name from the given *name*. - :rtype: string""" - return name.replace('-', '_') - - def distinfo_dirname(name, version): """ The *name* and *version* parameters are converted into their @@ -1007,7 +1005,7 @@ :returns: directory name :rtype: string""" file_extension = '.dist-info' - name = _normalize_dist_name(name) + name = name.replace('-', '_') normalized_version = suggest_normalized_version(version) # Because this is a lookup procedure, something will be returned even if # it is a version that cannot be normalized @@ -1017,7 +1015,7 @@ return '-'.join([name, normalized_version]) + file_extension -def get_distributions(use_egg_info=False): +def get_distributions(use_egg_info=False, paths=sys.path): """ Provides an iterator that looks for ``.dist-info`` directories in ``sys.path`` and returns :class:`Distribution` instances for each one of @@ -1028,10 +1026,10 @@ instances """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): yield dist else: - _generate_cache(use_egg_info) + _generate_cache(use_egg_info, paths) for dist in _cache_path.itervalues(): yield dist @@ -1041,7 +1039,7 @@ yield dist -def get_distribution(name, use_egg_info=False): +def get_distribution(name, use_egg_info=False, paths=sys.path): """ Scans all elements in ``sys.path`` and looks for all directories ending with ``.dist-info``. Returns a :class:`Distribution` @@ -1059,11 +1057,11 @@ :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None """ if not _cache_enabled: - for dist in _yield_distributions(True, use_egg_info): + for dist in _yield_distributions(True, use_egg_info, paths): if dist.name == name: return dist else: - _generate_cache(use_egg_info) + _generate_cache(use_egg_info, paths) if name in _cache_name: return _cache_name[name][0] @@ -1148,7 +1146,7 @@ raise DistutilsError(('Distribution %s has invalid ' + 'provides field: %s') \ % (dist.name, p)) - p_ver = p_ver[1:-1] # trim off the parenthesis + p_ver = p_ver[1:-1] # trim off the parenthesis if p_name == name and predicate.match(p_ver): yield dist break diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -1,4 +1,4 @@ -"""Utility functions for copying files and directory trees. +"""Utility functions for copying and archiving files and directory trees. XXX The functions here don't copy the resource fork or other metadata on Mac. @@ -9,7 +9,13 @@ import stat from os.path import abspath import fnmatch -from warnings import warn +import errno + +try: + import bz2 + _BZ2_SUPPORTED = True +except ImportError: + _BZ2_SUPPORTED = False try: from pwd import getpwnam @@ -21,9 +27,12 @@ except ImportError: getgrnam = None -__all__ = ["copyfileobj","copyfile","copymode","copystat","copy","copy2", - "copytree","move","rmtree","Error", "SpecialFileError", - "ExecError","make_archive"] +__all__ = ["copyfileobj", "copyfile", "copymode", "copystat", "copy", "copy2", + "copytree", "move", "rmtree", "Error", "SpecialFileError", + "ExecError", "make_archive", "get_archive_formats", + "register_archive_format", "unregister_archive_format", + "get_unpack_formats", "register_unpack_format", + "unregister_unpack_format", "unpack_archive"] class Error(EnvironmentError): pass @@ -35,6 +44,14 @@ class ExecError(EnvironmentError): """Raised when a command could not be executed""" +class ReadError(EnvironmentError): + """Raised when an archive cannot be read""" + +class RegistryError(Exception): + """Raised when a registery operation with the archiving + and unpacking registeries fails""" + + try: WindowsError except NameError: @@ -50,7 +67,7 @@ def _samefile(src, dst): # Macintosh, Unix. - if hasattr(os.path,'samefile'): + if hasattr(os.path, 'samefile'): try: return os.path.samefile(src, dst) except OSError: @@ -63,10 +80,8 @@ def copyfile(src, dst): """Copy data from src to dst""" if _samefile(src, dst): - raise Error, "`%s` and `%s` are the same file" % (src, dst) + raise Error("`%s` and `%s` are the same file" % (src, dst)) - fsrc = None - fdst = None for fn in [src, dst]: try: st = os.stat(fn) @@ -77,15 +92,16 @@ # XXX What about other special files? (sockets, devices...) if stat.S_ISFIFO(st.st_mode): raise SpecialFileError("`%s` is a named pipe" % fn) + + fsrc = open(src, 'rb') try: - fsrc = open(src, 'rb') fdst = open(dst, 'wb') - copyfileobj(fsrc, fdst) + try: + copyfileobj(fsrc, fdst) + finally: + fdst.close() finally: - if fdst: - fdst.close() - if fsrc: - fsrc.close() + fsrc.close() def copymode(src, dst): """Copy mode bits from src to dst""" @@ -103,8 +119,12 @@ if hasattr(os, 'chmod'): os.chmod(dst, mode) if hasattr(os, 'chflags') and hasattr(st, 'st_flags'): - os.chflags(dst, st.st_flags) - + try: + os.chflags(dst, st.st_flags) + except OSError, why: + if (not hasattr(errno, 'EOPNOTSUPP') or + why.errno != errno.EOPNOTSUPP): + raise def copy(src, dst): """Copy data and mode bits ("cp src dst"). @@ -140,8 +160,9 @@ return set(ignored_names) return _ignore_patterns -def copytree(src, dst, symlinks=False, ignore=None): - """Recursively copy a directory tree using copy2(). +def copytree(src, dst, symlinks=False, ignore=None, copy_function=copy2, + ignore_dangling_symlinks=False): + """Recursively copy a directory tree. The destination directory must not already exist. If exception(s) occur, an Error is raised with a list of reasons. @@ -149,7 +170,13 @@ If the optional symlinks flag is true, symbolic links in the source tree result in symbolic links in the destination tree; if it is false, the contents of the files pointed to by symbolic - links are copied. + links are copied. If the file pointed by the symlink doesn't + exist, an exception will be added in the list of errors raised in + an Error exception at the end of the copy process. + + You can set the optional ignore_dangling_symlinks flag to true if you + want to silence this exception. Notice that this has no effect on + platforms that don't support os.symlink. The optional ignore argument is a callable. If given, it is called with the `src` parameter, which is the directory @@ -163,7 +190,10 @@ list of names relative to the `src` directory that should not be copied. - XXX Consider this example code rather than the ultimate tool. + The optional copy_function argument is a callable that will be used + to copy each file. It will be called with the source path and the + destination path as arguments. By default, copy2() is used, but any + function that supports the same signature (like copy()) can be used. """ names = os.listdir(src) @@ -182,14 +212,21 @@ srcname = os.path.join(src, name) dstname = os.path.join(dst, name) try: - if symlinks and os.path.islink(srcname): + if os.path.islink(srcname): linkto = os.readlink(srcname) - os.symlink(linkto, dstname) + if symlinks: + os.symlink(linkto, dstname) + else: + # ignore dangling symlink if the flag is on + if not os.path.exists(linkto) and ignore_dangling_symlinks: + continue + # otherwise let the copy occurs. copy2 will raise an error + copy_function(srcname, dstname) elif os.path.isdir(srcname): - copytree(srcname, dstname, symlinks, ignore) + copytree(srcname, dstname, symlinks, ignore, copy_function) else: # Will raise a SpecialFileError for unsupported file types - copy2(srcname, dstname) + copy_function(srcname, dstname) # catch the Error from the recursive copytree so that we can # continue with other files except Error, err: @@ -205,7 +242,7 @@ else: errors.extend((src, dst, str(why))) if errors: - raise Error, errors + raise Error(errors) def rmtree(path, ignore_errors=False, onerror=None): """Recursively delete a directory tree. @@ -235,7 +272,7 @@ names = [] try: names = os.listdir(path) - except os.error, err: + except os.error: onerror(os.listdir, path, sys.exc_info()) for name in names: fullname = os.path.join(path, name) @@ -248,7 +285,7 @@ else: try: os.remove(fullname) - except os.error, err: + except os.error: onerror(os.remove, fullname, sys.exc_info()) try: os.rmdir(path) @@ -282,13 +319,13 @@ if os.path.isdir(dst): real_dst = os.path.join(dst, _basename(src)) if os.path.exists(real_dst): - raise Error, "Destination path '%s' already exists" % real_dst + raise Error("Destination path '%s' already exists" % real_dst) try: os.rename(src, real_dst) except OSError: if os.path.isdir(src): if _destinsrc(src, dst): - raise Error, "Cannot move a directory '%s' into itself '%s'." % (src, dst) + raise Error("Cannot move a directory '%s' into itself '%s'." % (src, dst)) copytree(src, real_dst, symlinks=True) rmtree(src) else: @@ -333,40 +370,41 @@ """Create a (possibly compressed) tar file from all the files under 'base_dir'. - 'compress' must be "gzip" (the default), "compress", "bzip2", or None. - (compress will be deprecated in Python 3.2) + 'compress' must be "gzip" (the default), "bzip2", or None. 'owner' and 'group' can be used to define an owner and a group for the archive that is being built. If not provided, the current owner and group will be used. The output tar file will be named 'base_dir' + ".tar", possibly plus - the appropriate compression extension (".gz", ".bz2" or ".Z"). + the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. """ - tar_compression = {'gzip': 'gz', 'bzip2': 'bz2', None: '', 'compress': ''} - compress_ext = {'gzip': '.gz', 'bzip2': '.bz2', 'compress': '.Z'} + tar_compression = {'gzip': 'gz', None: ''} + compress_ext = {'gzip': '.gz'} + + if _BZ2_SUPPORTED: + tar_compression['bzip2'] = 'bz2' + compress_ext['bzip2'] = '.bz2' # flags for compression program, each element of list will be an argument if compress is not None and compress not in compress_ext: - raise ValueError, \ - ("bad value for 'compress': must be None, 'gzip', 'bzip2' " - "or 'compress'") + raise ValueError("bad value for 'compress', or compression format not " + "supported: %s" % compress) - archive_name = base_name + '.tar' - if compress != 'compress': - archive_name += compress_ext.get(compress, '') + archive_name = base_name + '.tar' + compress_ext.get(compress, '') + archive_dir = os.path.dirname(archive_name) - archive_dir = os.path.dirname(archive_name) if not os.path.exists(archive_dir): if logger is not None: - logger.info("creating %s" % archive_dir) + logger.info("creating %s", archive_dir) if not dry_run: os.makedirs(archive_dir) - # creating the tarball + # XXX late import because of circular dependency between shutil and + # tarfile :( from distutils2._backport import tarfile if logger is not None: @@ -391,23 +429,9 @@ finally: tar.close() - # compression using `compress` - # XXX this block will be removed in Python 3.2 - if compress == 'compress': - warn("'compress' will be deprecated.", PendingDeprecationWarning) - # the option varies depending on the platform - compressed_name = archive_name + compress_ext[compress] - if sys.platform == 'win32': - cmd = [compress, archive_name, compressed_name] - else: - cmd = [compress, '-f', archive_name] - from distutils2.spawn import spawn - spawn(cmd, dry_run=dry_run) - return compressed_name - return archive_name -def _call_external_zip(directory, verbose=False): +def _call_external_zip(base_dir, zip_filename, verbose=False, dry_run=False): # XXX see if we want to keep an external call here if verbose: zipoptions = "-r" @@ -420,8 +444,7 @@ except DistutilsExecError: # XXX really should distinguish between "couldn't find # external 'zip' command" and "zip failed". - raise ExecError, \ - ("unable to create zip file '%s': " + raise ExecError("unable to create zip file '%s': " "could neither import the 'zipfile' module nor " "find a standalone zip utility") % zip_filename @@ -451,7 +474,7 @@ zipfile = None if zipfile is None: - _call_external_zip(base_dir, verbose) + _call_external_zip(base_dir, zip_filename, verbose, dry_run) else: if logger is not None: logger.info("creating '%s' and adding '%s' to it", @@ -475,12 +498,14 @@ _ARCHIVE_FORMATS = { 'gztar': (_make_tarball, [('compress', 'gzip')], "gzip'ed tar-file"), 'bztar': (_make_tarball, [('compress', 'bzip2')], "bzip2'ed tar-file"), - 'ztar': (_make_tarball, [('compress', 'compress')], - "compressed tar file"), 'tar': (_make_tarball, [('compress', None)], "uncompressed tar file"), - 'zip': (_make_zipfile, [],"ZIP file") + 'zip': (_make_zipfile, [], "ZIP file"), } +if _BZ2_SUPPORTED: + _ARCHIVE_FORMATS['bztar'] = (_make_tarball, [('compress', 'bzip2')], + "bzip2'ed tar-file") + def get_archive_formats(): """Returns a list of supported formats for archiving and unarchiving. @@ -507,7 +532,7 @@ if not isinstance(extra_args, (tuple, list)): raise TypeError('extra_args needs to be a sequence') for element in extra_args: - if not isinstance(element, (tuple, list)) or len(element) !=2 : + if not isinstance(element, (tuple, list)) or len(element) !=2: raise TypeError('extra_args elements are : (arg_name, value)') _ARCHIVE_FORMATS[name] = (function, extra_args, description) @@ -520,7 +545,7 @@ """Create an archive file (eg. zip or tar). 'base_name' is the name of the file to create, minus any format-specific - extension; 'format' is the archive format: one of "zip", "tar", "ztar", + extension; 'format' is the archive format: one of "zip", "tar", "bztar" or "gztar". 'root_dir' is a directory that will be the root directory of the @@ -549,7 +574,7 @@ try: format_info = _ARCHIVE_FORMATS[format] except KeyError: - raise ValueError, "unknown archive format '%s'" % format + raise ValueError("unknown archive format '%s'" % format) func = format_info[0] for arg, val in format_info[1]: @@ -568,3 +593,169 @@ os.chdir(save_cwd) return filename + + +def get_unpack_formats(): + """Returns a list of supported formats for unpacking. + + Each element of the returned sequence is a tuple + (name, extensions, description) + """ + formats = [(name, info[0], info[3]) for name, info in + _UNPACK_FORMATS.iteritems()] + formats.sort() + return formats + +def _check_unpack_options(extensions, function, extra_args): + """Checks what gets registered as an unpacker.""" + # first make sure no other unpacker is registered for this extension + existing_extensions = {} + for name, info in _UNPACK_FORMATS.iteritems(): + for ext in info[0]: + existing_extensions[ext] = name + + for extension in extensions: + if extension in existing_extensions: + msg = '%s is already registered for "%s"' + raise RegistryError(msg % (extension, + existing_extensions[extension])) + + if not callable(function): + raise TypeError('The registered function must be a callable') + + +def register_unpack_format(name, extensions, function, extra_args=None, + description=''): + """Registers an unpack format. + + `name` is the name of the format. `extensions` is a list of extensions + corresponding to the format. + + `function` is the callable that will be + used to unpack archives. The callable will receive archives to unpack. + If it's unable to handle an archive, it needs to raise a ReadError + exception. + + If provided, `extra_args` is a sequence of + (name, value) tuples that will be passed as arguments to the callable. + description can be provided to describe the format, and will be returned + by the get_unpack_formats() function. + """ + if extra_args is None: + extra_args = [] + _check_unpack_options(extensions, function, extra_args) + _UNPACK_FORMATS[name] = extensions, function, extra_args, description + +def unregister_unpack_format(name): + """Removes the pack format from the registery.""" + del _UNPACK_FORMATS[name] + +def _ensure_directory(path): + """Ensure that the parent directory of `path` exists""" + dirname = os.path.dirname(path) + if not os.path.isdir(dirname): + os.makedirs(dirname) + +def _unpack_zipfile(filename, extract_dir): + """Unpack zip `filename` to `extract_dir` + """ + try: + import zipfile + except ImportError: + raise ReadError('zlib not supported, cannot unpack this archive.') + + if not zipfile.is_zipfile(filename): + raise ReadError("%s is not a zip file" % filename) + + zip = zipfile.ZipFile(filename) + try: + for info in zip.infolist(): + name = info.filename + + # don't extract absolute paths or ones with .. in them + if name.startswith('/') or '..' in name: + continue + + target = os.path.join(extract_dir, *name.split('/')) + if not target: + continue + + _ensure_directory(target) + if not name.endswith('/'): + # file + data = zip.read(info.filename) + f = open(target, 'wb') + try: + f.write(data) + finally: + f.close() + del data + finally: + zip.close() + +def _unpack_tarfile(filename, extract_dir): + """Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir` + """ + from distutils2._backport import tarfile + try: + tarobj = tarfile.open(filename) + except tarfile.TarError: + raise ReadError( + "%s is not a compressed or uncompressed tar file" % filename) + try: + tarobj.extractall(extract_dir) + finally: + tarobj.close() + +_UNPACK_FORMATS = { + 'gztar': (['.tar.gz', '.tgz'], _unpack_tarfile, [], "gzip'ed tar-file"), + 'tar': (['.tar'], _unpack_tarfile, [], "uncompressed tar file"), + 'zip': (['.zip'], _unpack_zipfile, [], "ZIP file") + } + +if _BZ2_SUPPORTED: + _UNPACK_FORMATS['bztar'] = (['.bz2'], _unpack_tarfile, [], + "bzip2'ed tar-file") + +def _find_unpack_format(filename): + for name, info in _UNPACK_FORMATS.iteritems(): + for extension in info[0]: + if filename.endswith(extension): + return name + return None + +def unpack_archive(filename, extract_dir=None, format=None): + """Unpack an archive. + + `filename` is the name of the archive. + + `extract_dir` is the name of the target directory, where the archive + is unpacked. If not provided, the current working directory is used. + + `format` is the archive format: one of "zip", "tar", or "gztar". Or any + other registered format. If not provided, unpack_archive will use the + filename extension and see if an unpacker was registered for that + extension. + + In case none is found, a ValueError is raised. + """ + if extract_dir is None: + extract_dir = os.getcwd() + + if format is not None: + try: + format_info = _UNPACK_FORMATS[format] + except KeyError: + raise ValueError("Unknown unpack format '{0}'".format(format)) + + func = format_info[0] + func(filename, extract_dir, **dict(format_info[1])) + else: + # we need to look at the registered unpackers supported extensions + format = _find_unpack_format(filename) + if format is None: + raise ReadError("Unknown archive format '{0}'".format(filename)) + + func = _UNPACK_FORMATS[format][1] + kwargs = dict(_UNPACK_FORMATS[format][2]) + raise ValueError('Unknown archive format: %s' % filename) diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -12,10 +12,15 @@ except ImportError: from distutils2._backport.hashlib import md5 -from test.test_support import TESTFN +from distutils2.errors import DistutilsError +from distutils2.metadata import DistributionMetadata +from distutils2.tests import unittest, run_unittest, support -from distutils2.tests import unittest, run_unittest, support from distutils2._backport import pkgutil +from distutils2._backport.pkgutil import ( + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -108,6 +113,12 @@ self.assertEqual(res1, RESOURCE_DATA) res2 = pkgutil.get_data(pkg, 'sub/res.txt') self.assertEqual(res2, RESOURCE_DATA) + + names = [] + for loader, name, ispkg in pkgutil.iter_modules([zip_file]): + names.append(name) + self.assertEqual(names, ['test_getdata_zipfile']) + del sys.path[0] del sys.modules[pkg] @@ -205,7 +216,7 @@ record_writer.writerow(record_pieces( os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) - del record_writer # causes the RECORD file to close + del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: @@ -225,9 +236,6 @@ def test_instantiation(self): # Test the Distribution class's instantiation provides us with usable # attributes. - # Import the Distribution class - from distutils2._backport.pkgutil import distinfo_dirname, Distribution - here = os.path.abspath(os.path.dirname(__file__)) name = 'choxie' version = '2.0.0.9' @@ -236,7 +244,6 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - from distutils2.metadata import DistributionMetadata self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) @@ -244,7 +251,6 @@ def test_installed_files(self): # Test the iteration of installed files. # Test the distribution's installed files - from distutils2._backport.pkgutil import Distribution for distinfo_dir in self.distinfo_dirs: dist = Distribution(distinfo_dir) for path, md5_, size in dist.get_installed_files(): @@ -267,14 +273,12 @@ false_path = relpath(os.path.join(*false_path), sys.prefix) # Test if the distribution uses the file in question - from distutils2._backport.pkgutil import Distribution dist = Distribution(distinfo_dir) self.assertTrue(dist.uses(true_path)) self.assertFalse(dist.uses(false_path)) def test_get_distinfo_file(self): # Test the retrieval of dist-info file objects. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, @@ -295,7 +299,6 @@ # Is it the correct file? self.assertEqual(value.name, os.path.join(distinfo_dir, distfile)) - from distutils2.errors import DistutilsError # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, other_distinfo_name + '.dist-info', 'REQUESTED') @@ -307,7 +310,6 @@ def test_get_distinfo_files(self): # Test for the iteration of RECORD path entries. - from distutils2._backport.pkgutil import Distribution distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, distinfo_name + '.dist-info') @@ -345,7 +347,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [# (name, version, standard_dirname) + items = [ # (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -356,9 +358,6 @@ ('python-ldap', '2.5 a---5', 'python_ldap-2.5 a---5.dist-info'), ] - # Import the function in question - from distutils2._backport.pkgutil import distinfo_dirname - # Loop through the items to validate the results for name, version, standard_dirname in items: dirname = distinfo_dirname(name, version) @@ -371,11 +370,6 @@ ('towel-stuff', '0.1')] found_dists = [] - # Import the function in question - from distutils2._backport.pkgutil import get_distributions, \ - Distribution, \ - EggInfoDistribution - # Verify the fake dists have been found. dists = [dist for dist in get_distributions()] for dist in dists: @@ -416,12 +410,7 @@ def test_get_distribution(self): # Test for looking up a distribution by name. # Test the lookup of the towel-stuff distribution - name = 'towel-stuff' # Note: This is different from the directory name - - # Import the function in question - from distutils2._backport.pkgutil import get_distribution, \ - Distribution, \ - EggInfoDistribution + name = 'towel-stuff' # Note: This is different from the directory name # Lookup the distribution dist = get_distribution(name) @@ -461,7 +450,6 @@ def test_get_file_users(self): # Test the iteration of distributions that use a file. - from distutils2._backport.pkgutil import get_file_users, Distribution name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, 'towel_stuff', '__init__.py') @@ -471,9 +459,6 @@ def test_provides(self): # Test for looking up distributions by what they provide - from distutils2._backport.pkgutil import provides_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in provides_distribution('truffles')] @@ -522,12 +507,10 @@ use_egg_info=True)] checkLists(l, ['strawberry']) - l = [dist.name for dist in provides_distribution('strawberry', '>0.6', use_egg_info=True)] checkLists(l, []) - l = [dist.name for dist in provides_distribution('banana', '0.4', use_egg_info=True)] checkLists(l, ['banana']) @@ -536,16 +519,12 @@ use_egg_info=True)] checkLists(l, ['banana']) - l = [dist.name for dist in provides_distribution('banana', '!=0.4', use_egg_info=True)] checkLists(l, []) def test_obsoletes(self): # Test looking for distributions based on what they obsolete - from distutils2._backport.pkgutil import obsoletes_distribution - from distutils2.errors import DistutilsError - checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) l = [dist.name for dist in obsoletes_distribution('truffles', '1.0')] @@ -555,7 +534,6 @@ use_egg_info=True)] checkLists(l, ['cheese', 'bacon']) - l = [dist.name for dist in obsoletes_distribution('truffles', '0.8')] checkLists(l, ['choxie']) @@ -575,7 +553,6 @@ def test_yield_distribution(self): # tests the internal function _yield_distributions - from distutils2._backport.pkgutil import _yield_distributions checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), diff --git a/distutils2/_backport/tests/test_shutil.py b/distutils2/_backport/tests/test_shutil.py new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/test_shutil.py @@ -0,0 +1,945 @@ +import os +import sys +import tempfile +import stat +import tarfile +from os.path import splitdrive +from StringIO import StringIO + +from distutils.spawn import find_executable, spawn +from distutils2._backport import shutil +from distutils2._backport.shutil import ( + _make_tarball, _make_zipfile, make_archive, unpack_archive, + register_archive_format, unregister_archive_format, get_archive_formats, + register_unpack_format, unregister_unpack_format, get_unpack_formats, + Error, RegistryError) + +from distutils2.tests import unittest, support, TESTFN + +try: + import bz2 + BZ2_SUPPORTED = True +except ImportError: + BZ2_SUPPORTED = False + +TESTFN2 = TESTFN + "2" + +try: + import grp + import pwd + UID_GID_SUPPORT = True +except ImportError: + UID_GID_SUPPORT = False + +try: + import zlib +except ImportError: + zlib = None + +try: + import zipfile + ZIP_SUPPORT = True +except ImportError: + ZIP_SUPPORT = find_executable('zip') + +class TestShutil(unittest.TestCase): + + def setUp(self): + super(TestShutil, self).setUp() + self.tempdirs = [] + + def tearDown(self): + super(TestShutil, self).tearDown() + while self.tempdirs: + d = self.tempdirs.pop() + shutil.rmtree(d, os.name in ('nt', 'cygwin')) + + def write_file(self, path, content='xxx'): + """Writes a file in the given path. + + + path can be a string or a sequence. + """ + if isinstance(path, (list, tuple)): + path = os.path.join(*path) + f = open(path, 'w') + try: + f.write(content) + finally: + f.close() + + def mkdtemp(self): + """Create a temporary directory that will be cleaned up. + + Returns the path of the directory. + """ + d = tempfile.mkdtemp() + self.tempdirs.append(d) + return d + + def test_rmtree_errors(self): + # filename is guaranteed not to exist + filename = tempfile.mktemp() + self.assertRaises(OSError, shutil.rmtree, filename) + + # See bug #1071513 for why we don't run this on cygwin + # and bug #1076467 for why we don't run this as root. + if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' + and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.childpath = os.path.join(TESTFN, 'a') + f = open(self.childpath, 'w') + f.close() + old_dir_mode = os.stat(TESTFN).st_mode + old_child_mode = os.stat(self.childpath).st_mode + # Make unwritable. + os.chmod(self.childpath, stat.S_IREAD) + os.chmod(TESTFN, stat.S_IREAD) + + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 2, + "Expected call to onerror function did not happen.") + + # Make writable again. + os.chmod(TESTFN, old_dir_mode) + os.chmod(self.childpath, old_child_mode) + + # Clean up. + shutil.rmtree(TESTFN) + + def check_args_to_onerror(self, func, arg, exc): + # test_rmtree_errors deliberately runs rmtree + # on a directory that is chmod 400, which will fail. + # This function is run when shutil.rmtree fails. + # 99.9% of the time it initially fails to remove + # a file in the directory, so the first time through + # func is os.remove. + # However, some Linux machines running ZFS on + # FUSE experienced a failure earlier in the process + # at os.listdir. The first failure may legally + # be either. + if self.errorState == 0: + if func is os.remove: + self.assertEqual(arg, self.childpath) + else: + self.assertIs(func, os.listdir, + "func must be either os.remove or os.listdir") + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 1 + else: + self.assertEqual(func, os.rmdir) + self.assertEqual(arg, TESTFN) + self.assertTrue(issubclass(exc[0], OSError)) + self.errorState = 2 + + def test_rmtree_dont_delete_file(self): + # When called on a file instead of a directory, don't delete it. + handle, path = tempfile.mkstemp() + os.fdopen(handle).close() + self.assertRaises(OSError, shutil.rmtree, path) + os.remove(path) + + def _write_data(self, path, data): + f = open(path, "w") + f.write(data) + f.close() + + def test_copytree_simple(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + src_dir = tempfile.mkdtemp() + dst_dir = os.path.join(tempfile.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + try: + shutil.copytree(src_dir, dst_dir) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test.txt'))) + self.assertTrue(os.path.isdir(os.path.join(dst_dir, 'test_dir'))) + self.assertTrue(os.path.isfile(os.path.join(dst_dir, 'test_dir', + 'test.txt'))) + actual = read_data(os.path.join(dst_dir, 'test.txt')) + self.assertEqual(actual, '123') + actual = read_data(os.path.join(dst_dir, 'test_dir', 'test.txt')) + self.assertEqual(actual, '456') + finally: + for path in ( + os.path.join(src_dir, 'test.txt'), + os.path.join(dst_dir, 'test.txt'), + os.path.join(src_dir, 'test_dir', 'test.txt'), + os.path.join(dst_dir, 'test_dir', 'test.txt'), + ): + if os.path.exists(path): + os.remove(path) + for path in (src_dir, + os.path.dirname(dst_dir) + ): + if os.path.exists(path): + shutil.rmtree(path) + + def test_copytree_with_exclude(self): + + def read_data(path): + f = open(path) + data = f.read() + f.close() + return data + + # creating data + join = os.path.join + exists = os.path.exists + src_dir = tempfile.mkdtemp() + try: + dst_dir = join(tempfile.mkdtemp(), 'destination') + self._write_data(join(src_dir, 'test.txt'), '123') + self._write_data(join(src_dir, 'test.tmp'), '123') + os.mkdir(join(src_dir, 'test_dir')) + self._write_data(join(src_dir, 'test_dir', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2')) + self._write_data(join(src_dir, 'test_dir2', 'test.txt'), '456') + os.mkdir(join(src_dir, 'test_dir2', 'subdir')) + os.mkdir(join(src_dir, 'test_dir2', 'subdir2')) + self._write_data(join(src_dir, 'test_dir2', 'subdir', 'test.txt'), + '456') + self._write_data(join(src_dir, 'test_dir2', 'subdir2', 'test.py'), + '456') + + + # testing glob-like patterns + try: + patterns = shutil.ignore_patterns('*.tmp', 'test_dir2') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(exists(join(dst_dir, 'test.txt'))) + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + try: + patterns = shutil.ignore_patterns('*.tmp', 'subdir*') + shutil.copytree(src_dir, dst_dir, ignore=patterns) + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test.tmp'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + + # testing callable-style + try: + def _filter(src, names): + res = [] + for name in names: + path = os.path.join(src, name) + + if (os.path.isdir(path) and + path.split()[-1] == 'subdir'): + res.append(name) + elif os.path.splitext(path)[-1] in ('.py'): + res.append(name) + return res + + shutil.copytree(src_dir, dst_dir, ignore=_filter) + + # checking the result: some elements should not be copied + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir2', + 'test.py'))) + self.assertTrue(not exists(join(dst_dir, 'test_dir2', 'subdir'))) + + finally: + if os.path.exists(dst_dir): + shutil.rmtree(dst_dir) + finally: + shutil.rmtree(src_dir) + shutil.rmtree(os.path.dirname(dst_dir)) + + @support.skip_unless_symlink + def test_dont_copy_file_onto_link_to_itself(self): + # bug 851123. + os.mkdir(TESTFN) + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + try: + f = open(src, 'w') + f.write('cheddar') + f.close() + + if hasattr(os, "link"): + os.link(src, dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + + # Using `src` here would mean we end up with a symlink pointing + # to TESTFN/TESTFN/cheese, while it should point at + # TESTFN/cheese. + os.symlink('cheese', dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + f = open(src, 'r') + try: + self.assertEqual(f.read(), 'cheddar') + finally: + f.close() + os.remove(dst) + finally: + try: + shutil.rmtree(TESTFN) + except OSError: + pass + + @support.skip_unless_symlink + def test_rmtree_on_symlink(self): + # bug 1669. + os.mkdir(TESTFN) + try: + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + os.mkdir(src) + os.symlink(src, dst) + self.assertRaises(OSError, shutil.rmtree, dst) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + if hasattr(os, "mkfifo"): + # Issue #3002: copyfile and copytree block indefinitely on named pipes + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, 'mkfifo'), 'requires os.mkfifo') + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) + try: + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error, e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) + + def test_copytree_special_func(self): + + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + self._write_data(os.path.join(src_dir, 'test.txt'), '123') + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + + copied = [] + def _copy(src, dst): + copied.append((src, dst)) + + shutil.copytree(src_dir, dst_dir, copy_function=_copy) + self.assertEquals(len(copied), 2) + + @support.skip_unless_symlink + def test_copytree_dangling_symlinks(self): + + # a dangling symlink raises an error at the end + src_dir = self.mkdtemp() + dst_dir = os.path.join(self.mkdtemp(), 'destination') + os.symlink('IDONTEXIST', os.path.join(src_dir, 'test.txt')) + os.mkdir(os.path.join(src_dir, 'test_dir')) + self._write_data(os.path.join(src_dir, 'test_dir', 'test.txt'), '456') + self.assertRaises(Error, shutil.copytree, src_dir, dst_dir) + + # a dangling symlink is ignored with the proper flag + dst_dir = os.path.join(self.mkdtemp(), 'destination2') + shutil.copytree(src_dir, dst_dir, ignore_dangling_symlinks=True) + self.assertNotIn('test.txt', os.listdir(dst_dir)) + + # a dangling symlink is copied if symlinks=True + dst_dir = os.path.join(self.mkdtemp(), 'destination3') + shutil.copytree(src_dir, dst_dir, symlinks=True) + self.assertIn('test.txt', os.listdir(dst_dir)) + + @unittest.skipUnless(zlib, "requires zlib") + def test_make_tarball(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + os.mkdir(os.path.join(tmpdir, 'sub')) + self.write_file([tmpdir, 'sub', 'file3'], 'xxx') + + tmpdir2 = self.mkdtemp() + unittest.skipUnless(splitdrive(tmpdir)[0] == splitdrive(tmpdir2)[0], + "source and target should be on same drive") + + base_name = os.path.join(tmpdir2, 'archive') + + # working with relative paths to avoid tar warnings + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(splitdrive(base_name)[1], '.', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + def _tarinfo(self, path): + tar = tarfile.open(path) + try: + names = tar.getnames() + names.sort() + return tuple(names) + finally: + tar.close() + + def _create_files(self): + # creating something to tar + tmpdir = self.mkdtemp() + dist = os.path.join(tmpdir, 'dist') + os.mkdir(dist) + self.write_file([dist, 'file1'], 'xxx') + self.write_file([dist, 'file2'], 'xxx') + os.mkdir(os.path.join(dist, 'sub')) + self.write_file([dist, 'sub', 'file3'], 'xxx') + os.mkdir(os.path.join(dist, 'sub2')) + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + return tmpdir, tmpdir2, base_name + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(find_executable('tar') and find_executable('gzip'), + 'Need the tar command to run') + def test_tarfile_vs_tar(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist') + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + tarball = base_name + '.tar.gz' + self.assertTrue(os.path.exists(tarball)) + + # now create another tarball using `tar` + tarball2 = os.path.join(tmpdir, 'archive2.tar.gz') + tar_cmd = ['tar', '-cf', 'archive2.tar', 'dist'] + gzip_cmd = ['gzip', '-f9', 'archive2.tar'] + old_dir = os.getcwd() + old_stdout = sys.stdout + os.chdir(tmpdir) + sys.stdout = StringIO() + + try: + spawn(tar_cmd) + spawn(gzip_cmd) + finally: + os.chdir(old_dir) + sys.stdout = old_stdout + + self.assertTrue(os.path.exists(tarball2)) + # let's compare both tarballs + self.assertEquals(self._tarinfo(tarball), self._tarinfo(tarball2)) + + # trying an uncompressed one + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + # now for a dry_run + base_name = os.path.join(tmpdir2, 'archive') + old_dir = os.getcwd() + os.chdir(tmpdir) + try: + _make_tarball(base_name, 'dist', compress=None, dry_run=True) + finally: + os.chdir(old_dir) + tarball = base_name + '.tar' + self.assertTrue(os.path.exists(tarball)) + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(ZIP_SUPPORT, 'Need zip support to run') + def test_make_zipfile(self): + # creating something to tar + tmpdir = self.mkdtemp() + self.write_file([tmpdir, 'file1'], 'xxx') + self.write_file([tmpdir, 'file2'], 'xxx') + + tmpdir2 = self.mkdtemp() + base_name = os.path.join(tmpdir2, 'archive') + _make_zipfile(base_name, tmpdir) + + # check if the compressed tarball was created + tarball = base_name + '.zip' + self.assertTrue(os.path.exists(tarball)) + + + def test_make_archive(self): + tmpdir = self.mkdtemp() + base_name = os.path.join(tmpdir, 'archive') + self.assertRaises(ValueError, make_archive, base_name, 'xxx') + + @unittest.skipUnless(zlib, "Requires zlib") + def test_make_archive_owner_group(self): + # testing make_archive with owner and group, with various combinations + # this works even if there's not gid/uid support + if UID_GID_SUPPORT: + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + else: + group = owner = 'root' + + base_dir, root_dir, base_name = self._create_files() + base_name = os.path.join(self.mkdtemp() , 'archive') + res = make_archive(base_name, 'zip', root_dir, base_dir, owner=owner, + group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'zip', root_dir, base_dir) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner=owner, group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner='kjhkjhkjg', group='oihohoh') + self.assertTrue(os.path.exists(res)) + + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(UID_GID_SUPPORT, "Requires grp and pwd support") + def test_tarfile_root_owner(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + try: + archive_name = _make_tarball(base_name, 'dist', compress=None, + owner=owner, group=group) + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + self.assertTrue(os.path.exists(archive_name)) + + # now checks the rights + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, 0) + self.assertEquals(member.gid, 0) + finally: + archive.close() + + def test_make_archive_cwd(self): + current_dir = os.getcwd() + def _breaks(*args, **kw): + raise RuntimeError() + + register_archive_format('xxx', _breaks, [], 'xxx file') + try: + try: + make_archive('xxx', 'xxx', root_dir=self.mkdtemp()) + except Exception: + pass + self.assertEquals(os.getcwd(), current_dir) + finally: + unregister_archive_format('xxx') + + def test_register_archive_format(self): + + self.assertRaises(TypeError, register_archive_format, 'xxx', 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + 1) + self.assertRaises(TypeError, register_archive_format, 'xxx', lambda: x, + [(1, 2), (1, 2, 3)]) + + register_archive_format('xxx', lambda: x, [(1, 2)], 'xxx file') + formats = [name for name, params in get_archive_formats()] + self.assertIn('xxx', formats) + + unregister_archive_format('xxx') + formats = [name for name, params in get_archive_formats()] + self.assertNotIn('xxx', formats) + + def _compare_dirs(self, dir1, dir2): + # check that dir1 and dir2 are equivalent, + # return the diff + diff = [] + for root, dirs, files in os.walk(dir1): + for file_ in files: + path = os.path.join(root, file_) + target_path = os.path.join(dir2, os.path.split(path)[-1]) + if not os.path.exists(target_path): + diff.append(file_) + return diff + + @unittest.skipUnless(zlib, "Requires zlib") + def test_unpack_archive(self): + formats = ['tar', 'gztar', 'zip'] + if BZ2_SUPPORTED: + formats.append('bztar') + + for format in formats: + tmpdir = self.mkdtemp() + base_dir, root_dir, base_name = self._create_files() + tmpdir2 = self.mkdtemp() + filename = make_archive(base_name, format, root_dir, base_dir) + + # let's try to unpack it now + unpack_archive(filename, tmpdir2) + diff = self._compare_dirs(tmpdir, tmpdir2) + self.assertEquals(diff, []) + + def test_unpack_registery(self): + + formats = get_unpack_formats() + + def _boo(filename, extract_dir, extra): + self.assertEquals(extra, 1) + self.assertEquals(filename, 'stuff.boo') + self.assertEquals(extract_dir, 'xx') + + register_unpack_format('Boo', ['.boo', '.b2'], _boo, [('extra', 1)]) + unpack_archive('stuff.boo', 'xx') + + # trying to register a .boo unpacker again + self.assertRaises(RegistryError, register_unpack_format, 'Boo2', + ['.boo'], _boo) + + # should work now + unregister_unpack_format('Boo') + register_unpack_format('Boo2', ['.boo'], _boo) + self.assertIn(('Boo2', ['.boo'], ''), get_unpack_formats()) + self.assertNotIn(('Boo', ['.boo'], ''), get_unpack_formats()) + + # let's leave a clean state + unregister_unpack_format('Boo2') + self.assertEquals(get_unpack_formats(), formats) + + +class TestMove(unittest.TestCase): + + def setUp(self): + filename = "foo" + self.src_dir = tempfile.mkdtemp() + self.dst_dir = tempfile.mkdtemp() + self.src_file = os.path.join(self.src_dir, filename) + self.dst_file = os.path.join(self.dst_dir, filename) + # Try to create a dir in the current directory, hoping that it is + # not located on the same filesystem as the system tmp dir. + try: + self.dir_other_fs = tempfile.mkdtemp( + dir=os.path.dirname(__file__)) + self.file_other_fs = os.path.join(self.dir_other_fs, + filename) + except OSError: + self.dir_other_fs = None + f = open(self.src_file, "wb") + try: + f.write("spam") + finally: + f.close() + + def tearDown(self): + for d in (self.src_dir, self.dst_dir, self.dir_other_fs): + try: + if d: + shutil.rmtree(d) + except: + pass + + def _check_move_file(self, src, dst, real_dst): + f = open(src, "rb") + try: + contents = f.read() + finally: + f.close() + + shutil.move(src, dst) + f = open(real_dst, "rb") + try: + self.assertEqual(contents, f.read()) + finally: + f.close() + + self.assertFalse(os.path.exists(src)) + + def _check_move_dir(self, src, dst, real_dst): + contents = sorted(os.listdir(src)) + shutil.move(src, dst) + self.assertEqual(contents, sorted(os.listdir(real_dst))) + self.assertFalse(os.path.exists(src)) + + def test_move_file(self): + # Move a file to another location on the same filesystem. + self._check_move_file(self.src_file, self.dst_file, self.dst_file) + + def test_move_file_to_dir(self): + # Move a file inside an existing dir on the same filesystem. + self._check_move_file(self.src_file, self.dst_dir, self.dst_file) + + def test_move_file_other_fs(self): + # Move a file to an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.file_other_fs, + self.file_other_fs) + + def test_move_file_to_dir_other_fs(self): + # Move a file to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_file(self.src_file, self.dir_other_fs, + self.file_other_fs) + + def test_move_dir(self): + # Move a dir to another location on the same filesystem. + dst_dir = tempfile.mktemp() + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_other_fs(self): + # Move a dir to another location on another filesystem. + if not self.dir_other_fs: + # skip + return + dst_dir = tempfile.mktemp(dir=self.dir_other_fs) + try: + self._check_move_dir(self.src_dir, dst_dir, dst_dir) + finally: + try: + shutil.rmtree(dst_dir) + except: + pass + + def test_move_dir_to_dir(self): + # Move a dir inside an existing dir on the same filesystem. + self._check_move_dir(self.src_dir, self.dst_dir, + os.path.join(self.dst_dir, os.path.basename(self.src_dir))) + + def test_move_dir_to_dir_other_fs(self): + # Move a dir inside an existing dir on another filesystem. + if not self.dir_other_fs: + # skip + return + self._check_move_dir(self.src_dir, self.dir_other_fs, + os.path.join(self.dir_other_fs, os.path.basename(self.src_dir))) + + def test_existing_file_inside_dest_dir(self): + # A file with the same name inside the destination dir already exists. + f = open(self.dst_file, "wb") + try: + pass + finally: + f.close() + self.assertRaises(shutil.Error, shutil.move, self.src_file, self.dst_dir) + + def test_dont_move_dir_in_itself(self): + # Moving a dir inside itself raises an Error. + dst = os.path.join(self.src_dir, "bar") + self.assertRaises(shutil.Error, shutil.move, self.src_dir, dst) + + def test_destinsrc_false_negative(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'srcdir/dest')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertTrue(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is not in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + def test_destinsrc_false_positive(self): + os.mkdir(TESTFN) + try: + for src, dst in [('srcdir', 'src/dest'), ('srcdir', 'srcdir.new')]: + src = os.path.join(TESTFN, src) + dst = os.path.join(TESTFN, dst) + self.assertFalse(shutil._destinsrc(src, dst), + msg='_destinsrc() wrongly concluded that ' + 'dst (%s) is in src (%s)' % (dst, src)) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + + +class TestCopyFile(unittest.TestCase): + + _delete = False + + class Faux(object): + _entered = False + _exited_with = None + _raised = False + + def __init__(self, raise_in_exit=False, suppress_at_exit=True): + self._raise_in_exit = raise_in_exit + self._suppress_at_exit = suppress_at_exit + + def read(self, *args): + return '' + + def __enter__(self): + self._entered = True + + def __exit__(self, exc_type, exc_val, exc_tb): + self._exited_with = exc_type, exc_val, exc_tb + if self._raise_in_exit: + self._raised = True + raise IOError("Cannot close") + return self._suppress_at_exit + + def tearDown(self): + if self._delete: + del shutil.open + + def _set_shutil_open(self, func): + shutil.open = func + self._delete = True + + def test_w_source_open_fails(self): + def _open(filename, mode='r'): + if filename == 'srcfile': + raise IOError('Cannot open "srcfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, shutil.copyfile, 'srcfile', 'destfile') + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_open_fails(self): + + srcfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + raise IOError('Cannot open "destfile"') + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot open "destfile"',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_dest_close_fails(self): + + srcfile = self.Faux() + destfile = self.Faux(True) + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + shutil.copyfile('srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertTrue(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is IOError) + self.assertEqual(srcfile._exited_with[1].args, + ('Cannot close',)) + + @unittest.skip("can't use the with statement and support 2.4") + def test_w_source_close_fails(self): + + srcfile = self.Faux(True) + destfile = self.Faux() + + def _open(filename, mode='r'): + if filename == 'srcfile': + return srcfile + if filename == 'destfile': + return destfile + assert 0 # shouldn't reach here. + + self._set_shutil_open(_open) + + self.assertRaises(IOError, + shutil.copyfile, 'srcfile', 'destfile') + self.assertTrue(srcfile._entered) + self.assertTrue(destfile._entered) + self.assertFalse(destfile._raised) + self.assertTrue(srcfile._exited_with[0] is None) + self.assertTrue(srcfile._raised) + + +def test_suite(): + suite = unittest.TestSuite() + load = unittest.defaultTestLoader.loadTestsFromTestCase + suite.addTest(load(TestCopyFile)) + suite.addTest(load(TestMove)) + suite.addTest(load(TestShutil)) + return suite + + +if __name__ == '__main__': + unittest.main(defaultTest='test_suite') diff --git a/distutils2/_backport/tests/test_sysconfig.py b/distutils2/_backport/tests/test_sysconfig.py --- a/distutils2/_backport/tests/test_sysconfig.py +++ b/distutils2/_backport/tests/test_sysconfig.py @@ -4,7 +4,7 @@ import sys import subprocess import shutil -from copy import copy, deepcopy +from copy import copy from ConfigParser import RawConfigParser from StringIO import StringIO @@ -15,13 +15,9 @@ get_scheme_names, _main, _SCHEMES) from distutils2.tests import unittest -from distutils2.tests.support import EnvironGuard +from distutils2.tests.support import EnvironGuard, skip_unless_symlink from test.test_support import TESTFN, unlink -try: - from test.test_support import skip_unless_symlink -except ImportError: - skip_unless_symlink = unittest.skip( - 'requires test.test_support.skip_unless_symlink') + class TestSysConfig(EnvironGuard, unittest.TestCase): diff --git a/distutils2/command/build_py.py b/distutils2/command/build_py.py --- a/distutils2/command/build_py.py +++ b/distutils2/command/build_py.py @@ -8,7 +8,6 @@ import logging from glob import glob -import distutils2 from distutils2.command.cmd import Command from distutils2.errors import DistutilsOptionError, DistutilsFileError from distutils2.util import convert_path @@ -163,11 +162,13 @@ Helper function for `run()`. """ + # FIXME add tests for this method for package, src_dir, build_dir, filenames in self.data_files: for filename in filenames: target = os.path.join(build_dir, filename) + srcfile = os.path.join(src_dir, filename) self.mkpath(os.path.dirname(target)) - outf, copied = self.copy_file(os.path.join(src_dir, filename), + outf, copied = self.copy_file(srcfile, target, preserve_mode=False) if copied and srcfile in self.distribution.convert_2to3.doctests: self._doctests_2to3.append(outf) diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -165,7 +165,10 @@ header = "command options for '%s':" % self.get_command_name() self.announce(indent + header, level=logging.INFO) indent = indent + " " + negative_opt = getattr(self, 'negative_opt', ()) for (option, _, _) in self.user_options: + if option in negative_opt: + continue option = option.replace('-', '_') if option[-1] == "=": option = option[:-1] diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -9,7 +9,8 @@ from ConfigParser import RawConfigParser from distutils2 import logger -from distutils2.util import check_environ, resolve_name +from distutils2.errors import DistutilsOptionError +from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command from distutils2.datafiles import resources_dests diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -17,19 +17,19 @@ import urllib import urlparse import zipfile - try: import hashlib except ImportError: from distutils2._backport import hashlib +from distutils2._backport.shutil import unpack_archive from distutils2.errors import IrrationalVersionError from distutils2.index.errors import (HashDoesNotMatch, UnsupportedHashName, CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) from distutils2.metadata import DistributionMetadata -from distutils2.util import untar_file, unzip_file, splitext +from distutils2.util import splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -206,6 +206,7 @@ __hash__ = object.__hash__ + class DistInfo(IndexReference): """Represents a distribution retrieved from an index (sdist, bdist, ...) """ @@ -313,17 +314,8 @@ filename = self.download() content_type = mimetypes.guess_type(filename)[0] + self._unpacked_dir = unpack_archive(filename) - if (content_type == 'application/zip' - or filename.endswith('.zip') - or filename.endswith('.pybundle') - or zipfile.is_zipfile(filename)): - unzip_file(filename, path, flatten=not filename.endswith('.pybundle')) - elif (content_type == 'application/x-gzip' - or tarfile.is_tarfile(filename) - or splitext(filename)[1].lower() in ('.tar', '.tar.gz', '.tar.bz2', '.tgz', '.tbz')): - untar_file(filename, path) - self._unpacked_dir = path return self._unpacked_dir def _check_md5(self, filename): diff --git a/distutils2/index/xmlrpc.py b/distutils2/index/xmlrpc.py --- a/distutils2/index/xmlrpc.py +++ b/distutils2/index/xmlrpc.py @@ -127,10 +127,17 @@ return release def get_metadata(self, project_name, version): - """Retreive project metadatas. + """Retrieve project metadata. Return a ReleaseInfo object, with metadata informations filled in. """ + # to be case-insensitive, get the informations from the XMLRPC API + projects = [d['name'] for d in + self.proxy.search({'name': project_name}) + if d['name'].lower() == project_name] + if len(projects) > 0: + project_name = projects[0] + metadata = self.proxy.release_data(project_name, version) project = self._get_project(project_name) if version not in project.get_versions(): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -1,14 +1,21 @@ from tempfile import mkdtemp -import logging import shutil import os +import sys +import stat import errno import itertools +import tempfile +from distutils2 import logger from distutils2._backport.pkgutil import get_distributions +from distutils2._backport.pkgutil import get_distribution +from distutils2._backport.sysconfig import get_config_var from distutils2.depgraph import generate_graph from distutils2.index import wrapper from distutils2.index.errors import ProjectNotFound, ReleaseNotFound +from distutils2.errors import DistutilsError +from distutils2.version import get_version_predicate """Provides installations scripts. @@ -53,7 +60,63 @@ else: raise e os.rename(old, new) - yield(old, new) + yield (old, new) + + +def _run_d1_install(archive_dir, path): + # backward compat: using setuptools or plain-distutils + cmd = '%s setup.py install --root=%s --record=%s' + setup_py = os.path.join(archive_dir, 'setup.py') + if 'setuptools' in open(setup_py).read(): + cmd += ' --single-version-externally-managed' + + # how to place this file in the egg-info dir + # for non-distutils2 projects ? + record_file = os.path.join(archive_dir, 'RECORD') + os.system(cmd % (sys.executable, path, record_file)) + if not os.path.exists(record_file): + raise ValueError('Failed to install.') + return open(record_file).read().split('\n') + + +def _run_d2_install(archive_dir, path): + # using our own install command + raise NotImplementedError() + + +def _install_dist(dist, path): + """Install a distribution into a path. + + This: + + * unpack the distribution + * copy the files in "path" + * determine if the distribution is distutils2 or distutils1. + """ + where = dist.unpack(archive) + + # get into the dir + archive_dir = None + for item in os.listdir(where): + fullpath = os.path.join(where, item) + if os.path.isdir(fullpath): + archive_dir = fullpath + break + + if archive_dir is None: + raise ValueError('Cannot locate the unpacked archive') + + # install + old_dir = os.getcwd() + os.chdir(archive_dir) + try: + # distutils2 or distutils1 ? + if 'setup.py' in os.listdir(archive_dir): + return _run_d1_install(archive_dir, path) + else: + return _run_d2_install(archive_dir, path) + finally: + os.chdir(old_dir) def install_dists(dists, path=None): @@ -65,19 +128,23 @@ Return a list of installed files. :param dists: distributions to install - :param path: base path to install distribution on + :param path: base path to install distribution in """ if not path: path = mkdtemp() installed_dists, installed_files = [], [] for d in dists: + logger.info('Installing %s %s' % (d.name, d.version)) try: - installed_files.extend(d.install(path)) + installed_files.extend(_install_dist(d, path)) installed_dists.append(d) except Exception, e : + logger.info('Failed. %s' % str(e)) + + # reverting for d in installed_dists: - d.uninstall() + uninstall(d) raise e return installed_files @@ -123,16 +190,26 @@ try: if install: installed_files = install_dists(install, install_path) # install to tmp first - for files in temp_files.itervalues(): - for old, new in files: - os.remove(new) - except Exception,e: + except: # if an error occurs, put back the files in the good place. - for files in temp_files.itervalues(): + for files in temp_files.values(): for old, new in files: shutil.move(new, old) + # now re-raising + raise + + # we can remove them for good + for files in temp_files.values(): + for old, new in files: + os.remove(new) + + +def _get_setuptools_deps(release): + # NotImplementedError + pass + def get_infos(requirements, index=None, installed=None, prefer_final=True): """Return the informations on what's going to be installed and upgraded. @@ -154,44 +231,74 @@ Conflict contains all the conflicting distributions, if there is a conflict. """ + if not installed: + logger.info('Reading installed distributions') + installed = get_distributions(use_egg_info=True) + + infos = {'install': [], 'remove': [], 'conflict': []} + # Is a compatible version of the project is already installed ? + predicate = get_version_predicate(requirements) + found = False + installed = list(installed) + + # check that the project isnt already installed + for installed_project in installed: + # is it a compatible project ? + if predicate.name.lower() != installed_project.name.lower(): + continue + found = True + logger.info('Found %s %s' % (installed_project.name, + installed_project.version)) + + # if we already have something installed, check it matches the + # requirements + if predicate.match(installed_project.version): + return infos + break + + if not found: + logger.info('Project not installed.') if not index: index = wrapper.ClientWrapper() - if not installed: - installed = get_distributions(use_egg_info=True) - # Get all the releases that match the requirements try: releases = index.get_releases(requirements) except (ReleaseNotFound, ProjectNotFound), e: raise InstallationException('Release not found: "%s"' % requirements) - + # Pick up a release, and try to get the dependency tree release = releases.get_last(requirements, prefer_final=prefer_final) - # Iter since we found something without conflicts + if release is None: + logger.info('Could not find a matching project') + return infos + + # this works for Metadata 1.2 metadata = release.fetch_metadata() - # Get the distributions already_installed on the system - # and add the one we want to install + # for earlier, we need to build setuptools deps if any + if 'requires_dist' not in metadata: + deps = _get_setuptools_deps(release) + else: + deps = metadata['requires_dist'] distributions = itertools.chain(installed, [release]) depgraph = generate_graph(distributions) # Store all the already_installed packages in a list, in case of rollback. - infos = {'install': [], 'remove': [], 'conflict': []} - # Get what the missing deps are - for dists in depgraph.missing.itervalues(): - if dists: - logging.info("missing dependencies found, installing them") - # we have missing deps - for dist in dists: - _update_infos(infos, get_infos(dist, index, installed)) + dists = depgraph.missing[release] + if dists: + logger.info("missing dependencies found, retrieving metadata") + # we have missing deps + for dist in dists: + _update_infos(infos, get_infos(dist, index, installed)) # Fill in the infos existing = [d for d in installed if d.name == release.name] + if existing: infos['remove'].append(existing[0]) infos['conflict'].extend(depgraph.reverse_list[existing[0]]) @@ -203,15 +310,45 @@ """extends the lists contained in the `info` dict with those contained in the `new_info` one """ - for key, value in infos.iteritems(): + for key, value in infos.items(): if key in new_infos: infos[key].extend(new_infos[key]) -def remove(project_name): +def remove(project_name, paths=sys.path): """Removes a single project from the installation""" - pass + tmp = tempfile.mkdtemp(prefix=project_name+'-uninstall') + dist = get_distribution(project_name, paths=paths) + if dist is None: + raise DistutilsError('Distribution %s not found' % project_name) + files = dist.get_installed_files(local=True) + rmdirs = [] + rmfiles = [] + try: + for file, md5, size in files: + if os.path.isfile(file): + dirname, filename = os.path.split(file) + tmpfile = os.path.join(tmp, filename) + try: + os.rename(file, tmpfile) + finally: + if not os.path.isfile(file): + os.rename(tmpfile, file) + if file not in rmfiles: + rmfiles.append(file) + if dirname not in rmdirs: + rmdirs.append(dirname) + except OSError: + os.rmdir(tmp) + + for file in rmfiles: + os.remove(file) + + for dirname in rmdirs: + if not os.listdir(dirname): + if bool(os.stat(dirname).st_mode & stat.S_IWUSR): + os.rmdir(dirname) @@ -221,5 +358,28 @@ attrs['requirements'] = sys.argv[1] get_infos(**attrs) + +def install(project): + logger.info('Getting information about "%s".' % project) + try: + info = get_infos(project) + except InstallationException: + logger.info('Cound not find "%s".' % project) + return + + if info['install'] == []: + logger.info('Nothing to install.') + return + + install_path = get_config_var('base') + try: + install_from_infos(info['install'], info['remove'], info['conflict'], + install_path=install_path) + + except InstallationConflict, e: + projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] + logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) + + if __name__ == '__main__': main() diff --git a/distutils2/markers.py b/distutils2/markers.py new file mode 100644 --- /dev/null +++ b/distutils2/markers.py @@ -0,0 +1,194 @@ +""" Micro-language for PEP 345 environment markers +""" +import sys +import platform +import os +from tokenize import tokenize, NAME, OP, STRING, ENDMARKER +from StringIO import StringIO + +__all__ = ['interpret'] + + +# allowed operators +_OPERATORS = {'==': lambda x, y: x == y, + '!=': lambda x, y: x != y, + '>': lambda x, y: x > y, + '>=': lambda x, y: x >= y, + '<': lambda x, y: x < y, + '<=': lambda x, y: x <= y, + 'in': lambda x, y: x in y, + 'not in': lambda x, y: x not in y} + + +def _operate(operation, x, y): + return _OPERATORS[operation](x, y) + + +# restricted set of variables +_VARS = {'sys.platform': sys.platform, + 'python_version': sys.version[:3], + 'python_full_version': sys.version.split(' ', 1)[0], + 'os.name': os.name, + 'platform.version': platform.version(), + 'platform.machine': platform.machine()} + + +class _Operation(object): + + def __init__(self, execution_context=None): + self.left = None + self.op = None + self.right = None + if execution_context is None: + execution_context = {} + self.execution_context = execution_context + + def _get_var(self, name): + if name in self.execution_context: + return self.execution_context[name] + return _VARS[name] + + def __repr__(self): + return '%s %s %s' % (self.left, self.op, self.right) + + def _is_string(self, value): + if value is None or len(value) < 2: + return False + for delimiter in '"\'': + if value[0] == value[-1] == delimiter: + return True + return False + + def _is_name(self, value): + return value in _VARS + + def _convert(self, value): + if value in _VARS: + return self._get_var(value) + return value.strip('"\'') + + def _check_name(self, value): + if value not in _VARS: + raise NameError(value) + + def _nonsense_op(self): + msg = 'This operation is not supported : "%s"' % self + raise SyntaxError(msg) + + def __call__(self): + # make sure we do something useful + if self._is_string(self.left): + if self._is_string(self.right): + self._nonsense_op() + self._check_name(self.right) + else: + if not self._is_string(self.right): + self._nonsense_op() + self._check_name(self.left) + + if self.op not in _OPERATORS: + raise TypeError('Operator not supported "%s"' % self.op) + + left = self._convert(self.left) + right = self._convert(self.right) + return _operate(self.op, left, right) + + +class _OR(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'OR(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() or self.right() + + +class _AND(object): + def __init__(self, left, right=None): + self.left = left + self.right = right + + def filled(self): + return self.right is not None + + def __repr__(self): + return 'AND(%r, %r)' % (self.left, self.right) + + def __call__(self): + return self.left() and self.right() + + +class _CHAIN(object): + + def __init__(self, execution_context=None): + self.ops = [] + self.op_starting = True + self.execution_context = execution_context + + def eat(self, toktype, tokval, rowcol, line, logical_line): + if toktype not in (NAME, OP, STRING, ENDMARKER): + raise SyntaxError('Type not supported "%s"' % tokval) + + if self.op_starting: + op = _Operation(self.execution_context) + if len(self.ops) > 0: + last = self.ops[-1] + if isinstance(last, (_OR, _AND)) and not last.filled(): + last.right = op + else: + self.ops.append(op) + else: + self.ops.append(op) + self.op_starting = False + else: + op = self.ops[-1] + + if (toktype == ENDMARKER or + (toktype == NAME and tokval in ('and', 'or'))): + if toktype == NAME and tokval == 'and': + self.ops.append(_AND(self.ops.pop())) + elif toktype == NAME and tokval == 'or': + self.ops.append(_OR(self.ops.pop())) + self.op_starting = True + return + + if isinstance(op, (_OR, _AND)) and op.right is not None: + op = op.right + + if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) + or (toktype == OP and tokval == '.')): + if op.op is None: + if op.left is None: + op.left = tokval + else: + op.left += tokval + else: + if op.right is None: + op.right = tokval + else: + op.right += tokval + elif toktype == OP or tokval in ('in', 'not'): + if tokval == 'in' and op.op == 'not': + op.op = 'not in' + else: + op.op = tokval + + def result(self): + for op in self.ops: + if not op(): + return False + return True + + +def interpret(marker, execution_context=None): + """Interpret a marker and return a result depending on environment.""" + marker = marker.strip() + operations = _CHAIN(execution_context) + tokenize(StringIO(marker).readline, operations.eat) + return operations.result() diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -5,13 +5,12 @@ import os import sys -import platform import re from StringIO import StringIO from email import message_from_file -from tokenize import tokenize, NAME, OP, STRING, ENDMARKER from distutils2 import logger +from distutils2.markers import interpret from distutils2.version import (is_valid_predicate, is_valid_version, is_valid_versions) from distutils2.errors import (MetadataMissingError, @@ -291,7 +290,7 @@ if not self.platform_dependent or ';' not in value: return True, value value, marker = value.split(';') - return _interpret(marker, self.execution_context), value + return interpret(marker, self.execution_context), value def _remove_line_prefix(self, value): return _LINE_PREFIX.sub('\n', value) @@ -518,191 +517,3 @@ def items(self): """Dict like api""" return [(key, self[key]) for key in self.keys()] - - -# -# micro-language for PEP 345 environment markers -# - -# allowed operators -_OPERATORS = {'==': lambda x, y: x == y, - '!=': lambda x, y: x != y, - '>': lambda x, y: x > y, - '>=': lambda x, y: x >= y, - '<': lambda x, y: x < y, - '<=': lambda x, y: x <= y, - 'in': lambda x, y: x in y, - 'not in': lambda x, y: x not in y} - - -def _operate(operation, x, y): - return _OPERATORS[operation](x, y) - -# restricted set of variables -_VARS = {'sys.platform': sys.platform, - 'python_version': sys.version[:3], - 'python_full_version': sys.version.split(' ', 1)[0], - 'os.name': os.name, - 'platform.version': platform.version(), - 'platform.machine': platform.machine()} - - -class _Operation(object): - - def __init__(self, execution_context=None): - self.left = None - self.op = None - self.right = None - if execution_context is None: - execution_context = {} - self.execution_context = execution_context - - def _get_var(self, name): - if name in self.execution_context: - return self.execution_context[name] - return _VARS[name] - - def __repr__(self): - return '%s %s %s' % (self.left, self.op, self.right) - - def _is_string(self, value): - if value is None or len(value) < 2: - return False - for delimiter in '"\'': - if value[0] == value[-1] == delimiter: - return True - return False - - def _is_name(self, value): - return value in _VARS - - def _convert(self, value): - if value in _VARS: - return self._get_var(value) - return value.strip('"\'') - - def _check_name(self, value): - if value not in _VARS: - raise NameError(value) - - def _nonsense_op(self): - msg = 'This operation is not supported : "%s"' % self - raise SyntaxError(msg) - - def __call__(self): - # make sure we do something useful - if self._is_string(self.left): - if self._is_string(self.right): - self._nonsense_op() - self._check_name(self.right) - else: - if not self._is_string(self.right): - self._nonsense_op() - self._check_name(self.left) - - if self.op not in _OPERATORS: - raise TypeError('Operator not supported "%s"' % self.op) - - left = self._convert(self.left) - right = self._convert(self.right) - return _operate(self.op, left, right) - - -class _OR(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'OR(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() or self.right() - - -class _AND(object): - def __init__(self, left, right=None): - self.left = left - self.right = right - - def filled(self): - return self.right is not None - - def __repr__(self): - return 'AND(%r, %r)' % (self.left, self.right) - - def __call__(self): - return self.left() and self.right() - - -class _CHAIN(object): - - def __init__(self, execution_context=None): - self.ops = [] - self.op_starting = True - self.execution_context = execution_context - - def eat(self, toktype, tokval, rowcol, line, logical_line): - if toktype not in (NAME, OP, STRING, ENDMARKER): - raise SyntaxError('Type not supported "%s"' % tokval) - - if self.op_starting: - op = _Operation(self.execution_context) - if len(self.ops) > 0: - last = self.ops[-1] - if isinstance(last, (_OR, _AND)) and not last.filled(): - last.right = op - else: - self.ops.append(op) - else: - self.ops.append(op) - self.op_starting = False - else: - op = self.ops[-1] - - if (toktype == ENDMARKER or - (toktype == NAME and tokval in ('and', 'or'))): - if toktype == NAME and tokval == 'and': - self.ops.append(_AND(self.ops.pop())) - elif toktype == NAME and tokval == 'or': - self.ops.append(_OR(self.ops.pop())) - self.op_starting = True - return - - if isinstance(op, (_OR, _AND)) and op.right is not None: - op = op.right - - if ((toktype in (NAME, STRING) and tokval not in ('in', 'not')) - or (toktype == OP and tokval == '.')): - if op.op is None: - if op.left is None: - op.left = tokval - else: - op.left += tokval - else: - if op.right is None: - op.right = tokval - else: - op.right += tokval - elif toktype == OP or tokval in ('in', 'not'): - if tokval == 'in' and op.op == 'not': - op.op = 'not in' - else: - op.op = tokval - - def result(self): - for op in self.ops: - if not op(): - return False - return True - - -def _interpret(marker, execution_context=None): - """Interpret a marker and return a result depending on environment.""" - marker = marker.strip() - operations = _CHAIN(execution_context) - tokenize(StringIO(marker).readline, operations.eat) - return operations.result() diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -1,7 +1,9 @@ import os import sys from optparse import OptionParser +import logging +from distutils2 import logger from distutils2.util import grok_environment_error from distutils2.errors import (DistutilsSetupError, DistutilsArgError, DistutilsError, CCompilerError) @@ -9,6 +11,7 @@ from distutils2 import __version__ from distutils2._backport.pkgutil import get_distributions, get_distribution from distutils2.depgraph import generate_graph +from distutils2.install import install # This is a barebones help message generated displayed when the user # runs the setup script with no arguments at all. More useful help @@ -115,8 +118,17 @@ return dist +def _set_logger(): + logger.setLevel(logging.INFO) + sth = logging.StreamHandler(sys.stderr) + sth.setLevel(logging.INFO) + logger.addHandler(sth) + logger.propagate = 0 + + def main(): """Main entry point for Distutils2""" + _set_logger() parser = OptionParser() parser.disable_interspersed_args() parser.usage = '%prog [options] cmd1 cmd2 ..' @@ -141,6 +153,14 @@ action="store_true", dest="fgraph", default=False, help="Display the full graph for installed distributions.") + parser.add_option("-i", "--install", + action="store", dest="install", + help="Install a project.") + + parser.add_option("-r", "--remove", + action="store", dest="remove", + help="Remove a project.") + options, args = parser.parse_args() if options.version: print('Distutils2 %s' % __version__) @@ -199,6 +219,10 @@ print(graph) sys.exit(0) + if options.install is not None: + install(options.install) + sys.exit(0) + if len(args) == 0: parser.print_help() sys.exit(0) diff --git a/distutils2/tests/pypi_server.py b/distutils2/tests/pypi_server.py --- a/distutils2/tests/pypi_server.py +++ b/distutils2/tests/pypi_server.py @@ -375,6 +375,7 @@ def __init__(self, dists=[]): self._dists = dists + self._search_result = [] def add_distributions(self, dists): for dist in dists: diff --git a/distutils2/tests/support.py b/distutils2/tests/support.py --- a/distutils2/tests/support.py +++ b/distutils2/tests/support.py @@ -17,10 +17,11 @@ super(SomeTestCase, self).setUp() ... # other setup code -Read each class' docstring to see its purpose and usage. +Also provided is a DummyCommand class, useful to mock commands in the +tests of another command that needs them, a create_distribution function +and a skip_unless_symlink decorator. -Also provided is a DummyCommand class, useful to mock commands in the -tests of another command that needs them (see docstring). +Each class or function has a docstring to explain its purpose and usage. """ import os @@ -35,7 +36,8 @@ from distutils2.tests import unittest __all__ = ['LoggingCatcher', 'WarningsCatcher', 'TempdirManager', - 'EnvironGuard', 'DummyCommand', 'unittest'] + 'EnvironGuard', 'DummyCommand', 'unittest', 'create_distribution', + 'skip_unless_symlink'] class LoggingCatcher(object): @@ -135,7 +137,7 @@ finally: f.close() - def create_dist(self, pkg_name='foo', **kw): + def create_dist(self, **kw): """Create a stub distribution object and files. This function creates a Distribution instance (use keyword arguments @@ -143,18 +145,35 @@ (currently an empty directory). It returns the path to the directory and the Distribution instance. - You can use TempdirManager.write_file to write any file in that + You can use self.write_file to write any file in that directory, e.g. setup scripts or Python modules. """ # Late import so that third parties can import support without # loading a ton of distutils2 modules in memory. from distutils2.dist import Distribution + if 'name' not in kw: + kw['name'] = 'foo' tmp_dir = self.mkdtemp() - pkg_dir = os.path.join(tmp_dir, pkg_name) - os.mkdir(pkg_dir) + project_dir = os.path.join(tmp_dir, kw['name']) + os.mkdir(project_dir) dist = Distribution(attrs=kw) - return pkg_dir, dist + return project_dir, dist + def assertIsFile(self, *args): + path = os.path.join(*args) + dirname = os.path.dirname(path) + file = os.path.basename(path) + if os.path.isdir(dirname): + files = os.listdir(dirname) + msg = "%s not found in %s: %s" % (file, dirname, files) + assert os.path.isfile(path), msg + else: + raise AssertionError( + '%s not found. %s does not exist' % (file, dirname)) + + def assertIsNotFile(self, *args): + path = os.path.join(*args) + assert not os.path.isfile(path), "%s exist" % path class EnvironGuard(object): """TestCase-compatible mixin to save and restore the environment.""" @@ -211,3 +230,9 @@ d.parse_command_line() return d + +try: + from test.test_support import skip_unless_symlink +except ImportError: + skip_unless_symlink = unittest.skip( + 'requires test.test_support.skip_unless_symlink') diff --git a/distutils2/tests/test_command_install_dist.py b/distutils2/tests/test_command_install_dist.py --- a/distutils2/tests/test_command_install_dist.py +++ b/distutils2/tests/test_command_install_dist.py @@ -180,8 +180,8 @@ cmd.user = 'user' self.assertRaises(DistutilsOptionError, cmd.finalize_options) - def test_record(self): - + def test_old_record(self): + # test pre-PEP 376 --record option (outside dist-info dir) install_dir = self.mkdtemp() pkgdir, dist = self.create_dist() @@ -189,11 +189,11 @@ cmd = install_dist(dist) dist.command_obj['install_dist'] = cmd cmd.root = install_dir - cmd.record = os.path.join(pkgdir, 'RECORD') + cmd.record = os.path.join(pkgdir, 'filelist') cmd.ensure_finalized() cmd.run() - # let's check the RECORD file was created with four + # let's check the record file was created with four # lines, one for each .dist-info entry: METADATA, # INSTALLER, REQUSTED, RECORD f = open(cmd.record) diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -29,27 +29,16 @@ class ToInstallDist(object): """Distribution that will be installed""" - def __init__(self, raise_error=False, files=False): - self._raise_error = raise_error + def __init__(self, files=False): self._files = files - self.install_called = False - self.install_called_with = {} self.uninstall_called = False self._real_files = [] + self.name = "fake" + self.version = "fake" if files: for f in range(0,3): self._real_files.append(mkstemp()) - def install(self, *args): - self.install_called = True - self.install_called_with = args - if self._raise_error: - raise Exception('Oops !') - return ['/path/to/foo', '/path/to/bar'] - - def uninstall(self, **args): - self.uninstall_called = True - def get_installed_files(self, **args): if self._files: return [f[1] for f in self._real_files] @@ -58,7 +47,49 @@ return self.get_installed_files() +class MagicMock(object): + def __init__(self, return_value=None, raise_exception=False): + self.called = False + self._times_called = 0 + self._called_with = [] + self._return_value = return_value + self._raise = raise_exception + + def __call__(self, *args, **kwargs): + self.called = True + self._times_called = self._times_called + 1 + self._called_with.append((args, kwargs)) + iterable = hasattr(self._raise, '__iter__') + if self._raise: + if ((not iterable and self._raise) + or self._raise[self._times_called - 1]): + raise Exception + return self._return_value + + def called_with(self, *args, **kwargs): + return (args, kwargs) in self._called_with + + +def patch(parent, to_patch): + """monkey match a module""" + def wrapper(func): + print func + print dir(func) + old_func = getattr(parent, to_patch) + def wrapped(*args, **kwargs): + parent.__dict__[to_patch] = MagicMock() + try: + out = func(*args, **kwargs) + finally: + setattr(parent, to_patch, old_func) + return out + return wrapped + return wrapper + + def get_installed_dists(dists): + """Return a list of fake installed dists. + The list is name, version, deps""" objects = [] for (name, version, deps) in dists: objects.append(InstalledDist(name, version, deps)) @@ -69,6 +100,12 @@ def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) + def _patch_run_install(self): + """Patch run install""" + + def _unpatch_run_install(self): + """Unpatch run install for d2 and d1""" + def _get_results(self, output): """return a list of results""" installed = [(o.name, '%s' % o.version) for o in output['install']] @@ -150,6 +187,8 @@ # Tests that conflicts are detected client = self._get_client(server) archive_path = '%s/distribution.tar.gz' % server.full_address + + # choxie depends on towel-stuff, which depends on bacon. server.xmlrpc.set_distributions([ {'name':'choxie', 'version': '2.0.0.9', @@ -164,7 +203,9 @@ 'requires_dist': [], 'url': archive_path}, ]) - already_installed = [('bacon', '0.1', []), + + # name, version, deps. + already_installed = [('bacon', '0.1', []), ('chicken', '1.1', ['bacon (0.1)'])] output = install.get_infos("choxie", index=client, installed= get_installed_dists(already_installed)) @@ -221,23 +262,39 @@ # if one of the distribution installation fails, call uninstall on all # installed distributions. - d1 = ToInstallDist() - d2 = ToInstallDist(raise_error=True) - self.assertRaises(Exception, install.install_dists, [d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertTrue(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) + + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + d1 = ToInstallDist() + d2 = ToInstallDist() + path = self.mkdtemp() + self.assertRaises(Exception, install.install_dists, [d1, d2], path) + self.assertTrue(install._install_dist.called_with(d1, path)) + self.assertTrue(install.uninstall.called) + finally: + install._install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_dists_success(self): - # test that the install method is called on each of the distributions. - d1 = ToInstallDist() - d2 = ToInstallDist() - install.install_dists([d1, d2]) - for dist in (d1, d2): - self.assertTrue(dist.install_called) - self.assertFalse(d1.uninstall_called) - self.assertFalse(d2.uninstall_called) + old_install_dist = install._install_dist + install._install_dist = MagicMock(return_value=[]) + try: + # test that the install method is called on each of the distributions. + d1 = ToInstallDist() + d2 = ToInstallDist() + + # should call install + path = self.mkdtemp() + install.install_dists([d1, d2], path) + for dist in (d1, d2): + self.assertTrue(install._install_dist.called_with(dist, path)) + finally: + install._install_dist = old_install_dist def test_install_from_infos_conflict(self): # assert conflicts raise an exception @@ -262,29 +319,46 @@ install.install_dists = old_install_dists def test_install_from_infos_remove_rollback(self): - # assert that if an error occurs, the removed files are restored. - remove = [] - for i in range(0,2): - remove.append(ToInstallDist(files=True, raise_error=True)) - to_install = [ToInstallDist(raise_error=True), - ToInstallDist()] + old_install_dist = install._install_dist + old_uninstall = getattr(install, 'uninstall', None) - install.install_from_infos(remove=remove, install=to_install) - # assert that the files are in the same place - # assert that the files have been removed - for dist in remove: - for f in dist.get_installed_files(): - self.assertTrue(os.path.exists(f)) + install._install_dist = MagicMock(return_value=[], + raise_exception=(False, True)) + install.uninstall = MagicMock() + try: + # assert that if an error occurs, the removed files are restored. + remove = [] + for i in range(0,2): + remove.append(ToInstallDist(files=True)) + to_install = [ToInstallDist(), ToInstallDist()] + + self.assertRaises(Exception, install.install_from_infos, + remove=remove, install=to_install) + # assert that the files are in the same place + # assert that the files have been removed + for dist in remove: + for f in dist.get_installed_files(): + self.assertTrue(os.path.exists(f)) + finally: + install.install_dist = old_install_dist + install.uninstall = old_uninstall + def test_install_from_infos_install_succes(self): - # assert that the distribution can be installed - install_path = "my_install_path" - to_install = [ToInstallDist(), ToInstallDist()] + old_install_dist = install._install_dist + install._install_dist = MagicMock([]) + try: + # assert that the distribution can be installed + install_path = "my_install_path" + to_install = [ToInstallDist(), ToInstallDist()] - install.install_from_infos(install=to_install, - install_path=install_path) - for dist in to_install: - self.assertEqual(dist.install_called_with, (install_path,)) + install.install_from_infos(install=to_install, + install_path=install_path) + for dist in to_install: + install._install_dist.called_with(install_path) + finally: + install._install_dist = old_install_dist + def test_suite(): suite = unittest.TestSuite() diff --git a/distutils2/tests/test_markers.py b/distutils2/tests/test_markers.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_markers.py @@ -0,0 +1,69 @@ +"""Tests for distutils.metadata.""" +import os +import sys +import platform +from StringIO import StringIO + +from distutils2.markers import interpret +from distutils2.tests import run_unittest, unittest +from distutils2.tests.support import LoggingCatcher, WarningsCatcher + + +class MarkersTestCase(LoggingCatcher, WarningsCatcher, + unittest.TestCase): + + def test_interpret(self): + sys_platform = sys.platform + version = sys.version.split()[0] + os_name = os.name + platform_version = platform.version() + platform_machine = platform.machine() + + self.assertTrue(interpret("sys.platform == '%s'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) + self.assertTrue(interpret( + "sys.platform == '%s' and python_full_version == '%s'" % + (sys_platform, version))) + self.assertTrue(interpret("'%s' == sys.platform" % sys_platform)) + self.assertTrue(interpret('os.name == "%s"' % os_name)) + self.assertTrue(interpret( + 'platform.version == "%s" and platform.machine == "%s"' % + (platform_version, platform_machine))) + + # stuff that need to raise a syntax error + ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", + 'okpjonon', '', 'os.name ==', 'python_version == 2.4') + for op in ops: + self.assertRaises(SyntaxError, interpret, op) + + # combined operations + OP = 'os.name == "%s"' % os_name + AND = ' and ' + OR = ' or ' + self.assertTrue(interpret(OP + AND + OP)) + self.assertTrue(interpret(OP + AND + OP + AND + OP)) + self.assertTrue(interpret(OP + OR + OP)) + self.assertTrue(interpret(OP + OR + OP + OR + OP)) + + # other operators + self.assertTrue(interpret("os.name != 'buuuu'")) + self.assertTrue(interpret("python_version > '1.0'")) + self.assertTrue(interpret("python_version < '5.0'")) + self.assertTrue(interpret("python_version <= '5.0'")) + self.assertTrue(interpret("python_version >= '1.0'")) + self.assertTrue(interpret("'%s' in os.name" % os_name)) + self.assertTrue(interpret("'buuuu' not in os.name")) + self.assertTrue(interpret( + "'buuuu' not in os.name and '%s' in os.name" % os_name)) + + # execution context + self.assertTrue(interpret('python_version == "0.1"', + {'python_version': '0.1'})) + + +def test_suite(): + return unittest.makeSuite(MarkersTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -1,10 +1,10 @@ -"""Tests for distutils.command.bdist.""" +"""Tests for distutils.metadata.""" import os import sys import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, _interpret, +from distutils2.metadata import (DistributionMetadata, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -46,55 +46,6 @@ self.assertRaises(TypeError, DistributionMetadata, PKG_INFO, mapping=m, fileobj=fp) - def test_interpret(self): - sys_platform = sys.platform - version = sys.version.split()[0] - os_name = os.name - platform_version = platform.version() - platform_machine = platform.machine() - - self.assertTrue(_interpret("sys.platform == '%s'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' or python_version == '2.4'" % sys_platform)) - self.assertTrue(_interpret( - "sys.platform == '%s' and python_full_version == '%s'" % - (sys_platform, version))) - self.assertTrue(_interpret("'%s' == sys.platform" % sys_platform)) - self.assertTrue(_interpret('os.name == "%s"' % os_name)) - self.assertTrue(_interpret( - 'platform.version == "%s" and platform.machine == "%s"' % - (platform_version, platform_machine))) - - # stuff that need to raise a syntax error - ops = ('os.name == os.name', 'os.name == 2', "'2' == '2'", - 'okpjonon', '', 'os.name ==', 'python_version == 2.4') - for op in ops: - self.assertRaises(SyntaxError, _interpret, op) - - # combined operations - OP = 'os.name == "%s"' % os_name - AND = ' and ' - OR = ' or ' - self.assertTrue(_interpret(OP + AND + OP)) - self.assertTrue(_interpret(OP + AND + OP + AND + OP)) - self.assertTrue(_interpret(OP + OR + OP)) - self.assertTrue(_interpret(OP + OR + OP + OR + OP)) - - # other operators - self.assertTrue(_interpret("os.name != 'buuuu'")) - self.assertTrue(_interpret("python_version > '1.0'")) - self.assertTrue(_interpret("python_version < '5.0'")) - self.assertTrue(_interpret("python_version <= '5.0'")) - self.assertTrue(_interpret("python_version >= '1.0'")) - self.assertTrue(_interpret("'%s' in os.name" % os_name)) - self.assertTrue(_interpret("'buuuu' not in os.name")) - self.assertTrue(_interpret( - "'buuuu' not in os.name and '%s' in os.name" % os_name)) - - # execution context - self.assertTrue(_interpret('python_version == "0.1"', - {'python_version': '0.1'})) - def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') metadata = DistributionMetadata(PKG_INFO) diff --git a/distutils2/tests/test_uninstall.py b/distutils2/tests/test_uninstall.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_uninstall.py @@ -0,0 +1,93 @@ +"""Tests for the uninstall command.""" +import os +import sys +from StringIO import StringIO +from distutils2._backport.pkgutil import disable_cache, enable_cache +from distutils2.tests import unittest, support, run_unittest +from distutils2.errors import DistutilsError +from distutils2.install import remove + +SETUP_CFG = """ +[metadata] +name = %(name)s +version = %(version)s + +[files] +packages = + %(name)s + %(name)s.sub +""" + +class UninstallTestCase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def setUp(self): + super(UninstallTestCase, self).setUp() + self.addCleanup(setattr, sys, 'stdout', sys.stdout) + self.addCleanup(setattr, sys, 'stderr', sys.stderr) + self.addCleanup(os.chdir, os.getcwd()) + self.addCleanup(enable_cache) + self.root_dir = self.mkdtemp() + disable_cache() + + def run_setup(self, *args): + # run setup with args + #sys.stdout = StringIO() + sys.argv[:] = [''] + list(args) + old_sys = sys.argv[:] + try: + from distutils2.run import commands_main + dist = commands_main() + finally: + sys.argv[:] = old_sys + return dist + + def get_path(self, dist, name): + from distutils2.command.install_dist import install_dist + cmd = install_dist(dist) + cmd.prefix = self.root_dir + cmd.finalize_options() + return getattr(cmd, 'install_'+name) + + def make_dist(self, pkg_name='foo', **kw): + dirname = self.mkdtemp() + kw['name'] = pkg_name + if 'version' not in kw: + kw['version'] = '0.1' + self.write_file((dirname, 'setup.cfg'), SETUP_CFG % kw) + os.mkdir(os.path.join(dirname, pkg_name)) + self.write_file((dirname, '__init__.py'), '#') + self.write_file((dirname, pkg_name+'_utils.py'), '#') + os.mkdir(os.path.join(dirname, pkg_name, 'sub')) + self.write_file((dirname, pkg_name, 'sub', '__init__.py'), '#') + self.write_file((dirname, pkg_name, 'sub', pkg_name+'_utils.py'), '#') + return dirname + + def install_dist(self, pkg_name='foo', dirname=None, **kw): + if not dirname: + dirname = self.make_dist(pkg_name, **kw) + os.chdir(dirname) + dist = self.run_setup('install_dist', '--prefix='+self.root_dir) + install_lib = self.get_path(dist, 'purelib') + return dist, install_lib + + def test_uninstall_unknow_distribution(self): + self.assertRaises(DistutilsError, remove, 'foo', paths=[self.root_dir]) + + def test_uninstall(self): + dist, install_lib = self.install_dist() + self.assertIsFile(install_lib, 'foo', 'sub', '__init__.py') + self.assertIsFile(install_lib, 'foo-0.1.dist-info', 'RECORD') + remove('foo', paths=[install_lib]) + self.assertIsNotFile(install_lib, 'foo', 'sub', '__init__.py') + self.assertIsNotFile(install_lib, 'foo-0.1.dist-info', 'RECORD') + + + + +def test_suite(): + return unittest.makeSuite(UninstallTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -675,83 +675,6 @@ return base, ext -def unzip_file(filename, location, flatten=True): - """Unzip the file (zip file located at filename) to the destination - location""" - if not os.path.exists(location): - os.makedirs(location) - zipfp = open(filename, 'rb') - try: - zip = zipfile.ZipFile(zipfp) - leading = has_leading_dir(zip.namelist()) and flatten - for name in zip.namelist(): - data = zip.read(name) - fn = name - if leading: - fn = split_leading_dir(name)[1] - fn = os.path.join(location, fn) - dir = os.path.dirname(fn) - if not os.path.exists(dir): - os.makedirs(dir) - if fn.endswith('/') or fn.endswith('\\'): - # A directory - if not os.path.exists(fn): - os.makedirs(fn) - else: - fp = open(fn, 'wb') - try: - fp.write(data) - finally: - fp.close() - finally: - zipfp.close() - - -def untar_file(filename, location): - """Untar the file (tar file located at filename) to the destination - location - """ - if not os.path.exists(location): - os.makedirs(location) - if filename.lower().endswith('.gz') or filename.lower().endswith('.tgz'): - mode = 'r:gz' - elif (filename.lower().endswith('.bz2') - or filename.lower().endswith('.tbz')): - mode = 'r:bz2' - elif filename.lower().endswith('.tar'): - mode = 'r' - else: - mode = 'r:*' - tar = tarfile.open(filename, mode) - try: - leading = has_leading_dir([member.name for member in tar.getmembers()]) - for member in tar.getmembers(): - fn = member.name - if leading: - fn = split_leading_dir(fn)[1] - path = os.path.join(location, fn) - if member.isdir(): - if not os.path.exists(path): - os.makedirs(path) - else: - try: - fp = tar.extractfile(member) - except (KeyError, AttributeError): - # Some corrupt tar files seem to produce this - # (specifically bad symlinks) - continue - if not os.path.exists(os.path.dirname(path)): - os.makedirs(os.path.dirname(path)) - destfp = open(path, 'wb') - try: - shutil.copyfileobj(fp, destfp) - finally: - destfp.close() - fp.close() - finally: - tar.close() - - def has_leading_dir(paths): """Returns true if all the paths have the same leading path name (i.e., everything is in one subdirectory in an archive)""" diff --git a/docs/source/library/distutils2.tests.pypi_server.rst b/docs/source/library/distutils2.tests.pypi_server.rst --- a/docs/source/library/distutils2.tests.pypi_server.rst +++ b/docs/source/library/distutils2.tests.pypi_server.rst @@ -77,6 +77,7 @@ @use_pypi_server() def test_somthing(self, server): # your tests goes here + ... The decorator will instantiate the server for you, and run and stop it just before and after your method call. You also can pass the server initializer, @@ -85,4 +86,4 @@ class SampleTestCase(TestCase): @use_pypi_server("test_case_name") def test_something(self, server): - # something + ... diff --git a/docs/source/library/pkgutil.rst b/docs/source/library/pkgutil.rst --- a/docs/source/library/pkgutil.rst +++ b/docs/source/library/pkgutil.rst @@ -4,77 +4,204 @@ .. module:: pkgutil :synopsis: Utilities to support packages. -.. TODO Follow the reST conventions used in the stdlib +This module provides utilities to manipulate packages: support for the +Importer protocol defined in :PEP:`302` and implementation of the API +described in :PEP:`376` to work with the database of installed Python +distributions. -This module provides functions to manipulate packages, as well as -the necessary functions to provide support for the "Importer Protocol" as -described in :PEP:`302` and for working with the database of installed Python -distributions which is specified in :PEP:`376`. In addition to the functions -required in :PEP:`376`, back support for older ``.egg`` and ``.egg-info`` -distributions is provided as well. These distributions are represented by the -class :class:`~distutils2._backport.pkgutil.EggInfoDistribution` and most -functions provide an extra argument ``use_egg_info`` which indicates if -they should consider these old styled distributions. This document details -first the functions and classes available and then presents several use cases. - +Import system utilities +----------------------- .. function:: extend_path(path, name) - Extend the search path for the modules which comprise a package. Intended use is - to place the following code in a package's :file:`__init__.py`:: + Extend the search path for the modules which comprise a package. Intended + use is to place the following code in a package's :file:`__init__.py`:: from pkgutil import extend_path __path__ = extend_path(__path__, __name__) - This will add to the package's ``__path__`` all subdirectories of directories on - ``sys.path`` named after the package. This is useful if one wants to distribute - different parts of a single logical package as multiple directories. + This will add to the package's ``__path__`` all subdirectories of directories + on :data:`sys.path` named after the package. This is useful if one wants to + distribute different parts of a single logical package as multiple + directories. - It also looks for :file:`\*.pkg` files beginning where ``*`` matches the *name* - argument. This feature is similar to :file:`\*.pth` files (see the :mod:`site` - module for more information), except that it doesn't special-case lines starting - with ``import``. A :file:`\*.pkg` file is trusted at face value: apart from - checking for duplicates, all entries found in a :file:`\*.pkg` file are added to - the path, regardless of whether they exist on the filesystem. (This is a - feature.) + It also looks for :file:`\*.pkg` files beginning where ``*`` matches the + *name* argument. This feature is similar to :file:`\*.pth` files (see the + :mod:`site` module for more information), except that it doesn't special-case + lines starting with ``import``. A :file:`\*.pkg` file is trusted at face + value: apart from checking for duplicates, all entries found in a + :file:`\*.pkg` file are added to the path, regardless of whether they exist + on the filesystem. (This is a feature.) If the input path is not a list (as is the case for frozen packages) it is returned unchanged. The input path is not modified; an extended copy is returned. Items are only appended to the copy at the end. - It is assumed that ``sys.path`` is a sequence. Items of ``sys.path`` that are - not strings referring to existing directories are ignored. Unicode items on - ``sys.path`` that cause errors when used as filenames may cause this function - to raise an exception (in line with :func:`os.path.isdir` behavior). + It is assumed that :data:`sys.path` is a sequence. Items of :data:`sys.path` + that are not strings referring to existing directories are ignored. Unicode + items on :data:`sys.path` that cause errors when used as filenames may cause + this function to raise an exception (in line with :func:`os.path.isdir` + behavior). + + +.. class:: ImpImporter(dirname=None) + + :pep:`302` Importer that wraps Python's "classic" import algorithm. + + If *dirname* is a string, a :pep:`302` importer is created that searches that + directory. If *dirname* is ``None``, a :pep:`302` importer is created that + searches the current :data:`sys.path`, plus any modules that are frozen or + built-in. + + Note that :class:`ImpImporter` does not currently support being used by + placement on :data:`sys.meta_path`. + + +.. class:: ImpLoader(fullname, file, filename, etc) + + :pep:`302` Loader that wraps Python's "classic" import algorithm. + + +.. function:: find_loader(fullname) + + Find a :pep:`302` "loader" object for *fullname*. + + If *fullname* contains dots, path must be the containing package's + ``__path__``. Returns ``None`` if the module cannot be found or imported. + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: get_importer(path_item) + + Retrieve a :pep:`302` importer for the given *path_item*. + + The returned importer is cached in :data:`sys.path_importer_cache` if it was + newly created by a path hook. + + If there is no importer, a wrapper around the basic import machinery is + returned. This wrapper is never inserted into the importer cache (None is + inserted instead). + + The cache (or part of it) can be cleared manually if a rescan of + :data:`sys.path_hooks` is necessary. + + +.. function:: get_loader(module_or_name) + + Get a :pep:`302` "loader" object for *module_or_name*. + + If the module or package is accessible via the normal import mechanism, a + wrapper around the relevant part of that machinery is returned. Returns + ``None`` if the module cannot be found or imported. If the named module is + not already imported, its containing package (if any) is imported, in order + to establish the package ``__path__``. + + This function uses :func:`iter_importers`, and is thus subject to the same + limitations regarding platform-specific special import locations such as the + Windows registry. + + +.. function:: iter_importers(fullname='') + + Yield :pep:`302` importers for the given module name. + + If fullname contains a '.', the importers will be for the package containing + fullname, otherwise they will be importers for :data:`sys.meta_path`, + :data:`sys.path`, and Python's "classic" import machinery, in that order. If + the named module is in a package, that package is imported as a side effect + of invoking this function. + + Non-:pep:`302` mechanisms (e.g. the Windows registry) used by the standard + import machinery to find files in alternative locations are partially + supported, but are searched *after* :data:`sys.path`. Normally, these + locations are searched *before* :data:`sys.path`, preventing :data:`sys.path` + entries from shadowing them. + + For this to cause a visible difference in behaviour, there must be a module + or package name that is accessible via both :data:`sys.path` and one of the + non-:pep:`302` file system mechanisms. In this case, the emulation will find + the former version, while the builtin import mechanism will find the latter. + + Items of the following types can be affected by this discrepancy: + ``imp.C_EXTENSION``, ``imp.PY_SOURCE``, ``imp.PY_COMPILED``, + ``imp.PKG_DIRECTORY``. + + +.. function:: iter_modules(path=None, prefix='') + + Yields ``(module_loader, name, ispkg)`` for all submodules on *path*, or, if + path is ``None``, all top-level modules on :data:`sys.path`. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + +.. function:: walk_packages(path=None, prefix='', onerror=None) + + Yields ``(module_loader, name, ispkg)`` for all modules recursively on + *path*, or, if path is ``None``, all accessible modules. + + *path* should be either ``None`` or a list of paths to look for modules in. + + *prefix* is a string to output on the front of every module name on output. + + Note that this function must import all *packages* (*not* all modules!) on + the given *path*, in order to access the ``__path__`` attribute to find + submodules. + + *onerror* is a function which gets called with one argument (the name of the + package which was being imported) if any exception occurs while trying to + import a package. If no *onerror* function is supplied, :exc:`ImportError`\s + are caught and ignored, while all other exceptions are propagated, + terminating the search. + + Examples:: + + # list all modules python can access + walk_packages() + + # list all submodules of ctypes + walk_packages(ctypes.__path__, ctypes.__name__ + '.') + .. function:: get_data(package, resource) Get a resource from a package. - This is a wrapper for the :pep:`302` loader :func:`get_data` API. The package - argument should be the name of a package, in standard module format - (foo.bar). The resource argument should be in the form of a relative - filename, using ``/`` as the path separator. The parent directory name + This is a wrapper for the :pep:`302` loader :func:`get_data` API. The + *package* argument should be the name of a package, in standard module format + (``foo.bar``). The *resource* argument should be in the form of a relative + filename, using ``/`` as the path separator. The parent directory name ``..`` is not allowed, and nor is a rooted name (starting with a ``/``). - The function returns a binary string that is the contents of the - specified resource. + The function returns a binary string that is the contents of the specified + resource. For packages located in the filesystem, which have already been imported, this is the rough equivalent of:: - d = os.path.dirname(sys.modules[package].__file__) - data = open(os.path.join(d, resource), 'rb').read() + d = os.path.dirname(sys.modules[package].__file__) + data = open(os.path.join(d, resource), 'rb').read() If the package cannot be located or loaded, or it uses a :pep:`302` loader - which does not support :func:`get_data`, then None is returned. + which does not support :func:`get_data`, then ``None`` is returned. -API Reference -============= +Installed distributions database +-------------------------------- -.. automodule:: distutils2._backport.pkgutil - :members: +Installed Python distributions are represented by instances of +:class:`~distutils2._backport.pkgutil.Distribution`, or its subclass +:class:`~distutils2._backport.pkgutil.EggInfoDistribution` for legacy ``.egg`` +and ``.egg-info`` formats). Most functions also provide an extra argument +``use_egg_info`` to take legacy distributions into account. + +.. TODO write docs here, don't rely on automodule + classes: Distribution and descendents + functions: provides, obsoletes, replaces, etc. Caching +++++++ @@ -86,11 +213,10 @@ :func:`~distutils2._backport.pkgutil.clear_cache`. +Examples +-------- -Example Usage -============= - -Print All Information About a Distribution +Print all information about a distribution ++++++++++++++++++++++++++++++++++++++++++ Given a path to a ``.dist-info`` distribution, we shall print out all @@ -182,7 +308,7 @@ ===== * It was installed as a dependency -Find Out Obsoleted Distributions +Find out obsoleted distributions ++++++++++++++++++++++++++++++++ Now, we take tackle a different problem, we are interested in finding out diff --git a/patch b/patch deleted file mode 100644 --- a/patch +++ /dev/null @@ -1,23 +0,0 @@ -diff -r 5603e1bc5442 distutils2/util.py ---- a/distutils2/util.py Fri Jan 28 18:42:44 2011 +0100 -+++ b/distutils2/util.py Sat Jan 29 02:39:55 2011 +0100 -@@ -1203,12 +1203,13 @@ - in_cfg_value = has_get_option(config, section, option) - if not in_cfg_value: - # There is no such option in the setup.cfg -- continue -- -- if arg == "long_description": -- filename = has_get_option("description_file") -- if filename: -- in_cfg_value = open(filename).read() -+ if arg == "long_description": -+ filename = has_get_option(config, section, "description_file") -+ print "We have a filename", filename -+ if filename: -+ in_cfg_value = open(filename).read() -+ else: -+ continue - - if arg in MULTI_FIELDS: - # Special behaviour when we have a multi line option -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Remove raise into unittest2 check. Message-ID: tarek.ziade pushed d89fc47cfb15 to distutils2: http://hg.python.org/distutils2/rev/d89fc47cfb15 changeset: 1060:d89fc47cfb15 user: FELD Boris date: Sun Jan 30 12:00:47 2011 +0100 summary: Remove raise into unittest2 check. files: distutils2/tests/__init__.py diff --git a/distutils2/tests/__init__.py b/distutils2/tests/__init__.py --- a/distutils2/tests/__init__.py +++ b/distutils2/tests/__init__.py @@ -28,7 +28,6 @@ # external release of same package for older versions import unittest2 as unittest except ImportError: - raise sys.exit('Error: You have to install unittest2') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Fix last bug, check that data file not exists at the same path and has the same Message-ID: tarek.ziade pushed 84ae7ec1d532 to distutils2: http://hg.python.org/distutils2/rev/84ae7ec1d532 changeset: 1061:84ae7ec1d532 user: FELD Boris date: Sun Jan 30 14:42:59 2011 +0100 summary: Fix last bug, check that data file not exists at the same path and has the same content. files: distutils2/command/install_data.py distutils2/tests/test_command_install_data.py diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -10,6 +10,7 @@ from distutils2.command.cmd import Command from distutils2.util import change_root, convert_path from distutils2._backport.sysconfig import get_paths, format_value +from distutils2._backport.shutil import Error class install_data(Command): @@ -47,7 +48,11 @@ dir_dest = os.path.abspath(os.path.dirname(destination)) self.mkpath(dir_dest) - (out, _) = self.copy_file(file[0], dir_dest) + try: + (out, _) = self.copy_file(file[0], dir_dest) + except Error, e: + self.warn(e.message) + out = destination self.outfiles.append(out) self.data_files_out.append((file[0], destination)) diff --git a/distutils2/tests/test_command_install_data.py b/distutils2/tests/test_command_install_data.py --- a/distutils2/tests/test_command_install_data.py +++ b/distutils2/tests/test_command_install_data.py @@ -1,4 +1,5 @@ """Tests for distutils.command.install_data.""" +import cmd import os from distutils2.command.install_data import install_data @@ -10,18 +11,29 @@ unittest.TestCase): def test_simple_run(self): + from distutils2._backport.sysconfig import _SCHEMES as sysconfig_SCHEMES + from distutils2._backport.sysconfig import _get_default_scheme + #dirty but hit marmoute + + old_scheme = sysconfig_SCHEMES + pkg_dir, dist = self.create_dist() cmd = install_data(dist) cmd.install_dir = inst = os.path.join(pkg_dir, 'inst') + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst', + os.path.join(pkg_dir, 'inst')) + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst2', + os.path.join(pkg_dir, 'inst2')) + one = os.path.join(pkg_dir, 'one') self.write_file(one, 'xxx') inst2 = os.path.join(pkg_dir, 'inst2') two = os.path.join(pkg_dir, 'two') self.write_file(two, 'xxx') - cmd.data_files = {'one' : '{appdata}/one', 'two' : '{appdata}/two'} - self.assertEqual(cmd.get_inputs(), [one, (inst2, [two])]) + cmd.data_files = {one : '{inst}/one', two : '{inst2}/two'} + self.assertItemsEqual(cmd.get_inputs(), [one, two]) # let's run the command cmd.ensure_finalized() @@ -51,17 +63,22 @@ inst4 = os.path.join(pkg_dir, 'inst4') three = os.path.join(cmd.install_dir, 'three') self.write_file(three, 'xx') - cmd.data_files = [one, (inst2, [two]), - ('inst3', [three]), - (inst4, [])] + + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst3', cmd.install_dir) + + cmd.data_files = {one : '{inst}/one', + two : '{inst2}/two', + three : '{inst3}/three'} cmd.ensure_finalized() cmd.run() # let's check the result - self.assertEqual(len(cmd.get_outputs()), 4) + self.assertEqual(len(cmd.get_outputs()), 3) self.assertTrue(os.path.exists(os.path.join(inst2, rtwo))) self.assertTrue(os.path.exists(os.path.join(inst, rone))) + sysconfig_SCHEMES = old_scheme + def test_suite(): return unittest.makeSuite(InstallDataTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Merge with upstream Message-ID: tarek.ziade pushed bab1d09babc8 to distutils2: http://hg.python.org/distutils2/rev/bab1d09babc8 changeset: 1063:bab1d09babc8 parent: 1062:7d859dd56e92 parent: 981:93d7c64c7a67 user: FELD Boris date: Sun Jan 30 15:40:43 2011 +0100 summary: Merge with upstream files: diff --git a/distutils2/_backport/__init__.py b/distutils2/_backport/__init__.py --- a/distutils2/_backport/__init__.py +++ b/distutils2/_backport/__init__.py @@ -1,2 +1,8 @@ """Things that will land in the Python 3.3 std lib but which we must drag along -with us for now to support 2.x.""" + us for now to support 2.x.""" + +def any(seq): + for elem in seq: + if elem: + return True + return False diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -254,10 +254,12 @@ ('description', 'summary'), ('long_description', 'description'), ('url', 'home_page'), - ('platforms', 'platform'), - ('provides', 'provides-dist'), - ('obsoletes', 'obsoletes-dist'), - ('requires', 'requires-dist'),) + ('platforms', 'platform')) + + if sys.version[:3] >= '2.5': + labels += (('provides', 'provides-dist'), + ('obsoletes', 'obsoletes-dist'), + ('requires', 'requires-dist'),) get = lambda lab: getattr(dist.metadata, lab.replace('-', '_')) data.update((new, get(old)) for (old, new) in labels if get(old)) # 2. retrieves data that requires special processings. @@ -281,7 +283,7 @@ for tok, path in path_tokens: if dest.startswith(path): dest = ('{%s}' % tok) + dest[len(path):] - files = [('/ '.join(src.rsplit('/', 1)), dest) + files = [('/ '.join(src.rsplit('/', 1)), dest) for src in srcs] data['resources'].extend(files) continue diff --git a/distutils2/tests/test_index_dist.py b/distutils2/tests/test_index_dist.py --- a/distutils2/tests/test_index_dist.py +++ b/distutils2/tests/test_index_dist.py @@ -166,6 +166,7 @@ there = dist.unpack(here) result = os.listdir(there) self.assertIn('paf', result) + os.remove('paf') def test_hashname(self): # Invalid hashnames raises an exception on assignation -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Merge Message-ID: tarek.ziade pushed 7d859dd56e92 to distutils2: http://hg.python.org/distutils2/rev/7d859dd56e92 changeset: 1062:7d859dd56e92 parent: 1061:84ae7ec1d532 parent: 978:301ad616c208 user: FELD Boris date: Sun Jan 30 15:09:13 2011 +0100 summary: Merge files: distutils2/_backport/pkgutil.py distutils2/config.py distutils2/tests/test_config.py docs/source/setupcfg.rst diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -922,10 +922,6 @@ for field in ('Obsoletes', 'Requires', 'Provides'): del self.metadata[field] - provides = "%s (%s)" % (self.metadata['name'], - self.metadata['version']) - self.metadata['Provides-Dist'] += (provides,) - reqs = [] if requires is not None: diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -742,6 +742,8 @@ if extract_dir is None: extract_dir = os.getcwd() + func = None + if format is not None: try: format_info = _UNPACK_FORMATS[format] @@ -758,4 +760,9 @@ func = _UNPACK_FORMATS[format][1] kwargs = dict(_UNPACK_FORMATS[format][2]) - raise ValueError('Unknown archive format: %s' % filename) + func(filename, extract_dir, **kwargs) + + if func is None: + raise ValueError('Unknown archive format: %s' % filename) + + return extract_dir diff --git a/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO b/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/coconuts-aster-10.3.egg-info/PKG-INFO @@ -0,0 +1,5 @@ +Metadata-Version: 1.2 +Name: coconuts-aster +Version: 10.3 +Provides-Dist: strawberry (0.6) +Provides-Dist: banana (0.4) diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -389,6 +389,7 @@ # Now, test if the egg-info distributions are found correctly as well fake_dists += [('bacon', '0.1'), ('cheese', '2.0.2'), + ('coconuts-aster', '10.3'), ('banana', '0.4'), ('strawberry', '0.6'), ('truffles', '5.0'), ('nut', 'funkyversion')] found_dists = [] @@ -494,18 +495,18 @@ l = [dist.name for dist in provides_distribution('truffles', '>1.5', use_egg_info=True)] - checkLists(l, ['bacon', 'truffles']) + checkLists(l, ['bacon']) l = [dist.name for dist in provides_distribution('truffles', '>=1.0')] checkLists(l, ['choxie', 'towel-stuff']) l = [dist.name for dist in provides_distribution('strawberry', '0.6', use_egg_info=True)] - checkLists(l, ['strawberry']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('strawberry', '>=0.5', use_egg_info=True)] - checkLists(l, ['strawberry']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('strawberry', '>0.6', use_egg_info=True)] @@ -513,11 +514,11 @@ l = [dist.name for dist in provides_distribution('banana', '0.4', use_egg_info=True)] - checkLists(l, ['banana']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('banana', '>=0.3', use_egg_info=True)] - checkLists(l, ['banana']) + checkLists(l, ['coconuts-aster']) l = [dist.name for dist in provides_distribution('banana', '!=0.4', use_egg_info=True)] @@ -557,7 +558,7 @@ eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), ('truffles', '5.0'), ('cheese', '2.0.2'), - ('nut', 'funkyversion')] + ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), ('towel-stuff', '0.1')] diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -8,6 +8,7 @@ import sys import re from ConfigParser import RawConfigParser +from shlex import split from distutils2 import logger from distutils2.errors import DistutilsOptionError @@ -20,15 +21,9 @@ def _pop_values(values_dct, key): """Remove values from the dictionary and convert them as a list""" - vals_str = values_dct.pop(key, None) - if not vals_str: - return + vals_str = values_dct.pop(key, '') # Get bash options like `gcc -print-file-name=libgcc.a` - vals = re.search('(`.*?`)', vals_str) or [] - if vals: - vals = list(vals.groups()) - vals_str = re.sub('`.*?`', '', vals_str) - vals.extend(vals_str.split()) + vals = split(vals_str) if vals: return vals diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -149,6 +149,16 @@ dist = self.dists.values()[0] return dist + def unpack(self, path=None, prefer_source=True): + """Unpack the distribution to the given path. + + If not destination is given, creates a temporary location. + + Returns the location of the extracted files (root). + """ + return self.get_distribution(prefer_source=prefer_source)\ + .unpack(path=path) + def download(self, temp_path=None, prefer_source=True): """Download the distribution, using the requirements. @@ -312,7 +322,7 @@ if path is None: path = tempfile.mkdtemp() - filename = self.download() + filename = self.download(path) content_type = mimetypes.guess_type(filename)[0] self._unpacked_dir = unpack_archive(filename) @@ -332,8 +342,11 @@ % (hashval.hexdigest(), expected_hashval)) def __repr__(self): + if self.release is None: + return "" % self.dist_type + return "<%s %s %s>" % ( - self.release.name, self.release.version, self.dist_type or "") + self.release.name, self.release.version, self.dist_type or "") class ReleasesList(IndexReference): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -1,4 +1,11 @@ -from tempfile import mkdtemp +"""Provides installations scripts. + +The goal of this script is to install a release from the indexes (eg. +PyPI), including the dependencies of the releases if needed. + +It uses the work made in pkgutil and by the index crawlers to browse the +installed distributions, and rely on the instalation commands to install. +""" import shutil import os import sys @@ -17,14 +24,9 @@ from distutils2.errors import DistutilsError from distutils2.version import get_version_predicate -"""Provides installations scripts. -The goal of this script is to install a release from the indexes (eg. -PyPI), including the dependencies of the releases if needed. - -It uses the work made in pkgutil and by the index crawlers to browse the -installed distributions, and rely on the instalation commands to install. -""" +__all__ = ['install_dists', 'install_from_infos', 'get_infos', 'remove', + 'install'] class InstallationException(Exception): @@ -35,7 +37,7 @@ """Raised when a conflict is detected""" -def move_files(files, destination=None): +def _move_files(files, destination): """Move the list of files in the destination folder, keeping the same structure. @@ -43,13 +45,11 @@ :param files: a list of files to move. :param destination: the destination directory to put on the files. - if not defined, create a new one, using mkdtemp """ - if not destination: - destination = mkdtemp() - for old in files: - new = '%s%s' % (destination, old) + # not using os.path.join() because basename() might not be + # unique in destination + new = "%s%s" % (destination, old) # try to make the paths. try: @@ -60,7 +60,7 @@ else: raise e os.rename(old, new) - yield (old, new) + yield old, new def _run_d1_install(archive_dir, path): @@ -93,7 +93,7 @@ * copy the files in "path" * determine if the distribution is distutils2 or distutils1. """ - where = dist.unpack(archive) + where = dist.unpack(path) # get into the dir archive_dir = None @@ -119,7 +119,7 @@ os.chdir(old_dir) -def install_dists(dists, path=None): +def install_dists(dists, path, paths=sys.path): """Install all distributions provided in dists, with the given prefix. If an error occurs while installing one of the distributions, uninstall all @@ -129,27 +129,28 @@ :param dists: distributions to install :param path: base path to install distribution in + :param paths: list of paths (defaults to sys.path) to look for info """ - if not path: - path = mkdtemp() installed_dists, installed_files = [], [] - for d in dists: - logger.info('Installing %s %s' % (d.name, d.version)) + for dist in dists: + logger.info('Installing %s %s' % (dist.name, dist.version)) try: - installed_files.extend(_install_dist(d, path)) - installed_dists.append(d) - except Exception, e : + installed_files.extend(_install_dist(dist, path)) + installed_dists.append(dist) + except Exception, e: logger.info('Failed. %s' % str(e)) # reverting - for d in installed_dists: - uninstall(d) + for installed_dist in installed_dists: + _remove_dist(installed_dist, paths) raise e + return installed_files -def install_from_infos(install=[], remove=[], conflicts=[], install_path=None): +def install_from_infos(install_path=None, install=[], remove=[], conflicts=[], + paths=sys.path): """Install and remove the given distributions. The function signature is made to be compatible with the one of get_infos. @@ -168,35 +169,43 @@ 4. Else, move the distributions to the right locations, and remove for real the distributions thats need to be removed. - :param install: list of distributions that will be installed. + :param install_path: the installation path where we want to install the + distributions. + :param install: list of distributions that will be installed; install_path + must be provided if this list is not empty. :param remove: list of distributions that will be removed. :param conflicts: list of conflicting distributions, eg. that will be in conflict once the install and remove distribution will be processed. - :param install_path: the installation path where we want to install the - distributions. + :param paths: list of paths (defaults to sys.path) to look for info """ # first of all, if we have conflicts, stop here. if conflicts: raise InstallationConflict(conflicts) + if install and not install_path: + raise ValueError("Distributions are to be installed but `install_path`" + " is not provided.") + # before removing the files, we will start by moving them away # then, if any error occurs, we could replace them in the good place. temp_files = {} # contains lists of {dist: (old, new)} paths + temp_dir = None if remove: + temp_dir = tempfile.mkdtemp() for dist in remove: files = dist.get_installed_files() - temp_files[dist] = move_files(files) + temp_files[dist] = _move_files(files, temp_dir) try: if install: - installed_files = install_dists(install, install_path) # install to tmp first - + install_dists(install, install_path, paths) except: - # if an error occurs, put back the files in the good place. + # if an error occurs, put back the files in the right place. for files in temp_files.values(): for old, new in files: shutil.move(new, old) - + if temp_dir: + shutil.rmtree(temp_dir) # now re-raising raise @@ -204,6 +213,8 @@ for files in temp_files.values(): for old, new in files: os.remove(new) + if temp_dir: + shutil.rmtree(temp_dir) def _get_setuptools_deps(release): @@ -265,9 +276,9 @@ # Get all the releases that match the requirements try: releases = index.get_releases(requirements) - except (ReleaseNotFound, ProjectNotFound), e: + except (ReleaseNotFound, ProjectNotFound): raise InstallationException('Release not found: "%s"' % requirements) - + # Pick up a release, and try to get the dependency tree release = releases.get_last(requirements, prefer_final=prefer_final) @@ -284,6 +295,8 @@ else: deps = metadata['requires_dist'] + # XXX deps not used + distributions = itertools.chain(installed, [release]) depgraph = generate_graph(distributions) @@ -315,16 +328,19 @@ infos[key].extend(new_infos[key]) +def _remove_dist(dist, paths=sys.path): + remove(dist.name, paths) + + def remove(project_name, paths=sys.path): """Removes a single project from the installation""" - tmp = tempfile.mkdtemp(prefix=project_name+'-uninstall') dist = get_distribution(project_name, paths=paths) if dist is None: raise DistutilsError('Distribution %s not found' % project_name) files = dist.get_installed_files(local=True) rmdirs = [] rmfiles = [] - + tmp = tempfile.mkdtemp(prefix=project_name + '-uninstall') try: for file, md5, size in files: if os.path.isfile(file): @@ -339,8 +355,8 @@ rmfiles.append(file) if dirname not in rmdirs: rmdirs.append(dirname) - except OSError: - os.rmdir(tmp) + finally: + shutil.rmtree(tmp) for file in rmfiles: os.remove(file) @@ -351,14 +367,6 @@ os.rmdir(dirname) - -def main(**attrs): - if 'script_args' not in attrs: - import sys - attrs['requirements'] = sys.argv[1] - get_infos(**attrs) - - def install(project): logger.info('Getting information about "%s".' % project) try: @@ -373,13 +381,20 @@ install_path = get_config_var('base') try: - install_from_infos(info['install'], info['remove'], info['conflict'], - install_path=install_path) + install_from_infos(install_path, + info['install'], info['remove'], info['conflict']) except InstallationConflict, e: projects = ['%s %s' % (p.name, p.version) for p in e.args[0]] logger.info('"%s" conflicts with "%s"' % (project, ','.join(projects))) +def _main(**attrs): + if 'script_args' not in attrs: + import sys + attrs['requirements'] = sys.argv[1] + get_infos(**attrs) + + if __name__ == '__main__': - main() + _main() diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -266,23 +266,25 @@ data['packages'].extend(dist.packages or []) data['modules'].extend(dist.py_modules or []) # 2.1 data_files -> resources. - if len(dist.data_files) < 2 or isinstance(dist.data_files[1], str): - dist.data_files = [('', dist.data_files)] - #add tokens in the destination paths - vars = {'distribution.name':data['name']} - path_tokens = sysconfig.get_paths(vars=vars).items() - #sort tokens to use the longest one first - path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), - key=lambda x: x[1]) - for dest, srcs in (dist.data_files or []): - dest = os.path.join(sys.prefix, dest) - for tok, path in path_tokens: - if dest.startswith(path): - dest = ('{%s}' % tok) + dest[len(path):] - files = [('/ '.join(src.rsplit('/', 1)), dest) - for src in srcs] - data['resources'].extend(files) - continue + if dist.data_files: + if len(dist.data_files) < 2 or \ + isinstance(dist.data_files[1], str): + dist.data_files = [('', dist.data_files)] + #add tokens in the destination paths + vars = {'distribution.name':data['name']} + path_tokens = sysconfig.get_paths(vars=vars).items() + #sort tokens to use the longest one first + path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), + key=lambda x: x[1]) + for dest, srcs in (dist.data_files or []): + dest = os.path.join(sys.prefix, dest) + for tok, path in path_tokens: + if dest.startswith(path): + dest = ('{%s}' % tok) + dest[len(path):] + files = [('/ '.join(src.rsplit('/', 1)), dest) + for src in srcs] + data['resources'].extend(files) + continue # 2.2 package_data -> extra_files package_dirs = dist.package_dir or {} for package, extras in dist.package_data.iteritems() or []: diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -83,10 +83,10 @@ dist = distclass(attrs) except DistutilsSetupError, msg: if 'name' in attrs: - raise SystemExit, "error in %s setup command: %s" % \ - (attrs['name'], msg) + raise SystemExit("error in %s setup command: %s" % \ + (attrs['name'], msg)) else: - raise SystemExit, "error in setup command: %s" % msg + raise SystemExit("error in setup command: %s" % msg) # Find and parse the config file(s): they will override options from # the setup script, but be overridden by the command line. @@ -98,22 +98,21 @@ try: res = dist.parse_command_line() except DistutilsArgError, msg: - raise SystemExit, gen_usage(dist.script_name) + "\nerror: %s" % msg + raise SystemExit(gen_usage(dist.script_name) + "\nerror: %s" % msg) # And finally, run all the commands found on the command line. if res: try: dist.run_commands() except KeyboardInterrupt: - raise SystemExit, "interrupted" + raise SystemExit("interrupted") except (IOError, os.error), exc: error = grok_environment_error(exc) - raise SystemExit, error + raise SystemExit(error) except (DistutilsError, CCompilerError), msg: - raise - raise SystemExit, "error: " + str(msg) + raise SystemExit("error: " + str(msg)) return dist @@ -127,7 +126,10 @@ def main(): - """Main entry point for Distutils2""" + """Main entry point for Distutils2 + + Execute an action or delegate to the commands system. + """ _set_logger() parser = OptionParser() parser.disable_interspersed_args() @@ -164,7 +166,7 @@ options, args = parser.parse_args() if options.version: print('Distutils2 %s' % __version__) -# sys.exit(0) + return 0 if len(options.metadata): from distutils2.dist import Distribution @@ -178,18 +180,18 @@ keys = options.metadata if len(keys) == 1: print metadata[keys[0]] - sys.exit(0) + return for key in keys: if key in metadata: - print(metadata._convert_name(key)+':') + print(metadata._convert_name(key) + ':') value = metadata[key] if isinstance(value, list): for v in value: - print(' '+v) + print(' ' + v) else: - print(' '+value.replace('\n', '\n ')) - sys.exit(0) + print(' ' + value.replace('\n', '\n ')) + return 0 if options.search is not None: search = options.search.lower() @@ -199,7 +201,7 @@ print('%s %s at %s' % (dist.name, dist.metadata['version'], dist.path)) - sys.exit(0) + return 0 if options.graph is not None: name = options.graph @@ -211,25 +213,25 @@ graph = generate_graph(dists) print(graph.repr_node(dist)) - sys.exit(0) + return 0 if options.fgraph: dists = get_distributions(use_egg_info=True) graph = generate_graph(dists) print(graph) - sys.exit(0) + return 0 if options.install is not None: install(options.install) - sys.exit(0) + return 0 if len(args) == 0: parser.print_help() - sys.exit(0) + return 0 - return commands_main() -# sys.exit(0) + commands_main() + return 0 if __name__ == '__main__': - main() + sys.exit(main()) diff --git a/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz b/distutils2/tests/pypiserver/downloads_with_md5/simple/foobar/foobar-0.1.tar.gz index 0000000000000000000000000000000000000000..333961eb18a6e7db80fefd41c339ab218d5180c4 GIT binary patch literal 110 zc$|~(=3uy!>FUeC{PvtR-ysJc)&sVu?9yZ7`(A1Di)P(6s!I71JWZ;--fWND`LA)=lAmk-7Jbj=XMlnFEsQ#U Kd|Vkc7#IK&xGYxy diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -103,7 +103,7 @@ [extension=speed_coconuts] name = one.speed_coconuts sources = c_src/speed_coconuts.c -extra_link_args = `gcc -print-file-name=libgcc.a` -shared +extra_link_args = "`gcc -print-file-name=libgcc.a`" -shared define_macros = HAVE_CAIRO HAVE_GTK2 [extension=fast_taunt] diff --git a/distutils2/tests/test_index_dist.py b/distutils2/tests/test_index_dist.py --- a/distutils2/tests/test_index_dist.py +++ b/distutils2/tests/test_index_dist.py @@ -127,7 +127,7 @@ url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address # check md5 if given dist = Dist(url=url, hashname="md5", - hashval="d41d8cd98f00b204e9800998ecf8427e") + hashval="fe18804c5b722ff024cabdf514924fc4") dist.download(self.mkdtemp()) # a wrong md5 fails @@ -157,6 +157,24 @@ hashname="invalid_hashname", hashval="value") + @use_pypi_server('downloads_with_md5') + def test_unpack(self, server): + url = "%s/simple/foobar/foobar-0.1.tar.gz" % server.full_address + dist = Dist(url=url) + # doing an unpack + here = self.mkdtemp() + there = dist.unpack(here) + result = os.listdir(there) + self.assertIn('paf', result) + + def test_hashname(self): + # Invalid hashnames raises an exception on assignation + Dist(hashname="md5", hashval="value") + + self.assertRaises(UnsupportedHashName, Dist, + hashname="invalid_hashname", + hashval="value") + class TestReleasesList(unittest.TestCase): diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -39,6 +39,11 @@ for f in range(0,3): self._real_files.append(mkstemp()) + def _unlink_installed_files(self): + if self._files: + for f in self._real_files: + os.unlink(f[1]) + def get_installed_files(self, **args): if self._files: return [f[1] for f in self._real_files] @@ -54,14 +59,14 @@ self._called_with = [] self._return_value = return_value self._raise = raise_exception - + def __call__(self, *args, **kwargs): self.called = True self._times_called = self._times_called + 1 self._called_with.append((args, kwargs)) iterable = hasattr(self._raise, '__iter__') if self._raise: - if ((not iterable and self._raise) + if ((not iterable and self._raise) or self._raise[self._times_called - 1]): raise Exception return self._return_value @@ -70,25 +75,8 @@ return (args, kwargs) in self._called_with -def patch(parent, to_patch): - """monkey match a module""" - def wrapper(func): - print func - print dir(func) - old_func = getattr(parent, to_patch) - def wrapped(*args, **kwargs): - parent.__dict__[to_patch] = MagicMock() - try: - out = func(*args, **kwargs) - finally: - setattr(parent, to_patch, old_func) - return out - return wrapped - return wrapper - - def get_installed_dists(dists): - """Return a list of fake installed dists. + """Return a list of fake installed dists. The list is name, version, deps""" objects = [] for (name, version, deps) in dists: @@ -100,12 +88,6 @@ def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) - def _patch_run_install(self): - """Patch run install""" - - def _unpatch_run_install(self): - """Unpatch run install for d2 and d1""" - def _get_results(self, output): """return a list of results""" installed = [(o.name, '%s' % o.version) for o in output['install']] @@ -205,7 +187,7 @@ ]) # name, version, deps. - already_installed = [('bacon', '0.1', []), + already_installed = [('bacon', '0.1', []), ('chicken', '1.1', ['bacon (0.1)'])] output = install.get_infos("choxie", index=client, installed= get_installed_dists(already_installed)) @@ -236,7 +218,7 @@ files = [os.path.join(path, '%s' % x) for x in range(1, 20)] for f in files: file(f, 'a+') - output = [o for o in install.move_files(files, newpath)] + output = [o for o in install._move_files(files, newpath)] # check that output return the list of old/new places for f in files: @@ -265,19 +247,19 @@ old_install_dist = install._install_dist old_uninstall = getattr(install, 'uninstall', None) - install._install_dist = MagicMock(return_value=[], + install._install_dist = MagicMock(return_value=[], raise_exception=(False, True)) - install.uninstall = MagicMock() + install.remove = MagicMock() try: d1 = ToInstallDist() d2 = ToInstallDist() path = self.mkdtemp() self.assertRaises(Exception, install.install_dists, [d1, d2], path) self.assertTrue(install._install_dist.called_with(d1, path)) - self.assertTrue(install.uninstall.called) + self.assertTrue(install.remove.called) finally: install._install_dist = old_install_dist - install.uninstall = old_uninstall + install.remove = old_uninstall def test_install_dists_success(self): @@ -322,7 +304,7 @@ old_install_dist = install._install_dist old_uninstall = getattr(install, 'uninstall', None) - install._install_dist = MagicMock(return_value=[], + install._install_dist = MagicMock(return_value=[], raise_exception=(False, True)) install.uninstall = MagicMock() try: @@ -331,14 +313,17 @@ for i in range(0,2): remove.append(ToInstallDist(files=True)) to_install = [ToInstallDist(), ToInstallDist()] + temp_dir = self.mkdtemp() - self.assertRaises(Exception, install.install_from_infos, - remove=remove, install=to_install) + self.assertRaises(Exception, install.install_from_infos, + install_path=temp_dir, install=to_install, + remove=remove) # assert that the files are in the same place # assert that the files have been removed for dist in remove: for f in dist.get_installed_files(): self.assertTrue(os.path.exists(f)) + dist._unlink_installed_files() finally: install.install_dist = old_install_dist install.uninstall = old_uninstall @@ -352,8 +337,7 @@ install_path = "my_install_path" to_install = [ToInstallDist(), ToInstallDist()] - install.install_from_infos(install=to_install, - install_path=install_path) + install.install_from_infos(install_path, install=to_install) for dist in to_install: install._install_dist.called_with(install_path) finally: diff --git a/docs/design/configfile.rst b/docs/design/configfile.rst new file mode 100644 --- /dev/null +++ b/docs/design/configfile.rst @@ -0,0 +1,132 @@ +.. _setup-config: + +************************************ +Writing the Setup Configuration File +************************************ + +Often, it's not possible to write down everything needed to build a distribution +*a priori*: you may need to get some information from the user, or from the +user's system, in order to proceed. As long as that information is fairly +simple---a list of directories to search for C header files or libraries, for +example---then providing a configuration file, :file:`setup.cfg`, for users to +edit is a cheap and easy way to solicit it. Configuration files also let you +provide default values for any command option, which the installer can then +override either on the command line or by editing the config file. + +The setup configuration file is a useful middle-ground between the setup script +---which, ideally, would be opaque to installers [#]_---and the command line to +the setup script, which is outside of your control and entirely up to the +installer. In fact, :file:`setup.cfg` (and any other Distutils configuration +files present on the target system) are processed after the contents of the +setup script, but before the command line. This has several useful +consequences: + +.. If you have more advanced needs, such as determining which extensions to + build based on what capabilities are present on the target system, then you + need the Distutils auto-configuration facility. This started to appear in + Distutils 0.9 but, as of this writing, isn't mature or stable enough yet + for real-world use. + +* installers can override some of what you put in :file:`setup.py` by editing + :file:`setup.cfg` + +* you can provide non-standard defaults for options that are not easily set in + :file:`setup.py` + +* installers can override anything in :file:`setup.cfg` using the command-line + options to :file:`setup.py` + +The basic syntax of the configuration file is simple:: + + [command] + option=value + ... + +where *command* is one of the Distutils commands (e.g. :command:`build_py`, +:command:`install`), and *option* is one of the options that command supports. +Any number of options can be supplied for each command, and any number of +command sections can be included in the file. Blank lines are ignored, as are +comments, which run from a ``'#'`` character until the end of the line. Long +option values can be split across multiple lines simply by indenting the +continuation lines. + +You can find out the list of options supported by a particular command with the +universal :option:`--help` option, e.g. :: + + > python setup.py --help build_ext + [...] + Options for 'build_ext' command: + --build-lib (-b) directory for compiled extension modules + --build-temp (-t) directory for temporary files (build by-products) + --inplace (-i) ignore build-lib and put compiled extensions into the + source directory alongside your pure Python modules + --include-dirs (-I) list of directories to search for header files + --define (-D) C preprocessor macros to define + --undef (-U) C preprocessor macros to undefine + --swig-opts list of SWIG command-line options + [...] + +.. XXX do we want to support ``setup.py --help metadata``? + +Note that an option spelled :option:`--foo-bar` on the command line is spelled +:option:`foo_bar` in configuration files. + +For example, say you want your extensions to be built "in-place"---that is, you +have an extension :mod:`pkg.ext`, and you want the compiled extension file +(:file:`ext.so` on Unix, say) to be put in the same source directory as your +pure Python modules :mod:`pkg.mod1` and :mod:`pkg.mod2`. You can always use the +:option:`--inplace` option on the command line to ensure this:: + + python setup.py build_ext --inplace + +But this requires that you always specify the :command:`build_ext` command +explicitly, and remember to provide :option:`--inplace`. An easier way is to +"set and forget" this option, by encoding it in :file:`setup.cfg`, the +configuration file for this distribution:: + + [build_ext] + inplace=1 + +This will affect all builds of this module distribution, whether or not you +explicitly specify :command:`build_ext`. If you include :file:`setup.cfg` in +your source distribution, it will also affect end-user builds---which is +probably a bad idea for this option, since always building extensions in-place +would break installation of the module distribution. In certain peculiar cases, +though, modules are built right in their installation directory, so this is +conceivably a useful ability. (Distributing extensions that expect to be built +in their installation directory is almost always a bad idea, though.) + +Another example: certain commands take a lot of options that don't change from +run to run; for example, :command:`bdist_rpm` needs to know everything required +to generate a "spec" file for creating an RPM distribution. Some of this +information comes from the setup script, and some is automatically generated by +the Distutils (such as the list of files installed). But some of it has to be +supplied as options to :command:`bdist_rpm`, which would be very tedious to do +on the command line for every run. Hence, here is a snippet from the Distutils' +own :file:`setup.cfg`:: + + [bdist_rpm] + release = 1 + packager = Greg Ward + doc_files = CHANGES.txt + README.txt + USAGE.txt + doc/ + examples/ + +Note that the :option:`doc_files` option is simply a whitespace-separated string +split across multiple lines for readability. + + +.. seealso:: + + :ref:`inst-config-syntax` in "Installing Python Modules" + More information on the configuration files is available in the manual for + system administrators. + + +.. rubric:: Footnotes + +.. [#] This ideal probably won't be achieved until auto-configuration is fully + supported by the Distutils. + diff --git a/docs/source/distutils/sourcedist.rst b/docs/source/distutils/sourcedist.rst --- a/docs/source/distutils/sourcedist.rst +++ b/docs/source/distutils/sourcedist.rst @@ -86,8 +86,7 @@ distributions, but in the future there will be a standard for testing Python module distributions) -* :file:`README.txt` (or :file:`README`), :file:`setup.py` (or whatever you - called your setup script), and :file:`setup.cfg` +* The configuration file :file:`setup.cfg` * all files that matches the ``package_data`` metadata. See :ref:`distutils-installing-package-data`. @@ -95,6 +94,10 @@ * all files that matches the ``data_files`` metadata. See :ref:`distutils-additional-files`. +.. Warning:: + In Distutils2, setup.py and README (or README.txt) files are not more + included in source distribution by default + Sometimes this is enough, but usually you will want to specify additional files to distribute. The typical way to do this is to write a *manifest template*, called :file:`MANIFEST.in` by default. The manifest template is just a list of diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -7,24 +7,35 @@ Each section contains a description of its options. -- Options that are marked *\*multi* can have multiple values, one value - per line. +- Options that are marked *\*multi* can have multiple values, one value per + line. - Options that are marked *\*optional* can be omited. -- Options that are marked *\*environ* can use environement markes, as described - in PEP 345. +- Options that are marked *\*environ* can use environment markers, as described + in :PEP:`345`. + The sections are: -- global -- metadata -- files -- command sections +global + Global options for Distutils2. + +metadata + The metadata section contains the metadata for the project as described in + :PEP:`345`. + +files + Declaration of package files included in the project. + +`command` sections + Redefinition of user options for Distutils2 commands. global ====== -Contains global options for Distutils2. This section is shared with Distutils1. +Contains global options for Distutils2. This section is shared with Distutils1 +(legacy version distributed in python 2.X standard library). + - **commands**: Defined Distutils2 command. A command is defined by its fully qualified name. @@ -38,13 +49,13 @@ *\*optional* *\*multi* - **compilers**: Defined Distutils2 compiler. A compiler is defined by its fully - qualified name. + qualified name. Example:: [global] compiler = - package.compilers.CustomCCompiler + package.compiler.CustomCCompiler *\*optional* *\*multi* @@ -52,21 +63,29 @@ :file:`setup.cfg` file is read. The callable receives the configuration in form of a mapping and can make some changes to it. *\*optional* + Example:: + + [global] + setup_hook = + distutils2.tests.test_config.hook + metadata ======== The metadata section contains the metadata for the project as described in -PEP 345. +:PEP:`345`. +.. Note:: + Field names are case-insensitive. Fields: - **name**: Name of the project. -- **version**: Version of the project. Must comply with PEP 386. +- **version**: Version of the project. Must comply with :PEP:`386`. - **platform**: Platform specification describing an operating system supported by the distribution which is not listed in the "Operating System" Trove - classifiers. *\*multi* *\*optional* + classifiers (:PEP:`301`). *\*multi* *\*optional* - **supported-platform**: Binary distributions containing a PKG-INFO file will use the Supported-Platform field in their metadata to specify the OS and CPU for which the binary distribution was compiled. The semantics of @@ -113,14 +132,18 @@ name = pypi2rpm version = 0.1 author = Tarek Ziade - author_email = tarek at ziade.org + author-email = tarek at ziade.org summary = Script that transforms a sdist archive into a rpm archive description-file = README - home_page = http://bitbucket.org/tarek/pypi2rpm + home-page = http://bitbucket.org/tarek/pypi2rpm + project-url: RSS feed, https://bitbucket.org/tarek/pypi2rpm/rss classifier = Development Status :: 3 - Alpha License :: OSI Approved :: Mozilla Public License 1.1 (MPL 1.1) +.. Note:: + Some metadata fields seen in :PEP:`345` are automatically generated + (for instance Metadata-Version value). files @@ -148,6 +171,10 @@ extra_files = setup.py + README + +.. Note:: + In Distutils2, setup.cfg will be implicitly included. data-files ========== @@ -369,17 +396,27 @@ doc/ * = {doc} doc/ man = {man} -We use brace expansion syntax to place all the bash and batch scripts into {scripts} category. +We use brace expansion syntax to place all the bash and batch scripts into {scripts} category. -command sections -================ +.. Warning:: + In Distutils2, setup.py and README (or README.txt) files are not more + included in source distribution by default -Each command can have its options described in :file:`setup.cfg` +`command` sections +================== + +Each Distutils2 command can have its own user options defined in :file:`setup.cfg` Example:: [sdist] - manifest_makers = package.module.Maker + manifest-builders = package.module.Maker +To override the build class in order to generate Python3 code from your Python2 base:: + + [build_py] + use-2to3 = True + + -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Move iglob into from distutils2.datafiles module into distutils2.util Message-ID: tarek.ziade pushed cbbcc8b44d6d to distutils2: http://hg.python.org/distutils2/rev/cbbcc8b44d6d changeset: 1064:cbbcc8b44d6d user: Pierre-Yves David date: Sun Jan 30 17:45:18 2011 +0100 summary: Move iglob into from distutils2.datafiles module into distutils2.util files: distutils2/datafiles.py distutils2/tests/test_datafiles.py distutils2/tests/test_glob.py distutils2/util.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,6 +1,6 @@ import os import re -from glob import iglob as simple_iglob +from distutils2.util import iglob __all__ = ['iglob', 'resources_dests'] @@ -25,34 +25,6 @@ dest = os.path.join(destination, path_suffix) yield relpath, dest -RICH_GLOB = re.compile(r'\{([^}]*)\}') - -# r'\\\{' match "\{" - -def iglob(path_glob): - rich_path_glob = RICH_GLOB.split(path_glob, 1) - if len(rich_path_glob) > 1: - assert len(rich_path_glob) == 3, rich_path_glob - prefix, set, suffix = rich_path_glob - for item in set.split(','): - for path in iglob( ''.join((prefix, item, suffix))): - yield path - else: - if '**' not in path_glob: - for item in simple_iglob(path_glob): - yield item - else: - prefix, radical = path_glob.split('**', 1) - if prefix == '': - prefix = '.' - if radical == '': - radical = '*' - else: - radical = radical.lstrip('/') - for (path, dir, files) in os.walk(prefix): - for file in iglob(os.path.join(prefix, path, radical)): - yield os.path.join(prefix, file) - def resources_dests(resources_dir, rules): destinations = {} for (base, suffix, glob_dest) in rules: diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -5,71 +5,55 @@ from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest -from distutils2.datafiles import resources_dests, RICH_GLOB +from distutils2.tests.test_glob import GlobTestCaseBase +from distutils2.datafiles import resources_dests import re -class DataFilesTestCase(support.TempdirManager, - support.LoggingCatcher, - unittest.TestCase): +class DataFilesTestCase(GlobTestCaseBase): - def setUp(self): - super(DataFilesTestCase, self).setUp() - self.addCleanup(setattr, sys, 'stdout', sys.stdout) - self.addCleanup(setattr, sys, 'stderr', sys.stderr) + def assertRulesMatch(self, rules, spec): + tempdir = self.build_files_tree(spec) + expected = self.clean_tree(spec) + result = resources_dests(tempdir, rules) + self.assertEquals(expected, result) - - def build_spec(self, spec, clean=True): - tempdir = self.mkdtemp() - for filepath in spec: - filepath = os.path.join(tempdir, *filepath.split('/')) - dirname = os.path.dirname(filepath) - if dirname and not os.path.exists(dirname): - os.makedirs(dirname) - self.write_file(filepath, 'babar') - if clean: - for key, value in list(spec.items()): - if value is None: - del spec[key] - return tempdir - - def assertFindGlob(self, rules, spec): - tempdir = self.build_spec(spec) - result = resources_dests(tempdir, rules) - self.assertEquals(spec, result) - - def test_regex_rich_glob(self): - matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") - self.assertEquals(["fraises","huitres"], matches) + def clean_tree(self, spec): + files = {} + for path, value in spec.items(): + if value is not None: + path = self.os_dependant_path(path) + files[path] = value + return files def test_simple_glob(self): rules = [('', '*.tpl', '{data}')] spec = {'coucou.tpl': '{data}/coucou.tpl', 'Donotwant': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_multiple_match(self): rules = [('scripts', '*.bin', '{appdata}'), ('scripts', '*', '{appscript}')] spec = {'scripts/script.bin': '{appscript}/script.bin', 'Babarlikestrawberry': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_set_match(self): rules = [('scripts', '*.{bin,sh}', '{appscript}')] spec = {'scripts/script.bin': '{appscript}/script.bin', 'scripts/babar.sh': '{appscript}/babar.sh', 'Babarlikestrawberry': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_set_match_multiple(self): rules = [('scripts', 'script{s,}.{bin,sh}', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', 'scripts/script.sh': '{appscript}/script.sh', 'Babarlikestrawberry': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_set_match_exclude(self): rules = [('scripts', '*', '{appscript}'), @@ -77,13 +61,13 @@ spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', 'scripts/script.sh': None, 'Babarlikestrawberry': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_glob_in_base(self): rules = [('scrip*', '*.bin', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', 'Babarlikestrawberry': None} - tempdir = self.build_spec(spec) + tempdir = self.build_files_tree(spec) self.assertRaises(NotImplementedError, resources_dests, tempdir, rules) def test_recursive_glob(self): @@ -92,7 +76,7 @@ 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin', 'you/kill/pandabear.guy': None} - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_final_exemple_glob(self): rules = [ @@ -118,7 +102,7 @@ 'developer-docs/api/toc.txt': '{doc}/developer-docs/api/toc.txt', } self.maxDiff = None - self.assertFindGlob(rules, spec) + self.assertRulesMatch(rules, spec) def test_suite(): return unittest.makeSuite(DataFilesTestCase) diff --git a/distutils2/tests/test_glob.py b/distutils2/tests/test_glob.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_glob.py @@ -0,0 +1,160 @@ +import os +import sys +import re +from StringIO import StringIO + +from distutils2.tests import unittest, support, run_unittest +from distutils2.util import iglob, RICH_GLOB + +class GlobTestCaseBase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def build_files_tree(self, files): + tempdir = self.mkdtemp() + for filepath in files: + is_dir = filepath.endswith('/') + filepath = os.path.join(tempdir, *filepath.split('/')) + if is_dir: + dirname = filepath + else: + dirname = os.path.dirname(filepath) + if dirname and not os.path.exists(dirname): + os.makedirs(dirname) + if not is_dir: + self.write_file(filepath, 'babar') + return tempdir + + @staticmethod + def os_dependant_path(path): + path = path.rstrip('/').split('/') + return os.path.join(*path) + + def clean_tree(self, spec): + files = [] + for path, includes in list(spec.items()): + if includes: + files.append(self.os_dependant_path(path)) + return files + +class GlobTestCase(GlobTestCaseBase): + + + def assertGlobMatch(self, glob, spec): + """""" + tempdir = self.build_files_tree(spec) + expected = self.clean_tree(spec) + self.addCleanup(os.chdir, os.getcwd()) + os.chdir(tempdir) + result = list(iglob(glob)) + self.assertItemsEqual(expected, result) + + def test_regex_rich_glob(self): + matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") + self.assertEquals(["fraises","huitres"], matches) + + def test_simple_glob(self): + glob = '*.tp?' + spec = {'coucou.tpl': True, + 'coucou.tpj': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_simple_glob_in_dir(self): + glob = 'babar/*.tp?' + spec = {'babar/coucou.tpl': True, + 'babar/coucou.tpj': True, + 'babar/toto.bin': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_head(self): + glob = '**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': True, + 'zozo/tip/babar.tpl': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_tail(self): + glob = 'babar/**' + spec = {'babar/zaza/': True, + 'babar/zaza/zuzu/': True, + 'babar/zaza/zuzu/babar.xml': True, + 'babar/zaza/zuzu/toto.xml': True, + 'babar/zaza/zuzu/toto.csv': True, + 'babar/zaza/coucou.tpl': True, + 'babar/bubu.tpl': True, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_middle(self): + glob = 'babar/**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_tail(self): + glob = 'bin/*.{bin,sh,exe}' + spec = {'bin/babar.bin': True, + 'bin/zephir.sh': True, + 'bin/celestine.exe': True, + 'bin/cornelius.bat': False, + 'bin/cornelius.xml': False, + 'toto/yurg': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_middle(self): + glob = 'xml/{babar,toto}.xml' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': True, + 'xml/babar.xslt': False, + 'xml/cornelius.sgml': False, + 'xml/zephir.xml': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_head(self): + glob = '{xml,xslt}/babar.*' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': False, + 'xslt/babar.xslt': True, + 'xslt/toto.xslt': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_all(self): + glob = '{xml/*,xslt/**}/babar.xml' + spec = {'xml/a/babar.xml': True, + 'xml/b/babar.xml': True, + 'xml/a/c/babar.xml': False, + 'xslt/a/babar.xml': True, + 'xslt/b/babar.xml': True, + 'xslt/a/c/babar.xml': True, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + +def test_suite(): + return unittest.makeSuite(GlobTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -14,6 +14,7 @@ import zipfile from copy import copy from fnmatch import fnmatchcase +from glob import iglob as std_iglob from ConfigParser import RawConfigParser from inspect import getsource @@ -1052,6 +1053,33 @@ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) +RICH_GLOB = re.compile(r'\{([^}]*)\}') +def iglob(path_glob): + """Richer glob than the std glob module support ** and {opt1,opt2,opt3}""" + rich_path_glob = RICH_GLOB.split(path_glob, 1) + if len(rich_path_glob) > 1: + assert len(rich_path_glob) == 3, rich_path_glob + prefix, set, suffix = rich_path_glob + for item in set.split(','): + for path in iglob( ''.join((prefix, item, suffix))): + yield path + else: + if '**' not in path_glob: + for item in std_iglob(path_glob): + yield item + else: + prefix, radical = path_glob.split('**', 1) + if prefix == '': + prefix = '.' + if radical == '': + radical = '*' + else: + radical = radical.lstrip('/') + for (path, dir, files) in os.walk(prefix): + path = os.path.normpath(path) + for file in iglob(os.path.join(path, radical)): + yield file + def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): """ Distutils2 to distutils1 compatibility util. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Merge with upstream. Message-ID: tarek.ziade pushed b61e8db532cd to distutils2: http://hg.python.org/distutils2/rev/b61e8db532cd changeset: 1067:b61e8db532cd parent: 1066:e4474c0740d7 parent: 985:22028f4d78bc user: FELD Boris date: Sun Jan 30 18:35:07 2011 +0100 summary: Merge with upstream. files: distutils2/_backport/pkgutil.py distutils2/util.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -7,6 +7,13 @@ import warnings from csv import reader as csv_reader from types import ModuleType +from stat import ST_SIZE + +try: + from hashlib import md5 +except ImportError: + from md5 import md5 + from distutils2.errors import DistutilsError from distutils2.index.errors import DistributionNotFound from distutils2.metadata import DistributionMetadata @@ -979,6 +986,33 @@ return '%s-%s at %s' % (self.name, self.metadata.version, self.path) def get_installed_files(self, local=False): + + def _md5(path): + f = open(path) + try: + content = f.read() + finally: + f.close() + return md5(content).hexdigest() + + def _size(path): + return os.stat(path)[ST_SIZE] + + path = self.path + if local: + path = path.replace('/', os.sep) + + # XXX What about scripts and data files ? + if os.path.isfile(path): + return [(path, _md5(path), _size(path))] + else: + files = [] + for root, dir, files_ in os.walk(path): + for item in files_: + item = os.path.join(root, item) + files.append((item, _md5(item), _size(item))) + return files + return [] def uses(self, path): diff --git a/distutils2/install.py b/distutils2/install.py --- a/distutils2/install.py +++ b/distutils2/install.py @@ -334,37 +334,65 @@ def remove(project_name, paths=sys.path): """Removes a single project from the installation""" - dist = get_distribution(project_name, paths=paths) + dist = get_distribution(project_name, use_egg_info=True, paths=paths) if dist is None: - raise DistutilsError('Distribution %s not found' % project_name) + raise DistutilsError('Distribution "%s" not found' % project_name) files = dist.get_installed_files(local=True) rmdirs = [] rmfiles = [] tmp = tempfile.mkdtemp(prefix=project_name + '-uninstall') try: - for file, md5, size in files: - if os.path.isfile(file): - dirname, filename = os.path.split(file) + for file_, md5, size in files: + if os.path.isfile(file_): + dirname, filename = os.path.split(file_) tmpfile = os.path.join(tmp, filename) try: - os.rename(file, tmpfile) + os.rename(file_, tmpfile) finally: - if not os.path.isfile(file): - os.rename(tmpfile, file) - if file not in rmfiles: - rmfiles.append(file) + if not os.path.isfile(file_): + os.rename(tmpfile, file_) + if file_ not in rmfiles: + rmfiles.append(file_) if dirname not in rmdirs: rmdirs.append(dirname) finally: shutil.rmtree(tmp) - for file in rmfiles: - os.remove(file) + logger.info('Removing %r...' % project_name) + file_count = 0 + for file_ in rmfiles: + os.remove(file_) + file_count +=1 + + dir_count = 0 for dirname in rmdirs: - if not os.listdir(dirname): - if bool(os.stat(dirname).st_mode & stat.S_IWUSR): - os.rmdir(dirname) + if not os.path.exists(dirname): + # could + continue + + files_count = 0 + for root, dir, files in os.walk(dirname): + files_count += len(files) + + if files_count > 0: + # XXX Warning + continue + + # empty dirs with only empty dirs + if bool(os.stat(dirname).st_mode & stat.S_IWUSR): + # XXX Add a callable in shutil.rmtree to count + # the number of deleted elements + shutil.rmtree(dirname) + dir_count += 1 + + # removing the top path + # XXX count it ? + if os.path.exists(dist.path): + shutil.rmtree(dist.path) + + logger.info('Success ! Removed %d files and %d dirs' % \ + (file_count, dir_count)) def install(project): diff --git a/distutils2/run.py b/distutils2/run.py --- a/distutils2/run.py +++ b/distutils2/run.py @@ -11,7 +11,7 @@ from distutils2 import __version__ from distutils2._backport.pkgutil import get_distributions, get_distribution from distutils2.depgraph import generate_graph -from distutils2.install import install +from distutils2.install import install, remove # This is a barebones help message generated displayed when the user # runs the setup script with no arguments at all. More useful help @@ -225,6 +225,10 @@ install(options.install) return 0 + if options.remove is not None: + remove(options.remove) + return 0 + if len(args) == 0: parser.print_help() return 0 diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py --- a/distutils2/tests/test_util.py +++ b/distutils2/tests/test_util.py @@ -414,6 +414,10 @@ @unittest.skipUnless(os.name in ('nt', 'posix'), 'runs only under posix or nt') def test_spawn(self): + # Do not patch subprocess on unix because + # distutils2.util._spawn_posix uses it + if os.name in 'posix': + subprocess.Popen = self.old_popen tmpdir = self.mkdtemp() # creating something executable diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -12,6 +12,7 @@ import shutil import tarfile import zipfile +from subprocess import call as sub_call from copy import copy from fnmatch import fnmatchcase from glob import iglob as std_iglob @@ -801,65 +802,17 @@ "command '%s' failed with exit status %d" % (cmd[0], rc)) -def _spawn_posix(cmd, search_path=1, verbose=0, dry_run=0, env=None): - logger.info(' '.join(cmd)) +def _spawn_posix(cmd, search_path=1, verbose=1, dry_run=0, env=None): + cmd = ' '.join(cmd) + if verbose: + logger.info(cmd) + logger.info(env) if dry_run: return - - if env is None: - exec_fn = search_path and os.execvp or os.execv - else: - exec_fn = search_path and os.execvpe or os.execve - - pid = os.fork() - - if pid == 0: # in the child - try: - if env is None: - exec_fn(cmd[0], cmd) - else: - exec_fn(cmd[0], cmd, env) - except OSError, e: - sys.stderr.write("unable to execute %s: %s\n" % - (cmd[0], e.strerror)) - os._exit(1) - - sys.stderr.write("unable to execute %s for unknown reasons" % cmd[0]) - os._exit(1) - else: # in the parent - # Loop until the child either exits or is terminated by a signal - # (ie. keep waiting if it's merely stopped) - while 1: - try: - pid, status = os.waitpid(pid, 0) - except OSError, exc: - import errno - if exc.errno == errno.EINTR: - continue - raise DistutilsExecError( - "command '%s' failed: %s" % (cmd[0], exc[-1])) - if os.WIFSIGNALED(status): - raise DistutilsExecError( - "command '%s' terminated by signal %d" % \ - (cmd[0], os.WTERMSIG(status))) - - elif os.WIFEXITED(status): - exit_status = os.WEXITSTATUS(status) - if exit_status == 0: - return # hey, it succeeded! - else: - raise DistutilsExecError( - "command '%s' failed with exit status %d" % \ - (cmd[0], exit_status)) - - elif os.WIFSTOPPED(status): - continue - - else: - raise DistutilsExecError( - "unknown error executing '%s': termination status %d" % \ - (cmd[0], status)) - + exit_status = sub_call(cmd, shell=True, env=env) + if exit_status != 0: + msg = "command '%s' failed with exit status %d" + raise DistutilsExecError(msg % (cmd, exit_status)) def find_executable(executable, path=None): """Tries to find 'executable' in the directories listed in 'path'. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Merge with data files glob work. Message-ID: tarek.ziade pushed e4474c0740d7 to distutils2: http://hg.python.org/distutils2/rev/e4474c0740d7 changeset: 1066:e4474c0740d7 parent: 1064:cbbcc8b44d6d parent: 1065:7e8bef5cb5b3 user: FELD Boris date: Sun Jan 30 18:34:18 2011 +0100 summary: Merge with data files glob work. files: distutils2/tests/test_datafiles.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -8,6 +8,7 @@ from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError +from distutils2.index.errors import DistributionNotFound from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate try: @@ -742,8 +743,8 @@ return '%s-%s at %s' % (self.name, self.metadata.version, self.path) def _get_records(self, local=False): - RECORD = os.path.join(self.path, 'RECORD') - record_reader = csv_reader(open(RECORD, 'rb'), delimiter=',') + RECORD = self.get_distinfo_file('RECORD') + record_reader = csv_reader(RECORD, delimiter=',') for row in record_reader: path, md5, size = row[:] + [None for i in xrange(len(row), 3)] if local: @@ -751,6 +752,15 @@ path = os.path.join(sys.prefix, path) yield path, md5, size + def get_resource_path(self, relative_path): + datafiles_file = self.get_distinfo_file('DATAFILES') + datafiles_reader = csv_reader(datafiles_file, delimiter = ',') + for relative, destination in datafiles_reader: + if relative == relative_path: + return destination + raise KeyError('No data_file with relative path %s were installed' % + relative_path) + def get_installed_files(self, local=False): """ Iterates over the ``RECORD`` entries and returns a tuple @@ -1162,5 +1172,14 @@ if dist.uses(path): yield dist -def data_open(distribution_name, relative_path): - pass +def resource_path(distribution_name, relative_path): + dist = get_distribution(distribution_name) + if dist != None: + return dist.get_resource_path(relative_path) + raise DistributionNotFound('No distribution named %s is installed.' % + distribution_name) + +def resource_open(distribution_name, relative_path, *args, **kwargs): + file = open(resource_path(distribution_name, relative_path), *args, + **kwargs) + return file \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES @@ -0,0 +1,2 @@ +babar.png,babar.png +babar.cfg,babar.cfg \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA @@ -0,0 +1,4 @@ +Metadata-version: 1.2 +Name: babar +Version: 0.1 +Author: FELD Boris \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar.cfg b/distutils2/_backport/tests/fake_dists/babar.cfg new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar.cfg @@ -0,0 +1,1 @@ +Config \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar.png b/distutils2/_backport/tests/fake_dists/babar.png new file mode 100644 diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -1,11 +1,12 @@ # -*- coding: utf-8 -*- """Tests for PEP 376 pkgutil functionality""" +import imp import sys + +import csv import os -import csv -import imp +import shutil import tempfile -import shutil import zipfile try: from hashlib import md5 @@ -18,9 +19,9 @@ from distutils2._backport import pkgutil from distutils2._backport.pkgutil import ( - Distribution, EggInfoDistribution, get_distribution, get_distributions, - provides_distribution, obsoletes_distribution, get_file_users, - distinfo_dirname, _yield_distributions) + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -123,7 +124,6 @@ del sys.modules[pkg] - # Adapted from Python 2.7's trunk @@ -182,7 +182,7 @@ def setUp(self): super(TestPkgUtilDistribution, self).setUp() self.fake_dists_path = os.path.abspath( - os.path.join(os.path.dirname(__file__), 'fake_dists')) + os.path.join(os.path.dirname(__file__), 'fake_dists')) pkgutil.disable_cache() self.distinfo_dirs = [os.path.join(self.fake_dists_path, dir) @@ -205,7 +205,7 @@ # Setup the RECORD file for this dist record_file = os.path.join(distinfo_dir, 'RECORD') record_writer = csv.writer(open(record_file, 'w'), delimiter=',', - quoting=csv.QUOTE_NONE) + quoting=csv.QUOTE_NONE) dist_location = distinfo_dir.replace('.dist-info', '') for path, dirs, files in os.walk(dist_location): @@ -214,15 +214,15 @@ os.path.join(path, f))) for file in ['INSTALLER', 'METADATA', 'REQUESTED']: record_writer.writerow(record_pieces( - os.path.join(distinfo_dir, file))) + os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: path, md5_, size = row[:] + \ - [None for i in xrange(len(row), 3)] - record_data.append([path, (md5_, size,)]) + [None for i in xrange(len(row), 3)] + record_data.append([path, (md5_, size, )]) self.records[distinfo_dir] = dict(record_data) def tearDown(self): @@ -240,7 +240,7 @@ name = 'choxie' version = '2.0.0.9' dist_path = os.path.join(here, 'fake_dists', - distinfo_dirname(name, version)) + distinfo_dirname(name, version)) dist = Distribution(dist_path) self.assertEqual(dist.name, name) @@ -264,9 +264,9 @@ # Criteria to test against distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') true_path = [self.fake_dists_path, distinfo_name, \ - 'grammar', 'utils.py'] + 'grammar', 'utils.py'] true_path = relpath(os.path.join(*true_path), sys.prefix) false_path = [self.fake_dists_path, 'towel_stuff-0.1', 'towel_stuff', '__init__.py'] @@ -282,7 +282,7 @@ distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for known good file matches distinfo_files = [ @@ -301,9 +301,9 @@ # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, - other_distinfo_name + '.dist-info', 'REQUESTED') + other_distinfo_name + '.dist-info', 'REQUESTED') self.assertRaises(DistutilsError, dist.get_distinfo_file, - other_distinfo_file) + other_distinfo_file) # Test for a file that does not exist and should not exist self.assertRaises(DistutilsError, dist.get_distinfo_file, \ 'ENTRYPOINTS') @@ -312,7 +312,7 @@ # Test for the iteration of RECORD path entries. distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for the iteration of the raw path distinfo_record_paths = self.records[distinfo_dir].keys() @@ -324,6 +324,16 @@ found = [path for path in dist.get_distinfo_files(local=True)] self.assertEqual(sorted(found), sorted(distinfo_record_paths)) + def test_get_resources_path(self): + distinfo_name = 'babar-0.1' + distinfo_dir = os.path.join(self.fake_dists_path, + distinfo_name + '.dist-info') + dist = Distribution(distinfo_dir) + resource_path = dist.get_resource_path('babar.png') + self.assertEqual(resource_path, 'babar.png') + self.assertRaises(KeyError, lambda: dist.get_resource_path('notexist')) + + class TestPkgUtilPEP376(support.LoggingCatcher, support.WarningsCatcher, unittest.TestCase): @@ -347,7 +357,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [ # (name, version, standard_dirname) + items = [# (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -375,10 +385,10 @@ for dist in dists: if not isinstance(dist, Distribution): self.fail("item received was not a Distribution instance: " - "%s" % type(dist)) + "%s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'],)) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'], )) else: # check that it doesn't find anything more than this self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -389,9 +399,9 @@ # Now, test if the egg-info distributions are found correctly as well fake_dists += [('bacon', '0.1'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), - ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('nut', 'funkyversion')] + ('coconuts-aster', '10.3'), + ('banana', '0.4'), ('strawberry', '0.6'), + ('truffles', '5.0'), ('nut', 'funkyversion')] found_dists = [] dists = [dist for dist in get_distributions(use_egg_info=True)] @@ -401,8 +411,8 @@ self.fail("item received was not a Distribution or " "EggInfoDistribution instance: %s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'])) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'])) else: self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -453,7 +463,7 @@ # Test the iteration of distributions that use a file. name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, - 'towel_stuff', '__init__.py') + 'towel_stuff', '__init__.py') for dist in get_file_users(path): self.assertTrue(isinstance(dist, Distribution)) self.assertEqual(dist.name, name) @@ -484,7 +494,7 @@ l = [dist.name for dist in provides_distribution('truffles', \ '!=1.1,<=2.0', - use_egg_info=True)] + use_egg_info=True)] checkLists(l, ['choxie', 'bacon', 'cheese']) l = [dist.name for dist in provides_distribution('truffles', '>1.0')] @@ -557,29 +567,31 @@ checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] + ('truffles', '5.0'), ('cheese', '2.0.2'), + ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), - ('towel-stuff', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] checkLists([], _yield_distributions(False, False)) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(False, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(False, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(eggs, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, False) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, False) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists + eggs, found) + + def test_suite(): suite = unittest.TestSuite() load = unittest.defaultTestLoader.loadTestsFromTestCase diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -2,13 +2,10 @@ """Tests for distutils.data.""" import os import sys -from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest from distutils2.tests.test_glob import GlobTestCaseBase from distutils2.datafiles import resources_dests -import re - @@ -104,6 +101,42 @@ self.maxDiff = None self.assertRulesMatch(rules, spec) + def test_resource_open(self): + from distutils2._backport.sysconfig import _SCHEMES as sysconfig_SCHEMES + from distutils2._backport.sysconfig import _get_default_scheme + #dirty but hit marmoute + + tempdir = self.mkdtemp() + + old_scheme = sysconfig_SCHEMES + + sysconfig_SCHEMES.set(_get_default_scheme(), 'config', + tempdir) + + pkg_dir, dist = self.create_dist() + dist_name = dist.metadata['name'] + + test_path = os.path.join(pkg_dir, 'test.cfg') + self.write_file(test_path, 'Config') + dist.data_files = {test_path : '{config}/test.cfg'} + + cmd = install_dist(dist) + cmd.install_dir = os.path.join(pkg_dir, 'inst') + content = 'Config' + + cmd.ensure_finalized() + cmd.run() + + cfg_dest = os.path.join(tempdir, 'test.cfg') + + self.assertEqual(resource_path(dist_name, test_path), cfg_dest) + self.assertRaises(KeyError, lambda: resource_path(dist_name, 'notexis')) + + self.assertEqual(resource_open(dist_name, test_path).read(), content) + self.assertRaises(KeyError, lambda: resource_open(dist_name, 'notexis')) + + sysconfig_SCHEMES = old_scheme + def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Add some tests for retrieving resource path and file object. Message-ID: tarek.ziade pushed 7e8bef5cb5b3 to distutils2: http://hg.python.org/distutils2/rev/7e8bef5cb5b3 changeset: 1065:7e8bef5cb5b3 parent: 1063:bab1d09babc8 user: FELD Boris date: Sun Jan 30 18:29:28 2011 +0100 summary: Add some tests for retrieving resource path and file object. Add methods for retrieving resource path and file object. files: distutils2/_backport/pkgutil.py distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED distutils2/_backport/tests/fake_dists/babar.cfg distutils2/_backport/tests/fake_dists/babar.png distutils2/_backport/tests/test_pkgutil.py distutils2/tests/test_datafiles.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -8,6 +8,7 @@ from csv import reader as csv_reader from types import ModuleType from distutils2.errors import DistutilsError +from distutils2.index.errors import DistributionNotFound from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate try: @@ -742,8 +743,8 @@ return '%s-%s at %s' % (self.name, self.metadata.version, self.path) def _get_records(self, local=False): - RECORD = os.path.join(self.path, 'RECORD') - record_reader = csv_reader(open(RECORD, 'rb'), delimiter=',') + RECORD = self.get_distinfo_file('RECORD') + record_reader = csv_reader(RECORD, delimiter=',') for row in record_reader: path, md5, size = row[:] + [None for i in xrange(len(row), 3)] if local: @@ -751,6 +752,15 @@ path = os.path.join(sys.prefix, path) yield path, md5, size + def get_resource_path(self, relative_path): + datafiles_file = self.get_distinfo_file('DATAFILES') + datafiles_reader = csv_reader(datafiles_file, delimiter = ',') + for relative, destination in datafiles_reader: + if relative == relative_path: + return destination + raise KeyError('No data_file with relative path %s were installed' % + relative_path) + def get_installed_files(self, local=False): """ Iterates over the ``RECORD`` entries and returns a tuple @@ -1162,5 +1172,14 @@ if dist.uses(path): yield dist -def data_open(distribution_name, relative_path): - pass +def resource_path(distribution_name, relative_path): + dist = get_distribution(distribution_name) + if dist != None: + return dist.get_resource_path(relative_path) + raise DistributionNotFound('No distribution named %s is installed.' % + distribution_name) + +def resource_open(distribution_name, relative_path, *args, **kwargs): + file = open(resource_path(distribution_name, relative_path), *args, + **kwargs) + return file \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES @@ -0,0 +1,2 @@ +babar.png,babar.png +babar.cfg,babar.cfg \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA @@ -0,0 +1,4 @@ +Metadata-version: 1.2 +Name: babar +Version: 0.1 +Author: FELD Boris \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar.cfg b/distutils2/_backport/tests/fake_dists/babar.cfg new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar.cfg @@ -0,0 +1,1 @@ +Config \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar.png b/distutils2/_backport/tests/fake_dists/babar.png new file mode 100644 diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -1,11 +1,12 @@ # -*- coding: utf-8 -*- """Tests for PEP 376 pkgutil functionality""" +import imp import sys + +import csv import os -import csv -import imp +import shutil import tempfile -import shutil import zipfile try: from hashlib import md5 @@ -18,9 +19,9 @@ from distutils2._backport import pkgutil from distutils2._backport.pkgutil import ( - Distribution, EggInfoDistribution, get_distribution, get_distributions, - provides_distribution, obsoletes_distribution, get_file_users, - distinfo_dirname, _yield_distributions) + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -123,7 +124,6 @@ del sys.modules[pkg] - # Adapted from Python 2.7's trunk @@ -182,7 +182,7 @@ def setUp(self): super(TestPkgUtilDistribution, self).setUp() self.fake_dists_path = os.path.abspath( - os.path.join(os.path.dirname(__file__), 'fake_dists')) + os.path.join(os.path.dirname(__file__), 'fake_dists')) pkgutil.disable_cache() self.distinfo_dirs = [os.path.join(self.fake_dists_path, dir) @@ -205,7 +205,7 @@ # Setup the RECORD file for this dist record_file = os.path.join(distinfo_dir, 'RECORD') record_writer = csv.writer(open(record_file, 'w'), delimiter=',', - quoting=csv.QUOTE_NONE) + quoting=csv.QUOTE_NONE) dist_location = distinfo_dir.replace('.dist-info', '') for path, dirs, files in os.walk(dist_location): @@ -214,15 +214,15 @@ os.path.join(path, f))) for file in ['INSTALLER', 'METADATA', 'REQUESTED']: record_writer.writerow(record_pieces( - os.path.join(distinfo_dir, file))) + os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: path, md5_, size = row[:] + \ - [None for i in xrange(len(row), 3)] - record_data.append([path, (md5_, size,)]) + [None for i in xrange(len(row), 3)] + record_data.append([path, (md5_, size, )]) self.records[distinfo_dir] = dict(record_data) def tearDown(self): @@ -240,7 +240,7 @@ name = 'choxie' version = '2.0.0.9' dist_path = os.path.join(here, 'fake_dists', - distinfo_dirname(name, version)) + distinfo_dirname(name, version)) dist = Distribution(dist_path) self.assertEqual(dist.name, name) @@ -264,9 +264,9 @@ # Criteria to test against distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') true_path = [self.fake_dists_path, distinfo_name, \ - 'grammar', 'utils.py'] + 'grammar', 'utils.py'] true_path = relpath(os.path.join(*true_path), sys.prefix) false_path = [self.fake_dists_path, 'towel_stuff-0.1', 'towel_stuff', '__init__.py'] @@ -282,7 +282,7 @@ distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for known good file matches distinfo_files = [ @@ -301,9 +301,9 @@ # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, - other_distinfo_name + '.dist-info', 'REQUESTED') + other_distinfo_name + '.dist-info', 'REQUESTED') self.assertRaises(DistutilsError, dist.get_distinfo_file, - other_distinfo_file) + other_distinfo_file) # Test for a file that does not exist and should not exist self.assertRaises(DistutilsError, dist.get_distinfo_file, \ 'ENTRYPOINTS') @@ -312,7 +312,7 @@ # Test for the iteration of RECORD path entries. distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for the iteration of the raw path distinfo_record_paths = self.records[distinfo_dir].keys() @@ -324,6 +324,16 @@ found = [path for path in dist.get_distinfo_files(local=True)] self.assertEqual(sorted(found), sorted(distinfo_record_paths)) + def test_get_resources_path(self): + distinfo_name = 'babar-0.1' + distinfo_dir = os.path.join(self.fake_dists_path, + distinfo_name + '.dist-info') + dist = Distribution(distinfo_dir) + resource_path = dist.get_resource_path('babar.png') + self.assertEqual(resource_path, 'babar.png') + self.assertRaises(KeyError, lambda: dist.get_resource_path('notexist')) + + class TestPkgUtilPEP376(support.LoggingCatcher, support.WarningsCatcher, unittest.TestCase): @@ -347,7 +357,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [ # (name, version, standard_dirname) + items = [# (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -375,10 +385,10 @@ for dist in dists: if not isinstance(dist, Distribution): self.fail("item received was not a Distribution instance: " - "%s" % type(dist)) + "%s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'],)) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'], )) else: # check that it doesn't find anything more than this self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -389,9 +399,9 @@ # Now, test if the egg-info distributions are found correctly as well fake_dists += [('bacon', '0.1'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), - ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('nut', 'funkyversion')] + ('coconuts-aster', '10.3'), + ('banana', '0.4'), ('strawberry', '0.6'), + ('truffles', '5.0'), ('nut', 'funkyversion')] found_dists = [] dists = [dist for dist in get_distributions(use_egg_info=True)] @@ -401,8 +411,8 @@ self.fail("item received was not a Distribution or " "EggInfoDistribution instance: %s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'])) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'])) else: self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -453,7 +463,7 @@ # Test the iteration of distributions that use a file. name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, - 'towel_stuff', '__init__.py') + 'towel_stuff', '__init__.py') for dist in get_file_users(path): self.assertTrue(isinstance(dist, Distribution)) self.assertEqual(dist.name, name) @@ -484,7 +494,7 @@ l = [dist.name for dist in provides_distribution('truffles', \ '!=1.1,<=2.0', - use_egg_info=True)] + use_egg_info=True)] checkLists(l, ['choxie', 'bacon', 'cheese']) l = [dist.name for dist in provides_distribution('truffles', '>1.0')] @@ -557,29 +567,31 @@ checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] + ('truffles', '5.0'), ('cheese', '2.0.2'), + ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), - ('towel-stuff', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] checkLists([], _yield_distributions(False, False)) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(False, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(False, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(eggs, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, False) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, False) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists + eggs, found) + + def test_suite(): suite = unittest.TestSuite() load = unittest.defaultTestLoader.loadTestsFromTestCase diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -2,13 +2,11 @@ """Tests for distutils.data.""" import os import sys -from StringIO import StringIO from distutils2.tests import unittest, support, run_unittest from distutils2.datafiles import resources_dests, RICH_GLOB -import re - - +from distutils2._backport.pkgutil import resource_path, resource_open +from distutils2.command.install_dist import install_dist class DataFilesTestCase(support.TempdirManager, @@ -120,6 +118,42 @@ self.maxDiff = None self.assertFindGlob(rules, spec) + def test_resource_open(self): + from distutils2._backport.sysconfig import _SCHEMES as sysconfig_SCHEMES + from distutils2._backport.sysconfig import _get_default_scheme + #dirty but hit marmoute + + tempdir = self.mkdtemp() + + old_scheme = sysconfig_SCHEMES + + sysconfig_SCHEMES.set(_get_default_scheme(), 'config', + tempdir) + + pkg_dir, dist = self.create_dist() + dist_name = dist.metadata['name'] + + test_path = os.path.join(pkg_dir, 'test.cfg') + self.write_file(test_path, 'Config') + dist.data_files = {test_path : '{config}/test.cfg'} + + cmd = install_dist(dist) + cmd.install_dir = os.path.join(pkg_dir, 'inst') + content = 'Config' + + cmd.ensure_finalized() + cmd.run() + + cfg_dest = os.path.join(tempdir, 'test.cfg') + + self.assertEqual(resource_path(dist_name, test_path), cfg_dest) + self.assertRaises(KeyError, lambda: resource_path(dist_name, 'notexis')) + + self.assertEqual(resource_open(dist_name, test_path).read(), content) + self.assertRaises(KeyError, lambda: resource_open(dist_name, 'notexis')) + + sysconfig_SCHEMES = old_scheme + def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Remove lambda use in test._pkgutil Message-ID: tarek.ziade pushed 99fbaa234554 to distutils2: http://hg.python.org/distutils2/rev/99fbaa234554 changeset: 1069:99fbaa234554 user: FELD Boris date: Mon Jan 31 17:13:14 2011 +0100 summary: Remove lambda use in test._pkgutil files: distutils2/_backport/tests/test_pkgutil.py diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -331,7 +331,7 @@ dist = Distribution(distinfo_dir) resource_path = dist.get_resource_path('babar.png') self.assertEqual(resource_path, 'babar.png') - self.assertRaises(KeyError, lambda: dist.get_resource_path('notexist')) + self.assertRaises(KeyError, dist.get_resource_path, 'notexist') -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Fix missing import. Message-ID: tarek.ziade pushed c556ca5c471d to distutils2: http://hg.python.org/distutils2/rev/c556ca5c471d changeset: 1068:c556ca5c471d user: FELD Boris date: Sun Jan 30 18:38:26 2011 +0100 summary: Fix missing import. files: distutils2/tests/test_datafiles.py diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_datafiles.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_datafiles.py @@ -6,7 +6,8 @@ from distutils2.tests import unittest, support, run_unittest from distutils2.tests.test_glob import GlobTestCaseBase from distutils2.datafiles import resources_dests - +from distutils2.command.install_dist import install_dist +from distutils2._backport.pkgutil import resource_open, resource_path class DataFilesTestCase(GlobTestCaseBase): -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Use Lookup exception instead of DistributionNotFound in pkgutil Message-ID: tarek.ziade pushed bcc6c1157e08 to distutils2: http://hg.python.org/distutils2/rev/bcc6c1157e08 changeset: 1070:bcc6c1157e08 user: FELD Boris date: Mon Jan 31 17:19:59 2011 +0100 summary: Use Lookup exception instead of DistributionNotFound in pkgutil files: distutils2/_backport/pkgutil.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -15,7 +15,6 @@ from md5 import md5 from distutils2.errors import DistutilsError -from distutils2.index.errors import DistributionNotFound from distutils2.metadata import DistributionMetadata from distutils2.version import suggest_normalized_version, VersionPredicate try: @@ -1210,7 +1209,7 @@ dist = get_distribution(distribution_name) if dist != None: return dist.get_resource_path(relative_path) - raise DistributionNotFound('No distribution named %s is installed.' % + raise LookupError('No distribution named %s is installed.' % distribution_name) def resource_open(distribution_name, relative_path, *args, **kwargs): -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Improve compatibility with pep8 for file datafiles.py Message-ID: tarek.ziade pushed 7e820b8fe5ca to distutils2: http://hg.python.org/distutils2/rev/7e820b8fe5ca changeset: 1072:7e820b8fe5ca user: FELD Boris date: Mon Jan 31 17:37:12 2011 +0100 summary: Improve compatibility with pep8 for file datafiles.py files: distutils2/datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,5 +1,4 @@ import os -import re from distutils2.util import iglob __all__ = ['iglob', 'resources_dests'] @@ -17,11 +16,12 @@ else: base = basepath if '*' in base or '{' in base or '}' in base: - raise NotImplementedError('glob are not supported into base part of datafiles definition. %r is an invalide basepath' % base) + raise NotImplementedError('glob are not supported into base part\ + of datafiles definition. %r is an invalide basepath' % base) absglob = os.path.join(base, self.suffix) - for file in iglob(absglob): - path_suffix = file[len(base):].lstrip('/') - relpath = file[len(basepath):].lstrip('/') + for glob_file in iglob(absglob): + path_suffix = glob_file[len(base):].lstrip('/') + relpath = glob_file[len(basepath):].lstrip('/') dest = os.path.join(destination, path_suffix) yield relpath, dest @@ -35,9 +35,9 @@ else: delete = False dest = glob_dest - for file, file_dest in sglob.expand(resources_dir, dest): - if delete and file in destinations: - del destinations[file] + for resource_file, file_dest in sglob.expand(resources_dir, dest): + if delete and resource_file in destinations: + del destinations[resource_file] else: - destinations[file] = file_dest + destinations[resource_file] = file_dest return destinations -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Improve compatibility with pep8 for file config.py Message-ID: tarek.ziade pushed a82aff76a5c9 to distutils2: http://hg.python.org/distutils2/rev/a82aff76a5c9 changeset: 1071:a82aff76a5c9 user: FELD Boris date: Mon Jan 31 17:35:03 2011 +0100 summary: Improve compatibility with pep8 for file config.py files: distutils2/config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -4,9 +4,7 @@ """ import os.path import os -import re import sys -import re from ConfigParser import RawConfigParser from shlex import split @@ -138,11 +136,12 @@ # concatenate each files value = '' for filename in filenames: - f = open(filename) # will raise if file not found + # will raise if file not found + description_file = open(filename) try: - value += f.read().strip() + '\n' + value += description_file.read().strip() + '\n' finally: - f.close() + description_file.close() # add filename as a required file if filename not in metadata.requires_files: metadata.requires_files.append(filename) @@ -203,8 +202,8 @@ destination = None resources.append((prefix, suffix, destination)) - dir = os.path.dirname(os.path.join(os.getcwd(), cfg_filename)) - data_files = resources_dests(dir, resources) + directory = os.path.dirname(os.path.join(os.getcwd(), cfg_filename)) + data_files = resources_dests(directory, resources) self.dist.data_files = data_files ext_modules = self.dist.ext_modules -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Remove iglob from disutils.datafiles public interface Message-ID: tarek.ziade pushed d714bef9308d to distutils2: http://hg.python.org/distutils2/rev/d714bef9308d changeset: 1073:d714bef9308d user: Pierre-Yves David date: Tue Feb 01 23:20:26 2011 +0100 summary: Remove iglob from disutils.datafiles public interface This method have been moved to util. files: distutils2/datafiles.py diff --git a/distutils2/datafiles.py b/distutils2/datafiles.py --- a/distutils2/datafiles.py +++ b/distutils2/datafiles.py @@ -1,7 +1,7 @@ import os from distutils2.util import iglob -__all__ = ['iglob', 'resources_dests'] +__all__ = ['resources_dests'] class SmartGlob(object): -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: renames datafiles in ressources for consistency with setup.cfg section Message-ID: tarek.ziade pushed e707b3a6ce74 to distutils2: http://hg.python.org/distutils2/rev/e707b3a6ce74 changeset: 1075:e707b3a6ce74 user: Pierre-Yves David date: Wed Feb 02 11:38:07 2011 +0100 summary: renames datafiles in ressources for consistency with setup.cfg section files: distutils2/_backport/pkgutil.py distutils2/command/install_data.py distutils2/command/install_dist.py distutils2/command/install_distinfo.py distutils2/config.py distutils2/datafiles.py distutils2/resources.py distutils2/tests/test_datafiles.py distutils2/tests/test_resources.py diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -759,9 +759,9 @@ yield path, md5, size def get_resource_path(self, relative_path): - datafiles_file = self.get_distinfo_file('DATAFILES') - datafiles_reader = csv_reader(datafiles_file, delimiter = ',') - for relative, destination in datafiles_reader: + resources_file = self.get_distinfo_file('DATAFILES') + resources_reader = csv_reader(resources_file, delimiter = ',') + for relative, destination in resources_reader: if relative == relative_path: return destination raise KeyError('No data_file with relative path %s were installed' % diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -76,5 +76,5 @@ def get_outputs(self): return self.outfiles - def get_datafiles_out(self): + def get_resources_out(self): return self.data_files_out \ No newline at end of file diff --git a/distutils2/command/install_dist.py b/distutils2/command/install_dist.py --- a/distutils2/command/install_dist.py +++ b/distutils2/command/install_dist.py @@ -87,7 +87,7 @@ ('record=', None, "filename in which to record a list of installed files " "(not PEP 376-compliant)"), - ('datafiles=', None, + ('resources=', None, "data files mapping"), # .dist-info related arguments, read by install_dist_info @@ -186,14 +186,14 @@ #self.install_info = None self.record = None - self.datafiles = None + self.resources = None # .dist-info related options self.no_distinfo = None self.installer = None self.requested = None self.no_record = None - self.no_datafiles = None + self.no_resources = None # -- Option finalizing methods ------------------------------------- # (This is rather more involved than for most commands, diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -39,11 +39,11 @@ "do not generate a REQUESTED file"), ('no-record', None, "do not generate a RECORD file"), - ('no-datafiles', None, + ('no-resources', None, "do not generate a DATAFILES list installed file") ] - boolean_options = ['requested', 'no-record', 'no-datafiles'] + boolean_options = ['requested', 'no-record', 'no-resources'] negative_opt = {'no-requested': 'requested'} @@ -52,7 +52,7 @@ self.installer = None self.requested = None self.no_record = None - self.no_datafiles = None + self.no_resources = None def finalize_options(self): self.set_undefined_options('install_dist', @@ -69,8 +69,8 @@ self.requested = True if self.no_record is None: self.no_record = False - if self.no_datafiles is None: - self.no_datafiles = False + if self.no_resources is None: + self.no_resources = False metadata = self.distribution.metadata @@ -120,20 +120,20 @@ self.outputs.append(requested_path) - if not self.no_datafiles: + if not self.no_resources: install_data = self.get_finalized_command('install_data') - if install_data.get_datafiles_out() != []: - datafiles_path = os.path.join(self.distinfo_dir, 'DATAFILES') - logger.info('creating %s', datafiles_path) - f = open(datafiles_path, 'wb') + if install_data.get_resources_out() != []: + resources_path = os.path.join(self.distinfo_dir, 'DATAFILES') + logger.info('creating %s', resources_path) + f = open(resources_path, 'wb') try: writer = csv.writer(f, delimiter=',', lineterminator=os.linesep, quotechar='"') - for tuple in install_data.get_datafiles_out(): + for tuple in install_data.get_resources_out(): writer.writerow(tuple) - self.outputs.append(datafiles_path) + self.outputs.append(resources_path) finally: f.close() diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -14,7 +14,7 @@ from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command -from distutils2.datafiles import resources_dests +from distutils2.resources import resources_dests from distutils2.markers import interpret diff --git a/distutils2/datafiles.py b/distutils2/resources.py rename from distutils2/datafiles.py rename to distutils2/resources.py --- a/distutils2/datafiles.py +++ b/distutils2/resources.py @@ -17,7 +17,7 @@ base = basepath if '*' in base or '{' in base or '}' in base: raise NotImplementedError('glob are not supported into base part\ - of datafiles definition. %r is an invalide basepath' % base) + of resources definition. %r is an invalide basepath' % base) absglob = os.path.join(base, self.suffix) for glob_file in iglob(absglob): path_suffix = glob_file[len(base):].lstrip('/') diff --git a/distutils2/tests/test_datafiles.py b/distutils2/tests/test_resources.py rename from distutils2/tests/test_datafiles.py rename to distutils2/tests/test_resources.py --- a/distutils2/tests/test_datafiles.py +++ b/distutils2/tests/test_resources.py @@ -5,7 +5,7 @@ from distutils2.tests import unittest, support, run_unittest from distutils2.tests.test_glob import GlobTestCaseBase -from distutils2.datafiles import resources_dests +from distutils2.resources import resources_dests from distutils2.command.install_dist import install_dist from distutils2._backport.pkgutil import resource_open, resource_path -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: merge test_glob.py into test_util.py Message-ID: tarek.ziade pushed 14b50b3b0b5a to distutils2: http://hg.python.org/distutils2/rev/14b50b3b0b5a changeset: 1076:14b50b3b0b5a user: Pierre-Yves David date: Wed Feb 02 11:49:57 2011 +0100 summary: merge test_glob.py into test_util.py files: distutils2/tests/test_glob.py distutils2/tests/test_resources.py distutils2/tests/test_util.py diff --git a/distutils2/tests/test_glob.py b/distutils2/tests/test_glob.py deleted file mode 100644 --- a/distutils2/tests/test_glob.py +++ /dev/null @@ -1,160 +0,0 @@ -import os -import sys -import re -from StringIO import StringIO - -from distutils2.tests import unittest, support, run_unittest -from distutils2.util import iglob, RICH_GLOB - -class GlobTestCaseBase(support.TempdirManager, - support.LoggingCatcher, - unittest.TestCase): - - def build_files_tree(self, files): - tempdir = self.mkdtemp() - for filepath in files: - is_dir = filepath.endswith('/') - filepath = os.path.join(tempdir, *filepath.split('/')) - if is_dir: - dirname = filepath - else: - dirname = os.path.dirname(filepath) - if dirname and not os.path.exists(dirname): - os.makedirs(dirname) - if not is_dir: - self.write_file(filepath, 'babar') - return tempdir - - @staticmethod - def os_dependant_path(path): - path = path.rstrip('/').split('/') - return os.path.join(*path) - - def clean_tree(self, spec): - files = [] - for path, includes in list(spec.items()): - if includes: - files.append(self.os_dependant_path(path)) - return files - -class GlobTestCase(GlobTestCaseBase): - - - def assertGlobMatch(self, glob, spec): - """""" - tempdir = self.build_files_tree(spec) - expected = self.clean_tree(spec) - self.addCleanup(os.chdir, os.getcwd()) - os.chdir(tempdir) - result = list(iglob(glob)) - self.assertItemsEqual(expected, result) - - def test_regex_rich_glob(self): - matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") - self.assertEquals(["fraises","huitres"], matches) - - def test_simple_glob(self): - glob = '*.tp?' - spec = {'coucou.tpl': True, - 'coucou.tpj': True, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_simple_glob_in_dir(self): - glob = 'babar/*.tp?' - spec = {'babar/coucou.tpl': True, - 'babar/coucou.tpj': True, - 'babar/toto.bin': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_recursive_glob_head(self): - glob = '**/tip/*.t?l' - spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, - 'babar/z/tip/coucou.tpl': True, - 'babar/tip/coucou.tpl': True, - 'babar/zeop/tip/babar/babar.tpl': False, - 'babar/z/tip/coucou.bin': False, - 'babar/toto.bin': False, - 'zozo/zuzu/tip/babar.tpl': True, - 'zozo/tip/babar.tpl': True, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_recursive_glob_tail(self): - glob = 'babar/**' - spec = {'babar/zaza/': True, - 'babar/zaza/zuzu/': True, - 'babar/zaza/zuzu/babar.xml': True, - 'babar/zaza/zuzu/toto.xml': True, - 'babar/zaza/zuzu/toto.csv': True, - 'babar/zaza/coucou.tpl': True, - 'babar/bubu.tpl': True, - 'zozo/zuzu/tip/babar.tpl': False, - 'zozo/tip/babar.tpl': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_recursive_glob_middle(self): - glob = 'babar/**/tip/*.t?l' - spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, - 'babar/z/tip/coucou.tpl': True, - 'babar/tip/coucou.tpl': True, - 'babar/zeop/tip/babar/babar.tpl': False, - 'babar/z/tip/coucou.bin': False, - 'babar/toto.bin': False, - 'zozo/zuzu/tip/babar.tpl': False, - 'zozo/tip/babar.tpl': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_glob_set_tail(self): - glob = 'bin/*.{bin,sh,exe}' - spec = {'bin/babar.bin': True, - 'bin/zephir.sh': True, - 'bin/celestine.exe': True, - 'bin/cornelius.bat': False, - 'bin/cornelius.xml': False, - 'toto/yurg': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_glob_set_middle(self): - glob = 'xml/{babar,toto}.xml' - spec = {'xml/babar.xml': True, - 'xml/toto.xml': True, - 'xml/babar.xslt': False, - 'xml/cornelius.sgml': False, - 'xml/zephir.xml': False, - 'toto/yurg.xml': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_glob_set_head(self): - glob = '{xml,xslt}/babar.*' - spec = {'xml/babar.xml': True, - 'xml/toto.xml': False, - 'xslt/babar.xslt': True, - 'xslt/toto.xslt': False, - 'toto/yurg.xml': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - def test_glob_all(self): - glob = '{xml/*,xslt/**}/babar.xml' - spec = {'xml/a/babar.xml': True, - 'xml/b/babar.xml': True, - 'xml/a/c/babar.xml': False, - 'xslt/a/babar.xml': True, - 'xslt/b/babar.xml': True, - 'xslt/a/c/babar.xml': True, - 'toto/yurg.xml': False, - 'Donotwant': False} - self.assertGlobMatch(glob, spec) - - -def test_suite(): - return unittest.makeSuite(GlobTestCase) - -if __name__ == '__main__': - run_unittest(test_suite()) diff --git a/distutils2/tests/test_resources.py b/distutils2/tests/test_resources.py --- a/distutils2/tests/test_resources.py +++ b/distutils2/tests/test_resources.py @@ -4,7 +4,7 @@ import sys from distutils2.tests import unittest, support, run_unittest -from distutils2.tests.test_glob import GlobTestCaseBase +from distutils2.tests.test_util import GlobTestCaseBase from distutils2.resources import resources_dests from distutils2.command.install_dist import install_dist from distutils2._backport.pkgutil import resource_open, resource_path diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py --- a/distutils2/tests/test_util.py +++ b/distutils2/tests/test_util.py @@ -18,7 +18,7 @@ _find_exe_version, _MAC_OS_X_LD_VERSION, byte_compile, find_packages, spawn, find_executable, _nt_quote_args, get_pypirc_path, generate_pypirc, - read_pypirc, resolve_name) + read_pypirc, resolve_name, iglob, RICH_GLOB) from distutils2 import util from distutils2.tests import unittest, support @@ -479,9 +479,158 @@ content = open(rc).read() self.assertEqual(content, WANTED) +class GlobTestCaseBase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def build_files_tree(self, files): + tempdir = self.mkdtemp() + for filepath in files: + is_dir = filepath.endswith('/') + filepath = os.path.join(tempdir, *filepath.split('/')) + if is_dir: + dirname = filepath + else: + dirname = os.path.dirname(filepath) + if dirname and not os.path.exists(dirname): + os.makedirs(dirname) + if not is_dir: + self.write_file(filepath, 'babar') + return tempdir + + @staticmethod + def os_dependant_path(path): + path = path.rstrip('/').split('/') + return os.path.join(*path) + + def clean_tree(self, spec): + files = [] + for path, includes in list(spec.items()): + if includes: + files.append(self.os_dependant_path(path)) + return files + +class GlobTestCase(GlobTestCaseBase): + + + def assertGlobMatch(self, glob, spec): + """""" + tempdir = self.build_files_tree(spec) + expected = self.clean_tree(spec) + self.addCleanup(os.chdir, os.getcwd()) + os.chdir(tempdir) + result = list(iglob(glob)) + self.assertItemsEqual(expected, result) + + def test_regex_rich_glob(self): + matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") + self.assertEquals(["fraises","huitres"], matches) + + def test_simple_glob(self): + glob = '*.tp?' + spec = {'coucou.tpl': True, + 'coucou.tpj': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_simple_glob_in_dir(self): + glob = 'babar/*.tp?' + spec = {'babar/coucou.tpl': True, + 'babar/coucou.tpj': True, + 'babar/toto.bin': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_head(self): + glob = '**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': True, + 'zozo/tip/babar.tpl': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_tail(self): + glob = 'babar/**' + spec = {'babar/zaza/': True, + 'babar/zaza/zuzu/': True, + 'babar/zaza/zuzu/babar.xml': True, + 'babar/zaza/zuzu/toto.xml': True, + 'babar/zaza/zuzu/toto.csv': True, + 'babar/zaza/coucou.tpl': True, + 'babar/bubu.tpl': True, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_middle(self): + glob = 'babar/**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_tail(self): + glob = 'bin/*.{bin,sh,exe}' + spec = {'bin/babar.bin': True, + 'bin/zephir.sh': True, + 'bin/celestine.exe': True, + 'bin/cornelius.bat': False, + 'bin/cornelius.xml': False, + 'toto/yurg': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_middle(self): + glob = 'xml/{babar,toto}.xml' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': True, + 'xml/babar.xslt': False, + 'xml/cornelius.sgml': False, + 'xml/zephir.xml': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_head(self): + glob = '{xml,xslt}/babar.*' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': False, + 'xslt/babar.xslt': True, + 'xslt/toto.xslt': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_all(self): + glob = '{xml/*,xslt/**}/babar.xml' + spec = {'xml/a/babar.xml': True, + 'xml/b/babar.xml': True, + 'xml/a/c/babar.xml': False, + 'xslt/a/babar.xml': True, + 'xslt/b/babar.xml': True, + 'xslt/a/c/babar.xml': True, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + def test_suite(): - return unittest.makeSuite(UtilTestCase) + suite = unittest.makeSuite(UtilTestCase) + suite.addTest(unittest.makeSuite(GlobTestCase)) + return suite + if __name__ == "__main__": unittest.main(defaultTest="test_suite") -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: merge with upstream Message-ID: tarek.ziade pushed c7edf9ddc916 to distutils2: http://hg.python.org/distutils2/rev/c7edf9ddc916 changeset: 1074:c7edf9ddc916 parent: 1073:d714bef9308d parent: 1000:d355f123ac79 user: Pierre-Yves David date: Wed Feb 02 11:32:45 2011 +0100 summary: merge with upstream files: distutils2/config.py distutils2/tests/test_config.py distutils2/util.py diff --git a/DEVNOTES.txt b/DEVNOTES.txt --- a/DEVNOTES.txt +++ b/DEVNOTES.txt @@ -6,4 +6,5 @@ one of these Python versions. - Always run tests.sh before you push a change. This implies - that you have all Python versions installed from 2.4 to 2.7. + that you have all Python versions installed from 2.4 to 2.7. Be sure to have + docutils installed on all python versions no avoid skipping tests as well. diff --git a/distutils2/command/check.py b/distutils2/command/check.py --- a/distutils2/command/check.py +++ b/distutils2/command/check.py @@ -52,18 +52,19 @@ def check_metadata(self): """Ensures that all required elements of metadata are supplied. - name, version, URL, (author and author_email) or - (maintainer and maintainer_email)). + name, version, URL, author Warns if any are missing. """ - missing, __ = self.distribution.metadata.check(strict=True) + missing, warnings = self.distribution.metadata.check(strict=True) if missing != []: self.warn("missing required metadata: %s" % ', '.join(missing)) + for warning in warnings: + self.warn(warning) def check_restructuredtext(self): """Checks if the long string fields are reST-compliant.""" - missing, warnings = self.distribution.metadata.check() + missing, warnings = self.distribution.metadata.check(restructuredtext=True) if self.distribution.metadata.docutils_support: for warning in warnings: line = warning[-1].get('line') diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -15,13 +15,22 @@ from distutils2.compiler import set_compiler from distutils2.command import set_command from distutils2.datafiles import resources_dests +from distutils2.markers import interpret def _pop_values(values_dct, key): """Remove values from the dictionary and convert them as a list""" vals_str = values_dct.pop(key, '') + if not vals_str: + return + fields = [] + for field in vals_str.split(os.linesep): + tmp_vals = field.split('--') + if (len(tmp_vals) == 2) and (not interpret(tmp_vals[1])): + continue + fields.append(tmp_vals[0]) # Get bash options like `gcc -print-file-name=libgcc.a` - vals = split(vals_str) + vals = split(' '.join(fields)) if vals: return vals diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -465,7 +465,7 @@ return None return value - def check(self, strict=False): + def check(self, strict=False, restructuredtext=False): """Check if the metadata is compliant. If strict is False then raise if no Name or Version are provided""" # XXX should check the versions (if the file was loaded) @@ -479,11 +479,11 @@ msg = "missing required metadata: %s" % ', '.join(missing) raise MetadataMissingError(msg) - for attr in ('Home-page',): + for attr in ('Home-page', 'Author'): if attr not in self: missing.append(attr) - if _HAS_DOCUTILS: + if _HAS_DOCUTILS and restructuredtext: warnings.extend(self._check_rst_data(self['Description'])) # checking metadata 1.2 (XXX needs to check 1.1, 1.0) diff --git a/distutils2/tests/test_command_check.py b/distutils2/tests/test_command_check.py --- a/distutils2/tests/test_command_check.py +++ b/distutils2/tests/test_command_check.py @@ -6,13 +6,14 @@ from distutils2.errors import DistutilsSetupError from distutils2.errors import MetadataMissingError + class CheckTestCase(support.LoggingCatcher, support.TempdirManager, unittest.TestCase): def _run(self, metadata=None, **options): if metadata is None: - metadata = {'name':'xxx', 'version':'xxx'} + metadata = {'name': 'xxx', 'version': 'xxx'} pkg_info, dist = self.create_dist(**metadata) cmd = check(dist) cmd.initialize_options() @@ -34,7 +35,7 @@ # any warning anymore metadata = {'home_page': 'xxx', 'author': 'xxx', 'author_email': 'xxx', - 'name': 'xxx', 'version': 'xxx' + 'name': 'xxx', 'version': 'xxx', } cmd = self._run(metadata) self.assertEqual(len(cmd._warnings), 0) @@ -42,12 +43,52 @@ # now with the strict mode, we should # get an error if there are missing metadata self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) - self.assertRaises(DistutilsSetupError, self._run, {'name':'xxx', 'version':'xxx'}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1}) # and of course, no error when all metadata fields are present cmd = self._run(metadata, strict=1) self.assertEqual(len(cmd._warnings), 0) + def test_check_metadata_1_2(self): + # let's run the command with no metadata at all + # by default, check is checking the metadata + # should have some warnings + cmd = self._run() + self.assertTrue(len(cmd._warnings) > 0) + + # now let's add the required fields + # and run it again, to make sure we don't get + # any warning anymore + # let's use requires_python as a marker to enforce + # Metadata-Version 1.2 + metadata = {'home_page': 'xxx', 'author': 'xxx', + 'author_email': 'xxx', + 'name': 'xxx', 'version': 'xxx', + 'requires_python': '2.4', + } + cmd = self._run(metadata) + self.assertEqual(len(cmd._warnings), 1) + + # now with the strict mode, we should + # get an error if there are missing metadata + self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1}) + self.assertRaises(DistutilsSetupError, self._run, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1}) + + # complain about version format + self.assertRaises(DistutilsSetupError, self._run, metadata, + **{'strict': 1}) + + # now with correct version format + metadata = {'home_page': 'xxx', 'author': 'xxx', + 'author_email': 'xxx', + 'name': 'xxx', 'version': '1.2', + 'requires_python': '2.4', + } + cmd = self._run(metadata, strict=1) + self.assertEqual(len(cmd._warnings), 0) + @unittest.skipUnless(_HAS_DOCUTILS, "requires docutils") def test_check_restructuredtext(self): # let's see if it detects broken rest in long_description @@ -65,7 +106,7 @@ def test_check_all(self): self.assertRaises(DistutilsSetupError, self._run, - {'name':'xxx', 'version':'xxx'}, **{'strict': 1, + {'name': 'xxx', 'version': 'xxx'}, **{'strict': 1, 'all': 1}) self.assertRaises(MetadataMissingError, self._run, {}, **{'strict': 1, @@ -79,7 +120,7 @@ cmd = check(dist) cmd.check_hooks_resolvable() self.assertEqual(len(cmd._warnings), 1) - + def test_suite(): return unittest.makeSuite(CheckTestCase) diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -105,6 +105,8 @@ sources = c_src/speed_coconuts.c extra_link_args = "`gcc -print-file-name=libgcc.a`" -shared define_macros = HAVE_CAIRO HAVE_GTK2 +libraries = gecodeint gecodekernel -- sys.platform != 'win32' + GecodeInt GecodeKernel -- sys.platform == 'win32' [extension=fast_taunt] name = three.fast_taunt @@ -113,6 +115,8 @@ include_dirs = /usr/include/gecode /usr/include/blitz extra_compile_args = -fPIC -O2 + -DGECODE_VERSION=$(./gecode_version) -- sys.platform != 'win32' + /DGECODE_VERSION='win32' -- sys.platform == 'win32' language = cxx """ @@ -291,6 +295,10 @@ ext = ext_modules.get('one.speed_coconuts') self.assertEqual(ext.sources, ['c_src/speed_coconuts.c']) self.assertEqual(ext.define_macros, ['HAVE_CAIRO', 'HAVE_GTK2']) + libs = ['gecodeint', 'gecodekernel'] + if sys.platform == 'win32': + libs = ['GecodeInt', 'GecodeKernel'] + self.assertEqual(ext.libraries, libs) self.assertEqual(ext.extra_link_args, ['`gcc -print-file-name=libgcc.a`', '-shared']) @@ -299,7 +307,12 @@ ['cxx_src/utils_taunt.cxx', 'cxx_src/python_module.cxx']) self.assertEqual(ext.include_dirs, ['/usr/include/gecode', '/usr/include/blitz']) - self.assertEqual(ext.extra_compile_args, ['-fPIC', '-O2']) + cargs = ['-fPIC', '-O2'] + if sys.platform == 'win32': + cargs.append("/DGECODE_VERSION='win32'") + else: + cargs.append('-DGECODE_VERSION=$(./gecode_version)') + self.assertEqual(ext.extra_compile_args, cargs) self.assertEqual(ext.language, 'cxx') diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -171,17 +171,75 @@ [('one', 'http://ok')]) self.assertEqual(metadata.version, '1.2') - def test_check(self): + def test_check_version(self): + metadata = DistributionMetadata() + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Version']) + + def test_check_version_strict(self): + metadata = DistributionMetadata() + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + from distutils2.errors import MetadataMissingError + self.assertRaises(MetadataMissingError, metadata.check, strict=True) + + def test_check_name(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Name']) + + def test_check_name_strict(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + from distutils2.errors import MetadataMissingError + self.assertRaises(MetadataMissingError, metadata.check, strict=True) + + def test_check_author(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Author']) + + def test_check_homepage(self): + metadata = DistributionMetadata() + metadata['Version'] = '1.0' + metadata['Name'] = 'vimpdb' + metadata['Author'] = 'Monty Python' + metadata.docutils_support = False + missing, warnings = metadata.check() + self.assertEqual(missing, ['Home-page']) + + def test_check_predicates(self): metadata = DistributionMetadata() metadata['Version'] = 'rr' + metadata['Name'] = 'vimpdb' + metadata['Home-page'] = 'http://pypi.python.org' + metadata['Author'] = 'Monty Python' metadata['Requires-dist'] = ['Foo (a)'] + metadata['Obsoletes-dist'] = ['Foo (a)'] + metadata['Provides-dist'] = ['Foo (a)'] if metadata.docutils_support: missing, warnings = metadata.check() - self.assertEqual(len(warnings), 2) + self.assertEqual(len(warnings), 4) metadata.docutils_support = False missing, warnings = metadata.check() - self.assertEqual(missing, ['Name', 'Home-page']) - self.assertEqual(len(warnings), 2) + self.assertEqual(len(warnings), 4) def test_best_choice(self): metadata = DistributionMetadata() diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -806,7 +806,6 @@ cmd = ' '.join(cmd) if verbose: logger.info(cmd) - logger.info(env) if dry_run: return exit_status = sub_call(cmd, shell=True, env=env) @@ -1109,7 +1108,6 @@ # There is no such option in the setup.cfg if arg == "long_description": filename = has_get_option(config, section, "description_file") - print "We have a filename", filename if filename: in_cfg_value = open(filename).read() else: @@ -1137,12 +1135,16 @@ raise DistutilsFileError("A pre existing setup.py file exists") handle = open("setup.py", "w") - handle.write("# Distutils script using distutils2 setup.cfg to call the\n") - handle.write("# distutils.core.setup() with the right args.\n\n\n") - handle.write("import os\n") - handle.write("from distutils.core import setup\n") - handle.write("from ConfigParser import RawConfigParser\n\n") - handle.write(getsource(generate_distutils_kwargs_from_setup_cfg)) - handle.write("\n\nkwargs = generate_distutils_kwargs_from_setup_cfg()\n") - handle.write("setup(**kwargs)") - handle.close() + try: + handle.write( + "# Distutils script using distutils2 setup.cfg to call the\n" + "# distutils.core.setup() with the right args.\n\n" + "import os\n" + "from distutils.core import setup\n" + "from ConfigParser import RawConfigParser\n\n" + "" + getsource(generate_distutils_kwargs_from_setup_cfg) + "\n\n" + "kwargs = generate_distutils_kwargs_from_setup_cfg()\n" + "setup(**kwargs)\n" + ) + finally: + handle.close() diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst new file mode 100644 --- /dev/null +++ b/docs/source/contributing.rst @@ -0,0 +1,52 @@ +========================== +Contributing to Distutils2 +========================== + +---------------- +Reporting Issues +---------------- + +When using, testing, developping distutils2, you may encounter issues. Please report to the following sections to know how these issues should be reported. + +Please keep in mind that this guide is intended to ease the triage and fixing processes by giving the maximum information to the developers. It should not be viewed as mandatory, only advised ;). + +Issues regarding distutils2 commands +==================================== + +- Go to http://bugs.python.org/ (you'll need a Python Bugs account), then "Issues" > "Create ticket". +- **Title**: write in a short summary of the issue. + * You may prefix the issue title with [d2_component], where d2_component can be : installer, sdist, setup.cfg, ... This will ease up the triage process. + +- **Components**: choose "Distutils2" +- **Version**: choose "3rd party" +- **Comment**: use the following template for versions, reproduction conditions: + * If some of the fields presented don't apply to the issue, feel free to pick only the ones you need. + +:: + + Operating System: + Version of Python: + Version of Distutils2: + + How to reproduce: + + What happens: + + What should happen: + +- Filling in the fields: + * **How to reproduce**: indicate some test case to reproduce the issue. + * **What happens**: describe what is the error, paste tracebacks if you have any. + * **What should happen**: indicate what you think should be the result of the test case (wanted behaviour). + * **Versions**: + - If you're using a release of distutils2, you may want to test the latest version of the project (under developpment code). + - If the issue is present in the latest version, please indicate the tip commit of the version tested. + - Be careful to indicate the remote reference (12 characters, for instance c3cf81fc64db), not the local reference (rXXX). + +- If it is relevant, please join any file that will help reproducing the issue or logs to understand the problem (setup.cfg, strace ouptups, ...). + +Issues regarding PyPI display of the distutils2 projects +======================================================== + +- Please send a bug report to the catalog-sig at python.org mailing list. +- You can include your setup.cfg, and a link to your project page. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Finish the renaming of datafiles to resources. Message-ID: tarek.ziade pushed b6ad6f924cc3 to distutils2: http://hg.python.org/distutils2/rev/b6ad6f924cc3 changeset: 1077:b6ad6f924cc3 user: FELD Boris date: Fri Feb 04 18:47:37 2011 +0100 summary: Finish the renaming of datafiles to resources. Change test_ressources.test_resources_open to avoid to install a complete distribution, create a fake one manually. files: .hgignore distutils2/_backport/pkgutil.py distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES distutils2/command/install_distinfo.py distutils2/tests/test_resources.py diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -15,3 +15,4 @@ include bin nosetests.xml +Distutils2.egg-info diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1,13 +1,14 @@ """Utilities to support packages.""" +import imp +import sys + +from csv import reader as csv_reader import os -import sys -import imp import re +from stat import ST_SIZE +from types import ModuleType import warnings -from csv import reader as csv_reader -from types import ModuleType -from stat import ST_SIZE try: from hashlib import md5 @@ -55,7 +56,7 @@ """Make a trivial single-dispatch generic function""" registry = {} - def wrapper(*args, **kw): + def wrapper(*args, ** kw): ob = args[0] try: cls = ob.__class__ @@ -70,12 +71,12 @@ pass mro = cls.__mro__[1:] except TypeError: - mro = object, # must be an ExtensionClass or some such :( + mro = object, # must be an ExtensionClass or some such :( for t in mro: if t in registry: - return registry[t](*args, **kw) + return registry[t](*args, ** kw) else: - return func(*args, **kw) + return func(*args, ** kw) try: wrapper.__name__ = func.__name__ except (TypeError, AttributeError): @@ -620,7 +621,7 @@ # PEP 376 Implementation # ########################## -DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'DATAFILES') +DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'RESOURCES') # Cache _cache_name = {} # maps names to Distribution instances @@ -659,7 +660,7 @@ def clear_cache(): """ Clears the internal cache. """ global _cache_name, _cache_name_egg, _cache_path, _cache_path_egg, \ - _cache_generated, _cache_generated_egg + _cache_generated, _cache_generated_egg _cache_name = {} _cache_name_egg = {} @@ -759,13 +760,13 @@ yield path, md5, size def get_resource_path(self, relative_path): - resources_file = self.get_distinfo_file('DATAFILES') - resources_reader = csv_reader(resources_file, delimiter = ',') + resources_file = self.get_distinfo_file('RESOURCES') + resources_reader = csv_reader(resources_file, delimiter=',') for relative, destination in resources_reader: if relative == relative_path: return destination - raise KeyError('No data_file with relative path %s were installed' % - relative_path) + raise KeyError('No resource file with relative path %s were installed' % + relative_path) def get_installed_files(self, local=False): """ @@ -824,13 +825,13 @@ distinfo_dirname, path = path.split(os.sep)[-2:] if distinfo_dirname != self.path.split(os.sep)[-1]: raise DistutilsError("Requested dist-info file does not " - "belong to the %s distribution. '%s' was requested." \ - % (self.name, os.sep.join([distinfo_dirname, path]))) + "belong to the %s distribution. '%s' was requested." \ + % (self.name, os.sep.join([distinfo_dirname, path]))) # The file must be relative if path not in DIST_FILES: raise DistutilsError("Requested an invalid dist-info file: " - "%s" % path) + "%s" % path) # Convert the relative path back to absolute path = os.path.join(self.path, path) @@ -869,11 +870,11 @@ metadata = None """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with the distribution's ``METADATA`` file.""" - _REQUIREMENT = re.compile( \ - r'(?P[-A-Za-z0-9_.]+)\s*' \ - r'(?P(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)?\s*' \ - r'(?P(?:\s*,\s*(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)*)\s*' \ - r'(?P\[.*\])?') + _REQUIREMENT = re.compile(\ + r'(?P[-A-Za-z0-9_.]+)\s*' \ + r'(?P(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)?\s*' \ + r'(?P(?:\s*,\s*(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)*)\s*' \ + r'(?P\[.*\])?') def __init__(self, path, display_warnings=False): self.path = path @@ -959,8 +960,8 @@ else: if match.group('extras'): s = (('Distribution %s uses extra requirements ' - 'which are not supported in distutils') \ - % (self.name)) + 'which are not supported in distutils') \ + % (self.name)) warnings.warn(s) name = match.group('name') version = None @@ -1019,7 +1020,7 @@ def __eq__(self, other): return isinstance(other, EggInfoDistribution) and \ - self.path == other.path + self.path == other.path # See http://docs.python.org/reference/datamodel#object.__hash__ __hash__ = object.__hash__ @@ -1078,7 +1079,7 @@ yield dist -def get_distribution(name, use_egg_info=False, paths=sys.path): +def get_distribution(name, use_egg_info=False, paths=None): """ Scans all elements in ``sys.path`` and looks for all directories ending with ``.dist-info``. Returns a :class:`Distribution` @@ -1095,6 +1096,9 @@ :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None """ + if paths == None: + paths = sys.path + if not _cache_enabled: for dist in _yield_distributions(True, use_egg_info, paths): if dist.name == name: @@ -1137,7 +1141,7 @@ predicate = VersionPredicate(obs) except ValueError: raise DistutilsError(('Distribution %s has ill formed' + - ' obsoletes field') % (dist.name,)) + ' obsoletes field') % (dist.name,)) if name == o_components[0] and predicate.match(version): yield dist break @@ -1183,8 +1187,8 @@ p_name, p_ver = p_components if len(p_ver) < 2 or p_ver[0] != '(' or p_ver[-1] != ')': raise DistutilsError(('Distribution %s has invalid ' + - 'provides field: %s') \ - % (dist.name, p)) + 'provides field: %s') \ + % (dist.name, p)) p_ver = p_ver[1:-1] # trim off the parenthesis if p_name == name and predicate.match(p_ver): yield dist @@ -1206,13 +1210,13 @@ yield dist def resource_path(distribution_name, relative_path): - dist = get_distribution(distribution_name) - if dist != None: - return dist.get_resource_path(relative_path) - raise LookupError('No distribution named %s is installed.' % - distribution_name) + dist = get_distribution(distribution_name) + if dist != None: + return dist.get_resource_path(relative_path) + raise LookupError('No distribution named %s is installed.' % + distribution_name) -def resource_open(distribution_name, relative_path, *args, **kwargs): - file = open(resource_path(distribution_name, relative_path), *args, - **kwargs) +def resource_open(distribution_name, relative_path, * args, ** kwargs): + file = open(resource_path(distribution_name, relative_path), * args, + ** kwargs) return file \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES deleted file mode 100644 --- a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/DATAFILES +++ /dev/null @@ -1,2 +0,0 @@ -babar.png,babar.png -babar.cfg,babar.cfg \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES @@ -0,0 +1,2 @@ +babar.png,babar.png +babar.cfg,babar.cfg \ No newline at end of file diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -12,12 +12,12 @@ # This file was created from the code for the former command install_egg_info -import os import csv -import re -from distutils2.command.cmd import Command from distutils2 import logger from distutils2._backport.shutil import rmtree +from distutils2.command.cmd import Command +import os +import re try: import hashlib except ImportError: @@ -40,7 +40,7 @@ ('no-record', None, "do not generate a RECORD file"), ('no-resources', None, - "do not generate a DATAFILES list installed file") + "do not generate a RESSOURCES list installed file") ] boolean_options = ['requested', 'no-record', 'no-resources'] @@ -76,9 +76,9 @@ metadata = self.distribution.metadata basename = "%s-%s.dist-info" % ( - to_filename(safe_name(metadata['Name'])), - to_filename(safe_version(metadata['Version'])), - ) + to_filename(safe_name(metadata['Name'])), + to_filename(safe_version(metadata['Version'])), + ) self.distinfo_dir = os.path.join(self.distinfo_dir, basename) self.outputs = [] @@ -123,7 +123,8 @@ if not self.no_resources: install_data = self.get_finalized_command('install_data') if install_data.get_resources_out() != []: - resources_path = os.path.join(self.distinfo_dir, 'DATAFILES') + resources_path = os.path.join(self.distinfo_dir, + 'RESOURCES') logger.info('creating %s', resources_path) f = open(resources_path, 'wb') try: diff --git a/distutils2/tests/test_resources.py b/distutils2/tests/test_resources.py --- a/distutils2/tests/test_resources.py +++ b/distutils2/tests/test_resources.py @@ -1,13 +1,19 @@ # -*- encoding: utf-8 -*- """Tests for distutils.data.""" -import os +import pkgutil import sys -from distutils2.tests import unittest, support, run_unittest +from distutils2._backport.pkgutil import resource_open +from distutils2._backport.pkgutil import resource_path +from distutils2._backport.pkgutil import disable_cache +from distutils2._backport.pkgutil import enable_cache +from distutils2.command.install_dist import install_dist +from distutils2.resources import resources_dests +from distutils2.tests import run_unittest +from distutils2.tests import unittest from distutils2.tests.test_util import GlobTestCaseBase -from distutils2.resources import resources_dests -from distutils2.command.install_dist import install_dist -from distutils2._backport.pkgutil import resource_open, resource_path +import os +import tempfile class DataFilesTestCase(GlobTestCaseBase): @@ -29,56 +35,56 @@ def test_simple_glob(self): rules = [('', '*.tpl', '{data}')] spec = {'coucou.tpl': '{data}/coucou.tpl', - 'Donotwant': None} + 'Donotwant': None} self.assertRulesMatch(rules, spec) def test_multiple_match(self): rules = [('scripts', '*.bin', '{appdata}'), - ('scripts', '*', '{appscript}')] + ('scripts', '*', '{appscript}')] spec = {'scripts/script.bin': '{appscript}/script.bin', - 'Babarlikestrawberry': None} + 'Babarlikestrawberry': None} self.assertRulesMatch(rules, spec) def test_set_match(self): rules = [('scripts', '*.{bin,sh}', '{appscript}')] spec = {'scripts/script.bin': '{appscript}/script.bin', - 'scripts/babar.sh': '{appscript}/babar.sh', - 'Babarlikestrawberry': None} + 'scripts/babar.sh': '{appscript}/babar.sh', + 'Babarlikestrawberry': None} self.assertRulesMatch(rules, spec) def test_set_match_multiple(self): rules = [('scripts', 'script{s,}.{bin,sh}', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', - 'scripts/script.sh': '{appscript}/script.sh', - 'Babarlikestrawberry': None} + 'scripts/script.sh': '{appscript}/script.sh', + 'Babarlikestrawberry': None} self.assertRulesMatch(rules, spec) def test_set_match_exclude(self): rules = [('scripts', '*', '{appscript}'), - ('', '**/*.sh', None)] + ('', '**/*.sh', None)] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', - 'scripts/script.sh': None, - 'Babarlikestrawberry': None} + 'scripts/script.sh': None, + 'Babarlikestrawberry': None} self.assertRulesMatch(rules, spec) def test_glob_in_base(self): rules = [('scrip*', '*.bin', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', - 'Babarlikestrawberry': None} + 'Babarlikestrawberry': None} tempdir = self.build_files_tree(spec) self.assertRaises(NotImplementedError, resources_dests, tempdir, rules) def test_recursive_glob(self): rules = [('', '**/*.bin', '{binary}')] spec = {'binary0.bin': '{binary}/binary0.bin', - 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', - 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin', - 'you/kill/pandabear.guy': None} + 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', + 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin', + 'you/kill/pandabear.guy': None} self.assertRulesMatch(rules, spec) def test_final_exemple_glob(self): rules = [ - ('mailman/database/schemas/','*', '{appdata}/schemas'), + ('mailman/database/schemas/', '*', '{appdata}/schemas'), ('', '**/*.tpl', '{appdata}/templates'), ('', 'developer-docs/**/*.txt', '{doc}'), ('', 'README', '{doc}'), @@ -103,40 +109,62 @@ self.assertRulesMatch(rules, spec) def test_resource_open(self): - from distutils2._backport.sysconfig import _SCHEMES as sysconfig_SCHEMES - from distutils2._backport.sysconfig import _get_default_scheme - #dirty but hit marmoute - tempdir = self.mkdtemp() - - old_scheme = sysconfig_SCHEMES - sysconfig_SCHEMES.set(_get_default_scheme(), 'config', - tempdir) + #Create a fake-dist + temp_site_packages = tempfile.mkdtemp() - pkg_dir, dist = self.create_dist() - dist_name = dist.metadata['name'] + dist_name = 'test' + dist_info = os.path.join(temp_site_packages, 'test-0.1.dist-info') + os.mkdir(dist_info) - test_path = os.path.join(pkg_dir, 'test.cfg') - self.write_file(test_path, 'Config') - dist.data_files = {test_path : '{config}/test.cfg'} + metadata_path = os.path.join(dist_info, 'METADATA') + resources_path = os.path.join(dist_info, 'RESOURCES') - cmd = install_dist(dist) - cmd.install_dir = os.path.join(pkg_dir, 'inst') - content = 'Config' - - cmd.ensure_finalized() - cmd.run() + metadata_file = open(metadata_path, 'w') - cfg_dest = os.path.join(tempdir, 'test.cfg') + metadata_file.write( +"""Metadata-Version: 1.2 +Name: test +Version: 0.1 +Summary: test +Author: me + """) - self.assertEqual(resource_path(dist_name, test_path), cfg_dest) - self.assertRaises(KeyError, lambda: resource_path(dist_name, 'notexis')) + metadata_file.close() + + test_path = 'test.cfg' + + _, test_resource_path = tempfile.mkstemp() + + test_resource_file = open(test_resource_path, 'w') + + content = 'Config' + test_resource_file.write(content) + test_resource_file.close() + + resources_file = open(resources_path, 'w') + + resources_file.write("""%s,%s""" % (test_path, test_resource_path)) + resources_file.close() + + #Add fake site-packages to sys.path to retrieve fake dist + old_sys_path = sys.path + sys.path.insert(0, temp_site_packages) + + #Force pkgutil to rescan the sys.path + disable_cache() + + #Try to retrieve resources paths and files + self.assertEqual(resource_path(dist_name, test_path), test_resource_path) + self.assertRaises(KeyError, resource_path, dist_name, 'notexis') self.assertEqual(resource_open(dist_name, test_path).read(), content) - self.assertRaises(KeyError, lambda: resource_open(dist_name, 'notexis')) - - sysconfig_SCHEMES = old_scheme + self.assertRaises(KeyError, resource_open, dist_name, 'notexis') + + sys.path = old_sys_path + + enable_cache() def test_suite(): return unittest.makeSuite(DataFilesTestCase) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: merge upstream into resources branches Message-ID: tarek.ziade pushed cc75ed246974 to distutils2: http://hg.python.org/distutils2/rev/cc75ed246974 changeset: 1078:cc75ed246974 parent: 1077:b6ad6f924cc3 parent: 1008:1d1d07ebb991 user: Pierre-Yves David date: Fri Feb 04 23:41:38 2011 +0100 summary: merge upstream into resources branches files: distutils2/_backport/pkgutil.py distutils2/_backport/tests/test_pkgutil.py distutils2/dist.py distutils2/index/simple.py distutils2/tests/test_mkcfg.py diff --git a/CHANGES.txt b/CHANGES.txt --- a/CHANGES.txt +++ b/CHANGES.txt @@ -10,6 +10,7 @@ - Issue #10409: Fixed the Licence selector in mkcfg [tarek] - Issue #9558: Fix build_ext with VS 8.0 [??ric] - Issue #6007: Add disclaimer about MinGW compatibility in docs [??ric] +- Renamed DistributionMetadata to Metadata [ccomb] 1.0a3 - 2010-10-08 ------------------ diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -10,6 +10,9 @@ See the documentation at http://packages.python.org/Distutils2 for more info. +If you want to contribute, please have a look to +http://distutils2.notmyidea.org/contributing.html + **Beware that Distutils2 is in its early stage and should not be used in production. Its API is subject to changes** diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -16,7 +16,7 @@ from md5 import md5 from distutils2.errors import DistutilsError -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import suggest_normalized_version, VersionPredicate try: import cStringIO as StringIO @@ -726,7 +726,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" requested = False """A boolean that indicates whether the ``REQUESTED`` metadata file is @@ -738,7 +738,7 @@ self.metadata = _cache_path[path].metadata else: metadata_path = os.path.join(path, 'METADATA') - self.metadata = DistributionMetadata(path=metadata_path) + self.metadata = Metadata(path=metadata_path) self.path = path self.name = self.metadata['name'] @@ -868,7 +868,7 @@ name = '' """The name of the distribution.""" metadata = None - """A :class:`distutils2.metadata.DistributionMetadata` instance loaded with + """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" _REQUIREMENT = re.compile(\ r'(?P[-A-Za-z0-9_.]+)\s*' \ @@ -903,7 +903,7 @@ if path.endswith('.egg'): if os.path.isdir(path): meta_path = os.path.join(path, 'EGG-INFO', 'PKG-INFO') - self.metadata = DistributionMetadata(path=meta_path) + self.metadata = Metadata(path=meta_path) try: req_path = os.path.join(path, 'EGG-INFO', 'requires.txt') requires = open(req_path, 'r').read() @@ -913,7 +913,7 @@ # FIXME handle the case where zipfile is not available zipf = zipimport.zipimporter(path) fileobj = StringIO.StringIO(zipf.get_data('EGG-INFO/PKG-INFO')) - self.metadata = DistributionMetadata(fileobj=fileobj) + self.metadata = Metadata(fileobj=fileobj) try: requires = zipf.get_data('EGG-INFO/requires.txt') except IOError: @@ -927,7 +927,7 @@ requires = req_f.read() except IOError: requires = None - self.metadata = DistributionMetadata(path=path) + self.metadata = Metadata(path=path) self.name = self.metadata['name'] else: raise ValueError('The path must end with .egg-info or .egg') diff --git a/distutils2/_backport/shutil.py b/distutils2/_backport/shutil.py --- a/distutils2/_backport/shutil.py +++ b/distutils2/_backport/shutil.py @@ -376,7 +376,7 @@ archive that is being built. If not provided, the current owner and group will be used. - The output tar file will be named 'base_dir' + ".tar", possibly plus + The output tar file will be named 'base_name' + ".tar", possibly plus the appropriate compression extension (".gz", or ".bz2"). Returns the output filename. @@ -451,7 +451,7 @@ def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None): """Create a zip file from all the files under 'base_dir'. - The output zip file will be named 'base_dir' + ".zip". Uses either the + The output zip file will be named 'base_name' + ".zip". Uses either the "zipfile" Python module (if available) or the InfoZIP "zip" utility (if installed and found on the default search path). If neither tool is available, raises ExecError. Returns the name of the output zip diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -14,7 +14,7 @@ from distutils2._backport.hashlib import md5 from distutils2.errors import DistutilsError -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import unittest, run_unittest, support from distutils2._backport import pkgutil @@ -244,7 +244,7 @@ dist = Distribution(dist_path) self.assertEqual(dist.name, name) - self.assertTrue(isinstance(dist.metadata, DistributionMetadata)) + self.assertTrue(isinstance(dist.metadata, Metadata)) self.assertEqual(dist.metadata['version'], version) self.assertTrue(isinstance(dist.requested, type(bool()))) diff --git a/distutils2/command/cmd.py b/distutils2/command/cmd.py --- a/distutils2/command/cmd.py +++ b/distutils2/command/cmd.py @@ -10,14 +10,7 @@ from distutils2.errors import DistutilsOptionError from distutils2 import util from distutils2 import logger - -# XXX see if we want to backport this -from distutils2._backport.shutil import copytree, copyfile, move - -try: - from shutil import make_archive -except ImportError: - from distutils2._backport.shutil import make_archive +from distutils2._backport.shutil import copytree, copyfile, move, make_archive class Command(object): diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -15,7 +15,7 @@ from distutils2.fancy_getopt import FancyGetopt from distutils2.util import strtobool, resolve_name from distutils2 import logger -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.config import Config from distutils2.command import get_command_class @@ -145,7 +145,7 @@ # forth) in a separate object -- we're getting to have enough # information here (and enough command-line options) that it's # worth it. - self.metadata = DistributionMetadata() + self.metadata = Metadata() # 'cmdclass' maps command names to class objects, so we # can 1) quickly figure out which class to instantiate when diff --git a/distutils2/index/dist.py b/distutils2/index/dist.py --- a/distutils2/index/dist.py +++ b/distutils2/index/dist.py @@ -28,7 +28,7 @@ CantParseArchiveName) from distutils2.version import (suggest_normalized_version, NormalizedVersion, get_version_predicate) -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.util import splitext __all__ = ['ReleaseInfo', 'DistInfo', 'ReleasesList', 'get_infos_from_url'] @@ -66,7 +66,7 @@ self._version = None self.version = version if metadata: - self.metadata = DistributionMetadata(mapping=metadata) + self.metadata = Metadata(mapping=metadata) else: self.metadata = None self.dists = {} @@ -174,7 +174,7 @@ def set_metadata(self, metadata): if not self.metadata: - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata.update(metadata) def __getitem__(self, item): @@ -216,7 +216,6 @@ __hash__ = object.__hash__ - class DistInfo(IndexReference): """Represents a distribution retrieved from an index (sdist, bdist, ...) """ @@ -346,7 +345,7 @@ return "" % self.dist_type return "<%s %s %s>" % ( - self.release.name, self.release.version, self.dist_type or "") + self.release.name, self.release.version, self.dist_type or "") class ReleasesList(IndexReference): diff --git a/distutils2/index/simple.py b/distutils2/index/simple.py --- a/distutils2/index/simple.py +++ b/distutils2/index/simple.py @@ -22,7 +22,7 @@ UnableToDownload, CantParseArchiveName, ReleaseNotFound, ProjectNotFound) from distutils2.index.mirrors import get_mirrors -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.version import get_version_predicate from distutils2 import __version__ as __distutils2_version__ @@ -203,7 +203,7 @@ if not release._metadata: location = release.get_distribution().unpack() pkg_info = os.path.join(location, 'PKG-INFO') - release._metadata = DistributionMetadata(pkg_info) + release._metadata = Metadata(pkg_info) return release def _switch_to_next_mirror(self): diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -41,7 +41,7 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ('DistributionMetadata', 'PKG_INFO_ENCODING', +__all__ = ('Metadata', 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION') # Encoding used for the PKG-INFO files @@ -187,7 +187,7 @@ _MISSING = NoDefault() -class DistributionMetadata(object): +class Metadata(object): """The metadata of a release. Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -25,18 +25,15 @@ import os import sys +import glob import re import shutil -import glob -import re from ConfigParser import RawConfigParser from textwrap import dedent -if sys.version_info[:2] < (2, 6): - from sets import Set as set try: from hashlib import md5 except ImportError: - from md5 import md5 + from distutils2._backport.hashlib import md5 # importing this with an underscore as it should be replaced by the # dict form or another structures for all purposes from distutils2._trove import all_classifiers as _CLASSIFIERS_LIST @@ -92,10 +89,10 @@ Optionally, you can set other trove identifiers for things such as the human language, programming language, user interface, etc... ''', - 'setup.py found':''' + 'setup.py found': ''' The setup.py script will be executed to retrieve the metadata. A wizard will be run if you answer "n", -''' +''', } # XXX everything needs docstrings and tests (both low-level tests of various @@ -162,6 +159,7 @@ CLASSIFIERS = _build_classifiers_dict(_CLASSIFIERS_LIST) + def _build_licences(classifiers): res = [] for index, item in enumerate(classifiers): @@ -172,6 +170,7 @@ LICENCES = _build_licences(_CLASSIFIERS_LIST) + class MainProgram(object): def __init__(self): self.configparser = None @@ -235,8 +234,8 @@ if ans != 'y': return - #_______mock setup start data = self.data + def setup(**attrs): """Mock the setup(**attrs) in order to retrive metadata.""" # use the distutils v1 processings to correctly parse metadata. @@ -272,11 +271,12 @@ if len(dist.data_files) < 2 or \ isinstance(dist.data_files[1], str): dist.data_files = [('', dist.data_files)] - #add tokens in the destination paths - vars = {'distribution.name':data['name']} + # add tokens in the destination paths + vars = {'distribution.name': data['name']} path_tokens = sysconfig.get_paths(vars=vars).items() - #sort tokens to use the longest one first - path_tokens.sort(cmp=lambda x,y: cmp(len(y), len(x)), + # sort tokens to use the longest one first + # TODO chain two sorted with key arguments, remove cmp + path_tokens.sort(cmp=lambda x, y: cmp(len(y), len(x)), key=lambda x: x[1]) for dest, srcs in (dist.data_files or []): dest = os.path.join(sys.prefix, dest) @@ -291,29 +291,31 @@ package_dirs = dist.package_dir or {} for package, extras in dist.package_data.iteritems() or []: package_dir = package_dirs.get(package, package) - fils = [os.path.join(package_dir, fil) for fil in extras] - data['extra_files'].extend(fils) + files = [os.path.join(package_dir, f) for f in extras] + data['extra_files'].extend(files) # Use README file if its content is the desciption if "description" in data: ref = md5(re.sub('\s', '', self.data['description']).lower()) ref = ref.digest() for readme in glob.glob('README*'): - fob = open(readme) - val = md5(re.sub('\s', '', fob.read()).lower()).digest() - fob.close() + fp = open(readme) + try: + contents = fp.read() + finally: + fp.close() + val = md5(re.sub('\s', '', contents.lower())).digest() if val == ref: del data['description'] data['description-file'] = readme break - #_________ mock setup end # apply monkey patch to distutils (v1) and setuptools (if needed) # (abord the feature if distutils v1 has been killed) try: import distutils.core as DC - getattr(DC, 'setup') # ensure distutils v1 - except ImportError, AttributeError: + DC.setup # ensure distutils v1 + except (ImportError, AttributeError): return saved_setups = [(DC, DC.setup)] DC.setup = setup @@ -321,15 +323,15 @@ import setuptools saved_setups.append((setuptools, setuptools.setup)) setuptools.setup = setup - except ImportError, AttributeError: + except (ImportError, AttributeError): pass # get metadata by executing the setup.py with the patched setup(...) - success = False # for python < 2.4 + success = False # for python < 2.4 try: pyenv = globals().copy() execfile(setuppath, pyenv) success = True - finally: #revert monkey patches + finally: # revert monkey patches for patched_module, original_setup in saved_setups: patched_module.setup = original_setup if not self.data: @@ -339,7 +341,8 @@ def inspect_file(self, path): fp = open(path, 'r') try: - for line in [fp.readline() for _ in range(10)]: + for _ in xrange(10): + line = fp.readline() m = re.match(r'^#!.*python((?P\d)(\.\d+)?)?$', line) if m: if m.group('major') == '3': @@ -393,7 +396,6 @@ helptext=_helptext['extra_files']) == 'y': self._set_multi('Extra file/dir name', 'extra_files') - if ask_yn('Do you want to set Trove classifiers?', helptext=_helptext['do_classifier']) == 'y': self.set_classifier() @@ -413,7 +415,6 @@ _pref = ['lib', 'include', 'dist', 'build', '.', '~'] _suf = ['.pyc'] - def to_skip(path): path = relative(path) diff --git a/distutils2/tests/test_command_install_distinfo.py b/distutils2/tests/test_command_install_distinfo.py --- a/distutils2/tests/test_command_install_distinfo.py +++ b/distutils2/tests/test_command_install_distinfo.py @@ -5,7 +5,7 @@ from distutils2.command.install_distinfo import install_distinfo from distutils2.command.cmd import Command -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import unittest, support try: @@ -64,7 +64,7 @@ self.assertEqual(open(os.path.join(dist_info, 'REQUESTED')).read(), '') meta_path = os.path.join(dist_info, 'METADATA') - self.assertTrue(DistributionMetadata(path=meta_path).check()) + self.assertTrue(Metadata(path=meta_path).check()) def test_installer(self): pkg_dir, dist = self.create_dist(name='foo', diff --git a/distutils2/tests/test_command_test.py b/distutils2/tests/test_command_test.py --- a/distutils2/tests/test_command_test.py +++ b/distutils2/tests/test_command_test.py @@ -17,10 +17,6 @@ from distutils2.dist import Distribution from distutils2._backport import pkgutil -try: - any -except NameError: - from distutils2._backport import any EXPECTED_OUTPUT_RE = r'''FAIL: test_blah \(myowntestmodule.SomeTest\) ---------------------------------------------------------------------- diff --git a/distutils2/tests/test_dist.py b/distutils2/tests/test_dist.py --- a/distutils2/tests/test_dist.py +++ b/distutils2/tests/test_dist.py @@ -68,7 +68,7 @@ distutils2.dist.DEBUG = False def test_write_pkg_file(self): - # Check DistributionMetadata handling of Unicode fields + # Check Metadata handling of Unicode fields tmp_dir = self.mkdtemp() my_file = os.path.join(tmp_dir, 'f') cls = Distribution diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -5,7 +5,7 @@ from distutils2 import install from distutils2.index.xmlrpc import Client -from distutils2.metadata import DistributionMetadata +from distutils2.metadata import Metadata from distutils2.tests import run_unittest from distutils2.tests.support import TempdirManager from distutils2.tests.pypi_server import use_xmlrpc_server @@ -18,7 +18,7 @@ def __init__(self, name, version, deps): self.name = name self.version = version - self.metadata = DistributionMetadata() + self.metadata = Metadata() self.metadata['Requires-Dist'] = deps self.metadata['Provides-Dist'] = ['%s (%s)' % (name, version)] diff --git a/distutils2/tests/test_metadata.py b/distutils2/tests/test_metadata.py --- a/distutils2/tests/test_metadata.py +++ b/distutils2/tests/test_metadata.py @@ -4,7 +4,7 @@ import platform from StringIO import StringIO -from distutils2.metadata import (DistributionMetadata, +from distutils2.metadata import (Metadata, PKG_INFO_PREFERRED_VERSION) from distutils2.tests import run_unittest, unittest from distutils2.tests.support import LoggingCatcher, WarningsCatcher @@ -12,7 +12,7 @@ MetadataUnrecognizedVersionError) -class DistributionMetadataTestCase(LoggingCatcher, WarningsCatcher, +class MetadataTestCase(LoggingCatcher, WarningsCatcher, unittest.TestCase): def test_instantiation(self): @@ -24,35 +24,35 @@ fp.close() fp = StringIO(contents) - m = DistributionMetadata() + m = Metadata() self.assertRaises(MetadataUnrecognizedVersionError, m.items) - m = DistributionMetadata(PKG_INFO) + m = Metadata(PKG_INFO) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(fileobj=fp) + m = Metadata(fileobj=fp) self.assertEqual(len(m.items()), 22) - m = DistributionMetadata(mapping=dict(name='Test', version='1.0')) + m = Metadata(mapping=dict(name='Test', version='1.0')) self.assertEqual(len(m.items()), 11) d = dict(m.items()) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, fileobj=fp) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, fileobj=fp, mapping=d) - self.assertRaises(TypeError, DistributionMetadata, + self.assertRaises(TypeError, Metadata, PKG_INFO, mapping=m, fileobj=fp) def test_metadata_read_write(self): PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') - metadata = DistributionMetadata(PKG_INFO) + metadata = Metadata(PKG_INFO) out = StringIO() metadata.write_file(out) out.seek(0) - res = DistributionMetadata() + res = Metadata() res.read_file(out) for k in metadata.keys(): self.assertTrue(metadata[k] == res[k]) @@ -62,7 +62,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(platform_dependent=True) + metadata = Metadata(platform_dependent=True) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['bar']) metadata['Name'] = "baz; sys.platform == 'blah'" @@ -72,7 +72,7 @@ # test with context context = {'sys.platform': 'okook'} - metadata = DistributionMetadata(platform_dependent=True, + metadata = Metadata(platform_dependent=True, execution_context=context) metadata.read_file(StringIO(content)) self.assertEqual(metadata['Requires-Dist'], ['foo']) @@ -81,7 +81,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(StringIO(content)) # see if we can read the description now @@ -100,7 +100,7 @@ PKG_INFO = os.path.join(os.path.dirname(__file__), 'PKG-INFO') content = open(PKG_INFO).read() content = content % sys.platform - metadata = DistributionMetadata(fileobj=StringIO(content)) + metadata = Metadata(fileobj=StringIO(content)) self.assertIn('Version', metadata.keys()) self.assertIn('0.5', metadata.values()) self.assertIn(('Version', '0.5'), metadata.items()) @@ -111,7 +111,7 @@ self.assertEqual(metadata['Version'], '0.7') def test_versions(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Obsoletes'] = 'ok' self.assertEqual(metadata['Metadata-Version'], '1.1') @@ -142,7 +142,7 @@ # XXX Spurious Warnings were disabled def XXXtest_warnings(self): - metadata = DistributionMetadata() + metadata = Metadata() # these should raise a warning values = (('Requires-Dist', 'Funky (Groovie)'), @@ -155,7 +155,7 @@ self.assertEqual(len(self.logs), 2) def test_multiple_predicates(self): - metadata = DistributionMetadata() + metadata = Metadata() # see for "3" instead of "3.0" ??? # its seems like the MINOR VERSION can be omitted @@ -165,14 +165,14 @@ self.assertEqual(len(self.warnings), 0) def test_project_url(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Project-URL'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-URL'], [('one', 'http://ok')]) self.assertEqual(metadata.version, '1.2') def test_check_version(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -181,7 +181,7 @@ self.assertEqual(missing, ['Version']) def test_check_version_strict(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -190,7 +190,7 @@ self.assertRaises(MetadataMissingError, metadata.check, strict=True) def test_check_name(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -199,7 +199,7 @@ self.assertEqual(missing, ['Name']) def test_check_name_strict(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Home-page'] = 'http://pypi.python.org' metadata['Author'] = 'Monty Python' @@ -208,7 +208,7 @@ self.assertRaises(MetadataMissingError, metadata.check, strict=True) def test_check_author(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' @@ -217,7 +217,7 @@ self.assertEqual(missing, ['Author']) def test_check_homepage(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Name'] = 'vimpdb' metadata['Author'] = 'Monty Python' @@ -226,7 +226,7 @@ self.assertEqual(missing, ['Home-page']) def test_check_predicates(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = 'rr' metadata['Name'] = 'vimpdb' metadata['Home-page'] = 'http://pypi.python.org' @@ -242,7 +242,7 @@ self.assertEqual(len(warnings), 4) def test_best_choice(self): - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' self.assertEqual(metadata.version, PKG_INFO_PREFERRED_VERSION) metadata['Classifier'] = ['ok'] @@ -251,7 +251,7 @@ def test_project_urls(self): # project-url is a bit specific, make sure we write it # properly in PKG-INFO - metadata = DistributionMetadata() + metadata = Metadata() metadata['Version'] = '1.0' metadata['Project-Url'] = [('one', 'http://ok')] self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) @@ -262,13 +262,13 @@ self.assertIn('Project-URL: one,http://ok', res) file_.seek(0) - metadata = DistributionMetadata() + metadata = Metadata() metadata.read_file(file_) self.assertEqual(metadata['Project-Url'], [('one', 'http://ok')]) def test_suite(): - return unittest.makeSuite(DistributionMetadataTestCase) + return unittest.makeSuite(MetadataTestCase) if __name__ == '__main__': run_unittest(test_suite()) diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -1,11 +1,8 @@ # -*- coding: utf-8 -*- """Tests for distutils.mkcfg.""" import os -import os.path as osp import sys import StringIO -if sys.version_info[:2] < (2, 6): - from sets import Set as set from textwrap import dedent from distutils2.tests import run_unittest, support, unittest @@ -114,9 +111,11 @@ sys.stdin.write('y\n') sys.stdin.seek(0) main() - fid = open(osp.join(self.wdir, 'setup.cfg')) - lines = set([line.rstrip() for line in fid]) - fid.close() + fp = open(os.path.join(self.wdir, 'setup.cfg')) + try: + lines = set([line.rstrip() for line in fp]) + finally: + fp.close() self.assertEqual(lines, set(['', '[metadata]', 'version = 0.2', @@ -183,9 +182,11 @@ sys.stdin.write('y\n') sys.stdin.seek(0) main() - fid = open(osp.join(self.wdir, 'setup.cfg')) - lines = set([line.strip() for line in fid]) - fid.close() + fp = open(os.path.join(self.wdir, 'setup.cfg')) + try: + lines = set([line.strip() for line in fp]) + finally: + fp.close() self.assertEqual(lines, set(['', '[metadata]', 'version = 0.2', diff --git a/docs/design/pep-0376.txt b/docs/design/pep-0376.txt --- a/docs/design/pep-0376.txt +++ b/docs/design/pep-0376.txt @@ -425,7 +425,7 @@ - ``name``: The name of the distribution. -- ``metadata``: A ``DistributionMetadata`` instance loaded with the +- ``metadata``: A ``Metadata`` instance loaded with the distribution's PKG-INFO file. - ``requested``: A boolean that indicates whether the REQUESTED diff --git a/docs/source/distutils/apiref.rst b/docs/source/distutils/apiref.rst --- a/docs/source/distutils/apiref.rst +++ b/docs/source/distutils/apiref.rst @@ -888,7 +888,7 @@ .. function:: make_zipfile(base_name, base_dir[, verbose=0, dry_run=0]) Create a zip file from all files in and under *base_dir*. The output zip file - will be named *base_dir* + :file:`.zip`. Uses either the :mod:`zipfile` Python + will be named *base_name* + :file:`.zip`. Uses either the :mod:`zipfile` Python module (if available) or the InfoZIP :file:`zip` utility (if installed and found on the default search path). If neither tool is available, raises :exc:`DistutilsExecError`. Returns the name of the output zip file. @@ -1060,7 +1060,9 @@ .. module:: distutils2.metadata -.. autoclass:: distutils2.metadata.DistributionMetadata +.. FIXME CPython/stdlib docs don't use autoclass, write doc manually here + +.. autoclass:: distutils2.metadata.Metadata :members: :mod:`distutils2.util` --- Miscellaneous other utility functions diff --git a/docs/source/distutils/examples.rst b/docs/source/distutils/examples.rst --- a/docs/source/distutils/examples.rst +++ b/docs/source/distutils/examples.rst @@ -298,11 +298,11 @@ ``2.7`` or ``3.2``. You can read back this static file, by using the -:class:`distutils2.dist.DistributionMetadata` class and its +:class:`distutils2.dist.Metadata` class and its :func:`read_pkg_file` method:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata() >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) >>> metadata.name 'distribute' @@ -315,7 +315,7 @@ loads its values:: >>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info' - >>> DistributionMetadata(pkg_info_path).name + >>> Metadata(pkg_info_path).name 'distribute' diff --git a/docs/source/index.rst b/docs/source/index.rst --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -29,7 +29,7 @@ .. __: http://bitbucket.org/tarek/distutils2/wiki/GSoC_2010_teams If you???re looking for information on how to contribute, head to -:doc:`devresources`. +:doc:`devresources`, and be sure to have a look at :doc:`contributing`. Documentation @@ -76,6 +76,7 @@ distutils/index library/distutils2 library/pkgutil + contributing Indices and tables diff --git a/docs/source/library/distutils2.metadata.rst b/docs/source/library/distutils2.metadata.rst --- a/docs/source/library/distutils2.metadata.rst +++ b/docs/source/library/distutils2.metadata.rst @@ -4,7 +4,7 @@ .. module:: distutils2.metadata -Distutils2 provides a :class:`~distutils2.metadata.DistributionMetadata` class that can read and +Distutils2 provides a :class:`~distutils2.metadata.Metadata` class that can read and write metadata files. This class is compatible with all metadata versions: * 1.0: :PEP:`241` @@ -19,11 +19,11 @@ Reading metadata ================ -The :class:`~distutils2.metadata.DistributionMetadata` class can be instantiated with the path of +The :class:`~distutils2.metadata.Metadata` class can be instantiated with the path of the metadata file, and provides a dict-like interface to the values:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO') + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO') >>> metadata.keys()[:5] ('Metadata-Version', 'Name', 'Version', 'Platform', 'Supported-Platform') >>> metadata['Name'] @@ -35,13 +35,13 @@ The fields that supports environment markers can be automatically ignored if the object is instantiated using the ``platform_dependent`` option. -:class:`~distutils2.metadata.DistributionMetadata` will interpret in the case the markers and will +:class:`~distutils2.metadata.Metadata` will interpret in the case the markers and will automatically remove the fields that are not compliant with the running environment. Here's an example under Mac OS X. The win32 dependency we saw earlier is ignored:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True) + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata('PKG-INFO', platform_dependent=True) >>> metadata['Requires-Dist'] ['bar'] @@ -53,9 +53,9 @@ Here's an example, simulating a win32 environment:: - >>> from distutils2.metadata import DistributionMetadata + >>> from distutils2.metadata import Metadata >>> context = {'sys.platform': 'win32'} - >>> metadata = DistributionMetadata('PKG-INFO', platform_dependent=True, + >>> metadata = Metadata('PKG-INFO', platform_dependent=True, ... execution_context=context) ... >>> metadata['Requires-Dist'] = ["pywin32; sys.platform == 'win32'", @@ -83,8 +83,8 @@ Some fields in :PEP:`345` have to follow a version scheme in their versions predicate. When the scheme is violated, a warning is emitted:: - >>> from distutils2.metadata import DistributionMetadata - >>> metadata = DistributionMetadata() + >>> from distutils2.metadata import Metadata + >>> metadata = Metadata() >>> metadata['Requires-Dist'] = ['Funky (Groovie)'] "Funky (Groovie)" is not a valid predicate >>> metadata['Requires-Dist'] = ['Funky (1.2)'] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: fix indent in test Message-ID: tarek.ziade pushed cdb85a835177 to distutils2: http://hg.python.org/distutils2/rev/cdb85a835177 changeset: 1079:cdb85a835177 user: Pierre-Yves David date: Sat Feb 05 01:32:06 2011 +0100 summary: fix indent in test files: distutils2/_backport/tests/test_pkgutil.py diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -399,9 +399,9 @@ # Now, test if the egg-info distributions are found correctly as well fake_dists += [('bacon', '0.1'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), - ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('nut', 'funkyversion')] + ('coconuts-aster', '10.3'), + ('banana', '0.4'), ('strawberry', '0.6'), + ('truffles', '5.0'), ('nut', 'funkyversion')] found_dists = [] dists = [dist for dist in get_distributions(use_egg_info=True)] @@ -494,7 +494,7 @@ l = [dist.name for dist in provides_distribution('truffles', \ '!=1.1,<=2.0', - use_egg_info=True)] + use_egg_info=True)] checkLists(l, ['choxie', 'bacon', 'cheese']) l = [dist.name for dist in provides_distribution('truffles', '>1.0')] @@ -567,31 +567,29 @@ checkLists = lambda x, y: self.assertListEqual(sorted(x), sorted(y)) eggs = [('bacon', '0.1'), ('banana', '0.4'), ('strawberry', '0.6'), - ('truffles', '5.0'), ('cheese', '2.0.2'), - ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] + ('truffles', '5.0'), ('cheese', '2.0.2'), + ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), - ('towel-stuff', '0.1'), ('babar', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] checkLists([], _yield_distributions(False, False)) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(False, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(False, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(eggs, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, False) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, False) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists, found) found = [(dist.name, dist.metadata['Version']) - for dist in _yield_distributions(True, True) - if dist.path.startswith(self.fake_dists_path)] + for dist in _yield_distributions(True, True) + if dist.path.startswith(self.fake_dists_path)] checkLists(dists + eggs, found) - - def test_suite(): suite = unittest.TestSuite() load = unittest.defaultTestLoader.loadTestsFromTestCase -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: fix pkgutil test by adding babar 0.1 to installed fake_dist Message-ID: tarek.ziade pushed a35da5d5a84b to distutils2: http://hg.python.org/distutils2/rev/a35da5d5a84b changeset: 1080:a35da5d5a84b user: Pierre-Yves David date: Sat Feb 05 01:44:37 2011 +0100 summary: fix pkgutil test by adding babar 0.1 to installed fake_dist files: distutils2/_backport/tests/test_pkgutil.py diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -377,7 +377,7 @@ # Lookup all distributions found in the ``sys.path``. # This test could potentially pick up other installed distributions fake_dists = [('grammar', '1.0a4'), ('choxie', '2.0.0.9'), - ('towel-stuff', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] found_dists = [] # Verify the fake dists have been found. -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Improve iglob error handling. Message-ID: tarek.ziade pushed a9b19122fb32 to distutils2: http://hg.python.org/distutils2/rev/a9b19122fb32 changeset: 1082:a9b19122fb32 user: Pierre-Yves David date: Sat Feb 05 02:58:54 2011 +0100 summary: Improve iglob error handling. iblog now raise ValueError when riche iglob are malformated. Related test are included in this changeset. files: distutils2/tests/test_util.py distutils2/util.py diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py --- a/distutils2/tests/test_util.py +++ b/distutils2/tests/test_util.py @@ -625,6 +625,34 @@ 'Donotwant': False} self.assertGlobMatch(glob, spec) + def test_invalid_glob_pattern(self): + invalids = [ + 'ppooa**', + 'azzaeaz4**/', + '/**ddsfs', + '**##1e"&e', + 'DSFb**c009', + '{' + '{aaQSDFa' + '}' + 'aQSDFSaa}' + '{**a,' + ',**a}' + '{a**,' + ',b**}' + '{a**a,babar}' + '{bob,b**z}' + ] + msg = "%r is not supposed to be a valid pattern" + for pattern in invalids: + try: + iglob(pattern) + except ValueError: + continue + else: + self.fail("%r is not a valid iglob pattern" % pattern) + + def test_suite(): suite = unittest.makeSuite(UtilTestCase) diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -1009,14 +1009,28 @@ self.options, self.explicit) RICH_GLOB = re.compile(r'\{([^}]*)\}') +_CHECK_RECURSIVE_GLOB = re.compile(r'[^/,{]\*\*|\*\*[^/,}]') +_CHECK_MISMATCH_SET = re.compile(r'^[^{]*\}|\{[^}]*$') + def iglob(path_glob): """Richer glob than the std glob module support ** and {opt1,opt2,opt3}""" + if _CHECK_RECURSIVE_GLOB.search(path_glob): + msg = """Invalid glob %r: Recursive glob "**" must be used alone""" + raise ValueError(msg % path_glob) + if _CHECK_MISMATCH_SET.search(path_glob): + msg = """Invalid glob %r: Mismatching set marker '{' or '}'""" + raise ValueError(msg % path_glob) + return _iglob(path_glob) + + +def _iglob(path_glob): + """Actual logic of the iglob function""" rich_path_glob = RICH_GLOB.split(path_glob, 1) if len(rich_path_glob) > 1: assert len(rich_path_glob) == 3, rich_path_glob prefix, set, suffix = rich_path_glob for item in set.split(','): - for path in iglob( ''.join((prefix, item, suffix))): + for path in _iglob( ''.join((prefix, item, suffix))): yield path else: if '**' not in path_glob: @@ -1032,7 +1046,7 @@ radical = radical.lstrip('/') for (path, dir, files) in os.walk(prefix): path = os.path.normpath(path) - for file in iglob(os.path.join(path, radical)): + for file in _iglob(os.path.join(path, radical)): yield file -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: when stdlib iglob is not available use glob instead for extended iblog. Message-ID: tarek.ziade pushed 3d3b78d98f78 to distutils2: http://hg.python.org/distutils2/rev/3d3b78d98f78 changeset: 1081:3d3b78d98f78 user: Pierre-Yves David date: Sat Feb 05 02:54:48 2011 +0100 summary: when stdlib iglob is not available use glob instead for extended iblog. files: distutils2/util.py diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,7 +15,10 @@ from subprocess import call as sub_call from copy import copy from fnmatch import fnmatchcase -from glob import iglob as std_iglob +try: + from glob import iglob as std_iglob +except ImportError: + from glob import glob as std_iglob # for python < 2.5 from ConfigParser import RawConfigParser from inspect import getsource -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: remove useless SmartGlob class used by distutils2 module. Message-ID: tarek.ziade pushed 06cf46799b59 to distutils2: http://hg.python.org/distutils2/rev/06cf46799b59 changeset: 1083:06cf46799b59 user: Pierre-Yves David date: Sat Feb 05 17:08:29 2011 +0100 summary: remove useless SmartGlob class used by distutils2 module. files: distutils2/resources.py diff --git a/distutils2/resources.py b/distutils2/resources.py --- a/distutils2/resources.py +++ b/distutils2/resources.py @@ -3,39 +3,39 @@ __all__ = ['resources_dests'] -class SmartGlob(object): +def _expand(root_dir, glob_base, glob_suffix, destination): + """search for file in a directory and return they expected destination. - def __init__(self, base, suffix): - self.base = base - self.suffix = suffix - - - def expand(self, basepath, destination): - if self.base: - base = os.path.join(basepath, self.base) - else: - base = basepath - if '*' in base or '{' in base or '}' in base: - raise NotImplementedError('glob are not supported into base part\ - of resources definition. %r is an invalide basepath' % base) - absglob = os.path.join(base, self.suffix) - for glob_file in iglob(absglob): - path_suffix = glob_file[len(base):].lstrip('/') - relpath = glob_file[len(basepath):].lstrip('/') - dest = os.path.join(destination, path_suffix) - yield relpath, dest + root_dir: directory where to search for resources. + glob_base: part of the path not included in destination. + glob_suffix: part of the path reused in the destination. + destination: base part of the destination. + """ + if glob_base: + base = os.path.join(root_dir, glob_base) + else: + base = root_dir + if '*' in base or '{' in base or '}' in base: + raise NotImplementedError('glob are not supported into base part\ + of resources definition. %r is an invalide root_dir' % base) + absglob = os.path.join(base, glob_suffix) + for glob_file in iglob(absglob): + path_suffix = glob_file[len(base):].lstrip('/') + relpath = glob_file[len(root_dir):].lstrip('/') + dest = os.path.join(destination, path_suffix) + yield relpath, dest def resources_dests(resources_dir, rules): + """find destination of ressources files""" destinations = {} for (base, suffix, glob_dest) in rules: - sglob = SmartGlob(base, suffix) if glob_dest is None: delete = True dest = '' else: delete = False dest = glob_dest - for resource_file, file_dest in sglob.expand(resources_dir, dest): + for resource_file, file_dest in _expand(resources_dir, base, suffix, dest): if delete and resource_file in destinations: del destinations[resource_file] else: -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: simplify expand function by remove the "destination" argument. Message-ID: tarek.ziade pushed 9e136b0d0eea to distutils2: http://hg.python.org/distutils2/rev/9e136b0d0eea changeset: 1084:9e136b0d0eea user: Pierre-Yves David date: Sat Feb 05 17:14:32 2011 +0100 summary: simplify expand function by remove the "destination" argument. files: distutils2/resources.py diff --git a/distutils2/resources.py b/distutils2/resources.py --- a/distutils2/resources.py +++ b/distutils2/resources.py @@ -3,13 +3,12 @@ __all__ = ['resources_dests'] -def _expand(root_dir, glob_base, glob_suffix, destination): - """search for file in a directory and return they expected destination. +def _expand(root_dir, glob_base, glob_suffix): + """search for file in a directory and return they radical part. root_dir: directory where to search for resources. - glob_base: part of the path not included in destination. - glob_suffix: part of the path reused in the destination. - destination: base part of the destination. + glob_base: part of the path not included in radical. + glob_suffix: part of the path used as radical. """ if glob_base: base = os.path.join(root_dir, glob_base) @@ -22,22 +21,15 @@ for glob_file in iglob(absglob): path_suffix = glob_file[len(base):].lstrip('/') relpath = glob_file[len(root_dir):].lstrip('/') - dest = os.path.join(destination, path_suffix) - yield relpath, dest + yield relpath, path_suffix def resources_dests(resources_dir, rules): """find destination of ressources files""" destinations = {} for (base, suffix, glob_dest) in rules: - if glob_dest is None: - delete = True - dest = '' - else: - delete = False - dest = glob_dest - for resource_file, file_dest in _expand(resources_dir, base, suffix, dest): - if delete and resource_file in destinations: - del destinations[resource_file] + for resource_file, radical in _expand(resources_dir, base, suffix): + if glob_dest is None: + destinations.pop(resource_file, None) #remove the entry if it was here else: - destinations[resource_file] = file_dest + destinations[resource_file] = os.path.join(glob_dest, radical) return destinations -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: Allow glob in base part of ressource files Message-ID: tarek.ziade pushed 495547cc414e to distutils2: http://hg.python.org/distutils2/rev/495547cc414e changeset: 1086:495547cc414e user: Pierre-Yves David date: Sat Feb 05 17:44:01 2011 +0100 summary: Allow glob in base part of ressource files We should be carefull that conflicting rename are detected. files: distutils2/resources.py distutils2/tests/test_resources.py diff --git a/distutils2/resources.py b/distutils2/resources.py --- a/distutils2/resources.py +++ b/distutils2/resources.py @@ -18,14 +18,12 @@ base = os.path.join(root_dir, glob_base) else: base = root_dir - if '*' in base or '{' in base or '}' in base: - raise NotImplementedError('glob are not supported into base part\ - of resources definition. %r is an invalide root_dir' % base) - absglob = os.path.join(base, glob_suffix) - for glob_file in iglob(absglob): - path_suffix = _rel_path(base, glob_file) - relpath = _rel_path(root_dir, glob_file) - yield relpath, path_suffix + for base_dir in iglob(base): + absglob = os.path.join(base_dir, glob_suffix) + for glob_file in iglob(absglob): + path_suffix = _rel_path(base_dir, glob_file) + relpath = _rel_path(root_dir, glob_file) + yield relpath, path_suffix def resources_dests(resources_dir, rules): """find destination of ressources files""" diff --git a/distutils2/tests/test_resources.py b/distutils2/tests/test_resources.py --- a/distutils2/tests/test_resources.py +++ b/distutils2/tests/test_resources.py @@ -70,9 +70,10 @@ def test_glob_in_base(self): rules = [('scrip*', '*.bin', '{appscript}')] spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', - 'Babarlikestrawberry': None} - tempdir = self.build_files_tree(spec) - self.assertRaises(NotImplementedError, resources_dests, tempdir, rules) + 'scripouille/babar.bin': '{appscript}/babar.bin', + 'scriptortu/lotus.bin': '{appscript}/lotus.bin', + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) def test_recursive_glob(self): rules = [('', '**/*.bin', '{binary}')] -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:58 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:58 +0100 Subject: [Python-checkins] distutils2: put relative path logic in a dedicated function Message-ID: tarek.ziade pushed 57af7fec7fd4 to distutils2: http://hg.python.org/distutils2/rev/57af7fec7fd4 changeset: 1085:57af7fec7fd4 user: Pierre-Yves David date: Sat Feb 05 17:35:54 2011 +0100 summary: put relative path logic in a dedicated function files: distutils2/resources.py diff --git a/distutils2/resources.py b/distutils2/resources.py --- a/distutils2/resources.py +++ b/distutils2/resources.py @@ -3,6 +3,10 @@ __all__ = ['resources_dests'] +def _rel_path(base, path): + assert path.startswith(base) + return path[len(base):].lstrip('/') + def _expand(root_dir, glob_base, glob_suffix): """search for file in a directory and return they radical part. @@ -19,8 +23,8 @@ of resources definition. %r is an invalide root_dir' % base) absglob = os.path.join(base, glob_suffix) for glob_file in iglob(absglob): - path_suffix = glob_file[len(base):].lstrip('/') - relpath = glob_file[len(root_dir):].lstrip('/') + path_suffix = _rel_path(base, glob_file) + relpath = _rel_path(root_dir, glob_file) yield relpath, path_suffix def resources_dests(resources_dir, rules): -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: simplify resources_dests by removing _expand function Message-ID: tarek.ziade pushed d610c0f6107e to distutils2: http://hg.python.org/distutils2/rev/d610c0f6107e changeset: 1087:d610c0f6107e user: Pierre-Yves David date: Sat Feb 05 18:04:10 2011 +0100 summary: simplify resources_dests by removing _expand function files: distutils2/resources.py diff --git a/distutils2/resources.py b/distutils2/resources.py --- a/distutils2/resources.py +++ b/distutils2/resources.py @@ -1,37 +1,22 @@ import os from distutils2.util import iglob -__all__ = ['resources_dests'] - def _rel_path(base, path): assert path.startswith(base) return path[len(base):].lstrip('/') -def _expand(root_dir, glob_base, glob_suffix): - """search for file in a directory and return they radical part. - - root_dir: directory where to search for resources. - glob_base: part of the path not included in radical. - glob_suffix: part of the path used as radical. - """ - if glob_base: - base = os.path.join(root_dir, glob_base) - else: - base = root_dir - for base_dir in iglob(base): - absglob = os.path.join(base_dir, glob_suffix) - for glob_file in iglob(absglob): - path_suffix = _rel_path(base_dir, glob_file) - relpath = _rel_path(root_dir, glob_file) - yield relpath, path_suffix - -def resources_dests(resources_dir, rules): +def resources_dests(resources_root, rules): """find destination of ressources files""" destinations = {} - for (base, suffix, glob_dest) in rules: - for resource_file, radical in _expand(resources_dir, base, suffix): - if glob_dest is None: - destinations.pop(resource_file, None) #remove the entry if it was here - else: - destinations[resource_file] = os.path.join(glob_dest, radical) + for (base, suffix, dest) in rules: + prefix = os.path.join(resources_root, base) + for abs_base in iglob(prefix): + abs_glob = os.path.join(abs_base, suffix) + for abs_path in iglob(abs_glob): + resource_file = _rel_path(resources_root, abs_path) + if dest is None: #remove the entry if it was here + destinations.pop(resource_file, None) + else: + rel_path = _rel_path(abs_base, abs_path) + destinations[resource_file] = os.path.join(dest, rel_path) return destinations -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Add an XXX marker on a silently passing error in package data files. Message-ID: tarek.ziade pushed 375581a47a69 to distutils2: http://hg.python.org/distutils2/rev/375581a47a69 changeset: 1088:375581a47a69 user: Pierre-Yves David date: Sun Feb 06 13:21:19 2011 +0100 summary: Add an XXX marker on a silently passing error in package data files. files: distutils2/config.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -190,7 +190,7 @@ for data in files.get('package_data', []): data = data.split('=') if len(data) != 2: - continue + continue # XXX error should never pass silently key, value = data self.dist.package_data[key.strip()] = value.strip() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Change resource files declaration in setup.cfg Message-ID: tarek.ziade pushed 3841fa64b64c to distutils2: http://hg.python.org/distutils2/rev/3841fa64b64c changeset: 1089:3841fa64b64c user: Pierre-Yves David date: Sun Feb 06 18:23:06 2011 +0100 summary: Change resource files declaration in setup.cfg Resource files declaration move from they own section ``[resources]`` to a ``resources`` key in the ``[files]`` section. The format of the declaration in the ``resources`` key stay the same:: [] = {dispatch}/destination/ this change is motivated by the fact that config parser "key" are always lower case and can't hold file path information propertly. files: distutils2/config.py distutils2/mkcfg.py distutils2/tests/test_config.py distutils2/tests/test_mkcfg.py diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -106,6 +106,7 @@ return value def _read_setup_cfg(self, parser, cfg_filename): + cfg_directory = os.path.dirname(os.path.abspath(cfg_filename)) content = {} for section in parser.sections(): content[section] = dict(parser.items(section)) @@ -197,24 +198,21 @@ # manifest template self.dist.extra_files = files.get('extra_files', []) - if 'resources' in content: resources = [] - for glob, destination in content['resources'].iteritems(): - splitted_glob = glob.split(' ', 1) - if len(splitted_glob) == 1: + for rule in files.get('resources', []): + glob , destination = rule.split('=', 1) + rich_glob = glob.strip().split(' ', 1) + if len(rich_glob) == 2: + prefix, suffix = rich_glob + else: + assert len(rich_glob) == 1 prefix = '' - suffix = splitted_glob[0] - else: - prefix = splitted_glob[0] - suffix = splitted_glob[1] + suffix = glob if destination == '': destination = None - resources.append((prefix, suffix, destination)) - - directory = os.path.dirname(os.path.join(os.getcwd(), cfg_filename)) - data_files = resources_dests(directory, resources) - self.dist.data_files = data_files - + resources.append((prefix.strip(), suffix.strip(), destination.strip())) + self.dist.data_files = resources_dests(cfg_directory, resources) + ext_modules = self.dist.ext_modules for section_key in content: labels = section_key.split('=') diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -629,9 +629,11 @@ continue fp.write('%s = %s\n' % (name, '\n '.join(self.data[name]).strip())) - fp.write('\n[resources]\n') + fp.write('\nresources =\n') for src, dest in self.data['resources']: - fp.write('%s = %s\n' % (src, dest)) + fp.write(' %s = %s\n' % (src, dest)) + fp.write('\n') + finally: fp.close() diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -73,10 +73,10 @@ recursive-include examples *.txt *.py prune examples/sample?/build -[resources] -bm/ {b1,b2}.gif = {icon} -cfg/ data.cfg = {config} -init_script = {script} +resources= + bm/ {b1,b2}.gif = {icon} + Cf*/ *.CFG = {config}/baBar/ + init_script = {script}/JunGle/ [global] commands = @@ -196,8 +196,8 @@ os.mkdir('bm') self.write_file(os.path.join('bm', 'b1.gif'), '') self.write_file(os.path.join('bm', 'b2.gif'), '') - os.mkdir('cfg') - self.write_file(os.path.join('cfg', 'data.cfg'), '') + os.mkdir('Cfg') + self.write_file(os.path.join('Cfg', 'data.CFG'), '') self.write_file('init_script', '') # try to load the metadata now @@ -243,11 +243,12 @@ self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) - self.assertEqual(dist.data_files, + self.assertEqual( {'bm/b1.gif' : '{icon}/b1.gif', 'bm/b2.gif' : '{icon}/b2.gif', - 'cfg/data.cfg' : '{config}/data.cfg', - 'init_script' : '{script}/init_script'}) + 'Cfg/data.CFG' : '{config}/baBar/data.CFG', + 'init_script' : '{script}/JunGle/init_script'}, + dist.data_files) self.assertEqual(dist.package_dir, 'src') diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -146,9 +146,9 @@ ' pyxfoil/fengine.so', 'scripts = my_script', ' bin/run', - '[resources]', - 'README.rst = {doc}', - 'pyxfoil.1 = {man}', + 'resources =', + ' README.rst = {doc}', + ' pyxfoil.1 = {man}', ])) def test_convert_setup_py_to_cfg_with_description_in_readme(self): @@ -166,7 +166,7 @@ url='http://www.python-science.org/project/pyxfoil', license='GPLv2', packages=['pyxfoil'], - package_data={'pyxfoil' : ['fengine.so']}, + package_data={'pyxfoil' : ['fengine.so', 'babar.so']}, data_files=[ ('share/doc/pyxfoil', ['README.rst']), ('share/man', ['pyxfoil.1']), @@ -184,7 +184,7 @@ main() fp = open(os.path.join(self.wdir, 'setup.cfg')) try: - lines = set([line.strip() for line in fp]) + lines = set([line.rstrip() for line in fp]) finally: fp.close() self.assertEqual(lines, set(['', @@ -200,9 +200,10 @@ '[files]', 'packages = pyxfoil', 'extra_files = pyxfoil/fengine.so', - '[resources]', - 'README.rst = {doc}', - 'pyxfoil.1 = {man}', + ' pyxfoil/babar.so', + 'resources =', + ' README.rst = {doc}', + ' pyxfoil.1 = {man}', ])) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Merge with upstream default branch Message-ID: tarek.ziade pushed 4f9da8d9ebcb to distutils2: http://hg.python.org/distutils2/rev/4f9da8d9ebcb changeset: 1090:4f9da8d9ebcb parent: 1089:3841fa64b64c parent: 1010:bbc4437b851b user: FELD Boris date: Wed Feb 09 09:49:22 2011 +0100 summary: Merge with upstream default branch files: distutils2/dist.py diff --git a/distutils2/command/__init__.py b/distutils2/command/__init__.py --- a/distutils2/command/__init__.py +++ b/distutils2/command/__init__.py @@ -5,6 +5,9 @@ from distutils2.errors import DistutilsModuleError from distutils2.util import resolve_name +__all__ = ['get_command_names', 'set_command', 'get_command_class', + 'STANDARD_COMMANDS'] + _COMMANDS = { 'check': 'distutils2.command.check.check', 'test': 'distutils2.command.test.test', @@ -29,6 +32,8 @@ 'upload': 'distutils2.command.upload.upload', 'upload_docs': 'distutils2.command.upload_docs.upload_docs'} +STANDARD_COMMANDS = set(_COMMANDS) + def get_command_names(): """Return registered commands""" diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -17,7 +17,7 @@ from distutils2 import logger from distutils2.metadata import Metadata from distutils2.config import Config -from distutils2.command import get_command_class +from distutils2.command import get_command_class, STANDARD_COMMANDS # Regex to define acceptable Distutils command names. This is not *quite* # the same as a Python NAME -- I don't allow leading underscores. The fact @@ -589,31 +589,26 @@ print(header + ":") for cmd in commands: - cls = self.cmdclass.get(cmd) - if not cls: - cls = get_command_class(cmd) - try: - description = cls.description - except AttributeError: - description = "(no description available)" + cls = self.cmdclass.get(cmd) or get_command_class(cmd) + description = getattr(cls, 'description', + '(no description available)') print(" %-*s %s" % (max_length, cmd, description)) def _get_command_groups(self): """Helper function to retrieve all the command class names divided - into standard commands (listed in distutils2.command.__all__) - and extra commands (given in self.cmdclass and not standard - commands). + into standard commands (listed in + distutils2.command.STANDARD_COMMANDS) and extra commands (given in + self.cmdclass and not standard commands). """ - from distutils2.command import __all__ as std_commands extra_commands = [cmd for cmd in self.cmdclass - if cmd not in std_commands] - return std_commands, extra_commands + if cmd not in STANDARD_COMMANDS] + return STANDARD_COMMANDS, extra_commands def print_commands(self): """Print out a help message listing all available commands with a description of each. The list is divided into standard commands - (listed in distutils2.command.__all__) and extra commands + (listed in distutils2.command.STANDARD_COMMANDS) and extra commands (given in self.cmdclass and not standard commands). The descriptions come from the command class attribute 'description'. @@ -633,10 +628,8 @@ "Extra commands", max_length) - # -- Command class/object methods ---------------------------------- - def get_command_obj(self, command, create=1): """Return the command object for 'command'. Normally this object is cached on a previous call to 'get_command_obj()'; if no command diff --git a/distutils2/metadata.py b/distutils2/metadata.py --- a/distutils2/metadata.py +++ b/distutils2/metadata.py @@ -41,8 +41,8 @@ _HAS_DOCUTILS = False # public API of this module -__all__ = ('Metadata', 'PKG_INFO_ENCODING', - 'PKG_INFO_PREFERRED_VERSION') +__all__ = ['Metadata', 'PKG_INFO_ENCODING', + 'PKG_INFO_PREFERRED_VERSION'] # Encoding used for the PKG-INFO files PKG_INFO_ENCODING = 'utf-8' -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Improve distutils2 resources documentation. Message-ID: tarek.ziade pushed db16e7c6b2ed to distutils2: http://hg.python.org/distutils2/rev/db16e7c6b2ed changeset: 1091:db16e7c6b2ed user: FELD Boris date: Wed Feb 09 13:37:15 2011 +0100 summary: Improve distutils2 resources documentation. files: docs/source/setupcfg.rst diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -176,81 +176,46 @@ .. Note:: In Distutils2, setup.cfg will be implicitly included. -data-files -========== +Resources +========= +This section describes the files used by the project which must not be installed in the same place that python modules or libraries, they are called **resources**. They are for example documentation files, script files, databases, etc... -TODO : +For declaring resources, you must use this notation :: - ### - source -> destination + source = destination + +Data-files are declared in the **resources** field in the **file** section, for example:: + + [files] + resources = + source1 = destination1 + source2 = destination2 + +The **source** part of the declaration are relative paths of resources files (using unix path separator **/**). For example, if you've this source tree:: + + foo/ + doc/ + doc.man + scripts/ + foo.sh + +Your setup.cfg will look like:: + + [files] + resources = + doc/doc.man = destination_doc + scripts/foo.sh = destination_scripts - final-path = destination + source - - There is an {alias} for each categories of datafiles - ----- - source may be a glob (*, ?, **, {}) - - order - - exclude - -- - base-prefix - - #### - overwrite system config for {alias} - - #### - extra-categories +The final paths where files will be placed are composed by : **source** + **destination**. In the previous example, **doc/doc.man** will be placed in **destination_doc/doc/doc.man** and **scripts/foo.sh** will be placed in **destination_scripts/scripts/foo.sh**. (If you want more control on the final path, take a look at base_prefix_). -This section describes the files used by the project which must not be installed in the same place that python modules or libraries. +The **destination** part of resources declaration are paths with categories. Indeed, it's generally a bad idea to give absolute path as it will be cross incompatible. So, you must use resources categories in your **destination** declaration. Categories will be replaced by their real path at the installation time. Using categories is all benefit, your declaration will be simpler, cross platform and it will allow packager to place resources files where they want without breaking your code. -The format for specifing data files is : +Categories can be specified by using this syntax:: - **source** = **destination** - -Example:: - - scripts/script1.bin = {scripts} + {category} -It means that the file scripts/script1.bin will be placed - -It means that every file which match the glob_syntax will be placed in the destination. A part of the path of the file will be stripped when it will be expanded and another part will be append to the destination. For more informations about which part of the path will be stripped or not, take a look at next sub-section globsyntax_. - -The destination path will be expanded at the installation time using categories's default-path in the sysconfig.cfg file in the system. For more information about categories's default-paths, take a look at next next sub-section destination_. - - -.. _globsyntax: - -glob_syntax ------------ - -The glob syntax is traditionnal glob syntax (with unix separator **/**) with one more information : what part of the path will be stripped when path will be expanded ? - -The special character which indicate the end of the part that will be stripped and the beginning of the part that will be added is whitespace, which can follow or replace a path separator. - -Example:: - - scripts/ *.bin - -is equivalent to:: - - scripts *.bin - -Theses examples means that all files with extensions bin in the directory scripts will be placed directly on **destination** directory. - -This glob example:: - - scripts/*.bin - -means that all files with extensions bin in the directory scripts will be placed directly on **destination/scripts** directory. - -.. _destination: - -destination ------------ - -The destination is a traditionnal path (with unix separator **/**) where some parts will be expanded at installation time. These parts look like **{category}**, they will be expanded by reading system-wide default-path stored in sysconfig.cfg. Defaults categories are : +Default categories are:: * config * appdata @@ -264,40 +229,199 @@ * info * man -A special category exists, named {distribution.name} which will be expanded into your distribution name. You should not use it in your destination path, as they are may be used in defaults categories:: +A special category also exists **{distribution.name}** that will be replaced by the name of the distribution, but as most of the defaults categories use them, so it's not necessary to add **{distribution.name}** into your destination. - [globals] - # These are the useful categories that are sometimes referenced at runtime, - # using pkgutil.open(): - # Configuration files - config = {confdir}/{distribution.name} - # Non-writable data that is independent of architecture (images, many xml/text files) - appdata = {datadir}/{distribution.name} - # Non-writable data that is architecture-dependent (some binary data formats) - appdata.arch = {libdir}/{distribution.name} - # Data, written by the package, that must be preserved (databases) - appdata.persistent = {statedir}/lib/{distribution.name} - # Data, written by the package, that can be safely discarded (cache) - appdata.disposable = {statedir}/cache/{distribution.name} - # Help or documentation files referenced at runtime - help = {datadir}/{distribution.name} - icon = {datadir}/pixmaps - scripts = {base}/bin +If you use categories in your declarations, and you are encouraged to do, final path will be:: + + source + destination_expanded + +.. _example_final_path: + +For example, if you have this setup.cfg:: + + [metadata] + name = foo + + [files] + resources = + doc/doc.man = {doc} + +And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final path will be:: + + {datadir}/doc/foo/doc/doc.man - # Non-runtime files. These are valid categories for marking files for - # install, but they should not be referenced by the app at runtime: - # Help or documentation files not referenced by the package at runtime +Where {datafir} category will be platform-dependent. + + +More control on source part +--------------------------- + +Glob syntax +___________ + +When you declare source file, you can use a glob-like syntax to match multiples file, for example:: + + scripts/* = {script} + +Will match all the files in the scripts directory and placed them in the script category. + +Glob tokens are: + + * * : match all files. + * ? : match any character. + * ** : match any level of tree recursion (even 0). + * {} : will match any part separated by comma (example : {sh,bat}). + +TODO :: + + Add an example + +Order of declaration +____________________ + +The order of declaration is important if one file match multiple rules. The last rules matched by file is used, this is useful if you have this source tree:: + + foo/ + doc/ + index.rst + setup.rst + documentation.txt + doc.tex + README + +And you want all the files in the doc directory to be placed in {doc} category, but README must be placed in {help} category, instead of listing all the files one by one, you can declare them in this way:: + + [files] + resources = + doc/* = {doc} + doc/README = {help} + +Exclude +_______ + +You can exclude some files of resources declaration by giving no destination, it can be useful if you have a non-resources file in the same directory of resources files:: + + foo/ + doc/ + RELEASES + doc.tex + documentation.txt + docu.rst + +Your **file** section will be:: + + [files] + resources = + doc/* = {doc} + doc/RELEASES = + +More control on destination part +-------------------------------- + +.. _base_prefix: + +Define a base-prefix +____________________ + +When you define your resources, you can have more control of how the final path is compute. + +By default, the final path is:: + + destination + source + +This can generate long paths, for example (example_final_path_):: + + {datadir}/doc/foo/doc/doc.man + +When you declare your source, you can use a separator to split the source in **prefix** **suffix**. The supported separator are : + + * Whitespace + +So, for example, if you have this source:: + + docs/ doc.man + +The **prefix** is "docs/" and the **suffix** is "doc.html". + +.. note:: + + Separator can be placed after a path separator or replace it. So theses two sources are equivalent:: + + docs/ doc.man + docs doc.man + +.. note:: + + Glob syntax is working the same way with standard source and splitted source. So theses rules:: + + docs/* + docs/ * + docs * + + Will match all the files in the docs directory. + +When you use splitted source, the final path is compute in this way:: + + destination + prefix + +So for example, if you have this setup.cfg:: + + [metadata] + name = foo + + [files] + resources = + doc/ doc.man = {doc} + +And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final path will be:: + + {datadir}/doc/foo/doc.man + + +Overwrite paths for categories +------------------------------ + +.. warning:: + + This part is intended for system administrator or packager. + +The real paths of categories are registered in the *sysconfig.cfg* file installed in your python installation. The format of this file is INI-like. The content of the file is organized into several sections : + + * globals : Standard categories's paths. + * posix_prefix : Standard paths for categories and installation paths for posix system. + * other one... + +Standard categories's paths are platform independent, they generally refers to other categories, which are platform dependent. Sysconfig module will choose these category from sections matching os.name. For example:: + doc = {datadir}/doc/{distribution.name} - # GNU info documentation files - info = {datadir}/info - # man pages - man = {datadir}/man -So, if you have this destination path : **{help}/api**, it will be expanded into **{datadir}/{distribution.name}/api**. {datadir} will be expanded depending on your system value (ex : confdir = datadir = /usr/share/). +It refers to datadir category, which can be different between platforms. In posix system, it may be:: + datadir = /usr/share + +So the final path will be:: -Simple-example --------------- + doc = /usr/share/doc/{distribution.name} + +The platform dependent categories are : + + * confdir + * datadir + * libdir + * base + +Define extra-categories +----------------------- + +Examples +-------- + +.. note:: + + These examples are incremental but works unitarily. + +Resources in root dir +_____________________ Source tree:: @@ -309,16 +433,17 @@ Setup.cfg:: - [RESOURCES] - README = {doc} - *.sh = {scripts} + [files] + resources = + README = {doc} + *.sh = {scripts} So babar.sh and launch.sh will be placed in {scripts} directory. -Now let's create to move all the scripts into a scripts/directory. +Now let's move all the scripts into a scripts directory. -Second-example --------------- +Resources in sub-directory +__________________________ Source tree:: @@ -332,17 +457,18 @@ Setup.cfg:: - [RESOURCES] - README = {doc} - scripts/ LAUNCH = {scripts} - scripts/ *.sh = {scripts} + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.sh = {scripts} It's important to use the separator after scripts/ to install all the bash scripts into {scripts} instead of {scripts}/scripts. Now let's add some docs. -Third-example -------------- +Resources in multiple sub-directories +_____________________________________ Source tree:: @@ -359,19 +485,20 @@ Setup.cfg:: - [RESOURCES] - README = {doc} - scripts/ LAUNCH = {doc} - scripts/ *.sh = {scripts} - doc/ * = {doc} - doc/ man = {man} + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.sh = {scripts} + doc/ * = {doc} + doc/ man = {man} You want to place all the file in the docs script into {doc} category, instead of man, which must be placed into {man} category, we will use the order of declaration of globs to choose the destination, the last glob that match the file is used. Now let's add some scripts for windows users. -Final example -------------- +Complete example +________________ Source tree:: @@ -389,12 +516,13 @@ Setup.cfg:: - [RESOURCES] - README = {doc} - scripts/ LAUNCH = {doc} - scripts/ *.{sh,bat} = {scripts} - doc/ * = {doc} - doc/ man = {man} + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.{sh,bat} = {scripts} + doc/ * = {doc} + doc/ man = {man} We use brace expansion syntax to place all the bash and batch scripts into {scripts} category. @@ -402,7 +530,6 @@ In Distutils2, setup.py and README (or README.txt) files are not more included in source distribution by default - `command` sections ================== -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Merging the resource branch ! Message-ID: tarek.ziade pushed d4dad8855be3 to distutils2: http://hg.python.org/distutils2/rev/d4dad8855be3 changeset: 1092:d4dad8855be3 parent: 1024:c3706e13ec0b parent: 1091:db16e7c6b2ed user: Alexis Metaireau date: Sun Feb 13 23:07:59 2011 +0000 summary: Merging the resource branch ! files: distutils2/command/install_data.py distutils2/command/install_dist.py distutils2/config.py distutils2/dist.py distutils2/index/simple.py distutils2/tests/test_command_sdist.py distutils2/tests/test_mkcfg.py distutils2/util.py diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -15,3 +15,4 @@ include bin nosetests.xml +Distutils2.egg-info diff --git a/distutils2/_backport/pkgutil.py b/distutils2/_backport/pkgutil.py --- a/distutils2/_backport/pkgutil.py +++ b/distutils2/_backport/pkgutil.py @@ -1,13 +1,14 @@ """Utilities to support packages.""" +import imp +import sys + +from csv import reader as csv_reader import os -import sys -import imp import re +from stat import ST_SIZE +from types import ModuleType import warnings -from csv import reader as csv_reader -from types import ModuleType -from stat import ST_SIZE try: from hashlib import md5 @@ -55,7 +56,7 @@ """Make a trivial single-dispatch generic function""" registry = {} - def wrapper(*args, **kw): + def wrapper(*args, ** kw): ob = args[0] try: cls = ob.__class__ @@ -70,12 +71,12 @@ pass mro = cls.__mro__[1:] except TypeError: - mro = object, # must be an ExtensionClass or some such :( + mro = object, # must be an ExtensionClass or some such :( for t in mro: if t in registry: - return registry[t](*args, **kw) + return registry[t](*args, ** kw) else: - return func(*args, **kw) + return func(*args, ** kw) try: wrapper.__name__ = func.__name__ except (TypeError, AttributeError): @@ -620,7 +621,7 @@ # PEP 376 Implementation # ########################## -DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED',) +DIST_FILES = ('INSTALLER', 'METADATA', 'RECORD', 'REQUESTED', 'RESOURCES') # Cache _cache_name = {} # maps names to Distribution instances @@ -659,7 +660,7 @@ def clear_cache(): """ Clears the internal cache. """ global _cache_name, _cache_name_egg, _cache_path, _cache_path_egg, \ - _cache_generated, _cache_generated_egg + _cache_generated, _cache_generated_egg _cache_name = {} _cache_name_egg = {} @@ -749,8 +750,8 @@ return '%s-%s at %s' % (self.name, self.metadata.version, self.path) def _get_records(self, local=False): - RECORD = os.path.join(self.path, 'RECORD') - record_reader = csv_reader(open(RECORD, 'rb'), delimiter=',') + RECORD = self.get_distinfo_file('RECORD') + record_reader = csv_reader(RECORD, delimiter=',') for row in record_reader: path, md5, size = row[:] + [None for i in xrange(len(row), 3)] if local: @@ -758,6 +759,15 @@ path = os.path.join(sys.prefix, path) yield path, md5, size + def get_resource_path(self, relative_path): + resources_file = self.get_distinfo_file('RESOURCES') + resources_reader = csv_reader(resources_file, delimiter=',') + for relative, destination in resources_reader: + if relative == relative_path: + return destination + raise KeyError('No resource file with relative path %s were installed' % + relative_path) + def get_installed_files(self, local=False): """ Iterates over the ``RECORD`` entries and returns a tuple @@ -815,13 +825,13 @@ distinfo_dirname, path = path.split(os.sep)[-2:] if distinfo_dirname != self.path.split(os.sep)[-1]: raise DistutilsError("Requested dist-info file does not " - "belong to the %s distribution. '%s' was requested." \ - % (self.name, os.sep.join([distinfo_dirname, path]))) + "belong to the %s distribution. '%s' was requested." \ + % (self.name, os.sep.join([distinfo_dirname, path]))) # The file must be relative if path not in DIST_FILES: raise DistutilsError("Requested an invalid dist-info file: " - "%s" % path) + "%s" % path) # Convert the relative path back to absolute path = os.path.join(self.path, path) @@ -860,11 +870,11 @@ metadata = None """A :class:`distutils2.metadata.Metadata` instance loaded with the distribution's ``METADATA`` file.""" - _REQUIREMENT = re.compile( \ - r'(?P[-A-Za-z0-9_.]+)\s*' \ - r'(?P(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)?\s*' \ - r'(?P(?:\s*,\s*(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)*)\s*' \ - r'(?P\[.*\])?') + _REQUIREMENT = re.compile(\ + r'(?P[-A-Za-z0-9_.]+)\s*' \ + r'(?P(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)?\s*' \ + r'(?P(?:\s*,\s*(?:<|<=|!=|==|>=|>)[-A-Za-z0-9_.]+)*)\s*' \ + r'(?P\[.*\])?') def __init__(self, path, display_warnings=False): self.path = path @@ -950,8 +960,8 @@ else: if match.group('extras'): s = (('Distribution %s uses extra requirements ' - 'which are not supported in distutils') \ - % (self.name)) + 'which are not supported in distutils') \ + % (self.name)) warnings.warn(s) name = match.group('name') version = None @@ -1010,7 +1020,7 @@ def __eq__(self, other): return isinstance(other, EggInfoDistribution) and \ - self.path == other.path + self.path == other.path # See http://docs.python.org/reference/datamodel#object.__hash__ __hash__ = object.__hash__ @@ -1069,7 +1079,7 @@ yield dist -def get_distribution(name, use_egg_info=False, paths=sys.path): +def get_distribution(name, use_egg_info=False, paths=None): """ Scans all elements in ``sys.path`` and looks for all directories ending with ``.dist-info``. Returns a :class:`Distribution` @@ -1086,6 +1096,9 @@ :rtype: :class:`Distribution` or :class:`EggInfoDistribution` or None """ + if paths == None: + paths = sys.path + if not _cache_enabled: for dist in _yield_distributions(True, use_egg_info, paths): if dist.name == name: @@ -1128,7 +1141,7 @@ predicate = VersionPredicate(obs) except ValueError: raise DistutilsError(('Distribution %s has ill formed' + - ' obsoletes field') % (dist.name,)) + ' obsoletes field') % (dist.name,)) if name == o_components[0] and predicate.match(version): yield dist break @@ -1174,8 +1187,8 @@ p_name, p_ver = p_components if len(p_ver) < 2 or p_ver[0] != '(' or p_ver[-1] != ')': raise DistutilsError(('Distribution %s has invalid ' + - 'provides field: %s') \ - % (dist.name, p)) + 'provides field: %s') \ + % (dist.name, p)) p_ver = p_ver[1:-1] # trim off the parenthesis if p_name == name and predicate.match(p_ver): yield dist @@ -1195,3 +1208,15 @@ for dist in get_distributions(): if dist.uses(path): yield dist + +def resource_path(distribution_name, relative_path): + dist = get_distribution(distribution_name) + if dist != None: + return dist.get_resource_path(relative_path) + raise LookupError('No distribution named %s is installed.' % + distribution_name) + +def resource_open(distribution_name, relative_path, * args, ** kwargs): + file = open(resource_path(distribution_name, relative_path), * args, + ** kwargs) + return file \ No newline at end of file diff --git a/distutils2/_backport/sysconfig.py b/distutils2/_backport/sysconfig.py --- a/distutils2/_backport/sysconfig.py +++ b/distutils2/_backport/sysconfig.py @@ -120,6 +120,14 @@ res[key] = os.path.normpath(_subst_vars(value, vars)) return res +def format_value(value, vars): + def _replacer(matchobj): + name = matchobj.group(1) + if name in vars: + return vars[name] + return matchobj.group(0) + return _VAR_REPL.sub(_replacer, value) + def _get_default_scheme(): if os.name == 'posix': diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/INSTALLER new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/METADATA @@ -0,0 +1,4 @@ +Metadata-version: 1.2 +Name: babar +Version: 0.1 +Author: FELD Boris \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RECORD new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/REQUESTED new file mode 100644 diff --git a/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar-0.1.dist-info/RESOURCES @@ -0,0 +1,2 @@ +babar.png,babar.png +babar.cfg,babar.cfg \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar.cfg b/distutils2/_backport/tests/fake_dists/babar.cfg new file mode 100644 --- /dev/null +++ b/distutils2/_backport/tests/fake_dists/babar.cfg @@ -0,0 +1,1 @@ +Config \ No newline at end of file diff --git a/distutils2/_backport/tests/fake_dists/babar.png b/distutils2/_backport/tests/fake_dists/babar.png new file mode 100644 diff --git a/distutils2/_backport/tests/test_pkgutil.py b/distutils2/_backport/tests/test_pkgutil.py --- a/distutils2/_backport/tests/test_pkgutil.py +++ b/distutils2/_backport/tests/test_pkgutil.py @@ -1,11 +1,12 @@ # -*- coding: utf-8 -*- """Tests for PEP 376 pkgutil functionality""" +import imp import sys + +import csv import os -import csv -import imp +import shutil import tempfile -import shutil import zipfile try: from hashlib import md5 @@ -18,9 +19,9 @@ from distutils2._backport import pkgutil from distutils2._backport.pkgutil import ( - Distribution, EggInfoDistribution, get_distribution, get_distributions, - provides_distribution, obsoletes_distribution, get_file_users, - distinfo_dirname, _yield_distributions) + Distribution, EggInfoDistribution, get_distribution, get_distributions, + provides_distribution, obsoletes_distribution, get_file_users, + distinfo_dirname, _yield_distributions) try: from os.path import relpath @@ -123,7 +124,6 @@ del sys.modules[pkg] - # Adapted from Python 2.7's trunk @@ -182,7 +182,7 @@ def setUp(self): super(TestPkgUtilDistribution, self).setUp() self.fake_dists_path = os.path.abspath( - os.path.join(os.path.dirname(__file__), 'fake_dists')) + os.path.join(os.path.dirname(__file__), 'fake_dists')) pkgutil.disable_cache() self.distinfo_dirs = [os.path.join(self.fake_dists_path, dir) @@ -205,7 +205,7 @@ # Setup the RECORD file for this dist record_file = os.path.join(distinfo_dir, 'RECORD') record_writer = csv.writer(open(record_file, 'w'), delimiter=',', - quoting=csv.QUOTE_NONE) + quoting=csv.QUOTE_NONE) dist_location = distinfo_dir.replace('.dist-info', '') for path, dirs, files in os.walk(dist_location): @@ -214,15 +214,15 @@ os.path.join(path, f))) for file in ['INSTALLER', 'METADATA', 'REQUESTED']: record_writer.writerow(record_pieces( - os.path.join(distinfo_dir, file))) + os.path.join(distinfo_dir, file))) record_writer.writerow([relpath(record_file, sys.prefix)]) del record_writer # causes the RECORD file to close record_reader = csv.reader(open(record_file, 'rb')) record_data = [] for row in record_reader: path, md5_, size = row[:] + \ - [None for i in xrange(len(row), 3)] - record_data.append([path, (md5_, size,)]) + [None for i in xrange(len(row), 3)] + record_data.append([path, (md5_, size, )]) self.records[distinfo_dir] = dict(record_data) def tearDown(self): @@ -240,7 +240,7 @@ name = 'choxie' version = '2.0.0.9' dist_path = os.path.join(here, 'fake_dists', - distinfo_dirname(name, version)) + distinfo_dirname(name, version)) dist = Distribution(dist_path) self.assertEqual(dist.name, name) @@ -264,9 +264,9 @@ # Criteria to test against distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') true_path = [self.fake_dists_path, distinfo_name, \ - 'grammar', 'utils.py'] + 'grammar', 'utils.py'] true_path = relpath(os.path.join(*true_path), sys.prefix) false_path = [self.fake_dists_path, 'towel_stuff-0.1', 'towel_stuff', '__init__.py'] @@ -282,7 +282,7 @@ distinfo_name = 'choxie-2.0.0.9' other_distinfo_name = 'grammar-1.0a4' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for known good file matches distinfo_files = [ @@ -301,9 +301,9 @@ # Test an absolute path that is part of another distributions dist-info other_distinfo_file = os.path.join(self.fake_dists_path, - other_distinfo_name + '.dist-info', 'REQUESTED') + other_distinfo_name + '.dist-info', 'REQUESTED') self.assertRaises(DistutilsError, dist.get_distinfo_file, - other_distinfo_file) + other_distinfo_file) # Test for a file that does not exist and should not exist self.assertRaises(DistutilsError, dist.get_distinfo_file, \ 'ENTRYPOINTS') @@ -312,7 +312,7 @@ # Test for the iteration of RECORD path entries. distinfo_name = 'towel_stuff-0.1' distinfo_dir = os.path.join(self.fake_dists_path, - distinfo_name + '.dist-info') + distinfo_name + '.dist-info') dist = Distribution(distinfo_dir) # Test for the iteration of the raw path distinfo_record_paths = self.records[distinfo_dir].keys() @@ -324,6 +324,16 @@ found = [path for path in dist.get_distinfo_files(local=True)] self.assertEqual(sorted(found), sorted(distinfo_record_paths)) + def test_get_resources_path(self): + distinfo_name = 'babar-0.1' + distinfo_dir = os.path.join(self.fake_dists_path, + distinfo_name + '.dist-info') + dist = Distribution(distinfo_dir) + resource_path = dist.get_resource_path('babar.png') + self.assertEqual(resource_path, 'babar.png') + self.assertRaises(KeyError, dist.get_resource_path, 'notexist') + + class TestPkgUtilPEP376(support.LoggingCatcher, support.WarningsCatcher, unittest.TestCase): @@ -347,7 +357,7 @@ # Given a name and a version, we expect the distinfo_dirname function # to return a standard distribution information directory name. - items = [ # (name, version, standard_dirname) + items = [# (name, version, standard_dirname) # Test for a very simple single word name and decimal # version number ('docutils', '0.5', 'docutils-0.5.dist-info'), @@ -367,7 +377,7 @@ # Lookup all distributions found in the ``sys.path``. # This test could potentially pick up other installed distributions fake_dists = [('grammar', '1.0a4'), ('choxie', '2.0.0.9'), - ('towel-stuff', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] found_dists = [] # Verify the fake dists have been found. @@ -375,10 +385,10 @@ for dist in dists: if not isinstance(dist, Distribution): self.fail("item received was not a Distribution instance: " - "%s" % type(dist)) + "%s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'],)) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'], )) else: # check that it doesn't find anything more than this self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -401,8 +411,8 @@ self.fail("item received was not a Distribution or " "EggInfoDistribution instance: %s" % type(dist)) if dist.name in dict(fake_dists) and \ - dist.path.startswith(self.fake_dists_path): - found_dists.append((dist.name, dist.metadata['version'])) + dist.path.startswith(self.fake_dists_path): + found_dists.append((dist.name, dist.metadata['version'])) else: self.assertFalse(dist.path.startswith(self.fake_dists_path)) @@ -453,7 +463,7 @@ # Test the iteration of distributions that use a file. name = 'towel_stuff-0.1' path = os.path.join(self.fake_dists_path, name, - 'towel_stuff', '__init__.py') + 'towel_stuff', '__init__.py') for dist in get_file_users(path): self.assertTrue(isinstance(dist, Distribution)) self.assertEqual(dist.name, name) @@ -560,7 +570,7 @@ ('truffles', '5.0'), ('cheese', '2.0.2'), ('coconuts-aster', '10.3'), ('nut', 'funkyversion')] dists = [('choxie', '2.0.0.9'), ('grammar', '1.0a4'), - ('towel-stuff', '0.1')] + ('towel-stuff', '0.1'), ('babar', '0.1')] checkLists([], _yield_distributions(False, False)) diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -9,6 +9,8 @@ import os from distutils2.command.cmd import Command from distutils2.util import change_root, convert_path +from distutils2._backport.sysconfig import get_paths, format_value +from distutils2._backport.shutil import Error class install_data(Command): @@ -28,6 +30,7 @@ def initialize_options(self): self.install_dir = None self.outfiles = [] + self.data_files_out = [] self.root = None self.force = 0 self.data_files = self.distribution.data_files @@ -40,54 +43,38 @@ def run(self): self.mkpath(self.install_dir) - for f in self.data_files: - if isinstance(f, str): - # it's a simple file, so copy it - f = convert_path(f) - if self.warn_dir: - self.warn("setup script did not provide a directory for " - "'%s' -- installing right in '%s'" % - (f, self.install_dir)) - (out, _) = self.copy_file(f, self.install_dir) - self.outfiles.append(out) - else: - # it's a tuple with path to install to and a list of files - dir = convert_path(f[0]) - if not os.path.isabs(dir): - dir = os.path.join(self.install_dir, dir) - elif self.root: - dir = change_root(self.root, dir) - self.mkpath(dir) + for file in self.data_files.items(): + destination = convert_path(self.expand_categories(file[1])) + dir_dest = os.path.abspath(os.path.dirname(destination)) + + self.mkpath(dir_dest) + try: + (out, _) = self.copy_file(file[0], dir_dest) + except Error, e: + self.warn(e.message) + out = destination - if f[1] == []: - # If there are no files listed, the user must be - # trying to create an empty directory, so add the - # directory to the list of output files. - self.outfiles.append(dir) - else: - # Copy files, adding them to the list of output files. - for data in f[1]: - data = convert_path(data) - (out, _) = self.copy_file(data, dir) - self.outfiles.append(out) + self.outfiles.append(out) + self.data_files_out.append((file[0], destination)) + + def expand_categories(self, path_with_categories): + local_vars = get_paths() + local_vars['distribution.name'] = self.distribution.metadata['Name'] + expanded_path = format_value(path_with_categories, local_vars) + expanded_path = format_value(expanded_path, local_vars) + if '{' in expanded_path and '}' in expanded_path: + self.warn("Unable to expand %s, some categories may missing." % + path_with_categories) + return expanded_path def get_source_files(self): - sources = [] - for item in self.data_files: - if isinstance(item, str): # plain file - item = convert_path(item) - if os.path.isfile(item): - sources.append(item) - else: # a (dirname, filenames) tuple - dirname, filenames = item - for f in filenames: - f = convert_path(f) - if os.path.isfile(f): - sources.append(f) - return sources + return self.data_files.keys() def get_inputs(self): - return self.data_files or [] + return self.data_files.keys() def get_outputs(self): return self.outfiles + + def get_resources_out(self): + return self.data_files_out diff --git a/distutils2/command/install_dist.py b/distutils2/command/install_dist.py --- a/distutils2/command/install_dist.py +++ b/distutils2/command/install_dist.py @@ -87,6 +87,8 @@ ('record=', None, "filename in which to record a list of installed files " "(not PEP 376-compliant)"), + ('resources=', None, + "data files mapping"), # .dist-info related arguments, read by install_dist_info ('no-distinfo', None, @@ -184,12 +186,14 @@ #self.install_info = None self.record = None + self.resources = None # .dist-info related options self.no_distinfo = None self.installer = None self.requested = None self.no_record = None + self.no_resources = None # -- Option finalizing methods ------------------------------------- # (This is rather more involved than for most commands, diff --git a/distutils2/command/install_distinfo.py b/distutils2/command/install_distinfo.py --- a/distutils2/command/install_distinfo.py +++ b/distutils2/command/install_distinfo.py @@ -12,12 +12,12 @@ # This file was created from the code for the former command install_egg_info -import os import csv -import re -from distutils2.command.cmd import Command from distutils2 import logger from distutils2._backport.shutil import rmtree +from distutils2.command.cmd import Command +import os +import re try: import hashlib except ImportError: @@ -39,9 +39,11 @@ "do not generate a REQUESTED file"), ('no-record', None, "do not generate a RECORD file"), + ('no-resources', None, + "do not generate a RESSOURCES list installed file") ] - boolean_options = ['requested', 'no-record'] + boolean_options = ['requested', 'no-record', 'no-resources'] negative_opt = {'no-requested': 'requested'} @@ -50,6 +52,7 @@ self.installer = None self.requested = None self.no_record = None + self.no_resources = None def finalize_options(self): self.set_undefined_options('install_dist', @@ -66,13 +69,16 @@ self.requested = True if self.no_record is None: self.no_record = False + if self.no_resources is None: + self.no_resources = False + metadata = self.distribution.metadata basename = "%s-%s.dist-info" % ( - to_filename(safe_name(metadata['Name'])), - to_filename(safe_version(metadata['Version'])), - ) + to_filename(safe_name(metadata['Name'])), + to_filename(safe_version(metadata['Version'])), + ) self.distinfo_dir = os.path.join(self.distinfo_dir, basename) self.outputs = [] @@ -113,6 +119,25 @@ f.close() self.outputs.append(requested_path) + + if not self.no_resources: + install_data = self.get_finalized_command('install_data') + if install_data.get_resources_out() != []: + resources_path = os.path.join(self.distinfo_dir, + 'RESOURCES') + logger.info('creating %s', resources_path) + f = open(resources_path, 'wb') + try: + writer = csv.writer(f, delimiter=',', + lineterminator=os.linesep, + quotechar='"') + for tuple in install_data.get_resources_out(): + writer.writerow(tuple) + + self.outputs.append(resources_path) + finally: + f.close() + if not self.no_record: record_path = os.path.join(self.distinfo_dir, 'RECORD') logger.info('creating %s', record_path) @@ -142,6 +167,7 @@ finally: f.close() + def get_outputs(self): return self.outputs diff --git a/distutils2/config.py b/distutils2/config.py --- a/distutils2/config.py +++ b/distutils2/config.py @@ -2,6 +2,7 @@ Know how to read all config files Distutils2 uses. """ +import os.path import os import sys import logging @@ -14,6 +15,7 @@ from distutils2.util import check_environ, resolve_name, strtobool from distutils2.compiler import set_compiler from distutils2.command import set_command +from distutils2.resources import resources_dests from distutils2.markers import interpret @@ -105,7 +107,8 @@ if v != ''] return value - def _read_setup_cfg(self, parser): + def _read_setup_cfg(self, parser, cfg_filename): + cfg_directory = os.path.dirname(os.path.abspath(cfg_filename)) content = {} for section in parser.sections(): content[section] = dict(parser.items(section)) @@ -145,11 +148,12 @@ # concatenate each files value = '' for filename in filenames: - f = open(filename) # will raise if file not found + # will raise if file not found + description_file = open(filename) try: - value += f.read().strip() + '\n' + value += description_file.read().strip() + '\n' finally: - f.close() + description_file.close() # add filename as a required file if filename not in metadata.requires_files: metadata.requires_files.append(filename) @@ -189,22 +193,28 @@ for data in files.get('package_data', []): data = data.split('=') if len(data) != 2: - continue + continue # XXX error should never pass silently key, value = data self.dist.package_data[key.strip()] = value.strip() - self.dist.data_files = [] - for data in files.get('data_files', []): - data = data.split('=') - if len(data) != 2: - continue - key, value = data - values = [v.strip() for v in value.split(',')] - self.dist.data_files.append((key, values)) - # manifest template self.dist.extra_files = files.get('extra_files', []) + resources = [] + for rule in files.get('resources', []): + glob , destination = rule.split('=', 1) + rich_glob = glob.strip().split(' ', 1) + if len(rich_glob) == 2: + prefix, suffix = rich_glob + else: + assert len(rich_glob) == 1 + prefix = '' + suffix = glob + if destination == '': + destination = None + resources.append((prefix.strip(), suffix.strip(), destination.strip())) + self.dist.data_files = resources_dests(cfg_directory, resources) + ext_modules = self.dist.ext_modules for section_key in content: labels = section_key.split('=') @@ -232,7 +242,6 @@ **values_dct )) - def parse_config_files(self, filenames=None): if filenames is None: filenames = self.find_config_files() @@ -246,7 +255,7 @@ parser.read(filename) if os.path.split(filename)[-1] == 'setup.cfg': - self._read_setup_cfg(parser) + self._read_setup_cfg(parser, filename) for section in parser.sections(): if section == 'global': diff --git a/distutils2/dist.py b/distutils2/dist.py --- a/distutils2/dist.py +++ b/distutils2/dist.py @@ -191,7 +191,7 @@ self.include_dirs = [] self.extra_path = None self.scripts = [] - self.data_files = [] + self.data_files = {} self.password = '' self.use_2to3 = False self.convert_2to3_doctests = [] diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -629,9 +629,11 @@ continue fp.write('%s = %s\n' % (name, '\n '.join(self.data[name]).strip())) - fp.write('\n[resources]\n') + fp.write('\nresources =\n') for src, dest in self.data['resources']: - fp.write('%s = %s\n' % (src, dest)) + fp.write(' %s = %s\n' % (src, dest)) + fp.write('\n') + finally: fp.close() diff --git a/distutils2/resources.py b/distutils2/resources.py new file mode 100644 --- /dev/null +++ b/distutils2/resources.py @@ -0,0 +1,22 @@ +import os +from distutils2.util import iglob + +def _rel_path(base, path): + assert path.startswith(base) + return path[len(base):].lstrip('/') + +def resources_dests(resources_root, rules): + """find destination of ressources files""" + destinations = {} + for (base, suffix, dest) in rules: + prefix = os.path.join(resources_root, base) + for abs_base in iglob(prefix): + abs_glob = os.path.join(abs_base, suffix) + for abs_path in iglob(abs_glob): + resource_file = _rel_path(resources_root, abs_path) + if dest is None: #remove the entry if it was here + destinations.pop(resource_file, None) + else: + rel_path = _rel_path(abs_base, abs_path) + destinations[resource_file] = os.path.join(dest, rel_path) + return destinations diff --git a/distutils2/tests/test_command_install_data.py b/distutils2/tests/test_command_install_data.py --- a/distutils2/tests/test_command_install_data.py +++ b/distutils2/tests/test_command_install_data.py @@ -1,4 +1,5 @@ """Tests for distutils.command.install_data.""" +import cmd import os from distutils2.command.install_data import install_data @@ -10,21 +11,29 @@ unittest.TestCase): def test_simple_run(self): + from distutils2._backport.sysconfig import _SCHEMES as sysconfig_SCHEMES + from distutils2._backport.sysconfig import _get_default_scheme + #dirty but hit marmoute + + old_scheme = sysconfig_SCHEMES + pkg_dir, dist = self.create_dist() cmd = install_data(dist) cmd.install_dir = inst = os.path.join(pkg_dir, 'inst') - # data_files can contain - # - simple files - # - a tuple with a path, and a list of file + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst', + os.path.join(pkg_dir, 'inst')) + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst2', + os.path.join(pkg_dir, 'inst2')) + one = os.path.join(pkg_dir, 'one') self.write_file(one, 'xxx') inst2 = os.path.join(pkg_dir, 'inst2') two = os.path.join(pkg_dir, 'two') self.write_file(two, 'xxx') - cmd.data_files = [one, (inst2, [two])] - self.assertEqual(cmd.get_inputs(), [one, (inst2, [two])]) + cmd.data_files = {one : '{inst}/one', two : '{inst2}/two'} + self.assertItemsEqual(cmd.get_inputs(), [one, two]) # let's run the command cmd.ensure_finalized() @@ -54,17 +63,22 @@ inst4 = os.path.join(pkg_dir, 'inst4') three = os.path.join(cmd.install_dir, 'three') self.write_file(three, 'xx') - cmd.data_files = [one, (inst2, [two]), - ('inst3', [three]), - (inst4, [])] + + sysconfig_SCHEMES.set(_get_default_scheme(), 'inst3', cmd.install_dir) + + cmd.data_files = {one : '{inst}/one', + two : '{inst2}/two', + three : '{inst3}/three'} cmd.ensure_finalized() cmd.run() # let's check the result - self.assertEqual(len(cmd.get_outputs()), 4) + self.assertEqual(len(cmd.get_outputs()), 3) self.assertTrue(os.path.exists(os.path.join(inst2, rtwo))) self.assertTrue(os.path.exists(os.path.join(inst, rone))) + sysconfig_SCHEMES = old_scheme + def test_suite(): return unittest.makeSuite(InstallDataTestCase) diff --git a/distutils2/tests/test_command_sdist.py b/distutils2/tests/test_command_sdist.py --- a/distutils2/tests/test_command_sdist.py +++ b/distutils2/tests/test_command_sdist.py @@ -202,11 +202,10 @@ self.write_file((some_dir, 'file.txt'), '#') self.write_file((some_dir, 'other_file.txt'), '#') - dist.data_files = [('data', ['data/data.dt', - 'inroot.txt', - 'notexisting']), - 'some/file.txt', - 'some/other_file.txt'] + dist.data_files = {'data/data.dt' : '{appdata}/data.dt', + 'inroot.txt' : '{appdata}/inroot.txt', + 'some/file.txt' : '{appdata}/file.txt', + 'some/other_file.txt' : '{appdata}/other_file.txt'} # adding a script script_dir = join(self.tmp_dir, 'scripts') diff --git a/distutils2/tests/test_config.py b/distutils2/tests/test_config.py --- a/distutils2/tests/test_config.py +++ b/distutils2/tests/test_config.py @@ -65,11 +65,6 @@ package_data = cheese = data/templates/* -data_files = - bitmaps = bm/b1.gif, bm/b2.gif - config = cfg/data.cfg - /etc/init.d = init-script - extra_files = %(extra-files)s # Replaces MANIFEST.in @@ -78,6 +73,11 @@ recursive-include examples *.txt *.py prune examples/sample?/build +resources= + bm/ {b1,b2}.gif = {icon} + Cf*/ *.CFG = {config}/baBar/ + init_script = {script}/JunGle/ + [global] commands = distutils2.tests.test_config.FooBarBazTest @@ -193,6 +193,12 @@ def test_config(self): self.write_setup() self.write_file('README', 'yeah') + os.mkdir('bm') + self.write_file(os.path.join('bm', 'b1.gif'), '') + self.write_file(os.path.join('bm', 'b2.gif'), '') + os.mkdir('Cfg') + self.write_file(os.path.join('Cfg', 'data.CFG'), '') + self.write_file('init_script', '') # try to load the metadata now dist = self.run_setup('--version') @@ -237,10 +243,12 @@ self.assertEqual(dist.packages, ['one', 'two', 'three']) self.assertEqual(dist.py_modules, ['haven']) self.assertEqual(dist.package_data, {'cheese': 'data/templates/*'}) - self.assertEqual(dist.data_files, - [('bitmaps ', ['bm/b1.gif', 'bm/b2.gif']), - ('config ', ['cfg/data.cfg']), - ('/etc/init.d ', ['init-script'])]) + self.assertEqual( + {'bm/b1.gif' : '{icon}/b1.gif', + 'bm/b2.gif' : '{icon}/b2.gif', + 'Cfg/data.CFG' : '{config}/baBar/data.CFG', + 'init_script' : '{script}/JunGle/init_script'}, + dist.data_files) self.assertEqual(dist.package_dir, 'src') diff --git a/distutils2/tests/test_mkcfg.py b/distutils2/tests/test_mkcfg.py --- a/distutils2/tests/test_mkcfg.py +++ b/distutils2/tests/test_mkcfg.py @@ -153,9 +153,9 @@ ' pyxfoil/fengine.so', 'scripts = my_script', ' bin/run', - '[resources]', - 'README.rst = {doc}', - 'pyxfoil.1 = {man}', + 'resources =', + ' README.rst = {doc}', + ' pyxfoil.1 = {man}', ])) def test_convert_setup_py_to_cfg_with_description_in_readme(self): @@ -173,7 +173,7 @@ url='http://www.python-science.org/project/pyxfoil', license='GPLv2', packages=['pyxfoil'], - package_data={'pyxfoil' : ['fengine.so']}, + package_data={'pyxfoil' : ['fengine.so', 'babar.so']}, data_files=[ ('share/doc/pyxfoil', ['README.rst']), ('share/man', ['pyxfoil.1']), @@ -191,7 +191,7 @@ main() fp = open(os.path.join(self.wdir, 'setup.cfg')) try: - lines = set([line.strip() for line in fp]) + lines = set([line.rstrip() for line in fp]) finally: fp.close() self.assertEqual(lines, set(['', @@ -207,9 +207,10 @@ '[files]', 'packages = pyxfoil', 'extra_files = pyxfoil/fengine.so', - '[resources]', - 'README.rst = {doc}', - 'pyxfoil.1 = {man}', + ' pyxfoil/babar.so', + 'resources =', + ' README.rst = {doc}', + ' pyxfoil.1 = {man}', ])) diff --git a/distutils2/tests/test_resources.py b/distutils2/tests/test_resources.py new file mode 100644 --- /dev/null +++ b/distutils2/tests/test_resources.py @@ -0,0 +1,174 @@ +# -*- encoding: utf-8 -*- +"""Tests for distutils.data.""" +import pkgutil +import sys + +from distutils2._backport.pkgutil import resource_open +from distutils2._backport.pkgutil import resource_path +from distutils2._backport.pkgutil import disable_cache +from distutils2._backport.pkgutil import enable_cache +from distutils2.command.install_dist import install_dist +from distutils2.resources import resources_dests +from distutils2.tests import run_unittest +from distutils2.tests import unittest +from distutils2.tests.test_util import GlobTestCaseBase +import os +import tempfile + + +class DataFilesTestCase(GlobTestCaseBase): + + def assertRulesMatch(self, rules, spec): + tempdir = self.build_files_tree(spec) + expected = self.clean_tree(spec) + result = resources_dests(tempdir, rules) + self.assertEquals(expected, result) + + def clean_tree(self, spec): + files = {} + for path, value in spec.items(): + if value is not None: + path = self.os_dependant_path(path) + files[path] = value + return files + + def test_simple_glob(self): + rules = [('', '*.tpl', '{data}')] + spec = {'coucou.tpl': '{data}/coucou.tpl', + 'Donotwant': None} + self.assertRulesMatch(rules, spec) + + def test_multiple_match(self): + rules = [('scripts', '*.bin', '{appdata}'), + ('scripts', '*', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin', + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) + + def test_set_match(self): + rules = [('scripts', '*.{bin,sh}', '{appscript}')] + spec = {'scripts/script.bin': '{appscript}/script.bin', + 'scripts/babar.sh': '{appscript}/babar.sh', + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) + + def test_set_match_multiple(self): + rules = [('scripts', 'script{s,}.{bin,sh}', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': '{appscript}/script.sh', + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) + + def test_set_match_exclude(self): + rules = [('scripts', '*', '{appscript}'), + ('', '**/*.sh', None)] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripts/script.sh': None, + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) + + def test_glob_in_base(self): + rules = [('scrip*', '*.bin', '{appscript}')] + spec = {'scripts/scripts.bin': '{appscript}/scripts.bin', + 'scripouille/babar.bin': '{appscript}/babar.bin', + 'scriptortu/lotus.bin': '{appscript}/lotus.bin', + 'Babarlikestrawberry': None} + self.assertRulesMatch(rules, spec) + + def test_recursive_glob(self): + rules = [('', '**/*.bin', '{binary}')] + spec = {'binary0.bin': '{binary}/binary0.bin', + 'scripts/binary1.bin': '{binary}/scripts/binary1.bin', + 'scripts/bin/binary2.bin': '{binary}/scripts/bin/binary2.bin', + 'you/kill/pandabear.guy': None} + self.assertRulesMatch(rules, spec) + + def test_final_exemple_glob(self): + rules = [ + ('mailman/database/schemas/', '*', '{appdata}/schemas'), + ('', '**/*.tpl', '{appdata}/templates'), + ('', 'developer-docs/**/*.txt', '{doc}'), + ('', 'README', '{doc}'), + ('mailman/etc/', '*', '{config}'), + ('mailman/foo/', '**/bar/*.cfg', '{config}/baz'), + ('mailman/foo/', '**/*.cfg', '{config}/hmm'), + ('', 'some-new-semantic.sns', '{funky-crazy-category}') + ] + spec = { + 'README': '{doc}/README', + 'some.tpl': '{appdata}/templates/some.tpl', + 'some-new-semantic.sns': '{funky-crazy-category}/some-new-semantic.sns', + 'mailman/database/mailman.db': None, + 'mailman/database/schemas/blah.schema': '{appdata}/schemas/blah.schema', + 'mailman/etc/my.cnf': '{config}/my.cnf', + 'mailman/foo/some/path/bar/my.cfg': '{config}/hmm/some/path/bar/my.cfg', + 'mailman/foo/some/path/other.cfg': '{config}/hmm/some/path/other.cfg', + 'developer-docs/index.txt': '{doc}/developer-docs/index.txt', + 'developer-docs/api/toc.txt': '{doc}/developer-docs/api/toc.txt', + } + self.maxDiff = None + self.assertRulesMatch(rules, spec) + + def test_resource_open(self): + + + #Create a fake-dist + temp_site_packages = tempfile.mkdtemp() + + dist_name = 'test' + dist_info = os.path.join(temp_site_packages, 'test-0.1.dist-info') + os.mkdir(dist_info) + + metadata_path = os.path.join(dist_info, 'METADATA') + resources_path = os.path.join(dist_info, 'RESOURCES') + + metadata_file = open(metadata_path, 'w') + + metadata_file.write( +"""Metadata-Version: 1.2 +Name: test +Version: 0.1 +Summary: test +Author: me + """) + + metadata_file.close() + + test_path = 'test.cfg' + + _, test_resource_path = tempfile.mkstemp() + + test_resource_file = open(test_resource_path, 'w') + + content = 'Config' + test_resource_file.write(content) + test_resource_file.close() + + resources_file = open(resources_path, 'w') + + resources_file.write("""%s,%s""" % (test_path, test_resource_path)) + resources_file.close() + + #Add fake site-packages to sys.path to retrieve fake dist + old_sys_path = sys.path + sys.path.insert(0, temp_site_packages) + + #Force pkgutil to rescan the sys.path + disable_cache() + + #Try to retrieve resources paths and files + self.assertEqual(resource_path(dist_name, test_path), test_resource_path) + self.assertRaises(KeyError, resource_path, dist_name, 'notexis') + + self.assertEqual(resource_open(dist_name, test_path).read(), content) + self.assertRaises(KeyError, resource_open, dist_name, 'notexis') + + sys.path = old_sys_path + + enable_cache() + +def test_suite(): + return unittest.makeSuite(DataFilesTestCase) + +if __name__ == '__main__': + run_unittest(test_suite()) diff --git a/distutils2/tests/test_util.py b/distutils2/tests/test_util.py --- a/distutils2/tests/test_util.py +++ b/distutils2/tests/test_util.py @@ -18,7 +18,7 @@ _find_exe_version, _MAC_OS_X_LD_VERSION, byte_compile, find_packages, spawn, find_executable, _nt_quote_args, get_pypirc_path, generate_pypirc, - read_pypirc, resolve_name) + read_pypirc, resolve_name, iglob, RICH_GLOB) from distutils2 import util from distutils2.tests import unittest, support @@ -479,9 +479,186 @@ content = open(rc).read() self.assertEqual(content, WANTED) +class GlobTestCaseBase(support.TempdirManager, + support.LoggingCatcher, + unittest.TestCase): + + def build_files_tree(self, files): + tempdir = self.mkdtemp() + for filepath in files: + is_dir = filepath.endswith('/') + filepath = os.path.join(tempdir, *filepath.split('/')) + if is_dir: + dirname = filepath + else: + dirname = os.path.dirname(filepath) + if dirname and not os.path.exists(dirname): + os.makedirs(dirname) + if not is_dir: + self.write_file(filepath, 'babar') + return tempdir + + @staticmethod + def os_dependant_path(path): + path = path.rstrip('/').split('/') + return os.path.join(*path) + + def clean_tree(self, spec): + files = [] + for path, includes in list(spec.items()): + if includes: + files.append(self.os_dependant_path(path)) + return files + +class GlobTestCase(GlobTestCaseBase): + + + def assertGlobMatch(self, glob, spec): + """""" + tempdir = self.build_files_tree(spec) + expected = self.clean_tree(spec) + self.addCleanup(os.chdir, os.getcwd()) + os.chdir(tempdir) + result = list(iglob(glob)) + self.assertItemsEqual(expected, result) + + def test_regex_rich_glob(self): + matches = RICH_GLOB.findall(r"babar aime les {fraises} est les {huitres}") + self.assertEquals(["fraises","huitres"], matches) + + def test_simple_glob(self): + glob = '*.tp?' + spec = {'coucou.tpl': True, + 'coucou.tpj': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_simple_glob_in_dir(self): + glob = 'babar/*.tp?' + spec = {'babar/coucou.tpl': True, + 'babar/coucou.tpj': True, + 'babar/toto.bin': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_head(self): + glob = '**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': True, + 'zozo/tip/babar.tpl': True, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_tail(self): + glob = 'babar/**' + spec = {'babar/zaza/': True, + 'babar/zaza/zuzu/': True, + 'babar/zaza/zuzu/babar.xml': True, + 'babar/zaza/zuzu/toto.xml': True, + 'babar/zaza/zuzu/toto.csv': True, + 'babar/zaza/coucou.tpl': True, + 'babar/bubu.tpl': True, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_recursive_glob_middle(self): + glob = 'babar/**/tip/*.t?l' + spec = {'babar/zaza/zuzu/tip/coucou.tpl': True, + 'babar/z/tip/coucou.tpl': True, + 'babar/tip/coucou.tpl': True, + 'babar/zeop/tip/babar/babar.tpl': False, + 'babar/z/tip/coucou.bin': False, + 'babar/toto.bin': False, + 'zozo/zuzu/tip/babar.tpl': False, + 'zozo/tip/babar.tpl': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_tail(self): + glob = 'bin/*.{bin,sh,exe}' + spec = {'bin/babar.bin': True, + 'bin/zephir.sh': True, + 'bin/celestine.exe': True, + 'bin/cornelius.bat': False, + 'bin/cornelius.xml': False, + 'toto/yurg': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_middle(self): + glob = 'xml/{babar,toto}.xml' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': True, + 'xml/babar.xslt': False, + 'xml/cornelius.sgml': False, + 'xml/zephir.xml': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_set_head(self): + glob = '{xml,xslt}/babar.*' + spec = {'xml/babar.xml': True, + 'xml/toto.xml': False, + 'xslt/babar.xslt': True, + 'xslt/toto.xslt': False, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_glob_all(self): + glob = '{xml/*,xslt/**}/babar.xml' + spec = {'xml/a/babar.xml': True, + 'xml/b/babar.xml': True, + 'xml/a/c/babar.xml': False, + 'xslt/a/babar.xml': True, + 'xslt/b/babar.xml': True, + 'xslt/a/c/babar.xml': True, + 'toto/yurg.xml': False, + 'Donotwant': False} + self.assertGlobMatch(glob, spec) + + def test_invalid_glob_pattern(self): + invalids = [ + 'ppooa**', + 'azzaeaz4**/', + '/**ddsfs', + '**##1e"&e', + 'DSFb**c009', + '{' + '{aaQSDFa' + '}' + 'aQSDFSaa}' + '{**a,' + ',**a}' + '{a**,' + ',b**}' + '{a**a,babar}' + '{bob,b**z}' + ] + msg = "%r is not supposed to be a valid pattern" + for pattern in invalids: + try: + iglob(pattern) + except ValueError: + continue + else: + self.fail("%r is not a valid iglob pattern" % pattern) + + def test_suite(): - return unittest.makeSuite(UtilTestCase) + suite = unittest.makeSuite(UtilTestCase) + suite.addTest(unittest.makeSuite(GlobTestCase)) + return suite + if __name__ == "__main__": unittest.main(defaultTest="test_suite") diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -15,6 +15,10 @@ from subprocess import call as sub_call from copy import copy from fnmatch import fnmatchcase +try: + from glob import iglob as std_iglob +except ImportError: + from glob import glob as std_iglob # for python < 2.5 from ConfigParser import RawConfigParser from inspect import getsource @@ -946,6 +950,47 @@ return run_2to3(files, doctests_only, self.fixer_names, self.options, self.explicit) +RICH_GLOB = re.compile(r'\{([^}]*)\}') +_CHECK_RECURSIVE_GLOB = re.compile(r'[^/,{]\*\*|\*\*[^/,}]') +_CHECK_MISMATCH_SET = re.compile(r'^[^{]*\}|\{[^}]*$') + +def iglob(path_glob): + """Richer glob than the std glob module support ** and {opt1,opt2,opt3}""" + if _CHECK_RECURSIVE_GLOB.search(path_glob): + msg = """Invalid glob %r: Recursive glob "**" must be used alone""" + raise ValueError(msg % path_glob) + if _CHECK_MISMATCH_SET.search(path_glob): + msg = """Invalid glob %r: Mismatching set marker '{' or '}'""" + raise ValueError(msg % path_glob) + return _iglob(path_glob) + + +def _iglob(path_glob): + """Actual logic of the iglob function""" + rich_path_glob = RICH_GLOB.split(path_glob, 1) + if len(rich_path_glob) > 1: + assert len(rich_path_glob) == 3, rich_path_glob + prefix, set, suffix = rich_path_glob + for item in set.split(','): + for path in _iglob( ''.join((prefix, item, suffix))): + yield path + else: + if '**' not in path_glob: + for item in std_iglob(path_glob): + yield item + else: + prefix, radical = path_glob.split('**', 1) + if prefix == '': + prefix = '.' + if radical == '': + radical = '*' + else: + radical = radical.lstrip('/') + for (path, dir, files) in os.walk(prefix): + path = os.path.normpath(path) + for file in _iglob(os.path.join(path, radical)): + yield file + def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): """ Distutils2 to distutils1 compatibility util. diff --git a/docs/design/wiki.rst b/docs/design/wiki.rst --- a/docs/design/wiki.rst +++ b/docs/design/wiki.rst @@ -250,8 +250,8 @@ == ==================================== =================================================================================================== 1 mailman/database/schemas/blah.schema /var/mailman/schemas/blah.schema 2 some.tpl /var/mailman/templates/some.tpl -3 path/to/some.tpl /var/mailman/templates/path/to/some.tpl -4 mailman/database/mailman.db /var/mailman/database/mailman.db +3 path/to/some.tpl /var/mailman/templates/path/to/some.tpl ! +4 mailman/database/mailman.db /var/mailman/database/mailman.db ! 5 developer-docs/index.txt /usr/share/doc/mailman/developer-docs/index.txt 6 developer-docs/api/toc.txt /usr/share/doc/mailman/developer-docs/api/toc.txt 7 README /usr/share/doc/mailman/README @@ -259,7 +259,7 @@ 9 mailman/foo/some/path/bar/my.cfg /etc/mailman/baz/some/path/bar/my.cfg AND /etc/mailman/hmm/some/path/bar/my.cfg + emit a warning -10 mailman/foo/some/path/other.cfg /etc/mailman/some/path/other.cfg +10 mailman/foo/some/path/other.cfg /etc/mailman/some/path/other.cfg ! 11 some-new-semantic.sns /var/funky/mailman/some-new-semantic.sns == ==================================== =================================================================================================== diff --git a/docs/source/setupcfg.rst b/docs/source/setupcfg.rst --- a/docs/source/setupcfg.rst +++ b/docs/source/setupcfg.rst @@ -176,11 +176,360 @@ .. Note:: In Distutils2, setup.cfg will be implicitly included. +Resources +========= + +This section describes the files used by the project which must not be installed in the same place that python modules or libraries, they are called **resources**. They are for example documentation files, script files, databases, etc... + +For declaring resources, you must use this notation :: + + source = destination + +Data-files are declared in the **resources** field in the **file** section, for example:: + + [files] + resources = + source1 = destination1 + source2 = destination2 + +The **source** part of the declaration are relative paths of resources files (using unix path separator **/**). For example, if you've this source tree:: + + foo/ + doc/ + doc.man + scripts/ + foo.sh + +Your setup.cfg will look like:: + + [files] + resources = + doc/doc.man = destination_doc + scripts/foo.sh = destination_scripts + +The final paths where files will be placed are composed by : **source** + **destination**. In the previous example, **doc/doc.man** will be placed in **destination_doc/doc/doc.man** and **scripts/foo.sh** will be placed in **destination_scripts/scripts/foo.sh**. (If you want more control on the final path, take a look at base_prefix_). + +The **destination** part of resources declaration are paths with categories. Indeed, it's generally a bad idea to give absolute path as it will be cross incompatible. So, you must use resources categories in your **destination** declaration. Categories will be replaced by their real path at the installation time. Using categories is all benefit, your declaration will be simpler, cross platform and it will allow packager to place resources files where they want without breaking your code. + +Categories can be specified by using this syntax:: + + {category} + +Default categories are:: + +* config +* appdata +* appdata.arch +* appdata.persistent +* appdata.disposable +* help +* icon +* scripts +* doc +* info +* man + +A special category also exists **{distribution.name}** that will be replaced by the name of the distribution, but as most of the defaults categories use them, so it's not necessary to add **{distribution.name}** into your destination. + +If you use categories in your declarations, and you are encouraged to do, final path will be:: + + source + destination_expanded + +.. _example_final_path: + +For example, if you have this setup.cfg:: + + [metadata] + name = foo + + [files] + resources = + doc/doc.man = {doc} + +And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final path will be:: + + {datadir}/doc/foo/doc/doc.man + +Where {datafir} category will be platform-dependent. + + +More control on source part +--------------------------- + +Glob syntax +___________ + +When you declare source file, you can use a glob-like syntax to match multiples file, for example:: + + scripts/* = {script} + +Will match all the files in the scripts directory and placed them in the script category. + +Glob tokens are: + + * * : match all files. + * ? : match any character. + * ** : match any level of tree recursion (even 0). + * {} : will match any part separated by comma (example : {sh,bat}). + +TODO :: + + Add an example + +Order of declaration +____________________ + +The order of declaration is important if one file match multiple rules. The last rules matched by file is used, this is useful if you have this source tree:: + + foo/ + doc/ + index.rst + setup.rst + documentation.txt + doc.tex + README + +And you want all the files in the doc directory to be placed in {doc} category, but README must be placed in {help} category, instead of listing all the files one by one, you can declare them in this way:: + + [files] + resources = + doc/* = {doc} + doc/README = {help} + +Exclude +_______ + +You can exclude some files of resources declaration by giving no destination, it can be useful if you have a non-resources file in the same directory of resources files:: + + foo/ + doc/ + RELEASES + doc.tex + documentation.txt + docu.rst + +Your **file** section will be:: + + [files] + resources = + doc/* = {doc} + doc/RELEASES = + +More control on destination part +-------------------------------- + +.. _base_prefix: + +Define a base-prefix +____________________ + +When you define your resources, you can have more control of how the final path is compute. + +By default, the final path is:: + + destination + source + +This can generate long paths, for example (example_final_path_):: + + {datadir}/doc/foo/doc/doc.man + +When you declare your source, you can use a separator to split the source in **prefix** **suffix**. The supported separator are : + + * Whitespace + +So, for example, if you have this source:: + + docs/ doc.man + +The **prefix** is "docs/" and the **suffix** is "doc.html". + +.. note:: + + Separator can be placed after a path separator or replace it. So theses two sources are equivalent:: + + docs/ doc.man + docs doc.man + +.. note:: + + Glob syntax is working the same way with standard source and splitted source. So theses rules:: + + docs/* + docs/ * + docs * + + Will match all the files in the docs directory. + +When you use splitted source, the final path is compute in this way:: + + destination + prefix + +So for example, if you have this setup.cfg:: + + [metadata] + name = foo + + [files] + resources = + doc/ doc.man = {doc} + +And if **{doc}** is replaced by **{datadir}/doc/{distribution.name}**, final path will be:: + + {datadir}/doc/foo/doc.man + + +Overwrite paths for categories +------------------------------ + +.. warning:: + + This part is intended for system administrator or packager. + +The real paths of categories are registered in the *sysconfig.cfg* file installed in your python installation. The format of this file is INI-like. The content of the file is organized into several sections : + + * globals : Standard categories's paths. + * posix_prefix : Standard paths for categories and installation paths for posix system. + * other one... + +Standard categories's paths are platform independent, they generally refers to other categories, which are platform dependent. Sysconfig module will choose these category from sections matching os.name. For example:: + + doc = {datadir}/doc/{distribution.name} + +It refers to datadir category, which can be different between platforms. In posix system, it may be:: + + datadir = /usr/share + +So the final path will be:: + + doc = /usr/share/doc/{distribution.name} + +The platform dependent categories are : + + * confdir + * datadir + * libdir + * base + +Define extra-categories +----------------------- + +Examples +-------- + +.. note:: + + These examples are incremental but works unitarily. + +Resources in root dir +_____________________ + +Source tree:: + + babar-1.0/ + README + babar.sh + launch.sh + babar.py + +Setup.cfg:: + + [files] + resources = + README = {doc} + *.sh = {scripts} + +So babar.sh and launch.sh will be placed in {scripts} directory. + +Now let's move all the scripts into a scripts directory. + +Resources in sub-directory +__________________________ + +Source tree:: + + babar-1.1/ + README + scripts/ + babar.sh + launch.sh + LAUNCH + babar.py + +Setup.cfg:: + + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.sh = {scripts} + +It's important to use the separator after scripts/ to install all the bash scripts into {scripts} instead of {scripts}/scripts. + +Now let's add some docs. + +Resources in multiple sub-directories +_____________________________________ + +Source tree:: + + babar-1.2/ + README + scripts/ + babar.sh + launch.sh + LAUNCH + docs/ + api + man + babar.py + +Setup.cfg:: + + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.sh = {scripts} + doc/ * = {doc} + doc/ man = {man} + +You want to place all the file in the docs script into {doc} category, instead of man, which must be placed into {man} category, we will use the order of declaration of globs to choose the destination, the last glob that match the file is used. + +Now let's add some scripts for windows users. + +Complete example +________________ + +Source tree:: + + babar-1.3/ + README + doc/ + api + man + scripts/ + babar.sh + launch.sh + babar.bat + launch.bat + LAUNCH + +Setup.cfg:: + + [files] + resources = + README = {doc} + scripts/ LAUNCH = {doc} + scripts/ *.{sh,bat} = {scripts} + doc/ * = {doc} + doc/ man = {man} + +We use brace expansion syntax to place all the bash and batch scripts into {scripts} category. + .. Warning:: In Distutils2, setup.py and README (or README.txt) files are not more included in source distribution by default - `command` sections ================== -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Silence distutils2.tests.test_install log output. Message-ID: tarek.ziade pushed bf37a129641b to distutils2: http://hg.python.org/distutils2/rev/bf37a129641b changeset: 1094:bf37a129641b parent: 1024:c3706e13ec0b user: Kelsey Hightower date: Sun Feb 13 17:52:29 2011 -0500 summary: Silence distutils2.tests.test_install log output. files: distutils2/tests/test_install.py diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -1,15 +1,14 @@ """Tests for the distutils2.install module.""" import os + from tempfile import mkstemp - from distutils2 import install from distutils2.index.xmlrpc import Client from distutils2.metadata import Metadata from distutils2.tests import run_unittest -from distutils2.tests.support import TempdirManager +from distutils2.tests.support import LoggingCatcher, TempdirManager, unittest from distutils2.tests.pypi_server import use_xmlrpc_server -from distutils2.tests.support import unittest class InstalledDist(object): @@ -84,7 +83,7 @@ return objects -class TestInstall(TempdirManager, unittest.TestCase): +class TestInstall(LoggingCatcher, TempdirManager, unittest.TestCase): def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: EnvironmentError does not always have a message attribute (python 2.4) Message-ID: tarek.ziade pushed 422fcb29d738 to distutils2: http://hg.python.org/distutils2/rev/422fcb29d738 changeset: 1093:422fcb29d738 user: Alexis Metaireau date: Sun Feb 13 23:18:44 2011 +0000 summary: EnvironmentError does not always have a message attribute (python 2.4) files: distutils2/command/install_data.py diff --git a/distutils2/command/install_data.py b/distutils2/command/install_data.py --- a/distutils2/command/install_data.py +++ b/distutils2/command/install_data.py @@ -51,7 +51,7 @@ try: (out, _) = self.copy_file(file[0], dir_dest) except Error, e: - self.warn(e.message) + self.warn(e) out = destination self.outfiles.append(out) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: Merge Kelsey changes Message-ID: tarek.ziade pushed 53e2bbd550a2 to distutils2: http://hg.python.org/distutils2/rev/53e2bbd550a2 changeset: 1095:53e2bbd550a2 parent: 1093:422fcb29d738 parent: 1094:bf37a129641b user: Alexis Metaireau date: Sun Feb 13 23:29:08 2011 +0000 summary: Merge Kelsey changes files: diff --git a/distutils2/tests/test_install.py b/distutils2/tests/test_install.py --- a/distutils2/tests/test_install.py +++ b/distutils2/tests/test_install.py @@ -1,15 +1,14 @@ """Tests for the distutils2.install module.""" import os + from tempfile import mkstemp - from distutils2 import install from distutils2.index.xmlrpc import Client from distutils2.metadata import Metadata from distutils2.tests import run_unittest -from distutils2.tests.support import TempdirManager +from distutils2.tests.support import LoggingCatcher, TempdirManager, unittest from distutils2.tests.pypi_server import use_xmlrpc_server -from distutils2.tests.support import unittest class InstalledDist(object): @@ -84,7 +83,7 @@ return objects -class TestInstall(TempdirManager, unittest.TestCase): +class TestInstall(LoggingCatcher, TempdirManager, unittest.TestCase): def _get_client(self, server, *args, **kwargs): return Client(server.full_address, *args, **kwargs) -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:23:59 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:23:59 +0100 Subject: [Python-checkins] distutils2: cleanup util + better names Message-ID: tarek.ziade pushed 66427841fb2b to distutils2: http://hg.python.org/distutils2/rev/66427841fb2b changeset: 1096:66427841fb2b tag: tip user: Tarek Ziade date: Wed Feb 16 22:23:27 2011 +0100 summary: cleanup util + better names files: distutils2/mkcfg.py distutils2/util.py diff --git a/distutils2/mkcfg.py b/distutils2/mkcfg.py --- a/distutils2/mkcfg.py +++ b/distutils2/mkcfg.py @@ -650,7 +650,7 @@ program.query_user() program.update_config_file() program.write_setup_script() - # istutils2.util.generate_distutils_setup_py() + # distutils2.util.cfg_to_args() if __name__ == '__main__': diff --git a/distutils2/util.py b/distutils2/util.py --- a/distutils2/util.py +++ b/distutils2/util.py @@ -9,9 +9,6 @@ import re import string import sys -import shutil -import tarfile -import zipfile from subprocess import call as sub_call from copy import copy from fnmatch import fnmatchcase @@ -525,9 +522,9 @@ if missing == 'error': # blow up when we stat() the file pass elif missing == 'ignore': # missing source dropped from - continue # target's dependency list + continue # target's dependency list elif missing == 'newer': # missing source means target is - return True # out-of-date + return True # out-of-date if os.stat(source).st_mtime > target_mtime: return True @@ -793,6 +790,7 @@ msg = "command '%s' failed with exit status %d" raise DistutilsExecError(msg % (cmd, exit_status)) + def find_executable(executable, path=None): """Tries to find 'executable' in the directories listed in 'path'. @@ -924,7 +922,8 @@ if fixer_names: for fixername in fixer_names: - fixers.extend([fixer for fixer in get_fixers_from_package(fixername)]) + fixers.extend([fixer for fixer in + get_fixers_from_package(fixername)]) r = RefactoringTool(fixers, options=options) r.refactor(files, write=True, doctests_only=doctests_only) @@ -992,7 +991,7 @@ yield file -def generate_distutils_kwargs_from_setup_cfg(file='setup.cfg'): +def cfg_to_args(path='setup.cfg'): """ Distutils2 to distutils1 compatibility util. This method uses an existing setup.cfg to generate a dictionnary of @@ -1006,29 +1005,28 @@ """ # We need to declare the following constants here so that it's easier to # generate the setup.py afterwards, using inspect.getsource. - D1_D2_SETUP_ARGS = { - # D1 name : (D2_section, D2_name) - "name" : ("metadata",), - "version" : ("metadata",), - "author" : ("metadata",), - "author_email" : ("metadata",), - "maintainer" : ("metadata",), - "maintainer_email" : ("metadata",), - "url" : ("metadata", "home_page"), - "description" : ("metadata", "summary"), - "long_description" : ("metadata", "description"), - "download-url" : ("metadata",), - "classifiers" : ("metadata", "classifier"), - "platforms" : ("metadata", "platform"), # Needs testing - "license" : ("metadata",), - "requires" : ("metadata", "requires_dist"), - "provides" : ("metadata", "provides_dist"), # Needs testing - "obsoletes" : ("metadata", "obsoletes_dist"), # Needs testing - - "packages" : ("files",), - "scripts" : ("files",), - "py_modules" : ("files", "modules"), # Needs testing - } + + # XXX ** == needs testing + D1_D2_SETUP_ARGS = {"name": ("metadata",), + "version": ("metadata",), + "author": ("metadata",), + "author_email": ("metadata",), + "maintainer": ("metadata",), + "maintainer_email": ("metadata",), + "url": ("metadata", "home_page"), + "description": ("metadata", "summary"), + "long_description": ("metadata", "description"), + "download-url": ("metadata",), + "classifiers": ("metadata", "classifier"), + "platforms": ("metadata", "platform"), # ** + "license": ("metadata",), + "requires": ("metadata", "requires_dist"), + "provides": ("metadata", "provides_dist"), # ** + "obsoletes": ("metadata", "obsoletes_dist"), # ** + "packages": ("files",), + "scripts": ("files",), + "py_modules": ("files", "modules"), # ** + } MULTI_FIELDS = ("classifiers", "requires", @@ -1049,7 +1047,7 @@ if not os.path.exists(file): raise DistutilsFileError("file '%s' does not exist" % os.path.abspath(file)) - config.read(file) + config.read(path) kwargs = {} for arg in D1_D2_SETUP_ARGS: @@ -1084,8 +1082,20 @@ return kwargs -def generate_distutils_setup_py(): - """ Generate a distutils compatible setup.py using an existing setup.cfg. +_SETUP_TMPL = """\ +# This script was automatically generated by Distutils2 +import os +from distutils.core import setup +from ConfigParser import RawConfigParser + +%(func)s + +setup(**cfg_to_args()) +""" + + +def generate_setup_py(): + """Generates a distutils compatible setup.py using an existing setup.cfg. :raises DistutilsFileError: When a setup.py already exists. @@ -1095,15 +1105,6 @@ handle = open("setup.py", "w") try: - handle.write( - "# Distutils script using distutils2 setup.cfg to call the\n" - "# distutils.core.setup() with the right args.\n\n" - "import os\n" - "from distutils.core import setup\n" - "from ConfigParser import RawConfigParser\n\n" - "" + getsource(generate_distutils_kwargs_from_setup_cfg) + "\n\n" - "kwargs = generate_distutils_kwargs_from_setup_cfg()\n" - "setup(**kwargs)\n" - ) + handle.write(_SETUP_TMPL % {'func': getsource(cfg_to_args)}) finally: handle.close() -- Repository URL: http://hg.python.org/distutils2 From python-checkins at python.org Wed Feb 16 22:33:23 2011 From: python-checkins at python.org (tarek.ziade) Date: Wed, 16 Feb 2011 22:33:23 +0100 Subject: [Python-checkins] distutils2: cleaned up fancygetopt before refactoring Message-ID: tarek.ziade pushed a8c8ef7c3b92 to distutils2: http://hg.python.org/distutils2/rev/a8c8ef7c3b92 changeset: 1097:a8c8ef7c3b92 tag: tip user: Tarek Ziade date: Wed Feb 16 22:33:11 2011 +0100 summary: cleaned up fancygetopt before refactoring files: distutils2/fancy_getopt.py diff --git a/distutils2/fancy_getopt.py b/distutils2/fancy_getopt.py --- a/distutils2/fancy_getopt.py +++ b/distutils2/fancy_getopt.py @@ -25,6 +25,7 @@ # For recognizing "negative alias" options, eg. "quiet=!verbose" neg_alias_re = re.compile("^(%s)=!(%s)$" % (longopt_pat, longopt_pat)) + class FancyGetopt(object): """Wrapper around the standard 'getopt()' module that provides some handy extra functionality: @@ -36,7 +37,6 @@ --quiet is the "negative alias" of --verbose, then "--quiet" on the command line sets 'verbose' to false """ - def __init__(self, option_table=None): # The option table is (currently) a list of tuples. The @@ -78,51 +78,47 @@ # but expands short options, converts aliases, etc. self.option_order = [] - # __init__ () - - - def _build_index (self): + def _build_index(self): self.option_index.clear() for option in self.option_table: self.option_index[option[0]] = option - def set_option_table (self, option_table): + def set_option_table(self, option_table): self.option_table = option_table self._build_index() - def add_option (self, long_option, short_option=None, help_string=None): + def add_option(self, long_option, short_option=None, help_string=None): if long_option in self.option_index: - raise DistutilsGetoptError, \ - "option conflict: already an option '%s'" % long_option + raise DistutilsGetoptError( + "option conflict: already an option '%s'" % long_option) else: option = (long_option, short_option, help_string) self.option_table.append(option) self.option_index[long_option] = option - - def has_option (self, long_option): + def has_option(self, long_option): """Return true if the option table for this parser has an option with long name 'long_option'.""" return long_option in self.option_index - def _check_alias_dict (self, aliases, what): + def _check_alias_dict(self, aliases, what): assert isinstance(aliases, dict) for (alias, opt) in aliases.iteritems(): if alias not in self.option_index: - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid %s '%s': " - "option '%s' not defined") % (what, alias, alias) + "option '%s' not defined") % (what, alias, alias)) if opt not in self.option_index: - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid %s '%s': " - "aliased option '%s' not defined") % (what, alias, opt) + "aliased option '%s' not defined") % (what, alias, opt)) - def set_aliases (self, alias): + def set_aliases(self, alias): """Set the aliases for this option parser.""" self._check_alias_dict(alias, "alias") self.alias = alias - def set_negative_aliases (self, negative_alias): + def set_negative_aliases(self, negative_alias): """Set the negative aliases for this option parser. 'negative_alias' should be a dictionary mapping option names to option names, both the key and value must already be defined @@ -130,8 +126,7 @@ self._check_alias_dict(negative_alias, "negative alias") self.negative_alias = negative_alias - - def _grok_option_table (self): + def _grok_option_table(self): """Populate the various data structures that keep tabs on the option table. Called by 'getopt()' before it can do anything worthwhile. @@ -150,19 +145,19 @@ else: # the option table is part of the code, so simply # assert that it is correct - raise ValueError, "invalid option tuple: %r" % (option,) + raise ValueError("invalid option tuple: %r" % option) # Type- and value-check the option names if not isinstance(long, str) or len(long) < 2: - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid long option '%s': " - "must be a string of length >= 2") % long + "must be a string of length >= 2") % long) if (not ((short is None) or (isinstance(short, str) and len(short) == 1))): - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid short option '%s': " - "must a single character or None") % short + "must a single character or None") % short) self.repeat[long] = repeat self.long_opts.append(long) @@ -179,12 +174,12 @@ alias_to = self.negative_alias.get(long) if alias_to is not None: if self.takes_arg[alias_to]: - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid negative alias '%s': " "aliased option '%s' takes a value") % \ - (long, alias_to) + (long, alias_to)) - self.long_opts[-1] = long # XXX redundant?! + self.long_opts[-1] = long # XXX redundant?! self.takes_arg[long] = 0 else: @@ -195,32 +190,26 @@ alias_to = self.alias.get(long) if alias_to is not None: if self.takes_arg[long] != self.takes_arg[alias_to]: - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid alias '%s': inconsistent with " "aliased option '%s' (one of them takes a value, " - "the other doesn't") % (long, alias_to) - + "the other doesn't") % (long, alias_to)) # Now enforce some bondage on the long option name, so we can # later translate it to an attribute name on some object. Have # to do this a bit late to make sure we've removed any trailing # '='. if not longopt_re.match(long): - raise DistutilsGetoptError, \ + raise DistutilsGetoptError( ("invalid long option name '%s' " + - "(must be letters, numbers, hyphens only") % long + "(must be letters, numbers, hyphens only") % long) self.attr_name[long] = long.replace('-', '_') if short: self.short_opts.append(short) self.short2long[short[0]] = long - # for option_table - - # _grok_option_table() - - - def getopt (self, args=None, object=None): + def getopt(self, args=None, object=None): """Parse command-line options in args. Store as attributes on object. If 'args' is None or not supplied, uses 'sys.argv[1:]'. If @@ -245,10 +234,10 @@ try: opts, args = getopt.getopt(args, short_opts, self.long_opts) except getopt.error, msg: - raise DistutilsArgError, msg + raise DistutilsArgError(msg) for opt, val in opts: - if len(opt) == 2 and opt[0] == '-': # it's a short option + if len(opt) == 2 and opt[0] == '-': # it's a short option opt = self.short2long[opt[1]] else: assert len(opt) > 2 and opt[:2] == '--' @@ -281,21 +270,17 @@ else: return args - # getopt() - - - def get_option_order (self): + def get_option_order(self): """Returns the list of (option, value) tuples processed by the previous run of 'getopt()'. Raises RuntimeError if 'getopt()' hasn't been called yet. """ if self.option_order is None: - raise RuntimeError, "'getopt()' hasn't been called yet" - else: - return self.option_order + raise RuntimeError("'getopt()' hasn't been called yet") + return self.option_order - def generate_help (self, header=None): + def generate_help(self, header=None): """Generate help text (a list of strings, one per suggested line of output) from the option table for this FancyGetopt object. """ @@ -373,22 +358,16 @@ for l in text[1:]: lines.append(big_indent + l) - # for self.option_table - return lines - # generate_help () - - def print_help (self, header=None, file=None): + def print_help(self, header=None, file=None): if file is None: file = sys.stdout for line in self.generate_help(header): file.write(line + "\n") -# class FancyGetopt - -def fancy_getopt (options, negative_opt, object, args): +def fancy_getopt(options, negative_opt, object, args): parser = FancyGetopt(options) parser.set_negative_aliases(negative_opt) return parser.getopt(args, object) @@ -396,7 +375,8 @@ WS_TRANS = string.maketrans(string.whitespace, ' ' * len(string.whitespace)) -def wrap_text (text, width): + +def wrap_text(text, width): """wrap_text(text : string, width : int) -> [string] Split 'text' into multiple lines of no more than 'width' characters @@ -450,8 +430,6 @@ # string, of course! lines.append(string.join(cur_line, '')) - # while chunks - return lines @@ -459,7 +437,7 @@ """Dummy class just used as a place to hold command-line option values as instance attributes.""" - def __init__ (self, options=[]): + def __init__(self, options=[]): """Create a new OptionDummy instance. The attributes listed in 'options' will be initialized to None.""" for opt in options: -- Repository URL: http://hg.python.org/distutils2 From nnorwitz at gmail.com Thu Feb 17 10:00:48 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Thu, 17 Feb 2011 04:00:48 -0500 Subject: [Python-checkins] Python Regression Test Failures doc (1) Message-ID: <20110217090048.GA30982@kbk-i386-bb.dyndns.org> rm -rf build/* rm -rf tools/sphinx rm -rf tools/pygments rm -rf tools/jinja2 rm -rf tools/docutils Checking out Sphinx... svn: PROPFIND request failed on '/projects/external/Sphinx-0.6.5/sphinx' svn: PROPFIND of '/projects/external/Sphinx-0.6.5/sphinx': Could not resolve hostname `svn.python.org': Host not found (http://svn.python.org) make: *** [checkout] Error 1 From python-checkins at python.org Thu Feb 17 20:05:53 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 17 Feb 2011 20:05:53 +0100 (CET) Subject: [Python-checkins] r88431 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110217190553.CA1A9F6FF@mail.python.org> Author: raymond.hettinger Date: Thu Feb 17 20:05:53 2011 New Revision: 88431 Log: Fix an import and add a citation. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Thu Feb 17 20:05:53 2011 @@ -263,8 +263,8 @@ A simple of example of :class:`~concurrent.futures.ThreadPoolExecutor` is a launch of four parallel threads for copying files:: - import threading, shutil - with threading.ThreadPoolExecutor(max_workers=4) as e: + import concurrent.futures, shutil + with concurrent.futures.ThreadPoolExecutor(max_workers=4) as e: e.submit(shutil.copy, 'src1.txt', 'dest1.txt') e.submit(shutil.copy, 'src2.txt', 'dest2.txt') e.submit(shutil.copy, 'src3.txt', 'dest3.txt') @@ -767,8 +767,11 @@ >>> get_phone_number.cache_clear() - (Contributed by Raymond Hettinger and incorporating design ideas from - Jim Baker, Miki Tebeka, and Nick Coghlan.) + (Contributed by Raymond Hettinger and incorporating design ideas from Jim + Baker, Miki Tebeka, and Nick Coghlan; see `recipe 498245 + `_\, `recipe 577479 + `_\, :issue:`10586`, and + :issue:`10593`.) * The :func:`functools.wraps` decorator now adds a :attr:`__wrapped__` attribute pointing to the original callable function. This allows wrapped functions to From python-checkins at python.org Thu Feb 17 20:19:44 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 17 Feb 2011 20:19:44 +0100 (CET) Subject: [Python-checkins] r88432 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110217191944.E21EFEE982@mail.python.org> Author: raymond.hettinger Date: Thu Feb 17 20:19:44 2011 New Revision: 88432 Log: Fix-up logging.dictConfig() example. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Thu Feb 17 20:19:44 2011 @@ -193,7 +193,7 @@ {"version": 1, "formatters": {"brief": {"format": "%(levelname)-8s: %(name)-15s: %(message)s"}, - "full": {"format": "%(asctime)s %(name)-15s %(levelname)-8s %(message)s"}, + "full": {"format": "%(asctime)s %(name)-15s %(levelname)-8s %(message)s"} }, "handlers": {"console": { "class": "logging.StreamHandler", @@ -204,7 +204,7 @@ "class": "logging.StreamHandler", "formatter": "full", "level": "ERROR", - "stream": "ext://sys.stderr"}, + "stream": "ext://sys.stderr"} }, "root": {"level": "DEBUG", "handlers": ["console", "console_priority"]}} @@ -213,11 +213,13 @@ loaded and called with code like this:: >>> import json, logging.config - >>> with open('conf.json', 'rb') as f: + >>> with open('conf.json') as f: conf = json.load(f) >>> logging.config.dictConfig(conf) >>> logging.info("Transaction completed normally") + INFO : root : Transaction completed normally >>> logging.critical("Abnormal termination") + 2011-02-17 11:14:36,694 root CRITICAL Abnormal termination .. seealso:: From python-checkins at python.org Thu Feb 17 22:46:45 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 17 Feb 2011 22:46:45 +0100 Subject: [Python-checkins] hooks: Make merges more distinctive by their subject line Message-ID: antoine.pitrou pushed 78a366c3cf93 to hooks: http://hg.python.org/hooks/rev/78a366c3cf93 changeset: 18:78a366c3cf93 tag: tip user: Antoine Pitrou date: Thu Feb 17 22:46:16 2011 +0100 summary: Make merges more distinctive by their subject line files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -63,6 +63,9 @@ desc = desc.rsplit(' ', 1)[0] subj = '%s%s: %s' % (path, branch_insert, desc) + if len(parents) > 1: + subj = "merge in " + subj + send(subj, FROM % user, to, '\n'.join(body)) print 'notified %s of incoming changeset %s' % (to, ctx) return False -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 00:12:05 2011 From: python-checkins at python.org (michael.foord) Date: Fri, 18 Feb 2011 00:12:05 +0100 (CET) Subject: [Python-checkins] r88433 - peps/trunk/pep-0008.txt Message-ID: <20110217231205.0AFC8EE9DA@mail.python.org> Author: michael.foord Date: Fri Feb 18 00:12:04 2011 New Revision: 88433 Log: Typo correction in pep 8 Modified: peps/trunk/pep-0008.txt Modified: peps/trunk/pep-0008.txt ============================================================================== --- peps/trunk/pep-0008.txt (original) +++ peps/trunk/pep-0008.txt Fri Feb 18 00:12:04 2011 @@ -126,7 +126,7 @@ Encodings (PEP 263) - Code in the core Python distribution should aways use the ASCII or + Code in the core Python distribution should always use the ASCII or Latin-1 encoding (a.k.a. ISO-8859-1). For Python 3.0 and beyond, UTF-8 is preferred over Latin-1, see PEP 3120. From python-checkins at python.org Fri Feb 18 01:53:55 2011 From: python-checkins at python.org (raymond.hettinger) Date: Fri, 18 Feb 2011 01:53:55 +0100 (CET) Subject: [Python-checkins] r88434 - in python/branches/py3k/Doc: library/os.rst whatsnew/3.2.rst Message-ID: <20110218005355.C9C51F8C3@mail.python.org> Author: raymond.hettinger Date: Fri Feb 18 01:53:55 2011 New Revision: 88434 Log: Doc fixups. Modified: python/branches/py3k/Doc/library/os.rst python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/library/os.rst ============================================================================== --- python/branches/py3k/Doc/library/os.rst (original) +++ python/branches/py3k/Doc/library/os.rst Fri Feb 18 01:53:55 2011 @@ -1342,9 +1342,11 @@ >>> import os >>> statinfo = os.stat('somefile.txt') >>> statinfo - (33188, 422511, 769, 1, 1032, 100, 926, 1105022698,1105022732, 1105022732) + posix.stat_result(st_mode=33188, st_ino=7876932, st_dev=234881026, + st_nlink=1, st_uid=501, st_gid=501, st_size=264, st_atime=1297230295, + st_mtime=1297230027, st_ctime=1297230027) >>> statinfo.st_size - 926 + 264 Availability: Unix, Windows. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Fri Feb 18 01:53:55 2011 @@ -1354,7 +1354,8 @@ The :func:`os.popen` and :func:`subprocess.Popen` functions now support :keyword:`with` statements for auto-closing of the file descriptors. -(Contributed by Antoine Pitrou in :issue:`7461`.) +(Contributed by Antoine Pitrou and Brian Curtin in :issue:`7461` and +:issue:`10554`.) select ------ From python-checkins at python.org Fri Feb 18 02:34:28 2011 From: python-checkins at python.org (brett.cannon) Date: Fri, 18 Feb 2011 02:34:28 +0100 (CET) Subject: [Python-checkins] r88435 - python/branches/py3k/Doc/howto/pyporting.rst Message-ID: <20110218013428.EED9DEE98F@mail.python.org> Author: brett.cannon Date: Fri Feb 18 02:34:28 2011 New Revision: 88435 Log: Update the porting HOWTO to be a little less harsh on using 2to3. Patch reviewed by Raymond Hettinger, permission from Georg Brandl to commit during an RC. Modified: python/branches/py3k/Doc/howto/pyporting.rst Modified: python/branches/py3k/Doc/howto/pyporting.rst ============================================================================== --- python/branches/py3k/Doc/howto/pyporting.rst (original) +++ python/branches/py3k/Doc/howto/pyporting.rst Fri Feb 18 02:34:28 2011 @@ -31,23 +31,18 @@ want to consider writing/porting :ref:`all of your code for Python 3 and use 3to2 ` to port your code for Python 2. -If your project has a pre-existing Python 2 codebase and you would like Python -3 support to start off a new branch or version of your project, then you will -most likely want to :ref:`port using 2to3 `. This will allow you to -port your Python 2 code to Python 3 in a semi-automated fashion and begin to -maintain it separately from your Python 2 code. This approach can also work if -your codebase is small and/or simple enough for the translation to occur -quickly. - -Finally, if you want to maintain Python 2 and Python 3 versions of your project -simultaneously and with no differences, then you can write :ref:`Python 2/3 -source-compatible code `. While the code is not quite as -idiomatic as it would be written just for Python 3 or automating the port from -Python 2, it does makes it easier to continue to do rapid development -regardless of what major version of Python you are developing against at the -time. +If you would prefer to maintain a codebase which is semantically **and** +syntactically compatible with Python 2 & 3 simultaneously, you can write +:ref:`use_same_source`. While this tends to lead to somewhat non-idiomatic +code, it does mean you keep a rapid development process for you, the developer. + +Finally, you do have the option of :ref:`using 2to3 ` to translate +Python 2 code into Python 3 code (with some manual help). This can take the +form of branching your code and using 2to3 to start a Python 3 branch. You can +also have users perform the translation as installation time automatically so +that you only have to maintain a Python 2 codebase. -Regardless of which approach you choose, porting is probably not as hard or +Regardless of which approach you choose, porting is not as hard or time-consuming as you might initially think. You can also tackle the problem piece-meal as a good portion of porting is simply updating your code to follow current best practices in a Python 2/3 compatible way. @@ -595,7 +590,8 @@ source-compatible between Python 2 & 3. It does lead to code that is not entirely idiomatic Python (e.g., having to extract the currently raised exception from ``sys.exc_info()[1]``), but it can be run under Python 2 -**and** Python 3 without using 2to3_ as a translation step. This allows you to +**and** Python 3 without using 2to3_ as a translation step (although the tool +should be used to help find potential portability problems). This allows you to continue to have a rapid development process regardless of whether you are developing under Python 2 or Python 3. Whether this approach or using :ref:`use_2to3` works best for you will be a per-project decision. @@ -611,8 +607,8 @@ .. _What's New in Python 3.0: http://docs.python.org/release/3.0/whatsnew/3.0.html -Follow The Steps for Using 2to3_ (sans 2to3) --------------------------------------------- +Follow The Steps for Using 2to3_ +-------------------------------- All of the steps outlined in how to :ref:`port Python 2 code with 2to3 ` apply @@ -689,10 +685,11 @@ Other Resources =============== -The authors of the following blog posts and wiki pages deserve special thanks -for making public their tips for porting Python 2 code to Python 3 (and thus -helping provide information for this document): +The authors of the following blog posts, wiki pages, and books deserve special +thanks for making public their tips for porting Python 2 code to Python 3 (and +thus helping provide information for this document): +* http://python3porting.com/ * http://docs.pythonsprints.com/python3_porting/py-porting.html * http://techspot.zzzeek.org/2011/01/24/zzzeek-s-guide-to-python-3-porting/ * http://dabeaz.blogspot.com/2011/01/porting-py65-and-my-superboard-to.html From python-checkins at python.org Fri Feb 18 18:32:36 2011 From: python-checkins at python.org (dirkjan.ochtman) Date: Fri, 18 Feb 2011 18:32:36 +0100 Subject: [Python-checkins] merge in hooks: merge test Message-ID: dirkjan.ochtman pushed ddf747c3d272 to hooks: http://hg.python.org/hooks/rev/ddf747c3d272 changeset: 21:ddf747c3d272 tag: tip parent: 20:f3e489ffbf86 parent: 19:7ea3d7f6e688 user: Antoine Pitrou date: Fri Feb 18 18:32:26 2011 +0100 summary: merge test files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -9,6 +9,8 @@ CSET_URL = BASE + '%s/rev/%s' FROM = '%s ' +# foo + def send(sub, sender, to, body): msg = MIMEMultipart() msg['Subject'] = sub -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 18:32:35 2011 From: python-checkins at python.org (dirkjan.ochtman) Date: Fri, 18 Feb 2011 18:32:35 +0100 Subject: [Python-checkins] hooks: (test a) Message-ID: dirkjan.ochtman pushed 7ea3d7f6e688 to hooks: http://hg.python.org/hooks/rev/7ea3d7f6e688 changeset: 19:7ea3d7f6e688 user: Antoine Pitrou date: Fri Feb 18 18:31:51 2011 +0100 summary: (test a) files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -9,6 +9,8 @@ CSET_URL = BASE + '%s/rev/%s' FROM = '%s ' +# foo + def send(sub, sender, to, body): msg = MIMEMultipart() msg['Subject'] = sub -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 18:32:36 2011 From: python-checkins at python.org (dirkjan.ochtman) Date: Fri, 18 Feb 2011 18:32:36 +0100 Subject: [Python-checkins] hooks: test b Message-ID: dirkjan.ochtman pushed f3e489ffbf86 to hooks: http://hg.python.org/hooks/rev/f3e489ffbf86 changeset: 20:f3e489ffbf86 parent: 18:78a366c3cf93 user: Antoine Pitrou date: Fri Feb 18 18:32:11 2011 +0100 summary: test b files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -20,6 +20,8 @@ smtp.sendmail(sender, msg['To'], msg.as_string()) smtp.close() +# bar + def incoming(ui, repo, **kwargs): displayer = cmdutil.changeset_printer(ui, repo, False, False, True) -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 18:33:07 2011 From: python-checkins at python.org (dirkjan.ochtman) Date: Fri, 18 Feb 2011 18:33:07 +0100 Subject: [Python-checkins] hooks: Remove test lines Message-ID: dirkjan.ochtman pushed 03920e0377f0 to hooks: http://hg.python.org/hooks/rev/03920e0377f0 changeset: 22:03920e0377f0 tag: tip user: Antoine Pitrou date: Fri Feb 18 18:33:04 2011 +0100 summary: Remove test lines files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -9,8 +9,6 @@ CSET_URL = BASE + '%s/rev/%s' FROM = '%s ' -# foo - def send(sub, sender, to, body): msg = MIMEMultipart() msg['Subject'] = sub @@ -22,8 +20,6 @@ smtp.sendmail(sender, msg['To'], msg.as_string()) smtp.close() -# bar - def incoming(ui, repo, **kwargs): displayer = cmdutil.changeset_printer(ui, repo, False, False, True) -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 18:42:24 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 18 Feb 2011 18:42:24 +0100 Subject: [Python-checkins] hooks: Another test Message-ID: antoine.pitrou pushed e88f105b8e59 to hooks: http://hg.python.org/hooks/rev/e88f105b8e59 changeset: 23:e88f105b8e59 tag: tip user: Antoine Pitrou date: Fri Feb 18 18:42:21 2011 +0100 summary: Another test files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -69,3 +69,4 @@ send(subj, FROM % user, to, '\n'.join(body)) print 'notified %s of incoming changeset %s' % (to, ctx) return False + -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 18:47:23 2011 From: python-checkins at python.org (eric.araujo) Date: Fri, 18 Feb 2011 18:47:23 +0100 (CET) Subject: [Python-checkins] r88436 - python/branches/release27-maint/Doc/library/sets.rst Message-ID: <20110218174723.DA616EE981@mail.python.org> Author: eric.araujo Date: Fri Feb 18 18:47:23 2011 New Revision: 88436 Log: Link from deprecated sets module to builtins set and frozenset (#11238) Modified: python/branches/release27-maint/Doc/library/sets.rst Modified: python/branches/release27-maint/Doc/library/sets.rst ============================================================================== --- python/branches/release27-maint/Doc/library/sets.rst (original) +++ python/branches/release27-maint/Doc/library/sets.rst Fri Feb 18 18:47:23 2011 @@ -14,7 +14,7 @@ .. versionadded:: 2.3 .. deprecated:: 2.6 - The built-in ``set``/``frozenset`` types replace this module. + The built-in :class:`set`/:class:`frozenset` types replace this module. The :mod:`sets` module provides classes for constructing and manipulating unordered collections of unique elements. Common uses include membership From python-checkins at python.org Fri Feb 18 18:49:09 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 18 Feb 2011 18:49:09 +0100 Subject: [Python-checkins] hooks: A last test (everything should be ok now) Message-ID: antoine.pitrou pushed 6b294f1ff7ff to hooks: http://hg.python.org/hooks/rev/6b294f1ff7ff changeset: 24:6b294f1ff7ff tag: tip user: Antoine Pitrou date: Fri Feb 18 18:49:07 2011 +0100 summary: A last test (everything should be ok now) files: mail.py diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -69,4 +69,3 @@ send(subj, FROM % user, to, '\n'.join(body)) print 'notified %s of incoming changeset %s' % (to, ctx) return False - -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Fri Feb 18 20:37:53 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 18 Feb 2011 20:37:53 +0100 (CET) Subject: [Python-checkins] r88437 - peps/trunk/roman.py Message-ID: <20110218193753.6D803F6FF@mail.python.org> Author: antoine.pitrou Date: Fri Feb 18 20:37:53 2011 New Revision: 88437 Log: Test commit for hg mirrors Modified: peps/trunk/roman.py Modified: peps/trunk/roman.py ============================================================================== --- peps/trunk/roman.py (original) +++ peps/trunk/roman.py Fri Feb 18 20:37:53 2011 @@ -16,7 +16,7 @@ import re -#Define exceptions +# Define exceptions class RomanError(Exception): pass class OutOfRangeError(RomanError): pass class NotIntegerError(RomanError): pass From solipsis at pitrou.net Sat Feb 19 05:27:28 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 19 Feb 2011 05:27:28 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88435): sum=-55 Message-ID: py3k results for svn r88435 (hg cset 4401f56cc761) -------------------------------------------------- test_pyexpat leaked [0, 0, -56] references, sum=-56 test_timeout leaked [0, 1, 0] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogzrEpYD', '-x'] From python-checkins at python.org Sat Feb 19 09:44:47 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 19 Feb 2011 09:44:47 +0100 (CET) Subject: [Python-checkins] r88438 - in python/branches/py3k/Misc: NEWS README.AIX Message-ID: <20110219084447.E3729EE9B2@mail.python.org> Author: georg.brandl Date: Sat Feb 19 09:44:47 2011 New Revision: 88438 Log: #10709: add back an updated AIX-NOTES (as README.AIX). Added: python/branches/py3k/Misc/README.AIX Modified: python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 19 09:44:47 2011 @@ -18,6 +18,11 @@ - Issue #941346: Fix broken shared library build on AIX. +Documentation +------------- + +- Issue #10709: Add updated AIX notes in Misc/README.AIX. + What's New in Python 3.2 Release Candidate 3? ============================================= Added: python/branches/py3k/Misc/README.AIX ============================================================================== --- (empty file) +++ python/branches/py3k/Misc/README.AIX Sat Feb 19 09:44:47 2011 @@ -0,0 +1,137 @@ + +This documentation tries to help people who intend to use Python on +AIX. + +There used to be many issues with Python on AIX, but the major ones +have been corrected for version 3.2, so that Python should now work +rather well on this platform. The remaining known issues are listed in +this document. + + +====================================================================== + Compiling Python +---------------------------------------------------------------------- + +You can compile Python with gcc or the native AIX compiler. The native +compiler used to give better performances on this system with older +versions of Python. With Python 3.2 it may not be the case anymore, +as this compiler does not allow compiling Python with computed gotos. +Some benchmarks need to be done. + +Compiling with gcc: + +cd Python-3.2 +CC=gcc OPT="-O2" ./configure --enable-shared +make + +There are various aliases for the native compiler. The recommended +alias for compiling Python is 'xlc_r', which provides a better level of +compatibility and handles thread initialization properly. + +It is a good idea to add the '-qmaxmem=70000' option, otherwise the +compiler considers various files too complex to optimize. + +Compiling with xlc: + +cd Python-3.2 +CC=xlc_r OPT="-O2 -qmaxmem=70000" ./configure --without-computed-gotos --enable-shared +make + +Note: +On AIX 5.3 and earlier, you will also need to specify the +"--disable-ipv6" flag to configure. This has been corrected in AIX +6.1. + + +====================================================================== + Memory Limitations +---------------------------------------------------------------------- + +Note: this section may not apply when compiling Python as a 64 bit +application. + +By default on AIX each program gets one segment register for its data +segment. As each segment register covers 256 MB, a Python program that +would use more than 256MB will raise a MemoryError. The standard +Python test suite is one such application. + +To allocate more segment registers to Python, you must use the linker +option -bmaxdata or the ldedit tool to specify the number of bytes you +need in the data segment. + +For example, if you want to allow 512MB of memory for Python (this is +enough for the test suite to run without MemoryErrors), you should run +the following command at the end of compilation: + +ldedit -b maxdata:0x20000000 ./python + +You can allow up to 2GB of memory for Python by using the value +0x80000000 for maxdata. + +It is also possible to go beyond 2GB of memory by activating Large +Page Use. You should consult the IBM documentation if you need to use +this option. You can also follow the discussion of this problem +in issue 11212 at bugs.python.org. + +http://publib.boulder.ibm.com/infocenter/aix/v6r1/index.jsp?topic=/com.ibm.aix.cmds/doc/aixcmds3/ldedit.htm + + +====================================================================== + Known issues +---------------------------------------------------------------------- + +Those issues are currently affecting Python on AIX: + +* Python has not been fully tested on AIX when compiled as a 64 bit + application. + +* issue 3526: the memory used by a Python process will never be + released to the system. If you have a Python application on AIX that + uses a lot of memory, you should read this issue and you may + consider using the provided patch that implements a custom malloc + implementation + +* issue 11184: support for large files is currently broken + +* issue 11185: os.wait4 does not behave correctly with option WNOHANG + +* issue 1745108: there may be some problems with curses.panel + +* issue 11192: test_socket fails + +* issue 11190: test_locale fails + +* issue 11193: test_subprocess fails + +* issue 9920: minor arithmetic issues in cmath + +* issue 11215: test_fileio fails + +* issue 11188: test_time fails + + +====================================================================== + Implementation details for developers +---------------------------------------------------------------------- + +Python and python modules can now be built as shared libraries on AIX +as usual. + +AIX shared libraries require that an "export" and "import" file be +provided at compile time to list all extern symbols which may be +shared between modules. The "export" file (named python.exp) for the +modules and the libraries that belong to the Python core is created by +the "makexp_aix" script before performing the link of the python +binary. It lists all global symbols (exported during the link) of the +modules and the libraries that make up the python executable. + +When shared library modules (.so files) are made, a second shell +script is invoked. This script is named "ld_so_aix" and is also +provided with the distribution in the Modules subdirectory. This +script acts as an "ld" wrapper which hides the explicit management of +"export" and "import" files; it adds the appropriate arguments (in the +appropriate order) to the link command that creates the shared module. +Among other things, it specifies that the "python.exp" file is an +"import" file for the shared module. + +This mechanism should be transparent. From python-checkins at python.org Sat Feb 19 09:47:14 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 19 Feb 2011 09:47:14 +0100 (CET) Subject: [Python-checkins] r88439 - in python/branches/py3k: Makefile.pre.in Misc/NEWS Message-ID: <20110219084714.79B63EE99D@mail.python.org> Author: georg.brandl Date: Sat Feb 19 09:47:14 2011 New Revision: 88439 Log: #11222: fix non-framework shared library build on Mac, patch by Ned Deily. Modified: python/branches/py3k/Makefile.pre.in python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Makefile.pre.in ============================================================================== --- python/branches/py3k/Makefile.pre.in (original) +++ python/branches/py3k/Makefile.pre.in Sat Feb 19 09:47:14 2011 @@ -458,8 +458,8 @@ libpython3.so: libpython$(LDVERSION).so $(BLDSHARED) -o $@ -Wl,-hl$@ $^ -libpython$(VERSION).dylib: $(LIBRARY_OBJS) - $(CC) -dynamiclib -Wl,-single_module $(PY_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(VERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \ +libpython$(LDVERSION).dylib: $(LIBRARY_OBJS) + $(CC) -dynamiclib -Wl,-single_module $(PY_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(LDVERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \ libpython$(VERSION).sl: $(LIBRARY_OBJS) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 19 09:47:14 2011 @@ -16,6 +16,8 @@ Build ----- +- Issue #11222: Fix non-framework shared library build on Mac OS X. + - Issue #941346: Fix broken shared library build on AIX. Documentation From python-checkins at python.org Sat Feb 19 09:58:24 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 19 Feb 2011 09:58:24 +0100 (CET) Subject: [Python-checkins] r88440 - in python/branches/py3k: Misc/NEWS configure configure.in Message-ID: <20110219085824.29176EE988@mail.python.org> Author: georg.brandl Date: Sat Feb 19 09:58:23 2011 New Revision: 88440 Log: #11184: Fix large file support on AIX. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/configure python/branches/py3k/configure.in Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 19 09:58:23 2011 @@ -18,6 +18,8 @@ - Issue #11222: Fix non-framework shared library build on Mac OS X. +- Issue #11184: Fix large-file support on AIX. + - Issue #941346: Fix broken shared library build on AIX. Documentation Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Sat Feb 19 09:58:23 2011 @@ -1,5 +1,5 @@ #! /bin/sh -# From configure.in Revision: 88426 . +# From configure.in Revision: 88430 . # Guess values for system-dependent variables and create Makefiles. # Generated by GNU Autoconf 2.68 for python 3.2. # @@ -6414,6 +6414,13 @@ if test "$use_lfs" = "yes"; then # Two defines needed to enable largefile support on various platforms # These may affect some typedefs +case $ac_sys_system/$ac_sys_release in +AIX*) + +$as_echo "#define _LARGE_FILES 1" >>confdefs.h + + ;; +esac $as_echo "#define _LARGEFILE_SOURCE 1" >>confdefs.h Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Sat Feb 19 09:58:23 2011 @@ -1376,6 +1376,12 @@ if test "$use_lfs" = "yes"; then # Two defines needed to enable largefile support on various platforms # These may affect some typedefs +case $ac_sys_system/$ac_sys_release in +AIX*) + AC_DEFINE(_LARGE_FILES, 1, + [This must be defined on AIX systems to enable large file support.]) + ;; +esac AC_DEFINE(_LARGEFILE_SOURCE, 1, [This must be defined on some systems to enable large file support.]) AC_DEFINE(_FILE_OFFSET_BITS, 64, From python-checkins at python.org Sat Feb 19 13:18:39 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 19 Feb 2011 13:18:39 +0100 Subject: [Python-checkins] hooks: Do not hide status messages from the buildbot hook Message-ID: antoine.pitrou pushed 0db5eb2adbff to hooks: http://hg.python.org/hooks/rev/0db5eb2adbff changeset: 25:0db5eb2adbff tag: tip user: Antoine Pitrou date: Sat Feb 19 13:18:37 2011 +0100 summary: Do not hide status messages from the buildbot hook files: buildbot.py diff --git a/buildbot.py b/buildbot.py --- a/buildbot.py +++ b/buildbot.py @@ -51,7 +51,7 @@ c['files'], c['username']) for change in changes: d.addCallback(send, change) - #d.addCallbacks(s.printSuccess, s.printFailure) + d.addCallbacks(s.printSuccess, s.printFailure) d.addBoth(s.stop) s.run() -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Sat Feb 19 13:27:06 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 19 Feb 2011 13:27:06 +0100 Subject: [Python-checkins] hooks: Remove forking as it will then re-enter hg concurrently into both child Message-ID: antoine.pitrou pushed 86f51cb4d89a to hooks: http://hg.python.org/hooks/rev/86f51cb4d89a changeset: 26:86f51cb4d89a tag: tip user: Antoine Pitrou date: Sat Feb 19 13:27:00 2011 +0100 summary: Remove forking as it will then re-enter hg concurrently into both child and parent. Simply run tasks sequentially instead. files: buildbot.py diff --git a/buildbot.py b/buildbot.py --- a/buildbot.py +++ b/buildbot.py @@ -34,13 +34,14 @@ except ImportError: pass +from twisted.internet import defer, reactor + sys.path.append('/data/buildbot/lib/python') def sendchanges(master, changes): # send change information to one master from buildbot.clients import sendchange - from twisted.internet import defer, reactor s = sendchange.Sender(master, None) d = defer.Deferred() @@ -53,7 +54,6 @@ d.addCallback(send, change) d.addCallbacks(s.printSuccess, s.printFailure) d.addBoth(s.stop) - s.run() def hook(ui, repo, hooktype, node=None, source=None, **kwargs): @@ -93,8 +93,6 @@ }) for master in masters: - # fork off for each master: reactor can only run once per process - child_pid = os.fork() - if child_pid == 0: - sendchanges(master, changes) - return + sendchanges(master, changes) + reactor.run() + -- Repository URL: http://hg.python.org/hooks From nnorwitz at gmail.com Sat Feb 19 10:00:48 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Sat, 19 Feb 2011 04:00:48 -0500 Subject: [Python-checkins] Python Regression Test Failures doc (1) Message-ID: <20110219090048.GA10469@kbk-i386-bb.dyndns.org> rm -rf build/* rm -rf tools/sphinx rm -rf tools/pygments rm -rf tools/jinja2 rm -rf tools/docutils Checking out Sphinx... svn: PROPFIND request failed on '/projects/external/Sphinx-0.6.5/sphinx' svn: PROPFIND of '/projects/external/Sphinx-0.6.5/sphinx': Could not resolve hostname `svn.python.org': Host not found (http://svn.python.org) make: *** [checkout] Error 1 From python-checkins at python.org Sat Feb 19 19:06:51 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 19 Feb 2011 19:06:51 +0100 (CET) Subject: [Python-checkins] r88441 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110219180651.1113AEE9DA@mail.python.org> Author: eric.araujo Date: Sat Feb 19 19:06:50 2011 New Revision: 88441 Log: Some more grammar fixes/typos for what?s new (approved by Raymond; #11071) Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Sat Feb 19 19:06:50 2011 @@ -333,7 +333,7 @@ * The :mod:`py_compile` and :mod:`compileall` modules have been updated to reflect the new naming convention and target directory. The command-line - invocation of *compileall* has new command-line options: ``-i`` for + invocation of *compileall* has new options: ``-i`` for specifying a list of files and directories to compile and ``-b`` which causes bytecode files to be written to their legacy location rather than *__pycache__*. @@ -450,7 +450,7 @@ * There is also a new :meth:`str.format_map` method that extends the capabilities of the existing :meth:`str.format` method by accepting arbitrary :term:`mapping` objects. This new method makes it possible to use string - formatting with any of one of Python's many dictionary-like tools such as + formatting with any of Python's many dictionary-like objects such as :class:`~collections.defaultdict`, :class:`~shelve.Shelf`, :class:`~configparser.ConfigParser`, or :mod:`dbm`. It is also useful with custom :class:`dict` subclasses that normalize keys before look-up or that @@ -477,7 +477,7 @@ (Suggested by Raymond Hettinger and implemented by Eric Smith in :issue:`6081`.) -* The interpreter can now be started with a quiet option, ``-q``, to suppress +* The interpreter can now be started with a quiet option, ``-q``, to prevent the copyright and version information from being displayed in the interactive mode. The option can be introspected using the :attr:`sys.flags` attribute:: @@ -1410,7 +1410,7 @@ tarfile ------- -The :class:`~tarfile.TarFile` class can now be used as a content manager. In +The :class:`~tarfile.TarFile` class can now be used as a context manager. In addition, its :meth:`~tarfile.TarFile.add` method has a new option, *filter*, that controls which files are added to the archive and allows the file metadata to be edited. @@ -1485,7 +1485,7 @@ ... ValueError: malformed node or string: <_ast.Call object at 0x101739a10> -(Implemented by Georg Brandl.) +(Implemented by Benjamin Peterson and Georg Brandl.) os -- @@ -2005,7 +2005,7 @@ $ python -m site --user-site /Users/raymondhettinger/.local/lib/python3.2/site-packages -(Contributed by Tarek Ziad?.) +(Contributed by Tarek Ziad? in :issue:`6693`.) sysconfig --------- From python-checkins at python.org Sat Feb 19 19:46:02 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 19 Feb 2011 19:46:02 +0100 (CET) Subject: [Python-checkins] r88442 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110219184602.577C0EE984@mail.python.org> Author: eric.araujo Date: Sat Feb 19 19:46:02 2011 New Revision: 88442 Log: Fix two typos in what?s new (#11234). Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Sat Feb 19 19:46:02 2011 @@ -373,7 +373,7 @@ >>> sysconfig.get_config_var('SOABI') # find the version tag 'cpython-32mu' >>> sysconfig.get_config_var('SO') # find the full filename extension - 'cpython-32mu.so' + '.cpython-32mu.so' .. seealso:: @@ -2199,7 +2199,7 @@ >>> r DefragResult(url='http://python.org/about/', fragment='target') >>> r[0] - 'http://python.org/about/ + 'http://python.org/about/' >>> r.fragment 'target' From python-checkins at python.org Sat Feb 19 22:47:04 2011 From: python-checkins at python.org (georg.brandl) Date: Sat, 19 Feb 2011 22:47:04 +0100 (CET) Subject: [Python-checkins] r88443 - in python/branches/py3k: Misc/NEWS Objects/typeobject.c Message-ID: <20110219214704.39514EE98B@mail.python.org> Author: georg.brandl Date: Sat Feb 19 22:47:02 2011 New Revision: 88443 Log: #11249: in PyType_FromSpec, copy tp_doc slot since it usually will point to a static string literal which should not be deallocated together with the type. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/typeobject.c Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 19 22:47:02 2011 @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #11249: Fix potential crashes when using the limited API. + Library ------- Modified: python/branches/py3k/Objects/typeobject.c ============================================================================== --- python/branches/py3k/Objects/typeobject.c (original) +++ python/branches/py3k/Objects/typeobject.c Sat Feb 19 22:47:02 2011 @@ -2347,6 +2347,17 @@ goto fail; } *(void**)(res_start + slotoffsets[slot->slot]) = slot->pfunc; + + /* need to make a copy of the docstring slot, which usually + points to a static string literal */ + if (slot->slot == Py_tp_doc) { + ssize_t len = strlen(slot->pfunc)+1; + char *tp_doc = PyObject_MALLOC(len); + if (tp_doc == NULL) + goto fail; + memcpy(tp_doc, slot->pfunc, len); + res->ht_type.tp_doc = tp_doc; + } } return (PyObject*)res; From solipsis at pitrou.net Sun Feb 20 05:14:20 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 20 Feb 2011 05:14:20 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88443): sum=-322 Message-ID: py3k results for svn r88443 (hg cset 2d3aded9fdd8) -------------------------------------------------- test_pyexpat leaked [0, -56, -267] references, sum=-323 test_timeout leaked [1, 0, 0] references, sum=1 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogWZNKGM', '-x'] From python-checkins at python.org Sun Feb 20 11:22:47 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:22:47 +0100 (CET) Subject: [Python-checkins] r88444 - in python/branches/py3k: Doc/tools/sphinxext/susp-ignored.csv Lib/pydoc_data/topics.py Message-ID: <20110220102247.6843DEE981@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:22:41 2011 New Revision: 88444 Log: Topic and suspicious update. Modified: python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv python/branches/py3k/Lib/pydoc_data/topics.py Modified: python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv ============================================================================== --- python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv (original) +++ python/branches/py3k/Doc/tools/sphinxext/susp-ignored.csv Sun Feb 20 11:22:41 2011 @@ -368,35 +368,21 @@ documenting/markup,613,`,:ref:`link text ` library/imaplib,116,:MM,"""DD-Mmm-YYYY HH:MM:SS" library/imaplib,116,:SS,"""DD-Mmm-YYYY HH:MM:SS" -whatsnew/3.2,573,::,"$ export PYTHONWARNINGS='ignore::RuntimeWarning::,once::UnicodeWarning::'" -whatsnew/3.2,1374,:gz,">>> with tarfile.open(name='myarchive.tar.gz', mode='w:gz') as tf:" -whatsnew/3.2,2061,:directory,${buildout:directory}/downloads/dist -whatsnew/3.2,2061,:location,zope9-location = ${zope9:location} -whatsnew/3.2,2061,:prefix,zope-conf = ${custom:prefix}/etc/zope.conf -whatsnew/3.2,2102,:beef,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2102,:cafe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2102,:affe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2102,:deaf,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2102,:feed,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2102,:beef,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2102,:cafe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2102,:affe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2102,:deaf,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2102,:feed,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,::,"$ export PYTHONWARNINGS='ignore::RuntimeWarning::,once::UnicodeWarning::'" howto/pyporting,75,::,# make sure to use :: Python *and* :: Python :: 3 so howto/pyporting,75,::,"'Programming Language :: Python'," howto/pyporting,75,::,'Programming Language :: Python :: 3' -whatsnew/3.2,1419,:gz,">>> with tarfile.open(name='myarchive.tar.gz', mode='w:gz') as tf:" -whatsnew/3.2,2135,:directory,${buildout:directory}/downloads/dist -whatsnew/3.2,2135,:location,zope9-location = ${zope9:location} -whatsnew/3.2,2135,:prefix,zope-conf = ${custom:prefix}/etc/zope.conf -whatsnew/3.2,2178,:beef,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2178,:cafe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2178,:affe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2178,:deaf,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2178,:feed,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') -whatsnew/3.2,2178,:beef,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2178,:cafe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2178,:affe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2178,:deaf,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," -whatsnew/3.2,2178,:feed,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,:gz,">>> with tarfile.open(name='myarchive.tar.gz', mode='w:gz') as tf:" +whatsnew/3.2,,:directory,${buildout:directory}/downloads/dist +whatsnew/3.2,,:location,zope9-location = ${zope9:location} +whatsnew/3.2,,:prefix,zope-conf = ${custom:prefix}/etc/zope.conf +whatsnew/3.2,,:beef,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,,:cafe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,,:affe,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,,:deaf,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,,:feed,>>> urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') +whatsnew/3.2,,:beef,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,:cafe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,:affe,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,:deaf,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," +whatsnew/3.2,,:feed,"netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]'," Modified: python/branches/py3k/Lib/pydoc_data/topics.py ============================================================================== --- python/branches/py3k/Lib/pydoc_data/topics.py (original) +++ python/branches/py3k/Lib/pydoc_data/topics.py Sun Feb 20 11:22:41 2011 @@ -1,4 +1,4 @@ -# Autogenerated by Sphinx on Sun Feb 13 11:01:51 2011 +# Autogenerated by Sphinx on Sun Feb 20 10:16:17 2011 topics = {'assert': '\nThe ``assert`` statement\n************************\n\nAssert statements are a convenient way to insert debugging assertions\ninto a program:\n\n assert_stmt ::= "assert" expression ["," expression]\n\nThe simple form, ``assert expression``, is equivalent to\n\n if __debug__:\n if not expression: raise AssertionError\n\nThe extended form, ``assert expression1, expression2``, is equivalent\nto\n\n if __debug__:\n if not expression1: raise AssertionError(expression2)\n\nThese equivalences assume that ``__debug__`` and ``AssertionError``\nrefer to the built-in variables with those names. In the current\nimplementation, the built-in variable ``__debug__`` is ``True`` under\nnormal circumstances, ``False`` when optimization is requested\n(command line option -O). The current code generator emits no code\nfor an assert statement when optimization is requested at compile\ntime. Note that it is unnecessary to include the source code for the\nexpression that failed in the error message; it will be displayed as\npart of the stack trace.\n\nAssignments to ``__debug__`` are illegal. The value for the built-in\nvariable is determined when the interpreter starts.\n', 'assignment': '\nAssignment statements\n*********************\n\nAssignment statements are used to (re)bind names to values and to\nmodify attributes or items of mutable objects:\n\n assignment_stmt ::= (target_list "=")+ (expression_list | yield_expression)\n target_list ::= target ("," target)* [","]\n target ::= identifier\n | "(" target_list ")"\n | "[" target_list "]"\n | attributeref\n | subscription\n | slicing\n | "*" target\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn assignment statement evaluates the expression list (remember that\nthis can be a single expression or a comma-separated list, the latter\nyielding a tuple) and assigns the single resulting object to each of\nthe target lists, from left to right.\n\nAssignment is defined recursively depending on the form of the target\n(list). When a target is part of a mutable object (an attribute\nreference, subscription or slicing), the mutable object must\nultimately perform the assignment and decide about its validity, and\nmay raise an exception if the assignment is unacceptable. The rules\nobserved by various types and the exceptions raised are given with the\ndefinition of the object types (see section *The standard type\nhierarchy*).\n\nAssignment of an object to a target list, optionally enclosed in\nparentheses or square brackets, is recursively defined as follows.\n\n* If the target list is a single target: The object is assigned to\n that target.\n\n* If the target list is a comma-separated list of targets: The object\n must be an iterable with the same number of items as there are\n targets in the target list, and the items are assigned, from left to\n right, to the corresponding targets. (This rule is relaxed as of\n Python 1.5; in earlier versions, the object had to be a tuple.\n Since strings are sequences, an assignment like ``a, b = "xy"`` is\n now legal as long as the string has the right length.)\n\n * If the target list contains one target prefixed with an asterisk,\n called a "starred" target: The object must be a sequence with at\n least as many items as there are targets in the target list, minus\n one. The first items of the sequence are assigned, from left to\n right, to the targets before the starred target. The final items\n of the sequence are assigned to the targets after the starred\n target. A list of the remaining items in the sequence is then\n assigned to the starred target (the list can be empty).\n\n * Else: The object must be a sequence with the same number of items\n as there are targets in the target list, and the items are\n assigned, from left to right, to the corresponding targets.\n\nAssignment of an object to a single target is recursively defined as\nfollows.\n\n* If the target is an identifier (name):\n\n * If the name does not occur in a ``global`` or ``nonlocal``\n statement in the current code block: the name is bound to the\n object in the current local namespace.\n\n * Otherwise: the name is bound to the object in the global namespace\n or the outer namespace determined by ``nonlocal``, respectively.\n\n The name is rebound if it was already bound. This may cause the\n reference count for the object previously bound to the name to reach\n zero, causing the object to be deallocated and its destructor (if it\n has one) to be called.\n\n* If the target is a target list enclosed in parentheses or in square\n brackets: The object must be an iterable with the same number of\n items as there are targets in the target list, and its items are\n assigned, from left to right, to the corresponding targets.\n\n* If the target is an attribute reference: The primary expression in\n the reference is evaluated. It should yield an object with\n assignable attributes; if this is not the case, ``TypeError`` is\n raised. That object is then asked to assign the assigned object to\n the given attribute; if it cannot perform the assignment, it raises\n an exception (usually but not necessarily ``AttributeError``).\n\n Note: If the object is a class instance and the attribute reference\n occurs on both sides of the assignment operator, the RHS expression,\n ``a.x`` can access either an instance attribute or (if no instance\n attribute exists) a class attribute. The LHS target ``a.x`` is\n always set as an instance attribute, creating it if necessary.\n Thus, the two occurrences of ``a.x`` do not necessarily refer to the\n same attribute: if the RHS expression refers to a class attribute,\n the LHS creates a new instance attribute as the target of the\n assignment:\n\n class Cls:\n x = 3 # class variable\n inst = Cls()\n inst.x = inst.x + 1 # writes inst.x as 4 leaving Cls.x as 3\n\n This description does not necessarily apply to descriptor\n attributes, such as properties created with ``property()``.\n\n* If the target is a subscription: The primary expression in the\n reference is evaluated. It should yield either a mutable sequence\n object (such as a list) or a mapping object (such as a dictionary).\n Next, the subscript expression is evaluated.\n\n If the primary is a mutable sequence object (such as a list), the\n subscript must yield an integer. If it is negative, the sequence\'s\n length is added to it. The resulting value must be a nonnegative\n integer less than the sequence\'s length, and the sequence is asked\n to assign the assigned object to its item with that index. If the\n index is out of range, ``IndexError`` is raised (assignment to a\n subscripted sequence cannot add new items to a list).\n\n If the primary is a mapping object (such as a dictionary), the\n subscript must have a type compatible with the mapping\'s key type,\n and the mapping is then asked to create a key/datum pair which maps\n the subscript to the assigned object. This can either replace an\n existing key/value pair with the same key value, or insert a new\n key/value pair (if no key with the same value existed).\n\n For user-defined objects, the ``__setitem__()`` method is called\n with appropriate arguments.\n\n* If the target is a slicing: The primary expression in the reference\n is evaluated. It should yield a mutable sequence object (such as a\n list). The assigned object should be a sequence object of the same\n type. Next, the lower and upper bound expressions are evaluated,\n insofar they are present; defaults are zero and the sequence\'s\n length. The bounds should evaluate to integers. If either bound is\n negative, the sequence\'s length is added to it. The resulting\n bounds are clipped to lie between zero and the sequence\'s length,\n inclusive. Finally, the sequence object is asked to replace the\n slice with the items of the assigned sequence. The length of the\n slice may be different from the length of the assigned sequence,\n thus changing the length of the target sequence, if the object\n allows it.\n\n**CPython implementation detail:** In the current implementation, the\nsyntax for targets is taken to be the same as for expressions, and\ninvalid syntax is rejected during the code generation phase, causing\nless detailed error messages.\n\nWARNING: Although the definition of assignment implies that overlaps\nbetween the left-hand side and the right-hand side are \'safe\' (for\nexample ``a, b = b, a`` swaps two variables), overlaps *within* the\ncollection of assigned-to variables are not safe! For instance, the\nfollowing program prints ``[0, 2]``:\n\n x = [0, 1]\n i = 0\n i, x[i] = 1, 2\n print(x)\n\nSee also:\n\n **PEP 3132** - Extended Iterable Unpacking\n The specification for the ``*target`` feature.\n\n\nAugmented assignment statements\n===============================\n\nAugmented assignment is the combination, in a single statement, of a\nbinary operation and an assignment statement:\n\n augmented_assignment_stmt ::= augtarget augop (expression_list | yield_expression)\n augtarget ::= identifier | attributeref | subscription | slicing\n augop ::= "+=" | "-=" | "*=" | "/=" | "//=" | "%=" | "**="\n | ">>=" | "<<=" | "&=" | "^=" | "|="\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn augmented assignment evaluates the target (which, unlike normal\nassignment statements, cannot be an unpacking) and the expression\nlist, performs the binary operation specific to the type of assignment\non the two operands, and assigns the result to the original target.\nThe target is only evaluated once.\n\nAn augmented assignment expression like ``x += 1`` can be rewritten as\n``x = x + 1`` to achieve a similar, but not exactly equal effect. In\nthe augmented version, ``x`` is only evaluated once. Also, when\npossible, the actual operation is performed *in-place*, meaning that\nrather than creating a new object and assigning that to the target,\nthe old object is modified instead.\n\nWith the exception of assigning to tuples and multiple targets in a\nsingle statement, the assignment done by augmented assignment\nstatements is handled the same way as normal assignments. Similarly,\nwith the exception of the possible *in-place* behavior, the binary\noperation performed by augmented assignment is the same as the normal\nbinary operations.\n\nFor targets which are attribute references, the same *caveat about\nclass and instance attributes* applies as for regular assignments.\n', 'atom-identifiers': '\nIdentifiers (Names)\n*******************\n\nAn identifier occurring as an atom is a name. See section\n*Identifiers and keywords* for lexical definition and section *Naming\nand binding* for documentation of naming and binding.\n\nWhen the name is bound to an object, evaluation of the atom yields\nthat object. When a name is not bound, an attempt to evaluate it\nraises a ``NameError`` exception.\n\n**Private name mangling:** When an identifier that textually occurs in\na class definition begins with two or more underscore characters and\ndoes not end in two or more underscores, it is considered a *private\nname* of that class. Private names are transformed to a longer form\nbefore code is generated for them. The transformation inserts the\nclass name in front of the name, with leading underscores removed, and\na single underscore inserted in front of the class name. For example,\nthe identifier ``__spam`` occurring in a class named ``Ham`` will be\ntransformed to ``_Ham__spam``. This transformation is independent of\nthe syntactical context in which the identifier is used. If the\ntransformed name is extremely long (longer than 255 characters),\nimplementation defined truncation may happen. If the class name\nconsists only of underscores, no transformation is done.\n', From python-checkins at python.org Sun Feb 20 11:29:04 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:29:04 +0100 (CET) Subject: [Python-checkins] r88445 - in python/branches/py3k: Include/patchlevel.h Lib/distutils/__init__.py Lib/idlelib/idlever.py Misc/NEWS Misc/RPM/python-3.2.spec README Message-ID: <20110220102904.A1961EE981@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:29:04 2011 New Revision: 88445 Log: Version bump to 3.2 final. Modified: python/branches/py3k/Include/patchlevel.h python/branches/py3k/Lib/distutils/__init__.py python/branches/py3k/Lib/idlelib/idlever.py python/branches/py3k/Misc/NEWS python/branches/py3k/Misc/RPM/python-3.2.spec python/branches/py3k/README Modified: python/branches/py3k/Include/patchlevel.h ============================================================================== --- python/branches/py3k/Include/patchlevel.h (original) +++ python/branches/py3k/Include/patchlevel.h Sun Feb 20 11:29:04 2011 @@ -19,11 +19,11 @@ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 2 #define PY_MICRO_VERSION 0 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 3 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.2rc3+" +#define PY_VERSION "3.2" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository) */ Modified: python/branches/py3k/Lib/distutils/__init__.py ============================================================================== --- python/branches/py3k/Lib/distutils/__init__.py (original) +++ python/branches/py3k/Lib/distutils/__init__.py Sun Feb 20 11:29:04 2011 @@ -15,5 +15,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.2rc3" +__version__ = "3.2" #--end constants-- Modified: python/branches/py3k/Lib/idlelib/idlever.py ============================================================================== --- python/branches/py3k/Lib/idlelib/idlever.py (original) +++ python/branches/py3k/Lib/idlelib/idlever.py Sun Feb 20 11:29:04 2011 @@ -1 +1 @@ -IDLE_VERSION = "3.2rc3" +IDLE_VERSION = "3.2" Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sun Feb 20 11:29:04 2011 @@ -12,9 +12,6 @@ - Issue #11249: Fix potential crashes when using the limited API. -Library -------- - Build ----- @@ -42,40 +39,40 @@ - Issue #11135: Remove redundant doc field from PyType_Spec. -- Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check - in the limited ABI. +- Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check in the limited + ABI. - Issue #11118: Fix bogus export of None in python3.dll. Library ------- -- Issue #11116: any error during addition of a message to a mailbox now causes - a rollback, instead of leaving the mailbox partially modified. +- Issue #11116: any error during addition of a message to a mailbox now causes a + rollback, instead of leaving the mailbox partially modified. -- Issue #11132: Fix passing of "optimize" parameter when recursing - in compileall.compile_dir(). +- Issue #11132: Fix passing of "optimize" parameter when recursing in + compileall.compile_dir(). - Issue #11110: Fix a potential decref of a NULL in sqlite3. -- Issue #8275: Fix passing of callback arguments with ctypes under Win64. - Patch by Stan Mihai. +- Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch + by Stan Mihai. Build ----- -- Issue #11079: The /Applications/Python x.x folder created by the Mac - OS X installers now includes a link to the installed documentation - and no longer includes an Extras directory. The Tools directory is - now installed in the framework under share/doc. +- Issue #11079: The /Applications/Python x.x folder created by the Mac OS X + installers now includes a link to the installed documentation and no longer + includes an Extras directory. The Tools directory is now installed in the + framework under share/doc. - Issue #11121: Fix building with --enable-shared. Tests ----- -- Issue #10971: test_zipimport_support is once again compatible with the - refleak hunter feature of test.regrtest. +- Issue #10971: test_zipimport_support is once again compatible with the refleak + hunter feature of test.regrtest. What's New in Python 3.2 Release Candidate 2? Modified: python/branches/py3k/Misc/RPM/python-3.2.spec ============================================================================== --- python/branches/py3k/Misc/RPM/python-3.2.spec (original) +++ python/branches/py3k/Misc/RPM/python-3.2.spec Sun Feb 20 11:29:04 2011 @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.2rc3 +%define version 3.2 %define libvers 3.2 #--end constants-- %define release 1pydotorg Modified: python/branches/py3k/README ============================================================================== --- python/branches/py3k/README (original) +++ python/branches/py3k/README Sun Feb 20 11:29:04 2011 @@ -1,9 +1,8 @@ -This is Python version 3.2, release candidate 3 -=============================================== +This is Python version 3.2 +========================== Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 -Python Software Foundation. -All rights reserved. +Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially @@ -153,8 +152,8 @@ ------------------------- If you have a proposal to change Python, you may want to send an email to the -comp.lang.python or python-ideas mailing lists for inital feedback. A Python -Enhancement Proposal (PEP) may be submitted if your idea gains ground. All +comp.lang.python or python-ideas mailing lists for inital feedback. A Python +Enhancement Proposal (PEP) may be submitted if your idea gains ground. All current PEPs, as well as guidelines for submitting a new PEP, are listed at http://www.python.org/dev/peps/. @@ -168,26 +167,21 @@ Copyright and License Information --------------------------------- -Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010 -Python Software Foundation. -All rights reserved. +Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 +Python Software Foundation. All rights reserved. -Copyright (c) 2000 BeOpen.com. -All rights reserved. +Copyright (c) 2000 BeOpen.com. All rights reserved. -Copyright (c) 1995-2001 Corporation for National Research Initiatives. -All rights reserved. +Copyright (c) 1995-2001 Corporation for National Research Initiatives. All +rights reserved. -Copyright (c) 1991-1995 Stichting Mathematisch Centrum. -All rights reserved. +Copyright (c) 1991-1995 Stichting Mathematisch Centrum. All rights reserved. -See the file "LICENSE" for information on the history of this -software, terms & conditions for usage, and a DISCLAIMER OF ALL -WARRANTIES. +See the file "LICENSE" for information on the history of this software, terms & +conditions for usage, and a DISCLAIMER OF ALL WARRANTIES. -This Python distribution contains *no* GNU General Public License -(GPL) code, so it may be used in proprietary projects. There are -interfaces to some GNU code but these are entirely optional. +This Python distribution contains *no* GNU General Public License (GPL) code, so +it may be used in proprietary projects. There are interfaces to some GNU code +but these are entirely optional. -All trademarks referenced herein are property of their respective -holders. +All trademarks referenced herein are property of their respective holders. From python-checkins at python.org Sun Feb 20 11:31:59 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:31:59 +0100 (CET) Subject: [Python-checkins] r88446 - python/branches/release32-maint Message-ID: <20110220103159.95816EE981@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:31:59 2011 New Revision: 88446 Log: Make a 3.2 maintenance branch. Added: python/branches/release32-maint/ - copied from r88445, /python/branches/py3k/ From python-checkins at python.org Sun Feb 20 11:33:21 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:33:21 +0100 (CET) Subject: [Python-checkins] r88447 - in python/branches/py3k/Misc/RPM: python-3.2.spec python-3.3.spec Message-ID: <20110220103321.E0309EE981@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:33:21 2011 New Revision: 88447 Log: Bump to 3.3a0. Added: python/branches/py3k/Misc/RPM/python-3.3.spec - copied unchanged from r88445, /python/branches/py3k/Misc/RPM/python-3.2.spec Removed: python/branches/py3k/Misc/RPM/python-3.2.spec Deleted: python/branches/py3k/Misc/RPM/python-3.2.spec ============================================================================== --- python/branches/py3k/Misc/RPM/python-3.2.spec Sun Feb 20 11:33:21 2011 +++ (empty file) @@ -1,390 +0,0 @@ -########################## -# User-modifiable configs -########################## - -# Is the resulting package and the installed binary named "python" or -# "python2"? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_binsuffix none -%define config_binsuffix 2.6 - -# Build tkinter? "auto" enables it if /usr/bin/wish exists. -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_tkinter no -%define config_tkinter yes -%define config_tkinter auto - -# Use pymalloc? The last line (commented or not) determines wether -# pymalloc is used. -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_pymalloc no -%define config_pymalloc yes - -# Enable IPV6? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_ipv6 yes -%define config_ipv6 no - -# Build shared libraries or .a library? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_sharedlib no -%define config_sharedlib yes - -# Location of the HTML directory. -%define config_htmldir /var/www/html/python - -################################# -# End of user-modifiable configs -################################# - -%define name python -#--start constants-- -%define version 3.2 -%define libvers 3.2 -#--end constants-- -%define release 1pydotorg -%define __prefix /usr - -# kludge to get around rpm define weirdness -%define ipv6 %(if [ "%{config_ipv6}" = yes ]; then echo --enable-ipv6; else echo --disable-ipv6; fi) -%define pymalloc %(if [ "%{config_pymalloc}" = yes ]; then echo --with-pymalloc; else echo --without-pymalloc; fi) -%define binsuffix %(if [ "%{config_binsuffix}" = none ]; then echo ; else echo "%{config_binsuffix}"; fi) -%define include_tkinter %(if [ \\( "%{config_tkinter}" = auto -a -f /usr/bin/wish \\) -o "%{config_tkinter}" = yes ]; then echo 1; else echo 0; fi) -%define libdirname %(( uname -m | egrep -q '_64$' && [ -d /usr/lib64 ] && echo lib64 ) || echo lib) -%define sharedlib %(if [ "%{config_sharedlib}" = yes ]; then echo --enable-shared; else echo ; fi) -%define include_sharedlib %(if [ "%{config_sharedlib}" = yes ]; then echo 1; else echo 0; fi) - -# detect if documentation is available -%define include_docs %(if [ -f "%{_sourcedir}/html-%{version}.tar.bz2" ]; then echo 1; else echo 0; fi) - -Summary: An interpreted, interactive, object-oriented programming language. -Name: %{name}%{binsuffix} -Version: %{version} -Release: %{release} -License: PSF -Group: Development/Languages -Source: Python-%{version}.tar.bz2 -%if %{include_docs} -Source1: html-%{version}.tar.bz2 -%endif -BuildRoot: %{_tmppath}/%{name}-%{version}-root -BuildPrereq: expat-devel -BuildPrereq: db4-devel -BuildPrereq: gdbm-devel -BuildPrereq: sqlite-devel -Prefix: %{__prefix} -Packager: Sean Reifschneider - -%description -Python is an interpreted, interactive, object-oriented programming -language. It incorporates modules, exceptions, dynamic typing, very high -level dynamic data types, and classes. Python combines remarkable power -with very clear syntax. It has interfaces to many system calls and -libraries, as well as to various window systems, and is extensible in C or -C++. It is also usable as an extension language for applications that need -a programmable interface. Finally, Python is portable: it runs on many -brands of UNIX, on PCs under Windows, MS-DOS, and OS/2, and on the -Mac. - -%package devel -Summary: The libraries and header files needed for Python extension development. -Prereq: python%{binsuffix} = %{PACKAGE_VERSION} -Group: Development/Libraries - -%description devel -The Python programming language's interpreter can be extended with -dynamically loaded extensions and can be embedded in other programs. -This package contains the header files and libraries needed to do -these types of tasks. - -Install python-devel if you want to develop Python extensions. The -python package will also need to be installed. You'll probably also -want to install the python-docs package, which contains Python -documentation. - -%if %{include_tkinter} -%package tkinter -Summary: A graphical user interface for the Python scripting language. -Group: Development/Languages -Prereq: python%{binsuffix} = %{PACKAGE_VERSION}-%{release} - -%description tkinter -The Tkinter (Tk interface) program is an graphical user interface for -the Python scripting language. - -You should install the tkinter package if you'd like to use a graphical -user interface for Python programming. -%endif - -%package tools -Summary: A collection of development tools included with Python. -Group: Development/Tools -Prereq: python%{binsuffix} = %{PACKAGE_VERSION}-%{release} - -%description tools -The Python package includes several development tools that are used -to build python programs. This package contains a selection of those -tools, including the IDLE Python IDE. - -Install python-tools if you want to use these tools to develop -Python programs. You will also need to install the python and -tkinter packages. - -%if %{include_docs} -%package docs -Summary: Python-related documentation. -Group: Development/Documentation - -%description docs -Documentation relating to the Python programming language in HTML and info -formats. -%endif - -%changelog -* Mon Dec 20 2004 Sean Reifschneider [2.4-2pydotorg] -- Changing the idle wrapper so that it passes arguments to idle. - -* Tue Oct 19 2004 Sean Reifschneider [2.4b1-1pydotorg] -- Updating to 2.4. - -* Thu Jul 22 2004 Sean Reifschneider [2.3.4-3pydotorg] -- Paul Tiemann fixes for %{prefix}. -- Adding permission changes for directory as suggested by reimeika.ca -- Adding code to detect when it should be using lib64. -- Adding a define for the location of /var/www/html for docs. - -* Thu May 27 2004 Sean Reifschneider [2.3.4-2pydotorg] -- Including changes from Ian Holsman to build under Red Hat 7.3. -- Fixing some problems with the /usr/local path change. - -* Sat Mar 27 2004 Sean Reifschneider [2.3.2-3pydotorg] -- Being more agressive about finding the paths to fix for - #!/usr/local/bin/python. - -* Sat Feb 07 2004 Sean Reifschneider [2.3.3-2pydotorg] -- Adding code to remove "#!/usr/local/bin/python" from particular files and - causing the RPM build to terminate if there are any unexpected files - which have that line in them. - -* Mon Oct 13 2003 Sean Reifschneider [2.3.2-1pydotorg] -- Adding code to detect wether documentation is available to build. - -* Fri Sep 19 2003 Sean Reifschneider [2.3.1-1pydotorg] -- Updating to the 2.3.1 release. - -* Mon Feb 24 2003 Sean Reifschneider [2.3b1-1pydotorg] -- Updating to 2.3b1 release. - -* Mon Feb 17 2003 Sean Reifschneider [2.3a1-1] -- Updating to 2.3 release. - -* Sun Dec 23 2001 Sean Reifschneider -[Release 2.2-2] -- Added -docs package. -- Added "auto" config_tkinter setting which only enables tk if - /usr/bin/wish exists. - -* Sat Dec 22 2001 Sean Reifschneider -[Release 2.2-1] -- Updated to 2.2. -- Changed the extension to "2" from "2.2". - -* Tue Nov 18 2001 Sean Reifschneider -[Release 2.2c1-1] -- Updated to 2.2c1. - -* Thu Nov 1 2001 Sean Reifschneider -[Release 2.2b1-3] -- Changed the way the sed for fixing the #! in pydoc works. - -* Wed Oct 24 2001 Sean Reifschneider -[Release 2.2b1-2] -- Fixed missing "email" package, thanks to anonymous report on sourceforge. -- Fixed missing "compiler" package. - -* Mon Oct 22 2001 Sean Reifschneider -[Release 2.2b1-1] -- Updated to 2.2b1. - -* Mon Oct 9 2001 Sean Reifschneider -[Release 2.2a4-4] -- otto at balinor.mat.unimi.it mentioned that the license file is missing. - -* Sun Sep 30 2001 Sean Reifschneider -[Release 2.2a4-3] -- Ignacio Vazquez-Abrams pointed out that I had a spruious double-quote in - the spec files. Thanks. - -* Wed Jul 25 2001 Sean Reifschneider -[Release 2.2a1-1] -- Updated to 2.2a1 release. -- Changed idle and pydoc to use binsuffix macro - -####### -# PREP -####### -%prep -%setup -n Python-%{version} - -######## -# BUILD -######## -%build -echo "Setting for ipv6: %{ipv6}" -echo "Setting for pymalloc: %{pymalloc}" -echo "Setting for binsuffix: %{binsuffix}" -echo "Setting for include_tkinter: %{include_tkinter}" -echo "Setting for libdirname: %{libdirname}" -echo "Setting for sharedlib: %{sharedlib}" -echo "Setting for include_sharedlib: %{include_sharedlib}" -./configure --enable-unicode=ucs4 %{sharedlib} %{ipv6} %{pymalloc} --prefix=%{__prefix} -make - -########## -# INSTALL -########## -%install -# set the install path -echo '[install_scripts]' >setup.cfg -echo 'install_dir='"${RPM_BUILD_ROOT}%{__prefix}/bin" >>setup.cfg - -[ -d "$RPM_BUILD_ROOT" -a "$RPM_BUILD_ROOT" != "/" ] && rm -rf $RPM_BUILD_ROOT -mkdir -p $RPM_BUILD_ROOT%{__prefix}/%{libdirname}/python%{libvers}/lib-dynload -make prefix=$RPM_BUILD_ROOT%{__prefix} install - -# REPLACE PATH IN PYDOC -if [ ! -z "%{binsuffix}" ] -then - ( - cd $RPM_BUILD_ROOT%{__prefix}/bin - mv pydoc pydoc.old - sed 's|#!.*|#!%{__prefix}/bin/env python'%{binsuffix}'|' \ - pydoc.old >pydoc - chmod 755 pydoc - rm -f pydoc.old - ) -fi - -# add the binsuffix -if [ ! -z "%{binsuffix}" ] -then - rm -f $RPM_BUILD_ROOT%{__prefix}/bin/python[0-9a-zA-Z]* - ( cd $RPM_BUILD_ROOT%{__prefix}/bin; - for file in *; do mv "$file" "$file"%{binsuffix}; done ) - ( cd $RPM_BUILD_ROOT%{_mandir}/man1; mv python.1 python%{binsuffix}.1 ) -fi - -######## -# Tools -echo '#!%{__prefix}/bin/env python%{binsuffix}' >${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'import os, sys' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'os.execvp("%{__prefix}/bin/python%{binsuffix}", ["%{__prefix}/bin/python%{binsuffix}", "%{__prefix}/lib/python%{libvers}/idlelib/idle.py"] + sys.argv[1:])' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'print "Failed to exec Idle"' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'sys.exit(1)' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -chmod 755 $RPM_BUILD_ROOT%{__prefix}/bin/idle%{binsuffix} -cp -a Tools $RPM_BUILD_ROOT%{__prefix}/%{libdirname}/python%{libvers} - -# MAKE FILE LISTS -rm -f mainpkg.files -find "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers} -type f | - sed "s|^${RPM_BUILD_ROOT}|/|" | - grep -v -e '/python%{libvers}/config$' -e '_tkinter.so$' >mainpkg.files -find "$RPM_BUILD_ROOT""%{__prefix}"/bin -type f -o -type l | - sed "s|^${RPM_BUILD_ROOT}|/|" | - grep -v -e '/bin/2to3%{binsuffix}$' | - grep -v -e '/bin/pydoc%{binsuffix}$' | - grep -v -e '/bin/smtpd.py%{binsuffix}$' | - grep -v -e '/bin/idle%{binsuffix}$' >>mainpkg.files - -rm -f tools.files -find "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers}/idlelib \ - "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers}/Tools -type f | - sed "s|^${RPM_BUILD_ROOT}|/|" >tools.files -echo "%{__prefix}"/bin/2to3%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/pydoc%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/smtpd.py%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/idle%{binsuffix} >>tools.files - -###### -# Docs -%if %{include_docs} -mkdir -p "$RPM_BUILD_ROOT"%{config_htmldir} -( - cd "$RPM_BUILD_ROOT"%{config_htmldir} - bunzip2 < %{SOURCE1} | tar x -) -%endif - -# fix the #! line in installed files -find "$RPM_BUILD_ROOT" -type f -print0 | - xargs -0 grep -l /usr/local/bin/python | while read file -do - FIXFILE="$file" - sed 's|^#!.*python|#!%{__prefix}/bin/env python'"%{binsuffix}"'|' \ - "$FIXFILE" >/tmp/fix-python-path.$$ - cat /tmp/fix-python-path.$$ >"$FIXFILE" - rm -f /tmp/fix-python-path.$$ -done - -# check to see if there are any straggling #! lines -find "$RPM_BUILD_ROOT" -type f | xargs egrep -n '^#! */usr/local/bin/python' \ - | grep ':1:#!' >/tmp/python-rpm-files.$$ || true -if [ -s /tmp/python-rpm-files.$$ ] -then - echo '*****************************************************' - cat /tmp/python-rpm-files.$$ - cat <<@EOF - ***************************************************** - There are still files referencing /usr/local/bin/python in the - install directory. They are listed above. Please fix the .spec - file and try again. If you are an end-user, you probably want - to report this to jafo-rpms at tummy.com as well. - ***************************************************** - at EOF - rm -f /tmp/python-rpm-files.$$ - exit 1 -fi -rm -f /tmp/python-rpm-files.$$ - -######## -# CLEAN -######## -%clean -[ -n "$RPM_BUILD_ROOT" -a "$RPM_BUILD_ROOT" != / ] && rm -rf $RPM_BUILD_ROOT -rm -f mainpkg.files tools.files - -######## -# FILES -######## -%files -f mainpkg.files -%defattr(-,root,root) -%doc Misc/README Misc/cheatsheet Misc/Porting -%doc LICENSE Misc/ACKS Misc/HISTORY Misc/NEWS -%{_mandir}/man1/python%{binsuffix}.1* - -%attr(755,root,root) %dir %{__prefix}/include/python%{libvers} -%attr(755,root,root) %dir %{__prefix}/%{libdirname}/python%{libvers}/ -%if %{include_sharedlib} -%{__prefix}/%{libdirname}/libpython* -%endif - -%files devel -%defattr(-,root,root) -%{__prefix}/include/python%{libvers}/*.h -%{__prefix}/%{libdirname}/python%{libvers}/config - -%files -f tools.files tools -%defattr(-,root,root) - -%if %{include_tkinter} -%files tkinter -%defattr(-,root,root) -%{__prefix}/%{libdirname}/python%{libvers}/tkinter -%{__prefix}/%{libdirname}/python%{libvers}/lib-dynload/_tkinter.so* -%endif - -%if %{include_docs} -%files docs -%defattr(-,root,root) -%{config_htmldir}/* -%endif From python-checkins at python.org Sun Feb 20 11:37:07 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:37:07 +0100 (CET) Subject: [Python-checkins] r88448 - in python/branches/py3k: Doc/license.rst Doc/tutorial/interpreter.rst Doc/tutorial/stdlib.rst Doc/tutorial/stdlib2.rst Include/patchlevel.h LICENSE Lib/distutils/__init__.py Lib/idlelib/idlever.py Misc/NEWS Misc/RPM/python-3.3.spec README configure configure.in Message-ID: <20110220103707.5FC3DEE984@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:37:07 2011 New Revision: 88448 Log: Bump trunk to 3.3 alpha 0. Modified: python/branches/py3k/Doc/license.rst python/branches/py3k/Doc/tutorial/interpreter.rst python/branches/py3k/Doc/tutorial/stdlib.rst python/branches/py3k/Doc/tutorial/stdlib2.rst python/branches/py3k/Include/patchlevel.h python/branches/py3k/LICENSE python/branches/py3k/Lib/distutils/__init__.py python/branches/py3k/Lib/idlelib/idlever.py python/branches/py3k/Misc/NEWS python/branches/py3k/Misc/RPM/python-3.3.spec python/branches/py3k/README python/branches/py3k/configure python/branches/py3k/configure.in Modified: python/branches/py3k/Doc/license.rst ============================================================================== --- python/branches/py3k/Doc/license.rst (original) +++ python/branches/py3k/Doc/license.rst Sun Feb 20 11:37:07 2011 @@ -110,6 +110,8 @@ +----------------+--------------+------------+------------+-----------------+ | 3.2 | 3.1 | 2011 | PSF | yes | +----------------+--------------+------------+------------+-----------------+ +| 3.3 | 3.2 | 2012 | PSF | yes | ++----------------+--------------+------------+------------+-----------------+ .. note:: Modified: python/branches/py3k/Doc/tutorial/interpreter.rst ============================================================================== --- python/branches/py3k/Doc/tutorial/interpreter.rst (original) +++ python/branches/py3k/Doc/tutorial/interpreter.rst Sun Feb 20 11:37:07 2011 @@ -10,11 +10,11 @@ Invoking the Interpreter ======================== -The Python interpreter is usually installed as :file:`/usr/local/bin/python3.2` +The Python interpreter is usually installed as :file:`/usr/local/bin/python3.3` on those machines where it is available; putting :file:`/usr/local/bin` in your Unix shell's search path makes it possible to start it by typing the command :: - python3.2 + python3.3 to the shell. [#]_ Since the choice of the directory where the interpreter lives is an installation option, other places are possible; check with your local @@ -22,11 +22,11 @@ popular alternative location.) On Windows machines, the Python installation is usually placed in -:file:`C:\\Python32`, though you can change this when you're running the +:file:`C:\\Python33`, though you can change this when you're running the installer. To add this directory to your path, you can type the following command into the command prompt in a DOS box:: - set path=%path%;C:\python32 + set path=%path%;C:\python33 Typing an end-of-file character (:kbd:`Control-D` on Unix, :kbd:`Control-Z` on Windows) at the primary prompt causes the interpreter to exit with a zero exit @@ -94,8 +94,8 @@ prints a welcome message stating its version number and a copyright notice before printing the first prompt:: - $ python3.2 - Python 3.2 (py3k, Sep 12 2007, 12:21:02) + $ python3.3 + Python 3.3 (py3k, Sep 12 2007, 12:21:02) [GCC 3.4.6 20060404 (Red Hat 3.4.6-8)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> @@ -148,7 +148,7 @@ On BSD'ish Unix systems, Python scripts can be made directly executable, like shell scripts, by putting the line :: - #! /usr/bin/env python3.2 + #! /usr/bin/env python3.3 (assuming that the interpreter is on the user's :envvar:`PATH`) at the beginning of the script and giving the file an executable mode. The ``#!`` must be the Modified: python/branches/py3k/Doc/tutorial/stdlib.rst ============================================================================== --- python/branches/py3k/Doc/tutorial/stdlib.rst (original) +++ python/branches/py3k/Doc/tutorial/stdlib.rst Sun Feb 20 11:37:07 2011 @@ -15,7 +15,7 @@ >>> import os >>> os.getcwd() # Return the current working directory - 'C:\\Python31' + 'C:\\Python33' >>> os.chdir('/server/accesslogs') # Change current working directory >>> os.system('mkdir today') # Run the command mkdir in the system shell 0 Modified: python/branches/py3k/Doc/tutorial/stdlib2.rst ============================================================================== --- python/branches/py3k/Doc/tutorial/stdlib2.rst (original) +++ python/branches/py3k/Doc/tutorial/stdlib2.rst Sun Feb 20 11:37:07 2011 @@ -271,7 +271,7 @@ Traceback (most recent call last): File "", line 1, in d['primary'] # entry was automatically removed - File "C:/python31/lib/weakref.py", line 46, in __getitem__ + File "C:/python33/lib/weakref.py", line 46, in __getitem__ o = self.data[key]() KeyError: 'primary' Modified: python/branches/py3k/Include/patchlevel.h ============================================================================== --- python/branches/py3k/Include/patchlevel.h (original) +++ python/branches/py3k/Include/patchlevel.h Sun Feb 20 11:37:07 2011 @@ -17,13 +17,13 @@ /* Version parsed out into numeric values */ /*--start constants--*/ #define PY_MAJOR_VERSION 3 -#define PY_MINOR_VERSION 2 +#define PY_MINOR_VERSION 3 #define PY_MICRO_VERSION 0 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_ALPHA #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.2" +#define PY_VERSION "3.3a0" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository) */ Modified: python/branches/py3k/LICENSE ============================================================================== --- python/branches/py3k/LICENSE (original) +++ python/branches/py3k/LICENSE Sun Feb 20 11:37:07 2011 @@ -69,6 +69,7 @@ 3.1.1 3.1 2009 PSF yes 3.1.2 3.1 2010 PSF yes 3.2 3.1 2011 PSF yes + 3.3 3.2 2012 PSF yes Footnotes: Modified: python/branches/py3k/Lib/distutils/__init__.py ============================================================================== --- python/branches/py3k/Lib/distutils/__init__.py (original) +++ python/branches/py3k/Lib/distutils/__init__.py Sun Feb 20 11:37:07 2011 @@ -15,5 +15,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.2" +__version__ = "3.3a0" #--end constants-- Modified: python/branches/py3k/Lib/idlelib/idlever.py ============================================================================== --- python/branches/py3k/Lib/idlelib/idlever.py (original) +++ python/branches/py3k/Lib/idlelib/idlever.py Sun Feb 20 11:37:07 2011 @@ -1 +1 @@ -IDLE_VERSION = "3.2" +IDLE_VERSION = "3.3a0" Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sun Feb 20 11:37:07 2011 @@ -2,6 +2,12 @@ Python News +++++++++++ +What's New in Python 3.3 Alpha 1? +================================= + +*Release date: XX-XXX-20XX* + + What's New in Python 3.2? ========================= Modified: python/branches/py3k/Misc/RPM/python-3.3.spec ============================================================================== --- python/branches/py3k/Misc/RPM/python-3.3.spec (original) +++ python/branches/py3k/Misc/RPM/python-3.3.spec Sun Feb 20 11:37:07 2011 @@ -39,8 +39,8 @@ %define name python #--start constants-- -%define version 3.2 -%define libvers 3.2 +%define version 3.3a0 +%define libvers 3.3 #--end constants-- %define release 1pydotorg %define __prefix /usr Modified: python/branches/py3k/README ============================================================================== --- python/branches/py3k/README (original) +++ python/branches/py3k/README Sun Feb 20 11:37:07 2011 @@ -1,5 +1,5 @@ -This is Python version 3.2 -========================== +This is Python version 3.3 alpha 0 +================================== Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 Python Software Foundation. All rights reserved. Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Sun Feb 20 11:37:07 2011 @@ -1,7 +1,7 @@ #! /bin/sh -# From configure.in Revision: 88430 . +# From configure.in Revision: 88440 . # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.68 for python 3.2. +# Generated by GNU Autoconf 2.68 for python 3.3. # # Report bugs to . # @@ -561,8 +561,8 @@ # Identity of this package. PACKAGE_NAME='python' PACKAGE_TARNAME='python' -PACKAGE_VERSION='3.2' -PACKAGE_STRING='python 3.2' +PACKAGE_VERSION='3.3' +PACKAGE_STRING='python 3.3' PACKAGE_BUGREPORT='http://bugs.python.org/' PACKAGE_URL='' @@ -1317,7 +1317,7 @@ # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures python 3.2 to adapt to many kinds of systems. +\`configure' configures python 3.3 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1378,7 +1378,7 @@ if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of python 3.2:";; + short | recursive ) echo "Configuration of python 3.3:";; esac cat <<\_ACEOF @@ -1516,7 +1516,7 @@ test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -python configure 3.2 +python configure 3.3 generated by GNU Autoconf 2.68 Copyright (C) 2010 Free Software Foundation, Inc. @@ -2347,7 +2347,7 @@ This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by python $as_me 3.2, which was +It was created by python $as_me 3.3, which was generated by GNU Autoconf 2.68. Invocation command line was $ $0 $@ @@ -2714,7 +2714,7 @@ mv confdefs.h.new confdefs.h -VERSION=3.2 +VERSION=3.3 # Version number of Python's own shared library file. @@ -14311,7 +14311,7 @@ # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by python $as_me 3.2, which was +This file was extended by python $as_me 3.3, which was generated by GNU Autoconf 2.68. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -14373,7 +14373,7 @@ cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -python config.status 3.2 +python config.status 3.3 configured by $0, generated by GNU Autoconf 2.68, with options \\"\$ac_cs_config\\" Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Sun Feb 20 11:37:07 2011 @@ -3,7 +3,7 @@ dnl *********************************************** # Set VERSION so we only need to edit in one place (i.e., here) -m4_define(PYTHON_VERSION, 3.2) +m4_define(PYTHON_VERSION, 3.3) dnl Some m4 magic to ensure that the configure script is generated dnl by the correct autoconf version. From python-checkins at python.org Sun Feb 20 11:41:32 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 11:41:32 +0100 (CET) Subject: [Python-checkins] r88449 - in python/branches/py3k: PC/VC6/pythoncore.dsp PC/VC6/readme.txt PC/VS7.1/pythoncore.vcproj PC/VS7.1/readme.txt PC/VS8.0/build_ssl.bat PC/VS8.0/kill_python.c PC/VS8.0/pyproject.vsprops PC/example_nt/example.vcproj PC/os2emx/Makefile PC/os2emx/README.os2emx PC/os2emx/python27.def PC/os2emx/python33.def PC/pyconfig.h PC/python3.def PC/python3.mak PC/python32gen.py PC/python32stub.def PC/python33gen.py PC/python33stub.def PCbuild/build_ssl.bat PCbuild/kill_python.c PCbuild/pyproject.vsprops PCbuild/readme.txt README Message-ID: <20110220104132.4D741EE984@mail.python.org> Author: georg.brandl Date: Sun Feb 20 11:41:31 2011 New Revision: 88449 Log: More automated version replacement. Added: python/branches/py3k/PC/os2emx/python33.def - copied, changed from r88444, /python/branches/py3k/PC/os2emx/python27.def python/branches/py3k/PC/python33gen.py - copied, changed from r88444, /python/branches/py3k/PC/python32gen.py python/branches/py3k/PC/python33stub.def - copied, changed from r88444, /python/branches/py3k/PC/python32stub.def Removed: python/branches/py3k/PC/os2emx/python27.def python/branches/py3k/PC/python32gen.py python/branches/py3k/PC/python32stub.def Modified: python/branches/py3k/PC/VC6/pythoncore.dsp python/branches/py3k/PC/VC6/readme.txt python/branches/py3k/PC/VS7.1/pythoncore.vcproj python/branches/py3k/PC/VS7.1/readme.txt python/branches/py3k/PC/VS8.0/build_ssl.bat python/branches/py3k/PC/VS8.0/kill_python.c python/branches/py3k/PC/VS8.0/pyproject.vsprops python/branches/py3k/PC/example_nt/example.vcproj python/branches/py3k/PC/os2emx/Makefile python/branches/py3k/PC/os2emx/README.os2emx python/branches/py3k/PC/pyconfig.h python/branches/py3k/PC/python3.def python/branches/py3k/PC/python3.mak python/branches/py3k/PCbuild/build_ssl.bat python/branches/py3k/PCbuild/kill_python.c python/branches/py3k/PCbuild/pyproject.vsprops python/branches/py3k/PCbuild/readme.txt python/branches/py3k/README Modified: python/branches/py3k/PC/VC6/pythoncore.dsp ============================================================================== --- python/branches/py3k/PC/VC6/pythoncore.dsp (original) +++ python/branches/py3k/PC/VC6/pythoncore.dsp Sun Feb 20 11:41:31 2011 @@ -54,7 +54,7 @@ # ADD BSC32 /nologo LINK32=link.exe # ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /dll /machine:I386 -# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python32.dll" +# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python33.dll" # SUBTRACT LINK32 /pdb:none !ELSEIF "$(CFG)" == "pythoncore - Win32 Debug" @@ -82,7 +82,7 @@ # ADD BSC32 /nologo LINK32=link.exe # ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /dll /debug /machine:I386 /pdbtype:sept -# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python32_d.dll" /pdbtype:sept +# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python33_d.dll" /pdbtype:sept # SUBTRACT LINK32 /pdb:none !ENDIF Modified: python/branches/py3k/PC/VC6/readme.txt ============================================================================== --- python/branches/py3k/PC/VC6/readme.txt (original) +++ python/branches/py3k/PC/VC6/readme.txt Sun Feb 20 11:41:31 2011 @@ -12,7 +12,7 @@ The proper order to build subprojects: 1) pythoncore (this builds the main Python DLL and library files, - python32.{dll, lib} in Release mode) + python33.{dll, lib} in Release mode) 2) python (this builds the main Python executable, python.exe in Release mode) @@ -23,7 +23,7 @@ to the subsystems they implement; see SUBPROJECTS below) When using the Debug setting, the output files have a _d added to -their name: python32_d.dll, python_d.exe, pyexpat_d.pyd, and so on. +their name: python33_d.dll, python_d.exe, pyexpat_d.pyd, and so on. SUBPROJECTS ----------- Modified: python/branches/py3k/PC/VS7.1/pythoncore.vcproj ============================================================================== --- python/branches/py3k/PC/VS7.1/pythoncore.vcproj (original) +++ python/branches/py3k/PC/VS7.1/pythoncore.vcproj Sun Feb 20 11:41:31 2011 @@ -39,15 +39,15 @@ @@ -99,15 +99,15 @@ @@ -166,15 +166,15 @@ Name="VCLinkerTool" AdditionalOptions=" /MACHINE:IA64 /USELINK:MS_SDK" AdditionalDependencies="getbuildinfo.o" - OutputFile="./python32.dll" + OutputFile="./python33.dll" LinkIncremental="1" SuppressStartupBanner="FALSE" IgnoreDefaultLibraryNames="libc" GenerateDebugInformation="TRUE" - ProgramDatabaseFile=".\./python32.pdb" + ProgramDatabaseFile=".\./python33.pdb" SubSystem="2" BaseAddress="0x1e000000" - ImportLibrary=".\./python32.lib" + ImportLibrary=".\./python33.lib" TargetMachine="0"/> @@ -233,15 +233,15 @@ Name="VCLinkerTool" AdditionalOptions=" /MACHINE:AMD64 /USELINK:MS_SDK" AdditionalDependencies="getbuildinfo.o" - OutputFile="./python32.dll" + OutputFile="./python33.dll" LinkIncremental="1" SuppressStartupBanner="TRUE" IgnoreDefaultLibraryNames="libc" GenerateDebugInformation="TRUE" - ProgramDatabaseFile=".\./python32.pdb" + ProgramDatabaseFile=".\./python33.pdb" SubSystem="2" BaseAddress="0x1e000000" - ImportLibrary=".\./python32.lib" + ImportLibrary=".\./python33.lib" TargetMachine="0"/> Modified: python/branches/py3k/PC/VS7.1/readme.txt ============================================================================== --- python/branches/py3k/PC/VS7.1/readme.txt (original) +++ python/branches/py3k/PC/VS7.1/readme.txt Sun Feb 20 11:41:31 2011 @@ -12,7 +12,7 @@ The proper order to build subprojects: 1) pythoncore (this builds the main Python DLL and library files, - python26.{dll, lib} in Release mode) + python33.{dll, lib} in Release mode) NOTE: in previous releases, this subproject was named after the release number, e.g. python20. @@ -26,7 +26,7 @@ test slave; see SUBPROJECTS below) When using the Debug setting, the output files have a _d added to -their name: python26_d.dll, python_d.exe, parser_d.pyd, and so on. +their name: python33_d.dll, python_d.exe, parser_d.pyd, and so on. SUBPROJECTS ----------- Modified: python/branches/py3k/PC/VS8.0/build_ssl.bat ============================================================================== --- python/branches/py3k/PC/VS8.0/build_ssl.bat (original) +++ python/branches/py3k/PC/VS8.0/build_ssl.bat Sun Feb 20 11:41:31 2011 @@ -2,10 +2,10 @@ if not defined HOST_PYTHON ( if %1 EQU Debug ( set HOST_PYTHON=python_d.exe - if not exist python32_d.dll exit 1 + if not exist python33_d.dll exit 1 ) ELSE ( set HOST_PYTHON=python.exe - if not exist python32.dll exit 1 + if not exist python33.dll exit 1 ) ) %HOST_PYTHON% build_ssl.py %1 %2 %3 Modified: python/branches/py3k/PC/VS8.0/kill_python.c ============================================================================== --- python/branches/py3k/PC/VS8.0/kill_python.c (original) +++ python/branches/py3k/PC/VS8.0/kill_python.c Sun Feb 20 11:41:31 2011 @@ -106,7 +106,7 @@ /* * XXX TODO: if we really wanted to be fancy, we could check the * modules for all processes (not just the python[_d].exe ones) - * and see if any of our DLLs are loaded (i.e. python32[_d].dll), + * and see if any of our DLLs are loaded (i.e. python33[_d].dll), * as that would also inhibit our ability to rebuild the solution. * Not worth loosing sleep over though; for now, a simple check * for just the python executable should be sufficient. Modified: python/branches/py3k/PC/VS8.0/pyproject.vsprops ============================================================================== --- python/branches/py3k/PC/VS8.0/pyproject.vsprops (original) +++ python/branches/py3k/PC/VS8.0/pyproject.vsprops Sun Feb 20 11:41:31 2011 @@ -38,7 +38,7 @@ /> Author: georg.brandl Date: Sun Feb 20 11:42:18 2011 New Revision: 88450 Log: Tag release 3.2. Added: python/tags/r32/ - copied from r88449, /python/branches/release32-maint/ From python-checkins at python.org Sun Feb 20 12:18:09 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 12:18:09 +0100 (CET) Subject: [Python-checkins] r88451 - in python/branches/py3k/Lib/unittest: case.py test/_test_warnings.py test/test_assertions.py test/test_case.py test/test_runner.py Message-ID: <20110220111809.5C41CEE987@mail.python.org> Author: georg.brandl Date: Sun Feb 20 12:18:09 2011 New Revision: 88451 Log: Remove unittest methods scheduled for removal in 3.3 -- makes the unittest test suite pass again. Modified: python/branches/py3k/Lib/unittest/case.py python/branches/py3k/Lib/unittest/test/_test_warnings.py python/branches/py3k/Lib/unittest/test/test_assertions.py python/branches/py3k/Lib/unittest/test/test_case.py python/branches/py3k/Lib/unittest/test/test_runner.py Modified: python/branches/py3k/Lib/unittest/case.py ============================================================================== --- python/branches/py3k/Lib/unittest/case.py (original) +++ python/branches/py3k/Lib/unittest/case.py Sun Feb 20 12:18:09 2011 @@ -938,77 +938,6 @@ standardMsg = self._truncateMessage(standardMsg, diff) self.fail(self._formatMessage(msg, standardMsg)) - def assertDictContainsSubset(self, subset, dictionary, msg=None): - """Checks whether dictionary is a superset of subset.""" - warnings.warn('assertDictContainsSubset is deprecated', - DeprecationWarning) - missing = [] - mismatched = [] - for key, value in subset.items(): - if key not in dictionary: - missing.append(key) - elif value != dictionary[key]: - mismatched.append('%s, expected: %s, actual: %s' % - (safe_repr(key), safe_repr(value), - safe_repr(dictionary[key]))) - - if not (missing or mismatched): - return - - standardMsg = '' - if missing: - standardMsg = 'Missing: %s' % ','.join(safe_repr(m) for m in - missing) - if mismatched: - if standardMsg: - standardMsg += '; ' - standardMsg += 'Mismatched values: %s' % ','.join(mismatched) - - self.fail(self._formatMessage(msg, standardMsg)) - - def assertSameElements(self, expected_seq, actual_seq, msg=None): - """An unordered sequence specific comparison. - - Raises with an error message listing which elements of expected_seq - are missing from actual_seq and vice versa if any. - - Duplicate elements are ignored when comparing *expected_seq* and - *actual_seq*. It is the equivalent of ``assertEqual(set(expected), - set(actual))`` but it works with sequences of unhashable objects as - well. - """ - warnings.warn('assertSameElements is deprecated', - DeprecationWarning) - try: - expected = set(expected_seq) - actual = set(actual_seq) - missing = sorted(expected.difference(actual)) - unexpected = sorted(actual.difference(expected)) - except TypeError: - # Fall back to slower list-compare if any of the objects are - # not hashable. - expected = list(expected_seq) - actual = list(actual_seq) - try: - expected.sort() - actual.sort() - except TypeError: - missing, unexpected = unorderable_list_difference(expected, - actual) - else: - missing, unexpected = sorted_list_difference(expected, actual) - errors = [] - if missing: - errors.append('Expected, but missing:\n %s' % - safe_repr(missing)) - if unexpected: - errors.append('Unexpected, but present:\n %s' % - safe_repr(unexpected)) - if errors: - standardMsg = '\n'.join(errors) - self.fail(self._formatMessage(msg, standardMsg)) - - def assertCountEqual(self, first, second, msg=None): """An unordered sequence comparison asserting that the same elements, regardless of order. If the same element occurs more than once, @@ -1183,13 +1112,11 @@ # The fail* methods can be removed in 3.3, the 5 assert* methods will # have to stay around for a few more versions. See #9424. - failUnlessEqual = assertEquals = _deprecate(assertEqual) - failIfEqual = assertNotEquals = _deprecate(assertNotEqual) - failUnlessAlmostEqual = assertAlmostEquals = _deprecate(assertAlmostEqual) - failIfAlmostEqual = assertNotAlmostEquals = _deprecate(assertNotAlmostEqual) - failUnless = assert_ = _deprecate(assertTrue) - failUnlessRaises = _deprecate(assertRaises) - failIf = _deprecate(assertFalse) + assertEquals = _deprecate(assertEqual) + assertNotEquals = _deprecate(assertNotEqual) + assertAlmostEquals = _deprecate(assertAlmostEqual) + assertNotAlmostEquals = _deprecate(assertNotAlmostEqual) + assert_ = _deprecate(assertTrue) assertRaisesRegexp = _deprecate(assertRaisesRegex) assertRegexpMatches = _deprecate(assertRegex) Modified: python/branches/py3k/Lib/unittest/test/_test_warnings.py ============================================================================== --- python/branches/py3k/Lib/unittest/test/_test_warnings.py (original) +++ python/branches/py3k/Lib/unittest/test/_test_warnings.py Sun Feb 20 12:18:09 2011 @@ -19,17 +19,12 @@ warnings.warn('rw', RuntimeWarning) class TestWarnings(unittest.TestCase): - # unittest warnings will be printed at most once per type (max one message - # for the fail* methods, and one for the assert* methods) + # unittest warnings will be printed at most once per type def test_assert(self): self.assertEquals(2+2, 4) self.assertEquals(2*2, 4) self.assertEquals(2**2, 4) - def test_fail(self): - self.failUnless(1) - self.failUnless(True) - def test_other_unittest(self): self.assertAlmostEqual(2+2, 4) self.assertNotAlmostEqual(4+4, 2) Modified: python/branches/py3k/Lib/unittest/test/test_assertions.py ============================================================================== --- python/branches/py3k/Lib/unittest/test/test_assertions.py (original) +++ python/branches/py3k/Lib/unittest/test/test_assertions.py Sun Feb 20 12:18:09 2011 @@ -223,15 +223,6 @@ "\+ \{'key': 'value'\}$", "\+ \{'key': 'value'\} : oops$"]) - def testAssertDictContainsSubset(self): - with warnings.catch_warnings(): - warnings.simplefilter("ignore", DeprecationWarning) - - self.assertMessages('assertDictContainsSubset', ({'key': 'value'}, {}), - ["^Missing: 'key'$", "^oops$", - "^Missing: 'key'$", - "^Missing: 'key' : oops$"]) - def testAssertMultiLineEqual(self): self.assertMessages('assertMultiLineEqual', ("", "foo"), [r"\+ foo$", "^oops$", Modified: python/branches/py3k/Lib/unittest/test/test_case.py ============================================================================== --- python/branches/py3k/Lib/unittest/test/test_case.py (original) +++ python/branches/py3k/Lib/unittest/test/test_case.py Sun Feb 20 12:18:09 2011 @@ -488,36 +488,6 @@ self.assertRaises(self.failureException, self.assertNotIn, 'cow', animals) - def testAssertDictContainsSubset(self): - with warnings.catch_warnings(): - warnings.simplefilter("ignore", DeprecationWarning) - - self.assertDictContainsSubset({}, {}) - self.assertDictContainsSubset({}, {'a': 1}) - self.assertDictContainsSubset({'a': 1}, {'a': 1}) - self.assertDictContainsSubset({'a': 1}, {'a': 1, 'b': 2}) - self.assertDictContainsSubset({'a': 1, 'b': 2}, {'a': 1, 'b': 2}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({1: "one"}, {}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 2}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'c': 1}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 1, 'c': 1}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 1, 'c': 1}, {'a': 1}) - - one = ''.join(chr(i) for i in range(255)) - # this used to cause a UnicodeDecodeError constructing the failure msg - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'foo': one}, {'foo': '\uFFFD'}) - def testAssertEqual(self): equal_pairs = [ ((), ()), @@ -1094,20 +1064,11 @@ have to stay around for a few more versions. See #9424. """ old = ( - (self.failIfEqual, (3, 5)), (self.assertNotEquals, (3, 5)), - (self.failUnlessEqual, (3, 3)), (self.assertEquals, (3, 3)), - (self.failUnlessAlmostEqual, (2.0, 2.0)), (self.assertAlmostEquals, (2.0, 2.0)), - (self.failIfAlmostEqual, (3.0, 5.0)), (self.assertNotAlmostEquals, (3.0, 5.0)), - (self.failUnless, (True,)), (self.assert_, (True,)), - (self.failUnlessRaises, (TypeError, lambda _: 3.14 + 'spam')), - (self.failIf, (False,)), - (self.assertSameElements, ([1, 1, 2, 3], [1, 2, 3])), - (self.assertDictContainsSubset, (dict(a=1, b=2), dict(a=1, b=2, c=3))), (self.assertRaisesRegexp, (KeyError, 'foo', lambda: {}['foo'])), (self.assertRegexpMatches, ('bar', 'bar')), ) @@ -1115,19 +1076,6 @@ with self.assertWarns(DeprecationWarning): meth(*args) - def testDeprecatedFailMethods(self): - """Test that the deprecated fail* methods get removed in 3.3""" - if sys.version_info[:2] < (3, 3): - return - deprecated_names = [ - 'failIfEqual', 'failUnlessEqual', 'failUnlessAlmostEqual', - 'failIfAlmostEqual', 'failUnless', 'failUnlessRaises', 'failIf', - 'assertSameElements', 'assertDictContainsSubset', - ] - for deprecated_name in deprecated_names: - with self.assertRaises(AttributeError): - getattr(self, deprecated_name) # remove these in 3.3 - def testDeepcopy(self): # Issue: 5660 class TestableTest(unittest.TestCase): Modified: python/branches/py3k/Lib/unittest/test/test_runner.py ============================================================================== --- python/branches/py3k/Lib/unittest/test/test_runner.py (original) +++ python/branches/py3k/Lib/unittest/test/test_runner.py Sun Feb 20 12:18:09 2011 @@ -257,19 +257,17 @@ return [b.splitlines() for b in p.communicate()] opts = dict(stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=os.path.dirname(__file__)) - ae_msg = b'Please use assertEqual instead.' - at_msg = b'Please use assertTrue instead.' # no args -> all the warnings are printed, unittest warnings only once p = subprocess.Popen([sys.executable, '_test_warnings.py'], **opts) out, err = get_parse_out_err(p) self.assertIn(b'OK', err) # check that the total number of warnings in the output is correct - self.assertEqual(len(out), 12) + self.assertEqual(len(out), 11) # check that the numbers of the different kind of warnings is correct for msg in [b'dw', b'iw', b'uw']: self.assertEqual(out.count(msg), 3) - for msg in [ae_msg, at_msg, b'rw']: + for msg in [b'rw']: self.assertEqual(out.count(msg), 1) args_list = ( @@ -294,11 +292,9 @@ **opts) out, err = get_parse_out_err(p) self.assertIn(b'OK', err) - self.assertEqual(len(out), 14) + self.assertEqual(len(out), 13) for msg in [b'dw', b'iw', b'uw', b'rw']: self.assertEqual(out.count(msg), 3) - for msg in [ae_msg, at_msg]: - self.assertEqual(out.count(msg), 1) def testStdErrLookedUpAtInstantiationTime(self): # see issue 10786 From python-checkins at python.org Sun Feb 20 15:42:33 2011 From: python-checkins at python.org (georg.brandl) Date: Sun, 20 Feb 2011 15:42:33 +0100 (CET) Subject: [Python-checkins] r88452 - python/branches/release27-maint/Doc/tools/dailybuild.py Message-ID: <20110220144233.D5C7BC392@mail.python.org> Author: georg.brandl Date: Sun Feb 20 15:42:33 2011 New Revision: 88452 Log: Update daily build locations. Modified: python/branches/release27-maint/Doc/tools/dailybuild.py Modified: python/branches/release27-maint/Doc/tools/dailybuild.py ============================================================================== --- python/branches/release27-maint/Doc/tools/dailybuild.py (original) +++ python/branches/release27-maint/Doc/tools/dailybuild.py Sun Feb 20 15:42:33 2011 @@ -33,9 +33,9 @@ BRANCHES = [ # checkout, target, isdev - (BUILDROOT + '/python32', WWWROOT + '/dev', True), + (BUILDROOT + '/python33', WWWROOT + '/dev', True), (BUILDROOT + '/python27', WWWROOT, False), - (BUILDROOT + '/python31', WWWROOT + '/py3k', False), + (BUILDROOT + '/python32', WWWROOT + '/py3k', False), ] From python-checkins at python.org Mon Feb 21 00:23:29 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 21 Feb 2011 00:23:29 +0100 (CET) Subject: [Python-checkins] r88453 - python/branches/release27-maint/Doc/tools/sphinxext/indexsidebar.html Message-ID: <20110220232329.1EF19EC21@mail.python.org> Author: georg.brandl Date: Mon Feb 21 00:23:28 2011 New Revision: 88453 Log: Update versions in sidebar. Modified: python/branches/release27-maint/Doc/tools/sphinxext/indexsidebar.html Modified: python/branches/release27-maint/Doc/tools/sphinxext/indexsidebar.html ============================================================================== --- python/branches/release27-maint/Doc/tools/sphinxext/indexsidebar.html (original) +++ python/branches/release27-maint/Doc/tools/sphinxext/indexsidebar.html Mon Feb 21 00:23:28 2011 @@ -3,8 +3,8 @@

Docs for other versions

From solipsis at pitrou.net Mon Feb 21 05:18:31 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 21 Feb 2011 05:18:31 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88451): sum=0 Message-ID: py3k results for svn r88451 (hg cset 02b70cb59701) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogTXH62r', '-x'] From python-checkins at python.org Mon Feb 21 17:18:49 2011 From: python-checkins at python.org (martin.v.loewis) Date: Mon, 21 Feb 2011 17:18:49 +0100 (CET) Subject: [Python-checkins] r88455 - python/branches/release32-maint Message-ID: <20110221161849.46B4DEE9B9@mail.python.org> Author: martin.v.loewis Date: Mon Feb 21 17:18:49 2011 New Revision: 88455 Log: Initialized merge tracking via "svnmerge" with revisions "1-88445" from svn+ssh://pythondev at svn.python.org/python/branches/py3k Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Mon Feb 21 17:24:00 2011 From: python-checkins at python.org (martin.v.loewis) Date: Mon, 21 Feb 2011 17:24:00 +0100 (CET) Subject: [Python-checkins] r88456 - in python/branches/py3k: Misc/NEWS Objects/typeobject.c Message-ID: <20110221162400.79A82EE99A@mail.python.org> Author: martin.v.loewis Date: Mon Feb 21 17:24:00 2011 New Revision: 88456 Log: - Check for NULL result in PyType_FromSpec. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/typeobject.c Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 17:24:00 2011 @@ -7,6 +7,11 @@ *Release date: XX-XXX-20XX* +Core and Builtins +----------------- + +- Check for NULL result in PyType_FromSpec. + What's New in Python 3.2? ========================= Modified: python/branches/py3k/Objects/typeobject.c ============================================================================== --- python/branches/py3k/Objects/typeobject.c (original) +++ python/branches/py3k/Objects/typeobject.c Mon Feb 21 17:24:00 2011 @@ -2330,6 +2330,8 @@ char *res_start = (char*)res; PyType_Slot *slot; + if (res == NULL) + return NULL; res->ht_name = PyUnicode_FromString(spec->name); if (!res->ht_name) goto fail; From python-checkins at python.org Mon Feb 21 17:26:47 2011 From: python-checkins at python.org (martin.v.loewis) Date: Mon, 21 Feb 2011 17:26:47 +0100 (CET) Subject: [Python-checkins] r88457 - in python/branches/release32-maint: Misc/NEWS Objects/typeobject.c Message-ID: <20110221162647.D1E8CEE99A@mail.python.org> Author: martin.v.loewis Date: Mon Feb 21 17:26:47 2011 New Revision: 88457 Log: Merged revisions 88456 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88456 | martin.v.loewis | 2011-02-21 17:24:00 +0100 (Mo, 21 Feb 2011) | 2 lines - Check for NULL result in PyType_FromSpec. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Objects/typeobject.c Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Mon Feb 21 17:26:47 2011 @@ -2,6 +2,17 @@ Python News +++++++++++ +What's New in Python 3.2.1? +=========================== + +*Release date: XX-XXX-20XX* + +Core and Builtins +----------------- + +- Check for NULL result in PyType_FromSpec. + + What's New in Python 3.2? ========================= Modified: python/branches/release32-maint/Objects/typeobject.c ============================================================================== --- python/branches/release32-maint/Objects/typeobject.c (original) +++ python/branches/release32-maint/Objects/typeobject.c Mon Feb 21 17:26:47 2011 @@ -2330,6 +2330,8 @@ char *res_start = (char*)res; PyType_Slot *slot; + if (res == NULL) + return NULL; res->ht_name = PyUnicode_FromString(spec->name); if (!res->ht_name) goto fail; From python-checkins at python.org Mon Feb 21 18:53:17 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 18:53:17 +0100 (CET) Subject: [Python-checkins] r88458 - python/branches/release32-maint/Doc/whatsnew/3.2.rst Message-ID: <20110221175317.354E3F6AA@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 18:53:16 2011 New Revision: 88458 Log: Issue 10160: Both single-arg and multi-arg calls have been sped-up. Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/release32-maint/Doc/whatsnew/3.2.rst (original) +++ python/branches/release32-maint/Doc/whatsnew/3.2.rst Mon Feb 21 18:53:16 2011 @@ -2372,9 +2372,9 @@ :issue:`8685`). The :meth:`array.repeat` method has a faster implementation (:issue:`1569291` by Alexander Belopolsky). The :class:`BaseHTTPRequestHandler` has more efficient buffering (:issue:`3709` by Andrew Schaaf). The -multi-argument form of :func:`operator.attrgetter` function now runs slightly -faster (:issue:`10160` by Christos Georgiou). And :class:`ConfigParser` loads -multi-line arguments a bit faster (:issue:`7113` by ?ukasz Langa). +:func:`operator.attrgetter` function has been sped-up (:issue:`10160` by +Christos Georgiou). And :class:`ConfigParser` loads multi-line arguments a bit +faster (:issue:`7113` by ?ukasz Langa). Unicode From python-checkins at python.org Mon Feb 21 18:54:36 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 18:54:36 +0100 (CET) Subject: [Python-checkins] r88459 - python/branches/py3k/Doc/whatsnew/3.2.rst Message-ID: <20110221175436.45C98EE985@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 18:54:36 2011 New Revision: 88459 Log: Issue 10160: Both single-arg and multi-arg calls have been sped-up. Modified: python/branches/py3k/Doc/whatsnew/3.2.rst Modified: python/branches/py3k/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/py3k/Doc/whatsnew/3.2.rst (original) +++ python/branches/py3k/Doc/whatsnew/3.2.rst Mon Feb 21 18:54:36 2011 @@ -2372,9 +2372,9 @@ :issue:`8685`). The :meth:`array.repeat` method has a faster implementation (:issue:`1569291` by Alexander Belopolsky). The :class:`BaseHTTPRequestHandler` has more efficient buffering (:issue:`3709` by Andrew Schaaf). The -multi-argument form of :func:`operator.attrgetter` function now runs slightly -faster (:issue:`10160` by Christos Georgiou). And :class:`ConfigParser` loads -multi-line arguments a bit faster (:issue:`7113` by ?ukasz Langa). +:func:`operator.attrgetter` function has been sped-up (:issue:`10160` by +Christos Georgiou). And :class:`ConfigParser` loads multi-line arguments a bit +faster (:issue:`7113` by ?ukasz Langa). Unicode From python-checkins at python.org Mon Feb 21 19:03:13 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 19:03:13 +0100 (CET) Subject: [Python-checkins] r88460 - in python/branches/py3k: Lib/test/test_zlib.py Misc/NEWS Modules/zlibmodule.c Message-ID: <20110221180313.A0731EE9E2@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 19:03:13 2011 New Revision: 88460 Log: Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. Modified: python/branches/py3k/Lib/test/test_zlib.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/zlibmodule.c Modified: python/branches/py3k/Lib/test/test_zlib.py ============================================================================== --- python/branches/py3k/Lib/test/test_zlib.py (original) +++ python/branches/py3k/Lib/test/test_zlib.py Mon Feb 21 19:03:13 2011 @@ -2,10 +2,16 @@ from test import support import binascii import random +import sys from test.support import precisionbigmemtest, _1G, _4G zlib = support.import_module('zlib') +try: + import mmap +except ImportError: + mmap = None + class ChecksumTestCase(unittest.TestCase): # checksum test cases @@ -57,6 +63,28 @@ self.assertEqual(binascii.crc32(b'spam'), zlib.crc32(b'spam')) +# Issue #10276 - check that inputs >=4GB are handled correctly. +class ChecksumBigBufferTestCase(unittest.TestCase): + + def setUp(self): + with open(support.TESTFN, "wb+") as f: + f.seek(_4G) + f.write(b"asdf") + f.flush() + self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) + + def tearDown(self): + self.mapping.close() + support.unlink(support.TESTFN) + + @unittest.skipUnless(mmap, "mmap() is not available.") + @unittest.skipUnless(sys.maxsize > _4G, "Can't run on a 32-bit system.") + @unittest.skipUnless(support.is_resource_enabled("largefile"), + "May use lots of disk space.") + def test_big_buffer(self): + self.assertEqual(zlib.crc32(self.mapping), 3058686908) + self.assertEqual(zlib.adler32(self.mapping), 82837919) + class ExceptionTestCase(unittest.TestCase): # make sure we generate some expected errors @@ -577,6 +605,7 @@ def test_main(): support.run_unittest( ChecksumTestCase, + ChecksumBigBufferTestCase, ExceptionTestCase, CompressTestCase, CompressObjectTestCase Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 19:03:13 2011 @@ -12,6 +12,12 @@ - Check for NULL result in PyType_FromSpec. +Library +------- + +- Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers + larger than 4GB. Patch by Nadeem Vawda. + What's New in Python 3.2? ========================= Modified: python/branches/py3k/Modules/zlibmodule.c ============================================================================== --- python/branches/py3k/Modules/zlibmodule.c (original) +++ python/branches/py3k/Modules/zlibmodule.c Mon Feb 21 19:03:13 2011 @@ -945,8 +945,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + void *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - adler32val = adler32(adler32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. adler32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (Py_ssize_t)UINT_MAX) { + adler32val = adler32(adler32val, buf, UINT_MAX); + buf += UINT_MAX; + len -= UINT_MAX; + } + adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS } else { adler32val = adler32(adler32val, pbuf.buf, pbuf.len); @@ -973,8 +983,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + void *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - signed_val = crc32(crc32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. crc32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (Py_ssize_t)UINT_MAX) { + crc32val = crc32(crc32val, buf, UINT_MAX); + buf += UINT_MAX; + len -= UINT_MAX; + } + signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS } else { signed_val = crc32(crc32val, pbuf.buf, pbuf.len); From python-checkins at python.org Mon Feb 21 19:09:01 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 19:09:01 +0100 (CET) Subject: [Python-checkins] r88461 - in python/branches/release32-maint: Lib/test/test_zlib.py Misc/NEWS Modules/zlibmodule.c Message-ID: <20110221180901.0924CEE99C@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 19:09:00 2011 New Revision: 88461 Log: Merged revisions 88460 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88460 | antoine.pitrou | 2011-02-21 19:03:13 +0100 (lun., 21 f?vr. 2011) | 4 lines Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_zlib.py python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Modules/zlibmodule.c Modified: python/branches/release32-maint/Lib/test/test_zlib.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_zlib.py (original) +++ python/branches/release32-maint/Lib/test/test_zlib.py Mon Feb 21 19:09:00 2011 @@ -2,10 +2,16 @@ from test import support import binascii import random +import sys from test.support import precisionbigmemtest, _1G, _4G zlib = support.import_module('zlib') +try: + import mmap +except ImportError: + mmap = None + class ChecksumTestCase(unittest.TestCase): # checksum test cases @@ -57,6 +63,28 @@ self.assertEqual(binascii.crc32(b'spam'), zlib.crc32(b'spam')) +# Issue #10276 - check that inputs >=4GB are handled correctly. +class ChecksumBigBufferTestCase(unittest.TestCase): + + def setUp(self): + with open(support.TESTFN, "wb+") as f: + f.seek(_4G) + f.write(b"asdf") + f.flush() + self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) + + def tearDown(self): + self.mapping.close() + support.unlink(support.TESTFN) + + @unittest.skipUnless(mmap, "mmap() is not available.") + @unittest.skipUnless(sys.maxsize > _4G, "Can't run on a 32-bit system.") + @unittest.skipUnless(support.is_resource_enabled("largefile"), + "May use lots of disk space.") + def test_big_buffer(self): + self.assertEqual(zlib.crc32(self.mapping), 3058686908) + self.assertEqual(zlib.adler32(self.mapping), 82837919) + class ExceptionTestCase(unittest.TestCase): # make sure we generate some expected errors @@ -577,6 +605,7 @@ def test_main(): support.run_unittest( ChecksumTestCase, + ChecksumBigBufferTestCase, ExceptionTestCase, CompressTestCase, CompressObjectTestCase Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Mon Feb 21 19:09:00 2011 @@ -12,6 +12,12 @@ - Check for NULL result in PyType_FromSpec. +Library +------- + +- Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers + larger than 4GB. Patch by Nadeem Vawda. + What's New in Python 3.2? ========================= Modified: python/branches/release32-maint/Modules/zlibmodule.c ============================================================================== --- python/branches/release32-maint/Modules/zlibmodule.c (original) +++ python/branches/release32-maint/Modules/zlibmodule.c Mon Feb 21 19:09:00 2011 @@ -945,8 +945,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + void *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - adler32val = adler32(adler32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. adler32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (Py_ssize_t)UINT_MAX) { + adler32val = adler32(adler32val, buf, UINT_MAX); + buf += UINT_MAX; + len -= UINT_MAX; + } + adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS } else { adler32val = adler32(adler32val, pbuf.buf, pbuf.len); @@ -973,8 +983,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + void *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - signed_val = crc32(crc32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. crc32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (Py_ssize_t)UINT_MAX) { + crc32val = crc32(crc32val, buf, UINT_MAX); + buf += UINT_MAX; + len -= UINT_MAX; + } + signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS } else { signed_val = crc32(crc32val, pbuf.buf, pbuf.len); From python-checkins at python.org Mon Feb 21 19:13:44 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 21 Feb 2011 19:13:44 +0100 (CET) Subject: [Python-checkins] r88462 - python/branches/release32-maint/Include/patchlevel.h Message-ID: <20110221181344.E4EE0DA87@mail.python.org> Author: georg.brandl Date: Mon Feb 21 19:13:44 2011 New Revision: 88462 Log: Update in-development version. Modified: python/branches/release32-maint/Include/patchlevel.h Modified: python/branches/release32-maint/Include/patchlevel.h ============================================================================== --- python/branches/release32-maint/Include/patchlevel.h (original) +++ python/branches/release32-maint/Include/patchlevel.h Mon Feb 21 19:13:44 2011 @@ -23,7 +23,7 @@ #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.2" +#define PY_VERSION "3.2.1a0" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository) */ From python-checkins at python.org Mon Feb 21 19:18:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 19:18:50 +0100 Subject: [Python-checkins] hooks: Don't choke on non-ascii commit messages / diffs Message-ID: antoine.pitrou pushed 0a651c3be11c to hooks: http://hg.python.org/hooks/rev/0a651c3be11c changeset: 27:0a651c3be11c tag: tip user: Antoine Pitrou date: Mon Feb 21 19:18:43 2011 +0100 summary: Don't choke on non-ascii commit messages / diffs files: buildbot.py mail.py diff --git a/buildbot.py b/buildbot.py --- a/buildbot.py +++ b/buildbot.py @@ -87,7 +87,7 @@ changes.append({ 'username': user, 'revision': hex(node), - 'comments': desc, + 'comments': desc.decode('utf8', 'replace'), 'files': files, 'branch': branch, }) diff --git a/mail.py b/mail.py --- a/mail.py +++ b/mail.py @@ -14,7 +14,7 @@ msg['Subject'] = sub msg['To'] = to msg['From'] = sender - msg.attach(MIMEText(body, _subtype='plain')) + msg.attach(MIMEText(body, _subtype='plain', _charset='utf8')) smtp = smtplib.SMTP() smtp.connect() smtp.sendmail(sender, msg['To'], msg.as_string()) -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Mon Feb 21 19:37:01 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 21 Feb 2011 19:37:01 +0100 (CET) Subject: [Python-checkins] r88463 - python/branches/py3k/Doc/tools/sphinxext/indexsidebar.html Message-ID: <20110221183701.B5E96EE9F0@mail.python.org> Author: georg.brandl Date: Mon Feb 21 19:37:01 2011 New Revision: 88463 Log: 3.2 is stable now. Modified: python/branches/py3k/Doc/tools/sphinxext/indexsidebar.html Modified: python/branches/py3k/Doc/tools/sphinxext/indexsidebar.html ============================================================================== --- python/branches/py3k/Doc/tools/sphinxext/indexsidebar.html (original) +++ python/branches/py3k/Doc/tools/sphinxext/indexsidebar.html Mon Feb 21 19:37:01 2011 @@ -3,7 +3,7 @@

Docs for other versions

From python-checkins at python.org Mon Feb 21 20:05:09 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 20:05:09 +0100 (CET) Subject: [Python-checkins] r88464 - python/branches/py3k/Modules/zlibmodule.c Message-ID: <20110221190509.6127BEE99F@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 20:05:08 2011 New Revision: 88464 Log: Fix issues on 32-bit systems introduced by r88460 Modified: python/branches/py3k/Modules/zlibmodule.c Modified: python/branches/py3k/Modules/zlibmodule.c ============================================================================== --- python/branches/py3k/Modules/zlibmodule.c (original) +++ python/branches/py3k/Modules/zlibmodule.c Mon Feb 21 20:05:08 2011 @@ -951,10 +951,10 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. adler32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (Py_ssize_t)UINT_MAX) { + while (len > (size_t) UINT_MAX) { adler32val = adler32(adler32val, buf, UINT_MAX); - buf += UINT_MAX; - len -= UINT_MAX; + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; } adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS @@ -989,10 +989,10 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. crc32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (Py_ssize_t)UINT_MAX) { + while (len > (size_t) UINT_MAX) { crc32val = crc32(crc32val, buf, UINT_MAX); - buf += UINT_MAX; - len -= UINT_MAX; + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; } signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS From python-checkins at python.org Mon Feb 21 20:24:10 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 21 Feb 2011 20:24:10 +0100 (CET) Subject: [Python-checkins] r88465 - python/branches/py3k/.hgignore Message-ID: <20110221192410.8FB21EE9C7@mail.python.org> Author: brett.cannon Date: Mon Feb 21 20:24:10 2011 New Revision: 88465 Log: Ignore Vim .swp files. Modified: python/branches/py3k/.hgignore Modified: python/branches/py3k/.hgignore ============================================================================== --- python/branches/py3k/.hgignore (original) +++ python/branches/py3k/.hgignore Mon Feb 21 20:24:10 2011 @@ -39,6 +39,7 @@ syntax: glob libpython*.a +*.swp *.o *.pyc *.pyo From python-checkins at python.org Mon Feb 21 20:28:41 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 20:28:41 +0100 (CET) Subject: [Python-checkins] r88466 - python/branches/py3k/Modules/zlibmodule.c Message-ID: <20110221192841.1B280C9E9@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 20:28:40 2011 New Revision: 88466 Log: Fix compile error under MSVC introduced by r88460. Modified: python/branches/py3k/Modules/zlibmodule.c Modified: python/branches/py3k/Modules/zlibmodule.c ============================================================================== --- python/branches/py3k/Modules/zlibmodule.c (original) +++ python/branches/py3k/Modules/zlibmodule.c Mon Feb 21 20:28:40 2011 @@ -945,7 +945,7 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { - void *buf = pbuf.buf; + unsigned char *buf = pbuf.buf; Py_ssize_t len = pbuf.len; Py_BEGIN_ALLOW_THREADS @@ -983,7 +983,7 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { - void *buf = pbuf.buf; + unsigned char *buf = pbuf.buf; Py_ssize_t len = pbuf.len; Py_BEGIN_ALLOW_THREADS From python-checkins at python.org Mon Feb 21 20:29:56 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 21 Feb 2011 20:29:56 +0100 (CET) Subject: [Python-checkins] r88467 - in python/branches/py3k: Lib/doctest.py Lib/test/pickletester.py Lib/test/regrtest.py Lib/test/support.py Lib/test/test_doctest.py Lib/test/test_exceptions.py Lib/test/test_io.py Lib/test/test_pdb.py Lib/test/test_richcmp.py Lib/test/test_runpy.py Lib/test/test_scope.py Lib/test/test_sys_settrace.py Lib/test/test_trace.py Lib/test/test_zipimport_support.py Misc/NEWS Message-ID: <20110221192956.8D6F4C959@mail.python.org> Author: brett.cannon Date: Mon Feb 21 20:29:56 2011 New Revision: 88467 Log: Issue #10990: Prevent tests from clobbering a set trace function. Many tests simply didn't care if they unset a pre-existing trace function. This made test coverage impossible. This patch fixes various tests to put back any pre-existing trace function. It also introduces test.support.no_tracing as a decorator which will temporarily unset the trace function for tests which simply fail otherwise. Thanks to Kristian Vlaardingerbroek for helping to find the cause of various trace function unsets. Modified: python/branches/py3k/Lib/doctest.py python/branches/py3k/Lib/test/pickletester.py python/branches/py3k/Lib/test/regrtest.py python/branches/py3k/Lib/test/support.py python/branches/py3k/Lib/test/test_doctest.py python/branches/py3k/Lib/test/test_exceptions.py python/branches/py3k/Lib/test/test_io.py python/branches/py3k/Lib/test/test_pdb.py python/branches/py3k/Lib/test/test_richcmp.py python/branches/py3k/Lib/test/test_runpy.py python/branches/py3k/Lib/test/test_scope.py python/branches/py3k/Lib/test/test_sys_settrace.py python/branches/py3k/Lib/test/test_trace.py python/branches/py3k/Lib/test/test_zipimport_support.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/doctest.py ============================================================================== --- python/branches/py3k/Lib/doctest.py (original) +++ python/branches/py3k/Lib/doctest.py Mon Feb 21 20:29:56 2011 @@ -1373,6 +1373,7 @@ # Note that the interactive output will go to *our* # save_stdout, even if that's not the real sys.stdout; this # allows us to write test cases for the set_trace behavior. + save_trace = sys.gettrace() save_set_trace = pdb.set_trace self.debugger = _OutputRedirectingPdb(save_stdout) self.debugger.reset() @@ -1392,6 +1393,7 @@ finally: sys.stdout = save_stdout pdb.set_trace = save_set_trace + sys.settrace(save_trace) linecache.getlines = self.save_linecache_getlines sys.displayhook = save_displayhook if clear_globs: Modified: python/branches/py3k/Lib/test/pickletester.py ============================================================================== --- python/branches/py3k/Lib/test/pickletester.py (original) +++ python/branches/py3k/Lib/test/pickletester.py Mon Feb 21 20:29:56 2011 @@ -5,7 +5,7 @@ import copyreg from http.cookies import SimpleCookie -from test.support import TestFailed, TESTFN, run_with_locale +from test.support import TestFailed, TESTFN, run_with_locale, no_tracing from pickle import bytes_types @@ -1002,6 +1002,7 @@ y = self.loads(s) self.assertEqual(y._reduce_called, 1) + @no_tracing def test_bad_getattr(self): x = BadGetattr() for proto in 0, 1: Modified: python/branches/py3k/Lib/test/regrtest.py ============================================================================== --- python/branches/py3k/Lib/test/regrtest.py (original) +++ python/branches/py3k/Lib/test/regrtest.py Mon Feb 21 20:29:56 2011 @@ -827,7 +827,7 @@ resources = ('sys.argv', 'cwd', 'sys.stdin', 'sys.stdout', 'sys.stderr', 'os.environ', 'sys.path', 'sys.path_hooks', '__import__', 'warnings.filters', 'asyncore.socket_map', - 'logging._handlers', 'logging._handlerList') + 'logging._handlers', 'logging._handlerList', 'sys.gettrace') def get_sys_argv(self): return id(sys.argv), sys.argv, sys.argv[:] @@ -874,6 +874,11 @@ sys.path_hooks = saved_hooks[1] sys.path_hooks[:] = saved_hooks[2] + def get_sys_gettrace(self): + return sys.gettrace() + def restore_sys_gettrace(self, trace_fxn): + sys.settrace(trace_fxn) + def get___import__(self): return builtins.__import__ def restore___import__(self, import_): Modified: python/branches/py3k/Lib/test/support.py ============================================================================== --- python/branches/py3k/Lib/test/support.py (original) +++ python/branches/py3k/Lib/test/support.py Mon Feb 21 20:29:56 2011 @@ -1108,6 +1108,21 @@ return guards.get(platform.python_implementation().lower(), default) +def no_tracing(func): + """Decorator to temporarily turn off tracing for the duration of a test.""" + if not hasattr(sys, 'gettrace'): + return func + else: + @functools.wraps(func) + def wrapper(*args, **kwargs): + original_trace = sys.gettrace() + try: + sys.settrace(None) + return func(*args, **kwargs) + finally: + sys.settrace(original_trace) + return wrapper + def _run_suite(suite): """Run tests from a unittest.TestSuite-derived class.""" Modified: python/branches/py3k/Lib/test/test_doctest.py ============================================================================== --- python/branches/py3k/Lib/test/test_doctest.py (original) +++ python/branches/py3k/Lib/test/test_doctest.py Mon Feb 21 20:29:56 2011 @@ -5,6 +5,7 @@ from test import support import doctest import os +import sys # NOTE: There are some additional tests relating to interaction with @@ -373,7 +374,7 @@ >>> tests = finder.find(sample_func) >>> print(tests) # doctest: +ELLIPSIS - [] + [] The exact name depends on how test_doctest was invoked, so allow for leading path components. @@ -1686,226 +1687,227 @@ """ -def test_pdb_set_trace(): - """Using pdb.set_trace from a doctest. - - You can use pdb.set_trace from a doctest. To do so, you must - retrieve the set_trace function from the pdb module at the time - you use it. The doctest module changes sys.stdout so that it can - capture program output. It also temporarily replaces pdb.set_trace - with a version that restores stdout. This is necessary for you to - see debugger output. - - >>> doc = ''' - ... >>> x = 42 - ... >>> raise Exception('cl?') - ... Traceback (most recent call last): - ... Exception: cl? - ... >>> import pdb; pdb.set_trace() - ... ''' - >>> parser = doctest.DocTestParser() - >>> test = parser.get_doctest(doc, {}, "foo-bar at baz", "foo-bar at baz.py", 0) - >>> runner = doctest.DocTestRunner(verbose=False) - - To demonstrate this, we'll create a fake standard input that - captures our debugger input: - - >>> import tempfile - >>> real_stdin = sys.stdin - >>> sys.stdin = _FakeInput([ - ... 'print(x)', # print data defined by the example - ... 'continue', # stop debugging - ... '']) - - >>> try: runner.run(test) - ... finally: sys.stdin = real_stdin - --Return-- - > (1)()->None - -> import pdb; pdb.set_trace() - (Pdb) print(x) - 42 - (Pdb) continue - TestResults(failed=0, attempted=3) - - You can also put pdb.set_trace in a function called from a test: - - >>> def calls_set_trace(): - ... y=2 - ... import pdb; pdb.set_trace() - - >>> doc = ''' - ... >>> x=1 - ... >>> calls_set_trace() - ... ''' - >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) - >>> real_stdin = sys.stdin - >>> sys.stdin = _FakeInput([ - ... 'print(y)', # print data defined in the function - ... 'up', # out of function - ... 'print(x)', # print data defined by the example - ... 'continue', # stop debugging - ... '']) - - >>> try: - ... runner.run(test) - ... finally: - ... sys.stdin = real_stdin - --Return-- - > (3)calls_set_trace()->None - -> import pdb; pdb.set_trace() - (Pdb) print(y) - 2 - (Pdb) up - > (1)() - -> calls_set_trace() - (Pdb) print(x) - 1 - (Pdb) continue - TestResults(failed=0, attempted=2) - - During interactive debugging, source code is shown, even for - doctest examples: - - >>> doc = ''' - ... >>> def f(x): - ... ... g(x*2) - ... >>> def g(x): - ... ... print(x+3) - ... ... import pdb; pdb.set_trace() - ... >>> f(3) - ... ''' - >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) - >>> real_stdin = sys.stdin - >>> sys.stdin = _FakeInput([ - ... 'list', # list source from example 2 - ... 'next', # return from g() - ... 'list', # list source from example 1 - ... 'next', # return from f() - ... 'list', # list source from example 3 - ... 'continue', # stop debugging - ... '']) - >>> try: runner.run(test) - ... finally: sys.stdin = real_stdin - ... # doctest: +NORMALIZE_WHITESPACE - --Return-- - > (3)g()->None - -> import pdb; pdb.set_trace() - (Pdb) list - 1 def g(x): - 2 print(x+3) - 3 -> import pdb; pdb.set_trace() - [EOF] - (Pdb) next - --Return-- - > (2)f()->None - -> g(x*2) - (Pdb) list - 1 def f(x): - 2 -> g(x*2) - [EOF] - (Pdb) next - --Return-- - > (1)()->None - -> f(3) - (Pdb) list - 1 -> f(3) - [EOF] - (Pdb) continue - ********************************************************************** - File "foo-bar at baz.py", line 7, in foo-bar at baz - Failed example: - f(3) - Expected nothing - Got: - 9 - TestResults(failed=1, attempted=3) - """ - -def test_pdb_set_trace_nested(): - """This illustrates more-demanding use of set_trace with nested functions. - - >>> class C(object): - ... def calls_set_trace(self): - ... y = 1 - ... import pdb; pdb.set_trace() - ... self.f1() - ... y = 2 - ... def f1(self): - ... x = 1 - ... self.f2() - ... x = 2 - ... def f2(self): - ... z = 1 - ... z = 2 - - >>> calls_set_trace = C().calls_set_trace - - >>> doc = ''' - ... >>> a = 1 - ... >>> calls_set_trace() - ... ''' - >>> parser = doctest.DocTestParser() - >>> runner = doctest.DocTestRunner(verbose=False) - >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) - >>> real_stdin = sys.stdin - >>> sys.stdin = _FakeInput([ - ... 'print(y)', # print data defined in the function - ... 'step', 'step', 'step', 'step', 'step', 'step', 'print(z)', - ... 'up', 'print(x)', - ... 'up', 'print(y)', - ... 'up', 'print(foo)', - ... 'continue', # stop debugging - ... '']) - - >>> try: - ... runner.run(test) - ... finally: - ... sys.stdin = real_stdin - ... # doctest: +REPORT_NDIFF - > (5)calls_set_trace() - -> self.f1() - (Pdb) print(y) - 1 - (Pdb) step - --Call-- - > (7)f1() - -> def f1(self): - (Pdb) step - > (8)f1() - -> x = 1 - (Pdb) step - > (9)f1() - -> self.f2() - (Pdb) step - --Call-- - > (11)f2() - -> def f2(self): - (Pdb) step - > (12)f2() - -> z = 1 - (Pdb) step - > (13)f2() - -> z = 2 - (Pdb) print(z) - 1 - (Pdb) up - > (9)f1() - -> self.f2() - (Pdb) print(x) - 1 - (Pdb) up - > (5)calls_set_trace() - -> self.f1() - (Pdb) print(y) - 1 - (Pdb) up - > (1)() - -> calls_set_trace() - (Pdb) print(foo) - *** NameError: name 'foo' is not defined - (Pdb) continue - TestResults(failed=0, attempted=2) -""" +if not hasattr(sys, 'gettrace') or not sys.gettrace(): + def test_pdb_set_trace(): + """Using pdb.set_trace from a doctest. + + You can use pdb.set_trace from a doctest. To do so, you must + retrieve the set_trace function from the pdb module at the time + you use it. The doctest module changes sys.stdout so that it can + capture program output. It also temporarily replaces pdb.set_trace + with a version that restores stdout. This is necessary for you to + see debugger output. + + >>> doc = ''' + ... >>> x = 42 + ... >>> raise Exception('cl?') + ... Traceback (most recent call last): + ... Exception: cl? + ... >>> import pdb; pdb.set_trace() + ... ''' + >>> parser = doctest.DocTestParser() + >>> test = parser.get_doctest(doc, {}, "foo-bar at baz", "foo-bar at baz.py", 0) + >>> runner = doctest.DocTestRunner(verbose=False) + + To demonstrate this, we'll create a fake standard input that + captures our debugger input: + + >>> import tempfile + >>> real_stdin = sys.stdin + >>> sys.stdin = _FakeInput([ + ... 'print(x)', # print data defined by the example + ... 'continue', # stop debugging + ... '']) + + >>> try: runner.run(test) + ... finally: sys.stdin = real_stdin + --Return-- + > (1)()->None + -> import pdb; pdb.set_trace() + (Pdb) print(x) + 42 + (Pdb) continue + TestResults(failed=0, attempted=3) + + You can also put pdb.set_trace in a function called from a test: + + >>> def calls_set_trace(): + ... y=2 + ... import pdb; pdb.set_trace() + + >>> doc = ''' + ... >>> x=1 + ... >>> calls_set_trace() + ... ''' + >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) + >>> real_stdin = sys.stdin + >>> sys.stdin = _FakeInput([ + ... 'print(y)', # print data defined in the function + ... 'up', # out of function + ... 'print(x)', # print data defined by the example + ... 'continue', # stop debugging + ... '']) + + >>> try: + ... runner.run(test) + ... finally: + ... sys.stdin = real_stdin + --Return-- + > (3)calls_set_trace()->None + -> import pdb; pdb.set_trace() + (Pdb) print(y) + 2 + (Pdb) up + > (1)() + -> calls_set_trace() + (Pdb) print(x) + 1 + (Pdb) continue + TestResults(failed=0, attempted=2) + + During interactive debugging, source code is shown, even for + doctest examples: + + >>> doc = ''' + ... >>> def f(x): + ... ... g(x*2) + ... >>> def g(x): + ... ... print(x+3) + ... ... import pdb; pdb.set_trace() + ... >>> f(3) + ... ''' + >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) + >>> real_stdin = sys.stdin + >>> sys.stdin = _FakeInput([ + ... 'list', # list source from example 2 + ... 'next', # return from g() + ... 'list', # list source from example 1 + ... 'next', # return from f() + ... 'list', # list source from example 3 + ... 'continue', # stop debugging + ... '']) + >>> try: runner.run(test) + ... finally: sys.stdin = real_stdin + ... # doctest: +NORMALIZE_WHITESPACE + --Return-- + > (3)g()->None + -> import pdb; pdb.set_trace() + (Pdb) list + 1 def g(x): + 2 print(x+3) + 3 -> import pdb; pdb.set_trace() + [EOF] + (Pdb) next + --Return-- + > (2)f()->None + -> g(x*2) + (Pdb) list + 1 def f(x): + 2 -> g(x*2) + [EOF] + (Pdb) next + --Return-- + > (1)()->None + -> f(3) + (Pdb) list + 1 -> f(3) + [EOF] + (Pdb) continue + ********************************************************************** + File "foo-bar at baz.py", line 7, in foo-bar at baz + Failed example: + f(3) + Expected nothing + Got: + 9 + TestResults(failed=1, attempted=3) + """ + + def test_pdb_set_trace_nested(): + """This illustrates more-demanding use of set_trace with nested functions. + + >>> class C(object): + ... def calls_set_trace(self): + ... y = 1 + ... import pdb; pdb.set_trace() + ... self.f1() + ... y = 2 + ... def f1(self): + ... x = 1 + ... self.f2() + ... x = 2 + ... def f2(self): + ... z = 1 + ... z = 2 + + >>> calls_set_trace = C().calls_set_trace + + >>> doc = ''' + ... >>> a = 1 + ... >>> calls_set_trace() + ... ''' + >>> parser = doctest.DocTestParser() + >>> runner = doctest.DocTestRunner(verbose=False) + >>> test = parser.get_doctest(doc, globals(), "foo-bar at baz", "foo-bar at baz.py", 0) + >>> real_stdin = sys.stdin + >>> sys.stdin = _FakeInput([ + ... 'print(y)', # print data defined in the function + ... 'step', 'step', 'step', 'step', 'step', 'step', 'print(z)', + ... 'up', 'print(x)', + ... 'up', 'print(y)', + ... 'up', 'print(foo)', + ... 'continue', # stop debugging + ... '']) + + >>> try: + ... runner.run(test) + ... finally: + ... sys.stdin = real_stdin + ... # doctest: +REPORT_NDIFF + > (5)calls_set_trace() + -> self.f1() + (Pdb) print(y) + 1 + (Pdb) step + --Call-- + > (7)f1() + -> def f1(self): + (Pdb) step + > (8)f1() + -> x = 1 + (Pdb) step + > (9)f1() + -> self.f2() + (Pdb) step + --Call-- + > (11)f2() + -> def f2(self): + (Pdb) step + > (12)f2() + -> z = 1 + (Pdb) step + > (13)f2() + -> z = 2 + (Pdb) print(z) + 1 + (Pdb) up + > (9)f1() + -> self.f2() + (Pdb) print(x) + 1 + (Pdb) up + > (5)calls_set_trace() + -> self.f1() + (Pdb) print(y) + 1 + (Pdb) up + > (1)() + -> calls_set_trace() + (Pdb) print(foo) + *** NameError: name 'foo' is not defined + (Pdb) continue + TestResults(failed=0, attempted=2) + """ def test_DocTestSuite(): """DocTestSuite creates a unittest test suite from a doctest. Modified: python/branches/py3k/Lib/test/test_exceptions.py ============================================================================== --- python/branches/py3k/Lib/test/test_exceptions.py (original) +++ python/branches/py3k/Lib/test/test_exceptions.py Mon Feb 21 20:29:56 2011 @@ -7,7 +7,7 @@ import weakref from test.support import (TESTFN, unlink, run_unittest, captured_output, - gc_collect, cpython_only) + gc_collect, cpython_only, no_tracing) # XXX This is not really enough, each *operation* should be tested! @@ -388,6 +388,7 @@ x = DerivedException(fancy_arg=42) self.assertEqual(x.fancy_arg, 42) + @no_tracing def testInfiniteRecursion(self): def f(): return f() @@ -631,6 +632,7 @@ u.start = 1000 self.assertEqual(str(u), "can't translate characters in position 1000-4: 965230951443685724997") + @no_tracing def test_badisinstance(self): # Bug #2542: if issubclass(e, MyException) raises an exception, # it should be ignored @@ -741,6 +743,7 @@ self.fail("MemoryError not raised") self.assertEqual(wr(), None) + @no_tracing def test_recursion_error_cleanup(self): # Same test as above, but with "recursion exceeded" errors class C: Modified: python/branches/py3k/Lib/test/test_io.py ============================================================================== --- python/branches/py3k/Lib/test/test_io.py (original) +++ python/branches/py3k/Lib/test/test_io.py Mon Feb 21 20:29:56 2011 @@ -2214,6 +2214,7 @@ with self.open(support.TESTFN, "w", errors="replace") as f: self.assertEqual(f.errors, "replace") + @support.no_tracing @unittest.skipUnless(threading, 'Threading required for this test.') def test_threads_write(self): # Issue6750: concurrent writes could duplicate data @@ -2669,6 +2670,7 @@ def test_interrupted_write_text(self): self.check_interrupted_write("xy", b"xy", mode="w", encoding="ascii") + @support.no_tracing def check_reentrant_write(self, data, **fdopen_kwargs): def on_alarm(*args): # Will be called reentrantly from the same thread Modified: python/branches/py3k/Lib/test/test_pdb.py ============================================================================== --- python/branches/py3k/Lib/test/test_pdb.py (original) +++ python/branches/py3k/Lib/test/test_pdb.py Mon Feb 21 20:29:56 2011 @@ -20,9 +20,12 @@ def __enter__(self): self.real_stdin = sys.stdin sys.stdin = _FakeInput(self.input) + self.orig_trace = sys.gettrace() if hasattr(sys, 'gettrace') else None def __exit__(self, *exc): sys.stdin = self.real_stdin + if self.orig_trace: + sys.settrace(self.orig_trace) def test_pdb_displayhook(): Modified: python/branches/py3k/Lib/test/test_richcmp.py ============================================================================== --- python/branches/py3k/Lib/test/test_richcmp.py (original) +++ python/branches/py3k/Lib/test/test_richcmp.py Mon Feb 21 20:29:56 2011 @@ -220,6 +220,7 @@ for func in (do, operator.not_): self.assertRaises(Exc, func, Bad()) + @support.no_tracing def test_recursion(self): # Check that comparison for recursive objects fails gracefully from collections import UserList Modified: python/branches/py3k/Lib/test/test_runpy.py ============================================================================== --- python/branches/py3k/Lib/test/test_runpy.py (original) +++ python/branches/py3k/Lib/test/test_runpy.py Mon Feb 21 20:29:56 2011 @@ -6,7 +6,8 @@ import re import tempfile import py_compile -from test.support import forget, make_legacy_pyc, run_unittest, unload, verbose +from test.support import ( + forget, make_legacy_pyc, run_unittest, unload, verbose, no_tracing) from test.script_helper import ( make_pkg, make_script, make_zip_pkg, make_zip_script, temp_dir) @@ -395,6 +396,7 @@ msg = "can't find '__main__' module in %r" % zip_name self._check_import_error(zip_name, msg) + @no_tracing def test_main_recursion_error(self): with temp_dir() as script_dir, temp_dir() as dummy_dir: mod_name = '__main__' Modified: python/branches/py3k/Lib/test/test_scope.py ============================================================================== --- python/branches/py3k/Lib/test/test_scope.py (original) +++ python/branches/py3k/Lib/test/test_scope.py Mon Feb 21 20:29:56 2011 @@ -1,5 +1,5 @@ import unittest -from test.support import check_syntax_error, run_unittest +from test.support import check_syntax_error, cpython_only, run_unittest class ScopeTests(unittest.TestCase): @@ -496,23 +496,22 @@ self.assertNotIn("x", varnames) self.assertIn("y", varnames) + @cpython_only def testLocalsClass_WithTrace(self): # Issue23728: after the trace function returns, the locals() # dictionary is used to update all variables, this used to # include free variables. But in class statements, free # variables are not inserted... import sys + self.addCleanup(sys.settrace, sys.gettrace()) sys.settrace(lambda a,b,c:None) - try: - x = 12 + x = 12 - class C: - def f(self): - return x + class C: + def f(self): + return x - self.assertEqual(x, 12) # Used to raise UnboundLocalError - finally: - sys.settrace(None) + self.assertEqual(x, 12) # Used to raise UnboundLocalError def testBoundAndFree(self): # var is bound and free in class @@ -527,6 +526,7 @@ inst = f(3)() self.assertEqual(inst.a, inst.m()) + @cpython_only def testInteractionWithTraceFunc(self): import sys @@ -543,6 +543,7 @@ class TestClass: pass + self.addCleanup(sys.settrace, sys.gettrace()) sys.settrace(tracer) adaptgetter("foo", TestClass, (1, "")) sys.settrace(None) Modified: python/branches/py3k/Lib/test/test_sys_settrace.py ============================================================================== --- python/branches/py3k/Lib/test/test_sys_settrace.py (original) +++ python/branches/py3k/Lib/test/test_sys_settrace.py Mon Feb 21 20:29:56 2011 @@ -251,6 +251,7 @@ def setUp(self): self.using_gc = gc.isenabled() gc.disable() + self.addCleanup(sys.settrace, sys.gettrace()) def tearDown(self): if self.using_gc: @@ -389,6 +390,9 @@ class RaisingTraceFuncTestCase(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + def trace(self, frame, event, arg): """A trace function that raises an exception in response to a specific trace event.""" @@ -688,6 +692,10 @@ class JumpTestCase(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + sys.settrace(None) + def compare_jump_output(self, expected, received): if received != expected: self.fail( "Outputs don't match:\n" + @@ -739,6 +747,8 @@ def test_18_no_jump_to_non_integers(self): self.run_test(no_jump_to_non_integers) def test_19_no_jump_without_trace_function(self): + # Must set sys.settrace(None) in setUp(), else condition is not + # triggered. no_jump_without_trace_function() def test_20_large_function(self): Modified: python/branches/py3k/Lib/test/test_trace.py ============================================================================== --- python/branches/py3k/Lib/test/test_trace.py (original) +++ python/branches/py3k/Lib/test/test_trace.py Mon Feb 21 20:29:56 2011 @@ -102,6 +102,7 @@ class TestLineCounts(unittest.TestCase): """White-box testing of line-counting, via runfunc""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=1, trace=0, countfuncs=0, countcallers=0) self.my_py_filename = fix_ext_py(__file__) @@ -192,6 +193,7 @@ """A simple sanity test of line-counting, via runctx (exec)""" def setUp(self): self.my_py_filename = fix_ext_py(__file__) + self.addCleanup(sys.settrace, sys.gettrace()) def test_exec_counts(self): self.tracer = Trace(count=1, trace=0, countfuncs=0, countcallers=0) @@ -218,6 +220,7 @@ class TestFuncs(unittest.TestCase): """White-box testing of funcs tracing""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=0, trace=0, countfuncs=1) self.filemod = my_file_and_modname() @@ -257,6 +260,7 @@ class TestCallers(unittest.TestCase): """White-box testing of callers tracing""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=0, trace=0, countcallers=1) self.filemod = my_file_and_modname() @@ -280,6 +284,9 @@ # Created separately for issue #3821 class TestCoverage(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + def tearDown(self): rmtree(TESTFN) unlink(TESTFN) Modified: python/branches/py3k/Lib/test/test_zipimport_support.py ============================================================================== --- python/branches/py3k/Lib/test/test_zipimport_support.py (original) +++ python/branches/py3k/Lib/test/test_zipimport_support.py Mon Feb 21 20:29:56 2011 @@ -163,20 +163,24 @@ test_zipped_doctest.test_DocTestRunner.verbose_flag, test_zipped_doctest.test_Example, test_zipped_doctest.test_debug, - test_zipped_doctest.test_pdb_set_trace, - test_zipped_doctest.test_pdb_set_trace_nested, test_zipped_doctest.test_testsource, test_zipped_doctest.test_trailing_space_in_test, test_zipped_doctest.test_DocTestSuite, test_zipped_doctest.test_DocTestFinder, ] - # These remaining tests are the ones which need access + # These tests are the ones which need access # to the data files, so we don't run them fail_due_to_missing_data_files = [ test_zipped_doctest.test_DocFileSuite, test_zipped_doctest.test_testfile, test_zipped_doctest.test_unittest_reportflags, ] + # These tests are skipped when a trace funciton is set + can_fail_due_to_tracing = [ + test_zipped_doctest.test_pdb_set_trace, + test_zipped_doctest.test_pdb_set_trace_nested, + ] + for obj in known_good_tests: _run_object_doctest(obj, test_zipped_doctest) finally: Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 20:29:56 2011 @@ -18,6 +18,10 @@ - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. +Tests + +- Issue #10990: Prevent tests from clobbering a set trace function. + What's New in Python 3.2? ========================= From python-checkins at python.org Mon Feb 21 20:31:38 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 20:31:38 +0100 (CET) Subject: [Python-checkins] r88468 - in python/branches/release32-maint: Modules/zlibmodule.c Message-ID: <20110221193138.800F4DA87@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 20:31:38 2011 New Revision: 88468 Log: Merged revisions 88464,88466 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88464 | antoine.pitrou | 2011-02-21 20:05:08 +0100 (lun., 21 f?vr. 2011) | 3 lines Fix issues on 32-bit systems introduced by r88460 ........ r88466 | antoine.pitrou | 2011-02-21 20:28:40 +0100 (lun., 21 f?vr. 2011) | 3 lines Fix compile error under MSVC introduced by r88460. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Modules/zlibmodule.c Modified: python/branches/release32-maint/Modules/zlibmodule.c ============================================================================== --- python/branches/release32-maint/Modules/zlibmodule.c (original) +++ python/branches/release32-maint/Modules/zlibmodule.c Mon Feb 21 20:31:38 2011 @@ -945,16 +945,16 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { - void *buf = pbuf.buf; + unsigned char *buf = pbuf.buf; Py_ssize_t len = pbuf.len; Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. adler32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (Py_ssize_t)UINT_MAX) { + while (len > (size_t) UINT_MAX) { adler32val = adler32(adler32val, buf, UINT_MAX); - buf += UINT_MAX; - len -= UINT_MAX; + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; } adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS @@ -983,16 +983,16 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { - void *buf = pbuf.buf; + unsigned char *buf = pbuf.buf; Py_ssize_t len = pbuf.len; Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. crc32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (Py_ssize_t)UINT_MAX) { + while (len > (size_t) UINT_MAX) { crc32val = crc32(crc32val, buf, UINT_MAX); - buf += UINT_MAX; - len -= UINT_MAX; + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; } signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS From python-checkins at python.org Mon Feb 21 20:38:54 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 20:38:54 +0100 (CET) Subject: [Python-checkins] r88469 - in python/branches/release32-maint: Lib/collections.py Lib/configparser.py Misc/NEWS Message-ID: <20110221193854.26D36EE9D5@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 20:38:53 2011 New Revision: 88469 Log: Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. Modified: python/branches/release32-maint/Lib/collections.py python/branches/release32-maint/Lib/configparser.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/collections.py ============================================================================== --- python/branches/release32-maint/Lib/collections.py (original) +++ python/branches/release32-maint/Lib/collections.py Mon Feb 21 20:38:53 2011 @@ -631,6 +631,97 @@ return result +######################################################################## +### ChainMap (helper for configparser) +######################################################################## + +class _ChainMap(MutableMapping): + ''' A ChainMap groups multiple dicts (or other mappings) together + to create a single, updateable view. + + The underlying mappings are stored in a list. That list is public and can + accessed or updated using the *maps* attribute. There is no other state. + + Lookups search the underlying mappings successively until a key is found. + In contrast, writes, updates, and deletions only operate on the first + mapping. + + ''' + + def __init__(self, *maps): + '''Initialize a ChainMap by setting *maps* to the given mappings. + If no mappings are provided, a single empty dictionary is used. + + ''' + self.maps = list(maps) or [{}] # always at least one map + + def __missing__(self, key): + raise KeyError(key) + + def __getitem__(self, key): + for mapping in self.maps: + try: + return mapping[key] # can't use 'key in mapping' with defaultdict + except KeyError: + pass + return self.__missing__(key) # support subclasses that define __missing__ + + def get(self, key, default=None): + return self[key] if key in self else default + + def __len__(self): + return len(set().union(*self.maps)) # reuses stored hash values if possible + + def __iter__(self): + return iter(set().union(*self.maps)) + + def __contains__(self, key): + return any(key in m for m in self.maps) + + @_recursive_repr() + def __repr__(self): + return '{0.__class__.__name__}({1})'.format( + self, ', '.join(map(repr, self.maps))) + + @classmethod + def fromkeys(cls, iterable, *args): + 'Create a ChainMap with a single dict created from the iterable.' + return cls(dict.fromkeys(iterable, *args)) + + def copy(self): + 'New ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]' + return self.__class__(self.maps[0].copy(), *self.maps[1:]) + + __copy__ = copy + + def __setitem__(self, key, value): + self.maps[0][key] = value + + def __delitem__(self, key): + try: + del self.maps[0][key] + except KeyError: + raise KeyError('Key not found in the first mapping: {!r}'.format(key)) + + def popitem(self): + 'Remove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.' + try: + return self.maps[0].popitem() + except KeyError: + raise KeyError('No keys found in the first mapping.') + + def pop(self, key, *args): + 'Remove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].' + try: + return self.maps[0].pop(key, *args) + except KeyError: + raise KeyError('Key not found in the first mapping: {!r}'.format(key)) + + def clear(self): + 'Clear maps[0], leaving maps[1:] intact.' + self.maps[0].clear() + + ################################################################################ ### UserDict ################################################################################ Modified: python/branches/release32-maint/Lib/configparser.py ============================================================================== --- python/branches/release32-maint/Lib/configparser.py (original) +++ python/branches/release32-maint/Lib/configparser.py Mon Feb 21 20:38:53 2011 @@ -119,7 +119,7 @@ between keys and values are surrounded by spaces. """ -from collections import MutableMapping, OrderedDict as _default_dict +from collections import MutableMapping, OrderedDict as _default_dict, _ChainMap import functools import io import itertools @@ -1099,23 +1099,24 @@ return exc def _unify_values(self, section, vars): - """Create a copy of the DEFAULTSECT with values from a specific - `section' and the `vars' dictionary. If provided, values in `vars' - take precendence. + """Create a sequence of lookups with 'vars' taking priority over + the 'section' which takes priority over the DEFAULTSECT. + """ - d = self._defaults.copy() + sectiondict = {} try: - d.update(self._sections[section]) + sectiondict = self._sections[section] except KeyError: if section != self.default_section: raise NoSectionError(section) # Update with the entry specific variables + vardict = {} if vars: for key, value in vars.items(): if value is not None: value = str(value) - d[self.optionxform(key)] = value - return d + vardict[self.optionxform(key)] = value + return _ChainMap(vardict, sectiondict, self._defaults) def _convert_to_boolean(self, value): """Return a boolean value translating from other types if necessary. Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Mon Feb 21 20:38:53 2011 @@ -15,6 +15,9 @@ Library ------- +- Issue #11089: Fix performance issue limiting the use of ConfigParser() + with large config files. + - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. From python-checkins at python.org Mon Feb 21 20:42:11 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 20:42:11 +0100 (CET) Subject: [Python-checkins] r88470 - in python/branches/py3k: Lib/collections.py Lib/configparser.py Misc/NEWS Message-ID: <20110221194211.B282BF9B5@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 20:42:11 2011 New Revision: 88470 Log: Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. Modified: python/branches/py3k/Lib/collections.py python/branches/py3k/Lib/configparser.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/collections.py ============================================================================== --- python/branches/py3k/Lib/collections.py (original) +++ python/branches/py3k/Lib/collections.py Mon Feb 21 20:42:11 2011 @@ -631,6 +631,97 @@ return result +######################################################################## +### ChainMap (helper for configparser) +######################################################################## + +class _ChainMap(MutableMapping): + ''' A ChainMap groups multiple dicts (or other mappings) together + to create a single, updateable view. + + The underlying mappings are stored in a list. That list is public and can + accessed or updated using the *maps* attribute. There is no other state. + + Lookups search the underlying mappings successively until a key is found. + In contrast, writes, updates, and deletions only operate on the first + mapping. + + ''' + + def __init__(self, *maps): + '''Initialize a ChainMap by setting *maps* to the given mappings. + If no mappings are provided, a single empty dictionary is used. + + ''' + self.maps = list(maps) or [{}] # always at least one map + + def __missing__(self, key): + raise KeyError(key) + + def __getitem__(self, key): + for mapping in self.maps: + try: + return mapping[key] # can't use 'key in mapping' with defaultdict + except KeyError: + pass + return self.__missing__(key) # support subclasses that define __missing__ + + def get(self, key, default=None): + return self[key] if key in self else default + + def __len__(self): + return len(set().union(*self.maps)) # reuses stored hash values if possible + + def __iter__(self): + return iter(set().union(*self.maps)) + + def __contains__(self, key): + return any(key in m for m in self.maps) + + @_recursive_repr() + def __repr__(self): + return '{0.__class__.__name__}({1})'.format( + self, ', '.join(map(repr, self.maps))) + + @classmethod + def fromkeys(cls, iterable, *args): + 'Create a ChainMap with a single dict created from the iterable.' + return cls(dict.fromkeys(iterable, *args)) + + def copy(self): + 'New ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]' + return self.__class__(self.maps[0].copy(), *self.maps[1:]) + + __copy__ = copy + + def __setitem__(self, key, value): + self.maps[0][key] = value + + def __delitem__(self, key): + try: + del self.maps[0][key] + except KeyError: + raise KeyError('Key not found in the first mapping: {!r}'.format(key)) + + def popitem(self): + 'Remove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.' + try: + return self.maps[0].popitem() + except KeyError: + raise KeyError('No keys found in the first mapping.') + + def pop(self, key, *args): + 'Remove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].' + try: + return self.maps[0].pop(key, *args) + except KeyError: + raise KeyError('Key not found in the first mapping: {!r}'.format(key)) + + def clear(self): + 'Clear maps[0], leaving maps[1:] intact.' + self.maps[0].clear() + + ################################################################################ ### UserDict ################################################################################ Modified: python/branches/py3k/Lib/configparser.py ============================================================================== --- python/branches/py3k/Lib/configparser.py (original) +++ python/branches/py3k/Lib/configparser.py Mon Feb 21 20:42:11 2011 @@ -119,7 +119,7 @@ between keys and values are surrounded by spaces. """ -from collections import MutableMapping, OrderedDict as _default_dict +from collections import MutableMapping, OrderedDict as _default_dict, _ChainMap import functools import io import itertools @@ -1099,23 +1099,24 @@ return exc def _unify_values(self, section, vars): - """Create a copy of the DEFAULTSECT with values from a specific - `section' and the `vars' dictionary. If provided, values in `vars' - take precendence. + """Create a sequence of lookups with 'vars' taking priority over + the 'section' which takes priority over the DEFAULTSECT. + """ - d = self._defaults.copy() + sectiondict = {} try: - d.update(self._sections[section]) + sectiondict = self._sections[section] except KeyError: if section != self.default_section: raise NoSectionError(section) # Update with the entry specific variables + vardict = {} if vars: for key, value in vars.items(): if value is not None: value = str(value) - d[self.optionxform(key)] = value - return d + vardict[self.optionxform(key)] = value + return _ChainMap(vardict, sectiondict, self._defaults) def _convert_to_boolean(self, value): """Return a boolean value translating from other types if necessary. Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 20:42:11 2011 @@ -15,6 +15,9 @@ Library ------- +- Issue #11089: Fix performance issue limiting the use of ConfigParser() + with large config files. + - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. From python-checkins at python.org Mon Feb 21 20:49:22 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 21 Feb 2011 20:49:22 +0100 (CET) Subject: [Python-checkins] r88471 - in python/branches/release32-maint: .hgignore Message-ID: <20110221194922.3CE43EBC8@mail.python.org> Author: brett.cannon Date: Mon Feb 21 20:49:20 2011 New Revision: 88471 Log: Merged revisions 88465 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88465 | brett.cannon | 2011-02-21 11:24:10 -0800 (Mon, 21 Feb 2011) | 2 lines Ignore Vim .swp files. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/.hgignore Modified: python/branches/release32-maint/.hgignore ============================================================================== --- python/branches/release32-maint/.hgignore (original) +++ python/branches/release32-maint/.hgignore Mon Feb 21 20:49:20 2011 @@ -39,6 +39,7 @@ syntax: glob libpython*.a +*.swp *.o *.pyc *.pyo From python-checkins at python.org Mon Feb 21 20:50:24 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 21 Feb 2011 20:50:24 +0100 (CET) Subject: [Python-checkins] r88472 - python/branches/release32-maint Message-ID: <20110221195024.EDDF5EE98C@mail.python.org> Author: brett.cannon Date: Mon Feb 21 20:50:24 2011 New Revision: 88472 Log: Blocked revisions 88467 via svnmerge ........ r88467 | brett.cannon | 2011-02-21 11:29:56 -0800 (Mon, 21 Feb 2011) | 11 lines Issue #10990: Prevent tests from clobbering a set trace function. Many tests simply didn't care if they unset a pre-existing trace function. This made test coverage impossible. This patch fixes various tests to put back any pre-existing trace function. It also introduces test.support.no_tracing as a decorator which will temporarily unset the trace function for tests which simply fail otherwise. Thanks to Kristian Vlaardingerbroek for helping to find the cause of various trace function unsets. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Mon Feb 21 20:56:24 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 20:56:24 +0100 (CET) Subject: [Python-checkins] r88473 - python/branches/release32-maint/Doc/library/ftplib.rst Message-ID: <20110221195624.390DEEE9F4@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 20:56:24 2011 New Revision: 88473 Log: Issue 11263: Fix link to source code. Modified: python/branches/release32-maint/Doc/library/ftplib.rst Modified: python/branches/release32-maint/Doc/library/ftplib.rst ============================================================================== --- python/branches/release32-maint/Doc/library/ftplib.rst (original) +++ python/branches/release32-maint/Doc/library/ftplib.rst Mon Feb 21 20:56:24 2011 @@ -9,7 +9,7 @@ pair: FTP; protocol single: FTP; ftplib (standard module) -**Source code:** :source:`Lib/ftp.py` +**Source code:** :source:`Lib/ftplib.py` -------------- From python-checkins at python.org Mon Feb 21 20:58:38 2011 From: python-checkins at python.org (raymond.hettinger) Date: Mon, 21 Feb 2011 20:58:38 +0100 (CET) Subject: [Python-checkins] r88474 - python/branches/py3k/Doc/library/ftplib.rst Message-ID: <20110221195838.0334BC392@mail.python.org> Author: raymond.hettinger Date: Mon Feb 21 20:58:37 2011 New Revision: 88474 Log: Issue 11263: Fix link to source code. Modified: python/branches/py3k/Doc/library/ftplib.rst Modified: python/branches/py3k/Doc/library/ftplib.rst ============================================================================== --- python/branches/py3k/Doc/library/ftplib.rst (original) +++ python/branches/py3k/Doc/library/ftplib.rst Mon Feb 21 20:58:37 2011 @@ -9,7 +9,7 @@ pair: FTP; protocol single: FTP; ftplib (standard module) -**Source code:** :source:`Lib/ftp.py` +**Source code:** :source:`Lib/ftplib.py` -------------- From python-checkins at python.org Mon Feb 21 21:17:37 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 21:17:37 +0100 Subject: [Python-checkins] hooks: The buildbot hook now takes a "rev_url" config parameter to allow Message-ID: antoine.pitrou pushed dea047a97b74 to hooks: http://hg.python.org/hooks/rev/dea047a97b74 changeset: 28:dea047a97b74 tag: tip user: Antoine Pitrou date: Mon Feb 21 21:17:35 2011 +0100 summary: The buildbot hook now takes a "rev_url" config parameter to allow customizing the web link to revisions. files: buildbot.py diff --git a/buildbot.py b/buildbot.py --- a/buildbot.py +++ b/buildbot.py @@ -48,8 +48,7 @@ reactor.callLater(0, d.callback, None) def send(res, c): - return s.send(c['branch'], c['revision'], c['comments'], - c['files'], c['username']) + return s.send(**c) for change in changes: d.addCallback(send, change) d.addCallbacks(s.printSuccess, s.printFailure) @@ -64,6 +63,7 @@ "order to use buildbot hook\n") return prefix = ui.config('hgbuildbot', 'prefix', '') + url = ui.config('hgbuildbot', 'rev_url', '') if hooktype != 'changegroup': ui.status('hgbuildbot: hook %s not supported\n' % hooktype) @@ -85,9 +85,10 @@ # add artificial prefix if configured files = [prefix + f for f in files] changes.append({ - 'username': user, + 'user': user, 'revision': hex(node), 'comments': desc.decode('utf8', 'replace'), + 'revlink': (url % {'rev': hex(node)}) if url else '', 'files': files, 'branch': branch, }) -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Mon Feb 21 21:44:27 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 21 Feb 2011 21:44:27 +0100 (CET) Subject: [Python-checkins] r88475 - in python/branches/py3k: Mac/BuildScript/scripts/postflight.documentation Misc/NEWS Message-ID: <20110221204427.CDC03EE99C@mail.python.org> Author: ned.deily Date: Mon Feb 21 21:44:27 2011 New Revision: 88475 Log: Issue #11268: Prevent Mac OS X Installer failure if Documentation package had previously been installed. Modified: python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/py3k/Mac/BuildScript/scripts/postflight.documentation Mon Feb 21 21:44:27 2011 @@ -27,6 +27,6 @@ if [ -d "${SHARE_DIR}" ]; then mkdir -p "${SHARE_DOCDIR}" # make relative link to html doc directory - ln -s "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" + ln -fhs "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 21:44:27 2011 @@ -21,7 +21,14 @@ - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. +Build +----- + +- Issue #11268: Prevent Mac OS X Installer failure if Documentation + package had previously been installed. + Tests +----- - Issue #10990: Prevent tests from clobbering a set trace function. From python-checkins at python.org Mon Feb 21 21:51:29 2011 From: python-checkins at python.org (victor.stinner) Date: Mon, 21 Feb 2011 21:51:29 +0100 (CET) Subject: [Python-checkins] r88476 - in python/branches/py3k: Misc/NEWS Objects/unicodeobject.c Message-ID: <20110221205129.32C0FEE98C@mail.python.org> Author: victor.stinner Date: Mon Feb 21 21:51:28 2011 New Revision: 88476 Log: Remove bootstrap code of PyUnicode_AsEncodedString() Issue #11187: Remove bootstrap code (use ASCII) of PyUnicode_AsEncodedString(), it was replaced by a better fallback (use the locale encoding) in PyUnicode_EncodeFSDefault(). Prepare also empty sections in NEWS. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/unicodeobject.c Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 21:51:28 2011 @@ -12,11 +12,15 @@ - Check for NULL result in PyType_FromSpec. +- Issue #11187: Remove bootstrap code (use ASCII) of + PyUnicode_AsEncodedString(), it was replaced by a better fallback (use the + locale encoding) in PyUnicode_EncodeFSDefault(). + Library ------- - Issue #11089: Fix performance issue limiting the use of ConfigParser() - with large config files. + with large config files. - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. Modified: python/branches/py3k/Objects/unicodeobject.c ============================================================================== --- python/branches/py3k/Objects/unicodeobject.c (original) +++ python/branches/py3k/Objects/unicodeobject.c Mon Feb 21 21:51:28 2011 @@ -1673,21 +1673,6 @@ PyUnicode_GET_SIZE(unicode), errors); } - /* During bootstrap, we may need to find the encodings - package, to load the file system encoding, and require the - file system encoding in order to load the encodings - package. - - Break out of this dependency by assuming that the path to - the encodings module is ASCII-only. XXX could try wcstombs - instead, if the file system encoding is the locale's - encoding. */ - if (Py_FileSystemDefaultEncoding && - strcmp(encoding, Py_FileSystemDefaultEncoding) == 0 && - !PyThreadState_GET()->interp->codecs_initialized) - return PyUnicode_EncodeASCII(PyUnicode_AS_UNICODE(unicode), - PyUnicode_GET_SIZE(unicode), - errors); /* Encode via the codec registry */ v = PyCodec_Encode(unicode, encoding, errors); From python-checkins at python.org Mon Feb 21 21:52:58 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 21 Feb 2011 21:52:58 +0100 (CET) Subject: [Python-checkins] r88477 - in python/branches/release32-maint: Mac/BuildScript/scripts/postflight.documentation Misc/NEWS Message-ID: <20110221205258.7A12DEE99C@mail.python.org> Author: ned.deily Date: Mon Feb 21 21:52:58 2011 New Revision: 88477 Log: Merged revisions 88475 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88475 | ned.deily | 2011-02-21 12:44:27 -0800 (Mon, 21 Feb 2011) | 3 lines Issue #11268: Prevent Mac OS X Installer failure if Documentation package had previously been installed. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Mac/BuildScript/scripts/postflight.documentation python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/release32-maint/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/release32-maint/Mac/BuildScript/scripts/postflight.documentation Mon Feb 21 21:52:58 2011 @@ -27,6 +27,6 @@ if [ -d "${SHARE_DIR}" ]; then mkdir -p "${SHARE_DOCDIR}" # make relative link to html doc directory - ln -s "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" + ln -fhs "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Mon Feb 21 21:52:58 2011 @@ -21,6 +21,12 @@ - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. +Build +----- + +- Issue #11268: Prevent Mac OS X Installer failure if Documentation + package had previously been installed. + What's New in Python 3.2? ========================= From python-checkins at python.org Mon Feb 21 21:58:02 2011 From: python-checkins at python.org (victor.stinner) Date: Mon, 21 Feb 2011 21:58:02 +0100 (CET) Subject: [Python-checkins] r88478 - in python/branches/py3k: Lib/compileall.py Lib/test/test_compileall.py Misc/NEWS Message-ID: <20110221205802.3589CEE9B1@mail.python.org> Author: victor.stinner Date: Mon Feb 21 21:58:02 2011 New Revision: 88478 Log: compileall uses repr() to format filenames/paths Issue #11169: compileall module uses repr() to format filenames and paths to escape surrogate characters and show spaces. Modified: python/branches/py3k/Lib/compileall.py python/branches/py3k/Lib/test/test_compileall.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/compileall.py ============================================================================== --- python/branches/py3k/Lib/compileall.py (original) +++ python/branches/py3k/Lib/compileall.py Mon Feb 21 21:58:02 2011 @@ -35,11 +35,11 @@ optimize: optimization level or -1 for level of the interpreter """ if not quiet: - print('Listing', dir, '...') + print('Listing {!r}...'.format(dir)) try: names = os.listdir(dir) except os.error: - print("Can't list", dir) + print("Can't list {!r}".format(dir)) names = [] names.sort() success = 1 @@ -109,13 +109,13 @@ except IOError: pass if not quiet: - print('Compiling', fullname, '...') + print('Compiling {!r}...'.format(fullname)) try: ok = py_compile.compile(fullname, cfile, dfile, True, optimize=optimize) except py_compile.PyCompileError as err: if quiet: - print('*** Error compiling', fullname, '...') + print('*** Error compiling {!r}...'.format(fullname)) else: print('*** ', end='') # escape non-printable characters in msg @@ -126,7 +126,7 @@ success = 0 except (SyntaxError, UnicodeError, IOError) as e: if quiet: - print('*** Error compiling', fullname, '...') + print('*** Error compiling {!r}...'.format(fullname)) else: print('*** ', end='') print(e.__class__.__name__ + ':', e) Modified: python/branches/py3k/Lib/test/test_compileall.py ============================================================================== --- python/branches/py3k/Lib/test/test_compileall.py (original) +++ python/branches/py3k/Lib/test/test_compileall.py Mon Feb 21 21:58:02 2011 @@ -345,7 +345,7 @@ def test_invalid_arg_produces_message(self): out = self.assertRunOK('badfilename') - self.assertRegex(out, b"Can't list badfilename") + self.assertRegex(out, b"Can't list 'badfilename'") def test_main(): Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 21:58:02 2011 @@ -19,6 +19,9 @@ Library ------- +- Issue #11169: compileall module uses repr() to format filenames and paths to + escape surrogate characters and show spaces. + - Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. From python-checkins at python.org Mon Feb 21 22:00:09 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 21 Feb 2011 22:00:09 +0100 (CET) Subject: [Python-checkins] r88479 - in python/branches/release27-maint: Mac/BuildScript/scripts/postflight.documentation Misc/NEWS Message-ID: <20110221210009.7111BF46E@mail.python.org> Author: ned.deily Date: Mon Feb 21 22:00:09 2011 New Revision: 88479 Log: Merged revisions 88475 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88475 | ned.deily | 2011-02-21 12:44:27 -0800 (Mon, 21 Feb 2011) | 3 lines Issue #11268: Prevent Mac OS X Installer failure if Documentation package had previously been installed. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/release27-maint/Mac/BuildScript/scripts/postflight.documentation Mon Feb 21 22:00:09 2011 @@ -27,6 +27,6 @@ if [ -d "${SHARE_DIR}" ]; then mkdir -p "${SHARE_DOCDIR}" # make relative link to html doc directory - ln -s "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" + ln -fhs "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Mon Feb 21 22:00:09 2011 @@ -173,6 +173,9 @@ Build ----- +- Issue #11268: Prevent Mac OS X Installer failure if Documentation + package had previously been installed. + - Issue #11079: The /Applications/Python x.x folder created by the Mac OS X installers now includes a link to the installed documentation. From python-checkins at python.org Mon Feb 21 22:05:51 2011 From: python-checkins at python.org (victor.stinner) Date: Mon, 21 Feb 2011 22:05:51 +0100 (CET) Subject: [Python-checkins] r88480 - in python/branches/py3k: Misc/NEWS Python/ceval.c Message-ID: <20110221210551.11CDCEE98C@mail.python.org> Author: victor.stinner Date: Mon Feb 21 22:05:50 2011 New Revision: 88480 Log: Remove filename variable from ceval.c Issue #11168: Remove filename debug variable from PyEval_EvalFrameEx(). It encoded the Unicode filename to UTF-8, but the encoding fails on undecodable filename (on surrogate characters) which raises an unexpected UnicodeEncodeError on recursion limit. Modified: python/branches/py3k/Misc/NEWS python/branches/py3k/Python/ceval.c Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 22:05:50 2011 @@ -10,12 +10,17 @@ Core and Builtins ----------------- -- Check for NULL result in PyType_FromSpec. +- Issue #11168: Remove filename debug variable from PyEval_EvalFrameEx(). + It encoded the Unicode filename to UTF-8, but the encoding fails on + undecodable filename (on surrogate characters) which raises an unexpected + UnicodeEncodeError on recursion limit. - Issue #11187: Remove bootstrap code (use ASCII) of PyUnicode_AsEncodedString(), it was replaced by a better fallback (use the locale encoding) in PyUnicode_EncodeFSDefault(). +- Check for NULL result in PyType_FromSpec. + Library ------- Modified: python/branches/py3k/Python/ceval.c ============================================================================== --- python/branches/py3k/Python/ceval.c (original) +++ python/branches/py3k/Python/ceval.c Mon Feb 21 22:05:50 2011 @@ -811,10 +811,6 @@ unsigned char *first_instr; PyObject *names; PyObject *consts; -#if defined(Py_DEBUG) || defined(LLTRACE) - /* Make it easier to find out where we are with a debugger */ - char *filename; -#endif /* Computed GOTOs, or the-optimization-commonly-but-improperly-known-as-"threaded code" @@ -1227,18 +1223,6 @@ #ifdef LLTRACE lltrace = PyDict_GetItemString(f->f_globals, "__lltrace__") != NULL; #endif -#if defined(Py_DEBUG) || defined(LLTRACE) - { - PyObject *error_type, *error_value, *error_traceback; - PyErr_Fetch(&error_type, &error_value, &error_traceback); - filename = _PyUnicode_AsString(co->co_filename); - if (filename == NULL && tstate->overflowed) { - /* maximum recursion depth exceeded */ - goto exit_eval_frame; - } - PyErr_Restore(error_type, error_value, error_traceback); - } -#endif why = WHY_NOT; err = 0; From python-checkins at python.org Mon Feb 21 22:13:44 2011 From: python-checkins at python.org (victor.stinner) Date: Mon, 21 Feb 2011 22:13:44 +0100 (CET) Subject: [Python-checkins] r88481 - in python/branches/py3k: Lib/test/test_unicode.py Misc/NEWS Objects/unicodeobject.c Message-ID: <20110221211344.CE3A2DCD5@mail.python.org> Author: victor.stinner Date: Mon Feb 21 22:13:44 2011 New Revision: 88481 Log: Fix PyUnicode_FromFormatV("%c") for non-BMP char Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on narrow build. Modified: python/branches/py3k/Lib/test/test_unicode.py python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/unicodeobject.c Modified: python/branches/py3k/Lib/test/test_unicode.py ============================================================================== --- python/branches/py3k/Lib/test/test_unicode.py (original) +++ python/branches/py3k/Lib/test/test_unicode.py Mon Feb 21 22:13:44 2011 @@ -1427,7 +1427,7 @@ # Test PyUnicode_FromFormat() def test_from_format(self): support.import_module('ctypes') - from ctypes import pythonapi, py_object + from ctypes import pythonapi, py_object, c_int if sys.maxunicode == 65535: name = "PyUnicodeUCS2_FromFormat" else: @@ -1452,6 +1452,9 @@ 'string, got a non-ASCII byte: 0xe9$', PyUnicode_FromFormat, b'unicode\xe9=%s', 'ascii') + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0xabcd)), '\uabcd') + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0x10ffff)), '\U0010ffff') + # other tests text = PyUnicode_FromFormat(b'%%A:%A', 'abc\xe9\uabcd\U0010ffff') self.assertEqual(text, r"%A:'abc\xe9\uabcd\U0010ffff'") Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 22:13:44 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on + narrow build. + - Issue #11168: Remove filename debug variable from PyEval_EvalFrameEx(). It encoded the Unicode filename to UTF-8, but the encoding fails on undecodable filename (on surrogate characters) which raises an unexpected Modified: python/branches/py3k/Objects/unicodeobject.c ============================================================================== --- python/branches/py3k/Objects/unicodeobject.c (original) +++ python/branches/py3k/Objects/unicodeobject.c Mon Feb 21 22:13:44 2011 @@ -813,8 +813,19 @@ switch (*f) { case 'c': + { +#ifndef Py_UNICODE_WIDE + int ordinal = va_arg(count, int); + if (ordinal > 0xffff) + n += 2; + else + n++; +#else (void)va_arg(count, int); - /* fall through... */ + n++; +#endif + break; + } case '%': n++; break; @@ -992,8 +1003,18 @@ switch (*f) { case 'c': - *s++ = va_arg(vargs, int); + { + int ordinal = va_arg(vargs, int); +#ifndef Py_UNICODE_WIDE + if (ordinal > 0xffff) { + ordinal -= 0x10000; + *s++ = 0xD800 | (ordinal >> 10); + *s++ = 0xDC00 | (ordinal & 0x3FF); + } else +#endif + *s++ = ordinal; break; + } case 'd': makefmt(fmt, longflag, longlongflag, size_tflag, zeropad, width, precision, 'd'); From python-checkins at python.org Mon Feb 21 22:25:39 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 22:25:39 +0100 (CET) Subject: [Python-checkins] r88482 - python/branches/issue10276-snowleopard Message-ID: <20110221212539.6AD3DEE98C@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 22:25:39 2011 New Revision: 88482 Log: temp branch to debug crashes on snow leopard buildbot Added: python/branches/issue10276-snowleopard/ - copied from r88481, /python/branches/py3k/ From python-checkins at python.org Mon Feb 21 22:30:55 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 22:30:55 +0100 (CET) Subject: [Python-checkins] r88483 - in python/branches/issue10276-snowleopard: Makefile.pre.in Modules/zlibmodule.c Message-ID: <20110221213055.9E045EC04@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 22:30:55 2011 New Revision: 88483 Log: Try s/UINT_MAX/INT_MAX/ Modified: python/branches/issue10276-snowleopard/Makefile.pre.in python/branches/issue10276-snowleopard/Modules/zlibmodule.c Modified: python/branches/issue10276-snowleopard/Makefile.pre.in ============================================================================== --- python/branches/issue10276-snowleopard/Makefile.pre.in (original) +++ python/branches/issue10276-snowleopard/Makefile.pre.in Mon Feb 21 22:30:55 2011 @@ -784,7 +784,7 @@ - at if which pybuildbot.identify >/dev/null 2>&1; then \ pybuildbot.identify "CC='$(CC)'" "CXX='$(CXX)'"; \ fi - $(TESTPYTHON) $(TESTPROG) -uall -rwW $(TESTOPTS) + $(TESTPYTHON) $(TESTPROG) -uall -rwW $(TESTOPTS) -v test_zlib QUICKTESTOPTS= $(TESTOPTS) -x test_subprocess test_io test_lib2to3 \ test_multibytecodec test_urllib2_localnet test_itertools \ Modified: python/branches/issue10276-snowleopard/Modules/zlibmodule.c ============================================================================== --- python/branches/issue10276-snowleopard/Modules/zlibmodule.c (original) +++ python/branches/issue10276-snowleopard/Modules/zlibmodule.c Mon Feb 21 22:30:55 2011 @@ -951,10 +951,10 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. adler32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (size_t) UINT_MAX) { - adler32val = adler32(adler32val, buf, UINT_MAX); - buf += (size_t) UINT_MAX; - len -= (size_t) UINT_MAX; + while (len > (size_t) INT_MAX) { + adler32val = adler32(adler32val, buf, INT_MAX); + buf += (size_t) INT_MAX; + len -= (size_t) INT_MAX; } adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS @@ -989,10 +989,10 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. crc32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (size_t) UINT_MAX) { - crc32val = crc32(crc32val, buf, UINT_MAX); - buf += (size_t) UINT_MAX; - len -= (size_t) UINT_MAX; + while (len > (size_t) INT_MAX) { + crc32val = crc32(crc32val, buf, INT_MAX); + buf += (size_t) INT_MAX; + len -= (size_t) INT_MAX; } signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS From python-checkins at python.org Mon Feb 21 22:55:48 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 22:55:48 +0100 (CET) Subject: [Python-checkins] r88484 - in python/branches/py3k: Lib/test/subprocessdata/fd_status.py Lib/test/test_subprocess.py Misc/NEWS Message-ID: <20110221215548.98F23EE98F@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 22:55:48 2011 New Revision: 88484 Log: Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due to open door files. Modified: python/branches/py3k/Lib/test/subprocessdata/fd_status.py python/branches/py3k/Lib/test/test_subprocess.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/test/subprocessdata/fd_status.py ============================================================================== --- python/branches/py3k/Lib/test/subprocessdata/fd_status.py (original) +++ python/branches/py3k/Lib/test/subprocessdata/fd_status.py Mon Feb 21 22:55:48 2011 @@ -3,22 +3,22 @@ import errno import os -import fcntl try: _MAXFD = os.sysconf("SC_OPEN_MAX") except: _MAXFD = 256 -def isopen(fd): - """Return True if the fd is open, and False otherwise""" - try: - fcntl.fcntl(fd, fcntl.F_GETFD, 0) - except IOError as e: - if e.errno == errno.EBADF: - return False - raise - return True - if __name__ == "__main__": - print(','.join(str(fd) for fd in range(0, _MAXFD) if isopen(fd))) + fds = [] + for fd in range(0, _MAXFD): + try: + st = os.fstat(fd) + except OSError as e: + if e.errno == errno.EBADF: + continue + raise + # Ignore Solaris door files + if st.st_mode & 0xF000 != 0xd000: + fds.append(fd) + print(','.join(map(str, fds))) Modified: python/branches/py3k/Lib/test/test_subprocess.py ============================================================================== --- python/branches/py3k/Lib/test/test_subprocess.py (original) +++ python/branches/py3k/Lib/test/test_subprocess.py Mon Feb 21 22:55:48 2011 @@ -1156,9 +1156,6 @@ open_fds = set() - if support.verbose: - print(" -- maxfd =", subprocess.MAXFD) - for x in range(5): fds = os.pipe() self.addCleanup(os.close, fds[0]) @@ -1173,10 +1170,6 @@ remaining_fds = set(map(int, output.split(b','))) to_be_closed = open_fds - {fd} - # Temporary debug output for intermittent failures - if support.verbose: - print(" -- fds that should have been closed:", to_be_closed) - print(" -- fds that remained open:", remaining_fds) self.assertIn(fd, remaining_fds, "fd to be passed not passed") self.assertFalse(remaining_fds & to_be_closed, Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 21 22:55:48 2011 @@ -45,6 +45,9 @@ Tests ----- +- Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due + to open door files. + - Issue #10990: Prevent tests from clobbering a set trace function. From python-checkins at python.org Mon Feb 21 22:58:42 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 21 Feb 2011 22:58:42 +0100 (CET) Subject: [Python-checkins] r88485 - in python/branches/release32-maint: Lib/test/subprocessdata/fd_status.py Lib/test/test_subprocess.py Misc/NEWS Message-ID: <20110221215842.E155AEE98F@mail.python.org> Author: antoine.pitrou Date: Mon Feb 21 22:58:42 2011 New Revision: 88485 Log: Merged revisions 88484 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88484 | antoine.pitrou | 2011-02-21 22:55:48 +0100 (lun., 21 f?vr. 2011) | 4 lines Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due to open door files. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/subprocessdata/fd_status.py python/branches/release32-maint/Lib/test/test_subprocess.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/test/subprocessdata/fd_status.py ============================================================================== --- python/branches/release32-maint/Lib/test/subprocessdata/fd_status.py (original) +++ python/branches/release32-maint/Lib/test/subprocessdata/fd_status.py Mon Feb 21 22:58:42 2011 @@ -3,22 +3,22 @@ import errno import os -import fcntl try: _MAXFD = os.sysconf("SC_OPEN_MAX") except: _MAXFD = 256 -def isopen(fd): - """Return True if the fd is open, and False otherwise""" - try: - fcntl.fcntl(fd, fcntl.F_GETFD, 0) - except IOError as e: - if e.errno == errno.EBADF: - return False - raise - return True - if __name__ == "__main__": - print(','.join(str(fd) for fd in range(0, _MAXFD) if isopen(fd))) + fds = [] + for fd in range(0, _MAXFD): + try: + st = os.fstat(fd) + except OSError as e: + if e.errno == errno.EBADF: + continue + raise + # Ignore Solaris door files + if st.st_mode & 0xF000 != 0xd000: + fds.append(fd) + print(','.join(map(str, fds))) Modified: python/branches/release32-maint/Lib/test/test_subprocess.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_subprocess.py (original) +++ python/branches/release32-maint/Lib/test/test_subprocess.py Mon Feb 21 22:58:42 2011 @@ -1156,9 +1156,6 @@ open_fds = set() - if support.verbose: - print(" -- maxfd =", subprocess.MAXFD) - for x in range(5): fds = os.pipe() self.addCleanup(os.close, fds[0]) @@ -1173,10 +1170,6 @@ remaining_fds = set(map(int, output.split(b','))) to_be_closed = open_fds - {fd} - # Temporary debug output for intermittent failures - if support.verbose: - print(" -- fds that should have been closed:", to_be_closed) - print(" -- fds that remained open:", remaining_fds) self.assertIn(fd, remaining_fds, "fd to be passed not passed") self.assertFalse(remaining_fds & to_be_closed, Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Mon Feb 21 22:58:42 2011 @@ -27,6 +27,12 @@ - Issue #11268: Prevent Mac OS X Installer failure if Documentation package had previously been installed. +Tests +----- + +- Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due + to open door files. + What's New in Python 3.2? ========================= From python-checkins at python.org Tue Feb 22 00:41:12 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 00:41:12 +0100 (CET) Subject: [Python-checkins] r88486 - in python/branches/py3k: Lib/test/test_mmap.py Misc/NEWS Modules/mmapmodule.c Message-ID: <20110221234112.CE213EE9B1@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 00:41:12 2011 New Revision: 88486 Log: Issue #4681: Allow mmap() to work on file sizes and offsets larger than 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for 32-bit Windows. Modified: python/branches/py3k/Lib/test/test_mmap.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/mmapmodule.c Modified: python/branches/py3k/Lib/test/test_mmap.py ============================================================================== --- python/branches/py3k/Lib/test/test_mmap.py (original) +++ python/branches/py3k/Lib/test/test_mmap.py Tue Feb 22 00:41:12 2011 @@ -1,4 +1,4 @@ -from test.support import TESTFN, run_unittest, import_module +from test.support import TESTFN, run_unittest, import_module, unlink, requires import unittest import os import re @@ -646,9 +646,56 @@ "wrong exception raised in context manager") self.assertTrue(m.closed, "context manager failed") +class LargeMmapTests(unittest.TestCase): + + def setUp(self): + unlink(TESTFN) + + def tearDown(self): + unlink(TESTFN) + + def _working_largefile(self): + # Only run if the current filesystem supports large files. + f = open(TESTFN, 'wb', buffering=0) + try: + f.seek(0x80000001) + f.write(b'x') + f.flush() + except (IOError, OverflowError): + raise unittest.SkipTest("filesystem does not have largefile support") + finally: + f.close() + unlink(TESTFN) + + def test_large_offset(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x14FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + with mmap.mmap(f.fileno(), 0, offset=0x140000000, access=mmap.ACCESS_READ) as m: + self.assertEqual(m[0xFFFFFFF], 32) + + def test_large_filesize(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x17FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + with mmap.mmap(f.fileno(), 0x10000, access=mmap.ACCESS_READ) as m: + self.assertEqual(m.size(), 0x180000000) + def test_main(): - run_unittest(MmapTests) + run_unittest(MmapTests, LargeMmapTests) if __name__ == '__main__': test_main() Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 00:41:12 2011 @@ -27,6 +27,10 @@ Library ------- +- Issue #4681: Allow mmap() to work on file sizes and offsets larger than + 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for + 32-bit Windows. + - Issue #11169: compileall module uses repr() to format filenames and paths to escape surrogate characters and show spaces. Modified: python/branches/py3k/Modules/mmapmodule.c ============================================================================== --- python/branches/py3k/Modules/mmapmodule.c (original) +++ python/branches/py3k/Modules/mmapmodule.c Tue Feb 22 00:41:12 2011 @@ -90,7 +90,11 @@ char * data; size_t size; size_t pos; /* relative to offset */ - size_t offset; +#ifdef MS_WINDOWS + PY_LONG_LONG offset; +#else + off_t offset; +#endif int exports; #ifdef MS_WINDOWS @@ -433,7 +437,11 @@ PyErr_SetFromErrno(mmap_module_error); return NULL; } - return PyLong_FromSsize_t(buf.st_size); +#ifdef HAVE_LARGEFILE_SUPPORT + return PyLong_FromLongLong(buf.st_size); +#else + return PyLong_FromLong(buf.st_size); +#endif } #endif /* UNIX */ } @@ -467,17 +475,10 @@ CloseHandle(self->map_handle); self->map_handle = NULL; /* Move to the desired EOF position */ -#if SIZEOF_SIZE_T > 4 newSizeHigh = (DWORD)((self->offset + new_size) >> 32); newSizeLow = (DWORD)((self->offset + new_size) & 0xFFFFFFFF); off_hi = (DWORD)(self->offset >> 32); off_lo = (DWORD)(self->offset & 0xFFFFFFFF); -#else - newSizeHigh = 0; - newSizeLow = (DWORD)(self->offset + new_size); - off_hi = 0; - off_lo = (DWORD)self->offset; -#endif SetFilePointer(self->file_handle, newSizeLow, &newSizeHigh, FILE_BEGIN); /* Change the size of the file */ @@ -1051,6 +1052,12 @@ } #ifdef UNIX +#ifdef HAVE_LARGEFILE_SUPPORT +#define _Py_PARSE_OFF_T "L" +#else +#define _Py_PARSE_OFF_T "l" +#endif + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { @@ -1058,8 +1065,9 @@ struct stat st; #endif mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + off_t offset = 0; int fd, flags = MAP_SHARED, prot = PROT_WRITE | PROT_READ; int devzero = -1; int access = (int)ACCESS_DEFAULT; @@ -1067,16 +1075,18 @@ "flags", "prot", "access", "offset", NULL}; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iiiO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iii" _Py_PARSE_OFF_T, keywords, &fd, &map_size_obj, &flags, &prot, - &access, &offset_obj)) + &access, &offset)) return NULL; map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } if ((access != (int)ACCESS_DEFAULT) && ((flags != MAP_SHARED) || (prot != (PROT_WRITE | PROT_READ)))) @@ -1121,8 +1131,14 @@ "mmap offset is greater than file size"); return NULL; } - map_size = st.st_size - offset; - } else if ((size_t)offset + (size_t)map_size > st.st_size) { + off_t calc_size = st.st_size - offset; + map_size = calc_size; + if (map_size != calc_size) { + PyErr_SetString(PyExc_ValueError, + "mmap length is too large"); + return NULL; + } + } else if (offset + (size_t)map_size > st.st_size) { PyErr_SetString(PyExc_ValueError, "mmap length is greater than file size"); return NULL; @@ -1183,12 +1199,19 @@ #endif /* UNIX */ #ifdef MS_WINDOWS + +/* A note on sizes and offsets: while the actual map size must hold in a + Py_ssize_t, both the total file size and the start offset can be longer + than a Py_ssize_t, so we use PY_LONG_LONG which is always 64-bit. +*/ + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + PY_LONG_LONG offset = 0, size; DWORD off_hi; /* upper 32 bits of offset */ DWORD off_lo; /* lower 32 bits of offset */ DWORD size_hi; /* upper 32 bits of size */ @@ -1203,9 +1226,9 @@ "tagname", "access", "offset", NULL }; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziL", keywords, &fileno, &map_size_obj, - &tagname, &access, &offset_obj)) { + &tagname, &access, &offset)) { return NULL; } @@ -1230,9 +1253,11 @@ map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } /* assume -1 and 0 both mean invalid filedescriptor to 'anonymously' map memory. @@ -1296,28 +1321,26 @@ return PyErr_SetFromWindowsErr(dwErr); } -#if SIZEOF_SIZE_T > 4 - m_obj->size = (((size_t)high)<<32) + low; -#else - if (high) - /* File is too large to map completely */ - m_obj->size = (size_t)-1; - else - m_obj->size = low; -#endif - if (offset >= m_obj->size) { + size = (((PY_LONG_LONG) high) << 32) + low; + if (offset >= size) { PyErr_SetString(PyExc_ValueError, "mmap offset is greater than file size"); Py_DECREF(m_obj); return NULL; } - m_obj->size -= offset; + if (offset - size > PY_SSIZE_T_MAX) + /* Map area too large to fit in memory */ + m_obj->size = (Py_ssize_t) -1; + else + m_obj->size = (Py_ssize_t) (size - offset); } else { m_obj->size = map_size; + size = offset + map_size; } } else { m_obj->size = map_size; + size = offset + map_size; } /* set the initial position */ @@ -1338,22 +1361,10 @@ m_obj->tagname = NULL; m_obj->access = (access_mode)access; - /* DWORD is a 4-byte int. If we're on a box where size_t consumes - * more than 4 bytes, we need to break it apart. Else (size_t - * consumes 4 bytes), C doesn't define what happens if we shift - * right by 32, so we need different code. - */ -#if SIZEOF_SIZE_T > 4 - size_hi = (DWORD)((offset + m_obj->size) >> 32); - size_lo = (DWORD)((offset + m_obj->size) & 0xFFFFFFFF); + size_hi = (DWORD)(size >> 32); + size_lo = (DWORD)(size & 0xFFFFFFFF); off_hi = (DWORD)(offset >> 32); off_lo = (DWORD)(offset & 0xFFFFFFFF); -#else - size_hi = 0; - size_lo = (DWORD)(offset + m_obj->size); - off_hi = 0; - off_lo = (DWORD)offset; -#endif /* For files, it would be sufficient to pass 0 as size. For anonymous maps, we have to pass the size explicitly. */ m_obj->map_handle = CreateFileMapping(m_obj->file_handle, From python-checkins at python.org Tue Feb 22 00:46:27 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 00:46:27 +0100 (CET) Subject: [Python-checkins] r88487 - in python/branches/release32-maint: Lib/test/test_mmap.py Misc/NEWS Modules/mmapmodule.c Message-ID: <20110221234627.53F10EE9B1@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 00:46:27 2011 New Revision: 88487 Log: Merged revisions 88486 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88486 | antoine.pitrou | 2011-02-22 00:41:12 +0100 (mar., 22 f?vr. 2011) | 5 lines Issue #4681: Allow mmap() to work on file sizes and offsets larger than 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for 32-bit Windows. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_mmap.py python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Modules/mmapmodule.c Modified: python/branches/release32-maint/Lib/test/test_mmap.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_mmap.py (original) +++ python/branches/release32-maint/Lib/test/test_mmap.py Tue Feb 22 00:46:27 2011 @@ -1,4 +1,4 @@ -from test.support import TESTFN, run_unittest, import_module +from test.support import TESTFN, run_unittest, import_module, unlink, requires import unittest import os import re @@ -646,9 +646,56 @@ "wrong exception raised in context manager") self.assertTrue(m.closed, "context manager failed") +class LargeMmapTests(unittest.TestCase): + + def setUp(self): + unlink(TESTFN) + + def tearDown(self): + unlink(TESTFN) + + def _working_largefile(self): + # Only run if the current filesystem supports large files. + f = open(TESTFN, 'wb', buffering=0) + try: + f.seek(0x80000001) + f.write(b'x') + f.flush() + except (IOError, OverflowError): + raise unittest.SkipTest("filesystem does not have largefile support") + finally: + f.close() + unlink(TESTFN) + + def test_large_offset(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x14FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + with mmap.mmap(f.fileno(), 0, offset=0x140000000, access=mmap.ACCESS_READ) as m: + self.assertEqual(m[0xFFFFFFF], 32) + + def test_large_filesize(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x17FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + with mmap.mmap(f.fileno(), 0x10000, access=mmap.ACCESS_READ) as m: + self.assertEqual(m.size(), 0x180000000) + def test_main(): - run_unittest(MmapTests) + run_unittest(MmapTests, LargeMmapTests) if __name__ == '__main__': test_main() Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Tue Feb 22 00:46:27 2011 @@ -15,6 +15,10 @@ Library ------- +- Issue #4681: Allow mmap() to work on file sizes and offsets larger than + 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for + 32-bit Windows. + - Issue #11089: Fix performance issue limiting the use of ConfigParser() with large config files. Modified: python/branches/release32-maint/Modules/mmapmodule.c ============================================================================== --- python/branches/release32-maint/Modules/mmapmodule.c (original) +++ python/branches/release32-maint/Modules/mmapmodule.c Tue Feb 22 00:46:27 2011 @@ -90,7 +90,11 @@ char * data; size_t size; size_t pos; /* relative to offset */ - size_t offset; +#ifdef MS_WINDOWS + PY_LONG_LONG offset; +#else + off_t offset; +#endif int exports; #ifdef MS_WINDOWS @@ -433,7 +437,11 @@ PyErr_SetFromErrno(mmap_module_error); return NULL; } - return PyLong_FromSsize_t(buf.st_size); +#ifdef HAVE_LARGEFILE_SUPPORT + return PyLong_FromLongLong(buf.st_size); +#else + return PyLong_FromLong(buf.st_size); +#endif } #endif /* UNIX */ } @@ -467,17 +475,10 @@ CloseHandle(self->map_handle); self->map_handle = NULL; /* Move to the desired EOF position */ -#if SIZEOF_SIZE_T > 4 newSizeHigh = (DWORD)((self->offset + new_size) >> 32); newSizeLow = (DWORD)((self->offset + new_size) & 0xFFFFFFFF); off_hi = (DWORD)(self->offset >> 32); off_lo = (DWORD)(self->offset & 0xFFFFFFFF); -#else - newSizeHigh = 0; - newSizeLow = (DWORD)(self->offset + new_size); - off_hi = 0; - off_lo = (DWORD)self->offset; -#endif SetFilePointer(self->file_handle, newSizeLow, &newSizeHigh, FILE_BEGIN); /* Change the size of the file */ @@ -1051,6 +1052,12 @@ } #ifdef UNIX +#ifdef HAVE_LARGEFILE_SUPPORT +#define _Py_PARSE_OFF_T "L" +#else +#define _Py_PARSE_OFF_T "l" +#endif + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { @@ -1058,8 +1065,9 @@ struct stat st; #endif mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + off_t offset = 0; int fd, flags = MAP_SHARED, prot = PROT_WRITE | PROT_READ; int devzero = -1; int access = (int)ACCESS_DEFAULT; @@ -1067,16 +1075,18 @@ "flags", "prot", "access", "offset", NULL}; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iiiO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iii" _Py_PARSE_OFF_T, keywords, &fd, &map_size_obj, &flags, &prot, - &access, &offset_obj)) + &access, &offset)) return NULL; map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } if ((access != (int)ACCESS_DEFAULT) && ((flags != MAP_SHARED) || (prot != (PROT_WRITE | PROT_READ)))) @@ -1121,8 +1131,14 @@ "mmap offset is greater than file size"); return NULL; } - map_size = st.st_size - offset; - } else if ((size_t)offset + (size_t)map_size > st.st_size) { + off_t calc_size = st.st_size - offset; + map_size = calc_size; + if (map_size != calc_size) { + PyErr_SetString(PyExc_ValueError, + "mmap length is too large"); + return NULL; + } + } else if (offset + (size_t)map_size > st.st_size) { PyErr_SetString(PyExc_ValueError, "mmap length is greater than file size"); return NULL; @@ -1183,12 +1199,19 @@ #endif /* UNIX */ #ifdef MS_WINDOWS + +/* A note on sizes and offsets: while the actual map size must hold in a + Py_ssize_t, both the total file size and the start offset can be longer + than a Py_ssize_t, so we use PY_LONG_LONG which is always 64-bit. +*/ + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + PY_LONG_LONG offset = 0, size; DWORD off_hi; /* upper 32 bits of offset */ DWORD off_lo; /* lower 32 bits of offset */ DWORD size_hi; /* upper 32 bits of size */ @@ -1203,9 +1226,9 @@ "tagname", "access", "offset", NULL }; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziL", keywords, &fileno, &map_size_obj, - &tagname, &access, &offset_obj)) { + &tagname, &access, &offset)) { return NULL; } @@ -1230,9 +1253,11 @@ map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } /* assume -1 and 0 both mean invalid filedescriptor to 'anonymously' map memory. @@ -1296,28 +1321,26 @@ return PyErr_SetFromWindowsErr(dwErr); } -#if SIZEOF_SIZE_T > 4 - m_obj->size = (((size_t)high)<<32) + low; -#else - if (high) - /* File is too large to map completely */ - m_obj->size = (size_t)-1; - else - m_obj->size = low; -#endif - if (offset >= m_obj->size) { + size = (((PY_LONG_LONG) high) << 32) + low; + if (offset >= size) { PyErr_SetString(PyExc_ValueError, "mmap offset is greater than file size"); Py_DECREF(m_obj); return NULL; } - m_obj->size -= offset; + if (offset - size > PY_SSIZE_T_MAX) + /* Map area too large to fit in memory */ + m_obj->size = (Py_ssize_t) -1; + else + m_obj->size = (Py_ssize_t) (size - offset); } else { m_obj->size = map_size; + size = offset + map_size; } } else { m_obj->size = map_size; + size = offset + map_size; } /* set the initial position */ @@ -1338,22 +1361,10 @@ m_obj->tagname = NULL; m_obj->access = (access_mode)access; - /* DWORD is a 4-byte int. If we're on a box where size_t consumes - * more than 4 bytes, we need to break it apart. Else (size_t - * consumes 4 bytes), C doesn't define what happens if we shift - * right by 32, so we need different code. - */ -#if SIZEOF_SIZE_T > 4 - size_hi = (DWORD)((offset + m_obj->size) >> 32); - size_lo = (DWORD)((offset + m_obj->size) & 0xFFFFFFFF); + size_hi = (DWORD)(size >> 32); + size_lo = (DWORD)(size & 0xFFFFFFFF); off_hi = (DWORD)(offset >> 32); off_lo = (DWORD)(offset & 0xFFFFFFFF); -#else - size_hi = 0; - size_lo = (DWORD)(offset + m_obj->size); - off_hi = 0; - off_lo = (DWORD)offset; -#endif /* For files, it would be sufficient to pass 0 as size. For anonymous maps, we have to pass the size explicitly. */ m_obj->map_handle = CreateFileMapping(m_obj->file_handle, From python-checkins at python.org Tue Feb 22 00:59:20 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 00:59:20 +0100 (CET) Subject: [Python-checkins] r88488 - in python/branches/release27-maint: Lib/test/test_mmap.py Misc/NEWS Modules/mmapmodule.c Message-ID: <20110221235920.914FCEE9AD@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 00:59:20 2011 New Revision: 88488 Log: Merged revisions 88486 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88486 | antoine.pitrou | 2011-02-22 00:41:12 +0100 (mar., 22 f?vr. 2011) | 5 lines Issue #4681: Allow mmap() to work on file sizes and offsets larger than 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for 32-bit Windows. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/test/test_mmap.py python/branches/release27-maint/Misc/NEWS python/branches/release27-maint/Modules/mmapmodule.c Modified: python/branches/release27-maint/Lib/test/test_mmap.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_mmap.py (original) +++ python/branches/release27-maint/Lib/test/test_mmap.py Tue Feb 22 00:59:20 2011 @@ -1,6 +1,6 @@ -from test.test_support import TESTFN, run_unittest, import_module +from test.test_support import TESTFN, run_unittest, import_module, unlink, requires import unittest -import os, re, itertools, socket +import os, re, itertools, socket, sys mmap = import_module('mmap') @@ -627,8 +627,63 @@ finally: s.close() + +class LargeMmapTests(unittest.TestCase): + + def setUp(self): + unlink(TESTFN) + + def tearDown(self): + unlink(TESTFN) + + def _working_largefile(self): + # Only run if the current filesystem supports large files. + f = open(TESTFN, 'wb', buffering=0) + try: + f.seek(0x80000001) + f.write(b'x') + f.flush() + except (IOError, OverflowError): + raise unittest.SkipTest("filesystem does not have largefile support") + finally: + f.close() + unlink(TESTFN) + + def test_large_offset(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x14FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + m = mmap.mmap(f.fileno(), 0, offset=0x140000000, access=mmap.ACCESS_READ) + try: + self.assertEqual(m[0xFFFFFFF], b" ") + finally: + m.close() + + def test_large_filesize(self): + if sys.platform[:3] == 'win' or sys.platform == 'darwin': + requires('largefile', + 'test requires %s bytes and a long time to run' % str(0x180000000)) + self._working_largefile() + with open(TESTFN, 'wb') as f: + f.seek(0x17FFFFFFF) + f.write(b" ") + + with open(TESTFN, 'rb') as f: + m = mmap.mmap(f.fileno(), 0x10000, access=mmap.ACCESS_READ) + try: + self.assertEqual(m.size(), 0x180000000) + finally: + m.close() + + def test_main(): - run_unittest(MmapTests) + run_unittest(MmapTests, LargeMmapTests) if __name__ == '__main__': test_main() Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Tue Feb 22 00:59:20 2011 @@ -37,6 +37,10 @@ Library ------- +- Issue #4681: Allow mmap() to work on file sizes and offsets larger than + 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for + 32-bit Windows. + - Issue #11171: Fix detection of config/Makefile when --prefix != --exec-prefix, which caused Python to not start. Modified: python/branches/release27-maint/Modules/mmapmodule.c ============================================================================== --- python/branches/release27-maint/Modules/mmapmodule.c (original) +++ python/branches/release27-maint/Modules/mmapmodule.c Tue Feb 22 00:59:20 2011 @@ -90,7 +90,11 @@ char * data; size_t size; size_t pos; /* relative to offset */ - size_t offset; +#ifdef MS_WINDOWS + PY_LONG_LONG offset; +#else + off_t offset; +#endif #ifdef MS_WINDOWS HANDLE map_handle; @@ -424,7 +428,11 @@ PyErr_SetFromErrno(mmap_module_error); return NULL; } - return PyInt_FromSsize_t(buf.st_size); +#ifdef HAVE_LARGEFILE_SUPPORT + return PyLong_FromLongLong(buf.st_size); +#else + return PyInt_FromLong(buf.st_size); +#endif } #endif /* UNIX */ } @@ -458,17 +466,10 @@ CloseHandle(self->map_handle); self->map_handle = NULL; /* Move to the desired EOF position */ -#if SIZEOF_SIZE_T > 4 newSizeHigh = (DWORD)((self->offset + new_size) >> 32); newSizeLow = (DWORD)((self->offset + new_size) & 0xFFFFFFFF); off_hi = (DWORD)(self->offset >> 32); off_lo = (DWORD)(self->offset & 0xFFFFFFFF); -#else - newSizeHigh = 0; - newSizeLow = (DWORD)(self->offset + new_size); - off_hi = 0; - off_lo = (DWORD)self->offset; -#endif SetFilePointer(self->file_handle, newSizeLow, &newSizeHigh, FILE_BEGIN); /* Change the size of the file */ @@ -1099,6 +1100,12 @@ } #ifdef UNIX +#ifdef HAVE_LARGEFILE_SUPPORT +#define _Py_PARSE_OFF_T "L" +#else +#define _Py_PARSE_OFF_T "l" +#endif + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { @@ -1106,8 +1113,9 @@ struct stat st; #endif mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + off_t offset = 0; int fd, flags = MAP_SHARED, prot = PROT_WRITE | PROT_READ; int devzero = -1; int access = (int)ACCESS_DEFAULT; @@ -1115,16 +1123,18 @@ "flags", "prot", "access", "offset", NULL}; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iiiO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iii" _Py_PARSE_OFF_T, keywords, &fd, &map_size_obj, &flags, &prot, - &access, &offset_obj)) + &access, &offset)) return NULL; map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } if ((access != (int)ACCESS_DEFAULT) && ((flags != MAP_SHARED) || (prot != (PROT_WRITE | PROT_READ)))) @@ -1169,8 +1179,14 @@ "mmap offset is greater than file size"); return NULL; } - map_size = st.st_size - offset; - } else if ((size_t)offset + (size_t)map_size > st.st_size) { + off_t calc_size = st.st_size - offset; + map_size = calc_size; + if (map_size != calc_size) { + PyErr_SetString(PyExc_ValueError, + "mmap length is too large"); + return NULL; + } + } else if (offset + (size_t)map_size > st.st_size) { PyErr_SetString(PyExc_ValueError, "mmap length is greater than file size"); return NULL; @@ -1230,12 +1246,19 @@ #endif /* UNIX */ #ifdef MS_WINDOWS + +/* A note on sizes and offsets: while the actual map size must hold in a + Py_ssize_t, both the total file size and the start offset can be longer + than a Py_ssize_t, so we use PY_LONG_LONG which is always 64-bit. +*/ + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + PY_LONG_LONG offset = 0, size; DWORD off_hi; /* upper 32 bits of offset */ DWORD off_lo; /* lower 32 bits of offset */ DWORD size_hi; /* upper 32 bits of size */ @@ -1250,9 +1273,9 @@ "tagname", "access", "offset", NULL }; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziL", keywords, &fileno, &map_size_obj, - &tagname, &access, &offset_obj)) { + &tagname, &access, &offset)) { return NULL; } @@ -1277,9 +1300,11 @@ map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } /* assume -1 and 0 both mean invalid filedescriptor to 'anonymously' map memory. @@ -1342,28 +1367,26 @@ return PyErr_SetFromWindowsErr(dwErr); } -#if SIZEOF_SIZE_T > 4 - m_obj->size = (((size_t)high)<<32) + low; -#else - if (high) - /* File is too large to map completely */ - m_obj->size = (size_t)-1; - else - m_obj->size = low; -#endif - if (offset >= m_obj->size) { + size = (((PY_LONG_LONG) high) << 32) + low; + if (offset >= size) { PyErr_SetString(PyExc_ValueError, "mmap offset is greater than file size"); Py_DECREF(m_obj); return NULL; } - m_obj->size -= offset; + if (offset - size > PY_SSIZE_T_MAX) + /* Map area too large to fit in memory */ + m_obj->size = (Py_ssize_t) -1; + else + m_obj->size = (Py_ssize_t) (size - offset); } else { m_obj->size = map_size; + size = offset + map_size; } } else { m_obj->size = map_size; + size = offset + map_size; } /* set the initial position */ @@ -1383,22 +1406,10 @@ m_obj->tagname = NULL; m_obj->access = (access_mode)access; - /* DWORD is a 4-byte int. If we're on a box where size_t consumes - * more than 4 bytes, we need to break it apart. Else (size_t - * consumes 4 bytes), C doesn't define what happens if we shift - * right by 32, so we need different code. - */ -#if SIZEOF_SIZE_T > 4 - size_hi = (DWORD)((offset + m_obj->size) >> 32); - size_lo = (DWORD)((offset + m_obj->size) & 0xFFFFFFFF); + size_hi = (DWORD)(size >> 32); + size_lo = (DWORD)(size & 0xFFFFFFFF); off_hi = (DWORD)(offset >> 32); off_lo = (DWORD)(offset & 0xFFFFFFFF); -#else - size_hi = 0; - size_lo = (DWORD)(offset + m_obj->size); - off_hi = 0; - off_lo = (DWORD)offset; -#endif /* For files, it would be sufficient to pass 0 as size. For anonymous maps, we have to pass the size explicitly. */ m_obj->map_handle = CreateFileMapping(m_obj->file_handle, From python-checkins at python.org Tue Feb 22 01:41:50 2011 From: python-checkins at python.org (raymond.hettinger) Date: Tue, 22 Feb 2011 01:41:50 +0100 (CET) Subject: [Python-checkins] r88490 - in python/branches/py3k: Doc/library/collections.abc.rst Doc/library/collections.rst Doc/library/datatypes.rst Lib/_abcoll.py Lib/collections Lib/collections.py Lib/collections/__init__.py Lib/collections/abc.py Lib/os.py Lib/test/regrtest.py Misc/NEWS Message-ID: <20110222004150.61A6FEE986@mail.python.org> Author: raymond.hettinger Date: Tue Feb 22 01:41:50 2011 New Revision: 88490 Log: Issue #11085: Moved collections abstract base classes into a separate module called collections.abc, following the pattern used by importlib.abc. For backwards compatibility, the names continue to also be imported into the collections module. Added: python/branches/py3k/Doc/library/collections.abc.rst python/branches/py3k/Lib/collections/ python/branches/py3k/Lib/collections/__init__.py - copied, changed from r88470, /python/branches/py3k/Lib/collections.py python/branches/py3k/Lib/collections/abc.py - copied, changed from r88469, /python/branches/py3k/Lib/_abcoll.py Removed: python/branches/py3k/Lib/_abcoll.py python/branches/py3k/Lib/collections.py Modified: python/branches/py3k/Doc/library/collections.rst python/branches/py3k/Doc/library/datatypes.rst python/branches/py3k/Lib/os.py python/branches/py3k/Lib/test/regrtest.py python/branches/py3k/Misc/NEWS Added: python/branches/py3k/Doc/library/collections.abc.rst ============================================================================== --- (empty file) +++ python/branches/py3k/Doc/library/collections.abc.rst Tue Feb 22 01:41:50 2011 @@ -0,0 +1,139 @@ +:mod:`collections.abc` --- Abstract Base Classes for Containers +=============================================================== + +.. module:: collections.abc + :synopsis: Abstract base classes for containers +.. moduleauthor:: Raymond Hettinger +.. sectionauthor:: Raymond Hettinger + +.. testsetup:: * + + from collections import * + import itertools + __name__ = '' + +**Source code:** :source:`Lib/collections/abc.py` + +-------------- + +This module provides :term:`abstract base classes ` that +can be used to test whether a class provides a particular interface; for +example, whether it is hashable or whether it is a mapping. + +.. versionchanged:: 3.3 + Formerly, this module was part of the :mod:`collections` module. + +.. _abstract-base-classes: + +Collections Abstract Base Classes +--------------------------------- + +The collections module offers the following ABCs: + +========================= ===================== ====================== ==================================================== +ABC Inherits Abstract Methods Mixin Methods +========================= ===================== ====================== ==================================================== +:class:`Container` ``__contains__`` +:class:`Hashable` ``__hash__`` +:class:`Iterable` ``__iter__`` +:class:`Iterator` :class:`Iterable` ``__next__`` ``__iter__`` +:class:`Sized` ``__len__`` +:class:`Callable` ``__call__`` + +:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``, ``__iter__``, ``__reversed__``, + :class:`Iterable`, ``index``, and ``count`` + :class:`Container` + +:class:`MutableSequence` :class:`Sequence` ``__setitem__`` Inherited Sequence methods and + ``__delitem__``, ``append``, ``reverse``, ``extend``, ``pop``, + and ``insert`` ``remove``, and ``__iadd__`` + +:class:`Set` :class:`Sized`, ``__le__``, ``__lt__``, ``__eq__``, ``__ne__``, + :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__``, + :class:`Container` ``__sub__``, ``__xor__``, and ``isdisjoint`` + +:class:`MutableSet` :class:`Set` ``add`` and Inherited Set methods and + ``discard`` ``clear``, ``pop``, ``remove``, ``__ior__``, + ``__iand__``, ``__ixor__``, and ``__isub__`` + +:class:`Mapping` :class:`Sized`, ``__getitem__`` ``__contains__``, ``keys``, ``items``, ``values``, + :class:`Iterable`, ``get``, ``__eq__``, and ``__ne__`` + :class:`Container` + +:class:`MutableMapping` :class:`Mapping` ``__setitem__`` and Inherited Mapping methods and + ``__delitem__`` ``pop``, ``popitem``, ``clear``, ``update``, + and ``setdefault`` + + +:class:`MappingView` :class:`Sized` ``__len__`` +:class:`KeysView` :class:`MappingView`, ``__contains__``, + :class:`Set` ``__iter__`` +:class:`ItemsView` :class:`MappingView`, ``__contains__``, + :class:`Set` ``__iter__`` +:class:`ValuesView` :class:`MappingView` ``__contains__``, ``__iter__`` +========================= ===================== ====================== ==================================================== + +These ABCs allow us to ask classes or instances if they provide +particular functionality, for example:: + + size = None + if isinstance(myvar, collections.Sized): + size = len(myvar) + +Several of the ABCs are also useful as mixins that make it easier to develop +classes supporting container APIs. For example, to write a class supporting +the full :class:`Set` API, it only necessary to supply the three underlying +abstract methods: :meth:`__contains__`, :meth:`__iter__`, and :meth:`__len__`. +The ABC supplies the remaining methods such as :meth:`__and__` and +:meth:`isdisjoint` :: + + class ListBasedSet(collections.Set): + ''' Alternate set implementation favoring space over speed + and not requiring the set elements to be hashable. ''' + def __init__(self, iterable): + self.elements = lst = [] + for value in iterable: + if value not in lst: + lst.append(value) + def __iter__(self): + return iter(self.elements) + def __contains__(self, value): + return value in self.elements + def __len__(self): + return len(self.elements) + + s1 = ListBasedSet('abcdef') + s2 = ListBasedSet('defghi') + overlap = s1 & s2 # The __and__() method is supported automatically + +Notes on using :class:`Set` and :class:`MutableSet` as a mixin: + +(1) + Since some set operations create new sets, the default mixin methods need + a way to create new instances from an iterable. The class constructor is + assumed to have a signature in the form ``ClassName(iterable)``. + That assumption is factored-out to an internal classmethod called + :meth:`_from_iterable` which calls ``cls(iterable)`` to produce a new set. + If the :class:`Set` mixin is being used in a class with a different + constructor signature, you will need to override :meth:`from_iterable` + with a classmethod that can construct new instances from + an iterable argument. + +(2) + To override the comparisons (presumably for speed, as the + semantics are fixed), redefine :meth:`__le__` and + then the other operations will automatically follow suit. + +(3) + The :class:`Set` mixin provides a :meth:`_hash` method to compute a hash value + for the set; however, :meth:`__hash__` is not defined because not all sets + are hashable or immutable. To add set hashabilty using mixins, + inherit from both :meth:`Set` and :meth:`Hashable`, then define + ``__hash__ = Set._hash``. + +.. seealso:: + + * `OrderedSet recipe `_ that uses + :class:`MutableSet`. + + * For more about ABCs, see the :mod:`abc` module and :pep:`3119`. Modified: python/branches/py3k/Doc/library/collections.rst ============================================================================== --- python/branches/py3k/Doc/library/collections.rst (original) +++ python/branches/py3k/Doc/library/collections.rst Tue Feb 22 01:41:50 2011 @@ -12,7 +12,7 @@ import itertools __name__ = '' -**Source code:** :source:`Lib/collections.py` and :source:`Lib/_abcoll.py` +**Source code:** :source:`Lib/collections/__init__.py` -------------- @@ -31,9 +31,10 @@ :class:`UserString` wrapper around string objects for easier string subclassing ===================== ==================================================================== -In addition to the concrete container classes, the collections module provides -:ref:`abstract-base-classes` that can be used to test whether a class provides a -particular interface, for example, whether it is hashable or a mapping. +.. versionchanged:: 3.3 + Moved :ref:`abstract-base-classes` to the :mod:`collections.abc` module. + For backwards compatibility, they continue to be visible in this module + as well. :class:`Counter` objects @@ -957,121 +958,3 @@ be an instance of :class:`bytes`, :class:`str`, :class:`UserString` (or a subclass) or an arbitrary sequence which can be converted into a string using the built-in :func:`str` function. - -.. _abstract-base-classes: - -ABCs - abstract base classes ----------------------------- - -The collections module offers the following ABCs: - -========================= ===================== ====================== ==================================================== -ABC Inherits Abstract Methods Mixin Methods -========================= ===================== ====================== ==================================================== -:class:`Container` ``__contains__`` -:class:`Hashable` ``__hash__`` -:class:`Iterable` ``__iter__`` -:class:`Iterator` :class:`Iterable` ``__next__`` ``__iter__`` -:class:`Sized` ``__len__`` -:class:`Callable` ``__call__`` - -:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``, ``__iter__``, ``__reversed__``, - :class:`Iterable`, ``index``, and ``count`` - :class:`Container` - -:class:`MutableSequence` :class:`Sequence` ``__setitem__`` Inherited Sequence methods and - ``__delitem__``, ``append``, ``reverse``, ``extend``, ``pop``, - and ``insert`` ``remove``, and ``__iadd__`` - -:class:`Set` :class:`Sized`, ``__le__``, ``__lt__``, ``__eq__``, ``__ne__``, - :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__``, - :class:`Container` ``__sub__``, ``__xor__``, and ``isdisjoint`` - -:class:`MutableSet` :class:`Set` ``add`` and Inherited Set methods and - ``discard`` ``clear``, ``pop``, ``remove``, ``__ior__``, - ``__iand__``, ``__ixor__``, and ``__isub__`` - -:class:`Mapping` :class:`Sized`, ``__getitem__`` ``__contains__``, ``keys``, ``items``, ``values``, - :class:`Iterable`, ``get``, ``__eq__``, and ``__ne__`` - :class:`Container` - -:class:`MutableMapping` :class:`Mapping` ``__setitem__`` and Inherited Mapping methods and - ``__delitem__`` ``pop``, ``popitem``, ``clear``, ``update``, - and ``setdefault`` - - -:class:`MappingView` :class:`Sized` ``__len__`` -:class:`KeysView` :class:`MappingView`, ``__contains__``, - :class:`Set` ``__iter__`` -:class:`ItemsView` :class:`MappingView`, ``__contains__``, - :class:`Set` ``__iter__`` -:class:`ValuesView` :class:`MappingView` ``__contains__``, ``__iter__`` -========================= ===================== ====================== ==================================================== - -These ABCs allow us to ask classes or instances if they provide -particular functionality, for example:: - - size = None - if isinstance(myvar, collections.Sized): - size = len(myvar) - -Several of the ABCs are also useful as mixins that make it easier to develop -classes supporting container APIs. For example, to write a class supporting -the full :class:`Set` API, it only necessary to supply the three underlying -abstract methods: :meth:`__contains__`, :meth:`__iter__`, and :meth:`__len__`. -The ABC supplies the remaining methods such as :meth:`__and__` and -:meth:`isdisjoint` :: - - class ListBasedSet(collections.Set): - ''' Alternate set implementation favoring space over speed - and not requiring the set elements to be hashable. ''' - def __init__(self, iterable): - self.elements = lst = [] - for value in iterable: - if value not in lst: - lst.append(value) - def __iter__(self): - return iter(self.elements) - def __contains__(self, value): - return value in self.elements - def __len__(self): - return len(self.elements) - - s1 = ListBasedSet('abcdef') - s2 = ListBasedSet('defghi') - overlap = s1 & s2 # The __and__() method is supported automatically - -Notes on using :class:`Set` and :class:`MutableSet` as a mixin: - -(1) - Since some set operations create new sets, the default mixin methods need - a way to create new instances from an iterable. The class constructor is - assumed to have a signature in the form ``ClassName(iterable)``. - That assumption is factored-out to an internal classmethod called - :meth:`_from_iterable` which calls ``cls(iterable)`` to produce a new set. - If the :class:`Set` mixin is being used in a class with a different - constructor signature, you will need to override :meth:`from_iterable` - with a classmethod that can construct new instances from - an iterable argument. - -(2) - To override the comparisons (presumably for speed, as the - semantics are fixed), redefine :meth:`__le__` and - then the other operations will automatically follow suit. - -(3) - The :class:`Set` mixin provides a :meth:`_hash` method to compute a hash value - for the set; however, :meth:`__hash__` is not defined because not all sets - are hashable or immutable. To add set hashabilty using mixins, - inherit from both :meth:`Set` and :meth:`Hashable`, then define - ``__hash__ = Set._hash``. - -.. seealso:: - - * Latest version of the :source:`Python source code for the collections - abstract base classes ` - - * `OrderedSet recipe `_ for an - example built on :class:`MutableSet`. - - * For more about ABCs, see the :mod:`abc` module and :pep:`3119`. Modified: python/branches/py3k/Doc/library/datatypes.rst ============================================================================== --- python/branches/py3k/Doc/library/datatypes.rst (original) +++ python/branches/py3k/Doc/library/datatypes.rst Tue Feb 22 01:41:50 2011 @@ -21,6 +21,7 @@ datetime.rst calendar.rst collections.rst + collections.abc.rst heapq.rst bisect.rst array.rst Deleted: python/branches/py3k/Lib/_abcoll.py ============================================================================== --- python/branches/py3k/Lib/_abcoll.py Tue Feb 22 01:41:50 2011 +++ (empty file) @@ -1,623 +0,0 @@ -# Copyright 2007 Google, Inc. All Rights Reserved. -# Licensed to PSF under a Contributor Agreement. - -"""Abstract Base Classes (ABCs) for collections, according to PEP 3119. - -DON'T USE THIS MODULE DIRECTLY! The classes here should be imported -via collections; they are defined here only to alleviate certain -bootstrapping issues. Unit tests are in test_collections. -""" - -from abc import ABCMeta, abstractmethod -import sys - -__all__ = ["Hashable", "Iterable", "Iterator", - "Sized", "Container", "Callable", - "Set", "MutableSet", - "Mapping", "MutableMapping", - "MappingView", "KeysView", "ItemsView", "ValuesView", - "Sequence", "MutableSequence", - "ByteString", - ] - - -### collection related types which are not exposed through builtin ### -## iterators ## -bytes_iterator = type(iter(b'')) -bytearray_iterator = type(iter(bytearray())) -#callable_iterator = ??? -dict_keyiterator = type(iter({}.keys())) -dict_valueiterator = type(iter({}.values())) -dict_itemiterator = type(iter({}.items())) -list_iterator = type(iter([])) -list_reverseiterator = type(iter(reversed([]))) -range_iterator = type(iter(range(0))) -set_iterator = type(iter(set())) -str_iterator = type(iter("")) -tuple_iterator = type(iter(())) -zip_iterator = type(iter(zip())) -## views ## -dict_keys = type({}.keys()) -dict_values = type({}.values()) -dict_items = type({}.items()) -## misc ## -dict_proxy = type(type.__dict__) - - -### ONE-TRICK PONIES ### - -class Hashable(metaclass=ABCMeta): - - @abstractmethod - def __hash__(self): - return 0 - - @classmethod - def __subclasshook__(cls, C): - if cls is Hashable: - for B in C.__mro__: - if "__hash__" in B.__dict__: - if B.__dict__["__hash__"]: - return True - break - return NotImplemented - - -class Iterable(metaclass=ABCMeta): - - @abstractmethod - def __iter__(self): - while False: - yield None - - @classmethod - def __subclasshook__(cls, C): - if cls is Iterable: - if any("__iter__" in B.__dict__ for B in C.__mro__): - return True - return NotImplemented - - -class Iterator(Iterable): - - @abstractmethod - def __next__(self): - raise StopIteration - - def __iter__(self): - return self - - @classmethod - def __subclasshook__(cls, C): - if cls is Iterator: - if (any("__next__" in B.__dict__ for B in C.__mro__) and - any("__iter__" in B.__dict__ for B in C.__mro__)): - return True - return NotImplemented - -Iterator.register(bytes_iterator) -Iterator.register(bytearray_iterator) -#Iterator.register(callable_iterator) -Iterator.register(dict_keyiterator) -Iterator.register(dict_valueiterator) -Iterator.register(dict_itemiterator) -Iterator.register(list_iterator) -Iterator.register(list_reverseiterator) -Iterator.register(range_iterator) -Iterator.register(set_iterator) -Iterator.register(str_iterator) -Iterator.register(tuple_iterator) -Iterator.register(zip_iterator) - -class Sized(metaclass=ABCMeta): - - @abstractmethod - def __len__(self): - return 0 - - @classmethod - def __subclasshook__(cls, C): - if cls is Sized: - if any("__len__" in B.__dict__ for B in C.__mro__): - return True - return NotImplemented - - -class Container(metaclass=ABCMeta): - - @abstractmethod - def __contains__(self, x): - return False - - @classmethod - def __subclasshook__(cls, C): - if cls is Container: - if any("__contains__" in B.__dict__ for B in C.__mro__): - return True - return NotImplemented - - -class Callable(metaclass=ABCMeta): - - @abstractmethod - def __call__(self, *args, **kwds): - return False - - @classmethod - def __subclasshook__(cls, C): - if cls is Callable: - if any("__call__" in B.__dict__ for B in C.__mro__): - return True - return NotImplemented - - -### SETS ### - - -class Set(Sized, Iterable, Container): - - """A set is a finite, iterable container. - - This class provides concrete generic implementations of all - methods except for __contains__, __iter__ and __len__. - - To override the comparisons (presumably for speed, as the - semantics are fixed), all you have to do is redefine __le__ and - then the other operations will automatically follow suit. - """ - - def __le__(self, other): - if not isinstance(other, Set): - return NotImplemented - if len(self) > len(other): - return False - for elem in self: - if elem not in other: - return False - return True - - def __lt__(self, other): - if not isinstance(other, Set): - return NotImplemented - return len(self) < len(other) and self.__le__(other) - - def __gt__(self, other): - if not isinstance(other, Set): - return NotImplemented - return other < self - - def __ge__(self, other): - if not isinstance(other, Set): - return NotImplemented - return other <= self - - def __eq__(self, other): - if not isinstance(other, Set): - return NotImplemented - return len(self) == len(other) and self.__le__(other) - - def __ne__(self, other): - return not (self == other) - - @classmethod - def _from_iterable(cls, it): - '''Construct an instance of the class from any iterable input. - - Must override this method if the class constructor signature - does not accept an iterable for an input. - ''' - return cls(it) - - def __and__(self, other): - if not isinstance(other, Iterable): - return NotImplemented - return self._from_iterable(value for value in other if value in self) - - def isdisjoint(self, other): - for value in other: - if value in self: - return False - return True - - def __or__(self, other): - if not isinstance(other, Iterable): - return NotImplemented - chain = (e for s in (self, other) for e in s) - return self._from_iterable(chain) - - def __sub__(self, other): - if not isinstance(other, Set): - if not isinstance(other, Iterable): - return NotImplemented - other = self._from_iterable(other) - return self._from_iterable(value for value in self - if value not in other) - - def __xor__(self, other): - if not isinstance(other, Set): - if not isinstance(other, Iterable): - return NotImplemented - other = self._from_iterable(other) - return (self - other) | (other - self) - - def _hash(self): - """Compute the hash value of a set. - - Note that we don't define __hash__: not all sets are hashable. - But if you define a hashable set type, its __hash__ should - call this function. - - This must be compatible __eq__. - - All sets ought to compare equal if they contain the same - elements, regardless of how they are implemented, and - regardless of the order of the elements; so there's not much - freedom for __eq__ or __hash__. We match the algorithm used - by the built-in frozenset type. - """ - MAX = sys.maxsize - MASK = 2 * MAX + 1 - n = len(self) - h = 1927868237 * (n + 1) - h &= MASK - for x in self: - hx = hash(x) - h ^= (hx ^ (hx << 16) ^ 89869747) * 3644798167 - h &= MASK - h = h * 69069 + 907133923 - h &= MASK - if h > MAX: - h -= MASK + 1 - if h == -1: - h = 590923713 - return h - -Set.register(frozenset) - - -class MutableSet(Set): - - @abstractmethod - def add(self, value): - """Add an element.""" - raise NotImplementedError - - @abstractmethod - def discard(self, value): - """Remove an element. Do not raise an exception if absent.""" - raise NotImplementedError - - def remove(self, value): - """Remove an element. If not a member, raise a KeyError.""" - if value not in self: - raise KeyError(value) - self.discard(value) - - def pop(self): - """Return the popped value. Raise KeyError if empty.""" - it = iter(self) - try: - value = next(it) - except StopIteration: - raise KeyError - self.discard(value) - return value - - def clear(self): - """This is slow (creates N new iterators!) but effective.""" - try: - while True: - self.pop() - except KeyError: - pass - - def __ior__(self, it): - for value in it: - self.add(value) - return self - - def __iand__(self, it): - for value in (self - it): - self.discard(value) - return self - - def __ixor__(self, it): - if it is self: - self.clear() - else: - if not isinstance(it, Set): - it = self._from_iterable(it) - for value in it: - if value in self: - self.discard(value) - else: - self.add(value) - return self - - def __isub__(self, it): - if it is self: - self.clear() - else: - for value in it: - self.discard(value) - return self - -MutableSet.register(set) - - -### MAPPINGS ### - - -class Mapping(Sized, Iterable, Container): - - @abstractmethod - def __getitem__(self, key): - raise KeyError - - def get(self, key, default=None): - try: - return self[key] - except KeyError: - return default - - def __contains__(self, key): - try: - self[key] - except KeyError: - return False - else: - return True - - def keys(self): - return KeysView(self) - - def items(self): - return ItemsView(self) - - def values(self): - return ValuesView(self) - - def __eq__(self, other): - if not isinstance(other, Mapping): - return NotImplemented - return dict(self.items()) == dict(other.items()) - - def __ne__(self, other): - return not (self == other) - - -class MappingView(Sized): - - def __init__(self, mapping): - self._mapping = mapping - - def __len__(self): - return len(self._mapping) - - def __repr__(self): - return '{0.__class__.__name__}({0._mapping!r})'.format(self) - - -class KeysView(MappingView, Set): - - @classmethod - def _from_iterable(self, it): - return set(it) - - def __contains__(self, key): - return key in self._mapping - - def __iter__(self): - for key in self._mapping: - yield key - -KeysView.register(dict_keys) - - -class ItemsView(MappingView, Set): - - @classmethod - def _from_iterable(self, it): - return set(it) - - def __contains__(self, item): - key, value = item - try: - v = self._mapping[key] - except KeyError: - return False - else: - return v == value - - def __iter__(self): - for key in self._mapping: - yield (key, self._mapping[key]) - -ItemsView.register(dict_items) - - -class ValuesView(MappingView): - - def __contains__(self, value): - for key in self._mapping: - if value == self._mapping[key]: - return True - return False - - def __iter__(self): - for key in self._mapping: - yield self._mapping[key] - -ValuesView.register(dict_values) - - -class MutableMapping(Mapping): - - @abstractmethod - def __setitem__(self, key, value): - raise KeyError - - @abstractmethod - def __delitem__(self, key): - raise KeyError - - __marker = object() - - def pop(self, key, default=__marker): - try: - value = self[key] - except KeyError: - if default is self.__marker: - raise - return default - else: - del self[key] - return value - - def popitem(self): - try: - key = next(iter(self)) - except StopIteration: - raise KeyError - value = self[key] - del self[key] - return key, value - - def clear(self): - try: - while True: - self.popitem() - except KeyError: - pass - - def update(*args, **kwds): - if len(args) > 2: - raise TypeError("update() takes at most 2 positional " - "arguments ({} given)".format(len(args))) - elif not args: - raise TypeError("update() takes at least 1 argument (0 given)") - self = args[0] - other = args[1] if len(args) >= 2 else () - - if isinstance(other, Mapping): - for key in other: - self[key] = other[key] - elif hasattr(other, "keys"): - for key in other.keys(): - self[key] = other[key] - else: - for key, value in other: - self[key] = value - for key, value in kwds.items(): - self[key] = value - - def setdefault(self, key, default=None): - try: - return self[key] - except KeyError: - self[key] = default - return default - -MutableMapping.register(dict) - - -### SEQUENCES ### - - -class Sequence(Sized, Iterable, Container): - - """All the operations on a read-only sequence. - - Concrete subclasses must override __new__ or __init__, - __getitem__, and __len__. - """ - - @abstractmethod - def __getitem__(self, index): - raise IndexError - - def __iter__(self): - i = 0 - try: - while True: - v = self[i] - yield v - i += 1 - except IndexError: - return - - def __contains__(self, value): - for v in self: - if v == value: - return True - return False - - def __reversed__(self): - for i in reversed(range(len(self))): - yield self[i] - - def index(self, value): - for i, v in enumerate(self): - if v == value: - return i - raise ValueError - - def count(self, value): - return sum(1 for v in self if v == value) - -Sequence.register(tuple) -Sequence.register(str) -Sequence.register(range) - - -class ByteString(Sequence): - - """This unifies bytes and bytearray. - - XXX Should add all their methods. - """ - -ByteString.register(bytes) -ByteString.register(bytearray) - - -class MutableSequence(Sequence): - - @abstractmethod - def __setitem__(self, index, value): - raise IndexError - - @abstractmethod - def __delitem__(self, index): - raise IndexError - - @abstractmethod - def insert(self, index, value): - raise IndexError - - def append(self, value): - self.insert(len(self), value) - - def reverse(self): - n = len(self) - for i in range(n//2): - self[i], self[n-i-1] = self[n-i-1], self[i] - - def extend(self, values): - for v in values: - self.append(v) - - def pop(self, index=-1): - v = self[index] - del self[index] - return v - - def remove(self, value): - del self[self.index(value)] - - def __iadd__(self, values): - self.extend(values) - return self - -MutableSequence.register(list) -MutableSequence.register(bytearray) # Multiply inheriting, see ByteString Deleted: python/branches/py3k/Lib/collections.py ============================================================================== --- python/branches/py3k/Lib/collections.py Tue Feb 22 01:41:50 2011 +++ (empty file) @@ -1,1031 +0,0 @@ -__all__ = ['deque', 'defaultdict', 'namedtuple', 'UserDict', 'UserList', - 'UserString', 'Counter', 'OrderedDict'] -# For bootstrapping reasons, the collection ABCs are defined in _abcoll.py. -# They should however be considered an integral part of collections.py. -from _abcoll import * -import _abcoll -__all__ += _abcoll.__all__ - -from _collections import deque, defaultdict -from operator import itemgetter as _itemgetter -from keyword import iskeyword as _iskeyword -import sys as _sys -import heapq as _heapq -from weakref import proxy as _proxy -from itertools import repeat as _repeat, chain as _chain, starmap as _starmap -from reprlib import recursive_repr as _recursive_repr - -################################################################################ -### OrderedDict -################################################################################ - -class _Link(object): - __slots__ = 'prev', 'next', 'key', '__weakref__' - -class OrderedDict(dict): - 'Dictionary that remembers insertion order' - # An inherited dict maps keys to values. - # The inherited dict provides __getitem__, __len__, __contains__, and get. - # The remaining methods are order-aware. - # Big-O running times for all methods are the same as for regular dictionaries. - - # The internal self.__map dictionary maps keys to links in a doubly linked list. - # The circular doubly linked list starts and ends with a sentinel element. - # The sentinel element never gets deleted (this simplifies the algorithm). - # The sentinel is stored in self.__hardroot with a weakref proxy in self.__root. - # The prev/next links are weakref proxies (to prevent circular references). - # Individual links are kept alive by the hard reference in self.__map. - # Those hard references disappear when a key is deleted from an OrderedDict. - - def __init__(self, *args, **kwds): - '''Initialize an ordered dictionary. Signature is the same as for - regular dictionaries, but keyword arguments are not recommended - because their insertion order is arbitrary. - - ''' - if len(args) > 1: - raise TypeError('expected at most 1 arguments, got %d' % len(args)) - try: - self.__root - except AttributeError: - self.__hardroot = _Link() - self.__root = root = _proxy(self.__hardroot) - root.prev = root.next = root - self.__map = {} - self.__update(*args, **kwds) - - def __setitem__(self, key, value, - dict_setitem=dict.__setitem__, proxy=_proxy, Link=_Link): - 'od.__setitem__(i, y) <==> od[i]=y' - # Setting a new item creates a new link which goes at the end of the linked - # list, and the inherited dictionary is updated with the new key/value pair. - if key not in self: - self.__map[key] = link = Link() - root = self.__root - last = root.prev - link.prev, link.next, link.key = last, root, key - last.next = link - root.prev = proxy(link) - dict_setitem(self, key, value) - - def __delitem__(self, key, dict_delitem=dict.__delitem__): - 'od.__delitem__(y) <==> del od[y]' - # Deleting an existing item uses self.__map to find the link which is - # then removed by updating the links in the predecessor and successor nodes. - dict_delitem(self, key) - link = self.__map.pop(key) - link_prev = link.prev - link_next = link.next - link_prev.next = link_next - link_next.prev = link_prev - - def __iter__(self): - 'od.__iter__() <==> iter(od)' - # Traverse the linked list in order. - root = self.__root - curr = root.next - while curr is not root: - yield curr.key - curr = curr.next - - def __reversed__(self): - 'od.__reversed__() <==> reversed(od)' - # Traverse the linked list in reverse order. - root = self.__root - curr = root.prev - while curr is not root: - yield curr.key - curr = curr.prev - - def clear(self): - 'od.clear() -> None. Remove all items from od.' - root = self.__root - root.prev = root.next = root - self.__map.clear() - dict.clear(self) - - def popitem(self, last=True): - '''od.popitem() -> (k, v), return and remove a (key, value) pair. - Pairs are returned in LIFO order if last is true or FIFO order if false. - - ''' - if not self: - raise KeyError('dictionary is empty') - root = self.__root - if last: - link = root.prev - link_prev = link.prev - link_prev.next = root - root.prev = link_prev - else: - link = root.next - link_next = link.next - root.next = link_next - link_next.prev = root - key = link.key - del self.__map[key] - value = dict.pop(self, key) - return key, value - - def move_to_end(self, key, last=True): - '''Move an existing element to the end (or beginning if last==False). - - Raises KeyError if the element does not exist. - When last=True, acts like a fast version of self[key]=self.pop(key). - - ''' - link = self.__map[key] - link_prev = link.prev - link_next = link.next - link_prev.next = link_next - link_next.prev = link_prev - root = self.__root - if last: - last = root.prev - link.prev = last - link.next = root - last.next = root.prev = link - else: - first = root.next - link.prev = root - link.next = first - root.next = first.prev = link - - def __reduce__(self): - 'Return state information for pickling' - items = [[k, self[k]] for k in self] - tmp = self.__map, self.__root, self.__hardroot - del self.__map, self.__root, self.__hardroot - inst_dict = vars(self).copy() - self.__map, self.__root, self.__hardroot = tmp - if inst_dict: - return (self.__class__, (items,), inst_dict) - return self.__class__, (items,) - - def __sizeof__(self): - sizeof = _sys.getsizeof - n = len(self) + 1 # number of links including root - size = sizeof(self.__dict__) # instance dictionary - size += sizeof(self.__map) * 2 # internal dict and inherited dict - size += sizeof(self.__hardroot) * n # link objects - size += sizeof(self.__root) * n # proxy objects - return size - - update = __update = MutableMapping.update - keys = MutableMapping.keys - values = MutableMapping.values - items = MutableMapping.items - __ne__ = MutableMapping.__ne__ - - __marker = object() - - def pop(self, key, default=__marker): - if key in self: - result = self[key] - del self[key] - return result - if default is self.__marker: - raise KeyError(key) - return default - - def setdefault(self, key, default=None): - 'OD.setdefault(k[,d]) -> OD.get(k,d), also set OD[k]=d if k not in OD' - if key in self: - return self[key] - self[key] = default - return default - - @_recursive_repr() - def __repr__(self): - 'od.__repr__() <==> repr(od)' - if not self: - return '%s()' % (self.__class__.__name__,) - return '%s(%r)' % (self.__class__.__name__, list(self.items())) - - def copy(self): - 'od.copy() -> a shallow copy of od' - return self.__class__(self) - - @classmethod - def fromkeys(cls, iterable, value=None): - '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S - and values equal to v (which defaults to None). - - ''' - d = cls() - for key in iterable: - d[key] = value - return d - - def __eq__(self, other): - '''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive - while comparison to a regular mapping is order-insensitive. - - ''' - if isinstance(other, OrderedDict): - return len(self)==len(other) and \ - all(p==q for p, q in zip(self.items(), other.items())) - return dict.__eq__(self, other) - - -################################################################################ -### namedtuple -################################################################################ - -def namedtuple(typename, field_names, verbose=False, rename=False): - """Returns a new subclass of tuple with named fields. - - >>> Point = namedtuple('Point', 'x y') - >>> Point.__doc__ # docstring for the new class - 'Point(x, y)' - >>> p = Point(11, y=22) # instantiate with positional args or keywords - >>> p[0] + p[1] # indexable like a plain tuple - 33 - >>> x, y = p # unpack like a regular tuple - >>> x, y - (11, 22) - >>> p.x + p.y # fields also accessable by name - 33 - >>> d = p._asdict() # convert to a dictionary - >>> d['x'] - 11 - >>> Point(**d) # convert from a dictionary - Point(x=11, y=22) - >>> p._replace(x=100) # _replace() is like str.replace() but targets named fields - Point(x=100, y=22) - - """ - - # Parse and validate the field names. Validation serves two purposes, - # generating informative error messages and preventing template injection attacks. - if isinstance(field_names, str): - field_names = field_names.replace(',', ' ').split() # names separated by whitespace and/or commas - field_names = tuple(map(str, field_names)) - if rename: - names = list(field_names) - seen = set() - for i, name in enumerate(names): - if (not all(c.isalnum() or c=='_' for c in name) or _iskeyword(name) - or not name or name[0].isdigit() or name.startswith('_') - or name in seen): - names[i] = '_%d' % i - seen.add(name) - field_names = tuple(names) - for name in (typename,) + field_names: - if not all(c.isalnum() or c=='_' for c in name): - raise ValueError('Type names and field names can only contain alphanumeric characters and underscores: %r' % name) - if _iskeyword(name): - raise ValueError('Type names and field names cannot be a keyword: %r' % name) - if name[0].isdigit(): - raise ValueError('Type names and field names cannot start with a number: %r' % name) - seen_names = set() - for name in field_names: - if name.startswith('_') and not rename: - raise ValueError('Field names cannot start with an underscore: %r' % name) - if name in seen_names: - raise ValueError('Encountered duplicate field name: %r' % name) - seen_names.add(name) - - # Create and fill-in the class template - numfields = len(field_names) - argtxt = repr(field_names).replace("'", "")[1:-1] # tuple repr without parens or quotes - reprtxt = ', '.join('%s=%%r' % name for name in field_names) - template = '''class %(typename)s(tuple): - '%(typename)s(%(argtxt)s)' \n - __slots__ = () \n - _fields = %(field_names)r \n - def __new__(_cls, %(argtxt)s): - 'Create new instance of %(typename)s(%(argtxt)s)' - return _tuple.__new__(_cls, (%(argtxt)s)) \n - @classmethod - def _make(cls, iterable, new=tuple.__new__, len=len): - 'Make a new %(typename)s object from a sequence or iterable' - result = new(cls, iterable) - if len(result) != %(numfields)d: - raise TypeError('Expected %(numfields)d arguments, got %%d' %% len(result)) - return result \n - def __repr__(self): - 'Return a nicely formatted representation string' - return self.__class__.__name__ + '(%(reprtxt)s)' %% self \n - def _asdict(self): - 'Return a new OrderedDict which maps field names to their values' - return OrderedDict(zip(self._fields, self)) \n - def _replace(_self, **kwds): - 'Return a new %(typename)s object replacing specified fields with new values' - result = _self._make(map(kwds.pop, %(field_names)r, _self)) - if kwds: - raise ValueError('Got unexpected field names: %%r' %% kwds.keys()) - return result \n - def __getnewargs__(self): - 'Return self as a plain tuple. Used by copy and pickle.' - return tuple(self) \n\n''' % locals() - for i, name in enumerate(field_names): - template += " %s = _property(_itemgetter(%d), doc='Alias for field number %d')\n" % (name, i, i) - if verbose: - print(template) - - # Execute the template string in a temporary namespace and - # support tracing utilities by setting a value for frame.f_globals['__name__'] - namespace = dict(_itemgetter=_itemgetter, __name__='namedtuple_%s' % typename, - OrderedDict=OrderedDict, _property=property, _tuple=tuple) - try: - exec(template, namespace) - except SyntaxError as e: - raise SyntaxError(e.msg + ':\n\n' + template) - result = namespace[typename] - - # For pickling to work, the __module__ variable needs to be set to the frame - # where the named tuple is created. Bypass this step in enviroments where - # sys._getframe is not defined (Jython for example) or sys._getframe is not - # defined for arguments greater than 0 (IronPython). - try: - result.__module__ = _sys._getframe(1).f_globals.get('__name__', '__main__') - except (AttributeError, ValueError): - pass - - return result - - -######################################################################## -### Counter -######################################################################## - -def _count_elements(mapping, iterable): - 'Tally elements from the iterable.' - mapping_get = mapping.get - for elem in iterable: - mapping[elem] = mapping_get(elem, 0) + 1 - -try: # Load C helper function if available - from _collections import _count_elements -except ImportError: - pass - -class Counter(dict): - '''Dict subclass for counting hashable items. Sometimes called a bag - or multiset. Elements are stored as dictionary keys and their counts - are stored as dictionary values. - - >>> c = Counter('abcdeabcdabcaba') # count elements from a string - - >>> c.most_common(3) # three most common elements - [('a', 5), ('b', 4), ('c', 3)] - >>> sorted(c) # list all unique elements - ['a', 'b', 'c', 'd', 'e'] - >>> ''.join(sorted(c.elements())) # list elements with repetitions - 'aaaaabbbbcccdde' - >>> sum(c.values()) # total of all counts - 15 - - >>> c['a'] # count of letter 'a' - 5 - >>> for elem in 'shazam': # update counts from an iterable - ... c[elem] += 1 # by adding 1 to each element's count - >>> c['a'] # now there are seven 'a' - 7 - >>> del c['b'] # remove all 'b' - >>> c['b'] # now there are zero 'b' - 0 - - >>> d = Counter('simsalabim') # make another counter - >>> c.update(d) # add in the second counter - >>> c['a'] # now there are nine 'a' - 9 - - >>> c.clear() # empty the counter - >>> c - Counter() - - Note: If a count is set to zero or reduced to zero, it will remain - in the counter until the entry is deleted or the counter is cleared: - - >>> c = Counter('aaabbc') - >>> c['b'] -= 2 # reduce the count of 'b' by two - >>> c.most_common() # 'b' is still in, but its count is zero - [('a', 3), ('c', 1), ('b', 0)] - - ''' - # References: - # http://en.wikipedia.org/wiki/Multiset - # http://www.gnu.org/software/smalltalk/manual-base/html_node/Bag.html - # http://www.demo2s.com/Tutorial/Cpp/0380__set-multiset/Catalog0380__set-multiset.htm - # http://code.activestate.com/recipes/259174/ - # Knuth, TAOCP Vol. II section 4.6.3 - - def __init__(self, iterable=None, **kwds): - '''Create a new, empty Counter object. And if given, count elements - from an input iterable. Or, initialize the count from another mapping - of elements to their counts. - - >>> c = Counter() # a new, empty counter - >>> c = Counter('gallahad') # a new counter from an iterable - >>> c = Counter({'a': 4, 'b': 2}) # a new counter from a mapping - >>> c = Counter(a=4, b=2) # a new counter from keyword args - - ''' - super().__init__() - self.update(iterable, **kwds) - - def __missing__(self, key): - 'The count of elements not in the Counter is zero.' - # Needed so that self[missing_item] does not raise KeyError - return 0 - - def most_common(self, n=None): - '''List the n most common elements and their counts from the most - common to the least. If n is None, then list all element counts. - - >>> Counter('abcdeabcdabcaba').most_common(3) - [('a', 5), ('b', 4), ('c', 3)] - - ''' - # Emulate Bag.sortedByCount from Smalltalk - if n is None: - return sorted(self.items(), key=_itemgetter(1), reverse=True) - return _heapq.nlargest(n, self.items(), key=_itemgetter(1)) - - def elements(self): - '''Iterator over elements repeating each as many times as its count. - - >>> c = Counter('ABCABC') - >>> sorted(c.elements()) - ['A', 'A', 'B', 'B', 'C', 'C'] - - # Knuth's example for prime factors of 1836: 2**2 * 3**3 * 17**1 - >>> prime_factors = Counter({2: 2, 3: 3, 17: 1}) - >>> product = 1 - >>> for factor in prime_factors.elements(): # loop over factors - ... product *= factor # and multiply them - >>> product - 1836 - - Note, if an element's count has been set to zero or is a negative - number, elements() will ignore it. - - ''' - # Emulate Bag.do from Smalltalk and Multiset.begin from C++. - return _chain.from_iterable(_starmap(_repeat, self.items())) - - # Override dict methods where necessary - - @classmethod - def fromkeys(cls, iterable, v=None): - # There is no equivalent method for counters because setting v=1 - # means that no element can have a count greater than one. - raise NotImplementedError( - 'Counter.fromkeys() is undefined. Use Counter(iterable) instead.') - - def update(self, iterable=None, **kwds): - '''Like dict.update() but add counts instead of replacing them. - - Source can be an iterable, a dictionary, or another Counter instance. - - >>> c = Counter('which') - >>> c.update('witch') # add elements from another iterable - >>> d = Counter('watch') - >>> c.update(d) # add elements from another counter - >>> c['h'] # four 'h' in which, witch, and watch - 4 - - ''' - # The regular dict.update() operation makes no sense here because the - # replace behavior results in the some of original untouched counts - # being mixed-in with all of the other counts for a mismash that - # doesn't have a straight-forward interpretation in most counting - # contexts. Instead, we implement straight-addition. Both the inputs - # and outputs are allowed to contain zero and negative counts. - - if iterable is not None: - if isinstance(iterable, Mapping): - if self: - self_get = self.get - for elem, count in iterable.items(): - self[elem] = count + self_get(elem, 0) - else: - super().update(iterable) # fast path when counter is empty - else: - _count_elements(self, iterable) - if kwds: - self.update(kwds) - - def subtract(self, iterable=None, **kwds): - '''Like dict.update() but subtracts counts instead of replacing them. - Counts can be reduced below zero. Both the inputs and outputs are - allowed to contain zero and negative counts. - - Source can be an iterable, a dictionary, or another Counter instance. - - >>> c = Counter('which') - >>> c.subtract('witch') # subtract elements from another iterable - >>> c.subtract(Counter('watch')) # subtract elements from another counter - >>> c['h'] # 2 in which, minus 1 in witch, minus 1 in watch - 0 - >>> c['w'] # 1 in which, minus 1 in witch, minus 1 in watch - -1 - - ''' - if iterable is not None: - self_get = self.get - if isinstance(iterable, Mapping): - for elem, count in iterable.items(): - self[elem] = self_get(elem, 0) - count - else: - for elem in iterable: - self[elem] = self_get(elem, 0) - 1 - if kwds: - self.subtract(kwds) - - def copy(self): - 'Like dict.copy() but returns a Counter instance instead of a dict.' - return Counter(self) - - def __reduce__(self): - return self.__class__, (dict(self),) - - def __delitem__(self, elem): - 'Like dict.__delitem__() but does not raise KeyError for missing values.' - if elem in self: - super().__delitem__(elem) - - def __repr__(self): - if not self: - return '%s()' % self.__class__.__name__ - items = ', '.join(map('%r: %r'.__mod__, self.most_common())) - return '%s({%s})' % (self.__class__.__name__, items) - - # Multiset-style mathematical operations discussed in: - # Knuth TAOCP Volume II section 4.6.3 exercise 19 - # and at http://en.wikipedia.org/wiki/Multiset - # - # Outputs guaranteed to only include positive counts. - # - # To strip negative and zero counts, add-in an empty counter: - # c += Counter() - - def __add__(self, other): - '''Add counts from two counters. - - >>> Counter('abbb') + Counter('bcc') - Counter({'b': 4, 'c': 2, 'a': 1}) - - ''' - if not isinstance(other, Counter): - return NotImplemented - result = Counter() - for elem in set(self) | set(other): - newcount = self[elem] + other[elem] - if newcount > 0: - result[elem] = newcount - return result - - def __sub__(self, other): - ''' Subtract count, but keep only results with positive counts. - - >>> Counter('abbbc') - Counter('bccd') - Counter({'b': 2, 'a': 1}) - - ''' - if not isinstance(other, Counter): - return NotImplemented - result = Counter() - for elem in set(self) | set(other): - newcount = self[elem] - other[elem] - if newcount > 0: - result[elem] = newcount - return result - - def __or__(self, other): - '''Union is the maximum of value in either of the input counters. - - >>> Counter('abbb') | Counter('bcc') - Counter({'b': 3, 'c': 2, 'a': 1}) - - ''' - if not isinstance(other, Counter): - return NotImplemented - result = Counter() - for elem in set(self) | set(other): - p, q = self[elem], other[elem] - newcount = q if p < q else p - if newcount > 0: - result[elem] = newcount - return result - - def __and__(self, other): - ''' Intersection is the minimum of corresponding counts. - - >>> Counter('abbb') & Counter('bcc') - Counter({'b': 1}) - - ''' - if not isinstance(other, Counter): - return NotImplemented - result = Counter() - if len(self) < len(other): - self, other = other, self - for elem in filter(self.__contains__, other): - p, q = self[elem], other[elem] - newcount = p if p < q else q - if newcount > 0: - result[elem] = newcount - return result - - -######################################################################## -### ChainMap (helper for configparser) -######################################################################## - -class _ChainMap(MutableMapping): - ''' A ChainMap groups multiple dicts (or other mappings) together - to create a single, updateable view. - - The underlying mappings are stored in a list. That list is public and can - accessed or updated using the *maps* attribute. There is no other state. - - Lookups search the underlying mappings successively until a key is found. - In contrast, writes, updates, and deletions only operate on the first - mapping. - - ''' - - def __init__(self, *maps): - '''Initialize a ChainMap by setting *maps* to the given mappings. - If no mappings are provided, a single empty dictionary is used. - - ''' - self.maps = list(maps) or [{}] # always at least one map - - def __missing__(self, key): - raise KeyError(key) - - def __getitem__(self, key): - for mapping in self.maps: - try: - return mapping[key] # can't use 'key in mapping' with defaultdict - except KeyError: - pass - return self.__missing__(key) # support subclasses that define __missing__ - - def get(self, key, default=None): - return self[key] if key in self else default - - def __len__(self): - return len(set().union(*self.maps)) # reuses stored hash values if possible - - def __iter__(self): - return iter(set().union(*self.maps)) - - def __contains__(self, key): - return any(key in m for m in self.maps) - - @_recursive_repr() - def __repr__(self): - return '{0.__class__.__name__}({1})'.format( - self, ', '.join(map(repr, self.maps))) - - @classmethod - def fromkeys(cls, iterable, *args): - 'Create a ChainMap with a single dict created from the iterable.' - return cls(dict.fromkeys(iterable, *args)) - - def copy(self): - 'New ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]' - return self.__class__(self.maps[0].copy(), *self.maps[1:]) - - __copy__ = copy - - def __setitem__(self, key, value): - self.maps[0][key] = value - - def __delitem__(self, key): - try: - del self.maps[0][key] - except KeyError: - raise KeyError('Key not found in the first mapping: {!r}'.format(key)) - - def popitem(self): - 'Remove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.' - try: - return self.maps[0].popitem() - except KeyError: - raise KeyError('No keys found in the first mapping.') - - def pop(self, key, *args): - 'Remove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].' - try: - return self.maps[0].pop(key, *args) - except KeyError: - raise KeyError('Key not found in the first mapping: {!r}'.format(key)) - - def clear(self): - 'Clear maps[0], leaving maps[1:] intact.' - self.maps[0].clear() - - -################################################################################ -### UserDict -################################################################################ - -class UserDict(MutableMapping): - - # Start by filling-out the abstract methods - def __init__(self, dict=None, **kwargs): - self.data = {} - if dict is not None: - self.update(dict) - if len(kwargs): - self.update(kwargs) - def __len__(self): return len(self.data) - def __getitem__(self, key): - if key in self.data: - return self.data[key] - if hasattr(self.__class__, "__missing__"): - return self.__class__.__missing__(self, key) - raise KeyError(key) - def __setitem__(self, key, item): self.data[key] = item - def __delitem__(self, key): del self.data[key] - def __iter__(self): - return iter(self.data) - - # Modify __contains__ to work correctly when __missing__ is present - def __contains__(self, key): - return key in self.data - - # Now, add the methods in dicts but not in MutableMapping - def __repr__(self): return repr(self.data) - def copy(self): - if self.__class__ is UserDict: - return UserDict(self.data.copy()) - import copy - data = self.data - try: - self.data = {} - c = copy.copy(self) - finally: - self.data = data - c.update(self) - return c - @classmethod - def fromkeys(cls, iterable, value=None): - d = cls() - for key in iterable: - d[key] = value - return d - - - -################################################################################ -### UserList -################################################################################ - -class UserList(MutableSequence): - """A more or less complete user-defined wrapper around list objects.""" - def __init__(self, initlist=None): - self.data = [] - if initlist is not None: - # XXX should this accept an arbitrary sequence? - if type(initlist) == type(self.data): - self.data[:] = initlist - elif isinstance(initlist, UserList): - self.data[:] = initlist.data[:] - else: - self.data = list(initlist) - def __repr__(self): return repr(self.data) - def __lt__(self, other): return self.data < self.__cast(other) - def __le__(self, other): return self.data <= self.__cast(other) - def __eq__(self, other): return self.data == self.__cast(other) - def __ne__(self, other): return self.data != self.__cast(other) - def __gt__(self, other): return self.data > self.__cast(other) - def __ge__(self, other): return self.data >= self.__cast(other) - def __cast(self, other): - return other.data if isinstance(other, UserList) else other - def __contains__(self, item): return item in self.data - def __len__(self): return len(self.data) - def __getitem__(self, i): return self.data[i] - def __setitem__(self, i, item): self.data[i] = item - def __delitem__(self, i): del self.data[i] - def __add__(self, other): - if isinstance(other, UserList): - return self.__class__(self.data + other.data) - elif isinstance(other, type(self.data)): - return self.__class__(self.data + other) - return self.__class__(self.data + list(other)) - def __radd__(self, other): - if isinstance(other, UserList): - return self.__class__(other.data + self.data) - elif isinstance(other, type(self.data)): - return self.__class__(other + self.data) - return self.__class__(list(other) + self.data) - def __iadd__(self, other): - if isinstance(other, UserList): - self.data += other.data - elif isinstance(other, type(self.data)): - self.data += other - else: - self.data += list(other) - return self - def __mul__(self, n): - return self.__class__(self.data*n) - __rmul__ = __mul__ - def __imul__(self, n): - self.data *= n - return self - def append(self, item): self.data.append(item) - def insert(self, i, item): self.data.insert(i, item) - def pop(self, i=-1): return self.data.pop(i) - def remove(self, item): self.data.remove(item) - def count(self, item): return self.data.count(item) - def index(self, item, *args): return self.data.index(item, *args) - def reverse(self): self.data.reverse() - def sort(self, *args, **kwds): self.data.sort(*args, **kwds) - def extend(self, other): - if isinstance(other, UserList): - self.data.extend(other.data) - else: - self.data.extend(other) - - - -################################################################################ -### UserString -################################################################################ - -class UserString(Sequence): - def __init__(self, seq): - if isinstance(seq, str): - self.data = seq - elif isinstance(seq, UserString): - self.data = seq.data[:] - else: - self.data = str(seq) - def __str__(self): return str(self.data) - def __repr__(self): return repr(self.data) - def __int__(self): return int(self.data) - def __float__(self): return float(self.data) - def __complex__(self): return complex(self.data) - def __hash__(self): return hash(self.data) - - def __eq__(self, string): - if isinstance(string, UserString): - return self.data == string.data - return self.data == string - def __ne__(self, string): - if isinstance(string, UserString): - return self.data != string.data - return self.data != string - def __lt__(self, string): - if isinstance(string, UserString): - return self.data < string.data - return self.data < string - def __le__(self, string): - if isinstance(string, UserString): - return self.data <= string.data - return self.data <= string - def __gt__(self, string): - if isinstance(string, UserString): - return self.data > string.data - return self.data > string - def __ge__(self, string): - if isinstance(string, UserString): - return self.data >= string.data - return self.data >= string - - def __contains__(self, char): - if isinstance(char, UserString): - char = char.data - return char in self.data - - def __len__(self): return len(self.data) - def __getitem__(self, index): return self.__class__(self.data[index]) - def __add__(self, other): - if isinstance(other, UserString): - return self.__class__(self.data + other.data) - elif isinstance(other, str): - return self.__class__(self.data + other) - return self.__class__(self.data + str(other)) - def __radd__(self, other): - if isinstance(other, str): - return self.__class__(other + self.data) - return self.__class__(str(other) + self.data) - def __mul__(self, n): - return self.__class__(self.data*n) - __rmul__ = __mul__ - def __mod__(self, args): - return self.__class__(self.data % args) - - # the following methods are defined in alphabetical order: - def capitalize(self): return self.__class__(self.data.capitalize()) - def center(self, width, *args): - return self.__class__(self.data.center(width, *args)) - def count(self, sub, start=0, end=_sys.maxsize): - if isinstance(sub, UserString): - sub = sub.data - return self.data.count(sub, start, end) - def encode(self, encoding=None, errors=None): # XXX improve this? - if encoding: - if errors: - return self.__class__(self.data.encode(encoding, errors)) - return self.__class__(self.data.encode(encoding)) - return self.__class__(self.data.encode()) - def endswith(self, suffix, start=0, end=_sys.maxsize): - return self.data.endswith(suffix, start, end) - def expandtabs(self, tabsize=8): - return self.__class__(self.data.expandtabs(tabsize)) - def find(self, sub, start=0, end=_sys.maxsize): - if isinstance(sub, UserString): - sub = sub.data - return self.data.find(sub, start, end) - def format(self, *args, **kwds): - return self.data.format(*args, **kwds) - def index(self, sub, start=0, end=_sys.maxsize): - return self.data.index(sub, start, end) - def isalpha(self): return self.data.isalpha() - def isalnum(self): return self.data.isalnum() - def isdecimal(self): return self.data.isdecimal() - def isdigit(self): return self.data.isdigit() - def isidentifier(self): return self.data.isidentifier() - def islower(self): return self.data.islower() - def isnumeric(self): return self.data.isnumeric() - def isspace(self): return self.data.isspace() - def istitle(self): return self.data.istitle() - def isupper(self): return self.data.isupper() - def join(self, seq): return self.data.join(seq) - def ljust(self, width, *args): - return self.__class__(self.data.ljust(width, *args)) - def lower(self): return self.__class__(self.data.lower()) - def lstrip(self, chars=None): return self.__class__(self.data.lstrip(chars)) - def partition(self, sep): - return self.data.partition(sep) - def replace(self, old, new, maxsplit=-1): - if isinstance(old, UserString): - old = old.data - if isinstance(new, UserString): - new = new.data - return self.__class__(self.data.replace(old, new, maxsplit)) - def rfind(self, sub, start=0, end=_sys.maxsize): - if isinstance(sub, UserString): - sub = sub.data - return self.data.rfind(sub, start, end) - def rindex(self, sub, start=0, end=_sys.maxsize): - return self.data.rindex(sub, start, end) - def rjust(self, width, *args): - return self.__class__(self.data.rjust(width, *args)) - def rpartition(self, sep): - return self.data.rpartition(sep) - def rstrip(self, chars=None): - return self.__class__(self.data.rstrip(chars)) - def split(self, sep=None, maxsplit=-1): - return self.data.split(sep, maxsplit) - def rsplit(self, sep=None, maxsplit=-1): - return self.data.rsplit(sep, maxsplit) - def splitlines(self, keepends=0): return self.data.splitlines(keepends) - def startswith(self, prefix, start=0, end=_sys.maxsize): - return self.data.startswith(prefix, start, end) - def strip(self, chars=None): return self.__class__(self.data.strip(chars)) - def swapcase(self): return self.__class__(self.data.swapcase()) - def title(self): return self.__class__(self.data.title()) - def translate(self, *args): - return self.__class__(self.data.translate(*args)) - def upper(self): return self.__class__(self.data.upper()) - def zfill(self, width): return self.__class__(self.data.zfill(width)) - - - -################################################################################ -### Simple tests -################################################################################ - -if __name__ == '__main__': - # verify that instances can be pickled - from pickle import loads, dumps - Point = namedtuple('Point', 'x, y', True) - p = Point(x=10, y=20) - assert p == loads(dumps(p)) - - # test and demonstrate ability to override methods - class Point(namedtuple('Point', 'x y')): - __slots__ = () - @property - def hypot(self): - return (self.x ** 2 + self.y ** 2) ** 0.5 - def __str__(self): - return 'Point: x=%6.3f y=%6.3f hypot=%6.3f' % (self.x, self.y, self.hypot) - - for p in Point(3, 4), Point(14, 5/7.): - print (p) - - class Point(namedtuple('Point', 'x y')): - 'Point class with optimized _make() and _replace() without error-checking' - __slots__ = () - _make = classmethod(tuple.__new__) - def _replace(self, _map=map, **kwds): - return self._make(_map(kwds.get, ('x', 'y'), self)) - - print(Point(11, 22)._replace(x=100)) - - Point3D = namedtuple('Point3D', Point._fields + ('z',)) - print(Point3D.__doc__) - - import doctest - TestResults = namedtuple('TestResults', 'failed attempted') - print(TestResults(*doctest.testmod())) Copied: python/branches/py3k/Lib/collections/__init__.py (from r88470, /python/branches/py3k/Lib/collections.py) ============================================================================== --- /python/branches/py3k/Lib/collections.py (original) +++ python/branches/py3k/Lib/collections/__init__.py Tue Feb 22 01:41:50 2011 @@ -1,10 +1,11 @@ __all__ = ['deque', 'defaultdict', 'namedtuple', 'UserDict', 'UserList', 'UserString', 'Counter', 'OrderedDict'] -# For bootstrapping reasons, the collection ABCs are defined in _abcoll.py. -# They should however be considered an integral part of collections.py. -from _abcoll import * -import _abcoll -__all__ += _abcoll.__all__ + +# For backwards compatability, continue to make the collections ABCs +# available through the collections module. +from collections.abc import * +import collections.abc +__all__ += collections.abc.__all__ from _collections import deque, defaultdict from operator import itemgetter as _itemgetter Copied: python/branches/py3k/Lib/collections/abc.py (from r88469, /python/branches/py3k/Lib/_abcoll.py) ============================================================================== --- /python/branches/py3k/Lib/_abcoll.py (original) +++ python/branches/py3k/Lib/collections/abc.py Tue Feb 22 01:41:50 2011 @@ -3,9 +3,7 @@ """Abstract Base Classes (ABCs) for collections, according to PEP 3119. -DON'T USE THIS MODULE DIRECTLY! The classes here should be imported -via collections; they are defined here only to alleviate certain -bootstrapping issues. Unit tests are in test_collections. +Unit tests are in test_collections. """ from abc import ABCMeta, abstractmethod Modified: python/branches/py3k/Lib/os.py ============================================================================== --- python/branches/py3k/Lib/os.py (original) +++ python/branches/py3k/Lib/os.py Tue Feb 22 01:41:50 2011 @@ -434,7 +434,7 @@ # Change environ to automatically call putenv(), unsetenv if they exist. -from _abcoll import MutableMapping # Can't use collections (bootstrap) +from collections.abc import MutableMapping class _Environ(MutableMapping): def __init__(self, data, encodekey, decodekey, encodevalue, decodevalue, putenv, unsetenv): Modified: python/branches/py3k/Lib/test/regrtest.py ============================================================================== --- python/branches/py3k/Lib/test/regrtest.py (original) +++ python/branches/py3k/Lib/test/regrtest.py Tue Feb 22 01:41:50 2011 @@ -1056,7 +1056,8 @@ False if the test didn't leak references; True if we detected refleaks. """ # This code is hackish and inelegant, but it seems to do the job. - import copyreg, _abcoll + import copyreg + import collections.abc if not hasattr(sys, 'gettotalrefcount'): raise Exception("Tracking reference leaks requires a debug build " @@ -1073,7 +1074,7 @@ else: zdc = zipimport._zip_directory_cache.copy() abcs = {} - for abc in [getattr(_abcoll, a) for a in _abcoll.__all__]: + for abc in [getattr(collections.abc, a) for a in collections.abc.__all__]: if not isabstract(abc): continue for obj in abc.__subclasses__() + [abc]: @@ -1119,7 +1120,7 @@ import gc, copyreg import _strptime, linecache import urllib.parse, urllib.request, mimetypes, doctest - import struct, filecmp, _abcoll + import struct, filecmp, collections.abc from distutils.dir_util import _path_created from weakref import WeakSet @@ -1146,7 +1147,7 @@ sys._clear_type_cache() # Clear ABC registries, restoring previously saved ABC registries. - for abc in [getattr(_abcoll, a) for a in _abcoll.__all__]: + for abc in [getattr(collections.abc, a) for a in collections.abc.__all__]: if not isabstract(abc): continue for obj in abc.__subclasses__() + [abc]: Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 01:41:50 2011 @@ -27,6 +27,11 @@ Library ------- +- Issue #11085: Moved collections abstract base classes into a separate + module called collections.abc, following the pattern used by importlib.abc. + For backwards compatibility, the names are imported into the collections + module. + - Issue #4681: Allow mmap() to work on file sizes and offsets larger than 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for 32-bit Windows. From python-checkins at python.org Tue Feb 22 02:48:33 2011 From: python-checkins at python.org (raymond.hettinger) Date: Tue, 22 Feb 2011 02:48:33 +0100 (CET) Subject: [Python-checkins] r88491 - in python/branches/py3k/Lib/test: test_cfgparser.py test_configparser.py Message-ID: <20110222014833.E7F4CEE985@mail.python.org> Author: raymond.hettinger Date: Tue Feb 22 02:48:33 2011 New Revision: 88491 Log: Have the test filename match the module filename. Added: python/branches/py3k/Lib/test/test_configparser.py - copied unchanged from r88489, /python/branches/py3k/Lib/test/test_cfgparser.py Removed: python/branches/py3k/Lib/test/test_cfgparser.py Deleted: python/branches/py3k/Lib/test/test_cfgparser.py ============================================================================== --- python/branches/py3k/Lib/test/test_cfgparser.py Tue Feb 22 02:48:33 2011 +++ (empty file) @@ -1,1342 +0,0 @@ -import collections -import configparser -import io -import os -import sys -import textwrap -import unittest -import warnings - -from test import support - -class SortedDict(collections.UserDict): - - def items(self): - return sorted(self.data.items()) - - def keys(self): - return sorted(self.data.keys()) - - def values(self): - return [i[1] for i in self.items()] - - def iteritems(self): return iter(self.items()) - def iterkeys(self): return iter(self.keys()) - __iter__ = iterkeys - def itervalues(self): return iter(self.values()) - - -class CfgParserTestCaseClass(unittest.TestCase): - allow_no_value = False - delimiters = ('=', ':') - comment_prefixes = (';', '#') - inline_comment_prefixes = (';', '#') - empty_lines_in_values = True - dict_type = configparser._default_dict - strict = False - default_section = configparser.DEFAULTSECT - interpolation = configparser._UNSET - - def newconfig(self, defaults=None): - arguments = dict( - defaults=defaults, - allow_no_value=self.allow_no_value, - delimiters=self.delimiters, - comment_prefixes=self.comment_prefixes, - inline_comment_prefixes=self.inline_comment_prefixes, - empty_lines_in_values=self.empty_lines_in_values, - dict_type=self.dict_type, - strict=self.strict, - default_section=self.default_section, - interpolation=self.interpolation, - ) - instance = self.config_class(**arguments) - return instance - - def fromstring(self, string, defaults=None): - cf = self.newconfig(defaults) - cf.read_string(string) - return cf - -class BasicTestCase(CfgParserTestCaseClass): - - def basic_test(self, cf): - E = ['Commented Bar', - 'Foo Bar', - 'Internationalized Stuff', - 'Long Line', - 'Section\\with$weird%characters[\t', - 'Spaces', - 'Spacey Bar', - 'Spacey Bar From The Beginning', - 'Types', - ] - - if self.allow_no_value: - E.append('NoValue') - E.sort() - F = [('baz', 'qwe'), ('foo', 'bar3')] - - # API access - L = cf.sections() - L.sort() - eq = self.assertEqual - eq(L, E) - L = cf.items('Spacey Bar From The Beginning') - L.sort() - eq(L, F) - - # mapping access - L = [section for section in cf] - L.sort() - E.append(self.default_section) - E.sort() - eq(L, E) - L = cf['Spacey Bar From The Beginning'].items() - L = sorted(list(L)) - eq(L, F) - L = cf.items() - L = sorted(list(L)) - self.assertEqual(len(L), len(E)) - for name, section in L: - eq(name, section.name) - eq(cf.defaults(), cf[self.default_section]) - - # The use of spaces in the section names serves as a - # regression test for SourceForge bug #583248: - # http://www.python.org/sf/583248 - - # API access - eq(cf.get('Foo Bar', 'foo'), 'bar1') - eq(cf.get('Spacey Bar', 'foo'), 'bar2') - eq(cf.get('Spacey Bar From The Beginning', 'foo'), 'bar3') - eq(cf.get('Spacey Bar From The Beginning', 'baz'), 'qwe') - eq(cf.get('Commented Bar', 'foo'), 'bar4') - eq(cf.get('Commented Bar', 'baz'), 'qwe') - eq(cf.get('Spaces', 'key with spaces'), 'value') - eq(cf.get('Spaces', 'another with spaces'), 'splat!') - eq(cf.getint('Types', 'int'), 42) - eq(cf.get('Types', 'int'), "42") - self.assertAlmostEqual(cf.getfloat('Types', 'float'), 0.44) - eq(cf.get('Types', 'float'), "0.44") - eq(cf.getboolean('Types', 'boolean'), False) - eq(cf.get('Types', '123'), 'strange but acceptable') - if self.allow_no_value: - eq(cf.get('NoValue', 'option-without-value'), None) - - # test vars= and fallback= - eq(cf.get('Foo Bar', 'foo', fallback='baz'), 'bar1') - eq(cf.get('Foo Bar', 'foo', vars={'foo': 'baz'}), 'baz') - with self.assertRaises(configparser.NoSectionError): - cf.get('No Such Foo Bar', 'foo') - with self.assertRaises(configparser.NoOptionError): - cf.get('Foo Bar', 'no-such-foo') - eq(cf.get('No Such Foo Bar', 'foo', fallback='baz'), 'baz') - eq(cf.get('Foo Bar', 'no-such-foo', fallback='baz'), 'baz') - eq(cf.get('Spacey Bar', 'foo', fallback=None), 'bar2') - eq(cf.get('No Such Spacey Bar', 'foo', fallback=None), None) - eq(cf.getint('Types', 'int', fallback=18), 42) - eq(cf.getint('Types', 'no-such-int', fallback=18), 18) - eq(cf.getint('Types', 'no-such-int', fallback="18"), "18") # sic! - with self.assertRaises(configparser.NoOptionError): - cf.getint('Types', 'no-such-int') - self.assertAlmostEqual(cf.getfloat('Types', 'float', - fallback=0.0), 0.44) - self.assertAlmostEqual(cf.getfloat('Types', 'no-such-float', - fallback=0.0), 0.0) - eq(cf.getfloat('Types', 'no-such-float', fallback="0.0"), "0.0") # sic! - with self.assertRaises(configparser.NoOptionError): - cf.getfloat('Types', 'no-such-float') - eq(cf.getboolean('Types', 'boolean', fallback=True), False) - eq(cf.getboolean('Types', 'no-such-boolean', fallback="yes"), - "yes") # sic! - eq(cf.getboolean('Types', 'no-such-boolean', fallback=True), True) - with self.assertRaises(configparser.NoOptionError): - cf.getboolean('Types', 'no-such-boolean') - eq(cf.getboolean('No Such Types', 'boolean', fallback=True), True) - if self.allow_no_value: - eq(cf.get('NoValue', 'option-without-value', fallback=False), None) - eq(cf.get('NoValue', 'no-such-option-without-value', - fallback=False), False) - - # mapping access - eq(cf['Foo Bar']['foo'], 'bar1') - eq(cf['Spacey Bar']['foo'], 'bar2') - section = cf['Spacey Bar From The Beginning'] - eq(section.name, 'Spacey Bar From The Beginning') - self.assertIs(section.parser, cf) - with self.assertRaises(AttributeError): - section.name = 'Name is read-only' - with self.assertRaises(AttributeError): - section.parser = 'Parser is read-only' - eq(section['foo'], 'bar3') - eq(section['baz'], 'qwe') - eq(cf['Commented Bar']['foo'], 'bar4') - eq(cf['Commented Bar']['baz'], 'qwe') - eq(cf['Spaces']['key with spaces'], 'value') - eq(cf['Spaces']['another with spaces'], 'splat!') - eq(cf['Long Line']['foo'], - 'this line is much, much longer than my editor\nlikes it.') - if self.allow_no_value: - eq(cf['NoValue']['option-without-value'], None) - # test vars= and fallback= - eq(cf['Foo Bar'].get('foo', 'baz'), 'bar1') - eq(cf['Foo Bar'].get('foo', fallback='baz'), 'bar1') - eq(cf['Foo Bar'].get('foo', vars={'foo': 'baz'}), 'baz') - with self.assertRaises(KeyError): - cf['No Such Foo Bar']['foo'] - with self.assertRaises(KeyError): - cf['Foo Bar']['no-such-foo'] - with self.assertRaises(KeyError): - cf['No Such Foo Bar'].get('foo', fallback='baz') - eq(cf['Foo Bar'].get('no-such-foo', 'baz'), 'baz') - eq(cf['Foo Bar'].get('no-such-foo', fallback='baz'), 'baz') - eq(cf['Foo Bar'].get('no-such-foo'), None) - eq(cf['Spacey Bar'].get('foo', None), 'bar2') - eq(cf['Spacey Bar'].get('foo', fallback=None), 'bar2') - with self.assertRaises(KeyError): - cf['No Such Spacey Bar'].get('foo', None) - eq(cf['Types'].getint('int', 18), 42) - eq(cf['Types'].getint('int', fallback=18), 42) - eq(cf['Types'].getint('no-such-int', 18), 18) - eq(cf['Types'].getint('no-such-int', fallback=18), 18) - eq(cf['Types'].getint('no-such-int', "18"), "18") # sic! - eq(cf['Types'].getint('no-such-int', fallback="18"), "18") # sic! - eq(cf['Types'].getint('no-such-int'), None) - self.assertAlmostEqual(cf['Types'].getfloat('float', 0.0), 0.44) - self.assertAlmostEqual(cf['Types'].getfloat('float', - fallback=0.0), 0.44) - self.assertAlmostEqual(cf['Types'].getfloat('no-such-float', 0.0), 0.0) - self.assertAlmostEqual(cf['Types'].getfloat('no-such-float', - fallback=0.0), 0.0) - eq(cf['Types'].getfloat('no-such-float', "0.0"), "0.0") # sic! - eq(cf['Types'].getfloat('no-such-float', fallback="0.0"), "0.0") # sic! - eq(cf['Types'].getfloat('no-such-float'), None) - eq(cf['Types'].getboolean('boolean', True), False) - eq(cf['Types'].getboolean('boolean', fallback=True), False) - eq(cf['Types'].getboolean('no-such-boolean', "yes"), "yes") # sic! - eq(cf['Types'].getboolean('no-such-boolean', fallback="yes"), - "yes") # sic! - eq(cf['Types'].getboolean('no-such-boolean', True), True) - eq(cf['Types'].getboolean('no-such-boolean', fallback=True), True) - eq(cf['Types'].getboolean('no-such-boolean'), None) - if self.allow_no_value: - eq(cf['NoValue'].get('option-without-value', False), None) - eq(cf['NoValue'].get('option-without-value', fallback=False), None) - eq(cf['NoValue'].get('no-such-option-without-value', False), False) - eq(cf['NoValue'].get('no-such-option-without-value', - fallback=False), False) - - # Make sure the right things happen for remove_section() and - # remove_option(); added to include check for SourceForge bug #123324. - - cf[self.default_section]['this_value'] = '1' - cf[self.default_section]['that_value'] = '2' - - # API access - self.assertTrue(cf.remove_section('Spaces')) - self.assertFalse(cf.has_option('Spaces', 'key with spaces')) - self.assertFalse(cf.remove_section('Spaces')) - self.assertFalse(cf.remove_section(self.default_section)) - self.assertTrue(cf.remove_option('Foo Bar', 'foo'), - "remove_option() failed to report existence of option") - self.assertFalse(cf.has_option('Foo Bar', 'foo'), - "remove_option() failed to remove option") - self.assertFalse(cf.remove_option('Foo Bar', 'foo'), - "remove_option() failed to report non-existence of option" - " that was removed") - self.assertTrue(cf.has_option('Foo Bar', 'this_value')) - self.assertFalse(cf.remove_option('Foo Bar', 'this_value')) - self.assertTrue(cf.remove_option(self.default_section, 'this_value')) - self.assertFalse(cf.has_option('Foo Bar', 'this_value')) - self.assertFalse(cf.remove_option(self.default_section, 'this_value')) - - with self.assertRaises(configparser.NoSectionError) as cm: - cf.remove_option('No Such Section', 'foo') - self.assertEqual(cm.exception.args, ('No Such Section',)) - - eq(cf.get('Long Line', 'foo'), - 'this line is much, much longer than my editor\nlikes it.') - - # mapping access - del cf['Types'] - self.assertFalse('Types' in cf) - with self.assertRaises(KeyError): - del cf['Types'] - with self.assertRaises(ValueError): - del cf[self.default_section] - del cf['Spacey Bar']['foo'] - self.assertFalse('foo' in cf['Spacey Bar']) - with self.assertRaises(KeyError): - del cf['Spacey Bar']['foo'] - self.assertTrue('that_value' in cf['Spacey Bar']) - with self.assertRaises(KeyError): - del cf['Spacey Bar']['that_value'] - del cf[self.default_section]['that_value'] - self.assertFalse('that_value' in cf['Spacey Bar']) - with self.assertRaises(KeyError): - del cf[self.default_section]['that_value'] - with self.assertRaises(KeyError): - del cf['No Such Section']['foo'] - - # Don't add new asserts below in this method as most of the options - # and sections are now removed. - - def test_basic(self): - config_string = """\ -[Foo Bar] -foo{0[0]}bar1 -[Spacey Bar] -foo {0[0]} bar2 -[Spacey Bar From The Beginning] - foo {0[0]} bar3 - baz {0[0]} qwe -[Commented Bar] -foo{0[1]} bar4 {1[1]} comment -baz{0[0]}qwe {1[0]}another one -[Long Line] -foo{0[1]} this line is much, much longer than my editor - likes it. -[Section\\with$weird%characters[\t] -[Internationalized Stuff] -foo[bg]{0[1]} Bulgarian -foo{0[0]}Default -foo[en]{0[0]}English -foo[de]{0[0]}Deutsch -[Spaces] -key with spaces {0[1]} value -another with spaces {0[0]} splat! -[Types] -int {0[1]} 42 -float {0[0]} 0.44 -boolean {0[0]} NO -123 {0[1]} strange but acceptable -""".format(self.delimiters, self.comment_prefixes) - if self.allow_no_value: - config_string += ( - "[NoValue]\n" - "option-without-value\n" - ) - cf = self.fromstring(config_string) - self.basic_test(cf) - if self.strict: - with self.assertRaises(configparser.DuplicateOptionError): - cf.read_string(textwrap.dedent("""\ - [Duplicate Options Here] - option {0[0]} with a value - option {0[1]} with another value - """.format(self.delimiters))) - with self.assertRaises(configparser.DuplicateSectionError): - cf.read_string(textwrap.dedent("""\ - [And Now For Something] - completely different {0[0]} True - [And Now For Something] - the larch {0[1]} 1 - """.format(self.delimiters))) - else: - cf.read_string(textwrap.dedent("""\ - [Duplicate Options Here] - option {0[0]} with a value - option {0[1]} with another value - """.format(self.delimiters))) - - cf.read_string(textwrap.dedent("""\ - [And Now For Something] - completely different {0[0]} True - [And Now For Something] - the larch {0[1]} 1 - """.format(self.delimiters))) - - def test_basic_from_dict(self): - config = { - "Foo Bar": { - "foo": "bar1", - }, - "Spacey Bar": { - "foo": "bar2", - }, - "Spacey Bar From The Beginning": { - "foo": "bar3", - "baz": "qwe", - }, - "Commented Bar": { - "foo": "bar4", - "baz": "qwe", - }, - "Long Line": { - "foo": "this line is much, much longer than my editor\nlikes " - "it.", - }, - "Section\\with$weird%characters[\t": { - }, - "Internationalized Stuff": { - "foo[bg]": "Bulgarian", - "foo": "Default", - "foo[en]": "English", - "foo[de]": "Deutsch", - }, - "Spaces": { - "key with spaces": "value", - "another with spaces": "splat!", - }, - "Types": { - "int": 42, - "float": 0.44, - "boolean": False, - 123: "strange but acceptable", - }, - } - if self.allow_no_value: - config.update({ - "NoValue": { - "option-without-value": None, - } - }) - cf = self.newconfig() - cf.read_dict(config) - self.basic_test(cf) - if self.strict: - with self.assertRaises(configparser.DuplicateSectionError): - cf.read_dict({ - '1': {'key': 'value'}, - 1: {'key2': 'value2'}, - }) - with self.assertRaises(configparser.DuplicateOptionError): - cf.read_dict({ - "Duplicate Options Here": { - 'option': 'with a value', - 'OPTION': 'with another value', - }, - }) - else: - cf.read_dict({ - 'section': {'key': 'value'}, - 'SECTION': {'key2': 'value2'}, - }) - cf.read_dict({ - "Duplicate Options Here": { - 'option': 'with a value', - 'OPTION': 'with another value', - }, - }) - - def test_case_sensitivity(self): - cf = self.newconfig() - cf.add_section("A") - cf.add_section("a") - cf.add_section("B") - L = cf.sections() - L.sort() - eq = self.assertEqual - eq(L, ["A", "B", "a"]) - cf.set("a", "B", "value") - eq(cf.options("a"), ["b"]) - eq(cf.get("a", "b"), "value", - "could not locate option, expecting case-insensitive option names") - with self.assertRaises(configparser.NoSectionError): - # section names are case-sensitive - cf.set("b", "A", "value") - self.assertTrue(cf.has_option("a", "b")) - self.assertFalse(cf.has_option("b", "b")) - cf.set("A", "A-B", "A-B value") - for opt in ("a-b", "A-b", "a-B", "A-B"): - self.assertTrue( - cf.has_option("A", opt), - "has_option() returned false for option which should exist") - eq(cf.options("A"), ["a-b"]) - eq(cf.options("a"), ["b"]) - cf.remove_option("a", "B") - eq(cf.options("a"), []) - - # SF bug #432369: - cf = self.fromstring( - "[MySection]\nOption{} first line \n\tsecond line \n".format( - self.delimiters[0])) - eq(cf.options("MySection"), ["option"]) - eq(cf.get("MySection", "Option"), "first line\nsecond line") - - # SF bug #561822: - cf = self.fromstring("[section]\n" - "nekey{}nevalue\n".format(self.delimiters[0]), - defaults={"key":"value"}) - self.assertTrue(cf.has_option("section", "Key")) - - - def test_case_sensitivity_mapping_access(self): - cf = self.newconfig() - cf["A"] = {} - cf["a"] = {"B": "value"} - cf["B"] = {} - L = [section for section in cf] - L.sort() - eq = self.assertEqual - elem_eq = self.assertCountEqual - eq(L, sorted(["A", "B", self.default_section, "a"])) - eq(cf["a"].keys(), {"b"}) - eq(cf["a"]["b"], "value", - "could not locate option, expecting case-insensitive option names") - with self.assertRaises(KeyError): - # section names are case-sensitive - cf["b"]["A"] = "value" - self.assertTrue("b" in cf["a"]) - cf["A"]["A-B"] = "A-B value" - for opt in ("a-b", "A-b", "a-B", "A-B"): - self.assertTrue( - opt in cf["A"], - "has_option() returned false for option which should exist") - eq(cf["A"].keys(), {"a-b"}) - eq(cf["a"].keys(), {"b"}) - del cf["a"]["B"] - elem_eq(cf["a"].keys(), {}) - - # SF bug #432369: - cf = self.fromstring( - "[MySection]\nOption{} first line \n\tsecond line \n".format( - self.delimiters[0])) - eq(cf["MySection"].keys(), {"option"}) - eq(cf["MySection"]["Option"], "first line\nsecond line") - - # SF bug #561822: - cf = self.fromstring("[section]\n" - "nekey{}nevalue\n".format(self.delimiters[0]), - defaults={"key":"value"}) - self.assertTrue("Key" in cf["section"]) - - def test_default_case_sensitivity(self): - cf = self.newconfig({"foo": "Bar"}) - self.assertEqual( - cf.get(self.default_section, "Foo"), "Bar", - "could not locate option, expecting case-insensitive option names") - cf = self.newconfig({"Foo": "Bar"}) - self.assertEqual( - cf.get(self.default_section, "Foo"), "Bar", - "could not locate option, expecting case-insensitive defaults") - - def test_parse_errors(self): - cf = self.newconfig() - self.parse_error(cf, configparser.ParsingError, - "[Foo]\n" - "{}val-without-opt-name\n".format(self.delimiters[0])) - self.parse_error(cf, configparser.ParsingError, - "[Foo]\n" - "{}val-without-opt-name\n".format(self.delimiters[1])) - e = self.parse_error(cf, configparser.MissingSectionHeaderError, - "No Section!\n") - self.assertEqual(e.args, ('', 1, "No Section!\n")) - if not self.allow_no_value: - e = self.parse_error(cf, configparser.ParsingError, - "[Foo]\n wrong-indent\n") - self.assertEqual(e.args, ('',)) - # read_file on a real file - tricky = support.findfile("cfgparser.3") - if self.delimiters[0] == '=': - error = configparser.ParsingError - expected = (tricky,) - else: - error = configparser.MissingSectionHeaderError - expected = (tricky, 1, - ' # INI with as many tricky parts as possible\n') - with open(tricky, encoding='utf-8') as f: - e = self.parse_error(cf, error, f) - self.assertEqual(e.args, expected) - - def parse_error(self, cf, exc, src): - if hasattr(src, 'readline'): - sio = src - else: - sio = io.StringIO(src) - with self.assertRaises(exc) as cm: - cf.read_file(sio) - return cm.exception - - def test_query_errors(self): - cf = self.newconfig() - self.assertEqual(cf.sections(), [], - "new ConfigParser should have no defined sections") - self.assertFalse(cf.has_section("Foo"), - "new ConfigParser should have no acknowledged " - "sections") - with self.assertRaises(configparser.NoSectionError): - cf.options("Foo") - with self.assertRaises(configparser.NoSectionError): - cf.set("foo", "bar", "value") - e = self.get_error(cf, configparser.NoSectionError, "foo", "bar") - self.assertEqual(e.args, ("foo",)) - cf.add_section("foo") - e = self.get_error(cf, configparser.NoOptionError, "foo", "bar") - self.assertEqual(e.args, ("bar", "foo")) - - def get_error(self, cf, exc, section, option): - try: - cf.get(section, option) - except exc as e: - return e - else: - self.fail("expected exception type %s.%s" - % (exc.__module__, exc.__name__)) - - def test_boolean(self): - cf = self.fromstring( - "[BOOLTEST]\n" - "T1{equals}1\n" - "T2{equals}TRUE\n" - "T3{equals}True\n" - "T4{equals}oN\n" - "T5{equals}yes\n" - "F1{equals}0\n" - "F2{equals}FALSE\n" - "F3{equals}False\n" - "F4{equals}oFF\n" - "F5{equals}nO\n" - "E1{equals}2\n" - "E2{equals}foo\n" - "E3{equals}-1\n" - "E4{equals}0.1\n" - "E5{equals}FALSE AND MORE".format(equals=self.delimiters[0]) - ) - for x in range(1, 5): - self.assertTrue(cf.getboolean('BOOLTEST', 't%d' % x)) - self.assertFalse(cf.getboolean('BOOLTEST', 'f%d' % x)) - self.assertRaises(ValueError, - cf.getboolean, 'BOOLTEST', 'e%d' % x) - - def test_weird_errors(self): - cf = self.newconfig() - cf.add_section("Foo") - with self.assertRaises(configparser.DuplicateSectionError) as cm: - cf.add_section("Foo") - e = cm.exception - self.assertEqual(str(e), "Section 'Foo' already exists") - self.assertEqual(e.args, ("Foo", None, None)) - - if self.strict: - with self.assertRaises(configparser.DuplicateSectionError) as cm: - cf.read_string(textwrap.dedent("""\ - [Foo] - will this be added{equals}True - [Bar] - what about this{equals}True - [Foo] - oops{equals}this won't - """.format(equals=self.delimiters[0])), source='') - e = cm.exception - self.assertEqual(str(e), "While reading from [line 5]: " - "section 'Foo' already exists") - self.assertEqual(e.args, ("Foo", '', 5)) - - with self.assertRaises(configparser.DuplicateOptionError) as cm: - cf.read_dict({'Bar': {'opt': 'val', 'OPT': 'is really `opt`'}}) - e = cm.exception - self.assertEqual(str(e), "While reading from : option 'opt' " - "in section 'Bar' already exists") - self.assertEqual(e.args, ("Bar", "opt", "", None)) - - def test_write(self): - config_string = ( - "[Long Line]\n" - "foo{0[0]} this line is much, much longer than my editor\n" - " likes it.\n" - "[{default_section}]\n" - "foo{0[1]} another very\n" - " long line\n" - "[Long Line - With Comments!]\n" - "test {0[1]} we {comment} can\n" - " also {comment} place\n" - " comments {comment} in\n" - " multiline {comment} values" - "\n".format(self.delimiters, comment=self.comment_prefixes[0], - default_section=self.default_section) - ) - if self.allow_no_value: - config_string += ( - "[Valueless]\n" - "option-without-value\n" - ) - - cf = self.fromstring(config_string) - for space_around_delimiters in (True, False): - output = io.StringIO() - cf.write(output, space_around_delimiters=space_around_delimiters) - delimiter = self.delimiters[0] - if space_around_delimiters: - delimiter = " {} ".format(delimiter) - expect_string = ( - "[{default_section}]\n" - "foo{equals}another very\n" - "\tlong line\n" - "\n" - "[Long Line]\n" - "foo{equals}this line is much, much longer than my editor\n" - "\tlikes it.\n" - "\n" - "[Long Line - With Comments!]\n" - "test{equals}we\n" - "\talso\n" - "\tcomments\n" - "\tmultiline\n" - "\n".format(equals=delimiter, - default_section=self.default_section) - ) - if self.allow_no_value: - expect_string += ( - "[Valueless]\n" - "option-without-value\n" - "\n" - ) - self.assertEqual(output.getvalue(), expect_string) - - def test_set_string_types(self): - cf = self.fromstring("[sect]\n" - "option1{eq}foo\n".format(eq=self.delimiters[0])) - # Check that we don't get an exception when setting values in - # an existing section using strings: - class mystr(str): - pass - cf.set("sect", "option1", "splat") - cf.set("sect", "option1", mystr("splat")) - cf.set("sect", "option2", "splat") - cf.set("sect", "option2", mystr("splat")) - cf.set("sect", "option1", "splat") - cf.set("sect", "option2", "splat") - - def test_read_returns_file_list(self): - if self.delimiters[0] != '=': - # skip reading the file if we're using an incompatible format - return - file1 = support.findfile("cfgparser.1") - # check when we pass a mix of readable and non-readable files: - cf = self.newconfig() - parsed_files = cf.read([file1, "nonexistent-file"]) - self.assertEqual(parsed_files, [file1]) - self.assertEqual(cf.get("Foo Bar", "foo"), "newbar") - # check when we pass only a filename: - cf = self.newconfig() - parsed_files = cf.read(file1) - self.assertEqual(parsed_files, [file1]) - self.assertEqual(cf.get("Foo Bar", "foo"), "newbar") - # check when we pass only missing files: - cf = self.newconfig() - parsed_files = cf.read(["nonexistent-file"]) - self.assertEqual(parsed_files, []) - # check when we pass no files: - cf = self.newconfig() - parsed_files = cf.read([]) - self.assertEqual(parsed_files, []) - - # shared by subclasses - def get_interpolation_config(self): - return self.fromstring( - "[Foo]\n" - "bar{equals}something %(with1)s interpolation (1 step)\n" - "bar9{equals}something %(with9)s lots of interpolation (9 steps)\n" - "bar10{equals}something %(with10)s lots of interpolation (10 steps)\n" - "bar11{equals}something %(with11)s lots of interpolation (11 steps)\n" - "with11{equals}%(with10)s\n" - "with10{equals}%(with9)s\n" - "with9{equals}%(with8)s\n" - "with8{equals}%(With7)s\n" - "with7{equals}%(WITH6)s\n" - "with6{equals}%(with5)s\n" - "With5{equals}%(with4)s\n" - "WITH4{equals}%(with3)s\n" - "with3{equals}%(with2)s\n" - "with2{equals}%(with1)s\n" - "with1{equals}with\n" - "\n" - "[Mutual Recursion]\n" - "foo{equals}%(bar)s\n" - "bar{equals}%(foo)s\n" - "\n" - "[Interpolation Error]\n" - # no definition for 'reference' - "name{equals}%(reference)s\n".format(equals=self.delimiters[0])) - - def check_items_config(self, expected): - cf = self.fromstring(""" - [section] - name {0[0]} %(value)s - key{0[1]} |%(name)s| - getdefault{0[1]} |%(default)s| - """.format(self.delimiters), defaults={"default": ""}) - L = list(cf.items("section", vars={'value': 'value'})) - L.sort() - self.assertEqual(L, expected) - with self.assertRaises(configparser.NoSectionError): - cf.items("no such section") - - -class StrictTestCase(BasicTestCase): - config_class = configparser.RawConfigParser - strict = True - - -class ConfigParserTestCase(BasicTestCase): - config_class = configparser.ConfigParser - - def test_interpolation(self): - cf = self.get_interpolation_config() - eq = self.assertEqual - eq(cf.get("Foo", "bar"), "something with interpolation (1 step)") - eq(cf.get("Foo", "bar9"), - "something with lots of interpolation (9 steps)") - eq(cf.get("Foo", "bar10"), - "something with lots of interpolation (10 steps)") - e = self.get_error(cf, configparser.InterpolationDepthError, "Foo", "bar11") - if self.interpolation == configparser._UNSET: - self.assertEqual(e.args, ("bar11", "Foo", "%(with1)s")) - elif isinstance(self.interpolation, configparser.LegacyInterpolation): - self.assertEqual(e.args, ("bar11", "Foo", - "something %(with11)s lots of interpolation (11 steps)")) - - def test_interpolation_missing_value(self): - cf = self.get_interpolation_config() - e = self.get_error(cf, configparser.InterpolationMissingOptionError, - "Interpolation Error", "name") - self.assertEqual(e.reference, "reference") - self.assertEqual(e.section, "Interpolation Error") - self.assertEqual(e.option, "name") - if self.interpolation == configparser._UNSET: - self.assertEqual(e.args, ('name', 'Interpolation Error', - '', 'reference')) - elif isinstance(self.interpolation, configparser.LegacyInterpolation): - self.assertEqual(e.args, ('name', 'Interpolation Error', - '%(reference)s', 'reference')) - - def test_items(self): - self.check_items_config([('default', ''), - ('getdefault', '||'), - ('key', '|value|'), - ('name', 'value'), - ('value', 'value')]) - - def test_safe_interpolation(self): - # See http://www.python.org/sf/511737 - cf = self.fromstring("[section]\n" - "option1{eq}xxx\n" - "option2{eq}%(option1)s/xxx\n" - "ok{eq}%(option1)s/%%s\n" - "not_ok{eq}%(option2)s/%%s".format( - eq=self.delimiters[0])) - self.assertEqual(cf.get("section", "ok"), "xxx/%s") - if self.interpolation == configparser._UNSET: - self.assertEqual(cf.get("section", "not_ok"), "xxx/xxx/%s") - elif isinstance(self.interpolation, configparser.LegacyInterpolation): - with self.assertRaises(TypeError): - cf.get("section", "not_ok") - - def test_set_malformatted_interpolation(self): - cf = self.fromstring("[sect]\n" - "option1{eq}foo\n".format(eq=self.delimiters[0])) - - self.assertEqual(cf.get('sect', "option1"), "foo") - - self.assertRaises(ValueError, cf.set, "sect", "option1", "%foo") - self.assertRaises(ValueError, cf.set, "sect", "option1", "foo%") - self.assertRaises(ValueError, cf.set, "sect", "option1", "f%oo") - - self.assertEqual(cf.get('sect', "option1"), "foo") - - # bug #5741: double percents are *not* malformed - cf.set("sect", "option2", "foo%%bar") - self.assertEqual(cf.get("sect", "option2"), "foo%bar") - - def test_set_nonstring_types(self): - cf = self.fromstring("[sect]\n" - "option1{eq}foo\n".format(eq=self.delimiters[0])) - # Check that we get a TypeError when setting non-string values - # in an existing section: - self.assertRaises(TypeError, cf.set, "sect", "option1", 1) - self.assertRaises(TypeError, cf.set, "sect", "option1", 1.0) - self.assertRaises(TypeError, cf.set, "sect", "option1", object()) - self.assertRaises(TypeError, cf.set, "sect", "option2", 1) - self.assertRaises(TypeError, cf.set, "sect", "option2", 1.0) - self.assertRaises(TypeError, cf.set, "sect", "option2", object()) - self.assertRaises(TypeError, cf.set, "sect", 123, "invalid opt name!") - self.assertRaises(TypeError, cf.add_section, 123) - - def test_add_section_default(self): - cf = self.newconfig() - self.assertRaises(ValueError, cf.add_section, self.default_section) - -class ConfigParserTestCaseLegacyInterpolation(ConfigParserTestCase): - config_class = configparser.ConfigParser - interpolation = configparser.LegacyInterpolation() - - def test_set_malformatted_interpolation(self): - cf = self.fromstring("[sect]\n" - "option1{eq}foo\n".format(eq=self.delimiters[0])) - - self.assertEqual(cf.get('sect', "option1"), "foo") - - cf.set("sect", "option1", "%foo") - self.assertEqual(cf.get('sect', "option1"), "%foo") - cf.set("sect", "option1", "foo%") - self.assertEqual(cf.get('sect', "option1"), "foo%") - cf.set("sect", "option1", "f%oo") - self.assertEqual(cf.get('sect', "option1"), "f%oo") - - # bug #5741: double percents are *not* malformed - cf.set("sect", "option2", "foo%%bar") - self.assertEqual(cf.get("sect", "option2"), "foo%%bar") - -class ConfigParserTestCaseNonStandardDelimiters(ConfigParserTestCase): - delimiters = (':=', '$') - comment_prefixes = ('//', '"') - inline_comment_prefixes = ('//', '"') - -class ConfigParserTestCaseNonStandardDefaultSection(ConfigParserTestCase): - default_section = 'general' - -class MultilineValuesTestCase(BasicTestCase): - config_class = configparser.ConfigParser - wonderful_spam = ("I'm having spam spam spam spam " - "spam spam spam beaked beans spam " - "spam spam and spam!").replace(' ', '\t\n') - - def setUp(self): - cf = self.newconfig() - for i in range(100): - s = 'section{}'.format(i) - cf.add_section(s) - for j in range(10): - cf.set(s, 'lovely_spam{}'.format(j), self.wonderful_spam) - with open(support.TESTFN, 'w') as f: - cf.write(f) - - def tearDown(self): - os.unlink(support.TESTFN) - - def test_dominating_multiline_values(self): - # We're reading from file because this is where the code changed - # during performance updates in Python 3.2 - cf_from_file = self.newconfig() - with open(support.TESTFN) as f: - cf_from_file.read_file(f) - self.assertEqual(cf_from_file.get('section8', 'lovely_spam4'), - self.wonderful_spam.replace('\t\n', '\n')) - -class RawConfigParserTestCase(BasicTestCase): - config_class = configparser.RawConfigParser - - def test_interpolation(self): - cf = self.get_interpolation_config() - eq = self.assertEqual - eq(cf.get("Foo", "bar"), - "something %(with1)s interpolation (1 step)") - eq(cf.get("Foo", "bar9"), - "something %(with9)s lots of interpolation (9 steps)") - eq(cf.get("Foo", "bar10"), - "something %(with10)s lots of interpolation (10 steps)") - eq(cf.get("Foo", "bar11"), - "something %(with11)s lots of interpolation (11 steps)") - - def test_items(self): - self.check_items_config([('default', ''), - ('getdefault', '|%(default)s|'), - ('key', '|%(name)s|'), - ('name', '%(value)s'), - ('value', 'value')]) - - def test_set_nonstring_types(self): - cf = self.newconfig() - cf.add_section('non-string') - cf.set('non-string', 'int', 1) - cf.set('non-string', 'list', [0, 1, 1, 2, 3, 5, 8, 13]) - cf.set('non-string', 'dict', {'pi': 3.14159}) - self.assertEqual(cf.get('non-string', 'int'), 1) - self.assertEqual(cf.get('non-string', 'list'), - [0, 1, 1, 2, 3, 5, 8, 13]) - self.assertEqual(cf.get('non-string', 'dict'), {'pi': 3.14159}) - cf.add_section(123) - cf.set(123, 'this is sick', True) - self.assertEqual(cf.get(123, 'this is sick'), True) - if cf._dict.__class__ is configparser._default_dict: - # would not work for SortedDict; only checking for the most common - # default dictionary (OrderedDict) - cf.optionxform = lambda x: x - cf.set('non-string', 1, 1) - self.assertEqual(cf.get('non-string', 1), 1) - -class RawConfigParserTestCaseNonStandardDelimiters(RawConfigParserTestCase): - delimiters = (':=', '$') - comment_prefixes = ('//', '"') - inline_comment_prefixes = ('//', '"') - -class RawConfigParserTestSambaConf(CfgParserTestCaseClass): - config_class = configparser.RawConfigParser - comment_prefixes = ('#', ';', '----') - inline_comment_prefixes = ('//',) - empty_lines_in_values = False - - def test_reading(self): - smbconf = support.findfile("cfgparser.2") - # check when we pass a mix of readable and non-readable files: - cf = self.newconfig() - parsed_files = cf.read([smbconf, "nonexistent-file"], encoding='utf-8') - self.assertEqual(parsed_files, [smbconf]) - sections = ['global', 'homes', 'printers', - 'print$', 'pdf-generator', 'tmp', 'Agustin'] - self.assertEqual(cf.sections(), sections) - self.assertEqual(cf.get("global", "workgroup"), "MDKGROUP") - self.assertEqual(cf.getint("global", "max log size"), 50) - self.assertEqual(cf.get("global", "hosts allow"), "127.") - self.assertEqual(cf.get("tmp", "echo command"), "cat %s; rm %s") - -class ConfigParserTestCaseExtendedInterpolation(BasicTestCase): - config_class = configparser.ConfigParser - interpolation = configparser.ExtendedInterpolation() - default_section = 'common' - - def test_extended_interpolation(self): - cf = self.fromstring(textwrap.dedent(""" - [common] - favourite Beatle = Paul - favourite color = green - - [tom] - favourite band = ${favourite color} day - favourite pope = John ${favourite Beatle} II - sequel = ${favourite pope}I - - [ambv] - favourite Beatle = George - son of Edward VII = ${favourite Beatle} V - son of George V = ${son of Edward VII}I - - [stanley] - favourite Beatle = ${ambv:favourite Beatle} - favourite pope = ${tom:favourite pope} - favourite color = black - favourite state of mind = paranoid - favourite movie = soylent ${common:favourite color} - favourite song = ${favourite color} sabbath - ${favourite state of mind} - """).strip()) - - eq = self.assertEqual - eq(cf['common']['favourite Beatle'], 'Paul') - eq(cf['common']['favourite color'], 'green') - eq(cf['tom']['favourite Beatle'], 'Paul') - eq(cf['tom']['favourite color'], 'green') - eq(cf['tom']['favourite band'], 'green day') - eq(cf['tom']['favourite pope'], 'John Paul II') - eq(cf['tom']['sequel'], 'John Paul III') - eq(cf['ambv']['favourite Beatle'], 'George') - eq(cf['ambv']['favourite color'], 'green') - eq(cf['ambv']['son of Edward VII'], 'George V') - eq(cf['ambv']['son of George V'], 'George VI') - eq(cf['stanley']['favourite Beatle'], 'George') - eq(cf['stanley']['favourite color'], 'black') - eq(cf['stanley']['favourite state of mind'], 'paranoid') - eq(cf['stanley']['favourite movie'], 'soylent green') - eq(cf['stanley']['favourite pope'], 'John Paul II') - eq(cf['stanley']['favourite song'], - 'black sabbath - paranoid') - - def test_endless_loop(self): - cf = self.fromstring(textwrap.dedent(""" - [one for you] - ping = ${one for me:pong} - - [one for me] - pong = ${one for you:ping} - - [selfish] - me = ${me} - """).strip()) - - with self.assertRaises(configparser.InterpolationDepthError): - cf['one for you']['ping'] - with self.assertRaises(configparser.InterpolationDepthError): - cf['selfish']['me'] - - def test_strange_options(self): - cf = self.fromstring(""" - [dollars] - $var = $$value - $var2 = ${$var} - ${sick} = cannot interpolate me - - [interpolated] - $other = ${dollars:$var} - $trying = ${dollars:${sick}} - """) - - self.assertEqual(cf['dollars']['$var'], '$value') - self.assertEqual(cf['interpolated']['$other'], '$value') - self.assertEqual(cf['dollars']['${sick}'], 'cannot interpolate me') - exception_class = configparser.InterpolationMissingOptionError - with self.assertRaises(exception_class) as cm: - cf['interpolated']['$trying'] - self.assertEqual(cm.exception.reference, 'dollars:${sick') - self.assertEqual(cm.exception.args[2], '}') #rawval - - - def test_other_errors(self): - cf = self.fromstring(""" - [interpolation fail] - case1 = ${where's the brace - case2 = ${does_not_exist} - case3 = ${wrong_section:wrong_value} - case4 = ${i:like:colon:characters} - case5 = $100 for Fail No 5! - """) - - with self.assertRaises(configparser.InterpolationSyntaxError): - cf['interpolation fail']['case1'] - with self.assertRaises(configparser.InterpolationMissingOptionError): - cf['interpolation fail']['case2'] - with self.assertRaises(configparser.InterpolationMissingOptionError): - cf['interpolation fail']['case3'] - with self.assertRaises(configparser.InterpolationSyntaxError): - cf['interpolation fail']['case4'] - with self.assertRaises(configparser.InterpolationSyntaxError): - cf['interpolation fail']['case5'] - with self.assertRaises(ValueError): - cf['interpolation fail']['case6'] = "BLACK $ABBATH" - - -class ConfigParserTestCaseNoValue(ConfigParserTestCase): - allow_no_value = True - -class ConfigParserTestCaseTrickyFile(CfgParserTestCaseClass): - config_class = configparser.ConfigParser - delimiters = {'='} - comment_prefixes = {'#'} - allow_no_value = True - - def test_cfgparser_dot_3(self): - tricky = support.findfile("cfgparser.3") - cf = self.newconfig() - self.assertEqual(len(cf.read(tricky, encoding='utf-8')), 1) - self.assertEqual(cf.sections(), ['strange', - 'corruption', - 'yeah, sections can be ' - 'indented as well', - 'another one!', - 'no values here', - 'tricky interpolation', - 'more interpolation']) - self.assertEqual(cf.getint(self.default_section, 'go', - vars={'interpolate': '-1'}), -1) - with self.assertRaises(ValueError): - # no interpolation will happen - cf.getint(self.default_section, 'go', raw=True, - vars={'interpolate': '-1'}) - self.assertEqual(len(cf.get('strange', 'other').split('\n')), 4) - self.assertEqual(len(cf.get('corruption', 'value').split('\n')), 10) - longname = 'yeah, sections can be indented as well' - self.assertFalse(cf.getboolean(longname, 'are they subsections')) - self.assertEqual(cf.get(longname, 'lets use some Unicode'), '???') - self.assertEqual(len(cf.items('another one!')), 5) # 4 in section and - # `go` from DEFAULT - with self.assertRaises(configparser.InterpolationMissingOptionError): - cf.items('no values here') - self.assertEqual(cf.get('tricky interpolation', 'lets'), 'do this') - self.assertEqual(cf.get('tricky interpolation', 'lets'), - cf.get('tricky interpolation', 'go')) - self.assertEqual(cf.get('more interpolation', 'lets'), 'go shopping') - - def test_unicode_failure(self): - tricky = support.findfile("cfgparser.3") - cf = self.newconfig() - with self.assertRaises(UnicodeDecodeError): - cf.read(tricky, encoding='ascii') - - -class Issue7005TestCase(unittest.TestCase): - """Test output when None is set() as a value and allow_no_value == False. - - http://bugs.python.org/issue7005 - - """ - - expected_output = "[section]\noption = None\n\n" - - def prepare(self, config_class): - # This is the default, but that's the point. - cp = config_class(allow_no_value=False) - cp.add_section("section") - cp.set("section", "option", None) - sio = io.StringIO() - cp.write(sio) - return sio.getvalue() - - def test_none_as_value_stringified(self): - cp = configparser.ConfigParser(allow_no_value=False) - cp.add_section("section") - with self.assertRaises(TypeError): - cp.set("section", "option", None) - - def test_none_as_value_stringified_raw(self): - output = self.prepare(configparser.RawConfigParser) - self.assertEqual(output, self.expected_output) - - -class SortedTestCase(RawConfigParserTestCase): - dict_type = SortedDict - - def test_sorted(self): - cf = self.fromstring("[b]\n" - "o4=1\n" - "o3=2\n" - "o2=3\n" - "o1=4\n" - "[a]\n" - "k=v\n") - output = io.StringIO() - cf.write(output) - self.assertEqual(output.getvalue(), - "[a]\n" - "k = v\n\n" - "[b]\n" - "o1 = 4\n" - "o2 = 3\n" - "o3 = 2\n" - "o4 = 1\n\n") - - -class CompatibleTestCase(CfgParserTestCaseClass): - config_class = configparser.RawConfigParser - comment_prefixes = '#;' - inline_comment_prefixes = ';' - - def test_comment_handling(self): - config_string = textwrap.dedent("""\ - [Commented Bar] - baz=qwe ; a comment - foo: bar # not a comment! - # but this is a comment - ; another comment - quirk: this;is not a comment - ; a space must precede an inline comment - """) - cf = self.fromstring(config_string) - self.assertEqual(cf.get('Commented Bar', 'foo'), - 'bar # not a comment!') - self.assertEqual(cf.get('Commented Bar', 'baz'), 'qwe') - self.assertEqual(cf.get('Commented Bar', 'quirk'), - 'this;is not a comment') - -class CopyTestCase(BasicTestCase): - config_class = configparser.ConfigParser - - def fromstring(self, string, defaults=None): - cf = self.newconfig(defaults) - cf.read_string(string) - cf_copy = self.newconfig() - cf_copy.read_dict(cf) - # we have to clean up option duplicates that appeared because of - # the magic DEFAULTSECT behaviour. - for section in cf_copy.values(): - if section.name == self.default_section: - continue - for default, value in cf[self.default_section].items(): - if section[default] == value: - del section[default] - return cf_copy - -class CoverageOneHundredTestCase(unittest.TestCase): - """Covers edge cases in the codebase.""" - - def test_duplicate_option_error(self): - error = configparser.DuplicateOptionError('section', 'option') - self.assertEqual(error.section, 'section') - self.assertEqual(error.option, 'option') - self.assertEqual(error.source, None) - self.assertEqual(error.lineno, None) - self.assertEqual(error.args, ('section', 'option', None, None)) - self.assertEqual(str(error), "Option 'option' in section 'section' " - "already exists") - - def test_interpolation_depth_error(self): - error = configparser.InterpolationDepthError('option', 'section', - 'rawval') - self.assertEqual(error.args, ('option', 'section', 'rawval')) - self.assertEqual(error.option, 'option') - self.assertEqual(error.section, 'section') - - def test_parsing_error(self): - with self.assertRaises(ValueError) as cm: - configparser.ParsingError() - self.assertEqual(str(cm.exception), "Required argument `source' not " - "given.") - with self.assertRaises(ValueError) as cm: - configparser.ParsingError(source='source', filename='filename') - self.assertEqual(str(cm.exception), "Cannot specify both `filename' " - "and `source'. Use `source'.") - error = configparser.ParsingError(filename='source') - self.assertEqual(error.source, 'source') - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - self.assertEqual(error.filename, 'source') - error.filename = 'filename' - self.assertEqual(error.source, 'filename') - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) - - def test_interpolation_validation(self): - parser = configparser.ConfigParser() - parser.read_string(""" - [section] - invalid_percent = % - invalid_reference = %(() - invalid_variable = %(does_not_exist)s - """) - with self.assertRaises(configparser.InterpolationSyntaxError) as cm: - parser['section']['invalid_percent'] - self.assertEqual(str(cm.exception), "'%' must be followed by '%' or " - "'(', found: '%'") - with self.assertRaises(configparser.InterpolationSyntaxError) as cm: - parser['section']['invalid_reference'] - self.assertEqual(str(cm.exception), "bad interpolation variable " - "reference '%(()'") - - def test_readfp_deprecation(self): - sio = io.StringIO(""" - [section] - option = value - """) - parser = configparser.ConfigParser() - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - parser.readfp(sio, filename='StringIO') - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) - self.assertEqual(len(parser), 2) - self.assertEqual(parser['section']['option'], 'value') - - def test_safeconfigparser_deprecation(self): - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - parser = configparser.SafeConfigParser() - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) - - def test_sectionproxy_repr(self): - parser = configparser.ConfigParser() - parser.read_string(""" - [section] - key = value - """) - self.assertEqual(repr(parser['section']), '') - -def test_main(): - support.run_unittest( - ConfigParserTestCase, - ConfigParserTestCaseNonStandardDelimiters, - ConfigParserTestCaseNoValue, - ConfigParserTestCaseExtendedInterpolation, - ConfigParserTestCaseLegacyInterpolation, - ConfigParserTestCaseTrickyFile, - MultilineValuesTestCase, - RawConfigParserTestCase, - RawConfigParserTestCaseNonStandardDelimiters, - RawConfigParserTestSambaConf, - SortedTestCase, - Issue7005TestCase, - StrictTestCase, - CompatibleTestCase, - CopyTestCase, - ConfigParserTestCaseNonStandardDefaultSection, - CoverageOneHundredTestCase, - ) From python-checkins at python.org Tue Feb 22 02:55:36 2011 From: python-checkins at python.org (raymond.hettinger) Date: Tue, 22 Feb 2011 02:55:36 +0100 (CET) Subject: [Python-checkins] r88492 - in python/branches/py3k/Lib: collections/__init__.py string.py Message-ID: <20110222015536.DA88AEE985@mail.python.org> Author: raymond.hettinger Date: Tue Feb 22 02:55:36 2011 New Revision: 88492 Log: Factor-out common code for helper classes. Modified: python/branches/py3k/Lib/collections/__init__.py python/branches/py3k/Lib/string.py Modified: python/branches/py3k/Lib/collections/__init__.py ============================================================================== --- python/branches/py3k/Lib/collections/__init__.py (original) +++ python/branches/py3k/Lib/collections/__init__.py Tue Feb 22 02:55:36 2011 @@ -633,7 +633,7 @@ ######################################################################## -### ChainMap (helper for configparser) +### ChainMap (helper for configparser and string.Template) ######################################################################## class _ChainMap(MutableMapping): Modified: python/branches/py3k/Lib/string.py ============================================================================== --- python/branches/py3k/Lib/string.py (original) +++ python/branches/py3k/Lib/string.py Tue Feb 22 02:55:36 2011 @@ -46,23 +46,7 @@ #################################################################### import re as _re - -class _multimap: - """Helper class for combining multiple mappings. - - Used by .{safe_,}substitute() to combine the mapping and keyword - arguments. - """ - def __init__(self, primary, secondary): - self._primary = primary - self._secondary = secondary - - def __getitem__(self, key): - try: - return self._primary[key] - except KeyError: - return self._secondary[key] - +from collections import _ChainMap class _TemplateMetaclass(type): pattern = r""" @@ -116,7 +100,7 @@ if not args: mapping = kws elif kws: - mapping = _multimap(kws, args[0]) + mapping = _ChainMap(kws, args[0]) else: mapping = args[0] # Helper function for .sub() @@ -142,7 +126,7 @@ if not args: mapping = kws elif kws: - mapping = _multimap(kws, args[0]) + mapping = _ChainMap(kws, args[0]) else: mapping = args[0] # Helper function for .sub() From python-checkins at python.org Tue Feb 22 03:42:41 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 03:42:41 +0100 (CET) Subject: [Python-checkins] r88493 - python/branches/py3k/Lib/collections Message-ID: <20110222024241.78768ECD1@mail.python.org> Author: brett.cannon Date: Tue Feb 22 03:42:41 2011 New Revision: 88493 Log: Ignore __pycache__ in Lib/collections. Modified: python/branches/py3k/Lib/collections/ (props changed) From python-checkins at python.org Tue Feb 22 04:04:07 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:04:07 +0100 (CET) Subject: [Python-checkins] r88494 - in python/branches/py3k: Lib/ctypes/test/test_memfunctions.py Lib/ctypes/test/test_python_api.py Lib/ctypes/test/test_refcounts.py Lib/ctypes/test/test_stringptr.py Lib/test/support.py Lib/test/test_descr.py Lib/test/test_gc.py Lib/test/test_genexps.py Lib/test/test_metaclass.py Lib/test/test_pydoc.py Lib/test/test_sys.py Lib/test/test_trace.py Misc/NEWS Message-ID: <20110222030407.45EA1FB4F@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:04:06 2011 New Revision: 88494 Log: Issue #10992: make tests pass when run under coverage. Various tests fail when run under coverage. A primary culprit is refcount tests which fail as the counts are thrown off by the coverage code. A new decorator -- test.support.refcount_test -- is used to decorate tests which test refcounts and to skip them when running under coverage. Other tests simply fail because of changes in the system (e.g., __local__ suddenly appearing). Thanks to Kristian Vlaardingerbroek for helping to diagnose the test failures. Modified: python/branches/py3k/Lib/ctypes/test/test_memfunctions.py python/branches/py3k/Lib/ctypes/test/test_python_api.py python/branches/py3k/Lib/ctypes/test/test_refcounts.py python/branches/py3k/Lib/ctypes/test/test_stringptr.py python/branches/py3k/Lib/test/support.py python/branches/py3k/Lib/test/test_descr.py python/branches/py3k/Lib/test/test_gc.py python/branches/py3k/Lib/test/test_genexps.py python/branches/py3k/Lib/test/test_metaclass.py python/branches/py3k/Lib/test/test_pydoc.py python/branches/py3k/Lib/test/test_sys.py python/branches/py3k/Lib/test/test_trace.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/ctypes/test/test_memfunctions.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_memfunctions.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_memfunctions.py Tue Feb 22 04:04:06 2011 @@ -1,4 +1,5 @@ import sys +from test import support import unittest from ctypes import * @@ -49,6 +50,7 @@ self.assertEqual(cast(a, POINTER(c_byte))[:7:7], [97]) + @support.refcount_test def test_string_at(self): s = string_at(b"foo bar") # XXX The following may be wrong, depending on how Python Modified: python/branches/py3k/Lib/ctypes/test/test_python_api.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_python_api.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_python_api.py Tue Feb 22 04:04:06 2011 @@ -1,5 +1,6 @@ from ctypes import * import unittest, sys +from test import support from ctypes.test import is_resource_enabled ################################################################ @@ -25,6 +26,7 @@ self.assertEqual(PyBytes_FromStringAndSize(b"abcdefghi", 3), b"abc") + @support.refcount_test def test_PyString_FromString(self): pythonapi.PyBytes_FromString.restype = py_object pythonapi.PyBytes_FromString.argtypes = (c_char_p,) @@ -56,6 +58,7 @@ del res self.assertEqual(grc(42), ref42) + @support.refcount_test def test_PyObj_FromPtr(self): s = "abc def ghi jkl" ref = grc(s) Modified: python/branches/py3k/Lib/ctypes/test/test_refcounts.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_refcounts.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_refcounts.py Tue Feb 22 04:04:06 2011 @@ -1,4 +1,5 @@ import unittest +from test import support import ctypes import gc @@ -10,6 +11,7 @@ class RefcountTestCase(unittest.TestCase): + @support.refcount_test def test_1(self): from sys import getrefcount as grc @@ -34,6 +36,7 @@ self.assertEqual(grc(callback), 2) + @support.refcount_test def test_refcount(self): from sys import getrefcount as grc def func(*args): Modified: python/branches/py3k/Lib/ctypes/test/test_stringptr.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_stringptr.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_stringptr.py Tue Feb 22 04:04:06 2011 @@ -1,4 +1,5 @@ import unittest +from test import support from ctypes import * import _ctypes_test @@ -7,6 +8,7 @@ class StringPtrTestCase(unittest.TestCase): + @support.refcount_test def test__POINTER_c_char(self): class X(Structure): _fields_ = [("str", POINTER(c_char))] Modified: python/branches/py3k/Lib/test/support.py ============================================================================== --- python/branches/py3k/Lib/test/support.py (original) +++ python/branches/py3k/Lib/test/support.py Tue Feb 22 04:04:06 2011 @@ -1124,6 +1124,17 @@ return wrapper +def refcount_test(test): + """Decorator for tests which involve reference counting. + + To start, the decorator does not run the test if is not run by CPython. + After that, any trace function is unset during the test to prevent + unexpected refcounts caused by the trace function. + + """ + return no_tracing(cpython_only(test)) + + def _run_suite(suite): """Run tests from a unittest.TestSuite-derived class.""" if verbose: Modified: python/branches/py3k/Lib/test/test_descr.py ============================================================================== --- python/branches/py3k/Lib/test/test_descr.py (original) +++ python/branches/py3k/Lib/test/test_descr.py Tue Feb 22 04:04:06 2011 @@ -4243,6 +4243,8 @@ pass self.C = C + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __local__') def test_iter_keys(self): # Testing dict-proxy keys... it = self.C.__dict__.keys() @@ -4252,6 +4254,8 @@ self.assertEqual(keys, ['__dict__', '__doc__', '__module__', '__weakref__', 'meth']) + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __local__') def test_iter_values(self): # Testing dict-proxy values... it = self.C.__dict__.values() @@ -4259,6 +4263,8 @@ values = list(it) self.assertEqual(len(values), 5) + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __local__') def test_iter_items(self): # Testing dict-proxy iteritems... it = self.C.__dict__.items() Modified: python/branches/py3k/Lib/test/test_gc.py ============================================================================== --- python/branches/py3k/Lib/test/test_gc.py (original) +++ python/branches/py3k/Lib/test/test_gc.py Tue Feb 22 04:04:06 2011 @@ -1,5 +1,6 @@ import unittest -from test.support import verbose, run_unittest, strip_python_stderr +from test.support import (verbose, refcount_test, run_unittest, + strip_python_stderr) import sys import gc import weakref @@ -175,6 +176,7 @@ del d self.assertEqual(gc.collect(), 2) + @refcount_test def test_frame(self): def f(): frame = sys._getframe() @@ -242,6 +244,7 @@ # For example: # - disposed tuples are not freed, but reused # - the call to assertEqual somehow avoids building its args tuple + @refcount_test def test_get_count(self): # Avoid future allocation of method object assertEqual = self._baseAssertEqual @@ -252,6 +255,7 @@ # the dict, and the tuple returned by get_count() assertEqual(gc.get_count(), (2, 0, 0)) + @refcount_test def test_collect_generations(self): # Avoid future allocation of method object assertEqual = self.assertEqual Modified: python/branches/py3k/Lib/test/test_genexps.py ============================================================================== --- python/branches/py3k/Lib/test/test_genexps.py (original) +++ python/branches/py3k/Lib/test/test_genexps.py Tue Feb 22 04:04:06 2011 @@ -257,11 +257,15 @@ """ +import sys -__test__ = {'doctests' : doctests} +# Trace function can throw off the tuple reuse test. +if hasattr(sys, 'gettrace') and sys.gettrace(): + __test__ = {} +else: + __test__ = {'doctests' : doctests} def test_main(verbose=None): - import sys from test import support from test import test_genexps support.run_doctest(test_genexps, verbose) Modified: python/branches/py3k/Lib/test/test_metaclass.py ============================================================================== --- python/branches/py3k/Lib/test/test_metaclass.py (original) +++ python/branches/py3k/Lib/test/test_metaclass.py Tue Feb 22 04:04:06 2011 @@ -246,7 +246,13 @@ """ -__test__ = {'doctests' : doctests} +import sys + +# Trace function introduces __locals__ which causes various tests to fail. +if hasattr(sys, 'gettrace') and sys.gettrace(): + __test__ = {} +else: + __test__ = {'doctests' : doctests} def test_main(verbose=False): from test import support Modified: python/branches/py3k/Lib/test/test_pydoc.py ============================================================================== --- python/branches/py3k/Lib/test/test_pydoc.py (original) +++ python/branches/py3k/Lib/test/test_pydoc.py Tue Feb 22 04:04:06 2011 @@ -248,6 +248,8 @@ @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') def test_html_doc(self): result, doc_loc = get_pydoc_html(pydoc_mod) mod_file = inspect.getabsfile(pydoc_mod) @@ -263,6 +265,8 @@ @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') def test_text_doc(self): result, doc_loc = get_pydoc_text(pydoc_mod) expected_text = expected_text_pattern % \ @@ -340,6 +344,8 @@ @unittest.skipIf(sys.flags.optimize >= 2, 'Docstrings are omitted with -O2 and above') + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') def test_help_output_redirect(self): # issue 940286, if output is set in Helper, then all output from # Helper.help should be redirected Modified: python/branches/py3k/Lib/test/test_sys.py ============================================================================== --- python/branches/py3k/Lib/test/test_sys.py (original) +++ python/branches/py3k/Lib/test/test_sys.py Tue Feb 22 04:04:06 2011 @@ -303,6 +303,7 @@ self.assertEqual(sys.getdlopenflags(), oldflags+1) sys.setdlopenflags(oldflags) + @test.support.refcount_test def test_refcount(self): # n here must be a global in order for this test to pass while # tracing with a python function. Tracing calls PyFrame_FastToLocals Modified: python/branches/py3k/Lib/test/test_trace.py ============================================================================== --- python/branches/py3k/Lib/test/test_trace.py (original) +++ python/branches/py3k/Lib/test/test_trace.py Tue Feb 22 04:04:06 2011 @@ -245,6 +245,8 @@ } self.assertEqual(self.tracer.results().calledfuncs, expected) + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'pre-existing trace function throws off measurements') def test_inst_method_calling(self): obj = TracedClass(20) self.tracer.runfunc(obj.inst_method_calling, 1) @@ -264,6 +266,8 @@ self.tracer = Trace(count=0, trace=0, countcallers=1) self.filemod = my_file_and_modname() + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'pre-existing trace function throws off measurements') def test_loop_caller_importing(self): self.tracer.runfunc(traced_func_importing_caller, 1) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 04:04:06 2011 @@ -54,6 +54,8 @@ Tests ----- +- Issue #10992: Make tests pass under coverage. + - Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due to open door files. From python-checkins at python.org Tue Feb 22 04:06:27 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:06:27 +0100 Subject: [Python-checkins] devguide: Remove warnings about coverage issues as the patches have been applied. Message-ID: brett.cannon pushed 4c1f51eafc70 to devguide: http://hg.python.org/devguide/rev/4c1f51eafc70 changeset: 310:4c1f51eafc70 tag: tip parent: 297:b560997b365d user: Brett Cannon date: Mon Feb 21 19:06:20 2011 -0800 summary: Remove warnings about coverage issues as the patches have been applied. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -46,15 +46,6 @@ Measuring Coverage """""""""""""""""" -.. warning:: - Running the entire test suite under coverage (using either technique listed - below) currently fails as some tests are resetting the trace function; - see http://bugs.python.org/issue10990 for a patch. - - There are also various tests that simply fail as they have not been made - robust in the face of coverage measuring/having a trace function set; - see http://bugs.python.org/issue10992 for a patch. - It should be noted that a quirk of running coverage over Python's own stdlib is that certain modules are imported as part of interpreter startup. Those modules required by Python itself will not be viewed as executed by the coverage tools -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 22 04:07:16 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:07:16 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): merge from default Message-ID: brett.cannon pushed bbd90a65d228 to devguide: http://hg.python.org/devguide/rev/bbd90a65d228 changeset: 311:bbd90a65d228 branch: hg_transition tag: tip parent: 309:d1402dfa33ac parent: 310:4c1f51eafc70 user: Brett Cannon date: Mon Feb 21 19:07:10 2011 -0800 summary: merge from default files: diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -46,15 +46,6 @@ Measuring Coverage """""""""""""""""" -.. warning:: - Running the entire test suite under coverage (using either technique listed - below) currently fails as some tests are resetting the trace function; - see http://bugs.python.org/issue10990 for a patch. - - There are also various tests that simply fail as they have not been made - robust in the face of coverage measuring/having a trace function set; - see http://bugs.python.org/issue10992 for a patch. - It should be noted that a quirk of running coverage over Python's own stdlib is that certain modules are imported as part of interpreter startup. Those modules required by Python itself will not be viewed as executed by the coverage tools -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Tue Feb 22 04:07:39 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:07:39 +0100 (CET) Subject: [Python-checkins] r88495 - python/branches/release32-maint Message-ID: <20110222030739.D8513EE98B@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:07:39 2011 New Revision: 88495 Log: Blocked revisions 88494 via svnmerge ........ r88494 | brett.cannon | 2011-02-21 19:04:06 -0800 (Mon, 21 Feb 2011) | 10 lines Issue #10992: make tests pass when run under coverage. Various tests fail when run under coverage. A primary culprit is refcount tests which fail as the counts are thrown off by the coverage code. A new decorator -- test.support.refcount_test -- is used to decorate tests which test refcounts and to skip them when running under coverage. Other tests simply fail because of changes in the system (e.g., __local__ suddenly appearing). Thanks to Kristian Vlaardingerbroek for helping to diagnose the test failures. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Tue Feb 22 04:14:12 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:14:12 +0100 (CET) Subject: [Python-checkins] r88496 - in python/branches/py3k: Lib/test/test_cgi.py Misc/NEWS Message-ID: <20110222031412.51D3AEE986@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:14:12 2011 New Revision: 88496 Log: Issue #10512: close the log file in cgi when running tests. Thanks to Nadeem Vawda for the find and an initial fix. Modified: python/branches/py3k/Lib/test/test_cgi.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/test/test_cgi.py ============================================================================== --- python/branches/py3k/Lib/test/test_cgi.py (original) +++ python/branches/py3k/Lib/test/test_cgi.py Tue Feb 22 04:14:12 2011 @@ -155,6 +155,7 @@ cgi.logfp = None cgi.logfile = "/dev/null" cgi.initlog("%s", "Testing log 3") + self.addCleanup(cgi.logfp.close) cgi.log("Testing log 4") def test_fieldstorage_readline(self): Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 04:14:12 2011 @@ -54,6 +54,8 @@ Tests ----- +- Issue #10512: Properly close sockets under test.test_cgi. + - Issue #10992: Make tests pass under coverage. - Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due From python-checkins at python.org Tue Feb 22 04:16:07 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:16:07 +0100 (CET) Subject: [Python-checkins] r88497 - in python/branches/release32-maint: Lib/test/test_cgi.py Message-ID: <20110222031607.58564EE9AE@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:16:07 2011 New Revision: 88497 Log: Merged revisions 88496 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88496 | brett.cannon | 2011-02-21 19:14:12 -0800 (Mon, 21 Feb 2011) | 4 lines Issue #10512: close the log file in cgi when running tests. Thanks to Nadeem Vawda for the find and an initial fix. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_cgi.py Modified: python/branches/release32-maint/Lib/test/test_cgi.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_cgi.py (original) +++ python/branches/release32-maint/Lib/test/test_cgi.py Tue Feb 22 04:16:07 2011 @@ -155,6 +155,7 @@ cgi.logfp = None cgi.logfile = "/dev/null" cgi.initlog("%s", "Testing log 3") + self.addCleanup(cgi.logfp.close) cgi.log("Testing log 4") def test_fieldstorage_readline(self): From python-checkins at python.org Tue Feb 22 04:25:12 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:25:12 +0100 (CET) Subject: [Python-checkins] r88498 - in python/branches/py3k: Lib/tokenize.py Misc/NEWS Message-ID: <20110222032512.C5B43C341@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:25:12 2011 New Revision: 88498 Log: Issue #11074: Make 'tokenize' so it can be reloaded. The module stored away the 'open' object as found in the global namespace (which fell through to the built-in namespace) since it defined its own 'open'. Problem is that if you reloaded the module it then grabbed the 'open' defined in the previous load, leading to code that infinite recursed. Switched to simply call builtins.open directly. Modified: python/branches/py3k/Lib/tokenize.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/tokenize.py ============================================================================== --- python/branches/py3k/Lib/tokenize.py (original) +++ python/branches/py3k/Lib/tokenize.py Tue Feb 22 04:25:12 2011 @@ -24,6 +24,7 @@ __credits__ = ('GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, ' 'Skip Montanaro, Raymond Hettinger, Trent Nelson, ' 'Michael Foord') +import builtins import re import sys from token import * @@ -335,13 +336,11 @@ return default, [first, second] -_builtin_open = open - def open(filename): """Open a file in read only mode using the encoding detected by detect_encoding(). """ - buffer = _builtin_open(filename, 'rb') + buffer = builtins.open(filename, 'rb') encoding, lines = detect_encoding(buffer.readline) buffer.seek(0) text = TextIOWrapper(buffer, encoding, line_buffering=True) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 04:25:12 2011 @@ -27,6 +27,8 @@ Library ------- +- Issue #11074: Make 'tokenize' so it can be reloaded. + - Issue #11085: Moved collections abstract base classes into a separate module called collections.abc, following the pattern used by importlib.abc. For backwards compatibility, the names are imported into the collections From python-checkins at python.org Tue Feb 22 04:35:18 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 04:35:18 +0100 (CET) Subject: [Python-checkins] r88499 - in python/branches/release32-maint: Lib/tokenize.py Misc/NEWS Message-ID: <20110222033518.8AAA6EE9C9@mail.python.org> Author: brett.cannon Date: Tue Feb 22 04:35:18 2011 New Revision: 88499 Log: Merged revisions 88498 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88498 | brett.cannon | 2011-02-21 19:25:12 -0800 (Mon, 21 Feb 2011) | 8 lines Issue #11074: Make 'tokenize' so it can be reloaded. The module stored away the 'open' object as found in the global namespace (which fell through to the built-in namespace) since it defined its own 'open'. Problem is that if you reloaded the module it then grabbed the 'open' defined in the previous load, leading to code that infinite recursed. Switched to simply call builtins.open directly. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/tokenize.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/tokenize.py ============================================================================== --- python/branches/release32-maint/Lib/tokenize.py (original) +++ python/branches/release32-maint/Lib/tokenize.py Tue Feb 22 04:35:18 2011 @@ -24,6 +24,7 @@ __credits__ = ('GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, ' 'Skip Montanaro, Raymond Hettinger, Trent Nelson, ' 'Michael Foord') +import builtins import re import sys from token import * @@ -335,13 +336,11 @@ return default, [first, second] -_builtin_open = open - def open(filename): """Open a file in read only mode using the encoding detected by detect_encoding(). """ - buffer = _builtin_open(filename, 'rb') + buffer = builtins.open(filename, 'rb') encoding, lines = detect_encoding(buffer.readline) buffer.seek(0) text = TextIOWrapper(buffer, encoding, line_buffering=True) Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Tue Feb 22 04:35:18 2011 @@ -15,6 +15,8 @@ Library ------- +- Issue #11074: Make 'tokenize' so it can be reloaded. + - Issue #4681: Allow mmap() to work on file sizes and offsets larger than 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for 32-bit Windows. From solipsis at pitrou.net Tue Feb 22 05:06:40 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 22 Feb 2011 05:06:40 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88492): sum=0 Message-ID: py3k results for svn r88492 (hg cset 01e34f50b68b) -------------------------------------------------- test_pydoc leaked [0, -323, 323] references, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogaQIo9C', '-x'] From python-checkins at python.org Tue Feb 22 11:55:44 2011 From: python-checkins at python.org (sean.reifschneider) Date: Tue, 22 Feb 2011 11:55:44 +0100 (CET) Subject: [Python-checkins] r88500 - in python/branches/py3k: Doc/library/crypt.rst Lib/crypt.py Lib/test/test_crypt.py Misc/NEWS Modules/Setup.dist Modules/_cryptmodule.c Modules/cryptmodule.c setup.py Message-ID: <20110222105544.7A1B1EE989@mail.python.org> Author: sean.reifschneider Date: Tue Feb 22 11:55:44 2011 New Revision: 88500 Log: Issue #10924: Adding salt and Modular Crypt Format to crypt library. Added: python/branches/py3k/Lib/crypt.py python/branches/py3k/Modules/_cryptmodule.c - copied, changed from r88042, /python/branches/py3k/Modules/cryptmodule.c Removed: python/branches/py3k/Modules/cryptmodule.c Modified: python/branches/py3k/Doc/library/crypt.rst python/branches/py3k/Lib/test/test_crypt.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/Setup.dist python/branches/py3k/setup.py Modified: python/branches/py3k/Doc/library/crypt.rst ============================================================================== --- python/branches/py3k/Doc/library/crypt.rst (original) +++ python/branches/py3k/Doc/library/crypt.rst Tue Feb 22 11:55:44 2011 @@ -15,9 +15,9 @@ This module implements an interface to the :manpage:`crypt(3)` routine, which is a one-way hash function based upon a modified DES algorithm; see the Unix man -page for further details. Possible uses include allowing Python scripts to -accept typed passwords from the user, or attempting to crack Unix passwords with -a dictionary. +page for further details. Possible uses include storing hashed passwords +so you can check passwords without storing the actual password, or attempting +to crack Unix passwords with a dictionary. .. index:: single: crypt(3) @@ -26,15 +26,67 @@ extensions available on the current implementation will also be available on this module. +Hashing Methods +--------------- -.. function:: crypt(word, salt) +The :mod:`crypt` module defines the list of hashing methods (not all methods +are available on all platforms): + +.. data:: METHOD_SHA512 + + A Modular Crypt Format method with 16 character salt and 86 character + hash. This is the strongest method. + +.. versionadded:: 3.3 + +.. data:: METHOD_SHA256 + + Another Modular Crypt Format method with 16 character salt and 43 + character hash. + +.. versionadded:: 3.3 + +.. data:: METHOD_MD5 + + Another Modular Crypt Format method with 8 character salt and 22 + character hash. + +.. versionadded:: 3.3 + +.. data:: METHOD_CRYPT + + The traditional method with a 2 character salt and 13 characters of + hash. This is the weakest method. + +.. versionadded:: 3.3 + +Module Functions +---------------- + +The :mod:`crypt` module defines the following functions: + +.. function:: crypt(word, salt=None) *word* will usually be a user's password as typed at a prompt or in a graphical - interface. *salt* is usually a random two-character string which will be used - to perturb the DES algorithm in one of 4096 ways. The characters in *salt* must - be in the set ``[./a-zA-Z0-9]``. Returns the hashed password as a string, which - will be composed of characters from the same alphabet as the salt (the first two - characters represent the salt itself). + interface. The optional *salt* is either a string as returned from + :func:`mksalt`, one of the ``crypt.METHOD_*`` values (though not all + may be available on all platforms), or a full encrypted password + including salt, as returned by this function. If *salt* is not + provided, the strongest method will be used (as returned by + :func:`methods`. + + Checking a password is usually done by passing the plain-text password + as *word* and the full results of a previous :func:`crypt` call, + which should be the same as the results of this call. + + *salt* (either a random 2 or 16 character string, possibly prefixed with + ``$digit$`` to indicate the method) which will be used to perturb the + encryption algorithm. The characters in *salt* must be in the set + ``[./a-zA-Z0-9]``, with the exception of Modular Crypt Format which + prefixes a ``$digit$``. + + Returns the hashed password as a string, which will be composed of + characters from the same alphabet as the salt. .. index:: single: crypt(3) @@ -42,6 +94,34 @@ different sizes in the *salt*, it is recommended to use the full crypted password as salt when checking for a password. +.. versionchanged:: 3.3 + Before version 3.3, *salt* must be specified as a string and cannot + accept ``crypt.METHOD_*`` values (which don't exist anyway). + +.. function:: methods() + + Return a list of available password hashing algorithms, as + ``crypt.METHOD_*`` objects. This list is sorted from strongest to + weakest, and is guaranteed to have at least ``crypt.METHOD_CRYPT``. + +.. versionadded:: 3.3 + +.. function:: mksalt(method=None) + + Return a randomly generated salt of the specified method. If no + *method* is given, the strongest method available as returned by + :func:`methods` is used. + + The return value is a string either of 2 characters in length for + ``crypt.METHOD_CRYPT``, or 19 characters starting with ``$digit$`` and + 16 random characters from the set ``[./a-zA-Z0-9]``, suitable for + passing as the *salt* argument to :func:`crypt`. + +.. versionadded:: 3.3 + +Examples +-------- + A simple example illustrating typical use:: import crypt, getpass, pwd @@ -57,3 +137,11 @@ else: return 1 +To generate a hash of a password using the strongest available method and +check it against the original:: + + import crypt + + hashed = crypt.crypt(plaintext) + if hashed != crypt.crypt(plaintext, hashed): + raise "Hashed version doesn't validate against original" Added: python/branches/py3k/Lib/crypt.py ============================================================================== --- (empty file) +++ python/branches/py3k/Lib/crypt.py Tue Feb 22 11:55:44 2011 @@ -0,0 +1,61 @@ +'''Wrapper to the POSIX crypt library call and associated functionality. +''' + +import _crypt + +saltchars = 'abcdefghijklmnopqrstuvwxyz' +saltchars += saltchars.upper() +saltchars += '0123456789./' + + +class _MethodClass: + '''Class representing a salt method per the Modular Crypt Format or the + legacy 2-character crypt method.''' + def __init__(self, name, ident, salt_chars, total_size): + self.name = name + self.ident = ident + self.salt_chars = salt_chars + self.total_size = total_size + + def __repr__(self): + return '' % self.name + + +# available salting/crypto methods +METHOD_CRYPT = _MethodClass('CRYPT', None, 2, 13) +METHOD_MD5 = _MethodClass('MD5', '1', 8, 34) +METHOD_SHA256 = _MethodClass('SHA256', '5', 16, 63) +METHOD_SHA512 = _MethodClass('SHA512', '6', 16, 106) + + +def methods(): + '''Return a list of methods that are available in the platform ``crypt()`` + library, sorted from strongest to weakest. This is guaranteed to always + return at least ``[METHOD_CRYPT]``''' + method_list = [ METHOD_SHA512, METHOD_SHA256, METHOD_MD5 ] + ret = [ method for method in method_list + if len(crypt('', method)) == method.total_size ] + ret.append(METHOD_CRYPT) + return ret + + +def mksalt(method = None): + '''Generate a salt for the specified method. If not specified, the + strongest available method will be used.''' + import random + + if method == None: method = methods()[0] + s = '$%s$' % method.ident if method.ident else '' + s += ''.join([ random.choice(saltchars) for x in range(method.salt_chars) ]) + return(s) + + +def crypt(word, salt = None): + '''Return a string representing the one-way hash of a password, preturbed + by a salt. If ``salt`` is not specified or is ``None``, the strongest + available method will be selected and a salt generated. Otherwise, + ``salt`` may be one of the ``crypt.METHOD_*`` values, or a string as + returned by ``crypt.mksalt()``.''' + if salt == None: salt = mksalt() + elif isinstance(salt, _MethodClass): salt = mksalt(salt) + return(_crypt.crypt(word, salt)) Modified: python/branches/py3k/Lib/test/test_crypt.py ============================================================================== --- python/branches/py3k/Lib/test/test_crypt.py (original) +++ python/branches/py3k/Lib/test/test_crypt.py Tue Feb 22 11:55:44 2011 @@ -10,6 +10,23 @@ if support.verbose: print('Test encryption: ', c) + def test_salt(self): + self.assertEqual(len(crypt.saltchars), 64) + for method in crypt.methods(): + salt = crypt.mksalt(method) + self.assertEqual(len(salt), + method.salt_chars + (3 if method.ident else 0)) + + def test_saltedcrypt(self): + for method in crypt.methods(): + pw = crypt.crypt('assword', method) + self.assertEqual(len(pw), method.total_size) + pw = crypt.crypt('assword', crypt.mksalt(method)) + self.assertEqual(len(pw), method.total_size) + + def test_methods(self): + self.assertTrue(len(crypt.methods()) > 1) + def test_main(): support.run_unittest(CryptTestCase) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Tue Feb 22 11:55:44 2011 @@ -27,6 +27,10 @@ Library ------- +- Issue #10924: Adding salt and Modular Crypt Format to crypt library. + Moved old C wrapper to _crypt, and added a Python wrapper with + enhanced salt generation and simpler API for password generation. + - Issue #11074: Make 'tokenize' so it can be reloaded. - Issue #11085: Moved collections abstract base classes into a separate Modified: python/branches/py3k/Modules/Setup.dist ============================================================================== --- python/branches/py3k/Modules/Setup.dist (original) +++ python/branches/py3k/Modules/Setup.dist Tue Feb 22 11:55:44 2011 @@ -207,7 +207,7 @@ # # First, look at Setup.config; configure may have set this for you. -#crypt cryptmodule.c # -lcrypt # crypt(3); needs -lcrypt on some systems +#_crypt _cryptmodule.c # -lcrypt # crypt(3); needs -lcrypt on some systems # Some more UNIX dependent modules -- off by default, since these Copied: python/branches/py3k/Modules/_cryptmodule.c (from r88042, /python/branches/py3k/Modules/cryptmodule.c) ============================================================================== --- /python/branches/py3k/Modules/cryptmodule.c (original) +++ python/branches/py3k/Modules/_cryptmodule.c Tue Feb 22 11:55:44 2011 @@ -45,7 +45,7 @@ static struct PyModuleDef cryptmodule = { PyModuleDef_HEAD_INIT, - "crypt", + "_crypt", NULL, -1, crypt_methods, @@ -56,7 +56,7 @@ }; PyMODINIT_FUNC -PyInit_crypt(void) +PyInit__crypt(void) { return PyModule_Create(&cryptmodule); } Deleted: python/branches/py3k/Modules/cryptmodule.c ============================================================================== --- python/branches/py3k/Modules/cryptmodule.c Tue Feb 22 11:55:44 2011 +++ (empty file) @@ -1,62 +0,0 @@ -/* cryptmodule.c - by Steve Majewski - */ - -#include "Python.h" - -#include - -#ifdef __VMS -#include -#endif - -/* Module crypt */ - - -static PyObject *crypt_crypt(PyObject *self, PyObject *args) -{ - char *word, *salt; -#ifndef __VMS - extern char * crypt(const char *, const char *); -#endif - - if (!PyArg_ParseTuple(args, "ss:crypt", &word, &salt)) { - return NULL; - } - /* On some platforms (AtheOS) crypt returns NULL for an invalid - salt. Return None in that case. XXX Maybe raise an exception? */ - return Py_BuildValue("s", crypt(word, salt)); - -} - -PyDoc_STRVAR(crypt_crypt__doc__, -"crypt(word, salt) -> string\n\ -word will usually be a user's password. salt is a 2-character string\n\ -which will be used to select one of 4096 variations of DES. The characters\n\ -in salt must be either \".\", \"/\", or an alphanumeric character. Returns\n\ -the hashed password as a string, which will be composed of characters from\n\ -the same alphabet as the salt."); - - -static PyMethodDef crypt_methods[] = { - {"crypt", crypt_crypt, METH_VARARGS, crypt_crypt__doc__}, - {NULL, NULL} /* sentinel */ -}; - - -static struct PyModuleDef cryptmodule = { - PyModuleDef_HEAD_INIT, - "crypt", - NULL, - -1, - crypt_methods, - NULL, - NULL, - NULL, - NULL -}; - -PyMODINIT_FUNC -PyInit_crypt(void) -{ - return PyModule_Create(&cryptmodule); -} Modified: python/branches/py3k/setup.py ============================================================================== --- python/branches/py3k/setup.py (original) +++ python/branches/py3k/setup.py Tue Feb 22 11:55:44 2011 @@ -636,7 +636,7 @@ libs = ['crypt'] else: libs = [] - exts.append( Extension('crypt', ['cryptmodule.c'], libraries=libs) ) + exts.append( Extension('_crypt', ['_cryptmodule.c'], libraries=libs) ) # CSV files exts.append( Extension('_csv', ['_csv.c']) ) From python-checkins at python.org Tue Feb 22 16:56:20 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Tue, 22 Feb 2011 16:56:20 +0100 (CET) Subject: [Python-checkins] r88501 - python/branches/py3k/Lib/smtplib.py Message-ID: <20110222155620.B82D2EE9B5@mail.python.org> Author: giampaolo.rodola Date: Tue Feb 22 16:56:20 2011 New Revision: 88501 Log: smtlib.py PEP8 normalization via pep8.py script. Modified: python/branches/py3k/Lib/smtplib.py Modified: python/branches/py3k/Lib/smtplib.py ============================================================================== --- python/branches/py3k/Lib/smtplib.py (original) +++ python/branches/py3k/Lib/smtplib.py Tue Feb 22 16:56:20 2011 @@ -52,15 +52,15 @@ from email.base64mime import body_encode as encode_base64 from sys import stderr -__all__ = ["SMTPException","SMTPServerDisconnected","SMTPResponseException", - "SMTPSenderRefused","SMTPRecipientsRefused","SMTPDataError", - "SMTPConnectError","SMTPHeloError","SMTPAuthenticationError", - "quoteaddr","quotedata","SMTP"] +__all__ = ["SMTPException", "SMTPServerDisconnected", "SMTPResponseException", + "SMTPSenderRefused", "SMTPRecipientsRefused", "SMTPDataError", + "SMTPConnectError", "SMTPHeloError", "SMTPAuthenticationError", + "quoteaddr", "quotedata", "SMTP"] SMTP_PORT = 25 SMTP_SSL_PORT = 465 -CRLF="\r\n" -bCRLF=b"\r\n" +CRLF = "\r\n" +bCRLF = b"\r\n" OLDSTYLE_AUTH = re.compile(r"auth=(.*)", re.I) @@ -113,7 +113,7 @@ def __init__(self, recipients): self.recipients = recipients - self.args = ( recipients,) + self.args = (recipients,) class SMTPDataError(SMTPResponseException): @@ -142,7 +142,7 @@ m = email.utils.parseaddr(addr)[1] except AttributeError: pass - if m == (None, None): # Indicates parse failure or AttributeError + if m == (None, None): # Indicates parse failure or AttributeError # something weird here.. punt -ddm return "<%s>" % addr elif m is None: @@ -185,7 +185,8 @@ chr = None while chr != b"\n": chr = self.sslobj.read(1) - if not chr: break + if not chr: + break str += chr return str @@ -280,10 +281,11 @@ def _get_socket(self, host, port, timeout): # This makes it simpler for SMTP_SSL to use the SMTP connect code # and just alter the socket connection bit. - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) return socket.create_connection((host, port), timeout) - def connect(self, host='localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to a host on a given port. If the hostname ends with a colon (`:') followed by a number, and @@ -297,20 +299,25 @@ if not port and (host.find(':') == host.rfind(':')): i = host.rfind(':') if i >= 0: - host, port = host[:i], host[i+1:] - try: port = int(port) + host, port = host[:i], host[i + 1:] + try: + port = int(port) except ValueError: raise socket.error("nonnumeric port") - if not port: port = self.default_port - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if not port: + port = self.default_port + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) self.sock = self._get_socket(host, port, self.timeout) (code, msg) = self.getreply() - if self.debuglevel > 0: print("connect:", msg, file=stderr) + if self.debuglevel > 0: + print("connect:", msg, file=stderr) return (code, msg) def send(self, s): """Send `s' to the server.""" - if self.debuglevel > 0: print('send:', repr(s), file=stderr) + if self.debuglevel > 0: + print('send:', repr(s), file=stderr) if hasattr(self, 'sock') and self.sock: if isinstance(s, str): s = s.encode("ascii") @@ -343,7 +350,7 @@ Raises SMTPServerDisconnected if end-of-file is reached. """ - resp=[] + resp = [] if self.file is None: self.file = self.sock.makefile('rb') while 1: @@ -354,9 +361,10 @@ if not line: self.close() raise SMTPServerDisconnected("Connection unexpectedly closed") - if self.debuglevel > 0: print('reply:', repr(line), file=stderr) + if self.debuglevel > 0: + print('reply:', repr(line), file=stderr) resp.append(line[4:].strip(b' \t\r\n')) - code=line[:3] + code = line[:3] # Check that the error code is syntactically correct. # Don't attempt to read a continuation line if it is broken. try: @@ -370,12 +378,12 @@ errmsg = b"\n".join(resp) if self.debuglevel > 0: - print('reply: retcode (%s); Msg: %s' % (errcode,errmsg), file=stderr) + print('reply: retcode (%s); Msg: %s' % (errcode, errmsg), file=stderr) return errcode, errmsg def docmd(self, cmd, args=""): """Send a command, and return its response code.""" - self.putcmd(cmd,args) + self.putcmd(cmd, args) return self.getreply() # std smtp commands @@ -385,9 +393,9 @@ host. """ self.putcmd("helo", name or self.local_hostname) - (code,msg)=self.getreply() - self.helo_resp=msg - return (code,msg) + (code, msg) = self.getreply() + self.helo_resp = msg + return (code, msg) def ehlo(self, name=''): """ SMTP 'ehlo' command. @@ -396,20 +404,20 @@ """ self.esmtp_features = {} self.putcmd(self.ehlo_msg, name or self.local_hostname) - (code,msg)=self.getreply() + (code, msg) = self.getreply() # According to RFC1869 some (badly written) # MTA's will disconnect on an ehlo. Toss an exception if # that happens -ddm if code == -1 and len(msg) == 0: self.close() raise SMTPServerDisconnected("Server not connected") - self.ehlo_resp=msg + self.ehlo_resp = msg if code != 250: - return (code,msg) - self.does_esmtp=1 + return (code, msg) + self.does_esmtp = 1 #parse the ehlo response -ddm assert isinstance(self.ehlo_resp, bytes), repr(self.ehlo_resp) - resp=self.ehlo_resp.decode("latin-1").split('\n') + resp = self.ehlo_resp.decode("latin-1").split('\n') del resp[0] for each in resp: # To be able to communicate with as many SMTP servers as possible, @@ -429,16 +437,16 @@ # It's actually stricter, in that only spaces are allowed between # parameters, but were not going to check for that here. Note # that the space isn't present if there are no parameters. - m=re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?',each) + m = re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?', each) if m: - feature=m.group("feature").lower() - params=m.string[m.end("feature"):].strip() + feature = m.group("feature").lower() + params = m.string[m.end("feature"):].strip() if feature == "auth": self.esmtp_features[feature] = self.esmtp_features.get(feature, "") \ + " " + params else: - self.esmtp_features[feature]=params - return (code,msg) + self.esmtp_features[feature] = params + return (code, msg) def has_extn(self, opt): """Does the server support a given SMTP service extension?""" @@ -458,23 +466,23 @@ """SMTP 'noop' command -- doesn't do anything :>""" return self.docmd("noop") - def mail(self,sender,options=[]): + def mail(self, sender, options=[]): """SMTP 'mail' command -- begins mail xfer session.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender) ,optionlist)) + self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender), optionlist)) return self.getreply() - def rcpt(self,recip,options=[]): + def rcpt(self, recip, options=[]): """SMTP 'rcpt' command -- indicates 1 recipient for this mail.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("rcpt","TO:%s%s" % (quoteaddr(recip),optionlist)) + self.putcmd("rcpt", "TO:%s%s" % (quoteaddr(recip), optionlist)) return self.getreply() - def data(self,msg): + def data(self, msg): """SMTP 'DATA' command -- sends message data to server. Automatically quotes lines beginning with a period per rfc821. @@ -485,10 +493,11 @@ '\r\n' characters. If msg is bytes, it is transmitted as is. """ self.putcmd("data") - (code,repl)=self.getreply() - if self.debuglevel >0 : print("data:", (code,repl), file=stderr) + (code, repl) = self.getreply() + if self.debuglevel > 0: + print("data:", (code, repl), file=stderr) if code != 354: - raise SMTPDataError(code,repl) + raise SMTPDataError(code, repl) else: if isinstance(msg, str): msg = _fix_eols(msg).encode('ascii') @@ -497,16 +506,17 @@ q = q + bCRLF q = q + b"." + bCRLF self.send(q) - (code,msg)=self.getreply() - if self.debuglevel >0 : print("data:", (code,msg), file=stderr) - return (code,msg) + (code, msg) = self.getreply() + if self.debuglevel > 0: + print("data:", (code, msg), file=stderr) + return (code, msg) def verify(self, address): """SMTP 'verify' command -- checks for address validity.""" self.putcmd("vrfy", quoteaddr(address)) return self.getreply() # a.k.a. - vrfy=verify + vrfy = verify def expn(self, address): """SMTP 'expn' command -- expands a mailing list.""" @@ -564,7 +574,6 @@ s = "\0%s\0%s" % (user, password) return encode_base64(s.encode('ascii'), eol='') - AUTH_PLAIN = "PLAIN" AUTH_CRAM_MD5 = "CRAM-MD5" AUTH_LOGIN = "LOGIN" @@ -613,7 +622,7 @@ # We could not login sucessfully. Return result of last attempt. raise SMTPAuthenticationError(code, resp) - def starttls(self, keyfile = None, certfile = None): + def starttls(self, keyfile=None, certfile=None): """Puts the connection to the SMTP server into TLS mode. If there has been no previous EHLO or HELO command this session, this @@ -721,22 +730,22 @@ esmtp_opts.append("size=%d" % len(msg)) for option in mail_options: esmtp_opts.append(option) - (code,resp) = self.mail(from_addr, esmtp_opts) + (code, resp) = self.mail(from_addr, esmtp_opts) if code != 250: self.rset() raise SMTPSenderRefused(code, resp, from_addr) - senderrs={} + senderrs = {} if isinstance(to_addrs, str): to_addrs = [to_addrs] for each in to_addrs: - (code,resp)=self.rcpt(each, rcpt_options) + (code, resp) = self.rcpt(each, rcpt_options) if (code != 250) and (code != 251): - senderrs[each]=(code,resp) - if len(senderrs)==len(to_addrs): + senderrs[each] = (code, resp) + if len(senderrs) == len(to_addrs): # the server refused all our recipients self.rset() raise SMTPRecipientsRefused(senderrs) - (code,resp) = self.data(msg) + (code, resp) = self.data(msg) if code != 250: self.rset() raise SMTPDataError(code, resp) @@ -770,7 +779,6 @@ return self.sendmail(from_addr, to_addrs, flatmsg, mail_options, rcpt_options) - def close(self): """Close the connection to the SMTP server.""" if self.file: @@ -780,7 +788,6 @@ self.sock.close() self.sock = None - def quit(self): """Terminate the SMTP session.""" res = self.docmd("quit") @@ -806,7 +813,8 @@ self.default_port = SMTP_SSL_PORT def _get_socket(self, host, port, timeout): - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) new_socket = socket.create_connection((host, port), timeout) new_socket = ssl.wrap_socket(new_socket, self.keyfile, self.certfile) self.file = SSLFakeFile(new_socket) @@ -834,11 +842,11 @@ ehlo_msg = "lhlo" - def __init__(self, host = '', port = LMTP_PORT, local_hostname = None): + def __init__(self, host='', port=LMTP_PORT, local_hostname=None): """Initialize a new instance.""" SMTP.__init__(self, host, port, local_hostname) - def connect(self, host = 'localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to the LMTP daemon, on either a Unix or a TCP socket.""" if host[0] != '/': return SMTP.connect(self, host, port) @@ -848,13 +856,15 @@ self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) self.sock.connect(host) except socket.error as msg: - if self.debuglevel > 0: print('connect fail:', host, file=stderr) + if self.debuglevel > 0: + print('connect fail:', host, file=stderr) if self.sock: self.sock.close() self.sock = None raise socket.error(msg) (code, msg) = self.getreply() - if self.debuglevel > 0: print('connect:', msg, file=stderr) + if self.debuglevel > 0: + print('connect:', msg, file=stderr) return (code, msg) @@ -868,7 +878,7 @@ return sys.stdin.readline().strip() fromaddr = prompt("From") - toaddrs = prompt("To").split(',') + toaddrs = prompt("To").split(',') print("Enter message, end with ^D:") msg = '' while 1: From python-checkins at python.org Tue Feb 22 17:24:07 2011 From: python-checkins at python.org (terry.reedy) Date: Tue, 22 Feb 2011 17:24:07 +0100 (CET) Subject: [Python-checkins] r88502 - python/branches/release32-maint/Doc/whatsnew/3.2.rst Message-ID: <20110222162407.ED20DEE98B@mail.python.org> Author: terry.reedy Date: Tue Feb 22 17:24:07 2011 New Revision: 88502 Log: Fix typo reported by 'Paddy' on python-list. Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/release32-maint/Doc/whatsnew/3.2.rst (original) +++ python/branches/release32-maint/Doc/whatsnew/3.2.rst Tue Feb 22 17:24:07 2011 @@ -270,7 +270,7 @@ e.submit(shutil.copy, 'src1.txt', 'dest1.txt') e.submit(shutil.copy, 'src2.txt', 'dest2.txt') e.submit(shutil.copy, 'src3.txt', 'dest3.txt') - e.submit(shutil.copy, 'src3.txt', 'dest4.txt') + e.submit(shutil.copy, 'src4.txt', 'dest4.txt') .. seealso:: From nnorwitz at gmail.com Tue Feb 22 10:00:09 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Tue, 22 Feb 2011 04:00:09 -0500 Subject: [Python-checkins] Python Regression Test Failures doc (1) Message-ID: <20110222090009.GA28206@kbk-i386-bb.dyndns.org> rm -rf build/* rm -rf tools/sphinx rm -rf tools/pygments rm -rf tools/jinja2 rm -rf tools/docutils Checking out Sphinx... svn: PROPFIND request failed on '/projects/external/Sphinx-0.6.5/sphinx' svn: PROPFIND of '/projects/external/Sphinx-0.6.5/sphinx': could not connect to server (http://svn.python.org) make: *** [checkout] Error 1 From python-checkins at python.org Tue Feb 22 20:12:43 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 20:12:43 +0100 (CET) Subject: [Python-checkins] r88503 - in python/branches/py3k: Lib/lib2to3/__main__.py Tools/scripts/2to3 Message-ID: <20110222191243.810C3F634@mail.python.org> Author: brett.cannon Date: Tue Feb 22 20:12:43 2011 New Revision: 88503 Log: Add lib2to3.__main__ to make it easier for debugging purposes to run 2to3. Added: python/branches/py3k/Lib/lib2to3/__main__.py Modified: python/branches/py3k/Tools/scripts/2to3 Added: python/branches/py3k/Lib/lib2to3/__main__.py ============================================================================== --- (empty file) +++ python/branches/py3k/Lib/lib2to3/__main__.py Tue Feb 22 20:12:43 2011 @@ -0,0 +1,4 @@ +import sys +from .main import main + +sys.exit(main("lib2to3.fixes")) Modified: python/branches/py3k/Tools/scripts/2to3 ============================================================================== --- python/branches/py3k/Tools/scripts/2to3 (original) +++ python/branches/py3k/Tools/scripts/2to3 Tue Feb 22 20:12:43 2011 @@ -1,5 +1,4 @@ #!/usr/bin/env python -import sys -from lib2to3.main import main +import runpy -sys.exit(main("lib2to3.fixes")) +runpy.run_module('lib2to3', run_name='__main__', alter_sys=True) From python-checkins at python.org Tue Feb 22 20:13:46 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 20:13:46 +0100 (CET) Subject: [Python-checkins] r88504 - python/branches/release32-maint Message-ID: <20110222191346.9A10FEE9BB@mail.python.org> Author: brett.cannon Date: Tue Feb 22 20:13:46 2011 New Revision: 88504 Log: Blocked revisions 88503 via svnmerge ........ r88503 | brett.cannon | 2011-02-22 11:12:43 -0800 (Tue, 22 Feb 2011) | 1 line Add lib2to3.__main__ to make it easier for debugging purposes to run 2to3. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Tue Feb 22 20:24:33 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Tue, 22 Feb 2011 20:24:33 +0100 (CET) Subject: [Python-checkins] r88505 - python/branches/py3k/Lib/ftplib.py Message-ID: <20110222192433.60A86EEA06@mail.python.org> Author: giampaolo.rodola Date: Tue Feb 22 20:24:33 2011 New Revision: 88505 Log: In FTP.close() method, make sure to also close the socket object, not only the file. Modified: python/branches/py3k/Lib/ftplib.py Modified: python/branches/py3k/Lib/ftplib.py ============================================================================== --- python/branches/py3k/Lib/ftplib.py (original) +++ python/branches/py3k/Lib/ftplib.py Tue Feb 22 20:24:33 2011 @@ -589,11 +589,11 @@ def close(self): '''Close the connection without assuming anything about it.''' - if self.file: + if self.file is not None: self.file.close() + if self.sock is not None: self.sock.close() - self.file = self.sock = None - + self.file = self.sock = None try: import ssl From python-checkins at python.org Tue Feb 22 21:15:45 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 21:15:45 +0100 (CET) Subject: [Python-checkins] r88506 - in python/branches/py3k: Include/pydebug.h Include/pyerrors.h Modules/_ctypes/cfield.c Modules/_datetimemodule.c Modules/_io/bytesio.c Modules/_json.c Modules/_multiprocessing/multiprocessing.h Modules/_sqlite/connection.c Modules/_sqlite/cursor.c Modules/_sqlite/statement.c Modules/_ssl.c Modules/_threadmodule.c Modules/_tkinter.c Modules/arraymodule.c Modules/audioop.c Modules/cjkcodecs/_codecs_iso2022.c Modules/cjkcodecs/multibytecodec.c Modules/main.c Modules/socketmodule.c Objects/bytearrayobject.c Objects/fileobject.c Objects/floatobject.c Objects/obmalloc.c Objects/tupleobject.c Objects/typeobject.c Objects/unicodeobject.c Objects/weakrefobject.c Parser/parsetok.c Parser/pgenmain.c Python/ast.c Python/bltinmodule.c Python/ceval.c Python/dtoa.c Python/getargs.c Python/pystrtod.c Python/sysmodule.c Message-ID: <20110222201545.985C1EE9A5@mail.python.org> Author: brett.cannon Date: Tue Feb 22 21:15:44 2011 New Revision: 88506 Log: Issue #8914: fix various warnings from the Clang static analyzer v254. Modified: python/branches/py3k/Include/pydebug.h python/branches/py3k/Include/pyerrors.h python/branches/py3k/Modules/_ctypes/cfield.c python/branches/py3k/Modules/_datetimemodule.c python/branches/py3k/Modules/_io/bytesio.c python/branches/py3k/Modules/_json.c python/branches/py3k/Modules/_multiprocessing/multiprocessing.h python/branches/py3k/Modules/_sqlite/connection.c python/branches/py3k/Modules/_sqlite/cursor.c python/branches/py3k/Modules/_sqlite/statement.c python/branches/py3k/Modules/_ssl.c python/branches/py3k/Modules/_threadmodule.c python/branches/py3k/Modules/_tkinter.c python/branches/py3k/Modules/arraymodule.c python/branches/py3k/Modules/audioop.c python/branches/py3k/Modules/cjkcodecs/_codecs_iso2022.c python/branches/py3k/Modules/cjkcodecs/multibytecodec.c python/branches/py3k/Modules/main.c python/branches/py3k/Modules/socketmodule.c python/branches/py3k/Objects/bytearrayobject.c python/branches/py3k/Objects/fileobject.c python/branches/py3k/Objects/floatobject.c python/branches/py3k/Objects/obmalloc.c python/branches/py3k/Objects/tupleobject.c python/branches/py3k/Objects/typeobject.c python/branches/py3k/Objects/unicodeobject.c python/branches/py3k/Objects/weakrefobject.c python/branches/py3k/Parser/parsetok.c python/branches/py3k/Parser/pgenmain.c python/branches/py3k/Python/ast.c python/branches/py3k/Python/bltinmodule.c python/branches/py3k/Python/ceval.c python/branches/py3k/Python/dtoa.c python/branches/py3k/Python/getargs.c python/branches/py3k/Python/pystrtod.c python/branches/py3k/Python/sysmodule.c Modified: python/branches/py3k/Include/pydebug.h ============================================================================== --- python/branches/py3k/Include/pydebug.h (original) +++ python/branches/py3k/Include/pydebug.h Tue Feb 22 21:15:44 2011 @@ -26,8 +26,6 @@ PYTHONPATH and PYTHONHOME from the environment */ #define Py_GETENV(s) (Py_IgnoreEnvironmentFlag ? NULL : getenv(s)) -PyAPI_FUNC(void) Py_FatalError(const char *message); - #ifdef __cplusplus } #endif Modified: python/branches/py3k/Include/pyerrors.h ============================================================================== --- python/branches/py3k/Include/pyerrors.h (original) +++ python/branches/py3k/Include/pyerrors.h Tue Feb 22 21:15:44 2011 @@ -70,7 +70,17 @@ PyAPI_FUNC(void) PyErr_Clear(void); PyAPI_FUNC(void) PyErr_Fetch(PyObject **, PyObject **, PyObject **); PyAPI_FUNC(void) PyErr_Restore(PyObject *, PyObject *, PyObject *); -PyAPI_FUNC(void) Py_FatalError(const char *message); + +#if defined(__clang__) || \ + (defined(__GNUC__) && \ + ((__GNUC_MAJOR__ >= 3) || \ + (__GNUC_MAJOR__ == 2) && (__GNUC_MINOR__ >= 5))) +#define _Py_NO_RETURN __attribute__((__noreturn__)) +#else +#define _Py_NO_RETURN +#endif + +PyAPI_FUNC(void) Py_FatalError(const char *message) _Py_NO_RETURN; #if defined(Py_DEBUG) || defined(Py_LIMITED_API) #define _PyErr_OCCURRED() PyErr_Occurred() Modified: python/branches/py3k/Modules/_ctypes/cfield.c ============================================================================== --- python/branches/py3k/Modules/_ctypes/cfield.c (original) +++ python/branches/py3k/Modules/_ctypes/cfield.c Tue Feb 22 21:15:44 2011 @@ -52,7 +52,7 @@ { CFieldObject *self; PyObject *proto; - Py_ssize_t size, align, length; + Py_ssize_t size, align; SETFUNC setfunc = NULL; GETFUNC getfunc = NULL; StgDictObject *dict; @@ -106,7 +106,6 @@ } size = dict->size; - length = dict->length; proto = desc; /* Field descriptors for 'c_char * n' are be scpecial cased to Modified: python/branches/py3k/Modules/_datetimemodule.c ============================================================================== --- python/branches/py3k/Modules/_datetimemodule.c (original) +++ python/branches/py3k/Modules/_datetimemodule.c Tue Feb 22 21:15:44 2011 @@ -1461,7 +1461,7 @@ goto Done; Py_DECREF(x1); Py_DECREF(x2); - x1 = x2 = NULL; + /* x1 = */ x2 = NULL; /* x3 has days+seconds in seconds */ x1 = PyNumber_Multiply(x3, us_per_second); /* us */ Modified: python/branches/py3k/Modules/_io/bytesio.c ============================================================================== --- python/branches/py3k/Modules/_io/bytesio.c (original) +++ python/branches/py3k/Modules/_io/bytesio.c Tue Feb 22 21:15:44 2011 @@ -938,13 +938,11 @@ bytesiobuf_getbuffer(bytesiobuf *obj, Py_buffer *view, int flags) { int ret; - void *ptr; bytesio *b = (bytesio *) obj->source; if (view == NULL) { b->exports++; return 0; } - ptr = (void *) obj; ret = PyBuffer_FillInfo(view, (PyObject*)obj, b->buf, b->string_size, 0, flags); if (ret >= 0) { Modified: python/branches/py3k/Modules/_json.c ============================================================================== --- python/branches/py3k/Modules/_json.c (original) +++ python/branches/py3k/Modules/_json.c Tue Feb 22 21:15:44 2011 @@ -335,7 +335,7 @@ PyObject *rval = NULL; Py_ssize_t len = PyUnicode_GET_SIZE(pystr); Py_ssize_t begin = end - 1; - Py_ssize_t next = begin; + Py_ssize_t next /* = begin */; const Py_UNICODE *buf = PyUnicode_AS_UNICODE(pystr); PyObject *chunks = NULL; PyObject *chunk = NULL; @@ -1532,13 +1532,12 @@ goto bail; Py_CLEAR(ident); } + /* TODO DOES NOT RUN; dead code if (s->indent != Py_None) { - /* TODO: DOES NOT RUN */ indent_level -= 1; - /* - yield '\n' + (' ' * (_indent * _current_indent_level)) - */ - } + + yield '\n' + (' ' * (_indent * _current_indent_level)) + }*/ if (PyList_Append(rval, close_dict)) goto bail; return 0; @@ -1624,13 +1623,13 @@ goto bail; Py_CLEAR(ident); } + + /* TODO: DOES NOT RUN if (s->indent != Py_None) { - /* TODO: DOES NOT RUN */ indent_level -= 1; - /* - yield '\n' + (' ' * (_indent * _current_indent_level)) - */ - } + + yield '\n' + (' ' * (_indent * _current_indent_level)) + }*/ if (PyList_Append(rval, close_array)) goto bail; Py_DECREF(s_fast); Modified: python/branches/py3k/Modules/_multiprocessing/multiprocessing.h ============================================================================== --- python/branches/py3k/Modules/_multiprocessing/multiprocessing.h (original) +++ python/branches/py3k/Modules/_multiprocessing/multiprocessing.h Tue Feb 22 21:15:44 2011 @@ -4,7 +4,7 @@ #define PY_SSIZE_T_CLEAN #ifdef __sun -/* The control message API is only available on Solaris +/* The control message API is only available on Solaris if XPG 4.2 or later is requested. */ #define _XOPEN_SOURCE 500 #endif Modified: python/branches/py3k/Modules/_sqlite/connection.c ============================================================================== --- python/branches/py3k/Modules/_sqlite/connection.c (original) +++ python/branches/py3k/Modules/_sqlite/connection.c Tue Feb 22 21:15:44 2011 @@ -673,7 +673,6 @@ { PyObject* function_result = NULL; PyObject** aggregate_instance; - PyObject* aggregate_class; #ifdef WITH_THREAD PyGILState_STATE threadstate; @@ -681,8 +680,6 @@ threadstate = PyGILState_Ensure(); #endif - aggregate_class = (PyObject*)sqlite3_user_data(context); - aggregate_instance = (PyObject**)sqlite3_aggregate_context(context, sizeof(PyObject*)); if (!*aggregate_instance) { /* this branch is executed if there was an exception in the aggregate's Modified: python/branches/py3k/Modules/_sqlite/cursor.c ============================================================================== --- python/branches/py3k/Modules/_sqlite/cursor.c (original) +++ python/branches/py3k/Modules/_sqlite/cursor.c Tue Feb 22 21:15:44 2011 @@ -126,11 +126,9 @@ static void pysqlite_cursor_dealloc(pysqlite_Cursor* self) { - int rc; - /* Reset the statement if the user has not closed the cursor */ if (self->statement) { - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); Py_DECREF(self->statement); } @@ -529,7 +527,7 @@ if (self->statement != NULL) { /* There is an active statement */ - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); } operation_cstr = _PyUnicode_AsStringAndSize(operation, &operation_len); @@ -734,7 +732,7 @@ } if (multiple) { - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); } Py_XDECREF(parameters); } Modified: python/branches/py3k/Modules/_sqlite/statement.c ============================================================================== --- python/branches/py3k/Modules/_sqlite/statement.c (original) +++ python/branches/py3k/Modules/_sqlite/statement.c Tue Feb 22 21:15:44 2011 @@ -369,11 +369,9 @@ void pysqlite_statement_dealloc(pysqlite_Statement* self) { - int rc; - if (self->st) { Py_BEGIN_ALLOW_THREADS - rc = sqlite3_finalize(self->st); + sqlite3_finalize(self->st); Py_END_ALLOW_THREADS } Modified: python/branches/py3k/Modules/_ssl.c ============================================================================== --- python/branches/py3k/Modules/_ssl.c (original) +++ python/branches/py3k/Modules/_ssl.c Tue Feb 22 21:15:44 2011 @@ -354,7 +354,6 @@ /* Actually negotiate SSL connection */ /* XXX If SSL_do_handshake() returns 0, it's also a failure. */ - sockstate = 0; do { PySSL_BEGIN_ALLOW_THREADS ret = SSL_do_handshake(self->ssl); @@ -1090,7 +1089,6 @@ goto error; } do { - err = 0; PySSL_BEGIN_ALLOW_THREADS len = SSL_write(self->ssl, buf.buf, buf.len); err = SSL_get_error(self->ssl, len); @@ -1226,7 +1224,6 @@ } } do { - err = 0; PySSL_BEGIN_ALLOW_THREADS count = SSL_read(self->ssl, mem, len); err = SSL_get_error(self->ssl, count); Modified: python/branches/py3k/Modules/_threadmodule.c ============================================================================== --- python/branches/py3k/Modules/_threadmodule.c (original) +++ python/branches/py3k/Modules/_threadmodule.c Tue Feb 22 21:15:44 2011 @@ -53,8 +53,9 @@ _PyTime_timeval curtime; _PyTime_timeval endtime; + + _PyTime_gettimeofday(&endtime); if (microseconds > 0) { - _PyTime_gettimeofday(&endtime); endtime.tv_sec += microseconds / (1000 * 1000); endtime.tv_usec += microseconds % (1000 * 1000); } Modified: python/branches/py3k/Modules/_tkinter.c ============================================================================== --- python/branches/py3k/Modules/_tkinter.c (original) +++ python/branches/py3k/Modules/_tkinter.c Tue Feb 22 21:15:44 2011 @@ -2005,7 +2005,7 @@ PythonCmd(ClientData clientData, Tcl_Interp *interp, int argc, char *argv[]) { PythonCmd_ClientData *data = (PythonCmd_ClientData *)clientData; - PyObject *self, *func, *arg, *res; + PyObject *func, *arg, *res; int i, rv; Tcl_Obj *obj_res; @@ -2014,7 +2014,6 @@ /* TBD: no error checking here since we know, via the * Tkapp_CreateCommand() that the client data is a two-tuple */ - self = data->self; func = data->func; /* Create argument list (argv1, ..., argvN) */ Modified: python/branches/py3k/Modules/arraymodule.c ============================================================================== --- python/branches/py3k/Modules/arraymodule.c (original) +++ python/branches/py3k/Modules/arraymodule.c Tue Feb 22 21:15:44 2011 @@ -876,7 +876,6 @@ if (Py_SIZE(self) > 0) { if (n < 0) n = 0; - items = self->ob_item; if ((self->ob_descr->itemsize != 0) && (Py_SIZE(self) > PY_SSIZE_T_MAX / self->ob_descr->itemsize)) { return PyErr_NoMemory(); Modified: python/branches/py3k/Modules/audioop.c ============================================================================== --- python/branches/py3k/Modules/audioop.c (original) +++ python/branches/py3k/Modules/audioop.c Tue Feb 22 21:15:44 2011 @@ -513,7 +513,6 @@ best_result = result; best_j = 0; - j = 0; for ( j=1; j<=len1-len2; j++) { aj_m1 = (double)cp1[j-1]; @@ -599,7 +598,6 @@ best_result = result; best_j = 0; - j = 0; for ( j=1; j<=len1-len2; j++) { aj_m1 = (double)cp1[j-1]; @@ -1433,7 +1431,6 @@ if ( state == Py_None ) { /* First time, it seems. Set defaults */ valpred = 0; - step = 7; index = 0; } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) return 0; @@ -1534,7 +1531,6 @@ if ( state == Py_None ) { /* First time, it seems. Set defaults */ valpred = 0; - step = 7; index = 0; } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) return 0; Modified: python/branches/py3k/Modules/cjkcodecs/_codecs_iso2022.c ============================================================================== --- python/branches/py3k/Modules/cjkcodecs/_codecs_iso2022.c (original) +++ python/branches/py3k/Modules/cjkcodecs/_codecs_iso2022.c Tue Feb 22 21:15:44 2011 @@ -123,7 +123,7 @@ CODEC_INIT(iso2022) { - const struct iso2022_designation *desig = CONFIG_DESIGNATIONS; + const struct iso2022_designation *desig; for (desig = CONFIG_DESIGNATIONS; desig->mark; desig++) if (desig->initializer != NULL && desig->initializer() != 0) return -1; Modified: python/branches/py3k/Modules/cjkcodecs/multibytecodec.c ============================================================================== --- python/branches/py3k/Modules/cjkcodecs/multibytecodec.c (original) +++ python/branches/py3k/Modules/cjkcodecs/multibytecodec.c Tue Feb 22 21:15:44 2011 @@ -483,6 +483,7 @@ return PyBytes_FromStringAndSize(NULL, 0); buf.excobj = NULL; + buf.outobj = NULL; buf.inbuf = buf.inbuf_top = *data; buf.inbuf_end = buf.inbuf_top + datalen; Modified: python/branches/py3k/Modules/main.c ============================================================================== --- python/branches/py3k/Modules/main.c (original) +++ python/branches/py3k/Modules/main.c Tue Feb 22 21:15:44 2011 @@ -577,7 +577,6 @@ if ((p = Py_GETENV("PYTHONEXECUTABLE")) && *p != '\0') { wchar_t* buffer; size_t len = strlen(p); - size_t r; buffer = malloc(len * sizeof(wchar_t)); if (buffer == NULL) { @@ -585,7 +584,7 @@ "not enough memory to copy PYTHONEXECUTABLE"); } - r = mbstowcs(buffer, p, len); + mbstowcs(buffer, p, len); Py_SetProgramName(buffer); /* buffer is now handed off - do not free */ } else { Modified: python/branches/py3k/Modules/socketmodule.c ============================================================================== --- python/branches/py3k/Modules/socketmodule.c (original) +++ python/branches/py3k/Modules/socketmodule.c Tue Feb 22 21:15:44 2011 @@ -3407,7 +3407,7 @@ goto finally; af = sa->sa_family; ap = NULL; - al = 0; + /* al = 0; */ switch (af) { case AF_INET: ap = (char *)&((struct sockaddr_in *)sa)->sin_addr; Modified: python/branches/py3k/Objects/bytearrayobject.c ============================================================================== --- python/branches/py3k/Objects/bytearrayobject.c (original) +++ python/branches/py3k/Objects/bytearrayobject.c Tue Feb 22 21:15:44 2011 @@ -2439,7 +2439,7 @@ static PyObject * bytearray_rstrip(PyByteArrayObject *self, PyObject *args) { - Py_ssize_t left, right, mysize, argsize; + Py_ssize_t right, mysize, argsize; void *myptr, *argptr; PyObject *arg = Py_None; Py_buffer varg; @@ -2457,11 +2457,10 @@ } myptr = self->ob_bytes; mysize = Py_SIZE(self); - left = 0; right = rstrip_helper(myptr, mysize, argptr, argsize); if (arg != Py_None) PyBuffer_Release(&varg); - return PyByteArray_FromStringAndSize(self->ob_bytes + left, right - left); + return PyByteArray_FromStringAndSize(self->ob_bytes, right); } PyDoc_STRVAR(decode_doc, Modified: python/branches/py3k/Objects/fileobject.c ============================================================================== --- python/branches/py3k/Objects/fileobject.c (original) +++ python/branches/py3k/Objects/fileobject.c Tue Feb 22 21:15:44 2011 @@ -297,8 +297,8 @@ *p++ = c; if (c == '\n') break; } - if ( c == EOF && skipnextlf ) - newlinetypes |= NEWLINE_CR; + /* if ( c == EOF && skipnextlf ) + newlinetypes |= NEWLINE_CR; */ FUNLOCKFILE(stream); *p = '\0'; if ( skipnextlf ) { Modified: python/branches/py3k/Objects/floatobject.c ============================================================================== --- python/branches/py3k/Objects/floatobject.c (original) +++ python/branches/py3k/Objects/floatobject.c Tue Feb 22 21:15:44 2011 @@ -197,7 +197,6 @@ Py_DECREF(s_buffer); return NULL; } - last = s + len; } else if (PyObject_AsCharBuffer(v, &s, &len)) { PyErr_SetString(PyExc_TypeError, @@ -2246,7 +2245,7 @@ /* Eighth byte */ *p = flo & 0xFF; - p += incr; + /* p += incr; */ /* Done */ return 0; Modified: python/branches/py3k/Objects/obmalloc.c ============================================================================== --- python/branches/py3k/Objects/obmalloc.c (original) +++ python/branches/py3k/Objects/obmalloc.c Tue Feb 22 21:15:44 2011 @@ -1775,7 +1775,6 @@ * will be living in full pools -- would be a shame to miss them. */ for (i = 0; i < maxarenas; ++i) { - uint poolsinarena; uint j; uptr base = arenas[i].address; @@ -1784,7 +1783,6 @@ continue; narenas += 1; - poolsinarena = arenas[i].ntotalpools; numfreepools += arenas[i].nfreepools; /* round up to pool alignment */ Modified: python/branches/py3k/Objects/tupleobject.c ============================================================================== --- python/branches/py3k/Objects/tupleobject.c (original) +++ python/branches/py3k/Objects/tupleobject.c Tue Feb 22 21:15:44 2011 @@ -86,7 +86,7 @@ { return PyErr_NoMemory(); } - nbytes += sizeof(PyTupleObject) - sizeof(PyObject *); + /* nbytes += sizeof(PyTupleObject) - sizeof(PyObject *); */ op = PyObject_GC_NewVar(PyTupleObject, &PyTuple_Type, size); if (op == NULL) Modified: python/branches/py3k/Objects/typeobject.c ============================================================================== --- python/branches/py3k/Objects/typeobject.c (original) +++ python/branches/py3k/Objects/typeobject.c Tue Feb 22 21:15:44 2011 @@ -902,7 +902,7 @@ /* Find the nearest base with a different tp_dealloc */ base = type; - while ((basedealloc = base->tp_dealloc) == subtype_dealloc) { + while ((/*basedealloc =*/ base->tp_dealloc) == subtype_dealloc) { base = base->tp_base; assert(base); } Modified: python/branches/py3k/Objects/unicodeobject.c ============================================================================== --- python/branches/py3k/Objects/unicodeobject.c (original) +++ python/branches/py3k/Objects/unicodeobject.c Tue Feb 22 21:15:44 2011 @@ -5418,7 +5418,6 @@ if (!result) return NULL; for (i = 0; i < 256; i++) { - key = value = NULL; key = PyLong_FromLong(decode[i]); value = PyLong_FromLong(i); if (!key || !value) @@ -6935,7 +6934,7 @@ } } else { - Py_ssize_t n, i, j, e; + Py_ssize_t n, i, j; Py_ssize_t product, new_size, delta; Py_UNICODE *p; @@ -6967,7 +6966,6 @@ return NULL; i = 0; p = u->str; - e = self->length - str1->length; if (str1->length > 0) { while (n-- > 0) { /* look for next match */ Modified: python/branches/py3k/Objects/weakrefobject.c ============================================================================== --- python/branches/py3k/Objects/weakrefobject.c (original) +++ python/branches/py3k/Objects/weakrefobject.c Tue Feb 22 21:15:44 2011 @@ -168,13 +168,20 @@ PyErr_Clear(); else if (PyUnicode_Check(nameobj)) name = _PyUnicode_AsString(nameobj); - PyOS_snprintf(buffer, sizeof(buffer), - name ? "" - : "", - self, - Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, - PyWeakref_GET_OBJECT(self), - name); + if (name) + PyOS_snprintf(buffer, sizeof(buffer), + "", + self, + Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, + PyWeakref_GET_OBJECT(self), + name); + else + PyOS_snprintf(buffer, sizeof(buffer), + "", + self, + Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, + PyWeakref_GET_OBJECT(self)); + Py_XDECREF(nameobj); } return PyUnicode_FromString(buffer); Modified: python/branches/py3k/Parser/parsetok.c ============================================================================== --- python/branches/py3k/Parser/parsetok.c (original) +++ python/branches/py3k/Parser/parsetok.c Tue Feb 22 21:15:44 2011 @@ -127,7 +127,7 @@ { parser_state *ps; node *n; - int started = 0, handling_import = 0, handling_with = 0; + int started = 0; if ((ps = PyParser_New(g, start)) == NULL) { fprintf(stderr, "no mem for new parser\n"); @@ -154,7 +154,6 @@ } if (type == ENDMARKER && started) { type = NEWLINE; /* Add an extra newline */ - handling_with = handling_import = 0; started = 0; /* Add the right number of dedent tokens, except if a certain flag is given -- Modified: python/branches/py3k/Parser/pgenmain.c ============================================================================== --- python/branches/py3k/Parser/pgenmain.c (original) +++ python/branches/py3k/Parser/pgenmain.c Tue Feb 22 21:15:44 2011 @@ -29,6 +29,8 @@ /* Forward */ grammar *getgrammar(char *filename); +void Py_Exit(int) _Py_NO_RETURN; + void Py_Exit(int sts) { Modified: python/branches/py3k/Python/ast.c ============================================================================== --- python/branches/py3k/Python/ast.c (original) +++ python/branches/py3k/Python/ast.c Tue Feb 22 21:15:44 2011 @@ -3231,7 +3231,6 @@ const char *end; if (encoding == NULL) { - buf = (char *)s; u = NULL; } else { /* check for integer overflow */ Modified: python/branches/py3k/Python/bltinmodule.c ============================================================================== --- python/branches/py3k/Python/bltinmodule.c (original) +++ python/branches/py3k/Python/bltinmodule.c Tue Feb 22 21:15:44 2011 @@ -37,7 +37,7 @@ { PyObject *func, *name, *bases, *mkw, *meta, *prep, *ns, *cell; PyObject *cls = NULL; - Py_ssize_t nargs, nbases; + Py_ssize_t nargs; assert(args != NULL); if (!PyTuple_Check(args)) { @@ -61,7 +61,6 @@ bases = PyTuple_GetSlice(args, 2, nargs); if (bases == NULL) return NULL; - nbases = nargs - 2; if (kwds == NULL) { meta = NULL; @@ -766,7 +765,6 @@ { PyObject *v; PyObject *prog, *globals = Py_None, *locals = Py_None; - int plain = 0; if (!PyArg_UnpackTuple(args, "exec", 1, 3, &prog, &globals, &locals)) return NULL; @@ -775,7 +773,6 @@ globals = PyEval_GetGlobals(); if (locals == Py_None) { locals = PyEval_GetLocals(); - plain = 1; } if (!globals || !locals) { PyErr_SetString(PyExc_SystemError, Modified: python/branches/py3k/Python/ceval.c ============================================================================== --- python/branches/py3k/Python/ceval.c (original) +++ python/branches/py3k/Python/ceval.c Tue Feb 22 21:15:44 2011 @@ -2690,7 +2690,7 @@ Py_DECREF(*pfunc); *pfunc = self; na++; - n++; + /* n++; */ } else Py_INCREF(func); sp = stack_pointer; @@ -3026,7 +3026,7 @@ PyTrace_RETURN, retval)) { Py_XDECREF(retval); retval = NULL; - why = WHY_EXCEPTION; + /* why = WHY_EXCEPTION; */ } } } Modified: python/branches/py3k/Python/dtoa.c ============================================================================== --- python/branches/py3k/Python/dtoa.c (original) +++ python/branches/py3k/Python/dtoa.c Tue Feb 22 21:15:44 2011 @@ -2055,7 +2055,7 @@ + Exp_msk1 ; word1(&rv) = 0; - dsign = 0; + /* dsign = 0; */ break; } } @@ -2092,7 +2092,7 @@ goto undfl; } } - dsign = 1 - dsign; + /* dsign = 1 - dsign; */ break; } if ((aadj = ratio(delta, bs)) <= 2.) { Modified: python/branches/py3k/Python/getargs.c ============================================================================== --- python/branches/py3k/Python/getargs.c (original) +++ python/branches/py3k/Python/getargs.c Tue Feb 22 21:15:44 2011 @@ -966,9 +966,10 @@ case 'u': /* raw unicode buffer (Py_UNICODE *) */ case 'Z': /* raw unicode buffer or None */ { + Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); + if (*format == '#') { /* any buffer-like object */ /* "s#" or "Z#" */ - Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); FETCH_SIZE; if (c == 'Z' && arg == Py_None) { @@ -984,8 +985,6 @@ format++; } else { /* "s" or "Z" */ - Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); - if (c == 'Z' && arg == Py_None) *p = NULL; else if (PyUnicode_Check(arg)) { Modified: python/branches/py3k/Python/pystrtod.c ============================================================================== --- python/branches/py3k/Python/pystrtod.c (original) +++ python/branches/py3k/Python/pystrtod.c Tue Feb 22 21:15:44 2011 @@ -954,7 +954,7 @@ /* shouldn't get here: Gay's code should always return something starting with a digit, an 'I', or 'N' */ strncpy(p, "ERR", 3); - p += 3; + /* p += 3; */ assert(0); } goto exit; Modified: python/branches/py3k/Python/sysmodule.c ============================================================================== --- python/branches/py3k/Python/sysmodule.c (original) +++ python/branches/py3k/Python/sysmodule.c Tue Feb 22 21:15:44 2011 @@ -1176,7 +1176,6 @@ PyObject *opts; PyObject *name = NULL, *value = NULL; const wchar_t *name_end; - int r; opts = get_xoptions(); if (opts == NULL) @@ -1194,7 +1193,7 @@ } if (name == NULL || value == NULL) goto error; - r = PyDict_SetItem(opts, name, value); + PyDict_SetItem(opts, name, value); Py_DECREF(name); Py_DECREF(value); return; From python-checkins at python.org Tue Feb 22 21:17:14 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 21:17:14 +0100 (CET) Subject: [Python-checkins] r88507 - python/branches/py3k/Lib/test/test_crypt.py Message-ID: <20110222201714.81316EE9C8@mail.python.org> Author: brett.cannon Date: Tue Feb 22 21:17:14 2011 New Revision: 88507 Log: Fix test.test_crypt.test_methods() to pass on OS X. Modified: python/branches/py3k/Lib/test/test_crypt.py Modified: python/branches/py3k/Lib/test/test_crypt.py ============================================================================== --- python/branches/py3k/Lib/test/test_crypt.py (original) +++ python/branches/py3k/Lib/test/test_crypt.py Tue Feb 22 21:17:14 2011 @@ -25,7 +25,10 @@ self.assertEqual(len(pw), method.total_size) def test_methods(self): - self.assertTrue(len(crypt.methods()) > 1) + # Gurantee that METHOD_CRYPT is the last method in crypt.methods(). + methods = crypt.methods() + self.assertTrue(len(methods) >= 1) + self.assertEqual(crypt.METHOD_CRYPT, methods[-1]) def test_main(): support.run_unittest(CryptTestCase) From python-checkins at python.org Tue Feb 22 21:17:24 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 21:17:24 +0100 (CET) Subject: [Python-checkins] r88508 - python/branches/release32-maint Message-ID: <20110222201724.E290CEE9CE@mail.python.org> Author: brett.cannon Date: Tue Feb 22 21:17:24 2011 New Revision: 88508 Log: Blocked revisions 88506 via svnmerge ........ r88506 | brett.cannon | 2011-02-22 12:15:44 -0800 (Tue, 22 Feb 2011) | 2 lines Issue #8914: fix various warnings from the Clang static analyzer v254. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Tue Feb 22 21:18:11 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 21:18:11 +0100 (CET) Subject: [Python-checkins] r88509 - python/branches/release32-maint Message-ID: <20110222201811.60EDDEE9DE@mail.python.org> Author: brett.cannon Date: Tue Feb 22 21:18:11 2011 New Revision: 88509 Log: Blocked revisions 88507 via svnmerge ........ r88507 | brett.cannon | 2011-02-22 12:17:14 -0800 (Tue, 22 Feb 2011) | 1 line Fix test.test_crypt.test_methods() to pass on OS X. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Tue Feb 22 21:29:49 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 21:29:49 +0100 (CET) Subject: [Python-checkins] r88510 - python/branches/issue10276-snowleopard/Lib/test/test_zlib.py Message-ID: <20110222202949.9033EEE9AD@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 21:29:49 2011 New Revision: 88510 Log: Another try at fixing /working around Snow Leopard crash Modified: python/branches/issue10276-snowleopard/Lib/test/test_zlib.py Modified: python/branches/issue10276-snowleopard/Lib/test/test_zlib.py ============================================================================== --- python/branches/issue10276-snowleopard/Lib/test/test_zlib.py (original) +++ python/branches/issue10276-snowleopard/Lib/test/test_zlib.py Tue Feb 22 21:29:49 2011 @@ -70,7 +70,7 @@ with open(support.TESTFN, "wb+") as f: f.seek(_4G) f.write(b"asdf") - f.flush() + with open(support.TESTFN, "rb") as f: self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) def tearDown(self): From python-checkins at python.org Tue Feb 22 22:42:56 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 22:42:56 +0100 (CET) Subject: [Python-checkins] r88511 - python/branches/py3k/Lib/test/test_zlib.py Message-ID: <20110222214256.C6F43EE98C@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 22:42:56 2011 New Revision: 88511 Log: Issue #11277: finally fix Snow Leopard crash following r88460. (probably an OS-related issue with mmap) Modified: python/branches/py3k/Lib/test/test_zlib.py Modified: python/branches/py3k/Lib/test/test_zlib.py ============================================================================== --- python/branches/py3k/Lib/test/test_zlib.py (original) +++ python/branches/py3k/Lib/test/test_zlib.py Tue Feb 22 22:42:56 2011 @@ -70,7 +70,7 @@ with open(support.TESTFN, "wb+") as f: f.seek(_4G) f.write(b"asdf") - f.flush() + with open(support.TESTFN, "rb") as f: self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) def tearDown(self): From python-checkins at python.org Tue Feb 22 22:48:06 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 22:48:06 +0100 (CET) Subject: [Python-checkins] r88512 - in python/branches/py3k: Doc/library/crypt.rst Lib/crypt.py Lib/test/test_crypt.py Message-ID: <20110222214806.BC0A8EE98C@mail.python.org> Author: brett.cannon Date: Tue Feb 22 22:48:06 2011 New Revision: 88512 Log: Make Lib/crypt.py meet PEP 8 standards. This also led to a tweak in the new API by making methods() into a module attribute as it is statically calculated. Modified: python/branches/py3k/Doc/library/crypt.rst python/branches/py3k/Lib/crypt.py python/branches/py3k/Lib/test/test_crypt.py Modified: python/branches/py3k/Doc/library/crypt.rst ============================================================================== --- python/branches/py3k/Doc/library/crypt.rst (original) +++ python/branches/py3k/Doc/library/crypt.rst Tue Feb 22 22:48:06 2011 @@ -60,6 +60,20 @@ .. versionadded:: 3.3 + +Module Attributes +----------------- + + +.. attribute:: methods + + A list of available password hashing algorithms, as + ``crypt.METHOD_*`` objects. This list is sorted from strongest to + weakest, and is guaranteed to have at least ``crypt.METHOD_CRYPT``. + +.. versionadded:: 3.3 + + Module Functions ---------------- @@ -98,13 +112,6 @@ Before version 3.3, *salt* must be specified as a string and cannot accept ``crypt.METHOD_*`` values (which don't exist anyway). -.. function:: methods() - - Return a list of available password hashing algorithms, as - ``crypt.METHOD_*`` objects. This list is sorted from strongest to - weakest, and is guaranteed to have at least ``crypt.METHOD_CRYPT``. - -.. versionadded:: 3.3 .. function:: mksalt(method=None) Modified: python/branches/py3k/Lib/crypt.py ============================================================================== --- python/branches/py3k/Lib/crypt.py (original) +++ python/branches/py3k/Lib/crypt.py Tue Feb 22 22:48:06 2011 @@ -1,61 +1,57 @@ -'''Wrapper to the POSIX crypt library call and associated functionality. -''' +"""Wrapper to the POSIX crypt library call and associated functionality.""" import _crypt +import string +from random import choice +from collections import namedtuple -saltchars = 'abcdefghijklmnopqrstuvwxyz' -saltchars += saltchars.upper() -saltchars += '0123456789./' +_saltchars = string.ascii_letters + string.digits + './' -class _MethodClass: - '''Class representing a salt method per the Modular Crypt Format or the - legacy 2-character crypt method.''' - def __init__(self, name, ident, salt_chars, total_size): - self.name = name - self.ident = ident - self.salt_chars = salt_chars - self.total_size = total_size + +class _Method(namedtuple('_Method', 'name ident salt_chars total_size')): + + """Class representing a salt method per the Modular Crypt Format or the + legacy 2-character crypt method.""" def __repr__(self): - return '' % self.name + return ''.format(self.name) -# available salting/crypto methods -METHOD_CRYPT = _MethodClass('CRYPT', None, 2, 13) -METHOD_MD5 = _MethodClass('MD5', '1', 8, 34) -METHOD_SHA256 = _MethodClass('SHA256', '5', 16, 63) -METHOD_SHA512 = _MethodClass('SHA512', '6', 16, 106) - - -def methods(): - '''Return a list of methods that are available in the platform ``crypt()`` - library, sorted from strongest to weakest. This is guaranteed to always - return at least ``[METHOD_CRYPT]``''' - method_list = [ METHOD_SHA512, METHOD_SHA256, METHOD_MD5 ] - ret = [ method for method in method_list - if len(crypt('', method)) == method.total_size ] - ret.append(METHOD_CRYPT) - return ret - - -def mksalt(method = None): - '''Generate a salt for the specified method. If not specified, the - strongest available method will be used.''' - import random - - if method == None: method = methods()[0] - s = '$%s$' % method.ident if method.ident else '' - s += ''.join([ random.choice(saltchars) for x in range(method.salt_chars) ]) - return(s) - - -def crypt(word, salt = None): - '''Return a string representing the one-way hash of a password, preturbed - by a salt. If ``salt`` is not specified or is ``None``, the strongest + +def mksalt(method=None): + """Generate a salt for the specified method. + + If not specified, the strongest available method will be used. + + """ + if method is None: + method = methods[0] + s = '${}$'.format(method.ident) if method.ident else '' + s += ''.join(choice(_saltchars) for _ in range(method.salt_chars)) + return s + + +def crypt(word, salt=None): + """Return a string representing the one-way hash of a password, with a salt + prepended. + + If ``salt`` is not specified or is ``None``, the strongest available method will be selected and a salt generated. Otherwise, ``salt`` may be one of the ``crypt.METHOD_*`` values, or a string as - returned by ``crypt.mksalt()``.''' - if salt == None: salt = mksalt() - elif isinstance(salt, _MethodClass): salt = mksalt(salt) - return(_crypt.crypt(word, salt)) + returned by ``crypt.mksalt()``. + + """ + if salt is None or isinstance(salt, _Method): + salt = mksalt(salt) + return _crypt.crypt(word, salt) + + +# available salting/crypto methods +METHOD_CRYPT = _Method('CRYPT', None, 2, 13) +METHOD_MD5 = _Method('MD5', '1', 8, 34) +METHOD_SHA256 = _Method('SHA256', '5', 16, 63) +METHOD_SHA512 = _Method('SHA512', '6', 16, 106) + +methods = [METHOD_SHA512, METHOD_SHA256, METHOD_MD5, METHOD_CRYPT] +methods[:-1] = [m for m in methods[:-1] if len(crypt('', m)) == m.total_size] Modified: python/branches/py3k/Lib/test/test_crypt.py ============================================================================== --- python/branches/py3k/Lib/test/test_crypt.py (original) +++ python/branches/py3k/Lib/test/test_crypt.py Tue Feb 22 22:48:06 2011 @@ -11,24 +11,23 @@ print('Test encryption: ', c) def test_salt(self): - self.assertEqual(len(crypt.saltchars), 64) - for method in crypt.methods(): + self.assertEqual(len(crypt._saltchars), 64) + for method in crypt.methods: salt = crypt.mksalt(method) self.assertEqual(len(salt), method.salt_chars + (3 if method.ident else 0)) def test_saltedcrypt(self): - for method in crypt.methods(): + for method in crypt.methods: pw = crypt.crypt('assword', method) self.assertEqual(len(pw), method.total_size) pw = crypt.crypt('assword', crypt.mksalt(method)) self.assertEqual(len(pw), method.total_size) def test_methods(self): - # Gurantee that METHOD_CRYPT is the last method in crypt.methods(). - methods = crypt.methods() - self.assertTrue(len(methods) >= 1) - self.assertEqual(crypt.METHOD_CRYPT, methods[-1]) + # Gurantee that METHOD_CRYPT is the last method in crypt.methods. + self.assertTrue(len(crypt.methods) >= 1) + self.assertEqual(crypt.METHOD_CRYPT, crypt.methods[-1]) def test_main(): support.run_unittest(CryptTestCase) From python-checkins at python.org Tue Feb 22 22:55:52 2011 From: python-checkins at python.org (brett.cannon) Date: Tue, 22 Feb 2011 22:55:52 +0100 (CET) Subject: [Python-checkins] r88513 - python/branches/py3k/Lib/crypt.py Message-ID: <20110222215552.09955EE98A@mail.python.org> Author: brett.cannon Date: Tue Feb 22 22:55:51 2011 New Revision: 88513 Log: A crypt algorithm may not be available by returning None. Modified: python/branches/py3k/Lib/crypt.py Modified: python/branches/py3k/Lib/crypt.py ============================================================================== --- python/branches/py3k/Lib/crypt.py (original) +++ python/branches/py3k/Lib/crypt.py Tue Feb 22 22:55:51 2011 @@ -53,5 +53,10 @@ METHOD_SHA256 = _Method('SHA256', '5', 16, 63) METHOD_SHA512 = _Method('SHA512', '6', 16, 106) -methods = [METHOD_SHA512, METHOD_SHA256, METHOD_MD5, METHOD_CRYPT] -methods[:-1] = [m for m in methods[:-1] if len(crypt('', m)) == m.total_size] +methods = [] +for _method in (METHOD_SHA512, METHOD_SHA256, METHOD_MD5): + _result = crypt('', _method) + if _result and len(_result) == _method.total_size: + methods.append(_method) +methods.append(METHOD_CRYPT) +del _result, _method From python-checkins at python.org Tue Feb 22 23:36:07 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:36:07 +0100 (CET) Subject: [Python-checkins] r88514 - in python/branches/release32-maint: Lib/test/test_zlib.py Message-ID: <20110222223607.5C47CEE988@mail.python.org> Author: antoine.pitrou Date: Tue Feb 22 23:36:07 2011 New Revision: 88514 Log: Merged revisions 88511 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88511 | antoine.pitrou | 2011-02-22 22:42:56 +0100 (mar., 22 f?vr. 2011) | 4 lines Issue #11277: finally fix Snow Leopard crash following r88460. (probably an OS-related issue with mmap) ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_zlib.py Modified: python/branches/release32-maint/Lib/test/test_zlib.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_zlib.py (original) +++ python/branches/release32-maint/Lib/test/test_zlib.py Tue Feb 22 23:36:07 2011 @@ -70,7 +70,7 @@ with open(support.TESTFN, "wb+") as f: f.seek(_4G) f.write(b"asdf") - f.flush() + with open(support.TESTFN, "rb") as f: self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) def tearDown(self): From python-checkins at python.org Tue Feb 22 23:44:15 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:44:15 +0100 Subject: [Python-checkins] pymigr: Add 3.2 to branchmap Message-ID: antoine.pitrou pushed c4bf3feb1136 to pymigr: http://hg.python.org/pymigr/rev/c4bf3feb1136 changeset: 77:c4bf3feb1136 user: Antoine Pitrou date: Tue Feb 22 23:32:23 2011 +0100 summary: Add 3.2 to branchmap files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -1,5 +1,6 @@ py3k = 3.x p3yk = 3.x +release32-maint = 3.2.x release31-maint = 3.1.x release30-maint = 3.0.x release27-maint = 2.7.x -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Tue Feb 22 23:44:15 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:44:15 +0100 Subject: [Python-checkins] pymigr: Add docstring to authors.py Message-ID: antoine.pitrou pushed 246b74f0a32e to pymigr: http://hg.python.org/pymigr/rev/246b74f0a32e changeset: 79:246b74f0a32e tag: tip user: Antoine Pitrou date: Tue Feb 22 23:44:11 2011 +0100 summary: Add docstring to authors.py files: authors.py diff --git a/authors.py b/authors.py old mode 100644 new mode 100755 --- a/authors.py +++ b/authors.py @@ -1,3 +1,12 @@ +#!/usr/bin/env python +""" +Find the authors not yet in the author-map file. +First argument must be the filesystem path (not URL!) to the Subversion repo +(not hg clone!). Example: +$ ./authors.py ../python-svn +""" + + import util, sys MAP = 'author-map' @@ -10,7 +19,6 @@ if 'svn:log' in props and 'cvs2svn' in props['svn:log']: author = 'cvs2svn' else: - print rev, props continue data.add(author) return data -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Tue Feb 22 23:44:15 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:44:15 +0100 Subject: [Python-checkins] pymigr: Fix indentation of Python scripts Message-ID: antoine.pitrou pushed eaa307f60200 to pymigr: http://hg.python.org/pymigr/rev/eaa307f60200 changeset: 78:eaa307f60200 user: Antoine Pitrou date: Tue Feb 22 23:41:08 2011 +0100 summary: Fix indentation of Python scripts files: authors.py binfo.py findlarge.py findmerges.py query.py revprops.py senders.py split.py util.py diff --git a/authors.py b/authors.py --- a/authors.py +++ b/authors.py @@ -3,24 +3,24 @@ MAP = 'author-map' def findauthors(repo): - data = set() - for rev, props in util.walkrevs(repo): - author = props.get('svn:author') - if author is None: - if 'svn:log' in props and 'cvs2svn' in props['svn:log']: - author = 'cvs2svn' - else: - print rev, props - continue - data.add(author) - return data + data = set() + for rev, props in util.walkrevs(repo): + author = props.get('svn:author') + if author is None: + if 'svn:log' in props and 'cvs2svn' in props['svn:log']: + author = 'cvs2svn' + else: + print rev, props + continue + data.add(author) + return data def parse(): - f = open(MAP) - mapped = (i.split('=') for i in f) - return dict((k.strip(), v.strip()) for (k, v) in mapped) + f = open(MAP) + mapped = (i.split('=') for i in f) + return dict((k.strip(), v.strip()) for (k, v) in mapped) if __name__ == '__main__': - authors = findauthors(sys.argv[1]) - map = parse() - print authors - set(map) + authors = findauthors(sys.argv[1]) + map = parse() + print authors - set(map) diff --git a/binfo.py b/binfo.py --- a/binfo.py +++ b/binfo.py @@ -2,44 +2,43 @@ import sys def svnrev(ctx): - extra = ctx.extra() - if 'convert_revision' in extra: - return int(extra['convert_revision'].rsplit('@')[1]) - return None + extra = ctx.extra() + if 'convert_revision' in extra: + return int(extra['convert_revision'].rsplit('@')[1]) + return None def find(repo, branch): - - first = None - last = None - number = 0 - closed = False - mentioned = [] - - for i in repo: - ctx = repo[i] - if first and branch in ctx.description() and ctx.branch() != branch: - line = ctx.description().split('\n')[0] - mentioned.append((ctx.rev(), svnrev(ctx), line)) - if ctx.branch() != branch: - continue - if first is None: - first = ctx.rev(), svnrev(ctx) - if svnrev(ctx): - last = ctx.rev(), svnrev(ctx) - if 'close' in ctx.extra(): - pre = svnrev(repo[i - 1]) - closed = ctx.rev(), int(pre) + 1 if pre else None - number += 1 - - return first, last, number, closed, mentioned + first = None + last = None + number = 0 + closed = False + mentioned = [] + + for i in repo: + ctx = repo[i] + if first and branch in ctx.description() and ctx.branch() != branch: + line = ctx.description().split('\n')[0] + mentioned.append((ctx.rev(), svnrev(ctx), line)) + if ctx.branch() != branch: + continue + if first is None: + first = ctx.rev(), svnrev(ctx) + if svnrev(ctx): + last = ctx.rev(), svnrev(ctx) + if 'close' in ctx.extra(): + pre = svnrev(repo[i - 1]) + closed = ctx.rev(), int(pre) + 1 if pre else None + number += 1 + + return first, last, number, closed, mentioned if __name__ == '__main__': - repo = hg.repository(ui.ui(), sys.argv[1]) - first, last, number, closed, mentioned = find(repo, sys.argv[2]) - print 'First:', first - print 'Last:', last - print 'Closed:', closed - print 'Number:', number - for rev, srev, desc in mentioned: - if srev > first[1]: - print rev, srev, desc + repo = hg.repository(ui.ui(), sys.argv[1]) + first, last, number, closed, mentioned = find(repo, sys.argv[2]) + print 'First:', first + print 'Last:', last + print 'Closed:', closed + print 'Number:', number + for rev, srev, desc in mentioned: + if srev > first[1]: + print rev, srev, desc diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -2,9 +2,9 @@ files = [] for dir, dirs, fns in os.walk('data'): - for fn in fns: - fn = os.path.join(dir, fn) - files.append((os.path.getsize(fn), fn)) + for fn in fns: + fn = os.path.join(dir, fn) + files.append((os.path.getsize(fn), fn)) for s, fn in sorted(files, reverse=True)[0:20]: - print s, fn + print s, fn diff --git a/findmerges.py b/findmerges.py --- a/findmerges.py +++ b/findmerges.py @@ -4,28 +4,28 @@ IGNORE = set(('default', 'None', 'empty', 'py3k', 'ssize_t', 'SourceForge')) def ctxs(repo): - for i in repo: - yield repo[i] + for i in repo: + yield repo[i] def check(repo): - branches = set(repo.branchtags()) - branches -= IGNORE - for ctx in ctxs(repo): - desc = ctx.description() - if ctx.branch() != 'default': - continue - if 'cvs2svn' in desc: - continue - if desc.startswith('Merged revisions'): - continue - for b in branches: - if b.endswith('-maint'): - continue - if b in desc: - if 'convert_revision' not in ctx.extra(): - print 'WEIRD', ctx, ctx.extra() - else: - print ctx.extra()['convert_revision'].rsplit('@')[1], b, desc.split('\n')[0] + branches = set(repo.branchtags()) + branches -= IGNORE + for ctx in ctxs(repo): + desc = ctx.description() + if ctx.branch() != 'default': + continue + if 'cvs2svn' in desc: + continue + if desc.startswith('Merged revisions'): + continue + for b in branches: + if b.endswith('-maint'): + continue + if b in desc: + if 'convert_revision' not in ctx.extra(): + print 'WEIRD', ctx, ctx.extra() + else: + print ctx.extra()['convert_revision'].rsplit('@')[1], b, desc.split('\n')[0] if __name__ == '__main__': - check(hg.repository(ui.ui(), sys.argv[1])) + check(hg.repository(ui.ui(), sys.argv[1])) diff --git a/query.py b/query.py --- a/query.py +++ b/query.py @@ -1,12 +1,12 @@ import util, sys def authorrevs(repo, author): - for rev, props in util.walkrevs(repo): - if 'svn:author' in props and props['svn:author'].startswith(author): - print '%i:' % rev - for k, v in props.iteritems(): - print '%s: %s' % (k, v) - print '=' * 30 + for rev, props in util.walkrevs(repo): + if 'svn:author' in props and props['svn:author'].startswith(author): + print '%i:' % rev + for k, v in props.iteritems(): + print '%s: %s' % (k, v) + print '=' * 30 if __name__ == '__main__': - authorrevs(sys.argv[1], sys.argv[2]) + authorrevs(sys.argv[1], sys.argv[2]) diff --git a/revprops.py b/revprops.py --- a/revprops.py +++ b/revprops.py @@ -1,14 +1,14 @@ import util, sys def get(repo): - freq = {} - for rev, props in util.walkrevs(repo): - for k, v in props.iteritems(): - cur = freq.get(k, (0, None)) - freq[k] = cur[0] + 1, rev - return freq + freq = {} + for rev, props in util.walkrevs(repo): + for k, v in props.iteritems(): + cur = freq.get(k, (0, None)) + freq[k] = cur[0] + 1, rev + return freq if __name__ == '__main__': - freq = get(sys.argv[1]) - for k, v in freq.iteritems(): - print '%s (%s): %s' % (k, v[1], v[0]) + freq = get(sys.argv[1]) + for k, v in freq.iteritems(): + print '%s (%s): %s' % (k, v[1], v[0]) diff --git a/senders.py b/senders.py --- a/senders.py +++ b/senders.py @@ -6,89 +6,89 @@ simple2 = re.compile('"(.*?)" \<([\w@\.-]+)\>') def autodecode(s): - try: - encoding = chardet.detect(s)['encoding'] - if encoding is not None: - return s.decode(s, encoding) - else: - return s.decode('ascii') - except LookupError: - return s.decode('latin1') + try: + encoding = chardet.detect(s)['encoding'] + if encoding is not None: + return s.decode(s, encoding) + else: + return s.decode('ascii') + except LookupError: + return s.decode('latin1') def parse(dir): - people = {} - for fn in sorted(os.listdir(dir)): - - print 'FILE', fn - - fn = os.path.join(dir, fn) - for ln in open(fn): - if not ln.startswith('From:'): - continue - ln = ln[5:].strip() - - m = simple.match(ln) - if m: - email, name = m.groups() - name = autodecode(name) - people.setdefault(name, []).append(email.lower()) - continue - - m = obfuscated.match(ln) - if m: - user, host, name = m.groups() - email = user + '@' + host - if not name.startswith('=?'): - name = autodecode(name) - people.setdefault(name, []).append(email.lower()) - continue - - encend = name[2:].index('?', 1) - encoding = name[2:2 + encend].lower() - name = name[5 + encend:-2] - name = name.replace('=22', '') - - postfix = '' - if '?=' in name: - name, _, postfix = name.partition('?=') - - ss = "'%s'" % name.strip('=').replace('=', '\\x') - - try: - escaped = eval(ss) - name = escaped.decode(encoding) - except UnicodeDecodeError: - name = autodecode(escaped) - - name = name.replace('_', ' ') + postfix - if ' ' not in name: - continue - - people.setdefault(name, []).append(email.lower()) - continue - - m = double.match(ln) - if m: - name1, email, name2 = m.groups() - if name1 != name2: - continue - name = autodecode(name1) - people.setdefault(name, []).append(email.lower()) - continue - - m = simple2.match(ln) - if m: - name, email = m.groups() - name = autodecode(name) - people.setdefault(name, []).append(email.lower()) - continue - - return people + people = {} + for fn in sorted(os.listdir(dir)): + + print 'FILE', fn + + fn = os.path.join(dir, fn) + for ln in open(fn): + if not ln.startswith('From:'): + continue + ln = ln[5:].strip() + + m = simple.match(ln) + if m: + email, name = m.groups() + name = autodecode(name) + people.setdefault(name, []).append(email.lower()) + continue + + m = obfuscated.match(ln) + if m: + user, host, name = m.groups() + email = user + '@' + host + if not name.startswith('=?'): + name = autodecode(name) + people.setdefault(name, []).append(email.lower()) + continue + + encend = name[2:].index('?', 1) + encoding = name[2:2 + encend].lower() + name = name[5 + encend:-2] + name = name.replace('=22', '') + + postfix = '' + if '?=' in name: + name, _, postfix = name.partition('?=') + + ss = "'%s'" % name.strip('=').replace('=', '\\x') + + try: + escaped = eval(ss) + name = escaped.decode(encoding) + except UnicodeDecodeError: + name = autodecode(escaped) + + name = name.replace('_', ' ') + postfix + if ' ' not in name: + continue + + people.setdefault(name, []).append(email.lower()) + continue + + m = double.match(ln) + if m: + name1, email, name2 = m.groups() + if name1 != name2: + continue + name = autodecode(name1) + people.setdefault(name, []).append(email.lower()) + continue + + m = simple2.match(ln) + if m: + name, email = m.groups() + name = autodecode(name) + people.setdefault(name, []).append(email.lower()) + continue + + return people if __name__ == '__main__': - people = parse(sys.argv[1]) - print 'Writing output file...' - out = open('addresses.txt', 'w') - for k, v in sorted(people.iteritems()): - print >> out, k.encode('utf-8'), '===', v[-1] - out.close() + people = parse(sys.argv[1]) + print 'Writing output file...' + out = open('addresses.txt', 'w') + for k, v in sorted(people.iteritems()): + print >> out, k.encode('utf-8'), '===', v[-1] + out.close() diff --git a/split.py b/split.py --- a/split.py +++ b/split.py @@ -1,57 +1,56 @@ from mercurial import ui, hg def rtags(): - tags = {} - for ln in open('tagmap.txt'): - src, dst = ln.strip().split('=') - if dst.strip(): - tags[src.strip()] = dst.strip() - return tags + tags = {} + for ln in open('tagmap.txt'): + src, dst = ln.strip().split('=') + if dst.strip(): + tags[src.strip()] = dst.strip() + return tags def find(repo): - - want = rtags() - main = set() - for n in repo.heads(): - - ctx = repo[n] - path = ctx.extra()['convert_revision'].split('/', 1)[1] - path = path.rsplit('@', 1)[0] - - tags = ctx.tags() - if 'tip' in tags: - tags.remove('tip') - - if '/tags/' in path and not tags: - continue - elif '/tags/' not in path and tags: - continue - #print 'unexpected tags', tags - - if path.endswith('-maint'): - continue - - name = path.split('/')[-1] - if not tags and name not in set(('py3k', 'p3yk', 'trunk')): - #print 'clone', path, '->', path.split('/')[-1] - continue - - if path == 'python/trunk': - print 'hg pull -r%s ../python-hg' % ctx - elif path == 'python/branches/py3k': - print 'hg pull -r%s ../python-hg' % ctx - elif path == 'python/branches/p3yk': - print 'hg pull -r%s ../python-hg' % ctx - print 'hg up -C %s' % ctx.branch() - print 'hg parents' - print 'hg merge %s' % ctx - print '''hg ci -m "Close branch for tag '%s'."''' % name - else: - print 'hg pull -r%s ../python-hg' % ctx - print 'hg up -C %s' % ctx.branch() - print 'hg merge --tool internal:local %s' % ctx - print '''hg ci -m "Close branch for tag '%s'."''' % name + want = rtags() + main = set() + for n in repo.heads(): + + ctx = repo[n] + path = ctx.extra()['convert_revision'].split('/', 1)[1] + path = path.rsplit('@', 1)[0] + + tags = ctx.tags() + if 'tip' in tags: + tags.remove('tip') + + if '/tags/' in path and not tags: + continue + elif '/tags/' not in path and tags: + continue + #print 'unexpected tags', tags + + if path.endswith('-maint'): + continue + + name = path.split('/')[-1] + if not tags and name not in set(('py3k', 'p3yk', 'trunk')): + #print 'clone', path, '->', path.split('/')[-1] + continue + + if path == 'python/trunk': + print 'hg pull -r%s ../python-hg' % ctx + elif path == 'python/branches/py3k': + print 'hg pull -r%s ../python-hg' % ctx + elif path == 'python/branches/p3yk': + print 'hg pull -r%s ../python-hg' % ctx + print 'hg up -C %s' % ctx.branch() + print 'hg parents' + print 'hg merge %s' % ctx + print '''hg ci -m "Close branch for tag '%s'."''' % name + else: + print 'hg pull -r%s ../python-hg' % ctx + print 'hg up -C %s' % ctx.branch() + print 'hg merge --tool internal:local %s' % ctx + print '''hg ci -m "Close branch for tag '%s'."''' % name if __name__ == '__main__': - repo = hg.repository(ui.ui(), 'python-hg') - branches = find(repo) + repo = hg.repository(ui.ui(), 'python-hg') + branches = find(repo) diff --git a/util.py b/util.py --- a/util.py +++ b/util.py @@ -1,26 +1,26 @@ import sys, os def parse(fn): - f = open(fn, 'rb') - data = {} - while f.read(1) == 'K': - kl = int(f.readline().strip()) - kv = f.read(kl) - assert f.read(3) == '\nV ' - vl = int(f.readline().strip()) - vv = f.read(vl) - data[kv] = vv - f.read(1) # eat post-value newline - assert f.read() == 'ND\n' # E eaten by loop header - return data + f = open(fn, 'rb') + data = {} + while f.read(1) == 'K': + kl = int(f.readline().strip()) + kv = f.read(kl) + assert f.read(3) == '\nV ' + vl = int(f.readline().strip()) + vv = f.read(vl) + data[kv] = vv + f.read(1) # eat post-value newline + assert f.read() == 'ND\n' # E eaten by loop header + return data def walkrevs(repo): - rpdir = os.path.join(repo, 'db', 'revprops') - data = set() - for (root, dirs, files) in os.walk(rpdir): - for fn in files: - full = os.path.join(root, fn) - yield int(fn), parse(full) + rpdir = os.path.join(repo, 'db', 'revprops') + data = set() + for (root, dirs, files) in os.walk(rpdir): + for fn in files: + full = os.path.join(root, fn) + yield int(fn), parse(full) if __name__ == '__main__': - print parse(sys.argv[1]) + print parse(sys.argv[1]) -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Tue Feb 22 23:51:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:51:16 +0100 Subject: [Python-checkins] pymigr: Add new committers to author map Message-ID: antoine.pitrou pushed 1691905dd2df to pymigr: http://hg.python.org/pymigr/rev/1691905dd2df changeset: 80:1691905dd2df tag: tip user: Antoine Pitrou date: Tue Feb 22 23:51:13 2011 +0100 summary: Add new committers to author map files: author-map diff --git a/author-map b/author-map --- a/author-map +++ b/author-map @@ -57,6 +57,7 @@ edloper = Edward Loper effbot = Fredrik Lundh elguavas = Steven M. Gava +eli.bendersky = Eli Bendersky eprice = Eric Price eric.araujo = ?ric Araujo eric.smith = Eric Smith @@ -72,6 +73,7 @@ fred.drake = Fred Drake fredrik.lundh = Fredrik Lundh georg.brandl = Georg Brandl +george.boutsioukis = George Boutsioukis george.yoshida = George Yoshida gerhard.haering = Gerhard H?ring giampaolo.rodola = Giampaolo Rodol? @@ -100,6 +102,7 @@ jackjansen = Jack Jansen jafo = Sean Reifschneider jean-paul.calderone = Jean-Paul Calderone +jeff.senn = Jeff Senn jeffrey.yasskin = Jeffrey Yasskin jeremy.hylton = Jeremy Hylton jeroen.ruigrok = Jeroen Ruigrok van der Werven @@ -144,6 +147,7 @@ nascheme = Neil Schemenauer ncoghlan = Nick Coghlan neal.norwitz = Neal Norwitz +ned.deily = Ned Deily neil.schemenauer = Neil Schemenauer nick.coghlan = Nick Coghlan niemeyer = Gustavo Niemeyer -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Tue Feb 22 23:53:37 2011 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 22 Feb 2011 23:53:37 +0100 Subject: [Python-checkins] pymigr: chmod +x on all scripts Message-ID: antoine.pitrou pushed 70c1b42fb8f1 to pymigr: http://hg.python.org/pymigr/rev/70c1b42fb8f1 changeset: 81:70c1b42fb8f1 tag: tip user: Antoine Pitrou date: Tue Feb 22 23:53:34 2011 +0100 summary: chmod +x on all scripts files: binfo.py findlarge.py findmerges.py query.py revprops.py rewrite.py senders.py split.py diff --git a/binfo.py b/binfo.py old mode 100644 new mode 100755 diff --git a/findlarge.py b/findlarge.py old mode 100644 new mode 100755 diff --git a/findmerges.py b/findmerges.py old mode 100644 new mode 100755 diff --git a/query.py b/query.py old mode 100644 new mode 100755 diff --git a/revprops.py b/revprops.py old mode 100644 new mode 100755 diff --git a/rewrite.py b/rewrite.py old mode 100644 new mode 100755 diff --git a/senders.py b/senders.py old mode 100644 new mode 100755 diff --git a/split.py b/split.py old mode 100644 new mode 100755 -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:00:33 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 00:00:33 +0100 Subject: [Python-checkins] pymigr: Ignore svndump files Message-ID: antoine.pitrou pushed eaa98abb2409 to pymigr: http://hg.python.org/pymigr/rev/eaa98abb2409 changeset: 82:eaa98abb2409 user: Antoine Pitrou date: Tue Feb 22 23:57:23 2011 +0100 summary: Ignore svndump files files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -3,3 +3,4 @@ archives/* python-svn python-hg +svndump* -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:00:33 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 00:00:33 +0100 Subject: [Python-checkins] pymigr: Add docstring in convert.sh Message-ID: antoine.pitrou pushed 431033bc5ec5 to pymigr: http://hg.python.org/pymigr/rev/431033bc5ec5 changeset: 83:431033bc5ec5 tag: tip user: Antoine Pitrou date: Wed Feb 23 00:00:30 2011 +0100 summary: Add docstring in convert.sh files: convert.sh diff --git a/convert.sh b/convert.sh --- a/convert.sh +++ b/convert.sh @@ -1,3 +1,7 @@ +#!/bin/sh + +# Run this script from the pymigr working copy. The SVN repo should be +# in ./python-svn. hg init python-hg mkdir python-hg/.hg/svn -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:12:29 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 00:12:29 +0100 (CET) Subject: [Python-checkins] r88515 - in python/branches/py3k: PC/import_nt.c Python/import.c Message-ID: <20110222231229.231C4EE983@mail.python.org> Author: victor.stinner Date: Wed Feb 23 00:12:28 2011 New Revision: 88515 Log: Issue #3080: Mark PyWin_FindRegisteredModule() as private This function was not declared in Python public API (in any .h file) and not documented. Mark it as private to prepare a change of its API. Modified: python/branches/py3k/PC/import_nt.c python/branches/py3k/Python/import.c Modified: python/branches/py3k/PC/import_nt.c ============================================================================== --- python/branches/py3k/PC/import_nt.c (original) +++ python/branches/py3k/PC/import_nt.c Wed Feb 23 00:12:28 2011 @@ -15,10 +15,10 @@ /* a string loaded from the DLL at startup */ extern const char *PyWin_DLLVersionString; -FILE *PyWin_FindRegisteredModule(const char *moduleName, - struct filedescr **ppFileDesc, - char *pathBuf, - Py_ssize_t pathLen) +FILE *_PyWin_FindRegisteredModule(const char *moduleName, + struct filedescr **ppFileDesc, + char *pathBuf, + Py_ssize_t pathLen) { char *moduleKey; const char keyPrefix[] = "Software\\Python\\PythonCore\\"; Modified: python/branches/py3k/Python/import.c ============================================================================== --- python/branches/py3k/Python/import.c (original) +++ python/branches/py3k/Python/import.c Wed Feb 23 00:12:28 2011 @@ -1547,8 +1547,8 @@ pathname and an open file. Return NULL if the module is not found. */ #ifdef MS_COREDLL -extern FILE *PyWin_FindRegisteredModule(const char *, struct filedescr **, - char *, Py_ssize_t); +extern FILE *_PyWin_FindRegisteredModule(const char *, struct filedescr **, + char *, Py_ssize_t); #endif static int case_ok(char *, Py_ssize_t, Py_ssize_t, char *); @@ -1631,7 +1631,7 @@ return &fd_builtin; } #ifdef MS_COREDLL - fp = PyWin_FindRegisteredModule(name, &fdp, buf, buflen); + fp = _PyWin_FindRegisteredModule(name, &fdp, buf, buflen); if (fp != NULL) { *p_fp = fp; return fdp; From python-checkins at python.org Wed Feb 23 00:16:20 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 00:16:20 +0100 (CET) Subject: [Python-checkins] r88516 - in python/branches/py3k/Python: dynload_aix.c dynload_dl.c dynload_hpux.c dynload_next.c dynload_os2.c dynload_shlib.c dynload_win.c importdl.c Message-ID: <20110222231620.1668BEE9A5@mail.python.org> Author: victor.stinner Date: Wed Feb 23 00:16:19 2011 New Revision: 88516 Log: Issue #3080: Remove unused argument of _PyImport_GetDynLoadFunc() The first argument, fqname, was not used. Modified: python/branches/py3k/Python/dynload_aix.c python/branches/py3k/Python/dynload_dl.c python/branches/py3k/Python/dynload_hpux.c python/branches/py3k/Python/dynload_next.c python/branches/py3k/Python/dynload_os2.c python/branches/py3k/Python/dynload_shlib.c python/branches/py3k/Python/dynload_win.c python/branches/py3k/Python/importdl.c Modified: python/branches/py3k/Python/dynload_aix.c ============================================================================== --- python/branches/py3k/Python/dynload_aix.c (original) +++ python/branches/py3k/Python/dynload_aix.c Wed Feb 23 00:16:19 2011 @@ -154,7 +154,7 @@ } -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/py3k/Python/dynload_dl.c ============================================================================== --- python/branches/py3k/Python/dynload_dl.c (original) +++ python/branches/py3k/Python/dynload_dl.c Wed Feb 23 00:16:19 2011 @@ -16,7 +16,7 @@ }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { char funcname[258]; Modified: python/branches/py3k/Python/dynload_hpux.c ============================================================================== --- python/branches/py3k/Python/dynload_hpux.c (original) +++ python/branches/py3k/Python/dynload_hpux.c Wed Feb 23 00:16:19 2011 @@ -19,7 +19,7 @@ {0, 0} }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/py3k/Python/dynload_next.c ============================================================================== --- python/branches/py3k/Python/dynload_next.c (original) +++ python/branches/py3k/Python/dynload_next.c Wed Feb 23 00:16:19 2011 @@ -31,8 +31,8 @@ #define LINKOPTIONS NSLINKMODULE_OPTION_BINDNOW| \ NSLINKMODULE_OPTION_RETURN_ON_ERROR|NSLINKMODULE_OPTION_PRIVATE #endif -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, - const char *pathname, FILE *fp) +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, + const char *pathname, FILE *fp) { dl_funcptr p = NULL; char funcname[258]; Modified: python/branches/py3k/Python/dynload_os2.c ============================================================================== --- python/branches/py3k/Python/dynload_os2.c (original) +++ python/branches/py3k/Python/dynload_os2.c Wed Feb 23 00:16:19 2011 @@ -15,7 +15,7 @@ {0, 0} }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/py3k/Python/dynload_shlib.c ============================================================================== --- python/branches/py3k/Python/dynload_shlib.c (original) +++ python/branches/py3k/Python/dynload_shlib.c Wed Feb 23 00:16:19 2011 @@ -75,7 +75,7 @@ static int nhandles = 0; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/py3k/Python/dynload_win.c ============================================================================== --- python/branches/py3k/Python/dynload_win.c (original) +++ python/branches/py3k/Python/dynload_win.c Wed Feb 23 00:16:19 2011 @@ -171,7 +171,7 @@ return NULL; } -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/py3k/Python/importdl.c ============================================================================== --- python/branches/py3k/Python/importdl.c (original) +++ python/branches/py3k/Python/importdl.c Wed Feb 23 00:16:19 2011 @@ -12,8 +12,7 @@ #include "importdl.h" -extern dl_funcptr _PyImport_GetDynLoadFunc(const char *name, - const char *shortname, +extern dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp); @@ -48,7 +47,7 @@ shortname = lastdot+1; } - p0 = _PyImport_GetDynLoadFunc(name, shortname, pathname, fp); + p0 = _PyImport_GetDynLoadFunc(shortname, pathname, fp); p = (PyObject*(*)(void))p0; if (PyErr_Occurred()) goto error; From python-checkins at python.org Wed Feb 23 00:30:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 00:30:18 +0100 Subject: [Python-checkins] pymigr: Add recent tags Message-ID: antoine.pitrou pushed 672ac0562bed to pymigr: http://hg.python.org/pymigr/rev/672ac0562bed changeset: 84:672ac0562bed user: Antoine Pitrou date: Wed Feb 23 00:10:18 2011 +0100 summary: Add recent tags files: tagmap.txt diff --git a/tagmap.txt b/tagmap.txt --- a/tagmap.txt +++ b/tagmap.txt @@ -1,7 +1,14 @@ +r32 = 3.2 +r32rc3 = 3.2rc3 +r32rc2 = 3.2rc2 +r32rc1 = 3.2rc1 +r32b2 = 3.2b2 +r32b1 = 3.2b1 r32a4 = 3.2a4 r32a3 = 3.2a3 r32a2 = 3.2a2 r32a1 = 3.2a1 +r313 = 3.1.3 r31rc2 = 3.1rc2 r31rc1 = 3.1rc1 r31b1 = 3.1b1 @@ -26,6 +33,7 @@ r30a1 = 3.0a1 r301 = 3.0.1 r30 = 3.0 +r271 = 2.7.1 r27rc2 = 2.7rc2 r27rc1 = 2.7rc1 r27b2 = 2.7b2 -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:30:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 00:30:18 +0100 Subject: [Python-checkins] pymigr: Add recent branches Message-ID: antoine.pitrou pushed bf92515ba872 to pymigr: http://hg.python.org/pymigr/rev/bf92515ba872 changeset: 85:bf92515ba872 user: Antoine Pitrou date: Wed Feb 23 00:15:51 2011 +0100 summary: Add recent branches files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -126,6 +126,7 @@ sqlite-integration = default tim-obmalloc = default import_unicode = default +test_subprocess_10826 = 3.x py3k-signalfd-issue8407 = 3.x sslopts-4870 = 3.x py3k-dtoa = 3.x @@ -156,3 +157,7 @@ pep-382 = 3.x dmalcolm-ast-optimization-branch = 3.x pep-3151 = 3.x +import-unicode = 3.x +py3k-stat-on-windows-v2 = 3.x +py3k-futures-on-windows = 3.x +issue10276-snowleopard = 3.x -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:30:18 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 00:30:18 +0100 Subject: [Python-checkins] pymigr: Add missing shebang lines Message-ID: antoine.pitrou pushed f600f5c151a0 to pymigr: http://hg.python.org/pymigr/rev/f600f5c151a0 changeset: 86:f600f5c151a0 tag: tip user: Antoine Pitrou date: Wed Feb 23 00:30:12 2011 +0100 summary: Add missing shebang lines files: binfo.py findlarge.py findmerges.py query.py revprops.py senders.py split.py diff --git a/binfo.py b/binfo.py --- a/binfo.py +++ b/binfo.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + from mercurial import ui, hg import sys diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + import os files = [] diff --git a/findmerges.py b/findmerges.py --- a/findmerges.py +++ b/findmerges.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + from mercurial import hg, ui import sys diff --git a/query.py b/query.py --- a/query.py +++ b/query.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + import util, sys def authorrevs(repo, author): diff --git a/revprops.py b/revprops.py --- a/revprops.py +++ b/revprops.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + import util, sys def get(repo): diff --git a/senders.py b/senders.py --- a/senders.py +++ b/senders.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + import sys, os, re, chardet simple = re.compile('([\w@\.\-+]+) \((.*?)\)') diff --git a/split.py b/split.py --- a/split.py +++ b/split.py @@ -1,3 +1,5 @@ +#!/usr/bin/env python + from mercurial import ui, hg def rtags(): -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Wed Feb 23 00:38:34 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 00:38:34 +0100 (CET) Subject: [Python-checkins] r88517 - in python/branches/py3k/Include: import.h moduleobject.h pycapsule.h Message-ID: <20110222233834.3EDC2EE9F8@mail.python.org> Author: victor.stinner Date: Wed Feb 23 00:38:34 2011 New Revision: 88517 Log: Issue #3080: document encoding used by import functions Modified: python/branches/py3k/Include/import.h python/branches/py3k/Include/moduleobject.h python/branches/py3k/Include/pycapsule.h Modified: python/branches/py3k/Include/import.h ============================================================================== --- python/branches/py3k/Include/import.h (original) +++ python/branches/py3k/Include/import.h Wed Feb 23 00:38:34 2011 @@ -9,26 +9,49 @@ PyAPI_FUNC(long) PyImport_GetMagicNumber(void); PyAPI_FUNC(const char *) PyImport_GetMagicTag(void); -PyAPI_FUNC(PyObject *) PyImport_ExecCodeModule(char *name, PyObject *co); +PyAPI_FUNC(PyObject *) PyImport_ExecCodeModule( + char *name, /* UTF-8 encoded string */ + PyObject *co + ); PyAPI_FUNC(PyObject *) PyImport_ExecCodeModuleEx( - char *name, PyObject *co, char *pathname); + char *name, /* UTF-8 encoded string */ + PyObject *co, + char *pathname /* decoded from the filesystem encoding */ + ); PyAPI_FUNC(PyObject *) PyImport_ExecCodeModuleWithPathnames( - char *name, PyObject *co, char *pathname, char *cpathname); + char *name, /* UTF-8 encoded string */ + PyObject *co, + char *pathname, /* decoded from the filesystem encoding */ + char *cpathname /* decoded from the filesystem encoding */ + ); PyAPI_FUNC(PyObject *) PyImport_GetModuleDict(void); -PyAPI_FUNC(PyObject *) PyImport_AddModule(const char *name); -PyAPI_FUNC(PyObject *) PyImport_ImportModule(const char *name); -PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock(const char *); -PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevel(char *name, - PyObject *globals, PyObject *locals, PyObject *fromlist, int level); +PyAPI_FUNC(PyObject *) PyImport_AddModule( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModule( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevel( + char *name, /* UTF-8 encoded string */ + PyObject *globals, + PyObject *locals, + PyObject *fromlist, + int level + ); #define PyImport_ImportModuleEx(n, g, l, f) \ - PyImport_ImportModuleLevel(n, g, l, f, -1) + PyImport_ImportModuleLevel(n, g, l, f, -1) PyAPI_FUNC(PyObject *) PyImport_GetImporter(PyObject *path); PyAPI_FUNC(PyObject *) PyImport_Import(PyObject *name); PyAPI_FUNC(PyObject *) PyImport_ReloadModule(PyObject *m); PyAPI_FUNC(void) PyImport_Cleanup(void); -PyAPI_FUNC(int) PyImport_ImportFrozenModule(char *); +PyAPI_FUNC(int) PyImport_ImportFrozenModule( + char *name /* UTF-8 encoded string */ + ); #ifndef Py_LIMITED_API #ifdef WITH_THREAD @@ -41,9 +64,14 @@ PyAPI_FUNC(void) _PyImport_ReInitLock(void); -PyAPI_FUNC(PyObject *)_PyImport_FindBuiltin(char *); +PyAPI_FUNC(PyObject *)_PyImport_FindBuiltin( + char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(PyObject *)_PyImport_FindExtensionUnicode(char *, PyObject *); -PyAPI_FUNC(int)_PyImport_FixupBuiltin(PyObject*, char *); +PyAPI_FUNC(int)_PyImport_FixupBuiltin( + PyObject *mod, + char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(int)_PyImport_FixupExtensionUnicode(PyObject*, char *, PyObject *); struct _inittab { @@ -56,11 +84,14 @@ PyAPI_DATA(PyTypeObject) PyNullImporter_Type; -PyAPI_FUNC(int) PyImport_AppendInittab(const char *name, PyObject* (*initfunc)(void)); +PyAPI_FUNC(int) PyImport_AppendInittab( + const char *name, /* ASCII encoded string */ + PyObject* (*initfunc)(void) + ); #ifndef Py_LIMITED_API struct _frozen { - char *name; + char *name; /* ASCII encoded string */ unsigned char *code; int size; }; Modified: python/branches/py3k/Include/moduleobject.h ============================================================================== --- python/branches/py3k/Include/moduleobject.h (original) +++ python/branches/py3k/Include/moduleobject.h Wed Feb 23 00:38:34 2011 @@ -12,7 +12,9 @@ #define PyModule_Check(op) PyObject_TypeCheck(op, &PyModule_Type) #define PyModule_CheckExact(op) (Py_TYPE(op) == &PyModule_Type) -PyAPI_FUNC(PyObject *) PyModule_New(const char *); +PyAPI_FUNC(PyObject *) PyModule_New( + const char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(PyObject *) PyModule_GetDict(PyObject *); PyAPI_FUNC(const char *) PyModule_GetName(PyObject *); PyAPI_FUNC(const char *) PyModule_GetFilename(PyObject *); Modified: python/branches/py3k/Include/pycapsule.h ============================================================================== --- python/branches/py3k/Include/pycapsule.h (original) +++ python/branches/py3k/Include/pycapsule.h Wed Feb 23 00:38:34 2011 @@ -48,7 +48,9 @@ PyAPI_FUNC(int) PyCapsule_SetContext(PyObject *capsule, void *context); -PyAPI_FUNC(void *) PyCapsule_Import(const char *name, int no_block); +PyAPI_FUNC(void *) PyCapsule_Import( + const char *name, /* UTF-8 encoded string */ + int no_block); #ifdef __cplusplus From python-checkins at python.org Wed Feb 23 00:43:57 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 00:43:57 +0100 (CET) Subject: [Python-checkins] r88518 - in python/branches/release32-maint: Include/import.h Include/moduleobject.h Include/pycapsule.h Message-ID: <20110222234357.C0405EE9FD@mail.python.org> Author: victor.stinner Date: Wed Feb 23 00:43:57 2011 New Revision: 88518 Log: Merged revisions 88517 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88517 | victor.stinner | 2011-02-23 00:38:34 +0100 (mer., 23 f?vr. 2011) | 1 line Issue #3080: document encoding used by import functions ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Include/import.h python/branches/release32-maint/Include/moduleobject.h python/branches/release32-maint/Include/pycapsule.h Modified: python/branches/release32-maint/Include/import.h ============================================================================== --- python/branches/release32-maint/Include/import.h (original) +++ python/branches/release32-maint/Include/import.h Wed Feb 23 00:43:57 2011 @@ -9,26 +9,49 @@ PyAPI_FUNC(long) PyImport_GetMagicNumber(void); PyAPI_FUNC(const char *) PyImport_GetMagicTag(void); -PyAPI_FUNC(PyObject *) PyImport_ExecCodeModule(char *name, PyObject *co); +PyAPI_FUNC(PyObject *) PyImport_ExecCodeModule( + char *name, /* UTF-8 encoded string */ + PyObject *co + ); PyAPI_FUNC(PyObject *) PyImport_ExecCodeModuleEx( - char *name, PyObject *co, char *pathname); + char *name, /* UTF-8 encoded string */ + PyObject *co, + char *pathname /* decoded from the filesystem encoding */ + ); PyAPI_FUNC(PyObject *) PyImport_ExecCodeModuleWithPathnames( - char *name, PyObject *co, char *pathname, char *cpathname); + char *name, /* UTF-8 encoded string */ + PyObject *co, + char *pathname, /* decoded from the filesystem encoding */ + char *cpathname /* decoded from the filesystem encoding */ + ); PyAPI_FUNC(PyObject *) PyImport_GetModuleDict(void); -PyAPI_FUNC(PyObject *) PyImport_AddModule(const char *name); -PyAPI_FUNC(PyObject *) PyImport_ImportModule(const char *name); -PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock(const char *); -PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevel(char *name, - PyObject *globals, PyObject *locals, PyObject *fromlist, int level); +PyAPI_FUNC(PyObject *) PyImport_AddModule( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModule( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock( + const char *name /* UTF-8 encoded string */ + ); +PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevel( + char *name, /* UTF-8 encoded string */ + PyObject *globals, + PyObject *locals, + PyObject *fromlist, + int level + ); #define PyImport_ImportModuleEx(n, g, l, f) \ - PyImport_ImportModuleLevel(n, g, l, f, -1) + PyImport_ImportModuleLevel(n, g, l, f, -1) PyAPI_FUNC(PyObject *) PyImport_GetImporter(PyObject *path); PyAPI_FUNC(PyObject *) PyImport_Import(PyObject *name); PyAPI_FUNC(PyObject *) PyImport_ReloadModule(PyObject *m); PyAPI_FUNC(void) PyImport_Cleanup(void); -PyAPI_FUNC(int) PyImport_ImportFrozenModule(char *); +PyAPI_FUNC(int) PyImport_ImportFrozenModule( + char *name /* UTF-8 encoded string */ + ); #ifndef Py_LIMITED_API #ifdef WITH_THREAD @@ -41,9 +64,14 @@ PyAPI_FUNC(void) _PyImport_ReInitLock(void); -PyAPI_FUNC(PyObject *)_PyImport_FindBuiltin(char *); +PyAPI_FUNC(PyObject *)_PyImport_FindBuiltin( + char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(PyObject *)_PyImport_FindExtensionUnicode(char *, PyObject *); -PyAPI_FUNC(int)_PyImport_FixupBuiltin(PyObject*, char *); +PyAPI_FUNC(int)_PyImport_FixupBuiltin( + PyObject *mod, + char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(int)_PyImport_FixupExtensionUnicode(PyObject*, char *, PyObject *); struct _inittab { @@ -56,11 +84,14 @@ PyAPI_DATA(PyTypeObject) PyNullImporter_Type; -PyAPI_FUNC(int) PyImport_AppendInittab(const char *name, PyObject* (*initfunc)(void)); +PyAPI_FUNC(int) PyImport_AppendInittab( + const char *name, /* ASCII encoded string */ + PyObject* (*initfunc)(void) + ); #ifndef Py_LIMITED_API struct _frozen { - char *name; + char *name; /* ASCII encoded string */ unsigned char *code; int size; }; Modified: python/branches/release32-maint/Include/moduleobject.h ============================================================================== --- python/branches/release32-maint/Include/moduleobject.h (original) +++ python/branches/release32-maint/Include/moduleobject.h Wed Feb 23 00:43:57 2011 @@ -12,7 +12,9 @@ #define PyModule_Check(op) PyObject_TypeCheck(op, &PyModule_Type) #define PyModule_CheckExact(op) (Py_TYPE(op) == &PyModule_Type) -PyAPI_FUNC(PyObject *) PyModule_New(const char *); +PyAPI_FUNC(PyObject *) PyModule_New( + const char *name /* UTF-8 encoded string */ + ); PyAPI_FUNC(PyObject *) PyModule_GetDict(PyObject *); PyAPI_FUNC(const char *) PyModule_GetName(PyObject *); PyAPI_FUNC(const char *) PyModule_GetFilename(PyObject *); Modified: python/branches/release32-maint/Include/pycapsule.h ============================================================================== --- python/branches/release32-maint/Include/pycapsule.h (original) +++ python/branches/release32-maint/Include/pycapsule.h Wed Feb 23 00:43:57 2011 @@ -48,7 +48,9 @@ PyAPI_FUNC(int) PyCapsule_SetContext(PyObject *capsule, void *context); -PyAPI_FUNC(void *) PyCapsule_Import(const char *name, int no_block); +PyAPI_FUNC(void *) PyCapsule_Import( + const char *name, /* UTF-8 encoded string */ + int no_block); #ifdef __cplusplus From python-checkins at python.org Wed Feb 23 01:02:00 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 01:02:00 +0100 (CET) Subject: [Python-checkins] r88519 - in python/branches/py3k: Include/import.h Python/import.c Message-ID: <20110223000200.8C482F05E@mail.python.org> Author: victor.stinner Date: Wed Feb 23 01:02:00 2011 New Revision: 88519 Log: Issue #3080: Mark _PyImport_FindBuiltin() argument as constant And as a consequence, mark also name argument of _PyImport_FindExtensionUnicode() constant too. But I plan to change this argument type to PyObject* later. Modified: python/branches/py3k/Include/import.h python/branches/py3k/Python/import.c Modified: python/branches/py3k/Include/import.h ============================================================================== --- python/branches/py3k/Include/import.h (original) +++ python/branches/py3k/Include/import.h Wed Feb 23 01:02:00 2011 @@ -65,9 +65,9 @@ PyAPI_FUNC(void) _PyImport_ReInitLock(void); PyAPI_FUNC(PyObject *)_PyImport_FindBuiltin( - char *name /* UTF-8 encoded string */ + const char *name /* UTF-8 encoded string */ ); -PyAPI_FUNC(PyObject *)_PyImport_FindExtensionUnicode(char *, PyObject *); +PyAPI_FUNC(PyObject *)_PyImport_FindExtensionUnicode(const char *, PyObject *); PyAPI_FUNC(int)_PyImport_FixupBuiltin( PyObject *mod, char *name /* UTF-8 encoded string */ Modified: python/branches/py3k/Python/import.c ============================================================================== --- python/branches/py3k/Python/import.c (original) +++ python/branches/py3k/Python/import.c Wed Feb 23 01:02:00 2011 @@ -636,7 +636,7 @@ } PyObject * -_PyImport_FindExtensionUnicode(char *name, PyObject *filename) +_PyImport_FindExtensionUnicode(const char *name, PyObject *filename) { PyObject *mod, *mdict; PyModuleDef* def; @@ -680,7 +680,7 @@ } PyObject * -_PyImport_FindBuiltin(char *name) +_PyImport_FindBuiltin(const char *name) { PyObject *res, *filename; filename = PyUnicode_FromString(name); From python-checkins at python.org Wed Feb 23 01:21:43 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 01:21:43 +0100 (CET) Subject: [Python-checkins] r88520 - in python/branches/py3k: Doc/c-api/module.rst Include/moduleobject.h Objects/moduleobject.c Message-ID: <20110223002143.828F0EE9B1@mail.python.org> Author: victor.stinner Date: Wed Feb 23 01:21:43 2011 New Revision: 88520 Log: Issue #3080: Add PyModule_GetNameObject() repr(module) uses %R to format module name and filenames, instead of '%s' and '%U', so surrogates from undecodable bytes in a filename (PEP 383) are escaped. Modified: python/branches/py3k/Doc/c-api/module.rst python/branches/py3k/Include/moduleobject.h python/branches/py3k/Objects/moduleobject.c Modified: python/branches/py3k/Doc/c-api/module.rst ============================================================================== --- python/branches/py3k/Doc/c-api/module.rst (original) +++ python/branches/py3k/Doc/c-api/module.rst Wed Feb 23 01:21:43 2011 @@ -52,7 +52,7 @@ manipulate a module's :attr:`__dict__`. -.. c:function:: char* PyModule_GetName(PyObject *module) +.. c:function:: PyObject* PyModule_GetNameObject(PyObject *module) .. index:: single: __name__ (module attribute) @@ -61,15 +61,13 @@ Return *module*'s :attr:`__name__` value. If the module does not provide one, or if it is not a string, :exc:`SystemError` is raised and *NULL* is returned. + .. versionadded:: 3.3 -.. c:function:: char* PyModule_GetFilename(PyObject *module) - Similar to :c:func:`PyModule_GetFilenameObject` but return the filename - encoded to 'utf-8'. +.. c:function:: char* PyModule_GetName(PyObject *module) - .. deprecated:: 3.2 - :c:func:`PyModule_GetFilename` raises :c:type:`UnicodeEncodeError` on - unencodable filenames, use :c:func:`PyModule_GetFilenameObject` instead. + Similar to :c:func:`PyModule_GetNameObject` but return the name encoded to + ``'utf-8'``. .. c:function:: PyObject* PyModule_GetFilenameObject(PyObject *module) @@ -86,6 +84,16 @@ .. versionadded:: 3.2 +.. c:function:: char* PyModule_GetFilename(PyObject *module) + + Similar to :c:func:`PyModule_GetFilenameObject` but return the filename + encoded to 'utf-8'. + + .. deprecated:: 3.2 + :c:func:`PyModule_GetFilename` raises :c:type:`UnicodeEncodeError` on + unencodable filenames, use :c:func:`PyModule_GetFilenameObject` instead. + + .. c:function:: void* PyModule_GetState(PyObject *module) Return the "state" of the module, that is, a pointer to the block of memory Modified: python/branches/py3k/Include/moduleobject.h ============================================================================== --- python/branches/py3k/Include/moduleobject.h (original) +++ python/branches/py3k/Include/moduleobject.h Wed Feb 23 01:21:43 2011 @@ -16,6 +16,7 @@ const char *name /* UTF-8 encoded string */ ); PyAPI_FUNC(PyObject *) PyModule_GetDict(PyObject *); +PyAPI_FUNC(PyObject *) PyModule_GetNameObject(PyObject *); PyAPI_FUNC(const char *) PyModule_GetName(PyObject *); PyAPI_FUNC(const char *) PyModule_GetFilename(PyObject *); PyAPI_FUNC(PyObject *) PyModule_GetFilenameObject(PyObject *); Modified: python/branches/py3k/Objects/moduleobject.c ============================================================================== --- python/branches/py3k/Objects/moduleobject.c (original) +++ python/branches/py3k/Objects/moduleobject.c Wed Feb 23 01:21:43 2011 @@ -169,24 +169,35 @@ return d; } -const char * -PyModule_GetName(PyObject *m) +PyObject* +PyModule_GetNameObject(PyObject *m) { PyObject *d; - PyObject *nameobj; + PyObject *name; if (!PyModule_Check(m)) { PyErr_BadArgument(); return NULL; } d = ((PyModuleObject *)m)->md_dict; if (d == NULL || - (nameobj = PyDict_GetItemString(d, "__name__")) == NULL || - !PyUnicode_Check(nameobj)) + (name = PyDict_GetItemString(d, "__name__")) == NULL || + !PyUnicode_Check(name)) { PyErr_SetString(PyExc_SystemError, "nameless module"); return NULL; } - return _PyUnicode_AsString(nameobj); + Py_INCREF(name); + return name; +} + +const char * +PyModule_GetName(PyObject *m) +{ + PyObject *name = PyModule_GetNameObject(m); + if (name == NULL) + return NULL; + Py_DECREF(name); /* module dict has still a reference */ + return _PyUnicode_AsString(name); } PyObject* @@ -219,7 +230,7 @@ if (fileobj == NULL) return NULL; utf8 = _PyUnicode_AsString(fileobj); - Py_DECREF(fileobj); + Py_DECREF(fileobj); /* module dict has still a reference */ return utf8; } @@ -347,21 +358,25 @@ static PyObject * module_repr(PyModuleObject *m) { - const char *name; - PyObject *filename, *repr; + PyObject *name, *filename, *repr; - name = PyModule_GetName((PyObject *)m); + name = PyModule_GetNameObject((PyObject *)m); if (name == NULL) { PyErr_Clear(); - name = "?"; + name = PyUnicode_FromStringAndSize("?", 1); + if (name == NULL) + return NULL; } filename = PyModule_GetFilenameObject((PyObject *)m); if (filename == NULL) { PyErr_Clear(); - return PyUnicode_FromFormat("", name); + repr = PyUnicode_FromFormat("", name); + } + else { + repr = PyUnicode_FromFormat("", name, filename); + Py_DECREF(filename); } - repr = PyUnicode_FromFormat("", name, filename); - Py_DECREF(filename); + Py_DECREF(name); return repr; } From python-checkins at python.org Wed Feb 23 01:46:29 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 23 Feb 2011 01:46:29 +0100 (CET) Subject: [Python-checkins] r88521 - in python/branches/py3k/Lib: configparser.py random.py test/support.py test/test_collections.py test/test_nntplib.py test/test_shelve.py test/test_xmlrpc_net.py Message-ID: <20110223004629.2655AF3BA@mail.python.org> Author: raymond.hettinger Date: Wed Feb 23 01:46:28 2011 New Revision: 88521 Log: Fix imports from collections.abc Modified: python/branches/py3k/Lib/configparser.py python/branches/py3k/Lib/random.py python/branches/py3k/Lib/test/support.py python/branches/py3k/Lib/test/test_collections.py python/branches/py3k/Lib/test/test_nntplib.py python/branches/py3k/Lib/test/test_shelve.py python/branches/py3k/Lib/test/test_xmlrpc_net.py Modified: python/branches/py3k/Lib/configparser.py ============================================================================== --- python/branches/py3k/Lib/configparser.py (original) +++ python/branches/py3k/Lib/configparser.py Wed Feb 23 01:46:28 2011 @@ -119,7 +119,8 @@ between keys and values are surrounded by spaces. """ -from collections import MutableMapping, OrderedDict as _default_dict, _ChainMap +from collections.abc import MutableMapping +from collections import OrderedDict as _default_dict, _ChainMap import functools import io import itertools Modified: python/branches/py3k/Lib/random.py ============================================================================== --- python/branches/py3k/Lib/random.py (original) +++ python/branches/py3k/Lib/random.py Wed Feb 23 01:46:28 2011 @@ -42,7 +42,7 @@ from math import log as _log, exp as _exp, pi as _pi, e as _e, ceil as _ceil from math import sqrt as _sqrt, acos as _acos, cos as _cos, sin as _sin from os import urandom as _urandom -import collections as _collections +from collections.abc import Set as _Set, Sequence as _Sequence from hashlib import sha512 as _sha512 __all__ = ["Random","seed","random","uniform","randint","choice","sample", @@ -293,10 +293,10 @@ # preferred since the list takes less space than the # set and it doesn't suffer from frequent reselections. - if isinstance(population, _collections.Set): + if isinstance(population, _Set): population = tuple(population) - if not isinstance(population, _collections.Sequence): - raise TypeError("Population must be a sequence or Set. For dicts, use list(d).") + if not isinstance(population, _Sequence): + raise TypeError("Population must be a sequence or set. For dicts, use list(d).") randbelow = self._randbelow n = len(population) if not 0 <= k <= n: Modified: python/branches/py3k/Lib/test/support.py ============================================================================== --- python/branches/py3k/Lib/test/support.py (original) +++ python/branches/py3k/Lib/test/support.py Wed Feb 23 01:46:28 2011 @@ -15,7 +15,7 @@ import warnings import unittest import importlib -import collections +import collections.abc import re import subprocess import imp @@ -682,7 +682,7 @@ sys.modules.update(self.original_modules) -class EnvironmentVarGuard(collections.MutableMapping): +class EnvironmentVarGuard(collections.abc.MutableMapping): """Class to help protect the environment variable properly. Can be used as a context manager.""" Modified: python/branches/py3k/Lib/test/test_collections.py ============================================================================== --- python/branches/py3k/Lib/test/test_collections.py (original) +++ python/branches/py3k/Lib/test/test_collections.py Wed Feb 23 01:46:28 2011 @@ -10,12 +10,13 @@ import keyword import re import sys -from collections import Hashable, Iterable, Iterator -from collections import Sized, Container, Callable -from collections import Set, MutableSet -from collections import Mapping, MutableMapping, KeysView, ItemsView, UserDict -from collections import Sequence, MutableSequence -from collections import ByteString +from collections import UserDict +from collections.abc import Hashable, Iterable, Iterator +from collections.abc import Sized, Container, Callable +from collections.abc import Set, MutableSet +from collections.abc import Mapping, MutableMapping, KeysView, ItemsView +from collections.abc import Sequence, MutableSequence +from collections.abc import ByteString TestNT = namedtuple('TestNT', 'x y z') # type used for pickle tests @@ -507,7 +508,7 @@ def test_issue_4920(self): # MutableSet.pop() method did not work - class MySet(collections.MutableSet): + class MySet(MutableSet): __slots__=['__s'] def __init__(self,items=None): if items is None: @@ -553,7 +554,7 @@ self.assertTrue(issubclass(sample, Mapping)) self.validate_abstract_methods(Mapping, '__contains__', '__iter__', '__len__', '__getitem__') - class MyMapping(collections.Mapping): + class MyMapping(Mapping): def __len__(self): return 0 def __getitem__(self, i): Modified: python/branches/py3k/Lib/test/test_nntplib.py ============================================================================== --- python/branches/py3k/Lib/test/test_nntplib.py (original) +++ python/branches/py3k/Lib/test/test_nntplib.py Wed Feb 23 01:46:28 2011 @@ -4,7 +4,7 @@ import unittest import functools import contextlib -import collections +import collections.abc from test import support from nntplib import NNTP, GroupInfo, _have_ssl import nntplib @@ -246,7 +246,7 @@ if not name.startswith('test_'): continue meth = getattr(cls, name) - if not isinstance(meth, collections.Callable): + if not isinstance(meth, collections.abc.Callable): continue # Need to use a closure so that meth remains bound to its current # value Modified: python/branches/py3k/Lib/test/test_shelve.py ============================================================================== --- python/branches/py3k/Lib/test/test_shelve.py (original) +++ python/branches/py3k/Lib/test/test_shelve.py Wed Feb 23 01:46:28 2011 @@ -2,7 +2,7 @@ import shelve import glob from test import support -from collections import MutableMapping +from collections.abc import MutableMapping from test.test_dbm import dbm_iterator def L1(s): Modified: python/branches/py3k/Lib/test/test_xmlrpc_net.py ============================================================================== --- python/branches/py3k/Lib/test/test_xmlrpc_net.py (original) +++ python/branches/py3k/Lib/test/test_xmlrpc_net.py Wed Feb 23 01:46:28 2011 @@ -1,6 +1,6 @@ #!/usr/bin/env python3 -import collections +import collections.abc import errno import socket import sys @@ -48,7 +48,7 @@ # Perform a minimal sanity check on the result, just to be sure # the request means what we think it means. - self.assertIsInstance(builders, collections.Sequence) + self.assertIsInstance(builders, collections.abc.Sequence) self.assertTrue([x for x in builders if "3.x" in x], builders) From python-checkins at python.org Wed Feb 23 02:09:24 2011 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 23 Feb 2011 02:09:24 +0100 Subject: [Python-checkins] pymigr: Update the script used for shrinking revlogs Message-ID: antoine.pitrou pushed ff58f39081e3 to pymigr: http://hg.python.org/pymigr/rev/ff58f39081e3 changeset: 87:ff58f39081e3 tag: tip user: Antoine Pitrou date: Wed Feb 23 02:09:17 2011 +0100 summary: Update the script used for shrinking revlogs files: findlarge.py diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -1,12 +1,53 @@ #!/usr/bin/env python +""" +Find the name of the ".i" files in an hg repo's data store which have the +largest data. + +Typical use (from a "python-hg" subdirectory of the working copy): + $ ../findlarge.py | xargs -n 1 hg shrink --revlog + +Full recipe for shrinking a repo: + $ hg shrink + $ ../findlarge.py | xargs -0 -n 1 hg shrink --revlog + $ hg verify + $ find .hg -name "*.old" -print0 | xargs -0 rm + +""" import os +import sys +import collections -files = [] -for dir, dirs, fns in os.walk('data'): - for fn in fns: - fn = os.path.join(dir, fn) - files.append((os.path.getsize(fn), fn)) - -for s, fn in sorted(files, reverse=True)[0:20]: - print s, fn +def file_sizes(path): + files = collections.defaultdict(int) + if not os.path.exists(path): + return files + for dir, dirs, fns in os.walk(path): + for fn in fns: + if fn.endswith('.d'): + fni = fn[:-2] + '.i' + elif fn.endswith('.i'): + fni = fn + else: + continue + fn = os.path.join(dir, fn) + fni = os.path.join(dir, fni) + # Strange issue with this file, skip it + if fni.endswith('_misc/python-mode.el.i'): + continue + files[fni] += os.path.getsize(fn) + return files + +files = ( + [(v, k) for (k, v) in file_sizes('.hg/store/data').items()] + + [(v, k) for (k, v) in file_sizes('.hg/store/dh').items()] + ) + +nlargest = sorted(files, reverse=True)[0:1000] +if '-v' in sys.argv: + for s, fn in nlargest: + print s, fn + print "(total %d)" % sum(s for s, fn in nlargest) +else: + print '\0'.join(fn for s, fn in nlargest) + -- Repository URL: http://hg.python.org/pymigr From benjamin at python.org Wed Feb 23 03:23:54 2011 From: benjamin at python.org (Benjamin Peterson) Date: Tue, 22 Feb 2011 20:23:54 -0600 Subject: [Python-checkins] r88503 - in python/branches/py3k: Lib/lib2to3/__main__.py Tools/scripts/2to3 In-Reply-To: <20110222191243.810C3F634@mail.python.org> References: <20110222191243.810C3F634@mail.python.org> Message-ID: 2011/2/22 brett.cannon : > Author: brett.cannon > Date: Tue Feb 22 20:12:43 2011 > New Revision: 88503 > > Log: > Add lib2to3.__main__ to make it easier for debugging purposes to run 2to3. Please revert this and do it in the sandbox. > > Added: > ? python/branches/py3k/Lib/lib2to3/__main__.py > Modified: > ? python/branches/py3k/Tools/scripts/2to3 > > Added: python/branches/py3k/Lib/lib2to3/__main__.py > ============================================================================== > --- (empty file) > +++ python/branches/py3k/Lib/lib2to3/__main__.py ? ? ? ?Tue Feb 22 20:12:43 2011 > @@ -0,0 +1,4 @@ > +import sys > +from .main import main > + > +sys.exit(main("lib2to3.fixes")) > > Modified: python/branches/py3k/Tools/scripts/2to3 > ============================================================================== > --- python/branches/py3k/Tools/scripts/2to3 ? ? (original) > +++ python/branches/py3k/Tools/scripts/2to3 ? ? Tue Feb 22 20:12:43 2011 > @@ -1,5 +1,4 @@ > ?#!/usr/bin/env python > -import sys > -from lib2to3.main import main > +import runpy > > -sys.exit(main("lib2to3.fixes")) > +runpy.run_module('lib2to3', run_name='__main__', alter_sys=True) > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > http://mail.python.org/mailman/listinfo/python-checkins > -- Regards, Benjamin From solipsis at pitrou.net Wed Feb 23 05:10:15 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 23 Feb 2011 05:10:15 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88521): sum=0 Message-ID: py3k results for svn r88521 (hg cset 753620a48812) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflog1TdaNj', '-x'] From python-checkins at python.org Wed Feb 23 05:22:31 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 23 Feb 2011 05:22:31 +0100 (CET) Subject: [Python-checkins] r88522 - python/branches/release32-maint/Doc/whatsnew/3.2.rst Message-ID: <20110223042231.B9BBBEE9E3@mail.python.org> Author: raymond.hettinger Date: Wed Feb 23 05:22:31 2011 New Revision: 88522 Log: nits Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/release32-maint/Doc/whatsnew/3.2.rst (original) +++ python/branches/release32-maint/Doc/whatsnew/3.2.rst Wed Feb 23 05:22:31 2011 @@ -2471,14 +2471,14 @@ In addition to the existing Subversion code repository at http://svn.python.org there is now a `Mercurial `_ repository at -http://hg.python.org/ . +http://hg.python.org/\. After the 3.2 release, there are plans to switch to Mercurial as the primary repository. This distributed version control system should make it easier for members of the community to create and share external changesets. See :pep:`385` for details. -To learn to use the new version control system, see the `tutorial by Joel +To learn the new version control system, see the `tutorial by Joel Spolsky `_ or the `Guide to Mercurial Workflows `_. From python-checkins at python.org Wed Feb 23 08:30:13 2011 From: python-checkins at python.org (georg.brandl) Date: Wed, 23 Feb 2011 08:30:13 +0100 (CET) Subject: [Python-checkins] r88523 - python/branches/py3k/Makefile.pre.in Message-ID: <20110223073013.3BB14EEA41@mail.python.org> Author: georg.brandl Date: Wed Feb 23 08:30:12 2011 New Revision: 88523 Log: Add new subdirectory to LIBSUBDIRS. Modified: python/branches/py3k/Makefile.pre.in Modified: python/branches/py3k/Makefile.pre.in ============================================================================== --- python/branches/py3k/Makefile.pre.in (original) +++ python/branches/py3k/Makefile.pre.in Wed Feb 23 08:30:12 2011 @@ -906,7 +906,7 @@ tkinter/test/test_ttk site-packages test \ test/decimaltestdata test/xmltestdata test/subprocessdata \ test/tracedmodules test/encoded_modules \ - concurrent concurrent/futures encodings \ + collections concurrent concurrent/futures encodings \ email email/mime email/test email/test/data \ html json test/json_tests http dbm xmlrpc \ sqlite3 sqlite3/test \ From python-checkins at python.org Wed Feb 23 08:31:24 2011 From: python-checkins at python.org (georg.brandl) Date: Wed, 23 Feb 2011 08:31:24 +0100 (CET) Subject: [Python-checkins] r88524 - python/branches/py3k/Doc/library/crypt.rst Message-ID: <20110223073124.CD5ACEEA48@mail.python.org> Author: georg.brandl Date: Wed Feb 23 08:31:24 2011 New Revision: 88524 Log: Indent "versionadded" properly. Modified: python/branches/py3k/Doc/library/crypt.rst Modified: python/branches/py3k/Doc/library/crypt.rst ============================================================================== --- python/branches/py3k/Doc/library/crypt.rst (original) +++ python/branches/py3k/Doc/library/crypt.rst Wed Feb 23 08:31:24 2011 @@ -37,28 +37,28 @@ A Modular Crypt Format method with 16 character salt and 86 character hash. This is the strongest method. -.. versionadded:: 3.3 + .. versionadded:: 3.3 .. data:: METHOD_SHA256 Another Modular Crypt Format method with 16 character salt and 43 character hash. -.. versionadded:: 3.3 + .. versionadded:: 3.3 .. data:: METHOD_MD5 Another Modular Crypt Format method with 8 character salt and 22 character hash. -.. versionadded:: 3.3 + .. versionadded:: 3.3 .. data:: METHOD_CRYPT The traditional method with a 2 character salt and 13 characters of hash. This is the weakest method. -.. versionadded:: 3.3 + .. versionadded:: 3.3 Module Attributes @@ -71,7 +71,7 @@ ``crypt.METHOD_*`` objects. This list is sorted from strongest to weakest, and is guaranteed to have at least ``crypt.METHOD_CRYPT``. -.. versionadded:: 3.3 + .. versionadded:: 3.3 Module Functions From python-checkins at python.org Wed Feb 23 08:56:53 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 23 Feb 2011 08:56:53 +0100 (CET) Subject: [Python-checkins] r88525 - in python/branches/py3k/Lib: collections/__init__.py test/test_collections.py Message-ID: <20110223075653.30E81EEA4B@mail.python.org> Author: raymond.hettinger Date: Wed Feb 23 08:56:53 2011 New Revision: 88525 Log: Add tests for the _ChainMap helper class. Modified: python/branches/py3k/Lib/collections/__init__.py python/branches/py3k/Lib/test/test_collections.py Modified: python/branches/py3k/Lib/collections/__init__.py ============================================================================== --- python/branches/py3k/Lib/collections/__init__.py (original) +++ python/branches/py3k/Lib/collections/__init__.py Wed Feb 23 08:56:53 2011 @@ -695,6 +695,15 @@ __copy__ = copy + def new_child(self): # like Django's Context.push() + 'New ChainMap with a new dict followed by all previous maps.' + return self.__class__({}, *self.maps) + + @property + def parents(self): # like Django's Context.pop() + 'New ChainMap from maps[1:].' + return self.__class__(*self.maps[1:]) + def __setitem__(self, key, value): self.maps[0][key] = value Modified: python/branches/py3k/Lib/test/test_collections.py ============================================================================== --- python/branches/py3k/Lib/test/test_collections.py (original) +++ python/branches/py3k/Lib/test/test_collections.py Wed Feb 23 08:56:53 2011 @@ -11,6 +11,7 @@ import re import sys from collections import UserDict +from collections import _ChainMap as ChainMap from collections.abc import Hashable, Iterable, Iterator from collections.abc import Sized, Container, Callable from collections.abc import Set, MutableSet @@ -18,6 +19,97 @@ from collections.abc import Sequence, MutableSequence from collections.abc import ByteString + +################################################################################ +### _ChainMap (helper class for configparser and the string module) +################################################################################ + +class TestChainMap(unittest.TestCase): + + def test_basics(self): + c = ChainMap() + c['a'] = 1 + c['b'] = 2 + d = c.new_child() + d['b'] = 20 + d['c'] = 30 + self.assertEqual(d.maps, [{'b':20, 'c':30}, {'a':1, 'b':2}]) # check internal state + self.assertEqual(d.items(), dict(a=1, b=20, c=30).items()) # check items/iter/getitem + self.assertEqual(len(d), 3) # check len + for key in 'abc': # check contains + self.assertIn(key, d) + for k, v in dict(a=1, b=20, c=30, z=100).items(): # check get + self.assertEqual(d.get(k, 100), v) + + del d['b'] # unmask a value + self.assertEqual(d.maps, [{'c':30}, {'a':1, 'b':2}]) # check internal state + self.assertEqual(d.items(), dict(a=1, b=2, c=30).items()) # check items/iter/getitem + self.assertEqual(len(d), 3) # check len + for key in 'abc': # check contains + self.assertIn(key, d) + for k, v in dict(a=1, b=2, c=30, z=100).items(): # check get + self.assertEqual(d.get(k, 100), v) + self.assertIn(repr(d), [ # check repr + type(d).__name__ + "({'c': 30}, {'a': 1, 'b': 2})", + type(d).__name__ + "({'c': 30}, {'b': 2, 'a': 1})" + ]) + + for e in d.copy(), copy.copy(d): # check shallow copies + self.assertEqual(d, e) + self.assertEqual(d.maps, e.maps) + self.assertIsNot(d, e) + self.assertIsNot(d.maps[0], e.maps[0]) + for m1, m2 in zip(d.maps[1:], e.maps[1:]): + self.assertIs(m1, m2) + + for e in [pickle.loads(pickle.dumps(d)), + copy.deepcopy(d), + eval(repr(d)) + ]: # check deep copies + self.assertEqual(d, e) + self.assertEqual(d.maps, e.maps) + self.assertIsNot(d, e) + for m1, m2 in zip(d.maps, e.maps): + self.assertIsNot(m1, m2, e) + + d.new_child() + d['b'] = 5 + self.assertEqual(d.maps, [{'b': 5}, {'c':30}, {'a':1, 'b':2}]) + self.assertEqual(d.parents.maps, [{'c':30}, {'a':1, 'b':2}]) # check parents + self.assertEqual(d['b'], 5) # find first in chain + self.assertEqual(d.parents['b'], 2) # look beyond maps[0] + + def test_contructor(self): + self.assertEqual(ChainedContext().maps, [{}]) # no-args --> one new dict + self.assertEqual(ChainMap({1:2}).maps, [{1:2}]) # 1 arg --> list + + def test_missing(self): + class DefaultChainMap(ChainMap): + def __missing__(self, key): + return 999 + d = DefaultChainMap(dict(a=1, b=2), dict(b=20, c=30)) + for k, v in dict(a=1, b=2, c=30, d=999).items(): + self.assertEqual(d[k], v) # check __getitem__ w/missing + for k, v in dict(a=1, b=2, c=30, d=77).items(): + self.assertEqual(d.get(k, 77), v) # check get() w/ missing + for k, v in dict(a=True, b=True, c=True, d=False).items(): + self.assertEqual(k in d, v) # check __contains__ w/missing + self.assertEqual(d.pop('a', 1001), 1, d) + self.assertEqual(d.pop('a', 1002), 1002) # check pop() w/missing + self.assertEqual(d.popitem(), ('b', 2)) # check popitem() w/missing + with self.assertRaises(KeyError): + d.popitem() + + def test_dict_coercion(self): + d = ChainMap(dict(a=1, b=2), dict(b=20, c=30)) + self.assertEqual(dict(d), dict(a=1, b=2, c=30)) + self.assertEqual(dict(d.items()), dict(a=1, b=2, c=30)) + + +################################################################################ +### Named Tuples +################################################################################ + TestNT = namedtuple('TestNT', 'x y z') # type used for pickle tests class TestNamedTuple(unittest.TestCase): @@ -229,6 +321,10 @@ self.assertEqual(repr(B(1)), 'B(x=1)') +################################################################################ +### Abstract Base Classes +################################################################################ + class ABCTestCase(unittest.TestCase): def validate_abstract_methods(self, abc, *names): @@ -626,6 +722,11 @@ self.validate_abstract_methods(MutableSequence, '__contains__', '__iter__', '__len__', '__getitem__', '__setitem__', '__delitem__', 'insert') + +################################################################################ +### Counter +################################################################################ + class TestCounter(unittest.TestCase): def test_basics(self): @@ -789,6 +890,11 @@ self.assertEqual(m, OrderedDict([('a', 5), ('b', 2), ('r', 2), ('c', 1), ('d', 1)])) + +################################################################################ +### OrderedDict +################################################################################ + class TestOrderedDict(unittest.TestCase): def test_init(self): @@ -1067,6 +1173,10 @@ self.assertRaises(KeyError, d.popitem) +################################################################################ +### Run tests +################################################################################ + import doctest, collections def test_main(verbose=None): From python-checkins at python.org Wed Feb 23 09:28:06 2011 From: python-checkins at python.org (raymond.hettinger) Date: Wed, 23 Feb 2011 09:28:06 +0100 (CET) Subject: [Python-checkins] r88526 - in python/branches/release32-maint/Lib: collections.py test/test_collections.py Message-ID: <20110223082806.3AC47EE9E3@mail.python.org> Author: raymond.hettinger Date: Wed Feb 23 09:28:06 2011 New Revision: 88526 Log: Add tests for the collections helper class and sync-up with py3k branch. Modified: python/branches/release32-maint/Lib/collections.py python/branches/release32-maint/Lib/test/test_collections.py Modified: python/branches/release32-maint/Lib/collections.py ============================================================================== --- python/branches/release32-maint/Lib/collections.py (original) +++ python/branches/release32-maint/Lib/collections.py Wed Feb 23 09:28:06 2011 @@ -694,6 +694,15 @@ __copy__ = copy + def new_child(self): # like Django's Context.push() + 'New ChainMap with a new dict followed by all previous maps.' + return self.__class__({}, *self.maps) + + @property + def parents(self): # like Django's Context.pop() + 'New ChainMap from maps[1:].' + return self.__class__(*self.maps[1:]) + def __setitem__(self, key, value): self.maps[0][key] = value Modified: python/branches/release32-maint/Lib/test/test_collections.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_collections.py (original) +++ python/branches/release32-maint/Lib/test/test_collections.py Wed Feb 23 09:28:06 2011 @@ -10,6 +10,7 @@ import keyword import re import sys +from collections import _ChainMap as ChainMap from collections import Hashable, Iterable, Iterator from collections import Sized, Container, Callable from collections import Set, MutableSet @@ -17,6 +18,97 @@ from collections import Sequence, MutableSequence from collections import ByteString + +################################################################################ +### _ChainMap (helper class for configparser) +################################################################################ + +class TestChainMap(unittest.TestCase): + + def test_basics(self): + c = ChainMap() + c['a'] = 1 + c['b'] = 2 + d = c.new_child() + d['b'] = 20 + d['c'] = 30 + self.assertEqual(d.maps, [{'b':20, 'c':30}, {'a':1, 'b':2}]) # check internal state + self.assertEqual(d.items(), dict(a=1, b=20, c=30).items()) # check items/iter/getitem + self.assertEqual(len(d), 3) # check len + for key in 'abc': # check contains + self.assertIn(key, d) + for k, v in dict(a=1, b=20, c=30, z=100).items(): # check get + self.assertEqual(d.get(k, 100), v) + + del d['b'] # unmask a value + self.assertEqual(d.maps, [{'c':30}, {'a':1, 'b':2}]) # check internal state + self.assertEqual(d.items(), dict(a=1, b=2, c=30).items()) # check items/iter/getitem + self.assertEqual(len(d), 3) # check len + for key in 'abc': # check contains + self.assertIn(key, d) + for k, v in dict(a=1, b=2, c=30, z=100).items(): # check get + self.assertEqual(d.get(k, 100), v) + self.assertIn(repr(d), [ # check repr + type(d).__name__ + "({'c': 30}, {'a': 1, 'b': 2})", + type(d).__name__ + "({'c': 30}, {'b': 2, 'a': 1})" + ]) + + for e in d.copy(), copy.copy(d): # check shallow copies + self.assertEqual(d, e) + self.assertEqual(d.maps, e.maps) + self.assertIsNot(d, e) + self.assertIsNot(d.maps[0], e.maps[0]) + for m1, m2 in zip(d.maps[1:], e.maps[1:]): + self.assertIs(m1, m2) + + for e in [pickle.loads(pickle.dumps(d)), + copy.deepcopy(d), + eval(repr(d)) + ]: # check deep copies + self.assertEqual(d, e) + self.assertEqual(d.maps, e.maps) + self.assertIsNot(d, e) + for m1, m2 in zip(d.maps, e.maps): + self.assertIsNot(m1, m2, e) + + d.new_child() + d['b'] = 5 + self.assertEqual(d.maps, [{'b': 5}, {'c':30}, {'a':1, 'b':2}]) + self.assertEqual(d.parents.maps, [{'c':30}, {'a':1, 'b':2}]) # check parents + self.assertEqual(d['b'], 5) # find first in chain + self.assertEqual(d.parents['b'], 2) # look beyond maps[0] + + def test_contructor(self): + self.assertEqual(ChainedContext().maps, [{}]) # no-args --> one new dict + self.assertEqual(ChainMap({1:2}).maps, [{1:2}]) # 1 arg --> list + + def test_missing(self): + class DefaultChainMap(ChainMap): + def __missing__(self, key): + return 999 + d = DefaultChainMap(dict(a=1, b=2), dict(b=20, c=30)) + for k, v in dict(a=1, b=2, c=30, d=999).items(): + self.assertEqual(d[k], v) # check __getitem__ w/missing + for k, v in dict(a=1, b=2, c=30, d=77).items(): + self.assertEqual(d.get(k, 77), v) # check get() w/ missing + for k, v in dict(a=True, b=True, c=True, d=False).items(): + self.assertEqual(k in d, v) # check __contains__ w/missing + self.assertEqual(d.pop('a', 1001), 1, d) + self.assertEqual(d.pop('a', 1002), 1002) # check pop() w/missing + self.assertEqual(d.popitem(), ('b', 2)) # check popitem() w/missing + with self.assertRaises(KeyError): + d.popitem() + + def test_dict_coercion(self): + d = ChainMap(dict(a=1, b=2), dict(b=20, c=30)) + self.assertEqual(dict(d), dict(a=1, b=2, c=30)) + self.assertEqual(dict(d.items()), dict(a=1, b=2, c=30)) + + +################################################################################ +### Named Tuples +################################################################################ + TestNT = namedtuple('TestNT', 'x y z') # type used for pickle tests class TestNamedTuple(unittest.TestCase): @@ -228,6 +320,10 @@ self.assertEqual(repr(B(1)), 'B(x=1)') +################################################################################ +### Abstract Base Classes +################################################################################ + class ABCTestCase(unittest.TestCase): def validate_abstract_methods(self, abc, *names): @@ -507,7 +603,7 @@ def test_issue_4920(self): # MutableSet.pop() method did not work - class MySet(collections.MutableSet): + class MySet(MutableSet): __slots__=['__s'] def __init__(self,items=None): if items is None: @@ -553,7 +649,7 @@ self.assertTrue(issubclass(sample, Mapping)) self.validate_abstract_methods(Mapping, '__contains__', '__iter__', '__len__', '__getitem__') - class MyMapping(collections.Mapping): + class MyMapping(Mapping): def __len__(self): return 0 def __getitem__(self, i): @@ -625,6 +721,11 @@ self.validate_abstract_methods(MutableSequence, '__contains__', '__iter__', '__len__', '__getitem__', '__setitem__', '__delitem__', 'insert') + +################################################################################ +### Counter +################################################################################ + class TestCounter(unittest.TestCase): def test_basics(self): @@ -788,6 +889,11 @@ self.assertEqual(m, OrderedDict([('a', 5), ('b', 2), ('r', 2), ('c', 1), ('d', 1)])) + +################################################################################ +### OrderedDict +################################################################################ + class TestOrderedDict(unittest.TestCase): def test_init(self): @@ -1066,6 +1172,10 @@ self.assertRaises(KeyError, d.popitem) +################################################################################ +### Run tests +################################################################################ + import doctest, collections def test_main(verbose=None): From python-checkins at python.org Wed Feb 23 12:29:29 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 12:29:29 +0100 (CET) Subject: [Python-checkins] r88527 - python/branches/py3k/Python/dynload_dl.c Message-ID: <20110223112929.3FFCAEE99C@mail.python.org> Author: victor.stinner Date: Wed Feb 23 12:29:28 2011 New Revision: 88527 Log: dynload_dl.c: replace tabs by spaces Modified: python/branches/py3k/Python/dynload_dl.c Modified: python/branches/py3k/Python/dynload_dl.c ============================================================================== --- python/branches/py3k/Python/dynload_dl.c (original) +++ python/branches/py3k/Python/dynload_dl.c Wed Feb 23 12:29:28 2011 @@ -10,17 +10,17 @@ extern char *Py_GetProgramName(void); const struct filedescr _PyImport_DynLoadFiletab[] = { - {".o", "rb", C_EXTENSION}, - {"module.o", "rb", C_EXTENSION}, - {0, 0} + {".o", "rb", C_EXTENSION}, + {"module.o", "rb", C_EXTENSION}, + {0, 0} }; dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, - const char *pathname, FILE *fp) + const char *pathname, FILE *fp) { - char funcname[258]; + char funcname[258]; - PyOS_snprintf(funcname, sizeof(funcname), "PyInit_%.200s", shortname); - return dl_loadmod(Py_GetProgramName(), pathname, funcname); + PyOS_snprintf(funcname, sizeof(funcname), "PyInit_%.200s", shortname); + return dl_loadmod(Py_GetProgramName(), pathname, funcname); } From python-checkins at python.org Wed Feb 23 12:42:22 2011 From: python-checkins at python.org (lars.gustaebel) Date: Wed, 23 Feb 2011 12:42:22 +0100 (CET) Subject: [Python-checkins] r88528 - in python/branches/py3k: Lib/tarfile.py Lib/test/test_tarfile.py Misc/NEWS Message-ID: <20110223114222.4C708EE99F@mail.python.org> Author: lars.gustaebel Date: Wed Feb 23 12:42:22 2011 New Revision: 88528 Log: Issue #11224: Improved sparse file read support (r85916) introduced a regression in _FileInFile which is used in file-like objects returned by TarFile.extractfile(). The inefficient design of the _FileInFile.read() method causes various dramatic side-effects and errors: - The data segment of a file member is read completely into memory every(!) time a small block is accessed. This is not only slow but may cause unexpected MemoryErrors with very large files. - Reading members from compressed tar archives is even slower because of the excessive backwards seeking which is done when the same data segment is read over and over again. - As a backwards seek on a TarFile opened in stream mode is not possible, using extractfile() fails with a StreamError. Modified: python/branches/py3k/Lib/tarfile.py python/branches/py3k/Lib/test/test_tarfile.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/tarfile.py ============================================================================== --- python/branches/py3k/Lib/tarfile.py (original) +++ python/branches/py3k/Lib/tarfile.py Wed Feb 23 12:42:22 2011 @@ -760,9 +760,8 @@ self.map_index = 0 length = min(size, stop - self.position) if data: - self.fileobj.seek(offset) - block = self.fileobj.read(stop - start) - buf += block[self.position - start:self.position + length] + self.fileobj.seek(offset + (self.position - start)) + buf += self.fileobj.read(length) else: buf += NUL * length size -= length Modified: python/branches/py3k/Lib/test/test_tarfile.py ============================================================================== --- python/branches/py3k/Lib/test/test_tarfile.py (original) +++ python/branches/py3k/Lib/test/test_tarfile.py Wed Feb 23 12:42:22 2011 @@ -419,6 +419,22 @@ mode="r|" + def test_read_through(self): + # Issue #11224: A poorly designed _FileInFile.read() method + # caused seeking errors with stream tar files. + for tarinfo in self.tar: + if not tarinfo.isreg(): + continue + fobj = self.tar.extractfile(tarinfo) + while True: + try: + buf = fobj.read(512) + except tarfile.StreamError: + self.fail("simple read-through using TarFile.extractfile() failed") + if not buf: + break + fobj.close() + def test_fileobj_regular_file(self): tarinfo = self.tar.next() # get "regtype" (can't use getmember) fobj = self.tar.extractfile(tarinfo) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Wed Feb 23 12:42:22 2011 @@ -27,6 +27,10 @@ Library ------- +- Issue #11224: Fixed a regression in tarfile that affected the file-like + objects returned by TarFile.extractfile() regarding performance, memory + consumption and failures with the stream interface. + - Issue #10924: Adding salt and Modular Crypt Format to crypt library. Moved old C wrapper to _crypt, and added a Python wrapper with enhanced salt generation and simpler API for password generation. From python-checkins at python.org Wed Feb 23 12:52:38 2011 From: python-checkins at python.org (lars.gustaebel) Date: Wed, 23 Feb 2011 12:52:38 +0100 (CET) Subject: [Python-checkins] r88529 - in python/branches/release32-maint: Lib/tarfile.py Lib/test/test_tarfile.py Misc/NEWS Message-ID: <20110223115238.22CEFEE98F@mail.python.org> Author: lars.gustaebel Date: Wed Feb 23 12:52:31 2011 New Revision: 88529 Log: Merged revisions 88528 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88528 | lars.gustaebel | 2011-02-23 12:42:22 +0100 (Wed, 23 Feb 2011) | 16 lines Issue #11224: Improved sparse file read support (r85916) introduced a regression in _FileInFile which is used in file-like objects returned by TarFile.extractfile(). The inefficient design of the _FileInFile.read() method causes various dramatic side-effects and errors: - The data segment of a file member is read completely into memory every(!) time a small block is accessed. This is not only slow but may cause unexpected MemoryErrors with very large files. - Reading members from compressed tar archives is even slower because of the excessive backwards seeking which is done when the same data segment is read over and over again. - As a backwards seek on a TarFile opened in stream mode is not possible, using extractfile() fails with a StreamError. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/tarfile.py python/branches/release32-maint/Lib/test/test_tarfile.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/tarfile.py ============================================================================== --- python/branches/release32-maint/Lib/tarfile.py (original) +++ python/branches/release32-maint/Lib/tarfile.py Wed Feb 23 12:52:31 2011 @@ -760,9 +760,8 @@ self.map_index = 0 length = min(size, stop - self.position) if data: - self.fileobj.seek(offset) - block = self.fileobj.read(stop - start) - buf += block[self.position - start:self.position + length] + self.fileobj.seek(offset + (self.position - start)) + buf += self.fileobj.read(length) else: buf += NUL * length size -= length Modified: python/branches/release32-maint/Lib/test/test_tarfile.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_tarfile.py (original) +++ python/branches/release32-maint/Lib/test/test_tarfile.py Wed Feb 23 12:52:31 2011 @@ -419,6 +419,22 @@ mode="r|" + def test_read_through(self): + # Issue #11224: A poorly designed _FileInFile.read() method + # caused seeking errors with stream tar files. + for tarinfo in self.tar: + if not tarinfo.isreg(): + continue + fobj = self.tar.extractfile(tarinfo) + while True: + try: + buf = fobj.read(512) + except tarfile.StreamError: + self.fail("simple read-through using TarFile.extractfile() failed") + if not buf: + break + fobj.close() + def test_fileobj_regular_file(self): tarinfo = self.tar.next() # get "regtype" (can't use getmember) fobj = self.tar.extractfile(tarinfo) Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Wed Feb 23 12:52:31 2011 @@ -15,6 +15,10 @@ Library ------- +- Issue #11224: Fixed a regression in tarfile that affected the file-like + objects returned by TarFile.extractfile() regarding performance, memory + consumption and failures with the stream interface. + - Issue #11074: Make 'tokenize' so it can be reloaded. - Issue #4681: Allow mmap() to work on file sizes and offsets larger than From python-checkins at python.org Wed Feb 23 13:07:38 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 13:07:38 +0100 (CET) Subject: [Python-checkins] r88530 - in python/branches/py3k: Lib/test/test_cmd_line.py Misc/NEWS Python/bltinmodule.c Python/pythonrun.c Message-ID: <20110223120738.1EEFCEE9AD@mail.python.org> Author: victor.stinner Date: Wed Feb 23 13:07:37 2011 New Revision: 88530 Log: Issue #11272: Fix input() and sys.stdin for Windows newline On Windows, input() strips '\r' (and not only '\n'), and sys.stdin uses universal newline (replace '\r\n' by '\n'). Modified: python/branches/py3k/Lib/test/test_cmd_line.py python/branches/py3k/Misc/NEWS python/branches/py3k/Python/bltinmodule.c python/branches/py3k/Python/pythonrun.c Modified: python/branches/py3k/Lib/test/test_cmd_line.py ============================================================================== --- python/branches/py3k/Lib/test/test_cmd_line.py (original) +++ python/branches/py3k/Lib/test/test_cmd_line.py Wed Feb 23 13:07:37 2011 @@ -6,6 +6,7 @@ import os import sys import subprocess +import tempfile from test.script_helper import spawn_python, kill_python, assert_python_ok, assert_python_failure @@ -239,6 +240,31 @@ escaped = repr(text).encode(encoding, 'backslashreplace') self.assertIn(escaped, data) + def check_input(self, code, expected): + with tempfile.NamedTemporaryFile("wb+") as stdin: + sep = os.linesep.encode('ASCII') + stdin.write(sep.join((b'abc', b'def'))) + stdin.flush() + stdin.seek(0) + with subprocess.Popen( + (sys.executable, "-c", code), + stdin=stdin, stdout=subprocess.PIPE) as proc: + stdout, stderr = proc.communicate() + self.assertEqual(stdout.rstrip(), expected) + + def test_stdin_readline(self): + # Issue #11272: check that sys.stdin.readline() replaces '\r\n' by '\n' + # on Windows (sys.stdin is opened in binary mode) + self.check_input( + "import sys; print(repr(sys.stdin.readline()))", + b"'abc\\n'") + + def test_builtin_input(self): + # Issue #11272: check that input() strips newlines ('\n' or '\r\n') + self.check_input( + "print(repr(input()))", + b"'abc'") + def test_main(): test.support.run_unittest(CmdLineTest) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Wed Feb 23 13:07:37 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and + sys.stdin uses universal newline (replace '\r\n' by '\n'). + - Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on narrow build. Modified: python/branches/py3k/Python/bltinmodule.c ============================================================================== --- python/branches/py3k/Python/bltinmodule.c (original) +++ python/branches/py3k/Python/bltinmodule.c Wed Feb 23 13:07:37 2011 @@ -1618,6 +1618,7 @@ PyObject *stdin_encoding; char *stdin_encoding_str; PyObject *result; + size_t len; stdin_encoding = PyObject_GetAttrString(fin, "encoding"); if (!stdin_encoding) @@ -1682,19 +1683,23 @@ Py_DECREF(stdin_encoding); return NULL; } - if (*s == '\0') { + + len = strlen(s); + if (len == 0) { PyErr_SetNone(PyExc_EOFError); result = NULL; } - else { /* strip trailing '\n' */ - size_t len = strlen(s); + else { if (len > PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "input: input too long"); result = NULL; } else { - result = PyUnicode_Decode(s, len-1, stdin_encoding_str, NULL); + len--; /* strip trailing '\n' */ + if (len != 0 && s[len-1] == '\r') + len--; /* strip trailing '\r' */ + result = PyUnicode_Decode(s, len, stdin_encoding_str, NULL); } } Py_DECREF(stdin_encoding); Modified: python/branches/py3k/Python/pythonrun.c ============================================================================== --- python/branches/py3k/Python/pythonrun.c (original) +++ python/branches/py3k/Python/pythonrun.c Wed Feb 23 13:07:37 2011 @@ -778,6 +778,7 @@ { PyObject *buf = NULL, *stream = NULL, *text = NULL, *raw = NULL, *res; const char* mode; + const char* newline; PyObject *line_buffering; int buffering, isatty; @@ -828,9 +829,17 @@ Py_CLEAR(raw); Py_CLEAR(text); + newline = "\n"; +#ifdef MS_WINDOWS + if (!write_mode) { + /* translate \r\n to \n for sys.stdin on Windows */ + newline = NULL; + } +#endif + stream = PyObject_CallMethod(io, "TextIOWrapper", "OsssO", buf, encoding, errors, - "\n", line_buffering); + newline, line_buffering); Py_CLEAR(buf); if (stream == NULL) goto error; From python-checkins at python.org Wed Feb 23 13:10:23 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 13:10:23 +0100 (CET) Subject: [Python-checkins] r88531 - in python/branches/release32-maint: Lib/test/test_cmd_line.py Misc/NEWS Python/bltinmodule.c Python/pythonrun.c Message-ID: <20110223121023.7558BEE9FE@mail.python.org> Author: victor.stinner Date: Wed Feb 23 13:10:23 2011 New Revision: 88531 Log: Merged revisions 88530 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88530 | victor.stinner | 2011-02-23 13:07:37 +0100 (mer., 23 f?vr. 2011) | 4 lines Issue #11272: Fix input() and sys.stdin for Windows newline On Windows, input() strips '\r' (and not only '\n'), and sys.stdin uses universal newline (replace '\r\n' by '\n'). ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_cmd_line.py python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Python/bltinmodule.c python/branches/release32-maint/Python/pythonrun.c Modified: python/branches/release32-maint/Lib/test/test_cmd_line.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_cmd_line.py (original) +++ python/branches/release32-maint/Lib/test/test_cmd_line.py Wed Feb 23 13:10:23 2011 @@ -6,6 +6,7 @@ import os import sys import subprocess +import tempfile from test.script_helper import spawn_python, kill_python, assert_python_ok, assert_python_failure @@ -239,6 +240,31 @@ escaped = repr(text).encode(encoding, 'backslashreplace') self.assertIn(escaped, data) + def check_input(self, code, expected): + with tempfile.NamedTemporaryFile("wb+") as stdin: + sep = os.linesep.encode('ASCII') + stdin.write(sep.join((b'abc', b'def'))) + stdin.flush() + stdin.seek(0) + with subprocess.Popen( + (sys.executable, "-c", code), + stdin=stdin, stdout=subprocess.PIPE) as proc: + stdout, stderr = proc.communicate() + self.assertEqual(stdout.rstrip(), expected) + + def test_stdin_readline(self): + # Issue #11272: check that sys.stdin.readline() replaces '\r\n' by '\n' + # on Windows (sys.stdin is opened in binary mode) + self.check_input( + "import sys; print(repr(sys.stdin.readline()))", + b"'abc\\n'") + + def test_builtin_input(self): + # Issue #11272: check that input() strips newlines ('\n' or '\r\n') + self.check_input( + "print(repr(input()))", + b"'abc'") + def test_main(): test.support.run_unittest(CmdLineTest) Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Wed Feb 23 13:10:23 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and + sys.stdin uses universal newline (replace '\r\n' by '\n'). + - Check for NULL result in PyType_FromSpec. Library @@ -26,7 +29,7 @@ 32-bit Windows. - Issue #11089: Fix performance issue limiting the use of ConfigParser() - with large config files. + with large config files. - Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers larger than 4GB. Patch by Nadeem Vawda. Modified: python/branches/release32-maint/Python/bltinmodule.c ============================================================================== --- python/branches/release32-maint/Python/bltinmodule.c (original) +++ python/branches/release32-maint/Python/bltinmodule.c Wed Feb 23 13:10:23 2011 @@ -1621,6 +1621,7 @@ PyObject *stdin_encoding; char *stdin_encoding_str; PyObject *result; + size_t len; stdin_encoding = PyObject_GetAttrString(fin, "encoding"); if (!stdin_encoding) @@ -1685,19 +1686,23 @@ Py_DECREF(stdin_encoding); return NULL; } - if (*s == '\0') { + + len = strlen(s); + if (len == 0) { PyErr_SetNone(PyExc_EOFError); result = NULL; } - else { /* strip trailing '\n' */ - size_t len = strlen(s); + else { if (len > PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "input: input too long"); result = NULL; } else { - result = PyUnicode_Decode(s, len-1, stdin_encoding_str, NULL); + len--; /* strip trailing '\n' */ + if (len != 0 && s[len-1] == '\r') + len--; /* strip trailing '\r' */ + result = PyUnicode_Decode(s, len, stdin_encoding_str, NULL); } } Py_DECREF(stdin_encoding); Modified: python/branches/release32-maint/Python/pythonrun.c ============================================================================== --- python/branches/release32-maint/Python/pythonrun.c (original) +++ python/branches/release32-maint/Python/pythonrun.c Wed Feb 23 13:10:23 2011 @@ -778,6 +778,7 @@ { PyObject *buf = NULL, *stream = NULL, *text = NULL, *raw = NULL, *res; const char* mode; + const char* newline; PyObject *line_buffering; int buffering, isatty; @@ -828,9 +829,17 @@ Py_CLEAR(raw); Py_CLEAR(text); + newline = "\n"; +#ifdef MS_WINDOWS + if (!write_mode) { + /* translate \r\n to \n for sys.stdin on Windows */ + newline = NULL; + } +#endif + stream = PyObject_CallMethod(io, "TextIOWrapper", "OsssO", buf, encoding, errors, - "\n", line_buffering); + newline, line_buffering); Py_CLEAR(buf); if (stream == NULL) goto error; From python-checkins at python.org Wed Feb 23 13:14:23 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 13:14:23 +0100 (CET) Subject: [Python-checkins] r88532 - in python/branches/release32-maint: Lib/test/test_unicode.py Misc/NEWS Objects/unicodeobject.c Message-ID: <20110223121423.0BA25EE9CA@mail.python.org> Author: victor.stinner Date: Wed Feb 23 13:14:22 2011 New Revision: 88532 Log: Merged revisions 88481 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88481 | victor.stinner | 2011-02-21 22:13:44 +0100 (lun., 21 f?vr. 2011) | 4 lines Fix PyUnicode_FromFormatV("%c") for non-BMP char Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on narrow build. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_unicode.py python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Objects/unicodeobject.c Modified: python/branches/release32-maint/Lib/test/test_unicode.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_unicode.py (original) +++ python/branches/release32-maint/Lib/test/test_unicode.py Wed Feb 23 13:14:22 2011 @@ -1427,7 +1427,7 @@ # Test PyUnicode_FromFormat() def test_from_format(self): support.import_module('ctypes') - from ctypes import pythonapi, py_object + from ctypes import pythonapi, py_object, c_int if sys.maxunicode == 65535: name = "PyUnicodeUCS2_FromFormat" else: @@ -1452,6 +1452,9 @@ 'string, got a non-ASCII byte: 0xe9$', PyUnicode_FromFormat, b'unicode\xe9=%s', 'ascii') + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0xabcd)), '\uabcd') + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0x10ffff)), '\U0010ffff') + # other tests text = PyUnicode_FromFormat(b'%%A:%A', 'abc\xe9\uabcd\U0010ffff') self.assertEqual(text, r"%A:'abc\xe9\uabcd\U0010ffff'") Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Wed Feb 23 13:14:22 2011 @@ -13,6 +13,9 @@ - Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and sys.stdin uses universal newline (replace '\r\n' by '\n'). +- Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on + narrow build. + - Check for NULL result in PyType_FromSpec. Library Modified: python/branches/release32-maint/Objects/unicodeobject.c ============================================================================== --- python/branches/release32-maint/Objects/unicodeobject.c (original) +++ python/branches/release32-maint/Objects/unicodeobject.c Wed Feb 23 13:14:22 2011 @@ -813,8 +813,19 @@ switch (*f) { case 'c': + { +#ifndef Py_UNICODE_WIDE + int ordinal = va_arg(count, int); + if (ordinal > 0xffff) + n += 2; + else + n++; +#else (void)va_arg(count, int); - /* fall through... */ + n++; +#endif + break; + } case '%': n++; break; @@ -992,8 +1003,18 @@ switch (*f) { case 'c': - *s++ = va_arg(vargs, int); + { + int ordinal = va_arg(vargs, int); +#ifndef Py_UNICODE_WIDE + if (ordinal > 0xffff) { + ordinal -= 0x10000; + *s++ = 0xD800 | (ordinal >> 10); + *s++ = 0xDC00 | (ordinal & 0x3FF); + } else +#endif + *s++ = ordinal; break; + } case 'd': makefmt(fmt, longflag, longlongflag, size_tflag, zeropad, width, precision, 'd'); From python-checkins at python.org Wed Feb 23 15:14:49 2011 From: python-checkins at python.org (victor.stinner) Date: Wed, 23 Feb 2011 15:14:49 +0100 (CET) Subject: [Python-checkins] r88533 - python/branches/py3k/Lib/test/test_reprlib.py Message-ID: <20110223141449.350F4EEA1B@mail.python.org> Author: victor.stinner Date: Wed Feb 23 15:14:48 2011 New Revision: 88533 Log: Issue #3080: Fix test_reprlib on Windows Fix the test for last module changes (r88520). Modified: python/branches/py3k/Lib/test/test_reprlib.py Modified: python/branches/py3k/Lib/test/test_reprlib.py ============================================================================== --- python/branches/py3k/Lib/test/test_reprlib.py (original) +++ python/branches/py3k/Lib/test/test_reprlib.py Wed Feb 23 15:14:48 2011 @@ -234,7 +234,7 @@ touch(os.path.join(self.subpkgname, self.pkgname + '.py')) from areallylongpackageandmodulenametotestreprtruncation.areallylongpackageandmodulenametotestreprtruncation import areallylongpackageandmodulenametotestreprtruncation eq(repr(areallylongpackageandmodulenametotestreprtruncation), - "" % (areallylongpackageandmodulenametotestreprtruncation.__name__, areallylongpackageandmodulenametotestreprtruncation.__file__)) + "" % (areallylongpackageandmodulenametotestreprtruncation.__name__, areallylongpackageandmodulenametotestreprtruncation.__file__)) eq(repr(sys), "") def test_type(self): From python-checkins at python.org Wed Feb 23 19:48:52 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 23 Feb 2011 19:48:52 +0100 (CET) Subject: [Python-checkins] r88534 - in python/branches/py3k: Lib/lib2to3/__main__.py Tools/scripts/2to3 Message-ID: <20110223184852.93F49EE9C9@mail.python.org> Author: brett.cannon Date: Wed Feb 23 19:48:52 2011 New Revision: 88534 Log: Revert r88503 as Benjamin's request. Removed: python/branches/py3k/Lib/lib2to3/__main__.py Modified: python/branches/py3k/Tools/scripts/2to3 Deleted: python/branches/py3k/Lib/lib2to3/__main__.py ============================================================================== --- python/branches/py3k/Lib/lib2to3/__main__.py Wed Feb 23 19:48:52 2011 +++ (empty file) @@ -1,4 +0,0 @@ -import sys -from .main import main - -sys.exit(main("lib2to3.fixes")) Modified: python/branches/py3k/Tools/scripts/2to3 ============================================================================== --- python/branches/py3k/Tools/scripts/2to3 (original) +++ python/branches/py3k/Tools/scripts/2to3 Wed Feb 23 19:48:52 2011 @@ -1,4 +1,5 @@ #!/usr/bin/env python -import runpy +import sys +from lib2to3.main import main -runpy.run_module('lib2to3', run_name='__main__', alter_sys=True) +sys.exit(main("lib2to3.fixes")) From python-checkins at python.org Wed Feb 23 20:46:46 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 23 Feb 2011 20:46:46 +0100 (CET) Subject: [Python-checkins] r88535 - sandbox/trunk/2to3/lib2to3/__main__.py Message-ID: <20110223194646.F3025EE9E3@mail.python.org> Author: brett.cannon Date: Wed Feb 23 20:46:46 2011 New Revision: 88535 Log: Add lib2to3.__main__ for easy testing from the console. Added: sandbox/trunk/2to3/lib2to3/__main__.py Added: sandbox/trunk/2to3/lib2to3/__main__.py ============================================================================== --- (empty file) +++ sandbox/trunk/2to3/lib2to3/__main__.py Wed Feb 23 20:46:46 2011 @@ -0,0 +1,4 @@ +import sys +from .main import main + +sys.exit(main("lib2to3.fixes")) From python-checkins at python.org Wed Feb 23 21:16:19 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 23 Feb 2011 21:16:19 +0100 Subject: [Python-checkins] devguide: Add a Quick Links section to the devguide front page for easy reference. Message-ID: brett.cannon pushed e5da5c96d7f0 to devguide: http://hg.python.org/devguide/rev/e5da5c96d7f0 changeset: 312:e5da5c96d7f0 parent: 310:4c1f51eafc70 user: Brett Cannon date: Wed Feb 23 12:15:51 2011 -0800 summary: Add a Quick Links section to the devguide front page for easy reference. files: index.rst diff --git a/index.rst b/index.rst --- a/index.rst +++ b/index.rst @@ -33,6 +33,19 @@ faq + +Quick Links +----------- + +Here are some links that you may find you refererence frequently while +contributing to Python. + +* `Issue tracker `_ +* Buildbots_ +* :ref:`faq` +* PEPs_ (Python Enhancement Proposals) + + Contributing ------------ @@ -76,6 +89,7 @@ .. _resources: + Resources --------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 23 21:16:21 2011 From: python-checkins at python.org (brett.cannon) Date: Wed, 23 Feb 2011 21:16:21 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): Merge from default Message-ID: brett.cannon pushed 7d021dd30e9c to devguide: http://hg.python.org/devguide/rev/7d021dd30e9c changeset: 313:7d021dd30e9c branch: hg_transition tag: tip parent: 311:bbd90a65d228 parent: 312:e5da5c96d7f0 user: Brett Cannon date: Wed Feb 23 12:16:13 2011 -0800 summary: Merge from default files: index.rst diff --git a/index.rst b/index.rst --- a/index.rst +++ b/index.rst @@ -33,6 +33,19 @@ faq + +Quick Links +----------- + +Here are some links that you may find you refererence frequently while +contributing to Python. + +* `Issue tracker `_ +* Buildbots_ +* :ref:`faq` +* PEPs_ (Python Enhancement Proposals) + + Contributing ------------ @@ -78,6 +91,7 @@ .. _resources: + Resources --------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Feb 23 22:49:33 2011 From: python-checkins at python.org (georg.brandl) Date: Wed, 23 Feb 2011 22:49:33 +0100 Subject: [Python-checkins] pymigr: Fix recipe. Message-ID: georg.brandl pushed 655c7ac7a583 to pymigr: http://hg.python.org/pymigr/rev/655c7ac7a583 changeset: 88:655c7ac7a583 tag: tip user: Georg Brandl date: Wed Feb 23 22:48:40 2011 +0100 summary: Fix recipe. files: findlarge.py diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -4,7 +4,7 @@ largest data. Typical use (from a "python-hg" subdirectory of the working copy): - $ ../findlarge.py | xargs -n 1 hg shrink --revlog + $ ../findlarge.py | xargs -0 -n 1 hg shrink --revlog Full recipe for shrinking a repo: $ hg shrink -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 01:00:30 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 24 Feb 2011 01:00:30 +0100 (CET) Subject: [Python-checkins] r88536 - python/branches/release27-maint/Doc/tutorial/inputoutput.rst Message-ID: <20110224000030.EA98DEEA20@mail.python.org> Author: raymond.hettinger Date: Thu Feb 24 01:00:30 2011 New Revision: 88536 Log: Issue #11304: Input/output tutorial - PI is rounded not truncated. Modified: python/branches/release27-maint/Doc/tutorial/inputoutput.rst Modified: python/branches/release27-maint/Doc/tutorial/inputoutput.rst ============================================================================== --- python/branches/release27-maint/Doc/tutorial/inputoutput.rst (original) +++ python/branches/release27-maint/Doc/tutorial/inputoutput.rst Thu Feb 24 01:00:30 2011 @@ -160,7 +160,7 @@ An optional ``':'`` and format specifier can follow the field name. This allows greater control over how the value is formatted. The following example -truncates Pi to three places after the decimal. +rounds Pi to three places after the decimal. >>> import math >>> print 'The value of PI is approximately {0:.3f}.'.format(math.pi) From python-checkins at python.org Thu Feb 24 01:06:17 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 24 Feb 2011 01:06:17 +0100 (CET) Subject: [Python-checkins] r88537 - python/branches/release32-maint/Doc/tutorial/inputoutput.rst Message-ID: <20110224000617.3CE03EEA2A@mail.python.org> Author: raymond.hettinger Date: Thu Feb 24 01:06:16 2011 New Revision: 88537 Log: Issue #11304: Input/output tutorial - PI is rounded not truncated. Modified: python/branches/release32-maint/Doc/tutorial/inputoutput.rst Modified: python/branches/release32-maint/Doc/tutorial/inputoutput.rst ============================================================================== --- python/branches/release32-maint/Doc/tutorial/inputoutput.rst (original) +++ python/branches/release32-maint/Doc/tutorial/inputoutput.rst Thu Feb 24 01:06:16 2011 @@ -163,7 +163,7 @@ An optional ``':'`` and format specifier can follow the field name. This allows greater control over how the value is formatted. The following example -truncates Pi to three places after the decimal. +rounds Pi to three places after the decimal. >>> import math >>> print('The value of PI is approximately {0:.3f}.'.format(math.pi)) From python-checkins at python.org Thu Feb 24 01:08:13 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 24 Feb 2011 01:08:13 +0100 (CET) Subject: [Python-checkins] r88538 - python/branches/py3k/Doc/tutorial/inputoutput.rst Message-ID: <20110224000813.A3209EEA26@mail.python.org> Author: raymond.hettinger Date: Thu Feb 24 01:08:13 2011 New Revision: 88538 Log: Issue #11304: Input/output tutorial - PI is rounded not truncated. Modified: python/branches/py3k/Doc/tutorial/inputoutput.rst Modified: python/branches/py3k/Doc/tutorial/inputoutput.rst ============================================================================== --- python/branches/py3k/Doc/tutorial/inputoutput.rst (original) +++ python/branches/py3k/Doc/tutorial/inputoutput.rst Thu Feb 24 01:08:13 2011 @@ -163,7 +163,7 @@ An optional ``':'`` and format specifier can follow the field name. This allows greater control over how the value is formatted. The following example -truncates Pi to three places after the decimal. +rounds Pi to three places after the decimal. >>> import math >>> print('The value of PI is approximately {0:.3f}.'.format(math.pi)) From python-checkins at python.org Thu Feb 24 02:08:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 02:08:50 +0100 Subject: [Python-checkins] pymigr: Clarify convert.sh a bit Message-ID: antoine.pitrou pushed 91902af54714 to pymigr: http://hg.python.org/pymigr/rev/91902af54714 changeset: 90:91902af54714 user: Antoine Pitrou date: Wed Feb 23 19:22:37 2011 +0100 summary: Clarify convert.sh a bit files: convert.sh diff --git a/convert.sh b/convert.sh --- a/convert.sh +++ b/convert.sh @@ -3,17 +3,22 @@ # Run this script from the pymigr working copy. The SVN repo should be # in ./python-svn. -hg init python-hg -mkdir python-hg/.hg/svn +# NOTE: branch p3yk (sic) was created from trunk in r43030. + +HGREPO=python-hg + +hg init $HGREPO +mkdir $HGREPO/.hg/svn cp author-map python-hg/.hg/svn/authors cp branchmap.txt python-hg/.hg/svn/branchmap cp tagmap.txt python-hg/.hg/svn/tag-renames -echo '[paths]' > python-hg/.hg/hgrc -echo "default = file://`pwd`/python-svn/python" >> python-hg/.hg/hgrc -echo '' >> python-hg/.hg/hgrc -echo '[hgsubversion]' >> python-hg/.hg/hgrc -echo 'defaultauthors = False' >> python-hg/.hg/hgrc +echo '[paths]' > $HGREPO/.hg/hgrc +echo "default = file://`pwd`/python-svn/python" >> $HGREPO/.hg/hgrc +echo '' >> $HGREPO/.hg/hgrc +echo '[hgsubversion]' >> $HGREPO/.hg/hgrc +echo 'defaultauthors = False' >> $HGREPO/.hg/hgrc -cd python-hg +cd $HGREPO + HGSUBVERSION_BINDINGS=subvertpy hg pull -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 02:08:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 02:08:50 +0100 Subject: [Python-checkins] pymigr: Fix branchmap: 3.x should be the default branch, not 2.x Message-ID: antoine.pitrou pushed 6b6be999edb8 to pymigr: http://hg.python.org/pymigr/rev/6b6be999edb8 changeset: 91:6b6be999edb8 user: Antoine Pitrou date: Thu Feb 24 02:07:59 2011 +0100 summary: Fix branchmap: 3.x should be the default branch, not 2.x files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -1,5 +1,5 @@ -py3k = 3.x -p3yk = 3.x +py3k = default +p3yk = default release32-maint = 3.2.x release31-maint = 3.1.x release30-maint = 3.0.x @@ -11,153 +11,153 @@ release22-maint = 2.2.x release21-maint = 2.1.x release20-maint = 2.0.x -siginterrupt-reset-issue8354 = default -asyncore-tests-issue8490 = default -signalfd-issue8407 = default -tarek_sysconfig = default -pep370 = default -tk_and_idle_maintenance = default -multiprocessing-autoconf = default -tlee-ast-optimize = default -empty = default -tnelson-trunk-bsddb-47-upgrade = default -ctypes-branch = default -decimal-branch = default -bcannon-objcap = default -amk-mailbox = default -twouters-dictviews-backport = default -bcannon-sandboxing = default -theller_modulefinder = default -hoxworth-stdlib_logging-soc = default -tim-exc_sanity = default -IDLE-syntax-branch = default -jim-doctest = default -release23-branch = default -jim-modulator = default -indexing-cleanup-branch = default -r23c1-branch = default -r23b2-branch = default -anthony-parser-branch = default -r23b1-branch = default -idlefork-merge-branch = default -getargs_mask_mods = default -cache-attr-branch = default -folding-reimpl-branch = default -r23a2-branch = default -bsddb-bsddb3-schizo-branch = default -r23a1-branch = default -py-cvs-vendor-branch = default -DS_RPC_BRANCH = default -SourceForge = default -release22-branch = default -r22rc1-branch = default -r22b2-branch = default -r22b1-branch = default -r22a4-branch = default -r22a3-branch = default -r22a2-branch = default -r161-branch = default -cnri-16-start = default -universal-33 = default -None = default -avendor = default -Distutils_0_1_3-branch = default -release152p1-patches = default -string_methods = default -PYIDE = default -OSAM = default -PYTHONSCRIPT = default -BBPY = default -jar = default -alpha100 = default -unlabeled-2.36.4 = default -unlabeled-2.1.4 = default -unlabeled-2.25.4 = default -branch_libffi-3_0_10-win = default -iter-branch = default -gen-branch = default -descr-branch = default -ast-branch = default -ast-objects = default -ast-arena = default -benjaminp-testing = default -fix-test-ftplib = default -trunk-math = default -okkoto-sizeof = default -trunk-bytearray = default -libffi3-branch = default -../ctypes-branch = default -pep302_phase2 = default -py3k-buffer = default -unlabeled-2.9.2 = default -unlabeled-1.5.4 = default -unlabeled-1.1.2 = default -unlabeled-1.1.1 = default -unlabeled-2.9.4 = default -unlabeled-2.10.2 = default -unlabeled-2.1.2 = default -unlabeled-2.108.2 = default -unlabeled-2.36.2 = default -unlabeled-2.54.2 = default -unlabeled-1.3.2 = default -unlabeled-1.23.4 = default -unlabeled-2.25.2 = default -unlabeled-1.2.2 = default -unlabeled-1.5.2 = default -unlabeled-1.98.2 = default -unlabeled-2.16.2 = default -unlabeled-2.3.2 = default -unlabeled-1.9.2 = default -unlabeled-1.8.2 = default -aimacintyre-sf1454481 = default -tim-current_frames = default -bippolito-newstruct = default -runar-longslice-branch = default -steve-notracing = default -rjones-funccall = default -sreifschneider-newnewexcept = default -tim-doctest-branch = default -blais-bytebuf = default -../bippolito-newstruct = default -rjones-prealloc = default -sreifschneider-64ints = default -stdlib-cleanup = default -ssize_t = default -sqlite-integration = default -tim-obmalloc = default -import_unicode = default -test_subprocess_10826 = 3.x -py3k-signalfd-issue8407 = 3.x -sslopts-4870 = 3.x -py3k-dtoa = 3.x -py3k-jit = 3.x -py3k-cdecimal = 3.x -py3k-issue4970 = 3.x -pep-0384 = 3.x -pep-0383 = 3.x -py3k-issue1717 = 3.x -io-c = 3.x -py3k-short-float-repr = 3.x -py3k-ctypes-pep3118 = 3.x -py3k-grandrenaming = 3.x -py3k-pep3137 = 3.x -py3k-importlib = 3.x -alex-py3k = 3.x -cpy_merge = 3.x -py3k-struni = 3.x -int_unification = 3.x -p3yk-noslice = 3.x -py3k-importhook = 3.x -py3k-urllib = 3.x -p3yk_no_args_on_exc = 3.x -py3k-ttk-debug-on-xp5 = 3.x -py3k-stat-on-windows = 3.x -issue9437 = 3.x -issue9003 = 3.x -pep-382 = 3.x -dmalcolm-ast-optimization-branch = 3.x -pep-3151 = 3.x -import-unicode = 3.x -py3k-stat-on-windows-v2 = 3.x -py3k-futures-on-windows = 3.x -issue10276-snowleopard = 3.x +siginterrupt-reset-issue8354 = 2.x +asyncore-tests-issue8490 = 2.x +signalfd-issue8407 = 2.x +tarek_sysconfig = 2.x +pep370 = 2.x +tk_and_idle_maintenance = 2.x +multiprocessing-autoconf = 2.x +tlee-ast-optimize = 2.x +empty = 2.x +tnelson-trunk-bsddb-47-upgrade = 2.x +ctypes-branch = 2.x +decimal-branch = 2.x +bcannon-objcap = 2.x +amk-mailbox = 2.x +twouters-dictviews-backport = 2.x +bcannon-sandboxing = 2.x +theller_modulefinder = 2.x +hoxworth-stdlib_logging-soc = 2.x +tim-exc_sanity = 2.x +IDLE-syntax-branch = 2.x +jim-doctest = 2.x +release23-branch = 2.x +jim-modulator = 2.x +indexing-cleanup-branch = 2.x +r23c1-branch = 2.x +r23b2-branch = 2.x +anthony-parser-branch = 2.x +r23b1-branch = 2.x +idlefork-merge-branch = 2.x +getargs_mask_mods = 2.x +cache-attr-branch = 2.x +folding-reimpl-branch = 2.x +r23a2-branch = 2.x +bsddb-bsddb3-schizo-branch = 2.x +r23a1-branch = 2.x +py-cvs-vendor-branch = 2.x +DS_RPC_BRANCH = 2.x +SourceForge = 2.x +release22-branch = 2.x +r22rc1-branch = 2.x +r22b2-branch = 2.x +r22b1-branch = 2.x +r22a4-branch = 2.x +r22a3-branch = 2.x +r22a2-branch = 2.x +r161-branch = 2.x +cnri-16-start = 2.x +universal-33 = 2.x +None = 2.x +avendor = 2.x +Distutils_0_1_3-branch = 2.x +release152p1-patches = 2.x +string_methods = 2.x +PYIDE = 2.x +OSAM = 2.x +PYTHONSCRIPT = 2.x +BBPY = 2.x +jar = 2.x +alpha100 = 2.x +unlabeled-2.36.4 = 2.x +unlabeled-2.1.4 = 2.x +unlabeled-2.25.4 = 2.x +branch_libffi-3_0_10-win = 2.x +iter-branch = 2.x +gen-branch = 2.x +descr-branch = 2.x +ast-branch = 2.x +ast-objects = 2.x +ast-arena = 2.x +benjaminp-testing = 2.x +fix-test-ftplib = 2.x +trunk-math = 2.x +okkoto-sizeof = 2.x +trunk-bytearray = 2.x +libffi3-branch = 2.x +../ctypes-branch = 2.x +pep302_phase2 = 2.x +py3k-buffer = 2.x +unlabeled-2.9.2 = 2.x +unlabeled-1.5.4 = 2.x +unlabeled-1.1.2 = 2.x +unlabeled-1.1.1 = 2.x +unlabeled-2.9.4 = 2.x +unlabeled-2.10.2 = 2.x +unlabeled-2.1.2 = 2.x +unlabeled-2.108.2 = 2.x +unlabeled-2.36.2 = 2.x +unlabeled-2.54.2 = 2.x +unlabeled-1.3.2 = 2.x +unlabeled-1.23.4 = 2.x +unlabeled-2.25.2 = 2.x +unlabeled-1.2.2 = 2.x +unlabeled-1.5.2 = 2.x +unlabeled-1.98.2 = 2.x +unlabeled-2.16.2 = 2.x +unlabeled-2.3.2 = 2.x +unlabeled-1.9.2 = 2.x +unlabeled-1.8.2 = 2.x +aimacintyre-sf1454481 = 2.x +tim-current_frames = 2.x +bippolito-newstruct = 2.x +runar-longslice-branch = 2.x +steve-notracing = 2.x +rjones-funccall = 2.x +sreifschneider-newnewexcept = 2.x +tim-doctest-branch = 2.x +blais-bytebuf = 2.x +../bippolito-newstruct = 2.x +rjones-prealloc = 2.x +sreifschneider-64ints = 2.x +stdlib-cleanup = 2.x +ssize_t = 2.x +sqlite-integration = 2.x +tim-obmalloc = 2.x +import_unicode = 2.x +test_subprocess_10826 = default +py3k-signalfd-issue8407 = default +sslopts-4870 = default +py3k-dtoa = default +py3k-jit = default +py3k-cdecimal = default +py3k-issue4970 = default +pep-0384 = default +pep-0383 = default +py3k-issue1717 = default +io-c = default +py3k-short-float-repr = default +py3k-ctypes-pep3118 = default +py3k-grandrenaming = default +py3k-pep3137 = default +py3k-importlib = default +alex-py3k = default +cpy_merge = default +py3k-struni = default +int_unification = default +p3yk-noslice = default +py3k-importhook = default +py3k-urllib = default +p3yk_no_args_on_exc = default +py3k-ttk-debug-on-xp5 = default +py3k-stat-on-windows = default +issue9437 = default +issue9003 = default +pep-382 = default +dmalcolm-ast-optimization-branch = default +pep-3151 = default +import-unicode = default +py3k-stat-on-windows-v2 = default +py3k-futures-on-windows = default +issue10276-snowleopard = default -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 02:08:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 02:08:50 +0100 Subject: [Python-checkins] pymigr: Fix split.py Message-ID: antoine.pitrou pushed cd1527a081be to pymigr: http://hg.python.org/pymigr/rev/cd1527a081be changeset: 89:cd1527a081be parent: 87:ff58f39081e3 user: Antoine Pitrou date: Wed Feb 23 19:05:28 2011 +0100 summary: Fix split.py files: split.py diff --git a/split.py b/split.py --- a/split.py +++ b/split.py @@ -16,7 +16,10 @@ for n in repo.heads(): ctx = repo[n] - path = ctx.extra()['convert_revision'].split('/', 1)[1] + convert_revision = ctx.extra().get('convert_revision') + if not convert_revision: + continue + path = convert_revision.split('/', 1)[1] path = path.rsplit('@', 1)[0] tags = ctx.tags() -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 02:08:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 02:08:50 +0100 Subject: [Python-checkins] merge in pymigr: Merge Message-ID: antoine.pitrou pushed 7e674f360039 to pymigr: http://hg.python.org/pymigr/rev/7e674f360039 changeset: 92:7e674f360039 tag: tip parent: 91:6b6be999edb8 parent: 88:655c7ac7a583 user: Antoine Pitrou date: Thu Feb 24 02:08:20 2011 +0100 summary: Merge files: diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -4,7 +4,7 @@ largest data. Typical use (from a "python-hg" subdirectory of the working copy): - $ ../findlarge.py | xargs -n 1 hg shrink --revlog + $ ../findlarge.py | xargs -0 -n 1 hg shrink --revlog Full recipe for shrinking a repo: $ hg shrink -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 02:41:46 2011 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 24 Feb 2011 02:41:46 +0100 (CET) Subject: [Python-checkins] r88539 - in python/branches/py3k/Lib/lib2to3: __main__.py Message-ID: <20110224014146.65256EE9F0@mail.python.org> Author: benjamin.peterson Date: Thu Feb 24 02:41:46 2011 New Revision: 88539 Log: Merged revisions 88535 via svnmerge from svn+ssh://pythondev at svn.python.org/sandbox/trunk/2to3/lib2to3 ........ r88535 | brett.cannon | 2011-02-23 13:46:46 -0600 (Wed, 23 Feb 2011) | 1 line Add lib2to3.__main__ for easy testing from the console. ........ Added: python/branches/py3k/Lib/lib2to3/__main__.py - copied unchanged from r88535, /sandbox/trunk/2to3/lib2to3/__main__.py Modified: python/branches/py3k/Lib/lib2to3/ (props changed) From python-checkins at python.org Thu Feb 24 03:46:00 2011 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 24 Feb 2011 03:46:00 +0100 (CET) Subject: [Python-checkins] r88540 - python/branches/py3k/Doc/library/io.rst Message-ID: <20110224024600.B915FEE9CB@mail.python.org> Author: benjamin.peterson Date: Thu Feb 24 03:46:00 2011 New Revision: 88540 Log: this seems to be pointlessly nested Modified: python/branches/py3k/Doc/library/io.rst Modified: python/branches/py3k/Doc/library/io.rst ============================================================================== --- python/branches/py3k/Doc/library/io.rst (original) +++ python/branches/py3k/Doc/library/io.rst Thu Feb 24 03:46:00 2011 @@ -785,17 +785,14 @@ inherits :class:`codecs.IncrementalDecoder`. -Advanced topics ---------------- - -Here we will discuss several advanced topics pertaining to the concrete -I/O implementations described above. - Performance -^^^^^^^^^^^ +----------- + +This section discusses the performance of the provided concrete IO +implementations. Binary I/O -"""""""""" +^^^^^^^^^^ By reading and writing only large chunks of data even when the user asks for a single byte, buffered I/O is designed to hide any inefficiency in @@ -808,7 +805,7 @@ use buffered I/O rather than unbuffered I/O. Text I/O -"""""""" +^^^^^^^^ Text I/O over a binary storage (such as a file) is significantly slower than binary I/O over the same storage, because it implies conversions from From python-checkins at python.org Thu Feb 24 03:53:06 2011 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 24 Feb 2011 03:53:06 +0100 (CET) Subject: [Python-checkins] r88541 - python/branches/py3k/Doc/library/io.rst Message-ID: <20110224025306.1318CEC21@mail.python.org> Author: benjamin.peterson Date: Thu Feb 24 03:53:05 2011 New Revision: 88541 Log: rewrite Modified: python/branches/py3k/Doc/library/io.rst Modified: python/branches/py3k/Doc/library/io.rst ============================================================================== --- python/branches/py3k/Doc/library/io.rst (original) +++ python/branches/py3k/Doc/library/io.rst Thu Feb 24 03:53:05 2011 @@ -788,31 +788,30 @@ Performance ----------- -This section discusses the performance of the provided concrete IO +This section discusses the performance of the provided concrete I/O implementations. Binary I/O ^^^^^^^^^^ -By reading and writing only large chunks of data even when the user asks -for a single byte, buffered I/O is designed to hide any inefficiency in -calling and executing the operating system's unbuffered I/O routines. The -gain will vary very much depending on the OS and the kind of I/O which is -performed (for example, on some contemporary OSes such as Linux, unbuffered -disk I/O can be as fast as buffered I/O). The bottom line, however, is -that buffered I/O will offer you predictable performance regardless of the -platform and the backing device. Therefore, it is most always preferable to -use buffered I/O rather than unbuffered I/O. +By reading and writing only large chunks of data even when the user asks for a +single byte, buffered I/O hides any inefficiency in calling and executing the +operating system's unbuffered I/O routines. The gain depends on the OS and the +kind of I/O which is performed. For example, on some modern OSes such as Linux, +unbuffered disk I/O can be as fast as buffered I/O. The bottom line, however, +is that buffered I/O offers predictable performance regardless of the platform +and the backing device. Therefore, it is most always preferable to use buffered +I/O rather than unbuffered I/O for binary datal Text I/O ^^^^^^^^ Text I/O over a binary storage (such as a file) is significantly slower than -binary I/O over the same storage, because it implies conversions from -unicode to binary data using a character codec. This can become noticeable -if you handle huge amounts of text data (for example very large log files). -Also, :meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both -quite slow due to the reconstruction algorithm used. +binary I/O over the same storage, because it requires conversions between +unicode and binary data using a character codec. This can become noticeable +handling huge amounts of text data like large log files. Also, +:meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both quite slow +due to the reconstruction algorithm used. :class:`StringIO`, however, is a native in-memory unicode container and will exhibit similar speed to :class:`BytesIO`. @@ -820,9 +819,8 @@ Multi-threading ^^^^^^^^^^^^^^^ -:class:`FileIO` objects are thread-safe to the extent that the operating -system calls (such as ``read(2)`` under Unix) they are wrapping are thread-safe -too. +:class:`FileIO` objects are thread-safe to the extent that the operating system +calls (such as ``read(2)`` under Unix) they wrap are thread-safe too. Binary buffered objects (instances of :class:`BufferedReader`, :class:`BufferedWriter`, :class:`BufferedRandom` and :class:`BufferedRWPair`) @@ -837,12 +835,13 @@ Binary buffered objects (instances of :class:`BufferedReader`, :class:`BufferedWriter`, :class:`BufferedRandom` and :class:`BufferedRWPair`) are not reentrant. While reentrant calls will not happen in normal situations, -they can arise if you are doing I/O in a :mod:`signal` handler. If it is -attempted to enter a buffered object again while already being accessed -*from the same thread*, then a :exc:`RuntimeError` is raised. - -The above implicitly extends to text files, since the :func:`open()` -function will wrap a buffered object inside a :class:`TextIOWrapper`. This -includes standard streams and therefore affects the built-in function -:func:`print()` as well. +they can arise from doing I/O in a :mod:`signal` handler. If a thread tries to +renter a buffered object which it is already accessing, a :exc:`RuntimeError` is +raised. Note this doesn't prohibit a different thread from entering the +buffered object. + +The above implicitly extends to text files, since the :func:`open()` function +will wrap a buffered object inside a :class:`TextIOWrapper`. This includes +standard streams and therefore affects the built-in function :func:`print()` as +well. From python-checkins at python.org Thu Feb 24 04:03:46 2011 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 24 Feb 2011 04:03:46 +0100 (CET) Subject: [Python-checkins] r88542 - in python/branches/release32-maint: Doc/library/io.rst Message-ID: <20110224030346.3CF0FFB4F@mail.python.org> Author: benjamin.peterson Date: Thu Feb 24 04:03:46 2011 New Revision: 88542 Log: Merged revisions 88540-88541 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88540 | benjamin.peterson | 2011-02-23 20:46:00 -0600 (Wed, 23 Feb 2011) | 1 line this seems to be pointlessly nested ........ r88541 | benjamin.peterson | 2011-02-23 20:53:05 -0600 (Wed, 23 Feb 2011) | 1 line rewrite ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Doc/library/io.rst Modified: python/branches/release32-maint/Doc/library/io.rst ============================================================================== --- python/branches/release32-maint/Doc/library/io.rst (original) +++ python/branches/release32-maint/Doc/library/io.rst Thu Feb 24 04:03:46 2011 @@ -785,37 +785,33 @@ inherits :class:`codecs.IncrementalDecoder`. -Advanced topics ---------------- - -Here we will discuss several advanced topics pertaining to the concrete -I/O implementations described above. - Performance -^^^^^^^^^^^ +----------- + +This section discusses the performance of the provided concrete I/O +implementations. Binary I/O -"""""""""" +^^^^^^^^^^ -By reading and writing only large chunks of data even when the user asks -for a single byte, buffered I/O is designed to hide any inefficiency in -calling and executing the operating system's unbuffered I/O routines. The -gain will vary very much depending on the OS and the kind of I/O which is -performed (for example, on some contemporary OSes such as Linux, unbuffered -disk I/O can be as fast as buffered I/O). The bottom line, however, is -that buffered I/O will offer you predictable performance regardless of the -platform and the backing device. Therefore, it is most always preferable to -use buffered I/O rather than unbuffered I/O. +By reading and writing only large chunks of data even when the user asks for a +single byte, buffered I/O hides any inefficiency in calling and executing the +operating system's unbuffered I/O routines. The gain depends on the OS and the +kind of I/O which is performed. For example, on some modern OSes such as Linux, +unbuffered disk I/O can be as fast as buffered I/O. The bottom line, however, +is that buffered I/O offers predictable performance regardless of the platform +and the backing device. Therefore, it is most always preferable to use buffered +I/O rather than unbuffered I/O for binary datal Text I/O -"""""""" +^^^^^^^^ Text I/O over a binary storage (such as a file) is significantly slower than -binary I/O over the same storage, because it implies conversions from -unicode to binary data using a character codec. This can become noticeable -if you handle huge amounts of text data (for example very large log files). -Also, :meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both -quite slow due to the reconstruction algorithm used. +binary I/O over the same storage, because it requires conversions between +unicode and binary data using a character codec. This can become noticeable +handling huge amounts of text data like large log files. Also, +:meth:`TextIOWrapper.tell` and :meth:`TextIOWrapper.seek` are both quite slow +due to the reconstruction algorithm used. :class:`StringIO`, however, is a native in-memory unicode container and will exhibit similar speed to :class:`BytesIO`. @@ -823,9 +819,8 @@ Multi-threading ^^^^^^^^^^^^^^^ -:class:`FileIO` objects are thread-safe to the extent that the operating -system calls (such as ``read(2)`` under Unix) they are wrapping are thread-safe -too. +:class:`FileIO` objects are thread-safe to the extent that the operating system +calls (such as ``read(2)`` under Unix) they wrap are thread-safe too. Binary buffered objects (instances of :class:`BufferedReader`, :class:`BufferedWriter`, :class:`BufferedRandom` and :class:`BufferedRWPair`) @@ -840,12 +835,13 @@ Binary buffered objects (instances of :class:`BufferedReader`, :class:`BufferedWriter`, :class:`BufferedRandom` and :class:`BufferedRWPair`) are not reentrant. While reentrant calls will not happen in normal situations, -they can arise if you are doing I/O in a :mod:`signal` handler. If it is -attempted to enter a buffered object again while already being accessed -*from the same thread*, then a :exc:`RuntimeError` is raised. - -The above implicitly extends to text files, since the :func:`open()` -function will wrap a buffered object inside a :class:`TextIOWrapper`. This -includes standard streams and therefore affects the built-in function -:func:`print()` as well. +they can arise from doing I/O in a :mod:`signal` handler. If a thread tries to +renter a buffered object which it is already accessing, a :exc:`RuntimeError` is +raised. Note this doesn't prohibit a different thread from entering the +buffered object. + +The above implicitly extends to text files, since the :func:`open()` function +will wrap a buffered object inside a :class:`TextIOWrapper`. This includes +standard streams and therefore affects the built-in function :func:`print()` as +well. From python-checkins at python.org Thu Feb 24 04:34:26 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 04:34:26 +0100 Subject: [Python-checkins] pymigr: Add new branches, and fix that import_unicode is in 3.x Message-ID: antoine.pitrou pushed 85dbdb67a20f to pymigr: http://hg.python.org/pymigr/rev/85dbdb67a20f changeset: 93:85dbdb67a20f user: Antoine Pitrou date: Thu Feb 24 02:47:48 2011 +0100 summary: Add new branches, and fix that import_unicode is in 3.x files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -125,7 +125,7 @@ ssize_t = 2.x sqlite-integration = 2.x tim-obmalloc = 2.x -import_unicode = 2.x +release27-maint-ttk-debug-on-xp5 = 2.x test_subprocess_10826 = default py3k-signalfd-issue8407 = default sslopts-4870 = default @@ -161,3 +161,9 @@ py3k-stat-on-windows-v2 = default py3k-futures-on-windows = default issue10276-snowleopard = default +import_unicode = default +py3k-issue9978 = default +issue5639 = default +issue4388 = default +issue10209 = default +bbtest = default -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 04:34:26 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 04:34:26 +0100 Subject: [Python-checkins] pymigr: Add an experimental tool to dummy-merge useless heads, to make the Message-ID: antoine.pitrou pushed cb91d06e83e9 to pymigr: http://hg.python.org/pymigr/rev/cb91d06e83e9 changeset: 94:cb91d06e83e9 user: Antoine Pitrou date: Thu Feb 24 03:46:21 2011 +0100 summary: Add an experimental tool to dummy-merge useless heads, to make the topology much simpler for hg. files: closebranches.py diff --git a/closebranches.py b/closebranches.py new file mode 100755 --- /dev/null +++ b/closebranches.py @@ -0,0 +1,64 @@ +#!/usr/bin/env python +""" +Experimental tool to make "dummy merges" of obsolete heads (all heads except +tips of named branches), to reduce the number of heads and make life easier for hg. +""" + +from pprint import pprint +import sys + +from mercurial import ui, hg, commands + +# see http://mercurial.selenic.com/wiki/TipsAndTricks#mergemineortheir + +head_tmpl = """\ +hg up %(branch)s +hg -y merge -t internal:fail %(head_rev)s +hg revert -a -r . +hg resolve -am +hg ci -m "(hg migration: dummy merge of %(svn_branch)r)" +""" + +def walk(ui, repo): + # The heads in descending order ("latest" first) + heads = repo.heads() + # Latest head by branch + branch_tips = {} + for n in heads: + ctx = repo[n] + branch = ctx.branch() + if branch not in branch_tips: + branch_tips[branch] = n + print "branch tips:" + pprint(branch_tips) + + # Then list those heads which are *not* branch-local tips + for n in heads: + ctx = repo[n] + branch = ctx.branch() + if n == branch_tips[branch]: + continue + extra = ctx.extra() + svn_rev = extra['convert_revision'] + svn_branch = '/'.join(svn_rev.split('/')[-2:]) + print "dummy merge of %s (%r)" % (ctx.hex(), svn_branch) + #print head_tmpl % dict( + #branch=branch, head_rev=ctx.hex(), svn_branch=svn_branch) + commands.update(ui, repo, rev=branch) + commands.merge(ui, repo, n, tool='internal:local') + commands.revert(ui, repo, all=True, rev=branch) + commands.resolve(ui, repo, all=True, mark=True) + commands.commit(ui, repo, + message="(hg migration: dummy merge of %r)" % svn_branch) + + +if __name__ == '__main__': + if len(sys.argv) > 1: + repo_path = sys.argv[1] + else: + repo_path = '.' + ui = ui.ui() + # Allow merges to proceed without asking questions + ui.setconfig('ui', 'interactive', False) + repo = hg.repository(ui, repo_path) + walk(ui, repo) -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 04:34:26 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 04:34:26 +0100 Subject: [Python-checkins] pymigr: Fix findlarge.py, suggest patch and more actions Message-ID: antoine.pitrou pushed 54a76c200148 to pymigr: http://hg.python.org/pymigr/rev/54a76c200148 changeset: 95:54a76c200148 tag: tip user: Antoine Pitrou date: Thu Feb 24 04:34:20 2011 +0100 summary: Fix findlarge.py, suggest patch and more actions files: findlarge.py diff --git a/findlarge.py b/findlarge.py --- a/findlarge.py +++ b/findlarge.py @@ -9,11 +9,78 @@ Full recipe for shrinking a repo: $ hg shrink $ ../findlarge.py | xargs -0 -n 1 hg shrink --revlog + # the following optional $ hg verify $ find .hg -name "*.old" -print0 | xargs -0 rm +You can also try an other reordering algorithm: + $ ../findlarge.py | xargs -0 -n 1 hg shrink --sort postorderreverse --revlog + + +NOTE: Many operations become *much* slower after the shrink of a full converted +repo (probably because of the large number of heads - more than 400 - combined +with the new non-natural revlog order). +So it's better to first build a bundle of py3k before shrinking: + + $ hg bundle --base null -r 3.x -r 3.2.x -r 3.1.x -r 3.0.x py3k.bundle + +(replace "3.x" with "default" if the newer branchmap is used) + """ +# !--- It is advised to apply the following patch to your hg tree ---! + +''' +diff -r 5fc7c84ed9b0 contrib/shrink-revlog.py +--- a/contrib/shrink-revlog.py Tue Feb 01 17:30:13 2011 -0600 ++++ b/contrib/shrink-revlog.py Thu Feb 24 04:11:42 2011 +0100 +@@ -145,6 +145,7 @@ def report(ui, r1, r2): + shrink_factor = oldsize / newsize + ui.write(_('shrinkage: %.1f%% (%.1fx)\n') + % (shrink_percent, shrink_factor)) ++ return shrink_factor + + def shrink(ui, repo, **opts): + """shrink a revlog by reordering revisions +@@ -235,7 +236,7 @@ def shrink(ui, repo, **opts): + ui.note(_('%d suboptimal nodes\n') % suboptimal) + + writerevs(ui, r1, r2, order, tr) +- report(ui, r1, r2) ++ shrink_factor = report(ui, r1, r2) + tr.close() + except: + # Abort transaction first, so we truncate the files before +@@ -244,7 +245,7 @@ def shrink(ui, repo, **opts): + for fn in (tmpindexfn, tmpdatafn): + ignoremissing(os.unlink)(fn) + raise +- if not opts.get('dry_run'): ++ if not opts.get('dry_run') and shrink_factor > 1.0: + # racy, both files cannot be renamed atomically + # copy files + util.os_link(indexfn, oldindexfn) +@@ -253,7 +254,7 @@ def shrink(ui, repo, **opts): + # rename + util.rename(tmpindexfn, indexfn) + try: +- os.chmod(tmpdatafn, os.stat(datafn).st_mode) ++ os.chmod(tmpdatafn, os.stat(indexfn).st_mode) + util.rename(tmpdatafn, datafn) + except OSError, inst: + if inst.errno != errno.ENOENT: +@@ -265,7 +266,7 @@ def shrink(ui, repo, **opts): + finally: + lock.release() + +- if not opts.get('dry_run'): ++ if not opts.get('dry_run') and shrink_factor > 1.0: + ui.write(_('note: old revlog saved in:\n' + ' %s\n' + ' %s\n' +''' + + import os import sys import collections @@ -32,9 +99,11 @@ continue fn = os.path.join(dir, fn) fni = os.path.join(dir, fni) - # Strange issue with this file, skip it - if fni.endswith('_misc/python-mode.el.i'): - continue + # Strange issues with these files, skip + #if (fni.endswith('_misc/python-mode.el.i') or + #'objects/listobject.c' in fni or + #'objects/dictobject.c' in fni): + #continue files[fni] += os.path.getsize(fn) return files @@ -43,11 +112,12 @@ [(v, k) for (k, v) in file_sizes('.hg/store/dh').items()] ) -nlargest = sorted(files, reverse=True)[0:1000] +nlargest = sorted(files, reverse=True)[0:2000] if '-v' in sys.argv: for s, fn in nlargest: print s, fn print "(total %d)" % sum(s for s, fn in nlargest) else: - print '\0'.join(fn for s, fn in nlargest) + # no final \n + sys.stdout.write('\0'.join(fn for s, fn in nlargest)) -- Repository URL: http://hg.python.org/pymigr From solipsis at pitrou.net Thu Feb 24 05:06:23 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 24 Feb 2011 05:06:23 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88539): sum=0 Message-ID: py3k results for svn r88539 (hg cset 0a23ca25cb7b) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogPPXw89', '-x'] From python-checkins at python.org Thu Feb 24 08:34:06 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 24 Feb 2011 08:34:06 +0100 (CET) Subject: [Python-checkins] r88543 - peps/trunk/pep-0101.txt Message-ID: <20110224073406.78B87ECD1@mail.python.org> Author: georg.brandl Date: Thu Feb 24 08:34:06 2011 New Revision: 88543 Log: Add Apache alias step. Modified: peps/trunk/pep-0101.txt Modified: peps/trunk/pep-0101.txt ============================================================================== --- peps/trunk/pep-0101.txt (original) +++ peps/trunk/pep-0101.txt Thu Feb 24 08:34:06 2011 @@ -362,7 +362,8 @@ directory, You'll move everything in there when the final release comes out. - ___ Move the release .tgz, tar.bz2, and .msi files into place + ___ Move the release .tgz, tar.bz2, and .msi files into place, as well + as the .asc GPG signature files. Make sure they are world readable. They should also be group writable, and group-owned by webmaster. @@ -447,9 +448,6 @@ ___ edit the previous release's last release content.ht page to point to the new release. - ___ Mention the release as the most recent stable one in - `doc/faq/general/content.ht` (section "How stable is Python?") - ___ update `doc/content.ht` to indicate the new current documentation version, and remove the current version from any 'in development' section. Update the version in the "What's New" link. @@ -467,14 +465,10 @@ ___ cd to download/releases/X.Y.Z ___ Edit the version numbers in content.ht - ___ Comment out the link to the documentation if this is not a final, + ___ Comment out the link to the CHM file if this is not a final, remove the comment if it is. - ___ Copy the new .asc files into place ___ Update the md5 checksums - ___ Copy Misc/NEWS to download/releases/X.Y.Z/NEWS.txt - ___ Copy Lib/idlelib/NEWS.txt to download/releases/X.Y.Z/IDLENEWS.txt - Note, you don't have to copy the actual .tgz or tar.bz2 tarballs into this directory because they only live on dinsdale in the ftp directory. @@ -482,6 +476,9 @@ will trigger the live site to update itself, and at that point the release is live. + ___ If this is a final release, create a new python.org/X.Y Apache alias + (or ask pydotorg to do so for you). + Now it's time to write the announcement for the mailing lists. This is the fuzzy bit because not much can be automated. You can use an earlier announcement as a template, but edit it for content! From python-checkins at python.org Thu Feb 24 12:15:36 2011 From: python-checkins at python.org (raymond.hettinger) Date: Thu, 24 Feb 2011 12:15:36 +0100 (CET) Subject: [Python-checkins] r88544 - python/branches/release32-maint/Doc/whatsnew/3.2.rst Message-ID: <20110224111536.53ED5DCCF@mail.python.org> Author: raymond.hettinger Date: Thu Feb 24 12:15:36 2011 New Revision: 88544 Log: Issue #11296: rsplit() mentioned twice for the same change. Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst Modified: python/branches/release32-maint/Doc/whatsnew/3.2.rst ============================================================================== --- python/branches/release32-maint/Doc/whatsnew/3.2.rst (original) +++ python/branches/release32-maint/Doc/whatsnew/3.2.rst Thu Feb 24 12:15:36 2011 @@ -2354,7 +2354,7 @@ (Contributed by Antoine Pitrou; :issue:`3001`.) * The fast-search algorithm in stringlib is now used by the :meth:`split`, - :meth:`rsplit`, :meth:`splitlines` and :meth:`replace` methods on + :meth:`splitlines` and :meth:`replace` methods on :class:`bytes`, :class:`bytearray` and :class:`str` objects. Likewise, the algorithm is also used by :meth:`rfind`, :meth:`rindex`, :meth:`rsplit` and :meth:`rpartition`. From python-checkins at python.org Thu Feb 24 16:40:13 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 16:40:13 +0100 Subject: [Python-checkins] pymigr: Use branch name "trunk" instead of "2.x", since some of this is pre-2.x. Message-ID: antoine.pitrou pushed 5fc097a1c66e to pymigr: http://hg.python.org/pymigr/rev/5fc097a1c66e changeset: 96:5fc097a1c66e user: Antoine Pitrou date: Thu Feb 24 16:39:01 2011 +0100 summary: Use branch name "trunk" instead of "2.x", since some of this is pre-2.x. files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -11,121 +11,121 @@ release22-maint = 2.2.x release21-maint = 2.1.x release20-maint = 2.0.x -siginterrupt-reset-issue8354 = 2.x -asyncore-tests-issue8490 = 2.x -signalfd-issue8407 = 2.x -tarek_sysconfig = 2.x -pep370 = 2.x -tk_and_idle_maintenance = 2.x -multiprocessing-autoconf = 2.x -tlee-ast-optimize = 2.x -empty = 2.x -tnelson-trunk-bsddb-47-upgrade = 2.x -ctypes-branch = 2.x -decimal-branch = 2.x -bcannon-objcap = 2.x -amk-mailbox = 2.x -twouters-dictviews-backport = 2.x -bcannon-sandboxing = 2.x -theller_modulefinder = 2.x -hoxworth-stdlib_logging-soc = 2.x -tim-exc_sanity = 2.x -IDLE-syntax-branch = 2.x -jim-doctest = 2.x -release23-branch = 2.x -jim-modulator = 2.x -indexing-cleanup-branch = 2.x -r23c1-branch = 2.x -r23b2-branch = 2.x -anthony-parser-branch = 2.x -r23b1-branch = 2.x -idlefork-merge-branch = 2.x -getargs_mask_mods = 2.x -cache-attr-branch = 2.x -folding-reimpl-branch = 2.x -r23a2-branch = 2.x -bsddb-bsddb3-schizo-branch = 2.x -r23a1-branch = 2.x -py-cvs-vendor-branch = 2.x -DS_RPC_BRANCH = 2.x -SourceForge = 2.x -release22-branch = 2.x -r22rc1-branch = 2.x -r22b2-branch = 2.x -r22b1-branch = 2.x -r22a4-branch = 2.x -r22a3-branch = 2.x -r22a2-branch = 2.x -r161-branch = 2.x -cnri-16-start = 2.x -universal-33 = 2.x -None = 2.x -avendor = 2.x -Distutils_0_1_3-branch = 2.x -release152p1-patches = 2.x -string_methods = 2.x -PYIDE = 2.x -OSAM = 2.x -PYTHONSCRIPT = 2.x -BBPY = 2.x -jar = 2.x -alpha100 = 2.x -unlabeled-2.36.4 = 2.x -unlabeled-2.1.4 = 2.x -unlabeled-2.25.4 = 2.x -branch_libffi-3_0_10-win = 2.x -iter-branch = 2.x -gen-branch = 2.x -descr-branch = 2.x -ast-branch = 2.x -ast-objects = 2.x -ast-arena = 2.x -benjaminp-testing = 2.x -fix-test-ftplib = 2.x -trunk-math = 2.x -okkoto-sizeof = 2.x -trunk-bytearray = 2.x -libffi3-branch = 2.x -../ctypes-branch = 2.x -pep302_phase2 = 2.x -py3k-buffer = 2.x -unlabeled-2.9.2 = 2.x -unlabeled-1.5.4 = 2.x -unlabeled-1.1.2 = 2.x -unlabeled-1.1.1 = 2.x -unlabeled-2.9.4 = 2.x -unlabeled-2.10.2 = 2.x -unlabeled-2.1.2 = 2.x -unlabeled-2.108.2 = 2.x -unlabeled-2.36.2 = 2.x -unlabeled-2.54.2 = 2.x -unlabeled-1.3.2 = 2.x -unlabeled-1.23.4 = 2.x -unlabeled-2.25.2 = 2.x -unlabeled-1.2.2 = 2.x -unlabeled-1.5.2 = 2.x -unlabeled-1.98.2 = 2.x -unlabeled-2.16.2 = 2.x -unlabeled-2.3.2 = 2.x -unlabeled-1.9.2 = 2.x -unlabeled-1.8.2 = 2.x -aimacintyre-sf1454481 = 2.x -tim-current_frames = 2.x -bippolito-newstruct = 2.x -runar-longslice-branch = 2.x -steve-notracing = 2.x -rjones-funccall = 2.x -sreifschneider-newnewexcept = 2.x -tim-doctest-branch = 2.x -blais-bytebuf = 2.x -../bippolito-newstruct = 2.x -rjones-prealloc = 2.x -sreifschneider-64ints = 2.x -stdlib-cleanup = 2.x -ssize_t = 2.x -sqlite-integration = 2.x -tim-obmalloc = 2.x -release27-maint-ttk-debug-on-xp5 = 2.x +siginterrupt-reset-issue8354 = trunk +asyncore-tests-issue8490 = trunk +signalfd-issue8407 = trunk +tarek_sysconfig = trunk +pep370 = trunk +tk_and_idle_maintenance = trunk +multiprocessing-autoconf = trunk +tlee-ast-optimize = trunk +empty = trunk +tnelson-trunk-bsddb-47-upgrade = trunk +ctypes-branch = trunk +decimal-branch = trunk +bcannon-objcap = trunk +amk-mailbox = trunk +twouters-dictviews-backport = trunk +bcannon-sandboxing = trunk +theller_modulefinder = trunk +hoxworth-stdlib_logging-soc = trunk +tim-exc_sanity = trunk +IDLE-syntax-branch = trunk +jim-doctest = trunk +release23-branch = trunk +jim-modulator = trunk +indexing-cleanup-branch = trunk +r23c1-branch = trunk +r23b2-branch = trunk +anthony-parser-branch = trunk +r23b1-branch = trunk +idlefork-merge-branch = trunk +getargs_mask_mods = trunk +cache-attr-branch = trunk +folding-reimpl-branch = trunk +r23a2-branch = trunk +bsddb-bsddb3-schizo-branch = trunk +r23a1-branch = trunk +py-cvs-vendor-branch = trunk +DS_RPC_BRANCH = trunk +SourceForge = trunk +release22-branch = trunk +r22rc1-branch = trunk +r22b2-branch = trunk +r22b1-branch = trunk +r22a4-branch = trunk +r22a3-branch = trunk +r22a2-branch = trunk +r161-branch = trunk +cnri-16-start = trunk +universal-33 = trunk +None = trunk +avendor = trunk +Distutils_0_1_3-branch = trunk +release152p1-patches = trunk +string_methods = trunk +PYIDE = trunk +OSAM = trunk +PYTHONSCRIPT = trunk +BBPY = trunk +jar = trunk +alpha100 = trunk +unlabeled-2.36.4 = trunk +unlabeled-2.1.4 = trunk +unlabeled-2.25.4 = trunk +branch_libffi-3_0_10-win = trunk +iter-branch = trunk +gen-branch = trunk +descr-branch = trunk +ast-branch = trunk +ast-objects = trunk +ast-arena = trunk +benjaminp-testing = trunk +fix-test-ftplib = trunk +trunk-math = trunk +okkoto-sizeof = trunk +trunk-bytearray = trunk +libffi3-branch = trunk +../ctypes-branch = trunk +pep302_phase2 = trunk +py3k-buffer = trunk +unlabeled-2.9.2 = trunk +unlabeled-1.5.4 = trunk +unlabeled-1.1.2 = trunk +unlabeled-1.1.1 = trunk +unlabeled-2.9.4 = trunk +unlabeled-2.10.2 = trunk +unlabeled-2.1.2 = trunk +unlabeled-2.108.2 = trunk +unlabeled-2.36.2 = trunk +unlabeled-2.54.2 = trunk +unlabeled-1.3.2 = trunk +unlabeled-1.23.4 = trunk +unlabeled-2.25.2 = trunk +unlabeled-1.2.2 = trunk +unlabeled-1.5.2 = trunk +unlabeled-1.98.2 = trunk +unlabeled-2.16.2 = trunk +unlabeled-2.3.2 = trunk +unlabeled-1.9.2 = trunk +unlabeled-1.8.2 = trunk +aimacintyre-sf1454481 = trunk +tim-current_frames = trunk +bippolito-newstruct = trunk +runar-longslice-branch = trunk +steve-notracing = trunk +rjones-funccall = trunk +sreifschneider-newnewexcept = trunk +tim-doctest-branch = trunk +blais-bytebuf = trunk +../bippolito-newstruct = trunk +rjones-prealloc = trunk +sreifschneider-64ints = trunk +stdlib-cleanup = trunk +ssize_t = trunk +sqlite-integration = trunk +tim-obmalloc = trunk +release27-maint-ttk-debug-on-xp5 = trunk test_subprocess_10826 = default py3k-signalfd-issue8407 = default sslopts-4870 = default -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 16:40:13 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 16:40:13 +0100 Subject: [Python-checkins] pymigr: Use the trunkbranch option and mention the modified hgsubversion Message-ID: antoine.pitrou pushed 7d8692d368b0 to pymigr: http://hg.python.org/pymigr/rev/7d8692d368b0 changeset: 97:7d8692d368b0 tag: tip user: Antoine Pitrou date: Thu Feb 24 16:40:08 2011 +0100 summary: Use the trunkbranch option and mention the modified hgsubversion (this is a bit experimental at this point, but should allow to have "default" tip point to 3.x tip without having another head on "default" pointing to the last SVN trunk commit) files: convert.sh diff --git a/convert.sh b/convert.sh --- a/convert.sh +++ b/convert.sh @@ -2,6 +2,8 @@ # Run this script from the pymigr working copy. The SVN repo should be # in ./python-svn. +# The 'trunkbranch' hgsubversion option needs the modified hgsubversion +# from http://hg.python.org/hgsubversion/ # NOTE: branch p3yk (sic) was created from trunk in r43030. @@ -18,6 +20,7 @@ echo '' >> $HGREPO/.hg/hgrc echo '[hgsubversion]' >> $HGREPO/.hg/hgrc echo 'defaultauthors = False' >> $HGREPO/.hg/hgrc +echo 'trunkbranch = trunk' >> $HGREPO/.hg/hgrc cd $HGREPO -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 16:50:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 16:50:34 +0100 Subject: [Python-checkins] pymigr: Mention branch name Message-ID: antoine.pitrou pushed de117f1aefe6 to pymigr: http://hg.python.org/pymigr/rev/de117f1aefe6 changeset: 98:de117f1aefe6 tag: tip user: Antoine Pitrou date: Thu Feb 24 16:50:32 2011 +0100 summary: Mention branch name files: convert.sh diff --git a/convert.sh b/convert.sh --- a/convert.sh +++ b/convert.sh @@ -3,7 +3,7 @@ # Run this script from the pymigr working copy. The SVN repo should be # in ./python-svn. # The 'trunkbranch' hgsubversion option needs the modified hgsubversion -# from http://hg.python.org/hgsubversion/ +# from http://hg.python.org/hgsubversion/#pymigr # NOTE: branch p3yk (sic) was created from trunk in r43030. -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 18:28:47 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 18:28:47 +0100 Subject: [Python-checkins] pymigr: Improve instructions Message-ID: antoine.pitrou pushed 38f0f23904fc to pymigr: http://hg.python.org/pymigr/rev/38f0f23904fc changeset: 99:38f0f23904fc tag: tip user: Antoine Pitrou date: Thu Feb 24 18:28:42 2011 +0100 summary: Improve instructions files: README.txt diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -1,14 +1,68 @@ How to convert ============== -- Get a fresh clone of hgsubversion (http://bitbucket.org/durin42/hgsubversion) -- Add an extensions.hgsubversion entry to your ~/.hgrc -- Get a clone of the pymigr repo (http://hg.python.org/pymigr/) -- Get a copy of the Python SVN repo by asking djc or someone else who has one -- Run the following command: +Software +-------- - hg clone -A pymigr/author-map --branchmap branchmap.txt \ - file:///path/to/local/svnrepo/python python-hg +- Get a fresh clone of hgsubversion (from http://hg.python.org/hgsubversion/#pymigr) + and also install its dependencies (e.g. subvertpy) -This should get you a hg repo in python-hg. hgsubversion may leak memory; you -should be able to kill it if it uses up too much. Resume with hg pull. +- Enable hgsubversion by adding it to the "[extensions]" section in your .hgrc. + +- Enable the "hg shrink" extension by adding + shrink = path/to/mercurial-source-tree/contrib/shrink-revlog.py + to the "[extensions]" section in your .hgrc. It is also recommended that you + apply the patch that is inside findlarge.py (it only modifies shrink-revlog.py) + +Data +---- + +- You should have a local copy of the Python SVN repo in the "python-svn" + subdirectory of this working copy. The recommended way is to "svnadmin load" + the results of an "svnadmin dump" done on the server (once you have a dump, + you can create further incremental dumps and load them in your copy). + +Steps +----- + +- Run "./convert.sh". This will mirror the "python-svn" SVN repo in a "python-hg" + Mercurial repo. The full conversion takes several hours, but you can stop it + before the end if you like, and then do "hg pull" inside "python-hg" to + resume converting. "hg pull" should also allow you to add new SVN changesets + when you have loaded an incremental dump into your "python-svn" repo. + +- Inside, "python-hg", prepare a bundle of the branches needed for the "work + repository" (a subset of the full converted repo containing only active + dev branches): + + $ hg bundle --base null \ + -r default -r 3.2.x -r 3.1.x -r 3.0.x \ + -r trunk -r 2.7.x -r 2.6.x -r 2.5.x \ + ../work.bundle + +- Create the work repository: + + $ cd .. + $ hg init work + $ cd work + $ hg unbundle ../work.bundle + +- It is then recommended to shrink the work repo and the full history repo. + See findlarge.py's docstring for indications. You should do the shrinking + in a separate clone, in case it gets corrupted (you don't want to convert + again from scratch). + A shrunk work repository should contain 8 branches (all non-closed), + 7 heads (all non-closed), and weigh around 170MB (I'm talking about the ".hg" + directory). + + Don't hesitate to inspect the results of the following commands: + + $ hg branches -c # displays all branches, including closed + $ hg heads -c # displays all heads, inclusing closed + $ du -hs .hg # displays on-disk repo size + + Also, the size of a full bundle gives (IIUC) an approximation of the + amount of bytes transfered over the network when doing a remote clone: + + $ hg bundle -a cpython.bundle + -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 19:03:10 2011 From: python-checkins at python.org (eric.araujo) Date: Thu, 24 Feb 2011 19:03:10 +0100 (CET) Subject: [Python-checkins] r88545 - in python/branches/py3k: Doc/library/abc.rst Lib/abc.py Lib/test/test_abc.py Misc/ACKS Misc/NEWS Message-ID: <20110224180310.CA123EE989@mail.python.org> Author: eric.araujo Date: Thu Feb 24 19:03:10 2011 New Revision: 88545 Log: Allow usage of SomeABC.register as a class decorator. Patch by Edoardo Spadolini (#10868). Modified: python/branches/py3k/Doc/library/abc.rst python/branches/py3k/Lib/abc.py python/branches/py3k/Lib/test/test_abc.py python/branches/py3k/Misc/ACKS python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Doc/library/abc.rst ============================================================================== --- python/branches/py3k/Doc/library/abc.rst (original) +++ python/branches/py3k/Doc/library/abc.rst Thu Feb 24 19:03:10 2011 @@ -55,6 +55,9 @@ assert issubclass(tuple, MyABC) assert isinstance((), MyABC) + .. versionchanged:: 3.3 + Returns the registered subclass, to allow usage as a class decorator. + You can also override this method in an abstract base class: .. method:: __subclasshook__(subclass) Modified: python/branches/py3k/Lib/abc.py ============================================================================== --- python/branches/py3k/Lib/abc.py (original) +++ python/branches/py3k/Lib/abc.py Thu Feb 24 19:03:10 2011 @@ -133,11 +133,14 @@ return cls def register(cls, subclass): - """Register a virtual subclass of an ABC.""" + """Register a virtual subclass of an ABC. + + Returns the subclass, to allow usage as a class decorator. + """ if not isinstance(subclass, type): raise TypeError("Can only register classes") if issubclass(subclass, cls): - return # Already a subclass + return subclass # Already a subclass # Subtle: test for cycles *after* testing for "already a subclass"; # this means we allow X.register(X) and interpret it as a no-op. if issubclass(cls, subclass): @@ -145,6 +148,7 @@ raise RuntimeError("Refusing to create an inheritance cycle") cls._abc_registry.add(subclass) ABCMeta._abc_invalidation_counter += 1 # Invalidate negative cache + return subclass def _dump_registry(cls, file=None): """Debug helper to print the ABC registry.""" Modified: python/branches/py3k/Lib/test/test_abc.py ============================================================================== --- python/branches/py3k/Lib/test/test_abc.py (original) +++ python/branches/py3k/Lib/test/test_abc.py Thu Feb 24 19:03:10 2011 @@ -121,11 +121,12 @@ self.assertFalse(issubclass(B, (A,))) self.assertNotIsInstance(b, A) self.assertNotIsInstance(b, (A,)) - A.register(B) + B1 = A.register(B) self.assertTrue(issubclass(B, A)) self.assertTrue(issubclass(B, (A,))) self.assertIsInstance(b, A) self.assertIsInstance(b, (A,)) + self.assertIs(B1, B) class C(B): pass c = C() @@ -134,6 +135,27 @@ self.assertIsInstance(c, A) self.assertIsInstance(c, (A,)) + def test_register_as_class_deco(self): + class A(metaclass=abc.ABCMeta): + pass + @A.register + class B(object): + pass + b = B() + self.assertTrue(issubclass(B, A)) + self.assertTrue(issubclass(B, (A,))) + self.assertIsInstance(b, A) + self.assertIsInstance(b, (A,)) + @A.register + class C(B): + pass + c = C() + self.assertTrue(issubclass(C, A)) + self.assertTrue(issubclass(C, (A,))) + self.assertIsInstance(c, A) + self.assertIsInstance(c, (A,)) + self.assertIs(C, A.register(C)) + def test_isinstance_invalidation(self): class A(metaclass=abc.ABCMeta): pass Modified: python/branches/py3k/Misc/ACKS ============================================================================== --- python/branches/py3k/Misc/ACKS (original) +++ python/branches/py3k/Misc/ACKS Thu Feb 24 19:03:10 2011 @@ -798,6 +798,7 @@ Dirk Soede Paul Sokolovsky Cody Somerville +Edoardo Spadolini Clay Spence Per Spilling Joshua Spoerri Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Thu Feb 24 19:03:10 2011 @@ -30,6 +30,9 @@ Library ------- +- Issue #10868: Allow usage of the register method of an ABC as a class + decorator. + - Issue #11224: Fixed a regression in tarfile that affected the file-like objects returned by TarFile.extractfile() regarding performance, memory consumption and failures with the stream interface. From python-checkins at python.org Thu Feb 24 19:05:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 19:05:28 +0100 Subject: [Python-checkins] pymigr: Use a "v" prefix for tag names representing version numbers, to help Message-ID: antoine.pitrou pushed 9ab93ebea923 to pymigr: http://hg.python.org/pymigr/rev/9ab93ebea923 changeset: 100:9ab93ebea923 user: Antoine Pitrou date: Thu Feb 24 19:03:28 2011 +0100 summary: Use a "v" prefix for tag names representing version numbers, to help distinguish with branch names and other data. files: tagmap.txt diff --git a/tagmap.txt b/tagmap.txt --- a/tagmap.txt +++ b/tagmap.txt @@ -1,202 +1,202 @@ -r32 = 3.2 -r32rc3 = 3.2rc3 -r32rc2 = 3.2rc2 -r32rc1 = 3.2rc1 -r32b2 = 3.2b2 -r32b1 = 3.2b1 -r32a4 = 3.2a4 -r32a3 = 3.2a3 -r32a2 = 3.2a2 -r32a1 = 3.2a1 -r313 = 3.1.3 -r31rc2 = 3.1rc2 -r31rc1 = 3.1rc1 -r31b1 = 3.1b1 -r31a2 = 3.1a2 -r31a1 = 3.1a1 -r313rc1 = 3.1.3rc1 -r312rc1 = 3.1.2rc1 -r312 = 3.1.2 -r311rc1 = 3.1.1rc1 -r311 = 3.1.1 -r31 = 3.1 -r30rc3 = 3.0rc3 -r30rc2 = 3.0rc2 -r30rc1 = 3.0rc1 -r30b3 = 3.0b3 -r30b2 = 3.0b2 -r30b1 = 3.0b1 -r30a5 = 3.0a5 -r30a4 = 3.0a4 -r30a3 = 3.0a3 -r30a2 = 3.0a2 -r30a1 = 3.0a1 -r301 = 3.0.1 -r30 = 3.0 -r271 = 2.7.1 -r27rc2 = 2.7rc2 -r27rc1 = 2.7rc1 -r27b2 = 2.7b2 -r27b1 = 2.7b1 -r27a4 = 2.7a4 -r27a3 = 2.7a3 -r27a2 = 2.7a2 -r27a1 = 2.7a1 -r271rc1 = 2.7rc1 -r27 = 2.7 -r26rc2 = 2.6rc2 -r26rc1 = 2.6rc1 -r26b3 = 2.6b3 -r26b2 = 2.6b2 -r26b1 = 2.6b1 -r26a3 = 2.6a3 -r26a2 = 2.6a2 -r26a1 = 2.6a1 -r266rc2 = 2.6.6rc2 -r266rc1 = 2.6.6rc1 -r266 = 2.6.6 -r265rc2 = 2.6.5rc2 -r265rc1 = 2.6.5rc1 -r265 = 2.6.5 -r264rc2 = 2.6.4rc2 -r264rc1 = 2.6.4rc1 -r264 = 2.6.4 -r263rc1 = 2.6.3rc1 -r263 = 2.6.3 -r262c1 = 2.6.2c1 -r262 = 2.6.2 -r261 = 2.6.1 -r26 = 2.6 -r25c2 = 2.5c2 -r25c1 = 2.5c1 -r25b3 = 2.5b3 -r25b2 = 2.5b2 -r25b1 = 2.5b1 -r25a2 = 2.5a2 -r25a1 = 2.5a1 -r25a0 = 2.5a0 -r255c2 = 2.5.5c2 -r255c1 = 2.5.5c1 -r255 = 2.5.5 -r254 = 2.5.4 -r253c1 = 2.5.3c1 -r253 = 2.5.3 -r252c1 = 2.5.2c1 -r252 = 2.5.2 -r251c1 = 2.5.1c1 -r251 = 2.5.1 -r25 = 2.5 -r24c1 = 2.4c1 -r24b2 = 2.4b2 -r24b1 = 2.4b1 -r24a3 = 2.4a3 -r24a2 = 2.4a2 -r24a1 = 2.4a1 -r246c1 = 2.4.6c1 -r246 = 2.4.6 -r245c1 = 2.4.5c1 -r245 = 2.4.5 -r244c1 = 2.4.4c1 -r244 = 2.4.4 -r243c1 = 2.4.3c1 -r243 = 2.4.3 -r242c1 = 2.4.2c1 -r242 = 2.4.2 -r241c2 = 2.4.1c2 -r241c1 = 2.4.1c1 -r241 = 2.4.1 -r24 = 2.4 -r23c2 = 2.3c2 -r23c1 = 2.3c1 -r23b2 = 2.3b2 -r23b1 = 2.3b1 -r23a2 = 2.3a2 -r23a1 = 2.3a1 -r237c1 = 2.3.7c1 -r237 = 2.3.7 -r236c1 = 2.3.6c1 -r236 = 2.3.6 -r235c1 = 2.3.5c1 -r235 = 2.3.5 -r234c1 = 2.3.4c1 -r234 = 2.3.4 -r233c1 = 2.3.3c1 -r233 = 2.3.3 -r232c1 = 2.3.2c1 -r232 = 2.3.2 -r231 = 2.3.1 -r23 = 2.3 -r22c1 = 2.2c1 -r22b2 = 2.2b2 -r22b1 = 2.2b1 -r22a4 = 2.2a4 -r22a3 = 2.2a3 -r22a2 = 2.2a2 -r22a1 = 2.2a1 -r223c1 = 2.2.3c1 -r223 = 2.2.3 -r222b1 = 2.2.2b1 -r222 = 2.2.2 -r221c2 = 2.2.1c2 -r221c1 = 2.2.1c1 -r221 = 2.2.1 -release22 = 2.2 -r22p1 = 2.2 -r21c2 = 2.1c2 -r21c1 = 2.1c1 -r21b2 = 2.1b2 -r21b1 = 2.1b1 -r21a2 = 2.1a2 -r21a1 = 2.1a1 -r213 = 2.1.3 -r212c1 = 2.1.2c1 -r212 = 2.1.2 -r211c1 = 2.1.1c1 -r211 = 2.1.1 -release21 = 2.1 -r20c1 = 2.0c1 -r20b2 = 2.0b2 -r20b1 = 2.0b1 -r201c1 = 2.0.1c1 -r201 = 2.0.1 -release20 = 2.0 -r16b1 = 1.6b1 -r16a2 = 1.6a2 -r16a1 = 1.6a1 -r161 = 1.6.1 -release16 = 1.6 -r15b2 = 1.5b2 -r15b1 = 1.5b1 -r15a4 = 1.5a4 -r15a3 = 1.5a3 -r15a2 = 1.5a2 -r15a1 = 1.5a1 -r152c1 = 1.5.2c1 -r152b2 = 1.5.2b2 -r152b1 = 1.5.2b1 -r152a2 = 1.5.2a2 -r152a1 = 1.5.2a1 -r152 = 1.5.2 -r151 = 1.5.1 -release15 = 1.5 -r14beta3 = 1.4b3 -r14beta2 = 1.4b2 -r14beta1 = 1.4b1 -release14 = 1.4 -r13beta1 = 1.3b1 -release13 = 1.3 -r12beta4 = 1.2b4 -r12beta3 = 1.2b3 -r12beta2 = 1.2b2 -r12beta1 = 1.2b1 -release12 = 1.2 -release111 = 1.1.1 -release11 = 1.1 -release102 = 1.0.2 -release101 = 1.0.1 -release100 = 1.0.0 -release099 = 0.9.9 -release098 = 0.9.8 +r32 = v3.2 +r32rc3 = v3.2rc3 +r32rc2 = v3.2rc2 +r32rc1 = v3.2rc1 +r32b2 = v3.2b2 +r32b1 = v3.2b1 +r32a4 = v3.2a4 +r32a3 = v3.2a3 +r32a2 = v3.2a2 +r32a1 = v3.2a1 +r313 = v3.1.3 +r31rc2 = v3.1rc2 +r31rc1 = v3.1rc1 +r31b1 = v3.1b1 +r31a2 = v3.1a2 +r31a1 = v3.1a1 +r313rc1 = v3.1.3rc1 +r312rc1 = v3.1.2rc1 +r312 = v3.1.2 +r311rc1 = v3.1.1rc1 +r311 = v3.1.1 +r31 = v3.1 +r30rc3 = v3.0rc3 +r30rc2 = v3.0rc2 +r30rc1 = v3.0rc1 +r30b3 = v3.0b3 +r30b2 = v3.0b2 +r30b1 = v3.0b1 +r30a5 = v3.0a5 +r30a4 = v3.0a4 +r30a3 = v3.0a3 +r30a2 = v3.0a2 +r30a1 = v3.0a1 +r301 = v3.0.1 +r30 = v3.0 +r271 = v2.7.1 +r27rc2 = v2.7rc2 +r27rc1 = v2.7rc1 +r27b2 = v2.7b2 +r27b1 = v2.7b1 +r27a4 = v2.7a4 +r27a3 = v2.7a3 +r27a2 = v2.7a2 +r27a1 = v2.7a1 +r271rc1 = v2.7rc1 +r27 = v2.7 +r26rc2 = v2.6rc2 +r26rc1 = v2.6rc1 +r26b3 = v2.6b3 +r26b2 = v2.6b2 +r26b1 = v2.6b1 +r26a3 = v2.6a3 +r26a2 = v2.6a2 +r26a1 = v2.6a1 +r266rc2 = v2.6.6rc2 +r266rc1 = v2.6.6rc1 +r266 = v2.6.6 +r265rc2 = v2.6.5rc2 +r265rc1 = v2.6.5rc1 +r265 = v2.6.5 +r264rc2 = v2.6.4rc2 +r264rc1 = v2.6.4rc1 +r264 = v2.6.4 +r263rc1 = v2.6.3rc1 +r263 = v2.6.3 +r262c1 = v2.6.2c1 +r262 = v2.6.2 +r261 = v2.6.1 +r26 = v2.6 +r25c2 = v2.5c2 +r25c1 = v2.5c1 +r25b3 = v2.5b3 +r25b2 = v2.5b2 +r25b1 = v2.5b1 +r25a2 = v2.5a2 +r25a1 = v2.5a1 +r25a0 = v2.5a0 +r255c2 = v2.5.5c2 +r255c1 = v2.5.5c1 +r255 = v2.5.5 +r254 = v2.5.4 +r253c1 = v2.5.3c1 +r253 = v2.5.3 +r252c1 = v2.5.2c1 +r252 = v2.5.2 +r251c1 = v2.5.1c1 +r251 = v2.5.1 +r25 = v2.5 +r24c1 = v2.4c1 +r24b2 = v2.4b2 +r24b1 = v2.4b1 +r24a3 = v2.4a3 +r24a2 = v2.4a2 +r24a1 = v2.4a1 +r246c1 = v2.4.6c1 +r246 = v2.4.6 +r245c1 = v2.4.5c1 +r245 = v2.4.5 +r244c1 = v2.4.4c1 +r244 = v2.4.4 +r243c1 = v2.4.3c1 +r243 = v2.4.3 +r242c1 = v2.4.2c1 +r242 = v2.4.2 +r241c2 = v2.4.1c2 +r241c1 = v2.4.1c1 +r241 = v2.4.1 +r24 = v2.4 +r23c2 = v2.3c2 +r23c1 = v2.3c1 +r23b2 = v2.3b2 +r23b1 = v2.3b1 +r23a2 = v2.3a2 +r23a1 = v2.3a1 +r237c1 = v2.3.7c1 +r237 = v2.3.7 +r236c1 = v2.3.6c1 +r236 = v2.3.6 +r235c1 = v2.3.5c1 +r235 = v2.3.5 +r234c1 = v2.3.4c1 +r234 = v2.3.4 +r233c1 = v2.3.3c1 +r233 = v2.3.3 +r232c1 = v2.3.2c1 +r232 = v2.3.2 +r231 = v2.3.1 +r23 = v2.3 +r22c1 = v2.2c1 +r22b2 = v2.2b2 +r22b1 = v2.2b1 +r22a4 = v2.2a4 +r22a3 = v2.2a3 +r22a2 = v2.2a2 +r22a1 = v2.2a1 +r223c1 = v2.2.3c1 +r223 = v2.2.3 +r222b1 = v2.2.2b1 +r222 = v2.2.2 +r221c2 = v2.2.1c2 +r221c1 = v2.2.1c1 +r221 = v2.2.1 +release22 = v2.2 +r22p1 = v2.2 +r21c2 = v2.1c2 +r21c1 = v2.1c1 +r21b2 = v2.1b2 +r21b1 = v2.1b1 +r21a2 = v2.1a2 +r21a1 = v2.1a1 +r213 = v2.1.3 +r212c1 = v2.1.2c1 +r212 = v2.1.2 +r211c1 = v2.1.1c1 +r211 = v2.1.1 +release21 = v2.1 +r20c1 = v2.0c1 +r20b2 = v2.0b2 +r20b1 = v2.0b1 +r201c1 = v2.0.1c1 +r201 = v2.0.1 +release20 = v2.0 +r16b1 = v1.6b1 +r16a2 = v1.6a2 +r16a1 = v1.6a1 +r161 = v1.6.1 +release16 = v1.6 +r15b2 = v1.5b2 +r15b1 = v1.5b1 +r15a4 = v1.5a4 +r15a3 = v1.5a3 +r15a2 = v1.5a2 +r15a1 = v1.5a1 +r152c1 = v1.5.2c1 +r152b2 = v1.5.2b2 +r152b1 = v1.5.2b1 +r152a2 = v1.5.2a2 +r152a1 = v1.5.2a1 +r152 = v1.5.2 +r151 = v1.5.1 +release15 = v1.5 +r14beta3 = v1.4b3 +r14beta2 = v1.4b2 +r14beta1 = v1.4b1 +release14 = v1.4 +r13beta1 = v1.3b1 +release13 = v1.3 +r12beta4 = v1.2b4 +r12beta3 = v1.2b3 +r12beta2 = v1.2b2 +r12beta1 = v1.2b1 +release12 = v1.2 +release111 = v1.1.1 +release11 = v1.1 +release102 = v1.0.2 +release101 = v1.0.1 +release100 = v1.0.0 +release099 = v0.9.9 +release098 = v0.9.8 BBPY_0_2_3 = BEFORE_RESTART = Beta_05-Jul-1996 = -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 19:05:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 19:05:28 +0100 Subject: [Python-checkins] pymigr: Use shorter (easier to type) branch names Message-ID: antoine.pitrou pushed 1be3aecc9bea to pymigr: http://hg.python.org/pymigr/rev/1be3aecc9bea changeset: 101:1be3aecc9bea tag: tip user: Antoine Pitrou date: Thu Feb 24 19:05:23 2011 +0100 summary: Use shorter (easier to type) branch names files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -1,16 +1,16 @@ py3k = default p3yk = default -release32-maint = 3.2.x -release31-maint = 3.1.x -release30-maint = 3.0.x -release27-maint = 2.7.x -release26-maint = 2.6.x -release25-maint = 2.5.x -release24-maint = 2.4.x -release23-maint = 2.3.x -release22-maint = 2.2.x -release21-maint = 2.1.x -release20-maint = 2.0.x +release32-maint = 3.2 +release31-maint = 3.1 +release30-maint = 3.0 +release27-maint = 2.7 +release26-maint = 2.6 +release25-maint = 2.5 +release24-maint = 2.4 +release23-maint = 2.3 +release22-maint = 2.2 +release21-maint = 2.1 +release20-maint = 2.0 siginterrupt-reset-issue8354 = trunk asyncore-tests-issue8490 = trunk signalfd-issue8407 = trunk -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 19:51:02 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 19:51:02 +0100 Subject: [Python-checkins] hgsubversion (pymigr): Really drop tags that are marked empty in the tagmap Message-ID: antoine.pitrou pushed 19fc4050d75e to hgsubversion: http://hg.python.org/hgsubversion/rev/19fc4050d75e changeset: 789:19fc4050d75e branch: pymigr tag: tip user: Antoine Pitrou date: Thu Feb 24 19:50:58 2011 +0100 summary: Really drop tags that are marked empty in the tagmap (this is the same test as in committags()) files: hgsubversion/svnmeta.py diff --git a/hgsubversion/svnmeta.py b/hgsubversion/svnmeta.py --- a/hgsubversion/svnmeta.py +++ b/hgsubversion/svnmeta.py @@ -603,6 +603,8 @@ def movetag(self, tag, hash, rev, date): if tag in self.tags and self.tags[tag] == hash: return + if tag in self.tagmap and not self.tagmap[tag]: + return # determine branch from earliest unclosed ancestor branchparent = self.repo[hash] -- Repository URL: http://hg.python.org/hgsubversion From python-checkins at python.org Thu Feb 24 20:03:25 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 24 Feb 2011 20:03:25 +0100 Subject: [Python-checkins] pymigr: Add more null tags. Message-ID: georg.brandl pushed 90cc0a859a1e to pymigr: http://hg.python.org/pymigr/rev/90cc0a859a1e changeset: 102:90cc0a859a1e tag: tip user: Georg Brandl date: Thu Feb 24 20:02:27 2011 +0100 summary: Add more null tags. files: tagmap.txt diff --git a/tagmap.txt b/tagmap.txt --- a/tagmap.txt +++ b/tagmap.txt @@ -23,10 +23,13 @@ r30rc3 = v3.0rc3 r30rc2 = v3.0rc2 r30rc1 = v3.0rc1 +r30rc1/py3k = r30b3 = v3.0b3 r30b2 = v3.0b2 r30b1 = v3.0b1 r30a5 = v3.0a5 +r30a5/py3k = +r30a5-real = r30a4 = v3.0a4 r30a3 = v3.0a3 r30a2 = v3.0a2 -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 20:04:54 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 24 Feb 2011 20:04:54 +0100 Subject: [Python-checkins] hgsubversion (pymigr): Require a space before "#" starting non-full-line comments in tagmap. Message-ID: georg.brandl pushed 8f3cd69dc34d to hgsubversion: http://hg.python.org/hgsubversion/rev/8f3cd69dc34d changeset: 790:8f3cd69dc34d branch: pymigr tag: tip user: Georg Brandl date: Thu Feb 24 20:04:08 2011 +0100 summary: Require a space before "#" starting non-full-line comments in tagmap. files: hgsubversion/maps.py diff --git a/hgsubversion/maps.py b/hgsubversion/maps.py --- a/hgsubversion/maps.py +++ b/hgsubversion/maps.py @@ -392,7 +392,9 @@ if writing: writing.write(line) - line = line.split('#')[0] + if line.startswith('#'): + continue + line = line.split(' #')[0] if not line.strip(): continue -- Repository URL: http://hg.python.org/hgsubversion From python-checkins at python.org Thu Feb 24 20:15:56 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 20:15:56 +0100 Subject: [Python-checkins] pymigr: Update instructions with new branch names Message-ID: antoine.pitrou pushed 134668fee160 to pymigr: http://hg.python.org/pymigr/rev/134668fee160 changeset: 103:134668fee160 tag: tip user: Antoine Pitrou date: Thu Feb 24 20:15:53 2011 +0100 summary: Update instructions with new branch names files: README.txt diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -33,11 +33,12 @@ - Inside, "python-hg", prepare a bundle of the branches needed for the "work repository" (a subset of the full converted repo containing only active - dev branches): + dev branches and main maintenance branches): $ hg bundle --base null \ - -r default -r 3.2.x -r 3.1.x -r 3.0.x \ - -r trunk -r 2.7.x -r 2.6.x -r 2.5.x \ + -r default -r 3.2 -r 3.1 -r 3.0 \ + -r trunk -r 2.7 -r 2.6 -r 2.5 -r 2.4 \ + -r 2.3 -r 2.2 -r 2.1 -r 2.0 \ ../work.bundle - Create the work repository: -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 20:40:09 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Thu, 24 Feb 2011 20:40:09 +0100 (CET) Subject: [Python-checkins] r88546 - in python/branches/py3k: Lib/test/pickletester.py Lib/test/test_pickle.py Lib/test/test_pickletools.py Modules/_pickle.c Message-ID: <20110224194009.5CE24EE985@mail.python.org> Author: alexander.belopolsky Date: Thu Feb 24 20:40:09 2011 New Revision: 88546 Log: Issue #11286: Fixed unpickling of empty 2.x strings. Modified: python/branches/py3k/Lib/test/pickletester.py python/branches/py3k/Lib/test/test_pickle.py python/branches/py3k/Lib/test/test_pickletools.py python/branches/py3k/Modules/_pickle.c Modified: python/branches/py3k/Lib/test/pickletester.py ============================================================================== --- python/branches/py3k/Lib/test/pickletester.py (original) +++ python/branches/py3k/Lib/test/pickletester.py Thu Feb 24 20:40:09 2011 @@ -1094,6 +1094,10 @@ self.assertEqual(len(loaded), len(data)) self.assertEqual(loaded, data) + def test_empty_bytestring(self): + # issue 11286 + empty = self.loads(b'\x80\x03U\x00q\x00.', encoding='koi8-r') + self.assertEqual(empty, '') # Test classes for reduce_ex Modified: python/branches/py3k/Lib/test/test_pickle.py ============================================================================== --- python/branches/py3k/Lib/test/test_pickle.py (original) +++ python/branches/py3k/Lib/test/test_pickle.py Thu Feb 24 20:40:09 2011 @@ -31,9 +31,9 @@ f.seek(0) return bytes(f.read()) - def loads(self, buf): + def loads(self, buf, **kwds): f = io.BytesIO(buf) - u = self.unpickler(f) + u = self.unpickler(f, **kwds) return u.load() @@ -45,8 +45,8 @@ def dumps(self, arg, proto=None): return pickle.dumps(arg, proto) - def loads(self, buf): - return pickle.loads(buf) + def loads(self, buf, **kwds): + return pickle.loads(buf, **kwds) class PyPersPicklerTests(AbstractPersistentPicklerTests): @@ -64,12 +64,12 @@ f.seek(0) return f.read() - def loads(self, buf): + def loads(self, buf, **kwds): class PersUnpickler(self.unpickler): def persistent_load(subself, obj): return self.persistent_load(obj) f = io.BytesIO(buf) - u = PersUnpickler(f) + u = PersUnpickler(f, **kwds) return u.load() Modified: python/branches/py3k/Lib/test/test_pickletools.py ============================================================================== --- python/branches/py3k/Lib/test/test_pickletools.py (original) +++ python/branches/py3k/Lib/test/test_pickletools.py Thu Feb 24 20:40:09 2011 @@ -9,8 +9,8 @@ def dumps(self, arg, proto=None): return pickletools.optimize(pickle.dumps(arg, proto)) - def loads(self, buf): - return pickle.loads(buf) + def loads(self, buf, **kwds): + return pickle.loads(buf, **kwds) # Test relies on precise output of dumps() test_pickle_to_2x = None Modified: python/branches/py3k/Modules/_pickle.c ============================================================================== --- python/branches/py3k/Modules/_pickle.c (original) +++ python/branches/py3k/Modules/_pickle.c Thu Feb 24 20:40:09 2011 @@ -977,11 +977,6 @@ { Py_ssize_t num_read; - if (n == 0) { - *s = NULL; - return 0; - } - if (self->next_read_idx + n <= self->input_len) { *s = self->input_buffer + self->next_read_idx; self->next_read_idx += n; From python-checkins at python.org Thu Feb 24 21:12:35 2011 From: python-checkins at python.org (georg.brandl) Date: Thu, 24 Feb 2011 21:12:35 +0100 Subject: [Python-checkins] pymigr: When looking up SVN revisions, search through the repo backwards: most Message-ID: georg.brandl pushed 356575b1cbd6 to pymigr: http://hg.python.org/pymigr/rev/356575b1cbd6 changeset: 104:356575b1cbd6 tag: tip user: Georg Brandl date: Thu Feb 24 21:04:51 2011 +0100 summary: When looking up SVN revisions, search through the repo backwards: most looked-up SVN revisions will be relatively recent ones. files: hglookup.py diff --git a/hglookup.py b/hglookup.py --- a/hglookup.py +++ b/hglookup.py @@ -47,7 +47,7 @@ svnrev = '@' + node[1:] name = 'cpython' repo = self.repos[0][1] - for rev in repo: + for rev in xrange(len(repo)-1, -1, -1): cvt = repo[rev].extra().get('convert_revision', '') if cvt.endswith(svnrev): full = repo[rev].node() -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Thu Feb 24 21:25:11 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Thu, 24 Feb 2011 21:25:11 +0100 (CET) Subject: [Python-checkins] r88547 - in python/branches/release32-maint: Lib/smtplib.py Message-ID: <20110224202511.91010EE986@mail.python.org> Author: giampaolo.rodola Date: Thu Feb 24 21:25:11 2011 New Revision: 88547 Log: Merged revisions 88501 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88501 | giampaolo.rodola | 2011-02-22 16:56:20 +0100 (mar, 22 feb 2011) | 1 line smtlib.py PEP8 normalization via pep8.py script. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/smtplib.py Modified: python/branches/release32-maint/Lib/smtplib.py ============================================================================== --- python/branches/release32-maint/Lib/smtplib.py (original) +++ python/branches/release32-maint/Lib/smtplib.py Thu Feb 24 21:25:11 2011 @@ -52,15 +52,15 @@ from email.base64mime import body_encode as encode_base64 from sys import stderr -__all__ = ["SMTPException","SMTPServerDisconnected","SMTPResponseException", - "SMTPSenderRefused","SMTPRecipientsRefused","SMTPDataError", - "SMTPConnectError","SMTPHeloError","SMTPAuthenticationError", - "quoteaddr","quotedata","SMTP"] +__all__ = ["SMTPException", "SMTPServerDisconnected", "SMTPResponseException", + "SMTPSenderRefused", "SMTPRecipientsRefused", "SMTPDataError", + "SMTPConnectError", "SMTPHeloError", "SMTPAuthenticationError", + "quoteaddr", "quotedata", "SMTP"] SMTP_PORT = 25 SMTP_SSL_PORT = 465 -CRLF="\r\n" -bCRLF=b"\r\n" +CRLF = "\r\n" +bCRLF = b"\r\n" OLDSTYLE_AUTH = re.compile(r"auth=(.*)", re.I) @@ -113,7 +113,7 @@ def __init__(self, recipients): self.recipients = recipients - self.args = ( recipients,) + self.args = (recipients,) class SMTPDataError(SMTPResponseException): @@ -142,7 +142,7 @@ m = email.utils.parseaddr(addr)[1] except AttributeError: pass - if m == (None, None): # Indicates parse failure or AttributeError + if m == (None, None): # Indicates parse failure or AttributeError # something weird here.. punt -ddm return "<%s>" % addr elif m is None: @@ -185,7 +185,8 @@ chr = None while chr != b"\n": chr = self.sslobj.read(1) - if not chr: break + if not chr: + break str += chr return str @@ -280,10 +281,11 @@ def _get_socket(self, host, port, timeout): # This makes it simpler for SMTP_SSL to use the SMTP connect code # and just alter the socket connection bit. - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) return socket.create_connection((host, port), timeout) - def connect(self, host='localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to a host on a given port. If the hostname ends with a colon (`:') followed by a number, and @@ -297,20 +299,25 @@ if not port and (host.find(':') == host.rfind(':')): i = host.rfind(':') if i >= 0: - host, port = host[:i], host[i+1:] - try: port = int(port) + host, port = host[:i], host[i + 1:] + try: + port = int(port) except ValueError: raise socket.error("nonnumeric port") - if not port: port = self.default_port - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if not port: + port = self.default_port + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) self.sock = self._get_socket(host, port, self.timeout) (code, msg) = self.getreply() - if self.debuglevel > 0: print("connect:", msg, file=stderr) + if self.debuglevel > 0: + print("connect:", msg, file=stderr) return (code, msg) def send(self, s): """Send `s' to the server.""" - if self.debuglevel > 0: print('send:', repr(s), file=stderr) + if self.debuglevel > 0: + print('send:', repr(s), file=stderr) if hasattr(self, 'sock') and self.sock: if isinstance(s, str): s = s.encode("ascii") @@ -343,7 +350,7 @@ Raises SMTPServerDisconnected if end-of-file is reached. """ - resp=[] + resp = [] if self.file is None: self.file = self.sock.makefile('rb') while 1: @@ -354,9 +361,10 @@ if not line: self.close() raise SMTPServerDisconnected("Connection unexpectedly closed") - if self.debuglevel > 0: print('reply:', repr(line), file=stderr) + if self.debuglevel > 0: + print('reply:', repr(line), file=stderr) resp.append(line[4:].strip(b' \t\r\n')) - code=line[:3] + code = line[:3] # Check that the error code is syntactically correct. # Don't attempt to read a continuation line if it is broken. try: @@ -370,12 +378,12 @@ errmsg = b"\n".join(resp) if self.debuglevel > 0: - print('reply: retcode (%s); Msg: %s' % (errcode,errmsg), file=stderr) + print('reply: retcode (%s); Msg: %s' % (errcode, errmsg), file=stderr) return errcode, errmsg def docmd(self, cmd, args=""): """Send a command, and return its response code.""" - self.putcmd(cmd,args) + self.putcmd(cmd, args) return self.getreply() # std smtp commands @@ -385,9 +393,9 @@ host. """ self.putcmd("helo", name or self.local_hostname) - (code,msg)=self.getreply() - self.helo_resp=msg - return (code,msg) + (code, msg) = self.getreply() + self.helo_resp = msg + return (code, msg) def ehlo(self, name=''): """ SMTP 'ehlo' command. @@ -396,20 +404,20 @@ """ self.esmtp_features = {} self.putcmd(self.ehlo_msg, name or self.local_hostname) - (code,msg)=self.getreply() + (code, msg) = self.getreply() # According to RFC1869 some (badly written) # MTA's will disconnect on an ehlo. Toss an exception if # that happens -ddm if code == -1 and len(msg) == 0: self.close() raise SMTPServerDisconnected("Server not connected") - self.ehlo_resp=msg + self.ehlo_resp = msg if code != 250: - return (code,msg) - self.does_esmtp=1 + return (code, msg) + self.does_esmtp = 1 #parse the ehlo response -ddm assert isinstance(self.ehlo_resp, bytes), repr(self.ehlo_resp) - resp=self.ehlo_resp.decode("latin-1").split('\n') + resp = self.ehlo_resp.decode("latin-1").split('\n') del resp[0] for each in resp: # To be able to communicate with as many SMTP servers as possible, @@ -429,16 +437,16 @@ # It's actually stricter, in that only spaces are allowed between # parameters, but were not going to check for that here. Note # that the space isn't present if there are no parameters. - m=re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?',each) + m = re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?', each) if m: - feature=m.group("feature").lower() - params=m.string[m.end("feature"):].strip() + feature = m.group("feature").lower() + params = m.string[m.end("feature"):].strip() if feature == "auth": self.esmtp_features[feature] = self.esmtp_features.get(feature, "") \ + " " + params else: - self.esmtp_features[feature]=params - return (code,msg) + self.esmtp_features[feature] = params + return (code, msg) def has_extn(self, opt): """Does the server support a given SMTP service extension?""" @@ -458,23 +466,23 @@ """SMTP 'noop' command -- doesn't do anything :>""" return self.docmd("noop") - def mail(self,sender,options=[]): + def mail(self, sender, options=[]): """SMTP 'mail' command -- begins mail xfer session.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender) ,optionlist)) + self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender), optionlist)) return self.getreply() - def rcpt(self,recip,options=[]): + def rcpt(self, recip, options=[]): """SMTP 'rcpt' command -- indicates 1 recipient for this mail.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("rcpt","TO:%s%s" % (quoteaddr(recip),optionlist)) + self.putcmd("rcpt", "TO:%s%s" % (quoteaddr(recip), optionlist)) return self.getreply() - def data(self,msg): + def data(self, msg): """SMTP 'DATA' command -- sends message data to server. Automatically quotes lines beginning with a period per rfc821. @@ -485,10 +493,11 @@ '\r\n' characters. If msg is bytes, it is transmitted as is. """ self.putcmd("data") - (code,repl)=self.getreply() - if self.debuglevel >0 : print("data:", (code,repl), file=stderr) + (code, repl) = self.getreply() + if self.debuglevel > 0: + print("data:", (code, repl), file=stderr) if code != 354: - raise SMTPDataError(code,repl) + raise SMTPDataError(code, repl) else: if isinstance(msg, str): msg = _fix_eols(msg).encode('ascii') @@ -497,16 +506,17 @@ q = q + bCRLF q = q + b"." + bCRLF self.send(q) - (code,msg)=self.getreply() - if self.debuglevel >0 : print("data:", (code,msg), file=stderr) - return (code,msg) + (code, msg) = self.getreply() + if self.debuglevel > 0: + print("data:", (code, msg), file=stderr) + return (code, msg) def verify(self, address): """SMTP 'verify' command -- checks for address validity.""" self.putcmd("vrfy", quoteaddr(address)) return self.getreply() # a.k.a. - vrfy=verify + vrfy = verify def expn(self, address): """SMTP 'expn' command -- expands a mailing list.""" @@ -564,7 +574,6 @@ s = "\0%s\0%s" % (user, password) return encode_base64(s.encode('ascii'), eol='') - AUTH_PLAIN = "PLAIN" AUTH_CRAM_MD5 = "CRAM-MD5" AUTH_LOGIN = "LOGIN" @@ -613,7 +622,7 @@ # We could not login sucessfully. Return result of last attempt. raise SMTPAuthenticationError(code, resp) - def starttls(self, keyfile = None, certfile = None): + def starttls(self, keyfile=None, certfile=None): """Puts the connection to the SMTP server into TLS mode. If there has been no previous EHLO or HELO command this session, this @@ -721,22 +730,22 @@ esmtp_opts.append("size=%d" % len(msg)) for option in mail_options: esmtp_opts.append(option) - (code,resp) = self.mail(from_addr, esmtp_opts) + (code, resp) = self.mail(from_addr, esmtp_opts) if code != 250: self.rset() raise SMTPSenderRefused(code, resp, from_addr) - senderrs={} + senderrs = {} if isinstance(to_addrs, str): to_addrs = [to_addrs] for each in to_addrs: - (code,resp)=self.rcpt(each, rcpt_options) + (code, resp) = self.rcpt(each, rcpt_options) if (code != 250) and (code != 251): - senderrs[each]=(code,resp) - if len(senderrs)==len(to_addrs): + senderrs[each] = (code, resp) + if len(senderrs) == len(to_addrs): # the server refused all our recipients self.rset() raise SMTPRecipientsRefused(senderrs) - (code,resp) = self.data(msg) + (code, resp) = self.data(msg) if code != 250: self.rset() raise SMTPDataError(code, resp) @@ -770,7 +779,6 @@ return self.sendmail(from_addr, to_addrs, flatmsg, mail_options, rcpt_options) - def close(self): """Close the connection to the SMTP server.""" if self.file: @@ -780,7 +788,6 @@ self.sock.close() self.sock = None - def quit(self): """Terminate the SMTP session.""" res = self.docmd("quit") @@ -806,7 +813,8 @@ self.default_port = SMTP_SSL_PORT def _get_socket(self, host, port, timeout): - if self.debuglevel > 0: print('connect:', (host, port), file=stderr) + if self.debuglevel > 0: + print('connect:', (host, port), file=stderr) new_socket = socket.create_connection((host, port), timeout) new_socket = ssl.wrap_socket(new_socket, self.keyfile, self.certfile) self.file = SSLFakeFile(new_socket) @@ -834,11 +842,11 @@ ehlo_msg = "lhlo" - def __init__(self, host = '', port = LMTP_PORT, local_hostname = None): + def __init__(self, host='', port=LMTP_PORT, local_hostname=None): """Initialize a new instance.""" SMTP.__init__(self, host, port, local_hostname) - def connect(self, host = 'localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to the LMTP daemon, on either a Unix or a TCP socket.""" if host[0] != '/': return SMTP.connect(self, host, port) @@ -848,13 +856,15 @@ self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) self.sock.connect(host) except socket.error as msg: - if self.debuglevel > 0: print('connect fail:', host, file=stderr) + if self.debuglevel > 0: + print('connect fail:', host, file=stderr) if self.sock: self.sock.close() self.sock = None raise socket.error(msg) (code, msg) = self.getreply() - if self.debuglevel > 0: print('connect:', msg, file=stderr) + if self.debuglevel > 0: + print('connect:', msg, file=stderr) return (code, msg) @@ -868,7 +878,7 @@ return sys.stdin.readline().strip() fromaddr = prompt("From") - toaddrs = prompt("To").split(',') + toaddrs = prompt("To").split(',') print("Enter message, end with ^D:") msg = '' while 1: From python-checkins at python.org Thu Feb 24 21:34:39 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Thu, 24 Feb 2011 21:34:39 +0100 (CET) Subject: [Python-checkins] r88548 - in python/branches/release32-maint: Lib/test/pickletester.py Lib/test/test_pickle.py Lib/test/test_pickletools.py Modules/_pickle.c Message-ID: <20110224203439.2B075F8C9@mail.python.org> Author: alexander.belopolsky Date: Thu Feb 24 21:34:38 2011 New Revision: 88548 Log: Merged revisions 88546 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88546 | alexander.belopolsky | 2011-02-24 14:40:09 -0500 (Thu, 24 Feb 2011) | 3 lines Issue #11286: Fixed unpickling of empty 2.x strings. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/pickletester.py python/branches/release32-maint/Lib/test/test_pickle.py python/branches/release32-maint/Lib/test/test_pickletools.py python/branches/release32-maint/Modules/_pickle.c Modified: python/branches/release32-maint/Lib/test/pickletester.py ============================================================================== --- python/branches/release32-maint/Lib/test/pickletester.py (original) +++ python/branches/release32-maint/Lib/test/pickletester.py Thu Feb 24 21:34:38 2011 @@ -1093,6 +1093,10 @@ self.assertEqual(len(loaded), len(data)) self.assertEqual(loaded, data) + def test_empty_bytestring(self): + # issue 11286 + empty = self.loads(b'\x80\x03U\x00q\x00.', encoding='koi8-r') + self.assertEqual(empty, '') # Test classes for reduce_ex Modified: python/branches/release32-maint/Lib/test/test_pickle.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_pickle.py (original) +++ python/branches/release32-maint/Lib/test/test_pickle.py Thu Feb 24 21:34:38 2011 @@ -31,9 +31,9 @@ f.seek(0) return bytes(f.read()) - def loads(self, buf): + def loads(self, buf, **kwds): f = io.BytesIO(buf) - u = self.unpickler(f) + u = self.unpickler(f, **kwds) return u.load() @@ -45,8 +45,8 @@ def dumps(self, arg, proto=None): return pickle.dumps(arg, proto) - def loads(self, buf): - return pickle.loads(buf) + def loads(self, buf, **kwds): + return pickle.loads(buf, **kwds) class PyPersPicklerTests(AbstractPersistentPicklerTests): @@ -64,12 +64,12 @@ f.seek(0) return f.read() - def loads(self, buf): + def loads(self, buf, **kwds): class PersUnpickler(self.unpickler): def persistent_load(subself, obj): return self.persistent_load(obj) f = io.BytesIO(buf) - u = PersUnpickler(f) + u = PersUnpickler(f, **kwds) return u.load() Modified: python/branches/release32-maint/Lib/test/test_pickletools.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_pickletools.py (original) +++ python/branches/release32-maint/Lib/test/test_pickletools.py Thu Feb 24 21:34:38 2011 @@ -9,8 +9,8 @@ def dumps(self, arg, proto=None): return pickletools.optimize(pickle.dumps(arg, proto)) - def loads(self, buf): - return pickle.loads(buf) + def loads(self, buf, **kwds): + return pickle.loads(buf, **kwds) # Test relies on precise output of dumps() test_pickle_to_2x = None Modified: python/branches/release32-maint/Modules/_pickle.c ============================================================================== --- python/branches/release32-maint/Modules/_pickle.c (original) +++ python/branches/release32-maint/Modules/_pickle.c Thu Feb 24 21:34:38 2011 @@ -977,11 +977,6 @@ { Py_ssize_t num_read; - if (n == 0) { - *s = NULL; - return 0; - } - if (self->next_read_idx + n <= self->input_len) { *s = self->input_buffer + self->next_read_idx; self->next_read_idx += n; From python-checkins at python.org Thu Feb 24 21:43:11 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Thu, 24 Feb 2011 21:43:11 +0100 (CET) Subject: [Python-checkins] r88549 - python/branches/release27-maint/Lib/smtplib.py Message-ID: <20110224204311.35CDBF405@mail.python.org> Author: giampaolo.rodola Date: Thu Feb 24 21:43:11 2011 New Revision: 88549 Log: smtlib.py PEP8 normalization via pep8.py script. Modified: python/branches/release27-maint/Lib/smtplib.py Modified: python/branches/release27-maint/Lib/smtplib.py ============================================================================== --- python/branches/release27-maint/Lib/smtplib.py (original) +++ python/branches/release27-maint/Lib/smtplib.py Thu Feb 24 21:43:11 2011 @@ -49,17 +49,18 @@ from email.base64mime import encode as encode_base64 from sys import stderr -__all__ = ["SMTPException","SMTPServerDisconnected","SMTPResponseException", - "SMTPSenderRefused","SMTPRecipientsRefused","SMTPDataError", - "SMTPConnectError","SMTPHeloError","SMTPAuthenticationError", - "quoteaddr","quotedata","SMTP"] +__all__ = ["SMTPException", "SMTPServerDisconnected", "SMTPResponseException", + "SMTPSenderRefused", "SMTPRecipientsRefused", "SMTPDataError", + "SMTPConnectError", "SMTPHeloError", "SMTPAuthenticationError", + "quoteaddr", "quotedata", "SMTP"] SMTP_PORT = 25 SMTP_SSL_PORT = 465 -CRLF="\r\n" +CRLF = "\r\n" OLDSTYLE_AUTH = re.compile(r"auth=(.*)", re.I) + # Exception classes used by this module. class SMTPException(Exception): """Base class for all exceptions raised by this module.""" @@ -109,7 +110,7 @@ def __init__(self, recipients): self.recipients = recipients - self.args = ( recipients,) + self.args = (recipients,) class SMTPDataError(SMTPResponseException): @@ -128,6 +129,7 @@ combination provided. """ + def quoteaddr(addr): """Quote a subset of the email addresses defined by RFC 821. @@ -138,7 +140,7 @@ m = email.utils.parseaddr(addr)[1] except AttributeError: pass - if m == (None, None): # Indicates parse failure or AttributeError + if m == (None, None): # Indicates parse failure or AttributeError # something weird here.. punt -ddm return "<%s>" % addr elif m is None: @@ -175,7 +177,8 @@ chr = None while chr != "\n": chr = self.sslobj.read(1) - if not chr: break + if not chr: + break str += chr return str @@ -269,10 +272,11 @@ def _get_socket(self, port, host, timeout): # This makes it simpler for SMTP_SSL to use the SMTP connect code # and just alter the socket connection bit. - if self.debuglevel > 0: print>>stderr, 'connect:', (host, port) + if self.debuglevel > 0: + print>>stderr, 'connect:', (host, port) return socket.create_connection((port, host), timeout) - def connect(self, host='localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to a host on a given port. If the hostname ends with a colon (`:') followed by a number, and @@ -286,20 +290,25 @@ if not port and (host.find(':') == host.rfind(':')): i = host.rfind(':') if i >= 0: - host, port = host[:i], host[i+1:] - try: port = int(port) + host, port = host[:i], host[i + 1:] + try: + port = int(port) except ValueError: raise socket.error, "nonnumeric port" - if not port: port = self.default_port - if self.debuglevel > 0: print>>stderr, 'connect:', (host, port) + if not port: + port = self.default_port + if self.debuglevel > 0: + print>>stderr, 'connect:', (host, port) self.sock = self._get_socket(host, port, self.timeout) (code, msg) = self.getreply() - if self.debuglevel > 0: print>>stderr, "connect:", msg + if self.debuglevel > 0: + print>>stderr, "connect:", msg return (code, msg) def send(self, str): """Send `str' to the server.""" - if self.debuglevel > 0: print>>stderr, 'send:', repr(str) + if self.debuglevel > 0: + print>>stderr, 'send:', repr(str) if hasattr(self, 'sock') and self.sock: try: self.sock.sendall(str) @@ -330,7 +339,7 @@ Raises SMTPServerDisconnected if end-of-file is reached. """ - resp=[] + resp = [] if self.file is None: self.file = self.sock.makefile('rb') while 1: @@ -341,9 +350,10 @@ if line == '': self.close() raise SMTPServerDisconnected("Connection unexpectedly closed") - if self.debuglevel > 0: print>>stderr, 'reply:', repr(line) + if self.debuglevel > 0: + print>>stderr, 'reply:', repr(line) resp.append(line[4:].strip()) - code=line[:3] + code = line[:3] # Check that the error code is syntactically correct. # Don't attempt to read a continuation line if it is broken. try: @@ -352,17 +362,17 @@ errcode = -1 break # Check if multiline response. - if line[3:4]!="-": + if line[3:4] != "-": break errmsg = "\n".join(resp) if self.debuglevel > 0: - print>>stderr, 'reply: retcode (%s); Msg: %s' % (errcode,errmsg) + print>>stderr, 'reply: retcode (%s); Msg: %s' % (errcode, errmsg) return errcode, errmsg def docmd(self, cmd, args=""): """Send a command, and return its response code.""" - self.putcmd(cmd,args) + self.putcmd(cmd, args) return self.getreply() # std smtp commands @@ -372,9 +382,9 @@ host. """ self.putcmd("helo", name or self.local_hostname) - (code,msg)=self.getreply() - self.helo_resp=msg - return (code,msg) + (code, msg) = self.getreply() + self.helo_resp = msg + return (code, msg) def ehlo(self, name=''): """ SMTP 'ehlo' command. @@ -383,19 +393,19 @@ """ self.esmtp_features = {} self.putcmd(self.ehlo_msg, name or self.local_hostname) - (code,msg)=self.getreply() + (code, msg) = self.getreply() # According to RFC1869 some (badly written) # MTA's will disconnect on an ehlo. Toss an exception if # that happens -ddm if code == -1 and len(msg) == 0: self.close() raise SMTPServerDisconnected("Server not connected") - self.ehlo_resp=msg + self.ehlo_resp = msg if code != 250: - return (code,msg) - self.does_esmtp=1 + return (code, msg) + self.does_esmtp = 1 #parse the ehlo response -ddm - resp=self.ehlo_resp.split('\n') + resp = self.ehlo_resp.split('\n') del resp[0] for each in resp: # To be able to communicate with as many SMTP servers as possible, @@ -415,16 +425,16 @@ # It's actually stricter, in that only spaces are allowed between # parameters, but were not going to check for that here. Note # that the space isn't present if there are no parameters. - m=re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?',each) + m = re.match(r'(?P[A-Za-z0-9][A-Za-z0-9\-]*) ?', each) if m: - feature=m.group("feature").lower() - params=m.string[m.end("feature"):].strip() + feature = m.group("feature").lower() + params = m.string[m.end("feature"):].strip() if feature == "auth": self.esmtp_features[feature] = self.esmtp_features.get(feature, "") \ + " " + params else: - self.esmtp_features[feature]=params - return (code,msg) + self.esmtp_features[feature] = params + return (code, msg) def has_extn(self, opt): """Does the server support a given SMTP service extension?""" @@ -444,23 +454,23 @@ """SMTP 'noop' command -- doesn't do anything :>""" return self.docmd("noop") - def mail(self,sender,options=[]): + def mail(self, sender, options=[]): """SMTP 'mail' command -- begins mail xfer session.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender) ,optionlist)) + self.putcmd("mail", "FROM:%s%s" % (quoteaddr(sender), optionlist)) return self.getreply() - def rcpt(self,recip,options=[]): + def rcpt(self, recip, options=[]): """SMTP 'rcpt' command -- indicates 1 recipient for this mail.""" optionlist = '' if options and self.does_esmtp: optionlist = ' ' + ' '.join(options) - self.putcmd("rcpt","TO:%s%s" % (quoteaddr(recip),optionlist)) + self.putcmd("rcpt", "TO:%s%s" % (quoteaddr(recip), optionlist)) return self.getreply() - def data(self,msg): + def data(self, msg): """SMTP 'DATA' command -- sends message data to server. Automatically quotes lines beginning with a period per rfc821. @@ -469,26 +479,28 @@ response code received when the all data is sent. """ self.putcmd("data") - (code,repl)=self.getreply() - if self.debuglevel >0 : print>>stderr, "data:", (code,repl) + (code, repl) = self.getreply() + if self.debuglevel > 0: + print>>stderr, "data:", (code, repl) if code != 354: - raise SMTPDataError(code,repl) + raise SMTPDataError(code, repl) else: q = quotedata(msg) if q[-2:] != CRLF: q = q + CRLF q = q + "." + CRLF self.send(q) - (code,msg)=self.getreply() - if self.debuglevel >0 : print>>stderr, "data:", (code,msg) - return (code,msg) + (code, msg) = self.getreply() + if self.debuglevel > 0: + print>>stderr, "data:", (code, msg) + return (code, msg) def verify(self, address): """SMTP 'verify' command -- checks for address validity.""" self.putcmd("vrfy", quoteaddr(address)) return self.getreply() # a.k.a. - vrfy=verify + vrfy = verify def expn(self, address): """SMTP 'expn' command -- expands a mailing list.""" @@ -592,7 +604,7 @@ raise SMTPAuthenticationError(code, resp) return (code, resp) - def starttls(self, keyfile = None, certfile = None): + def starttls(self, keyfile=None, certfile=None): """Puts the connection to the SMTP server into TLS mode. If there has been no previous EHLO or HELO command this session, this @@ -695,22 +707,22 @@ for option in mail_options: esmtp_opts.append(option) - (code,resp) = self.mail(from_addr, esmtp_opts) + (code, resp) = self.mail(from_addr, esmtp_opts) if code != 250: self.rset() raise SMTPSenderRefused(code, resp, from_addr) - senderrs={} + senderrs = {} if isinstance(to_addrs, basestring): to_addrs = [to_addrs] for each in to_addrs: - (code,resp)=self.rcpt(each, rcpt_options) + (code, resp) = self.rcpt(each, rcpt_options) if (code != 250) and (code != 251): - senderrs[each]=(code,resp) - if len(senderrs)==len(to_addrs): + senderrs[each] = (code, resp) + if len(senderrs) == len(to_addrs): # the server refused all our recipients self.rset() raise SMTPRecipientsRefused(senderrs) - (code,resp) = self.data(msg) + (code, resp) = self.data(msg) if code != 250: self.rset() raise SMTPDataError(code, resp) @@ -753,7 +765,8 @@ self.default_port = SMTP_SSL_PORT def _get_socket(self, host, port, timeout): - if self.debuglevel > 0: print>>stderr, 'connect:', (host, port) + if self.debuglevel > 0: + print>>stderr, 'connect:', (host, port) new_socket = socket.create_connection((host, port), timeout) new_socket = ssl.wrap_socket(new_socket, self.keyfile, self.certfile) self.file = SSLFakeFile(new_socket) @@ -781,11 +794,11 @@ ehlo_msg = "lhlo" - def __init__(self, host = '', port = LMTP_PORT, local_hostname = None): + def __init__(self, host='', port=LMTP_PORT, local_hostname=None): """Initialize a new instance.""" SMTP.__init__(self, host, port, local_hostname) - def connect(self, host = 'localhost', port = 0): + def connect(self, host='localhost', port=0): """Connect to the LMTP daemon, on either a Unix or a TCP socket.""" if host[0] != '/': return SMTP.connect(self, host, port) @@ -795,13 +808,15 @@ self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) self.sock.connect(host) except socket.error, msg: - if self.debuglevel > 0: print>>stderr, 'connect fail:', host + if self.debuglevel > 0: + print>>stderr, 'connect fail:', host if self.sock: self.sock.close() self.sock = None raise socket.error, msg (code, msg) = self.getreply() - if self.debuglevel > 0: print>>stderr, "connect:", msg + if self.debuglevel > 0: + print>>stderr, "connect:", msg return (code, msg) @@ -815,7 +830,7 @@ return sys.stdin.readline().strip() fromaddr = prompt("From") - toaddrs = prompt("To").split(',') + toaddrs = prompt("To").split(',') print "Enter message, end with ^D:" msg = '' while 1: From python-checkins at python.org Thu Feb 24 21:50:49 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 21:50:49 +0100 (CET) Subject: [Python-checkins] r88550 - in python/branches/py3k: Lib/test/test_capi.py Misc/NEWS Modules/_testcapimodule.c Objects/memoryobject.c Message-ID: <20110224205049.38EEFEE981@mail.python.org> Author: antoine.pitrou Date: Thu Feb 24 21:50:49 2011 New Revision: 88550 Log: Issue #11286: Raise a ValueError from calling PyMemoryView_FromBuffer with a buffer struct having a NULL data pointer. Modified: python/branches/py3k/Lib/test/test_capi.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/_testcapimodule.c python/branches/py3k/Objects/memoryobject.c Modified: python/branches/py3k/Lib/test/test_capi.py ============================================================================== --- python/branches/py3k/Lib/test/test_capi.py (original) +++ python/branches/py3k/Lib/test/test_capi.py Thu Feb 24 21:50:49 2011 @@ -50,6 +50,8 @@ b'Fatal Python error:' b' PyThreadState_Get: no current thread') + def test_memoryview_from_NULL_pointer(self): + self.assertRaises(ValueError, _testcapi.make_memoryview_from_NULL_pointer) @unittest.skipUnless(threading, 'Threading required for this test.') class TestPendingCalls(unittest.TestCase): Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Thu Feb 24 21:50:49 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #11286: Raise a ValueError from calling PyMemoryView_FromBuffer with + a buffer struct having a NULL data pointer. + - Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and sys.stdin uses universal newline (replace '\r\n' by '\n'). Modified: python/branches/py3k/Modules/_testcapimodule.c ============================================================================== --- python/branches/py3k/Modules/_testcapimodule.c (original) +++ python/branches/py3k/Modules/_testcapimodule.c Thu Feb 24 21:50:49 2011 @@ -2231,6 +2231,15 @@ return PyErr_NewExceptionWithDoc(name, doc, base, dict); } +static PyObject * +make_memoryview_from_NULL_pointer(PyObject *self) +{ + Py_buffer info; + if (PyBuffer_FillInfo(&info, NULL, NULL, 1, 1, PyBUF_FULL_RO) < 0) + return NULL; + return PyMemoryView_FromBuffer(&info); +} + /* Test that the fatal error from not having a current thread doesn't cause an infinite loop. Run via Lib/test/test_capi.py */ static PyObject * @@ -2326,6 +2335,8 @@ {"code_newempty", code_newempty, METH_VARARGS}, {"make_exception_with_doc", (PyCFunction)make_exception_with_doc, METH_VARARGS | METH_KEYWORDS}, + {"make_memoryview_from_NULL_pointer", (PyCFunction)make_memoryview_from_NULL_pointer, + METH_NOARGS}, {"crash_no_current_thread", (PyCFunction)crash_no_current_thread, METH_NOARGS}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/py3k/Objects/memoryobject.c ============================================================================== --- python/branches/py3k/Objects/memoryobject.c (original) +++ python/branches/py3k/Objects/memoryobject.c Thu Feb 24 21:50:49 2011 @@ -75,6 +75,11 @@ { PyMemoryViewObject *mview; + if (info->buf == NULL) { + PyErr_SetString(PyExc_ValueError, + "cannot make memory view from a buffer with a NULL data pointer"); + return NULL; + } mview = (PyMemoryViewObject *) PyObject_GC_New(PyMemoryViewObject, &PyMemoryView_Type); if (mview == NULL) From python-checkins at python.org Thu Feb 24 21:53:48 2011 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 24 Feb 2011 21:53:48 +0100 (CET) Subject: [Python-checkins] r88551 - in python/branches/release32-maint: Lib/test/test_capi.py Misc/NEWS Modules/_testcapimodule.c Objects/memoryobject.c Message-ID: <20110224205348.BC191F8ED@mail.python.org> Author: antoine.pitrou Date: Thu Feb 24 21:53:48 2011 New Revision: 88551 Log: Merged revisions 88550 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88550 | antoine.pitrou | 2011-02-24 21:50:49 +0100 (jeu., 24 f?vr. 2011) | 4 lines Issue #11286: Raise a ValueError from calling PyMemoryView_FromBuffer with a buffer struct having a NULL data pointer. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_capi.py python/branches/release32-maint/Misc/NEWS python/branches/release32-maint/Modules/_testcapimodule.c python/branches/release32-maint/Objects/memoryobject.c Modified: python/branches/release32-maint/Lib/test/test_capi.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_capi.py (original) +++ python/branches/release32-maint/Lib/test/test_capi.py Thu Feb 24 21:53:48 2011 @@ -50,6 +50,8 @@ b'Fatal Python error:' b' PyThreadState_Get: no current thread') + def test_memoryview_from_NULL_pointer(self): + self.assertRaises(ValueError, _testcapi.make_memoryview_from_NULL_pointer) @unittest.skipUnless(threading, 'Threading required for this test.') class TestPendingCalls(unittest.TestCase): Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Thu Feb 24 21:53:48 2011 @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #11286: Raise a ValueError from calling PyMemoryView_FromBuffer with + a buffer struct having a NULL data pointer. + - Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and sys.stdin uses universal newline (replace '\r\n' by '\n'). Modified: python/branches/release32-maint/Modules/_testcapimodule.c ============================================================================== --- python/branches/release32-maint/Modules/_testcapimodule.c (original) +++ python/branches/release32-maint/Modules/_testcapimodule.c Thu Feb 24 21:53:48 2011 @@ -2231,6 +2231,15 @@ return PyErr_NewExceptionWithDoc(name, doc, base, dict); } +static PyObject * +make_memoryview_from_NULL_pointer(PyObject *self) +{ + Py_buffer info; + if (PyBuffer_FillInfo(&info, NULL, NULL, 1, 1, PyBUF_FULL_RO) < 0) + return NULL; + return PyMemoryView_FromBuffer(&info); +} + /* Test that the fatal error from not having a current thread doesn't cause an infinite loop. Run via Lib/test/test_capi.py */ static PyObject * @@ -2326,6 +2335,8 @@ {"code_newempty", code_newempty, METH_VARARGS}, {"make_exception_with_doc", (PyCFunction)make_exception_with_doc, METH_VARARGS | METH_KEYWORDS}, + {"make_memoryview_from_NULL_pointer", (PyCFunction)make_memoryview_from_NULL_pointer, + METH_NOARGS}, {"crash_no_current_thread", (PyCFunction)crash_no_current_thread, METH_NOARGS}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/release32-maint/Objects/memoryobject.c ============================================================================== --- python/branches/release32-maint/Objects/memoryobject.c (original) +++ python/branches/release32-maint/Objects/memoryobject.c Thu Feb 24 21:53:48 2011 @@ -75,6 +75,11 @@ { PyMemoryViewObject *mview; + if (info->buf == NULL) { + PyErr_SetString(PyExc_ValueError, + "cannot make memory view from a buffer with a NULL data pointer"); + return NULL; + } mview = (PyMemoryViewObject *) PyObject_GC_New(PyMemoryViewObject, &PyMemoryView_Type); if (mview == NULL) From python-checkins at python.org Thu Feb 24 21:59:48 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Thu, 24 Feb 2011 21:59:48 +0100 (CET) Subject: [Python-checkins] r88552 - python/branches/py3k/Doc/whatsnew/3.3.rst Message-ID: <20110224205948.CCFC4ED56@mail.python.org> Author: giampaolo.rodola Date: Thu Feb 24 21:59:48 2011 New Revision: 88552 Log: Adds Python 3.3 what's new document. Added: python/branches/py3k/Doc/whatsnew/3.3.rst Added: python/branches/py3k/Doc/whatsnew/3.3.rst ============================================================================== --- (empty file) +++ python/branches/py3k/Doc/whatsnew/3.3.rst Thu Feb 24 21:59:48 2011 @@ -0,0 +1,95 @@ +**************************** + What's New In Python 3.3 +**************************** + +:Author: Raymond Hettinger +:Release: |release| +:Date: |today| + +.. $Id$ + Rules for maintenance: + + * Anyone can add text to this document. Do not spend very much time + on the wording of your changes, because your text will probably + get rewritten to some degree. + + * The maintainer will go through Misc/NEWS periodically and add + changes; it's therefore more important to add your changes to + Misc/NEWS than to this file. + + * This is not a complete list of every single change; completeness + is the purpose of Misc/NEWS. Some changes I consider too small + or esoteric to include. If such a change is added to the text, + I'll just remove it. (This is another reason you shouldn't spend + too much time on writing your addition.) + + * If you want to draw your new text to the attention of the + maintainer, add 'XXX' to the beginning of the paragraph or + section. + + * It's OK to just add a fragmentary note about a change. For + example: "XXX Describe the transmogrify() function added to the + socket module." The maintainer will research the change and + write the necessary text. + + * You can comment out your additions if you like, but it's not + necessary (especially when a final release is some months away). + + * Credit the author of a patch or bugfix. Just the name is + sufficient; the e-mail address isn't necessary. + + * It's helpful to add the bug/patch number as a comment: + + % Patch 12345 + XXX Describe the transmogrify() function added to the socket + module. + (Contributed by P.Y. Developer.) + + This saves the maintainer the effort of going through the SVN log + when researching a change. + +This article explains the new features in Python 3.3, compared to 3.2. + + +PEP XXX: Stub +============= + + +Other Language Changes +====================== + +Some smaller changes made to the core Python language are: + +* Stub + + +New, Improved, and Deprecated Modules +===================================== + +* Stub + + +Optimizations +============= + +Major performance enhancements have been added: + +* Stub + + +Build and C API Changes +======================= + +Changes to Python's build process and to the C API include: + +* Stub + + +Porting to Python 3.3 +===================== + +This section lists previously described changes and other bugfixes +that may require changes to your code: + +* Stub + From python-checkins at python.org Fri Feb 25 00:31:41 2011 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 25 Feb 2011 00:31:41 +0100 Subject: [Python-checkins] devguide (hg_transition): Update instructions for the new branch names Message-ID: antoine.pitrou pushed 53f440806875 to devguide: http://hg.python.org/devguide/rev/53f440806875 changeset: 314:53f440806875 branch: hg_transition tag: tip user: Antoine Pitrou date: Fri Feb 25 00:31:39 2011 +0100 summary: Update instructions for the new branch names files: committing.rst coredev.rst devcycle.rst setup.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -97,32 +97,32 @@ XXX Update to using hg qimport if that ends up being the way non-core developers are told to go. -Assume that Python 3.2 is the current in-development version of Python and that -you have a patch that should also be applied to Python 3.1. To properly port +Assume that Python 3.3 is the current in-development version of Python and that +you have a patch that should also be applied to Python 3.2. To properly port the patch to both versions of Python, you should first apply the patch to -Python 3.1:: +Python 3.2:: - hg update release-31maint + hg update 3.2 patch -p1 < patch.diff # Compile; run the test suite hg commit -With the patch now committed, you want to merge the patch up into Python 3.2. +With the patch now committed, you want to merge the patch up into Python 3.3. This should be done *before* pushing your changes to hg.python.org, so that the branches are in sync on the public repository. Assuming you are doing all of your work in a single clone, do:: - hg update py3k - hg merge release-31maint + hg update default + hg merge 3.2 # Fix any conflicts; compile; run the test suite hg commit .. note:: - *If the patch shouldn't be ported* from Python 3.1 to Python 3.2, you must + *If the patch shouldn't be ported* from Python 3.2 to Python 3.3, you must also make it explicit: merge the changes but revert them before committing:: - hg update py3k - hg merge release-31maint + hg update default + hg merge 3.2 hg revert -a hg commit @@ -135,7 +135,7 @@ hg push -This will push changes in both the Python 3.1 and Python 3.2 branches to +This will push changes in both the Python 3.2 and Python 3.3 branches to hg.python.org. @@ -147,9 +147,9 @@ "hg qimport -r tip -P" afterwards but that would add another level of complexity. -To move a patch between, e.g., Python 3.1 and 2.7, use the `transplant +To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant extension`_. Assuming you committed in Python 2.7 first, to pull changeset -``a7df1a869e4a`` into Python 3.1, do:: +``a7df1a869e4a`` into Python 3.2, do:: hg transplant -s a7df1a869e4a # Compile; run the test suite diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -107,12 +107,15 @@ are different than those for read-only checkouts as SSH is used instead of HTTP. -For the development branch, you can check out the development branch with:: +You can clone the repository (which contains all active branches) with:: - hg clone hg at hg.python.org/cpython#py3k + hg clone ssh://hg at hg.python.org/cpython -Make the appropriate changes to the URL to checkout maintenance branches by -removing ``py3k`` and replacing it with the name of the branch you want. +The default branch in that repository is the current development branch. +You can of course switch your working copy to one of the maintenance branches, +for example:: + + hg update 2.7 Responsibilities diff --git a/devcycle.rst b/devcycle.rst --- a/devcycle.rst +++ b/devcycle.rst @@ -47,9 +47,8 @@ http://hg.python.org/cpython#py3k. Once a :ref:`final` release is made from the in-development branch (say, 3.2), a -new :ref:`maintenance branch ` (e.g. ``release32-maint``) -is created to host all bug fixing activity for further micro versions -(3.2.1, 3.2.2, etc.). +new :ref:`maintenance branch ` is created to host all bug fixing +activity for further micro versions (3.2.1, 3.2.2, etc.). .. _maintbranch: diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -44,11 +44,9 @@ i.e., a version in :ref:`maintenance mode `, you can update your working copy. For instance, to update your working copy to Python 3.1, do:: - hg update release-31maint + hg update 3.1 -To get a version of Python other than 3.1, simply change the number in -the above example to the major/minor version (e.g., ``release27-maint`` for -Python 2.7). You will need to re-compile CPython when you do an update. +You will need to re-compile CPython when you do such an update. Do note that CPython will notice that it is being run from a working copy. This means that it if you edit CPython's source code in your working copy the -- Repository URL: http://hg.python.org/devguide From nnorwitz at gmail.com Fri Feb 25 00:48:48 2011 From: nnorwitz at gmail.com (Neal Norwitz) Date: Thu, 24 Feb 2011 18:48:48 -0500 Subject: [Python-checkins] Python Regression Test Failures refleak (1) Message-ID: <20110224234848.GA30592@kbk-i386-bb.dyndns.org> More important issues: ---------------------- test_itertools leaked [0, 0, 32] references, sum=32 Less important issues: ---------------------- From python-checkins at python.org Fri Feb 25 01:47:22 2011 From: python-checkins at python.org (senthil.kumaran) Date: Fri, 25 Feb 2011 01:47:22 +0100 Subject: [Python-checkins] cpython: Test Commit. Message-ID: senthil.kumaran pushed 42daecbc905c to cpython: http://hg.python.org/cpython/rev/42daecbc905c changeset: 68027:42daecbc905c tag: tip user: orsenthil at gmail.com date: Fri Feb 25 08:47:09 2011 +0800 summary: Test Commit. files: Misc/NEWS diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,9 +2,17 @@ Python News +++++++++++ + What's New in Python 3.3 Alpha 1? ================================= +Infrastructure +-------------- + +- Mercurial Repository for CPython http://hg.python.org/cpython/ (Under Test) + by __ap__ and birkenfeld. + + *Release date: XX-XXX-20XX* Core and Builtins -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Feb 25 02:14:14 2011 From: python-checkins at python.org (alexander.belopolsky) Date: Fri, 25 Feb 2011 02:14:14 +0100 (CET) Subject: [Python-checkins] r88553 - in python/branches/py3k: Lib/test/test_bytes.py Lib/test/test_unicode.py Objects/unicodeobject.c Message-ID: <20110225011414.D3225EE98F@mail.python.org> Author: alexander.belopolsky Date: Fri Feb 25 02:14:14 2011 New Revision: 88553 Log: Issue #11311: Short-circuit default encoding case in PyUnicode_Decode() and PyUnicode_AsEncodedString(). Thanks Ezio Melotti for the tests and the review. Modified: python/branches/py3k/Lib/test/test_bytes.py python/branches/py3k/Lib/test/test_unicode.py python/branches/py3k/Objects/unicodeobject.c Modified: python/branches/py3k/Lib/test/test_bytes.py ============================================================================== --- python/branches/py3k/Lib/test/test_bytes.py (original) +++ python/branches/py3k/Lib/test/test_bytes.py Fri Feb 25 02:14:14 2011 @@ -206,6 +206,8 @@ self.assertEqual(b.decode("utf8", "ignore"), "Hello world\n") self.assertEqual(b.decode(errors="ignore", encoding="utf8"), "Hello world\n") + # Default encoding is utf-8 + self.assertEqual(self.type2test(b'\xe2\x98\x83').decode(), '\u2603') def test_from_int(self): b = self.type2test(0) Modified: python/branches/py3k/Lib/test/test_unicode.py ============================================================================== --- python/branches/py3k/Lib/test/test_unicode.py (original) +++ python/branches/py3k/Lib/test/test_unicode.py Fri Feb 25 02:14:14 2011 @@ -1187,6 +1187,9 @@ self.assertEqual('hello'.encode('utf-16-be'), b'\000h\000e\000l\000l\000o') self.assertEqual('hello'.encode('latin-1'), b'hello') + # Default encoding is utf-8 + self.assertEqual('\u2603'.encode(), b'\xe2\x98\x83') + # Roundtrip safety for BMP (just the first 1024 chars) for c in range(1024): u = chr(c) Modified: python/branches/py3k/Objects/unicodeobject.c ============================================================================== --- python/branches/py3k/Objects/unicodeobject.c (original) +++ python/branches/py3k/Objects/unicodeobject.c Fri Feb 25 02:14:14 2011 @@ -1462,7 +1462,7 @@ char lower[11]; /* Enough for any encoding shortcut */ if (encoding == NULL) - encoding = PyUnicode_GetDefaultEncoding(); + return PyUnicode_DecodeUTF8(s, size, errors); /* Shortcuts for common default encodings */ if (normalize_encoding(encoding, lower, sizeof(lower))) { @@ -1670,7 +1670,9 @@ } if (encoding == NULL) - encoding = PyUnicode_GetDefaultEncoding(); + return PyUnicode_EncodeUTF8(PyUnicode_AS_UNICODE(unicode), + PyUnicode_GET_SIZE(unicode), + errors); /* Shortcuts for common default encodings */ if (normalize_encoding(encoding, lower, sizeof(lower))) { From python-checkins at python.org Fri Feb 25 02:30:35 2011 From: python-checkins at python.org (raymond.hettinger) Date: Fri, 25 Feb 2011 02:30:35 +0100 Subject: [Python-checkins] cpython: Add entry for ChainMap Message-ID: raymond.hettinger pushed 873adf3744ca to cpython: http://hg.python.org/cpython/rev/873adf3744ca changeset: 68028:873adf3744ca tag: tip user: Raymond Hettinger date: Thu Feb 24 17:30:25 2011 -0800 summary: Add entry for ChainMap files: Doc/library/collections.rst diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -23,6 +23,7 @@ ===================== ==================================================================== :func:`namedtuple` factory function for creating tuple subclasses with named fields :class:`deque` list-like container with fast appends and pops on either end +:class:`ChainMap` dict-like view of multiple mappings :class:`Counter` dict subclass for counting hashable objects :class:`OrderedDict` dict subclass that remembers the order entries were added :class:`defaultdict` dict subclass that calls a factory function to supply missing values -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Fri Feb 25 05:05:00 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 25 Feb 2011 05:05:00 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88553): sum=0 Message-ID: py3k results for svn r88553 (hg cset 1699d32c9580) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogVHwbZm', '-x'] From python-checkins at python.org Fri Feb 25 06:47:54 2011 From: python-checkins at python.org (eli.bendersky) Date: Fri, 25 Feb 2011 06:47:54 +0100 (CET) Subject: [Python-checkins] r88554 - in python/branches/py3k: Doc/library/stdtypes.rst Lib/collections/__init__.py Lib/test/list_tests.py Lib/test/test_descrtut.py Misc/NEWS Objects/listobject.c Message-ID: <20110225054754.3349EEE984@mail.python.org> Author: eli.bendersky Date: Fri Feb 25 06:47:53 2011 New Revision: 88554 Log: Issue #10516: adding list.clear() and list.copy() methods Modified: python/branches/py3k/Doc/library/stdtypes.rst python/branches/py3k/Lib/collections/__init__.py python/branches/py3k/Lib/test/list_tests.py python/branches/py3k/Lib/test/test_descrtut.py python/branches/py3k/Misc/NEWS python/branches/py3k/Objects/listobject.c Modified: python/branches/py3k/Doc/library/stdtypes.rst ============================================================================== --- python/branches/py3k/Doc/library/stdtypes.rst (original) +++ python/branches/py3k/Doc/library/stdtypes.rst Fri Feb 25 06:47:53 2011 @@ -1642,6 +1642,8 @@ single: append() (sequence method) single: extend() (sequence method) single: count() (sequence method) + single: clear() (sequence method) + single: copy() (sequence method) single: index() (sequence method) single: insert() (sequence method) single: pop() (sequence method) @@ -1673,6 +1675,12 @@ | ``s.extend(x)`` | same as ``s[len(s):len(s)] = | \(2) | | | x`` | | +------------------------------+--------------------------------+---------------------+ +| ``s.clear()`` | remove all items from ``s`` | \(8) | +| | | | ++------------------------------+--------------------------------+---------------------+ +| ``s.copy()`` | return a shallow copy of ``s`` | \(8) | +| | | | ++------------------------------+--------------------------------+---------------------+ | ``s.count(x)`` | return number of *i*'s for | | | | which ``s[i] == x`` | | +------------------------------+--------------------------------+---------------------+ @@ -1749,7 +1757,11 @@ detect that the list has been mutated during a sort. (8) - :meth:`sort` is not supported by :class:`bytearray` objects. + :meth:`clear`, :meth:`!copy` and :meth:`sort` are not supported by + :class:`bytearray` objects. + + .. versionadded:: 3.3 + :meth:`clear` and :meth:`!copy` methods. .. _bytes-methods: Modified: python/branches/py3k/Lib/collections/__init__.py ============================================================================== --- python/branches/py3k/Lib/collections/__init__.py (original) +++ python/branches/py3k/Lib/collections/__init__.py Fri Feb 25 06:47:53 2011 @@ -844,6 +844,8 @@ def insert(self, i, item): self.data.insert(i, item) def pop(self, i=-1): return self.data.pop(i) def remove(self, item): self.data.remove(item) + def clear(self): self.data.clear() + def copy(self): return self.data.copy() def count(self, item): return self.data.count(item) def index(self, item, *args): return self.data.index(item, *args) def reverse(self): self.data.reverse() Modified: python/branches/py3k/Lib/test/list_tests.py ============================================================================== --- python/branches/py3k/Lib/test/list_tests.py (original) +++ python/branches/py3k/Lib/test/list_tests.py Fri Feb 25 06:47:53 2011 @@ -425,6 +425,48 @@ self.assertRaises(TypeError, u.reverse, 42) + def test_clear(self): + u = self.type2test([2, 3, 4]) + u.clear() + self.assertEqual(u, []) + + u = self.type2test([]) + u.clear() + self.assertEqual(u, []) + + u = self.type2test([]) + u.append(1) + u.clear() + u.append(2) + self.assertEqual(u, [2]) + + self.assertRaises(TypeError, u.clear, None) + + def test_copy(self): + u = self.type2test([1, 2, 3]) + v = u.copy() + self.assertEqual(v, [1, 2, 3]) + + u = self.type2test([]) + v = u.copy() + self.assertEqual(v, []) + + # test that it's indeed a copy and not a reference + u = self.type2test(['a', 'b']) + v = u.copy() + v.append('i') + self.assertEqual(u, ['a', 'b']) + self.assertEqual(v, u + ['i']) + + # test that it's a shallow, not a deep copy + u = self.type2test([1, 2, [3, 4], 5]) + v = u.copy() + v[2].append(666) + self.assertEqual(u, [1, 2, [3, 4, 666], 5]) + self.assertEqual(u, v) + + self.assertRaises(TypeError, u.copy, None) + def test_sort(self): u = self.type2test([1, 0]) u.sort() Modified: python/branches/py3k/Lib/test/test_descrtut.py ============================================================================== --- python/branches/py3k/Lib/test/test_descrtut.py (original) +++ python/branches/py3k/Lib/test/test_descrtut.py Fri Feb 25 06:47:53 2011 @@ -199,6 +199,8 @@ '__str__', '__subclasshook__', 'append', + 'clear', + 'copy', 'count', 'extend', 'index', Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Fri Feb 25 06:47:53 2011 @@ -30,6 +30,8 @@ - Check for NULL result in PyType_FromSpec. +- Issue #10516: New copy() and clear() methods for lists. + Library ------- Modified: python/branches/py3k/Objects/listobject.c ============================================================================== --- python/branches/py3k/Objects/listobject.c (original) +++ python/branches/py3k/Objects/listobject.c Fri Feb 25 06:47:53 2011 @@ -747,6 +747,19 @@ } static PyObject * +listclear(PyListObject *self) +{ + list_clear(self); + Py_RETURN_NONE; +} + +static PyObject * +listcopy(PyListObject *self) +{ + return list_slice(self, 0, Py_SIZE(self)); +} + +static PyObject * listappend(PyListObject *self, PyObject *v) { if (app1(self, v) == 0) @@ -2322,6 +2335,10 @@ "L.__reversed__() -- return a reverse iterator over the list"); PyDoc_STRVAR(sizeof_doc, "L.__sizeof__() -- size of L in memory, in bytes"); +PyDoc_STRVAR(clear_doc, +"L.clear() -> None -- remove all items from L"); +PyDoc_STRVAR(copy_doc, +"L.copy() -> list -- a shallow copy of L"); PyDoc_STRVAR(append_doc, "L.append(object) -- append object to end"); PyDoc_STRVAR(extend_doc, @@ -2350,9 +2367,11 @@ {"__getitem__", (PyCFunction)list_subscript, METH_O|METH_COEXIST, getitem_doc}, {"__reversed__",(PyCFunction)list_reversed, METH_NOARGS, reversed_doc}, {"__sizeof__", (PyCFunction)list_sizeof, METH_NOARGS, sizeof_doc}, + {"clear", (PyCFunction)listclear, METH_NOARGS, clear_doc}, + {"copy", (PyCFunction)listcopy, METH_NOARGS, copy_doc}, {"append", (PyCFunction)listappend, METH_O, append_doc}, {"insert", (PyCFunction)listinsert, METH_VARARGS, insert_doc}, - {"extend", (PyCFunction)listextend, METH_O, extend_doc}, + {"extend", (PyCFunction)listextend, METH_O, extend_doc}, {"pop", (PyCFunction)listpop, METH_VARARGS, pop_doc}, {"remove", (PyCFunction)listremove, METH_O, remove_doc}, {"index", (PyCFunction)listindex, METH_VARARGS, index_doc}, From python-checkins at python.org Fri Feb 25 07:34:19 2011 From: python-checkins at python.org (georg.brandl) Date: Fri, 25 Feb 2011 07:34:19 +0100 Subject: [Python-checkins] cpython: Remove unused or duplicate tags. Message-ID: georg.brandl pushed 4a128b3a7e1d to cpython: http://hg.python.org/cpython/rev/4a128b3a7e1d changeset: 68029:4a128b3a7e1d tag: tip user: Georg Brandl date: Fri Feb 25 07:33:20 2011 +0100 summary: Remove unused or duplicate tags. files: .hgtags diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -7,9 +7,6 @@ 0100c8cd81ab8940cabe57a718140f456a91a8d7 v1.2b1 f70d079881e01e6185a54ee1c358fa3193e53552 v1.2b2 d18868883417a7b8505324444c11c1da13dc03d8 v1.2b3 -4dae144b69208c10fcbfa9f7520561d2a02374f2 Beta_14-Mar-1995-#2 -27694bad85e50d70266c8db7b9189ef9ef554999 Beta_14-Mar-1995-#3 -6c8b7cf638097a70276913caa5f799fb35fb121a Beta_15-Mar-1995-#2 e3c717d1c4fda7005cbc0067155a82857525e2c2 v1.2b4 9b35203a3faa3086e952444e6668127776edcf26 v1.2 3585c210bea96cf9cb52df0b1db392d0ea491427 v1.3b1 @@ -59,23 +56,11 @@ a9b75e9fef87d0108b5df68736f2f716543a2d4f v3.0a2 7f931276ca3474fbb025be1a3ea44489ee6ff5b4 v3.0a3 5453f5a4dc4a0af2b3dd543a1723f4e06e71d78c v3.0a4 -3579d9ff602e6fac345e6b5f591461729dc067c2 v3.0a5 -b9cfab14a7963becb64364eb27fa7187536dde72 r30a5/py3k -0000000000000000000000000000000000000000 r30a5/py3k -b9cfab14a7963becb64364eb27fa7187536dde72 r30a5-real -0000000000000000000000000000000000000000 v3.0a5 -0000000000000000000000000000000000000000 r30a5-real b9cfab14a7963becb64364eb27fa7187536dde72 v3.0a5 85e8adbc2a4e76931eba48368a31f626c331fe75 v3.0b1 628485807fb9981196b481a6792834ffc933673c v3.0b2 d75444c81ebd16b3ed2bf33a16e15e8b6a104b8f v3.0b3 003f8abfb1683efc68dd3c42cc30c2db18d50b7c v3.0rc1 -8a9665fd08246e46c3fded66c9fd26ef0d6eb9b8 r30rc1/py3k -0000000000000000000000000000000000000000 r30rc1/py3k -70f94fcb2359c21d1ad8c88cb1c954ff8c2227b5 v3.0rc2 -0000000000000000000000000000000000000000 v3.0rc2 -e6deba920da959a2e280d5b540a16c43eff48b20 v3.0rc2 -0000000000000000000000000000000000000000 v3.0rc2 3fa415e7f7247089561065b19904a1cf49728681 v3.0rc2 f131f3d09012705aca05621edbca3103405dbe70 v3.0rc3 2067a7ad807ccc8adc8f5a15f4b128181d5edcb7 v3.1a1 @@ -85,13 +70,9 @@ 5823c3912150e1f089c6484468d54fbba2e52f47 v3.1rc2 7723bc83e6a3fff2d668fb3f0d450e6f0fbb6700 v3.1 37ddbc7e3a1bfb5c662d32767af812dfb1bb4180 v3.2a1 -1271447d4654f5082e10547e6c9dffbe16366afd v3.2a2 -0000000000000000000000000000000000000000 v3.2a2 948171bc42811e1fc3d71989fe2443cf7ddf33cb v3.2a2 713a595cf39d336fbf072c34064fe70327586a2a v3.2a3 055120bdeb975f6b0ddabb4c6981ec2ed3d0ec1c v3.2a4 -53119407bd0811b9806fe3efceff505047fe4c40 v3.2b1 -0000000000000000000000000000000000000000 v3.2b1 0a65a0ac892a25088388a8d7f0c21af8d5a00430 v3.2b1 a522787a0bec661342c93a3f0d293f7a77e6f418 v3.2b2 34beaba435d572273f7c9d64df8da7de2451e643 v3.2rc1 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Feb 25 09:25:27 2011 From: python-checkins at python.org (georg.brandl) Date: Fri, 25 Feb 2011 09:25:27 +0100 (CET) Subject: [Python-checkins] r88555 - python/branches/release32-maint Message-ID: <20110225082527.B537BEE987@mail.python.org> Author: georg.brandl Date: Fri Feb 25 09:25:27 2011 New Revision: 88555 Log: Blocked revisions 88447-88449,88451,88459,88523-88524 via svnmerge ........ r88447 | georg.brandl | 2011-02-20 11:33:21 +0100 (So, 20 Feb 2011) | 2 lines Bump to 3.3a0. ........ r88448 | georg.brandl | 2011-02-20 11:37:07 +0100 (So, 20 Feb 2011) | 1 line Bump trunk to 3.3 alpha 0. ........ r88449 | georg.brandl | 2011-02-20 11:41:31 +0100 (So, 20 Feb 2011) | 1 line More automated version replacement. ........ r88451 | georg.brandl | 2011-02-20 12:18:09 +0100 (So, 20 Feb 2011) | 1 line Remove unittest methods scheduled for removal in 3.3 -- makes the unittest test suite pass again. ........ r88459 | raymond.hettinger | 2011-02-21 18:54:36 +0100 (Mo, 21 Feb 2011) | 1 line Issue 10160: Both single-arg and multi-arg calls have been sped-up. ........ r88523 | georg.brandl | 2011-02-23 08:30:12 +0100 (Mi, 23 Feb 2011) | 1 line Add new subdirectory to LIBSUBDIRS. ........ r88524 | georg.brandl | 2011-02-23 08:31:24 +0100 (Mi, 23 Feb 2011) | 1 line Indent "versionadded" properly. ........ Modified: python/branches/release32-maint/ (props changed) From python-checkins at python.org Fri Feb 25 10:48:22 2011 From: python-checkins at python.org (georg.brandl) Date: Fri, 25 Feb 2011 10:48:22 +0100 (CET) Subject: [Python-checkins] r88556 - in python/branches/release31-maint: Doc/c-api/init.rst Doc/c-api/list.rst Doc/howto/unicode.rst Doc/library/dbm.rst Doc/library/difflib.rst Doc/library/logging.rst Doc/library/os.rst Doc/library/pickle.rst Doc/library/urllib.request.rst Doc/library/zipfile.rst Doc/reference/expressions.rst Lib/_weakrefset.py Lib/pydoc.py Lib/test/test_weakset.py Lib/urllib/parse.py Misc/NEWS Objects/bytearrayobject.c Message-ID: <20110225094822.62848EE98A@mail.python.org> Author: georg.brandl Date: Fri Feb 25 10:48:21 2011 New Revision: 88556 Log: Merged revisions 86537,86867-86868,86881,86887,86913-86915,86931-86933,86960,86964,86974,86980,86996,87008,87050 via svnmerge from svn+ssh://svn.python.org/python/branches/py3k ........ r86537 | georg.brandl | 2010-11-19 23:09:04 +0100 (Fr, 19 Nov 2010) | 1 line Do not put a raw REPLACEMENT CHARACTER in the document. ........ r86867 | georg.brandl | 2010-11-29 15:50:54 +0100 (Mo, 29 Nov 2010) | 1 line Fix indentation bug. ........ r86868 | georg.brandl | 2010-11-29 15:53:15 +0100 (Mo, 29 Nov 2010) | 1 line Fix heading style inconsistencies. ........ r86881 | georg.brandl | 2010-11-30 08:43:28 +0100 (Di, 30 Nov 2010) | 1 line #10584: fix bad links. ........ r86887 | georg.brandl | 2010-11-30 15:57:54 +0100 (Di, 30 Nov 2010) | 1 line Fix typo. ........ r86913 | georg.brandl | 2010-12-01 16:32:43 +0100 (Mi, 01 Dez 2010) | 1 line Add missing word, and add a better reference to the actual function. ........ r86914 | georg.brandl | 2010-12-01 16:36:33 +0100 (Mi, 01 Dez 2010) | 1 line #10594: fix parameter names in PyList API docs. ........ r86915 | georg.brandl | 2010-12-01 16:44:25 +0100 (Mi, 01 Dez 2010) | 1 line Fix some markup and style in the unittest docs. ........ r86931 | georg.brandl | 2010-12-02 10:06:12 +0100 (Do, 02 Dez 2010) | 1 line Fix-up documentation of makedirs(). ........ r86932 | david.malcolm | 2010-12-02 17:41:00 +0100 (Do, 02 Dez 2010) | 2 lines Fix spelling of Jamie Zawinski's surname in urllib.parse docstring (issue 10606) ........ r86933 | georg.brandl | 2010-12-02 19:02:01 +0100 (Do, 02 Dez 2010) | 1 line #10597: fix Py_SetPythonHome docs by pointing to where the meaning of PYTHONHOME is already documented. ........ r86960 | georg.brandl | 2010-12-03 08:55:44 +0100 (Fr, 03 Dez 2010) | 1 line #10360: catch TypeError in WeakSet.__contains__, just like WeakKeyDictionary does. ........ r86964 | georg.brandl | 2010-12-03 10:58:38 +0100 (Fr, 03 Dez 2010) | 1 line #10549: fix interface of docclass() for text documenter. ........ r86974 | georg.brandl | 2010-12-03 16:30:09 +0100 (Fr, 03 Dez 2010) | 1 line Markup consistency fixes. ........ r86980 | georg.brandl | 2010-12-03 18:19:27 +0100 (Fr, 03 Dez 2010) | 1 line Fix punctuation. ........ r86996 | georg.brandl | 2010-12-03 20:56:42 +0100 (Fr, 03 Dez 2010) | 1 line Fix indentation. ........ r87008 | georg.brandl | 2010-12-04 10:04:04 +0100 (Sa, 04 Dez 2010) | 1 line Fix typo. ........ r87050 | georg.brandl | 2010-12-04 18:09:30 +0100 (Sa, 04 Dez 2010) | 1 line Fix typo. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/c-api/init.rst python/branches/release31-maint/Doc/c-api/list.rst python/branches/release31-maint/Doc/howto/unicode.rst python/branches/release31-maint/Doc/library/dbm.rst python/branches/release31-maint/Doc/library/difflib.rst python/branches/release31-maint/Doc/library/logging.rst python/branches/release31-maint/Doc/library/os.rst python/branches/release31-maint/Doc/library/pickle.rst python/branches/release31-maint/Doc/library/urllib.request.rst python/branches/release31-maint/Doc/library/zipfile.rst python/branches/release31-maint/Doc/reference/expressions.rst python/branches/release31-maint/Lib/_weakrefset.py python/branches/release31-maint/Lib/pydoc.py python/branches/release31-maint/Lib/test/test_weakset.py python/branches/release31-maint/Lib/urllib/parse.py python/branches/release31-maint/Misc/NEWS python/branches/release31-maint/Objects/bytearrayobject.c Modified: python/branches/release31-maint/Doc/c-api/init.rst ============================================================================== --- python/branches/release31-maint/Doc/c-api/init.rst (original) +++ python/branches/release31-maint/Doc/c-api/init.rst Fri Feb 25 10:48:21 2011 @@ -318,8 +318,9 @@ .. cfunction:: void Py_SetPythonHome(wchar_t *home) Set the default "home" directory, that is, the location of the standard - Python libraries. The libraries are searched in - :file:`{home}/lib/python{version}` and :file:`{home}/lib/python{version}`. + Python libraries. See :envvar:`PYTHONHOME` for the meaning of the + argument string. + The argument should point to a zero-terminated character string in static storage whose contents will not change for the duration of the program's execution. No code in the Python interpreter will change the contents of Modified: python/branches/release31-maint/Doc/c-api/list.rst ============================================================================== --- python/branches/release31-maint/Doc/c-api/list.rst (original) +++ python/branches/release31-maint/Doc/c-api/list.rst Fri Feb 25 10:48:21 2011 @@ -37,7 +37,7 @@ .. note:: - If *length* is greater than zero, the returned list object's items are + If *len* is greater than zero, the returned list object's items are set to ``NULL``. Thus you cannot use abstract API functions such as :cfunc:`PySequence_SetItem` or expose the object to Python code before setting all items to a real object with :cfunc:`PyList_SetItem`. @@ -58,9 +58,9 @@ .. cfunction:: PyObject* PyList_GetItem(PyObject *list, Py_ssize_t index) - Return the object at position *pos* in the list pointed to by *p*. The + Return the object at position *index* in the list pointed to by *list*. The position must be positive, indexing from the end of the list is not - supported. If *pos* is out of bounds, return *NULL* and set an + supported. If *index* is out of bounds, return *NULL* and set an :exc:`IndexError` exception. Modified: python/branches/release31-maint/Doc/howto/unicode.rst ============================================================================== --- python/branches/release31-maint/Doc/howto/unicode.rst (original) +++ python/branches/release31-maint/Doc/howto/unicode.rst Fri Feb 25 10:48:21 2011 @@ -265,10 +265,13 @@ UnicodeDecodeError: 'utf8' codec can't decode byte 0x80 in position 0: unexpected code byte >>> b'\x80abc'.decode("utf-8", "replace") - '\ufffdabc' + '?abc' >>> b'\x80abc'.decode("utf-8", "ignore") 'abc' +(In this code example, the Unicode replacement character has been replaced by +a question mark because it may not be displayed on some systems.) + Encodings are specified as strings containing the encoding's name. Python 3.2 comes with roughly 100 different encodings; see the Python Library Reference at :ref:`standard-encodings` for a list. Some encodings have multiple names; for Modified: python/branches/release31-maint/Doc/library/dbm.rst ============================================================================== --- python/branches/release31-maint/Doc/library/dbm.rst (original) +++ python/branches/release31-maint/Doc/library/dbm.rst Fri Feb 25 10:48:21 2011 @@ -20,7 +20,7 @@ .. function:: whichdb(filename) - This functionattempts to guess which of the several simple database modules + This function attempts to guess which of the several simple database modules available --- :mod:`dbm.gnu`, :mod:`dbm.ndbm` or :mod:`dbm.dumb` --- should be used to open a given file. Modified: python/branches/release31-maint/Doc/library/difflib.rst ============================================================================== --- python/branches/release31-maint/Doc/library/difflib.rst (original) +++ python/branches/release31-maint/Doc/library/difflib.rst Fri Feb 25 10:48:21 2011 @@ -347,7 +347,6 @@ :class:`SequenceMatcher` objects have the following methods: - .. method:: set_seqs(a, b) Set the two sequences to be compared. Modified: python/branches/release31-maint/Doc/library/logging.rst ============================================================================== --- python/branches/release31-maint/Doc/library/logging.rst (original) +++ python/branches/release31-maint/Doc/library/logging.rst Fri Feb 25 10:48:21 2011 @@ -473,9 +473,7 @@ just "foo". .. versionadded:: 3.1 - The :class:`NullHandler` class was not present in previous versions, but is - now included, so that it need not be defined in library code. - + The :class:`NullHandler` class. Logging Levels @@ -593,8 +591,7 @@ more information. .. versionadded:: 3.1 - -The :class:`NullHandler` class was not present in previous versions. + The :class:`NullHandler` class. The :class:`NullHandler`, :class:`StreamHandler` and :class:`FileHandler` classes are defined in the core logging package. The other handlers are @@ -1816,6 +1813,7 @@ Outputs the record to the file. + .. _null-handler: NullHandler @@ -1827,12 +1825,10 @@ does not do any formatting or output. It is essentially a "no-op" handler for use by library developers. - .. class:: NullHandler() Returns a new instance of the :class:`NullHandler` class. - .. method:: emit(record) This method does nothing. @@ -2609,6 +2605,7 @@ Returns the message for this :class:`LogRecord` instance after merging any user-supplied arguments with the message. + .. _logger-adapter: LoggerAdapter Objects @@ -2616,22 +2613,21 @@ :class:`LoggerAdapter` instances are used to conveniently pass contextual information into logging calls. For a usage example , see the section on -`adding contextual information to your logging output`__. +:ref:`adding contextual information to your logging output `. -__ context-info_ .. class:: LoggerAdapter(logger, extra) - Returns an instance of :class:`LoggerAdapter` initialized with an - underlying :class:`Logger` instance and a dict-like object. + Returns an instance of :class:`LoggerAdapter` initialized with an + underlying :class:`Logger` instance and a dict-like object. - .. method:: process(msg, kwargs) + .. method:: process(msg, kwargs) - Modifies the message and/or keyword arguments passed to a logging call in - order to insert contextual information. This implementation takes the object - passed as *extra* to the constructor and adds it to *kwargs* using key - 'extra'. The return value is a (*msg*, *kwargs*) tuple which has the - (possibly modified) versions of the arguments passed in. + Modifies the message and/or keyword arguments passed to a logging call in + order to insert contextual information. This implementation takes the object + passed as *extra* to the constructor and adds it to *kwargs* using key + 'extra'. The return value is a (*msg*, *kwargs*) tuple which has the + (possibly modified) versions of the arguments passed in. In addition to the above, :class:`LoggerAdapter` supports all the logging methods of :class:`Logger`, i.e. :meth:`debug`, :meth:`info`, :meth:`warning`, Modified: python/branches/release31-maint/Doc/library/os.rst ============================================================================== --- python/branches/release31-maint/Doc/library/os.rst (original) +++ python/branches/release31-maint/Doc/library/os.rst Fri Feb 25 10:48:21 2011 @@ -1015,7 +1015,7 @@ Availability: Unix, Windows. -.. function:: makedirs(path[, mode]) +.. function:: makedirs(path, mode=0o777) .. index:: single: directory; creating @@ -1029,8 +1029,8 @@ .. note:: - :func:`makedirs` will become confused if the path elements to create include - :data:`os.pardir`. + :func:`makedirs` will become confused if the path elements to create + include :data:`pardir`. This function handles UNC paths correctly. Modified: python/branches/release31-maint/Doc/library/pickle.rst ============================================================================== --- python/branches/release31-maint/Doc/library/pickle.rst (original) +++ python/branches/release31-maint/Doc/library/pickle.rst Fri Feb 25 10:48:21 2011 @@ -42,7 +42,7 @@ objects. :mod:`marshal` exists primarily to support Python's :file:`.pyc` files. -The :mod:`pickle` module differs from :mod:`marshal` several significant ways: +The :mod:`pickle` module differs from :mod:`marshal` in several significant ways: * The :mod:`pickle` module keeps track of the objects it has already serialized, so that later references to the same object won't be serialized again. Modified: python/branches/release31-maint/Doc/library/urllib.request.rst ============================================================================== --- python/branches/release31-maint/Doc/library/urllib.request.rst (original) +++ python/branches/release31-maint/Doc/library/urllib.request.rst Fri Feb 25 10:48:21 2011 @@ -1,4 +1,4 @@ -:mod:`urllib.request` --- extensible library for opening URLs +:mod:`urllib.request` --- Extensible library for opening URLs ============================================================= .. module:: urllib.request @@ -299,18 +299,19 @@ users for the required information on the controlling terminal. A subclass may override this method to support more appropriate behavior if needed. - The :class:`FancyURLopener` class offers one additional method that should be - overloaded to provide the appropriate behavior: + The :class:`FancyURLopener` class offers one additional method that should be + overloaded to provide the appropriate behavior: - .. method:: prompt_user_passwd(host, realm) + .. method:: prompt_user_passwd(host, realm) + + Return information needed to authenticate the user at the given host in the + specified security realm. The return value should be a tuple, ``(user, + password)``, which can be used for basic authentication. + + The implementation prompts for this information on the terminal; an application + should override this method to use an appropriate interaction model in the local + environment. - Return information needed to authenticate the user at the given host in the - specified security realm. The return value should be a tuple, ``(user, - password)``, which can be used for basic authentication. - - The implementation prompts for this information on the terminal; an application - should override this method to use an appropriate interaction model in the local - environment. .. class:: OpenerDirector() @@ -1249,8 +1250,8 @@ -:mod:`urllib.response` --- Response classes used by urllib. -=========================================================== +:mod:`urllib.response` --- Response classes used by urllib +========================================================== .. module:: urllib.response :synopsis: Response classes used by urllib. Modified: python/branches/release31-maint/Doc/library/zipfile.rst ============================================================================== --- python/branches/release31-maint/Doc/library/zipfile.rst (original) +++ python/branches/release31-maint/Doc/library/zipfile.rst Fri Feb 25 10:48:21 2011 @@ -163,8 +163,8 @@ .. note:: The file-like object is read-only and provides the following methods: - :meth:`read`, :meth:`readline`, :meth:`readlines`, :meth:`__iter__`, - :meth:`__next__`. + :meth:`!read`, :meth:`!readline`, :meth:`!readlines`, :meth:`!__iter__`, + :meth:`!__next__`. .. note:: Modified: python/branches/release31-maint/Doc/reference/expressions.rst ============================================================================== --- python/branches/release31-maint/Doc/reference/expressions.rst (original) +++ python/branches/release31-maint/Doc/reference/expressions.rst Fri Feb 25 10:48:21 2011 @@ -1318,8 +1318,8 @@ true numerically due to roundoff. For example, and assuming a platform on which a Python float is an IEEE 754 double-precision number, in order that ``-1e-100 % 1e100`` have the same sign as ``1e100``, the computed result is ``-1e-100 + - 1e100``, which is numerically exactly equal to ``1e100``. Function :func:`fmod` - in the :mod:`math` module returns a result whose sign matches the sign of the + 1e100``, which is numerically exactly equal to ``1e100``. The function + :func:`math.fmod` returns a result whose sign matches the sign of the first argument instead, and so returns ``-1e-100`` in this case. Which approach is more appropriate depends on the application. @@ -1340,7 +1340,8 @@ the :keyword:`is` operator, like those involving comparisons between instance methods, or constants. Check their documentation for more info. -.. [#] The ``%`` is also used for string formatting; the same precedence applies. +.. [#] The ``%`` operator is also used for string formatting; the same + precedence applies. .. [#] The power operator ``**`` binds less tightly than an arithmetic or bitwise unary operator on its right, that is, ``2**-1`` is ``0.5``. Modified: python/branches/release31-maint/Lib/_weakrefset.py ============================================================================== --- python/branches/release31-maint/Lib/_weakrefset.py (original) +++ python/branches/release31-maint/Lib/_weakrefset.py Fri Feb 25 10:48:21 2011 @@ -66,7 +66,11 @@ return sum(x() is not None for x in self.data) def __contains__(self, item): - return ref(item) in self.data + try: + wr = ref(item) + except TypeError: + return False + return wr in self.data def __reduce__(self): return (self.__class__, (list(self),), Modified: python/branches/release31-maint/Lib/pydoc.py ============================================================================== --- python/branches/release31-maint/Lib/pydoc.py (original) +++ python/branches/release31-maint/Lib/pydoc.py Fri Feb 25 10:48:21 2011 @@ -1110,7 +1110,7 @@ result = result + self.section('FILE', file) return result - def docclass(self, object, name=None, mod=None): + def docclass(self, object, name=None, mod=None, *ignored): """Produce text documentation for a given class object.""" realname = object.__name__ name = name or realname Modified: python/branches/release31-maint/Lib/test/test_weakset.py ============================================================================== --- python/branches/release31-maint/Lib/test/test_weakset.py (original) +++ python/branches/release31-maint/Lib/test/test_weakset.py Fri Feb 25 10:48:21 2011 @@ -50,7 +50,8 @@ def test_contains(self): for c in self.letters: self.assertEqual(c in self.s, c in self.d) - self.assertRaises(TypeError, self.s.__contains__, [[]]) + # 1 is not weakref'able, but that TypeError is caught by __contains__ + self.assertNotIn(1, self.s) self.assertTrue(self.obj in self.fs) del self.obj self.assertTrue(ustr('F') not in self.fs) Modified: python/branches/release31-maint/Lib/urllib/parse.py ============================================================================== --- python/branches/release31-maint/Lib/urllib/parse.py (original) +++ python/branches/release31-maint/Lib/urllib/parse.py Fri Feb 25 10:48:21 2011 @@ -8,7 +8,7 @@ RFC 2396: "Uniform Resource Identifiers (URI)": Generic Syntax by T. Berners-Lee, R. Fielding, and L. Masinter, August 1998. -RFC 2368: "The mailto URL scheme", by P.Hoffman , L Masinter, J. Zwinski, July 1998. +RFC 2368: "The mailto URL scheme", by P.Hoffman , L Masinter, J. Zawinski, July 1998. RFC 1808: "Relative Uniform Resource Locators", by R. Fielding, UC Irvine, June 1995. Modified: python/branches/release31-maint/Misc/NEWS ============================================================================== --- python/branches/release31-maint/Misc/NEWS (original) +++ python/branches/release31-maint/Misc/NEWS Fri Feb 25 10:48:21 2011 @@ -42,6 +42,11 @@ without folding whitespace. It now uses the continuation_ws, as it does for continuation lines that it creates itself. +- Issue #10360: In WeakSet, do not raise TypeErrors when testing for + membership of non-weakrefable objects. + +- Issue #10549: Fix pydoc traceback when text-documenting certain classes. + - Issue #11110: Fix _sqlite to not deref a NULL when module creation fails. - Issue #11089: Fix performance issue limiting the use of ConfigParser() Modified: python/branches/release31-maint/Objects/bytearrayobject.c ============================================================================== --- python/branches/release31-maint/Objects/bytearrayobject.c (original) +++ python/branches/release31-maint/Objects/bytearrayobject.c Fri Feb 25 10:48:21 2011 @@ -595,7 +595,7 @@ needed = 0; } else if (values == (PyObject *)self || !PyByteArray_Check(values)) { - /* Make a copy an call this function recursively */ + /* Make a copy and call this function recursively */ int err; values = PyByteArray_FromObject(values); if (values == NULL) From python-checkins at python.org Fri Feb 25 11:03:35 2011 From: python-checkins at python.org (georg.brandl) Date: Fri, 25 Feb 2011 11:03:35 +0100 (CET) Subject: [Python-checkins] r88557 - in python/branches/release31-maint: Doc/ACKS.txt Doc/c-api/slice.rst Doc/c-api/typeobj.rst Doc/extending/windows.rst Doc/glossary.rst Doc/library/collections.rst Doc/library/dbm.rst Doc/library/exceptions.rst Doc/library/socket.rst Doc/library/stdtypes.rst Doc/library/xml.dom.minidom.rst Doc/tutorial/interpreter.rst Lib/test/crashers/README Lib/test/test_array.py Lib/tkinter/scrolledtext.py Misc/SpecialBuilds.txt Message-ID: <20110225100335.2109EEE98F@mail.python.org> Author: georg.brandl Date: Fri Feb 25 11:03:34 2011 New Revision: 88557 Log: Merged revisions 87101,87146,87156,87172,87175,87371,87378,87522-87524,87526-87528,87530-87536,87581 via svnmerge from svn+ssh://svn.python.org/python/branches/py3k ........ r87101 | georg.brandl | 2010-12-06 23:02:48 +0100 (Mo, 06 Dez 2010) | 1 line Remove visible XXX comments. ........ r87146 | georg.brandl | 2010-12-09 19:08:43 +0100 (Do, 09 Dez 2010) | 1 line Fix "seperate". ........ r87156 | georg.brandl | 2010-12-10 11:01:44 +0100 (Fr, 10 Dez 2010) | 1 line #10668: fix wrong call of __init__. ........ r87172 | georg.brandl | 2010-12-11 20:10:30 +0100 (Sa, 11 Dez 2010) | 1 line Avoid AttributeError(_closed) when a TemporaryDirectory is deallocated whose mkdtemp call failed. ........ r87175 | georg.brandl | 2010-12-11 23:19:34 +0100 (Sa, 11 Dez 2010) | 1 line Fix markup. ........ r87371 | georg.brandl | 2010-12-18 17:21:58 +0100 (Sa, 18 Dez 2010) | 1 line Fix typo. ........ r87378 | georg.brandl | 2010-12-18 18:51:28 +0100 (Sa, 18 Dez 2010) | 1 line #10723: add missing builtin exceptions. ........ r87522 | georg.brandl | 2010-12-28 10:16:12 +0100 (Di, 28 Dez 2010) | 1 line Replace sys.maxint mention by sys.maxsize. ........ r87523 | georg.brandl | 2010-12-28 10:18:24 +0100 (Di, 28 Dez 2010) | 1 line Remove confusing paragraph -- this is relevant only to advanced users anyway and does not belong into the tutorial. ........ r87524 | georg.brandl | 2010-12-28 10:29:19 +0100 (Di, 28 Dez 2010) | 1 line Fix advice: call PyType_Ready to fill in ob_type of custom types. ........ r87526 | georg.brandl | 2010-12-28 11:38:33 +0100 (Di, 28 Dez 2010) | 1 line #10777: fix iteration over dict keys while mutating the dict. ........ r87527 | georg.brandl | 2010-12-28 11:56:20 +0100 (Di, 28 Dez 2010) | 1 line #10768: fix ScrolledText widget construction, and make the example work from the interactive shell. ........ r87528 | georg.brandl | 2010-12-28 12:02:12 +0100 (Di, 28 Dez 2010) | 1 line Add news entry and clarify another. ........ r87530 | georg.brandl | 2010-12-28 12:06:07 +0100 (Di, 28 Dez 2010) | 1 line #10767: update README in crashers; not all may have a bug entry and/or be fixed. ........ r87531 | georg.brandl | 2010-12-28 12:08:17 +0100 (Di, 28 Dez 2010) | 1 line #10742: document readonly attribute of memoryviews. ........ r87532 | georg.brandl | 2010-12-28 12:15:49 +0100 (Di, 28 Dez 2010) | 1 line #10781: clarify that *encoding* is not a parameter for Node objects in general. ........ r87533 | georg.brandl | 2010-12-28 12:38:12 +0100 (Di, 28 Dez 2010) | 1 line Remove history; adapt a bit more to reST, since this will once be part of the dev guide. ........ r87534 | georg.brandl | 2010-12-28 12:48:53 +0100 (Di, 28 Dez 2010) | 1 line Rewrap. ........ r87535 | georg.brandl | 2010-12-28 12:49:41 +0100 (Di, 28 Dez 2010) | 1 line #10739: document that on Windows, socket.makefile() does not make a file that has a true file descriptor usable where such a thing is expected. ........ r87536 | georg.brandl | 2010-12-28 12:53:25 +0100 (Di, 28 Dez 2010) | 1 line #10609: fix non-working dbm example. ........ r87581 | georg.brandl | 2010-12-30 18:36:17 +0100 (Do, 30 Dez 2010) | 1 line Fix NameErrors. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/ACKS.txt python/branches/release31-maint/Doc/c-api/slice.rst python/branches/release31-maint/Doc/c-api/typeobj.rst python/branches/release31-maint/Doc/extending/windows.rst python/branches/release31-maint/Doc/glossary.rst python/branches/release31-maint/Doc/library/collections.rst python/branches/release31-maint/Doc/library/dbm.rst python/branches/release31-maint/Doc/library/exceptions.rst python/branches/release31-maint/Doc/library/socket.rst python/branches/release31-maint/Doc/library/stdtypes.rst python/branches/release31-maint/Doc/library/xml.dom.minidom.rst python/branches/release31-maint/Doc/tutorial/interpreter.rst python/branches/release31-maint/Lib/test/crashers/README python/branches/release31-maint/Lib/test/test_array.py python/branches/release31-maint/Lib/tkinter/scrolledtext.py python/branches/release31-maint/Misc/SpecialBuilds.txt Modified: python/branches/release31-maint/Doc/ACKS.txt ============================================================================== --- python/branches/release31-maint/Doc/ACKS.txt (original) +++ python/branches/release31-maint/Doc/ACKS.txt Fri Feb 25 11:03:34 2011 @@ -112,6 +112,7 @@ * Andrew M. Kuchling * Dave Kuhlman * Erno Kuusela + * Ross Lagerwall * Thomas Lamb * Detlef Lannert * Piers Lauder Modified: python/branches/release31-maint/Doc/c-api/slice.rst ============================================================================== --- python/branches/release31-maint/Doc/c-api/slice.rst (original) +++ python/branches/release31-maint/Doc/c-api/slice.rst Fri Feb 25 11:03:34 2011 @@ -48,4 +48,3 @@ normal slices. Returns 0 on success and -1 on error with exception set. - Modified: python/branches/release31-maint/Doc/c-api/typeobj.rst ============================================================================== --- python/branches/release31-maint/Doc/c-api/typeobj.rst (original) +++ python/branches/release31-maint/Doc/c-api/typeobj.rst Fri Feb 25 11:03:34 2011 @@ -705,7 +705,9 @@ This field is not inherited by subtypes (computed attributes are inherited through a different mechanism). - Docs for PyGetSetDef (XXX belong elsewhere):: + .. XXX belongs elsewhere + + Docs for PyGetSetDef:: typedef PyObject *(*getter)(PyObject *, void *); typedef int (*setter)(PyObject *, PyObject *, void *); @@ -752,7 +754,7 @@ PyObject * tp_descr_get(PyObject *self, PyObject *obj, PyObject *type); - XXX explain. + .. XXX explain. This field is inherited by subtypes. @@ -767,7 +769,7 @@ This field is inherited by subtypes. - XXX explain. + .. XXX explain. .. cmember:: long PyTypeObject.tp_dictoffset Modified: python/branches/release31-maint/Doc/extending/windows.rst ============================================================================== --- python/branches/release31-maint/Doc/extending/windows.rst (original) +++ python/branches/release31-maint/Doc/extending/windows.rst Fri Feb 25 11:03:34 2011 @@ -110,7 +110,7 @@ Now your options are: #. Copy :file:`example.sln` and :file:`example.vcproj`, rename them to - :file:`spam.\*`, and edit them by hand, or + :file:`spam.\*`, and edit them by hand, or #. Create a brand new project; instructions are below. @@ -179,8 +179,8 @@ and add the following to the module initialization function:: - MyObject_Type.ob_type = &PyType_Type; - + if (PyType_Ready(&MyObject_Type) < 0) + return NULL; .. _dynamic-linking: Modified: python/branches/release31-maint/Doc/glossary.rst ============================================================================== --- python/branches/release31-maint/Doc/glossary.rst (original) +++ python/branches/release31-maint/Doc/glossary.rst Fri Feb 25 11:03:34 2011 @@ -179,22 +179,22 @@ not expressions. extension module - A module written in C or C++, using Python's C API to interact with the core and - with user code. + A module written in C or C++, using Python's C API to interact with the + core and with user code. file object An object exposing a file-oriented API (with methods such as - :meth:`read()` or :meth:`write()`) to an underlying resource. - Depending on the way it was created, a file object can mediate access - to a real on-disk file or to another other type of storage or - communication device (for example standard input/output, in-memory - buffers, sockets, pipes, etc.). File objects are also called - :dfn:`file-like objects` or :dfn:`streams`. + :meth:`read()` or :meth:`write()`) to an underlying resource. Depending + on the way it was created, a file object can mediate access to a real + on-disk file or to another other type of storage or communication device + (for example standard input/output, in-memory buffers, sockets, pipes, + etc.). File objects are also called :dfn:`file-like objects` or + :dfn:`streams`. - There are actually three categories of file objects: raw binary - files, buffered binary files and text files. Their interfaces are - defined in the :mod:`io` module. The canonical way to create a - file object is by using the :func:`open` function. + There are actually three categories of file objects: raw binary files, + buffered binary files and text files. Their interfaces are defined in the + :mod:`io` module. The canonical way to create a file object is by using + the :func:`open` function. file-like object A synonym for :term:`file object`. Modified: python/branches/release31-maint/Doc/library/collections.rst ============================================================================== --- python/branches/release31-maint/Doc/library/collections.rst (original) +++ python/branches/release31-maint/Doc/library/collections.rst Fri Feb 25 11:03:34 2011 @@ -810,7 +810,7 @@ original insertion position is changed and moved to the end:: class LastUpdatedOrderedDict(OrderedDict): - 'Store items is the order the keys were last added' + 'Store items in the order the keys were last added' def __setitem__(self, key, value): if key in self: del self[key] Modified: python/branches/release31-maint/Doc/library/dbm.rst ============================================================================== --- python/branches/release31-maint/Doc/library/dbm.rst (original) +++ python/branches/release31-maint/Doc/library/dbm.rst Fri Feb 25 11:03:34 2011 @@ -86,10 +86,8 @@ # Notice how the value is now in bytes. assert db['www.cnn.com'] == b'Cable News Network' - # Loop through contents. Other dictionary methods - # such as .keys(), .values() also work. - for k, v in db.iteritems(): - print(k, '\t', v) + # Often-used methods of the dict interface work too. + print(db.get('python.org', b'not present')) # Storing a non-string key or value will raise an exception (most # likely a TypeError). Modified: python/branches/release31-maint/Doc/library/exceptions.rst ============================================================================== --- python/branches/release31-maint/Doc/library/exceptions.rst (original) +++ python/branches/release31-maint/Doc/library/exceptions.rst Fri Feb 25 11:03:34 2011 @@ -77,6 +77,12 @@ :exc:`FloatingPointError`. +.. exception:: BufferError + + Raised when a :ref:`buffer ` related operation cannot be + performed. + + .. exception:: LookupError The base class for the exceptions that are raised when a key or index used on @@ -271,6 +277,18 @@ of the exception instance returns only the message. +.. exception:: IndentationError + + Base class for syntax errors related to incorrect indentation. This is a + subclass of :exc:`SyntaxError`. + + +.. exception:: TabError + + Raised when indentation contains an inconsistent use of tabs and spaces. + This is a subclass of :exc:`IndentationError`. + + .. exception:: SystemError Raised when the interpreter finds an internal error, but the situation does not Modified: python/branches/release31-maint/Doc/library/socket.rst ============================================================================== --- python/branches/release31-maint/Doc/library/socket.rst (original) +++ python/branches/release31-maint/Doc/library/socket.rst Fri Feb 25 11:03:34 2011 @@ -616,16 +616,21 @@ .. index:: single: I/O control; buffering - Return a :term:`file object` associated with the socket. The exact - returned type depends on the arguments given to :meth:`makefile`. These - arguments are interpreted the same way as by the built-in :func:`open` - function. + Return a :term:`file object` associated with the socket. The exact returned + type depends on the arguments given to :meth:`makefile`. These arguments are + interpreted the same way as by the built-in :func:`open` function. Closing the file object won't close the socket unless there are no remaining references to the socket. The socket must be in blocking mode; it can have a timeout, but the file object's internal buffer may end up in a inconsistent state if a timeout occurs. + .. note:: + + On Windows, the file-like object created by :meth:`makefile` cannot be + used where a file object with a file descriptor is expected, such as the + stream arguments of :meth:`subprocess.Popen`. + .. method:: socket.recv(bufsize[, flags]) Modified: python/branches/release31-maint/Doc/library/stdtypes.rst ============================================================================== --- python/branches/release31-maint/Doc/library/stdtypes.rst (original) +++ python/branches/release31-maint/Doc/library/stdtypes.rst Fri Feb 25 11:03:34 2011 @@ -2203,6 +2203,10 @@ A tuple of integers the length of :attr:`ndim` giving the size in bytes to access each element for each dimension of the array. + .. attribute:: readonly + + A bool indicating whether the memory is read only. + .. memoryview.suboffsets isn't documented because it only seems useful for C Modified: python/branches/release31-maint/Doc/library/xml.dom.minidom.rst ============================================================================== --- python/branches/release31-maint/Doc/library/xml.dom.minidom.rst (original) +++ python/branches/release31-maint/Doc/library/xml.dom.minidom.rst Fri Feb 25 11:03:34 2011 @@ -125,7 +125,7 @@ to discard children of that node. -.. method:: Node.writexml(writer, indent="", addindent="", newl="", encoding="") +.. method:: Node.writexml(writer, indent="", addindent="", newl="") Write XML to the writer object. The writer should have a :meth:`write` method which matches that of the file object interface. The *indent* parameter is the @@ -133,8 +133,8 @@ indentation to use for subnodes of the current one. The *newl* parameter specifies the string to use to terminate newlines. - For the :class:`Document` node, an additional keyword argument *encoding* can be - used to specify the encoding field of the XML header. + For the :class:`Document` node, an additional keyword argument *encoding* can + be used to specify the encoding field of the XML header. .. method:: Node.toxml(encoding=None) Modified: python/branches/release31-maint/Doc/tutorial/interpreter.rst ============================================================================== --- python/branches/release31-maint/Doc/tutorial/interpreter.rst (original) +++ python/branches/release31-maint/Doc/tutorial/interpreter.rst Fri Feb 25 11:03:34 2011 @@ -58,14 +58,6 @@ ``python -m module [arg] ...``, which executes the source file for *module* as if you had spelled out its full name on the command line. -Note that there is a difference between ``python file`` and ``python ->> 23 23 @@ -19,75 +22,72 @@ >>> Note that if this count increases when you're not storing away new objects, -there's probably a leak. Remember, though, that in interactive mode the -special name "_" holds a reference to the last result displayed! +there's probably a leak. Remember, though, that in interactive mode the special +name "_" holds a reference to the last result displayed! -Py_REF_DEBUG also checks after every decref to verify that the refcount -hasn't gone negative, and causes an immediate fatal error if it has. +Py_REF_DEBUG also checks after every decref to verify that the refcount hasn't +gone negative, and causes an immediate fatal error if it has. Special gimmicks: sys.gettotalrefcount() Return current total of all refcounts. - Available under Py_REF_DEBUG in Python 2.3. - Before 2.3, Py_TRACE_REFS was required to enable this function. ---------------------------------------------------------------------------- -Py_TRACE_REFS introduced in 1.4 - named TRACE_REFS before 1.4 - -Turn on heavy reference debugging. This is major surgery. Every PyObject -grows two more pointers, to maintain a doubly-linked list of all live -heap-allocated objects. Most built-in type objects are not in this list, -as they're statically allocated. Starting in Python 2.3, if COUNT_ALLOCS -(see below) is also defined, a static type object T does appear in this -list if at least one object of type T has been created. + + +Py_TRACE_REFS +------------- + +Turn on heavy reference debugging. This is major surgery. Every PyObject grows +two more pointers, to maintain a doubly-linked list of all live heap-allocated +objects. Most built-in type objects are not in this list, as they're statically +allocated. Starting in Python 2.3, if COUNT_ALLOCS (see below) is also defined, +a static type object T does appear in this list if at least one object of type T +has been created. Note that because the fundamental PyObject layout changes, Python modules -compiled with Py_TRACE_REFS are incompatible with modules compiled without -it. +compiled with Py_TRACE_REFS are incompatible with modules compiled without it. Py_TRACE_REFS implies Py_REF_DEBUG. Special gimmicks: sys.getobjects(max[, type]) - Return list of the (no more than) max most-recently allocated objects, - most recently allocated first in the list, least-recently allocated - last in the list. max=0 means no limit on list length. - If an optional type object is passed, the list is also restricted to - objects of that type. - The return list itself, and some temp objects created just to call - sys.getobjects(), are excluded from the return list. Note that the - list returned is just another object, though, so may appear in the - return list the next time you call getobjects(); note that every - object in the list is kept alive too, simply by virtue of being in - the list. - -envar PYTHONDUMPREFS - If this envar exists, Py_Finalize() arranges to print a list of - all still-live heap objects. This is printed twice, in different - formats, before and after Py_Finalize has cleaned up everything it - can clean up. The first output block produces the repr() of each - object so is more informative; however, a lot of stuff destined to - die is still alive then. The second output block is much harder - to work with (repr() can't be invoked anymore -- the interpreter - has been torn down too far), but doesn't list any objects that will - die. The tool script combinerefs.py can be run over this to combine - the info from both output blocks. The second output block, and + Return list of the (no more than) max most-recently allocated objects, most + recently allocated first in the list, least-recently allocated last in the + list. max=0 means no limit on list length. If an optional type object is + passed, the list is also restricted to objects of that type. The return + list itself, and some temp objects created just to call sys.getobjects(), + are excluded from the return list. Note that the list returned is just + another object, though, so may appear in the return list the next time you + call getobjects(); note that every object in the list is kept alive too, + simply by virtue of being in the list. + +envvar PYTHONDUMPREFS + If this envvar exists, Py_Finalize() arranges to print a list of all + still-live heap objects. This is printed twice, in different formats, + before and after Py_Finalize has cleaned up everything it can clean up. The + first output block produces the repr() of each object so is more + informative; however, a lot of stuff destined to die is still alive then. + The second output block is much harder to work with (repr() can't be invoked + anymore -- the interpreter has been torn down too far), but doesn't list any + objects that will die. The tool script combinerefs.py can be run over this + to combine the info from both output blocks. The second output block, and combinerefs.py, were new in Python 2.3b1. ---------------------------------------------------------------------------- -PYMALLOC_DEBUG introduced in 2.3 + + +PYMALLOC_DEBUG +-------------- When pymalloc is enabled (WITH_PYMALLOC is defined), calls to the PyObject_ -memory routines are handled by Python's own small-object allocator, while -calls to the PyMem_ memory routines are directed to the system malloc/ -realloc/free. If PYMALLOC_DEBUG is also defined, calls to both PyObject_ -and PyMem_ memory routines are directed to a special debugging mode of -Python's small-object allocator. - -This mode fills dynamically allocated memory blocks with special, -recognizable bit patterns, and adds debugging info on each end of -dynamically allocated memory blocks. The special bit patterns are: +memory routines are handled by Python's own small-object allocator, while calls +to the PyMem_ memory routines are directed to the system malloc/ realloc/free. +If PYMALLOC_DEBUG is also defined, calls to both PyObject_ and PyMem_ memory +routines are directed to a special debugging mode of Python's small-object +allocator. + +This mode fills dynamically allocated memory blocks with special, recognizable +bit patterns, and adds debugging info on each end of dynamically allocated +memory blocks. The special bit patterns are: #define CLEANBYTE 0xCB /* clean (newly allocated) memory */ #define DEADBYTE 0xDB /* dead (newly freed) memory */ @@ -96,73 +96,70 @@ Strings of these bytes are unlikely to be valid addresses, floats, or 7-bit ASCII strings. -Let S = sizeof(size_t). 2*S bytes are added at each end of each block of N -bytes requested. The memory layout is like so, where p represents the -address returned by a malloc-like or realloc-like function (p[i:j] means -the slice of bytes from *(p+i) inclusive up to *(p+j) exclusive; note that -the treatment of negative indices differs from a Python slice): +Let S = sizeof(size_t). 2*S bytes are added at each end of each block of N bytes +requested. The memory layout is like so, where p represents the address +returned by a malloc-like or realloc-like function (p[i:j] means the slice of +bytes from *(p+i) inclusive up to *(p+j) exclusive; note that the treatment of +negative indices differs from a Python slice): p[-2*S:-S] - Number of bytes originally asked for. This is a size_t, big-endian - (easier to read in a memory dump). + Number of bytes originally asked for. This is a size_t, big-endian (easier + to read in a memory dump). p[-S:0] Copies of FORBIDDENBYTE. Used to catch under- writes and reads. p[0:N] The requested memory, filled with copies of CLEANBYTE, used to catch - reference to uninitialized memory. - When a realloc-like function is called requesting a larger memory - block, the new excess bytes are also filled with CLEANBYTE. - When a free-like function is called, these are overwritten with - DEADBYTE, to catch reference to freed memory. When a realloc- - like function is called requesting a smaller memory block, the excess - old bytes are also filled with DEADBYTE. + reference to uninitialized memory. When a realloc-like function is called + requesting a larger memory block, the new excess bytes are also filled with + CLEANBYTE. When a free-like function is called, these are overwritten with + DEADBYTE, to catch reference to freed memory. When a realloc- like function + is called requesting a smaller memory block, the excess old bytes are also + filled with DEADBYTE. p[N:N+S] Copies of FORBIDDENBYTE. Used to catch over- writes and reads. p[N+S:N+2*S] A serial number, incremented by 1 on each call to a malloc-like or - realloc-like function. - Big-endian size_t. - If "bad memory" is detected later, the serial number gives an - excellent way to set a breakpoint on the next run, to capture the - instant at which this block was passed out. The static function - bumpserialno() in obmalloc.c is the only place the serial number - is incremented, and exists so you can set such a breakpoint easily. - -A realloc-like or free-like function first checks that the FORBIDDENBYTEs -at each end are intact. If they've been altered, diagnostic output is -written to stderr, and the program is aborted via Py_FatalError(). The -other main failure mode is provoking a memory error when a program -reads up one of the special bit patterns and tries to use it as an address. -If you get in a debugger then and look at the object, you're likely -to see that it's entirely filled with 0xDB (meaning freed memory is -getting used) or 0xCB (meaning uninitialized memory is getting used). + realloc-like function. Big-endian size_t. If "bad memory" is detected + later, the serial number gives an excellent way to set a breakpoint on the + next run, to capture the instant at which this block was passed out. The + static function bumpserialno() in obmalloc.c is the only place the serial + number is incremented, and exists so you can set such a breakpoint easily. + +A realloc-like or free-like function first checks that the FORBIDDENBYTEs at +each end are intact. If they've been altered, diagnostic output is written to +stderr, and the program is aborted via Py_FatalError(). The other main failure +mode is provoking a memory error when a program reads up one of the special bit +patterns and tries to use it as an address. If you get in a debugger then and +look at the object, you're likely to see that it's entirely filled with 0xDB +(meaning freed memory is getting used) or 0xCB (meaning uninitialized memory is +getting used). Note that PYMALLOC_DEBUG requires WITH_PYMALLOC. Special gimmicks: -envar PYTHONMALLOCSTATS - If this envar exists, a report of pymalloc summary statistics is - printed to stderr whenever a new arena is allocated, and also - by Py_Finalize(). +envvar PYTHONMALLOCSTATS + If this envvar exists, a report of pymalloc summary statistics is printed to + stderr whenever a new arena is allocated, and also by Py_Finalize(). Changed in 2.5: The number of extra bytes allocated is 4*sizeof(size_t). Before it was 16 on all boxes, reflecting that Python couldn't make use of allocations >= 2**32 bytes even on 64-bit boxes before 2.5. ---------------------------------------------------------------------------- -Py_DEBUG introduced in 1.5 - named DEBUG before 1.5 + + +Py_DEBUG +-------- This is what is generally meant by "a debug build" of Python. -Py_DEBUG implies LLTRACE, Py_REF_DEBUG, Py_TRACE_REFS, and -PYMALLOC_DEBUG (if WITH_PYMALLOC is enabled). In addition, C -assert()s are enabled (via the C way: by not defining NDEBUG), and -some routines do additional sanity checks inside "#ifdef Py_DEBUG" -blocks. ---------------------------------------------------------------------------- -COUNT_ALLOCS introduced in 0.9.9 - partly broken in 2.2 and 2.2.1 +Py_DEBUG implies LLTRACE, Py_REF_DEBUG, Py_TRACE_REFS, and PYMALLOC_DEBUG (if +WITH_PYMALLOC is enabled). In addition, C assert()s are enabled (via the C way: +by not defining NDEBUG), and some routines do additional sanity checks inside +"#ifdef Py_DEBUG" blocks. + + +COUNT_ALLOCS +------------ Each type object grows three new members: @@ -178,84 +175,85 @@ */ int tp_maxalloc; -Allocation and deallocation code keeps these counts up to date. -Py_Finalize() displays a summary of the info returned by sys.getcounts() -(see below), along with assorted other special allocation counts (like -the number of tuple allocations satisfied by a tuple free-list, the number -of 1-character strings allocated, etc). +Allocation and deallocation code keeps these counts up to date. Py_Finalize() +displays a summary of the info returned by sys.getcounts() (see below), along +with assorted other special allocation counts (like the number of tuple +allocations satisfied by a tuple free-list, the number of 1-character strings +allocated, etc). Before Python 2.2, type objects were immortal, and the COUNT_ALLOCS -implementation relies on that. As of Python 2.2, heap-allocated type/ -class objects can go away. COUNT_ALLOCS can blow up in 2.2 and 2.2.1 -because of this; this was fixed in 2.2.2. Use of COUNT_ALLOCS makes -all heap-allocated type objects immortal, except for those for which no -object of that type is ever allocated. +implementation relies on that. As of Python 2.2, heap-allocated type/ class +objects can go away. COUNT_ALLOCS can blow up in 2.2 and 2.2.1 because of this; +this was fixed in 2.2.2. Use of COUNT_ALLOCS makes all heap-allocated type +objects immortal, except for those for which no object of that type is ever +allocated. Starting with Python 2.3, If Py_TRACE_REFS is also defined, COUNT_ALLOCS -arranges to ensure that the type object for each allocated object -appears in the doubly-linked list of all objects maintained by -Py_TRACE_REFS. +arranges to ensure that the type object for each allocated object appears in the +doubly-linked list of all objects maintained by Py_TRACE_REFS. Special gimmicks: sys.getcounts() - Return a list of 4-tuples, one entry for each type object for which - at least one object of that type was allocated. Each tuple is of - the form: + Return a list of 4-tuples, one entry for each type object for which at least + one object of that type was allocated. Each tuple is of the form: (tp_name, tp_allocs, tp_frees, tp_maxalloc) - Each distinct type object gets a distinct entry in this list, even - if two or more type objects have the same tp_name (in which case - there's no way to distinguish them by looking at this list). The - list is ordered by time of first object allocation: the type object - for which the first allocation of an object of that type occurred - most recently is at the front of the list. ---------------------------------------------------------------------------- -LLTRACE introduced well before 1.0 + Each distinct type object gets a distinct entry in this list, even if two or + more type objects have the same tp_name (in which case there's no way to + distinguish them by looking at this list). The list is ordered by time of + first object allocation: the type object for which the first allocation of + an object of that type occurred most recently is at the front of the list. + + +LLTRACE +------- Compile in support for Low Level TRACE-ing of the main interpreter loop. -When this preprocessor symbol is defined, before PyEval_EvalFrame -(eval_frame in 2.3 and 2.2, eval_code2 before that) executes a frame's code -it checks the frame's global namespace for a variable "__lltrace__". If -such a variable is found, mounds of information about what the interpreter -is doing are sprayed to stdout, such as every opcode and opcode argument -and values pushed onto and popped off the value stack. +When this preprocessor symbol is defined, before PyEval_EvalFrame (eval_frame in +2.3 and 2.2, eval_code2 before that) executes a frame's code it checks the +frame's global namespace for a variable "__lltrace__". If such a variable is +found, mounds of information about what the interpreter is doing are sprayed to +stdout, such as every opcode and opcode argument and values pushed onto and +popped off the value stack. Not useful very often, but very useful when needed. ---------------------------------------------------------------------------- -CALL_PROFILE introduced for Python 2.3 + +CALL_PROFILE +------------ Count the number of function calls executed. -When this symbol is defined, the ceval mainloop and helper functions -count the number of function calls made. It keeps detailed statistics -about what kind of object was called and whether the call hit any of -the special fast paths in the code. +When this symbol is defined, the ceval mainloop and helper functions count the +number of function calls made. It keeps detailed statistics about what kind of +object was called and whether the call hit any of the special fast paths in the +code. + ---------------------------------------------------------------------------- -WITH_TSC introduced for Python 2.4 +WITH_TSC +-------- -Super-lowlevel profiling of the interpreter. When enabled, the sys -module grows a new function: +Super-lowlevel profiling of the interpreter. When enabled, the sys module grows +a new function: settscdump(bool) - If true, tell the Python interpreter to dump VM measurements to - stderr. If false, turn off dump. The measurements are based on the - processor's time-stamp counter. - -This build option requires a small amount of platform specific code. -Currently this code is present for linux/x86 and any PowerPC platform -that uses GCC (i.e. OS X and linux/ppc). - -On the PowerPC the rate at which the time base register is incremented -is not defined by the architecture specification, so you'll need to -find the manual for your specific processor. For the 750CX, 750CXe -and 750FX (all sold as the G3) we find: + If true, tell the Python interpreter to dump VM measurements to stderr. If + false, turn off dump. The measurements are based on the processor's + time-stamp counter. + +This build option requires a small amount of platform specific code. Currently +this code is present for linux/x86 and any PowerPC platform that uses GCC +(i.e. OS X and linux/ppc). + +On the PowerPC the rate at which the time base register is incremented is not +defined by the architecture specification, so you'll need to find the manual for +your specific processor. For the 750CX, 750CXe and 750FX (all sold as the G3) +we find: - The time base counter is clocked at a frequency that is - one-fourth that of the bus clock. + The time base counter is clocked at a frequency that is one-fourth that of + the bus clock. This build is enabled by the --with-tsc flag to configure. From python-checkins at python.org Fri Feb 25 11:14:17 2011 From: python-checkins at python.org (eli.bendersky) Date: Fri, 25 Feb 2011 11:14:17 +0100 (CET) Subject: [Python-checkins] r88558 - in python/branches/py3k/Lib/test: support.py test_builtin.py test_float.py Message-ID: <20110225101417.3C0A8EC21@mail.python.org> Author: eli.bendersky Date: Fri Feb 25 11:14:17 2011 New Revision: 88558 Log: Removed fcmp and FUZZ from test.support, following the discussion on python-dev: http://mail.python.org/pipermail/python-dev/2011-January/107735.html Modified: python/branches/py3k/Lib/test/support.py python/branches/py3k/Lib/test/test_builtin.py python/branches/py3k/Lib/test/test_float.py Modified: python/branches/py3k/Lib/test/support.py ============================================================================== --- python/branches/py3k/Lib/test/support.py (original) +++ python/branches/py3k/Lib/test/support.py Fri Feb 25 11:14:17 2011 @@ -33,7 +33,7 @@ "verbose", "use_resources", "max_memuse", "record_original_stdout", "get_original_stdout", "unload", "unlink", "rmtree", "forget", "is_resource_enabled", "requires", "find_unused_port", "bind_port", - "fcmp", "is_jython", "TESTFN", "HOST", "FUZZ", "SAVEDCWD", "temp_cwd", + "is_jython", "TESTFN", "HOST", "SAVEDCWD", "temp_cwd", "findfile", "sortdict", "check_syntax_error", "open_urlresource", "check_warnings", "CleanImport", "EnvironmentVarGuard", "TransientResource", "captured_output", "captured_stdout", @@ -349,24 +349,6 @@ port = sock.getsockname()[1] return port -FUZZ = 1e-6 - -def fcmp(x, y): # fuzzy comparison function - if isinstance(x, float) or isinstance(y, float): - try: - fuzz = (abs(x) + abs(y)) * FUZZ - if abs(x-y) <= fuzz: - return 0 - except: - pass - elif type(x) == type(y) and isinstance(x, (tuple, list)): - for i in range(min(len(x), len(y))): - outcome = fcmp(x[i], y[i]) - if outcome != 0: - return outcome - return (len(x) > len(y)) - (len(x) < len(y)) - return (x > y) - (x < y) - # decorator for skipping tests on non-IEEE 754 platforms requires_IEEE_754 = unittest.skipUnless( float.__getformat__("double").startswith("IEEE"), Modified: python/branches/py3k/Lib/test/test_builtin.py ============================================================================== --- python/branches/py3k/Lib/test/test_builtin.py (original) +++ python/branches/py3k/Lib/test/test_builtin.py Fri Feb 25 11:14:17 2011 @@ -10,7 +10,7 @@ import types import builtins import random -from test.support import fcmp, TESTFN, unlink, run_unittest, check_warnings +from test.support import TESTFN, unlink, run_unittest, check_warnings from operator import neg @@ -394,10 +394,13 @@ self.assertEqual(divmod(-sys.maxsize-1, -1), (sys.maxsize+1, 0)) - self.assertTrue(not fcmp(divmod(3.25, 1.0), (3.0, 0.25))) - self.assertTrue(not fcmp(divmod(-3.25, 1.0), (-4.0, 0.75))) - self.assertTrue(not fcmp(divmod(3.25, -1.0), (-4.0, -0.75))) - self.assertTrue(not fcmp(divmod(-3.25, -1.0), (3.0, -0.25))) + for num, denom, exp_result in [ (3.25, 1.0, (3.0, 0.25)), + (-3.25, 1.0, (-4.0, 0.75)), + (3.25, -1.0, (-4.0, -0.75)), + (-3.25, -1.0, (3.0, -0.25))]: + result = divmod(num, denom) + self.assertAlmostEqual(result[0], exp_result[0]) + self.assertAlmostEqual(result[1], exp_result[1]) self.assertRaises(TypeError, divmod) Modified: python/branches/py3k/Lib/test/test_float.py ============================================================================== --- python/branches/py3k/Lib/test/test_float.py (original) +++ python/branches/py3k/Lib/test/test_float.py Fri Feb 25 11:14:17 2011 @@ -88,7 +88,7 @@ self.assertRaises(ValueError, float, " -0x3.p-1 ") self.assertRaises(ValueError, float, " +0x3.p-1 ") self.assertEqual(float(" 25.e-1 "), 2.5) - self.assertEqual(support.fcmp(float(" .25e-1 "), .025), 0) + self.assertAlmostEqual(float(" .25e-1 "), .025) def test_floatconversion(self): # Make sure that calls to __float__() work properly From python-checkins at python.org Fri Feb 25 11:18:12 2011 From: python-checkins at python.org (georg.brandl) Date: Fri, 25 Feb 2011 11:18:12 +0100 (CET) Subject: [Python-checkins] r88559 - in python/branches/release31-maint: Doc/ACKS.txt Doc/library/collections.rst Doc/library/ctypes.rst Doc/library/optparse.rst Doc/library/ssl.rst Doc/library/string.rst Doc/tools/sphinxext/indexcontent.html Doc/tools/sphinxext/static/basic.css Lib/email/charset.py Lib/http/client.py Message-ID: <20110225101812.0CE56F69F@mail.python.org> Author: georg.brandl Date: Fri Feb 25 11:18:11 2011 New Revision: 88559 Log: Merged revisions 87627,87638,87739,87760,87771,87787,87984,87986,88108,88115,88144,88165,88329,88364-88365,88369-88370,88423-88424 via svnmerge from svn+ssh://svn.python.org/python/branches/py3k ........ r87627 | georg.brandl | 2011-01-02 15:23:43 +0100 (So, 02 Jan 2011) | 1 line #1665333: add more docs for optparse.OptionGroup. ........ r87638 | georg.brandl | 2011-01-02 20:07:51 +0100 (So, 02 Jan 2011) | 1 line Fix code indentation. ........ r87739 | georg.brandl | 2011-01-04 18:27:13 +0100 (Di, 04 Jan 2011) | 1 line Fix exception catching. ........ r87760 | georg.brandl | 2011-01-05 11:59:48 +0100 (Mi, 05 Jan 2011) | 1 line Fix duplicate end tag. ........ r87771 | georg.brandl | 2011-01-05 22:47:47 +0100 (Mi, 05 Jan 2011) | 1 line On Py3k, -tt and -3 are no-op and unsupported respectively. ........ r87787 | georg.brandl | 2011-01-06 10:15:45 +0100 (Do, 06 Jan 2011) | 1 line Remove doc for nonexisting parameter. ........ r87984 | georg.brandl | 2011-01-13 08:24:40 +0100 (Do, 13 Jan 2011) | 1 line Add semicolon for consistency. ........ r87986 | georg.brandl | 2011-01-13 08:31:18 +0100 (Do, 13 Jan 2011) | 1 line Fix the example output of count(). ........ r88108 | georg.brandl | 2011-01-19 09:42:03 +0100 (Mi, 19 Jan 2011) | 1 line Suppress trailing spaces in table paragraphs. ........ r88115 | georg.brandl | 2011-01-19 21:05:49 +0100 (Mi, 19 Jan 2011) | 1 line #10944: add c_bool to types table. ........ r88144 | georg.brandl | 2011-01-22 23:06:24 +0100 (Sa, 22 Jan 2011) | 1 line #10983: fix several bugs in the _tunnel implementation that seem to have missed while porting between branches. A unittest is needed! ........ r88165 | georg.brandl | 2011-01-24 20:53:18 +0100 (Mo, 24 Jan 2011) | 1 line Typo fix. ........ r88329 | georg.brandl | 2011-02-03 08:08:25 +0100 (Do, 03 Feb 2011) | 1 line Punctuation typos. ........ r88364 | georg.brandl | 2011-02-07 13:10:46 +0100 (Mo, 07 Feb 2011) | 1 line #11138: fix order of fill and align specifiers. ........ r88365 | georg.brandl | 2011-02-07 13:13:58 +0100 (Mo, 07 Feb 2011) | 1 line #8691: document that right alignment is default for numbers. ........ r88369 | georg.brandl | 2011-02-07 16:30:45 +0100 (Mo, 07 Feb 2011) | 1 line Consistent heading spacing, and fix two typos. ........ r88370 | georg.brandl | 2011-02-07 16:44:27 +0100 (Mo, 07 Feb 2011) | 1 line Spelling fixes. ........ r88423 | georg.brandl | 2011-02-15 13:41:17 +0100 (Di, 15 Feb 2011) | 1 line Apply logging SocketHandler doc update by Vinay. ........ r88424 | georg.brandl | 2011-02-15 13:44:43 +0100 (Di, 15 Feb 2011) | 1 line Remove editing slip. ........ Modified: python/branches/release31-maint/ (props changed) python/branches/release31-maint/Doc/ACKS.txt python/branches/release31-maint/Doc/library/collections.rst python/branches/release31-maint/Doc/library/ctypes.rst python/branches/release31-maint/Doc/library/optparse.rst python/branches/release31-maint/Doc/library/ssl.rst python/branches/release31-maint/Doc/library/string.rst python/branches/release31-maint/Doc/tools/sphinxext/indexcontent.html python/branches/release31-maint/Doc/tools/sphinxext/static/basic.css python/branches/release31-maint/Lib/email/charset.py python/branches/release31-maint/Lib/http/client.py Modified: python/branches/release31-maint/Doc/ACKS.txt ============================================================================== --- python/branches/release31-maint/Doc/ACKS.txt (original) +++ python/branches/release31-maint/Doc/ACKS.txt Fri Feb 25 11:18:11 2011 @@ -130,6 +130,7 @@ * Andrew MacIntyre * Vladimir Marangozov * Vincent Marchetti + * Westley Mart?nez * Laura Matson * Daniel May * Rebecca McCreary Modified: python/branches/release31-maint/Doc/library/collections.rst ============================================================================== --- python/branches/release31-maint/Doc/library/collections.rst (original) +++ python/branches/release31-maint/Doc/library/collections.rst Fri Feb 25 11:18:11 2011 @@ -918,7 +918,7 @@ :class:`Sized` ``__len__`` :class:`Callable` ``__call__`` -:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``. ``__iter__``, ``__reversed__``. +:class:`Sequence` :class:`Sized`, ``__getitem__`` ``__contains__``. ``__iter__``, ``__reversed__``, :class:`Iterable`, ``index``, and ``count`` :class:`Container` @@ -927,7 +927,7 @@ and ``insert`` ``remove``, and ``__iadd__`` :class:`Set` :class:`Sized`, ``__le__``, ``__lt__``, ``__eq__``, ``__ne__``, - :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__`` + :class:`Iterable`, ``__gt__``, ``__ge__``, ``__and__``, ``__or__``, :class:`Container` ``__sub__``, ``__xor__``, and ``isdisjoint`` :class:`MutableSet` :class:`Set` ``add`` and Inherited Set methods and Modified: python/branches/release31-maint/Doc/library/ctypes.rst ============================================================================== --- python/branches/release31-maint/Doc/library/ctypes.rst (original) +++ python/branches/release31-maint/Doc/library/ctypes.rst Fri Feb 25 11:18:11 2011 @@ -216,6 +216,8 @@ +----------------------+----------------------------------------+----------------------------+ | ctypes type | C type | Python type | +======================+========================================+============================+ +| :class:`c_bool` | :c:type:`_Bool` | bool (1) | ++----------------------+----------------------------------------+----------------------------+ | :class:`c_char` | :ctype:`char` | 1-character bytes object | +----------------------+----------------------------------------+----------------------------+ | :class:`c_wchar` | :ctype:`wchar_t` | 1-character string | @@ -254,6 +256,9 @@ | :class:`c_void_p` | :ctype:`void *` | int or ``None`` | +----------------------+----------------------------------------+----------------------------+ +(1) + The constructor accepts any object with a truth value. + All these types can be created by calling them with an optional initializer of the correct type and value:: Modified: python/branches/release31-maint/Doc/library/optparse.rst ============================================================================== --- python/branches/release31-maint/Doc/library/optparse.rst (original) +++ python/branches/release31-maint/Doc/library/optparse.rst Fri Feb 25 11:18:11 2011 @@ -55,9 +55,9 @@ .. code-block:: text - usage: [options] + Usage: [options] - options: + Options: -h, --help show this help message and exit -f FILE, --file=FILE write report to FILE -q, --quiet don't print status messages to stdout @@ -486,9 +486,9 @@ .. code-block:: text - usage: [options] arg1 arg2 + Usage: [options] arg1 arg2 - options: + Options: -h, --help show this help message and exit -v, --verbose make lots of noise [default] -q, --quiet be vewwy quiet (I'm hunting wabbits) @@ -512,7 +512,7 @@ is then printed before the detailed option help. If you don't supply a usage string, :mod:`optparse` uses a bland but sensible - default: ``"usage: %prog [options]"``, which is fine if your script doesn't + default: ``"Usage: %prog [options]"``, which is fine if your script doesn't take any positional arguments. * every option defines a help string, and doesn't worry about line-wrapping--- @@ -544,12 +544,33 @@ default value. If an option has no default value (or the default value is ``None``), ``%default`` expands to ``none``. +Grouping Options +++++++++++++++++ + When dealing with many options, it is convenient to group these options for better help output. An :class:`OptionParser` can contain several option groups, each of which can contain several options. -Continuing with the parser defined above, adding an :class:`OptionGroup` to a -parser is easy:: +An option group is obtained using the class :class:`OptionGroup`: + +.. class:: OptionGroup(parser, title, description=None) + + where + + * parser is the :class:`OptionParser` instance the group will be insterted in + to + * title is the group title + * description, optional, is a long description of the group + +:class:`OptionGroup` inherits from :class:`OptionContainer` (like +:class:`OptionParser`) and so the :meth:`add_option` method can be used to add +an option to the group. + +Once all the options are declared, using the :class:`OptionParser` method +:meth:`add_option_group` the group is added to the previously defined parser. + +Continuing with the parser defined in the previous section, adding an +:class:`OptionGroup` to a parser is easy:: group = OptionGroup(parser, "Dangerous Options", "Caution: use these options at your own risk. " @@ -561,20 +582,73 @@ .. code-block:: text - usage: [options] arg1 arg2 + Usage: [options] arg1 arg2 + + Options: + -h, --help show this help message and exit + -v, --verbose make lots of noise [default] + -q, --quiet be vewwy quiet (I'm hunting wabbits) + -f FILE, --filename=FILE + write output to FILE + -m MODE, --mode=MODE interaction mode: novice, intermediate, or + expert [default: intermediate] + + Dangerous Options: + Caution: use these options at your own risk. It is believed that some + of them bite. + + -g Group option. + +A bit more complete example might invole using more than one group: still +extendind the previous example:: + + group = OptionGroup(parser, "Dangerous Options", + "Caution: use these options at your own risk. " + "It is believed that some of them bite.") + group.add_option("-g", action="store_true", help="Group option.") + parser.add_option_group(group) + + group = OptionGroup(parser, "Debug Options") + group.add_option("-d", "--debug", action="store_true", + help="Print debug information") + group.add_option("-s", "--sql", action="store_true", + help="Print all SQL statements executed") + group.add_option("-e", action="store_true", help="Print every action done") + parser.add_option_group(group) + +that results in the following output: + +.. code-block:: text + + Usage: [options] arg1 arg2 + + Options: + -h, --help show this help message and exit + -v, --verbose make lots of noise [default] + -q, --quiet be vewwy quiet (I'm hunting wabbits) + -f FILE, --filename=FILE + write output to FILE + -m MODE, --mode=MODE interaction mode: novice, intermediate, or expert + [default: intermediate] + + Dangerous Options: + Caution: use these options at your own risk. It is believed that some + of them bite. + + -g Group option. + + Debug Options: + -d, --debug Print debug information + -s, --sql Print all SQL statements executed + -e Print every action done + +Another interesting method, in particular when working programmatically with +option groups is: + +.. method:: OptionParser.get_option_group(opt_str) - options: - -h, --help show this help message and exit - -v, --verbose make lots of noise [default] - -q, --quiet be vewwy quiet (I'm hunting wabbits) - -fFILE, --file=FILE write output to FILE - -mMODE, --mode=MODE interaction mode: one of 'novice', 'intermediate' - [default], 'expert' - - Dangerous Options: - Caution: use of these options is at your own risk. It is believed that - some of them bite. - -g Group option. + Return, if defined, the :class:`OptionGroup` that has the title or the long + description equals to *opt_str* .. _optparse-printing-version-string: @@ -646,14 +720,14 @@ that takes an integer:: $ /usr/bin/foo -n 4x - usage: foo [options] + Usage: foo [options] foo: error: option -n: invalid integer value: '4x' Or, where the user fails to pass a value at all:: $ /usr/bin/foo -n - usage: foo [options] + Usage: foo [options] foo: error: -n option requires an argument @@ -1155,9 +1229,9 @@ .. code-block:: text - usage: foo.py [options] + Usage: foo.py [options] - options: + Options: -h, --help Show this help message and exit -v Be moderately verbose --file=FILENAME Input file to read data from @@ -1352,7 +1426,7 @@ option strings. Now ``--dry-run`` is the only way for the user to activate that option. If the user asks for help, the help message will reflect that:: - options: + Options: --dry-run do no harm [...] -n, --noisy be noisy @@ -1368,7 +1442,7 @@ At this point, the original ``-n``/``--dry-run`` option is no longer accessible, so :mod:`optparse` removes it, leaving this help text:: - options: + Options: [...] -n, --noisy be noisy --dry-run new dry-run option Modified: python/branches/release31-maint/Doc/library/ssl.rst ============================================================================== --- python/branches/release31-maint/Doc/library/ssl.rst (original) +++ python/branches/release31-maint/Doc/library/ssl.rst Fri Feb 25 11:18:11 2011 @@ -460,11 +460,11 @@ should use the following idiom:: try: - import ssl + import ssl except ImportError: - pass + pass else: - [ do something that requires SSL support ] + ... # do something that requires SSL support Client-side operation ^^^^^^^^^^^^^^^^^^^^^ @@ -553,16 +553,15 @@ are finished with the client (or the client is finished with you):: def deal_with_client(connstream): - - data = connstream.read() - # null data means the client is finished with us - while data: - if not do_something(connstream, data): - # we'll assume do_something returns False - # when we're finished with client - break - data = connstream.read() - # finished with client + data = connstream.read() + # null data means the client is finished with us + while data: + if not do_something(connstream, data): + # we'll assume do_something returns False + # when we're finished with client + break + data = connstream.read() + # finished with client And go back to listening for new client connections. Modified: python/branches/release31-maint/Doc/library/string.rst ============================================================================== --- python/branches/release31-maint/Doc/library/string.rst (original) +++ python/branches/release31-maint/Doc/library/string.rst Fri Feb 25 11:18:11 2011 @@ -86,7 +86,7 @@ The :class:`Formatter` class has the following public methods: - .. method:: format(format_string, *args, *kwargs) + .. method:: format(format_string, *args, **kwargs) :meth:`format` is the primary API method. It takes a format template string, and an arbitrary set of positional and keyword argument. @@ -306,10 +306,10 @@ | Option | Meaning | +=========+==========================================================+ | ``'<'`` | Forces the field to be left-aligned within the available | - | | space (this is the default). | + | | space (this is the default for most objects). | +---------+----------------------------------------------------------+ | ``'>'`` | Forces the field to be right-aligned within the | - | | available space. | + | | available space (this is the default for numbers). | +---------+----------------------------------------------------------+ | ``'='`` | Forces the padding to be placed after the sign (if any) | | | but before the digits. This is used for printing fields | @@ -582,7 +582,7 @@ Nesting arguments and more complex examples:: >>> for align, text in zip('<^>', ['left', 'center', 'right']): - ... '{0:{align}{fill}16}'.format(text, fill=align, align=align) + ... '{0:{fill}{align}16}'.format(text, fill=align, align=align) ... 'left<<<<<<<<<<<<' '^^^^^center^^^^^' Modified: python/branches/release31-maint/Doc/tools/sphinxext/indexcontent.html ============================================================================== --- python/branches/release31-maint/Doc/tools/sphinxext/indexcontent.html (original) +++ python/branches/release31-maint/Doc/tools/sphinxext/indexcontent.html Fri Feb 25 11:18:11 2011 @@ -4,7 +4,7 @@
+ or all "What's new" documents since 2.0

+

__xyz__ = 'X, Y and Z'

 
@@ -223,24 +222,34 @@ output = doc.docmodule(module) - # cleanup the extra text formatting that pydoc preforms + # clean up the extra text formatting that pydoc performs patt = re.compile('\b.') output = patt.sub('', output) return output.strip(), loc def print_diffs(text1, text2): "Prints unified diffs for two texts" + # XXX now obsolete, use unittest built-in support lines1 = text1.splitlines(True) lines2 = text2.splitlines(True) diffs = difflib.unified_diff(lines1, lines2, n=0, fromfile='expected', tofile='got') print('\n' + ''.join(diffs)) +def get_html_title(text): + # Bit of hack, but good enough for test purposes + header, _, _ = text.partition("") + _, _, title = header.partition("") + title, _, _ = title.partition("") + return title -class PyDocDocTest(unittest.TestCase): + +class PydocDocTest(unittest.TestCase): @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') def test_html_doc(self): result, doc_loc = get_pydoc_html(pydoc_mod) mod_file = inspect.getabsfile(pydoc_mod) @@ -256,10 +265,12 @@ @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') def test_text_doc(self): result, doc_loc = get_pydoc_text(pydoc_mod) expected_text = expected_text_pattern % \ - (inspect.getabsfile(pydoc_mod), doc_loc) + (doc_loc, inspect.getabsfile(pydoc_mod)) if result != expected_text: print_diffs(expected_text, result) self.fail("outputs are not equal, see diff above") @@ -331,6 +342,43 @@ self.assertEqual(stripid(""), "") + @unittest.skipIf(sys.flags.optimize >= 2, + 'Docstrings are omitted with -O2 and above') + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') + def test_help_output_redirect(self): + # issue 940286, if output is set in Helper, then all output from + # Helper.help should be redirected + old_pattern = expected_text_pattern + getpager_old = pydoc.getpager + getpager_new = lambda: (lambda x: x) + self.maxDiff = None + + buf = StringIO() + helper = pydoc.Helper(output=buf) + unused, doc_loc = get_pydoc_text(pydoc_mod) + module = "test.pydoc_mod" + help_header = """ + Help on module test.pydoc_mod in test: + + """.lstrip() + help_header = textwrap.dedent(help_header) + expected_help_pattern = help_header + expected_text_pattern + + pydoc.getpager = getpager_new + try: + with captured_output('stdout') as output, \ + captured_output('stderr') as err: + helper.help(module) + result = buf.getvalue().strip() + expected_text = expected_help_pattern % \ + (doc_loc, inspect.getabsfile(pydoc_mod)) + self.assertEqual('', output.getvalue()) + self.assertEqual('', err.getvalue()) + self.assertEqual(expected_text, result) + finally: + pydoc.getpager = getpager_old + class TestDescriptions(unittest.TestCase): @@ -340,16 +388,8 @@ doc = pydoc.render_doc(pydocfodder) self.assertIn("pydocfodder", doc) - def test_classic_class(self): - class C: "Classic class" - c = C() - self.assertEqual(pydoc.describe(C), 'class C') - self.assertEqual(pydoc.describe(c), 'C') - expected = 'C in module %s' % __name__ - self.assertIn(expected, pydoc.render_doc(c)) - def test_class(self): - class C(object): "New-style class" + class C: "New-style class" c = C() self.assertEqual(pydoc.describe(C), 'class C') @@ -358,8 +398,75 @@ self.assertIn(expected, pydoc.render_doc(c)) +class PydocServerTest(unittest.TestCase): + """Tests for pydoc._start_server""" + + def test_server(self): + + # Minimal test that starts the server, then stops it. + def my_url_handler(url, content_type): + text = 'the URL sent was: (%s, %s)' % (url, content_type) + return text + + serverthread = pydoc._start_server(my_url_handler, port=0) + starttime = time.time() + timeout = 1 #seconds + + while serverthread.serving: + time.sleep(.01) + if serverthread.serving and time.time() - starttime > timeout: + serverthread.stop() + break + + self.assertEqual(serverthread.error, None) + + +class PydocUrlHandlerTest(unittest.TestCase): + """Tests for pydoc._url_handler""" + + def test_content_type_err(self): + f = pydoc._url_handler + self.assertRaises(TypeError, f, 'A', '') + self.assertRaises(TypeError, f, 'B', 'foobar') + + def test_url_requests(self): + # Test for the correct title in the html pages returned. + # This tests the different parts of the URL handler without + # getting too picky about the exact html. + requests = [ + ("", "Pydoc: Index of Modules"), + ("get?key=", "Pydoc: Index of Modules"), + ("index", "Pydoc: Index of Modules"), + ("topics", "Pydoc: Topics"), + ("keywords", "Pydoc: Keywords"), + ("pydoc", "Pydoc: module pydoc"), + ("get?key=pydoc", "Pydoc: module pydoc"), + ("search?key=pydoc", "Pydoc: Search Results"), + ("topic?key=def", "Pydoc: KEYWORD def"), + ("topic?key=STRINGS", "Pydoc: TOPIC STRINGS"), + ("foobar", "Pydoc: Error - foobar"), + ("getfile?key=foobar", "Pydoc: Error - getfile?key=foobar"), + ] + + for url, title in requests: + text = pydoc._url_handler(url, "text/html") + result = get_html_title(text) + self.assertEqual(result, title) + + path = string.__file__ + title = "Pydoc: getfile " + path + url = "getfile?key=" + path + text = pydoc._url_handler(url, "text/html") + result = get_html_title(text) + self.assertEqual(result, title) + + def test_main(): - test.support.run_unittest(PyDocDocTest, TestDescriptions) + test.support.run_unittest(PydocDocTest, + TestDescriptions, + PydocServerTest, + PydocUrlHandlerTest, + ) if __name__ == "__main__": test_main() Modified: python/branches/pep-3151/Lib/test/test_pyexpat.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_pyexpat.py (original) +++ python/branches/pep-3151/Lib/test/test_pyexpat.py Sat Feb 26 08:16:32 2011 @@ -23,12 +23,12 @@ def test_ordered_attributes(self): for x, y in self.set_get_pairs: self.parser.ordered_attributes = x - self.assertEquals(self.parser.ordered_attributes, y) + self.assertEqual(self.parser.ordered_attributes, y) def test_specified_attributes(self): for x, y in self.set_get_pairs: self.parser.specified_attributes = x - self.assertEquals(self.parser.specified_attributes, y) + self.assertEqual(self.parser.specified_attributes, y) data = b'''\ @@ -155,6 +155,14 @@ 'ElementDeclHandler', 'AttlistDeclHandler', 'SkippedEntityHandler', ] + def _hookup_callbacks(self, parser, handler): + """ + Set each of the callbacks defined on handler and named in + self.handler_names on the given parser. + """ + for name in self.handler_names: + setattr(parser, name, getattr(handler, name)) + def _verify_parse_output(self, operations): expected_operations = [ ('XML declaration', ('1.0', 'iso-8859-1', 0)), @@ -190,26 +198,26 @@ "End element: 'root'", ] for operation, expected_operation in zip(operations, expected_operations): - self.assertEquals(operation, expected_operation) + self.assertEqual(operation, expected_operation) def test_unicode(self): # Try the parse again, this time producing Unicode output out = self.Outputter() parser = expat.ParserCreate(namespace_separator='!') - for name in self.handler_names: - setattr(parser, name, getattr(out, name)) + self._hookup_callbacks(parser, out) parser.Parse(data, 1) operations = out.out self._verify_parse_output(operations) + # Issue #6697. + self.assertRaises(AttributeError, getattr, parser, '\uD800') def test_parse_file(self): # Try parsing a file out = self.Outputter() parser = expat.ParserCreate(namespace_separator='!') - for name in self.handler_names: - setattr(parser, name, getattr(out, name)) + self._hookup_callbacks(parser, out) file = BytesIO(data) parser.ParseFile(file) @@ -230,14 +238,14 @@ expat.ParserCreate(namespace_separator=42) self.fail() except TypeError as e: - self.assertEquals(str(e), + self.assertEqual(str(e), 'ParserCreate() argument 2 must be str or None, not int') try: expat.ParserCreate(namespace_separator='too long') self.fail() except ValueError as e: - self.assertEquals(str(e), + self.assertEqual(str(e), 'namespace_separator must be at most one character, omitted, or None') def test_zero_length(self): @@ -263,7 +271,7 @@ p.EndElementHandler = collector p.Parse(" ", 1) tag = L[0] - self.assertEquals(len(L), 6) + self.assertEqual(len(L), 6) for entry in L: # L should have the same string repeated over and over. self.assertTrue(tag is entry) @@ -285,7 +293,7 @@ out = ExternalOutputter(parser) parser.ExternalEntityRefHandler = out.ExternalEntityRefHandler parser.Parse(data, 1) - self.assertEquals(out.parser_result, 1) + self.assertEqual(out.parser_result, 1) class BufferTextTest(unittest.TestCase): @@ -296,7 +304,7 @@ self.parser.CharacterDataHandler = self.CharacterDataHandler def check(self, expected, label): - self.assertEquals(self.stuff, expected, + self.assertEqual(self.stuff, expected, "%s\nstuff = %r\nexpected = %r" % (label, self.stuff, map(str, expected))) @@ -329,47 +337,47 @@ # Make sure buffering is turned on self.assertTrue(self.parser.buffer_text) self.parser.Parse("123", 1) - self.assertEquals(self.stuff, ['123'], - "buffered text not properly collapsed") + self.assertEqual(self.stuff, ['123'], + "buffered text not properly collapsed") def test1(self): # XXX This test exposes more detail of Expat's text chunking than we # XXX like, but it tests what we need to concisely. self.setHandlers(["StartElementHandler"]) self.parser.Parse("12\n34\n5", 1) - self.assertEquals(self.stuff, - ["", "1", "", "2", "\n", "3", "", "4\n5"], - "buffering control not reacting as expected") + self.assertEqual(self.stuff, + ["", "1", "", "2", "\n", "3", "", "4\n5"], + "buffering control not reacting as expected") def test2(self): self.parser.Parse("1<2> \n 3", 1) - self.assertEquals(self.stuff, ["1<2> \n 3"], - "buffered text not properly collapsed") + self.assertEqual(self.stuff, ["1<2> \n 3"], + "buffered text not properly collapsed") def test3(self): self.setHandlers(["StartElementHandler"]) self.parser.Parse("123", 1) - self.assertEquals(self.stuff, ["", "1", "", "2", "", "3"], - "buffered text not properly split") + self.assertEqual(self.stuff, ["", "1", "", "2", "", "3"], + "buffered text not properly split") def test4(self): self.setHandlers(["StartElementHandler", "EndElementHandler"]) self.parser.CharacterDataHandler = None self.parser.Parse("123", 1) - self.assertEquals(self.stuff, - ["", "", "", "", "", ""]) + self.assertEqual(self.stuff, + ["", "", "", "", "", ""]) def test5(self): self.setHandlers(["StartElementHandler", "EndElementHandler"]) self.parser.Parse("123", 1) - self.assertEquals(self.stuff, + self.assertEqual(self.stuff, ["", "1", "", "", "2", "", "", "3", ""]) def test6(self): self.setHandlers(["CommentHandler", "EndElementHandler", "StartElementHandler"]) self.parser.Parse("12345 ", 1) - self.assertEquals(self.stuff, + self.assertEqual(self.stuff, ["", "1", "", "", "2", "", "", "345", ""], "buffered text not properly split") @@ -377,10 +385,10 @@ self.setHandlers(["CommentHandler", "EndElementHandler", "StartElementHandler"]) self.parser.Parse("12345 ", 1) - self.assertEquals(self.stuff, - ["", "1", "", "", "2", "", "", "3", - "", "4", "", "5", ""], - "buffered text not properly split") + self.assertEqual(self.stuff, + ["", "1", "", "", "2", "", "", "3", + "", "4", "", "5", ""], + "buffered text not properly split") # Test handling of exception from callback: @@ -395,9 +403,9 @@ parser.Parse("", 1) self.fail() except RuntimeError as e: - self.assertEquals(e.args[0], 'a', - "Expected RuntimeError for element 'a', but" + \ - " found %r" % e.args[0]) + self.assertEqual(e.args[0], 'a', + "Expected RuntimeError for element 'a', but" + \ + " found %r" % e.args[0]) # Test Current* members: @@ -416,7 +424,7 @@ self.assertTrue(self.upto < len(self.expected_list), 'too many parser events') expected = self.expected_list[self.upto] - self.assertEquals(pos, expected, + self.assertEqual(pos, expected, 'Expected position %s, got position %s' %(pos, expected)) self.upto += 1 @@ -457,10 +465,10 @@ """ def test_1025_bytes(self): - self.assertEquals(self.small_buffer_test(1025), 2) + self.assertEqual(self.small_buffer_test(1025), 2) def test_1000_bytes(self): - self.assertEquals(self.small_buffer_test(1000), 1) + self.assertEqual(self.small_buffer_test(1000), 1) def test_wrong_size(self): parser = expat.ParserCreate() @@ -483,15 +491,15 @@ # once. self.n = 0 parser.Parse(xml1) - self.assertEquals(self.n, 1) + self.assertEqual(self.n, 1) # Reassign to buffer_size, but assign the same size. parser.buffer_size = parser.buffer_size - self.assertEquals(self.n, 1) + self.assertEqual(self.n, 1) # Try parsing rest of the document parser.Parse(xml2) - self.assertEquals(self.n, 2) + self.assertEqual(self.n, 2) def test_disabling_buffer(self): @@ -502,27 +510,27 @@ parser.CharacterDataHandler = self.counting_handler parser.buffer_text = 1 parser.buffer_size = 1024 - self.assertEquals(parser.buffer_size, 1024) + self.assertEqual(parser.buffer_size, 1024) # Parse one chunk of XML self.n = 0 parser.Parse(xml1, 0) - self.assertEquals(parser.buffer_size, 1024) - self.assertEquals(self.n, 1) + self.assertEqual(parser.buffer_size, 1024) + self.assertEqual(self.n, 1) # Turn off buffering and parse the next chunk. parser.buffer_text = 0 self.assertFalse(parser.buffer_text) - self.assertEquals(parser.buffer_size, 1024) + self.assertEqual(parser.buffer_size, 1024) for i in range(10): parser.Parse(xml2, 0) - self.assertEquals(self.n, 11) + self.assertEqual(self.n, 11) parser.buffer_text = 1 self.assertTrue(parser.buffer_text) - self.assertEquals(parser.buffer_size, 1024) + self.assertEqual(parser.buffer_size, 1024) parser.Parse(xml3, 1) - self.assertEquals(self.n, 12) + self.assertEqual(self.n, 12) @@ -550,14 +558,14 @@ parser.CharacterDataHandler = self.counting_handler parser.buffer_text = 1 parser.buffer_size = 1024 - self.assertEquals(parser.buffer_size, 1024) + self.assertEqual(parser.buffer_size, 1024) self.n = 0 parser.Parse(xml1, 0) parser.buffer_size *= 2 - self.assertEquals(parser.buffer_size, 2048) + self.assertEqual(parser.buffer_size, 2048) parser.Parse(xml2, 1) - self.assertEquals(self.n, 2) + self.assertEqual(self.n, 2) def test_change_size_2(self): xml1 = "a%s" % ('a' * 1023) @@ -566,14 +574,14 @@ parser.CharacterDataHandler = self.counting_handler parser.buffer_text = 1 parser.buffer_size = 2048 - self.assertEquals(parser.buffer_size, 2048) + self.assertEqual(parser.buffer_size, 2048) self.n=0 parser.Parse(xml1, 0) parser.buffer_size = parser.buffer_size // 2 - self.assertEquals(parser.buffer_size, 1024) + self.assertEqual(parser.buffer_size, 1024) parser.Parse(xml2, 1) - self.assertEquals(self.n, 4) + self.assertEqual(self.n, 4) class MalformedInputTest(unittest.TestCase): def test1(self): @@ -583,7 +591,7 @@ parser.Parse(xml, True) self.fail() except expat.ExpatError as e: - self.assertEquals(str(e), 'unclosed token: line 2, column 0') + self.assertEqual(str(e), 'unclosed token: line 2, column 0') def test2(self): xml = "\r\n" @@ -592,7 +600,7 @@ parser.Parse(xml, True) self.fail() except expat.ExpatError as e: - self.assertEquals(str(e), 'XML declaration not well-formed: line 1, column 14') + self.assertEqual(str(e), 'XML declaration not well-formed: line 1, column 14') class ErrorMessageTest(unittest.TestCase): def test_codes(self): @@ -607,8 +615,50 @@ parser.Parse(xml, True) self.fail() except expat.ExpatError as e: - self.assertEquals(e.code, - errors.codes[errors.XML_ERROR_UNCLOSED_TOKEN]) + self.assertEqual(e.code, + errors.codes[errors.XML_ERROR_UNCLOSED_TOKEN]) + + +class ForeignDTDTests(unittest.TestCase): + """ + Tests for the UseForeignDTD method of expat parser objects. + """ + def test_use_foreign_dtd(self): + """ + If UseForeignDTD is passed True and a document without an external + entity reference is parsed, ExternalEntityRefHandler is first called + with None for the public and system ids. + """ + handler_call_args = [] + def resolve_entity(context, base, system_id, public_id): + handler_call_args.append((public_id, system_id)) + return 1 + + parser = expat.ParserCreate() + parser.UseForeignDTD(True) + parser.SetParamEntityParsing(expat.XML_PARAM_ENTITY_PARSING_ALWAYS) + parser.ExternalEntityRefHandler = resolve_entity + parser.Parse("") + self.assertEqual(handler_call_args, [(None, None)]) + + def test_ignore_use_foreign_dtd(self): + """ + If UseForeignDTD is passed True and a document with an external + entity reference is parsed, ExternalEntityRefHandler is called with + the public and system ids from the document. + """ + handler_call_args = [] + def resolve_entity(context, base, system_id, public_id): + handler_call_args.append((public_id, system_id)) + return 1 + + parser = expat.ParserCreate() + parser.UseForeignDTD(True) + parser.SetParamEntityParsing(expat.XML_PARAM_ENTITY_PARSING_ALWAYS) + parser.ExternalEntityRefHandler = resolve_entity + parser.Parse( + "") + self.assertEqual(handler_call_args, [("bar", "baz")]) def test_main(): @@ -622,7 +672,8 @@ sf1296433Test, ChardataBufferTest, MalformedInputTest, - ErrorMessageTest) + ErrorMessageTest, + ForeignDTDTests) if __name__ == "__main__": test_main() Modified: python/branches/pep-3151/Lib/test/test_queue.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_queue.py (original) +++ python/branches/pep-3151/Lib/test/test_queue.py Sat Feb 26 08:16:32 2011 @@ -100,8 +100,8 @@ LifoQueue = [222, 333, 111], PriorityQueue = [111, 222, 333]) actual_order = [q.get(), q.get(), q.get()] - self.assertEquals(actual_order, target_order[q.__class__.__name__], - "Didn't seem to queue the correct data!") + self.assertEqual(actual_order, target_order[q.__class__.__name__], + "Didn't seem to queue the correct data!") for i in range(QUEUE_SIZE-1): q.put(i) self.assertTrue(q.qsize(), "Queue should not be empty") @@ -161,8 +161,8 @@ for i in range(100): q.put(i) q.join() - self.assertEquals(self.cum, sum(range(100)), - "q.join() did not block until all tasks were done") + self.assertEqual(self.cum, sum(range(100)), + "q.join() did not block until all tasks were done") for i in (0,1): q.put(-1) # instruct the threads to close q.join() # verify that you can join twice Modified: python/branches/pep-3151/Lib/test/test_random.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_random.py (original) +++ python/branches/pep-3151/Lib/test/test_random.py Sat Feb 26 08:16:32 2011 @@ -246,8 +246,8 @@ '0x1.1ebb4352e4c4dp-1', '0x1.1a7422abf9c11p-1']) self.gen.seed("the quick brown fox", version=2) self.assertEqual([self.gen.random().hex() for i in range(4)], - ['0x1.1294009b9eda4p-2', '0x1.2ff96171b0010p-1', - '0x1.459e0989bd8e0p-5', '0x1.8b5f55892ddcbp-1']) + ['0x1.1239ddfb11b7cp-3', '0x1.b3cbb5c51b120p-4', + '0x1.8c4f55116b60fp-1', '0x1.63eb525174a27p-1']) def test_setstate_first_arg(self): self.assertRaises(ValueError, self.gen.setstate, (1, None, None)) Modified: python/branches/pep-3151/Lib/test/test_range.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_range.py (original) +++ python/branches/pep-3151/Lib/test/test_range.py Sat Feb 26 08:16:32 2011 @@ -90,6 +90,250 @@ r = range(-sys.maxsize, sys.maxsize, 2) self.assertEqual(len(r), sys.maxsize) + def test_large_operands(self): + x = range(10**20, 10**20+10, 3) + self.assertEqual(len(x), 4) + self.assertEqual(len(list(x)), 4) + + x = range(10**20+10, 10**20, 3) + self.assertEqual(len(x), 0) + self.assertEqual(len(list(x)), 0) + + x = range(10**20, 10**20+10, -3) + self.assertEqual(len(x), 0) + self.assertEqual(len(list(x)), 0) + + x = range(10**20+10, 10**20, -3) + self.assertEqual(len(x), 4) + self.assertEqual(len(list(x)), 4) + + # Now test range() with longs + self.assertEqual(list(range(-2**100)), []) + self.assertEqual(list(range(0, -2**100)), []) + self.assertEqual(list(range(0, 2**100, -1)), []) + self.assertEqual(list(range(0, 2**100, -1)), []) + + a = int(10 * sys.maxsize) + b = int(100 * sys.maxsize) + c = int(50 * sys.maxsize) + + self.assertEqual(list(range(a, a+2)), [a, a+1]) + self.assertEqual(list(range(a+2, a, -1)), [a+2, a+1]) + self.assertEqual(list(range(a+4, a, -2)), [a+4, a+2]) + + seq = list(range(a, b, c)) + self.assertIn(a, seq) + self.assertNotIn(b, seq) + self.assertEqual(len(seq), 2) + self.assertEqual(seq[0], a) + self.assertEqual(seq[-1], a+c) + + seq = list(range(b, a, -c)) + self.assertIn(b, seq) + self.assertNotIn(a, seq) + self.assertEqual(len(seq), 2) + self.assertEqual(seq[0], b) + self.assertEqual(seq[-1], b-c) + + seq = list(range(-a, -b, -c)) + self.assertIn(-a, seq) + self.assertNotIn(-b, seq) + self.assertEqual(len(seq), 2) + self.assertEqual(seq[0], -a) + self.assertEqual(seq[-1], -a-c) + + def test_large_range(self): + # Check long ranges (len > sys.maxsize) + # len() is expected to fail due to limitations of the __len__ protocol + def _range_len(x): + try: + length = len(x) + except OverflowError: + step = x[1] - x[0] + length = 1 + ((x[-1] - x[0]) // step) + return length + a = -sys.maxsize + b = sys.maxsize + expected_len = b - a + x = range(a, b) + self.assertIn(a, x) + self.assertNotIn(b, x) + self.assertRaises(OverflowError, len, x) + self.assertEqual(_range_len(x), expected_len) + self.assertEqual(x[0], a) + idx = sys.maxsize+1 + self.assertEqual(x[idx], a+idx) + self.assertEqual(x[idx:idx+1][0], a+idx) + with self.assertRaises(IndexError): + x[-expected_len-1] + with self.assertRaises(IndexError): + x[expected_len] + + a = 0 + b = 2 * sys.maxsize + expected_len = b - a + x = range(a, b) + self.assertIn(a, x) + self.assertNotIn(b, x) + self.assertRaises(OverflowError, len, x) + self.assertEqual(_range_len(x), expected_len) + self.assertEqual(x[0], a) + idx = sys.maxsize+1 + self.assertEqual(x[idx], a+idx) + self.assertEqual(x[idx:idx+1][0], a+idx) + with self.assertRaises(IndexError): + x[-expected_len-1] + with self.assertRaises(IndexError): + x[expected_len] + + a = 0 + b = sys.maxsize**10 + c = 2*sys.maxsize + expected_len = 1 + (b - a) // c + x = range(a, b, c) + self.assertIn(a, x) + self.assertNotIn(b, x) + self.assertRaises(OverflowError, len, x) + self.assertEqual(_range_len(x), expected_len) + self.assertEqual(x[0], a) + idx = sys.maxsize+1 + self.assertEqual(x[idx], a+(idx*c)) + self.assertEqual(x[idx:idx+1][0], a+(idx*c)) + with self.assertRaises(IndexError): + x[-expected_len-1] + with self.assertRaises(IndexError): + x[expected_len] + + a = sys.maxsize**10 + b = 0 + c = -2*sys.maxsize + expected_len = 1 + (b - a) // c + x = range(a, b, c) + self.assertIn(a, x) + self.assertNotIn(b, x) + self.assertRaises(OverflowError, len, x) + self.assertEqual(_range_len(x), expected_len) + self.assertEqual(x[0], a) + idx = sys.maxsize+1 + self.assertEqual(x[idx], a+(idx*c)) + self.assertEqual(x[idx:idx+1][0], a+(idx*c)) + with self.assertRaises(IndexError): + x[-expected_len-1] + with self.assertRaises(IndexError): + x[expected_len] + + def test_invalid_invocation(self): + self.assertRaises(TypeError, range) + self.assertRaises(TypeError, range, 1, 2, 3, 4) + self.assertRaises(ValueError, range, 1, 2, 0) + a = int(10 * sys.maxsize) + self.assertRaises(ValueError, range, a, a + 1, int(0)) + self.assertRaises(TypeError, range, 1., 1., 1.) + self.assertRaises(TypeError, range, 1e100, 1e101, 1e101) + self.assertRaises(TypeError, range, 0, "spam") + self.assertRaises(TypeError, range, 0, 42, "spam") + # Exercise various combinations of bad arguments, to check + # refcounting logic + self.assertRaises(TypeError, range, 0.0) + self.assertRaises(TypeError, range, 0, 0.0) + self.assertRaises(TypeError, range, 0.0, 0) + self.assertRaises(TypeError, range, 0.0, 0.0) + self.assertRaises(TypeError, range, 0, 0, 1.0) + self.assertRaises(TypeError, range, 0, 0.0, 1) + self.assertRaises(TypeError, range, 0, 0.0, 1.0) + self.assertRaises(TypeError, range, 0.0, 0, 1) + self.assertRaises(TypeError, range, 0.0, 0, 1.0) + self.assertRaises(TypeError, range, 0.0, 0.0, 1) + self.assertRaises(TypeError, range, 0.0, 0.0, 1.0) + + def test_index(self): + u = range(2) + self.assertEqual(u.index(0), 0) + self.assertEqual(u.index(1), 1) + self.assertRaises(ValueError, u.index, 2) + + u = range(-2, 3) + self.assertEqual(u.count(0), 1) + self.assertEqual(u.index(0), 2) + self.assertRaises(TypeError, u.index) + + class BadExc(Exception): + pass + + class BadCmp: + def __eq__(self, other): + if other == 2: + raise BadExc() + return False + + a = range(4) + self.assertRaises(BadExc, a.index, BadCmp()) + + a = range(-2, 3) + self.assertEqual(a.index(0), 2) + self.assertEqual(range(1, 10, 3).index(4), 1) + self.assertEqual(range(1, -10, -3).index(-5), 2) + + self.assertEqual(range(10**20).index(1), 1) + self.assertEqual(range(10**20).index(10**20 - 1), 10**20 - 1) + + self.assertRaises(ValueError, range(1, 2**100, 2).index, 2**87) + self.assertEqual(range(1, 2**100, 2).index(2**87+1), 2**86) + + class AlwaysEqual(object): + def __eq__(self, other): + return True + always_equal = AlwaysEqual() + self.assertEqual(range(10).index(always_equal), 0) + + def test_user_index_method(self): + bignum = 2*sys.maxsize + smallnum = 42 + + # User-defined class with an __index__ method + class I: + def __init__(self, n): + self.n = int(n) + def __index__(self): + return self.n + self.assertEqual(list(range(I(bignum), I(bignum + 1))), [bignum]) + self.assertEqual(list(range(I(smallnum), I(smallnum + 1))), [smallnum]) + + # User-defined class with a failing __index__ method + class IX: + def __index__(self): + raise RuntimeError + self.assertRaises(RuntimeError, range, IX()) + + # User-defined class with an invalid __index__ method + class IN: + def __index__(self): + return "not a number" + + self.assertRaises(TypeError, range, IN()) + + def test_count(self): + self.assertEqual(range(3).count(-1), 0) + self.assertEqual(range(3).count(0), 1) + self.assertEqual(range(3).count(1), 1) + self.assertEqual(range(3).count(2), 1) + self.assertEqual(range(3).count(3), 0) + self.assertIs(type(range(3).count(-1)), int) + self.assertIs(type(range(3).count(1)), int) + self.assertEqual(range(10**20).count(1), 1) + self.assertEqual(range(10**20).count(10**20), 0) + self.assertEqual(range(3).index(1), 1) + self.assertEqual(range(1, 2**100, 2).count(2**87), 0) + self.assertEqual(range(1, 2**100, 2).count(2**87+1), 1) + + class AlwaysEqual(object): + def __eq__(self, other): + return True + always_equal = AlwaysEqual() + self.assertEqual(range(10).count(always_equal), 10) + + self.assertEqual(len(range(sys.maxsize, sys.maxsize+10)), 10) + def test_repr(self): self.assertEqual(repr(range(1)), 'range(0, 1)') self.assertEqual(repr(range(1, 2)), 'range(1, 2)') @@ -101,8 +345,8 @@ for proto in range(pickle.HIGHEST_PROTOCOL + 1): for t in testcases: r = range(*t) - self.assertEquals(list(pickle.loads(pickle.dumps(r, proto))), - list(r)) + self.assertEqual(list(pickle.loads(pickle.dumps(r, proto))), + list(r)) def test_odd_bug(self): # This used to raise a "SystemError: NULL result without error" @@ -191,6 +435,70 @@ test_id = "reversed(range({}, {}, {}))".format(start, end, step) self.assert_iterators_equal(iter1, iter2, test_id, limit=100) + def test_slice(self): + def check(start, stop, step=None): + i = slice(start, stop, step) + self.assertEqual(list(r[i]), list(r)[i]) + self.assertEqual(len(r[i]), len(list(r)[i])) + for r in [range(10), + range(0), + range(1, 9, 3), + range(8, 0, -3), + range(sys.maxsize+1, sys.maxsize+10), + ]: + check(0, 2) + check(0, 20) + check(1, 2) + check(20, 30) + check(-30, -20) + check(-1, 100, 2) + check(0, -1) + check(-1, -3, -1) + + def test_contains(self): + r = range(10) + self.assertIn(0, r) + self.assertIn(1, r) + self.assertIn(5.0, r) + self.assertNotIn(5.1, r) + self.assertNotIn(-1, r) + self.assertNotIn(10, r) + self.assertNotIn("", r) + r = range(9, -1, -1) + self.assertIn(0, r) + self.assertIn(1, r) + self.assertIn(5.0, r) + self.assertNotIn(5.1, r) + self.assertNotIn(-1, r) + self.assertNotIn(10, r) + self.assertNotIn("", r) + r = range(0, 10, 2) + self.assertIn(0, r) + self.assertNotIn(1, r) + self.assertNotIn(5.0, r) + self.assertNotIn(5.1, r) + self.assertNotIn(-1, r) + self.assertNotIn(10, r) + self.assertNotIn("", r) + r = range(9, -1, -2) + self.assertNotIn(0, r) + self.assertIn(1, r) + self.assertIn(5.0, r) + self.assertNotIn(5.1, r) + self.assertNotIn(-1, r) + self.assertNotIn(10, r) + self.assertNotIn("", r) + + def test_reverse_iteration(self): + for r in [range(10), + range(0), + range(1, 9, 3), + range(8, 0, -3), + range(sys.maxsize+1, sys.maxsize+10), + ]: + self.assertEqual(list(reversed(r)), list(r)[::-1]) + + def test_main(): test.support.run_unittest(RangeTest) Modified: python/branches/pep-3151/Lib/test/test_reprlib.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_reprlib.py (original) +++ python/branches/pep-3151/Lib/test/test_reprlib.py Sat Feb 26 08:16:32 2011 @@ -23,7 +23,7 @@ class ReprTests(unittest.TestCase): def test_string(self): - eq = self.assertEquals + eq = self.assertEqual eq(r("abc"), "'abc'") eq(r("abcdefghijklmnop"),"'abcdefghijklmnop'") @@ -37,7 +37,7 @@ eq(r(s), expected) def test_tuple(self): - eq = self.assertEquals + eq = self.assertEqual eq(r((1,)), "(1,)") t3 = (1, 2, 3) @@ -52,7 +52,7 @@ from array import array from collections import deque - eq = self.assertEquals + eq = self.assertEqual # Tuples give up after 6 elements eq(r(()), "()") eq(r((1,)), "(1,)") @@ -102,7 +102,7 @@ "array('i', [1, 2, 3, 4, 5, ...])") def test_numbers(self): - eq = self.assertEquals + eq = self.assertEqual eq(r(123), repr(123)) eq(r(123), repr(123)) eq(r(1.0/3), repr(1.0/3)) @@ -112,7 +112,7 @@ eq(r(n), expected) def test_instance(self): - eq = self.assertEquals + eq = self.assertEqual i1 = ClassWithRepr("a") eq(r(i1), repr(i1)) @@ -134,7 +134,7 @@ # XXX anonymous functions? see func_repr def test_builtin_function(self): - eq = self.assertEquals + eq = self.assertEqual # Functions eq(repr(hash), '') # Methods @@ -142,13 +142,13 @@ '") # XXX member descriptors @@ -230,15 +230,15 @@ del sys.path[0] def test_module(self): - eq = self.assertEquals + eq = self.assertEqual touch(os.path.join(self.subpkgname, self.pkgname + '.py')) from areallylongpackageandmodulenametotestreprtruncation.areallylongpackageandmodulenametotestreprtruncation import areallylongpackageandmodulenametotestreprtruncation eq(repr(areallylongpackageandmodulenametotestreprtruncation), - "" % (areallylongpackageandmodulenametotestreprtruncation.__name__, areallylongpackageandmodulenametotestreprtruncation.__file__)) + "" % (areallylongpackageandmodulenametotestreprtruncation.__name__, areallylongpackageandmodulenametotestreprtruncation.__file__)) eq(repr(sys), "") def test_type(self): - eq = self.assertEquals + eq = self.assertEqual touch(os.path.join(self.subpkgname, 'foo.py'), '''\ class foo(object): pass @@ -259,7 +259,7 @@ ''') from areallylongpackageandmodulenametotestreprtruncation.areallylongpackageandmodulenametotestreprtruncation import bar # Module name may be prefixed with "test.", depending on how run. - self.assertEquals(repr(bar.bar), "" % bar.__name__) + self.assertEqual(repr(bar.bar), "" % bar.__name__) def test_instance(self): touch(os.path.join(self.subpkgname, 'baz.py'), '''\ @@ -272,7 +272,7 @@ "<%s.baz object at 0x" % baz.__name__)) def test_method(self): - eq = self.assertEquals + eq = self.assertEqual touch(os.path.join(self.subpkgname, 'qux.py'), '''\ class aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa: def amethod(self): pass Modified: python/branches/pep-3151/Lib/test/test_resource.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_resource.py (original) +++ python/branches/pep-3151/Lib/test/test_resource.py Sat Feb 26 08:16:32 2011 @@ -102,6 +102,10 @@ usageboth = resource.getrusage(resource.RUSAGE_BOTH) except (ValueError, AttributeError): pass + try: + usage_thread = resource.getrusage(resource.RUSAGE_THREAD) + except (ValueError, AttributeError): + pass def test_main(verbose=None): support.run_unittest(ResourceTest) Modified: python/branches/pep-3151/Lib/test/test_richcmp.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_richcmp.py (original) +++ python/branches/pep-3151/Lib/test/test_richcmp.py Sat Feb 26 08:16:32 2011 @@ -220,6 +220,7 @@ for func in (do, operator.not_): self.assertRaises(Exc, func, Bad()) + @support.no_tracing def test_recursion(self): # Check that comparison for recursive objects fails gracefully from collections import UserList Modified: python/branches/pep-3151/Lib/test/test_runpy.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_runpy.py (original) +++ python/branches/pep-3151/Lib/test/test_runpy.py Sat Feb 26 08:16:32 2011 @@ -6,7 +6,8 @@ import re import tempfile import py_compile -from test.support import forget, make_legacy_pyc, run_unittest, unload, verbose +from test.support import ( + forget, make_legacy_pyc, run_unittest, unload, verbose, no_tracing) from test.script_helper import ( make_pkg, make_script, make_zip_pkg, make_zip_script, temp_dir) @@ -329,7 +330,7 @@ def _check_import_error(self, script_name, msg): msg = re.escape(msg) - self.assertRaisesRegexp(ImportError, msg, run_path, script_name) + self.assertRaisesRegex(ImportError, msg, run_path, script_name) def test_basic_script(self): with temp_dir() as script_dir: @@ -395,6 +396,7 @@ msg = "can't find '__main__' module in %r" % zip_name self._check_import_error(zip_name, msg) + @no_tracing def test_main_recursion_error(self): with temp_dir() as script_dir, temp_dir() as dummy_dir: mod_name = '__main__' @@ -403,7 +405,7 @@ script_name = self._make_test_script(script_dir, mod_name, source) zip_name, fname = make_zip_script(script_dir, 'test_zip', script_name) msg = "recursion depth exceeded" - self.assertRaisesRegexp(RuntimeError, msg, run_path, zip_name) + self.assertRaisesRegex(RuntimeError, msg, run_path, zip_name) Modified: python/branches/pep-3151/Lib/test/test_sax.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_sax.py (original) +++ python/branches/pep-3151/Lib/test/test_sax.py Sat Feb 26 08:16:32 2011 @@ -20,8 +20,8 @@ TEST_XMLFILE = findfile("test.xml", subdir="xmltestdata") TEST_XMLFILE_OUT = findfile("test.xml.out", subdir="xmltestdata") try: - TEST_XMLFILE.encode("utf8") - TEST_XMLFILE_OUT.encode("utf8") + TEST_XMLFILE.encode("utf-8") + TEST_XMLFILE_OUT.encode("utf-8") except UnicodeEncodeError: raise unittest.SkipTest("filename is not encodable to utf8") @@ -34,16 +34,16 @@ self.assertRaises(KeyError, attrs.getNameByQName, "attr") self.assertRaises(KeyError, attrs.getQNameByName, "attr") self.assertRaises(KeyError, attrs.__getitem__, "attr") - self.assertEquals(attrs.getLength(), 0) - self.assertEquals(attrs.getNames(), []) - self.assertEquals(attrs.getQNames(), []) - self.assertEquals(len(attrs), 0) + self.assertEqual(attrs.getLength(), 0) + self.assertEqual(attrs.getNames(), []) + self.assertEqual(attrs.getQNames(), []) + self.assertEqual(len(attrs), 0) self.assertNotIn("attr", attrs) - self.assertEquals(list(attrs.keys()), []) - self.assertEquals(attrs.get("attrs"), None) - self.assertEquals(attrs.get("attrs", 25), 25) - self.assertEquals(list(attrs.items()), []) - self.assertEquals(list(attrs.values()), []) + self.assertEqual(list(attrs.keys()), []) + self.assertEqual(attrs.get("attrs"), None) + self.assertEqual(attrs.get("attrs", 25), 25) + self.assertEqual(list(attrs.items()), []) + self.assertEqual(list(attrs.values()), []) def verify_empty_nsattrs(self, attrs): self.assertRaises(KeyError, attrs.getValue, (ns_uri, "attr")) @@ -51,33 +51,33 @@ self.assertRaises(KeyError, attrs.getNameByQName, "ns:attr") self.assertRaises(KeyError, attrs.getQNameByName, (ns_uri, "attr")) self.assertRaises(KeyError, attrs.__getitem__, (ns_uri, "attr")) - self.assertEquals(attrs.getLength(), 0) - self.assertEquals(attrs.getNames(), []) - self.assertEquals(attrs.getQNames(), []) - self.assertEquals(len(attrs), 0) + self.assertEqual(attrs.getLength(), 0) + self.assertEqual(attrs.getNames(), []) + self.assertEqual(attrs.getQNames(), []) + self.assertEqual(len(attrs), 0) self.assertNotIn((ns_uri, "attr"), attrs) - self.assertEquals(list(attrs.keys()), []) - self.assertEquals(attrs.get((ns_uri, "attr")), None) - self.assertEquals(attrs.get((ns_uri, "attr"), 25), 25) - self.assertEquals(list(attrs.items()), []) - self.assertEquals(list(attrs.values()), []) + self.assertEqual(list(attrs.keys()), []) + self.assertEqual(attrs.get((ns_uri, "attr")), None) + self.assertEqual(attrs.get((ns_uri, "attr"), 25), 25) + self.assertEqual(list(attrs.items()), []) + self.assertEqual(list(attrs.values()), []) def verify_attrs_wattr(self, attrs): - self.assertEquals(attrs.getLength(), 1) - self.assertEquals(attrs.getNames(), ["attr"]) - self.assertEquals(attrs.getQNames(), ["attr"]) - self.assertEquals(len(attrs), 1) + self.assertEqual(attrs.getLength(), 1) + self.assertEqual(attrs.getNames(), ["attr"]) + self.assertEqual(attrs.getQNames(), ["attr"]) + self.assertEqual(len(attrs), 1) self.assertIn("attr", attrs) - self.assertEquals(list(attrs.keys()), ["attr"]) - self.assertEquals(attrs.get("attr"), "val") - self.assertEquals(attrs.get("attr", 25), "val") - self.assertEquals(list(attrs.items()), [("attr", "val")]) - self.assertEquals(list(attrs.values()), ["val"]) - self.assertEquals(attrs.getValue("attr"), "val") - self.assertEquals(attrs.getValueByQName("attr"), "val") - self.assertEquals(attrs.getNameByQName("attr"), "attr") - self.assertEquals(attrs["attr"], "val") - self.assertEquals(attrs.getQNameByName("attr"), "attr") + self.assertEqual(list(attrs.keys()), ["attr"]) + self.assertEqual(attrs.get("attr"), "val") + self.assertEqual(attrs.get("attr", 25), "val") + self.assertEqual(list(attrs.items()), [("attr", "val")]) + self.assertEqual(list(attrs.values()), ["val"]) + self.assertEqual(attrs.getValue("attr"), "val") + self.assertEqual(attrs.getValueByQName("attr"), "val") + self.assertEqual(attrs.getNameByQName("attr"), "attr") + self.assertEqual(attrs["attr"], "val") + self.assertEqual(attrs.getQNameByName("attr"), "attr") class MakeParserTest(unittest.TestCase): def test_make_parser2(self): @@ -107,47 +107,47 @@ class SaxutilsTest(unittest.TestCase): # ===== escape def test_escape_basic(self): - self.assertEquals(escape("Donald Duck & Co"), "Donald Duck & Co") + self.assertEqual(escape("Donald Duck & Co"), "Donald Duck & Co") def test_escape_all(self): - self.assertEquals(escape(""), - "<Donald Duck & Co>") + self.assertEqual(escape(""), + "<Donald Duck & Co>") def test_escape_extra(self): - self.assertEquals(escape("Hei p?? deg", {"??" : "å"}), - "Hei på deg") + self.assertEqual(escape("Hei p?? deg", {"??" : "å"}), + "Hei på deg") # ===== unescape def test_unescape_basic(self): - self.assertEquals(unescape("Donald Duck & Co"), "Donald Duck & Co") + self.assertEqual(unescape("Donald Duck & Co"), "Donald Duck & Co") def test_unescape_all(self): - self.assertEquals(unescape("<Donald Duck & Co>"), - "") + self.assertEqual(unescape("<Donald Duck & Co>"), + "") def test_unescape_extra(self): - self.assertEquals(unescape("Hei p?? deg", {"??" : "å"}), - "Hei på deg") + self.assertEqual(unescape("Hei p?? deg", {"??" : "å"}), + "Hei på deg") def test_unescape_amp_extra(self): - self.assertEquals(unescape("&foo;", {"&foo;": "splat"}), "&foo;") + self.assertEqual(unescape("&foo;", {"&foo;": "splat"}), "&foo;") # ===== quoteattr def test_quoteattr_basic(self): - self.assertEquals(quoteattr("Donald Duck & Co"), - '"Donald Duck & Co"') + self.assertEqual(quoteattr("Donald Duck & Co"), + '"Donald Duck & Co"') def test_single_quoteattr(self): - self.assertEquals(quoteattr('Includes "double" quotes'), - '\'Includes "double" quotes\'') + self.assertEqual(quoteattr('Includes "double" quotes'), + '\'Includes "double" quotes\'') def test_double_quoteattr(self): - self.assertEquals(quoteattr("Includes 'single' quotes"), - "\"Includes 'single' quotes\"") + self.assertEqual(quoteattr("Includes 'single' quotes"), + "\"Includes 'single' quotes\"") def test_single_double_quoteattr(self): - self.assertEquals(quoteattr("Includes 'single' and \"double\" quotes"), - "\"Includes 'single' and "double" quotes\"") + self.assertEqual(quoteattr("Includes 'single' and \"double\" quotes"), + "\"Includes 'single' and "double" quotes\"") # ===== make_parser def test_make_parser(self): @@ -169,7 +169,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + "") + self.assertEqual(result.getvalue(), start + "") def test_xmlgen_basic_empty(self): result = StringIO() @@ -179,7 +179,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + "") + self.assertEqual(result.getvalue(), start + "") def test_xmlgen_content(self): result = StringIO() @@ -191,7 +191,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + "huhei") + self.assertEqual(result.getvalue(), start + "huhei") def test_xmlgen_content_empty(self): result = StringIO() @@ -203,7 +203,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + "huhei") + self.assertEqual(result.getvalue(), start + "huhei") def test_xmlgen_pi(self): result = StringIO() @@ -215,7 +215,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + "") + self.assertEqual(result.getvalue(), start + "") def test_xmlgen_content_escape(self): result = StringIO() @@ -227,7 +227,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), + self.assertEqual(result.getvalue(), start + "<huhei&") def test_xmlgen_attr_escape(self): @@ -245,7 +245,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + + self.assertEqual(result.getvalue(), start + ("" "" "")) @@ -260,7 +260,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + " ") + self.assertEqual(result.getvalue(), start + " ") def test_xmlgen_ignorable_empty(self): result = StringIO() @@ -272,7 +272,7 @@ gen.endElement("doc") gen.endDocument() - self.assertEquals(result.getvalue(), start + " ") + self.assertEqual(result.getvalue(), start + " ") def test_xmlgen_ns(self): result = StringIO() @@ -288,7 +288,7 @@ gen.endPrefixMapping("ns1") gen.endDocument() - self.assertEquals(result.getvalue(), start + \ + self.assertEqual(result.getvalue(), start + \ ('' % ns_uri)) @@ -306,7 +306,7 @@ gen.endPrefixMapping("ns1") gen.endDocument() - self.assertEquals(result.getvalue(), start + \ + self.assertEqual(result.getvalue(), start + \ ('' % ns_uri)) @@ -319,7 +319,7 @@ gen.endElementNS((None, 'a'), 'a') gen.endDocument() - self.assertEquals(result.getvalue(), start+'') + self.assertEqual(result.getvalue(), start+'') def test_1463026_1_empty(self): result = StringIO() @@ -330,7 +330,7 @@ gen.endElementNS((None, 'a'), 'a') gen.endDocument() - self.assertEquals(result.getvalue(), start+'') + self.assertEqual(result.getvalue(), start+'') def test_1463026_2(self): result = StringIO() @@ -343,7 +343,7 @@ gen.endPrefixMapping(None) gen.endDocument() - self.assertEquals(result.getvalue(), start+'') + self.assertEqual(result.getvalue(), start+'') def test_1463026_2_empty(self): result = StringIO() @@ -356,7 +356,7 @@ gen.endPrefixMapping(None) gen.endDocument() - self.assertEquals(result.getvalue(), start+'') + self.assertEqual(result.getvalue(), start+'') def test_1463026_3(self): result = StringIO() @@ -369,7 +369,7 @@ gen.endPrefixMapping('my') gen.endDocument() - self.assertEquals(result.getvalue(), + self.assertEqual(result.getvalue(), start+'') def test_1463026_3_empty(self): @@ -383,7 +383,7 @@ gen.endPrefixMapping('my') gen.endDocument() - self.assertEquals(result.getvalue(), + self.assertEqual(result.getvalue(), start+'') def test_5027_1(self): @@ -406,11 +406,11 @@ parser.setContentHandler(gen) parser.parse(test_xml) - self.assertEquals(result.getvalue(), - start + ( - '' - 'Hello' - '')) + self.assertEqual(result.getvalue(), + start + ( + '' + 'Hello' + '')) def test_5027_2(self): # The xml prefix (as in xml:lang below) is reserved and bound by @@ -434,11 +434,11 @@ gen.endPrefixMapping('a') gen.endDocument() - self.assertEquals(result.getvalue(), - start + ( - '' - 'Hello' - '')) + self.assertEqual(result.getvalue(), + start + ( + '' + 'Hello' + '')) class XMLFilterBaseTest(unittest.TestCase): @@ -455,7 +455,7 @@ filter.endElement("doc") filter.endDocument() - self.assertEquals(result.getvalue(), start + "content ") + self.assertEqual(result.getvalue(), start + "content ") # =========================================================================== # @@ -479,7 +479,7 @@ with open(TEST_XMLFILE) as f: parser.parse(f) - self.assertEquals(result.getvalue(), xml_test_out) + self.assertEqual(result.getvalue(), xml_test_out) # ===== DTDHandler support @@ -507,9 +507,9 @@ parser.feed('') parser.close() - self.assertEquals(handler._notations, + self.assertEqual(handler._notations, [("GIF", "-//CompuServe//NOTATION Graphics Interchange Format 89a//EN", None)]) - self.assertEquals(handler._entities, [("img", None, "expat.gif", "GIF")]) + self.assertEqual(handler._entities, [("img", None, "expat.gif", "GIF")]) # ===== EntityResolver support @@ -532,8 +532,8 @@ parser.feed('&test;') parser.close() - self.assertEquals(result.getvalue(), start + - "") + self.assertEqual(result.getvalue(), start + + "") # ===== Attributes support @@ -585,18 +585,18 @@ attrs = gather._attrs - self.assertEquals(attrs.getLength(), 1) - self.assertEquals(attrs.getNames(), [(ns_uri, "attr")]) + self.assertEqual(attrs.getLength(), 1) + self.assertEqual(attrs.getNames(), [(ns_uri, "attr")]) self.assertTrue((attrs.getQNames() == [] or attrs.getQNames() == ["ns:attr"])) - self.assertEquals(len(attrs), 1) + self.assertEqual(len(attrs), 1) self.assertIn((ns_uri, "attr"), attrs) - self.assertEquals(attrs.get((ns_uri, "attr")), "val") - self.assertEquals(attrs.get((ns_uri, "attr"), 25), "val") - self.assertEquals(list(attrs.items()), [((ns_uri, "attr"), "val")]) - self.assertEquals(list(attrs.values()), ["val"]) - self.assertEquals(attrs.getValue((ns_uri, "attr")), "val") - self.assertEquals(attrs[(ns_uri, "attr")], "val") + self.assertEqual(attrs.get((ns_uri, "attr")), "val") + self.assertEqual(attrs.get((ns_uri, "attr"), 25), "val") + self.assertEqual(list(attrs.items()), [((ns_uri, "attr"), "val")]) + self.assertEqual(list(attrs.values()), ["val"]) + self.assertEqual(attrs.getValue((ns_uri, "attr")), "val") + self.assertEqual(attrs[(ns_uri, "attr")], "val") # ===== InputSource support @@ -608,7 +608,7 @@ parser.setContentHandler(xmlgen) parser.parse(TEST_XMLFILE) - self.assertEquals(result.getvalue(), xml_test_out) + self.assertEqual(result.getvalue(), xml_test_out) def test_expat_inpsource_sysid(self): parser = create_parser() @@ -618,7 +618,7 @@ parser.setContentHandler(xmlgen) parser.parse(InputSource(TEST_XMLFILE)) - self.assertEquals(result.getvalue(), xml_test_out) + self.assertEqual(result.getvalue(), xml_test_out) def test_expat_inpsource_stream(self): parser = create_parser() @@ -631,7 +631,7 @@ inpsrc.setByteStream(f) parser.parse(inpsrc) - self.assertEquals(result.getvalue(), xml_test_out) + self.assertEqual(result.getvalue(), xml_test_out) # ===== IncrementalParser support @@ -645,7 +645,7 @@ parser.feed("") parser.close() - self.assertEquals(result.getvalue(), start + "") + self.assertEqual(result.getvalue(), start + "") def test_expat_incremental_reset(self): result = StringIO() @@ -666,7 +666,7 @@ parser.feed("") parser.close() - self.assertEquals(result.getvalue(), start + "text") + self.assertEqual(result.getvalue(), start + "text") # ===== Locator support @@ -680,9 +680,9 @@ parser.feed("") parser.close() - self.assertEquals(parser.getSystemId(), None) - self.assertEquals(parser.getPublicId(), None) - self.assertEquals(parser.getLineNumber(), 1) + self.assertEqual(parser.getSystemId(), None) + self.assertEqual(parser.getPublicId(), None) + self.assertEqual(parser.getLineNumber(), 1) def test_expat_locator_withinfo(self): result = StringIO() @@ -691,8 +691,8 @@ parser.setContentHandler(xmlgen) parser.parse(TEST_XMLFILE) - self.assertEquals(parser.getSystemId(), TEST_XMLFILE) - self.assertEquals(parser.getPublicId(), None) + self.assertEqual(parser.getSystemId(), TEST_XMLFILE) + self.assertEqual(parser.getPublicId(), None) # =========================================================================== @@ -713,7 +713,7 @@ parser.parse(source) self.fail() except SAXException as e: - self.assertEquals(e.getSystemId(), name) + self.assertEqual(e.getSystemId(), name) def test_expat_incomplete(self): parser = create_parser() @@ -777,21 +777,21 @@ attrs = AttributesNSImpl({(ns_uri, "attr") : "val"}, {(ns_uri, "attr") : "ns:attr"}) - self.assertEquals(attrs.getLength(), 1) - self.assertEquals(attrs.getNames(), [(ns_uri, "attr")]) - self.assertEquals(attrs.getQNames(), ["ns:attr"]) - self.assertEquals(len(attrs), 1) + self.assertEqual(attrs.getLength(), 1) + self.assertEqual(attrs.getNames(), [(ns_uri, "attr")]) + self.assertEqual(attrs.getQNames(), ["ns:attr"]) + self.assertEqual(len(attrs), 1) self.assertIn((ns_uri, "attr"), attrs) - self.assertEquals(list(attrs.keys()), [(ns_uri, "attr")]) - self.assertEquals(attrs.get((ns_uri, "attr")), "val") - self.assertEquals(attrs.get((ns_uri, "attr"), 25), "val") - self.assertEquals(list(attrs.items()), [((ns_uri, "attr"), "val")]) - self.assertEquals(list(attrs.values()), ["val"]) - self.assertEquals(attrs.getValue((ns_uri, "attr")), "val") - self.assertEquals(attrs.getValueByQName("ns:attr"), "val") - self.assertEquals(attrs.getNameByQName("ns:attr"), (ns_uri, "attr")) - self.assertEquals(attrs[(ns_uri, "attr")], "val") - self.assertEquals(attrs.getQNameByName((ns_uri, "attr")), "ns:attr") + self.assertEqual(list(attrs.keys()), [(ns_uri, "attr")]) + self.assertEqual(attrs.get((ns_uri, "attr")), "val") + self.assertEqual(attrs.get((ns_uri, "attr"), 25), "val") + self.assertEqual(list(attrs.items()), [((ns_uri, "attr"), "val")]) + self.assertEqual(list(attrs.values()), ["val"]) + self.assertEqual(attrs.getValue((ns_uri, "attr")), "val") + self.assertEqual(attrs.getValueByQName("ns:attr"), "val") + self.assertEqual(attrs.getNameByQName("ns:attr"), (ns_uri, "attr")) + self.assertEqual(attrs[(ns_uri, "attr")], "val") + self.assertEqual(attrs.getQNameByName((ns_uri, "attr")), "ns:attr") # During the development of Python 2.5, an attempt to move the "xml" @@ -827,7 +827,7 @@ try: import xml.sax.expatreader module = xml.sax.expatreader - self.assertEquals(module.__name__, "xml.sax.expatreader") + self.assertEqual(module.__name__, "xml.sax.expatreader") finally: sys.modules.update(old_modules) Modified: python/branches/pep-3151/Lib/test/test_scope.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_scope.py (original) +++ python/branches/pep-3151/Lib/test/test_scope.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,5 @@ import unittest -from test.support import check_syntax_error, run_unittest +from test.support import check_syntax_error, cpython_only, run_unittest class ScopeTests(unittest.TestCase): @@ -496,23 +496,22 @@ self.assertNotIn("x", varnames) self.assertIn("y", varnames) + @cpython_only def testLocalsClass_WithTrace(self): # Issue23728: after the trace function returns, the locals() # dictionary is used to update all variables, this used to # include free variables. But in class statements, free # variables are not inserted... import sys + self.addCleanup(sys.settrace, sys.gettrace()) sys.settrace(lambda a,b,c:None) - try: - x = 12 + x = 12 - class C: - def f(self): - return x + class C: + def f(self): + return x - self.assertEquals(x, 12) # Used to raise UnboundLocalError - finally: - sys.settrace(None) + self.assertEqual(x, 12) # Used to raise UnboundLocalError def testBoundAndFree(self): # var is bound and free in class @@ -527,6 +526,7 @@ inst = f(3)() self.assertEqual(inst.a, inst.m()) + @cpython_only def testInteractionWithTraceFunc(self): import sys @@ -543,6 +543,7 @@ class TestClass: pass + self.addCleanup(sys.settrace, sys.gettrace()) sys.settrace(tracer) adaptgetter("foo", TestClass, (1, "")) sys.settrace(None) Modified: python/branches/pep-3151/Lib/test/test_shelve.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_shelve.py (original) +++ python/branches/pep-3151/Lib/test/test_shelve.py Sat Feb 26 08:16:32 2011 @@ -2,7 +2,7 @@ import shelve import glob from test import support -from collections import MutableMapping +from collections.abc import MutableMapping from test.test_dbm import dbm_iterator def L1(s): @@ -122,6 +122,19 @@ self.assertEqual(len(d1), 1) self.assertEqual(len(d2), 1) + def test_keyencoding(self): + d = {} + key = 'P????p' + # the default keyencoding is utf-8 + shelve.Shelf(d)[key] = [1] + self.assertIn(key.encode('utf-8'), d) + # but a different one can be given + shelve.Shelf(d, keyencoding='latin-1')[key] = [1] + self.assertIn(key.encode('latin-1'), d) + # with all consequences + s = shelve.Shelf(d, keyencoding='ascii') + self.assertRaises(UnicodeEncodeError, s.__setitem__, key, [1]) + def test_writeback_also_writes_immediately(self): # Issue 5754 d = {} Modified: python/branches/pep-3151/Lib/test/test_shutil.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_shutil.py (original) +++ python/branches/pep-3151/Lib/test/test_shutil.py Sat Feb 26 08:16:32 2011 @@ -271,24 +271,35 @@ shutil.rmtree(src_dir) shutil.rmtree(os.path.dirname(dst_dir)) - @support.skip_unless_symlink + @unittest.skipUnless(hasattr(os, 'link'), 'requires os.link') def test_dont_copy_file_onto_link_to_itself(self): + # Temporarily disable test on Windows. + if os.name == 'nt': + return # bug 851123. os.mkdir(TESTFN) src = os.path.join(TESTFN, 'cheese') dst = os.path.join(TESTFN, 'shop') try: - f = open(src, 'w') - f.write('cheddar') - f.close() - - if hasattr(os, "link"): - os.link(src, dst) - self.assertRaises(shutil.Error, shutil.copyfile, src, dst) - with open(src, 'r') as f: - self.assertEqual(f.read(), 'cheddar') - os.remove(dst) + with open(src, 'w') as f: + f.write('cheddar') + os.link(src, dst) + self.assertRaises(shutil.Error, shutil.copyfile, src, dst) + with open(src, 'r') as f: + self.assertEqual(f.read(), 'cheddar') + os.remove(dst) + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + @support.skip_unless_symlink + def test_dont_copy_file_onto_symlink_to_itself(self): + # bug 851123. + os.mkdir(TESTFN) + src = os.path.join(TESTFN, 'cheese') + dst = os.path.join(TESTFN, 'shop') + try: + with open(src, 'w') as f: + f.write('cheddar') # Using `src` here would mean we end up with a symlink pointing # to TESTFN/TESTFN/cheese, while it should point at # TESTFN/cheese. @@ -298,10 +309,7 @@ self.assertEqual(f.read(), 'cheddar') os.remove(dst) finally: - try: - shutil.rmtree(TESTFN) - except OSError: - pass + shutil.rmtree(TESTFN, ignore_errors=True) @support.skip_unless_symlink def test_rmtree_on_symlink(self): @@ -328,26 +336,26 @@ finally: os.remove(TESTFN) - @unittest.skipUnless(hasattr(os, 'mkfifo'), 'requires os.mkfifo') - def test_copytree_named_pipe(self): - os.mkdir(TESTFN) - try: - subdir = os.path.join(TESTFN, "subdir") - os.mkdir(subdir) - pipe = os.path.join(subdir, "mypipe") - os.mkfifo(pipe) + @support.skip_unless_symlink + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) try: - shutil.copytree(TESTFN, TESTFN2) - except shutil.Error as e: - errors = e.args[0] - self.assertEqual(len(errors), 1) - src, dst, error_msg = errors[0] - self.assertEqual("`%s` is a named pipe" % pipe, error_msg) - else: - self.fail("shutil.Error should have been raised") - finally: - shutil.rmtree(TESTFN, ignore_errors=True) - shutil.rmtree(TESTFN2, ignore_errors=True) + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) + try: + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error as e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) def test_copytree_special_func(self): @@ -362,7 +370,7 @@ copied.append((src, dst)) shutil.copytree(src_dir, dst_dir, copy_function=_copy) - self.assertEquals(len(copied), 2) + self.assertEqual(len(copied), 2) @support.skip_unless_symlink def test_copytree_dangling_symlinks(self): @@ -477,7 +485,7 @@ self.assertTrue(os.path.exists(tarball2)) # let's compare both tarballs - self.assertEquals(self._tarinfo(tarball), self._tarinfo(tarball2)) + self.assertEqual(self._tarinfo(tarball), self._tarinfo(tarball2)) # trying an uncompressed one base_name = os.path.join(tmpdir2, 'archive') @@ -572,8 +580,8 @@ archive = tarfile.open(archive_name) try: for member in archive.getmembers(): - self.assertEquals(member.uid, 0) - self.assertEquals(member.gid, 0) + self.assertEqual(member.uid, 0) + self.assertEqual(member.gid, 0) finally: archive.close() @@ -588,7 +596,7 @@ make_archive('xxx', 'xxx', root_dir=self.mkdtemp()) except Exception: pass - self.assertEquals(os.getcwd(), current_dir) + self.assertEqual(os.getcwd(), current_dir) finally: unregister_archive_format('xxx') @@ -635,16 +643,16 @@ # let's try to unpack it now unpack_archive(filename, tmpdir2) diff = self._compare_dirs(tmpdir, tmpdir2) - self.assertEquals(diff, []) + self.assertEqual(diff, []) def test_unpack_registery(self): formats = get_unpack_formats() def _boo(filename, extract_dir, extra): - self.assertEquals(extra, 1) - self.assertEquals(filename, 'stuff.boo') - self.assertEquals(extract_dir, 'xx') + self.assertEqual(extra, 1) + self.assertEqual(filename, 'stuff.boo') + self.assertEqual(extract_dir, 'xx') register_unpack_format('Boo', ['.boo', '.b2'], _boo, [('extra', 1)]) unpack_archive('stuff.boo', 'xx') @@ -661,7 +669,7 @@ # let's leave a clean state unregister_unpack_format('Boo2') - self.assertEquals(get_unpack_formats(), formats) + self.assertEqual(get_unpack_formats(), formats) class TestMove(unittest.TestCase): Modified: python/branches/pep-3151/Lib/test/test_signal.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_signal.py (original) +++ python/branches/pep-3151/Lib/test/test_signal.py Sat Feb 26 08:16:32 2011 @@ -203,10 +203,10 @@ def test_getsignal(self): hup = signal.signal(signal.SIGHUP, self.trivial_signal_handler) - self.assertEquals(signal.getsignal(signal.SIGHUP), - self.trivial_signal_handler) + self.assertEqual(signal.getsignal(signal.SIGHUP), + self.trivial_signal_handler) signal.signal(signal.SIGHUP, hup) - self.assertEquals(signal.getsignal(signal.SIGHUP), hup) + self.assertEqual(signal.getsignal(signal.SIGHUP), hup) @unittest.skipUnless(sys.platform == "win32", "Windows specific") @@ -456,9 +456,9 @@ "high") # virtual itimer should be (0.0, 0.0) now - self.assertEquals(signal.getitimer(self.itimer), (0.0, 0.0)) + self.assertEqual(signal.getitimer(self.itimer), (0.0, 0.0)) # and the handler should have been called - self.assertEquals(self.hndl_called, True) + self.assertEqual(self.hndl_called, True) # Issue 3864, unknown if this affects earlier versions of freebsd also @unittest.skipIf(sys.platform=='freebsd6', @@ -479,7 +479,7 @@ "high") # profiling itimer should be (0.0, 0.0) now - self.assertEquals(signal.getitimer(self.itimer), (0.0, 0.0)) + self.assertEqual(signal.getitimer(self.itimer), (0.0, 0.0)) # and the handler should have been called self.assertEqual(self.hndl_called, True) Modified: python/branches/pep-3151/Lib/test/test_site.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_site.py (original) +++ python/branches/pep-3151/Lib/test/test_site.py Sat Feb 26 08:16:32 2011 @@ -6,9 +6,11 @@ """ import unittest from test.support import run_unittest, TESTFN, EnvironmentVarGuard +from test.support import captured_stderr import builtins import os import sys +import re import encodings import subprocess import sysconfig @@ -90,6 +92,58 @@ finally: pth_file.cleanup() + def make_pth(self, contents, pth_dir='.', pth_name=TESTFN): + # Create a .pth file and return its (abspath, basename). + pth_dir = os.path.abspath(pth_dir) + pth_basename = pth_name + '.pth' + pth_fn = os.path.join(pth_dir, pth_basename) + pth_file = open(pth_fn, 'w', encoding='utf-8') + self.addCleanup(lambda: os.remove(pth_fn)) + pth_file.write(contents) + pth_file.close() + return pth_dir, pth_basename + + def test_addpackage_import_bad_syntax(self): + # Issue 10642 + pth_dir, pth_fn = self.make_pth("import bad)syntax\n") + with captured_stderr() as err_out: + site.addpackage(pth_dir, pth_fn, set()) + self.assertRegex(err_out.getvalue(), "line 1") + self.assertRegex(err_out.getvalue(), + re.escape(os.path.join(pth_dir, pth_fn))) + # XXX: the previous two should be independent checks so that the + # order doesn't matter. The next three could be a single check + # but my regex foo isn't good enough to write it. + self.assertRegex(err_out.getvalue(), 'Traceback') + self.assertRegex(err_out.getvalue(), r'import bad\)syntax') + self.assertRegex(err_out.getvalue(), 'SyntaxError') + + def test_addpackage_import_bad_exec(self): + # Issue 10642 + pth_dir, pth_fn = self.make_pth("randompath\nimport nosuchmodule\n") + with captured_stderr() as err_out: + site.addpackage(pth_dir, pth_fn, set()) + self.assertRegex(err_out.getvalue(), "line 2") + self.assertRegex(err_out.getvalue(), + re.escape(os.path.join(pth_dir, pth_fn))) + # XXX: ditto previous XXX comment. + self.assertRegex(err_out.getvalue(), 'Traceback') + self.assertRegex(err_out.getvalue(), 'ImportError') + + @unittest.skipIf(sys.platform == "win32", "Windows does not raise an " + "error for file paths containing null characters") + def test_addpackage_import_bad_pth_file(self): + # Issue 5258 + pth_dir, pth_fn = self.make_pth("abc\x00def\n") + with captured_stderr() as err_out: + site.addpackage(pth_dir, pth_fn, set()) + self.assertRegex(err_out.getvalue(), "line 1") + self.assertRegex(err_out.getvalue(), + re.escape(os.path.join(pth_dir, pth_fn))) + # XXX: ditto previous XXX comment. + self.assertRegex(err_out.getvalue(), 'Traceback') + self.assertRegex(err_out.getvalue(), 'TypeError') + def test_addsitedir(self): # Same tests for test_addpackage since addsitedir() essentially just # calls addpackage() for every .pth file in the directory @@ -107,12 +161,16 @@ usersite = site.USER_SITE self.assertIn(usersite, sys.path) + env = os.environ.copy() rc = subprocess.call([sys.executable, '-c', - 'import sys; sys.exit(%r in sys.path)' % usersite]) + 'import sys; sys.exit(%r in sys.path)' % usersite], + env=env) self.assertEqual(rc, 1) + env = os.environ.copy() rc = subprocess.call([sys.executable, '-s', '-c', - 'import sys; sys.exit(%r in sys.path)' % usersite]) + 'import sys; sys.exit(%r in sys.path)' % usersite], + env=env) self.assertEqual(rc, 0) env = os.environ.copy() @@ -134,7 +192,7 @@ user_base = site.getuserbase() # the call sets site.USER_BASE - self.assertEquals(site.USER_BASE, user_base) + self.assertEqual(site.USER_BASE, user_base) # let's set PYTHONUSERBASE and see if it uses it site.USER_BASE = None @@ -152,7 +210,7 @@ user_site = site.getusersitepackages() # the call sets USER_BASE *and* USER_SITE - self.assertEquals(site.USER_SITE, user_site) + self.assertEqual(site.USER_SITE, user_site) self.assertTrue(user_site.startswith(site.USER_BASE), user_site) def test_getsitepackages(self): @@ -162,19 +220,19 @@ if sys.platform in ('os2emx', 'riscos'): self.assertEqual(len(dirs), 1) wanted = os.path.join('xoxo', 'Lib', 'site-packages') - self.assertEquals(dirs[0], wanted) + self.assertEqual(dirs[0], wanted) elif os.sep == '/': self.assertEqual(len(dirs), 2) wanted = os.path.join('xoxo', 'lib', 'python' + sys.version[:3], 'site-packages') - self.assertEquals(dirs[0], wanted) + self.assertEqual(dirs[0], wanted) wanted = os.path.join('xoxo', 'lib', 'site-python') - self.assertEquals(dirs[1], wanted) + self.assertEqual(dirs[1], wanted) else: self.assertEqual(len(dirs), 2) - self.assertEquals(dirs[0], 'xoxo') + self.assertEqual(dirs[0], 'xoxo') wanted = os.path.join('xoxo', 'lib', 'site-packages') - self.assertEquals(dirs[1], wanted) + self.assertEqual(dirs[1], wanted) # let's try the specific Apple location if (sys.platform == "darwin" and @@ -184,7 +242,7 @@ self.assertEqual(len(dirs), 3) wanted = os.path.join('/Library', 'Python', sys.version[:3], 'site-packages') - self.assertEquals(dirs[2], wanted) + self.assertEqual(dirs[2], wanted) class PthFile(object): """Helper class for handling testing of .pth files""" Modified: python/branches/pep-3151/Lib/test/test_slice.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_slice.py (original) +++ python/branches/pep-3151/Lib/test/test_slice.py Sat Feb 26 08:16:32 2011 @@ -118,7 +118,7 @@ x = X() x[1:2] = 42 - self.assertEquals(tmp, [(slice(1, 2), 42)]) + self.assertEqual(tmp, [(slice(1, 2), 42)]) def test_pickle(self): s = slice(10, 20, 3) Modified: python/branches/pep-3151/Lib/test/test_smtpd.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_smtpd.py (original) +++ python/branches/pep-3151/Lib/test/test_smtpd.py Sat Feb 26 08:16:32 2011 @@ -121,6 +121,24 @@ self.assertEqual(self.channel.socket.last, b'451 Internal confusion\r\n') + def test_command_too_long(self): + self.write_line(b'MAIL from ' + + b'a' * self.channel.command_size_limit + + b'@example') + self.assertEqual(self.channel.socket.last, + b'500 Error: line too long\r\n') + + def test_data_too_long(self): + # Small hack. Setting limit to 2K octets here will save us some time. + self.channel.data_size_limit = 2048 + self.write_line(b'MAIL From:eggs at example') + self.write_line(b'RCPT To:spam at example') + self.write_line(b'DATA') + self.write_line(b'A' * self.channel.data_size_limit + + b'A\r\n.') + self.assertEqual(self.channel.socket.last, + b'552 Error: Too much mail data\r\n') + def test_need_MAIL(self): self.write_line(b'RCPT to:spam at example') self.assertEqual(self.channel.socket.last, Modified: python/branches/pep-3151/Lib/test/test_smtplib.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_smtplib.py (original) +++ python/branches/pep-3151/Lib/test/test_smtplib.py Sat Feb 26 08:16:32 2011 @@ -319,12 +319,12 @@ self.assertEqual(self.output.getvalue(), mexpect) debugout = smtpd.DEBUGSTREAM.getvalue() sender = re.compile("^sender: foo at bar.com$", re.MULTILINE) - self.assertRegexpMatches(debugout, sender) + self.assertRegex(debugout, sender) for addr in ('John', 'Sally', 'Fred', 'root at localhost', 'warped at silly.walks.com'): to_addr = re.compile(r"^recips: .*'{}'.*$".format(addr), re.MULTILINE) - self.assertRegexpMatches(debugout, to_addr) + self.assertRegex(debugout, to_addr) def testSendMessageWithSomeAddresses(self): # Make sure nothing breaks if not all of the three 'to' headers exist @@ -347,11 +347,11 @@ self.assertEqual(self.output.getvalue(), mexpect) debugout = smtpd.DEBUGSTREAM.getvalue() sender = re.compile("^sender: foo at bar.com$", re.MULTILINE) - self.assertRegexpMatches(debugout, sender) + self.assertRegex(debugout, sender) for addr in ('John', 'Dinsdale'): to_addr = re.compile(r"^recips: .*'{}'.*$".format(addr), re.MULTILINE) - self.assertRegexpMatches(debugout, to_addr) + self.assertRegex(debugout, to_addr) class NonConnectingTests(unittest.TestCase): Modified: python/branches/pep-3151/Lib/test/test_socket.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_socket.py (original) +++ python/branches/pep-3151/Lib/test/test_socket.py Sat Feb 26 08:16:32 2011 @@ -44,7 +44,7 @@ return 0, 0, 0 HOST = support.HOST -MSG = 'Michael Gilfix was here\u1234\r\n'.encode('utf8') ## test unicode string and carriage return +MSG = 'Michael Gilfix was here\u1234\r\n'.encode('utf-8') ## test unicode string and carriage return SUPPORTS_IPV6 = socket.has_ipv6 and try_address('::1', family=socket.AF_INET6) try: @@ -446,8 +446,8 @@ return # No inet_aton, nothing to check # Test that issue1008086 and issue767150 are fixed. # It must return 4 bytes. - self.assertEquals(b'\x00'*4, socket.inet_aton('0.0.0.0')) - self.assertEquals(b'\xff'*4, socket.inet_aton('255.255.255.255')) + self.assertEqual(b'\x00'*4, socket.inet_aton('0.0.0.0')) + self.assertEqual(b'\xff'*4, socket.inet_aton('255.255.255.255')) def testIPv4toString(self): if not hasattr(socket, 'inet_pton'): @@ -455,16 +455,16 @@ from socket import inet_aton as f, inet_pton, AF_INET g = lambda a: inet_pton(AF_INET, a) - self.assertEquals(b'\x00\x00\x00\x00', f('0.0.0.0')) - self.assertEquals(b'\xff\x00\xff\x00', f('255.0.255.0')) - self.assertEquals(b'\xaa\xaa\xaa\xaa', f('170.170.170.170')) - self.assertEquals(b'\x01\x02\x03\x04', f('1.2.3.4')) - self.assertEquals(b'\xff\xff\xff\xff', f('255.255.255.255')) - - self.assertEquals(b'\x00\x00\x00\x00', g('0.0.0.0')) - self.assertEquals(b'\xff\x00\xff\x00', g('255.0.255.0')) - self.assertEquals(b'\xaa\xaa\xaa\xaa', g('170.170.170.170')) - self.assertEquals(b'\xff\xff\xff\xff', g('255.255.255.255')) + self.assertEqual(b'\x00\x00\x00\x00', f('0.0.0.0')) + self.assertEqual(b'\xff\x00\xff\x00', f('255.0.255.0')) + self.assertEqual(b'\xaa\xaa\xaa\xaa', f('170.170.170.170')) + self.assertEqual(b'\x01\x02\x03\x04', f('1.2.3.4')) + self.assertEqual(b'\xff\xff\xff\xff', f('255.255.255.255')) + + self.assertEqual(b'\x00\x00\x00\x00', g('0.0.0.0')) + self.assertEqual(b'\xff\x00\xff\x00', g('255.0.255.0')) + self.assertEqual(b'\xaa\xaa\xaa\xaa', g('170.170.170.170')) + self.assertEqual(b'\xff\xff\xff\xff', g('255.255.255.255')) def testIPv6toString(self): if not hasattr(socket, 'inet_pton'): @@ -477,10 +477,10 @@ return f = lambda a: inet_pton(AF_INET6, a) - self.assertEquals(b'\x00' * 16, f('::')) - self.assertEquals(b'\x00' * 16, f('0::0')) - self.assertEquals(b'\x00\x01' + b'\x00' * 14, f('1::')) - self.assertEquals( + self.assertEqual(b'\x00' * 16, f('::')) + self.assertEqual(b'\x00' * 16, f('0::0')) + self.assertEqual(b'\x00\x01' + b'\x00' * 14, f('1::')) + self.assertEqual( b'\x45\xef\x76\xcb\x00\x1a\x56\xef\xaf\xeb\x0b\xac\x19\x24\xae\xae', f('45ef:76cb:1a:56ef:afeb:bac:1924:aeae') ) @@ -491,14 +491,14 @@ from socket import inet_ntoa as f, inet_ntop, AF_INET g = lambda a: inet_ntop(AF_INET, a) - self.assertEquals('1.0.1.0', f(b'\x01\x00\x01\x00')) - self.assertEquals('170.85.170.85', f(b'\xaa\x55\xaa\x55')) - self.assertEquals('255.255.255.255', f(b'\xff\xff\xff\xff')) - self.assertEquals('1.2.3.4', f(b'\x01\x02\x03\x04')) - - self.assertEquals('1.0.1.0', g(b'\x01\x00\x01\x00')) - self.assertEquals('170.85.170.85', g(b'\xaa\x55\xaa\x55')) - self.assertEquals('255.255.255.255', g(b'\xff\xff\xff\xff')) + self.assertEqual('1.0.1.0', f(b'\x01\x00\x01\x00')) + self.assertEqual('170.85.170.85', f(b'\xaa\x55\xaa\x55')) + self.assertEqual('255.255.255.255', f(b'\xff\xff\xff\xff')) + self.assertEqual('1.2.3.4', f(b'\x01\x02\x03\x04')) + + self.assertEqual('1.0.1.0', g(b'\x01\x00\x01\x00')) + self.assertEqual('170.85.170.85', g(b'\xaa\x55\xaa\x55')) + self.assertEqual('255.255.255.255', g(b'\xff\xff\xff\xff')) def testStringToIPv6(self): if not hasattr(socket, 'inet_ntop'): @@ -511,9 +511,9 @@ return f = lambda a: inet_ntop(AF_INET6, a) - self.assertEquals('::', f(b'\x00' * 16)) - self.assertEquals('::1', f(b'\x00' * 15 + b'\x01')) - self.assertEquals( + self.assertEqual('::', f(b'\x00' * 16)) + self.assertEqual('::1', f(b'\x00' * 15 + b'\x01')) + self.assertEqual( 'aef:b01:506:1001:ffff:9997:55:170', f(b'\x0a\xef\x0b\x01\x05\x06\x10\x01\xff\xff\x99\x97\x00\x55\x01\x70') ) @@ -544,7 +544,11 @@ # XXX(nnorwitz): http://tinyurl.com/os5jz seems to indicate # it reasonable to get the host's addr in addition to 0.0.0.0. # At least for eCos. This is required for the S/390 to pass. - my_ip_addr = socket.gethostbyname(socket.gethostname()) + try: + my_ip_addr = socket.gethostbyname(socket.gethostname()) + except socket.error: + # Probably name lookup wasn't set up right; skip this test + return self.assertIn(name[0], ("0.0.0.0", my_ip_addr), '%s invalid' % name[0]) self.assertEqual(name[1], port) @@ -663,6 +667,8 @@ type=socket.SOCK_STREAM, proto=0, flags=socket.AI_PASSIVE) self.assertEqual(a, b) + # Issue #6697. + self.assertRaises(UnicodeEncodeError, socket.getaddrinfo, 'localhost', '\uD800') def test_getnameinfo(self): # only IP addresses are allowed @@ -732,6 +738,12 @@ f = None support.gc_collect() + def test_name_closed_socketio(self): + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as sock: + fp = sock.makefile("rb") + fp.close() + self.assertEqual(repr(fp), "<_io.BufferedReader name=-1>") + @unittest.skipUnless(thread, 'Threading required for this test.') class BasicTCPTest(SocketConnectedTest): @@ -970,6 +982,23 @@ def _testInitNonBlocking(self): pass + def testInheritFlags(self): + # Issue #7995: when calling accept() on a listening socket with a + # timeout, the resulting socket should not be non-blocking. + self.serv.settimeout(10) + try: + conn, addr = self.serv.accept() + message = conn.recv(len(MSG)) + finally: + conn.close() + self.serv.settimeout(None) + + def _testInheritFlags(self): + time.sleep(0.1) + self.cli.connect((HOST, self.port)) + time.sleep(0.5) + self.cli.send(MSG) + def testAccept(self): # Testing non-blocking accept self.serv.setblocking(0) @@ -1036,7 +1065,7 @@ """ bufsize = -1 # Use default buffer size - encoding = 'utf8' + encoding = 'utf-8' errors = 'strict' newline = None @@ -1080,6 +1109,23 @@ self.write_file = None SocketConnectedTest.clientTearDown(self) + def testReadAfterTimeout(self): + # Issue #7322: A file object must disallow further reads + # after a timeout has occurred. + self.cli_conn.settimeout(1) + self.read_file.read(3) + # First read raises a timeout + self.assertRaises(socket.timeout, self.read_file.read, 1) + # Second read is disallowed + with self.assertRaises(IOError) as ctx: + self.read_file.read(1) + self.assertIn("cannot read from timed out object", str(ctx.exception)) + + def _testReadAfterTimeout(self): + self.write_file.write(self.write_msg[0:3]) + self.write_file.flush() + self.serv_finished.wait() + def testSmallRead(self): # Performing small file read test first_seg = self.read_file.read(len(self.read_msg)-3) @@ -1222,8 +1268,8 @@ lambda : b"", # XXX(gps): io library does an extra EOF read ]) fo = mock_sock._textiowrap_for_test(buffering=buffering) - self.assertEquals(fo.readline(size), "This is the first line\n") - self.assertEquals(fo.readline(size), "And the second line is here\n") + self.assertEqual(fo.readline(size), "This is the first line\n") + self.assertEqual(fo.readline(size), "And the second line is here\n") def _test_read(self, size=-1, buffering=-1): mock_sock = self.MockSocket(recv_funcs=[ @@ -1240,13 +1286,13 @@ data = b'' else: data = '' - expecting = expecting.decode('utf8') + expecting = expecting.decode('utf-8') while len(data) != len(expecting): part = fo.read(size) if not part: break data += part - self.assertEquals(data, expecting) + self.assertEqual(data, expecting) def test_default(self): self._test_readline() @@ -1270,8 +1316,8 @@ lambda : b"", ]) fo = mock_sock._textiowrap_for_test(buffering=0) - self.assertEquals(fo.readline(size), b"a\n") - self.assertEquals(fo.readline(size), b"Bb") + self.assertEqual(fo.readline(size), b"a\n") + self.assertEqual(fo.readline(size), b"Bb") def test_no_buffer(self): self._test_readline_no_buffer() @@ -1398,7 +1444,7 @@ """Tests for socket.makefile() in text mode (rather than binary)""" read_mode = 'r' - read_msg = MSG.decode('utf8') + read_msg = MSG.decode('utf-8') write_mode = 'wb' write_msg = MSG newline = '' @@ -1410,7 +1456,7 @@ read_mode = 'rb' read_msg = MSG write_mode = 'w' - write_msg = MSG.decode('utf8') + write_msg = MSG.decode('utf-8') newline = '' @@ -1418,9 +1464,9 @@ """Tests for socket.makefile() in text mode (rather than binary)""" read_mode = 'r' - read_msg = MSG.decode('utf8') + read_msg = MSG.decode('utf-8') write_mode = 'w' - write_msg = MSG.decode('utf8') + write_msg = MSG.decode('utf-8') newline = '' @@ -1521,7 +1567,7 @@ self.addCleanup(self.cli.close) finally: socket.setdefaulttimeout(None) - self.assertEquals(self.cli.gettimeout(), 42) + self.assertEqual(self.cli.gettimeout(), 42) testTimeoutNone = _justAccept def _testTimeoutNone(self): @@ -1672,25 +1718,25 @@ def testLinuxAbstractNamespace(self): address = b"\x00python-test-hello\x00\xff" - s1 = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) - s1.bind(address) - s1.listen(1) - s2 = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) - s2.connect(s1.getsockname()) - s1.accept() - self.assertEqual(s1.getsockname(), address) - self.assertEqual(s2.getpeername(), address) + with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as s1: + s1.bind(address) + s1.listen(1) + with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as s2: + s2.connect(s1.getsockname()) + with s1.accept()[0] as s3: + self.assertEqual(s1.getsockname(), address) + self.assertEqual(s2.getpeername(), address) def testMaxName(self): address = b"\x00" + b"h" * (self.UNIX_PATH_MAX - 1) - s = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) - s.bind(address) - self.assertEqual(s.getsockname(), address) + with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as s: + s.bind(address) + self.assertEqual(s.getsockname(), address) def testNameOverflow(self): address = "\x00" + "h" * self.UNIX_PATH_MAX - s = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) - self.assertRaises(socket.error, s.bind, address) + with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as s: + self.assertRaises(socket.error, s.bind, address) @unittest.skipUnless(thread, 'Threading required for this test.') @@ -1892,10 +1938,10 @@ if v < (2, 6, 28): self.skipTest("Linux kernel 2.6.28 or higher required, not %s" % ".".join(map(str, v))) - s = socket.socket(socket.AF_INET, - socket.SOCK_STREAM | socket.SOCK_CLOEXEC) - self.assertTrue(s.type & socket.SOCK_CLOEXEC) - self.assertTrue(fcntl.fcntl(s, fcntl.F_GETFD) & fcntl.FD_CLOEXEC) + with socket.socket(socket.AF_INET, + socket.SOCK_STREAM | socket.SOCK_CLOEXEC) as s: + self.assertTrue(s.type & socket.SOCK_CLOEXEC) + self.assertTrue(fcntl.fcntl(s, fcntl.F_GETFD) & fcntl.FD_CLOEXEC) @unittest.skipUnless(hasattr(socket, "SOCK_NONBLOCK"), @@ -1916,29 +1962,33 @@ % ".".join(map(str, v))) # a lot of it seems silly and redundant, but I wanted to test that # changing back and forth worked ok - s = socket.socket(socket.AF_INET, - socket.SOCK_STREAM | socket.SOCK_NONBLOCK) - self.checkNonblock(s) - s.setblocking(1) - self.checkNonblock(s, False) - s.setblocking(0) - self.checkNonblock(s) - s.settimeout(None) - self.checkNonblock(s, False) - s.settimeout(2.0) - self.checkNonblock(s, timeout=2.0) - s.setblocking(1) - self.checkNonblock(s, False) + with socket.socket(socket.AF_INET, + socket.SOCK_STREAM | socket.SOCK_NONBLOCK) as s: + self.checkNonblock(s) + s.setblocking(1) + self.checkNonblock(s, False) + s.setblocking(0) + self.checkNonblock(s) + s.settimeout(None) + self.checkNonblock(s, False) + s.settimeout(2.0) + self.checkNonblock(s, timeout=2.0) + s.setblocking(1) + self.checkNonblock(s, False) # defaulttimeout t = socket.getdefaulttimeout() socket.setdefaulttimeout(0.0) - self.checkNonblock(socket.socket()) + with socket.socket() as s: + self.checkNonblock(s) socket.setdefaulttimeout(None) - self.checkNonblock(socket.socket(), False) + with socket.socket() as s: + self.checkNonblock(s, False) socket.setdefaulttimeout(2.0) - self.checkNonblock(socket.socket(), timeout=2.0) + with socket.socket() as s: + self.checkNonblock(s, timeout=2.0) socket.setdefaulttimeout(None) - self.checkNonblock(socket.socket(), False) + with socket.socket() as s: + self.checkNonblock(s, False) socket.setdefaulttimeout(t) Modified: python/branches/pep-3151/Lib/test/test_socketserver.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_socketserver.py (original) +++ python/branches/pep-3151/Lib/test/test_socketserver.py Sat Feb 26 08:16:32 2011 @@ -57,8 +57,8 @@ os._exit(72) yield None pid2, status = os.waitpid(pid, 0) - testcase.assertEquals(pid2, pid) - testcase.assertEquals(72 << 8, status) + testcase.assertEqual(pid2, pid) + testcase.assertEqual(72 << 8, status) @unittest.skipUnless(threading, 'Threading required for this test.') @@ -120,7 +120,7 @@ if verbose: print("creating server") server = MyServer(addr, MyHandler) - self.assertEquals(server.server_address, server.socket.getsockname()) + self.assertEqual(server.server_address, server.socket.getsockname()) return server @unittest.skipUnless(threading, 'Threading required for this test.') @@ -151,6 +151,7 @@ if verbose: print("waiting for server") server.shutdown() t.join() + server.server_close() if verbose: print("done") def stream_examine(self, proto, addr): @@ -161,7 +162,7 @@ while data and b'\n' not in buf: data = receive(s, 100) buf += data - self.assertEquals(buf, TEST_STR) + self.assertEqual(buf, TEST_STR) s.close() def dgram_examine(self, proto, addr): @@ -171,7 +172,7 @@ while data and b'\n' not in buf: data = receive(s, 100) buf += data - self.assertEquals(buf, TEST_STR) + self.assertEqual(buf, TEST_STR) s.close() def test_TCPServer(self): @@ -270,6 +271,7 @@ s.shutdown() for t, s in threads: t.join() + s.server_close() def test_main(): Modified: python/branches/pep-3151/Lib/test/test_ssl.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_ssl.py (original) +++ python/branches/pep-3151/Lib/test/test_ssl.py Sat Feb 26 08:16:32 2011 @@ -185,17 +185,17 @@ def test_errors(self): sock = socket.socket() - self.assertRaisesRegexp(ValueError, + self.assertRaisesRegex(ValueError, "certfile must be specified", ssl.wrap_socket, sock, keyfile=CERTFILE) - self.assertRaisesRegexp(ValueError, + self.assertRaisesRegex(ValueError, "certfile must be specified for server-side operations", ssl.wrap_socket, sock, server_side=True) - self.assertRaisesRegexp(ValueError, + self.assertRaisesRegex(ValueError, "certfile must be specified for server-side operations", ssl.wrap_socket, sock, server_side=True, certfile="") s = ssl.wrap_socket(sock, server_side=True, certfile=CERTFILE) - self.assertRaisesRegexp(ValueError, "can't connect in server-side mode", + self.assertRaisesRegex(ValueError, "can't connect in server-side mode", s.connect, (HOST, 8080)) with self.assertRaises(IOError) as cm: with socket.socket() as sock: @@ -310,7 +310,7 @@ ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) ctx.set_ciphers("ALL") ctx.set_ciphers("DEFAULT") - with self.assertRaisesRegexp(ssl.SSLError, "No cipher can be selected"): + with self.assertRaisesRegex(ssl.SSLError, "No cipher can be selected"): ctx.set_ciphers("^$:,;?*'dorothyx") @skip_if_broken_ubuntu_ssl @@ -358,24 +358,24 @@ with self.assertRaises(IOError) as cm: ctx.load_cert_chain(WRONGCERT) self.assertEqual(cm.exception.errno, errno.ENOENT) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_cert_chain(BADCERT) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_cert_chain(EMPTYCERT) # Separate key and cert ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) ctx.load_cert_chain(ONLYCERT, ONLYKEY) ctx.load_cert_chain(certfile=ONLYCERT, keyfile=ONLYKEY) ctx.load_cert_chain(certfile=BYTES_ONLYCERT, keyfile=BYTES_ONLYKEY) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_cert_chain(ONLYCERT) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_cert_chain(ONLYKEY) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_cert_chain(certfile=ONLYKEY, keyfile=ONLYCERT) # Mismatching key and cert ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) - with self.assertRaisesRegexp(ssl.SSLError, "key values mismatch"): + with self.assertRaisesRegex(ssl.SSLError, "key values mismatch"): ctx.load_cert_chain(SVN_PYTHON_ORG_ROOT_CERT, ONLYKEY) def test_load_verify_locations(self): @@ -389,11 +389,14 @@ with self.assertRaises(IOError) as cm: ctx.load_verify_locations(WRONGCERT) self.assertEqual(cm.exception.errno, errno.ENOENT) - with self.assertRaisesRegexp(ssl.SSLError, "PEM lib"): + with self.assertRaisesRegex(ssl.SSLError, "PEM lib"): ctx.load_verify_locations(BADCERT) ctx.load_verify_locations(CERTFILE, CAPATH) ctx.load_verify_locations(CERTFILE, capath=BYTES_CAPATH) + # Issue #10989: crash if the second argument type is invalid + self.assertRaises(TypeError, ctx.load_verify_locations, None, True) + @skip_if_broken_ubuntu_ssl def test_session_stats(self): for proto in PROTOCOLS: @@ -412,6 +415,12 @@ 'cache_full': 0, }) + def test_set_default_verify_paths(self): + # There's not much we can do to test that it acts as expected, + # so just check it doesn't crash or raise an exception. + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.set_default_verify_paths() + class NetworkedTests(unittest.TestCase): @@ -428,8 +437,8 @@ # this should fail because we have no verification certs s = ssl.wrap_socket(socket.socket(socket.AF_INET), cert_reqs=ssl.CERT_REQUIRED) - self.assertRaisesRegexp(ssl.SSLError, "certificate verify failed", - s.connect, ("svn.python.org", 443)) + self.assertRaisesRegex(ssl.SSLError, "certificate verify failed", + s.connect, ("svn.python.org", 443)) s.close() # this should succeed because we specify the root cert @@ -463,7 +472,7 @@ # This should fail because we have no verification certs ctx.verify_mode = ssl.CERT_REQUIRED s = ctx.wrap_socket(socket.socket(socket.AF_INET)) - self.assertRaisesRegexp(ssl.SSLError, "certificate verify failed", + self.assertRaisesRegex(ssl.SSLError, "certificate verify failed", s.connect, ("svn.python.org", 443)) s.close() # This should succeed because we specify the root cert @@ -581,7 +590,7 @@ cert_reqs=ssl.CERT_NONE, ciphers="DEFAULT") s.connect(remote) # Error checking can happen at instantiation or when connecting - with self.assertRaisesRegexp(ssl.SSLError, "No cipher can be selected"): + with self.assertRaisesRegex(ssl.SSLError, "No cipher can be selected"): with socket.socket(socket.AF_INET) as sock: s = ssl.wrap_socket(sock, cert_reqs=ssl.CERT_NONE, ciphers="^$:,;?*'dorothyx") @@ -593,10 +602,10 @@ # SHA256 was added in OpenSSL 0.9.8 if ssl.OPENSSL_VERSION_INFO < (0, 9, 8, 0, 15): self.skipTest("SHA256 not available on %r" % ssl.OPENSSL_VERSION) - # NOTE: https://sha256.tbs-internet.com is another possible test host - remote = ("sha2.hboeck.de", 443) + # https://sha2.hboeck.de/ was used until 2011-01-08 (no route to host) + remote = ("sha256.tbs-internet.com", 443) sha256_cert = os.path.join(os.path.dirname(__file__), "sha256.pem") - with support.transient_internet("sha2.hboeck.de"): + with support.transient_internet("sha256.tbs-internet.com"): s = ssl.wrap_socket(socket.socket(socket.AF_INET), cert_reqs=ssl.CERT_REQUIRED, ca_certs=sha256_cert,) @@ -1493,8 +1502,8 @@ c.settimeout(0.2) c.connect((host, port)) # Will attempt handshake and time out - self.assertRaisesRegexp(ssl.SSLError, "timed out", - ssl.wrap_socket, c) + self.assertRaisesRegex(socket.timeout, "timed out", + ssl.wrap_socket, c) finally: c.close() try: @@ -1502,8 +1511,8 @@ c = ssl.wrap_socket(c) c.settimeout(0.2) # Will attempt handshake and time out - self.assertRaisesRegexp(ssl.SSLError, "timed out", - c.connect, (host, port)) + self.assertRaisesRegex(socket.timeout, "timed out", + c.connect, (host, port)) finally: c.close() finally: Modified: python/branches/pep-3151/Lib/test/test_strlit.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_strlit.py (original) +++ python/branches/pep-3151/Lib/test/test_strlit.py Sat Feb 26 08:16:32 2011 @@ -130,7 +130,7 @@ self.assertRaises(SyntaxError, self.check_encoding, "utf-8", extra) def test_file_utf8(self): - self.check_encoding("utf8") + self.check_encoding("utf-8") def test_file_iso_8859_1(self): self.check_encoding("iso-8859-1") Modified: python/branches/pep-3151/Lib/test/test_struct.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_struct.py (original) +++ python/branches/pep-3151/Lib/test/test_struct.py Sat Feb 26 08:16:32 2011 @@ -82,58 +82,52 @@ # Test some of the new features in detail # (format, argument, big-endian result, little-endian result, asymmetric) tests = [ - ('c', 'a', 'a', 'a', 0), - ('xc', 'a', '\0a', '\0a', 0), - ('cx', 'a', 'a\0', 'a\0', 0), - ('s', 'a', 'a', 'a', 0), - ('0s', 'helloworld', '', '', 1), - ('1s', 'helloworld', 'h', 'h', 1), - ('9s', 'helloworld', 'helloworl', 'helloworl', 1), - ('10s', 'helloworld', 'helloworld', 'helloworld', 0), - ('11s', 'helloworld', 'helloworld\0', 'helloworld\0', 1), - ('20s', 'helloworld', 'helloworld'+10*'\0', 'helloworld'+10*'\0', 1), - ('b', 7, '\7', '\7', 0), - ('b', -7, '\371', '\371', 0), - ('B', 7, '\7', '\7', 0), - ('B', 249, '\371', '\371', 0), - ('h', 700, '\002\274', '\274\002', 0), - ('h', -700, '\375D', 'D\375', 0), - ('H', 700, '\002\274', '\274\002', 0), - ('H', 0x10000-700, '\375D', 'D\375', 0), - ('i', 70000000, '\004,\035\200', '\200\035,\004', 0), - ('i', -70000000, '\373\323\342\200', '\200\342\323\373', 0), - ('I', 70000000, '\004,\035\200', '\200\035,\004', 0), - ('I', 0x100000000-70000000, '\373\323\342\200', '\200\342\323\373', 0), - ('l', 70000000, '\004,\035\200', '\200\035,\004', 0), - ('l', -70000000, '\373\323\342\200', '\200\342\323\373', 0), - ('L', 70000000, '\004,\035\200', '\200\035,\004', 0), - ('L', 0x100000000-70000000, '\373\323\342\200', '\200\342\323\373', 0), - ('f', 2.0, '@\000\000\000', '\000\000\000@', 0), - ('d', 2.0, '@\000\000\000\000\000\000\000', - '\000\000\000\000\000\000\000@', 0), - ('f', -2.0, '\300\000\000\000', '\000\000\000\300', 0), - ('d', -2.0, '\300\000\000\000\000\000\000\000', - '\000\000\000\000\000\000\000\300', 0), - ('?', 0, '\0', '\0', 0), - ('?', 3, '\1', '\1', 1), - ('?', True, '\1', '\1', 0), - ('?', [], '\0', '\0', 1), - ('?', (1,), '\1', '\1', 1), + ('c', b'a', b'a', b'a', 0), + ('xc', b'a', b'\0a', b'\0a', 0), + ('cx', b'a', b'a\0', b'a\0', 0), + ('s', b'a', b'a', b'a', 0), + ('0s', b'helloworld', b'', b'', 1), + ('1s', b'helloworld', b'h', b'h', 1), + ('9s', b'helloworld', b'helloworl', b'helloworl', 1), + ('10s', b'helloworld', b'helloworld', b'helloworld', 0), + ('11s', b'helloworld', b'helloworld\0', b'helloworld\0', 1), + ('20s', b'helloworld', b'helloworld'+10*b'\0', b'helloworld'+10*b'\0', 1), + ('b', 7, b'\7', b'\7', 0), + ('b', -7, b'\371', b'\371', 0), + ('B', 7, b'\7', b'\7', 0), + ('B', 249, b'\371', b'\371', 0), + ('h', 700, b'\002\274', b'\274\002', 0), + ('h', -700, b'\375D', b'D\375', 0), + ('H', 700, b'\002\274', b'\274\002', 0), + ('H', 0x10000-700, b'\375D', b'D\375', 0), + ('i', 70000000, b'\004,\035\200', b'\200\035,\004', 0), + ('i', -70000000, b'\373\323\342\200', b'\200\342\323\373', 0), + ('I', 70000000, b'\004,\035\200', b'\200\035,\004', 0), + ('I', 0x100000000-70000000, b'\373\323\342\200', b'\200\342\323\373', 0), + ('l', 70000000, b'\004,\035\200', b'\200\035,\004', 0), + ('l', -70000000, b'\373\323\342\200', b'\200\342\323\373', 0), + ('L', 70000000, b'\004,\035\200', b'\200\035,\004', 0), + ('L', 0x100000000-70000000, b'\373\323\342\200', b'\200\342\323\373', 0), + ('f', 2.0, b'@\000\000\000', b'\000\000\000@', 0), + ('d', 2.0, b'@\000\000\000\000\000\000\000', + b'\000\000\000\000\000\000\000@', 0), + ('f', -2.0, b'\300\000\000\000', b'\000\000\000\300', 0), + ('d', -2.0, b'\300\000\000\000\000\000\000\000', + b'\000\000\000\000\000\000\000\300', 0), + ('?', 0, b'\0', b'\0', 0), + ('?', 3, b'\1', b'\1', 1), + ('?', True, b'\1', b'\1', 0), + ('?', [], b'\0', b'\0', 1), + ('?', (1,), b'\1', b'\1', 1), ] for fmt, arg, big, lil, asy in tests: - big = bytes(big, "latin-1") - lil = bytes(lil, "latin-1") for (xfmt, exp) in [('>'+fmt, big), ('!'+fmt, big), ('<'+fmt, lil), ('='+fmt, ISBIGENDIAN and big or lil)]: res = struct.pack(xfmt, arg) self.assertEqual(res, exp) self.assertEqual(struct.calcsize(xfmt), len(res)) rev = struct.unpack(xfmt, res)[0] - if isinstance(arg, str): - # Strings are returned as bytes since you can't know the - # encoding of the string when packed. - arg = bytes(arg, 'latin1') if rev != arg: self.assertTrue(asy) @@ -334,15 +328,14 @@ def test_p_code(self): # Test p ("Pascal string") code. for code, input, expected, expectedback in [ - ('p','abc', '\x00', b''), - ('1p', 'abc', '\x00', b''), - ('2p', 'abc', '\x01a', b'a'), - ('3p', 'abc', '\x02ab', b'ab'), - ('4p', 'abc', '\x03abc', b'abc'), - ('5p', 'abc', '\x03abc\x00', b'abc'), - ('6p', 'abc', '\x03abc\x00\x00', b'abc'), - ('1000p', 'x'*1000, '\xff' + 'x'*999, b'x'*255)]: - expected = bytes(expected, "latin-1") + ('p', b'abc', b'\x00', b''), + ('1p', b'abc', b'\x00', b''), + ('2p', b'abc', b'\x01a', b'a'), + ('3p', b'abc', b'\x02ab', b'ab'), + ('4p', b'abc', b'\x03abc', b'abc'), + ('5p', b'abc', b'\x03abc\x00', b'abc'), + ('6p', b'abc', b'\x03abc\x00\x00', b'abc'), + ('1000p', b'x'*1000, b'\xff' + b'x'*999, b'x'*255)]: got = struct.pack(code, input) self.assertEqual(got, expected) (got,) = struct.unpack(code, got) @@ -401,15 +394,11 @@ s = struct.Struct(fmt) for cls in (bytes, bytearray): data = cls(test_string) - if not isinstance(data, (bytes, bytearray)): - bytes_data = bytes(data, 'latin1') - else: - bytes_data = data self.assertEqual(s.unpack_from(data), (b'abcd',)) self.assertEqual(s.unpack_from(data, 2), (b'cd01',)) self.assertEqual(s.unpack_from(data, 4), (b'0123',)) for i in range(6): - self.assertEqual(s.unpack_from(data, i), (bytes_data[i:i+4],)) + self.assertEqual(s.unpack_from(data, i), (data[i:i+4],)) for i in range(6, len(test_string) + 1): self.assertRaises(struct.error, s.unpack_from, data, i) for cls in (bytes, bytearray): Modified: python/branches/pep-3151/Lib/test/test_structmembers.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_structmembers.py (original) +++ python/branches/pep-3151/Lib/test/test_structmembers.py Sat Feb 26 08:16:32 2011 @@ -28,64 +28,64 @@ def test_bool(self): ts.T_BOOL = True - self.assertEquals(ts.T_BOOL, True) + self.assertEqual(ts.T_BOOL, True) ts.T_BOOL = False - self.assertEquals(ts.T_BOOL, False) + self.assertEqual(ts.T_BOOL, False) self.assertRaises(TypeError, setattr, ts, 'T_BOOL', 1) def test_byte(self): ts.T_BYTE = CHAR_MAX - self.assertEquals(ts.T_BYTE, CHAR_MAX) + self.assertEqual(ts.T_BYTE, CHAR_MAX) ts.T_BYTE = CHAR_MIN - self.assertEquals(ts.T_BYTE, CHAR_MIN) + self.assertEqual(ts.T_BYTE, CHAR_MIN) ts.T_UBYTE = UCHAR_MAX - self.assertEquals(ts.T_UBYTE, UCHAR_MAX) + self.assertEqual(ts.T_UBYTE, UCHAR_MAX) def test_short(self): ts.T_SHORT = SHRT_MAX - self.assertEquals(ts.T_SHORT, SHRT_MAX) + self.assertEqual(ts.T_SHORT, SHRT_MAX) ts.T_SHORT = SHRT_MIN - self.assertEquals(ts.T_SHORT, SHRT_MIN) + self.assertEqual(ts.T_SHORT, SHRT_MIN) ts.T_USHORT = USHRT_MAX - self.assertEquals(ts.T_USHORT, USHRT_MAX) + self.assertEqual(ts.T_USHORT, USHRT_MAX) def test_int(self): ts.T_INT = INT_MAX - self.assertEquals(ts.T_INT, INT_MAX) + self.assertEqual(ts.T_INT, INT_MAX) ts.T_INT = INT_MIN - self.assertEquals(ts.T_INT, INT_MIN) + self.assertEqual(ts.T_INT, INT_MIN) ts.T_UINT = UINT_MAX - self.assertEquals(ts.T_UINT, UINT_MAX) + self.assertEqual(ts.T_UINT, UINT_MAX) def test_long(self): ts.T_LONG = LONG_MAX - self.assertEquals(ts.T_LONG, LONG_MAX) + self.assertEqual(ts.T_LONG, LONG_MAX) ts.T_LONG = LONG_MIN - self.assertEquals(ts.T_LONG, LONG_MIN) + self.assertEqual(ts.T_LONG, LONG_MIN) ts.T_ULONG = ULONG_MAX - self.assertEquals(ts.T_ULONG, ULONG_MAX) + self.assertEqual(ts.T_ULONG, ULONG_MAX) def test_py_ssize_t(self): ts.T_PYSSIZET = PY_SSIZE_T_MAX - self.assertEquals(ts.T_PYSSIZET, PY_SSIZE_T_MAX) + self.assertEqual(ts.T_PYSSIZET, PY_SSIZE_T_MAX) ts.T_PYSSIZET = PY_SSIZE_T_MIN - self.assertEquals(ts.T_PYSSIZET, PY_SSIZE_T_MIN) + self.assertEqual(ts.T_PYSSIZET, PY_SSIZE_T_MIN) @unittest.skipUnless(hasattr(ts, "T_LONGLONG"), "long long not present") def test_longlong(self): ts.T_LONGLONG = LLONG_MAX - self.assertEquals(ts.T_LONGLONG, LLONG_MAX) + self.assertEqual(ts.T_LONGLONG, LLONG_MAX) ts.T_LONGLONG = LLONG_MIN - self.assertEquals(ts.T_LONGLONG, LLONG_MIN) + self.assertEqual(ts.T_LONGLONG, LLONG_MIN) ts.T_ULONGLONG = ULLONG_MAX - self.assertEquals(ts.T_ULONGLONG, ULLONG_MAX) + self.assertEqual(ts.T_ULONGLONG, ULLONG_MAX) ## make sure these will accept a plain int as well as a long ts.T_LONGLONG = 3 - self.assertEquals(ts.T_LONGLONG, 3) + self.assertEqual(ts.T_LONGLONG, 3) ts.T_ULONGLONG = 4 - self.assertEquals(ts.T_ULONGLONG, 4) + self.assertEqual(ts.T_ULONGLONG, 4) def test_bad_assignments(self): integer_attributes = [ @@ -106,7 +106,7 @@ self.assertRaises(TypeError, setattr, ts, attr, nonint) def test_inplace_string(self): - self.assertEquals(ts.T_STRING_INPLACE, "hi") + self.assertEqual(ts.T_STRING_INPLACE, "hi") self.assertRaises(TypeError, setattr, ts, "T_STRING_INPLACE", "s") self.assertRaises(TypeError, delattr, ts, "T_STRING_INPLACE") Modified: python/branches/pep-3151/Lib/test/test_subprocess.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_subprocess.py (original) +++ python/branches/pep-3151/Lib/test/test_subprocess.py Sat Feb 26 08:16:32 2011 @@ -9,6 +9,8 @@ import time import re import sysconfig +import warnings +import select try: import gc except ImportError: @@ -364,22 +366,28 @@ self.assertEqual(stdout, b"banana") self.assertStderrEqual(stderr, b"pineapple") - # This test is Linux specific for simplicity to at least have - # some coverage. It is not a platform specific bug. - @unittest.skipUnless(os.path.isdir('/proc/%d/fd' % os.getpid()), - "Linux specific") # Test for the fd leak reported in http://bugs.python.org/issue2791. def test_communicate_pipe_fd_leak(self): - fd_directory = '/proc/%d/fd' % os.getpid() - num_fds_before_popen = len(os.listdir(fd_directory)) - p = subprocess.Popen([sys.executable, "-c", "print()"], - stdout=subprocess.PIPE) - p.communicate() - num_fds_after_communicate = len(os.listdir(fd_directory)) - del p - num_fds_after_destruction = len(os.listdir(fd_directory)) - self.assertEqual(num_fds_before_popen, num_fds_after_destruction) - self.assertEqual(num_fds_before_popen, num_fds_after_communicate) + for stdin_pipe in (False, True): + for stdout_pipe in (False, True): + for stderr_pipe in (False, True): + options = {} + if stdin_pipe: + options['stdin'] = subprocess.PIPE + if stdout_pipe: + options['stdout'] = subprocess.PIPE + if stderr_pipe: + options['stderr'] = subprocess.PIPE + if not options: + continue + p = subprocess.Popen((sys.executable, "-c", "pass"), **options) + p.communicate() + if p.stdin is not None: + self.assertTrue(p.stdin.closed) + if p.stdout is not None: + self.assertTrue(p.stdout.closed) + if p.stderr is not None: + self.assertTrue(p.stderr.closed) def test_communicate_returns(self): # communicate() should return None if no redirection is active @@ -591,7 +599,7 @@ "[sys.executable, '-c', 'print(\"Hello World!\")'])", 'assert retcode == 0')) output = subprocess.check_output([sys.executable, '-c', code]) - self.assert_(output.startswith(b'Hello World!'), ascii(output)) + self.assertTrue(output.startswith(b'Hello World!'), ascii(output)) def test_handles_closed_on_exception(self): # If CreateProcess exits with an error, ensure the @@ -666,6 +674,7 @@ # string and instead capture the exception that we want to see # below for comparison. desired_exception = e + desired_exception.strerror += ': ' + repr(sys.executable) else: self.fail("chdir to nonexistant directory %s succeeded." % nonexistent_dir) @@ -894,6 +903,106 @@ self.assertStderrEqual(stderr, b'') self.assertEqual(p.wait(), -signal.SIGTERM) + def check_close_std_fds(self, fds): + # Issue #9905: test that subprocess pipes still work properly with + # some standard fds closed + stdin = 0 + newfds = [] + for a in fds: + b = os.dup(a) + newfds.append(b) + if a == 0: + stdin = b + try: + for fd in fds: + os.close(fd) + out, err = subprocess.Popen([sys.executable, "-c", + 'import sys;' + 'sys.stdout.write("apple");' + 'sys.stdout.flush();' + 'sys.stderr.write("orange")'], + stdin=stdin, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE).communicate() + err = support.strip_python_stderr(err) + self.assertEqual((out, err), (b'apple', b'orange')) + finally: + for b, a in zip(newfds, fds): + os.dup2(b, a) + for b in newfds: + os.close(b) + + def test_close_fd_0(self): + self.check_close_std_fds([0]) + + def test_close_fd_1(self): + self.check_close_std_fds([1]) + + def test_close_fd_2(self): + self.check_close_std_fds([2]) + + def test_close_fds_0_1(self): + self.check_close_std_fds([0, 1]) + + def test_close_fds_0_2(self): + self.check_close_std_fds([0, 2]) + + def test_close_fds_1_2(self): + self.check_close_std_fds([1, 2]) + + def test_close_fds_0_1_2(self): + # Issue #10806: test that subprocess pipes still work properly with + # all standard fds closed. + self.check_close_std_fds([0, 1, 2]) + + def test_remapping_std_fds(self): + # open up some temporary files + temps = [mkstemp() for i in range(3)] + try: + temp_fds = [fd for fd, fname in temps] + + # unlink the files -- we won't need to reopen them + for fd, fname in temps: + os.unlink(fname) + + # write some data to what will become stdin, and rewind + os.write(temp_fds[1], b"STDIN") + os.lseek(temp_fds[1], 0, 0) + + # move the standard file descriptors out of the way + saved_fds = [os.dup(fd) for fd in range(3)] + try: + # duplicate the file objects over the standard fd's + for fd, temp_fd in enumerate(temp_fds): + os.dup2(temp_fd, fd) + + # now use those files in the "wrong" order, so that subprocess + # has to rearrange them in the child + p = subprocess.Popen([sys.executable, "-c", + 'import sys; got = sys.stdin.read();' + 'sys.stdout.write("got %s"%got); sys.stderr.write("err")'], + stdin=temp_fds[1], + stdout=temp_fds[2], + stderr=temp_fds[0]) + p.wait() + finally: + # restore the original fd's underneath sys.stdin, etc. + for std, saved in enumerate(saved_fds): + os.dup2(saved, std) + os.close(saved) + + for fd in temp_fds: + os.lseek(fd, 0, 0) + + out = os.read(temp_fds[2], 1024) + err = support.strip_python_stderr(os.read(temp_fds[0], 1024)) + self.assertEqual(out, b"got STDIN") + self.assertEqual(err, b"err") + + finally: + for fd in temp_fds: + os.close(fd) + def test_surrogates_error_message(self): def prepare(): raise ValueError("surrogate:\uDCff") @@ -927,7 +1036,7 @@ [sys.executable, "-c", script], env=env) stdout = stdout.rstrip(b'\n\r') - self.assertEquals(stdout.decode('ascii'), ascii(value)) + self.assertEqual(stdout.decode('ascii'), ascii(value)) # test bytes key = key.encode("ascii", "surrogateescape") @@ -939,7 +1048,7 @@ [sys.executable, "-c", script], env=env) stdout = stdout.rstrip(b'\n\r') - self.assertEquals(stdout.decode('ascii'), ascii(value)) + self.assertEqual(stdout.decode('ascii'), ascii(value)) def test_bytes_program(self): abs_program = os.fsencode(sys.executable) @@ -948,19 +1057,141 @@ # absolute bytes path exitcode = subprocess.call([abs_program, "-c", "pass"]) - self.assertEquals(exitcode, 0) + self.assertEqual(exitcode, 0) # bytes program, unicode PATH env = os.environ.copy() env["PATH"] = path exitcode = subprocess.call([program, "-c", "pass"], env=env) - self.assertEquals(exitcode, 0) + self.assertEqual(exitcode, 0) # bytes program, bytes PATH envb = os.environb.copy() envb[b"PATH"] = os.fsencode(path) exitcode = subprocess.call([program, "-c", "pass"], env=envb) - self.assertEquals(exitcode, 0) + self.assertEqual(exitcode, 0) + + def test_pipe_cloexec(self): + sleeper = support.findfile("input_reader.py", subdir="subprocessdata") + fd_status = support.findfile("fd_status.py", subdir="subprocessdata") + + p1 = subprocess.Popen([sys.executable, sleeper], + stdin=subprocess.PIPE, stdout=subprocess.PIPE, + stderr=subprocess.PIPE, close_fds=False) + + self.addCleanup(p1.communicate, b'') + + p2 = subprocess.Popen([sys.executable, fd_status], + stdout=subprocess.PIPE, close_fds=False) + + output, error = p2.communicate() + result_fds = set(map(int, output.split(b','))) + unwanted_fds = set([p1.stdin.fileno(), p1.stdout.fileno(), + p1.stderr.fileno()]) + + self.assertFalse(result_fds & unwanted_fds, + "Expected no fds from %r to be open in child, " + "found %r" % + (unwanted_fds, result_fds & unwanted_fds)) + + def test_pipe_cloexec_real_tools(self): + qcat = support.findfile("qcat.py", subdir="subprocessdata") + qgrep = support.findfile("qgrep.py", subdir="subprocessdata") + + subdata = b'zxcvbn' + data = subdata * 4 + b'\n' + + p1 = subprocess.Popen([sys.executable, qcat], + stdin=subprocess.PIPE, stdout=subprocess.PIPE, + close_fds=False) + + p2 = subprocess.Popen([sys.executable, qgrep, subdata], + stdin=p1.stdout, stdout=subprocess.PIPE, + close_fds=False) + + self.addCleanup(p1.wait) + self.addCleanup(p2.wait) + self.addCleanup(p1.terminate) + self.addCleanup(p2.terminate) + + p1.stdin.write(data) + p1.stdin.close() + + readfiles, ignored1, ignored2 = select.select([p2.stdout], [], [], 10) + + self.assertTrue(readfiles, "The child hung") + self.assertEqual(p2.stdout.read(), data) + + p1.stdout.close() + p2.stdout.close() + + def test_close_fds(self): + fd_status = support.findfile("fd_status.py", subdir="subprocessdata") + + fds = os.pipe() + self.addCleanup(os.close, fds[0]) + self.addCleanup(os.close, fds[1]) + + open_fds = set(fds) + + p = subprocess.Popen([sys.executable, fd_status], + stdout=subprocess.PIPE, close_fds=False) + output, ignored = p.communicate() + remaining_fds = set(map(int, output.split(b','))) + + self.assertEqual(remaining_fds & open_fds, open_fds, + "Some fds were closed") + + p = subprocess.Popen([sys.executable, fd_status], + stdout=subprocess.PIPE, close_fds=True) + output, ignored = p.communicate() + remaining_fds = set(map(int, output.split(b','))) + + self.assertFalse(remaining_fds & open_fds, + "Some fds were left open") + self.assertIn(1, remaining_fds, "Subprocess failed") + + def test_pass_fds(self): + fd_status = support.findfile("fd_status.py", subdir="subprocessdata") + + open_fds = set() + + for x in range(5): + fds = os.pipe() + self.addCleanup(os.close, fds[0]) + self.addCleanup(os.close, fds[1]) + open_fds.update(fds) + + for fd in open_fds: + p = subprocess.Popen([sys.executable, fd_status], + stdout=subprocess.PIPE, close_fds=True, + pass_fds=(fd, )) + output, ignored = p.communicate() + + remaining_fds = set(map(int, output.split(b','))) + to_be_closed = open_fds - {fd} + + self.assertIn(fd, remaining_fds, "fd to be passed not passed") + self.assertFalse(remaining_fds & to_be_closed, + "fd to be closed passed") + + # pass_fds overrides close_fds with a warning. + with self.assertWarns(RuntimeWarning) as context: + self.assertFalse(subprocess.call( + [sys.executable, "-c", "import sys; sys.exit(0)"], + close_fds=False, pass_fds=(fd, ))) + self.assertIn('overriding close_fds', str(context.warning)) + + def test_wait_when_sigchild_ignored(self): + # NOTE: sigchild_ignore.py may not be an effective test on all OSes. + sigchild_ignore = support.findfile("sigchild_ignore.py", + subdir="subprocessdata") + p = subprocess.Popen([sys.executable, sigchild_ignore], + stdout=subprocess.PIPE, stderr=subprocess.PIPE) + stdout, stderr = p.communicate() + self.assertEqual(0, p.returncode, "sigchild_ignore.py exited" + " non-zero with this error:\n%s" % + stderr.decode('utf-8')) @unittest.skipUnless(mswindows, "Windows specific tests") @@ -1182,6 +1413,47 @@ # call() function with sequence argument with spaces on Windows self.with_spaces([sys.executable, self.fname, "ab cd"]) + +class ContextManagerTests(ProcessTestCase): + + def test_pipe(self): + with subprocess.Popen([sys.executable, "-c", + "import sys;" + "sys.stdout.write('stdout');" + "sys.stderr.write('stderr');"], + stdout=subprocess.PIPE, + stderr=subprocess.PIPE) as proc: + self.assertEqual(proc.stdout.read(), b"stdout") + self.assertStderrEqual(proc.stderr.read(), b"stderr") + + self.assertTrue(proc.stdout.closed) + self.assertTrue(proc.stderr.closed) + + def test_returncode(self): + with subprocess.Popen([sys.executable, "-c", + "import sys; sys.exit(100)"]) as proc: + proc.wait() + self.assertEqual(proc.returncode, 100) + + def test_communicate_stdin(self): + with subprocess.Popen([sys.executable, "-c", + "import sys;" + "sys.exit(sys.stdin.read() == 'context')"], + stdin=subprocess.PIPE) as proc: + proc.communicate(b"context") + self.assertEqual(proc.returncode, 1) + + def test_invalid_args(self): + with self.assertRaises(EnvironmentError) as c: + with subprocess.Popen(['nonexisting_i_hope'], + stdout=subprocess.PIPE, + stderr=subprocess.PIPE) as proc: + pass + + if c.exception.errno != errno.ENOENT: # ignore "no such file" + raise c.exception + + def test_main(): unit_tests = (ProcessTestCase, POSIXProcessTestCase, @@ -1190,7 +1462,8 @@ CommandTests, ProcessTestCaseNoPoll, HelperFunctionTests, - CommandsWithSpaces) + CommandsWithSpaces, + ContextManagerTests) support.run_unittest(*unit_tests) support.reap_children() Modified: python/branches/pep-3151/Lib/test/test_sys.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_sys.py (original) +++ python/branches/pep-3151/Lib/test/test_sys.py Sat Feb 26 08:16:32 2011 @@ -93,7 +93,7 @@ try: sys.exit(0) except SystemExit as exc: - self.assertEquals(exc.code, 0) + self.assertEqual(exc.code, 0) except: self.fail("wrong exception") else: @@ -104,7 +104,7 @@ try: sys.exit(42) except SystemExit as exc: - self.assertEquals(exc.code, 42) + self.assertEqual(exc.code, 42) except: self.fail("wrong exception") else: @@ -114,7 +114,7 @@ try: sys.exit((42,)) except SystemExit as exc: - self.assertEquals(exc.code, 42) + self.assertEqual(exc.code, 42) except: self.fail("wrong exception") else: @@ -124,7 +124,7 @@ try: sys.exit("exit") except SystemExit as exc: - self.assertEquals(exc.code, "exit") + self.assertEqual(exc.code, "exit") except: self.fail("wrong exception") else: @@ -134,7 +134,7 @@ try: sys.exit((17, 23)) except SystemExit as exc: - self.assertEquals(exc.code, (17, 23)) + self.assertEqual(exc.code, (17, 23)) except: self.fail("wrong exception") else: @@ -188,7 +188,7 @@ orig = sys.getcheckinterval() for n in 0, 100, 120, orig: # orig last to restore starting state sys.setcheckinterval(n) - self.assertEquals(sys.getcheckinterval(), n) + self.assertEqual(sys.getcheckinterval(), n) @unittest.skipUnless(threading, 'Threading required for this test.') def test_switchinterval(self): @@ -202,7 +202,7 @@ try: for n in 0.00001, 0.05, 3.0, orig: sys.setswitchinterval(n) - self.assertAlmostEquals(sys.getswitchinterval(), n) + self.assertAlmostEqual(sys.getswitchinterval(), n) finally: sys.setswitchinterval(orig) @@ -215,6 +215,8 @@ self.assertEqual(sys.getrecursionlimit(), 10000) sys.setrecursionlimit(oldlimit) + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'fatal error if run with a trace function') def test_recursionlimit_recovery(self): # NOTE: this test is slightly fragile in that it depends on the current # recursion count when executing the test being low enough so as to @@ -301,6 +303,7 @@ self.assertEqual(sys.getdlopenflags(), oldflags+1) sys.setdlopenflags(oldflags) + @test.support.refcount_test def test_refcount(self): # n here must be a global in order for this test to pass while # tracing with a python function. Tracing calls PyFrame_FastToLocals @@ -501,7 +504,7 @@ attrs = ("debug", "division_warning", "inspect", "interactive", "optimize", "dont_write_bytecode", "no_user_site", "no_site", "ignore_environment", "verbose", - "bytes_warning") + "bytes_warning", "quiet") for attr in attrs: self.assertTrue(hasattr(sys.flags, attr), attr) self.assertEqual(type(getattr(sys.flags, attr)), int, attr) @@ -569,7 +572,7 @@ TPFLAGS_HEAPTYPE = 1<<9 def setUp(self): - self.c = len(struct.pack('c', ' ')) + self.c = len(struct.pack('c', b' ')) self.H = len(struct.pack('H', 0)) self.i = len(struct.pack('i', 0)) self.l = len(struct.pack('l', 0)) @@ -782,8 +785,8 @@ # reverse check(reversed(''), size(h + 'PP')) # range - check(range(1), size(h + '3P')) - check(range(66000), size(h + '3P')) + check(range(1), size(h + '4P')) + check(range(66000), size(h + '4P')) # set # frozenset PySet_MINSIZE = 8 Modified: python/branches/pep-3151/Lib/test/test_sys_settrace.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_sys_settrace.py (original) +++ python/branches/pep-3151/Lib/test/test_sys_settrace.py Sat Feb 26 08:16:32 2011 @@ -251,6 +251,7 @@ def setUp(self): self.using_gc = gc.isenabled() gc.disable() + self.addCleanup(sys.settrace, sys.gettrace()) def tearDown(self): if self.using_gc: @@ -389,6 +390,9 @@ class RaisingTraceFuncTestCase(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + def trace(self, frame, event, arg): """A trace function that raises an exception in response to a specific trace event.""" @@ -688,6 +692,10 @@ class JumpTestCase(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + sys.settrace(None) + def compare_jump_output(self, expected, received): if received != expected: self.fail( "Outputs don't match:\n" + @@ -739,6 +747,8 @@ def test_18_no_jump_to_non_integers(self): self.run_test(no_jump_to_non_integers) def test_19_no_jump_without_trace_function(self): + # Must set sys.settrace(None) in setUp(), else condition is not + # triggered. no_jump_without_trace_function() def test_20_large_function(self): Modified: python/branches/pep-3151/Lib/test/test_sysconfig.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_sysconfig.py (original) +++ python/branches/pep-3151/Lib/test/test_sysconfig.py Sat Feb 26 08:16:32 2011 @@ -88,7 +88,7 @@ shutil.rmtree(path) def test_get_path_names(self): - self.assertEquals(get_path_names(), sysconfig._SCHEME_KEYS) + self.assertEqual(get_path_names(), sysconfig._SCHEME_KEYS) def test_get_paths(self): scheme = get_paths() @@ -98,7 +98,7 @@ wanted.sort() scheme = list(scheme.items()) scheme.sort() - self.assertEquals(scheme, wanted) + self.assertEqual(scheme, wanted) def test_get_path(self): # xxx make real tests here @@ -117,21 +117,21 @@ sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) ' '[MSC v.1310 32 bit (Intel)]') sys.platform = 'win32' - self.assertEquals(get_platform(), 'win32') + self.assertEqual(get_platform(), 'win32') # windows XP, amd64 os.name = 'nt' sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) ' '[MSC v.1310 32 bit (Amd64)]') sys.platform = 'win32' - self.assertEquals(get_platform(), 'win-amd64') + self.assertEqual(get_platform(), 'win-amd64') # windows XP, itanium os.name = 'nt' sys.version = ('2.4.4 (#71, Oct 18 2006, 08:34:43) ' '[MSC v.1310 32 bit (Itanium)]') sys.platform = 'win32' - self.assertEquals(get_platform(), 'win-ia64') + self.assertEqual(get_platform(), 'win-ia64') # macbook os.name = 'posix' @@ -150,9 +150,9 @@ maxint = sys.maxsize try: sys.maxsize = 2147483647 - self.assertEquals(get_platform(), 'macosx-10.3-ppc') + self.assertEqual(get_platform(), 'macosx-10.3-ppc') sys.maxsize = 9223372036854775807 - self.assertEquals(get_platform(), 'macosx-10.3-ppc64') + self.assertEqual(get_platform(), 'macosx-10.3-ppc64') finally: sys.maxsize = maxint @@ -169,9 +169,9 @@ maxint = sys.maxsize try: sys.maxsize = 2147483647 - self.assertEquals(get_platform(), 'macosx-10.3-i386') + self.assertEqual(get_platform(), 'macosx-10.3-i386') sys.maxsize = 9223372036854775807 - self.assertEquals(get_platform(), 'macosx-10.3-x86_64') + self.assertEqual(get_platform(), 'macosx-10.3-x86_64') finally: sys.maxsize = maxint @@ -182,33 +182,33 @@ '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3') - self.assertEquals(get_platform(), 'macosx-10.4-fat') + self.assertEqual(get_platform(), 'macosx-10.4-fat') get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch i386 -isysroot ' '/Developer/SDKs/MacOSX10.4u.sdk ' '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3') - self.assertEquals(get_platform(), 'macosx-10.4-intel') + self.assertEqual(get_platform(), 'macosx-10.4-intel') get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch ppc -arch i386 -isysroot ' '/Developer/SDKs/MacOSX10.4u.sdk ' '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3') - self.assertEquals(get_platform(), 'macosx-10.4-fat3') + self.assertEqual(get_platform(), 'macosx-10.4-fat3') get_config_vars()['CFLAGS'] = ('-arch ppc64 -arch x86_64 -arch ppc -arch i386 -isysroot ' '/Developer/SDKs/MacOSX10.4u.sdk ' '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3') - self.assertEquals(get_platform(), 'macosx-10.4-universal') + self.assertEqual(get_platform(), 'macosx-10.4-universal') get_config_vars()['CFLAGS'] = ('-arch x86_64 -arch ppc64 -isysroot ' '/Developer/SDKs/MacOSX10.4u.sdk ' '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3') - self.assertEquals(get_platform(), 'macosx-10.4-fat64') + self.assertEqual(get_platform(), 'macosx-10.4-fat64') for arch in ('ppc', 'i386', 'x86_64', 'ppc64'): get_config_vars()['CFLAGS'] = ('-arch %s -isysroot ' @@ -216,7 +216,7 @@ '-fno-strict-aliasing -fno-common ' '-dynamic -DNDEBUG -g -O3'%(arch,)) - self.assertEquals(get_platform(), 'macosx-10.4-%s'%(arch,)) + self.assertEqual(get_platform(), 'macosx-10.4-%s'%(arch,)) # linux debian sarge os.name = 'posix' @@ -226,7 +226,7 @@ self._set_uname(('Linux', 'aglae', '2.6.21.1dedibox-r7', '#1 Mon Apr 30 17:25:38 CEST 2007', 'i686')) - self.assertEquals(get_platform(), 'linux-i686') + self.assertEqual(get_platform(), 'linux-i686') # XXX more platforms to tests here @@ -243,7 +243,7 @@ def test_get_scheme_names(self): wanted = ('nt', 'nt_user', 'os2', 'os2_home', 'osx_framework_user', 'posix_home', 'posix_prefix', 'posix_user') - self.assertEquals(get_scheme_names(), wanted) + self.assertEqual(get_scheme_names(), wanted) @skip_unless_symlink def test_symlink(self): @@ -275,7 +275,7 @@ for name in ('stdlib', 'platstdlib', 'purelib', 'platlib'): global_path = get_path(name, 'posix_prefix') user_path = get_path(name, 'posix_user') - self.assertEquals(user_path, global_path.replace(base, user)) + self.assertEqual(user_path, global_path.replace(base, user)) def test_main(self): # just making sure _main() runs and returns things in the stdout Modified: python/branches/pep-3151/Lib/test/test_syslog.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_syslog.py (original) +++ python/branches/pep-3151/Lib/test/test_syslog.py Sat Feb 26 08:16:32 2011 @@ -11,6 +11,8 @@ def test_openlog(self): syslog.openlog('python') + # Issue #6697. + self.assertRaises(UnicodeEncodeError, syslog.openlog, '\uD800') def test_syslog(self): syslog.openlog('python') Modified: python/branches/pep-3151/Lib/test/test_tarfile.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_tarfile.py (original) +++ python/branches/pep-3151/Lib/test/test_tarfile.py Sat Feb 26 08:16:32 2011 @@ -419,6 +419,22 @@ mode="r|" + def test_read_through(self): + # Issue #11224: A poorly designed _FileInFile.read() method + # caused seeking errors with stream tar files. + for tarinfo in self.tar: + if not tarinfo.isreg(): + continue + fobj = self.tar.extractfile(tarinfo) + while True: + try: + buf = fobj.read(512) + except tarfile.StreamError: + self.fail("simple read-through using TarFile.extractfile() failed") + if not buf: + break + fobj.close() + def test_fileobj_regular_file(self): tarinfo = self.tar.next() # get "regtype" (can't use getmember) fobj = self.tar.extractfile(tarinfo) @@ -919,6 +935,10 @@ finally: tar.close() + # Verify that filter is a keyword-only argument + with self.assertRaises(TypeError): + tar.add(tempdir, "empty_dir", True, None, filter) + tar = tarfile.open(tmpname, "r") try: for tarinfo in tar: @@ -1000,7 +1020,7 @@ tar = tarfile.open(tmpname, "r") try: for t in tar: - self.assert_(t.name == "." or t.name.startswith("./")) + self.assertTrue(t.name == "." or t.name.startswith("./")) finally: tar.close() finally: @@ -1269,7 +1289,7 @@ self._test_unicode_filename("utf7") def test_utf8_filename(self): - self._test_unicode_filename("utf8") + self._test_unicode_filename("utf-8") def _test_unicode_filename(self, encoding): tar = tarfile.open(tmpname, "w", format=self.format, encoding=encoding, errors="strict") @@ -1348,7 +1368,7 @@ def test_bad_pax_header(self): # Test for issue #8633. GNU tar <= 1.23 creates raw binary fields # without a hdrcharset=BINARY header. - for encoding, name in (("utf8", "pax/bad-pax-\udce4\udcf6\udcfc"), + for encoding, name in (("utf-8", "pax/bad-pax-\udce4\udcf6\udcfc"), ("iso8859-1", "pax/bad-pax-\xe4\xf6\xfc"),): with tarfile.open(tarname, encoding=encoding, errors="surrogateescape") as tar: try: @@ -1363,7 +1383,7 @@ def test_binary_header(self): # Test a POSIX.1-2008 compatible header with a hdrcharset=BINARY field. - for encoding, name in (("utf8", "pax/hdrcharset-\udce4\udcf6\udcfc"), + for encoding, name in (("utf-8", "pax/hdrcharset-\udce4\udcf6\udcfc"), ("iso8859-1", "pax/hdrcharset-\xe4\xf6\xfc"),): with tarfile.open(tarname, encoding=encoding, errors="surrogateescape") as tar: try: Modified: python/branches/pep-3151/Lib/test/test_telnetlib.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_telnetlib.py (original) +++ python/branches/pep-3151/Lib/test/test_telnetlib.py Sat Feb 26 08:16:32 2011 @@ -342,6 +342,16 @@ expected = "send b'xxx'\n" self.assertIn(expected, telnet._messages) + def test_debug_accepts_str_port(self): + # Issue 10695 + with test_socket([]): + telnet = TelnetAlike('dummy', '0') + telnet._messages = '' + telnet.set_debuglevel(1) + telnet.msg('test') + self.assertRegex(telnet._messages, r'0.*test') + + def test_main(verbose=None): support.run_unittest(GeneralTests, ReadTests, WriteTests, OptionTests) Modified: python/branches/pep-3151/Lib/test/test_tempfile.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_tempfile.py (original) +++ python/branches/pep-3151/Lib/test/test_tempfile.py Sat Feb 26 08:16:32 2011 @@ -319,7 +319,7 @@ f.write(b"blat\x1a") f.write(b"extra\n") os.lseek(f.fd, 0, os.SEEK_SET) - self.assertEquals(os.read(f.fd, 20), b"blat") + self.assertEqual(os.read(f.fd, 20), b"blat") test_classes.append(test__mkstemp_inner) @@ -879,7 +879,7 @@ with tempfile.TemporaryFile(*args, **kwargs) as fileobj: fileobj.write(input) fileobj.seek(0) - self.assertEquals(input, fileobj.read()) + self.assertEqual(input, fileobj.read()) roundtrip(b"1234", "w+b") roundtrip("abdc\n", "w+") @@ -925,6 +925,15 @@ f.write(b"Hello world!") return tmp + def test_mkdtemp_failure(self): + # Check no additional exception if mkdtemp fails + # Previously would raise AttributeError instead + # (noted as part of Issue #10188) + with tempfile.TemporaryDirectory() as nonexistent: + pass + with self.assertRaises(os.error): + tempfile.TemporaryDirectory(dir=nonexistent) + def test_explicit_cleanup(self): # A TemporaryDirectory is deleted when cleaned up dir = tempfile.mkdtemp() @@ -955,20 +964,56 @@ def test_del_on_shutdown(self): # A TemporaryDirectory may be cleaned up during shutdown # Make sure it works with the relevant modules nulled out - dir = tempfile.mkdtemp() - try: + with self.do_create() as dir: d = self.do_create(dir=dir) # Mimic the nulling out of modules that # occurs during system shutdown modules = [os, os.path] if has_stat: modules.append(stat) - with NulledModules(*modules): - d.cleanup() + # Currently broken, so suppress the warning + # that is otherwise emitted on stdout + with support.captured_stderr() as err: + with NulledModules(*modules): + d.cleanup() + # Currently broken, so stop spurious exception by + # indicating the object has already been closed + d._closed = True + # And this assert will fail, as expected by the + # unittest decorator... self.assertFalse(os.path.exists(d.name), "TemporaryDirectory %s exists after cleanup" % d.name) - finally: - os.rmdir(dir) + + def test_warnings_on_cleanup(self): + # Two kinds of warning on shutdown + # Issue 10888: may write to stderr if modules are nulled out + # ResourceWarning will be triggered by __del__ + with self.do_create() as dir: + if os.sep != '\\': + # Embed a backslash in order to make sure string escaping + # in the displayed error message is dealt with correctly + suffix = '\\check_backslash_handling' + else: + suffix = '' + d = self.do_create(dir=dir, suf=suffix) + + #Check for the Issue 10888 message + modules = [os, os.path] + if has_stat: + modules.append(stat) + with support.captured_stderr() as err: + with NulledModules(*modules): + d.cleanup() + message = err.getvalue().replace('\\\\', '\\') + self.assertIn("while cleaning up", message) + self.assertIn(d.name, message) + + # Check for the resource warning + with support.check_warnings(('Implicitly', ResourceWarning), quiet=False): + warnings.filterwarnings("always", category=ResourceWarning) + d.__del__() + self.assertFalse(os.path.exists(d.name), + "TemporaryDirectory %s exists after __del__" % d.name) def test_multiple_close(self): # Can be cleaned-up many times without error Modified: python/branches/pep-3151/Lib/test/test_textwrap.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_textwrap.py (original) +++ python/branches/pep-3151/Lib/test/test_textwrap.py Sat Feb 26 08:16:32 2011 @@ -29,7 +29,7 @@ def check(self, result, expect): - self.assertEquals(result, expect, + self.assertEqual(result, expect, 'expected:\n%s\nbut got:\n%s' % ( self.show(expect), self.show(result))) @@ -39,9 +39,9 @@ def check_split(self, text, expect): result = self.wrapper._split(text) - self.assertEquals(result, expect, - "\nexpected %r\n" - "but got %r" % (expect, result)) + self.assertEqual(result, expect, + "\nexpected %r\n" + "but got %r" % (expect, result)) class WrapTestCase(BaseTestCase): @@ -490,7 +490,7 @@ def assertUnchanged(self, text): """assert that dedent() has no effect on 'text'""" - self.assertEquals(text, dedent(text)) + self.assertEqual(text, dedent(text)) def test_dedent_nomargin(self): # No lines indented. @@ -513,17 +513,17 @@ # All lines indented by two spaces. text = " Hello there.\n How are ya?\n Oh good." expect = "Hello there.\nHow are ya?\nOh good." - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # Same, with blank lines. text = " Hello there.\n\n How are ya?\n Oh good.\n" expect = "Hello there.\n\nHow are ya?\nOh good.\n" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # Now indent one of the blank lines. text = " Hello there.\n \n How are ya?\n Oh good.\n" expect = "Hello there.\n\nHow are ya?\nOh good.\n" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) def test_dedent_uneven(self): # Lines indented unevenly. @@ -537,27 +537,27 @@ while 1: return foo ''' - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # Uneven indentation with a blank line. text = " Foo\n Bar\n\n Baz\n" expect = "Foo\n Bar\n\n Baz\n" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # Uneven indentation with a whitespace-only line. text = " Foo\n Bar\n \n Baz\n" expect = "Foo\n Bar\n\n Baz\n" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # dedent() should not mangle internal tabs def test_dedent_preserve_internal_tabs(self): text = " hello\tthere\n how are\tyou?" expect = "hello\tthere\nhow are\tyou?" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) # make sure that it preserves tabs when it's not making any # changes at all - self.assertEquals(expect, dedent(expect)) + self.assertEqual(expect, dedent(expect)) # dedent() should not mangle tabs in the margin (i.e. # tabs and spaces both count as margin, but are *not* @@ -573,17 +573,17 @@ # dedent() only removes whitespace that can be uniformly removed! text = "\thello there\n\thow are you?" expect = "hello there\nhow are you?" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) text = " \thello there\n \thow are you?" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) text = " \t hello there\n \t how are you?" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) text = " \thello there\n \t how are you?" expect = "hello there\n how are you?" - self.assertEquals(expect, dedent(text)) + self.assertEqual(expect, dedent(text)) def test_main(): Modified: python/branches/pep-3151/Lib/test/test_threadedtempfile.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_threadedtempfile.py (original) +++ python/branches/pep-3151/Lib/test/test_threadedtempfile.py Sat Feb 26 08:16:32 2011 @@ -68,8 +68,8 @@ msg = "Errors: errors %d ok %d\n%s" % (len(errors), ok, '\n'.join(errors)) - self.assertEquals(errors, [], msg) - self.assertEquals(ok, NUM_THREADS * FILES_PER_THREAD) + self.assertEqual(errors, [], msg) + self.assertEqual(ok, NUM_THREADS * FILES_PER_THREAD) def test_main(): run_unittest(ThreadedTempFileTest) Modified: python/branches/pep-3151/Lib/test/test_threading.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_threading.py (original) +++ python/branches/pep-3151/Lib/test/test_threading.py Sat Feb 26 08:16:32 2011 @@ -11,6 +11,7 @@ import unittest import weakref import os +import subprocess from test import lock_tests @@ -272,7 +273,6 @@ except ImportError: raise unittest.SkipTest("cannot import ctypes") - import subprocess rc = subprocess.call([sys.executable, "-c", """if 1: import ctypes, sys, time, _thread @@ -303,7 +303,6 @@ def test_finalize_with_trace(self): # Issue1733757 # Avoid a deadlock when sys.settrace steps into threading._shutdown - import subprocess p = subprocess.Popen([sys.executable, "-c", """if 1: import sys, threading @@ -338,7 +337,6 @@ def test_join_nondaemon_on_shutdown(self): # Issue 1722344 # Raising SystemExit skipped threading._shutdown - import subprocess p = subprocess.Popen([sys.executable, "-c", """if 1: import threading from time import sleep @@ -398,17 +396,17 @@ weak_cyclic_object = weakref.ref(cyclic_object) cyclic_object.thread.join() del cyclic_object - self.assertEquals(None, weak_cyclic_object(), - msg=('%d references still around' % - sys.getrefcount(weak_cyclic_object()))) + self.assertIsNone(weak_cyclic_object(), + msg=('%d references still around' % + sys.getrefcount(weak_cyclic_object()))) raising_cyclic_object = RunSelfFunction(should_raise=True) weak_raising_cyclic_object = weakref.ref(raising_cyclic_object) raising_cyclic_object.thread.join() del raising_cyclic_object - self.assertEquals(None, weak_raising_cyclic_object(), - msg=('%d references still around' % - sys.getrefcount(weak_raising_cyclic_object()))) + self.assertIsNone(weak_raising_cyclic_object(), + msg=('%d references still around' % + sys.getrefcount(weak_raising_cyclic_object()))) def test_old_threading_api(self): # Just a quick sanity check to make sure the old method names are @@ -429,6 +427,14 @@ t.daemon = True self.assertTrue('daemon' in repr(t)) + def test_deamon_param(self): + t = threading.Thread() + self.assertFalse(t.daemon) + t = threading.Thread(daemon=False) + self.assertFalse(t.daemon) + t = threading.Thread(daemon=True) + self.assertTrue(t.daemon) + class ThreadJoinOnShutdown(BaseTestCase): @@ -445,7 +451,6 @@ sys.stdout.flush() \n""" + script - import subprocess p = subprocess.Popen([sys.executable, "-c", script], stdout=subprocess.PIPE) rc = p.wait() data = p.stdout.read().decode().replace('\r', '') @@ -512,6 +517,152 @@ """ self._run_and_join(script) + def assertScriptHasOutput(self, script, expected_output): + p = subprocess.Popen([sys.executable, "-c", script], + stdout=subprocess.PIPE) + stdout, stderr = p.communicate() + data = stdout.decode().replace('\r', '') + self.assertEqual(p.returncode, 0, "Unexpected error") + self.assertEqual(data, expected_output) + + @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + def test_4_joining_across_fork_in_worker_thread(self): + # There used to be a possible deadlock when forking from a child + # thread. See http://bugs.python.org/issue6643. + + # Skip platforms with known problems forking from a worker thread. + # See http://bugs.python.org/issue3863. + if sys.platform in ('freebsd4', 'freebsd5', 'freebsd6', 'os2emx'): + raise unittest.SkipTest('due to known OS bugs on ' + sys.platform) + + # The script takes the following steps: + # - The main thread in the parent process starts a new thread and then + # tries to join it. + # - The join operation acquires the Lock inside the thread's _block + # Condition. (See threading.py:Thread.join().) + # - We stub out the acquire method on the condition to force it to wait + # until the child thread forks. (See LOCK ACQUIRED HERE) + # - The child thread forks. (See LOCK HELD and WORKER THREAD FORKS + # HERE) + # - The main thread of the parent process enters Condition.wait(), + # which releases the lock on the child thread. + # - The child process returns. Without the necessary fix, when the + # main thread of the child process (which used to be the child thread + # in the parent process) attempts to exit, it will try to acquire the + # lock in the Thread._block Condition object and hang, because the + # lock was held across the fork. + + script = """if 1: + import os, time, threading + + finish_join = False + start_fork = False + + def worker(): + # Wait until this thread's lock is acquired before forking to + # create the deadlock. + global finish_join + while not start_fork: + time.sleep(0.01) + # LOCK HELD: Main thread holds lock across this call. + childpid = os.fork() + finish_join = True + if childpid != 0: + # Parent process just waits for child. + os.waitpid(childpid, 0) + # Child process should just return. + + w = threading.Thread(target=worker) + + # Stub out the private condition variable's lock acquire method. + # This acquires the lock and then waits until the child has forked + # before returning, which will release the lock soon after. If + # someone else tries to fix this test case by acquiring this lock + # before forking instead of reseting it, the test case will + # deadlock when it shouldn't. + condition = w._block + orig_acquire = condition.acquire + call_count_lock = threading.Lock() + call_count = 0 + def my_acquire(): + global call_count + global start_fork + orig_acquire() # LOCK ACQUIRED HERE + start_fork = True + if call_count == 0: + while not finish_join: + time.sleep(0.01) # WORKER THREAD FORKS HERE + with call_count_lock: + call_count += 1 + condition.acquire = my_acquire + + w.start() + w.join() + print('end of main') + """ + self.assertScriptHasOutput(script, "end of main\n") + + @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + def test_5_clear_waiter_locks_to_avoid_crash(self): + # Check that a spawned thread that forks doesn't segfault on certain + # platforms, namely OS X. This used to happen if there was a waiter + # lock in the thread's condition variable's waiters list. Even though + # we know the lock will be held across the fork, it is not safe to + # release locks held across forks on all platforms, so releasing the + # waiter lock caused a segfault on OS X. Furthermore, since locks on + # OS X are (as of this writing) implemented with a mutex + condition + # variable instead of a semaphore, while we know that the Python-level + # lock will be acquired, we can't know if the internal mutex will be + # acquired at the time of the fork. + + # Skip platforms with known problems forking from a worker thread. + # See http://bugs.python.org/issue3863. + if sys.platform in ('freebsd4', 'freebsd5', 'freebsd6', 'os2emx'): + raise unittest.SkipTest('due to known OS bugs on ' + sys.platform) + script = """if True: + import os, time, threading + + start_fork = False + + def worker(): + # Wait until the main thread has attempted to join this thread + # before continuing. + while not start_fork: + time.sleep(0.01) + childpid = os.fork() + if childpid != 0: + # Parent process just waits for child. + (cpid, rc) = os.waitpid(childpid, 0) + assert cpid == childpid + assert rc == 0 + print('end of worker thread') + else: + # Child process should just return. + pass + + w = threading.Thread(target=worker) + + # Stub out the private condition variable's _release_save method. + # This releases the condition's lock and flips the global that + # causes the worker to fork. At this point, the problematic waiter + # lock has been acquired once by the waiter and has been put onto + # the waiters list. + condition = w._block + orig_release_save = condition._release_save + def my_release_save(): + global start_fork + orig_release_save() + # Waiter lock held here, condition lock released. + start_fork = True + condition._release_save = my_release_save + + w.start() + w.join() + print('end of main thread') + """ + output = "end of worker thread\nend of main thread\n" + self.assertScriptHasOutput(script, output) + class ThreadingExceptionTests(BaseTestCase): # A RuntimeError should be raised if Thread.start() is called Modified: python/branches/pep-3151/Lib/test/test_threadsignals.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_threadsignals.py (original) +++ python/branches/pep-3151/Lib/test/test_threadsignals.py Sat Feb 26 08:16:32 2011 @@ -6,6 +6,7 @@ import sys from test.support import run_unittest, import_module thread = import_module('_thread') +import time if sys.platform[:3] in ('win', 'os2') or sys.platform=='riscos': raise unittest.SkipTest("Can't test signal on %s" % sys.platform) @@ -34,12 +35,12 @@ signalled_all.release() class ThreadSignals(unittest.TestCase): - """Test signal handling semantics of threads. - We spawn a thread, have the thread send two signals, and - wait for it to finish. Check that we got both signals - and that they were run by the main thread. - """ + def test_signals(self): + # Test signal handling semantics of threads. + # We spawn a thread, have the thread send two signals, and + # wait for it to finish. Check that we got both signals + # and that they were run by the main thread. signalled_all.acquire() self.spawnSignallingThread() signalled_all.acquire() @@ -66,6 +67,121 @@ def spawnSignallingThread(self): thread.start_new_thread(send_signals, ()) + def alarm_interrupt(self, sig, frame): + raise KeyboardInterrupt + + def test_lock_acquire_interruption(self): + # Mimic receiving a SIGINT (KeyboardInterrupt) with SIGALRM while stuck + # in a deadlock. + oldalrm = signal.signal(signal.SIGALRM, self.alarm_interrupt) + try: + lock = thread.allocate_lock() + lock.acquire() + signal.alarm(1) + self.assertRaises(KeyboardInterrupt, lock.acquire) + finally: + signal.signal(signal.SIGALRM, oldalrm) + + def test_rlock_acquire_interruption(self): + # Mimic receiving a SIGINT (KeyboardInterrupt) with SIGALRM while stuck + # in a deadlock. + oldalrm = signal.signal(signal.SIGALRM, self.alarm_interrupt) + try: + rlock = thread.RLock() + # For reentrant locks, the initial acquisition must be in another + # thread. + def other_thread(): + rlock.acquire() + thread.start_new_thread(other_thread, ()) + # Wait until we can't acquire it without blocking... + while rlock.acquire(blocking=False): + rlock.release() + time.sleep(0.01) + signal.alarm(1) + self.assertRaises(KeyboardInterrupt, rlock.acquire) + finally: + signal.signal(signal.SIGALRM, oldalrm) + + def acquire_retries_on_intr(self, lock): + self.sig_recvd = False + def my_handler(signal, frame): + self.sig_recvd = True + old_handler = signal.signal(signal.SIGUSR1, my_handler) + try: + def other_thread(): + # Acquire the lock in a non-main thread, so this test works for + # RLocks. + lock.acquire() + # Wait until the main thread is blocked in the lock acquire, and + # then wake it up with this. + time.sleep(0.5) + os.kill(process_pid, signal.SIGUSR1) + # Let the main thread take the interrupt, handle it, and retry + # the lock acquisition. Then we'll let it run. + time.sleep(0.5) + lock.release() + thread.start_new_thread(other_thread, ()) + # Wait until we can't acquire it without blocking... + while lock.acquire(blocking=False): + lock.release() + time.sleep(0.01) + result = lock.acquire() # Block while we receive a signal. + self.assertTrue(self.sig_recvd) + self.assertTrue(result) + finally: + signal.signal(signal.SIGUSR1, old_handler) + + def test_lock_acquire_retries_on_intr(self): + self.acquire_retries_on_intr(thread.allocate_lock()) + + def test_rlock_acquire_retries_on_intr(self): + self.acquire_retries_on_intr(thread.RLock()) + + def test_interrupted_timed_acquire(self): + # Test to make sure we recompute lock acquisition timeouts when we + # receive a signal. Check this by repeatedly interrupting a lock + # acquire in the main thread, and make sure that the lock acquire times + # out after the right amount of time. + # NOTE: this test only behaves as expected if C signals get delivered + # to the main thread. Otherwise lock.acquire() itself doesn't get + # interrupted and the test trivially succeeds. + self.start = None + self.end = None + self.sigs_recvd = 0 + done = thread.allocate_lock() + done.acquire() + lock = thread.allocate_lock() + lock.acquire() + def my_handler(signum, frame): + self.sigs_recvd += 1 + old_handler = signal.signal(signal.SIGUSR1, my_handler) + try: + def timed_acquire(): + self.start = time.time() + lock.acquire(timeout=0.5) + self.end = time.time() + def send_signals(): + for _ in range(40): + time.sleep(0.02) + os.kill(process_pid, signal.SIGUSR1) + done.release() + + # Send the signals from the non-main thread, since the main thread + # is the only one that can process signals. + thread.start_new_thread(send_signals, ()) + timed_acquire() + # Wait for thread to finish + done.acquire() + # This allows for some timing and scheduling imprecision + self.assertLess(self.end - self.start, 2.0) + self.assertGreater(self.end - self.start, 0.3) + # If the signal is received several times before PyErr_CheckSignals() + # is called, the handler will get called less than 40 times. Just + # check it's been called at least once. + self.assertGreater(self.sigs_recvd, 0) + finally: + signal.signal(signal.SIGUSR1, old_handler) + def test_main(): global signal_blackboard Modified: python/branches/pep-3151/Lib/test/test_time.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_time.py (original) +++ python/branches/pep-3151/Lib/test/test_time.py Sat Feb 26 08:16:32 2011 @@ -2,6 +2,8 @@ import time import unittest import locale +import sysconfig +import warnings class TimeTestCase(unittest.TestCase): @@ -18,10 +20,10 @@ time.clock() def test_conversions(self): - self.assertTrue(time.ctime(self.t) - == time.asctime(time.localtime(self.t))) - self.assertTrue(int(time.mktime(time.localtime(self.t))) - == int(self.t)) + self.assertEqual(time.ctime(self.t), + time.asctime(time.localtime(self.t))) + self.assertEqual(int(time.mktime(time.localtime(self.t))), + int(self.t)) def test_sleep(self): time.sleep(1.2) @@ -41,14 +43,8 @@ # Make sure that strftime() checks the bounds of the various parts #of the time tuple (0 is valid for *all* values). - # Check year [1900, max(int)] - self.assertRaises(ValueError, func, - (1899, 1, 1, 0, 0, 0, 0, 1, -1)) - if time.accept2dyear: - self.assertRaises(ValueError, func, - (-1, 1, 1, 0, 0, 0, 0, 1, -1)) - self.assertRaises(ValueError, func, - (100, 1, 1, 0, 0, 0, 0, 1, -1)) + # The year field is tested by other test cases above + # Check month [1, 12] + zero support self.assertRaises(ValueError, func, (1900, -1, 1, 0, 0, 0, 0, 1, -1)) @@ -96,8 +92,9 @@ # No test for daylight savings since strftime() does not change output # based on its value. expected = "2000 01 01 00 00 00 1 001" - result = time.strftime("%Y %m %d %H %M %S %w %j", (0,)*9) - self.assertEquals(expected, result) + with support.check_warnings(): + result = time.strftime("%Y %m %d %H %M %S %w %j", (0,)*9) + self.assertEqual(expected, result) def test_strptime(self): # Should be able to go round-trip from strftime to strptime without @@ -121,14 +118,38 @@ def test_asctime(self): time.asctime(time.gmtime(self.t)) + + # Max year is only limited by the size of C int. + sizeof_int = sysconfig.get_config_var('SIZEOF_INT') or 4 + bigyear = (1 << 8 * sizeof_int - 1) - 1 + asc = time.asctime((bigyear, 6, 1) + (0,)*6) + self.assertEqual(asc[-len(str(bigyear)):], str(bigyear)) + self.assertRaises(OverflowError, time.asctime, (bigyear + 1,) + (0,)*8) self.assertRaises(TypeError, time.asctime, 0) + self.assertRaises(TypeError, time.asctime, ()) + self.assertRaises(TypeError, time.asctime, (0,) * 10) def test_asctime_bounding_check(self): self._bounds_checking(time.asctime) + def test_ctime(self): + t = time.mktime((1973, 9, 16, 1, 3, 52, 0, 0, -1)) + self.assertEqual(time.ctime(t), 'Sun Sep 16 01:03:52 1973') + t = time.mktime((2000, 1, 1, 0, 0, 0, 0, 0, -1)) + self.assertEqual(time.ctime(t), 'Sat Jan 1 00:00:00 2000') + for year in [-100, 100, 1000, 2000, 10000]: + try: + testval = time.mktime((year, 1, 10) + (0,)*6) + except (ValueError, OverflowError): + # If mktime fails, ctime will fail too. This may happen + # on some platforms. + pass + else: + self.assertEqual(time.ctime(testval)[20:], str(year)) + + @unittest.skipIf(not hasattr(time, "tzset"), + "time module has no attribute tzset") def test_tzset(self): - if not hasattr(time, "tzset"): - return # Can't test this; don't want the test suite to fail from os import environ @@ -215,14 +236,14 @@ gt1 = time.gmtime(None) t0 = time.mktime(gt0) t1 = time.mktime(gt1) - self.assertTrue(0 <= (t1-t0) < 0.2) + self.assertAlmostEqual(t1, t0, delta=0.2) def test_localtime_without_arg(self): lt0 = time.localtime() lt1 = time.localtime(None) t0 = time.mktime(lt0) t1 = time.mktime(lt1) - self.assertTrue(0 <= (t1-t0) < 0.2) + self.assertAlmostEqual(t1, t0, delta=0.2) class TestLocale(unittest.TestCase): def setUp(self): @@ -240,8 +261,142 @@ # This should not cause an exception time.strftime("%B", (2009,2,1,0,0,0,0,0,0)) + +class _BaseYearTest(unittest.TestCase): + accept2dyear = None + + def setUp(self): + self.saved_accept2dyear = time.accept2dyear + time.accept2dyear = self.accept2dyear + + def tearDown(self): + time.accept2dyear = self.saved_accept2dyear + + def yearstr(self, y): + raise NotImplementedError() + +class _TestAsctimeYear: + def yearstr(self, y): + return time.asctime((y,) + (0,) * 8).split()[-1] + + def test_large_year(self): + # Check that it doesn't crash for year > 9999 + self.assertEqual(self.yearstr(12345), '12345') + self.assertEqual(self.yearstr(123456789), '123456789') + +class _TestStrftimeYear: + def yearstr(self, y): + return time.strftime('%Y', (y,) + (0,) * 8).split()[-1] + + def test_large_year(self): + # Check that it doesn't crash for year > 9999 + try: + text = self.yearstr(12345) + except ValueError: + # strftime() is limited to [1; 9999] with Visual Studio + return + # Issue #10864: OpenIndiana is limited to 4 digits, + # but Python doesn't raise a ValueError + #self.assertEqual(text, '12345') + #self.assertEqual(self.yearstr(123456789), '123456789') + self.assertIn(text, ('2345', '12345')) + self.assertIn(self.yearstr(123456789), ('123456789', '6789')) + +class _Test2dYear(_BaseYearTest): + accept2dyear = 1 + + def test_year(self): + with support.check_warnings(): + self.assertEqual(self.yearstr(0), '2000') + self.assertEqual(self.yearstr(69), '1969') + self.assertEqual(self.yearstr(68), '2068') + self.assertEqual(self.yearstr(99), '1999') + + def test_invalid(self): + self.assertRaises(ValueError, self.yearstr, -1) + self.assertRaises(ValueError, self.yearstr, 100) + self.assertRaises(ValueError, self.yearstr, 999) + +class _Test4dYear(_BaseYearTest): + accept2dyear = 0 + + def test_year(self): + self.assertIn(self.yearstr(1), ('1', '0001')) + self.assertIn(self.yearstr(68), ('68', '0068')) + self.assertIn(self.yearstr(69), ('69', '0069')) + self.assertIn(self.yearstr(99), ('99', '0099')) + self.assertIn(self.yearstr(999), ('999', '0999')) + self.assertEqual(self.yearstr(9999), '9999') + + def test_negative(self): + try: + text = self.yearstr(-1) + except ValueError: + # strftime() is limited to [1; 9999] with Visual Studio + return + self.assertIn(text, ('-1', '-001')) + + self.assertEqual(self.yearstr(-1234), '-1234') + self.assertEqual(self.yearstr(-123456), '-123456') + + + def test_mktime(self): + # Issue #1726687 + for t in (-2, -1, 0, 1): + try: + tt = time.localtime(t) + except (OverflowError, ValueError): + pass + else: + self.assertEqual(time.mktime(tt), t) + # It may not be possible to reliably make mktime return error + # on all platfom. This will make sure that no other exception + # than OverflowError is raised for an extreme value. + try: + time.mktime((-1, 1, 1, 0, 0, 0, -1, -1, -1)) + except OverflowError: + pass + +class TestAsctimeAccept2dYear(_TestAsctimeYear, _Test2dYear): + pass + +class TestStrftimeAccept2dYear(_TestStrftimeYear, _Test2dYear): + pass + +class TestAsctime4dyear(_TestAsctimeYear, _Test4dYear): + pass + +class TestStrftime4dyear(_TestStrftimeYear, _Test4dYear): + pass + +class Test2dyearBool(_TestAsctimeYear, _Test2dYear): + accept2dyear = True + +class Test4dyearBool(_TestAsctimeYear, _Test4dYear): + accept2dyear = False + +class TestAccept2YearBad(_TestAsctimeYear, _BaseYearTest): + class X: + def __bool__(self): + raise RuntimeError('boo') + accept2dyear = X() + def test_2dyear(self): + pass + def test_invalid(self): + self.assertRaises(RuntimeError, self.yearstr, 200) + + def test_main(): - support.run_unittest(TimeTestCase, TestLocale) + support.run_unittest( + TimeTestCase, + TestLocale, + TestAsctimeAccept2dYear, + TestStrftimeAccept2dYear, + TestAsctime4dyear, + TestStrftime4dyear, + Test2dyearBool, + Test4dyearBool, + TestAccept2YearBad) if __name__ == "__main__": test_main() Modified: python/branches/pep-3151/Lib/test/test_timeout.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_timeout.py (original) +++ python/branches/pep-3151/Lib/test/test_timeout.py Sat Feb 26 08:16:32 2011 @@ -7,6 +7,7 @@ skip_expected = not support.is_resource_enabled('network') import time +import errno import socket @@ -88,8 +89,6 @@ class TimeoutTestCase(unittest.TestCase): - """Test case for socket.socket() timeout functions""" - # There are a number of tests here trying to make sure that an operation # doesn't take too much longer than expected. But competing machine # activity makes it inevitable that such tests will fail at times. @@ -98,10 +97,42 @@ # solution. fuzz = 2.0 + localhost = '127.0.0.1' + + def setUp(self): + raise NotImplementedError() + + tearDown = setUp + + def _sock_operation(self, count, timeout, method, *args): + """ + Test the specified socket method. + + The method is run at most `count` times and must raise a socket.timeout + within `timeout` + self.fuzz seconds. + """ + self.sock.settimeout(timeout) + method = getattr(self.sock, method) + for i in range(count): + t1 = time.time() + try: + method(*args) + except socket.timeout as e: + delta = time.time() - t1 + break + else: + self.fail('socket.timeout was not raised') + # These checks should account for timing unprecision + self.assertLess(delta, timeout + self.fuzz) + self.assertGreater(delta, timeout - 1.0) + + +class TCPTimeoutTestCase(TimeoutTestCase): + """TCP test case for socket.socket() timeout functions""" + def setUp(self): self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) self.addr_remote = ('www.python.org.', 80) - self.localhost = '127.0.0.1' def tearDown(self): self.sock.close() @@ -113,90 +144,73 @@ # with the connect time. This avoids failing the assertion that # the timeout occurred fast enough. addr = ('10.0.0.0', 12345) - - # Test connect() timeout - _timeout = 0.001 - self.sock.settimeout(_timeout) - - _t1 = time.time() - self.assertRaises(socket.error, self.sock.connect, addr) - _t2 = time.time() - - _delta = abs(_t1 - _t2) - self.assertTrue(_delta < _timeout + self.fuzz, - "timeout (%g) is more than %g seconds more than expected (%g)" - %(_delta, self.fuzz, _timeout)) + with support.transient_internet(addr[0]): + self._sock_operation(1, 0.001, 'connect', addr) def testRecvTimeout(self): # Test recv() timeout - _timeout = 0.02 - with support.transient_internet(self.addr_remote[0]): self.sock.connect(self.addr_remote) - self.sock.settimeout(_timeout) - - _t1 = time.time() - self.assertRaises(socket.timeout, self.sock.recv, 1024) - _t2 = time.time() - - _delta = abs(_t1 - _t2) - self.assertTrue(_delta < _timeout + self.fuzz, - "timeout (%g) is %g seconds more than expected (%g)" - %(_delta, self.fuzz, _timeout)) + self._sock_operation(1, 1.5, 'recv', 1024) def testAcceptTimeout(self): # Test accept() timeout - _timeout = 2 - self.sock.settimeout(_timeout) - # Prevent "Address already in use" socket exceptions support.bind_port(self.sock, self.localhost) self.sock.listen(5) - - _t1 = time.time() - self.assertRaises(socket.error, self.sock.accept) - _t2 = time.time() - - _delta = abs(_t1 - _t2) - self.assertTrue(_delta < _timeout + self.fuzz, - "timeout (%g) is %g seconds more than expected (%g)" - %(_delta, self.fuzz, _timeout)) - - def testRecvfromTimeout(self): - # Test recvfrom() timeout - _timeout = 2 - self.sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) - self.sock.settimeout(_timeout) - # Prevent "Address already in use" socket exceptions - support.bind_port(self.sock, self.localhost) - - _t1 = time.time() - self.assertRaises(socket.error, self.sock.recvfrom, 8192) - _t2 = time.time() - - _delta = abs(_t1 - _t2) - self.assertTrue(_delta < _timeout + self.fuzz, - "timeout (%g) is %g seconds more than expected (%g)" - %(_delta, self.fuzz, _timeout)) + self._sock_operation(1, 1.5, 'accept') def testSend(self): # Test send() timeout - # couldn't figure out how to test it - pass + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as serv: + support.bind_port(serv, self.localhost) + serv.listen(5) + self.sock.connect(serv.getsockname()) + # Send a lot of data in order to bypass buffering in the TCP stack. + self._sock_operation(100, 1.5, 'send', b"X" * 200000) def testSendto(self): # Test sendto() timeout - # couldn't figure out how to test it - pass + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as serv: + support.bind_port(serv, self.localhost) + serv.listen(5) + self.sock.connect(serv.getsockname()) + # The address argument is ignored since we already connected. + self._sock_operation(100, 1.5, 'sendto', b"X" * 200000, + serv.getsockname()) def testSendall(self): # Test sendall() timeout - # couldn't figure out how to test it - pass + with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as serv: + support.bind_port(serv, self.localhost) + serv.listen(5) + self.sock.connect(serv.getsockname()) + # Send a lot of data in order to bypass buffering in the TCP stack. + self._sock_operation(100, 1.5, 'sendall', b"X" * 200000) + + +class UDPTimeoutTestCase(TimeoutTestCase): + """UDP test case for socket.socket() timeout functions""" + + def setUp(self): + self.sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) + + def tearDown(self): + self.sock.close() + + def testRecvfromTimeout(self): + # Test recvfrom() timeout + # Prevent "Address already in use" socket exceptions + support.bind_port(self.sock, self.localhost) + self._sock_operation(1, 1.5, 'recvfrom', 1024) def test_main(): support.requires('network') - support.run_unittest(CreationTestCase, TimeoutTestCase) + support.run_unittest( + CreationTestCase, + TCPTimeoutTestCase, + UDPTimeoutTestCase, + ) if __name__ == "__main__": test_main() Modified: python/branches/pep-3151/Lib/test/test_tokenize.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_tokenize.py (original) +++ python/branches/pep-3151/Lib/test/test_tokenize.py Sat Feb 26 08:16:32 2011 @@ -689,8 +689,8 @@ # skip the initial encoding token and the end token tokens = list(_tokenize(readline, encoding='utf-8'))[1:-1] expected_tokens = [(3, '"??????????"', (1, 0), (1, 7), '"??????????"')] - self.assertEquals(tokens, expected_tokens, - "bytes not decoded with encoding") + self.assertEqual(tokens, expected_tokens, + "bytes not decoded with encoding") def test__tokenize_does_not_decode_with_encoding_none(self): literal = '"??????????"' @@ -706,8 +706,8 @@ # skip the end token tokens = list(_tokenize(readline, encoding=None))[:-1] expected_tokens = [(3, '"??????????"', (1, 0), (1, 7), '"??????????"')] - self.assertEquals(tokens, expected_tokens, - "string not tokenized when encoding is None") + self.assertEqual(tokens, expected_tokens, + "string not tokenized when encoding is None") class TestDetectEncoding(TestCase): @@ -730,8 +730,8 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'utf-8') - self.assertEquals(consumed_lines, list(lines[:2])) + self.assertEqual(encoding, 'utf-8') + self.assertEqual(consumed_lines, list(lines[:2])) def test_bom_no_cookie(self): lines = ( @@ -740,9 +740,9 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'utf-8-sig') - self.assertEquals(consumed_lines, - [b'# something\n', b'print(something)\n']) + self.assertEqual(encoding, 'utf-8-sig') + self.assertEqual(consumed_lines, + [b'# something\n', b'print(something)\n']) def test_cookie_first_line_no_bom(self): lines = ( @@ -751,8 +751,8 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'iso-8859-1') - self.assertEquals(consumed_lines, [b'# -*- coding: latin-1 -*-\n']) + self.assertEqual(encoding, 'iso-8859-1') + self.assertEqual(consumed_lines, [b'# -*- coding: latin-1 -*-\n']) def test_matched_bom_and_cookie_first_line(self): lines = ( @@ -761,8 +761,8 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'utf-8-sig') - self.assertEquals(consumed_lines, [b'# coding=utf-8\n']) + self.assertEqual(encoding, 'utf-8-sig') + self.assertEqual(consumed_lines, [b'# coding=utf-8\n']) def test_mismatched_bom_and_cookie_first_line_raises_syntaxerror(self): lines = ( @@ -781,9 +781,9 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'ascii') + self.assertEqual(encoding, 'ascii') expected = [b'#! something\n', b'# vim: set fileencoding=ascii :\n'] - self.assertEquals(consumed_lines, expected) + self.assertEqual(consumed_lines, expected) def test_matched_bom_and_cookie_second_line(self): lines = ( @@ -793,9 +793,9 @@ b'do_something(else)\n' ) encoding, consumed_lines = detect_encoding(self.get_readline(lines)) - self.assertEquals(encoding, 'utf-8-sig') - self.assertEquals(consumed_lines, - [b'#! something\n', b'f# coding=utf-8\n']) + self.assertEqual(encoding, 'utf-8-sig') + self.assertEqual(consumed_lines, + [b'#! something\n', b'f# coding=utf-8\n']) def test_mismatched_bom_and_cookie_second_line_raises_syntaxerror(self): lines = ( @@ -820,7 +820,7 @@ b"do_something += 4\n") rl = self.get_readline(lines) found, consumed_lines = detect_encoding(rl) - self.assertEquals(found, "iso-8859-1") + self.assertEqual(found, "iso-8859-1") def test_utf8_normalization(self): # See get_normal_name() in tokenizer.c. @@ -833,27 +833,27 @@ b"1 + 3\n") rl = self.get_readline(lines) found, consumed_lines = detect_encoding(rl) - self.assertEquals(found, "utf-8") + self.assertEqual(found, "utf-8") def test_short_files(self): readline = self.get_readline((b'print(something)\n',)) encoding, consumed_lines = detect_encoding(readline) - self.assertEquals(encoding, 'utf-8') - self.assertEquals(consumed_lines, [b'print(something)\n']) + self.assertEqual(encoding, 'utf-8') + self.assertEqual(consumed_lines, [b'print(something)\n']) encoding, consumed_lines = detect_encoding(self.get_readline(())) - self.assertEquals(encoding, 'utf-8') - self.assertEquals(consumed_lines, []) + self.assertEqual(encoding, 'utf-8') + self.assertEqual(consumed_lines, []) readline = self.get_readline((b'\xef\xbb\xbfprint(something)\n',)) encoding, consumed_lines = detect_encoding(readline) - self.assertEquals(encoding, 'utf-8-sig') - self.assertEquals(consumed_lines, [b'print(something)\n']) + self.assertEqual(encoding, 'utf-8-sig') + self.assertEqual(consumed_lines, [b'print(something)\n']) readline = self.get_readline((b'\xef\xbb\xbf',)) encoding, consumed_lines = detect_encoding(readline) - self.assertEquals(encoding, 'utf-8-sig') - self.assertEquals(consumed_lines, []) + self.assertEqual(encoding, 'utf-8-sig') + self.assertEqual(consumed_lines, []) readline = self.get_readline((b'# coding: bad\n',)) self.assertRaises(SyntaxError, detect_encoding, readline) @@ -912,7 +912,7 @@ tokenize_module._tokenize = mock__tokenize try: results = tokenize(mock_readline) - self.assertEquals(list(results), ['first', 'second', 1, 2, 3, 4]) + self.assertEqual(list(results), ['first', 'second', 1, 2, 3, 4]) finally: tokenize_module.detect_encoding = orig_detect_encoding tokenize_module._tokenize = orig__tokenize Modified: python/branches/pep-3151/Lib/test/test_trace.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_trace.py (original) +++ python/branches/pep-3151/Lib/test/test_trace.py Sat Feb 26 08:16:32 2011 @@ -102,6 +102,7 @@ class TestLineCounts(unittest.TestCase): """White-box testing of line-counting, via runfunc""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=1, trace=0, countfuncs=0, countcallers=0) self.my_py_filename = fix_ext_py(__file__) @@ -192,6 +193,7 @@ """A simple sanity test of line-counting, via runctx (exec)""" def setUp(self): self.my_py_filename = fix_ext_py(__file__) + self.addCleanup(sys.settrace, sys.gettrace()) def test_exec_counts(self): self.tracer = Trace(count=1, trace=0, countfuncs=0, countcallers=0) @@ -218,6 +220,7 @@ class TestFuncs(unittest.TestCase): """White-box testing of funcs tracing""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=0, trace=0, countfuncs=1) self.filemod = my_file_and_modname() @@ -242,6 +245,8 @@ } self.assertEqual(self.tracer.results().calledfuncs, expected) + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'pre-existing trace function throws off measurements') def test_inst_method_calling(self): obj = TracedClass(20) self.tracer.runfunc(obj.inst_method_calling, 1) @@ -257,9 +262,12 @@ class TestCallers(unittest.TestCase): """White-box testing of callers tracing""" def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) self.tracer = Trace(count=0, trace=0, countcallers=1) self.filemod = my_file_and_modname() + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'pre-existing trace function throws off measurements') def test_loop_caller_importing(self): self.tracer.runfunc(traced_func_importing_caller, 1) @@ -280,6 +288,9 @@ # Created separately for issue #3821 class TestCoverage(unittest.TestCase): + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + def tearDown(self): rmtree(TESTFN) unlink(TESTFN) @@ -311,7 +322,7 @@ self._coverage(tracer) if os.path.exists(TESTFN): files = os.listdir(TESTFN) - self.assertEquals(files, []) + self.assertEqual(files, []) def test_issue9936(self): tracer = trace.Trace(trace=0, count=1) @@ -330,7 +341,7 @@ lines, cov, module = line.split()[:3] coverage[module] = (int(lines), int(cov[:-1])) # XXX This is needed to run regrtest.py as a script - modname = trace.fullmodname(sys.modules[modname].__file__) + modname = trace._fullmodname(sys.modules[modname].__file__) self.assertIn(modname, coverage) self.assertEqual(coverage[modname], (5, 100)) @@ -340,7 +351,7 @@ class Test_Ignore(unittest.TestCase): def test_ignored(self): jn = os.path.join - ignore = trace.Ignore(['x', 'y.z'], [jn('foo', 'bar')]) + ignore = trace._Ignore(['x', 'y.z'], [jn('foo', 'bar')]) self.assertTrue(ignore.names('x.py', 'x')) self.assertFalse(ignore.names('xy.py', 'xy')) self.assertFalse(ignore.names('y.py', 'y')) Modified: python/branches/pep-3151/Lib/test/test_traceback.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_traceback.py (original) +++ python/branches/pep-3151/Lib/test/test_traceback.py Sat Feb 26 08:16:32 2011 @@ -164,11 +164,11 @@ raise Error("unable to create test traceback string") # Make sure that Python and the traceback module format the same thing - self.assertEquals(traceback_fmt, python_fmt) + self.assertEqual(traceback_fmt, python_fmt) # Make sure that the traceback is properly indented. tb_lines = python_fmt.splitlines() - self.assertEquals(len(tb_lines), 3) + self.assertEqual(len(tb_lines), 3) banner, location, source_line = tb_lines self.assertTrue(banner.startswith('Traceback')) self.assertTrue(location.startswith(' File')) @@ -212,7 +212,7 @@ except ZeroDivisionError as _: e = _ lines = self.get_report(e).splitlines() - self.assertEquals(len(lines), 4) + self.assertEqual(len(lines), 4) self.assertTrue(lines[0].startswith('Traceback')) self.assertTrue(lines[1].startswith(' File')) self.assertIn('1/0 # Marker', lines[2]) @@ -227,8 +227,8 @@ def outer_raise(): inner_raise() # Marker blocks = boundaries.split(self.get_report(outer_raise)) - self.assertEquals(len(blocks), 3) - self.assertEquals(blocks[1], cause_message) + self.assertEqual(len(blocks), 3) + self.assertEqual(blocks[1], cause_message) self.check_zero_div(blocks[0]) self.assertIn('inner_raise() # Marker', blocks[2]) @@ -241,8 +241,8 @@ def outer_raise(): inner_raise() # Marker blocks = boundaries.split(self.get_report(outer_raise)) - self.assertEquals(len(blocks), 3) - self.assertEquals(blocks[1], context_message) + self.assertEqual(len(blocks), 3) + self.assertEqual(blocks[1], context_message) self.check_zero_div(blocks[0]) self.assertIn('inner_raise() # Marker', blocks[2]) @@ -261,8 +261,8 @@ def outer_raise(): inner_raise() # Marker blocks = boundaries.split(self.get_report(outer_raise)) - self.assertEquals(len(blocks), 3) - self.assertEquals(blocks[1], cause_message) + self.assertEqual(len(blocks), 3) + self.assertEqual(blocks[1], cause_message) self.check_zero_div(blocks[0]) self.assertIn('inner_raise() # Marker', blocks[2]) @@ -279,8 +279,8 @@ def outer_raise(): inner_raise() # Marker blocks = boundaries.split(self.get_report(outer_raise)) - self.assertEquals(len(blocks), 3) - self.assertEquals(blocks[1], cause_message) + self.assertEqual(len(blocks), 3) + self.assertEqual(blocks[1], cause_message) # The first block is the KeyError raised from the ZeroDivisionError self.assertIn('raise KeyError from e', blocks[0]) self.assertNotIn('1/0', blocks[0]) @@ -313,7 +313,7 @@ traceback.format_exception(type(e), e, e.__traceback__)) with captured_output("stderr") as sio: traceback.print_exception(type(e), e, e.__traceback__) - self.assertEquals(sio.getvalue(), s) + self.assertEqual(sio.getvalue(), s) return s Modified: python/branches/pep-3151/Lib/test/test_ttk_guionly.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_ttk_guionly.py (original) +++ python/branches/pep-3151/Lib/test/test_ttk_guionly.py Sat Feb 26 08:16:32 2011 @@ -8,6 +8,7 @@ from _tkinter import TclError from tkinter import ttk from tkinter.test import runtktests +from tkinter.test.support import get_tk_root try: ttk.Button() @@ -22,8 +23,11 @@ elif 'gui' not in support.use_resources: support.use_resources.append('gui') - support.run_unittest( - *runtktests.get_tests(text=False, packages=['test_ttk'])) + try: + support.run_unittest( + *runtktests.get_tests(text=False, packages=['test_ttk'])) + finally: + get_tk_root().destroy() if __name__ == '__main__': test_main(enable_gui=True) Modified: python/branches/pep-3151/Lib/test/test_tuple.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_tuple.py (original) +++ python/branches/pep-3151/Lib/test/test_tuple.py Sat Feb 26 08:16:32 2011 @@ -6,7 +6,7 @@ type2test = tuple def test_constructors(self): - super().test_len() + super().test_constructors() # calling built-in types without argument must return empty self.assertEqual(tuple(), ()) t0_3 = (0, 1, 2, 3) Modified: python/branches/pep-3151/Lib/test/test_types.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_types.py (original) +++ python/branches/pep-3151/Lib/test/test_types.py Sat Feb 26 08:16:32 2011 @@ -396,13 +396,9 @@ self.assertEqual(len(format(0, cfmt)), len(format(x, cfmt))) def test_float__format__(self): - # these should be rewritten to use both format(x, spec) and - # x.__format__(spec) - def test(f, format_spec, result): - assert type(f) == float - assert type(format_spec) == str self.assertEqual(f.__format__(format_spec), result) + self.assertEqual(format(f, format_spec), result) test(0.0, 'f', '0.000000') @@ -516,9 +512,27 @@ self.assertRaises(ValueError, format, 1e-100, format_spec) self.assertRaises(ValueError, format, -1e-100, format_spec) - # Alternate formatting is not supported - self.assertRaises(ValueError, format, 0.0, '#') - self.assertRaises(ValueError, format, 0.0, '#20f') + # Alternate float formatting + test(1.0, '.0e', '1e+00') + test(1.0, '#.0e', '1.e+00') + test(1.0, '.0f', '1') + test(1.0, '#.0f', '1.') + test(1.1, 'g', '1.1') + test(1.1, '#g', '1.10000') + test(1.0, '.0%', '100%') + test(1.0, '#.0%', '100.%') + + # Issue 7094: Alternate formatting (specified by #) + test(1.0, '0e', '1.000000e+00') + test(1.0, '#0e', '1.000000e+00') + test(1.0, '0f', '1.000000' ) + test(1.0, '#0f', '1.000000') + test(1.0, '.1e', '1.0e+00') + test(1.0, '#.1e', '1.0e+00') + test(1.0, '.1f', '1.0') + test(1.0, '#.1f', '1.0') + test(1.0, '.1%', '100.0%') + test(1.0, '#.1%', '100.0%') # Issue 6902 test(12345.6, "0<20", '12345.60000000000000') Modified: python/branches/pep-3151/Lib/test/test_ucn.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_ucn.py (original) +++ python/branches/pep-3151/Lib/test/test_ucn.py Sat Feb 26 08:16:32 2011 @@ -88,9 +88,13 @@ self.checkletter("CJK UNIFIED IDEOGRAPH-3400", "\u3400") self.checkletter("CJK UNIFIED IDEOGRAPH-4DB5", "\u4db5") self.checkletter("CJK UNIFIED IDEOGRAPH-4E00", "\u4e00") - self.checkletter("CJK UNIFIED IDEOGRAPH-9FA5", "\u9fa5") + self.checkletter("CJK UNIFIED IDEOGRAPH-9FCB", "\u9fCB") self.checkletter("CJK UNIFIED IDEOGRAPH-20000", "\U00020000") self.checkletter("CJK UNIFIED IDEOGRAPH-2A6D6", "\U0002a6d6") + self.checkletter("CJK UNIFIED IDEOGRAPH-2A700", "\U0002A700") + self.checkletter("CJK UNIFIED IDEOGRAPH-2B734", "\U0002B734") + self.checkletter("CJK UNIFIED IDEOGRAPH-2B740", "\U0002B740") + self.checkletter("CJK UNIFIED IDEOGRAPH-2B81D", "\U0002B81D") def test_bmp_characters(self): import unicodedata Modified: python/branches/pep-3151/Lib/test/test_unicode.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_unicode.py (original) +++ python/branches/pep-3151/Lib/test/test_unicode.py Sat Feb 26 08:16:32 2011 @@ -11,6 +11,7 @@ import unittest import warnings from test import support, string_tests +import _string # Error handling (bad decoder return) def search_function(encoding): @@ -1168,19 +1169,27 @@ # Error handling (wrong arguments) self.assertRaises(TypeError, "hello".encode, 42, 42, 42) - # Error handling (PyUnicode_EncodeDecimal()) - self.assertRaises(UnicodeError, int, "\u0200") + # Error handling (lone surrogate in PyUnicode_TransformDecimalToASCII()) + self.assertRaises(UnicodeError, int, "\ud800") + self.assertRaises(UnicodeError, int, "\udf00") + self.assertRaises(UnicodeError, float, "\ud800") + self.assertRaises(UnicodeError, float, "\udf00") + self.assertRaises(UnicodeError, complex, "\ud800") + self.assertRaises(UnicodeError, complex, "\udf00") def test_codecs(self): # Encoding self.assertEqual('hello'.encode('ascii'), b'hello') self.assertEqual('hello'.encode('utf-7'), b'hello') self.assertEqual('hello'.encode('utf-8'), b'hello') - self.assertEqual('hello'.encode('utf8'), b'hello') + self.assertEqual('hello'.encode('utf-8'), b'hello') self.assertEqual('hello'.encode('utf-16-le'), b'h\000e\000l\000l\000o\000') self.assertEqual('hello'.encode('utf-16-be'), b'\000h\000e\000l\000l\000o') self.assertEqual('hello'.encode('latin-1'), b'hello') + # Default encoding is utf-8 + self.assertEqual('\u2603'.encode(), b'\xe2\x98\x83') + # Roundtrip safety for BMP (just the first 1024 chars) for c in range(1024): u = chr(c) @@ -1418,48 +1427,66 @@ self.assertEqual("%s" % s, '__str__ overridden') self.assertEqual("{}".format(s), '__str__ overridden') + # Test PyUnicode_FromFormat() def test_from_format(self): - from _testcapi import format_unicode + support.import_module('ctypes') + from ctypes import pythonapi, py_object, c_int + if sys.maxunicode == 65535: + name = "PyUnicodeUCS2_FromFormat" + else: + name = "PyUnicodeUCS4_FromFormat" + _PyUnicode_FromFormat = getattr(pythonapi, name) + _PyUnicode_FromFormat.restype = py_object + + def PyUnicode_FromFormat(format, *args): + cargs = tuple( + py_object(arg) if isinstance(arg, str) else arg + for arg in args) + return _PyUnicode_FromFormat(format, *cargs) # ascii format, non-ascii argument - text = format_unicode(b'ascii\x7f=%U', 'unicode\xe9') + text = PyUnicode_FromFormat(b'ascii\x7f=%U', 'unicode\xe9') self.assertEqual(text, 'ascii\x7f=unicode\xe9') - # non-ascii format, ascii argument: ensure that PyUnicode_FromFormat() - # raises an error for a non-ascii format string. - self.assertRaisesRegexp(ValueError, + # non-ascii format, ascii argument: ensure that PyUnicode_FromFormatV() + # raises an error + self.assertRaisesRegex(ValueError, '^PyUnicode_FromFormatV\(\) expects an ASCII-encoded format ' 'string, got a non-ASCII byte: 0xe9$', - format_unicode, b'unicode\xe9=%s', 'ascii') + PyUnicode_FromFormat, b'unicode\xe9=%s', 'ascii') + + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0xabcd)), '\uabcd') + self.assertEqual(PyUnicode_FromFormat(b'%c', c_int(0x10ffff)), '\U0010ffff') # other tests - text = format_unicode(b'%%A:%A', 'abc\xe9\uabcd\U0010ffff') + text = PyUnicode_FromFormat(b'%%A:%A', 'abc\xe9\uabcd\U0010ffff') self.assertEqual(text, r"%A:'abc\xe9\uabcd\U0010ffff'") # Test PyUnicode_AsWideChar() def test_aswidechar(self): from _testcapi import unicode_aswidechar + support.import_module('ctypes') from ctypes import c_wchar, sizeof wchar, size = unicode_aswidechar('abcdef', 2) - self.assertEquals(size, 2) - self.assertEquals(wchar, 'ab') + self.assertEqual(size, 2) + self.assertEqual(wchar, 'ab') wchar, size = unicode_aswidechar('abc', 3) - self.assertEquals(size, 3) - self.assertEquals(wchar, 'abc') + self.assertEqual(size, 3) + self.assertEqual(wchar, 'abc') wchar, size = unicode_aswidechar('abc', 4) - self.assertEquals(size, 3) - self.assertEquals(wchar, 'abc\0') + self.assertEqual(size, 3) + self.assertEqual(wchar, 'abc\0') wchar, size = unicode_aswidechar('abc', 10) - self.assertEquals(size, 3) - self.assertEquals(wchar, 'abc\0') + self.assertEqual(size, 3) + self.assertEqual(wchar, 'abc\0') wchar, size = unicode_aswidechar('abc\0def', 20) - self.assertEquals(size, 7) - self.assertEquals(wchar, 'abc\0def\0') + self.assertEqual(size, 7) + self.assertEqual(wchar, 'abc\0def\0') nonbmp = chr(0x10ffff) if sizeof(c_wchar) == 2: @@ -1469,21 +1496,22 @@ buflen = 2 nchar = 1 wchar, size = unicode_aswidechar(nonbmp, buflen) - self.assertEquals(size, nchar) - self.assertEquals(wchar, nonbmp + '\0') + self.assertEqual(size, nchar) + self.assertEqual(wchar, nonbmp + '\0') # Test PyUnicode_AsWideCharString() def test_aswidecharstring(self): from _testcapi import unicode_aswidecharstring + support.import_module('ctypes') from ctypes import c_wchar, sizeof wchar, size = unicode_aswidecharstring('abc') - self.assertEquals(size, 3) - self.assertEquals(wchar, 'abc\0') + self.assertEqual(size, 3) + self.assertEqual(wchar, 'abc\0') wchar, size = unicode_aswidecharstring('abc\0def') - self.assertEquals(size, 7) - self.assertEquals(wchar, 'abc\0def\0') + self.assertEqual(size, 7) + self.assertEqual(wchar, 'abc\0def\0') nonbmp = chr(0x10ffff) if sizeof(c_wchar) == 2: @@ -1491,8 +1519,59 @@ else: # sizeof(c_wchar) == 4 nchar = 1 wchar, size = unicode_aswidecharstring(nonbmp) - self.assertEquals(size, nchar) - self.assertEquals(wchar, nonbmp + '\0') + self.assertEqual(size, nchar) + self.assertEqual(wchar, nonbmp + '\0') + + +class StringModuleTest(unittest.TestCase): + def test_formatter_parser(self): + def parse(format): + return list(_string.formatter_parser(format)) + + formatter = parse("prefix {2!s}xxx{0:^+10.3f}{obj.attr!s} {z[0]!s:10}") + self.assertEqual(formatter, [ + ('prefix ', '2', '', 's'), + ('xxx', '0', '^+10.3f', None), + ('', 'obj.attr', '', 's'), + (' ', 'z[0]', '10', 's'), + ]) + + formatter = parse("prefix {} suffix") + self.assertEqual(formatter, [ + ('prefix ', '', '', None), + (' suffix', None, None, None), + ]) + + formatter = parse("str") + self.assertEqual(formatter, [ + ('str', None, None, None), + ]) + + formatter = parse("") + self.assertEqual(formatter, []) + + formatter = parse("{0}") + self.assertEqual(formatter, [ + ('', '0', '', None), + ]) + + self.assertRaises(TypeError, _string.formatter_parser, 1) + + def test_formatter_field_name_split(self): + def split(name): + items = list(_string.formatter_field_name_split(name)) + items[1] = list(items[1]) + return items + self.assertEqual(split("obj"), ["obj", []]) + self.assertEqual(split("obj.arg"), ["obj", [(True, 'arg')]]) + self.assertEqual(split("obj[key]"), ["obj", [(False, 'key')]]) + self.assertEqual(split("obj.arg[key1][key2]"), [ + "obj", + [(True, 'arg'), + (False, 'key1'), + (False, 'key2'), + ]]) + self.assertRaises(TypeError, _string.formatter_field_name_split, 1) def test_main(): Modified: python/branches/pep-3151/Lib/test/test_unicodedata.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_unicodedata.py (original) +++ python/branches/pep-3151/Lib/test/test_unicodedata.py Sat Feb 26 08:16:32 2011 @@ -188,9 +188,22 @@ def test_pr29(self): # http://www.unicode.org/review/pr-29.html - for text in ("\u0b47\u0300\u0b3e", "\u1100\u0300\u1161"): + # See issues #1054943 and #10254. + composed = ("\u0b47\u0300\u0b3e", "\u1100\u0300\u1161", + 'Li\u030dt-s\u1e73\u0301', + '\u092e\u093e\u0930\u094d\u0915 \u091c\u093c' + + '\u0941\u0915\u0947\u0930\u092c\u0930\u094d\u0917', + '\u0915\u093f\u0930\u094d\u0917\u093f\u091c\u093c' + + '\u0938\u094d\u0924\u093e\u0928') + for text in composed: self.assertEqual(self.db.normalize('NFC', text), text) + def test_issue10254(self): + # Crash reported in #10254 + a = 'C\u0338' * 20 + 'C\u0327' + b = 'C\u0338' * 20 + '\xC7' + self.assertEqual(self.db.normalize('NFC', a), b) + def test_east_asian_width(self): eaw = self.db.east_asian_width self.assertRaises(TypeError, eaw, b'a') @@ -254,7 +267,7 @@ self.assertTrue(count >= 10) # should have tested at least the ASCII digits def test_bug_1704793(self): - self.assertEquals(self.db.lookup("GOTHIC LETTER FAIHU"), '\U00010346') + self.assertEqual(self.db.lookup("GOTHIC LETTER FAIHU"), '\U00010346') def test_ucd_510(self): import unicodedata Modified: python/branches/pep-3151/Lib/test/test_unittest.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_unittest.py (original) +++ python/branches/pep-3151/Lib/test/test_unittest.py Sat Feb 26 08:16:32 2011 @@ -4,8 +4,13 @@ def test_main(): + # used by regrtest support.run_unittest(unittest.test.suite()) support.reap_children() +def load_tests(*_): + # used by unittest + return unittest.test.suite() + if __name__ == "__main__": test_main() Modified: python/branches/pep-3151/Lib/test/test_urllib.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_urllib.py (original) +++ python/branches/pep-3151/Lib/test/test_urllib.py Sat Feb 26 08:16:32 2011 @@ -139,8 +139,10 @@ def fakehttp(self, fakedata): class FakeSocket(io.BytesIO): + io_refs = 1 def sendall(self, str): pass def makefile(self, *args, **kwds): + self.io_refs += 1 return self def read(self, amt=None): if self.closed: return b"" @@ -148,6 +150,10 @@ def readline(self, length=None): if self.closed: return b"" return io.BytesIO.readline(self, length) + def close(self): + self.io_refs -= 1 + if self.io_refs == 0: + io.BytesIO.close(self) class FakeHTTPConnection(http.client.HTTPConnection): def connect(self): self.sock = FakeSocket(fakedata) @@ -157,8 +163,8 @@ def unfakehttp(self): http.client.HTTPConnection = self._connection_class - def test_read(self): - self.fakehttp(b"Hello!") + def check_read(self, ver): + self.fakehttp(b"HTTP/" + ver + b" 200 OK\r\n\r\nHello!") try: fp = urlopen("http://python.org/") self.assertEqual(fp.readline(), b"Hello!") @@ -168,6 +174,17 @@ finally: self.unfakehttp() + def test_read_0_9(self): + # "0.9" response accepted (but not "simple responses" without + # a status line) + self.check_read(b"0.9") + + def test_read_1_0(self): + self.check_read(b"1.0") + + def test_read_1_1(self): + self.check_read(b"1.1") + def test_read_bogus(self): # urlopen() should raise IOError for many error codes. self.fakehttp(b'''HTTP/1.1 401 Authentication Required @@ -191,7 +208,7 @@ self.unfakehttp() def test_userpass_inurl(self): - self.fakehttp(b"Hello!") + self.fakehttp(b"HTTP/1.0 200 OK\r\n\r\nHello!") try: fp = urlopen("http://user:pass at python.org/") self.assertEqual(fp.readline(), b"Hello!") @@ -234,7 +251,7 @@ def constructLocalFileUrl(self, filePath): filePath = os.path.abspath(filePath) try: - filePath.encode("utf8") + filePath.encode("utf-8") except UnicodeEncodeError: raise unittest.SkipTest("filePath is not encodable to utf8") return "file://%s" % urllib.request.pathname2url(filePath) Modified: python/branches/pep-3151/Lib/test/test_urllib2.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_urllib2.py (original) +++ python/branches/pep-3151/Lib/test/test_urllib2.py Sat Feb 26 08:16:32 2011 @@ -4,6 +4,7 @@ import os import io import socket +import array import urllib.request from urllib.request import Request, OpenerDirector @@ -598,7 +599,7 @@ def sanepathname2url(path): try: - path.encode("utf8") + path.encode("utf-8") except UnicodeEncodeError: raise unittest.SkipTest("path is not encodable to utf8") urlpath = urllib.request.pathname2url(path) @@ -631,22 +632,32 @@ h = NullFTPHandler(data) o = h.parent = MockOpener() - for url, host, port, type_, dirs, filename, mimetype in [ + for url, host, port, user, passwd, type_, dirs, filename, mimetype in [ ("ftp://localhost/foo/bar/baz.html", - "localhost", ftplib.FTP_PORT, "I", + "localhost", ftplib.FTP_PORT, "", "", "I", + ["foo", "bar"], "baz.html", "text/html"), + ("ftp://parrot at localhost/foo/bar/baz.html", + "localhost", ftplib.FTP_PORT, "parrot", "", "I", + ["foo", "bar"], "baz.html", "text/html"), + ("ftp://%25parrot at localhost/foo/bar/baz.html", + "localhost", ftplib.FTP_PORT, "%parrot", "", "I", + ["foo", "bar"], "baz.html", "text/html"), + ("ftp://%2542parrot at localhost/foo/bar/baz.html", + "localhost", ftplib.FTP_PORT, "%42parrot", "", "I", ["foo", "bar"], "baz.html", "text/html"), ("ftp://localhost:80/foo/bar/", - "localhost", 80, "D", + "localhost", 80, "", "", "D", ["foo", "bar"], "", None), ("ftp://localhost/baz.gif;type=a", - "localhost", ftplib.FTP_PORT, "A", + "localhost", ftplib.FTP_PORT, "", "", "A", [], "baz.gif", None), # XXX really this should guess image/gif ]: req = Request(url) req.timeout = None r = h.ftp_open(req) # ftp authentication not yet implemented by FTPHandler - self.assertTrue(h.user == h.passwd == "") + self.assertEqual(h.user, user) + self.assertEqual(h.passwd, passwd) self.assertEqual(h.host, socket.gethostbyname(host)) self.assertEqual(h.port, port) self.assertEqual(h.dirs, dirs) @@ -747,7 +758,7 @@ else: self.assertIs(o.req, req) self.assertEqual(req.type, "ftp") - self.assertEqual(req.type is "ftp", ftp) + self.assertEqual(req.type == "ftp", ftp) def test_http(self): @@ -755,7 +766,7 @@ o = h.parent = MockOpener() url = "http://example.com/" - for method, data in [("GET", None), ("POST", "blah")]: + for method, data in [("GET", None), ("POST", b"blah")]: req = Request(url, data, {"Foo": "bar"}) req.timeout = None req.add_unredirected_header("Spam", "eggs") @@ -783,9 +794,13 @@ http.raise_on_endheaders = True self.assertRaises(urllib.error.URLError, h.do_open, http, req) + # Check for TypeError on POST data which is str. + req = Request("http://example.com/","badpost") + self.assertRaises(TypeError, h.do_request_, req) + # check adding of standard headers o.addheaders = [("Spam", "eggs")] - for data in "", None: # POST, GET + for data in b"", None: # POST, GET req = Request("http://example.com/", data) r = MockResponse(200, "OK", {}, "") newreq = h.do_request_(req) @@ -811,6 +826,48 @@ self.assertEqual(req.unredirected_hdrs["Host"], "baz") self.assertEqual(req.unredirected_hdrs["Spam"], "foo") + # Check iterable body support + def iterable_body(): + yield b"one" + yield b"two" + yield b"three" + + for headers in {}, {"Content-Length": 11}: + req = Request("http://example.com/", iterable_body(), headers) + if not headers: + # Having an iterable body without a Content-Length should + # raise an exception + self.assertRaises(ValueError, h.do_request_, req) + else: + newreq = h.do_request_(req) + + # A file object. + # Test only Content-Length attribute of request. + + file_obj = io.BytesIO() + file_obj.write(b"Something\nSomething\nSomething\n") + + for headers in {}, {"Content-Length": 30}: + req = Request("http://example.com/", file_obj, headers) + if not headers: + # Having an iterable body without a Content-Length should + # raise an exception + self.assertRaises(ValueError, h.do_request_, req) + else: + newreq = h.do_request_(req) + self.assertEqual(int(newreq.get_header('Content-length')),30) + + file_obj.close() + + # array.array Iterable - Content Length is calculated + + iterable_array = array.array("I",[1,2,3,4]) + + for headers in {}, {"Content-Length": 16}: + req = Request("http://example.com/", iterable_array, headers) + newreq = h.do_request_(req) + self.assertEqual(int(newreq.get_header('Content-length')),16) + def test_http_doubleslash(self): # Checks the presence of any unnecessary double slash in url does not # break anything. Previously, a double slash directly after the host @@ -818,7 +875,7 @@ h = urllib.request.AbstractHTTPHandler() o = h.parent = MockOpener() - data = "" + data = b"" ds_urls = [ "http://example.com/foo/bar/baz.html", "http://example.com//foo/bar/baz.html", @@ -838,6 +895,25 @@ p_ds_req = h.do_request_(ds_req) self.assertEqual(p_ds_req.unredirected_hdrs["Host"],"example.com") + def test_fixpath_in_weirdurls(self): + # Issue4493: urllib2 to supply '/' when to urls where path does not + # start with'/' + + h = urllib.request.AbstractHTTPHandler() + o = h.parent = MockOpener() + + weird_url = 'http://www.python.org?getspam' + req = Request(weird_url) + newreq = h.do_request_(req) + self.assertEqual(newreq.host,'www.python.org') + self.assertEqual(newreq.selector,'/?getspam') + + url_without_path = 'http://www.python.org' + req = Request(url_without_path) + newreq = h.do_request_(req) + self.assertEqual(newreq.host,'www.python.org') + self.assertEqual(newreq.selector,'') + def test_errors(self): h = urllib.request.HTTPErrorProcessor() Modified: python/branches/pep-3151/Lib/test/test_urllibnet.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_urllibnet.py (original) +++ python/branches/pep-3151/Lib/test/test_urllibnet.py Sat Feb 26 08:16:32 2011 @@ -13,7 +13,7 @@ class URLTimeoutTest(unittest.TestCase): - TIMEOUT = 10.0 + TIMEOUT = 30.0 def setUp(self): socket.setdefaulttimeout(self.TIMEOUT) Modified: python/branches/pep-3151/Lib/test/test_urlparse.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_urlparse.py (original) +++ python/branches/pep-3151/Lib/test/test_urlparse.py Sat Feb 26 08:16:32 2011 @@ -24,6 +24,17 @@ ("&a=b", [('a', 'b')]), ("a=a+b&b=b+c", [('a', 'a b'), ('b', 'b c')]), ("a=1&a=2", [('a', '1'), ('a', '2')]), + (b"", []), + (b"&", []), + (b"&&", []), + (b"=", [(b'', b'')]), + (b"=a", [(b'', b'a')]), + (b"a", [(b'a', b'')]), + (b"a=", [(b'a', b'')]), + (b"a=", [(b'a', b'')]), + (b"&a=b", [(b'a', b'b')]), + (b"a=a+b&b=b+c", [(b'a', b'a b'), (b'b', b'b c')]), + (b"a=1&a=2", [(b'a', b'1'), (b'a', b'2')]), ] class UrlParseTestCase(unittest.TestCase): @@ -86,7 +97,7 @@ def test_roundtrips(self): - testcases = [ + str_cases = [ ('file:///tmp/junk.txt', ('file', '', '/tmp/junk.txt', '', '', ''), ('file', '', '/tmp/junk.txt', '', '')), @@ -110,16 +121,21 @@ ('git+ssh', 'git at github.com','/user/project.git', '','',''), ('git+ssh', 'git at github.com','/user/project.git', - '', '')) + '', '')), ] - for url, parsed, split in testcases: + def _encode(t): + return (t[0].encode('ascii'), + tuple(x.encode('ascii') for x in t[1]), + tuple(x.encode('ascii') for x in t[2])) + bytes_cases = [_encode(x) for x in str_cases] + for url, parsed, split in str_cases + bytes_cases: self.checkRoundtrips(url, parsed, split) def test_http_roundtrips(self): # urllib.parse.urlsplit treats 'http:' as an optimized special case, # so we test both 'http:' and 'https:' in all the following. # Three cheers for white box knowledge! - testcases = [ + str_cases = [ ('://www.python.org', ('www.python.org', '', '', '', ''), ('www.python.org', '', '', '')), @@ -136,19 +152,34 @@ ('a', '/b/c/d', 'p', 'q', 'f'), ('a', '/b/c/d;p', 'q', 'f')), ] - for scheme in ('http', 'https'): - for url, parsed, split in testcases: - url = scheme + url - parsed = (scheme,) + parsed - split = (scheme,) + split - self.checkRoundtrips(url, parsed, split) + def _encode(t): + return (t[0].encode('ascii'), + tuple(x.encode('ascii') for x in t[1]), + tuple(x.encode('ascii') for x in t[2])) + bytes_cases = [_encode(x) for x in str_cases] + str_schemes = ('http', 'https') + bytes_schemes = (b'http', b'https') + str_tests = str_schemes, str_cases + bytes_tests = bytes_schemes, bytes_cases + for schemes, test_cases in (str_tests, bytes_tests): + for scheme in schemes: + for url, parsed, split in test_cases: + url = scheme + url + parsed = (scheme,) + parsed + split = (scheme,) + split + self.checkRoundtrips(url, parsed, split) def checkJoin(self, base, relurl, expected): - self.assertEqual(urllib.parse.urljoin(base, relurl), expected, - (base, relurl, expected)) + str_components = (base, relurl, expected) + self.assertEqual(urllib.parse.urljoin(base, relurl), expected) + bytes_components = baseb, relurlb, expectedb = [ + x.encode('ascii') for x in str_components] + self.assertEqual(urllib.parse.urljoin(baseb, relurlb), expectedb) def test_unparse_parse(self): - for u in ['Python', './Python','x-newscheme://foo.com/stuff','x://y','x:/y','x:/','/',]: + str_cases = ['Python', './Python','x-newscheme://foo.com/stuff','x://y','x:/y','x:/','/',] + bytes_cases = [x.encode('ascii') for x in str_cases] + for u in str_cases + bytes_cases: self.assertEqual(urllib.parse.urlunsplit(urllib.parse.urlsplit(u)), u) self.assertEqual(urllib.parse.urlunparse(urllib.parse.urlparse(u)), u) @@ -296,6 +327,9 @@ #self.checkJoin(RFC3986_BASE, 'http:g','http:g') # strict parser self.checkJoin(RFC3986_BASE, 'http:g','http://a/b/c/g') #relaxed parser + # Test for issue9721 + self.checkJoin('http://a/b/c/de', ';x','http://a/b/c/;x') + def test_urljoins(self): self.checkJoin(SIMPLE_BASE, 'g:h','g:h') self.checkJoin(SIMPLE_BASE, 'http:g','http://a/b/c/g') @@ -328,7 +362,7 @@ self.checkJoin(SIMPLE_BASE, 'http:g?y/./x','http://a/b/c/g?y/./x') def test_RFC2732(self): - for url, hostname, port in [ + str_cases = [ ('http://Test.python.org:5432/foo/', 'test.python.org', 5432), ('http://12.34.56.78:5432/foo/', '12.34.56.78', 5432), ('http://[::1]:5432/foo/', '::1', 5432), @@ -349,20 +383,26 @@ ('http://[::12.34.56.78]/foo/', '::12.34.56.78', None), ('http://[::ffff:12.34.56.78]/foo/', '::ffff:12.34.56.78', None), - ]: + ] + def _encode(t): + return t[0].encode('ascii'), t[1].encode('ascii'), t[2] + bytes_cases = [_encode(x) for x in str_cases] + for url, hostname, port in str_cases + bytes_cases: urlparsed = urllib.parse.urlparse(url) self.assertEqual((urlparsed.hostname, urlparsed.port) , (hostname, port)) - for invalid_url in [ + str_cases = [ 'http://::12.34.56.78]/', 'http://[::1/foo/', 'ftp://[::1/foo/bad]/bad', 'http://[::1/foo/bad]/bad', - 'http://[::ffff:12.34.56.78']: + 'http://[::ffff:12.34.56.78'] + bytes_cases = [x.encode('ascii') for x in str_cases] + for invalid_url in str_cases + bytes_cases: self.assertRaises(ValueError, urllib.parse.urlparse, invalid_url) def test_urldefrag(self): - for url, defrag, frag in [ + str_cases = [ ('http://python.org#frag', 'http://python.org', 'frag'), ('http://python.org', 'http://python.org', ''), ('http://python.org/#frag', 'http://python.org/', 'frag'), @@ -373,8 +413,16 @@ ('http://python.org/p?q', 'http://python.org/p?q', ''), (RFC1808_BASE, 'http://a/b/c/d;p?q', 'f'), (RFC2396_BASE, 'http://a/b/c/d;p?q', ''), - ]: - self.assertEqual(urllib.parse.urldefrag(url), (defrag, frag)) + ] + def _encode(t): + return type(t)(x.encode('ascii') for x in t) + bytes_cases = [_encode(x) for x in str_cases] + for url, defrag, frag in str_cases + bytes_cases: + result = urllib.parse.urldefrag(url) + self.assertEqual(result.geturl(), url) + self.assertEqual(result, (defrag, frag)) + self.assertEqual(result.url, defrag) + self.assertEqual(result.fragment, frag) def test_urlsplit_attributes(self): url = "HTTP://WWW.PYTHON.ORG/doc/#frag" @@ -390,7 +438,8 @@ self.assertEqual(p.port, None) # geturl() won't return exactly the original URL in this case # since the scheme is always case-normalized - #self.assertEqual(p.geturl(), url) + # We handle this by ignoring the first 4 characters of the URL + self.assertEqual(p.geturl()[4:], url[4:]) url = "http://User:Pass at www.python.org:080/doc/?query=yes#frag" p = urllib.parse.urlsplit(url) @@ -422,6 +471,45 @@ self.assertEqual(p.port, 80) self.assertEqual(p.geturl(), url) + # And check them all again, only with bytes this time + url = b"HTTP://WWW.PYTHON.ORG/doc/#frag" + p = urllib.parse.urlsplit(url) + self.assertEqual(p.scheme, b"http") + self.assertEqual(p.netloc, b"WWW.PYTHON.ORG") + self.assertEqual(p.path, b"/doc/") + self.assertEqual(p.query, b"") + self.assertEqual(p.fragment, b"frag") + self.assertEqual(p.username, None) + self.assertEqual(p.password, None) + self.assertEqual(p.hostname, b"www.python.org") + self.assertEqual(p.port, None) + self.assertEqual(p.geturl()[4:], url[4:]) + + url = b"http://User:Pass at www.python.org:080/doc/?query=yes#frag" + p = urllib.parse.urlsplit(url) + self.assertEqual(p.scheme, b"http") + self.assertEqual(p.netloc, b"User:Pass at www.python.org:080") + self.assertEqual(p.path, b"/doc/") + self.assertEqual(p.query, b"query=yes") + self.assertEqual(p.fragment, b"frag") + self.assertEqual(p.username, b"User") + self.assertEqual(p.password, b"Pass") + self.assertEqual(p.hostname, b"www.python.org") + self.assertEqual(p.port, 80) + self.assertEqual(p.geturl(), url) + + url = b"http://User at example.com:Pass at www.python.org:080/doc/?query=yes#frag" + p = urllib.parse.urlsplit(url) + self.assertEqual(p.scheme, b"http") + self.assertEqual(p.netloc, b"User at example.com:Pass at www.python.org:080") + self.assertEqual(p.path, b"/doc/") + self.assertEqual(p.query, b"query=yes") + self.assertEqual(p.fragment, b"frag") + self.assertEqual(p.username, b"User at example.com") + self.assertEqual(p.password, b"Pass") + self.assertEqual(p.hostname, b"www.python.org") + self.assertEqual(p.port, 80) + self.assertEqual(p.geturl(), url) def test_attributes_bad_port(self): """Check handling of non-integer ports.""" @@ -433,6 +521,15 @@ self.assertEqual(p.netloc, "www.example.net:foo") self.assertRaises(ValueError, lambda: p.port) + # Once again, repeat ourselves to test bytes + p = urllib.parse.urlsplit(b"http://www.example.net:foo") + self.assertEqual(p.netloc, b"www.example.net:foo") + self.assertRaises(ValueError, lambda: p.port) + + p = urllib.parse.urlparse(b"http://www.example.net:foo") + self.assertEqual(p.netloc, b"www.example.net:foo") + self.assertRaises(ValueError, lambda: p.port) + def test_attributes_without_netloc(self): # This example is straight from RFC 3261. It looks like it # should allow the username, hostname, and port to be filled @@ -456,10 +553,30 @@ self.assertEqual(p.port, None) self.assertEqual(p.geturl(), uri) + # You guessed it, repeating the test with bytes input + uri = b"sip:alice at atlanta.com;maddr=239.255.255.1;ttl=15" + p = urllib.parse.urlsplit(uri) + self.assertEqual(p.netloc, b"") + self.assertEqual(p.username, None) + self.assertEqual(p.password, None) + self.assertEqual(p.hostname, None) + self.assertEqual(p.port, None) + self.assertEqual(p.geturl(), uri) + + p = urllib.parse.urlparse(uri) + self.assertEqual(p.netloc, b"") + self.assertEqual(p.username, None) + self.assertEqual(p.password, None) + self.assertEqual(p.hostname, None) + self.assertEqual(p.port, None) + self.assertEqual(p.geturl(), uri) + def test_noslash(self): # Issue 1637: http://foo.com?query is legal self.assertEqual(urllib.parse.urlparse("http://example.com?blahblah=/foo"), ('http', 'example.com', '', '', 'blahblah=/foo', '')) + self.assertEqual(urllib.parse.urlparse(b"http://example.com?blahblah=/foo"), + (b'http', b'example.com', b'', b'', b'blahblah=/foo', b'')) def test_withoutscheme(self): # Test urlparse without scheme @@ -472,6 +589,13 @@ ('','www.python.org:80','','','','')) self.assertEqual(urllib.parse.urlparse("http://www.python.org:80"), ('http','www.python.org:80','','','','')) + # Repeat for bytes input + self.assertEqual(urllib.parse.urlparse(b"path"), + (b'',b'',b'path',b'',b'',b'')) + self.assertEqual(urllib.parse.urlparse(b"//www.python.org:80"), + (b'',b'www.python.org:80',b'',b'',b'',b'')) + self.assertEqual(urllib.parse.urlparse(b"http://www.python.org:80"), + (b'http',b'www.python.org:80',b'',b'',b'',b'')) def test_portseparator(self): # Issue 754016 makes changes for port separator ':' from scheme separator @@ -481,6 +605,13 @@ self.assertEqual(urllib.parse.urlparse("https:"),('https','','','','','')) self.assertEqual(urllib.parse.urlparse("http://www.python.org:80"), ('http','www.python.org:80','','','','')) + # As usual, need to check bytes input as well + self.assertEqual(urllib.parse.urlparse(b"path:80"), + (b'',b'',b'path:80',b'',b'',b'')) + self.assertEqual(urllib.parse.urlparse(b"http:"),(b'http',b'',b'',b'',b'',b'')) + self.assertEqual(urllib.parse.urlparse(b"https:"),(b'https',b'',b'',b'',b'',b'')) + self.assertEqual(urllib.parse.urlparse(b"http://www.python.org:80"), + (b'http',b'www.python.org:80',b'',b'',b'',b'')) def test_usingsys(self): # Issue 3314: sys module is used in the error @@ -492,6 +623,98 @@ ('s3', 'foo.com', '/stuff', '', '', '')) self.assertEqual(urllib.parse.urlparse("x-newscheme://foo.com/stuff"), ('x-newscheme', 'foo.com', '/stuff', '', '', '')) + # And for bytes... + self.assertEqual(urllib.parse.urlparse(b"s3://foo.com/stuff"), + (b's3', b'foo.com', b'/stuff', b'', b'', b'')) + self.assertEqual(urllib.parse.urlparse(b"x-newscheme://foo.com/stuff"), + (b'x-newscheme', b'foo.com', b'/stuff', b'', b'', b'')) + + def test_mixed_types_rejected(self): + # Several functions that process either strings or ASCII encoded bytes + # accept multiple arguments. Check they reject mixed type input + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlparse("www.python.org", b"http") + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlparse(b"www.python.org", "http") + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlsplit("www.python.org", b"http") + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlsplit(b"www.python.org", "http") + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlunparse(( b"http", "www.python.org","","","","")) + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlunparse(("http", b"www.python.org","","","","")) + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlunsplit((b"http", "www.python.org","","","")) + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urlunsplit(("http", b"www.python.org","","","")) + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urljoin("http://python.org", b"http://python.org") + with self.assertRaisesRegex(TypeError, "Cannot mix str"): + urllib.parse.urljoin(b"http://python.org", "http://python.org") + + def _check_result_type(self, str_type): + num_args = len(str_type._fields) + bytes_type = str_type._encoded_counterpart + self.assertIs(bytes_type._decoded_counterpart, str_type) + str_args = ('',) * num_args + bytes_args = (b'',) * num_args + str_result = str_type(*str_args) + bytes_result = bytes_type(*bytes_args) + encoding = 'ascii' + errors = 'strict' + self.assertEqual(str_result, str_args) + self.assertEqual(bytes_result.decode(), str_args) + self.assertEqual(bytes_result.decode(), str_result) + self.assertEqual(bytes_result.decode(encoding), str_args) + self.assertEqual(bytes_result.decode(encoding), str_result) + self.assertEqual(bytes_result.decode(encoding, errors), str_args) + self.assertEqual(bytes_result.decode(encoding, errors), str_result) + self.assertEqual(bytes_result, bytes_args) + self.assertEqual(str_result.encode(), bytes_args) + self.assertEqual(str_result.encode(), bytes_result) + self.assertEqual(str_result.encode(encoding), bytes_args) + self.assertEqual(str_result.encode(encoding), bytes_result) + self.assertEqual(str_result.encode(encoding, errors), bytes_args) + self.assertEqual(str_result.encode(encoding, errors), bytes_result) + + def test_result_pairs(self): + # Check encoding and decoding between result pairs + result_types = [ + urllib.parse.DefragResult, + urllib.parse.SplitResult, + urllib.parse.ParseResult, + ] + for result_type in result_types: + self._check_result_type(result_type) + + def test_parse_qs_encoding(self): + result = urllib.parse.parse_qs("key=\u0141%E9", encoding="latin-1") + self.assertEqual(result, {'key': ['\u0141\xE9']}) + result = urllib.parse.parse_qs("key=\u0141%C3%A9", encoding="utf-8") + self.assertEqual(result, {'key': ['\u0141\xE9']}) + result = urllib.parse.parse_qs("key=\u0141%C3%A9", encoding="ascii") + self.assertEqual(result, {'key': ['\u0141\ufffd\ufffd']}) + result = urllib.parse.parse_qs("key=\u0141%E9-", encoding="ascii") + self.assertEqual(result, {'key': ['\u0141\ufffd-']}) + result = urllib.parse.parse_qs("key=\u0141%E9-", encoding="ascii", + errors="ignore") + self.assertEqual(result, {'key': ['\u0141-']}) + + def test_parse_qsl_encoding(self): + result = urllib.parse.parse_qsl("key=\u0141%E9", encoding="latin-1") + self.assertEqual(result, [('key', '\u0141\xE9')]) + result = urllib.parse.parse_qsl("key=\u0141%C3%A9", encoding="utf-8") + self.assertEqual(result, [('key', '\u0141\xE9')]) + result = urllib.parse.parse_qsl("key=\u0141%C3%A9", encoding="ascii") + self.assertEqual(result, [('key', '\u0141\ufffd\ufffd')]) + result = urllib.parse.parse_qsl("key=\u0141%E9-", encoding="ascii") + self.assertEqual(result, [('key', '\u0141\ufffd-')]) + result = urllib.parse.parse_qsl("key=\u0141%E9-", encoding="ascii", + errors="ignore") + self.assertEqual(result, [('key', '\u0141-')]) + + def test_main(): support.run_unittest(UrlParseTestCase) Modified: python/branches/pep-3151/Lib/test/test_uuid.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_uuid.py (original) +++ python/branches/pep-3151/Lib/test/test_uuid.py Sat Feb 26 08:16:32 2011 @@ -471,14 +471,14 @@ if pid == 0: os.close(fds[0]) value = uuid.uuid4() - os.write(fds[1], value.hex.encode('latin1')) + os.write(fds[1], value.hex.encode('latin-1')) os._exit(0) else: os.close(fds[1]) parent_value = uuid.uuid4().hex os.waitpid(pid, 0) - child_value = os.read(fds[0], 100).decode('latin1') + child_value = os.read(fds[0], 100).decode('latin-1') self.assertNotEqual(parent_value, child_value) Modified: python/branches/pep-3151/Lib/test/test_warnings.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_warnings.py (original) +++ python/branches/pep-3151/Lib/test/test_warnings.py Sat Feb 26 08:16:32 2011 @@ -80,7 +80,7 @@ self.module.resetwarnings() self.module.filterwarnings("ignore", category=UserWarning) self.module.warn("FilterTests.test_ignore", UserWarning) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) def test_always(self): with original_warnings.catch_warnings(record=True, @@ -102,10 +102,10 @@ for x in range(2): self.module.warn(message, UserWarning) if x == 0: - self.assertEquals(w[-1].message, message) + self.assertEqual(w[-1].message, message) del w[:] elif x == 1: - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) else: raise ValueError("loop variant unhandled") @@ -116,10 +116,10 @@ self.module.filterwarnings("module", category=UserWarning) message = UserWarning("FilterTests.test_module") self.module.warn(message, UserWarning) - self.assertEquals(w[-1].message, message) + self.assertEqual(w[-1].message, message) del w[:] self.module.warn(message, UserWarning) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) def test_once(self): with original_warnings.catch_warnings(record=True, @@ -129,14 +129,14 @@ message = UserWarning("FilterTests.test_once") self.module.warn_explicit(message, UserWarning, "test_warnings.py", 42) - self.assertEquals(w[-1].message, message) + self.assertEqual(w[-1].message, message) del w[:] self.module.warn_explicit(message, UserWarning, "test_warnings.py", 13) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) self.module.warn_explicit(message, UserWarning, "test_warnings2.py", 42) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) def test_inheritance(self): with original_warnings.catch_warnings(module=self.module) as w: @@ -157,7 +157,7 @@ self.module.warn("FilterTests.test_ordering", UserWarning) except UserWarning: self.fail("order handling for actions failed") - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) def test_filterwarnings(self): # Test filterwarnings(). @@ -217,7 +217,7 @@ self.module.warn(ob) # Don't directly compare objects since # ``Warning() != Warning()``. - self.assertEquals(str(w[-1].message), str(UserWarning(ob))) + self.assertEqual(str(w[-1].message), str(UserWarning(ob))) def test_filename(self): with warnings_state(self.module): @@ -448,7 +448,7 @@ self.assertEqual(w[-1].message, message) del w[:] self.module.warn_explicit(message, UserWarning, "file", 42) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) # Test the resetting of onceregistry. self.module.onceregistry = {} __warningregistry__ = {} @@ -459,7 +459,7 @@ del self.module.onceregistry __warningregistry__ = {} self.module.warn_explicit(message, UserWarning, "file", 42) - self.assertEquals(len(w), 0) + self.assertEqual(len(w), 0) finally: self.module.onceregistry = original_registry Modified: python/branches/pep-3151/Lib/test/test_weakset.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_weakset.py (original) +++ python/branches/pep-3151/Lib/test/test_weakset.py Sat Feb 26 08:16:32 2011 @@ -50,7 +50,8 @@ def test_contains(self): for c in self.letters: self.assertEqual(c in self.s, c in self.d) - self.assertRaises(TypeError, self.s.__contains__, [[]]) + # 1 is not weakref'able, but that TypeError is caught by __contains__ + self.assertNotIn(1, self.s) self.assertIn(self.obj, self.fs) del self.obj self.assertNotIn(ustr('F'), self.fs) Modified: python/branches/pep-3151/Lib/test/test_winreg.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_winreg.py (original) +++ python/branches/pep-3151/Lib/test/test_winreg.py Sat Feb 26 08:16:32 2011 @@ -82,12 +82,12 @@ # Check we wrote as many items as we thought. nkeys, nvalues, since_mod = QueryInfoKey(key) - self.assertEquals(nkeys, 1, "Not the correct number of sub keys") - self.assertEquals(nvalues, 1, "Not the correct number of values") + self.assertEqual(nkeys, 1, "Not the correct number of sub keys") + self.assertEqual(nvalues, 1, "Not the correct number of values") nkeys, nvalues, since_mod = QueryInfoKey(sub_key) - self.assertEquals(nkeys, 0, "Not the correct number of sub keys") - self.assertEquals(nvalues, len(test_data), - "Not the correct number of values") + self.assertEqual(nkeys, 0, "Not the correct number of sub keys") + self.assertEqual(nvalues, len(test_data), + "Not the correct number of values") # Close this key this way... # (but before we do, copy the key as an integer - this allows # us to test that the key really gets closed). @@ -112,8 +112,8 @@ def _read_test_data(self, root_key, subkeystr="sub_key", OpenKey=OpenKey): # Check we can get default value for this key. val = QueryValue(root_key, test_key_name) - self.assertEquals(val, "Default value", - "Registry didn't give back the correct value") + self.assertEqual(val, "Default value", + "Registry didn't give back the correct value") key = OpenKey(root_key, test_key_name) # Read the sub-keys @@ -125,22 +125,22 @@ data = EnumValue(sub_key, index) except EnvironmentError: break - self.assertEquals(data in test_data, True, - "Didn't read back the correct test data") + self.assertEqual(data in test_data, True, + "Didn't read back the correct test data") index = index + 1 - self.assertEquals(index, len(test_data), - "Didn't read the correct number of items") + self.assertEqual(index, len(test_data), + "Didn't read the correct number of items") # Check I can directly access each item for value_name, value_data, value_type in test_data: read_val, read_typ = QueryValueEx(sub_key, value_name) - self.assertEquals(read_val, value_data, - "Could not directly read the value") - self.assertEquals(read_typ, value_type, - "Could not directly read the value") + self.assertEqual(read_val, value_data, + "Could not directly read the value") + self.assertEqual(read_typ, value_type, + "Could not directly read the value") sub_key.Close() # Enumerate our main key. read_val = EnumKey(key, 0) - self.assertEquals(read_val, subkeystr, "Read subkey value wrong") + self.assertEqual(read_val, subkeystr, "Read subkey value wrong") try: EnumKey(key, 1) self.fail("Was able to get a second key when I only have one!") @@ -159,8 +159,8 @@ DeleteValue(sub_key, value_name) nkeys, nvalues, since_mod = QueryInfoKey(sub_key) - self.assertEquals(nkeys, 0, "subkey not empty before delete") - self.assertEquals(nvalues, 0, "subkey not empty before delete") + self.assertEqual(nkeys, 0, "subkey not empty before delete") + self.assertEqual(nvalues, 0, "subkey not empty before delete") sub_key.Close() DeleteKey(key, subkeystr) @@ -341,8 +341,8 @@ with OpenKey(HKEY_LOCAL_MACHINE, "Software") as key: # HKLM\Software is redirected but not reflected in all OSes self.assertTrue(QueryReflectionKey(key)) - self.assertEquals(None, EnableReflectionKey(key)) - self.assertEquals(None, DisableReflectionKey(key)) + self.assertIsNone(EnableReflectionKey(key)) + self.assertIsNone(DisableReflectionKey(key)) self.assertTrue(QueryReflectionKey(key)) @unittest.skipUnless(HAS_REFLECTION, "OS doesn't support reflection") Modified: python/branches/pep-3151/Lib/test/test_winsound.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_winsound.py (original) +++ python/branches/pep-3151/Lib/test/test_winsound.py Sat Feb 26 08:16:32 2011 @@ -249,6 +249,7 @@ p = subprocess.Popen([cscript_path, check_script], stdout=subprocess.PIPE) __have_soundcard_cache = not p.wait() + p.stdout.close() return __have_soundcard_cache Modified: python/branches/pep-3151/Lib/test/test_with.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_with.py (original) +++ python/branches/pep-3151/Lib/test/test_with.py Sat Feb 26 08:16:32 2011 @@ -9,26 +9,26 @@ import sys import unittest from collections import deque -from contextlib import GeneratorContextManager, contextmanager +from contextlib import _GeneratorContextManager, contextmanager from test.support import run_unittest -class MockContextManager(GeneratorContextManager): +class MockContextManager(_GeneratorContextManager): def __init__(self, gen): - GeneratorContextManager.__init__(self, gen) + _GeneratorContextManager.__init__(self, gen) self.enter_called = False self.exit_called = False self.exit_args = None def __enter__(self): self.enter_called = True - return GeneratorContextManager.__enter__(self) + return _GeneratorContextManager.__enter__(self) def __exit__(self, type, value, traceback): self.exit_called = True self.exit_args = (type, value, traceback) - return GeneratorContextManager.__exit__(self, type, - value, traceback) + return _GeneratorContextManager.__exit__(self, type, + value, traceback) def mock_contextmanager(func): @@ -734,10 +734,10 @@ def testEnterReturnsTuple(self): with self.Dummy(value=(1,2)) as (a1, a2), \ self.Dummy(value=(10, 20)) as (b1, b2): - self.assertEquals(1, a1) - self.assertEquals(2, a2) - self.assertEquals(10, b1) - self.assertEquals(20, b2) + self.assertEqual(1, a1) + self.assertEqual(2, a2) + self.assertEqual(10, b1) + self.assertEqual(20, b2) def test_main(): run_unittest(FailureTestCase, NonexceptionalTestCase, Modified: python/branches/pep-3151/Lib/test/test_wsgiref.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_wsgiref.py (original) +++ python/branches/pep-3151/Lib/test/test_wsgiref.py Sat Feb 26 08:16:32 2011 @@ -342,6 +342,10 @@ self.checkReqURI("http://127.0.0.1/sp%C3%A4m", SCRIPT_NAME="/sp??m") self.checkReqURI("http://127.0.0.1/spammity/spam", SCRIPT_NAME="/spammity", PATH_INFO="/spam") + self.checkReqURI("http://127.0.0.1/spammity/spam;ham", + SCRIPT_NAME="/spammity", PATH_INFO="/spam;ham") + self.checkReqURI("http://127.0.0.1/spammity/spam;cookie=1234,5678", + SCRIPT_NAME="/spammity", PATH_INFO="/spam;cookie=1234,5678") self.checkReqURI("http://127.0.0.1/spammity/spam?say=ni", SCRIPT_NAME="/spammity", PATH_INFO="/spam",QUERY_STRING="say=ni") self.checkReqURI("http://127.0.0.1/spammity/spam", 0, @@ -516,6 +520,11 @@ s('200 OK',[]) return ['\u0442\u0435\u0441\u0442'.encode("utf-8")] + def trivial_app4(e,s): + # Simulate a response to a HEAD request + s('200 OK',[('Content-Length', '12345')]) + return [] + h = TestHandler() h.run(trivial_app1) self.assertEqual(h.stdout.getvalue(), @@ -539,10 +548,12 @@ b'\r\n' b'\xd1\x82\xd0\xb5\xd1\x81\xd1\x82') - - - - + h = TestHandler() + h.run(trivial_app4) + self.assertEqual(h.stdout.getvalue(), + b'Status: 200 OK\r\n' + b'Content-Length: 12345\r\n' + b'\r\n') def testBasicErrorOutput(self): Modified: python/branches/pep-3151/Lib/test/test_xml_etree.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_xml_etree.py (original) +++ python/branches/pep-3151/Lib/test/test_xml_etree.py Sat Feb 26 08:16:32 2011 @@ -22,7 +22,7 @@ SIMPLE_XMLFILE = findfile("simple.xml", subdir="xmltestdata") try: - SIMPLE_XMLFILE.encode("utf8") + SIMPLE_XMLFILE.encode("utf-8") except UnicodeEncodeError: raise unittest.SkipTest("filename is not encodable to utf8") SIMPLE_NS_XMLFILE = findfile("simple-ns.xml", subdir="xmltestdata") @@ -1255,8 +1255,8 @@ >>> ET.tostring(ET.PI('test', '')) b'?>' - >>> ET.tostring(ET.PI('test', '\xe3'), 'latin1') - b"\\n\\xe3?>" + >>> ET.tostring(ET.PI('test', '\xe3'), 'latin-1') + b"\\n\\xe3?>" """ # @@ -1841,6 +1841,15 @@ """ +def check_issue10777(): + """ + Registering a namespace twice caused a "dictionary changed size during + iteration" bug. + + >>> ET.register_namespace('test10777', 'http://myuri/') + >>> ET.register_namespace('test10777', 'http://myuri/') + """ + # -------------------------------------------------------------------- Modified: python/branches/pep-3151/Lib/test/test_xml_etree_c.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_xml_etree_c.py (original) +++ python/branches/pep-3151/Lib/test/test_xml_etree_c.py Sat Feb 26 08:16:32 2011 @@ -1,6 +1,8 @@ # xml.etree test for cElementTree from test import support +from test.support import precisionbigmemtest, _2G +import unittest cET = support.import_module('xml.etree.cElementTree') @@ -8,19 +10,51 @@ # cElementTree specific tests def sanity(): - """ + r""" Import sanity. >>> from xml.etree import cElementTree + + Issue #6697. + + >>> e = cElementTree.Element('a') + >>> getattr(e, '\uD800') # doctest: +ELLIPSIS + Traceback (most recent call last): + ... + UnicodeEncodeError: ... + + >>> p = cElementTree.XMLParser() + >>> p.version.split()[0] + 'Expat' + >>> getattr(p, '\uD800') + Traceback (most recent call last): + ... + AttributeError: 'XMLParser' object has no attribute '\ud800' """ +class MiscTests(unittest.TestCase): + # Issue #8651. + @support.precisionbigmemtest(size=support._2G + 100, memuse=1) + def test_length_overflow(self, size): + if size < support._2G + 100: + self.skipTest("not enough free memory, need at least 2 GB") + data = b'x' * size + parser = cET.XMLParser() + try: + self.assertRaises(OverflowError, parser.feed, data) + finally: + data = None + + def test_main(): from test import test_xml_etree, test_xml_etree_c # Run the tests specific to the C implementation support.run_doctest(test_xml_etree_c, verbosity=True) + support.run_unittest(MiscTests) + # Assign the C implementation before running the doctests # Patch the __name__, to prevent confusion with the pure Python test pyET = test_xml_etree.ET Modified: python/branches/pep-3151/Lib/test/test_xmlrpc.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_xmlrpc.py (original) +++ python/branches/pep-3151/Lib/test/test_xmlrpc.py Sat Feb 26 08:16:32 2011 @@ -39,7 +39,7 @@ def test_dump_load(self): dump = xmlrpclib.dumps((alist,)) load = xmlrpclib.loads(dump) - self.assertEquals(alist, load[0][0]) + self.assertEqual(alist, load[0][0]) def test_dump_bare_datetime(self): # This checks that an unwrapped datetime.date object can be handled @@ -49,22 +49,22 @@ dt = datetime.datetime(2005, 2, 10, 11, 41, 23) s = xmlrpclib.dumps((dt,)) (newdt,), m = xmlrpclib.loads(s, use_datetime=1) - self.assertEquals(newdt, dt) - self.assertEquals(m, None) + self.assertEqual(newdt, dt) + self.assertEqual(m, None) (newdt,), m = xmlrpclib.loads(s, use_datetime=0) - self.assertEquals(newdt, xmlrpclib.DateTime('20050210T11:41:23')) + self.assertEqual(newdt, xmlrpclib.DateTime('20050210T11:41:23')) def test_datetime_before_1900(self): # same as before but with a date before 1900 dt = datetime.datetime(1, 2, 10, 11, 41, 23) s = xmlrpclib.dumps((dt,)) (newdt,), m = xmlrpclib.loads(s, use_datetime=1) - self.assertEquals(newdt, dt) - self.assertEquals(m, None) + self.assertEqual(newdt, dt) + self.assertEqual(m, None) (newdt,), m = xmlrpclib.loads(s, use_datetime=0) - self.assertEquals(newdt, xmlrpclib.DateTime('00010210T11:41:23')) + self.assertEqual(newdt, xmlrpclib.DateTime('00010210T11:41:23')) def test_cmp_datetime_DateTime(self): now = datetime.datetime.now() @@ -92,7 +92,7 @@ t.x = 100 t.y = "Hello" ((t2,), dummy) = xmlrpclib.loads(xmlrpclib.dumps((t,))) - self.assertEquals(t2, t.__dict__) + self.assertEqual(t2, t.__dict__) def test_dump_big_long(self): self.assertRaises(OverflowError, xmlrpclib.dumps, (2**99,)) @@ -138,18 +138,30 @@ value = alist + [None] arg1 = (alist + [None],) strg = xmlrpclib.dumps(arg1, allow_none=True) - self.assertEquals(value, + self.assertEqual(value, xmlrpclib.loads(strg)[0][0]) self.assertRaises(TypeError, xmlrpclib.dumps, (arg1,)) def test_get_host_info(self): # see bug #3613, this raised a TypeError transp = xmlrpc.client.Transport() - self.assertEquals(transp.get_host_info("user at host.tld"), + self.assertEqual(transp.get_host_info("user at host.tld"), ('host.tld', [('Authorization', 'Basic dXNlcg==')], {})) - + def test_ssl_presence(self): + try: + import ssl + except ImportError: + has_ssl = False + else: + has_ssl = True + try: + xmlrpc.client.ServerProxy('https://localhost:9999').bad_function() + except NotImplementedError: + self.assertFalse(has_ssl, "xmlrpc client's error with SSL support") + except socket.error: + self.assertTrue(has_ssl) class HelperTestCase(unittest.TestCase): def test_escape(self): @@ -167,8 +179,8 @@ f = xmlrpclib.Fault(42, 'Test Fault') s = xmlrpclib.dumps((f,)) (newf,), m = xmlrpclib.loads(s) - self.assertEquals(newf, {'faultCode': 42, 'faultString': 'Test Fault'}) - self.assertEquals(m, None) + self.assertEqual(newf, {'faultCode': 42, 'faultString': 'Test Fault'}) + self.assertEqual(m, None) s = xmlrpclib.Marshaller().dumps(f) self.assertRaises(xmlrpclib.Fault, xmlrpclib.loads, s) @@ -617,6 +629,7 @@ self.assertEqual(p.pow(6,8), 6**8) self.assertEqual(p.pow(6,8), 6**8) self.assertEqual(p.pow(6,8), 6**8) + p("close")() #they should have all been handled by a single request handler self.assertEqual(len(self.RequestHandler.myRequests), 1) @@ -625,6 +638,7 @@ #due to thread scheduling) self.assertGreaterEqual(len(self.RequestHandler.myRequests[-1]), 2) + #test special attribute access on the serverproxy, through the __call__ #function. class KeepaliveServerTestCase2(BaseKeepaliveServerTestCase): @@ -641,6 +655,7 @@ self.assertEqual(p.pow(6,8), 6**8) self.assertEqual(p.pow(6,8), 6**8) self.assertEqual(p.pow(6,8), 6**8) + p("close")() #they should have all been two request handlers, each having logged at least #two complete requests @@ -648,12 +663,14 @@ self.assertGreaterEqual(len(self.RequestHandler.myRequests[-1]), 2) self.assertGreaterEqual(len(self.RequestHandler.myRequests[-2]), 2) + def test_transport(self): p = xmlrpclib.ServerProxy(URL) #do some requests with close. self.assertEqual(p.pow(6,8), 6**8) p("transport").close() #same as above, really. self.assertEqual(p.pow(6,8), 6**8) + p("close")() self.assertEqual(len(self.RequestHandler.myRequests), 2) #A test case that verifies that gzip encoding works in both directions @@ -697,16 +714,18 @@ self.assertEqual(p.pow(6,8), 6**8) b = self.RequestHandler.content_length self.assertTrue(a>b) + p("close")() def test_bad_gzip_request(self): t = self.Transport() t.encode_threshold = None t.fake_gzip = True p = xmlrpclib.ServerProxy(URL, transport=t) - cm = self.assertRaisesRegexp(xmlrpclib.ProtocolError, - re.compile(r"\b400\b")) + cm = self.assertRaisesRegex(xmlrpclib.ProtocolError, + re.compile(r"\b400\b")) with cm: p.pow(6, 8) + p("close")() def test_gsip_response(self): t = self.Transport() @@ -717,6 +736,7 @@ a = t.response_length self.requestHandler.encode_threshold = 0 #always encode self.assertEqual(p.pow(6,8), 6**8) + p("close")() b = t.response_length self.requestHandler.encode_threshold = old self.assertTrue(a>b) @@ -906,7 +926,7 @@ content = handle[handle.find("=4GB are handled correctly. +class ChecksumBigBufferTestCase(unittest.TestCase): + + def setUp(self): + with open(support.TESTFN, "wb+") as f: + f.seek(_4G) + f.write(b"asdf") + with open(support.TESTFN, "rb") as f: + self.mapping = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) + + def tearDown(self): + self.mapping.close() + support.unlink(support.TESTFN) + + @unittest.skipUnless(mmap, "mmap() is not available.") + @unittest.skipUnless(sys.maxsize > _4G, "Can't run on a 32-bit system.") + @unittest.skipUnless(support.is_resource_enabled("largefile"), + "May use lots of disk space.") + def test_big_buffer(self): + self.assertEqual(zlib.crc32(self.mapping), 3058686908) + self.assertEqual(zlib.adler32(self.mapping), 82837919) + class ExceptionTestCase(unittest.TestCase): # make sure we generate some expected errors @@ -143,7 +171,7 @@ def test_incomplete_stream(self): # An useful error message is given x = zlib.compress(HAMLET_SCENE) - self.assertRaisesRegexp(zlib.error, + self.assertRaisesRegex(zlib.error, "Error -5 while decompressing data: incomplete or truncated stream", zlib.decompress, x[:-1]) @@ -158,6 +186,16 @@ def test_big_decompress_buffer(self, size): self.check_big_decompress_buffer(size, zlib.decompress) + @precisionbigmemtest(size=_4G + 100, memuse=1) + def test_length_overflow(self, size): + if size < _4G + 100: + self.skipTest("not enough free memory, need at least 4 GB") + data = b'x' * size + try: + self.assertRaises(OverflowError, zlib.compress, data, 1) + finally: + data = None + class CompressObjectTestCase(BaseCompressTestCase, unittest.TestCase): # Test compression object @@ -567,6 +605,7 @@ def test_main(): support.run_unittest( ChecksumTestCase, + ChecksumBigBufferTestCase, ExceptionTestCase, CompressTestCase, CompressObjectTestCase Modified: python/branches/pep-3151/Lib/test/win_console_handler.py ============================================================================== --- python/branches/pep-3151/Lib/test/win_console_handler.py (original) +++ python/branches/pep-3151/Lib/test/win_console_handler.py Sat Feb 26 08:16:32 2011 @@ -40,7 +40,7 @@ print("Unable to add SetConsoleCtrlHandler") exit(-1) - # Awaken mail process + # Awake main process m = mmap.mmap(-1, 1, sys.argv[1]) m[0] = 1 Modified: python/branches/pep-3151/Lib/threading.py ============================================================================== --- python/branches/pep-3151/Lib/threading.py (original) +++ python/branches/pep-3151/Lib/threading.py Sat Feb 26 08:16:32 2011 @@ -17,12 +17,11 @@ # with the multiprocessing module, which doesn't provide the old # Java inspired names. - -# Rename some stuff so "from threading import *" is safe __all__ = ['active_count', 'Condition', 'current_thread', 'enumerate', 'Event', - 'Lock', 'RLock', 'Semaphore', 'BoundedSemaphore', 'Thread', + 'Lock', 'RLock', 'Semaphore', 'BoundedSemaphore', 'Thread', 'Barrier', 'Timer', 'setprofile', 'settrace', 'local', 'stack_size'] +# Rename some stuff so "from threading import *" is safe _start_new_thread = _thread.start_new_thread _allocate_lock = _thread.allocate_lock _get_ident = _thread.get_ident @@ -36,10 +35,6 @@ # Debug support (adapted from ihooks.py). -# All the major classes here derive from _Verbose. We force that to -# be a new-style class so that all the major classes here are new-style. -# This helps debugging (type(instance) is more revealing for instances -# of new-style classes). _VERBOSE = False @@ -55,8 +50,14 @@ def _note(self, format, *args): if self._verbose: format = format % args - format = "%s: %s\n" % ( - current_thread().name, format) + # Issue #4188: calling current_thread() can incur an infinite + # recursion if it has to create a DummyThread on the fly. + ident = _get_ident() + try: + name = _active[ident].name + except KeyError: + name = "" % ident + format = "%s: %s\n" % (name, format) _sys.stderr.write(format) else: @@ -254,6 +255,32 @@ finally: self._acquire_restore(saved_state) + def wait_for(self, predicate, timeout=None): + endtime = None + waittime = timeout + result = predicate() + while not result: + if waittime is not None: + if endtime is None: + endtime = _time() + waittime + else: + waittime = endtime - _time() + if waittime <= 0: + if __debug__: + self._note("%s.wait_for(%r, %r): Timed out.", + self, predicate, timeout) + break + if __debug__: + self._note("%s.wait_for(%r, %r): Waiting with timeout=%s.", + self, predicate, timeout, waittime) + self.wait(waittime) + result = predicate() + else: + if __debug__: + self._note("%s.wait_for(%r, %r): Success.", + self, predicate, timeout) + return result + def notify(self, n=1): if not self._is_owned(): raise RuntimeError("cannot notify on un-acquired lock") @@ -363,6 +390,10 @@ self._cond = Condition(Lock()) self._flag = False + def _reset_internal_locks(self): + # private! called by Thread._reset_internal_locks by _after_fork() + self._cond.__init__() + def is_set(self): return self._flag @@ -482,13 +513,12 @@ # Wait in the barrier until we are relased. Raise an exception # if the barrier is reset or broken. def _wait(self, timeout): - while self._state == 0: - if self._cond.wait(timeout) is False: - #timed out. Break the barrier - self._break() - raise BrokenBarrierError - if self._state < 0: - raise BrokenBarrierError + if not self._cond.wait_for(lambda : self._state != 0, timeout): + #timed out. Break the barrier + self._break() + raise BrokenBarrierError + if self._state < 0: + raise BrokenBarrierError assert self._state == 1 # If we are the last thread to exit the barrier, signal any threads @@ -592,7 +622,7 @@ #XXX __exc_clear = _sys.exc_clear def __init__(self, group=None, target=None, name=None, - args=(), kwargs=None, verbose=None): + args=(), kwargs=None, verbose=None, *, daemon=None): assert group is None, "group argument must be None for now" _Verbose.__init__(self, verbose) if kwargs is None: @@ -601,7 +631,10 @@ self._name = str(name or _newname()) self._args = args self._kwargs = kwargs - self._daemonic = self._set_daemon() + if daemon is not None: + self._daemonic = daemon + else: + self._daemonic = current_thread().daemon self._ident = None self._started = Event() self._stopped = False @@ -611,9 +644,12 @@ # sys.exc_info since it can be changed between instances self._stderr = _sys.stderr - def _set_daemon(self): - # Overridden in _MainThread and _DummyThread - return current_thread().daemon + def _reset_internal_locks(self): + # private! Called by _after_fork() to reset our internal locks as + # they may be in an invalid state leading to a deadlock or crash. + if hasattr(self, '_block'): # DummyThread deletes _block + self._block.__init__() + self._started._reset_internal_locks() def __repr__(self): assert self._initialized, "Thread.__init__() was not called" @@ -911,15 +947,12 @@ class _MainThread(Thread): def __init__(self): - Thread.__init__(self, name="MainThread") + Thread.__init__(self, name="MainThread", daemon=False) self._started.set() self._set_ident() with _active_limbo_lock: _active[self._ident] = self - def _set_daemon(self): - return False - def _exitfunc(self): self._stop() t = _pickSomeNonDaemonThread() @@ -951,22 +984,18 @@ class _DummyThread(Thread): def __init__(self): - Thread.__init__(self, name=_newname("Dummy-%d")) + Thread.__init__(self, name=_newname("Dummy-%d"), daemon=True) - # Thread.__block consumes an OS-level locking primitive, which + # Thread._block consumes an OS-level locking primitive, which # can never be used by a _DummyThread. Since a _DummyThread # instance is immortal, that's bad, so release this resource. del self._block - self._started.set() self._set_ident() with _active_limbo_lock: _active[self._ident] = self - def _set_daemon(self): - return True - def join(self, timeout=None): assert False, "cannot join a dummy thread" @@ -1033,6 +1062,9 @@ # its new value since it can have changed. ident = _get_ident() thread._ident = ident + # Any condition variables hanging off of the active thread may + # be in an invalid state, so we reinitialize them. + thread._reset_internal_locks() new_active[ident] = thread else: # All the others are already stopped. Modified: python/branches/pep-3151/Lib/tkinter/__init__.py ============================================================================== --- python/branches/pep-3151/Lib/tkinter/__init__.py (original) +++ python/branches/pep-3151/Lib/tkinter/__init__.py Sat Feb 26 08:16:32 2011 @@ -216,8 +216,8 @@ self._master.deletecommand(cbname) def trace_vinfo(self): """Return all trace callback information.""" - return map(self._tk.split, self._tk.splitlist( - self._tk.call("trace", "vinfo", self._name))) + return [self._tk.split(x) for x in self._tk.splitlist( + self._tk.call("trace", "vinfo", self._name))] def __eq__(self, other): """Comparison for equality (==). @@ -855,7 +855,7 @@ includeids and 'includeids' or None)) if isinstance(data, str): data = [self.tk.split(data)] - return map(self.__winfo_parseitem, data) + return [self.__winfo_parseitem(x) for x in data] def __winfo_parseitem(self, t): """Internal function.""" return t[:1] + tuple(map(self.__winfo_getint, t[1:])) @@ -1200,8 +1200,8 @@ self.configure({key: value}) def keys(self): """Return a list of all resource names of this widget.""" - return map(lambda x: x[0][1:], - self.tk.split(self.tk.call(self._w, 'configure'))) + return [x[0][1:] for x in + self.tk.split(self.tk.call(self._w, 'configure'))] def __str__(self): """Return the window path name of this widget.""" return self._w @@ -1223,18 +1223,18 @@ def pack_slaves(self): """Return a list of all slaves of this widget in its packing order.""" - return map(self._nametowidget, - self.tk.splitlist( - self.tk.call('pack', 'slaves', self._w))) + return [self._nametowidget(x) for x in + self.tk.splitlist( + self.tk.call('pack', 'slaves', self._w))] slaves = pack_slaves # Place method that applies to the master def place_slaves(self): """Return a list of all slaves of this widget in its packing order.""" - return map(self._nametowidget, - self.tk.splitlist( + return [self._nametowidget(x) for x in + self.tk.splitlist( self.tk.call( - 'place', 'slaves', self._w))) + 'place', 'slaves', self._w))] # Grid methods that apply to the master def grid_bbox(self, column=None, row=None, col2=None, row2=None): """Return a tuple of integer coordinates for the bounding @@ -1338,9 +1338,9 @@ args = args + ('-row', row) if column is not None: args = args + ('-column', column) - return map(self._nametowidget, - self.tk.splitlist(self.tk.call( - ('grid', 'slaves', self._w) + args))) + return [self._nametowidget(x) for x in + self.tk.splitlist(self.tk.call( + ('grid', 'slaves', self._w) + args))] # Support for the "event" command, new in Tk 4.2. # By Case Roole. @@ -1494,7 +1494,7 @@ if len(wlist) > 1: wlist = (wlist,) # Tk needs a list of windows here args = ('wm', 'colormapwindows', self._w) + wlist - return map(self._nametowidget, self.tk.call(args)) + return [self._nametowidget(x) for x in self.tk.call(args)] colormapwindows = wm_colormapwindows def wm_command(self, value=None): """Store VALUE in WM_COMMAND property. It is the command @@ -2157,9 +2157,9 @@ def coords(self, *args): """Return a list of coordinates for the item given in ARGS.""" # XXX Should use _flatten on args - return map(getdouble, + return [getdouble(x) for x in self.tk.splitlist( - self.tk.call((self._w, 'coords') + args))) + self.tk.call((self._w, 'coords') + args))] def _create(self, itemType, args, kw): # Args: (val, val, ..., cnf={}) """Internal function.""" args = _flatten(args) Modified: python/branches/pep-3151/Lib/tkinter/scrolledtext.py ============================================================================== --- python/branches/pep-3151/Lib/tkinter/scrolledtext.py (original) +++ python/branches/pep-3151/Lib/tkinter/scrolledtext.py Sat Feb 26 08:16:32 2011 @@ -30,8 +30,8 @@ # Copy geometry methods of self.frame without overriding Text # methods -- hack! text_meths = vars(Text).keys() - methods = vars(Pack).keys() + vars(Grid).keys() + vars(Place).keys() - methods = set(methods).difference(text_meths) + methods = vars(Pack).keys() | vars(Grid).keys() | vars(Place).keys() + methods = methods.difference(text_meths) for m in methods: if m[0] != '_' and m != 'config' and m != 'configure': @@ -42,11 +42,10 @@ def example(): - import __main__ from tkinter.constants import END stext = ScrolledText(bg='white', height=10) - stext.insert(END, __main__.__doc__) + stext.insert(END, __doc__) stext.pack(fill=BOTH, side=LEFT, expand=True) stext.focus_set() stext.mainloop() Modified: python/branches/pep-3151/Lib/tkinter/test/test_tkinter/test_loadtk.py ============================================================================== --- python/branches/pep-3151/Lib/tkinter/test/test_tkinter/test_loadtk.py (original) +++ python/branches/pep-3151/Lib/tkinter/test/test_tkinter/test_loadtk.py Sat Feb 26 08:16:32 2011 @@ -31,7 +31,8 @@ # doesn't actually carry through to the process level # because they don't support unsetenv # If that's the case, abort. - display = os.popen('echo $DISPLAY').read().strip() + with os.popen('echo $DISPLAY') as pipe: + display = pipe.read().strip() if display: return Modified: python/branches/pep-3151/Lib/tkinter/test/test_ttk/test_widgets.py ============================================================================== --- python/branches/pep-3151/Lib/tkinter/test/test_ttk/test_widgets.py (original) +++ python/branches/pep-3151/Lib/tkinter/test/test_ttk/test_widgets.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,6 @@ import unittest import tkinter +import os from tkinter import ttk from test.support import requires, run_unittest @@ -925,7 +926,8 @@ self.assertRaises(tkinter.TclError, self.tv.heading, '#0', anchor=1) - + # XXX skipping for now; should be fixed to work with newer ttk + @unittest.skip def test_heading_callback(self): def simulate_heading_click(x, y): support.simulate_mouse_click(self.tv, x, y) Modified: python/branches/pep-3151/Lib/tkinter/tix.py ============================================================================== --- python/branches/pep-3151/Lib/tkinter/tix.py (original) +++ python/branches/pep-3151/Lib/tkinter/tix.py Sat Feb 26 08:16:32 2011 @@ -268,10 +268,10 @@ return self.tk.call('tixForm', 'info', self._w, option) def slaves(self): - return map(self._nametowidget, - self.tk.splitlist( + return [self._nametowidget(x) for x in + self.tk.splitlist( self.tk.call( - 'tixForm', 'slaves', self._w))) + 'tixForm', 'slaves', self._w))] Modified: python/branches/pep-3151/Lib/token.py ============================================================================== --- python/branches/pep-3151/Lib/token.py (original) +++ python/branches/pep-3151/Lib/token.py Sat Feb 26 08:16:32 2011 @@ -1,7 +1,7 @@ -#! /usr/bin/env python3 - """Token constants (from "token.h").""" +__all__ = ['tok_name', 'ISTERMINAL', 'ISNONTERMINAL', 'ISEOF'] + # This file is automatically generated; please don't muck it up! # # To update the symbols in this file, 'cd' to the top directory of @@ -68,12 +68,10 @@ NT_OFFSET = 256 #--end constants-- -tok_name = {} -for _name, _value in list(globals().items()): - if type(_value) is type(0): - tok_name[_value] = _name -del _name, _value - +tok_name = {value: name + for name, value in globals().items() + if isinstance(value, int)} +__all__.extend(tok_name.values()) def ISTERMINAL(x): return x < NT_OFFSET @@ -85,7 +83,7 @@ return x == ENDMARKER -def main(): +def _main(): import re import sys args = sys.argv[1:] @@ -139,4 +137,4 @@ if __name__ == "__main__": - main() + _main() Modified: python/branches/pep-3151/Lib/tokenize.py ============================================================================== --- python/branches/pep-3151/Lib/tokenize.py (original) +++ python/branches/pep-3151/Lib/tokenize.py Sat Feb 26 08:16:32 2011 @@ -24,6 +24,7 @@ __credits__ = ('GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, ' 'Skip Montanaro, Raymond Hettinger, Trent Nelson, ' 'Michael Foord') +import builtins import re import sys from token import * @@ -33,9 +34,8 @@ cookie_re = re.compile("coding[:=]\s*([-\w.]+)") import token -__all__ = [x for x in dir(token) if not x.startswith("_")] -__all__.extend(["COMMENT", "tokenize", "detect_encoding", "NL", "untokenize", - "ENCODING", "TokenInfo"]) +__all__ = token.__all__ + ["COMMENT", "tokenize", "detect_encoding", + "NL", "untokenize", "ENCODING", "TokenInfo"] del token COMMENT = N_TOKENS @@ -336,13 +336,11 @@ return default, [first, second] -_builtin_open = open - def open(filename): """Open a file in read only mode using the encoding detected by detect_encoding(). """ - buffer = _builtin_open(filename, 'rb') + buffer = builtins.open(filename, 'rb') encoding, lines = detect_encoding(buffer.readline) buffer.seek(0) text = TextIOWrapper(buffer, encoding, line_buffering=True) Modified: python/branches/pep-3151/Lib/trace.py ============================================================================== --- python/branches/pep-3151/Lib/trace.py (original) +++ python/branches/pep-3151/Lib/trace.py Sat Feb 26 08:16:32 2011 @@ -47,7 +47,7 @@ r = tracer.results() r.write_results(show_missing=True, coverdir="/tmp") """ - +__all__ = ['Trace', 'CoverageResults'] import io import linecache import os @@ -60,6 +60,7 @@ import gc import dis import pickle +from warnings import warn as _warn try: import threading @@ -77,7 +78,7 @@ sys.settrace(None) threading.settrace(None) -def usage(outfile): +def _usage(outfile): outfile.write("""Usage: %s [OPTIONS] [ARGS] Meta-options: @@ -127,7 +128,7 @@ # Simple rx to find lines with no code. rx_blank = re.compile(r'^\s*(#.*)?$') -class Ignore: +class _Ignore: def __init__(self, modules=None, dirs=None): self._mods = set() if not modules else set(modules) self._dirs = [] if not dirs else [os.path.normpath(d) @@ -177,14 +178,14 @@ self._ignore[modulename] = 0 return 0 -def modname(path): +def _modname(path): """Return a plausible module name for the patch.""" base = os.path.basename(path) filename, ext = os.path.splitext(base) return filename -def fullmodname(path): +def _fullmodname(path): """Return a plausible module name for the path.""" # If the file 'path' is part of a package, then the filename isn't @@ -311,17 +312,17 @@ if coverdir is None: dir = os.path.dirname(os.path.abspath(filename)) - modulename = modname(filename) + modulename = _modname(filename) else: dir = coverdir if not os.path.exists(dir): os.makedirs(dir) - modulename = fullmodname(filename) + modulename = _fullmodname(filename) # If desired, get a list of the line numbers which represent # executable content (returned as a dict for better lookup speed) if show_missing: - lnotab = find_executable_linenos(filename) + lnotab = _find_executable_linenos(filename) else: lnotab = {} @@ -384,7 +385,7 @@ return n_hits, n_lines -def find_lines_from_code(code, strs): +def _find_lines_from_code(code, strs): """Return dict where keys are lines in the line number table.""" linenos = {} @@ -394,19 +395,19 @@ return linenos -def find_lines(code, strs): +def _find_lines(code, strs): """Return lineno dict for all code objects reachable from code.""" # get all of the lineno information from the code of this scope level - linenos = find_lines_from_code(code, strs) + linenos = _find_lines_from_code(code, strs) # and check the constants for references to other code objects for c in code.co_consts: if inspect.iscode(c): # find another code object, so recurse into it - linenos.update(find_lines(c, strs)) + linenos.update(_find_lines(c, strs)) return linenos -def find_strings(filename, encoding=None): +def _find_strings(filename, encoding=None): """Return a dict of possible docstring positions. The dict maps line numbers to strings. There is an entry for @@ -429,7 +430,7 @@ prev_ttype = ttype return d -def find_executable_linenos(filename): +def _find_executable_linenos(filename): """Return dict where keys are line numbers in the line number table.""" try: with tokenize.open(filename) as f: @@ -440,8 +441,8 @@ % (filename, err)), file=sys.stderr) return {} code = compile(prog, filename, "exec") - strs = find_strings(filename, encoding) - return find_lines(code, strs) + strs = _find_strings(filename, encoding) + return _find_lines(code, strs) class Trace: def __init__(self, count=1, trace=1, countfuncs=0, countcallers=0, @@ -466,7 +467,7 @@ """ self.infile = infile self.outfile = outfile - self.ignore = Ignore(ignoremods, ignoredirs) + self.ignore = _Ignore(ignoremods, ignoredirs) self.counts = {} # keys are (filename, linenumber) self.pathtobasename = {} # for memoizing os.path.basename self.donothing = 0 @@ -525,7 +526,7 @@ code = frame.f_code filename = code.co_filename if filename: - modulename = modname(filename) + modulename = _modname(filename) else: modulename = None @@ -592,9 +593,9 @@ code = frame.f_code filename = frame.f_globals.get('__file__', None) if filename: - # XXX modname() doesn't work right for packages, so + # XXX _modname() doesn't work right for packages, so # the ignore support won't work right for packages - modulename = modname(filename) + modulename = _modname(filename) if modulename is not None: ignore_it = self.ignore.names(filename, modulename) if not ignore_it: @@ -810,5 +811,47 @@ if not no_report: results.write_results(missing, summary=summary, coverdir=coverdir) +# Deprecated API +def usage(outfile): + _warn("The trace.usage() function is deprecated", + DeprecationWarning, 2) + _usage(outfile) + +class Ignore(_Ignore): + def __init__(self, modules=None, dirs=None): + _warn("The class trace.Ignore is deprecated", + DeprecationWarning, 2) + _Ignore.__init__(self, modules, dirs) + +def modname(path): + _warn("The trace.modname() function is deprecated", + DeprecationWarning, 2) + return _modname(path) + +def fullmodname(path): + _warn("The trace.fullmodname() function is deprecated", + DeprecationWarning, 2) + return _fullmodname(path) + +def find_lines_from_code(code, strs): + _warn("The trace.find_lines_from_code() function is deprecated", + DeprecationWarning, 2) + return _find_lines_from_code(code, strs) + +def find_lines(code, strs): + _warn("The trace.find_lines() function is deprecated", + DeprecationWarning, 2) + return _find_lines(code, strs) + +def find_strings(filename, encoding=None): + _warn("The trace.find_strings() function is deprecated", + DeprecationWarning, 2) + return _find_strings(filename, encoding=None) + +def find_executable_linenos(filename): + _warn("The trace.find_executable_linenos() function is deprecated", + DeprecationWarning, 2) + return _find_executable_linenos(filename) + if __name__=='__main__': main() Modified: python/branches/pep-3151/Lib/turtle.py ============================================================================== --- python/branches/pep-3151/Lib/turtle.py (original) +++ python/branches/pep-3151/Lib/turtle.py Sat Feb 26 08:16:32 2011 @@ -752,7 +752,7 @@ [(0.0, 9.9999999999999982), (0.0, -9.9999999999999982), (9.9999999999999982, 0.0)] >>> """ - cl = list(self.cv.coords(item)) + cl = self.cv.coords(item) pl = [(cl[i], -cl[i+1]) for i in range(0, len(cl), 2)] return pl Modified: python/branches/pep-3151/Lib/turtledemo/about_turtledemo.txt ============================================================================== --- python/branches/pep-3151/Lib/turtledemo/about_turtledemo.txt (original) +++ python/branches/pep-3151/Lib/turtledemo/about_turtledemo.txt Sat Feb 26 08:16:32 2011 @@ -1,9 +1,9 @@ -------------------------------------- - About turtleDemo.py + About this viewer -------------------------------------- - Tiny demo Viewer to view turtle graphics example scripts. + Tiny demo viewer to view turtle graphics example scripts. Quickly and dirtyly assembled by Gregor Lingl. June, 2006 Modified: python/branches/pep-3151/Lib/turtledemo/demohelp.txt ============================================================================== --- python/branches/pep-3151/Lib/turtledemo/demohelp.txt (original) +++ python/branches/pep-3151/Lib/turtledemo/demohelp.txt Sat Feb 26 08:16:32 2011 @@ -53,12 +53,7 @@ (2) How to add your own demos to the demo repository - - scriptname: must begin with tdemo_ , - so it must have the form tdemo_.py - - - place: same directory as turtleDemo.py or some - subdirectory, the name of which must also begin with - tdemo_..... + - place: same directory as turtledemo/__main__.py - requirements on source code: code must contain a main() function which will Modified: python/branches/pep-3151/Lib/unittest/case.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/case.py (original) +++ python/branches/pep-3151/Lib/unittest/case.py Sat Feb 26 08:16:32 2011 @@ -6,10 +6,12 @@ import pprint import re import warnings +import collections from . import result from .util import (strclass, safe_repr, sorted_list_difference, - unorderable_list_difference) + unorderable_list_difference, _count_diff_all_purpose, + _count_diff_hashable) __unittest = True @@ -24,7 +26,6 @@ Usually you can use TestResult.skip() or one of the skipping decorators instead of raising this directly. """ - pass class _ExpectedFailure(Exception): """ @@ -41,7 +42,17 @@ """ The test was supposed to fail, but it didn't! """ - pass + + +class _Outcome(object): + def __init__(self): + self.success = True + self.skipped = None + self.unexpectedSuccess = None + self.expectedFailure = None + self.errors = [] + self.failures = [] + def _id(obj): return obj @@ -93,7 +104,7 @@ class _AssertRaisesBaseContext(object): def __init__(self, expected, test_case, callable_obj=None, - expected_regexp=None): + expected_regex=None): self.expected = expected self.failureException = test_case.failureException if callable_obj is not None: @@ -103,9 +114,9 @@ self.obj_name = str(callable_obj) else: self.obj_name = None - if isinstance(expected_regexp, (bytes, str)): - expected_regexp = re.compile(expected_regexp) - self.expected_regexp = expected_regexp + if isinstance(expected_regex, (bytes, str)): + expected_regex = re.compile(expected_regex) + self.expected_regex = expected_regex class _AssertRaisesContext(_AssertRaisesBaseContext): @@ -131,13 +142,13 @@ return False # store exception, without traceback, for later retrieval self.exception = exc_value.with_traceback(None) - if self.expected_regexp is None: + if self.expected_regex is None: return True - expected_regexp = self.expected_regexp - if not expected_regexp.search(str(exc_value)): + expected_regex = self.expected_regex + if not expected_regex.search(str(exc_value)): raise self.failureException('"%s" does not match "%s"' % - (expected_regexp.pattern, str(exc_value))) + (expected_regex.pattern, str(exc_value))) return True @@ -171,8 +182,8 @@ continue if first_matching is None: first_matching = w - if (self.expected_regexp is not None and - not self.expected_regexp.search(str(w))): + if (self.expected_regex is not None and + not self.expected_regex.search(str(w))): continue # store warning for later retrieval self.warning = w @@ -182,7 +193,7 @@ # Now we simply try to choose a helpful failure message if first_matching is not None: raise self.failureException('"%s" does not match "%s"' % - (self.expected_regexp.pattern, str(first_matching))) + (self.expected_regex.pattern, str(first_matching))) if self.obj_name: raise self.failureException("{0} not triggered by {1}" .format(exc_name, self.obj_name)) @@ -191,6 +202,27 @@ .format(exc_name)) +class _TypeEqualityDict(object): + + def __init__(self, testcase): + self.testcase = testcase + self._store = {} + + def __setitem__(self, key, value): + self._store[key] = value + + def __getitem__(self, key): + value = self._store[key] + if isinstance(value, str): + return getattr(self.testcase, value) + return value + + def get(self, key, default=None): + if key in self._store: + return self[key] + return default + + class TestCase(object): """A class whose instances are single test cases. @@ -223,7 +255,7 @@ # objects used in assert methods) will be printed on failure in *addition* # to any explicit message passed. - longMessage = False + longMessage = True # This attribute sets the maximum length of a diff in failure messages # by assert methods using difflib. It is looked up as an instance attribute @@ -241,25 +273,30 @@ not have a method with the specified name. """ self._testMethodName = methodName - self._resultForDoCleanups = None + self._outcomeForDoCleanups = None + self._testMethodDoc = 'No test' try: testMethod = getattr(self, methodName) except AttributeError: - raise ValueError("no such test method in %s: %s" % - (self.__class__, methodName)) - self._testMethodDoc = testMethod.__doc__ + if methodName != 'runTest': + # we allow instantiation with no explicit method name + # but not an *incorrect* or missing method name + raise ValueError("no such test method in %s: %s" % + (self.__class__, methodName)) + else: + self._testMethodDoc = testMethod.__doc__ self._cleanups = [] # Map types to custom assertEqual functions that will compare # instances of said type in more detail to generate a more useful # error message. - self._type_equality_funcs = {} - self.addTypeEqualityFunc(dict, self.assertDictEqual) - self.addTypeEqualityFunc(list, self.assertListEqual) - self.addTypeEqualityFunc(tuple, self.assertTupleEqual) - self.addTypeEqualityFunc(set, self.assertSetEqual) - self.addTypeEqualityFunc(frozenset, self.assertSetEqual) - self.addTypeEqualityFunc(str, self.assertMultiLineEqual) + self._type_equality_funcs = _TypeEqualityDict(self) + self.addTypeEqualityFunc(dict, 'assertDictEqual') + self.addTypeEqualityFunc(list, 'assertListEqual') + self.addTypeEqualityFunc(tuple, 'assertTupleEqual') + self.addTypeEqualityFunc(set, 'assertSetEqual') + self.addTypeEqualityFunc(frozenset, 'assertSetEqual') + self.addTypeEqualityFunc(str, 'assertMultiLineEqual') def addTypeEqualityFunc(self, typeobj, function): """Add a type specific assertEqual style function to compare a type. @@ -345,6 +382,36 @@ RuntimeWarning, 2) result.addSuccess(self) + def _executeTestPart(self, function, outcome, isTest=False): + try: + function() + except KeyboardInterrupt: + raise + except SkipTest as e: + outcome.success = False + outcome.skipped = str(e) + except _UnexpectedSuccess: + exc_info = sys.exc_info() + outcome.success = False + if isTest: + outcome.unexpectedSuccess = exc_info + else: + outcome.errors.append(exc_info) + except _ExpectedFailure: + outcome.success = False + exc_info = sys.exc_info() + if isTest: + outcome.expectedFailure = exc_info + else: + outcome.errors.append(exc_info) + except self.failureException: + outcome.success = False + outcome.failures.append(sys.exc_info()) + exc_info = sys.exc_info() + except: + outcome.success = False + outcome.errors.append(sys.exc_info()) + def run(self, result=None): orig_result = result if result is None: @@ -353,7 +420,6 @@ if startTestRun is not None: startTestRun() - self._resultForDoCleanups = result result.startTest(self) testMethod = getattr(self, self._testMethodName) @@ -368,51 +434,42 @@ result.stopTest(self) return try: - success = False - try: - self.setUp() - except SkipTest as e: - self._addSkip(result, str(e)) - except Exception: - result.addError(self, sys.exc_info()) + outcome = _Outcome() + self._outcomeForDoCleanups = outcome + + self._executeTestPart(self.setUp, outcome) + if outcome.success: + self._executeTestPart(testMethod, outcome, isTest=True) + self._executeTestPart(self.tearDown, outcome) + + self.doCleanups() + if outcome.success: + result.addSuccess(self) else: - try: - testMethod() - except self.failureException: - result.addFailure(self, sys.exc_info()) - except _ExpectedFailure as e: - addExpectedFailure = getattr(result, 'addExpectedFailure', None) - if addExpectedFailure is not None: - addExpectedFailure(self, e.exc_info) - else: - warnings.warn("TestResult has no addExpectedFailure method, reporting as passes", - RuntimeWarning) - result.addSuccess(self) - except _UnexpectedSuccess: + if outcome.skipped is not None: + self._addSkip(result, outcome.skipped) + for exc_info in outcome.errors: + result.addError(self, exc_info) + for exc_info in outcome.failures: + result.addFailure(self, exc_info) + if outcome.unexpectedSuccess is not None: addUnexpectedSuccess = getattr(result, 'addUnexpectedSuccess', None) if addUnexpectedSuccess is not None: addUnexpectedSuccess(self) else: warnings.warn("TestResult has no addUnexpectedSuccess method, reporting as failures", RuntimeWarning) - result.addFailure(self, sys.exc_info()) - except SkipTest as e: - self._addSkip(result, str(e)) - except Exception: - result.addError(self, sys.exc_info()) - else: - success = True + result.addFailure(self, outcome.unexpectedSuccess) + + if outcome.expectedFailure is not None: + addExpectedFailure = getattr(result, 'addExpectedFailure', None) + if addExpectedFailure is not None: + addExpectedFailure(self, outcome.expectedFailure) + else: + warnings.warn("TestResult has no addExpectedFailure method, reporting as passes", + RuntimeWarning) + result.addSuccess(self) - try: - self.tearDown() - except Exception: - result.addError(self, sys.exc_info()) - success = False - - cleanUpSuccess = self.doCleanups() - success = success and cleanUpSuccess - if success: - result.addSuccess(self) finally: result.stopTest(self) if orig_result is None: @@ -423,16 +480,15 @@ def doCleanups(self): """Execute all cleanup functions. Normally called for you after tearDown.""" - result = self._resultForDoCleanups - ok = True + outcome = self._outcomeForDoCleanups or _Outcome() while self._cleanups: - function, args, kwargs = self._cleanups.pop(-1) - try: - function(*args, **kwargs) - except Exception: - ok = False - result.addError(self, sys.exc_info()) - return ok + function, args, kwargs = self._cleanups.pop() + part = lambda: function(*args, **kwargs) + self._executeTestPart(part, outcome) + + # return this for backwards compatibility + # even though we no longer us it internally + return outcome.success def __call__(self, *args, **kwds): return self.run(*args, **kwds) @@ -455,15 +511,15 @@ raise self.failureException(msg) def assertFalse(self, expr, msg=None): - "Fail the test if the expression is true." + """Check that the expression is false.""" if expr: - msg = self._formatMessage(msg, "%s is not False" % safe_repr(expr)) + msg = self._formatMessage(msg, "%s is not false" % safe_repr(expr)) raise self.failureException(msg) def assertTrue(self, expr, msg=None): - """Fail the test unless the expression is true.""" + """Check that the expression is true.""" if not expr: - msg = self._formatMessage(msg, "%s is not True" % safe_repr(expr)) + msg = self._formatMessage(msg, "%s is not true" % safe_repr(expr)) raise self.failureException(msg) def _formatMessage(self, msg, standardMsg): @@ -666,34 +722,6 @@ msg = self._formatMessage(msg, standardMsg) raise self.failureException(msg) - # Synonyms for assertion methods - - # The plurals are undocumented. Keep them that way to discourage use. - # Do not add more. Do not remove. - # Going through a deprecation cycle on these would annoy many people. - assertEquals = assertEqual - assertNotEquals = assertNotEqual - assertAlmostEquals = assertAlmostEqual - assertNotAlmostEquals = assertNotAlmostEqual - assert_ = assertTrue - - # These fail* assertion method names are pending deprecation and will - # be a DeprecationWarning in 3.2; http://bugs.python.org/issue2578 - def _deprecate(original_func): - def deprecated_func(*args, **kwargs): - warnings.warn( - 'Please use {0} instead.'.format(original_func.__name__), - DeprecationWarning, 2) - return original_func(*args, **kwargs) - return deprecated_func - - failUnlessEqual = _deprecate(assertEqual) - failIfEqual = _deprecate(assertNotEqual) - failUnlessAlmostEqual = _deprecate(assertAlmostEqual) - failIfAlmostEqual = _deprecate(assertNotAlmostEqual) - failUnless = _deprecate(assertTrue) - failUnlessRaises = _deprecate(assertRaises) - failIf = _deprecate(assertFalse) def assertSequenceEqual(self, seq1, seq2, msg=None, seq_type=None): """An equality assertion for ordered sequences (like lists and tuples). @@ -899,8 +927,8 @@ self.fail(self._formatMessage(msg, standardMsg)) def assertDictEqual(self, d1, d2, msg=None): - self.assert_(isinstance(d1, dict), 'First argument is not a dictionary') - self.assert_(isinstance(d2, dict), 'Second argument is not a dictionary') + self.assertIsInstance(d1, dict, 'First argument is not a dictionary') + self.assertIsInstance(d2, dict, 'Second argument is not a dictionary') if d1 != d2: standardMsg = '%s != %s' % (safe_repr(d1, True), safe_repr(d2, True)) @@ -910,118 +938,43 @@ standardMsg = self._truncateMessage(standardMsg, diff) self.fail(self._formatMessage(msg, standardMsg)) - def assertDictContainsSubset(self, expected, actual, msg=None): - """Checks whether actual is a superset of expected.""" - missing = [] - mismatched = [] - for key, value in expected.items(): - if key not in actual: - missing.append(key) - elif value != actual[key]: - mismatched.append('%s, expected: %s, actual: %s' % - (safe_repr(key), safe_repr(value), - safe_repr(actual[key]))) - - if not (missing or mismatched): - return - - standardMsg = '' - if missing: - standardMsg = 'Missing: %s' % ','.join(safe_repr(m) for m in - missing) - if mismatched: - if standardMsg: - standardMsg += '; ' - standardMsg += 'Mismatched values: %s' % ','.join(mismatched) - - self.fail(self._formatMessage(msg, standardMsg)) - - def assertSameElements(self, expected_seq, actual_seq, msg=None): - """An unordered sequence specific comparison. - - Raises with an error message listing which elements of expected_seq - are missing from actual_seq and vice versa if any. - - Duplicate elements are ignored when comparing *expected_seq* and - *actual_seq*. It is the equivalent of ``assertEqual(set(expected), - set(actual))`` but it works with sequences of unhashable objects as - well. - """ - warnings.warn('assertSameElements is deprecated', - DeprecationWarning) - try: - expected = set(expected_seq) - actual = set(actual_seq) - missing = sorted(expected.difference(actual)) - unexpected = sorted(actual.difference(expected)) - except TypeError: - # Fall back to slower list-compare if any of the objects are - # not hashable. - expected = list(expected_seq) - actual = list(actual_seq) - try: - expected.sort() - actual.sort() - except TypeError: - missing, unexpected = unorderable_list_difference(expected, - actual) - else: - missing, unexpected = sorted_list_difference(expected, actual) - errors = [] - if missing: - errors.append('Expected, but missing:\n %s' % - safe_repr(missing)) - if unexpected: - errors.append('Unexpected, but present:\n %s' % - safe_repr(unexpected)) - if errors: - standardMsg = '\n'.join(errors) - self.fail(self._formatMessage(msg, standardMsg)) - + def assertCountEqual(self, first, second, msg=None): + """An unordered sequence comparison asserting that the same elements, + regardless of order. If the same element occurs more than once, + it verifies that the elements occur the same number of times. - def assertItemsEqual(self, expected_seq, actual_seq, msg=None): - """An unordered sequence / set specific comparison. It asserts that - expected_seq and actual_seq contain the same elements. It is - the equivalent of:: + self.assertEqual(Counter(list(first)), + Counter(list(second))) - self.assertEqual(sorted(expected_seq), sorted(actual_seq)) - - Raises with an error message listing which elements of expected_seq - are missing from actual_seq and vice versa if any. - - Asserts that each element has the same count in both sequences. - Example: + Example: - [0, 1, 1] and [1, 0, 1] compare equal. - [0, 0, 1] and [0, 1] compare unequal. + """ + first_seq, second_seq = list(first), list(second) try: - expected = sorted(expected_seq) - actual = sorted(actual_seq) + first = collections.Counter(first_seq) + second = collections.Counter(second_seq) except TypeError: - # Unsortable items (example: set(), complex(), ...) - expected = list(expected_seq) - actual = list(actual_seq) - missing, unexpected = unorderable_list_difference(expected, actual) + # Handle case with unhashable elements + differences = _count_diff_all_purpose(first_seq, second_seq) else: - return self.assertSequenceEqual(expected, actual, msg=msg) + if first == second: + return + differences = _count_diff_hashable(first_seq, second_seq) - errors = [] - if missing: - errors.append('Expected, but missing:\n %s' % - safe_repr(missing)) - if unexpected: - errors.append('Unexpected, but present:\n %s' % - safe_repr(unexpected)) - if errors: - standardMsg = '\n'.join(errors) - self.fail(self._formatMessage(msg, standardMsg)) + if differences: + standardMsg = 'Element counts were not equal:\n' + lines = ['First has %d, Second has %d: %r' % diff for diff in differences] + diffMsg = '\n'.join(lines) + standardMsg = self._truncateMessage(standardMsg, diffMsg) + msg = self._formatMessage(msg, standardMsg) + self.fail(msg) def assertMultiLineEqual(self, first, second, msg=None): """Assert that two multi-line strings are equal.""" - self.assert_(isinstance(first, str), ( - 'First argument is not a string')) - self.assert_(isinstance(second, str), ( - 'Second argument is not a string')) + self.assertIsInstance(first, str, 'First argument is not a string') + self.assertIsInstance(second, str, 'Second argument is not a string') if first != second: firstlines = first.splitlines(True) @@ -1084,27 +1037,27 @@ standardMsg = '%s is an instance of %r' % (safe_repr(obj), cls) self.fail(self._formatMessage(msg, standardMsg)) - def assertRaisesRegexp(self, expected_exception, expected_regexp, - callable_obj=None, *args, **kwargs): - """Asserts that the message in a raised exception matches a regexp. + def assertRaisesRegex(self, expected_exception, expected_regex, + callable_obj=None, *args, **kwargs): + """Asserts that the message in a raised exception matches a regex. Args: expected_exception: Exception class expected to be raised. - expected_regexp: Regexp (re pattern object or string) expected + expected_regex: Regex (re pattern object or string) expected to be found in error message. callable_obj: Function to be called. args: Extra args. kwargs: Extra kwargs. """ context = _AssertRaisesContext(expected_exception, self, callable_obj, - expected_regexp) + expected_regex) if callable_obj is None: return context with context: callable_obj(*args, **kwargs) - def assertWarnsRegexp(self, expected_warning, expected_regexp, - callable_obj=None, *args, **kwargs): + def assertWarnsRegex(self, expected_warning, expected_regex, + callable_obj=None, *args, **kwargs): """Asserts that the message in a triggered warning matches a regexp. Basic functioning is similar to assertWarns() with the addition that only warnings whose messages also match the regular expression @@ -1112,42 +1065,63 @@ Args: expected_warning: Warning class expected to be triggered. - expected_regexp: Regexp (re pattern object or string) expected + expected_regex: Regex (re pattern object or string) expected to be found in error message. callable_obj: Function to be called. args: Extra args. kwargs: Extra kwargs. """ context = _AssertWarnsContext(expected_warning, self, callable_obj, - expected_regexp) + expected_regex) if callable_obj is None: return context with context: callable_obj(*args, **kwargs) - def assertRegexpMatches(self, text, expected_regexp, msg=None): + def assertRegex(self, text, expected_regex, msg=None): """Fail the test unless the text matches the regular expression.""" - if isinstance(expected_regexp, (str, bytes)): - expected_regexp = re.compile(expected_regexp) - if not expected_regexp.search(text): - msg = msg or "Regexp didn't match" - msg = '%s: %r not found in %r' % (msg, expected_regexp.pattern, text) + if isinstance(expected_regex, (str, bytes)): + assert expected_regex, "expected_regex must not be empty." + expected_regex = re.compile(expected_regex) + if not expected_regex.search(text): + msg = msg or "Regex didn't match" + msg = '%s: %r not found in %r' % (msg, expected_regex.pattern, text) raise self.failureException(msg) - def assertNotRegexpMatches(self, text, unexpected_regexp, msg=None): + def assertNotRegex(self, text, unexpected_regex, msg=None): """Fail the test if the text matches the regular expression.""" - if isinstance(unexpected_regexp, (str, bytes)): - unexpected_regexp = re.compile(unexpected_regexp) - match = unexpected_regexp.search(text) + if isinstance(unexpected_regex, (str, bytes)): + unexpected_regex = re.compile(unexpected_regex) + match = unexpected_regex.search(text) if match: - msg = msg or "Regexp matched" + msg = msg or "Regex matched" msg = '%s: %r matches %r in %r' % (msg, text[match.start():match.end()], - unexpected_regexp.pattern, + unexpected_regex.pattern, text) raise self.failureException(msg) + def _deprecate(original_func): + def deprecated_func(*args, **kwargs): + warnings.warn( + 'Please use {0} instead.'.format(original_func.__name__), + DeprecationWarning, 2) + return original_func(*args, **kwargs) + return deprecated_func + + # The fail* methods can be removed in 3.3, the 5 assert* methods will + # have to stay around for a few more versions. See #9424. + assertEquals = _deprecate(assertEqual) + assertNotEquals = _deprecate(assertNotEqual) + assertAlmostEquals = _deprecate(assertAlmostEqual) + assertNotAlmostEquals = _deprecate(assertNotAlmostEqual) + assert_ = _deprecate(assertTrue) + assertRaisesRegexp = _deprecate(assertRaisesRegex) + assertRegexpMatches = _deprecate(assertRegex) + + + class FunctionTestCase(TestCase): """A test case that wraps a test function. Modified: python/branches/pep-3151/Lib/unittest/loader.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/loader.py (original) +++ python/branches/pep-3151/Lib/unittest/loader.py Sat Feb 26 08:16:32 2011 @@ -147,9 +147,9 @@ def discover(self, start_dir, pattern='test*.py', top_level_dir=None): """Find and return all test modules from the specified start - directory, recursing into subdirectories to find them. Only test files - that match the pattern will be loaded. (Using shell style pattern - matching.) + directory, recursing into subdirectories to find them and return all + tests found within them. Only test files that match the pattern will + be loaded. (Using shell style pattern matching.) All test modules must be importable from the top level of the project. If the start directory is not the top level directory then the top Modified: python/branches/pep-3151/Lib/unittest/main.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/main.py (original) +++ python/branches/pep-3151/Lib/unittest/main.py Sat Feb 26 08:16:32 2011 @@ -58,7 +58,24 @@ in MyTestCase """ +def _convert_name(name): + # on Linux / Mac OS X 'foo.PY' is not importable, but on + # Windows it is. Simpler to do a case insensitive match + # a better check would be to check that the name is a + # valid Python module name. + if os.path.isfile(name) and name.lower().endswith('.py'): + if os.path.isabs(name): + rel_path = os.path.relpath(name, os.getcwd()) + if os.path.isabs(rel_path) or rel_path.startswith(os.pardir): + return name + name = rel_path + # on Windows both '\' and '/' are used as path + # separators. Better to replace both than rely on os.path.sep + return name[:-3].replace('\\', '.').replace('/', '.') + return name +def _convert_names(names): + return [_convert_name(name) for name in names] class TestProgram(object): """A command-line program that runs a set of tests; this is primarily @@ -67,12 +84,12 @@ USAGE = USAGE_FROM_MODULE # defaults for testing - failfast = catchbreak = buffer = progName = None + failfast = catchbreak = buffer = progName = warnings = None def __init__(self, module='__main__', defaultTest=None, argv=None, testRunner=None, testLoader=loader.defaultTestLoader, exit=True, verbosity=1, failfast=None, catchbreak=None, - buffer=None): + buffer=None, warnings=None): if isinstance(module, str): self.module = __import__(module) for part in module.split('.')[1:]: @@ -87,6 +104,18 @@ self.catchbreak = catchbreak self.verbosity = verbosity self.buffer = buffer + if warnings is None and not sys.warnoptions: + # even if DreprecationWarnings are ignored by default + # print them anyway unless other warnings settings are + # specified by the warnings arg or the -W python flag + self.warnings = 'default' + else: + # here self.warnings is set either to the value passed + # to the warnings args or to None. + # If the user didn't pass a value self.warnings will + # be None. This means that the behavior is unchanged + # and depends on the values passed to -W. + self.warnings = warnings self.defaultTest = defaultTest self.testRunner = testRunner self.testLoader = testLoader @@ -109,7 +138,8 @@ sys.exit(2) def parseArgs(self, argv): - if len(argv) > 1 and argv[1].lower() == 'discover': + if ((len(argv) > 1 and argv[1].lower() == 'discover') or + (len(argv) == 1 and self.module is None)): self._do_discovery(argv[2:]) return @@ -117,38 +147,48 @@ long_opts = ['help', 'verbose', 'quiet', 'failfast', 'catch', 'buffer'] try: options, args = getopt.getopt(argv[1:], 'hHvqfcb', long_opts) - for opt, value in options: - if opt in ('-h','-H','--help'): - self.usageExit() - if opt in ('-q','--quiet'): - self.verbosity = 0 - if opt in ('-v','--verbose'): - self.verbosity = 2 - if opt in ('-f','--failfast'): - if self.failfast is None: - self.failfast = True - # Should this raise an exception if -f is not valid? - if opt in ('-c','--catch'): - if self.catchbreak is None: - self.catchbreak = True - # Should this raise an exception if -c is not valid? - if opt in ('-b','--buffer'): - if self.buffer is None: - self.buffer = True - # Should this raise an exception if -b is not valid? - if len(args) == 0 and self.defaultTest is None: - # createTests will load tests from self.module - self.testNames = None - elif len(args) > 0: - self.testNames = args - if __name__ == '__main__': - # to support python -m unittest ... - self.module = None - else: - self.testNames = (self.defaultTest,) - self.createTests() except getopt.error as msg: self.usageExit(msg) + return + + for opt, value in options: + if opt in ('-h','-H','--help'): + self.usageExit() + if opt in ('-q','--quiet'): + self.verbosity = 0 + if opt in ('-v','--verbose'): + self.verbosity = 2 + if opt in ('-f','--failfast'): + if self.failfast is None: + self.failfast = True + # Should this raise an exception if -f is not valid? + if opt in ('-c','--catch'): + if self.catchbreak is None: + self.catchbreak = True + # Should this raise an exception if -c is not valid? + if opt in ('-b','--buffer'): + if self.buffer is None: + self.buffer = True + # Should this raise an exception if -b is not valid? + + if len(args) == 0 and self.module is None: + # this allows "python -m unittest -v" to still work for + # test discovery. This means -c / -b / -v / -f options will + # be handled twice, which is harmless but not ideal. + self._do_discovery(argv[1:]) + return + + if len(args) == 0 and self.defaultTest is None: + # createTests will load tests from self.module + self.testNames = None + elif len(args) > 0: + self.testNames = _convert_names(args) + if __name__ == '__main__': + # to support python -m unittest ... + self.module = None + else: + self.testNames = (self.defaultTest,) + self.createTests() def createTests(self): if self.testNames is None: @@ -219,7 +259,8 @@ try: testRunner = self.testRunner(verbosity=self.verbosity, failfast=self.failfast, - buffer=self.buffer) + buffer=self.buffer, + warnings=self.warnings) except TypeError: # didn't accept the verbosity, buffer or failfast arguments testRunner = self.testRunner() Modified: python/branches/pep-3151/Lib/unittest/runner.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/runner.py (original) +++ python/branches/pep-3151/Lib/unittest/runner.py Sat Feb 26 08:16:32 2011 @@ -2,6 +2,7 @@ import sys import time +import warnings from . import result from .signals import registerResult @@ -124,13 +125,16 @@ """ resultclass = TextTestResult - def __init__(self, stream=sys.stderr, descriptions=True, verbosity=1, - failfast=False, buffer=False, resultclass=None): + def __init__(self, stream=None, descriptions=True, verbosity=1, + failfast=False, buffer=False, resultclass=None, warnings=None): + if stream is None: + stream = sys.stderr self.stream = _WritelnDecorator(stream) self.descriptions = descriptions self.verbosity = verbosity self.failfast = failfast self.buffer = buffer + self.warnings = warnings if resultclass is not None: self.resultclass = resultclass @@ -143,17 +147,30 @@ registerResult(result) result.failfast = self.failfast result.buffer = self.buffer - startTime = time.time() - startTestRun = getattr(result, 'startTestRun', None) - if startTestRun is not None: - startTestRun() - try: - test(result) - finally: - stopTestRun = getattr(result, 'stopTestRun', None) - if stopTestRun is not None: - stopTestRun() - stopTime = time.time() + with warnings.catch_warnings(): + if self.warnings: + # if self.warnings is set, use it to filter all the warnings + warnings.simplefilter(self.warnings) + # if the filter is 'default' or 'always', special-case the + # warnings from the deprecated unittest methods to show them + # no more than once per module, because they can be fairly + # noisy. The -Wd and -Wa flags can be used to bypass this + # only when self.warnings is None. + if self.warnings in ['default', 'always']: + warnings.filterwarnings('module', + category=DeprecationWarning, + message='Please use assert\w+ instead.') + startTime = time.time() + startTestRun = getattr(result, 'startTestRun', None) + if startTestRun is not None: + startTestRun() + try: + test(result) + finally: + stopTestRun = getattr(result, 'stopTestRun', None) + if stopTestRun is not None: + stopTestRun() + stopTime = time.time() timeTaken = stopTime - startTime result.printErrors() if hasattr(result, 'separator2'): @@ -168,9 +185,10 @@ results = map(len, (result.expectedFailures, result.unexpectedSuccesses, result.skipped)) - expectedFails, unexpectedSuccesses, skipped = results except AttributeError: pass + else: + expectedFails, unexpectedSuccesses, skipped = results infos = [] if not result.wasSuccessful(): Modified: python/branches/pep-3151/Lib/unittest/suite.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/suite.py (original) +++ python/branches/pep-3151/Lib/unittest/suite.py Sat Feb 26 08:16:32 2011 @@ -104,6 +104,7 @@ if topLevel: self._tearDownPreviousClass(None, result) self._handleModuleTearDown(result) + result._testRunEntered = False return result def debug(self): Modified: python/branches/pep-3151/Lib/unittest/test/test_assertions.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_assertions.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_assertions.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,5 @@ import datetime - +import warnings import unittest @@ -92,15 +92,15 @@ else: self.fail("assertRaises() didn't let exception pass through") - def testAssertNotRegexpMatches(self): - self.assertNotRegexpMatches('Ala ma kota', r'r+') + def testAssertNotRegex(self): + self.assertNotRegex('Ala ma kota', r'r+') try: - self.assertNotRegexpMatches('Ala ma kota', r'k.t', 'Message') + self.assertNotRegex('Ala ma kota', r'k.t', 'Message') except self.failureException as e: self.assertIn("'kot'", e.args[0]) self.assertIn('Message', e.args[0]) else: - self.fail('assertNotRegexpMatches should have failed.') + self.fail('assertNotRegex should have failed.') class TestLongMessage(unittest.TestCase): @@ -127,14 +127,14 @@ self.testableFalse = TestableTestFalse('testTest') def testDefault(self): - self.assertFalse(unittest.TestCase.longMessage) + self.assertTrue(unittest.TestCase.longMessage) def test_formatMsg(self): - self.assertEquals(self.testableFalse._formatMessage(None, "foo"), "foo") - self.assertEquals(self.testableFalse._formatMessage("foo", "bar"), "foo") + self.assertEqual(self.testableFalse._formatMessage(None, "foo"), "foo") + self.assertEqual(self.testableFalse._formatMessage("foo", "bar"), "foo") - self.assertEquals(self.testableTrue._formatMessage(None, "foo"), "foo") - self.assertEquals(self.testableTrue._formatMessage("foo", "bar"), "bar : foo") + self.assertEqual(self.testableTrue._formatMessage(None, "foo"), "foo") + self.assertEqual(self.testableTrue._formatMessage("foo", "bar"), "bar : foo") # This blows up if _formatMessage uses string concatenation self.testableTrue._formatMessage(object(), 'foo') @@ -153,26 +153,26 @@ test = self.testableTrue return getattr(test, methodName) - for i, expected_regexp in enumerate(errors): + for i, expected_regex in enumerate(errors): testMethod = getMethod(i) kwargs = {} withMsg = i % 2 if withMsg: kwargs = {"msg": "oops"} - with self.assertRaisesRegexp(self.failureException, - expected_regexp=expected_regexp): + with self.assertRaisesRegex(self.failureException, + expected_regex=expected_regex): testMethod(*args, **kwargs) def testAssertTrue(self): self.assertMessages('assertTrue', (False,), - ["^False is not True$", "^oops$", "^False is not True$", - "^False is not True : oops$"]) + ["^False is not true$", "^oops$", "^False is not true$", + "^False is not true : oops$"]) def testAssertFalse(self): self.assertMessages('assertFalse', (True,), - ["^True is not False$", "^oops$", "^True is not False$", - "^True is not False : oops$"]) + ["^True is not false$", "^oops$", "^True is not false$", + "^True is not false : oops$"]) def testNotEqual(self): self.assertMessages('assertNotEqual', (1, 1), @@ -223,18 +223,6 @@ "\+ \{'key': 'value'\}$", "\+ \{'key': 'value'\} : oops$"]) - def testAssertDictContainsSubset(self): - self.assertMessages('assertDictContainsSubset', ({'key': 'value'}, {}), - ["^Missing: 'key'$", "^oops$", - "^Missing: 'key'$", - "^Missing: 'key' : oops$"]) - - def testAssertItemsEqual(self): - self.assertMessages('assertItemsEqual', ([], [None]), - [r"\[None\]$", "^oops$", - r"\[None\]$", - r"\[None\] : oops$"]) - def testAssertMultiLineEqual(self): self.assertMessages('assertMultiLineEqual', ("", "foo"), [r"\+ foo$", "^oops$", Modified: python/branches/pep-3151/Lib/unittest/test/test_break.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_break.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_break.py Sat Feb 26 08:16:32 2011 @@ -209,7 +209,8 @@ self.assertEqual(FakeRunner.initArgs, [((), {'buffer': None, 'verbosity': verbosity, - 'failfast': failfast})]) + 'failfast': failfast, + 'warnings': None})]) self.assertEqual(FakeRunner.runArgs, [test]) self.assertEqual(p.result, result) @@ -222,7 +223,8 @@ self.assertEqual(FakeRunner.initArgs, [((), {'buffer': None, 'verbosity': verbosity, - 'failfast': failfast})]) + 'failfast': failfast, + 'warnings': None})]) self.assertEqual(FakeRunner.runArgs, [test]) self.assertEqual(p.result, result) Modified: python/branches/pep-3151/Lib/unittest/test/test_case.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_case.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_case.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,6 @@ import difflib import pprint +import pickle import re import sys import warnings @@ -76,6 +77,16 @@ self.assertEqual(Test().id()[-13:], '.Test.runTest') + # test that TestCase can be instantiated with no args + # primarily for use at the interactive interpreter + test = unittest.TestCase() + test.assertEqual(3, 3) + with test.assertRaises(test.failureException): + test.assertEqual(3, 2) + + with self.assertRaises(AttributeError): + test.run() + # "class TestCase([methodName])" # ... # "Each instance of TestCase will run a single test method: the @@ -176,8 +187,8 @@ super(Foo, self).test() raise RuntimeError('raised by Foo.test') - expected = ['startTest', 'setUp', 'test', 'addError', 'tearDown', - 'stopTest'] + expected = ['startTest', 'setUp', 'test', 'tearDown', + 'addError', 'stopTest'] Foo(events).run(result) self.assertEqual(events, expected) @@ -194,8 +205,8 @@ super(Foo, self).test() raise RuntimeError('raised by Foo.test') - expected = ['startTestRun', 'startTest', 'setUp', 'test', 'addError', - 'tearDown', 'stopTest', 'stopTestRun'] + expected = ['startTestRun', 'startTest', 'setUp', 'test', + 'tearDown', 'addError', 'stopTest', 'stopTestRun'] Foo(events).run() self.assertEqual(events, expected) @@ -215,8 +226,8 @@ super(Foo, self).test() self.fail('raised by Foo.test') - expected = ['startTest', 'setUp', 'test', 'addFailure', 'tearDown', - 'stopTest'] + expected = ['startTest', 'setUp', 'test', 'tearDown', + 'addFailure', 'stopTest'] Foo(events).run(result) self.assertEqual(events, expected) @@ -230,8 +241,8 @@ super(Foo, self).test() self.fail('raised by Foo.test') - expected = ['startTestRun', 'startTest', 'setUp', 'test', 'addFailure', - 'tearDown', 'stopTest', 'stopTestRun'] + expected = ['startTestRun', 'startTest', 'setUp', 'test', + 'tearDown', 'addFailure', 'stopTest', 'stopTestRun'] events = [] Foo(events).run() self.assertEqual(events, expected) @@ -477,33 +488,6 @@ self.assertRaises(self.failureException, self.assertNotIn, 'cow', animals) - def testAssertDictContainsSubset(self): - self.assertDictContainsSubset({}, {}) - self.assertDictContainsSubset({}, {'a': 1}) - self.assertDictContainsSubset({'a': 1}, {'a': 1}) - self.assertDictContainsSubset({'a': 1}, {'a': 1, 'b': 2}) - self.assertDictContainsSubset({'a': 1, 'b': 2}, {'a': 1, 'b': 2}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({1: "one"}, {}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 2}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'c': 1}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 1, 'c': 1}, {'a': 1}) - - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'a': 1, 'c': 1}, {'a': 1}) - - one = ''.join(chr(i) for i in range(255)) - # this used to cause a UnicodeDecodeError constructing the failure msg - with self.assertRaises(self.failureException): - self.assertDictContainsSubset({'foo': one}, {'foo': '\uFFFD'}) - def testAssertEqual(self): equal_pairs = [ ((), ()), @@ -671,46 +655,68 @@ else: self.fail('assertMultiLineEqual did not fail') - def testAssertItemsEqual(self): + def testAssertCountEqual(self): a = object() - self.assertItemsEqual([1, 2, 3], [3, 2, 1]) - self.assertItemsEqual(['foo', 'bar', 'baz'], ['bar', 'baz', 'foo']) - self.assertItemsEqual([a, a, 2, 2, 3], (a, 2, 3, a, 2)) - self.assertItemsEqual([1, "2", "a", "a"], ["a", "2", True, "a"]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertCountEqual([1, 2, 3], [3, 2, 1]) + self.assertCountEqual(['foo', 'bar', 'baz'], ['bar', 'baz', 'foo']) + self.assertCountEqual([a, a, 2, 2, 3], (a, 2, 3, a, 2)) + self.assertCountEqual([1, "2", "a", "a"], ["a", "2", True, "a"]) + self.assertRaises(self.failureException, self.assertCountEqual, [1, 2] + [3] * 100, [1] * 100 + [2, 3]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [1, "2", "a", "a"], ["a", "2", True, 1]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [10], [10, 11]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [10, 11], [10]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [10, 11, 10], [10, 11]) # Test that sequences of unhashable objects can be tested for sameness: - self.assertItemsEqual([[1, 2], [3, 4], 0], [False, [3, 4], [1, 2]]) + self.assertCountEqual([[1, 2], [3, 4], 0], [False, [3, 4], [1, 2]]) + # Test that iterator of unhashable objects can be tested for sameness: + self.assertCountEqual(iter([1, 2, [], 3, 4]), + iter([1, 2, [], 3, 4])) # hashable types, but not orderable - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [], [divmod, 'x', 1, 5j, 2j, frozenset()]) # comparing dicts - self.assertItemsEqual([{'a': 1}, {'b': 2}], [{'b': 2}, {'a': 1}]) + self.assertCountEqual([{'a': 1}, {'b': 2}], [{'b': 2}, {'a': 1}]) # comparing heterogenous non-hashable sequences - self.assertItemsEqual([1, 'x', divmod, []], [divmod, [], 'x', 1]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertCountEqual([1, 'x', divmod, []], [divmod, [], 'x', 1]) + self.assertRaises(self.failureException, self.assertCountEqual, [], [divmod, [], 'x', 1, 5j, 2j, set()]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [[1]], [[2]]) # Same elements, but not same sequence length - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [1, 1, 2], [2, 1]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [1, 1, "2", "a", "a"], ["2", "2", True, "a"]) - self.assertRaises(self.failureException, self.assertItemsEqual, + self.assertRaises(self.failureException, self.assertCountEqual, [1, {'b': 2}, None, True], [{'b': 2}, True, None]) + # Same elements which don't reliably compare, in + # different order, see issue 10242 + a = [{2,4}, {1,2}] + b = a[::-1] + self.assertCountEqual(a, b) + + # test utility functions supporting assertCountEqual() + + diffs = set(unittest.util._count_diff_all_purpose('aaabccd', 'abbbcce')) + expected = {(3,1,'a'), (1,3,'b'), (1,0,'d'), (0,1,'e')} + self.assertEqual(diffs, expected) + + diffs = unittest.util._count_diff_all_purpose([[]], []) + self.assertEqual(diffs, [(1, 0, [])]) + + diffs = set(unittest.util._count_diff_hashable('aaabccd', 'abbbcce')) + expected = {(3,1,'a'), (1,3,'b'), (1,0,'d'), (0,1,'e')} + self.assertEqual(diffs, expected) + def testAssertSetEqual(self): set1 = set() set2 = set() @@ -864,44 +870,44 @@ self.assertIsNotNone('DjZoPloGears on Rails') self.assertRaises(self.failureException, self.assertIsNotNone, None) - def testAssertRegexpMatches(self): - self.assertRegexpMatches('asdfabasdf', r'ab+') - self.assertRaises(self.failureException, self.assertRegexpMatches, + def testAssertRegex(self): + self.assertRegex('asdfabasdf', r'ab+') + self.assertRaises(self.failureException, self.assertRegex, 'saaas', r'aaaa') - def testAssertRaisesRegexp(self): + def testAssertRaisesRegex(self): class ExceptionMock(Exception): pass def Stub(): raise ExceptionMock('We expect') - self.assertRaisesRegexp(ExceptionMock, re.compile('expect$'), Stub) - self.assertRaisesRegexp(ExceptionMock, 'expect$', Stub) + self.assertRaisesRegex(ExceptionMock, re.compile('expect$'), Stub) + self.assertRaisesRegex(ExceptionMock, 'expect$', Stub) - def testAssertNotRaisesRegexp(self): - self.assertRaisesRegexp( + def testAssertNotRaisesRegex(self): + self.assertRaisesRegex( self.failureException, '^Exception not raised by $', - self.assertRaisesRegexp, Exception, re.compile('x'), + self.assertRaisesRegex, Exception, re.compile('x'), lambda: None) - self.assertRaisesRegexp( + self.assertRaisesRegex( self.failureException, '^Exception not raised by $', - self.assertRaisesRegexp, Exception, 'x', + self.assertRaisesRegex, Exception, 'x', lambda: None) - def testAssertRaisesRegexpMismatch(self): + def testAssertRaisesRegexMismatch(self): def Stub(): raise Exception('Unexpected') - self.assertRaisesRegexp( + self.assertRaisesRegex( self.failureException, r'"\^Expected\$" does not match "Unexpected"', - self.assertRaisesRegexp, Exception, '^Expected$', + self.assertRaisesRegex, Exception, '^Expected$', Stub) - self.assertRaisesRegexp( + self.assertRaisesRegex( self.failureException, r'"\^Expected\$" does not match "Unexpected"', - self.assertRaisesRegexp, Exception, + self.assertRaisesRegex, Exception, re.compile('^Expected$'), Stub) def testAssertRaisesExcValue(self): @@ -985,26 +991,26 @@ with self.assertWarns(DeprecationWarning): _runtime_warn() - def testAssertWarnsRegexpCallable(self): + def testAssertWarnsRegexCallable(self): def _runtime_warn(msg): warnings.warn(msg, RuntimeWarning) - self.assertWarnsRegexp(RuntimeWarning, "o+", - _runtime_warn, "foox") + self.assertWarnsRegex(RuntimeWarning, "o+", + _runtime_warn, "foox") # Failure when no warning is triggered with self.assertRaises(self.failureException): - self.assertWarnsRegexp(RuntimeWarning, "o+", - lambda: 0) + self.assertWarnsRegex(RuntimeWarning, "o+", + lambda: 0) # Failure when another warning is triggered with warnings.catch_warnings(): # Force default filter (in case tests are run with -We) warnings.simplefilter("default", RuntimeWarning) with self.assertRaises(self.failureException): - self.assertWarnsRegexp(DeprecationWarning, "o+", - _runtime_warn, "foox") + self.assertWarnsRegex(DeprecationWarning, "o+", + _runtime_warn, "foox") # Failure when message doesn't match with self.assertRaises(self.failureException): - self.assertWarnsRegexp(RuntimeWarning, "o+", - _runtime_warn, "barz") + self.assertWarnsRegex(RuntimeWarning, "o+", + _runtime_warn, "barz") # A little trickier: we ask RuntimeWarnings to be raised, and then # check for some of them. It is implementation-defined whether # non-matching RuntimeWarnings are simply re-raised, or produce a @@ -1012,15 +1018,15 @@ with warnings.catch_warnings(): warnings.simplefilter("error", RuntimeWarning) with self.assertRaises((RuntimeWarning, self.failureException)): - self.assertWarnsRegexp(RuntimeWarning, "o+", - _runtime_warn, "barz") + self.assertWarnsRegex(RuntimeWarning, "o+", + _runtime_warn, "barz") - def testAssertWarnsRegexpContext(self): - # Same as above, but with assertWarnsRegexp as a context manager + def testAssertWarnsRegexContext(self): + # Same as above, but with assertWarnsRegex as a context manager def _runtime_warn(msg): warnings.warn(msg, RuntimeWarning) _runtime_warn_lineno = inspect.getsourcelines(_runtime_warn)[1] - with self.assertWarnsRegexp(RuntimeWarning, "o+") as cm: + with self.assertWarnsRegex(RuntimeWarning, "o+") as cm: _runtime_warn("foox") self.assertIsInstance(cm.warning, RuntimeWarning) self.assertEqual(cm.warning.args[0], "foox") @@ -1028,18 +1034,18 @@ self.assertEqual(cm.lineno, _runtime_warn_lineno + 1) # Failure when no warning is triggered with self.assertRaises(self.failureException): - with self.assertWarnsRegexp(RuntimeWarning, "o+"): + with self.assertWarnsRegex(RuntimeWarning, "o+"): pass # Failure when another warning is triggered with warnings.catch_warnings(): # Force default filter (in case tests are run with -We) warnings.simplefilter("default", RuntimeWarning) with self.assertRaises(self.failureException): - with self.assertWarnsRegexp(DeprecationWarning, "o+"): + with self.assertWarnsRegex(DeprecationWarning, "o+"): _runtime_warn("foox") # Failure when message doesn't match with self.assertRaises(self.failureException): - with self.assertWarnsRegexp(RuntimeWarning, "o+"): + with self.assertWarnsRegex(RuntimeWarning, "o+"): _runtime_warn("barz") # A little trickier: we ask RuntimeWarnings to be raised, and then # check for some of them. It is implementation-defined whether @@ -1048,42 +1054,27 @@ with warnings.catch_warnings(): warnings.simplefilter("error", RuntimeWarning) with self.assertRaises((RuntimeWarning, self.failureException)): - with self.assertWarnsRegexp(RuntimeWarning, "o+"): + with self.assertWarnsRegex(RuntimeWarning, "o+"): _runtime_warn("barz") - def testSynonymAssertMethodNames(self): - """Test undocumented method name synonyms. - - Please do not use these methods names in your own code. - - This test confirms their continued existence and functionality - in order to avoid breaking existing code. - """ - self.assertNotEquals(3, 5) - self.assertEquals(3, 3) - self.assertAlmostEquals(2.0, 2.0) - self.assertNotAlmostEquals(3.0, 5.0) - self.assert_(True) - - def testPendingDeprecationMethodNames(self): - """Test fail* methods pending deprecation, they will warn in 3.2. + def testDeprecatedMethodNames(self): + """Test that the deprecated methods raise a DeprecationWarning. - Do not use these methods. They will go away in 3.3. + The fail* methods will be removed in 3.3. The assert* methods will + have to stay around for a few more versions. See #9424. """ old = ( - (self.failIfEqual, (3, 5)), - (self.failUnlessEqual, (3, 3)), - (self.failUnlessAlmostEqual, (2.0, 2.0)), - (self.failIfAlmostEqual, (3.0, 5.0)), - (self.failUnless, (True,)), - (self.failUnlessRaises, (TypeError, lambda _: 3.14 + 'spam')), - (self.failIf, (False,)), - (self.assertSameElements, ([1, 1, 2, 3], [1, 2, 3])) + (self.assertNotEquals, (3, 5)), + (self.assertEquals, (3, 3)), + (self.assertAlmostEquals, (2.0, 2.0)), + (self.assertNotAlmostEquals, (3.0, 5.0)), + (self.assert_, (True,)), + (self.assertRaisesRegexp, (KeyError, 'foo', lambda: {}['foo'])), + (self.assertRegexpMatches, ('bar', 'bar')), ) for meth, args in old: - with support.check_warnings(('', DeprecationWarning)) as w: + with self.assertWarns(DeprecationWarning): meth(*args) - self.assertEqual(len(w.warnings), 1) def testDeepcopy(self): # Issue: 5660 @@ -1095,3 +1086,99 @@ # This shouldn't blow up deepcopy(test) + + def testPickle(self): + # Issue 10326 + + # Can't use TestCase classes defined in Test class as + # pickle does not work with inner classes + test = unittest.TestCase('run') + for protocol in range(pickle.HIGHEST_PROTOCOL + 1): + + # blew up prior to fix + pickled_test = pickle.dumps(test, protocol=protocol) + unpickled_test = pickle.loads(pickled_test) + self.assertEqual(test, unpickled_test) + + # exercise the TestCase instance in a way that will invoke + # the type equality lookup mechanism + unpickled_test.assertEqual(set(), set()) + + def testKeyboardInterrupt(self): + def _raise(self=None): + raise KeyboardInterrupt + def nothing(self): + pass + + class Test1(unittest.TestCase): + test_something = _raise + + class Test2(unittest.TestCase): + setUp = _raise + test_something = nothing + + class Test3(unittest.TestCase): + test_something = nothing + tearDown = _raise + + class Test4(unittest.TestCase): + def test_something(self): + self.addCleanup(_raise) + + for klass in (Test1, Test2, Test3, Test4): + with self.assertRaises(KeyboardInterrupt): + klass('test_something').run() + + def testSkippingEverywhere(self): + def _skip(self=None): + raise unittest.SkipTest('some reason') + def nothing(self): + pass + + class Test1(unittest.TestCase): + test_something = _skip + + class Test2(unittest.TestCase): + setUp = _skip + test_something = nothing + + class Test3(unittest.TestCase): + test_something = nothing + tearDown = _skip + + class Test4(unittest.TestCase): + def test_something(self): + self.addCleanup(_skip) + + for klass in (Test1, Test2, Test3, Test4): + result = unittest.TestResult() + klass('test_something').run(result) + self.assertEqual(len(result.skipped), 1) + self.assertEqual(result.testsRun, 1) + + def testSystemExit(self): + def _raise(self=None): + raise SystemExit + def nothing(self): + pass + + class Test1(unittest.TestCase): + test_something = _raise + + class Test2(unittest.TestCase): + setUp = _raise + test_something = nothing + + class Test3(unittest.TestCase): + test_something = nothing + tearDown = _raise + + class Test4(unittest.TestCase): + def test_something(self): + self.addCleanup(_raise) + + for klass in (Test1, Test2, Test3, Test4): + result = unittest.TestResult() + klass('test_something').run(result) + self.assertEqual(len(result.errors), 1) + self.assertEqual(result.testsRun, 1) Modified: python/branches/pep-3151/Lib/unittest/test/test_discovery.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_discovery.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_discovery.py Sat Feb 26 08:16:32 2011 @@ -5,6 +5,18 @@ import unittest +class TestableTestProgram(unittest.TestProgram): + module = '__main__' + exit = True + defaultTest = failfast = catchbreak = buffer = None + verbosity = 1 + progName = '' + testRunner = testLoader = None + + def __init__(self): + pass + + class TestDiscovery(unittest.TestCase): # Heavily mocked tests so I can avoid hitting the filesystem @@ -195,8 +207,7 @@ test.test_this_does_not_exist() def test_command_line_handling_parseArgs(self): - # Haha - take that uninstantiable class - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() args = [] def do_discovery(argv): @@ -208,13 +219,39 @@ program.parseArgs(['something', 'discover', 'foo', 'bar']) self.assertEqual(args, ['foo', 'bar']) + def test_command_line_handling_discover_by_default(self): + program = TestableTestProgram() + program.module = None + + self.called = False + def do_discovery(argv): + self.called = True + self.assertEqual(argv, []) + program._do_discovery = do_discovery + program.parseArgs(['something']) + self.assertTrue(self.called) + + def test_command_line_handling_discover_by_default_with_options(self): + program = TestableTestProgram() + program.module = None + + args = ['something', '-v', '-b', '-v', '-c', '-f'] + self.called = False + def do_discovery(argv): + self.called = True + self.assertEqual(argv, args[1:]) + program._do_discovery = do_discovery + program.parseArgs(args) + self.assertTrue(self.called) + + def test_command_line_handling_do_discovery_too_many_arguments(self): class Stop(Exception): pass def usageExit(): raise Stop - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program.usageExit = usageExit with self.assertRaises(Stop): @@ -223,7 +260,7 @@ def test_command_line_handling_do_discovery_calls_loader(self): - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() class Loader(object): args = [] @@ -237,49 +274,49 @@ self.assertEqual(Loader.args, [('.', 'test*.py', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['--verbose'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('.', 'test*.py', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery([], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('.', 'test*.py', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['fish'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('fish', 'test*.py', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['fish', 'eggs'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('fish', 'eggs', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['fish', 'eggs', 'ham'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('fish', 'eggs', 'ham')]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['-s', 'fish'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('fish', 'test*.py', None)]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['-t', 'fish'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('.', 'test*.py', 'fish')]) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['-p', 'fish'], Loader=Loader) self.assertEqual(program.test, 'tests') self.assertEqual(Loader.args, [('.', 'fish', None)]) @@ -287,7 +324,7 @@ self.assertFalse(program.catchbreak) Loader.args = [] - program = object.__new__(unittest.TestProgram) + program = TestableTestProgram() program._do_discovery(['-p', 'eggs', '-s', 'fish', '-v', '-f', '-c'], Loader=Loader) self.assertEqual(program.test, 'tests') @@ -330,7 +367,7 @@ expected_dir = os.path.abspath('foo') msg = re.escape(r"'foo' module incorrectly imported from %r. Expected %r. " "Is this module globally installed?" % (mod_dir, expected_dir)) - self.assertRaisesRegexp( + self.assertRaisesRegex( ImportError, '^%s$' % msg, loader.discover, start_dir='foo', pattern='foo.py' ) Modified: python/branches/pep-3151/Lib/unittest/test/test_functiontestcase.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_functiontestcase.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_functiontestcase.py Sat Feb 26 08:16:32 2011 @@ -58,8 +58,8 @@ def tearDown(): events.append('tearDown') - expected = ['startTest', 'setUp', 'test', 'addError', 'tearDown', - 'stopTest'] + expected = ['startTest', 'setUp', 'test', 'tearDown', + 'addError', 'stopTest'] unittest.FunctionTestCase(test, setUp, tearDown).run(result) self.assertEqual(events, expected) @@ -84,8 +84,8 @@ def tearDown(): events.append('tearDown') - expected = ['startTest', 'setUp', 'test', 'addFailure', 'tearDown', - 'stopTest'] + expected = ['startTest', 'setUp', 'test', 'tearDown', + 'addFailure', 'stopTest'] unittest.FunctionTestCase(test, setUp, tearDown).run(result) self.assertEqual(events, expected) Modified: python/branches/pep-3151/Lib/unittest/test/test_loader.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_loader.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_loader.py Sat Feb 26 08:16:32 2011 @@ -167,11 +167,11 @@ loader = unittest.TestLoader() suite = loader.loadTestsFromModule(m) self.assertIsInstance(suite, unittest.TestSuite) - self.assertEquals(load_tests_args, [loader, suite, None]) + self.assertEqual(load_tests_args, [loader, suite, None]) load_tests_args = [] suite = loader.loadTestsFromModule(m, use_load_tests=False) - self.assertEquals(load_tests_args, []) + self.assertEqual(load_tests_args, []) def test_loadTestsFromModule__faulty_load_tests(self): m = types.ModuleType('m') @@ -186,7 +186,7 @@ self.assertEqual(suite.countTestCases(), 1) test = list(suite)[0] - self.assertRaisesRegexp(TypeError, "some failure", test.m) + self.assertRaisesRegex(TypeError, "some failure", test.m) ################################################################ ### /Tests for TestLoader.loadTestsFromModule() Modified: python/branches/pep-3151/Lib/unittest/test/test_program.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_program.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_program.py Sat Feb 26 08:16:32 2011 @@ -99,6 +99,7 @@ defaultTest = None testRunner = None testLoader = unittest.defaultTestLoader + module = '__main__' progName = 'test' test = 'test' def __init__(self, *args): @@ -182,6 +183,27 @@ program.parseArgs([None, opt]) self.assertEqual(getattr(program, attr), not_none) + def testWarning(self): + """Test the warnings argument""" + # see #10535 + class FakeTP(unittest.TestProgram): + def parseArgs(self, *args, **kw): pass + def runTests(self, *args, **kw): pass + warnoptions = sys.warnoptions + try: + sys.warnoptions[:] = [] + # no warn options, no arg -> default + self.assertEqual(FakeTP().warnings, 'default') + # no warn options, w/ arg -> arg value + self.assertEqual(FakeTP(warnings='ignore').warnings, 'ignore') + sys.warnoptions[:] = ['somevalue'] + # warn options, no arg -> None + # warn options, w/ arg -> arg value + self.assertEqual(FakeTP().warnings, None) + self.assertEqual(FakeTP(warnings='ignore').warnings, 'ignore') + finally: + sys.warnoptions[:] = warnoptions + def testRunTestsRunnerClass(self): program = self.program @@ -189,12 +211,14 @@ program.verbosity = 'verbosity' program.failfast = 'failfast' program.buffer = 'buffer' + program.warnings = 'warnings' program.runTests() self.assertEqual(FakeRunner.initArgs, {'verbosity': 'verbosity', 'failfast': 'failfast', - 'buffer': 'buffer'}) + 'buffer': 'buffer', + 'warnings': 'warnings'}) self.assertEqual(FakeRunner.test, 'test') self.assertIs(program.result, RESULT) @@ -250,6 +274,85 @@ program.runTests() self.assertTrue(self.installed) + def _patch_isfile(self, names, exists=True): + def isfile(path): + return path in names + original = os.path.isfile + os.path.isfile = isfile + def restore(): + os.path.isfile = original + self.addCleanup(restore) + + + def testParseArgsFileNames(self): + # running tests with filenames instead of module names + program = self.program + argv = ['progname', 'foo.py', 'bar.Py', 'baz.PY', 'wing.txt'] + self._patch_isfile(argv) + + program.createTests = lambda: None + program.parseArgs(argv) + + # note that 'wing.txt' is not a Python file so the name should + # *not* be converted to a module name + expected = ['foo', 'bar', 'baz', 'wing.txt'] + self.assertEqual(program.testNames, expected) + + + def testParseArgsFilePaths(self): + program = self.program + argv = ['progname', 'foo/bar/baz.py', 'green\\red.py'] + self._patch_isfile(argv) + + program.createTests = lambda: None + program.parseArgs(argv) + + expected = ['foo.bar.baz', 'green.red'] + self.assertEqual(program.testNames, expected) + + + def testParseArgsNonExistentFiles(self): + program = self.program + argv = ['progname', 'foo/bar/baz.py', 'green\\red.py'] + self._patch_isfile([]) + + program.createTests = lambda: None + program.parseArgs(argv) + + self.assertEqual(program.testNames, argv[1:]) + + def testParseArgsAbsolutePathsThatCanBeConverted(self): + cur_dir = os.getcwd() + program = self.program + def _join(name): + return os.path.join(cur_dir, name) + argv = ['progname', _join('foo/bar/baz.py'), _join('green\\red.py')] + self._patch_isfile(argv) + + program.createTests = lambda: None + program.parseArgs(argv) + + expected = ['foo.bar.baz', 'green.red'] + self.assertEqual(program.testNames, expected) + + def testParseArgsAbsolutePathsThatCannotBeConverted(self): + program = self.program + # even on Windows '/...' is considered absolute by os.path.abspath + argv = ['progname', '/foo/bar/baz.py', '/green/red.py'] + self._patch_isfile(argv) + + program.createTests = lambda: None + program.parseArgs(argv) + + self.assertEqual(program.testNames, argv[1:]) + + # it may be better to use platform specific functions to normalise paths + # rather than accepting '.PY' and '\' as file seprator on Linux / Mac + # it would also be better to check that a filename is a valid module + # identifier (we have a regex for this in loader.py) + # for invalid filenames should we raise a useful error rather than + # leaving the current error message (import of filename fails) in place? + if __name__ == '__main__': unittest.main() Modified: python/branches/pep-3151/Lib/unittest/test/test_runner.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_runner.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_runner.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,8 @@ import io +import os +import sys import pickle +import subprocess import unittest @@ -31,9 +34,7 @@ [(cleanup1, (1, 2, 3), dict(four='hello', five='goodbye')), (cleanup2, (), {})]) - result = test.doCleanups() - self.assertTrue(result) - + self.assertTrue(test.doCleanups()) self.assertEqual(cleanups, [(2, (), {}), (1, (1, 2, 3), dict(four='hello', five='goodbye'))]) def testCleanUpWithErrors(self): @@ -41,14 +42,12 @@ def testNothing(self): pass - class MockResult(object): + class MockOutcome(object): + success = True errors = [] - def addError(self, test, exc_info): - self.errors.append((test, exc_info)) - result = MockResult() test = TestableTest('testNothing') - test._resultForDoCleanups = result + test._outcomeForDoCleanups = MockOutcome exc1 = Exception('foo') exc2 = Exception('bar') @@ -62,10 +61,11 @@ test.addCleanup(cleanup2) self.assertFalse(test.doCleanups()) + self.assertFalse(MockOutcome.success) - (test1, (Type1, instance1, _)), (test2, (Type2, instance2, _)) = reversed(MockResult.errors) - self.assertEqual((test1, Type1, instance1), (test, Exception, exc1)) - self.assertEqual((test2, Type2, instance2), (test, Exception, exc2)) + (Type1, instance1, _), (Type2, instance2, _) = reversed(MockOutcome.errors) + self.assertEqual((Type1, instance1), (Exception, exc1)) + self.assertEqual((Type2, instance2), (Exception, exc2)) def testCleanupInRun(self): blowUp = False @@ -144,6 +144,7 @@ self.assertFalse(runner.failfast) self.assertFalse(runner.buffer) self.assertEqual(runner.verbosity, 1) + self.assertEqual(runner.warnings, None) self.assertTrue(runner.descriptions) self.assertEqual(runner.resultclass, unittest.TextTestResult) @@ -244,3 +245,70 @@ expectedresult = (runner.stream, DESCRIPTIONS, VERBOSITY) self.assertEqual(runner._makeResult(), expectedresult) + + def test_warnings(self): + """ + Check that warnings argument of TextTestRunner correctly affects the + behavior of the warnings. + """ + # see #10535 and the _test_warnings file for more information + + def get_parse_out_err(p): + return [b.splitlines() for b in p.communicate()] + opts = dict(stdout=subprocess.PIPE, stderr=subprocess.PIPE, + cwd=os.path.dirname(__file__)) + + # no args -> all the warnings are printed, unittest warnings only once + p = subprocess.Popen([sys.executable, '_test_warnings.py'], **opts) + out, err = get_parse_out_err(p) + self.assertIn(b'OK', err) + # check that the total number of warnings in the output is correct + self.assertEqual(len(out), 11) + # check that the numbers of the different kind of warnings is correct + for msg in [b'dw', b'iw', b'uw']: + self.assertEqual(out.count(msg), 3) + for msg in [b'rw']: + self.assertEqual(out.count(msg), 1) + + args_list = ( + # passing 'ignore' as warnings arg -> no warnings + [sys.executable, '_test_warnings.py', 'ignore'], + # -W doesn't affect the result if the arg is passed + [sys.executable, '-Wa', '_test_warnings.py', 'ignore'], + # -W affects the result if the arg is not passed + [sys.executable, '-Wi', '_test_warnings.py'] + ) + # in all these cases no warnings are printed + for args in args_list: + p = subprocess.Popen(args, **opts) + out, err = get_parse_out_err(p) + self.assertIn(b'OK', err) + self.assertEqual(len(out), 0) + + + # passing 'always' as warnings arg -> all the warnings printed, + # unittest warnings only once + p = subprocess.Popen([sys.executable, '_test_warnings.py', 'always'], + **opts) + out, err = get_parse_out_err(p) + self.assertIn(b'OK', err) + self.assertEqual(len(out), 13) + for msg in [b'dw', b'iw', b'uw', b'rw']: + self.assertEqual(out.count(msg), 3) + + def testStdErrLookedUpAtInstantiationTime(self): + # see issue 10786 + old_stderr = sys.stderr + f = io.StringIO() + sys.stderr = f + try: + runner = unittest.TextTestRunner() + self.assertTrue(runner.stream.stream is f) + finally: + sys.stderr = old_stderr + + def testSpecifiedStreamUsed(self): + # see issue 10786 + f = io.StringIO() + runner = unittest.TextTestRunner(f) + self.assertTrue(runner.stream.stream is f) Modified: python/branches/pep-3151/Lib/unittest/test/test_setups.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_setups.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_setups.py Sat Feb 26 08:16:32 2011 @@ -500,7 +500,7 @@ messages = ('setUpModule', 'tearDownModule', 'setUpClass', 'tearDownClass', 'test_something') for phase, msg in enumerate(messages): - with self.assertRaisesRegexp(Exception, msg): + with self.assertRaisesRegex(Exception, msg): suite.debug() if __name__ == '__main__': Modified: python/branches/pep-3151/Lib/unittest/test/test_suite.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/test/test_suite.py (original) +++ python/branches/pep-3151/Lib/unittest/test/test_suite.py Sat Feb 26 08:16:32 2011 @@ -353,11 +353,16 @@ unittest.TestSuite.__call__(self, *args, **kw) suite = MySuite() + result = unittest.TestResult() wrapper = unittest.TestSuite() wrapper.addTest(suite) - wrapper(unittest.TestResult()) + wrapper(result) self.assertTrue(suite.called) + # reusing results should be permitted even if abominable + self.assertFalse(result._testRunEntered) + + if __name__ == '__main__': unittest.main() Modified: python/branches/pep-3151/Lib/unittest/util.py ============================================================================== --- python/branches/pep-3151/Lib/unittest/util.py (original) +++ python/branches/pep-3151/Lib/unittest/util.py Sat Feb 26 08:16:32 2011 @@ -1,5 +1,7 @@ """Various utility functions.""" +from collections import namedtuple, OrderedDict + __unittest = True _MAX_LENGTH = 80 @@ -12,7 +14,6 @@ return result return result[:_MAX_LENGTH] + ' [truncated]...' - def strclass(cls): return "%s.%s" % (cls.__module__, cls.__name__) @@ -77,3 +78,63 @@ def three_way_cmp(x, y): """Return -1 if x < y, 0 if x == y and 1 if x > y""" return (x > y) - (x < y) + +_Mismatch = namedtuple('Mismatch', 'actual expected value') + +def _count_diff_all_purpose(actual, expected): + 'Returns list of (cnt_act, cnt_exp, elem) triples where the counts differ' + # elements need not be hashable + s, t = list(actual), list(expected) + m, n = len(s), len(t) + NULL = object() + result = [] + for i, elem in enumerate(s): + if elem is NULL: + continue + cnt_s = cnt_t = 0 + for j in range(i, m): + if s[j] == elem: + cnt_s += 1 + s[j] = NULL + for j, other_elem in enumerate(t): + if other_elem == elem: + cnt_t += 1 + t[j] = NULL + if cnt_s != cnt_t: + diff = _Mismatch(cnt_s, cnt_t, elem) + result.append(diff) + + for i, elem in enumerate(t): + if elem is NULL: + continue + cnt_t = 0 + for j in range(i, n): + if t[j] == elem: + cnt_t += 1 + t[j] = NULL + diff = _Mismatch(0, cnt_t, elem) + result.append(diff) + return result + +def _ordered_count(iterable): + 'Return dict of element counts, in the order they were first seen' + c = OrderedDict() + for elem in iterable: + c[elem] = c.get(elem, 0) + 1 + return c + +def _count_diff_hashable(actual, expected): + 'Returns list of (cnt_act, cnt_exp, elem) triples where the counts differ' + # elements must be hashable + s, t = _ordered_count(actual), _ordered_count(expected) + result = [] + for elem, cnt_s in s.items(): + cnt_t = t.get(elem, 0) + if cnt_s != cnt_t: + diff = _Mismatch(cnt_s, cnt_t, elem) + result.append(diff) + for elem, cnt_t in t.items(): + if elem not in s: + diff = _Mismatch(0, cnt_t, elem) + result.append(diff) + return result Modified: python/branches/pep-3151/Lib/urllib/parse.py ============================================================================== --- python/branches/pep-3151/Lib/urllib/parse.py (original) +++ python/branches/pep-3151/Lib/urllib/parse.py Sat Feb 26 08:16:32 2011 @@ -11,7 +11,7 @@ RFC 2396: "Uniform Resource Identifiers (URI)": Generic Syntax by T. Berners-Lee, R. Fielding, and L. Masinter, August 1998. -RFC 2368: "The mailto URL scheme", by P.Hoffman , L Masinter, J. Zwinski, July 1998. +RFC 2368: "The mailto URL scheme", by P.Hoffman , L Masinter, J. Zawinski, July 1998. RFC 1808: "Relative Uniform Resource Locators", by R. Fielding, UC Irvine, June 1995. @@ -60,6 +60,7 @@ '0123456789' '+-.') +# XXX: Consider replacing with functools.lru_cache MAX_CACHE_SIZE = 20 _parse_cache = {} @@ -69,66 +70,210 @@ _safe_quoters.clear() -class ResultMixin(object): - """Shared methods for the parsed result objects.""" +# Helpers for bytes handling +# For 3.2, we deliberately require applications that +# handle improperly quoted URLs to do their own +# decoding and encoding. If valid use cases are +# presented, we may relax this by using latin-1 +# decoding internally for 3.3 +_implicit_encoding = 'ascii' +_implicit_errors = 'strict' + +def _noop(obj): + return obj + +def _encode_result(obj, encoding=_implicit_encoding, + errors=_implicit_errors): + return obj.encode(encoding, errors) + +def _decode_args(args, encoding=_implicit_encoding, + errors=_implicit_errors): + return tuple(x.decode(encoding, errors) if x else '' for x in args) + +def _coerce_args(*args): + # Invokes decode if necessary to create str args + # and returns the coerced inputs along with + # an appropriate result coercion function + # - noop for str inputs + # - encoding function otherwise + str_input = isinstance(args[0], str) + for arg in args[1:]: + # We special-case the empty string to support the + # "scheme=''" default argument to some functions + if arg and isinstance(arg, str) != str_input: + raise TypeError("Cannot mix str and non-str arguments") + if str_input: + return args + (_noop,) + return _decode_args(args) + (_encode_result,) + +# Result objects are more helpful than simple tuples +class _ResultMixinStr(object): + """Standard approach to encoding parsed results from str to bytes""" + __slots__ = () + + def encode(self, encoding='ascii', errors='strict'): + return self._encoded_counterpart(*(x.encode(encoding, errors) for x in self)) + + +class _ResultMixinBytes(object): + """Standard approach to decoding parsed results from bytes to str""" + __slots__ = () + + def decode(self, encoding='ascii', errors='strict'): + return self._decoded_counterpart(*(x.decode(encoding, errors) for x in self)) + + +class _NetlocResultMixinBase(object): + """Shared methods for the parsed result objects containing a netloc element""" + __slots__ = () @property def username(self): - netloc = self.netloc - if "@" in netloc: - userinfo = netloc.rsplit("@", 1)[0] - if ":" in userinfo: - userinfo = userinfo.split(":", 1)[0] - return userinfo - return None + return self._userinfo[0] @property def password(self): - netloc = self.netloc - if "@" in netloc: - userinfo = netloc.rsplit("@", 1)[0] - if ":" in userinfo: - return userinfo.split(":", 1)[1] - return None + return self._userinfo[1] @property def hostname(self): - netloc = self.netloc.split('@')[-1] - if '[' in netloc and ']' in netloc: - return netloc.split(']')[0][1:].lower() - elif ':' in netloc: - return netloc.split(':')[0].lower() - elif netloc == '': - return None - else: - return netloc.lower() + hostname = self._hostinfo[0] + if not hostname: + hostname = None + elif hostname is not None: + hostname = hostname.lower() + return hostname @property def port(self): - netloc = self.netloc.split('@')[-1].split(']')[-1] - if ':' in netloc: - port = netloc.split(':')[1] - return int(port, 10) + port = self._hostinfo[1] + if port is not None: + port = int(port, 10) + return port + + +class _NetlocResultMixinStr(_NetlocResultMixinBase, _ResultMixinStr): + __slots__ = () + + @property + def _userinfo(self): + netloc = self.netloc + userinfo, have_info, hostinfo = netloc.rpartition('@') + if have_info: + username, have_password, password = userinfo.partition(':') + if not have_password: + password = None + else: + username = password = None + return username, password + + @property + def _hostinfo(self): + netloc = self.netloc + _, _, hostinfo = netloc.rpartition('@') + _, have_open_br, bracketed = hostinfo.partition('[') + if have_open_br: + hostname, _, port = bracketed.partition(']') + _, have_port, port = port.partition(':') + else: + hostname, have_port, port = hostinfo.partition(':') + if not have_port: + port = None + return hostname, port + + +class _NetlocResultMixinBytes(_NetlocResultMixinBase, _ResultMixinBytes): + __slots__ = () + + @property + def _userinfo(self): + netloc = self.netloc + userinfo, have_info, hostinfo = netloc.rpartition(b'@') + if have_info: + username, have_password, password = userinfo.partition(b':') + if not have_password: + password = None else: - return None + username = password = None + return username, password + + @property + def _hostinfo(self): + netloc = self.netloc + _, _, hostinfo = netloc.rpartition(b'@') + _, have_open_br, bracketed = hostinfo.partition(b'[') + if have_open_br: + hostname, _, port = bracketed.partition(b']') + _, have_port, port = port.partition(b':') + else: + hostname, have_port, port = hostinfo.partition(b':') + if not have_port: + port = None + return hostname, port + from collections import namedtuple -class SplitResult(namedtuple('SplitResult', 'scheme netloc path query fragment'), ResultMixin): +_DefragResultBase = namedtuple('DefragResult', 'url fragment') +_SplitResultBase = namedtuple('SplitResult', 'scheme netloc path query fragment') +_ParseResultBase = namedtuple('ParseResult', 'scheme netloc path params query fragment') + +# For backwards compatibility, alias _NetlocResultMixinStr +# ResultBase is no longer part of the documented API, but it is +# retained since deprecating it isn't worth the hassle +ResultBase = _NetlocResultMixinStr +# Structured result objects for string data +class DefragResult(_DefragResultBase, _ResultMixinStr): __slots__ = () + def geturl(self): + if self.fragment: + return self.url + '#' + self.fragment + else: + return self.url +class SplitResult(_SplitResultBase, _NetlocResultMixinStr): + __slots__ = () def geturl(self): return urlunsplit(self) +class ParseResult(_ParseResultBase, _NetlocResultMixinStr): + __slots__ = () + def geturl(self): + return urlunparse(self) -class ParseResult(namedtuple('ParseResult', 'scheme netloc path params query fragment'), ResultMixin): +# Structured result objects for bytes data +class DefragResultBytes(_DefragResultBase, _ResultMixinBytes): + __slots__ = () + def geturl(self): + if self.fragment: + return self.url + b'#' + self.fragment + else: + return self.url +class SplitResultBytes(_SplitResultBase, _NetlocResultMixinBytes): __slots__ = () + def geturl(self): + return urlunsplit(self) +class ParseResultBytes(_ParseResultBase, _NetlocResultMixinBytes): + __slots__ = () def geturl(self): return urlunparse(self) +# Set up the encode/decode result pairs +def _fix_result_transcoding(): + _result_pairs = ( + (DefragResult, DefragResultBytes), + (SplitResult, SplitResultBytes), + (ParseResult, ParseResultBytes), + ) + for _decoded, _encoded in _result_pairs: + _decoded._encoded_counterpart = _encoded + _encoded._decoded_counterpart = _decoded + +_fix_result_transcoding() +del _fix_result_transcoding def urlparse(url, scheme='', allow_fragments=True): """Parse a URL into 6 components: @@ -136,13 +281,15 @@ Return a 6-tuple: (scheme, netloc, path, params, query, fragment). Note that we don't break the components up in smaller bits (e.g. netloc is a single string) and we don't expand % escapes.""" + url, scheme, _coerce_result = _coerce_args(url, scheme) tuple = urlsplit(url, scheme, allow_fragments) scheme, netloc, url, query, fragment = tuple if scheme in uses_params and ';' in url: url, params = _splitparams(url) else: params = '' - return ParseResult(scheme, netloc, url, params, query, fragment) + result = ParseResult(scheme, netloc, url, params, query, fragment) + return _coerce_result(result) def _splitparams(url): if '/' in url: @@ -167,11 +314,12 @@ Return a 5-tuple: (scheme, netloc, path, query, fragment). Note that we don't break the components up in smaller bits (e.g. netloc is a single string) and we don't expand % escapes.""" + url, scheme, _coerce_result = _coerce_args(url, scheme) allow_fragments = bool(allow_fragments) key = url, scheme, allow_fragments, type(url), type(scheme) cached = _parse_cache.get(key, None) if cached: - return cached + return _coerce_result(cached) if len(_parse_cache) >= MAX_CACHE_SIZE: # avoid runaway growth clear_cache() netloc = query = fragment = '' @@ -191,7 +339,7 @@ url, query = url.split('?', 1) v = SplitResult(scheme, netloc, url, query, fragment) _parse_cache[key] = v - return v + return _coerce_result(v) if url.endswith(':') or not url[i+1].isdigit(): for c in url[:i]: if c not in scheme_chars: @@ -209,17 +357,18 @@ url, query = url.split('?', 1) v = SplitResult(scheme, netloc, url, query, fragment) _parse_cache[key] = v - return v + return _coerce_result(v) def urlunparse(components): """Put a parsed URL back together again. This may result in a slightly different, but equivalent URL, if the URL that was parsed originally had redundant delimiters, e.g. a ? with an empty query (the draft states that these are equivalent).""" - scheme, netloc, url, params, query, fragment = components + scheme, netloc, url, params, query, fragment, _coerce_result = ( + _coerce_args(*components)) if params: url = "%s;%s" % (url, params) - return urlunsplit((scheme, netloc, url, query, fragment)) + return _coerce_result(urlunsplit((scheme, netloc, url, query, fragment))) def urlunsplit(components): """Combine the elements of a tuple as returned by urlsplit() into a @@ -227,7 +376,8 @@ This may result in a slightly different, but equivalent URL, if the URL that was parsed originally had unnecessary delimiters (for example, a ? with an empty query; the RFC states that these are equivalent).""" - scheme, netloc, url, query, fragment = components + scheme, netloc, url, query, fragment, _coerce_result = ( + _coerce_args(*components)) if netloc or (scheme and scheme in uses_netloc and url[:2] != '//'): if url and url[:1] != '/': url = '/' + url url = '//' + (netloc or '') + url @@ -237,7 +387,7 @@ url = url + '?' + query if fragment: url = url + '#' + fragment - return url + return _coerce_result(url) def urljoin(base, url, allow_fragments=True): """Join a base URL and a possibly relative URL to form an absolute @@ -246,32 +396,28 @@ return url if not url: return base + base, url, _coerce_result = _coerce_args(base, url) bscheme, bnetloc, bpath, bparams, bquery, bfragment = \ urlparse(base, '', allow_fragments) scheme, netloc, path, params, query, fragment = \ urlparse(url, bscheme, allow_fragments) if scheme != bscheme or scheme not in uses_relative: - return url + return _coerce_result(url) if scheme in uses_netloc: if netloc: - return urlunparse((scheme, netloc, path, - params, query, fragment)) + return _coerce_result(urlunparse((scheme, netloc, path, + params, query, fragment))) netloc = bnetloc if path[:1] == '/': - return urlunparse((scheme, netloc, path, - params, query, fragment)) - if not path: + return _coerce_result(urlunparse((scheme, netloc, path, + params, query, fragment))) + if not path and not params: path = bpath - if not params: - params = bparams - else: - path = path[:-1] - return urlunparse((scheme, netloc, path, - params, query, fragment)) + params = bparams if not query: query = bquery - return urlunparse((scheme, netloc, path, - params, query, fragment)) + return _coerce_result(urlunparse((scheme, netloc, path, + params, query, fragment))) segments = bpath.split('/')[:-1] + path.split('/') # XXX The stuff below is bogus in various ways... if segments[-1] == '.': @@ -293,8 +439,8 @@ segments[-1] = '' elif len(segments) >= 2 and segments[-1] == '..': segments[-2:] = [''] - return urlunparse((scheme, netloc, '/'.join(segments), - params, query, fragment)) + return _coerce_result(urlunparse((scheme, netloc, '/'.join(segments), + params, query, fragment))) def urldefrag(url): """Removes any existing fragment from URL. @@ -303,12 +449,14 @@ the URL contained no fragments, the second element is the empty string. """ + url, _coerce_result = _coerce_args(url) if '#' in url: s, n, p, a, q, frag = urlparse(url) defrag = urlunparse((s, n, p, a, q, '')) - return defrag, frag else: - return url, '' + frag = '' + defrag = url + return _coerce_result(DefragResult(defrag, frag)) def unquote_to_bytes(string): """unquote_to_bytes('abc%20def') -> b'abc def'.""" @@ -375,7 +523,8 @@ string += pct_sequence.decode(encoding, errors) return string -def parse_qs(qs, keep_blank_values=False, strict_parsing=False): +def parse_qs(qs, keep_blank_values=False, strict_parsing=False, + encoding='utf-8', errors='replace'): """Parse a query given as a string argument. Arguments: @@ -392,16 +541,22 @@ strict_parsing: flag indicating what to do with parsing errors. If false (the default), errors are silently ignored. If true, errors raise a ValueError exception. + + encoding and errors: specify how to decode percent-encoded sequences + into Unicode characters, as accepted by the bytes.decode() method. """ dict = {} - for name, value in parse_qsl(qs, keep_blank_values, strict_parsing): + pairs = parse_qsl(qs, keep_blank_values, strict_parsing, + encoding=encoding, errors=errors) + for name, value in pairs: if name in dict: dict[name].append(value) else: dict[name] = [value] return dict -def parse_qsl(qs, keep_blank_values=False, strict_parsing=False): +def parse_qsl(qs, keep_blank_values=False, strict_parsing=False, + encoding='utf-8', errors='replace'): """Parse a query given as a string argument. Arguments: @@ -418,8 +573,12 @@ false (the default), errors are silently ignored. If true, errors raise a ValueError exception. + encoding and errors: specify how to decode percent-encoded sequences + into Unicode characters, as accepted by the bytes.decode() method. + Returns a list, as G-d intended. """ + qs, _coerce_result = _coerce_args(qs) pairs = [s2 for s1 in qs.split('&') for s2 in s1.split(';')] r = [] for name_value in pairs: @@ -435,10 +594,13 @@ else: continue if len(nv[1]) or keep_blank_values: - name = unquote(nv[0].replace('+', ' ')) - value = unquote(nv[1].replace('+', ' ')) + name = nv[0].replace('+', ' ') + name = unquote(name, encoding=encoding, errors=errors) + name = _coerce_result(name) + value = nv[1].replace('+', ' ') + value = unquote(value, encoding=encoding, errors=errors) + value = _coerce_result(value) r.append((name, value)) - return r def unquote_plus(string, encoding='utf-8', errors='replace'): @@ -699,7 +861,12 @@ _hostprog = re.compile('^//([^/?]*)(.*)$') match = _hostprog.match(url) - if match: return match.group(1, 2) + if match: + host_port = match.group(1) + path = match.group(2) + if path and not path.startswith('/'): + path = '/' + path + return host_port, path return None, url _userprog = None @@ -711,7 +878,7 @@ _userprog = re.compile('^(.*)@(.*)$') match = _userprog.match(host) - if match: return map(unquote, match.group(1, 2)) + if match: return match.group(1, 2) return None, host _passwdprog = None Modified: python/branches/pep-3151/Lib/urllib/request.py ============================================================================== --- python/branches/pep-3151/Lib/urllib/request.py (original) +++ python/branches/pep-3151/Lib/urllib/request.py Sat Feb 26 08:16:32 2011 @@ -94,6 +94,7 @@ import socket import sys import time +import collections from urllib.error import URLError, HTTPError, ContentTooShortError from urllib.parse import ( @@ -105,7 +106,7 @@ # check for SSL try: import ssl -except: +except ImportError: _have_ssl = False else: _have_ssl = True @@ -274,8 +275,9 @@ def __init__(self): client_version = "Python-urllib/%s" % __version__ self.addheaders = [('User-agent', client_version)] - # manage the individual handlers + # self.handlers is retained only for backward compatibility self.handlers = [] + # manage the individual handlers self.handle_open = {} self.handle_error = {} self.process_response = {} @@ -325,8 +327,6 @@ added = True if added: - # the handlers must work in an specific order, the order - # is specified in a Handler attribute bisect.insort(self.handlers, handler) handler.add_parent(self) @@ -1048,13 +1048,24 @@ if request.data is not None: # POST data = request.data + if isinstance(data, str): + raise TypeError("POST data should be bytes" + " or an iterable of bytes. It cannot be str.") if not request.has_header('Content-type'): request.add_unredirected_header( 'Content-type', 'application/x-www-form-urlencoded') if not request.has_header('Content-length'): - request.add_unredirected_header( - 'Content-length', '%d' % len(data)) + try: + mv = memoryview(data) + except TypeError: + if isinstance(data, collections.Iterable): + raise ValueError("Content-Length should be specified " + "for iterable data of type %r %r" % (type(data), + data)) + else: + request.add_unredirected_header( + 'Content-length', '%d' % (len(mv) * mv.itemsize)) sel_host = host if request.has_proxy(): @@ -1300,8 +1311,8 @@ else: passwd = None host = unquote(host) - user = unquote(user or '') - passwd = unquote(passwd or '') + user = user or '' + passwd = passwd or '' try: host = socket.gethostbyname(host) @@ -1835,7 +1846,7 @@ if encoding == 'base64': import base64 # XXX is this encoding/decoding ok? - data = base64.decodebytes(data.encode('ascii')).decode('latin1') + data = base64.decodebytes(data.encode('ascii')).decode('latin-1') else: data = unquote(data) msg.append('Content-Length: %d' % len(data)) Modified: python/branches/pep-3151/Lib/wave.py ============================================================================== --- python/branches/pep-3151/Lib/wave.py (original) +++ python/branches/pep-3151/Lib/wave.py Sat Feb 26 08:16:32 2011 @@ -467,11 +467,11 @@ self._datalength = self._nframes * self._nchannels * self._sampwidth self._form_length_pos = self._file.tell() self._file.write(struct.pack('' % (self.text,) def __hash__(self): return hash(self.text) def __le__(self, other): @@ -1066,7 +1068,7 @@ def register_namespace(prefix, uri): if re.match("ns\d+$", prefix): raise ValueError("Prefix format reserved for internal use") - for k, v in _namespace_map.items(): + for k, v in list(_namespace_map.items()): if k == uri or v == prefix: del _namespace_map[k] _namespace_map[uri] = prefix Modified: python/branches/pep-3151/Lib/xmlrpc/client.py ============================================================================== --- python/branches/pep-3151/Lib/xmlrpc/client.py (original) +++ python/branches/pep-3151/Lib/xmlrpc/client.py Sat Feb 26 08:16:32 2011 @@ -1297,8 +1297,12 @@ def parse_response(self, response): # read response data from httpresponse, and parse it - if response.getheader("Content-Encoding", "") == "gzip": - stream = GzipDecodedResponse(response) + # Check for new http response object, otherwise it is a file object. + if hasattr(response, 'getheader'): + if response.getheader("Content-Encoding", "") == "gzip": + stream = GzipDecodedResponse(response) + else: + stream = response else: stream = response @@ -1330,7 +1334,7 @@ if self._connection and host == self._connection[0]: return self._connection[1] - if not hasattr(socket, "ssl"): + if not hasattr(http.client, "HTTPSConnection"): raise NotImplementedError( "your version of http.client doesn't support HTTPS") # create a HTTPS connection object from a host descriptor Modified: python/branches/pep-3151/Lib/zipfile.py ============================================================================== --- python/branches/pep-3151/Lib/zipfile.py (original) +++ python/branches/pep-3151/Lib/zipfile.py Sat Feb 26 08:16:32 2011 @@ -473,9 +473,11 @@ # Search for universal newlines or line chunks. PATTERN = re.compile(br'^(?P[^\r\n]+)|(?P\n|\r\n?)') - def __init__(self, fileobj, mode, zipinfo, decrypter=None): + def __init__(self, fileobj, mode, zipinfo, decrypter=None, + close_fileobj=False): self._fileobj = fileobj self._decrypter = decrypter + self._close_fileobj = close_fileobj self._compress_type = zipinfo.compress_type self._compress_size = zipinfo.compress_size @@ -647,6 +649,12 @@ self._offset += len(data) return data + def close(self): + try: + if self._close_fileobj: + self._fileobj.close() + finally: + super().close() class ZipFile: @@ -869,8 +877,12 @@ def setpassword(self, pwd): """Set default password for encrypted files.""" - assert isinstance(pwd, bytes) - self.pwd = pwd + if pwd and not isinstance(pwd, bytes): + raise TypeError("pwd: expected bytes, got %s" % type(pwd)) + if pwd: + self.pwd = pwd + else: + self.pwd = None def read(self, name, pwd=None): """Return file bytes (as a string) for name.""" @@ -881,6 +893,8 @@ """Return file-like object for 'name'.""" if mode not in ("r", "U", "rU"): raise RuntimeError('open() requires mode "r", "U", or "rU"') + if pwd and not isinstance(pwd, bytes): + raise TypeError("pwd: expected bytes, got %s" % type(pwd)) if not self.fp: raise RuntimeError( "Attempt to read ZIP archive that was already closed") @@ -898,8 +912,12 @@ zinfo = name else: # Get info object for name - zinfo = self.getinfo(name) - + try: + zinfo = self.getinfo(name) + except KeyError: + if not self._filePassed: + zef_file.close() + raise zef_file.seek(zinfo.header_offset, 0) # Skip the file header: @@ -912,7 +930,15 @@ if fheader[_FH_EXTRA_FIELD_LENGTH]: zef_file.read(fheader[_FH_EXTRA_FIELD_LENGTH]) - if fname != zinfo.orig_filename.encode("utf-8"): + if zinfo.flag_bits & 0x800: + # UTF-8 filename + fname_str = fname.decode("utf-8") + else: + fname_str = fname.decode("cp437") + + if fname_str != zinfo.orig_filename: + if not self._filePassed: + zef_file.close() raise BadZipFile( 'File name in directory %r and header %r differ.' % (zinfo.orig_filename, fname)) @@ -924,6 +950,8 @@ if not pwd: pwd = self.pwd if not pwd: + if not self._filePassed: + zef_file.close() raise RuntimeError("File %s is encrypted, " "password required for extraction" % name) @@ -933,8 +961,8 @@ # completely random, while the 12th contains the MSB of the CRC, # or the MSB of the file time depending on the header type # and is used to check the correctness of the password. - bytes = zef_file.read(12) - h = list(map(zd, bytes[0:12])) + header = zef_file.read(12) + h = list(map(zd, header[0:12])) if zinfo.flag_bits & 0x8: # compare against the file type from extended local headers check_byte = (zinfo._raw_time >> 8) & 0xff @@ -942,9 +970,12 @@ # compare against the CRC otherwise check_byte = (zinfo.CRC >> 24) & 0xff if h[11] != check_byte: + if not self._filePassed: + zef_file.close() raise RuntimeError("Bad password for file", name) - return ZipExtFile(zef_file, mode, zinfo, zd) + return ZipExtFile(zef_file, mode, zinfo, zd, + close_fileobj=not self._filePassed) def extract(self, member, path=None, pwd=None): """Extract a member from the archive to the current working directory, @@ -1276,6 +1307,12 @@ class PyZipFile(ZipFile): """Class to create ZIP archives with Python library files and packages.""" + def __init__(self, file, mode="r", compression=ZIP_STORED, + allowZip64=False, optimize=-1): + ZipFile.__init__(self, file, mode=mode, compression=compression, + allowZip64=allowZip64) + self._optimize = optimize + def writepy(self, pathname, basename=""): """Add all files from "pathname" to the ZIP archive. @@ -1348,44 +1385,63 @@ archive name, compiling if necessary. For example, given /python/lib/string, return (/python/lib/string.pyc, string). """ + def _compile(file, optimize=-1): + import py_compile + if self.debug: + print("Compiling", file) + try: + py_compile.compile(file, doraise=True, optimize=optimize) + except py_compile.PyCompileError as error: + print(err.msg) + return False + return True + file_py = pathname + ".py" file_pyc = pathname + ".pyc" file_pyo = pathname + ".pyo" pycache_pyc = imp.cache_from_source(file_py, True) pycache_pyo = imp.cache_from_source(file_py, False) - if (os.path.isfile(file_pyo) and - os.stat(file_pyo).st_mtime >= os.stat(file_py).st_mtime): - # Use .pyo file. - arcname = fname = file_pyo - elif (os.path.isfile(file_pyc) and - os.stat(file_pyc).st_mtime >= os.stat(file_py).st_mtime): - # Use .pyc file. - arcname = fname = file_pyc - elif (os.path.isfile(pycache_pyc) and - os.stat(pycache_pyc).st_mtime >= os.stat(file_py).st_mtime): - # Use the __pycache__/*.pyc file, but write it to the legacy pyc - # file name in the archive. - fname = pycache_pyc - arcname = file_pyc - elif (os.path.isfile(pycache_pyo) and - os.stat(pycache_pyo).st_mtime >= os.stat(file_py).st_mtime): - # Use the __pycache__/*.pyo file, but write it to the legacy pyo - # file name in the archive. - fname = pycache_pyo - arcname = file_pyo + if self._optimize == -1: + # legacy mode: use whatever file is present + if (os.path.isfile(file_pyo) and + os.stat(file_pyo).st_mtime >= os.stat(file_py).st_mtime): + # Use .pyo file. + arcname = fname = file_pyo + elif (os.path.isfile(file_pyc) and + os.stat(file_pyc).st_mtime >= os.stat(file_py).st_mtime): + # Use .pyc file. + arcname = fname = file_pyc + elif (os.path.isfile(pycache_pyc) and + os.stat(pycache_pyc).st_mtime >= os.stat(file_py).st_mtime): + # Use the __pycache__/*.pyc file, but write it to the legacy pyc + # file name in the archive. + fname = pycache_pyc + arcname = file_pyc + elif (os.path.isfile(pycache_pyo) and + os.stat(pycache_pyo).st_mtime >= os.stat(file_py).st_mtime): + # Use the __pycache__/*.pyo file, but write it to the legacy pyo + # file name in the archive. + fname = pycache_pyo + arcname = file_pyo + else: + # Compile py into PEP 3147 pyc file. + if _compile(file_py): + fname = (pycache_pyc if __debug__ else pycache_pyo) + arcname = (file_pyc if __debug__ else file_pyo) + else: + fname = arcname = file_py else: - # Compile py into PEP 3147 pyc file. - import py_compile - if self.debug: - print("Compiling", file_py) - try: - py_compile.compile(file_py, doraise=True) - except py_compile.PyCompileError as error: - print(err.msg) - fname = file_py + # new mode: use given optimization level + if self._optimize == 0: + fname = pycache_pyc + arcname = file_pyc else: - fname = (pycache_pyc if __debug__ else pycache_pyo) - arcname = (file_pyc if __debug__ else file_pyo) + fname = pycache_pyo + arcname = file_pyo + if not (os.path.isfile(fname) and + os.stat(fname).st_mtime >= os.stat(file_py).st_mtime): + if not _compile(file_py, optimize=self._optimize): + fname = arcname = file_py archivename = os.path.split(arcname)[1] if basename: archivename = "%s/%s" % (basename, archivename) Modified: python/branches/pep-3151/Mac/BuildScript/README.txt ============================================================================== --- python/branches/pep-3151/Mac/BuildScript/README.txt (original) +++ python/branches/pep-3151/Mac/BuildScript/README.txt Sat Feb 26 08:16:32 2011 @@ -1,78 +1,147 @@ -Building a MacPython distribution -================================= +Building a Python Mac OS X distribution +======================================= -The ``build-install.py`` script creates MacPython distributions, including -sleepycat db4, sqlite3 and readline support. It builds a complete +The ``build-install.py`` script creates Python distributions, including +certain third-party libraries as necessary. It builds a complete framework-based Python out-of-tree, installs it in a funny place with $DESTROOT, massages that installation to remove .pyc files and such, creates an Installer package from the installation plus other files in ``resources`` and ``scripts`` and placed that on a ``.dmg`` disk image. -Prerequisites -------------- +As of Python 2.7.x and 3.2, PSF practice is to build two installer variants +for each release: -* A MacOS X 10.4 (or later) +1. 32-bit-only, i386 and PPC universal, capable on running on all machines + supported by Mac OS X 10.3.9 through (at least) 10.6:: -* XCode 2.2 (or later), with the universal SDK + python build-installer.py \ + --sdk-path=/Developer/SDKs/MacOSX10.4u.sdk \ + --universal-archs=32-bit \ + --dep-target=10.3 + # These are the current default options -* No Fink (in ``/sw``) or DarwinPorts (in ``/opt/local``), those could + - builds the following third-party libraries + + * Bzip2 + * Zlib 1.2.3 + * GNU Readline (GPL) + * SQLite 3 + * NCurses + * Oracle Sleepycat DB 4.8 (Python 2.x only) + + - requires ActiveState ``Tcl/Tk 8.4`` (currently 8.4.19) to be installed for building + + - current target build environment: + + * Mac OS X 10.5.8 PPC or Intel + * Xcode 3.1.4 (or later) + * ``MacOSX10.4u`` SDK (later SDKs do not support PPC G3 processors) + * ``MACOSX_DEPLOYMENT_TARGET=10.3`` + * Apple ``gcc-4.0`` + * Python 2.n (n >= 4) for documentation build with Sphinx + + - alternate build environments: + + * Mac OS X 10.4.11 with Xcode 2.5 + * Mac OS X 10.6.6 with Xcode 3.2.5 + - need to change ``/System/Library/Frameworks/{Tcl,Tk}.framework/Version/Current`` to ``8.4`` + +2. 64-bit / 32-bit, x86_64 and i386 universal, for OS X 10.6 (and later):: + + python build-installer.py \ + --sdk-path=/Developer/SDKs/MacOSX10.6.sdk \ + --universal-archs=intel \ + --dep-target=10.6 + + - uses system-supplied versions of third-party libraries + + * readline module links with Apple BSD editline (libedit) + * builds Oracle Sleepycat DB 4.8 (Python 2.x only) + + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building + + - current target build environment: + + * Mac OS X 10.6.6 (or later) + * Xcode 3.2.5 (or later) + * ``MacOSX10.6`` SDK + * ``MACOSX_DEPLOYMENT_TARGET=10.6`` + * Apple ``gcc-4.2`` + * Python 2.n (n >= 4) for documentation build with Sphinx + + - alternate build environments: + + * none + + +General Prerequisites +--------------------- + +* No Fink (in ``/sw``) or MacPorts (in ``/opt/local``) or other local + libraries or utilities (in ``/usr/local``) as they could interfere with the build. -* The documentation for the release must be available on python.org +* The documentation for the release is built using Sphinx because it is included in the installer. +* It is safest to start each variant build with an empty source directory + populated with a fresh copy of the untarred source. + The Recipe ---------- -Here are the steps you need to follow to build a MacPython installer: +Here are the steps you need to follow to build a Python installer: -* Run ``./build-installer.py``. Optionally you can pass a number of arguments - to specify locations of various files. Please see the top of +* Run ``build-installer.py``. Optionally you can pass a number of arguments + to specify locations of various files. Please see the top of ``build-installer.py`` for its usage. - Running this script takes some time, I will not only build Python itself + Running this script takes some time, it will not only build Python itself but also some 3th-party libraries that are needed for extensions. * When done the script will tell you where the DMG image is (by default somewhere in ``/tmp/_py``). -Building a 4-way universal installer -.................................... +Building other universal installers +................................... It is also possible to build a 4-way universal installer that runs on -OSX Leopard or later:: +OS X Leopard or later:: - $ ./build-installer.py --dep-target=10.5 --universal-archs=all --sdk=/ + python 2.6 /build-installer.py \ + --dep-target=10.5 + --universal-archs=all + --sdk-path=/Developer/SDKs/MacOSX10.5.sdk + +This requires that the deployment target is 10.5, and hence +also that you are building on at least OS X 10.5. 4-way includes +``i386``, ``x86_64``, ``ppc``, and ``ppc64`` (G5). ``ppc64`` executable +variants can only be run on G5 machines running 10.5. Note that, +while OS X 10.6 is only supported on Intel-based machines, it is possible +to run ``ppc`` (32-bit) executables unmodified thanks to the Rosetta ppc +emulation in OS X 10.5 and 10.6. + +Other ``--universal-archs`` options are ``64-bit`` (``x86_64``, ``ppc64``), +and ``3-way`` (``ppc``, ``i386``, ``x86_64``). None of these options +are regularly exercised; use at your own risk. -This requires that the deployment target is 10.5 (or later), and hence -also that your building on at least OSX 10.5. Testing ------- -The resulting binaries should work on MacOSX 10.3.9 or later. I usually run -the installer on a 10.3.9, a 10.4.x PPC and a 10.4.x Intel system and then -run the testsuite to make sure. - - -Announcements -------------- - -(This is mostly of historic interest) - -When all is done, announcements can be posted to at least the following -places: -- pythonmac-sig at python.org -- python-dev at python.org -- python-announce at python.org -- archivist at info-mac.org -- adcnews at apple.com -- news at macnn.com -- http://www.macupdate.com -- http://guide.apple.com/usindex.lasso -- http://www.apple.com/downloads/macosx/submit -- http://www.versiontracker.com/ (userid Jack.Jansen at oratrix.com) -- http://www.macshareware.net (userid jackjansen) +Ideally, the resulting binaries should be installed and the test suite run +on all supported OS X releases and architectures. As a practical matter, +that is generally not possible. At a minimum, variant 1 should be run on +at least one Intel, one PPC G4, and one PPC G3 system and one each of +OS X 10.6, 10.5, 10.4, and 10.3.9. Not all tests run on 10.3.9. +Variant 2 should be run on 10.6 in both 32-bit and 64-bit modes.:: + + arch -i386 /usr/local/bin/pythonn.n -m test.regrtest -w -u all + arch -X86_64 /usr/local/bin/pythonn.n -m test.regrtest -w -u all + +Certain tests will be skipped and some cause the interpreter to fail +which will likely generate ``Python quit unexpectedly`` alert messages +to be generated at several points during a test run. These can +be ignored. -Also, check out Stephan Deibels http://pythonology.org/market contact list Modified: python/branches/pep-3151/Mac/BuildScript/build-installer.py ============================================================================== --- python/branches/pep-3151/Mac/BuildScript/build-installer.py (original) +++ python/branches/pep-3151/Mac/BuildScript/build-installer.py Sat Feb 26 08:16:32 2011 @@ -1,12 +1,14 @@ #!/usr/bin/python """ -This script is used to build the "official unofficial" universal build on -Mac OS X. It requires Mac OS X 10.4, Xcode 2.2 and the 10.4u SDK to do its -work. 64-bit or four-way universal builds require at least OS X 10.5 and -the 10.5 SDK. - -Please ensure that this script keeps working with Python 2.3, to avoid -bootstrap issues (/usr/bin/python is Python 2.3 on OSX 10.4) +This script is used to build "official" universal installers on Mac OS X. +It requires at least Mac OS X 10.4, Xcode 2.2 and the 10.4u SDK for +32-bit builds. 64-bit or four-way universal builds require at least +OS X 10.5 and the 10.5 SDK. + +Please ensure that this script keeps working with Python 2.5, to avoid +bootstrap issues (/usr/bin/python is Python 2.5 on OSX 10.5). Sphinx, +which is used to build the documentation, currently requires at least +Python 2.4. Usage: see USAGE variable in the script. """ @@ -144,9 +146,9 @@ if DEPTARGET < '10.5': result.extend([ dict( - name="Bzip2 1.0.5", - url="http://www.bzip.org/1.0.5/bzip2-1.0.5.tar.gz", - checksum='3c15a0c8d1d3ee1c46a1634d00617b1a', + name="Bzip2 1.0.6", + url="http://bzip.org/1.0.6/bzip2-1.0.6.tar.gz", + checksum='00b516f4704d4a7cb50a1d97e6e8e15b', configure=None, install='make install CC=%s PREFIX=%s/usr/local/ CFLAGS="-arch %s -isysroot %s"'%( CC, @@ -169,29 +171,33 @@ ), dict( # Note that GNU readline is GPL'd software - name="GNU Readline 5.1.4", - url="http://ftp.gnu.org/pub/gnu/readline/readline-5.1.tar.gz" , - checksum='7ee5a692db88b30ca48927a13fd60e46', + name="GNU Readline 6.1.2", + url="http://ftp.gnu.org/pub/gnu/readline/readline-6.1.tar.gz" , + checksum='fc2f7e714fe792db1ce6ddc4c9fb4ef3', patchlevel='0', patches=[ # The readline maintainers don't do actual micro releases, but # just ship a set of patches. - 'http://ftp.gnu.org/pub/gnu/readline/readline-5.1-patches/readline51-001', - 'http://ftp.gnu.org/pub/gnu/readline/readline-5.1-patches/readline51-002', - 'http://ftp.gnu.org/pub/gnu/readline/readline-5.1-patches/readline51-003', - 'http://ftp.gnu.org/pub/gnu/readline/readline-5.1-patches/readline51-004', + 'http://ftp.gnu.org/pub/gnu/readline/readline-6.1-patches/readline61-001', + 'http://ftp.gnu.org/pub/gnu/readline/readline-6.1-patches/readline61-002', ] ), dict( - name="SQLite 3.6.11", - url="http://www.sqlite.org/sqlite-3.6.11.tar.gz", - checksum='7ebb099696ab76cc6ff65dd496d17858', + name="SQLite 3.7.4", + url="http://www.sqlite.org/sqlite-autoconf-3070400.tar.gz", + checksum='8f0c690bfb33c3cbbc2471c3d9ba0158', + configure_env=('CFLAGS="-Os' + ' -DSQLITE_ENABLE_FTS3' + ' -DSQLITE_ENABLE_FTS3_PARENTHESIS' + ' -DSQLITE_ENABLE_RTREE' + ' -DSQLITE_TCL=0' + '"'), configure_pre=[ '--enable-threadsafe', - '--enable-tempstore', '--enable-shared=no', '--enable-static=yes', - '--disable-tcl', + '--disable-readline', + '--disable-dependency-tracking', ] ), dict( @@ -199,6 +205,7 @@ url="http://ftp.gnu.org/pub/gnu/ncurses/ncurses-5.5.tar.gz", checksum='e73c1ac10b4bfc46db43b2ddfd6244ef', configure_pre=[ + "--enable-widec", "--without-cxx", "--without-ada", "--without-progs", @@ -225,18 +232,19 @@ ), ]) - result.extend([ - dict( - name="Sleepycat DB 4.7.25", - url="http://download.oracle.com/berkeley-db/db-4.7.25.tar.gz", - checksum='ec2b87e833779681a0c3a814aa71359e', - buildDir="build_unix", - configure="../dist/configure", - configure_pre=[ - '--includedir=/usr/local/include/db4', - ] - ), - ]) + if not PYTHON_3: + result.extend([ + dict( + name="Sleepycat DB 4.7.25", + url="http://download.oracle.com/berkeley-db/db-4.7.25.tar.gz", + checksum='ec2b87e833779681a0c3a814aa71359e', + buildDir="build_unix", + configure="../dist/configure", + configure_pre=[ + '--includedir=/usr/local/include/db4', + ] + ), + ]) return result @@ -399,6 +407,9 @@ Check that we're running on a supported system. """ + if sys.version_info[0:2] < (2, 4): + fatal("This script must be run with Python 2.4 or later") + if platform.system() != 'Darwin': fatal("This script should be run on a Mac OS X 10.4 (or later) system") @@ -418,15 +429,16 @@ # to install a newer patch level. for framework in ['Tcl', 'Tk']: - fw = dict(lower=framework.lower(), - upper=framework.upper(), - cap=framework.capitalize()) - fwpth = "Library/Frameworks/%(cap)s.framework/%(lower)sConfig.sh" % fw - sysfw = os.path.join('/System', fwpth) + #fw = dict(lower=framework.lower(), + # upper=framework.upper(), + # cap=framework.capitalize()) + #fwpth = "Library/Frameworks/%(cap)s.framework/%(lower)sConfig.sh" % fw + fwpth = 'Library/Frameworks/Tcl.framework/Versions/Current' + sysfw = os.path.join(SDKPATH, 'System', fwpth) libfw = os.path.join('/', fwpth) usrfw = os.path.join(os.getenv('HOME'), fwpth) - version = "%(upper)s_VERSION" % fw - if getTclTkVersion(libfw, version) != getTclTkVersion(sysfw, version): + #version = "%(upper)s_VERSION" % fw + if os.readlink(libfw) != os.readlink(sysfw): fatal("Version of %s must match %s" % (libfw, sysfw) ) if os.path.exists(usrfw): fatal("Please rename %s to avoid possible dynamic load issues." @@ -696,6 +708,9 @@ configure_args.insert(0, configure) configure_args = [ shellQuote(a) for a in configure_args ] + if 'configure_env' in recipe: + configure_args.insert(0, recipe['configure_env']) + print "Running configure for %s"%(name,) runCommand(' '.join(configure_args) + ' 2>&1') @@ -751,9 +766,9 @@ shutil.rmtree(buildDir) if os.path.exists(rootDir): shutil.rmtree(rootDir) - os.mkdir(buildDir) - os.mkdir(rootDir) - os.mkdir(os.path.join(rootDir, 'empty-dir')) + os.makedirs(buildDir) + os.makedirs(rootDir) + os.makedirs(os.path.join(rootDir, 'empty-dir')) curdir = os.getcwd() os.chdir(buildDir) @@ -825,12 +840,33 @@ os.chmod(p, stat.S_IMODE(st.st_mode) | stat.S_IWGRP) os.chown(p, -1, gid) + if PYTHON_3: + LDVERSION=None + VERSION=None + ABIFLAGS=None + + fp = open(os.path.join(buildDir, 'Makefile'), 'r') + for ln in fp: + if ln.startswith('VERSION='): + VERSION=ln.split()[1] + if ln.startswith('ABIFLAGS='): + ABIFLAGS=ln.split()[1] + if ln.startswith('LDVERSION='): + LDVERSION=ln.split()[1] + fp.close() + + LDVERSION = LDVERSION.replace('$(VERSION)', VERSION) + LDVERSION = LDVERSION.replace('$(ABIFLAGS)', ABIFLAGS) + config_suffix = '-' + LDVERSION + else: + config_suffix = '' # Python 2.x + # We added some directories to the search path during the configure # phase. Remove those because those directories won't be there on # the end-users system. path =os.path.join(rootDir, 'Library', 'Frameworks', 'Python.framework', 'Versions', version, 'lib', 'python%s'%(version,), - 'config', 'Makefile') + 'config' + config_suffix, 'Makefile') fp = open(path, 'r') data = fp.read() fp.close() Modified: python/branches/pep-3151/Mac/BuildScript/resources/ReadMe.txt ============================================================================== --- python/branches/pep-3151/Mac/BuildScript/resources/ReadMe.txt (original) +++ python/branches/pep-3151/Mac/BuildScript/resources/ReadMe.txt Sat Feb 26 08:16:32 2011 @@ -1,29 +1,36 @@ This package will install Python $FULL_VERSION for Mac OS X -$MACOSX_DEPLOYMENT_TARGET for the following -architecture(s): $ARCHITECTURES. +$MACOSX_DEPLOYMENT_TARGET for the following architecture(s): +$ARCHITECTURES. -Separate installers are available for older versions -of Mac OS X, see the homepage, below. +Installation requires approximately $INSTALL_SIZE MB of disk space, +ignore the message that it will take zero bytes. -Installation requires approximately $INSTALL_SIZE MB of disk -space, ignore the message that it will take zero bytes. +You must install onto your current boot disk, even though the +installer does not enforce this, otherwise things will not work. -You must install onto your current boot disk, even -though the installer does not enforce this, otherwise -things will not work. - -Python consists of the Python programming language -interpreter, plus a set of programs to allow easy -access to it for Mac users including an integrated development -environment, IDLE, plus a set of pre-built extension modules -that open up specific Macintosh technologies to Python programs. - -The installer puts the applications in "Python $VERSION" -in your Applications folder, and the underlying machinery in -$PYTHONFRAMEWORKINSTALLDIR. It can optionally place -links to the command-line tools in /usr/local as well, -by default you have to add the "bin" directory inside -the framework to you shell's search path. +Python consists of the Python programming language interpreter, plus +a set of programs to allow easy access to it for Mac users including +an integrated development environment, IDLE, plus a set of pre-built +extension modules that open up specific Macintosh technologies to +Python programs. + + **** IMPORTANT **** + +Before using IDLE or other programs using the tkinter graphical user +interface toolkit, visit http://www.python.org/download/mac/tcltk/ +for current information about supported and recommended versions +of Tcl/Tk for this version of Python and Mac OS X. + + ******************* + +The installer puts applications, an "Update Shell Profile" command, +and a link to the optionally installed Python Documentation into the +"Python $VERSION" subfolder of the system Applications folder, +and puts the underlying machinery into the folder +$PYTHONFRAMEWORKINSTALLDIR. It can +optionally place links to the command-line tools in /usr/local/bin as +well. Double-click on the "Update Shell Profile" command to add the +"bin" directory inside the framework to your shell's search path. More information on Python in general can be found at http://www.python.org. Modified: python/branches/pep-3151/Mac/BuildScript/resources/Welcome.rtf ============================================================================== --- python/branches/pep-3151/Mac/BuildScript/resources/Welcome.rtf (original) +++ python/branches/pep-3151/Mac/BuildScript/resources/Welcome.rtf Sat Feb 26 08:16:32 2011 @@ -1,19 +1,35 @@ -{\rtf1\ansi\ansicpg1252\cocoartf949\cocoasubrtf430 +{\rtf1\ansi\ansicpg1252\cocoartf1038\cocoasubrtf350 {\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} -\paperw11900\paperh16840\margl1440\margr1440\vieww9920\viewh10660\viewkind0 +\paperw11904\paperh16836\margl1440\margr1440\vieww9640\viewh10620\viewkind0 \pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\ql\qnatural \f0\fs24 \cf0 This package will install -\b MacPython $FULL_VERSION +\b Python $FULL_VERSION \b0 for \b Mac OS X $MACOSX_DEPLOYMENT_TARGET \b0 .\ \ -MacPython consists of the Python programming language interpreter, plus a set of programs to allow easy access to it for Mac users including an integrated development environment \b IDLE\b0 plus a set of pre-built extension modules that open up specific Macintosh technologies to Python programs.\ + +\b Python for Mac OS X +\b0 consists of the Python programming language interpreter, plus a set of programs to allow easy access to it for Mac OS X users including an integrated development environment +\b IDLE +\b0 and a set of pre-built extension modules that open up specific Macintosh technologies to Python programs.\ \ -See the ReadMe file for more information.\ +See the ReadMe file and the Python documentation for more information.\ \ \b NOTE: -\b0 This package will by default not update your shell profile and will also not install files in /usr/local. Double-click \b Update Shell Profile\b0 at any time to make $FULL_VERSION the default Python.} +\b0 This package will by default not update your shell profile and will also not install files in /usr/local. Double-click +\b Update Shell Profile +\b0 at any time to make $FULL_VERSION the default Python.\ +\ + +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the +\b Tcl/Tk +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file Modified: python/branches/pep-3151/Mac/BuildScript/scripts/postflight.documentation ============================================================================== --- python/branches/pep-3151/Mac/BuildScript/scripts/postflight.documentation (original) +++ python/branches/pep-3151/Mac/BuildScript/scripts/postflight.documentation Sat Feb 26 08:16:32 2011 @@ -1,11 +1,32 @@ #!/bin/sh PYVER="@PYVER@" +FWK="/Library/Frameworks/Python.framework/Versions/${PYVER}" +FWK_DOCDIR_SUBPATH="Resources/English.lproj/Documentation" +FWK_DOCDIR="${FWK}/${FWK_DOCDIR_SUBPATH}" +APPDIR="/Applications/Python ${PYVER}" +DEV_DOCDIR="/Developer/Documentation" +SHARE_DIR="${FWK}/share" +SHARE_DOCDIR="${SHARE_DIR}/doc/python${PYVER}" +SHARE_DOCDIR_TO_FWK="../../.." -if [ -d /Developer/Documentation ]; then - if [ ! -d /Developer/Documentation/Python ]; then - mkdir -p /Developer/Documentation/Python - fi +# make link in /Developer/Documentation/ for Xcode users +if [ -d "${DEV_DOCDIR}" ]; then + if [ ! -d "${DEV_DOCDIR}/Python" ]; then + mkdir -p "${DEV_DOCDIR}/Python" + fi + ln -fhs "${FWK_DOCDIR}" "${DEV_DOCDIR}/Python/Reference Documentation ${PYVER}" +fi + +# make link in /Applications/Python m.n/ for Finder users +if [ -d "${APPDIR}" ]; then + ln -fhs "${FWK_DOCDIR}/index.html" "${APPDIR}/Python Documentation.html" +fi - ln -fhs /Library/Frameworks/Python.framework/Versions/${PYVER}/Resources/English.lproj/Documentation "/Developer/Documentation/Python/Reference Documentation @PYVER@" +# make share/doc link in framework for command line users +if [ -d "${SHARE_DIR}" ]; then + mkdir -p "${SHARE_DOCDIR}" + # make relative link to html doc directory + ln -fhs "${SHARE_DOCDIR_TO_FWK}/${FWK_DOCDIR_SUBPATH}" "${SHARE_DOCDIR}/html" fi + Deleted: python/branches/pep-3151/Mac/Extras.ReadMe.txt ============================================================================== --- python/branches/pep-3151/Mac/Extras.ReadMe.txt Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,5 +0,0 @@ -This folder contains examples of Python usage and useful scripts and tools. - -You should be aware that these are not Macintosh-specific but are shared -among Python on all platforms, so there are some that only run on Windows -or Unix or another platform, and/or make little sense on a Macintosh. Modified: python/branches/pep-3151/Mac/IDLE/IDLE.app/Contents/Info.plist ============================================================================== --- python/branches/pep-3151/Mac/IDLE/IDLE.app/Contents/Info.plist (original) +++ python/branches/pep-3151/Mac/IDLE/IDLE.app/Contents/Info.plist Sat Feb 26 08:16:32 2011 @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ?? 2001-2008 Python Software Foundation + %version%, ?? 2001-2011 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier Deleted: python/branches/pep-3151/Mac/IDLE/idlemain.py ============================================================================== --- python/branches/pep-3151/Mac/IDLE/idlemain.py Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,30 +0,0 @@ -""" -Bootstrap script for IDLE as an application bundle. -""" -import sys, os - -from idlelib.PyShell import main - -# Change the current directory the user's home directory, that way we'll get -# a more useful default location in the open/save dialogs. -os.chdir(os.path.expanduser('~/Documents')) - - -# Make sure sys.executable points to the python interpreter inside the -# framework, instead of at the helper executable inside the application -# bundle (the latter works, but doesn't allow access to the window server) -if sys.executable.endswith('-32'): - sys.executable = os.path.join(sys.prefix, 'bin', 'python-32') -else: - sys.executable = os.path.join(sys.prefix, 'bin', 'python') - -# Look for the -psn argument that the launcher adds and remove it, it will -# only confuse the IDLE startup code. -for idx, value in enumerate(sys.argv): - if value.startswith('-psn_'): - del sys.argv[idx] - break - -#argvemulator.ArgvCollector().mainloop() -if __name__ == '__main__': - main() Modified: python/branches/pep-3151/Mac/Makefile.in ============================================================================== --- python/branches/pep-3151/Mac/Makefile.in (original) +++ python/branches/pep-3151/Mac/Makefile.in Sat Feb 26 08:16:32 2011 @@ -47,8 +47,7 @@ compileall=$(srcdir)/../Lib/compileall.py installapps: install_Python install_pythonw install_PythonLauncher install_IDLE \ - checkapplepython install_versionedtools - + checkapplepython install_pythonw: pythonw $(INSTALL_PROGRAM) $(STRIPFLAG) pythonw "$(DESTDIR)$(prefix)/bin/pythonw$(VERSION)" @@ -92,27 +91,6 @@ ln -fs "$(prefix)/bin/$${fn}" "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin/$${fn}" ;\ done -# By default most tools are installed without a version in their basename, to -# make it easier to install (and use) several python versions side-by-side move -# the tools to a version-specific name and add the non-versioned name as an -# alias. -install_versionedtools: - for fn in idle pydoc ;\ - do \ - if [ -h "$(DESTDIR)$(prefix)/bin/$${fn}3" ]; then \ - continue ;\ - fi ;\ - mv "$(DESTDIR)$(prefix)/bin/$${fn}3" "$(DESTDIR)$(prefix)/bin/$${fn}$(VERSION)" ;\ - ln -sf "$${fn}$(VERSION)" "$(DESTDIR)$(prefix)/bin/$${fn}3" ;\ - done - mv "$(DESTDIR)$(prefix)/bin/2to3" "$(DESTDIR)$(prefix)/bin/2to3-$(VERSION)" - ln -sf "2to3-$(VERSION)" "$(DESTDIR)$(prefix)/bin/2to3" - if [ ! -h "$(DESTDIR)$(prefix)/bin/python3-config" ]; then \ - mv "$(DESTDIR)$(prefix)/bin/python3-config" "$(DESTDIR)$(prefix)/bin/python$(VERSION)-config" ;\ - ln -sf "python$(VERSION)-config" "$(DESTDIR)$(prefix)/bin/python3-config" ; \ - fi - - pythonw: $(srcdir)/Tools/pythonw.c Makefile $(CC) $(LDFLAGS) -DPYTHONFRAMEWORK='"$(PYTHONFRAMEWORK)"' -o $@ $(srcdir)/Tools/pythonw.c -I.. -I$(srcdir)/../Include ../$(PYTHONFRAMEWORK).framework/Versions/$(VERSION)/$(PYTHONFRAMEWORK) @@ -199,13 +177,11 @@ $(INSTALLED_PYTHONAPP): install_Python -installextras: $(srcdir)/Extras.ReadMe.txt $(srcdir)/Extras.install.py - $(INSTALL) -d "$(DESTDIR)$(PYTHONAPPSDIR)/Extras" - $(INSTALL) $(srcdir)/Extras.ReadMe.txt "$(DESTDIR)$(PYTHONAPPSDIR)/Extras/ReadMe.txt" - $(RUNSHARED) $(BUILDPYTHON) $(srcdir)/Extras.install.py $(srcdir)/../Demo \ - "$(DESTDIR)$(PYTHONAPPSDIR)/Extras/Demo" - $(RUNSHARED) $(BUILDPYTHON) $(srcdir)/Extras.install.py $(srcdir)/Demo \ - "$(DESTDIR)$(PYTHONAPPSDIR)/Extras/Demo.Mac" +installextras: $(srcdir)/Extras.install.py + $(INSTALL) -d "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples" + $(RUNSHARED) $(BUILDPYTHON) $(srcdir)/Extras.install.py $(srcdir)/../Tools \ + "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples/Tools" ; \ + chmod -R ugo+rX,go-w "$(DESTDIR)$(prefix)/share/doc/python$(VERSION)/examples/Tools" checkapplepython: $(srcdir)/Tools/fixapplepython23.py Modified: python/branches/pep-3151/Mac/PythonLauncher/Info.plist.in ============================================================================== --- python/branches/pep-3151/Mac/PythonLauncher/Info.plist.in (original) +++ python/branches/pep-3151/Mac/PythonLauncher/Info.plist.in Sat Feb 26 08:16:32 2011 @@ -40,7 +40,7 @@ CFBundleExecutable PythonLauncher CFBundleGetInfoString - %VERSION%, ?? 2001-2008 Python Software Foundation + %VERSION%, ?? 2001-2011 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier Modified: python/branches/pep-3151/Mac/README ============================================================================== --- python/branches/pep-3151/Mac/README (original) +++ python/branches/pep-3151/Mac/README Sat Feb 26 08:16:32 2011 @@ -188,8 +188,8 @@ framework itself, the Mac subtree, the applications and the unix tools. There is an extra target frameworkinstallextras that is not part of the -normal frameworkinstall which installs the Demo and Tools directories -into "/Applications/MacPython ", this is useful for binary +normal frameworkinstall which installs the Tools directory into +"/Applications/MacPython ", this is useful for binary distributions. What do all these programs do? Modified: python/branches/pep-3151/Mac/Resources/app/Info.plist.in ============================================================================== --- python/branches/pep-3151/Mac/Resources/app/Info.plist.in (original) +++ python/branches/pep-3151/Mac/Resources/app/Info.plist.in Sat Feb 26 08:16:32 2011 @@ -20,7 +20,7 @@ CFBundleExecutable Python CFBundleGetInfoString - %version%, (c) 2004-2010 Python Software Foundation. + %version%, (c) 2004-2011 Python Software Foundation. CFBundleHelpBookFolder Documentation @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2004-2010 Python Software Foundation. + %version%, (c) 2004-2011 Python Software Foundation. CFBundleName Python CFBundlePackageType @@ -55,6 +55,6 @@ NSAppleScriptEnabled NSHumanReadableCopyright - (c) 2004 Python Software Foundation. + (c) 2011 Python Software Foundation. Modified: python/branches/pep-3151/Mac/Resources/framework/Info.plist.in ============================================================================== --- python/branches/pep-3151/Mac/Resources/framework/Info.plist.in (original) +++ python/branches/pep-3151/Mac/Resources/framework/Info.plist.in Sat Feb 26 08:16:32 2011 @@ -17,9 +17,9 @@ CFBundlePackageType FMWK CFBundleShortVersionString - %VERSION%, (c) 2004-2008 Python Software Foundation. + %VERSION%, (c) 2004-2011 Python Software Foundation. CFBundleLongVersionString - %VERSION%, (c) 2004-2008 Python Software Foundation. + %VERSION%, (c) 2004-2011 Python Software Foundation. CFBundleSignature ???? CFBundleVersion Modified: python/branches/pep-3151/Makefile.pre.in ============================================================================== --- python/branches/pep-3151/Makefile.pre.in (original) +++ python/branches/pep-3151/Makefile.pre.in Sat Feb 26 08:16:32 2011 @@ -108,9 +108,8 @@ # Detailed destination directories BINLIBDEST= $(LIBDIR)/python$(VERSION) LIBDEST= $(SCRIPTDIR)/python$(VERSION) -INCLUDEPY= $(INCLUDEDIR)/python$(VERSION) -CONFINCLUDEPY= $(CONFINCLUDEDIR)/python$(VERSION) -LIBP= $(LIBDIR)/python$(VERSION) +INCLUDEPY= $(INCLUDEDIR)/python$(LDVERSION) +CONFINCLUDEPY= $(CONFINCLUDEDIR)/python$(LDVERSION) # Symbols used for using shared libraries SO= @SO@ @@ -155,7 +154,7 @@ SRCDIRS= @SRCDIRS@ # Other subdirectories -SUBDIRSTOO= Include Lib Misc Demo +SUBDIRSTOO= Include Lib Misc # Files and directories to be distributed CONFIGFILES= configure configure.in acconfig.h pyconfig.h.in Makefile.pre.in @@ -167,6 +166,7 @@ LIBRARY= @LIBRARY@ LDLIBRARY= @LDLIBRARY@ BLDLIBRARY= @BLDLIBRARY@ +PY3LIBRARY= @PY3LIBRARY@ DLLLIBRARY= @DLLLIBRARY@ LDLIBRARYDIR= @LDLIBRARYDIR@ INSTSONAME= @INSTSONAME@ @@ -421,7 +421,7 @@ # Build the interpreter -$(BUILDPYTHON): Modules/python.o $(LIBRARY) $(LDLIBRARY) +$(BUILDPYTHON): Modules/python.o $(LIBRARY) $(LDLIBRARY) $(PY3LIBRARY) $(LINKCC) $(PY_LDFLAGS) $(LINKFORSHARED) -o $@ Modules/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST) platform: $(BUILDPYTHON) @@ -455,8 +455,11 @@ $(BLDSHARED) -o $@ $(LIBRARY_OBJS) $(MODLIBS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \ fi -libpython$(VERSION).dylib: $(LIBRARY_OBJS) - $(CC) -dynamiclib -Wl,-single_module $(PY_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(VERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \ +libpython3.so: libpython$(LDVERSION).so + $(BLDSHARED) -o $@ -Wl,-hl$@ $^ + +libpython$(LDVERSION).dylib: $(LIBRARY_OBJS) + $(CC) -dynamiclib -Wl,-single_module $(PY_LDFLAGS) -undefined dynamic_lookup -Wl,-install_name,$(prefix)/lib/libpython$(LDVERSION).dylib -Wl,-compatibility_version,$(VERSION) -Wl,-current_version,$(VERSION) -o $@ $(LIBRARY_OBJS) $(SHLIBS) $(LIBC) $(LIBM) $(LDLAST); \ libpython$(VERSION).sl: $(LIBRARY_OBJS) @@ -499,7 +502,6 @@ $(PYTHONFRAMEWORKDIR)/Versions/$(VERSION)/Resources/Info.plist $(LN) -fsn $(VERSION) $(PYTHONFRAMEWORKDIR)/Versions/Current $(LN) -fsn Versions/Current/$(PYTHONFRAMEWORK) $(PYTHONFRAMEWORKDIR)/$(PYTHONFRAMEWORK) - $(LN) -fsn Versions/Current/Headers $(PYTHONFRAMEWORKDIR)/Headers $(LN) -fsn Versions/Current/Resources $(PYTHONFRAMEWORKDIR)/Resources # This rule builds the Cygwin Python DLL and import library if configured @@ -579,7 +581,7 @@ $(GRAMMAR_H) $(GRAMMAR_C): Parser/pgen.stamp Parser/pgen.stamp: $(PGEN) $(GRAMMAR_INPUT) -@$(INSTALL) -d Include - -$(PGEN) $(GRAMMAR_INPUT) $(GRAMMAR_H) $(GRAMMAR_C) + $(PGEN) $(GRAMMAR_INPUT) $(GRAMMAR_H) $(GRAMMAR_C) -touch Parser/pgen.stamp $(PGEN): $(PGENOBJS) @@ -642,6 +644,9 @@ $(BYTESTR_DEPS) \ $(srcdir)/Objects/stringlib/formatter.h +Objects/typeobject.o: $(srcdir)/Objects/typeslots.inc +$(srcdir)/Objects/typeslots.inc: $(srcdir)/Include/typeslots.h $(srcdir)/Objects/typeslots.py + $(PYTHON) $(srcdir)/Objects/typeslots.py < $(srcdir)/Include/typeslots.h > $(srcdir)/Objects/typeslots.inc ############################################################################ # Header files @@ -834,7 +839,13 @@ else true; \ fi; \ done - $(INSTALL_PROGRAM) $(BUILDPYTHON) $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE) + $(INSTALL_PROGRAM) $(BUILDPYTHON) $(DESTDIR)$(BINDIR)/python$(LDVERSION)$(EXE) + -if test "$(VERSION)" != "$(LDVERSION)"; then \ + if test -f $(DESTDIR)$(BINDIR)/$(PYTHON)$(VERSION)$(EXE) -o -h $(DESTDIR)$(BINDIR)/$(PYTHON)$(VERSION)$(EXE); \ + then rm -f $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE); \ + fi; \ + (cd $(DESTDIR)$(BINDIR); $(LN) python$(LDVERSION)$(EXE) python$(VERSION)$(EXE)); \ + fi if test -f $(LDLIBRARY); then \ if test -n "$(DLLLIBRARY)" ; then \ $(INSTALL_SHARED) $(DLLLIBRARY) $(DESTDIR)$(BINDIR); \ @@ -844,6 +855,9 @@ (cd $(DESTDIR)$(LIBDIR); $(LN) -sf $(INSTSONAME) $(LDLIBRARY)) \ fi \ fi; \ + if test -n "$(PY3LIBRARY)"; then \ + $(INSTALL_SHARED) $(PY3LIBRARY) $(DESTDIR)$(LIBDIR)/$(PY3LIBRARY); \ + fi; \ else true; \ fi @@ -853,10 +867,22 @@ else true; \ fi (cd $(DESTDIR)$(BINDIR); $(LN) python$(VERSION)$(EXE) $(PYTHON)3$(EXE)) + -if test "$(VERSION)" != "$(LDVERSION)"; then \ + rm -f $(DESTDIR)$(BINDIR)/python$(VERSION)-config; \ + (cd $(DESTDIR)$(BINDIR); $(LN) -s python$(LDVERSION)-config python$(VERSION)-config); \ + rm -f $(DESTDIR)$(LIBPC)/python-$(LDVERSION).pc; \ + (cd $(DESTDIR)$(LIBPC); $(LN) -s python-$(VERSION).pc python-$(LDVERSION).pc); \ + fi -rm -f $(DESTDIR)$(BINDIR)/python3-config (cd $(DESTDIR)$(BINDIR); $(LN) -s python$(VERSION)-config python3-config) -rm -f $(DESTDIR)$(LIBPC)/python3.pc (cd $(DESTDIR)$(LIBPC); $(LN) -s python-$(VERSION).pc python3.pc) + -rm -f $(DESTDIR)$(BINDIR)/idle3 + (cd $(DESTDIR)$(BINDIR); $(LN) -s idle$(VERSION) idle3) + -rm -f $(DESTDIR)$(BINDIR)/pydoc3 + (cd $(DESTDIR)$(BINDIR); $(LN) -s pydoc$(VERSION) pydoc3) + -rm -f $(DESTDIR)$(BINDIR)/2to3 + (cd $(DESTDIR)$(BINDIR); $(LN) -s 2to3-$(VERSION) 2to3) # Install the manual page maninstall: @@ -878,11 +904,11 @@ XMLLIBSUBDIRS= xml xml/dom xml/etree xml/parsers xml/sax LIBSUBDIRS= tkinter tkinter/test tkinter/test/test_tkinter \ tkinter/test/test_ttk site-packages test \ - test/decimaltestdata test/xmltestdata \ + test/decimaltestdata test/xmltestdata test/subprocessdata \ test/tracedmodules test/encoded_modules \ - concurrent concurrent/futures encodings \ + collections concurrent concurrent/futures encodings \ email email/mime email/test email/test/data \ - html json json/tests http dbm xmlrpc \ + html json test/json_tests http dbm xmlrpc \ sqlite3 sqlite3/test \ logging csv wsgiref urllib \ lib2to3 lib2to3/fixes lib2to3/pgen2 lib2to3/tests \ @@ -986,7 +1012,7 @@ python-config: $(srcdir)/Misc/python-config.in # Substitution happens here, as the completely-expanded BINDIR # is not available in configure - sed -e "s, at EXENAME@,$(BINDIR)/python$(VERSION)$(EXE)," < $(srcdir)/Misc/python-config.in >python-config + sed -e "s, at EXENAME@,$(BINDIR)/python$(LDVERSION)$(EXE)," < $(srcdir)/Misc/python-config.in >python-config # Install the include files INCLDIRSTOMAKE=$(INCLUDEDIR) $(CONFINCLUDEDIR) $(INCLUDEPY) $(CONFINCLUDEPY) @@ -1008,13 +1034,13 @@ # Install the library and miscellaneous stuff needed for extending/embedding # This goes into $(exec_prefix) -LIBPL= $(LIBP)/config +LIBPL= $(LIBDEST)/config-$(LDVERSION) # pkgconfig directory LIBPC= $(LIBDIR)/pkgconfig libainstall: all python-config - @for i in $(LIBDIR) $(LIBP) $(LIBPL) $(LIBPC); \ + @for i in $(LIBDIR) $(LIBPL) $(LIBPC); \ do \ if test ! -d $(DESTDIR)$$i; then \ echo "Creating directory $$i"; \ @@ -1044,7 +1070,7 @@ $(INSTALL_DATA) Misc/python.pc $(DESTDIR)$(LIBPC)/python-$(VERSION).pc $(INSTALL_SCRIPT) $(srcdir)/Modules/makesetup $(DESTDIR)$(LIBPL)/makesetup $(INSTALL_SCRIPT) $(srcdir)/install-sh $(DESTDIR)$(LIBPL)/install-sh - $(INSTALL_SCRIPT) python-config $(DESTDIR)$(BINDIR)/python$(VERSION)-config + $(INSTALL_SCRIPT) python-config $(DESTDIR)$(BINDIR)/python$(LDVERSION)-config rm python-config @if [ -s Modules/python.exp -a \ "`echo $(MACHDEP) | sed 's/^\(...\).*/\1/'`" = "aix" ]; then \ @@ -1102,7 +1128,7 @@ else true; \ fi; \ done - $(LN) -fsn include/python$(VERSION) $(DESTDIR)$(prefix)/Headers + $(LN) -fsn include/python$(LDVERSION) $(DESTDIR)$(prefix)/Headers sed 's/%VERSION%/'"`$(RUNSHARED) ./$(BUILDPYTHON) -c 'import platform; print(platform.python_version())'`"'/g' < $(RESSRCDIR)/Info.plist > $(DESTDIR)$(prefix)/Resources/Info.plist $(LN) -fsn $(VERSION) $(DESTDIR)$(PYTHONFRAMEWORKINSTALLDIR)/Versions/Current $(LN) -fsn Versions/Current/$(PYTHONFRAMEWORK) $(DESTDIR)$(PYTHONFRAMEWORKINSTALLDIR)/$(PYTHONFRAMEWORK) @@ -1114,8 +1140,8 @@ # Install a number of symlinks to keep software that expects a normal unix # install (which includes python-config) happy. frameworkinstallmaclib: - ln -fs "../../../$(PYTHONFRAMEWORK)" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config/libpython$(VERSION).a" - ln -fs "../../../$(PYTHONFRAMEWORK)" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config/libpython$(VERSION).dylib" + ln -fs "../../../$(PYTHONFRAMEWORK)" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config-$(LDVERSION)/libpython$(VERSION).a" + ln -fs "../../../$(PYTHONFRAMEWORK)" "$(DESTDIR)$(prefix)/lib/python$(VERSION)/config-$(LDVERSION)/libpython$(VERSION).dylib" ln -fs "../$(PYTHONFRAMEWORK)" "$(DESTDIR)$(prefix)/lib/libpython$(VERSION).dylib" # This installs the IDE, the Launcher and other apps into /Applications @@ -1129,7 +1155,7 @@ frameworkaltinstallunixtools: cd Mac && $(MAKE) altinstallunixtools DESTDIR="$(DESTDIR)" -# This installs the Demos and Tools into the applications directory. +# This installs the Tools into the applications directory. # It is not part of a normal frameworkinstall frameworkinstallextras: cd Mac && $(MAKE) installextras DESTDIR="$(DESTDIR)" @@ -1230,7 +1256,7 @@ done -rm -f core Makefile Makefile.pre config.status \ Modules/Setup Modules/Setup.local Modules/Setup.config \ - Modules/ld_so_aix Misc/python.pc + Modules/ld_so_aix Modules/python.exp Misc/python.pc -rm -f python*-gdb.py -rm -f pybuilddir.txt find $(srcdir) '(' -name '*.fdc' -o -name '*~' \ @@ -1297,3 +1323,6 @@ .PHONY: gdbhooks # IF YOU PUT ANYTHING HERE IT WILL GO AWAY +# Local Variables: +# mode: makefile +# End: Modified: python/branches/pep-3151/Misc/ACKS ============================================================================== --- python/branches/pep-3151/Misc/ACKS (original) +++ python/branches/pep-3151/Misc/ACKS Sat Feb 26 08:16:32 2011 @@ -12,12 +12,14 @@ and the list is in rough alphabetical order by last names. David Abrahams +Ron Adam Jim Ahlstrom Farhan Ahmad Matthew Ahrens Nir Aides Yaniv Aknin Jyrki Alakuijala +Ray Allen Billy G. Allie Kevin Altis Joe Amenta @@ -44,6 +46,7 @@ Greg Ball Luigi Ballabio Jeff Balogh +Matt Bandy Michael J. Barber Chris Barker Nick Barnes @@ -76,6 +79,7 @@ Steven Bethard Stephen Bevan Ron Bickers +Adrian von Bidder David Binger Dominic Binks Philippe Biondi @@ -235,6 +239,7 @@ Maxim Dzumanenko Walter D??rwald Hans Eckardt +Rodolpho Eckhardt Grant Edwards John Ehresman Eric Eisner @@ -257,6 +262,7 @@ David Everly Greg Ewing Martijn Faassen +Clovis Fabricio Andreas Faerber Bill Fancher Troy J. Farrell @@ -317,6 +323,8 @@ Hans de Graaff Eddy De Greef Duncan Grisby +Fabian Groffen +Eric Groo Dag Gruneau Michael Guravage Lars Gust??bel @@ -337,6 +345,7 @@ Lynda Hardman Derek Harland Jason Harper +Brian Harring Larry Hastings Shane Hathaway Rycharde Hawkes @@ -392,6 +401,7 @@ Fredrik H????rd Mihai Ibanescu Lars Immisch +Bobby Impollonia Meador Inge Tony Ingraldi John Interrante @@ -415,6 +425,7 @@ Gregory K. Johnson Simon Johnston Thomas Jollans +Nicolas Joly Evan Jones Jeremy Jones Richard Jones @@ -447,6 +458,7 @@ Steve Kirsch Sebastian Kirsche Ron Klatchko +Reid Kleckner Bastian Kleineidam Bob Kline Matthias Klose @@ -455,6 +467,7 @@ Pat Knight Greg Kochanski Damon Kohler +Vlad Korolev Joseph Koshy Maksim Kozyarchuk Stefan Krah @@ -467,6 +480,7 @@ Ivan Krsti?? Andrew Kuchling Vladimir Kushnir +Ross Lagerwall Cameron Laird Jean-Baptiste "Jiba" Lamy Torsten Landschoff @@ -497,6 +511,7 @@ Christopher Tur Lesniewski-Laas Mark Levinson William Lewis +Xuanji Li Robert van Liere Ross Light Shawn Ligocki @@ -534,6 +549,7 @@ Doug Marien Alex Martelli Anthony Martin +Owen Martin S??bastien Martini Roger Masse Nick Mathewson @@ -558,6 +574,7 @@ Mike Meyer Steven Miale Trent Mick +Stan Mihai Aristotelis Mikropoulos Damien Miller Chad Miller @@ -637,6 +654,7 @@ Gabriel de Perthuis Tim Peters Benjamin Peterson +Joe Peterson Chris Petrilli Bjorn Pettersen Geoff Philbrick @@ -663,6 +681,7 @@ Steve Purcell Fernando P??rez Eduardo P??rez +Pierre Quentel Brian Quinlan Anders Qvist Burton Radons @@ -684,6 +703,7 @@ Steven Reiz Roeland Rengelink Tim Rice +Francesco Ricciardi Jan Pieter Riegel Armin Rigo Nicholas Riley @@ -693,6 +713,7 @@ Mark Roberts Jim Robinson Andy Robinson +Mark Roddy Kevin Rodgers Giampaolo Rodola Mike Romberg @@ -721,14 +742,17 @@ George Sakkis Rich Salz Kevin Samborn +Adrian Sampson Ilya Sandler Mark Sapiro Ty Sarna Ben Sayer +Andrew Schaaf Michael Scharf Andreas Schawo Neil Schemenauer David Scherer +Bob Schmertz Gregor Schmid Ralf Schmitt Michael Schneider @@ -774,6 +798,7 @@ Dirk Soede Paul Sokolovsky Cody Somerville +Edoardo Spadolini Clay Spence Per Spilling Joshua Spoerri @@ -793,6 +818,7 @@ Ken Stox Dan Stromberg Daniel Stutzbach +Andreas St??hrk Pal Subbiah Nathan Sullivan Mark Summerfield @@ -816,7 +842,9 @@ Tobias Thelen James Thomas Robin Thomas +Jeremy Thurgood Eric Tiedemann +July Tikhonov Tracy Tims Oren Tirosh Jason Tishler @@ -848,6 +876,7 @@ Atul Varma Dmitry Vasiliev Alexandre Vassalotti +Nadeem Vawda Frank Vercruesse Mike Verdone Jaap Vermeulen @@ -864,6 +893,7 @@ Charles Waldman Richard Walker Larry Wall +Kevin Walzer Rodrigo Steinmuller Wanderley Greg Ward Barry Warsaw @@ -880,6 +910,7 @@ Rickard Westman Jeff Wheeler Christopher White +David White Mats Wichmann Truida Wiedijk Felix Wiemann Deleted: python/branches/pep-3151/Misc/AIX-NOTES ============================================================================== --- python/branches/pep-3151/Misc/AIX-NOTES Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,155 +0,0 @@ -Subject: AIX - Misc/AIX-NOTES -From: Vladimir Marangozov -To: guido at CNRI.Reston.Va.US (Guido van Rossum) -Date: Wed, 6 Aug 1997 11:41:00 +0200 (EET) - -============================================================================== - COMPILER INFORMATION ------------------------------------------------------------------------------- - -(1) A problem has been reported with "make test" failing because of "weird - indentation." Searching the comp.lang.python newsgroup reveals several - threads on this subject, and it seems to be a compiler bug in an old - version of the AIX CC compiler. However, the compiler/OS combination - which has this problem is not identified. In preparation for the 1.4 - release, Vladimir Marangozov (Vladimir.Marangozov at imag.fr) and Manus Hand - (mhand at csn.net) reported no such troubles for the following compilers and - operating system versions: - AIX C compiler version 3.1.2 on AIX 4.1.3 and AIX 4.1.4 - AIX C compiler version 1.3.0 on AIX 3.2.5 - If you have this problem, please report the compiler/OS version. - -(2) Stefan Esser (se at MI.Uni-Koeln.DE), in work done to compile Python - 1.0.0 on AIX 3.2.4, reports that AIX compilers don't like the LANG - environment varaiable set to European locales. This makes the compiler - generate floating point constants using "," as the decimal separator, - which the assembler doesn't understand (or perhaps it is the other way - around, with the assembler expecting, but not getting "," in float - numbers). "LANG=C; export LANG" solves the problem, as does - "LANG=C $(MAKE) ..." in the master Makefile. - -(3) The cc (or xlc) compiler considers "Python/ceval.c" too complex to - optimize, except when invoked with "-qmaxmem=4000" - -(4) Some problems (due to _AIX not being #defined) when python 1.0.0 was - compiled using 'gcc -ansi' were reported by Stefan Esser, but were not - investigated. - -(5) The cc compiler has internal variables named "__abs" and "__div". These - names are reserved and may not be used as program variables in compiled - source. (As an anecdote in support of this, the implementation of - Python/operator.c had this problem in the 1.4 beta releases, and the - solution was to re#define some core-source variables having these names, - to give these python variables different names if the build is being done - on AIX.) - -(6) As mentioned in the README, builds done immediately after previous builds - (without "make clean" or "make clobber") sometimes fail for mysterious - reasons. There are some unpredictable results when the configuration - is changed (that is, if you "configure" with different parameters) or if - intermediate changes are made to some files. Performing "make clean" or - "make clobber" resolves the problems. - -============================================================================== - THREAD SUPPORT ------------------------------------------------------------------------------- - -As of AIX version 4, there are two (incompatible) types of pthreads on AIX: - a) AIX DCE pthreads (on AIX 3.2.5) - b) AIX 4 pthreads (on AIX 4.1 and up) -Support has been added to Python to handle the distinction. - -The cc and gcc compilers do not initialize pthreads properly. The only -compilers that can initialize pthreads properly are IBM *_r* compilers, -which use the crt0_r.o module, and which invoke ld with the reentrant -version of libc (libc_r). - -In order to enable thread support, follow these steps: - 1. Uncomment the thread module in Modules/Setup - 2. configure --without-gcc --with-thread ... - 3. make CC="cc_r" OPT="-O -qmaxmem=4000" - -For example, to make with both threads and readline, use: - ./configure --without-gcc --with-thread --with-readline=/usr/local/lib - make CC=cc_r OPT="-O2 -qmaxmem=4000" - -If the "make" which is used ignores the "CC=cc_r" directive, one could alias -the cc command to cc_r (for example, in C-shell, perform an "alias cc cc_r"). - -Vladimir Marangozov (Vladimir.Marangozov at imag.fr) provided this information, -and he reports that a cc_r build initializes threads properly and that all -demos on threads run okay with cc_r. - -============================================================================== - SHARED LIBRARY SUPPORT ------------------------------------------------------------------------------- - -AIX shared library support was added to Python in the 1.4 release by Manus -Hand (mhand at csn.net) and Vladimir Marangozov (Vladimir.Marangozov at imag.fr). - -Python modules may now be built as shared libraries on AIX using the normal -process of uncommenting the "*shared*" line in Modules/Setup before the -build. - -AIX shared libraries require that an "export" and "import" file be provided -at compile time to list all extern symbols which may be shared between -modules. The "export" file (named python.exp) for the modules and the -libraries that belong to the Python core is created by the "makexp_aix" -script before performing the link of the python binary. It lists all global -symbols (exported during the link) of the modules and the libraries that -make up the python executable. - -When shared library modules (.so files) are made, a second shell script -is invoked. This script is named "ld_so_aix" and is also provided with -the distribution in the Modules subdirectory. This script acts as an "ld" -wrapper which hides the explicit management of "export" and "import" files; -it adds the appropriate arguments (in the appropriate order) to the link -command that creates the shared module. Among other things, it specifies -that the "python.exp" file is an "import" file for the shared module. - -At the time of this writing, neither the python.exp file nor the makexp_aix -or ld_so_aix scripts are installed by the make procedure, so you should -remember to keep these and/or copy them to a different location for -safekeeping if you wish to use them to add shared extension modules to -python. However, if the make process has been updated since this writing, -these files MAY have been installed for you during the make by the -LIBAINSTALL rule, in which case the need to make safe copies is obviated. - -If you wish to add a shared extension module to the language, you would follow -the steps given in the example below (the example adds the shared extension -module "spam" to python): - 1. Make sure that "ld_so_aix" and "makexp_aix" are in your path. - 2. The "python.exp" file should be in the current directory. - 3. Issue the following commands or include them in your Makefile: - cc -c spammodule.c - ld_so_aix cc spammodule.o -o spammodule.so - -For more detailed information on the shared library support, examine the -contents of the "ld_so_aix" and "makexp_aix" scripts or refer to the AIX -documentation. - -NOTE: If the extension module is written in C++ and contains templates, - an alternative to "ld_so_aix" is the /usr/lpp/xlC/bin/makeC++SharedLib - script. Chris Myers (myers at TC.Cornell.EDU) reports that ld_so_aix - works well for some C++ (including the C++ that is generated - automatically by the Python SWIG package [SWIG can be found at - http://www.cs.utah.edu/~beazley/SWIG/swig.html]). However, it is not - known whether makeC++SharedLib can be used as a complete substitute - for ld_so_aix. - -According to Gary Hook from IBM, the format of the export file changed -in AIX 4.2. For AIX 4.2 and later, a period "." is required on the -first line after "#!". If python crashes while importing a shared -library, you can try modifying the LINKCC variable in the Makefile. -It probably looks like this: - - LINKCC= $(srcdir)/Modules/makexp_aix Modules/python.exp \"\" $(LIBRARY); $(PURIFY) $(CXX) - -You should modify the \"\" to be a period: - - LINKCC= $(srcdir)/Modules/makexp_aix Modules/python.exp . $(LIBRARY); $(PURIFY) $(CXX) - -Using a period fixed the problem in the snake farm. YMMV. -This fix has been incorporated into Python 2.3. - -============================================================================== Modified: python/branches/pep-3151/Misc/NEWS ============================================================================== --- python/branches/pep-3151/Misc/NEWS (original) +++ python/branches/pep-3151/Misc/NEWS Sat Feb 26 08:16:32 2011 @@ -2,10 +2,986 @@ Python News +++++++++++ +What's New in Python 3.3 Alpha 1? +================================= + +*Release date: XX-XXX-20XX* + +Core and Builtins +----------------- + +- Issue #11286: Raise a ValueError from calling PyMemoryView_FromBuffer with + a buffer struct having a NULL data pointer. + +- Issue #11272: On Windows, input() strips '\r' (and not only '\n'), and + sys.stdin uses universal newline (replace '\r\n' by '\n'). + +- Issue #10830: Fix PyUnicode_FromFormatV("%c") for non-BMP characters on + narrow build. + +- Issue #11168: Remove filename debug variable from PyEval_EvalFrameEx(). + It encoded the Unicode filename to UTF-8, but the encoding fails on + undecodable filename (on surrogate characters) which raises an unexpected + UnicodeEncodeError on recursion limit. + +- Issue #11187: Remove bootstrap code (use ASCII) of + PyUnicode_AsEncodedString(), it was replaced by a better fallback (use the + locale encoding) in PyUnicode_EncodeFSDefault(). + +- Check for NULL result in PyType_FromSpec. + +- Issue #10516: New copy() and clear() methods for lists. + +Library +------- + +- Issue #11297: Add collections.ChainMap(). + +- Issue #10755: Add the posix.fdlistdir() function. Patch by Ross Lagerwall. + +- Issue #4761: Add the *at() family of functions (openat(), etc.) to the posix + module. Patch by Ross Lagerwall. + +- Issue #7322: Trying to read from a socket's file-like object after a timeout + occurred now raises an error instead of silently losing data. + +- Issue 11291: poplib.POP no longer suppresses errors on quit(). + +- Issue 11177: asyncore's create_socket() arguments can now be omitted. + +- Issue #6064: Add a ``daemon`` keyword argument to the threading.Thread + and multiprocessing.Process constructors in order to override the + default behaviour of inheriting the daemonic property from the current + thread/process. + +- Issue #10956: Buffered I/O classes retry reading or writing after a signal + has arrived and the handler returned successfully. + +- Issue #10784: New os.getpriority() and os.setpriority() functions. + +- Issue #11114: Fix catastrophic performance of tell() on text files (up + to 1000x faster in some cases). It is still one to two order of magnitudes + slower than binary tell(). + +- Issue 10882: Add os.sendfile function. + +- Issue #10868: Allow usage of the register method of an ABC as a class + decorator. + +- Issue #11224: Fixed a regression in tarfile that affected the file-like + objects returned by TarFile.extractfile() regarding performance, memory + consumption and failures with the stream interface. + +- Issue #10924: Adding salt and Modular Crypt Format to crypt library. + Moved old C wrapper to _crypt, and added a Python wrapper with + enhanced salt generation and simpler API for password generation. + +- Issue #11074: Make 'tokenize' so it can be reloaded. + +- Issue #11085: Moved collections abstract base classes into a separate + module called collections.abc, following the pattern used by importlib.abc. + For backwards compatibility, the names are imported into the collections + module. + +- Issue #4681: Allow mmap() to work on file sizes and offsets larger than + 4GB, even on 32-bit builds. Initial patch by Ross Lagerwall, adapted for + 32-bit Windows. + +- Issue #11169: compileall module uses repr() to format filenames and paths to + escape surrogate characters and show spaces. + +- Issue #11089: Fix performance issue limiting the use of ConfigParser() + with large config files. + +- Issue #10276: Fix the results of zlib.crc32() and zlib.adler32() on buffers + larger than 4GB. Patch by Nadeem Vawda. + +Build +----- + +- Issue #11268: Prevent Mac OS X Installer failure if Documentation + package had previously been installed. + +Tests +----- + +- Issue #10512: Properly close sockets under test.test_cgi. + +- Issue #10992: Make tests pass under coverage. + +- Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due + to open door files. + +- Issue #10990: Prevent tests from clobbering a set trace function. + + +What's New in Python 3.2? +========================= + +*Release date: 20-Feb-2011* + +Core and Builtins +----------------- + +- Issue #11249: Fix potential crashes when using the limited API. + +Build +----- + +- Issue #11222: Fix non-framework shared library build on Mac OS X. + +- Issue #11184: Fix large-file support on AIX. + +- Issue #941346: Fix broken shared library build on AIX. + +Documentation +------------- + +- Issue #10709: Add updated AIX notes in Misc/README.AIX. + + +What's New in Python 3.2 Release Candidate 3? +============================================= + +*Release date: 13-Feb-2011* + +Core and Builtins +----------------- + +- Issue #11134: Add missing fields to typeslots.h. + +- Issue #11135: Remove redundant doc field from PyType_Spec. + +- Issue #11067: Add PyType_GetFlags, to support PyUnicode_Check in the limited + ABI. + +- Issue #11118: Fix bogus export of None in python3.dll. + +Library +------- + +- Issue #11116: any error during addition of a message to a mailbox now causes a + rollback, instead of leaving the mailbox partially modified. + +- Issue #11132: Fix passing of "optimize" parameter when recursing in + compileall.compile_dir(). + +- Issue #11110: Fix a potential decref of a NULL in sqlite3. + +- Issue #8275: Fix passing of callback arguments with ctypes under Win64. Patch + by Stan Mihai. + +Build +----- + +- Issue #11079: The /Applications/Python x.x folder created by the Mac OS X + installers now includes a link to the installed documentation and no longer + includes an Extras directory. The Tools directory is now installed in the + framework under share/doc. + +- Issue #11121: Fix building with --enable-shared. + +Tests +----- + +- Issue #10971: test_zipimport_support is once again compatible with the refleak + hunter feature of test.regrtest. + + +What's New in Python 3.2 Release Candidate 2? +============================================= + +*Release date: 30-Jan-2011* + +Core and Builtins +----------------- + +- Issue #10451: memoryview objects could allow to mutate a readable buffer. + Initial patch by Ross Lagerwall. + +Library +------- + +- Issue #9124: mailbox now accepts binary input and reads and writes mailbox + files in binary mode, using the email package's binary support to parse + arbitrary email messages. StringIO and text file input is deprecated, + and string input fails early if non-ASCII characters are used, where + previously it would fail when the email was processed in a later step. + +- Issue #10845: Mitigate the incompatibility between the multiprocessing + module on Windows and the use of package, zipfile or directory execution + by special casing main modules that actually *are* called __main__.py. + +- Issue #11045: Protect logging call against None argument. + +- Issue #11052: Correct IDLE menu accelerators on Mac OS X for Save + commands. + +- Issue #11053: Fix IDLE "Syntax Error" windows to behave as in 2.x, + preventing a confusing hung appearance on OS X with the windows + obscured. + +- Issue #10940: Workaround an IDLE hang on Mac OS X 10.6 when using the + menu accelerators for Open Module, Go to Line, and New Indent Width. + The accelerators still work but no longer appear in the menu items. + +- Issue #10989: Fix a crash on SSLContext.load_verify_locations(None, True). + +- Issue #11020: Command-line pyclbr was broken because of missing 2-to-3 + conversion. + +- Issue #11019: Fixed BytesGenerator so that it correctly handles a Message + with a None body. + +- Issue #11014: Make 'filter' argument in tarfile.Tarfile.add() into a + keyword-only argument. The preceding positional argument was deprecated, + so it made no sense to add filter as a positional argument. + +- Issue #11004: Repaired edge case in deque.count(). + +- Issue #10974: IDLE no longer crashes if its recent files list includes files + with non-ASCII characters in their path names. + +- Have hashlib.algorithms_available and hashlib.algorithms_guaranteed both + return sets instead of one returning a tuple and the other a frozenset. + +- Issue #10987: Fix the recursion limit handling in the _pickle module. + +- Issue #10983: Fix several bugs making tunnel requests in http.client. + +- Issue #10955: zipimport uses ASCII encoding instead of cp437 to decode + filenames, at bootstrap, if the codec registry is not ready yet. It is still + possible to have non-ASCII filenames using the Unicode flag (UTF-8 encoding) + for all file entries in the ZIP file. + +- Issue #10949: Improved robustness of rotating file handlers. + +- Issue #10955: Fix a potential crash when trying to mmap() a file past its + length. Initial patch by Ross Lagerwall. + +- Issue #10898: Allow compiling the posix module when the C library defines + a symbol named FSTAT. + +- Issue #10980: the HTTP server now encodes headers with iso-8859-1 (latin1) + encoding. This is the preferred encoding of PEP 3333 and the base encoding + of HTTP 1.1. + +- To match the behaviour of HTTP server, the HTTP client library now also + encodes headers with iso-8859-1 (latin1) encoding. It was already doing + that for incoming headers which makes this behaviour now consistent in + both incoming and outgoing direction. + +- Issue #9509: argparse now properly handles IOErrors raised by + argparse.FileType. + +- Issue #10961: The new pydoc server now better handles exceptions raised + during request handling. + +- Issue #10680: Fix mutually exclusive arguments for argument groups in + argparse. + +Build +----- + +- Issue #11054: Allow Mac OS X installer builds to again work on 10.5 with + the system-provided Python. + + +What's New in Python 3.2 Release Candidate 1 +============================================ + +*Release date: 16-Jan-2011* + +Core and Builtins +----------------- + +- Issue #10889: range indexing and slicing now works correctly on ranges with + a length that exceeds sys.maxsize. + +- Issue #10892: Don't segfault when trying to delete __abstractmethods__ from a + class. + +- Issue #8020: Avoid a crash where the small objects allocator would read + non-Python managed memory while it is being modified by another thread. Patch + by Matt Bandy. + +- Issue #10841: On Windows, set the binary mode on stdin, stdout, stderr and all + io.FileIO objects (to not translate newlines, \r\n <=> \n). The Python parser + translates newlines (\r\n => \n). + +- Remove buffer API from stable ABI for now, see #10181. + +- Issue #8651: PyArg_Parse*() functions raise an OverflowError if the file + doesn't have PY_SSIZE_T_CLEAN define and the size doesn't fit in an int + (length bigger than 2^31-1 bytes). + +- Issue #9015, #9611: FileIO.readinto(), FileIO.write(), os.write() and + stdprinter.write() clamp the length to 2^31-1 on Windows. + +- Issue #8278: On Windows and with a NTFS filesystem, os.stat() and os.utime() + can now handle dates after 2038. + +- Issue #10780: PyErr_SetFromWindowsErrWithFilename() and + PyErr_SetExcFromWindowsErrWithFilename() decode the filename from the + filesystem encoding instead of UTF-8. + +- Issue #10779: PyErr_WarnExplicit() decodes the filename from the filesystem + encoding instead of UTF-8. + +- Add sys.flags attribute for the new -q command-line option. + +Library +------- + +- Issue #10916: mmap should not segfault when a file is mapped using 0 as length + and a non-zero offset, and an attempt to read past the end of file is made + (IndexError is raised instead). Patch by Ross Lagerwall. + +- Issue #10907: Warn OS X 10.6 IDLE users to use ActiveState Tcl/Tk 8.5, rather + than the currently problematic Apple-supplied one, when running with the + 64-/32-bit installer variant. + +- Issue #4953: cgi.FieldStorage and cgi.parse() parse the request as bytes, not + as unicode, and accept binary files. Add encoding and errors attributes to + cgi.FieldStorage. Patch written by Pierre Quentel (with many inputs by Glenn + Linderman). + +- Add encoding and errors arguments to urllib.parse_qs() and urllib.parse_qsl(). + +- Issue #10899: No function type annotations in the standard library. Removed + function type annotations from _pyio.py. + +- Issue #10875: Update Regular Expression HOWTO; patch by 'SilentGhost'. + +- Issue #10872: The repr() of TextIOWrapper objects now includes the mode + if available. + +- Issue #10869: Fixed bug where ast.increment_lineno modified the root node + twice. + +- Issue #5871: email.header.Header.encode now raises an error if any + continuation line in the formatted value has no leading white space and looks + like a header. Since Generator uses Header to format all headers, this check + is made for all headers in any serialized message at serialization time. This + provides protection against header injection attacks. + +- Issue #10859: Make ``contextlib.GeneratorContextManager`` officially + private by renaming it to ``_GeneratorContextManager``. + +- Issue #10042: Fixed the total_ordering decorator to handle cross-type + comparisons that could lead to infinite recursion. + +- Issue #10686: the email package now :rfc:`2047`\ -encodes headers with + non-ASCII bytes (parsed by a Bytes Parser) when doing conversion to 7bit-clean + presentation, instead of replacing them with ?s. + +- email.header.Header was incorrectly encoding folding white space when + rfc2047-encoding header values with embedded newlines, leaving them without + folding whitespace. It now uses the continuation_ws, as it does for + continuation lines that it creates itself. + +- Issue #1777412, #10827: Changed the rules for 2-digit years. The + time.asctime(), time.ctime() and time.strftime() functions will now format + any year when ``time.accept2dyear`` is False and will accept years >= 1000 + otherwise. ``time.mktime`` and ``time.strftime`` now accept full range + supported by the OS. With Visual Studio or on Solaris, the year is limited to + the range [1; 9999]. Conversion of 2-digit years to 4-digit is deprecated. + +- Issue #7858: Raise an error properly when os.utime() fails under Windows + on an existing file. + +- Issue #3839: wsgiref should not override a Content-Length header set by + the application. Initial patch by Clovis Fabricio. + +- Issue #10492: bdb.Bdb.run() only traces the execution of the code, not the + compilation (if the input is a string). + +- Issue #7995: When calling accept() on a socket with a timeout, the returned + socket is now always blocking, regardless of the operating system. + +- Issue #10756: atexit normalizes the exception before displaying it. Patch by + Andreas St??hrk. + +- Issue #10790: email.header.Header.append's charset logic now works correctly + for charsets whose output codec is different from its input codec. + +- Issue #10819: SocketIO.name property returns -1 when its closed, instead of + raising a ValueError, to fix repr(). + +- Issue #8650: zlib.compress() and zlib.decompress() raise an OverflowError if + the input buffer length doesn't fit into an unsigned int (length bigger than + 2^32-1 bytes). + +- Issue #6643: Reinitialize locks held within the threading module after fork to + avoid a potential rare deadlock or crash on some platforms. + +- Issue #10806, issue #9905: Fix subprocess pipes when some of the standard file + descriptors (0, 1, 2) are closed in the parent process. Initial patch by Ross + Lagerwall. + +- `unittest.TestCase` can be instantiated without a method name; for simpler + exploration from the interactive interpreter. + +- Issue #10798: Reject supporting concurrent.futures if the system has too + few POSIX semaphores. + +- Issue #10807: Remove base64, bz2, hex, quopri, rot13, uu and zlib codecs from + the codec aliases. They are still accessible via codecs.lookup(). + +- Issue #10801: In zipfile, support different encodings for the header and the + filenames. + +- Issue #6285: IDLE no longer crashes on missing help file; patch by Scott + David Daniels. + +- Fix collections.OrderedDict.setdefault() so that it works in subclasses that + define __missing__(). + +- Issue #10786: unittest.TextTestRunner default stream no longer bound at import + time. `sys.stderr` now looked up at instantiation time. Fix contributed by + Mark Roddy. + +- Issue #10753: Characters ';','=' and ',' in the PATH_INFO environment variable + won't be quoted when the URI is constructed by the wsgiref.util 's request_uri + method. According to RFC 3986, these characters can be a part of params in + PATH component of URI and need not be quoted. + +- Issue #10738: Fix webbrowser.Opera.raise_opts. + +- Issue #9824: SimpleCookie now encodes , and ; in values to cater to how + browsers actually parse cookies. + +- Issue #9333: os.symlink now available regardless of user privileges. The + function now raises OSError on Windows >=6.0 when the user is unable to create + symbolic links. XP and 2003 still raise NotImplementedError. + +- Issue #10783: struct.pack() no longer implicitly encodes unicode to UTF-8. + +- Issue #10730: Add SVG mime types to mimetypes module. + +- Issue #10768: Make the Tkinter ScrolledText widget work again. + +- Issue #10777: Fix "dictionary changed size during iteration" bug in + ElementTree register_namespace(). + +- Issue #10626: test_logging now preserves logger disabled states. + +- Issue #10774: test_logging now removes temp files created during tests. + +- Issue #5258/#10642: if site.py encounters a .pth file that generates an error, + it now prints the filename, line number, and traceback to stderr and skips + the rest of that individual file, instead of stopping processing entirely. + +- Issue #10763: subprocess.communicate() closes stdout and stderr if both are + pipes (bug specific to Windows). + +- Issue #1693546: fix email.message RFC 2231 parameter encoding to be in better + compliance (no "s around encoded values). + +- Improved the diff message in the unittest module's assertCountEqual(). + +- Issue #1155362: email.utils.parsedate_tz now handles a missing space before + the '-' of a timezone field as well as before a '+'. + +- Issue #4871: The zipfile module now gives a more useful error message if + an attempt is made to use a string to specify the archive password. + +- Issue #10750: The ``raw`` attribute of buffered IO objects is now read-only. + +- Deprecated assertDictContainsSubset() in the unittest module. + +C-API +----- + +- Issue #10913: Deprecate misleading functions PyEval_AcquireLock() and + PyEval_ReleaseLock(). The thread-state aware APIs should be used instead. + +- Issue #10333: Remove ancient GC API, which has been deprecated since Python + 2.2. + +Build +----- + +- Issue #10843: Update third-party library versions used in OS X 32-bit + installer builds: bzip2 1.0.6, readline 6.1.2, SQLite 3.7.4 (with FTS3/FTS4 + and RTREE enabled), and ncursesw 5.5 (wide-char support enabled). + +- Issue #10820: Fix OS X framework installs to support version-specific + scripts (#10679). + +- Issue #7716: Under Solaris, don't assume existence of /usr/xpg4/bin/grep in + the configure script but use $GREP instead. Patch by Fabian Groffen. + +- Issue #10475: Don't hardcode compilers for LDSHARED/LDCXXSHARED on NetBSD + and DragonFly BSD. Patch by Nicolas Joly. + +- Issue #10679: The "idle", "pydoc" and "2to3" scripts are now installed with + a version-specific suffix on "make altinstall". + +- Issue #10655: Fix the build on PowerPC on Linux with GCC when building with + timestamp profiling (--with-tsc): the preprocessor test for the PowerPC + support now looks for "__powerpc__" as well as "__ppc__": the latter seems to + only be present on OS X; the former is the correct one for Linux with GCC. + +Tools/Demos +----------- + +- Issue #10843: Install the Tools directory on OS X in the applications Extras + (/Applications/Python 3.n/Extras/) where the Demo directory had previous been + installed. + +- Issue #7962: The Demo directory is gone. Most of the old and unmaintained + demos have been removed, others integrated in documentation or a new + Tools/demo subdirectory. + +- Issue #10502: Addition of the unittestgui tool. Originally by Steve Purcell. + Updated for test discovery by Mark Roddy and Python 3 compatibility by Brian + Curtin. + +Tests +----- + +- Issue #10822: Fix test_posix:test_getgroups failure under Solaris. Patch + by Ross Lagerwall. + +- Make the --coverage flag work for test.regrtest. + +- Issue #1677694: Refactor and improve test_timeout. Original patch by + Bj??rn Lindqvist. + +- Issue #5485: Add tests for the UseForeignDTD method of expat parser objects. + Patch by Jean-Paul Calderone and Sandro Tosi. + +- Issue #6293: Have regrtest.py echo back sys.flags. This is done by default in + whole runs and enabled selectively using ``--header`` when running an explicit + list of tests. Original patch by Collin Winter. + + +What's New in Python 3.2 Beta 2? +================================ + +*Release date: 19-Dec-2010* + +Core and Builtins +----------------- + +- Issue #8844: Regular and recursive lock acquisitions can now be interrupted + by signals on platforms using pthreads. Patch by Reid Kleckner. + +- Issue #4236: PyModule_Create2 now checks the import machinery directly + rather than the Py_IsInitialized flag, avoiding a Fatal Python + error in certain circumstances when an import is done in __del__. + +- Issue #5587: add a repr to dict_proxy objects. Patch by David Stanek and + Daniel Urban. + +Library +------- + +- Issue #3243: Support iterable bodies in httplib. Patch Contributions by + Xuanji Li and Chris AtLee. + +- Issue #10611: SystemExit exception will no longer kill a unittest run. + +- Issue #9857: It is now possible to skip a test in a setUp, tearDown or clean + up function. + +- Issue #10573: use actual/expected consistently in unittest methods. + The order of the args of assertCountEqual is also changed. + +- Issue #9286: email.utils.parseaddr no longer concatenates blank-separated + words in the local part of email addresses, thereby preserving the input. + +- Issue #6791: Limit header line length (to 65535 bytes) in http.client + and http.server, to avoid denial of services from the other party. + +- Issue #10404: Use ctl-button-1 on OSX for the context menu in Idle. + +- Issue #9907: Fix tab handling on OSX when using editline by calling + rl_initialize first, then setting our custom defaults, then reading .editrc. + +- Issue #4188: Avoid creating dummy thread objects when logging operations + from the threading module (with the internal verbose flag activated). + +- Issue #10711: Remove HTTP 0.9 support from http.client. The ``strict`` + parameter to HTTPConnection and friends is deprecated. + +- Issue #9721: Fix the behavior of urljoin when the relative url starts with a + ';' character. Patch by Wes Chow. + +- Issue #10714: Limit length of incoming request in http.server to 65536 bytes + for security reasons. Initial patch by Ross Lagerwall. + +- Issue #9558: Fix distutils.command.build_ext with VS 8.0. + +- Issue #10667: Fast path for collections.Counter(). + +- Issue #10695: passing the port as a string value to telnetlib no longer + causes debug mode to fail. + +- Issue #1078919: add_header now automatically RFC2231 encodes parameters + that contain non-ascii values. + +- Issue #10188 (partial resolution): tempfile.TemporaryDirectory emits + a warning on sys.stderr rather than throwing a misleading exception + if cleanup fails due to nulling out of modules during shutdown. + Also avoids an AttributeError when mkdtemp call fails and issues + a ResourceWarning on implicit cleanup via __del__. + +- Issue #10107: Warn about unsaved files in IDLE on OSX. + +- Issue #7213: subprocess.Popen's default for close_fds has been changed. + It is now True in most cases other than on Windows when input, output or + error handles are provided. + +- Issue #6559: subprocess.Popen has a new pass_fds parameter (actually + added in 3.2beta1) to allow specifying a specific list of file descriptors + to keep open in the child process. + +- Issue #1731717: Fixed the problem where subprocess.wait() could cause an + OSError exception when The OS had been told to ignore SIGCLD in our process + or otherwise not wait for exiting child processes. + +Tests +----- + +- Issue #775964: test_grp now skips YP/NIS entries instead of failing when + encountering them. + +Tools/Demos +----------- + +- Issue #6075: IDLE on Mac OS X now works with both Carbon AquaTk and + Cocoa AquaTk. + +- Issue #10710: ``Misc/setuid-prog.c`` is removed from the source tree. + +- Issue #10706: Remove outdated script runtests.sh. Either ``make test`` + or ``python -m test`` should be used instead. + +Build +----- + +- The Windows build now uses Tcl/Tk 8.5.9 and sqlite3 3.7.4. + +- Issue #9234: argparse supports alias names for subparsers. + + +What's New in Python 3.2 Beta 1? +================================ + +*Release date: 05-Dec-2010* + +Core and Builtins +----------------- + +- Issue #10630: Return dict views from the dict proxy keys()/values()/items() + methods. + +- Issue #10596: Fix float.__mod__ to have the same behaviour as float.__divmod__ + with respect to signed zeros. -4.0 % 4.0 should be 0.0, not -0.0. + +- Issue #1772833: Add the -q command-line option to suppress copyright and + version output in interactive mode. + +- Provide an *optimize* parameter in the built-in compile() function. + +- Fixed several corner case issues on Windows in os.stat/os.lstat related to + reparse points. + +- PEP 384 (Defining a Stable ABI) is implemented. + +- Issue #2690: Range objects support negative indices and slicing. + +- Issue #9915: Speed up sorting with a key. + +- Issue #8685: Speed up set difference ``a - b`` when source set ``a`` is much + larger than operand ``b``. Patch by Andrew Bennetts. + +- Issue #10518: Bring back the callable() builtin. + +- Issue #7094: Added alternate formatting (specified by '#') to ``__format__`` + method of float, complex, and Decimal. This allows more precise control over + when decimal points are displayed. + +- Issue #10474: range.count() should return integers. + +- Issue #1574217: isinstance now catches only AttributeError, rather than + masking all errors. + +Library +------- + +- logging: added "handler of last resort". See http://bit.ly/last-resort-handler + +- test.support: Added TestHandler and Matcher classes for better support of + assertions about logging. + +- Issue #4391: Use proper plural forms in argparse. + +- Issue #10601: sys.displayhook uses 'backslashreplace' error handler on + UnicodeEncodeError. + +- Add the "display" and "undisplay" pdb commands. + +- Issue #7245: Add a SIGINT handler in pdb that allows to break a program again + after a "continue" command. + +- Add the "interact" pdb command. + +- Issue #7905: Actually respect the keyencoding parameter to shelve.Shelf. + +- Issue #1569291: Speed up array.repeat(). + +- Provide an interface to set the optimization level of compilation in + py_compile, compileall and zipfile.PyZipFile. + +- Issue #7904: Changes to urllib.parse.urlsplit to handle schemes as defined by + RFC3986. Anything before :// is considered a scheme and is followed by an + authority (or netloc) and by '/' led path, which is optional. + +- Issue #6045: dbm.gnu databases now support get() and setdefault() methods. + +- Issue #10620: `python -m unittest` can accept file paths instead of module + names for running specific tests. + +- Issue #9424: Deprecate the `unittest.TestCase` methods `assertEquals`, + `assertNotEquals`, `assertAlmostEquals`, `assertNotAlmostEquals` and `assert_` + and replace them with the correct methods in the Python test suite. + +- Issue #10272: The ssl module now raises socket.timeout instead of a generic + SSLError on socket timeouts. + +- Issue #10528: Allow translators to reorder placeholders in localizable + messages from argparse. + +- Issue #10497: Fix incorrect use of gettext in argparse. + +- Issue #10478: Reentrant calls inside buffered IO objects (for example by + way of a signal handler) now raise a RuntimeError instead of freezing the + current process. + +- logging: Added getLogRecordFactory/setLogRecordFactory with docs and tests. + +- Issue #10549: Fix pydoc traceback when text-documenting certain classes. + +- Issue #2001: New HTML server with enhanced Web page features. Patch by Ron + Adam. + +- Issue #10360: In WeakSet, do not raise TypeErrors when testing for membership + of non-weakrefable objects. + +- Issue #940286: pydoc.Helper.help() ignores input/output init parameters. + +- Issue #1745035: Add a command size and data size limit to smtpd.py, to prevent + DoS attacks. Patch by Savio Sena. + +- Issue #4925: Add filename to error message when executable can't be found in + subprocess. + +- Issue #10391: Don't dereference invalid memory in error messages in the ast + module. + +- Issue #10027: st_nlink was not being set on Windows calls to os.stat or + os.lstat. Patch by Hirokazu Yamamoto. + +- Issue #9333: Expose os.symlink only when the SeCreateSymbolicLinkPrivilege is + held by the user's account, i.e., when the function can actually be used. + +- Issue #8879: Add os.link support for Windows. + +- Issue #7911: ``unittest.TestCase.longMessage`` defaults to True for improved + failure messages by default. Patch by Mark Roddy. + +- Issue #1486713: HTMLParser now has an optional tolerant mode where it tries to + guess at the correct parsing of invalid html. + +- Issue #10554: Add context manager support to subprocess.Popen objects. + +- Issue #8989: email.utils.make_msgid now has a domain parameter that can + override the domain name used in the generated msgid. + +- Issue #9299: Add exist_ok parameter to os.makedirs to suppress the 'File + exists' exception when a target directory already exists with the specified + mode. Patch by Ray Allen. + +- Issue #9573: os.fork() now works correctly when triggered as a side effect of + a module import. + +- Issue #10464: netrc now correctly handles lines with embedded '#' characters. + +- Added itertools.accumulate(). + +- Issue #4113: Added custom ``__repr__`` method to ``functools.partial``. + Original patch by Daniel Urban. + +- Issue #10273: Rename `assertRegexpMatches` and `assertRaisesRegexp` to + `assertRegex` and `assertRaisesRegex`. + +- Issue #10535: Enable silenced warnings in unittest by default. + +- Issue #9873: The URL parsing functions in urllib.parse now accept ASCII byte + sequences as input in addition to character strings. + +- Issue #10586: The statistics API for the new functools.lru_cache has been + changed to a single cache_info() method returning a named tuple. + +- Issue #10323: itertools.islice() now consumes the minimum number of inputs + before stopping. Formerly, the final state of the underlying iterator was + undefined. + +- Issue #10565: The collections.Iterator ABC now checks for both __iter__ and + __next__. + +- Issue #10242: Fixed implementation of unittest.ItemsEqual and gave it a new + more informative name, unittest.CountEqual. + +- Issue #10561: In pdb, clear the breakpoints by the breakpoint number. + +- Issue #2986: difflib.SequenceMatcher gets a new parameter, autojunk, which can + be set to False to turn off the previously undocumented 'popularity' + heuristic. Patch by Terry Reedy and Eli Bendersky. + +- Issue #10534: in difflib, expose bjunk and bpopular sets; deprecate + undocumented and now redundant isbjunk and isbpopular methods. + +- Issue #9846: zipfile is now correctly closing underlying file objects. + +- Issue #10459: Update CJK character names to Unicode 6.0. + +- Issue #4493: urllib.request adds '/' in front of path components which does not + start with '/. Common behavior exhibited by browsers and other clients. + +- Issue #6378: idle.bat now runs with the appropriate Python version rather than + the system default. Patch by Sridhar Ratnakumar. + +- Issue #10470: 'python -m unittest' will now run test discovery by default, + when no extra arguments have been provided. + +- Issue #3709: BaseHTTPRequestHandler will buffer the headers and write to + output stream only when end_headers is invoked. This is a speedup and an + internal optimization. Patch by endian. + +- Issue #10220: Added inspect.getgeneratorstate. Initial patch by Rodolpho + Eckhardt. + +- Issue #10453: compileall now uses argparse instead of getopt, and thus + provides clean output when called with '-h'. + +- Issue #8078: Add constants for higher baud rates in the termios module. Patch + by Rodolpho Eckhardt. + +- Issue #10407: Fix two NameErrors in distutils. + +- Issue #10371: Deprecated undocumented functions in the trace module. + +- Issue #10467: Fix BytesIO.readinto() after seeking into a position after the + end of the file. + +- configparser: 100% test coverage. + +- Issue #10499: configparser supports pluggable interpolation handlers. The + default classic interpolation handler is called BasicInterpolation. Another + interpolation handler added (ExtendedInterpolation) which supports the syntax + used by zc.buildout (e.g. interpolation between sections). + +- configparser: the SafeConfigParser class has been renamed to ConfigParser. + The legacy ConfigParser class has been removed but its interpolation mechanism + is still available as LegacyInterpolation. + +- configparser: Usage of RawConfigParser is now discouraged for new projects + in favor of ConfigParser(interpolation=None). + +- Issue #1682942: configparser supports alternative option/value delimiters. + +- Issue #5412: configparser supports mapping protocol access. + +- Issue #9411: configparser supports specifying encoding for read operations. + +- Issue #9421: configparser's getint(), getfloat() and getboolean() methods + accept vars and default arguments just like get() does. + +- Issue #9452: configparser supports reading from strings and dictionaries + (thanks to the mapping protocol API, the latter can be used to copy data + between parsers). + +- configparser: accepted INI file structure is now customizable, including + comment prefixes, name of the DEFAULT section, empty lines in multiline + values, and indentation. + +- Issue 10326: unittest.TestCase instances can be pickled. + +- Issue 9926: Wrapped TestSuite subclass does not get __call__ executed. + +- Issue #9920: Skip tests for cmath.atan and cmath.atanh applied to complex + zeros on systems where the log1p function fails to respect the sign of zero. + This fixes a test failure on AIX. + +- Issue #9732: Addition of getattr_static to the inspect module. + +- Issue #10446: Module documentation generated by pydoc now links to a + version-specific online reference manual. + +- Make the 'No module named' exception message from importlib consistent. + +- Issue #10443: Add the SSLContext.set_default_verify_paths() method. + +- Issue #10440: Support RUSAGE_THREAD as a constant in the resource module. + Patch by Robert Collins. + +- Issue #10429: IMAP.starttls() stored the capabilities as bytes objects, rather + than strings. + +C-API +----- + +- Issue #10557: Added a new API function, PyUnicode_TransformDecimalToASCII(), + which transforms non-ASCII decimal digits in a Unicode string to their ASCII + equivalents. + +- Issue #9518: Extend the PyModuleDef_HEAD_INIT macro to explicitly + zero-initialize all fields, fixing compiler warnings seen when building + extension modules with gcc with "-Wmissing-field-initializers" (implied by + "-W"). + +- Issue #10255: Fix reference leak in Py_InitializeEx(). Patch by Neil + Schemenauer. + +- structseq.h is now included in Python.h. + +- Loosen PyArg_ValidateKeywordArguments to allow dict subclasses. + +Tests +----- + +- regrtest.py once again ensures the test directory is removed from sys.path + when it is invoked directly as the __main__ module. + +- `python -m test` can be used to run the test suite as well as `python -m + test.regrtest`. + +- Do not fail test_socket when the IP address of the local hostname cannot be + looked up. + +- Issue #8886: Use context managers throughout test_zipfile. Patch by Eric + Carstensen. + +Build +----- + +- Issue #10325: Fix two issues in the fallback definitions for PY_ULLONG_MAX and + PY_LLONG_MAX that made them unsuitable for use in preprocessor conditionals. + +Documentation +------------- + +- Issue #10299: List the built-in functions in a table in functions.rst. + + What's New in Python 3.2 Alpha 4? ================================= -*Release date: XX-Nov-2010* +*Release date: 13-Nov-2010* Core and Builtins ----------------- @@ -13,46 +989,46 @@ - Issue #10372: Import the warnings module only after the IO library is initialized, so as to avoid bootstrap issues with the '-W' option. -- Issue #10293: Remove obsolete field in the PyMemoryView structure, - unused undocumented value PyBUF_SHADOW, and strangely-looking code in +- Issue #10293: Remove obsolete field in the PyMemoryView structure, unused + undocumented value PyBUF_SHADOW, and strangely-looking code in PyMemoryView_GetContiguous. -- Issue #6081: Add str.format_map, similar to str.format(**mapping). +- Issue #6081: Add str.format_map(), similar to ``str.format(**mapping)``. - If FileIO.__init__ fails, close the file descriptor. - Issue #10221: dict.pop(k) now has a key error message that includes the missing key (same message d[k] returns for missing keys). -- Issue #5437: A preallocated MemoryError instance should not hold traceback +- Issue #5437: A preallocated MemoryError instance should not keep traceback data (including local variables caught in the stack trace) alive infinitely. - Issue #10186: Fix the SyntaxError caret when the offset is equal to the length of the offending line. -- Issue #10089: Add support for arbitrary -X options on the command-line. - They can be retrieved through a new attribute ``sys._xoptions``. +- Issue #10089: Add support for arbitrary -X options on the command line. They + can be retrieved through a new attribute ``sys._xoptions``. -- Issue #4388: On Mac OS X, decode command line arguments from UTF-8, instead - of the locale encoding. If the LANG (and LC_ALL and LC_CTYPE) environment +- Issue #4388: On Mac OS X, decode command line arguments from UTF-8, instead of + the locale encoding. If the LANG (and LC_ALL and LC_CTYPE) environment variable is not set, the locale encoding is ISO-8859-1, whereas most programs - (including Python) expect UTF-8. Python already uses UTF-8 for the filesystem + (including Python) expect UTF-8. Python already uses UTF-8 for the filesystem encoding and to encode command line arguments on this OS. -- Issue #9713, #10114: Parser functions (eg. PyParser_ASTFromFile) expects - filenames encoded to the filesystem encoding with surrogateescape error +- Issue #9713, #10114: Parser functions (e.g. PyParser_ASTFromFile) expect + filenames encoded to the filesystem encoding with the surrogateescape error handler (to support undecodable bytes), instead of UTF-8 in strict mode. - Issue #9997: Don't let the name "top" have special significance in scope resolution. -- Issue #9862: Compensate for broken PIPE_BUF in AIX by hard coding - its value as the default 512 when compiling on AIX. +- Issue #9862: Compensate for broken PIPE_BUF in AIX by hard coding its value as + the default 512 when compiling on AIX. - Use locale encoding instead of UTF-8 to encode and decode filenames if Py_FileSystemDefaultEncoding is not set. -- Issue #10095: fp_setreadl() doesn't reopen the file, reuse instead the file +- Issue #10095: fp_setreadl() doesn't reopen the file, instead reuse the file descriptor. - Issue #9418: Moved private string methods ``_formatter_parser`` and @@ -63,34 +1039,54 @@ Library ------- -- Issue #4471: Properly shutdown socket in IMAP.shutdown(). Patch by - Lorenzo M. Catucci. +- Issue #10465: fix broken delegating of attributes by gzip._PaddedFile. + +- Issue #10356: Decimal.__hash__(-1) should return -2. + +- Issue #1553375: logging: Added stack_info kwarg to display stack information. + +- Issue #5111: IPv6 Host in the Header is wrapped inside [ ]. Patch by Chandru. + +- Fix Fraction.__hash__ so that Fraction.__hash__(-1) is -2. (See also issue + #10356.) + +- Issue #4471: Add the IMAP.starttls() method to enable encryption on standard + IMAP4 connections. Original patch by Lorenzo M. Catucci. + +- Issue #1466065: Add 'validate' option to base64.b64decode to raise an error if + there are non-base64 alphabet characters in the input. + +- Issue #10386: Add __all__ to token module; this simplifies importing in + tokenize module and prevents leaking of private names through ``import *``. + +- Issue #4471: Properly shutdown socket in IMAP.shutdown(). Patch by Lorenzo + M. Catucci. - Fix IMAP.login() to work properly. - Issue #9244: multiprocessing pool worker processes could terminate - unexpectedly if the return value of a task could not be pickled. Only - the ``repr`` of such errors are now sent back, wrapped in an + unexpectedly if the return value of a task could not be pickled. Only the + ``repr`` of such errors are now sent back, wrapped in an ``MaybeEncodingError`` exception. -- Issue #9244: The ``apply_async()`` and ``map_async()`` methods - of ``multiprocessing.Pool`` now accepts a ``error_callback`` argument. - This can be a callback with the signature ``callback(exc)``, which will - be called if the target raises an exception. +- Issue #9244: The ``apply_async()`` and ``map_async()`` methods of + ``multiprocessing.Pool`` now accepts a ``error_callback`` argument. This can + be a callback with the signature ``callback(exc)``, which will be called if + the target raises an exception. -- Issue #10022: The dictionary returned by the ``getpeercert()`` method - of SSL sockets now has additional items such as ``issuer`` and ``notBefore``. +- Issue #10022: The dictionary returned by the ``getpeercert()`` method of SSL + sockets now has additional items such as ``issuer`` and ``notBefore``. - ``usenetrc`` is now false by default for NNTP objects. -- Issue #1926: Add support for NNTP over SSL on port 563, as well as - STARTTLS. Patch by Andrew Vant. +- Issue #1926: Add support for NNTP over SSL on port 563, as well as STARTTLS. + Patch by Andrew Vant. - Issue #10335: Add tokenize.open(), detect the file encoding using tokenize.detect_encoding() and open it in read only mode. -- Issue #10321: Added support for binary data to smtplib.SMTP.sendmail, - and a new method send_message to send an email.message.Message object. +- Issue #10321: Add support for binary data to smtplib.SMTP.sendmail, and a new + method send_message to send an email.message.Message object. - Issue #6011: sysconfig and distutils.sysconfig use the surrogateescape error handler to parse the Makefile file. Avoid a UnicodeDecodeError if the source @@ -115,60 +1111,59 @@ - Issue #10180: Pickling file objects is now explicitly forbidden, since unpickling them produced nonsensical results. -- Issue #10311: The signal module now restores errno before returning from - its low-level signal handler. Patch by Hallvard B Furuseth. +- Issue #10311: The signal module now restores errno before returning from its + low-level signal handler. Patch by Hallvard B Furuseth. - Issue #10282: Add a ``nntp_implementation`` attribute to NNTP objects. - Issue #10283: Add a ``group_pattern`` argument to NNTP.list(). -- Issue #10155: Add IISCGIHandler to wsgiref.handlers to support IIS - CGI environment better, and to correct unicode environment values - for WSGI 1.0.1. +- Issue #10155: Add IISCGIHandler to wsgiref.handlers to support IIS CGI + environment better, and to correct unicode environment values for WSGI 1.0.1. - Issue #10281: nntplib now returns None for absent fields in the OVER/XOVER response, instead of raising an exception. - wsgiref now implements and validates PEP 3333, rather than an experimental extension of PEP 333. (Note: earlier versions of Python 3.x may have - incorrectly validated some non-compliant applications as WSGI compliant; - if your app validates with Python <3.2b1+, but not on this version, it is - likely the case that your app was not compliant.) + incorrectly validated some non-compliant applications as WSGI compliant; if + your app validates with Python <3.2b1+, but not on this version, it is likely + the case that your app was not compliant.) -- Issue #10280: NNTP.nntp_version should reflect the highest version - advertised by the server. +- Issue #10280: NNTP.nntp_version should reflect the highest version advertised + by the server. - Issue #10184: Touch directories only once when extracting a tarfile. -- Issue #10199: New package, ``turtledemo`` now contains selected demo - scripts that were formerly found under Demo/turtle. +- Issue #10199: New package, ``turtledemo`` now contains selected demo scripts + that were formerly found under Demo/turtle. - Issue #10265: Close file objects explicitly in sunau. Patch by Brian Brazil. -- Issue #10266: uu.decode didn't close in_file explicitly when it was given - as a filename. Patch by Brian Brazil. +- Issue #10266: uu.decode didn't close in_file explicitly when it was given as a + filename. Patch by Brian Brazil. -- Issue #10110: Queue objects didn't recognize full queues when the - maxsize parameter had been reduced. +- Issue #10110: Queue objects didn't recognize full queues when the maxsize + parameter had been reduced. - Issue #10160: Speed up operator.attrgetter. Patch by Christos Georgiou. - logging: Added style option to basicConfig() to allow %, {} or $-formatting. -- Issue #5729: json.dumps() now supports using a string such as '\t' - for pretty-printing multilevel objects. +- Issue #5729: json.dumps() now supports using a string such as '\t' for + pretty-printing multilevel objects. -- Issue #10253: FileIO leaks a file descriptor when trying to open a file - for append that isn't seekable. Patch by Brian Brazil. +- Issue #10253: FileIO leaks a file descriptor when trying to open a file for + append that isn't seekable. Patch by Brian Brazil. -- Support context manager protocol for file-like objects returned by - mailbox ``get_file()`` methods. +- Support context manager protocol for file-like objects returned by mailbox + ``get_file()`` methods. - Issue #10246: uu.encode didn't close file objects explicitly when filenames were given to it. Patch by Brian Brazil. -- Issue #10198: fix duplicate header written to wave files when writeframes() - is called without data. +- Issue #10198: fix duplicate header written to wave files when writeframes() is + called without data. - Close file objects in modulefinder in a timely manner. @@ -178,20 +1173,20 @@ - Close a file object in pkgutil in a timely manner. -- Issue #10233: Close file objects in a timely manner in the tarfile module - and its test suite. +- Issue #10233: Close file objects in a timely manner in the tarfile module and + its test suite. - Issue #10093: ResourceWarnings are now issued when files and sockets are - deallocated without explicit closing. These warnings are silenced by - default, except in pydebug mode. + deallocated without explicit closing. These warnings are silenced by default, + except in pydebug mode. -- tarfile.py: Add support for all missing variants of the GNU sparse - extensions and create files with holes when extracting sparse members. +- tarfile.py: Add support for all missing variants of the GNU sparse extensions + and create files with holes when extracting sparse members. - Issue #10218: Return timeout status from ``Condition.wait`` in threading. -- Issue #7351: Add ``zipfile.BadZipFile`` spelling of the exception name - and deprecate the old name ``zipfile.BadZipfile``. +- Issue #7351: Add ``zipfile.BadZipFile`` spelling of the exception name and + deprecate the old name ``zipfile.BadZipfile``. - Issue #5027: The standard ``xml`` namespace is now understood by xml.sax.saxutils.XMLGenerator as being bound to @@ -203,21 +1198,19 @@ - logging: Added style option to Formatter to allow %, {} or $-formatting. -- Issue #5178: Added tempfile.TemporaryDirectory class that can be used - as a context manager. +- Issue #5178: Added tempfile.TemporaryDirectory class that can be used as a + context manager. - Issue #1349106: Generator (and BytesGenerator) flatten method and Header encode method now support a 'linesep' argument. -- Issue #5639: Add a *server_hostname* argument to ``SSLContext.wrap_socket`` - in order to support the TLS SNI extension. ``HTTPSConnection`` and - ``urlopen()`` also use this argument, so that HTTPS virtual hosts are now - supported. +- Issue #5639: Add a *server_hostname* argument to ``SSLContext.wrap_socket`` in + order to support the TLS SNI extension. ``HTTPSConnection`` and ``urlopen()`` + also use this argument, so that HTTPS virtual hosts are now supported. - Issue #10166: Avoid recursion in pstats Stats.add() for many stats items. -- Issue #10163: Skip unreadable registry keys during mimetypes - initialization. +- Issue #10163: Skip unreadable registry keys during mimetypes initialization. - logging: Made StreamHandler terminator configurable. @@ -230,9 +1223,9 @@ - logging: Added _logRecordClass, getLogRecordClass, setLogRecordClass to increase flexibility of LogRecord creation. -- Issue #5117: Case normalization was needed on ntpath.relpath(). And - fixed root directory issue on posixpath.relpath(). (Ported working fixes - from ntpath) +- Issue #5117: Case normalization was needed on ntpath.relpath(). Also fixed + root directory issue on posixpath.relpath(). (Ported working fixes from + ntpath.) - Issue #1343: xml.sax.saxutils.XMLGenerator now has an option short_empty_elements to direct it to use self-closing tags when appropriate. @@ -251,34 +1244,33 @@ - Issue #9409: Fix the regex to match all kind of filenames, for interactive debugging in doctests. -- Issue #9183: ``datetime.timezone(datetime.timedelta(0))`` will now - return the same instance as ``datetime.timezone.utc``. +- Issue #9183: ``datetime.timezone(datetime.timedelta(0))`` will now return the + same instance as ``datetime.timezone.utc``. -- Issue #7523: Add SOCK_CLOEXEC and SOCK_NONBLOCK to the socket module, - where supported by the system. Patch by Nikita Vetoshkin. +- Issue #7523: Add SOCK_CLOEXEC and SOCK_NONBLOCK to the socket module, where + supported by the system. Patch by Nikita Vetoshkin. - Issue #10063: file:// scheme will stop accessing remote hosts via ftp protocol. file:// urls had fallback to access remote hosts via ftp. This was - not correct, change is made to raise a URLError when a remote host is tried - to access via file:// scheme. + not correct, change is made to raise a URLError when a remote host is tried to + access via file:// scheme. - Issue #1710703: Write structures for an empty ZIP archive when a ZipFile is created in modes 'a' or 'w' and then closed without adding any files. Raise BadZipfile (rather than IOError) when opening small non-ZIP files. -- Issue #10041: The signature of optional arguments in socket.makefile() - didn't match that of io.open(), and they also didn't get forwarded - properly to TextIOWrapper in text mode. Patch by Kai Zhu. +- Issue #10041: The signature of optional arguments in socket.makefile() didn't + match that of io.open(), and they also didn't get forwarded properly to + TextIOWrapper in text mode. Patch by Kai Zhu. - Issue #9003: http.client.HTTPSConnection, urllib.request.HTTPSHandler and - urllib.request.urlopen now take optional arguments to allow for - server certificate checking, as recommended in public uses of HTTPS. + urllib.request.urlopen now take optional arguments to allow for server + certificate checking, as recommended in public uses of HTTPS. - Issue #6612: Fix site and sysconfig to catch os.getcwd() error, eg. if the current directory was deleted. Patch written by W. Trevor King. -- Issue #3873: Speed up unpickling from file objects which have a peek() - method. +- Issue #3873: Speed up unpickling from file objects that have a peek() method. - Issue #10075: Add a session_stats() method to SSLContext objects. @@ -287,6 +1279,9 @@ Extension Modules ----------------- +- Issue #5109: array.array constructor will now use fast code when + initial data is provided in an array object with correct type. + - Issue #6317: Now winsound.PlaySound only accepts unicode. - Issue #6317: Now winsound.PlaySound can accept non ascii filename. @@ -299,12 +1294,12 @@ - Issue #678250: Make mmap flush a noop on ACCESS_READ and ACCESS_COPY. -- Issue #9054: Fix a crash occurring when using the pyexpat module - with expat version 2.0.1. +- Issue #9054: Fix a crash occurring when using the pyexpat module with expat + version 2.0.1. -- Issue #5355: Provide mappings from Expat error numbers to string - descriptions and backwards, in order to actually make it possible - to analyze error codes provided by ExpatError. +- Issue #5355: Provide mappings from Expat error numbers to string descriptions + and backwards, in order to actually make it possible to analyze error codes + provided by ExpatError. - The Unicode database was updated to 6.0.0. @@ -312,8 +1307,7 @@ ----- - Issue #10288: The deprecated family of "char"-handling macros - (ISLOWER()/ISUPPER()/etc) have now been removed: use Py_ISLOWER() etc - instead. + (ISLOWER()/ISUPPER()/etc) have now been removed: use Py_ISLOWER() etc instead. - Issue #9778: Hash values are now always the size of pointers. A new Py_hash_t type has been introduced. @@ -321,22 +1315,22 @@ Tools/Demos ----------- -- Issue #10117: Tools/scripts/reindent.py now accepts source files - that use encoding other than ASCII or UTF-8. Source encoding is - preserved when reindented code is written to a file. +- Issue #10117: Tools/scripts/reindent.py now accepts source files that use + encoding other than ASCII or UTF-8. Source encoding is preserved when + reindented code is written to a file. - Issue #7287: Demo/imputil/knee.py was removed. Tests ----- -- Issue #3699: Fix test_bigaddrspace and extend it to test bytestrings - as well as unicode strings. Initial patch by Sandro Tosi. +- Issue #3699: Fix test_bigaddrspace and extend it to test bytestrings as well + as unicode strings. Initial patch by Sandro Tosi. - Issue #10294: Remove dead code form test_unicode_file. -- Issue #10123: Don't use non-ascii filenames in test_doctest tests. Add a - new test specific to unicode (non-ascii name and filename). +- Issue #10123: Don't use non-ascii filenames in test_doctest tests. Add a new + test specific to unicode (non-ascii name and filename). Build ----- @@ -351,8 +1345,8 @@ - Accept Oracle Berkeley DB 5.0 and 5.1 as backend for the dbm extension. -- Issue #7473: avoid link errors when building a framework with a different - set of architectures than the one that is currently installed. +- Issue #7473: avoid link errors when building a framework with a different set + of architectures than the one that is currently installed. What's New in Python 3.2 Alpha 3? @@ -408,7 +1402,7 @@ - Issue #9252: PyImport_Import no longer uses a fromlist hack to return the module that was imported, but instead gets the module from sys.modules. -- Issue #9212: The range type_items now provides index() and count() methods, to +- Issue #9213: The range type_items now provides index() and count() methods, to conform to the Sequence ABC. Patch by Daniel Urban and Daniel Stutzbach. - Issue #7994: Issue a PendingDeprecationWarning if object.__format__ is called @@ -836,6 +1830,12 @@ Extension Modules ----------------- +- Issue #8013: time.asctime and time.ctime no longer call system + asctime and ctime functions. The year range for time.asctime is now + 1900 through maxint. The range for time.ctime is the same as for + time.localtime. The string produced by these functions is longer + than 24 characters when year is greater than 9999. + - Issue #6608: time.asctime is now checking struct tm fields its input before passing it to the system asctime. Patch by MunSic Jeong. @@ -1091,8 +2091,8 @@ - Issue #8230: Fix Lib/test/sortperf.py. -- Issue #8620: when a Cmd is fed input that reaches EOF without a final newline, - it no longer truncates the last character of the last command line. +- Issue #8620: when a cmd.Cmd() is fed input that reaches EOF without a final + newline, it no longer truncates the last character of the last command line. - Issue #5146: Handle UID THREAD command correctly in imaplib. @@ -1816,7 +2816,7 @@ - Issue #8897: Fix sunau module, use bytes to write the header. Patch written by Thomas Jollans. -- Issue #8899: time.struct_time now has class and atribute docstrings. +- Issue #8899: time.struct_time now has class and attribute docstrings. - Issue #6470: Drop UNC prefix in FixTk. Deleted: python/branches/pep-3151/Misc/PURIFY.README ============================================================================== --- python/branches/pep-3151/Misc/PURIFY.README Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,97 +0,0 @@ -Purify (tm) and Quantify (tm) are commercial software quality -assurance tools available from IBM . -Purify is essentially a memory access -verifier and leak detector; Quantify is a C level profiler. The rest -of this file assumes you generally know how to use Purify and -Quantify, and that you have installed valid licenses for these -products. If you haven't installed such licenses, you can ignore the -following since it won't help you a bit! - -You can easily build a Purify or Quantify instrumented version of the -Python interpreter by passing the PURIFY variable to the make command -at the top of the Python tree: - - make PURIFY=purify - -This assumes that the `purify' program is on your $PATH. Note that -you cannot both Purify and Quantify the Python interpreter (or any -program for that matter) at the same time. If you want to build a -Quantify'd interpreter, do this: - - make PURIFY=quantify - -Starting with Python 2.3, pymalloc is enabled by default. This -will cause many supurious warnings. Modify Objects/obmalloc.c -and enable Py_USING_MEMORY_DEBUGGER by uncommenting it. -README.valgrind has more details about why this is necessary. -See below about setting up suppressions. Some tests may not -run well with Purify due to heavy memory or CPU usage. These -tests may include: test_largefile, test_import, and test_long. - -Please report any findings (problems or no warnings) to python-dev at python.org. -It may be useful to submit a bug report for any problems. - -When running the regression test (make test), I have found it useful -to set my PURIFYOPTIONS environment variable using the following -(bash) shell function. Check out the Purify documentation for -details: - -p() { - chainlen='-chain-length=12' - ignoresigs='-ignore-signals="SIGHUP,SIGINT,SIGQUIT,SIGILL,SIGTRAP,SIGAVRT,SIGEMT,SIGFPE,SIGKILL,SIGBUS,SIGSEGV,SIGPIPE,SIGTERM,SIGUSR1,SIGUSR2,SIGPOLL,SIGXCPU,SIGXFSZ,SIGFREEZE,SIGTHAW,SIGRTMIN,SIGRTMAX"' - followchild='-follow-child-processes=yes' - threads='-max-threads=50' - export PURIFYOPTIONS="$chainlen $ignoresigs $followchild $threads" - echo $PURIFYOPTIONS -} - -Note that you may want to crank -chain-length up even further. A -value of 20 should get you the entire stack up into the Python C code -in all situations. - -With the regression test on a fatly configured interpreter -(i.e. including as many modules as possible in your Modules/Setup -file), you'll probably get a gabillion UMR errors, and a few MLK -errors. I think most of these can be safely suppressed by putting the -following in your .purify file: - - suppress umr ...; "socketmodule.c" - suppress umr ...; time_strftime - suppress umr ...; "_dbmmodule.c" - suppress umr ...; "_gdbmmodule.c" - suppress umr ...; "grpmodule.c" - suppress umr ...; "nismodule.c" - suppress umr ...; "pwdmodule.c" - -Note: this list is very old and may not be accurate any longer. -It's possible some of these no longer need to be suppressed. -You will also need to suppress warnings (at least umr) -from Py_ADDRESS_IN_RANGE. - -This will still leave you with just a few UMR, mostly in the readline -library, which you can safely ignore. A lot of work has gone into -Python 1.5 to plug as many leaks as possible. - -Using Purify or Quantify in this way will give you coarse grained -reports on the whole Python interpreter. You can actually get more -fine grained control over both by linking with the optional `pure' -module, which exports (most of) the Purify and Quantify C API's into -Python. To link in this module (it must be statically linked), edit -your Modules/Setup file for your site, and rebuild the interpreter. -You might want to check out the comments in the Modules/puremodule.c -file for some idiosyncrasies. - -Using this module, you can actually profile or leak test a small -section of code, instead of the whole interpreter. Using this in -conjuction with pdb.py, dbx, or the profiler.py module really gives -you quite a bit of introspective power. - -Naturally there are a couple of caveats. This has only been tested -with Purify 4.0.1 and Quantify 2.1-beta on Solaris 2.5. Purify 4.0.1 -does not work with Solaris 2.6, but Purify 4.1 which reportedly will, -is currently in beta test. There are funky problems when Purify'ing a -Python interpreter build with threads. I've had a lot of problems -getting this to work, so I generally don't build with threads when I'm -Purify'ing. If you get this to work, let us know! - --Barry Warsaw Modified: python/branches/pep-3151/Misc/README ============================================================================== --- python/branches/pep-3151/Misc/README (original) +++ python/branches/pep-3151/Misc/README Sat Feb 26 08:16:32 2011 @@ -8,32 +8,19 @@ ---------------- ACKS Acknowledgements -AIX-NOTES Notes for building Python on AIX build.sh Script to build and test latest Python from the repository -cheatsheet Quick summary of Python by Ken Manheimer -developers.txt A history of who got developer permissions, and why gdbinit Handy stuff to put in your .gdbinit file, if you use gdb HISTORY News from previous releases -- oldest last indent.pro GNU indent profile approximating my C style -maintainers.rst A list of maintainers for library modules NEWS News for this release (for some meaning of "this") -NEWS.help How to edit NEWS Porting Mini-FAQ on porting to new platforms -PURIFY.README Information for Purify users -pymemcompat.h Memory interface compatibility file. python-config.in Python script template for python-config python.man UNIX man page for the python interpreter -python-mode.el Emacs mode for editing Python programs python.pc.in Package configuration info template for pkg-config -python-wing.wpr Wing IDE project file +python-wing*.wpr Wing IDE project file README The file you're reading now -README.coverity Information about running Coverity's Prevent on Python -README.klocwork Information about running Klocwork's K7 on Python -README.OpenBSD Help for building problems on OpenBSD README.valgrind Information for Valgrind users, see valgrind-python.supp -RFD Request For Discussion about a Python newsgroup RPM (Old) tools to build RPMs -setuid-prog.c C helper program for set-uid Python scripts SpecialBuilds.txt Describes extra symbols you can set for debug builds TextMate A TextMate bundle for Python development valgrind-python.supp Valgrind suppression file, see README.valgrind Deleted: python/branches/pep-3151/Misc/README.Emacs ============================================================================== --- python/branches/pep-3151/Misc/README.Emacs Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,32 +0,0 @@ -============= -Emacs support -============= - -If you want to edit Python code in Emacs, you should download python-mode.el -and install it somewhere on your load-path. See the project page to download: - - https://launchpad.net/python-mode - -While Emacs comes with a python.el file, it is not recommended. -python-mode.el is maintained by core Python developers and is generally -considered more Python programmer friendly. For example, python-mode.el -includes a killer feature called `pdbtrack` which allows you to set a pdb -breakpoint in your code, run your program in an Emacs shell buffer, and do gud -style debugging when the breakpoint is hit. - -python-mode.el is compatible with both GNU Emacs from the FSF, and XEmacs. - -For more information and bug reporting, see the above project page. For help, -development, or discussions, see the python-mode mailing list: - - http://mail.python.org/mailman/listinfo/python-mode - - -.. - Local Variables: - mode: indented-text - indent-tabs-mode: nil - sentence-end-double-space: t - fill-column: 78 - coding: utf-8 - End: Deleted: python/branches/pep-3151/Misc/README.OpenBSD ============================================================================== --- python/branches/pep-3151/Misc/README.OpenBSD Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,38 +0,0 @@ - -2005-01-08 - -If you are have a problem building on OpenBSD and see output like this -while running configure: - -checking curses.h presence... yes -configure: WARNING: curses.h: present but cannot be compiled -configure: WARNING: curses.h: check for missing prerequisite headers? -configure: WARNING: curses.h: see the Autoconf documentation -configure: WARNING: curses.h: section "Present But Cannot Be Compiled" -configure: WARNING: curses.h: proceeding with the preprocessor's result -configure: WARNING: curses.h: in the future, the compiler will take precedence - -there is likely a problem that will prevent building python. -If you see the messages above and are able to completely build python, -please tell python-dev at python.org indicating your version of OpenBSD -and any other relevant system configuration. - -The build error that occurs while making may look something like this: - - /usr/include/sys/event.h:53: error: syntax error before "u_int" - /usr/include/sys/event.h:55: error: syntax error before "u_short" - -To fix this problem, you will probably need update Python's configure -script to disable certain options. Search for a line that looks like: - - OpenBSD/2.* | OpenBSD/3.@<:@012345678@:>@) - -If your version is not in that list, e.g., 3.9, add the version -number. In this case, you would just need to add a 9 after the 8. -If you modify configure.in, you will need to regenerate configure -with autoconf. - -If your version is already in the list, this is not a known problem. -Please submit a bug report here: - - http://sourceforge.net/tracker/?group_id=5470&atid=105470 Deleted: python/branches/pep-3151/Misc/README.klocwork ============================================================================== --- python/branches/pep-3151/Misc/README.klocwork Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,30 +0,0 @@ - -Klocwork has a static analysis tool (K7) which is similar to Coverity. -They will run their tool on the Python source code on demand. -The results are available at: - - https://opensource.klocwork.com/ - -Currently, only Neal Norwitz has access to the analysis reports. Other -people can be added by request. - -K7 was first run on the Python 2.5 source code in mid-July 2006. -This is after Coverity had been making their results available. -There were originally 175 defects reported. Most of these -were false positives. However, there were numerous real issues -also uncovered. - -Each warning has a unique id and comments that can be made on it. -When checking in changes due to a K7 report, the unique id -as reported by the tool was added to the SVN commit message. -A comment was added to the K7 warning indicating the SVN revision -in addition to any analysis. - -False positives were also annotated so that the comments can -be reviewed and reversed if the analysis was incorrect. - -A second run was performed on 10-Aug-2006. The tool was tuned to remove -some false positives and perform some additional checks. ~150 new -warnings were produced, primarily related to dereferencing NULL pointers. - -Contact python-dev at python.org for more information. Deleted: python/branches/pep-3151/Misc/RFD ============================================================================== --- python/branches/pep-3151/Misc/RFD Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,114 +0,0 @@ -To: python-list -Subject: comp.lang.python RFD again -From: Guido.van.Rossum at cwi.nl - -I've followed the recent discussion and trimmed the blurb RFD down a bit -(and added the word "object-oriented" to the blurb). - -I don't think it's too early to *try* to create the newsgroup -- -whether we will succeed may depend on how many Python supporters there -are outside the mailing list. - -I'm personally not worried about moderation, and anyway I haven't -heard from any volunteers for moderation (and I won't volunteer -myself) so I suggest that we'll continue to ask for one unmoderated -newsgroup. - -My next action will be to post an updated FAQ (which will hint at the -upcoming RFD) to comp.lang.misc; then finalize the 1.0.0 release and -put it on the ftp site. I'll also try to get it into -comp.sources.unix or .misc. And all this before the end of January! - ---Guido van Rossum, CWI, Amsterdam -URL: - -====================================================================== - -These are the steps required (in case you don't know about the -newsgroup creation process): - -First, we need to draw up an RFD (Request For Discussion). This is a -document that tells what the purpose of the group is, and gives a case -for its creation. We post this to relevant groups (comp.lang.misc, -the mailing list, news.groups, etc.) Discussion is held on -news.groups. - -Then, after a few weeks, we run the official CFV (Call For Votes). -The votes are then collected over a period of weeks. We need 100 more -yes votes than no votes, and a 2/3 majority, to get the group. - -There are some restrictions on the vote taker: [s]he cannot actively -campaign for/against the group during the vote process. So the main -benefit to Steve instead of me running the vote is that I will be free -to campaign for its creation! - -The following is our current draft for the RFD. - -====================================================================== - -Request For Discussion: comp.lang.python - - -Purpose -------- - -The newsgroup will be for discussion on the Python computer language. -Possible topics include requests for information, general programming, -development, and bug reports. The group will be unmoderated. - - -What is Python? ---------------- - -Python is a relatively new very-high-level language developed in -Amsterdam. Python is a simple, object-oriented procedural language, -with features taken from ABC, Icon, Modula-3, and C/C++. - -Its central goal is to provide the best of both worlds: the dynamic -nature of scripting languages like Perl/TCL/REXX, but also support for -general programming found in the more traditional languages like Icon, -C, Modula,... - -Python may be FTP'd from the following sites: - - ftp.cwi.nl in directory /pub/python (its "home site", also has a FAQ) - ftp.uu.net in directory /languages/python - gatekeeper.dec.com in directory /pub/plan/python/cwi - - -Rationale ---------- - -Currently there is a mailing list with over 130 subscribers. -The activity of this list is high, and to make handling the -traffic more reasonable, a newsgroup is being proposed. We -also feel that comp.lang.misc would not be a suitable forum -for this volume of discussion on a particular language. - - -Charter -------- - -Comp.lang.python is an unmoderated newsgroup which will serve -as a forum for discussing the Python computer language. The -group will serve both those who just program in Python and -those who work on developing the language. Topics that -may be discussed include: - - - announcements of new versions of the language and - applications written in Python. - - - discussion on the internals of the Python language. - - - general information about the language. - - - discussion on programming in Python. - - -Discussion ----------- - -Any objections to this RFD will be considered and, if determined -to be appropriate, will be incorporated. The discussion period -will be for a period of 21 days after which the first CFV will be -issued. Deleted: python/branches/pep-3151/Misc/RPM/python-3.2.spec ============================================================================== --- python/branches/pep-3151/Misc/RPM/python-3.2.spec Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,390 +0,0 @@ -########################## -# User-modifiable configs -########################## - -# Is the resulting package and the installed binary named "python" or -# "python2"? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_binsuffix none -%define config_binsuffix 2.6 - -# Build tkinter? "auto" enables it if /usr/bin/wish exists. -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_tkinter no -%define config_tkinter yes -%define config_tkinter auto - -# Use pymalloc? The last line (commented or not) determines wether -# pymalloc is used. -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_pymalloc no -%define config_pymalloc yes - -# Enable IPV6? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_ipv6 yes -%define config_ipv6 no - -# Build shared libraries or .a library? -#WARNING: Commenting out doesn't work. Last line is what's used. -%define config_sharedlib no -%define config_sharedlib yes - -# Location of the HTML directory. -%define config_htmldir /var/www/html/python - -################################# -# End of user-modifiable configs -################################# - -%define name python -#--start constants-- -%define version 3.2a3 -%define libvers 3.2 -#--end constants-- -%define release 1pydotorg -%define __prefix /usr - -# kludge to get around rpm define weirdness -%define ipv6 %(if [ "%{config_ipv6}" = yes ]; then echo --enable-ipv6; else echo --disable-ipv6; fi) -%define pymalloc %(if [ "%{config_pymalloc}" = yes ]; then echo --with-pymalloc; else echo --without-pymalloc; fi) -%define binsuffix %(if [ "%{config_binsuffix}" = none ]; then echo ; else echo "%{config_binsuffix}"; fi) -%define include_tkinter %(if [ \\( "%{config_tkinter}" = auto -a -f /usr/bin/wish \\) -o "%{config_tkinter}" = yes ]; then echo 1; else echo 0; fi) -%define libdirname %(( uname -m | egrep -q '_64$' && [ -d /usr/lib64 ] && echo lib64 ) || echo lib) -%define sharedlib %(if [ "%{config_sharedlib}" = yes ]; then echo --enable-shared; else echo ; fi) -%define include_sharedlib %(if [ "%{config_sharedlib}" = yes ]; then echo 1; else echo 0; fi) - -# detect if documentation is available -%define include_docs %(if [ -f "%{_sourcedir}/html-%{version}.tar.bz2" ]; then echo 1; else echo 0; fi) - -Summary: An interpreted, interactive, object-oriented programming language. -Name: %{name}%{binsuffix} -Version: %{version} -Release: %{release} -License: PSF -Group: Development/Languages -Source: Python-%{version}.tar.bz2 -%if %{include_docs} -Source1: html-%{version}.tar.bz2 -%endif -BuildRoot: %{_tmppath}/%{name}-%{version}-root -BuildPrereq: expat-devel -BuildPrereq: db4-devel -BuildPrereq: gdbm-devel -BuildPrereq: sqlite-devel -Prefix: %{__prefix} -Packager: Sean Reifschneider - -%description -Python is an interpreted, interactive, object-oriented programming -language. It incorporates modules, exceptions, dynamic typing, very high -level dynamic data types, and classes. Python combines remarkable power -with very clear syntax. It has interfaces to many system calls and -libraries, as well as to various window systems, and is extensible in C or -C++. It is also usable as an extension language for applications that need -a programmable interface. Finally, Python is portable: it runs on many -brands of UNIX, on PCs under Windows, MS-DOS, and OS/2, and on the -Mac. - -%package devel -Summary: The libraries and header files needed for Python extension development. -Prereq: python%{binsuffix} = %{PACKAGE_VERSION} -Group: Development/Libraries - -%description devel -The Python programming language's interpreter can be extended with -dynamically loaded extensions and can be embedded in other programs. -This package contains the header files and libraries needed to do -these types of tasks. - -Install python-devel if you want to develop Python extensions. The -python package will also need to be installed. You'll probably also -want to install the python-docs package, which contains Python -documentation. - -%if %{include_tkinter} -%package tkinter -Summary: A graphical user interface for the Python scripting language. -Group: Development/Languages -Prereq: python%{binsuffix} = %{PACKAGE_VERSION}-%{release} - -%description tkinter -The Tkinter (Tk interface) program is an graphical user interface for -the Python scripting language. - -You should install the tkinter package if you'd like to use a graphical -user interface for Python programming. -%endif - -%package tools -Summary: A collection of development tools included with Python. -Group: Development/Tools -Prereq: python%{binsuffix} = %{PACKAGE_VERSION}-%{release} - -%description tools -The Python package includes several development tools that are used -to build python programs. This package contains a selection of those -tools, including the IDLE Python IDE. - -Install python-tools if you want to use these tools to develop -Python programs. You will also need to install the python and -tkinter packages. - -%if %{include_docs} -%package docs -Summary: Python-related documentation. -Group: Development/Documentation - -%description docs -Documentation relating to the Python programming language in HTML and info -formats. -%endif - -%changelog -* Mon Dec 20 2004 Sean Reifschneider [2.4-2pydotorg] -- Changing the idle wrapper so that it passes arguments to idle. - -* Tue Oct 19 2004 Sean Reifschneider [2.4b1-1pydotorg] -- Updating to 2.4. - -* Thu Jul 22 2004 Sean Reifschneider [2.3.4-3pydotorg] -- Paul Tiemann fixes for %{prefix}. -- Adding permission changes for directory as suggested by reimeika.ca -- Adding code to detect when it should be using lib64. -- Adding a define for the location of /var/www/html for docs. - -* Thu May 27 2004 Sean Reifschneider [2.3.4-2pydotorg] -- Including changes from Ian Holsman to build under Red Hat 7.3. -- Fixing some problems with the /usr/local path change. - -* Sat Mar 27 2004 Sean Reifschneider [2.3.2-3pydotorg] -- Being more agressive about finding the paths to fix for - #!/usr/local/bin/python. - -* Sat Feb 07 2004 Sean Reifschneider [2.3.3-2pydotorg] -- Adding code to remove "#!/usr/local/bin/python" from particular files and - causing the RPM build to terminate if there are any unexpected files - which have that line in them. - -* Mon Oct 13 2003 Sean Reifschneider [2.3.2-1pydotorg] -- Adding code to detect wether documentation is available to build. - -* Fri Sep 19 2003 Sean Reifschneider [2.3.1-1pydotorg] -- Updating to the 2.3.1 release. - -* Mon Feb 24 2003 Sean Reifschneider [2.3b1-1pydotorg] -- Updating to 2.3b1 release. - -* Mon Feb 17 2003 Sean Reifschneider [2.3a1-1] -- Updating to 2.3 release. - -* Sun Dec 23 2001 Sean Reifschneider -[Release 2.2-2] -- Added -docs package. -- Added "auto" config_tkinter setting which only enables tk if - /usr/bin/wish exists. - -* Sat Dec 22 2001 Sean Reifschneider -[Release 2.2-1] -- Updated to 2.2. -- Changed the extension to "2" from "2.2". - -* Tue Nov 18 2001 Sean Reifschneider -[Release 2.2c1-1] -- Updated to 2.2c1. - -* Thu Nov 1 2001 Sean Reifschneider -[Release 2.2b1-3] -- Changed the way the sed for fixing the #! in pydoc works. - -* Wed Oct 24 2001 Sean Reifschneider -[Release 2.2b1-2] -- Fixed missing "email" package, thanks to anonymous report on sourceforge. -- Fixed missing "compiler" package. - -* Mon Oct 22 2001 Sean Reifschneider -[Release 2.2b1-1] -- Updated to 2.2b1. - -* Mon Oct 9 2001 Sean Reifschneider -[Release 2.2a4-4] -- otto at balinor.mat.unimi.it mentioned that the license file is missing. - -* Sun Sep 30 2001 Sean Reifschneider -[Release 2.2a4-3] -- Ignacio Vazquez-Abrams pointed out that I had a spruious double-quote in - the spec files. Thanks. - -* Wed Jul 25 2001 Sean Reifschneider -[Release 2.2a1-1] -- Updated to 2.2a1 release. -- Changed idle and pydoc to use binsuffix macro - -####### -# PREP -####### -%prep -%setup -n Python-%{version} - -######## -# BUILD -######## -%build -echo "Setting for ipv6: %{ipv6}" -echo "Setting for pymalloc: %{pymalloc}" -echo "Setting for binsuffix: %{binsuffix}" -echo "Setting for include_tkinter: %{include_tkinter}" -echo "Setting for libdirname: %{libdirname}" -echo "Setting for sharedlib: %{sharedlib}" -echo "Setting for include_sharedlib: %{include_sharedlib}" -./configure --enable-unicode=ucs4 %{sharedlib} %{ipv6} %{pymalloc} --prefix=%{__prefix} -make - -########## -# INSTALL -########## -%install -# set the install path -echo '[install_scripts]' >setup.cfg -echo 'install_dir='"${RPM_BUILD_ROOT}%{__prefix}/bin" >>setup.cfg - -[ -d "$RPM_BUILD_ROOT" -a "$RPM_BUILD_ROOT" != "/" ] && rm -rf $RPM_BUILD_ROOT -mkdir -p $RPM_BUILD_ROOT%{__prefix}/%{libdirname}/python%{libvers}/lib-dynload -make prefix=$RPM_BUILD_ROOT%{__prefix} install - -# REPLACE PATH IN PYDOC -if [ ! -z "%{binsuffix}" ] -then - ( - cd $RPM_BUILD_ROOT%{__prefix}/bin - mv pydoc pydoc.old - sed 's|#!.*|#!%{__prefix}/bin/env python'%{binsuffix}'|' \ - pydoc.old >pydoc - chmod 755 pydoc - rm -f pydoc.old - ) -fi - -# add the binsuffix -if [ ! -z "%{binsuffix}" ] -then - rm -f $RPM_BUILD_ROOT%{__prefix}/bin/python[0-9a-zA-Z]* - ( cd $RPM_BUILD_ROOT%{__prefix}/bin; - for file in *; do mv "$file" "$file"%{binsuffix}; done ) - ( cd $RPM_BUILD_ROOT%{_mandir}/man1; mv python.1 python%{binsuffix}.1 ) -fi - -######## -# Tools -echo '#!%{__prefix}/bin/env python%{binsuffix}' >${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'import os, sys' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'os.execvp("%{__prefix}/bin/python%{binsuffix}", ["%{__prefix}/bin/python%{binsuffix}", "%{__prefix}/lib/python%{libvers}/idlelib/idle.py"] + sys.argv[1:])' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'print "Failed to exec Idle"' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -echo 'sys.exit(1)' >>${RPM_BUILD_ROOT}%{__prefix}/bin/idle%{binsuffix} -chmod 755 $RPM_BUILD_ROOT%{__prefix}/bin/idle%{binsuffix} -cp -a Tools $RPM_BUILD_ROOT%{__prefix}/%{libdirname}/python%{libvers} - -# MAKE FILE LISTS -rm -f mainpkg.files -find "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers} -type f | - sed "s|^${RPM_BUILD_ROOT}|/|" | - grep -v -e '/python%{libvers}/config$' -e '_tkinter.so$' >mainpkg.files -find "$RPM_BUILD_ROOT""%{__prefix}"/bin -type f -o -type l | - sed "s|^${RPM_BUILD_ROOT}|/|" | - grep -v -e '/bin/2to3%{binsuffix}$' | - grep -v -e '/bin/pydoc%{binsuffix}$' | - grep -v -e '/bin/smtpd.py%{binsuffix}$' | - grep -v -e '/bin/idle%{binsuffix}$' >>mainpkg.files - -rm -f tools.files -find "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers}/idlelib \ - "$RPM_BUILD_ROOT""%{__prefix}"/%{libdirname}/python%{libvers}/Tools -type f | - sed "s|^${RPM_BUILD_ROOT}|/|" >tools.files -echo "%{__prefix}"/bin/2to3%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/pydoc%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/smtpd.py%{binsuffix} >>tools.files -echo "%{__prefix}"/bin/idle%{binsuffix} >>tools.files - -###### -# Docs -%if %{include_docs} -mkdir -p "$RPM_BUILD_ROOT"%{config_htmldir} -( - cd "$RPM_BUILD_ROOT"%{config_htmldir} - bunzip2 < %{SOURCE1} | tar x -) -%endif - -# fix the #! line in installed files -find "$RPM_BUILD_ROOT" -type f -print0 | - xargs -0 grep -l /usr/local/bin/python | while read file -do - FIXFILE="$file" - sed 's|^#!.*python|#!%{__prefix}/bin/env python'"%{binsuffix}"'|' \ - "$FIXFILE" >/tmp/fix-python-path.$$ - cat /tmp/fix-python-path.$$ >"$FIXFILE" - rm -f /tmp/fix-python-path.$$ -done - -# check to see if there are any straggling #! lines -find "$RPM_BUILD_ROOT" -type f | xargs egrep -n '^#! */usr/local/bin/python' \ - | grep ':1:#!' >/tmp/python-rpm-files.$$ || true -if [ -s /tmp/python-rpm-files.$$ ] -then - echo '*****************************************************' - cat /tmp/python-rpm-files.$$ - cat <<@EOF - ***************************************************** - There are still files referencing /usr/local/bin/python in the - install directory. They are listed above. Please fix the .spec - file and try again. If you are an end-user, you probably want - to report this to jafo-rpms at tummy.com as well. - ***************************************************** - at EOF - rm -f /tmp/python-rpm-files.$$ - exit 1 -fi -rm -f /tmp/python-rpm-files.$$ - -######## -# CLEAN -######## -%clean -[ -n "$RPM_BUILD_ROOT" -a "$RPM_BUILD_ROOT" != / ] && rm -rf $RPM_BUILD_ROOT -rm -f mainpkg.files tools.files - -######## -# FILES -######## -%files -f mainpkg.files -%defattr(-,root,root) -%doc Misc/README Misc/cheatsheet Misc/Porting -%doc LICENSE Misc/ACKS Misc/HISTORY Misc/NEWS -%{_mandir}/man1/python%{binsuffix}.1* - -%attr(755,root,root) %dir %{__prefix}/include/python%{libvers} -%attr(755,root,root) %dir %{__prefix}/%{libdirname}/python%{libvers}/ -%if %{include_sharedlib} -%{__prefix}/%{libdirname}/libpython* -%endif - -%files devel -%defattr(-,root,root) -%{__prefix}/include/python%{libvers}/*.h -%{__prefix}/%{libdirname}/python%{libvers}/config - -%files -f tools.files tools -%defattr(-,root,root) - -%if %{include_tkinter} -%files tkinter -%defattr(-,root,root) -%{__prefix}/%{libdirname}/python%{libvers}/tkinter -%{__prefix}/%{libdirname}/python%{libvers}/lib-dynload/_tkinter.so* -%endif - -%if %{include_docs} -%files docs -%defattr(-,root,root) -%{config_htmldir}/* -%endif Modified: python/branches/pep-3151/Misc/SpecialBuilds.txt ============================================================================== --- python/branches/pep-3151/Misc/SpecialBuilds.txt (original) +++ python/branches/pep-3151/Misc/SpecialBuilds.txt Sat Feb 26 08:16:32 2011 @@ -1,17 +1,20 @@ -This file describes some special Python build types enabled via -compile-time preprocessor defines. +This file describes some special Python build types enabled via compile-time +preprocessor defines. -It is best to define these options in the EXTRA_CFLAGS make variable; +IMPORTANT: if you want to build a debug-enabled Python, it is recommended that +you use ``./configure --with-pydebug``, rather than the options listed here. + +However, if you wish to define some of these options individually, it is best +to define them in the EXTRA_CFLAGS make variable; ``make EXTRA_CFLAGS="-DPy_REF_DEBUG"``. ---------------------------------------------------------------------------- -Py_REF_DEBUG introduced in 1.4 - named REF_DEBUG before 1.4 - -Turn on aggregate reference counting. This arranges that extern -_Py_RefTotal hold a count of all references, the sum of ob_refcnt across -all objects. In a debug-mode build, this is where the "8288" comes from -in + +Py_REF_DEBUG +------------ + +Turn on aggregate reference counting. This arranges that extern _Py_RefTotal +hold a count of all references, the sum of ob_refcnt across all objects. In a +debug-mode build, this is where the "8288" comes from in >>> 23 23 @@ -19,75 +22,72 @@ >>> Note that if this count increases when you're not storing away new objects, -there's probably a leak. Remember, though, that in interactive mode the -special name "_" holds a reference to the last result displayed! +there's probably a leak. Remember, though, that in interactive mode the special +name "_" holds a reference to the last result displayed! -Py_REF_DEBUG also checks after every decref to verify that the refcount -hasn't gone negative, and causes an immediate fatal error if it has. +Py_REF_DEBUG also checks after every decref to verify that the refcount hasn't +gone negative, and causes an immediate fatal error if it has. Special gimmicks: sys.gettotalrefcount() Return current total of all refcounts. - Available under Py_REF_DEBUG in Python 2.3. - Before 2.3, Py_TRACE_REFS was required to enable this function. ---------------------------------------------------------------------------- -Py_TRACE_REFS introduced in 1.4 - named TRACE_REFS before 1.4 - -Turn on heavy reference debugging. This is major surgery. Every PyObject -grows two more pointers, to maintain a doubly-linked list of all live -heap-allocated objects. Most built-in type objects are not in this list, -as they're statically allocated. Starting in Python 2.3, if COUNT_ALLOCS -(see below) is also defined, a static type object T does appear in this -list if at least one object of type T has been created. + + +Py_TRACE_REFS +------------- + +Turn on heavy reference debugging. This is major surgery. Every PyObject grows +two more pointers, to maintain a doubly-linked list of all live heap-allocated +objects. Most built-in type objects are not in this list, as they're statically +allocated. Starting in Python 2.3, if COUNT_ALLOCS (see below) is also defined, +a static type object T does appear in this list if at least one object of type T +has been created. Note that because the fundamental PyObject layout changes, Python modules -compiled with Py_TRACE_REFS are incompatible with modules compiled without -it. +compiled with Py_TRACE_REFS are incompatible with modules compiled without it. Py_TRACE_REFS implies Py_REF_DEBUG. Special gimmicks: sys.getobjects(max[, type]) - Return list of the (no more than) max most-recently allocated objects, - most recently allocated first in the list, least-recently allocated - last in the list. max=0 means no limit on list length. - If an optional type object is passed, the list is also restricted to - objects of that type. - The return list itself, and some temp objects created just to call - sys.getobjects(), are excluded from the return list. Note that the - list returned is just another object, though, so may appear in the - return list the next time you call getobjects(); note that every - object in the list is kept alive too, simply by virtue of being in - the list. - -envar PYTHONDUMPREFS - If this envar exists, Py_Finalize() arranges to print a list of - all still-live heap objects. This is printed twice, in different - formats, before and after Py_Finalize has cleaned up everything it - can clean up. The first output block produces the repr() of each - object so is more informative; however, a lot of stuff destined to - die is still alive then. The second output block is much harder - to work with (repr() can't be invoked anymore -- the interpreter - has been torn down too far), but doesn't list any objects that will - die. The tool script combinerefs.py can be run over this to combine - the info from both output blocks. The second output block, and + Return list of the (no more than) max most-recently allocated objects, most + recently allocated first in the list, least-recently allocated last in the + list. max=0 means no limit on list length. If an optional type object is + passed, the list is also restricted to objects of that type. The return + list itself, and some temp objects created just to call sys.getobjects(), + are excluded from the return list. Note that the list returned is just + another object, though, so may appear in the return list the next time you + call getobjects(); note that every object in the list is kept alive too, + simply by virtue of being in the list. + +envvar PYTHONDUMPREFS + If this envvar exists, Py_Finalize() arranges to print a list of all + still-live heap objects. This is printed twice, in different formats, + before and after Py_Finalize has cleaned up everything it can clean up. The + first output block produces the repr() of each object so is more + informative; however, a lot of stuff destined to die is still alive then. + The second output block is much harder to work with (repr() can't be invoked + anymore -- the interpreter has been torn down too far), but doesn't list any + objects that will die. The tool script combinerefs.py can be run over this + to combine the info from both output blocks. The second output block, and combinerefs.py, were new in Python 2.3b1. ---------------------------------------------------------------------------- -PYMALLOC_DEBUG introduced in 2.3 + + +PYMALLOC_DEBUG +-------------- When pymalloc is enabled (WITH_PYMALLOC is defined), calls to the PyObject_ -memory routines are handled by Python's own small-object allocator, while -calls to the PyMem_ memory routines are directed to the system malloc/ -realloc/free. If PYMALLOC_DEBUG is also defined, calls to both PyObject_ -and PyMem_ memory routines are directed to a special debugging mode of -Python's small-object allocator. - -This mode fills dynamically allocated memory blocks with special, -recognizable bit patterns, and adds debugging info on each end of -dynamically allocated memory blocks. The special bit patterns are: +memory routines are handled by Python's own small-object allocator, while calls +to the PyMem_ memory routines are directed to the system malloc/ realloc/free. +If PYMALLOC_DEBUG is also defined, calls to both PyObject_ and PyMem_ memory +routines are directed to a special debugging mode of Python's small-object +allocator. + +This mode fills dynamically allocated memory blocks with special, recognizable +bit patterns, and adds debugging info on each end of dynamically allocated +memory blocks. The special bit patterns are: #define CLEANBYTE 0xCB /* clean (newly allocated) memory */ #define DEADBYTE 0xDB /* dead (newly freed) memory */ @@ -96,73 +96,70 @@ Strings of these bytes are unlikely to be valid addresses, floats, or 7-bit ASCII strings. -Let S = sizeof(size_t). 2*S bytes are added at each end of each block of N -bytes requested. The memory layout is like so, where p represents the -address returned by a malloc-like or realloc-like function (p[i:j] means -the slice of bytes from *(p+i) inclusive up to *(p+j) exclusive; note that -the treatment of negative indices differs from a Python slice): +Let S = sizeof(size_t). 2*S bytes are added at each end of each block of N bytes +requested. The memory layout is like so, where p represents the address +returned by a malloc-like or realloc-like function (p[i:j] means the slice of +bytes from *(p+i) inclusive up to *(p+j) exclusive; note that the treatment of +negative indices differs from a Python slice): p[-2*S:-S] - Number of bytes originally asked for. This is a size_t, big-endian - (easier to read in a memory dump). + Number of bytes originally asked for. This is a size_t, big-endian (easier + to read in a memory dump). p[-S:0] Copies of FORBIDDENBYTE. Used to catch under- writes and reads. p[0:N] The requested memory, filled with copies of CLEANBYTE, used to catch - reference to uninitialized memory. - When a realloc-like function is called requesting a larger memory - block, the new excess bytes are also filled with CLEANBYTE. - When a free-like function is called, these are overwritten with - DEADBYTE, to catch reference to freed memory. When a realloc- - like function is called requesting a smaller memory block, the excess - old bytes are also filled with DEADBYTE. + reference to uninitialized memory. When a realloc-like function is called + requesting a larger memory block, the new excess bytes are also filled with + CLEANBYTE. When a free-like function is called, these are overwritten with + DEADBYTE, to catch reference to freed memory. When a realloc- like function + is called requesting a smaller memory block, the excess old bytes are also + filled with DEADBYTE. p[N:N+S] Copies of FORBIDDENBYTE. Used to catch over- writes and reads. p[N+S:N+2*S] A serial number, incremented by 1 on each call to a malloc-like or - realloc-like function. - Big-endian size_t. - If "bad memory" is detected later, the serial number gives an - excellent way to set a breakpoint on the next run, to capture the - instant at which this block was passed out. The static function - bumpserialno() in obmalloc.c is the only place the serial number - is incremented, and exists so you can set such a breakpoint easily. - -A realloc-like or free-like function first checks that the FORBIDDENBYTEs -at each end are intact. If they've been altered, diagnostic output is -written to stderr, and the program is aborted via Py_FatalError(). The -other main failure mode is provoking a memory error when a program -reads up one of the special bit patterns and tries to use it as an address. -If you get in a debugger then and look at the object, you're likely -to see that it's entirely filled with 0xDB (meaning freed memory is -getting used) or 0xCB (meaning uninitialized memory is getting used). + realloc-like function. Big-endian size_t. If "bad memory" is detected + later, the serial number gives an excellent way to set a breakpoint on the + next run, to capture the instant at which this block was passed out. The + static function bumpserialno() in obmalloc.c is the only place the serial + number is incremented, and exists so you can set such a breakpoint easily. + +A realloc-like or free-like function first checks that the FORBIDDENBYTEs at +each end are intact. If they've been altered, diagnostic output is written to +stderr, and the program is aborted via Py_FatalError(). The other main failure +mode is provoking a memory error when a program reads up one of the special bit +patterns and tries to use it as an address. If you get in a debugger then and +look at the object, you're likely to see that it's entirely filled with 0xDB +(meaning freed memory is getting used) or 0xCB (meaning uninitialized memory is +getting used). Note that PYMALLOC_DEBUG requires WITH_PYMALLOC. Special gimmicks: -envar PYTHONMALLOCSTATS - If this envar exists, a report of pymalloc summary statistics is - printed to stderr whenever a new arena is allocated, and also - by Py_Finalize(). +envvar PYTHONMALLOCSTATS + If this envvar exists, a report of pymalloc summary statistics is printed to + stderr whenever a new arena is allocated, and also by Py_Finalize(). Changed in 2.5: The number of extra bytes allocated is 4*sizeof(size_t). Before it was 16 on all boxes, reflecting that Python couldn't make use of allocations >= 2**32 bytes even on 64-bit boxes before 2.5. ---------------------------------------------------------------------------- -Py_DEBUG introduced in 1.5 - named DEBUG before 1.5 + + +Py_DEBUG +-------- This is what is generally meant by "a debug build" of Python. -Py_DEBUG implies LLTRACE, Py_REF_DEBUG, Py_TRACE_REFS, and -PYMALLOC_DEBUG (if WITH_PYMALLOC is enabled). In addition, C -assert()s are enabled (via the C way: by not defining NDEBUG), and -some routines do additional sanity checks inside "#ifdef Py_DEBUG" -blocks. ---------------------------------------------------------------------------- -COUNT_ALLOCS introduced in 0.9.9 - partly broken in 2.2 and 2.2.1 +Py_DEBUG implies LLTRACE, Py_REF_DEBUG, Py_TRACE_REFS, and PYMALLOC_DEBUG (if +WITH_PYMALLOC is enabled). In addition, C assert()s are enabled (via the C way: +by not defining NDEBUG), and some routines do additional sanity checks inside +"#ifdef Py_DEBUG" blocks. + + +COUNT_ALLOCS +------------ Each type object grows three new members: @@ -178,84 +175,85 @@ */ int tp_maxalloc; -Allocation and deallocation code keeps these counts up to date. -Py_Finalize() displays a summary of the info returned by sys.getcounts() -(see below), along with assorted other special allocation counts (like -the number of tuple allocations satisfied by a tuple free-list, the number -of 1-character strings allocated, etc). +Allocation and deallocation code keeps these counts up to date. Py_Finalize() +displays a summary of the info returned by sys.getcounts() (see below), along +with assorted other special allocation counts (like the number of tuple +allocations satisfied by a tuple free-list, the number of 1-character strings +allocated, etc). Before Python 2.2, type objects were immortal, and the COUNT_ALLOCS -implementation relies on that. As of Python 2.2, heap-allocated type/ -class objects can go away. COUNT_ALLOCS can blow up in 2.2 and 2.2.1 -because of this; this was fixed in 2.2.2. Use of COUNT_ALLOCS makes -all heap-allocated type objects immortal, except for those for which no -object of that type is ever allocated. +implementation relies on that. As of Python 2.2, heap-allocated type/ class +objects can go away. COUNT_ALLOCS can blow up in 2.2 and 2.2.1 because of this; +this was fixed in 2.2.2. Use of COUNT_ALLOCS makes all heap-allocated type +objects immortal, except for those for which no object of that type is ever +allocated. Starting with Python 2.3, If Py_TRACE_REFS is also defined, COUNT_ALLOCS -arranges to ensure that the type object for each allocated object -appears in the doubly-linked list of all objects maintained by -Py_TRACE_REFS. +arranges to ensure that the type object for each allocated object appears in the +doubly-linked list of all objects maintained by Py_TRACE_REFS. Special gimmicks: sys.getcounts() - Return a list of 4-tuples, one entry for each type object for which - at least one object of that type was allocated. Each tuple is of - the form: + Return a list of 4-tuples, one entry for each type object for which at least + one object of that type was allocated. Each tuple is of the form: (tp_name, tp_allocs, tp_frees, tp_maxalloc) - Each distinct type object gets a distinct entry in this list, even - if two or more type objects have the same tp_name (in which case - there's no way to distinguish them by looking at this list). The - list is ordered by time of first object allocation: the type object - for which the first allocation of an object of that type occurred - most recently is at the front of the list. ---------------------------------------------------------------------------- -LLTRACE introduced well before 1.0 + Each distinct type object gets a distinct entry in this list, even if two or + more type objects have the same tp_name (in which case there's no way to + distinguish them by looking at this list). The list is ordered by time of + first object allocation: the type object for which the first allocation of + an object of that type occurred most recently is at the front of the list. + + +LLTRACE +------- Compile in support for Low Level TRACE-ing of the main interpreter loop. -When this preprocessor symbol is defined, before PyEval_EvalFrame -(eval_frame in 2.3 and 2.2, eval_code2 before that) executes a frame's code -it checks the frame's global namespace for a variable "__lltrace__". If -such a variable is found, mounds of information about what the interpreter -is doing are sprayed to stdout, such as every opcode and opcode argument -and values pushed onto and popped off the value stack. +When this preprocessor symbol is defined, before PyEval_EvalFrame (eval_frame in +2.3 and 2.2, eval_code2 before that) executes a frame's code it checks the +frame's global namespace for a variable "__lltrace__". If such a variable is +found, mounds of information about what the interpreter is doing are sprayed to +stdout, such as every opcode and opcode argument and values pushed onto and +popped off the value stack. Not useful very often, but very useful when needed. ---------------------------------------------------------------------------- -CALL_PROFILE introduced for Python 2.3 + +CALL_PROFILE +------------ Count the number of function calls executed. -When this symbol is defined, the ceval mainloop and helper functions -count the number of function calls made. It keeps detailed statistics -about what kind of object was called and whether the call hit any of -the special fast paths in the code. +When this symbol is defined, the ceval mainloop and helper functions count the +number of function calls made. It keeps detailed statistics about what kind of +object was called and whether the call hit any of the special fast paths in the +code. + ---------------------------------------------------------------------------- -WITH_TSC introduced for Python 2.4 +WITH_TSC +-------- -Super-lowlevel profiling of the interpreter. When enabled, the sys -module grows a new function: +Super-lowlevel profiling of the interpreter. When enabled, the sys module grows +a new function: settscdump(bool) - If true, tell the Python interpreter to dump VM measurements to - stderr. If false, turn off dump. The measurements are based on the - processor's time-stamp counter. - -This build option requires a small amount of platform specific code. -Currently this code is present for linux/x86 and any PowerPC platform -that uses GCC (i.e. OS X and linux/ppc). - -On the PowerPC the rate at which the time base register is incremented -is not defined by the architecture specification, so you'll need to -find the manual for your specific processor. For the 750CX, 750CXe -and 750FX (all sold as the G3) we find: + If true, tell the Python interpreter to dump VM measurements to stderr. If + false, turn off dump. The measurements are based on the processor's + time-stamp counter. + +This build option requires a small amount of platform specific code. Currently +this code is present for linux/x86 and any PowerPC platform that uses GCC +(i.e. OS X and linux/ppc). + +On the PowerPC the rate at which the time base register is incremented is not +defined by the architecture specification, so you'll need to find the manual for +your specific processor. For the 750CX, 750CXe and 750FX (all sold as the G3) +we find: - The time base counter is clocked at a frequency that is - one-fourth that of the bus clock. + The time base counter is clocked at a frequency that is one-fourth that of + the bus clock. This build is enabled by the --with-tsc flag to configure. Deleted: python/branches/pep-3151/Misc/cheatsheet ============================================================================== --- python/branches/pep-3151/Misc/cheatsheet Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,2179 +0,0 @@ - Python 2.3 Quick Reference - - - 25 Jan 2003 upgraded by Raymond Hettinger for Python 2.3 - 16 May 2001 upgraded by Richard Gruet and Simon Brunning for Python 2.0 - 2000/07/18 upgraded by Richard Gruet, rgruet at intraware.com for Python 1.5.2 -from V1.3 ref -1995/10/30, by Chris Hoffmann, choffman at vicorp.com - -Based on: - Python Bestiary, Author: Ken Manheimer, ken.manheimer at nist.gov - Python manuals, Authors: Guido van Rossum and Fred Drake - What's new in Python 2.0, Authors: A.M. Kuchling and Moshe Zadka - python-mode.el, Author: Tim Peters, tim_one at email.msn.com - - and the readers of comp.lang.python - -Python's nest: http://www.python.org Developement: http:// -python.sourceforge.net/ ActivePython : http://www.ActiveState.com/ASPN/ -Python/ -newsgroup: comp.lang.python Help desk: help at python.org -Resources: http://starship.python.net/ - http://www.vex.net/parnassus/ - http://aspn.activestate.com/ASPN/Cookbook/Python -FAQ: http://www.python.org/cgi-bin/faqw.py -Full documentation: http://www.python.org/doc/ -Excellent reference books: - Python Essential Reference by David Beazley (New Riders) - Python Pocket Reference by Mark Lutz (O'Reilly) - - -Invocation Options - -python [-diOStuUvxX?] [-c command | script | - ] [args] - - Invocation Options -Option Effect --c cmd program passed in as string (terminates option list) --d Outputs parser debugging information (also PYTHONDEBUG=x) --E ignore environment variables (such as PYTHONPATH) --h print this help message and exit --i Inspect interactively after running script (also PYTHONINSPECT=x) and - force prompts, even if stdin appears not to be a terminal --m mod run library module as a script (terminates option list --O optimize generated bytecode (a tad; also PYTHONOPTIMIZE=x) --OO remove doc-strings in addition to the -O optimizations --Q arg division options: -Qold (default), -Qwarn, -Qwarnall, -Qnew --S Don't perform 'import site' on initialization --u Unbuffered binary stdout and stderr (also PYTHONUNBUFFERED=x). --v Verbose (trace import statements) (also PYTHONVERBOSE=x) --W arg : warning control (arg is action:message:category:module:lineno) --x Skip first line of source, allowing use of non-unix Forms of #!cmd --? Help! --c Specify the command to execute (see next section). This terminates the -command option list (following options are passed as arguments to the command). - the name of a python file (.py) to execute read from stdin. -script Anything afterward is passed as options to python script or command, - not interpreted as an option to interpreter itself. -args passed to script or command (in sys.argv[1:]) - If no script or command, Python enters interactive mode. - - * Available IDEs in std distrib: IDLE (tkinter based, portable), Pythonwin - (Windows). - - - -Environment variables - - Environment variables - Variable Effect -PYTHONHOME Alternate prefix directory (or prefix;exec_prefix). The - default module search path uses prefix/lib - Augments the default search path for module files. The format - is the same as the shell's $PATH: one or more directory - pathnames separated by ':' or ';' without spaces around - (semi-)colons! -PYTHONPATH On Windows first search for Registry key HKEY_LOCAL_MACHINE\ - Software\Python\PythonCore\x.y\PythonPath (default value). You - may also define a key named after your application with a - default string value giving the root directory path of your - app. - If this is the name of a readable file, the Python commands in -PYTHONSTARTUP that file are executed before the first prompt is displayed in - interactive mode (no default). -PYTHONDEBUG If non-empty, same as -d option -PYTHONINSPECT If non-empty, same as -i option -PYTHONSUPPRESS If non-empty, same as -s option -PYTHONUNBUFFERED If non-empty, same as -u option -PYTHONVERBOSE If non-empty, same as -v option -PYTHONCASEOK If non-empty, ignore case in file/module names (imports) - - - - -Notable lexical entities - -Keywords - - and del for is raise - assert elif from lambda return - break else global not try - class except if or while - continue exec import pass yield - def finally in print - - * (list of keywords in std module: keyword) - * Illegitimate Tokens (only valid in strings): @ $ ? - * A statement must all be on a single line. To break a statement over - multiple lines use "\", as with the C preprocessor. - Exception: can always break when inside any (), [], or {} pair, or in - triple-quoted strings. - * More than one statement can appear on a line if they are separated with - semicolons (";"). - * Comments start with "#" and continue to end of line. - -Identifiers - - (letter | "_") (letter | digit | "_")* - - * Python identifiers keywords, attributes, etc. are case-sensitive. - * Special forms: _ident (not imported by 'from module import *'); __ident__ - (system defined name); - __ident (class-private name mangling) - -Strings - - "a string enclosed by double quotes" - 'another string delimited by single quotes and with a " inside' - '''a string containing embedded newlines and quote (') marks, can be - delimited with triple quotes.''' - """ may also use 3- double quotes as delimiters """ - u'a unicode string' U"Another unicode string" - r'a raw string where \ are kept (literalized): handy for regular - expressions and windows paths!' - R"another raw string" -- raw strings cannot end with a \ - ur'a unicode raw string' UR"another raw unicode" - - Use \ at end of line to continue a string on next line. - adjacent strings are concatened, e.g. 'Monty' ' Python' is the same as - 'Monty Python'. - u'hello' + ' world' --> u'hello world' (coerced to unicode) - - String Literal Escapes - - \newline Ignored (escape newline) - \\ Backslash (\) \e Escape (ESC) \v Vertical Tab (VT) - \' Single quote (') \f Formfeed (FF) \OOO char with octal value OOO - \" Double quote (") \n Linefeed (LF) - \a Bell (BEL) \r Carriage Return (CR) \xHH char with hex value HH - \b Backspace (BS) \t Horizontal Tab (TAB) - \uHHHH unicode char with hex value HHHH, can only be used in unicode string - \UHHHHHHHH unicode char with hex value HHHHHHHH, can only be used in unicode string - \AnyOtherChar is left as-is - - * NUL byte (\000) is NOT an end-of-string marker; NULs may be embedded in - strings. - * Strings (and tuples) are immutable: they cannot be modified. - -Numbers - - Decimal integer: 1234, 1234567890546378940L (or l) - Octal integer: 0177, 0177777777777777777 (begin with a 0) - Hex integer: 0xFF, 0XFFFFffffFFFFFFFFFF (begin with 0x or 0X) - Long integer (unlimited precision): 1234567890123456 - Float (double precision): 3.14e-10, .001, 10., 1E3 - Complex: 1J, 2+3J, 4+5j (ends with J or j, + separates (float) real and - imaginary parts) - -Sequences - - * String of length 0, 1, 2 (see above) - '', '1', "12", 'hello\n' - * Tuple of length 0, 1, 2, etc: - () (1,) (1,2) # parentheses are optional if len > 0 - * List of length 0, 1, 2, etc: - [] [1] [1,2] - -Indexing is 0-based. Negative indices (usually) mean count backwards from end -of sequence. - -Sequence slicing [starting-at-index : but-less-than-index]. Start defaults to -'0'; End defaults to 'sequence-length'. - -a = (0,1,2,3,4,5,6,7) - a[3] ==> 3 - a[-1] ==> 7 - a[2:4] ==> (2, 3) - a[1:] ==> (1, 2, 3, 4, 5, 6, 7) - a[:3] ==> (0, 1, 2) - a[:] ==> (0,1,2,3,4,5,6,7) # makes a copy of the sequence. - -Dictionaries (Mappings) - - {} # Zero length empty dictionary - {1 : 'first'} # Dictionary with one (key, value) pair - {1 : 'first', 'next': 'second'} - dict([('one',1),('two',2)]) # Construct a dict from an item list - dict('one'=1, 'two'=2) # Construct a dict using keyword args - dict.fromkeys(['one', 'keys']) # Construct a dict from a sequence - -Operators and their evaluation order - - Operators and their evaluation order -Highest Operator Comment - (...) [...] {...} `...` Tuple, list & dict. creation; string - conv. - s[i] s[i:j] s.attr f(...) indexing & slicing; attributes, fct - calls - +x, -x, ~x Unary operators - x**y Power - x*y x/y x%y x//y mult, division, modulo, floor division - x+y x-y addition, subtraction - x<>y Bit shifting - x&y Bitwise and - x^y Bitwise exclusive or - x|y Bitwise or - xy x>=y x==y x!=y Comparison, - x is y x is not y membership - x in s x not in s - not x boolean negation - x and y boolean and - x or y boolean or -Lowest lambda args: expr anonymous function - -Alternate names are defined in module operator (e.g. __add__ and add for +) -Most operators are overridable. - -Many binary operators also support augmented assignment: - x += 1 # Same as x = x + 1 - - -Basic Types and Their Operations - -Comparisons (defined between *any* types) - - Comparisons -Comparison Meaning Notes -< strictly less than (1) -<= less than or equal to -> strictly greater than ->= greater than or equal to -== equal to -!= not equal to -is object identity (2) -is not negated object identity (2) - -Notes : - Comparison behavior can be overridden for a given class by defining special -method __cmp__. - The above comparisons return True or False which are of type bool -(a subclass of int) and behave exactly as 1 or 0 except for their type and -that they print as True or False instead of 1 or 0. - (1) X < Y < Z < W has expected meaning, unlike C - (2) Compare object identities (i.e. id(object)), not object values. - -Boolean values and operators - - Boolean values and operators - Value or Operator Returns Notes -None, numeric zeros, empty sequences and False -mappings -all other values True -not x True if x is False, else - True -x or y if x is False then y, else (1) - x -x and y if x is False then x, else (1) - y - -Notes : - Truth testing behavior can be overridden for a given class by defining -special method __bool__. - (1) Evaluate second arg only if necessary to determine outcome. - -None - - None is used as default return value on functions. Built-in single object - with type NoneType. - Input that evaluates to None does not print when running Python - interactively. - -Numeric types - -Floats and integers. - - Floats are implemented with C doubles. - Integers have unlimited size (only limit is system resources) - -Operators on all numeric types - - Operators on all numeric types - Operation Result -abs(x) the absolute value of x -int(x) x converted to integer -float(x) x converted to floating point --x x negated -+x x unchanged -x + y the sum of x and y -x - y difference of x and y -x * y product of x and y -x / y quotient of x and y -x % y remainder of x / y -divmod(x, y) the tuple (x/y, x%y) -x ** y x to the power y (the same as pow(x, y)) - -Bit operators on integers - - Bit operators -Operation >Result -~x the bits of x inverted -x ^ y bitwise exclusive or of x and y -x & y bitwise and of x and y -x | y bitwise or of x and y -x << n x shifted left by n bits -x >> n x shifted right by n bits - -Complex Numbers - - * represented as a pair of machine-level double precision floating point - numbers. - * The real and imaginary value of a complex number z can be retrieved through - the attributes z.real and z.imag. - -Numeric exceptions - -TypeError - raised on application of arithmetic operation to non-number -OverflowError - numeric bounds exceeded -ZeroDivisionError - raised when zero second argument of div or modulo op -FloatingPointError - raised when a floating point operation fails - -Operations on all sequence types (lists, tuples, strings) - - Operations on all sequence types -Operation Result Notes -x in s True if an item of s is equal to x, else False -x not in s False if an item of s is equal to x, else True -for x in s: loops over the sequence -s + t the concatenation of s and t -s * n, n*s n copies of s concatenated -s[i] i'th item of s, origin 0 (1) -s[i:j] slice of s from i (included) to j (excluded) (1), (2) -len(s) length of s -min(s) smallest item of s -max(s) largest item of (s) -iter(s) returns an iterator over s. iterators define __iter__ and next() - -Notes : - (1) if i or j is negative, the index is relative to the end of the string, -ie len(s)+ i or len(s)+j is - substituted. But note that -0 is still 0. - (2) The slice of s from i to j is defined as the sequence of items with -index k such that i <= k < j. - If i or j is greater than len(s), use len(s). If i is omitted, use -len(s). If i is greater than or - equal to j, the slice is empty. - -Operations on mutable (=modifiable) sequences (lists) - - Operations on mutable sequences - Operation Result Notes -s[i] =x item i of s is replaced by x -s[i:j] = t slice of s from i to j is replaced by t -del s[i:j] same as s[i:j] = [] -s.append(x) same as s[len(s) : len(s)] = [x] -s.count(x) return number of i's for which s[i] == x -s.extend(x) same as s[len(s):len(s)]= x -s.index(x) return smallest i such that s[i] == x (1) -s.insert(i, x) same as s[i:i] = [x] if i >= 0 -s.pop([i]) same as x = s[i]; del s[i]; return x (4) -s.remove(x) same as del s[s.index(x)] (1) -s.reverse() reverse the items of s in place (3) -s.sort([cmpFct]) sort the items of s in place (2), (3) - -Notes : - (1) raise a ValueError exception when x is not found in s (i.e. out of -range). - (2) The sort() method takes an optional argument specifying a comparison -fct of 2 arguments (list items) which should - return -1, 0, or 1 depending on whether the 1st argument is -considered smaller than, equal to, or larger than the 2nd - argument. Note that this slows the sorting process down considerably. - (3) The sort() and reverse() methods modify the list in place for economy -of space when sorting or reversing a large list. - They don't return the sorted or reversed list to remind you of this -side effect. - (4) [New 1.5.2] The optional argument i defaults to -1, so that by default the last -item is removed and returned. - - - -Operations on mappings (dictionaries) - - Operations on mappings - Operation Result Notes -len(d) the number of items in d -d[k] the item of d with key k (1) -d[k] = x set d[k] to x -del d[k] remove d[k] from d (1) -d.clear() remove all items from d -d.copy() a shallow copy of d -d.get(k,defaultval) the item of d with key k (4) -d.has_key(k) True if d has key k, else False -d.items() a copy of d's list of (key, item) pairs (2) -d.iteritems() an iterator over (key, value) pairs (7) -d.iterkeys() an iterator over the keys of d (7) -d.itervalues() an iterator over the values of d (7) -d.keys() a copy of d's list of keys (2) -d1.update(d2) for k, v in d2.items(): d1[k] = v (3) -d.values() a copy of d's list of values (2) -d.pop(k) remove d[k] and return its value -d.popitem() remove and return an arbitrary (6) - (key, item) pair -d.setdefault(k,defaultval) the item of d with key k (5) - - Notes : - TypeError is raised if key is not acceptable - (1) KeyError is raised if key k is not in the map - (2) Keys and values are listed in random order - (3) d2 must be of the same type as d1 - (4) Never raises an exception if k is not in the map, instead it returns - defaultVal. - defaultVal is optional, when not provided and k is not in the map, - None is returned. - (5) Never raises an exception if k is not in the map, instead it returns - defaultVal, and adds k to map with value defaultVal. defaultVal is - optional. When not provided and k is not in the map, None is returned and - added to map. - (6) Raises a KeyError if the dictionary is emtpy. - (7) While iterating over a dictionary, the values may be updated but - the keys cannot be changed. - -Operations on strings - -Note that these string methods largely (but not completely) supersede the -functions available in the string module. - - - Operations on strings - Operation Result Notes -s.capitalize() return a copy of s with only its first character - capitalized. -s.center(width) return a copy of s centered in a string of length width (1) - . -s.count(sub[ return the number of occurrences of substring sub in (2) -,start[,end]]) string s. -s.decode(([ return a decoded version of s. (3) - encoding - [,errors]]) -s.encode([ return an encoded version of s. Default encoding is the - encoding current default string encoding. (3) - [,errors]]) -s.endswith(suffix return true if s ends with the specified suffix, (2) - [,start[,end]]) otherwise return False. -s.expandtabs([ return a copy of s where all tab characters are (4) -tabsize]) expanded using spaces. -s.find(sub[,start return the lowest index in s where substring sub is (2) -[,end]]) found. Return -1 if sub is not found. -s.index(sub[ like find(), but raise ValueError when the substring is (2) -,start[,end]]) not found. -s.isalnum() return True if all characters in s are alphanumeric, (5) - False otherwise. -s.isalpha() return True if all characters in s are alphabetic, (5) - False otherwise. -s.isdigit() return True if all characters in s are digit (5) - characters, False otherwise. -s.islower() return True if all characters in s are lowercase, False (6) - otherwise. -s.isspace() return True if all characters in s are whitespace (5) - characters, False otherwise. -s.istitle() return True if string s is a titlecased string, False (7) - otherwise. -s.isupper() return True if all characters in s are uppercase, False (6) - otherwise. -s.join(seq) return a concatenation of the strings in the sequence - seq, separated by 's's. -s.ljust(width) return s left justified in a string of length width. (1), - (8) -s.lower() return a copy of s converted to lowercase. -s.lstrip() return a copy of s with leading whitespace removed. -s.replace(old, return a copy of s with all occurrences of substring (9) -new[, maxsplit]) old replaced by new. -s.rfind(sub[ return the highest index in s where substring sub is (2) -,start[,end]]) found. Return -1 if sub is not found. -s.rindex(sub[ like rfind(), but raise ValueError when the substring (2) -,start[,end]]) is not found. -s.rjust(width) return s right justified in a string of length width. (1), - (8) -s.rstrip() return a copy of s with trailing whitespace removed. -s.split([sep[ return a list of the words in s, using sep as the (10) -,maxsplit]]) delimiter string. -s.splitlines([ return a list of the lines in s, breaking at line (11) -keepends]) boundaries. -s.startswith return true if s starts with the specified prefix, -(prefix[,start[ otherwise return false. (2) -,end]]) -s.strip() return a copy of s with leading and trailing whitespace - removed. -s.swapcase() return a copy of s with uppercase characters converted - to lowercase and vice versa. - return a titlecased copy of s, i.e. words start with -s.title() uppercase characters, all remaining cased characters - are lowercase. -s.translate(table return a copy of s mapped through translation table (12) -[,deletechars]) table. -s.upper() return a copy of s converted to uppercase. -s.zfill(width) return a string padded with zeroes on the left side and - sliding a minus sign left if necessary. never truncates. - -Notes : - (1) Padding is done using spaces. - (2) If optional argument start is supplied, substring s[start:] is -processed. If optional arguments start and end are supplied, substring s[start: -end] is processed. - (3) Optional argument errors may be given to set a different error handling -scheme. The default for errors is 'strict', meaning that encoding errors raise -a ValueError. Other possible values are 'ignore' and 'replace'. - (4) If optional argument tabsize is not given, a tab size of 8 characters -is assumed. - (5) Returns false if string s does not contain at least one character. - (6) Returns false if string s does not contain at least one cased -character. - (7) A titlecased string is a string in which uppercase characters may only -follow uncased characters and lowercase characters only cased ones. - (8) s is returned if width is less than len(s). - (9) If the optional argument maxsplit is given, only the first maxsplit -occurrences are replaced. - (10) If sep is not specified or None, any whitespace string is a separator. -If maxsplit is given, at most maxsplit splits are done. - (11) Line breaks are not included in the resulting list unless keepends is -given and true. - (12) table must be a string of length 256. All characters occurring in the -optional argument deletechars are removed prior to translation. - -String formatting with the % operator - -formatString % args--> evaluates to a string - - * formatString uses C printf format codes : %, c, s, i, d, u, o, x, X, e, E, - f, g, G, r (details below). - * Width and precision may be a * to specify that an integer argument gives - the actual width or precision. - * The flag characters -, +, blank, # and 0 are understood. (details below) - * %s will convert any type argument to string (uses str() function) - * args may be a single arg or a tuple of args - - '%s has %03d quote types.' % ('Python', 2) # => 'Python has 002 quote types.' - - * Right-hand-side can also be a mapping: - - a = '%(lang)s has %(c)03d quote types.' % {'c':2, 'lang':'Python} -(vars() function very handy to use on right-hand-side.) - - Format codes -Conversion Meaning -d Signed integer decimal. -i Signed integer decimal. -o Unsigned octal. -u Unsigned decimal. -x Unsigned hexadecimal (lowercase). -X Unsigned hexadecimal (uppercase). -e Floating point exponential format (lowercase). -E Floating point exponential format (uppercase). -f Floating point decimal format. -F Floating point decimal format. -g Same as "e" if exponent is greater than -4 or less than precision, - "f" otherwise. -G Same as "E" if exponent is greater than -4 or less than precision, - "F" otherwise. -c Single character (accepts integer or single character string). -r String (converts any python object using repr()). -s String (converts any python object using str()). -% No argument is converted, results in a "%" character in the result. - (The complete specification is %%.) - - Conversion flag characters -Flag Meaning -# The value conversion will use the ``alternate form''. -0 The conversion will be zero padded. -- The converted value is left adjusted (overrides "-"). - (a space) A blank should be left before a positive number (or empty - string) produced by a signed conversion. -+ A sign character ("+" or "-") will precede the conversion (overrides a - "space" flag). - -File Objects - -Created with built-in function open; may be created by other modules' functions -as well. - -Operators on file objects - - File operations - Operation Result -f.close() Close file f. -f.fileno() Get fileno (fd) for file f. -f.flush() Flush file f's internal buffer. -f.isatty() True if file f is connected to a tty-like dev, else False. -f.read([size]) Read at most size bytes from file f and return as a string - object. If size omitted, read to EOF. -f.readline() Read one entire line from file f. -f.readlines() Read until EOF with readline() and return list of lines read. - Set file f's position, like "stdio's fseek()". -f.seek(offset[, whence == 0 then use absolute indexing. -whence=0]) whence == 1 then offset relative to current pos. - whence == 2 then offset relative to file end. -f.tell() Return file f's current position (byte offset). -f.write(str) Write string to file f. -f.writelines(list Write list of strings to file f. -) - -File Exceptions - - EOFError - End-of-file hit when reading (may be raised many times, e.g. if f is a - tty). - IOError - Other I/O-related I/O operation failure. - OSError - OS system call failed. - - - Advanced Types - - -See manuals for more details - - + Module objects - + Class objects - + Class instance objects - + Type objects (see module: types) - + File objects (see above) - + Slice objects - + XRange objects - + Callable types: - o User-defined (written in Python): - # User-defined Function objects - # User-defined Method objects - o Built-in (written in C): - # Built-in Function objects - # Built-in Method objects - + Internal Types: - o Code objects (byte-compile executable Python code: bytecode) - o Frame objects (execution frames) - o Traceback objects (stack trace of an exception) - - - Statements - - pass -- Null statement - del name[,name]* -- Unbind name(s) from object. Object will be indirectly - (and automatically) deleted only if no longer referenced. - print [>> fileobject,] [s1 [, s2 ]* [,] - -- Writes to sys.stdout, or to fileobject if supplied. - Puts spaces between arguments. Puts newline at end - unless statement ends with comma. - Print is not required when running interactively, - simply typing an expression will print its value, - unless the value is None. - exec x [in globals [,locals]] - -- Executes x in namespaces provided. Defaults - to current namespaces. x can be a string, file - object or a function object. - callable(value,... [id=value], [*args], [**kw]) - -- Call function callable with parameters. Parameters can - be passed by name or be omitted if function - defines default values. E.g. if callable is defined as - "def callable(p1=1, p2=2)" - "callable()" <=> "callable(1, 2)" - "callable(10)" <=> "callable(10, 2)" - "callable(p2=99)" <=> "callable(1, 99)" - *args is a tuple of positional arguments. - **kw is a dictionary of keyword arguments. - - Assignment operators - - Caption - Operator Result Notes - a = b Basic assignment - assign object b to label a (1) - a += b Roughly equivalent to a = a + b (2) - a -= b Roughly equivalent to a = a - b (2) - a *= b Roughly equivalent to a = a * b (2) - a /= b Roughly equivalent to a = a / b (2) - a %= b Roughly equivalent to a = a % b (2) - a **= b Roughly equivalent to a = a ** b (2) - a &= b Roughly equivalent to a = a & b (2) - a |= b Roughly equivalent to a = a | b (2) - a ^= b Roughly equivalent to a = a ^ b (2) - a >>= b Roughly equivalent to a = a >> b (2) - a <<= b Roughly equivalent to a = a << b (2) - - Notes : - (1) Can unpack tuples, lists, and strings. - first, second = a[0:2]; [f, s] = range(2); c1,c2,c3='abc' - Tip: x,y = y,x swaps x and y. - (2) Not exactly equivalent - a is evaluated only once. Also, where - possible, operation performed in-place - a is modified rather than - replaced. - - Control Flow - - if condition: suite - [elif condition: suite]* - [else: suite] -- usual if/else_if/else statement - while condition: suite - [else: suite] - -- usual while statement. "else" suite is executed - after loop exits, unless the loop is exited with - "break" - for element in sequence: suite - [else: suite] - -- iterates over sequence, assigning each element to element. - Use built-in range function to iterate a number of times. - "else" suite executed at end unless loop exited - with "break" - break -- immediately exits "for" or "while" loop - continue -- immediately does next iteration of "for" or "while" loop - return [result] -- Exits from function (or method) and returns result (use a tuple to - return more than one value). If no result given, then returns None. - yield result -- Freezes the execution frame of a generator and returns the result - to the iterator's .__next__() method. Upon the next call to __next__(), - resumes execution at the frozen point with all of the local variables - still intact. - - Exception Statements - - assert expr[, message] - -- expr is evaluated. if false, raises exception AssertionError - with message. Inhibited if __debug__ is 0. - try: suite1 - [except [exception [, value]: suite2]+ - [else: suite3] - -- statements in suite1 are executed. If an exception occurs, look - in "except" clauses for matching . If matches or bare - "except" execute suite of that clause. If no exception happens - suite in "else" clause is executed after suite1. - If exception has a value, it is put in value. - exception can also be tuple of exceptions, e.g. - "except (KeyError, NameError), val: print val" - try: suite1 - finally: suite2 - -- statements in suite1 are executed. If no - exception, execute suite2 (even if suite1 is - exited with a "return", "break" or "continue" - statement). If exception did occur, executes - suite2 and then immediately reraises exception. - raise exception [,value [, traceback]] - -- raises exception with optional value - value. Arg traceback specifies a traceback object to - use when printing the exception's backtrace. - raise -- a raise statement without arguments re-raises - the last exception raised in the current function -An exception is either a string (object) or a class instance. - Can create a new one simply by creating a new string: - - my_exception = 'You did something wrong' - try: - if bad: - raise my_exception, bad - except my_exception, value: - print 'Oops', value - -Exception classes must be derived from the predefined class: Exception, e.g.: - class text_exception(Exception): pass - try: - if bad: - raise text_exception() - # This is a shorthand for the form - # "raise , " - except Exception: - print 'Oops' - # This will be printed because - # text_exception is a subclass of Exception -When an error message is printed for an unhandled exception which is a -class, the class name is printed, then a colon and a space, and -finally the instance converted to a string using the built-in function -str(). -All built-in exception classes derives from Exception, itself -derived from BaseException. - -Name Space Statements - -[1.51: On Mac & Windows, the case of module file names must now match the case -as used - in the import statement] -Packages (>1.5): a package is a name space which maps to a directory including - module(s) and the special initialization module '__init__.py' - (possibly empty). Packages/dirs can be nested. You address a - module's symbol via '[package.[package...]module.symbol's. -import module1 [as name1] [, module2]* - -- imports modules. Members of module must be - referred to by qualifying with [package.]module name: - "import sys; print sys.argv:" - "import package1.subpackage.module; package1.subpackage.module.foo()" - module1 renamed as name1, if supplied. -from module import name1 [as othername1] [, name2]* - -- imports names from module module in current namespace. - "from sys import argv; print argv" - "from package1 import module; module.foo()" - "from package1.module import foo; foo()" - name1 renamed as othername1, if supplied. -from module import * - -- imports all names in module, except those starting with "_"; - *to be used sparsely, beware of name clashes* : - "from sys import *; print argv" - "from package.module import *; print x' - NB: "from package import *" only imports the symbols defined - in the package's __init__.py file, not those in the - template modules! -global name1 [, name2]* - -- names are from global scope (usually meaning from module) - rather than local (usually meaning only in function). - -- E.g. in fct without "global" statements, assuming - "a" is name that hasn't been used in fct or module - so far: - -Try to read from "a" -> NameError - -Try to write to "a" -> creates "a" local to fcn - -If "a" not defined in fct, but is in module, then - -Try to read from "a", gets value from module - -Try to write to "a", creates "a" local to fct - But note "a[0]=3" starts with search for "a", - will use to global "a" if no local "a". - -Function Definition - -def func_id ([param_list]): suite - -- Creates a function object & binds it to name func_id. - - param_list ::= [id [, id]*] - id ::= value | id = value | *id | **id - [Args are passed by value.Thus only args representing a mutable object - can be modified (are inout parameters). Use a tuple to return more than - one value] - -Example: - def test (p1, p2 = 1+1, *rest, **keywords): - -- Parameters with "=" have default value (v is - evaluated when function defined). - If list has "*id" then id is assigned a tuple of - all remaining args passed to function (like C vararg) - If list has "**id" then id is assigned a dictionary of - all extra arguments passed as keywords. - -Class Definition - -class [( [,]*)]: - -- Creates a class object and assigns it name - may contain local "defs" of class methods and - assignments to class attributes. -Example: - class my_class (class1, class_list[3]): ... - Creates a class object inheriting from both "class1" and whatever - class object "class_list[3]" evaluates to. Assigns new - class object to name "my_class". - - First arg to class methods is always instance object, called 'self' - by convention. - - Special method __init__() is called when instance is created. - - Special method __del__() called when no more reference to object. - - Create instance by "calling" class object, possibly with arg - (thus instance=apply(aClassObject, args...) creates an instance!) - - In current implementation, can't subclass off built-in - classes. But can "wrap" them, see UserDict & UserList modules, - and see __getattr__() below. -Example: - class c (c_parent): - def __init__(self, name): self.name = name - def print_name(self): print "I'm", self.name - def call_parent(self): c_parent.print_name(self) - instance = c('tom') - print instance.name - 'tom' - instance.print_name() - "I'm tom" - Call parent's super class by accessing parent's method - directly and passing "self" explicitly (see "call_parent" - in example above). - Many other special methods available for implementing - arithmetic operators, sequence, mapping indexing, etc. - -Documentation Strings - -Modules, classes and functions may be documented by placing a string literal by -itself as the first statement in the suite. The documentation can be retrieved -by getting the '__doc__' attribute from the module, class or function. -Example: - class C: - "A description of C" - def __init__(self): - "A description of the constructor" - # etc. -Then c.__doc__ == "A description of C". -Then c.__init__.__doc__ == "A description of the constructor". - -Others - -lambda [param_list]: returnedExpr - -- Creates an anonymous function. returnedExpr must be - an expression, not a statement (e.g., not "if xx:...", - "print xxx", etc.) and thus can't contain newlines. - Used mostly for filter(), map() functions, and GUI callbacks.. -List comprehensions -result = [expression for item1 in sequence1 [if condition1] - [for item2 in sequence2 ... for itemN in sequenceN] - ] -is equivalent to: -result = [] -for item1 in sequence1: - for item2 in sequence2: - ... - for itemN in sequenceN: - if (condition1) and furthur conditions: - result.append(expression) - - - -Built-In Functions - - Built-In Functions - Function Result -__import__(name[, Imports module within the given context (see lib ref for -globals[, locals[, more details) -fromlist]]]) -abs(x) Return the absolute value of number x. -bool(x) Returns True when the argument x is true and False otherwise. -buffer(obj) Creates a buffer reference to an object. -chr(i) Returns one-character string whose ASCII code isinteger i -classmethod(f) Converts a function f, into a method with the class as the - first argument. Useful for creating alternative constructors. -cmp(x,y) Returns negative, 0, positive if x <, ==, > to y -compile(string, from which the code was read, or eg. ''if not read -filename, kind) from file.kind can be 'eval' if string is a single stmt, or - 'single' which prints the output of expression statements - that evaluate to something else than None, or be 'exec'. -complex(real[, Builds a complex object (can also be done using J or j -image]) suffix,e.g. 1+3J) -delattr(obj, name) deletes attribute named name of object obj <=> del obj.name - If no args, returns the list of names in current -dict([items]) Create a new dictionary from the specified item list. -dir([object]) local symbol table. With a module, class or class - instance object as arg, returns list of names in its attr. - dict. -divmod(a,b) Returns tuple of (a/b, a%b) -enumerate(seq) Return a iterator giving: (0, seq[0]), (1, seq[1]), ... -eval(s[, globals[, Eval string s in (optional) globals, locals contexts.s must -locals]]) have no NUL's or newlines. s can also be acode object. - Example: x = 1; incr_x = eval('x + 1') -filter(function, Constructs a list from those elements of sequence for which -sequence) function returns true. function takes one parameter. -float(x) Converts a number or a string to floating point. -getattr(object, [ arg added in 1.5.2]Gets attribute called name -name[, default])) from object,e.g. getattr(x, 'f') <=> x.f). If not found, - raises AttributeError or returns default if specified. -globals() Returns a dictionary containing current global variables. -hasattr(object, Returns true if object has attr called name. -name) -hash(object) Returns the hash value of the object (if it has one) -help(f) Display documentation on object f. -hex(x) Converts a number x to a hexadecimal string. -id(object) Returns a unique 'identity' integer for an object. -int(x[, base]) base paramenter specifies base from which to convert string - values. -isinstance(obj, Returns true if obj is an instance of class. Ifissubclass -class) (A,B) then isinstance(x,A) => isinstance(x,B) -issubclass(class1, returns true if class1 is derived from class2 -class2) - Returns the length (the number of items) of an object -iter(collection) Returns an iterator over the collection. -len(obj) (sequence, dictionary, or instance of class implementing - __len__). -list(sequence) Converts sequence into a list. If already a list,returns a - copy of it. -locals() Returns a dictionary containing current local variables. - Applies function to every item of list and returns a listof -map(function, list, the results. If additional arguments are passed,function -...) must take that many arguments and it is givento function on - each call. -max(seq) Returns the largest item of the non-empty sequence seq. -min(seq) Returns the smallest item of a non-empty sequence seq. -oct(x) Converts a number to an octal string. -open(filename [, Returns a new file object. First two args are same asthose -mode='r', [bufsize= for C's "stdio open" function. bufsize is 0for unbuffered, -implementation 1 for line-buffered, negative forsys-default, all else, of -dependent]]) (about) given size. -ord(c) Returns integer ASCII value of c (a string of len 1). Works - with Unicode char. -object() Create a base type. Used as a superclass for new-style objects. -open(name Open a file. - [, mode - [, buffering]]) -pow(x, y [, z]) Returns x to power y [modulo z]. See also ** operator. -property() Created a property with access controlled by functions. -range(start [,end Returns list of ints from >= start and < end. With 1 arg, -[, step]]) list from 0..arg-1. With 2 args, list from start..end-1. - With 3 args, list from start up to end by step - after fixing it. -repr(object) Returns a string containing a printable and if possible - evaluable representation of an object. - Class redefinable (__repr__). See also str(). -round(x, n=0) Returns the floating point value x rounded to n digitsafter - the decimal point. -setattr(object, This is the counterpart of getattr(). setattr(o, 'foobar', -name, value) 3) <=> o.foobar = 3. Creates attribute if it doesn't exist! -slice([start,] stop Returns a slice object representing a range, with R/ -[, step]) O attributes: start, stop, step. -staticmethod() Convert a function to method with no self or class - argument. Useful for methods associated with a class that - do not need access to an object's internal state. -str(object) Returns a string containing a nicely - printable representation of an object. Class overridable - (__str__).See also repr(). -super(type) Create an unbound super object. Used to call cooperative - superclass methods. -sum(sequence, Add the values in the sequence and return the sum. - [start]) -tuple(sequence) Creates a tuple with same elements as sequence. If already - a tuple, return itself (not a copy). - Returns a type object [see module types] representing - thetype of obj. Example: import typesif type(x) == -type(obj) types.StringType: print 'It is a string'NB: it is - recommanded to use the following form:if isinstance(x, - types.StringType): etc... -unichr(code) code. -unicode(string[, Creates a Unicode string from a 8-bit string, using -encoding[, error thegiven encoding name and error treatment ('strict', -]]]) 'ignore',or 'replace'}. - Without arguments, returns a dictionary correspondingto the - current local symbol table. With a module,class or class -vars([object]) instance object as argumentreturns a dictionary - corresponding to the object'ssymbol table. Useful with "%" - formatting operator. -zip(seq1[, seq2, Returns an iterator of tuples where each tuple contains -...]) the nth element of each of the argument sequences. - - - -Built-In Exceptions - -Exception> - Root class for all exceptions - SystemExit - On 'sys.exit()' - StopIteration - Signal the end from iterator.__next__() - ArithmeticError - Base class for OverflowError, ZeroDivisionError, - FloatingPointError - FloatingPointError - When a floating point operation fails. - OverflowError - On excessively large arithmetic operation - ZeroDivisionError - On division or modulo operation with 0 as 2nd arg - AssertionError - When an assert statement fails. - AttributeError - On attribute reference or assignment failure - EnvironmentError [new in 1.5.2] - On error outside Python; error arg tuple is (errno, errMsg...) - IOError [changed in 1.5.2] - I/O-related operation failure - OSError [new in 1.5.2] - used by the os module's os.error exception. - EOFError - Immediate end-of-file hit by input() or raw_input() - ImportError - On failure of `import' to find module or name - KeyboardInterrupt - On user entry of the interrupt key (often `Control-C') - LookupError - base class for IndexError, KeyError - IndexError - On out-of-range sequence subscript - KeyError - On reference to a non-existent mapping (dict) key - MemoryError - On recoverable memory exhaustion - NameError - On failure to find a local or global (unqualified) name - RuntimeError - Obsolete catch-all; define a suitable error instead - NotImplementedError [new in 1.5.2] - On method not implemented - SyntaxError - On parser encountering a syntax error - IndentationError - On parser encountering an indentation syntax error - TabError - On parser encountering an indentation syntax error - SystemError - On non-fatal interpreter error - bug - report it - TypeError - On passing inappropriate type to built-in op or func - ValueError - On arg error not covered by TypeError or more precise - Warning - UserWarning - DeprecationWarning - PendingDeprecationWarning - SyntaxWarning - RuntimeWarning - FutureWarning - - - -Standard methods & operators redefinition in classes - -Standard methods & operators map to special '__methods__' and thus may be - redefined (mostly in user-defined classes), e.g.: - class x: - def __init__(self, v): self.value = v - def __add__(self, r): return self.value + r - a = x(3) # sort of like calling x.__init__(a, 3) - a + 4 # is equivalent to a.__add__(4) - -Special methods for any class - -(s: self, o: other) - __init__(s, args) instance initialization (on construction) - __del__(s) called on object demise (refcount becomes 0) - __repr__(s) repr() and `...` conversions - __str__(s) str() and 'print' statement - __cmp__(s, o) Compares s to o and returns <0, 0, or >0. - Implements >, <, == etc... - __hash__(s) Compute a 32 bit hash code; hash() and dictionary ops - __bool__(s) Returns False or True for truth value testing - __getattr__(s, name) called when attr lookup doesn't find - __setattr__(s, name, val) called when setting an attr - (inside, don't use "self.name = value" - use "self.__dict__[name] = val") - __delattr__(s, name) called to delete attr - __call__(self, *args) called when an instance is called as function. - -Operators - - See list in the operator module. Operator function names are provided with - 2 variants, with or without - ading & trailing '__' (eg. __add__ or add). - - Numeric operations special methods - (s: self, o: other) - - s+o = __add__(s,o) s-o = __sub__(s,o) - s*o = __mul__(s,o) s/o = __div__(s,o) - s%o = __mod__(s,o) divmod(s,o) = __divmod__(s,o) - s**o = __pow__(s,o) - s&o = __and__(s,o) - s^o = __xor__(s,o) s|o = __or__(s,o) - s<>o = __rshift__(s,o) - bool(s) = __bool__(s) (used in boolean testing) - -s = __neg__(s) +s = __pos__(s) - abs(s) = __abs__(s) ~s = __invert__(s) (bitwise) - s+=o = __iadd__(s,o) s-=o = __isub__(s,o) - s*=o = __imul__(s,o) s/=o = __idiv__(s,o) - s%=o = __imod__(s,o) - s**=o = __ipow__(s,o) - s&=o = __iand__(s,o) - s^=o = __ixor__(s,o) s|=o = __ior__(s,o) - s<<=o = __ilshift__(s,o) s>>=o = __irshift__(s,o) - Conversions - int(s) = __int__(s) - float(s) = __float__(s) complex(s) = __complex__(s) - oct(s) = __oct__(s) hex(s) = __hex__(s) - Right-hand-side equivalents for all binary operators exist; - are called when class instance is on r-h-s of operator: - a + 3 calls __add__(a, 3) - 3 + a calls __radd__(a, 3) - - All seqs and maps, general operations plus: - (s: self, i: index or key) - - len(s) = __len__(s) length of object, >= 0. Length 0 == false - s[i] = __getitem__(s,i) Element at index/key i, origin 0 - - Sequences, general methods, plus: - s[i]=v = __setitem__(s,i,v) - del s[i] = __delitem__(s,i) - s[i:j] = __getslice__(s,i,j) - s[i:j]=seq = __setslice__(s,i,j,seq) - del s[i:j] = __delslice__(s,i,j) == s[i:j] = [] - seq * n = __repeat__(seq, n) - s1 + s2 = __concat__(s1, s2) - i in s = __contains__(s, i) - Mappings, general methods, plus - hash(s) = __hash__(s) - hash value for dictionary references - s[k]=v = __setitem__(s,k,v) - del s[k] = __delitem__(s,k) - -Special informative state attributes for some types: - - Modules: - __doc__ (string/None, R/O): doc string (<=> __dict__['__doc__']) - __name__(string, R/O): module name (also in __dict__['__name__']) - __dict__ (dict, R/O): module's name space - __file__(string/undefined, R/O): pathname of .pyc, .pyo or .pyd (undef for - modules statically linked to the interpreter) - - Classes: [in bold: writable since 1.5.2] - __doc__ (string/None, R/W): doc string (<=> __dict__['__doc__']) - __module__ is the module name in which the class was defined - __name__(string, R/W): class name (also in __dict__['__name__']) - __bases__ (tuple, R/W): parent classes - __dict__ (dict, R/W): attributes (class name space) - - Instances: - __class__ (class, R/W): instance's class - __dict__ (dict, R/W): attributes - - User-defined functions: [bold: writable since 1.5.2] - __doc__ (string/None, R/W): doc string - __name__(string, R/O): function name - func_doc (R/W): same as __doc__ - func_name (R/O): same as __name__ - func_defaults (tuple/None, R/W): default args values if any - func_code (code, R/W): code object representing the compiled function body - func_globals (dict, R/O): ref to dictionary of func global variables - func_dict (dict, R/W): same as __dict__ contains the namespace supporting - arbitrary function attributes - func_closure (R/O): None or a tuple of cells that contain bindings - for the function's free variables. - - - User-defined Methods: - __doc__ (string/None, R/O): doc string - __name__(string, R/O): method name (same as im_func.__name__) - im_class (class, R/O): class defining the method (may be a base class) - im_self (instance/None, R/O): target instance object (None if unbound) - im_func (function, R/O): function object - - Built-in Functions & methods: - __doc__ (string/None, R/O): doc string - __name__ (string, R/O): function name - __self__ : [methods only] target object - - Codes: - co_name (string, R/O): function name - co_argcount (int, R/0): number of positional args - co_nlocals (int, R/O): number of local vars (including args) - co_varnames (tuple, R/O): names of local vars (starting with args) - co_cellvars (tuple, R/O)) the names of local variables referenced by - nested functions - co_freevars (tuple, R/O)) names of free variables - co_code (string, R/O): sequence of bytecode instructions - co_consts (tuple, R/O): litterals used by the bytecode, 1st one is - fct doc (or None) - co_names (tuple, R/O): names used by the bytecode - co_filename (string, R/O): filename from which the code was compiled - co_firstlineno (int, R/O): first line number of the function - co_lnotab (string, R/O): string encoding bytecode offsets to line numbers. - co_stacksize (int, R/O): required stack size (including local vars) - co_flags (int, R/O): flags for the interpreter - bit 2 set if fct uses "*arg" syntax - bit 3 set if fct uses '**keywords' syntax - Frames: - f_back (frame/None, R/O): previous stack frame (toward the caller) - f_code (code, R/O): code object being executed in this frame - f_locals (dict, R/O): local vars - f_globals (dict, R/O): global vars - f_builtins (dict, R/O): built-in (intrinsic) names - f_restricted (int, R/O): flag indicating whether fct is executed in - restricted mode - f_lineno (int, R/O): current line number - f_lasti (int, R/O): precise instruction (index into bytecode) - f_trace (function/None, R/W): debug hook called at start of each source line - Tracebacks: - tb_next (frame/None, R/O): next level in stack trace (toward the frame where - the exception occurred) - tb_frame (frame, R/O): execution frame of the current level - tb_lineno (int, R/O): line number where the exception occurred - tb_lasti (int, R/O): precise instruction (index into bytecode) - - Slices: - start (any/None, R/O): lowerbound - stop (any/None, R/O): upperbound - step (any/None, R/O): step value - - Complex numbers: - real (float, R/O): real part - imag (float, R/O): imaginary part - - -Important Modules - - sys - - Some sys variables - Variable Content -argv The list of command line arguments passed to aPython - script. sys.argv[0] is the script name. -builtin_module_names A list of strings giving the names of all moduleswritten - in C that are linked into this interpreter. -check_interval How often to check for thread switches or signals(measured - in number of virtual machine instructions) -last_type, Set only when an exception not handled andinterpreter -last_value, prints an error. Used by debuggers. -last_traceback -maxint maximum positive value for integers -modules Dictionary of modules that have already been loaded. -path Search path for external modules. Can be modifiedby - program. sys.path[0] == dir of script executing -platform The current platform, e.g. "sunos5", "win32" -ps1, ps2 prompts to use in interactive mode. - File objects used for I/O. One can redirect byassigning a -stdin, stdout, new file object to them (or any object:.with a method -stderr write(string) for stdout/stderr,.with a method readline() - for stdin) -version string containing version info about Python interpreter. - (and also: copyright, dllhandle, exec_prefix, prefix) -version_info tuple containing Python version info - (major, minor, - micro, level, serial). - - Some sys functions - Function Result -exit(n) Exits with status n. Raises SystemExit exception.(Hence can - be caught and ignored by program) -getrefcount(object Returns the reference count of the object. Generally one -) higher than you might expect, because of object arg temp - reference. -setcheckinterval( Sets the interpreter's thread switching interval (in number -interval) of virtual code instructions, default:100). -settrace(func) Sets a trace function: called before each line ofcode is - exited. -setprofile(func) Sets a profile function for performance profiling. - Info on exception currently being handled; this is atuple - (exc_type, exc_value, exc_traceback).Warning: assigning the -exc_info() traceback return value to a local variable in a - function handling an exception will cause a circular - reference. -getrecursionlimit Retrieve maximum recursion depth. -() -setrecursionlimit Set maximum recursion depth. (Defaults to 1000.) -() - - - - os -"synonym" for whatever O/S-specific module is proper for current environment. -this module uses posix whenever possible. -(see also M.A. Lemburg's utility http://www.lemburg.com/files/python/ -platform.py) - - Some os variables - Variable Meaning -name name of O/S-specific module (e.g. "posix", "mac", "nt") -path O/S-specific module for path manipulations. - On Unix, os.path.split() <=> posixpath.split() -curdir string used to represent current directory ('.') -pardir string used to represent parent directory ('..') -sep string used to separate directories ('/' or '\'). Tip: use - os.path.join() to build portable paths. -altsep Alternate sep -if applicable (None -otherwise) -pathsep character used to separate search path components (as in - $PATH), eg. ';' for windows. -linesep line separator as used in binary files, ie '\n' on Unix, '\ - r\n' on Dos/Win, '\r' - - Some os functions - Function Result -makedirs(path[, Recursive directory creation (create required intermediary -mode=0777]) dirs); os.error if fails. -removedirs(path) Recursive directory delete (delete intermediary empty - dirs); if fails. -renames(old, new) Recursive directory or file renaming; os.error if fails. - - - - posix -don't import this module directly, import os instead ! -(see also module: shutil for file copy & remove fcts) - - posix Variables -Variable Meaning -environ dictionary of environment variables, e.g.posix.environ['HOME']. -error exception raised on POSIX-related error. - Corresponding value is tuple of errno code and perror() string. - - Some posix functions - Function Result -chdir(path) Changes current directory to path. -chmod(path, Changes the mode of path to the numeric mode -mode) -close(fd) Closes file descriptor fd opened with posix.open. -_exit(n) Immediate exit, with no cleanups, no SystemExit,etc. Should use - this to exit a child process. -execv(p, args) "Become" executable p with args args -getcwd() Returns a string representing the current working directory -getpid() Returns the current process id -fork() Like C's fork(). Returns 0 to child, child pid to parent.[Not - on Windows] -kill(pid, Like C's kill [Not on Windows] -signal) -listdir(path) Lists (base)names of entries in directory path, excluding '.' - and '..' -lseek(fd, pos, Sets current position in file fd to position pos, expressedas -how) an offset relative to beginning of file (how=0), tocurrent - position (how=1), or to end of file (how=2) -mkdir(path[, Creates a directory named path with numeric mode (default 0777) -mode]) -open(file, Like C's open(). Returns file descriptor. Use file object -flags, mode) fctsrather than this low level ones. -pipe() Creates a pipe. Returns pair of file descriptors (r, w) [Not on - Windows]. -popen(command, Opens a pipe to or from command. Result is a file object to -mode='r', read to orwrite from, as indicated by mode being 'r' or 'w'. -bufSize=0) Use it to catch acommand output ('r' mode) or to feed it ('w' - mode). -remove(path) See unlink. -rename(src, dst Renames/moves the file or directory src to dst. [error iftarget -) name already exists] -rmdir(path) Removes the empty directory path -read(fd, n) Reads n bytes from file descriptor fd and return as string. - Returns st_mode, st_ino, st_dev, st_nlink, st_uid,st_gid, -stat(path) st_size, st_atime, st_mtime, st_ctime.[st_ino, st_uid, st_gid - are dummy on Windows] -system(command) Executes string command in a subshell. Returns exitstatus of - subshell (usually 0 means OK). - Returns accumulated CPU times in sec (user, system, children's -times() user,children's sys, elapsed real time). [3 last not on - Windows] -unlink(path) Unlinks ("deletes") the file (not dir!) path. same as: remove -utime(path, ( Sets the access & modified time of the file to the given tuple -aTime, mTime)) of values. -wait() Waits for child process completion. Returns tuple ofpid, - exit_status [Not on Windows] -waitpid(pid, Waits for process pid to complete. Returns tuple ofpid, -options) exit_status [Not on Windows] -write(fd, str) Writes str to file fd. Returns nb of bytes written. - - - - posixpath -Do not import this module directly, import os instead and refer to this module -as os.path. (e.g. os.path.exists(p)) ! - - Some posixpath functions - Function Result -abspath(p) Returns absolute path for path p, taking current working dir in - account. -dirname/ -basename(p directory and name parts of the path p. See also split. -) -exists(p) True if string p is an existing path (file or directory) -expanduser Returns string that is (a copy of) p with "~" expansion done. -(p) -expandvars Returns string that is (a copy of) p with environment vars expanded. -(p) [Windows: case significant; must use Unix: $var notation, not %var%] -getsize( return the size in bytes of filename. raise os.error. -filename) -getmtime( return last modification time of filename (integer nb of seconds -filename) since epoch). -getatime( return last access time of filename (integer nb of seconds since -filename) epoch). -isabs(p) True if string p is an absolute path. -isdir(p) True if string p is a directory. -islink(p) True if string p is a symbolic link. -ismount(p) True if string p is a mount point [true for all dirs on Windows]. -join(p[,q Joins one or more path components intelligently. -[,...]]) - Splits p into (head, tail) where tail is lastpathname component and -split(p) is everything leadingup to that. <=> (dirname(p), basename - (p)) -splitdrive Splits path p in a pair ('drive:', tail) [Windows] -(p) -splitext(p Splits into (root, ext) where last comp of root contains no periods -) and ext is empty or startswith a period. - Calls the function visit with arguments(arg, dirname, names) for - each directory recursively inthe directory tree rooted at p -walk(p, (including p itself if it's a dir)The argument dirname specifies the -visit, arg visited directory, the argumentnames lists the files in the -) directory. The visit function maymodify names to influence the set - of directories visited belowdirname, e.g., to avoid visiting certain - parts of the tree. - - - - shutil -high-level file operations (copying, deleting). - - Main shutil functions - Function Result -copy(src, dst) Copies the contents of file src to file dst, retaining file - permissions. -copytree(src, dst Recursively copies an entire directory tree rooted at src -[, symlinks]) into dst (which should not already exist). If symlinks is - true, links insrc are kept as such in dst. -rmtree(path[, Deletes an entire directory tree, ignoring errors if -ignore_errors[, ignore_errors true,or calling onerror(func, path, -onerror]]) sys.exc_info()) if supplied with - -(and also: copyfile, copymode, copystat, copy2) - -time - - Variables -Variable Meaning -altzone signed offset of local DST timezone in sec west of the 0th meridian. -daylight nonzero if a DST timezone is specified - - Functions - Function Result -time() return a float representing UTC time in seconds since the epoch. -gmtime(secs), return a tuple representing time : (year aaaa, month(1-12),day -localtime( (1-31), hour(0-23), minute(0-59), second(0-59), weekday(0-6, 0 is -secs) monday), Julian day(1-366), daylight flag(-1,0 or 1)) -asctime( -timeTuple), -strftime( -format, return a formatted string representing time. -timeTuple) -mktime(tuple) inverse of localtime(). Return a float. -strptime( parse a formatted string representing time, return tuple as in -string[, gmtime(). -format]) -sleep(secs) Suspend execution for seconds. can be a float. - -and also: clock, ctime. - - string - -As of Python 2.0, much (though not all) of the functionality provided by the -string module have been superseded by built-in string methods - see Operations -on strings for details. - - Some string variables - Variable Meaning -digits The string '0123456789' -hexdigits, octdigits legal hexadecimal & octal digits -letters, uppercase, lowercase, Strings containing the appropriate -whitespace characters -index_error Exception raised by index() if substr not - found. - - Some string functions - Function Result -expandtabs(s, returns a copy of string with tabs expanded. -tabSize) -find/rfind(s, sub Return the lowest/highest index in where the substring -[, start=0[, end= is found such that is wholly contained ins -0]) [start:end]. Return -1 if not found. -ljust/rjust/center Return a copy of string left/right justified/centerd in -(s, width) afield of given width, padded with spaces. is - nevertruncated. -lower/upper(s) Return a string that is (a copy of) in lowercase/ - uppercase -split(s[, sep= Return a list containing the words of the string ,using -whitespace[, the string as a separator. -maxsplit=0]]) -join(words[, sep=' Concatenate a list or tuple of words with -']) interveningseparators; inverse of split. -replace(s, old, Returns a copy of string with all occurrences of -new[, maxsplit=0] substring replaced by . Limits to - firstsubstitutions if specified. -strip(s) Return a string that is (a copy of) without leadingand - trailing whitespace. see also lstrip, rstrip. - - - - re (sre) - -Handles Unicode strings. Implemented in new module sre, re now a mere front-end -for compatibility. -Patterns are specified as strings. Tip: Use raw strings (e.g. r'\w*') to -litteralize backslashes. - - - Regular expression syntax - Form Description -. matches any character (including newline if DOTALL flag specified) -^ matches start of the string (of every line in MULTILINE mode) -$ matches end of the string (of every line in MULTILINE mode) -* 0 or more of preceding regular expression (as many as possible) -+ 1 or more of preceding regular expression (as many as possible) -? 0 or 1 occurrence of preceding regular expression -*?, +?, ?? Same as *, + and ? but matches as few characters as possible -{m,n} matches from m to n repetitions of preceding RE -{m,n}? idem, attempting to match as few repetitions as possible -[ ] defines character set: e.g. '[a-zA-Z]' to match all letters(see also - \w \S) -[^ ] defines complemented character set: matches if char is NOT in set - escapes special chars '*?+&$|()' and introduces special sequences -\ (see below). Due to Python string rules, write as '\\' orr'\' in the - pattern string. -\\ matches a litteral '\'; due to Python string rules, write as '\\\\ - 'in pattern string, or better using raw string: r'\\'. -| specifies alternative: 'foo|bar' matches 'foo' or 'bar' -(...) matches any RE inside (), and delimits a group. -(?:...) idem but doesn't delimit a group. - matches if ... matches next, but doesn't consume any of the string -(?=...) e.g. 'Isaac (?=Asimov)' matches 'Isaac' only if followed by - 'Asimov'. -(?!...) matches if ... doesn't match next. Negative of (?=...) -(?P...) [a-zA-Z_]\w*)' defines a group named id) -(?P=name) matches whatever text was matched by the earlier group named name. -(?#...) A comment; ignored. -(?letter) letter is one of 'i','L', 'm', 's', 'x'. Set the corresponding flags - (re.I, re.L, re.M, re.S, re.X) for the entire RE. - - Special sequences -Sequence Description -number matches content of the group of the same number; groups are numbered - starting from 1 -\A matches only at the start of the string -\b empty str at beg or end of word: '\bis\b' matches 'is', but not 'his' -\B empty str NOT at beginning or end of word -\d any decimal digit (<=> [0-9]) -\D any non-decimal digit char (<=> [^O-9]) -\s any whitespace char (<=> [ \t\n\r\f\v]) -\S any non-whitespace char (<=> [^ \t\n\r\f\v]) -\w any alphaNumeric char (depends on LOCALE flag) -\W any non-alphaNumeric char (depends on LOCALE flag) -\Z matches only at the end of the string - - Variables -Variable Meaning -error Exception when pattern string isn't a valid regexp. - - Functions - Function Result - Compile a RE pattern string into a regular expression object. - Flags (combinable by |): - - I or IGNORECASE or (?i) - case insensitive matching -compile( L or LOCALE or (?L) -pattern[, make \w, \W, \b, \B dependent on thecurrent locale -flags=0]) M or MULTILINE or (?m) - matches every new line and not onlystart/end of the whole - string - S or DOTALL or (?s) - '.' matches ALL chars, including newline - X or VERBOSE or (?x) - Ignores whitespace outside character sets -escape(string) return (a copy of) string with all non-alphanumerics - backslashed. -match(pattern, if 0 or more chars at beginning of match the RE pattern -string[, flags string,return a corresponding MatchObject instance, or None if -]) no match. -search(pattern scan thru for a location matching , return -, string[, acorresponding MatchObject instance, or None if no match. -flags]) -split(pattern, split by occurrences of . If capturing () are -string[, used inpattern, then occurrences of patterns or subpatterns are -maxsplit=0]) also returned. -findall( return a list of non-overlapping matches in , either a -pattern, list ofgroups or a list of tuples if the pattern has more than 1 -string) group. - return string obtained by replacing the ( first) lefmost -sub(pattern, non-overlapping occurrences of (a string or a RE -repl, string[, object) in by ; can be a string or a fct -count=0]) called with a single MatchObj arg, which must return the - replacement string. -subn(pattern, -repl, string[, same as sub(), but returns a tuple (newString, numberOfSubsMade) -count=0]) - -Regular Expression Objects - - -(RE objects are returned by the compile fct) - - re object attributes -Attribute Descrition -flags flags arg used when RE obj was compiled, or 0 if none provided -groupindex dictionary of {group name: group number} in pattern -pattern pattern string from which RE obj was compiled - - re object methods - Method Result - If zero or more characters at the beginning of string match this - regular expression, return a corresponding MatchObject instance. - Return None if the string does not match the pattern; note that - this is different from a zero-length match. - The optional second parameter pos gives an index in the string -match( where the search is to start; it defaults to 0. This is not -string[, completely equivalent to slicing the string; the '' pattern -pos][, character matches at the real beginning of the string and at -endpos]) positions just after a newline, but not necessarily at the index - where the search is to start. - The optional parameter endpos limits how far the string will be - searched; it will be as if the string is endpos characters long, so - only the characters from pos to endpos will be searched for a - match. - Scan through string looking for a location where this regular -search( expression produces a match, and return a corresponding MatchObject -string[, instance. Return None if no position in the string matches the -pos][, pattern; note that this is different from finding a zero-length -endpos]) match at some point in the string. - The optional pos and endpos parameters have the same meaning as for - the match() method. -split( -string[, Identical to the split() function, using the compiled pattern. -maxsplit= -0]) -findall( Identical to the findall() function, using the compiled pattern. -string) -sub(repl, -string[, Identical to the sub() function, using the compiled pattern. -count=0]) -subn(repl, -string[, Identical to the subn() function, using the compiled pattern. -count=0]) - -Match Objects - - -(Match objects are returned by the match & search functions) - - Match object attributes -Attribute Description -pos value of pos passed to search or match functions; index intostring at - which RE engine started search. -endpos value of endpos passed to search or match functions; index intostring - beyond which RE engine won't go. -re RE object whose match or search fct produced this MatchObj instance -string string passed to match() or search() - - Match object functions -Function Result - returns one or more groups of the match. If one arg, result is a -group([g1 string;if multiple args, result is a tuple with one item per arg. If -, g2, gi is 0,return value is entire matching string; if 1 <= gi <= 99, -...]) returnstring matching group #gi (or None if no such group); gi may - also bea group name. - returns a tuple of all groups of the match; groups not -groups() participatingto the match have a value of None. Returns a string - instead of tupleif len(tuple)=1 -start( -group), returns indices of start & end of substring matched by group (or -end(group Noneif group exists but doesn't contribute to the match) -) -span( returns the 2-tuple (start(group), end(group)); can be (None, None)if -group) group didn't contibute to the match. - - - - math - -Variables: -pi -e -Functions (see ordinary C man pages for info): -acos(x) -asin(x) -atan(x) -atan2(x, y) -ceil(x) -cos(x) -cosh(x) -degrees(x) -exp(x) -fabs(x) -floor(x) -fmod(x, y) -frexp(x) -- Unlike C: (float, int) = frexp(float) -ldexp(x, y) -log(x [,base]) -log10(x) -modf(x) -- Unlike C: (float, float) = modf(float) -pow(x, y) -radians(x) -sin(x) -sinh(x) -sqrt(x) -tan(x) -tanh(x) - - getopt - -Functions: -getopt(list, optstr) -- Similar to C. is option - letters to look for. Put ':' after letter - if option takes arg. E.g. - # invocation was "python test.py -c hi -a arg1 arg2" - opts, args = getopt.getopt(sys.argv[1:], 'ab:c:') - # opts would be - [('-c', 'hi'), ('-a', '')] - # args would be - ['arg1', 'arg2'] - - -List of modules and packages in base distribution - -(built-ins and content of python Lib directory) -(Python NT distribution, may be slightly different in other distributions) - - Standard library modules - Operation Result -aifc Stuff to parse AIFF-C and AIFF files. -asynchat Support for 'chat' style protocols -asyncore Asynchronous File I/O (in select style) -atexit Register functions to be called at exit of Python interpreter. -base64 Conversions to/from base64 RFC-MIME transport encoding . -bdb A generic Python debugger base class. -binhex Macintosh binhex compression/decompression. -bisect List bisection algorithms. -bz2 Support for bz2 compression/decompression. -calendar Calendar printing functions. -cgi Wraps the WWW Forms Common Gateway Interface (CGI). -cgitb Utility for handling CGI tracebacks. -cmd A generic class to build line-oriented command interpreters. -datetime Basic date and time types. -code Utilities needed to emulate Python's interactive interpreter -codecs Lookup existing Unicode encodings and register new ones. -colorsys Conversion functions between RGB and other color systems. -compileall Force "compilation" of all .py files in a directory. -configparser Configuration file parser (much like windows .ini files) -copy Generic shallow and deep copying operations. -copyreg Helper to provide extensibility for pickle/cPickle. -csv Read and write files with comma separated values. -dbm Generic interface to all dbm clones (dbm.bsd, dbm.gnu, - dbm.ndbm, dbm.dumb). -dircache Sorted list of files in a dir, using a cache. -difflib Tool for creating delta between sequences. -dis Bytecode disassembler. -distutils Package installation system. -doctest Tool for running and verifying tests inside doc strings. -dospath Common operations on DOS pathnames. -email Comprehensive support for internet email. -filecmp File comparison. -fileinput Helper class to quickly write a loop over all standard input - files. -fnmatch Filename matching with shell patterns. -formatter A test formatter. -fpformat General floating point formatting functions. -ftplib An FTP client class. Based on RFC 959. -gc Perform garbacge collection, obtain GC debug stats, and tune - GC parameters. -getopt Standard command line processing. See also ftp:// - www.pauahtun.org/pub/getargspy.zip -getpass Utilities to get a password and/or the current user name. -glob filename globbing. -gzip Read & write gzipped files. -heapq Priority queue implemented using lists organized as heaps. -hmac Keyed-Hashing for Message Authentication -- RFC 2104. -html.entities HTML entity definitions. -html.parser A parser for HTML and XHTML. -http.client HTTP client class. -http.server HTTP server services. -ihooks Hooks into the "import" mechanism. -imaplib IMAP4 client.Based on RFC 2060. -imghdr Recognizing image files based on their first few bytes. -imputil Privides a way of writing customised import hooks. -inspect Tool for probing live Python objects. -keyword List of Python keywords. -linecache Cache lines from files. -locale Support for number formatting using the current locale - settings. -logging Python logging facility. -macpath Pathname (or related) operations for the Macintosh. -macurl2path Mac specific module for conversion between pathnames and URLs. -mailbox A class to handle a unix-style or mmdf-style mailbox. -mailcap Mailcap file handling (RFC 1524). -mhlib MH (mailbox) interface. -mimetypes Guess the MIME type of a file. -mmap Interface to memory-mapped files - they behave like mutable - strings./font> -multifile Class to make multi-file messages easier to handle. -mutex Mutual exclusion -- for use with module sched. -netrc -nntplib An NNTP client class. Based on RFC 977. -ntpath Common operations on DOS pathnames. -nturl2path Mac specific module for conversion between pathnames and URLs. -optparse A comprehensive tool for processing command line options. -os Either mac, dos or posix depending system. -pdb A Python debugger. -pickle Pickling (save and restore) of Python objects (a faster - Cimplementation exists in built-in module: cPickle). -pipes Conversion pipeline templates. -pkgunil Utilities for working with Python packages. -poplib A POP3 client class. Based on the J. Myers POP3 draft. -posixpath Common operations on POSIX pathnames. -pprint Support to pretty-print lists, tuples, & dictionaries - recursively. -profile Class for profiling python code. -pstats Class for printing reports on profiled python code. -pydoc Utility for generating documentation from source files. -pty Pseudo terminal utilities. -pyexpat Interface to the Expay XML parser. -py_compile Routine to "compile" a .py file to a .pyc file. -pyclbr Parse a Python file and retrieve classes and methods. -queue A multi-producer, multi-consumer queue. -quopri Conversions to/from quoted-printable transport encoding. -random Random variable generators -re Regular Expressions. -reprlib Redo repr() but with limits on most sizes. -rlcompleter Word completion for GNU readline 2.0. -sched A generally useful event scheduler class. -shelve Manage shelves of pickled objects. -shlex Lexical analyzer class for simple shell-like syntaxes. -shutil Utility functions usable in a shell-like program. -site Append module search paths for third-party packages to - sys.path. -smtplib SMTP Client class (RFC 821) -sndhdr Several routines that help recognizing sound. -socketserver Generic socket server classes. -stat Constants and functions for interpreting stat/lstat struct. -statvfs Constants for interpreting statvfs struct as returned by - os.statvfs()and os.fstatvfs() (if they exist). -string A collection of string operations. -sunau Stuff to parse Sun and NeXT audio files. -sunaudio Interpret sun audio headers. -symbol Non-terminal symbols of Python grammar (from "graminit.h"). -tabnanny Check Python source for ambiguous indentation. -tarfile Facility for reading and writing to the *nix tarfile format. -telnetlib TELNET client class. Based on RFC 854. -tempfile Temporary file name allocation. -textwrap Object for wrapping and filling text. -threading Proposed new higher-level threading interfaces -token Tokens (from "token.h"). -tokenize Compiles a regular expression that recognizes Python tokens. -traceback Format and print Python stack traces. -tty Terminal utilities. -turtle LogoMation-like turtle graphics -types Define names for all type symbols in the std interpreter. -tzparse Parse a timezone specification. -unicodedata Interface to unicode properties. -urllib.parse Parse URLs according to latest draft of standard. -urllib.request Open an arbitrary URL. -urllib.robotparser Parse robots.txt files, useful for web spiders. -user Hook to allow user-specified customization code to run. -uu UUencode/UUdecode. -unittest Utilities for implementing unit testing. -wave Stuff to parse WAVE files. -weakref Tools for creating and managing weakly referenced objects. -webbrowser Platform independent URL launcher. -xdrlib Implements (a subset of) Sun XDR (eXternal Data - Representation). -xml.dom Classes for processing XML using the Document Object Model. -xml.sax Classes for processing XML using the SAX API. -xmlrpc.client Support for remote procedure calls using XML. -xmlrpc.server Create XMLRPC servers. -zipfile Read & write PK zipped files. - - - -* Built-ins * - - sys Interpreter state vars and functions - __built-in__ Access to all built-in python identifiers - __main__ Scope of the interpreters main program, script or stdin - array Obj efficiently representing arrays of basic values - math Math functions of C standard - time Time-related functions (also the newer datetime module) - marshal Read and write some python values in binary format - struct Convert between python values and C structs - -* Standard * - - getopt Parse cmd line args in sys.argv. A la UNIX 'getopt'. - os A more portable interface to OS dependent functionality - re Functions useful for working with regular expressions - string Useful string and characters functions and exceptions - random Mersenne Twister pseudo-random number generator - _thread Low-level primitives for working with process threads - threading idem, new recommended interface. - -* Unix/Posix * - - dbm Interface to Unix dbm databases - grp Interface to Unix group database - posix OS functionality standardized by C and POSIX standards - posixpath POSIX pathname functions - pwd Access to the Unix password database - select Access to Unix select multiplex file synchronization - socket Access to BSD socket interface - -* Tk User-interface Toolkit * - - tkinter Main interface to Tk - -* Multimedia * - - audioop Useful operations on sound fragments - imageop Useful operations on images - jpeg Access to jpeg image compressor and decompressor - rgbimg Access SGI imglib image files - -* Cryptographic Extensions * - - md5 Interface to RSA's MD5 message digest algorithm - sha Interface to the SHA message digest algorithm - HMAC Keyed-Hashing for Message Authentication -- RFC 2104. - -* SGI IRIX * (4 & 5) - - al SGI audio facilities - AL al constants - fl Interface to FORMS library - FL fl constants - flp Functions for form designer - fm Access to font manager library - gl Access to graphics library - GL Constants for gl - DEVICE More constants for gl - imgfile Imglib image file interface - - -Workspace exploration and idiom hints - - dir() list functions, variables in - dir() get object keys, defaults to local name space - if __name__ == '__main__': main() invoke main if running as script - map(None, lst1, lst2, ...) merge lists - b = a[:] create copy of seq structure - _ in interactive mode, is last value printed - - - - - - - -Python Mode for Emacs - -(Not revised, possibly not up to date) -Type C-c ? when in python-mode for extensive help. -INDENTATION -Primarily for entering new code: - TAB indent line appropriately - LFD insert newline, then indent - DEL reduce indentation, or delete single character -Primarily for reindenting existing code: - C-c : guess py-indent-offset from file content; change locally - C-u C-c : ditto, but change globally - C-c TAB reindent region to match its context - C-c < shift region left by py-indent-offset - C-c > shift region right by py-indent-offset -MARKING & MANIPULATING REGIONS OF CODE -C-c C-b mark block of lines -M-C-h mark smallest enclosing def -C-u M-C-h mark smallest enclosing class -C-c # comment out region of code -C-u C-c # uncomment region of code -MOVING POINT -C-c C-p move to statement preceding point -C-c C-n move to statement following point -C-c C-u move up to start of current block -M-C-a move to start of def -C-u M-C-a move to start of class -M-C-e move to end of def -C-u M-C-e move to end of class -EXECUTING PYTHON CODE -C-c C-c sends the entire buffer to the Python interpreter -C-c | sends the current region -C-c ! starts a Python interpreter window; this will be used by - subsequent C-c C-c or C-c | commands -C-c C-w runs PyChecker - -VARIABLES -py-indent-offset indentation increment -py-block-comment-prefix comment string used by py-comment-region -py-python-command shell command to invoke Python interpreter -py-scroll-process-buffer t means always scroll Python process buffer -py-temp-directory directory used for temp files (if needed) -py-beep-if-tab-change ring the bell if tab-width is changed - - -The Python Debugger - -(Not revised, possibly not up to date, see 1.5.2 Library Ref section 9.1; in 1.5.2, you may also use debugger integrated in IDLE) - -Accessing - -import pdb (it's a module written in Python) - -- defines functions : - run(statement[,globals[, locals]]) - -- execute statement string under debugger control, with optional - global & local environment. - runeval(expression[,globals[, locals]]) - -- same as run, but evaluate expression and return value. - runcall(function[, argument, ...]) - -- run function object with given arg(s) - pm() -- run postmortem on last exception (like debugging a core file) - post_mortem(t) - -- run postmortem on traceback object - - -- defines class Pdb : - use Pdb to create reusable debugger objects. Object - preserves state (i.e. break points) between calls. - - runs until a breakpoint hit, exception, or end of program - If exception, variable '__exception__' holds (exception,value). - -Commands - -h, help - brief reminder of commands -b, break [] - if numeric, break at line in current file - if is function object, break on entry to fcn - if no arg, list breakpoints -cl, clear [] - if numeric, clear breakpoint at in current file - if no arg, clear all breakpoints after confirmation -w, where - print current call stack -u, up - move up one stack frame (to top-level caller) -d, down - move down one stack frame -s, step - advance one line in the program, stepping into calls -n, next - advance one line, stepping over calls -r, return - continue execution until current function returns - (return value is saved in variable "__return__", which - can be printed or manipulated from debugger) -c, continue - continue until next breakpoint -j, jump lineno - Set the next line that will be executed -a, args - print args to current function -rv, retval - prints return value from last function that returned -p, print - prints value of in current stack frame -l, list [ [, ]] - List source code for the current file. - Without arguments, list 11 lines around the current line - or continue the previous listing. - With one argument, list 11 lines starting at that line. - With two arguments, list the given range; - if the second argument is less than the first, it is a count. -whatis - prints type of -! - executes rest of line as a Python statement in the current stack frame -q quit - immediately stop execution and leave debugger - - executes last command again -Any input debugger doesn't recognize as a command is assumed to be a -Python statement to execute in the current stack frame, the same way -the exclamation mark ("!") command does. - -Example - -(1394) python -Python 1.0.3 (Sep 26 1994) -Copyright 1991-1994 Stichting Mathematisch Centrum, Amsterdam ->>> import rm ->>> rm.run() -Traceback (innermost last): - File "", line 1 - File "./rm.py", line 7 - x = div(3) - File "./rm.py", line 2 - return a / r -ZeroDivisionError: integer division or modulo ->>> import pdb ->>> pdb.pm() -> ./rm.py(2)div: return a / r -(Pdb) list - 1 def div(a): - 2 -> return a / r - 3 - 4 def run(): - 5 global r - 6 r = 0 - 7 x = div(3) - 8 print x -[EOF] -(Pdb) print r -0 -(Pdb) q ->>> pdb.runcall(rm.run) -etc. - -Quirks - -Breakpoints are stored as filename, line number tuples. If a module is reloaded -after editing, any remembered breakpoints are likely to be wrong. - -Always single-steps through top-most stack frame. That is, "c" acts like "n". Deleted: python/branches/pep-3151/Misc/developers.txt ============================================================================== --- python/branches/pep-3151/Misc/developers.txt Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,346 +0,0 @@ -Developer Log -============= - -This file is a running log of developers given permissions on SourceForge. - -The purpose is to provide some institutional memory of who was given access -and why. - -The first entry starts in April 2005. In keeping with the style of -Misc/NEWS, newer entries should be added to the top. Any markup should -be in the form of ReST. Entries should include the initials of the -project admin who made the change or granted access. Feel free to revise -the format to accommodate documentation needs as they arise. - -Note, when giving new commit permissions, be sure to get a contributor -agreement from the committer. See http://www.python.org/psf/contrib/ -for details. When the agreement is signed, please note it in this log. - -This file is encoded in UTF-8. If the usual form for a name is not in -a Latin or extended Latin alphabet, make sure to include an ASCII -transliteration too. - -Permissions History -------------------- - -- David Malcolm was given commit access on Oct 27 2010 by GFB, - at recommendation by Antoine Pitrou and Raymond Hettinger. - -- Tal Einat was given commit access on Oct 4 2010 by MvL, - for improving IDLE. - -- ??ukasz Langa was given commit access on Sep 08 2010 by GFB, - at suggestion of Antoine Pitrou, for general bug fixing. - -- Daniel Stutzbach was given commit access on Aug 22 2010 by MvL, - for general bug fixing. - -- Ask Solem was given commit access on Aug 17 2010 by MvL, - on recommendation by Jesse Noller, for work on the multiprocessing - library. - -- George Boutsioukis was given commit access on Aug 10 2010 - by MvL, for work on 2to3. - -- ??ric Araujo was given commit access on Aug 10 2010 by BAC, - at suggestion of Tarek Ziad??. - -- Terry Reedy was given commit access on Aug 04 2010 by MvL, - at suggestion of Nick Coghlan. - -- Brian Quinlan was given commit access on Jul 26 2010 by GFB, - for work related to PEP 3148. - -- Reid Kleckner was given commit access on Jul 11 2010 by GFB, - for work on the py3k-jit branch, at suggestion of the Unladen - Swallow team. - -- Alexander Belopolsky was given commit access on May 25 2010 - by MvL at suggestion of Mark Dickinson. - -- Tim Golden was given commit access on April 21 2010 by MvL, - at suggestion of Michael Foord. - -- Giampaolo Rodol?? was given commit access on April 17 2010 by - MvL, at suggestion of R. David Murray. - -- Jean-Paul Calderone was given commit access on April 6 2010 by - GFB, at suggestion of Michael Foord and others. - -- Brian Curtin was given commit access on March 24 2010 by MvL. - -- Florent Xicluna was given commit access on February 25 2010 by - MvL, based on Antoine Pitrou's recommendation. - -- Dino Viehland was given SVN access on February 23 2010 by Brett - Cannon, for backporting tests from IronPython. - -- Larry Hastings was given SVN access on February 22 2010 by - Andrew Kuchling, based on Brett Cannon's recommendation. - -- Victor Stinner was given SVN access on January 30 2010 by MvL, - at recommendation by Mark Dickinson and Amaury Forgeot d'Arc. - -- Stefan Krah was given SVN access on January 5 2010 by GFB, at - suggestion of Mark Dickinson, for work on the decimal module. - -- Doug Hellmann was given SVN access on September 19 2009 by GFB, at - suggestion of Jesse Noller, for documentation work. - -- Ezio Melotti was given SVN access on June 7 2009 by GFB, for work on and - fixes to the documentation. - -- Paul Kippes was given commit privileges at PyCon 2009 by BAC to work on 3to2. - -- Ron DuPlain was given commit privileges at PyCon 2009 by BAC to work on 3to2. - -- Several developers of alternative Python implementations where - given access for test suite and library adaptions by MvL: - Allison Randal (Parrot), Michael Foord (IronPython), - Jim Baker, Philip Jenvey, and Frank Wierzbicki (all Jython). - -- R. David Murray was given SVN access on March 30 2009 by MvL, after - recommendation by BAC. - -- Chris Withers was given SVN access on March 8 2009 by MvL, - after recommendation by GvR. - -- Tarek Ziad?? was given SVN access on December 21 2008 by NCN, - for maintenance of distutils. - -- Hirokazu Yamamoto was given SVN access on August 12 2008 by MvL, - for contributions to the Windows build. - -- Antoine Pitrou was given SVN access on July 16 2008, by recommendation - from GvR, for general contributions to Python. - -- Jesse Noller was given SVN access on 16 June 2008 by GFB, - for work on the multiprocessing module. - -- Gregor Lingl was given SVN access on 10 June 2008 by MvL, - for work on the turtle module. - -- Robert Schuppenies was given SVN access on 21 May 2008 by MvL, - for GSoC contributions. - -- Rodrigo Bernardo Pimentel was given SVN access on 29 April 2008 by MvL, - for GSoC contributions. - -- Heiko Weinen was given SVN access on 29 April 2008 by MvL, - for GSoC contributions. - -- Jesus Cea was given SVN access on 24 April 2008 by MvL, - for maintenance of bsddb. - -- Guilherme Polo was given SVN access on 24 April 2008 by MvL, - for GSoC contributions. - -- Thomas Lee was given SVN access on 21 April 2008 by NCN, - for work on branches (ast/optimizer related). - -- Jeroen Ruigrok van der Werven was given SVN access on 12 April 2008 - by GFB, for documentation work. - -- Josiah Carlson was given SVN access on 26 March 2008 by GFB, - for work on asyncore/asynchat. - -- Benjamin Peterson was given SVN access on 25 March 2008 by GFB, - for bug triage work. - -- Jerry Seutter was given SVN access on 20 March 2008 by BAC, for - general contributions to Python. - -- Jeff Rush was given SVN access on 18 March 2008 by AMK, for Distutils work. - -- David Wolever was given SVN access on 17 March 2008 by MvL, - for 2to3 work. - -- Trent Nelson was given SVN access on 17 March 2008 by MvL, - for general contributions to Python. - -- Mark Dickinson was given SVN access on 6 January 2008 by Facundo - Batista for his work on mathemathics and number related issues. - -- Amaury Forgeot d'Arc was given SVN access on 9 November 2007 by MvL, - for general contributions to Python. - -- Christian Heimes was given SVN access on 31 October 2007 by MvL, - for general contributions to Python. - -- Chris Monson was given SVN access on 20 October 2007 by NCN, - for his work on editing PEPs. - -- Bill Janssen was given SVN access on 28 August 2007 by NCN, - for his work on the SSL module and other things related to (SSL) sockets. - -- Jeffrey Yasskin was given SVN access on 9 August 2007 by NCN, - for his work on PEPs and other general patches. - -- Mark Summerfield was given SVN access on 1 August 2007 by GFB, - for work on documentation. - -- Armin Ronacher was given SVN access on 23 July 2007 by GFB, - for work on the documentation toolset. He now maintains the - ast module. - -- Senthil Kumaran was given SVN access on 16 June 2007 by MvL, - for his Summer-of-Code project, mentored by Skip Montanaro. - -- Alexandre Vassalotti was given SVN access on 21 May 2007 by MvL, - for his Summer-of-Code project, mentored by Brett Cannon. - -- Travis Oliphant was given SVN access on 17 Apr 2007 by MvL, - for implementing the extended buffer protocol. - -- Ziga Seilnacht was given SVN access on 09 Mar 2007 by MvL, - for general maintenance. - -- Pete Shinners was given SVN access on 04 Mar 2007 by NCN, - for PEP 3101 work in the sandbox. - -- Pat Maupin and Eric V. Smith were given SVN access on 28 Feb 2007 by NCN, - for PEP 3101 work in the sandbox. - -- Steven Bethard (SF name "bediviere") added to the SourceForge Python - project 26 Feb 2007, by NCN, as a tracker tech. - -- Josiah Carlson (SF name "josiahcarlson") added to the SourceForge Python - project 06 Jan 2007, by NCN, as a tracker tech. He will maintain asyncore. - -- Collin Winter was given SVN access on 05 Jan 2007 by NCN, for PEP - update access. - -- Lars Gustaebel was given SVN access on 20 Dec 2006 by NCN, for tarfile.py - related work. - -- 2006 Summer of Code entries: SoC developers are expected to work - primarily in nondist/sandbox or on a branch of their own, and will - have their work reviewed before changes are accepted into the trunk. - - - Matt Fleming was added on 25 May 2006 by AMK; he'll be working on - enhancing the Python debugger. - - - Jackilyn Hoxworth was added on 25 May 2006 by AMK; she'll be adding logging - to the standard library. - - - Mateusz Rukowicz was added on 30 May 2006 by AMK; he'll be - translating the decimal module into C. - -- SVN access granted to the "Need for Speed" Iceland sprint attendees, - between May 17 and 21, 2006, by Tim Peters. All work is to be done - in new sandbox projects or on new branches, with merging to the - trunk as approved: - - Andrew Dalke - Christian Tismer - Jack Diederich - John Benediktsson - Kristj??n V. J??nsson - Martin Blais - Richard Emslie - Richard Jones - Runar Petursson - Steve Holden - Richard M. Tew - -- Steven Bethard was given SVN access on 27 Apr 2006 by DJG, for PEP - update access. - -- Talin was given SVN access on 27 Apr 2006 by DJG, for PEP update - access. - -- George Yoshida (SF name "quiver") added to the SourceForge Python - project 14 Apr 2006, by Tim Peters, as a tracker admin. See - contemporaneous python-checkins thread with the unlikely Subject: - - r45329 - python/trunk/Doc/whatsnew/whatsnew25.tex - -- Ronald Oussoren was given SVN access on 3 Mar 2006 by NCN, for Mac - related work. - -- Bob Ippolito was given SVN access on 2 Mar 2006 by NCN, for Mac - related work. - -- Nick Coghlan requested CVS access so he could update his PEP directly. - Granted by GvR on 16 Oct 2005. - -- Added two new developers for the Summer of Code project. 8 July 2005 - by RDH. Andrew Kuchling will be mentoring Gregory K Johnson for a - project to enhance mailbox. Brett Cannon requested access for Flovis - Bruynooghe (sirolf) to work on pstats, profile, and hotshot. Both users - are expected to work primarily in nondist/sandbox and have their work - reviewed before making updates to active code. - -- Georg Brandl was given SF tracker permissions on 28 May 2005 - by RDH. Since the beginning of 2005, he has been active in discussions - on python-dev and has submitted a dozen patch reviews. The permissions - add the ability to change tracker status and to attach patches. On - 3 June 2005, this was expanded by RDH to include checkin permissions. - -- Terry Reedy was given SF tracker permissions on 7 Apr 2005 by RDH. - -- Nick Coghlan was given SF tracker permissions on 5 Apr 2005 by RDH. - For several months, he has been active in reviewing and contributing - patches. The added permissions give him greater flexibility in - working with the tracker. - -- Eric Price was made a developer on 2 May 2003 by TGP. This was - specifically to work on the new ``decimal`` package, which lived in - ``nondist/sandbox/decimal/`` at the time. - -- Eric S. Raymond was made a developer on 2 Jul 2000 by TGP, for general - library work. His request is archived here: - - http://mail.python.org/pipermail/python-dev/2000-July/005314.html - - -Permissions Dropped on Request ------------------------------- - -- Roy Smith, Matt Fleming and Richard Emslie sent drop requests. - 4 Aug 2008 GFB - -- Per note from Andrew Kuchling, the permissions for Gregory K Johnson - and the Summer Of Code project are no longer needed. 4 Aug 2008 GFB - -- Per note from Andrew Kuchling, the permissions for Gregory K Johnson - and the Summer Of Code project are no longer needed. AMK will make - any future checkins directly. 16 Oct 2005 RDH - -- Johannes Gijsbers sent a drop request. 27 July 2005 RDH - -- Flovis Bruynooghe sent a drop request. 14 July 2005 RDH - -- Paul Prescod sent a drop request. 30 Apr 2005 RDH - -- Finn Bock sent a drop request. 13 Apr 2005 RDH - -- Eric Price sent a drop request. 10 Apr 2005 RDH - -- Irmen de Jong requested dropping CVS access while keeping tracker - access. 10 Apr 2005 RDH - -- Moshe Zadka and Ken Manheimer sent drop requests. 8 Apr 2005 by RDH - -- Steve Holden, Gerhard Haring, and David Cole sent email stating that - they no longer use their access. 7 Apr 2005 RDH - - -Permissions Dropped after Loss of Contact ------------------------------------------ - -- Several unsuccessful efforts were made to contact Charles G Waldman. - Removed on 8 Apr 2005 by RDH. - - -Initials of Project Admins --------------------------- - -TGP: Tim Peters -GFB: Georg Brandl -BAC: Brett Cannon -NCN: Neal Norwitz -DJG: David Goodger -MvL: Martin v. Loewis -GvR: Guido van Rossum -RDH: Raymond Hettinger Modified: python/branches/pep-3151/Misc/indent.pro ============================================================================== --- python/branches/pep-3151/Misc/indent.pro (original) +++ python/branches/pep-3151/Misc/indent.pro Sat Feb 26 08:16:32 2011 @@ -1,15 +1,24 @@ --sob --nbad --bap --br --nce --ncs --npcs --i8 --ip8 --c25 +--blank-lines-after-declarations +--blank-lines-after-procedures +--braces-after-func-def-line +--braces-on-if-line +--braces-on-struct-decl-line +--break-after-boolean-operator +--comment-indentation25 +--comment-line-length79 +--continue-at-parentheses +--dont-cuddle-do-while +--dont-cuddle-else +--indent-level4 +--line-length79 +--no-space-after-casts +--no-space-after-function-call-names +--no-space-after-parentheses +--no-tabs +--procnames-start-lines +--space-after-for +--space-after-if +--space-after-while +--swallow-optional-blank-lines +-T PyCFunction -T PyObject - - - - Deleted: python/branches/pep-3151/Misc/maintainers.rst ============================================================================== --- python/branches/pep-3151/Misc/maintainers.rst Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,309 +0,0 @@ -Maintainers Index -================= - -This document has tables that list Python Modules, Tools, Platforms and -Interest Areas and names for each item that indicate a maintainer or an -expert in the field. This list is intended to be used by issue submitters, -issue triage people, and other issue participants to find people to add to -the nosy list or to contact directly by email for help and decisions on -feature requests and bug fixes. People on this list may be asked to render -final judgement on a feature or bug. If no active maintainer is listed for -a given module, then questionable changes should go to python-dev, while -any other issues can and should be decided by any committer. - -Unless a name is followed by a '*', you should never assign an issue to -that person, only make them nosy. Names followed by a '*' may be assigned -issues involving the module or topic for which the name has a '*'. - -The Platform and Interest Area tables list broader fields in which various -people have expertise. These people can also be contacted for help, -opinions, and decisions when issues involve their areas. - -If a listed maintainer does not respond to requests for comment for an -extended period (three weeks or more), they should be marked as inactive -in this list by placing the word 'inactive' in parenthesis behind their -tracker id. They are of course free to remove that inactive mark at -any time. - -Committers should update these tables as their areas of expertise widen. -New topics may be added to the Interest Area table at will. - -The existence of this list is not meant to indicate that these people -*must* be contacted for decisions; it is, rather, a resource to be used -by non-committers to find responsible parties, and by committers who do -not feel qualified to make a decision in a particular context. - -See also `PEP 291`_ and `PEP 360`_ for information about certain modules -with special rules. - -.. _`PEP 291`: http://www.python.org/dev/peps/pep-0291/ -.. _`PEP 360`: http://www.python.org/dev/peps/pep-0360/ - - -================== =========== -Module Maintainers -================== =========== -__future__ -__main__ gvanrossum -_dummy_thread brett.cannon -_thread pitrou -abc -aifc r.david.murray -argparse bethard -array -ast -asynchat josiahcarlson, giampaolo.rodola, stutzbach -asyncore josiahcarlson, giampaolo.rodola, stutzbach -atexit -audioop -base64 -bdb -binascii -binhex -bisect rhettinger -builtins -bz2 -calendar -cgi -cgitb -chunk -cmath mark.dickinson -cmd -code -codecs lemburg, doerwalter -codeop -collections rhettinger, stutzbach -colorsys -compileall -concurrent.futures brian.quinlan -configparser lukasz.langa -contextlib -copy alexandre.vassalotti -copyreg alexandre.vassalotti -cProfile -crypt -csv -ctypes theller -curses akuchling -datetime alexander.belopolsky -dbm -decimal facundobatista, rhettinger, mark.dickinson -difflib tim_one -dis -distutils tarek*, eric.araujo* -doctest tim_one (inactive) -dummy_threading brett.cannon -email barry, r.david.murray* -encodings lemburg, loewis -errno -exceptions -fcntl -filecmp -fileinput -fnmatch -formatter -fpectl -fractions mark.dickinson, rhettinger -ftplib giampaolo.rodola -functools -gc pitrou -getopt -getpass -gettext loewis -glob -grp -gzip -hashlib -heapq rhettinger, stutzbach -hmac -html -http -idlelib kbk -imaplib -imghdr -imp -importlib brett.cannon -inspect -io pitrou, benjamin.peterson, stutzbach -itertools rhettinger -json bob.ippolito (inactive) -keyword -lib2to3 benjamin.peterson -linecache -locale loewis, lemburg -logging vinay.sajip -macpath -mailbox akuchling -mailcap -marshal -math mark.dickinson, rhettinger, stutzbach -mimetypes -mmap -modulefinder theller, jvr -msilib loewis -msvcrt -multiprocessing jnoller -netrc -nis -nntplib -numbers -operator -optparse aronacher -os loewis -ossaudiodev -parser -pdb georg.brandl* -pickle alexandre.vassalotti, pitrou -pickletools alexandre.vassalotti -pipes -pkgutil -platform lemburg -plistlib -poplib -posix -pprint fdrake -profile georg.brandl -pstats georg.brandl -pty -pwd -py_compile -pybench lemburg, pitrou -pyclbr -pydoc -queue rhettinger -quopri -random rhettinger -re effbot (inactive), pitrou, ezio.melotti -readline -reprlib -resource -rlcompleter -runpy ncoghlan -sched -select -shelve -shlex -shutil tarek -signal -site -smtpd -smtplib -sndhdr -socket -socketserver -spwd -sqlite3 ghaering -ssl janssen, pitrou, giampaolo.rodola -stat -string georg.brandl* -stringprep -struct mark.dickinson -subprocess astrand (inactive) -sunau -symbol -symtable benjamin.peterson -sys -sysconfig tarek -syslog jafo -tabnanny tim_one -tarfile lars.gustaebel -telnetlib -tempfile georg.brandl -termios -test -textwrap georg.brandl -threading pitrou -time alexander.belopolsky -timeit georg.brandl -tkinter gpolo -token georg.brandl -tokenize -trace alexander.belopolsky -traceback georg.brandl* -tty -turtle gregorlingl -types -unicodedata loewis, lemburg, ezio.melotti -unittest michael.foord -urllib orsenthil -uu -uuid -warnings brett.cannon -wave -weakref fdrake, pitrou -webbrowser georg.brandl -winreg brian.curtin*, stutzbach -winsound effbot (inactive) -wsgiref pje -xdrlib -xml.dom -xml.dom.minidom -xml.dom.pulldom -xml.etree effbot (inactive) -xml.parsers.expat -xml.sax -xml.sax.handler -xml.sax.saxutils -xml.sax.xmlreader -xmlrpc loewis -zipfile alanmcintyre -zipimport -zlib -================== =========== - - -================== =========== -Tool Maintainers ------------------- ----------- -pybench lemburg -================== =========== - - -================== =========== -Platform Maintainers ------------------- ----------- -AIX -Cygwin jlt63, stutzbach -FreeBSD -HP-UX -Linux -Mac ronaldoussoren -NetBSD1 -OS2/EMX aimacintyre -Solaris -Windows tim.golden, brian.curtin -================== =========== - - -================== =========== -Interest Area Maintainers ------------------- ----------- -algorithms -ast/compiler ncoghlan, benjamin.peterson, brett.cannon, georg.brandl -autoconf/makefiles -bsd -bug tracker ezio.melotti -buildbots -bytecode pitrou -data formats mark.dickinson, georg.brandl -database lemburg -documentation georg.brandl, ezio.melotti -GUI -i18n lemburg -import machinery brett.cannon, ncoghlan -io pitrou, benjamin.peterson, stutzbach -locale lemburg, loewis -mathematics mark.dickinson, eric.smith, lemburg, stutzbach -memory management tim_one, lemburg -networking giampaolo.rodola -packaging tarek, lemburg -py3 transition benjamin.peterson -release management tarek, lemburg, benjamin.peterson, barry, loewis, - gvanrossum, anthonybaxter -str.format eric.smith -testing michael.foord, pitrou, giampaolo.rodola -threads pitrou -time and dates lemburg -unicode lemburg, ezio.melotti, haypo -version control -================== =========== Deleted: python/branches/pep-3151/Misc/pymemcompat.h ============================================================================== --- python/branches/pep-3151/Misc/pymemcompat.h Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,85 +0,0 @@ -/* The idea of this file is that you bundle it with your extension, - #include it, program to Python 2.3's memory API and have your - extension build with any version of Python from 1.5.2 through to - 2.3 (and hopefully beyond). */ - -#ifndef Py_PYMEMCOMPAT_H -#define Py_PYMEMCOMPAT_H - -#include "Python.h" - -/* There are three "families" of memory API: the "raw memory", "object - memory" and "object" families. (This is ignoring the matter of the - cycle collector, about which more is said below). - - Raw Memory: - - PyMem_Malloc, PyMem_Realloc, PyMem_Free - - Object Memory: - - PyObject_Malloc, PyObject_Realloc, PyObject_Free - - Object: - - PyObject_New, PyObject_NewVar, PyObject_Del - - The raw memory and object memory allocators both mimic the - malloc/realloc/free interface from ANSI C, but the object memory - allocator can (and, since 2.3, does by default) use a different - allocation strategy biased towards lots of "small" allocations. - - The object family is used for allocating Python objects, and the - initializers take care of some basic initialization (setting the - refcount to 1 and filling out the ob_type field) as well as having - a somewhat different interface. - - Do not mix the families! E.g. do not allocate memory with - PyMem_Malloc and free it with PyObject_Free. You may get away with - it quite a lot of the time, but there *are* scenarios where this - will break. You Have Been Warned. - - Also, in many versions of Python there are an insane amount of - memory interfaces to choose from. Use the ones described above. */ - -#if PY_VERSION_HEX < 0x01060000 -/* raw memory interface already present */ - -/* there is no object memory interface in 1.5.2 */ -#define PyObject_Malloc PyMem_Malloc -#define PyObject_Realloc PyMem_Realloc -#define PyObject_Free PyMem_Free - -/* the object interface is there, but the names have changed */ -#define PyObject_New PyObject_NEW -#define PyObject_NewVar PyObject_NEW_VAR -#define PyObject_Del PyMem_Free -#endif - -/* If your object is a container you probably want to support the - cycle collector, which was new in Python 2.0. - - Unfortunately, the interface to the collector that was present in - Python 2.0 and 2.1 proved to be tricky to use, and so changed in - 2.2 -- in a way that can't easily be papered over with macros. - - This file contains macros that let you program to the 2.2 GC API. - Your module will compile against any Python since version 1.5.2, - but the type will only participate in the GC in versions 2.2 and - up. Some work is still necessary on your part to only fill out the - tp_traverse and tp_clear fields when they exist and set tp_flags - appropriately. - - It is possible to support both the 2.0 and 2.2 GC APIs, but it's - not pretty and this comment block is too narrow to contain a - description of what's required... */ - -#if PY_VERSION_HEX < 0x020200B1 -#define PyObject_GC_New PyObject_New -#define PyObject_GC_NewVar PyObject_NewVar -#define PyObject_GC_Del PyObject_Del -#define PyObject_GC_Track(op) -#define PyObject_GC_UnTrack(op) -#endif - -#endif /* !Py_PYMEMCOMPAT_H */ Modified: python/branches/pep-3151/Misc/python-wing4.wpr ============================================================================== --- python/branches/pep-3151/Misc/python-wing4.wpr (original) +++ python/branches/pep-3151/Misc/python-wing4.wpr Sat Feb 26 08:16:32 2011 @@ -5,7 +5,11 @@ ################################################################## [project attributes] proj.directory-list = [{'dirloc': loc('..'), - 'excludes': [u'Lib/__pycache__'], + 'excludes': [u'Lib/unittest/test/__pycache__', + u'Lib/__pycache__', + u'Doc/build', + u'Lib/unittest/__pycache__', + u'build'], 'filter': '*', 'include_hidden': False, 'recursive': True, Modified: python/branches/pep-3151/Misc/python.man ============================================================================== --- python/branches/pep-3151/Misc/python.man (original) +++ python/branches/pep-3151/Misc/python.man Sat Feb 26 08:16:32 2011 @@ -26,6 +26,9 @@ .B \-m .I module-name ] +[ +.B \-q +] .br [ .B \-O @@ -145,6 +148,10 @@ .B \-O0 Discard docstrings in addition to the \fB-O\fP optimizations. .TP +.B \-q +Do not print the version and copyright messages. These messages are +also suppressed in non-interactive mode. +.TP .BI "\-Q " argument Division control; see PEP 238. The argument must be one of "old" (the default, int/int and long/long return an int or long), "new" (new Modified: python/branches/pep-3151/Misc/python.pc.in ============================================================================== --- python/branches/pep-3151/Misc/python.pc.in (original) +++ python/branches/pep-3151/Misc/python.pc.in Sat Feb 26 08:16:32 2011 @@ -1,3 +1,4 @@ +# See: man pkg-config prefix=@prefix@ exec_prefix=@exec_prefix@ libdir=@libdir@ @@ -9,5 +10,4 @@ Version: @VERSION@ Libs.private: @LIBS@ Libs: -L${libdir} -lpython at VERSION@@ABIFLAGS@ -Cflags: -I${includedir}/python at VERSION@ - +Cflags: -I${includedir}/python at VERSION@@ABIFLAGS@ Deleted: python/branches/pep-3151/Misc/setuid-prog.c ============================================================================== --- python/branches/pep-3151/Misc/setuid-prog.c Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,176 +0,0 @@ -/* - Template for a setuid program that calls a script. - - The script should be in an unwritable directory and should itself - be unwritable. In fact all parent directories up to the root - should be unwritable. The script must not be setuid, that's what - this program is for. - - This is a template program. You need to fill in the name of the - script that must be executed. This is done by changing the - definition of FULL_PATH below. - - There are also some rules that should be adhered to when writing - the script itself. - - The first and most important rule is to never, ever trust that the - user of the program will behave properly. Program defensively. - Check your arguments for reasonableness. If the user is allowed to - create files, check the names of the files. If the program depends - on argv[0] for the action it should perform, check it. - - Assuming the script is a Bourne shell script, the first line of the - script should be - #!/bin/sh - - The - is important, don't omit it. If you're using esh, the first - line should be - #!/usr/local/bin/esh -f - and for ksh, the first line should be - #!/usr/local/bin/ksh -p - The script should then set the variable IFS to the string - consisting of , , and . After this (*not* - before!), the PATH variable should be set to a reasonable value and - exported. Do not expect the PATH to have a reasonable value, so do - not trust the old value of PATH. You should then set the umask of - the program by calling - umask 077 # or 022 if you want the files to be readable - If you plan to change directories, you should either unset CDPATH - or set it to a good value. Setting CDPATH to just ``.'' (dot) is a - good idea. - If, for some reason, you want to use csh, the first line should be - #!/bin/csh -fb - You should then set the path variable to something reasonable, - without trusting the inherited path. Here too, you should set the - umask using the command - umask 077 # or 022 if you want the files to be readable -*/ - -#include -#include -#include -#include -#include -#include - -/* CONFIGURATION SECTION */ - -#ifndef FULL_PATH /* so that this can be specified from the Makefile */ -/* Uncomment the following line: -#define FULL_PATH "/full/path/of/script" -* Then comment out the #error line. */ -#error "You must define FULL_PATH somewhere" -#endif -#ifndef UMASK -#define UMASK 077 -#endif - -/* END OF CONFIGURATION SECTION */ - -#if defined(__STDC__) && defined(__sgi) -#define environ _environ -#endif - -/* don't change def_IFS */ -char def_IFS[] = "IFS= \t\n"; -/* you may want to change def_PATH, but you should really change it in */ -/* your script */ -#ifdef __sgi -char def_PATH[] = "PATH=/usr/bsd:/usr/bin:/bin:/usr/local/bin:/usr/sbin"; -#else -char def_PATH[] = "PATH=/usr/ucb:/usr/bin:/bin:/usr/local/bin"; -#endif -/* don't change def_CDPATH */ -char def_CDPATH[] = "CDPATH=."; -/* don't change def_ENV */ -char def_ENV[] = "ENV=:"; - -/* - This function changes all environment variables that start with LD_ - into variables that start with XD_. This is important since we - don't want the script that is executed to use any funny shared - libraries. - - The other changes to the environment are, strictly speaking, not - needed here. They can safely be done in the script. They are done - here because we don't trust the script writer (just like the script - writer shouldn't trust the user of the script). - If IFS is set in the environment, set it to space,tab,newline. - If CDPATH is set in the environment, set it to ``.''. - Set PATH to a reasonable default. -*/ -void -clean_environ(void) -{ - char **p; - extern char **environ; - - for (p = environ; *p; p++) { - if (strncmp(*p, "LD_", 3) == 0) - **p = 'X'; - else if (strncmp(*p, "_RLD", 4) == 0) - **p = 'X'; - else if (strncmp(*p, "PYTHON", 6) == 0) - **p = 'X'; - else if (strncmp(*p, "IFS=", 4) == 0) - *p = def_IFS; - else if (strncmp(*p, "CDPATH=", 7) == 0) - *p = def_CDPATH; - else if (strncmp(*p, "ENV=", 4) == 0) - *p = def_ENV; - } - putenv(def_PATH); -} - -int -main(int argc, char **argv) -{ - struct stat statb; - gid_t egid = getegid(); - uid_t euid = geteuid(); - - /* - Sanity check #1. - This check should be made compile-time, but that's not possible. - If you're sure that you specified a full path name for FULL_PATH, - you can omit this check. - */ - if (FULL_PATH[0] != '/') { - fprintf(stderr, "%s: %s is not a full path name\n", argv[0], - FULL_PATH); - fprintf(stderr, "You can only use this wrapper if you\n"); - fprintf(stderr, "compile it with an absolute path.\n"); - exit(1); - } - - /* - Sanity check #2. - Check that the owner of the script is equal to either the - effective uid or the super user. - */ - if (stat(FULL_PATH, &statb) < 0) { - perror("stat"); - exit(1); - } - if (statb.st_uid != 0 && statb.st_uid != euid) { - fprintf(stderr, "%s: %s has the wrong owner\n", argv[0], - FULL_PATH); - fprintf(stderr, "The script should be owned by root,\n"); - fprintf(stderr, "and shouldn't be writable by anyone.\n"); - exit(1); - } - - if (setregid(egid, egid) < 0) - perror("setregid"); - if (setreuid(euid, euid) < 0) - perror("setreuid"); - - clean_environ(); - - umask(UMASK); - - while (**argv == '-') /* don't let argv[0] start with '-' */ - (*argv)++; - execv(FULL_PATH, argv); - fprintf(stderr, "%s: could not execute the script\n", argv[0]); - exit(1); -} Modified: python/branches/pep-3151/Modules/Setup.dist ============================================================================== --- python/branches/pep-3151/Modules/Setup.dist (original) +++ python/branches/pep-3151/Modules/Setup.dist Sat Feb 26 08:16:32 2011 @@ -207,7 +207,7 @@ # # First, look at Setup.config; configure may have set this for you. -#crypt cryptmodule.c # -lcrypt # crypt(3); needs -lcrypt on some systems +#_crypt _cryptmodule.c # -lcrypt # crypt(3); needs -lcrypt on some systems # Some more UNIX dependent modules -- off by default, since these Modified: python/branches/pep-3151/Modules/_collectionsmodule.c ============================================================================== --- python/branches/pep-3151/Modules/_collectionsmodule.c (original) +++ python/branches/pep-3151/Modules/_collectionsmodule.c Sat Feb 26 08:16:32 2011 @@ -485,7 +485,8 @@ /* Advance left block/index pair */ leftindex++; if (leftindex == BLOCKLEN) { - assert (leftblock->rightlink != NULL); + if (leftblock->rightlink == NULL) + break; leftblock = leftblock->rightlink; leftindex = 0; } @@ -493,7 +494,8 @@ /* Step backwards with the right block/index pair */ rightindex--; if (rightindex == -1) { - assert (rightblock->leftlink != NULL); + if (rightblock->leftlink == NULL) + break; rightblock = rightblock->leftlink; rightindex = BLOCKLEN - 1; } @@ -509,7 +511,7 @@ { block *leftblock = deque->leftblock; Py_ssize_t leftindex = deque->leftindex; - Py_ssize_t n = (deque->len); + Py_ssize_t n = deque->len; Py_ssize_t i; Py_ssize_t count = 0; PyObject *item; @@ -533,7 +535,8 @@ /* Advance left block/index pair */ leftindex++; if (leftindex == BLOCKLEN) { - assert (leftblock->rightlink != NULL); + if (leftblock->rightlink == NULL) /* can occur when i==n-1 */ + break; leftblock = leftblock->rightlink; leftindex = 0; } @@ -1518,6 +1521,95 @@ PyObject_GC_Del, /* tp_free */ }; +/* helper function for Counter *********************************************/ + +PyDoc_STRVAR(_count_elements_doc, +"_count_elements(mapping, iterable) -> None\n\ +\n\ +Count elements in the iterable, updating the mappping"); + +static PyObject * +_count_elements(PyObject *self, PyObject *args) +{ + PyObject *it, *iterable, *mapping, *oldval; + PyObject *newval = NULL; + PyObject *key = NULL; + PyObject *one = NULL; + + if (!PyArg_UnpackTuple(args, "_count_elements", 2, 2, &mapping, &iterable)) + return NULL; + + it = PyObject_GetIter(iterable); + if (it == NULL) + return NULL; + + one = PyLong_FromLong(1); + if (one == NULL) { + Py_DECREF(it); + return NULL; + } + + if (PyDict_CheckExact(mapping)) { + while (1) { + key = PyIter_Next(it); + if (key == NULL) { + if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_StopIteration)) + PyErr_Clear(); + else + break; + } + oldval = PyDict_GetItem(mapping, key); + if (oldval == NULL) { + if (PyDict_SetItem(mapping, key, one) == -1) + break; + } else { + newval = PyNumber_Add(oldval, one); + if (newval == NULL) + break; + if (PyDict_SetItem(mapping, key, newval) == -1) + break; + Py_CLEAR(newval); + } + Py_DECREF(key); + } + } else { + while (1) { + key = PyIter_Next(it); + if (key == NULL) { + if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_StopIteration)) + PyErr_Clear(); + else + break; + } + oldval = PyObject_GetItem(mapping, key); + if (oldval == NULL) { + if (!PyErr_Occurred() || !PyErr_ExceptionMatches(PyExc_KeyError)) + break; + PyErr_Clear(); + Py_INCREF(one); + newval = one; + } else { + newval = PyNumber_Add(oldval, one); + Py_DECREF(oldval); + if (newval == NULL) + break; + } + if (PyObject_SetItem(mapping, key, newval) == -1) + break; + Py_CLEAR(newval); + Py_DECREF(key); + } + } + + Py_DECREF(it); + Py_XDECREF(key); + Py_XDECREF(newval); + Py_DECREF(one); + if (PyErr_Occurred()) + return NULL; + Py_RETURN_NONE; +} + /* module level code ********************************************************/ PyDoc_STRVAR(module_doc, @@ -1526,13 +1618,17 @@ - defaultdict: dict subclass with a default value factory\n\ "); +static struct PyMethodDef module_functions[] = { + {"_count_elements", _count_elements, METH_VARARGS, _count_elements_doc}, + {NULL, NULL} /* sentinel */ +}; static struct PyModuleDef _collectionsmodule = { PyModuleDef_HEAD_INIT, "_collections", module_doc, -1, - NULL, + module_functions, NULL, NULL, NULL, Modified: python/branches/pep-3151/Modules/_ctypes/_ctypes.c ============================================================================== --- python/branches/pep-3151/Modules/_ctypes/_ctypes.c (original) +++ python/branches/pep-3151/Modules/_ctypes/_ctypes.c Sat Feb 26 08:16:32 2011 @@ -1155,7 +1155,7 @@ result = -1; goto done; } - result = PyUnicode_AsWideChar((PyUnicodeObject *)value, + result = PyUnicode_AsWideChar(value, (wchar_t *)self->b_ptr, self->b_size/sizeof(wchar_t)); if (result >= 0 && (size_t)result < self->b_size/sizeof(wchar_t)) @@ -3925,14 +3925,14 @@ Returns -1 on error, or the index of next argument on success. */ -static int +static Py_ssize_t _init_pos_args(PyObject *self, PyTypeObject *type, PyObject *args, PyObject *kwds, - int index) + Py_ssize_t index) { StgDictObject *dict; PyObject *fields; - int i; + Py_ssize_t i; if (PyType_stgdict((PyObject *)type->tp_base)) { index = _init_pos_args(self, type->tp_base, @@ -4174,7 +4174,7 @@ PyObject *np; Py_ssize_t start, stop, step, slicelen, cur, i; - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, self->b_length, &start, &stop, &step, &slicelen) < 0) { return NULL; @@ -4308,7 +4308,7 @@ else if (PySlice_Check(item)) { Py_ssize_t start, stop, step, slicelen, otherlen, i, cur; - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, self->b_length, &start, &stop, &step, &slicelen) < 0) { return -1; Modified: python/branches/pep-3151/Modules/_ctypes/_ctypes_test.c ============================================================================== --- python/branches/pep-3151/Modules/_ctypes/_ctypes_test.c (original) +++ python/branches/pep-3151/Modules/_ctypes/_ctypes_test.c Sat Feb 26 08:16:32 2011 @@ -12,6 +12,20 @@ /* some functions handy for testing */ +EXPORT(int) +_testfunc_cbk_reg_int(int a, int b, int c, int d, int e, + int (*func)(int, int, int, int, int)) +{ + return func(a*a, b*b, c*c, d*d, e*e); +} + +EXPORT(double) +_testfunc_cbk_reg_double(double a, double b, double c, double d, double e, + double (*func)(double, double, double, double, double)) +{ + return func(a*a, b*b, c*c, d*d, e*e); +} + EXPORT(void)testfunc_array(int values[4]) { printf("testfunc_array %d %d %d %d\n", Modified: python/branches/pep-3151/Modules/_ctypes/cfield.c ============================================================================== --- python/branches/pep-3151/Modules/_ctypes/cfield.c (original) +++ python/branches/pep-3151/Modules/_ctypes/cfield.c Sat Feb 26 08:16:32 2011 @@ -52,7 +52,7 @@ { CFieldObject *self; PyObject *proto; - Py_ssize_t size, align, length; + Py_ssize_t size, align; SETFUNC setfunc = NULL; GETFUNC getfunc = NULL; StgDictObject *dict; @@ -106,7 +106,6 @@ } size = dict->size; - length = dict->length; proto = desc; /* Field descriptors for 'c_char * n' are be scpecial cased to @@ -1214,7 +1213,7 @@ } else Py_INCREF(value); - len = PyUnicode_AsWideChar((PyUnicodeObject *)value, chars, 2); + len = PyUnicode_AsWideChar(value, chars, 2); if (len != 1) { Py_DECREF(value); PyErr_SetString(PyExc_TypeError, @@ -1292,7 +1291,7 @@ } else if (size < length-1) /* copy terminating NUL character if there is space */ size += 1; - PyUnicode_AsWideChar((PyUnicodeObject *)value, (wchar_t *)ptr, size); + PyUnicode_AsWideChar(value, (wchar_t *)ptr, size); return value; } Modified: python/branches/pep-3151/Modules/_ctypes/libffi_msvc/ffi.c ============================================================================== --- python/branches/pep-3151/Modules/_ctypes/libffi_msvc/ffi.c (original) +++ python/branches/pep-3151/Modules/_ctypes/libffi_msvc/ffi.c Sat Feb 26 08:16:32 2011 @@ -380,7 +380,7 @@ short bytes; char *tramp; #ifdef _WIN64 - int mask; + int mask = 0; #endif FFI_ASSERT (cif->abi == FFI_SYSV); Modified: python/branches/pep-3151/Modules/_datetimemodule.c ============================================================================== --- python/branches/pep-3151/Modules/_datetimemodule.c (original) +++ python/branches/pep-3151/Modules/_datetimemodule.c Sat Feb 26 08:16:32 2011 @@ -3,7 +3,6 @@ */ #include "Python.h" -#include "modsupport.h" #include "structmember.h" #include @@ -1167,10 +1166,10 @@ if (!pin) return NULL; - /* Give up if the year is before 1900. + /* Give up if the year is before 1000. * Python strftime() plays games with the year, and different * games depending on whether envar PYTHON2K is set. This makes - * years before 1900 a nightmare, even if the platform strftime + * years before 1000 a nightmare, even if the platform strftime * supports them (and not all do). * We could get a lot farther here by avoiding Python's strftime * wrapper and calling the C strftime() directly, but that isn't @@ -1183,10 +1182,10 @@ assert(PyLong_Check(pyyear)); year = PyLong_AsLong(pyyear); Py_DECREF(pyyear); - if (year < 1900) { + if (year < 1000) { PyErr_Format(PyExc_ValueError, "year=%ld is before " - "1900; the datetime strftime() " - "methods require year >= 1900", + "1000; the datetime strftime() " + "methods require year >= 1000", year); return NULL; } @@ -1258,7 +1257,8 @@ assert(PyUnicode_Check(Zreplacement)); ptoappend = _PyUnicode_AsStringAndSize(Zreplacement, &ntoappend); - ntoappend = Py_SIZE(Zreplacement); + if (ptoappend == NULL) + goto Done; } else if (ch == 'f') { /* format microseconds */ @@ -1461,7 +1461,7 @@ goto Done; Py_DECREF(x1); Py_DECREF(x2); - x1 = x2 = NULL; + /* x1 = */ x2 = NULL; /* x3 has days+seconds in seconds */ x1 = PyNumber_Multiply(x3, us_per_second); /* us */ @@ -3663,7 +3663,7 @@ /* Python's strftime does insane things with the year part of the * timetuple. The year is forced to (the otherwise nonsensical) - * 1900 to worm around that. + * 1900 to work around that. */ tuple = Py_BuildValue("iiiiiiiii", 1900, 1, 1, /* year, month, day */ Modified: python/branches/pep-3151/Modules/_elementtree.c ============================================================================== --- python/branches/pep-3151/Modules/_elementtree.c (original) +++ python/branches/pep-3151/Modules/_elementtree.c Sat Feb 26 08:16:32 2011 @@ -1272,7 +1272,7 @@ if (!self->extra) return PyList_New(0); - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, self->extra->length, &start, &stop, &step, &slicelen) < 0) { return NULL; @@ -1331,7 +1331,7 @@ if (!self->extra) element_new_extra(self, NULL); - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, self->extra->length, &start, &stop, &step, &slicelen) < 0) { return -1; @@ -1483,6 +1483,9 @@ if (PyUnicode_Check(nameobj)) name = _PyUnicode_AsString(nameobj); + + if (name == NULL) + return NULL; /* handle common attributes first */ if (strcmp(name, "tag") == 0) { @@ -2139,7 +2142,7 @@ PyObject *position; char buffer[256]; - sprintf(buffer, "%s: line %d, column %d", message, line, column); + sprintf(buffer, "%.100s: line %d, column %d", message, line, column); error = PyObject_CallFunction(elementtree_parseerror_obj, "s", buffer); if (!error) @@ -2194,8 +2197,8 @@ Py_XDECREF(res); } else if (!PyErr_Occurred()) { /* Report the first error, not the last */ - char message[128]; - sprintf(message, "undefined entity &%.100s;", _PyUnicode_AsString(key)); + char message[128] = "undefined entity "; + strncat(message, data_in, data_len < 100?data_len:100); expat_set_error( message, EXPAT(GetErrorLineNumber)(self->parser), @@ -2796,29 +2799,25 @@ static PyObject* xmlparser_getattro(XMLParserObject* self, PyObject* nameobj) { - PyObject* res; - char *name = ""; - - if (PyUnicode_Check(nameobj)) - name = _PyUnicode_AsString(nameobj); - - PyErr_Clear(); - - if (strcmp(name, "entity") == 0) - res = self->entity; - else if (strcmp(name, "target") == 0) - res = self->target; - else if (strcmp(name, "version") == 0) { - char buffer[100]; - sprintf(buffer, "Expat %d.%d.%d", XML_MAJOR_VERSION, + if (PyUnicode_Check(nameobj)) { + PyObject* res; + if (PyUnicode_CompareWithASCIIString(nameobj, "entity") == 0) + res = self->entity; + else if (PyUnicode_CompareWithASCIIString(nameobj, "target") == 0) + res = self->target; + else if (PyUnicode_CompareWithASCIIString(nameobj, "version") == 0) { + return PyUnicode_FromFormat( + "Expat %d.%d.%d", XML_MAJOR_VERSION, XML_MINOR_VERSION, XML_MICRO_VERSION); - return PyUnicode_DecodeUTF8(buffer, strlen(buffer), "strict"); - } else { - return PyObject_GenericGetAttr((PyObject*) self, nameobj); - } + } + else + goto generic; - Py_INCREF(res); - return res; + Py_INCREF(res); + return res; + } + generic: + return PyObject_GenericGetAttr((PyObject*) self, nameobj); } static PyTypeObject XMLParser_Type = { Modified: python/branches/pep-3151/Modules/_functoolsmodule.c ============================================================================== --- python/branches/pep-3151/Modules/_functoolsmodule.c (original) +++ python/branches/pep-3151/Modules/_functoolsmodule.c Sat Feb 26 08:16:32 2011 @@ -196,6 +196,48 @@ {NULL} /* Sentinel */ }; +static PyObject * +partial_repr(partialobject *pto) +{ + PyObject *result; + PyObject *arglist; + PyObject *tmp; + Py_ssize_t i, n; + + arglist = PyUnicode_FromString(""); + if (arglist == NULL) { + return NULL; + } + /* Pack positional arguments */ + assert (PyTuple_Check(pto->args)); + n = PyTuple_GET_SIZE(pto->args); + for (i = 0; i < n; i++) { + tmp = PyUnicode_FromFormat("%U, %R", arglist, + PyTuple_GET_ITEM(pto->args, i)); + Py_DECREF(arglist); + if (tmp == NULL) + return NULL; + arglist = tmp; + } + /* Pack keyword arguments */ + assert (pto->kw == Py_None || PyDict_Check(pto->kw)); + if (pto->kw != Py_None) { + PyObject *key, *value; + for (i = 0; PyDict_Next(pto->kw, &i, &key, &value);) { + tmp = PyUnicode_FromFormat("%U, %U=%R", arglist, + key, value); + Py_DECREF(arglist); + if (tmp == NULL) + return NULL; + arglist = tmp; + } + } + result = PyUnicode_FromFormat("%s(%R%U)", Py_TYPE(pto)->tp_name, + pto->fn, arglist); + Py_DECREF(arglist); + return result; +} + /* Pickle strategy: __reduce__ by itself doesn't support getting kwargs in the unpickle operation so we define a __setstate__ that replaces all the information @@ -254,7 +296,7 @@ 0, /* tp_getattr */ 0, /* tp_setattr */ 0, /* tp_reserved */ - 0, /* tp_repr */ + (reprfunc)partial_repr, /* tp_repr */ 0, /* tp_as_number */ 0, /* tp_as_sequence */ 0, /* tp_as_mapping */ Modified: python/branches/pep-3151/Modules/_gdbmmodule.c ============================================================================== --- python/branches/pep-3151/Modules/_gdbmmodule.c (original) +++ python/branches/pep-3151/Modules/_gdbmmodule.c Sat Feb 26 08:16:32 2011 @@ -135,6 +135,28 @@ return v; } +PyDoc_STRVAR(dbm_get__doc__, +"get(key[, default]) -> value\n\ +Get the value for key, or default if not present; if not given,\n\ +default is None."); + +static PyObject * +dbm_get(dbmobject *dp, PyObject *args) +{ + PyObject *v, *res; + PyObject *def = Py_None; + + if (!PyArg_UnpackTuple(args, "get", 1, 2, &v, &def)) + return NULL; + res = dbm_subscript(dp, v); + if (res == NULL && PyErr_ExceptionMatches(PyExc_KeyError)) { + PyErr_Clear(); + Py_INCREF(def); + return def; + } + return res; +} + static int dbm_ass_sub(dbmobject *dp, PyObject *v, PyObject *w) { @@ -176,6 +198,29 @@ return 0; } +PyDoc_STRVAR(dbm_setdefault__doc__, +"setdefault(key[, default]) -> value\n\ +Get value for key, or set it to default and return default if not present;\n\ +if not given, default is None."); + +static PyObject * +dbm_setdefault(dbmobject *dp, PyObject *args) +{ + PyObject *v, *res; + PyObject *def = Py_None; + + if (!PyArg_UnpackTuple(args, "setdefault", 1, 2, &v, &def)) + return NULL; + res = dbm_subscript(dp, v); + if (res == NULL && PyErr_ExceptionMatches(PyExc_KeyError)) { + PyErr_Clear(); + if (dbm_ass_sub(dp, v, def) < 0) + return NULL; + return dbm_subscript(dp, v); + } + return res; +} + static PyMappingMethods dbm_as_mapping = { (lenfunc)dbm_length, /*mp_length*/ (binaryfunc)dbm_subscript, /*mp_subscript*/ @@ -378,6 +423,8 @@ {"nextkey", (PyCFunction)dbm_nextkey, METH_VARARGS, dbm_nextkey__doc__}, {"reorganize",(PyCFunction)dbm_reorganize,METH_NOARGS, dbm_reorganize__doc__}, {"sync", (PyCFunction)dbm_sync, METH_NOARGS, dbm_sync__doc__}, + {"get", (PyCFunction)dbm_get, METH_VARARGS, dbm_get__doc__}, + {"setdefault",(PyCFunction)dbm_setdefault,METH_VARARGS, dbm_setdefault__doc__}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/pep-3151/Modules/_io/bufferedio.c ============================================================================== --- python/branches/pep-3151/Modules/_io/bufferedio.c (original) +++ python/branches/pep-3151/Modules/_io/bufferedio.c Sat Feb 26 08:16:32 2011 @@ -225,6 +225,7 @@ #ifdef WITH_THREAD PyThread_type_lock lock; + volatile long owner; #endif Py_ssize_t buffer_size; @@ -260,17 +261,34 @@ /* These macros protect the buffered object against concurrent operations. */ #ifdef WITH_THREAD -#define ENTER_BUFFERED(self) \ - if (!PyThread_acquire_lock(self->lock, 0)) { \ - Py_BEGIN_ALLOW_THREADS \ - PyThread_acquire_lock(self->lock, 1); \ - Py_END_ALLOW_THREADS \ + +static int +_enter_buffered_busy(buffered *self) +{ + if (self->owner == PyThread_get_thread_ident()) { + PyErr_Format(PyExc_RuntimeError, + "reentrant call inside %R", self); + return 0; } + Py_BEGIN_ALLOW_THREADS + PyThread_acquire_lock(self->lock, 1); + Py_END_ALLOW_THREADS + return 1; +} + +#define ENTER_BUFFERED(self) \ + ( (PyThread_acquire_lock(self->lock, 0) ? \ + 1 : _enter_buffered_busy(self)) \ + && (self->owner = PyThread_get_thread_ident(), 1) ) #define LEAVE_BUFFERED(self) \ - PyThread_release_lock(self->lock); + do { \ + self->owner = 0; \ + PyThread_release_lock(self->lock); \ + } while(0); + #else -#define ENTER_BUFFERED(self) +#define ENTER_BUFFERED(self) 1 #define LEAVE_BUFFERED(self) #endif @@ -387,7 +405,7 @@ /* Because this can call arbitrary code, it shouldn't be called when the refcount is 0 (that is, not directly from tp_dealloc unless the refcount has been temporarily re-incremented). */ -PyObject * +static PyObject * buffered_dealloc_warn(buffered *self, PyObject *source) { if (self->ok && self->raw) { @@ -444,7 +462,8 @@ int r; CHECK_INITIALIZED(self) - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; r = buffered_closed(self); if (r < 0) @@ -465,7 +484,8 @@ /* flush() will most probably re-take the lock, so drop it first */ LEAVE_BUFFERED(self) res = PyObject_CallMethodObjArgs((PyObject *)self, _PyIO_str_flush, NULL); - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; if (res == NULL) { goto end; } @@ -679,6 +699,7 @@ PyErr_SetString(PyExc_RuntimeError, "can't allocate read lock"); return -1; } + self->owner = 0; #endif /* Find out whether buffer_size is a power of 2 */ /* XXX is this optimization useful? */ @@ -693,6 +714,39 @@ return 0; } +/* Return 1 if an EnvironmentError with errno == EINTR is set (and then + clears the error indicator), 0 otherwise. + Should only be called when PyErr_Occurred() is true. +*/ +static int +_trap_eintr(void) +{ + static PyObject *eintr_int = NULL; + PyObject *typ, *val, *tb; + PyEnvironmentErrorObject *env_err; + + if (eintr_int == NULL) { + eintr_int = PyLong_FromLong(EINTR); + assert(eintr_int != NULL); + } + if (!PyErr_ExceptionMatches(PyExc_EnvironmentError)) + return 0; + PyErr_Fetch(&typ, &val, &tb); + PyErr_NormalizeException(&typ, &val, &tb); + env_err = (PyEnvironmentErrorObject *) val; + assert(env_err != NULL); + if (env_err->myerrno != NULL && + PyObject_RichCompareBool(env_err->myerrno, eintr_int, Py_EQ) > 0) { + Py_DECREF(typ); + Py_DECREF(val); + Py_XDECREF(tb); + return 1; + } + /* This silences any error set by PyObject_RichCompareBool() */ + PyErr_Restore(typ, val, tb); + return 0; +} + /* * Shared methods and wrappers */ @@ -705,7 +759,8 @@ CHECK_INITIALIZED(self) CHECK_CLOSED(self, "flush of closed file") - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; res = _bufferedwriter_flush_unlocked(self, 0); if (res != NULL && self->readable) { /* Rewind the raw stream so that its position corresponds to @@ -732,7 +787,8 @@ return NULL; } - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; if (self->writable) { res = _bufferedwriter_flush_unlocked(self, 1); @@ -767,7 +823,8 @@ if (n == -1) { /* The number of bytes is unspecified, read until the end of stream */ - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; res = _bufferedreader_read_all(self); LEAVE_BUFFERED(self) } @@ -775,7 +832,8 @@ res = _bufferedreader_read_fast(self, n); if (res == Py_None) { Py_DECREF(res); - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; res = _bufferedreader_read_generic(self, n); LEAVE_BUFFERED(self) } @@ -803,7 +861,8 @@ if (n == 0) return PyBytes_FromStringAndSize(NULL, 0); - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; if (self->writable) { res = _bufferedwriter_flush_unlocked(self, 1); @@ -859,7 +918,8 @@ /* TODO: use raw.readinto() instead! */ if (self->writable) { - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; res = _bufferedwriter_flush_unlocked(self, 0); LEAVE_BUFFERED(self) if (res == NULL) @@ -903,7 +963,8 @@ goto end_unlocked; } - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + goto end_unlocked; /* Now we try to get some more from the raw stream */ if (self->writable) { @@ -1053,7 +1114,8 @@ } } - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; /* Fallback: invoke raw seek() method and clear buffer */ if (self->writable) { @@ -1091,7 +1153,8 @@ return NULL; } - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) + return NULL; if (self->writable) { res = _bufferedwriter_flush_unlocked(self, 0); @@ -1239,7 +1302,14 @@ memobj = PyMemoryView_FromBuffer(&buf); if (memobj == NULL) return -1; - res = PyObject_CallMethodObjArgs(self->raw, _PyIO_str_readinto, memobj, NULL); + /* NOTE: PyErr_SetFromErrno() calls PyErr_CheckSignals() when EINTR + occurs so we needn't do it ourselves. + We then retry reading, ignoring the signal if no handler has + raised (see issue #10956). + */ + do { + res = PyObject_CallMethodObjArgs(self->raw, _PyIO_str_readinto, memobj, NULL); + } while (res == NULL && _trap_eintr()); Py_DECREF(memobj); if (res == NULL) return -1; @@ -1511,7 +1581,7 @@ }; static PyMemberDef bufferedreader_members[] = { - {"raw", T_OBJECT, offsetof(buffered, raw), 0}, + {"raw", T_OBJECT, offsetof(buffered, raw), READONLY}, {NULL} }; @@ -1648,7 +1718,14 @@ memobj = PyMemoryView_FromBuffer(&buf); if (memobj == NULL) return -1; - res = PyObject_CallMethodObjArgs(self->raw, _PyIO_str_write, memobj, NULL); + /* NOTE: PyErr_SetFromErrno() calls PyErr_CheckSignals() when EINTR + occurs so we needn't do it ourselves. + We then retry writing, ignoring the signal if no handler has + raised (see issue #10956). + */ + do { + res = PyObject_CallMethodObjArgs(self->raw, _PyIO_str_write, memobj, NULL); + } while (res == NULL && _trap_eintr()); Py_DECREF(memobj); if (res == NULL) return -1; @@ -1748,7 +1825,10 @@ return NULL; } - ENTER_BUFFERED(self) + if (!ENTER_BUFFERED(self)) { + PyBuffer_Release(&buf); + return NULL; + } /* Fast path: the data to write can be fully buffered. */ if (!VALID_READ_BUFFER(self) && !VALID_WRITE_BUFFER(self)) { @@ -1893,7 +1973,7 @@ }; static PyMemberDef bufferedwriter_members[] = { - {"raw", T_OBJECT, offsetof(buffered, raw), 0}, + {"raw", T_OBJECT, offsetof(buffered, raw), READONLY}, {NULL} }; @@ -2287,7 +2367,7 @@ }; static PyMemberDef bufferedrandom_members[] = { - {"raw", T_OBJECT, offsetof(buffered, raw), 0}, + {"raw", T_OBJECT, offsetof(buffered, raw), READONLY}, {NULL} }; Modified: python/branches/pep-3151/Modules/_io/bytesio.c ============================================================================== --- python/branches/pep-3151/Modules/_io/bytesio.c (original) +++ python/branches/pep-3151/Modules/_io/bytesio.c Sat Feb 26 08:16:32 2011 @@ -430,15 +430,20 @@ bytesio_readinto(bytesio *self, PyObject *buffer) { void *raw_buffer; - Py_ssize_t len; + Py_ssize_t len, n; CHECK_CLOSED(self); if (PyObject_AsWriteBuffer(buffer, &raw_buffer, &len) == -1) return NULL; - if (self->pos + len > self->string_size) - len = self->string_size - self->pos; + /* adjust invalid sizes */ + n = self->string_size - self->pos; + if (len > n) { + len = n; + if (len < 0) + len = 0; + } memcpy(raw_buffer, self->buf + self->pos, len); assert(self->pos + len < PY_SSIZE_T_MAX); @@ -933,13 +938,11 @@ bytesiobuf_getbuffer(bytesiobuf *obj, Py_buffer *view, int flags) { int ret; - void *ptr; bytesio *b = (bytesio *) obj->source; if (view == NULL) { b->exports++; return 0; } - ptr = (void *) obj; ret = PyBuffer_FillInfo(view, (PyObject*)obj, b->buf, b->string_size, 0, flags); if (ret >= 0) { Modified: python/branches/pep-3151/Modules/_io/fileio.c ============================================================================== --- python/branches/pep-3151/Modules/_io/fileio.c (original) +++ python/branches/pep-3151/Modules/_io/fileio.c Sat Feb 26 08:16:32 2011 @@ -388,6 +388,11 @@ goto error; } +#if defined(MS_WINDOWS) || defined(__CYGWIN__) + /* don't translate newlines (\r\n <=> \n) */ + _setmode(self->fd, O_BINARY); +#endif + if (PyObject_SetAttrString((PyObject *)self, "name", nameobj) < 0) goto error; @@ -506,7 +511,7 @@ fileio_readinto(fileio *self, PyObject *args) { Py_buffer pbuf; - Py_ssize_t n; + Py_ssize_t n, len; if (self->fd < 0) return err_closed(); @@ -517,9 +522,16 @@ return NULL; if (_PyVerify_fd(self->fd)) { + len = pbuf.len; Py_BEGIN_ALLOW_THREADS errno = 0; - n = read(self->fd, pbuf.buf, pbuf.len); +#if defined(MS_WIN64) || defined(MS_WINDOWS) + if (len > INT_MAX) + len = INT_MAX; + n = read(self->fd, pbuf.buf, (int)len); +#else + n = read(self->fd, pbuf.buf, len); +#endif Py_END_ALLOW_THREADS } else n = -1; @@ -685,7 +697,7 @@ fileio_write(fileio *self, PyObject *args) { Py_buffer pbuf; - Py_ssize_t n; + Py_ssize_t n, len; if (self->fd < 0) return err_closed(); @@ -698,7 +710,14 @@ if (_PyVerify_fd(self->fd)) { Py_BEGIN_ALLOW_THREADS errno = 0; - n = write(self->fd, pbuf.buf, pbuf.len); + len = pbuf.len; +#if defined(MS_WIN64) || defined(MS_WINDOWS) + if (len > INT_MAX) + len = INT_MAX; + n = write(self->fd, pbuf.buf, (int)len); +#else + n = write(self->fd, pbuf.buf, len); +#endif Py_END_ALLOW_THREADS } else n = -1; Modified: python/branches/pep-3151/Modules/_io/textio.c ============================================================================== --- python/branches/pep-3151/Modules/_io/textio.c (original) +++ python/branches/pep-3151/Modules/_io/textio.c Sat Feb 26 08:16:32 2011 @@ -678,12 +678,16 @@ PyObject *pending_bytes; /* list of bytes objects waiting to be written, or NULL */ Py_ssize_t pending_bytes_count; - PyObject *snapshot; + /* snapshot is either None, or a tuple (dec_flags, next_input) where * dec_flags is the second (integer) item of the decoder state and * next_input is the chunk of input bytes that comes next after the * snapshot point. We use this to reconstruct decoder states in tell(). */ + PyObject *snapshot; + /* Bytes-to-characters ratio for the current chunk. Serves as input for + the heuristic in tell(). */ + double b2cratio; /* Cache raw object if it's a FileIO object */ PyObject *raw; @@ -850,6 +854,7 @@ self->decoded_chars_used = 0; self->pending_bytes_count = 0; self->encodefunc = NULL; + self->b2cratio = 0.0; if (encoding == NULL) { /* Try os.device_encoding(fileno) */ @@ -1390,6 +1395,7 @@ PyObject *dec_flags = NULL; PyObject *input_chunk = NULL; PyObject *decoded_chars, *chunk_size; + Py_ssize_t nbytes, nchars; int eof; /* The return value is True unless EOF was reached. The decoded string is @@ -1435,7 +1441,8 @@ goto fail; assert(PyBytes_Check(input_chunk)); - eof = (PyBytes_Size(input_chunk) == 0); + nbytes = PyBytes_Size(input_chunk); + eof = (nbytes == 0); if (Py_TYPE(self->decoder) == &PyIncrementalNewlineDecoder_Type) { decoded_chars = _PyIncrementalNewlineDecoder_decode( @@ -1450,7 +1457,12 @@ if (decoded_chars == NULL) goto fail; textiowrapper_set_decoded_chars(self, decoded_chars); - if (PyUnicode_GET_SIZE(decoded_chars) > 0) + nchars = PyUnicode_GET_SIZE(decoded_chars); + if (nchars > 0) + self->b2cratio = (double) nbytes / nchars; + else + self->b2cratio = 0.0; + if (nchars > 0) eof = 0; if (self->telling) { @@ -2139,8 +2151,12 @@ cookie_type cookie = {0,0,0,0,0}; PyObject *next_input; Py_ssize_t chars_to_skip, chars_decoded; + Py_ssize_t skip_bytes, skip_back; PyObject *saved_state = NULL; char *input, *input_end; + char *dec_buffer; + Py_ssize_t dec_buffer_len; + int dec_flags; CHECK_INITIALIZED(self); CHECK_CLOSED(self); @@ -2176,6 +2192,7 @@ #else cookie.start_pos = PyLong_AsLong(posobj); #endif + Py_DECREF(posobj); if (PyErr_Occurred()) goto fail; @@ -2190,57 +2207,99 @@ /* How many decoded characters have been used up since the snapshot? */ if (self->decoded_chars_used == 0) { /* We haven't moved from the snapshot point. */ - Py_DECREF(posobj); return textiowrapper_build_cookie(&cookie); } chars_to_skip = self->decoded_chars_used; - /* Starting from the snapshot position, we will walk the decoder - * forward until it gives us enough decoded characters. - */ + /* Decoder state will be restored at the end */ saved_state = PyObject_CallMethodObjArgs(self->decoder, _PyIO_str_getstate, NULL); if (saved_state == NULL) goto fail; - /* Note our initial start point. */ - if (_textiowrapper_decoder_setstate(self, &cookie) < 0) - goto fail; +#define DECODER_GETSTATE() do { \ + PyObject *_state = PyObject_CallMethodObjArgs(self->decoder, \ + _PyIO_str_getstate, NULL); \ + if (_state == NULL) \ + goto fail; \ + if (!PyArg_Parse(_state, "(y#i)", &dec_buffer, &dec_buffer_len, &dec_flags)) { \ + Py_DECREF(_state); \ + goto fail; \ + } \ + Py_DECREF(_state); \ + } while (0) + + /* TODO: replace assert with exception */ +#define DECODER_DECODE(start, len, res) do { \ + PyObject *_decoded = PyObject_CallMethod( \ + self->decoder, "decode", "y#", start, len); \ + if (_decoded == NULL) \ + goto fail; \ + assert (PyUnicode_Check(_decoded)); \ + res = PyUnicode_GET_SIZE(_decoded); \ + Py_DECREF(_decoded); \ + } while (0) - /* Feed the decoder one byte at a time. As we go, note the - * nearest "safe start point" before the current location - * (a point where the decoder has nothing buffered, so seek() + /* Fast search for an acceptable start point, close to our + current pos */ + skip_bytes = (Py_ssize_t) (self->b2cratio * chars_to_skip); + skip_back = 1; + assert(skip_back <= PyBytes_GET_SIZE(next_input)); + input = PyBytes_AS_STRING(next_input); + while (skip_bytes > 0) { + /* Decode up to temptative start point */ + if (_textiowrapper_decoder_setstate(self, &cookie) < 0) + goto fail; + DECODER_DECODE(input, skip_bytes, chars_decoded); + if (chars_decoded <= chars_to_skip) { + DECODER_GETSTATE(); + if (dec_buffer_len == 0) { + /* Before pos and no bytes buffered in decoder => OK */ + cookie.dec_flags = dec_flags; + chars_to_skip -= chars_decoded; + break; + } + /* Skip back by buffered amount and reset heuristic */ + skip_bytes -= dec_buffer_len; + skip_back = 1; + } + else { + /* We're too far ahead, skip back a bit */ + skip_bytes -= skip_back; + skip_back *= 2; + } + } + if (skip_bytes <= 0) { + skip_bytes = 0; + if (_textiowrapper_decoder_setstate(self, &cookie) < 0) + goto fail; + } + + /* Note our initial start point. */ + cookie.start_pos += skip_bytes; + cookie.chars_to_skip = chars_to_skip; + if (chars_to_skip == 0) + goto finally; + + /* We should be close to the desired position. Now feed the decoder one + * byte at a time until we reach the `chars_to_skip` target. + * As we go, note the nearest "safe start point" before the current + * location (a point where the decoder has nothing buffered, so seek() * can safely start from there and advance to this location). */ chars_decoded = 0; input = PyBytes_AS_STRING(next_input); input_end = input + PyBytes_GET_SIZE(next_input); + input += skip_bytes; while (input < input_end) { - PyObject *state; - char *dec_buffer; - Py_ssize_t dec_buffer_len; - int dec_flags; - - PyObject *decoded = PyObject_CallMethod( - self->decoder, "decode", "y#", input, 1); - if (decoded == NULL) - goto fail; - assert (PyUnicode_Check(decoded)); - chars_decoded += PyUnicode_GET_SIZE(decoded); - Py_DECREF(decoded); + Py_ssize_t n; + DECODER_DECODE(input, 1, n); + /* We got n chars for 1 byte */ + chars_decoded += n; cookie.bytes_to_feed += 1; - - state = PyObject_CallMethodObjArgs(self->decoder, - _PyIO_str_getstate, NULL); - if (state == NULL) - goto fail; - if (!PyArg_Parse(state, "(y#i)", &dec_buffer, &dec_buffer_len, &dec_flags)) { - Py_DECREF(state); - goto fail; - } - Py_DECREF(state); + DECODER_GETSTATE(); if (dec_buffer_len == 0 && chars_decoded <= chars_to_skip) { /* Decoder buffer is empty, so this is a safe start point. */ @@ -2272,8 +2331,7 @@ } } - /* finally */ - Py_XDECREF(posobj); +finally: res = PyObject_CallMethod(self->decoder, "setstate", "(O)", saved_state); Py_DECREF(saved_state); if (res == NULL) @@ -2284,8 +2342,7 @@ cookie.chars_to_skip = Py_SAFE_DOWNCAST(chars_to_skip, Py_ssize_t, int); return textiowrapper_build_cookie(&cookie); - fail: - Py_XDECREF(posobj); +fail: if (saved_state) { PyObject *type, *value, *traceback; PyErr_Fetch(&type, &value, &traceback); @@ -2323,25 +2380,52 @@ static PyObject * textiowrapper_repr(textio *self) { - PyObject *nameobj, *res; + PyObject *nameobj, *modeobj, *res, *s; CHECK_INITIALIZED(self); + res = PyUnicode_FromString("<_io.TextIOWrapper"); + if (res == NULL) + return NULL; nameobj = PyObject_GetAttrString((PyObject *) self, "name"); if (nameobj == NULL) { if (PyErr_ExceptionMatches(PyExc_AttributeError)) PyErr_Clear(); else - return NULL; - res = PyUnicode_FromFormat("<_io.TextIOWrapper encoding=%R>", - self->encoding); + goto error; } else { - res = PyUnicode_FromFormat("<_io.TextIOWrapper name=%R encoding=%R>", - nameobj, self->encoding); + s = PyUnicode_FromFormat(" name=%R", nameobj); Py_DECREF(nameobj); + if (s == NULL) + goto error; + PyUnicode_AppendAndDel(&res, s); + if (res == NULL) + return NULL; + } + modeobj = PyObject_GetAttrString((PyObject *) self, "mode"); + if (modeobj == NULL) { + if (PyErr_ExceptionMatches(PyExc_AttributeError)) + PyErr_Clear(); + else + goto error; } - return res; + else { + s = PyUnicode_FromFormat(" mode=%R", modeobj); + Py_DECREF(modeobj); + if (s == NULL) + goto error; + PyUnicode_AppendAndDel(&res, s); + if (res == NULL) + return NULL; + } + s = PyUnicode_FromFormat("%U encoding=%R>", + res, self->encoding); + Py_DECREF(res); + return s; +error: + Py_XDECREF(res); + return NULL; } Modified: python/branches/pep-3151/Modules/_json.c ============================================================================== --- python/branches/pep-3151/Modules/_json.c (original) +++ python/branches/pep-3151/Modules/_json.c Sat Feb 26 08:16:32 2011 @@ -335,7 +335,7 @@ PyObject *rval = NULL; Py_ssize_t len = PyUnicode_GET_SIZE(pystr); Py_ssize_t begin = end - 1; - Py_ssize_t next = begin; + Py_ssize_t next /* = begin */; const Py_UNICODE *buf = PyUnicode_AS_UNICODE(pystr); PyObject *chunks = NULL; PyObject *chunk = NULL; @@ -1532,13 +1532,12 @@ goto bail; Py_CLEAR(ident); } + /* TODO DOES NOT RUN; dead code if (s->indent != Py_None) { - /* TODO: DOES NOT RUN */ indent_level -= 1; - /* - yield '\n' + (' ' * (_indent * _current_indent_level)) - */ - } + + yield '\n' + (' ' * (_indent * _current_indent_level)) + }*/ if (PyList_Append(rval, close_dict)) goto bail; return 0; @@ -1624,13 +1623,13 @@ goto bail; Py_CLEAR(ident); } + + /* TODO: DOES NOT RUN if (s->indent != Py_None) { - /* TODO: DOES NOT RUN */ indent_level -= 1; - /* - yield '\n' + (' ' * (_indent * _current_indent_level)) - */ - } + + yield '\n' + (' ' * (_indent * _current_indent_level)) + }*/ if (PyList_Append(rval, close_array)) goto bail; Py_DECREF(s_fast); Modified: python/branches/pep-3151/Modules/_lsprof.c ============================================================================== --- python/branches/pep-3151/Modules/_lsprof.c (original) +++ python/branches/pep-3151/Modules/_lsprof.c Sat Feb 26 08:16:32 2011 @@ -1,7 +1,5 @@ #include "Python.h" -#include "compile.h" #include "frameobject.h" -#include "structseq.h" #include "rotatingtree.h" #if !defined(HAVE_LONG_LONG) @@ -180,7 +178,16 @@ PyObject *mod = fn->m_module; const char *modname; if (mod && PyUnicode_Check(mod)) { + /* XXX: The following will truncate module names with embedded + * null-characters. It is unlikely that this can happen in + * practice and the concequences are not serious enough to + * introduce extra checks here. + */ modname = _PyUnicode_AsString(mod); + if (modname == NULL) { + modname = ""; + PyErr_Clear(); + } } else if (mod && PyModule_Check(mod)) { modname = PyModule_GetName(mod); Modified: python/branches/pep-3151/Modules/_multiprocessing/multiprocessing.h ============================================================================== --- python/branches/pep-3151/Modules/_multiprocessing/multiprocessing.h (original) +++ python/branches/pep-3151/Modules/_multiprocessing/multiprocessing.h Sat Feb 26 08:16:32 2011 @@ -4,7 +4,7 @@ #define PY_SSIZE_T_CLEAN #ifdef __sun -/* The control message API is only available on Solaris +/* The control message API is only available on Solaris if XPG 4.2 or later is requested. */ #define _XOPEN_SOURCE 500 #endif Modified: python/branches/pep-3151/Modules/_pickle.c ============================================================================== --- python/branches/pep-3151/Modules/_pickle.c (original) +++ python/branches/pep-3151/Modules/_pickle.c Sat Feb 26 08:16:32 2011 @@ -977,11 +977,6 @@ { Py_ssize_t num_read; - if (n == 0) { - *s = NULL; - return 0; - } - if (self->next_read_idx + n <= self->input_len) { *s = self->input_buffer + self->next_read_idx; self->next_read_idx += n; @@ -2244,19 +2239,21 @@ if (len != 0) { /* Materialize the list elements. */ if (PyList_CheckExact(obj) && self->proto > 0) { - if (Py_EnterRecursiveCall(" while pickling an object") == 0) { - status = batch_list_exact(self, obj); - Py_LeaveRecursiveCall(); - } + if (Py_EnterRecursiveCall(" while pickling an object")) + goto error; + status = batch_list_exact(self, obj); + Py_LeaveRecursiveCall(); } else { PyObject *iter = PyObject_GetIter(obj); if (iter == NULL) goto error; - if (Py_EnterRecursiveCall(" while pickling an object") == 0) { - status = batch_list(self, iter); - Py_LeaveRecursiveCall(); + if (Py_EnterRecursiveCall(" while pickling an object")) { + Py_DECREF(iter); + goto error; } + status = batch_list(self, iter); + Py_LeaveRecursiveCall(); Py_DECREF(iter); } } @@ -2504,10 +2501,10 @@ if (PyDict_CheckExact(obj) && self->proto > 0) { /* We can take certain shortcuts if we know this is a dict and not a dict subclass. */ - if (Py_EnterRecursiveCall(" while pickling an object") == 0) { - status = batch_dict_exact(self, obj); - Py_LeaveRecursiveCall(); - } + if (Py_EnterRecursiveCall(" while pickling an object")) + goto error; + status = batch_dict_exact(self, obj); + Py_LeaveRecursiveCall(); } else { items = PyObject_CallMethod(obj, "items", "()"); if (items == NULL) @@ -2516,7 +2513,12 @@ Py_DECREF(items); if (iter == NULL) goto error; + if (Py_EnterRecursiveCall(" while pickling an object")) { + Py_DECREF(iter); + goto error; + } status = batch_dict(self, iter); + Py_LeaveRecursiveCall(); Py_DECREF(iter); } } @@ -3044,7 +3046,7 @@ PyObject *reduce_value = NULL; int status = 0; - if (Py_EnterRecursiveCall(" while pickling an object") < 0) + if (Py_EnterRecursiveCall(" while pickling an object")) return -1; /* The extra pers_save argument is necessary to avoid calling save_pers() Modified: python/branches/pep-3151/Modules/_posixsubprocess.c ============================================================================== --- python/branches/pep-3151/Modules/_posixsubprocess.c (original) +++ python/branches/pep-3151/Modules/_posixsubprocess.c Sat Feb 26 08:16:32 2011 @@ -1,6 +1,10 @@ /* Authors: Gregory P. Smith & Jeffrey Yasskin */ #include "Python.h" +#ifdef HAVE_PIPE2 +#define _GNU_SOURCE +#endif #include +#include #define POSIX_CALL(call) if ((call) == -1) goto error @@ -42,13 +46,14 @@ int errread, int errwrite, int errpipe_read, int errpipe_write, int close_fds, int restore_signals, - int call_setsid, + int call_setsid, Py_ssize_t num_fds_to_keep, + PyObject *py_fds_to_keep, PyObject *preexec_fn, PyObject *preexec_fn_args_tuple) { int i, saved_errno, fd_num; PyObject *result; - const char* err_msg; + const char* err_msg = ""; /* Buffer large enough to hold a hex integer. We can't malloc. */ char hex_errno[sizeof(saved_errno)*2+1]; @@ -64,38 +69,68 @@ } POSIX_CALL(close(errpipe_read)); - /* Dup fds for child. */ - if (p2cread != -1) { + /* Dup fds for child. + dup2() removes the CLOEXEC flag but we must do it ourselves if dup2() + would be a no-op (issue #10806). */ + if (p2cread == 0) { + int old = fcntl(p2cread, F_GETFD); + if (old != -1) + fcntl(p2cread, F_SETFD, old & ~FD_CLOEXEC); + } else if (p2cread != -1) { POSIX_CALL(dup2(p2cread, 0)); /* stdin */ } - if (c2pwrite != -1) { + if (c2pwrite == 1) { + int old = fcntl(c2pwrite, F_GETFD); + if (old != -1) + fcntl(c2pwrite, F_SETFD, old & ~FD_CLOEXEC); + } else if (c2pwrite != -1) { POSIX_CALL(dup2(c2pwrite, 1)); /* stdout */ } - if (errwrite != -1) { + if (errwrite == 2) { + int old = fcntl(errwrite, F_GETFD); + if (old != -1) + fcntl(errwrite, F_SETFD, old & ~FD_CLOEXEC); + } else if (errwrite != -1) { POSIX_CALL(dup2(errwrite, 2)); /* stderr */ } /* Close pipe fds. Make sure we don't close the same fd more than */ /* once, or standard fds. */ - if (p2cread != -1 && p2cread != 0) { + if (p2cread > 2) { POSIX_CALL(close(p2cread)); } - if (c2pwrite != -1 && c2pwrite != p2cread && c2pwrite != 1) { + if (c2pwrite > 2) { POSIX_CALL(close(c2pwrite)); } - if (errwrite != -1 && errwrite != p2cread && - errwrite != c2pwrite && errwrite != 2) { + if (errwrite != c2pwrite && errwrite > 2) { POSIX_CALL(close(errwrite)); } /* close() is intentionally not checked for errors here as we are closing */ /* a large range of fds, some of which may be invalid. */ if (close_fds) { - for (fd_num = 3; fd_num < errpipe_write; ++fd_num) { - close(fd_num); - } - for (fd_num = errpipe_write+1; fd_num < max_fd; ++fd_num) { - close(fd_num); + Py_ssize_t keep_seq_idx; + int start_fd = 3; + for (keep_seq_idx = 0; keep_seq_idx < num_fds_to_keep; ++keep_seq_idx) { + PyObject* py_keep_fd = PySequence_Fast_GET_ITEM(py_fds_to_keep, + keep_seq_idx); + int keep_fd = PyLong_AsLong(py_keep_fd); + if (keep_fd < 0) { /* Negative number, overflow or not a Long. */ + err_msg = "bad value in fds_to_keep."; + errno = 0; /* We don't want to report an OSError. */ + goto error; + } + if (keep_fd < start_fd) + continue; + for (fd_num = start_fd; fd_num < keep_fd; ++fd_num) { + close(fd_num); + } + start_fd = keep_fd + 1; + } + if (start_fd <= max_fd) { + for (fd_num = start_fd; fd_num < max_fd; ++fd_num) { + close(fd_num); + } } } @@ -170,7 +205,7 @@ subprocess_fork_exec(PyObject* self, PyObject *args) { PyObject *gc_module = NULL; - PyObject *executable_list, *py_close_fds; + PyObject *executable_list, *py_close_fds, *py_fds_to_keep; PyObject *env_list, *preexec_fn; PyObject *process_args, *converted_args = NULL, *fast_args = NULL; PyObject *preexec_fn_args_tuple = NULL; @@ -182,11 +217,11 @@ pid_t pid; int need_to_reenable_gc = 0; char *const *exec_array, *const *argv = NULL, *const *envp = NULL; - Py_ssize_t arg_num; + Py_ssize_t arg_num, num_fds_to_keep; if (!PyArg_ParseTuple( - args, "OOOOOiiiiiiiiiiO:fork_exec", - &process_args, &executable_list, &py_close_fds, + args, "OOOOOOiiiiiiiiiiO:fork_exec", + &process_args, &executable_list, &py_close_fds, &py_fds_to_keep, &cwd_obj, &env_list, &p2cread, &p2cwrite, &c2pread, &c2pwrite, &errread, &errwrite, &errpipe_read, &errpipe_write, @@ -198,6 +233,11 @@ PyErr_SetString(PyExc_ValueError, "errpipe_write must be >= 3"); return NULL; } + num_fds_to_keep = PySequence_Length(py_fds_to_keep); + if (num_fds_to_keep < 0) { + PyErr_SetString(PyExc_ValueError, "bad fds_to_keep"); + return NULL; + } /* We need to call gc.disable() when we'll be calling preexec_fn */ if (preexec_fn != Py_None) { @@ -298,6 +338,7 @@ p2cread, p2cwrite, c2pread, c2pwrite, errread, errwrite, errpipe_read, errpipe_write, close_fds, restore_signals, call_setsid, + num_fds_to_keep, py_fds_to_keep, preexec_fn, preexec_fn_args_tuple); _exit(255); return NULL; /* Dead code to avoid a potential compiler warning. */ @@ -374,6 +415,57 @@ Raises: Only on an error in the parent process.\n\ "); +PyDoc_STRVAR(subprocess_cloexec_pipe_doc, +"cloexec_pipe() -> (read_end, write_end)\n\n\ +Create a pipe whose ends have the cloexec flag set."); + +static PyObject * +subprocess_cloexec_pipe(PyObject *self, PyObject *noargs) +{ + int fds[2]; + int res; +#ifdef HAVE_PIPE2 + Py_BEGIN_ALLOW_THREADS + res = pipe2(fds, O_CLOEXEC); + Py_END_ALLOW_THREADS + if (res != 0 && errno == ENOSYS) + { + if (PyErr_WarnEx( + PyExc_RuntimeWarning, + "pipe2 set errno ENOSYS; falling " + "back to non-atomic pipe+fcntl.", 1) != 0) { + return NULL; + } + { +#endif + /* We hold the GIL which offers some protection from other code calling + * fork() before the CLOEXEC flags have been set but we can't guarantee + * anything without pipe2(). */ + long oldflags; + + res = pipe(fds); + + if (res == 0) { + oldflags = fcntl(fds[0], F_GETFD, 0); + if (oldflags < 0) res = oldflags; + } + if (res == 0) + res = fcntl(fds[0], F_SETFD, oldflags | FD_CLOEXEC); + + if (res == 0) { + oldflags = fcntl(fds[1], F_GETFD, 0); + if (oldflags < 0) res = oldflags; + } + if (res == 0) + res = fcntl(fds[1], F_SETFD, oldflags | FD_CLOEXEC); +#ifdef HAVE_PIPE2 + } + } +#endif + if (res != 0) + return PyErr_SetFromErrno(PyExc_OSError); + return Py_BuildValue("(ii)", fds[0], fds[1]); +} /* module level code ********************************************************/ @@ -383,6 +475,7 @@ static PyMethodDef module_methods[] = { {"fork_exec", subprocess_fork_exec, METH_VARARGS, subprocess_fork_exec_doc}, + {"cloexec_pipe", subprocess_cloexec_pipe, METH_NOARGS, subprocess_cloexec_pipe_doc}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/pep-3151/Modules/_sqlite/connection.c ============================================================================== --- python/branches/pep-3151/Modules/_sqlite/connection.c (original) +++ python/branches/pep-3151/Modules/_sqlite/connection.c Sat Feb 26 08:16:32 2011 @@ -673,7 +673,6 @@ { PyObject* function_result = NULL; PyObject** aggregate_instance; - PyObject* aggregate_class; #ifdef WITH_THREAD PyGILState_STATE threadstate; @@ -681,8 +680,6 @@ threadstate = PyGILState_Ensure(); #endif - aggregate_class = (PyObject*)sqlite3_user_data(context); - aggregate_instance = (PyObject**)sqlite3_aggregate_context(context, sizeof(PyObject*)); if (!*aggregate_instance) { /* this branch is executed if there was an exception in the aggregate's Modified: python/branches/pep-3151/Modules/_sqlite/cursor.c ============================================================================== --- python/branches/pep-3151/Modules/_sqlite/cursor.c (original) +++ python/branches/pep-3151/Modules/_sqlite/cursor.c Sat Feb 26 08:16:32 2011 @@ -126,11 +126,9 @@ static void pysqlite_cursor_dealloc(pysqlite_Cursor* self) { - int rc; - /* Reset the statement if the user has not closed the cursor */ if (self->statement) { - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); Py_DECREF(self->statement); } @@ -529,7 +527,7 @@ if (self->statement != NULL) { /* There is an active statement */ - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); } operation_cstr = _PyUnicode_AsStringAndSize(operation, &operation_len); @@ -734,7 +732,7 @@ } if (multiple) { - rc = pysqlite_statement_reset(self->statement); + pysqlite_statement_reset(self->statement); } Py_XDECREF(parameters); } Modified: python/branches/pep-3151/Modules/_sqlite/module.c ============================================================================== --- python/branches/pep-3151/Modules/_sqlite/module.c (original) +++ python/branches/pep-3151/Modules/_sqlite/module.c Sat Feb 26 08:16:32 2011 @@ -329,7 +329,7 @@ (pysqlite_statement_setup_types() < 0) || (pysqlite_prepare_protocol_setup_types() < 0) ) { - Py_DECREF(module); + Py_XDECREF(module); return NULL; } Modified: python/branches/pep-3151/Modules/_sqlite/statement.c ============================================================================== --- python/branches/pep-3151/Modules/_sqlite/statement.c (original) +++ python/branches/pep-3151/Modules/_sqlite/statement.c Sat Feb 26 08:16:32 2011 @@ -369,11 +369,9 @@ void pysqlite_statement_dealloc(pysqlite_Statement* self) { - int rc; - if (self->st) { Py_BEGIN_ALLOW_THREADS - rc = sqlite3_finalize(self->st); + sqlite3_finalize(self->st); Py_END_ALLOW_THREADS } Modified: python/branches/pep-3151/Modules/_ssl.c ============================================================================== --- python/branches/pep-3151/Modules/_ssl.c (original) +++ python/branches/pep-3151/Modules/_ssl.c Sat Feb 26 08:16:32 2011 @@ -354,7 +354,6 @@ /* Actually negotiate SSL connection */ /* XXX If SSL_do_handshake() returns 0, it's also a failure. */ - sockstate = 0; do { PySSL_BEGIN_ALLOW_THREADS ret = SSL_do_handshake(self->ssl); @@ -370,7 +369,7 @@ sockstate = SOCKET_OPERATION_OK; } if (sockstate == SOCKET_HAS_TIMED_OUT) { - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, ERRSTR("The handshake operation timed out")); goto error; } else if (sockstate == SOCKET_HAS_BEEN_CLOSED) { @@ -928,10 +927,10 @@ char *cipher_protocol; if (self->ssl == NULL) - return Py_None; + Py_RETURN_NONE; current = SSL_get_current_cipher(self->ssl); if (current == NULL) - return Py_None; + Py_RETURN_NONE; retval = PyTuple_New(3); if (retval == NULL) @@ -939,6 +938,7 @@ cipher_name = (char *) SSL_CIPHER_get_name(current); if (cipher_name == NULL) { + Py_INCREF(Py_None); PyTuple_SET_ITEM(retval, 0, Py_None); } else { v = PyUnicode_FromString(cipher_name); @@ -948,6 +948,7 @@ } cipher_protocol = SSL_CIPHER_get_version(current); if (cipher_protocol == NULL) { + Py_INCREF(Py_None); PyTuple_SET_ITEM(retval, 1, Py_None); } else { v = PyUnicode_FromString(cipher_protocol); @@ -1075,7 +1076,7 @@ sockstate = check_socket_and_wait_for_timeout(sock, 1); if (sockstate == SOCKET_HAS_TIMED_OUT) { - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The write operation timed out"); goto error; } else if (sockstate == SOCKET_HAS_BEEN_CLOSED) { @@ -1088,7 +1089,6 @@ goto error; } do { - err = 0; PySSL_BEGIN_ALLOW_THREADS len = SSL_write(self->ssl, buf.buf, buf.len); err = SSL_get_error(self->ssl, len); @@ -1104,7 +1104,7 @@ sockstate = SOCKET_OPERATION_OK; } if (sockstate == SOCKET_HAS_TIMED_OUT) { - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The write operation timed out"); goto error; } else if (sockstate == SOCKET_HAS_BEEN_CLOSED) { @@ -1211,7 +1211,7 @@ if (!count) { sockstate = check_socket_and_wait_for_timeout(sock, 0); if (sockstate == SOCKET_HAS_TIMED_OUT) { - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The read operation timed out"); goto error; } else if (sockstate == SOCKET_TOO_LARGE_FOR_SELECT) { @@ -1224,7 +1224,6 @@ } } do { - err = 0; PySSL_BEGIN_ALLOW_THREADS count = SSL_read(self->ssl, mem, len); err = SSL_get_error(self->ssl, count); @@ -1245,7 +1244,7 @@ sockstate = SOCKET_OPERATION_OK; } if (sockstate == SOCKET_HAS_TIMED_OUT) { - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The read operation timed out"); goto error; } else if (sockstate == SOCKET_IS_NONBLOCKING) { @@ -1340,10 +1339,10 @@ break; if (sockstate == SOCKET_HAS_TIMED_OUT) { if (ssl_err == SSL_ERROR_WANT_READ) - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The read operation timed out"); else - PyErr_SetString(PySSLErrorObject, + PyErr_SetString(PySocketModule.timeout_error, "The write operation timed out"); goto error; } @@ -1681,7 +1680,7 @@ return NULL; } if (capath && !PyUnicode_FSConverter(capath, &capath_bytes)) { - Py_DECREF(cafile_bytes); + Py_XDECREF(cafile_bytes); PyErr_SetString(PyExc_TypeError, "capath should be a valid filesystem path"); return NULL; @@ -1783,6 +1782,16 @@ return NULL; } +static PyObject * +set_default_verify_paths(PySSLContext *self, PyObject *unused) +{ + if (!SSL_CTX_set_default_verify_paths(self->ctx)) { + _setSSLError(NULL, 0, __FILE__, __LINE__); + return NULL; + } + Py_RETURN_NONE; +} + static PyGetSetDef context_getsetlist[] = { {"options", (getter) get_options, (setter) set_options, NULL}, @@ -1802,6 +1811,8 @@ METH_VARARGS | METH_KEYWORDS, NULL}, {"session_stats", (PyCFunction) session_stats, METH_NOARGS, NULL}, + {"set_default_verify_paths", (PyCFunction) set_default_verify_paths, + METH_NOARGS, NULL}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/pep-3151/Modules/_struct.c ============================================================================== --- python/branches/pep-3151/Modules/_struct.c (original) +++ python/branches/pep-3151/Modules/_struct.c Sat Feb 26 08:16:32 2011 @@ -6,7 +6,6 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" -#include "structseq.h" #include "structmember.h" #include @@ -463,14 +462,9 @@ static int np_char(char *p, PyObject *v, const formatdef *f) { - if (PyUnicode_Check(v)) { - v = _PyUnicode_AsDefaultEncodedString(v, NULL); - if (v == NULL) - return -1; - } if (!PyBytes_Check(v) || PyBytes_Size(v) != 1) { PyErr_SetString(StructError, - "char format requires bytes or string of length 1"); + "char format requires a bytes object of length 1"); return -1; } *p = *PyBytes_AsString(v); @@ -1346,7 +1340,7 @@ if (!PyBytes_Check(o_format)) { Py_DECREF(o_format); PyErr_Format(PyExc_TypeError, - "Struct() argument 1 must be bytes, not %.200s", + "Struct() argument 1 must be a bytes object, not %.200s", Py_TYPE(o_format)->tp_name); return -1; } @@ -1424,7 +1418,7 @@ return NULL; if (vbuf.len != soself->s_size) { PyErr_Format(StructError, - "unpack requires a bytes argument of length %zd", + "unpack requires a bytes object of length %zd", soself->s_size); PyBuffer_Release(&vbuf); return NULL; @@ -1504,15 +1498,10 @@ if (e->format == 's') { int isstring; void *p; - if (PyUnicode_Check(v)) { - v = _PyUnicode_AsDefaultEncodedString(v, NULL); - if (v == NULL) - return -1; - } isstring = PyBytes_Check(v); if (!isstring && !PyByteArray_Check(v)) { PyErr_SetString(StructError, - "argument for 's' must be a bytes or string"); + "argument for 's' must be a bytes object"); return -1; } if (isstring) { @@ -1530,15 +1519,10 @@ } else if (e->format == 'p') { int isstring; void *p; - if (PyUnicode_Check(v)) { - v = _PyUnicode_AsDefaultEncodedString(v, NULL); - if (v == NULL) - return -1; - } isstring = PyBytes_Check(v); if (!isstring && !PyByteArray_Check(v)) { PyErr_SetString(StructError, - "argument for 'p' must be a bytes or string"); + "argument for 'p' must be a bytes object"); return -1; } if (isstring) { @@ -1692,7 +1676,7 @@ {NULL, NULL} /* sentinel */ }; -PyDoc_STRVAR(s__doc__, +PyDoc_STRVAR(s__doc__, "Struct(fmt) --> compiled struct object\n" "\n" "Return a new Struct object which writes and reads binary data according to\n" Modified: python/branches/pep-3151/Modules/_testcapimodule.c ============================================================================== --- python/branches/pep-3151/Modules/_testcapimodule.c (original) +++ python/branches/pep-3151/Modules/_testcapimodule.c Sat Feb 26 08:16:32 2011 @@ -1398,7 +1398,7 @@ if (buffer == NULL) return PyErr_NoMemory(); - size = PyUnicode_AsWideChar((PyUnicodeObject*)unicode, buffer, buflen); + size = PyUnicode_AsWideChar(unicode, buffer, buflen); if (size == -1) { PyMem_Free(buffer); return NULL; @@ -1741,15 +1741,16 @@ { PyObject *result; char *msg; + static const Py_UNICODE one[] = {'1', 0}; -#define CHECK_1_FORMAT(FORMAT, TYPE) \ - result = PyUnicode_FromFormat(FORMAT, (TYPE)1); \ - if (result == NULL) \ - return NULL; \ - if (strcmp(_PyUnicode_AsString(result), "1")) { \ - msg = FORMAT " failed at 1"; \ - goto Fail; \ - } \ +#define CHECK_1_FORMAT(FORMAT, TYPE) \ + result = PyUnicode_FromFormat(FORMAT, (TYPE)1); \ + if (result == NULL) \ + return NULL; \ + if (Py_UNICODE_strcmp(PyUnicode_AS_UNICODE(result), one)) { \ + msg = FORMAT " failed at 1"; \ + goto Fail; \ + } \ Py_DECREF(result) CHECK_1_FORMAT("%d", int); @@ -2187,7 +2188,7 @@ /* argument converter not called? */ return NULL; /* Should be 1 */ - res = PyLong_FromLong(Py_REFCNT(str2)); + res = PyLong_FromSsize_t(Py_REFCNT(str2)); Py_DECREF(str2); PyErr_Clear(); return res; @@ -2230,6 +2231,15 @@ return PyErr_NewExceptionWithDoc(name, doc, base, dict); } +static PyObject * +make_memoryview_from_NULL_pointer(PyObject *self) +{ + Py_buffer info; + if (PyBuffer_FillInfo(&info, NULL, NULL, 1, 1, PyBUF_FULL_RO) < 0) + return NULL; + return PyMemoryView_FromBuffer(&info); +} + /* Test that the fatal error from not having a current thread doesn't cause an infinite loop. Run via Lib/test/test_capi.py */ static PyObject * @@ -2245,17 +2255,6 @@ return NULL; } -static PyObject * -format_unicode(PyObject *self, PyObject *args) -{ - const char *format; - PyObject *arg; - if (!PyArg_ParseTuple(args, "yU", &format, &arg)) - return NULL; - return PyUnicode_FromFormat(format, arg); - -} - static PyMethodDef TestMethods[] = { {"raise_exception", raise_exception, METH_VARARGS}, {"raise_memoryerror", (PyCFunction)raise_memoryerror, METH_NOARGS}, @@ -2336,8 +2335,9 @@ {"code_newempty", code_newempty, METH_VARARGS}, {"make_exception_with_doc", (PyCFunction)make_exception_with_doc, METH_VARARGS | METH_KEYWORDS}, + {"make_memoryview_from_NULL_pointer", (PyCFunction)make_memoryview_from_NULL_pointer, + METH_NOARGS}, {"crash_no_current_thread", (PyCFunction)crash_no_current_thread, METH_NOARGS}, - {"format_unicode", format_unicode, METH_VARARGS}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/pep-3151/Modules/_threadmodule.c ============================================================================== --- python/branches/pep-3151/Modules/_threadmodule.c (original) +++ python/branches/pep-3151/Modules/_threadmodule.c Sat Feb 26 08:16:32 2011 @@ -40,6 +40,59 @@ PyObject_Del(self); } +/* Helper to acquire an interruptible lock with a timeout. If the lock acquire + * is interrupted, signal handlers are run, and if they raise an exception, + * PY_LOCK_INTR is returned. Otherwise, PY_LOCK_ACQUIRED or PY_LOCK_FAILURE + * are returned, depending on whether the lock can be acquired withing the + * timeout. + */ +static PyLockStatus +acquire_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds) +{ + PyLockStatus r; + _PyTime_timeval curtime; + _PyTime_timeval endtime; + + + _PyTime_gettimeofday(&endtime); + if (microseconds > 0) { + endtime.tv_sec += microseconds / (1000 * 1000); + endtime.tv_usec += microseconds % (1000 * 1000); + } + + + do { + Py_BEGIN_ALLOW_THREADS + r = PyThread_acquire_lock_timed(lock, microseconds, 1); + Py_END_ALLOW_THREADS + + if (r == PY_LOCK_INTR) { + /* Run signal handlers if we were interrupted. Propagate + * exceptions from signal handlers, such as KeyboardInterrupt, by + * passing up PY_LOCK_INTR. */ + if (Py_MakePendingCalls() < 0) { + return PY_LOCK_INTR; + } + + /* If we're using a timeout, recompute the timeout after processing + * signals, since those can take time. */ + if (microseconds >= 0) { + _PyTime_gettimeofday(&curtime); + microseconds = ((endtime.tv_sec - curtime.tv_sec) * 1000000 + + (endtime.tv_usec - curtime.tv_usec)); + + /* Check for negative values, since those mean block forever. + */ + if (microseconds <= 0) { + r = PY_LOCK_FAILURE; + } + } + } + } while (r == PY_LOCK_INTR); /* Retry if we were interrupted. */ + + return r; +} + static PyObject * lock_PyThread_acquire_lock(lockobject *self, PyObject *args, PyObject *kwds) { @@ -47,7 +100,7 @@ int blocking = 1; double timeout = -1; PY_TIMEOUT_T microseconds; - int r; + PyLockStatus r; if (!PyArg_ParseTupleAndKeywords(args, kwds, "|id:acquire", kwlist, &blocking, &timeout)) @@ -77,11 +130,12 @@ microseconds = (PY_TIMEOUT_T) timeout; } - Py_BEGIN_ALLOW_THREADS - r = PyThread_acquire_lock_timed(self->lock_lock, microseconds); - Py_END_ALLOW_THREADS + r = acquire_timed(self->lock_lock, microseconds); + if (r == PY_LOCK_INTR) { + return NULL; + } - return PyBool_FromLong(r); + return PyBool_FromLong(r == PY_LOCK_ACQUIRED); } PyDoc_STRVAR(acquire_doc, @@ -93,7 +147,7 @@ the lock, and return None once the lock is acquired.\n\ With an argument, this will only block if the argument is true,\n\ and the return value reflects whether the lock is acquired.\n\ -The blocking operation is not interruptible."); +The blocking operation is interruptible."); static PyObject * lock_PyThread_release_lock(lockobject *self) @@ -218,7 +272,7 @@ double timeout = -1; PY_TIMEOUT_T microseconds; long tid; - int r = 1; + PyLockStatus r = PY_LOCK_ACQUIRED; if (!PyArg_ParseTupleAndKeywords(args, kwds, "|id:acquire", kwlist, &blocking, &timeout)) @@ -265,17 +319,18 @@ if (microseconds == 0) { Py_RETURN_FALSE; } - Py_BEGIN_ALLOW_THREADS - r = PyThread_acquire_lock_timed(self->rlock_lock, microseconds); - Py_END_ALLOW_THREADS + r = acquire_timed(self->rlock_lock, microseconds); } - if (r) { + if (r == PY_LOCK_ACQUIRED) { assert(self->rlock_count == 0); self->rlock_owner = tid; self->rlock_count = 1; } + else if (r == PY_LOCK_INTR) { + return NULL; + } - return PyBool_FromLong(r); + return PyBool_FromLong(r == PY_LOCK_ACQUIRED); } PyDoc_STRVAR(rlock_acquire_doc, @@ -287,7 +342,7 @@ immediately. If `blocking` is True and another thread holds\n\ the lock, the method will wait for the lock to be released,\n\ take it and then return True.\n\ -(note: the blocking operation is not interruptible.)\n\ +(note: the blocking operation is interruptible.)\n\ \n\ In all other cases, the method will return True immediately.\n\ Precisely, if the current thread already holds the lock, its\n\ Modified: python/branches/pep-3151/Modules/_tkinter.c ============================================================================== --- python/branches/pep-3151/Modules/_tkinter.c (original) +++ python/branches/pep-3151/Modules/_tkinter.c Sat Feb 26 08:16:32 2011 @@ -2005,7 +2005,7 @@ PythonCmd(ClientData clientData, Tcl_Interp *interp, int argc, char *argv[]) { PythonCmd_ClientData *data = (PythonCmd_ClientData *)clientData; - PyObject *self, *func, *arg, *res; + PyObject *func, *arg, *res; int i, rv; Tcl_Obj *obj_res; @@ -2014,7 +2014,6 @@ /* TBD: no error checking here since we know, via the * Tkapp_CreateCommand() that the client data is a two-tuple */ - self = data->self; func = data->func; /* Create argument list (argv1, ..., argvN) */ Modified: python/branches/pep-3151/Modules/arraymodule.c ============================================================================== --- python/branches/pep-3151/Modules/arraymodule.c (original) +++ python/branches/pep-3151/Modules/arraymodule.c Sat Feb 26 08:16:32 2011 @@ -674,11 +674,9 @@ static PyObject * array_repeat(arrayobject *a, Py_ssize_t n) { - Py_ssize_t i; Py_ssize_t size; arrayobject *np; - char *p; - Py_ssize_t nbytes; + Py_ssize_t oldbytes, newbytes; if (n < 0) n = 0; if ((Py_SIZE(a) != 0) && (n > PY_SSIZE_T_MAX / Py_SIZE(a))) { @@ -688,13 +686,23 @@ np = (arrayobject *) newarrayobject(&Arraytype, size, a->ob_descr); if (np == NULL) return NULL; - p = np->ob_item; - nbytes = Py_SIZE(a) * a->ob_descr->itemsize; - for (i = 0; i < n; i++) { - memcpy(p, a->ob_item, nbytes); - p += nbytes; + if (n == 0) + return (PyObject *)np; + oldbytes = Py_SIZE(a) * a->ob_descr->itemsize; + newbytes = oldbytes * n; + /* this follows the code in unicode_repeat */ + if (oldbytes == 1) { + memset(np->ob_item, a->ob_item[0], newbytes); + } else { + Py_ssize_t done = oldbytes; + Py_MEMCPY(np->ob_item, a->ob_item, oldbytes); + while (done < newbytes) { + Py_ssize_t ncopy = (done <= newbytes-done) ? done : newbytes-done; + Py_MEMCPY(np->ob_item+done, np->ob_item, ncopy); + done += ncopy; + } } - return (PyObject *) np; + return (PyObject *)np; } static int @@ -868,7 +876,6 @@ if (Py_SIZE(self) > 0) { if (n < 0) n = 0; - items = self->ob_item; if ((self->ob_descr->itemsize != 0) && (Py_SIZE(self) > PY_SSIZE_T_MAX / self->ob_descr->itemsize)) { return PyErr_NoMemory(); @@ -1448,7 +1455,7 @@ { Py_UNICODE *ustr; Py_ssize_t n; - char typecode; + Py_UNICODE typecode; if (!PyArg_ParseTuple(args, "u#:fromunicode", &ustr, &n)) return NULL; @@ -1483,7 +1490,7 @@ static PyObject * array_tounicode(arrayobject *self, PyObject *unused) { - char typecode; + Py_UNICODE typecode; typecode = self->ob_descr->typecode; if ((typecode != 'u')) { PyErr_SetString(PyExc_ValueError, @@ -2002,8 +2009,8 @@ static PyObject * array_get_typecode(arrayobject *a, void *closure) { - char tc = a->ob_descr->typecode; - return PyUnicode_FromStringAndSize(&tc, 1); + Py_UNICODE tc = a->ob_descr->typecode; + return PyUnicode_FromUnicode(&tc, 1); } static PyObject * @@ -2075,21 +2082,21 @@ static PyObject * array_repr(arrayobject *a) { - char typecode; + Py_UNICODE typecode; PyObject *s, *v = NULL; Py_ssize_t len; len = Py_SIZE(a); typecode = a->ob_descr->typecode; if (len == 0) { - return PyUnicode_FromFormat("array('%c')", typecode); + return PyUnicode_FromFormat("array('%c')", (int)typecode); } if ((typecode == 'u')) v = array_tounicode(a, NULL); else v = array_tolist(a, NULL); - s = PyUnicode_FromFormat("array('%c', %R)", typecode, v); + s = PyUnicode_FromFormat("array('%c', %R)", (int)typecode, v); Py_DECREF(v); return s; } @@ -2112,7 +2119,7 @@ arrayobject* ar; int itemsize = self->ob_descr->itemsize; - if (PySlice_GetIndicesEx((PySliceObject*)item, Py_SIZE(self), + if (PySlice_GetIndicesEx(item, Py_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; } @@ -2183,7 +2190,7 @@ return (*self->ob_descr->setitem)(self, i, value); } else if (PySlice_Check(item)) { - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, Py_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return -1; @@ -2397,7 +2404,9 @@ || PyByteArray_Check(initial) || PyBytes_Check(initial) || PyTuple_Check(initial) - || ((c=='u') && PyUnicode_Check(initial)))) { + || ((c=='u') && PyUnicode_Check(initial)) + || (array_Check(initial) + && c == ((arrayobject*)initial)->ob_descr->typecode))) { it = PyObject_GetIter(initial); if (it == NULL) return NULL; @@ -2413,17 +2422,20 @@ PyObject *a; Py_ssize_t len; - if (initial == NULL || !(PyList_Check(initial) - || PyTuple_Check(initial))) + if (initial == NULL) len = 0; + else if (PyList_Check(initial)) + len = PyList_GET_SIZE(initial); + else if (PyTuple_Check(initial) || array_Check(initial)) + len = Py_SIZE(initial); else - len = PySequence_Size(initial); + len = 0; a = newarrayobject(type, len, descr); if (a == NULL) return NULL; - if (len > 0) { + if (len > 0 && !array_Check(initial)) { Py_ssize_t i; for (i = 0; i < len; i++) { PyObject *v = @@ -2474,6 +2486,11 @@ self->allocated = Py_SIZE(self); } } + else if (initial != NULL && array_Check(initial)) { + arrayobject *self = (arrayobject *)a; + arrayobject *other = (arrayobject *)initial; + memcpy(self->ob_item, other->ob_item, len * other->ob_descr->itemsize); + } if (it != NULL) { if (array_iter_extend((arrayobject *)a, it) == -1) { Py_DECREF(it); Modified: python/branches/pep-3151/Modules/atexitmodule.c ============================================================================== --- python/branches/pep-3151/Modules/atexitmodule.c (original) +++ python/branches/pep-3151/Modules/atexitmodule.c Sat Feb 26 08:16:32 2011 @@ -72,6 +72,7 @@ PyErr_Fetch(&exc_type, &exc_value, &exc_tb); if (!PyErr_ExceptionMatches(PyExc_SystemExit)) { PySys_WriteStderr("Error in atexit._run_exitfuncs:\n"); + PyErr_NormalizeException(&exc_type, &exc_value, &exc_tb); PyErr_Display(exc_type, exc_value, exc_tb); } } Modified: python/branches/pep-3151/Modules/audioop.c ============================================================================== --- python/branches/pep-3151/Modules/audioop.c (original) +++ python/branches/pep-3151/Modules/audioop.c Sat Feb 26 08:16:32 2011 @@ -309,7 +309,7 @@ } static int -audioop_check_parameters(int len, int size) +audioop_check_parameters(Py_ssize_t len, int size) { if (!audioop_check_size(size)) return 0; @@ -513,7 +513,6 @@ best_result = result; best_j = 0; - j = 0; for ( j=1; j<=len1-len2; j++) { aj_m1 = (double)cp1[j-1]; @@ -599,7 +598,6 @@ best_result = result; best_j = 0; - j = 0; for ( j=1; j<=len1-len2; j++) { aj_m1 = (double)cp1[j-1]; @@ -1433,7 +1431,6 @@ if ( state == Py_None ) { /* First time, it seems. Set defaults */ valpred = 0; - step = 7; index = 0; } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) return 0; @@ -1534,7 +1531,6 @@ if ( state == Py_None ) { /* First time, it seems. Set defaults */ valpred = 0; - step = 7; index = 0; } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) return 0; Modified: python/branches/pep-3151/Modules/cjkcodecs/_codecs_iso2022.c ============================================================================== --- python/branches/pep-3151/Modules/cjkcodecs/_codecs_iso2022.c (original) +++ python/branches/pep-3151/Modules/cjkcodecs/_codecs_iso2022.c Sat Feb 26 08:16:32 2011 @@ -123,7 +123,7 @@ CODEC_INIT(iso2022) { - const struct iso2022_designation *desig = CONFIG_DESIGNATIONS; + const struct iso2022_designation *desig; for (desig = CONFIG_DESIGNATIONS; desig->mark; desig++) if (desig->initializer != NULL && desig->initializer() != 0) return -1; Modified: python/branches/pep-3151/Modules/cjkcodecs/multibytecodec.c ============================================================================== --- python/branches/pep-3151/Modules/cjkcodecs/multibytecodec.c (original) +++ python/branches/pep-3151/Modules/cjkcodecs/multibytecodec.c Sat Feb 26 08:16:32 2011 @@ -483,6 +483,7 @@ return PyBytes_FromStringAndSize(NULL, 0); buf.excobj = NULL; + buf.outobj = NULL; buf.inbuf = buf.inbuf_top = *data; buf.inbuf_end = buf.inbuf_top + datalen; Deleted: python/branches/pep-3151/Modules/cryptmodule.c ============================================================================== --- python/branches/pep-3151/Modules/cryptmodule.c Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,62 +0,0 @@ -/* cryptmodule.c - by Steve Majewski - */ - -#include "Python.h" - -#include - -#ifdef __VMS -#include -#endif - -/* Module crypt */ - - -static PyObject *crypt_crypt(PyObject *self, PyObject *args) -{ - char *word, *salt; -#ifndef __VMS - extern char * crypt(const char *, const char *); -#endif - - if (!PyArg_ParseTuple(args, "ss:crypt", &word, &salt)) { - return NULL; - } - /* On some platforms (AtheOS) crypt returns NULL for an invalid - salt. Return None in that case. XXX Maybe raise an exception? */ - return Py_BuildValue("s", crypt(word, salt)); - -} - -PyDoc_STRVAR(crypt_crypt__doc__, -"crypt(word, salt) -> string\n\ -word will usually be a user's password. salt is a 2-character string\n\ -which will be used to select one of 4096 variations of DES. The characters\n\ -in salt must be either \".\", \"/\", or an alphanumeric character. Returns\n\ -the hashed password as a string, which will be composed of characters from\n\ -the same alphabet as the salt."); - - -static PyMethodDef crypt_methods[] = { - {"crypt", crypt_crypt, METH_VARARGS, crypt_crypt__doc__}, - {NULL, NULL} /* sentinel */ -}; - - -static struct PyModuleDef cryptmodule = { - PyModuleDef_HEAD_INIT, - "crypt", - NULL, - -1, - crypt_methods, - NULL, - NULL, - NULL, - NULL -}; - -PyMODINIT_FUNC -PyInit_crypt(void) -{ - return PyModule_Create(&cryptmodule); -} Modified: python/branches/pep-3151/Modules/gcmodule.c ============================================================================== --- python/branches/pep-3151/Modules/gcmodule.c (original) +++ python/branches/pep-3151/Modules/gcmodule.c Sat Feb 26 08:16:32 2011 @@ -1511,11 +1511,3 @@ } PyObject_FREE(g); } - -/* for binary compatibility with 2.2 */ -#undef _PyObject_GC_Del -void -_PyObject_GC_Del(PyObject *op) -{ - PyObject_GC_Del(op); -} Modified: python/branches/pep-3151/Modules/getpath.c ============================================================================== --- python/branches/pep-3151/Modules/getpath.c (original) +++ python/branches/pep-3151/Modules/getpath.c Sat Feb 26 08:16:32 2011 @@ -361,7 +361,7 @@ decoded = PyUnicode_DecodeUTF8(buf, n, "surrogateescape"); if (decoded != NULL) { Py_ssize_t k; - k = PyUnicode_AsWideChar((PyUnicodeObject*)decoded, + k = PyUnicode_AsWideChar(decoded, rel_builddir_path, MAXPATHLEN); Py_DECREF(decoded); if (k >= 0) { Modified: python/branches/pep-3151/Modules/grpmodule.c ============================================================================== --- python/branches/pep-3151/Modules/grpmodule.c (original) +++ python/branches/pep-3151/Modules/grpmodule.c Sat Feb 26 08:16:32 2011 @@ -2,7 +2,6 @@ /* UNIX group file access module */ #include "Python.h" -#include "structseq.h" #include #include @@ -160,7 +159,9 @@ name is not valid, raise KeyError."}, {"getgrall", grp_getgrall, METH_NOARGS, "getgrall() -> list of tuples\n\ -Return a list of all available group entries, in arbitrary order."}, +Return a list of all available group entries, in arbitrary order.\n\ +An entry whose name starts with '+' or '-' represents an instruction\n\ +to use YP/NIS and may not be accessible via getgrnam or getgrgid."}, {NULL, NULL} /* sentinel */ }; Modified: python/branches/pep-3151/Modules/itertoolsmodule.c ============================================================================== --- python/branches/pep-3151/Modules/itertoolsmodule.c (original) +++ python/branches/pep-3151/Modules/itertoolsmodule.c Sat Feb 26 08:16:32 2011 @@ -1215,6 +1215,7 @@ { PyObject *item; PyObject *it = lz->it; + Py_ssize_t stop = lz->stop; Py_ssize_t oldnext; PyObject *(*iternext)(PyObject *); @@ -1226,7 +1227,7 @@ Py_DECREF(item); lz->cnt++; } - if (lz->stop != -1 && lz->cnt >= lz->stop) + if (stop != -1 && lz->cnt >= stop) return NULL; item = iternext(it); if (item == NULL) @@ -1234,8 +1235,8 @@ lz->cnt++; oldnext = lz->next; lz->next += lz->step; - if (lz->next < oldnext) /* Check for overflow */ - lz->next = lz->stop; + if (lz->next < oldnext || (stop != -1 && lz->next > stop)) + lz->next = stop; return item; } @@ -2583,6 +2584,138 @@ PyObject_GC_Del, /* tp_free */ }; +/* accumulate object ************************************************************/ + +typedef struct { + PyObject_HEAD + PyObject *total; + PyObject *it; +} accumulateobject; + +static PyTypeObject accumulate_type; + +static PyObject * +accumulate_new(PyTypeObject *type, PyObject *args, PyObject *kwds) +{ + static char *kwargs[] = {"iterable", NULL}; + PyObject *iterable; + PyObject *it; + accumulateobject *lz; + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "O:accumulate", kwargs, &iterable)) + return NULL; + + /* Get iterator. */ + it = PyObject_GetIter(iterable); + if (it == NULL) + return NULL; + + /* create accumulateobject structure */ + lz = (accumulateobject *)type->tp_alloc(type, 0); + if (lz == NULL) { + Py_DECREF(it); + return NULL; + } + + lz->total = NULL; + lz->it = it; + return (PyObject *)lz; +} + +static void +accumulate_dealloc(accumulateobject *lz) +{ + PyObject_GC_UnTrack(lz); + Py_XDECREF(lz->total); + Py_XDECREF(lz->it); + Py_TYPE(lz)->tp_free(lz); +} + +static int +accumulate_traverse(accumulateobject *lz, visitproc visit, void *arg) +{ + Py_VISIT(lz->it); + Py_VISIT(lz->total); + return 0; +} + +static PyObject * +accumulate_next(accumulateobject *lz) +{ + PyObject *val, *oldtotal, *newtotal; + + val = PyIter_Next(lz->it); + if (val == NULL) + return NULL; + + if (lz->total == NULL) { + Py_INCREF(val); + lz->total = val; + return lz->total; + } + + newtotal = PyNumber_Add(lz->total, val); + Py_DECREF(val); + if (newtotal == NULL) + return NULL; + + oldtotal = lz->total; + lz->total = newtotal; + Py_DECREF(oldtotal); + + Py_INCREF(newtotal); + return newtotal; +} + +PyDoc_STRVAR(accumulate_doc, +"accumulate(iterable) --> accumulate object\n\ +\n\ +Return series of accumulated sums."); + +static PyTypeObject accumulate_type = { + PyVarObject_HEAD_INIT(NULL, 0) + "itertools.accumulate", /* tp_name */ + sizeof(accumulateobject), /* tp_basicsize */ + 0, /* tp_itemsize */ + /* methods */ + (destructor)accumulate_dealloc, /* tp_dealloc */ + 0, /* tp_print */ + 0, /* tp_getattr */ + 0, /* tp_setattr */ + 0, /* tp_reserved */ + 0, /* tp_repr */ + 0, /* tp_as_number */ + 0, /* tp_as_sequence */ + 0, /* tp_as_mapping */ + 0, /* tp_hash */ + 0, /* tp_call */ + 0, /* tp_str */ + PyObject_GenericGetAttr, /* tp_getattro */ + 0, /* tp_setattro */ + 0, /* tp_as_buffer */ + Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | + Py_TPFLAGS_BASETYPE, /* tp_flags */ + accumulate_doc, /* tp_doc */ + (traverseproc)accumulate_traverse, /* tp_traverse */ + 0, /* tp_clear */ + 0, /* tp_richcompare */ + 0, /* tp_weaklistoffset */ + PyObject_SelfIter, /* tp_iter */ + (iternextfunc)accumulate_next, /* tp_iternext */ + 0, /* tp_methods */ + 0, /* tp_members */ + 0, /* tp_getset */ + 0, /* tp_base */ + 0, /* tp_dict */ + 0, /* tp_descr_get */ + 0, /* tp_descr_set */ + 0, /* tp_dictoffset */ + 0, /* tp_init */ + 0, /* tp_alloc */ + accumulate_new, /* tp_new */ + PyObject_GC_Del, /* tp_free */ +}; + /* compress object ************************************************************/ @@ -3495,6 +3628,7 @@ repeat(elem [,n]) --> elem, elem, elem, ... endlessly or up to n times\n\ \n\ Iterators terminating on the shortest input sequence:\n\ +accumulate(p, start=0) --> p0, p0+p1, p0+p1+p2\n\ chain(p, q, ...) --> p0, p1, ... plast, q0, q1, ... \n\ compress(data, selectors) --> (d[0] if s[0]), (d[1] if s[1]), ...\n\ dropwhile(pred, seq) --> seq[n], seq[n+1], starting when pred fails\n\ @@ -3540,6 +3674,7 @@ PyObject *m; char *name; PyTypeObject *typelist[] = { + &accumulate_type, &combinations_type, &cwr_type, &cycle_type, Modified: python/branches/pep-3151/Modules/ld_so_aix.in ============================================================================== --- python/branches/pep-3151/Modules/ld_so_aix.in (original) +++ python/branches/pep-3151/Modules/ld_so_aix.in Sat Feb 26 08:16:32 2011 @@ -131,7 +131,7 @@ shift done -if test "$objfile" = "libpython at VERSION@.so"; then +if test "$objfile" = "libpython at VERSION@@ABIFLAGS at .so"; then ldsocoremode="true" fi Modified: python/branches/pep-3151/Modules/main.c ============================================================================== --- python/branches/pep-3151/Modules/main.c (original) +++ python/branches/pep-3151/Modules/main.c Sat Feb 26 08:16:32 2011 @@ -2,7 +2,6 @@ #include "Python.h" #include "osdefs.h" -#include "import.h" #include @@ -47,7 +46,7 @@ static int orig_argc; /* command line options */ -#define BASE_OPTS L"bBc:dEhiJm:OsStuvVW:xX:?" +#define BASE_OPTS L"bBc:dEhiJm:OqsStuvVW:xX:?" #define PROGRAM_OPTS BASE_OPTS @@ -72,6 +71,7 @@ -m mod : run library module as a script (terminates option list)\n\ -O : optimize generated bytecode slightly; also PYTHONOPTIMIZE=x\n\ -OO : remove doc-strings in addition to the -O optimizations\n\ +-q : don't print version and copyright messages on interactive startup\n\ -s : don't add user site directory to sys.path; also PYTHONNOUSERSITE\n\ -S : don't imply 'import site' on initialization\n\ "; @@ -425,6 +425,10 @@ PySys_AddXOption(_PyOS_optarg); break; + case 'q': + Py_QuietFlag++; + break; + /* This space reserved for other options */ default: @@ -523,11 +527,14 @@ stdin_is_interactive = Py_FdIsInteractive(stdin, (char *)0); - if (Py_UnbufferedStdioFlag) { #if defined(MS_WINDOWS) || defined(__CYGWIN__) - _setmode(fileno(stdin), O_BINARY); - _setmode(fileno(stdout), O_BINARY); + /* don't translate newlines (\r\n <=> \n) */ + _setmode(fileno(stdin), O_BINARY); + _setmode(fileno(stdout), O_BINARY); + _setmode(fileno(stderr), O_BINARY); #endif + + if (Py_UnbufferedStdioFlag) { #ifdef HAVE_SETVBUF setvbuf(stdin, (char *)NULL, _IONBF, BUFSIZ); setvbuf(stdout, (char *)NULL, _IONBF, BUFSIZ); @@ -570,7 +577,6 @@ if ((p = Py_GETENV("PYTHONEXECUTABLE")) && *p != '\0') { wchar_t* buffer; size_t len = strlen(p); - size_t r; buffer = malloc(len * sizeof(wchar_t)); if (buffer == NULL) { @@ -578,7 +584,7 @@ "not enough memory to copy PYTHONEXECUTABLE"); } - r = mbstowcs(buffer, p, len); + mbstowcs(buffer, p, len); Py_SetProgramName(buffer); /* buffer is now handed off - do not free */ } else { @@ -589,8 +595,9 @@ #endif Py_Initialize(); - if (Py_VerboseFlag || - (command == NULL && filename == NULL && module == NULL && stdin_is_interactive)) { + if (!Py_QuietFlag && (Py_VerboseFlag || + (command == NULL && filename == NULL && + module == NULL && stdin_is_interactive))) { fprintf(stderr, "Python %s on %s\n", Py_GetVersion(), Py_GetPlatform()); if (!Py_NoSiteFlag) @@ -712,7 +719,6 @@ * trade off slower shutdown for less distraction in the memory * reports. -baw */ - _Py_ReleaseInternedStrings(); _Py_ReleaseInternedUnicodeStrings(); #endif /* __INSURE__ */ Modified: python/branches/pep-3151/Modules/md5module.c ============================================================================== --- python/branches/pep-3151/Modules/md5module.c (original) +++ python/branches/pep-3151/Modules/md5module.c Sat Feb 26 08:16:32 2011 @@ -228,9 +228,9 @@ @param inlen The length of the data (octets) */ void md5_process(struct md5_state *md5, - const unsigned char *in, unsigned long inlen) + const unsigned char *in, Py_ssize_t inlen) { - unsigned long n; + Py_ssize_t n; assert(md5 != NULL); assert(in != NULL); Modified: python/branches/pep-3151/Modules/mmapmodule.c ============================================================================== --- python/branches/pep-3151/Modules/mmapmodule.c (original) +++ python/branches/pep-3151/Modules/mmapmodule.c Sat Feb 26 08:16:32 2011 @@ -88,7 +88,11 @@ char * data; size_t size; size_t pos; /* relative to offset */ - size_t offset; +#ifdef MS_WINDOWS + PY_LONG_LONG offset; +#else + off_t offset; +#endif int exports; #ifdef MS_WINDOWS @@ -431,7 +435,11 @@ PyErr_SetFromErrno(PyExc_IOError); return NULL; } - return PyLong_FromSsize_t(buf.st_size); +#ifdef HAVE_LARGEFILE_SUPPORT + return PyLong_FromLongLong(buf.st_size); +#else + return PyLong_FromLong(buf.st_size); +#endif } #endif /* UNIX */ } @@ -465,17 +473,10 @@ CloseHandle(self->map_handle); self->map_handle = NULL; /* Move to the desired EOF position */ -#if SIZEOF_SIZE_T > 4 newSizeHigh = (DWORD)((self->offset + new_size) >> 32); newSizeLow = (DWORD)((self->offset + new_size) & 0xFFFFFFFF); off_hi = (DWORD)(self->offset >> 32); off_lo = (DWORD)(self->offset & 0xFFFFFFFF); -#else - newSizeHigh = 0; - newSizeLow = (DWORD)(self->offset + new_size); - off_hi = 0; - off_lo = (DWORD)self->offset; -#endif SetFilePointer(self->file_handle, newSizeLow, &newSizeHigh, FILE_BEGIN); /* Change the size of the file */ @@ -760,7 +761,7 @@ else if (PySlice_Check(item)) { Py_ssize_t start, stop, step, slicelen; - if (PySlice_GetIndicesEx((PySliceObject *)item, self->size, + if (PySlice_GetIndicesEx(item, self->size, &start, &stop, &step, &slicelen) < 0) { return NULL; } @@ -886,7 +887,7 @@ Py_ssize_t start, stop, step, slicelen; Py_buffer vbuf; - if (PySlice_GetIndicesEx((PySliceObject *)item, + if (PySlice_GetIndicesEx(item, self->size, &start, &stop, &step, &slicelen) < 0) { return -1; @@ -1049,6 +1050,12 @@ } #ifdef UNIX +#ifdef HAVE_LARGEFILE_SUPPORT +#define _Py_PARSE_OFF_T "L" +#else +#define _Py_PARSE_OFF_T "l" +#endif + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { @@ -1056,8 +1063,9 @@ struct stat st; #endif mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + off_t offset = 0; int fd, flags = MAP_SHARED, prot = PROT_WRITE | PROT_READ; int devzero = -1; int access = (int)ACCESS_DEFAULT; @@ -1065,16 +1073,18 @@ "flags", "prot", "access", "offset", NULL}; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iiiO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|iii" _Py_PARSE_OFF_T, keywords, &fd, &map_size_obj, &flags, &prot, - &access, &offset_obj)) + &access, &offset)) return NULL; map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } if ((access != (int)ACCESS_DEFAULT) && ((flags != MAP_SHARED) || (prot != (PROT_WRITE | PROT_READ)))) @@ -1114,8 +1124,19 @@ # endif if (fd != -1 && fstat(fd, &st) == 0 && S_ISREG(st.st_mode)) { if (map_size == 0) { - map_size = st.st_size; - } else if ((size_t)offset + (size_t)map_size > st.st_size) { + if (offset >= st.st_size) { + PyErr_SetString(PyExc_ValueError, + "mmap offset is greater than file size"); + return NULL; + } + off_t calc_size = st.st_size - offset; + map_size = calc_size; + if (map_size != calc_size) { + PyErr_SetString(PyExc_ValueError, + "mmap length is too large"); + return NULL; + } + } else if (offset + (size_t)map_size > st.st_size) { PyErr_SetString(PyExc_ValueError, "mmap length is greater than file size"); return NULL; @@ -1176,12 +1197,19 @@ #endif /* UNIX */ #ifdef MS_WINDOWS + +/* A note on sizes and offsets: while the actual map size must hold in a + Py_ssize_t, both the total file size and the start offset can be longer + than a Py_ssize_t, so we use PY_LONG_LONG which is always 64-bit. +*/ + static PyObject * new_mmap_object(PyTypeObject *type, PyObject *args, PyObject *kwdict) { mmap_object *m_obj; - PyObject *map_size_obj = NULL, *offset_obj = NULL; - Py_ssize_t map_size, offset; + PyObject *map_size_obj = NULL; + Py_ssize_t map_size; + PY_LONG_LONG offset = 0, size; DWORD off_hi; /* upper 32 bits of offset */ DWORD off_lo; /* lower 32 bits of offset */ DWORD size_hi; /* upper 32 bits of size */ @@ -1196,9 +1224,9 @@ "tagname", "access", "offset", NULL }; - if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziO", keywords, + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iO|ziL", keywords, &fileno, &map_size_obj, - &tagname, &access, &offset_obj)) { + &tagname, &access, &offset)) { return NULL; } @@ -1223,9 +1251,11 @@ map_size = _GetMapSize(map_size_obj, "size"); if (map_size < 0) return NULL; - offset = _GetMapSize(offset_obj, "offset"); - if (offset < 0) + if (offset < 0) { + PyErr_SetString(PyExc_OverflowError, + "memory mapped offset must be positive"); return NULL; + } /* assume -1 and 0 both mean invalid filedescriptor to 'anonymously' map memory. @@ -1289,21 +1319,26 @@ return PyErr_SetFromWindowsErr(dwErr); } -#if SIZEOF_SIZE_T > 4 - m_obj->size = (((size_t)high)<<32) + low; -#else - if (high) - /* File is too large to map completely */ - m_obj->size = (size_t)-1; + size = (((PY_LONG_LONG) high) << 32) + low; + if (offset >= size) { + PyErr_SetString(PyExc_ValueError, + "mmap offset is greater than file size"); + Py_DECREF(m_obj); + return NULL; + } + if (offset - size > PY_SSIZE_T_MAX) + /* Map area too large to fit in memory */ + m_obj->size = (Py_ssize_t) -1; else - m_obj->size = low; -#endif + m_obj->size = (Py_ssize_t) (size - offset); } else { m_obj->size = map_size; + size = offset + map_size; } } else { m_obj->size = map_size; + size = offset + map_size; } /* set the initial position */ @@ -1324,22 +1359,10 @@ m_obj->tagname = NULL; m_obj->access = (access_mode)access; - /* DWORD is a 4-byte int. If we're on a box where size_t consumes - * more than 4 bytes, we need to break it apart. Else (size_t - * consumes 4 bytes), C doesn't define what happens if we shift - * right by 32, so we need different code. - */ -#if SIZEOF_SIZE_T > 4 - size_hi = (DWORD)((offset + m_obj->size) >> 32); - size_lo = (DWORD)((offset + m_obj->size) & 0xFFFFFFFF); + size_hi = (DWORD)(size >> 32); + size_lo = (DWORD)(size & 0xFFFFFFFF); off_hi = (DWORD)(offset >> 32); off_lo = (DWORD)(offset & 0xFFFFFFFF); -#else - size_hi = 0; - size_lo = (DWORD)(offset + m_obj->size); - off_hi = 0; - off_lo = (DWORD)offset; -#endif /* For files, it would be sufficient to pass 0 as size. For anonymous maps, we have to pass the size explicitly. */ m_obj->map_handle = CreateFileMapping(m_obj->file_handle, Modified: python/branches/pep-3151/Modules/parsermodule.c ============================================================================== --- python/branches/pep-3151/Modules/parsermodule.c (original) +++ python/branches/pep-3151/Modules/parsermodule.c Sat Feb 26 08:16:32 2011 @@ -34,10 +34,8 @@ #include "grammar.h" #include "parsetok.h" /* ISTERMINAL() / ISNONTERMINAL() */ -#include "compile.h" #undef Yield #include "ast.h" -#include "pyarena.h" extern grammar _PyParser_Grammar; /* From graminit.c */ @@ -794,6 +792,11 @@ } } temp_str = _PyUnicode_AsStringAndSize(temp, &len); + if (temp_str == NULL) { + Py_DECREF(temp); + Py_XDECREF(elem); + return 0; + } strn = (char *)PyObject_MALLOC(len + 1); if (strn != NULL) (void) memcpy(strn, temp_str, len + 1); @@ -872,6 +875,8 @@ encoding = PySequence_GetItem(tuple, 2); /* tuple isn't borrowed anymore here, need to DECREF */ tuple = PySequence_GetSlice(tuple, 0, 2); + if (tuple == NULL) + return NULL; } res = PyNode_New(num); if (res != NULL) { @@ -883,6 +888,12 @@ Py_ssize_t len; const char *temp; temp = _PyUnicode_AsStringAndSize(encoding, &len); + if (temp == NULL) { + Py_DECREF(res); + Py_DECREF(encoding); + Py_DECREF(tuple); + return NULL; + } res->n_str = (char *)PyObject_MALLOC(len + 1); if (res->n_str != NULL && temp != NULL) (void) memcpy(res->n_str, temp, len + 1); Modified: python/branches/pep-3151/Modules/posixmodule.c ============================================================================== --- python/branches/pep-3151/Modules/posixmodule.c (original) +++ python/branches/pep-3151/Modules/posixmodule.c Sat Feb 26 08:16:32 2011 @@ -28,7 +28,6 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" -#include "structseq.h" #if defined(__VMS) # include @@ -96,6 +95,20 @@ #include #endif +#ifdef HAVE_SYS_SENDFILE_H +#include +#endif + +#if defined(__FreeBSD__) || defined(__DragonFly__) || defined(__APPLE__) +#ifdef HAVE_SYS_SOCKET_H +#include +#endif + +#ifdef HAVE_SYS_UIO_H +#include +#endif +#endif + /* Various compilers have only certain posix functions */ /* XXX Gosh I wish these were all moved into pyconfig.h */ #if defined(PYCC_VACPP) && defined(PYOS_OS2) @@ -122,7 +135,7 @@ #ifdef _MSC_VER /* Microsoft compiler */ #define HAVE_GETCWD 1 #define HAVE_GETPPID 1 -#define HAVE_GETLOGIN 1 +#define HAVE_GETLOGIN 1 #define HAVE_SPAWNV 1 #define HAVE_EXECV 1 #define HAVE_PIPE 1 @@ -278,6 +291,10 @@ #include #include /* for ShellExecute() */ #include /* for UNLEN */ +#ifdef SE_CREATE_SYMBOLIC_LINK_NAME /* Available starting with Vista */ +#define HAVE_SYMLINK +static int win32_can_symlink = 0; +#endif #endif /* _MSC_VER */ #if defined(PYCC_VACPP) && defined(PYOS_OS2) @@ -323,6 +340,8 @@ /* choose the appropriate stat and fstat functions and return structs */ #undef STAT +#undef FSTAT +#undef STRUCT_STAT #if defined(MS_WIN64) || defined(MS_WINDOWS) # define STAT win32_stat # define FSTAT win32_fstat @@ -344,6 +363,20 @@ #endif #endif +static int +_parse_off_t(PyObject* arg, void* addr) +{ +#if !defined(HAVE_LARGEFILE_SUPPORT) + *((off_t*)addr) = PyLong_AsLong(arg); +#else + *((off_t*)addr) = PyLong_Check(arg) ? PyLong_AsLongLong(arg) + : PyLong_AsLong(arg); +#endif + if (PyErr_Occurred()) + return 0; + return 1; +} + #if defined _MSC_VER && _MSC_VER >= 1400 /* Microsoft CRT in VS2005 and higher will verify that a filehandle is * valid and throw an assertion if it isn't. @@ -437,6 +470,98 @@ #define _PyVerify_fd_dup2(A, B) (1) #endif +#ifdef MS_WINDOWS +/* The following structure was copied from + http://msdn.microsoft.com/en-us/library/ms791514.aspx as the required + include doesn't seem to be present in the Windows SDK (at least as included + with Visual Studio Express). */ +typedef struct _REPARSE_DATA_BUFFER { + ULONG ReparseTag; + USHORT ReparseDataLength; + USHORT Reserved; + union { + struct { + USHORT SubstituteNameOffset; + USHORT SubstituteNameLength; + USHORT PrintNameOffset; + USHORT PrintNameLength; + ULONG Flags; + WCHAR PathBuffer[1]; + } SymbolicLinkReparseBuffer; + + struct { + USHORT SubstituteNameOffset; + USHORT SubstituteNameLength; + USHORT PrintNameOffset; + USHORT PrintNameLength; + WCHAR PathBuffer[1]; + } MountPointReparseBuffer; + + struct { + UCHAR DataBuffer[1]; + } GenericReparseBuffer; + }; +} REPARSE_DATA_BUFFER, *PREPARSE_DATA_BUFFER; + +#define REPARSE_DATA_BUFFER_HEADER_SIZE FIELD_OFFSET(REPARSE_DATA_BUFFER,\ + GenericReparseBuffer) +#define MAXIMUM_REPARSE_DATA_BUFFER_SIZE ( 16 * 1024 ) + +static int +win32_read_link(HANDLE reparse_point_handle, ULONG *reparse_tag, wchar_t **target_path) +{ + char target_buffer[MAXIMUM_REPARSE_DATA_BUFFER_SIZE]; + REPARSE_DATA_BUFFER *rdb = (REPARSE_DATA_BUFFER *)target_buffer; + DWORD n_bytes_returned; + const wchar_t *ptr; + wchar_t *buf; + size_t len; + + if (0 == DeviceIoControl( + reparse_point_handle, + FSCTL_GET_REPARSE_POINT, + NULL, 0, /* in buffer */ + target_buffer, sizeof(target_buffer), + &n_bytes_returned, + NULL)) /* we're not using OVERLAPPED_IO */ + return -1; + + if (reparse_tag) + *reparse_tag = rdb->ReparseTag; + + if (target_path) { + switch (rdb->ReparseTag) { + case IO_REPARSE_TAG_SYMLINK: + /* XXX: Maybe should use SubstituteName? */ + ptr = rdb->SymbolicLinkReparseBuffer.PathBuffer + + rdb->SymbolicLinkReparseBuffer.PrintNameOffset/sizeof(WCHAR); + len = rdb->SymbolicLinkReparseBuffer.PrintNameLength/sizeof(WCHAR); + break; + case IO_REPARSE_TAG_MOUNT_POINT: + ptr = rdb->MountPointReparseBuffer.PathBuffer + + rdb->MountPointReparseBuffer.SubstituteNameOffset/sizeof(WCHAR); + len = rdb->MountPointReparseBuffer.SubstituteNameLength/sizeof(WCHAR); + break; + default: + SetLastError(ERROR_REPARSE_TAG_MISMATCH); /* XXX: Proper error code? */ + return -1; + } + buf = (wchar_t *)malloc(sizeof(wchar_t)*(len+1)); + if (!buf) { + SetLastError(ERROR_OUTOFMEMORY); + return -1; + } + wcsncpy(buf, ptr, len); + buf[len] = L'\0'; + if (wcsncmp(buf, L"\\??\\", 4) == 0) + buf[1] = L'\\'; + *target_path = buf; + } + + return 0; +} +#endif /* MS_WINDOWS */ + /* Return a dictionary corresponding to the POSIX environment table */ #ifdef WITH_NEXT_FRAMEWORK /* On Darwin/MacOSX a shared library or framework has no access to @@ -881,18 +1006,18 @@ int st_gid; int st_rdev; __int64 st_size; - int st_atime; + time_t st_atime; int st_atime_nsec; - int st_mtime; + time_t st_mtime; int st_mtime_nsec; - int st_ctime; + time_t st_ctime; int st_ctime_nsec; }; static __int64 secs_between_epochs = 11644473600; /* Seconds between 1.1.1601 and 1.1.1970 */ static void -FILE_TIME_to_time_t_nsec(FILETIME *in_ptr, int *time_out, int* nsec_out) +FILE_TIME_to_time_t_nsec(FILETIME *in_ptr, time_t *time_out, int* nsec_out) { /* XXX endianness. Shouldn't matter, as all Windows implementations are little-endian */ /* Cannot simply cast and dereference in_ptr, @@ -900,12 +1025,11 @@ __int64 in; memcpy(&in, in_ptr, sizeof(in)); *nsec_out = (int)(in % 10000000) * 100; /* FILETIME is in units of 100 nsec. */ - /* XXX Win32 supports time stamps past 2038; we currently don't */ - *time_out = Py_SAFE_DOWNCAST((in / 10000000) - secs_between_epochs, __int64, int); + *time_out = Py_SAFE_DOWNCAST((in / 10000000) - secs_between_epochs, __int64, time_t); } static void -time_t_to_FILE_TIME(int time_in, int nsec_in, FILETIME *out_ptr) +time_t_to_FILE_TIME(time_t time_in, int nsec_in, FILETIME *out_ptr) { /* XXX endianness */ __int64 out; @@ -934,7 +1058,7 @@ } static int -attribute_data_to_stat(WIN32_FILE_ATTRIBUTE_DATA *info, struct win32_stat *result) +attribute_data_to_stat(BY_HANDLE_FILE_INFORMATION *info, ULONG reparse_tag, struct win32_stat *result) { memset(result, 0, sizeof(*result)); result->st_mode = attributes_to_mode(info->dwFileAttributes); @@ -942,12 +1066,20 @@ FILE_TIME_to_time_t_nsec(&info->ftCreationTime, &result->st_ctime, &result->st_ctime_nsec); FILE_TIME_to_time_t_nsec(&info->ftLastWriteTime, &result->st_mtime, &result->st_mtime_nsec); FILE_TIME_to_time_t_nsec(&info->ftLastAccessTime, &result->st_atime, &result->st_atime_nsec); + result->st_nlink = info->nNumberOfLinks; + result->st_ino = (((__int64)info->nFileIndexHigh)<<32) + info->nFileIndexLow; + if (reparse_tag == IO_REPARSE_TAG_SYMLINK) { + /* first clear the S_IFMT bits */ + result->st_mode ^= (result->st_mode & 0170000); + /* now set the bits that make this a symlink */ + result->st_mode |= 0120000; + } return 0; } static BOOL -attributes_from_dir(LPCSTR pszFile, LPWIN32_FILE_ATTRIBUTE_DATA pfad) +attributes_from_dir(LPCSTR pszFile, BY_HANDLE_FILE_INFORMATION *info, ULONG *reparse_tag) { HANDLE hFindFile; WIN32_FIND_DATAA FileData; @@ -955,17 +1087,22 @@ if (hFindFile == INVALID_HANDLE_VALUE) return FALSE; FindClose(hFindFile); - pfad->dwFileAttributes = FileData.dwFileAttributes; - pfad->ftCreationTime = FileData.ftCreationTime; - pfad->ftLastAccessTime = FileData.ftLastAccessTime; - pfad->ftLastWriteTime = FileData.ftLastWriteTime; - pfad->nFileSizeHigh = FileData.nFileSizeHigh; - pfad->nFileSizeLow = FileData.nFileSizeLow; + memset(info, 0, sizeof(*info)); + *reparse_tag = 0; + info->dwFileAttributes = FileData.dwFileAttributes; + info->ftCreationTime = FileData.ftCreationTime; + info->ftLastAccessTime = FileData.ftLastAccessTime; + info->ftLastWriteTime = FileData.ftLastWriteTime; + info->nFileSizeHigh = FileData.nFileSizeHigh; + info->nFileSizeLow = FileData.nFileSizeLow; +/* info->nNumberOfLinks = 1; */ + if (FileData.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) + *reparse_tag = FileData.dwReserved0; return TRUE; } static BOOL -attributes_from_dir_w(LPCWSTR pszFile, LPWIN32_FILE_ATTRIBUTE_DATA pfad) +attributes_from_dir_w(LPCWSTR pszFile, BY_HANDLE_FILE_INFORMATION *info, ULONG *reparse_tag) { HANDLE hFindFile; WIN32_FIND_DATAW FileData; @@ -973,178 +1110,40 @@ if (hFindFile == INVALID_HANDLE_VALUE) return FALSE; FindClose(hFindFile); - pfad->dwFileAttributes = FileData.dwFileAttributes; - pfad->ftCreationTime = FileData.ftCreationTime; - pfad->ftLastAccessTime = FileData.ftLastAccessTime; - pfad->ftLastWriteTime = FileData.ftLastWriteTime; - pfad->nFileSizeHigh = FileData.nFileSizeHigh; - pfad->nFileSizeLow = FileData.nFileSizeLow; + memset(info, 0, sizeof(*info)); + *reparse_tag = 0; + info->dwFileAttributes = FileData.dwFileAttributes; + info->ftCreationTime = FileData.ftCreationTime; + info->ftLastAccessTime = FileData.ftLastAccessTime; + info->ftLastWriteTime = FileData.ftLastWriteTime; + info->nFileSizeHigh = FileData.nFileSizeHigh; + info->nFileSizeLow = FileData.nFileSizeLow; +/* info->nNumberOfLinks = 1; */ + if (FileData.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) + *reparse_tag = FileData.dwReserved0; return TRUE; } -/* About the following functions: win32_lstat, win32_lstat_w, win32_stat, - win32_stat_w - - In Posix, stat automatically traverses symlinks and returns the stat - structure for the target. In Windows, the equivalent GetFileAttributes by - default does not traverse symlinks and instead returns attributes for - the symlink. - - Therefore, win32_lstat will get the attributes traditionally, and - win32_stat will first explicitly resolve the symlink target and then will - call win32_lstat on that result. - - The _w represent Unicode equivalents of the aformentioned ANSI functions. */ - -static int -win32_lstat(const char* path, struct win32_stat *result) -{ - WIN32_FILE_ATTRIBUTE_DATA info; - int code; - char *dot; - WIN32_FIND_DATAA find_data; - HANDLE find_data_handle; - if (!GetFileAttributesExA(path, GetFileExInfoStandard, &info)) { - if (GetLastError() != ERROR_SHARING_VIOLATION) { - /* Protocol violation: we explicitly clear errno, instead of - setting it to a POSIX error. Callers should use GetLastError. */ - errno = 0; - return -1; - } else { - /* Could not get attributes on open file. Fall back to - reading the directory. */ - if (!attributes_from_dir(path, &info)) { - /* Very strange. This should not fail now */ - errno = 0; - return -1; - } - } - } - - code = attribute_data_to_stat(&info, result); - if (code != 0) - return code; - - /* Get WIN32_FIND_DATA structure for the path to determine if - it is a symlink */ - if(info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { - find_data_handle = FindFirstFileA(path, &find_data); - if(find_data_handle != INVALID_HANDLE_VALUE) { - if(find_data.dwReserved0 == IO_REPARSE_TAG_SYMLINK) { - /* first clear the S_IFMT bits */ - result->st_mode ^= (result->st_mode & 0170000); - /* now set the bits that make this a symlink */ - result->st_mode |= 0120000; - } - FindClose(find_data_handle); - } - } - - /* Set S_IFEXEC if it is an .exe, .bat, ... */ - dot = strrchr(path, '.'); - if (dot) { - if (stricmp(dot, ".bat") == 0 || stricmp(dot, ".cmd") == 0 || - stricmp(dot, ".exe") == 0 || stricmp(dot, ".com") == 0) - result->st_mode |= 0111; - } - return code; -} - -static int -win32_lstat_w(const wchar_t* path, struct win32_stat *result) -{ - int code; - const wchar_t *dot; - WIN32_FILE_ATTRIBUTE_DATA info; - WIN32_FIND_DATAW find_data; - HANDLE find_data_handle; - if (!GetFileAttributesExW(path, GetFileExInfoStandard, &info)) { - if (GetLastError() != ERROR_SHARING_VIOLATION) { - /* Protocol violation: we explicitly clear errno, instead of - setting it to a POSIX error. Callers should use GetLastError. */ - errno = 0; - return -1; - } else { - /* Could not get attributes on open file. Fall back to reading - the directory. */ - if (!attributes_from_dir_w(path, &info)) { - /* Very strange. This should not fail now */ - errno = 0; - return -1; - } - } - } - code = attribute_data_to_stat(&info, result); - if (code < 0) - return code; - - /* Get WIN32_FIND_DATA structure for the path to determine if - it is a symlink */ - if(info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { - find_data_handle = FindFirstFileW(path, &find_data); - if(find_data_handle != INVALID_HANDLE_VALUE) { - if(find_data.dwReserved0 == IO_REPARSE_TAG_SYMLINK) { - /* first clear the S_IFMT bits */ - result->st_mode ^= (result->st_mode & 0170000); - /* now set the bits that make this a symlink */ - result->st_mode |= 0120000; - } - FindClose(find_data_handle); - } - } - - /* Set IFEXEC if it is an .exe, .bat, ... */ - dot = wcsrchr(path, '.'); - if (dot) { - if (_wcsicmp(dot, L".bat") == 0 || _wcsicmp(dot, L".cmd") == 0 || - _wcsicmp(dot, L".exe") == 0 || _wcsicmp(dot, L".com") == 0) - result->st_mode |= 0111; - } - return code; -} +#ifndef SYMLOOP_MAX +#define SYMLOOP_MAX ( 88 ) +#endif -/* Grab GetFinalPathNameByHandle dynamically from kernel32 */ -static int has_GetFinalPathNameByHandle = 0; -static DWORD (CALLBACK *Py_GetFinalPathNameByHandleA)(HANDLE, LPSTR, DWORD, - DWORD); -static DWORD (CALLBACK *Py_GetFinalPathNameByHandleW)(HANDLE, LPWSTR, DWORD, - DWORD); static int -check_GetFinalPathNameByHandle() -{ - HINSTANCE hKernel32; - /* only recheck */ - if (!has_GetFinalPathNameByHandle) - { - hKernel32 = GetModuleHandle("KERNEL32"); - *(FARPROC*)&Py_GetFinalPathNameByHandleA = GetProcAddress(hKernel32, - "GetFinalPathNameByHandleA"); - *(FARPROC*)&Py_GetFinalPathNameByHandleW = GetProcAddress(hKernel32, - "GetFinalPathNameByHandleW"); - has_GetFinalPathNameByHandle = Py_GetFinalPathNameByHandleA && - Py_GetFinalPathNameByHandleW; - } - return has_GetFinalPathNameByHandle; -} +win32_xstat_impl_w(const wchar_t *path, struct win32_stat *result, BOOL traverse, int depth); static int -win32_stat(const char* path, struct win32_stat *result) +win32_xstat_impl(const char *path, struct win32_stat *result, BOOL traverse, int depth) { - /* Traverse the symlink to the target using - GetFinalPathNameByHandle() - */ int code; HANDLE hFile; - int buf_size; - char *target_path; - int result_length; - WIN32_FILE_ATTRIBUTE_DATA info; - - if(!check_GetFinalPathNameByHandle()) { - /* if the OS doesn't have GetFinalPathNameByHandle, it doesn't - have symlinks, so just fall back to the traditional behavior - found in lstat. */ - return win32_lstat(path, result); + BY_HANDLE_FILE_INFORMATION info; + ULONG reparse_tag = 0; + wchar_t *target_path; + const char *dot; + + if (depth > SYMLOOP_MAX) { + SetLastError(ERROR_CANT_RESOLVE_FILENAME); /* XXX: ELOOP? */ + return -1; } hFile = CreateFileA( @@ -1154,75 +1153,70 @@ NULL, /* security attributes */ OPEN_EXISTING, /* FILE_FLAG_BACKUP_SEMANTICS is required to open a directory */ - FILE_ATTRIBUTE_NORMAL|FILE_FLAG_BACKUP_SEMANTICS, + FILE_ATTRIBUTE_NORMAL|FILE_FLAG_BACKUP_SEMANTICS|FILE_FLAG_OPEN_REPARSE_POINT, NULL); - - if(hFile == INVALID_HANDLE_VALUE) { + + if (hFile == INVALID_HANDLE_VALUE) { /* Either the target doesn't exist, or we don't have access to get a handle to it. If the former, we need to return an error. If the latter, we can use attributes_from_dir. */ - if (GetLastError() != ERROR_SHARING_VIOLATION) { - /* Protocol violation: we explicitly clear errno, instead of - setting it to a POSIX error. Callers should use GetLastError. */ - errno = 0; + if (GetLastError() != ERROR_SHARING_VIOLATION) return -1; - } else { - /* Could not get attributes on open file. Fall back to - reading the directory. */ - if (!attributes_from_dir(path, &info)) { - /* Very strange. This should not fail now */ - errno = 0; + /* Could not get attributes on open file. Fall back to + reading the directory. */ + if (!attributes_from_dir(path, &info, &reparse_tag)) + /* Very strange. This should not fail now */ + return -1; + if (info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { + if (traverse) { + /* Should traverse, but could not open reparse point handle */ + SetLastError(ERROR_SHARING_VIOLATION); return -1; } } - code = attribute_data_to_stat(&info, result); + } else { + if (!GetFileInformationByHandle(hFile, &info)) { + CloseHandle(hFile); + return -1;; + } + if (info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { + code = win32_read_link(hFile, &reparse_tag, traverse ? &target_path : NULL); + CloseHandle(hFile); + if (code < 0) + return code; + if (traverse) { + code = win32_xstat_impl_w(target_path, result, traverse, depth + 1); + free(target_path); + return code; + } + } else + CloseHandle(hFile); } - else { - /* We have a good handle to the target, use it to determine the target - path name (then we'll call lstat on it). */ - buf_size = Py_GetFinalPathNameByHandleA(hFile, 0, 0, VOLUME_NAME_DOS); - if(!buf_size) return -1; - /* Due to a slight discrepancy between GetFinalPathNameByHandleA - and GetFinalPathNameByHandleW, we must allocate one more byte - than reported. */ - target_path = (char *)malloc((buf_size+2)*sizeof(char)); - result_length = Py_GetFinalPathNameByHandleA(hFile, target_path, - buf_size+1, VOLUME_NAME_DOS); - - if(!result_length) { - free(target_path); - return -1; - } - - if(!CloseHandle(hFile)) { - free(target_path); - return -1; - } + attribute_data_to_stat(&info, reparse_tag, result); - target_path[result_length] = 0; - code = win32_lstat(target_path, result); - free(target_path); + /* Set S_IEXEC if it is an .exe, .bat, ... */ + dot = strrchr(path, '.'); + if (dot) { + if (stricmp(dot, ".bat") == 0 || stricmp(dot, ".cmd") == 0 || + stricmp(dot, ".exe") == 0 || stricmp(dot, ".com") == 0) + result->st_mode |= 0111; } - - return code; + return 0; } -static int -win32_stat_w(const wchar_t* path, struct win32_stat *result) +static int +win32_xstat_impl_w(const wchar_t *path, struct win32_stat *result, BOOL traverse, int depth) { - /* Traverse the symlink to the target using GetFinalPathNameByHandle() */ int code; HANDLE hFile; - int buf_size; - wchar_t *target_path; - int result_length; - WIN32_FILE_ATTRIBUTE_DATA info; - - if(!check_GetFinalPathNameByHandle()) { - /* If the OS doesn't have GetFinalPathNameByHandle, it doesn't have - symlinks, so just fall back to the traditional behavior found - in lstat. */ - return win32_lstat_w(path, result); + BY_HANDLE_FILE_INFORMATION info; + ULONG reparse_tag = 0; + wchar_t *target_path; + const wchar_t *dot; + + if (depth > SYMLOOP_MAX) { + SetLastError(ERROR_CANT_RESOLVE_FILENAME); /* XXX: ELOOP? */ + return -1; } hFile = CreateFileW( @@ -1232,75 +1226,132 @@ NULL, /* security attributes */ OPEN_EXISTING, /* FILE_FLAG_BACKUP_SEMANTICS is required to open a directory */ - FILE_ATTRIBUTE_NORMAL|FILE_FLAG_BACKUP_SEMANTICS, + FILE_ATTRIBUTE_NORMAL|FILE_FLAG_BACKUP_SEMANTICS|FILE_FLAG_OPEN_REPARSE_POINT, NULL); - if(hFile == INVALID_HANDLE_VALUE) { + if (hFile == INVALID_HANDLE_VALUE) { /* Either the target doesn't exist, or we don't have access to get a handle to it. If the former, we need to return an error. If the latter, we can use attributes_from_dir. */ - if (GetLastError() != ERROR_SHARING_VIOLATION) { - /* Protocol violation: we explicitly clear errno, instead of - setting it to a POSIX error. Callers should use GetLastError. */ - errno = 0; + if (GetLastError() != ERROR_SHARING_VIOLATION) return -1; - } else { - /* Could not get attributes on open file. Fall back to - reading the directory. */ - if (!attributes_from_dir_w(path, &info)) { - /* Very strange. This should not fail now */ - errno = 0; + /* Could not get attributes on open file. Fall back to + reading the directory. */ + if (!attributes_from_dir_w(path, &info, &reparse_tag)) + /* Very strange. This should not fail now */ + return -1; + if (info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { + if (traverse) { + /* Should traverse, but could not open reparse point handle */ + SetLastError(ERROR_SHARING_VIOLATION); return -1; } } - code = attribute_data_to_stat(&info, result); + } else { + if (!GetFileInformationByHandle(hFile, &info)) { + CloseHandle(hFile); + return -1;; + } + if (info.dwFileAttributes & FILE_ATTRIBUTE_REPARSE_POINT) { + code = win32_read_link(hFile, &reparse_tag, traverse ? &target_path : NULL); + CloseHandle(hFile); + if (code < 0) + return code; + if (traverse) { + code = win32_xstat_impl_w(target_path, result, traverse, depth + 1); + free(target_path); + return code; + } + } else + CloseHandle(hFile); } - else { - /* We have a good handle to the target, use it to determine the target - path name (then we'll call lstat on it). */ - buf_size = Py_GetFinalPathNameByHandleW(hFile, 0, 0, VOLUME_NAME_DOS); - if(!buf_size) - return -1; - - target_path = (wchar_t *)malloc((buf_size+1)*sizeof(wchar_t)); - result_length = Py_GetFinalPathNameByHandleW(hFile, target_path, - buf_size, VOLUME_NAME_DOS); - - if(!result_length) { - free(target_path); - return -1; - } - - if(!CloseHandle(hFile)) { - free(target_path); - return -1; - } + attribute_data_to_stat(&info, reparse_tag, result); - target_path[result_length] = 0; - code = win32_lstat_w(target_path, result); - free(target_path); + /* Set S_IEXEC if it is an .exe, .bat, ... */ + dot = wcsrchr(path, '.'); + if (dot) { + if (_wcsicmp(dot, L".bat") == 0 || _wcsicmp(dot, L".cmd") == 0 || + _wcsicmp(dot, L".exe") == 0 || _wcsicmp(dot, L".com") == 0) + result->st_mode |= 0111; } - - return code; + return 0; } static int -win32_fstat(int file_number, struct win32_stat *result) +win32_xstat(const char *path, struct win32_stat *result, BOOL traverse) { - BY_HANDLE_FILE_INFORMATION info; - HANDLE h; - int type; - - h = (HANDLE)_get_osfhandle(file_number); + /* Protocol violation: we explicitly clear errno, instead of + setting it to a POSIX error. Callers should use GetLastError. */ + int code = win32_xstat_impl(path, result, traverse, 0); + errno = 0; + return code; +} +static int +win32_xstat_w(const wchar_t *path, struct win32_stat *result, BOOL traverse) +{ /* Protocol violation: we explicitly clear errno, instead of setting it to a POSIX error. Callers should use GetLastError. */ + int code = win32_xstat_impl_w(path, result, traverse, 0); errno = 0; + return code; +} - if (h == INVALID_HANDLE_VALUE) { - /* This is really a C library error (invalid file handle). - We set the Win32 error to the closes one matching. */ - SetLastError(ERROR_INVALID_HANDLE); +/* About the following functions: win32_lstat, win32_lstat_w, win32_stat, + win32_stat_w + + In Posix, stat automatically traverses symlinks and returns the stat + structure for the target. In Windows, the equivalent GetFileAttributes by + default does not traverse symlinks and instead returns attributes for + the symlink. + + Therefore, win32_lstat will get the attributes traditionally, and + win32_stat will first explicitly resolve the symlink target and then will + call win32_lstat on that result. + + The _w represent Unicode equivalents of the aformentioned ANSI functions. */ + +static int +win32_lstat(const char* path, struct win32_stat *result) +{ + return win32_xstat(path, result, FALSE); +} + +static int +win32_lstat_w(const wchar_t* path, struct win32_stat *result) +{ + return win32_xstat_w(path, result, FALSE); +} + +static int +win32_stat(const char* path, struct win32_stat *result) +{ + return win32_xstat(path, result, TRUE); +} + +static int +win32_stat_w(const wchar_t* path, struct win32_stat *result) +{ + return win32_xstat_w(path, result, TRUE); +} + +static int +win32_fstat(int file_number, struct win32_stat *result) +{ + BY_HANDLE_FILE_INFORMATION info; + HANDLE h; + int type; + + h = (HANDLE)_get_osfhandle(file_number); + + /* Protocol violation: we explicitly clear errno, instead of + setting it to a POSIX error. Callers should use GetLastError. */ + errno = 0; + + if (h == INVALID_HANDLE_VALUE) { + /* This is really a C library error (invalid file handle). + We set the Win32 error to the closes one matching. */ + SetLastError(ERROR_INVALID_HANDLE); return -1; } memset(result, 0, sizeof(*result)); @@ -1309,7 +1360,7 @@ if (type == FILE_TYPE_UNKNOWN) { DWORD error = GetLastError(); if (error != 0) { - return -1; + return -1; } /* else: valid but unknown file */ } @@ -1326,17 +1377,8 @@ return -1; } - /* similar to stat() */ - result->st_mode = attributes_to_mode(info.dwFileAttributes); - result->st_size = (((__int64)info.nFileSizeHigh)<<32) + info.nFileSizeLow; - FILE_TIME_to_time_t_nsec(&info.ftCreationTime, &result->st_ctime, - &result->st_ctime_nsec); - FILE_TIME_to_time_t_nsec(&info.ftLastWriteTime, &result->st_mtime, - &result->st_mtime_nsec); - FILE_TIME_to_time_t_nsec(&info.ftLastAccessTime, &result->st_atime, - &result->st_atime_nsec); + attribute_data_to_stat(&info, 0, result); /* specific to fstat() */ - result->st_nlink = info.nNumberOfLinks; result->st_ino = (((__int64)info.nFileIndexHigh)<<32) + info.nFileIndexLow; return 0; } @@ -2057,7 +2099,7 @@ static PyObject * posix_fsync(PyObject *self, PyObject *fdobj) { - return posix_fildes(fdobj, fsync); + return posix_fildes(fdobj, fsync); } #endif /* HAVE_FSYNC */ @@ -2075,7 +2117,7 @@ static PyObject * posix_fdatasync(PyObject *self, PyObject *fdobj) { - return posix_fildes(fdobj, fdatasync); + return posix_fildes(fdobj, fdatasync); } #endif /* HAVE_FDATASYNC */ @@ -2247,6 +2289,54 @@ } #endif /* HAVE_LINK */ +#ifdef MS_WINDOWS +PyDoc_STRVAR(win32_link__doc__, +"link(src, dst)\n\n\ +Create a hard link to a file."); + +static PyObject * +win32_link(PyObject *self, PyObject *args) +{ + PyObject *osrc, *odst; + char *src, *dst; + BOOL rslt; + + PyUnicodeObject *usrc, *udst; + if (PyArg_ParseTuple(args, "UU:link", &usrc, &udst)) { + Py_BEGIN_ALLOW_THREADS + rslt = CreateHardLinkW(PyUnicode_AS_UNICODE(udst), + PyUnicode_AS_UNICODE(usrc), NULL); + Py_END_ALLOW_THREADS + + if (rslt == 0) + return win32_error("link", NULL); + + Py_RETURN_NONE; + } + + /* Narrow strings also valid. */ + PyErr_Clear(); + + if (!PyArg_ParseTuple(args, "O&O&:link", PyUnicode_FSConverter, &osrc, + PyUnicode_FSConverter, &odst)) + return NULL; + + src = PyBytes_AsString(osrc); + dst = PyBytes_AsString(odst); + + Py_BEGIN_ALLOW_THREADS + rslt = CreateHardLinkA(dst, src, NULL); + Py_END_ALLOW_THREADS + + Py_DECREF(osrc); + Py_DECREF(odst); + if (rslt == 0) + return win32_error("link", NULL); + + Py_RETURN_NONE; +} +#endif /* MS_WINDOWS */ + PyDoc_STRVAR(posix_listdir__doc__, "listdir([path]) -> list_of_strings\n\n\ @@ -2277,7 +2367,7 @@ if (PyArg_ParseTuple(args, "|U:listdir", &po)) { WIN32_FIND_DATAW wFileData; Py_UNICODE *wnamebuf, *po_wchars; - + if (po == NULL) { /* Default arg: "." */ po_wchars = L"."; len = 1; @@ -2371,6 +2461,7 @@ } strcpy(namebuf, PyBytes_AsString(opath)); len = PyObject_Size(opath); + Py_DECREF(opath); if (len > 0) { char ch = namebuf[len-1]; if (ch != SEP && ch != ALTSEP && ch != ':') @@ -2526,7 +2617,7 @@ if (!PyArg_ParseTuple(args, "|O&:listdir", PyUnicode_FSConverter, &oname)) return NULL; if (oname == NULL) { /* Default arg: "." */ - oname = PyBytes_FromString("."); + oname = PyBytes_FromString("."); } name = PyBytes_AsString(oname); Py_BEGIN_ALLOW_THREADS @@ -2587,6 +2678,76 @@ #endif /* which OS */ } /* end of posix_listdir */ +#ifdef HAVE_FDOPENDIR +PyDoc_STRVAR(posix_fdlistdir__doc__, +"fdlistdir(fd) -> list_of_strings\n\n\ +Like listdir(), but uses a file descriptor instead.\n\ +After succesful execution of this function, fd will be closed."); + +static PyObject * +posix_fdlistdir(PyObject *self, PyObject *args) +{ + PyObject *d, *v; + DIR *dirp; + struct dirent *ep; + int fd; + + errno = 0; + if (!PyArg_ParseTuple(args, "i:fdlistdir", &fd)) + return NULL; + Py_BEGIN_ALLOW_THREADS + dirp = fdopendir(fd); + Py_END_ALLOW_THREADS + if (dirp == NULL) { + close(fd); + return posix_error(); + } + if ((d = PyList_New(0)) == NULL) { + Py_BEGIN_ALLOW_THREADS + closedir(dirp); + Py_END_ALLOW_THREADS + return NULL; + } + for (;;) { + errno = 0; + Py_BEGIN_ALLOW_THREADS + ep = readdir(dirp); + Py_END_ALLOW_THREADS + if (ep == NULL) { + if (errno == 0) { + break; + } else { + Py_BEGIN_ALLOW_THREADS + closedir(dirp); + Py_END_ALLOW_THREADS + Py_DECREF(d); + return posix_error(); + } + } + if (ep->d_name[0] == '.' && + (NAMLEN(ep) == 1 || + (ep->d_name[1] == '.' && NAMLEN(ep) == 2))) + continue; + v = PyUnicode_DecodeFSDefaultAndSize(ep->d_name, NAMLEN(ep)); + if (v == NULL) { + Py_CLEAR(d); + break; + } + if (PyList_Append(d, v) != 0) { + Py_DECREF(v); + Py_CLEAR(d); + break; + } + Py_DECREF(v); + } + Py_BEGIN_ALLOW_THREADS + closedir(dirp); + Py_END_ALLOW_THREADS + + return d; +} +#endif + #ifdef MS_WINDOWS /* A helper function for abspath on win32 */ static PyObject * @@ -2644,6 +2805,30 @@ return PyBytes_FromString(outbuf); } /* end of posix__getfullpathname */ +/* Grab GetFinalPathNameByHandle dynamically from kernel32 */ +static int has_GetFinalPathNameByHandle = 0; +static DWORD (CALLBACK *Py_GetFinalPathNameByHandleA)(HANDLE, LPSTR, DWORD, + DWORD); +static DWORD (CALLBACK *Py_GetFinalPathNameByHandleW)(HANDLE, LPWSTR, DWORD, + DWORD); +static int +check_GetFinalPathNameByHandle() +{ + HINSTANCE hKernel32; + /* only recheck */ + if (!has_GetFinalPathNameByHandle) + { + hKernel32 = GetModuleHandle("KERNEL32"); + *(FARPROC*)&Py_GetFinalPathNameByHandleA = GetProcAddress(hKernel32, + "GetFinalPathNameByHandleA"); + *(FARPROC*)&Py_GetFinalPathNameByHandleW = GetProcAddress(hKernel32, + "GetFinalPathNameByHandleW"); + has_GetFinalPathNameByHandle = Py_GetFinalPathNameByHandleA && + Py_GetFinalPathNameByHandleW; + } + return has_GetFinalPathNameByHandle; +} + /* A helper function for samepath on windows */ static PyObject * posix__getfinalpathname(PyObject *self, PyObject *args) @@ -2654,7 +2839,7 @@ int result_length; PyObject *result; wchar_t *path; - + if (!PyArg_ParseTuple(args, "u|:_getfinalpathname", &path)) { return NULL; } @@ -2675,7 +2860,7 @@ /* FILE_FLAG_BACKUP_SEMANTICS is required to open a directory */ FILE_FLAG_BACKUP_SEMANTICS, NULL); - + if(hFile == INVALID_HANDLE_VALUE) { return win32_error_unicode("GetFinalPathNamyByHandle", path); return PyErr_Format(PyExc_RuntimeError, @@ -2718,14 +2903,12 @@ if (!PyArg_ParseTuple(args, "i:_getfileinformation", &fd)) return NULL; - if (!_PyVerify_fd(fd)) { - PyErr_SetString(PyExc_ValueError, "received invalid file descriptor"); - return NULL; - } + if (!_PyVerify_fd(fd)) + return posix_error(); hFile = (HANDLE)_get_osfhandle(fd); if (hFile == INVALID_HANDLE_VALUE) - return win32_error("_getfileinformation", NULL); + return posix_error(); if (!GetFileInformationByHandle(hFile, &info)) return win32_error("_getfileinformation", NULL); @@ -2845,6 +3028,52 @@ } #endif /* HAVE_NICE */ + +#ifdef HAVE_GETPRIORITY +PyDoc_STRVAR(posix_getpriority__doc__, +"getpriority(which, who) -> current_priority\n\n\ +Get program scheduling priority."); + +static PyObject * +posix_getpriority(PyObject *self, PyObject *args) +{ + int which, who, retval; + + if (!PyArg_ParseTuple(args, "ii", &which, &who)) + return NULL; + errno = 0; + Py_BEGIN_ALLOW_THREADS + retval = getpriority(which, who); + Py_END_ALLOW_THREADS + if (errno != 0) + return posix_error(); + return PyLong_FromLong((long)retval); +} +#endif /* HAVE_GETPRIORITY */ + + +#ifdef HAVE_SETPRIORITY +PyDoc_STRVAR(posix_setpriority__doc__, +"setpriority(which, who, prio) -> None\n\n\ +Set program scheduling priority."); + +static PyObject * +posix_setpriority(PyObject *self, PyObject *args) +{ + int which, who, prio, retval; + + if (!PyArg_ParseTuple(args, "iii", &which, &who, &prio)) + return NULL; + Py_BEGIN_ALLOW_THREADS + retval = setpriority(which, who, prio); + Py_END_ALLOW_THREADS + if (retval == -1) + return posix_error(); + Py_RETURN_NONE; +} +#endif /* HAVE_SETPRIORITY */ + + PyDoc_STRVAR(posix_rename__doc__, "rename(old, new)\n\n\ Rename a file or directory."); @@ -2987,7 +3216,7 @@ if (GetFileAttributesExW(lpFileName, GetFileExInfoStandard, &info)) { is_directory = info.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY; - + /* Get WIN32_FIND_DATA structure for the path to determine if it is a symlink */ if(is_directory && @@ -3054,15 +3283,19 @@ #endif /* HAVE_UNAME */ static int -extract_time(PyObject *t, long* sec, long* usec) +extract_time(PyObject *t, time_t* sec, long* usec) { - long intval; + time_t intval; if (PyFloat_Check(t)) { double tval = PyFloat_AsDouble(t); - PyObject *intobj = Py_TYPE(t)->tp_as_number->nb_int(t); + PyObject *intobj = PyNumber_Long(t); if (!intobj) return -1; +#if SIZEOF_TIME_T > SIZEOF_LONG + intval = PyLong_AsUnsignedLongLongMask(intobj); +#else intval = PyLong_AsLong(intobj); +#endif Py_DECREF(intobj); if (intval == -1 && PyErr_Occurred()) return -1; @@ -3074,7 +3307,11 @@ *usec = 0; return 0; } +#if SIZEOF_TIME_T > SIZEOF_LONG + intval = PyLong_AsUnsignedLongLongMask(t); +#else intval = PyLong_AsLong(t); +#endif if (intval == -1 && PyErr_Occurred()) return -1; *sec = intval; @@ -3098,7 +3335,8 @@ PyObject *oapath; char *apath; HANDLE hFile; - long atimesec, mtimesec, ausec, musec; + time_t atimesec, mtimesec; + long ausec, musec; FILETIME atime, mtime; PyObject *result = NULL; @@ -3141,7 +3379,7 @@ !SystemTimeToFileTime(&now, &atime)) { win32_error("utime", NULL); goto done; - } + } } else if (!PyTuple_Check(arg) || PyTuple_Size(arg) != 2) { PyErr_SetString(PyExc_TypeError, @@ -3164,6 +3402,7 @@ something is wrong with the file, when it also could be the time stamp that gives a problem. */ win32_error("utime", NULL); + goto done; } Py_INCREF(Py_None); result = Py_None; @@ -3174,7 +3413,8 @@ PyObject *opath; char *path; - long atime, mtime, ausec, musec; + time_t atime, mtime; + long ausec, musec; int res; PyObject* arg; @@ -4166,7 +4406,7 @@ #endif gid_t grouplist[MAX_GROUPS]; - /* On MacOSX getgroups(2) can return more than MAX_GROUPS results + /* On MacOSX getgroups(2) can return more than MAX_GROUPS results * This is a helper variable to store the intermediate result when * that happens. * @@ -4209,9 +4449,9 @@ for (i = 0; i < n; ++i) { PyObject *o = PyLong_FromLong((long)alt_grouplist[i]); if (o == NULL) { - Py_DECREF(result); - result = NULL; - break; + Py_DECREF(result); + result = NULL; + break; } PyList_SET_ITEM(result, i, o); } @@ -4382,15 +4622,15 @@ posix_getlogin(PyObject *self, PyObject *noargs) { PyObject *result = NULL; -#ifdef MS_WINDOWS +#ifdef MS_WINDOWS wchar_t user_name[UNLEN + 1]; DWORD num_chars = sizeof(user_name)/sizeof(user_name[0]); if (GetUserNameW(user_name, &num_chars)) { /* num_chars is the number of unicode chars plus null terminator */ result = PyUnicode_FromWideChar(user_name, num_chars - 1); - } - else + } + else result = PyErr_SetFromWindowsErr(GetLastError()); #else char *name; @@ -5026,7 +5266,7 @@ #endif /* HAVE_READLINK */ -#ifdef HAVE_SYMLINK +#if defined(HAVE_SYMLINK) && !defined(MS_WINDOWS) PyDoc_STRVAR(posix_symlink__doc__, "symlink(src, dst)\n\n\ Create a symbolic link pointing to src named dst."); @@ -5044,43 +5284,6 @@ "readlink(path) -> path\n\n\ Return a string representing the path to which the symbolic link points."); -/* The following structure was copied from - http://msdn.microsoft.com/en-us/library/ms791514.aspx as the required - include doesn't seem to be present in the Windows SDK (at least as included - with Visual Studio Express). */ -typedef struct _REPARSE_DATA_BUFFER { - ULONG ReparseTag; - USHORT ReparseDataLength; - USHORT Reserved; - union { - struct { - USHORT SubstituteNameOffset; - USHORT SubstituteNameLength; - USHORT PrintNameOffset; - USHORT PrintNameLength; - ULONG Flags; - WCHAR PathBuffer[1]; - } SymbolicLinkReparseBuffer; - - struct { - USHORT SubstituteNameOffset; - USHORT SubstituteNameLength; - USHORT PrintNameOffset; - USHORT PrintNameLength; - WCHAR PathBuffer[1]; - } MountPointReparseBuffer; - - struct { - UCHAR DataBuffer[1]; - } GenericReparseBuffer; - }; -} REPARSE_DATA_BUFFER, *PREPARSE_DATA_BUFFER; - -#define REPARSE_DATA_BUFFER_HEADER_SIZE FIELD_OFFSET(REPARSE_DATA_BUFFER,\ - GenericReparseBuffer) - -#define MAXIMUM_REPARSE_DATA_BUFFER_SIZE ( 16 * 1024 ) - /* Windows readlink implementation */ static PyObject * win_readlink(PyObject *self, PyObject *args) @@ -5111,12 +5314,12 @@ FILE_FLAG_OPEN_REPARSE_POINT|FILE_FLAG_BACKUP_SEMANTICS, 0); Py_END_ALLOW_THREADS - + if (reparse_point_handle==INVALID_HANDLE_VALUE) { return win32_error_unicode("readlink", path); } - + Py_BEGIN_ALLOW_THREADS /* New call DeviceIoControl to read the reparse point */ io_result = DeviceIoControl( @@ -5151,7 +5354,7 @@ #endif /* !defined(HAVE_READLINK) && defined(MS_WINDOWS) */ -#if !defined(HAVE_SYMLINK) && defined(MS_WINDOWS) +#if defined(HAVE_SYMLINK) && defined(MS_WINDOWS) /* Grab CreateSymbolicLinkW dynamically from kernel32 */ static int has_CreateSymbolicLinkW = 0; @@ -5187,7 +5390,7 @@ int target_is_directory = 0; DWORD res; WIN32_FILE_ATTRIBUTE_DATA src_info; - + if (!check_CreateSymbolicLinkW()) { /* raise NotImplementedError */ @@ -5197,12 +5400,16 @@ if (!PyArg_ParseTupleAndKeywords(args, kwargs, "OO|i:symlink", kwlist, &src, &dest, &target_is_directory)) return NULL; + + if (win32_can_symlink == 0) + return PyErr_Format(PyExc_OSError, "symbolic link privilege not held"); + if (!convert_to_unicode(&src)) { return NULL; } if (!convert_to_unicode(&dest)) { Py_DECREF(src); return NULL; } - + /* if src is a directory, ensure target_is_directory==1 */ if( GetFileAttributesExW( @@ -5225,11 +5432,11 @@ { return win32_error_unicode("symlink", PyUnicode_AsUnicode(src)); } - + Py_INCREF(Py_None); return Py_None; } -#endif /* !defined(HAVE_SYMLINK) && defined(MS_WINDOWS) */ +#endif /* defined(HAVE_SYMLINK) && defined(MS_WINDOWS) */ #ifdef HAVE_TIMES #if defined(PYCC_VACPP) && defined(PYOS_OS2) @@ -5610,8 +5817,10 @@ buffer = PyBytes_FromStringAndSize((char *)NULL, size); if (buffer == NULL) return NULL; - if (!_PyVerify_fd(fd)) + if (!_PyVerify_fd(fd)) { + Py_DECREF(buffer); return posix_error(); + } Py_BEGIN_ALLOW_THREADS n = read(fd, PyBytes_AS_STRING(buffer), size); Py_END_ALLOW_THREADS @@ -5634,21 +5843,203 @@ { Py_buffer pbuf; int fd; - Py_ssize_t size; + Py_ssize_t size, len; if (!PyArg_ParseTuple(args, "iy*:write", &fd, &pbuf)) return NULL; - if (!_PyVerify_fd(fd)) + if (!_PyVerify_fd(fd)) { + PyBuffer_Release(&pbuf); return posix_error(); + } + len = pbuf.len; Py_BEGIN_ALLOW_THREADS - size = write(fd, pbuf.buf, (size_t)pbuf.len); +#if defined(MS_WIN64) || defined(MS_WINDOWS) + if (len > INT_MAX) + len = INT_MAX; + size = write(fd, pbuf.buf, (int)len); +#else + size = write(fd, pbuf.buf, len); +#endif Py_END_ALLOW_THREADS - PyBuffer_Release(&pbuf); + PyBuffer_Release(&pbuf); if (size < 0) return posix_error(); return PyLong_FromSsize_t(size); } +#ifdef HAVE_SENDFILE +#if defined(__FreeBSD__) || defined(__DragonFly__) || defined(__APPLE__) +static int +iov_setup(struct iovec **iov, Py_buffer **buf, PyObject *seq, int cnt, int type) +{ + int i, j; + *iov = PyMem_New(struct iovec, cnt); + if (*iov == NULL) { + PyErr_NoMemory(); + return 0; + } + *buf = PyMem_New(Py_buffer, cnt); + if (*buf == NULL) { + PyMem_Del(*iov); + PyErr_NoMemory(); + return 0; + } + + for (i = 0; i < cnt; i++) { + if (PyObject_GetBuffer(PySequence_GetItem(seq, i), &(*buf)[i], + type) == -1) { + PyMem_Del(*iov); + for (j = 0; j < i; j++) { + PyBuffer_Release(&(*buf)[j]); + } + PyMem_Del(*buf); + return 0; + } + (*iov)[i].iov_base = (*buf)[i].buf; + (*iov)[i].iov_len = (*buf)[i].len; + } + return 1; +} + +static void +iov_cleanup(struct iovec *iov, Py_buffer *buf, int cnt) +{ + int i; + PyMem_Del(iov); + for (i = 0; i < cnt; i++) { + PyBuffer_Release(&buf[i]); + } + PyMem_Del(buf); +} +#endif + +PyDoc_STRVAR(posix_sendfile__doc__, +"sendfile(out, in, offset, nbytes) -> byteswritten\n\ +sendfile(out, in, offset, nbytes, headers=None, trailers=None, flags=0)\n\ + -> byteswritten\n\ +Copy nbytes bytes from file descriptor in to file descriptor out."); + +static PyObject * +posix_sendfile(PyObject *self, PyObject *args, PyObject *kwdict) +{ + int in, out; + Py_ssize_t ret; + off_t offset; + +#if defined(__FreeBSD__) || defined(__DragonFly__) || defined(__APPLE__) +#ifndef __APPLE__ + Py_ssize_t len; +#endif + PyObject *headers = NULL, *trailers = NULL; + Py_buffer *hbuf, *tbuf; + off_t sbytes; + struct sf_hdtr sf; + int flags = 0; + sf.headers = NULL; + sf.trailers = NULL; + static char *keywords[] = {"out", "in", + "offset", "count", + "headers", "trailers", "flags", NULL}; + +#ifdef __APPLE__ + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iiO&O&|OOi:sendfile", + keywords, &out, &in, _parse_off_t, &offset, _parse_off_t, &sbytes, +#else + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iiO&n|OOi:sendfile", + keywords, &out, &in, _parse_off_t, &offset, &len, +#endif + &headers, &trailers, &flags)) + return NULL; + if (headers != NULL) { + if (!PySequence_Check(headers)) { + PyErr_SetString(PyExc_TypeError, + "sendfile() headers must be a sequence or None"); + return NULL; + } else { + sf.hdr_cnt = PySequence_Size(headers); + if (sf.hdr_cnt > 0 && !iov_setup(&(sf.headers), &hbuf, + headers, sf.hdr_cnt, PyBUF_SIMPLE)) + return NULL; + } + } + if (trailers != NULL) { + if (!PySequence_Check(trailers)) { + PyErr_SetString(PyExc_TypeError, + "sendfile() trailers must be a sequence or None"); + return NULL; + } else { + sf.trl_cnt = PySequence_Size(trailers); + if (sf.trl_cnt > 0 && !iov_setup(&(sf.trailers), &tbuf, + trailers, sf.trl_cnt, PyBUF_SIMPLE)) + return NULL; + } + } + + Py_BEGIN_ALLOW_THREADS +#ifdef __APPLE__ + ret = sendfile(in, out, offset, &sbytes, &sf, flags); +#else + ret = sendfile(in, out, offset, len, &sf, &sbytes, flags); +#endif + Py_END_ALLOW_THREADS + + if (sf.headers != NULL) + iov_cleanup(sf.headers, hbuf, sf.hdr_cnt); + if (sf.trailers != NULL) + iov_cleanup(sf.trailers, tbuf, sf.trl_cnt); + + if (ret < 0) { + if ((errno == EAGAIN) || (errno == EBUSY)) { + if (sbytes != 0) { + // some data has been sent + goto done; + } + else { + // no data has been sent; upper application is supposed + // to retry on EAGAIN or EBUSY + return posix_error(); + } + } + return posix_error(); + } + goto done; + +done: + #if !defined(HAVE_LARGEFILE_SUPPORT) + return Py_BuildValue("l", sbytes); + #else + return Py_BuildValue("L", sbytes); + #endif + +#else + Py_ssize_t count; + PyObject *offobj; + static char *keywords[] = {"out", "in", + "offset", "count", NULL}; + if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iiOn:sendfile", + keywords, &out, &in, &offobj, &count)) + return NULL; +#ifdef linux + if (offobj == Py_None) { + Py_BEGIN_ALLOW_THREADS + ret = sendfile(out, in, NULL, count); + Py_END_ALLOW_THREADS + if (ret < 0) + return posix_error(); + Py_INCREF(Py_None); + return Py_BuildValue("nO", ret, Py_None); + } +#endif + _parse_off_t(offobj, &offset); + Py_BEGIN_ALLOW_THREADS + ret = sendfile(out, in, &offset, count); + Py_END_ALLOW_THREADS + if (ret < 0) + return posix_error(); + return Py_BuildValue("n", ret); +#endif +} +#endif PyDoc_STRVAR(posix_fstat__doc__, "fstat(fd) -> stat result\n\n\ @@ -6348,38 +6739,38 @@ size_t tablesize) { if (PyLong_Check(arg)) { - *valuep = PyLong_AS_LONG(arg); - return 1; + *valuep = PyLong_AS_LONG(arg); + return 1; } else { - /* look up the value in the table using a binary search */ - size_t lo = 0; - size_t mid; - size_t hi = tablesize; - int cmp; - const char *confname; - if (!PyUnicode_Check(arg)) { - PyErr_SetString(PyExc_TypeError, - "configuration names must be strings or integers"); - return 0; - } - confname = _PyUnicode_AsString(arg); - if (confname == NULL) - return 0; - while (lo < hi) { - mid = (lo + hi) / 2; - cmp = strcmp(confname, table[mid].name); - if (cmp < 0) - hi = mid; - else if (cmp > 0) - lo = mid + 1; - else { - *valuep = table[mid].value; - return 1; + /* look up the value in the table using a binary search */ + size_t lo = 0; + size_t mid; + size_t hi = tablesize; + int cmp; + const char *confname; + if (!PyUnicode_Check(arg)) { + PyErr_SetString(PyExc_TypeError, + "configuration names must be strings or integers"); + return 0; } - } - PyErr_SetString(PyExc_ValueError, "unrecognized configuration name"); - return 0; + confname = _PyUnicode_AsString(arg); + if (confname == NULL) + return 0; + while (lo < hi) { + mid = (lo + hi) / 2; + cmp = strcmp(confname, table[mid].name); + if (cmp < 0) + hi = mid; + else if (cmp > 0) + lo = mid + 1; + else { + *valuep = table[mid].value; + return 1; + } + } + PyErr_SetString(PyExc_ValueError, "unrecognized configuration name"); + return 0; } } @@ -6495,14 +6886,14 @@ if (PyArg_ParseTuple(args, "iO&:fpathconf", &fd, conv_path_confname, &name)) { - long limit; + long limit; - errno = 0; - limit = fpathconf(fd, name); - if (limit == -1 && errno != 0) - posix_error(); - else - result = PyLong_FromLong(limit); + errno = 0; + limit = fpathconf(fd, name); + if (limit == -1 && errno != 0) + posix_error(); + else + result = PyLong_FromLong(limit); } return result; } @@ -6530,10 +6921,10 @@ limit = pathconf(path, name); if (limit == -1 && errno != 0) { if (errno == EINVAL) - /* could be a path or name problem */ - posix_error(); + /* could be a path or name problem */ + posix_error(); else - posix_error_with_filename(path); + posix_error_with_filename(path); } else result = PyLong_FromLong(limit); @@ -6714,7 +7105,7 @@ PyObject *result = NULL; int name; char buffer[255]; - int len; + int len; if (!PyArg_ParseTuple(args, "O&:confstr", conv_confstr_confname, &name)) return NULL; @@ -7329,21 +7720,21 @@ sizeof(posix_constants_pathconf) / sizeof(struct constdef), "pathconf_names", module)) - return -1; + return -1; #endif #ifdef HAVE_CONFSTR if (setup_confname_table(posix_constants_confstr, sizeof(posix_constants_confstr) / sizeof(struct constdef), "confstr_names", module)) - return -1; + return -1; #endif #ifdef HAVE_SYSCONF if (setup_confname_table(posix_constants_sysconf, sizeof(posix_constants_sysconf) / sizeof(struct constdef), "sysconf_names", module)) - return -1; + return -1; #endif return 0; } @@ -7456,10 +7847,10 @@ { double loadavg[3]; if (getloadavg(loadavg, 3)!=3) { - PyErr_SetString(PyExc_OSError, "Load averages are unobtainable"); - return NULL; + PyErr_SetString(PyExc_OSError, "Load averages are unobtainable"); + return NULL; } else - return Py_BuildValue("ddd", loadavg[0], loadavg[1], loadavg[2]); + return Py_BuildValue("ddd", loadavg[0], loadavg[1], loadavg[2]); } #endif @@ -7688,6 +8079,552 @@ } #endif +/* Posix *at family of functions: + faccessat, fchmodat, fchownat, fstatat, futimesat, + linkat, mkdirat, mknodat, openat, readlinkat, renameat, symlinkat, + unlinkat, utimensat, mkfifoat */ + +#ifdef HAVE_FACCESSAT +PyDoc_STRVAR(posix_faccessat__doc__, +"faccessat(dirfd, path, mode, flags=0) -> True if granted, False otherwise\n\n\ +Like access() but if path is relative, it is taken as relative to dirfd.\n\ +flags is optional and can be constructed by ORing together zero or more\n\ +of these values: AT_SYMLINK_NOFOLLOW, AT_EACCESS.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_faccessat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *path; + int mode; + int res; + int dirfd, flags = 0; + if (!PyArg_ParseTuple(args, "iO&i|i:faccessat", + &dirfd, PyUnicode_FSConverter, &opath, &mode, &flags)) + return NULL; + path = PyBytes_AsString(opath); + Py_BEGIN_ALLOW_THREADS + res = faccessat(dirfd, path, mode, flags); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + return PyBool_FromLong(res == 0); +} +#endif + +#ifdef HAVE_FCHMODAT +PyDoc_STRVAR(posix_fchmodat__doc__, +"fchmodat(dirfd, path, mode, flags=0)\n\n\ +Like chmod() but if path is relative, it is taken as relative to dirfd.\n\ +flags is optional and may be 0 or AT_SYMLINK_NOFOLLOW.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_fchmodat(PyObject *self, PyObject *args) +{ + int dirfd, mode, res; + int flags = 0; + PyObject *opath; + char *path; + + if (!PyArg_ParseTuple(args, "iO&i|i:fchmodat", + &dirfd, PyUnicode_FSConverter, &opath, &mode, &flags)) + return NULL; + + path = PyBytes_AsString(opath); + + Py_BEGIN_ALLOW_THREADS + res = fchmodat(dirfd, path, mode, flags); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif /* HAVE_FCHMODAT */ + +#ifdef HAVE_FCHOWNAT +PyDoc_STRVAR(posix_fchownat__doc__, +"fchownat(dirfd, path, uid, gid, flags=0)\n\n\ +Like chown() but if path is relative, it is taken as relative to dirfd.\n\ +flags is optional and may be 0 or AT_SYMLINK_NOFOLLOW.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_fchownat(PyObject *self, PyObject *args) +{ + PyObject *opath; + int dirfd, res; + long uid, gid; + int flags = 0; + char *path; + + if (!PyArg_ParseTuple(args, "iO&ll|i:fchownat", + &dirfd, PyUnicode_FSConverter, &opath, &uid, &gid, &flags)) + return NULL; + + path = PyBytes_AsString(opath); + + Py_BEGIN_ALLOW_THREADS + res = fchownat(dirfd, path, (uid_t) uid, (gid_t) gid, flags); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif /* HAVE_FCHOWNAT */ + +#ifdef HAVE_FSTATAT +PyDoc_STRVAR(posix_fstatat__doc__, +"fstatat(dirfd, path, flags=0) -> stat result\n\n\ +Like stat() but if path is relative, it is taken as relative to dirfd.\n\ +flags is optional and may be 0 or AT_SYMLINK_NOFOLLOW.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_fstatat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *path; + STRUCT_STAT st; + int dirfd, res, flags = 0; + + if (!PyArg_ParseTuple(args, "iO&|i:fstatat", + &dirfd, PyUnicode_FSConverter, &opath, &flags)) + return NULL; + path = PyBytes_AsString(opath); + + Py_BEGIN_ALLOW_THREADS + res = fstatat(dirfd, path, &st, flags); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res != 0) + return posix_error(); + + return _pystat_fromstructstat(&st); +} +#endif + +#ifdef HAVE_FUTIMESAT +PyDoc_STRVAR(posix_futimesat__doc__, +"futimesat(dirfd, path, (atime, mtime))\n\ +futimesat(dirfd, path, None)\n\n\ +Like utime() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_futimesat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *path; + int res, dirfd; + PyObject* arg; + + struct timeval buf[2]; + + if (!PyArg_ParseTuple(args, "iO&O:futimesat", + &dirfd, PyUnicode_FSConverter, &opath, &arg)) + return NULL; + path = PyBytes_AsString(opath); + if (arg == Py_None) { + /* optional time values not given */ + Py_BEGIN_ALLOW_THREADS + res = futimesat(dirfd, path, NULL); + Py_END_ALLOW_THREADS + } + else if (!PyTuple_Check(arg) || PyTuple_Size(arg) != 2) { + PyErr_SetString(PyExc_TypeError, + "futimesat() arg 3 must be a tuple (atime, mtime)"); + Py_DECREF(opath); + return NULL; + } + else { + if (extract_time(PyTuple_GET_ITEM(arg, 0), + &(buf[0].tv_sec), &(buf[0].tv_usec)) == -1) { + Py_DECREF(opath); + return NULL; + } + if (extract_time(PyTuple_GET_ITEM(arg, 1), + &(buf[1].tv_sec), &(buf[1].tv_usec)) == -1) { + Py_DECREF(opath); + return NULL; + } + Py_BEGIN_ALLOW_THREADS + res = futimesat(dirfd, path, buf); + Py_END_ALLOW_THREADS + } + Py_DECREF(opath); + if (res < 0) { + return posix_error(); + } + Py_RETURN_NONE; +} +#endif + +#ifdef HAVE_LINKAT +PyDoc_STRVAR(posix_linkat__doc__, +"linkat(srcfd, srcpath, dstfd, dstpath, flags=0)\n\n\ +Like link() but if srcpath is relative, it is taken as relative to srcfd\n\ +and if dstpath is relative, it is taken as relative to dstfd.\n\ +flags is optional and may be 0 or AT_SYMLINK_FOLLOW.\n\ +If srcpath is relative and srcfd is the special value AT_FDCWD, then\n\ +srcpath is interpreted relative to the current working directory. This\n\ +also applies for dstpath."); + +static PyObject * +posix_linkat(PyObject *self, PyObject *args) +{ + PyObject *osrc, *odst; + char *src, *dst; + int res, srcfd, dstfd; + int flags = 0; + + if (!PyArg_ParseTuple(args, "iO&iO&|i:linkat", + &srcfd, PyUnicode_FSConverter, &osrc, &dstfd, PyUnicode_FSConverter, &odst, &flags)) + return NULL; + src = PyBytes_AsString(osrc); + dst = PyBytes_AsString(odst); + Py_BEGIN_ALLOW_THREADS + res = linkat(srcfd, src, dstfd, dst, flags); + Py_END_ALLOW_THREADS + Py_DECREF(osrc); + Py_DECREF(odst); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif /* HAVE_LINKAT */ + +#ifdef HAVE_MKDIRAT +PyDoc_STRVAR(posix_mkdirat__doc__, +"mkdirat(dirfd, path, mode=0o777)\n\n\ +Like mkdir() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_mkdirat(PyObject *self, PyObject *args) +{ + int res, dirfd; + PyObject *opath; + char *path; + int mode = 0777; + + if (!PyArg_ParseTuple(args, "iO&|i:mkdirat", + &dirfd, PyUnicode_FSConverter, &opath, &mode)) + return NULL; + path = PyBytes_AsString(opath); + Py_BEGIN_ALLOW_THREADS + res = mkdirat(dirfd, path, mode); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif + +#if defined(HAVE_MKNODAT) && defined(HAVE_MAKEDEV) +PyDoc_STRVAR(posix_mknodat__doc__, +"mknodat(dirfd, path, mode=0o600, device=0)\n\n\ +Like mknod() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_mknodat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *filename; + int mode = 0600; + int device = 0; + int res, dirfd; + if (!PyArg_ParseTuple(args, "iO&|ii:mknodat", &dirfd, + PyUnicode_FSConverter, &opath, &mode, &device)) + return NULL; + filename = PyBytes_AS_STRING(opath); + Py_BEGIN_ALLOW_THREADS + res = mknodat(dirfd, filename, mode, device); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif + +#ifdef HAVE_OPENAT +PyDoc_STRVAR(posix_openat__doc__, +"openat(dirfd, path, flag, mode=0o777) -> fd\n\n\ +Like open() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_openat(PyObject *self, PyObject *args) +{ + PyObject *ofile; + char *file; + int flag, dirfd, fd; + int mode = 0777; + + if (!PyArg_ParseTuple(args, "iO&i|i:openat", + &dirfd, PyUnicode_FSConverter, &ofile, + &flag, &mode)) + return NULL; + file = PyBytes_AsString(ofile); + Py_BEGIN_ALLOW_THREADS + fd = openat(dirfd, file, flag, mode); + Py_END_ALLOW_THREADS + Py_DECREF(ofile); + if (fd < 0) + return posix_error(); + return PyLong_FromLong((long)fd); +} +#endif + +#ifdef HAVE_READLINKAT +PyDoc_STRVAR(posix_readlinkat__doc__, +"readlinkat(dirfd, path) -> path\n\n\ +Like readlink() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_readlinkat(PyObject *self, PyObject *args) +{ + PyObject *v, *opath; + char buf[MAXPATHLEN]; + char *path; + int n, dirfd; + int arg_is_unicode = 0; + + if (!PyArg_ParseTuple(args, "iO&:readlinkat", + &dirfd, PyUnicode_FSConverter, &opath)) + return NULL; + path = PyBytes_AsString(opath); + v = PySequence_GetItem(args, 1); + if (v == NULL) { + Py_DECREF(opath); + return NULL; + } + + if (PyUnicode_Check(v)) { + arg_is_unicode = 1; + } + Py_DECREF(v); + + Py_BEGIN_ALLOW_THREADS + n = readlinkat(dirfd, path, buf, (int) sizeof buf); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (n < 0) + return posix_error(); + + if (arg_is_unicode) + return PyUnicode_DecodeFSDefaultAndSize(buf, n); + else + return PyBytes_FromStringAndSize(buf, n); +} +#endif /* HAVE_READLINKAT */ + +#ifdef HAVE_RENAMEAT +PyDoc_STRVAR(posix_renameat__doc__, +"renameat(olddirfd, oldpath, newdirfd, newpath)\n\n\ +Like rename() but if oldpath is relative, it is taken as relative to\n\ +olddirfd and if newpath is relative, it is taken as relative to newdirfd.\n\ +If oldpath is relative and olddirfd is the special value AT_FDCWD, then\n\ +oldpath is interpreted relative to the current working directory. This\n\ +also applies for newpath."); + +static PyObject * +posix_renameat(PyObject *self, PyObject *args) +{ + int res; + PyObject *opathold, *opathnew; + char *opath, *npath; + int oldfd, newfd; + + if (!PyArg_ParseTuple(args, "iO&iO&:renameat", + &oldfd, PyUnicode_FSConverter, &opathold, &newfd, PyUnicode_FSConverter, &opathnew)) + return NULL; + opath = PyBytes_AsString(opathold); + npath = PyBytes_AsString(opathnew); + Py_BEGIN_ALLOW_THREADS + res = renameat(oldfd, opath, newfd, npath); + Py_END_ALLOW_THREADS + Py_DECREF(opathold); + Py_DECREF(opathnew); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif + +#if HAVE_SYMLINKAT +PyDoc_STRVAR(posix_symlinkat__doc__, +"symlinkat(src, dstfd, dst)\n\n\ +Like symlink() but if dst is relative, it is taken as relative to dstfd.\n\ +If dst is relative and dstfd is the special value AT_FDCWD, then dst\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_symlinkat(PyObject *self, PyObject *args) +{ + int res, dstfd; + PyObject *osrc, *odst; + char *src, *dst; + + if (!PyArg_ParseTuple(args, "O&iO&:symlinkat", + PyUnicode_FSConverter, &osrc, &dstfd, PyUnicode_FSConverter, &odst)) + return NULL; + src = PyBytes_AsString(osrc); + dst = PyBytes_AsString(odst); + Py_BEGIN_ALLOW_THREADS + res = symlinkat(src, dstfd, dst); + Py_END_ALLOW_THREADS + Py_DECREF(osrc); + Py_DECREF(odst); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif /* HAVE_SYMLINKAT */ + +#ifdef HAVE_UNLINKAT +PyDoc_STRVAR(posix_unlinkat__doc__, +"unlinkat(dirfd, path, flags=0)\n\n\ +Like unlink() but if path is relative, it is taken as relative to dirfd.\n\ +flags is optional and may be 0 or AT_REMOVEDIR. If AT_REMOVEDIR is\n\ +specified, unlinkat() behaves like rmdir().\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_unlinkat(PyObject *self, PyObject *args) +{ + int dirfd, res, flags = 0; + PyObject *opath; + char *path; + + if (!PyArg_ParseTuple(args, "iO&|i:unlinkat", + &dirfd, PyUnicode_FSConverter, &opath, &flags)) + return NULL; + path = PyBytes_AsString(opath); + Py_BEGIN_ALLOW_THREADS + res = unlinkat(dirfd, path, flags); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif + +#ifdef HAVE_UTIMENSAT +PyDoc_STRVAR(posix_utimensat__doc__, +"utimensat(dirfd, path, (atime_sec, atime_nsec),\n\ + (mtime_sec, mtime_nsec), flags)\n\ +utimensat(dirfd, path, None, None, flags)\n\n\ +Updates the timestamps of a file with nanosecond precision. If path is\n\ +relative, it is taken as relative to dirfd.\n\ +The second form sets atime and mtime to the current time.\n\ +flags is optional and may be 0 or AT_SYMLINK_NOFOLLOW.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory.\n\ +If *_nsec is specified as UTIME_NOW, the timestamp is updated to the\n\ +current time.\n\ +If *_nsec is specified as UTIME_OMIT, the timestamp is not updated."); + +static PyObject * +posix_utimensat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *path; + int res, dirfd, flags = 0; + PyObject *atime, *mtime; + + struct timespec buf[2]; + + if (!PyArg_ParseTuple(args, "iO&OO|i:utimensat", + &dirfd, PyUnicode_FSConverter, &opath, &atime, &mtime, &flags)) + return NULL; + path = PyBytes_AsString(opath); + if (atime == Py_None && mtime == Py_None) { + /* optional time values not given */ + Py_BEGIN_ALLOW_THREADS + res = utimensat(dirfd, path, NULL, flags); + Py_END_ALLOW_THREADS + } + else if (!PyTuple_Check(atime) || PyTuple_Size(atime) != 2) { + PyErr_SetString(PyExc_TypeError, + "utimensat() arg 3 must be a tuple (atime_sec, atime_nsec)"); + Py_DECREF(opath); + return NULL; + } + else if (!PyTuple_Check(mtime) || PyTuple_Size(mtime) != 2) { + PyErr_SetString(PyExc_TypeError, + "utimensat() arg 4 must be a tuple (mtime_sec, mtime_nsec)"); + Py_DECREF(opath); + return NULL; + } + else { + if (!PyArg_ParseTuple(atime, "ll:utimensat", + &(buf[0].tv_sec), &(buf[0].tv_nsec))) { + Py_DECREF(opath); + return NULL; + } + if (!PyArg_ParseTuple(mtime, "ll:utimensat", + &(buf[1].tv_sec), &(buf[1].tv_nsec))) { + Py_DECREF(opath); + return NULL; + } + Py_BEGIN_ALLOW_THREADS + res = utimensat(dirfd, path, buf, flags); + Py_END_ALLOW_THREADS + } + Py_DECREF(opath); + if (res < 0) { + return posix_error(); + } + Py_RETURN_NONE; +} +#endif + +#ifdef HAVE_MKFIFOAT +PyDoc_STRVAR(posix_mkfifoat__doc__, +"mkfifoat(dirfd, path, mode=0o666)\n\n\ +Like mkfifo() but if path is relative, it is taken as relative to dirfd.\n\ +If path is relative and dirfd is the special value AT_FDCWD, then path\n\ +is interpreted relative to the current working directory."); + +static PyObject * +posix_mkfifoat(PyObject *self, PyObject *args) +{ + PyObject *opath; + char *filename; + int mode = 0666; + int res, dirfd; + if (!PyArg_ParseTuple(args, "iO&|i:mkfifoat", + &dirfd, PyUnicode_FSConverter, &opath, &mode)) + return NULL; + filename = PyBytes_AS_STRING(opath); + Py_BEGIN_ALLOW_THREADS + res = mkfifoat(dirfd, filename, mode); + Py_END_ALLOW_THREADS + Py_DECREF(opath); + if (res < 0) + return posix_error(); + Py_RETURN_NONE; +} +#endif + static PyMethodDef posix_methods[] = { {"access", posix_access, METH_VARARGS, posix_access__doc__}, #ifdef HAVE_TTYNAME @@ -7732,11 +8669,20 @@ {"link", posix_link, METH_VARARGS, posix_link__doc__}, #endif /* HAVE_LINK */ {"listdir", posix_listdir, METH_VARARGS, posix_listdir__doc__}, +#ifdef HAVE_FDOPENDIR + {"fdlistdir", posix_fdlistdir, METH_VARARGS, posix_fdlistdir__doc__}, +#endif {"lstat", posix_lstat, METH_VARARGS, posix_lstat__doc__}, {"mkdir", posix_mkdir, METH_VARARGS, posix_mkdir__doc__}, #ifdef HAVE_NICE {"nice", posix_nice, METH_VARARGS, posix_nice__doc__}, #endif /* HAVE_NICE */ +#ifdef HAVE_GETPRIORITY + {"getpriority", posix_getpriority, METH_VARARGS, posix_getpriority__doc__}, +#endif /* HAVE_GETPRIORITY */ +#ifdef HAVE_SETPRIORITY + {"setpriority", posix_setpriority, METH_VARARGS, posix_setpriority__doc__}, +#endif /* HAVE_SETPRIORITY */ #ifdef HAVE_READLINK {"readlink", posix_readlink, METH_VARARGS, posix_readlink__doc__}, #endif /* HAVE_READLINK */ @@ -7747,13 +8693,13 @@ {"rmdir", posix_rmdir, METH_VARARGS, posix_rmdir__doc__}, {"stat", posix_stat, METH_VARARGS, posix_stat__doc__}, {"stat_float_times", stat_float_times, METH_VARARGS, stat_float_times__doc__}, -#ifdef HAVE_SYMLINK +#if defined(HAVE_SYMLINK) && !defined(MS_WINDOWS) {"symlink", posix_symlink, METH_VARARGS, posix_symlink__doc__}, #endif /* HAVE_SYMLINK */ -#if !defined(HAVE_SYMLINK) && defined(MS_WINDOWS) +#if defined(HAVE_SYMLINK) && defined(MS_WINDOWS) {"symlink", (PyCFunction)win_symlink, METH_VARARGS | METH_KEYWORDS, - win_symlink__doc__}, -#endif /* !defined(HAVE_SYMLINK) && defined(MS_WINDOWS) */ + win_symlink__doc__}, +#endif /* defined(HAVE_SYMLINK) && defined(MS_WINDOWS) */ #ifdef HAVE_SYSTEM {"system", posix_system, METH_VARARGS, posix_system__doc__}, #endif @@ -7829,6 +8775,7 @@ #ifdef MS_WINDOWS {"startfile", win32_startfile, METH_VARARGS, win32_startfile__doc__}, {"kill", win32_kill, METH_VARARGS, win32_kill__doc__}, + {"link", win32_link, METH_VARARGS, win32_link__doc__}, #endif #ifdef HAVE_SETUID {"setuid", posix_setuid, METH_VARARGS, posix_setuid__doc__}, @@ -7896,6 +8843,10 @@ {"lseek", posix_lseek, METH_VARARGS, posix_lseek__doc__}, {"read", posix_read, METH_VARARGS, posix_read__doc__}, {"write", posix_write, METH_VARARGS, posix_write__doc__}, +#ifdef HAVE_SENDFILE + {"sendfile", (PyCFunction)posix_sendfile, METH_VARARGS | METH_KEYWORDS, + posix_sendfile__doc__}, +#endif {"fstat", posix_fstat, METH_VARARGS, posix_fstat__doc__}, {"isatty", posix_isatty, METH_VARARGS, posix_isatty__doc__}, #ifdef HAVE_PIPE @@ -8003,6 +8954,52 @@ {"getresgid", posix_getresgid, METH_NOARGS, posix_getresgid__doc__}, #endif +/* posix *at family of functions */ +#ifdef HAVE_FACCESSAT + {"faccessat", posix_faccessat, METH_VARARGS, posix_faccessat__doc__}, +#endif +#ifdef HAVE_FCHMODAT + {"fchmodat", posix_fchmodat, METH_VARARGS, posix_fchmodat__doc__}, +#endif /* HAVE_FCHMODAT */ +#ifdef HAVE_FCHOWNAT + {"fchownat", posix_fchownat, METH_VARARGS, posix_fchownat__doc__}, +#endif /* HAVE_FCHOWNAT */ +#ifdef HAVE_FSTATAT + {"fstatat", posix_fstatat, METH_VARARGS, posix_fstatat__doc__}, +#endif +#ifdef HAVE_FUTIMESAT + {"futimesat", posix_futimesat, METH_VARARGS, posix_futimesat__doc__}, +#endif +#ifdef HAVE_LINKAT + {"linkat", posix_linkat, METH_VARARGS, posix_linkat__doc__}, +#endif /* HAVE_LINKAT */ +#ifdef HAVE_MKDIRAT + {"mkdirat", posix_mkdirat, METH_VARARGS, posix_mkdirat__doc__}, +#endif +#if defined(HAVE_MKNODAT) && defined(HAVE_MAKEDEV) + {"mknodat", posix_mknodat, METH_VARARGS, posix_mknodat__doc__}, +#endif +#ifdef HAVE_OPENAT + {"openat", posix_openat, METH_VARARGS, posix_openat__doc__}, +#endif +#ifdef HAVE_READLINKAT + {"readlinkat", posix_readlinkat, METH_VARARGS, posix_readlinkat__doc__}, +#endif /* HAVE_READLINKAT */ +#ifdef HAVE_RENAMEAT + {"renameat", posix_renameat, METH_VARARGS, posix_renameat__doc__}, +#endif +#if HAVE_SYMLINKAT + {"symlinkat", posix_symlinkat, METH_VARARGS, posix_symlinkat__doc__}, +#endif /* HAVE_SYMLINKAT */ +#ifdef HAVE_UNLINKAT + {"unlinkat", posix_unlinkat, METH_VARARGS, posix_unlinkat__doc__}, +#endif +#ifdef HAVE_UTIMENSAT + {"utimensat", posix_utimensat, METH_VARARGS, posix_utimensat__doc__}, +#endif +#ifdef HAVE_MKFIFOAT + {"mkfifoat", posix_mkfifoat, METH_VARARGS, posix_mkfifoat__doc__}, +#endif {NULL, NULL} /* Sentinel */ }; @@ -8066,6 +9063,35 @@ } #endif +#if defined(HAVE_SYMLINK) && defined(MS_WINDOWS) +static int +enable_symlink() +{ + HANDLE tok; + TOKEN_PRIVILEGES tok_priv; + LUID luid; + int meth_idx = 0; + + if (!OpenProcessToken(GetCurrentProcess(), TOKEN_ALL_ACCESS, &tok)) + return 0; + + if (!LookupPrivilegeValue(NULL, SE_CREATE_SYMBOLIC_LINK_NAME, &luid)) + return 0; + + tok_priv.PrivilegeCount = 1; + tok_priv.Privileges[0].Luid = luid; + tok_priv.Privileges[0].Attributes = SE_PRIVILEGE_ENABLED; + + if (!AdjustTokenPrivileges(tok, FALSE, &tok_priv, + sizeof(TOKEN_PRIVILEGES), + (PTOKEN_PRIVILEGES) NULL, (PDWORD) NULL)) + return 0; + + /* ERROR_NOT_ALL_ASSIGNED returned when the privilege can't be assigned. */ + return GetLastError() == ERROR_NOT_ALL_ASSIGNED ? 0 : 1; +} +#endif /* defined(HAVE_SYMLINK) && defined(MS_WINDOWS) */ + static int all_ins(PyObject *d) { @@ -8150,6 +9176,38 @@ #ifdef O_EXLOCK if (ins(d, "O_EXLOCK", (long)O_EXLOCK)) return -1; #endif +#ifdef PRIO_PROCESS + if (ins(d, "PRIO_PROCESS", (long)PRIO_PROCESS)) return -1; +#endif +#ifdef PRIO_PGRP + if (ins(d, "PRIO_PGRP", (long)PRIO_PGRP)) return -1; +#endif +#ifdef PRIO_USER + if (ins(d, "PRIO_USER", (long)PRIO_USER)) return -1; +#endif +/* posix - constants for *at functions */ +#ifdef AT_SYMLINK_NOFOLLOW + if (ins(d, "AT_SYMLINK_NOFOLLOW", (long)AT_SYMLINK_NOFOLLOW)) return -1; +#endif +#ifdef AT_EACCESS + if (ins(d, "AT_EACCESS", (long)AT_EACCESS)) return -1; +#endif +#ifdef AT_FDCWD + if (ins(d, "AT_FDCWD", (long)AT_FDCWD)) return -1; +#endif +#ifdef AT_REMOVEDIR + if (ins(d, "AT_REMOVEDIR", (long)AT_REMOVEDIR)) return -1; +#endif +#ifdef AT_SYMLINK_FOLLOW + if (ins(d, "AT_SYMLINK_FOLLOW", (long)AT_SYMLINK_FOLLOW)) return -1; +#endif +#ifdef UTIME_NOW + if (ins(d, "UTIME_NOW", (long)UTIME_NOW)) return -1; +#endif +#ifdef UTIME_OMIT + if (ins(d, "UTIME_OMIT", (long)UTIME_OMIT)) return -1; +#endif + /* MS Windows */ #ifdef O_NOINHERIT @@ -8258,6 +9316,17 @@ if (ins(d, "ST_NOSUID", (long)ST_NOSUID)) return -1; #endif /* ST_NOSUID */ + /* FreeBSD sendfile() constants */ +#ifdef SF_NODISKIO + if (ins(d, "SF_NODISKIO", (long)SF_NODISKIO)) return -1; +#endif +#ifdef SF_MNOWAIT + if (ins(d, "SF_MNOWAIT", (long)SF_MNOWAIT)) return -1; +#endif +#ifdef SF_SYNC + if (ins(d, "SF_SYNC", (long)SF_SYNC)) return -1; +#endif + #ifdef HAVE_SPAWNV #if defined(PYOS_OS2) && defined(PYCC_GCC) if (ins(d, "P_WAIT", (long)P_WAIT)) return -1; @@ -8327,6 +9396,10 @@ { PyObject *m, *v; +#if defined(HAVE_SYMLINK) && defined(MS_WINDOWS) + win32_can_symlink = enable_symlink(); +#endif + m = PyModule_Create(&posixmodule); if (m == NULL) return NULL; Modified: python/branches/pep-3151/Modules/pwdmodule.c ============================================================================== --- python/branches/pep-3151/Modules/pwdmodule.c (original) +++ python/branches/pep-3151/Modules/pwdmodule.c Sat Feb 26 08:16:32 2011 @@ -2,7 +2,6 @@ /* UNIX password file access module */ #include "Python.h" -#include "structseq.h" #include #include Modified: python/branches/pep-3151/Modules/pyexpat.c ============================================================================== --- python/branches/pep-3151/Modules/pyexpat.c (original) +++ python/branches/pep-3151/Modules/pyexpat.c Sat Feb 26 08:16:32 2011 @@ -797,25 +797,13 @@ static int readinst(char *buf, int buf_size, PyObject *meth) { - PyObject *arg = NULL; - PyObject *bytes = NULL; - PyObject *str = NULL; - int len = -1; + PyObject *str; + Py_ssize_t len; char *ptr; - if ((bytes = PyLong_FromLong(buf_size)) == NULL) - goto finally; - - if ((arg = PyTuple_New(1)) == NULL) { - Py_DECREF(bytes); - goto finally; - } - - PyTuple_SET_ITEM(arg, 0, bytes); - - str = PyObject_Call(meth, arg, NULL); + str = PyObject_CallFunction(meth, "n", buf_size); if (str == NULL) - goto finally; + goto error; if (PyBytes_Check(str)) ptr = PyBytes_AS_STRING(str); @@ -825,21 +813,24 @@ PyErr_Format(PyExc_TypeError, "read() did not return a bytes object (type=%.400s)", Py_TYPE(str)->tp_name); - goto finally; + goto error; } len = Py_SIZE(str); if (len > buf_size) { PyErr_Format(PyExc_ValueError, "read() returned too much data: " - "%i bytes requested, %i returned", + "%i bytes requested, %zd returned", buf_size, len); - goto finally; + goto error; } memcpy(buf, ptr, len); -finally: - Py_XDECREF(arg); + Py_DECREF(str); + /* len <= buf_size <= INT_MAX */ + return (int)len; + +error: Py_XDECREF(str); - return len; + return -1; } PyDoc_STRVAR(xmlparse_ParseFile__doc__, @@ -1215,11 +1206,12 @@ } static int -handlername2int(const char *name) +handlername2int(PyObject *name) { int i; for (i = 0; handler_info[i].name != NULL; i++) { - if (strcmp(name, handler_info[i].name) == 0) { + if (PyUnicode_CompareWithASCIIString( + name, handler_info[i].name) == 0) { return i; } } @@ -1237,13 +1229,13 @@ static PyObject * xmlparse_getattro(xmlparseobject *self, PyObject *nameobj) { - char *name = ""; + Py_UNICODE *name; int handlernum = -1; - if (PyUnicode_Check(nameobj)) - name = _PyUnicode_AsString(nameobj); + if (!PyUnicode_Check(nameobj)) + goto generic; - handlernum = handlername2int(name); + handlernum = handlername2int(nameobj); if (handlernum != -1) { PyObject *result = self->handlers[handlernum]; @@ -1252,46 +1244,48 @@ Py_INCREF(result); return result; } + + name = PyUnicode_AS_UNICODE(nameobj); if (name[0] == 'E') { - if (strcmp(name, "ErrorCode") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "ErrorCode") == 0) return PyLong_FromLong((long) XML_GetErrorCode(self->itself)); - if (strcmp(name, "ErrorLineNumber") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "ErrorLineNumber") == 0) return PyLong_FromLong((long) XML_GetErrorLineNumber(self->itself)); - if (strcmp(name, "ErrorColumnNumber") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "ErrorColumnNumber") == 0) return PyLong_FromLong((long) XML_GetErrorColumnNumber(self->itself)); - if (strcmp(name, "ErrorByteIndex") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "ErrorByteIndex") == 0) return PyLong_FromLong((long) XML_GetErrorByteIndex(self->itself)); } if (name[0] == 'C') { - if (strcmp(name, "CurrentLineNumber") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "CurrentLineNumber") == 0) return PyLong_FromLong((long) XML_GetCurrentLineNumber(self->itself)); - if (strcmp(name, "CurrentColumnNumber") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "CurrentColumnNumber") == 0) return PyLong_FromLong((long) XML_GetCurrentColumnNumber(self->itself)); - if (strcmp(name, "CurrentByteIndex") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "CurrentByteIndex") == 0) return PyLong_FromLong((long) XML_GetCurrentByteIndex(self->itself)); } if (name[0] == 'b') { - if (strcmp(name, "buffer_size") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "buffer_size") == 0) return PyLong_FromLong((long) self->buffer_size); - if (strcmp(name, "buffer_text") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "buffer_text") == 0) return get_pybool(self->buffer != NULL); - if (strcmp(name, "buffer_used") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "buffer_used") == 0) return PyLong_FromLong((long) self->buffer_used); } - if (strcmp(name, "namespace_prefixes") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "namespace_prefixes") == 0) return get_pybool(self->ns_prefixes); - if (strcmp(name, "ordered_attributes") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "ordered_attributes") == 0) return get_pybool(self->ordered_attributes); - if (strcmp(name, "specified_attributes") == 0) + if (PyUnicode_CompareWithASCIIString(nameobj, "specified_attributes") == 0) return get_pybool((long) self->specified_attributes); - if (strcmp(name, "intern") == 0) { + if (PyUnicode_CompareWithASCIIString(nameobj, "intern") == 0) { if (self->intern == NULL) { Py_INCREF(Py_None); return Py_None; @@ -1301,7 +1295,7 @@ return self->intern; } } - + generic: return PyObject_GenericGetAttr((PyObject*)self, nameobj); } @@ -1352,7 +1346,7 @@ } static int -sethandler(xmlparseobject *self, const char *name, PyObject* v) +sethandler(xmlparseobject *self, PyObject *name, PyObject* v) { int handlernum = handlername2int(name); if (handlernum >= 0) { @@ -1388,14 +1382,15 @@ } static int -xmlparse_setattr(xmlparseobject *self, char *name, PyObject *v) +xmlparse_setattro(xmlparseobject *self, PyObject *name, PyObject *v) { /* Set attribute 'name' to value 'v'. v==NULL means delete */ if (v == NULL) { PyErr_SetString(PyExc_RuntimeError, "Cannot delete attribute"); return -1; } - if (strcmp(name, "buffer_text") == 0) { + assert(PyUnicode_Check(name)); + if (PyUnicode_CompareWithASCIIString(name, "buffer_text") == 0) { if (PyObject_IsTrue(v)) { if (self->buffer == NULL) { self->buffer = malloc(self->buffer_size); @@ -1414,7 +1409,7 @@ } return 0; } - if (strcmp(name, "namespace_prefixes") == 0) { + if (PyUnicode_CompareWithASCIIString(name, "namespace_prefixes") == 0) { if (PyObject_IsTrue(v)) self->ns_prefixes = 1; else @@ -1422,14 +1417,14 @@ XML_SetReturnNSTriplet(self->itself, self->ns_prefixes); return 0; } - if (strcmp(name, "ordered_attributes") == 0) { + if (PyUnicode_CompareWithASCIIString(name, "ordered_attributes") == 0) { if (PyObject_IsTrue(v)) self->ordered_attributes = 1; else self->ordered_attributes = 0; return 0; } - if (strcmp(name, "specified_attributes") == 0) { + if (PyUnicode_CompareWithASCIIString(name, "specified_attributes") == 0) { if (PyObject_IsTrue(v)) self->specified_attributes = 1; else @@ -1437,7 +1432,7 @@ return 0; } - if (strcmp(name, "buffer_size") == 0) { + if (PyUnicode_CompareWithASCIIString(name, "buffer_size") == 0) { long new_buffer_size; if (!PyLong_Check(v)) { PyErr_SetString(PyExc_TypeError, "buffer_size must be an integer"); @@ -1480,7 +1475,7 @@ return 0; } - if (strcmp(name, "CharacterDataHandler") == 0) { + if (PyUnicode_CompareWithASCIIString(name, "CharacterDataHandler") == 0) { /* If we're changing the character data handler, flush all * cached data with the old handler. Not sure there's a * "right" thing to do, though, but this probably won't @@ -1492,7 +1487,7 @@ if (sethandler(self, name, v)) { return 0; } - PyErr_SetString(PyExc_AttributeError, name); + PyErr_SetObject(PyExc_AttributeError, name); return -1; } @@ -1518,13 +1513,13 @@ static PyTypeObject Xmlparsetype = { PyVarObject_HEAD_INIT(NULL, 0) "pyexpat.xmlparser", /*tp_name*/ - sizeof(xmlparseobject) + PyGC_HEAD_SIZE,/*tp_basicsize*/ + sizeof(xmlparseobject), /*tp_basicsize*/ 0, /*tp_itemsize*/ /* methods */ (destructor)xmlparse_dealloc, /*tp_dealloc*/ (printfunc)0, /*tp_print*/ 0, /*tp_getattr*/ - (setattrfunc)xmlparse_setattr, /*tp_setattr*/ + 0, /*tp_setattr*/ 0, /*tp_reserved*/ (reprfunc)0, /*tp_repr*/ 0, /*tp_as_number*/ @@ -1534,7 +1529,7 @@ (ternaryfunc)0, /*tp_call*/ (reprfunc)0, /*tp_str*/ (getattrofunc)xmlparse_getattro, /* tp_getattro */ - 0, /* tp_setattro */ + (setattrofunc)xmlparse_setattro, /* tp_setattro */ 0, /* tp_as_buffer */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, /*tp_flags*/ Xmlparsetype__doc__, /* tp_doc - Documentation string */ @@ -1803,7 +1798,7 @@ Py_XDECREF(rev_codes_dict); return NULL; } - + #define MYCONST(name) \ if (PyModule_AddStringConstant(errors_module, #name, \ (char *)XML_ErrorString(name)) < 0) \ @@ -1869,7 +1864,7 @@ return NULL; if (PyModule_AddObject(errors_module, "messages", rev_codes_dict) < 0) return NULL; - + #undef MYCONST #define MYCONST(c) PyModule_AddIntConstant(m, #c, c) Modified: python/branches/pep-3151/Modules/readline.c ============================================================================== --- python/branches/pep-3151/Modules/readline.c (original) +++ python/branches/pep-3151/Modules/readline.c Sat Feb 26 08:16:32 2011 @@ -889,6 +889,14 @@ Py_FatalError("not enough memory to save locale"); #endif +#ifdef __APPLE__ + /* the libedit readline emulation resets key bindings etc + * when calling rl_initialize. So call it upfront + */ + if (using_libedit_emulation) + rl_initialize(); +#endif /* __APPLE__ */ + using_history(); rl_readline_name = "python"; @@ -920,8 +928,13 @@ * XXX: A bug in the readline-2.2 library causes a memory leak * inside this function. Nothing we can do about it. */ - rl_initialize(); - +#ifdef __APPLE__ + if (using_libedit_emulation) + rl_read_init_file(NULL); + else +#endif /* __APPLE__ */ + rl_initialize(); + RESTORE_LOCALE(saved_locale) } Modified: python/branches/pep-3151/Modules/resource.c ============================================================================== --- python/branches/pep-3151/Modules/resource.c (original) +++ python/branches/pep-3151/Modules/resource.c Sat Feb 26 08:16:32 2011 @@ -1,6 +1,5 @@ #include "Python.h" -#include "structseq.h" #include #include #include @@ -325,6 +324,10 @@ PyModule_AddIntConstant(m, "RUSAGE_BOTH", RUSAGE_BOTH); #endif +#ifdef RUSAGE_THREAD + PyModule_AddIntConstant(m, "RUSAGE_THREAD", RUSAGE_THREAD); +#endif + #if defined(HAVE_LONG_LONG) if (sizeof(RLIM_INFINITY) > sizeof(long)) { v = PyLong_FromLongLong((PY_LONG_LONG) RLIM_INFINITY); Modified: python/branches/pep-3151/Modules/selectmodule.c ============================================================================== --- python/branches/pep-3151/Modules/selectmodule.c (original) +++ python/branches/pep-3151/Modules/selectmodule.c Sat Feb 26 08:16:32 2011 @@ -79,10 +79,9 @@ static int seq2set(PyObject *seq, fd_set *set, pylist fd2obj[FD_SETSIZE + 1]) { - int i; int max = -1; int index = 0; - int len = -1; + Py_ssize_t i, len = -1; PyObject* fast_seq = NULL; PyObject* o = NULL; Modified: python/branches/pep-3151/Modules/sha1module.c ============================================================================== --- python/branches/pep-3151/Modules/sha1module.c (original) +++ python/branches/pep-3151/Modules/sha1module.c Sat Feb 26 08:16:32 2011 @@ -203,9 +203,9 @@ @param inlen The length of the data (octets) */ void sha1_process(struct sha1_state *sha1, - const unsigned char *in, unsigned long inlen) + const unsigned char *in, Py_ssize_t inlen) { - unsigned long n; + Py_ssize_t n; assert(sha1 != NULL); assert(in != NULL); Modified: python/branches/pep-3151/Modules/sha256module.c ============================================================================== --- python/branches/pep-3151/Modules/sha256module.c (original) +++ python/branches/pep-3151/Modules/sha256module.c Sat Feb 26 08:16:32 2011 @@ -265,9 +265,9 @@ /* update the SHA digest */ static void -sha_update(SHAobject *sha_info, SHA_BYTE *buffer, int count) +sha_update(SHAobject *sha_info, SHA_BYTE *buffer, Py_ssize_t count) { - int i; + Py_ssize_t i; SHA_INT32 clo; clo = sha_info->count_lo + ((SHA_INT32) count << 3); Modified: python/branches/pep-3151/Modules/sha512module.c ============================================================================== --- python/branches/pep-3151/Modules/sha512module.c (original) +++ python/branches/pep-3151/Modules/sha512module.c Sat Feb 26 08:16:32 2011 @@ -291,9 +291,9 @@ /* update the SHA digest */ static void -sha512_update(SHAobject *sha_info, SHA_BYTE *buffer, int count) +sha512_update(SHAobject *sha_info, SHA_BYTE *buffer, Py_ssize_t count) { - int i; + Py_ssize_t i; SHA_INT32 clo; clo = sha_info->count_lo + ((SHA_INT32) count << 3); Modified: python/branches/pep-3151/Modules/signalmodule.c ============================================================================== --- python/branches/pep-3151/Modules/signalmodule.c (original) +++ python/branches/pep-3151/Modules/signalmodule.c Sat Feb 26 08:16:32 2011 @@ -4,7 +4,6 @@ /* XXX Signals should be recorded per thread, now we have thread state. */ #include "Python.h" -#include "intrcheck.h" #ifdef MS_WINDOWS #include Modified: python/branches/pep-3151/Modules/socketmodule.c ============================================================================== --- python/branches/pep-3151/Modules/socketmodule.c (original) +++ python/branches/pep-3151/Modules/socketmodule.c Sat Feb 26 08:16:32 2011 @@ -1405,9 +1405,9 @@ { struct sockaddr_hci *addr = (struct sockaddr_hci *)addr_ret; #if defined(__NetBSD__) || defined(__DragonFly__) - char *straddr = PyBytes_AS_STRING(args); + char *straddr = PyBytes_AS_STRING(args); - _BT_HCI_MEMB(addr, family) = AF_BLUETOOTH; + _BT_HCI_MEMB(addr, family) = AF_BLUETOOTH; if (straddr == NULL) { PyErr_SetString(PyExc_IOError, "getsockaddrarg: " "wrong format"); @@ -3406,7 +3406,7 @@ goto finally; af = sa->sa_family; ap = NULL; - al = 0; + /* al = 0; */ switch (af) { case AF_INET: ap = (char *)&((struct sockaddr_in *)sa)->sin_addr; @@ -4021,8 +4021,10 @@ pptr = pbuf; } else if (PyUnicode_Check(pobj)) { pptr = _PyUnicode_AsString(pobj); + if (pptr == NULL) + goto err; } else if (PyBytes_Check(pobj)) { - pptr = PyBytes_AsString(pobj); + pptr = PyBytes_AS_STRING(pobj); } else if (pobj == Py_None) { pptr = (char *)NULL; } else { @@ -4357,6 +4359,7 @@ PySocketModule_APIObject PySocketModuleAPI = { &sock_type, + NULL, NULL }; @@ -4421,6 +4424,7 @@ PyExc_IOError, NULL); if (socket_timeout == NULL) return NULL; + PySocketModuleAPI.timeout_error = socket_timeout; Py_INCREF(socket_timeout); PyModule_AddObject(m, "timeout", socket_timeout); Py_INCREF((PyObject *)&sock_type); Modified: python/branches/pep-3151/Modules/socketmodule.h ============================================================================== --- python/branches/pep-3151/Modules/socketmodule.h (original) +++ python/branches/pep-3151/Modules/socketmodule.h Sat Feb 26 08:16:32 2011 @@ -196,6 +196,7 @@ typedef struct { PyTypeObject *Sock_Type; PyObject *error; + PyObject *timeout_error; } PySocketModule_APIObject; #define PySocketModule_ImportModuleAndAPI() PyCapsule_Import(PySocket_CAPSULE_NAME, 1) Modified: python/branches/pep-3151/Modules/spwdmodule.c ============================================================================== --- python/branches/pep-3151/Modules/spwdmodule.c (original) +++ python/branches/pep-3151/Modules/spwdmodule.c Sat Feb 26 08:16:32 2011 @@ -4,7 +4,6 @@ /* For info also see http://www.unixpapa.com/incnote/passwd.html */ #include "Python.h" -#include "structseq.h" #include #ifdef HAVE_SHADOW_H Modified: python/branches/pep-3151/Modules/symtablemodule.c ============================================================================== --- python/branches/pep-3151/Modules/symtablemodule.c (original) +++ python/branches/pep-3151/Modules/symtablemodule.c Sat Feb 26 08:16:32 2011 @@ -1,7 +1,6 @@ #include "Python.h" #include "code.h" -#include "compile.h" #include "Python-ast.h" #include "symtable.h" Modified: python/branches/pep-3151/Modules/syslogmodule.c ============================================================================== --- python/branches/pep-3151/Modules/syslogmodule.c (original) +++ python/branches/pep-3151/Modules/syslogmodule.c Sat Feb 26 08:16:32 2011 @@ -68,9 +68,9 @@ * is optional. */ - Py_ssize_t argv_len; + Py_ssize_t argv_len, scriptlen; PyObject *scriptobj; - char *atslash; + Py_UNICODE *atslash, *atstart; PyObject *argv = PySys_GetObject("argv"); if (argv == NULL) { @@ -90,13 +90,16 @@ if (!PyUnicode_Check(scriptobj)) { return(NULL); } - if (PyUnicode_GET_SIZE(scriptobj) == 0) { + scriptlen = PyUnicode_GET_SIZE(scriptobj); + if (scriptlen == 0) { return(NULL); } - atslash = strrchr(_PyUnicode_AsString(scriptobj), SEP); + atstart = PyUnicode_AS_UNICODE(scriptobj); + atslash = Py_UNICODE_strrchr(atstart, SEP); if (atslash) { - return(PyUnicode_FromString(atslash + 1)); + return(PyUnicode_FromUnicode(atslash + 1, + scriptlen - (atslash - atstart) - 1)); } else { Py_INCREF(scriptobj); return(scriptobj); @@ -113,6 +116,7 @@ long facility = LOG_USER; PyObject *new_S_ident_o = NULL; static char *keywords[] = {"ident", "logoption", "facility", 0}; + char *ident = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwds, "|Ull:openlog", keywords, &new_S_ident_o, &logopt, &facility)) @@ -120,12 +124,12 @@ if (new_S_ident_o) { Py_INCREF(new_S_ident_o); - } + } /* get sys.argv[0] or NULL if we can't for some reason */ if (!new_S_ident_o) { new_S_ident_o = syslog_get_argv(); - } + } Py_XDECREF(S_ident_o); S_ident_o = new_S_ident_o; @@ -134,8 +138,13 @@ * make a copy, and syslog(3) later uses it. We can't garbagecollect it * If NULL, just let openlog figure it out (probably using C argv[0]). */ + if (S_ident_o) { + ident = _PyUnicode_AsString(S_ident_o); + if (ident == NULL) + return NULL; + } - openlog(S_ident_o ? _PyUnicode_AsString(S_ident_o) : NULL, logopt, facility); + openlog(ident, logopt, facility); S_log_open = 1; Py_INCREF(Py_None); Modified: python/branches/pep-3151/Modules/termios.c ============================================================================== --- python/branches/pep-3151/Modules/termios.c (original) +++ python/branches/pep-3151/Modules/termios.c Sat Feb 26 08:16:32 2011 @@ -347,6 +347,43 @@ #ifdef B230400 {"B230400", B230400}, #endif +#ifdef B460800 + {"B460800", B460800}, +#endif +#ifdef B500000 + {"B500000", B500000}, +#endif +#ifdef B576000 + {"B576000", B576000}, +#endif +#ifdef B921600 + {"B921600", B921600}, +#endif +#ifdef B1000000 + {"B1000000", B1000000}, +#endif +#ifdef B1152000 + {"B1152000", B1152000}, +#endif +#ifdef B1500000 + {"B1500000", B1500000}, +#endif +#ifdef B2000000 + {"B2000000", B2000000}, +#endif +#ifdef B2500000 + {"B2500000", B2500000}, +#endif +#ifdef B3000000 + {"B3000000", B3000000}, +#endif +#ifdef B3500000 + {"B3500000", B3500000}, +#endif +#ifdef B4000000 + {"B4000000", B4000000}, +#endif + #ifdef CBAUDEX {"CBAUDEX", CBAUDEX}, #endif Modified: python/branches/pep-3151/Modules/timemodule.c ============================================================================== --- python/branches/pep-3151/Modules/timemodule.c (original) +++ python/branches/pep-3151/Modules/timemodule.c Sat Feb 26 08:16:32 2011 @@ -1,8 +1,6 @@ - /* Time module */ #include "Python.h" -#include "structseq.h" #include "_time.h" #define TZNAME_ENCODING "utf-8" @@ -219,29 +217,6 @@ } static PyObject * -structtime_totuple(PyObject *t) -{ - PyObject *x = NULL; - unsigned int i; - PyObject *v = PyTuple_New(9); - if (v == NULL) - return NULL; - - for (i=0; i<9; i++) { - x = PyStructSequence_GET_ITEM(t, i); - Py_INCREF(x); - PyTuple_SET_ITEM(v, i, x); - } - - if (PyErr_Occurred()) { - Py_XDECREF(v); - return NULL; - } - - return v; -} - -static PyObject * time_convert(double when, struct tm * (*function)(const time_t *)) { struct tm *p; @@ -322,56 +297,62 @@ gettmarg(PyObject *args, struct tm *p) { int y; - PyObject *t = NULL; memset((void *) p, '\0', sizeof(struct tm)); - if (PyTuple_Check(args)) { - t = args; - Py_INCREF(t); - } - else if (Py_TYPE(args) == &StructTimeType) { - t = structtime_totuple(args); - } - else { + if (!PyTuple_Check(args)) { PyErr_SetString(PyExc_TypeError, "Tuple or struct_time argument required"); return 0; } - if (t == NULL || !PyArg_ParseTuple(t, "iiiiiiiii", - &y, - &p->tm_mon, - &p->tm_mday, - &p->tm_hour, - &p->tm_min, - &p->tm_sec, - &p->tm_wday, - &p->tm_yday, - &p->tm_isdst)) { - Py_XDECREF(t); + if (!PyArg_ParseTuple(args, "iiiiiiiii", + &y, &p->tm_mon, &p->tm_mday, + &p->tm_hour, &p->tm_min, &p->tm_sec, + &p->tm_wday, &p->tm_yday, &p->tm_isdst)) return 0; - } - Py_DECREF(t); - if (y < 1900) { + /* If year is specified with less than 4 digits, its interpretation + * depends on the accept2dyear value. + * + * If accept2dyear is true (default), a backward compatibility behavior is + * invoked as follows: + * + * - for 2-digit year, century is guessed according to POSIX rules for + * %y strptime format: 21st century for y < 69, 20th century + * otherwise. A deprecation warning is issued when century + * information is guessed in this way. + * + * - for 3-digit or negative year, a ValueError exception is raised. + * + * If accept2dyear is false (set by the program or as a result of a + * non-empty value assigned to PYTHONY2K environment variable) all year + * values are interpreted as given. + */ + if (y < 1000) { PyObject *accept = PyDict_GetItemString(moddict, "accept2dyear"); - if (accept == NULL || !PyLong_CheckExact(accept) || - !PyObject_IsTrue(accept)) { - PyErr_SetString(PyExc_ValueError, - "year >= 1900 required"); - return 0; + if (accept != NULL) { + int acceptval = PyObject_IsTrue(accept); + if (acceptval == -1) + return 0; + if (acceptval) { + if (0 <= y && y < 69) + y += 2000; + else if (69 <= y && y < 100) + y += 1900; + else { + PyErr_SetString(PyExc_ValueError, + "year out of range"); + return 0; + } + if (PyErr_WarnEx(PyExc_DeprecationWarning, + "Century info guessed for a 2-digit year.", 1) != 0) + return 0; + } } - if (69 <= y && y <= 99) - y += 1900; - else if (0 <= y && y <= 68) - y += 2000; - else { - PyErr_SetString(PyExc_ValueError, - "year out of range"); + else return 0; - } } p->tm_year = y - 1900; p->tm_mon--; @@ -493,6 +474,15 @@ else if (!gettmarg(tup, &buf) || !checktm(&buf)) return NULL; +#if defined(_MSC_VER) || defined(sun) + if (buf.tm_year + 1900 < 1 || 9999 < buf.tm_year + 1900) { + PyErr_Format(PyExc_ValueError, + "strftime() requires year in [1; 9999]", + buf.tm_year + 1900); + return NULL; + } +#endif + /* Normalize tm_isdst just in case someone foolishly implements %Z based on the assumption that tm_isdst falls within the range of [-1, 1] */ @@ -600,19 +590,51 @@ return strptime_result; } + PyDoc_STRVAR(strptime_doc, "strptime(string, format) -> struct_time\n\ \n\ Parse a string to a time tuple according to a format specification.\n\ See the library reference manual for formatting codes (same as strftime())."); +static PyObject * +_asctime(struct tm *timeptr) +{ + /* Inspired by Open Group reference implementation available at + * http://pubs.opengroup.org/onlinepubs/009695399/functions/asctime.html */ + static char wday_name[7][3] = { + "Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat" + }; + static char mon_name[12][3] = { + "Jan", "Feb", "Mar", "Apr", "May", "Jun", + "Jul", "Aug", "Sep", "Oct", "Nov", "Dec" + }; + char buf[20]; /* 'Sun Sep 16 01:03:52\0' */ + int n; + + n = PyOS_snprintf(buf, sizeof(buf), "%.3s %.3s%3d %.2d:%.2d:%.2d", + wday_name[timeptr->tm_wday], + mon_name[timeptr->tm_mon], + timeptr->tm_mday, timeptr->tm_hour, + timeptr->tm_min, timeptr->tm_sec); + /* XXX: since the fields used by snprintf above are validated in checktm, + * the following condition should never trigger. We keep the check because + * historically fixed size buffer used in asctime was the source of + * crashes. */ + if (n + 1 != sizeof(buf)) { + PyErr_SetString(PyExc_ValueError, "unconvertible time"); + return NULL; + } + + return PyUnicode_FromFormat("%s %d", buf, 1900 + timeptr->tm_year); +} static PyObject * time_asctime(PyObject *self, PyObject *args) { PyObject *tup = NULL; struct tm buf; - char *p; + if (!PyArg_UnpackTuple(args, "asctime", 0, 1, &tup)) return NULL; if (tup == NULL) { @@ -620,10 +642,7 @@ buf = *localtime(&tt); } else if (!gettmarg(tup, &buf) || !checktm(&buf)) return NULL; - p = asctime(&buf); - if (p[24] == '\n') - p[24] = '\0'; - return PyUnicode_FromString(p); + return _asctime(&buf); } PyDoc_STRVAR(asctime_doc, @@ -638,7 +657,7 @@ { PyObject *ot = NULL; time_t tt; - char *p; + struct tm *timeptr; if (!PyArg_UnpackTuple(args, "ctime", 0, 1, &ot)) return NULL; @@ -652,14 +671,12 @@ if (tt == (time_t)-1 && PyErr_Occurred()) return NULL; } - p = ctime(&tt); - if (p == NULL) { + timeptr = localtime(&tt); + if (timeptr == NULL) { PyErr_SetString(PyExc_ValueError, "unconvertible time"); return NULL; } - if (p[24] == '\n') - p[24] = '\0'; - return PyUnicode_FromString(p); + return _asctime(timeptr); } PyDoc_STRVAR(ctime_doc, @@ -677,8 +694,11 @@ time_t tt; if (!gettmarg(tup, &buf)) return NULL; + buf.tm_wday = -1; /* sentinel; original value ignored */ tt = mktime(&buf); - if (tt == (time_t)(-1)) { + /* Return value of -1 does not necessarily mean an error, but tm_wday + * cannot remain set to -1 if mktime succedded. */ + if (tt == (time_t)(-1) && buf.tm_wday == -1) { PyErr_SetString(PyExc_OverflowError, "mktime argument out of range"); return NULL; @@ -716,7 +736,7 @@ } PyDoc_STRVAR(tzset_doc, -"tzset(zone)\n\ +"tzset()\n\ \n\ Initialize, or reinitialize, the local timezone to the value stored in\n\ os.environ['TZ']. The TZ environment variable should be specified in\n\ Modified: python/branches/pep-3151/Modules/unicodedata.c ============================================================================== --- python/branches/pep-3151/Modules/unicodedata.c (original) +++ python/branches/pep-3151/Modules/unicodedata.c Sat Feb 26 08:16:32 2011 @@ -403,7 +403,8 @@ { PyUnicodeObject *v; char decomp[256]; - int code, index, count, i; + int code, index, count; + size_t i; unsigned int prefix_index; Py_UCS4 c; @@ -450,15 +451,12 @@ while (count-- > 0) { if (i) decomp[i++] = ' '; - assert((size_t)i < sizeof(decomp)); + assert(i < sizeof(decomp)); PyOS_snprintf(decomp + i, sizeof(decomp) - i, "%04X", decomp_data[++index]); i += strlen(decomp + i); } - - decomp[i] = '\0'; - - return PyUnicode_FromString(decomp); + return PyUnicode_FromStringAndSize(decomp, i); } static void @@ -684,10 +682,14 @@ comb = 0; while (i1 < end) { int comb1 = _getrecord_ex(*i1)->combining; - if (comb && (comb1 == 0 || comb == comb1)) { - /* Character is blocked. */ - i1++; - continue; + if (comb) { + if (comb1 == 0) + break; + if (comb >= comb1) { + /* Character is blocked. */ + i1++; + continue; + } } l = find_nfc_index(self, nfc_last, *i1); /* *i1 cannot be combined with *i. If *i1 @@ -711,6 +713,7 @@ /* Replace the original character. */ *i = code; /* Mark the second character unused. */ + assert(cskipped < 20); skipped[cskipped++] = i1; i1++; f = find_nfc_index(self, nfc_first, *i); @@ -866,13 +869,16 @@ { 0, 0, "H" } }; +/* These ranges need to match makeunicodedata.py:cjk_ranges. */ static int is_unified_ideograph(Py_UCS4 code) { - return ( - (0x3400 <= code && code <= 0x4DB5) || /* CJK Ideograph Extension A */ - (0x4E00 <= code && code <= 0x9FBB) || /* CJK Ideograph */ - (0x20000 <= code && code <= 0x2A6D6));/* CJK Ideograph Extension B */ + return + (0x3400 <= code && code <= 0x4DB5) || /* CJK Ideograph Extension A */ + (0x4E00 <= code && code <= 0x9FCB) || /* CJK Ideograph */ + (0x20000 <= code && code <= 0x2A6D6) || /* CJK Ideograph Extension B */ + (0x2A700 <= code && code <= 0x2B734) || /* CJK Ideograph Extension C */ + (0x2B740 <= code && code <= 0x2B81D); /* CJK Ideograph Extension D */ } static int Modified: python/branches/pep-3151/Modules/zipimport.c ============================================================================== --- python/branches/pep-3151/Modules/zipimport.c (original) +++ python/branches/pep-3151/Modules/zipimport.c Sat Feb 26 08:16:32 2011 @@ -725,6 +725,7 @@ long arc_offset; /* offset from beginning of file to start of zip-archive */ PyObject *pathobj; const char *charset; + int bootstrap; if (PyUnicode_GET_SIZE(archive_obj) > MAXPATHLEN) { PyErr_SetString(PyExc_OverflowError, @@ -801,13 +802,30 @@ *p = 0; /* Add terminating null byte */ header_offset += header_size; + bootstrap = 0; if (flags & 0x0800) charset = "utf-8"; + else if (!PyThreadState_GET()->interp->codecs_initialized) { + /* During bootstrap, we may need to load the encodings + package from a ZIP file. But the cp437 encoding is implemented + in Python in the encodings package. + + Break out of this dependency by assuming that the path to + the encodings module is ASCII-only. */ + charset = "ascii"; + bootstrap = 1; + } else charset = "cp437"; nameobj = PyUnicode_Decode(name, name_size, charset, NULL); - if (nameobj == NULL) + if (nameobj == NULL) { + if (bootstrap) + PyErr_Format(PyExc_NotImplementedError, + "bootstrap issue: python%i%i.zip contains non-ASCII " + "filenames without the unicode flag", + PY_MAJOR_VERSION, PY_MINOR_VERSION); goto error; + } Py_UNICODE_strncpy(path + length + 1, PyUnicode_AS_UNICODE(nameobj), MAXPATHLEN - length - 1); pathobj = PyUnicode_FromUnicode(path, Py_UNICODE_strlen(path)); Modified: python/branches/pep-3151/Modules/zlibmodule.c ============================================================================== --- python/branches/pep-3151/Modules/zlibmodule.c (original) +++ python/branches/pep-3151/Modules/zlibmodule.c Sat Feb 26 08:16:32 2011 @@ -117,14 +117,21 @@ PyObject *ReturnVal = NULL; Py_buffer pinput; Byte *input, *output; - int length, level=Z_DEFAULT_COMPRESSION, err; + unsigned int length; + int level=Z_DEFAULT_COMPRESSION, err; z_stream zst; /* require Python string object, optional 'level' arg */ if (!PyArg_ParseTuple(args, "y*|i:compress", &pinput, &level)) return NULL; - input = pinput.buf; + + if (pinput.len > UINT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "size does not fit in an unsigned int"); + return NULL; + } length = pinput.len; + input = pinput.buf; zst.avail_out = length + length/1000 + 12 + 1; @@ -199,7 +206,8 @@ PyObject *result_str; Py_buffer pinput; Byte *input; - int length, err; + unsigned int length; + int err; int wsize=DEF_WBITS; Py_ssize_t r_strlen=DEFAULTALLOC; z_stream zst; @@ -207,8 +215,14 @@ if (!PyArg_ParseTuple(args, "y*|in:decompress", &pinput, &wsize, &r_strlen)) return NULL; - input = pinput.buf; + + if (pinput.len > UINT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "size does not fit in an unsigned int"); + return NULL; + } length = pinput.len; + input = pinput.buf; if (r_strlen <= 0) r_strlen = 1; @@ -931,8 +945,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + unsigned char *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - adler32val = adler32(adler32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. adler32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (size_t) UINT_MAX) { + adler32val = adler32(adler32val, buf, UINT_MAX); + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; + } + adler32val = adler32(adler32val, buf, len); Py_END_ALLOW_THREADS } else { adler32val = adler32(adler32val, pbuf.buf, pbuf.len); @@ -959,8 +983,18 @@ /* Releasing the GIL for very small buffers is inefficient and may lower performance */ if (pbuf.len > 1024*5) { + unsigned char *buf = pbuf.buf; + Py_ssize_t len = pbuf.len; + Py_BEGIN_ALLOW_THREADS - signed_val = crc32(crc32val, pbuf.buf, pbuf.len); + /* Avoid truncation of length for very large buffers. crc32() takes + length as an unsigned int, which may be narrower than Py_ssize_t. */ + while (len > (size_t) UINT_MAX) { + crc32val = crc32(crc32val, buf, UINT_MAX); + buf += (size_t) UINT_MAX; + len -= (size_t) UINT_MAX; + } + signed_val = crc32(crc32val, buf, len); Py_END_ALLOW_THREADS } else { signed_val = crc32(crc32val, pbuf.buf, pbuf.len); Modified: python/branches/pep-3151/Objects/abstract.c ============================================================================== --- python/branches/pep-3151/Objects/abstract.c (original) +++ python/branches/pep-3151/Objects/abstract.c Sat Feb 26 08:16:32 2011 @@ -2500,7 +2500,10 @@ if (retval == 0) { PyObject *c = PyObject_GetAttr(inst, __class__); if (c == NULL) { - PyErr_Clear(); + if (PyErr_ExceptionMatches(PyExc_AttributeError)) + PyErr_Clear(); + else + retval = -1; } else { if (c != (PyObject *)(inst->ob_type) && @@ -2518,8 +2521,10 @@ return -1; icls = PyObject_GetAttr(inst, __class__); if (icls == NULL) { - PyErr_Clear(); - retval = 0; + if (PyErr_ExceptionMatches(PyExc_AttributeError)) + PyErr_Clear(); + else + retval = -1; } else { retval = abstract_issubclass(icls, cls); Modified: python/branches/pep-3151/Objects/bytearrayobject.c ============================================================================== --- python/branches/pep-3151/Objects/bytearrayobject.c (original) +++ python/branches/pep-3151/Objects/bytearrayobject.c Sat Feb 26 08:16:32 2011 @@ -389,7 +389,7 @@ } else if (PySlice_Check(index)) { Py_ssize_t start, stop, step, slicelength, cur, i; - if (PySlice_GetIndicesEx((PySliceObject *)index, + if (PySlice_GetIndicesEx(index, PyByteArray_GET_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; @@ -573,7 +573,7 @@ } } else if (PySlice_Check(index)) { - if (PySlice_GetIndicesEx((PySliceObject *)index, + if (PySlice_GetIndicesEx(index, PyByteArray_GET_SIZE(self), &start, &stop, &step, &slicelen) < 0) { return -1; @@ -589,7 +589,7 @@ needed = 0; } else if (values == (PyObject *)self || !PyByteArray_Check(values)) { - /* Make a copy an call this function recursively */ + /* Make a copy and call this function recursively */ int err; values = PyByteArray_FromObject(values); if (values == NULL) @@ -2439,7 +2439,7 @@ static PyObject * bytearray_rstrip(PyByteArrayObject *self, PyObject *args) { - Py_ssize_t left, right, mysize, argsize; + Py_ssize_t right, mysize, argsize; void *myptr, *argptr; PyObject *arg = Py_None; Py_buffer varg; @@ -2457,11 +2457,10 @@ } myptr = self->ob_bytes; mysize = Py_SIZE(self); - left = 0; right = rstrip_helper(myptr, mysize, argptr, argsize); if (arg != Py_None) PyBuffer_Release(&varg); - return PyByteArray_FromStringAndSize(self->ob_bytes + left, right - left); + return PyByteArray_FromStringAndSize(self->ob_bytes, right); } PyDoc_STRVAR(decode_doc, Modified: python/branches/pep-3151/Objects/bytesobject.c ============================================================================== --- python/branches/pep-3151/Objects/bytesobject.c (original) +++ python/branches/pep-3151/Objects/bytesobject.c Sat Feb 26 08:16:32 2011 @@ -911,7 +911,7 @@ char* result_buf; PyObject* result; - if (PySlice_GetIndicesEx((PySliceObject*)item, + if (PySlice_GetIndicesEx(item, PyBytes_GET_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; Modified: python/branches/pep-3151/Objects/codeobject.c ============================================================================== --- python/branches/pep-3151/Objects/codeobject.c (original) +++ python/branches/pep-3151/Objects/codeobject.c Sat Feb 26 08:16:32 2011 @@ -347,11 +347,11 @@ lineno = -1; if (co->co_filename && PyUnicode_Check(co->co_filename)) { return PyUnicode_FromFormat( - "", + "", co->co_name, co, co->co_filename, lineno); } else { return PyUnicode_FromFormat( - "", + "", co->co_name, co, lineno); } } @@ -492,7 +492,7 @@ int PyCode_Addr2Line(PyCodeObject *co, int addrq) { - int size = PyBytes_Size(co->co_lnotab) / 2; + Py_ssize_t size = PyBytes_Size(co->co_lnotab) / 2; unsigned char *p = (unsigned char*)PyBytes_AsString(co->co_lnotab); int line = co->co_firstlineno; int addr = 0; @@ -510,7 +510,8 @@ int _PyCode_CheckLineNumber(PyCodeObject* co, int lasti, PyAddrPair *bounds) { - int size, addr, line; + Py_ssize_t size; + int addr, line; unsigned char* p; p = (unsigned char*)PyBytes_AS_STRING(co->co_lnotab); Modified: python/branches/pep-3151/Objects/complexobject.c ============================================================================== --- python/branches/pep-3151/Objects/complexobject.c (original) +++ python/branches/pep-3151/Objects/complexobject.c Sat Feb 26 08:16:32 2011 @@ -324,10 +324,11 @@ op->ob_type->tp_free(op); } - static PyObject * -complex_format(PyComplexObject *v, int precision, char format_code) +complex_repr(PyComplexObject *v) { + int precision = 0; + char format_code = 'r'; PyObject *result = NULL; Py_ssize_t len; @@ -344,6 +345,8 @@ char *tail = ""; if (v->cval.real == 0. && copysign(1.0, v->cval.real)==1.0) { + /* Real part is +0: just output the imaginary part and do not + include parens. */ re = ""; im = PyOS_double_to_string(v->cval.imag, format_code, precision, 0, NULL); @@ -352,7 +355,8 @@ goto done; } } else { - /* Format imaginary part with sign, real part without */ + /* Format imaginary part with sign, real part without. Include + parens in the result. */ pre = PyOS_double_to_string(v->cval.real, format_code, precision, 0, NULL); if (!pre) { @@ -371,7 +375,7 @@ tail = ")"; } /* Alloc the final buffer. Add one for the "j" in the format string, - and one for the trailing zero. */ + and one for the trailing zero byte. */ len = strlen(lead) + strlen(re) + strlen(im) + strlen(tail) + 2; buf = PyMem_Malloc(len); if (!buf) { @@ -388,12 +392,6 @@ return result; } -static PyObject * -complex_repr(PyComplexObject *v) -{ - return complex_format(v, 0, 'r'); -} - static Py_hash_t complex_hash(PyComplexObject *v) { @@ -766,24 +764,30 @@ char *end; double x=0.0, y=0.0, z; int got_bracket=0; - char *s_buffer = NULL; + PyObject *s_buffer = NULL; Py_ssize_t len; if (PyUnicode_Check(v)) { - s_buffer = (char *)PyMem_MALLOC(PyUnicode_GET_SIZE(v) + 1); + Py_ssize_t i, buflen = PyUnicode_GET_SIZE(v); + Py_UNICODE *bufptr; + s_buffer = PyUnicode_TransformDecimalToASCII( + PyUnicode_AS_UNICODE(v), buflen); if (s_buffer == NULL) - return PyErr_NoMemory(); - if (PyUnicode_EncodeDecimal(PyUnicode_AS_UNICODE(v), - PyUnicode_GET_SIZE(v), - s_buffer, - NULL)) + return NULL; + /* Replace non-ASCII whitespace with ' ' */ + bufptr = PyUnicode_AS_UNICODE(s_buffer); + for (i = 0; i < buflen; i++) { + Py_UNICODE ch = bufptr[i]; + if (ch > 127 && Py_UNICODE_ISSPACE(ch)) + bufptr[i] = ' '; + } + s = _PyUnicode_AsStringAndSize(s_buffer, &len); + if (s == NULL) goto error; - s = s_buffer; - len = strlen(s); } else if (PyObject_AsCharBuffer(v, &s, &len)) { PyErr_SetString(PyExc_TypeError, - "complex() arg is not a string"); + "complex() argument must be a string or a number"); return NULL; } @@ -894,16 +898,14 @@ if (s-start != len) goto parse_error; - if (s_buffer) - PyMem_FREE(s_buffer); + Py_XDECREF(s_buffer); return complex_subtype_from_doubles(type, x, y); parse_error: PyErr_SetString(PyExc_ValueError, "complex() arg is a malformed string"); error: - if (s_buffer) - PyMem_FREE(s_buffer); + Py_XDECREF(s_buffer); return NULL; } Modified: python/branches/pep-3151/Objects/descrobject.c ============================================================================== --- python/branches/pep-3151/Objects/descrobject.c (original) +++ python/branches/pep-3151/Objects/descrobject.c Sat Feb 26 08:16:32 2011 @@ -710,19 +710,19 @@ static PyObject * proxy_keys(proxyobject *pp) { - return PyMapping_Keys(pp->dict); + return PyObject_CallMethod(pp->dict, "keys", NULL); } static PyObject * proxy_values(proxyobject *pp) { - return PyMapping_Values(pp->dict); + return PyObject_CallMethod(pp->dict, "values", NULL); } static PyObject * proxy_items(proxyobject *pp) { - return PyMapping_Items(pp->dict); + return PyObject_CallMethod(pp->dict, "items", NULL); } static PyObject * @@ -766,6 +766,12 @@ return PyObject_Str(pp->dict); } +static PyObject * +proxy_repr(proxyobject *pp) +{ + return PyUnicode_FromFormat("dict_proxy(%R)", pp->dict); +} + static int proxy_traverse(PyObject *self, visitproc visit, void *arg) { @@ -791,7 +797,7 @@ 0, /* tp_getattr */ 0, /* tp_setattr */ 0, /* tp_reserved */ - 0, /* tp_repr */ + (reprfunc)proxy_repr, /* tp_repr */ 0, /* tp_as_number */ &proxy_as_sequence, /* tp_as_sequence */ &proxy_as_mapping, /* tp_as_mapping */ @@ -1190,7 +1196,7 @@ PyErr_SetString(PyExc_AttributeError, "unreadable attribute"); return NULL; } - return PyObject_CallFunction(gs->prop_get, "(O)", obj); + return PyObject_CallFunctionObjArgs(gs->prop_get, obj, NULL); } static int @@ -1211,9 +1217,9 @@ return -1; } if (value == NULL) - res = PyObject_CallFunction(func, "(O)", obj); + res = PyObject_CallFunctionObjArgs(func, obj, NULL); else - res = PyObject_CallFunction(func, "(OO)", obj, value); + res = PyObject_CallFunctionObjArgs(func, obj, value, NULL); if (res == NULL) return -1; Py_DECREF(res); Modified: python/branches/pep-3151/Objects/dictobject.c ============================================================================== --- python/branches/pep-3151/Objects/dictobject.c (original) +++ python/branches/pep-3151/Objects/dictobject.c Sat Feb 26 08:16:32 2011 @@ -454,7 +454,7 @@ { Py_ssize_t pos = 0; PyObject *key, *value; - assert(PyDict_CheckExact(dict)); + assert(PyDict_Check(dict)); /* Shortcut */ if (((PyDictObject *)dict)->ma_lookup == lookdict_unicode) return 1; Modified: python/branches/pep-3151/Objects/fileobject.c ============================================================================== --- python/branches/pep-3151/Objects/fileobject.c (original) +++ python/branches/pep-3151/Objects/fileobject.c Sat Feb 26 08:16:32 2011 @@ -297,8 +297,8 @@ *p++ = c; if (c == '\n') break; } - if ( c == EOF && skipnextlf ) - newlinetypes |= NEWLINE_CR; + /* if ( c == EOF && skipnextlf ) + newlinetypes |= NEWLINE_CR; */ FUNLOCKFILE(stream); *p = '\0'; if ( skipnextlf ) { @@ -344,7 +344,7 @@ } static int -fileio_init(PyObject *self, PyObject *args, PyObject *kwds) +stdprinter_init(PyObject *self, PyObject *args, PyObject *kwds) { PyErr_SetString(PyExc_TypeError, "cannot create 'stderrprinter' instances"); @@ -390,7 +390,13 @@ Py_BEGIN_ALLOW_THREADS errno = 0; +#if defined(MS_WIN64) || defined(MS_WINDOWS) + if (n > INT_MAX) + n = INT_MAX; + n = write(self->fd, c, (int)n); +#else n = write(self->fd, c, n); +#endif Py_END_ALLOW_THREADS if (n < 0) { @@ -509,7 +515,7 @@ 0, /* tp_descr_get */ 0, /* tp_descr_set */ 0, /* tp_dictoffset */ - fileio_init, /* tp_init */ + stdprinter_init, /* tp_init */ PyType_GenericAlloc, /* tp_alloc */ stdprinter_new, /* tp_new */ PyObject_Del, /* tp_free */ Modified: python/branches/pep-3151/Objects/floatobject.c ============================================================================== --- python/branches/pep-3151/Objects/floatobject.c (original) +++ python/branches/pep-3151/Objects/floatobject.c Sat Feb 26 08:16:32 2011 @@ -5,7 +5,6 @@ for any kind of float exception without losing portability. */ #include "Python.h" -#include "structseq.h" #include #include @@ -175,22 +174,29 @@ { const char *s, *last, *end; double x; - char buffer[256]; /* for errors */ - char *s_buffer = NULL; + PyObject *s_buffer = NULL; Py_ssize_t len; PyObject *result = NULL; if (PyUnicode_Check(v)) { - s_buffer = (char *)PyMem_MALLOC(PyUnicode_GET_SIZE(v)+1); + Py_ssize_t i, buflen = PyUnicode_GET_SIZE(v); + Py_UNICODE *bufptr; + s_buffer = PyUnicode_TransformDecimalToASCII( + PyUnicode_AS_UNICODE(v), buflen); if (s_buffer == NULL) - return PyErr_NoMemory(); - if (PyUnicode_EncodeDecimal(PyUnicode_AS_UNICODE(v), - PyUnicode_GET_SIZE(v), - s_buffer, - NULL)) - goto error; - s = s_buffer; - len = strlen(s); + return NULL; + /* Replace non-ASCII whitespace with ' ' */ + bufptr = PyUnicode_AS_UNICODE(s_buffer); + for (i = 0; i < buflen; i++) { + Py_UNICODE ch = bufptr[i]; + if (ch > 127 && Py_UNICODE_ISSPACE(ch)) + bufptr[i] = ' '; + } + s = _PyUnicode_AsStringAndSize(s_buffer, &len); + if (s == NULL) { + Py_DECREF(s_buffer); + return NULL; + } } else if (PyObject_AsCharBuffer(v, &s, &len)) { PyErr_SetString(PyExc_TypeError, @@ -198,29 +204,27 @@ return NULL; } last = s + len; - - while (Py_ISSPACE(*s)) + /* strip space */ + while (s < last && Py_ISSPACE(*s)) s++; + while (s < last - 1 && Py_ISSPACE(last[-1])) + last--; /* We don't care about overflow or underflow. If the platform * supports them, infinities and signed zeroes (on underflow) are * fine. */ x = PyOS_string_to_double(s, (char **)&end, NULL); - if (x == -1.0 && PyErr_Occurred()) - goto error; - while (Py_ISSPACE(*end)) - end++; - if (end == last) - result = PyFloat_FromDouble(x); - else { - PyOS_snprintf(buffer, sizeof(buffer), - "invalid literal for float(): %.200s", s); - PyErr_SetString(PyExc_ValueError, buffer); + if (end != last) { + PyErr_Format(PyExc_ValueError, + "could not convert string to float: " + "%R", v); result = NULL; } + else if (x == -1.0 && PyErr_Occurred()) + result = NULL; + else + result = PyFloat_FromDouble(x); - error: - if (s_buffer) - PyMem_FREE(s_buffer); + Py_XDECREF(s_buffer); return result; } @@ -570,13 +574,11 @@ double a,b; CONVERT_TO_DOUBLE(v, a); CONVERT_TO_DOUBLE(w, b); -#ifdef Py_NAN if (b == 0.0) { PyErr_SetString(PyExc_ZeroDivisionError, "float division by zero"); return NULL; } -#endif PyFPE_START_PROTECT("divide", return 0) a = a / b; PyFPE_END_PROTECT(a) @@ -590,19 +592,24 @@ double mod; CONVERT_TO_DOUBLE(v, vx); CONVERT_TO_DOUBLE(w, wx); -#ifdef Py_NAN if (wx == 0.0) { PyErr_SetString(PyExc_ZeroDivisionError, "float modulo"); return NULL; } -#endif PyFPE_START_PROTECT("modulo", return 0) mod = fmod(vx, wx); - /* note: checking mod*wx < 0 is incorrect -- underflows to - 0 if wx < sqrt(smallest nonzero double) */ - if (mod && ((wx < 0) != (mod < 0))) { - mod += wx; + if (mod) { + /* ensure the remainder has the same sign as the denominator */ + if ((wx < 0) != (mod < 0)) { + mod += wx; + } + } + else { + /* the remainder is zero, and in the presence of signed zeroes + fmod returns different results across platforms; ensure + it has the same sign as the denominator. */ + mod = copysign(0.0, wx); } PyFPE_END_PROTECT(mod) return PyFloat_FromDouble(mod); @@ -638,11 +645,8 @@ else { /* the remainder is zero, and in the presence of signed zeroes fmod returns different results across platforms; ensure - it has the same sign as the denominator; we'd like to do - "mod = wx * 0.0", but that may get optimized away */ - mod *= mod; /* hide "mod = +0" from optimizer */ - if (wx < 0.0) - mod = -mod; + it has the same sign as the denominator. */ + mod = copysign(0.0, wx); } /* snap quotient to nearest integral value */ if (div) { @@ -652,8 +656,7 @@ } else { /* div is zero - get the same sign as the true quotient */ - div *= div; /* hide "div = +0" from optimizers */ - floordiv = div * vx / wx; /* zero w/ sign of vx/wx */ + floordiv = copysign(0.0, vx / wx); /* zero w/ sign of vx/wx */ } PyFPE_END_PROTECT(floordiv) return Py_BuildValue("(dd)", floordiv, mod); @@ -1487,13 +1490,11 @@ "Cannot pass infinity to float.as_integer_ratio."); return NULL; } -#ifdef Py_NAN if (Py_IS_NAN(self)) { PyErr_SetString(PyExc_ValueError, "Cannot pass NaN to float.as_integer_ratio."); return NULL; } -#endif PyFPE_START_PROTECT("as_integer_ratio", goto error); float_part = frexp(self, &exponent); /* self == float_part * 2**exponent exactly */ @@ -2244,7 +2245,7 @@ /* Eighth byte */ *p = flo & 0xFF; - p += incr; + /* p += incr; */ /* Done */ return 0; Modified: python/branches/pep-3151/Objects/funcobject.c ============================================================================== --- python/branches/pep-3151/Objects/funcobject.c (original) +++ python/branches/pep-3151/Objects/funcobject.c Sat Feb 26 08:16:32 2011 @@ -3,7 +3,6 @@ #include "Python.h" #include "code.h" -#include "eval.h" #include "structmember.h" PyObject * @@ -628,7 +627,7 @@ } result = PyEval_EvalCodeEx( - (PyCodeObject *)PyFunction_GET_CODE(func), + PyFunction_GET_CODE(func), PyFunction_GET_GLOBALS(func), (PyObject *)NULL, &PyTuple_GET_ITEM(arg, 0), PyTuple_GET_SIZE(arg), k, nk, d, nd, Modified: python/branches/pep-3151/Objects/genobject.c ============================================================================== --- python/branches/pep-3151/Objects/genobject.c (original) +++ python/branches/pep-3151/Objects/genobject.c Sat Feb 26 08:16:32 2011 @@ -2,8 +2,6 @@ #include "Python.h" #include "frameobject.h" -#include "genobject.h" -#include "ceval.h" #include "structmember.h" #include "opcode.h" Modified: python/branches/pep-3151/Objects/listobject.c ============================================================================== --- python/branches/pep-3151/Objects/listobject.c (original) +++ python/branches/pep-3151/Objects/listobject.c Sat Feb 26 08:16:32 2011 @@ -747,6 +747,19 @@ } static PyObject * +listclear(PyListObject *self) +{ + list_clear(self); + Py_RETURN_NONE; +} + +static PyObject * +listcopy(PyListObject *self) +{ + return list_slice(self, 0, Py_SIZE(self)); +} + +static PyObject * listappend(PyListObject *self, PyObject *v) { if (app1(self, v) == 0) @@ -940,6 +953,71 @@ * pieces to this algorithm; read listsort.txt for overviews and details. */ +/* A sortslice contains a pointer to an array of keys and a pointer to + * an array of corresponding values. In other words, keys[i] + * corresponds with values[i]. If values == NULL, then the keys are + * also the values. + * + * Several convenience routines are provided here, so that keys and + * values are always moved in sync. + */ + +typedef struct { + PyObject **keys; + PyObject **values; +} sortslice; + +Py_LOCAL_INLINE(void) +sortslice_copy(sortslice *s1, Py_ssize_t i, sortslice *s2, Py_ssize_t j) +{ + s1->keys[i] = s2->keys[j]; + if (s1->values != NULL) + s1->values[i] = s2->values[j]; +} + +Py_LOCAL_INLINE(void) +sortslice_copy_incr(sortslice *dst, sortslice *src) +{ + *dst->keys++ = *src->keys++; + if (dst->values != NULL) + *dst->values++ = *src->values++; +} + +Py_LOCAL_INLINE(void) +sortslice_copy_decr(sortslice *dst, sortslice *src) +{ + *dst->keys-- = *src->keys--; + if (dst->values != NULL) + *dst->values-- = *src->values--; +} + + +Py_LOCAL_INLINE(void) +sortslice_memcpy(sortslice *s1, Py_ssize_t i, sortslice *s2, Py_ssize_t j, + Py_ssize_t n) +{ + memcpy(&s1->keys[i], &s2->keys[j], sizeof(PyObject *) * n); + if (s1->values != NULL) + memcpy(&s1->values[i], &s2->values[j], sizeof(PyObject *) * n); +} + +Py_LOCAL_INLINE(void) +sortslice_memmove(sortslice *s1, Py_ssize_t i, sortslice *s2, Py_ssize_t j, + Py_ssize_t n) +{ + memmove(&s1->keys[i], &s2->keys[j], sizeof(PyObject *) * n); + if (s1->values != NULL) + memmove(&s1->values[i], &s2->values[j], sizeof(PyObject *) * n); +} + +Py_LOCAL_INLINE(void) +sortslice_advance(sortslice *slice, Py_ssize_t n) +{ + slice->keys += n; + if (slice->values != NULL) + slice->values += n; +} + /* Comparison function: PyObject_RichCompareBool with Py_LT. * Returns -1 on error, 1 if x < y, 0 if x >= y. */ @@ -965,19 +1043,19 @@ the input (nothing is lost or duplicated). */ static int -binarysort(PyObject **lo, PyObject **hi, PyObject **start) +binarysort(sortslice lo, PyObject **hi, PyObject **start) { register Py_ssize_t k; register PyObject **l, **p, **r; register PyObject *pivot; - assert(lo <= start && start <= hi); + assert(lo.keys <= start && start <= hi); /* assert [lo, start) is sorted */ - if (lo == start) + if (lo.keys == start) ++start; for (; start < hi; ++start) { /* set l to where *start belongs */ - l = lo; + l = lo.keys; r = start; pivot = *r; /* Invariants: @@ -1004,6 +1082,15 @@ for (p = start; p > l; --p) *p = *(p-1); *l = pivot; + if (lo.values != NULL) { + Py_ssize_t offset = lo.values - lo.keys; + p = start + offset; + pivot = *p; + l += offset; + for (p = start + offset; p > l; --p) + *p = *(p-1); + *l = pivot; + } } return 0; @@ -1272,7 +1359,7 @@ * a convenient way to pass state around among the helper functions. */ struct s_slice { - PyObject **base; + sortslice base; Py_ssize_t len; }; @@ -1286,7 +1373,7 @@ /* 'a' is temp storage to help with merges. It contains room for * alloced entries. */ - PyObject **a; /* may point to temparray below */ + sortslice a; /* may point to temparray below */ Py_ssize_t alloced; /* A stack of n pending runs yet to be merged. Run #i starts at @@ -1307,11 +1394,29 @@ /* Conceptually a MergeState's constructor. */ static void -merge_init(MergeState *ms) +merge_init(MergeState *ms, Py_ssize_t list_size, int has_keyfunc) { assert(ms != NULL); - ms->a = ms->temparray; - ms->alloced = MERGESTATE_TEMP_SIZE; + if (has_keyfunc) { + /* The temporary space for merging will need at most half the list + * size rounded up. Use the minimum possible space so we can use the + * rest of temparray for other things. In particular, if there is + * enough extra space, listsort() will use it to store the keys. + */ + ms->alloced = (list_size + 1) / 2; + + /* ms->alloced describes how many keys will be stored at + ms->temparray, but we also need to store the values. Hence, + ms->alloced is capped at half of MERGESTATE_TEMP_SIZE. */ + if (MERGESTATE_TEMP_SIZE / 2 < ms->alloced) + ms->alloced = MERGESTATE_TEMP_SIZE / 2; + ms->a.values = &ms->temparray[ms->alloced]; + } + else { + ms->alloced = MERGESTATE_TEMP_SIZE; + ms->a.values = NULL; + } + ms->a.keys = ms->temparray; ms->n = 0; ms->min_gallop = MIN_GALLOP; } @@ -1324,10 +1429,8 @@ merge_freemem(MergeState *ms) { assert(ms != NULL); - if (ms->a != ms->temparray) - PyMem_Free(ms->a); - ms->a = ms->temparray; - ms->alloced = MERGESTATE_TEMP_SIZE; + if (ms->a.keys != ms->temparray) + PyMem_Free(ms->a.keys); } /* Ensure enough temp memory for 'need' array slots is available. @@ -1336,52 +1439,60 @@ static int merge_getmem(MergeState *ms, Py_ssize_t need) { + int multiplier; + assert(ms != NULL); if (need <= ms->alloced) return 0; + + multiplier = ms->a.values != NULL ? 2 : 1; + /* Don't realloc! That can cost cycles to copy the old data, but * we don't care what's in the block. */ merge_freemem(ms); - if ((size_t)need > PY_SSIZE_T_MAX / sizeof(PyObject*)) { + if ((size_t)need > PY_SSIZE_T_MAX / sizeof(PyObject*) / multiplier) { PyErr_NoMemory(); return -1; } - ms->a = (PyObject **)PyMem_Malloc(need * sizeof(PyObject*)); - if (ms->a) { + ms->a.keys = (PyObject**)PyMem_Malloc(multiplier * need + * sizeof(PyObject *)); + if (ms->a.keys != NULL) { ms->alloced = need; + if (ms->a.values != NULL) + ms->a.values = &ms->a.keys[need]; return 0; } PyErr_NoMemory(); - merge_freemem(ms); /* reset to sane state */ return -1; } #define MERGE_GETMEM(MS, NEED) ((NEED) <= (MS)->alloced ? 0 : \ merge_getmem(MS, NEED)) -/* Merge the na elements starting at pa with the nb elements starting at pb - * in a stable way, in-place. na and nb must be > 0, and pa + na == pb. - * Must also have that *pb < *pa, that pa[na-1] belongs at the end of the - * merge, and should have na <= nb. See listsort.txt for more info. - * Return 0 if successful, -1 if error. +/* Merge the na elements starting at ssa with the nb elements starting at + * ssb.keys = ssa.keys + na in a stable way, in-place. na and nb must be > 0. + * Must also have that ssa.keys[na-1] belongs at the end of the merge, and + * should have na <= nb. See listsort.txt for more info. Return 0 if + * successful, -1 if error. */ static Py_ssize_t -merge_lo(MergeState *ms, PyObject **pa, Py_ssize_t na, - PyObject **pb, Py_ssize_t nb) +merge_lo(MergeState *ms, sortslice ssa, Py_ssize_t na, + sortslice ssb, Py_ssize_t nb) { Py_ssize_t k; - PyObject **dest; + sortslice dest; int result = -1; /* guilty until proved innocent */ Py_ssize_t min_gallop; - assert(ms && pa && pb && na > 0 && nb > 0 && pa + na == pb); + assert(ms && ssa.keys && ssb.keys && na > 0 && nb > 0); + assert(ssa.keys + na == ssb.keys); if (MERGE_GETMEM(ms, na) < 0) return -1; - memcpy(ms->a, pa, na * sizeof(PyObject*)); - dest = pa; - pa = ms->a; + sortslice_memcpy(&ms->a, 0, &ssa, 0, na); + dest = ssa; + ssa = ms->a; - *dest++ = *pb++; + sortslice_copy_incr(&dest, &ssb); --nb; if (nb == 0) goto Succeed; @@ -1398,11 +1509,11 @@ */ for (;;) { assert(na > 1 && nb > 0); - k = ISLT(*pb, *pa); + k = ISLT(ssb.keys[0], ssa.keys[0]); if (k) { if (k < 0) goto Fail; - *dest++ = *pb++; + sortslice_copy_incr(&dest, &ssb); ++bcount; acount = 0; --nb; @@ -1412,7 +1523,7 @@ break; } else { - *dest++ = *pa++; + sortslice_copy_incr(&dest, &ssa); ++acount; bcount = 0; --na; @@ -1433,14 +1544,14 @@ assert(na > 1 && nb > 0); min_gallop -= min_gallop > 1; ms->min_gallop = min_gallop; - k = gallop_right(*pb, pa, na, 0); + k = gallop_right(ssb.keys[0], ssa.keys, na, 0); acount = k; if (k) { if (k < 0) goto Fail; - memcpy(dest, pa, k * sizeof(PyObject *)); - dest += k; - pa += k; + sortslice_memcpy(&dest, 0, &ssa, 0, k); + sortslice_advance(&dest, k); + sortslice_advance(&ssa, k); na -= k; if (na == 1) goto CopyB; @@ -1451,24 +1562,24 @@ if (na == 0) goto Succeed; } - *dest++ = *pb++; + sortslice_copy_incr(&dest, &ssb); --nb; if (nb == 0) goto Succeed; - k = gallop_left(*pa, pb, nb, 0); + k = gallop_left(ssa.keys[0], ssb.keys, nb, 0); bcount = k; if (k) { if (k < 0) goto Fail; - memmove(dest, pb, k * sizeof(PyObject *)); - dest += k; - pb += k; + sortslice_memmove(&dest, 0, &ssb, 0, k); + sortslice_advance(&dest, k); + sortslice_advance(&ssb, k); nb -= k; if (nb == 0) goto Succeed; } - *dest++ = *pa++; + sortslice_copy_incr(&dest, &ssa); --na; if (na == 1) goto CopyB; @@ -1480,43 +1591,46 @@ result = 0; Fail: if (na) - memcpy(dest, pa, na * sizeof(PyObject*)); + sortslice_memcpy(&dest, 0, &ssa, 0, na); return result; CopyB: assert(na == 1 && nb > 0); - /* The last element of pa belongs at the end of the merge. */ - memmove(dest, pb, nb * sizeof(PyObject *)); - dest[nb] = *pa; + /* The last element of ssa belongs at the end of the merge. */ + sortslice_memmove(&dest, 0, &ssb, 0, nb); + sortslice_copy(&dest, nb, &ssa, 0); return 0; } -/* Merge the na elements starting at pa with the nb elements starting at pb - * in a stable way, in-place. na and nb must be > 0, and pa + na == pb. - * Must also have that *pb < *pa, that pa[na-1] belongs at the end of the - * merge, and should have na >= nb. See listsort.txt for more info. - * Return 0 if successful, -1 if error. +/* Merge the na elements starting at pa with the nb elements starting at + * ssb.keys = ssa.keys + na in a stable way, in-place. na and nb must be > 0. + * Must also have that ssa.keys[na-1] belongs at the end of the merge, and + * should have na >= nb. See listsort.txt for more info. Return 0 if + * successful, -1 if error. */ static Py_ssize_t -merge_hi(MergeState *ms, PyObject **pa, Py_ssize_t na, PyObject **pb, Py_ssize_t nb) +merge_hi(MergeState *ms, sortslice ssa, Py_ssize_t na, + sortslice ssb, Py_ssize_t nb) { Py_ssize_t k; - PyObject **dest; + sortslice dest, basea, baseb; int result = -1; /* guilty until proved innocent */ - PyObject **basea; - PyObject **baseb; Py_ssize_t min_gallop; - assert(ms && pa && pb && na > 0 && nb > 0 && pa + na == pb); + assert(ms && ssa.keys && ssb.keys && na > 0 && nb > 0); + assert(ssa.keys + na == ssb.keys); if (MERGE_GETMEM(ms, nb) < 0) return -1; - dest = pb + nb - 1; - memcpy(ms->a, pb, nb * sizeof(PyObject*)); - basea = pa; + dest = ssb; + sortslice_advance(&dest, nb-1); + sortslice_memcpy(&ms->a, 0, &ssb, 0, nb); + basea = ssa; baseb = ms->a; - pb = ms->a + nb - 1; - pa += na - 1; + ssb.keys = ms->a.keys + nb - 1; + if (ssb.values != NULL) + ssb.values = ms->a.values + nb - 1; + sortslice_advance(&ssa, na - 1); - *dest-- = *pa--; + sortslice_copy_decr(&dest, &ssa); --na; if (na == 0) goto Succeed; @@ -1533,11 +1647,11 @@ */ for (;;) { assert(na > 0 && nb > 1); - k = ISLT(*pb, *pa); + k = ISLT(ssb.keys[0], ssa.keys[0]); if (k) { if (k < 0) goto Fail; - *dest-- = *pa--; + sortslice_copy_decr(&dest, &ssa); ++acount; bcount = 0; --na; @@ -1547,7 +1661,7 @@ break; } else { - *dest-- = *pb--; + sortslice_copy_decr(&dest, &ssb); ++bcount; acount = 0; --nb; @@ -1568,33 +1682,33 @@ assert(na > 0 && nb > 1); min_gallop -= min_gallop > 1; ms->min_gallop = min_gallop; - k = gallop_right(*pb, basea, na, na-1); + k = gallop_right(ssb.keys[0], basea.keys, na, na-1); if (k < 0) goto Fail; k = na - k; acount = k; if (k) { - dest -= k; - pa -= k; - memmove(dest+1, pa+1, k * sizeof(PyObject *)); + sortslice_advance(&dest, -k); + sortslice_advance(&ssa, -k); + sortslice_memmove(&dest, 1, &ssa, 1, k); na -= k; if (na == 0) goto Succeed; } - *dest-- = *pb--; + sortslice_copy_decr(&dest, &ssb); --nb; if (nb == 1) goto CopyA; - k = gallop_left(*pa, baseb, nb, nb-1); + k = gallop_left(ssa.keys[0], baseb.keys, nb, nb-1); if (k < 0) goto Fail; k = nb - k; bcount = k; if (k) { - dest -= k; - pb -= k; - memcpy(dest+1, pb+1, k * sizeof(PyObject *)); + sortslice_advance(&dest, -k); + sortslice_advance(&ssb, -k); + sortslice_memcpy(&dest, 1, &ssb, 1, k); nb -= k; if (nb == 1) goto CopyA; @@ -1605,7 +1719,7 @@ if (nb == 0) goto Succeed; } - *dest-- = *pa--; + sortslice_copy_decr(&dest, &ssa); --na; if (na == 0) goto Succeed; @@ -1617,15 +1731,15 @@ result = 0; Fail: if (nb) - memcpy(dest-(nb-1), baseb, nb * sizeof(PyObject*)); + sortslice_memcpy(&dest, -(nb-1), &baseb, 0, nb); return result; CopyA: assert(nb == 1 && na > 0); - /* The first element of pb belongs at the front of the merge. */ - dest -= na; - pa -= na; - memmove(dest+1, pa+1, na * sizeof(PyObject *)); - *dest = *pb; + /* The first element of ssb belongs at the front of the merge. */ + sortslice_memmove(&dest, 1-na, &ssa, 1-na, na); + sortslice_advance(&dest, -na); + sortslice_advance(&ssa, -na); + sortslice_copy(&dest, 0, &ssb, 0); return 0; } @@ -1635,7 +1749,7 @@ static Py_ssize_t merge_at(MergeState *ms, Py_ssize_t i) { - PyObject **pa, **pb; + sortslice ssa, ssb; Py_ssize_t na, nb; Py_ssize_t k; @@ -1644,12 +1758,12 @@ assert(i >= 0); assert(i == ms->n - 2 || i == ms->n - 3); - pa = ms->pending[i].base; + ssa = ms->pending[i].base; na = ms->pending[i].len; - pb = ms->pending[i+1].base; + ssb = ms->pending[i+1].base; nb = ms->pending[i+1].len; assert(na > 0 && nb > 0); - assert(pa + na == pb); + assert(ssa.keys + na == ssb.keys); /* Record the length of the combined runs; if i is the 3rd-last * run now, also slide over the last run (which isn't involved @@ -1663,10 +1777,10 @@ /* Where does b start in a? Elements in a before that can be * ignored (already in place). */ - k = gallop_right(*pb, pa, na, 0); + k = gallop_right(*ssb.keys, ssa.keys, na, 0); if (k < 0) return -1; - pa += k; + sortslice_advance(&ssa, k); na -= k; if (na == 0) return 0; @@ -1674,7 +1788,7 @@ /* Where does a end in b? Elements in b after that can be * ignored (already in place). */ - nb = gallop_left(pa[na-1], pb, nb, nb-1); + nb = gallop_left(ssa.keys[na-1], ssb.keys, nb, nb-1); if (nb <= 0) return nb; @@ -1682,9 +1796,9 @@ * min(na, nb) elements. */ if (na <= nb) - return merge_lo(ms, pa, na, pb, nb); + return merge_lo(ms, ssa, na, ssb, nb); else - return merge_hi(ms, pa, na, pb, nb); + return merge_hi(ms, ssa, na, ssb, nb); } /* Examine the stack of runs waiting to be merged, merging adjacent runs @@ -1765,103 +1879,12 @@ return n + r; } -/* Special wrapper to support stable sorting using the decorate-sort-undecorate - pattern. Holds a key which is used for comparisons and the original record - which is returned during the undecorate phase. By exposing only the key - during comparisons, the underlying sort stability characteristics are left - unchanged. Also, the comparison function will only see the key instead of - a full record. */ - -typedef struct { - PyObject_HEAD - PyObject *key; - PyObject *value; -} sortwrapperobject; - -PyDoc_STRVAR(sortwrapper_doc, "Object wrapper with a custom sort key."); -static PyObject * -sortwrapper_richcompare(sortwrapperobject *, sortwrapperobject *, int); static void -sortwrapper_dealloc(sortwrapperobject *); - -PyTypeObject PySortWrapper_Type = { - PyVarObject_HEAD_INIT(&PyType_Type, 0) - "sortwrapper", /* tp_name */ - sizeof(sortwrapperobject), /* tp_basicsize */ - 0, /* tp_itemsize */ - /* methods */ - (destructor)sortwrapper_dealloc, /* tp_dealloc */ - 0, /* tp_print */ - 0, /* tp_getattr */ - 0, /* tp_setattr */ - 0, /* tp_reserved */ - 0, /* tp_repr */ - 0, /* tp_as_number */ - 0, /* tp_as_sequence */ - 0, /* tp_as_mapping */ - 0, /* tp_hash */ - 0, /* tp_call */ - 0, /* tp_str */ - PyObject_GenericGetAttr, /* tp_getattro */ - 0, /* tp_setattro */ - 0, /* tp_as_buffer */ - Py_TPFLAGS_DEFAULT, /* tp_flags */ - sortwrapper_doc, /* tp_doc */ - 0, /* tp_traverse */ - 0, /* tp_clear */ - (richcmpfunc)sortwrapper_richcompare, /* tp_richcompare */ -}; - - -static PyObject * -sortwrapper_richcompare(sortwrapperobject *a, sortwrapperobject *b, int op) -{ - if (!PyObject_TypeCheck(b, &PySortWrapper_Type)) { - PyErr_SetString(PyExc_TypeError, - "expected a sortwrapperobject"); - return NULL; - } - return PyObject_RichCompare(a->key, b->key, op); -} - -static void -sortwrapper_dealloc(sortwrapperobject *so) +reverse_sortslice(sortslice *s, Py_ssize_t n) { - Py_XDECREF(so->key); - Py_XDECREF(so->value); - PyObject_Del(so); -} - -/* Returns a new reference to a sortwrapper. - Consumes the references to the two underlying objects. */ - -static PyObject * -build_sortwrapper(PyObject *key, PyObject *value) -{ - sortwrapperobject *so; - - so = PyObject_New(sortwrapperobject, &PySortWrapper_Type); - if (so == NULL) - return NULL; - so->key = key; - so->value = value; - return (PyObject *)so; -} - -/* Returns a new reference to the value underlying the wrapper. */ -static PyObject * -sortwrapper_getvalue(PyObject *so) -{ - PyObject *value; - - if (!PyObject_TypeCheck(so, &PySortWrapper_Type)) { - PyErr_SetString(PyExc_TypeError, - "expected a sortwrapperobject"); - return NULL; - } - value = ((sortwrapperobject *)so)->value; - Py_INCREF(value); - return value; + reverse_slice(s->keys, &s->keys[n]); + if (s->values != NULL) + reverse_slice(s->values, &s->values[n]); } /* An adaptive, stable, natural mergesort. See listsort.txt. @@ -1873,9 +1896,9 @@ listsort(PyListObject *self, PyObject *args, PyObject *kwds) { MergeState ms; - PyObject **lo, **hi; Py_ssize_t nremaining; Py_ssize_t minrun; + sortslice lo; Py_ssize_t saved_ob_size, saved_allocated; PyObject **saved_ob_item; PyObject **final_ob_item; @@ -1883,8 +1906,8 @@ int reverse = 0; PyObject *keyfunc = NULL; Py_ssize_t i; - PyObject *key, *value, *kvpair; static char *kwlist[] = {"key", "reverse", 0}; + PyObject **keys; assert(self != NULL); assert (PyList_Check(self)); @@ -1913,28 +1936,36 @@ self->ob_item = NULL; self->allocated = -1; /* any operation will reset it to >= 0 */ - if (keyfunc != NULL) { - for (i=0 ; i < saved_ob_size ; i++) { - value = saved_ob_item[i]; - key = PyObject_CallFunctionObjArgs(keyfunc, value, - NULL); - if (key == NULL) { - for (i=i-1 ; i>=0 ; i--) { - kvpair = saved_ob_item[i]; - value = sortwrapper_getvalue(kvpair); - saved_ob_item[i] = value; - Py_DECREF(kvpair); - } - goto dsu_fail; + if (keyfunc == NULL) { + keys = NULL; + lo.keys = saved_ob_item; + lo.values = NULL; + } + else { + if (saved_ob_size < MERGESTATE_TEMP_SIZE/2) + /* Leverage stack space we allocated but won't otherwise use */ + keys = &ms.temparray[saved_ob_size+1]; + else { + keys = PyMem_MALLOC(sizeof(PyObject *) * saved_ob_size); + if (keys == NULL) + return NULL; + } + + for (i = 0; i < saved_ob_size ; i++) { + keys[i] = PyObject_CallFunctionObjArgs(keyfunc, saved_ob_item[i], + NULL); + if (keys[i] == NULL) { + for (i=i-1 ; i>=0 ; i--) + Py_DECREF(keys[i]); + goto keyfunc_fail; } - kvpair = build_sortwrapper(key, value); - if (kvpair == NULL) - goto dsu_fail; - saved_ob_item[i] = kvpair; } + + lo.keys = keys; + lo.values = saved_ob_item; } - merge_init(&ms); + merge_init(&ms, saved_ob_size, keys != NULL); nremaining = saved_ob_size; if (nremaining < 2) @@ -1942,30 +1973,31 @@ /* Reverse sort stability achieved by initially reversing the list, applying a stable forward sort, then reversing the final result. */ - if (reverse) - reverse_slice(saved_ob_item, saved_ob_item + saved_ob_size); + if (reverse) { + if (keys != NULL) + reverse_slice(&keys[0], &keys[saved_ob_size]); + reverse_slice(&saved_ob_item[0], &saved_ob_item[saved_ob_size]); + } /* March over the array once, left to right, finding natural runs, * and extending short natural runs to minrun elements. */ - lo = saved_ob_item; - hi = lo + nremaining; minrun = merge_compute_minrun(nremaining); do { int descending; Py_ssize_t n; /* Identify next run. */ - n = count_run(lo, hi, &descending); + n = count_run(lo.keys, lo.keys + nremaining, &descending); if (n < 0) goto fail; if (descending) - reverse_slice(lo, lo + n); + reverse_sortslice(&lo, n); /* If short, extend to min(minrun, nremaining). */ if (n < minrun) { const Py_ssize_t force = nremaining <= minrun ? nremaining : minrun; - if (binarysort(lo, lo + force, lo + n) < 0) + if (binarysort(lo, lo.keys + force, lo.keys + n) < 0) goto fail; n = force; } @@ -1977,27 +2009,27 @@ if (merge_collapse(&ms) < 0) goto fail; /* Advance to find next run. */ - lo += n; + sortslice_advance(&lo, n); nremaining -= n; } while (nremaining); - assert(lo == hi); if (merge_force_collapse(&ms) < 0) goto fail; assert(ms.n == 1); - assert(ms.pending[0].base == saved_ob_item); + assert(keys == NULL + ? ms.pending[0].base.keys == saved_ob_item + : ms.pending[0].base.keys == &keys[0]); assert(ms.pending[0].len == saved_ob_size); + lo = ms.pending[0].base; succeed: result = Py_None; fail: - if (keyfunc != NULL) { - for (i=0 ; i < saved_ob_size ; i++) { - kvpair = saved_ob_item[i]; - value = sortwrapper_getvalue(kvpair); - saved_ob_item[i] = value; - Py_DECREF(kvpair); - } + if (keys != NULL) { + for (i = 0; i < saved_ob_size; i++) + Py_DECREF(keys[i]); + if (keys != &ms.temparray[saved_ob_size+1]) + PyMem_FREE(keys); } if (self->allocated != -1 && result != NULL) { @@ -2013,7 +2045,7 @@ merge_freemem(&ms); -dsu_fail: +keyfunc_fail: final_ob_item = self->ob_item; i = Py_SIZE(self); Py_SIZE(self) = saved_ob_size; @@ -2303,6 +2335,10 @@ "L.__reversed__() -- return a reverse iterator over the list"); PyDoc_STRVAR(sizeof_doc, "L.__sizeof__() -- size of L in memory, in bytes"); +PyDoc_STRVAR(clear_doc, +"L.clear() -> None -- remove all items from L"); +PyDoc_STRVAR(copy_doc, +"L.copy() -> list -- a shallow copy of L"); PyDoc_STRVAR(append_doc, "L.append(object) -- append object to end"); PyDoc_STRVAR(extend_doc, @@ -2331,9 +2367,11 @@ {"__getitem__", (PyCFunction)list_subscript, METH_O|METH_COEXIST, getitem_doc}, {"__reversed__",(PyCFunction)list_reversed, METH_NOARGS, reversed_doc}, {"__sizeof__", (PyCFunction)list_sizeof, METH_NOARGS, sizeof_doc}, + {"clear", (PyCFunction)listclear, METH_NOARGS, clear_doc}, + {"copy", (PyCFunction)listcopy, METH_NOARGS, copy_doc}, {"append", (PyCFunction)listappend, METH_O, append_doc}, {"insert", (PyCFunction)listinsert, METH_VARARGS, insert_doc}, - {"extend", (PyCFunction)listextend, METH_O, extend_doc}, + {"extend", (PyCFunction)listextend, METH_O, extend_doc}, {"pop", (PyCFunction)listpop, METH_VARARGS, pop_doc}, {"remove", (PyCFunction)listremove, METH_O, remove_doc}, {"index", (PyCFunction)listindex, METH_VARARGS, index_doc}, @@ -2378,7 +2416,7 @@ PyObject* it; PyObject **src, **dest; - if (PySlice_GetIndicesEx((PySliceObject*)item, Py_SIZE(self), + if (PySlice_GetIndicesEx(item, Py_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; } @@ -2427,7 +2465,7 @@ else if (PySlice_Check(item)) { Py_ssize_t start, stop, step, slicelength; - if (PySlice_GetIndicesEx((PySliceObject*)item, Py_SIZE(self), + if (PySlice_GetIndicesEx(item, Py_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return -1; } @@ -2862,4 +2900,3 @@ len = 0; return PyLong_FromSsize_t(len); } - Modified: python/branches/pep-3151/Objects/longobject.c ============================================================================== --- python/branches/pep-3151/Objects/longobject.c (original) +++ python/branches/pep-3151/Objects/longobject.c Sat Feb 26 08:16:32 2011 @@ -4,7 +4,6 @@ #include "Python.h" #include "longintrepr.h" -#include "structseq.h" #include #include @@ -2134,17 +2133,34 @@ PyLong_FromUnicode(Py_UNICODE *u, Py_ssize_t length, int base) { PyObject *result; - char *buffer = (char *)PyMem_MALLOC(length+1); - - if (buffer == NULL) - return NULL; - - if (PyUnicode_EncodeDecimal(u, length, buffer, NULL)) { - PyMem_FREE(buffer); + PyObject *asciidig; + char *buffer, *end; + Py_ssize_t i, buflen; + Py_UNICODE *ptr; + + asciidig = PyUnicode_TransformDecimalToASCII(u, length); + if (asciidig == NULL) + return NULL; + /* Replace non-ASCII whitespace with ' ' */ + ptr = PyUnicode_AS_UNICODE(asciidig); + for (i = 0; i < length; i++) { + Py_UNICODE ch = ptr[i]; + if (ch > 127 && Py_UNICODE_ISSPACE(ch)) + ptr[i] = ' '; + } + buffer = _PyUnicode_AsStringAndSize(asciidig, &buflen); + if (buffer == NULL) { + Py_DECREF(asciidig); return NULL; } - result = PyLong_FromString(buffer, NULL, base); - PyMem_FREE(buffer); + result = PyLong_FromString(buffer, &end, base); + if (result != NULL && end != buffer + buflen) { + PyErr_SetString(PyExc_ValueError, + "null byte in argument for int()"); + Py_DECREF(result); + result = NULL; + } + Py_DECREF(asciidig); return result; } Modified: python/branches/pep-3151/Objects/memoryobject.c ============================================================================== --- python/branches/pep-3151/Objects/memoryobject.c (original) +++ python/branches/pep-3151/Objects/memoryobject.c Sat Feb 26 08:16:32 2011 @@ -52,9 +52,6 @@ { int res = 0; CHECK_RELEASED_INT(self); - /* XXX for whatever reason fixing the flags seems necessary */ - if (self->view.readonly) - flags &= ~PyBUF_WRITABLE; if (self->view.obj != NULL) res = PyObject_GetBuffer(self->view.obj, view, flags); if (view) @@ -78,6 +75,11 @@ { PyMemoryViewObject *mview; + if (info->buf == NULL) { + PyErr_SetString(PyExc_ValueError, + "cannot make memory view from a buffer with a NULL data pointer"); + return NULL; + } mview = (PyMemoryViewObject *) PyObject_GC_New(PyMemoryViewObject, &PyMemoryView_Type); if (mview == NULL) @@ -599,7 +601,7 @@ else if (PySlice_Check(key)) { Py_ssize_t start, stop, step, slicelength; - if (PySlice_GetIndicesEx((PySliceObject*)key, get_shape0(view), + if (PySlice_GetIndicesEx(key, get_shape0(view), &start, &stop, &step, &slicelength) < 0) { return NULL; } @@ -678,7 +680,7 @@ else if (PySlice_Check(key)) { Py_ssize_t stop, step; - if (PySlice_GetIndicesEx((PySliceObject*)key, get_shape0(view), + if (PySlice_GetIndicesEx(key, get_shape0(view), &start, &stop, &step, &len) < 0) { return -1; } Modified: python/branches/pep-3151/Objects/moduleobject.c ============================================================================== --- python/branches/pep-3151/Objects/moduleobject.c (original) +++ python/branches/pep-3151/Objects/moduleobject.c Sat Feb 26 08:16:32 2011 @@ -63,8 +63,9 @@ PyMethodDef *ml; const char* name; PyModuleObject *m; - if (!Py_IsInitialized()) - Py_FatalError("Interpreter not initialized (version mismatch?)"); + PyInterpreterState *interp = PyThreadState_Get()->interp; + if (interp->modules == NULL) + Py_FatalError("Python import machinery not initialized"); if (PyType_Ready(&moduledef_type) < 0) return NULL; if (module->m_base.m_index == 0) { @@ -74,7 +75,7 @@ module->m_base.m_index = max_module_number; } name = module->m_name; - if (module_api_version != PYTHON_API_VERSION) { + if (module_api_version != PYTHON_API_VERSION && module_api_version != PYTHON_ABI_VERSION) { int err; err = PyErr_WarnFormat(PyExc_RuntimeWarning, 1, "Python C API version mismatch for module %.100s: " @@ -168,24 +169,35 @@ return d; } -const char * -PyModule_GetName(PyObject *m) +PyObject* +PyModule_GetNameObject(PyObject *m) { PyObject *d; - PyObject *nameobj; + PyObject *name; if (!PyModule_Check(m)) { PyErr_BadArgument(); return NULL; } d = ((PyModuleObject *)m)->md_dict; if (d == NULL || - (nameobj = PyDict_GetItemString(d, "__name__")) == NULL || - !PyUnicode_Check(nameobj)) + (name = PyDict_GetItemString(d, "__name__")) == NULL || + !PyUnicode_Check(name)) { PyErr_SetString(PyExc_SystemError, "nameless module"); return NULL; } - return _PyUnicode_AsString(nameobj); + Py_INCREF(name); + return name; +} + +const char * +PyModule_GetName(PyObject *m) +{ + PyObject *name = PyModule_GetNameObject(m); + if (name == NULL) + return NULL; + Py_DECREF(name); /* module dict has still a reference */ + return _PyUnicode_AsString(name); } PyObject* @@ -218,7 +230,7 @@ if (fileobj == NULL) return NULL; utf8 = _PyUnicode_AsString(fileobj); - Py_DECREF(fileobj); + Py_DECREF(fileobj); /* module dict has still a reference */ return utf8; } @@ -346,21 +358,25 @@ static PyObject * module_repr(PyModuleObject *m) { - const char *name; - PyObject *filename, *repr; + PyObject *name, *filename, *repr; - name = PyModule_GetName((PyObject *)m); + name = PyModule_GetNameObject((PyObject *)m); if (name == NULL) { PyErr_Clear(); - name = "?"; + name = PyUnicode_FromStringAndSize("?", 1); + if (name == NULL) + return NULL; } filename = PyModule_GetFilenameObject((PyObject *)m); if (filename == NULL) { PyErr_Clear(); - return PyUnicode_FromFormat("", name); + repr = PyUnicode_FromFormat("", name); + } + else { + repr = PyUnicode_FromFormat("", name, filename); + Py_DECREF(filename); } - repr = PyUnicode_FromFormat("", name, filename); - Py_DECREF(filename); + Py_DECREF(name); return repr; } Modified: python/branches/pep-3151/Objects/object.c ============================================================================== --- python/branches/pep-3151/Objects/object.c (original) +++ python/branches/pep-3151/Objects/object.c Sat Feb 26 08:16:32 2011 @@ -2,7 +2,6 @@ /* Generic object operations; and implementation of None (NoObject) */ #include "Python.h" -#include "sliceobject.h" /* For PyEllipsis_Type */ #include "frameobject.h" #ifdef __cplusplus @@ -1756,7 +1755,6 @@ #endif - /* Hack to force loading of pycapsule.o */ PyTypeObject *_PyCapsule_hack = &PyCapsule_Type; @@ -1901,6 +1899,19 @@ } } +#ifndef Py_TRACE_REFS +/* For Py_LIMITED_API, we need an out-of-line version of _Py_Dealloc. + Define this here, so we can undefine the macro. */ +#undef _Py_Dealloc +PyAPI_FUNC(void) _Py_Dealloc(PyObject *); +void +_Py_Dealloc(PyObject *op) +{ + _Py_INC_TPFREES(op) _Py_COUNT_ALLOCS_COMMA + (*Py_TYPE(op)->tp_dealloc)(op); +} +#endif + #ifdef __cplusplus } #endif Modified: python/branches/pep-3151/Objects/obmalloc.c ============================================================================== --- python/branches/pep-3151/Objects/obmalloc.c (original) +++ python/branches/pep-3151/Objects/obmalloc.c Sat Feb 26 08:16:32 2011 @@ -249,7 +249,7 @@ /* Pool for small blocks. */ struct pool_header { union { block *_padding; - uint count; } ref; /* number of allocated blocks */ + uint count; } ref; /* number of allocated blocks */ block *freeblock; /* pool's free list head */ struct pool_header *nextpool; /* next pool of this size class */ struct pool_header *prevpool; /* previous pool "" */ @@ -404,7 +404,7 @@ immediately follow a pool_header's first two members: union { block *_padding; - uint count; } ref; + uint count; } ref; block *freeblock; each of which consume sizeof(block *) bytes. So what usedpools[i+i] really @@ -682,11 +682,19 @@ obmalloc in a small constant time, independent of the number of arenas obmalloc controls. Since this test is needed at every entry point, it's extremely desirable that it be this fast. + +Since Py_ADDRESS_IN_RANGE may be reading from memory which was not allocated +by Python, it is important that (POOL)->arenaindex is read only once, as +another thread may be concurrently modifying the value without holding the +GIL. To accomplish this, the arenaindex_temp variable is used to store +(POOL)->arenaindex for the duration of the Py_ADDRESS_IN_RANGE macro's +execution. The caller of the macro is responsible for declaring this +variable. */ #define Py_ADDRESS_IN_RANGE(P, POOL) \ - ((POOL)->arenaindex < maxarenas && \ - (uptr)(P) - arenas[(POOL)->arenaindex].address < (uptr)ARENA_SIZE && \ - arenas[(POOL)->arenaindex].address != 0) + ((arenaindex_temp = (POOL)->arenaindex) < maxarenas && \ + (uptr)(P) - arenas[arenaindex_temp].address < (uptr)ARENA_SIZE && \ + arenas[arenaindex_temp].address != 0) /* This is only useful when running memory debuggers such as @@ -709,7 +717,7 @@ #undef Py_ADDRESS_IN_RANGE #if defined(__GNUC__) && ((__GNUC__ == 3) && (__GNUC_MINOR__ >= 1) || \ - (__GNUC__ >= 4)) + (__GNUC__ >= 4)) #define Py_NO_INLINE __attribute__((__noinline__)) #else #define Py_NO_INLINE @@ -945,6 +953,9 @@ block *lastfree; poolp next, prev; uint size; +#ifndef Py_USING_MEMORY_DEBUGGER + uint arenaindex_temp; +#endif if (p == NULL) /* free(NULL) has no effect */ return; @@ -1167,6 +1178,9 @@ void *bp; poolp pool; size_t size; +#ifndef Py_USING_MEMORY_DEBUGGER + uint arenaindex_temp; +#endif if (p == NULL) return PyObject_Malloc(nbytes); @@ -1514,7 +1528,7 @@ if (nbytes > original_nbytes) { /* growing: mark new extra memory clean */ memset(q + original_nbytes, CLEANBYTE, - nbytes - original_nbytes); + nbytes - original_nbytes); } return q; @@ -1641,11 +1655,11 @@ fputs("FORBIDDENBYTE, as expected.\n", stderr); else { fprintf(stderr, "not all FORBIDDENBYTE (0x%02x):\n", - FORBIDDENBYTE); + FORBIDDENBYTE); for (i = 0; i < SST; ++i) { const uchar byte = tail[i]; fprintf(stderr, " at tail+%d: 0x%02x", - i, byte); + i, byte); if (byte != FORBIDDENBYTE) fputs(" *** OUCH", stderr); fputc('\n', stderr); @@ -1751,7 +1765,7 @@ char buf[128]; fprintf(stderr, "Small block threshold = %d, in %u size classes.\n", - SMALL_REQUEST_THRESHOLD, numclasses); + SMALL_REQUEST_THRESHOLD, numclasses); for (i = 0; i < numclasses; ++i) numpools[i] = numblocks[i] = numfreeblocks[i] = 0; @@ -1761,7 +1775,6 @@ * will be living in full pools -- would be a shame to miss them. */ for (i = 0; i < maxarenas; ++i) { - uint poolsinarena; uint j; uptr base = arenas[i].address; @@ -1770,7 +1783,6 @@ continue; narenas += 1; - poolsinarena = arenas[i].ntotalpools; numfreepools += arenas[i].nfreepools; /* round up to pool alignment */ @@ -1809,7 +1821,7 @@ fputc('\n', stderr); fputs("class size num pools blocks in use avail blocks\n" "----- ---- --------- ------------- ------------\n", - stderr); + stderr); for (i = 0; i < numclasses; ++i) { size_t p = numpools[i]; @@ -1824,7 +1836,7 @@ "%11" PY_FORMAT_SIZE_T "u " "%15" PY_FORMAT_SIZE_T "u " "%13" PY_FORMAT_SIZE_T "u\n", - i, size, p, b, f); + i, size, p, b, f); allocated_bytes += b * size; available_bytes += f * size; pool_header_bytes += p * POOL_OVERHEAD; @@ -1867,8 +1879,10 @@ int Py_ADDRESS_IN_RANGE(void *P, poolp pool) { - return pool->arenaindex < maxarenas && - (uptr)P - arenas[pool->arenaindex].address < (uptr)ARENA_SIZE && - arenas[pool->arenaindex].address != 0; + uint arenaindex_temp = pool->arenaindex; + + return arenaindex_temp < maxarenas && + (uptr)P - arenas[arenaindex_temp].address < (uptr)ARENA_SIZE && + arenas[arenaindex_temp].address != 0; } #endif Modified: python/branches/pep-3151/Objects/rangeobject.c ============================================================================== --- python/branches/pep-3151/Objects/rangeobject.c (original) +++ python/branches/pep-3151/Objects/rangeobject.c Sat Feb 26 08:16:32 2011 @@ -14,6 +14,7 @@ PyObject *start; PyObject *stop; PyObject *step; + PyObject *length; } rangeobject; /* Helper function for validating step. Always returns a new reference or @@ -43,6 +44,31 @@ return step; } +static PyObject * +compute_range_length(PyObject *start, PyObject *stop, PyObject *step); + +static rangeobject * +make_range_object(PyTypeObject *type, PyObject *start, + PyObject *stop, PyObject *step) +{ + rangeobject *obj = NULL; + PyObject *length; + length = compute_range_length(start, stop, step); + if (length == NULL) { + return NULL; + } + obj = PyObject_New(rangeobject, type); + if (obj == NULL) { + Py_DECREF(length); + return NULL; + } + obj->start = start; + obj->stop = stop; + obj->step = step; + obj->length = length; + return obj; +} + /* XXX(nnorwitz): should we error check if the user passes any empty ranges? range(-10) range(0, -5) @@ -51,7 +77,7 @@ static PyObject * range_new(PyTypeObject *type, PyObject *args, PyObject *kw) { - rangeobject *obj = NULL; + rangeobject *obj; PyObject *start = NULL, *stop = NULL, *step = NULL; if (!_PyArg_NoKeywords("range()", kw)) @@ -97,15 +123,11 @@ } } - obj = PyObject_New(rangeobject, &PyRange_Type); - if (obj == NULL) - goto Fail; - obj->start = start; - obj->stop = stop; - obj->step = step; - return (PyObject *) obj; + obj = make_range_object(type, start, stop, step); + if (obj != NULL) + return (PyObject *) obj; -Fail: + /* Failed to create object, release attributes */ Py_XDECREF(start); Py_XDECREF(stop); Py_XDECREF(step); @@ -115,7 +137,7 @@ PyDoc_STRVAR(range_doc, "range([start,] stop[, step]) -> range object\n\ \n\ -Returns an iterator that generates the numbers in the range on demand."); +Returns a virtual sequence of numbers from start to stop by step."); static void range_dealloc(rangeobject *r) @@ -123,6 +145,7 @@ Py_DECREF(r->start); Py_DECREF(r->stop); Py_DECREF(r->step); + Py_DECREF(r->length); PyObject_Del(r); } @@ -131,7 +154,7 @@ * PyLong_Check(). Return NULL when there is an error. */ static PyObject* -range_length_obj(rangeobject *r) +compute_range_length(PyObject *start, PyObject *stop, PyObject *step) { /* ------------------------------------------------------------- Algorithm is equal to that of get_len_of_range(), but it operates @@ -139,7 +162,6 @@ ---------------------------------------------------------------*/ int cmp_result; PyObject *lo, *hi; - PyObject *step = NULL; PyObject *diff = NULL; PyObject *one = NULL; PyObject *tmp1 = NULL, *tmp2 = NULL, *result; @@ -148,20 +170,19 @@ PyObject *zero = PyLong_FromLong(0); if (zero == NULL) return NULL; - cmp_result = PyObject_RichCompareBool(r->step, zero, Py_GT); + cmp_result = PyObject_RichCompareBool(step, zero, Py_GT); Py_DECREF(zero); if (cmp_result == -1) return NULL; if (cmp_result == 1) { - lo = r->start; - hi = r->stop; - step = r->step; + lo = start; + hi = stop; Py_INCREF(step); } else { - lo = r->stop; - hi = r->start; - step = PyNumber_Negative(r->step); + lo = stop; + hi = start; + step = PyNumber_Negative(step); if (!step) return NULL; } @@ -206,71 +227,325 @@ static Py_ssize_t range_length(rangeobject *r) { - PyObject *len = range_length_obj(r); - Py_ssize_t result = -1; - if (len) { - result = PyLong_AsSsize_t(len); - Py_DECREF(len); - } - return result; + return PyLong_AsSsize_t(r->length); } -/* range(...)[x] is necessary for: seq[:] = range(...) */ +static PyObject * +compute_item(rangeobject *r, PyObject *i) +{ + PyObject *incr, *result; + /* PyLong equivalent to: + * return r->start + (i * r->step) + */ + incr = PyNumber_Multiply(i, r->step); + if (!incr) + return NULL; + result = PyNumber_Add(r->start, incr); + Py_DECREF(incr); + return result; +} static PyObject * -range_item(rangeobject *r, Py_ssize_t i) +compute_range_item(rangeobject *r, PyObject *arg) { - Py_ssize_t len = range_length(r); - PyObject *rem, *incr, *result; + int cmp_result; + PyObject *i, *result; - /* XXX(nnorwitz): should negative indices be supported? */ - /* XXX(nnorwitz): should support range[x] where x > PY_SSIZE_T_MAX? */ - if (i < 0 || i >= len) { - if (!PyErr_Occurred()) - PyErr_SetString(PyExc_IndexError, - "range object index out of range"); + PyObject *zero = PyLong_FromLong(0); + if (zero == NULL) return NULL; - } - /* XXX(nnorwitz): optimize for short ints. */ - rem = PyLong_FromSsize_t(i); - if (!rem) + /* PyLong equivalent to: + * if (arg < 0) { + * i = r->length + arg + * } else { + * i = arg + * } + */ + cmp_result = PyObject_RichCompareBool(arg, zero, Py_LT); + if (cmp_result == -1) { + Py_DECREF(zero); return NULL; - incr = PyNumber_Multiply(rem, r->step); - Py_DECREF(rem); - if (!incr) + } + if (cmp_result == 1) { + i = PyNumber_Add(r->length, arg); + if (!i) { + Py_DECREF(zero); return NULL; - result = PyNumber_Add(r->start, incr); - Py_DECREF(incr); + } + } else { + i = arg; + Py_INCREF(i); + } + + /* PyLong equivalent to: + * if (i < 0 || i >= r->length) { + * + * } + */ + cmp_result = PyObject_RichCompareBool(i, zero, Py_LT); + Py_DECREF(zero); + if (cmp_result == 0) { + cmp_result = PyObject_RichCompareBool(i, r->length, Py_GE); + } + if (cmp_result == -1) { + Py_DECREF(i); + return NULL; + } + if (cmp_result == 1) { + Py_DECREF(i); + PyErr_SetString(PyExc_IndexError, + "range object index out of range"); + return NULL; + } + + result = compute_item(r, i); + Py_DECREF(i); return result; } static PyObject * -range_repr(rangeobject *r) +range_item(rangeobject *r, Py_ssize_t i) { - Py_ssize_t istep; + PyObject *res, *arg = PyLong_FromLong(i); + if (!arg) { + return NULL; + } + res = compute_range_item(r, arg); + Py_DECREF(arg); + return res; +} - /* Check for special case values for printing. We don't always - need the step value. We don't care about errors - (it means overflow), so clear the errors. */ - istep = PyNumber_AsSsize_t(r->step, NULL); - if (istep != 1 || (istep == -1 && PyErr_Occurred())) { - PyErr_Clear(); +/* Additional helpers, since the standard slice helpers + * all clip to PY_SSIZE_T_MAX + */ + +/* Replace _PyEval_SliceIndex */ +static PyObject * +compute_slice_element(PyObject *obj) +{ + PyObject *result = NULL; + if (obj != NULL) { + if (PyIndex_Check(obj)) { + result = PyNumber_Index(obj); + } + } + if (result == NULL) { + PyErr_SetString(PyExc_TypeError, + "slice indices must be integers or " + "None or have an __index__ method"); } + return result; +} - if (istep == 1) - return PyUnicode_FromFormat("range(%R, %R)", r->start, r->stop); - else - return PyUnicode_FromFormat("range(%R, %R, %R)", - r->start, r->stop, r->step); +/* Replace PySlice_GetIndicesEx + * Result indicates whether or not the slice is empty + * (-1 = error, 0 = empty slice, 1 = slice contains elements) + */ +static int +compute_slice_indices(rangeobject *r, PySliceObject *slice, + PyObject **start, PyObject **stop, PyObject **step) +{ + int cmp_result, has_elements; + Py_ssize_t clamped_step = 0; + PyObject *zero = NULL, *one = NULL, *neg_one = NULL, *candidate = NULL; + PyObject *tmp_start = NULL, *tmp_stop = NULL, *tmp_step = NULL; + zero = PyLong_FromLong(0); + if (zero == NULL) goto Fail; + one = PyLong_FromLong(1); + if (one == NULL) goto Fail; + neg_one = PyLong_FromLong(-1); + if (neg_one == NULL) goto Fail; + + /* Calculate step value */ + if (slice->step == Py_None) { + clamped_step = 1; + tmp_step = one; + Py_INCREF(tmp_step); + } else { + if (!_PyEval_SliceIndex(slice->step, &clamped_step)) goto Fail; + if (clamped_step == 0) { + PyErr_SetString(PyExc_ValueError, + "slice step cannot be zero"); + goto Fail; + } + tmp_step = compute_slice_element(slice->step); + if (tmp_step == NULL) goto Fail; + } + + /* Calculate start value */ + if (slice->start == Py_None) { + if (clamped_step < 0) { + tmp_start = PyNumber_Subtract(r->length, one); + if (tmp_start == NULL) goto Fail; + } else { + tmp_start = zero; + Py_INCREF(tmp_start); + } + } else { + candidate = compute_slice_element(slice->start); + if (candidate == NULL) goto Fail; + cmp_result = PyObject_RichCompareBool(candidate, zero, Py_LT); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* candidate < 0 */ + tmp_start = PyNumber_Add(r->length, candidate); + if (tmp_start == NULL) goto Fail; + Py_CLEAR(candidate); + } else { + /* candidate >= 0 */ + tmp_start = candidate; + candidate = NULL; + } + cmp_result = PyObject_RichCompareBool(tmp_start, zero, Py_LT); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* tmp_start < 0 */ + Py_CLEAR(tmp_start); + if (clamped_step < 0) { + tmp_start = neg_one; + } else { + tmp_start = zero; + } + Py_INCREF(tmp_start); + } else { + /* tmp_start >= 0 */ + cmp_result = PyObject_RichCompareBool(tmp_start, r->length, Py_GE); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* tmp_start >= r->length */ + Py_CLEAR(tmp_start); + if (clamped_step < 0) { + tmp_start = PyNumber_Subtract(r->length, one); + if (tmp_start == NULL) goto Fail; + } else { + tmp_start = r->length; + Py_INCREF(tmp_start); + } + } + } + } + + /* Calculate stop value */ + if (slice->stop == Py_None) { + if (clamped_step < 0) { + tmp_stop = neg_one; + } else { + tmp_stop = r->length; + } + Py_INCREF(tmp_stop); + } else { + candidate = compute_slice_element(slice->stop); + if (candidate == NULL) goto Fail; + cmp_result = PyObject_RichCompareBool(candidate, zero, Py_LT); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* candidate < 0 */ + tmp_stop = PyNumber_Add(r->length, candidate); + if (tmp_stop == NULL) goto Fail; + Py_CLEAR(candidate); + } else { + /* candidate >= 0 */ + tmp_stop = candidate; + candidate = NULL; + } + cmp_result = PyObject_RichCompareBool(tmp_stop, zero, Py_LT); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* tmp_stop < 0 */ + Py_CLEAR(tmp_stop); + if (clamped_step < 0) { + tmp_stop = neg_one; + } else { + tmp_stop = zero; + } + Py_INCREF(tmp_stop); + } else { + /* tmp_stop >= 0 */ + cmp_result = PyObject_RichCompareBool(tmp_stop, r->length, Py_GE); + if (cmp_result == -1) goto Fail; + if (cmp_result) { + /* tmp_stop >= r->length */ + Py_CLEAR(tmp_stop); + if (clamped_step < 0) { + tmp_stop = PyNumber_Subtract(r->length, one); + if (tmp_stop == NULL) goto Fail; + } else { + tmp_stop = r->length; + Py_INCREF(tmp_start); + } + } + } + } + + /* Check if the slice is empty or not */ + if (clamped_step < 0) { + has_elements = PyObject_RichCompareBool(tmp_start, tmp_stop, Py_GT); + } else { + has_elements = PyObject_RichCompareBool(tmp_start, tmp_stop, Py_LT); + } + if (has_elements == -1) goto Fail; + + *start = tmp_start; + *stop = tmp_stop; + *step = tmp_step; + Py_DECREF(neg_one); + Py_DECREF(one); + Py_DECREF(zero); + return has_elements; + + Fail: + Py_XDECREF(tmp_start); + Py_XDECREF(tmp_stop); + Py_XDECREF(tmp_step); + Py_XDECREF(candidate); + Py_XDECREF(neg_one); + Py_XDECREF(one); + Py_XDECREF(zero); + return -1; } -/* Pickling support */ static PyObject * -range_reduce(rangeobject *r, PyObject *args) +compute_slice(rangeobject *r, PyObject *_slice) { - return Py_BuildValue("(O(OOO))", Py_TYPE(r), - r->start, r->stop, r->step); + PySliceObject *slice = (PySliceObject *) _slice; + rangeobject *result; + PyObject *start = NULL, *stop = NULL, *step = NULL; + PyObject *substart = NULL, *substop = NULL, *substep = NULL; + int has_elements; + + has_elements = compute_slice_indices(r, slice, &start, &stop, &step); + if (has_elements == -1) return NULL; + + substep = PyNumber_Multiply(r->step, step); + if (substep == NULL) goto fail; + Py_CLEAR(step); + + substart = compute_item(r, start); + if (substart == NULL) goto fail; + Py_CLEAR(start); + + if (has_elements) { + substop = compute_item(r, stop); + if (substop == NULL) goto fail; + } else { + substop = substart; + Py_INCREF(substop); + } + Py_CLEAR(stop); + + result = make_range_object(Py_TYPE(r), substart, substop, substep); + if (result != NULL) { + return (PyObject *) result; + } +fail: + Py_XDECREF(start); + Py_XDECREF(stop); + Py_XDECREF(step); + Py_XDECREF(substart); + Py_XDECREF(substop); + Py_XDECREF(substep); + return NULL; } /* Assumes (PyLong_CheckExact(ob) || PyBool_Check(ob)) */ @@ -325,7 +600,8 @@ } static int -range_contains(rangeobject *r, PyObject *ob) { +range_contains(rangeobject *r, PyObject *ob) +{ if (PyLong_CheckExact(ob) || PyBool_Check(ob)) return range_contains_long(r, ob); @@ -337,10 +613,13 @@ range_count(rangeobject *r, PyObject *ob) { if (PyLong_CheckExact(ob) || PyBool_Check(ob)) { - if (range_contains_long(r, ob)) - Py_RETURN_TRUE; + int result = range_contains_long(r, ob); + if (result == -1) + return NULL; + else if (result) + return PyLong_FromLong(1); else - Py_RETURN_FALSE; + return PyLong_FromLong(0); } else { Py_ssize_t count; count = _PySequence_IterSearch((PyObject*)r, ob, PY_ITERSEARCH_COUNT); @@ -353,10 +632,7 @@ static PyObject * range_index(rangeobject *r, PyObject *ob) { - PyObject *idx, *tmp; int contains; - PyObject *format_tuple, *err_string; - static PyObject *err_format = NULL; if (!PyLong_CheckExact(ob) && !PyBool_Check(ob)) { Py_ssize_t index; @@ -370,49 +646,88 @@ if (contains == -1) return NULL; - if (!contains) - goto value_error; - - tmp = PyNumber_Subtract(ob, r->start); - if (tmp == NULL) - return NULL; - - /* idx = (ob - r.start) // r.step */ - idx = PyNumber_FloorDivide(tmp, r->step); - Py_DECREF(tmp); - return idx; - -value_error: - - /* object is not in the range */ - if (err_format == NULL) { - err_format = PyUnicode_FromString("%r is not in range"); - if (err_format == NULL) + if (contains) { + PyObject *idx, *tmp = PyNumber_Subtract(ob, r->start); + if (tmp == NULL) return NULL; + /* idx = (ob - r.start) // r.step */ + idx = PyNumber_FloorDivide(tmp, r->step); + Py_DECREF(tmp); + return idx; } - format_tuple = PyTuple_Pack(1, ob); - if (format_tuple == NULL) - return NULL; - err_string = PyUnicode_Format(err_format, format_tuple); - Py_DECREF(format_tuple); - if (err_string == NULL) - return NULL; - PyErr_SetObject(PyExc_ValueError, err_string); - Py_DECREF(err_string); + + /* object is not in the range */ + PyErr_Format(PyExc_ValueError, "%R is not in range", ob); return NULL; } static PySequenceMethods range_as_sequence = { (lenfunc)range_length, /* sq_length */ - 0, /* sq_concat */ - 0, /* sq_repeat */ - (ssizeargfunc)range_item, /* sq_item */ - 0, /* sq_slice */ - 0, /* sq_ass_item */ - 0, /* sq_ass_slice */ + 0, /* sq_concat */ + 0, /* sq_repeat */ + (ssizeargfunc)range_item, /* sq_item */ + 0, /* sq_slice */ + 0, /* sq_ass_item */ + 0, /* sq_ass_slice */ (objobjproc)range_contains, /* sq_contains */ }; +static PyObject * +range_repr(rangeobject *r) +{ + Py_ssize_t istep; + + /* Check for special case values for printing. We don't always + need the step value. We don't care about errors + (it means overflow), so clear the errors. */ + istep = PyNumber_AsSsize_t(r->step, NULL); + if (istep != 1 || (istep == -1 && PyErr_Occurred())) { + PyErr_Clear(); + } + + if (istep == 1) + return PyUnicode_FromFormat("range(%R, %R)", r->start, r->stop); + else + return PyUnicode_FromFormat("range(%R, %R, %R)", + r->start, r->stop, r->step); +} + +/* Pickling support */ +static PyObject * +range_reduce(rangeobject *r, PyObject *args) +{ + return Py_BuildValue("(O(OOO))", Py_TYPE(r), + r->start, r->stop, r->step); +} + +static PyObject * +range_subscript(rangeobject* self, PyObject* item) +{ + if (PyIndex_Check(item)) { + PyObject *i, *result; + i = PyNumber_Index(item); + if (!i) + return NULL; + result = compute_range_item(self, i); + Py_DECREF(i); + return result; + } + if (PySlice_Check(item)) { + return compute_slice(self, item); + } + PyErr_Format(PyExc_TypeError, + "range indices must be integers or slices, not %.200s", + item->ob_type->tp_name); + return NULL; +} + + +static PyMappingMethods range_as_mapping = { + (lenfunc)range_length, /* mp_length */ + (binaryfunc)range_subscript, /* mp_subscript */ + (objobjargproc)0, /* mp_ass_subscript */ +}; + static PyObject * range_iter(PyObject *seq); static PyObject * range_reverse(PyObject *seq); @@ -447,7 +762,7 @@ (reprfunc)range_repr, /* tp_repr */ 0, /* tp_as_number */ &range_as_sequence, /* tp_as_sequence */ - 0, /* tp_as_mapping */ + &range_as_mapping, /* tp_as_mapping */ 0, /* tp_hash */ 0, /* tp_call */ 0, /* tp_str */ @@ -507,22 +822,6 @@ return PyLong_FromLong(r->len - r->index); } -typedef struct { - PyObject_HEAD - PyObject *index; - PyObject *start; - PyObject *step; - PyObject *len; -} longrangeiterobject; - -static PyObject * -longrangeiter_len(longrangeiterobject *r, PyObject *no_args) -{ - return PyNumber_Subtract(r->len, r->index); -} - -static PyObject *rangeiter_new(PyTypeObject *, PyObject *args, PyObject *kw); - PyDoc_STRVAR(length_hint_doc, "Private method returning an estimate of len(list(it))."); @@ -532,6 +831,8 @@ {NULL, NULL} /* sentinel */ }; +static PyObject *rangeiter_new(PyTypeObject *, PyObject *args, PyObject *kw); + PyTypeObject PyRangeIter_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "range_iterator", /* tp_name */ @@ -606,7 +907,7 @@ is not representable as a C long, OverflowError is raised. */ static PyObject * -int_range_iter(long start, long stop, long step) +fast_range_iter(long start, long stop, long step) { rangeiterobject *it = PyObject_New(rangeiterobject, &PyRangeIter_Type); unsigned long ulen; @@ -638,7 +939,21 @@ &start, &stop, &step)) return NULL; - return int_range_iter(start, stop, step); + return fast_range_iter(start, stop, step); +} + +typedef struct { + PyObject_HEAD + PyObject *index; + PyObject *start; + PyObject *step; + PyObject *len; +} longrangeiterobject; + +static PyObject * +longrangeiter_len(longrangeiterobject *r, PyObject *no_args) +{ + return PyNumber_Subtract(r->len, r->index); } static PyMethodDef longrangeiter_methods[] = { @@ -752,7 +1067,7 @@ PyErr_Clear(); goto long_range; } - int_it = int_range_iter(lstart, lstop, lstep); + int_it = fast_range_iter(lstart, lstop, lstep); if (int_it == NULL && PyErr_ExceptionMatches(PyExc_OverflowError)) { PyErr_Clear(); goto long_range; @@ -767,14 +1082,11 @@ /* Do all initialization here, so we can DECREF on failure. */ it->start = r->start; it->step = r->step; + it->len = r->length; Py_INCREF(it->start); Py_INCREF(it->step); + Py_INCREF(it->len); - it->len = it->index = NULL; - - it->len = range_length_obj(r); - if (!it->len) - goto create_failure; it->index = PyLong_FromLong(0); if (!it->index) goto create_failure; @@ -791,7 +1103,7 @@ { rangeobject *range = (rangeobject*) seq; longrangeiterobject *it; - PyObject *one, *sum, *diff, *len = NULL, *product; + PyObject *one, *sum, *diff, *product; long lstart, lstop, lstep, new_start, new_stop; unsigned long ulen; @@ -854,7 +1166,7 @@ new_stop = lstart - lstep; new_start = (long)(new_stop + ulen * lstep); - return int_range_iter(new_start, new_stop, -lstep); + return fast_range_iter(new_start, new_stop, -lstep); long_range: it = PyObject_New(longrangeiterobject, &PyLongRangeIter_Type); @@ -862,18 +1174,14 @@ return NULL; /* start + (len - 1) * step */ - len = range_length_obj(range); - if (!len) - goto create_failure; - - /* Steal reference to len. */ - it->len = len; + it->len = range->length; + Py_INCREF(it->len); one = PyLong_FromLong(1); if (!one) goto create_failure; - diff = PyNumber_Subtract(len, one); + diff = PyNumber_Subtract(it->len, one); Py_DECREF(one); if (!diff) goto create_failure; Modified: python/branches/pep-3151/Objects/setobject.c ============================================================================== --- python/branches/pep-3151/Objects/setobject.c (original) +++ python/branches/pep-3151/Objects/setobject.c Sat Feb 26 08:16:32 2011 @@ -1525,6 +1525,20 @@ "Remove all elements of another set from this set."); static PyObject * +set_copy_and_difference(PySetObject *so, PyObject *other) +{ + PyObject *result; + + result = set_copy(so); + if (result == NULL) + return NULL; + if (set_difference_update_internal((PySetObject *) result, other) != -1) + return result; + Py_DECREF(result); + return NULL; +} + +static PyObject * set_difference(PySetObject *so, PyObject *other) { PyObject *result; @@ -1532,13 +1546,13 @@ Py_ssize_t pos = 0; if (!PyAnySet_Check(other) && !PyDict_CheckExact(other)) { - result = set_copy(so); - if (result == NULL) - return NULL; - if (set_difference_update_internal((PySetObject *)result, other) != -1) - return result; - Py_DECREF(result); - return NULL; + return set_copy_and_difference(so, other); + } + + /* If len(so) much more than len(other), it's more efficient to simply copy + * so and then iterate other looking for common elements. */ + if ((PySet_GET_SIZE(so) >> 2) > PyObject_Size(other)) { + return set_copy_and_difference(so, other); } result = make_new_set_basetype(Py_TYPE(so), NULL); @@ -1560,6 +1574,7 @@ return result; } + /* Iterate over so, checking for common elements in other. */ while (set_next(so, &pos, &entry)) { int rv = set_contains_entry((PySetObject *)other, entry); if (rv == -1) { Modified: python/branches/pep-3151/Objects/sliceobject.c ============================================================================== --- python/branches/pep-3151/Objects/sliceobject.c (original) +++ python/branches/pep-3151/Objects/sliceobject.c Sat Feb 26 08:16:32 2011 @@ -99,9 +99,10 @@ } int -PySlice_GetIndices(PySliceObject *r, Py_ssize_t length, +PySlice_GetIndices(PyObject *_r, Py_ssize_t length, Py_ssize_t *start, Py_ssize_t *stop, Py_ssize_t *step) { + PySliceObject *r = (PySliceObject*)_r; /* XXX support long ints */ if (r->step == Py_None) { *step = 1; @@ -130,10 +131,11 @@ } int -PySlice_GetIndicesEx(PySliceObject *r, Py_ssize_t length, +PySlice_GetIndicesEx(PyObject *_r, Py_ssize_t length, Py_ssize_t *start, Py_ssize_t *stop, Py_ssize_t *step, Py_ssize_t *slicelength) { + PySliceObject *r = (PySliceObject*)_r; /* this is harder to get right than you might think */ Py_ssize_t defstart, defstop; @@ -256,7 +258,7 @@ return NULL; } - if (PySlice_GetIndicesEx(self, ilen, &start, &stop, + if (PySlice_GetIndicesEx((PyObject*)self, ilen, &start, &stop, &step, &slicelength) < 0) { return NULL; } Modified: python/branches/pep-3151/Objects/stringlib/formatter.h ============================================================================== --- python/branches/pep-3151/Objects/stringlib/formatter.h (original) +++ python/branches/pep-3151/Objects/stringlib/formatter.h Sat Feb 26 08:16:32 2011 @@ -941,13 +941,8 @@ from a hard-code pseudo-locale */ LocaleInfo locale; - /* Alternate is not allowed on floats. */ - if (format->alternate) { - PyErr_SetString(PyExc_ValueError, - "Alternate form (#) not allowed in float format " - "specifier"); - goto done; - } + if (format->alternate) + flags |= Py_DTSF_ALT; if (type == '\0') { /* Omitted type specifier. Behaves in the same way as repr(x) @@ -1104,15 +1099,7 @@ from a hard-code pseudo-locale */ LocaleInfo locale; - /* Alternate is not allowed on complex. */ - if (format->alternate) { - PyErr_SetString(PyExc_ValueError, - "Alternate form (#) not allowed in complex format " - "specifier"); - goto done; - } - - /* Neither is zero pading. */ + /* Zero padding is not allowed. */ if (format->fill_char == '0') { PyErr_SetString(PyExc_ValueError, "Zero padding is not allowed in complex format " @@ -1135,6 +1122,9 @@ if (im == -1.0 && PyErr_Occurred()) goto done; + if (format->alternate) + flags |= Py_DTSF_ALT; + if (type == '\0') { /* Omitted type specifier. Should be like str(self). */ type = 'r'; Modified: python/branches/pep-3151/Objects/stringlib/string_format.h ============================================================================== --- python/branches/pep-3151/Objects/stringlib/string_format.h (original) +++ python/branches/pep-3151/Objects/stringlib/string_format.h Sat Feb 26 08:16:32 2011 @@ -1192,6 +1192,11 @@ { formatteriterobject *it; + if (!PyUnicode_Check(self)) { + PyErr_Format(PyExc_TypeError, "expected str, got %s", Py_TYPE(self)->tp_name); + return NULL; + } + it = PyObject_New(formatteriterobject, &PyFormatterIter_Type); if (it == NULL) return NULL; @@ -1332,6 +1337,11 @@ PyObject *first_obj = NULL; PyObject *result = NULL; + if (!PyUnicode_Check(self)) { + PyErr_Format(PyExc_TypeError, "expected str, got %s", Py_TYPE(self)->tp_name); + return NULL; + } + it = PyObject_New(fieldnameiterobject, &PyFieldNameIter_Type); if (it == NULL) return NULL; Modified: python/branches/pep-3151/Objects/structseq.c ============================================================================== --- python/branches/pep-3151/Objects/structseq.c (original) +++ python/branches/pep-3151/Objects/structseq.c Sat Feb 26 08:16:32 2011 @@ -3,7 +3,6 @@ #include "Python.h" #include "structmember.h" -#include "structseq.h" static char visible_length_key[] = "n_sequence_fields"; static char real_length_key[] = "n_fields"; @@ -44,6 +43,18 @@ return (PyObject*)obj; } +void +PyStructSequence_SetItem(PyObject* op, Py_ssize_t i, PyObject* v) +{ + PyStructSequence_SET_ITEM(op, i, v); +} + +PyObject* +PyStructSequence_GetItem(PyObject* op, Py_ssize_t i) +{ + return PyStructSequence_GET_ITEM(op, i); +} + static void structseq_dealloc(PyStructSequence *obj) { @@ -366,3 +377,11 @@ SET_DICT_FROM_INT(real_length_key, n_members); SET_DICT_FROM_INT(unnamed_fields_key, n_unnamed_members); } + +PyTypeObject* +PyStructSequence_NewType(PyStructSequence_Desc *desc) +{ + PyTypeObject *result = (PyTypeObject*)PyType_GenericAlloc(&PyType_Type, 0); + PyStructSequence_InitType(result, desc); + return result; +} Modified: python/branches/pep-3151/Objects/tupleobject.c ============================================================================== --- python/branches/pep-3151/Objects/tupleobject.c (original) +++ python/branches/pep-3151/Objects/tupleobject.c Sat Feb 26 08:16:32 2011 @@ -86,7 +86,7 @@ { return PyErr_NoMemory(); } - nbytes += sizeof(PyTupleObject) - sizeof(PyObject *); + /* nbytes += sizeof(PyTupleObject) - sizeof(PyObject *); */ op = PyObject_GC_NewVar(PyTupleObject, &PyTuple_Type, size); if (op == NULL) @@ -689,7 +689,7 @@ PyObject* it; PyObject **src, **dest; - if (PySlice_GetIndicesEx((PySliceObject*)item, + if (PySlice_GetIndicesEx(item, PyTuple_GET_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; Modified: python/branches/pep-3151/Objects/typeobject.c ============================================================================== --- python/branches/pep-3151/Objects/typeobject.c (original) +++ python/branches/pep-3151/Objects/typeobject.c Sat Feb 26 08:16:32 2011 @@ -326,7 +326,7 @@ if (type != &PyType_Type) mod = PyDict_GetItemString(type->tp_dict, "__abstractmethods__"); if (!mod) { - PyErr_Format(PyExc_AttributeError, "__abstractmethods__"); + PyErr_SetString(PyExc_AttributeError, "__abstractmethods__"); return NULL; } Py_XINCREF(mod); @@ -340,8 +340,17 @@ abc.ABCMeta.__new__, so this function doesn't do anything special to update subclasses. */ - int res = PyDict_SetItemString(type->tp_dict, - "__abstractmethods__", value); + int res; + if (value != NULL) { + res = PyDict_SetItemString(type->tp_dict, "__abstractmethods__", value); + } + else { + res = PyDict_DelItemString(type->tp_dict, "__abstractmethods__"); + if (res && PyErr_ExceptionMatches(PyExc_KeyError)) { + PyErr_SetString(PyExc_AttributeError, "__abstractmethods__"); + return -1; + } + } if (res == 0) { PyType_Modified(type); if (value && PyObject_IsTrue(value)) { @@ -893,7 +902,7 @@ /* Find the nearest base with a different tp_dealloc */ base = type; - while ((basedealloc = base->tp_dealloc) == subtype_dealloc) { + while ((/*basedealloc =*/ base->tp_dealloc) == subtype_dealloc) { base = base->tp_base; assert(base); } @@ -1895,6 +1904,12 @@ return res; } +long +PyType_GetFlags(PyTypeObject *type) +{ + return type->tp_flags; +} + static PyObject * type_new(PyTypeObject *metatype, PyObject *args, PyObject *kwds) { @@ -2304,6 +2319,57 @@ return (PyObject *)type; } +static short slotoffsets[] = { + -1, /* invalid slot */ +#include "typeslots.inc" +}; + +PyObject* PyType_FromSpec(PyType_Spec *spec) +{ + PyHeapTypeObject *res = (PyHeapTypeObject*)PyType_GenericAlloc(&PyType_Type, 0); + char *res_start = (char*)res; + PyType_Slot *slot; + + if (res == NULL) + return NULL; + res->ht_name = PyUnicode_FromString(spec->name); + if (!res->ht_name) + goto fail; + res->ht_type.tp_name = _PyUnicode_AsString(res->ht_name); + if (!res->ht_type.tp_name) + goto fail; + + res->ht_type.tp_basicsize = spec->basicsize; + res->ht_type.tp_itemsize = spec->itemsize; + res->ht_type.tp_flags = spec->flags | Py_TPFLAGS_HEAPTYPE; + + for (slot = spec->slots; slot->slot; slot++) { + if (slot->slot >= sizeof(slotoffsets)/sizeof(slotoffsets[0])) { + PyErr_SetString(PyExc_RuntimeError, "invalid slot offset"); + goto fail; + } + *(void**)(res_start + slotoffsets[slot->slot]) = slot->pfunc; + + /* need to make a copy of the docstring slot, which usually + points to a static string literal */ + if (slot->slot == Py_tp_doc) { + ssize_t len = strlen(slot->pfunc)+1; + char *tp_doc = PyObject_MALLOC(len); + if (tp_doc == NULL) + goto fail; + memcpy(tp_doc, slot->pfunc, len); + res->ht_type.tp_doc = tp_doc; + } + } + + return (PyObject*)res; + + fail: + Py_DECREF(res); + return NULL; +} + + /* Internal API to look for a name through the MRO. This returns a borrowed reference, and doesn't set an exception! */ PyObject * @@ -2898,10 +2964,7 @@ Py_ssize_t size; PyObject *slots_a, *slots_b; - if (base != b->tp_base) - return 0; - if (equiv_structs(a, base) && equiv_structs(b, base)) - return 1; + assert(base == b->tp_base); size = base->tp_basicsize; if (a->tp_dictoffset == size && b->tp_dictoffset == size) size += sizeof(PyObject *); @@ -6164,7 +6227,7 @@ and first local variable on the stack. */ PyFrameObject *f = PyThreadState_GET()->frame; PyCodeObject *co = f->f_code; - int i, n; + Py_ssize_t i, n; if (co == NULL) { PyErr_SetString(PyExc_SystemError, "super(): no code object"); Modified: python/branches/pep-3151/Objects/unicodectype.c ============================================================================== --- python/branches/pep-3151/Objects/unicodectype.c (original) +++ python/branches/pep-3151/Objects/unicodectype.c Sat Feb 26 08:16:32 2011 @@ -9,7 +9,6 @@ */ #include "Python.h" -#include "unicodeobject.h" #define ALPHA_MASK 0x01 #define DECIMAL_MASK 0x02 Modified: python/branches/pep-3151/Objects/unicodeobject.c ============================================================================== --- python/branches/pep-3151/Objects/unicodeobject.c (original) +++ python/branches/pep-3151/Objects/unicodeobject.c Sat Feb 26 08:16:32 2011 @@ -41,9 +41,6 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" -#include "bytes_methods.h" - -#include "unicodeobject.h" #include "ucnhash.h" #ifdef MS_WINDOWS @@ -145,16 +142,18 @@ 0, 0, 0, 0, 0, 0, 0, 0 }; -static PyObject *unicode_encode_call_errorhandler(const char *errors, +static PyObject * +unicode_encode_call_errorhandler(const char *errors, PyObject **errorHandler,const char *encoding, const char *reason, const Py_UNICODE *unicode, Py_ssize_t size, PyObject **exceptionObject, Py_ssize_t startpos, Py_ssize_t endpos, Py_ssize_t *newpos); -static void raise_encode_exception(PyObject **exceptionObject, - const char *encoding, - const Py_UNICODE *unicode, Py_ssize_t size, - Py_ssize_t startpos, Py_ssize_t endpos, - const char *reason); +static void +raise_encode_exception(PyObject **exceptionObject, + const char *encoding, + const Py_UNICODE *unicode, Py_ssize_t size, + Py_ssize_t startpos, Py_ssize_t endpos, + const char *reason); /* Same for linebreaks */ static unsigned char ascii_linebreak[] = { @@ -226,7 +225,8 @@ ((ch) < 128U ? ascii_linebreak[(ch)] : \ (BLOOM(bloom_linebreak, (ch)) && Py_UNICODE_ISLINEBREAK(ch))) -Py_LOCAL_INLINE(BLOOM_MASK) make_bloom_mask(Py_UNICODE* ptr, Py_ssize_t len) +Py_LOCAL_INLINE(BLOOM_MASK) +make_bloom_mask(Py_UNICODE* ptr, Py_ssize_t len) { /* calculate simple bloom-style bitmask for a given unicode string */ @@ -240,7 +240,8 @@ return mask; } -Py_LOCAL_INLINE(int) unicode_member(Py_UNICODE chr, Py_UNICODE* set, Py_ssize_t setlen) +Py_LOCAL_INLINE(int) +unicode_member(Py_UNICODE chr, Py_UNICODE* set, Py_ssize_t setlen) { Py_ssize_t i; @@ -256,9 +257,9 @@ /* --- Unicode Object ----------------------------------------------------- */ -static -int unicode_resize(register PyUnicodeObject *unicode, - Py_ssize_t length) +static int +unicode_resize(register PyUnicodeObject *unicode, + Py_ssize_t length) { void *oldstr; @@ -314,8 +315,8 @@ */ -static -PyUnicodeObject *_PyUnicode_New(Py_ssize_t length) +static PyUnicodeObject * +_PyUnicode_New(Py_ssize_t length) { register PyUnicodeObject *unicode; @@ -386,8 +387,8 @@ return NULL; } -static -void unicode_dealloc(register PyUnicodeObject *unicode) +static void +unicode_dealloc(register PyUnicodeObject *unicode) { switch (PyUnicode_CHECK_INTERNED(unicode)) { case SSTATE_NOT_INTERNED: @@ -431,8 +432,8 @@ } } -static -int _PyUnicode_Resize(PyUnicodeObject **unicode, Py_ssize_t length) +static int +_PyUnicode_Resize(PyUnicodeObject **unicode, Py_ssize_t length) { register PyUnicodeObject *v; @@ -467,13 +468,14 @@ return unicode_resize(v, length); } -int PyUnicode_Resize(PyObject **unicode, Py_ssize_t length) +int +PyUnicode_Resize(PyObject **unicode, Py_ssize_t length) { return _PyUnicode_Resize((PyUnicodeObject **)unicode, length); } -PyObject *PyUnicode_FromUnicode(const Py_UNICODE *u, - Py_ssize_t size) +PyObject * +PyUnicode_FromUnicode(const Py_UNICODE *u, Py_ssize_t size) { PyUnicodeObject *unicode; @@ -514,7 +516,8 @@ return (PyObject *)unicode; } -PyObject *PyUnicode_FromStringAndSize(const char *u, Py_ssize_t size) +PyObject * +PyUnicode_FromStringAndSize(const char *u, Py_ssize_t size) { PyUnicodeObject *unicode; @@ -561,7 +564,8 @@ return (PyObject *)unicode; } -PyObject *PyUnicode_FromString(const char *u) +PyObject * +PyUnicode_FromString(const char *u) { size_t size = strlen(u); if (size > PY_SSIZE_T_MAX) { @@ -583,8 +587,8 @@ /* Here sizeof(wchar_t) is 4 but Py_UNICODE_SIZE == 2, so we need to convert from UTF32 to UTF16. */ -PyObject *PyUnicode_FromWideChar(register const wchar_t *w, - Py_ssize_t size) +PyObject * +PyUnicode_FromWideChar(register const wchar_t *w, Py_ssize_t size) { PyUnicodeObject *unicode; register Py_ssize_t i; @@ -634,8 +638,8 @@ #else -PyObject *PyUnicode_FromWideChar(register const wchar_t *w, - Py_ssize_t size) +PyObject * +PyUnicode_FromWideChar(register const wchar_t *w, Py_ssize_t size) { PyUnicodeObject *unicode; @@ -816,8 +820,19 @@ switch (*f) { case 'c': + { +#ifndef Py_UNICODE_WIDE + int ordinal = va_arg(count, int); + if (ordinal > 0xffff) + n += 2; + else + n++; +#else (void)va_arg(count, int); - /* fall through... */ + n++; +#endif + break; + } case '%': n++; break; @@ -995,8 +1010,18 @@ switch (*f) { case 'c': - *s++ = va_arg(vargs, int); + { + int ordinal = va_arg(vargs, int); +#ifndef Py_UNICODE_WIDE + if (ordinal > 0xffff) { + ordinal -= 0x10000; + *s++ = 0xD800 | (ordinal >> 10); + *s++ = 0xDC00 | (ordinal & 0x3FF); + } else +#endif + *s++ = ordinal; break; + } case 'd': makefmt(fmt, longflag, longlongflag, size_tflag, zeropad, width, precision, 'd'); @@ -1264,7 +1289,7 @@ } Py_ssize_t -PyUnicode_AsWideChar(PyUnicodeObject *unicode, +PyUnicode_AsWideChar(PyObject *unicode, wchar_t *w, Py_ssize_t size) { @@ -1272,7 +1297,7 @@ PyErr_BadInternalCall(); return -1; } - return unicode_aswidechar(unicode, w, size); + return unicode_aswidechar((PyUnicodeObject*)unicode, w, size); } wchar_t* @@ -1306,7 +1331,8 @@ #endif -PyObject *PyUnicode_FromOrdinal(int ordinal) +PyObject * +PyUnicode_FromOrdinal(int ordinal) { Py_UNICODE s[2]; @@ -1329,7 +1355,8 @@ return PyUnicode_FromUnicode(s, 1); } -PyObject *PyUnicode_FromObject(register PyObject *obj) +PyObject * +PyUnicode_FromObject(register PyObject *obj) { /* XXX Perhaps we should make this API an alias of PyObject_Str() instead ?! */ @@ -1349,9 +1376,10 @@ return NULL; } -PyObject *PyUnicode_FromEncodedObject(register PyObject *obj, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_FromEncodedObject(register PyObject *obj, + const char *encoding, + const char *errors) { Py_buffer buffer; PyObject *v; @@ -1434,23 +1462,26 @@ return 1; } -PyObject *PyUnicode_Decode(const char *s, - Py_ssize_t size, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_Decode(const char *s, + Py_ssize_t size, + const char *encoding, + const char *errors) { PyObject *buffer = NULL, *unicode; Py_buffer info; char lower[11]; /* Enough for any encoding shortcut */ if (encoding == NULL) - encoding = PyUnicode_GetDefaultEncoding(); + return PyUnicode_DecodeUTF8(s, size, errors); /* Shortcuts for common default encodings */ if (normalize_encoding(encoding, lower, sizeof(lower))) { - if (strcmp(lower, "utf-8") == 0) + if ((strcmp(lower, "utf-8") == 0) || + (strcmp(lower, "utf8") == 0)) return PyUnicode_DecodeUTF8(s, size, errors); else if ((strcmp(lower, "latin-1") == 0) || + (strcmp(lower, "latin1") == 0) || (strcmp(lower, "iso-8859-1") == 0)) return PyUnicode_DecodeLatin1(s, size, errors); #if defined(MS_WINDOWS) && defined(HAVE_USABLE_WCHAR_T) @@ -1490,9 +1521,10 @@ return NULL; } -PyObject *PyUnicode_AsDecodedObject(PyObject *unicode, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_AsDecodedObject(PyObject *unicode, + const char *encoding, + const char *errors) { PyObject *v; @@ -1514,9 +1546,10 @@ return NULL; } -PyObject *PyUnicode_AsDecodedUnicode(PyObject *unicode, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_AsDecodedUnicode(PyObject *unicode, + const char *encoding, + const char *errors) { PyObject *v; @@ -1545,10 +1578,11 @@ return NULL; } -PyObject *PyUnicode_Encode(const Py_UNICODE *s, - Py_ssize_t size, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_Encode(const Py_UNICODE *s, + Py_ssize_t size, + const char *encoding, + const char *errors) { PyObject *v, *unicode; @@ -1560,9 +1594,10 @@ return v; } -PyObject *PyUnicode_AsEncodedObject(PyObject *unicode, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_AsEncodedObject(PyObject *unicode, + const char *encoding, + const char *errors) { PyObject *v; @@ -1639,9 +1674,10 @@ #endif } -PyObject *PyUnicode_AsEncodedString(PyObject *unicode, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_AsEncodedString(PyObject *unicode, + const char *encoding, + const char *errors) { PyObject *v; char lower[11]; /* Enough for any encoding shortcut */ @@ -1652,15 +1688,19 @@ } if (encoding == NULL) - encoding = PyUnicode_GetDefaultEncoding(); + return PyUnicode_EncodeUTF8(PyUnicode_AS_UNICODE(unicode), + PyUnicode_GET_SIZE(unicode), + errors); /* Shortcuts for common default encodings */ if (normalize_encoding(encoding, lower, sizeof(lower))) { - if (strcmp(lower, "utf-8") == 0) + if ((strcmp(lower, "utf-8") == 0) || + (strcmp(lower, "utf8") == 0)) return PyUnicode_EncodeUTF8(PyUnicode_AS_UNICODE(unicode), PyUnicode_GET_SIZE(unicode), errors); else if ((strcmp(lower, "latin-1") == 0) || + (strcmp(lower, "latin1") == 0) || (strcmp(lower, "iso-8859-1") == 0)) return PyUnicode_EncodeLatin1(PyUnicode_AS_UNICODE(unicode), PyUnicode_GET_SIZE(unicode), @@ -1676,21 +1716,6 @@ PyUnicode_GET_SIZE(unicode), errors); } - /* During bootstrap, we may need to find the encodings - package, to load the file system encoding, and require the - file system encoding in order to load the encodings - package. - - Break out of this dependency by assuming that the path to - the encodings module is ASCII-only. XXX could try wcstombs - instead, if the file system encoding is the locale's - encoding. */ - if (Py_FileSystemDefaultEncoding && - strcmp(encoding, Py_FileSystemDefaultEncoding) == 0 && - !PyThreadState_GET()->interp->codecs_initialized) - return PyUnicode_EncodeASCII(PyUnicode_AS_UNICODE(unicode), - PyUnicode_GET_SIZE(unicode), - errors); /* Encode via the codec registry */ v = PyCodec_Encode(unicode, encoding, errors); @@ -1726,9 +1751,10 @@ return NULL; } -PyObject *PyUnicode_AsEncodedUnicode(PyObject *unicode, - const char *encoding, - const char *errors) +PyObject * +PyUnicode_AsEncodedUnicode(PyObject *unicode, + const char *encoding, + const char *errors) { PyObject *v; @@ -1757,8 +1783,9 @@ return NULL; } -PyObject *_PyUnicode_AsDefaultEncodedString(PyObject *unicode, - const char *errors) +PyObject * +_PyUnicode_AsDefaultEncodedString(PyObject *unicode, + const char *errors) { PyObject *v = ((PyUnicodeObject *)unicode)->defenc; if (v) @@ -1924,7 +1951,8 @@ return _PyUnicode_AsStringAndSize(unicode, NULL); } -Py_UNICODE *PyUnicode_AsUnicode(PyObject *unicode) +Py_UNICODE * +PyUnicode_AsUnicode(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -1936,7 +1964,8 @@ return NULL; } -Py_ssize_t PyUnicode_GetSize(PyObject *unicode) +Py_ssize_t +PyUnicode_GetSize(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -1948,7 +1977,8 @@ return -1; } -const char *PyUnicode_GetDefaultEncoding(void) +const char * +PyUnicode_GetDefaultEncoding(void) { return "utf-8"; } @@ -1987,12 +2017,12 @@ return 0 on success, -1 on error */ -static -int unicode_decode_call_errorhandler(const char *errors, PyObject **errorHandler, - const char *encoding, const char *reason, - const char **input, const char **inend, Py_ssize_t *startinpos, - Py_ssize_t *endinpos, PyObject **exceptionObject, const char **inptr, - PyUnicodeObject **output, Py_ssize_t *outpos, Py_UNICODE **outptr) +static int +unicode_decode_call_errorhandler(const char *errors, PyObject **errorHandler, + const char *encoding, const char *reason, + const char **input, const char **inend, Py_ssize_t *startinpos, + Py_ssize_t *endinpos, PyObject **exceptionObject, const char **inptr, + PyUnicodeObject **output, Py_ssize_t *outpos, Py_UNICODE **outptr) { static char *argparse = "O!n;decoding error handler must return (str, int) tuple"; @@ -2162,9 +2192,10 @@ (directWS && (utf7_category[(c)] == 2)) || \ (directO && (utf7_category[(c)] == 1)))) -PyObject *PyUnicode_DecodeUTF7(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeUTF7(const char *s, + Py_ssize_t size, + const char *errors) { return PyUnicode_DecodeUTF7Stateful(s, size, errors, NULL); } @@ -2176,10 +2207,11 @@ * all the shift state (seen bits, number of bits seen, high * surrogate). */ -PyObject *PyUnicode_DecodeUTF7Stateful(const char *s, - Py_ssize_t size, - const char *errors, - Py_ssize_t *consumed) +PyObject * +PyUnicode_DecodeUTF7Stateful(const char *s, + Py_ssize_t size, + const char *errors, + Py_ssize_t *consumed) { const char *starts = s; Py_ssize_t startinpos; @@ -2366,11 +2398,12 @@ } -PyObject *PyUnicode_EncodeUTF7(const Py_UNICODE *s, - Py_ssize_t size, - int base64SetO, - int base64WhiteSpace, - const char *errors) +PyObject * +PyUnicode_EncodeUTF7(const Py_UNICODE *s, + Py_ssize_t size, + int base64SetO, + int base64WhiteSpace, + const char *errors) { PyObject *v; /* It might be possible to tighten this worst case */ @@ -2491,9 +2524,10 @@ 4, 4, 4, 4, 4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 /* F0-F4 + F5-FF */ }; -PyObject *PyUnicode_DecodeUTF8(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeUTF8(const char *s, + Py_ssize_t size, + const char *errors) { return PyUnicode_DecodeUTF8Stateful(s, size, errors, NULL); } @@ -2511,10 +2545,11 @@ # error C 'long' size should be either 4 or 8! #endif -PyObject *PyUnicode_DecodeUTF8Stateful(const char *s, - Py_ssize_t size, - const char *errors, - Py_ssize_t *consumed) +PyObject * +PyUnicode_DecodeUTF8Stateful(const char *s, + Py_ssize_t size, + const char *errors, + Py_ssize_t *consumed) { const char *starts = s; int n; @@ -3020,7 +3055,8 @@ #undef MAX_SHORT_UNICHARS } -PyObject *PyUnicode_AsUTF8String(PyObject *unicode) +PyObject * +PyUnicode_AsUTF8String(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -3294,7 +3330,8 @@ #undef STORECHAR } -PyObject *PyUnicode_AsUTF32String(PyObject *unicode) +PyObject * +PyUnicode_AsUTF32String(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -3684,7 +3721,8 @@ #undef STORECHAR } -PyObject *PyUnicode_AsUTF16String(PyObject *unicode) +PyObject * +PyUnicode_AsUTF16String(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -3700,9 +3738,10 @@ static _PyUnicode_Name_CAPI *ucnhash_CAPI = NULL; -PyObject *PyUnicode_DecodeUnicodeEscape(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeUnicodeEscape(const char *s, + Py_ssize_t size, + const char *errors) { const char *starts = s; Py_ssize_t startinpos; @@ -3958,8 +3997,9 @@ static const char *hexdigits = "0123456789abcdef"; -PyObject *PyUnicode_EncodeUnicodeEscape(const Py_UNICODE *s, - Py_ssize_t size) +PyObject * +PyUnicode_EncodeUnicodeEscape(const Py_UNICODE *s, + Py_ssize_t size) { PyObject *repr; char *p; @@ -4099,7 +4139,8 @@ return repr; } -PyObject *PyUnicode_AsUnicodeEscapeString(PyObject *unicode) +PyObject * +PyUnicode_AsUnicodeEscapeString(PyObject *unicode) { PyObject *s; if (!PyUnicode_Check(unicode)) { @@ -4113,9 +4154,10 @@ /* --- Raw Unicode Escape Codec ------------------------------------------- */ -PyObject *PyUnicode_DecodeRawUnicodeEscape(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeRawUnicodeEscape(const char *s, + Py_ssize_t size, + const char *errors) { const char *starts = s; Py_ssize_t startinpos; @@ -4230,8 +4272,9 @@ return NULL; } -PyObject *PyUnicode_EncodeRawUnicodeEscape(const Py_UNICODE *s, - Py_ssize_t size) +PyObject * +PyUnicode_EncodeRawUnicodeEscape(const Py_UNICODE *s, + Py_ssize_t size) { PyObject *repr; char *p; @@ -4318,7 +4361,8 @@ return repr; } -PyObject *PyUnicode_AsRawUnicodeEscapeString(PyObject *unicode) +PyObject * +PyUnicode_AsRawUnicodeEscapeString(PyObject *unicode) { PyObject *s; if (!PyUnicode_Check(unicode)) { @@ -4333,9 +4377,10 @@ /* --- Unicode Internal Codec ------------------------------------------- */ -PyObject *_PyUnicode_DecodeUnicodeInternal(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +_PyUnicode_DecodeUnicodeInternal(const char *s, + Py_ssize_t size, + const char *errors) { const char *starts = s; Py_ssize_t startinpos; @@ -4411,9 +4456,10 @@ /* --- Latin-1 Codec ------------------------------------------------------ */ -PyObject *PyUnicode_DecodeLatin1(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeLatin1(const char *s, + Py_ssize_t size, + const char *errors) { PyUnicodeObject *v; Py_UNICODE *p; @@ -4453,11 +4499,12 @@ } /* create or adjust a UnicodeEncodeError */ -static void make_encode_exception(PyObject **exceptionObject, - const char *encoding, - const Py_UNICODE *unicode, Py_ssize_t size, - Py_ssize_t startpos, Py_ssize_t endpos, - const char *reason) +static void +make_encode_exception(PyObject **exceptionObject, + const char *encoding, + const Py_UNICODE *unicode, Py_ssize_t size, + Py_ssize_t startpos, Py_ssize_t endpos, + const char *reason) { if (*exceptionObject == NULL) { *exceptionObject = PyUnicodeEncodeError_Create( @@ -4478,11 +4525,12 @@ } /* raises a UnicodeEncodeError */ -static void raise_encode_exception(PyObject **exceptionObject, - const char *encoding, - const Py_UNICODE *unicode, Py_ssize_t size, - Py_ssize_t startpos, Py_ssize_t endpos, - const char *reason) +static void +raise_encode_exception(PyObject **exceptionObject, + const char *encoding, + const Py_UNICODE *unicode, Py_ssize_t size, + Py_ssize_t startpos, Py_ssize_t endpos, + const char *reason) { make_encode_exception(exceptionObject, encoding, unicode, size, startpos, endpos, reason); @@ -4494,12 +4542,13 @@ build arguments, call the callback and check the arguments, put the result into newpos and return the replacement string, which has to be freed by the caller */ -static PyObject *unicode_encode_call_errorhandler(const char *errors, - PyObject **errorHandler, - const char *encoding, const char *reason, - const Py_UNICODE *unicode, Py_ssize_t size, PyObject **exceptionObject, - Py_ssize_t startpos, Py_ssize_t endpos, - Py_ssize_t *newpos) +static PyObject * +unicode_encode_call_errorhandler(const char *errors, + PyObject **errorHandler, + const char *encoding, const char *reason, + const Py_UNICODE *unicode, Py_ssize_t size, PyObject **exceptionObject, + Py_ssize_t startpos, Py_ssize_t endpos, + Py_ssize_t *newpos) { static char *argparse = "On;encoding error handler must return (str/bytes, int) tuple"; @@ -4548,10 +4597,11 @@ return resunicode; } -static PyObject *unicode_encode_ucs1(const Py_UNICODE *p, - Py_ssize_t size, - const char *errors, - int limit) +static PyObject * +unicode_encode_ucs1(const Py_UNICODE *p, + Py_ssize_t size, + const char *errors, + int limit) { /* output object */ PyObject *res; @@ -4744,14 +4794,16 @@ return NULL; } -PyObject *PyUnicode_EncodeLatin1(const Py_UNICODE *p, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_EncodeLatin1(const Py_UNICODE *p, + Py_ssize_t size, + const char *errors) { return unicode_encode_ucs1(p, size, errors, 256); } -PyObject *PyUnicode_AsLatin1String(PyObject *unicode) +PyObject * +PyUnicode_AsLatin1String(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -4764,9 +4816,10 @@ /* --- 7-bit ASCII Codec -------------------------------------------------- */ -PyObject *PyUnicode_DecodeASCII(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeASCII(const char *s, + Py_ssize_t size, + const char *errors) { const char *starts = s; PyUnicodeObject *v; @@ -4823,14 +4876,16 @@ return NULL; } -PyObject *PyUnicode_EncodeASCII(const Py_UNICODE *p, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_EncodeASCII(const Py_UNICODE *p, + Py_ssize_t size, + const char *errors) { return unicode_encode_ucs1(p, size, errors, 128); } -PyObject *PyUnicode_AsASCIIString(PyObject *unicode) +PyObject * +PyUnicode_AsASCIIString(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -4854,7 +4909,8 @@ b) IsDBCSLeadByte (probably) does not work for non-DBCS multi-byte encodings, see IsDBCSLeadByteEx documentation. */ -static int is_dbcs_lead_byte(const char *s, int offset) +static int +is_dbcs_lead_byte(const char *s, int offset) { const char *curr = s + offset; @@ -4869,11 +4925,12 @@ * Decode MBCS string into unicode object. If 'final' is set, converts * trailing lead-byte too. Returns consumed size if succeed, -1 otherwise. */ -static int decode_mbcs(PyUnicodeObject **v, - const char *s, /* MBCS string */ - int size, /* sizeof MBCS string */ - int final, - const char *errors) +static int +decode_mbcs(PyUnicodeObject **v, + const char *s, /* MBCS string */ + int size, /* sizeof MBCS string */ + int final, + const char *errors) { Py_UNICODE *p; Py_ssize_t n; @@ -4952,10 +5009,11 @@ return -1; } -PyObject *PyUnicode_DecodeMBCSStateful(const char *s, - Py_ssize_t size, - const char *errors, - Py_ssize_t *consumed) +PyObject * +PyUnicode_DecodeMBCSStateful(const char *s, + Py_ssize_t size, + const char *errors, + Py_ssize_t *consumed) { PyUnicodeObject *v = NULL; int done; @@ -4990,9 +5048,10 @@ return (PyObject *)v; } -PyObject *PyUnicode_DecodeMBCS(const char *s, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_DecodeMBCS(const char *s, + Py_ssize_t size, + const char *errors) { return PyUnicode_DecodeMBCSStateful(s, size, errors, NULL); } @@ -5001,10 +5060,11 @@ * Convert unicode into string object (MBCS). * Returns 0 if succeed, -1 otherwise. */ -static int encode_mbcs(PyObject **repr, - const Py_UNICODE *p, /* unicode */ - int size, /* size of unicode */ - const char* errors) +static int +encode_mbcs(PyObject **repr, + const Py_UNICODE *p, /* unicode */ + int size, /* size of unicode */ + const char* errors) { BOOL usedDefaultChar = FALSE; BOOL *pusedDefaultChar; @@ -5077,9 +5137,10 @@ return -1; } -PyObject *PyUnicode_EncodeMBCS(const Py_UNICODE *p, - Py_ssize_t size, - const char *errors) +PyObject * +PyUnicode_EncodeMBCS(const Py_UNICODE *p, + Py_ssize_t size, + const char *errors) { PyObject *repr = NULL; int ret; @@ -5108,7 +5169,8 @@ return repr; } -PyObject *PyUnicode_AsMBCSString(PyObject *unicode) +PyObject * +PyUnicode_AsMBCSString(PyObject *unicode) { if (!PyUnicode_Check(unicode)) { PyErr_BadArgument(); @@ -5125,10 +5187,11 @@ /* --- Character Mapping Codec -------------------------------------------- */ -PyObject *PyUnicode_DecodeCharmap(const char *s, - Py_ssize_t size, - PyObject *mapping, - const char *errors) +PyObject * +PyUnicode_DecodeCharmap(const char *s, + Py_ssize_t size, + PyObject *mapping, + const char *errors) { const char *starts = s; Py_ssize_t startinpos; @@ -5288,7 +5351,7 @@ /* Charmap encoding: the lookup table */ -struct encoding_map{ +struct encoding_map { PyObject_HEAD unsigned char level1[32]; int count2, count3; @@ -5415,7 +5478,6 @@ if (!result) return NULL; for (i = 0; i < 256; i++) { - key = value = NULL; key = PyLong_FromLong(decode[i]); value = PyLong_FromLong(i); if (!key || !value) @@ -5503,7 +5565,8 @@ /* Lookup the character ch in the mapping. If the character can't be found, Py_None is returned (or NULL, if another error occurred). */ -static PyObject *charmapencode_lookup(Py_UNICODE c, PyObject *mapping) +static PyObject * +charmapencode_lookup(Py_UNICODE c, PyObject *mapping) { PyObject *w = PyLong_FromLong((long)c); PyObject *x; @@ -5560,16 +5623,16 @@ typedef enum charmapencode_result { enc_SUCCESS, enc_FAILED, enc_EXCEPTION -}charmapencode_result; +} charmapencode_result; /* lookup the character, put the result in the output string and adjust various state variables. Resize the output bytes object if not enough space is available. Return a new reference to the object that was put in the output buffer, or Py_None, if the mapping was undefined (in which case no character was written) or NULL, if a reallocation error occurred. The caller must decref the result */ -static -charmapencode_result charmapencode_output(Py_UNICODE c, PyObject *mapping, - PyObject **outobj, Py_ssize_t *outpos) +static charmapencode_result +charmapencode_output(Py_UNICODE c, PyObject *mapping, + PyObject **outobj, Py_ssize_t *outpos) { PyObject *rep; char *outstart; @@ -5625,8 +5688,8 @@ /* handle an error in PyUnicode_EncodeCharmap Return 0 on success, -1 on error */ -static -int charmap_encoding_error( +static int +charmap_encoding_error( const Py_UNICODE *p, Py_ssize_t size, Py_ssize_t *inpos, PyObject *mapping, PyObject **exceptionObject, int *known_errorHandler, PyObject **errorHandler, const char *errors, @@ -5760,10 +5823,11 @@ return 0; } -PyObject *PyUnicode_EncodeCharmap(const Py_UNICODE *p, - Py_ssize_t size, - PyObject *mapping, - const char *errors) +PyObject * +PyUnicode_EncodeCharmap(const Py_UNICODE *p, + Py_ssize_t size, + PyObject *mapping, + const char *errors) { /* output object */ PyObject *res = NULL; @@ -5824,8 +5888,9 @@ return NULL; } -PyObject *PyUnicode_AsCharmapString(PyObject *unicode, - PyObject *mapping) +PyObject * +PyUnicode_AsCharmapString(PyObject *unicode, + PyObject *mapping) { if (!PyUnicode_Check(unicode) || mapping == NULL) { PyErr_BadArgument(); @@ -5838,10 +5903,11 @@ } /* create or adjust a UnicodeTranslateError */ -static void make_translate_exception(PyObject **exceptionObject, - const Py_UNICODE *unicode, Py_ssize_t size, - Py_ssize_t startpos, Py_ssize_t endpos, - const char *reason) +static void +make_translate_exception(PyObject **exceptionObject, + const Py_UNICODE *unicode, Py_ssize_t size, + Py_ssize_t startpos, Py_ssize_t endpos, + const char *reason) { if (*exceptionObject == NULL) { *exceptionObject = PyUnicodeTranslateError_Create( @@ -5862,10 +5928,11 @@ } /* raises a UnicodeTranslateError */ -static void raise_translate_exception(PyObject **exceptionObject, - const Py_UNICODE *unicode, Py_ssize_t size, - Py_ssize_t startpos, Py_ssize_t endpos, - const char *reason) +static void +raise_translate_exception(PyObject **exceptionObject, + const Py_UNICODE *unicode, Py_ssize_t size, + Py_ssize_t startpos, Py_ssize_t endpos, + const char *reason) { make_translate_exception(exceptionObject, unicode, size, startpos, endpos, reason); @@ -5877,12 +5944,13 @@ build arguments, call the callback and check the arguments, put the result into newpos and return the replacement string, which has to be freed by the caller */ -static PyObject *unicode_translate_call_errorhandler(const char *errors, - PyObject **errorHandler, - const char *reason, - const Py_UNICODE *unicode, Py_ssize_t size, PyObject **exceptionObject, - Py_ssize_t startpos, Py_ssize_t endpos, - Py_ssize_t *newpos) +static PyObject * +unicode_translate_call_errorhandler(const char *errors, + PyObject **errorHandler, + const char *reason, + const Py_UNICODE *unicode, Py_ssize_t size, PyObject **exceptionObject, + Py_ssize_t startpos, Py_ssize_t endpos, + Py_ssize_t *newpos) { static char *argparse = "O!n;translating error handler must return (str, int) tuple"; @@ -5932,8 +6000,8 @@ /* Lookup the character ch in the mapping and put the result in result, which must be decrefed by the caller. Return 0 on success, -1 on error */ -static -int charmaptranslate_lookup(Py_UNICODE c, PyObject *mapping, PyObject **result) +static int +charmaptranslate_lookup(Py_UNICODE c, PyObject *mapping, PyObject **result) { PyObject *w = PyLong_FromLong((long)c); PyObject *x; @@ -5982,8 +6050,8 @@ /* ensure that *outobj is at least requiredsize characters long, if not reallocate and adjust various state variables. Return 0 on success, -1 on error */ -static -int charmaptranslate_makespace(PyObject **outobj, Py_UNICODE **outp, +static int +charmaptranslate_makespace(PyObject **outobj, Py_UNICODE **outp, Py_ssize_t requiredsize) { Py_ssize_t oldsize = PyUnicode_GET_SIZE(*outobj); @@ -6005,10 +6073,10 @@ undefined (in which case no character was written). The called must decref result. Return 0 on success, -1 on error. */ -static -int charmaptranslate_output(const Py_UNICODE *startinp, const Py_UNICODE *curinp, - Py_ssize_t insize, PyObject *mapping, PyObject **outobj, Py_UNICODE **outp, - PyObject **res) +static int +charmaptranslate_output(const Py_UNICODE *startinp, const Py_UNICODE *curinp, + Py_ssize_t insize, PyObject *mapping, PyObject **outobj, Py_UNICODE **outp, + PyObject **res) { if (charmaptranslate_lookup(*curinp, mapping, res)) return -1; @@ -6044,10 +6112,11 @@ return 0; } -PyObject *PyUnicode_TranslateCharmap(const Py_UNICODE *p, - Py_ssize_t size, - PyObject *mapping, - const char *errors) +PyObject * +PyUnicode_TranslateCharmap(const Py_UNICODE *p, + Py_ssize_t size, + PyObject *mapping, + const char *errors) { /* output object */ PyObject *res = NULL; @@ -6186,9 +6255,10 @@ return NULL; } -PyObject *PyUnicode_Translate(PyObject *str, - PyObject *mapping, - const char *errors) +PyObject * +PyUnicode_Translate(PyObject *str, + PyObject *mapping, + const char *errors) { PyObject *result; @@ -6207,12 +6277,37 @@ return NULL; } +PyObject * +PyUnicode_TransformDecimalToASCII(Py_UNICODE *s, + Py_ssize_t length) +{ + PyObject *result; + Py_UNICODE *p; /* write pointer into result */ + Py_ssize_t i; + /* Copy to a new string */ + result = (PyObject *)_PyUnicode_New(length); + Py_UNICODE_COPY(PyUnicode_AS_UNICODE(result), s, length); + if (result == NULL) + return result; + p = PyUnicode_AS_UNICODE(result); + /* Iterate over code points */ + for (i = 0; i < length; i++) { + Py_UNICODE ch =s[i]; + if (ch > 127) { + int decimal = Py_UNICODE_TODECIMAL(ch); + if (decimal >= 0) + p[i] = '0' + decimal; + } + } + return result; +} /* --- Decimal Encoder ---------------------------------------------------- */ -int PyUnicode_EncodeDecimal(Py_UNICODE *s, - Py_ssize_t length, - char *output, - const char *errors) +int +PyUnicode_EncodeDecimal(Py_UNICODE *s, + Py_ssize_t length, + char *output, + const char *errors) { Py_UNICODE *p, *end; PyObject *errorHandler = NULL; @@ -6373,10 +6468,11 @@ start = 0; \ } -Py_ssize_t PyUnicode_Count(PyObject *str, - PyObject *substr, - Py_ssize_t start, - Py_ssize_t end) +Py_ssize_t +PyUnicode_Count(PyObject *str, + PyObject *substr, + Py_ssize_t start, + Py_ssize_t end) { Py_ssize_t result; PyUnicodeObject* str_obj; @@ -6403,11 +6499,12 @@ return result; } -Py_ssize_t PyUnicode_Find(PyObject *str, - PyObject *sub, - Py_ssize_t start, - Py_ssize_t end, - int direction) +Py_ssize_t +PyUnicode_Find(PyObject *str, + PyObject *sub, + Py_ssize_t start, + Py_ssize_t end, + int direction) { Py_ssize_t result; @@ -6439,12 +6536,12 @@ return result; } -static -int tailmatch(PyUnicodeObject *self, - PyUnicodeObject *substring, - Py_ssize_t start, - Py_ssize_t end, - int direction) +static int +tailmatch(PyUnicodeObject *self, + PyUnicodeObject *substring, + Py_ssize_t start, + Py_ssize_t end, + int direction) { if (substring->length == 0) return 1; @@ -6465,11 +6562,12 @@ return 0; } -Py_ssize_t PyUnicode_Tailmatch(PyObject *str, - PyObject *substr, - Py_ssize_t start, - Py_ssize_t end, - int direction) +Py_ssize_t +PyUnicode_Tailmatch(PyObject *str, + PyObject *substr, + Py_ssize_t start, + Py_ssize_t end, + int direction) { Py_ssize_t result; @@ -6493,9 +6591,9 @@ /* Apply fixfct filter to the Unicode object self and return a reference to the modified object */ -static -PyObject *fixup(PyUnicodeObject *self, - int (*fixfct)(PyUnicodeObject *s)) +static PyObject * +fixup(PyUnicodeObject *self, + int (*fixfct)(PyUnicodeObject *s)) { PyUnicodeObject *u; @@ -6517,8 +6615,8 @@ return (PyObject*) u; } -static -int fixupper(PyUnicodeObject *self) +static int +fixupper(PyUnicodeObject *self) { Py_ssize_t len = self->length; Py_UNICODE *s = self->str; @@ -6538,8 +6636,8 @@ return status; } -static -int fixlower(PyUnicodeObject *self) +static int +fixlower(PyUnicodeObject *self) { Py_ssize_t len = self->length; Py_UNICODE *s = self->str; @@ -6559,8 +6657,8 @@ return status; } -static -int fixswapcase(PyUnicodeObject *self) +static int +fixswapcase(PyUnicodeObject *self) { Py_ssize_t len = self->length; Py_UNICODE *s = self->str; @@ -6580,8 +6678,8 @@ return status; } -static -int fixcapitalize(PyUnicodeObject *self) +static int +fixcapitalize(PyUnicodeObject *self) { Py_ssize_t len = self->length; Py_UNICODE *s = self->str; @@ -6604,8 +6702,8 @@ return status; } -static -int fixtitle(PyUnicodeObject *self) +static int +fixtitle(PyUnicodeObject *self) { register Py_UNICODE *p = PyUnicode_AS_UNICODE(self); register Py_UNICODE *e; @@ -6755,11 +6853,11 @@ return NULL; } -static -PyUnicodeObject *pad(PyUnicodeObject *self, - Py_ssize_t left, - Py_ssize_t right, - Py_UNICODE fill) +static PyUnicodeObject * +pad(PyUnicodeObject *self, + Py_ssize_t left, + Py_ssize_t right, + Py_UNICODE fill) { PyUnicodeObject *u; @@ -6790,7 +6888,8 @@ return u; } -PyObject *PyUnicode_Splitlines(PyObject *string, int keepends) +PyObject * +PyUnicode_Splitlines(PyObject *string, int keepends) { PyObject *list; @@ -6806,10 +6905,10 @@ return list; } -static -PyObject *split(PyUnicodeObject *self, - PyUnicodeObject *substring, - Py_ssize_t maxcount) +static PyObject * +split(PyUnicodeObject *self, + PyUnicodeObject *substring, + Py_ssize_t maxcount) { if (maxcount < 0) maxcount = PY_SSIZE_T_MAX; @@ -6826,10 +6925,10 @@ ); } -static -PyObject *rsplit(PyUnicodeObject *self, - PyUnicodeObject *substring, - Py_ssize_t maxcount) +static PyObject * +rsplit(PyUnicodeObject *self, + PyUnicodeObject *substring, + Py_ssize_t maxcount) { if (maxcount < 0) maxcount = PY_SSIZE_T_MAX; @@ -6846,11 +6945,11 @@ ); } -static -PyObject *replace(PyUnicodeObject *self, - PyUnicodeObject *str1, - PyUnicodeObject *str2, - Py_ssize_t maxcount) +static PyObject * +replace(PyUnicodeObject *self, + PyUnicodeObject *str1, + PyUnicodeObject *str2, + Py_ssize_t maxcount) { PyUnicodeObject *u; @@ -6908,7 +7007,7 @@ } } else { - Py_ssize_t n, i, j, e; + Py_ssize_t n, i, j; Py_ssize_t product, new_size, delta; Py_UNICODE *p; @@ -6940,7 +7039,6 @@ return NULL; i = 0; p = u->str; - e = self->length - str1->length; if (str1->length > 0) { while (n-- > 0) { /* look for next match */ @@ -7185,8 +7283,8 @@ #endif -int PyUnicode_Compare(PyObject *left, - PyObject *right) +int +PyUnicode_Compare(PyObject *left, PyObject *right) { if (PyUnicode_Check(left) && PyUnicode_Check(right)) return unicode_compare((PyUnicodeObject *)left, @@ -7222,9 +7320,8 @@ #define TEST_COND(cond) \ ((cond) ? Py_True : Py_False) -PyObject *PyUnicode_RichCompare(PyObject *left, - PyObject *right, - int op) +PyObject * +PyUnicode_RichCompare(PyObject *left, PyObject *right, int op) { int result; @@ -7279,8 +7376,8 @@ return Py_NotImplemented; } -int PyUnicode_Contains(PyObject *container, - PyObject *element) +int +PyUnicode_Contains(PyObject *container, PyObject *element) { PyObject *str, *sub; int result; @@ -7310,8 +7407,8 @@ /* Concat to string or Unicode object giving a new Unicode object. */ -PyObject *PyUnicode_Concat(PyObject *left, - PyObject *right) +PyObject * +PyUnicode_Concat(PyObject *left, PyObject *right) { PyUnicodeObject *u = NULL, *v = NULL, *w; @@ -7425,26 +7522,11 @@ static char *kwlist[] = {"encoding", "errors", 0}; char *encoding = NULL; char *errors = NULL; - PyObject *v; if (!PyArg_ParseTupleAndKeywords(args, kwargs, "|ss:encode", kwlist, &encoding, &errors)) return NULL; - v = PyUnicode_AsEncodedString((PyObject *)self, encoding, errors); - if (v == NULL) - goto onError; - if (!PyBytes_Check(v)) { - PyErr_Format(PyExc_TypeError, - "encoder did not return a bytes object " - "(type=%.400s)", - Py_TYPE(v)->tp_name); - Py_DECREF(v); - return NULL; - } - return v; - - onError: - return NULL; + return PyUnicode_AsEncodedString((PyObject *)self, encoding, errors); } PyDoc_STRVAR(expandtabs__doc__, @@ -8242,10 +8324,11 @@ return (PyObject*) u; } -PyObject *PyUnicode_Replace(PyObject *obj, - PyObject *subobj, - PyObject *replobj, - Py_ssize_t maxcount) +PyObject * +PyUnicode_Replace(PyObject *obj, + PyObject *subobj, + PyObject *replobj, + Py_ssize_t maxcount) { PyObject *self; PyObject *str1; @@ -8309,8 +8392,8 @@ return result; } -static -PyObject *unicode_repr(PyObject *unicode) +static PyObject * +unicode_repr(PyObject *unicode) { PyObject *repr; Py_UNICODE *p; @@ -8543,9 +8626,8 @@ return (PyObject*) pad(self, width - self->length, 0, fillchar); } -PyObject *PyUnicode_Split(PyObject *s, - PyObject *sep, - Py_ssize_t maxsplit) +PyObject * +PyUnicode_Split(PyObject *s, PyObject *sep, Py_ssize_t maxsplit) { PyObject *result; @@ -8674,9 +8756,8 @@ return PyUnicode_RPartition((PyObject *)self, separator); } -PyObject *PyUnicode_RSplit(PyObject *s, - PyObject *sep, - Py_ssize_t maxsplit) +PyObject * +PyUnicode_RSplit(PyObject *s, PyObject *sep, Py_ssize_t maxsplit) { PyObject *result; @@ -8945,6 +9026,13 @@ { return PyLong_FromLong(numfree); } + +static PyObject * +unicode__decimal2ascii(PyObject *self) +{ + return PyUnicode_TransformDecimalToASCII(PyUnicode_AS_UNICODE(self), + PyUnicode_GET_SIZE(self)); +} #endif PyDoc_STRVAR(startswith__doc__, @@ -9086,7 +9174,6 @@ return Py_BuildValue("(u#)", v->str, v->length); } - static PyMethodDef unicode_methods[] = { /* Order is according to common usage: often used methods should @@ -9143,8 +9230,9 @@ #endif #if 0 - /* This one is just used for debugging the implementation. */ + /* These methods are just used for debugging the implementation. */ {"freelistsize", (PyCFunction) unicode_freelistsize, METH_NOARGS}, + {"_decimal2ascii", (PyCFunction) unicode__decimal2ascii, METH_NOARGS}, #endif {"__getnewargs__", (PyCFunction)unicode_getnewargs, METH_NOARGS}, @@ -9195,7 +9283,7 @@ Py_UNICODE* result_buf; PyObject* result; - if (PySlice_GetIndicesEx((PySliceObject*)item, PyUnicode_GET_SIZE(self), + if (PySlice_GetIndicesEx(item, PyUnicode_GET_SIZE(self), &start, &stop, &step, &slicelength) < 0) { return NULL; } @@ -9361,8 +9449,8 @@ */ #define FORMATBUFLEN (size_t)10 -PyObject *PyUnicode_Format(PyObject *format, - PyObject *args) +PyObject * +PyUnicode_Format(PyObject *format, PyObject *args) { Py_UNICODE *fmt, *res; Py_ssize_t fmtcnt, rescnt, reslen, arglen, argidx; @@ -10053,7 +10141,8 @@ return s; } -void _Py_ReleaseInternedUnicodeStrings(void) +void +_Py_ReleaseInternedUnicodeStrings(void) { PyObject *keys; PyUnicodeObject *s; Modified: python/branches/pep-3151/Objects/weakrefobject.c ============================================================================== --- python/branches/pep-3151/Objects/weakrefobject.c (original) +++ python/branches/pep-3151/Objects/weakrefobject.c Sat Feb 26 08:16:32 2011 @@ -168,13 +168,20 @@ PyErr_Clear(); else if (PyUnicode_Check(nameobj)) name = _PyUnicode_AsString(nameobj); - PyOS_snprintf(buffer, sizeof(buffer), - name ? "" - : "", - self, - Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, - PyWeakref_GET_OBJECT(self), - name); + if (name) + PyOS_snprintf(buffer, sizeof(buffer), + "", + self, + Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, + PyWeakref_GET_OBJECT(self), + name); + else + PyOS_snprintf(buffer, sizeof(buffer), + "", + self, + Py_TYPE(PyWeakref_GET_OBJECT(self))->tp_name, + PyWeakref_GET_OBJECT(self)); + Py_XDECREF(nameobj); } return PyUnicode_FromString(buffer); Modified: python/branches/pep-3151/PC/VC6/pythoncore.dsp ============================================================================== --- python/branches/pep-3151/PC/VC6/pythoncore.dsp (original) +++ python/branches/pep-3151/PC/VC6/pythoncore.dsp Sat Feb 26 08:16:32 2011 @@ -54,7 +54,7 @@ # ADD BSC32 /nologo LINK32=link.exe # ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /dll /machine:I386 -# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python32.dll" +# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python33.dll" # SUBTRACT LINK32 /pdb:none !ELSEIF "$(CFG)" == "pythoncore - Win32 Debug" @@ -82,7 +82,7 @@ # ADD BSC32 /nologo LINK32=link.exe # ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /dll /debug /machine:I386 /pdbtype:sept -# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python32_d.dll" /pdbtype:sept +# ADD LINK32 largeint.lib kernel32.lib user32.lib advapi32.lib shell32.lib /nologo /base:"0x1e000000" /subsystem:windows /dll /debug /machine:I386 /nodefaultlib:"libc" /out:"./python33_d.dll" /pdbtype:sept # SUBTRACT LINK32 /pdb:none !ENDIF Modified: python/branches/pep-3151/PC/VC6/readme.txt ============================================================================== --- python/branches/pep-3151/PC/VC6/readme.txt (original) +++ python/branches/pep-3151/PC/VC6/readme.txt Sat Feb 26 08:16:32 2011 @@ -12,7 +12,7 @@ The proper order to build subprojects: 1) pythoncore (this builds the main Python DLL and library files, - python32.{dll, lib} in Release mode) + python33.{dll, lib} in Release mode) 2) python (this builds the main Python executable, python.exe in Release mode) @@ -23,7 +23,7 @@ to the subsystems they implement; see SUBPROJECTS below) When using the Debug setting, the output files have a _d added to -their name: python32_d.dll, python_d.exe, pyexpat_d.pyd, and so on. +their name: python33_d.dll, python_d.exe, pyexpat_d.pyd, and so on. SUBPROJECTS ----------- @@ -158,9 +158,17 @@ You can (theoretically) use any version of OpenSSL you like - the build process will automatically select the latest version. - You must also install ActivePerl from + You can install the NASM assembler from + http://www.nasm.us/ + for x86 builds. Put nasmw.exe anywhere in your PATH. + Note: recent releases of nasm only have nasm.exe. Just rename it to + nasmw.exe. + + You can also install ActivePerl from http://www.activestate.com/activeperl/ - as this is used by the OpenSSL build process. Complain to them . + if you like to use the official sources instead of the files from + python's subversion repository. The svn version contains pre-build + makefiles and assembly files. The MSVC project simply invokes PC/VC6/build_ssl.py to perform the build. This Python script locates and builds your OpenSSL Modified: python/branches/pep-3151/PC/VS7.1/build_ssl.bat ============================================================================== --- python/branches/pep-3151/PC/VS7.1/build_ssl.bat (original) +++ python/branches/pep-3151/PC/VS7.1/build_ssl.bat Sat Feb 26 08:16:32 2011 @@ -1,12 +1,12 @@ -if "%1" == "ReleaseAMD64" call "%MSSdk%\SetEnv" /XP64 /RETAIL - - at echo off -if not defined HOST_PYTHON ( - if %1 EQU Debug ( - set HOST_PYTHON=python_d.exe - ) ELSE ( - set HOST_PYTHON=python.exe - ) -) -%HOST_PYTHON% build_ssl.py %1 %2 - +if "%1" == "ReleaseAMD64" call "%MSSdk%\SetEnv" /XP64 /RETAIL + + at echo off +if not defined HOST_PYTHON ( + if %1 EQU Debug ( + set HOST_PYTHON=python_d.exe + ) ELSE ( + set HOST_PYTHON=python.exe + ) +) +%HOST_PYTHON% build_ssl.py %1 %2 + Modified: python/branches/pep-3151/PC/VS7.1/pythoncore.vcproj ============================================================================== --- python/branches/pep-3151/PC/VS7.1/pythoncore.vcproj (original) +++ python/branches/pep-3151/PC/VS7.1/pythoncore.vcproj Sat Feb 26 08:16:32 2011 @@ -39,15 +39,15 @@ @@ -99,15 +99,15 @@ @@ -166,15 +166,15 @@ Name="VCLinkerTool" AdditionalOptions=" /MACHINE:IA64 /USELINK:MS_SDK" AdditionalDependencies="getbuildinfo.o" - OutputFile="./python32.dll" + OutputFile="./python33.dll" LinkIncremental="1" SuppressStartupBanner="FALSE" IgnoreDefaultLibraryNames="libc" GenerateDebugInformation="TRUE" - ProgramDatabaseFile=".\./python32.pdb" + ProgramDatabaseFile=".\./python33.pdb" SubSystem="2" BaseAddress="0x1e000000" - ImportLibrary=".\./python32.lib" + ImportLibrary=".\./python33.lib" TargetMachine="0"/> @@ -233,15 +233,15 @@ Name="VCLinkerTool" AdditionalOptions=" /MACHINE:AMD64 /USELINK:MS_SDK" AdditionalDependencies="getbuildinfo.o" - OutputFile="./python32.dll" + OutputFile="./python33.dll" LinkIncremental="1" SuppressStartupBanner="TRUE" IgnoreDefaultLibraryNames="libc" GenerateDebugInformation="TRUE" - ProgramDatabaseFile=".\./python32.pdb" + ProgramDatabaseFile=".\./python33.pdb" SubSystem="2" BaseAddress="0x1e000000" - ImportLibrary=".\./python32.lib" + ImportLibrary=".\./python33.lib" TargetMachine="0"/> Modified: python/branches/pep-3151/PC/VS7.1/readme.txt ============================================================================== --- python/branches/pep-3151/PC/VS7.1/readme.txt (original) +++ python/branches/pep-3151/PC/VS7.1/readme.txt Sat Feb 26 08:16:32 2011 @@ -12,7 +12,7 @@ The proper order to build subprojects: 1) pythoncore (this builds the main Python DLL and library files, - python26.{dll, lib} in Release mode) + python33.{dll, lib} in Release mode) NOTE: in previous releases, this subproject was named after the release number, e.g. python20. @@ -26,7 +26,7 @@ test slave; see SUBPROJECTS below) When using the Debug setting, the output files have a _d added to -their name: python26_d.dll, python_d.exe, parser_d.pyd, and so on. +their name: python33_d.dll, python_d.exe, parser_d.pyd, and so on. SUBPROJECTS ----------- Modified: python/branches/pep-3151/PC/VS8.0/build.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/build.bat (original) +++ python/branches/pep-3151/PC/VS8.0/build.bat Sat Feb 26 08:16:32 2011 @@ -1,17 +1,17 @@ - at echo off -rem A batch program to build or rebuild a particular configuration. -rem just for convenience. - -setlocal -set platf=Win32 -set conf=Release -set build=/build - -:CheckOpts -if "%1"=="-c" (set conf=%2) & shift & shift & goto CheckOpts -if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts -if "%1"=="-r" (set build=/rebuild) & shift & goto CheckOpts - -set cmd=devenv pcbuild.sln %build% "%conf%|%platf%" -echo %cmd% -%cmd% + at echo off +rem A batch program to build or rebuild a particular configuration. +rem just for convenience. + +setlocal +set platf=Win32 +set conf=Release +set build=/build + +:CheckOpts +if "%1"=="-c" (set conf=%2) & shift & shift & goto CheckOpts +if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts +if "%1"=="-r" (set build=/rebuild) & shift & goto CheckOpts + +set cmd=devenv pcbuild.sln %build% "%conf%|%platf%" +echo %cmd% +%cmd% Modified: python/branches/pep-3151/PC/VS8.0/build_env.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/build_env.bat (original) +++ python/branches/pep-3151/PC/VS8.0/build_env.bat Sat Feb 26 08:16:32 2011 @@ -1 +1 @@ -@%comspec% /k env.bat %* +@%comspec% /k env.bat %* Modified: python/branches/pep-3151/PC/VS8.0/build_pgo.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/build_pgo.bat (original) +++ python/branches/pep-3151/PC/VS8.0/build_pgo.bat Sat Feb 26 08:16:32 2011 @@ -1,41 +1,41 @@ - at echo off -rem A batch program to build PGO (Profile guided optimization) by first -rem building instrumented binaries, then running the testsuite, and -rem finally building the optimized code. -rem Note, after the first instrumented run, one can just keep on -rem building the PGUpdate configuration while developing. - -setlocal -set platf=Win32 - -rem use the performance testsuite. This is quick and simple -set job1=..\..\tools\pybench\pybench.py -n 1 -C 1 --with-gc -set path1=..\..\tools\pybench - -rem or the whole testsuite for more thorough testing -set job2=..\..\lib\test\regrtest.py -set path2=..\..\lib - -set job=%job1% -set clrpath=%path1% - -:CheckOpts -if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts -if "%1"=="-2" (set job=%job2%) & (set clrpath=%path2%) & shift & goto CheckOpts - -set PGI=%platf%-pgi -set PGO=%platf%-pgo - - at echo on -rem build the instrumented version -call build -p %platf% -c PGInstrument - -rem remove .pyc files, .pgc files and execute the job -%PGI%\python.exe rmpyc.py %clrpath% -del %PGI%\*.pgc -%PGI%\python.exe %job% - -rem finally build the optimized version -if exist %PGO% del /s /q %PGO% -call build -p %platf% -c PGUpdate - + at echo off +rem A batch program to build PGO (Profile guided optimization) by first +rem building instrumented binaries, then running the testsuite, and +rem finally building the optimized code. +rem Note, after the first instrumented run, one can just keep on +rem building the PGUpdate configuration while developing. + +setlocal +set platf=Win32 + +rem use the performance testsuite. This is quick and simple +set job1=..\..\tools\pybench\pybench.py -n 1 -C 1 --with-gc +set path1=..\..\tools\pybench + +rem or the whole testsuite for more thorough testing +set job2=..\..\lib\test\regrtest.py +set path2=..\..\lib + +set job=%job1% +set clrpath=%path1% + +:CheckOpts +if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts +if "%1"=="-2" (set job=%job2%) & (set clrpath=%path2%) & shift & goto CheckOpts + +set PGI=%platf%-pgi +set PGO=%platf%-pgo + + at echo on +rem build the instrumented version +call build -p %platf% -c PGInstrument + +rem remove .pyc files, .pgc files and execute the job +%PGI%\python.exe rmpyc.py %clrpath% +del %PGI%\*.pgc +%PGI%\python.exe %job% + +rem finally build the optimized version +if exist %PGO% del /s /q %PGO% +call build -p %platf% -c PGUpdate + Modified: python/branches/pep-3151/PC/VS8.0/build_ssl.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/build_ssl.bat (original) +++ python/branches/pep-3151/PC/VS8.0/build_ssl.bat Sat Feb 26 08:16:32 2011 @@ -1,12 +1,12 @@ - at echo off -if not defined HOST_PYTHON ( - if %1 EQU Debug ( - set HOST_PYTHON=python_d.exe - if not exist python32_d.dll exit 1 - ) ELSE ( - set HOST_PYTHON=python.exe - if not exist python32.dll exit 1 - ) -) -%HOST_PYTHON% build_ssl.py %1 %2 %3 - + at echo off +if not defined HOST_PYTHON ( + if %1 EQU Debug ( + set HOST_PYTHON=python_d.exe + if not exist python33_d.dll exit 1 + ) ELSE ( + set HOST_PYTHON=python.exe + if not exist python33.dll exit 1 + ) +) +%HOST_PYTHON% build_ssl.py %1 %2 %3 + Modified: python/branches/pep-3151/PC/VS8.0/env.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/env.bat (original) +++ python/branches/pep-3151/PC/VS8.0/env.bat Sat Feb 26 08:16:32 2011 @@ -1,5 +1,5 @@ - at echo off -set VS8=%ProgramFiles%\Microsoft Visual Studio 8 -echo Build environments: x86, ia64, amd64, x86_amd64, x86_ia64 -echo. -call "%VS8%\VC\vcvarsall.bat" %1 + at echo off +set VS8=%ProgramFiles%\Microsoft Visual Studio 8 +echo Build environments: x86, ia64, amd64, x86_amd64, x86_ia64 +echo. +call "%VS8%\VC\vcvarsall.bat" %1 Modified: python/branches/pep-3151/PC/VS8.0/idle.bat ============================================================================== --- python/branches/pep-3151/PC/VS8.0/idle.bat (original) +++ python/branches/pep-3151/PC/VS8.0/idle.bat Sat Feb 26 08:16:32 2011 @@ -1,15 +1,15 @@ - at echo off -rem start idle -rem Usage: idle [-d] -rem -d Run Debug build (python_d.exe). Else release build. - -setlocal -set exe=python -PATH %PATH%;..\..\..\tcltk\bin - -if "%1"=="-d" (set exe=python_d) & shift - -set cmd=%exe% ../../Lib/idlelib/idle.py %1 %2 %3 %4 %5 %6 %7 %8 %9 - -echo on -%cmd% + at echo off +rem start idle +rem Usage: idle [-d] +rem -d Run Debug build (python_d.exe). Else release build. + +setlocal +set exe=python +PATH %PATH%;..\..\..\tcltk\bin + +if "%1"=="-d" (set exe=python_d) & shift + +set cmd=%exe% ../../Lib/idlelib/idle.py %1 %2 %3 %4 %5 %6 %7 %8 %9 + +echo on +%cmd% Modified: python/branches/pep-3151/PC/VS8.0/kill_python.c ============================================================================== --- python/branches/pep-3151/PC/VS8.0/kill_python.c (original) +++ python/branches/pep-3151/PC/VS8.0/kill_python.c Sat Feb 26 08:16:32 2011 @@ -106,7 +106,7 @@ /* * XXX TODO: if we really wanted to be fancy, we could check the * modules for all processes (not just the python[_d].exe ones) - * and see if any of our DLLs are loaded (i.e. python32[_d].dll), + * and see if any of our DLLs are loaded (i.e. python33[_d].dll), * as that would also inhibit our ability to rebuild the solution. * Not worth loosing sleep over though; for now, a simple check * for just the python executable should be sufficient. Modified: python/branches/pep-3151/PC/VS8.0/pyproject.vsprops ============================================================================== --- python/branches/pep-3151/PC/VS8.0/pyproject.vsprops (original) +++ python/branches/pep-3151/PC/VS8.0/pyproject.vsprops Sat Feb 26 08:16:32 2011 @@ -38,7 +38,7 @@ /> -# Microsoft Developer Studio Generated Build File, Format Version 6.00 -# ** DO NOT EDIT ** - -# TARGTYPE "Win32 (x86) Application" 0x0101 - -CFG=wininst - Win32 Debug -!MESSAGE This is not a valid makefile. To build this project using NMAKE, -!MESSAGE use the Export Makefile command and run -!MESSAGE -!MESSAGE NMAKE /f "wininst.mak". -!MESSAGE -!MESSAGE You can specify a configuration when running NMAKE -!MESSAGE by defining the macro CFG on the command line. For example: -!MESSAGE -!MESSAGE NMAKE /f "wininst.mak" CFG="wininst - Win32 Debug" -!MESSAGE -!MESSAGE Possible choices for configuration are: -!MESSAGE -!MESSAGE "wininst - Win32 Release" (based on "Win32 (x86) Application") -!MESSAGE "wininst - Win32 Debug" (based on "Win32 (x86) Application") -!MESSAGE - -# Begin Project -# PROP AllowPerConfigDependencies 0 -# PROP Scc_ProjName "" -# PROP Scc_LocalPath "" -CPP=cl.exe -MTL=midl.exe -RSC=rc.exe - -!IF "$(CFG)" == "wininst - Win32 Release" - -# PROP BASE Use_MFC 0 -# PROP BASE Use_Debug_Libraries 0 -# PROP BASE Output_Dir "Release" -# PROP BASE Intermediate_Dir "Release" -# PROP BASE Target_Dir "" -# PROP Use_MFC 0 -# PROP Use_Debug_Libraries 0 -# PROP Output_Dir "..\..\lib\distutils\command" -# PROP Intermediate_Dir "temp-release" -# PROP Ignore_Export_Lib 0 -# PROP Target_Dir "" -# ADD BASE CPP /nologo /W3 /GX /O2 /D "WIN32" /D "NDEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /c -# ADD CPP /nologo /MD /W3 /O1 /I "..\..\Include" /I "..\..\..\zlib-1.2.3" /D "WIN32" /D "NDEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /c -# ADD BASE MTL /nologo /D "NDEBUG" /mktyplib203 /win32 -# ADD MTL /nologo /D "NDEBUG" /mktyplib203 /win32 -# ADD BASE RSC /l 0x407 /d "NDEBUG" -# ADD RSC /l 0x407 /d "NDEBUG" -BSC32=bscmake.exe -# ADD BASE BSC32 /nologo -# ADD BSC32 /nologo -LINK32=link.exe -# ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /machine:I386 -# ADD LINK32 ..\..\..\zlib-1.2.3\zlib.lib imagehlp.lib comdlg32.lib ole32.lib comctl32.lib kernel32.lib user32.lib gdi32.lib advapi32.lib shell32.lib /nologo /subsystem:windows /machine:I386 /nodefaultlib:"LIBC" /out:"..\..\lib\distutils\command/wininst-6.0.exe" - -!ELSEIF "$(CFG)" == "wininst - Win32 Debug" - -# PROP BASE Use_MFC 0 -# PROP BASE Use_Debug_Libraries 1 -# PROP BASE Output_Dir "Debug" -# PROP BASE Intermediate_Dir "Debug" -# PROP BASE Target_Dir "" -# PROP Use_MFC 0 -# PROP Use_Debug_Libraries 1 -# PROP Output_Dir "." -# PROP Intermediate_Dir "temp-debug" -# PROP Ignore_Export_Lib 0 -# PROP Target_Dir "" -# ADD BASE CPP /nologo /W3 /GX /ZI /Od /D "WIN32" /D "_DEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /GZ /c -# ADD CPP /nologo /MD /W3 /Z7 /Od /I "..\..\Include" /I "..\..\..\zlib-1.2.1" /D "WIN32" /D "_DEBUG" /D "_WINDOWS" /D "_MBCS" /FR /YX /FD /c -# ADD BASE MTL /nologo /D "_DEBUG" /mktyplib203 /win32 -# ADD MTL /nologo /D "_DEBUG" /mktyplib203 /win32 -# ADD BASE RSC /l 0x407 /d "_DEBUG" -# ADD RSC /l 0x407 /d "_DEBUG" -BSC32=bscmake.exe -# ADD BASE BSC32 /nologo -# ADD BSC32 /nologo -LINK32=link.exe -# ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /debug /machine:I386 /pdbtype:sept -# ADD LINK32 ..\..\..\zlib-1.2.3\zlib.lib imagehlp.lib comdlg32.lib ole32.lib comctl32.lib kernel32.lib user32.lib gdi32.lib advapi32.lib shell32.lib /nologo /subsystem:windows /pdb:none /debug /machine:I386 /nodefaultlib:"LIBC" /out:"..\..\lib\distutils\command/wininst-6.0_d.exe" - -!ENDIF - -# Begin Target - -# Name "wininst - Win32 Release" -# Name "wininst - Win32 Debug" -# Begin Group "Source Files" - -# PROP Default_Filter "cpp;c;cxx;rc;def;r;odl;idl;hpj;bat" -# Begin Source File - -SOURCE=.\extract.c -# End Source File -# Begin Source File - -SOURCE=.\install.c -# End Source File -# Begin Source File - -SOURCE=.\install.rc -# End Source File -# End Group -# Begin Group "Header Files" - -# PROP Default_Filter "h;hpp;hxx;hm;inl" -# Begin Source File - -SOURCE=.\archive.h -# End Source File -# End Group -# Begin Group "Resource Files" - -# PROP Default_Filter "ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe" -# Begin Source File - -SOURCE=.\PythonPowered.bmp -# End Source File -# End Group -# End Target -# End Project +# Microsoft Developer Studio Project File - Name="wininst" - Package Owner=<4> +# Microsoft Developer Studio Generated Build File, Format Version 6.00 +# ** DO NOT EDIT ** + +# TARGTYPE "Win32 (x86) Application" 0x0101 + +CFG=wininst - Win32 Debug +!MESSAGE This is not a valid makefile. To build this project using NMAKE, +!MESSAGE use the Export Makefile command and run +!MESSAGE +!MESSAGE NMAKE /f "wininst.mak". +!MESSAGE +!MESSAGE You can specify a configuration when running NMAKE +!MESSAGE by defining the macro CFG on the command line. For example: +!MESSAGE +!MESSAGE NMAKE /f "wininst.mak" CFG="wininst - Win32 Debug" +!MESSAGE +!MESSAGE Possible choices for configuration are: +!MESSAGE +!MESSAGE "wininst - Win32 Release" (based on "Win32 (x86) Application") +!MESSAGE "wininst - Win32 Debug" (based on "Win32 (x86) Application") +!MESSAGE + +# Begin Project +# PROP AllowPerConfigDependencies 0 +# PROP Scc_ProjName "" +# PROP Scc_LocalPath "" +CPP=cl.exe +MTL=midl.exe +RSC=rc.exe + +!IF "$(CFG)" == "wininst - Win32 Release" + +# PROP BASE Use_MFC 0 +# PROP BASE Use_Debug_Libraries 0 +# PROP BASE Output_Dir "Release" +# PROP BASE Intermediate_Dir "Release" +# PROP BASE Target_Dir "" +# PROP Use_MFC 0 +# PROP Use_Debug_Libraries 0 +# PROP Output_Dir "..\..\lib\distutils\command" +# PROP Intermediate_Dir "temp-release" +# PROP Ignore_Export_Lib 0 +# PROP Target_Dir "" +# ADD BASE CPP /nologo /W3 /GX /O2 /D "WIN32" /D "NDEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /c +# ADD CPP /nologo /MD /W3 /O1 /I "..\..\Include" /I "..\..\..\zlib-1.2.3" /D "WIN32" /D "NDEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /c +# ADD BASE MTL /nologo /D "NDEBUG" /mktyplib203 /win32 +# ADD MTL /nologo /D "NDEBUG" /mktyplib203 /win32 +# ADD BASE RSC /l 0x407 /d "NDEBUG" +# ADD RSC /l 0x407 /d "NDEBUG" +BSC32=bscmake.exe +# ADD BASE BSC32 /nologo +# ADD BSC32 /nologo +LINK32=link.exe +# ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /machine:I386 +# ADD LINK32 ..\..\..\zlib-1.2.3\zlib.lib imagehlp.lib comdlg32.lib ole32.lib comctl32.lib kernel32.lib user32.lib gdi32.lib advapi32.lib shell32.lib /nologo /subsystem:windows /machine:I386 /nodefaultlib:"LIBC" /out:"..\..\lib\distutils\command/wininst-6.0.exe" + +!ELSEIF "$(CFG)" == "wininst - Win32 Debug" + +# PROP BASE Use_MFC 0 +# PROP BASE Use_Debug_Libraries 1 +# PROP BASE Output_Dir "Debug" +# PROP BASE Intermediate_Dir "Debug" +# PROP BASE Target_Dir "" +# PROP Use_MFC 0 +# PROP Use_Debug_Libraries 1 +# PROP Output_Dir "." +# PROP Intermediate_Dir "temp-debug" +# PROP Ignore_Export_Lib 0 +# PROP Target_Dir "" +# ADD BASE CPP /nologo /W3 /GX /ZI /Od /D "WIN32" /D "_DEBUG" /D "_WINDOWS" /D "_MBCS" /YX /FD /GZ /c +# ADD CPP /nologo /MD /W3 /Z7 /Od /I "..\..\Include" /I "..\..\..\zlib-1.2.1" /D "WIN32" /D "_DEBUG" /D "_WINDOWS" /D "_MBCS" /FR /YX /FD /c +# ADD BASE MTL /nologo /D "_DEBUG" /mktyplib203 /win32 +# ADD MTL /nologo /D "_DEBUG" /mktyplib203 /win32 +# ADD BASE RSC /l 0x407 /d "_DEBUG" +# ADD RSC /l 0x407 /d "_DEBUG" +BSC32=bscmake.exe +# ADD BASE BSC32 /nologo +# ADD BSC32 /nologo +LINK32=link.exe +# ADD BASE LINK32 kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib odbccp32.lib /nologo /subsystem:windows /debug /machine:I386 /pdbtype:sept +# ADD LINK32 ..\..\..\zlib-1.2.3\zlib.lib imagehlp.lib comdlg32.lib ole32.lib comctl32.lib kernel32.lib user32.lib gdi32.lib advapi32.lib shell32.lib /nologo /subsystem:windows /pdb:none /debug /machine:I386 /nodefaultlib:"LIBC" /out:"..\..\lib\distutils\command/wininst-6.0_d.exe" + +!ENDIF + +# Begin Target + +# Name "wininst - Win32 Release" +# Name "wininst - Win32 Debug" +# Begin Group "Source Files" + +# PROP Default_Filter "cpp;c;cxx;rc;def;r;odl;idl;hpj;bat" +# Begin Source File + +SOURCE=.\extract.c +# End Source File +# Begin Source File + +SOURCE=.\install.c +# End Source File +# Begin Source File + +SOURCE=.\install.rc +# End Source File +# End Group +# Begin Group "Header Files" + +# PROP Default_Filter "h;hpp;hxx;hm;inl" +# Begin Source File + +SOURCE=.\archive.h +# End Source File +# End Group +# Begin Group "Resource Files" + +# PROP Default_Filter "ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe" +# Begin Source File + +SOURCE=.\PythonPowered.bmp +# End Source File +# End Group +# End Target +# End Project Modified: python/branches/pep-3151/PC/bdist_wininst/wininst.dsw ============================================================================== --- python/branches/pep-3151/PC/bdist_wininst/wininst.dsw (original) +++ python/branches/pep-3151/PC/bdist_wininst/wininst.dsw Sat Feb 26 08:16:32 2011 @@ -1,29 +1,29 @@ -Microsoft Developer Studio Workspace File, Format Version 6.00 -# WARNING: DO NOT EDIT OR DELETE THIS WORKSPACE FILE! - -############################################################################### - -Project: "wininst"=.\wininst.dsp - Package Owner=<4> - -Package=<5> -{{{ -}}} - -Package=<4> -{{{ -}}} - -############################################################################### - -Global: - -Package=<5> -{{{ -}}} - -Package=<3> -{{{ -}}} - -############################################################################### - +Microsoft Developer Studio Workspace File, Format Version 6.00 +# WARNING: DO NOT EDIT OR DELETE THIS WORKSPACE FILE! + +############################################################################### + +Project: "wininst"=.\wininst.dsp - Package Owner=<4> + +Package=<5> +{{{ +}}} + +Package=<4> +{{{ +}}} + +############################################################################### + +Global: + +Package=<5> +{{{ +}}} + +Package=<3> +{{{ +}}} + +############################################################################### + Modified: python/branches/pep-3151/PC/example_nt/example.vcproj ============================================================================== --- python/branches/pep-3151/PC/example_nt/example.vcproj (original) +++ python/branches/pep-3151/PC/example_nt/example.vcproj Sat Feb 26 08:16:32 2011 @@ -39,7 +39,7 @@ 2) { tmpdir = argv[2]; strcat_s(tmppath, _countof(tmppath), tmpdir); + /* Hack fix for bad command line: If the command is issued like this: + * $(SolutionDir)make_buildinfo.exe" Debug "$(IntDir)" + * we will get a trailing quote because IntDir ends with a backslash that then + * escapes the final ". To simplify the life for developers, catch that problem + * here by cutting it off. + * The proper command line, btw is: + * $(SolutionDir)make_buildinfo.exe" Debug "$(IntDir)\" + * Hooray for command line parsing on windows. + */ + if (strlen(tmppath) > 0 && tmppath[strlen(tmppath)-1] == '"') + tmppath[strlen(tmppath)-1] = '\0'; strcat_s(tmppath, _countof(tmppath), "\\"); } if ((do_unlink = make_buildinfo2(tmppath))) { + strcat_s(command, CMD_SIZE, "\""); strcat_s(command, CMD_SIZE, tmppath); - strcat_s(command, CMD_SIZE, "getbuildinfo2.c -DSUBWCREV "); + strcat_s(command, CMD_SIZE, "getbuildinfo2.c\" -DSUBWCREV "); } else strcat_s(command, CMD_SIZE, "..\\Modules\\getbuildinfo.c"); - strcat_s(command, CMD_SIZE, " -Fo"); + strcat_s(command, CMD_SIZE, " -Fo\""); strcat_s(command, CMD_SIZE, tmppath); - strcat_s(command, CMD_SIZE, "getbuildinfo.o -I..\\Include -I..\\PC"); + strcat_s(command, CMD_SIZE, "getbuildinfo.o\" -I..\\Include -I..\\PC"); puts(command); fflush(stdout); result = system(command); if (do_unlink) { command[0] = '\0'; + strcat_s(command, CMD_SIZE, "\""); strcat_s(command, CMD_SIZE, tmppath); - strcat_s(command, CMD_SIZE, "getbuildinfo2.c"); + strcat_s(command, CMD_SIZE, "getbuildinfo2.c\""); _unlink(command); } if (result < 0) Modified: python/branches/pep-3151/PCbuild/pcbuild.sln ============================================================================== --- python/branches/pep-3151/PCbuild/pcbuild.sln (original) +++ python/branches/pep-3151/PCbuild/pcbuild.sln Sat Feb 26 08:16:32 2011 @@ -73,8 +73,8 @@ ProjectSection(ProjectDependencies) = postProject {B11D750F-CD1F-4A96-85CE-E69A5C5259F9} = {B11D750F-CD1F-4A96-85CE-E69A5C5259F9} {86937F53-C189-40EF-8CE8-8759D8E7D480} = {86937F53-C189-40EF-8CE8-8759D8E7D480} - {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} = {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} {E5B04CC0-EB4C-42AB-B4DC-18EF95F864B0} = {E5B04CC0-EB4C-42AB-B4DC-18EF95F864B0} + {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} = {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} EndProjectSection EndProject Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "_testcapi", "_testcapi.vcproj", "{6901D91C-6E48-4BB7-9FEC-700C8131DF1D}" @@ -112,8 +112,8 @@ Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "_hashlib", "_hashlib.vcproj", "{447F05A8-F581-4CAC-A466-5AC7936E207E}" ProjectSection(ProjectDependencies) = postProject {B11D750F-CD1F-4A96-85CE-E69A5C5259F9} = {B11D750F-CD1F-4A96-85CE-E69A5C5259F9} - {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} = {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} {E5B04CC0-EB4C-42AB-B4DC-18EF95F864B0} = {E5B04CC0-EB4C-42AB-B4DC-18EF95F864B0} + {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} = {CF7AC3D1-E2DF-41D2-BEA6-1E2556CDEA26} EndProjectSection EndProject Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "sqlite3", "sqlite3.vcproj", "{A1A295E5-463C-437F-81CA-1F32367685DA}" @@ -133,6 +133,10 @@ EndProject Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "kill_python", "kill_python.vcproj", "{6DE10744-E396-40A5-B4E2-1B69AA7C8D31}" EndProject +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "python3dll", "python3dll.vcproj", "{885D4898-D08D-4091-9C40-C700CFE3FC5A}" +EndProject +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "xxlimited", "xxlimited.vcproj", "{F749B822-B489-4CA5-A3AD-CE078F5F338A}" +EndProject Global GlobalSection(SolutionConfigurationPlatforms) = preSolution Debug|Win32 = Debug|Win32 @@ -553,6 +557,37 @@ {6DE10744-E396-40A5-B4E2-1B69AA7C8D31}.Release|Win32.Build.0 = Release|Win32 {6DE10744-E396-40A5-B4E2-1B69AA7C8D31}.Release|x64.ActiveCfg = Release|x64 {6DE10744-E396-40A5-B4E2-1B69AA7C8D31}.Release|x64.Build.0 = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Debug|Win32.ActiveCfg = PGInstrument|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Debug|x64.ActiveCfg = Debug|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Debug|x64.Build.0 = Debug|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGInstrument|Win32.ActiveCfg = PGInstrument|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGInstrument|Win32.Build.0 = PGInstrument|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGInstrument|x64.ActiveCfg = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGInstrument|x64.Build.0 = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGUpdate|Win32.ActiveCfg = PGUpdate|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGUpdate|Win32.Build.0 = PGUpdate|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGUpdate|x64.ActiveCfg = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.PGUpdate|x64.Build.0 = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Release|Win32.ActiveCfg = Release|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Release|Win32.Build.0 = Release|Win32 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Release|x64.ActiveCfg = Release|x64 + {885D4898-D08D-4091-9C40-C700CFE3FC5A}.Release|x64.Build.0 = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Debug|Win32.ActiveCfg = Debug|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Debug|Win32.Build.0 = Debug|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Debug|x64.ActiveCfg = Debug|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Debug|x64.Build.0 = Debug|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGInstrument|Win32.ActiveCfg = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGInstrument|Win32.Build.0 = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGInstrument|x64.ActiveCfg = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGInstrument|x64.Build.0 = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGUpdate|Win32.ActiveCfg = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGUpdate|Win32.Build.0 = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGUpdate|x64.ActiveCfg = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.PGUpdate|x64.Build.0 = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Release|Win32.ActiveCfg = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Release|Win32.Build.0 = Release|Win32 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Release|x64.ActiveCfg = Release|x64 + {F749B822-B489-4CA5-A3AD-CE078F5F338A}.Release|x64.Build.0 = Release|x64 EndGlobalSection GlobalSection(SolutionProperties) = preSolution HideSolutionNode = FALSE Modified: python/branches/pep-3151/PCbuild/pyproject.vsprops ============================================================================== --- python/branches/pep-3151/PCbuild/pyproject.vsprops (original) +++ python/branches/pep-3151/PCbuild/pyproject.vsprops Sat Feb 26 08:16:32 2011 @@ -38,7 +38,7 @@ /> decoding_state == STATE_NORMAL) { /* We already have a codec associated with @@ -585,12 +586,16 @@ if (badchar) { /* Need to add 1 to the line number, since this line has not been counted, yet. */ - PyErr_Format(PyExc_SyntaxError, - "Non-UTF-8 code starting with '\\x%.2x' " - "in file %.200s on line %i, " - "but no encoding declared; " - "see http://python.org/dev/peps/pep-0263/ for details", - badchar, tok->filename, tok->lineno + 1); + filename = PyUnicode_DecodeFSDefault(tok->filename); + if (filename != NULL) { + PyErr_Format(PyExc_SyntaxError, + "Non-UTF-8 code starting with '\\x%.2x' " + "in file %U on line %i, " + "but no encoding declared; " + "see http://python.org/dev/peps/pep-0263/ for details", + badchar, filename, tok->lineno + 1); + Py_DECREF(filename); + } return error_ret(tok); } #endif @@ -888,6 +893,13 @@ if (tok->prompt != NULL) { char *newtok = PyOS_Readline(stdin, stdout, tok->prompt); #ifndef PGEN + if (newtok != NULL) { + char *translated = translate_newlines(newtok, 0, tok); + PyMem_FREE(newtok); + if (translated == NULL) + return EOF; + newtok = translated; + } if (tok->encoding && newtok && *newtok) { /* Recode to UTF-8 */ Py_ssize_t buflen; Modified: python/branches/pep-3151/Python/Python-ast.c ============================================================================== --- python/branches/pep-3151/Python/Python-ast.c (original) +++ python/branches/pep-3151/Python/Python-ast.c Sat Feb 26 08:16:32 2011 @@ -3374,9 +3374,9 @@ int obj2ast_mod(PyObject* obj, mod_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; + PyObject *tmp = NULL; if (obj == Py_None) { *out = NULL; @@ -3514,10 +3514,8 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of mod, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: + PyErr_Format(PyExc_TypeError, "expected some sort of mod, but got %R", obj); + failed: Py_XDECREF(tmp); return 1; } @@ -3525,9 +3523,9 @@ int obj2ast_stmt(PyObject* obj, stmt_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; + PyObject *tmp = NULL; int lineno; int col_offset; @@ -4712,10 +4710,8 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of stmt, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: + PyErr_Format(PyExc_TypeError, "expected some sort of stmt, but got %R", obj); + failed: Py_XDECREF(tmp); return 1; } @@ -4723,9 +4719,9 @@ int obj2ast_expr(PyObject* obj, expr_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; + PyObject *tmp = NULL; int lineno; int col_offset; @@ -5830,10 +5826,8 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of expr, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: + PyErr_Format(PyExc_TypeError, "expected some sort of expr, but got %R", obj); + failed: Py_XDECREF(tmp); return 1; } @@ -5841,7 +5835,6 @@ int obj2ast_expr_context(PyObject* obj, expr_context_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; isinstance = PyObject_IsInstance(obj, (PyObject *)Load_type); @@ -5893,20 +5886,16 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of expr_context, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: - Py_XDECREF(tmp); + PyErr_Format(PyExc_TypeError, "expected some sort of expr_context, but got %R", obj); return 1; } int obj2ast_slice(PyObject* obj, slice_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; + PyObject *tmp = NULL; if (obj == Py_None) { *out = NULL; @@ -6018,10 +6007,8 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of slice, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: + PyErr_Format(PyExc_TypeError, "expected some sort of slice, but got %R", obj); + failed: Py_XDECREF(tmp); return 1; } @@ -6029,7 +6016,6 @@ int obj2ast_boolop(PyObject* obj, boolop_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; isinstance = PyObject_IsInstance(obj, (PyObject *)And_type); @@ -6049,18 +6035,13 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of boolop, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: - Py_XDECREF(tmp); + PyErr_Format(PyExc_TypeError, "expected some sort of boolop, but got %R", obj); return 1; } int obj2ast_operator(PyObject* obj, operator_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; isinstance = PyObject_IsInstance(obj, (PyObject *)Add_type); @@ -6160,18 +6141,13 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of operator, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: - Py_XDECREF(tmp); + PyErr_Format(PyExc_TypeError, "expected some sort of operator, but got %R", obj); return 1; } int obj2ast_unaryop(PyObject* obj, unaryop_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; isinstance = PyObject_IsInstance(obj, (PyObject *)Invert_type); @@ -6207,18 +6183,13 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of unaryop, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: - Py_XDECREF(tmp); + PyErr_Format(PyExc_TypeError, "expected some sort of unaryop, but got %R", obj); return 1; } int obj2ast_cmpop(PyObject* obj, cmpop_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; isinstance = PyObject_IsInstance(obj, (PyObject *)Eq_type); @@ -6302,11 +6273,7 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of cmpop, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: - Py_XDECREF(tmp); + PyErr_Format(PyExc_TypeError, "expected some sort of cmpop, but got %R", obj); return 1; } @@ -6377,9 +6344,9 @@ int obj2ast_excepthandler(PyObject* obj, excepthandler_ty* out, PyArena* arena) { - PyObject* tmp = NULL; int isinstance; + PyObject *tmp = NULL; int lineno; int col_offset; @@ -6473,10 +6440,8 @@ return 0; } - tmp = PyObject_Repr(obj); - if (tmp == NULL) goto failed; - PyErr_Format(PyExc_TypeError, "expected some sort of excepthandler, but got %.400s", PyBytes_AS_STRING(tmp)); -failed: + PyErr_Format(PyExc_TypeError, "expected some sort of excepthandler, but got %R", obj); + failed: Py_XDECREF(tmp); return 1; } Modified: python/branches/pep-3151/Python/_warnings.c ============================================================================== --- python/branches/pep-3151/Python/_warnings.c (original) +++ python/branches/pep-3151/Python/_warnings.c Sat Feb 26 08:16:32 2011 @@ -783,7 +783,7 @@ { PyObject *res; PyObject *message = PyUnicode_FromString(text); - PyObject *filename = PyUnicode_FromString(filename_str); + PyObject *filename = PyUnicode_DecodeFSDefault(filename_str); PyObject *module = NULL; int ret = -1; Modified: python/branches/pep-3151/Python/ast.c ============================================================================== --- python/branches/pep-3151/Python/ast.c (original) +++ python/branches/pep-3151/Python/ast.c Sat Feb 26 08:16:32 2011 @@ -7,7 +7,6 @@ #include "Python-ast.h" #include "grammar.h" #include "node.h" -#include "pyarena.h" #include "ast.h" #include "token.h" #include "parsetok.h" @@ -3232,7 +3231,6 @@ const char *end; if (encoding == NULL) { - buf = (char *)s; u = NULL; } else { /* check for integer overflow */ Modified: python/branches/pep-3151/Python/bltinmodule.c ============================================================================== --- python/branches/pep-3151/Python/bltinmodule.c (original) +++ python/branches/pep-3151/Python/bltinmodule.c Sat Feb 26 08:16:32 2011 @@ -5,7 +5,6 @@ #include "node.h" #include "code.h" -#include "eval.h" #include @@ -38,7 +37,7 @@ { PyObject *func, *name, *bases, *mkw, *meta, *prep, *ns, *cell; PyObject *cls = NULL; - Py_ssize_t nargs, nbases; + Py_ssize_t nargs; assert(args != NULL); if (!PyTuple_Check(args)) { @@ -62,7 +61,6 @@ bases = PyTuple_GetSlice(args, 2, nargs); if (bases == NULL) return NULL; - nbases = nargs - 2; if (kwds == NULL) { meta = NULL; @@ -311,6 +309,20 @@ Return the binary representation of an integer or long integer."); +static PyObject * +builtin_callable(PyObject *self, PyObject *v) +{ + return PyBool_FromLong((long)PyCallable_Check(v)); +} + +PyDoc_STRVAR(callable_doc, +"callable(object) -> bool\n\ +\n\ +Return whether the object is callable (i.e., some kind of function).\n\ +Note that classes are callable, as are instances of classes with a\n\ +__call__() method."); + + typedef struct { PyObject_HEAD PyObject *func; @@ -530,19 +542,20 @@ int mode = -1; int dont_inherit = 0; int supplied_flags = 0; + int optimize = -1; int is_ast; PyCompilerFlags cf; PyObject *cmd; static char *kwlist[] = {"source", "filename", "mode", "flags", - "dont_inherit", NULL}; + "dont_inherit", "optimize", NULL}; int start[] = {Py_file_input, Py_eval_input, Py_single_input}; PyObject *result; - if (!PyArg_ParseTupleAndKeywords(args, kwds, "OO&s|ii:compile", kwlist, + if (!PyArg_ParseTupleAndKeywords(args, kwds, "OO&s|iii:compile", kwlist, &cmd, PyUnicode_FSConverter, &filename_obj, &startstr, &supplied_flags, - &dont_inherit)) + &dont_inherit, &optimize)) return NULL; filename = PyBytes_AS_STRING(filename_obj); @@ -557,6 +570,12 @@ } /* XXX Warn if (supplied_flags & PyCF_MASK_OBSOLETE) != 0? */ + if (optimize < -1 || optimize > 2) { + PyErr_SetString(PyExc_ValueError, + "compile(): invalid optimize value"); + goto error; + } + if (!dont_inherit) { PyEval_MergeCompilerFlags(&cf); } @@ -591,8 +610,8 @@ PyArena_Free(arena); goto error; } - result = (PyObject*)PyAST_Compile(mod, filename, - &cf, arena); + result = (PyObject*)PyAST_CompileEx(mod, filename, + &cf, optimize, arena); PyArena_Free(arena); } goto finally; @@ -602,7 +621,7 @@ if (str == NULL) goto error; - result = Py_CompileStringFlags(str, filename, start[mode], &cf); + result = Py_CompileStringExFlags(str, filename, start[mode], &cf, optimize); goto finally; error: @@ -714,7 +733,7 @@ "code object passed to eval() may not contain free variables"); return NULL; } - return PyEval_EvalCode((PyCodeObject *) cmd, globals, locals); + return PyEval_EvalCode(cmd, globals, locals); } cf.cf_flags = PyCF_SOURCE_IS_UTF8; @@ -746,7 +765,6 @@ { PyObject *v; PyObject *prog, *globals = Py_None, *locals = Py_None; - int plain = 0; if (!PyArg_UnpackTuple(args, "exec", 1, 3, &prog, &globals, &locals)) return NULL; @@ -755,7 +773,6 @@ globals = PyEval_GetGlobals(); if (locals == Py_None) { locals = PyEval_GetLocals(); - plain = 1; } if (!globals || !locals) { PyErr_SetString(PyExc_SystemError, @@ -790,7 +807,7 @@ "contain free variables"); return NULL; } - v = PyEval_EvalCode((PyCodeObject *) prog, globals, locals); + v = PyEval_EvalCode(prog, globals, locals); } else { char *str; @@ -1601,6 +1618,7 @@ PyObject *stdin_encoding; char *stdin_encoding_str; PyObject *result; + size_t len; stdin_encoding = PyObject_GetAttrString(fin, "encoding"); if (!stdin_encoding) @@ -1665,19 +1683,23 @@ Py_DECREF(stdin_encoding); return NULL; } - if (*s == '\0') { + + len = strlen(s); + if (len == 0) { PyErr_SetNone(PyExc_EOFError); result = NULL; } - else { /* strip trailing '\n' */ - size_t len = strlen(s); + else { if (len > PY_SSIZE_T_MAX) { PyErr_SetString(PyExc_OverflowError, "input: input too long"); result = NULL; } else { - result = PyUnicode_Decode(s, len-1, stdin_encoding_str, NULL); + len--; /* strip trailing '\n' */ + if (len != 0 && s[len-1] == '\r') + len--; /* strip trailing '\r' */ + result = PyUnicode_Decode(s, len, stdin_encoding_str, NULL); } } Py_DECREF(stdin_encoding); @@ -2242,6 +2264,7 @@ {"any", builtin_any, METH_O, any_doc}, {"ascii", builtin_ascii, METH_O, ascii_doc}, {"bin", builtin_bin, METH_O, bin_doc}, + {"callable", builtin_callable, METH_O, callable_doc}, {"chr", builtin_chr, METH_VARARGS, chr_doc}, {"compile", (PyCFunction)builtin_compile, METH_VARARGS | METH_KEYWORDS, compile_doc}, {"delattr", builtin_delattr, METH_VARARGS, delattr_doc}, Modified: python/branches/pep-3151/Python/ceval.c ============================================================================== --- python/branches/pep-3151/Python/ceval.c (original) +++ python/branches/pep-3151/Python/ceval.c Sat Feb 26 08:16:32 2011 @@ -13,7 +13,6 @@ #include "code.h" #include "frameobject.h" -#include "eval.h" #include "opcode.h" #include "structmember.h" @@ -27,10 +26,11 @@ typedef unsigned long long uint64; -#if defined(__ppc__) /* <- Don't know if this is the correct symbol; this - section should work for GCC on any PowerPC - platform, irrespective of OS. - POWER? Who knows :-) */ +/* PowerPC suppport. + "__ppc__" appears to be the preprocessor definition to detect on OS X, whereas + "__powerpc__" appears to be the correct one for Linux with GCC +*/ +#if defined(__ppc__) || defined (__powerpc__) #define READ_TIMESTAMP(var) ppc_getcounter(&var) @@ -756,7 +756,7 @@ PyObject * -PyEval_EvalCode(PyCodeObject *co, PyObject *globals, PyObject *locals) +PyEval_EvalCode(PyObject *co, PyObject *globals, PyObject *locals) { return PyEval_EvalCodeEx(co, globals, locals, @@ -811,10 +811,6 @@ unsigned char *first_instr; PyObject *names; PyObject *consts; -#if defined(Py_DEBUG) || defined(LLTRACE) - /* Make it easier to find out where we are with a debugger */ - char *filename; -#endif /* Computed GOTOs, or the-optimization-commonly-but-improperly-known-as-"threaded code" @@ -1227,18 +1223,6 @@ #ifdef LLTRACE lltrace = PyDict_GetItemString(f->f_globals, "__lltrace__") != NULL; #endif -#if defined(Py_DEBUG) || defined(LLTRACE) - { - PyObject *error_type, *error_value, *error_traceback; - PyErr_Fetch(&error_type, &error_value, &error_traceback); - filename = _PyUnicode_AsString(co->co_filename); - if (filename == NULL && tstate->overflowed) { - /* maximum recursion depth exceeded */ - goto exit_eval_frame; - } - PyErr_Restore(error_type, error_value, error_traceback); - } -#endif why = WHY_NOT; err = 0; @@ -2706,7 +2690,7 @@ Py_DECREF(*pfunc); *pfunc = self; na++; - n++; + /* n++; */ } else Py_INCREF(func); sp = stack_pointer; @@ -3042,7 +3026,7 @@ PyTrace_RETURN, retval)) { Py_XDECREF(retval); retval = NULL; - why = WHY_EXCEPTION; + /* why = WHY_EXCEPTION; */ } } } @@ -3060,10 +3044,11 @@ the test in the if statements in Misc/gdbinit (pystack and pystackv). */ PyObject * -PyEval_EvalCodeEx(PyCodeObject *co, PyObject *globals, PyObject *locals, +PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, PyObject **args, int argcount, PyObject **kws, int kwcount, PyObject **defs, int defcount, PyObject *kwdefs, PyObject *closure) { + PyCodeObject* co = (PyCodeObject*)_co; register PyFrameObject *f; register PyObject *retval = NULL; register PyObject **fastlocals, **freevars; @@ -3969,7 +3954,7 @@ d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } - return PyEval_EvalCodeEx(co, globals, + return PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, (*pp_stack)-n, na, (*pp_stack)-2*nk, nk, d, nd, kwdefs, PyFunction_GET_CLOSURE(func)); Modified: python/branches/pep-3151/Python/ceval_gil.h ============================================================================== --- python/branches/pep-3151/Python/ceval_gil.h (original) +++ python/branches/pep-3151/Python/ceval_gil.h Sat Feb 26 08:16:32 2011 @@ -334,12 +334,15 @@ static void drop_gil(PyThreadState *tstate) { - /* NOTE: tstate is allowed to be NULL. */ if (!_Py_atomic_load_relaxed(&gil_locked)) Py_FatalError("drop_gil: GIL is not locked"); - if (tstate != NULL && - tstate != _Py_atomic_load_relaxed(&gil_last_holder)) - Py_FatalError("drop_gil: wrong thread state"); + /* tstate is allowed to be NULL (early interpreter init) */ + if (tstate != NULL) { + /* Sub-interpreter support: threads might have been switched + under our feet using PyThreadState_Swap(). Fix the GIL last + holder variable so that our heuristics work. */ + _Py_atomic_store_relaxed(&gil_last_holder, tstate); + } MUTEX_LOCK(gil_mutex); _Py_ANNOTATE_RWLOCK_RELEASED(&gil_locked, /*is_write=*/1); Modified: python/branches/pep-3151/Python/compile.c ============================================================================== --- python/branches/pep-3151/Python/compile.c (original) +++ python/branches/pep-3151/Python/compile.c Sat Feb 26 08:16:32 2011 @@ -25,10 +25,8 @@ #include "Python-ast.h" #include "node.h" -#include "pyarena.h" #include "ast.h" #include "code.h" -#include "compile.h" #include "symtable.h" #include "opcode.h" @@ -141,6 +139,7 @@ PyFutureFeatures *c_future; /* pointer to module's __future__ */ PyCompilerFlags *c_flags; + int c_optimize; /* optimization level */ int c_interactive; /* true if in interactive mode */ int c_nestlevel; @@ -177,7 +176,7 @@ static int compiler_in_loop(struct compiler *); static int inplace_binop(struct compiler *, operator_ty); -static int expr_constant(expr_ty e); +static int expr_constant(struct compiler *, expr_ty); static int compiler_with(struct compiler *, stmt_ty); static int compiler_call_helper(struct compiler *c, int n, @@ -256,8 +255,8 @@ } PyCodeObject * -PyAST_Compile(mod_ty mod, const char *filename, PyCompilerFlags *flags, - PyArena *arena) +PyAST_CompileEx(mod_ty mod, const char *filename, PyCompilerFlags *flags, + int optimize, PyArena *arena) { struct compiler c; PyCodeObject *co = NULL; @@ -285,6 +284,7 @@ c.c_future->ff_features = merged; flags->cf_flags = merged; c.c_flags = flags; + c.c_optimize = (optimize == -1) ? Py_OptimizeFlag : optimize; c.c_nestlevel = 0; c.c_st = PySymtable_Build(mod, filename, c.c_future); @@ -1151,7 +1151,7 @@ if (!asdl_seq_LEN(stmts)) return 1; st = (stmt_ty)asdl_seq_GET(stmts, 0); - if (compiler_isdocstring(st) && Py_OptimizeFlag < 2) { + if (compiler_isdocstring(st) && c->c_optimize < 2) { /* don't generate docstrings if -OO */ i = 1; VISIT(c, expr, st->v.Expr.value); @@ -1465,7 +1465,7 @@ st = (stmt_ty)asdl_seq_GET(s->v.FunctionDef.body, 0); docstring = compiler_isdocstring(st); - if (docstring && Py_OptimizeFlag < 2) + if (docstring && c->c_optimize < 2) first_const = st->v.Expr.value->v.Str.s; if (compiler_add_o(c, c->u->u_consts, first_const) < 0) { compiler_exit_scope(c); @@ -1699,7 +1699,7 @@ if (end == NULL) return 0; - constant = expr_constant(s->v.If.test); + constant = expr_constant(c, s->v.If.test); /* constant = 0: "if 0" * constant = 1: "if 1", "if 2", ... * constant = -1: rest */ @@ -1761,7 +1761,7 @@ compiler_while(struct compiler *c, stmt_ty s) { basicblock *loop, *orelse, *end, *anchor = NULL; - int constant = expr_constant(s->v.While.test); + int constant = expr_constant(c, s->v.While.test); if (constant == 0) { if (s->v.While.orelse) @@ -2213,7 +2213,7 @@ static PyObject *assertion_error = NULL; basicblock *end; - if (Py_OptimizeFlag) + if (c->c_optimize) return 1; if (assertion_error == NULL) { assertion_error = PyUnicode_InternFromString("AssertionError"); @@ -3013,7 +3013,7 @@ */ static int -expr_constant(expr_ty e) +expr_constant(struct compiler *c, expr_ty e) { char *id; switch (e->kind) { @@ -3031,7 +3031,7 @@ if (strcmp(id, "False") == 0) return 0; if (strcmp(id, "None") == 0) return 0; if (strcmp(id, "__debug__") == 0) - return ! Py_OptimizeFlag; + return ! c->c_optimize; /* fall through */ default: return -1; @@ -4082,3 +4082,13 @@ assemble_free(&a); return co; } + +#undef PyAST_Compile +PyAPI_FUNC(PyCodeObject *) +PyAST_Compile(mod_ty mod, const char *filename, PyCompilerFlags *flags, + PyArena *arena) +{ + return PyAST_CompileEx(mod, filename, flags, -1, arena); +} + + Modified: python/branches/pep-3151/Python/dtoa.c ============================================================================== --- python/branches/pep-3151/Python/dtoa.c (original) +++ python/branches/pep-3151/Python/dtoa.c Sat Feb 26 08:16:32 2011 @@ -2055,7 +2055,7 @@ + Exp_msk1 ; word1(&rv) = 0; - dsign = 0; + /* dsign = 0; */ break; } } @@ -2092,7 +2092,7 @@ goto undfl; } } - dsign = 1 - dsign; + /* dsign = 1 - dsign; */ break; } if ((aadj = ratio(delta, bs)) <= 2.) { Modified: python/branches/pep-3151/Python/dynload_aix.c ============================================================================== --- python/branches/pep-3151/Python/dynload_aix.c (original) +++ python/branches/pep-3151/Python/dynload_aix.c Sat Feb 26 08:16:32 2011 @@ -12,7 +12,7 @@ #ifdef AIX_GENUINE_CPLUSPLUS -#include "/usr/lpp/xlC/include/load.h" +#include #define aix_load loadAndInit #else #define aix_load load @@ -154,7 +154,7 @@ } -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/pep-3151/Python/dynload_dl.c ============================================================================== --- python/branches/pep-3151/Python/dynload_dl.c (original) +++ python/branches/pep-3151/Python/dynload_dl.c Sat Feb 26 08:16:32 2011 @@ -10,17 +10,17 @@ extern char *Py_GetProgramName(void); const struct filedescr _PyImport_DynLoadFiletab[] = { - {".o", "rb", C_EXTENSION}, - {"module.o", "rb", C_EXTENSION}, - {0, 0} + {".o", "rb", C_EXTENSION}, + {"module.o", "rb", C_EXTENSION}, + {0, 0} }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, - const char *pathname, FILE *fp) +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, + const char *pathname, FILE *fp) { - char funcname[258]; + char funcname[258]; - PyOS_snprintf(funcname, sizeof(funcname), "PyInit_%.200s", shortname); - return dl_loadmod(Py_GetProgramName(), pathname, funcname); + PyOS_snprintf(funcname, sizeof(funcname), "PyInit_%.200s", shortname); + return dl_loadmod(Py_GetProgramName(), pathname, funcname); } Modified: python/branches/pep-3151/Python/dynload_hpux.c ============================================================================== --- python/branches/pep-3151/Python/dynload_hpux.c (original) +++ python/branches/pep-3151/Python/dynload_hpux.c Sat Feb 26 08:16:32 2011 @@ -19,7 +19,7 @@ {0, 0} }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/pep-3151/Python/dynload_next.c ============================================================================== --- python/branches/pep-3151/Python/dynload_next.c (original) +++ python/branches/pep-3151/Python/dynload_next.c Sat Feb 26 08:16:32 2011 @@ -31,8 +31,8 @@ #define LINKOPTIONS NSLINKMODULE_OPTION_BINDNOW| \ NSLINKMODULE_OPTION_RETURN_ON_ERROR|NSLINKMODULE_OPTION_PRIVATE #endif -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, - const char *pathname, FILE *fp) +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, + const char *pathname, FILE *fp) { dl_funcptr p = NULL; char funcname[258]; Modified: python/branches/pep-3151/Python/dynload_os2.c ============================================================================== --- python/branches/pep-3151/Python/dynload_os2.c (original) +++ python/branches/pep-3151/Python/dynload_os2.c Sat Feb 26 08:16:32 2011 @@ -15,7 +15,7 @@ {0, 0} }; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/pep-3151/Python/dynload_shlib.c ============================================================================== --- python/branches/pep-3151/Python/dynload_shlib.c (original) +++ python/branches/pep-3151/Python/dynload_shlib.c Sat Feb 26 08:16:32 2011 @@ -53,6 +53,8 @@ #else /* !__VMS */ {"." SOABI ".so", "rb", C_EXTENSION}, {"module." SOABI ".so", "rb", C_EXTENSION}, + {".abi" PYTHON_ABI_STRING ".so", "rb", C_EXTENSION}, + {"module.abi" PYTHON_ABI_STRING ".so", "rb", C_EXTENSION}, {".so", "rb", C_EXTENSION}, {"module.so", "rb", C_EXTENSION}, #endif /* __VMS */ @@ -73,7 +75,7 @@ static int nhandles = 0; -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; Modified: python/branches/pep-3151/Python/dynload_win.c ============================================================================== --- python/branches/pep-3151/Python/dynload_win.c (original) +++ python/branches/pep-3151/Python/dynload_win.c Sat Feb 26 08:16:32 2011 @@ -134,6 +134,15 @@ !strncmp(import_name,"python",6)) { char *pch; +#ifndef _DEBUG + /* In a release version, don't claim that python3.dll is + a Python DLL. */ + if (strcmp(import_name, "python3.dll") == 0) { + import_data += 20; + continue; + } +#endif + /* Ensure python prefix is followed only by numbers to the end of the basename */ pch = import_name + 6; @@ -162,13 +171,16 @@ return NULL; } - -dl_funcptr _PyImport_GetDynLoadFunc(const char *fqname, const char *shortname, +dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp) { dl_funcptr p; char funcname[258], *import_python; +#ifndef _DEBUG + _Py_CheckPython3(); +#endif + PyOS_snprintf(funcname, sizeof(funcname), "PyInit_%.200s", shortname); { Modified: python/branches/pep-3151/Python/errors.c ============================================================================== --- python/branches/pep-3151/Python/errors.c (original) +++ python/branches/pep-3151/Python/errors.c Sat Feb 26 08:16:32 2011 @@ -520,7 +520,7 @@ int ierr, const char *filename) { - PyObject *name = filename ? PyUnicode_FromString(filename) : NULL; + PyObject *name = filename ? PyUnicode_DecodeFSDefault(filename) : NULL; PyObject *ret = PyErr_SetExcFromWindowsErrWithFilenameObject(exc, ierr, name); @@ -557,7 +557,7 @@ int ierr, const char *filename) { - PyObject *name = filename ? PyUnicode_FromString(filename) : NULL; + PyObject *name = filename ? PyUnicode_DecodeFSDefault(filename) : NULL; PyObject *result = PyErr_SetExcFromWindowsErrWithFilenameObject( PyExc_WindowsError, ierr, name); Modified: python/branches/pep-3151/Python/future.c ============================================================================== --- python/branches/pep-3151/Python/future.c (original) +++ python/branches/pep-3151/Python/future.c Sat Feb 26 08:16:32 2011 @@ -4,7 +4,6 @@ #include "token.h" #include "graminit.h" #include "code.h" -#include "compile.h" #include "symtable.h" #define UNDEFINED_FUTURE_FEATURE "future feature %.100s is not defined" Modified: python/branches/pep-3151/Python/getargs.c ============================================================================== --- python/branches/pep-3151/Python/getargs.c (original) +++ python/branches/pep-3151/Python/getargs.c Sat Feb 26 08:16:32 2011 @@ -146,10 +146,19 @@ } static int -addcleanup(void *ptr, PyObject **freelist, PyCapsule_Destructor destr) +addcleanup(void *ptr, PyObject **freelist, int is_buffer) { PyObject *cobj; const char *name; + PyCapsule_Destructor destr; + + if (is_buffer) { + destr = cleanup_buffer; + name = GETARGS_CAPSULE_NAME_CLEANUP_BUFFER; + } else { + destr = cleanup_ptr; + name = GETARGS_CAPSULE_NAME_CLEANUP_PTR; + } if (!*freelist) { *freelist = PyList_New(0); @@ -159,13 +168,6 @@ } } - if (destr == cleanup_ptr) { - name = GETARGS_CAPSULE_NAME_CLEANUP_PTR; - } else if (destr == cleanup_buffer) { - name = GETARGS_CAPSULE_NAME_CLEANUP_BUFFER; - } else { - return -1; - } cobj = PyCapsule_New(ptr, name, destr); if (!cobj) { destr(ptr); @@ -597,8 +599,19 @@ #define FETCH_SIZE int *q=NULL;Py_ssize_t *q2=NULL;\ if (flags & FLAG_SIZE_T) q2=va_arg(*p_va, Py_ssize_t*); \ else q=va_arg(*p_va, int*); -#define STORE_SIZE(s) if (flags & FLAG_SIZE_T) *q2=s; else *q=s; +#define STORE_SIZE(s) \ + if (flags & FLAG_SIZE_T) \ + *q2=s; \ + else { \ + if (INT_MAX < s) { \ + PyErr_SetString(PyExc_OverflowError, \ + "size does not fit in an int"); \ + return converterr("", arg, msgbuf, bufsize); \ + } \ + *q=s; \ + } #define BUFFER_LEN ((flags & FLAG_SIZE_T) ? *q2:*q) +#define RETURN_ERR_OCCURRED return msgbuf const char *format = *p_format; char c = *format++; @@ -610,19 +623,19 @@ char *p = va_arg(*p_va, char *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsLong(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else if (ival < 0) { PyErr_SetString(PyExc_OverflowError, - "unsigned byte integer is less than minimum"); - return converterr("integer", arg, msgbuf, bufsize); + "unsigned byte integer is less than minimum"); + RETURN_ERR_OCCURRED; } else if (ival > UCHAR_MAX) { PyErr_SetString(PyExc_OverflowError, - "unsigned byte integer is greater than maximum"); - return converterr("integer", arg, msgbuf, bufsize); + "unsigned byte integer is greater than maximum"); + RETURN_ERR_OCCURRED; } else *p = (unsigned char) ival; @@ -634,10 +647,10 @@ char *p = va_arg(*p_va, char *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsUnsignedLongMask(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = (unsigned char) ival; break; @@ -647,19 +660,19 @@ short *p = va_arg(*p_va, short *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsLong(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else if (ival < SHRT_MIN) { PyErr_SetString(PyExc_OverflowError, - "signed short integer is less than minimum"); - return converterr("integer", arg, msgbuf, bufsize); + "signed short integer is less than minimum"); + RETURN_ERR_OCCURRED; } else if (ival > SHRT_MAX) { PyErr_SetString(PyExc_OverflowError, - "signed short integer is greater than maximum"); - return converterr("integer", arg, msgbuf, bufsize); + "signed short integer is greater than maximum"); + RETURN_ERR_OCCURRED; } else *p = (short) ival; @@ -671,10 +684,10 @@ unsigned short *p = va_arg(*p_va, unsigned short *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsUnsignedLongMask(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = (unsigned short) ival; break; @@ -684,19 +697,19 @@ int *p = va_arg(*p_va, int *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsLong(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else if (ival > INT_MAX) { PyErr_SetString(PyExc_OverflowError, - "signed integer is greater than maximum"); - return converterr("integer", arg, msgbuf, bufsize); + "signed integer is greater than maximum"); + RETURN_ERR_OCCURRED; } else if (ival < INT_MIN) { PyErr_SetString(PyExc_OverflowError, - "signed integer is less than minimum"); - return converterr("integer", arg, msgbuf, bufsize); + "signed integer is less than minimum"); + RETURN_ERR_OCCURRED; } else *p = ival; @@ -708,10 +721,10 @@ unsigned int *p = va_arg(*p_va, unsigned int *); unsigned int ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = (unsigned int)PyLong_AsUnsignedLongMask(arg); if (ival == (unsigned int)-1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = ival; break; @@ -723,14 +736,14 @@ Py_ssize_t *p = va_arg(*p_va, Py_ssize_t *); Py_ssize_t ival = -1; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; iobj = PyNumber_Index(arg); if (iobj != NULL) { ival = PyLong_AsSsize_t(iobj); Py_DECREF(iobj); } if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; *p = ival; break; } @@ -738,10 +751,10 @@ long *p = va_arg(*p_va, long *); long ival; if (float_argument_error(arg)) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsLong(arg); if (ival == -1 && PyErr_Occurred()) - return converterr("integer", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = ival; break; @@ -763,10 +776,10 @@ PY_LONG_LONG *p = va_arg( *p_va, PY_LONG_LONG * ); PY_LONG_LONG ival; if (float_argument_error(arg)) - return converterr("long", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; ival = PyLong_AsLongLong(arg); if (ival == (PY_LONG_LONG)-1 && PyErr_Occurred()) - return converterr("long", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = ival; break; @@ -788,7 +801,7 @@ float *p = va_arg(*p_va, float *); double dval = PyFloat_AsDouble(arg); if (PyErr_Occurred()) - return converterr("float", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = (float) dval; break; @@ -798,7 +811,7 @@ double *p = va_arg(*p_va, double *); double dval = PyFloat_AsDouble(arg); if (PyErr_Occurred()) - return converterr("float", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = dval; break; @@ -809,7 +822,7 @@ Py_complex cval; cval = PyComplex_AsCComplex(arg); if (PyErr_Occurred()) - return converterr("complex", arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; else *p = cval; break; @@ -845,7 +858,7 @@ if (getbuffer(arg, (Py_buffer*)p, &buf) < 0) return converterr(buf, arg, msgbuf, bufsize); format++; - if (addcleanup(p, freelist, cleanup_buffer)) { + if (addcleanup(p, freelist, 1)) { return converterr( "(cleanup problem)", arg, msgbuf, bufsize); @@ -891,7 +904,7 @@ if (getbuffer(arg, p, &buf) < 0) return converterr(buf, arg, msgbuf, bufsize); } - if (addcleanup(p, freelist, cleanup_buffer)) { + if (addcleanup(p, freelist, 1)) { return converterr( "(cleanup problem)", arg, msgbuf, bufsize); @@ -953,9 +966,10 @@ case 'u': /* raw unicode buffer (Py_UNICODE *) */ case 'Z': /* raw unicode buffer or None */ { + Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); + if (*format == '#') { /* any buffer-like object */ /* "s#" or "Z#" */ - Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); FETCH_SIZE; if (c == 'Z' && arg == Py_None) { @@ -971,8 +985,6 @@ format++; } else { /* "s" or "Z" */ - Py_UNICODE **p = va_arg(*p_va, Py_UNICODE **); - if (c == 'Z' && arg == Py_None) *p = NULL; else if (PyUnicode_Check(arg)) { @@ -1095,11 +1107,9 @@ if (*buffer == NULL) { Py_DECREF(s); PyErr_NoMemory(); - return converterr( - "(memory error)", - arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; } - if (addcleanup(*buffer, freelist, cleanup_ptr)) { + if (addcleanup(*buffer, freelist, 0)) { Py_DECREF(s); return converterr( "(cleanup problem)", @@ -1139,10 +1149,9 @@ if (*buffer == NULL) { Py_DECREF(s); PyErr_NoMemory(); - return converterr("(memory error)", - arg, msgbuf, bufsize); + RETURN_ERR_OCCURRED; } - if (addcleanup(*buffer, freelist, cleanup_ptr)) { + if (addcleanup(*buffer, freelist, 0)) { Py_DECREF(s); return converterr("(cleanup problem)", arg, msgbuf, bufsize); @@ -1234,7 +1243,7 @@ PyBuffer_Release((Py_buffer*)p); return converterr("contiguous buffer", arg, msgbuf, bufsize); } - if (addcleanup(p, freelist, cleanup_buffer)) { + if (addcleanup(p, freelist, 1)) { return converterr( "(cleanup problem)", arg, msgbuf, bufsize); @@ -1249,6 +1258,11 @@ *p_format = format; return NULL; + +#undef FETCH_SIZE +#undef STORE_SIZE +#undef BUFFER_LEN +#undef RETURN_ERR_OCCURRED } static Py_ssize_t @@ -1394,7 +1408,7 @@ int PyArg_ValidateKeywordArguments(PyObject *kwargs) { - if (!PyDict_CheckExact(kwargs)) { + if (!PyDict_Check(kwargs)) { PyErr_BadInternalCall(); return 0; } Modified: python/branches/pep-3151/Python/getcopyright.c ============================================================================== --- python/branches/pep-3151/Python/getcopyright.c (original) +++ python/branches/pep-3151/Python/getcopyright.c Sat Feb 26 08:16:32 2011 @@ -4,7 +4,7 @@ static char cprt[] = "\ -Copyright (c) 2001-2010 Python Software Foundation.\n\ +Copyright (c) 2001-2011 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ Modified: python/branches/pep-3151/Python/import.c ============================================================================== --- python/branches/pep-3151/Python/import.c (original) +++ python/branches/pep-3151/Python/import.c Sat Feb 26 08:16:32 2011 @@ -5,13 +5,9 @@ #include "Python-ast.h" #undef Yield /* undefine macro conflicting with winbase.h */ -#include "pyarena.h" -#include "pythonrun.h" #include "errcode.h" #include "marshal.h" #include "code.h" -#include "compile.h" -#include "eval.h" #include "osdefs.h" #include "importdl.h" @@ -329,8 +325,17 @@ { if (import_lock != NULL) import_lock = PyThread_allocate_lock(); - import_lock_thread = -1; - import_lock_level = 0; + if (import_lock_level > 1) { + /* Forked as a side effect of import */ + long me = PyThread_get_thread_ident(); + PyThread_acquire_lock(import_lock, 0); + /* XXX: can the previous line fail? */ + import_lock_thread = me; + import_lock_level--; + } else { + import_lock_thread = -1; + import_lock_level = 0; + } } #endif @@ -631,7 +636,7 @@ } PyObject * -_PyImport_FindExtensionUnicode(char *name, PyObject *filename) +_PyImport_FindExtensionUnicode(const char *name, PyObject *filename) { PyObject *mod, *mdict; PyModuleDef* def; @@ -675,7 +680,7 @@ } PyObject * -_PyImport_FindBuiltin(char *name) +_PyImport_FindBuiltin(const char *name) { PyObject *res, *filename; filename = PyUnicode_FromString(name); @@ -801,7 +806,7 @@ PyErr_Clear(); /* Not important enough to report */ Py_DECREF(v); - v = PyEval_EvalCode((PyCodeObject *)co, d, d); + v = PyEval_EvalCode(co, d, d); if (v == NULL) goto error; Py_DECREF(v); @@ -1542,8 +1547,8 @@ pathname and an open file. Return NULL if the module is not found. */ #ifdef MS_COREDLL -extern FILE *PyWin_FindRegisteredModule(const char *, struct filedescr **, - char *, Py_ssize_t); +extern FILE *_PyWin_FindRegisteredModule(const char *, struct filedescr **, + char *, Py_ssize_t); #endif static int case_ok(char *, Py_ssize_t, Py_ssize_t, char *); @@ -1626,7 +1631,7 @@ return &fd_builtin; } #ifdef MS_COREDLL - fp = PyWin_FindRegisteredModule(name, &fdp, buf, buflen); + fp = _PyWin_FindRegisteredModule(name, &fdp, buf, buflen); if (fp != NULL) { *p_fp = fp; return fdp; @@ -3201,14 +3206,14 @@ static PyObject * imp_find_module(PyObject *self, PyObject *args) { - char *name; + PyObject *name; PyObject *ret, *path = NULL; - if (!PyArg_ParseTuple(args, "es|O:find_module", - Py_FileSystemDefaultEncoding, &name, + if (!PyArg_ParseTuple(args, "O&|O:find_module", + PyUnicode_FSConverter, &name, &path)) return NULL; - ret = call_find_module(name, path); - PyMem_Free(name); + ret = call_find_module(PyBytes_AS_STRING(name), path); + Py_DECREF(name); return ret; } @@ -3326,23 +3331,23 @@ imp_load_compiled(PyObject *self, PyObject *args) { char *name; - char *pathname; + PyObject *pathname; PyObject *fob = NULL; PyObject *m; FILE *fp; - if (!PyArg_ParseTuple(args, "ses|O:load_compiled", + if (!PyArg_ParseTuple(args, "sO&|O:load_compiled", &name, - Py_FileSystemDefaultEncoding, &pathname, + PyUnicode_FSConverter, &pathname, &fob)) return NULL; - fp = get_file(pathname, fob, "rb"); + fp = get_file(PyBytes_AS_STRING(pathname), fob, "rb"); if (fp == NULL) { - PyMem_Free(pathname); + Py_DECREF(pathname); return NULL; } - m = load_compiled_module(name, pathname, fp); + m = load_compiled_module(name, PyBytes_AS_STRING(pathname), fp); fclose(fp); - PyMem_Free(pathname); + Py_DECREF(pathname); return m; } @@ -3381,22 +3386,22 @@ imp_load_source(PyObject *self, PyObject *args) { char *name; - char *pathname; + PyObject *pathname; PyObject *fob = NULL; PyObject *m; FILE *fp; - if (!PyArg_ParseTuple(args, "ses|O:load_source", + if (!PyArg_ParseTuple(args, "sO&|O:load_source", &name, - Py_FileSystemDefaultEncoding, &pathname, + PyUnicode_FSConverter, &pathname, &fob)) return NULL; - fp = get_file(pathname, fob, "r"); + fp = get_file(PyBytes_AS_STRING(pathname), fob, "r"); if (fp == NULL) { - PyMem_Free(pathname); + Py_DECREF(pathname); return NULL; } - m = load_source_module(name, pathname, fp); - PyMem_Free(pathname); + m = load_source_module(name, PyBytes_AS_STRING(pathname), fp); + Py_DECREF(pathname); fclose(fp); return m; } @@ -3450,13 +3455,13 @@ imp_load_package(PyObject *self, PyObject *args) { char *name; - char *pathname; + PyObject *pathname; PyObject * ret; - if (!PyArg_ParseTuple(args, "ses:load_package", - &name, Py_FileSystemDefaultEncoding, &pathname)) + if (!PyArg_ParseTuple(args, "sO&:load_package", + &name, PyUnicode_FSConverter, &pathname)) return NULL; - ret = load_package(name, pathname); - PyMem_Free(pathname); + ret = load_package(name, PyBytes_AS_STRING(pathname)); + Py_DECREF(pathname); return ret; } @@ -3529,21 +3534,23 @@ { static char *kwlist[] = {"path", NULL}; + PyObject *pathname_obj; char *pathname; char buf[MAXPATHLEN+1]; if (!PyArg_ParseTupleAndKeywords( - args, kws, "es", kwlist, - Py_FileSystemDefaultEncoding, &pathname)) + args, kws, "O&", kwlist, + PyUnicode_FSConverter, &pathname_obj)) return NULL; + pathname = PyBytes_AS_STRING(pathname_obj); if (make_source_pathname(pathname, buf) == NULL) { PyErr_Format(PyExc_ValueError, "Not a PEP 3147 pyc path: %s", pathname); - PyMem_Free(pathname); + Py_DECREF(pathname_obj); return NULL; } - PyMem_Free(pathname); + Py_DECREF(pathname_obj); return PyUnicode_FromString(buf); } Modified: python/branches/pep-3151/Python/importdl.c ============================================================================== --- python/branches/pep-3151/Python/importdl.c (original) +++ python/branches/pep-3151/Python/importdl.c Sat Feb 26 08:16:32 2011 @@ -12,8 +12,7 @@ #include "importdl.h" -extern dl_funcptr _PyImport_GetDynLoadFunc(const char *name, - const char *shortname, +extern dl_funcptr _PyImport_GetDynLoadFunc(const char *shortname, const char *pathname, FILE *fp); @@ -48,7 +47,7 @@ shortname = lastdot+1; } - p0 = _PyImport_GetDynLoadFunc(name, shortname, pathname, fp); + p0 = _PyImport_GetDynLoadFunc(shortname, pathname, fp); p = (PyObject*(*)(void))p0; if (PyErr_Occurred()) goto error; Modified: python/branches/pep-3151/Python/peephole.c ============================================================================== --- python/branches/pep-3151/Python/peephole.c (original) +++ python/branches/pep-3151/Python/peephole.c Sat Feb 26 08:16:32 2011 @@ -4,10 +4,8 @@ #include "Python-ast.h" #include "node.h" -#include "pyarena.h" #include "ast.h" #include "code.h" -#include "compile.h" #include "symtable.h" #include "opcode.h" Modified: python/branches/pep-3151/Python/pyarena.c ============================================================================== --- python/branches/pep-3151/Python/pyarena.c (original) +++ python/branches/pep-3151/Python/pyarena.c Sat Feb 26 08:16:32 2011 @@ -1,5 +1,4 @@ #include "Python.h" -#include "pyarena.h" /* A simple arena block structure. Modified: python/branches/pep-3151/Python/pystrtod.c ============================================================================== --- python/branches/pep-3151/Python/pystrtod.c (original) +++ python/branches/pep-3151/Python/pystrtod.c Sat Feb 26 08:16:32 2011 @@ -954,7 +954,7 @@ /* shouldn't get here: Gay's code should always return something starting with a digit, an 'I', or 'N' */ strncpy(p, "ERR", 3); - p += 3; + /* p += 3; */ assert(0); } goto exit; Modified: python/branches/pep-3151/Python/pythonrun.c ============================================================================== --- python/branches/pep-3151/Python/pythonrun.c (original) +++ python/branches/pep-3151/Python/pythonrun.c Sat Feb 26 08:16:32 2011 @@ -11,14 +11,10 @@ #include "parsetok.h" #include "errcode.h" #include "code.h" -#include "compile.h" #include "symtable.h" -#include "pyarena.h" #include "ast.h" -#include "eval.h" #include "marshal.h" #include "osdefs.h" -#include "abstract.h" #ifdef HAVE_SIGNAL_H #include @@ -82,6 +78,7 @@ int Py_DebugFlag; /* Needed by parser.c */ int Py_VerboseFlag; /* Needed by import.c */ +int Py_QuietFlag; /* Needed by sysmodule.c */ int Py_InteractiveFlag; /* Needed by Py_FdIsInteractive() below */ int Py_InspectFlag; /* Needed to determine whether to exit at SystemError */ int Py_NoSiteFlag; /* Suppress 'import site' */ @@ -781,6 +778,7 @@ { PyObject *buf = NULL, *stream = NULL, *text = NULL, *raw = NULL, *res; const char* mode; + const char* newline; PyObject *line_buffering; int buffering, isatty; @@ -831,9 +829,17 @@ Py_CLEAR(raw); Py_CLEAR(text); + newline = "\n"; +#ifdef MS_WINDOWS + if (!write_mode) { + /* translate \r\n to \n for sys.stdin on Windows */ + newline = NULL; + } +#endif + stream = PyObject_CallMethod(io, "TextIOWrapper", "OsssO", buf, encoding, errors, - "\n", line_buffering); + newline, line_buffering); Py_CLEAR(buf); if (stream == NULL) goto error; @@ -893,8 +899,10 @@ /* Set builtins.open */ if (PyObject_SetAttrString(bimod, "open", wrapper) == -1) { + Py_DECREF(wrapper); goto error; } + Py_DECREF(wrapper); encoding = Py_GETENV("PYTHONIOENCODING"); errors = NULL; @@ -1203,7 +1211,8 @@ { PyObject *m, *d, *v; const char *ext; - int set_file_name = 0, ret, len; + int set_file_name = 0, ret; + size_t len; m = PyImport_AddModule("__main__"); if (m == NULL) @@ -1757,7 +1766,7 @@ co = PyAST_Compile(mod, filename, flags, arena); if (co == NULL) return NULL; - v = PyEval_EvalCode(co, globals, locals); + v = PyEval_EvalCode((PyObject*)co, globals, locals); Py_DECREF(co); return v; } @@ -1787,7 +1796,7 @@ return NULL; } co = (PyCodeObject *)v; - v = PyEval_EvalCode(co, globals, locals); + v = PyEval_EvalCode((PyObject*)co, globals, locals); if (v && flags) flags->cf_flags |= (co->co_flags & PyCF_MASK); Py_DECREF(co); @@ -1795,8 +1804,8 @@ } PyObject * -Py_CompileStringFlags(const char *str, const char *filename, int start, - PyCompilerFlags *flags) +Py_CompileStringExFlags(const char *str, const char *filename, int start, + PyCompilerFlags *flags, int optimize) { PyCodeObject *co; mod_ty mod; @@ -1814,11 +1823,19 @@ PyArena_Free(arena); return result; } - co = PyAST_Compile(mod, filename, flags, arena); + co = PyAST_CompileEx(mod, filename, flags, optimize, arena); PyArena_Free(arena); return (PyObject *)co; } +/* For use in Py_LIMITED_API */ +#undef Py_CompileString +PyObject * +PyCompileString(const char *str, const char *filename, int start) +{ + return Py_CompileStringFlags(str, filename, start, NULL); +} + struct symtable * Py_SymtableString(const char *str, const char *filename, int start) { @@ -2444,7 +2461,15 @@ PyAPI_FUNC(PyObject *) Py_CompileString(const char *str, const char *p, int s) { - return Py_CompileStringFlags(str, p, s, NULL); + return Py_CompileStringExFlags(str, p, s, NULL, -1); +} + +#undef Py_CompileStringFlags +PyAPI_FUNC(PyObject *) +Py_CompileStringFlags(const char *str, const char *p, int s, + PyCompilerFlags *flags) +{ + return Py_CompileStringExFlags(str, p, s, flags, -1); } #undef PyRun_InteractiveOne Modified: python/branches/pep-3151/Python/sysmodule.c ============================================================================== --- python/branches/pep-3151/Python/sysmodule.c (original) +++ python/branches/pep-3151/Python/sysmodule.c Sat Feb 26 08:16:32 2011 @@ -15,10 +15,8 @@ */ #include "Python.h" -#include "structseq.h" #include "code.h" #include "frameobject.h" -#include "eval.h" #include "osdefs.h" @@ -67,6 +65,68 @@ return PyDict_SetItemString(sd, name, v); } +/* Write repr(o) to sys.stdout using sys.stdout.encoding and 'backslashreplace' + error handler. If sys.stdout has a buffer attribute, use + sys.stdout.buffer.write(encoded), otherwise redecode the string and use + sys.stdout.write(redecoded). + + Helper function for sys_displayhook(). */ +static int +sys_displayhook_unencodable(PyObject *outf, PyObject *o) +{ + PyObject *stdout_encoding = NULL; + PyObject *encoded, *escaped_str, *repr_str, *buffer, *result; + char *stdout_encoding_str; + int ret; + + stdout_encoding = PyObject_GetAttrString(outf, "encoding"); + if (stdout_encoding == NULL) + goto error; + stdout_encoding_str = _PyUnicode_AsString(stdout_encoding); + if (stdout_encoding_str == NULL) + goto error; + + repr_str = PyObject_Repr(o); + if (repr_str == NULL) + goto error; + encoded = PyUnicode_AsEncodedString(repr_str, + stdout_encoding_str, + "backslashreplace"); + Py_DECREF(repr_str); + if (encoded == NULL) + goto error; + + buffer = PyObject_GetAttrString(outf, "buffer"); + if (buffer) { + result = PyObject_CallMethod(buffer, "write", "(O)", encoded); + Py_DECREF(buffer); + Py_DECREF(encoded); + if (result == NULL) + goto error; + Py_DECREF(result); + } + else { + PyErr_Clear(); + escaped_str = PyUnicode_FromEncodedObject(encoded, + stdout_encoding_str, + "strict"); + Py_DECREF(encoded); + if (PyFile_WriteObject(escaped_str, outf, Py_PRINT_RAW) != 0) { + Py_DECREF(escaped_str); + goto error; + } + Py_DECREF(escaped_str); + } + ret = 0; + goto finally; + +error: + ret = -1; +finally: + Py_XDECREF(stdout_encoding); + return ret; +} + static PyObject * sys_displayhook(PyObject *self, PyObject *o) { @@ -74,6 +134,7 @@ PyInterpreterState *interp = PyThreadState_GET()->interp; PyObject *modules = interp->modules; PyObject *builtins = PyDict_GetItemString(modules, "builtins"); + int err; if (builtins == NULL) { PyErr_SetString(PyExc_RuntimeError, "lost builtins module"); @@ -94,8 +155,19 @@ PyErr_SetString(PyExc_RuntimeError, "lost sys.stdout"); return NULL; } - if (PyFile_WriteObject(o, outf, 0) != 0) - return NULL; + if (PyFile_WriteObject(o, outf, 0) != 0) { + if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) { + /* repr(o) is not encodable to sys.stdout.encoding with + * sys.stdout.errors error handler (which is probably 'strict') */ + PyErr_Clear(); + err = sys_displayhook_unencodable(outf, o); + if (err) + return NULL; + } + else { + return NULL; + } + } if (PyFile_WriteString("\n", outf) != 0) return NULL; if (PyObject_SetAttrString(builtins, "_", o) != 0) @@ -1104,7 +1176,6 @@ PyObject *opts; PyObject *name = NULL, *value = NULL; const wchar_t *name_end; - int r; opts = get_xoptions(); if (opts == NULL) @@ -1122,7 +1193,7 @@ } if (name == NULL || value == NULL) goto error; - r = PyDict_SetItem(opts, name, value); + PyDict_SetItem(opts, name, value); Py_DECREF(name); Py_DECREF(value); return; @@ -1345,7 +1416,8 @@ #endif /* {"unbuffered", "-u"}, */ /* {"skip_first", "-x"}, */ - {"bytes_warning", "-b"}, + {"bytes_warning", "-b"}, + {"quiet", "-q"}, {0} }; @@ -1354,9 +1426,9 @@ flags__doc__, /* doc */ flags_fields, /* fields */ #ifdef RISCOS - 12 + 13 #else - 11 + 12 #endif }; @@ -1389,6 +1461,7 @@ /* SetFlag(saw_unbuffered_flag); */ /* SetFlag(skipfirstline); */ SetFlag(Py_BytesWarningFlag); + SetFlag(Py_QuietFlag); #undef SetFlag if (PyErr_Occurred()) { Modified: python/branches/pep-3151/Python/thread_nt.h ============================================================================== --- python/branches/pep-3151/Python/thread_nt.h (original) +++ python/branches/pep-3151/Python/thread_nt.h Sat Feb 26 08:16:32 2011 @@ -238,10 +238,13 @@ * and 0 if the lock was not acquired. This means a 0 is returned * if the lock has already been acquired by this thread! */ -int -PyThread_acquire_lock_timed(PyThread_type_lock aLock, PY_TIMEOUT_T microseconds) +PyLockStatus +PyThread_acquire_lock_timed(PyThread_type_lock aLock, + PY_TIMEOUT_T microseconds, int intr_flag) { - int success ; + /* Fow now, intr_flag does nothing on Windows, and lock acquires are + * uninterruptible. */ + PyLockStatus success; PY_TIMEOUT_T milliseconds; if (microseconds >= 0) { @@ -258,7 +261,13 @@ dprintf(("%ld: PyThread_acquire_lock_timed(%p, %lld) called\n", PyThread_get_thread_ident(), aLock, microseconds)); - success = aLock && EnterNonRecursiveMutex((PNRMUTEX) aLock, (DWORD) milliseconds) == WAIT_OBJECT_0 ; + if (aLock && EnterNonRecursiveMutex((PNRMUTEX)aLock, + (DWORD)milliseconds) == WAIT_OBJECT_0) { + success = PY_LOCK_ACQUIRED; + } + else { + success = PY_LOCK_FAILURE; + } dprintf(("%ld: PyThread_acquire_lock(%p, %lld) -> %d\n", PyThread_get_thread_ident(), aLock, microseconds, success)); @@ -268,7 +277,7 @@ int PyThread_acquire_lock(PyThread_type_lock aLock, int waitflag) { - return PyThread_acquire_lock_timed(aLock, waitflag ? -1 : 0); + return PyThread_acquire_lock_timed(aLock, waitflag ? -1 : 0, 0); } void Modified: python/branches/pep-3151/Python/thread_pthread.h ============================================================================== --- python/branches/pep-3151/Python/thread_pthread.h (original) +++ python/branches/pep-3151/Python/thread_pthread.h Sat Feb 26 08:16:32 2011 @@ -316,16 +316,17 @@ return (status == -1) ? errno : status; } -int -PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds) +PyLockStatus +PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds, + int intr_flag) { - int success; + PyLockStatus success; sem_t *thelock = (sem_t *)lock; int status, error = 0; struct timespec ts; - dprintf(("PyThread_acquire_lock_timed(%p, %lld) called\n", - lock, microseconds)); + dprintf(("PyThread_acquire_lock_timed(%p, %lld, %d) called\n", + lock, microseconds, intr_flag)); if (microseconds > 0) MICROSECONDS_TO_TIMESPEC(microseconds, ts); @@ -336,33 +337,38 @@ status = fix_status(sem_trywait(thelock)); else status = fix_status(sem_wait(thelock)); - } while (status == EINTR); /* Retry if interrupted by a signal */ - - if (microseconds > 0) { - if (status != ETIMEDOUT) - CHECK_STATUS("sem_timedwait"); - } - else if (microseconds == 0) { - if (status != EAGAIN) - CHECK_STATUS("sem_trywait"); - } - else { - CHECK_STATUS("sem_wait"); + /* Retry if interrupted by a signal, unless the caller wants to be + notified. */ + } while (!intr_flag && status == EINTR); + + /* Don't check the status if we're stopping because of an interrupt. */ + if (!(intr_flag && status == EINTR)) { + if (microseconds > 0) { + if (status != ETIMEDOUT) + CHECK_STATUS("sem_timedwait"); + } + else if (microseconds == 0) { + if (status != EAGAIN) + CHECK_STATUS("sem_trywait"); + } + else { + CHECK_STATUS("sem_wait"); + } } - success = (status == 0) ? 1 : 0; + if (status == 0) { + success = PY_LOCK_ACQUIRED; + } else if (intr_flag && status == EINTR) { + success = PY_LOCK_INTR; + } else { + success = PY_LOCK_FAILURE; + } - dprintf(("PyThread_acquire_lock_timed(%p, %lld) -> %d\n", - lock, microseconds, success)); + dprintf(("PyThread_acquire_lock_timed(%p, %lld, %d) -> %d\n", + lock, microseconds, intr_flag, success)); return success; } -int -PyThread_acquire_lock(PyThread_type_lock lock, int waitflag) -{ - return PyThread_acquire_lock_timed(lock, waitflag ? -1 : 0); -} - void PyThread_release_lock(PyThread_type_lock lock) { @@ -436,21 +442,25 @@ free((void *)thelock); } -int -PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds) +PyLockStatus +PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds, + int intr_flag) { - int success; + PyLockStatus success; pthread_lock *thelock = (pthread_lock *)lock; int status, error = 0; - dprintf(("PyThread_acquire_lock_timed(%p, %lld) called\n", - lock, microseconds)); + dprintf(("PyThread_acquire_lock_timed(%p, %lld, %d) called\n", + lock, microseconds, intr_flag)); status = pthread_mutex_lock( &thelock->mut ); CHECK_STATUS("pthread_mutex_lock[1]"); - success = thelock->locked == 0; - if (!success && microseconds != 0) { + if (thelock->locked == 0) { + success = PY_LOCK_ACQUIRED; + } else if (microseconds == 0) { + success = PY_LOCK_FAILURE; + } else { struct timespec ts; if (microseconds > 0) MICROSECONDS_TO_TIMESPEC(microseconds, ts); @@ -458,7 +468,8 @@ /* mut must be locked by me -- part of the condition * protocol */ - while (thelock->locked) { + success = PY_LOCK_FAILURE; + while (success == PY_LOCK_FAILURE) { if (microseconds > 0) { status = pthread_cond_timedwait( &thelock->lock_released, @@ -473,25 +484,30 @@ &thelock->mut); CHECK_STATUS("pthread_cond_wait"); } + + if (intr_flag && status == 0 && thelock->locked) { + /* We were woken up, but didn't get the lock. We probably received + * a signal. Return PY_LOCK_INTR to allow the caller to handle + * it and retry. */ + success = PY_LOCK_INTR; + break; + } else if (status == 0 && !thelock->locked) { + success = PY_LOCK_ACQUIRED; + } else { + success = PY_LOCK_FAILURE; + } } - success = (status == 0); } - if (success) thelock->locked = 1; + if (success == PY_LOCK_ACQUIRED) thelock->locked = 1; status = pthread_mutex_unlock( &thelock->mut ); CHECK_STATUS("pthread_mutex_unlock[1]"); - if (error) success = 0; - dprintf(("PyThread_acquire_lock_timed(%p, %lld) -> %d\n", - lock, microseconds, success)); + if (error) success = PY_LOCK_FAILURE; + dprintf(("PyThread_acquire_lock_timed(%p, %lld, %d) -> %d\n", + lock, microseconds, intr_flag, success)); return success; } -int -PyThread_acquire_lock(PyThread_type_lock lock, int waitflag) -{ - return PyThread_acquire_lock_timed(lock, waitflag ? -1 : 0); -} - void PyThread_release_lock(PyThread_type_lock lock) { @@ -515,6 +531,12 @@ #endif /* USE_SEMAPHORES */ +int +PyThread_acquire_lock(PyThread_type_lock lock, int waitflag) +{ + return PyThread_acquire_lock_timed(lock, waitflag ? -1 : 0, /*intr_flag=*/0); +} + /* set the thread stack size. * Return 0 if size is valid, -1 if size is invalid, * -2 if setting stack size is not supported. Modified: python/branches/pep-3151/Python/traceback.c ============================================================================== --- python/branches/pep-3151/Python/traceback.c (original) +++ python/branches/pep-3151/Python/traceback.c Sat Feb 26 08:16:32 2011 @@ -7,7 +7,6 @@ #include "frameobject.h" #include "structmember.h" #include "osdefs.h" -#include "traceback.h" #ifdef HAVE_FCNTL_H #include #endif Modified: python/branches/pep-3151/README ============================================================================== --- python/branches/pep-3151/README (original) +++ python/branches/pep-3151/README Sat Feb 26 08:16:32 2011 @@ -1,9 +1,8 @@ -This is Python version 3.2 alpha 3 +This is Python version 3.3 alpha 0 ================================== -Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010 -Python Software Foundation. -All rights reserved. +Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 +Python Software Foundation. All rights reserved. Python 3.x is a new version of the language, which is incompatible with the 2.x line of releases. The language is mostly the same, but many details, especially @@ -51,9 +50,9 @@ ---------- We try to have a comprehensive overview of the changes in the "What's New in -Python 3.2" document, found at +Python 3.3" document, found at - http://docs.python.org/3.2/whatsnew/3.2.html + http://docs.python.org/3.3/whatsnew/3.3.html For a more detailed change log, read Misc/NEWS (though this file, too, is incomplete, and also doesn't list anything merged in from the 2.7 release under @@ -66,9 +65,9 @@ Documentation ------------- -Documentation for Python 3.2 is online, updated daily: +Documentation for Python 3.3 is online, updated daily: - http://docs.python.org/3.2/ + http://docs.python.org/3.3/ It can also be downloaded in many formats for faster access. The documentation is downloadable in HTML, PDF, and reStructuredText formats; the latter version @@ -86,7 +85,7 @@ A source-to-source translation tool, "2to3", can take care of the mundane task of converting large amounts of source code. It is not a complete solution but is complemented by the deprecation warnings in 2.6. See -http://docs.python.org/3.2/library/2to3.html for more information. +http://docs.python.org/3.3/library/2to3.html for more information. Testing @@ -97,10 +96,7 @@ left by the previous test run). The test set produces some output. You can generally ignore the messages about skipped tests due to optional features which can't be imported. If a message is printed about a failed test or a traceback -or core dump is produced, something is wrong. On some Linux systems (those that -are not yet using glibc 6), test_strftime fails due to a non-standard -implementation of strftime() in the C library. Please ignore this, or upgrade to -glibc version 6. +or core dump is produced, something is wrong. By default, tests are prevented from overusing resources like disk space and memory. To enable these tests, run "make testall". @@ -109,7 +105,7 @@ include the output of "make test". It is useless. Run the failing test manually, as follows: - ./python Lib/test/regrtest.py -v test_whatever + ./python -m test -v test_whatever (substituting the top of the source tree for '.' if you built in a different directory). This runs the test in verbose mode. @@ -129,8 +125,8 @@ Install that version using "make install". Install all other versions using "make altinstall". -For example, if you want to install Python 2.5, 2.6 and 3.2 with 2.6 being the -primary version, you would execute "make install" in your 2.6 build directory +For example, if you want to install Python 2.6, 2.7 and 3.3 with 2.7 being the +primary version, you would execute "make install" in your 2.7 build directory and "make altinstall" in the others. @@ -156,8 +152,8 @@ ------------------------- If you have a proposal to change Python, you may want to send an email to the -comp.lang.python or python-ideas mailing lists for inital feedback. A Python -Enhancement Proposal (PEP) may be submitted if your idea gains ground. All +comp.lang.python or python-ideas mailing lists for inital feedback. A Python +Enhancement Proposal (PEP) may be submitted if your idea gains ground. All current PEPs, as well as guidelines for submitting a new PEP, are listed at http://www.python.org/dev/peps/. @@ -165,32 +161,27 @@ Release Schedule ---------------- -See PEP 392 for release details: http://www.python.org/dev/peps/pep-0392/ +See PEP XXX for release details: http://www.python.org/dev/peps/pep-0XXX/ Copyright and License Information --------------------------------- -Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010 -Python Software Foundation. -All rights reserved. +Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 +Python Software Foundation. All rights reserved. -Copyright (c) 2000 BeOpen.com. -All rights reserved. +Copyright (c) 2000 BeOpen.com. All rights reserved. -Copyright (c) 1995-2001 Corporation for National Research Initiatives. -All rights reserved. +Copyright (c) 1995-2001 Corporation for National Research Initiatives. All +rights reserved. -Copyright (c) 1991-1995 Stichting Mathematisch Centrum. -All rights reserved. +Copyright (c) 1991-1995 Stichting Mathematisch Centrum. All rights reserved. -See the file "LICENSE" for information on the history of this -software, terms & conditions for usage, and a DISCLAIMER OF ALL -WARRANTIES. +See the file "LICENSE" for information on the history of this software, terms & +conditions for usage, and a DISCLAIMER OF ALL WARRANTIES. -This Python distribution contains *no* GNU General Public License -(GPL) code, so it may be used in proprietary projects. There are -interfaces to some GNU code but these are entirely optional. +This Python distribution contains *no* GNU General Public License (GPL) code, so +it may be used in proprietary projects. There are interfaces to some GNU code +but these are entirely optional. -All trademarks referenced herein are property of their respective -holders. +All trademarks referenced herein are property of their respective holders. Modified: python/branches/pep-3151/Tools/README ============================================================================== --- python/branches/pep-3151/Tools/README (original) +++ python/branches/pep-3151/Tools/README Sat Feb 26 08:16:32 2011 @@ -5,7 +5,7 @@ ccbench A Python concurrency benchmark. -framer Generate boilerplate code for C extension types. +demo Several Python programming demos. freeze Create a stand-alone executable from a Python program. @@ -21,6 +21,8 @@ msi Support for packaging Python as an MSI package on Windows. +parser Un-parsing tool to generate code from an AST. + pybench Comprehensive Python benchmarking suite. pynche A Tkinter-based color editor. @@ -30,10 +32,14 @@ tabs and spaces, and 2to3, which converts Python 2 code to Python 3 code. -ssl Currently, a tool to fetch server certificates. +test2to3 A demonstration of how to use 2to3 transparently in setup.py. + +unicode Tools for generating unicodedata and codecs from unicode.org + and other mapping files (by Fredrik Lundh, Marc-Andre Lemburg + and Martin von Loewis). -unicode Tools used to generate unicode database files for - Python 2.0 (by Fredrik Lundh). +unittestgui A Tkinter based GUI test runner for unittest, with test + discovery. world Script to take a list of Internet addresses and print out where in the world those addresses originate from, Modified: python/branches/pep-3151/Tools/buildbot/build-amd64.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/build-amd64.bat (original) +++ python/branches/pep-3151/Tools/buildbot/build-amd64.bat Sat Feb 26 08:16:32 2011 @@ -1,6 +1,6 @@ - at rem Used by the buildbot "compile" step. -cmd /c Tools\buildbot\external-amd64.bat -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 -cmd /c Tools\buildbot\clean-amd64.bat -vcbuild /useenv PCbuild\kill_python.vcproj "Debug|x64" && PCbuild\amd64\kill_python_d.exe -vcbuild PCbuild\pcbuild.sln "Debug|x64" + at rem Used by the buildbot "compile" step. +cmd /c Tools\buildbot\external-amd64.bat +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 +cmd /c Tools\buildbot\clean-amd64.bat +vcbuild /useenv PCbuild\kill_python.vcproj "Debug|x64" && PCbuild\amd64\kill_python_d.exe +vcbuild PCbuild\pcbuild.sln "Debug|x64" Modified: python/branches/pep-3151/Tools/buildbot/build.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/build.bat (original) +++ python/branches/pep-3151/Tools/buildbot/build.bat Sat Feb 26 08:16:32 2011 @@ -1,7 +1,7 @@ - at rem Used by the buildbot "compile" step. -cmd /c Tools\buildbot\external.bat -call "%VS90COMNTOOLS%vsvars32.bat" -cmd /c Tools\buildbot\clean.bat -vcbuild /useenv PCbuild\kill_python.vcproj "Debug|Win32" && PCbuild\kill_python_d.exe -vcbuild /useenv PCbuild\pcbuild.sln "Debug|Win32" - + at rem Used by the buildbot "compile" step. +cmd /c Tools\buildbot\external.bat +call "%VS90COMNTOOLS%vsvars32.bat" +cmd /c Tools\buildbot\clean.bat +vcbuild /useenv PCbuild\kill_python.vcproj "Debug|Win32" && PCbuild\kill_python_d.exe +vcbuild /useenv PCbuild\pcbuild.sln "Debug|Win32" + Modified: python/branches/pep-3151/Tools/buildbot/buildmsi.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/buildmsi.bat (original) +++ python/branches/pep-3151/Tools/buildbot/buildmsi.bat Sat Feb 26 08:16:32 2011 @@ -1,21 +1,21 @@ - at rem Used by the buildbot "buildmsi" step. - -cmd /c Tools\buildbot\external.bat - at rem build release versions of things -call "%VS90COMNTOOLS%vsvars32.bat" - - at rem build Python -vcbuild /useenv PCbuild\pcbuild.sln "Release|Win32" - - at rem build the documentation -bash.exe -c 'cd Doc;make PYTHON=python2.5 update htmlhelp' -"%ProgramFiles%\HTML Help Workshop\hhc.exe" Doc\build\htmlhelp\python26a3.hhp - - at rem build the MSI file -cd PC -nmake /f icons.mak -cd ..\Tools\msi -del *.msi -nmake /f msisupport.mak -%HOST_PYTHON% msi.py - + at rem Used by the buildbot "buildmsi" step. + +cmd /c Tools\buildbot\external.bat + at rem build release versions of things +call "%VS90COMNTOOLS%vsvars32.bat" + + at rem build Python +vcbuild /useenv PCbuild\pcbuild.sln "Release|Win32" + + at rem build the documentation +bash.exe -c 'cd Doc;make PYTHON=python2.5 update htmlhelp' +"%ProgramFiles%\HTML Help Workshop\hhc.exe" Doc\build\htmlhelp\python26a3.hhp + + at rem build the MSI file +cd PC +nmake /f icons.mak +cd ..\Tools\msi +del *.msi +nmake /f msisupport.mak +%HOST_PYTHON% msi.py + Modified: python/branches/pep-3151/Tools/buildbot/clean-amd64.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/clean-amd64.bat (original) +++ python/branches/pep-3151/Tools/buildbot/clean-amd64.bat Sat Feb 26 08:16:32 2011 @@ -1,7 +1,7 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 -cd PCbuild - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -vcbuild /clean pcbuild.sln "Release|x64" -vcbuild /clean pcbuild.sln "Debug|x64" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 +cd PCbuild + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +vcbuild /clean pcbuild.sln "Release|x64" +vcbuild /clean pcbuild.sln "Debug|x64" Modified: python/branches/pep-3151/Tools/buildbot/clean.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/clean.bat (original) +++ python/branches/pep-3151/Tools/buildbot/clean.bat Sat Feb 26 08:16:32 2011 @@ -1,7 +1,7 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%vsvars32.bat" - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -cd PCbuild -vcbuild /clean pcbuild.sln "Release|Win32" -vcbuild /clean pcbuild.sln "Debug|Win32" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%vsvars32.bat" + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +cd PCbuild +vcbuild /clean pcbuild.sln "Release|Win32" +vcbuild /clean pcbuild.sln "Debug|Win32" Modified: python/branches/pep-3151/Tools/buildbot/external-amd64.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/external-amd64.bat (original) +++ python/branches/pep-3151/Tools/buildbot/external-amd64.bat Sat Feb 26 08:16:32 2011 @@ -1,21 +1,21 @@ - at rem Fetches (and builds if necessary) external dependencies - - at rem Assume we start inside the Python source directory -call "Tools\buildbot\external-common.bat" -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 - -if not exist tcltk64\bin\tcl85g.dll ( - cd tcl-8.5.2.1\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install - cd ..\.. -) - -if not exist tcltk64\bin\tk85g.dll ( - cd tk-8.5.2.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.2.1 clean - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.2.1 all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.2.1 install - cd ..\.. -) - + at rem Fetches (and builds if necessary) external dependencies + + at rem Assume we start inside the Python source directory +call "Tools\buildbot\external-common.bat" +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 + +if not exist tcltk64\bin\tcl85g.dll ( + cd tcl-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install + cd ..\.. +) + +if not exist tcltk64\bin\tk85g.dll ( + cd tk-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 clean + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 install + cd ..\.. +) + Modified: python/branches/pep-3151/Tools/buildbot/external-common.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/external-common.bat (original) +++ python/branches/pep-3151/Tools/buildbot/external-common.bat Sat Feb 26 08:16:32 2011 @@ -1,43 +1,43 @@ - at rem Common file shared between external.bat and external-amd64.bat. Responsible for - at rem fetching external components into the root\.. buildbot directories. - -cd .. - at rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment - at rem the following, check it in, then check it out, comment it out, then check it back in. - at rem if exist bzip2-1.0.5 rd /s/q bzip2-1.0.5 - at rem if exist tcltk rd /s/q tcltk - at rem if exist tcltk64 rd /s/q tcltk64 - at rem if exist tcl8.4.12 rd /s/q tcl8.4.12 - at rem if exist tcl8.4.16 rd /s/q tcl8.4.16 - at rem if exist tcl-8.4.18.1 rd /s/q tcl-8.4.18.1 - at rem if exist tk8.4.12 rd /s/q tk8.4.12 - at rem if exist tk8.4.16 rd /s/q tk8.4.16 - at rem if exist tk-8.4.18.1 rd /s/q tk-8.4.18.1 - at rem if exist db-4.4.20 rd /s/q db-4.4.20 - at rem if exist openssl-1.0.0a rd /s/q openssl-1.0.0a - at rem if exist sqlite-3.6.21 rd /s/q sqlite-3.6.21 - - at rem bzip -if not exist bzip2-1.0.5 ( - rd /s/q bzip2-1.0.3 - svn export http://svn.python.org/projects/external/bzip2-1.0.5 -) - - at rem Sleepycat db -if not exist db-4.4.20 svn export http://svn.python.org/projects/external/db-4.4.20-vs9 db-4.4.20 - - at rem OpenSSL -if not exist openssl-1.0.0a svn export http://svn.python.org/projects/external/openssl-1.0.0a - - at rem tcl/tk -if not exist tcl-8.5.2.1 ( - rd /s/q tcltk tcltk64 - svn export http://svn.python.org/projects/external/tcl-8.5.2.1 -) -if not exist tk-8.5.2.0 svn export http://svn.python.org/projects/external/tk-8.5.2.0 - - at rem sqlite3 -if not exist sqlite-3.6.21 ( - rd /s/q sqlite-source-3.3.4 - svn export http://svn.python.org/projects/external/sqlite-3.6.21 -) + at rem Common file shared between external.bat and external-amd64.bat. Responsible for + at rem fetching external components into the root\.. buildbot directories. + +cd .. + at rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment + at rem the following, check it in, then check it out, comment it out, then check it back in. + at rem if exist bzip2-1.0.5 rd /s/q bzip2-1.0.5 + at rem if exist tcltk rd /s/q tcltk + at rem if exist tcltk64 rd /s/q tcltk64 + at rem if exist tcl8.4.12 rd /s/q tcl8.4.12 + at rem if exist tcl8.4.16 rd /s/q tcl8.4.16 + at rem if exist tcl-8.4.18.1 rd /s/q tcl-8.4.18.1 + at rem if exist tk8.4.12 rd /s/q tk8.4.12 + at rem if exist tk8.4.16 rd /s/q tk8.4.16 + at rem if exist tk-8.4.18.1 rd /s/q tk-8.4.18.1 + at rem if exist db-4.4.20 rd /s/q db-4.4.20 + at rem if exist openssl-1.0.0a rd /s/q openssl-1.0.0a + at rem if exist sqlite-3.7.4 rd /s/q sqlite-3.7.4 + + at rem bzip +if not exist bzip2-1.0.5 ( + rd /s/q bzip2-1.0.3 + svn export http://svn.python.org/projects/external/bzip2-1.0.5 +) + + at rem Sleepycat db +if not exist db-4.4.20 svn export http://svn.python.org/projects/external/db-4.4.20-vs9 db-4.4.20 + + at rem OpenSSL +if not exist openssl-1.0.0a svn export http://svn.python.org/projects/external/openssl-1.0.0a + + at rem tcl/tk +if not exist tcl-8.5.9.0 ( + rd /s/q tcltk tcltk64 + svn export http://svn.python.org/projects/external/tcl-8.5.9.0 +) +if not exist tk-8.5.9.0 svn export http://svn.python.org/projects/external/tk-8.5.9.0 + + at rem sqlite3 +if not exist sqlite-3.7.4 ( + rd /s/q sqlite-source-3.6.21 + svn export http://svn.python.org/projects/external/sqlite-3.7.4 +) Modified: python/branches/pep-3151/Tools/buildbot/external.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/external.bat (original) +++ python/branches/pep-3151/Tools/buildbot/external.bat Sat Feb 26 08:16:32 2011 @@ -1,21 +1,21 @@ - at rem Fetches (and builds if necessary) external dependencies - - at rem Assume we start inside the Python source directory -call "Tools\buildbot\external-common.bat" -call "%VS90COMNTOOLS%\vsvars32.bat" - -if not exist tcltk\bin\tcl85g.dll ( - @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install - cd tcl-8.5.2.1\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 INSTALLDIR=..\..\tcltk clean all - nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install - cd ..\.. -) - -if not exist tcltk\bin\tk85g.dll ( - cd tk-8.5.2.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.2.1 clean - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.2.1 all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.2.1 install - cd ..\.. -) + at rem Fetches (and builds if necessary) external dependencies + + at rem Assume we start inside the Python source directory +call "Tools\buildbot\external-common.bat" +call "%VS90COMNTOOLS%\vsvars32.bat" + +if not exist tcltk\bin\tcl85g.dll ( + @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install + cd tcl-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 INSTALLDIR=..\..\tcltk clean all + nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install + cd ..\.. +) + +if not exist tcltk\bin\tk85g.dll ( + cd tk-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 clean + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 install + cd ..\.. +) Modified: python/branches/pep-3151/Tools/buildbot/test-amd64.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/test-amd64.bat (original) +++ python/branches/pep-3151/Tools/buildbot/test-amd64.bat Sat Feb 26 08:16:32 2011 @@ -1,3 +1,3 @@ - at rem Used by the buildbot "test" step. -cd PCbuild -call rt.bat -q -d -x64 -uall -rw + at rem Used by the buildbot "test" step. +cd PCbuild +call rt.bat -q -d -x64 -uall -rw Modified: python/branches/pep-3151/Tools/buildbot/test.bat ============================================================================== --- python/branches/pep-3151/Tools/buildbot/test.bat (original) +++ python/branches/pep-3151/Tools/buildbot/test.bat Sat Feb 26 08:16:32 2011 @@ -1,4 +1,4 @@ - at rem Used by the buildbot "test" step. -cd PCbuild -call rt.bat -d -q -uall -rw -n - + at rem Used by the buildbot "test" step. +cd PCbuild +call rt.bat -d -q -uall -rwW -n + Modified: python/branches/pep-3151/Tools/ccbench/ccbench.py ============================================================================== --- python/branches/pep-3151/Tools/ccbench/ccbench.py (original) +++ python/branches/pep-3151/Tools/ccbench/ccbench.py Sat Feb 26 08:16:32 2011 @@ -276,19 +276,19 @@ return sock.recv(n).decode('ascii') def latency_client(addr, nb_pings, interval): - sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) - _time = time.time - _sleep = time.sleep - def _ping(): - _sendto(sock, "%r\n" % _time(), addr) - # The first ping signals the parent process that we are ready. - _ping() - # We give the parent a bit of time to notice. - _sleep(1.0) - for i in range(nb_pings): - _sleep(interval) + with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock: + _time = time.time + _sleep = time.sleep + def _ping(): + _sendto(sock, "%r\n" % _time(), addr) + # The first ping signals the parent process that we are ready. _ping() - _sendto(sock, LAT_END + "\n", addr) + # We give the parent a bit of time to notice. + _sleep(1.0) + for i in range(nb_pings): + _sleep(interval) + _ping() + _sendto(sock, LAT_END + "\n", addr) def run_latency_client(**kwargs): cmd_line = [sys.executable, '-E', os.path.abspath(__file__)] @@ -363,6 +363,7 @@ for t in threads: t.join() process.wait() + sock.close() for recv_time, chunk in chunks: # NOTE: it is assumed that a line sent by a client wasn't received Modified: python/branches/pep-3151/Tools/msi/msi.py ============================================================================== --- python/branches/pep-3151/Tools/msi/msi.py (original) +++ python/branches/pep-3151/Tools/msi/msi.py Sat Feb 26 08:16:32 2011 @@ -1025,6 +1025,7 @@ lib.glob("*.pem") lib.glob("*.pck") lib.glob("cfgparser.*") + lib.add_file("zip_cp437_header.zip") lib.add_file("zipdir.zip") if dir=='capath': lib.glob("*.0") @@ -1054,6 +1055,10 @@ if dir=="macholib": lib.add_file("README.ctypes") lib.glob("fetch_macholib*") + if dir=='turtledemo': + lib.add_file("turtle.cfg") + if dir=="pydoc_data": + lib.add_file("_pydoc.css") if dir=="data" and parent.physical=="test" and parent.basedir.physical=="email": # This should contain all non-.svn files listed in subversion for f in os.listdir(lib.absolute): @@ -1082,6 +1087,7 @@ continue dlls.append(f) lib.add_file(f) + lib.add_file('python3.dll') # Add sqlite if msilib.msi_type=="Intel64;1033": sqlite_arch = "/ia64" @@ -1118,6 +1124,7 @@ for f in dlls: lib.add_file(f.replace('pyd','lib')) lib.add_file('python%s%s.lib' % (major, minor)) + lib.add_file('python3.lib') # Add the mingw-format library if have_mingw: lib.add_file('libpython%s%s.a' % (major, minor)) Modified: python/branches/pep-3151/Tools/msi/uuids.py ============================================================================== --- python/branches/pep-3151/Tools/msi/uuids.py (original) +++ python/branches/pep-3151/Tools/msi/uuids.py Sat Feb 26 08:16:32 2011 @@ -80,9 +80,11 @@ '3.2.101' :'{b411f168-7a36-4fff-902c-a554d1c78a4f}', # 3.2a1 '3.2.102' :'{79ff73b7-8359-410f-b9c5-152d2026f8c8}', # 3.2a2 '3.2.103' :'{e7635c65-c221-4b9b-b70a-5611b8369d77}', # 3.2a3 + '3.2.104' :'{748cd139-75b8-4ca8-98a7-58262298181e}', # 3.2a4 '3.2.111' :'{20bfc16f-c7cd-4fc0-8f96-9914614a3c50}', # 3.2b1 '3.2.112' :'{0e350c98-8d73-4993-b686-cfe87160046e}', # 3.2b2 '3.2.121' :'{2094968d-7583-47f6-a7fd-22304532e09f}', # 3.2rc1 '3.2.122' :'{4f3edfa6-cf70-469a-825f-e1206aa7f412}', # 3.2rc2 + '3.2.123' :'{90c673d7-8cfd-4969-9816-f7d70bad87f3}', # 3.2rc3 '3.2.150' :'{b2042d5e-986d-44ec-aee3-afe4108ccc93}', # 3.2.0 } Modified: python/branches/pep-3151/Tools/scripts/README ============================================================================== --- python/branches/pep-3151/Tools/scripts/README (original) +++ python/branches/pep-3151/Tools/scripts/README Sat Feb 26 08:16:32 2011 @@ -2,8 +2,6 @@ useful while building, extending or managing Python. Some (e.g., dutree or lll) are also generally useful UNIX tools. -See also the Demo/scripts directory! - 2to3 Main script for running the 2to3 conversion tool analyze_dxp.py Analyzes the result of sys.getdxp() byext.py Print lines/words/chars stats of files by extension Modified: python/branches/pep-3151/Tools/scripts/find_recursionlimit.py ============================================================================== --- python/branches/pep-3151/Tools/scripts/find_recursionlimit.py (original) +++ python/branches/pep-3151/Tools/scripts/find_recursionlimit.py Sat Feb 26 08:16:32 2011 @@ -77,14 +77,15 @@ except ImportError: print("cannot import _pickle, skipped!") return - l = None + k, l = None, None for n in itertools.count(): try: l = _cache[n] continue # Already tried and it works, let's save some time except KeyError: for i in range(100): - l = [l] + l = [k, l] + k = {i: l} _pickle.Pickler(io.BytesIO(), protocol=-1).dump(l) _cache[n] = l Modified: python/branches/pep-3151/Tools/scripts/patchcheck.py ============================================================================== --- python/branches/pep-3151/Tools/scripts/patchcheck.py (original) +++ python/branches/pep-3151/Tools/scripts/patchcheck.py Sat Feb 26 08:16:32 2011 @@ -45,13 +45,16 @@ sys.exit('need a checkout to get modified files') st = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE) - st.wait() - if vcs == 'hg': - return [x.decode().rstrip() for x in st.stdout] - else: - output = (x.decode().rstrip().rsplit(None, 1)[-1] - for x in st.stdout if x[0] in b'AM') + try: + st.wait() + if vcs == 'hg': + return [x.decode().rstrip() for x in st.stdout] + else: + output = (x.decode().rstrip().rsplit(None, 1)[-1] + for x in st.stdout if x[0] in b'AM') return set(path for path in output if os.path.isfile(path)) + finally: + st.stdout.close() def report_modified_files(file_paths): Deleted: python/branches/pep-3151/Tools/scripts/redemo.py ============================================================================== --- python/branches/pep-3151/Tools/scripts/redemo.py Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,171 +0,0 @@ -"""Basic regular expression demostration facility (Perl style syntax).""" - -from tkinter import * -import re - -class ReDemo: - - def __init__(self, master): - self.master = master - - self.promptdisplay = Label(self.master, anchor=W, - text="Enter a Perl-style regular expression:") - self.promptdisplay.pack(side=TOP, fill=X) - - self.regexdisplay = Entry(self.master) - self.regexdisplay.pack(fill=X) - self.regexdisplay.focus_set() - - self.addoptions() - - self.statusdisplay = Label(self.master, text="", anchor=W) - self.statusdisplay.pack(side=TOP, fill=X) - - self.labeldisplay = Label(self.master, anchor=W, - text="Enter a string to search:") - self.labeldisplay.pack(fill=X) - self.labeldisplay.pack(fill=X) - - self.showframe = Frame(master) - self.showframe.pack(fill=X, anchor=W) - - self.showvar = StringVar(master) - self.showvar.set("first") - - self.showfirstradio = Radiobutton(self.showframe, - text="Highlight first match", - variable=self.showvar, - value="first", - command=self.recompile) - self.showfirstradio.pack(side=LEFT) - - self.showallradio = Radiobutton(self.showframe, - text="Highlight all matches", - variable=self.showvar, - value="all", - command=self.recompile) - self.showallradio.pack(side=LEFT) - - self.stringdisplay = Text(self.master, width=60, height=4) - self.stringdisplay.pack(fill=BOTH, expand=1) - self.stringdisplay.tag_configure("hit", background="yellow") - - self.grouplabel = Label(self.master, text="Groups:", anchor=W) - self.grouplabel.pack(fill=X) - - self.grouplist = Listbox(self.master) - self.grouplist.pack(expand=1, fill=BOTH) - - self.regexdisplay.bind('', self.recompile) - self.stringdisplay.bind('', self.reevaluate) - - self.compiled = None - self.recompile() - - btags = self.regexdisplay.bindtags() - self.regexdisplay.bindtags(btags[1:] + btags[:1]) - - btags = self.stringdisplay.bindtags() - self.stringdisplay.bindtags(btags[1:] + btags[:1]) - - def addoptions(self): - self.frames = [] - self.boxes = [] - self.vars = [] - for name in ('IGNORECASE', - 'LOCALE', - 'MULTILINE', - 'DOTALL', - 'VERBOSE'): - if len(self.boxes) % 3 == 0: - frame = Frame(self.master) - frame.pack(fill=X) - self.frames.append(frame) - val = getattr(re, name) - var = IntVar() - box = Checkbutton(frame, - variable=var, text=name, - offvalue=0, onvalue=val, - command=self.recompile) - box.pack(side=LEFT) - self.boxes.append(box) - self.vars.append(var) - - def getflags(self): - flags = 0 - for var in self.vars: - flags = flags | var.get() - flags = flags - return flags - - def recompile(self, event=None): - try: - self.compiled = re.compile(self.regexdisplay.get(), - self.getflags()) - bg = self.promptdisplay['background'] - self.statusdisplay.config(text="", background=bg) - except re.error as msg: - self.compiled = None - self.statusdisplay.config( - text="re.error: %s" % str(msg), - background="red") - self.reevaluate() - - def reevaluate(self, event=None): - try: - self.stringdisplay.tag_remove("hit", "1.0", END) - except TclError: - pass - try: - self.stringdisplay.tag_remove("hit0", "1.0", END) - except TclError: - pass - self.grouplist.delete(0, END) - if not self.compiled: - return - self.stringdisplay.tag_configure("hit", background="yellow") - self.stringdisplay.tag_configure("hit0", background="orange") - text = self.stringdisplay.get("1.0", END) - last = 0 - nmatches = 0 - while last <= len(text): - m = self.compiled.search(text, last) - if m is None: - break - first, last = m.span() - if last == first: - last = first+1 - tag = "hit0" - else: - tag = "hit" - pfirst = "1.0 + %d chars" % first - plast = "1.0 + %d chars" % last - self.stringdisplay.tag_add(tag, pfirst, plast) - if nmatches == 0: - self.stringdisplay.yview_pickplace(pfirst) - groups = list(m.groups()) - groups.insert(0, m.group()) - for i in range(len(groups)): - g = "%2d: %r" % (i, groups[i]) - self.grouplist.insert(END, g) - nmatches = nmatches + 1 - if self.showvar.get() == "first": - break - - if nmatches == 0: - self.statusdisplay.config(text="(no match)", - background="yellow") - else: - self.statusdisplay.config(text="") - - -# Main function, run when invoked as a stand-alone Python program. - -def main(): - root = Tk() - demo = ReDemo(root) - root.protocol('WM_DELETE_WINDOW', root.quit) - root.mainloop() - -if __name__ == '__main__': - main() Modified: python/branches/pep-3151/Tools/scripts/untabify.py ============================================================================== --- python/branches/pep-3151/Tools/scripts/untabify.py (original) +++ python/branches/pep-3151/Tools/scripts/untabify.py Sat Feb 26 08:16:32 2011 @@ -5,7 +5,7 @@ import os import sys import getopt - +import tokenize def main(): tabsize = 8 @@ -27,8 +27,9 @@ def process(filename, tabsize, verbose=True): try: - with open(filename) as f: + with tokenize.open(filename) as f: text = f.read() + encoding = f.encoding except IOError as msg: print("%r: I/O error: %s" % (filename, msg)) return @@ -44,7 +45,7 @@ os.rename(filename, backup) except os.error: pass - with open(filename, "w") as f: + with open(filename, "w", encoding=encoding) as f: f.write(newtext) if verbose: print(filename) Modified: python/branches/pep-3151/Tools/unicode/gencodec.py ============================================================================== --- python/branches/pep-3151/Tools/unicode/gencodec.py (original) +++ python/branches/pep-3151/Tools/unicode/gencodec.py Sat Feb 26 08:16:32 2011 @@ -34,6 +34,9 @@ # Standard undefined Unicode code point UNI_UNDEFINED = chr(0xFFFE) +# Placeholder for a missing codepoint +MISSING_CODE = -1 + mapRE = re.compile('((?:0x[0-9a-fA-F]+\+?)+)' '\s+' '((?:(?:0x[0-9a-fA-Z]+|<[A-Za-z]+>)\+?)*)' @@ -52,7 +55,7 @@ """ if not codes: - return None + return MISSING_CODE l = codes.split('+') if len(l) == 1: return int(l[0],16) @@ -60,8 +63,8 @@ try: l[i] = int(l[i],16) except ValueError: - l[i] = None - l = [x for x in l if x is not None] + l[i] = MISSING_CODE + l = [x for x in l if x != MISSING_CODE] if len(l) == 1: return l[0] else: @@ -113,7 +116,7 @@ # mappings to None for the rest if len(identity) >= len(unmapped): for enc in unmapped: - enc2uni[enc] = (None, "") + enc2uni[enc] = (MISSING_CODE, "") enc2uni['IDENTITY'] = 256 return enc2uni @@ -211,7 +214,7 @@ (mapkey, mapcomment) = mapkey if isinstance(mapvalue, tuple): (mapvalue, mapcomment) = mapvalue - if mapkey is None: + if mapkey == MISSING_CODE: continue table[mapkey] = (mapvalue, mapcomment) if mapkey > maxkey: @@ -223,11 +226,11 @@ # Create table code for key in range(maxkey + 1): if key not in table: - mapvalue = None + mapvalue = MISSING_CODE mapcomment = 'UNDEFINED' else: mapvalue, mapcomment = table[key] - if mapvalue is None: + if mapvalue == MISSING_CODE: mapchar = UNI_UNDEFINED else: if isinstance(mapvalue, tuple): Modified: python/branches/pep-3151/Tools/unicode/genwincodecs.bat ============================================================================== --- python/branches/pep-3151/Tools/unicode/genwincodecs.bat (original) +++ python/branches/pep-3151/Tools/unicode/genwincodecs.bat Sat Feb 26 08:16:32 2011 @@ -1,7 +1,7 @@ - at rem Recreate some python charmap codecs from the Windows function - at rem MultiByteToWideChar. - - at cd /d %~dp0 - at mkdir build - at rem Arabic DOS code page -c:\python30\python genwincodec.py 720 > build/cp720.py + at rem Recreate some python charmap codecs from the Windows function + at rem MultiByteToWideChar. + + at cd /d %~dp0 + at mkdir build + at rem Arabic DOS code page +c:\python30\python genwincodec.py 720 > build/cp720.py Modified: python/branches/pep-3151/Tools/unicode/makeunicodedata.py ============================================================================== --- python/branches/pep-3151/Tools/unicode/makeunicodedata.py (original) +++ python/branches/pep-3151/Tools/unicode/makeunicodedata.py Sat Feb 26 08:16:32 2011 @@ -70,6 +70,15 @@ NODELTA_MASK = 0x800 NUMERIC_MASK = 0x1000 +# these ranges need to match unicodedata.c:is_unified_ideograph +cjk_ranges = [ + ('3400', '4DB5'), + ('4E00', '9FCB'), + ('20000', '2A6D6'), + ('2A700', '2B734'), + ('2B740', '2B81D') +] + def maketables(trace=0): print("--- Reading", UNICODE_DATA % "", "...") @@ -81,7 +90,7 @@ for version in old_versions: print("--- Reading", UNICODE_DATA % ("-"+version), "...") - old_unicode = UnicodeData(version) + old_unicode = UnicodeData(version, cjk_check=False) print(len(list(filter(None, old_unicode.table))), "characters") merge_old_version(version, unicode, old_unicode) @@ -804,7 +813,8 @@ def __init__(self, version, linebreakprops=False, - expand=1): + expand=1, + cjk_check=True): self.changed = [] file = open_data(UNICODE_DATA, version) table = [None] * 0x110000 @@ -816,6 +826,8 @@ char = int(s[0], 16) table[char] = s + cjk_ranges_found = [] + # expand first-last ranges if expand: field = None @@ -826,12 +838,17 @@ s[1] = "" field = s elif s[1][-5:] == "Last>": + if s[1].startswith(". # @@ -553,8 +553,8 @@ # Identity of this package. PACKAGE_NAME='python' PACKAGE_TARNAME='python' -PACKAGE_VERSION='3.2' -PACKAGE_STRING='python 3.2' +PACKAGE_VERSION='3.3' +PACKAGE_STRING='python 3.3' PACKAGE_BUGREPORT='http://bugs.python.org/' PACKAGE_URL='' @@ -647,6 +647,7 @@ RUNSHARED INSTSONAME LDLIBRARYDIR +PY3LIBRARY BLDLIBRARY DLLLIBRARY LDLIBRARY @@ -1307,7 +1308,7 @@ # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures python 3.2 to adapt to many kinds of systems. +\`configure' configures python 3.3 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1368,7 +1369,7 @@ if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of python 3.2:";; + short | recursive ) echo "Configuration of python 3.3:";; esac cat <<\_ACEOF @@ -1506,7 +1507,7 @@ test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -python configure 3.2 +python configure 3.3 generated by GNU Autoconf 2.65 Copyright (C) 2009 Free Software Foundation, Inc. @@ -1935,11 +1936,11 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ $ac_includes_default - enum { N = $2 / 2 - 1 }; int main () { -static int test_array [1 - 2 * !(0 < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1))]; +static int test_array [1 - 2 * !(enum { N = $2 / 2 - 1 }; + 0 < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1))]; test_array [0] = 0 ; @@ -1950,11 +1951,11 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ $ac_includes_default - enum { N = $2 / 2 - 1 }; int main () { -static int test_array [1 - 2 * !(($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1) +static int test_array [1 - 2 * !(enum { N = $2 / 2 - 1 }; + ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 1) < ($ac_type) ((((($ac_type) 1 << N) << N) - 1) * 2 + 2))]; test_array [0] = 0 @@ -2334,7 +2335,7 @@ This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by python $as_me 3.2, which was +It was created by python $as_me 3.3, which was generated by GNU Autoconf 2.65. Invocation command line was $ $0 $@ @@ -2700,7 +2701,7 @@ mv confdefs.h.new confdefs.h -VERSION=3.2 +VERSION=3.3 # Version number of Python's own shared library file. @@ -4707,13 +4708,14 @@ + LDLIBRARY="$LIBRARY" BLDLIBRARY='$(LDLIBRARY)' INSTSONAME='$(LDLIBRARY)' DLLLIBRARY='' LDLIBRARYDIR='' RUNSHARED='' -LDVERSION="$(VERSION)" +LDVERSION="$VERSION" # LINKCC is the command that links the python executable -- default is $(CC). # If CXX is set, and if it is needed to link a main function that was @@ -4903,6 +4905,10 @@ BLDLIBRARY='-Wl,-R,$(LIBDIR) -L. -lpython$(LDVERSION)' RUNSHARED=LD_LIBRARY_PATH=`pwd`:${LD_LIBRARY_PATH} INSTSONAME="$LDLIBRARY".$SOVERSION + if test "$with_pydebug" != yes + then + PY3LIBRARY=libpython3.so + fi ;; Linux*|GNU*|NetBSD*|FreeBSD*|DragonFly*) LDLIBRARY='libpython$(LDVERSION).so' @@ -4914,6 +4920,10 @@ ;; esac INSTSONAME="$LDLIBRARY".$SOVERSION + if test "$with_pydebug" != yes + then + PY3LIBRARY=libpython3.so + fi ;; hp*|HP*) case `uname -m` in @@ -5988,10 +5998,10 @@ unistd.h utime.h \ sys/audioio.h sys/bsdtty.h sys/epoll.h sys/event.h sys/file.h sys/loadavg.h \ sys/lock.h sys/mkdev.h sys/modem.h \ -sys/param.h sys/poll.h sys/select.h sys/socket.h sys/statvfs.h sys/stat.h \ -sys/termio.h sys/time.h \ -sys/times.h sys/types.h sys/un.h sys/utsname.h sys/wait.h pty.h libutil.h \ -sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ +sys/param.h sys/poll.h sys/select.h sys/sendfile.h sys/socket.h sys/statvfs.h \ +sys/stat.h sys/termio.h sys/time.h \ +sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \ +libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ bluetooth/bluetooth.h linux/tipc.h spawn.h util.h do : as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` @@ -6389,6 +6399,13 @@ if test "$use_lfs" = "yes"; then # Two defines needed to enable largefile support on various platforms # These may affect some typedefs +case $ac_sys_system/$ac_sys_release in +AIX*) + +$as_echo "#define _LARGE_FILES 1" >>confdefs.h + + ;; +esac $as_echo "#define _LARGEFILE_SOURCE 1" >>confdefs.h @@ -7417,7 +7434,7 @@ then case $ac_sys_system/$ac_sys_release in AIX*) - BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:Modules/python.exp -L\$(srcdir)" + BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:\$(srcdir)/Modules/python.exp" LDSHARED="\$(BINLIBDEST)/config/ld_so_aix \$(CC) -bI:\$(BINLIBDEST)/config/python.exp" ;; IRIX/5*) LDSHARED="ld -shared";; @@ -7524,8 +7541,8 @@ esac fi;; NetBSD*|DragonFly*) - LDSHARED="cc -shared" - LDCXXSHARED="c++ -shared";; + LDSHARED='$(CC) -shared' + LDCXXSHARED='$(CXX) -shared';; OpenUNIX*|UnixWare*) if test "$GCC" = "yes" ; then LDSHARED='$(CC) -shared' @@ -7676,6 +7693,51 @@ # checks for libraries +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for sendfile in -lsendfile" >&5 +$as_echo_n "checking for sendfile in -lsendfile... " >&6; } +if test "${ac_cv_lib_sendfile_sendfile+set}" = set; then : + $as_echo_n "(cached) " >&6 +else + ac_check_lib_save_LIBS=$LIBS +LIBS="-lsendfile $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +#ifdef __cplusplus +extern "C" +#endif +char sendfile (); +int +main () +{ +return sendfile (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_lib_sendfile_sendfile=yes +else + ac_cv_lib_sendfile_sendfile=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_sendfile_sendfile" >&5 +$as_echo "$ac_cv_lib_sendfile_sendfile" >&6; } +if test "x$ac_cv_lib_sendfile_sendfile" = x""yes; then : + cat >>confdefs.h <<_ACEOF +#define HAVE_LIBSENDFILE 1 +_ACEOF + + LIBS="-lsendfile $LIBS" + +fi + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for dlopen in -ldl" >&5 $as_echo_n "checking for dlopen in -ldl... " >&6; } if test "${ac_cv_lib_dl_dlopen+set}" = set; then : @@ -7881,7 +7943,7 @@ cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ - #include "/usr/lpp/xlC/include/load.h" + #include int main () { @@ -8965,7 +9027,7 @@ ;; solaris) if test -f /etc/netconfig; then - if /usr/xpg4/bin/grep -q tcp6 /etc/netconfig; then + if $GREP -q tcp6 /etc/netconfig; then ipv6type=$i ipv6trylibc=yes fi @@ -9249,18 +9311,19 @@ # checks for library functions for ac_func in alarm accept4 setitimer getitimer bind_textdomain_codeset chown \ - clock confstr ctermid execv fchmod fchown fork fpathconf ftime ftruncate \ + clock confstr ctermid execv faccessat fchmod fchmodat fchown fchownat \ + fdopendir fork fpathconf fstatat ftime ftruncate futimesat \ gai_strerror getgroups getlogin getloadavg getpeername getpgid getpid \ getpriority getresuid getresgid getpwent getspnam getspent getsid getwd \ - initgroups kill killpg lchmod lchown lstat mbrtowc mkfifo mknod mktime \ - mremap nice pathconf pause plock poll pthread_init \ - putenv readlink realpath \ - select sem_open sem_timedwait sem_getvalue sem_unlink setegid seteuid \ + initgroups kill killpg lchmod lchown linkat lstat mbrtowc mkdirat mkfifo \ + mkfifoat mknod mknodat mktime mremap nice openat pathconf pause plock poll \ + pthread_init putenv readlink readlinkat realpath renameat \ + select sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ setgid \ - setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setuid setvbuf \ - sigaction siginterrupt sigrelse snprintf strftime strlcpy \ + setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setpriority setuid setvbuf \ + sigaction siginterrupt sigrelse snprintf strftime strlcpy symlinkat \ sysconf tcgetpgrp tcsetpgrp tempnam timegm times tmpfile tmpnam tmpnam_r \ - truncate uname unsetenv utimes waitpid wait3 wait4 \ + truncate uname unlinkat unsetenv utimensat utimes waitpid wait3 wait4 \ wcscoll wcsftime wcsxfrm _getpty do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` @@ -11772,52 +11835,6 @@ LIBS_SAVE=$LIBS LIBS="$LIBS $LIBM" -# On FreeBSD 6.2, it appears that tanh(-0.) returns 0. instead of -# -0. on some architectures. -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether tanh preserves the sign of zero" >&5 -$as_echo_n "checking whether tanh preserves the sign of zero... " >&6; } -if test "${ac_cv_tanh_preserves_zero_sign+set}" = set; then : - $as_echo_n "(cached) " >&6 -else - -if test "$cross_compiling" = yes; then : - ac_cv_tanh_preserves_zero_sign=no -else - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ - -#include -#include -int main() { - /* return 0 if either negative zeros don't exist - on this platform or if negative zeros exist - and tanh(-0.) == -0. */ - if (atan2(0., -1.) == atan2(-0., -1.) || - atan2(tanh(-0.), -1.) == atan2(-0., -1.)) exit(0); - else exit(1); -} - -_ACEOF -if ac_fn_c_try_run "$LINENO"; then : - ac_cv_tanh_preserves_zero_sign=yes -else - ac_cv_tanh_preserves_zero_sign=no -fi -rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ - conftest.$ac_objext conftest.beam conftest.$ac_ext -fi - -fi - -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_tanh_preserves_zero_sign" >&5 -$as_echo "$ac_cv_tanh_preserves_zero_sign" >&6; } -if test "$ac_cv_tanh_preserves_zero_sign" = yes -then - -$as_echo "#define TANH_PRESERVES_ZERO_SIGN 1" >>confdefs.h - -fi - for ac_func in acosh asinh atanh copysign erf erfc expm1 finite gamma do : as_ac_var=`$as_echo "ac_cv_func_$ac_func" | $as_tr_sh` @@ -11879,6 +11896,101 @@ _ACEOF +# On FreeBSD 6.2, it appears that tanh(-0.) returns 0. instead of +# -0. on some architectures. +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether tanh preserves the sign of zero" >&5 +$as_echo_n "checking whether tanh preserves the sign of zero... " >&6; } +if test "${ac_cv_tanh_preserves_zero_sign+set}" = set; then : + $as_echo_n "(cached) " >&6 +else + +if test "$cross_compiling" = yes; then : + ac_cv_tanh_preserves_zero_sign=no +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +#include +#include +int main() { + /* return 0 if either negative zeros don't exist + on this platform or if negative zeros exist + and tanh(-0.) == -0. */ + if (atan2(0., -1.) == atan2(-0., -1.) || + atan2(tanh(-0.), -1.) == atan2(-0., -1.)) exit(0); + else exit(1); +} + +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + ac_cv_tanh_preserves_zero_sign=yes +else + ac_cv_tanh_preserves_zero_sign=no +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + +fi + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_tanh_preserves_zero_sign" >&5 +$as_echo "$ac_cv_tanh_preserves_zero_sign" >&6; } +if test "$ac_cv_tanh_preserves_zero_sign" = yes +then + +$as_echo "#define TANH_PRESERVES_ZERO_SIGN 1" >>confdefs.h + +fi + +if test "$ac_cv_func_log1p" = yes +then + # On some versions of AIX, log1p(-0.) returns 0. instead of + # -0. See issue #9920. + { $as_echo "$as_me:${as_lineno-$LINENO}: checking whether log1p drops the sign of negative zero" >&5 +$as_echo_n "checking whether log1p drops the sign of negative zero... " >&6; } + if test "${ac_cv_log1p_drops_zero_sign+set}" = set; then : + $as_echo_n "(cached) " >&6 +else + + if test "$cross_compiling" = yes; then : + ac_cv_log1p_drops_zero_sign=no +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + + #include + #include + int main() { + /* Fail if the signs of log1p(-0.) and -0. can be + distinguished. */ + if (atan2(log1p(-0.), -1.) == atan2(-0., -1.)) + return 0; + else + return 1; + } + +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + ac_cv_log1p_drops_zero_sign=no +else + ac_cv_log1p_drops_zero_sign=yes +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + +fi + + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_log1p_drops_zero_sign" >&5 +$as_echo "$ac_cv_log1p_drops_zero_sign" >&6; } +fi +if test "$ac_cv_log1p_drops_zero_sign" = yes +then + +$as_echo "#define LOG1P_DROPS_ZERO_SIGN 1" >>confdefs.h + +fi + LIBS=$LIBS_SAVE # For multiprocessing module, check that sem_open @@ -13681,6 +13793,12 @@ OSF*) as_fn_error "OSF* systems are deprecated unless somebody volunteers. Check http://bugs.python.org/issue8606" "$LINENO" 5 ;; esac +ac_fn_c_check_func "$LINENO" "pipe2" "ac_cv_func_pipe2" +if test "x$ac_cv_func_pipe2" = x""yes; then : + +$as_echo "#define HAVE_PIPE2 1" >>confdefs.h + +fi @@ -14213,7 +14331,7 @@ # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by python $as_me 3.2, which was +This file was extended by python $as_me 3.3, which was generated by GNU Autoconf 2.65. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -14238,8 +14356,8 @@ cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 # Files that config.status was made for. -config_files="$ac_config_files" -config_headers="$ac_config_headers" +config_files="`echo $ac_config_files`" +config_headers="`echo $ac_config_headers`" _ACEOF @@ -14275,7 +14393,7 @@ cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -python config.status 3.2 +python config.status 3.3 configured by $0, generated by GNU Autoconf 2.65, with options \\"\$ac_cs_config\\" Modified: python/branches/pep-3151/configure.in ============================================================================== --- python/branches/pep-3151/configure.in (original) +++ python/branches/pep-3151/configure.in Sat Feb 26 08:16:32 2011 @@ -3,15 +3,8 @@ dnl *********************************************** # Set VERSION so we only need to edit in one place (i.e., here) -m4_define(PYTHON_VERSION, 3.2) +m4_define(PYTHON_VERSION, 3.3) -dnl Some m4 magic to ensure that the configure script is generated -dnl by the correct autoconf version. -m4_define([version_required], -[m4_if(m4_version_compare(m4_defn([m4_PACKAGE_VERSION]), [$1]), 0, - [], - [m4_fatal([Autoconf version $1 is required for Python], 63)]) -]) AC_PREREQ(2.65) AC_REVISION($Revision$) @@ -608,6 +601,7 @@ AC_SUBST(LDLIBRARY) AC_SUBST(DLLLIBRARY) AC_SUBST(BLDLIBRARY) +AC_SUBST(PY3LIBRARY) AC_SUBST(LDLIBRARYDIR) AC_SUBST(INSTSONAME) AC_SUBST(RUNSHARED) @@ -618,7 +612,7 @@ DLLLIBRARY='' LDLIBRARYDIR='' RUNSHARED='' -LDVERSION="$(VERSION)" +LDVERSION="$VERSION" # LINKCC is the command that links the python executable -- default is $(CC). # If CXX is set, and if it is needed to link a main function that was @@ -737,6 +731,10 @@ BLDLIBRARY='-Wl,-R,$(LIBDIR) -L. -lpython$(LDVERSION)' RUNSHARED=LD_LIBRARY_PATH=`pwd`:${LD_LIBRARY_PATH} INSTSONAME="$LDLIBRARY".$SOVERSION + if test "$with_pydebug" != yes + then + PY3LIBRARY=libpython3.so + fi ;; Linux*|GNU*|NetBSD*|FreeBSD*|DragonFly*) LDLIBRARY='libpython$(LDVERSION).so' @@ -748,6 +746,10 @@ ;; esac INSTSONAME="$LDLIBRARY".$SOVERSION + if test "$with_pydebug" != yes + then + PY3LIBRARY=libpython3.so + fi ;; hp*|HP*) case `uname -m` in @@ -1281,10 +1283,10 @@ unistd.h utime.h \ sys/audioio.h sys/bsdtty.h sys/epoll.h sys/event.h sys/file.h sys/loadavg.h \ sys/lock.h sys/mkdev.h sys/modem.h \ -sys/param.h sys/poll.h sys/select.h sys/socket.h sys/statvfs.h sys/stat.h \ -sys/termio.h sys/time.h \ -sys/times.h sys/types.h sys/un.h sys/utsname.h sys/wait.h pty.h libutil.h \ -sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ +sys/param.h sys/poll.h sys/select.h sys/sendfile.h sys/socket.h sys/statvfs.h \ +sys/stat.h sys/termio.h sys/time.h \ +sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \ +libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ bluetooth/bluetooth.h linux/tipc.h spawn.h util.h) AC_HEADER_DIRENT AC_HEADER_MAJOR @@ -1367,6 +1369,12 @@ if test "$use_lfs" = "yes"; then # Two defines needed to enable largefile support on various platforms # These may affect some typedefs +case $ac_sys_system/$ac_sys_release in +AIX*) + AC_DEFINE(_LARGE_FILES, 1, + [This must be defined on AIX systems to enable large file support.]) + ;; +esac AC_DEFINE(_LARGEFILE_SOURCE, 1, [This must be defined on some systems to enable large file support.]) AC_DEFINE(_FILE_OFFSET_BITS, 64, @@ -1633,7 +1641,7 @@ then case $ac_sys_system/$ac_sys_release in AIX*) - BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:Modules/python.exp -L\$(srcdir)" + BLDSHARED="\$(srcdir)/Modules/ld_so_aix \$(CC) -bI:\$(srcdir)/Modules/python.exp" LDSHARED="\$(BINLIBDEST)/config/ld_so_aix \$(CC) -bI:\$(BINLIBDEST)/config/python.exp" ;; IRIX/5*) LDSHARED="ld -shared";; @@ -1740,8 +1748,8 @@ esac fi;; NetBSD*|DragonFly*) - LDSHARED="cc -shared" - LDCXXSHARED="c++ -shared";; + LDSHARED='$(CC) -shared' + LDCXXSHARED='$(CXX) -shared';; OpenUNIX*|UnixWare*) if test "$GCC" = "yes" ; then LDSHARED='$(CC) -shared' @@ -1883,6 +1891,7 @@ # checks for libraries +AC_CHECK_LIB(sendfile, sendfile) AC_CHECK_LIB(dl, dlopen) # Dynamic linking for SunOS/Solaris and SYSV AC_CHECK_LIB(dld, shl_load) # Dynamic linking for HP-UX @@ -1903,7 +1912,7 @@ case "$ac_sys_system" in AIX*) AC_MSG_CHECKING(for genuine AIX C++ extensions support) AC_LINK_IFELSE([ - AC_LANG_PROGRAM([[#include "/usr/lpp/xlC/include/load.h"]], + AC_LANG_PROGRAM([[#include ]], [[loadAndInit("", 0, "")]]) ],[ AC_DEFINE(AIX_GENUINE_CPLUSPLUS, 1, @@ -2340,7 +2349,7 @@ ;; solaris) if test -f /etc/netconfig; then - if /usr/xpg4/bin/grep -q tcp6 /etc/netconfig; then + if $GREP -q tcp6 /etc/netconfig; then ipv6type=$i ipv6trylibc=yes fi @@ -2525,18 +2534,19 @@ # checks for library functions AC_CHECK_FUNCS(alarm accept4 setitimer getitimer bind_textdomain_codeset chown \ - clock confstr ctermid execv fchmod fchown fork fpathconf ftime ftruncate \ + clock confstr ctermid execv faccessat fchmod fchmodat fchown fchownat \ + fdopendir fork fpathconf fstatat ftime ftruncate futimesat \ gai_strerror getgroups getlogin getloadavg getpeername getpgid getpid \ getpriority getresuid getresgid getpwent getspnam getspent getsid getwd \ - initgroups kill killpg lchmod lchown lstat mbrtowc mkfifo mknod mktime \ - mremap nice pathconf pause plock poll pthread_init \ - putenv readlink realpath \ - select sem_open sem_timedwait sem_getvalue sem_unlink setegid seteuid \ + initgroups kill killpg lchmod lchown linkat lstat mbrtowc mkdirat mkfifo \ + mkfifoat mknod mknodat mktime mremap nice openat pathconf pause plock poll \ + pthread_init putenv readlink readlinkat realpath renameat \ + select sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ setgid \ - setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setuid setvbuf \ - sigaction siginterrupt sigrelse snprintf strftime strlcpy \ + setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setpriority setuid setvbuf \ + sigaction siginterrupt sigrelse snprintf strftime strlcpy symlinkat \ sysconf tcgetpgrp tcsetpgrp tempnam timegm times tmpfile tmpnam tmpnam_r \ - truncate uname unsetenv utimes waitpid wait3 wait4 \ + truncate uname unlinkat unsetenv utimensat utimes waitpid wait3 wait4 \ wcscoll wcsftime wcsxfrm _getpty) # For some functions, having a definition is not sufficient, since @@ -3372,6 +3382,10 @@ LIBS_SAVE=$LIBS LIBS="$LIBS $LIBM" +AC_CHECK_FUNCS([acosh asinh atanh copysign erf erfc expm1 finite gamma]) +AC_CHECK_FUNCS([hypot lgamma log1p round tgamma]) +AC_CHECK_DECLS([isinf, isnan, isfinite], [], [], [[#include ]]) + # On FreeBSD 6.2, it appears that tanh(-0.) returns 0. instead of # -0. on some architectures. AC_MSG_CHECKING(whether tanh preserves the sign of zero) @@ -3398,9 +3412,34 @@ [Define if tanh(-0.) is -0., or if platform doesn't have signed zeros]) fi -AC_CHECK_FUNCS([acosh asinh atanh copysign erf erfc expm1 finite gamma]) -AC_CHECK_FUNCS([hypot lgamma log1p round tgamma]) -AC_CHECK_DECLS([isinf, isnan, isfinite], [], [], [[#include ]]) +if test "$ac_cv_func_log1p" = yes +then + # On some versions of AIX, log1p(-0.) returns 0. instead of + # -0. See issue #9920. + AC_MSG_CHECKING(whether log1p drops the sign of negative zero) + AC_CACHE_VAL(ac_cv_log1p_drops_zero_sign, [ + AC_RUN_IFELSE([AC_LANG_SOURCE([[ + #include + #include + int main() { + /* Fail if the signs of log1p(-0.) and -0. can be + distinguished. */ + if (atan2(log1p(-0.), -1.) == atan2(-0., -1.)) + return 0; + else + return 1; + } + ]])], + [ac_cv_log1p_drops_zero_sign=no], + [ac_cv_log1p_drops_zero_sign=yes], + [ac_cv_log1p_drops_zero_sign=no])]) + AC_MSG_RESULT($ac_cv_log1p_drops_zero_sign) +fi +if test "$ac_cv_log1p_drops_zero_sign" = yes +then + AC_DEFINE(LOG1P_DROPS_ZERO_SIGN, 1, + [Define if log1p(-0.) is 0. rather than -0.]) +fi LIBS=$LIBS_SAVE @@ -4206,7 +4245,7 @@ OSF*) AC_MSG_ERROR(OSF* systems are deprecated unless somebody volunteers. Check http://bugs.python.org/issue8606) ;; esac - +AC_CHECK_FUNC(pipe2, AC_DEFINE(HAVE_PIPE2, 1, [Define if the OS supports pipe2()]), ) AC_SUBST(THREADHEADERS) Modified: python/branches/pep-3151/pyconfig.h.in ============================================================================== --- python/branches/pep-3151/pyconfig.h.in (original) +++ python/branches/pep-3151/pyconfig.h.in Sat Feb 26 08:16:32 2011 @@ -205,21 +205,33 @@ /* Define to 1 if you have the `expm1' function. */ #undef HAVE_EXPM1 +/* Define to 1 if you have the `faccessat' function. */ +#undef HAVE_FACCESSAT + /* Define if you have the 'fchdir' function. */ #undef HAVE_FCHDIR /* Define to 1 if you have the `fchmod' function. */ #undef HAVE_FCHMOD +/* Define to 1 if you have the `fchmodat' function. */ +#undef HAVE_FCHMODAT + /* Define to 1 if you have the `fchown' function. */ #undef HAVE_FCHOWN +/* Define to 1 if you have the `fchownat' function. */ +#undef HAVE_FCHOWNAT + /* Define to 1 if you have the header file. */ #undef HAVE_FCNTL_H /* Define if you have the 'fdatasync' function. */ #undef HAVE_FDATASYNC +/* Define to 1 if you have the `fdopendir' function. */ +#undef HAVE_FDOPENDIR + /* Define to 1 if you have the `finite' function. */ #undef HAVE_FINITE @@ -241,6 +253,9 @@ /* Define to 1 if you have the `fseeko' function. */ #undef HAVE_FSEEKO +/* Define to 1 if you have the `fstatat' function. */ +#undef HAVE_FSTATAT + /* Define to 1 if you have the `fstatvfs' function. */ #undef HAVE_FSTATVFS @@ -259,6 +274,9 @@ /* Define to 1 if you have the `ftruncate' function. */ #undef HAVE_FTRUNCATE +/* Define to 1 if you have the `futimesat' function. */ +#undef HAVE_FUTIMESAT + /* Define to 1 if you have the `gai_strerror' function. */ #undef HAVE_GAI_STRERROR @@ -431,6 +449,9 @@ /* Define if you have the 'link' function. */ #undef HAVE_LINK +/* Define to 1 if you have the `linkat' function. */ +#undef HAVE_LINKAT + /* Define to 1 if you have the header file. */ #undef HAVE_LINUX_NETLINK_H @@ -461,12 +482,21 @@ /* Define to 1 if you have the header file. */ #undef HAVE_MEMORY_H +/* Define to 1 if you have the `mkdirat' function. */ +#undef HAVE_MKDIRAT + /* Define to 1 if you have the `mkfifo' function. */ #undef HAVE_MKFIFO +/* Define to 1 if you have the `mkfifoat' function. */ +#undef HAVE_MKFIFOAT + /* Define to 1 if you have the `mknod' function. */ #undef HAVE_MKNOD +/* Define to 1 if you have the `mknodat' function. */ +#undef HAVE_MKNODAT + /* Define to 1 if you have the `mktime' function. */ #undef HAVE_MKTIME @@ -485,6 +515,9 @@ /* Define to 1 if you have the `nice' function. */ #undef HAVE_NICE +/* Define to 1 if you have the `openat' function. */ +#undef HAVE_OPENAT + /* Define to 1 if you have the `openpty' function. */ #undef HAVE_OPENPTY @@ -497,6 +530,15 @@ /* Define to 1 if you have the `pause' function. */ #undef HAVE_PAUSE +/* Define if the OS supports pipe2() */ +#undef HAVE_PIPE2 + +/* Define if the OS supports pipe2() */ +#undef HAVE_PIPE2 + +/* Define if the OS supports pipe2() */ +#undef HAVE_PIPE2 + /* Define to 1 if you have the `plock' function. */ #undef HAVE_PLOCK @@ -533,9 +575,15 @@ /* Define to 1 if you have the `readlink' function. */ #undef HAVE_READLINK +/* Define to 1 if you have the `readlinkat' function. */ +#undef HAVE_READLINKAT + /* Define to 1 if you have the `realpath' function. */ #undef HAVE_REALPATH +/* Define to 1 if you have the `renameat' function. */ +#undef HAVE_RENAMEAT + /* Define if you have readline 2.1 */ #undef HAVE_RL_CALLBACK @@ -599,6 +647,9 @@ /* Define to 1 if you have the `setpgrp' function. */ #undef HAVE_SETPGRP +/* Define to 1 if you have the `setpriority' function. */ +#undef HAVE_SETPRIORITY + /* Define to 1 if you have the `setregid' function. */ #undef HAVE_SETREGID @@ -718,6 +769,9 @@ /* Define if you have the 'symlink' function. */ #undef HAVE_SYMLINK +/* Define to 1 if you have the `symlinkat' function. */ +#undef HAVE_SYMLINKAT + /* Define to 1 if you have the `sysconf' function. */ #undef HAVE_SYSCONF @@ -771,6 +825,9 @@ /* Define to 1 if you have the header file. */ #undef HAVE_SYS_SELECT_H +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_SENDFILE_H + /* Define to 1 if you have the header file. */ #undef HAVE_SYS_SOCKET_H @@ -860,6 +917,9 @@ /* Define to 1 if you have the header file. */ #undef HAVE_UNISTD_H +/* Define to 1 if you have the `unlinkat' function. */ +#undef HAVE_UNLINKAT + /* Define to 1 if you have the `unsetenv' function. */ #undef HAVE_UNSETENV @@ -871,6 +931,9 @@ /* Define to 1 if you have the header file. */ #undef HAVE_UTIL_H +/* Define to 1 if you have the `utimensat' function. */ +#undef HAVE_UTIMENSAT + /* Define to 1 if you have the `utimes' function. */ #undef HAVE_UTIMES @@ -911,6 +974,9 @@ /* Define if you are using Mach cthreads directly under /include */ #undef HURD_C_THREADS +/* Define if log1p(-0.) is 0. rather than -0. */ +#undef LOG1P_DROPS_ZERO_SIGN + /* Define if you are using Mach cthreads under mach / */ #undef MACH_C_THREADS @@ -1018,6 +1084,9 @@ /* The size of `size_t', as computed by sizeof. */ #undef SIZEOF_SIZE_T +/* Define to 1 if you have the `sendfile' function. */ +#undef HAVE_SENDFILE + /* The size of `time_t', as computed by sizeof. */ #undef SIZEOF_TIME_T @@ -1146,6 +1215,9 @@ /* This must be defined on some systems to enable large file support. */ #undef _LARGEFILE_SOURCE +/* This must be defined on AIX systems to enable large file support. */ +#undef _LARGE_FILES + /* Define to 1 if on MINIX. */ #undef _MINIX Deleted: python/branches/pep-3151/runtests.sh ============================================================================== --- python/branches/pep-3151/runtests.sh Sat Feb 26 08:16:32 2011 +++ (empty file) @@ -1,93 +0,0 @@ -#!/bin/bash - -HELP="Usage: ./runtests.py [-h] [-x] [flags] [tests] - -Runs each unit test independently, with output directed to a file in -OUT/.out. If no tests are given, all tests are run; otherwise, -only the specified tests are run, unless -x is also given, in which -case all tests *except* those given are run. - -Standard output shows the name of the tests run, with 'BAD' or -'SKIPPED' added if the test didn't produce a positive result. Also, -three files are created, named 'BAD', 'GOOD' and 'SKIPPED', to which -are written the names of the tests categorized by result. - -Flags (arguments starting with '-') are passed transparently to -regrtest.py, except for -x, which is processed here." - -# Choose the Python binary. -case `uname` in -Darwin) PYTHON=./python.exe;; -CYGWIN*) PYTHON=./python.exe;; -*) PYTHON=./python;; -esac - -PYTHON="$PYTHON -bb" - -# Unset PYTHONPATH, just to be sure. -unset PYTHONPATH - -# Create the output directory if necessary. -mkdir -p OUT - -# Empty the summary files. ->GOOD ->BAD ->SKIPPED - -# Process flags (transparently pass these on to regrtest.py) -FLAGS="" -EXCEPT="" -while : -do - case $1 in - -h|--h|-help|--help) echo "$HELP"; exit;; - --) FLAGS="$FLAGS $1"; shift; break;; - -x) EXCEPT="$1"; shift;; - -*) FLAGS="$FLAGS $1"; shift;; - *) break;; - esac -done - -# Compute the list of tests to run. -case "$#$EXCEPT" in -0) - TESTS=`(cd Lib/test; ls test_*.py | sed 's/\.py//')` - ;; -*-x) - PAT="^(`echo $@ | sed 's/\.py//g' | sed 's/ /|/g'`)$" - TESTS=`(cd Lib/test; ls test_*.py | sed 's/\.py//' | egrep -v "$PAT")` - ;; -*) - TESTS="$@" - ;; -esac - -# Run the tests. -for T in $TESTS -do - echo -n $T - if case $T in - *curses*) - echo - $PYTHON -E Lib/test/regrtest.py $FLAGS $T 2>OUT/$T.out - ;; - *) $PYTHON -E Lib/test/regrtest.py $FLAGS $T >OUT/$T.out 2>&1;; - esac - then - if grep -q "1 test skipped:" OUT/$T.out - then - echo " SKIPPED" - echo $T >>SKIPPED - else - echo - echo $T >>GOOD - fi - else - echo " BAD" - echo $T >>BAD - fi -done - -# Summarize results -wc -l BAD GOOD SKIPPED Modified: python/branches/pep-3151/setup.py ============================================================================== --- python/branches/pep-3151/setup.py (original) +++ python/branches/pep-3151/setup.py Sat Feb 26 08:16:32 2011 @@ -14,6 +14,7 @@ from distutils.command.build_ext import build_ext from distutils.command.install import install from distutils.command.install_lib import install_lib +from distutils.command.build_scripts import build_scripts from distutils.spawn import find_executable # Were we compiled --with-pydebug or with #define Py_DEBUG? @@ -27,11 +28,19 @@ _BUILDDIR_COOKIE = "pybuilddir.txt" def add_dir_to_list(dirlist, dir): - """Add the directory 'dir' to the list 'dirlist' (at the front) if + """Add the directory 'dir' to the list 'dirlist' (after any relative + directories) if: + 1) 'dir' is not already in 'dirlist' - 2) 'dir' actually exists, and is a directory.""" - if dir is not None and os.path.isdir(dir) and dir not in dirlist: - dirlist.insert(0, dir) + 2) 'dir' actually exists, and is a directory. + """ + if dir is None or not os.path.isdir(dir) or dir in dirlist: + return + for i, path in enumerate(dirlist): + if not os.path.isabs(path): + dirlist.insert(i + 1, dir) + return + dirlist.insert(0, dir) def macosx_sdk_root(): """ @@ -362,7 +371,9 @@ return sys.platform def detect_modules(self): - # Ensure that /usr/local is always used + # Ensure that /usr/local is always used, but the local build + # directories (i.e. '.' and 'Include') must be first. See issue + # 10520. add_dir_to_list(self.compiler.library_dirs, '/usr/local/lib') add_dir_to_list(self.compiler.include_dirs, '/usr/local/include') @@ -437,9 +448,10 @@ # This should work on any unixy platform ;-) # If the user has bothered specifying additional -I and -L flags # in OPT and LDFLAGS we might as well use them here. - # NOTE: using shlex.split would technically be more correct, but - # also gives a bootstrap problem. Let's hope nobody uses directories - # with whitespace in the name to store libraries. + # + # NOTE: using shlex.split would technically be more correct, but + # also gives a bootstrap problem. Let's hope nobody uses + # directories with whitespace in the name to store libraries. cflags, ldflags = sysconfig.get_config_vars( 'CFLAGS', 'LDFLAGS') for item in cflags.split(): @@ -624,7 +636,7 @@ libs = ['crypt'] else: libs = [] - exts.append( Extension('crypt', ['cryptmodule.c'], libraries=libs) ) + exts.append( Extension('_crypt', ['_cryptmodule.c'], libraries=libs) ) # CSV files exts.append( Extension('_csv', ['_csv.c']) ) @@ -1568,6 +1580,10 @@ ## # Uncomment these lines if you want to play with xxmodule.c ## ext = Extension('xx', ['xxmodule.c']) ## self.extensions.append(ext) + if 'd' not in sys.abiflags: + ext = Extension('xxlimited', ['xxlimited.c'], + define_macros=[('Py_LIMITED_API', 1)]) + self.extensions.append(ext) # XXX handle these, but how to detect? # *** Uncomment and edit for PIL (TkImaging) extension only: @@ -1777,6 +1793,25 @@ def is_chmod_supported(self): return hasattr(os, 'chmod') +class PyBuildScripts(build_scripts): + def copy_scripts(self): + outfiles, updated_files = build_scripts.copy_scripts(self) + fullversion = '-{0[0]}.{0[1]}'.format(sys.version_info) + minoronly = '.{0[1]}'.format(sys.version_info) + newoutfiles = [] + newupdated_files = [] + for filename in outfiles: + if filename.endswith('2to3'): + newfilename = filename + fullversion + else: + newfilename = filename + minoronly + log.info('renaming {} to {}'.format(filename, newfilename)) + os.rename(filename, newfilename) + newoutfiles.append(newfilename) + if filename in updated_files: + newupdated_files.append(newfilename) + return newoutfiles, newupdated_files + SUMMARY = """ Python is an interpreted, interactive, object-oriented programming language. It is often compared to Tcl, Perl, Scheme or Java. @@ -1822,12 +1857,17 @@ platforms = ["Many"], # Build info - cmdclass = {'build_ext':PyBuildExt, 'install':PyBuildInstall, - 'install_lib':PyBuildInstallLib}, + cmdclass = {'build_ext': PyBuildExt, + 'build_scripts': PyBuildScripts, + 'install': PyBuildInstall, + 'install_lib': PyBuildInstallLib}, # The struct module is defined here, because build_ext won't be # called unless there's at least one extension module defined. ext_modules=[Extension('_struct', ['_struct.c'])], + # If you change the scripts installed here, you also need to + # check the PyBuildScripts command above, and change the links + # created by the bininstall target in Makefile.pre.in scripts = ["Tools/scripts/pydoc3", "Tools/scripts/idle3", "Tools/scripts/2to3"] ) From python-checkins at python.org Sat Feb 26 08:18:23 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 08:18:23 +0100 (CET) Subject: [Python-checkins] r88636 - python/branches/py3k/Lib/test/test_logging.py Message-ID: <20110226071823.AF471EEA15@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 08:18:22 2011 New Revision: 88636 Log: test_logging: Changed TimedRotatingFileHandler tests to use UTC time rather than local time. Modified: python/branches/py3k/Lib/test/test_logging.py Modified: python/branches/py3k/Lib/test/test_logging.py ============================================================================== --- python/branches/py3k/Lib/test/test_logging.py (original) +++ python/branches/py3k/Lib/test/test_logging.py Sat Feb 26 08:18:22 2011 @@ -2044,13 +2044,13 @@ ('M', 60), ('H', 60 * 60), ('D', 60 * 60 * 24), - ('MIDNIGHT', 60 * 60 * 23), + ('MIDNIGHT', 60 * 60 * 24), # current time (epoch start) is a Thursday, W0 means Monday - ('W0', secs(days=4, hours=23)), + ('W0', secs(days=4, hours=24)), ): def test_compute_rollover(self, when=when, exp=exp): rh = logging.handlers.TimedRotatingFileHandler( - self.fn, when=when, interval=1, backupCount=0) + self.fn, when=when, interval=1, backupCount=0, utc=True) currentTime = 0.0 actual = rh.computeRollover(currentTime) if exp != actual: From python-checkins at python.org Sat Feb 26 08:18:36 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 08:18:36 +0100 (CET) Subject: [Python-checkins] r88637 - python/branches/pep-3151/TODO-3151.txt Message-ID: <20110226071836.5BFD6EEA2A@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 08:18:35 2011 New Revision: 88637 Log: Add a small todo Added: python/branches/pep-3151/TODO-3151.txt (contents, props changed) Added: python/branches/pep-3151/TODO-3151.txt ============================================================================== --- (empty file) +++ python/branches/pep-3151/TODO-3151.txt Sat Feb 26 08:18:35 2011 @@ -0,0 +1,6 @@ + +- rename all PyExc_WindowsError to PyExc_IOError +- rename all PyExc_OSError to PyExc_IOError +- rename all WindowsError to IOError +- rename all OSError to IOError +- LOAD_OLD_GLOBAL From python-checkins at python.org Sat Feb 26 09:17:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 09:17:28 +0100 (CET) Subject: [Python-checkins] r88638 - python/branches/empty/README Message-ID: <20110226081728.022CBEE98A@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 09:17:25 2011 New Revision: 88638 Log: Dummy commit to mostly-clean the misnamed branch checkouts on buildslaves Modified: python/branches/empty/README Modified: python/branches/empty/README ============================================================================== --- python/branches/empty/README (original) +++ python/branches/empty/README Sat Feb 26 09:17:25 2011 @@ -1,2 +1,3 @@ This is an empty directory which can be used to mostly-clean the local checkouts on the build slaves. + From python-checkins at python.org Sat Feb 26 09:45:20 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 09:45:20 +0100 (CET) Subject: [Python-checkins] r88639 - in python/branches/py3k: Lib/ctypes/test/test_loading.py Lib/ctypes/util.py Misc/ACKS Misc/NEWS Message-ID: <20110226084520.BE64BED56@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 09:45:20 2011 New Revision: 88639 Log: Issue #11258: Speed up ctypes.util.find_library() under Linux a lot. Patch by Jonas H. Modified: python/branches/py3k/Lib/ctypes/test/test_loading.py python/branches/py3k/Lib/ctypes/util.py python/branches/py3k/Misc/ACKS python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/ctypes/test/test_loading.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_loading.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_loading.py Sat Feb 26 09:45:20 2011 @@ -102,5 +102,12 @@ # This is the real test: call the function via 'call_function' self.assertEqual(0, call_function(proc, (None,))) + if os.name != "nt": + def test_libc_exists(self): + # A basic test that the libc is found by find_library() + # XXX Can this fail on some non-Windows systems? + self.assertTrue(libc_name) + self.assertTrue(os.path.exists(libc_name)) + if __name__ == "__main__": unittest.main() Modified: python/branches/py3k/Lib/ctypes/util.py ============================================================================== --- python/branches/py3k/Lib/ctypes/util.py (original) +++ python/branches/py3k/Lib/ctypes/util.py Sat Feb 26 09:45:20 2011 @@ -203,14 +203,18 @@ abi_type = mach_map.get(machine, 'libc6') # XXX assuming GLIBC's ldconfig (with option -p) - expr = r'(\S+)\s+\((%s(?:, OS ABI:[^\)]*)?)\)[^/]*(/[^\(\)\s]*lib%s\.[^\(\)\s]*)' \ - % (abi_type, re.escape(name)) + name = 'lib%s' % name + pat = re.compile('\s*(/[^\(\)\s]*%s\.[^\(\)\s]*)' % re.escape(name)) with contextlib.closing(os.popen('LC_ALL=C LANG=C /sbin/ldconfig -p 2>/dev/null')) as f: - data = f.read() - res = re.search(expr, data) - if not res: - return None - return res.group(1) + for line in f: + if not '=>' in line: + continue + path = line.rsplit('=>', 1)[1] + if not name+'.' in path: + continue + res = pat.search(path) + if res: + return res.group(1) def find_library(name): return _findSoname_ldconfig(name) or _get_soname(_findLib_gcc(name)) Modified: python/branches/py3k/Misc/ACKS ============================================================================== --- python/branches/py3k/Misc/ACKS (original) +++ python/branches/py3k/Misc/ACKS Sat Feb 26 09:45:20 2011 @@ -329,6 +329,7 @@ Michael Guravage Lars Gust?bel Thomas G?ttler +Jonas H. Barry Haddow Paul ten Hagen Rasmus Hahn Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 26 09:45:20 2011 @@ -35,6 +35,9 @@ Library ------- +- Issue #11258: Speed up ctypes.util.find_library() under Linux a lot. Patch + by Jonas H. + - Issue #11297: Add collections.ChainMap(). - Issue #10755: Add the posix.fdlistdir() function. Patch by Ross Lagerwall. From python-checkins at python.org Sat Feb 26 10:37:45 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 10:37:45 +0100 (CET) Subject: [Python-checkins] r88640 - in python/branches/py3k: Lib/ctypes/test/test_loading.py Lib/ctypes/util.py Misc/ACKS Misc/NEWS Message-ID: <20110226093745.94C5AEE986@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 10:37:45 2011 New Revision: 88640 Log: Revert r88639 (the optimization changes behaviour and breaks buildbots) Modified: python/branches/py3k/Lib/ctypes/test/test_loading.py python/branches/py3k/Lib/ctypes/util.py python/branches/py3k/Misc/ACKS python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/ctypes/test/test_loading.py ============================================================================== --- python/branches/py3k/Lib/ctypes/test/test_loading.py (original) +++ python/branches/py3k/Lib/ctypes/test/test_loading.py Sat Feb 26 10:37:45 2011 @@ -102,12 +102,5 @@ # This is the real test: call the function via 'call_function' self.assertEqual(0, call_function(proc, (None,))) - if os.name != "nt": - def test_libc_exists(self): - # A basic test that the libc is found by find_library() - # XXX Can this fail on some non-Windows systems? - self.assertTrue(libc_name) - self.assertTrue(os.path.exists(libc_name)) - if __name__ == "__main__": unittest.main() Modified: python/branches/py3k/Lib/ctypes/util.py ============================================================================== --- python/branches/py3k/Lib/ctypes/util.py (original) +++ python/branches/py3k/Lib/ctypes/util.py Sat Feb 26 10:37:45 2011 @@ -203,18 +203,14 @@ abi_type = mach_map.get(machine, 'libc6') # XXX assuming GLIBC's ldconfig (with option -p) - name = 'lib%s' % name - pat = re.compile('\s*(/[^\(\)\s]*%s\.[^\(\)\s]*)' % re.escape(name)) + expr = r'(\S+)\s+\((%s(?:, OS ABI:[^\)]*)?)\)[^/]*(/[^\(\)\s]*lib%s\.[^\(\)\s]*)' \ + % (abi_type, re.escape(name)) with contextlib.closing(os.popen('LC_ALL=C LANG=C /sbin/ldconfig -p 2>/dev/null')) as f: - for line in f: - if not '=>' in line: - continue - path = line.rsplit('=>', 1)[1] - if not name+'.' in path: - continue - res = pat.search(path) - if res: - return res.group(1) + data = f.read() + res = re.search(expr, data) + if not res: + return None + return res.group(1) def find_library(name): return _findSoname_ldconfig(name) or _get_soname(_findLib_gcc(name)) Modified: python/branches/py3k/Misc/ACKS ============================================================================== --- python/branches/py3k/Misc/ACKS (original) +++ python/branches/py3k/Misc/ACKS Sat Feb 26 10:37:45 2011 @@ -329,7 +329,6 @@ Michael Guravage Lars Gust?bel Thomas G?ttler -Jonas H. Barry Haddow Paul ten Hagen Rasmus Hahn Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 26 10:37:45 2011 @@ -35,9 +35,6 @@ Library ------- -- Issue #11258: Speed up ctypes.util.find_library() under Linux a lot. Patch - by Jonas H. - - Issue #11297: Add collections.ChainMap(). - Issue #10755: Add the posix.fdlistdir() function. Patch by Ross Lagerwall. From python-checkins at python.org Sat Feb 26 10:41:39 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 10:41:39 +0100 Subject: [Python-checkins] cpython: Spaces -> tab. Message-ID: martin.v.loewis pushed 8ff33af017ef to cpython: http://hg.python.org/cpython/rev/8ff33af017ef changeset: 68042:8ff33af017ef tag: tip parent: 68040:10cb0ba4fd71 user: Martin v. L?wis date: Sat Feb 26 10:40:56 2011 +0100 summary: Spaces -> tab. files: Lib/http/client.py diff --git a/Lib/http/client.py b/Lib/http/client.py --- a/Lib/http/client.py +++ b/Lib/http/client.py @@ -76,7 +76,7 @@ import warnings __all__ = ["HTTPResponse", "HTTPConnection", - "HTTPException", "NotConnected", "UnknownProtocol", + "HTTPException", "NotConnected", "UnknownProtocol", "UnknownTransferEncoding", "UnimplementedFileMode", "IncompleteRead", "InvalidURL", "ImproperConnectionState", "CannotSendRequest", "CannotSendHeader", "ResponseNotReady", -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 10:43:47 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 10:43:47 +0100 Subject: [Python-checkins] pymigr: Request whitespace checking hook. Message-ID: martin.v.loewis pushed 77c489b3889f to pymigr: http://hg.python.org/pymigr/rev/77c489b3889f changeset: 107:77c489b3889f tag: tip user: Martin v. L?wis date: Sat Feb 26 10:43:26 2011 +0100 summary: Request whitespace checking hook. files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -5,6 +5,7 @@ (issue links are done, rietveld integration pending) * resolve open issues in PEP 385 * (optionally) fight some more about workflow and decide the branch structure +* some hook should prevent pushing python files indented by tabs. Done ==== -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 10:45:52 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 10:45:52 +0100 Subject: [Python-checkins] cpython (trunk): Start working on Python 2.8 :-) Message-ID: martin.v.loewis pushed 5ef18f50529a to cpython: http://hg.python.org/cpython/rev/5ef18f50529a changeset: 68043:5ef18f50529a branch: trunk tag: tip parent: 68041:41071f447b15 user: Martin v. L?wis date: Sat Feb 26 10:45:45 2011 +0100 summary: Start working on Python 2.8 :-) files: Misc/NEWS diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -4,6 +4,11 @@ (editors: check NEWS.help for information about editing NEWS using ReST.) +What's New in Python 2.8? +========================= + +*Release date: 2010-07-03* + What's New in Python 2.7? ========================= -- Repository URL: http://hg.python.org/cpython From martin at v.loewis.de Sat Feb 26 10:52:02 2011 From: martin at v.loewis.de (=?UTF-8?B?Ik1hcnRpbiB2LiBMw7Z3aXMi?=) Date: Sat, 26 Feb 2011 10:52:02 +0100 Subject: [Python-checkins] cpython (trunk): Close the "trunk" branch. In-Reply-To: References: Message-ID: <4D68CD42.5050005@v.loewis.de> > http://hg.python.org/cpython/rev/41071f447b15 > changeset: 68041:41071f447b15 > branch: trunk > tag: tip > parent: 62750:800f6c92c3ed > user: Georg Brandl > date: Sat Feb 26 07:48:21 2011 +0100 > summary: > Close the "trunk" branch. What's the effect of "closing" a branch in Mercurial? I can still commit to it, hg branches still shows it (but shows 3.2 as "inactive"???). How can I find out that the branch is closed? Regards, Martin From solipsis at pitrou.net Sat Feb 26 11:06:12 2011 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 26 Feb 2011 11:06:12 +0100 Subject: [Python-checkins] cpython (trunk): Close the "trunk" branch. References: <4D68CD42.5050005@v.loewis.de> Message-ID: <20110226110612.2766c0da@pitrou.net> On Sat, 26 Feb 2011 10:52:02 +0100 "Martin v. L?wis" wrote: > > http://hg.python.org/cpython/rev/41071f447b15 > > changeset: 68041:41071f447b15 > > branch: trunk > > tag: tip > > parent: 62750:800f6c92c3ed > > user: Georg Brandl > > date: Sat Feb 26 07:48:21 2011 +0100 > > summary: > > Close the "trunk" branch. > > What's the effect of "closing" a branch in Mercurial? > I can still commit to it, hg branches still shows it > (but shows 3.2 as "inactive"???). How can I find out > that the branch is closed? Committing reopened it, otherwise the branch would have been hidden in "hg branches" (and all its heads in "hg heads"). A branch labeled "inactive" is a branch which is a strict ancestor of another one. Here, Georg did a merge of 3.2 into default, so all changesets of 3.2 are considered including in default. Regards Antoine. From python-checkins at python.org Sat Feb 26 11:13:56 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:56 +0100 Subject: [Python-checkins] cpython (2.0): Close branch 2.0 Message-ID: antoine.pitrou pushed 1bc67d5f2e24 to cpython: http://hg.python.org/cpython/rev/1bc67d5f2e24 changeset: 68044:1bc67d5f2e24 branch: 2.0 parent: 18214:dc0dfc9565cd user: Antoine Pitrou date: Sat Feb 26 11:12:49 2011 +0100 summary: Close branch 2.0 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:13:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:57 +0100 Subject: [Python-checkins] cpython (2.1): Close branch 2.1 Message-ID: antoine.pitrou pushed 4ca451b423ea to cpython: http://hg.python.org/cpython/rev/4ca451b423ea changeset: 68045:4ca451b423ea branch: 2.1 parent: 30171:06fcccf6eca8 user: Antoine Pitrou date: Sat Feb 26 11:13:01 2011 +0100 summary: Close branch 2.1 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:13:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:57 +0100 Subject: [Python-checkins] cpython (2.2): Close branch 2.2 Message-ID: antoine.pitrou pushed 77893592f199 to cpython: http://hg.python.org/cpython/rev/77893592f199 changeset: 68046:77893592f199 branch: 2.2 parent: 40444:d55ddc8c8501 user: Antoine Pitrou date: Sat Feb 26 11:13:08 2011 +0100 summary: Close branch 2.2 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:13:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:57 +0100 Subject: [Python-checkins] cpython (2.3): Close branch 2.3 Message-ID: antoine.pitrou pushed 01bdf9560ce9 to cpython: http://hg.python.org/cpython/rev/01bdf9560ce9 changeset: 68047:01bdf9560ce9 branch: 2.3 parent: 45731:a3d9a9730743 user: Antoine Pitrou date: Sat Feb 26 11:13:16 2011 +0100 summary: Close branch 2.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:13:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:57 +0100 Subject: [Python-checkins] cpython (2.4): Close branch 2.4 Message-ID: antoine.pitrou pushed 8c40ae4269e3 to cpython: http://hg.python.org/cpython/rev/8c40ae4269e3 changeset: 68048:8c40ae4269e3 branch: 2.4 parent: 58552:df72cac1899e user: Antoine Pitrou date: Sat Feb 26 11:13:23 2011 +0100 summary: Close branch 2.4 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:13:57 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:13:57 +0100 Subject: [Python-checkins] cpython (3.0): Close branch 3.0 Message-ID: antoine.pitrou pushed 338a9e576330 to cpython: http://hg.python.org/cpython/rev/338a9e576330 changeset: 68049:338a9e576330 branch: 3.0 tag: tip parent: 60075:1d05144224fe user: Antoine Pitrou date: Sat Feb 26 11:13:41 2011 +0100 summary: Close branch 3.0 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 11:16:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:16:08 +0100 Subject: [Python-checkins] devguide (hg_transition): Add a FAQ entry about "inactive" branches Message-ID: antoine.pitrou pushed eda855685396 to devguide: http://hg.python.org/devguide/rev/eda855685396 changeset: 318:eda855685396 branch: hg_transition user: Antoine Pitrou date: Sat Feb 26 11:15:12 2011 +0100 summary: Add a FAQ entry about "inactive" branches files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -119,6 +119,19 @@ 2.0 18214:dc0dfc9565cd +Why are some branches marked "inactive"? +---------------------------------------- + +Assuming you get the following output:: + + $ hg branches + default 68042:8ff33af017ef + 3.2 68039:c17d7772c638 (inactive) + +This means all changesets in the "3.2" branch have been merged into the +"default" branch (or any other branch, if such exists). + + Which branch is currently checked out in my working copy? --------------------------------------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 26 11:16:09 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:16:09 +0100 Subject: [Python-checkins] devguide (hg_transition): Trim example to match what we assume will be the final output Message-ID: antoine.pitrou pushed 1df9ad248418 to devguide: http://hg.python.org/devguide/rev/1df9ad248418 changeset: 319:1df9ad248418 branch: hg_transition tag: tip user: Antoine Pitrou date: Sat Feb 26 11:16:01 2011 +0100 summary: Trim example to match what we assume will be the final output files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -110,13 +110,6 @@ 3.1 67955:5be8b695ea86 2.6 67287:5e26a860eded 2.5 65464:e4ecac76e499 - trunk 62750:800f6c92c3ed - 3.0 60075:1d05144224fe - 2.4 58552:df72cac1899e - 2.3 45731:a3d9a9730743 - 2.2 40444:d55ddc8c8501 - 2.1 30171:06fcccf6eca8 - 2.0 18214:dc0dfc9565cd Why are some branches marked "inactive"? -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 26 11:30:10 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 11:30:10 +0100 Subject: [Python-checkins] devguide (hg_transition): Encourage people to explore the extensions, tips, configuration... Message-ID: antoine.pitrou pushed 23e42f74324a to devguide: http://hg.python.org/devguide/rev/23e42f74324a changeset: 320:23e42f74324a branch: hg_transition tag: tip user: Antoine Pitrou date: Sat Feb 26 11:30:08 2011 +0100 summary: Encourage people to explore the extensions, tips, configuration... files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -474,6 +474,28 @@ hg update +How come feature XYZ isn't available in Mercurial? +-------------------------------------------------- + +Mercurial comes with many bundled extensions which can be explicitly enabled. +You can get a list of them by typing ``hg help extensions``. Some of these +extensions, such as ``color``, can prettify output; others, such as ``fetch`` +or ``transplant``, add new Mercurial commands. + +There are also many `configuration options`_ to tweak various aspects of the +command line and other Mercurial behaviour; typing ``man hgrc`` displays +their documentation inside your terminal. + +In the end, please refer to the Mercurial `wiki`_, especially the pages about +`extensions`_ (including third-party ones) and the `tips and tricks`_. + + +.. _wiki: http://mercurial.selenic.com/wiki/ +.. _extensions: http://mercurial.selenic.com/wiki/UsingExtensions +.. _tips and tricks: http://mercurial.selenic.com/wiki/TipsAndTricks +.. _configuration options: http://www.selenic.com/mercurial/hgrc.5.html + + SSH ======= -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 26 13:50:47 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 13:50:47 +0100 Subject: [Python-checkins] devguide (hg_transition): Add links to HTML versions of man pages Message-ID: eric.araujo pushed 3f9d973e51f4 to devguide: http://hg.python.org/devguide/rev/3f9d973e51f4 changeset: 321:3f9d973e51f4 branch: hg_transition tag: tip user: ?ric Araujo date: Sat Feb 26 13:50:32 2011 +0100 summary: Add links to HTML versions of man pages files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -9,7 +9,7 @@ Where can I learn about the version control system used, Mercurial (hg)? ------------------------------------------------------------------------------- -`Mercurial`_'s (also known as ``hg``) official web site is at +Mercurial_'s (also known as ``hg``) official web site is at http://mercurial.selenic.com/. A book on Mercurial published by `O'Reilly Media`_, `Mercurial: The Definitive Guide`_, is available for free online. @@ -19,13 +19,14 @@ hg help -The man page for ``hg`` provides a quick refresher on the details of +The `man page`_ for ``hg`` provides a quick refresher on the details of various commands, but doesn't provide any guidance on overall workflow. .. _Mercurial: http://mercurial.selenic.com/ .. _O'Reilly Media: http://www.oreilly.com/ .. _Mercurial\: The Definitive Guide: http://hgbook.red-bean.com/ +.. _man page: http://www.selenic.com/mercurial/hg.1.html What do I need to use Mercurial? @@ -483,13 +484,14 @@ or ``transplant``, add new Mercurial commands. There are also many `configuration options`_ to tweak various aspects of the -command line and other Mercurial behaviour; typing ``man hgrc`` displays +command line and other Mercurial behaviour; typing `man hgrc`_ displays their documentation inside your terminal. In the end, please refer to the Mercurial `wiki`_, especially the pages about `extensions`_ (including third-party ones) and the `tips and tricks`_. +.. _man hgrc: http://www.selenic.com/mercurial/hgrc.5.html .. _wiki: http://mercurial.selenic.com/wiki/ .. _extensions: http://mercurial.selenic.com/wiki/UsingExtensions .. _tips and tricks: http://mercurial.selenic.com/wiki/TipsAndTricks -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sat Feb 26 14:02:50 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 14:02:50 +0100 (CET) Subject: [Python-checkins] r88641 - python/branches/release31-maint/Misc/maintainers.rst Message-ID: <20110226130250.34BCBEE9BA@mail.python.org> Author: eric.araujo Date: Sat Feb 26 14:02:50 2011 New Revision: 88641 Log: Final update and deprecation of maintainers.rst. This file has been removed from 3.2 and moved to the devguide. I think the safe choice for the stable branches is to keep it with a warning and link. Modified: python/branches/release31-maint/Misc/maintainers.rst Modified: python/branches/release31-maint/Misc/maintainers.rst ============================================================================== --- python/branches/release31-maint/Misc/maintainers.rst (original) +++ python/branches/release31-maint/Misc/maintainers.rst Sat Feb 26 14:02:50 2011 @@ -1,6 +1,11 @@ Maintainers Index ================= +.. warning:: + + This document is out of date and replaced by another version in the + developpers' guide at http://docs.python.org/devguide/experts. + This document has tables that list Python Modules, Tools, Platforms and Interest Areas and names for each item that indicate a maintainer or an expert in the field. This list is intended to be used by issue submitters, @@ -13,7 +18,7 @@ Unless a name is followed by a '*', you should never assign an issue to that person, only make them nosy. Names followed by a '*' may be assigned -issues involving the module or topic for which the name has a '*'. +issues involving the module or topic. The Platform and Interest Area tables list broader fields in which various people have expertise. These people can also be contacted for help, @@ -77,18 +82,18 @@ colorsys compileall configparser lukasz.langa -contextlib +contextlib ncoghlan copy alexandre.vassalotti copyreg alexandre.vassalotti cProfile -crypt +crypt jafo* csv skip.montanaro ctypes theller curses datetime belopolsky dbm decimal facundobatista, rhettinger, mark.dickinson -difflib tim_one +difflib tim_one (inactive) dis distutils tarek*, eric.araujo* doctest tim_one (inactive) @@ -145,7 +150,7 @@ multiprocessing jnoller netrc nis -nntplib +nntplib pitrou numbers operator optparse aronacher @@ -204,8 +209,8 @@ symtable benjamin.peterson sys sysconfig tarek -syslog jafo -tabnanny tim_one +syslog jafo* +tabnanny tim_one (inactive) tarfile lars.gustaebel telnetlib tempfile georg.brandl From python-checkins at python.org Sat Feb 26 14:05:20 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 14:05:20 +0100 (CET) Subject: [Python-checkins] r88642 - python/branches/release27-maint/Misc/maintainers.rst Message-ID: <20110226130520.9D361EE9BA@mail.python.org> Author: eric.araujo Date: Sat Feb 26 14:05:20 2011 New Revision: 88642 Log: Final update and deprecation of maintainers.rst. This file has been removed from 3.2 and moved to the devguide. I think the safe choice for the stable branches is to keep it with a warning and link. Modified: python/branches/release27-maint/Misc/maintainers.rst Modified: python/branches/release27-maint/Misc/maintainers.rst ============================================================================== --- python/branches/release27-maint/Misc/maintainers.rst (original) +++ python/branches/release27-maint/Misc/maintainers.rst Sat Feb 26 14:05:20 2011 @@ -1,6 +1,11 @@ Maintainers Index ================= +.. warning:: + + This document is out of date and replaced by another version in the + developpers' guide at http://docs.python.org/devguide/experts. + This document has tables that list Python Modules, Tools, Platforms and Interest Areas and names for each item that indicate a maintainer or an expert in the field. This list is intended to be used by issue submitters, @@ -13,7 +18,7 @@ Unless a name is followed by a '*', you should never assign an issue to that person, only make them nosy. Names followed by a '*' may be assigned -issues involving the module or topic for which the name has a '*'. +issues involving the module or topic. The Platform and Interest Area tables list broader fields in which various people have expertise. These people can also be contacted for help, @@ -80,18 +85,18 @@ colorsys compileall ConfigParser lukasz.langa -contextlib +contextlib ncoghlan copy alexandre.vassalotti copy_reg alexandre.vassalotti cProfile -crypt +crypt jafo* csv skip.montanaro ctypes theller curses datetime belopolsky dbm decimal facundobatista, rhettinger, mark.dickinson -difflib tim_one +difflib tim_one (inactive) dis distutils tarek*, eric.araujo* doctest tim_one (inactive) @@ -150,7 +155,7 @@ multiprocessing jnoller netrc nis -nntplib +nntplib pitrou numbers operator optparse aronacher @@ -210,8 +215,8 @@ symtable benjamin.peterson sys sysconfig tarek -syslog jafo -tabnanny tim_one +syslog jafo* +tabnanny tim_one (inactive) tarfile lars.gustaebel telnetlib tempfile georg.brandl From python-checkins at python.org Sat Feb 26 14:38:36 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 14:38:36 +0100 (CET) Subject: [Python-checkins] r88643 - python/branches/py3k/Modules/posixmodule.c Message-ID: <20110226133836.12051EE99F@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 14:38:35 2011 New Revision: 88643 Log: Check error return from _parse_off_t(), and remove cruft from the 2->3 transition. Modified: python/branches/py3k/Modules/posixmodule.c Modified: python/branches/py3k/Modules/posixmodule.c ============================================================================== --- python/branches/py3k/Modules/posixmodule.c (original) +++ python/branches/py3k/Modules/posixmodule.c Sat Feb 26 14:38:35 2011 @@ -369,8 +369,7 @@ #if !defined(HAVE_LARGEFILE_SUPPORT) *((off_t*)addr) = PyLong_AsLong(arg); #else - *((off_t*)addr) = PyLong_Check(arg) ? PyLong_AsLongLong(arg) - : PyLong_AsLong(arg); + *((off_t*)addr) = PyLong_AsLongLong(arg); #endif if (PyErr_Occurred()) return 0; @@ -5772,8 +5771,7 @@ #if !defined(HAVE_LARGEFILE_SUPPORT) pos = PyLong_AsLong(posobj); #else - pos = PyLong_Check(posobj) ? - PyLong_AsLongLong(posobj) : PyLong_AsLong(posobj); + pos = PyLong_AsLongLong(posobj); #endif if (PyErr_Occurred()) return NULL; @@ -6030,7 +6028,8 @@ return Py_BuildValue("nO", ret, Py_None); } #endif - _parse_off_t(offobj, &offset); + if (!_parse_off_t(offobj, &offset)) + return NULL; Py_BEGIN_ALLOW_THREADS ret = sendfile(out, in, &offset, count); Py_END_ALLOW_THREADS From python-checkins at python.org Sat Feb 26 15:15:55 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 15:15:55 +0100 (CET) Subject: [Python-checkins] r88644 - in python/branches: py3k/Lib/logging/__init__.py release32-maint/Lib/logging/__init__.py Message-ID: <20110226141555.7D56CEE9BA@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 15:15:48 2011 New Revision: 88644 Log: Issue #11330: asctime format bug fixed. Modified: python/branches/py3k/Lib/logging/__init__.py python/branches/release32-maint/Lib/logging/__init__.py Modified: python/branches/py3k/Lib/logging/__init__.py ============================================================================== --- python/branches/py3k/Lib/logging/__init__.py (original) +++ python/branches/py3k/Lib/logging/__init__.py Sat Feb 26 15:15:48 2011 @@ -360,12 +360,13 @@ default_format = '%(message)s' asctime_format = '%(asctime)s' + asctime_search = '%(asctime)' def __init__(self, fmt): self._fmt = fmt or self.default_format def usesTime(self): - return self._fmt.find(self.asctime_format) >= 0 + return self._fmt.find(self.asctime_search) >= 0 def format(self, record): return self._fmt % record.__dict__ @@ -373,6 +374,7 @@ class StrFormatStyle(PercentStyle): default_format = '{message}' asctime_format = '{asctime}' + asctime_search = '{asctime' def format(self, record): return self._fmt.format(**record.__dict__) @@ -381,6 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' + asctime_search = '{asctime' def __init__(self, fmt): self._fmt = fmt or self.default_format Modified: python/branches/release32-maint/Lib/logging/__init__.py ============================================================================== --- python/branches/release32-maint/Lib/logging/__init__.py (original) +++ python/branches/release32-maint/Lib/logging/__init__.py Sat Feb 26 15:15:48 2011 @@ -360,12 +360,13 @@ default_format = '%(message)s' asctime_format = '%(asctime)s' + asctime_search = '%(asctime)' def __init__(self, fmt): self._fmt = fmt or self.default_format def usesTime(self): - return self._fmt.find(self.asctime_format) >= 0 + return self._fmt.find(self.asctime_search) >= 0 def format(self, record): return self._fmt % record.__dict__ @@ -373,6 +374,7 @@ class StrFormatStyle(PercentStyle): default_format = '{message}' asctime_format = '{asctime}' + asctime_search = '{asctime' def format(self, record): return self._fmt.format(**record.__dict__) @@ -381,6 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' + asctime_search = '{asctime' def __init__(self, fmt): self._fmt = fmt or self.default_format From python-checkins at python.org Sat Feb 26 15:24:30 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 15:24:30 +0100 (CET) Subject: [Python-checkins] r88645 - in python/branches: py3k/Doc/howto/logging-cookbook.rst release32-maint/Doc/howto/logging-cookbook.rst Message-ID: <20110226142430.13F5BEE9C3@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 15:24:29 2011 New Revision: 88645 Log: Issue #11331: fixed documentation in logging cookbook. Modified: python/branches/py3k/Doc/howto/logging-cookbook.rst python/branches/release32-maint/Doc/howto/logging-cookbook.rst Modified: python/branches/py3k/Doc/howto/logging-cookbook.rst ============================================================================== --- python/branches/py3k/Doc/howto/logging-cookbook.rst (original) +++ python/branches/py3k/Doc/howto/logging-cookbook.rst Sat Feb 26 15:24:29 2011 @@ -630,8 +630,6 @@ if __name__ == '__main__': levels = (logging.DEBUG, logging.INFO, logging.WARNING, logging.ERROR, logging.CRITICAL) - a1 = logging.LoggerAdapter(logging.getLogger('a.b.c'), - { 'ip' : '123.231.231.123', 'user' : 'sheila' }) logging.basicConfig(level=logging.DEBUG, format='%(asctime)-15s %(name)-5s %(levelname)-8s IP: %(ip)-15s User: %(user)-8s %(message)s') a1 = logging.getLogger('a.b.c') Modified: python/branches/release32-maint/Doc/howto/logging-cookbook.rst ============================================================================== --- python/branches/release32-maint/Doc/howto/logging-cookbook.rst (original) +++ python/branches/release32-maint/Doc/howto/logging-cookbook.rst Sat Feb 26 15:24:29 2011 @@ -630,8 +630,6 @@ if __name__ == '__main__': levels = (logging.DEBUG, logging.INFO, logging.WARNING, logging.ERROR, logging.CRITICAL) - a1 = logging.LoggerAdapter(logging.getLogger('a.b.c'), - { 'ip' : '123.231.231.123', 'user' : 'sheila' }) logging.basicConfig(level=logging.DEBUG, format='%(asctime)-15s %(name)-5s %(levelname)-8s IP: %(ip)-15s User: %(user)-8s %(message)s') a1 = logging.getLogger('a.b.c') From python-checkins at python.org Sat Feb 26 15:28:36 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 15:28:36 +0100 (CET) Subject: [Python-checkins] r88646 - in python/branches: py3k/Lib/logging/__init__.py release32-maint/Lib/logging/__init__.py Message-ID: <20110226142836.C8A62EE9C3@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 15:28:36 2011 New Revision: 88646 Log: Removed typo. Modified: python/branches/py3k/Lib/logging/__init__.py python/branches/release32-maint/Lib/logging/__init__.py Modified: python/branches/py3k/Lib/logging/__init__.py ============================================================================== --- python/branches/py3k/Lib/logging/__init__.py (original) +++ python/branches/py3k/Lib/logging/__init__.py Sat Feb 26 15:28:36 2011 @@ -383,7 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' - asctime_search = '{asctime' + asctime_search = '${asctime' def __init__(self, fmt): self._fmt = fmt or self.default_format Modified: python/branches/release32-maint/Lib/logging/__init__.py ============================================================================== --- python/branches/release32-maint/Lib/logging/__init__.py (original) +++ python/branches/release32-maint/Lib/logging/__init__.py Sat Feb 26 15:28:36 2011 @@ -383,7 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' - asctime_search = '{asctime' + asctime_search = '${asctime' def __init__(self, fmt): self._fmt = fmt or self.default_format From python-checkins at python.org Sat Feb 26 15:29:24 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 15:29:24 +0100 (CET) Subject: [Python-checkins] r88647 - python/branches/py3k/Lib/test/test_os.py Message-ID: <20110226142924.AFA01EE9C3@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 15:29:24 2011 New Revision: 88647 Log: Issue #11323: fix sendfile tests under 64-bit Solaris. Modified: python/branches/py3k/Lib/test/test_os.py Modified: python/branches/py3k/Lib/test/test_os.py ============================================================================== --- python/branches/py3k/Lib/test/test_os.py (original) +++ python/branches/py3k/Lib/test/test_os.py Sat Feb 26 15:29:24 2011 @@ -1374,7 +1374,7 @@ @unittest.skipUnless(hasattr(os, 'sendfile'), "test needs os.sendfile()") class TestSendfile(unittest.TestCase): - DATA = b"12345abcde" * 1024 * 1024 # 10 Mb + DATA = b"12345abcde" * 16 * 1024 # 160 KB SUPPORT_HEADERS_TRAILERS = not sys.platform.startswith("linux") and \ not sys.platform.startswith("solaris") and \ not sys.platform.startswith("sunos") @@ -1432,7 +1432,7 @@ total_sent = 0 offset = 0 nbytes = 4096 - while 1: + while total_sent < len(self.DATA): sent = self.sendfile_wrapper(self.sockno, self.fileno, offset, nbytes) if sent == 0: break @@ -1445,14 +1445,15 @@ self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - self.assertEqual(hash(data), hash(self.DATA)) + self.assertEqual(data, self.DATA) def test_send_at_certain_offset(self): # start sending a file at a certain offset total_sent = 0 - offset = len(self.DATA) / 2 + offset = len(self.DATA) // 2 + must_send = len(self.DATA) - offset nbytes = 4096 - while 1: + while total_sent < must_send: sent = self.sendfile_wrapper(self.sockno, self.fileno, offset, nbytes) if sent == 0: break @@ -1463,15 +1464,21 @@ self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - expected = self.DATA[int(len(self.DATA) / 2):] + expected = self.DATA[len(self.DATA) // 2:] self.assertEqual(total_sent, len(expected)) - self.assertEqual(hash(data), hash(expected)) + self.assertEqual(data, expected) def test_offset_overflow(self): # specify an offset > file size offset = len(self.DATA) + 4096 - sent = os.sendfile(self.sockno, self.fileno, offset, 4096) - self.assertEqual(sent, 0) + try: + sent = os.sendfile(self.sockno, self.fileno, offset, 4096) + except OSError as e: + # Solaris can raise EINVAL if offset >= file length, ignore. + if e.errno != errno.EINVAL: + raise + else: + self.assertEqual(sent, 0) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() From eric at trueblade.com Sat Feb 26 15:25:34 2011 From: eric at trueblade.com (Eric Smith) Date: Sat, 26 Feb 2011 09:25:34 -0500 Subject: [Python-checkins] r88644 - in python/branches: py3k/Lib/logging/__init__.py release32-maint/Lib/logging/__init__.py In-Reply-To: <20110226141555.7D56CEE9BA@mail.python.org> References: <20110226141555.7D56CEE9BA@mail.python.org> Message-ID: <4D690D5E.2050401@trueblade.com> On 2/26/2011 9:15 AM, vinay.sajip wrote: > class StringTemplateStyle(PercentStyle): > default_format = '${message}' > asctime_format = '${asctime}' > + asctime_search = '{asctime' Not that it probably matters much, but shouldn't that be '${asctime' (with a leading $)? From python-checkins at python.org Sat Feb 26 15:48:42 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 15:48:42 +0100 (CET) Subject: [Python-checkins] r88648 - python/branches/release31-maint/Misc/maintainers.rst Message-ID: <20110226144842.66B44EEA1C@mail.python.org> Author: eric.araujo Date: Sat Feb 26 15:48:42 2011 New Revision: 88648 Log: Typos Modified: python/branches/release31-maint/Misc/maintainers.rst Modified: python/branches/release31-maint/Misc/maintainers.rst ============================================================================== --- python/branches/release31-maint/Misc/maintainers.rst (original) +++ python/branches/release31-maint/Misc/maintainers.rst Sat Feb 26 15:48:42 2011 @@ -4,7 +4,7 @@ .. warning:: This document is out of date and replaced by another version in the - developpers' guide at http://docs.python.org/devguide/experts. + developer's guide at http://docs.python.org/devguide/experts This document has tables that list Python Modules, Tools, Platforms and Interest Areas and names for each item that indicate a maintainer or an From python-checkins at python.org Sat Feb 26 15:49:04 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 15:49:04 +0100 (CET) Subject: [Python-checkins] r88649 - python/branches/release27-maint/Misc/maintainers.rst Message-ID: <20110226144904.365F6EEA35@mail.python.org> Author: eric.araujo Date: Sat Feb 26 15:49:04 2011 New Revision: 88649 Log: Typos Modified: python/branches/release27-maint/Misc/maintainers.rst Modified: python/branches/release27-maint/Misc/maintainers.rst ============================================================================== --- python/branches/release27-maint/Misc/maintainers.rst (original) +++ python/branches/release27-maint/Misc/maintainers.rst Sat Feb 26 15:49:04 2011 @@ -4,7 +4,7 @@ .. warning:: This document is out of date and replaced by another version in the - developpers' guide at http://docs.python.org/devguide/experts. + developer's guide at http://docs.python.org/devguide/experts This document has tables that list Python Modules, Tools, Platforms and Interest Areas and names for each item that indicate a maintainer or an From python-checkins at python.org Sat Feb 26 15:57:23 2011 From: python-checkins at python.org (eric.araujo) Date: Sat, 26 Feb 2011 15:57:23 +0100 (CET) Subject: [Python-checkins] r88650 - in python/branches/py3k/Doc: bugs.rst faq/general.rst using/unix.rst using/windows.rst Message-ID: <20110226145723.30F12EEA3B@mail.python.org> Author: eric.araujo Date: Sat Feb 26 15:57:23 2011 New Revision: 88650 Log: Replace links to the old dev doc with links to the new devguide. Modified: python/branches/py3k/Doc/bugs.rst python/branches/py3k/Doc/faq/general.rst python/branches/py3k/Doc/using/unix.rst python/branches/py3k/Doc/using/windows.rst Modified: python/branches/py3k/Doc/bugs.rst ============================================================================== --- python/branches/py3k/Doc/bugs.rst (original) +++ python/branches/py3k/Doc/bugs.rst Sat Feb 26 15:57:23 2011 @@ -57,12 +57,14 @@ Each bug report will be assigned to a developer who will determine what needs to be done to correct the problem. You will receive an update each time action is -taken on the bug. See http://www.python.org/dev/workflow/ for a detailed -description of the issue workflow. +taken on the bug. .. seealso:: + `Python Developer's Guide `_ + Detailed description of the issue workflow and developers tools. + `How to Report Bugs Effectively `_ Article which goes into some detail about how to create a useful bug report. This describes what kind of information is useful and why it is useful. Modified: python/branches/py3k/Doc/faq/general.rst ============================================================================== --- python/branches/py3k/Doc/faq/general.rst (original) +++ python/branches/py3k/Doc/faq/general.rst Sat Feb 26 15:57:23 2011 @@ -166,7 +166,7 @@ .. XXX update link once the dev faq is relocated -Consult the `Developer FAQ `__ for more +Consult the `Developer FAQ `__ for more information on getting the source code and compiling it. @@ -224,7 +224,7 @@ .. XXX update link once the dev faq is relocated You can also access the development version of Python through Subversion. See -http://www.python.org/dev/faq/ for details. +http://docs.python.org/devguide/faq for details. How do I submit bug reports and patches for Python? @@ -242,7 +242,7 @@ .. XXX adapt link to dev guide For more information on how Python is developed, consult `the Python Developer's -Guide `_. +Guide `_. Are there any published articles about Python that I can reference? Modified: python/branches/py3k/Doc/using/unix.rst ============================================================================== --- python/branches/py3k/Doc/using/unix.rst (original) +++ python/branches/py3k/Doc/using/unix.rst Sat Feb 26 15:57:23 2011 @@ -66,7 +66,7 @@ If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `checkout -`_. +`_. The build process consists the usual :: Modified: python/branches/py3k/Doc/using/windows.rst ============================================================================== --- python/branches/py3k/Doc/using/windows.rst (original) +++ python/branches/py3k/Doc/using/windows.rst Sat Feb 26 15:57:23 2011 @@ -290,7 +290,7 @@ If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `checkout -`_. +`_. For Microsoft Visual C++, which is the compiler with which official Python releases are built, the source tree contains solutions/project files. View the From python-checkins at python.org Sat Feb 26 16:35:38 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 16:35:38 +0100 (CET) Subject: [Python-checkins] r88651 - python/branches/release32-maint/Lib/test/test_logging.py Message-ID: <20110226153538.86C66EE99F@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 16:35:38 2011 New Revision: 88651 Log: Issue #9941: Fixed TimedRotatingHandler test issues. Modified: python/branches/release32-maint/Lib/test/test_logging.py Modified: python/branches/release32-maint/Lib/test/test_logging.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_logging.py (original) +++ python/branches/release32-maint/Lib/test/test_logging.py Sat Feb 26 16:35:38 2011 @@ -1,6 +1,6 @@ #!/usr/bin/env python # -# Copyright 2001-2010 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2011 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -18,7 +18,7 @@ """Test harness for the logging module. Run all tests. -Copyright (C) 2001-2010 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2011 Vinay Sajip. All Rights Reserved. """ import logging @@ -2044,13 +2044,43 @@ ('M', 60), ('H', 60 * 60), ('D', 60 * 60 * 24), - ('MIDNIGHT', 60 * 60 * 23), + ('MIDNIGHT', 60 * 60 * 24), # current time (epoch start) is a Thursday, W0 means Monday - ('W0', secs(days=4, hours=23)),): + ('W0', secs(days=4, hours=24)), + ): def test_compute_rollover(self, when=when, exp=exp): rh = logging.handlers.TimedRotatingFileHandler( - self.fn, when=when, interval=1, backupCount=0) - self.assertEqual(exp, rh.computeRollover(0.0)) + self.fn, when=when, interval=1, backupCount=0, utc=True) + currentTime = 0.0 + actual = rh.computeRollover(currentTime) + if exp != actual: + # Failures occur on some systems for MIDNIGHT and W0. + # Print detailed calculation for MIDNIGHT so we can try to see + # what's going on + import time + if when == 'MIDNIGHT': + try: + if rh.utc: + t = time.gmtime(currentTime) + else: + t = time.localtime(currentTime) + currentHour = t[3] + currentMinute = t[4] + currentSecond = t[5] + # r is the number of seconds left between now and midnight + r = logging.handlers._MIDNIGHT - ((currentHour * 60 + + currentMinute) * 60 + + currentSecond) + result = currentTime + r + print('t: %s (%s)' % (t, rh.utc), file=sys.stderr) + print('currentHour: %s' % currentHour, file=sys.stderr) + print('currentMinute: %s' % currentMinute, file=sys.stderr) + print('currentSecond: %s' % currentSecond, file=sys.stderr) + print('r: %s' % r, file=sys.stderr) + print('result: %s' % result, file=sys.stderr) + except Exception: + print('exception in diagnostic code: %s' % sys.exc_info()[1], file=sys.stderr) + self.assertEqual(exp, actual) rh.close() setattr(TimedRotatingFileHandlerTest, "test_compute_rollover_%s" % when, test_compute_rollover) From python-checkins at python.org Sat Feb 26 16:58:05 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 16:58:05 +0100 (CET) Subject: [Python-checkins] r88652 - in python/branches/py3k: Lib/test/support.py Misc/NEWS Message-ID: <20110226155805.AB4C8EEA1A@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 16:58:05 2011 New Revision: 88652 Log: Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. Patch by Hirokazu Yamamoto. Modified: python/branches/py3k/Lib/test/support.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/test/support.py ============================================================================== --- python/branches/py3k/Lib/test/support.py (original) +++ python/branches/py3k/Lib/test/support.py Sat Feb 26 16:58:05 2011 @@ -233,6 +233,36 @@ unlink(imp.cache_from_source(source, debug_override=True)) unlink(imp.cache_from_source(source, debug_override=False)) +# On some platforms, should not run gui test even if it is allowed +# in `use_resources'. +if sys.platform.startswith('win'): + import ctypes + import ctypes.wintypes + def _is_gui_available(): + UOI_FLAGS = 1 + WSF_VISIBLE = 0x0001 + class USEROBJECTFLAGS(ctypes.Structure): + _fields_ = [("fInherit", ctypes.wintypes.BOOL), + ("fReserved", ctypes.wintypes.BOOL), + ("dwFlags", ctypes.wintypes.DWORD)] + dll = ctypes.windll.user32 + h = dll.GetProcessWindowStation() + if not h: + raise ctypes.WinError() + uof = USEROBJECTFLAGS() + needed = ctypes.wintypes.DWORD() + res = dll.GetUserObjectInformationW(h, + UOI_FLAGS, + ctypes.byref(uof), + ctypes.sizeof(uof), + ctypes.byref(needed)) + if not res: + raise ctypes.WinError() + return bool(uof.dwFlags & WSF_VISIBLE) +else: + def _is_gui_available(): + return True + def is_resource_enabled(resource): """Test whether a resource is enabled. Known resources are set by regrtest.py.""" @@ -245,6 +275,8 @@ possibility of False being returned occurs when regrtest.py is executing. """ + if resource == 'gui' and not _is_gui_available(): + raise unittest.SkipTest("Cannot use the 'gui' resource") # see if the caller's module is __main__ - if so, treat as if # the resource was set if sys._getframe(1).f_globals.get("__name__") == "__main__": @@ -1045,6 +1077,8 @@ return obj def requires_resource(resource): + if resource == 'gui' and not _is_gui_available(): + return unittest.skip("resource 'gui' is not available") if is_resource_enabled(resource): return _id else: Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sat Feb 26 16:58:05 2011 @@ -105,6 +105,9 @@ Tests ----- +- Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. + Patch by Hirokazu Yamamoto. + - Issue #10512: Properly close sockets under test.test_cgi. - Issue #10992: Make tests pass under coverage. From python-checkins at python.org Sat Feb 26 17:00:04 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 17:00:04 +0100 (CET) Subject: [Python-checkins] r88653 - python/branches/py3k/Lib/test/test_logging.py Message-ID: <20110226160004.77957EEA3B@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 17:00:04 2011 New Revision: 88653 Log: Issue #11330: Added regression test. Modified: python/branches/py3k/Lib/test/test_logging.py Modified: python/branches/py3k/Lib/test/test_logging.py ============================================================================== --- python/branches/py3k/Lib/test/test_logging.py (original) +++ python/branches/py3k/Lib/test/test_logging.py Sat Feb 26 17:00:04 2011 @@ -1907,6 +1907,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('%(asctime)s') self.assertTrue(f.usesTime()) + f = logging.Formatter('%(asctime)-15s') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime') self.assertFalse(f.usesTime()) @@ -1920,6 +1922,10 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('{asctime}', style='{') self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime!s:15}', style='{') + self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime:15}', style='{') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='{') self.assertFalse(f.usesTime()) @@ -1935,6 +1941,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('${asctime}', style='$') self.assertTrue(f.usesTime()) + f = logging.Formatter('${asctime', style='$') + self.assertFalse(f.usesTime()) f = logging.Formatter('$asctime', style='$') self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='$') From martin at v.loewis.de Sat Feb 26 17:02:07 2011 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Sat, 26 Feb 2011 17:02:07 +0100 Subject: [Python-checkins] cpython (trunk): Close the "trunk" branch. In-Reply-To: <20110226110612.2766c0da@pitrou.net> References: <4D68CD42.5050005@v.loewis.de> <20110226110612.2766c0da@pitrou.net> Message-ID: <4D6923FF.9050204@v.loewis.de> > Committing reopened it So what's the point of closing it, then? What effect does that achieve? Regards, Martin From python-checkins at python.org Sat Feb 26 17:06:03 2011 From: python-checkins at python.org (vinay.sajip) Date: Sat, 26 Feb 2011 17:06:03 +0100 (CET) Subject: [Python-checkins] r88654 - in python/branches: py3k/Lib/logging/__init__.py py3k/Lib/test/test_logging.py release32-maint/Lib/logging/__init__.py release32-maint/Lib/test/test_logging.py Message-ID: <20110226160603.354EEEEA1D@mail.python.org> Author: vinay.sajip Date: Sat Feb 26 17:06:02 2011 New Revision: 88654 Log: Issue #11330: Updated tests for correct asctime handling. Modified: python/branches/py3k/Lib/logging/__init__.py python/branches/py3k/Lib/test/test_logging.py python/branches/release32-maint/Lib/logging/__init__.py python/branches/release32-maint/Lib/test/test_logging.py Modified: python/branches/py3k/Lib/logging/__init__.py ============================================================================== --- python/branches/py3k/Lib/logging/__init__.py (original) +++ python/branches/py3k/Lib/logging/__init__.py Sat Feb 26 17:06:02 2011 @@ -383,7 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' - asctime_search = '${asctime' + asctime_search = '${asctime}' def __init__(self, fmt): self._fmt = fmt or self.default_format Modified: python/branches/py3k/Lib/test/test_logging.py ============================================================================== --- python/branches/py3k/Lib/test/test_logging.py (original) +++ python/branches/py3k/Lib/test/test_logging.py Sat Feb 26 17:06:02 2011 @@ -1,6 +1,6 @@ #!/usr/bin/env python # -# Copyright 2001-2010 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2011 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -18,7 +18,7 @@ """Test harness for the logging module. Run all tests. -Copyright (C) 2001-2010 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2011 Vinay Sajip. All Rights Reserved. """ import logging Modified: python/branches/release32-maint/Lib/logging/__init__.py ============================================================================== --- python/branches/release32-maint/Lib/logging/__init__.py (original) +++ python/branches/release32-maint/Lib/logging/__init__.py Sat Feb 26 17:06:02 2011 @@ -383,7 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' - asctime_search = '${asctime' + asctime_search = '${asctime}' def __init__(self, fmt): self._fmt = fmt or self.default_format Modified: python/branches/release32-maint/Lib/test/test_logging.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_logging.py (original) +++ python/branches/release32-maint/Lib/test/test_logging.py Sat Feb 26 17:06:02 2011 @@ -1907,6 +1907,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('%(asctime)s') self.assertTrue(f.usesTime()) + f = logging.Formatter('%(asctime)-15s') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime') self.assertFalse(f.usesTime()) @@ -1920,6 +1922,10 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('{asctime}', style='{') self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime!s:15}', style='{') + self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime:15}', style='{') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='{') self.assertFalse(f.usesTime()) @@ -1935,6 +1941,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('${asctime}', style='$') self.assertTrue(f.usesTime()) + f = logging.Formatter('${asctime', style='$') + self.assertFalse(f.usesTime()) f = logging.Formatter('$asctime', style='$') self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='$') @@ -2097,7 +2105,7 @@ LogRecordFactoryTest, ChildLoggerTest, QueueHandlerTest, RotatingFileHandlerTest, LastResortTest, - #TimedRotatingFileHandlerTest + TimedRotatingFileHandlerTest ) if __name__ == "__main__": From python-checkins at python.org Sat Feb 26 17:34:45 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 17:34:45 +0100 Subject: [Python-checkins] pymigr: Request addition of hook for closed branches. Message-ID: martin.v.loewis pushed 928a3edfce4f to pymigr: http://hg.python.org/pymigr/rev/928a3edfce4f changeset: 108:928a3edfce4f tag: tip user: Martin v. L?wis date: Sat Feb 26 17:34:41 2011 +0100 summary: Request addition of hook for closed branches. files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -6,6 +6,7 @@ * resolve open issues in PEP 385 * (optionally) fight some more about workflow and decide the branch structure * some hook should prevent pushing python files indented by tabs. +* some hook should prevent pushing to the 2.x trunk. Done ==== -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 17:49:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 17:49:08 +0100 (CET) Subject: [Python-checkins] r88655 - in python/branches/release32-maint: Lib/test/support.py Misc/NEWS Message-ID: <20110226164908.AE397FE26@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 17:49:08 2011 New Revision: 88655 Log: Merged revisions 88652 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88652 | antoine.pitrou | 2011-02-26 16:58:05 +0100 (sam., 26 f?vr. 2011) | 4 lines Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. Patch by Hirokazu Yamamoto. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/support.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/test/support.py ============================================================================== --- python/branches/release32-maint/Lib/test/support.py (original) +++ python/branches/release32-maint/Lib/test/support.py Sat Feb 26 17:49:08 2011 @@ -233,6 +233,36 @@ unlink(imp.cache_from_source(source, debug_override=True)) unlink(imp.cache_from_source(source, debug_override=False)) +# On some platforms, should not run gui test even if it is allowed +# in `use_resources'. +if sys.platform.startswith('win'): + import ctypes + import ctypes.wintypes + def _is_gui_available(): + UOI_FLAGS = 1 + WSF_VISIBLE = 0x0001 + class USEROBJECTFLAGS(ctypes.Structure): + _fields_ = [("fInherit", ctypes.wintypes.BOOL), + ("fReserved", ctypes.wintypes.BOOL), + ("dwFlags", ctypes.wintypes.DWORD)] + dll = ctypes.windll.user32 + h = dll.GetProcessWindowStation() + if not h: + raise ctypes.WinError() + uof = USEROBJECTFLAGS() + needed = ctypes.wintypes.DWORD() + res = dll.GetUserObjectInformationW(h, + UOI_FLAGS, + ctypes.byref(uof), + ctypes.sizeof(uof), + ctypes.byref(needed)) + if not res: + raise ctypes.WinError() + return bool(uof.dwFlags & WSF_VISIBLE) +else: + def _is_gui_available(): + return True + def is_resource_enabled(resource): """Test whether a resource is enabled. Known resources are set by regrtest.py.""" @@ -245,6 +275,8 @@ possibility of False being returned occurs when regrtest.py is executing. """ + if resource == 'gui' and not _is_gui_available(): + raise unittest.SkipTest("Cannot use the 'gui' resource") # see if the caller's module is __main__ - if so, treat as if # the resource was set if sys._getframe(1).f_globals.get("__name__") == "__main__": @@ -1063,6 +1095,8 @@ return obj def requires_resource(resource): + if resource == 'gui' and not _is_gui_available(): + return unittest.skip("resource 'gui' is not available") if is_resource_enabled(resource): return _id else: Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Sat Feb 26 17:49:08 2011 @@ -55,6 +55,9 @@ Tests ----- +- Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. + Patch by Hirokazu Yamamoto. + - Issue #10826: Prevent sporadic failure in test_subprocess on Solaris due to open door files. From ncoghlan at gmail.com Sat Feb 26 17:57:23 2011 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 27 Feb 2011 02:57:23 +1000 Subject: [Python-checkins] r88601 - peps/trunk/pep-0385.txt In-Reply-To: <20110225191207.51301EE985@mail.python.org> References: <20110225191207.51301EE985@mail.python.org> Message-ID: On Sat, Feb 26, 2011 at 5:12 AM, antoine.pitrou wrote: > +In practice, most Mercurial users under Windows don't seem to have a need > +for the ``eol`` extension, though, and personal experience using a > +Linux-generated SVN checkout through a shared folder under Windows seems > +to confirm that modern tools work well without it. Feedback from the Mozilla and XEmacs folks at the time of that discussion suggested that EOL issues *were* a potential issue, and that having to revert unintentional commits of CRLF line endings was an annoyingly common problem. Cheers, Nick. -- Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia From python-checkins at python.org Sat Feb 26 18:45:41 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 18:45:41 +0100 Subject: [Python-checkins] merge in cpython: merged Message-ID: martin.v.loewis pushed fd43ce758b11 to cpython: http://hg.python.org/cpython/rev/fd43ce758b11 changeset: 68051:fd43ce758b11 tag: tip parent: 68050:ef10245dfe0c parent: 68042:8ff33af017ef user: Martin v. L?wis date: Sat Feb 26 18:45:13 2011 +0100 summary: merged files: diff --git a/Lib/http/client.py b/Lib/http/client.py --- a/Lib/http/client.py +++ b/Lib/http/client.py @@ -76,7 +76,7 @@ import warnings __all__ = ["HTTPResponse", "HTTPConnection", - "HTTPException", "NotConnected", "UnknownProtocol", + "HTTPException", "NotConnected", "UnknownProtocol", "UnknownTransferEncoding", "UnimplementedFileMode", "IncompleteRead", "InvalidURL", "ImproperConnectionState", "CannotSendRequest", "CannotSendHeader", "ResponseNotReady", -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 18:45:41 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 18:45:41 +0100 Subject: [Python-checkins] cpython: Add CR in individual line. Message-ID: martin.v.loewis pushed ef10245dfe0c to cpython: http://hg.python.org/cpython/rev/ef10245dfe0c changeset: 68050:ef10245dfe0c parent: 68040:10cb0ba4fd71 user: Martin v. L?wis date: Sat Feb 26 18:44:41 2011 +0100 summary: Add CR in individual line. files: Include/Python.h diff --git a/Include/Python.h b/Include/Python.h --- a/Include/Python.h +++ b/Include/Python.h @@ -4,7 +4,7 @@ /* Include nearly all Python header files */ -#include "patchlevel.h" +#include "patchlevel.h" #include "pyconfig.h" #include "pymacconfig.h" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 18:48:17 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 18:48:17 +0100 Subject: [Python-checkins] pymigr: Ask for hgeol-checking hook. Message-ID: martin.v.loewis pushed cafb761c4175 to pymigr: http://hg.python.org/pymigr/rev/cafb761c4175 changeset: 109:cafb761c4175 tag: tip user: Martin v. L?wis date: Sat Feb 26 18:48:13 2011 +0100 summary: Ask for hgeol-checking hook. files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -7,6 +7,7 @@ * (optionally) fight some more about workflow and decide the branch structure * some hook should prevent pushing python files indented by tabs. * some hook should prevent pushing to the 2.x trunk. +* some hook should prevent breaking EOL conventions. Done ==== -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 18:52:50 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 18:52:50 +0100 (CET) Subject: [Python-checkins] r88656 - python/branches/py3k/Lib/test/test_os.py Message-ID: <20110226175250.9B071EE9B5@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 18:52:50 2011 New Revision: 88656 Log: Make sendfile tests more robust Modified: python/branches/py3k/Lib/test/test_os.py Modified: python/branches/py3k/Lib/test/test_os.py ============================================================================== --- python/branches/py3k/Lib/test/test_os.py (original) +++ python/branches/py3k/Lib/test/test_os.py Sat Feb 26 18:52:50 2011 @@ -1340,7 +1340,7 @@ def wait(self): # wait for handler connection to be closed, then stop the server - while not getattr(self.handler_instance, "closed", True): + while not getattr(self.handler_instance, "closed", False): time.sleep(0.001) self.stop() @@ -1442,9 +1442,11 @@ self.assertEqual(offset, total_sent) self.assertEqual(total_sent, len(self.DATA)) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() + self.assertEqual(len(data), len(self.DATA)) self.assertEqual(data, self.DATA) def test_send_at_certain_offset(self): @@ -1461,11 +1463,13 @@ total_sent += sent self.assertTrue(sent <= nbytes) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() expected = self.DATA[len(self.DATA) // 2:] self.assertEqual(total_sent, len(expected)) + self.assertEqual(len(data), len(expected)) self.assertEqual(data, expected) def test_offset_overflow(self): @@ -1479,6 +1483,7 @@ raise else: self.assertEqual(sent, 0) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() From python-checkins at python.org Sat Feb 26 19:04:03 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:04:03 +0100 (CET) Subject: [Python-checkins] r88657 - peps/trunk/pep-0385.txt Message-ID: <20110226180403.218BEEE9BA@mail.python.org> Author: antoine.pitrou Date: Sat Feb 26 19:04:02 2011 New Revision: 88657 Log: Following Nick's comment, remove optimistic paragraph Modified: peps/trunk/pep-0385.txt Modified: peps/trunk/pep-0385.txt ============================================================================== --- peps/trunk/pep-0385.txt (original) +++ peps/trunk/pep-0385.txt Sat Feb 26 19:04:02 2011 @@ -285,11 +285,6 @@ information is kept in a versioned file called ``.hgeol``, and such a file has already been checked into the Subversion repository. -In practice, most Mercurial users under Windows don't seem to have a need -for the ``eol`` extension, though, and personal experience using a -Linux-generated SVN checkout through a shared folder under Windows seems -to confirm that modern tools work well without it. - A hook on the server side that turns down any changegroup or changeset introducing inconsistent newline data can still be implemented, if deemed necessary. From python-checkins at python.org Sat Feb 26 19:05:28 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 19:05:28 +0100 Subject: [Python-checkins] cpython: improve license Message-ID: benjamin.peterson pushed 0873fb83f1e2 to cpython: http://hg.python.org/cpython/rev/0873fb83f1e2 changeset: 68052:0873fb83f1e2 tag: tip user: Benjamin Peterson date: Sat Feb 26 12:06:36 2011 -0600 summary: improve license files: LICENSE diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -1,19 +1,18 @@ Copyright (C) 2011 by Python authors and the Python Software Foundation -Permission is hereby granted, free of charge, to any person obtaining a copy -of this software and associated documentation files (the "Software"), to -deal in the Software without restriction, including without limitation the -rights to use, copy, modify, merge, publish, distribute, sublicense, and/or -sell copies of the Software, and to permit persons to whom the Software is -furnished to do so, subject to the following conditions: +In place of a legal mess, here is a poem: -The above copyright notice and this permission notice shall be included in -all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING -FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS -IN THE SOFTWARE. +I met a traveller from an antique land +Who said: Two vast and trunkless legs of stone +Stand in the desert. Near them, on the sand, +Half sunk, a shattered visage lies, whose frown +And wrinkled lip, and sneer of cold command +Tell that its sculptor well those passions read +Which yet survive, stamped on these lifeless things, +The hand that mocked them and the heart that fed. +And on the pedestal these words appear: +"My name is Ozymandias, king of kings: +Look on my works, ye Mighty, and despair!" +Nothing beside remains. Round the decay +Of that colossal wreck, boundless and bare +The lone and level sands stretch far away. -- Repository URL: http://hg.python.org/cpython From barry at python.org Sat Feb 26 19:12:25 2011 From: barry at python.org (Barry Warsaw) Date: Sat, 26 Feb 2011 13:12:25 -0500 Subject: [Python-checkins] cpython: improve license In-Reply-To: References: Message-ID: <20110226131225.2f310fc4@limelight.wooz.org> Notice the subject line. Can we make commit messages contain the named branch that the change applies to? The 'cpython' in the header doesn't really tell me whether I should care about this diff or not. Say the change applied to 2.6 but I only care about Python 3. It would be nice if I could just delete this message without reading the body. I guess it's possible for change notifications to encompass multiple named branches though, right? I'm not sure what to do about that, but it seems like a less common use case. -Barry On Feb 26, 2011, at 07:05 PM, benjamin.peterson wrote: >benjamin.peterson pushed 0873fb83f1e2 to cpython: > >http://hg.python.org/cpython/rev/0873fb83f1e2 >changeset: 68052:0873fb83f1e2 >tag: tip >user: Benjamin Peterson >date: Sat Feb 26 12:06:36 2011 -0600 >summary: > improve license > >files: > LICENSE -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: not available URL: From python-checkins at python.org Sat Feb 26 19:17:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:17:28 +0100 Subject: [Python-checkins] cpython: Commit fixed .hgeol from SVN Message-ID: antoine.pitrou pushed 685cc8eb00ce to cpython: http://hg.python.org/cpython/rev/685cc8eb00ce changeset: 68053:685cc8eb00ce user: Antoine Pitrou date: Sat Feb 26 19:16:34 2011 +0100 summary: Commit fixed .hgeol from SVN files: .hgeol diff --git a/.hgeol b/.hgeol --- a/.hgeol +++ b/.hgeol @@ -1,16 +1,13 @@ [patterns] -** = native -**.bat = CRLF -**.def = CRLF -**.dsp = CRLF -**.dsw = CRLF -**.mak = CRLF -**.mk = CRLF -**.rc = CRLF -**.sln = CRLF -**.vcproj = CRLF -**.vsprops = CRLF +# Non human-editable files are binary + +**.dsp = BIN +**.dsw = BIN +**.mk = BIN +**.sln = BIN +**.vcproj = BIN +**.vsprops = BIN **.aif = BIN **.au = BIN @@ -31,6 +28,12 @@ Lib/email/test/data/msg_26.txt = BIN Lib/test/sndhdrdata/sndhdr.* = BIN +Lib/test/decimaltestdata/*.decTest = BIN + +# All other files (which presumably are human-editable) are "native". +# This must be the last rule! + +** = native [repository] -native = LF \ No newline at end of file +native = LF -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 19:17:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:17:28 +0100 Subject: [Python-checkins] cpython: Fix endings (courtesy of the eol extension) Message-ID: antoine.pitrou pushed ca16fffdb379 to cpython: http://hg.python.org/cpython/rev/ca16fffdb379 changeset: 68054:ca16fffdb379 tag: tip user: Antoine Pitrou date: Sat Feb 26 19:17:21 2011 +0100 summary: Fix endings (courtesy of the eol extension) files: Doc/make.bat PC/VS7.1/build_ssl.bat PC/VS8.0/build.bat PC/VS8.0/build_env.bat PC/VS8.0/build_pgo.bat PC/VS8.0/build_ssl.bat PC/VS8.0/env.bat PC/VS8.0/idle.bat PC/VS8.0/rt.bat PC/python3.def PC/python3.mak PC/python33gen.py PC/python33stub.def PC/python3dll.c Tools/buildbot/build-amd64.bat Tools/buildbot/build.bat Tools/buildbot/buildmsi.bat Tools/buildbot/clean-amd64.bat Tools/buildbot/clean.bat Tools/buildbot/external-amd64.bat Tools/buildbot/external-common.bat Tools/buildbot/external.bat Tools/buildbot/test-amd64.bat Tools/buildbot/test.bat Tools/unicode/genwincodecs.bat diff --git a/Doc/make.bat b/Doc/make.bat --- a/Doc/make.bat +++ b/Doc/make.bat @@ -1,59 +1,59 @@ -@@echo off -setlocal - -set SVNROOT=http://svn.python.org/projects -if "%PYTHON%" EQU "" set PYTHON=..\pcbuild\python -if "%HTMLHELP%" EQU "" set HTMLHELP=%ProgramFiles%\HTML Help Workshop\hhc.exe -if "%DISTVERSION%" EQU "" for /f "usebackq" %%v in (`%PYTHON% tools/sphinxext/patchlevel.py`) do set DISTVERSION=%%v - -if "%1" EQU "" goto help -if "%1" EQU "html" goto build -if "%1" EQU "htmlhelp" goto build -if "%1" EQU "latex" goto build -if "%1" EQU "text" goto build -if "%1" EQU "suspicious" goto build -if "%1" EQU "linkcheck" goto build -if "%1" EQU "changes" goto build -if "%1" EQU "checkout" goto checkout -if "%1" EQU "update" goto update - -:help -set this=%~n0 -echo HELP -echo. -echo %this% checkout -echo %this% update -echo %this% html -echo %this% htmlhelp -echo %this% latex -echo %this% text -echo %this% suspicious -echo %this% linkcheck -echo %this% changes -echo. -goto end - -:checkout -svn co %SVNROOT%/external/Sphinx-1.0.7/sphinx tools/sphinx -svn co %SVNROOT%/external/docutils-0.6/docutils tools/docutils -svn co %SVNROOT%/external/Jinja-2.3.1/jinja2 tools/jinja2 -svn co %SVNROOT%/external/Pygments-1.3.1/pygments tools/pygments -goto end - -:update -svn update tools/sphinx -svn update tools/docutils -svn update tools/jinja2 -svn update tools/pygments -goto end - -:build -if not exist build mkdir build -if not exist build\%1 mkdir build\%1 -if not exist build\doctrees mkdir build\doctrees -cmd /C %PYTHON% --version -cmd /C %PYTHON% tools\sphinx-build.py -b%1 -dbuild\doctrees . build\%* -if "%1" EQU "htmlhelp" "%HTMLHELP%" build\htmlhelp\python%DISTVERSION:.=%.hhp -goto end - -:end +@@echo off +setlocal + +set SVNROOT=http://svn.python.org/projects +if "%PYTHON%" EQU "" set PYTHON=..\pcbuild\python +if "%HTMLHELP%" EQU "" set HTMLHELP=%ProgramFiles%\HTML Help Workshop\hhc.exe +if "%DISTVERSION%" EQU "" for /f "usebackq" %%v in (`%PYTHON% tools/sphinxext/patchlevel.py`) do set DISTVERSION=%%v + +if "%1" EQU "" goto help +if "%1" EQU "html" goto build +if "%1" EQU "htmlhelp" goto build +if "%1" EQU "latex" goto build +if "%1" EQU "text" goto build +if "%1" EQU "suspicious" goto build +if "%1" EQU "linkcheck" goto build +if "%1" EQU "changes" goto build +if "%1" EQU "checkout" goto checkout +if "%1" EQU "update" goto update + +:help +set this=%~n0 +echo HELP +echo. +echo %this% checkout +echo %this% update +echo %this% html +echo %this% htmlhelp +echo %this% latex +echo %this% text +echo %this% suspicious +echo %this% linkcheck +echo %this% changes +echo. +goto end + +:checkout +svn co %SVNROOT%/external/Sphinx-1.0.7/sphinx tools/sphinx +svn co %SVNROOT%/external/docutils-0.6/docutils tools/docutils +svn co %SVNROOT%/external/Jinja-2.3.1/jinja2 tools/jinja2 +svn co %SVNROOT%/external/Pygments-1.3.1/pygments tools/pygments +goto end + +:update +svn update tools/sphinx +svn update tools/docutils +svn update tools/jinja2 +svn update tools/pygments +goto end + +:build +if not exist build mkdir build +if not exist build\%1 mkdir build\%1 +if not exist build\doctrees mkdir build\doctrees +cmd /C %PYTHON% --version +cmd /C %PYTHON% tools\sphinx-build.py -b%1 -dbuild\doctrees . build\%* +if "%1" EQU "htmlhelp" "%HTMLHELP%" build\htmlhelp\python%DISTVERSION:.=%.hhp +goto end + +:end diff --git a/PC/VS7.1/build_ssl.bat b/PC/VS7.1/build_ssl.bat --- a/PC/VS7.1/build_ssl.bat +++ b/PC/VS7.1/build_ssl.bat @@ -1,12 +1,12 @@ -if "%1" == "ReleaseAMD64" call "%MSSdk%\SetEnv" /XP64 /RETAIL - - at echo off -if not defined HOST_PYTHON ( - if %1 EQU Debug ( - set HOST_PYTHON=python_d.exe - ) ELSE ( - set HOST_PYTHON=python.exe - ) -) -%HOST_PYTHON% build_ssl.py %1 %2 - +if "%1" == "ReleaseAMD64" call "%MSSdk%\SetEnv" /XP64 /RETAIL + + at echo off +if not defined HOST_PYTHON ( + if %1 EQU Debug ( + set HOST_PYTHON=python_d.exe + ) ELSE ( + set HOST_PYTHON=python.exe + ) +) +%HOST_PYTHON% build_ssl.py %1 %2 + diff --git a/PC/VS8.0/build.bat b/PC/VS8.0/build.bat --- a/PC/VS8.0/build.bat +++ b/PC/VS8.0/build.bat @@ -1,17 +1,17 @@ - at echo off -rem A batch program to build or rebuild a particular configuration. -rem just for convenience. - -setlocal -set platf=Win32 -set conf=Release -set build=/build - -:CheckOpts -if "%1"=="-c" (set conf=%2) & shift & shift & goto CheckOpts -if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts -if "%1"=="-r" (set build=/rebuild) & shift & goto CheckOpts - -set cmd=devenv pcbuild.sln %build% "%conf%|%platf%" -echo %cmd% -%cmd% + at echo off +rem A batch program to build or rebuild a particular configuration. +rem just for convenience. + +setlocal +set platf=Win32 +set conf=Release +set build=/build + +:CheckOpts +if "%1"=="-c" (set conf=%2) & shift & shift & goto CheckOpts +if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts +if "%1"=="-r" (set build=/rebuild) & shift & goto CheckOpts + +set cmd=devenv pcbuild.sln %build% "%conf%|%platf%" +echo %cmd% +%cmd% diff --git a/PC/VS8.0/build_env.bat b/PC/VS8.0/build_env.bat --- a/PC/VS8.0/build_env.bat +++ b/PC/VS8.0/build_env.bat @@ -1,1 +1,1 @@ -@%comspec% /k env.bat %* +@%comspec% /k env.bat %* diff --git a/PC/VS8.0/build_pgo.bat b/PC/VS8.0/build_pgo.bat --- a/PC/VS8.0/build_pgo.bat +++ b/PC/VS8.0/build_pgo.bat @@ -1,41 +1,41 @@ - at echo off -rem A batch program to build PGO (Profile guided optimization) by first -rem building instrumented binaries, then running the testsuite, and -rem finally building the optimized code. -rem Note, after the first instrumented run, one can just keep on -rem building the PGUpdate configuration while developing. - -setlocal -set platf=Win32 - -rem use the performance testsuite. This is quick and simple -set job1=..\..\tools\pybench\pybench.py -n 1 -C 1 --with-gc -set path1=..\..\tools\pybench - -rem or the whole testsuite for more thorough testing -set job2=..\..\lib\test\regrtest.py -set path2=..\..\lib - -set job=%job1% -set clrpath=%path1% - -:CheckOpts -if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts -if "%1"=="-2" (set job=%job2%) & (set clrpath=%path2%) & shift & goto CheckOpts - -set PGI=%platf%-pgi -set PGO=%platf%-pgo - - at echo on -rem build the instrumented version -call build -p %platf% -c PGInstrument - -rem remove .pyc files, .pgc files and execute the job -%PGI%\python.exe rmpyc.py %clrpath% -del %PGI%\*.pgc -%PGI%\python.exe %job% - -rem finally build the optimized version -if exist %PGO% del /s /q %PGO% -call build -p %platf% -c PGUpdate - + at echo off +rem A batch program to build PGO (Profile guided optimization) by first +rem building instrumented binaries, then running the testsuite, and +rem finally building the optimized code. +rem Note, after the first instrumented run, one can just keep on +rem building the PGUpdate configuration while developing. + +setlocal +set platf=Win32 + +rem use the performance testsuite. This is quick and simple +set job1=..\..\tools\pybench\pybench.py -n 1 -C 1 --with-gc +set path1=..\..\tools\pybench + +rem or the whole testsuite for more thorough testing +set job2=..\..\lib\test\regrtest.py +set path2=..\..\lib + +set job=%job1% +set clrpath=%path1% + +:CheckOpts +if "%1"=="-p" (set platf=%2) & shift & shift & goto CheckOpts +if "%1"=="-2" (set job=%job2%) & (set clrpath=%path2%) & shift & goto CheckOpts + +set PGI=%platf%-pgi +set PGO=%platf%-pgo + + at echo on +rem build the instrumented version +call build -p %platf% -c PGInstrument + +rem remove .pyc files, .pgc files and execute the job +%PGI%\python.exe rmpyc.py %clrpath% +del %PGI%\*.pgc +%PGI%\python.exe %job% + +rem finally build the optimized version +if exist %PGO% del /s /q %PGO% +call build -p %platf% -c PGUpdate + diff --git a/PC/VS8.0/build_ssl.bat b/PC/VS8.0/build_ssl.bat --- a/PC/VS8.0/build_ssl.bat +++ b/PC/VS8.0/build_ssl.bat @@ -1,12 +1,12 @@ - at echo off -if not defined HOST_PYTHON ( - if %1 EQU Debug ( - set HOST_PYTHON=python_d.exe - if not exist python33_d.dll exit 1 - ) ELSE ( - set HOST_PYTHON=python.exe - if not exist python33.dll exit 1 - ) -) -%HOST_PYTHON% build_ssl.py %1 %2 %3 - + at echo off +if not defined HOST_PYTHON ( + if %1 EQU Debug ( + set HOST_PYTHON=python_d.exe + if not exist python33_d.dll exit 1 + ) ELSE ( + set HOST_PYTHON=python.exe + if not exist python33.dll exit 1 + ) +) +%HOST_PYTHON% build_ssl.py %1 %2 %3 + diff --git a/PC/VS8.0/env.bat b/PC/VS8.0/env.bat --- a/PC/VS8.0/env.bat +++ b/PC/VS8.0/env.bat @@ -1,5 +1,5 @@ - at echo off -set VS8=%ProgramFiles%\Microsoft Visual Studio 8 -echo Build environments: x86, ia64, amd64, x86_amd64, x86_ia64 -echo. -call "%VS8%\VC\vcvarsall.bat" %1 + at echo off +set VS8=%ProgramFiles%\Microsoft Visual Studio 8 +echo Build environments: x86, ia64, amd64, x86_amd64, x86_ia64 +echo. +call "%VS8%\VC\vcvarsall.bat" %1 diff --git a/PC/VS8.0/idle.bat b/PC/VS8.0/idle.bat --- a/PC/VS8.0/idle.bat +++ b/PC/VS8.0/idle.bat @@ -1,15 +1,15 @@ - at echo off -rem start idle -rem Usage: idle [-d] -rem -d Run Debug build (python_d.exe). Else release build. - -setlocal -set exe=python -PATH %PATH%;..\..\..\tcltk\bin - -if "%1"=="-d" (set exe=python_d) & shift - -set cmd=%exe% ../../Lib/idlelib/idle.py %1 %2 %3 %4 %5 %6 %7 %8 %9 - -echo on -%cmd% + at echo off +rem start idle +rem Usage: idle [-d] +rem -d Run Debug build (python_d.exe). Else release build. + +setlocal +set exe=python +PATH %PATH%;..\..\..\tcltk\bin + +if "%1"=="-d" (set exe=python_d) & shift + +set cmd=%exe% ../../Lib/idlelib/idle.py %1 %2 %3 %4 %5 %6 %7 %8 %9 + +echo on +%cmd% diff --git a/PC/VS8.0/rt.bat b/PC/VS8.0/rt.bat --- a/PC/VS8.0/rt.bat +++ b/PC/VS8.0/rt.bat @@ -1,52 +1,52 @@ - at echo off -rem Run Tests. Run the regression test suite. -rem Usage: rt [-d] [-O] [-q] regrtest_args -rem -d Run Debug build (python_d.exe). Else release build. -rem -O Run python.exe or python_d.exe (see -d) with -O. -rem -q "quick" -- normally the tests are run twice, the first time -rem after deleting all the .py[co] files reachable from Lib/. -rem -q runs the tests just once, and without deleting .py[co] files. -rem All leading instances of these switches are shifted off, and -rem whatever remains is passed to regrtest.py. For example, -rem rt -O -d -x test_thread -rem runs -rem python_d -O ../lib/test/regrtest.py -x test_thread -rem twice, and -rem rt -q -g test_binascii -rem runs -rem python_d ../lib/test/regrtest.py -g test_binascii -rem to generate the expected-output file for binascii quickly. -rem -rem Confusing: if you want to pass a comma-separated list, like -rem -u network,largefile -rem then you have to quote it on the rt line, like -rem rt -u "network,largefile" - -setlocal - -set exe=python -set qmode= -set dashO= -PATH %PATH%;%~dp0..\..\..\tcltk\bin - -:CheckOpts -if "%1"=="-O" (set dashO=-O) & shift & goto CheckOpts -if "%1"=="-q" (set qmode=yes) & shift & goto CheckOpts -if "%1"=="-d" (set exe=python_d) & shift & goto CheckOpts - -set cmd=%exe% %dashO% -E ../../lib/test/regrtest.py %1 %2 %3 %4 %5 %6 %7 %8 %9 -if defined qmode goto Qmode - -echo Deleting .pyc/.pyo files ... -%exe% rmpyc.py - -echo on -%cmd% - at echo off - -echo About to run again without deleting .pyc/.pyo first: -pause - -:Qmode -echo on -%cmd% + at echo off +rem Run Tests. Run the regression test suite. +rem Usage: rt [-d] [-O] [-q] regrtest_args +rem -d Run Debug build (python_d.exe). Else release build. +rem -O Run python.exe or python_d.exe (see -d) with -O. +rem -q "quick" -- normally the tests are run twice, the first time +rem after deleting all the .py[co] files reachable from Lib/. +rem -q runs the tests just once, and without deleting .py[co] files. +rem All leading instances of these switches are shifted off, and +rem whatever remains is passed to regrtest.py. For example, +rem rt -O -d -x test_thread +rem runs +rem python_d -O ../lib/test/regrtest.py -x test_thread +rem twice, and +rem rt -q -g test_binascii +rem runs +rem python_d ../lib/test/regrtest.py -g test_binascii +rem to generate the expected-output file for binascii quickly. +rem +rem Confusing: if you want to pass a comma-separated list, like +rem -u network,largefile +rem then you have to quote it on the rt line, like +rem rt -u "network,largefile" + +setlocal + +set exe=python +set qmode= +set dashO= +PATH %PATH%;%~dp0..\..\..\tcltk\bin + +:CheckOpts +if "%1"=="-O" (set dashO=-O) & shift & goto CheckOpts +if "%1"=="-q" (set qmode=yes) & shift & goto CheckOpts +if "%1"=="-d" (set exe=python_d) & shift & goto CheckOpts + +set cmd=%exe% %dashO% -E ../../lib/test/regrtest.py %1 %2 %3 %4 %5 %6 %7 %8 %9 +if defined qmode goto Qmode + +echo Deleting .pyc/.pyo files ... +%exe% rmpyc.py + +echo on +%cmd% + at echo off + +echo About to run again without deleting .pyc/.pyo first: +pause + +:Qmode +echo on +%cmd% diff --git a/PC/python3.def b/PC/python3.def --- a/PC/python3.def +++ b/PC/python3.def @@ -1,689 +1,689 @@ -LIBRARY "python3" -EXPORTS - PyArg_Parse=python33.PyArg_Parse - PyArg_ParseTuple=python33.PyArg_ParseTuple - PyArg_ParseTupleAndKeywords=python33.PyArg_ParseTupleAndKeywords - PyArg_UnpackTuple=python33.PyArg_UnpackTuple - PyArg_VaParse=python33.PyArg_VaParse - PyArg_VaParseTupleAndKeywords=python33.PyArg_VaParseTupleAndKeywords - PyArg_ValidateKeywordArguments=python33.PyArg_ValidateKeywordArguments - PyBaseObject_Type=python33.PyBaseObject_Type DATA - PyBool_FromLong=python33.PyBool_FromLong - PyBool_Type=python33.PyBool_Type DATA - PyByteArrayIter_Type=python33.PyByteArrayIter_Type DATA - PyByteArray_AsString=python33.PyByteArray_AsString - PyByteArray_Concat=python33.PyByteArray_Concat - PyByteArray_FromObject=python33.PyByteArray_FromObject - PyByteArray_FromStringAndSize=python33.PyByteArray_FromStringAndSize - PyByteArray_Resize=python33.PyByteArray_Resize - PyByteArray_Size=python33.PyByteArray_Size - PyByteArray_Type=python33.PyByteArray_Type DATA - PyBytesIter_Type=python33.PyBytesIter_Type DATA - PyBytes_AsString=python33.PyBytes_AsString - PyBytes_AsStringAndSize=python33.PyBytes_AsStringAndSize - PyBytes_Concat=python33.PyBytes_Concat - PyBytes_ConcatAndDel=python33.PyBytes_ConcatAndDel - PyBytes_DecodeEscape=python33.PyBytes_DecodeEscape - PyBytes_FromFormat=python33.PyBytes_FromFormat - PyBytes_FromFormatV=python33.PyBytes_FromFormatV - PyBytes_FromObject=python33.PyBytes_FromObject - PyBytes_FromString=python33.PyBytes_FromString - PyBytes_FromStringAndSize=python33.PyBytes_FromStringAndSize - PyBytes_Repr=python33.PyBytes_Repr - PyBytes_Size=python33.PyBytes_Size - PyBytes_Type=python33.PyBytes_Type DATA - PyCFunction_Call=python33.PyCFunction_Call - PyCFunction_ClearFreeList=python33.PyCFunction_ClearFreeList - PyCFunction_GetFlags=python33.PyCFunction_GetFlags - PyCFunction_GetFunction=python33.PyCFunction_GetFunction - PyCFunction_GetSelf=python33.PyCFunction_GetSelf - PyCFunction_NewEx=python33.PyCFunction_NewEx - PyCFunction_Type=python33.PyCFunction_Type DATA - PyCallIter_New=python33.PyCallIter_New - PyCallIter_Type=python33.PyCallIter_Type DATA - PyCallable_Check=python33.PyCallable_Check - PyCapsule_GetContext=python33.PyCapsule_GetContext - PyCapsule_GetDestructor=python33.PyCapsule_GetDestructor - PyCapsule_GetName=python33.PyCapsule_GetName - PyCapsule_GetPointer=python33.PyCapsule_GetPointer - PyCapsule_Import=python33.PyCapsule_Import - PyCapsule_IsValid=python33.PyCapsule_IsValid - PyCapsule_New=python33.PyCapsule_New - PyCapsule_SetContext=python33.PyCapsule_SetContext - PyCapsule_SetDestructor=python33.PyCapsule_SetDestructor - PyCapsule_SetName=python33.PyCapsule_SetName - PyCapsule_SetPointer=python33.PyCapsule_SetPointer - PyCapsule_Type=python33.PyCapsule_Type DATA - PyClassMethodDescr_Type=python33.PyClassMethodDescr_Type DATA - PyCodec_BackslashReplaceErrors=python33.PyCodec_BackslashReplaceErrors - PyCodec_Decode=python33.PyCodec_Decode - PyCodec_Decoder=python33.PyCodec_Decoder - PyCodec_Encode=python33.PyCodec_Encode - PyCodec_Encoder=python33.PyCodec_Encoder - PyCodec_IgnoreErrors=python33.PyCodec_IgnoreErrors - PyCodec_IncrementalDecoder=python33.PyCodec_IncrementalDecoder - PyCodec_IncrementalEncoder=python33.PyCodec_IncrementalEncoder - PyCodec_KnownEncoding=python33.PyCodec_KnownEncoding - PyCodec_LookupError=python33.PyCodec_LookupError - PyCodec_Register=python33.PyCodec_Register - PyCodec_RegisterError=python33.PyCodec_RegisterError - PyCodec_ReplaceErrors=python33.PyCodec_ReplaceErrors - PyCodec_StreamReader=python33.PyCodec_StreamReader - PyCodec_StreamWriter=python33.PyCodec_StreamWriter - PyCodec_StrictErrors=python33.PyCodec_StrictErrors - PyCodec_XMLCharRefReplaceErrors=python33.PyCodec_XMLCharRefReplaceErrors - PyComplex_FromDoubles=python33.PyComplex_FromDoubles - PyComplex_ImagAsDouble=python33.PyComplex_ImagAsDouble - PyComplex_RealAsDouble=python33.PyComplex_RealAsDouble - PyComplex_Type=python33.PyComplex_Type DATA - PyDescr_NewClassMethod=python33.PyDescr_NewClassMethod - PyDescr_NewGetSet=python33.PyDescr_NewGetSet - PyDescr_NewMember=python33.PyDescr_NewMember - PyDescr_NewMethod=python33.PyDescr_NewMethod - PyDictItems_Type=python33.PyDictItems_Type DATA - PyDictIterItem_Type=python33.PyDictIterItem_Type DATA - PyDictIterKey_Type=python33.PyDictIterKey_Type DATA - PyDictIterValue_Type=python33.PyDictIterValue_Type DATA - PyDictKeys_Type=python33.PyDictKeys_Type DATA - PyDictProxy_New=python33.PyDictProxy_New - PyDictProxy_Type=python33.PyDictProxy_Type DATA - PyDictValues_Type=python33.PyDictValues_Type DATA - PyDict_Clear=python33.PyDict_Clear - PyDict_Contains=python33.PyDict_Contains - PyDict_Copy=python33.PyDict_Copy - PyDict_DelItem=python33.PyDict_DelItem - PyDict_DelItemString=python33.PyDict_DelItemString - PyDict_GetItem=python33.PyDict_GetItem - PyDict_GetItemString=python33.PyDict_GetItemString - PyDict_GetItemWithError=python33.PyDict_GetItemWithError - PyDict_Items=python33.PyDict_Items - PyDict_Keys=python33.PyDict_Keys - PyDict_Merge=python33.PyDict_Merge - PyDict_MergeFromSeq2=python33.PyDict_MergeFromSeq2 - PyDict_New=python33.PyDict_New - PyDict_Next=python33.PyDict_Next - PyDict_SetItem=python33.PyDict_SetItem - PyDict_SetItemString=python33.PyDict_SetItemString - PyDict_Size=python33.PyDict_Size - PyDict_Type=python33.PyDict_Type DATA - PyDict_Update=python33.PyDict_Update - PyDict_Values=python33.PyDict_Values - PyEllipsis_Type=python33.PyEllipsis_Type DATA - PyEnum_Type=python33.PyEnum_Type DATA - PyErr_BadArgument=python33.PyErr_BadArgument - PyErr_BadInternalCall=python33.PyErr_BadInternalCall - PyErr_CheckSignals=python33.PyErr_CheckSignals - PyErr_Clear=python33.PyErr_Clear - PyErr_Display=python33.PyErr_Display - PyErr_ExceptionMatches=python33.PyErr_ExceptionMatches - PyErr_Fetch=python33.PyErr_Fetch - PyErr_Format=python33.PyErr_Format - PyErr_GivenExceptionMatches=python33.PyErr_GivenExceptionMatches - PyErr_NewException=python33.PyErr_NewException - PyErr_NewExceptionWithDoc=python33.PyErr_NewExceptionWithDoc - PyErr_NoMemory=python33.PyErr_NoMemory - PyErr_NormalizeException=python33.PyErr_NormalizeException - PyErr_Occurred=python33.PyErr_Occurred - PyErr_Print=python33.PyErr_Print - PyErr_PrintEx=python33.PyErr_PrintEx - PyErr_ProgramText=python33.PyErr_ProgramText - PyErr_Restore=python33.PyErr_Restore - PyErr_SetFromErrno=python33.PyErr_SetFromErrno - PyErr_SetFromErrnoWithFilename=python33.PyErr_SetFromErrnoWithFilename - PyErr_SetFromErrnoWithFilenameObject=python33.PyErr_SetFromErrnoWithFilenameObject - PyErr_SetInterrupt=python33.PyErr_SetInterrupt - PyErr_SetNone=python33.PyErr_SetNone - PyErr_SetObject=python33.PyErr_SetObject - PyErr_SetString=python33.PyErr_SetString - PyErr_SyntaxLocation=python33.PyErr_SyntaxLocation - PyErr_WarnEx=python33.PyErr_WarnEx - PyErr_WarnExplicit=python33.PyErr_WarnExplicit - PyErr_WarnFormat=python33.PyErr_WarnFormat - PyErr_WriteUnraisable=python33.PyErr_WriteUnraisable - PyEval_AcquireLock=python33.PyEval_AcquireLock - PyEval_AcquireThread=python33.PyEval_AcquireThread - PyEval_CallFunction=python33.PyEval_CallFunction - PyEval_CallMethod=python33.PyEval_CallMethod - PyEval_CallObjectWithKeywords=python33.PyEval_CallObjectWithKeywords - PyEval_EvalCode=python33.PyEval_EvalCode - PyEval_EvalCodeEx=python33.PyEval_EvalCodeEx - PyEval_EvalFrame=python33.PyEval_EvalFrame - PyEval_EvalFrameEx=python33.PyEval_EvalFrameEx - PyEval_GetBuiltins=python33.PyEval_GetBuiltins - PyEval_GetCallStats=python33.PyEval_GetCallStats - PyEval_GetFrame=python33.PyEval_GetFrame - PyEval_GetFuncDesc=python33.PyEval_GetFuncDesc - PyEval_GetFuncName=python33.PyEval_GetFuncName - PyEval_GetGlobals=python33.PyEval_GetGlobals - PyEval_GetLocals=python33.PyEval_GetLocals - PyEval_InitThreads=python33.PyEval_InitThreads - PyEval_ReInitThreads=python33.PyEval_ReInitThreads - PyEval_ReleaseLock=python33.PyEval_ReleaseLock - PyEval_ReleaseThread=python33.PyEval_ReleaseThread - PyEval_RestoreThread=python33.PyEval_RestoreThread - PyEval_SaveThread=python33.PyEval_SaveThread - PyEval_ThreadsInitialized=python33.PyEval_ThreadsInitialized - PyExc_ArithmeticError=python33.PyExc_ArithmeticError DATA - PyExc_AssertionError=python33.PyExc_AssertionError DATA - PyExc_AttributeError=python33.PyExc_AttributeError DATA - PyExc_BaseException=python33.PyExc_BaseException DATA - PyExc_BufferError=python33.PyExc_BufferError DATA - PyExc_BytesWarning=python33.PyExc_BytesWarning DATA - PyExc_DeprecationWarning=python33.PyExc_DeprecationWarning DATA - PyExc_EOFError=python33.PyExc_EOFError DATA - PyExc_EnvironmentError=python33.PyExc_EnvironmentError DATA - PyExc_Exception=python33.PyExc_Exception DATA - PyExc_FloatingPointError=python33.PyExc_FloatingPointError DATA - PyExc_FutureWarning=python33.PyExc_FutureWarning DATA - PyExc_GeneratorExit=python33.PyExc_GeneratorExit DATA - PyExc_IOError=python33.PyExc_IOError DATA - PyExc_ImportError=python33.PyExc_ImportError DATA - PyExc_ImportWarning=python33.PyExc_ImportWarning DATA - PyExc_IndentationError=python33.PyExc_IndentationError DATA - PyExc_IndexError=python33.PyExc_IndexError DATA - PyExc_KeyError=python33.PyExc_KeyError DATA - PyExc_KeyboardInterrupt=python33.PyExc_KeyboardInterrupt DATA - PyExc_LookupError=python33.PyExc_LookupError DATA - PyExc_MemoryError=python33.PyExc_MemoryError DATA - PyExc_MemoryErrorInst=python33.PyExc_MemoryErrorInst DATA - PyExc_NameError=python33.PyExc_NameError DATA - PyExc_NotImplementedError=python33.PyExc_NotImplementedError DATA - PyExc_OSError=python33.PyExc_OSError DATA - PyExc_OverflowError=python33.PyExc_OverflowError DATA - PyExc_PendingDeprecationWarning=python33.PyExc_PendingDeprecationWarning DATA - PyExc_RecursionErrorInst=python33.PyExc_RecursionErrorInst DATA - PyExc_ReferenceError=python33.PyExc_ReferenceError DATA - PyExc_RuntimeError=python33.PyExc_RuntimeError DATA - PyExc_RuntimeWarning=python33.PyExc_RuntimeWarning DATA - PyExc_StopIteration=python33.PyExc_StopIteration DATA - PyExc_SyntaxError=python33.PyExc_SyntaxError DATA - PyExc_SyntaxWarning=python33.PyExc_SyntaxWarning DATA - PyExc_SystemError=python33.PyExc_SystemError DATA - PyExc_SystemExit=python33.PyExc_SystemExit DATA - PyExc_TabError=python33.PyExc_TabError DATA - PyExc_TypeError=python33.PyExc_TypeError DATA - PyExc_UnboundLocalError=python33.PyExc_UnboundLocalError DATA - PyExc_UnicodeDecodeError=python33.PyExc_UnicodeDecodeError DATA - PyExc_UnicodeEncodeError=python33.PyExc_UnicodeEncodeError DATA - PyExc_UnicodeError=python33.PyExc_UnicodeError DATA - PyExc_UnicodeTranslateError=python33.PyExc_UnicodeTranslateError DATA - PyExc_UnicodeWarning=python33.PyExc_UnicodeWarning DATA - PyExc_UserWarning=python33.PyExc_UserWarning DATA - PyExc_ValueError=python33.PyExc_ValueError DATA - PyExc_Warning=python33.PyExc_Warning DATA - PyExc_ZeroDivisionError=python33.PyExc_ZeroDivisionError DATA - PyException_GetCause=python33.PyException_GetCause - PyException_GetContext=python33.PyException_GetContext - PyException_GetTraceback=python33.PyException_GetTraceback - PyException_SetCause=python33.PyException_SetCause - PyException_SetContext=python33.PyException_SetContext - PyException_SetTraceback=python33.PyException_SetTraceback - PyFile_FromFd=python33.PyFile_FromFd - PyFile_GetLine=python33.PyFile_GetLine - PyFile_WriteObject=python33.PyFile_WriteObject - PyFile_WriteString=python33.PyFile_WriteString - PyFilter_Type=python33.PyFilter_Type DATA - PyFloat_AsDouble=python33.PyFloat_AsDouble - PyFloat_FromDouble=python33.PyFloat_FromDouble - PyFloat_FromString=python33.PyFloat_FromString - PyFloat_GetInfo=python33.PyFloat_GetInfo - PyFloat_GetMax=python33.PyFloat_GetMax - PyFloat_GetMin=python33.PyFloat_GetMin - PyFloat_Type=python33.PyFloat_Type DATA - PyFrozenSet_New=python33.PyFrozenSet_New - PyFrozenSet_Type=python33.PyFrozenSet_Type DATA - PyGC_Collect=python33.PyGC_Collect - PyGILState_Ensure=python33.PyGILState_Ensure - PyGILState_GetThisThreadState=python33.PyGILState_GetThisThreadState - PyGILState_Release=python33.PyGILState_Release - PyGetSetDescr_Type=python33.PyGetSetDescr_Type DATA - PyImport_AddModule=python33.PyImport_AddModule - PyImport_AppendInittab=python33.PyImport_AppendInittab - PyImport_Cleanup=python33.PyImport_Cleanup - PyImport_ExecCodeModule=python33.PyImport_ExecCodeModule - PyImport_ExecCodeModuleEx=python33.PyImport_ExecCodeModuleEx - PyImport_ExecCodeModuleWithPathnames=python33.PyImport_ExecCodeModuleWithPathnames - PyImport_GetImporter=python33.PyImport_GetImporter - PyImport_GetMagicNumber=python33.PyImport_GetMagicNumber - PyImport_GetMagicTag=python33.PyImport_GetMagicTag - PyImport_GetModuleDict=python33.PyImport_GetModuleDict - PyImport_Import=python33.PyImport_Import - PyImport_ImportFrozenModule=python33.PyImport_ImportFrozenModule - PyImport_ImportModule=python33.PyImport_ImportModule - PyImport_ImportModuleLevel=python33.PyImport_ImportModuleLevel - PyImport_ImportModuleNoBlock=python33.PyImport_ImportModuleNoBlock - PyImport_ReloadModule=python33.PyImport_ReloadModule - PyInterpreterState_Clear=python33.PyInterpreterState_Clear - PyInterpreterState_Delete=python33.PyInterpreterState_Delete - PyInterpreterState_New=python33.PyInterpreterState_New - PyIter_Next=python33.PyIter_Next - PyListIter_Type=python33.PyListIter_Type DATA - PyListRevIter_Type=python33.PyListRevIter_Type DATA - PyList_Append=python33.PyList_Append - PyList_AsTuple=python33.PyList_AsTuple - PyList_GetItem=python33.PyList_GetItem - PyList_GetSlice=python33.PyList_GetSlice - PyList_Insert=python33.PyList_Insert - PyList_New=python33.PyList_New - PyList_Reverse=python33.PyList_Reverse - PyList_SetItem=python33.PyList_SetItem - PyList_SetSlice=python33.PyList_SetSlice - PyList_Size=python33.PyList_Size - PyList_Sort=python33.PyList_Sort - PyList_Type=python33.PyList_Type DATA - PyLongRangeIter_Type=python33.PyLongRangeIter_Type DATA - PyLong_AsDouble=python33.PyLong_AsDouble - PyLong_AsLong=python33.PyLong_AsLong - PyLong_AsLongAndOverflow=python33.PyLong_AsLongAndOverflow - PyLong_AsLongLong=python33.PyLong_AsLongLong - PyLong_AsLongLongAndOverflow=python33.PyLong_AsLongLongAndOverflow - PyLong_AsSize_t=python33.PyLong_AsSize_t - PyLong_AsSsize_t=python33.PyLong_AsSsize_t - PyLong_AsUnsignedLong=python33.PyLong_AsUnsignedLong - PyLong_AsUnsignedLongLong=python33.PyLong_AsUnsignedLongLong - PyLong_AsUnsignedLongLongMask=python33.PyLong_AsUnsignedLongLongMask - PyLong_AsUnsignedLongMask=python33.PyLong_AsUnsignedLongMask - PyLong_AsVoidPtr=python33.PyLong_AsVoidPtr - PyLong_FromDouble=python33.PyLong_FromDouble - PyLong_FromLong=python33.PyLong_FromLong - PyLong_FromLongLong=python33.PyLong_FromLongLong - PyLong_FromSize_t=python33.PyLong_FromSize_t - PyLong_FromSsize_t=python33.PyLong_FromSsize_t - PyLong_FromString=python33.PyLong_FromString - PyLong_FromUnsignedLong=python33.PyLong_FromUnsignedLong - PyLong_FromUnsignedLongLong=python33.PyLong_FromUnsignedLongLong - PyLong_FromVoidPtr=python33.PyLong_FromVoidPtr - PyLong_GetInfo=python33.PyLong_GetInfo - PyLong_Type=python33.PyLong_Type DATA - PyMap_Type=python33.PyMap_Type DATA - PyMapping_Check=python33.PyMapping_Check - PyMapping_GetItemString=python33.PyMapping_GetItemString - PyMapping_HasKey=python33.PyMapping_HasKey - PyMapping_HasKeyString=python33.PyMapping_HasKeyString - PyMapping_Items=python33.PyMapping_Items - PyMapping_Keys=python33.PyMapping_Keys - PyMapping_Length=python33.PyMapping_Length - PyMapping_SetItemString=python33.PyMapping_SetItemString - PyMapping_Size=python33.PyMapping_Size - PyMapping_Values=python33.PyMapping_Values - PyMem_Free=python33.PyMem_Free - PyMem_Malloc=python33.PyMem_Malloc - PyMem_Realloc=python33.PyMem_Realloc - PyMemberDescr_Type=python33.PyMemberDescr_Type DATA - PyMemoryView_FromObject=python33.PyMemoryView_FromObject - PyMemoryView_GetContiguous=python33.PyMemoryView_GetContiguous - PyMemoryView_Type=python33.PyMemoryView_Type DATA - PyMethodDescr_Type=python33.PyMethodDescr_Type DATA - PyModule_AddIntConstant=python33.PyModule_AddIntConstant - PyModule_AddObject=python33.PyModule_AddObject - PyModule_AddStringConstant=python33.PyModule_AddStringConstant - PyModule_Create2=python33.PyModule_Create2 - PyModule_GetDef=python33.PyModule_GetDef - PyModule_GetDict=python33.PyModule_GetDict - PyModule_GetFilename=python33.PyModule_GetFilename - PyModule_GetFilenameObject=python33.PyModule_GetFilenameObject - PyModule_GetName=python33.PyModule_GetName - PyModule_GetState=python33.PyModule_GetState - PyModule_New=python33.PyModule_New - PyModule_Type=python33.PyModule_Type DATA - PyNullImporter_Type=python33.PyNullImporter_Type DATA - PyNumber_Absolute=python33.PyNumber_Absolute - PyNumber_Add=python33.PyNumber_Add - PyNumber_And=python33.PyNumber_And - PyNumber_AsSsize_t=python33.PyNumber_AsSsize_t - PyNumber_Check=python33.PyNumber_Check - PyNumber_Divmod=python33.PyNumber_Divmod - PyNumber_Float=python33.PyNumber_Float - PyNumber_FloorDivide=python33.PyNumber_FloorDivide - PyNumber_InPlaceAdd=python33.PyNumber_InPlaceAdd - PyNumber_InPlaceAnd=python33.PyNumber_InPlaceAnd - PyNumber_InPlaceFloorDivide=python33.PyNumber_InPlaceFloorDivide - PyNumber_InPlaceLshift=python33.PyNumber_InPlaceLshift - PyNumber_InPlaceMultiply=python33.PyNumber_InPlaceMultiply - PyNumber_InPlaceOr=python33.PyNumber_InPlaceOr - PyNumber_InPlacePower=python33.PyNumber_InPlacePower - PyNumber_InPlaceRemainder=python33.PyNumber_InPlaceRemainder - PyNumber_InPlaceRshift=python33.PyNumber_InPlaceRshift - PyNumber_InPlaceSubtract=python33.PyNumber_InPlaceSubtract - PyNumber_InPlaceTrueDivide=python33.PyNumber_InPlaceTrueDivide - PyNumber_InPlaceXor=python33.PyNumber_InPlaceXor - PyNumber_Index=python33.PyNumber_Index - PyNumber_Invert=python33.PyNumber_Invert - PyNumber_Long=python33.PyNumber_Long - PyNumber_Lshift=python33.PyNumber_Lshift - PyNumber_Multiply=python33.PyNumber_Multiply - PyNumber_Negative=python33.PyNumber_Negative - PyNumber_Or=python33.PyNumber_Or - PyNumber_Positive=python33.PyNumber_Positive - PyNumber_Power=python33.PyNumber_Power - PyNumber_Remainder=python33.PyNumber_Remainder - PyNumber_Rshift=python33.PyNumber_Rshift - PyNumber_Subtract=python33.PyNumber_Subtract - PyNumber_ToBase=python33.PyNumber_ToBase - PyNumber_TrueDivide=python33.PyNumber_TrueDivide - PyNumber_Xor=python33.PyNumber_Xor - PyOS_AfterFork=python33.PyOS_AfterFork - PyOS_InitInterrupts=python33.PyOS_InitInterrupts - PyOS_InputHook=python33.PyOS_InputHook DATA - PyOS_InterruptOccurred=python33.PyOS_InterruptOccurred - PyOS_ReadlineFunctionPointer=python33.PyOS_ReadlineFunctionPointer DATA - PyOS_double_to_string=python33.PyOS_double_to_string - PyOS_getsig=python33.PyOS_getsig - PyOS_mystricmp=python33.PyOS_mystricmp - PyOS_mystrnicmp=python33.PyOS_mystrnicmp - PyOS_setsig=python33.PyOS_setsig - PyOS_snprintf=python33.PyOS_snprintf - PyOS_string_to_double=python33.PyOS_string_to_double - PyOS_strtol=python33.PyOS_strtol - PyOS_strtoul=python33.PyOS_strtoul - PyOS_vsnprintf=python33.PyOS_vsnprintf - PyObject_ASCII=python33.PyObject_ASCII - PyObject_AsCharBuffer=python33.PyObject_AsCharBuffer - PyObject_AsFileDescriptor=python33.PyObject_AsFileDescriptor - PyObject_AsReadBuffer=python33.PyObject_AsReadBuffer - PyObject_AsWriteBuffer=python33.PyObject_AsWriteBuffer - PyObject_Bytes=python33.PyObject_Bytes - PyObject_Call=python33.PyObject_Call - PyObject_CallFunction=python33.PyObject_CallFunction - PyObject_CallFunctionObjArgs=python33.PyObject_CallFunctionObjArgs - PyObject_CallMethod=python33.PyObject_CallMethod - PyObject_CallMethodObjArgs=python33.PyObject_CallMethodObjArgs - PyObject_CallObject=python33.PyObject_CallObject - PyObject_CheckReadBuffer=python33.PyObject_CheckReadBuffer - PyObject_ClearWeakRefs=python33.PyObject_ClearWeakRefs - PyObject_DelItem=python33.PyObject_DelItem - PyObject_DelItemString=python33.PyObject_DelItemString - PyObject_Dir=python33.PyObject_Dir - PyObject_Format=python33.PyObject_Format - PyObject_Free=python33.PyObject_Free - PyObject_GC_Del=python33.PyObject_GC_Del - PyObject_GC_Track=python33.PyObject_GC_Track - PyObject_GC_UnTrack=python33.PyObject_GC_UnTrack - PyObject_GenericGetAttr=python33.PyObject_GenericGetAttr - PyObject_GenericSetAttr=python33.PyObject_GenericSetAttr - PyObject_GetAttr=python33.PyObject_GetAttr - PyObject_GetAttrString=python33.PyObject_GetAttrString - PyObject_GetItem=python33.PyObject_GetItem - PyObject_GetIter=python33.PyObject_GetIter - PyObject_HasAttr=python33.PyObject_HasAttr - PyObject_HasAttrString=python33.PyObject_HasAttrString - PyObject_Hash=python33.PyObject_Hash - PyObject_HashNotImplemented=python33.PyObject_HashNotImplemented - PyObject_Init=python33.PyObject_Init - PyObject_InitVar=python33.PyObject_InitVar - PyObject_IsInstance=python33.PyObject_IsInstance - PyObject_IsSubclass=python33.PyObject_IsSubclass - PyObject_IsTrue=python33.PyObject_IsTrue - PyObject_Length=python33.PyObject_Length - PyObject_Malloc=python33.PyObject_Malloc - PyObject_Not=python33.PyObject_Not - PyObject_Realloc=python33.PyObject_Realloc - PyObject_Repr=python33.PyObject_Repr - PyObject_RichCompare=python33.PyObject_RichCompare - PyObject_RichCompareBool=python33.PyObject_RichCompareBool - PyObject_SelfIter=python33.PyObject_SelfIter - PyObject_SetAttr=python33.PyObject_SetAttr - PyObject_SetAttrString=python33.PyObject_SetAttrString - PyObject_SetItem=python33.PyObject_SetItem - PyObject_Size=python33.PyObject_Size - PyObject_Str=python33.PyObject_Str - PyObject_Type=python33.PyObject_Type DATA - PyParser_SimpleParseFileFlags=python33.PyParser_SimpleParseFileFlags - PyParser_SimpleParseStringFlags=python33.PyParser_SimpleParseStringFlags - PyProperty_Type=python33.PyProperty_Type DATA - PyRangeIter_Type=python33.PyRangeIter_Type DATA - PyRange_Type=python33.PyRange_Type DATA - PyReversed_Type=python33.PyReversed_Type DATA - PySeqIter_New=python33.PySeqIter_New - PySeqIter_Type=python33.PySeqIter_Type DATA - PySequence_Check=python33.PySequence_Check - PySequence_Concat=python33.PySequence_Concat - PySequence_Contains=python33.PySequence_Contains - PySequence_Count=python33.PySequence_Count - PySequence_DelItem=python33.PySequence_DelItem - PySequence_DelSlice=python33.PySequence_DelSlice - PySequence_Fast=python33.PySequence_Fast - PySequence_GetItem=python33.PySequence_GetItem - PySequence_GetSlice=python33.PySequence_GetSlice - PySequence_In=python33.PySequence_In - PySequence_InPlaceConcat=python33.PySequence_InPlaceConcat - PySequence_InPlaceRepeat=python33.PySequence_InPlaceRepeat - PySequence_Index=python33.PySequence_Index - PySequence_Length=python33.PySequence_Length - PySequence_List=python33.PySequence_List - PySequence_Repeat=python33.PySequence_Repeat - PySequence_SetItem=python33.PySequence_SetItem - PySequence_SetSlice=python33.PySequence_SetSlice - PySequence_Size=python33.PySequence_Size - PySequence_Tuple=python33.PySequence_Tuple - PySetIter_Type=python33.PySetIter_Type DATA - PySet_Add=python33.PySet_Add - PySet_Clear=python33.PySet_Clear - PySet_Contains=python33.PySet_Contains - PySet_Discard=python33.PySet_Discard - PySet_New=python33.PySet_New - PySet_Pop=python33.PySet_Pop - PySet_Size=python33.PySet_Size - PySet_Type=python33.PySet_Type DATA - PySlice_GetIndices=python33.PySlice_GetIndices - PySlice_GetIndicesEx=python33.PySlice_GetIndicesEx - PySlice_New=python33.PySlice_New - PySlice_Type=python33.PySlice_Type DATA - PySortWrapper_Type=python33.PySortWrapper_Type DATA - PyState_FindModule=python33.PyState_FindModule - PyStructSequence_GetItem=python33.PyStructSequence_GetItem - PyStructSequence_New=python33.PyStructSequence_New - PyStructSequence_NewType=python33.PyStructSequence_NewType - PyStructSequence_SetItem=python33.PyStructSequence_SetItem - PySuper_Type=python33.PySuper_Type DATA - PySys_AddWarnOption=python33.PySys_AddWarnOption - PySys_AddWarnOptionUnicode=python33.PySys_AddWarnOptionUnicode - PySys_FormatStderr=python33.PySys_FormatStderr - PySys_FormatStdout=python33.PySys_FormatStdout - PySys_GetObject=python33.PySys_GetObject - PySys_HasWarnOptions=python33.PySys_HasWarnOptions - PySys_ResetWarnOptions=python33.PySys_ResetWarnOptions - PySys_SetArgv=python33.PySys_SetArgv - PySys_SetArgvEx=python33.PySys_SetArgvEx - PySys_SetObject=python33.PySys_SetObject - PySys_SetPath=python33.PySys_SetPath - PySys_WriteStderr=python33.PySys_WriteStderr - PySys_WriteStdout=python33.PySys_WriteStdout - PyThreadState_Clear=python33.PyThreadState_Clear - PyThreadState_Delete=python33.PyThreadState_Delete - PyThreadState_DeleteCurrent=python33.PyThreadState_DeleteCurrent - PyThreadState_Get=python33.PyThreadState_Get - PyThreadState_GetDict=python33.PyThreadState_GetDict - PyThreadState_New=python33.PyThreadState_New - PyThreadState_SetAsyncExc=python33.PyThreadState_SetAsyncExc - PyThreadState_Swap=python33.PyThreadState_Swap - PyTraceBack_Here=python33.PyTraceBack_Here - PyTraceBack_Print=python33.PyTraceBack_Print - PyTraceBack_Type=python33.PyTraceBack_Type DATA - PyTupleIter_Type=python33.PyTupleIter_Type DATA - PyTuple_ClearFreeList=python33.PyTuple_ClearFreeList - PyTuple_GetItem=python33.PyTuple_GetItem - PyTuple_GetSlice=python33.PyTuple_GetSlice - PyTuple_New=python33.PyTuple_New - PyTuple_Pack=python33.PyTuple_Pack - PyTuple_SetItem=python33.PyTuple_SetItem - PyTuple_Size=python33.PyTuple_Size - PyTuple_Type=python33.PyTuple_Type DATA - PyType_ClearCache=python33.PyType_ClearCache - PyType_FromSpec=python33.PyType_FromSpec - PyType_GenericAlloc=python33.PyType_GenericAlloc - PyType_GenericNew=python33.PyType_GenericNew - PyType_GetFlags=python33.PyType_GetFlags - PyType_IsSubtype=python33.PyType_IsSubtype - PyType_Modified=python33.PyType_Modified - PyType_Ready=python33.PyType_Ready - PyType_Type=python33.PyType_Type DATA - PyUnicodeDecodeError_Create=python33.PyUnicodeDecodeError_Create - PyUnicodeDecodeError_GetEncoding=python33.PyUnicodeDecodeError_GetEncoding - PyUnicodeDecodeError_GetEnd=python33.PyUnicodeDecodeError_GetEnd - PyUnicodeDecodeError_GetObject=python33.PyUnicodeDecodeError_GetObject - PyUnicodeDecodeError_GetReason=python33.PyUnicodeDecodeError_GetReason - PyUnicodeDecodeError_GetStart=python33.PyUnicodeDecodeError_GetStart - PyUnicodeDecodeError_SetEnd=python33.PyUnicodeDecodeError_SetEnd - PyUnicodeDecodeError_SetReason=python33.PyUnicodeDecodeError_SetReason - PyUnicodeDecodeError_SetStart=python33.PyUnicodeDecodeError_SetStart - PyUnicodeEncodeError_GetEncoding=python33.PyUnicodeEncodeError_GetEncoding - PyUnicodeEncodeError_GetEnd=python33.PyUnicodeEncodeError_GetEnd - PyUnicodeEncodeError_GetObject=python33.PyUnicodeEncodeError_GetObject - PyUnicodeEncodeError_GetReason=python33.PyUnicodeEncodeError_GetReason - PyUnicodeEncodeError_GetStart=python33.PyUnicodeEncodeError_GetStart - PyUnicodeEncodeError_SetEnd=python33.PyUnicodeEncodeError_SetEnd - PyUnicodeEncodeError_SetReason=python33.PyUnicodeEncodeError_SetReason - PyUnicodeEncodeError_SetStart=python33.PyUnicodeEncodeError_SetStart - PyUnicodeIter_Type=python33.PyUnicodeIter_Type DATA - PyUnicodeTranslateError_GetEnd=python33.PyUnicodeTranslateError_GetEnd - PyUnicodeTranslateError_GetObject=python33.PyUnicodeTranslateError_GetObject - PyUnicodeTranslateError_GetReason=python33.PyUnicodeTranslateError_GetReason - PyUnicodeTranslateError_GetStart=python33.PyUnicodeTranslateError_GetStart - PyUnicodeTranslateError_SetEnd=python33.PyUnicodeTranslateError_SetEnd - PyUnicodeTranslateError_SetReason=python33.PyUnicodeTranslateError_SetReason - PyUnicodeTranslateError_SetStart=python33.PyUnicodeTranslateError_SetStart - PyUnicode_Append=python33.PyUnicodeUCS2_Append - PyUnicode_AppendAndDel=python33.PyUnicodeUCS2_AppendAndDel - PyUnicode_AsASCIIString=python33.PyUnicodeUCS2_AsASCIIString - PyUnicode_AsCharmapString=python33.PyUnicodeUCS2_AsCharmapString - PyUnicode_AsDecodedObject=python33.PyUnicodeUCS2_AsDecodedObject - PyUnicode_AsDecodedUnicode=python33.PyUnicodeUCS2_AsDecodedUnicode - PyUnicode_AsEncodedObject=python33.PyUnicodeUCS2_AsEncodedObject - PyUnicode_AsEncodedString=python33.PyUnicodeUCS2_AsEncodedString - PyUnicode_AsEncodedUnicode=python33.PyUnicodeUCS2_AsEncodedUnicode - PyUnicode_AsLatin1String=python33.PyUnicodeUCS2_AsLatin1String - PyUnicode_AsRawUnicodeEscapeString=python33.PyUnicodeUCS2_AsRawUnicodeEscapeString - PyUnicode_AsUTF16String=python33.PyUnicodeUCS2_AsUTF16String - PyUnicode_AsUTF32String=python33.PyUnicodeUCS2_AsUTF32String - PyUnicode_AsUTF8String=python33.PyUnicodeUCS2_AsUTF8String - PyUnicode_AsUnicodeEscapeString=python33.PyUnicodeUCS2_AsUnicodeEscapeString - PyUnicode_AsWideChar=python33.PyUnicodeUCS2_AsWideChar - PyUnicode_ClearFreelist=python33.PyUnicodeUCS2_ClearFreelist - PyUnicode_Compare=python33.PyUnicodeUCS2_Compare - PyUnicode_Concat=python33.PyUnicodeUCS2_Concat - PyUnicode_Contains=python33.PyUnicodeUCS2_Contains - PyUnicode_Count=python33.PyUnicodeUCS2_Count - PyUnicode_Decode=python33.PyUnicodeUCS2_Decode - PyUnicode_DecodeASCII=python33.PyUnicodeUCS2_DecodeASCII - PyUnicode_DecodeCharmap=python33.PyUnicodeUCS2_DecodeCharmap - PyUnicode_DecodeFSDefault=python33.PyUnicodeUCS2_DecodeFSDefault - PyUnicode_DecodeFSDefaultAndSize=python33.PyUnicodeUCS2_DecodeFSDefaultAndSize - PyUnicode_DecodeLatin1=python33.PyUnicodeUCS2_DecodeLatin1 - PyUnicode_DecodeRawUnicodeEscape=python33.PyUnicodeUCS2_DecodeRawUnicodeEscape - PyUnicode_DecodeUTF16=python33.PyUnicodeUCS2_DecodeUTF16 - PyUnicode_DecodeUTF16Stateful=python33.PyUnicodeUCS2_DecodeUTF16Stateful - PyUnicode_DecodeUTF32=python33.PyUnicodeUCS2_DecodeUTF32 - PyUnicode_DecodeUTF32Stateful=python33.PyUnicodeUCS2_DecodeUTF32Stateful - PyUnicode_DecodeUTF8=python33.PyUnicodeUCS2_DecodeUTF8 - PyUnicode_DecodeUTF8Stateful=python33.PyUnicodeUCS2_DecodeUTF8Stateful - PyUnicode_DecodeUnicodeEscape=python33.PyUnicodeUCS2_DecodeUnicodeEscape - PyUnicode_FSConverter=python33.PyUnicodeUCS2_FSConverter - PyUnicode_FSDecoder=python33.PyUnicodeUCS2_FSDecoder - PyUnicode_Find=python33.PyUnicodeUCS2_Find - PyUnicode_Format=python33.PyUnicodeUCS2_Format - PyUnicode_FromEncodedObject=python33.PyUnicodeUCS2_FromEncodedObject - PyUnicode_FromFormat=python33.PyUnicodeUCS2_FromFormat - PyUnicode_FromFormatV=python33.PyUnicodeUCS2_FromFormatV - PyUnicode_FromObject=python33.PyUnicodeUCS2_FromObject - PyUnicode_FromOrdinal=python33.PyUnicodeUCS2_FromOrdinal - PyUnicode_FromString=python33.PyUnicodeUCS2_FromString - PyUnicode_FromStringAndSize=python33.PyUnicodeUCS2_FromStringAndSize - PyUnicode_FromWideChar=python33.PyUnicodeUCS2_FromWideChar - PyUnicode_GetDefaultEncoding=python33.PyUnicodeUCS2_GetDefaultEncoding - PyUnicode_GetSize=python33.PyUnicodeUCS2_GetSize - PyUnicode_IsIdentifier=python33.PyUnicodeUCS2_IsIdentifier - PyUnicode_Join=python33.PyUnicodeUCS2_Join - PyUnicode_Partition=python33.PyUnicodeUCS2_Partition - PyUnicode_RPartition=python33.PyUnicodeUCS2_RPartition - PyUnicode_RSplit=python33.PyUnicodeUCS2_RSplit - PyUnicode_Replace=python33.PyUnicodeUCS2_Replace - PyUnicode_Resize=python33.PyUnicodeUCS2_Resize - PyUnicode_RichCompare=python33.PyUnicodeUCS2_RichCompare - PyUnicode_SetDefaultEncoding=python33.PyUnicodeUCS2_SetDefaultEncoding - PyUnicode_Split=python33.PyUnicodeUCS2_Split - PyUnicode_Splitlines=python33.PyUnicodeUCS2_Splitlines - PyUnicode_Tailmatch=python33.PyUnicodeUCS2_Tailmatch - PyUnicode_Translate=python33.PyUnicodeUCS2_Translate - PyUnicode_BuildEncodingMap=python33.PyUnicode_BuildEncodingMap - PyUnicode_CompareWithASCIIString=python33.PyUnicode_CompareWithASCIIString - PyUnicode_DecodeUTF7=python33.PyUnicode_DecodeUTF7 - PyUnicode_DecodeUTF7Stateful=python33.PyUnicode_DecodeUTF7Stateful - PyUnicode_EncodeFSDefault=python33.PyUnicode_EncodeFSDefault - PyUnicode_InternFromString=python33.PyUnicode_InternFromString - PyUnicode_InternImmortal=python33.PyUnicode_InternImmortal - PyUnicode_InternInPlace=python33.PyUnicode_InternInPlace - PyUnicode_Type=python33.PyUnicode_Type DATA - PyWeakref_GetObject=python33.PyWeakref_GetObject DATA - PyWeakref_NewProxy=python33.PyWeakref_NewProxy - PyWeakref_NewRef=python33.PyWeakref_NewRef - PyWrapperDescr_Type=python33.PyWrapperDescr_Type DATA - PyWrapper_New=python33.PyWrapper_New - PyZip_Type=python33.PyZip_Type DATA - Py_AddPendingCall=python33.Py_AddPendingCall - Py_AtExit=python33.Py_AtExit - Py_BuildValue=python33.Py_BuildValue - Py_CompileString=python33.Py_CompileString - Py_DecRef=python33.Py_DecRef - Py_EndInterpreter=python33.Py_EndInterpreter - Py_Exit=python33.Py_Exit - Py_FatalError=python33.Py_FatalError - Py_FileSystemDefaultEncoding=python33.Py_FileSystemDefaultEncoding DATA - Py_Finalize=python33.Py_Finalize - Py_GetBuildInfo=python33.Py_GetBuildInfo - Py_GetCompiler=python33.Py_GetCompiler - Py_GetCopyright=python33.Py_GetCopyright - Py_GetExecPrefix=python33.Py_GetExecPrefix - Py_GetPath=python33.Py_GetPath - Py_GetPlatform=python33.Py_GetPlatform - Py_GetPrefix=python33.Py_GetPrefix - Py_GetProgramFullPath=python33.Py_GetProgramFullPath - Py_GetProgramName=python33.Py_GetProgramName - Py_GetPythonHome=python33.Py_GetPythonHome - Py_GetRecursionLimit=python33.Py_GetRecursionLimit - Py_GetVersion=python33.Py_GetVersion - Py_HasFileSystemDefaultEncoding=python33.Py_HasFileSystemDefaultEncoding DATA - Py_IncRef=python33.Py_IncRef - Py_Initialize=python33.Py_Initialize - Py_InitializeEx=python33.Py_InitializeEx - Py_IsInitialized=python33.Py_IsInitialized - Py_Main=python33.Py_Main - Py_MakePendingCalls=python33.Py_MakePendingCalls - Py_NewInterpreter=python33.Py_NewInterpreter - Py_ReprEnter=python33.Py_ReprEnter - Py_ReprLeave=python33.Py_ReprLeave - Py_SetProgramName=python33.Py_SetProgramName - Py_SetPythonHome=python33.Py_SetPythonHome - Py_SetRecursionLimit=python33.Py_SetRecursionLimit - Py_SymtableString=python33.Py_SymtableString - Py_VaBuildValue=python33.Py_VaBuildValue - _PyErr_BadInternalCall=python33._PyErr_BadInternalCall - _PyObject_CallFunction_SizeT=python33._PyObject_CallFunction_SizeT - _PyObject_CallMethod_SizeT=python33._PyObject_CallMethod_SizeT - _PyObject_GC_Malloc=python33._PyObject_GC_Malloc - _PyObject_GC_New=python33._PyObject_GC_New - _PyObject_GC_NewVar=python33._PyObject_GC_NewVar - _PyObject_GC_Resize=python33._PyObject_GC_Resize - _PyObject_New=python33._PyObject_New - _PyObject_NewVar=python33._PyObject_NewVar - _PyState_AddModule=python33._PyState_AddModule - _PyThreadState_Init=python33._PyThreadState_Init - _PyThreadState_Prealloc=python33._PyThreadState_Prealloc - _PyTrash_delete_later=python33._PyTrash_delete_later DATA - _PyTrash_delete_nesting=python33._PyTrash_delete_nesting DATA - _PyTrash_deposit_object=python33._PyTrash_deposit_object - _PyTrash_destroy_chain=python33._PyTrash_destroy_chain - _PyWeakref_CallableProxyType=python33._PyWeakref_CallableProxyType DATA - _PyWeakref_ProxyType=python33._PyWeakref_ProxyType DATA - _PyWeakref_RefType=python33._PyWeakref_RefType DATA - _Py_BuildValue_SizeT=python33._Py_BuildValue_SizeT - _Py_CheckRecursionLimit=python33._Py_CheckRecursionLimit DATA - _Py_CheckRecursiveCall=python33._Py_CheckRecursiveCall - _Py_Dealloc=python33._Py_Dealloc - _Py_EllipsisObject=python33._Py_EllipsisObject DATA - _Py_FalseStruct=python33._Py_FalseStruct DATA - _Py_NoneStruct=python33._Py_NoneStruct DATA - _Py_NotImplementedStruct=python33._Py_NotImplementedStruct DATA - _Py_SwappedOp=python33._Py_SwappedOp DATA - _Py_TrueStruct=python33._Py_TrueStruct DATA - _Py_VaBuildValue_SizeT=python33._Py_VaBuildValue_SizeT +LIBRARY "python3" +EXPORTS + PyArg_Parse=python33.PyArg_Parse + PyArg_ParseTuple=python33.PyArg_ParseTuple + PyArg_ParseTupleAndKeywords=python33.PyArg_ParseTupleAndKeywords + PyArg_UnpackTuple=python33.PyArg_UnpackTuple + PyArg_VaParse=python33.PyArg_VaParse + PyArg_VaParseTupleAndKeywords=python33.PyArg_VaParseTupleAndKeywords + PyArg_ValidateKeywordArguments=python33.PyArg_ValidateKeywordArguments + PyBaseObject_Type=python33.PyBaseObject_Type DATA + PyBool_FromLong=python33.PyBool_FromLong + PyBool_Type=python33.PyBool_Type DATA + PyByteArrayIter_Type=python33.PyByteArrayIter_Type DATA + PyByteArray_AsString=python33.PyByteArray_AsString + PyByteArray_Concat=python33.PyByteArray_Concat + PyByteArray_FromObject=python33.PyByteArray_FromObject + PyByteArray_FromStringAndSize=python33.PyByteArray_FromStringAndSize + PyByteArray_Resize=python33.PyByteArray_Resize + PyByteArray_Size=python33.PyByteArray_Size + PyByteArray_Type=python33.PyByteArray_Type DATA + PyBytesIter_Type=python33.PyBytesIter_Type DATA + PyBytes_AsString=python33.PyBytes_AsString + PyBytes_AsStringAndSize=python33.PyBytes_AsStringAndSize + PyBytes_Concat=python33.PyBytes_Concat + PyBytes_ConcatAndDel=python33.PyBytes_ConcatAndDel + PyBytes_DecodeEscape=python33.PyBytes_DecodeEscape + PyBytes_FromFormat=python33.PyBytes_FromFormat + PyBytes_FromFormatV=python33.PyBytes_FromFormatV + PyBytes_FromObject=python33.PyBytes_FromObject + PyBytes_FromString=python33.PyBytes_FromString + PyBytes_FromStringAndSize=python33.PyBytes_FromStringAndSize + PyBytes_Repr=python33.PyBytes_Repr + PyBytes_Size=python33.PyBytes_Size + PyBytes_Type=python33.PyBytes_Type DATA + PyCFunction_Call=python33.PyCFunction_Call + PyCFunction_ClearFreeList=python33.PyCFunction_ClearFreeList + PyCFunction_GetFlags=python33.PyCFunction_GetFlags + PyCFunction_GetFunction=python33.PyCFunction_GetFunction + PyCFunction_GetSelf=python33.PyCFunction_GetSelf + PyCFunction_NewEx=python33.PyCFunction_NewEx + PyCFunction_Type=python33.PyCFunction_Type DATA + PyCallIter_New=python33.PyCallIter_New + PyCallIter_Type=python33.PyCallIter_Type DATA + PyCallable_Check=python33.PyCallable_Check + PyCapsule_GetContext=python33.PyCapsule_GetContext + PyCapsule_GetDestructor=python33.PyCapsule_GetDestructor + PyCapsule_GetName=python33.PyCapsule_GetName + PyCapsule_GetPointer=python33.PyCapsule_GetPointer + PyCapsule_Import=python33.PyCapsule_Import + PyCapsule_IsValid=python33.PyCapsule_IsValid + PyCapsule_New=python33.PyCapsule_New + PyCapsule_SetContext=python33.PyCapsule_SetContext + PyCapsule_SetDestructor=python33.PyCapsule_SetDestructor + PyCapsule_SetName=python33.PyCapsule_SetName + PyCapsule_SetPointer=python33.PyCapsule_SetPointer + PyCapsule_Type=python33.PyCapsule_Type DATA + PyClassMethodDescr_Type=python33.PyClassMethodDescr_Type DATA + PyCodec_BackslashReplaceErrors=python33.PyCodec_BackslashReplaceErrors + PyCodec_Decode=python33.PyCodec_Decode + PyCodec_Decoder=python33.PyCodec_Decoder + PyCodec_Encode=python33.PyCodec_Encode + PyCodec_Encoder=python33.PyCodec_Encoder + PyCodec_IgnoreErrors=python33.PyCodec_IgnoreErrors + PyCodec_IncrementalDecoder=python33.PyCodec_IncrementalDecoder + PyCodec_IncrementalEncoder=python33.PyCodec_IncrementalEncoder + PyCodec_KnownEncoding=python33.PyCodec_KnownEncoding + PyCodec_LookupError=python33.PyCodec_LookupError + PyCodec_Register=python33.PyCodec_Register + PyCodec_RegisterError=python33.PyCodec_RegisterError + PyCodec_ReplaceErrors=python33.PyCodec_ReplaceErrors + PyCodec_StreamReader=python33.PyCodec_StreamReader + PyCodec_StreamWriter=python33.PyCodec_StreamWriter + PyCodec_StrictErrors=python33.PyCodec_StrictErrors + PyCodec_XMLCharRefReplaceErrors=python33.PyCodec_XMLCharRefReplaceErrors + PyComplex_FromDoubles=python33.PyComplex_FromDoubles + PyComplex_ImagAsDouble=python33.PyComplex_ImagAsDouble + PyComplex_RealAsDouble=python33.PyComplex_RealAsDouble + PyComplex_Type=python33.PyComplex_Type DATA + PyDescr_NewClassMethod=python33.PyDescr_NewClassMethod + PyDescr_NewGetSet=python33.PyDescr_NewGetSet + PyDescr_NewMember=python33.PyDescr_NewMember + PyDescr_NewMethod=python33.PyDescr_NewMethod + PyDictItems_Type=python33.PyDictItems_Type DATA + PyDictIterItem_Type=python33.PyDictIterItem_Type DATA + PyDictIterKey_Type=python33.PyDictIterKey_Type DATA + PyDictIterValue_Type=python33.PyDictIterValue_Type DATA + PyDictKeys_Type=python33.PyDictKeys_Type DATA + PyDictProxy_New=python33.PyDictProxy_New + PyDictProxy_Type=python33.PyDictProxy_Type DATA + PyDictValues_Type=python33.PyDictValues_Type DATA + PyDict_Clear=python33.PyDict_Clear + PyDict_Contains=python33.PyDict_Contains + PyDict_Copy=python33.PyDict_Copy + PyDict_DelItem=python33.PyDict_DelItem + PyDict_DelItemString=python33.PyDict_DelItemString + PyDict_GetItem=python33.PyDict_GetItem + PyDict_GetItemString=python33.PyDict_GetItemString + PyDict_GetItemWithError=python33.PyDict_GetItemWithError + PyDict_Items=python33.PyDict_Items + PyDict_Keys=python33.PyDict_Keys + PyDict_Merge=python33.PyDict_Merge + PyDict_MergeFromSeq2=python33.PyDict_MergeFromSeq2 + PyDict_New=python33.PyDict_New + PyDict_Next=python33.PyDict_Next + PyDict_SetItem=python33.PyDict_SetItem + PyDict_SetItemString=python33.PyDict_SetItemString + PyDict_Size=python33.PyDict_Size + PyDict_Type=python33.PyDict_Type DATA + PyDict_Update=python33.PyDict_Update + PyDict_Values=python33.PyDict_Values + PyEllipsis_Type=python33.PyEllipsis_Type DATA + PyEnum_Type=python33.PyEnum_Type DATA + PyErr_BadArgument=python33.PyErr_BadArgument + PyErr_BadInternalCall=python33.PyErr_BadInternalCall + PyErr_CheckSignals=python33.PyErr_CheckSignals + PyErr_Clear=python33.PyErr_Clear + PyErr_Display=python33.PyErr_Display + PyErr_ExceptionMatches=python33.PyErr_ExceptionMatches + PyErr_Fetch=python33.PyErr_Fetch + PyErr_Format=python33.PyErr_Format + PyErr_GivenExceptionMatches=python33.PyErr_GivenExceptionMatches + PyErr_NewException=python33.PyErr_NewException + PyErr_NewExceptionWithDoc=python33.PyErr_NewExceptionWithDoc + PyErr_NoMemory=python33.PyErr_NoMemory + PyErr_NormalizeException=python33.PyErr_NormalizeException + PyErr_Occurred=python33.PyErr_Occurred + PyErr_Print=python33.PyErr_Print + PyErr_PrintEx=python33.PyErr_PrintEx + PyErr_ProgramText=python33.PyErr_ProgramText + PyErr_Restore=python33.PyErr_Restore + PyErr_SetFromErrno=python33.PyErr_SetFromErrno + PyErr_SetFromErrnoWithFilename=python33.PyErr_SetFromErrnoWithFilename + PyErr_SetFromErrnoWithFilenameObject=python33.PyErr_SetFromErrnoWithFilenameObject + PyErr_SetInterrupt=python33.PyErr_SetInterrupt + PyErr_SetNone=python33.PyErr_SetNone + PyErr_SetObject=python33.PyErr_SetObject + PyErr_SetString=python33.PyErr_SetString + PyErr_SyntaxLocation=python33.PyErr_SyntaxLocation + PyErr_WarnEx=python33.PyErr_WarnEx + PyErr_WarnExplicit=python33.PyErr_WarnExplicit + PyErr_WarnFormat=python33.PyErr_WarnFormat + PyErr_WriteUnraisable=python33.PyErr_WriteUnraisable + PyEval_AcquireLock=python33.PyEval_AcquireLock + PyEval_AcquireThread=python33.PyEval_AcquireThread + PyEval_CallFunction=python33.PyEval_CallFunction + PyEval_CallMethod=python33.PyEval_CallMethod + PyEval_CallObjectWithKeywords=python33.PyEval_CallObjectWithKeywords + PyEval_EvalCode=python33.PyEval_EvalCode + PyEval_EvalCodeEx=python33.PyEval_EvalCodeEx + PyEval_EvalFrame=python33.PyEval_EvalFrame + PyEval_EvalFrameEx=python33.PyEval_EvalFrameEx + PyEval_GetBuiltins=python33.PyEval_GetBuiltins + PyEval_GetCallStats=python33.PyEval_GetCallStats + PyEval_GetFrame=python33.PyEval_GetFrame + PyEval_GetFuncDesc=python33.PyEval_GetFuncDesc + PyEval_GetFuncName=python33.PyEval_GetFuncName + PyEval_GetGlobals=python33.PyEval_GetGlobals + PyEval_GetLocals=python33.PyEval_GetLocals + PyEval_InitThreads=python33.PyEval_InitThreads + PyEval_ReInitThreads=python33.PyEval_ReInitThreads + PyEval_ReleaseLock=python33.PyEval_ReleaseLock + PyEval_ReleaseThread=python33.PyEval_ReleaseThread + PyEval_RestoreThread=python33.PyEval_RestoreThread + PyEval_SaveThread=python33.PyEval_SaveThread + PyEval_ThreadsInitialized=python33.PyEval_ThreadsInitialized + PyExc_ArithmeticError=python33.PyExc_ArithmeticError DATA + PyExc_AssertionError=python33.PyExc_AssertionError DATA + PyExc_AttributeError=python33.PyExc_AttributeError DATA + PyExc_BaseException=python33.PyExc_BaseException DATA + PyExc_BufferError=python33.PyExc_BufferError DATA + PyExc_BytesWarning=python33.PyExc_BytesWarning DATA + PyExc_DeprecationWarning=python33.PyExc_DeprecationWarning DATA + PyExc_EOFError=python33.PyExc_EOFError DATA + PyExc_EnvironmentError=python33.PyExc_EnvironmentError DATA + PyExc_Exception=python33.PyExc_Exception DATA + PyExc_FloatingPointError=python33.PyExc_FloatingPointError DATA + PyExc_FutureWarning=python33.PyExc_FutureWarning DATA + PyExc_GeneratorExit=python33.PyExc_GeneratorExit DATA + PyExc_IOError=python33.PyExc_IOError DATA + PyExc_ImportError=python33.PyExc_ImportError DATA + PyExc_ImportWarning=python33.PyExc_ImportWarning DATA + PyExc_IndentationError=python33.PyExc_IndentationError DATA + PyExc_IndexError=python33.PyExc_IndexError DATA + PyExc_KeyError=python33.PyExc_KeyError DATA + PyExc_KeyboardInterrupt=python33.PyExc_KeyboardInterrupt DATA + PyExc_LookupError=python33.PyExc_LookupError DATA + PyExc_MemoryError=python33.PyExc_MemoryError DATA + PyExc_MemoryErrorInst=python33.PyExc_MemoryErrorInst DATA + PyExc_NameError=python33.PyExc_NameError DATA + PyExc_NotImplementedError=python33.PyExc_NotImplementedError DATA + PyExc_OSError=python33.PyExc_OSError DATA + PyExc_OverflowError=python33.PyExc_OverflowError DATA + PyExc_PendingDeprecationWarning=python33.PyExc_PendingDeprecationWarning DATA + PyExc_RecursionErrorInst=python33.PyExc_RecursionErrorInst DATA + PyExc_ReferenceError=python33.PyExc_ReferenceError DATA + PyExc_RuntimeError=python33.PyExc_RuntimeError DATA + PyExc_RuntimeWarning=python33.PyExc_RuntimeWarning DATA + PyExc_StopIteration=python33.PyExc_StopIteration DATA + PyExc_SyntaxError=python33.PyExc_SyntaxError DATA + PyExc_SyntaxWarning=python33.PyExc_SyntaxWarning DATA + PyExc_SystemError=python33.PyExc_SystemError DATA + PyExc_SystemExit=python33.PyExc_SystemExit DATA + PyExc_TabError=python33.PyExc_TabError DATA + PyExc_TypeError=python33.PyExc_TypeError DATA + PyExc_UnboundLocalError=python33.PyExc_UnboundLocalError DATA + PyExc_UnicodeDecodeError=python33.PyExc_UnicodeDecodeError DATA + PyExc_UnicodeEncodeError=python33.PyExc_UnicodeEncodeError DATA + PyExc_UnicodeError=python33.PyExc_UnicodeError DATA + PyExc_UnicodeTranslateError=python33.PyExc_UnicodeTranslateError DATA + PyExc_UnicodeWarning=python33.PyExc_UnicodeWarning DATA + PyExc_UserWarning=python33.PyExc_UserWarning DATA + PyExc_ValueError=python33.PyExc_ValueError DATA + PyExc_Warning=python33.PyExc_Warning DATA + PyExc_ZeroDivisionError=python33.PyExc_ZeroDivisionError DATA + PyException_GetCause=python33.PyException_GetCause + PyException_GetContext=python33.PyException_GetContext + PyException_GetTraceback=python33.PyException_GetTraceback + PyException_SetCause=python33.PyException_SetCause + PyException_SetContext=python33.PyException_SetContext + PyException_SetTraceback=python33.PyException_SetTraceback + PyFile_FromFd=python33.PyFile_FromFd + PyFile_GetLine=python33.PyFile_GetLine + PyFile_WriteObject=python33.PyFile_WriteObject + PyFile_WriteString=python33.PyFile_WriteString + PyFilter_Type=python33.PyFilter_Type DATA + PyFloat_AsDouble=python33.PyFloat_AsDouble + PyFloat_FromDouble=python33.PyFloat_FromDouble + PyFloat_FromString=python33.PyFloat_FromString + PyFloat_GetInfo=python33.PyFloat_GetInfo + PyFloat_GetMax=python33.PyFloat_GetMax + PyFloat_GetMin=python33.PyFloat_GetMin + PyFloat_Type=python33.PyFloat_Type DATA + PyFrozenSet_New=python33.PyFrozenSet_New + PyFrozenSet_Type=python33.PyFrozenSet_Type DATA + PyGC_Collect=python33.PyGC_Collect + PyGILState_Ensure=python33.PyGILState_Ensure + PyGILState_GetThisThreadState=python33.PyGILState_GetThisThreadState + PyGILState_Release=python33.PyGILState_Release + PyGetSetDescr_Type=python33.PyGetSetDescr_Type DATA + PyImport_AddModule=python33.PyImport_AddModule + PyImport_AppendInittab=python33.PyImport_AppendInittab + PyImport_Cleanup=python33.PyImport_Cleanup + PyImport_ExecCodeModule=python33.PyImport_ExecCodeModule + PyImport_ExecCodeModuleEx=python33.PyImport_ExecCodeModuleEx + PyImport_ExecCodeModuleWithPathnames=python33.PyImport_ExecCodeModuleWithPathnames + PyImport_GetImporter=python33.PyImport_GetImporter + PyImport_GetMagicNumber=python33.PyImport_GetMagicNumber + PyImport_GetMagicTag=python33.PyImport_GetMagicTag + PyImport_GetModuleDict=python33.PyImport_GetModuleDict + PyImport_Import=python33.PyImport_Import + PyImport_ImportFrozenModule=python33.PyImport_ImportFrozenModule + PyImport_ImportModule=python33.PyImport_ImportModule + PyImport_ImportModuleLevel=python33.PyImport_ImportModuleLevel + PyImport_ImportModuleNoBlock=python33.PyImport_ImportModuleNoBlock + PyImport_ReloadModule=python33.PyImport_ReloadModule + PyInterpreterState_Clear=python33.PyInterpreterState_Clear + PyInterpreterState_Delete=python33.PyInterpreterState_Delete + PyInterpreterState_New=python33.PyInterpreterState_New + PyIter_Next=python33.PyIter_Next + PyListIter_Type=python33.PyListIter_Type DATA + PyListRevIter_Type=python33.PyListRevIter_Type DATA + PyList_Append=python33.PyList_Append + PyList_AsTuple=python33.PyList_AsTuple + PyList_GetItem=python33.PyList_GetItem + PyList_GetSlice=python33.PyList_GetSlice + PyList_Insert=python33.PyList_Insert + PyList_New=python33.PyList_New + PyList_Reverse=python33.PyList_Reverse + PyList_SetItem=python33.PyList_SetItem + PyList_SetSlice=python33.PyList_SetSlice + PyList_Size=python33.PyList_Size + PyList_Sort=python33.PyList_Sort + PyList_Type=python33.PyList_Type DATA + PyLongRangeIter_Type=python33.PyLongRangeIter_Type DATA + PyLong_AsDouble=python33.PyLong_AsDouble + PyLong_AsLong=python33.PyLong_AsLong + PyLong_AsLongAndOverflow=python33.PyLong_AsLongAndOverflow + PyLong_AsLongLong=python33.PyLong_AsLongLong + PyLong_AsLongLongAndOverflow=python33.PyLong_AsLongLongAndOverflow + PyLong_AsSize_t=python33.PyLong_AsSize_t + PyLong_AsSsize_t=python33.PyLong_AsSsize_t + PyLong_AsUnsignedLong=python33.PyLong_AsUnsignedLong + PyLong_AsUnsignedLongLong=python33.PyLong_AsUnsignedLongLong + PyLong_AsUnsignedLongLongMask=python33.PyLong_AsUnsignedLongLongMask + PyLong_AsUnsignedLongMask=python33.PyLong_AsUnsignedLongMask + PyLong_AsVoidPtr=python33.PyLong_AsVoidPtr + PyLong_FromDouble=python33.PyLong_FromDouble + PyLong_FromLong=python33.PyLong_FromLong + PyLong_FromLongLong=python33.PyLong_FromLongLong + PyLong_FromSize_t=python33.PyLong_FromSize_t + PyLong_FromSsize_t=python33.PyLong_FromSsize_t + PyLong_FromString=python33.PyLong_FromString + PyLong_FromUnsignedLong=python33.PyLong_FromUnsignedLong + PyLong_FromUnsignedLongLong=python33.PyLong_FromUnsignedLongLong + PyLong_FromVoidPtr=python33.PyLong_FromVoidPtr + PyLong_GetInfo=python33.PyLong_GetInfo + PyLong_Type=python33.PyLong_Type DATA + PyMap_Type=python33.PyMap_Type DATA + PyMapping_Check=python33.PyMapping_Check + PyMapping_GetItemString=python33.PyMapping_GetItemString + PyMapping_HasKey=python33.PyMapping_HasKey + PyMapping_HasKeyString=python33.PyMapping_HasKeyString + PyMapping_Items=python33.PyMapping_Items + PyMapping_Keys=python33.PyMapping_Keys + PyMapping_Length=python33.PyMapping_Length + PyMapping_SetItemString=python33.PyMapping_SetItemString + PyMapping_Size=python33.PyMapping_Size + PyMapping_Values=python33.PyMapping_Values + PyMem_Free=python33.PyMem_Free + PyMem_Malloc=python33.PyMem_Malloc + PyMem_Realloc=python33.PyMem_Realloc + PyMemberDescr_Type=python33.PyMemberDescr_Type DATA + PyMemoryView_FromObject=python33.PyMemoryView_FromObject + PyMemoryView_GetContiguous=python33.PyMemoryView_GetContiguous + PyMemoryView_Type=python33.PyMemoryView_Type DATA + PyMethodDescr_Type=python33.PyMethodDescr_Type DATA + PyModule_AddIntConstant=python33.PyModule_AddIntConstant + PyModule_AddObject=python33.PyModule_AddObject + PyModule_AddStringConstant=python33.PyModule_AddStringConstant + PyModule_Create2=python33.PyModule_Create2 + PyModule_GetDef=python33.PyModule_GetDef + PyModule_GetDict=python33.PyModule_GetDict + PyModule_GetFilename=python33.PyModule_GetFilename + PyModule_GetFilenameObject=python33.PyModule_GetFilenameObject + PyModule_GetName=python33.PyModule_GetName + PyModule_GetState=python33.PyModule_GetState + PyModule_New=python33.PyModule_New + PyModule_Type=python33.PyModule_Type DATA + PyNullImporter_Type=python33.PyNullImporter_Type DATA + PyNumber_Absolute=python33.PyNumber_Absolute + PyNumber_Add=python33.PyNumber_Add + PyNumber_And=python33.PyNumber_And + PyNumber_AsSsize_t=python33.PyNumber_AsSsize_t + PyNumber_Check=python33.PyNumber_Check + PyNumber_Divmod=python33.PyNumber_Divmod + PyNumber_Float=python33.PyNumber_Float + PyNumber_FloorDivide=python33.PyNumber_FloorDivide + PyNumber_InPlaceAdd=python33.PyNumber_InPlaceAdd + PyNumber_InPlaceAnd=python33.PyNumber_InPlaceAnd + PyNumber_InPlaceFloorDivide=python33.PyNumber_InPlaceFloorDivide + PyNumber_InPlaceLshift=python33.PyNumber_InPlaceLshift + PyNumber_InPlaceMultiply=python33.PyNumber_InPlaceMultiply + PyNumber_InPlaceOr=python33.PyNumber_InPlaceOr + PyNumber_InPlacePower=python33.PyNumber_InPlacePower + PyNumber_InPlaceRemainder=python33.PyNumber_InPlaceRemainder + PyNumber_InPlaceRshift=python33.PyNumber_InPlaceRshift + PyNumber_InPlaceSubtract=python33.PyNumber_InPlaceSubtract + PyNumber_InPlaceTrueDivide=python33.PyNumber_InPlaceTrueDivide + PyNumber_InPlaceXor=python33.PyNumber_InPlaceXor + PyNumber_Index=python33.PyNumber_Index + PyNumber_Invert=python33.PyNumber_Invert + PyNumber_Long=python33.PyNumber_Long + PyNumber_Lshift=python33.PyNumber_Lshift + PyNumber_Multiply=python33.PyNumber_Multiply + PyNumber_Negative=python33.PyNumber_Negative + PyNumber_Or=python33.PyNumber_Or + PyNumber_Positive=python33.PyNumber_Positive + PyNumber_Power=python33.PyNumber_Power + PyNumber_Remainder=python33.PyNumber_Remainder + PyNumber_Rshift=python33.PyNumber_Rshift + PyNumber_Subtract=python33.PyNumber_Subtract + PyNumber_ToBase=python33.PyNumber_ToBase + PyNumber_TrueDivide=python33.PyNumber_TrueDivide + PyNumber_Xor=python33.PyNumber_Xor + PyOS_AfterFork=python33.PyOS_AfterFork + PyOS_InitInterrupts=python33.PyOS_InitInterrupts + PyOS_InputHook=python33.PyOS_InputHook DATA + PyOS_InterruptOccurred=python33.PyOS_InterruptOccurred + PyOS_ReadlineFunctionPointer=python33.PyOS_ReadlineFunctionPointer DATA + PyOS_double_to_string=python33.PyOS_double_to_string + PyOS_getsig=python33.PyOS_getsig + PyOS_mystricmp=python33.PyOS_mystricmp + PyOS_mystrnicmp=python33.PyOS_mystrnicmp + PyOS_setsig=python33.PyOS_setsig + PyOS_snprintf=python33.PyOS_snprintf + PyOS_string_to_double=python33.PyOS_string_to_double + PyOS_strtol=python33.PyOS_strtol + PyOS_strtoul=python33.PyOS_strtoul + PyOS_vsnprintf=python33.PyOS_vsnprintf + PyObject_ASCII=python33.PyObject_ASCII + PyObject_AsCharBuffer=python33.PyObject_AsCharBuffer + PyObject_AsFileDescriptor=python33.PyObject_AsFileDescriptor + PyObject_AsReadBuffer=python33.PyObject_AsReadBuffer + PyObject_AsWriteBuffer=python33.PyObject_AsWriteBuffer + PyObject_Bytes=python33.PyObject_Bytes + PyObject_Call=python33.PyObject_Call + PyObject_CallFunction=python33.PyObject_CallFunction + PyObject_CallFunctionObjArgs=python33.PyObject_CallFunctionObjArgs + PyObject_CallMethod=python33.PyObject_CallMethod + PyObject_CallMethodObjArgs=python33.PyObject_CallMethodObjArgs + PyObject_CallObject=python33.PyObject_CallObject + PyObject_CheckReadBuffer=python33.PyObject_CheckReadBuffer + PyObject_ClearWeakRefs=python33.PyObject_ClearWeakRefs + PyObject_DelItem=python33.PyObject_DelItem + PyObject_DelItemString=python33.PyObject_DelItemString + PyObject_Dir=python33.PyObject_Dir + PyObject_Format=python33.PyObject_Format + PyObject_Free=python33.PyObject_Free + PyObject_GC_Del=python33.PyObject_GC_Del + PyObject_GC_Track=python33.PyObject_GC_Track + PyObject_GC_UnTrack=python33.PyObject_GC_UnTrack + PyObject_GenericGetAttr=python33.PyObject_GenericGetAttr + PyObject_GenericSetAttr=python33.PyObject_GenericSetAttr + PyObject_GetAttr=python33.PyObject_GetAttr + PyObject_GetAttrString=python33.PyObject_GetAttrString + PyObject_GetItem=python33.PyObject_GetItem + PyObject_GetIter=python33.PyObject_GetIter + PyObject_HasAttr=python33.PyObject_HasAttr + PyObject_HasAttrString=python33.PyObject_HasAttrString + PyObject_Hash=python33.PyObject_Hash + PyObject_HashNotImplemented=python33.PyObject_HashNotImplemented + PyObject_Init=python33.PyObject_Init + PyObject_InitVar=python33.PyObject_InitVar + PyObject_IsInstance=python33.PyObject_IsInstance + PyObject_IsSubclass=python33.PyObject_IsSubclass + PyObject_IsTrue=python33.PyObject_IsTrue + PyObject_Length=python33.PyObject_Length + PyObject_Malloc=python33.PyObject_Malloc + PyObject_Not=python33.PyObject_Not + PyObject_Realloc=python33.PyObject_Realloc + PyObject_Repr=python33.PyObject_Repr + PyObject_RichCompare=python33.PyObject_RichCompare + PyObject_RichCompareBool=python33.PyObject_RichCompareBool + PyObject_SelfIter=python33.PyObject_SelfIter + PyObject_SetAttr=python33.PyObject_SetAttr + PyObject_SetAttrString=python33.PyObject_SetAttrString + PyObject_SetItem=python33.PyObject_SetItem + PyObject_Size=python33.PyObject_Size + PyObject_Str=python33.PyObject_Str + PyObject_Type=python33.PyObject_Type DATA + PyParser_SimpleParseFileFlags=python33.PyParser_SimpleParseFileFlags + PyParser_SimpleParseStringFlags=python33.PyParser_SimpleParseStringFlags + PyProperty_Type=python33.PyProperty_Type DATA + PyRangeIter_Type=python33.PyRangeIter_Type DATA + PyRange_Type=python33.PyRange_Type DATA + PyReversed_Type=python33.PyReversed_Type DATA + PySeqIter_New=python33.PySeqIter_New + PySeqIter_Type=python33.PySeqIter_Type DATA + PySequence_Check=python33.PySequence_Check + PySequence_Concat=python33.PySequence_Concat + PySequence_Contains=python33.PySequence_Contains + PySequence_Count=python33.PySequence_Count + PySequence_DelItem=python33.PySequence_DelItem + PySequence_DelSlice=python33.PySequence_DelSlice + PySequence_Fast=python33.PySequence_Fast + PySequence_GetItem=python33.PySequence_GetItem + PySequence_GetSlice=python33.PySequence_GetSlice + PySequence_In=python33.PySequence_In + PySequence_InPlaceConcat=python33.PySequence_InPlaceConcat + PySequence_InPlaceRepeat=python33.PySequence_InPlaceRepeat + PySequence_Index=python33.PySequence_Index + PySequence_Length=python33.PySequence_Length + PySequence_List=python33.PySequence_List + PySequence_Repeat=python33.PySequence_Repeat + PySequence_SetItem=python33.PySequence_SetItem + PySequence_SetSlice=python33.PySequence_SetSlice + PySequence_Size=python33.PySequence_Size + PySequence_Tuple=python33.PySequence_Tuple + PySetIter_Type=python33.PySetIter_Type DATA + PySet_Add=python33.PySet_Add + PySet_Clear=python33.PySet_Clear + PySet_Contains=python33.PySet_Contains + PySet_Discard=python33.PySet_Discard + PySet_New=python33.PySet_New + PySet_Pop=python33.PySet_Pop + PySet_Size=python33.PySet_Size + PySet_Type=python33.PySet_Type DATA + PySlice_GetIndices=python33.PySlice_GetIndices + PySlice_GetIndicesEx=python33.PySlice_GetIndicesEx + PySlice_New=python33.PySlice_New + PySlice_Type=python33.PySlice_Type DATA + PySortWrapper_Type=python33.PySortWrapper_Type DATA + PyState_FindModule=python33.PyState_FindModule + PyStructSequence_GetItem=python33.PyStructSequence_GetItem + PyStructSequence_New=python33.PyStructSequence_New + PyStructSequence_NewType=python33.PyStructSequence_NewType + PyStructSequence_SetItem=python33.PyStructSequence_SetItem + PySuper_Type=python33.PySuper_Type DATA + PySys_AddWarnOption=python33.PySys_AddWarnOption + PySys_AddWarnOptionUnicode=python33.PySys_AddWarnOptionUnicode + PySys_FormatStderr=python33.PySys_FormatStderr + PySys_FormatStdout=python33.PySys_FormatStdout + PySys_GetObject=python33.PySys_GetObject + PySys_HasWarnOptions=python33.PySys_HasWarnOptions + PySys_ResetWarnOptions=python33.PySys_ResetWarnOptions + PySys_SetArgv=python33.PySys_SetArgv + PySys_SetArgvEx=python33.PySys_SetArgvEx + PySys_SetObject=python33.PySys_SetObject + PySys_SetPath=python33.PySys_SetPath + PySys_WriteStderr=python33.PySys_WriteStderr + PySys_WriteStdout=python33.PySys_WriteStdout + PyThreadState_Clear=python33.PyThreadState_Clear + PyThreadState_Delete=python33.PyThreadState_Delete + PyThreadState_DeleteCurrent=python33.PyThreadState_DeleteCurrent + PyThreadState_Get=python33.PyThreadState_Get + PyThreadState_GetDict=python33.PyThreadState_GetDict + PyThreadState_New=python33.PyThreadState_New + PyThreadState_SetAsyncExc=python33.PyThreadState_SetAsyncExc + PyThreadState_Swap=python33.PyThreadState_Swap + PyTraceBack_Here=python33.PyTraceBack_Here + PyTraceBack_Print=python33.PyTraceBack_Print + PyTraceBack_Type=python33.PyTraceBack_Type DATA + PyTupleIter_Type=python33.PyTupleIter_Type DATA + PyTuple_ClearFreeList=python33.PyTuple_ClearFreeList + PyTuple_GetItem=python33.PyTuple_GetItem + PyTuple_GetSlice=python33.PyTuple_GetSlice + PyTuple_New=python33.PyTuple_New + PyTuple_Pack=python33.PyTuple_Pack + PyTuple_SetItem=python33.PyTuple_SetItem + PyTuple_Size=python33.PyTuple_Size + PyTuple_Type=python33.PyTuple_Type DATA + PyType_ClearCache=python33.PyType_ClearCache + PyType_FromSpec=python33.PyType_FromSpec + PyType_GenericAlloc=python33.PyType_GenericAlloc + PyType_GenericNew=python33.PyType_GenericNew + PyType_GetFlags=python33.PyType_GetFlags + PyType_IsSubtype=python33.PyType_IsSubtype + PyType_Modified=python33.PyType_Modified + PyType_Ready=python33.PyType_Ready + PyType_Type=python33.PyType_Type DATA + PyUnicodeDecodeError_Create=python33.PyUnicodeDecodeError_Create + PyUnicodeDecodeError_GetEncoding=python33.PyUnicodeDecodeError_GetEncoding + PyUnicodeDecodeError_GetEnd=python33.PyUnicodeDecodeError_GetEnd + PyUnicodeDecodeError_GetObject=python33.PyUnicodeDecodeError_GetObject + PyUnicodeDecodeError_GetReason=python33.PyUnicodeDecodeError_GetReason + PyUnicodeDecodeError_GetStart=python33.PyUnicodeDecodeError_GetStart + PyUnicodeDecodeError_SetEnd=python33.PyUnicodeDecodeError_SetEnd + PyUnicodeDecodeError_SetReason=python33.PyUnicodeDecodeError_SetReason + PyUnicodeDecodeError_SetStart=python33.PyUnicodeDecodeError_SetStart + PyUnicodeEncodeError_GetEncoding=python33.PyUnicodeEncodeError_GetEncoding + PyUnicodeEncodeError_GetEnd=python33.PyUnicodeEncodeError_GetEnd + PyUnicodeEncodeError_GetObject=python33.PyUnicodeEncodeError_GetObject + PyUnicodeEncodeError_GetReason=python33.PyUnicodeEncodeError_GetReason + PyUnicodeEncodeError_GetStart=python33.PyUnicodeEncodeError_GetStart + PyUnicodeEncodeError_SetEnd=python33.PyUnicodeEncodeError_SetEnd + PyUnicodeEncodeError_SetReason=python33.PyUnicodeEncodeError_SetReason + PyUnicodeEncodeError_SetStart=python33.PyUnicodeEncodeError_SetStart + PyUnicodeIter_Type=python33.PyUnicodeIter_Type DATA + PyUnicodeTranslateError_GetEnd=python33.PyUnicodeTranslateError_GetEnd + PyUnicodeTranslateError_GetObject=python33.PyUnicodeTranslateError_GetObject + PyUnicodeTranslateError_GetReason=python33.PyUnicodeTranslateError_GetReason + PyUnicodeTranslateError_GetStart=python33.PyUnicodeTranslateError_GetStart + PyUnicodeTranslateError_SetEnd=python33.PyUnicodeTranslateError_SetEnd + PyUnicodeTranslateError_SetReason=python33.PyUnicodeTranslateError_SetReason + PyUnicodeTranslateError_SetStart=python33.PyUnicodeTranslateError_SetStart + PyUnicode_Append=python33.PyUnicodeUCS2_Append + PyUnicode_AppendAndDel=python33.PyUnicodeUCS2_AppendAndDel + PyUnicode_AsASCIIString=python33.PyUnicodeUCS2_AsASCIIString + PyUnicode_AsCharmapString=python33.PyUnicodeUCS2_AsCharmapString + PyUnicode_AsDecodedObject=python33.PyUnicodeUCS2_AsDecodedObject + PyUnicode_AsDecodedUnicode=python33.PyUnicodeUCS2_AsDecodedUnicode + PyUnicode_AsEncodedObject=python33.PyUnicodeUCS2_AsEncodedObject + PyUnicode_AsEncodedString=python33.PyUnicodeUCS2_AsEncodedString + PyUnicode_AsEncodedUnicode=python33.PyUnicodeUCS2_AsEncodedUnicode + PyUnicode_AsLatin1String=python33.PyUnicodeUCS2_AsLatin1String + PyUnicode_AsRawUnicodeEscapeString=python33.PyUnicodeUCS2_AsRawUnicodeEscapeString + PyUnicode_AsUTF16String=python33.PyUnicodeUCS2_AsUTF16String + PyUnicode_AsUTF32String=python33.PyUnicodeUCS2_AsUTF32String + PyUnicode_AsUTF8String=python33.PyUnicodeUCS2_AsUTF8String + PyUnicode_AsUnicodeEscapeString=python33.PyUnicodeUCS2_AsUnicodeEscapeString + PyUnicode_AsWideChar=python33.PyUnicodeUCS2_AsWideChar + PyUnicode_ClearFreelist=python33.PyUnicodeUCS2_ClearFreelist + PyUnicode_Compare=python33.PyUnicodeUCS2_Compare + PyUnicode_Concat=python33.PyUnicodeUCS2_Concat + PyUnicode_Contains=python33.PyUnicodeUCS2_Contains + PyUnicode_Count=python33.PyUnicodeUCS2_Count + PyUnicode_Decode=python33.PyUnicodeUCS2_Decode + PyUnicode_DecodeASCII=python33.PyUnicodeUCS2_DecodeASCII + PyUnicode_DecodeCharmap=python33.PyUnicodeUCS2_DecodeCharmap + PyUnicode_DecodeFSDefault=python33.PyUnicodeUCS2_DecodeFSDefault + PyUnicode_DecodeFSDefaultAndSize=python33.PyUnicodeUCS2_DecodeFSDefaultAndSize + PyUnicode_DecodeLatin1=python33.PyUnicodeUCS2_DecodeLatin1 + PyUnicode_DecodeRawUnicodeEscape=python33.PyUnicodeUCS2_DecodeRawUnicodeEscape + PyUnicode_DecodeUTF16=python33.PyUnicodeUCS2_DecodeUTF16 + PyUnicode_DecodeUTF16Stateful=python33.PyUnicodeUCS2_DecodeUTF16Stateful + PyUnicode_DecodeUTF32=python33.PyUnicodeUCS2_DecodeUTF32 + PyUnicode_DecodeUTF32Stateful=python33.PyUnicodeUCS2_DecodeUTF32Stateful + PyUnicode_DecodeUTF8=python33.PyUnicodeUCS2_DecodeUTF8 + PyUnicode_DecodeUTF8Stateful=python33.PyUnicodeUCS2_DecodeUTF8Stateful + PyUnicode_DecodeUnicodeEscape=python33.PyUnicodeUCS2_DecodeUnicodeEscape + PyUnicode_FSConverter=python33.PyUnicodeUCS2_FSConverter + PyUnicode_FSDecoder=python33.PyUnicodeUCS2_FSDecoder + PyUnicode_Find=python33.PyUnicodeUCS2_Find + PyUnicode_Format=python33.PyUnicodeUCS2_Format + PyUnicode_FromEncodedObject=python33.PyUnicodeUCS2_FromEncodedObject + PyUnicode_FromFormat=python33.PyUnicodeUCS2_FromFormat + PyUnicode_FromFormatV=python33.PyUnicodeUCS2_FromFormatV + PyUnicode_FromObject=python33.PyUnicodeUCS2_FromObject + PyUnicode_FromOrdinal=python33.PyUnicodeUCS2_FromOrdinal + PyUnicode_FromString=python33.PyUnicodeUCS2_FromString + PyUnicode_FromStringAndSize=python33.PyUnicodeUCS2_FromStringAndSize + PyUnicode_FromWideChar=python33.PyUnicodeUCS2_FromWideChar + PyUnicode_GetDefaultEncoding=python33.PyUnicodeUCS2_GetDefaultEncoding + PyUnicode_GetSize=python33.PyUnicodeUCS2_GetSize + PyUnicode_IsIdentifier=python33.PyUnicodeUCS2_IsIdentifier + PyUnicode_Join=python33.PyUnicodeUCS2_Join + PyUnicode_Partition=python33.PyUnicodeUCS2_Partition + PyUnicode_RPartition=python33.PyUnicodeUCS2_RPartition + PyUnicode_RSplit=python33.PyUnicodeUCS2_RSplit + PyUnicode_Replace=python33.PyUnicodeUCS2_Replace + PyUnicode_Resize=python33.PyUnicodeUCS2_Resize + PyUnicode_RichCompare=python33.PyUnicodeUCS2_RichCompare + PyUnicode_SetDefaultEncoding=python33.PyUnicodeUCS2_SetDefaultEncoding + PyUnicode_Split=python33.PyUnicodeUCS2_Split + PyUnicode_Splitlines=python33.PyUnicodeUCS2_Splitlines + PyUnicode_Tailmatch=python33.PyUnicodeUCS2_Tailmatch + PyUnicode_Translate=python33.PyUnicodeUCS2_Translate + PyUnicode_BuildEncodingMap=python33.PyUnicode_BuildEncodingMap + PyUnicode_CompareWithASCIIString=python33.PyUnicode_CompareWithASCIIString + PyUnicode_DecodeUTF7=python33.PyUnicode_DecodeUTF7 + PyUnicode_DecodeUTF7Stateful=python33.PyUnicode_DecodeUTF7Stateful + PyUnicode_EncodeFSDefault=python33.PyUnicode_EncodeFSDefault + PyUnicode_InternFromString=python33.PyUnicode_InternFromString + PyUnicode_InternImmortal=python33.PyUnicode_InternImmortal + PyUnicode_InternInPlace=python33.PyUnicode_InternInPlace + PyUnicode_Type=python33.PyUnicode_Type DATA + PyWeakref_GetObject=python33.PyWeakref_GetObject DATA + PyWeakref_NewProxy=python33.PyWeakref_NewProxy + PyWeakref_NewRef=python33.PyWeakref_NewRef + PyWrapperDescr_Type=python33.PyWrapperDescr_Type DATA + PyWrapper_New=python33.PyWrapper_New + PyZip_Type=python33.PyZip_Type DATA + Py_AddPendingCall=python33.Py_AddPendingCall + Py_AtExit=python33.Py_AtExit + Py_BuildValue=python33.Py_BuildValue + Py_CompileString=python33.Py_CompileString + Py_DecRef=python33.Py_DecRef + Py_EndInterpreter=python33.Py_EndInterpreter + Py_Exit=python33.Py_Exit + Py_FatalError=python33.Py_FatalError + Py_FileSystemDefaultEncoding=python33.Py_FileSystemDefaultEncoding DATA + Py_Finalize=python33.Py_Finalize + Py_GetBuildInfo=python33.Py_GetBuildInfo + Py_GetCompiler=python33.Py_GetCompiler + Py_GetCopyright=python33.Py_GetCopyright + Py_GetExecPrefix=python33.Py_GetExecPrefix + Py_GetPath=python33.Py_GetPath + Py_GetPlatform=python33.Py_GetPlatform + Py_GetPrefix=python33.Py_GetPrefix + Py_GetProgramFullPath=python33.Py_GetProgramFullPath + Py_GetProgramName=python33.Py_GetProgramName + Py_GetPythonHome=python33.Py_GetPythonHome + Py_GetRecursionLimit=python33.Py_GetRecursionLimit + Py_GetVersion=python33.Py_GetVersion + Py_HasFileSystemDefaultEncoding=python33.Py_HasFileSystemDefaultEncoding DATA + Py_IncRef=python33.Py_IncRef + Py_Initialize=python33.Py_Initialize + Py_InitializeEx=python33.Py_InitializeEx + Py_IsInitialized=python33.Py_IsInitialized + Py_Main=python33.Py_Main + Py_MakePendingCalls=python33.Py_MakePendingCalls + Py_NewInterpreter=python33.Py_NewInterpreter + Py_ReprEnter=python33.Py_ReprEnter + Py_ReprLeave=python33.Py_ReprLeave + Py_SetProgramName=python33.Py_SetProgramName + Py_SetPythonHome=python33.Py_SetPythonHome + Py_SetRecursionLimit=python33.Py_SetRecursionLimit + Py_SymtableString=python33.Py_SymtableString + Py_VaBuildValue=python33.Py_VaBuildValue + _PyErr_BadInternalCall=python33._PyErr_BadInternalCall + _PyObject_CallFunction_SizeT=python33._PyObject_CallFunction_SizeT + _PyObject_CallMethod_SizeT=python33._PyObject_CallMethod_SizeT + _PyObject_GC_Malloc=python33._PyObject_GC_Malloc + _PyObject_GC_New=python33._PyObject_GC_New + _PyObject_GC_NewVar=python33._PyObject_GC_NewVar + _PyObject_GC_Resize=python33._PyObject_GC_Resize + _PyObject_New=python33._PyObject_New + _PyObject_NewVar=python33._PyObject_NewVar + _PyState_AddModule=python33._PyState_AddModule + _PyThreadState_Init=python33._PyThreadState_Init + _PyThreadState_Prealloc=python33._PyThreadState_Prealloc + _PyTrash_delete_later=python33._PyTrash_delete_later DATA + _PyTrash_delete_nesting=python33._PyTrash_delete_nesting DATA + _PyTrash_deposit_object=python33._PyTrash_deposit_object + _PyTrash_destroy_chain=python33._PyTrash_destroy_chain + _PyWeakref_CallableProxyType=python33._PyWeakref_CallableProxyType DATA + _PyWeakref_ProxyType=python33._PyWeakref_ProxyType DATA + _PyWeakref_RefType=python33._PyWeakref_RefType DATA + _Py_BuildValue_SizeT=python33._Py_BuildValue_SizeT + _Py_CheckRecursionLimit=python33._Py_CheckRecursionLimit DATA + _Py_CheckRecursiveCall=python33._Py_CheckRecursiveCall + _Py_Dealloc=python33._Py_Dealloc + _Py_EllipsisObject=python33._Py_EllipsisObject DATA + _Py_FalseStruct=python33._Py_FalseStruct DATA + _Py_NoneStruct=python33._Py_NoneStruct DATA + _Py_NotImplementedStruct=python33._Py_NotImplementedStruct DATA + _Py_SwappedOp=python33._Py_SwappedOp DATA + _Py_TrueStruct=python33._Py_TrueStruct DATA + _Py_VaBuildValue_SizeT=python33._Py_VaBuildValue_SizeT diff --git a/PC/python3.mak b/PC/python3.mak --- a/PC/python3.mak +++ b/PC/python3.mak @@ -1,10 +1,10 @@ -$(OutDir)python33.dll: python3.def $(OutDir)python33stub.lib - cl /LD /Fe$(OutDir)python3.dll python3dll.c python3.def $(OutDir)python33stub.lib - -$(OutDir)python33stub.lib: python33stub.def - lib /def:python33stub.def /out:$(OutDir)python33stub.lib /MACHINE:$(MACHINE) - -clean: - del $(OutDir)python3.dll $(OutDir)python3.lib $(OutDir)python33stub.lib $(OutDir)python3.exp $(OutDir)python33stub.exp - -rebuild: clean $(OutDir)python33.dll +$(OutDir)python33.dll: python3.def $(OutDir)python33stub.lib + cl /LD /Fe$(OutDir)python3.dll python3dll.c python3.def $(OutDir)python33stub.lib + +$(OutDir)python33stub.lib: python33stub.def + lib /def:python33stub.def /out:$(OutDir)python33stub.lib /MACHINE:$(MACHINE) + +clean: + del $(OutDir)python3.dll $(OutDir)python3.lib $(OutDir)python33stub.lib $(OutDir)python3.exp $(OutDir)python33stub.exp + +rebuild: clean $(OutDir)python33.dll diff --git a/PC/python33gen.py b/PC/python33gen.py --- a/PC/python33gen.py +++ b/PC/python33gen.py @@ -1,25 +1,25 @@ -# Generate python33stub.def out of python3.def -# The regular import library cannot be used, -# since it doesn't provide the right symbols for -# data forwarding -out = open("python33stub.def", "w") -out.write('LIBRARY "python33"\n') -out.write('EXPORTS\n') - -inp = open("python3.def") -inp.readline() -line = inp.readline() -assert line.strip()=='EXPORTS' - -for line in inp: - # SYM1=python33.SYM2[ DATA] - head, tail = line.split('.') - if 'DATA' in tail: - symbol, tail = tail.split(' ') - else: - symbol = tail.strip() - out.write(symbol+'\n') - -inp.close() -out.close() - +# Generate python33stub.def out of python3.def +# The regular import library cannot be used, +# since it doesn't provide the right symbols for +# data forwarding +out = open("python33stub.def", "w") +out.write('LIBRARY "python33"\n') +out.write('EXPORTS\n') + +inp = open("python3.def") +inp.readline() +line = inp.readline() +assert line.strip()=='EXPORTS' + +for line in inp: + # SYM1=python33.SYM2[ DATA] + head, tail = line.split('.') + if 'DATA' in tail: + symbol, tail = tail.split(' ') + else: + symbol = tail.strip() + out.write(symbol+'\n') + +inp.close() +out.close() + diff --git a/PC/python33stub.def b/PC/python33stub.def --- a/PC/python33stub.def +++ b/PC/python33stub.def @@ -1,689 +1,689 @@ -LIBRARY "python33" -EXPORTS -PyArg_Parse -PyArg_ParseTuple -PyArg_ParseTupleAndKeywords -PyArg_UnpackTuple -PyArg_VaParse -PyArg_VaParseTupleAndKeywords -PyArg_ValidateKeywordArguments -PyBaseObject_Type -PyBool_FromLong -PyBool_Type -PyByteArrayIter_Type -PyByteArray_AsString -PyByteArray_Concat -PyByteArray_FromObject -PyByteArray_FromStringAndSize -PyByteArray_Resize -PyByteArray_Size -PyByteArray_Type -PyBytesIter_Type -PyBytes_AsString -PyBytes_AsStringAndSize -PyBytes_Concat -PyBytes_ConcatAndDel -PyBytes_DecodeEscape -PyBytes_FromFormat -PyBytes_FromFormatV -PyBytes_FromObject -PyBytes_FromString -PyBytes_FromStringAndSize -PyBytes_Repr -PyBytes_Size -PyBytes_Type -PyCFunction_Call -PyCFunction_ClearFreeList -PyCFunction_GetFlags -PyCFunction_GetFunction -PyCFunction_GetSelf -PyCFunction_NewEx -PyCFunction_Type -PyCallIter_New -PyCallIter_Type -PyCallable_Check -PyCapsule_GetContext -PyCapsule_GetDestructor -PyCapsule_GetName -PyCapsule_GetPointer -PyCapsule_Import -PyCapsule_IsValid -PyCapsule_New -PyCapsule_SetContext -PyCapsule_SetDestructor -PyCapsule_SetName -PyCapsule_SetPointer -PyCapsule_Type -PyClassMethodDescr_Type -PyCodec_BackslashReplaceErrors -PyCodec_Decode -PyCodec_Decoder -PyCodec_Encode -PyCodec_Encoder -PyCodec_IgnoreErrors -PyCodec_IncrementalDecoder -PyCodec_IncrementalEncoder -PyCodec_KnownEncoding -PyCodec_LookupError -PyCodec_Register -PyCodec_RegisterError -PyCodec_ReplaceErrors -PyCodec_StreamReader -PyCodec_StreamWriter -PyCodec_StrictErrors -PyCodec_XMLCharRefReplaceErrors -PyComplex_FromDoubles -PyComplex_ImagAsDouble -PyComplex_RealAsDouble -PyComplex_Type -PyDescr_NewClassMethod -PyDescr_NewGetSet -PyDescr_NewMember -PyDescr_NewMethod -PyDictItems_Type -PyDictIterItem_Type -PyDictIterKey_Type -PyDictIterValue_Type -PyDictKeys_Type -PyDictProxy_New -PyDictProxy_Type -PyDictValues_Type -PyDict_Clear -PyDict_Contains -PyDict_Copy -PyDict_DelItem -PyDict_DelItemString -PyDict_GetItem -PyDict_GetItemString -PyDict_GetItemWithError -PyDict_Items -PyDict_Keys -PyDict_Merge -PyDict_MergeFromSeq2 -PyDict_New -PyDict_Next -PyDict_SetItem -PyDict_SetItemString -PyDict_Size -PyDict_Type -PyDict_Update -PyDict_Values -PyEllipsis_Type -PyEnum_Type -PyErr_BadArgument -PyErr_BadInternalCall -PyErr_CheckSignals -PyErr_Clear -PyErr_Display -PyErr_ExceptionMatches -PyErr_Fetch -PyErr_Format -PyErr_GivenExceptionMatches -PyErr_NewException -PyErr_NewExceptionWithDoc -PyErr_NoMemory -PyErr_NormalizeException -PyErr_Occurred -PyErr_Print -PyErr_PrintEx -PyErr_ProgramText -PyErr_Restore -PyErr_SetFromErrno -PyErr_SetFromErrnoWithFilename -PyErr_SetFromErrnoWithFilenameObject -PyErr_SetInterrupt -PyErr_SetNone -PyErr_SetObject -PyErr_SetString -PyErr_SyntaxLocation -PyErr_WarnEx -PyErr_WarnExplicit -PyErr_WarnFormat -PyErr_WriteUnraisable -PyEval_AcquireLock -PyEval_AcquireThread -PyEval_CallFunction -PyEval_CallMethod -PyEval_CallObjectWithKeywords -PyEval_EvalCode -PyEval_EvalCodeEx -PyEval_EvalFrame -PyEval_EvalFrameEx -PyEval_GetBuiltins -PyEval_GetCallStats -PyEval_GetFrame -PyEval_GetFuncDesc -PyEval_GetFuncName -PyEval_GetGlobals -PyEval_GetLocals -PyEval_InitThreads -PyEval_ReInitThreads -PyEval_ReleaseLock -PyEval_ReleaseThread -PyEval_RestoreThread -PyEval_SaveThread -PyEval_ThreadsInitialized -PyExc_ArithmeticError -PyExc_AssertionError -PyExc_AttributeError -PyExc_BaseException -PyExc_BufferError -PyExc_BytesWarning -PyExc_DeprecationWarning -PyExc_EOFError -PyExc_EnvironmentError -PyExc_Exception -PyExc_FloatingPointError -PyExc_FutureWarning -PyExc_GeneratorExit -PyExc_IOError -PyExc_ImportError -PyExc_ImportWarning -PyExc_IndentationError -PyExc_IndexError -PyExc_KeyError -PyExc_KeyboardInterrupt -PyExc_LookupError -PyExc_MemoryError -PyExc_MemoryErrorInst -PyExc_NameError -PyExc_NotImplementedError -PyExc_OSError -PyExc_OverflowError -PyExc_PendingDeprecationWarning -PyExc_RecursionErrorInst -PyExc_ReferenceError -PyExc_RuntimeError -PyExc_RuntimeWarning -PyExc_StopIteration -PyExc_SyntaxError -PyExc_SyntaxWarning -PyExc_SystemError -PyExc_SystemExit -PyExc_TabError -PyExc_TypeError -PyExc_UnboundLocalError -PyExc_UnicodeDecodeError -PyExc_UnicodeEncodeError -PyExc_UnicodeError -PyExc_UnicodeTranslateError -PyExc_UnicodeWarning -PyExc_UserWarning -PyExc_ValueError -PyExc_Warning -PyExc_ZeroDivisionError -PyException_GetCause -PyException_GetContext -PyException_GetTraceback -PyException_SetCause -PyException_SetContext -PyException_SetTraceback -PyFile_FromFd -PyFile_GetLine -PyFile_WriteObject -PyFile_WriteString -PyFilter_Type -PyFloat_AsDouble -PyFloat_FromDouble -PyFloat_FromString -PyFloat_GetInfo -PyFloat_GetMax -PyFloat_GetMin -PyFloat_Type -PyFrozenSet_New -PyFrozenSet_Type -PyGC_Collect -PyGILState_Ensure -PyGILState_GetThisThreadState -PyGILState_Release -PyGetSetDescr_Type -PyImport_AddModule -PyImport_AppendInittab -PyImport_Cleanup -PyImport_ExecCodeModule -PyImport_ExecCodeModuleEx -PyImport_ExecCodeModuleWithPathnames -PyImport_GetImporter -PyImport_GetMagicNumber -PyImport_GetMagicTag -PyImport_GetModuleDict -PyImport_Import -PyImport_ImportFrozenModule -PyImport_ImportModule -PyImport_ImportModuleLevel -PyImport_ImportModuleNoBlock -PyImport_ReloadModule -PyInterpreterState_Clear -PyInterpreterState_Delete -PyInterpreterState_New -PyIter_Next -PyListIter_Type -PyListRevIter_Type -PyList_Append -PyList_AsTuple -PyList_GetItem -PyList_GetSlice -PyList_Insert -PyList_New -PyList_Reverse -PyList_SetItem -PyList_SetSlice -PyList_Size -PyList_Sort -PyList_Type -PyLongRangeIter_Type -PyLong_AsDouble -PyLong_AsLong -PyLong_AsLongAndOverflow -PyLong_AsLongLong -PyLong_AsLongLongAndOverflow -PyLong_AsSize_t -PyLong_AsSsize_t -PyLong_AsUnsignedLong -PyLong_AsUnsignedLongLong -PyLong_AsUnsignedLongLongMask -PyLong_AsUnsignedLongMask -PyLong_AsVoidPtr -PyLong_FromDouble -PyLong_FromLong -PyLong_FromLongLong -PyLong_FromSize_t -PyLong_FromSsize_t -PyLong_FromString -PyLong_FromUnsignedLong -PyLong_FromUnsignedLongLong -PyLong_FromVoidPtr -PyLong_GetInfo -PyLong_Type -PyMap_Type -PyMapping_Check -PyMapping_GetItemString -PyMapping_HasKey -PyMapping_HasKeyString -PyMapping_Items -PyMapping_Keys -PyMapping_Length -PyMapping_SetItemString -PyMapping_Size -PyMapping_Values -PyMem_Free -PyMem_Malloc -PyMem_Realloc -PyMemberDescr_Type -PyMemoryView_FromObject -PyMemoryView_GetContiguous -PyMemoryView_Type -PyMethodDescr_Type -PyModule_AddIntConstant -PyModule_AddObject -PyModule_AddStringConstant -PyModule_Create2 -PyModule_GetDef -PyModule_GetDict -PyModule_GetFilename -PyModule_GetFilenameObject -PyModule_GetName -PyModule_GetState -PyModule_New -PyModule_Type -PyNullImporter_Type -PyNumber_Absolute -PyNumber_Add -PyNumber_And -PyNumber_AsSsize_t -PyNumber_Check -PyNumber_Divmod -PyNumber_Float -PyNumber_FloorDivide -PyNumber_InPlaceAdd -PyNumber_InPlaceAnd -PyNumber_InPlaceFloorDivide -PyNumber_InPlaceLshift -PyNumber_InPlaceMultiply -PyNumber_InPlaceOr -PyNumber_InPlacePower -PyNumber_InPlaceRemainder -PyNumber_InPlaceRshift -PyNumber_InPlaceSubtract -PyNumber_InPlaceTrueDivide -PyNumber_InPlaceXor -PyNumber_Index -PyNumber_Invert -PyNumber_Long -PyNumber_Lshift -PyNumber_Multiply -PyNumber_Negative -PyNumber_Or -PyNumber_Positive -PyNumber_Power -PyNumber_Remainder -PyNumber_Rshift -PyNumber_Subtract -PyNumber_ToBase -PyNumber_TrueDivide -PyNumber_Xor -PyOS_AfterFork -PyOS_InitInterrupts -PyOS_InputHook -PyOS_InterruptOccurred -PyOS_ReadlineFunctionPointer -PyOS_double_to_string -PyOS_getsig -PyOS_mystricmp -PyOS_mystrnicmp -PyOS_setsig -PyOS_snprintf -PyOS_string_to_double -PyOS_strtol -PyOS_strtoul -PyOS_vsnprintf -PyObject_ASCII -PyObject_AsCharBuffer -PyObject_AsFileDescriptor -PyObject_AsReadBuffer -PyObject_AsWriteBuffer -PyObject_Bytes -PyObject_Call -PyObject_CallFunction -PyObject_CallFunctionObjArgs -PyObject_CallMethod -PyObject_CallMethodObjArgs -PyObject_CallObject -PyObject_CheckReadBuffer -PyObject_ClearWeakRefs -PyObject_DelItem -PyObject_DelItemString -PyObject_Dir -PyObject_Format -PyObject_Free -PyObject_GC_Del -PyObject_GC_Track -PyObject_GC_UnTrack -PyObject_GenericGetAttr -PyObject_GenericSetAttr -PyObject_GetAttr -PyObject_GetAttrString -PyObject_GetItem -PyObject_GetIter -PyObject_HasAttr -PyObject_HasAttrString -PyObject_Hash -PyObject_HashNotImplemented -PyObject_Init -PyObject_InitVar -PyObject_IsInstance -PyObject_IsSubclass -PyObject_IsTrue -PyObject_Length -PyObject_Malloc -PyObject_Not -PyObject_Realloc -PyObject_Repr -PyObject_RichCompare -PyObject_RichCompareBool -PyObject_SelfIter -PyObject_SetAttr -PyObject_SetAttrString -PyObject_SetItem -PyObject_Size -PyObject_Str -PyObject_Type -PyParser_SimpleParseFileFlags -PyParser_SimpleParseStringFlags -PyProperty_Type -PyRangeIter_Type -PyRange_Type -PyReversed_Type -PySeqIter_New -PySeqIter_Type -PySequence_Check -PySequence_Concat -PySequence_Contains -PySequence_Count -PySequence_DelItem -PySequence_DelSlice -PySequence_Fast -PySequence_GetItem -PySequence_GetSlice -PySequence_In -PySequence_InPlaceConcat -PySequence_InPlaceRepeat -PySequence_Index -PySequence_Length -PySequence_List -PySequence_Repeat -PySequence_SetItem -PySequence_SetSlice -PySequence_Size -PySequence_Tuple -PySetIter_Type -PySet_Add -PySet_Clear -PySet_Contains -PySet_Discard -PySet_New -PySet_Pop -PySet_Size -PySet_Type -PySlice_GetIndices -PySlice_GetIndicesEx -PySlice_New -PySlice_Type -PySortWrapper_Type -PyState_FindModule -PyStructSequence_GetItem -PyStructSequence_New -PyStructSequence_NewType -PyStructSequence_SetItem -PySuper_Type -PySys_AddWarnOption -PySys_AddWarnOptionUnicode -PySys_FormatStderr -PySys_FormatStdout -PySys_GetObject -PySys_HasWarnOptions -PySys_ResetWarnOptions -PySys_SetArgv -PySys_SetArgvEx -PySys_SetObject -PySys_SetPath -PySys_WriteStderr -PySys_WriteStdout -PyThreadState_Clear -PyThreadState_Delete -PyThreadState_DeleteCurrent -PyThreadState_Get -PyThreadState_GetDict -PyThreadState_New -PyThreadState_SetAsyncExc -PyThreadState_Swap -PyTraceBack_Here -PyTraceBack_Print -PyTraceBack_Type -PyTupleIter_Type -PyTuple_ClearFreeList -PyTuple_GetItem -PyTuple_GetSlice -PyTuple_New -PyTuple_Pack -PyTuple_SetItem -PyTuple_Size -PyTuple_Type -PyType_ClearCache -PyType_FromSpec -PyType_GenericAlloc -PyType_GenericNew -PyType_GetFlags -PyType_IsSubtype -PyType_Modified -PyType_Ready -PyType_Type -PyUnicodeDecodeError_Create -PyUnicodeDecodeError_GetEncoding -PyUnicodeDecodeError_GetEnd -PyUnicodeDecodeError_GetObject -PyUnicodeDecodeError_GetReason -PyUnicodeDecodeError_GetStart -PyUnicodeDecodeError_SetEnd -PyUnicodeDecodeError_SetReason -PyUnicodeDecodeError_SetStart -PyUnicodeEncodeError_GetEncoding -PyUnicodeEncodeError_GetEnd -PyUnicodeEncodeError_GetObject -PyUnicodeEncodeError_GetReason -PyUnicodeEncodeError_GetStart -PyUnicodeEncodeError_SetEnd -PyUnicodeEncodeError_SetReason -PyUnicodeEncodeError_SetStart -PyUnicodeIter_Type -PyUnicodeTranslateError_GetEnd -PyUnicodeTranslateError_GetObject -PyUnicodeTranslateError_GetReason -PyUnicodeTranslateError_GetStart -PyUnicodeTranslateError_SetEnd -PyUnicodeTranslateError_SetReason -PyUnicodeTranslateError_SetStart -PyUnicodeUCS2_Append -PyUnicodeUCS2_AppendAndDel -PyUnicodeUCS2_AsASCIIString -PyUnicodeUCS2_AsCharmapString -PyUnicodeUCS2_AsDecodedObject -PyUnicodeUCS2_AsDecodedUnicode -PyUnicodeUCS2_AsEncodedObject -PyUnicodeUCS2_AsEncodedString -PyUnicodeUCS2_AsEncodedUnicode -PyUnicodeUCS2_AsLatin1String -PyUnicodeUCS2_AsRawUnicodeEscapeString -PyUnicodeUCS2_AsUTF16String -PyUnicodeUCS2_AsUTF32String -PyUnicodeUCS2_AsUTF8String -PyUnicodeUCS2_AsUnicodeEscapeString -PyUnicodeUCS2_AsWideChar -PyUnicodeUCS2_ClearFreelist -PyUnicodeUCS2_Compare -PyUnicodeUCS2_Concat -PyUnicodeUCS2_Contains -PyUnicodeUCS2_Count -PyUnicodeUCS2_Decode -PyUnicodeUCS2_DecodeASCII -PyUnicodeUCS2_DecodeCharmap -PyUnicodeUCS2_DecodeFSDefault -PyUnicodeUCS2_DecodeFSDefaultAndSize -PyUnicodeUCS2_DecodeLatin1 -PyUnicodeUCS2_DecodeRawUnicodeEscape -PyUnicodeUCS2_DecodeUTF16 -PyUnicodeUCS2_DecodeUTF16Stateful -PyUnicodeUCS2_DecodeUTF32 -PyUnicodeUCS2_DecodeUTF32Stateful -PyUnicodeUCS2_DecodeUTF8 -PyUnicodeUCS2_DecodeUTF8Stateful -PyUnicodeUCS2_DecodeUnicodeEscape -PyUnicodeUCS2_FSConverter -PyUnicodeUCS2_FSDecoder -PyUnicodeUCS2_Find -PyUnicodeUCS2_Format -PyUnicodeUCS2_FromEncodedObject -PyUnicodeUCS2_FromFormat -PyUnicodeUCS2_FromFormatV -PyUnicodeUCS2_FromObject -PyUnicodeUCS2_FromOrdinal -PyUnicodeUCS2_FromString -PyUnicodeUCS2_FromStringAndSize -PyUnicodeUCS2_FromWideChar -PyUnicodeUCS2_GetDefaultEncoding -PyUnicodeUCS2_GetSize -PyUnicodeUCS2_IsIdentifier -PyUnicodeUCS2_Join -PyUnicodeUCS2_Partition -PyUnicodeUCS2_RPartition -PyUnicodeUCS2_RSplit -PyUnicodeUCS2_Replace -PyUnicodeUCS2_Resize -PyUnicodeUCS2_RichCompare -PyUnicodeUCS2_SetDefaultEncoding -PyUnicodeUCS2_Split -PyUnicodeUCS2_Splitlines -PyUnicodeUCS2_Tailmatch -PyUnicodeUCS2_Translate -PyUnicode_BuildEncodingMap -PyUnicode_CompareWithASCIIString -PyUnicode_DecodeUTF7 -PyUnicode_DecodeUTF7Stateful -PyUnicode_EncodeFSDefault -PyUnicode_InternFromString -PyUnicode_InternImmortal -PyUnicode_InternInPlace -PyUnicode_Type -PyWeakref_GetObject -PyWeakref_NewProxy -PyWeakref_NewRef -PyWrapperDescr_Type -PyWrapper_New -PyZip_Type -Py_AddPendingCall -Py_AtExit -Py_BuildValue -Py_CompileString -Py_DecRef -Py_EndInterpreter -Py_Exit -Py_FatalError -Py_FileSystemDefaultEncoding -Py_Finalize -Py_GetBuildInfo -Py_GetCompiler -Py_GetCopyright -Py_GetExecPrefix -Py_GetPath -Py_GetPlatform -Py_GetPrefix -Py_GetProgramFullPath -Py_GetProgramName -Py_GetPythonHome -Py_GetRecursionLimit -Py_GetVersion -Py_HasFileSystemDefaultEncoding -Py_IncRef -Py_Initialize -Py_InitializeEx -Py_IsInitialized -Py_Main -Py_MakePendingCalls -Py_NewInterpreter -Py_ReprEnter -Py_ReprLeave -Py_SetProgramName -Py_SetPythonHome -Py_SetRecursionLimit -Py_SymtableString -Py_VaBuildValue -_PyErr_BadInternalCall -_PyObject_CallFunction_SizeT -_PyObject_CallMethod_SizeT -_PyObject_GC_Malloc -_PyObject_GC_New -_PyObject_GC_NewVar -_PyObject_GC_Resize -_PyObject_New -_PyObject_NewVar -_PyState_AddModule -_PyThreadState_Init -_PyThreadState_Prealloc -_PyTrash_delete_later -_PyTrash_delete_nesting -_PyTrash_deposit_object -_PyTrash_destroy_chain -_PyWeakref_CallableProxyType -_PyWeakref_ProxyType -_PyWeakref_RefType -_Py_BuildValue_SizeT -_Py_CheckRecursionLimit -_Py_CheckRecursiveCall -_Py_Dealloc -_Py_EllipsisObject -_Py_FalseStruct -_Py_NoneStruct -_Py_NotImplementedStruct -_Py_SwappedOp -_Py_TrueStruct -_Py_VaBuildValue_SizeT +LIBRARY "python33" +EXPORTS +PyArg_Parse +PyArg_ParseTuple +PyArg_ParseTupleAndKeywords +PyArg_UnpackTuple +PyArg_VaParse +PyArg_VaParseTupleAndKeywords +PyArg_ValidateKeywordArguments +PyBaseObject_Type +PyBool_FromLong +PyBool_Type +PyByteArrayIter_Type +PyByteArray_AsString +PyByteArray_Concat +PyByteArray_FromObject +PyByteArray_FromStringAndSize +PyByteArray_Resize +PyByteArray_Size +PyByteArray_Type +PyBytesIter_Type +PyBytes_AsString +PyBytes_AsStringAndSize +PyBytes_Concat +PyBytes_ConcatAndDel +PyBytes_DecodeEscape +PyBytes_FromFormat +PyBytes_FromFormatV +PyBytes_FromObject +PyBytes_FromString +PyBytes_FromStringAndSize +PyBytes_Repr +PyBytes_Size +PyBytes_Type +PyCFunction_Call +PyCFunction_ClearFreeList +PyCFunction_GetFlags +PyCFunction_GetFunction +PyCFunction_GetSelf +PyCFunction_NewEx +PyCFunction_Type +PyCallIter_New +PyCallIter_Type +PyCallable_Check +PyCapsule_GetContext +PyCapsule_GetDestructor +PyCapsule_GetName +PyCapsule_GetPointer +PyCapsule_Import +PyCapsule_IsValid +PyCapsule_New +PyCapsule_SetContext +PyCapsule_SetDestructor +PyCapsule_SetName +PyCapsule_SetPointer +PyCapsule_Type +PyClassMethodDescr_Type +PyCodec_BackslashReplaceErrors +PyCodec_Decode +PyCodec_Decoder +PyCodec_Encode +PyCodec_Encoder +PyCodec_IgnoreErrors +PyCodec_IncrementalDecoder +PyCodec_IncrementalEncoder +PyCodec_KnownEncoding +PyCodec_LookupError +PyCodec_Register +PyCodec_RegisterError +PyCodec_ReplaceErrors +PyCodec_StreamReader +PyCodec_StreamWriter +PyCodec_StrictErrors +PyCodec_XMLCharRefReplaceErrors +PyComplex_FromDoubles +PyComplex_ImagAsDouble +PyComplex_RealAsDouble +PyComplex_Type +PyDescr_NewClassMethod +PyDescr_NewGetSet +PyDescr_NewMember +PyDescr_NewMethod +PyDictItems_Type +PyDictIterItem_Type +PyDictIterKey_Type +PyDictIterValue_Type +PyDictKeys_Type +PyDictProxy_New +PyDictProxy_Type +PyDictValues_Type +PyDict_Clear +PyDict_Contains +PyDict_Copy +PyDict_DelItem +PyDict_DelItemString +PyDict_GetItem +PyDict_GetItemString +PyDict_GetItemWithError +PyDict_Items +PyDict_Keys +PyDict_Merge +PyDict_MergeFromSeq2 +PyDict_New +PyDict_Next +PyDict_SetItem +PyDict_SetItemString +PyDict_Size +PyDict_Type +PyDict_Update +PyDict_Values +PyEllipsis_Type +PyEnum_Type +PyErr_BadArgument +PyErr_BadInternalCall +PyErr_CheckSignals +PyErr_Clear +PyErr_Display +PyErr_ExceptionMatches +PyErr_Fetch +PyErr_Format +PyErr_GivenExceptionMatches +PyErr_NewException +PyErr_NewExceptionWithDoc +PyErr_NoMemory +PyErr_NormalizeException +PyErr_Occurred +PyErr_Print +PyErr_PrintEx +PyErr_ProgramText +PyErr_Restore +PyErr_SetFromErrno +PyErr_SetFromErrnoWithFilename +PyErr_SetFromErrnoWithFilenameObject +PyErr_SetInterrupt +PyErr_SetNone +PyErr_SetObject +PyErr_SetString +PyErr_SyntaxLocation +PyErr_WarnEx +PyErr_WarnExplicit +PyErr_WarnFormat +PyErr_WriteUnraisable +PyEval_AcquireLock +PyEval_AcquireThread +PyEval_CallFunction +PyEval_CallMethod +PyEval_CallObjectWithKeywords +PyEval_EvalCode +PyEval_EvalCodeEx +PyEval_EvalFrame +PyEval_EvalFrameEx +PyEval_GetBuiltins +PyEval_GetCallStats +PyEval_GetFrame +PyEval_GetFuncDesc +PyEval_GetFuncName +PyEval_GetGlobals +PyEval_GetLocals +PyEval_InitThreads +PyEval_ReInitThreads +PyEval_ReleaseLock +PyEval_ReleaseThread +PyEval_RestoreThread +PyEval_SaveThread +PyEval_ThreadsInitialized +PyExc_ArithmeticError +PyExc_AssertionError +PyExc_AttributeError +PyExc_BaseException +PyExc_BufferError +PyExc_BytesWarning +PyExc_DeprecationWarning +PyExc_EOFError +PyExc_EnvironmentError +PyExc_Exception +PyExc_FloatingPointError +PyExc_FutureWarning +PyExc_GeneratorExit +PyExc_IOError +PyExc_ImportError +PyExc_ImportWarning +PyExc_IndentationError +PyExc_IndexError +PyExc_KeyError +PyExc_KeyboardInterrupt +PyExc_LookupError +PyExc_MemoryError +PyExc_MemoryErrorInst +PyExc_NameError +PyExc_NotImplementedError +PyExc_OSError +PyExc_OverflowError +PyExc_PendingDeprecationWarning +PyExc_RecursionErrorInst +PyExc_ReferenceError +PyExc_RuntimeError +PyExc_RuntimeWarning +PyExc_StopIteration +PyExc_SyntaxError +PyExc_SyntaxWarning +PyExc_SystemError +PyExc_SystemExit +PyExc_TabError +PyExc_TypeError +PyExc_UnboundLocalError +PyExc_UnicodeDecodeError +PyExc_UnicodeEncodeError +PyExc_UnicodeError +PyExc_UnicodeTranslateError +PyExc_UnicodeWarning +PyExc_UserWarning +PyExc_ValueError +PyExc_Warning +PyExc_ZeroDivisionError +PyException_GetCause +PyException_GetContext +PyException_GetTraceback +PyException_SetCause +PyException_SetContext +PyException_SetTraceback +PyFile_FromFd +PyFile_GetLine +PyFile_WriteObject +PyFile_WriteString +PyFilter_Type +PyFloat_AsDouble +PyFloat_FromDouble +PyFloat_FromString +PyFloat_GetInfo +PyFloat_GetMax +PyFloat_GetMin +PyFloat_Type +PyFrozenSet_New +PyFrozenSet_Type +PyGC_Collect +PyGILState_Ensure +PyGILState_GetThisThreadState +PyGILState_Release +PyGetSetDescr_Type +PyImport_AddModule +PyImport_AppendInittab +PyImport_Cleanup +PyImport_ExecCodeModule +PyImport_ExecCodeModuleEx +PyImport_ExecCodeModuleWithPathnames +PyImport_GetImporter +PyImport_GetMagicNumber +PyImport_GetMagicTag +PyImport_GetModuleDict +PyImport_Import +PyImport_ImportFrozenModule +PyImport_ImportModule +PyImport_ImportModuleLevel +PyImport_ImportModuleNoBlock +PyImport_ReloadModule +PyInterpreterState_Clear +PyInterpreterState_Delete +PyInterpreterState_New +PyIter_Next +PyListIter_Type +PyListRevIter_Type +PyList_Append +PyList_AsTuple +PyList_GetItem +PyList_GetSlice +PyList_Insert +PyList_New +PyList_Reverse +PyList_SetItem +PyList_SetSlice +PyList_Size +PyList_Sort +PyList_Type +PyLongRangeIter_Type +PyLong_AsDouble +PyLong_AsLong +PyLong_AsLongAndOverflow +PyLong_AsLongLong +PyLong_AsLongLongAndOverflow +PyLong_AsSize_t +PyLong_AsSsize_t +PyLong_AsUnsignedLong +PyLong_AsUnsignedLongLong +PyLong_AsUnsignedLongLongMask +PyLong_AsUnsignedLongMask +PyLong_AsVoidPtr +PyLong_FromDouble +PyLong_FromLong +PyLong_FromLongLong +PyLong_FromSize_t +PyLong_FromSsize_t +PyLong_FromString +PyLong_FromUnsignedLong +PyLong_FromUnsignedLongLong +PyLong_FromVoidPtr +PyLong_GetInfo +PyLong_Type +PyMap_Type +PyMapping_Check +PyMapping_GetItemString +PyMapping_HasKey +PyMapping_HasKeyString +PyMapping_Items +PyMapping_Keys +PyMapping_Length +PyMapping_SetItemString +PyMapping_Size +PyMapping_Values +PyMem_Free +PyMem_Malloc +PyMem_Realloc +PyMemberDescr_Type +PyMemoryView_FromObject +PyMemoryView_GetContiguous +PyMemoryView_Type +PyMethodDescr_Type +PyModule_AddIntConstant +PyModule_AddObject +PyModule_AddStringConstant +PyModule_Create2 +PyModule_GetDef +PyModule_GetDict +PyModule_GetFilename +PyModule_GetFilenameObject +PyModule_GetName +PyModule_GetState +PyModule_New +PyModule_Type +PyNullImporter_Type +PyNumber_Absolute +PyNumber_Add +PyNumber_And +PyNumber_AsSsize_t +PyNumber_Check +PyNumber_Divmod +PyNumber_Float +PyNumber_FloorDivide +PyNumber_InPlaceAdd +PyNumber_InPlaceAnd +PyNumber_InPlaceFloorDivide +PyNumber_InPlaceLshift +PyNumber_InPlaceMultiply +PyNumber_InPlaceOr +PyNumber_InPlacePower +PyNumber_InPlaceRemainder +PyNumber_InPlaceRshift +PyNumber_InPlaceSubtract +PyNumber_InPlaceTrueDivide +PyNumber_InPlaceXor +PyNumber_Index +PyNumber_Invert +PyNumber_Long +PyNumber_Lshift +PyNumber_Multiply +PyNumber_Negative +PyNumber_Or +PyNumber_Positive +PyNumber_Power +PyNumber_Remainder +PyNumber_Rshift +PyNumber_Subtract +PyNumber_ToBase +PyNumber_TrueDivide +PyNumber_Xor +PyOS_AfterFork +PyOS_InitInterrupts +PyOS_InputHook +PyOS_InterruptOccurred +PyOS_ReadlineFunctionPointer +PyOS_double_to_string +PyOS_getsig +PyOS_mystricmp +PyOS_mystrnicmp +PyOS_setsig +PyOS_snprintf +PyOS_string_to_double +PyOS_strtol +PyOS_strtoul +PyOS_vsnprintf +PyObject_ASCII +PyObject_AsCharBuffer +PyObject_AsFileDescriptor +PyObject_AsReadBuffer +PyObject_AsWriteBuffer +PyObject_Bytes +PyObject_Call +PyObject_CallFunction +PyObject_CallFunctionObjArgs +PyObject_CallMethod +PyObject_CallMethodObjArgs +PyObject_CallObject +PyObject_CheckReadBuffer +PyObject_ClearWeakRefs +PyObject_DelItem +PyObject_DelItemString +PyObject_Dir +PyObject_Format +PyObject_Free +PyObject_GC_Del +PyObject_GC_Track +PyObject_GC_UnTrack +PyObject_GenericGetAttr +PyObject_GenericSetAttr +PyObject_GetAttr +PyObject_GetAttrString +PyObject_GetItem +PyObject_GetIter +PyObject_HasAttr +PyObject_HasAttrString +PyObject_Hash +PyObject_HashNotImplemented +PyObject_Init +PyObject_InitVar +PyObject_IsInstance +PyObject_IsSubclass +PyObject_IsTrue +PyObject_Length +PyObject_Malloc +PyObject_Not +PyObject_Realloc +PyObject_Repr +PyObject_RichCompare +PyObject_RichCompareBool +PyObject_SelfIter +PyObject_SetAttr +PyObject_SetAttrString +PyObject_SetItem +PyObject_Size +PyObject_Str +PyObject_Type +PyParser_SimpleParseFileFlags +PyParser_SimpleParseStringFlags +PyProperty_Type +PyRangeIter_Type +PyRange_Type +PyReversed_Type +PySeqIter_New +PySeqIter_Type +PySequence_Check +PySequence_Concat +PySequence_Contains +PySequence_Count +PySequence_DelItem +PySequence_DelSlice +PySequence_Fast +PySequence_GetItem +PySequence_GetSlice +PySequence_In +PySequence_InPlaceConcat +PySequence_InPlaceRepeat +PySequence_Index +PySequence_Length +PySequence_List +PySequence_Repeat +PySequence_SetItem +PySequence_SetSlice +PySequence_Size +PySequence_Tuple +PySetIter_Type +PySet_Add +PySet_Clear +PySet_Contains +PySet_Discard +PySet_New +PySet_Pop +PySet_Size +PySet_Type +PySlice_GetIndices +PySlice_GetIndicesEx +PySlice_New +PySlice_Type +PySortWrapper_Type +PyState_FindModule +PyStructSequence_GetItem +PyStructSequence_New +PyStructSequence_NewType +PyStructSequence_SetItem +PySuper_Type +PySys_AddWarnOption +PySys_AddWarnOptionUnicode +PySys_FormatStderr +PySys_FormatStdout +PySys_GetObject +PySys_HasWarnOptions +PySys_ResetWarnOptions +PySys_SetArgv +PySys_SetArgvEx +PySys_SetObject +PySys_SetPath +PySys_WriteStderr +PySys_WriteStdout +PyThreadState_Clear +PyThreadState_Delete +PyThreadState_DeleteCurrent +PyThreadState_Get +PyThreadState_GetDict +PyThreadState_New +PyThreadState_SetAsyncExc +PyThreadState_Swap +PyTraceBack_Here +PyTraceBack_Print +PyTraceBack_Type +PyTupleIter_Type +PyTuple_ClearFreeList +PyTuple_GetItem +PyTuple_GetSlice +PyTuple_New +PyTuple_Pack +PyTuple_SetItem +PyTuple_Size +PyTuple_Type +PyType_ClearCache +PyType_FromSpec +PyType_GenericAlloc +PyType_GenericNew +PyType_GetFlags +PyType_IsSubtype +PyType_Modified +PyType_Ready +PyType_Type +PyUnicodeDecodeError_Create +PyUnicodeDecodeError_GetEncoding +PyUnicodeDecodeError_GetEnd +PyUnicodeDecodeError_GetObject +PyUnicodeDecodeError_GetReason +PyUnicodeDecodeError_GetStart +PyUnicodeDecodeError_SetEnd +PyUnicodeDecodeError_SetReason +PyUnicodeDecodeError_SetStart +PyUnicodeEncodeError_GetEncoding +PyUnicodeEncodeError_GetEnd +PyUnicodeEncodeError_GetObject +PyUnicodeEncodeError_GetReason +PyUnicodeEncodeError_GetStart +PyUnicodeEncodeError_SetEnd +PyUnicodeEncodeError_SetReason +PyUnicodeEncodeError_SetStart +PyUnicodeIter_Type +PyUnicodeTranslateError_GetEnd +PyUnicodeTranslateError_GetObject +PyUnicodeTranslateError_GetReason +PyUnicodeTranslateError_GetStart +PyUnicodeTranslateError_SetEnd +PyUnicodeTranslateError_SetReason +PyUnicodeTranslateError_SetStart +PyUnicodeUCS2_Append +PyUnicodeUCS2_AppendAndDel +PyUnicodeUCS2_AsASCIIString +PyUnicodeUCS2_AsCharmapString +PyUnicodeUCS2_AsDecodedObject +PyUnicodeUCS2_AsDecodedUnicode +PyUnicodeUCS2_AsEncodedObject +PyUnicodeUCS2_AsEncodedString +PyUnicodeUCS2_AsEncodedUnicode +PyUnicodeUCS2_AsLatin1String +PyUnicodeUCS2_AsRawUnicodeEscapeString +PyUnicodeUCS2_AsUTF16String +PyUnicodeUCS2_AsUTF32String +PyUnicodeUCS2_AsUTF8String +PyUnicodeUCS2_AsUnicodeEscapeString +PyUnicodeUCS2_AsWideChar +PyUnicodeUCS2_ClearFreelist +PyUnicodeUCS2_Compare +PyUnicodeUCS2_Concat +PyUnicodeUCS2_Contains +PyUnicodeUCS2_Count +PyUnicodeUCS2_Decode +PyUnicodeUCS2_DecodeASCII +PyUnicodeUCS2_DecodeCharmap +PyUnicodeUCS2_DecodeFSDefault +PyUnicodeUCS2_DecodeFSDefaultAndSize +PyUnicodeUCS2_DecodeLatin1 +PyUnicodeUCS2_DecodeRawUnicodeEscape +PyUnicodeUCS2_DecodeUTF16 +PyUnicodeUCS2_DecodeUTF16Stateful +PyUnicodeUCS2_DecodeUTF32 +PyUnicodeUCS2_DecodeUTF32Stateful +PyUnicodeUCS2_DecodeUTF8 +PyUnicodeUCS2_DecodeUTF8Stateful +PyUnicodeUCS2_DecodeUnicodeEscape +PyUnicodeUCS2_FSConverter +PyUnicodeUCS2_FSDecoder +PyUnicodeUCS2_Find +PyUnicodeUCS2_Format +PyUnicodeUCS2_FromEncodedObject +PyUnicodeUCS2_FromFormat +PyUnicodeUCS2_FromFormatV +PyUnicodeUCS2_FromObject +PyUnicodeUCS2_FromOrdinal +PyUnicodeUCS2_FromString +PyUnicodeUCS2_FromStringAndSize +PyUnicodeUCS2_FromWideChar +PyUnicodeUCS2_GetDefaultEncoding +PyUnicodeUCS2_GetSize +PyUnicodeUCS2_IsIdentifier +PyUnicodeUCS2_Join +PyUnicodeUCS2_Partition +PyUnicodeUCS2_RPartition +PyUnicodeUCS2_RSplit +PyUnicodeUCS2_Replace +PyUnicodeUCS2_Resize +PyUnicodeUCS2_RichCompare +PyUnicodeUCS2_SetDefaultEncoding +PyUnicodeUCS2_Split +PyUnicodeUCS2_Splitlines +PyUnicodeUCS2_Tailmatch +PyUnicodeUCS2_Translate +PyUnicode_BuildEncodingMap +PyUnicode_CompareWithASCIIString +PyUnicode_DecodeUTF7 +PyUnicode_DecodeUTF7Stateful +PyUnicode_EncodeFSDefault +PyUnicode_InternFromString +PyUnicode_InternImmortal +PyUnicode_InternInPlace +PyUnicode_Type +PyWeakref_GetObject +PyWeakref_NewProxy +PyWeakref_NewRef +PyWrapperDescr_Type +PyWrapper_New +PyZip_Type +Py_AddPendingCall +Py_AtExit +Py_BuildValue +Py_CompileString +Py_DecRef +Py_EndInterpreter +Py_Exit +Py_FatalError +Py_FileSystemDefaultEncoding +Py_Finalize +Py_GetBuildInfo +Py_GetCompiler +Py_GetCopyright +Py_GetExecPrefix +Py_GetPath +Py_GetPlatform +Py_GetPrefix +Py_GetProgramFullPath +Py_GetProgramName +Py_GetPythonHome +Py_GetRecursionLimit +Py_GetVersion +Py_HasFileSystemDefaultEncoding +Py_IncRef +Py_Initialize +Py_InitializeEx +Py_IsInitialized +Py_Main +Py_MakePendingCalls +Py_NewInterpreter +Py_ReprEnter +Py_ReprLeave +Py_SetProgramName +Py_SetPythonHome +Py_SetRecursionLimit +Py_SymtableString +Py_VaBuildValue +_PyErr_BadInternalCall +_PyObject_CallFunction_SizeT +_PyObject_CallMethod_SizeT +_PyObject_GC_Malloc +_PyObject_GC_New +_PyObject_GC_NewVar +_PyObject_GC_Resize +_PyObject_New +_PyObject_NewVar +_PyState_AddModule +_PyThreadState_Init +_PyThreadState_Prealloc +_PyTrash_delete_later +_PyTrash_delete_nesting +_PyTrash_deposit_object +_PyTrash_destroy_chain +_PyWeakref_CallableProxyType +_PyWeakref_ProxyType +_PyWeakref_RefType +_Py_BuildValue_SizeT +_Py_CheckRecursionLimit +_Py_CheckRecursiveCall +_Py_Dealloc +_Py_EllipsisObject +_Py_FalseStruct +_Py_NoneStruct +_Py_NotImplementedStruct +_Py_SwappedOp +_Py_TrueStruct +_Py_VaBuildValue_SizeT diff --git a/PC/python3dll.c b/PC/python3dll.c --- a/PC/python3dll.c +++ b/PC/python3dll.c @@ -1,9 +1,9 @@ -#include - -BOOL WINAPI -DllMain(HINSTANCE hInstDLL, - DWORD fdwReason, - LPVOID lpReserved) -{ - return TRUE; +#include + +BOOL WINAPI +DllMain(HINSTANCE hInstDLL, + DWORD fdwReason, + LPVOID lpReserved) +{ + return TRUE; } \ No newline at end of file diff --git a/Tools/buildbot/build-amd64.bat b/Tools/buildbot/build-amd64.bat --- a/Tools/buildbot/build-amd64.bat +++ b/Tools/buildbot/build-amd64.bat @@ -1,6 +1,6 @@ - at rem Used by the buildbot "compile" step. -cmd /c Tools\buildbot\external-amd64.bat -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 -cmd /c Tools\buildbot\clean-amd64.bat -vcbuild /useenv PCbuild\kill_python.vcproj "Debug|x64" && PCbuild\amd64\kill_python_d.exe -vcbuild PCbuild\pcbuild.sln "Debug|x64" + at rem Used by the buildbot "compile" step. +cmd /c Tools\buildbot\external-amd64.bat +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 +cmd /c Tools\buildbot\clean-amd64.bat +vcbuild /useenv PCbuild\kill_python.vcproj "Debug|x64" && PCbuild\amd64\kill_python_d.exe +vcbuild PCbuild\pcbuild.sln "Debug|x64" diff --git a/Tools/buildbot/build.bat b/Tools/buildbot/build.bat --- a/Tools/buildbot/build.bat +++ b/Tools/buildbot/build.bat @@ -1,7 +1,7 @@ - at rem Used by the buildbot "compile" step. -cmd /c Tools\buildbot\external.bat -call "%VS90COMNTOOLS%vsvars32.bat" -cmd /c Tools\buildbot\clean.bat -vcbuild /useenv PCbuild\kill_python.vcproj "Debug|Win32" && PCbuild\kill_python_d.exe -vcbuild /useenv PCbuild\pcbuild.sln "Debug|Win32" - + at rem Used by the buildbot "compile" step. +cmd /c Tools\buildbot\external.bat +call "%VS90COMNTOOLS%vsvars32.bat" +cmd /c Tools\buildbot\clean.bat +vcbuild /useenv PCbuild\kill_python.vcproj "Debug|Win32" && PCbuild\kill_python_d.exe +vcbuild /useenv PCbuild\pcbuild.sln "Debug|Win32" + diff --git a/Tools/buildbot/buildmsi.bat b/Tools/buildbot/buildmsi.bat --- a/Tools/buildbot/buildmsi.bat +++ b/Tools/buildbot/buildmsi.bat @@ -1,21 +1,21 @@ - at rem Used by the buildbot "buildmsi" step. - -cmd /c Tools\buildbot\external.bat - at rem build release versions of things -call "%VS90COMNTOOLS%vsvars32.bat" - - at rem build Python -vcbuild /useenv PCbuild\pcbuild.sln "Release|Win32" - - at rem build the documentation -bash.exe -c 'cd Doc;make PYTHON=python2.5 update htmlhelp' -"%ProgramFiles%\HTML Help Workshop\hhc.exe" Doc\build\htmlhelp\python26a3.hhp - - at rem build the MSI file -cd PC -nmake /f icons.mak -cd ..\Tools\msi -del *.msi -nmake /f msisupport.mak -%HOST_PYTHON% msi.py - + at rem Used by the buildbot "buildmsi" step. + +cmd /c Tools\buildbot\external.bat + at rem build release versions of things +call "%VS90COMNTOOLS%vsvars32.bat" + + at rem build Python +vcbuild /useenv PCbuild\pcbuild.sln "Release|Win32" + + at rem build the documentation +bash.exe -c 'cd Doc;make PYTHON=python2.5 update htmlhelp' +"%ProgramFiles%\HTML Help Workshop\hhc.exe" Doc\build\htmlhelp\python26a3.hhp + + at rem build the MSI file +cd PC +nmake /f icons.mak +cd ..\Tools\msi +del *.msi +nmake /f msisupport.mak +%HOST_PYTHON% msi.py + diff --git a/Tools/buildbot/clean-amd64.bat b/Tools/buildbot/clean-amd64.bat --- a/Tools/buildbot/clean-amd64.bat +++ b/Tools/buildbot/clean-amd64.bat @@ -1,7 +1,7 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 -cd PCbuild - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -vcbuild /clean pcbuild.sln "Release|x64" -vcbuild /clean pcbuild.sln "Debug|x64" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 +cd PCbuild + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +vcbuild /clean pcbuild.sln "Release|x64" +vcbuild /clean pcbuild.sln "Debug|x64" diff --git a/Tools/buildbot/clean.bat b/Tools/buildbot/clean.bat --- a/Tools/buildbot/clean.bat +++ b/Tools/buildbot/clean.bat @@ -1,7 +1,7 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%vsvars32.bat" - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -cd PCbuild -vcbuild /clean pcbuild.sln "Release|Win32" -vcbuild /clean pcbuild.sln "Debug|Win32" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%vsvars32.bat" + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +cd PCbuild +vcbuild /clean pcbuild.sln "Release|Win32" +vcbuild /clean pcbuild.sln "Debug|Win32" diff --git a/Tools/buildbot/external-amd64.bat b/Tools/buildbot/external-amd64.bat --- a/Tools/buildbot/external-amd64.bat +++ b/Tools/buildbot/external-amd64.bat @@ -1,21 +1,21 @@ - at rem Fetches (and builds if necessary) external dependencies - - at rem Assume we start inside the Python source directory -call "Tools\buildbot\external-common.bat" -call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 - -if not exist tcltk64\bin\tcl85g.dll ( - cd tcl-8.5.9.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install - cd ..\.. -) - -if not exist tcltk64\bin\tk85g.dll ( - cd tk-8.5.9.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 clean - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 install - cd ..\.. -) - + at rem Fetches (and builds if necessary) external dependencies + + at rem Assume we start inside the Python source directory +call "Tools\buildbot\external-common.bat" +call "%VS90COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 + +if not exist tcltk64\bin\tcl85g.dll ( + cd tcl-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install + cd ..\.. +) + +if not exist tcltk64\bin\tk85g.dll ( + cd tk-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 clean + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.9.0 install + cd ..\.. +) + diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -1,43 +1,43 @@ - at rem Common file shared between external.bat and external-amd64.bat. Responsible for - at rem fetching external components into the root\.. buildbot directories. - -cd .. - at rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment - at rem the following, check it in, then check it out, comment it out, then check it back in. - at rem if exist bzip2-1.0.5 rd /s/q bzip2-1.0.5 - at rem if exist tcltk rd /s/q tcltk - at rem if exist tcltk64 rd /s/q tcltk64 - at rem if exist tcl8.4.12 rd /s/q tcl8.4.12 - at rem if exist tcl8.4.16 rd /s/q tcl8.4.16 - at rem if exist tcl-8.4.18.1 rd /s/q tcl-8.4.18.1 - at rem if exist tk8.4.12 rd /s/q tk8.4.12 - at rem if exist tk8.4.16 rd /s/q tk8.4.16 - at rem if exist tk-8.4.18.1 rd /s/q tk-8.4.18.1 - at rem if exist db-4.4.20 rd /s/q db-4.4.20 - at rem if exist openssl-1.0.0a rd /s/q openssl-1.0.0a - at rem if exist sqlite-3.7.4 rd /s/q sqlite-3.7.4 - - at rem bzip -if not exist bzip2-1.0.5 ( - rd /s/q bzip2-1.0.3 - svn export http://svn.python.org/projects/external/bzip2-1.0.5 -) - - at rem Sleepycat db -if not exist db-4.4.20 svn export http://svn.python.org/projects/external/db-4.4.20-vs9 db-4.4.20 - - at rem OpenSSL -if not exist openssl-1.0.0a svn export http://svn.python.org/projects/external/openssl-1.0.0a - - at rem tcl/tk -if not exist tcl-8.5.9.0 ( - rd /s/q tcltk tcltk64 - svn export http://svn.python.org/projects/external/tcl-8.5.9.0 -) -if not exist tk-8.5.9.0 svn export http://svn.python.org/projects/external/tk-8.5.9.0 - - at rem sqlite3 -if not exist sqlite-3.7.4 ( - rd /s/q sqlite-source-3.6.21 - svn export http://svn.python.org/projects/external/sqlite-3.7.4 -) + at rem Common file shared between external.bat and external-amd64.bat. Responsible for + at rem fetching external components into the root\.. buildbot directories. + +cd .. + at rem XXX: If you need to force the buildbots to start from a fresh environment, uncomment + at rem the following, check it in, then check it out, comment it out, then check it back in. + at rem if exist bzip2-1.0.5 rd /s/q bzip2-1.0.5 + at rem if exist tcltk rd /s/q tcltk + at rem if exist tcltk64 rd /s/q tcltk64 + at rem if exist tcl8.4.12 rd /s/q tcl8.4.12 + at rem if exist tcl8.4.16 rd /s/q tcl8.4.16 + at rem if exist tcl-8.4.18.1 rd /s/q tcl-8.4.18.1 + at rem if exist tk8.4.12 rd /s/q tk8.4.12 + at rem if exist tk8.4.16 rd /s/q tk8.4.16 + at rem if exist tk-8.4.18.1 rd /s/q tk-8.4.18.1 + at rem if exist db-4.4.20 rd /s/q db-4.4.20 + at rem if exist openssl-1.0.0a rd /s/q openssl-1.0.0a + at rem if exist sqlite-3.7.4 rd /s/q sqlite-3.7.4 + + at rem bzip +if not exist bzip2-1.0.5 ( + rd /s/q bzip2-1.0.3 + svn export http://svn.python.org/projects/external/bzip2-1.0.5 +) + + at rem Sleepycat db +if not exist db-4.4.20 svn export http://svn.python.org/projects/external/db-4.4.20-vs9 db-4.4.20 + + at rem OpenSSL +if not exist openssl-1.0.0a svn export http://svn.python.org/projects/external/openssl-1.0.0a + + at rem tcl/tk +if not exist tcl-8.5.9.0 ( + rd /s/q tcltk tcltk64 + svn export http://svn.python.org/projects/external/tcl-8.5.9.0 +) +if not exist tk-8.5.9.0 svn export http://svn.python.org/projects/external/tk-8.5.9.0 + + at rem sqlite3 +if not exist sqlite-3.7.4 ( + rd /s/q sqlite-source-3.6.21 + svn export http://svn.python.org/projects/external/sqlite-3.7.4 +) diff --git a/Tools/buildbot/external.bat b/Tools/buildbot/external.bat --- a/Tools/buildbot/external.bat +++ b/Tools/buildbot/external.bat @@ -1,21 +1,21 @@ - at rem Fetches (and builds if necessary) external dependencies - - at rem Assume we start inside the Python source directory -call "Tools\buildbot\external-common.bat" -call "%VS90COMNTOOLS%\vsvars32.bat" - -if not exist tcltk\bin\tcl85g.dll ( - @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install - cd tcl-8.5.9.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 INSTALLDIR=..\..\tcltk clean all - nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install - cd ..\.. -) - -if not exist tcltk\bin\tk85g.dll ( - cd tk-8.5.9.0\win - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 clean - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 all - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 install - cd ..\.. -) + at rem Fetches (and builds if necessary) external dependencies + + at rem Assume we start inside the Python source directory +call "Tools\buildbot\external-common.bat" +call "%VS90COMNTOOLS%\vsvars32.bat" + +if not exist tcltk\bin\tcl85g.dll ( + @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install + cd tcl-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 INSTALLDIR=..\..\tcltk clean all + nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install + cd ..\.. +) + +if not exist tcltk\bin\tk85g.dll ( + cd tk-8.5.9.0\win + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 clean + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 all + nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.9.0 install + cd ..\.. +) diff --git a/Tools/buildbot/test-amd64.bat b/Tools/buildbot/test-amd64.bat --- a/Tools/buildbot/test-amd64.bat +++ b/Tools/buildbot/test-amd64.bat @@ -1,3 +1,3 @@ - at rem Used by the buildbot "test" step. -cd PCbuild -call rt.bat -q -d -x64 -uall -rw + at rem Used by the buildbot "test" step. +cd PCbuild +call rt.bat -q -d -x64 -uall -rw diff --git a/Tools/buildbot/test.bat b/Tools/buildbot/test.bat --- a/Tools/buildbot/test.bat +++ b/Tools/buildbot/test.bat @@ -1,4 +1,4 @@ - at rem Used by the buildbot "test" step. -cd PCbuild -call rt.bat -d -q -uall -rwW -n - + at rem Used by the buildbot "test" step. +cd PCbuild +call rt.bat -d -q -uall -rwW -n + diff --git a/Tools/unicode/genwincodecs.bat b/Tools/unicode/genwincodecs.bat --- a/Tools/unicode/genwincodecs.bat +++ b/Tools/unicode/genwincodecs.bat @@ -1,7 +1,7 @@ - at rem Recreate some python charmap codecs from the Windows function - at rem MultiByteToWideChar. - - at cd /d %~dp0 - at mkdir build - at rem Arabic DOS code page -c:\python30\python genwincodec.py 720 > build/cp720.py + at rem Recreate some python charmap codecs from the Windows function + at rem MultiByteToWideChar. + + at cd /d %~dp0 + at mkdir build + at rem Arabic DOS code page +c:\python30\python genwincodec.py 720 > build/cp720.py -- Repository URL: http://hg.python.org/cpython From martin at v.loewis.de Sat Feb 26 19:19:37 2011 From: martin at v.loewis.de (=?ISO-8859-15?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Sat, 26 Feb 2011 19:19:37 +0100 Subject: [Python-checkins] [Python-Dev] cpython: improve license In-Reply-To: <20110226131225.2f310fc4@limelight.wooz.org> References: <20110226131225.2f310fc4@limelight.wooz.org> Message-ID: <4D694439.5010004@v.loewis.de> Am 26.02.2011 19:12, schrieb Barry Warsaw: > Notice the subject line. Can we make commit messages contain the named branch > that the change applies to? If you don't want this request to be forgotten, add it to todo.txt in the pymigr repo. Regards, Martin From python-checkins at python.org Sat Feb 26 19:29:43 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:29:43 +0100 Subject: [Python-checkins] cpython: Some test in a bat file Message-ID: antoine.pitrou pushed cc83a2611f9b to cpython: http://hg.python.org/cpython/rev/cc83a2611f9b changeset: 68055:cc83a2611f9b tag: tip user: Antoine Pitrou date: Sat Feb 26 19:29:34 2011 +0100 summary: Some test in a bat file files: Tools/buildbot/clean.bat diff --git a/Tools/buildbot/clean.bat b/Tools/buildbot/clean.bat --- a/Tools/buildbot/clean.bat +++ b/Tools/buildbot/clean.bat @@ -3,5 +3,6 @@ @echo Deleting .pyc/.pyo files ... del /s Lib\*.pyc Lib\*.pyo cd PCbuild + vcbuild /clean pcbuild.sln "Release|Win32" vcbuild /clean pcbuild.sln "Debug|Win32" -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sat Feb 26 19:31:34 2011 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 26 Feb 2011 19:31:34 +0100 Subject: [Python-checkins] [Python-Dev] cpython: improve license References: <20110226131225.2f310fc4@limelight.wooz.org> <4D694439.5010004@v.loewis.de> Message-ID: <20110226193134.4e0f4bcf@pitrou.net> On Sat, 26 Feb 2011 19:19:37 +0100 "Martin v. L?wis" wrote: > Am 26.02.2011 19:12, schrieb Barry Warsaw: > > Notice the subject line. Can we make commit messages contain the named branch > > that the change applies to? > > If you don't want this request to be forgotten, add it to todo.txt in > the pymigr repo. No need to add it, the feature is already implemented. The branch name is in parentheses, except that it's omitted when it's the "default" branch. (see many previous hg commits on this list) Regards Antoine. From python-checkins at python.org Sat Feb 26 19:40:32 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:40:32 +0100 Subject: [Python-checkins] cpython: Try to commit some eols to a bat file Message-ID: antoine.pitrou pushed e57542471f8a to cpython: http://hg.python.org/cpython/rev/e57542471f8a changeset: 68056:e57542471f8a tag: tip user: Antoine Pitrou date: Sat Feb 26 19:32:24 2011 +0100 summary: Try to commit some eols to a bat file files: Tools/buildbot/clean.bat diff --git a/Tools/buildbot/clean.bat b/Tools/buildbot/clean.bat --- a/Tools/buildbot/clean.bat +++ b/Tools/buildbot/clean.bat @@ -1,8 +1,7 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%vsvars32.bat" - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -cd PCbuild - -vcbuild /clean pcbuild.sln "Release|Win32" -vcbuild /clean pcbuild.sln "Debug|Win32" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%vsvars32.bat" + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +cd PCbuild +vcbuild /clean pcbuild.sln "Release|Win32" +vcbuild /clean pcbuild.sln "Debug|Win32" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 19:41:14 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 19:41:14 +0100 Subject: [Python-checkins] cpython: Backed out changeset e57542471f8a Message-ID: antoine.pitrou pushed 6da2ffc992f4 to cpython: http://hg.python.org/cpython/rev/6da2ffc992f4 changeset: 68057:6da2ffc992f4 tag: tip user: Antoine Pitrou date: Sat Feb 26 19:40:58 2011 +0100 summary: Backed out changeset e57542471f8a files: Tools/buildbot/clean.bat diff --git a/Tools/buildbot/clean.bat b/Tools/buildbot/clean.bat --- a/Tools/buildbot/clean.bat +++ b/Tools/buildbot/clean.bat @@ -1,7 +1,8 @@ - at rem Used by the buildbot "clean" step. -call "%VS90COMNTOOLS%vsvars32.bat" - at echo Deleting .pyc/.pyo files ... -del /s Lib\*.pyc Lib\*.pyo -cd PCbuild -vcbuild /clean pcbuild.sln "Release|Win32" -vcbuild /clean pcbuild.sln "Debug|Win32" + at rem Used by the buildbot "clean" step. +call "%VS90COMNTOOLS%vsvars32.bat" + at echo Deleting .pyc/.pyo files ... +del /s Lib\*.pyc Lib\*.pyo +cd PCbuild + +vcbuild /clean pcbuild.sln "Release|Win32" +vcbuild /clean pcbuild.sln "Debug|Win32" -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sat Feb 26 20:31:30 2011 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 26 Feb 2011 20:31:30 +0100 Subject: [Python-checkins] [Python-Dev] cpython: improve license References: <20110226131225.2f310fc4@limelight.wooz.org> <4D694439.5010004@v.loewis.de> <20110226193134.4e0f4bcf@pitrou.net> Message-ID: <20110226203130.1e75fdde@pitrou.net> On Sat, 26 Feb 2011 19:31:34 +0100 Antoine Pitrou wrote: > > No need to add it, the feature is already implemented. The branch name > is in parentheses, except that it's omitted when it's the "default" > branch. > (see many previous hg commits on this list) (note for python-dev readers: "this list" refers to python-checkins ;-)) From python-checkins at python.org Sat Feb 26 21:00:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:00:34 +0100 Subject: [Python-checkins] hooks: Tentative branch check hook Message-ID: antoine.pitrou pushed 2f27be828fa0 to hooks: http://hg.python.org/hooks/rev/2f27be828fa0 changeset: 29:2f27be828fa0 tag: tip user: Antoine Pitrou date: Sat Feb 26 21:00:31 2011 +0100 summary: Tentative branch check hook files: checkbranch.py diff --git a/checkbranch.py b/checkbranch.py new file mode 100644 --- /dev/null +++ b/checkbranch.py @@ -0,0 +1,23 @@ +""" +Mercurial hook to check that individual changesets don't happen on a +forbidden branch. + +To use the changeset hook in a local repository, include something like the +following in your hgrc file. + +[hooks] +pretxncommit.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook +""" + +from mercurial import util + + +def hook(ui, repo, node, **kwargs): + ctx = repo[node] + branch = ctx.branch() + if branch in ('trunk', 'legacy-trunk', + '2.0', '2.1', '2.2', '2.3', '2.4', '3.0'): + raise util.Abort('changeset %s on disallowed branch %r, ' + 'please strip your changeset and ' + 're-do it on another branch ' + % (ctx, branch)) -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Sat Feb 26 21:25:04 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:25:04 +0100 Subject: [Python-checkins] hooks: Fix checkbranch hook Message-ID: antoine.pitrou pushed 2f045ec0b6c6 to hooks: http://hg.python.org/hooks/rev/2f045ec0b6c6 changeset: 30:2f045ec0b6c6 tag: tip user: Antoine Pitrou date: Sat Feb 26 21:25:02 2011 +0100 summary: Fix checkbranch hook files: checkbranch.py diff --git a/checkbranch.py b/checkbranch.py --- a/checkbranch.py +++ b/checkbranch.py @@ -6,18 +6,29 @@ following in your hgrc file. [hooks] -pretxncommit.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook +pretxnchangegroup.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook """ +from mercurial.node import bin from mercurial import util def hook(ui, repo, node, **kwargs): - ctx = repo[node] - branch = ctx.branch() - if branch in ('trunk', 'legacy-trunk', - '2.0', '2.1', '2.2', '2.3', '2.4', '3.0'): - raise util.Abort('changeset %s on disallowed branch %r, ' - 'please strip your changeset and ' - 're-do it on another branch ' - % (ctx, branch)) + n = bin(node) + start = repo.changelog.rev(n) + end = len(repo.changelog) + failed = False + for rev in xrange(start, end): + n = repo.changelog.node(rev) + ctx = repo[n] + branch = ctx.branch() + if branch in ('trunk', 'legacy-trunk', + '2.0', '2.1', '2.2', '2.3', '2.4', '3.0'): + ui.warn(' - changeset %s on disallowed branch %r!\n' + % (ctx, branch)) + failed = True + if failed: + ui.warn('* Please strip the offending changeset(s)\n' + '* and re-do them, if needed, on another branch!\n') + return True + -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Sat Feb 26 21:28:46 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:28:46 +0100 Subject: [Python-checkins] cpython: test commit on non-forbidden branch Message-ID: antoine.pitrou pushed 4479573a8b69 to cpython: http://hg.python.org/cpython/rev/4479573a8b69 changeset: 68058:4479573a8b69 tag: tip user: Antoine Pitrou date: Sat Feb 26 21:28:41 2011 +0100 summary: test commit on non-forbidden branch files: LICENSE diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -9,7 +9,7 @@ And wrinkled lip, and sneer of cold command Tell that its sculptor well those passions read Which yet survive, stamped on these lifeless things, -The hand that mocked them and the heart that fed. +The void that mocked them and the heart that fed. And on the pedestal these words appear: "My name is Ozymandias, king of kings: Look on my works, ye Mighty, and despair!" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 21:34:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:34:16 +0100 Subject: [Python-checkins] merge in pymigr: Merge Message-ID: antoine.pitrou pushed 026ab115475f to pymigr: http://hg.python.org/pymigr/rev/026ab115475f changeset: 111:026ab115475f parent: 110:1c89cd975318 parent: 109:cafb761c4175 user: Antoine Pitrou date: Sat Feb 26 21:31:21 2011 +0100 summary: Merge files: diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -5,6 +5,9 @@ (issue links are done, rietveld integration pending) * resolve open issues in PEP 385 * (optionally) fight some more about workflow and decide the branch structure +* some hook should prevent pushing python files indented by tabs. +* some hook should prevent pushing to the 2.x trunk. +* some hook should prevent breaking EOL conventions. Done ==== @@ -32,7 +35,9 @@ * set up automatic installation of changes to ssh keys, decide upon account managers -* adapt build identification for Windows build process +* adapt build identification for Windows build process (see + build-identification directory) +* adapt Python-ast.c version generation process Buildbot -------- -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 21:34:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:34:16 +0100 Subject: [Python-checkins] pymigr: Add current / sample .hg/hgrc for the server Message-ID: antoine.pitrou pushed 1c89cd975318 to pymigr: http://hg.python.org/pymigr/rev/1c89cd975318 changeset: 110:1c89cd975318 parent: 105:ddaae6fa1dea user: Antoine Pitrou date: Sat Feb 26 21:31:11 2011 +0100 summary: Add current / sample .hg/hgrc for the server files: README.txt diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -67,3 +67,25 @@ $ hg bundle -a cpython.bundle +- Setup hooks and on other configuration on the public repository's .hg/hgrc: + + [web] + description = CPython repository + contact = python-dev at python.org + + [extensions] + # Needed for the hook to work, otherwise it will be a no-op + eol = + + [hooks] + + # Uses our modified version of eol.py (see http://mercurial.selenic.com/bts/issue2660) + pretxnchangegroup.eol = python:/home/hg/mercurial-1.7.5/hgext/eol.py:hook + #pretxnchangegroup.eol = python:hgext.eol.hook + + pretxnchangegroup.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook + incoming.notify = python:/home/hg/repos/hooks/mail.py:incoming + + [mail] + notify = python-checkins at python.org + -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 21:34:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 26 Feb 2011 21:34:16 +0100 Subject: [Python-checkins] pymigr: Update todo Message-ID: antoine.pitrou pushed 3a259f1c9553 to pymigr: http://hg.python.org/pymigr/rev/3a259f1c9553 changeset: 112:3a259f1c9553 tag: tip user: Antoine Pitrou date: Sat Feb 26 21:34:14 2011 +0100 summary: Update todo files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -1,13 +1,10 @@ Before final conversion (tentatively 2011-03-05) ================================================ -* HALF DONE investigate roundup changes required - (issue links are done, rietveld integration pending) * resolve open issues in PEP 385 * (optionally) fight some more about workflow and decide the branch structure * some hook should prevent pushing python files indented by tabs. -* some hook should prevent pushing to the 2.x trunk. -* some hook should prevent breaking EOL conventions. + -> done in checkwhitespace.py? Done ==== @@ -21,6 +18,12 @@ -> see svnrev.py in extensions repo * craft .hgeol and make sure we comply with it -> should be done, see py3k svn branch +* some hook should prevent pushing to the 2.x trunk. + -> should be done, see cpython test repo +* some hook should prevent breaking EOL conventions. + -> should be done, see cpython test repo +* roundup integration (issue links, changeset links) + -> should be done Optional/Low priority ===================== @@ -29,16 +32,18 @@ * check if we should speed up svn revision matching/lookups (since the lookup WSGI app resides permanently in a mod_wsgi process, building a cache at startup should be sufficient) +* investigate roundup + rietveld integration changes required After migration =============== -* set up automatic installation of changes to ssh keys, decide upon +* set up automatic (?) installation of changes to ssh keys, decide upon account managers * adapt build identification for Windows build process (see build-identification directory) * adapt Python-ast.c version generation process + Buildbot -------- -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sat Feb 26 22:32:17 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 22:32:17 +0100 (CET) Subject: [Python-checkins] r88658 - in python/branches/py3k: Doc/reference/executionmodel.rst Modules/posixmodule.c Message-ID: <20110226213217.0482EEE9EA@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 22:32:16 2011 New Revision: 88658 Log: this isn't true anymore Modified: python/branches/py3k/Doc/reference/executionmodel.rst python/branches/py3k/Modules/posixmodule.c Modified: python/branches/py3k/Doc/reference/executionmodel.rst ============================================================================== --- python/branches/py3k/Doc/reference/executionmodel.rst (original) +++ python/branches/py3k/Doc/reference/executionmodel.rst Sat Feb 26 22:32:16 2011 @@ -94,9 +94,7 @@ at the module level. A target occurring in a :keyword:`del` statement is also considered bound for -this purpose (though the actual semantics are to unbind the name). It is -illegal to unbind a name that is referenced by an enclosing scope; the compiler -will report a :exc:`SyntaxError`. +this purpose (though the actual semantics are to unbind the name). Each assignment or import statement occurs within a block defined by a class or function definition or at the module level (the top-level code block). Modified: python/branches/py3k/Modules/posixmodule.c ============================================================================== --- python/branches/py3k/Modules/posixmodule.c (original) +++ python/branches/py3k/Modules/posixmodule.c Sat Feb 26 22:32:16 2011 @@ -5935,9 +5935,8 @@ int flags = 0; sf.headers = NULL; sf.trailers = NULL; - static char *keywords[] = {"out", "in", - "offset", "count", - "headers", "trailers", "flags", NULL}; + static char *keywords[] = {"out", "in", "offset", "count", "headers", + "trailers", "flags", NULL}; #ifdef __APPLE__ if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iiO&O&|OOi:sendfile", From python-checkins at python.org Sat Feb 26 22:34:51 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 22:34:51 +0100 (CET) Subject: [Python-checkins] r88659 - in python/branches/release32-maint: Doc/reference/executionmodel.rst Message-ID: <20110226213451.BBC92EEA26@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 22:34:51 2011 New Revision: 88659 Log: Merged revisions 88658 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88658 | benjamin.peterson | 2011-02-26 15:32:16 -0600 (Sat, 26 Feb 2011) | 1 line this isn't true anymore ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Doc/reference/executionmodel.rst Modified: python/branches/release32-maint/Doc/reference/executionmodel.rst ============================================================================== --- python/branches/release32-maint/Doc/reference/executionmodel.rst (original) +++ python/branches/release32-maint/Doc/reference/executionmodel.rst Sat Feb 26 22:34:51 2011 @@ -94,9 +94,7 @@ at the module level. A target occurring in a :keyword:`del` statement is also considered bound for -this purpose (though the actual semantics are to unbind the name). It is -illegal to unbind a name that is referenced by an enclosing scope; the compiler -will report a :exc:`SyntaxError`. +this purpose (though the actual semantics are to unbind the name). Each assignment or import statement occurs within a block defined by a class or function definition or at the module level (the top-level code block). From python-checkins at python.org Sat Feb 26 22:35:16 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 22:35:16 +0100 (CET) Subject: [Python-checkins] r88660 - python/branches/py3k/Modules/posixmodule.c Message-ID: <20110226213516.A2368EEA1E@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 22:35:16 2011 New Revision: 88660 Log: revert accidental formatting change Modified: python/branches/py3k/Modules/posixmodule.c Modified: python/branches/py3k/Modules/posixmodule.c ============================================================================== --- python/branches/py3k/Modules/posixmodule.c (original) +++ python/branches/py3k/Modules/posixmodule.c Sat Feb 26 22:35:16 2011 @@ -5935,8 +5935,9 @@ int flags = 0; sf.headers = NULL; sf.trailers = NULL; - static char *keywords[] = {"out", "in", "offset", "count", "headers", - "trailers", "flags", NULL}; + static char *keywords[] = {"out", "in", + "offset", "count", + "headers", "trailers", "flags", NULL}; #ifdef __APPLE__ if (!PyArg_ParseTupleAndKeywords(args, kwdict, "iiO&O&|OOi:sendfile", From python-checkins at python.org Sat Feb 26 23:06:24 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 23:06:24 +0100 (CET) Subject: [Python-checkins] r88661 - in sandbox/trunk/2to3/lib2to3: patcomp.py pgen2/driver.py tests/test_parser.py Message-ID: <20110226220624.F02A1EEA09@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 23:06:24 2011 New Revision: 88661 Log: fix refactoring on formfeed characters #11250 This is because text.splitlines() is not the same as list(StringIO.StringIO(text)). Modified: sandbox/trunk/2to3/lib2to3/patcomp.py sandbox/trunk/2to3/lib2to3/pgen2/driver.py sandbox/trunk/2to3/lib2to3/tests/test_parser.py Modified: sandbox/trunk/2to3/lib2to3/patcomp.py ============================================================================== --- sandbox/trunk/2to3/lib2to3/patcomp.py (original) +++ sandbox/trunk/2to3/lib2to3/patcomp.py Sat Feb 26 23:06:24 2011 @@ -12,6 +12,7 @@ # Python imports import os +import StringIO # Fairly local imports from .pgen2 import driver, literals, token, tokenize, parse, grammar @@ -32,7 +33,7 @@ def tokenize_wrapper(input): """Tokenizes a string suppressing significant whitespace.""" skip = set((token.NEWLINE, token.INDENT, token.DEDENT)) - tokens = tokenize.generate_tokens(driver.generate_lines(input).next) + tokens = tokenize.generate_tokens(StringIO.StringIO(input).readline) for quintuple in tokens: type, value, start, end, line_text = quintuple if type not in skip: Modified: sandbox/trunk/2to3/lib2to3/pgen2/driver.py ============================================================================== --- sandbox/trunk/2to3/lib2to3/pgen2/driver.py (original) +++ sandbox/trunk/2to3/lib2to3/pgen2/driver.py Sat Feb 26 23:06:24 2011 @@ -19,6 +19,7 @@ import codecs import os import logging +import StringIO import sys # Pgen imports @@ -101,18 +102,10 @@ def parse_string(self, text, debug=False): """Parse a string and return the syntax tree.""" - tokens = tokenize.generate_tokens(generate_lines(text).next) + tokens = tokenize.generate_tokens(StringIO.StringIO(text).readline) return self.parse_tokens(tokens, debug) -def generate_lines(text): - """Generator that behaves like readline without using StringIO.""" - for line in text.splitlines(True): - yield line - while True: - yield "" - - def load_grammar(gt="Grammar.txt", gp=None, save=True, force=False, logger=None): """Load the grammar (maybe from a pickle).""" Modified: sandbox/trunk/2to3/lib2to3/tests/test_parser.py ============================================================================== --- sandbox/trunk/2to3/lib2to3/tests/test_parser.py (original) +++ sandbox/trunk/2to3/lib2to3/tests/test_parser.py Sat Feb 26 23:06:24 2011 @@ -19,6 +19,16 @@ # Local imports from lib2to3.pgen2 import tokenize from ..pgen2.parse import ParseError +from lib2to3.pygram import python_symbols as syms + + +class TestDriver(support.TestCase): + + def test_formfeed(self): + s = """print 1\n\x0Cprint 2\n""" + t = driver.parse_string(s) + self.assertEqual(t.children[0].children[0].type, syms.print_stmt) + self.assertEqual(t.children[1].children[0].type, syms.print_stmt) class GrammarTest(support.TestCase): From python-checkins at python.org Sat Feb 26 23:11:02 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 23:11:02 +0100 (CET) Subject: [Python-checkins] r88662 - in python/branches/release27-maint/Lib/lib2to3: __main__.py patcomp.py pgen2/driver.py tests/test_parser.py Message-ID: <20110226221102.B1169EE99F@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 23:11:02 2011 New Revision: 88662 Log: Merged revisions 88535,88661 via svnmerge from svn+ssh://pythondev at svn.python.org/sandbox/trunk/2to3/lib2to3 ........ r88535 | brett.cannon | 2011-02-23 13:46:46 -0600 (Wed, 23 Feb 2011) | 1 line Add lib2to3.__main__ for easy testing from the console. ........ r88661 | benjamin.peterson | 2011-02-26 16:06:24 -0600 (Sat, 26 Feb 2011) | 6 lines fix refactoring on formfeed characters #11250 This is because text.splitlines() is not the same as list(StringIO.StringIO(text)). ........ Added: python/branches/release27-maint/Lib/lib2to3/__main__.py - copied unchanged from r88661, /sandbox/trunk/2to3/lib2to3/__main__.py Modified: python/branches/release27-maint/Lib/lib2to3/ (props changed) python/branches/release27-maint/Lib/lib2to3/patcomp.py python/branches/release27-maint/Lib/lib2to3/pgen2/driver.py python/branches/release27-maint/Lib/lib2to3/tests/test_parser.py Modified: python/branches/release27-maint/Lib/lib2to3/patcomp.py ============================================================================== --- python/branches/release27-maint/Lib/lib2to3/patcomp.py (original) +++ python/branches/release27-maint/Lib/lib2to3/patcomp.py Sat Feb 26 23:11:02 2011 @@ -12,6 +12,7 @@ # Python imports import os +import StringIO # Fairly local imports from .pgen2 import driver, literals, token, tokenize, parse, grammar @@ -32,7 +33,7 @@ def tokenize_wrapper(input): """Tokenizes a string suppressing significant whitespace.""" skip = set((token.NEWLINE, token.INDENT, token.DEDENT)) - tokens = tokenize.generate_tokens(driver.generate_lines(input).next) + tokens = tokenize.generate_tokens(StringIO.StringIO(input).readline) for quintuple in tokens: type, value, start, end, line_text = quintuple if type not in skip: Modified: python/branches/release27-maint/Lib/lib2to3/pgen2/driver.py ============================================================================== --- python/branches/release27-maint/Lib/lib2to3/pgen2/driver.py (original) +++ python/branches/release27-maint/Lib/lib2to3/pgen2/driver.py Sat Feb 26 23:11:02 2011 @@ -19,6 +19,7 @@ import codecs import os import logging +import StringIO import sys # Pgen imports @@ -101,18 +102,10 @@ def parse_string(self, text, debug=False): """Parse a string and return the syntax tree.""" - tokens = tokenize.generate_tokens(generate_lines(text).next) + tokens = tokenize.generate_tokens(StringIO.StringIO(text).readline) return self.parse_tokens(tokens, debug) -def generate_lines(text): - """Generator that behaves like readline without using StringIO.""" - for line in text.splitlines(True): - yield line - while True: - yield "" - - def load_grammar(gt="Grammar.txt", gp=None, save=True, force=False, logger=None): """Load the grammar (maybe from a pickle).""" Modified: python/branches/release27-maint/Lib/lib2to3/tests/test_parser.py ============================================================================== --- python/branches/release27-maint/Lib/lib2to3/tests/test_parser.py (original) +++ python/branches/release27-maint/Lib/lib2to3/tests/test_parser.py Sat Feb 26 23:11:02 2011 @@ -19,6 +19,16 @@ # Local imports from lib2to3.pgen2 import tokenize from ..pgen2.parse import ParseError +from lib2to3.pygram import python_symbols as syms + + +class TestDriver(support.TestCase): + + def test_formfeed(self): + s = """print 1\n\x0Cprint 2\n""" + t = driver.parse_string(s) + self.assertEqual(t.children[0].children[0].type, syms.print_stmt) + self.assertEqual(t.children[1].children[0].type, syms.print_stmt) class GrammarTest(support.TestCase): From python-checkins at python.org Sat Feb 26 23:12:10 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sat, 26 Feb 2011 23:12:10 +0100 (CET) Subject: [Python-checkins] r88663 - in python/branches/py3k/Lib/lib2to3: patcomp.py pgen2/driver.py tests/test_parser.py Message-ID: <20110226221210.7E6FEEE9BA@mail.python.org> Author: benjamin.peterson Date: Sat Feb 26 23:12:10 2011 New Revision: 88663 Log: Merged revisions 88661 via svnmerge from svn+ssh://pythondev at svn.python.org/sandbox/trunk/2to3/lib2to3 ........ r88661 | benjamin.peterson | 2011-02-26 16:06:24 -0600 (Sat, 26 Feb 2011) | 6 lines fix refactoring on formfeed characters #11250 This is because text.splitlines() is not the same as list(StringIO.StringIO(text)). ........ Modified: python/branches/py3k/Lib/lib2to3/ (props changed) python/branches/py3k/Lib/lib2to3/patcomp.py python/branches/py3k/Lib/lib2to3/pgen2/driver.py python/branches/py3k/Lib/lib2to3/tests/test_parser.py Modified: python/branches/py3k/Lib/lib2to3/patcomp.py ============================================================================== --- python/branches/py3k/Lib/lib2to3/patcomp.py (original) +++ python/branches/py3k/Lib/lib2to3/patcomp.py Sat Feb 26 23:12:10 2011 @@ -11,6 +11,7 @@ __author__ = "Guido van Rossum " # Python imports +import io import os # Fairly local imports @@ -32,7 +33,7 @@ def tokenize_wrapper(input): """Tokenizes a string suppressing significant whitespace.""" skip = set((token.NEWLINE, token.INDENT, token.DEDENT)) - tokens = tokenize.generate_tokens(driver.generate_lines(input).__next__) + tokens = tokenize.generate_tokens(io.StringIO(input).readline) for quintuple in tokens: type, value, start, end, line_text = quintuple if type not in skip: Modified: python/branches/py3k/Lib/lib2to3/pgen2/driver.py ============================================================================== --- python/branches/py3k/Lib/lib2to3/pgen2/driver.py (original) +++ python/branches/py3k/Lib/lib2to3/pgen2/driver.py Sat Feb 26 23:12:10 2011 @@ -17,6 +17,7 @@ # Python imports import codecs +import io import os import logging import sys @@ -101,18 +102,10 @@ def parse_string(self, text, debug=False): """Parse a string and return the syntax tree.""" - tokens = tokenize.generate_tokens(generate_lines(text).__next__) + tokens = tokenize.generate_tokens(io.StringIO(text).readline) return self.parse_tokens(tokens, debug) -def generate_lines(text): - """Generator that behaves like readline without using StringIO.""" - for line in text.splitlines(True): - yield line - while True: - yield "" - - def load_grammar(gt="Grammar.txt", gp=None, save=True, force=False, logger=None): """Load the grammar (maybe from a pickle).""" Modified: python/branches/py3k/Lib/lib2to3/tests/test_parser.py ============================================================================== --- python/branches/py3k/Lib/lib2to3/tests/test_parser.py (original) +++ python/branches/py3k/Lib/lib2to3/tests/test_parser.py Sat Feb 26 23:12:10 2011 @@ -18,6 +18,16 @@ # Local imports from lib2to3.pgen2 import tokenize from ..pgen2.parse import ParseError +from lib2to3.pygram import python_symbols as syms + + +class TestDriver(support.TestCase): + + def test_formfeed(self): + s = """print 1\n\x0Cprint 2\n""" + t = driver.parse_string(s) + self.assertEqual(t.children[0].children[0].type, syms.print_stmt) + self.assertEqual(t.children[1].children[0].type, syms.print_stmt) class GrammarTest(support.TestCase): From python-checkins at python.org Sat Feb 26 23:15:36 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 23:15:36 +0100 Subject: [Python-checkins] merge in cpython: Merged Message-ID: martin.v.loewis pushed 879109e477ec to cpython: http://hg.python.org/cpython/rev/879109e477ec changeset: 68060:879109e477ec tag: tip parent: 68058:4479573a8b69 parent: 68059:3e6eafb66c96 user: Martin v. L?wis date: Sat Feb 26 23:14:38 2011 +0100 summary: Merged files: diff --git a/Include/Python.h b/Include/Python.h --- a/Include/Python.h +++ b/Include/Python.h @@ -4,7 +4,7 @@ /* Include nearly all Python header files */ -#include "patchlevel.h" +#include "patchlevel.h" #include "pyconfig.h" #include "pymacconfig.h" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 23:15:36 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 23:15:36 +0100 Subject: [Python-checkins] cpython: Drop extra CR Message-ID: martin.v.loewis pushed 3e6eafb66c96 to cpython: http://hg.python.org/cpython/rev/3e6eafb66c96 changeset: 68059:3e6eafb66c96 parent: 68057:6da2ffc992f4 user: Martin v. L?wis date: Sat Feb 26 23:00:49 2011 +0100 summary: Drop extra CR files: Include/Python.h diff --git a/Include/Python.h b/Include/Python.h --- a/Include/Python.h +++ b/Include/Python.h @@ -4,7 +4,7 @@ /* Include nearly all Python header files */ -#include "patchlevel.h" +#include "patchlevel.h" #include "pyconfig.h" #include "pymacconfig.h" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 23:20:51 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 23:20:51 +0100 Subject: [Python-checkins] cpython: Change all line endings. Message-ID: martin.v.loewis pushed dbfb2158b657 to cpython: http://hg.python.org/cpython/rev/dbfb2158b657 changeset: 68061:dbfb2158b657 parent: 68059:3e6eafb66c96 user: Martin v. L?wis date: Sat Feb 26 23:07:08 2011 +0100 summary: Change all line endings. files: Include/asdl.h diff --git a/Include/asdl.h b/Include/asdl.h --- a/Include/asdl.h +++ b/Include/asdl.h @@ -1,41 +1,42 @@ -#ifndef Py_ASDL_H -#define Py_ASDL_H - -typedef PyObject * identifier; -typedef PyObject * string; -typedef PyObject * object; - -/* It would be nice if the code generated by asdl_c.py was completely - independent of Python, but it is a goal the requires too much work - at this stage. So, for example, I'll represent identifiers as - interned Python strings. -*/ - -/* XXX A sequence should be typed so that its use can be typechecked. */ - -typedef struct { - int size; - void *elements[1]; -} asdl_seq; - -typedef struct { - int size; - int elements[1]; -} asdl_int_seq; - -asdl_seq *asdl_seq_new(int size, PyArena *arena); -asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); - -#define asdl_seq_GET(S, I) (S)->elements[(I)] -#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) -#ifdef Py_DEBUG -#define asdl_seq_SET(S, I, V) { \ - int _asdl_i = (I); \ - assert((S) && _asdl_i < (S)->size); \ - (S)->elements[_asdl_i] = (V); \ -} -#else -#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) -#endif - -#endif /* !Py_ASDL_H */ +#ifndef Py_ASDL_H +#define Py_ASDL_H + +typedef PyObject * identifier; +typedef PyObject * string; +typedef PyObject * object; +/* Some new comment. What line ending is used? */ + +/* It would be nice if the code generated by asdl_c.py was completely + independent of Python, but it is a goal the requires too much work + at this stage. So, for example, I'll represent identifiers as + interned Python strings. +*/ + +/* XXX A sequence should be typed so that its use can be typechecked. */ + +typedef struct { + int size; + void *elements[1]; +} asdl_seq; + +typedef struct { + int size; + int elements[1]; +} asdl_int_seq; + +asdl_seq *asdl_seq_new(int size, PyArena *arena); +asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); + +#define asdl_seq_GET(S, I) (S)->elements[(I)] +#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) +#ifdef Py_DEBUG +#define asdl_seq_SET(S, I, V) { \ + int _asdl_i = (I); \ + assert((S) && _asdl_i < (S)->size); \ + (S)->elements[_asdl_i] = (V); \ +} +#else +#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) +#endif + +#endif /* !Py_ASDL_H */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 23:20:51 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 23:20:51 +0100 Subject: [Python-checkins] merge in cpython: Merged Message-ID: martin.v.loewis pushed cd743c54fa9c to cpython: http://hg.python.org/cpython/rev/cd743c54fa9c changeset: 68062:cd743c54fa9c parent: 68060:879109e477ec parent: 68061:dbfb2158b657 user: Martin v. L?wis date: Sat Feb 26 23:16:09 2011 +0100 summary: Merged files: diff --git a/Include/asdl.h b/Include/asdl.h --- a/Include/asdl.h +++ b/Include/asdl.h @@ -1,41 +1,42 @@ -#ifndef Py_ASDL_H -#define Py_ASDL_H - -typedef PyObject * identifier; -typedef PyObject * string; -typedef PyObject * object; - -/* It would be nice if the code generated by asdl_c.py was completely - independent of Python, but it is a goal the requires too much work - at this stage. So, for example, I'll represent identifiers as - interned Python strings. -*/ - -/* XXX A sequence should be typed so that its use can be typechecked. */ - -typedef struct { - int size; - void *elements[1]; -} asdl_seq; - -typedef struct { - int size; - int elements[1]; -} asdl_int_seq; - -asdl_seq *asdl_seq_new(int size, PyArena *arena); -asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); - -#define asdl_seq_GET(S, I) (S)->elements[(I)] -#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) -#ifdef Py_DEBUG -#define asdl_seq_SET(S, I, V) { \ - int _asdl_i = (I); \ - assert((S) && _asdl_i < (S)->size); \ - (S)->elements[_asdl_i] = (V); \ -} -#else -#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) -#endif - -#endif /* !Py_ASDL_H */ +#ifndef Py_ASDL_H +#define Py_ASDL_H + +typedef PyObject * identifier; +typedef PyObject * string; +typedef PyObject * object; +/* Some new comment. What line ending is used? */ + +/* It would be nice if the code generated by asdl_c.py was completely + independent of Python, but it is a goal the requires too much work + at this stage. So, for example, I'll represent identifiers as + interned Python strings. +*/ + +/* XXX A sequence should be typed so that its use can be typechecked. */ + +typedef struct { + int size; + void *elements[1]; +} asdl_seq; + +typedef struct { + int size; + int elements[1]; +} asdl_int_seq; + +asdl_seq *asdl_seq_new(int size, PyArena *arena); +asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); + +#define asdl_seq_GET(S, I) (S)->elements[(I)] +#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) +#ifdef Py_DEBUG +#define asdl_seq_SET(S, I, V) { \ + int _asdl_i = (I); \ + assert((S) && _asdl_i < (S)->size); \ + (S)->elements[_asdl_i] = (V); \ +} +#else +#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) +#endif + +#endif /* !Py_ASDL_H */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Feb 26 23:20:51 2011 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 26 Feb 2011 23:20:51 +0100 Subject: [Python-checkins] cpython: Revert line endings to Unix. Message-ID: martin.v.loewis pushed 8cdddac5ad28 to cpython: http://hg.python.org/cpython/rev/8cdddac5ad28 changeset: 68063:8cdddac5ad28 tag: tip user: Martin v. L?wis date: Sat Feb 26 23:20:37 2011 +0100 summary: Revert line endings to Unix. files: Include/asdl.h diff --git a/Include/asdl.h b/Include/asdl.h --- a/Include/asdl.h +++ b/Include/asdl.h @@ -1,42 +1,42 @@ -#ifndef Py_ASDL_H -#define Py_ASDL_H - -typedef PyObject * identifier; -typedef PyObject * string; -typedef PyObject * object; -/* Some new comment. What line ending is used? */ - -/* It would be nice if the code generated by asdl_c.py was completely - independent of Python, but it is a goal the requires too much work - at this stage. So, for example, I'll represent identifiers as - interned Python strings. -*/ - -/* XXX A sequence should be typed so that its use can be typechecked. */ - -typedef struct { - int size; - void *elements[1]; -} asdl_seq; - -typedef struct { - int size; - int elements[1]; -} asdl_int_seq; - -asdl_seq *asdl_seq_new(int size, PyArena *arena); -asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); - -#define asdl_seq_GET(S, I) (S)->elements[(I)] -#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) -#ifdef Py_DEBUG -#define asdl_seq_SET(S, I, V) { \ - int _asdl_i = (I); \ - assert((S) && _asdl_i < (S)->size); \ - (S)->elements[_asdl_i] = (V); \ -} -#else -#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) -#endif - -#endif /* !Py_ASDL_H */ +#ifndef Py_ASDL_H +#define Py_ASDL_H + +typedef PyObject * identifier; +typedef PyObject * string; +typedef PyObject * object; +/* Some new comment. What line ending is used? */ + +/* It would be nice if the code generated by asdl_c.py was completely + independent of Python, but it is a goal the requires too much work + at this stage. So, for example, I'll represent identifiers as + interned Python strings. +*/ + +/* XXX A sequence should be typed so that its use can be typechecked. */ + +typedef struct { + int size; + void *elements[1]; +} asdl_seq; + +typedef struct { + int size; + int elements[1]; +} asdl_int_seq; + +asdl_seq *asdl_seq_new(int size, PyArena *arena); +asdl_int_seq *asdl_int_seq_new(int size, PyArena *arena); + +#define asdl_seq_GET(S, I) (S)->elements[(I)] +#define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) +#ifdef Py_DEBUG +#define asdl_seq_SET(S, I, V) { \ + int _asdl_i = (I); \ + assert((S) && _asdl_i < (S)->size); \ + (S)->elements[_asdl_i] = (V); \ +} +#else +#define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) +#endif + +#endif /* !Py_ASDL_H */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Feb 27 00:24:07 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 00:24:07 +0100 (CET) Subject: [Python-checkins] r88664 - in python/branches/py3k: Lib/ssl.py Lib/test/test_ssl.py Misc/NEWS Message-ID: <20110226232407.2E5EEEEA03@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 00:24:06 2011 New Revision: 88664 Log: Issue #11326: Add the missing connect_ex() implementation for SSL sockets, and make it work for non-blocking connects. Modified: python/branches/py3k/Lib/ssl.py python/branches/py3k/Lib/test/test_ssl.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Lib/ssl.py ============================================================================== --- python/branches/py3k/Lib/ssl.py (original) +++ python/branches/py3k/Lib/ssl.py Sun Feb 27 00:24:06 2011 @@ -237,6 +237,7 @@ self._closed = False self._sslobj = None + self._connected = connected if connected: # create the SSL object try: @@ -430,23 +431,36 @@ finally: self.settimeout(timeout) - def connect(self, addr): - """Connects to remote ADDR, and then wraps the connection in - an SSL channel.""" + def _real_connect(self, addr, return_errno): if self.server_side: raise ValueError("can't connect in server-side mode") # Here we assume that the socket is client-side, and not # connected at the time of the call. We connect it, then wrap it. - if self._sslobj: + if self._connected: raise ValueError("attempt to connect already-connected SSLSocket!") - socket.connect(self, addr) self._sslobj = self.context._wrap_socket(self, False, self.server_hostname) try: + socket.connect(self, addr) if self.do_handshake_on_connect: self.do_handshake() - except: - self._sslobj = None - raise + except socket_error as e: + if return_errno: + return e.errno + else: + self._sslobj = None + raise e + self._connected = True + return 0 + + def connect(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + self._real_connect(addr, False) + + def connect_ex(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + return self._real_connect(addr, True) def accept(self): """Accepts a new connection from a remote client, and returns Modified: python/branches/py3k/Lib/test/test_ssl.py ============================================================================== --- python/branches/py3k/Lib/test/test_ssl.py (original) +++ python/branches/py3k/Lib/test/test_ssl.py Sun Feb 27 00:24:06 2011 @@ -451,6 +451,49 @@ finally: s.close() + def test_connect_ex(self): + # Issue #11326: check connect_ex() implementation + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT) + try: + self.assertEqual(0, s.connect_ex(("svn.python.org", 443))) + self.assertTrue(s.getpeercert()) + finally: + s.close() + + def test_non_blocking_connect_ex(self): + # Issue #11326: non-blocking connect_ex() should allow handshake + # to proceed after the socket gets ready. + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT, + do_handshake_on_connect=False) + try: + s.setblocking(False) + rc = s.connect_ex(('svn.python.org', 443)) + self.assertIn(rc, (0, errno.EINPROGRESS)) + # Wait for connect to finish + select.select([], [s], [], 5.0) + # Non-blocking handshake + while True: + try: + s.do_handshake() + break + except ssl.SSLError as err: + if err.args[0] == ssl.SSL_ERROR_WANT_READ: + select.select([s], [], [], 5.0) + elif err.args[0] == ssl.SSL_ERROR_WANT_WRITE: + select.select([], [s], [], 5.0) + else: + raise + # SSL established + self.assertTrue(s.getpeercert()) + finally: + s.close() + def test_connect_with_context(self): with support.transient_internet("svn.python.org"): # Same as test_connect, but with a separately created context Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Sun Feb 27 00:24:06 2011 @@ -35,6 +35,9 @@ Library ------- +- Issue #11326: Add the missing connect_ex() implementation for SSL sockets, + and make it work for non-blocking connects. + - Issue #11297: Add collections.ChainMap(). - Issue #10755: Add the posix.fdlistdir() function. Patch by Ross Lagerwall. From python-checkins at python.org Sun Feb 27 00:25:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 00:25:34 +0100 (CET) Subject: [Python-checkins] r88665 - in python/branches/release32-maint: Lib/ssl.py Lib/test/test_ssl.py Misc/NEWS Message-ID: <20110226232534.7B3C2EE9C3@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 00:25:34 2011 New Revision: 88665 Log: Merged revisions 88664 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88664 | antoine.pitrou | 2011-02-27 00:24:06 +0100 (dim., 27 f?vr. 2011) | 4 lines Issue #11326: Add the missing connect_ex() implementation for SSL sockets, and make it work for non-blocking connects. ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/ssl.py python/branches/release32-maint/Lib/test/test_ssl.py python/branches/release32-maint/Misc/NEWS Modified: python/branches/release32-maint/Lib/ssl.py ============================================================================== --- python/branches/release32-maint/Lib/ssl.py (original) +++ python/branches/release32-maint/Lib/ssl.py Sun Feb 27 00:25:34 2011 @@ -237,6 +237,7 @@ self._closed = False self._sslobj = None + self._connected = connected if connected: # create the SSL object try: @@ -430,23 +431,36 @@ finally: self.settimeout(timeout) - def connect(self, addr): - """Connects to remote ADDR, and then wraps the connection in - an SSL channel.""" + def _real_connect(self, addr, return_errno): if self.server_side: raise ValueError("can't connect in server-side mode") # Here we assume that the socket is client-side, and not # connected at the time of the call. We connect it, then wrap it. - if self._sslobj: + if self._connected: raise ValueError("attempt to connect already-connected SSLSocket!") - socket.connect(self, addr) self._sslobj = self.context._wrap_socket(self, False, self.server_hostname) try: + socket.connect(self, addr) if self.do_handshake_on_connect: self.do_handshake() - except: - self._sslobj = None - raise + except socket_error as e: + if return_errno: + return e.errno + else: + self._sslobj = None + raise e + self._connected = True + return 0 + + def connect(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + self._real_connect(addr, False) + + def connect_ex(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + return self._real_connect(addr, True) def accept(self): """Accepts a new connection from a remote client, and returns Modified: python/branches/release32-maint/Lib/test/test_ssl.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_ssl.py (original) +++ python/branches/release32-maint/Lib/test/test_ssl.py Sun Feb 27 00:25:34 2011 @@ -451,6 +451,49 @@ finally: s.close() + def test_connect_ex(self): + # Issue #11326: check connect_ex() implementation + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT) + try: + self.assertEqual(0, s.connect_ex(("svn.python.org", 443))) + self.assertTrue(s.getpeercert()) + finally: + s.close() + + def test_non_blocking_connect_ex(self): + # Issue #11326: non-blocking connect_ex() should allow handshake + # to proceed after the socket gets ready. + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT, + do_handshake_on_connect=False) + try: + s.setblocking(False) + rc = s.connect_ex(('svn.python.org', 443)) + self.assertIn(rc, (0, errno.EINPROGRESS)) + # Wait for connect to finish + select.select([], [s], [], 5.0) + # Non-blocking handshake + while True: + try: + s.do_handshake() + break + except ssl.SSLError as err: + if err.args[0] == ssl.SSL_ERROR_WANT_READ: + select.select([s], [], [], 5.0) + elif err.args[0] == ssl.SSL_ERROR_WANT_WRITE: + select.select([], [s], [], 5.0) + else: + raise + # SSL established + self.assertTrue(s.getpeercert()) + finally: + s.close() + def test_connect_with_context(self): with support.transient_internet("svn.python.org"): # Same as test_connect, but with a separately created context Modified: python/branches/release32-maint/Misc/NEWS ============================================================================== --- python/branches/release32-maint/Misc/NEWS (original) +++ python/branches/release32-maint/Misc/NEWS Sun Feb 27 00:25:34 2011 @@ -24,6 +24,9 @@ Library ------- +- Issue #11326: Add the missing connect_ex() implementation for SSL sockets, + and make it work for non-blocking connects. + - Issue #7322: Trying to read from a socket's file-like object after a timeout occurred now raises an error instead of silently losing data. From python-checkins at python.org Sun Feb 27 00:35:27 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 00:35:27 +0100 (CET) Subject: [Python-checkins] r88666 - in python/branches/release27-maint: Lib/ssl.py Lib/test/test_ssl.py Misc/NEWS Message-ID: <20110226233527.66A6BEEA2C@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 00:35:27 2011 New Revision: 88666 Log: Merged revisions 88664 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88664 | antoine.pitrou | 2011-02-27 00:24:06 +0100 (dim., 27 f?vr. 2011) | 4 lines Issue #11326: Add the missing connect_ex() implementation for SSL sockets, and make it work for non-blocking connects. ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/ssl.py python/branches/release27-maint/Lib/test/test_ssl.py python/branches/release27-maint/Misc/NEWS Modified: python/branches/release27-maint/Lib/ssl.py ============================================================================== --- python/branches/release27-maint/Lib/ssl.py (original) +++ python/branches/release27-maint/Lib/ssl.py Sun Feb 27 00:35:27 2011 @@ -110,9 +110,11 @@ if e.errno != errno.ENOTCONN: raise # no, no connection yet + self._connected = False self._sslobj = None else: # yes, create the SSL object + self._connected = True self._sslobj = _ssl.sslwrap(self._sock, server_side, keyfile, certfile, cert_reqs, ssl_version, ca_certs, @@ -282,21 +284,36 @@ self._sslobj.do_handshake() - def connect(self, addr): - - """Connects to remote ADDR, and then wraps the connection in - an SSL channel.""" - + def _real_connect(self, addr, return_errno): # Here we assume that the socket is client-side, and not # connected at the time of the call. We connect it, then wrap it. - if self._sslobj: + if self._connected: raise ValueError("attempt to connect already-connected SSLSocket!") - socket.connect(self, addr) self._sslobj = _ssl.sslwrap(self._sock, False, self.keyfile, self.certfile, self.cert_reqs, self.ssl_version, self.ca_certs, self.ciphers) - if self.do_handshake_on_connect: - self.do_handshake() + try: + socket.connect(self, addr) + if self.do_handshake_on_connect: + self.do_handshake() + except socket_error as e: + if return_errno: + return e.errno + else: + self._sslobj = None + raise e + self._connected = True + return 0 + + def connect(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + self._real_connect(addr, False) + + def connect_ex(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + return self._real_connect(addr, True) def accept(self): Modified: python/branches/release27-maint/Lib/test/test_ssl.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_ssl.py (original) +++ python/branches/release27-maint/Lib/test/test_ssl.py Sun Feb 27 00:35:27 2011 @@ -225,6 +225,49 @@ finally: s.close() + def test_connect_ex(self): + # Issue #11326: check connect_ex() implementation + with test_support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT) + try: + self.assertEqual(0, s.connect_ex(("svn.python.org", 443))) + self.assertTrue(s.getpeercert()) + finally: + s.close() + + def test_non_blocking_connect_ex(self): + # Issue #11326: non-blocking connect_ex() should allow handshake + # to proceed after the socket gets ready. + with test_support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT, + do_handshake_on_connect=False) + try: + s.setblocking(False) + rc = s.connect_ex(('svn.python.org', 443)) + self.assertIn(rc, (0, errno.EINPROGRESS)) + # Wait for connect to finish + select.select([], [s], [], 5.0) + # Non-blocking handshake + while True: + try: + s.do_handshake() + break + except ssl.SSLError as err: + if err.args[0] == ssl.SSL_ERROR_WANT_READ: + select.select([s], [], [], 5.0) + elif err.args[0] == ssl.SSL_ERROR_WANT_WRITE: + select.select([], [s], [], 5.0) + else: + raise + # SSL established + self.assertTrue(s.getpeercert()) + finally: + s.close() + @unittest.skipIf(os.name == "nt", "Can't use a socket as a file under Windows") def test_makefile_close(self): # Issue #5238: creating a file-like object with makefile() shouldn't Modified: python/branches/release27-maint/Misc/NEWS ============================================================================== --- python/branches/release27-maint/Misc/NEWS (original) +++ python/branches/release27-maint/Misc/NEWS Sun Feb 27 00:35:27 2011 @@ -37,6 +37,9 @@ Library ------- +- Issue #11326: Add the missing connect_ex() implementation for SSL sockets, + and make it work for non-blocking connects. + - Issue #10956: Buffered I/O classes retry reading or writing after a signal has arrived and the handler returned successfully. From python-checkins at python.org Sun Feb 27 04:16:58 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:16:58 +0100 Subject: [Python-checkins] devguide: Update module name Message-ID: eric.araujo pushed c6a6c206c8cc to devguide: http://hg.python.org/devguide/rev/c6a6c206c8cc changeset: 322:c6a6c206c8cc parent: 312:e5da5c96d7f0 user: ?ric Araujo date: Sun Feb 27 02:38:30 2011 +0100 summary: Update module name files: experts.rst diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -77,7 +77,7 @@ codecs lemburg, doerwalter codeop collections rhettinger -collections._abcoll rhettinger, stutzbach +collections.abc rhettinger, stutzbach colorsys compileall concurrent.futures brian.quinlan -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:03 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:03 +0100 Subject: [Python-checkins] devguide: Add missing tracker stage and keyword Message-ID: eric.araujo pushed 7f7028199459 to devguide: http://hg.python.org/devguide/rev/7f7028199459 changeset: 323:7f7028199459 user: ?ric Araujo date: Sun Feb 27 02:41:35 2011 +0100 summary: Add missing tracker stage and keyword files: triaging.rst diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -30,6 +30,9 @@ What is next to advance the issue forward. The *stage* needn't be set until it is clear that the issue warrants fixing. +* test needed + The bug reporter should post a script or instructions to let a triager or + developper reproduce the issue. * needs patch The issue lacks a patch to solve the problem (i.e. fixing the bug, or adding the requested improvement). @@ -99,6 +102,8 @@ The patch attached to the issue is in need of a review. * patch There is a patch attached to the issue. +* 3.2regression + The issue is a regression in 3.2. Nosy List ''''''''' -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:03 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:03 +0100 Subject: [Python-checkins] devguide: A few typos Message-ID: eric.araujo pushed cdf8bd5571e3 to devguide: http://hg.python.org/devguide/rev/cdf8bd5571e3 changeset: 324:cdf8bd5571e3 user: ?ric Araujo date: Sun Feb 27 02:43:41 2011 +0100 summary: A few typos files: faq.rst grammar.rst triaging.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -117,7 +117,7 @@ svn commit [PATH] -Although ``[PATH]`` is optional, if PATH is omitted all changes +Although ``PATH`` is optional, if PATH is omitted all changes in your local copy will be committed to the repository. **DO NOT USE THIS!!!** You should specify the specific files to be committed unless you are *absolutely* positive that @@ -392,7 +392,7 @@ ssh-keygen -t rsa -This will generate a two files; your public key and your private key. Your +This will generate two files; your public key and your private key. Your public key is the file ending in ``.pub``. Windows diff --git a/grammar.rst b/grammar.rst --- a/grammar.rst +++ b/grammar.rst @@ -62,6 +62,6 @@ * After everything's been checked in, you're likely to see a new change to Python/Python-ast.c. This is because this - (generated) file contains the SVN version of the source from + (generated) file contains the svn version of the source from which it was generated. There's no way to avoid this; you just have to submit this file separately. diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -20,10 +20,10 @@ '''' Describes the type of issue. If something does not fit within any -specific type then simply do not set it. *"Crash"* is for hard crashes of +specific type then simply do not set it. *"crash"* is for hard crashes of the Python interpreter -- possibly with a core dump or a Windows error box -- and not erroneous exits because of an unhandled exception (the latter fall under -the *"behaviour"* category). +the *"behavior"* category). Stage ''''' @@ -188,7 +188,7 @@ * ``#``, ``issue``, ``issue `` links to the tracker issue ````. * ``msg`` links to the tracker message ````. -* ``r``, ``rev``, ``revision `` links to the VCS +* ``r``, ``rev``, ``revision `` links to the Subversion revision ````. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:04 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:04 +0100 Subject: [Python-checkins] devguide: Edit two links. Message-ID: eric.araujo pushed f5896ba61c7b to devguide: http://hg.python.org/devguide/rev/f5896ba61c7b changeset: 325:f5896ba61c7b user: ?ric Araujo date: Sun Feb 27 02:59:25 2011 +0100 summary: Edit two links. The text in the second link seemed awkward to me. files: docquality.rst setup.rst diff --git a/docquality.rst b/docquality.rst --- a/docquality.rst +++ b/docquality.rst @@ -32,7 +32,7 @@ `_ which discusses the documentation toolchain, projects, standards, etc. -.. _Documenting Python: http://docs.python.org/py3k/documenting/index.html +.. _Documenting Python: http://docs.python.org/dev/documenting/ Helping with issues filed on the issue tracker diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -14,8 +14,8 @@ Version Control Setup --------------------- -CPython is developed using `Subversion (commonly abbreviated SVN) -`_. +CPython is developed using `Subversion `_ +(commonly abbreviated svn, after the program name). It is easily available for common Unix systems by way of the standard package manager; under Windows, you might want to use the `TortoiseSVN `_ graphical client. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:05 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:05 +0100 Subject: [Python-checkins] devguide: Minor markup fixes. Message-ID: eric.araujo pushed 47b68547a383 to devguide: http://hg.python.org/devguide/rev/47b68547a383 changeset: 326:47b68547a383 user: ?ric Araujo date: Sun Feb 27 03:06:32 2011 +0100 summary: Minor markup fixes. - Don?t mix lists and definition lists - Don?t create gratuitous blockquotes - Remove one mention of Misc/developers.txt, moved to the devguide - Use built-in PEP role - Use compact table markup files: compiler.rst coredev.rst developers.rst emacs.rst experts.rst faq.rst triaging.rst diff --git a/compiler.rst b/compiler.rst --- a/compiler.rst +++ b/compiler.rst @@ -47,21 +47,21 @@ Querying data from the node structs can be done with the following macros (which are all defined in Include/token.h): -- ``CHILD(node *, int)`` +``CHILD(node *, int)`` Returns the nth child of the node using zero-offset indexing -- ``RCHILD(node *, int)`` +``RCHILD(node *, int)`` Returns the nth child of the node from the right side; use negative numbers! -- ``NCH(node *)`` +``NCH(node *)`` Number of children the node has -- ``STR(node *)`` +``STR(node *)`` String representation of the node; e.g., will return ``:`` for a COLON token -- ``TYPE(node *)`` +``TYPE(node *)`` The type of node as specified in ``Include/graminit.h`` -- ``REQ(node *, TYPE)`` +``REQ(node *, TYPE)`` Assert that the node is the type that is expected -- ``LINENO(node *)`` +``LINENO(node *)`` retrieve the line number of the source code that led to the creation of the parse rule; defined in Python/ast.c @@ -220,13 +220,13 @@ Function and macros for creating and using ``asdl_seq *`` types as found in Python/asdl.c and Include/asdl.h: -- ``asdl_seq_new()`` +``asdl_seq_new()`` Allocate memory for an asdl_seq for the specified length -- ``asdl_seq_GET()`` +``asdl_seq_GET()`` Get item held at a specific position in an asdl_seq -- ``asdl_seq_SET()`` +``asdl_seq_SET()`` Set a specific index in an asdl_seq to the specified value -- ``asdl_seq_LEN(asdl_seq *)`` +``asdl_seq_LEN(asdl_seq *)`` Return the length of an asdl_seq If you are working with statements, you must also worry about keeping @@ -315,23 +315,23 @@ Emission of bytecode is handled by the following macros: -- ``ADDOP()`` +``ADDOP()`` add a specified opcode -- ``ADDOP_I()`` +``ADDOP_I()`` add an opcode that takes an argument -- ``ADDOP_O(struct compiler *c, int op, PyObject *type, PyObject *obj)`` +``ADDOP_O(struct compiler *c, int op, PyObject *type, PyObject *obj)`` add an opcode with the proper argument based on the position of the specified PyObject in PyObject sequence object, but with no handling of mangled names; used for when you need to do named lookups of objects such as globals, consts, or parameters where name mangling is not possible and the scope of the name is known -- ``ADDOP_NAME()`` +``ADDOP_NAME()`` just like ADDOP_O, but name mangling is also handled; used for attribute loading or importing based on name -- ``ADDOP_JABS()`` +``ADDOP_JABS()`` create an absolute jump to a basic block -- ``ADDOP_JREL()`` +``ADDOP_JREL()`` create a relative jump to a basic block Several helper functions that will emit bytecode and are named @@ -348,11 +348,11 @@ creation of basic blocks must be done. Below are the macros and functions used for managing basic blocks: -- ``NEW_BLOCK()`` +``NEW_BLOCK()`` create block and set it as current -- ``NEXT_BLOCK()`` +``NEXT_BLOCK()`` basically NEW_BLOCK() plus jump from current block -- ``compiler_new_block()`` +``compiler_new_block()`` create a block but don't use it (used for generating jumps) Once the CFG is created, it must be flattened and then final emission of @@ -417,23 +417,23 @@ + Parser/ - - Python.asdl + Python.asdl ASDL syntax file - - asdl.py + asdl.py "An implementation of the Zephyr Abstract Syntax Definition Language." Uses SPARK_ to parse the ASDL files. - - asdl_c.py + asdl_c.py "Generate C code from an ASDL description." Generates Python/Python-ast.c and Include/Python-ast.h . - - spark.py + spark.py SPARK_ parser generator + Python/ - - Python-ast.c + Python-ast.c Creates C structs corresponding to the ASDL types. Also contains code for marshaling AST nodes (core ASDL types have marshaling code in asdl.c). "File automatically generated by @@ -441,76 +441,76 @@ after every grammar change is committed since the __version__ value is set to the latest grammar change revision number. - - asdl.c + asdl.c Contains code to handle the ASDL sequence type. Also has code to handle marshalling the core ASDL types, such as number and identifier. used by Python-ast.c for marshaling AST nodes. - - ast.c + ast.c Converts Python's parse tree into the abstract syntax tree. - - ceval.c + ceval.c Executes byte code (aka, eval loop). - - compile.c + compile.c Emits bytecode based on the AST. - - symtable.c + symtable.c Generates a symbol table from AST. - - pyarena.c + pyarena.c Implementation of the arena memory manager. - - import.c + import.c Home of the magic number (named ``MAGIC``) for bytecode versioning + Include/ - - Python-ast.h + Python-ast.h Contains the actual definitions of the C structs as generated by Python/Python-ast.c . "Automatically generated by Parser/asdl_c.py". - - asdl.h + asdl.h Header for the corresponding Python/ast.c . - - ast.h + ast.h Declares PyAST_FromNode() external (from Python/ast.c). - - code.h + code.h Header file for Objects/codeobject.c; contains definition of PyCodeObject. - - symtable.h + symtable.h Header for Python/symtable.c . struct symtable and PySTEntryObject are defined here. - - pyarena.h + pyarena.h Header file for the corresponding Python/pyarena.c . - - opcode.h + opcode.h Master list of bytecode; if this file is modified you must modify several other files accordingly (see "`Introducing New Bytecode`_") + Objects/ - - codeobject.c + codeobject.c Contains PyCodeObject-related code (originally in Python/compile.c). + Lib/ - - opcode.py + opcode.py One of the files that must be modified if Include/opcode.h is. - - compiler/ + compiler/ - * pyassem.py + pyassem.py One of the files that must be modified if Include/opcode.h is changed. - * pycodegen.py + pycodegen.py One of the files that must be modified if Include/opcode.h is changed. diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -77,7 +77,7 @@ c:\path\to\putty\plink.exe pythondev at svn.python.org -An entry in the ``Misc/developers.txt`` file should also be entered for you. +An entry in the :ref:`developers` should also be entered for you. Typically the person who sponsored your application to become a core developer makes sure an entry is created for you. diff --git a/developers.rst b/developers.rst --- a/developers.rst +++ b/developers.rst @@ -259,8 +259,7 @@ - George Yoshida (SF name "quiver") added to the SourceForge Python project 14 Apr 2006, by Tim Peters, as a tracker admin. See contemporaneous python-checkins thread with the unlikely Subject: - - r45329 - python/trunk/Doc/whatsnew/whatsnew25.tex + r45329 - python/trunk/Doc/whatsnew/whatsnew25.tex - Ronald Oussoren was given SVN access on 3 Mar 2006 by NCN, for Mac related work. @@ -297,8 +296,7 @@ - Eric S. Raymond was made a developer on 2 Jul 2000 by TGP, for general library work. His request is archived here: - - http://mail.python.org/pipermail/python-dev/2000-July/005314.html + http://mail.python.org/pipermail/python-dev/2000-July/005314.html Permissions Dropped on Request diff --git a/emacs.rst b/emacs.rst --- a/emacs.rst +++ b/emacs.rst @@ -6,8 +6,7 @@ If you want to edit Python code in Emacs, you should download python-mode.el and install it somewhere on your load-path. See the project page to download: - - https://launchpad.net/python-mode +https://launchpad.net/python-mode While Emacs comes with a python.el file, it is not recommended. python-mode.el is maintained by core Python developers and is generally @@ -20,8 +19,7 @@ For more information and bug reporting, see the above project page. For help, development, or discussions, see the python-mode mailing list: - - http://mail.python.org/mailman/listinfo/python-mode +http://mail.python.org/mailman/listinfo/python-mode .. diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -35,12 +35,9 @@ by non-committers to find responsible parties, and by committers who do not feel qualified to make a decision in a particular context. -See also `PEP 291`_ and `PEP 360`_ for information about certain modules +See also :PEP:`291` and :PEP:`360` for information about certain modules with special rules. -.. _`PEP 291`: http://www.python.org/dev/peps/pep-0291/ -.. _`PEP 360`: http://www.python.org/dev/peps/pep-0360/ - Stdlib ------ diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -156,11 +156,8 @@ = =========================== A Scheduled to be added - D Scheduled to be deleted - M Modified locally - ? Not under version control = =========================== diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -18,7 +18,6 @@ Type '''' - Describes the type of issue. If something does not fit within any specific type then simply do not set it. *"crash"* is for hard crashes of the Python interpreter -- possibly with a core dump or a Windows error box -- @@ -30,20 +29,20 @@ What is next to advance the issue forward. The *stage* needn't be set until it is clear that the issue warrants fixing. -* test needed +test needed The bug reporter should post a script or instructions to let a triager or developper reproduce the issue. -* needs patch +needs patch The issue lacks a patch to solve the problem (i.e. fixing the bug, or adding the requested improvement). -* patch review +patch review There is a patch, but it needs reviewing or is in the process of being reviewed. This can be done by any triager as well as a core developer. -* commit review +commit review A triager performed a patch review and it looks good to them, but a core developer needs to commit the patch (and do a quick once-over to make sure nothing was overlooked). -* committed/rejected +committed/rejected The issue is considered closed and dealt with. Components @@ -62,20 +61,20 @@ '''''''' How important is this issue? -* low +low This is for low-impact bugs, or feature requests of little utility. -* normal +normal The default value for most issues, which deserve fixing but without any urgency to do so. -* high +high Make some effort to fix the issue before the next final release. -* critical +critical This issue should definitely be fixed before the next final release. -* deferred blocker +deferred blocker The issue will not hold up the next release, but will be promoted to a release blocker for the following release, e.g., won't block the next release of a1 but will block a2. -* release blocker +release blocker The issue must be fixed before *any* release is made, e.g., will block the next release even if it is an alpha release. @@ -88,21 +87,21 @@ '''''''' Various flags about the issue. Multiple values are possible. -* after moratorium +after moratorium The issue is in regards to a language change which is not allowed during the `language moratorium`_. -* buildbot +buildbot A buildbot triggered the issue being reported. -* easy +easy Fixing the issue should not take longer than a day for someone new to contributing to Python to solve. -* gsoc +gsoc The issue would fit as, or is related to, GSoC_. -* needs review +needs review The patch attached to the issue is in need of a review. -* patch +patch There is a patch attached to the issue. -* 3.2regression +3.2regression The issue is a regression in 3.2. Nosy List @@ -131,52 +130,52 @@ Status '''''' -* open +open Issue is not resolved. -* languishing +languishing The issue has no clear solution , e.g., no agreement on a technical solution or if it is even a problem worth fixing. -* pending +pending The issue is blocked until someone (often times the :abbr:`OP (original poster)`) provides some critical info; the issue is automatically closed after a set amount of time if no reply comes in. Useful for when someone reports a bug that lacks enough information to be reproduced and thus should be closed if the lacking info is never provided. and thus the issue is worthless without the needed info being provided. -* closed +closed The issue has been resolved (somehow). Resolution '''''''''' Why the issue is in its current state (not usually used for "open"). -* accepted +accepted Submitted patch was applied, still needs verifying (for example by watching the `buildbots `_) that everything went fine. Then the resolution will turn to *fixed* and the status to *closed*. -* duplicate +duplicate Duplicate of another issue; should have the Superseder field filled out. -* fixed +fixed A fix for the issue was committed. -* invalid +invalid For some reason the issue is invalid (e.g. the perceived problem is not a bug in Python). -* later +later Issue is to be worked on at a later date. -* out of date +out of date The issue has already been fixed, or the problem doesn't exist anymore for other reasons. -* postponed +postponed Issue will not be worked on at the moment. -* rejected +rejected Issue was rejected (especially for feature requests). -* remind +remind The issue is acting as a reminder for someone. -* wont fix +wont fix Issue will not be fixed, typically because it would cause a backwards-compatibility problem. -* works for me +works for me Bug cannot be reproduced. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:05 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:05 +0100 Subject: [Python-checkins] devguide: Use component instead of assigned-to, the former lists more bugs Message-ID: eric.araujo pushed 974ed7a43003 to devguide: http://hg.python.org/devguide/rev/974ed7a43003 changeset: 327:974ed7a43003 user: ?ric Araujo date: Sat Feb 26 17:46:38 2011 +0100 summary: Use component instead of assigned-to, the former lists more bugs files: docquality.rst diff --git a/docquality.rst b/docquality.rst --- a/docquality.rst +++ b/docquality.rst @@ -38,7 +38,7 @@ Helping with issues filed on the issue tracker ---------------------------------------------- -If you look at `issues assigned to docs at python`_ on the `issue tracker`_, you +If you look at `documentation issues`_ on the `issue tracker`_, you will find various documentation problems that need work. Issues vary from typos, to unclear documentation, to something completely lacking documentation. @@ -50,9 +50,7 @@ to forget or lose interest). .. _issue tracker: http://bugs.python.org -.. _issues assigned to docs at python: http://bugs.python.org/issue?%40sort0=activity&%40sortdir0=on&%40sort1=creation&%40sortdir1=on&%40group0=priority&%40group1=&%40columns=title%2Cid%2Cactivity%2Cstatus&%40filter=assignee%2Cstatus&status=1&assignee=12260&%40pagesize=50&%40startwith=0 - - +.. _documentation issues: http://bugs.python.org/issue?%40search_text=&ignore=file%3Acontent&title=&%40columns=title&id=&%40columns=id&stage=&creation=&creator=&activity=&%40columns=activity&%40sort=activity&actor=&nosy=&type=&components=4&versions=&dependencies=&assignee=&keywords=&priority=&%40group=priority&status=1&%40columns=status&resolution=&nosy_count=&message_count=&%40pagesize=50&%40startwith=0&%40queryname=&%40old-queryname=&%40action=search Proofreading -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:06 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:06 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): Merge default Message-ID: eric.araujo pushed 1115e3c40a70 to devguide: http://hg.python.org/devguide/rev/1115e3c40a70 changeset: 328:1115e3c40a70 branch: hg_transition parent: 321:3f9d973e51f4 parent: 327:974ed7a43003 user: ?ric Araujo date: Sun Feb 27 03:34:08 2011 +0100 summary: Merge default files: coredev.rst faq.rst setup.rst diff --git a/compiler.rst b/compiler.rst --- a/compiler.rst +++ b/compiler.rst @@ -47,21 +47,21 @@ Querying data from the node structs can be done with the following macros (which are all defined in Include/token.h): -- ``CHILD(node *, int)`` +``CHILD(node *, int)`` Returns the nth child of the node using zero-offset indexing -- ``RCHILD(node *, int)`` +``RCHILD(node *, int)`` Returns the nth child of the node from the right side; use negative numbers! -- ``NCH(node *)`` +``NCH(node *)`` Number of children the node has -- ``STR(node *)`` +``STR(node *)`` String representation of the node; e.g., will return ``:`` for a COLON token -- ``TYPE(node *)`` +``TYPE(node *)`` The type of node as specified in ``Include/graminit.h`` -- ``REQ(node *, TYPE)`` +``REQ(node *, TYPE)`` Assert that the node is the type that is expected -- ``LINENO(node *)`` +``LINENO(node *)`` retrieve the line number of the source code that led to the creation of the parse rule; defined in Python/ast.c @@ -220,13 +220,13 @@ Function and macros for creating and using ``asdl_seq *`` types as found in Python/asdl.c and Include/asdl.h: -- ``asdl_seq_new()`` +``asdl_seq_new()`` Allocate memory for an asdl_seq for the specified length -- ``asdl_seq_GET()`` +``asdl_seq_GET()`` Get item held at a specific position in an asdl_seq -- ``asdl_seq_SET()`` +``asdl_seq_SET()`` Set a specific index in an asdl_seq to the specified value -- ``asdl_seq_LEN(asdl_seq *)`` +``asdl_seq_LEN(asdl_seq *)`` Return the length of an asdl_seq If you are working with statements, you must also worry about keeping @@ -315,23 +315,23 @@ Emission of bytecode is handled by the following macros: -- ``ADDOP()`` +``ADDOP()`` add a specified opcode -- ``ADDOP_I()`` +``ADDOP_I()`` add an opcode that takes an argument -- ``ADDOP_O(struct compiler *c, int op, PyObject *type, PyObject *obj)`` +``ADDOP_O(struct compiler *c, int op, PyObject *type, PyObject *obj)`` add an opcode with the proper argument based on the position of the specified PyObject in PyObject sequence object, but with no handling of mangled names; used for when you need to do named lookups of objects such as globals, consts, or parameters where name mangling is not possible and the scope of the name is known -- ``ADDOP_NAME()`` +``ADDOP_NAME()`` just like ADDOP_O, but name mangling is also handled; used for attribute loading or importing based on name -- ``ADDOP_JABS()`` +``ADDOP_JABS()`` create an absolute jump to a basic block -- ``ADDOP_JREL()`` +``ADDOP_JREL()`` create a relative jump to a basic block Several helper functions that will emit bytecode and are named @@ -348,11 +348,11 @@ creation of basic blocks must be done. Below are the macros and functions used for managing basic blocks: -- ``NEW_BLOCK()`` +``NEW_BLOCK()`` create block and set it as current -- ``NEXT_BLOCK()`` +``NEXT_BLOCK()`` basically NEW_BLOCK() plus jump from current block -- ``compiler_new_block()`` +``compiler_new_block()`` create a block but don't use it (used for generating jumps) Once the CFG is created, it must be flattened and then final emission of @@ -417,23 +417,23 @@ + Parser/ - - Python.asdl + Python.asdl ASDL syntax file - - asdl.py + asdl.py "An implementation of the Zephyr Abstract Syntax Definition Language." Uses SPARK_ to parse the ASDL files. - - asdl_c.py + asdl_c.py "Generate C code from an ASDL description." Generates Python/Python-ast.c and Include/Python-ast.h . - - spark.py + spark.py SPARK_ parser generator + Python/ - - Python-ast.c + Python-ast.c Creates C structs corresponding to the ASDL types. Also contains code for marshaling AST nodes (core ASDL types have marshaling code in asdl.c). "File automatically generated by @@ -441,76 +441,76 @@ after every grammar change is committed since the __version__ value is set to the latest grammar change revision number. - - asdl.c + asdl.c Contains code to handle the ASDL sequence type. Also has code to handle marshalling the core ASDL types, such as number and identifier. used by Python-ast.c for marshaling AST nodes. - - ast.c + ast.c Converts Python's parse tree into the abstract syntax tree. - - ceval.c + ceval.c Executes byte code (aka, eval loop). - - compile.c + compile.c Emits bytecode based on the AST. - - symtable.c + symtable.c Generates a symbol table from AST. - - pyarena.c + pyarena.c Implementation of the arena memory manager. - - import.c + import.c Home of the magic number (named ``MAGIC``) for bytecode versioning + Include/ - - Python-ast.h + Python-ast.h Contains the actual definitions of the C structs as generated by Python/Python-ast.c . "Automatically generated by Parser/asdl_c.py". - - asdl.h + asdl.h Header for the corresponding Python/ast.c . - - ast.h + ast.h Declares PyAST_FromNode() external (from Python/ast.c). - - code.h + code.h Header file for Objects/codeobject.c; contains definition of PyCodeObject. - - symtable.h + symtable.h Header for Python/symtable.c . struct symtable and PySTEntryObject are defined here. - - pyarena.h + pyarena.h Header file for the corresponding Python/pyarena.c . - - opcode.h + opcode.h Master list of bytecode; if this file is modified you must modify several other files accordingly (see "`Introducing New Bytecode`_") + Objects/ - - codeobject.c + codeobject.c Contains PyCodeObject-related code (originally in Python/compile.c). + Lib/ - - opcode.py + opcode.py One of the files that must be modified if Include/opcode.h is. - - compiler/ + compiler/ - * pyassem.py + pyassem.py One of the files that must be modified if Include/opcode.h is changed. - * pycodegen.py + pycodegen.py One of the files that must be modified if Include/opcode.h is changed. diff --git a/coredev.rst b/coredev.rst --- a/coredev.rst +++ b/coredev.rst @@ -74,7 +74,7 @@ hg clone ssh://hg at hg.python.org/test/ hgtest -An entry in the ``Misc/developers.txt`` file should also be entered for you. +An entry in the :ref:`developers` should also be entered for you. Typically the person who sponsored your application to become a core developer makes sure an entry is created for you. diff --git a/developers.rst b/developers.rst --- a/developers.rst +++ b/developers.rst @@ -259,8 +259,7 @@ - George Yoshida (SF name "quiver") added to the SourceForge Python project 14 Apr 2006, by Tim Peters, as a tracker admin. See contemporaneous python-checkins thread with the unlikely Subject: - - r45329 - python/trunk/Doc/whatsnew/whatsnew25.tex + r45329 - python/trunk/Doc/whatsnew/whatsnew25.tex - Ronald Oussoren was given SVN access on 3 Mar 2006 by NCN, for Mac related work. @@ -297,8 +296,7 @@ - Eric S. Raymond was made a developer on 2 Jul 2000 by TGP, for general library work. His request is archived here: - - http://mail.python.org/pipermail/python-dev/2000-July/005314.html + http://mail.python.org/pipermail/python-dev/2000-July/005314.html Permissions Dropped on Request diff --git a/docquality.rst b/docquality.rst --- a/docquality.rst +++ b/docquality.rst @@ -32,13 +32,13 @@ `_ which discusses the documentation toolchain, projects, standards, etc. -.. _Documenting Python: http://docs.python.org/py3k/documenting/index.html +.. _Documenting Python: http://docs.python.org/dev/documenting/ Helping with issues filed on the issue tracker ---------------------------------------------- -If you look at `issues assigned to docs at python`_ on the `issue tracker`_, you +If you look at `documentation issues`_ on the `issue tracker`_, you will find various documentation problems that need work. Issues vary from typos, to unclear documentation, to something completely lacking documentation. @@ -50,9 +50,7 @@ to forget or lose interest). .. _issue tracker: http://bugs.python.org -.. _issues assigned to docs at python: http://bugs.python.org/issue?%40sort0=activity&%40sortdir0=on&%40sort1=creation&%40sortdir1=on&%40group0=priority&%40group1=&%40columns=title%2Cid%2Cactivity%2Cstatus&%40filter=assignee%2Cstatus&status=1&assignee=12260&%40pagesize=50&%40startwith=0 - - +.. _documentation issues: http://bugs.python.org/issue?%40search_text=&ignore=file%3Acontent&title=&%40columns=title&id=&%40columns=id&stage=&creation=&creator=&activity=&%40columns=activity&%40sort=activity&actor=&nosy=&type=&components=4&versions=&dependencies=&assignee=&keywords=&priority=&%40group=priority&status=1&%40columns=status&resolution=&nosy_count=&message_count=&%40pagesize=50&%40startwith=0&%40queryname=&%40old-queryname=&%40action=search Proofreading diff --git a/emacs.rst b/emacs.rst --- a/emacs.rst +++ b/emacs.rst @@ -6,8 +6,7 @@ If you want to edit Python code in Emacs, you should download python-mode.el and install it somewhere on your load-path. See the project page to download: - - https://launchpad.net/python-mode +https://launchpad.net/python-mode While Emacs comes with a python.el file, it is not recommended. python-mode.el is maintained by core Python developers and is generally @@ -20,8 +19,7 @@ For more information and bug reporting, see the above project page. For help, development, or discussions, see the python-mode mailing list: - - http://mail.python.org/mailman/listinfo/python-mode +http://mail.python.org/mailman/listinfo/python-mode .. diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -35,12 +35,9 @@ by non-committers to find responsible parties, and by committers who do not feel qualified to make a decision in a particular context. -See also `PEP 291`_ and `PEP 360`_ for information about certain modules +See also :PEP:`291` and :PEP:`360` for information about certain modules with special rules. -.. _`PEP 291`: http://www.python.org/dev/peps/pep-0291/ -.. _`PEP 360`: http://www.python.org/dev/peps/pep-0360/ - Stdlib ------ @@ -77,7 +74,7 @@ codecs lemburg, doerwalter codeop collections rhettinger -collections._abcoll rhettinger, stutzbach +collections.abc rhettinger, stutzbach colorsys compileall concurrent.futures brian.quinlan diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -326,7 +326,7 @@ hg commit [PATH] -``[PATH]`` is optional: if it is omitted, all changes in your working copy +``PATH`` is optional: if it is omitted, all changes in your working copy will be committed to the local repository. When you commit, be sure that all changes are desired by :ref:`reviewing them first `; also, when making commits that you intend to push to public repositories, @@ -379,11 +379,8 @@ = =========================== A Scheduled to be added - R Scheduled to be removed - M Modified locally - ? Not under version control = =========================== @@ -392,7 +389,7 @@ hg diff -How do I revert a file I have modified back to the version in the respository? +How do I revert a file I have modified back to the version in the repository? ------------------------------------------------------------------------------- Running:: @@ -514,7 +511,7 @@ ssh-keygen -t rsa -This will generate a two files; your public key and your private key. Your +This will generate two files; your public key and your private key. Your public key is the file ending in ``.pub``. Windows diff --git a/grammar.rst b/grammar.rst --- a/grammar.rst +++ b/grammar.rst @@ -62,6 +62,6 @@ * After everything's been checked in, you're likely to see a new change to Python/Python-ast.c. This is because this - (generated) file contains the SVN version of the source from + (generated) file contains the svn version of the source from which it was generated. There's no way to avoid this; you just have to submit this file separately. diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -14,8 +14,8 @@ Version Control Setup --------------------- -CPython is developed using `Mercurial (commonly abbreviated 'hg') -`_. +CPython is developed using `Mercurial `_ +(commonly abbreviated hg, after the program name). It is easily available for common Unix systems by way of the standard package manager; under Windows, you might want to use the `TortoiseHg `_ graphical client. diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -18,29 +18,31 @@ Type '''' - Describes the type of issue. If something does not fit within any -specific type then simply do not set it. *"Crash"* is for hard crashes of +specific type then simply do not set it. *"crash"* is for hard crashes of the Python interpreter -- possibly with a core dump or a Windows error box -- and not erroneous exits because of an unhandled exception (the latter fall under -the *"behaviour"* category). +the *"behavior"* category). Stage ''''' What is next to advance the issue forward. The *stage* needn't be set until it is clear that the issue warrants fixing. -* needs patch +test needed + The bug reporter should post a script or instructions to let a triager or + developper reproduce the issue. +needs patch The issue lacks a patch to solve the problem (i.e. fixing the bug, or adding the requested improvement). -* patch review +patch review There is a patch, but it needs reviewing or is in the process of being reviewed. This can be done by any triager as well as a core developer. -* commit review +commit review A triager performed a patch review and it looks good to them, but a core developer needs to commit the patch (and do a quick once-over to make sure nothing was overlooked). -* committed/rejected +committed/rejected The issue is considered closed and dealt with. Components @@ -59,20 +61,20 @@ '''''''' How important is this issue? -* low +low This is for low-impact bugs, or feature requests of little utility. -* normal +normal The default value for most issues, which deserve fixing but without any urgency to do so. -* high +high Make some effort to fix the issue before the next final release. -* critical +critical This issue should definitely be fixed before the next final release. -* deferred blocker +deferred blocker The issue will not hold up the next release, but will be promoted to a release blocker for the following release, e.g., won't block the next release of a1 but will block a2. -* release blocker +release blocker The issue must be fixed before *any* release is made, e.g., will block the next release even if it is an alpha release. @@ -85,20 +87,22 @@ '''''''' Various flags about the issue. Multiple values are possible. -* after moratorium +after moratorium The issue is in regards to a language change which is not allowed during the `language moratorium`_. -* buildbot +buildbot A buildbot triggered the issue being reported. -* easy +easy Fixing the issue should not take longer than a day for someone new to contributing to Python to solve. -* gsoc +gsoc The issue would fit as, or is related to, GSoC_. -* needs review +needs review The patch attached to the issue is in need of a review. -* patch +patch There is a patch attached to the issue. +3.2regression + The issue is a regression in 3.2. Nosy List ''''''''' @@ -126,52 +130,52 @@ Status '''''' -* open +open Issue is not resolved. -* languishing +languishing The issue has no clear solution , e.g., no agreement on a technical solution or if it is even a problem worth fixing. -* pending +pending The issue is blocked until someone (often times the :abbr:`OP (original poster)`) provides some critical info; the issue is automatically closed after a set amount of time if no reply comes in. Useful for when someone reports a bug that lacks enough information to be reproduced and thus should be closed if the lacking info is never provided. and thus the issue is worthless without the needed info being provided. -* closed +closed The issue has been resolved (somehow). Resolution '''''''''' Why the issue is in its current state (not usually used for "open"). -* accepted +accepted Submitted patch was applied, still needs verifying (for example by watching the `buildbots `_) that everything went fine. Then the resolution will turn to *fixed* and the status to *closed*. -* duplicate +duplicate Duplicate of another issue; should have the Superseder field filled out. -* fixed +fixed A fix for the issue was committed. -* invalid +invalid For some reason the issue is invalid (e.g. the perceived problem is not a bug in Python). -* later +later Issue is to be worked on at a later date. -* out of date +out of date The issue has already been fixed, or the problem doesn't exist anymore for other reasons. -* postponed +postponed Issue will not be worked on at the moment. -* rejected +rejected Issue was rejected (especially for feature requests). -* remind +remind The issue is acting as a reminder for someone. -* wont fix +wont fix Issue will not be fixed, typically because it would cause a backwards-compatibility problem. -* works for me +works for me Bug cannot be reproduced. @@ -183,7 +187,7 @@ * ``#``, ``issue``, ``issue `` links to the tracker issue ````. * ``msg`` links to the tracker message ````. -* ``r``, ``rev``, ``revision `` links to the VCS +* ``r``, ``rev``, ``revision `` links to the Subversion revision ````. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:06 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:06 +0100 Subject: [Python-checkins] devguide (hg_transition): Advertise hg import over patch. Message-ID: eric.araujo pushed fe7bcebd13c9 to devguide: http://hg.python.org/devguide/rev/fe7bcebd13c9 changeset: 329:fe7bcebd13c9 branch: hg_transition user: ?ric Araujo date: Sat Feb 26 17:28:32 2011 +0100 summary: Advertise hg import over patch. hg import understands the extended git diff format, which supports renames, changes to the executable bit and changes in binary files. patch doesn?t do anything useful with that information, and also requires downloading and setup on Windows. files: committing.rst faq.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -103,7 +103,7 @@ Python 3.2:: hg update 3.2 - patch -p1 < patch.diff + hg import --no-commit patch.diff # Compile; run the test suite hg commit diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -263,18 +263,15 @@ If you want to try out or review a patch generated using Mercurial, do:: - patch -p1 < somework.patch + hg import --no-commit somework.patch This will apply the changes in your working copy without committing them. -If the patch was not created by hg (i.e., a patch created by SVN and thus lacking -any ``a``/``b`` directory prefixes in the patch), use ``-p0`` instead of ``-p1``. +If the patch was not created by Mercurial (for example, a patch created by +Subversion and thus lacking any ``a``/``b`` directory prefixes in the patch), +add ``-p0`` to the above command. -.. note:: The ``patch`` program is not available by default under Windows. - You can find it `here `_, - courtesy of the `GnuWin32 `_ project. - Also, you may find it necessary to add the "``--binary``" option when trying - to apply Unix-generated patches under Windows. - +You can also use the ``patch`` program, but be aware that it does not +understand the `extended diff format`_ used by Mercurial. If you want to work on the patch using mq_ (Mercurial Queues), type instead:: @@ -291,6 +288,7 @@ hg qdelete somework.patch +.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs .. _mq: http://mercurial.selenic.com/wiki/MqExtension -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:07 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:07 +0100 Subject: [Python-checkins] devguide (hg_transition): Mention the collapse extension, simpler than MQ Message-ID: eric.araujo pushed 20beef07677f to devguide: http://hg.python.org/devguide/rev/20beef07677f changeset: 330:20beef07677f branch: hg_transition user: ?ric Araujo date: Sat Feb 26 17:29:31 2011 +0100 summary: Mention the collapse extension, simpler than MQ files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -44,7 +44,8 @@ the result to the main repository. The reason is that we don't want the history to be full of intermediate commits recording the private history of the person working on a patch. If you are using the rebase_ extension, - consider adding the ``--collapse`` option to ``hg rebase``. + consider adding the ``--collapse`` option to ``hg rebase``. The collapse_ + extension is another choice. Because of these constraints, it can be practical to use other approaches such as mq_ (Mercurial Queues), in order to maintain patches in a single @@ -54,6 +55,7 @@ .. _Mercurial: http://www.hg-scm.org/ .. _mq: http://mercurial.selenic.com/wiki/MqExtension .. _rebase: http://mercurial.selenic.com/wiki/RebaseExtension +.. _collapse: http://mercurial.selenic.com/wiki/CollapseExtension Handling Other's Code -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:07 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:07 +0100 Subject: [Python-checkins] devguide (hg_transition): patchcheck does work Message-ID: eric.araujo pushed f325d743c385 to devguide: http://hg.python.org/devguide/rev/f325d743c385 changeset: 331:f325d743c385 branch: hg_transition user: ?ric Araujo date: Sat Feb 26 17:30:51 2011 +0100 summary: patchcheck does work files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -114,15 +114,13 @@ Generation '''''''''' -.. XXX [commented out] make patchcheck doesn't work with non-SVN workflow +To perform a quick sanity check on your patch, you can run:: - To perform a quick sanity check on your patch, you can run:: + make patchcheck - make patchcheck - - This will check and/or fix various common things people forget to do for - patches, such as adding any new files needing for the patch to work (do not - that not all checks apply to non-core developers). +This will check and/or fix various common things people forget to do for +patches, such as adding any new files needing for the patch to work (do not +that not all checks apply to non-core developers). Assume you are using the :ref:`mq approach ` suggested earlier, first check that all your local changes have been recorded (using -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:08 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:08 +0100 Subject: [Python-checkins] devguide (hg_transition): Be helpful: add a pointer to more help, use full command name. Message-ID: eric.araujo pushed ee4ff7460b6f to devguide: http://hg.python.org/devguide/rev/ee4ff7460b6f changeset: 332:ee4ff7460b6f branch: hg_transition user: ?ric Araujo date: Sat Feb 26 17:40:30 2011 +0100 summary: Be helpful: add a pointer to more help, use full command name. Not everyone is used to UNIX compression :) files: faq.rst patch.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -352,7 +352,7 @@ Specify the path to be removed with:: - hg rm PATH + hg remove PATH This will remove the file or the directory from your working copy; you will have to :ref:`commit your changes ` for the removal to be recorded diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -129,7 +129,8 @@ hg qdiff > mywork.patch If you are using another approach, you probably need to find out the right -invocation of ``hg diff`` for your purposes. Just please make sure that you +invocation of ``hg diff`` for your purposes; see ``hg help diff`` and ``hg +help revisions``. Just please make sure that you generate a **single, condensed** patch rather than a series of several changesets. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:08 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:08 +0100 Subject: [Python-checkins] devguide (hg_transition): Add section about minimal Mercurial configuration Message-ID: eric.araujo pushed 0a4b6a217350 to devguide: http://hg.python.org/devguide/rev/0a4b6a217350 changeset: 333:0a4b6a217350 branch: hg_transition user: ?ric Araujo date: Sat Feb 26 18:50:44 2011 +0100 summary: Add section about minimal Mercurial configuration files: setup.rst diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -21,6 +21,28 @@ `_ graphical client. +Minimal Configuration +''''''''''''''''''''' + +To use ``hg`` on the command line, you should set up some basic options in +your `configuration file`_ (``~/.hgrc`` on POSIX, +``%USERPROFILE%\Mercurial.ini`` on Windows). :: + + [ui] + username = Your Name + + [diff] + git = on + +The first setting defines the name that will be used when you :ref:`commit +` changes. The second setting enables an `extended diff format`_ +which is more useful than the standard unified diff format. + +Note that TortoiseHg has a graphical settings dialog. + +.. _configuration file: http://www.selenic.com/mercurial/hgrc.5.html#files +.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs + .. _checkout: Getting the Source Code -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 04:17:09 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 04:17:09 +0100 Subject: [Python-checkins] devguide (hg_transition): Minor edits and typo fixes. Message-ID: eric.araujo pushed 140b84af4344 to devguide: http://hg.python.org/devguide/rev/140b84af4344 changeset: 334:140b84af4344 branch: hg_transition tag: tip user: ?ric Araujo date: Sun Feb 27 04:08:47 2011 +0100 summary: Minor edits and typo fixes. - Move a link target after its use - Add a todo about tracker markup - Remove one XXX that was in a warning block, not a comment files: committing.rst faq.rst grammar.rst triaging.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -144,7 +144,8 @@ Porting Between Major Versions '''''''''''''''''''''''''''''' -.. warning:: XXX transplant always commits automatically. This breaks the +.. warning:: + transplant always commits automatically. This breaks the "run the test suite before committing" rule. We could advocate using "hg qimport -r tip -P" afterwards but that would add another level of complexity. @@ -161,7 +162,7 @@ Differences with ``svnmerge`` ''''''''''''''''''''''''''''' -If you are coming from SVN, you might be surprised by how Mercurial works. +If you are coming from Subversion, you might be surprised by how Mercurial works. Despite its name, ``svnmerge`` is different from ``hg merge``: while ``svnmerge`` allows to cherrypick individual revisions, ``hg merge`` can only merge whole lines of development in the repository's :abbr:`DAG (directed acyclic graph)`. diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -32,8 +32,6 @@ What do I need to use Mercurial? ------------------------------------------------------------------------------- -.. _download Mercurial: http://mercurial.selenic.com/downloads/ - UNIX ''''''''''''''''''' @@ -46,6 +44,7 @@ are typically available either online or through the platform's package management system. +.. _download Mercurial: http://mercurial.selenic.com/downloads/ .. _OpenSSH: http://www.openssh.org/ @@ -72,7 +71,6 @@ If your private key is in OpenSSH format, you must first convert it to PuTTY format by loading it into `PuTTYgen`_. - .. _download TortoiseHg: http://tortoisehg.bitbucket.org/download/index.html diff --git a/grammar.rst b/grammar.rst --- a/grammar.rst +++ b/grammar.rst @@ -62,6 +62,6 @@ * After everything's been checked in, you're likely to see a new change to Python/Python-ast.c. This is because this - (generated) file contains the svn version of the source from + (generated) file contains the hg version of the source from which it was generated. There's no way to avoid this; you just have to submit this file separately. diff --git a/triaging.rst b/triaging.rst --- a/triaging.rst +++ b/triaging.rst @@ -190,6 +190,10 @@ * ``r``, ``rev``, ``revision `` links to the Subversion revision ````. +.. TODO update that last item to the format choosed for Mercurial: [CSET] + - check that [CSET] works + - check that rNUMBER links to the hglookup application + Reporting Issues About the Tracker ---------------------------------- -- Repository URL: http://hg.python.org/devguide From solipsis at pitrou.net Sun Feb 27 05:08:10 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 27 Feb 2011 05:08:10 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88664): sum=0 Message-ID: py3k results for svn r88664 (hg cset 68b6d6df226f) -------------------------------------------------- test_pydoc leaked [323, 0, 0] references, sum=323 test_pyexpat leaked [0, -323, 0] references, sum=-323 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogZVqDGA', '-x'] From python-checkins at python.org Sun Feb 27 06:57:51 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 06:57:51 +0100 (CET) Subject: [Python-checkins] r88667 - peps/trunk/pep-0380.txt Message-ID: <20110227055751.C5FC1EE98A@mail.python.org> Author: brett.cannon Date: Sun Feb 27 06:57:51 2011 New Revision: 88667 Log: Indent a literal block some more so that it is picked up as a literal block and not just as a paragraph in a numbered list. Modified: peps/trunk/pep-0380.txt Modified: peps/trunk/pep-0380.txt ============================================================================== --- peps/trunk/pep-0380.txt (original) +++ peps/trunk/pep-0380.txt Sun Feb 27 06:57:51 2011 @@ -182,14 +182,14 @@ 3. The StopIteration exception behaves as though defined thusly:: - class StopIteration(Exception): + class StopIteration(Exception): - def __init__(self, *args): - if len(args) > 0: - self.value = args[0] - else: - self.value = None - Exception.__init__(self, *args) + def __init__(self, *args): + if len(args) > 0: + self.value = args[0] + else: + self.value = None + Exception.__init__(self, *args) Rationale From python-checkins at python.org Sun Feb 27 07:31:29 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 07:31:29 +0100 Subject: [Python-checkins] devguide: Minor markup change. Message-ID: brett.cannon pushed cdf69cbca33a to devguide: http://hg.python.org/devguide/rev/cdf69cbca33a changeset: 335:cdf69cbca33a parent: 327:974ed7a43003 user: Brett Cannon date: Sat Feb 26 22:29:50 2011 -0800 summary: Minor markup change. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -33,9 +33,8 @@ Finally, you can simply run the entire test suite yourself with coverage turned on and see what modules need help. This has the drawback of running the entire -test suite under coverage measuring which -takes some time to complete, but you will -have an accurate, up-to-date notion of what modules need the most work. +test suite under coverage measuring which takes some time to complete, but you +will have an accurate, up-to-date notion of what modules need the most work. Do make sure, though, that for any module you do decide to work on that you run coverage for just that module. This will make sure you know how good the -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 07:31:31 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 07:31:31 +0100 Subject: [Python-checkins] devguide: List common gotchas people might come across when doing coverage work. Message-ID: brett.cannon pushed 00260cf660c8 to devguide: http://hg.python.org/devguide/rev/00260cf660c8 changeset: 336:00260cf660c8 user: Brett Cannon date: Sat Feb 26 22:30:48 2011 -0800 summary: List common gotchas people might come across when doing coverage work. Currently this includes coverage for modules which are already imported (i.e., bootstrap-required modules) and matching the pre-existing testing style for a module and leaning towards whitebox testing when in doubt. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -42,6 +42,24 @@ implicit testing by other code that happens to use the module. +Common Gotchas +"""""""""""""" + +Please realize that coverage reports on modules already imported before coverage +data starts to be recorded will be wrong. Typically you can tell a module falls +into this category by the coverage report saying that global statements that +would obviously be executed upon import have gone unexecuted while local +statements have been covered. In these instances you can ignore the global +statement coverage and simply focus on the local statement coverage. + +When writing new tests to increase coverage, do take note of the style of tests +already provided for a module (e.g., whitebox, blackbox, etc.). As +some modules are primarily maintained by a single core developer they may have +a specific preference as to what kind of test is used (e.g., whitebox) and +prefer that other types of tests not be used (e.g., blackbox). When in doubt, +stick with whitebox testing in order to properly exercise the code. + + Measuring Coverage """""""""""""""""" -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 07:31:31 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 07:31:31 +0100 Subject: [Python-checkins] devguide: Minor grammar issue. Message-ID: brett.cannon pushed 7ce1bf0f6f5d to devguide: http://hg.python.org/devguide/rev/7ce1bf0f6f5d changeset: 337:7ce1bf0f6f5d user: Brett Cannon date: Sat Feb 26 22:30:57 2011 -0800 summary: Minor grammar issue. files: coverage.rst diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -147,7 +147,7 @@ goal to strive for, is a secondary goal to getting 100% line coverage for the entire stdlib (for now). -If you decide to want to try to improve branch coverage, simply add the +If you decide you want to try to improve branch coverage, simply add the ``--branch`` flag to your coverage run:: ./python -m coverage run --pylib --branch -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 07:31:32 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 07:31:32 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): Merged from default Message-ID: brett.cannon pushed c500a1efc984 to devguide: http://hg.python.org/devguide/rev/c500a1efc984 changeset: 338:c500a1efc984 branch: hg_transition tag: tip parent: 334:140b84af4344 parent: 337:7ce1bf0f6f5d user: Brett Cannon date: Sat Feb 26 22:31:17 2011 -0800 summary: Merged from default files: diff --git a/coverage.rst b/coverage.rst --- a/coverage.rst +++ b/coverage.rst @@ -33,9 +33,8 @@ Finally, you can simply run the entire test suite yourself with coverage turned on and see what modules need help. This has the drawback of running the entire -test suite under coverage measuring which -takes some time to complete, but you will -have an accurate, up-to-date notion of what modules need the most work. +test suite under coverage measuring which takes some time to complete, but you +will have an accurate, up-to-date notion of what modules need the most work. Do make sure, though, that for any module you do decide to work on that you run coverage for just that module. This will make sure you know how good the @@ -43,6 +42,24 @@ implicit testing by other code that happens to use the module. +Common Gotchas +"""""""""""""" + +Please realize that coverage reports on modules already imported before coverage +data starts to be recorded will be wrong. Typically you can tell a module falls +into this category by the coverage report saying that global statements that +would obviously be executed upon import have gone unexecuted while local +statements have been covered. In these instances you can ignore the global +statement coverage and simply focus on the local statement coverage. + +When writing new tests to increase coverage, do take note of the style of tests +already provided for a module (e.g., whitebox, blackbox, etc.). As +some modules are primarily maintained by a single core developer they may have +a specific preference as to what kind of test is used (e.g., whitebox) and +prefer that other types of tests not be used (e.g., blackbox). When in doubt, +stick with whitebox testing in order to properly exercise the code. + + Measuring Coverage """""""""""""""""" @@ -130,7 +147,7 @@ goal to strive for, is a secondary goal to getting 100% line coverage for the entire stdlib (for now). -If you decide to want to try to improve branch coverage, simply add the +If you decide you want to try to improve branch coverage, simply add the ``--branch`` flag to your coverage run:: ./python -m coverage run --pylib --branch -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 15:22:40 2011 From: python-checkins at python.org (eric.araujo) Date: Sun, 27 Feb 2011 15:22:40 +0100 Subject: [Python-checkins] devguide (hg_transition): Typo spotted by Sandro Tosi Message-ID: eric.araujo pushed c4f0c92018f1 to devguide: http://hg.python.org/devguide/rev/c4f0c92018f1 changeset: 339:c4f0c92018f1 branch: hg_transition tag: tip user: ?ric Araujo date: Sun Feb 27 15:22:24 2011 +0100 summary: Typo spotted by Sandro Tosi files: patch.rst diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -119,7 +119,7 @@ make patchcheck This will check and/or fix various common things people forget to do for -patches, such as adding any new files needing for the patch to work (do not +patches, such as adding any new files needing for the patch to work (note that not all checks apply to non-core developers). Assume you are using the :ref:`mq approach ` suggested earlier, -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 16:06:44 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 27 Feb 2011 16:06:44 +0100 (CET) Subject: [Python-checkins] r88668 - python/branches/py3k/Doc/library/json.rst Message-ID: <20110227150644.D494DEEA2F@mail.python.org> Author: benjamin.peterson Date: Sun Feb 27 16:06:44 2011 New Revision: 88668 Log: make this a link #11345 Modified: python/branches/py3k/Doc/library/json.rst Modified: python/branches/py3k/Doc/library/json.rst ============================================================================== --- python/branches/py3k/Doc/library/json.rst (original) +++ python/branches/py3k/Doc/library/json.rst Sun Feb 27 16:06:44 2011 @@ -6,7 +6,7 @@ .. moduleauthor:: Bob Ippolito .. sectionauthor:: Bob Ippolito -JSON (JavaScript Object Notation) is a subset of JavaScript +`JSON (JavaScript Object Notation) `_ is a subset of JavaScript syntax (ECMA-262 3rd edition) used as a lightweight data interchange format. :mod:`json` exposes an API familiar to users of the standard library From python-checkins at python.org Sun Feb 27 16:09:14 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 27 Feb 2011 16:09:14 +0100 (CET) Subject: [Python-checkins] r88669 - in python/branches/release27-maint: Doc/library/json.rst Message-ID: <20110227150914.7CC46EEA77@mail.python.org> Author: benjamin.peterson Date: Sun Feb 27 16:09:14 2011 New Revision: 88669 Log: Merged revisions 88668 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88668 | benjamin.peterson | 2011-02-27 09:06:44 -0600 (Sun, 27 Feb 2011) | 1 line make this a link #11345 ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Doc/library/json.rst Modified: python/branches/release27-maint/Doc/library/json.rst ============================================================================== --- python/branches/release27-maint/Doc/library/json.rst (original) +++ python/branches/release27-maint/Doc/library/json.rst Sun Feb 27 16:09:14 2011 @@ -7,7 +7,7 @@ .. sectionauthor:: Bob Ippolito .. versionadded:: 2.6 -JSON (JavaScript Object Notation) is a subset of JavaScript +`JSON (JavaScript Object Notation) `_ is a subset of JavaScript syntax (ECMA-262 3rd edition) used as a lightweight data interchange format. :mod:`json` exposes an API familiar to users of the standard library From python-checkins at python.org Sun Feb 27 16:15:06 2011 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 27 Feb 2011 16:15:06 +0100 (CET) Subject: [Python-checkins] r88670 - in python/branches/release32-maint: Doc/library/json.rst Message-ID: <20110227151506.A8BDAEEA2F@mail.python.org> Author: benjamin.peterson Date: Sun Feb 27 16:15:06 2011 New Revision: 88670 Log: Merged revisions 88668 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88668 | benjamin.peterson | 2011-02-27 09:06:44 -0600 (Sun, 27 Feb 2011) | 1 line make this a link #11345 ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Doc/library/json.rst Modified: python/branches/release32-maint/Doc/library/json.rst ============================================================================== --- python/branches/release32-maint/Doc/library/json.rst (original) +++ python/branches/release32-maint/Doc/library/json.rst Sun Feb 27 16:15:06 2011 @@ -6,7 +6,7 @@ .. moduleauthor:: Bob Ippolito .. sectionauthor:: Bob Ippolito -JSON (JavaScript Object Notation) is a subset of JavaScript +`JSON (JavaScript Object Notation) `_ is a subset of JavaScript syntax (ECMA-262 3rd edition) used as a lightweight data interchange format. :mod:`json` exposes an API familiar to users of the standard library From python-checkins at python.org Sun Feb 27 16:37:02 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 16:37:02 +0100 Subject: [Python-checkins] pymigr: Optionally check all changesets in the eol hook Message-ID: antoine.pitrou pushed 88ab2b24ca7a to pymigr: http://hg.python.org/pymigr/rev/88ab2b24ca7a changeset: 113:88ab2b24ca7a tag: tip user: Antoine Pitrou date: Sun Feb 27 16:36:50 2011 +0100 summary: Optionally check all changesets in the eol hook files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -28,6 +28,9 @@ Optional/Low priority ===================== +* adapt eol hook to check each changeset (rather than the changegroup's result). + It's stricter but prevents from polluting the history with whitespace fix + commits. * find a nice way of modifying the templates (possibly involving jinja) * check if we should speed up svn revision matching/lookups (since the lookup WSGI app resides permanently in a mod_wsgi process, building a @@ -54,7 +57,7 @@ * According to Martin, obsolete categories (such as ``hg-XXX`` when it gets renamed to ``XXX``) should be redirected to the ``empty`` SVN branch and - a build forced there (by pushing a dummy changeset to the empty branch?) + a build forced there (by pushing a dummy changeset to the empty branch) so as to release disk space on the build slaves for these slaves. * Later, add a new buildslave category ("custom"?) receiving no changesets, -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sun Feb 27 16:44:13 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 16:44:13 +0100 (CET) Subject: [Python-checkins] r88671 - python/branches/py3k/Lib/test/test_ssl.py Message-ID: <20110227154413.10F88EE98A@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 16:44:12 2011 New Revision: 88671 Log: Follow up to r88664: non-blocking connect-ex() can return EWOULDBLOCK under Windows Modified: python/branches/py3k/Lib/test/test_ssl.py Modified: python/branches/py3k/Lib/test/test_ssl.py ============================================================================== --- python/branches/py3k/Lib/test/test_ssl.py (original) +++ python/branches/py3k/Lib/test/test_ssl.py Sun Feb 27 16:44:12 2011 @@ -474,7 +474,8 @@ try: s.setblocking(False) rc = s.connect_ex(('svn.python.org', 443)) - self.assertIn(rc, (0, errno.EINPROGRESS)) + # EWOULDBLOCK under Windows, EINPROGRESS elsewhere + self.assertIn(rc, (0, errno.EINPROGRESS, errno.EWOULDBLOCK)) # Wait for connect to finish select.select([], [s], [], 5.0) # Non-blocking handshake From python-checkins at python.org Sun Feb 27 16:45:16 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 16:45:16 +0100 (CET) Subject: [Python-checkins] r88672 - in python/branches/release32-maint: Lib/test/test_ssl.py Message-ID: <20110227154516.895D0EE98A@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 16:45:16 2011 New Revision: 88672 Log: Merged revisions 88671 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88671 | antoine.pitrou | 2011-02-27 16:44:12 +0100 (dim., 27 f?vr. 2011) | 3 lines Follow up to r88664: non-blocking connect-ex() can return EWOULDBLOCK under Windows ........ Modified: python/branches/release32-maint/ (props changed) python/branches/release32-maint/Lib/test/test_ssl.py Modified: python/branches/release32-maint/Lib/test/test_ssl.py ============================================================================== --- python/branches/release32-maint/Lib/test/test_ssl.py (original) +++ python/branches/release32-maint/Lib/test/test_ssl.py Sun Feb 27 16:45:16 2011 @@ -474,7 +474,8 @@ try: s.setblocking(False) rc = s.connect_ex(('svn.python.org', 443)) - self.assertIn(rc, (0, errno.EINPROGRESS)) + # EWOULDBLOCK under Windows, EINPROGRESS elsewhere + self.assertIn(rc, (0, errno.EINPROGRESS, errno.EWOULDBLOCK)) # Wait for connect to finish select.select([], [s], [], 5.0) # Non-blocking handshake From python-checkins at python.org Sun Feb 27 16:45:23 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 16:45:23 +0100 (CET) Subject: [Python-checkins] r88673 - in python/branches/release27-maint: Lib/test/test_ssl.py Message-ID: <20110227154523.0FD10EEA87@mail.python.org> Author: antoine.pitrou Date: Sun Feb 27 16:45:22 2011 New Revision: 88673 Log: Merged revisions 88671 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ........ r88671 | antoine.pitrou | 2011-02-27 16:44:12 +0100 (dim., 27 f?vr. 2011) | 3 lines Follow up to r88664: non-blocking connect-ex() can return EWOULDBLOCK under Windows ........ Modified: python/branches/release27-maint/ (props changed) python/branches/release27-maint/Lib/test/test_ssl.py Modified: python/branches/release27-maint/Lib/test/test_ssl.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_ssl.py (original) +++ python/branches/release27-maint/Lib/test/test_ssl.py Sun Feb 27 16:45:22 2011 @@ -248,7 +248,8 @@ try: s.setblocking(False) rc = s.connect_ex(('svn.python.org', 443)) - self.assertIn(rc, (0, errno.EINPROGRESS)) + # EWOULDBLOCK under Windows, EINPROGRESS elsewhere + self.assertIn(rc, (0, errno.EINPROGRESS, errno.EWOULDBLOCK)) # Wait for connect to finish select.select([], [s], [], 5.0) # Non-blocking handshake From python-checkins at python.org Sun Feb 27 17:21:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 17:21:08 +0100 Subject: [Python-checkins] cpython (3.2): test commit on 3.2 Message-ID: antoine.pitrou pushed d6404ebb622d to cpython: http://hg.python.org/cpython/rev/d6404ebb622d changeset: 68064:d6404ebb622d branch: 3.2 parent: 68039:c17d7772c638 user: Antoine Pitrou date: Sun Feb 27 17:20:07 2011 +0100 summary: test commit on 3.2 files: LICENSE diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -1,7 +1,7 @@ A. HISTORY OF THE SOFTWARE ========================== -Python was created in the early 1990s by Guido van Rossum at Stichting +Python was created in the early 1990s by Larry Wall at Stichting Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands as a successor of a language called ABC. Guido remains Python's principal author, although it includes many contributions from others. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Feb 27 17:21:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 17:21:08 +0100 Subject: [Python-checkins] merge in cpython: Merge 3.2 Message-ID: antoine.pitrou pushed a072e8b6d701 to cpython: http://hg.python.org/cpython/rev/a072e8b6d701 changeset: 68065:a072e8b6d701 tag: tip parent: 68063:8cdddac5ad28 parent: 68064:d6404ebb622d user: Antoine Pitrou date: Sun Feb 27 17:21:03 2011 +0100 summary: Merge 3.2 files: LICENSE diff --git a/LICENSE b/LICENSE --- a/LICENSE +++ b/LICENSE @@ -1,4 +1,4 @@ -Copyright (C) 2011 by Python authors and the Python Software Foundation +Copyright (C) 2011 by Python authors and Larry Wall In place of a legal mess, here is a poem: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Feb 27 17:21:30 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 17:21:30 +0100 Subject: [Python-checkins] hooks: Whitelist branches in the branch check Message-ID: antoine.pitrou pushed 60f1cc0228bf to hooks: http://hg.python.org/hooks/rev/60f1cc0228bf changeset: 31:60f1cc0228bf tag: tip user: Antoine Pitrou date: Sun Feb 27 17:21:27 2011 +0100 summary: Whitelist branches in the branch check files: checkbranch.py diff --git a/checkbranch.py b/checkbranch.py --- a/checkbranch.py +++ b/checkbranch.py @@ -22,8 +22,7 @@ n = repo.changelog.node(rev) ctx = repo[n] branch = ctx.branch() - if branch in ('trunk', 'legacy-trunk', - '2.0', '2.1', '2.2', '2.3', '2.4', '3.0'): + if branch not in ('default', '3.2', '3.1', '2.7', '2.6', '2.5'): ui.warn(' - changeset %s on disallowed branch %r!\n' % (ctx, branch)) failed = True -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Sun Feb 27 19:19:14 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 19:19:14 +0100 Subject: [Python-checkins] pymigr: Adapt the README for the new allcsethook in the improved eol ext Message-ID: antoine.pitrou pushed 5849cdf06d1b to pymigr: http://hg.python.org/pymigr/rev/5849cdf06d1b changeset: 115:5849cdf06d1b tag: tip user: Antoine Pitrou date: Sun Feb 27 19:19:11 2011 +0100 summary: Adapt the README for the new allcsethook in the improved eol ext files: README.txt diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -79,8 +79,10 @@ [hooks] - # Uses our modified version of eol.py (see http://mercurial.selenic.com/bts/issue2660) - pretxnchangegroup.eol = python:/home/hg/mercurial-1.7.5/hgext/eol.py:hook + # Uses our modified version of eol.py (see + # http://mercurial.selenic.com/bts/issue2660 and + # http://mercurial.selenic.com/bts/issue2665) + pretxnchangegroup.eol = python:/home/hg/mercurial-1.7.5/hgext/eol.py:allcsethook #pretxnchangegroup.eol = python:hgext.eol.hook pretxnchangegroup.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sun Feb 27 19:19:14 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 19:19:14 +0100 Subject: [Python-checkins] pymigr: The stricter eol hook is now in power Message-ID: antoine.pitrou pushed d3385138d6c6 to pymigr: http://hg.python.org/pymigr/rev/d3385138d6c6 changeset: 114:d3385138d6c6 user: Antoine Pitrou date: Sun Feb 27 19:17:49 2011 +0100 summary: The stricter eol hook is now in power files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -24,13 +24,14 @@ -> should be done, see cpython test repo * roundup integration (issue links, changeset links) -> should be done +* adapt eol hook to check each changeset (rather than the changegroup's result). + It's stricter but prevents from polluting the history with whitespace fix + commits. + -> done Optional/Low priority ===================== -* adapt eol hook to check each changeset (rather than the changegroup's result). - It's stricter but prevents from polluting the history with whitespace fix - commits. * find a nice way of modifying the templates (possibly involving jinja) * check if we should speed up svn revision matching/lookups (since the lookup WSGI app resides permanently in a mod_wsgi process, building a -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Sun Feb 27 19:42:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 19:42:34 +0100 Subject: [Python-checkins] devguide (hg_transition): Move the minimal configuration to the committer docs (since it's Message-ID: antoine.pitrou pushed e34f8ae6967e to devguide: http://hg.python.org/devguide/rev/e34f8ae6967e changeset: 340:e34f8ae6967e branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 27 19:40:29 2011 +0100 summary: Move the minimal configuration to the committer docs (since it's really necessary there), and expand it to include the eol extension under Windows files: committing.rst setup.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -58,6 +58,44 @@ .. _collapse: http://mercurial.selenic.com/wiki/CollapseExtension +Minimal Configuration +--------------------- + +To use Mercurial as a committer (both of your and others' patches), you should +set up some basic options in your `configuration file`_. Under Windows, +TortoiseHg has a graphical settings dialog for most options, meaning you +don't need to edit the file directly (it is still available in +``%USERPROFILE%\Mercurial.ini``). Under other platforms, you must edit +``~/.hgrc``. + +Here are the minimal options you need to activate: + +* your *username*: this setting defines the name that will be used when you + :ref:`commit ` changes. The usual convention is to also include + an e-mail contact address in there:: + + [ui] + username = Your Name + +* *extended diffing*: this setting enables an `extended diff format`_ + which is more useful than the standard unified diff format as it includes + metadata about file copies and permission bits:: + + [diff] + git = on + +Under Windows, you should also enable the `eol extension`_, which will +fix any Windows-specific line endings your text editor might insert when you +create or modify versioned files. The public repository has a hook which +will reject all changesets having the wrong line endings, so enabling this +extension on your local computer is in your best interest. + + +.. _configuration file: http://www.selenic.com/mercurial/hgrc.5.html#files +.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs +.. _eol extension: http://mercurial.selenic.com/wiki/EolExtension + + Handling Other's Code --------------------- diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -21,28 +21,6 @@ `_ graphical client. -Minimal Configuration -''''''''''''''''''''' - -To use ``hg`` on the command line, you should set up some basic options in -your `configuration file`_ (``~/.hgrc`` on POSIX, -``%USERPROFILE%\Mercurial.ini`` on Windows). :: - - [ui] - username = Your Name - - [diff] - git = on - -The first setting defines the name that will be used when you :ref:`commit -` changes. The second setting enables an `extended diff format`_ -which is more useful than the standard unified diff format. - -Note that TortoiseHg has a graphical settings dialog. - -.. _configuration file: http://www.selenic.com/mercurial/hgrc.5.html#files -.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs - .. _checkout: Getting the Source Code -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 20:12:11 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 20:12:11 +0100 Subject: [Python-checkins] devguide (hg_transition): Remove XXX. mq isn't really useful in the context of making the final Message-ID: antoine.pitrou pushed 6a9f6a140c5c to devguide: http://hg.python.org/devguide/rev/6a9f6a140c5c changeset: 341:6a9f6a140c5c branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 27 20:12:06 2011 +0100 summary: Remove XXX. mq isn't really useful in the context of making the final commit of a reviewed patch. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -133,10 +133,6 @@ Porting Within a Major Version '''''''''''''''''''''''''''''' -.. note:: - XXX Update to using hg qimport if that ends up being the way non-core - developers are told to go. - Assume that Python 3.3 is the current in-development version of Python and that you have a patch that should also be applied to Python 3.2. To properly port the patch to both versions of Python, you should first apply the patch to -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 20:25:07 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 20:25:07 +0100 Subject: [Python-checkins] devguide (hg_transition): Drop the "-s" option to hg transplant and improve wording Message-ID: antoine.pitrou pushed 15e302905a5b to devguide: http://hg.python.org/devguide/rev/15e302905a5b changeset: 342:15e302905a5b branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 27 20:25:04 2011 +0100 summary: Drop the "-s" option to hg transplant and improve wording files: committing.rst faq.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -184,11 +184,12 @@ "hg qimport -r tip -P" afterwards but that would add another level of complexity. -To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant -extension`_. Assuming you committed in Python 2.7 first, to pull changeset -``a7df1a869e4a`` into Python 3.2, do:: +To port a patch from, e.g., Python 3.2 to 2.7, you can use the `transplant +extension`_. Assuming you first committed your changes as changeset +``a7df1a869e4a`` in the 3.2 branch and have now :ref:`updated +` your working copy to the 2.7 branch, do:: - hg transplant -s a7df1a869e4a + hg transplant a7df1a869e4a # Compile; run the test suite hg push diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -142,6 +142,8 @@ update: (current) +.. _hg-switch-branches: + How do I switch between branches inside my working copy? -------------------------------------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 20:34:03 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 20:34:03 +0100 Subject: [Python-checkins] devguide (hg_transition): Separate "hg pull" and "hg update" entries Message-ID: antoine.pitrou pushed 6fcbafdf681f to devguide: http://hg.python.org/devguide/rev/6fcbafdf681f changeset: 343:6fcbafdf681f branch: hg_transition tag: tip user: Antoine Pitrou date: Sun Feb 27 20:34:00 2011 +0100 summary: Separate "hg pull" and "hg update" entries files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -124,6 +124,8 @@ "default" branch (or any other branch, if such exists). +.. _hg-current-branch: + Which branch is currently checked out in my working copy? --------------------------------------------------------- @@ -243,17 +245,32 @@ Run:: - hg pull - hg update + hg pull -from the directory you wish to update. The first command retrieves any -changes from the specified remote repository and merges them into the local -repository. The second commands updates the current directory and all its -subdirectories from the local repository. +from the repository you wish to pull the latest changes into. Most of the +time, that repository is a clone of the repository you want to pull from, +so you can simply type:: -You can combine the two commands in one by using:: + hg pull - hg pull -u +This doesn't update your working copy, though. See below: + + +How do I update my working copy with the latest changes? +-------------------------------------------------------- + +Do:: + + hg update + +This will update your working copy with the latest changes on the +:ref:`current branch `. If you had :ref:`uncommitted +changes ` in your working copy, they will be merged in. + +If you find yourself typing often ``hg pull`` followed by ``hg update``, +be aware that you can combine them in a single command:: + + hg pull -u .. _hg-local-workflow: -- Repository URL: http://hg.python.org/devguide From tjreedy at udel.edu Sun Feb 27 20:02:30 2011 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 27 Feb 2011 14:02:30 -0500 Subject: [Python-checkins] devguide (hg_transition): Typo spotted by Sandro Tosi In-Reply-To: References: Message-ID: <4D6A9FC6.7050809@udel.edu> > This will check and/or fix various common things people forget to do for > -patches, such as adding any new files needing for the patch to work (do not > +patches, such as adding any new files needing for the patch to work (note /needing/needed/ From python-checkins at python.org Sun Feb 27 21:21:44 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 21:21:44 +0100 Subject: [Python-checkins] devguide: Fix the copyright notice. Message-ID: brett.cannon pushed e8ba3be88996 to devguide: http://hg.python.org/devguide/rev/e8ba3be88996 changeset: 344:e8ba3be88996 parent: 337:7ce1bf0f6f5d user: Brett Cannon date: Sun Feb 27 12:20:51 2011 -0800 summary: Fix the copyright notice. files: conf.py diff --git a/conf.py b/conf.py --- a/conf.py +++ b/conf.py @@ -43,7 +43,7 @@ # General information about the project. project = u'Python Developer\'s Guide' -copyright = u'2011, Brett Cannon' +copyright = u'2011, Python Software Foundation' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 21:21:47 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 21:21:47 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): Merged default Message-ID: brett.cannon pushed 4a1997d6cf53 to devguide: http://hg.python.org/devguide/rev/4a1997d6cf53 changeset: 345:4a1997d6cf53 branch: hg_transition parent: 338:c500a1efc984 parent: 344:e8ba3be88996 user: Brett Cannon date: Sun Feb 27 12:21:13 2011 -0800 summary: Merged default files: diff --git a/conf.py b/conf.py --- a/conf.py +++ b/conf.py @@ -43,7 +43,7 @@ # General information about the project. project = u'Python Developer\'s Guide' -copyright = u'2011, Brett Cannon' +copyright = u'2011, Python Software Foundation' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 21:21:47 2011 From: python-checkins at python.org (brett.cannon) Date: Sun, 27 Feb 2011 21:21:47 +0100 Subject: [Python-checkins] merge in devguide (hg_transition): Merge Message-ID: brett.cannon pushed a478c5a5a7b5 to devguide: http://hg.python.org/devguide/rev/a478c5a5a7b5 changeset: 346:a478c5a5a7b5 branch: hg_transition tag: tip parent: 345:4a1997d6cf53 parent: 343:6fcbafdf681f user: Brett Cannon date: Sun Feb 27 12:21:39 2011 -0800 summary: Merge files: diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -58,6 +58,44 @@ .. _collapse: http://mercurial.selenic.com/wiki/CollapseExtension +Minimal Configuration +--------------------- + +To use Mercurial as a committer (both of your and others' patches), you should +set up some basic options in your `configuration file`_. Under Windows, +TortoiseHg has a graphical settings dialog for most options, meaning you +don't need to edit the file directly (it is still available in +``%USERPROFILE%\Mercurial.ini``). Under other platforms, you must edit +``~/.hgrc``. + +Here are the minimal options you need to activate: + +* your *username*: this setting defines the name that will be used when you + :ref:`commit ` changes. The usual convention is to also include + an e-mail contact address in there:: + + [ui] + username = Your Name + +* *extended diffing*: this setting enables an `extended diff format`_ + which is more useful than the standard unified diff format as it includes + metadata about file copies and permission bits:: + + [diff] + git = on + +Under Windows, you should also enable the `eol extension`_, which will +fix any Windows-specific line endings your text editor might insert when you +create or modify versioned files. The public repository has a hook which +will reject all changesets having the wrong line endings, so enabling this +extension on your local computer is in your best interest. + + +.. _configuration file: http://www.selenic.com/mercurial/hgrc.5.html#files +.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs +.. _eol extension: http://mercurial.selenic.com/wiki/EolExtension + + Handling Other's Code --------------------- @@ -95,10 +133,6 @@ Porting Within a Major Version '''''''''''''''''''''''''''''' -.. note:: - XXX Update to using hg qimport if that ends up being the way non-core - developers are told to go. - Assume that Python 3.3 is the current in-development version of Python and that you have a patch that should also be applied to Python 3.2. To properly port the patch to both versions of Python, you should first apply the patch to @@ -150,11 +184,12 @@ "hg qimport -r tip -P" afterwards but that would add another level of complexity. -To move a patch between, e.g., Python 3.2 and 2.7, use the `transplant -extension`_. Assuming you committed in Python 2.7 first, to pull changeset -``a7df1a869e4a`` into Python 3.2, do:: +To port a patch from, e.g., Python 3.2 to 2.7, you can use the `transplant +extension`_. Assuming you first committed your changes as changeset +``a7df1a869e4a`` in the 3.2 branch and have now :ref:`updated +` your working copy to the 2.7 branch, do:: - hg transplant -s a7df1a869e4a + hg transplant a7df1a869e4a # Compile; run the test suite hg push diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -124,6 +124,8 @@ "default" branch (or any other branch, if such exists). +.. _hg-current-branch: + Which branch is currently checked out in my working copy? --------------------------------------------------------- @@ -142,6 +144,8 @@ update: (current) +.. _hg-switch-branches: + How do I switch between branches inside my working copy? -------------------------------------------------------- @@ -241,17 +245,32 @@ Run:: - hg pull - hg update + hg pull -from the directory you wish to update. The first command retrieves any -changes from the specified remote repository and merges them into the local -repository. The second commands updates the current directory and all its -subdirectories from the local repository. +from the repository you wish to pull the latest changes into. Most of the +time, that repository is a clone of the repository you want to pull from, +so you can simply type:: -You can combine the two commands in one by using:: + hg pull - hg pull -u +This doesn't update your working copy, though. See below: + + +How do I update my working copy with the latest changes? +-------------------------------------------------------- + +Do:: + + hg update + +This will update your working copy with the latest changes on the +:ref:`current branch `. If you had :ref:`uncommitted +changes ` in your working copy, they will be merged in. + +If you find yourself typing often ``hg pull`` followed by ``hg update``, +be aware that you can combine them in a single command:: + + hg pull -u .. _hg-local-workflow: diff --git a/patch.rst b/patch.rst --- a/patch.rst +++ b/patch.rst @@ -119,7 +119,7 @@ make patchcheck This will check and/or fix various common things people forget to do for -patches, such as adding any new files needing for the patch to work (do not +patches, such as adding any new files needing for the patch to work (note that not all checks apply to non-core developers). Assume you are using the :ref:`mq approach ` suggested earlier, diff --git a/setup.rst b/setup.rst --- a/setup.rst +++ b/setup.rst @@ -21,28 +21,6 @@ `_ graphical client. -Minimal Configuration -''''''''''''''''''''' - -To use ``hg`` on the command line, you should set up some basic options in -your `configuration file`_ (``~/.hgrc`` on POSIX, -``%USERPROFILE%\Mercurial.ini`` on Windows). :: - - [ui] - username = Your Name - - [diff] - git = on - -The first setting defines the name that will be used when you :ref:`commit -` changes. The second setting enables an `extended diff format`_ -which is more useful than the standard unified diff format. - -Note that TortoiseHg has a graphical settings dialog. - -.. _configuration file: http://www.selenic.com/mercurial/hgrc.5.html#files -.. _extended diff format: http://www.selenic.com/mercurial/hg.1.html#diffs - .. _checkout: Getting the Source Code -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Sun Feb 27 23:23:28 2011 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 27 Feb 2011 23:23:28 +0100 Subject: [Python-checkins] pymigr: Mention the checkheads hook Message-ID: antoine.pitrou pushed c2ef8c7fbe6d to pymigr: http://hg.python.org/pymigr/rev/c2ef8c7fbe6d changeset: 116:c2ef8c7fbe6d tag: tip user: Antoine Pitrou date: Sun Feb 27 23:23:23 2011 +0100 summary: Mention the checkheads hook files: README.txt diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -85,6 +85,7 @@ pretxnchangegroup.eol = python:/home/hg/mercurial-1.7.5/hgext/eol.py:allcsethook #pretxnchangegroup.eol = python:hgext.eol.hook + pretxnchangegroup.checkheads = python:/home/hg/repos/hooks/checkheads.py:hook pretxnchangegroup.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook incoming.notify = python:/home/hg/repos/hooks/mail.py:incoming -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Mon Feb 28 02:42:29 2011 From: python-checkins at python.org (ezio.melotti) Date: Mon, 28 Feb 2011 02:42:29 +0100 (CET) Subject: [Python-checkins] r88674 - python/branches/release27-maint/Lib/test/test_unicode.py Message-ID: <20110228014229.D40FCEEA47@mail.python.org> Author: ezio.melotti Date: Mon Feb 28 02:42:29 2011 New Revision: 88674 Log: Python 2 can encode/decode surrogates to utf-8. Add a test for this. Modified: python/branches/release27-maint/Lib/test/test_unicode.py Modified: python/branches/release27-maint/Lib/test/test_unicode.py ============================================================================== --- python/branches/release27-maint/Lib/test/test_unicode.py (original) +++ python/branches/release27-maint/Lib/test/test_unicode.py Mon Feb 28 02:42:29 2011 @@ -667,11 +667,17 @@ # see http://www.unicode.org/versions/Unicode5.2.0/ch03.pdf # (table 3-7) and http://www.rfc-editor.org/rfc/rfc3629.txt #for cb in map(chr, range(0xA0, 0xC0)): - #sys.__stdout__.write('\\xED\\x%02x\\x80\n' % ord(cb)) #self.assertRaises(UnicodeDecodeError, #('\xED'+cb+'\x80').decode, 'utf-8') #self.assertRaises(UnicodeDecodeError, #('\xED'+cb+'\xBF').decode, 'utf-8') + # but since they are valid on Python 2 add a test for that: + for cb, surrogate in zip(map(chr, range(0xA0, 0xC0)), + map(unichr, range(0xd800, 0xe000, 64))): + encoded = '\xED'+cb+'\x80' + self.assertEqual(encoded.decode('utf-8'), surrogate) + self.assertEqual(surrogate.encode('utf-8'), encoded) + for cb in map(chr, range(0x80, 0x90)): self.assertRaises(UnicodeDecodeError, ('\xF0'+cb+'\x80\x80').decode, 'utf-8') From solipsis at pitrou.net Mon Feb 28 05:08:25 2011 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 28 Feb 2011 05:08:25 +0100 Subject: [Python-checkins] Daily py3k reference leaks (r88671): sum=0 Message-ID: py3k results for svn r88671 (hg cset 271057c7c6f3) -------------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/py3k/refleaks/reflogMkddHS', '-x'] From python-checkins at python.org Mon Feb 28 08:42:20 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 08:42:20 +0100 Subject: [Python-checkins] hooks: Clarify. Message-ID: georg.brandl pushed 41d7d7c2818c to hooks: http://hg.python.org/hooks/rev/41d7d7c2818c changeset: 33:41d7d7c2818c tag: tip user: Georg Brandl date: Mon Feb 28 08:41:18 2011 +0100 summary: Clarify. files: checkbranch.py checkheads.py diff --git a/checkbranch.py b/checkbranch.py --- a/checkbranch.py +++ b/checkbranch.py @@ -3,7 +3,7 @@ forbidden branch. To use the changeset hook in a local repository, include something like the -following in your hgrc file. +following in its hgrc file. [hooks] pretxnchangegroup.checkbranch = python:/home/hg/repos/hooks/checkbranch.py:hook diff --git a/checkheads.py b/checkheads.py --- a/checkheads.py +++ b/checkheads.py @@ -3,7 +3,7 @@ doesn't create new heads on any branch. To use the changeset hook in a local repository, include something like the -following in your hgrc file. +following in its hgrc file. [hooks] pretxnchangegroup.checkheads = python:/home/hg/repos/hooks/checkheads.py:hook -- Repository URL: http://hg.python.org/hooks From python-checkins at python.org Mon Feb 28 08:51:45 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 08:51:45 +0100 Subject: [Python-checkins] cpython: Commit some end-of-line whitespace. Message-ID: georg.brandl pushed 5e39fb8635b5 to cpython: http://hg.python.org/cpython/rev/5e39fb8635b5 changeset: 68066:5e39fb8635b5 user: Georg Brandl date: Mon Feb 28 08:44:26 2011 +0100 summary: Commit some end-of-line whitespace. files: Lib/ast.py diff --git a/Lib/ast.py b/Lib/ast.py --- a/Lib/ast.py +++ b/Lib/ast.py @@ -29,7 +29,7 @@ def parse(source, filename='', mode='exec'): - """ + """ Parse the source into an AST node. Equivalent to compile(source, filename, mode, PyCF_ONLY_AST). """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Feb 28 08:51:45 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 08:51:45 +0100 Subject: [Python-checkins] cpython: Revert whitespace change. Message-ID: georg.brandl pushed 590870877f85 to cpython: http://hg.python.org/cpython/rev/590870877f85 changeset: 68067:590870877f85 tag: tip user: Georg Brandl date: Mon Feb 28 08:50:40 2011 +0100 summary: Revert whitespace change. files: Lib/ast.py diff --git a/Lib/ast.py b/Lib/ast.py --- a/Lib/ast.py +++ b/Lib/ast.py @@ -29,7 +29,7 @@ def parse(source, filename='', mode='exec'): - """ + """ Parse the source into an AST node. Equivalent to compile(source, filename, mode, PyCF_ONLY_AST). """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Feb 28 08:54:55 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 08:54:55 +0100 Subject: [Python-checkins] pymigr: Update open issues: there are no more open issues in PEP 385. Branch structure Message-ID: georg.brandl pushed 829967ac436e to pymigr: http://hg.python.org/pymigr/rev/829967ac436e changeset: 117:829967ac436e tag: tip user: Georg Brandl date: Mon Feb 28 08:53:53 2011 +0100 summary: Update open issues: there are no more open issues in PEP 385. Branch structure should be pretty much decided by now, workflow is by individual preference. files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -1,11 +1,3 @@ -Before final conversion (tentatively 2011-03-05) -================================================ - -* resolve open issues in PEP 385 -* (optionally) fight some more about workflow and decide the branch structure -* some hook should prevent pushing python files indented by tabs. - -> done in checkwhitespace.py? - Done ==== @@ -28,15 +20,17 @@ It's stricter but prevents from polluting the history with whitespace fix commits. -> done +* some hook should prevent pushing python files indented by tabs. + -> done in checkwhitespace.py Optional/Low priority ===================== * find a nice way of modifying the templates (possibly involving jinja) * check if we should speed up svn revision matching/lookups (since the - lookup WSGI app resides permanently in a mod_wsgi process, building a - cache at startup should be sufficient) -* investigate roundup + rietveld integration changes required + lookup WSGI app can reside permanently in a mod_wsgi process, building + a cache at startup should be sufficient) +* investigate roundup-rietveld integration changes required After migration =============== -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Mon Feb 28 08:59:57 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 08:59:57 +0100 Subject: [Python-checkins] pymigr: Add some things to do to the new repo after conversion. Message-ID: georg.brandl pushed 7509f87b49c3 to pymigr: http://hg.python.org/pymigr/rev/7509f87b49c3 changeset: 118:7509f87b49c3 tag: tip user: Georg Brandl date: Mon Feb 28 08:58:55 2011 +0100 summary: Add some things to do to the new repo after conversion. files: todo.txt diff --git a/todo.txt b/todo.txt --- a/todo.txt +++ b/todo.txt @@ -23,6 +23,17 @@ * some hook should prevent pushing python files indented by tabs. -> done in checkwhitespace.py +Checklist for the converted repo +================================ + +* close unused branches +* add .hgeol in all active branches +* commit wrong line-endings once +* dummy-merge maintenance branches +* clean up tags on all branches, correct pre-2.4 tags that point to + nothing +* install same hooks as in test repo + Optional/Low priority ===================== -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Mon Feb 28 09:30:18 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 28 Feb 2011 09:30:18 +0100 Subject: [Python-checkins] cpython (2.7): Mysterious update to installer README. Message-ID: ned.deily pushed 41c92cafef61 to cpython: http://hg.python.org/cpython/rev/41c92cafef61 changeset: 68068:41c92cafef61 branch: 2.7 tag: tip parent: 68030:9619d21d8198 user: Ned Deily date: Mon Feb 28 00:29:57 2011 -0800 summary: Mysterious update to installer README. files: Mac/BuildScript/README.txt diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -3,7 +3,7 @@ The ``build-install.py`` script creates Python distributions, including certain third-party libraries as necessary. It builds a complete -framework-based Python out-of-tree, installs it in a funny place with +framework-based Python out-of-tree, installs it in a mysterious place with $DESTROOT, massages that installation to remove .pyc files and such, creates an Installer package from the installation plus other files in ``resources`` and ``scripts`` and placed that on a ``.dmg`` disk image. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Feb 28 09:40:50 2011 From: python-checkins at python.org (ned.deily) Date: Mon, 28 Feb 2011 09:40:50 +0100 Subject: [Python-checkins] cpython: Mysterious update to installer README. Message-ID: ned.deily pushed cbad338b10cd to cpython: http://hg.python.org/cpython/rev/cbad338b10cd changeset: 68069:cbad338b10cd tag: tip parent: 68067:590870877f85 user: Ned Deily date: Mon Feb 28 00:29:57 2011 -0800 summary: Mysterious update to installer README. files: Mac/BuildScript/README.txt diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -3,7 +3,7 @@ The ``build-install.py`` script creates Python distributions, including certain third-party libraries as necessary. It builds a complete -framework-based Python out-of-tree, installs it in a funny place with +framework-based Python out-of-tree, installs it in a mysterious place with $DESTROOT, massages that installation to remove .pyc files and such, creates an Installer package from the installation plus other files in ``resources`` and ``scripts`` and placed that on a ``.dmg`` disk image. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Feb 28 18:59:41 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 18:59:41 +0100 Subject: [Python-checkins] pymigr: legacy-trunk it shall be Message-ID: antoine.pitrou pushed f12144b4e50c to pymigr: http://hg.python.org/pymigr/rev/f12144b4e50c changeset: 119:f12144b4e50c tag: tip user: Antoine Pitrou date: Mon Feb 28 18:59:38 2011 +0100 summary: legacy-trunk it shall be files: branchmap.txt diff --git a/branchmap.txt b/branchmap.txt --- a/branchmap.txt +++ b/branchmap.txt @@ -11,121 +11,121 @@ release22-maint = 2.2 release21-maint = 2.1 release20-maint = 2.0 -siginterrupt-reset-issue8354 = trunk -asyncore-tests-issue8490 = trunk -signalfd-issue8407 = trunk -tarek_sysconfig = trunk -pep370 = trunk -tk_and_idle_maintenance = trunk -multiprocessing-autoconf = trunk -tlee-ast-optimize = trunk -empty = trunk -tnelson-trunk-bsddb-47-upgrade = trunk -ctypes-branch = trunk -decimal-branch = trunk -bcannon-objcap = trunk -amk-mailbox = trunk -twouters-dictviews-backport = trunk -bcannon-sandboxing = trunk -theller_modulefinder = trunk -hoxworth-stdlib_logging-soc = trunk -tim-exc_sanity = trunk -IDLE-syntax-branch = trunk -jim-doctest = trunk -release23-branch = trunk -jim-modulator = trunk -indexing-cleanup-branch = trunk -r23c1-branch = trunk -r23b2-branch = trunk -anthony-parser-branch = trunk -r23b1-branch = trunk -idlefork-merge-branch = trunk -getargs_mask_mods = trunk -cache-attr-branch = trunk -folding-reimpl-branch = trunk -r23a2-branch = trunk -bsddb-bsddb3-schizo-branch = trunk -r23a1-branch = trunk -py-cvs-vendor-branch = trunk -DS_RPC_BRANCH = trunk -SourceForge = trunk -release22-branch = trunk -r22rc1-branch = trunk -r22b2-branch = trunk -r22b1-branch = trunk -r22a4-branch = trunk -r22a3-branch = trunk -r22a2-branch = trunk -r161-branch = trunk -cnri-16-start = trunk -universal-33 = trunk -None = trunk -avendor = trunk -Distutils_0_1_3-branch = trunk -release152p1-patches = trunk -string_methods = trunk -PYIDE = trunk -OSAM = trunk -PYTHONSCRIPT = trunk -BBPY = trunk -jar = trunk -alpha100 = trunk -unlabeled-2.36.4 = trunk -unlabeled-2.1.4 = trunk -unlabeled-2.25.4 = trunk -branch_libffi-3_0_10-win = trunk -iter-branch = trunk -gen-branch = trunk -descr-branch = trunk -ast-branch = trunk -ast-objects = trunk -ast-arena = trunk -benjaminp-testing = trunk -fix-test-ftplib = trunk -trunk-math = trunk -okkoto-sizeof = trunk -trunk-bytearray = trunk -libffi3-branch = trunk -../ctypes-branch = trunk -pep302_phase2 = trunk -py3k-buffer = trunk -unlabeled-2.9.2 = trunk -unlabeled-1.5.4 = trunk -unlabeled-1.1.2 = trunk -unlabeled-1.1.1 = trunk -unlabeled-2.9.4 = trunk -unlabeled-2.10.2 = trunk -unlabeled-2.1.2 = trunk -unlabeled-2.108.2 = trunk -unlabeled-2.36.2 = trunk -unlabeled-2.54.2 = trunk -unlabeled-1.3.2 = trunk -unlabeled-1.23.4 = trunk -unlabeled-2.25.2 = trunk -unlabeled-1.2.2 = trunk -unlabeled-1.5.2 = trunk -unlabeled-1.98.2 = trunk -unlabeled-2.16.2 = trunk -unlabeled-2.3.2 = trunk -unlabeled-1.9.2 = trunk -unlabeled-1.8.2 = trunk -aimacintyre-sf1454481 = trunk -tim-current_frames = trunk -bippolito-newstruct = trunk -runar-longslice-branch = trunk -steve-notracing = trunk -rjones-funccall = trunk -sreifschneider-newnewexcept = trunk -tim-doctest-branch = trunk -blais-bytebuf = trunk -../bippolito-newstruct = trunk -rjones-prealloc = trunk -sreifschneider-64ints = trunk -stdlib-cleanup = trunk -ssize_t = trunk -sqlite-integration = trunk -tim-obmalloc = trunk -release27-maint-ttk-debug-on-xp5 = trunk +siginterrupt-reset-issue8354 = legacy-trunk +asyncore-tests-issue8490 = legacy-trunk +signalfd-issue8407 = legacy-trunk +tarek_sysconfig = legacy-trunk +pep370 = legacy-trunk +tk_and_idle_maintenance = legacy-trunk +multiprocessing-autoconf = legacy-trunk +tlee-ast-optimize = legacy-trunk +empty = legacy-trunk +tnelson-legacy-trunk-bsddb-47-upgrade = legacy-trunk +ctypes-branch = legacy-trunk +decimal-branch = legacy-trunk +bcannon-objcap = legacy-trunk +amk-mailbox = legacy-trunk +twouters-dictviews-backport = legacy-trunk +bcannon-sandboxing = legacy-trunk +theller_modulefinder = legacy-trunk +hoxworth-stdlib_logging-soc = legacy-trunk +tim-exc_sanity = legacy-trunk +IDLE-syntax-branch = legacy-trunk +jim-doctest = legacy-trunk +release23-branch = legacy-trunk +jim-modulator = legacy-trunk +indexing-cleanup-branch = legacy-trunk +r23c1-branch = legacy-trunk +r23b2-branch = legacy-trunk +anthony-parser-branch = legacy-trunk +r23b1-branch = legacy-trunk +idlefork-merge-branch = legacy-trunk +getargs_mask_mods = legacy-trunk +cache-attr-branch = legacy-trunk +folding-reimpl-branch = legacy-trunk +r23a2-branch = legacy-trunk +bsddb-bsddb3-schizo-branch = legacy-trunk +r23a1-branch = legacy-trunk +py-cvs-vendor-branch = legacy-trunk +DS_RPC_BRANCH = legacy-trunk +SourceForge = legacy-trunk +release22-branch = legacy-trunk +r22rc1-branch = legacy-trunk +r22b2-branch = legacy-trunk +r22b1-branch = legacy-trunk +r22a4-branch = legacy-trunk +r22a3-branch = legacy-trunk +r22a2-branch = legacy-trunk +r161-branch = legacy-trunk +cnri-16-start = legacy-trunk +universal-33 = legacy-trunk +None = legacy-trunk +avendor = legacy-trunk +Distutils_0_1_3-branch = legacy-trunk +release152p1-patches = legacy-trunk +string_methods = legacy-trunk +PYIDE = legacy-trunk +OSAM = legacy-trunk +PYTHONSCRIPT = legacy-trunk +BBPY = legacy-trunk +jar = legacy-trunk +alpha100 = legacy-trunk +unlabeled-2.36.4 = legacy-trunk +unlabeled-2.1.4 = legacy-trunk +unlabeled-2.25.4 = legacy-trunk +branch_libffi-3_0_10-win = legacy-trunk +iter-branch = legacy-trunk +gen-branch = legacy-trunk +descr-branch = legacy-trunk +ast-branch = legacy-trunk +ast-objects = legacy-trunk +ast-arena = legacy-trunk +benjaminp-testing = legacy-trunk +fix-test-ftplib = legacy-trunk +trunk-math = legacy-trunk +okkoto-sizeof = legacy-trunk +trunk-bytearray = legacy-trunk +libffi3-branch = legacy-trunk +../ctypes-branch = legacy-trunk +pep302_phase2 = legacy-trunk +py3k-buffer = legacy-trunk +unlabeled-2.9.2 = legacy-trunk +unlabeled-1.5.4 = legacy-trunk +unlabeled-1.1.2 = legacy-trunk +unlabeled-1.1.1 = legacy-trunk +unlabeled-2.9.4 = legacy-trunk +unlabeled-2.10.2 = legacy-trunk +unlabeled-2.1.2 = legacy-trunk +unlabeled-2.108.2 = legacy-trunk +unlabeled-2.36.2 = legacy-trunk +unlabeled-2.54.2 = legacy-trunk +unlabeled-1.3.2 = legacy-trunk +unlabeled-1.23.4 = legacy-trunk +unlabeled-2.25.2 = legacy-trunk +unlabeled-1.2.2 = legacy-trunk +unlabeled-1.5.2 = legacy-trunk +unlabeled-1.98.2 = legacy-trunk +unlabeled-2.16.2 = legacy-trunk +unlabeled-2.3.2 = legacy-trunk +unlabeled-1.9.2 = legacy-trunk +unlabeled-1.8.2 = legacy-trunk +aimacintyre-sf1454481 = legacy-trunk +tim-current_frames = legacy-trunk +bippolito-newstruct = legacy-trunk +runar-longslice-branch = legacy-trunk +steve-notracing = legacy-trunk +rjones-funccall = legacy-trunk +sreifschneider-newnewexcept = legacy-trunk +tim-doctest-branch = legacy-trunk +blais-bytebuf = legacy-trunk +../bippolito-newstruct = legacy-trunk +rjones-prealloc = legacy-trunk +sreifschneider-64ints = legacy-trunk +stdlib-cleanup = legacy-trunk +ssize_t = legacy-trunk +sqlite-integration = legacy-trunk +tim-obmalloc = legacy-trunk +release27-maint-ttk-debug-on-xp5 = legacy-trunk test_subprocess_10826 = default py3k-signalfd-issue8407 = default sslopts-4870 = default -- Repository URL: http://hg.python.org/pymigr From python-checkins at python.org Mon Feb 28 19:08:11 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 19:08:11 +0100 (CET) Subject: [Python-checkins] r88675 - in python/branches/pep-3151: Doc/bugs.rst Doc/faq/general.rst Doc/howto/logging-cookbook.rst Doc/library/json.rst Doc/reference/executionmodel.rst Doc/using/unix.rst Doc/using/windows.rst Lib/collections/__init__.py Lib/lib2to3 Lib/lib2to3/patcomp.py Lib/lib2to3/pgen2/driver.py Lib/lib2to3/tests/test_parser.py Lib/logging/__init__.py Lib/ssl.py Lib/test/support.py Lib/test/test_collections.py Lib/test/test_logging.py Lib/test/test_os.py Lib/test/test_ssl.py Misc/NEWS Modules/posixmodule.c Objects/typeslots.inc Message-ID: <20110228180811.0091FEE984@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 19:08:07 2011 New Revision: 88675 Log: Merged revisions 88633,88636,88639-88640,88643-88647,88650,88652-88654,88656,88658,88660,88663-88664,88668,88671 via svnmerge from svn+ssh://pythondev at svn.python.org/python/branches/py3k ................ r88633 | raymond.hettinger | 2011-02-26 07:53:58 +0100 (sam., 26 f?vr. 2011) | 1 line Add __bool__ method. Add tests. Fix-up broken test. ................ r88636 | vinay.sajip | 2011-02-26 08:18:22 +0100 (sam., 26 f?vr. 2011) | 1 line test_logging: Changed TimedRotatingFileHandler tests to use UTC time rather than local time. ................ r88639 | antoine.pitrou | 2011-02-26 09:45:20 +0100 (sam., 26 f?vr. 2011) | 4 lines Issue #11258: Speed up ctypes.util.find_library() under Linux a lot. Patch by Jonas H. ................ r88640 | antoine.pitrou | 2011-02-26 10:37:45 +0100 (sam., 26 f?vr. 2011) | 3 lines Revert r88639 (the optimization changes behaviour and breaks buildbots) ................ r88643 | antoine.pitrou | 2011-02-26 14:38:35 +0100 (sam., 26 f?vr. 2011) | 3 lines Check error return from _parse_off_t(), and remove cruft from the 2->3 transition. ................ r88644 | vinay.sajip | 2011-02-26 15:15:48 +0100 (sam., 26 f?vr. 2011) | 1 line Issue #11330: asctime format bug fixed. ................ r88645 | vinay.sajip | 2011-02-26 15:24:29 +0100 (sam., 26 f?vr. 2011) | 1 line Issue #11331: fixed documentation in logging cookbook. ................ r88646 | vinay.sajip | 2011-02-26 15:28:36 +0100 (sam., 26 f?vr. 2011) | 1 line Removed typo. ................ r88647 | antoine.pitrou | 2011-02-26 15:29:24 +0100 (sam., 26 f?vr. 2011) | 3 lines Issue #11323: fix sendfile tests under 64-bit Solaris. ................ r88650 | eric.araujo | 2011-02-26 15:57:23 +0100 (sam., 26 f?vr. 2011) | 2 lines Replace links to the old dev doc with links to the new devguide. ................ r88652 | antoine.pitrou | 2011-02-26 16:58:05 +0100 (sam., 26 f?vr. 2011) | 4 lines Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. Patch by Hirokazu Yamamoto. ................ r88653 | vinay.sajip | 2011-02-26 17:00:04 +0100 (sam., 26 f?vr. 2011) | 1 line Issue #11330: Added regression test. ................ r88654 | vinay.sajip | 2011-02-26 17:06:02 +0100 (sam., 26 f?vr. 2011) | 1 line Issue #11330: Updated tests for correct asctime handling. ................ r88656 | antoine.pitrou | 2011-02-26 18:52:50 +0100 (sam., 26 f?vr. 2011) | 3 lines Make sendfile tests more robust ................ r88658 | benjamin.peterson | 2011-02-26 22:32:16 +0100 (sam., 26 f?vr. 2011) | 1 line this isn't true anymore ................ r88660 | benjamin.peterson | 2011-02-26 22:35:16 +0100 (sam., 26 f?vr. 2011) | 1 line revert accidental formatting change ................ r88663 | benjamin.peterson | 2011-02-26 23:12:10 +0100 (sam., 26 f?vr. 2011) | 12 lines Merged revisions 88661 via svnmerge from svn+ssh://pythondev at svn.python.org/sandbox/trunk/2to3/lib2to3 ........ r88661 | benjamin.peterson | 2011-02-26 16:06:24 -0600 (Sat, 26 Feb 2011) | 6 lines fix refactoring on formfeed characters #11250 This is because text.splitlines() is not the same as list(StringIO.StringIO(text)). ........ ................ r88664 | antoine.pitrou | 2011-02-27 00:24:06 +0100 (dim., 27 f?vr. 2011) | 4 lines Issue #11326: Add the missing connect_ex() implementation for SSL sockets, and make it work for non-blocking connects. ................ r88668 | benjamin.peterson | 2011-02-27 16:06:44 +0100 (dim., 27 f?vr. 2011) | 1 line make this a link #11345 ................ r88671 | antoine.pitrou | 2011-02-27 16:44:12 +0100 (dim., 27 f?vr. 2011) | 3 lines Follow up to r88664: non-blocking connect-ex() can return EWOULDBLOCK under Windows ................ Modified: python/branches/pep-3151/ (props changed) python/branches/pep-3151/Doc/bugs.rst python/branches/pep-3151/Doc/faq/general.rst python/branches/pep-3151/Doc/howto/logging-cookbook.rst python/branches/pep-3151/Doc/library/json.rst python/branches/pep-3151/Doc/reference/executionmodel.rst python/branches/pep-3151/Doc/using/unix.rst python/branches/pep-3151/Doc/using/windows.rst python/branches/pep-3151/Lib/collections/__init__.py python/branches/pep-3151/Lib/lib2to3/ (props changed) python/branches/pep-3151/Lib/lib2to3/patcomp.py python/branches/pep-3151/Lib/lib2to3/pgen2/driver.py python/branches/pep-3151/Lib/lib2to3/tests/test_parser.py python/branches/pep-3151/Lib/logging/__init__.py python/branches/pep-3151/Lib/ssl.py python/branches/pep-3151/Lib/test/support.py python/branches/pep-3151/Lib/test/test_collections.py python/branches/pep-3151/Lib/test/test_logging.py python/branches/pep-3151/Lib/test/test_os.py python/branches/pep-3151/Lib/test/test_ssl.py python/branches/pep-3151/Misc/NEWS python/branches/pep-3151/Modules/posixmodule.c python/branches/pep-3151/Objects/typeslots.inc Modified: python/branches/pep-3151/Doc/bugs.rst ============================================================================== --- python/branches/pep-3151/Doc/bugs.rst (original) +++ python/branches/pep-3151/Doc/bugs.rst Mon Feb 28 19:08:07 2011 @@ -57,12 +57,14 @@ Each bug report will be assigned to a developer who will determine what needs to be done to correct the problem. You will receive an update each time action is -taken on the bug. See http://www.python.org/dev/workflow/ for a detailed -description of the issue workflow. +taken on the bug. .. seealso:: + `Python Developer's Guide `_ + Detailed description of the issue workflow and developers tools. + `How to Report Bugs Effectively `_ Article which goes into some detail about how to create a useful bug report. This describes what kind of information is useful and why it is useful. Modified: python/branches/pep-3151/Doc/faq/general.rst ============================================================================== --- python/branches/pep-3151/Doc/faq/general.rst (original) +++ python/branches/pep-3151/Doc/faq/general.rst Mon Feb 28 19:08:07 2011 @@ -166,7 +166,7 @@ .. XXX update link once the dev faq is relocated -Consult the `Developer FAQ `__ for more +Consult the `Developer FAQ `__ for more information on getting the source code and compiling it. @@ -224,7 +224,7 @@ .. XXX update link once the dev faq is relocated You can also access the development version of Python through Subversion. See -http://www.python.org/dev/faq/ for details. +http://docs.python.org/devguide/faq for details. How do I submit bug reports and patches for Python? @@ -242,7 +242,7 @@ .. XXX adapt link to dev guide For more information on how Python is developed, consult `the Python Developer's -Guide `_. +Guide `_. Are there any published articles about Python that I can reference? Modified: python/branches/pep-3151/Doc/howto/logging-cookbook.rst ============================================================================== --- python/branches/pep-3151/Doc/howto/logging-cookbook.rst (original) +++ python/branches/pep-3151/Doc/howto/logging-cookbook.rst Mon Feb 28 19:08:07 2011 @@ -630,8 +630,6 @@ if __name__ == '__main__': levels = (logging.DEBUG, logging.INFO, logging.WARNING, logging.ERROR, logging.CRITICAL) - a1 = logging.LoggerAdapter(logging.getLogger('a.b.c'), - { 'ip' : '123.231.231.123', 'user' : 'sheila' }) logging.basicConfig(level=logging.DEBUG, format='%(asctime)-15s %(name)-5s %(levelname)-8s IP: %(ip)-15s User: %(user)-8s %(message)s') a1 = logging.getLogger('a.b.c') Modified: python/branches/pep-3151/Doc/library/json.rst ============================================================================== --- python/branches/pep-3151/Doc/library/json.rst (original) +++ python/branches/pep-3151/Doc/library/json.rst Mon Feb 28 19:08:07 2011 @@ -6,7 +6,7 @@ .. moduleauthor:: Bob Ippolito .. sectionauthor:: Bob Ippolito -JSON (JavaScript Object Notation) is a subset of JavaScript +`JSON (JavaScript Object Notation) `_ is a subset of JavaScript syntax (ECMA-262 3rd edition) used as a lightweight data interchange format. :mod:`json` exposes an API familiar to users of the standard library Modified: python/branches/pep-3151/Doc/reference/executionmodel.rst ============================================================================== --- python/branches/pep-3151/Doc/reference/executionmodel.rst (original) +++ python/branches/pep-3151/Doc/reference/executionmodel.rst Mon Feb 28 19:08:07 2011 @@ -94,9 +94,7 @@ at the module level. A target occurring in a :keyword:`del` statement is also considered bound for -this purpose (though the actual semantics are to unbind the name). It is -illegal to unbind a name that is referenced by an enclosing scope; the compiler -will report a :exc:`SyntaxError`. +this purpose (though the actual semantics are to unbind the name). Each assignment or import statement occurs within a block defined by a class or function definition or at the module level (the top-level code block). Modified: python/branches/pep-3151/Doc/using/unix.rst ============================================================================== --- python/branches/pep-3151/Doc/using/unix.rst (original) +++ python/branches/pep-3151/Doc/using/unix.rst Mon Feb 28 19:08:07 2011 @@ -66,7 +66,7 @@ If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `checkout -`_. +`_. The build process consists the usual :: Modified: python/branches/pep-3151/Doc/using/windows.rst ============================================================================== --- python/branches/pep-3151/Doc/using/windows.rst (original) +++ python/branches/pep-3151/Doc/using/windows.rst Mon Feb 28 19:08:07 2011 @@ -290,7 +290,7 @@ If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `checkout -`_. +`_. For Microsoft Visual C++, which is the compiler with which official Python releases are built, the source tree contains solutions/project files. View the Modified: python/branches/pep-3151/Lib/collections/__init__.py ============================================================================== --- python/branches/pep-3151/Lib/collections/__init__.py (original) +++ python/branches/pep-3151/Lib/collections/__init__.py Mon Feb 28 19:08:07 2011 @@ -679,6 +679,9 @@ def __contains__(self, key): return any(key in m for m in self.maps) + def __bool__(self): + return any(self.maps) + @_recursive_repr() def __repr__(self): return '{0.__class__.__name__}({1})'.format( Modified: python/branches/pep-3151/Lib/lib2to3/patcomp.py ============================================================================== --- python/branches/pep-3151/Lib/lib2to3/patcomp.py (original) +++ python/branches/pep-3151/Lib/lib2to3/patcomp.py Mon Feb 28 19:08:07 2011 @@ -11,6 +11,7 @@ __author__ = "Guido van Rossum " # Python imports +import io import os # Fairly local imports @@ -32,7 +33,7 @@ def tokenize_wrapper(input): """Tokenizes a string suppressing significant whitespace.""" skip = set((token.NEWLINE, token.INDENT, token.DEDENT)) - tokens = tokenize.generate_tokens(driver.generate_lines(input).__next__) + tokens = tokenize.generate_tokens(io.StringIO(input).readline) for quintuple in tokens: type, value, start, end, line_text = quintuple if type not in skip: Modified: python/branches/pep-3151/Lib/lib2to3/pgen2/driver.py ============================================================================== --- python/branches/pep-3151/Lib/lib2to3/pgen2/driver.py (original) +++ python/branches/pep-3151/Lib/lib2to3/pgen2/driver.py Mon Feb 28 19:08:07 2011 @@ -17,6 +17,7 @@ # Python imports import codecs +import io import os import logging import sys @@ -101,18 +102,10 @@ def parse_string(self, text, debug=False): """Parse a string and return the syntax tree.""" - tokens = tokenize.generate_tokens(generate_lines(text).__next__) + tokens = tokenize.generate_tokens(io.StringIO(text).readline) return self.parse_tokens(tokens, debug) -def generate_lines(text): - """Generator that behaves like readline without using StringIO.""" - for line in text.splitlines(True): - yield line - while True: - yield "" - - def load_grammar(gt="Grammar.txt", gp=None, save=True, force=False, logger=None): """Load the grammar (maybe from a pickle).""" Modified: python/branches/pep-3151/Lib/lib2to3/tests/test_parser.py ============================================================================== --- python/branches/pep-3151/Lib/lib2to3/tests/test_parser.py (original) +++ python/branches/pep-3151/Lib/lib2to3/tests/test_parser.py Mon Feb 28 19:08:07 2011 @@ -18,6 +18,16 @@ # Local imports from lib2to3.pgen2 import tokenize from ..pgen2.parse import ParseError +from lib2to3.pygram import python_symbols as syms + + +class TestDriver(support.TestCase): + + def test_formfeed(self): + s = """print 1\n\x0Cprint 2\n""" + t = driver.parse_string(s) + self.assertEqual(t.children[0].children[0].type, syms.print_stmt) + self.assertEqual(t.children[1].children[0].type, syms.print_stmt) class GrammarTest(support.TestCase): Modified: python/branches/pep-3151/Lib/logging/__init__.py ============================================================================== --- python/branches/pep-3151/Lib/logging/__init__.py (original) +++ python/branches/pep-3151/Lib/logging/__init__.py Mon Feb 28 19:08:07 2011 @@ -360,12 +360,13 @@ default_format = '%(message)s' asctime_format = '%(asctime)s' + asctime_search = '%(asctime)' def __init__(self, fmt): self._fmt = fmt or self.default_format def usesTime(self): - return self._fmt.find(self.asctime_format) >= 0 + return self._fmt.find(self.asctime_search) >= 0 def format(self, record): return self._fmt % record.__dict__ @@ -373,6 +374,7 @@ class StrFormatStyle(PercentStyle): default_format = '{message}' asctime_format = '{asctime}' + asctime_search = '{asctime' def format(self, record): return self._fmt.format(**record.__dict__) @@ -381,6 +383,7 @@ class StringTemplateStyle(PercentStyle): default_format = '${message}' asctime_format = '${asctime}' + asctime_search = '${asctime}' def __init__(self, fmt): self._fmt = fmt or self.default_format Modified: python/branches/pep-3151/Lib/ssl.py ============================================================================== --- python/branches/pep-3151/Lib/ssl.py (original) +++ python/branches/pep-3151/Lib/ssl.py Mon Feb 28 19:08:07 2011 @@ -237,6 +237,7 @@ self._closed = False self._sslobj = None + self._connected = connected if connected: # create the SSL object try: @@ -430,23 +431,36 @@ finally: self.settimeout(timeout) - def connect(self, addr): - """Connects to remote ADDR, and then wraps the connection in - an SSL channel.""" + def _real_connect(self, addr, return_errno): if self.server_side: raise ValueError("can't connect in server-side mode") # Here we assume that the socket is client-side, and not # connected at the time of the call. We connect it, then wrap it. - if self._sslobj: + if self._connected: raise ValueError("attempt to connect already-connected SSLSocket!") - socket.connect(self, addr) self._sslobj = self.context._wrap_socket(self, False, self.server_hostname) try: + socket.connect(self, addr) if self.do_handshake_on_connect: self.do_handshake() - except: - self._sslobj = None - raise + except socket_error as e: + if return_errno: + return e.errno + else: + self._sslobj = None + raise e + self._connected = True + return 0 + + def connect(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + self._real_connect(addr, False) + + def connect_ex(self, addr): + """Connects to remote ADDR, and then wraps the connection in + an SSL channel.""" + return self._real_connect(addr, True) def accept(self): """Accepts a new connection from a remote client, and returns Modified: python/branches/pep-3151/Lib/test/support.py ============================================================================== --- python/branches/pep-3151/Lib/test/support.py (original) +++ python/branches/pep-3151/Lib/test/support.py Mon Feb 28 19:08:07 2011 @@ -233,6 +233,36 @@ unlink(imp.cache_from_source(source, debug_override=True)) unlink(imp.cache_from_source(source, debug_override=False)) +# On some platforms, should not run gui test even if it is allowed +# in `use_resources'. +if sys.platform.startswith('win'): + import ctypes + import ctypes.wintypes + def _is_gui_available(): + UOI_FLAGS = 1 + WSF_VISIBLE = 0x0001 + class USEROBJECTFLAGS(ctypes.Structure): + _fields_ = [("fInherit", ctypes.wintypes.BOOL), + ("fReserved", ctypes.wintypes.BOOL), + ("dwFlags", ctypes.wintypes.DWORD)] + dll = ctypes.windll.user32 + h = dll.GetProcessWindowStation() + if not h: + raise ctypes.WinError() + uof = USEROBJECTFLAGS() + needed = ctypes.wintypes.DWORD() + res = dll.GetUserObjectInformationW(h, + UOI_FLAGS, + ctypes.byref(uof), + ctypes.sizeof(uof), + ctypes.byref(needed)) + if not res: + raise ctypes.WinError() + return bool(uof.dwFlags & WSF_VISIBLE) +else: + def _is_gui_available(): + return True + def is_resource_enabled(resource): """Test whether a resource is enabled. Known resources are set by regrtest.py.""" @@ -245,6 +275,8 @@ possibility of False being returned occurs when regrtest.py is executing. """ + if resource == 'gui' and not _is_gui_available(): + raise unittest.SkipTest("Cannot use the 'gui' resource") # see if the caller's module is __main__ - if so, treat as if # the resource was set if sys._getframe(1).f_globals.get("__name__") == "__main__": @@ -1045,6 +1077,8 @@ return obj def requires_resource(resource): + if resource == 'gui' and not _is_gui_available(): + return unittest.skip("resource 'gui' is not available") if is_resource_enabled(resource): return _id else: Modified: python/branches/pep-3151/Lib/test/test_collections.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_collections.py (original) +++ python/branches/pep-3151/Lib/test/test_collections.py Mon Feb 28 19:08:07 2011 @@ -72,17 +72,23 @@ for m1, m2 in zip(d.maps, e.maps): self.assertIsNot(m1, m2, e) - d.new_child() - d['b'] = 5 - self.assertEqual(d.maps, [{'b': 5}, {'c':30}, {'a':1, 'b':2}]) - self.assertEqual(d.parents.maps, [{'c':30}, {'a':1, 'b':2}]) # check parents - self.assertEqual(d['b'], 5) # find first in chain - self.assertEqual(d.parents['b'], 2) # look beyond maps[0] + f = d.new_child() + f['b'] = 5 + self.assertEqual(f.maps, [{'b': 5}, {'c':30}, {'a':1, 'b':2}]) + self.assertEqual(f.parents.maps, [{'c':30}, {'a':1, 'b':2}]) # check parents + self.assertEqual(f['b'], 5) # find first in chain + self.assertEqual(f.parents['b'], 2) # look beyond maps[0] def test_contructor(self): - self.assertEqual(ChainedContext().maps, [{}]) # no-args --> one new dict + self.assertEqual(ChainMap().maps, [{}]) # no-args --> one new dict self.assertEqual(ChainMap({1:2}).maps, [{1:2}]) # 1 arg --> list + def test_bool(self): + self.assertFalse(ChainMap()) + self.assertFalse(ChainMap({}, {})) + self.assertTrue(ChainMap({1:2}, {})) + self.assertTrue(ChainMap({}, {1:2})) + def test_missing(self): class DefaultChainMap(ChainMap): def __missing__(self, key): @@ -1182,7 +1188,7 @@ def test_main(verbose=None): NamedTupleDocs = doctest.DocTestSuite(module=collections) test_classes = [TestNamedTuple, NamedTupleDocs, TestOneTrickPonyABCs, - TestCollectionABCs, TestCounter, + TestCollectionABCs, TestCounter, TestChainMap, TestOrderedDict, GeneralMappingTests, SubclassMappingTests] support.run_unittest(*test_classes) support.run_doctest(collections, verbose) Modified: python/branches/pep-3151/Lib/test/test_logging.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_logging.py (original) +++ python/branches/pep-3151/Lib/test/test_logging.py Mon Feb 28 19:08:07 2011 @@ -1,6 +1,6 @@ #!/usr/bin/env python # -# Copyright 2001-2010 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2011 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -18,7 +18,7 @@ """Test harness for the logging module. Run all tests. -Copyright (C) 2001-2010 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2011 Vinay Sajip. All Rights Reserved. """ import logging @@ -1907,6 +1907,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('%(asctime)s') self.assertTrue(f.usesTime()) + f = logging.Formatter('%(asctime)-15s') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime') self.assertFalse(f.usesTime()) @@ -1920,6 +1922,10 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('{asctime}', style='{') self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime!s:15}', style='{') + self.assertTrue(f.usesTime()) + f = logging.Formatter('{asctime:15}', style='{') + self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='{') self.assertFalse(f.usesTime()) @@ -1935,6 +1941,8 @@ self.assertFalse(f.usesTime()) f = logging.Formatter('${asctime}', style='$') self.assertTrue(f.usesTime()) + f = logging.Formatter('${asctime', style='$') + self.assertFalse(f.usesTime()) f = logging.Formatter('$asctime', style='$') self.assertTrue(f.usesTime()) f = logging.Formatter('asctime', style='$') @@ -2044,13 +2052,13 @@ ('M', 60), ('H', 60 * 60), ('D', 60 * 60 * 24), - ('MIDNIGHT', 60 * 60 * 23), + ('MIDNIGHT', 60 * 60 * 24), # current time (epoch start) is a Thursday, W0 means Monday - ('W0', secs(days=4, hours=23)), + ('W0', secs(days=4, hours=24)), ): def test_compute_rollover(self, when=when, exp=exp): rh = logging.handlers.TimedRotatingFileHandler( - self.fn, when=when, interval=1, backupCount=0) + self.fn, when=when, interval=1, backupCount=0, utc=True) currentTime = 0.0 actual = rh.computeRollover(currentTime) if exp != actual: Modified: python/branches/pep-3151/Lib/test/test_os.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_os.py (original) +++ python/branches/pep-3151/Lib/test/test_os.py Mon Feb 28 19:08:07 2011 @@ -1340,7 +1340,7 @@ def wait(self): # wait for handler connection to be closed, then stop the server - while not getattr(self.handler_instance, "closed", True): + while not getattr(self.handler_instance, "closed", False): time.sleep(0.001) self.stop() @@ -1374,7 +1374,7 @@ @unittest.skipUnless(hasattr(os, 'sendfile'), "test needs os.sendfile()") class TestSendfile(unittest.TestCase): - DATA = b"12345abcde" * 1024 * 1024 # 10 Mb + DATA = b"12345abcde" * 16 * 1024 # 160 KB SUPPORT_HEADERS_TRAILERS = not sys.platform.startswith("linux") and \ not sys.platform.startswith("solaris") and \ not sys.platform.startswith("sunos") @@ -1432,7 +1432,7 @@ total_sent = 0 offset = 0 nbytes = 4096 - while 1: + while total_sent < len(self.DATA): sent = self.sendfile_wrapper(self.sockno, self.fileno, offset, nbytes) if sent == 0: break @@ -1442,17 +1442,20 @@ self.assertEqual(offset, total_sent) self.assertEqual(total_sent, len(self.DATA)) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - self.assertEqual(hash(data), hash(self.DATA)) + self.assertEqual(len(data), len(self.DATA)) + self.assertEqual(data, self.DATA) def test_send_at_certain_offset(self): # start sending a file at a certain offset total_sent = 0 - offset = len(self.DATA) / 2 + offset = len(self.DATA) // 2 + must_send = len(self.DATA) - offset nbytes = 4096 - while 1: + while total_sent < must_send: sent = self.sendfile_wrapper(self.sockno, self.fileno, offset, nbytes) if sent == 0: break @@ -1460,18 +1463,27 @@ total_sent += sent self.assertTrue(sent <= nbytes) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - expected = self.DATA[int(len(self.DATA) / 2):] + expected = self.DATA[len(self.DATA) // 2:] self.assertEqual(total_sent, len(expected)) - self.assertEqual(hash(data), hash(expected)) + self.assertEqual(len(data), len(expected)) + self.assertEqual(data, expected) def test_offset_overflow(self): # specify an offset > file size offset = len(self.DATA) + 4096 - sent = os.sendfile(self.sockno, self.fileno, offset, 4096) - self.assertEqual(sent, 0) + try: + sent = os.sendfile(self.sockno, self.fileno, offset, 4096) + except OSError as e: + # Solaris can raise EINVAL if offset >= file length, ignore. + if e.errno != errno.EINVAL: + raise + else: + self.assertEqual(sent, 0) + self.client.shutdown(socket.SHUT_RDWR) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() Modified: python/branches/pep-3151/Lib/test/test_ssl.py ============================================================================== --- python/branches/pep-3151/Lib/test/test_ssl.py (original) +++ python/branches/pep-3151/Lib/test/test_ssl.py Mon Feb 28 19:08:07 2011 @@ -451,6 +451,50 @@ finally: s.close() + def test_connect_ex(self): + # Issue #11326: check connect_ex() implementation + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT) + try: + self.assertEqual(0, s.connect_ex(("svn.python.org", 443))) + self.assertTrue(s.getpeercert()) + finally: + s.close() + + def test_non_blocking_connect_ex(self): + # Issue #11326: non-blocking connect_ex() should allow handshake + # to proceed after the socket gets ready. + with support.transient_internet("svn.python.org"): + s = ssl.wrap_socket(socket.socket(socket.AF_INET), + cert_reqs=ssl.CERT_REQUIRED, + ca_certs=SVN_PYTHON_ORG_ROOT_CERT, + do_handshake_on_connect=False) + try: + s.setblocking(False) + rc = s.connect_ex(('svn.python.org', 443)) + # EWOULDBLOCK under Windows, EINPROGRESS elsewhere + self.assertIn(rc, (0, errno.EINPROGRESS, errno.EWOULDBLOCK)) + # Wait for connect to finish + select.select([], [s], [], 5.0) + # Non-blocking handshake + while True: + try: + s.do_handshake() + break + except ssl.SSLError as err: + if err.args[0] == ssl.SSL_ERROR_WANT_READ: + select.select([s], [], [], 5.0) + elif err.args[0] == ssl.SSL_ERROR_WANT_WRITE: + select.select([], [s], [], 5.0) + else: + raise + # SSL established + self.assertTrue(s.getpeercert()) + finally: + s.close() + def test_connect_with_context(self): with support.transient_internet("svn.python.org"): # Same as test_connect, but with a separately created context Modified: python/branches/pep-3151/Misc/NEWS ============================================================================== --- python/branches/pep-3151/Misc/NEWS (original) +++ python/branches/pep-3151/Misc/NEWS Mon Feb 28 19:08:07 2011 @@ -35,6 +35,9 @@ Library ------- +- Issue #11326: Add the missing connect_ex() implementation for SSL sockets, + and make it work for non-blocking connects. + - Issue #11297: Add collections.ChainMap(). - Issue #10755: Add the posix.fdlistdir() function. Patch by Ross Lagerwall. @@ -105,6 +108,9 @@ Tests ----- +- Issue #9931: Fix hangs in GUI tests under Windows in certain conditions. + Patch by Hirokazu Yamamoto. + - Issue #10512: Properly close sockets under test.test_cgi. - Issue #10992: Make tests pass under coverage. Modified: python/branches/pep-3151/Modules/posixmodule.c ============================================================================== --- python/branches/pep-3151/Modules/posixmodule.c (original) +++ python/branches/pep-3151/Modules/posixmodule.c Mon Feb 28 19:08:07 2011 @@ -369,8 +369,7 @@ #if !defined(HAVE_LARGEFILE_SUPPORT) *((off_t*)addr) = PyLong_AsLong(arg); #else - *((off_t*)addr) = PyLong_Check(arg) ? PyLong_AsLongLong(arg) - : PyLong_AsLong(arg); + *((off_t*)addr) = PyLong_AsLongLong(arg); #endif if (PyErr_Occurred()) return 0; @@ -5772,8 +5771,7 @@ #if !defined(HAVE_LARGEFILE_SUPPORT) pos = PyLong_AsLong(posobj); #else - pos = PyLong_Check(posobj) ? - PyLong_AsLongLong(posobj) : PyLong_AsLong(posobj); + pos = PyLong_AsLongLong(posobj); #endif if (PyErr_Occurred()) return NULL; @@ -6030,7 +6028,8 @@ return Py_BuildValue("nO", ret, Py_None); } #endif - _parse_off_t(offobj, &offset); + if (!_parse_off_t(offobj, &offset)) + return NULL; Py_BEGIN_ALLOW_THREADS ret = sendfile(out, in, &offset, count); Py_END_ALLOW_THREADS Modified: python/branches/pep-3151/Objects/typeslots.inc ============================================================================== --- python/branches/pep-3151/Objects/typeslots.inc (original) +++ python/branches/pep-3151/Objects/typeslots.inc Mon Feb 28 19:08:07 2011 @@ -1,4 +1,4 @@ -/* Generated by typeslots.py $Revision: 87806 $ */ +/* Generated by typeslots.py $Revision: 88635 $ */ 0, 0, offsetof(PyHeapTypeObject, as_mapping.mp_ass_subscript), From python-checkins at python.org Mon Feb 28 19:22:36 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 19:22:36 +0100 (CET) Subject: [Python-checkins] r88676 - peps/trunk/pep-0385.txt Message-ID: <20110228182236.5F675EE982@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 19:22:36 2011 New Revision: 88676 Log: Update PEP 385 with latest hooks work Modified: peps/trunk/pep-0385.txt Modified: peps/trunk/pep-0385.txt ============================================================================== --- peps/trunk/pep-0385.txt (original) +++ peps/trunk/pep-0385.txt Mon Feb 28 19:22:36 2011 @@ -262,7 +262,22 @@ on every build slave for the branch in which the changeset occurs. The `hooks repository`_ contains ports of these server-side hooks to -Mercurial. One additional hook could be beneficial: +Mercurial, as well as a couple additional ones: + +* check branch heads: a hook to reject pushes which create a new head on + an existing branch. The pusher then has to merge the superfetatory heads + and try pushing again. + +* check branches: a hook to reject all changesets not on an allowed named + branch. This hook's whitelist will have to be updated when we want to + create new maintenance branches. + +* check line endings: a hook, based on the `eol extension`_, to reject all + changesets committing files with the wrong line endings. The commits then + have to be stripped and redone, possibly with the `eol extension`_ enabled + on the comitter's computer. + +One additional hook could be beneficial: * check contributors: in the current setup, all changesets bear the username of committers, who must have signed the contributor @@ -285,9 +300,8 @@ information is kept in a versioned file called ``.hgeol``, and such a file has already been checked into the Subversion repository. -A hook on the server side that turns down any changegroup or changeset -introducing inconsistent newline data can still be implemented, if -deemed necessary. +A hook also exists on the server side to reject any changeset +introducing inconsistent newline data (see above). .. _eol extension: http://mercurial.selenic.com/wiki/EolExtension .. _win32text extension: http://mercurial.selenic.com/wiki/Win32TextExtension From python-checkins at python.org Mon Feb 28 19:26:58 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 19:26:58 +0100 (CET) Subject: [Python-checkins] r88677 - peps/trunk/pep-0385.txt Message-ID: <20110228182658.704E1EE984@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 19:26:58 2011 New Revision: 88677 Log: trunk is now legacy-trunk Modified: peps/trunk/pep-0385.txt Modified: peps/trunk/pep-0385.txt ============================================================================== --- peps/trunk/pep-0385.txt (original) +++ peps/trunk/pep-0385.txt Mon Feb 28 19:26:58 2011 @@ -111,8 +111,8 @@ The ``default`` branch in that repo is what is known as ``py3k`` in Subversion, while the Subversion trunk lives on with the branch name - ``trunk``; however in Mercurial this branch will be closed. Release - branches are named after their major.minor version, e.g. ``3.2``. + ``legacy-trunk``; however in Mercurial this branch will be closed. + Release branches are named after their major.minor version, e.g. ``3.2``. * A repository with the full, unedited conversion of the Subversion repository (actually, its /python subdirectory) -- this is called From python-checkins at python.org Mon Feb 28 19:32:14 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 19:32:14 +0100 (CET) Subject: [Python-checkins] r88678 - peps/trunk/pep-0385.txt Message-ID: <20110228183214.F0E09EE982@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 19:32:14 2011 New Revision: 88678 Log: Fix tags in example, and mention latesttag and latesttagdistance Modified: peps/trunk/pep-0385.txt Modified: peps/trunk/pep-0385.txt ============================================================================== --- peps/trunk/pep-0385.txt (original) +++ peps/trunk/pep-0385.txt Mon Feb 28 19:32:14 2011 @@ -446,18 +446,29 @@ ('tip' doesn't count), and uses the branch name otherwise. sys.subversion becomes -* ('CPython', '2.6.2', 'dd3ebf81af43') +* ('CPython', 'v2.6.2', 'dd3ebf81af43') * ('CPython', 'default', 'af694c6a888c+') and the build info string becomes -* '2.6.2:dd3ebf81af43, Jun 2 2009, 09:58:33' +* 'v2.6.2:dd3ebf81af43, Jun 2 2009, 09:58:33' * 'default:af694c6a888c+, Jun 2 2009, 01:24:14' This reflects that the default branch in hg is called 'default' instead of Subversion's 'trunk', and reflects the proposed new tag format. +Mercurial also allows to find out the latest tag and the number of +changesets separating the current changeset from that tag, allowing for +a descriptive version string:: + + $ hg parent --template "{latesttag}+{latesttagdistance}-{node|short}\n" + v3.2+37-4b5d0d260e72 + $ hg up 2.7 + 3316 files updated, 0 files merged, 379 files removed, 0 files unresolved + $ hg parent --template "{latesttag}+{latesttagdistance}-{node|short}\n" + v2.7.1+216-9619d21d8198 + Footnotes ========= From python-checkins at python.org Mon Feb 28 20:19:52 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Mon, 28 Feb 2011 20:19:52 +0100 (CET) Subject: [Python-checkins] r88679 - in python/branches/py3k: Doc/library/ftplib.rst Lib/ftplib.py Lib/test/test_ftplib.py Misc/NEWS Message-ID: <20110228191952.15A3DEE9B2@mail.python.org> Author: giampaolo.rodola Date: Mon Feb 28 20:19:51 2011 New Revision: 88679 Log: Fix issue 8594: adds a source_address parameter to ftplib module. Modified: python/branches/py3k/Doc/library/ftplib.rst python/branches/py3k/Lib/ftplib.py python/branches/py3k/Lib/test/test_ftplib.py python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Doc/library/ftplib.rst ============================================================================== --- python/branches/py3k/Doc/library/ftplib.rst (original) +++ python/branches/py3k/Doc/library/ftplib.rst Mon Feb 28 20:19:51 2011 @@ -40,7 +40,7 @@ The module defines the following items: -.. class:: FTP(host='', user='', passwd='', acct=''[, timeout]) +.. class:: FTP(host='', user='', passwd='', acct='', timeout=None, source_address=None) Return a new instance of the :class:`FTP` class. When *host* is given, the method call ``connect(host)`` is made. When *user* is given, additionally @@ -48,7 +48,8 @@ *acct* default to the empty string when not given). The optional *timeout* parameter specifies a timeout in seconds for blocking operations like the connection attempt (if is not specified, the global default timeout setting - will be used). + will be used). *source_address* is a 2-tuple ``(host, port)`` for the socket + to bind to as its source address before connecting. :class:`FTP` class supports the :keyword:`with` statement. Here is a sample on how using it: @@ -68,8 +69,11 @@ .. versionchanged:: 3.2 Support for the :keyword:`with` statement was added. + .. versionchanged:: 3.3 + *source_address* parameter was added. -.. class:: FTP_TLS(host='', user='', passwd='', acct='', [keyfile[, certfile[, context[, timeout]]]]) + +.. class:: FTP_TLS(host='', user='', passwd='', acct='', keyfile=None, certfile=None, context=None, timeout=None, source_address=None) A :class:`FTP` subclass which adds TLS support to FTP as described in :rfc:`4217`. @@ -80,10 +84,15 @@ private key and certificate chain file name for the SSL connection. *context* parameter is a :class:`ssl.SSLContext` object which allows bundling SSL configuration options, certificates and private keys into a - single (potentially long-lived) structure. + single (potentially long-lived) structure. *source_address* is a 2-tuple + ``(host, port)`` for the socket to bind to as its source address before + connecting. .. versionadded:: 3.2 + .. versionchanged:: 3.3 + *source_address* parameter was added. + Here's a sample session using the :class:`FTP_TLS` class: >>> from ftplib import FTP_TLS @@ -174,7 +183,7 @@ debugging output, logging each line sent and received on the control connection. -.. method:: FTP.connect(host='', port=0[, timeout]) +.. method:: FTP.connect(host='', port=0, timeout=None, source_address=None) Connect to the given host and port. The default port number is ``21``, as specified by the FTP protocol specification. It is rarely needed to specify a @@ -182,10 +191,14 @@ instance; it should not be called at all if a host was given when the instance was created. All other methods can only be used after a connection has been made. - The optional *timeout* parameter specifies a timeout in seconds for the connection attempt. If no *timeout* is passed, the global default timeout setting will be used. + *source_address* is a 2-tuple ``(host, port)`` for the socket to bind to as + its source address before connecting. + + .. versionchanged:: 3.3 + *source_address* parameter was added. .. method:: FTP.getwelcome() Modified: python/branches/py3k/Lib/ftplib.py ============================================================================== --- python/branches/py3k/Lib/ftplib.py (original) +++ python/branches/py3k/Lib/ftplib.py Mon Feb 28 20:19:51 2011 @@ -107,7 +107,8 @@ # Optional arguments are host (for connect()), # and user, passwd, acct (for login()) def __init__(self, host='', user='', passwd='', acct='', - timeout=_GLOBAL_DEFAULT_TIMEOUT): + timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): + self.source_address = source_address self.timeout = timeout if host: self.connect(host) @@ -128,10 +129,12 @@ if self.sock is not None: self.close() - def connect(self, host='', port=0, timeout=-999): + def connect(self, host='', port=0, timeout=-999, source_address=None): '''Connect to host. Arguments are: - host: hostname to connect to (string, default previous host) - port: port to connect to (integer, default previous port) + - source_address: a 2-tuple (host, port) for the socket to bind + to as its source address before connecting. ''' if host != '': self.host = host @@ -139,7 +142,10 @@ self.port = port if timeout != -999: self.timeout = timeout - self.sock = socket.create_connection((self.host, self.port), self.timeout) + if source_address is not None: + self.source_address = source_address + self.sock = socket.create_connection((self.host, self.port), self.timeout, + source_address=self.source_address) self.af = self.sock.family self.file = self.sock.makefile('r', encoding=self.encoding) self.welcome = self.getresp() @@ -334,7 +340,8 @@ size = None if self.passiveserver: host, port = self.makepasv() - conn = socket.create_connection((host, port), self.timeout) + conn = socket.create_connection((host, port), self.timeout, + source_address=self.source_address) if rest is not None: self.sendcmd("REST %s" % rest) resp = self.sendcmd(cmd) @@ -637,7 +644,7 @@ def __init__(self, host='', user='', passwd='', acct='', keyfile=None, certfile=None, context=None, - timeout=_GLOBAL_DEFAULT_TIMEOUT): + timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): if context is not None and keyfile is not None: raise ValueError("context and keyfile arguments are mutually " "exclusive") @@ -648,7 +655,7 @@ self.certfile = certfile self.context = context self._prot_p = False - FTP.__init__(self, host, user, passwd, acct, timeout) + FTP.__init__(self, host, user, passwd, acct, timeout, source_address) def login(self, user='', passwd='', acct='', secure=True): if secure and not isinstance(self.sock, ssl.SSLSocket): Modified: python/branches/py3k/Lib/test/test_ftplib.py ============================================================================== --- python/branches/py3k/Lib/test/test_ftplib.py (original) +++ python/branches/py3k/Lib/test/test_ftplib.py Mon Feb 28 20:19:51 2011 @@ -608,6 +608,20 @@ self.assertEqual(self.server.handler_instance.last_received_cmd, 'quit') self.assertFalse(is_client_connected()) + def test_source_address(self): + self.client.quit() + port = support.find_unused_port() + self.client.connect(self.server.host, self.server.port, + source_address=(HOST, port)) + self.assertEqual(self.client.sock.getsockname()[1], port) + self.client.quit() + + def test_source_address_passive_connection(self): + port = support.find_unused_port() + self.client.source_address = (HOST, port) + sock = self.client.transfercmd('list') + self.assertEqual(sock.getsockname()[1], port) + def test_parse257(self): self.assertEqual(ftplib.parse257('257 "/foo/bar"'), '/foo/bar') self.assertEqual(ftplib.parse257('257 "/foo/bar" created'), '/foo/bar') Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 28 20:19:51 2011 @@ -35,6 +35,9 @@ Library ------- +- Issue 8594: ftplib now provides a source_address parameter to specify which + (address, port) to bind to before connecting. + - Issue #11326: Add the missing connect_ex() implementation for SSL sockets, and make it work for non-blocking connects. From python-checkins at python.org Mon Feb 28 20:27:16 2011 From: python-checkins at python.org (giampaolo.rodola) Date: Mon, 28 Feb 2011 20:27:16 +0100 (CET) Subject: [Python-checkins] r88680 - python/branches/py3k/Lib/test/test_os.py Message-ID: <20110228192716.48DAEEEA33@mail.python.org> Author: giampaolo.rodola Date: Mon Feb 28 20:27:16 2011 New Revision: 88680 Log: Issue 11348: skip os.setpriority() test if current nice level is >= 19. Modified: python/branches/py3k/Lib/test/test_os.py Modified: python/branches/py3k/Lib/test/test_os.py ============================================================================== --- python/branches/py3k/Lib/test/test_os.py (original) +++ python/branches/py3k/Lib/test/test_os.py Mon Feb 28 20:27:16 2011 @@ -1274,10 +1274,16 @@ """Tests for os.getpriority() and os.setpriority().""" def test_set_get_priority(self): + base = os.getpriority(os.PRIO_PROCESS, os.getpid()) os.setpriority(os.PRIO_PROCESS, os.getpid(), base + 1) try: - self.assertEqual(os.getpriority(os.PRIO_PROCESS, os.getpid()), base + 1) + new_prio = os.getpriority(os.PRIO_PROCESS, os.getpid()) + if base >= 19 and new_prio <= 19: + raise unittest.SkipTest( + "unable to reliably test setpriority at current nice level of %s" % base) + else: + self.assertEqual(new_prio, base + 1) finally: try: os.setpriority(os.PRIO_PROCESS, os.getpid(), base) From tjreedy at udel.edu Mon Feb 28 19:36:11 2011 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 28 Feb 2011 13:36:11 -0500 Subject: [Python-checkins] r88676 - peps/trunk/pep-0385.txt In-Reply-To: <20110228182236.5F675EE982@mail.python.org> References: <20110228182236.5F675EE982@mail.python.org> Message-ID: <4D6BEB1B.1040004@udel.edu> > + an existing branch. The pusher then has to merge the superfetatory heads 'superfetatory'? I have no idea of what this is, neither does merriam-webster.com ;-). From python-checkins at python.org Mon Feb 28 20:52:37 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 20:52:37 +0100 Subject: [Python-checkins] devguide (hg_transition): Add a section explaining how to do long-term development of features Message-ID: antoine.pitrou pushed 766105be37f9 to devguide: http://hg.python.org/devguide/rev/766105be37f9 changeset: 347:766105be37f9 branch: hg_transition user: Antoine Pitrou date: Mon Feb 28 20:47:22 2011 +0100 summary: Add a section explaining how to do long-term development of features in a public repository. files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -217,8 +217,72 @@ .. _transplant extension: http://mercurial.selenic.com/wiki/TransplantExtension - .. seealso:: `Merging work `_, in `Mercurial: The Definitive Guide `_. + + +Long-term development of features +--------------------------------- + +If you want to work on a feature long-term (perhaps you're implementing a +PEP, or even removing the GIL), you will want to publish your work somewhere. +We then recommend that you maintain it in a dedicated repository. + +First create a public (empty) repository on hg.python.org:: + + $ hg init ssh://hg at hg.python.org/features/mywork + +And do a local clone of that repository on your disk:: + + $ hg clone ssh://hg at hg.python.org/features/mywork + $ cd mywork + +There, pull all the contents from the main repository, either from a local +clone:: + + $ hg pull ../cpython + $ hg update + +or directly from the network (which is of course slower):: + + $ hg pull http://hg.python.org/cpython + $ hg update + +It is recommended that you create a new named branch for your work, so as +to easily track changes. That named branch will exist in your feature +repository, but not in the main repository:: + + $ hg branch mywork + $ hg commit -m "Creating branch mywork" + +You can now work on your feature, commit changes as you will, and push them +when desired:: + + $ hg push + +When you push them, they will land in the public repository at +``ssh://hg at hg.python.org/features/mywork`` (or +``http://hg.python.org/features/mywork`` for the read-only URL). + +When you want to synchronize your changes, you can pull from the main +repository:: + + $ hg pull ../cpython + +or from the network:: + + $ hg pull http://hg.python.org/cpython + +and merge all new changes from branch ``default`` to branch ``mywork``:: + + $ hg branch + mywork + $ hg merge default + + +.. XXX: since the initial "hg push" can be quite long on asymmetric + connections, we could offer a way for people to make a remote-to-remote + clone (like SVN allows creating branches by remote copying). + hg currently doesn't support that. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 28 20:52:38 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 20:52:38 +0100 Subject: [Python-checkins] devguide (hg_transition): Explain how to obtain a patch Message-ID: antoine.pitrou pushed 2d5e1a5058a3 to devguide: http://hg.python.org/devguide/rev/2d5e1a5058a3 changeset: 348:2d5e1a5058a3 branch: hg_transition tag: tip user: Antoine Pitrou date: Mon Feb 28 20:52:28 2011 +0100 summary: Explain how to obtain a patch files: committing.rst diff --git a/committing.rst b/committing.rst --- a/committing.rst +++ b/committing.rst @@ -286,3 +286,21 @@ connections, we could offer a way for people to make a remote-to-remote clone (like SVN allows creating branches by remote copying). hg currently doesn't support that. + + +Uploading a patch for review +'''''''''''''''''''''''''''' + +In this scheme, your work will probably consist of many commits (some of +them merges). If you want to upload a patch for review somewhere, you need +a single agregate patch. This is where having a dedicated named branch +``mywork`` gets handy. + +First ensure that you have pulled *and merged* all changes from the main +repository, as explained above. Then, assuming your :ref:`currently checked +out branch ` is still ``mywork``, simply do:: + + $ hg diff -r default > mywork.patch + +This will write to ``mywork.patch`` all the changes between ``default`` and +``mywork``. -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 28 21:03:39 2011 From: python-checkins at python.org (georg.brandl) Date: Mon, 28 Feb 2011 21:03:39 +0100 Subject: [Python-checkins] devguide (hg_transition): Add section about graphlog. Message-ID: georg.brandl pushed c7d1d29b0a3b to devguide: http://hg.python.org/devguide/rev/c7d1d29b0a3b changeset: 349:c7d1d29b0a3b branch: hg_transition tag: tip user: Georg Brandl date: Mon Feb 28 21:02:00 2011 +0100 summary: Add section about graphlog. files: faq.rst diff --git a/faq.rst b/faq.rst --- a/faq.rst +++ b/faq.rst @@ -456,6 +456,22 @@ hg log -vp -r +How can I see the changeset graph in my repository? +--------------------------------------------------- + +In Mercurial repositories, changesets don't form a simple list, but rather +a graph: every changeset has one or two parents (it's called a merge changeset +in the latter case), and can have any number of children. + +The graphlog_ extension is very useful for examining the structure of the +changeset graph. It is bundled with Mercurial. + +Graphical tools, such as TortoiseHG, will display the changeset graph +by default. + +.. _graphlog: http://mercurial.selenic.com/wiki/GraphlogExtension + + How do I undo the changes made in a recent commit? ------------------------------------------------------------------------------- -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Mon Feb 28 22:05:27 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 22:05:27 +0100 (CET) Subject: [Python-checkins] r88681 - peps/trunk/pep-0385.txt Message-ID: <20110228210527.9F67EEE9AE@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 22:05:27 2011 New Revision: 88681 Log: Plain English Modified: peps/trunk/pep-0385.txt Modified: peps/trunk/pep-0385.txt ============================================================================== --- peps/trunk/pep-0385.txt (original) +++ peps/trunk/pep-0385.txt Mon Feb 28 22:05:27 2011 @@ -265,7 +265,7 @@ Mercurial, as well as a couple additional ones: * check branch heads: a hook to reject pushes which create a new head on - an existing branch. The pusher then has to merge the superfetatory heads + an existing branch. The pusher then has to merge the excess heads and try pushing again. * check branches: a hook to reject all changesets not on an allowed named From python-checkins at python.org Mon Feb 28 23:03:34 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 23:03:34 +0100 (CET) Subject: [Python-checkins] r88682 - in python/branches/py3k: Doc/library/_thread.rst Lib/test/test_threading.py Misc/NEWS Modules/_threadmodule.c Message-ID: <20110228220334.D0C5BEEA4B@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 23:03:34 2011 New Revision: 88682 Log: Issue #11140: Lock.release() now raises a RuntimeError when attempting to release an unacquired lock, as claimed in the threading documentation. The _thread.error exception is now an alias of RuntimeError. Modified: python/branches/py3k/Doc/library/_thread.rst python/branches/py3k/Lib/test/test_threading.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/_threadmodule.c Modified: python/branches/py3k/Doc/library/_thread.rst ============================================================================== --- python/branches/py3k/Doc/library/_thread.rst (original) +++ python/branches/py3k/Doc/library/_thread.rst Mon Feb 28 23:03:34 2011 @@ -35,6 +35,9 @@ Raised on thread-specific errors. + .. versionchanged:: 3.3 + This is now a synonym of the built-in :exc:`RuntimeError`. + .. data:: LockType Modified: python/branches/py3k/Lib/test/test_threading.py ============================================================================== --- python/branches/py3k/Lib/test/test_threading.py (original) +++ python/branches/py3k/Lib/test/test_threading.py Mon Feb 28 23:03:34 2011 @@ -685,6 +685,10 @@ thread.start() self.assertRaises(RuntimeError, setattr, thread, "daemon", True) + def test_releasing_unacquired_lock(self): + lock = threading.Lock() + self.assertRaises(RuntimeError, lock.release) + class LockTests(lock_tests.LockTests): locktype = staticmethod(threading.Lock) Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 28 23:03:34 2011 @@ -35,6 +35,10 @@ Library ------- +- Issue #11140: Lock.release() now raises a RuntimeError when attempting + to release an unacquired lock, as claimed in the threading documentation. + The _thread.error exception is now an alias of RuntimeError. + - Issue 8594: ftplib now provides a source_address parameter to specify which (address, port) to bind to before connecting. Modified: python/branches/py3k/Modules/_threadmodule.c ============================================================================== --- python/branches/py3k/Modules/_threadmodule.c (original) +++ python/branches/py3k/Modules/_threadmodule.c Mon Feb 28 23:03:34 2011 @@ -1308,7 +1308,9 @@ /* Add a symbolic constant */ d = PyModule_GetDict(m); - ThreadError = PyErr_NewException("_thread.error", NULL, NULL); + ThreadError = PyExc_RuntimeError; + Py_INCREF(ThreadError); + PyDict_SetItemString(d, "error", ThreadError); Locktype.tp_doc = lock_doc; Py_INCREF(&Locktype); From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Ignore everything that is auto-generated. Message-ID: brett.cannon pushed 7ae1f083d4dd to devinabox: http://hg.python.org/devinabox/rev/7ae1f083d4dd changeset: 0:7ae1f083d4dd user: Brett Cannon date: Fri Feb 25 16:36:15 2011 -0800 summary: Ignore everything that is auto-generated. files: .hgignore diff --git a/.hgignore b/.hgignore new file mode 100644 --- /dev/null +++ b/.hgignore @@ -0,0 +1,8 @@ +syntax: glob + +coveragepy +cpython +devguide +Mercurial +peps +Visual C++ Express -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Get basic creation of the Box working. Message-ID: brett.cannon pushed 77a0bf0169cd to devinabox: http://hg.python.org/devinabox/rev/77a0bf0169cd changeset: 1:77a0bf0169cd user: Brett Cannon date: Fri Feb 25 16:36:39 2011 -0800 summary: Get basic creation of the Box working. files: box.py diff --git a/box.py b/box.py new file mode 100644 --- /dev/null +++ b/box.py @@ -0,0 +1,302 @@ +"""Python-Dev In a Box: (almost) everything you need to contribute to (C)Python +in under 700 MB. + +This script will create a directory which will end up containing (or remind you +to download) everything you need to contribute to (C)Python's development (sans +a compiler): + + * Mercurial: source download + Hg is Python's VCS (Version Control System). + + * TortoiseHg: Windows 32/64 + For ease-of-use for Windows users. + + * Visual C++ Express: English Web installer + So Windows users can compile CPython. + OS X users should install XCode (http://developer.apple.com/) and + optionally Homebrew (http://mxcl.github.com/homebrew/) to install + LLVM/clang. + Linux user should install gcc or clang using their distribution's + package managers. + + * Python Developer's Guide + "The devguide"; documentation on how to contribute to Python. + + * Python Enhancement Proposals + Also known as PEPs. This is included as reference material, especially + for PEPs 7 & 8 (the C and Python style guides, respectively). + + * CPython + The included repository clone has branches for all versions of Python + either under development or maintenance. + + * coverage.py: cloned repository + For measuring the coverage of Python's test suite. Includes a + cloned repository instead of the latest release as cutting-edge support + is occasionally needed to support the in-development version of Python. + +Once the requisite code has been checked out, various optional steps can be +performed to make the lives of users easier: + + * Update all cloned repositories (devguide, PEPs, CPython, coverage.py) + + * Build Python's documentation (will also lead to the download of the + required software; requires Python to already be installed on the user's + machine) + + * Build the devguide (requires Python's docs to be built so as to have a + copy of Sphinx available to use) + + * Build the PEPs (requires Python to be installed) + + * Generate a coverage report (requires that CPython be built) + + +This script is assumed to be run by a built version of the in-development +version of CPython. + +""" +# XXX Script to build CPython? On Windows should just open devguide to proper +# page? +# XXX Vim files? Emacs files? +# XXX README or just the docstring for this script? +# XXX Script to run thorough test suite? + +import abc +import contextlib +from distutils.version import LooseVersion as Version +import os +import os.path +import subprocess +import urllib.request +import urllib.parse +import webbrowser +import xmlrpc.client + + + at contextlib.contextmanager +def change_cwd(directory): + cwd = os.getcwd() + os.chdir(directory) + try: + yield + finally: + os.chdir(cwd) + + +class Provider(metaclass=abc.ABCMeta): + + """Base class for items to be provided.""" + + @abc.abstractproperty + def directory(self): + """Directory to put everything.""" + raise NotImplementedError + + def _prompt(self, message): + """Prompt the user to perform an action, waiting for a response.""" + input("{} [press Enter when done]".format(message)) + + @abc.abstractmethod + def create(self): + """Create what is to be provided.""" + raise NotImplementedError + + def update(self): + """Update what is provided.""" + pass + + +class HgProvider(Provider): + + """Provide an Hg repository.""" + + @abc.abstractproperty + def url(self): + """Location of the repository.""" + raise NotImplementedError + + def create(self): + """Clone an Hg repository to 'directory'.""" + subprocess.check_call(['hg', 'clone', self.url, self.directory]) + + def update(self): + """Update the Hg clone in 'directory'.""" + with change_cwd(self.directory): + subprocess.check_call(['hg', 'pull']) + subprocess.check_call(['hg', 'update']) + + +class SvnProvider(Provider): + + """Provide an svn checkout.""" + + @abc.abstractproperty + def url(self): + """Location of the repository.""" + raise NotImplementedError + + def create(self): + """Check out the svn repository to 'directory'.""" + subprocess.check_call(['svn', 'checkout', self.url, self.directory]) + + def update(self): + """Update the svn checkout in 'directory'.""" + subprocess.check_call(['svn', 'update', self.directory]) + + +class Visual_Studio_Express(Provider): + + """Provide the Web installer for Visual C++ Express.""" + + size = (4, None) + directory = 'Visual C++ Express' + + def create(self): + """Bring up a browser window to download the release.""" + try: + os.mkdir(self.directory) + except OSError: + pass + url = 'http://www.microsoft.com/express/Downloads/' + try: + webbrowser.open(url) + except webbrowser.Error: + pass + self._prompt('download Visual C++ Express at {}'.format(url)) + + +class CoveragePy(HgProvider): + + """Cloned repository for coverage.py.""" + + url = 'https://brettsky at bitbucket.org/ned/coveragepy' + directory = 'coveragepy' + size = (5, None) # XXX coverage report for CPython + + # XXX script to run coverage tests? + # XXX 'coverage' runs coverage tests + + +class Mercurial(Provider): + + """Provide Mercurial (source release) and TortoiseHg (for Windows).""" + + directory = 'Mercurial' + size = (47, None) # Includes TortoiseHg for 32/64-bit Windows + + def _download_url(self): + """Discover the URL to download Mercurial from.""" + pypi = xmlrpc.client.ServerProxy('http://pypi.python.org/pypi') + hg_versions = pypi.package_releases('Mercurial') + hg_versions.sort(key=Version) + latest_release = hg_versions[-1] + # Mercurial keeps releases on their servers + release_data = pypi.release_data('Mercurial', latest_release) + try: + return release_data['download_url'] + except KeyError: + # XXX + pass + + def _url_filename(self, url): + """Find the filename from the URL.""" + return os.path.split(urllib.parse.urlparse(url).path)[1] + + def _create_mercurial(self): + """Download the latest source release of Mercurial.""" + file_url = self._download_url() + file_name = self._url_filename(file_url) + with urllib.request.urlopen(file_url) as url_file: + with open(os.path.join(self.directory, file_name), 'wb') as file: + file.write(url_file.read()) + + def _create_tortoisehg(self): + """Open a web page to the TortoiseHg download page.""" + url = 'http://tortoisehg.bitbucket.org/download/' + try: + webbrowser.open(url) + except webbrowser.Error: + pass + self._prompt('Download TortoiseHg from {}'.format(url)) + + def create(self): + """Fetch the latest source distribution for Mercurial.""" + try: + os.mkdir(self.directory) + except OSError: + pass + self._create_mercurial() + self._create_tortoisehg() + + + +class Devguide(HgProvider): + + """Clone the Python developer's guide.""" + + size = (1, 4) + + url = 'http://hg.python.org/devguide' + directory = 'devguide' + + # XXX build (and symlink the index); use Sphinx from docs + + +class PEPs(SvnProvider): + + """Checkout the Python Enhancement Proposals.""" + + url = 'http://svn.python.org/projects/peps/trunk/' + directory = 'peps' + size = (14, 20) + + def build(self): + """Build the PEPs and symlink PEP 0.""" + with change_cwd(self.directory): + subprocess.check_call(['make']) + os.symlink(os.path.join(self.directory, 'pep-0000.html'), 'peps.html') + + +class CPython(HgProvider): + + """Clone CPython (and requisite tools to build the documentation).""" + + url = 'http://hg.python.org/cpython' + directory = 'cpython' + size = (245, # Includes stuff required to build docs/ docs built + 325) # Only docs are built + + def create(self): + """Clone CPython and get the necessary tools to build the + documentation.""" + super().create() + with change_cwd(os.path.join(self.directory, 'Doc')): + subprocess.check_call(['make', 'checkout']) + + # XXX build docs (and symlink) + # XXX script to build CPython? multiprocessing.cpu_count() + # XXX script to run unit tests? multiprocessing.cpu_count() + + + +if __name__ == '__main__': + import argparse + parser = arparse.ArgumentParser(prog='Python-Dev In a Box') + subparsers = parser.add_subparsers() # XXX help + parser_create = subparsers.add_parser('create') # XXX help + # XXX --all option + # XXX --basic option (everything that doesn't require a Web browser) + # XXX --miniumum option (cpython, devguide, peps) + # XXX --build option + # XXX --with-coverage option + parser_update = subparsers.add_parser('update') # XXX help + # XXX also run build (--with-coverage) + + + + for provider in (CPython, Devguide, PEPs, Mercurial, CoveragePy, + Visual_Studio_Express,): + print('Creating', provider.__name__.replace('_', ' ')) + provider().create() + print() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Scrap the idea of Vim and Emacs config files. That's just asking for trouble. Message-ID: brett.cannon pushed d28bc8018e78 to devinabox: http://hg.python.org/devinabox/rev/d28bc8018e78 changeset: 2:d28bc8018e78 user: Brett Cannon date: Fri Feb 25 16:38:41 2011 -0800 summary: Scrap the idea of Vim and Emacs config files. That's just asking for trouble. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -58,7 +58,6 @@ """ # XXX Script to build CPython? On Windows should just open devguide to proper # page? -# XXX Vim files? Emacs files? # XXX README or just the docstring for this script? # XXX Script to run thorough test suite? -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Ignore more files. Message-ID: brett.cannon pushed c29bac1b876f to devinabox: http://hg.python.org/devinabox/rev/c29bac1b876f changeset: 3:c29bac1b876f user: Brett Cannon date: Fri Feb 25 16:49:22 2011 -0800 summary: Ignore more files. files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -6,3 +6,7 @@ Mercurial peps Visual C++ Express +.*.swp +*.pyc +*.pyo +__pycache__ -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Re-org some comments. Message-ID: brett.cannon pushed 1d912fdfa240 to devinabox: http://hg.python.org/devinabox/rev/1d912fdfa240 changeset: 4:1d912fdfa240 user: Brett Cannon date: Fri Feb 25 16:49:32 2011 -0800 summary: Re-org some comments. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -56,8 +56,6 @@ version of CPython. """ -# XXX Script to build CPython? On Windows should just open devguide to proper -# page? # XXX README or just the docstring for this script? # XXX Script to run thorough test suite? @@ -274,7 +272,7 @@ subprocess.check_call(['make', 'checkout']) # XXX build docs (and symlink) - # XXX script to build CPython? multiprocessing.cpu_count() + # XXX script to build CPython? multiprocessing.cpu_count(). Windows? # XXX script to run unit tests? multiprocessing.cpu_count() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Try to support Windows users building the docs. Message-ID: brett.cannon pushed 897f82506708 to devinabox: http://hg.python.org/devinabox/rev/897f82506708 changeset: 6:897f82506708 user: Brett Cannon date: Fri Feb 25 17:12:37 2011 -0800 summary: Try to support Windows users building the docs. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -286,8 +286,9 @@ subprocess.check_call(['make', 'checkout']) def build(self): + cmd = 'make' if sys.platform != 'win32' else 'make.bat' with change_cwd(os.path.join(self.directory, 'Doc'): - subprocess.check_call(['make', 'html']) + subprocess.check_call([cmd, 'html']) if __name__ == '__main__': -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Have commands run quietly. Message-ID: brett.cannon pushed e9bb550d8cd2 to devinabox: http://hg.python.org/devinabox/rev/e9bb550d8cd2 changeset: 7:e9bb550d8cd2 user: Brett Cannon date: Fri Feb 25 17:32:06 2011 -0800 summary: Have commands run quietly. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -116,13 +116,13 @@ def create(self): """Clone an Hg repository to 'directory'.""" - subprocess.check_call(['hg', 'clone', self.url, self.directory]) + subprocess.check_call(['hg', '-q', 'clone', self.url, self.directory]) def update(self): """Update the Hg clone in 'directory'.""" with change_cwd(self.directory): - subprocess.check_call(['hg', 'pull']) - subprocess.check_call(['hg', 'update']) + subprocess.check_call(['hg', '-q', 'pull']) + subprocess.check_call(['hg', '-q', 'update']) class SvnProvider(Provider): @@ -136,11 +136,12 @@ def create(self): """Check out the svn repository to 'directory'.""" - subprocess.check_call(['svn', 'checkout', self.url, self.directory]) + subprocess.check_call(['svn', 'checkout', '-q', + self.url, self.directory]) def update(self): """Update the svn checkout in 'directory'.""" - subprocess.check_call(['svn', 'update', self.directory]) + subprocess.check_call(['svn', 'update', '-q', self.directory]) class Visual_Studio_Express(Provider): @@ -300,9 +301,8 @@ # XXX --basic option (everything that doesn't require a Web browser) # XXX --miniumum option (cpython, devguide, peps) # XXX --build option - # XXX --with-coverage option parser_update = subparsers.add_parser('update') # XXX help - # XXX also run build (--with-coverage) + # XXX also run build -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Code up more build rules for documentation. Message-ID: brett.cannon pushed fa02baedf1d6 to devinabox: http://hg.python.org/devinabox/rev/fa02baedf1d6 changeset: 5:fa02baedf1d6 user: Brett Cannon date: Fri Feb 25 17:08:51 2011 -0800 summary: Code up more build rules for documentation. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -57,7 +57,8 @@ """ # XXX README or just the docstring for this script? -# XXX Script to run thorough test suite? +# XXX script to build CPython? multiprocessing.cpu_count(). Windows? +# XXX script to run unit tests? multiprocessing.cpu_count() import abc import contextlib @@ -237,7 +238,20 @@ url = 'http://hg.python.org/devguide' directory = 'devguide' - # XXX build (and symlink the index); use Sphinx from docs + def build(self): + """Build the devguide and symlink its index page.""" + # Grab Sphinx from cpython/Doc/tools/ + tools_directory = os.path.join(CPython.directory, 'Doc', 'tools') + orig_pythonpath = os.environ['PYTHONPATH'] + os.environ['PYTHONPATH'] = os.path.abspath(tools_directory) + try: + with change_cwd(self.directory): + subprocess.check_call(['make', 'html']) + finally: + os.environ['PYTHONPATH'] = orig_pythonpath + index_path = os.path.join(self.directory, '_build', 'html', + 'index.html') + os.symlink(index_path, 'devguide.html') class PEPs(SvnProvider): @@ -271,10 +285,9 @@ with change_cwd(os.path.join(self.directory, 'Doc')): subprocess.check_call(['make', 'checkout']) - # XXX build docs (and symlink) - # XXX script to build CPython? multiprocessing.cpu_count(). Windows? - # XXX script to run unit tests? multiprocessing.cpu_count() - + def build(self): + with change_cwd(os.path.join(self.directory, 'Doc'): + subprocess.check_call(['make', 'html']) if __name__ == '__main__': -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Add a Python script which will run the test suite in the most rigorous way Message-ID: brett.cannon pushed 3e5a61adb41d to devinabox: http://hg.python.org/devinabox/rev/3e5a61adb41d changeset: 8:3e5a61adb41d user: Brett Cannon date: Fri Feb 25 17:35:37 2011 -0800 summary: Add a Python script which will run the test suite in the most rigorous way possible. files: run_tests.py diff --git a/run_tests.py b/run_tests.py new file mode 100644 --- /dev/null +++ b/run_tests.py @@ -0,0 +1,26 @@ +#!/usr/bin/env python +"""Run CPython's test suite in the most rigorous way possible.""" +import multiprocessing +import os +import subprocess +import sys + + +directory = 'cpython' +cmd = os.path.join(directory, 'python') +# UNIX +if not os.path.isfile(cmd): + # OS X + cmd += '.exe' + if not os.path.isfile(cmd): + # 32-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'python_d.exe') + if not os.path.isfile(cmd): + # 64-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') + if not os.path.isfile(cmd): + print('CPython is not built') + sys.exit(1) + +subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', '-w', + '-u', 'all', '-j', str(multiprocessing.cpu_count())]) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Update build and test running scripts so they are executable. Message-ID: brett.cannon pushed a1e5643bdd1d to devinabox: http://hg.python.org/devinabox/rev/a1e5643bdd1d changeset: 10:a1e5643bdd1d user: Brett Cannon date: Fri Feb 25 17:45:38 2011 -0800 summary: Update build and test running scripts so they are executable. files: build_cpython.py run_tests.py diff --git a/build_cpython.py b/build_cpython.py old mode 100644 new mode 100755 diff --git a/run_tests.py b/run_tests.py old mode 100644 new mode 100755 -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Get the argument parser written (but not wired up). Message-ID: brett.cannon pushed 36aab8aa3dea to devinabox: http://hg.python.org/devinabox/rev/36aab8aa3dea changeset: 11:36aab8aa3dea user: Brett Cannon date: Fri Feb 25 18:13:18 2011 -0800 summary: Get the argument parser written (but not wired up). files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -5,12 +5,9 @@ to download) everything you need to contribute to (C)Python's development (sans a compiler): - * Mercurial: source download + * Mercurial: source download & TortoiseHG for 32/64-bit Windows Hg is Python's VCS (Version Control System). - * TortoiseHg: Windows 32/64 - For ease-of-use for Windows users. - * Visual C++ Express: English Web installer So Windows users can compile CPython. OS X users should install XCode (http://developer.apple.com/) and @@ -19,22 +16,22 @@ Linux user should install gcc or clang using their distribution's package managers. - * Python Developer's Guide - "The devguide"; documentation on how to contribute to Python. + * coverage.py: cloned repository + For measuring the coverage of Python's test suite. Includes a + cloned repository instead of the latest release as cutting-edge support + is occasionally needed to support the in-development version of Python. * Python Enhancement Proposals Also known as PEPs. This is included as reference material, especially for PEPs 7 & 8 (the C and Python style guides, respectively). + * Python Developer's Guide + "The devguide"; documentation on how to contribute to Python. + * CPython The included repository clone has branches for all versions of Python either under development or maintenance. - * coverage.py: cloned repository - For measuring the coverage of Python's test suite. Includes a - cloned repository instead of the latest release as cutting-edge support - is occasionally needed to support the in-development version of Python. - Once the requisite code has been checked out, various optional steps can be performed to make the lives of users easier: @@ -288,26 +285,29 @@ def build(self): cmd = 'make' if sys.platform != 'win32' else 'make.bat' - with change_cwd(os.path.join(self.directory, 'Doc'): + with change_cwd(os.path.join(self.directory, 'Doc')): subprocess.check_call([cmd, 'html']) if __name__ == '__main__': import argparse - parser = arparse.ArgumentParser(prog='Python-Dev In a Box') + + all_providers = (CPython, Devguide, PEPs, CoveragePy, Mercurial, + Visual_Studio_Express) + parser = argparse.ArgumentParser(prog='Python-Dev In a Box') subparsers = parser.add_subparsers() # XXX help - parser_create = subparsers.add_parser('create') # XXX help - # XXX --all option - # XXX --basic option (everything that doesn't require a Web browser) - # XXX --miniumum option (cpython, devguide, peps) + parser_create = subparsers.add_parser('create', + help='Create a %(prog)s') + parser_create.add_argument('--build', action='store_true', default=False) + group = parser_create.add_mutually_exclusive_group() + group.add_argument('--all', dest='providers', action='store_const', + const=all_providers, + help='Provide everything (the default)') + group.add_argument('--basic', dest='providers', action='store_const', + const=(CPython, Devguide, PEPs, CoveragePy), + help='Provide the basics people probably are lacking') + group.add_argument('--minimum', dest='providers', action='store_const', + const=(CPython, Devguide, PEPs), + help='Provide the bare minimum to be productive') # XXX --build option - parser_update = subparsers.add_parser('update') # XXX help - # XXX also run build - - - - for provider in (CPython, Devguide, PEPs, Mercurial, CoveragePy, - Visual_Studio_Express,): - print('Creating', provider.__name__.replace('_', ' ')) - provider().create() - print() + # XXX parser_update = subparsers.add_parser('update', help='Update the %(prog)s') # XXX also run build -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:50 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:50 +0100 Subject: [Python-checkins] devinabox: Add a script which will build CPython. Message-ID: brett.cannon pushed 5bbb3ca59d71 to devinabox: http://hg.python.org/devinabox/rev/5bbb3ca59d71 changeset: 9:5bbb3ca59d71 user: Brett Cannon date: Fri Feb 25 17:44:56 2011 -0800 summary: Add a script which will build CPython. files: build_cpython.py diff --git a/build_cpython.py b/build_cpython.py new file mode 100644 --- /dev/null +++ b/build_cpython.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python +"""Build CPython""" +import multiprocessing +import os +import subprocess +import sys + +if sys.platform == 'win32': + print("See the devguide's Getting Set Up guide for building under Windows") + +directory = 'cpython' +cwd = os.getcwd() +os.chdir(directory) +try: + if os.path.isfile('Makefile'): + print('Makefile already exists; skipping ./configure') + else: + subprocess.check_call(['./configure', '--prefix=/dev/null', + '--with-pydebug']) + make_cmd = ['make', '-s', '-j', str(multiprocessing.cpu_count())] + subprocess.call(make_cmd) +finally: + os.chdir(cwd) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Clean up the docstring. Message-ID: brett.cannon pushed cfcf7b5e2e43 to devinabox: http://hg.python.org/devinabox/rev/cfcf7b5e2e43 changeset: 12:cfcf7b5e2e43 user: Brett Cannon date: Fri Feb 25 18:25:43 2011 -0800 summary: Clean up the docstring. files: box.py diff --git a/box.py b/box.py old mode 100644 new mode 100755 --- a/box.py +++ b/box.py @@ -1,56 +1,17 @@ +#!/usr/bin/env python3 """Python-Dev In a Box: (almost) everything you need to contribute to (C)Python in under 700 MB. This script will create a directory which will end up containing (or remind you to download) everything you need to contribute to (C)Python's development (sans -a compiler): +a C compiler) through the ``create`` command. You can also "build" what is +provided in the Box to save users some hassle (e.g., build the documentation). - * Mercurial: source download & TortoiseHG for 32/64-bit Windows - Hg is Python's VCS (Version Control System). +Once a Box has been created, users of it can update what it contains (e.g., +update the repository of CPython). - * Visual C++ Express: English Web installer - So Windows users can compile CPython. - OS X users should install XCode (http://developer.apple.com/) and - optionally Homebrew (http://mxcl.github.com/homebrew/) to install - LLVM/clang. - Linux user should install gcc or clang using their distribution's - package managers. - - * coverage.py: cloned repository - For measuring the coverage of Python's test suite. Includes a - cloned repository instead of the latest release as cutting-edge support - is occasionally needed to support the in-development version of Python. - - * Python Enhancement Proposals - Also known as PEPs. This is included as reference material, especially - for PEPs 7 & 8 (the C and Python style guides, respectively). - - * Python Developer's Guide - "The devguide"; documentation on how to contribute to Python. - - * CPython - The included repository clone has branches for all versions of Python - either under development or maintenance. - -Once the requisite code has been checked out, various optional steps can be -performed to make the lives of users easier: - - * Update all cloned repositories (devguide, PEPs, CPython, coverage.py) - - * Build Python's documentation (will also lead to the download of the - required software; requires Python to already be installed on the user's - machine) - - * Build the devguide (requires Python's docs to be built so as to have a - copy of Sphinx available to use) - - * Build the PEPs (requires Python to be installed) - - * Generate a coverage report (requires that CPython be built) - - -This script is assumed to be run by a built version of the in-development -version of CPython. +There are also some scripts provide along side this one to help get people +started. See the README for more information. """ # XXX README or just the docstring for this script? @@ -143,7 +104,7 @@ class Visual_Studio_Express(Provider): - """Provide the Web installer for Visual C++ Express.""" + """The Web installer for Visual C++ Express.""" size = (4, None) directory = 'Visual C++ Express' @@ -164,7 +125,8 @@ class CoveragePy(HgProvider): - """Cloned repository for coverage.py.""" + """Cloned repository for coverage.py so you can generate coverage report + for the stdlib.""" url = 'https://brettsky at bitbucket.org/ned/coveragepy' directory = 'coveragepy' @@ -176,7 +138,8 @@ class Mercurial(Provider): - """Provide Mercurial (source release) and TortoiseHg (for Windows).""" + """Provide Mercurial (source release) and TortoiseHg (for Windows) so you + can update CPython's repository.""" directory = 'Mercurial' size = (47, None) # Includes TortoiseHg for 32/64-bit Windows @@ -229,7 +192,7 @@ class Devguide(HgProvider): - """Clone the Python developer's guide.""" + """Clone of the Python developer's guide so you know what to do.""" size = (1, 4) @@ -254,7 +217,8 @@ class PEPs(SvnProvider): - """Checkout the Python Enhancement Proposals.""" + """Checkout the Python Enhancement Proposals so you have the style guides + (PEPs 7 & 8) along with all other PEPs for reference.""" url = 'http://svn.python.org/projects/peps/trunk/' directory = 'peps' @@ -269,7 +233,7 @@ class CPython(HgProvider): - """Clone CPython (and requisite tools to build the documentation).""" + """Clone of CPython (and requisite tools to build the documentation).""" url = 'http://hg.python.org/cpython' directory = 'cpython' @@ -290,7 +254,7 @@ if __name__ == '__main__': - import argparse + import argparse # XXX snag from CPython repo if missing all_providers = (CPython, Devguide, PEPs, CoveragePy, Mercurial, Visual_Studio_Express) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Modularize build_cpython.py. Message-ID: brett.cannon pushed 2396229f0a74 to devinabox: http://hg.python.org/devinabox/rev/2396229f0a74 changeset: 13:2396229f0a74 user: Brett Cannon date: Sat Feb 26 13:04:19 2011 -0800 summary: Modularize build_cpython.py. files: build_cpython.py diff --git a/build_cpython.py b/build_cpython.py --- a/build_cpython.py +++ b/build_cpython.py @@ -5,19 +5,24 @@ import subprocess import sys -if sys.platform == 'win32': - print("See the devguide's Getting Set Up guide for building under Windows") +def main(): + if sys.platform == 'win32': + print("See the devguide's Getting Set Up guide for building under " + "Windows") -directory = 'cpython' -cwd = os.getcwd() -os.chdir(directory) -try: - if os.path.isfile('Makefile'): - print('Makefile already exists; skipping ./configure') - else: - subprocess.check_call(['./configure', '--prefix=/dev/null', - '--with-pydebug']) - make_cmd = ['make', '-s', '-j', str(multiprocessing.cpu_count())] - subprocess.call(make_cmd) -finally: - os.chdir(cwd) + directory = 'cpython' + cwd = os.getcwd() + os.chdir(directory) + try: + if os.path.isfile('Makefile'): + print('Makefile already exists; skipping ./configure') + else: + subprocess.check_call(['./configure', '--prefix=/dev/null', + '--with-pydebug']) + make_cmd = ['make', '-s', '-j', str(multiprocessing.cpu_count())] + subprocess.call(make_cmd) + finally: + os.chdir(cwd) + +if __name__ == '__main__': + main() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Actually return the path to the executable. Message-ID: brett.cannon pushed 72f72ce682f7 to devinabox: http://hg.python.org/devinabox/rev/72f72ce682f7 changeset: 15:72f72ce682f7 user: Brett Cannon date: Sat Feb 26 13:22:50 2011 -0800 summary: Actually return the path to the executable. files: run_tests.py diff --git a/run_tests.py b/run_tests.py --- a/run_tests.py +++ b/run_tests.py @@ -21,6 +21,7 @@ cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') if not os.path.isfile(cmd): return None + return os.path.abspath(cmd) if __name__ == '__main__': cmd = executable() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Refactor run_tests.py. Message-ID: brett.cannon pushed 1c2795eb8298 to devinabox: http://hg.python.org/devinabox/rev/1c2795eb8298 changeset: 14:1c2795eb8298 user: Brett Cannon date: Sat Feb 26 13:04:41 2011 -0800 summary: Refactor run_tests.py. files: run_tests.py diff --git a/run_tests.py b/run_tests.py --- a/run_tests.py +++ b/run_tests.py @@ -6,21 +6,27 @@ import sys -directory = 'cpython' -cmd = os.path.join(directory, 'python') -# UNIX -if not os.path.isfile(cmd): - # OS X - cmd += '.exe' +def executable(): + directory = 'cpython' + cmd = os.path.join(directory, 'python') + # UNIX if not os.path.isfile(cmd): - # 32-bit Windows - cmd = os.path.join(directory, 'PCBuild', 'python_d.exe') + # OS X + cmd += '.exe' if not os.path.isfile(cmd): - # 64-bit Windows - cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') + # 32-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'python_d.exe') if not os.path.isfile(cmd): - print('CPython is not built') - sys.exit(1) + # 64-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') + if not os.path.isfile(cmd): + return None -subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', '-w', - '-u', 'all', '-j', str(multiprocessing.cpu_count())]) +if __name__ == '__main__': + cmd = executable() + if cmd is None: + print('CPython is not built') + sys.exit(1) + subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', + '-w', '-u', 'all', '-j', + str(multiprocessing.cpu_count())]) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Ignore .coverage. Message-ID: brett.cannon pushed 50f2f565e39c to devinabox: http://hg.python.org/devinabox/rev/50f2f565e39c changeset: 17:50f2f565e39c user: Brett Cannon date: Sat Feb 26 14:36:32 2011 -0800 summary: Ignore .coverage. files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -8,6 +8,7 @@ peps Visual C++ Express # Generated +.coverage coverage_report build coverage.html -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Skip auto-generated stuff. Message-ID: brett.cannon pushed 5975eb8ce9e8 to devinabox: http://hg.python.org/devinabox/rev/5975eb8ce9e8 changeset: 16:5975eb8ce9e8 user: Brett Cannon date: Sat Feb 26 14:36:02 2011 -0800 summary: Skip auto-generated stuff. files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -1,11 +1,20 @@ syntax: glob +# Repositories/files coveragepy cpython devguide Mercurial peps Visual C++ Express +# Generated +coverage_report +build +coverage.html +devguide.html +peps.html +python_docs.html +# Misc .*.swp *.pyc *.pyo -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Script to easily execute coverage.py (requires __main__.py in coveragepy). Message-ID: brett.cannon pushed 5c66d683172c to devinabox: http://hg.python.org/devinabox/rev/5c66d683172c changeset: 19:5c66d683172c user: Brett Cannon date: Sat Feb 26 14:37:11 2011 -0800 summary: Script to easily execute coverage.py (requires __main__.py in coveragepy). files: run_coverage.py diff --git a/run_coverage.py b/run_coverage.py new file mode 100755 --- /dev/null +++ b/run_coverage.py @@ -0,0 +1,24 @@ +#!/usr/bin/env python +"""Create a coverage report for CPython""" +import subprocess +import os +import sys +import build_cpython +import run_tests + +build_cpython.main() +print(os.getcwd()) +executable = run_tests.executable() +if not executable: + print('no CPython executable found') + sys.exit(1) + +print('Running coverage ...') +subprocess.check_call([executable, 'coveragepy', 'run', '--pylib', + os.path.join('cpython', 'Lib', 'test', 'regrtest.py'), + ]) +print('Generating report ...') +subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', + '"*/test/*,*/tests/*"', '-d', 'coverage_report']) +print('Creating symlink ...') +os.symlink(os.path.join('coverage_report', 'index.html'), 'coverage.html') -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Ignore the auto-generated README. Message-ID: brett.cannon pushed 761007ae2a64 to devinabox: http://hg.python.org/devinabox/rev/761007ae2a64 changeset: 21:761007ae2a64 user: Brett Cannon date: Sat Feb 26 16:39:17 2011 -0800 summary: Ignore the auto-generated README. files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -15,6 +15,7 @@ devguide.html peps.html python_docs.html +README # Misc .*.swp *.pyc -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Simply the UX. Message-ID: brett.cannon pushed fdf18450ac82 to devinabox: http://hg.python.org/devinabox/rev/fdf18450ac82 changeset: 20:fdf18450ac82 user: Brett Cannon date: Sat Feb 26 16:32:46 2011 -0800 summary: Simply the UX. Assume that box.py is only used by creators of a Box. Allow the user to personally specify through prompting what they do and do not want to add to the Box. Also always build what is provided. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -2,24 +2,24 @@ """Python-Dev In a Box: (almost) everything you need to contribute to (C)Python in under 700 MB. -This script will create a directory which will end up containing (or remind you -to download) everything you need to contribute to (C)Python's development (sans -a C compiler) through the ``create`` command. You can also "build" what is -provided in the Box to save users some hassle (e.g., build the documentation). +This script will clone, checkout, download, or ask you to download everything +you need to contribute to (C)Python's development short of a C compiler. It +will also "build" everything so that users do not need to do **everything** +from scratch. This also allows for easy offline use. -Once a Box has been created, users of it can update what it contains (e.g., -update the repository of CPython). - -There are also some scripts provide along side this one to help get people -started. See the README for more information. +There are also some scripts provided along side this one to help in executing +common tasks. """ import abc import contextlib +import datetime from distutils.version import LooseVersion as Version +import operator import os import os.path import subprocess +import sys import urllib.request import urllib.parse import webbrowser @@ -54,6 +54,9 @@ """Create what is to be provided.""" raise NotImplementedError + def build(self): + """Optional step to "build" something.""" + def update(self): """Update what is provided.""" pass @@ -98,11 +101,11 @@ subprocess.check_call(['svn', 'update', '-q', self.directory]) -class Visual_Studio_Express(Provider): +class VisualCPPExpress(Provider): - """The Web installer for Visual C++ Express.""" + """The Web installer for Visual C++ Express""" - size = (4, None) + size = 4 directory = 'Visual C++ Express' def create(self): @@ -116,28 +119,29 @@ webbrowser.open(url) except webbrowser.Error: pass - self._prompt('download Visual C++ Express at {}'.format(url)) + self._prompt('download Visual C++ Express at {} and put in {}'.format( + url, self.directory)) +VisualCPPExpress.__name__ = 'Visual C++ Express' class CoveragePy(HgProvider): - """Cloned repository for coverage.py so you can generate coverage report - for the stdlib.""" + """Cloned repository of coverage.py (WARNING: building takes a while)""" url = 'https://brettsky at bitbucket.org/ned/coveragepy' directory = 'coveragepy' - size = (5, None) # XXX coverage report for CPython + size = 0 # XXX coverage report for CPython # XXX build runs coverage tests +CoveragePy.__name__ = 'coverage.py' class Mercurial(Provider): - """Provide Mercurial (source release) and TortoiseHg (for Windows) so you - can update CPython's repository.""" + """Source release of Mercurial along with TortoiseHg""" directory = 'Mercurial' - size = (47, None) # Includes TortoiseHg for 32/64-bit Windows + size = 47 # Includes TortoiseHg for 32/64-bit Windows def _download_url(self): """Discover the URL to download Mercurial from.""" @@ -172,7 +176,8 @@ webbrowser.open(url) except webbrowser.Error: pass - self._prompt('Download TortoiseHg from {}'.format(url)) + self._prompt('Download TortoiseHg from {} and put in {}'.format(url, + self.directory)) def create(self): """Fetch the latest source distribution for Mercurial.""" @@ -184,12 +189,26 @@ self._create_tortoisehg() +class PEPs(SvnProvider): + + """Checkout of the Python Enhancement Proposals (for PEPs 7 & 8)""" + + url = 'http://svn.python.org/projects/peps/trunk/' + directory = 'peps' + size = 20 + + def build(self): + """Build the PEPs and symlink PEP 0.""" + with change_cwd(self.directory): + subprocess.check_call(['make']) + os.symlink(os.path.join(self.directory, 'pep-0000.html'), 'peps.html') + class Devguide(HgProvider): - """Clone of the Python developer's guide so you know what to do.""" + """Clone of the Python Developer's Guide""" - size = (1, 4) + size = 4 url = 'http://hg.python.org/devguide' directory = 'devguide' @@ -210,30 +229,13 @@ os.symlink(index_path, 'devguide.html') -class PEPs(SvnProvider): - - """Checkout the Python Enhancement Proposals so you have the style guides - (PEPs 7 & 8) along with all other PEPs for reference.""" - - url = 'http://svn.python.org/projects/peps/trunk/' - directory = 'peps' - size = (14, 20) - - def build(self): - """Build the PEPs and symlink PEP 0.""" - with change_cwd(self.directory): - subprocess.check_call(['make']) - os.symlink(os.path.join(self.directory, 'pep-0000.html'), 'peps.html') - - class CPython(HgProvider): - """Clone of CPython (and requisite tools to build the documentation).""" + """Clone of CPython (and requisite tools to build the documentation)""" url = 'http://hg.python.org/cpython' directory = 'cpython' - size = (245, # Includes stuff required to build docs/ docs built - 325) # Only docs are built + size = 325 # Only docs are built def create(self): """Clone CPython and get the necessary tools to build the @@ -243,31 +245,49 @@ subprocess.check_call(['make', 'checkout']) def build(self): + """Build CPython's documentation. + + CPython itself is not built as one will most likely not want to + distribute that on a CD. The build_cpython.py script can be used by the + Box user for building CPython. + + """ cmd = 'make' if sys.platform != 'win32' else 'make.bat' with change_cwd(os.path.join(self.directory, 'Doc')): subprocess.check_call([cmd, 'html']) - # XXX symlink to python_docs.html + os.symlink(os.path.join(self.directory, 'Doc', 'html', 'index.html'), + 'python_docs.html') if __name__ == '__main__': - import argparse # XXX snag from CPython repo if missing - all_providers = (CPython, Devguide, PEPs, CoveragePy, Mercurial, - Visual_Studio_Express) - parser = argparse.ArgumentParser(prog='Python-Dev In a Box') - subparsers = parser.add_subparsers() # XXX help - parser_create = subparsers.add_parser('create', - help='Create a %(prog)s') - parser_create.add_argument('--build', action='store_true', default=False) - group = parser_create.add_mutually_exclusive_group() - group.add_argument('--all', dest='providers', action='store_const', - const=all_providers, - help='Provide everything (the default)') - group.add_argument('--basic', dest='providers', action='store_const', - const=(CPython, Devguide, PEPs, CoveragePy), - help='Provide the basics people probably are lacking') - group.add_argument('--minimum', dest='providers', action='store_const', - const=(CPython, Devguide, PEPs), - help='Provide the bare minimum to be productive') - # XXX --build option - # XXX parser_update = subparsers.add_parser('update', help='Update the %(prog)s') # XXX also run build + VisualCPPExpress) + print(__doc__) + print('Please choose what to provide [y/n]:\n') + desired_providers = [] + for provider in all_providers: + docstring = provider.__doc__#.replace('\n', ' ') + msg = '{} ({} MB built)? '.format(docstring, provider.size) + response = input(msg) + if response in ('Y', 'y'): + desired_providers.append(provider) + print() + getting = ', '.join(map(operator.attrgetter('__name__'), + desired_providers)) + print('Getting {}'.format(getting)) + total_size = sum(map(operator.attrgetter('size'), desired_providers)) + msg = 'The requested Box will be about {} MB. OK [y/n]? '.format(total_size) + response = input(msg) + if response not in ('Y', 'y'): + sys.exit(0) + else: + for provider in desired_providers: + ins = provider() + print('Fetching {} ...'.format(provider.__name__)) + ins.create() + print('Building {} ...'.format(provider.__name__)) + ins.build() + with open('README', 'w') as file: + header = 'Python-Dev In a Box: created on {}\n' + file.write(header.format(datetime.date.today())) + sys.exit(0) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Update some XXX comments. Message-ID: brett.cannon pushed f22b4803ab4a to devinabox: http://hg.python.org/devinabox/rev/f22b4803ab4a changeset: 18:f22b4803ab4a user: Brett Cannon date: Sat Feb 26 14:36:41 2011 -0800 summary: Update some XXX comments. files: box.py diff --git a/box.py b/box.py --- a/box.py +++ b/box.py @@ -14,10 +14,6 @@ started. See the README for more information. """ -# XXX README or just the docstring for this script? -# XXX script to build CPython? multiprocessing.cpu_count(). Windows? -# XXX script to run unit tests? multiprocessing.cpu_count() - import abc import contextlib from distutils.version import LooseVersion as Version @@ -132,8 +128,7 @@ directory = 'coveragepy' size = (5, None) # XXX coverage report for CPython - # XXX script to run coverage tests? - # XXX 'coverage' runs coverage tests + # XXX build runs coverage tests class Mercurial(Provider): @@ -251,6 +246,7 @@ cmd = 'make' if sys.platform != 'win32' else 'make.bat' with change_cwd(os.path.join(self.directory, 'Doc')): subprocess.check_call([cmd, 'html']) + # XXX symlink to python_docs.html if __name__ == '__main__': -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: No longer symlinking most pages since they don't transition well. Message-ID: brett.cannon pushed 254b26b740ec to devinabox: http://hg.python.org/devinabox/rev/254b26b740ec changeset: 22:254b26b740ec user: Brett Cannon date: Sat Feb 26 17:14:36 2011 -0800 summary: No longer symlinking most pages since they don't transition well. files: .hgignore diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -12,9 +12,6 @@ coverage_report build coverage.html -devguide.html -peps.html -python_docs.html README # Misc .*.swp -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Rename some files. Message-ID: brett.cannon pushed c7bafcc52be7 to devinabox: http://hg.python.org/devinabox/rev/c7bafcc52be7 changeset: 24:c7bafcc52be7 user: Brett Cannon date: Sat Feb 26 18:00:46 2011 -0800 summary: Rename some files. files: box.py make_a_box.py run_all_tests.py run_tests.py diff --git a/box.py b/make_a_box.py rename from box.py rename to make_a_box.py --- a/box.py +++ b/make_a_box.py @@ -1,6 +1,6 @@ #!/usr/bin/env python3 -"""Python-Dev In a Box: (almost) everything you need to contribute to (C)Python -in under 700 MB. +"""Python-Dev In a Box: everything you need to contribute to Python in under +700 MB. This script will clone, checkout, download, or ask you to download everything you need to contribute to (C)Python's development short of a C compiler. It @@ -24,6 +24,15 @@ import urllib.parse import webbrowser import xmlrpc.client +import run_coverage + + +def rename(new_name): + """Decorator to rename an object that defines __name__.""" + def do_rename(ob): + ob.__name__ = new_name + return ob + return do_rename @contextlib.contextmanager @@ -73,13 +82,7 @@ def create(self): """Clone an Hg repository to 'directory'.""" - subprocess.check_call(['hg', '-q', 'clone', self.url, self.directory]) - - def update(self): - """Update the Hg clone in 'directory'.""" - with change_cwd(self.directory): - subprocess.check_call(['hg', '-q', 'pull']) - subprocess.check_call(['hg', '-q', 'update']) + subprocess.check_call(['hg', 'clone', self.url, self.directory]) class SvnProvider(Provider): @@ -93,14 +96,10 @@ def create(self): """Check out the svn repository to 'directory'.""" - subprocess.check_call(['svn', 'checkout', '-q', - self.url, self.directory]) + subprocess.check_call(['svn', 'checkout', self.url, self.directory]) - def update(self): - """Update the svn checkout in 'directory'.""" - subprocess.check_call(['svn', 'update', '-q', self.directory]) - + at rename('Visual C++ Express') class VisualCPPExpress(Provider): """The Web installer for Visual C++ Express""" @@ -121,9 +120,9 @@ pass self._prompt('download Visual C++ Express at {} and put in {}'.format( url, self.directory)) -VisualCPPExpress.__name__ = 'Visual C++ Express' + at rename('coverage.py') class CoveragePy(HgProvider): """Cloned repository of coverage.py (WARNING: building takes a while)""" @@ -132,8 +131,13 @@ directory = 'coveragepy' size = 0 # XXX coverage report for CPython - # XXX build runs coverage tests -CoveragePy.__name__ = 'coverage.py' + def build(self): + """Run coverage over CPython.""" + # XXX build python + # XXX run coverage + # XXX ``make distclean`` + # XXX generate html + run_coverage.main() class Mercurial(Provider): @@ -154,8 +158,9 @@ try: return release_data['download_url'] except KeyError: - # XXX - pass + print('Mercurial has changed how it releases software on PyPI; ' + 'please report this to bugs.python.org') + sys.exit(1) def _url_filename(self, url): """Find the filename from the URL.""" @@ -198,10 +203,9 @@ size = 20 def build(self): - """Build the PEPs and symlink PEP 0.""" + """Build the PEPs.""" with change_cwd(self.directory): subprocess.check_call(['make']) - os.symlink(os.path.join(self.directory, 'pep-0000.html'), 'peps.html') class Devguide(HgProvider): @@ -214,19 +218,19 @@ directory = 'devguide' def build(self): - """Build the devguide and symlink its index page.""" + """Build the devguide using Sphinx from CPython's docs.""" # Grab Sphinx from cpython/Doc/tools/ tools_directory = os.path.join(CPython.directory, 'Doc', 'tools') - orig_pythonpath = os.environ['PYTHONPATH'] + orig_pythonpath = os.environ.get('PYTHONPATH') os.environ['PYTHONPATH'] = os.path.abspath(tools_directory) try: with change_cwd(self.directory): subprocess.check_call(['make', 'html']) finally: - os.environ['PYTHONPATH'] = orig_pythonpath + if orig_pythonpath: + os.environ['PYTHONPATH'] = orig_pythonpath index_path = os.path.join(self.directory, '_build', 'html', 'index.html') - os.symlink(index_path, 'devguide.html') class CPython(HgProvider): @@ -235,13 +239,14 @@ url = 'http://hg.python.org/cpython' directory = 'cpython' - size = 325 # Only docs are built + size = 330 # Only docs are built def create(self): """Clone CPython and get the necessary tools to build the documentation.""" super().create() with change_cwd(os.path.join(self.directory, 'Doc')): + # XXX Windows? subprocess.check_call(['make', 'checkout']) def build(self): @@ -255,8 +260,6 @@ cmd = 'make' if sys.platform != 'win32' else 'make.bat' with change_cwd(os.path.join(self.directory, 'Doc')): subprocess.check_call([cmd, 'html']) - os.symlink(os.path.join(self.directory, 'Doc', 'html', 'index.html'), - 'python_docs.html') if __name__ == '__main__': @@ -276,7 +279,7 @@ desired_providers)) print('Getting {}'.format(getting)) total_size = sum(map(operator.attrgetter('size'), desired_providers)) - msg = 'The requested Box will be about {} MB. OK [y/n]? '.format(total_size) + msg = 'The requested Box will be about {} MB. OK? [y/n] '.format(total_size) response = input(msg) if response not in ('Y', 'y'): sys.exit(0) diff --git a/run_tests.py b/run_all_tests.py rename from run_tests.py rename to run_all_tests.py -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Drop the coverage running script; it's so slow only people making a Box should Message-ID: brett.cannon pushed 4314d3836974 to devinabox: http://hg.python.org/devinabox/rev/4314d3836974 changeset: 25:4314d3836974 user: Brett Cannon date: Sat Feb 26 18:09:20 2011 -0800 summary: Drop the coverage running script; it's so slow only people making a Box should run it. files: run_coverage.py diff --git a/run_coverage.py b/run_coverage.py deleted file mode 100755 --- a/run_coverage.py +++ /dev/null @@ -1,29 +0,0 @@ -#!/usr/bin/env python -"""Create a coverage report for CPython""" -import subprocess -import os -import sys -import build_cpython -import run_tests - - -def main(): - build_cpython.main() - executable = run_tests.executable() - if not executable: - print('no CPython executable found') - sys.exit(1) - - print('Running coverage ...') - subprocess.check_call([executable, 'coveragepy', 'run', '--pylib', - os.path.join('cpython', 'Lib', 'test', 'regrtest.py'), - ]) - print('Generating report ...') - subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', - '"*/test/*,*/tests/*"', '-d', 'coverage_report']) - print('Creating symlink ...') - os.symlink(os.path.join('coverage_report', 'index.html'), 'coverage.html') - - -if __name__ == '__main__': - main() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Move run_coverage.py functionality into make_a_box.py. Message-ID: brett.cannon pushed bd04d72f7257 to devinabox: http://hg.python.org/devinabox/rev/bd04d72f7257 changeset: 26:bd04d72f7257 user: Brett Cannon date: Sat Feb 26 18:12:27 2011 -0800 summary: Move run_coverage.py functionality into make_a_box.py. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -24,7 +24,8 @@ import urllib.parse import webbrowser import xmlrpc.client -import run_coverage +import build_cpython +import run_all_tests def rename(new_name): @@ -133,11 +134,25 @@ def build(self): """Run coverage over CPython.""" - # XXX build python - # XXX run coverage - # XXX ``make distclean`` - # XXX generate html - run_coverage.main() + # Build Python + build_cpython.main() + # Run coverage + executable = run_all_tests.executable() + if not executable: + print('No CPython executable found') + sys.exit(1) + print('Running coverage ...') + regrest_path = os.path.join(CPython.directory, 'Lib', 'test', + 'regrtest.py') + subprocess.check_call([executable, self.directory, 'run', '--pylib', + regrtest_path]) + # ``make distclean`` as you don't want to distribute your own build + with change_cwd(CPython.directory): + subprocess.check_call(['make', 'distclean']) + # Generate the HTML report + print('Generating report ...') + subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', + '"*/test/*,*/tests/*"', '-d', 'coverage_report']) class Mercurial(Provider): -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Refactor run_coverage.py. Message-ID: brett.cannon pushed 3efec08c0767 to devinabox: http://hg.python.org/devinabox/rev/3efec08c0767 changeset: 23:3efec08c0767 user: Brett Cannon date: Sat Feb 26 18:00:35 2011 -0800 summary: Refactor run_coverage.py. files: run_coverage.py diff --git a/run_coverage.py b/run_coverage.py --- a/run_coverage.py +++ b/run_coverage.py @@ -6,19 +6,24 @@ import build_cpython import run_tests -build_cpython.main() -print(os.getcwd()) -executable = run_tests.executable() -if not executable: - print('no CPython executable found') - sys.exit(1) -print('Running coverage ...') -subprocess.check_call([executable, 'coveragepy', 'run', '--pylib', - os.path.join('cpython', 'Lib', 'test', 'regrtest.py'), - ]) -print('Generating report ...') -subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', - '"*/test/*,*/tests/*"', '-d', 'coverage_report']) -print('Creating symlink ...') -os.symlink(os.path.join('coverage_report', 'index.html'), 'coverage.html') +def main(): + build_cpython.main() + executable = run_tests.executable() + if not executable: + print('no CPython executable found') + sys.exit(1) + + print('Running coverage ...') + subprocess.check_call([executable, 'coveragepy', 'run', '--pylib', + os.path.join('cpython', 'Lib', 'test', 'regrtest.py'), + ]) + print('Generating report ...') + subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', + '"*/test/*,*/tests/*"', '-d', 'coverage_report']) + print('Creating symlink ...') + os.symlink(os.path.join('coverage_report', 'index.html'), 'coverage.html') + + +if __name__ == '__main__': + main() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Measure the size of coverage.py + coverage report. Message-ID: brett.cannon pushed 4dfc931cab95 to devinabox: http://hg.python.org/devinabox/rev/4dfc931cab95 changeset: 27:4dfc931cab95 user: Brett Cannon date: Sat Feb 26 21:10:09 2011 -0800 summary: Measure the size of coverage.py + coverage report. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -130,7 +130,7 @@ url = 'https://brettsky at bitbucket.org/ned/coveragepy' directory = 'coveragepy' - size = 0 # XXX coverage report for CPython + size = 133 # Includes the coverage report def build(self): """Run coverage over CPython.""" -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Fix code from the shifting of executable(). Message-ID: brett.cannon pushed cf80aee7de97 to devinabox: http://hg.python.org/devinabox/rev/cf80aee7de97 changeset: 29:cf80aee7de97 user: Brett Cannon date: Sat Feb 26 21:16:31 2011 -0800 summary: Fix code from the shifting of executable(). files: build_cpython.py run_all_tests.py diff --git a/build_cpython.py b/build_cpython.py --- a/build_cpython.py +++ b/build_cpython.py @@ -42,7 +42,7 @@ subprocess.call(make_cmd) finally: os.chdir(cwd) - return executable + return executable() if __name__ == '__main__': if not main(): diff --git a/run_all_tests.py b/run_all_tests.py --- a/run_all_tests.py +++ b/run_all_tests.py @@ -7,7 +7,7 @@ def main(): - cmd = executable() + cmd = build_cpython.main() if cmd is None: print('CPython is not built') sys.exit(1) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Properly clean up the 'build' directory from test runs. Message-ID: brett.cannon pushed 921434ac1129 to devinabox: http://hg.python.org/devinabox/rev/921434ac1129 changeset: 31:921434ac1129 user: Brett Cannon date: Sat Feb 26 21:34:10 2011 -0800 summary: Properly clean up the 'build' directory from test runs. files: make_a_box.py run_all_tests.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -143,10 +143,12 @@ print('Running coverage ...') regrtest_path = os.path.join(CPython.directory, 'Lib', 'test', 'regrtest.py') - subprocess.check_call([executable, self.directory, 'run', '--pylib', - regrtest_path]) - # Clean up from the test run - os.rmdir('build') + try: + subprocess.check_call([executable, self.directory, 'run', '--pylib', + regrtest_path]) + finally: + # Clean up from the test run + os.rmdir('build') # Generate the HTML report print('Generating report ...') subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', diff --git a/run_all_tests.py b/run_all_tests.py --- a/run_all_tests.py +++ b/run_all_tests.py @@ -11,9 +11,12 @@ if cmd is None: print('CPython is not built') sys.exit(1) - subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', - '-w', '-u', 'all', '-j', - str(multiprocessing.cpu_count())]) + try: + subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', + '-w', '-u', 'all', '-j', + str(multiprocessing.cpu_count())]) + finally: + os.rmdir('build') if __name__ == '__main__': -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Fix some issues with building coverage.py. Message-ID: brett.cannon pushed 3fd40a5f2e0e to devinabox: http://hg.python.org/devinabox/rev/3fd40a5f2e0e changeset: 30:3fd40a5f2e0e user: Brett Cannon date: Sat Feb 26 21:30:11 2011 -0800 summary: Fix some issues with building coverage.py. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -126,7 +126,7 @@ @rename('coverage.py') class CoveragePy(HgProvider): - """Cloned repository of coverage.py (WARNING: building takes a while)""" + """Clone of coverage.py (WARNING: building takes a while)""" url = 'https://brettsky at bitbucket.org/ned/coveragepy' directory = 'coveragepy' @@ -141,17 +141,20 @@ print('No CPython executable found') sys.exit(1) print('Running coverage ...') - regrest_path = os.path.join(CPython.directory, 'Lib', 'test', + regrtest_path = os.path.join(CPython.directory, 'Lib', 'test', 'regrtest.py') subprocess.check_call([executable, self.directory, 'run', '--pylib', regrtest_path]) - # ``make distclean`` as you don't want to distribute your own build - with change_cwd(CPython.directory): - subprocess.check_call(['make', 'distclean']) + # Clean up from the test run + os.rmdir('build') # Generate the HTML report print('Generating report ...') subprocess.call([executable, 'coveragepy', 'html', '-i', '--omit', '"*/test/*,*/tests/*"', '-d', 'coverage_report']) + # ``make distclean`` as you don't want to distribute your own build + print('Cleaning up the CPython build ...') + with change_cwd(CPython.directory): + subprocess.check_call(['make', 'distclean']) class Mercurial(Provider): -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Re-arrange executable discovery code. Message-ID: brett.cannon pushed 817d54b6da18 to devinabox: http://hg.python.org/devinabox/rev/817d54b6da18 changeset: 28:817d54b6da18 user: Brett Cannon date: Sat Feb 26 21:13:55 2011 -0800 summary: Re-arrange executable discovery code. files: build_cpython.py make_a_box.py run_all_tests.py diff --git a/build_cpython.py b/build_cpython.py --- a/build_cpython.py +++ b/build_cpython.py @@ -5,6 +5,25 @@ import subprocess import sys + +def executable(): + directory = 'cpython' + cmd = os.path.join(directory, 'python') + # UNIX + if not os.path.isfile(cmd): + # OS X + cmd += '.exe' + if not os.path.isfile(cmd): + # 32-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'python_d.exe') + if not os.path.isfile(cmd): + # 64-bit Windows + cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') + if not os.path.isfile(cmd): + return None + return os.path.abspath(cmd) + + def main(): if sys.platform == 'win32': print("See the devguide's Getting Set Up guide for building under " @@ -23,6 +42,9 @@ subprocess.call(make_cmd) finally: os.chdir(cwd) + return executable if __name__ == '__main__': - main() + if not main(): + print('No executable found') + sys.exit(1) diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -25,7 +25,6 @@ import webbrowser import xmlrpc.client import build_cpython -import run_all_tests def rename(new_name): @@ -123,6 +122,7 @@ url, self.directory)) +# XXX test @rename('coverage.py') class CoveragePy(HgProvider): @@ -135,9 +135,8 @@ def build(self): """Run coverage over CPython.""" # Build Python - build_cpython.main() + executable = build_cpython.main() # Run coverage - executable = run_all_tests.executable() if not executable: print('No CPython executable found') sys.exit(1) diff --git a/run_all_tests.py b/run_all_tests.py --- a/run_all_tests.py +++ b/run_all_tests.py @@ -1,29 +1,12 @@ #!/usr/bin/env python """Run CPython's test suite in the most rigorous way possible.""" import multiprocessing -import os import subprocess import sys +import build_cpython -def executable(): - directory = 'cpython' - cmd = os.path.join(directory, 'python') - # UNIX - if not os.path.isfile(cmd): - # OS X - cmd += '.exe' - if not os.path.isfile(cmd): - # 32-bit Windows - cmd = os.path.join(directory, 'PCBuild', 'python_d.exe') - if not os.path.isfile(cmd): - # 64-bit Windows - cmd = os.path.join(directory, 'PCBuild', 'AMD64', 'python_d.exe') - if not os.path.isfile(cmd): - return None - return os.path.abspath(cmd) - -if __name__ == '__main__': +def main(): cmd = executable() if cmd is None: print('CPython is not built') @@ -31,3 +14,7 @@ subprocess.call([cmd, '-W', 'default', '-bb', '-E', '-m', 'test', '-r', '-w', '-u', 'all', '-j', str(multiprocessing.cpu_count())]) + + +if __name__ == '__main__': + main() -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Tweak the base class. Message-ID: brett.cannon pushed fb4d6ba4ed64 to devinabox: http://hg.python.org/devinabox/rev/fb4d6ba4ed64 changeset: 33:fb4d6ba4ed64 user: Brett Cannon date: Sat Feb 26 21:52:49 2011 -0800 summary: Tweak the base class. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -54,6 +54,12 @@ """Directory to put everything.""" raise NotImplementedError + @abc.abstractproperty + def size(self): + """Roughly how big will the thing being provided be.""" + + # docs points to any documentation index for the provider + def _prompt(self, message): """Prompt the user to perform an action, waiting for a response.""" input("{} [press Enter when done]".format(message)) @@ -66,10 +72,6 @@ def build(self): """Optional step to "build" something.""" - def update(self): - """Update what is provided.""" - pass - class HgProvider(Provider): -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Remove a dead comment. Message-ID: brett.cannon pushed 512f981bf7b7 to devinabox: http://hg.python.org/devinabox/rev/512f981bf7b7 changeset: 34:512f981bf7b7 user: Brett Cannon date: Sun Feb 27 13:03:27 2011 -0800 summary: Remove a dead comment. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -56,9 +56,8 @@ @abc.abstractproperty def size(self): - """Roughly how big will the thing being provided be.""" - - # docs points to any documentation index for the provider + """Size of what is provided (built and everything).""" + raise NotImplementedError def _prompt(self, message): """Prompt the user to perform an action, waiting for a response.""" @@ -124,7 +123,6 @@ url, self.directory)) -# XXX test @rename('coverage.py') class CoveragePy(HgProvider): -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: Note about what is required for coverage.py to work. Message-ID: brett.cannon pushed aa36c83dbbc9 to devinabox: http://hg.python.org/devinabox/rev/aa36c83dbbc9 changeset: 35:aa36c83dbbc9 tag: tip user: Brett Cannon date: Mon Feb 28 14:03:41 2011 -0800 summary: Note about what is required for coverage.py to work. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -133,6 +133,7 @@ size = 133 # Includes the coverage report docs = os.path.join('coverage_report', 'index.html') + # XXX still requires __main__.py in coveragepy checkout def build(self): """Run coverage over CPython.""" # Build Python -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:03:51 2011 From: python-checkins at python.org (brett.cannon) Date: Mon, 28 Feb 2011 23:03:51 +0100 Subject: [Python-checkins] devinabox: List in the generated README where built documentation exists. Message-ID: brett.cannon pushed 254340e39c79 to devinabox: http://hg.python.org/devinabox/rev/254340e39c79 changeset: 32:254340e39c79 user: Brett Cannon date: Sat Feb 26 21:50:53 2011 -0800 summary: List in the generated README where built documentation exists. files: make_a_box.py diff --git a/make_a_box.py b/make_a_box.py --- a/make_a_box.py +++ b/make_a_box.py @@ -131,6 +131,7 @@ url = 'https://brettsky at bitbucket.org/ned/coveragepy' directory = 'coveragepy' size = 133 # Includes the coverage report + docs = os.path.join('coverage_report', 'index.html') def build(self): """Run coverage over CPython.""" @@ -220,6 +221,7 @@ url = 'http://svn.python.org/projects/peps/trunk/' directory = 'peps' size = 20 + docs = os.path.join(directory, 'pep-0000.html') def build(self): """Build the PEPs.""" @@ -235,6 +237,7 @@ url = 'http://hg.python.org/devguide' directory = 'devguide' + docs = os.path.join(directory, '_build', 'html', 'index.html') def build(self): """Build the devguide using Sphinx from CPython's docs.""" @@ -259,6 +262,7 @@ url = 'http://hg.python.org/cpython' directory = 'cpython' size = 330 # Only docs are built + docs = os.path.join(directory, 'Doc', 'build', 'html', 'index.html') def create(self): """Clone CPython and get the necessary tools to build the @@ -303,6 +307,7 @@ if response not in ('Y', 'y'): sys.exit(0) else: + print() for provider in desired_providers: ins = provider() print('Fetching {} ...'.format(provider.__name__)) @@ -312,4 +317,9 @@ with open('README', 'w') as file: header = 'Python-Dev In a Box: created on {}\n' file.write(header.format(datetime.date.today())) + file.write('\n') + file.write('Documentation indices can be found at:\n') + for provider in desired_providers: + if hasattr(provider, 'docs'): + file.write(' {}\n'.format(provider.docs)) sys.exit(0) -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Mon Feb 28 23:04:51 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 23:04:51 +0100 (CET) Subject: [Python-checkins] r88683 - python/branches/py3k/Doc/library/threading.rst Message-ID: <20110228220451.81D53EEA93@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 23:04:51 2011 New Revision: 88683 Log: No need to put this at top Modified: python/branches/py3k/Doc/library/threading.rst Modified: python/branches/py3k/Doc/library/threading.rst ============================================================================== --- python/branches/py3k/Doc/library/threading.rst (original) +++ python/branches/py3k/Doc/library/threading.rst Mon Feb 28 23:04:51 2011 @@ -20,17 +20,6 @@ methods and functions in this module in the Python 2.x series are still supported by this module. -.. impl-detail:: - - Due to the :term:`Global Interpreter Lock`, in CPython only one thread - can execute Python code at once (even though certain performance-oriented - libraries might overcome this limitation). - If you want your application to make better of use of the computational - resources of multi-core machines, you are advised to use - :mod:`multiprocessing` or :class:`concurrent.futures.ProcessPoolExecutor`. - However, threading is still an appropriate model if you want to run - multiple I/O-bound tasks simultaneously. - This module defines the following functions and objects: @@ -374,6 +363,18 @@ property instead. +.. impl-detail:: + + Due to the :term:`Global Interpreter Lock`, in CPython only one thread + can execute Python code at once (even though certain performance-oriented + libraries might overcome this limitation). + If you want your application to make better of use of the computational + resources of multi-core machines, you are advised to use + :mod:`multiprocessing` or :class:`concurrent.futures.ProcessPoolExecutor`. + However, threading is still an appropriate model if you want to run + multiple I/O-bound tasks simultaneously. + + .. _lock-objects: Lock Objects From python-checkins at python.org Mon Feb 28 23:06:48 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 23:06:48 +0100 (CET) Subject: [Python-checkins] r88684 - in python/branches/py3k/Misc: ACKS NEWS Message-ID: <20110228220648.57082EE99D@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 23:06:48 2011 New Revision: 88684 Log: Add credit for r88682. Modified: python/branches/py3k/Misc/ACKS python/branches/py3k/Misc/NEWS Modified: python/branches/py3k/Misc/ACKS ============================================================================== --- python/branches/py3k/Misc/ACKS (original) +++ python/branches/py3k/Misc/ACKS Mon Feb 28 23:06:48 2011 @@ -326,6 +326,7 @@ Fabian Groffen Eric Groo Dag Gruneau +Filip Gruszczy?ski Michael Guravage Lars Gust?bel Thomas G?ttler Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 28 23:06:48 2011 @@ -37,7 +37,8 @@ - Issue #11140: Lock.release() now raises a RuntimeError when attempting to release an unacquired lock, as claimed in the threading documentation. - The _thread.error exception is now an alias of RuntimeError. + The _thread.error exception is now an alias of RuntimeError. Patch by + Filip Gruszczy?ski. - Issue 8594: ftplib now provides a source_address parameter to specify which (address, port) to bind to before connecting. From python-checkins at python.org Mon Feb 28 23:25:22 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 23:25:22 +0100 (CET) Subject: [Python-checkins] r88685 - in python/branches/py3k: Doc/library/socket.rst Lib/test/test_socket.py Misc/NEWS Modules/socketmodule.c configure configure.in pyconfig.h.in Message-ID: <20110228222522.6E251EEA5F@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 23:25:22 2011 New Revision: 88685 Log: Issue #10866: Add socket.sethostname(). Initial patch by Ross Lagerwall. Modified: python/branches/py3k/Doc/library/socket.rst python/branches/py3k/Lib/test/test_socket.py python/branches/py3k/Misc/NEWS python/branches/py3k/Modules/socketmodule.c python/branches/py3k/configure python/branches/py3k/configure.in python/branches/py3k/pyconfig.h.in Modified: python/branches/py3k/Doc/library/socket.rst ============================================================================== --- python/branches/py3k/Doc/library/socket.rst (original) +++ python/branches/py3k/Doc/library/socket.rst Mon Feb 28 23:25:22 2011 @@ -521,6 +521,16 @@ meanings. +.. function:: sethostname(name) + + Set the machine's hostname to *name*. This will raise a + :exc:`socket.error` if you don't have enough rights. + + Availability: Unix. + + .. versionadded:: 3.3 + + .. data:: SocketType This is a Python type object that represents the socket object type. It is the Modified: python/branches/py3k/Lib/test/test_socket.py ============================================================================== --- python/branches/py3k/Lib/test/test_socket.py (original) +++ python/branches/py3k/Lib/test/test_socket.py Mon Feb 28 23:25:22 2011 @@ -325,6 +325,26 @@ if not fqhn in all_host_names: self.fail("Error testing host resolution mechanisms. (fqdn: %s, all: %s)" % (fqhn, repr(all_host_names))) + @unittest.skipUnless(hasattr(socket, 'sethostname'), "test needs socket.sethostname()") + @unittest.skipUnless(hasattr(socket, 'gethostname'), "test needs socket.gethostname()") + def test_sethostname(self): + oldhn = socket.gethostname() + try: + socket.sethostname('new') + except socket.error as e: + if e.errno == errno.EPERM: + self.skipTest("test should be run as root") + else: + raise + try: + # running test as root! + self.assertEqual(socket.gethostname(), 'new') + # Should work with bytes objects too + socket.sethostname(b'bar') + self.assertEqual(socket.gethostname(), 'bar') + finally: + socket.sethostname(oldhn) + def testRefCountGetNameInfo(self): # Testing reference count for getnameinfo if hasattr(sys, "getrefcount"): Modified: python/branches/py3k/Misc/NEWS ============================================================================== --- python/branches/py3k/Misc/NEWS (original) +++ python/branches/py3k/Misc/NEWS Mon Feb 28 23:25:22 2011 @@ -35,6 +35,8 @@ Library ------- +- Issue #10866: Add socket.sethostname(). Initial patch by Ross Lagerwall. + - Issue #11140: Lock.release() now raises a RuntimeError when attempting to release an unacquired lock, as claimed in the threading documentation. The _thread.error exception is now an alias of RuntimeError. Patch by Modified: python/branches/py3k/Modules/socketmodule.c ============================================================================== --- python/branches/py3k/Modules/socketmodule.c (original) +++ python/branches/py3k/Modules/socketmodule.c Mon Feb 28 23:25:22 2011 @@ -3135,6 +3135,37 @@ \n\ Return the current host name."); +#ifdef HAVE_SETHOSTNAME +PyDoc_STRVAR(sethostname_doc, +"sethostname(name)\n\n\ +Sets the hostname to name."); + +static PyObject * +socket_sethostname(PyObject *self, PyObject *args) +{ + PyObject *hnobj; + Py_buffer buf; + int res, flag = 0; + + if (!PyArg_ParseTuple(args, "S:sethostname", &hnobj)) { + PyErr_Clear(); + if (!PyArg_ParseTuple(args, "O&:sethostname", + PyUnicode_FSConverter, &hnobj)) + return NULL; + flag = 1; + } + res = PyObject_GetBuffer(hnobj, &buf, PyBUF_SIMPLE); + if (!res) { + res = sethostname(buf.buf, buf.len); + PyBuffer_Release(&buf); + } + if (flag) + Py_DECREF(hnobj); + if (res) + return set_error(); + Py_RETURN_NONE; +} +#endif /* Python interface to gethostbyname(name). */ @@ -4233,6 +4264,10 @@ METH_VARARGS, gethostbyaddr_doc}, {"gethostname", socket_gethostname, METH_NOARGS, gethostname_doc}, +#ifdef HAVE_SETHOSTNAME + {"sethostname", socket_sethostname, + METH_VARARGS, sethostname_doc}, +#endif {"getservbyname", socket_getservbyname, METH_VARARGS, getservbyname_doc}, {"getservbyport", socket_getservbyport, Modified: python/branches/py3k/configure ============================================================================== --- python/branches/py3k/configure (original) +++ python/branches/py3k/configure Mon Feb 28 23:25:22 2011 @@ -1,5 +1,5 @@ #! /bin/sh -# From configure.in Revision: 88624 . +# From configure.in Revision: 88625 . # Guess values for system-dependent variables and create Makefiles. # Generated by GNU Autoconf 2.65 for python 3.3. # @@ -9319,7 +9319,7 @@ mkfifoat mknod mknodat mktime mremap nice openat pathconf pause plock poll \ pthread_init putenv readlink readlinkat realpath renameat \ select sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ - setgid \ + setgid sethostname \ setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setpriority setuid setvbuf \ sigaction siginterrupt sigrelse snprintf strftime strlcpy symlinkat \ sysconf tcgetpgrp tcsetpgrp tempnam timegm times tmpfile tmpnam tmpnam_r \ Modified: python/branches/py3k/configure.in ============================================================================== --- python/branches/py3k/configure.in (original) +++ python/branches/py3k/configure.in Mon Feb 28 23:25:22 2011 @@ -2542,7 +2542,7 @@ mkfifoat mknod mknodat mktime mremap nice openat pathconf pause plock poll \ pthread_init putenv readlink readlinkat realpath renameat \ select sem_open sem_timedwait sem_getvalue sem_unlink sendfile setegid seteuid \ - setgid \ + setgid sethostname \ setlocale setregid setreuid setresuid setresgid setsid setpgid setpgrp setpriority setuid setvbuf \ sigaction siginterrupt sigrelse snprintf strftime strlcpy symlinkat \ sysconf tcgetpgrp tcsetpgrp tempnam timegm times tmpfile tmpnam tmpnam_r \ Modified: python/branches/py3k/pyconfig.h.in ============================================================================== --- python/branches/py3k/pyconfig.h.in (original) +++ python/branches/py3k/pyconfig.h.in Mon Feb 28 23:25:22 2011 @@ -443,6 +443,9 @@ /* Define to 1 if you have the `resolv' library (-lresolv). */ #undef HAVE_LIBRESOLV +/* Define to 1 if you have the `sendfile' library (-lsendfile). */ +#undef HAVE_LIBSENDFILE + /* Define to 1 if you have the header file. */ #undef HAVE_LIBUTIL_H @@ -533,12 +536,6 @@ /* Define if the OS supports pipe2() */ #undef HAVE_PIPE2 -/* Define if the OS supports pipe2() */ -#undef HAVE_PIPE2 - -/* Define if the OS supports pipe2() */ -#undef HAVE_PIPE2 - /* Define to 1 if you have the `plock' function. */ #undef HAVE_PLOCK @@ -623,6 +620,9 @@ /* Define to 1 if you have the `sem_unlink' function. */ #undef HAVE_SEM_UNLINK +/* Define to 1 if you have the `sendfile' function. */ +#undef HAVE_SENDFILE + /* Define to 1 if you have the `setegid' function. */ #undef HAVE_SETEGID @@ -635,6 +635,9 @@ /* Define if you have the 'setgroups' function. */ #undef HAVE_SETGROUPS +/* Define to 1 if you have the `sethostname' function. */ +#undef HAVE_SETHOSTNAME + /* Define to 1 if you have the `setitimer' function. */ #undef HAVE_SETITIMER @@ -849,6 +852,9 @@ /* Define to 1 if you have the header file. */ #undef HAVE_SYS_TYPES_H +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_UIO_H + /* Define to 1 if you have the header file. */ #undef HAVE_SYS_UN_H @@ -1084,9 +1090,6 @@ /* The size of `size_t', as computed by sizeof. */ #undef SIZEOF_SIZE_T -/* Define to 1 if you have the `sendfile' function. */ -#undef HAVE_SENDFILE - /* The size of `time_t', as computed by sizeof. */ #undef SIZEOF_TIME_T From python-checkins at python.org Mon Feb 28 23:38:08 2011 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 28 Feb 2011 23:38:08 +0100 (CET) Subject: [Python-checkins] r88686 - python/branches/py3k/Doc/library/socket.rst Message-ID: <20110228223808.10811EEA54@mail.python.org> Author: antoine.pitrou Date: Mon Feb 28 23:38:07 2011 New Revision: 88686 Log: Recommend inspecting the errno attribute of socket.error objects, and improve wording. Modified: python/branches/py3k/Doc/library/socket.rst Modified: python/branches/py3k/Doc/library/socket.rst ============================================================================== --- python/branches/py3k/Doc/library/socket.rst (original) +++ python/branches/py3k/Doc/library/socket.rst Mon Feb 28 23:38:07 2011 @@ -117,39 +117,44 @@ .. index:: module: errno - This exception is raised for socket-related errors. The accompanying value is - either a string telling what went wrong or a pair ``(errno, string)`` - representing an error returned by a system call, similar to the value - accompanying :exc:`os.error`. See the module :mod:`errno`, which contains names - for the error codes defined by the underlying operating system. + A subclass of :exc:`IOError`, this exception is raised for socket-related + errors. It is recommended that you inspect its ``errno`` attribute to + discriminate between different kinds of errors. + + .. seealso:: + The :mod:`errno` module contains symbolic names for the error codes + defined by the underlying operating system. .. exception:: herror - This exception is raised for address-related errors, i.e. for functions that use - *h_errno* in the C API, including :func:`gethostbyname_ex` and - :func:`gethostbyaddr`. - - The accompanying value is a pair ``(h_errno, string)`` representing an error - returned by a library call. *string* represents the description of *h_errno*, as - returned by the :c:func:`hstrerror` C function. + A subclass of :exc:`socket.error`, this exception is raised for + address-related errors, i.e. for functions that use *h_errno* in the POSIX + C API, including :func:`gethostbyname_ex` and :func:`gethostbyaddr`. + The accompanying value is a pair ``(h_errno, string)`` representing an + error returned by a library call. *h_errno* is a numeric value, while + *string* represents the description of *h_errno*, as returned by the + :c:func:`hstrerror` C function. .. exception:: gaierror - This exception is raised for address-related errors, for :func:`getaddrinfo` and - :func:`getnameinfo`. The accompanying value is a pair ``(error, string)`` - representing an error returned by a library call. *string* represents the - description of *error*, as returned by the :c:func:`gai_strerror` C function. The - *error* value will match one of the :const:`EAI_\*` constants defined in this - module. + A subclass of :exc:`socket.error`, this exception is raised for + address-related errors by :func:`getaddrinfo` and :func:`getnameinfo`. + The accompanying value is a pair ``(error, string)`` representing an error + returned by a library call. *string* represents the description of + *error*, as returned by the :c:func:`gai_strerror` C function. The + numeric *error* value will match one of the :const:`EAI_\*` constants + defined in this module. .. exception:: timeout - This exception is raised when a timeout occurs on a socket which has had - timeouts enabled via a prior call to :meth:`~socket.settimeout`. The - accompanying value is a string whose value is currently always "timed out". + A subclass of :exc:`socket.error`, this exception is raised when a timeout + occurs on a socket which has had timeouts enabled via a prior call to + :meth:`~socket.settimeout` (or implicitly through + :func:`~socket.setdefaulttimeout`). The accompanying value is a string + whose value is currently always "timed out". .. data:: AF_UNIX