From webhook-mailer at python.org Sat Jan 1 12:50:04 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 01 Jan 2022 17:50:04 -0000 Subject: [Python-checkins] bpo-46079: Replace external link that is down for maintenance. (GH-30315) Message-ID: https://github.com/python/cpython/commit/ac4eea21722d9ed1604c9c30a8d29ae2ce74f1b2 commit: ac4eea21722d9ed1604c9c30a8d29ae2ce74f1b2 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-01T09:49:55-08:00 summary: bpo-46079: Replace external link that is down for maintenance. (GH-30315) files: M Doc/tutorial/stdlib.rst diff --git a/Doc/tutorial/stdlib.rst b/Doc/tutorial/stdlib.rst index bad750424c8e8..d90dc51c71927 100644 --- a/Doc/tutorial/stdlib.rst +++ b/Doc/tutorial/stdlib.rst @@ -178,13 +178,13 @@ protocols. Two of the simplest are :mod:`urllib.request` for retrieving data from URLs and :mod:`smtplib` for sending mail:: >>> from urllib.request import urlopen - >>> with urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl') as response: + >>> with urlopen('http://worldtimeapi.org/api/timezone/etc/UTC.txt') as response: ... for line in response: - ... line = line.decode('utf-8') # Decoding the binary data to text. - ... if 'EST' in line or 'EDT' in line: # look for Eastern Time - ... print(line) - -
Nov. 25, 09:43:32 PM EST + ... line = line.decode() # Convert bytes to a str + ... if line.startswith('datetime'): + ... print(line.rstrip()) # Remove trailing newline + ... + datetime: 2022-01-01T01:36:47.689215+00:00 >>> import smtplib >>> server = smtplib.SMTP('localhost') From webhook-mailer at python.org Sat Jan 1 13:13:13 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 01 Jan 2022 18:13:13 -0000 Subject: [Python-checkins] bpo-46079: Replace external link that is down for maintenance. (GH-30315) (GH-30328) Message-ID: https://github.com/python/cpython/commit/2bd73546959619b2519a7a830b3aaf190abeaf78 commit: 2bd73546959619b2519a7a830b3aaf190abeaf78 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-01T10:12:59-08:00 summary: bpo-46079: Replace external link that is down for maintenance. (GH-30315) (GH-30328) files: M Doc/tutorial/stdlib.rst diff --git a/Doc/tutorial/stdlib.rst b/Doc/tutorial/stdlib.rst index f33265cd2b0eb..ac16160b23439 100644 --- a/Doc/tutorial/stdlib.rst +++ b/Doc/tutorial/stdlib.rst @@ -178,13 +178,13 @@ protocols. Two of the simplest are :mod:`urllib.request` for retrieving data from URLs and :mod:`smtplib` for sending mail:: >>> from urllib.request import urlopen - >>> with urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl') as response: + >>> with urlopen('http://worldtimeapi.org/api/timezone/etc/UTC.txt') as response: ... for line in response: - ... line = line.decode('utf-8') # Decoding the binary data to text. - ... if 'EST' in line or 'EDT' in line: # look for Eastern Time - ... print(line) - -
Nov. 25, 09:43:32 PM EST + ... line = line.decode() # Convert bytes to a str + ... if line.startswith('datetime'): + ... print(line.rstrip()) # Remove trailing newline + ... + datetime: 2022-01-01T01:36:47.689215+00:00 >>> import smtplib >>> server = smtplib.SMTP('localhost') From webhook-mailer at python.org Sat Jan 1 13:13:35 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 01 Jan 2022 18:13:35 -0000 Subject: [Python-checkins] bpo-46079: Replace external link that is down for maintenance. (GH-30315) (GH-30329) Message-ID: https://github.com/python/cpython/commit/72ffcb02f3ea6efcd3afe368996dc3ee89701898 commit: 72ffcb02f3ea6efcd3afe368996dc3ee89701898 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-01T10:13:31-08:00 summary: bpo-46079: Replace external link that is down for maintenance. (GH-30315) (GH-30329) files: M Doc/tutorial/stdlib.rst diff --git a/Doc/tutorial/stdlib.rst b/Doc/tutorial/stdlib.rst index a52653b94a325..ab64ca6d400b8 100644 --- a/Doc/tutorial/stdlib.rst +++ b/Doc/tutorial/stdlib.rst @@ -178,13 +178,13 @@ protocols. Two of the simplest are :mod:`urllib.request` for retrieving data from URLs and :mod:`smtplib` for sending mail:: >>> from urllib.request import urlopen - >>> with urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl') as response: + >>> with urlopen('http://worldtimeapi.org/api/timezone/etc/UTC.txt') as response: ... for line in response: - ... line = line.decode('utf-8') # Decoding the binary data to text. - ... if 'EST' in line or 'EDT' in line: # look for Eastern Time - ... print(line) - -
Nov. 25, 09:43:32 PM EST + ... line = line.decode() # Convert bytes to a str + ... if line.startswith('datetime'): + ... print(line.rstrip()) # Remove trailing newline + ... + datetime: 2022-01-01T01:36:47.689215+00:00 >>> import smtplib >>> server = smtplib.SMTP('localhost') From webhook-mailer at python.org Sat Jan 1 13:37:50 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 01 Jan 2022 18:37:50 -0000 Subject: [Python-checkins] bpo-46095: Improve SeqIter documentation. (GH-30316) Message-ID: https://github.com/python/cpython/commit/a09bc3a404befca197b5d9959a9c62110ee61d77 commit: a09bc3a404befca197b5d9959a9c62110ee61d77 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-01T10:37:26-08:00 summary: bpo-46095: Improve SeqIter documentation. (GH-30316) files: M Doc/library/stdtypes.rst M Doc/reference/compound_stmts.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index 9d80661fdb821..dc423bfbb7f55 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -959,6 +959,16 @@ This means that to compare equal, every element must compare equal and the two sequences must be of the same type and have the same length. (For full details see :ref:`comparisons` in the language reference.) +.. index:: + single: loop; over mutable sequence + single: mutable sequence; loop over + +Forward and reversed iterators over mutable sequences access values using an +index. That index will continue to march forward (or backward) even if the +underlying sequence is mutated. The iterator terminates only when an +:exc:`IndexError` or a :exc:`StopIteration` is encountered (or when the index +drops below zero). + Notes: (1) diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index cf8ad1787b291..03fc2cb962791 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -196,27 +196,6 @@ the built-in function :func:`range` returns an iterator of integers suitable to emulate the effect of Pascal's ``for i := a to b do``; e.g., ``list(range(3))`` returns the list ``[0, 1, 2]``. -.. note:: - - .. index:: - single: loop; over mutable sequence - single: mutable sequence; loop over - - There is a subtlety when the sequence is being modified by the loop (this can - only occur for mutable sequences, e.g. lists). An internal counter is used - to keep track of which item is used next, and this is incremented on each - iteration. When this counter has reached the length of the sequence the loop - terminates. This means that if the suite deletes the current (or a previous) - item from the sequence, the next item will be skipped (since it gets the - index of the current item which has already been treated). Likewise, if the - suite inserts an item in the sequence before the current item, the current - item will be treated again the next time through the loop. This can lead to - nasty bugs that can be avoided by making a temporary copy using a slice of - the whole sequence, e.g., :: - - for x in a[:]: - if x < 0: a.remove(x) - .. _try: .. _except: From webhook-mailer at python.org Sat Jan 1 14:12:52 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 01 Jan 2022 19:12:52 -0000 Subject: [Python-checkins] bpo-46095: Improve SeqIter documentation. (GH-30316) (GH-30330) Message-ID: https://github.com/python/cpython/commit/e9783d6434f28dfb0b531c6760f7642fc7ede278 commit: e9783d6434f28dfb0b531c6760f7642fc7ede278 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-01T11:12:43-08:00 summary: bpo-46095: Improve SeqIter documentation. (GH-30316) (GH-30330) files: M Doc/library/stdtypes.rst M Doc/reference/compound_stmts.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index 101bbca7be8b2..8fa252b04d706 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -921,6 +921,16 @@ This means that to compare equal, every element must compare equal and the two sequences must be of the same type and have the same length. (For full details see :ref:`comparisons` in the language reference.) +.. index:: + single: loop; over mutable sequence + single: mutable sequence; loop over + +Forward and reversed iterators over mutable sequences access values using an +index. That index will continue to march forward (or backward) even if the +underlying sequence is mutated. The iterator terminates only when an +:exc:`IndexError` or a :exc:`StopIteration` is encountered (or when the index +drops below zero). + Notes: (1) diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index 63d885deae93f..7f37bb4fdf9c9 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -196,27 +196,6 @@ the built-in function :func:`range` returns an iterator of integers suitable to emulate the effect of Pascal's ``for i := a to b do``; e.g., ``list(range(3))`` returns the list ``[0, 1, 2]``. -.. note:: - - .. index:: - single: loop; over mutable sequence - single: mutable sequence; loop over - - There is a subtlety when the sequence is being modified by the loop (this can - only occur for mutable sequences, e.g. lists). An internal counter is used - to keep track of which item is used next, and this is incremented on each - iteration. When this counter has reached the length of the sequence the loop - terminates. This means that if the suite deletes the current (or a previous) - item from the sequence, the next item will be skipped (since it gets the - index of the current item which has already been treated). Likewise, if the - suite inserts an item in the sequence before the current item, the current - item will be treated again the next time through the loop. This can lead to - nasty bugs that can be avoided by making a temporary copy using a slice of - the whole sequence, e.g., :: - - for x in a[:]: - if x < 0: a.remove(x) - .. _try: .. _except: From webhook-mailer at python.org Sun Jan 2 04:34:38 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sun, 02 Jan 2022 09:34:38 -0000 Subject: [Python-checkins] bpo-45615: Add missing test for printing traceback for non-exception. Fix traceback.py (GH-30091) Message-ID: https://github.com/python/cpython/commit/a82baed0e9e61c0d8dc5c12fc08de7fc172c1a38 commit: a82baed0e9e61c0d8dc5c12fc08de7fc172c1a38 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-02T09:34:03Z summary: bpo-45615: Add missing test for printing traceback for non-exception. Fix traceback.py (GH-30091) files: A Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst M Lib/test/test_traceback.py M Lib/traceback.py M Modules/_testcapimodule.c diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index 97bd9bae1d58e..a0e4656d3d9ea 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -1060,6 +1060,22 @@ def test_exception_group_deep_recursion_traceback(self): self.assertIn('ExceptionGroup', output) self.assertLessEqual(output.count('ExceptionGroup'), LIMIT) + @cpython_only + def test_print_exception_bad_type_capi(self): + from _testcapi import exception_print + with captured_output("stderr") as stderr: + exception_print(42) + self.assertEqual( + stderr.getvalue(), + ('TypeError: print_exception(): ' + 'Exception expected for value, int found\n') + ) + + def test_print_exception_bad_type_python(self): + msg = "Exception expected for value, int found" + with self.assertRaisesRegex(TypeError, msg): + traceback.print_exception(42) + cause_message = ( "\nThe above exception was the direct cause " diff --git a/Lib/traceback.py b/Lib/traceback.py index b244750fd016e..05f1fffef0d3b 100644 --- a/Lib/traceback.py +++ b/Lib/traceback.py @@ -98,7 +98,11 @@ def _parse_value_tb(exc, value, tb): raise ValueError("Both or neither of value and tb must be given") if value is tb is _sentinel: if exc is not None: - return exc, exc.__traceback__ + if isinstance(exc, BaseException): + return exc, exc.__traceback__ + + raise TypeError(f'Exception expected for value, ' + f'{type(exc).__name__} found') else: return None, None return value, tb diff --git a/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst b/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst new file mode 100644 index 0000000000000..f8cd911ea6365 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst @@ -0,0 +1 @@ +Functions in the :mod:`traceback` module raise :exc:`TypeError` rather than :exc:`AttributeError` when an exception argument is not of type :exc:`BaseException`. \ No newline at end of file diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index 6116365b2c0f7..be40d68b40b17 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -3513,17 +3513,17 @@ static PyObject * exception_print(PyObject *self, PyObject *args) { PyObject *value; - PyObject *tb; + PyObject *tb = NULL; if (!PyArg_ParseTuple(args, "O:exception_print", - &value)) - return NULL; - if (!PyExceptionInstance_Check(value)) { - PyErr_Format(PyExc_TypeError, "an exception instance is required"); + &value)) { return NULL; } - tb = PyException_GetTraceback(value); + if (PyExceptionInstance_Check(value)) { + tb = PyException_GetTraceback(value); + } + PyErr_Display((PyObject *) Py_TYPE(value), value, tb); Py_XDECREF(tb); From webhook-mailer at python.org Sun Jan 2 11:52:05 2022 From: webhook-mailer at python.org (Mariatta) Date: Sun, 02 Jan 2022 16:52:05 -0000 Subject: [Python-checkins] bpo-45903: Fix typo in What's New: Signature.from_builtin is removed (GH-29813) Message-ID: https://github.com/python/cpython/commit/7a8796dc67d691e43eed69969e7706fefe0f16e9 commit: 7a8796dc67d691e43eed69969e7706fefe0f16e9 branch: main author: Hugo van Kemenade committer: Mariatta date: 2022-01-02T08:51:56-08:00 summary: bpo-45903: Fix typo in What's New: Signature.from_builtin is removed (GH-29813) files: M Doc/whatsnew/3.11.rst diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 5389ce8b258cf..faa63a93895a2 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -528,7 +528,7 @@ Removed use the :func:`inspect.signature` function and :class:`Signature` object directly. - * the undocumented ``Signature.from_callable`` and ``Signature.from_function`` + * the undocumented ``Signature.from_builtin`` and ``Signature.from_function`` functions, deprecated since Python 3.5; use the :meth:`Signature.from_callable() ` method instead. From webhook-mailer at python.org Sun Jan 2 13:33:29 2022 From: webhook-mailer at python.org (merwok) Date: Sun, 02 Jan 2022 18:33:29 -0000 Subject: [Python-checkins] bpo-46196: document method cmd.Cmd.columnize (#30303) Message-ID: https://github.com/python/cpython/commit/ce4d25f3cd0a1c6e65b64015140fb5e1397c8ac5 commit: ce4d25f3cd0a1c6e65b64015140fb5e1397c8ac5 branch: main author: Nikita Sobolev committer: merwok date: 2022-01-02T13:33:20-05:00 summary: bpo-46196: document method cmd.Cmd.columnize (#30303) The method is already written and tested, now it's officially public. files: A Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst M Doc/library/cmd.rst diff --git a/Doc/library/cmd.rst b/Doc/library/cmd.rst index d57edb7eb1698..fd5df96dfd0b3 100644 --- a/Doc/library/cmd.rst +++ b/Doc/library/cmd.rst @@ -121,6 +121,13 @@ A :class:`Cmd` instance has the following methods: :meth:`complete_\*` method is available. By default, it returns an empty list. +.. method:: Cmd.columnize(list, displaywidth=80) + + Method called to display a list of strings as a compact set of columns. + Each column is only as wide as necessary. + Columns are separated by two spaces for readability. + + .. method:: Cmd.precmd(line) Hook method executed just before the command line *line* is interpreted, but diff --git a/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst b/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst new file mode 100644 index 0000000000000..f14ada607522e --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst @@ -0,0 +1 @@ +Document method :meth:`cmd.Cmd.columnize`. From webhook-mailer at python.org Sun Jan 2 14:18:36 2022 From: webhook-mailer at python.org (tim-one) Date: Sun, 02 Jan 2022 19:18:36 -0000 Subject: [Python-checkins] bpo-46218: Change long_pow() to sliding window algorithm (GH-30319) Message-ID: https://github.com/python/cpython/commit/863729e9c6f599286f98ec37c8716e982c4ca9dd commit: 863729e9c6f599286f98ec37c8716e982c4ca9dd branch: main author: Tim Peters committer: tim-one date: 2022-01-02T13:18:20-06:00 summary: bpo-46218: Change long_pow() to sliding window algorithm (GH-30319) * bpo-46218: Change long_pow() to sliding window algorithm The primary motivation is to eliminate long_pow's reliance on that the number of bits in a long "digit" is a multiple of 5. Now it no longer cares how many bits are in a digit. But the sliding window approach also allows cutting the precomputed table of small powers in half, which reduces initialization overhead enough that the approach pays off for smaller exponents too. Depending on exponent bit patterns, a sliding window may also be able to save some bigint multiplies (sometimes when at least 5 consecutive exponent bits are 0, regardless of their starting bit position modulo 5). Note: boosting the window width to 6 didn't work well overall. It give marginal speed improvements for huge exponents, but the increased overhead (the small-power table needs twice as many entries) made it a loss for smaller exponents. Co-authored-by: Oleg Iarygin files: M Include/cpython/longintrepr.h M Lib/test/test_pow.py M Objects/longobject.c diff --git a/Include/cpython/longintrepr.h b/Include/cpython/longintrepr.h index ff4155f9656de..68dbf9c4382dc 100644 --- a/Include/cpython/longintrepr.h +++ b/Include/cpython/longintrepr.h @@ -21,8 +21,6 @@ extern "C" { PyLong_SHIFT. The majority of the code doesn't care about the precise value of PyLong_SHIFT, but there are some notable exceptions: - - long_pow() requires that PyLong_SHIFT be divisible by 5 - - PyLong_{As,From}ByteArray require that PyLong_SHIFT be at least 8 - long_hash() requires that PyLong_SHIFT is *strictly* less than the number @@ -63,10 +61,6 @@ typedef long stwodigits; /* signed variant of twodigits */ #define PyLong_BASE ((digit)1 << PyLong_SHIFT) #define PyLong_MASK ((digit)(PyLong_BASE - 1)) -#if PyLong_SHIFT % 5 != 0 -#error "longobject.c requires that PyLong_SHIFT be divisible by 5" -#endif - /* Long integer representation. The absolute value of a number is equal to SUM(for i=0 through abs(ob_size)-1) ob_digit[i] * 2**(SHIFT*i) diff --git a/Lib/test/test_pow.py b/Lib/test/test_pow.py index 660ff80bbf522..5cea9ceb20f5c 100644 --- a/Lib/test/test_pow.py +++ b/Lib/test/test_pow.py @@ -93,6 +93,28 @@ def test_other(self): pow(int(i),j,k) ) + def test_big_exp(self): + import random + self.assertEqual(pow(2, 50000), 1 << 50000) + # Randomized modular tests, checking the identities + # a**(b1 + b2) == a**b1 * a**b2 + # a**(b1 * b2) == (a**b1)**b2 + prime = 1000000000039 # for speed, relatively small prime modulus + for i in range(10): + a = random.randrange(1000, 1000000) + bpower = random.randrange(1000, 50000) + b = random.randrange(1 << (bpower - 1), 1 << bpower) + b1 = random.randrange(1, b) + b2 = b - b1 + got1 = pow(a, b, prime) + got2 = pow(a, b1, prime) * pow(a, b2, prime) % prime + if got1 != got2: + self.fail(f"{a=:x} {b1=:x} {b2=:x} {got1=:x} {got2=:x}") + got3 = pow(a, b1 * b2, prime) + got4 = pow(pow(a, b1, prime), b2, prime) + if got3 != got4: + self.fail(f"{a=:x} {b1=:x} {b2=:x} {got3=:x} {got4=:x}") + def test_bug643260(self): class TestRpow: def __rpow__(self, other): diff --git a/Objects/longobject.c b/Objects/longobject.c index 09ae9455c5b26..b5648fca7dc5c 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -74,12 +74,34 @@ maybe_small_long(PyLongObject *v) #define KARATSUBA_CUTOFF 70 #define KARATSUBA_SQUARE_CUTOFF (2 * KARATSUBA_CUTOFF) -/* For exponentiation, use the binary left-to-right algorithm - * unless the exponent contains more than FIVEARY_CUTOFF digits. - * In that case, do 5 bits at a time. The potential drawback is that - * a table of 2**5 intermediate results is computed. +/* For exponentiation, use the binary left-to-right algorithm unless the + ^ exponent contains more than HUGE_EXP_CUTOFF bits. In that case, do + * (no more than) EXP_WINDOW_SIZE bits at a time. The potential drawback is + * that a table of 2**(EXP_WINDOW_SIZE - 1) intermediate results is + * precomputed. */ -#define FIVEARY_CUTOFF 8 +#define EXP_WINDOW_SIZE 5 +#define EXP_TABLE_LEN (1 << (EXP_WINDOW_SIZE - 1)) +/* Suppose the exponent has bit length e. All ways of doing this + * need e squarings. The binary method also needs a multiply for + * each bit set. In a k-ary method with window width w, a multiply + * for each non-zero window, so at worst (and likely!) + * ceiling(e/w). The k-ary sliding window method has the same + * worst case, but the window slides so it can sometimes skip + * over an all-zero window that the fixed-window method can't + * exploit. In addition, the windowing methods need multiplies + * to precompute a table of small powers. + * + * For the sliding window method with width 5, 16 precomputation + * multiplies are needed. Assuming about half the exponent bits + * are set, then, the binary method needs about e/2 extra mults + * and the window method about 16 + e/5. + * + * The latter is smaller for e > 53 1/3. We don't have direct + * access to the bit length, though, so call it 60, which is a + * multiple of a long digit's max bit length (15 or 30 so far). + */ +#define HUGE_EXP_CUTOFF 60 #define SIGCHECK(PyTryBlock) \ do { \ @@ -4172,14 +4194,15 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) int negativeOutput = 0; /* if x<0 return negative output */ PyLongObject *z = NULL; /* accumulated result */ - Py_ssize_t i, j, k; /* counters */ + Py_ssize_t i, j; /* counters */ PyLongObject *temp = NULL; + PyLongObject *a2 = NULL; /* may temporarily hold a**2 % c */ - /* 5-ary values. If the exponent is large enough, table is - * precomputed so that table[i] == a**i % c for i in range(32). + /* k-ary values. If the exponent is large enough, table is + * precomputed so that table[i] == a**(2*i+1) % c for i in + * range(EXP_TABLE_LEN). */ - PyLongObject *table[32] = {0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, - 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0}; + PyLongObject *table[EXP_TABLE_LEN] = {0}; /* a, b, c = v, w, x */ CHECK_BINOP(v, w); @@ -4332,7 +4355,7 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) } /* else bi is 0, and z==1 is correct */ } - else if (i <= FIVEARY_CUTOFF) { + else if (i <= HUGE_EXP_CUTOFF / PyLong_SHIFT ) { /* Left-to-right binary exponentiation (HAC Algorithm 14.79) */ /* http://www.cacr.math.uwaterloo.ca/hac/about/chap14.pdf */ @@ -4366,23 +4389,59 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) } } else { - /* Left-to-right 5-ary exponentiation (HAC Algorithm 14.82) */ - Py_INCREF(z); /* still holds 1L */ - table[0] = z; - for (i = 1; i < 32; ++i) - MULT(table[i-1], a, table[i]); + /* Left-to-right k-ary sliding window exponentiation + * (Handbook of Applied Cryptography (HAC) Algorithm 14.85) + */ + Py_INCREF(a); + table[0] = a; + MULT(a, a, a2); + /* table[i] == a**(2*i + 1) % c */ + for (i = 1; i < EXP_TABLE_LEN; ++i) + MULT(table[i-1], a2, table[i]); + Py_CLEAR(a2); + + /* Repeatedly extract the next (no more than) EXP_WINDOW_SIZE bits + * into `pending`, starting with the next 1 bit. The current bit + * length of `pending` is `blen`. + */ + int pending = 0, blen = 0; +#define ABSORB_PENDING do { \ + int ntz = 0; /* number of trailing zeroes in `pending` */ \ + assert(pending && blen); \ + assert(pending >> (blen - 1)); \ + assert(pending >> blen == 0); \ + while ((pending & 1) == 0) { \ + ++ntz; \ + pending >>= 1; \ + } \ + assert(ntz < blen); \ + blen -= ntz; \ + do { \ + MULT(z, z, z); \ + } while (--blen); \ + MULT(z, table[pending >> 1], z); \ + while (ntz-- > 0) \ + MULT(z, z, z); \ + assert(blen == 0); \ + pending = 0; \ + } while(0) for (i = Py_SIZE(b) - 1; i >= 0; --i) { const digit bi = b->ob_digit[i]; - - for (j = PyLong_SHIFT - 5; j >= 0; j -= 5) { - const int index = (bi >> j) & 0x1f; - for (k = 0; k < 5; ++k) + for (j = PyLong_SHIFT - 1; j >= 0; --j) { + const int bit = (bi >> j) & 1; + pending = (pending << 1) | bit; + if (pending) { + ++blen; + if (blen == EXP_WINDOW_SIZE) + ABSORB_PENDING; + } + else /* absorb strings of 0 bits */ MULT(z, z, z); - if (index) - MULT(z, table[index], z); } } + if (pending) + ABSORB_PENDING; } if (negativeOutput && (Py_SIZE(z) != 0)) { @@ -4399,13 +4458,14 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) Py_CLEAR(z); /* fall through */ Done: - if (Py_SIZE(b) > FIVEARY_CUTOFF) { - for (i = 0; i < 32; ++i) + if (Py_SIZE(b) > HUGE_EXP_CUTOFF / PyLong_SHIFT) { + for (i = 0; i < EXP_TABLE_LEN; ++i) Py_XDECREF(table[i]); } Py_DECREF(a); Py_DECREF(b); Py_XDECREF(c); + Py_XDECREF(a2); Py_XDECREF(temp); return (PyObject *)z; } From webhook-mailer at python.org Sun Jan 2 15:08:53 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 02 Jan 2022 20:08:53 -0000 Subject: [Python-checkins] Update copyright year to 2022. (GH-30335) Message-ID: https://github.com/python/cpython/commit/ba00f0d93a4aea85ae8089f139856a7c450584d7 commit: ba00f0d93a4aea85ae8089f139856a7c450584d7 branch: main author: Benjamin Peterson committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-02T12:08:48-08:00 summary: Update copyright year to 2022. (GH-30335) Automerge-Triggered-By: GH:benjaminp files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M PC/python_ver_rc.h M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 4191c0bb63a2c..e64a49328b472 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2021 Python Software Foundation. All rights reserved. +Copyright ? 2001-2022 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index cd03411d6c94a..e0ca5f2662dc1 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -100,7 +100,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2021 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2022 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 55cb8d37e5219..02a5145f0e385 100644 --- a/LICENSE +++ b/LICENSE @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021 Python Software Foundation; +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index f6b5cfe8d5451..d197c77ed4b1a 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2021 Python Software Foundation + %version%, ? 2001-2022 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 3d8bc3e4154ee..70f215d07249b 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2021 Python Software Foundation + %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index 2c801332332b3..84843b734e3d6 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2021 Python Software Foundation. + %version%, (c) 2001-2022 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/PC/python_ver_rc.h b/PC/python_ver_rc.h index 90fc6ba1a1460..e6c1d24370415 100644 --- a/PC/python_ver_rc.h +++ b/PC/python_ver_rc.h @@ -5,7 +5,7 @@ #include "winver.h" #define PYTHON_COMPANY "Python Software Foundation" -#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2021 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." +#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2022 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." #define MS_WINDOWS #include "modsupport.h" diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 7fdeb314d5261..88d1d0536253a 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2021 Python Software Foundation.\n\ +Copyright (c) 2001-2022 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index d4b6621a80554..ff9d7858fd5bf 100644 --- a/README.rst +++ b/README.rst @@ -14,7 +14,7 @@ This is Python version 3.11.0 alpha 3 :target: https://discuss.python.org/ -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. See the end of this file for further copyright and license information. @@ -243,7 +243,7 @@ See :pep:`664` for Python 3.11 release details. Copyright and License Information --------------------------------- -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Sun Jan 2 15:16:33 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 02 Jan 2022 20:16:33 -0000 Subject: [Python-checkins] argparse docs: prog default is the basename of argv[0] (GH-30298) Message-ID: https://github.com/python/cpython/commit/8e75c6b49b7cb8515b917f01b32ece8c8ea2c0a0 commit: 8e75c6b49b7cb8515b917f01b32ece8c8ea2c0a0 branch: main author: Jade Lovelace committer: rhettinger date: 2022-01-02T12:16:25-08:00 summary: argparse docs: prog default is the basename of argv[0] (GH-30298) files: M Doc/library/argparse.rst M Lib/argparse.py diff --git a/Doc/library/argparse.rst b/Doc/library/argparse.rst index 80c382a981b8d..e050d6298b6ff 100644 --- a/Doc/library/argparse.rst +++ b/Doc/library/argparse.rst @@ -148,7 +148,8 @@ ArgumentParser objects as keyword arguments. Each parameter has its own more detailed description below, but in short they are: - * prog_ - The name of the program (default: ``sys.argv[0]``) + * prog_ - The name of the program (default: + ``os.path.basename(sys.argv[0])``) * usage_ - The string describing the program usage (default: generated from arguments added to parser) diff --git a/Lib/argparse.py b/Lib/argparse.py index de95eedbee0ee..1529d9e768737 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -1691,7 +1691,8 @@ class ArgumentParser(_AttributeHolder, _ActionsContainer): """Object for parsing command line strings into Python objects. Keyword Arguments: - - prog -- The name of the program (default: sys.argv[0]) + - prog -- The name of the program (default: + ``os.path.basename(sys.argv[0])``) - usage -- A usage message (default: auto-generated from arguments) - description -- A description of what the program does - epilog -- Text following the argument descriptions From webhook-mailer at python.org Sun Jan 2 15:34:45 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 02 Jan 2022 20:34:45 -0000 Subject: [Python-checkins] Update copyright year to 2022. (GH-30335) Message-ID: https://github.com/python/cpython/commit/17c858e3318638f01c163aa92d9c990ae03ca214 commit: 17c858e3318638f01c163aa92d9c990ae03ca214 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-02T12:34:36-08:00 summary: Update copyright year to 2022. (GH-30335) Automerge-Triggered-By: GH:benjaminp (cherry picked from commit ba00f0d93a4aea85ae8089f139856a7c450584d7) Co-authored-by: Benjamin Peterson files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M PC/python_ver_rc.h M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 4191c0bb63a2c..e64a49328b472 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2021 Python Software Foundation. All rights reserved. +Copyright ? 2001-2022 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index 4ab04f34dd199..842cf03a31c65 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -100,7 +100,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2021 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2022 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 473861da1be7c..739c90c284001 100644 --- a/LICENSE +++ b/LICENSE @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021 Python Software Foundation; +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index f6b5cfe8d5451..d197c77ed4b1a 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2021 Python Software Foundation + %version%, ? 2001-2022 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 3d8bc3e4154ee..70f215d07249b 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2021 Python Software Foundation + %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index 2c801332332b3..84843b734e3d6 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2021 Python Software Foundation. + %version%, (c) 2001-2022 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/PC/python_ver_rc.h b/PC/python_ver_rc.h index 90fc6ba1a1460..e6c1d24370415 100644 --- a/PC/python_ver_rc.h +++ b/PC/python_ver_rc.h @@ -5,7 +5,7 @@ #include "winver.h" #define PYTHON_COMPANY "Python Software Foundation" -#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2021 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." +#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2022 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." #define MS_WINDOWS #include "modsupport.h" diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 7fdeb314d5261..88d1d0536253a 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2021 Python Software Foundation.\n\ +Copyright (c) 2001-2022 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index b808463b71271..9fb0d243b4986 100644 --- a/README.rst +++ b/README.rst @@ -22,7 +22,7 @@ This is Python version 3.9.9 :target: https://discuss.python.org/ -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. See the end of this file for further copyright and license information. @@ -251,7 +251,7 @@ See :pep:`596` for Python 3.9 release details. Copyright and License Information --------------------------------- -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Sun Jan 2 16:13:12 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 02 Jan 2022 21:13:12 -0000 Subject: [Python-checkins] [3.10] Update copyright year to 2022. (GH-30335) (GH-30336) Message-ID: https://github.com/python/cpython/commit/35955e4adec4dd09127af93f9413d46889a3c475 commit: 35955e4adec4dd09127af93f9413d46889a3c475 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-02T13:13:04-08:00 summary: [3.10] Update copyright year to 2022. (GH-30335) (GH-30336) Automerge-Triggered-By: GH:benjaminp (cherry picked from commit ba00f0d93a4aea85ae8089f139856a7c450584d7) Co-authored-by: Benjamin Peterson files: M Doc/copyright.rst M Doc/license.rst M LICENSE M Mac/IDLE/IDLE.app/Contents/Info.plist M Mac/PythonLauncher/Info.plist.in M Mac/Resources/app/Info.plist.in M PC/python_ver_rc.h M Python/getcopyright.c M README.rst diff --git a/Doc/copyright.rst b/Doc/copyright.rst index 4191c0bb63a2c..e64a49328b472 100644 --- a/Doc/copyright.rst +++ b/Doc/copyright.rst @@ -4,7 +4,7 @@ Copyright Python and this documentation is: -Copyright ? 2001-2021 Python Software Foundation. All rights reserved. +Copyright ? 2001-2022 Python Software Foundation. All rights reserved. Copyright ? 2000 BeOpen.com. All rights reserved. diff --git a/Doc/license.rst b/Doc/license.rst index cd03411d6c94a..e0ca5f2662dc1 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -100,7 +100,7 @@ PSF LICENSE AGREEMENT FOR PYTHON |release| analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python |release| alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of - copyright, i.e., "Copyright ? 2001-2021 Python Software Foundation; All Rights + copyright, i.e., "Copyright ? 2001-2022 Python Software Foundation; All Rights Reserved" are retained in Python |release| alone or in any derivative version prepared by Licensee. diff --git a/LICENSE b/LICENSE index 55cb8d37e5219..02a5145f0e385 100644 --- a/LICENSE +++ b/LICENSE @@ -84,7 +84,7 @@ analyze, test, perform and/or display publicly, prepare derivative works, distribute, and otherwise use Python alone or in any derivative version, provided, however, that PSF's License Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021 Python Software Foundation; +2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation; All Rights Reserved" are retained in Python alone or in any derivative version prepared by Licensee. diff --git a/Mac/IDLE/IDLE.app/Contents/Info.plist b/Mac/IDLE/IDLE.app/Contents/Info.plist index f6b5cfe8d5451..d197c77ed4b1a 100644 --- a/Mac/IDLE/IDLE.app/Contents/Info.plist +++ b/Mac/IDLE/IDLE.app/Contents/Info.plist @@ -36,7 +36,7 @@ CFBundleExecutable IDLE CFBundleGetInfoString - %version%, ? 2001-2021 Python Software Foundation + %version%, ? 2001-2022 Python Software Foundation CFBundleIconFile IDLE.icns CFBundleIdentifier diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 3d8bc3e4154ee..70f215d07249b 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -40,7 +40,7 @@ CFBundleExecutable Python Launcher CFBundleGetInfoString - %VERSION%, ? 2001-2021 Python Software Foundation + %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile PythonLauncher.icns CFBundleIdentifier diff --git a/Mac/Resources/app/Info.plist.in b/Mac/Resources/app/Info.plist.in index 2c801332332b3..84843b734e3d6 100644 --- a/Mac/Resources/app/Info.plist.in +++ b/Mac/Resources/app/Info.plist.in @@ -37,7 +37,7 @@ CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString - %version%, (c) 2001-2021 Python Software Foundation. + %version%, (c) 2001-2022 Python Software Foundation. CFBundleName Python CFBundlePackageType diff --git a/PC/python_ver_rc.h b/PC/python_ver_rc.h index 90fc6ba1a1460..e6c1d24370415 100644 --- a/PC/python_ver_rc.h +++ b/PC/python_ver_rc.h @@ -5,7 +5,7 @@ #include "winver.h" #define PYTHON_COMPANY "Python Software Foundation" -#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2021 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." +#define PYTHON_COPYRIGHT "Copyright \xA9 2001-2022 Python Software Foundation. Copyright \xA9 2000 BeOpen.com. Copyright \xA9 1995-2001 CNRI. Copyright \xA9 1991-1995 SMC." #define MS_WINDOWS #include "modsupport.h" diff --git a/Python/getcopyright.c b/Python/getcopyright.c index 7fdeb314d5261..88d1d0536253a 100644 --- a/Python/getcopyright.c +++ b/Python/getcopyright.c @@ -4,7 +4,7 @@ static const char cprt[] = "\ -Copyright (c) 2001-2021 Python Software Foundation.\n\ +Copyright (c) 2001-2022 Python Software Foundation.\n\ All Rights Reserved.\n\ \n\ Copyright (c) 2000 BeOpen.com.\n\ diff --git a/README.rst b/README.rst index d2189e5c361e6..d98a2ad9a3dae 100644 --- a/README.rst +++ b/README.rst @@ -18,7 +18,7 @@ This is Python version 3.10.1 :target: https://discuss.python.org/ -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. See the end of this file for further copyright and license information. @@ -247,7 +247,7 @@ See :pep:`619` for Python 3.10 release details. Copyright and License Information --------------------------------- -Copyright (c) 2001-2021 Python Software Foundation. All rights reserved. +Copyright (c) 2001-2022 Python Software Foundation. All rights reserved. Copyright (c) 2000 BeOpen.com. All rights reserved. From webhook-mailer at python.org Sun Jan 2 16:29:54 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 02 Jan 2022 21:29:54 -0000 Subject: [Python-checkins] argparse docs: prog default is the basename of argv[0] (GH-30298) (GH-30339) Message-ID: https://github.com/python/cpython/commit/74af713538463c9881e27b58bc4dbd67712c53f8 commit: 74af713538463c9881e27b58bc4dbd67712c53f8 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-02T13:29:35-08:00 summary: argparse docs: prog default is the basename of argv[0] (GH-30298) (GH-30339) files: M Doc/library/argparse.rst M Lib/argparse.py diff --git a/Doc/library/argparse.rst b/Doc/library/argparse.rst index a056826774c5e..d853d2afbe372 100644 --- a/Doc/library/argparse.rst +++ b/Doc/library/argparse.rst @@ -148,7 +148,8 @@ ArgumentParser objects as keyword arguments. Each parameter has its own more detailed description below, but in short they are: - * prog_ - The name of the program (default: ``sys.argv[0]``) + * prog_ - The name of the program (default: + ``os.path.basename(sys.argv[0])``) * usage_ - The string describing the program usage (default: generated from arguments added to parser) diff --git a/Lib/argparse.py b/Lib/argparse.py index b2db312b8fdfd..e177e4fe034d3 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -1672,7 +1672,8 @@ class ArgumentParser(_AttributeHolder, _ActionsContainer): """Object for parsing command line strings into Python objects. Keyword Arguments: - - prog -- The name of the program (default: sys.argv[0]) + - prog -- The name of the program (default: + ``os.path.basename(sys.argv[0])``) - usage -- A usage message (default: auto-generated from arguments) - description -- A description of what the program does - epilog -- Text following the argument descriptions From webhook-mailer at python.org Sun Jan 2 18:00:17 2022 From: webhook-mailer at python.org (ned-deily) Date: Sun, 02 Jan 2022 23:00:17 -0000 Subject: [Python-checkins] bpo-41028: use generic version links in Docs index. Message-ID: https://github.com/python/cpython/commit/811f65ba263140b6ba28151246b52efe149a6382 commit: 811f65ba263140b6ba28151246b52efe149a6382 branch: 3.7 author: Ned Deily committer: ned-deily date: 2022-01-02T17:55:22-05:00 summary: bpo-41028: use generic version links in Docs index. files: M Doc/tools/static/switchers.js M Doc/tools/templates/indexsidebar.html diff --git a/Doc/tools/static/switchers.js b/Doc/tools/static/switchers.js index 1a1c7d0fa57e2..e6949ab842dd5 100644 --- a/Doc/tools/static/switchers.js +++ b/Doc/tools/static/switchers.js @@ -10,8 +10,9 @@ '(?:release/\\d.\\d[\\x\\d\\.]*)']; var all_versions = { - '3.10': 'dev (3.10)', - '3.9': 'pre (3.9)', + '3.11': 'dev (3.11)', + '3.10': '3.10', + '3.9': '3.9', '3.8': '3.8', '3.7': '3.7', '3.6': '3.6', diff --git a/Doc/tools/templates/indexsidebar.html b/Doc/tools/templates/indexsidebar.html index 2ef90f094ddaa..f7bf6d8e49117 100644 --- a/Doc/tools/templates/indexsidebar.html +++ b/Doc/tools/templates/indexsidebar.html @@ -2,12 +2,8 @@

{% trans %}Download{% endtrans %}

{% trans %}Download these documents{% endtrans %}

{% trans %}Docs by version{% endtrans %}

@@ -18,4 +14,5 @@

{% trans %}Other resources{% endtrans %}

  • {% trans %}Beginner's Guide{% endtrans %}
  • {% trans %}Book List{% endtrans %}
  • {% trans %}Audio/Visual Talks{% endtrans %}
  • +
  • {% trans %}Python Developer?s Guide{% endtrans %}
  • From webhook-mailer at python.org Sun Jan 2 18:04:43 2022 From: webhook-mailer at python.org (ned-deily) Date: Sun, 02 Jan 2022 23:04:43 -0000 Subject: [Python-checkins] bpo-41028: Doc: Move switchers to docsbuild-scripts. (GH-20969) (GH-30343) Message-ID: https://github.com/python/cpython/commit/b28b0222e360669463ffe31d27c1fd374361cb23 commit: b28b0222e360669463ffe31d27c1fd374361cb23 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ned-deily date: 2022-01-02T18:04:33-05:00 summary: bpo-41028: Doc: Move switchers to docsbuild-scripts. (GH-20969) (GH-30343) (cherry picked from commit ee2549c2ba8bae00f2b2fea8a39c6dfbd1d06520) Co-authored-by: Julien Palard files: A Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst D Doc/tools/static/switchers.js M Doc/Makefile M Doc/tools/templates/dummy.html M Doc/tools/templates/indexsidebar.html M Doc/tools/templates/layout.html diff --git a/Doc/Makefile b/Doc/Makefile index 57763fc0e103d..3cf2040f40087 100644 --- a/Doc/Makefile +++ b/Doc/Makefile @@ -223,12 +223,12 @@ serve: # for development releases: always build autobuild-dev: - make dist SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1 -A switchers=1' + make dist SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1' -make suspicious # for quick rebuilds (HTML only) autobuild-dev-html: - make html SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1 -A switchers=1' + make html SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1' # for stable releases: only build if not in pre-release stage (alpha, beta) # release candidate downloads are okay, since the stable tree can be in that stage diff --git a/Doc/tools/static/switchers.js b/Doc/tools/static/switchers.js deleted file mode 100644 index 1a1c7d0fa57e2..0000000000000 --- a/Doc/tools/static/switchers.js +++ /dev/null @@ -1,156 +0,0 @@ -(function() { - 'use strict'; - - // Parses versions in URL segments like: - // "3", "dev", "release/2.7" or "3.6rc2" - var version_regexs = [ - '(?:\\d)', - '(?:\\d\\.\\d[\\w\\d\\.]*)', - '(?:dev)', - '(?:release/\\d.\\d[\\x\\d\\.]*)']; - - var all_versions = { - '3.10': 'dev (3.10)', - '3.9': 'pre (3.9)', - '3.8': '3.8', - '3.7': '3.7', - '3.6': '3.6', - '2.7': '2.7', - }; - - var all_languages = { - 'en': 'English', - 'fr': 'French', - 'ja': 'Japanese', - 'ko': 'Korean', - 'pt-br': 'Brazilian Portuguese', - 'zh-cn': 'Simplified Chinese', - }; - - function build_version_select(current_version, current_release) { - var buf = [''); - return buf.join(''); - } - - function build_language_select(current_language) { - var buf = [''); - return buf.join(''); - } - - function navigate_to_first_existing(urls) { - // Navigate to the first existing URL in urls. - var url = urls.shift(); - if (urls.length == 0) { - window.location.href = url; - return; - } - $.ajax({ - url: url, - success: function() { - window.location.href = url; - }, - error: function() { - navigate_to_first_existing(urls); - } - }); - } - - function on_version_switch() { - var selected_version = $(this).children('option:selected').attr('value') + '/'; - var url = window.location.href; - var current_language = language_segment_from_url(url); - var current_version = version_segment_in_url(url); - var new_url = url.replace('.org/' + current_language + current_version, - '.org/' + current_language + selected_version); - if (new_url != url) { - navigate_to_first_existing([ - new_url, - url.replace('.org/' + current_language + current_version, - '.org/' + selected_version), - 'https://docs.python.org/' + current_language + selected_version, - 'https://docs.python.org/' + selected_version, - 'https://docs.python.org/' - ]); - } - } - - function on_language_switch() { - var selected_language = $(this).children('option:selected').attr('value') + '/'; - var url = window.location.href; - var current_language = language_segment_from_url(url); - var current_version = version_segment_in_url(url); - if (selected_language == 'en/') // Special 'default' case for english. - selected_language = ''; - var new_url = url.replace('.org/' + current_language + current_version, - '.org/' + selected_language + current_version); - if (new_url != url) { - navigate_to_first_existing([ - new_url, - 'https://docs.python.org/' - ]); - } - } - - // Returns the path segment of the language as a string, like 'fr/' - // or '' if not found. - function language_segment_from_url(url) { - var language_regexp = '\.org/([a-z]{2}(?:-[a-z]{2})?/)'; - var match = url.match(language_regexp); - if (match !== null) - return match[1]; - return ''; - } - - // Returns the path segment of the version as a string, like '3.6/' - // or '' if not found. - function version_segment_in_url(url) { - var language_segment = '(?:[a-z]{2}(?:-[a-z]{2})?/)'; - var version_segment = '(?:(?:' + version_regexs.join('|') + ')/)'; - var version_regexp = '\\.org/' + language_segment + '?(' + version_segment + ')'; - var match = url.match(version_regexp); - if (match !== null) - return match[1]; - return '' - } - - $(document).ready(function() { - var release = DOCUMENTATION_OPTIONS.VERSION; - var language_segment = language_segment_from_url(window.location.href); - var current_language = language_segment.replace(/\/+$/g, '') || 'en'; - var version = release.substr(0, 3); - var version_select = build_version_select(version, release); - - $('.version_switcher_placeholder').html(version_select); - $('.version_switcher_placeholder select').bind('change', on_version_switch); - - var language_select = build_language_select(current_language); - - $('.language_switcher_placeholder').html(language_select); - $('.language_switcher_placeholder select').bind('change', on_language_switch); - }); -})(); diff --git a/Doc/tools/templates/dummy.html b/Doc/tools/templates/dummy.html index 68ae3ad148ec2..3438b44377fcb 100644 --- a/Doc/tools/templates/dummy.html +++ b/Doc/tools/templates/dummy.html @@ -6,3 +6,12 @@ {% trans %}CPython implementation detail:{% endtrans %} {% trans %}Deprecated since version {deprecated}, will be removed in version {removed}{% endtrans %} {% trans %}Deprecated since version {deprecated}, removed in version {removed}{% endtrans %} + + +In docsbuild-scripts, when rewriting indexsidebar.html with actual versions: + +{% trans %}in development{% endtrans %} +{% trans %}pre-release{% endtrans %} +{% trans %}stable{% endtrans %} +{% trans %}security-fixes{% endtrans %} +{% trans %}EOL{% endtrans %} diff --git a/Doc/tools/templates/indexsidebar.html b/Doc/tools/templates/indexsidebar.html index 1c1cb5484a4f6..f7bf6d8e49117 100644 --- a/Doc/tools/templates/indexsidebar.html +++ b/Doc/tools/templates/indexsidebar.html @@ -2,12 +2,8 @@

    {% trans %}Download{% endtrans %}

    {% trans %}Download these documents{% endtrans %}

    {% trans %}Docs by version{% endtrans %}

    diff --git a/Doc/tools/templates/layout.html b/Doc/tools/templates/layout.html index 17592d74a4eb5..98ccf4224804b 100644 --- a/Doc/tools/templates/layout.html +++ b/Doc/tools/templates/layout.html @@ -12,22 +12,14 @@ {% block rootrellink %} {{ super() }} -
  • - {%- if switchers is defined %} - {{ language or 'en' }} - {{ release }} - {% trans %}Documentation {% endtrans %}{{ reldelim1 }} - {%- else %} +
  • {{ shorttitle }}{{ reldelim1 }} - {%- endif %}
  • {% endblock %} {% block extrahead %} {% if builder != "htmlhelp" %} - {% if switchers is defined and not embedded %} - {% endif %} {% if pagename == 'whatsnew/changelog' and not embedded %} {% endif %} {% endif %} diff --git a/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst b/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst new file mode 100644 index 0000000000000..5fc4155b55346 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst @@ -0,0 +1,2 @@ +Language and version switchers, previously maintained in every cpython +branches, are now handled by docsbuild-script. From webhook-mailer at python.org Sun Jan 2 18:22:47 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sun, 02 Jan 2022 23:22:47 -0000 Subject: [Python-checkins] bpo-46219, 46221: simplify except* implementation following exc_info changes. Move helpers to exceptions.c. Do not assume that exception groups are truthy. (GH-30289) Message-ID: https://github.com/python/cpython/commit/65e7c1f90e9136fc61f4af029b065d9f6c5664c3 commit: 65e7c1f90e9136fc61f4af029b065d9f6c5664c3 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-02T23:22:42Z summary: bpo-46219, 46221: simplify except* implementation following exc_info changes. Move helpers to exceptions.c. Do not assume that exception groups are truthy. (GH-30289) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst M .github/CODEOWNERS M Doc/library/dis.rst M Include/internal/pycore_pyerrors.h M Lib/importlib/_bootstrap_external.py M Lib/test/test_except_star.py M Objects/exceptions.c M Python/ceval.c M Python/compile.c diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index ce5121e7ac8f8..f484664f7b712 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -21,6 +21,14 @@ Python/ceval.c @markshannon Python/compile.c @markshannon Python/ast_opt.c @isidentical +# Exceptions +Lib/traceback.py @iritkatriel +Lib/test/test_except*.py @iritkatriel +Lib/test/test_traceback.py @iritkatriel +Objects/exceptions.c @iritkatriel +Python/traceback.c @iritkatriel +Python/pythonrun.c @iritkatriel + # Hashing **/*hashlib* @python/crypto-team @tiran **/*pyhash* @python/crypto-team @tiran diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index ffade3c9bfe7c..14de191265cf2 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -911,8 +911,8 @@ All of the following opcodes use their arguments. Combines the raised and reraised exceptions list from TOS, into an exception group to propagate from a try-except* block. Uses the original exception group from TOS1 to reconstruct the structure of reraised exceptions. Pops - two items from the stack and pushes 0 (for lasti, which is unused) followed - by the exception to reraise or ``None`` if there isn't one. + two items from the stack and pushes the exception to reraise or ``None`` + if there isn't one. .. versionadded:: 3.11 diff --git a/Include/internal/pycore_pyerrors.h b/Include/internal/pycore_pyerrors.h index b9fc36cf06760..f375337a405bb 100644 --- a/Include/internal/pycore_pyerrors.h +++ b/Include/internal/pycore_pyerrors.h @@ -87,9 +87,9 @@ PyAPI_FUNC(PyObject *) _PyExc_CreateExceptionGroup( const char *msg, PyObject *excs); -PyAPI_FUNC(PyObject *) _PyExc_ExceptionGroupProjection( - PyObject *left, - PyObject *right); +PyAPI_FUNC(PyObject *) _PyExc_PrepReraiseStar( + PyObject *orig, + PyObject *excs); PyAPI_FUNC(int) _PyErr_CheckSignalsTstate(PyThreadState *tstate); diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 5ead6caf9f3c7..095c1274bebaf 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -375,6 +375,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3467 (Change CALL_xxx opcodes) # Python 3.11a4 3468 (Add SEND opcode) # Python 3.11a4 3469 (bpo-45711: remove type, traceback from exc_info) +# Python 3.11a4 3470 (bpo-46221: PREP_RERAISE_STAR no longer pushes lasti) # # MAGIC must change whenever the bytecode emitted by the compiler may no @@ -384,7 +385,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3469).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3470).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/test/test_except_star.py b/Lib/test/test_except_star.py index b03de9c1de32e..490b159e3b71b 100644 --- a/Lib/test/test_except_star.py +++ b/Lib/test/test_except_star.py @@ -952,6 +952,34 @@ def derive(self, excs): self.assertEqual(teg.code, 42) self.assertEqual(teg.exceptions[0].code, 101) + def test_falsy_exception_group_subclass(self): + class FalsyEG(ExceptionGroup): + def __bool__(self): + return False + + def derive(self, excs): + return FalsyEG(self.message, excs) + + try: + try: + raise FalsyEG("eg", [TypeError(1), ValueError(2)]) + except *TypeError as e: + tes = e + raise + except *ValueError as e: + ves = e + pass + except Exception as e: + exc = e + + for e in [tes, ves, exc]: + self.assertFalse(e) + self.assertIsInstance(e, FalsyEG) + + self.assertExceptionIsLike(exc, FalsyEG("eg", [TypeError(1)])) + self.assertExceptionIsLike(tes, FalsyEG("eg", [TypeError(1)])) + self.assertExceptionIsLike(ves, FalsyEG("eg", [ValueError(2)])) + class TestExceptStarCleanup(ExceptStarTest): def test_exc_info_restored(self): diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst new file mode 100644 index 0000000000000..0cb3e90a28d75 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst @@ -0,0 +1 @@ +:opcode:`PREP_RERAISE_STAR` no longer pushes ``lasti`` to the stack. diff --git a/Objects/exceptions.c b/Objects/exceptions.c index d82340b8e78ab..403d2d4a3fddf 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -1207,8 +1207,8 @@ collect_exception_group_leaves(PyObject *exc, PyObject *leaves) * of eg which contains all leaf exceptions that are contained * in any exception group in keep. */ -PyObject * -_PyExc_ExceptionGroupProjection(PyObject *eg, PyObject *keep) +static PyObject * +exception_group_projection(PyObject *eg, PyObject *keep) { assert(_PyBaseExceptionGroup_Check(eg)); assert(PyList_CheckExact(keep)); @@ -1245,6 +1245,122 @@ _PyExc_ExceptionGroupProjection(PyObject *eg, PyObject *keep) return result; } +static bool +is_same_exception_metadata(PyObject *exc1, PyObject *exc2) +{ + assert(PyExceptionInstance_Check(exc1)); + assert(PyExceptionInstance_Check(exc2)); + + PyBaseExceptionObject *e1 = (PyBaseExceptionObject *)exc1; + PyBaseExceptionObject *e2 = (PyBaseExceptionObject *)exc2; + + return (e1->note == e2->note && + e1->traceback == e2->traceback && + e1->cause == e2->cause && + e1->context == e2->context); +} + +/* + This function is used by the interpreter to calculate + the exception group to be raised at the end of a + try-except* construct. + + orig: the original except that was caught. + excs: a list of exceptions that were raised/reraised + in the except* clauses. + + Calculates an exception group to raise. It contains + all exceptions in excs, where those that were reraised + have same nesting structure as in orig, and those that + were raised (if any) are added as siblings in a new EG. + + Returns NULL and sets an exception on failure. +*/ +PyObject * +_PyExc_PrepReraiseStar(PyObject *orig, PyObject *excs) +{ + assert(PyExceptionInstance_Check(orig)); + assert(PyList_Check(excs)); + + Py_ssize_t numexcs = PyList_GET_SIZE(excs); + + if (numexcs == 0) { + return Py_NewRef(Py_None); + } + + if (!_PyBaseExceptionGroup_Check(orig)) { + /* a naked exception was caught and wrapped. Only one except* clause + * could have executed,so there is at most one exception to raise. + */ + + assert(numexcs == 1 || (numexcs == 2 && PyList_GET_ITEM(excs, 1) == Py_None)); + + PyObject *e = PyList_GET_ITEM(excs, 0); + assert(e != NULL); + return Py_NewRef(e); + } + + PyObject *raised_list = PyList_New(0); + if (raised_list == NULL) { + return NULL; + } + PyObject* reraised_list = PyList_New(0); + if (reraised_list == NULL) { + Py_DECREF(raised_list); + return NULL; + } + + /* Now we are holding refs to raised_list and reraised_list */ + + PyObject *result = NULL; + + /* Split excs into raised and reraised by comparing metadata with orig */ + for (Py_ssize_t i = 0; i < numexcs; i++) { + PyObject *e = PyList_GET_ITEM(excs, i); + assert(e != NULL); + if (Py_IsNone(e)) { + continue; + } + bool is_reraise = is_same_exception_metadata(e, orig); + PyObject *append_list = is_reraise ? reraised_list : raised_list; + if (PyList_Append(append_list, e) < 0) { + goto done; + } + } + + PyObject *reraised_eg = exception_group_projection(orig, reraised_list); + if (reraised_eg == NULL) { + goto done; + } + + if (!Py_IsNone(reraised_eg)) { + assert(is_same_exception_metadata(reraised_eg, orig)); + } + Py_ssize_t num_raised = PyList_GET_SIZE(raised_list); + if (num_raised == 0) { + result = reraised_eg; + } + else if (num_raised > 0) { + int res = 0; + if (!Py_IsNone(reraised_eg)) { + res = PyList_Append(raised_list, reraised_eg); + } + Py_DECREF(reraised_eg); + if (res < 0) { + goto done; + } + result = _PyExc_CreateExceptionGroup("", raised_list); + if (result == NULL) { + goto done; + } + } + +done: + Py_XDECREF(raised_list); + Py_XDECREF(reraised_list); + return result; +} + static PyMemberDef BaseExceptionGroup_members[] = { {"message", T_OBJECT, offsetof(PyBaseExceptionGroupObject, msg), READONLY, PyDoc_STR("exception message")}, diff --git a/Python/ceval.c b/Python/ceval.c index 9976bdeffbe96..43925e6db269c 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1092,7 +1092,6 @@ match_class(PyThreadState *tstate, PyObject *subject, PyObject *type, static int do_raise(PyThreadState *tstate, PyObject *exc, PyObject *cause); -static PyObject *do_reraise_star(PyObject *excs, PyObject *orig); static int exception_group_match( PyObject* exc_value, PyObject *match_type, PyObject **match, PyObject **rest); @@ -2777,7 +2776,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr assert(PyList_Check(excs)); PyObject *orig = POP(); - PyObject *val = do_reraise_star(excs, orig); + PyObject *val = _PyExc_PrepReraiseStar(orig, excs); Py_DECREF(excs); Py_DECREF(orig); @@ -2785,8 +2784,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr goto error; } - PyObject *lasti_unused = Py_NewRef(_PyLong_GetZero()); - PUSH(lasti_unused); PUSH(val); DISPATCH(); } @@ -6313,134 +6310,6 @@ exception_group_match(PyObject* exc_value, PyObject *match_type, return 0; } -/* Logic for the final raise/reraise of a try-except* contruct - (too complicated for inlining). -*/ - -static bool -is_same_exception_metadata(PyObject *exc1, PyObject *exc2) -{ - assert(PyExceptionInstance_Check(exc1)); - assert(PyExceptionInstance_Check(exc2)); - - PyObject *tb1 = PyException_GetTraceback(exc1); - PyObject *ctx1 = PyException_GetContext(exc1); - PyObject *cause1 = PyException_GetCause(exc1); - PyObject *tb2 = PyException_GetTraceback(exc2); - PyObject *ctx2 = PyException_GetContext(exc2); - PyObject *cause2 = PyException_GetCause(exc2); - - bool result = (Py_Is(tb1, tb2) && - Py_Is(ctx1, ctx2) && - Py_Is(cause1, cause2)); - - Py_XDECREF(tb1); - Py_XDECREF(ctx1); - Py_XDECREF(cause1); - Py_XDECREF(tb2); - Py_XDECREF(ctx2); - Py_XDECREF(cause2); - return result; -} - -/* - excs: a list of exceptions to raise/reraise - orig: the original except that was caught - - Calculates an exception group to raise. It contains - all exceptions in excs, where those that were reraised - have same nesting structure as in orig, and those that - were raised (if any) are added as siblings in a new EG. - - Returns NULL and sets an exception on failure. -*/ -static PyObject * -do_reraise_star(PyObject *excs, PyObject *orig) -{ - assert(PyList_Check(excs)); - assert(PyExceptionInstance_Check(orig)); - - Py_ssize_t numexcs = PyList_GET_SIZE(excs); - - if (numexcs == 0) { - return Py_NewRef(Py_None); - } - - if (!_PyBaseExceptionGroup_Check(orig)) { - /* a naked exception was caught and wrapped. Only one except* clause - * could have executed,so there is at most one exception to raise. - */ - - assert(numexcs == 1 || (numexcs == 2 && PyList_GET_ITEM(excs, 1) == Py_None)); - - PyObject *e = PyList_GET_ITEM(excs, 0); - assert(e != NULL); - return Py_NewRef(e); - } - - - PyObject *raised_list = PyList_New(0); - if (raised_list == NULL) { - return NULL; - } - PyObject* reraised_list = PyList_New(0); - if (reraised_list == NULL) { - Py_DECREF(raised_list); - return NULL; - } - - /* Now we are holding refs to raised_list and reraised_list */ - - PyObject *result = NULL; - - /* Split excs into raised and reraised by comparing metadata with orig */ - for (Py_ssize_t i = 0; i < numexcs; i++) { - PyObject *e = PyList_GET_ITEM(excs, i); - assert(e != NULL); - if (Py_IsNone(e)) { - continue; - } - bool is_reraise = is_same_exception_metadata(e, orig); - PyObject *append_list = is_reraise ? reraised_list : raised_list; - if (PyList_Append(append_list, e) < 0) { - goto done; - } - } - - PyObject *reraised_eg = _PyExc_ExceptionGroupProjection(orig, reraised_list); - if (reraised_eg == NULL) { - goto done; - } - - if (!Py_IsNone(reraised_eg)) { - assert(is_same_exception_metadata(reraised_eg, orig)); - } - - Py_ssize_t num_raised = PyList_GET_SIZE(raised_list); - if (num_raised == 0) { - result = reraised_eg; - } - else if (num_raised > 0) { - int res = 0; - if (!Py_IsNone(reraised_eg)) { - res = PyList_Append(raised_list, reraised_eg); - } - Py_DECREF(reraised_eg); - if (res < 0) { - goto done; - } - result = _PyExc_CreateExceptionGroup("", raised_list); - if (result == NULL) { - goto done; - } - } - -done: - Py_XDECREF(raised_list); - Py_XDECREF(reraised_list); - return result; -} - /* Iterate v argcnt times and store the results on the stack (via decreasing sp). Return 1 for success, 0 if error. diff --git a/Python/compile.c b/Python/compile.c index 8e90b1ab37037..48250b5dba973 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -1134,7 +1134,7 @@ stack_effect(int opcode, int oparg, int jump) return jump ? 1 : 0; case PREP_RERAISE_STAR: - return 0; + return -1; case RERAISE: return -1; case PUSH_EXC_INFO: @@ -3488,12 +3488,16 @@ compiler_try_except(struct compiler *c, stmt_ty s) [orig, res, rest] Ln+1: LIST_APPEND 1 ) add unhandled exc to res (could be None) [orig, res] PREP_RERAISE_STAR - [i, exc] POP_JUMP_IF_TRUE RER - [i, exc] POP - [i] POP + [exc] DUP_TOP + [exc, exc] LOAD_CONST None + [exc, exc, None] COMPARE_IS + [exc, is_none] POP_JUMP_IF_FALSE RER + [exc] POP_TOP [] JUMP_FORWARD L0 - [i, exc] RER: POP_EXCEPT_AND_RERAISE + [exc] RER: ROT_TWO + [exc, prev_exc_info] POP_EXCEPT + [exc] RERAISE 0 [] L0: */ @@ -3657,18 +3661,21 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) compiler_use_next_block(c, reraise_star); ADDOP(c, PREP_RERAISE_STAR); ADDOP(c, DUP_TOP); - ADDOP_JUMP(c, POP_JUMP_IF_TRUE, reraise); + ADDOP_LOAD_CONST(c, Py_None); + ADDOP_COMPARE(c, Is); + ADDOP_JUMP(c, POP_JUMP_IF_FALSE, reraise); NEXT_BLOCK(c); - /* Nothing to reraise - pop it */ - ADDOP(c, POP_TOP); + /* Nothing to reraise */ ADDOP(c, POP_TOP); ADDOP(c, POP_BLOCK); ADDOP(c, POP_EXCEPT); ADDOP_JUMP(c, JUMP_FORWARD, end); compiler_use_next_block(c, reraise); ADDOP(c, POP_BLOCK); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + ADDOP(c, ROT_TWO); + ADDOP(c, POP_EXCEPT); + ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); ADDOP(c, POP_EXCEPT_AND_RERAISE); compiler_use_next_block(c, orelse); From webhook-mailer at python.org Sun Jan 2 18:31:03 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 02 Jan 2022 23:31:03 -0000 Subject: [Python-checkins] bpo-46229: remove `CODE_OF_CONDUCT.md` to use org default (GH-30342) Message-ID: https://github.com/python/cpython/commit/fedefa67350c72dde121f68cbe7aa70face6805e commit: fedefa67350c72dde121f68cbe7aa70face6805e branch: main author: Nikita Sobolev committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-02T15:30:54-08:00 summary: bpo-46229: remove `CODE_OF_CONDUCT.md` to use org default (GH-30342) Automerge-Triggered-By: GH:Mariatta files: D CODE_OF_CONDUCT.md diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md deleted file mode 100644 index c5f24abe2ca66..0000000000000 --- a/CODE_OF_CONDUCT.md +++ /dev/null @@ -1,12 +0,0 @@ -# Code of Conduct - -Please note that all interactions on -[Python Software Foundation](https://www.python.org/psf-landing/)-supported -infrastructure is [covered](https://www.python.org/psf/records/board/minutes/2014-01-06/#management-of-the-psfs-web-properties) -by the [PSF Code of Conduct](https://www.python.org/psf/codeofconduct/), -which includes all the infrastructure used in the development of Python itself -(e.g. mailing lists, issue trackers, GitHub, etc.). - -In general, this means that everyone is expected to be **open**, **considerate**, and -**respectful** of others no matter what their position is within the project. - From webhook-mailer at python.org Mon Jan 3 01:20:15 2022 From: webhook-mailer at python.org (ned-deily) Date: Mon, 03 Jan 2022 06:20:15 -0000 Subject: [Python-checkins] bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) Message-ID: https://github.com/python/cpython/commit/549e62827262264cda30455e10e315602129da72 commit: 549e62827262264cda30455e10e315602129da72 branch: main author: Ned Deily committer: ned-deily date: 2022-01-03T01:19:59-05:00 summary: bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) This change solves two problems encountered by users of the macOS Python Launcher app on recent macOS releases (10.14+): - The launcher app was no longer able to launch the macOS Terminal.app to run a script. - Even if Terminal.app was already launched, the launcher app was unable to send an Apple Event to Terminal.app to open and run Python with the desired .py file. files: A Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Mac/PythonLauncher/Info.plist.in M Mac/PythonLauncher/doscript.m diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 70f215d07249b..dec0a2eaf5c56 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -3,7 +3,7 @@ CFBundleDevelopmentRegion - English + en CFBundleDocumentTypes @@ -39,6 +39,8 @@ CFBundleExecutable Python Launcher + NSHumanReadableCopyright + Copyright ? 2001-2022 Python Software Foundation CFBundleGetInfoString %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile @@ -61,5 +63,7 @@ MainMenu NSPrincipalClass NSApplication + NSAppleEventsUsageDescription + Python Launcher uses Apple events to launch your Python script in a Terminal window. diff --git a/Mac/PythonLauncher/doscript.m b/Mac/PythonLauncher/doscript.m index cbb783ba3e893..f07326bce46fb 100644 --- a/Mac/PythonLauncher/doscript.m +++ b/Mac/PythonLauncher/doscript.m @@ -19,7 +19,7 @@ AEDesc desc; OSStatus err; - [[NSWorkspace sharedWorkspace] launchApplication:@"/Applications/Utilities/Terminal.app/"]; + [[NSWorkspace sharedWorkspace] launchApplication:@"Terminal.app"]; // Build event err = AEBuildAppleEvent(kAECoreSuite, kAEDoScript, diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst new file mode 100644 index 0000000000000..fc953b85dcc2a --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst @@ -0,0 +1,2 @@ +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. From webhook-mailer at python.org Mon Jan 3 01:44:43 2022 From: webhook-mailer at python.org (ned-deily) Date: Mon, 03 Jan 2022 06:44:43 -0000 Subject: [Python-checkins] bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) (GH-30349) Message-ID: https://github.com/python/cpython/commit/b312794de0f78da15593d059f09b4071d95c0d0e commit: b312794de0f78da15593d059f09b4071d95c0d0e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ned-deily date: 2022-01-03T01:44:35-05:00 summary: bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) (GH-30349) This change solves two problems encountered by users of the macOS Python Launcher app on recent macOS releases (10.14+): - The launcher app was no longer able to launch the macOS Terminal.app to run a script. - Even if Terminal.app was already launched, the launcher app was unable to send an Apple Event to Terminal.app to open and run Python with the desired .py file. (cherry picked from commit 549e62827262264cda30455e10e315602129da72) Co-authored-by: Ned Deily files: A Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Mac/PythonLauncher/Info.plist.in M Mac/PythonLauncher/doscript.m diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 70f215d07249b..dec0a2eaf5c56 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -3,7 +3,7 @@ CFBundleDevelopmentRegion - English + en CFBundleDocumentTypes @@ -39,6 +39,8 @@ CFBundleExecutable Python Launcher + NSHumanReadableCopyright + Copyright ? 2001-2022 Python Software Foundation CFBundleGetInfoString %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile @@ -61,5 +63,7 @@ MainMenu NSPrincipalClass NSApplication + NSAppleEventsUsageDescription + Python Launcher uses Apple events to launch your Python script in a Terminal window. diff --git a/Mac/PythonLauncher/doscript.m b/Mac/PythonLauncher/doscript.m index cbb783ba3e893..f07326bce46fb 100644 --- a/Mac/PythonLauncher/doscript.m +++ b/Mac/PythonLauncher/doscript.m @@ -19,7 +19,7 @@ AEDesc desc; OSStatus err; - [[NSWorkspace sharedWorkspace] launchApplication:@"/Applications/Utilities/Terminal.app/"]; + [[NSWorkspace sharedWorkspace] launchApplication:@"Terminal.app"]; // Build event err = AEBuildAppleEvent(kAECoreSuite, kAEDoScript, diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst new file mode 100644 index 0000000000000..fc953b85dcc2a --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst @@ -0,0 +1,2 @@ +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. From webhook-mailer at python.org Mon Jan 3 01:48:30 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 03 Jan 2022 06:48:30 -0000 Subject: [Python-checkins] bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) Message-ID: https://github.com/python/cpython/commit/50da397be4f71e5c12759281446b06ce14b6a5c4 commit: 50da397be4f71e5c12759281446b06ce14b6a5c4 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-02T22:48:21-08:00 summary: bpo-40477: macOS Python Launcher app fixes for recent macOS releases (GH-30348) This change solves two problems encountered by users of the macOS Python Launcher app on recent macOS releases (10.14+): - The launcher app was no longer able to launch the macOS Terminal.app to run a script. - Even if Terminal.app was already launched, the launcher app was unable to send an Apple Event to Terminal.app to open and run Python with the desired .py file. (cherry picked from commit 549e62827262264cda30455e10e315602129da72) Co-authored-by: Ned Deily files: A Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Mac/PythonLauncher/Info.plist.in M Mac/PythonLauncher/doscript.m diff --git a/Mac/PythonLauncher/Info.plist.in b/Mac/PythonLauncher/Info.plist.in index 70f215d07249b..dec0a2eaf5c56 100644 --- a/Mac/PythonLauncher/Info.plist.in +++ b/Mac/PythonLauncher/Info.plist.in @@ -3,7 +3,7 @@ CFBundleDevelopmentRegion - English + en CFBundleDocumentTypes @@ -39,6 +39,8 @@ CFBundleExecutable Python Launcher + NSHumanReadableCopyright + Copyright ? 2001-2022 Python Software Foundation CFBundleGetInfoString %VERSION%, ? 2001-2022 Python Software Foundation CFBundleIconFile @@ -61,5 +63,7 @@ MainMenu NSPrincipalClass NSApplication + NSAppleEventsUsageDescription + Python Launcher uses Apple events to launch your Python script in a Terminal window. diff --git a/Mac/PythonLauncher/doscript.m b/Mac/PythonLauncher/doscript.m index cbb783ba3e893..f07326bce46fb 100644 --- a/Mac/PythonLauncher/doscript.m +++ b/Mac/PythonLauncher/doscript.m @@ -19,7 +19,7 @@ AEDesc desc; OSStatus err; - [[NSWorkspace sharedWorkspace] launchApplication:@"/Applications/Utilities/Terminal.app/"]; + [[NSWorkspace sharedWorkspace] launchApplication:@"Terminal.app"]; // Build event err = AEBuildAppleEvent(kAECoreSuite, kAEDoScript, diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst new file mode 100644 index 0000000000000..fc953b85dcc2a --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst @@ -0,0 +1,2 @@ +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. From webhook-mailer at python.org Mon Jan 3 08:01:10 2022 From: webhook-mailer at python.org (asvetlov) Date: Mon, 03 Jan 2022 13:01:10 -0000 Subject: [Python-checkins] bpo-46222: posixmodule sendfile FreeBSD's constants updates. (GH-30327) Message-ID: https://github.com/python/cpython/commit/c960b191b8999a9455bb4b2c50dc224d06fee80c commit: c960b191b8999a9455bb4b2c50dc224d06fee80c branch: main author: David CARLIER committer: asvetlov date: 2022-01-03T15:01:04+02:00 summary: bpo-46222: posixmodule sendfile FreeBSD's constants updates. (GH-30327) * posixodule sendfile FreeBSD's constants updates. * ?? Added by blurb_it. Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst M Doc/library/os.rst M Modules/posixmodule.c diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 8e11c693c7c2e..3e8fc54485e20 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -1429,6 +1429,15 @@ or `the MSDN `_ on Windo .. versionadded:: 3.3 +.. data:: SF_NOCACHE + + Parameter to the :func:`sendfile` function, if the implementation supports + it. The data won't be cached in the virtual memory and will be freed afterwards. + + .. availability:: Unix. + + .. versionadded:: 3.11 + .. function:: splice(src, dst, count, offset_src=None, offset_dst=None) diff --git a/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst b/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst new file mode 100644 index 0000000000000..1fe28792529d0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst @@ -0,0 +1 @@ +Adding ``SF_NOCACHE`` sendfile constant for FreeBSD for the posixmodule. \ No newline at end of file diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index b3a5757a8221d..21adf806a4e85 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -15170,12 +15170,16 @@ all_ins(PyObject *m) #ifdef SF_NODISKIO if (PyModule_AddIntMacro(m, SF_NODISKIO)) return -1; #endif + /* is obsolete since the 11.x release */ #ifdef SF_MNOWAIT if (PyModule_AddIntMacro(m, SF_MNOWAIT)) return -1; #endif #ifdef SF_SYNC if (PyModule_AddIntMacro(m, SF_SYNC)) return -1; #endif +#ifdef SF_NOCACHE + if (PyModule_AddIntMacro(m, SF_NOCACHE)) return -1; +#endif /* constants for posix_fadvise */ #ifdef POSIX_FADV_NORMAL From webhook-mailer at python.org Mon Jan 3 13:29:24 2022 From: webhook-mailer at python.org (pablogsal) Date: Mon, 03 Jan 2022 18:29:24 -0000 Subject: [Python-checkins] Revert "bpo-46110: Add a recursion check to avoid stack overflow in the PEG parser (GH-30177)" (GH-30363) Message-ID: https://github.com/python/cpython/commit/9d35dedc5e7e940b639230fba93c763bd9f19c09 commit: 9d35dedc5e7e940b639230fba93c763bd9f19c09 branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-03T18:29:18Z summary: Revert "bpo-46110: Add a recursion check to avoid stack overflow in the PEG parser (GH-30177)" (GH-30363) This reverts commit e9898bf153d26059261ffef11f7643ae991e2a4c temporarily as we want to confirm if this commit is the cause of a slowdown at startup time. files: D Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst M Lib/test/test_syntax.py M Parser/parser.c M Parser/pegen.c M Tools/peg_generator/pegen/c_generator.py diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index c95bc15e7273d..6286529d2734e 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1729,14 +1729,6 @@ def test_syntax_error_on_deeply_nested_blocks(self): """ self._check_error(source, "too many statically nested blocks") - @support.cpython_only - def test_error_on_parser_stack_overflow(self): - source = "-" * 100000 + "4" - for mode in ["exec", "eval", "single"]: - with self.subTest(mode=mode): - with self.assertRaises(MemoryError): - compile(source, "", mode) - def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite()) diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst deleted file mode 100644 index 593d2855972c4..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a maximum recursion check to the PEG parser to avoid stack overflow. -Patch by Pablo Galindo diff --git a/Parser/parser.c b/Parser/parser.c index 07a04c917430c..4d576aa781542 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -6,8 +6,6 @@ #else # define D(x) #endif - -# define MAXSTACK 6000 static const int n_keyword_lists = 9; static KeywordToken *reserved_keywords[] = { (KeywordToken[]) {{NULL, -1}}, @@ -970,19 +968,16 @@ static asdl_seq *_loop1_222_rule(Parser *p); static mod_ty file_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // statements? $ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> file[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statements? $")); @@ -998,7 +993,7 @@ file_rule(Parser *p) _res = _PyPegen_make_module ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1009,7 +1004,7 @@ file_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1017,19 +1012,16 @@ file_rule(Parser *p) static mod_ty interactive_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // statement_newline if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> interactive[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement_newline")); @@ -1042,7 +1034,7 @@ interactive_rule(Parser *p) _res = _PyAST_Interactive ( a , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1053,7 +1045,7 @@ interactive_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1061,19 +1053,16 @@ interactive_rule(Parser *p) static mod_ty eval_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // expressions NEWLINE* $ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> eval[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions NEWLINE* $")); @@ -1092,7 +1081,7 @@ eval_rule(Parser *p) _res = _PyAST_Expression ( a , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1103,7 +1092,7 @@ eval_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1111,19 +1100,16 @@ eval_rule(Parser *p) static mod_ty func_type_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // '(' type_expressions? ')' '->' expression NEWLINE* $ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> func_type[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' type_expressions? ')' '->' expression NEWLINE* $")); @@ -1154,7 +1140,7 @@ func_type_rule(Parser *p) _res = _PyAST_FunctionType ( a , b , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1165,7 +1151,7 @@ func_type_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1173,19 +1159,16 @@ func_type_rule(Parser *p) static expr_ty fstring_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> fstring[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -1204,7 +1187,7 @@ fstring_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1212,19 +1195,16 @@ fstring_rule(Parser *p) static asdl_stmt_seq* statements_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // statement+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statements[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement+")); @@ -1237,7 +1217,7 @@ statements_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_seq_flatten ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1248,7 +1228,7 @@ statements_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1256,19 +1236,16 @@ statements_rule(Parser *p) static asdl_stmt_seq* statement_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // compound_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compound_stmt")); @@ -1281,7 +1258,7 @@ statement_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1292,7 +1269,7 @@ statement_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -1305,7 +1282,7 @@ statement_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1316,7 +1293,7 @@ statement_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1324,19 +1301,16 @@ statement_rule(Parser *p) static asdl_stmt_seq* statement_newline_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -1345,7 +1319,7 @@ statement_newline_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // compound_stmt NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compound_stmt NEWLINE")); @@ -1361,7 +1335,7 @@ statement_newline_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1372,7 +1346,7 @@ statement_newline_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -1391,7 +1365,7 @@ statement_newline_rule(Parser *p) } { // NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -1403,7 +1377,7 @@ statement_newline_rule(Parser *p) D(fprintf(stderr, "%*c+ statement_newline[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NEWLINE")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -1413,7 +1387,7 @@ statement_newline_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , CHECK ( stmt_ty , _PyAST_Pass ( EXTRA ) ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1424,7 +1398,7 @@ statement_newline_rule(Parser *p) } { // $ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "$")); @@ -1437,7 +1411,7 @@ statement_newline_rule(Parser *p) _res = _PyPegen_interactive_exit ( p ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1448,7 +1422,7 @@ statement_newline_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1456,19 +1430,16 @@ statement_newline_rule(Parser *p) static asdl_stmt_seq* simple_stmts_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // simple_stmt !';' NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmts[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmt !';' NEWLINE")); @@ -1486,7 +1457,7 @@ simple_stmts_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1497,7 +1468,7 @@ simple_stmts_rule(Parser *p) } { // ';'.simple_stmt+ ';'? NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmts[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';'.simple_stmt+ ';'? NEWLINE")); @@ -1517,7 +1488,7 @@ simple_stmts_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1528,7 +1499,7 @@ simple_stmts_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -1549,23 +1520,20 @@ simple_stmts_rule(Parser *p) static stmt_ty simple_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; if (_PyPegen_is_memoized(p, simple_stmt_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -1574,7 +1542,7 @@ simple_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // assignment if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment")); @@ -1593,7 +1561,7 @@ simple_stmt_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -1605,7 +1573,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expressions")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -1615,7 +1583,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Expr ( e , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1626,7 +1594,7 @@ simple_stmt_rule(Parser *p) } { // &'return' return_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'return' return_stmt")); @@ -1647,7 +1615,7 @@ simple_stmt_rule(Parser *p) } { // &('import' | 'from') import_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('import' | 'from') import_stmt")); @@ -1668,7 +1636,7 @@ simple_stmt_rule(Parser *p) } { // &'raise' raise_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'raise' raise_stmt")); @@ -1689,7 +1657,7 @@ simple_stmt_rule(Parser *p) } { // 'pass' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'pass'")); @@ -1701,7 +1669,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'pass'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -1711,7 +1679,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Pass ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1722,7 +1690,7 @@ simple_stmt_rule(Parser *p) } { // &'del' del_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'del' del_stmt")); @@ -1743,7 +1711,7 @@ simple_stmt_rule(Parser *p) } { // &'yield' yield_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'yield' yield_stmt")); @@ -1764,7 +1732,7 @@ simple_stmt_rule(Parser *p) } { // &'assert' assert_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'assert' assert_stmt")); @@ -1785,7 +1753,7 @@ simple_stmt_rule(Parser *p) } { // 'break' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'break'")); @@ -1797,7 +1765,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'break'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -1807,7 +1775,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Break ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1818,7 +1786,7 @@ simple_stmt_rule(Parser *p) } { // 'continue' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'continue'")); @@ -1830,7 +1798,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'continue'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -1840,7 +1808,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Continue ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -1851,7 +1819,7 @@ simple_stmt_rule(Parser *p) } { // &'global' global_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'global' global_stmt")); @@ -1872,7 +1840,7 @@ simple_stmt_rule(Parser *p) } { // &'nonlocal' nonlocal_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'nonlocal' nonlocal_stmt")); @@ -1894,7 +1862,7 @@ simple_stmt_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, simple_stmt_type, _res); - p->level--; + D(p->level--); return _res; } @@ -1910,19 +1878,16 @@ simple_stmt_rule(Parser *p) static stmt_ty compound_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // &('def' | '@' | ASYNC) function_def if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('def' | '@' | ASYNC) function_def")); @@ -1943,7 +1908,7 @@ compound_stmt_rule(Parser *p) } { // &'if' if_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'if' if_stmt")); @@ -1964,7 +1929,7 @@ compound_stmt_rule(Parser *p) } { // &('class' | '@') class_def if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('class' | '@') class_def")); @@ -1985,7 +1950,7 @@ compound_stmt_rule(Parser *p) } { // &('with' | ASYNC) with_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('with' | ASYNC) with_stmt")); @@ -2006,7 +1971,7 @@ compound_stmt_rule(Parser *p) } { // &('for' | ASYNC) for_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('for' | ASYNC) for_stmt")); @@ -2027,7 +1992,7 @@ compound_stmt_rule(Parser *p) } { // &'try' try_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'try' try_stmt")); @@ -2048,7 +2013,7 @@ compound_stmt_rule(Parser *p) } { // &'while' while_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'while' while_stmt")); @@ -2069,7 +2034,7 @@ compound_stmt_rule(Parser *p) } { // match_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "match_stmt")); @@ -2088,7 +2053,7 @@ compound_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2101,19 +2066,16 @@ compound_stmt_rule(Parser *p) static stmt_ty assignment_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2122,7 +2084,7 @@ assignment_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ':' expression ['=' annotated_rhs] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ':' expression ['=' annotated_rhs]")); @@ -2143,7 +2105,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ':' expression ['=' annotated_rhs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2153,7 +2115,7 @@ assignment_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 6 , "Variable annotation syntax is" , _PyAST_AnnAssign ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , b , c , 1 , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2164,7 +2126,7 @@ assignment_rule(Parser *p) } { // ('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs]")); @@ -2185,7 +2147,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2195,7 +2157,7 @@ assignment_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 6 , "Variable annotations syntax is" , _PyAST_AnnAssign ( a , b , c , 0 , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2206,7 +2168,7 @@ assignment_rule(Parser *p) } { // ((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT?")); @@ -2226,7 +2188,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2236,7 +2198,7 @@ assignment_rule(Parser *p) _res = _PyAST_Assign ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2247,7 +2209,7 @@ assignment_rule(Parser *p) } { // single_target augassign ~ (yield_expr | star_expressions) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); @@ -2268,7 +2230,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2278,7 +2240,7 @@ assignment_rule(Parser *p) _res = _PyAST_AugAssign ( a , b -> kind , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2287,13 +2249,13 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c%s assignment[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } if (p->call_invalid_rules) { // invalid_assignment if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_assignment")); @@ -2312,7 +2274,7 @@ assignment_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2320,19 +2282,16 @@ assignment_rule(Parser *p) static expr_ty annotated_rhs_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> annotated_rhs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -2351,7 +2310,7 @@ annotated_rhs_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> annotated_rhs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -2370,7 +2329,7 @@ annotated_rhs_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2391,19 +2350,16 @@ annotated_rhs_rule(Parser *p) static AugOperator* augassign_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } AugOperator* _res = NULL; int _mark = p->mark; { // '+=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+='")); @@ -2416,7 +2372,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Add ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2427,7 +2383,7 @@ augassign_rule(Parser *p) } { // '-=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-='")); @@ -2440,7 +2396,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Sub ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2451,7 +2407,7 @@ augassign_rule(Parser *p) } { // '*=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*='")); @@ -2464,7 +2420,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Mult ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2475,7 +2431,7 @@ augassign_rule(Parser *p) } { // '@=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@='")); @@ -2488,7 +2444,7 @@ augassign_rule(Parser *p) _res = CHECK_VERSION ( AugOperator* , 5 , "The '@' operator is" , _PyPegen_augoperator ( p , MatMult ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2499,7 +2455,7 @@ augassign_rule(Parser *p) } { // '/=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/='")); @@ -2512,7 +2468,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Div ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2523,7 +2479,7 @@ augassign_rule(Parser *p) } { // '%=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'%='")); @@ -2536,7 +2492,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Mod ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2547,7 +2503,7 @@ augassign_rule(Parser *p) } { // '&=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'&='")); @@ -2560,7 +2516,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitAnd ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2571,7 +2527,7 @@ augassign_rule(Parser *p) } { // '|=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|='")); @@ -2584,7 +2540,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitOr ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2595,7 +2551,7 @@ augassign_rule(Parser *p) } { // '^=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'^='")); @@ -2608,7 +2564,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitXor ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2619,7 +2575,7 @@ augassign_rule(Parser *p) } { // '<<=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<<='")); @@ -2632,7 +2588,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , LShift ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2643,7 +2599,7 @@ augassign_rule(Parser *p) } { // '>>=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>>='")); @@ -2656,7 +2612,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , RShift ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2667,7 +2623,7 @@ augassign_rule(Parser *p) } { // '**=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**='")); @@ -2680,7 +2636,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Pow ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2691,7 +2647,7 @@ augassign_rule(Parser *p) } { // '//=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'//='")); @@ -2704,7 +2660,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , FloorDiv ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2715,7 +2671,7 @@ augassign_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2723,19 +2679,16 @@ augassign_rule(Parser *p) static stmt_ty return_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2744,7 +2697,7 @@ return_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'return' star_expressions? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> return_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'return' star_expressions?")); @@ -2759,7 +2712,7 @@ return_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ return_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'return' star_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2769,7 +2722,7 @@ return_stmt_rule(Parser *p) _res = _PyAST_Return ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2780,7 +2733,7 @@ return_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2788,19 +2741,16 @@ return_stmt_rule(Parser *p) static stmt_ty raise_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2809,7 +2759,7 @@ raise_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'raise' expression ['from' expression] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> raise_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'raise' expression ['from' expression]")); @@ -2827,7 +2777,7 @@ raise_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ raise_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'raise' expression ['from' expression]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2837,7 +2787,7 @@ raise_stmt_rule(Parser *p) _res = _PyAST_Raise ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2848,7 +2798,7 @@ raise_stmt_rule(Parser *p) } { // 'raise' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> raise_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'raise'")); @@ -2860,7 +2810,7 @@ raise_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ raise_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'raise'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2870,7 +2820,7 @@ raise_stmt_rule(Parser *p) _res = _PyAST_Raise ( NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2881,7 +2831,7 @@ raise_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2889,19 +2839,16 @@ raise_stmt_rule(Parser *p) static stmt_ty global_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2910,7 +2857,7 @@ global_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'global' ','.NAME+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> global_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'global' ','.NAME+")); @@ -2925,7 +2872,7 @@ global_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ global_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'global' ','.NAME+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -2935,7 +2882,7 @@ global_stmt_rule(Parser *p) _res = _PyAST_Global ( CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -2946,7 +2893,7 @@ global_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -2954,19 +2901,16 @@ global_stmt_rule(Parser *p) static stmt_ty nonlocal_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2975,7 +2919,7 @@ nonlocal_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'nonlocal' ','.NAME+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> nonlocal_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'nonlocal' ','.NAME+")); @@ -2990,7 +2934,7 @@ nonlocal_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ nonlocal_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'nonlocal' ','.NAME+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3000,7 +2944,7 @@ nonlocal_stmt_rule(Parser *p) _res = _PyAST_Nonlocal ( CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3011,7 +2955,7 @@ nonlocal_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3019,19 +2963,16 @@ nonlocal_stmt_rule(Parser *p) static stmt_ty del_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3040,7 +2981,7 @@ del_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'del' del_targets &(';' | NEWLINE) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'del' del_targets &(';' | NEWLINE)")); @@ -3057,7 +2998,7 @@ del_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ del_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'del' del_targets &(';' | NEWLINE)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3067,7 +3008,7 @@ del_stmt_rule(Parser *p) _res = _PyAST_Delete ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3078,7 +3019,7 @@ del_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_del_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_del_stmt")); @@ -3097,7 +3038,7 @@ del_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3105,19 +3046,16 @@ del_stmt_rule(Parser *p) static stmt_ty yield_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3126,7 +3064,7 @@ yield_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> yield_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -3138,7 +3076,7 @@ yield_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "yield_expr")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3148,7 +3086,7 @@ yield_stmt_rule(Parser *p) _res = _PyAST_Expr ( y , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3159,7 +3097,7 @@ yield_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3167,19 +3105,16 @@ yield_stmt_rule(Parser *p) static stmt_ty assert_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3188,7 +3123,7 @@ assert_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'assert' expression [',' expression] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assert_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'assert' expression [',' expression]")); @@ -3206,7 +3141,7 @@ assert_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ assert_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'assert' expression [',' expression]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3216,7 +3151,7 @@ assert_stmt_rule(Parser *p) _res = _PyAST_Assert ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3227,7 +3162,7 @@ assert_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3235,19 +3170,16 @@ assert_stmt_rule(Parser *p) static stmt_ty import_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // import_name if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_name")); @@ -3266,7 +3198,7 @@ import_stmt_rule(Parser *p) } { // import_from if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from")); @@ -3285,7 +3217,7 @@ import_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3293,19 +3225,16 @@ import_stmt_rule(Parser *p) static stmt_ty import_name_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3314,7 +3243,7 @@ import_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'import' dotted_as_names if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'import' dotted_as_names")); @@ -3329,7 +3258,7 @@ import_name_rule(Parser *p) D(fprintf(stderr, "%*c+ import_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'import' dotted_as_names")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3339,7 +3268,7 @@ import_name_rule(Parser *p) _res = _PyAST_Import ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3350,7 +3279,7 @@ import_name_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3360,19 +3289,16 @@ import_name_rule(Parser *p) static stmt_ty import_from_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3381,7 +3307,7 @@ import_from_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'from' (('.' | '...'))* dotted_name 'import' import_from_targets if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))* dotted_name 'import' import_from_targets")); @@ -3405,7 +3331,7 @@ import_from_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))* dotted_name 'import' import_from_targets")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3415,7 +3341,7 @@ import_from_rule(Parser *p) _res = _PyAST_ImportFrom ( b -> v . Name . id , c , _PyPegen_seq_count_dots ( a ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3426,7 +3352,7 @@ import_from_rule(Parser *p) } { // 'from' (('.' | '...'))+ 'import' import_from_targets if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))+ 'import' import_from_targets")); @@ -3447,7 +3373,7 @@ import_from_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))+ 'import' import_from_targets")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3457,7 +3383,7 @@ import_from_rule(Parser *p) _res = _PyAST_ImportFrom ( NULL , b , _PyPegen_seq_count_dots ( a ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3468,7 +3394,7 @@ import_from_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3480,19 +3406,16 @@ import_from_rule(Parser *p) static asdl_alias_seq* import_from_targets_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3501,7 +3424,7 @@ import_from_targets_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' import_from_as_names ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' import_from_as_names ','? ')'")); @@ -3524,7 +3447,7 @@ import_from_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3535,7 +3458,7 @@ import_from_targets_rule(Parser *p) } { // import_from_as_names !',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_names !','")); @@ -3556,7 +3479,7 @@ import_from_targets_rule(Parser *p) } { // '*' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); @@ -3568,7 +3491,7 @@ import_from_targets_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from_targets[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3578,7 +3501,7 @@ import_from_targets_rule(Parser *p) _res = ( asdl_alias_seq* ) _PyPegen_singleton_seq ( p , CHECK ( alias_ty , _PyPegen_alias_for_star ( p , EXTRA ) ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3589,7 +3512,7 @@ import_from_targets_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_import_from_targets if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_import_from_targets")); @@ -3608,7 +3531,7 @@ import_from_targets_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3616,19 +3539,16 @@ import_from_targets_rule(Parser *p) static asdl_alias_seq* import_from_as_names_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; { // ','.import_from_as_name+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_as_names[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.import_from_as_name+")); @@ -3641,7 +3561,7 @@ import_from_as_names_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3652,7 +3572,7 @@ import_from_as_names_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3660,19 +3580,16 @@ import_from_as_names_rule(Parser *p) static alias_ty import_from_as_name_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } alias_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3681,7 +3598,7 @@ import_from_as_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ['as' NAME] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> import_from_as_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ['as' NAME]")); @@ -3696,7 +3613,7 @@ import_from_as_name_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from_as_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ['as' NAME]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3706,7 +3623,7 @@ import_from_as_name_rule(Parser *p) _res = _PyAST_alias ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Name . id : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3717,7 +3634,7 @@ import_from_as_name_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3725,19 +3642,16 @@ import_from_as_name_rule(Parser *p) static asdl_alias_seq* dotted_as_names_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; { // ','.dotted_as_name+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dotted_as_names[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.dotted_as_name+")); @@ -3750,7 +3664,7 @@ dotted_as_names_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3761,7 +3675,7 @@ dotted_as_names_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3769,19 +3683,16 @@ dotted_as_names_rule(Parser *p) static alias_ty dotted_as_name_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } alias_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3790,7 +3701,7 @@ dotted_as_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // dotted_name ['as' NAME] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dotted_as_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_name ['as' NAME]")); @@ -3805,7 +3716,7 @@ dotted_as_name_rule(Parser *p) D(fprintf(stderr, "%*c+ dotted_as_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "dotted_name ['as' NAME]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -3815,7 +3726,7 @@ dotted_as_name_rule(Parser *p) _res = _PyAST_alias ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Name . id : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3826,7 +3737,7 @@ dotted_as_name_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3836,13 +3747,10 @@ static expr_ty dotted_name_raw(Parser *); static expr_ty dotted_name_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, dotted_name_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -3850,42 +3758,37 @@ dotted_name_rule(Parser *p) while (1) { int tmpvar_0 = _PyPegen_update_memo(p, _mark, dotted_name_type, _res); if (tmpvar_0) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = dotted_name_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty dotted_name_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // dotted_name '.' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dotted_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_name '.' NAME")); @@ -3904,7 +3807,7 @@ dotted_name_raw(Parser *p) _res = _PyPegen_join_names_with_dot ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3915,7 +3818,7 @@ dotted_name_raw(Parser *p) } { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dotted_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -3934,7 +3837,7 @@ dotted_name_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -3942,23 +3845,20 @@ dotted_name_raw(Parser *p) static asdl_stmt_seq* block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; if (_PyPegen_is_memoized(p, block_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; { // NEWLINE INDENT statements DEDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE INDENT statements DEDENT")); @@ -3980,7 +3880,7 @@ block_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -3991,7 +3891,7 @@ block_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -4010,7 +3910,7 @@ block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_block")); @@ -4030,7 +3930,7 @@ block_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, block_type, _res); - p->level--; + D(p->level--); return _res; } @@ -4038,19 +3938,16 @@ block_rule(Parser *p) static asdl_expr_seq* decorators_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // (('@' named_expression NEWLINE))+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> decorators[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(('@' named_expression NEWLINE))+")); @@ -4063,7 +3960,7 @@ decorators_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4074,7 +3971,7 @@ decorators_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4082,19 +3979,16 @@ decorators_rule(Parser *p) static stmt_ty class_def_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // decorators class_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "decorators class_def_raw")); @@ -4110,7 +4004,7 @@ class_def_rule(Parser *p) _res = _PyPegen_class_def_decorators ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4121,7 +4015,7 @@ class_def_rule(Parser *p) } { // class_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "class_def_raw")); @@ -4140,7 +4034,7 @@ class_def_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4148,19 +4042,16 @@ class_def_rule(Parser *p) static stmt_ty class_def_raw_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -4169,7 +4060,7 @@ class_def_raw_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_class_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_class_def_raw")); @@ -4188,7 +4079,7 @@ class_def_raw_rule(Parser *p) } { // 'class' NAME ['(' arguments? ')'] &&':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); @@ -4212,7 +4103,7 @@ class_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ class_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -4222,7 +4113,7 @@ class_def_raw_rule(Parser *p) _res = _PyAST_ClassDef ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , c , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4233,7 +4124,7 @@ class_def_raw_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4241,19 +4132,16 @@ class_def_raw_rule(Parser *p) static stmt_ty function_def_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // decorators function_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> function_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "decorators function_def_raw")); @@ -4269,7 +4157,7 @@ function_def_rule(Parser *p) _res = _PyPegen_function_def_decorators ( p , d , f ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4280,7 +4168,7 @@ function_def_rule(Parser *p) } { // function_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> function_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "function_def_raw")); @@ -4299,7 +4187,7 @@ function_def_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4310,19 +4198,16 @@ function_def_rule(Parser *p) static stmt_ty function_def_raw_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -4331,7 +4216,7 @@ function_def_raw_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_def_raw if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_def_raw")); @@ -4350,7 +4235,7 @@ function_def_raw_rule(Parser *p) } { // 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); @@ -4386,7 +4271,7 @@ function_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ function_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -4396,7 +4281,7 @@ function_def_raw_rule(Parser *p) _res = _PyAST_FunctionDef ( n -> v . Name . id , ( params ) ? params : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , NULL , a , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4407,7 +4292,7 @@ function_def_raw_rule(Parser *p) } { // ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); @@ -4446,7 +4331,7 @@ function_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ function_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -4456,7 +4341,7 @@ function_def_raw_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async functions are" , _PyAST_AsyncFunctionDef ( n -> v . Name . id , ( params ) ? params : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , NULL , a , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4467,7 +4352,7 @@ function_def_raw_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4475,19 +4360,16 @@ function_def_raw_rule(Parser *p) static arguments_ty params_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arguments_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_parameters if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_parameters")); @@ -4506,7 +4388,7 @@ params_rule(Parser *p) } { // parameters if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "parameters")); @@ -4525,7 +4407,7 @@ params_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4538,19 +4420,16 @@ params_rule(Parser *p) static arguments_ty parameters_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arguments_ty _res = NULL; int _mark = p->mark; { // slash_no_default param_no_default* param_with_default* star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default param_no_default* param_with_default* star_etc?")); @@ -4572,7 +4451,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , a , NULL , b , c , d ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4583,7 +4462,7 @@ parameters_rule(Parser *p) } { // slash_with_default param_with_default* star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default param_with_default* star_etc?")); @@ -4602,7 +4481,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , a , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4613,7 +4492,7 @@ parameters_rule(Parser *p) } { // param_no_default+ param_with_default* star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ param_with_default* star_etc?")); @@ -4632,7 +4511,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4643,7 +4522,7 @@ parameters_rule(Parser *p) } { // param_with_default+ star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default+ star_etc?")); @@ -4659,7 +4538,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4670,7 +4549,7 @@ parameters_rule(Parser *p) } { // star_etc if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_etc")); @@ -4683,7 +4562,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4694,7 +4573,7 @@ parameters_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4702,19 +4581,16 @@ parameters_rule(Parser *p) static asdl_arg_seq* slash_no_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_arg_seq* _res = NULL; int _mark = p->mark; { // param_no_default+ '/' ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ '/' ','")); @@ -4733,7 +4609,7 @@ slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4744,7 +4620,7 @@ slash_no_default_rule(Parser *p) } { // param_no_default+ '/' &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ '/' &')'")); @@ -4762,7 +4638,7 @@ slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4773,7 +4649,7 @@ slash_no_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4783,19 +4659,16 @@ slash_no_default_rule(Parser *p) static SlashWithDefault* slash_with_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } SlashWithDefault* _res = NULL; int _mark = p->mark; { // param_no_default* param_with_default+ '/' ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* param_with_default+ '/' ','")); @@ -4817,7 +4690,7 @@ slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4828,7 +4701,7 @@ slash_with_default_rule(Parser *p) } { // param_no_default* param_with_default+ '/' &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* param_with_default+ '/' &')'")); @@ -4849,7 +4722,7 @@ slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4860,7 +4733,7 @@ slash_with_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -4872,19 +4745,16 @@ slash_with_default_rule(Parser *p) static StarEtc* star_etc_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } StarEtc* _res = NULL; int _mark = p->mark; { // '*' param_no_default param_maybe_default* kwds? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' param_no_default param_maybe_default* kwds?")); @@ -4906,7 +4776,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4917,7 +4787,7 @@ star_etc_rule(Parser *p) } { // '*' ',' param_maybe_default+ kwds? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' param_maybe_default+ kwds?")); @@ -4939,7 +4809,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4950,7 +4820,7 @@ star_etc_rule(Parser *p) } { // kwds if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwds")); @@ -4963,7 +4833,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -4974,7 +4844,7 @@ star_etc_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_star_etc if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_star_etc")); @@ -4993,7 +4863,7 @@ star_etc_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5001,19 +4871,16 @@ star_etc_rule(Parser *p) static arg_ty kwds_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // '**' param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwds[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' param_no_default")); @@ -5029,7 +4896,7 @@ kwds_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5040,7 +4907,7 @@ kwds_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5048,19 +4915,16 @@ kwds_rule(Parser *p) static arg_ty param_no_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // param ',' TYPE_COMMENT? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param ',' TYPE_COMMENT?")); @@ -5079,7 +4943,7 @@ param_no_default_rule(Parser *p) _res = _PyPegen_add_type_comment_to_arg ( p , a , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5090,7 +4954,7 @@ param_no_default_rule(Parser *p) } { // param TYPE_COMMENT? &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param TYPE_COMMENT? &')'")); @@ -5108,7 +4972,7 @@ param_no_default_rule(Parser *p) _res = _PyPegen_add_type_comment_to_arg ( p , a , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5119,7 +4983,7 @@ param_no_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5127,19 +4991,16 @@ param_no_default_rule(Parser *p) static NameDefaultPair* param_with_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // param default ',' TYPE_COMMENT? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default ',' TYPE_COMMENT?")); @@ -5161,7 +5022,7 @@ param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5172,7 +5033,7 @@ param_with_default_rule(Parser *p) } { // param default TYPE_COMMENT? &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default TYPE_COMMENT? &')'")); @@ -5193,7 +5054,7 @@ param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5204,7 +5065,7 @@ param_with_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5214,19 +5075,16 @@ param_with_default_rule(Parser *p) static NameDefaultPair* param_maybe_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // param default? ',' TYPE_COMMENT? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default? ',' TYPE_COMMENT?")); @@ -5248,7 +5106,7 @@ param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5259,7 +5117,7 @@ param_maybe_default_rule(Parser *p) } { // param default? TYPE_COMMENT? &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default? TYPE_COMMENT? &')'")); @@ -5280,7 +5138,7 @@ param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5291,7 +5149,7 @@ param_maybe_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5299,19 +5157,16 @@ param_maybe_default_rule(Parser *p) static arg_ty param_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5320,7 +5175,7 @@ param_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME annotation? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> param[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME annotation?")); @@ -5335,7 +5190,7 @@ param_rule(Parser *p) D(fprintf(stderr, "%*c+ param[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME annotation?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5345,7 +5200,7 @@ param_rule(Parser *p) _res = _PyAST_arg ( a -> v . Name . id , b , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5356,7 +5211,7 @@ param_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5364,19 +5219,16 @@ param_rule(Parser *p) static expr_ty annotation_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> annotation[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':' expression")); @@ -5392,7 +5244,7 @@ annotation_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5403,7 +5255,7 @@ annotation_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5411,19 +5263,16 @@ annotation_rule(Parser *p) static expr_ty default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '=' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' expression")); @@ -5439,7 +5288,7 @@ default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5450,7 +5299,7 @@ default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5461,19 +5310,16 @@ default_rule(Parser *p) static stmt_ty if_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5482,7 +5328,7 @@ if_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_if_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_if_stmt")); @@ -5501,7 +5347,7 @@ if_stmt_rule(Parser *p) } { // 'if' named_expression ':' block elif_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block elif_stmt")); @@ -5525,7 +5371,7 @@ if_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ if_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block elif_stmt")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5535,7 +5381,7 @@ if_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , CHECK ( asdl_stmt_seq* , _PyPegen_singleton_seq ( p , c ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5546,7 +5392,7 @@ if_stmt_rule(Parser *p) } { // 'if' named_expression ':' block else_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block else_block?")); @@ -5570,7 +5416,7 @@ if_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ if_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5580,7 +5426,7 @@ if_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5591,7 +5437,7 @@ if_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5602,19 +5448,16 @@ if_stmt_rule(Parser *p) static stmt_ty elif_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5623,7 +5466,7 @@ elif_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_elif_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_elif_stmt")); @@ -5642,7 +5485,7 @@ elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' block elif_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block elif_stmt")); @@ -5666,7 +5509,7 @@ elif_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ elif_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block elif_stmt")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5676,7 +5519,7 @@ elif_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , CHECK ( asdl_stmt_seq* , _PyPegen_singleton_seq ( p , c ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5687,7 +5530,7 @@ elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' block else_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block else_block?")); @@ -5711,7 +5554,7 @@ elif_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ elif_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5721,7 +5564,7 @@ elif_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5732,7 +5575,7 @@ elif_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5740,19 +5583,16 @@ elif_stmt_rule(Parser *p) static asdl_stmt_seq* else_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_else_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> else_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_else_stmt")); @@ -5771,7 +5611,7 @@ else_block_rule(Parser *p) } { // 'else' &&':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> else_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else' &&':' block")); @@ -5790,7 +5630,7 @@ else_block_rule(Parser *p) _res = b; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5801,7 +5641,7 @@ else_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5809,19 +5649,16 @@ else_block_rule(Parser *p) static stmt_ty while_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5830,7 +5667,7 @@ while_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_while_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_while_stmt")); @@ -5849,7 +5686,7 @@ while_stmt_rule(Parser *p) } { // 'while' named_expression ':' block else_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' block else_block?")); @@ -5873,7 +5710,7 @@ while_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ while_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5883,7 +5720,7 @@ while_stmt_rule(Parser *p) _res = _PyAST_While ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -5894,7 +5731,7 @@ while_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -5906,19 +5743,16 @@ while_stmt_rule(Parser *p) static stmt_ty for_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5927,7 +5761,7 @@ for_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_for_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_stmt")); @@ -5946,7 +5780,7 @@ for_stmt_rule(Parser *p) } { // 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); @@ -5982,7 +5816,7 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -5992,7 +5826,7 @@ for_stmt_rule(Parser *p) _res = _PyAST_For ( t , ex , b , el , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6001,13 +5835,13 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } { // ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); @@ -6046,7 +5880,7 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6056,7 +5890,7 @@ for_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async for loops are" , _PyAST_AsyncFor ( t , ex , b , el , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6065,13 +5899,13 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } if (p->call_invalid_rules) { // invalid_for_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_target")); @@ -6090,7 +5924,7 @@ for_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6104,19 +5938,16 @@ for_stmt_rule(Parser *p) static stmt_ty with_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6125,7 +5956,7 @@ with_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_with_stmt_indent if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_stmt_indent")); @@ -6144,7 +5975,7 @@ with_stmt_rule(Parser *p) } { // 'with' '(' ','.with_item+ ','? ')' ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with' '(' ','.with_item+ ','? ')' ':' block")); @@ -6175,7 +6006,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'with' '(' ','.with_item+ ','? ')' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6185,7 +6016,7 @@ with_stmt_rule(Parser *p) _res = _PyAST_With ( a , b , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6196,7 +6027,7 @@ with_stmt_rule(Parser *p) } { // 'with' ','.with_item+ ':' TYPE_COMMENT? block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with' ','.with_item+ ':' TYPE_COMMENT? block")); @@ -6220,7 +6051,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'with' ','.with_item+ ':' TYPE_COMMENT? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6230,7 +6061,7 @@ with_stmt_rule(Parser *p) _res = _PyAST_With ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6241,7 +6072,7 @@ with_stmt_rule(Parser *p) } { // ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block")); @@ -6275,7 +6106,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6285,7 +6116,7 @@ with_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async with statements are" , _PyAST_AsyncWith ( a , b , NULL , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6296,7 +6127,7 @@ with_stmt_rule(Parser *p) } { // ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block")); @@ -6323,7 +6154,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6333,7 +6164,7 @@ with_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async with statements are" , _PyAST_AsyncWith ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6344,7 +6175,7 @@ with_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_with_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_stmt")); @@ -6363,7 +6194,7 @@ with_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6374,19 +6205,16 @@ with_stmt_rule(Parser *p) static withitem_ty with_item_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } withitem_ty _res = NULL; int _mark = p->mark; { // expression 'as' star_target &(',' | ')' | ':') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression 'as' star_target &(',' | ')' | ':')")); @@ -6407,7 +6235,7 @@ with_item_rule(Parser *p) _res = _PyAST_withitem ( e , t , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6418,7 +6246,7 @@ with_item_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_with_item if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_item")); @@ -6437,7 +6265,7 @@ with_item_rule(Parser *p) } { // expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -6450,7 +6278,7 @@ with_item_rule(Parser *p) _res = _PyAST_withitem ( e , NULL , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6461,7 +6289,7 @@ with_item_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6473,19 +6301,16 @@ with_item_rule(Parser *p) static stmt_ty try_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6494,7 +6319,7 @@ try_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_try_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_try_stmt")); @@ -6513,7 +6338,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block finally_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block finally_block")); @@ -6534,7 +6359,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block finally_block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6544,7 +6369,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_Try ( b , NULL , NULL , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6555,7 +6380,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block except_block+ else_block? finally_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_block+ else_block? finally_block?")); @@ -6582,7 +6407,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_block+ else_block? finally_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6592,7 +6417,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_Try ( b , ex , el , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6603,7 +6428,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block except_star_block+ else_block? finally_block? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_star_block+ else_block? finally_block?")); @@ -6630,7 +6455,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_star_block+ else_block? finally_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6640,7 +6465,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_TryStar ( b , ex , el , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6651,7 +6476,7 @@ try_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6663,19 +6488,16 @@ try_stmt_rule(Parser *p) static excepthandler_ty except_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } excepthandler_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6684,7 +6506,7 @@ except_block_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_except_stmt_indent if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt_indent")); @@ -6703,7 +6525,7 @@ except_block_rule(Parser *p) } { // 'except' expression ['as' NAME] ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' block")); @@ -6727,7 +6549,7 @@ except_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6737,7 +6559,7 @@ except_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( e , ( t ) ? ( ( expr_ty ) t ) -> v . Name . id : NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6748,7 +6570,7 @@ except_block_rule(Parser *p) } { // 'except' ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' ':' block")); @@ -6766,7 +6588,7 @@ except_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6776,7 +6598,7 @@ except_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( NULL , NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6787,7 +6609,7 @@ except_block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_except_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt")); @@ -6806,7 +6628,7 @@ except_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6817,19 +6639,16 @@ except_block_rule(Parser *p) static excepthandler_ty except_star_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } excepthandler_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6838,7 +6657,7 @@ except_star_block_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_except_star_stmt_indent if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_star_stmt_indent")); @@ -6857,7 +6676,7 @@ except_star_block_rule(Parser *p) } { // 'except' '*' expression ['as' NAME] ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' block")); @@ -6884,7 +6703,7 @@ except_star_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_star_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -6894,7 +6713,7 @@ except_star_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( e , ( t ) ? ( ( expr_ty ) t ) -> v . Name . id : NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6905,7 +6724,7 @@ except_star_block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_except_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt")); @@ -6924,7 +6743,7 @@ except_star_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -6932,19 +6751,16 @@ except_star_block_rule(Parser *p) static asdl_stmt_seq* finally_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_finally_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> finally_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_finally_stmt")); @@ -6963,7 +6779,7 @@ finally_block_rule(Parser *p) } { // 'finally' &&':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> finally_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally' &&':' block")); @@ -6982,7 +6798,7 @@ finally_block_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -6993,7 +6809,7 @@ finally_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7003,19 +6819,16 @@ finally_block_rule(Parser *p) static stmt_ty match_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7024,7 +6837,7 @@ match_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // "match" subject_expr ':' NEWLINE INDENT case_block+ DEDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE INDENT case_block+ DEDENT")); @@ -7054,7 +6867,7 @@ match_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ match_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE INDENT case_block+ DEDENT")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7064,7 +6877,7 @@ match_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 10 , "Pattern matching is" , _PyAST_Match ( subject , cases , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7075,7 +6888,7 @@ match_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_match_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_match_stmt")); @@ -7094,7 +6907,7 @@ match_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7102,19 +6915,16 @@ match_stmt_rule(Parser *p) static expr_ty subject_expr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7123,7 +6933,7 @@ subject_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_named_expression ',' star_named_expressions? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> subject_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); @@ -7141,7 +6951,7 @@ subject_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ subject_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7151,7 +6961,7 @@ subject_expr_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , value , values ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7162,7 +6972,7 @@ subject_expr_rule(Parser *p) } { // named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> subject_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -7181,7 +6991,7 @@ subject_expr_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7189,19 +6999,16 @@ subject_expr_rule(Parser *p) static match_case_ty case_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } match_case_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_case_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_case_block")); @@ -7220,7 +7027,7 @@ case_block_rule(Parser *p) } { // "case" patterns guard? ':' block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? ':' block")); @@ -7245,7 +7052,7 @@ case_block_rule(Parser *p) _res = _PyAST_match_case ( pattern , guard , body , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7256,7 +7063,7 @@ case_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7264,19 +7071,16 @@ case_block_rule(Parser *p) static expr_ty guard_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // 'if' named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> guard[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression")); @@ -7292,7 +7096,7 @@ guard_rule(Parser *p) _res = guard; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7303,7 +7107,7 @@ guard_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7311,19 +7115,16 @@ guard_rule(Parser *p) static pattern_ty patterns_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7332,7 +7133,7 @@ patterns_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // open_sequence_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "open_sequence_pattern")); @@ -7344,7 +7145,7 @@ patterns_rule(Parser *p) D(fprintf(stderr, "%*c+ patterns[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "open_sequence_pattern")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7354,7 +7155,7 @@ patterns_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7365,7 +7166,7 @@ patterns_rule(Parser *p) } { // pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern")); @@ -7384,7 +7185,7 @@ patterns_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7392,19 +7193,16 @@ patterns_rule(Parser *p) static pattern_ty pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // as_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "as_pattern")); @@ -7423,7 +7221,7 @@ pattern_rule(Parser *p) } { // or_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern")); @@ -7442,7 +7240,7 @@ pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7450,19 +7248,16 @@ pattern_rule(Parser *p) static pattern_ty as_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7471,7 +7266,7 @@ as_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // or_pattern 'as' pattern_capture_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' pattern_capture_target")); @@ -7489,7 +7284,7 @@ as_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ as_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7499,7 +7294,7 @@ as_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( pattern , target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7510,7 +7305,7 @@ as_pattern_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_as_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_as_pattern")); @@ -7529,7 +7324,7 @@ as_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7537,19 +7332,16 @@ as_pattern_rule(Parser *p) static pattern_ty or_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7558,7 +7350,7 @@ or_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '|'.closed_pattern+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> or_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|'.closed_pattern+")); @@ -7570,7 +7362,7 @@ or_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ or_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'|'.closed_pattern+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7580,7 +7372,7 @@ or_pattern_rule(Parser *p) _res = asdl_seq_LEN ( patterns ) == 1 ? asdl_seq_GET ( patterns , 0 ) : _PyAST_MatchOr ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7591,7 +7383,7 @@ or_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7607,19 +7399,16 @@ or_pattern_rule(Parser *p) static pattern_ty closed_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // literal_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "literal_pattern")); @@ -7638,7 +7427,7 @@ closed_pattern_rule(Parser *p) } { // capture_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "capture_pattern")); @@ -7657,7 +7446,7 @@ closed_pattern_rule(Parser *p) } { // wildcard_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "wildcard_pattern")); @@ -7676,7 +7465,7 @@ closed_pattern_rule(Parser *p) } { // value_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "value_pattern")); @@ -7695,7 +7484,7 @@ closed_pattern_rule(Parser *p) } { // group_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "group_pattern")); @@ -7714,7 +7503,7 @@ closed_pattern_rule(Parser *p) } { // sequence_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sequence_pattern")); @@ -7733,7 +7522,7 @@ closed_pattern_rule(Parser *p) } { // mapping_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "mapping_pattern")); @@ -7752,7 +7541,7 @@ closed_pattern_rule(Parser *p) } { // class_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "class_pattern")); @@ -7771,7 +7560,7 @@ closed_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -7785,19 +7574,16 @@ closed_pattern_rule(Parser *p) static pattern_ty literal_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7806,7 +7592,7 @@ literal_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_number !('+' | '-') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); @@ -7820,7 +7606,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7830,7 +7616,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7841,7 +7627,7 @@ literal_pattern_rule(Parser *p) } { // complex_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "complex_number")); @@ -7853,7 +7639,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "complex_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7863,7 +7649,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7874,7 +7660,7 @@ literal_pattern_rule(Parser *p) } { // strings if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "strings")); @@ -7886,7 +7672,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "strings")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7896,7 +7682,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7907,7 +7693,7 @@ literal_pattern_rule(Parser *p) } { // 'None' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -7919,7 +7705,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7929,7 +7715,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_None , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7940,7 +7726,7 @@ literal_pattern_rule(Parser *p) } { // 'True' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -7952,7 +7738,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7962,7 +7748,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_True , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -7973,7 +7759,7 @@ literal_pattern_rule(Parser *p) } { // 'False' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -7985,7 +7771,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -7995,7 +7781,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_False , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8006,7 +7792,7 @@ literal_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8020,19 +7806,16 @@ literal_pattern_rule(Parser *p) static expr_ty literal_expr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8041,7 +7824,7 @@ literal_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_number !('+' | '-') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); @@ -8062,7 +7845,7 @@ literal_expr_rule(Parser *p) } { // complex_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "complex_number")); @@ -8081,7 +7864,7 @@ literal_expr_rule(Parser *p) } { // strings if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "strings")); @@ -8100,7 +7883,7 @@ literal_expr_rule(Parser *p) } { // 'None' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -8112,7 +7895,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8122,7 +7905,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_None , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8133,7 +7916,7 @@ literal_expr_rule(Parser *p) } { // 'True' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -8145,7 +7928,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8155,7 +7938,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_True , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8166,7 +7949,7 @@ literal_expr_rule(Parser *p) } { // 'False' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -8178,7 +7961,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8188,7 +7971,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_False , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8199,7 +7982,7 @@ literal_expr_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8209,19 +7992,16 @@ literal_expr_rule(Parser *p) static expr_ty complex_number_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8230,7 +8010,7 @@ complex_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_real_number '+' imaginary_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> complex_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_real_number '+' imaginary_number")); @@ -8248,7 +8028,7 @@ complex_number_rule(Parser *p) D(fprintf(stderr, "%*c+ complex_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_real_number '+' imaginary_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8258,7 +8038,7 @@ complex_number_rule(Parser *p) _res = _PyAST_BinOp ( real , Add , imag , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8269,7 +8049,7 @@ complex_number_rule(Parser *p) } { // signed_real_number '-' imaginary_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> complex_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_real_number '-' imaginary_number")); @@ -8287,7 +8067,7 @@ complex_number_rule(Parser *p) D(fprintf(stderr, "%*c+ complex_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_real_number '-' imaginary_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8297,7 +8077,7 @@ complex_number_rule(Parser *p) _res = _PyAST_BinOp ( real , Sub , imag , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8308,7 +8088,7 @@ complex_number_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8316,19 +8096,16 @@ complex_number_rule(Parser *p) static expr_ty signed_number_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8337,7 +8114,7 @@ signed_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NUMBER if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> signed_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8356,7 +8133,7 @@ signed_number_rule(Parser *p) } { // '-' NUMBER if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> signed_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' NUMBER")); @@ -8371,7 +8148,7 @@ signed_number_rule(Parser *p) D(fprintf(stderr, "%*c+ signed_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' NUMBER")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8381,7 +8158,7 @@ signed_number_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , number , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8392,7 +8169,7 @@ signed_number_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8400,19 +8177,16 @@ signed_number_rule(Parser *p) static expr_ty signed_real_number_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8421,7 +8195,7 @@ signed_real_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // real_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> signed_real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "real_number")); @@ -8440,7 +8214,7 @@ signed_real_number_rule(Parser *p) } { // '-' real_number if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> signed_real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' real_number")); @@ -8455,7 +8229,7 @@ signed_real_number_rule(Parser *p) D(fprintf(stderr, "%*c+ signed_real_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' real_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8465,7 +8239,7 @@ signed_real_number_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , real , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8476,7 +8250,7 @@ signed_real_number_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8484,19 +8258,16 @@ signed_real_number_rule(Parser *p) static expr_ty real_number_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // NUMBER if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8509,7 +8280,7 @@ real_number_rule(Parser *p) _res = _PyPegen_ensure_real ( p , real ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8520,7 +8291,7 @@ real_number_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8528,19 +8299,16 @@ real_number_rule(Parser *p) static expr_ty imaginary_number_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // NUMBER if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> imaginary_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8553,7 +8321,7 @@ imaginary_number_rule(Parser *p) _res = _PyPegen_ensure_imaginary ( p , imag ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8564,7 +8332,7 @@ imaginary_number_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8572,19 +8340,16 @@ imaginary_number_rule(Parser *p) static pattern_ty capture_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8593,7 +8358,7 @@ capture_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // pattern_capture_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> capture_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern_capture_target")); @@ -8605,7 +8370,7 @@ capture_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ capture_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8615,7 +8380,7 @@ capture_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( NULL , target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8626,7 +8391,7 @@ capture_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8634,19 +8399,16 @@ capture_pattern_rule(Parser *p) static expr_ty pattern_capture_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // !"_" NAME !('.' | '(' | '=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> pattern_capture_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!\"_\" NAME !('.' | '(' | '=')")); @@ -8663,7 +8425,7 @@ pattern_capture_target_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , name , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8674,7 +8436,7 @@ pattern_capture_target_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8682,19 +8444,16 @@ pattern_capture_target_rule(Parser *p) static pattern_ty wildcard_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8703,7 +8462,7 @@ wildcard_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // "_" if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> wildcard_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"_\"")); @@ -8715,7 +8474,7 @@ wildcard_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ wildcard_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"_\"")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8725,7 +8484,7 @@ wildcard_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8736,7 +8495,7 @@ wildcard_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8744,19 +8503,16 @@ wildcard_pattern_rule(Parser *p) static pattern_ty value_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8765,7 +8521,7 @@ value_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // attr !('.' | '(' | '=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> value_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr !('.' | '(' | '=')")); @@ -8779,7 +8535,7 @@ value_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ value_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "attr !('.' | '(' | '=')")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8789,7 +8545,7 @@ value_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( attr , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8800,7 +8556,7 @@ value_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8810,13 +8566,10 @@ static expr_ty attr_raw(Parser *); static expr_ty attr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, attr_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -8824,42 +8577,37 @@ attr_rule(Parser *p) while (1) { int tmpvar_1 = _PyPegen_update_memo(p, _mark, attr_type, _res); if (tmpvar_1) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = attr_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty attr_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8868,7 +8616,7 @@ attr_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // name_or_attr '.' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '.' NAME")); @@ -8886,7 +8634,7 @@ attr_raw(Parser *p) D(fprintf(stderr, "%*c+ attr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '.' NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -8896,7 +8644,7 @@ attr_raw(Parser *p) _res = _PyAST_Attribute ( value , attr -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -8907,7 +8655,7 @@ attr_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8916,19 +8664,16 @@ attr_raw(Parser *p) static expr_ty name_or_attr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // attr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> name_or_attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr")); @@ -8947,7 +8692,7 @@ name_or_attr_rule(Parser *p) } { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> name_or_attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -8966,7 +8711,7 @@ name_or_attr_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -8974,19 +8719,16 @@ name_or_attr_rule(Parser *p) static pattern_ty group_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // '(' pattern ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> group_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' pattern ')'")); @@ -9005,7 +8747,7 @@ group_pattern_rule(Parser *p) _res = pattern; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9016,7 +8758,7 @@ group_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9024,19 +8766,16 @@ group_pattern_rule(Parser *p) static pattern_ty sequence_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9045,7 +8784,7 @@ sequence_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' maybe_sequence_pattern? ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' maybe_sequence_pattern? ']'")); @@ -9063,7 +8802,7 @@ sequence_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ sequence_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' maybe_sequence_pattern? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9073,7 +8812,7 @@ sequence_pattern_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9084,7 +8823,7 @@ sequence_pattern_rule(Parser *p) } { // '(' open_sequence_pattern? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' open_sequence_pattern? ')'")); @@ -9102,7 +8841,7 @@ sequence_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ sequence_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' open_sequence_pattern? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9112,7 +8851,7 @@ sequence_pattern_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9123,7 +8862,7 @@ sequence_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9131,19 +8870,16 @@ sequence_pattern_rule(Parser *p) static asdl_seq* open_sequence_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // maybe_star_pattern ',' maybe_sequence_pattern? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> open_sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "maybe_star_pattern ',' maybe_sequence_pattern?")); @@ -9162,7 +8898,7 @@ open_sequence_pattern_rule(Parser *p) _res = _PyPegen_seq_insert_in_front ( p , pattern , patterns ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9173,7 +8909,7 @@ open_sequence_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9181,19 +8917,16 @@ open_sequence_pattern_rule(Parser *p) static asdl_seq* maybe_sequence_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.maybe_star_pattern+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> maybe_sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.maybe_star_pattern+ ','?")); @@ -9210,7 +8943,7 @@ maybe_sequence_pattern_rule(Parser *p) _res = patterns; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9221,7 +8954,7 @@ maybe_sequence_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9229,19 +8962,16 @@ maybe_sequence_pattern_rule(Parser *p) static pattern_ty maybe_star_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // star_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> maybe_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_pattern")); @@ -9260,7 +8990,7 @@ maybe_star_pattern_rule(Parser *p) } { // pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> maybe_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern")); @@ -9279,7 +9009,7 @@ maybe_star_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9287,19 +9017,16 @@ maybe_star_pattern_rule(Parser *p) static pattern_ty star_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9308,7 +9035,7 @@ star_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' pattern_capture_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' pattern_capture_target")); @@ -9323,7 +9050,7 @@ star_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ star_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9333,7 +9060,7 @@ star_pattern_rule(Parser *p) _res = _PyAST_MatchStar ( target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9344,7 +9071,7 @@ star_pattern_rule(Parser *p) } { // '*' wildcard_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' wildcard_pattern")); @@ -9359,7 +9086,7 @@ star_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ star_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' wildcard_pattern")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9369,7 +9096,7 @@ star_pattern_rule(Parser *p) _res = _PyAST_MatchStar ( NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9380,7 +9107,7 @@ star_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9392,19 +9119,16 @@ star_pattern_rule(Parser *p) static pattern_ty mapping_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9413,7 +9137,7 @@ mapping_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' '}'")); @@ -9428,7 +9152,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9438,7 +9162,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( NULL , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9449,7 +9173,7 @@ mapping_pattern_rule(Parser *p) } { // '{' double_star_pattern ','? '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' double_star_pattern ','? '}'")); @@ -9471,7 +9195,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' double_star_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9481,7 +9205,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( NULL , NULL , rest -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9492,7 +9216,7 @@ mapping_pattern_rule(Parser *p) } { // '{' items_pattern ',' double_star_pattern ','? '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ',' double_star_pattern ','? '}'")); @@ -9520,7 +9244,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ',' double_star_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9530,7 +9254,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , items ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , items ) ) , rest -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9541,7 +9265,7 @@ mapping_pattern_rule(Parser *p) } { // '{' items_pattern ','? '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ','? '}'")); @@ -9563,7 +9287,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9573,7 +9297,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , items ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , items ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9584,7 +9308,7 @@ mapping_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9592,19 +9316,16 @@ mapping_pattern_rule(Parser *p) static asdl_seq* items_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.key_value_pattern+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> items_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.key_value_pattern+")); @@ -9623,7 +9344,7 @@ items_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9631,19 +9352,16 @@ items_pattern_rule(Parser *p) static KeyPatternPair* key_value_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeyPatternPair* _res = NULL; int _mark = p->mark; { // (literal_expr | attr) ':' pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> key_value_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(literal_expr | attr) ':' pattern")); @@ -9662,7 +9380,7 @@ key_value_pattern_rule(Parser *p) _res = _PyPegen_key_pattern_pair ( p , key , pattern ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9673,7 +9391,7 @@ key_value_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9681,19 +9399,16 @@ key_value_pattern_rule(Parser *p) static expr_ty double_star_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '**' pattern_capture_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> double_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' pattern_capture_target")); @@ -9709,7 +9424,7 @@ double_star_pattern_rule(Parser *p) _res = target; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9720,7 +9435,7 @@ double_star_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9733,19 +9448,16 @@ double_star_pattern_rule(Parser *p) static pattern_ty class_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9754,7 +9466,7 @@ class_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // name_or_attr '(' ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' ')'")); @@ -9772,7 +9484,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9782,7 +9494,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , NULL , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9793,7 +9505,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' positional_patterns ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ','? ')'")); @@ -9818,7 +9530,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9828,7 +9540,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , patterns , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9839,7 +9551,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' keyword_patterns ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' keyword_patterns ','? ')'")); @@ -9864,7 +9576,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' keyword_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9874,7 +9586,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , NULL , CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , keywords ) ) ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , keywords ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9885,7 +9597,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')'")); @@ -9916,7 +9628,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -9926,7 +9638,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , patterns , CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , keywords ) ) ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , keywords ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -9937,7 +9649,7 @@ class_pattern_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_class_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_class_pattern")); @@ -9956,7 +9668,7 @@ class_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -9964,19 +9676,16 @@ class_pattern_rule(Parser *p) static asdl_pattern_seq* positional_patterns_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_pattern_seq* _res = NULL; int _mark = p->mark; { // ','.pattern+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> positional_patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.pattern+")); @@ -9989,7 +9698,7 @@ positional_patterns_rule(Parser *p) _res = args; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10000,7 +9709,7 @@ positional_patterns_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10008,19 +9717,16 @@ positional_patterns_rule(Parser *p) static asdl_seq* keyword_patterns_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.keyword_pattern+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> keyword_patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.keyword_pattern+")); @@ -10039,7 +9745,7 @@ keyword_patterns_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10047,19 +9753,16 @@ keyword_patterns_rule(Parser *p) static KeyPatternPair* keyword_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeyPatternPair* _res = NULL; int _mark = p->mark; { // NAME '=' pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> keyword_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' pattern")); @@ -10078,7 +9781,7 @@ keyword_pattern_rule(Parser *p) _res = _PyPegen_key_pattern_pair ( p , arg , value ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10089,7 +9792,7 @@ keyword_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10097,19 +9800,16 @@ keyword_pattern_rule(Parser *p) static expr_ty expressions_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10118,7 +9818,7 @@ expressions_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // expression ((',' expression))+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ((',' expression))+ ','?")); @@ -10137,7 +9837,7 @@ expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ((',' expression))+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10147,7 +9847,7 @@ expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10158,7 +9858,7 @@ expressions_rule(Parser *p) } { // expression ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ','")); @@ -10173,7 +9873,7 @@ expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ','")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10183,7 +9883,7 @@ expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_singleton_seq ( p , a ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10194,7 +9894,7 @@ expressions_rule(Parser *p) } { // expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -10213,7 +9913,7 @@ expressions_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10226,23 +9926,20 @@ expressions_rule(Parser *p) static expr_ty expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, expression_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10251,7 +9948,7 @@ expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_expression")); @@ -10270,7 +9967,7 @@ expression_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_legacy_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_legacy_expression")); @@ -10289,7 +9986,7 @@ expression_rule(Parser *p) } { // disjunction 'if' disjunction 'else' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); @@ -10313,7 +10010,7 @@ expression_rule(Parser *p) D(fprintf(stderr, "%*c+ expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10323,7 +10020,7 @@ expression_rule(Parser *p) _res = _PyAST_IfExp ( b , a , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10334,7 +10031,7 @@ expression_rule(Parser *p) } { // disjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction")); @@ -10353,7 +10050,7 @@ expression_rule(Parser *p) } { // lambdef if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambdef")); @@ -10373,7 +10070,7 @@ expression_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, expression_type, _res); - p->level--; + D(p->level--); return _res; } @@ -10381,19 +10078,16 @@ expression_rule(Parser *p) static expr_ty yield_expr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10402,7 +10096,7 @@ yield_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'yield' 'from' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> yield_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'yield' 'from' expression")); @@ -10420,7 +10114,7 @@ yield_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'yield' 'from' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10430,7 +10124,7 @@ yield_expr_rule(Parser *p) _res = _PyAST_YieldFrom ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10441,7 +10135,7 @@ yield_expr_rule(Parser *p) } { // 'yield' star_expressions? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> yield_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'yield' star_expressions?")); @@ -10456,7 +10150,7 @@ yield_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'yield' star_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10466,7 +10160,7 @@ yield_expr_rule(Parser *p) _res = _PyAST_Yield ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10477,7 +10171,7 @@ yield_expr_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10488,19 +10182,16 @@ yield_expr_rule(Parser *p) static expr_ty star_expressions_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10509,7 +10200,7 @@ star_expressions_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_expression ((',' star_expression))+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression ((',' star_expression))+ ','?")); @@ -10528,7 +10219,7 @@ star_expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expression ((',' star_expression))+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10538,7 +10229,7 @@ star_expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10549,7 +10240,7 @@ star_expressions_rule(Parser *p) } { // star_expression ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression ','")); @@ -10564,7 +10255,7 @@ star_expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expression ','")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10574,7 +10265,7 @@ star_expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_singleton_seq ( p , a ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10585,7 +10276,7 @@ star_expressions_rule(Parser *p) } { // star_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression")); @@ -10604,7 +10295,7 @@ star_expressions_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10612,23 +10303,20 @@ star_expressions_rule(Parser *p) static expr_ty star_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, star_expression_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10637,7 +10325,7 @@ star_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); @@ -10652,7 +10340,7 @@ star_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10662,7 +10350,7 @@ star_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10673,7 +10361,7 @@ star_expression_rule(Parser *p) } { // expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -10693,7 +10381,7 @@ star_expression_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, star_expression_type, _res); - p->level--; + D(p->level--); return _res; } @@ -10701,19 +10389,16 @@ star_expression_rule(Parser *p) static asdl_expr_seq* star_named_expressions_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.star_named_expression+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_named_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.star_named_expression+ ','?")); @@ -10730,7 +10415,7 @@ star_named_expressions_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10741,7 +10426,7 @@ star_named_expressions_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10749,19 +10434,16 @@ star_named_expressions_rule(Parser *p) static expr_ty star_named_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10770,7 +10452,7 @@ star_named_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); @@ -10785,7 +10467,7 @@ star_named_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ star_named_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10795,7 +10477,7 @@ star_named_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10806,7 +10488,7 @@ star_named_expression_rule(Parser *p) } { // named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -10825,7 +10507,7 @@ star_named_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10833,19 +10515,16 @@ star_named_expression_rule(Parser *p) static expr_ty assignment_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10854,7 +10533,7 @@ assignment_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ':=' ~ expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> assignment_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ':=' ~ expression")); @@ -10875,7 +10554,7 @@ assignment_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ':=' ~ expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -10885,7 +10564,7 @@ assignment_expression_rule(Parser *p) _res = _PyAST_NamedExpr ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -10894,13 +10573,13 @@ assignment_expression_rule(Parser *p) D(fprintf(stderr, "%*c%s assignment_expression[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "NAME ':=' ~ expression")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10908,19 +10587,16 @@ assignment_expression_rule(Parser *p) static expr_ty named_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -10939,7 +10615,7 @@ named_expression_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_named_expression")); @@ -10958,7 +10634,7 @@ named_expression_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -10979,7 +10655,7 @@ named_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -10987,23 +10663,20 @@ named_expression_rule(Parser *p) static expr_ty disjunction_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, disjunction_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11012,7 +10685,7 @@ disjunction_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // conjunction (('or' conjunction))+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> disjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "conjunction (('or' conjunction))+")); @@ -11027,7 +10700,7 @@ disjunction_rule(Parser *p) D(fprintf(stderr, "%*c+ disjunction[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "conjunction (('or' conjunction))+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -11037,7 +10710,7 @@ disjunction_rule(Parser *p) _res = _PyAST_BoolOp ( Or , CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11048,7 +10721,7 @@ disjunction_rule(Parser *p) } { // conjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> disjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "conjunction")); @@ -11068,7 +10741,7 @@ disjunction_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, disjunction_type, _res); - p->level--; + D(p->level--); return _res; } @@ -11076,23 +10749,20 @@ disjunction_rule(Parser *p) static expr_ty conjunction_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, conjunction_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11101,7 +10771,7 @@ conjunction_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // inversion (('and' inversion))+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> conjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "inversion (('and' inversion))+")); @@ -11116,7 +10786,7 @@ conjunction_rule(Parser *p) D(fprintf(stderr, "%*c+ conjunction[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "inversion (('and' inversion))+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -11126,7 +10796,7 @@ conjunction_rule(Parser *p) _res = _PyAST_BoolOp ( And , CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11137,7 +10807,7 @@ conjunction_rule(Parser *p) } { // inversion if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> conjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "inversion")); @@ -11157,7 +10827,7 @@ conjunction_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, conjunction_type, _res); - p->level--; + D(p->level--); return _res; } @@ -11165,23 +10835,20 @@ conjunction_rule(Parser *p) static expr_ty inversion_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, inversion_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11190,7 +10857,7 @@ inversion_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'not' inversion if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> inversion[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'not' inversion")); @@ -11205,7 +10872,7 @@ inversion_rule(Parser *p) D(fprintf(stderr, "%*c+ inversion[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'not' inversion")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -11215,7 +10882,7 @@ inversion_rule(Parser *p) _res = _PyAST_UnaryOp ( Not , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11226,7 +10893,7 @@ inversion_rule(Parser *p) } { // comparison if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> inversion[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "comparison")); @@ -11246,7 +10913,7 @@ inversion_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, inversion_type, _res); - p->level--; + D(p->level--); return _res; } @@ -11254,19 +10921,16 @@ inversion_rule(Parser *p) static expr_ty comparison_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11275,7 +10939,7 @@ comparison_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_or compare_op_bitwise_or_pair+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> comparison[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or compare_op_bitwise_or_pair+")); @@ -11290,7 +10954,7 @@ comparison_rule(Parser *p) D(fprintf(stderr, "%*c+ comparison[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_or compare_op_bitwise_or_pair+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -11300,7 +10964,7 @@ comparison_rule(Parser *p) _res = _PyAST_Compare ( a , CHECK ( asdl_int_seq* , _PyPegen_get_cmpops ( p , b ) ) , CHECK ( asdl_expr_seq* , _PyPegen_get_exprs ( p , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11311,7 +10975,7 @@ comparison_rule(Parser *p) } { // bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> comparison[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or")); @@ -11330,7 +10994,7 @@ comparison_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11348,19 +11012,16 @@ comparison_rule(Parser *p) static CmpopExprPair* compare_op_bitwise_or_pair_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // eq_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "eq_bitwise_or")); @@ -11379,7 +11040,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // noteq_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "noteq_bitwise_or")); @@ -11398,7 +11059,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // lte_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lte_bitwise_or")); @@ -11417,7 +11078,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // lt_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lt_bitwise_or")); @@ -11436,7 +11097,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // gte_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "gte_bitwise_or")); @@ -11455,7 +11116,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // gt_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "gt_bitwise_or")); @@ -11474,7 +11135,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // notin_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "notin_bitwise_or")); @@ -11493,7 +11154,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // in_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "in_bitwise_or")); @@ -11512,7 +11173,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // isnot_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "isnot_bitwise_or")); @@ -11531,7 +11192,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // is_bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "is_bitwise_or")); @@ -11550,7 +11211,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11558,19 +11219,16 @@ compare_op_bitwise_or_pair_rule(Parser *p) static CmpopExprPair* eq_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '==' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> eq_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'==' bitwise_or")); @@ -11586,7 +11244,7 @@ eq_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Eq , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11597,7 +11255,7 @@ eq_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11605,19 +11263,16 @@ eq_bitwise_or_rule(Parser *p) static CmpopExprPair* noteq_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // ('!=') bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> noteq_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('!=') bitwise_or")); @@ -11633,7 +11288,7 @@ noteq_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , NotEq , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11644,7 +11299,7 @@ noteq_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11652,19 +11307,16 @@ noteq_bitwise_or_rule(Parser *p) static CmpopExprPair* lte_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '<=' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lte_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<=' bitwise_or")); @@ -11680,7 +11332,7 @@ lte_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , LtE , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11691,7 +11343,7 @@ lte_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11699,19 +11351,16 @@ lte_bitwise_or_rule(Parser *p) static CmpopExprPair* lt_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '<' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lt_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<' bitwise_or")); @@ -11727,7 +11376,7 @@ lt_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Lt , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11738,7 +11387,7 @@ lt_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11746,19 +11395,16 @@ lt_bitwise_or_rule(Parser *p) static CmpopExprPair* gte_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '>=' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> gte_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>=' bitwise_or")); @@ -11774,7 +11420,7 @@ gte_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , GtE , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11785,7 +11431,7 @@ gte_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11793,19 +11439,16 @@ gte_bitwise_or_rule(Parser *p) static CmpopExprPair* gt_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '>' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> gt_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>' bitwise_or")); @@ -11821,7 +11464,7 @@ gt_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Gt , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11832,7 +11475,7 @@ gt_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11840,19 +11483,16 @@ gt_bitwise_or_rule(Parser *p) static CmpopExprPair* notin_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'not' 'in' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> notin_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'not' 'in' bitwise_or")); @@ -11871,7 +11511,7 @@ notin_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , NotIn , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11882,7 +11522,7 @@ notin_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11890,19 +11530,16 @@ notin_bitwise_or_rule(Parser *p) static CmpopExprPair* in_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'in' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> in_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'in' bitwise_or")); @@ -11918,7 +11555,7 @@ in_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , In , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11929,7 +11566,7 @@ in_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11937,19 +11574,16 @@ in_bitwise_or_rule(Parser *p) static CmpopExprPair* isnot_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'is' 'not' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> isnot_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'is' 'not' bitwise_or")); @@ -11968,7 +11602,7 @@ isnot_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , IsNot , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -11979,7 +11613,7 @@ isnot_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -11987,19 +11621,16 @@ isnot_bitwise_or_rule(Parser *p) static CmpopExprPair* is_bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'is' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> is_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'is' bitwise_or")); @@ -12015,7 +11646,7 @@ is_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Is , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12026,7 +11657,7 @@ is_bitwise_or_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12036,13 +11667,10 @@ static expr_ty bitwise_or_raw(Parser *); static expr_ty bitwise_or_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_or_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12050,42 +11678,37 @@ bitwise_or_rule(Parser *p) while (1) { int tmpvar_2 = _PyPegen_update_memo(p, _mark, bitwise_or_type, _res); if (tmpvar_2) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_or_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty bitwise_or_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12094,7 +11717,7 @@ bitwise_or_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_or '|' bitwise_xor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or '|' bitwise_xor")); @@ -12112,7 +11735,7 @@ bitwise_or_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_or[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_or '|' bitwise_xor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12122,7 +11745,7 @@ bitwise_or_raw(Parser *p) _res = _PyAST_BinOp ( a , BitOr , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12133,7 +11756,7 @@ bitwise_or_raw(Parser *p) } { // bitwise_xor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_xor")); @@ -12152,7 +11775,7 @@ bitwise_or_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12162,13 +11785,10 @@ static expr_ty bitwise_xor_raw(Parser *); static expr_ty bitwise_xor_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_xor_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12176,42 +11796,37 @@ bitwise_xor_rule(Parser *p) while (1) { int tmpvar_3 = _PyPegen_update_memo(p, _mark, bitwise_xor_type, _res); if (tmpvar_3) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_xor_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty bitwise_xor_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12220,7 +11835,7 @@ bitwise_xor_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_xor '^' bitwise_and if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_xor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_xor '^' bitwise_and")); @@ -12238,7 +11853,7 @@ bitwise_xor_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_xor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_xor '^' bitwise_and")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12248,7 +11863,7 @@ bitwise_xor_raw(Parser *p) _res = _PyAST_BinOp ( a , BitXor , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12259,7 +11874,7 @@ bitwise_xor_raw(Parser *p) } { // bitwise_and if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_xor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_and")); @@ -12278,7 +11893,7 @@ bitwise_xor_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12288,13 +11903,10 @@ static expr_ty bitwise_and_raw(Parser *); static expr_ty bitwise_and_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_and_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12302,42 +11914,37 @@ bitwise_and_rule(Parser *p) while (1) { int tmpvar_4 = _PyPegen_update_memo(p, _mark, bitwise_and_type, _res); if (tmpvar_4) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_and_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty bitwise_and_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12346,7 +11953,7 @@ bitwise_and_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_and '&' shift_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_and[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_and '&' shift_expr")); @@ -12364,7 +11971,7 @@ bitwise_and_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_and[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_and '&' shift_expr")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12374,7 +11981,7 @@ bitwise_and_raw(Parser *p) _res = _PyAST_BinOp ( a , BitAnd , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12385,7 +11992,7 @@ bitwise_and_raw(Parser *p) } { // shift_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> bitwise_and[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr")); @@ -12404,7 +12011,7 @@ bitwise_and_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12414,13 +12021,10 @@ static expr_ty shift_expr_raw(Parser *); static expr_ty shift_expr_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, shift_expr_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12428,42 +12032,37 @@ shift_expr_rule(Parser *p) while (1) { int tmpvar_5 = _PyPegen_update_memo(p, _mark, shift_expr_type, _res); if (tmpvar_5) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = shift_expr_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty shift_expr_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12472,7 +12071,7 @@ shift_expr_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // shift_expr '<<' sum if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr '<<' sum")); @@ -12490,7 +12089,7 @@ shift_expr_raw(Parser *p) D(fprintf(stderr, "%*c+ shift_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "shift_expr '<<' sum")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12500,7 +12099,7 @@ shift_expr_raw(Parser *p) _res = _PyAST_BinOp ( a , LShift , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12511,7 +12110,7 @@ shift_expr_raw(Parser *p) } { // shift_expr '>>' sum if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr '>>' sum")); @@ -12529,7 +12128,7 @@ shift_expr_raw(Parser *p) D(fprintf(stderr, "%*c+ shift_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "shift_expr '>>' sum")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12539,7 +12138,7 @@ shift_expr_raw(Parser *p) _res = _PyAST_BinOp ( a , RShift , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12550,7 +12149,7 @@ shift_expr_raw(Parser *p) } { // sum if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum")); @@ -12569,7 +12168,7 @@ shift_expr_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12579,13 +12178,10 @@ static expr_ty sum_raw(Parser *); static expr_ty sum_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, sum_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12593,42 +12189,37 @@ sum_rule(Parser *p) while (1) { int tmpvar_6 = _PyPegen_update_memo(p, _mark, sum_type, _res); if (tmpvar_6) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = sum_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty sum_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12637,7 +12228,7 @@ sum_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // sum '+' term if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum '+' term")); @@ -12655,7 +12246,7 @@ sum_raw(Parser *p) D(fprintf(stderr, "%*c+ sum[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "sum '+' term")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12665,7 +12256,7 @@ sum_raw(Parser *p) _res = _PyAST_BinOp ( a , Add , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12676,7 +12267,7 @@ sum_raw(Parser *p) } { // sum '-' term if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum '-' term")); @@ -12694,7 +12285,7 @@ sum_raw(Parser *p) D(fprintf(stderr, "%*c+ sum[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "sum '-' term")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12704,7 +12295,7 @@ sum_raw(Parser *p) _res = _PyAST_BinOp ( a , Sub , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12715,7 +12306,7 @@ sum_raw(Parser *p) } { // term if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term")); @@ -12734,7 +12325,7 @@ sum_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -12750,13 +12341,10 @@ static expr_ty term_raw(Parser *); static expr_ty term_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, term_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -12764,42 +12352,37 @@ term_rule(Parser *p) while (1) { int tmpvar_7 = _PyPegen_update_memo(p, _mark, term_type, _res); if (tmpvar_7) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = term_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty term_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12808,7 +12391,7 @@ term_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // term '*' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '*' factor")); @@ -12826,7 +12409,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '*' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12836,7 +12419,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Mult , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12847,7 +12430,7 @@ term_raw(Parser *p) } { // term '/' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '/' factor")); @@ -12865,7 +12448,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '/' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12875,7 +12458,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Div , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12886,7 +12469,7 @@ term_raw(Parser *p) } { // term '//' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '//' factor")); @@ -12904,7 +12487,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '//' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12914,7 +12497,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , FloorDiv , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12925,7 +12508,7 @@ term_raw(Parser *p) } { // term '%' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '%' factor")); @@ -12943,7 +12526,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '%' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12953,7 +12536,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Mod , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -12964,7 +12547,7 @@ term_raw(Parser *p) } { // term '@' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '@' factor")); @@ -12982,7 +12565,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '@' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -12992,7 +12575,7 @@ term_raw(Parser *p) _res = CHECK_VERSION ( expr_ty , 5 , "The '@' operator is" , _PyAST_BinOp ( a , MatMult , b , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13003,7 +12586,7 @@ term_raw(Parser *p) } { // factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "factor")); @@ -13022,7 +12605,7 @@ term_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -13030,23 +12613,20 @@ term_raw(Parser *p) static expr_ty factor_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, factor_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13055,7 +12635,7 @@ factor_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '+' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+' factor")); @@ -13070,7 +12650,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'+' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13080,7 +12660,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( UAdd , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13091,7 +12671,7 @@ factor_rule(Parser *p) } { // '-' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' factor")); @@ -13106,7 +12686,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13116,7 +12696,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13127,7 +12707,7 @@ factor_rule(Parser *p) } { // '~' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'~' factor")); @@ -13142,7 +12722,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'~' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13152,7 +12732,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( Invert , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13163,7 +12743,7 @@ factor_rule(Parser *p) } { // power if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "power")); @@ -13183,7 +12763,7 @@ factor_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, factor_type, _res); - p->level--; + D(p->level--); return _res; } @@ -13191,19 +12771,16 @@ factor_rule(Parser *p) static expr_ty power_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13212,7 +12789,7 @@ power_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // await_primary '**' factor if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> power[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "await_primary '**' factor")); @@ -13230,7 +12807,7 @@ power_rule(Parser *p) D(fprintf(stderr, "%*c+ power[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "await_primary '**' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13240,7 +12817,7 @@ power_rule(Parser *p) _res = _PyAST_BinOp ( a , Pow , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13251,7 +12828,7 @@ power_rule(Parser *p) } { // await_primary if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> power[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "await_primary")); @@ -13270,7 +12847,7 @@ power_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -13278,23 +12855,20 @@ power_rule(Parser *p) static expr_ty await_primary_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, await_primary_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13303,7 +12877,7 @@ await_primary_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // AWAIT primary if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> await_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "AWAIT primary")); @@ -13318,7 +12892,7 @@ await_primary_rule(Parser *p) D(fprintf(stderr, "%*c+ await_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "AWAIT primary")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13328,7 +12902,7 @@ await_primary_rule(Parser *p) _res = CHECK_VERSION ( expr_ty , 5 , "Await expressions are" , _PyAST_Await ( a , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13339,7 +12913,7 @@ await_primary_rule(Parser *p) } { // primary if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> await_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary")); @@ -13359,7 +12933,7 @@ await_primary_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, await_primary_type, _res); - p->level--; + D(p->level--); return _res; } @@ -13374,13 +12948,10 @@ static expr_ty primary_raw(Parser *); static expr_ty primary_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, primary_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -13388,42 +12959,37 @@ primary_rule(Parser *p) while (1) { int tmpvar_8 = _PyPegen_update_memo(p, _mark, primary_type, _res); if (tmpvar_8) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = primary_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty primary_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13432,7 +12998,7 @@ primary_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // primary '.' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '.' NAME")); @@ -13450,7 +13016,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '.' NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13460,7 +13026,7 @@ primary_raw(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13471,7 +13037,7 @@ primary_raw(Parser *p) } { // primary genexp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary genexp")); @@ -13486,7 +13052,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary genexp")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13496,7 +13062,7 @@ primary_raw(Parser *p) _res = _PyAST_Call ( a , CHECK ( asdl_expr_seq* , ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , b ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13507,7 +13073,7 @@ primary_raw(Parser *p) } { // primary '(' arguments? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '(' arguments? ')'")); @@ -13528,7 +13094,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '(' arguments? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13538,7 +13104,7 @@ primary_raw(Parser *p) _res = _PyAST_Call ( a , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13549,7 +13115,7 @@ primary_raw(Parser *p) } { // primary '[' slices ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '[' slices ']'")); @@ -13570,7 +13136,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '[' slices ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13580,7 +13146,7 @@ primary_raw(Parser *p) _res = _PyAST_Subscript ( a , b , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13591,7 +13157,7 @@ primary_raw(Parser *p) } { // atom if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "atom")); @@ -13610,7 +13176,7 @@ primary_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -13618,19 +13184,16 @@ primary_raw(Parser *p) static expr_ty slices_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13639,7 +13202,7 @@ slices_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // slice !',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slices[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice !','")); @@ -13654,7 +13217,7 @@ slices_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13665,7 +13228,7 @@ slices_rule(Parser *p) } { // ','.slice+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slices[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.slice+ ','?")); @@ -13681,7 +13244,7 @@ slices_rule(Parser *p) D(fprintf(stderr, "%*c+ slices[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.slice+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13691,7 +13254,7 @@ slices_rule(Parser *p) _res = _PyAST_Tuple ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13702,7 +13265,7 @@ slices_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -13710,19 +13273,16 @@ slices_rule(Parser *p) static expr_ty slice_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13731,7 +13291,7 @@ slice_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // expression? ':' expression? [':' expression?] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slice[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression? ':' expression? [':' expression?]")); @@ -13752,7 +13312,7 @@ slice_rule(Parser *p) D(fprintf(stderr, "%*c+ slice[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression? ':' expression? [':' expression?]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13762,7 +13322,7 @@ slice_rule(Parser *p) _res = _PyAST_Slice ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13773,7 +13333,7 @@ slice_rule(Parser *p) } { // named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> slice[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -13786,7 +13346,7 @@ slice_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13797,7 +13357,7 @@ slice_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -13815,19 +13375,16 @@ slice_rule(Parser *p) static expr_ty atom_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13836,7 +13393,7 @@ atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -13855,7 +13412,7 @@ atom_rule(Parser *p) } { // 'True' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -13867,7 +13424,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13877,7 +13434,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_True , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13888,7 +13445,7 @@ atom_rule(Parser *p) } { // 'False' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -13900,7 +13457,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13910,7 +13467,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_False , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13921,7 +13478,7 @@ atom_rule(Parser *p) } { // 'None' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -13933,7 +13490,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -13943,7 +13500,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_None , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -13954,7 +13511,7 @@ atom_rule(Parser *p) } { // &STRING strings if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&STRING strings")); @@ -13975,7 +13532,7 @@ atom_rule(Parser *p) } { // NUMBER if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -13994,7 +13551,7 @@ atom_rule(Parser *p) } { // &'(' (tuple | group | genexp) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'(' (tuple | group | genexp)")); @@ -14015,7 +13572,7 @@ atom_rule(Parser *p) } { // &'[' (list | listcomp) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'[' (list | listcomp)")); @@ -14036,7 +13593,7 @@ atom_rule(Parser *p) } { // &'{' (dict | set | dictcomp | setcomp) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'{' (dict | set | dictcomp | setcomp)")); @@ -14057,7 +13614,7 @@ atom_rule(Parser *p) } { // '...' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -14069,7 +13626,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -14079,7 +13636,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_Ellipsis , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14090,7 +13647,7 @@ atom_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14098,19 +13655,16 @@ atom_rule(Parser *p) static expr_ty group_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '(' (yield_expr | named_expression) ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' (yield_expr | named_expression) ')'")); @@ -14129,7 +13683,7 @@ group_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14140,7 +13694,7 @@ group_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_group if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_group")); @@ -14159,7 +13713,7 @@ group_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14167,19 +13721,16 @@ group_rule(Parser *p) static expr_ty lambdef_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14188,7 +13739,7 @@ lambdef_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'lambda' lambda_params? ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambdef[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'lambda' lambda_params? ':' expression")); @@ -14209,7 +13760,7 @@ lambdef_rule(Parser *p) D(fprintf(stderr, "%*c+ lambdef[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'lambda' lambda_params? ':' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -14219,7 +13770,7 @@ lambdef_rule(Parser *p) _res = _PyAST_Lambda ( ( a ) ? a : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14230,7 +13781,7 @@ lambdef_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14238,19 +13789,16 @@ lambdef_rule(Parser *p) static arguments_ty lambda_params_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arguments_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_lambda_parameters if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_lambda_parameters")); @@ -14269,7 +13817,7 @@ lambda_params_rule(Parser *p) } { // lambda_parameters if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_parameters")); @@ -14288,7 +13836,7 @@ lambda_params_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14301,19 +13849,16 @@ lambda_params_rule(Parser *p) static arguments_ty lambda_parameters_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arguments_ty _res = NULL; int _mark = p->mark; { // lambda_slash_no_default lambda_param_no_default* lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default lambda_param_no_default* lambda_param_with_default* lambda_star_etc?")); @@ -14335,7 +13880,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , a , NULL , b , c , d ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14346,7 +13891,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_slash_with_default lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default lambda_param_with_default* lambda_star_etc?")); @@ -14365,7 +13910,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , a , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14376,7 +13921,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_param_no_default+ lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ lambda_param_with_default* lambda_star_etc?")); @@ -14395,7 +13940,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14406,7 +13951,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_param_with_default+ lambda_star_etc? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+ lambda_star_etc?")); @@ -14422,7 +13967,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14433,7 +13978,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_star_etc if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_star_etc")); @@ -14446,7 +13991,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14457,7 +14002,7 @@ lambda_parameters_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14467,19 +14012,16 @@ lambda_parameters_rule(Parser *p) static asdl_arg_seq* lambda_slash_no_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_arg_seq* _res = NULL; int _mark = p->mark; { // lambda_param_no_default+ '/' ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ '/' ','")); @@ -14498,7 +14040,7 @@ lambda_slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14509,7 +14051,7 @@ lambda_slash_no_default_rule(Parser *p) } { // lambda_param_no_default+ '/' &':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ '/' &':'")); @@ -14527,7 +14069,7 @@ lambda_slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14538,7 +14080,7 @@ lambda_slash_no_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14548,19 +14090,16 @@ lambda_slash_no_default_rule(Parser *p) static SlashWithDefault* lambda_slash_with_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } SlashWithDefault* _res = NULL; int _mark = p->mark; { // lambda_param_no_default* lambda_param_with_default+ '/' ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* lambda_param_with_default+ '/' ','")); @@ -14582,7 +14121,7 @@ lambda_slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14593,7 +14132,7 @@ lambda_slash_with_default_rule(Parser *p) } { // lambda_param_no_default* lambda_param_with_default+ '/' &':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* lambda_param_with_default+ '/' &':'")); @@ -14614,7 +14153,7 @@ lambda_slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14625,7 +14164,7 @@ lambda_slash_with_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14637,19 +14176,16 @@ lambda_slash_with_default_rule(Parser *p) static StarEtc* lambda_star_etc_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } StarEtc* _res = NULL; int _mark = p->mark; { // '*' lambda_param_no_default lambda_param_maybe_default* lambda_kwds? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' lambda_param_no_default lambda_param_maybe_default* lambda_kwds?")); @@ -14671,7 +14207,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14682,7 +14218,7 @@ lambda_star_etc_rule(Parser *p) } { // '*' ',' lambda_param_maybe_default+ lambda_kwds? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' lambda_param_maybe_default+ lambda_kwds?")); @@ -14704,7 +14240,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14715,7 +14251,7 @@ lambda_star_etc_rule(Parser *p) } { // lambda_kwds if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_kwds")); @@ -14728,7 +14264,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14739,7 +14275,7 @@ lambda_star_etc_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_lambda_star_etc if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_lambda_star_etc")); @@ -14758,7 +14294,7 @@ lambda_star_etc_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14766,19 +14302,16 @@ lambda_star_etc_rule(Parser *p) static arg_ty lambda_kwds_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // '**' lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_kwds[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' lambda_param_no_default")); @@ -14794,7 +14327,7 @@ lambda_kwds_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14805,7 +14338,7 @@ lambda_kwds_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14813,19 +14346,16 @@ lambda_kwds_rule(Parser *p) static arg_ty lambda_param_no_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // lambda_param ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param ','")); @@ -14841,7 +14371,7 @@ lambda_param_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14852,7 +14382,7 @@ lambda_param_no_default_rule(Parser *p) } { // lambda_param &':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param &':'")); @@ -14867,7 +14397,7 @@ lambda_param_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14878,7 +14408,7 @@ lambda_param_no_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14886,19 +14416,16 @@ lambda_param_no_default_rule(Parser *p) static NameDefaultPair* lambda_param_with_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // lambda_param default ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default ','")); @@ -14917,7 +14444,7 @@ lambda_param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14928,7 +14455,7 @@ lambda_param_with_default_rule(Parser *p) } { // lambda_param default &':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default &':'")); @@ -14946,7 +14473,7 @@ lambda_param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -14957,7 +14484,7 @@ lambda_param_with_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -14965,19 +14492,16 @@ lambda_param_with_default_rule(Parser *p) static NameDefaultPair* lambda_param_maybe_default_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // lambda_param default? ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default? ','")); @@ -14996,7 +14520,7 @@ lambda_param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15007,7 +14531,7 @@ lambda_param_maybe_default_rule(Parser *p) } { // lambda_param default? &':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default? &':'")); @@ -15025,7 +14549,7 @@ lambda_param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15036,7 +14560,7 @@ lambda_param_maybe_default_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15044,19 +14568,16 @@ lambda_param_maybe_default_rule(Parser *p) static arg_ty lambda_param_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } arg_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15065,7 +14586,7 @@ lambda_param_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> lambda_param[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -15077,7 +14598,7 @@ lambda_param_rule(Parser *p) D(fprintf(stderr, "%*c+ lambda_param[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15087,7 +14608,7 @@ lambda_param_rule(Parser *p) _res = _PyAST_arg ( a -> v . Name . id , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15098,7 +14619,7 @@ lambda_param_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15106,23 +14627,20 @@ lambda_param_rule(Parser *p) static expr_ty strings_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, strings_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; { // STRING+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> strings[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "STRING+")); @@ -15135,7 +14653,7 @@ strings_rule(Parser *p) _res = _PyPegen_concatenate_strings ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15147,7 +14665,7 @@ strings_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, strings_type, _res); - p->level--; + D(p->level--); return _res; } @@ -15155,19 +14673,16 @@ strings_rule(Parser *p) static expr_ty list_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15176,7 +14691,7 @@ list_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' star_named_expressions? ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> list[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' star_named_expressions? ']'")); @@ -15194,7 +14709,7 @@ list_rule(Parser *p) D(fprintf(stderr, "%*c+ list[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' star_named_expressions? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15204,7 +14719,7 @@ list_rule(Parser *p) _res = _PyAST_List ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15215,7 +14730,7 @@ list_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15223,19 +14738,16 @@ list_rule(Parser *p) static expr_ty tuple_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15244,7 +14756,7 @@ tuple_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' [star_named_expression ',' star_named_expressions?] ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> tuple[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' [star_named_expression ',' star_named_expressions?] ')'")); @@ -15262,7 +14774,7 @@ tuple_rule(Parser *p) D(fprintf(stderr, "%*c+ tuple[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' [star_named_expression ',' star_named_expressions?] ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15272,7 +14784,7 @@ tuple_rule(Parser *p) _res = _PyAST_Tuple ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15283,7 +14795,7 @@ tuple_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15291,19 +14803,16 @@ tuple_rule(Parser *p) static expr_ty set_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15312,7 +14821,7 @@ set_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' star_named_expressions '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> set[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' star_named_expressions '}'")); @@ -15330,7 +14839,7 @@ set_rule(Parser *p) D(fprintf(stderr, "%*c+ set[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' star_named_expressions '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15340,7 +14849,7 @@ set_rule(Parser *p) _res = _PyAST_Set ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15351,7 +14860,7 @@ set_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15359,19 +14868,16 @@ set_rule(Parser *p) static expr_ty dict_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15380,7 +14886,7 @@ dict_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' double_starred_kvpairs? '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dict[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' double_starred_kvpairs? '}'")); @@ -15398,7 +14904,7 @@ dict_rule(Parser *p) D(fprintf(stderr, "%*c+ dict[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' double_starred_kvpairs? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15408,7 +14914,7 @@ dict_rule(Parser *p) _res = _PyAST_Dict ( CHECK ( asdl_expr_seq* , _PyPegen_get_keys ( p , a ) ) , CHECK ( asdl_expr_seq* , _PyPegen_get_values ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15419,7 +14925,7 @@ dict_rule(Parser *p) } { // '{' invalid_double_starred_kvpairs '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dict[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' invalid_double_starred_kvpairs '}'")); @@ -15444,7 +14950,7 @@ dict_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15452,19 +14958,16 @@ dict_rule(Parser *p) static asdl_seq* double_starred_kvpairs_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.double_starred_kvpair+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ','?")); @@ -15481,7 +14984,7 @@ double_starred_kvpairs_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15492,7 +14995,7 @@ double_starred_kvpairs_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15500,19 +15003,16 @@ double_starred_kvpairs_rule(Parser *p) static KeyValuePair* double_starred_kvpair_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeyValuePair* _res = NULL; int _mark = p->mark; { // '**' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' bitwise_or")); @@ -15528,7 +15028,7 @@ double_starred_kvpair_rule(Parser *p) _res = _PyPegen_key_value_pair ( p , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15539,7 +15039,7 @@ double_starred_kvpair_rule(Parser *p) } { // kvpair if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kvpair")); @@ -15558,7 +15058,7 @@ double_starred_kvpair_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15566,19 +15066,16 @@ double_starred_kvpair_rule(Parser *p) static KeyValuePair* kvpair_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeyValuePair* _res = NULL; int _mark = p->mark; { // expression ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' expression")); @@ -15597,7 +15094,7 @@ kvpair_rule(Parser *p) _res = _PyPegen_key_value_pair ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15608,7 +15105,7 @@ kvpair_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15616,19 +15113,16 @@ kvpair_rule(Parser *p) static asdl_comprehension_seq* for_if_clauses_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_comprehension_seq* _res = NULL; int _mark = p->mark; { // for_if_clause+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_if_clauses[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "for_if_clause+")); @@ -15641,7 +15135,7 @@ for_if_clauses_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15652,7 +15146,7 @@ for_if_clauses_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15663,19 +15157,16 @@ for_if_clauses_rule(Parser *p) static comprehension_ty for_if_clause_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } comprehension_ty _res = NULL; int _mark = p->mark; { // ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))* if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); @@ -15706,7 +15197,7 @@ for_if_clause_rule(Parser *p) _res = CHECK_VERSION ( comprehension_ty , 6 , "Async comprehensions are" , _PyAST_comprehension ( a , b , c , 1 , p -> arena ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15715,13 +15206,13 @@ for_if_clause_rule(Parser *p) D(fprintf(stderr, "%*c%s for_if_clause[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } { // 'for' star_targets 'in' ~ disjunction (('if' disjunction))* if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); @@ -15749,7 +15240,7 @@ for_if_clause_rule(Parser *p) _res = _PyAST_comprehension ( a , b , c , 0 , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15758,13 +15249,13 @@ for_if_clause_rule(Parser *p) D(fprintf(stderr, "%*c%s for_if_clause[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); if (_cut_var) { - p->level--; + D(p->level--); return NULL; } } if (p->call_invalid_rules) { // invalid_for_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_target")); @@ -15783,7 +15274,7 @@ for_if_clause_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15791,19 +15282,16 @@ for_if_clause_rule(Parser *p) static expr_ty listcomp_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15812,7 +15300,7 @@ listcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' named_expression for_if_clauses ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> listcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' named_expression for_if_clauses ']'")); @@ -15833,7 +15321,7 @@ listcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ listcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' named_expression for_if_clauses ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15843,7 +15331,7 @@ listcomp_rule(Parser *p) _res = _PyAST_ListComp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15854,7 +15342,7 @@ listcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> listcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -15873,7 +15361,7 @@ listcomp_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15881,19 +15369,16 @@ listcomp_rule(Parser *p) static expr_ty setcomp_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15902,7 +15387,7 @@ setcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' named_expression for_if_clauses '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> setcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' named_expression for_if_clauses '}'")); @@ -15923,7 +15408,7 @@ setcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ setcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' named_expression for_if_clauses '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -15933,7 +15418,7 @@ setcomp_rule(Parser *p) _res = _PyAST_SetComp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -15944,7 +15429,7 @@ setcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> setcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -15963,7 +15448,7 @@ setcomp_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -15973,19 +15458,16 @@ setcomp_rule(Parser *p) static expr_ty genexp_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15994,7 +15476,7 @@ genexp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' (assignment_expression | expression !':=') for_if_clauses ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> genexp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' (assignment_expression | expression !':=') for_if_clauses ')'")); @@ -16015,7 +15497,7 @@ genexp_rule(Parser *p) D(fprintf(stderr, "%*c+ genexp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' (assignment_expression | expression !':=') for_if_clauses ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16025,7 +15507,7 @@ genexp_rule(Parser *p) _res = _PyAST_GeneratorExp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16036,7 +15518,7 @@ genexp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> genexp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -16055,7 +15537,7 @@ genexp_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16063,19 +15545,16 @@ genexp_rule(Parser *p) static expr_ty dictcomp_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16084,7 +15563,7 @@ dictcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' kvpair for_if_clauses '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dictcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' kvpair for_if_clauses '}'")); @@ -16105,7 +15584,7 @@ dictcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ dictcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' kvpair for_if_clauses '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16115,7 +15594,7 @@ dictcomp_rule(Parser *p) _res = _PyAST_DictComp ( a -> key , a -> value , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16126,7 +15605,7 @@ dictcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_dict_comprehension if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> dictcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_dict_comprehension")); @@ -16145,7 +15624,7 @@ dictcomp_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16153,23 +15632,20 @@ dictcomp_rule(Parser *p) static expr_ty arguments_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, arguments_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; { // args ','? &')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ','? &')'")); @@ -16188,7 +15664,7 @@ arguments_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16199,7 +15675,7 @@ arguments_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_arguments if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_arguments")); @@ -16219,7 +15695,7 @@ arguments_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, arguments_type, _res); - p->level--; + D(p->level--); return _res; } @@ -16229,19 +15705,16 @@ arguments_rule(Parser *p) static expr_ty args_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16250,7 +15723,7 @@ args_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> args[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs]")); @@ -16265,7 +15738,7 @@ args_rule(Parser *p) D(fprintf(stderr, "%*c+ args[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16275,7 +15748,7 @@ args_rule(Parser *p) _res = _PyPegen_collect_call_seqs ( p , a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16286,7 +15759,7 @@ args_rule(Parser *p) } { // kwargs if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> args[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwargs")); @@ -16298,7 +15771,7 @@ args_rule(Parser *p) D(fprintf(stderr, "%*c+ args[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "kwargs")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16308,7 +15781,7 @@ args_rule(Parser *p) _res = _PyAST_Call ( _PyPegen_dummy_name ( p ) , CHECK_NULL_ALLOWED ( asdl_expr_seq* , _PyPegen_seq_extract_starred_exprs ( p , a ) ) , CHECK_NULL_ALLOWED ( asdl_keyword_seq* , _PyPegen_seq_delete_starred_exprs ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16319,7 +15792,7 @@ args_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16330,19 +15803,16 @@ args_rule(Parser *p) static asdl_seq* kwargs_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.kwarg_or_starred+ ',' ','.kwarg_or_double_starred+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_starred+ ',' ','.kwarg_or_double_starred+")); @@ -16361,7 +15831,7 @@ kwargs_rule(Parser *p) _res = _PyPegen_join_sequences ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16372,7 +15842,7 @@ kwargs_rule(Parser *p) } { // ','.kwarg_or_starred+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_starred+")); @@ -16391,7 +15861,7 @@ kwargs_rule(Parser *p) } { // ','.kwarg_or_double_starred+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_double_starred+")); @@ -16410,7 +15880,7 @@ kwargs_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16418,19 +15888,16 @@ kwargs_rule(Parser *p) static expr_ty starred_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16439,7 +15906,7 @@ starred_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> starred_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression")); @@ -16454,7 +15921,7 @@ starred_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ starred_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16464,7 +15931,7 @@ starred_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16475,7 +15942,7 @@ starred_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16483,19 +15950,16 @@ starred_expression_rule(Parser *p) static KeywordOrStarred* kwarg_or_starred_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeywordOrStarred* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16504,7 +15968,7 @@ kwarg_or_starred_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_kwarg if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_kwarg")); @@ -16523,7 +15987,7 @@ kwarg_or_starred_rule(Parser *p) } { // NAME '=' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); @@ -16541,7 +16005,7 @@ kwarg_or_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16551,7 +16015,7 @@ kwarg_or_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( a -> v . Name . id , b , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16562,7 +16026,7 @@ kwarg_or_starred_rule(Parser *p) } { // starred_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); @@ -16575,7 +16039,7 @@ kwarg_or_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , a , 0 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16586,7 +16050,7 @@ kwarg_or_starred_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16594,19 +16058,16 @@ kwarg_or_starred_rule(Parser *p) static KeywordOrStarred* kwarg_or_double_starred_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } KeywordOrStarred* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16615,7 +16076,7 @@ kwarg_or_double_starred_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_kwarg if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_kwarg")); @@ -16634,7 +16095,7 @@ kwarg_or_double_starred_rule(Parser *p) } { // NAME '=' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); @@ -16652,7 +16113,7 @@ kwarg_or_double_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_double_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16662,7 +16123,7 @@ kwarg_or_double_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( a -> v . Name . id , b , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16673,7 +16134,7 @@ kwarg_or_double_starred_rule(Parser *p) } { // '**' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' expression")); @@ -16688,7 +16149,7 @@ kwarg_or_double_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_double_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16698,7 +16159,7 @@ kwarg_or_double_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( NULL , a , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16709,7 +16170,7 @@ kwarg_or_double_starred_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16717,19 +16178,16 @@ kwarg_or_double_starred_rule(Parser *p) static expr_ty star_targets_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16738,7 +16196,7 @@ star_targets_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_target !',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target !','")); @@ -16753,7 +16211,7 @@ star_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16764,7 +16222,7 @@ star_targets_rule(Parser *p) } { // star_target ((',' star_target))* ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))* ','?")); @@ -16783,7 +16241,7 @@ star_targets_rule(Parser *p) D(fprintf(stderr, "%*c+ star_targets[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))* ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16793,7 +16251,7 @@ star_targets_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16804,7 +16262,7 @@ star_targets_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16812,19 +16270,16 @@ star_targets_rule(Parser *p) static asdl_expr_seq* star_targets_list_seq_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.star_target+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_targets_list_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.star_target+ ','?")); @@ -16841,7 +16296,7 @@ star_targets_list_seq_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16852,7 +16307,7 @@ star_targets_list_seq_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16860,19 +16315,16 @@ star_targets_list_seq_rule(Parser *p) static asdl_expr_seq* star_targets_tuple_seq_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // star_target ((',' star_target))+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_targets_tuple_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))+ ','?")); @@ -16892,7 +16344,7 @@ star_targets_tuple_seq_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_insert_in_front ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16903,7 +16355,7 @@ star_targets_tuple_seq_rule(Parser *p) } { // star_target ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_targets_tuple_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ','")); @@ -16919,7 +16371,7 @@ star_targets_tuple_seq_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16930,7 +16382,7 @@ star_targets_tuple_seq_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -16938,23 +16390,20 @@ star_targets_tuple_seq_rule(Parser *p) static expr_ty star_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, star_target_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16963,7 +16412,7 @@ star_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' (!'*' star_target) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (!'*' star_target)")); @@ -16978,7 +16427,7 @@ star_target_rule(Parser *p) D(fprintf(stderr, "%*c+ star_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (!'*' star_target)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -16988,7 +16437,7 @@ star_target_rule(Parser *p) _res = _PyAST_Starred ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -16999,7 +16448,7 @@ star_target_rule(Parser *p) } { // target_with_star_atom if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "target_with_star_atom")); @@ -17019,7 +16468,7 @@ star_target_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, star_target_type, _res); - p->level--; + D(p->level--); return _res; } @@ -17030,23 +16479,20 @@ star_target_rule(Parser *p) static expr_ty target_with_star_atom_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, target_with_star_atom_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17055,7 +16501,7 @@ target_with_star_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -17075,7 +16521,7 @@ target_with_star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ target_with_star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17085,7 +16531,7 @@ target_with_star_atom_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17096,7 +16542,7 @@ target_with_star_atom_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -17119,7 +16565,7 @@ target_with_star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ target_with_star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17129,7 +16575,7 @@ target_with_star_atom_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17140,7 +16586,7 @@ target_with_star_atom_rule(Parser *p) } { // star_atom if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_atom")); @@ -17160,7 +16606,7 @@ target_with_star_atom_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, target_with_star_atom_type, _res); - p->level--; + D(p->level--); return _res; } @@ -17172,19 +16618,16 @@ target_with_star_atom_rule(Parser *p) static expr_ty star_atom_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17193,7 +16636,7 @@ star_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -17206,7 +16649,7 @@ star_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17217,7 +16660,7 @@ star_atom_rule(Parser *p) } { // '(' target_with_star_atom ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' target_with_star_atom ')'")); @@ -17236,7 +16679,7 @@ star_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17247,7 +16690,7 @@ star_atom_rule(Parser *p) } { // '(' star_targets_tuple_seq? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' star_targets_tuple_seq? ')'")); @@ -17265,7 +16708,7 @@ star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' star_targets_tuple_seq? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17275,7 +16718,7 @@ star_atom_rule(Parser *p) _res = _PyAST_Tuple ( a , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17286,7 +16729,7 @@ star_atom_rule(Parser *p) } { // '[' star_targets_list_seq? ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' star_targets_list_seq? ']'")); @@ -17304,7 +16747,7 @@ star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' star_targets_list_seq? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17314,7 +16757,7 @@ star_atom_rule(Parser *p) _res = _PyAST_List ( a , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17325,7 +16768,7 @@ star_atom_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17333,19 +16776,16 @@ star_atom_rule(Parser *p) static expr_ty single_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // single_subscript_attribute_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_subscript_attribute_target")); @@ -17364,7 +16804,7 @@ single_target_rule(Parser *p) } { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -17377,7 +16817,7 @@ single_target_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17388,7 +16828,7 @@ single_target_rule(Parser *p) } { // '(' single_target ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' single_target ')'")); @@ -17407,7 +16847,7 @@ single_target_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17418,7 +16858,7 @@ single_target_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17428,19 +16868,16 @@ single_target_rule(Parser *p) static expr_ty single_subscript_attribute_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17449,7 +16886,7 @@ single_subscript_attribute_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> single_subscript_attribute_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -17469,7 +16906,7 @@ single_subscript_attribute_target_rule(Parser *p) D(fprintf(stderr, "%*c+ single_subscript_attribute_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17479,7 +16916,7 @@ single_subscript_attribute_target_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17490,7 +16927,7 @@ single_subscript_attribute_target_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> single_subscript_attribute_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -17513,7 +16950,7 @@ single_subscript_attribute_target_rule(Parser *p) D(fprintf(stderr, "%*c+ single_subscript_attribute_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17523,7 +16960,7 @@ single_subscript_attribute_target_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17534,7 +16971,7 @@ single_subscript_attribute_target_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17549,13 +16986,10 @@ static expr_ty t_primary_raw(Parser *); static expr_ty t_primary_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); expr_ty _res = NULL; if (_PyPegen_is_memoized(p, t_primary_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; @@ -17563,42 +16997,37 @@ t_primary_rule(Parser *p) while (1) { int tmpvar_9 = _PyPegen_update_memo(p, _mark, t_primary_type, _res); if (tmpvar_9) { - p->level--; + D(p->level--); return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = t_primary_raw(p); p->in_raw_rule--; - if (p->error_indicator) { - p->level--; + if (p->error_indicator) return NULL; - } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - p->level--; + D(p->level--); return _res; } static expr_ty t_primary_raw(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17607,7 +17036,7 @@ t_primary_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME &t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME &t_lookahead")); @@ -17627,7 +17056,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17637,7 +17066,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17648,7 +17077,7 @@ t_primary_raw(Parser *p) } { // t_primary '[' slices ']' &t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' &t_lookahead")); @@ -17671,7 +17100,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17681,7 +17110,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Subscript ( a , b , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17692,7 +17121,7 @@ t_primary_raw(Parser *p) } { // t_primary genexp &t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary genexp &t_lookahead")); @@ -17709,7 +17138,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary genexp &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17719,7 +17148,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Call ( a , CHECK ( asdl_expr_seq* , ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , b ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17730,7 +17159,7 @@ t_primary_raw(Parser *p) } { // t_primary '(' arguments? ')' &t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '(' arguments? ')' &t_lookahead")); @@ -17753,7 +17182,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '(' arguments? ')' &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17763,7 +17192,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Call ( a , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17774,7 +17203,7 @@ t_primary_raw(Parser *p) } { // atom &t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "atom &t_lookahead")); @@ -17789,7 +17218,7 @@ t_primary_raw(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17800,7 +17229,7 @@ t_primary_raw(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17808,19 +17237,16 @@ t_primary_raw(Parser *p) static void * t_lookahead_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -17839,7 +17265,7 @@ t_lookahead_rule(Parser *p) } { // '[' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -17858,7 +17284,7 @@ t_lookahead_rule(Parser *p) } { // '.' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -17877,7 +17303,7 @@ t_lookahead_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17885,19 +17311,16 @@ t_lookahead_rule(Parser *p) static asdl_expr_seq* del_targets_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.del_target+ ','? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.del_target+ ','?")); @@ -17914,7 +17337,7 @@ del_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -17925,7 +17348,7 @@ del_targets_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -17936,23 +17359,20 @@ del_targets_rule(Parser *p) static expr_ty del_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, del_target_type, &_res)) { - p->level--; + D(p->level--); return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17961,7 +17381,7 @@ del_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -17981,7 +17401,7 @@ del_target_rule(Parser *p) D(fprintf(stderr, "%*c+ del_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -17991,7 +17411,7 @@ del_target_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18002,7 +17422,7 @@ del_target_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -18025,7 +17445,7 @@ del_target_rule(Parser *p) D(fprintf(stderr, "%*c+ del_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -18035,7 +17455,7 @@ del_target_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18046,7 +17466,7 @@ del_target_rule(Parser *p) } { // del_t_atom if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "del_t_atom")); @@ -18066,7 +17486,7 @@ del_target_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, del_target_type, _res); - p->level--; + D(p->level--); return _res; } @@ -18074,19 +17494,16 @@ del_target_rule(Parser *p) static expr_ty del_t_atom_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -18095,7 +17512,7 @@ del_t_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -18108,7 +17525,7 @@ del_t_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Del ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18119,7 +17536,7 @@ del_t_atom_rule(Parser *p) } { // '(' del_target ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' del_target ')'")); @@ -18138,7 +17555,7 @@ del_t_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Del ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18149,7 +17566,7 @@ del_t_atom_rule(Parser *p) } { // '(' del_targets? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' del_targets? ')'")); @@ -18167,7 +17584,7 @@ del_t_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ del_t_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' del_targets? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -18177,7 +17594,7 @@ del_t_atom_rule(Parser *p) _res = _PyAST_Tuple ( a , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18188,7 +17605,7 @@ del_t_atom_rule(Parser *p) } { // '[' del_targets? ']' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' del_targets? ']'")); @@ -18206,7 +17623,7 @@ del_t_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ del_t_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' del_targets? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -18216,7 +17633,7 @@ del_t_atom_rule(Parser *p) _res = _PyAST_List ( a , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18227,7 +17644,7 @@ del_t_atom_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -18242,19 +17659,16 @@ del_t_atom_rule(Parser *p) static asdl_expr_seq* type_expressions_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.expression+ ',' '*' expression ',' '**' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '*' expression ',' '**' expression")); @@ -18285,7 +17699,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , CHECK ( asdl_seq* , _PyPegen_seq_append_to_end ( p , a , b ) ) , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18296,7 +17710,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ ',' '*' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '*' expression")); @@ -18318,7 +17732,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18329,7 +17743,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ ',' '**' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '**' expression")); @@ -18351,7 +17765,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18362,7 +17776,7 @@ type_expressions_rule(Parser *p) } { // '*' expression ',' '**' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression ',' '**' expression")); @@ -18387,7 +17801,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , CHECK ( asdl_seq* , _PyPegen_singleton_seq ( p , a ) ) , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18398,7 +17812,7 @@ type_expressions_rule(Parser *p) } { // '*' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression")); @@ -18414,7 +17828,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18425,7 +17839,7 @@ type_expressions_rule(Parser *p) } { // '**' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' expression")); @@ -18441,7 +17855,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18452,7 +17866,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+")); @@ -18465,7 +17879,7 @@ type_expressions_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18476,7 +17890,7 @@ type_expressions_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -18487,19 +17901,16 @@ type_expressions_rule(Parser *p) static Token* func_type_comment_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } Token* _res = NULL; int _mark = p->mark; { // NEWLINE TYPE_COMMENT &(NEWLINE INDENT) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE TYPE_COMMENT &(NEWLINE INDENT)")); @@ -18517,7 +17928,7 @@ func_type_comment_rule(Parser *p) _res = t; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18528,7 +17939,7 @@ func_type_comment_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_double_type_comments if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_double_type_comments")); @@ -18547,7 +17958,7 @@ func_type_comment_rule(Parser *p) } { // TYPE_COMMENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "TYPE_COMMENT")); @@ -18566,7 +17977,7 @@ func_type_comment_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -18580,19 +17991,16 @@ func_type_comment_rule(Parser *p) static void * invalid_arguments_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // args ',' '*' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' '*'")); @@ -18611,7 +18019,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "iterable argument unpacking follows keyword argument unpacking" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18622,7 +18030,7 @@ invalid_arguments_rule(Parser *p) } { // expression for_if_clauses ',' [args | expression for_if_clauses] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses ',' [args | expression for_if_clauses]")); @@ -18645,7 +18053,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , PyPegen_last_item ( b , comprehension_ty ) -> target , "Generator expression must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18656,7 +18064,7 @@ invalid_arguments_rule(Parser *p) } { // NAME '=' expression for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression for_if_clauses")); @@ -18678,7 +18086,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18689,7 +18097,7 @@ invalid_arguments_rule(Parser *p) } { // args for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args for_if_clauses")); @@ -18705,7 +18113,7 @@ invalid_arguments_rule(Parser *p) _res = _PyPegen_nonparen_genexp_in_call ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18716,7 +18124,7 @@ invalid_arguments_rule(Parser *p) } { // args ',' expression for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' expression for_if_clauses")); @@ -18738,7 +18146,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , asdl_seq_GET ( b , b -> size - 1 ) -> target , "Generator expression must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18749,7 +18157,7 @@ invalid_arguments_rule(Parser *p) } { // args ',' args if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' args")); @@ -18768,7 +18176,7 @@ invalid_arguments_rule(Parser *p) _res = _PyPegen_arguments_parsing_error ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18779,7 +18187,7 @@ invalid_arguments_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -18790,19 +18198,16 @@ invalid_arguments_rule(Parser *p) static void * invalid_kwarg_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ('True' | 'False' | 'None') '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('True' | 'False' | 'None') '='")); @@ -18818,7 +18223,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "cannot assign to %s" , PyBytes_AS_STRING ( a -> bytes ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18829,7 +18234,7 @@ invalid_kwarg_rule(Parser *p) } { // NAME '=' expression for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression for_if_clauses")); @@ -18851,7 +18256,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18862,7 +18267,7 @@ invalid_kwarg_rule(Parser *p) } { // !(NAME '=') expression '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(NAME '=') expression '='")); @@ -18880,7 +18285,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "expression cannot contain assignment, perhaps you meant \"==\"?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18891,7 +18296,7 @@ invalid_kwarg_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -18902,19 +18307,16 @@ invalid_kwarg_rule(Parser *p) static expr_ty expression_without_invalid_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -18923,7 +18325,7 @@ expression_without_invalid_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // disjunction 'if' disjunction 'else' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); @@ -18947,7 +18349,7 @@ expression_without_invalid_rule(Parser *p) D(fprintf(stderr, "%*c+ expression_without_invalid[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - p->level--; + D(p->level--); return NULL; } int _end_lineno = _token->end_lineno; @@ -18957,7 +18359,7 @@ expression_without_invalid_rule(Parser *p) _res = _PyAST_IfExp ( b , a , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -18968,7 +18370,7 @@ expression_without_invalid_rule(Parser *p) } { // disjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction")); @@ -18987,7 +18389,7 @@ expression_without_invalid_rule(Parser *p) } { // lambdef if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambdef")); @@ -19006,7 +18408,7 @@ expression_without_invalid_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19014,19 +18416,16 @@ expression_without_invalid_rule(Parser *p) static void * invalid_legacy_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME !'(' star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_legacy_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME !'(' star_expressions")); @@ -19044,7 +18443,7 @@ invalid_legacy_expression_rule(Parser *p) _res = _PyPegen_check_legacy_stmt ( p , a ) ? RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Missing parentheses in call to '%U'. Did you mean %U(...)?" , a -> v . Name . id , a -> v . Name . id ) : NULL; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19055,7 +18454,7 @@ invalid_legacy_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19065,19 +18464,16 @@ invalid_legacy_expression_rule(Parser *p) static void * invalid_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // !(NAME STRING | SOFT_KEYWORD) disjunction expression_without_invalid if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(NAME STRING | SOFT_KEYWORD) disjunction expression_without_invalid")); @@ -19095,7 +18491,7 @@ invalid_expression_rule(Parser *p) _res = _PyPegen_check_legacy_stmt ( p , a ) ? NULL : p -> tokens [p -> mark - 1] -> level == 0 ? NULL : RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Perhaps you forgot a comma?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19106,7 +18502,7 @@ invalid_expression_rule(Parser *p) } { // disjunction 'if' disjunction !('else' | ':') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction !('else' | ':')")); @@ -19127,7 +18523,7 @@ invalid_expression_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "expected 'else' after 'if' expression" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19138,7 +18534,7 @@ invalid_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19149,19 +18545,16 @@ invalid_expression_rule(Parser *p) static void * invalid_named_expression_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ':=' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':=' expression")); @@ -19180,7 +18573,7 @@ invalid_named_expression_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use assignment expressions with %s" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19191,7 +18584,7 @@ invalid_named_expression_rule(Parser *p) } { // NAME '=' bitwise_or !('=' | ':=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' bitwise_or !('=' | ':=')")); @@ -19212,7 +18605,7 @@ invalid_named_expression_rule(Parser *p) _res = p -> in_raw_rule ? NULL : RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19223,7 +18616,7 @@ invalid_named_expression_rule(Parser *p) } { // !(list | tuple | genexp | 'True' | 'None' | 'False') bitwise_or '=' bitwise_or !('=' | ':=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(list | tuple | genexp | 'True' | 'None' | 'False') bitwise_or '=' bitwise_or !('=' | ':=')")); @@ -19246,7 +18639,7 @@ invalid_named_expression_rule(Parser *p) _res = p -> in_raw_rule ? NULL : RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot assign to %s here. Maybe you meant '==' instead of '='?" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19257,7 +18650,7 @@ invalid_named_expression_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19271,19 +18664,16 @@ invalid_named_expression_rule(Parser *p) static void * invalid_assignment_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // invalid_ann_assign_target ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_ann_assign_target ':' expression")); @@ -19302,7 +18692,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "only single target (not %s) can be annotated" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19313,7 +18703,7 @@ invalid_assignment_rule(Parser *p) } { // star_named_expression ',' star_named_expressions* ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions* ':' expression")); @@ -19338,7 +18728,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "only single target (not tuple) can be annotated" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19349,7 +18739,7 @@ invalid_assignment_rule(Parser *p) } { // expression ':' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' expression")); @@ -19368,7 +18758,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "illegal target for annotation" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19379,7 +18769,7 @@ invalid_assignment_rule(Parser *p) } { // ((star_targets '='))* star_expressions '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* star_expressions '='")); @@ -19398,7 +18788,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( STAR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19409,7 +18799,7 @@ invalid_assignment_rule(Parser *p) } { // ((star_targets '='))* yield_expr '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* yield_expr '='")); @@ -19428,7 +18818,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "assignment to yield expression not possible" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19439,7 +18829,7 @@ invalid_assignment_rule(Parser *p) } { // star_expressions augassign (yield_expr | star_expressions) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions augassign (yield_expr | star_expressions)")); @@ -19458,7 +18848,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "'%s' is an illegal expression for augmented assignment" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19469,7 +18859,7 @@ invalid_assignment_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19477,19 +18867,16 @@ invalid_assignment_rule(Parser *p) static expr_ty invalid_ann_assign_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -19508,7 +18895,7 @@ invalid_ann_assign_target_rule(Parser *p) } { // tuple if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -19527,7 +18914,7 @@ invalid_ann_assign_target_rule(Parser *p) } { // '(' invalid_ann_assign_target ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' invalid_ann_assign_target ')'")); @@ -19546,7 +18933,7 @@ invalid_ann_assign_target_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19557,7 +18944,7 @@ invalid_ann_assign_target_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19565,19 +18952,16 @@ invalid_ann_assign_target_rule(Parser *p) static void * invalid_del_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'del' star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'del' star_expressions")); @@ -19593,7 +18977,7 @@ invalid_del_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( DEL_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19604,7 +18988,7 @@ invalid_del_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19612,19 +18996,16 @@ invalid_del_stmt_rule(Parser *p) static void * invalid_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE !INDENT")); @@ -19639,7 +19020,7 @@ invalid_block_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19650,7 +19031,7 @@ invalid_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19661,19 +19042,16 @@ invalid_block_rule(Parser *p) static void * invalid_comprehension_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ('[' | '(' | '{') starred_expression for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '(' | '{') starred_expression for_if_clauses")); @@ -19692,7 +19070,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "iterable unpacking cannot be used in comprehension" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19703,7 +19081,7 @@ invalid_comprehension_rule(Parser *p) } { // ('[' | '{') star_named_expression ',' star_named_expressions for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' star_named_expressions for_if_clauses")); @@ -19728,7 +19106,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , PyPegen_last_item ( b , expr_ty ) , "did you forget parentheses around the comprehension target?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19739,7 +19117,7 @@ invalid_comprehension_rule(Parser *p) } { // ('[' | '{') star_named_expression ',' for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' for_if_clauses")); @@ -19761,7 +19139,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "did you forget parentheses around the comprehension target?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19772,7 +19150,7 @@ invalid_comprehension_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19780,19 +19158,16 @@ invalid_comprehension_rule(Parser *p) static void * invalid_dict_comprehension_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '{' '**' bitwise_or for_if_clauses '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_dict_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' '**' bitwise_or for_if_clauses '}'")); @@ -19817,7 +19192,7 @@ invalid_dict_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "dict unpacking cannot be used in dict comprehension" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19828,7 +19203,7 @@ invalid_dict_comprehension_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19838,19 +19213,16 @@ invalid_dict_comprehension_rule(Parser *p) static void * invalid_parameters_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // param_no_default* invalid_parameters_helper param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* invalid_parameters_helper param_no_default")); @@ -19869,7 +19241,7 @@ invalid_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "non-default argument follows default argument" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19880,7 +19252,7 @@ invalid_parameters_rule(Parser *p) } { // param_no_default* '(' param_no_default+ ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* '(' param_no_default+ ','? ')'")); @@ -19906,7 +19278,7 @@ invalid_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Function parameters cannot be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19917,7 +19289,7 @@ invalid_parameters_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19925,19 +19297,16 @@ invalid_parameters_rule(Parser *p) static void * invalid_parameters_helper_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // slash_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); @@ -19950,7 +19319,7 @@ invalid_parameters_helper_rule(Parser *p) _res = _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -19961,7 +19330,7 @@ invalid_parameters_helper_rule(Parser *p) } { // param_with_default+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default+")); @@ -19980,7 +19349,7 @@ invalid_parameters_helper_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -19990,19 +19359,16 @@ invalid_parameters_helper_rule(Parser *p) static void * invalid_lambda_parameters_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // lambda_param_no_default* invalid_lambda_parameters_helper lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* invalid_lambda_parameters_helper lambda_param_no_default")); @@ -20021,7 +19387,7 @@ invalid_lambda_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "non-default argument follows default argument" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20032,7 +19398,7 @@ invalid_lambda_parameters_rule(Parser *p) } { // lambda_param_no_default* '(' ','.lambda_param+ ','? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* '(' ','.lambda_param+ ','? ')'")); @@ -20058,7 +19424,7 @@ invalid_lambda_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Lambda expression parameters cannot be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20069,7 +19435,7 @@ invalid_lambda_parameters_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20079,19 +19445,16 @@ invalid_lambda_parameters_rule(Parser *p) static void * invalid_lambda_parameters_helper_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // lambda_slash_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); @@ -20104,7 +19467,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) _res = _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20115,7 +19478,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) } { // lambda_param_with_default+ if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+")); @@ -20134,7 +19497,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20142,19 +19505,16 @@ invalid_lambda_parameters_helper_rule(Parser *p) static void * invalid_star_etc_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '*' (')' | ',' (')' | '**')) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (')' | ',' (')' | '**'))")); @@ -20170,7 +19530,7 @@ invalid_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "named arguments must follow bare *" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20181,7 +19541,7 @@ invalid_star_etc_rule(Parser *p) } { // '*' ',' TYPE_COMMENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' TYPE_COMMENT")); @@ -20200,7 +19560,7 @@ invalid_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "bare * has associated type comment" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20211,7 +19571,7 @@ invalid_star_etc_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20219,19 +19579,16 @@ invalid_star_etc_rule(Parser *p) static void * invalid_lambda_star_etc_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '*' (':' | ',' (':' | '**')) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (':' | ',' (':' | '**'))")); @@ -20247,7 +19604,7 @@ invalid_lambda_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "named arguments must follow bare *" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20258,7 +19615,7 @@ invalid_lambda_star_etc_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20266,19 +19623,16 @@ invalid_lambda_star_etc_rule(Parser *p) static void * invalid_double_type_comments_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // TYPE_COMMENT NEWLINE TYPE_COMMENT NEWLINE INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_double_type_comments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "TYPE_COMMENT NEWLINE TYPE_COMMENT NEWLINE INDENT")); @@ -20303,7 +19657,7 @@ invalid_double_type_comments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "Cannot have two type comments on def" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20314,7 +19668,7 @@ invalid_double_type_comments_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20322,19 +19676,16 @@ invalid_double_type_comments_rule(Parser *p) static void * invalid_with_item_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expression 'as' expression &(',' | ')' | ':') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression 'as' expression &(',' | ')' | ':')")); @@ -20355,7 +19706,7 @@ invalid_with_item_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( STAR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20366,7 +19717,7 @@ invalid_with_item_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20374,19 +19725,16 @@ invalid_with_item_rule(Parser *p) static void * invalid_for_target_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'for' star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_for_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_expressions")); @@ -20406,7 +19754,7 @@ invalid_for_target_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( FOR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20417,7 +19765,7 @@ invalid_for_target_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20425,19 +19773,16 @@ invalid_for_target_rule(Parser *p) static void * invalid_group_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' starred_expression ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' starred_expression ')'")); @@ -20456,7 +19801,7 @@ invalid_group_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use starred expression here" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20467,7 +19812,7 @@ invalid_group_rule(Parser *p) } { // '(' '**' expression ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' '**' expression ')'")); @@ -20489,7 +19834,7 @@ invalid_group_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use double starred expression here" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20500,7 +19845,7 @@ invalid_group_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20508,19 +19853,16 @@ invalid_group_rule(Parser *p) static void * invalid_import_from_targets_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // import_from_as_names ',' NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_names ',' NEWLINE")); @@ -20539,7 +19881,7 @@ invalid_import_from_targets_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "trailing comma not allowed without surrounding parentheses" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20550,7 +19892,7 @@ invalid_import_from_targets_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20560,19 +19902,16 @@ invalid_import_from_targets_rule(Parser *p) static void * invalid_with_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'with' ','.(expression ['as' star_target])+ &&':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ &&':'")); @@ -20601,7 +19940,7 @@ invalid_with_stmt_rule(Parser *p) } { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':'")); @@ -20640,7 +19979,7 @@ invalid_with_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20650,19 +19989,16 @@ invalid_with_stmt_rule(Parser *p) static void * invalid_with_stmt_indent_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT")); @@ -20690,7 +20026,7 @@ invalid_with_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'with' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20701,7 +20037,7 @@ invalid_with_stmt_indent_rule(Parser *p) } { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' ':' NEWLINE !INDENT")); @@ -20739,7 +20075,7 @@ invalid_with_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'with' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20750,7 +20086,7 @@ invalid_with_stmt_indent_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20761,19 +20097,16 @@ invalid_with_stmt_indent_rule(Parser *p) static void * invalid_try_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'try' ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' NEWLINE !INDENT")); @@ -20794,7 +20127,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'try' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20805,7 +20138,7 @@ invalid_try_stmt_rule(Parser *p) } { // 'try' ':' block !('except' | 'finally') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' block !('except' | 'finally')")); @@ -20826,7 +20159,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected 'except' or 'finally' block" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20837,7 +20170,7 @@ invalid_try_stmt_rule(Parser *p) } { // 'try' ':' block* ((except_block+ except_star_block) | (except_star_block+ except_block)) block* if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' block* ((except_block+ except_star_block) | (except_star_block+ except_block)) block*")); @@ -20862,7 +20195,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "cannot have both 'except' and 'except*' on the same 'try'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20873,7 +20206,7 @@ invalid_try_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -20885,19 +20218,16 @@ invalid_try_stmt_rule(Parser *p) static void * invalid_except_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' '*'? expression ',' expressions ['as' NAME] ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*'? expression ',' expressions ['as' NAME] ':'")); @@ -20930,7 +20260,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "multiple exception types must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20941,7 +20271,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' '*'? expression ['as' NAME] NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*'? expression ['as' NAME] NEWLINE")); @@ -20968,7 +20298,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -20979,7 +20309,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' NEWLINE")); @@ -20995,7 +20325,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21006,7 +20336,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' '*' (NEWLINE | ':') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' (NEWLINE | ':')")); @@ -21025,7 +20355,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected one or more exception types" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21036,7 +20366,7 @@ invalid_except_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21044,19 +20374,16 @@ invalid_except_stmt_rule(Parser *p) static void * invalid_finally_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'finally' ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_finally_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally' ':' NEWLINE !INDENT")); @@ -21077,7 +20404,7 @@ invalid_finally_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'finally' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21088,7 +20415,7 @@ invalid_finally_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21098,19 +20425,16 @@ invalid_finally_stmt_rule(Parser *p) static void * invalid_except_stmt_indent_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' expression ['as' NAME] ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' NEWLINE !INDENT")); @@ -21138,7 +20462,7 @@ invalid_except_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'except' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21149,7 +20473,7 @@ invalid_except_stmt_indent_rule(Parser *p) } { // 'except' ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' ':' NEWLINE !INDENT")); @@ -21170,7 +20494,7 @@ invalid_except_stmt_indent_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected an indented block after except statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21181,7 +20505,7 @@ invalid_except_stmt_indent_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21190,19 +20514,16 @@ invalid_except_stmt_indent_rule(Parser *p) static void * invalid_except_star_stmt_indent_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' '*' expression ['as' NAME] ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_except_star_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' NEWLINE !INDENT")); @@ -21233,7 +20554,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'except*' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21244,7 +20565,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21254,19 +20575,16 @@ invalid_except_star_stmt_indent_rule(Parser *p) static void * invalid_match_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // "match" subject_expr !':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr !':'")); @@ -21284,7 +20602,7 @@ invalid_match_stmt_rule(Parser *p) _res = CHECK_VERSION ( void* , 10 , "Pattern matching is" , RAISE_SYNTAX_ERROR ( "expected ':'" ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21295,7 +20613,7 @@ invalid_match_stmt_rule(Parser *p) } { // "match" subject_expr ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE !INDENT")); @@ -21319,7 +20637,7 @@ invalid_match_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'match' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21330,7 +20648,7 @@ invalid_match_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21340,19 +20658,16 @@ invalid_match_stmt_rule(Parser *p) static void * invalid_case_block_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // "case" patterns guard? !':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? !':'")); @@ -21374,7 +20689,7 @@ invalid_case_block_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21385,7 +20700,7 @@ invalid_case_block_rule(Parser *p) } { // "case" patterns guard? ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? ':' NEWLINE !INDENT")); @@ -21413,7 +20728,7 @@ invalid_case_block_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'case' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21424,7 +20739,7 @@ invalid_case_block_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21432,19 +20747,16 @@ invalid_case_block_rule(Parser *p) static void * invalid_as_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // or_pattern 'as' "_" if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' \"_\"")); @@ -21463,7 +20775,7 @@ invalid_as_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use '_' as a target" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21474,7 +20786,7 @@ invalid_as_pattern_rule(Parser *p) } { // or_pattern 'as' !NAME expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' !NAME expression")); @@ -21495,7 +20807,7 @@ invalid_as_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "invalid pattern target" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21506,7 +20818,7 @@ invalid_as_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21514,19 +20826,16 @@ invalid_as_pattern_rule(Parser *p) static void * invalid_class_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // name_or_attr '(' invalid_class_argument_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' invalid_class_argument_pattern")); @@ -21545,7 +20854,7 @@ invalid_class_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( PyPegen_first_item ( a , pattern_ty ) , PyPegen_last_item ( a , pattern_ty ) , "positional patterns follow keyword patterns" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21556,7 +20865,7 @@ invalid_class_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21565,19 +20874,16 @@ invalid_class_pattern_rule(Parser *p) static asdl_pattern_seq* invalid_class_argument_pattern_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_pattern_seq* _res = NULL; int _mark = p->mark; { // [positional_patterns ','] keyword_patterns ',' positional_patterns if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_class_argument_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "[positional_patterns ','] keyword_patterns ',' positional_patterns")); @@ -21600,7 +20906,7 @@ invalid_class_argument_pattern_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21611,7 +20917,7 @@ invalid_class_argument_pattern_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21621,19 +20927,16 @@ invalid_class_argument_pattern_rule(Parser *p) static void * invalid_if_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' named_expression NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression NEWLINE")); @@ -21652,7 +20955,7 @@ invalid_if_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21663,7 +20966,7 @@ invalid_if_stmt_rule(Parser *p) } { // 'if' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' NEWLINE !INDENT")); @@ -21687,7 +20990,7 @@ invalid_if_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'if' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21698,7 +21001,7 @@ invalid_if_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21708,19 +21011,16 @@ invalid_if_stmt_rule(Parser *p) static void * invalid_elif_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'elif' named_expression NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression NEWLINE")); @@ -21739,7 +21039,7 @@ invalid_elif_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21750,7 +21050,7 @@ invalid_elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' NEWLINE !INDENT")); @@ -21774,7 +21074,7 @@ invalid_elif_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'elif' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21785,7 +21085,7 @@ invalid_elif_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21793,19 +21093,16 @@ invalid_elif_stmt_rule(Parser *p) static void * invalid_else_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'else' ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_else_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else' ':' NEWLINE !INDENT")); @@ -21826,7 +21123,7 @@ invalid_else_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'else' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21837,7 +21134,7 @@ invalid_else_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21847,19 +21144,16 @@ invalid_else_stmt_rule(Parser *p) static void * invalid_while_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'while' named_expression NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression NEWLINE")); @@ -21878,7 +21172,7 @@ invalid_while_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21889,7 +21183,7 @@ invalid_while_stmt_rule(Parser *p) } { // 'while' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' NEWLINE !INDENT")); @@ -21913,7 +21207,7 @@ invalid_while_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'while' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21924,7 +21218,7 @@ invalid_while_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21932,19 +21226,16 @@ invalid_while_stmt_rule(Parser *p) static void * invalid_for_stmt_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT")); @@ -21978,7 +21269,7 @@ invalid_for_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'for' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -21989,7 +21280,7 @@ invalid_for_stmt_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -21998,19 +21289,16 @@ invalid_for_stmt_rule(Parser *p) static void * invalid_def_raw_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'def' NAME '(' params? ')' ['->' expression] ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'def' NAME '(' params? ')' ['->' expression] ':' NEWLINE !INDENT")); @@ -22052,7 +21340,7 @@ invalid_def_raw_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after function definition on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22063,7 +21351,7 @@ invalid_def_raw_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22071,19 +21359,16 @@ invalid_def_raw_rule(Parser *p) static void * invalid_class_def_raw_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT")); @@ -22111,7 +21396,7 @@ invalid_class_def_raw_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after class definition on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22122,7 +21407,7 @@ invalid_class_def_raw_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22133,19 +21418,16 @@ invalid_class_def_raw_rule(Parser *p) static void * invalid_double_starred_kvpairs_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ','.double_starred_kvpair+ ',' invalid_kvpair if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); @@ -22170,7 +21452,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } { // expression ':' '*' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' '*' bitwise_or")); @@ -22192,7 +21474,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "cannot use a starred expression in a dictionary value" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22203,7 +21485,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } { // expression ':' &('}' | ',') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' &('}' | ',')")); @@ -22221,7 +21503,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "expression expected after dictionary key and ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22232,7 +21514,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22240,19 +21522,16 @@ invalid_double_starred_kvpairs_rule(Parser *p) static void * invalid_kvpair_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expression !(':') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !(':')")); @@ -22267,7 +21546,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_ERROR_KNOWN_LOCATION ( p , PyExc_SyntaxError , a -> lineno , a -> end_col_offset - 1 , a -> end_lineno , - 1 , "':' expected after dictionary key" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22278,7 +21557,7 @@ invalid_kvpair_rule(Parser *p) } { // expression ':' '*' bitwise_or if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' '*' bitwise_or")); @@ -22300,7 +21579,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "cannot use a starred expression in a dictionary value" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22311,7 +21590,7 @@ invalid_kvpair_rule(Parser *p) } { // expression ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':'")); @@ -22327,7 +21606,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "expression expected after dictionary key and ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -22338,7 +21617,7 @@ invalid_kvpair_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22346,12 +21625,9 @@ invalid_kvpair_rule(Parser *p) static asdl_seq * _loop0_1_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -22361,14 +21637,14 @@ _loop0_1_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_1[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -22384,7 +21660,7 @@ _loop0_1_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -22401,13 +21677,13 @@ _loop0_1_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_1_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -22415,12 +21691,9 @@ _loop0_1_rule(Parser *p) static asdl_seq * _loop0_2_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -22430,14 +21703,14 @@ _loop0_2_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_2[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -22453,7 +21726,7 @@ _loop0_2_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -22470,13 +21743,13 @@ _loop0_2_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_2_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -22484,12 +21757,9 @@ _loop0_2_rule(Parser *p) static asdl_seq * _loop1_3_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -22499,14 +21769,14 @@ _loop1_3_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // statement if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_3[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement")); @@ -22522,7 +21792,7 @@ _loop1_3_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -22536,7 +21806,7 @@ _loop1_3_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -22544,13 +21814,13 @@ _loop1_3_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_3_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -22558,12 +21828,9 @@ _loop1_3_rule(Parser *p) static asdl_seq * _loop0_5_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -22573,14 +21840,14 @@ _loop0_5_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ';' simple_stmt if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_5[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';' simple_stmt")); @@ -22596,7 +21863,7 @@ _loop0_5_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -22605,7 +21872,7 @@ _loop0_5_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -22622,13 +21889,13 @@ _loop0_5_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_5_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -22636,19 +21903,16 @@ _loop0_5_rule(Parser *p) static asdl_seq * _gather_4_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // simple_stmt _loop0_5 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_4[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmt _loop0_5")); @@ -22670,7 +21934,7 @@ _gather_4_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22678,19 +21942,16 @@ _gather_4_rule(Parser *p) static void * _tmp_6_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'import' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_6[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'import'")); @@ -22709,7 +21970,7 @@ _tmp_6_rule(Parser *p) } { // 'from' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_6[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from'")); @@ -22728,7 +21989,7 @@ _tmp_6_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22736,19 +21997,16 @@ _tmp_6_rule(Parser *p) static void * _tmp_7_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'def' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'def'")); @@ -22767,7 +22025,7 @@ _tmp_7_rule(Parser *p) } { // '@' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@'")); @@ -22786,7 +22044,7 @@ _tmp_7_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22805,7 +22063,7 @@ _tmp_7_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22813,19 +22071,16 @@ _tmp_7_rule(Parser *p) static void * _tmp_8_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'class' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_8[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class'")); @@ -22844,7 +22099,7 @@ _tmp_8_rule(Parser *p) } { // '@' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_8[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@'")); @@ -22863,7 +22118,7 @@ _tmp_8_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22871,19 +22126,16 @@ _tmp_8_rule(Parser *p) static void * _tmp_9_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'with' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_9[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with'")); @@ -22902,7 +22154,7 @@ _tmp_9_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_9[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22921,7 +22173,7 @@ _tmp_9_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22929,19 +22181,16 @@ _tmp_9_rule(Parser *p) static void * _tmp_10_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'for' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_10[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for'")); @@ -22960,7 +22209,7 @@ _tmp_10_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_10[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22979,7 +22228,7 @@ _tmp_10_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -22987,19 +22236,16 @@ _tmp_10_rule(Parser *p) static void * _tmp_11_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' annotated_rhs if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_11[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' annotated_rhs")); @@ -23015,7 +22261,7 @@ _tmp_11_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -23026,7 +22272,7 @@ _tmp_11_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23034,19 +22280,16 @@ _tmp_11_rule(Parser *p) static void * _tmp_12_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' single_target ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_12[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' single_target ')'")); @@ -23065,7 +22308,7 @@ _tmp_12_rule(Parser *p) _res = b; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -23076,7 +22319,7 @@ _tmp_12_rule(Parser *p) } { // single_subscript_attribute_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_12[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_subscript_attribute_target")); @@ -23095,7 +22338,7 @@ _tmp_12_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23103,19 +22346,16 @@ _tmp_12_rule(Parser *p) static void * _tmp_13_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' annotated_rhs if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_13[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' annotated_rhs")); @@ -23131,7 +22371,7 @@ _tmp_13_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -23142,7 +22382,7 @@ _tmp_13_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23150,12 +22390,9 @@ _tmp_13_rule(Parser *p) static asdl_seq * _loop1_14_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23165,14 +22402,14 @@ _loop1_14_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_14[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -23188,7 +22425,7 @@ _loop1_14_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23202,7 +22439,7 @@ _loop1_14_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23210,13 +22447,13 @@ _loop1_14_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_14_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23224,19 +22461,16 @@ _loop1_14_rule(Parser *p) static void * _tmp_15_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_15[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -23255,7 +22489,7 @@ _tmp_15_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_15[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -23274,7 +22508,7 @@ _tmp_15_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23282,19 +22516,16 @@ _tmp_15_rule(Parser *p) static void * _tmp_16_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_16[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -23313,7 +22544,7 @@ _tmp_16_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_16[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -23332,7 +22563,7 @@ _tmp_16_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23340,19 +22571,16 @@ _tmp_16_rule(Parser *p) static void * _tmp_17_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'from' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_17[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' expression")); @@ -23368,7 +22596,7 @@ _tmp_17_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -23379,7 +22607,7 @@ _tmp_17_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23387,12 +22615,9 @@ _tmp_17_rule(Parser *p) static asdl_seq * _loop0_19_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23402,14 +22627,14 @@ _loop0_19_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_19[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' NAME")); @@ -23425,7 +22650,7 @@ _loop0_19_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -23434,7 +22659,7 @@ _loop0_19_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23451,13 +22676,13 @@ _loop0_19_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_19_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23465,19 +22690,16 @@ _loop0_19_rule(Parser *p) static asdl_seq * _gather_18_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // NAME _loop0_19 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_18[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME _loop0_19")); @@ -23499,7 +22721,7 @@ _gather_18_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23507,12 +22729,9 @@ _gather_18_rule(Parser *p) static asdl_seq * _loop0_21_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23522,14 +22741,14 @@ _loop0_21_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_21[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' NAME")); @@ -23545,7 +22764,7 @@ _loop0_21_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -23554,7 +22773,7 @@ _loop0_21_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23571,13 +22790,13 @@ _loop0_21_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_21_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23585,19 +22804,16 @@ _loop0_21_rule(Parser *p) static asdl_seq * _gather_20_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // NAME _loop0_21 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_20[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME _loop0_21")); @@ -23619,7 +22835,7 @@ _gather_20_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23627,19 +22843,16 @@ _gather_20_rule(Parser *p) static void * _tmp_22_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ';' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_22[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';'")); @@ -23658,7 +22871,7 @@ _tmp_22_rule(Parser *p) } { // NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_22[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -23677,7 +22890,7 @@ _tmp_22_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23685,19 +22898,16 @@ _tmp_22_rule(Parser *p) static void * _tmp_23_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_23[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -23713,7 +22923,7 @@ _tmp_23_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -23724,7 +22934,7 @@ _tmp_23_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23732,12 +22942,9 @@ _tmp_23_rule(Parser *p) static asdl_seq * _loop0_24_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23747,14 +22954,14 @@ _loop0_24_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('.' | '...') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_24[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); @@ -23770,7 +22977,7 @@ _loop0_24_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23787,13 +22994,13 @@ _loop0_24_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_24_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23801,12 +23008,9 @@ _loop0_24_rule(Parser *p) static asdl_seq * _loop1_25_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23816,14 +23020,14 @@ _loop1_25_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('.' | '...') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_25[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); @@ -23839,7 +23043,7 @@ _loop1_25_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23853,7 +23057,7 @@ _loop1_25_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23861,13 +23065,13 @@ _loop1_25_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_25_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23875,12 +23079,9 @@ _loop1_25_rule(Parser *p) static asdl_seq * _loop0_27_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -23890,14 +23091,14 @@ _loop0_27_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' import_from_as_name if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_27[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' import_from_as_name")); @@ -23913,7 +23114,7 @@ _loop0_27_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -23922,7 +23123,7 @@ _loop0_27_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -23939,13 +23140,13 @@ _loop0_27_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_27_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -23953,19 +23154,16 @@ _loop0_27_rule(Parser *p) static asdl_seq * _gather_26_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // import_from_as_name _loop0_27 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_26[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_name _loop0_27")); @@ -23987,7 +23185,7 @@ _gather_26_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -23995,19 +23193,16 @@ _gather_26_rule(Parser *p) static void * _tmp_28_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_28[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -24023,7 +23218,7 @@ _tmp_28_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -24034,7 +23229,7 @@ _tmp_28_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24042,12 +23237,9 @@ _tmp_28_rule(Parser *p) static asdl_seq * _loop0_30_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24057,14 +23249,14 @@ _loop0_30_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' dotted_as_name if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_30[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' dotted_as_name")); @@ -24080,7 +23272,7 @@ _loop0_30_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -24089,7 +23281,7 @@ _loop0_30_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24106,13 +23298,13 @@ _loop0_30_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_30_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24120,19 +23312,16 @@ _loop0_30_rule(Parser *p) static asdl_seq * _gather_29_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // dotted_as_name _loop0_30 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_29[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_as_name _loop0_30")); @@ -24154,7 +23343,7 @@ _gather_29_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24162,19 +23351,16 @@ _gather_29_rule(Parser *p) static void * _tmp_31_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_31[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -24190,7 +23376,7 @@ _tmp_31_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -24201,7 +23387,7 @@ _tmp_31_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24209,12 +23395,9 @@ _tmp_31_rule(Parser *p) static asdl_seq * _loop1_32_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24224,14 +23407,14 @@ _loop1_32_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('@' named_expression NEWLINE) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_32[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('@' named_expression NEWLINE)")); @@ -24247,7 +23430,7 @@ _loop1_32_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24261,7 +23444,7 @@ _loop1_32_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24269,13 +23452,13 @@ _loop1_32_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_32_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24283,19 +23466,16 @@ _loop1_32_rule(Parser *p) static void * _tmp_33_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' arguments? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_33[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); @@ -24314,7 +23494,7 @@ _tmp_33_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -24325,7 +23505,7 @@ _tmp_33_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24333,19 +23513,16 @@ _tmp_33_rule(Parser *p) static void * _tmp_34_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_34[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -24361,7 +23538,7 @@ _tmp_34_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -24372,7 +23549,7 @@ _tmp_34_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24380,19 +23557,16 @@ _tmp_34_rule(Parser *p) static void * _tmp_35_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_35[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -24408,7 +23582,7 @@ _tmp_35_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -24419,7 +23593,7 @@ _tmp_35_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -24427,12 +23601,9 @@ _tmp_35_rule(Parser *p) static asdl_seq * _loop0_36_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24442,14 +23613,14 @@ _loop0_36_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_36[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24465,7 +23636,7 @@ _loop0_36_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24482,13 +23653,13 @@ _loop0_36_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_36_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24496,12 +23667,9 @@ _loop0_36_rule(Parser *p) static asdl_seq * _loop0_37_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24511,14 +23679,14 @@ _loop0_37_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_37[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24534,7 +23702,7 @@ _loop0_37_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24551,13 +23719,13 @@ _loop0_37_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_37_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24565,12 +23733,9 @@ _loop0_37_rule(Parser *p) static asdl_seq * _loop0_38_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24580,14 +23745,14 @@ _loop0_38_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_38[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24603,7 +23768,7 @@ _loop0_38_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24620,13 +23785,13 @@ _loop0_38_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_38_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24634,12 +23799,9 @@ _loop0_38_rule(Parser *p) static asdl_seq * _loop1_39_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24649,14 +23811,14 @@ _loop1_39_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_39[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24672,7 +23834,7 @@ _loop1_39_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24686,7 +23848,7 @@ _loop1_39_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24694,13 +23856,13 @@ _loop1_39_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_39_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24708,12 +23870,9 @@ _loop1_39_rule(Parser *p) static asdl_seq * _loop0_40_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24723,14 +23882,14 @@ _loop0_40_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_40[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24746,7 +23905,7 @@ _loop0_40_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24763,13 +23922,13 @@ _loop0_40_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_40_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24777,12 +23936,9 @@ _loop0_40_rule(Parser *p) static asdl_seq * _loop1_41_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24792,14 +23948,14 @@ _loop1_41_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_41[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24815,7 +23971,7 @@ _loop1_41_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24829,7 +23985,7 @@ _loop1_41_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24837,13 +23993,13 @@ _loop1_41_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_41_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24851,12 +24007,9 @@ _loop1_41_rule(Parser *p) static asdl_seq * _loop1_42_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24866,14 +24019,14 @@ _loop1_42_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_42[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24889,7 +24042,7 @@ _loop1_42_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24903,7 +24056,7 @@ _loop1_42_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24911,13 +24064,13 @@ _loop1_42_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_42_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24925,12 +24078,9 @@ _loop1_42_rule(Parser *p) static asdl_seq * _loop1_43_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -24940,14 +24090,14 @@ _loop1_43_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_43[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24963,7 +24113,7 @@ _loop1_43_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -24977,7 +24127,7 @@ _loop1_43_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24985,13 +24135,13 @@ _loop1_43_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_43_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -24999,12 +24149,9 @@ _loop1_43_rule(Parser *p) static asdl_seq * _loop0_44_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25014,14 +24161,14 @@ _loop0_44_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_44[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -25037,7 +24184,7 @@ _loop0_44_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25054,13 +24201,13 @@ _loop0_44_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_44_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25068,12 +24215,9 @@ _loop0_44_rule(Parser *p) static asdl_seq * _loop1_45_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25083,14 +24227,14 @@ _loop1_45_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_45[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -25106,7 +24250,7 @@ _loop1_45_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25120,7 +24264,7 @@ _loop1_45_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25128,13 +24272,13 @@ _loop1_45_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_45_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25142,12 +24286,9 @@ _loop1_45_rule(Parser *p) static asdl_seq * _loop0_46_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25157,14 +24298,14 @@ _loop0_46_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_46[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -25180,7 +24321,7 @@ _loop0_46_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25197,13 +24338,13 @@ _loop0_46_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_46_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25211,12 +24352,9 @@ _loop0_46_rule(Parser *p) static asdl_seq * _loop1_47_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25226,14 +24364,14 @@ _loop1_47_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_47[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -25249,7 +24387,7 @@ _loop1_47_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25263,7 +24401,7 @@ _loop1_47_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25271,13 +24409,13 @@ _loop1_47_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_47_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25285,12 +24423,9 @@ _loop1_47_rule(Parser *p) static asdl_seq * _loop0_48_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25300,14 +24435,14 @@ _loop0_48_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_maybe_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_48[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); @@ -25323,7 +24458,7 @@ _loop0_48_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25340,13 +24475,13 @@ _loop0_48_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_48_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25354,12 +24489,9 @@ _loop0_48_rule(Parser *p) static asdl_seq * _loop1_49_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25369,14 +24501,14 @@ _loop1_49_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_maybe_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_49[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); @@ -25392,7 +24524,7 @@ _loop1_49_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25406,7 +24538,7 @@ _loop1_49_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25414,13 +24546,13 @@ _loop1_49_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_49_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25428,12 +24560,9 @@ _loop1_49_rule(Parser *p) static asdl_seq * _loop0_51_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25443,14 +24572,14 @@ _loop0_51_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_51[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -25466,7 +24595,7 @@ _loop0_51_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -25475,7 +24604,7 @@ _loop0_51_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25492,13 +24621,13 @@ _loop0_51_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_51_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25506,19 +24635,16 @@ _loop0_51_rule(Parser *p) static asdl_seq * _gather_50_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_51 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_50[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_51")); @@ -25540,7 +24666,7 @@ _gather_50_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -25548,12 +24674,9 @@ _gather_50_rule(Parser *p) static asdl_seq * _loop0_53_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25563,14 +24686,14 @@ _loop0_53_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_53[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -25586,7 +24709,7 @@ _loop0_53_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -25595,7 +24718,7 @@ _loop0_53_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25612,13 +24735,13 @@ _loop0_53_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_53_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25626,19 +24749,16 @@ _loop0_53_rule(Parser *p) static asdl_seq * _gather_52_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_53 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_52[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_53")); @@ -25660,7 +24780,7 @@ _gather_52_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -25668,12 +24788,9 @@ _gather_52_rule(Parser *p) static asdl_seq * _loop0_55_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25683,14 +24800,14 @@ _loop0_55_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_55[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -25706,7 +24823,7 @@ _loop0_55_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -25715,7 +24832,7 @@ _loop0_55_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25732,13 +24849,13 @@ _loop0_55_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_55_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25746,19 +24863,16 @@ _loop0_55_rule(Parser *p) static asdl_seq * _gather_54_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_55 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_54[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_55")); @@ -25780,7 +24894,7 @@ _gather_54_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -25788,12 +24902,9 @@ _gather_54_rule(Parser *p) static asdl_seq * _loop0_57_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -25803,14 +24914,14 @@ _loop0_57_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_57[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -25826,7 +24937,7 @@ _loop0_57_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -25835,7 +24946,7 @@ _loop0_57_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -25852,13 +24963,13 @@ _loop0_57_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_57_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -25866,19 +24977,16 @@ _loop0_57_rule(Parser *p) static asdl_seq * _gather_56_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_57 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_56[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_57")); @@ -25900,7 +25008,7 @@ _gather_56_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -25908,19 +25016,16 @@ _gather_56_rule(Parser *p) static void * _tmp_58_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -25939,7 +25044,7 @@ _tmp_58_rule(Parser *p) } { // ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -25958,7 +25063,7 @@ _tmp_58_rule(Parser *p) } { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -25977,7 +25082,7 @@ _tmp_58_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -25985,12 +25090,9 @@ _tmp_58_rule(Parser *p) static asdl_seq * _loop1_59_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26000,14 +25102,14 @@ _loop1_59_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_59[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); @@ -26023,7 +25125,7 @@ _loop1_59_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26037,7 +25139,7 @@ _loop1_59_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26045,13 +25147,13 @@ _loop1_59_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_59_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26059,12 +25161,9 @@ _loop1_59_rule(Parser *p) static asdl_seq * _loop1_60_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26074,14 +25173,14 @@ _loop1_60_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_star_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_60[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); @@ -26097,7 +25196,7 @@ _loop1_60_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26111,7 +25210,7 @@ _loop1_60_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26119,13 +25218,13 @@ _loop1_60_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_60_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26133,19 +25232,16 @@ _loop1_60_rule(Parser *p) static void * _tmp_61_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_61[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -26161,7 +25257,7 @@ _tmp_61_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -26172,7 +25268,7 @@ _tmp_61_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26180,19 +25276,16 @@ _tmp_61_rule(Parser *p) static void * _tmp_62_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_62[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -26208,7 +25301,7 @@ _tmp_62_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -26219,7 +25312,7 @@ _tmp_62_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26227,12 +25320,9 @@ _tmp_62_rule(Parser *p) static asdl_seq * _loop1_63_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26242,14 +25332,14 @@ _loop1_63_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // case_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_63[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "case_block")); @@ -26265,7 +25355,7 @@ _loop1_63_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26279,7 +25369,7 @@ _loop1_63_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26287,13 +25377,13 @@ _loop1_63_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_63_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26301,12 +25391,9 @@ _loop1_63_rule(Parser *p) static asdl_seq * _loop0_65_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26316,14 +25403,14 @@ _loop0_65_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // '|' closed_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_65[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|' closed_pattern")); @@ -26339,7 +25426,7 @@ _loop0_65_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -26348,7 +25435,7 @@ _loop0_65_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26365,13 +25452,13 @@ _loop0_65_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_65_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26379,19 +25466,16 @@ _loop0_65_rule(Parser *p) static asdl_seq * _gather_64_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // closed_pattern _loop0_65 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_64[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "closed_pattern _loop0_65")); @@ -26413,7 +25497,7 @@ _gather_64_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26421,19 +25505,16 @@ _gather_64_rule(Parser *p) static void * _tmp_66_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '+' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_66[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+'")); @@ -26452,7 +25533,7 @@ _tmp_66_rule(Parser *p) } { // '-' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_66[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-'")); @@ -26471,7 +25552,7 @@ _tmp_66_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26479,19 +25560,16 @@ _tmp_66_rule(Parser *p) static void * _tmp_67_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '+' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_67[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+'")); @@ -26510,7 +25588,7 @@ _tmp_67_rule(Parser *p) } { // '-' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_67[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-'")); @@ -26529,7 +25607,7 @@ _tmp_67_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26537,19 +25615,16 @@ _tmp_67_rule(Parser *p) static void * _tmp_68_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -26568,7 +25643,7 @@ _tmp_68_rule(Parser *p) } { // '(' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -26587,7 +25662,7 @@ _tmp_68_rule(Parser *p) } { // '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -26606,7 +25681,7 @@ _tmp_68_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26614,19 +25689,16 @@ _tmp_68_rule(Parser *p) static void * _tmp_69_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -26645,7 +25717,7 @@ _tmp_69_rule(Parser *p) } { // '(' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -26664,7 +25736,7 @@ _tmp_69_rule(Parser *p) } { // '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -26683,7 +25755,7 @@ _tmp_69_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26691,12 +25763,9 @@ _tmp_69_rule(Parser *p) static asdl_seq * _loop0_71_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26706,14 +25775,14 @@ _loop0_71_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' maybe_star_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_71[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' maybe_star_pattern")); @@ -26729,7 +25798,7 @@ _loop0_71_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -26738,7 +25807,7 @@ _loop0_71_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26755,13 +25824,13 @@ _loop0_71_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_71_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26769,19 +25838,16 @@ _loop0_71_rule(Parser *p) static asdl_seq * _gather_70_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // maybe_star_pattern _loop0_71 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_70[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "maybe_star_pattern _loop0_71")); @@ -26803,7 +25869,7 @@ _gather_70_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26811,12 +25877,9 @@ _gather_70_rule(Parser *p) static asdl_seq * _loop0_73_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -26826,14 +25889,14 @@ _loop0_73_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' key_value_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_73[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' key_value_pattern")); @@ -26849,7 +25912,7 @@ _loop0_73_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -26858,7 +25921,7 @@ _loop0_73_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -26875,13 +25938,13 @@ _loop0_73_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_73_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -26889,19 +25952,16 @@ _loop0_73_rule(Parser *p) static asdl_seq * _gather_72_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // key_value_pattern _loop0_73 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_72[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "key_value_pattern _loop0_73")); @@ -26923,7 +25983,7 @@ _gather_72_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26931,19 +25991,16 @@ _gather_72_rule(Parser *p) static void * _tmp_74_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // literal_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_74[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "literal_expr")); @@ -26962,7 +26019,7 @@ _tmp_74_rule(Parser *p) } { // attr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_74[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr")); @@ -26981,7 +26038,7 @@ _tmp_74_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -26989,12 +26046,9 @@ _tmp_74_rule(Parser *p) static asdl_seq * _loop0_76_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27004,14 +26058,14 @@ _loop0_76_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_76[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' pattern")); @@ -27027,7 +26081,7 @@ _loop0_76_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -27036,7 +26090,7 @@ _loop0_76_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27053,13 +26107,13 @@ _loop0_76_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_76_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27067,19 +26121,16 @@ _loop0_76_rule(Parser *p) static asdl_seq * _gather_75_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // pattern _loop0_76 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_75[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern _loop0_76")); @@ -27101,7 +26152,7 @@ _gather_75_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27109,12 +26160,9 @@ _gather_75_rule(Parser *p) static asdl_seq * _loop0_78_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27124,14 +26172,14 @@ _loop0_78_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' keyword_pattern if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_78[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' keyword_pattern")); @@ -27147,7 +26195,7 @@ _loop0_78_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -27156,7 +26204,7 @@ _loop0_78_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27173,13 +26221,13 @@ _loop0_78_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_78_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27187,19 +26235,16 @@ _loop0_78_rule(Parser *p) static asdl_seq * _gather_77_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // keyword_pattern _loop0_78 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_77[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "keyword_pattern _loop0_78")); @@ -27221,7 +26266,7 @@ _gather_77_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27229,12 +26274,9 @@ _gather_77_rule(Parser *p) static asdl_seq * _loop1_79_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27244,14 +26286,14 @@ _loop1_79_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' expression) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_79[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' expression)")); @@ -27267,7 +26309,7 @@ _loop1_79_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27281,7 +26323,7 @@ _loop1_79_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27289,13 +26331,13 @@ _loop1_79_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_79_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27303,12 +26345,9 @@ _loop1_79_rule(Parser *p) static asdl_seq * _loop1_80_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27318,14 +26357,14 @@ _loop1_80_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_expression) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_80[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_expression)")); @@ -27341,7 +26380,7 @@ _loop1_80_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27355,7 +26394,7 @@ _loop1_80_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27363,13 +26402,13 @@ _loop1_80_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_80_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27377,12 +26416,9 @@ _loop1_80_rule(Parser *p) static asdl_seq * _loop0_82_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27392,14 +26428,14 @@ _loop0_82_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' star_named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_82[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_named_expression")); @@ -27415,7 +26451,7 @@ _loop0_82_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -27424,7 +26460,7 @@ _loop0_82_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27441,13 +26477,13 @@ _loop0_82_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_82_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27455,19 +26491,16 @@ _loop0_82_rule(Parser *p) static asdl_seq * _gather_81_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // star_named_expression _loop0_82 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_81[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression _loop0_82")); @@ -27489,7 +26522,7 @@ _gather_81_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27497,12 +26530,9 @@ _gather_81_rule(Parser *p) static asdl_seq * _loop1_83_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27512,14 +26542,14 @@ _loop1_83_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('or' conjunction) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_83[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('or' conjunction)")); @@ -27535,7 +26565,7 @@ _loop1_83_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27549,7 +26579,7 @@ _loop1_83_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27557,13 +26587,13 @@ _loop1_83_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_83_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27571,12 +26601,9 @@ _loop1_83_rule(Parser *p) static asdl_seq * _loop1_84_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27586,14 +26613,14 @@ _loop1_84_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('and' inversion) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_84[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('and' inversion)")); @@ -27609,7 +26636,7 @@ _loop1_84_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27623,7 +26650,7 @@ _loop1_84_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27631,13 +26658,13 @@ _loop1_84_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_84_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27645,12 +26672,9 @@ _loop1_84_rule(Parser *p) static asdl_seq * _loop1_85_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27660,14 +26684,14 @@ _loop1_85_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // compare_op_bitwise_or_pair if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_85[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compare_op_bitwise_or_pair")); @@ -27683,7 +26707,7 @@ _loop1_85_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27697,7 +26721,7 @@ _loop1_85_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27705,13 +26729,13 @@ _loop1_85_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_85_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27719,19 +26743,16 @@ _loop1_85_rule(Parser *p) static void * _tmp_86_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '!=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_86[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'!='")); @@ -27744,7 +26765,7 @@ _tmp_86_rule(Parser *p) _res = _PyPegen_check_barry_as_flufl ( p , tok ) ? NULL : tok; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -27755,7 +26776,7 @@ _tmp_86_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27763,12 +26784,9 @@ _tmp_86_rule(Parser *p) static asdl_seq * _loop0_88_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -27778,14 +26796,14 @@ _loop0_88_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' slice if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_88[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' slice")); @@ -27801,7 +26819,7 @@ _loop0_88_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -27810,7 +26828,7 @@ _loop0_88_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -27827,13 +26845,13 @@ _loop0_88_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_88_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -27841,19 +26859,16 @@ _loop0_88_rule(Parser *p) static asdl_seq * _gather_87_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // slice _loop0_88 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_87[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice _loop0_88")); @@ -27875,7 +26890,7 @@ _gather_87_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27883,19 +26898,16 @@ _gather_87_rule(Parser *p) static void * _tmp_89_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' expression? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_89[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':' expression?")); @@ -27911,7 +26923,7 @@ _tmp_89_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -27922,7 +26934,7 @@ _tmp_89_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -27930,19 +26942,16 @@ _tmp_89_rule(Parser *p) static void * _tmp_90_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // tuple if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -27961,7 +26970,7 @@ _tmp_90_rule(Parser *p) } { // group if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "group")); @@ -27980,7 +26989,7 @@ _tmp_90_rule(Parser *p) } { // genexp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); @@ -27999,7 +27008,7 @@ _tmp_90_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -28007,19 +27016,16 @@ _tmp_90_rule(Parser *p) static void * _tmp_91_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_91[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -28038,7 +27044,7 @@ _tmp_91_rule(Parser *p) } { // listcomp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_91[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "listcomp")); @@ -28057,7 +27063,7 @@ _tmp_91_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -28065,19 +27071,16 @@ _tmp_91_rule(Parser *p) static void * _tmp_92_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // dict if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dict")); @@ -28096,7 +27099,7 @@ _tmp_92_rule(Parser *p) } { // set if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "set")); @@ -28115,7 +27118,7 @@ _tmp_92_rule(Parser *p) } { // dictcomp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dictcomp")); @@ -28134,7 +27137,7 @@ _tmp_92_rule(Parser *p) } { // setcomp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "setcomp")); @@ -28153,7 +27156,7 @@ _tmp_92_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -28161,19 +27164,16 @@ _tmp_92_rule(Parser *p) static void * _tmp_93_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_93[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -28192,7 +27192,7 @@ _tmp_93_rule(Parser *p) } { // named_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_93[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -28211,7 +27211,7 @@ _tmp_93_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -28219,12 +27219,9 @@ _tmp_93_rule(Parser *p) static asdl_seq * _loop0_94_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28234,14 +27231,14 @@ _loop0_94_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_94[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28257,7 +27254,7 @@ _loop0_94_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28274,13 +27271,13 @@ _loop0_94_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_94_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28288,12 +27285,9 @@ _loop0_94_rule(Parser *p) static asdl_seq * _loop0_95_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28303,14 +27297,14 @@ _loop0_95_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_95[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28326,7 +27320,7 @@ _loop0_95_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28343,13 +27337,13 @@ _loop0_95_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_95_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28357,12 +27351,9 @@ _loop0_95_rule(Parser *p) static asdl_seq * _loop0_96_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28372,14 +27363,14 @@ _loop0_96_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_96[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28395,7 +27386,7 @@ _loop0_96_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28412,13 +27403,13 @@ _loop0_96_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_96_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28426,12 +27417,9 @@ _loop0_96_rule(Parser *p) static asdl_seq * _loop1_97_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28441,14 +27429,14 @@ _loop1_97_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_97[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28464,7 +27452,7 @@ _loop1_97_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28478,7 +27466,7 @@ _loop1_97_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28486,13 +27474,13 @@ _loop1_97_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_97_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28500,12 +27488,9 @@ _loop1_97_rule(Parser *p) static asdl_seq * _loop0_98_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28515,14 +27500,14 @@ _loop0_98_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_98[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28538,7 +27523,7 @@ _loop0_98_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28555,13 +27540,13 @@ _loop0_98_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_98_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28569,12 +27554,9 @@ _loop0_98_rule(Parser *p) static asdl_seq * _loop1_99_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28584,14 +27566,14 @@ _loop1_99_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_99[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28607,7 +27589,7 @@ _loop1_99_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28621,7 +27603,7 @@ _loop1_99_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28629,13 +27611,13 @@ _loop1_99_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_99_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28643,12 +27625,9 @@ _loop1_99_rule(Parser *p) static asdl_seq * _loop1_100_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28658,14 +27637,14 @@ _loop1_100_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_100[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28681,7 +27660,7 @@ _loop1_100_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28695,7 +27674,7 @@ _loop1_100_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28703,13 +27682,13 @@ _loop1_100_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_100_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28717,12 +27696,9 @@ _loop1_100_rule(Parser *p) static asdl_seq * _loop1_101_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28732,14 +27708,14 @@ _loop1_101_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_101[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28755,7 +27731,7 @@ _loop1_101_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28769,7 +27745,7 @@ _loop1_101_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28777,13 +27753,13 @@ _loop1_101_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_101_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28791,12 +27767,9 @@ _loop1_101_rule(Parser *p) static asdl_seq * _loop0_102_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28806,14 +27779,14 @@ _loop0_102_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_102[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28829,7 +27802,7 @@ _loop0_102_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28846,13 +27819,13 @@ _loop0_102_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_102_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28860,12 +27833,9 @@ _loop0_102_rule(Parser *p) static asdl_seq * _loop1_103_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28875,14 +27845,14 @@ _loop1_103_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_103[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28898,7 +27868,7 @@ _loop1_103_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28912,7 +27882,7 @@ _loop1_103_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28920,13 +27890,13 @@ _loop1_103_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_103_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -28934,12 +27904,9 @@ _loop1_103_rule(Parser *p) static asdl_seq * _loop0_104_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -28949,14 +27916,14 @@ _loop0_104_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_104[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -28972,7 +27939,7 @@ _loop0_104_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -28989,13 +27956,13 @@ _loop0_104_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_104_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29003,12 +27970,9 @@ _loop0_104_rule(Parser *p) static asdl_seq * _loop1_105_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29018,14 +27982,14 @@ _loop1_105_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_105[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -29041,7 +28005,7 @@ _loop1_105_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29055,7 +28019,7 @@ _loop1_105_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -29063,13 +28027,13 @@ _loop1_105_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_105_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29077,12 +28041,9 @@ _loop1_105_rule(Parser *p) static asdl_seq * _loop0_106_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29092,14 +28053,14 @@ _loop0_106_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_maybe_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_106[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); @@ -29115,7 +28076,7 @@ _loop0_106_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29132,13 +28093,13 @@ _loop0_106_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_106_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29146,12 +28107,9 @@ _loop0_106_rule(Parser *p) static asdl_seq * _loop1_107_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29161,14 +28119,14 @@ _loop1_107_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_maybe_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_107[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); @@ -29184,7 +28142,7 @@ _loop1_107_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29198,7 +28156,7 @@ _loop1_107_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -29206,13 +28164,13 @@ _loop1_107_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_107_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29220,12 +28178,9 @@ _loop1_107_rule(Parser *p) static asdl_seq * _loop1_108_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29235,14 +28190,14 @@ _loop1_108_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // STRING if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_108[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "STRING")); @@ -29258,7 +28213,7 @@ _loop1_108_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29272,7 +28227,7 @@ _loop1_108_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -29280,13 +28235,13 @@ _loop1_108_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_108_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29294,19 +28249,16 @@ _loop1_108_rule(Parser *p) static void * _tmp_109_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // star_named_expression ',' star_named_expressions? if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_109[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); @@ -29325,7 +28277,7 @@ _tmp_109_rule(Parser *p) _res = _PyPegen_seq_insert_in_front ( p , y , z ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -29336,7 +28288,7 @@ _tmp_109_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -29344,12 +28296,9 @@ _tmp_109_rule(Parser *p) static asdl_seq * _loop0_111_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29359,14 +28308,14 @@ _loop0_111_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' double_starred_kvpair if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_111[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); @@ -29382,7 +28331,7 @@ _loop0_111_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -29391,7 +28340,7 @@ _loop0_111_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29408,13 +28357,13 @@ _loop0_111_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_111_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29422,19 +28371,16 @@ _loop0_111_rule(Parser *p) static asdl_seq * _gather_110_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // double_starred_kvpair _loop0_111 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_110[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_111")); @@ -29456,7 +28402,7 @@ _gather_110_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -29464,12 +28410,9 @@ _gather_110_rule(Parser *p) static asdl_seq * _loop1_112_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29479,14 +28422,14 @@ _loop1_112_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // for_if_clause if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_112[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "for_if_clause")); @@ -29502,7 +28445,7 @@ _loop1_112_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29516,7 +28459,7 @@ _loop1_112_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -29524,13 +28467,13 @@ _loop1_112_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_112_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29538,12 +28481,9 @@ _loop1_112_rule(Parser *p) static asdl_seq * _loop0_113_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29553,14 +28493,14 @@ _loop0_113_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('if' disjunction) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_113[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); @@ -29576,7 +28516,7 @@ _loop0_113_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29593,13 +28533,13 @@ _loop0_113_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_113_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29607,12 +28547,9 @@ _loop0_113_rule(Parser *p) static asdl_seq * _loop0_114_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29622,14 +28559,14 @@ _loop0_114_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('if' disjunction) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_114[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); @@ -29645,7 +28582,7 @@ _loop0_114_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29662,13 +28599,13 @@ _loop0_114_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_114_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29676,19 +28613,16 @@ _loop0_114_rule(Parser *p) static void * _tmp_115_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -29707,7 +28641,7 @@ _tmp_115_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -29728,7 +28662,7 @@ _tmp_115_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -29736,12 +28670,9 @@ _tmp_115_rule(Parser *p) static asdl_seq * _loop0_117_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29751,14 +28682,14 @@ _loop0_117_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (starred_expression | (assignment_expression | expression !':=') !'=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_117[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (starred_expression | (assignment_expression | expression !':=') !'=')")); @@ -29774,7 +28705,7 @@ _loop0_117_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -29783,7 +28714,7 @@ _loop0_117_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29800,13 +28731,13 @@ _loop0_117_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_117_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29815,19 +28746,16 @@ _loop0_117_rule(Parser *p) static asdl_seq * _gather_116_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (starred_expression | (assignment_expression | expression !':=') !'=') _loop0_117 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_116[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(starred_expression | (assignment_expression | expression !':=') !'=') _loop0_117")); @@ -29849,7 +28777,7 @@ _gather_116_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -29857,19 +28785,16 @@ _gather_116_rule(Parser *p) static void * _tmp_118_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' kwargs if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_118[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwargs")); @@ -29885,7 +28810,7 @@ _tmp_118_rule(Parser *p) _res = k; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -29896,7 +28821,7 @@ _tmp_118_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -29904,12 +28829,9 @@ _tmp_118_rule(Parser *p) static asdl_seq * _loop0_120_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -29919,14 +28841,14 @@ _loop0_120_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_starred if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_120[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_starred")); @@ -29942,7 +28864,7 @@ _loop0_120_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -29951,7 +28873,7 @@ _loop0_120_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -29968,13 +28890,13 @@ _loop0_120_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_120_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -29982,19 +28904,16 @@ _loop0_120_rule(Parser *p) static asdl_seq * _gather_119_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_starred _loop0_120 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_119[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_starred _loop0_120")); @@ -30016,7 +28935,7 @@ _gather_119_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30024,12 +28943,9 @@ _gather_119_rule(Parser *p) static asdl_seq * _loop0_122_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30039,14 +28955,14 @@ _loop0_122_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_double_starred if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_122[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_double_starred")); @@ -30062,7 +28978,7 @@ _loop0_122_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30071,7 +28987,7 @@ _loop0_122_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30088,13 +29004,13 @@ _loop0_122_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_122_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30102,19 +29018,16 @@ _loop0_122_rule(Parser *p) static asdl_seq * _gather_121_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_double_starred _loop0_122 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_121[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_double_starred _loop0_122")); @@ -30136,7 +29049,7 @@ _gather_121_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30144,12 +29057,9 @@ _gather_121_rule(Parser *p) static asdl_seq * _loop0_124_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30159,14 +29069,14 @@ _loop0_124_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_starred if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_124[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_starred")); @@ -30182,7 +29092,7 @@ _loop0_124_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30191,7 +29101,7 @@ _loop0_124_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30208,13 +29118,13 @@ _loop0_124_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_124_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30222,19 +29132,16 @@ _loop0_124_rule(Parser *p) static asdl_seq * _gather_123_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_starred _loop0_124 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_123[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_starred _loop0_124")); @@ -30256,7 +29163,7 @@ _gather_123_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30264,12 +29171,9 @@ _gather_123_rule(Parser *p) static asdl_seq * _loop0_126_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30279,14 +29183,14 @@ _loop0_126_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_double_starred if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_126[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_double_starred")); @@ -30302,7 +29206,7 @@ _loop0_126_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30311,7 +29215,7 @@ _loop0_126_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30328,13 +29232,13 @@ _loop0_126_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_126_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30342,19 +29246,16 @@ _loop0_126_rule(Parser *p) static asdl_seq * _gather_125_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_double_starred _loop0_126 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_125[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_double_starred _loop0_126")); @@ -30376,7 +29277,7 @@ _gather_125_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30384,12 +29285,9 @@ _gather_125_rule(Parser *p) static asdl_seq * _loop0_127_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30399,14 +29297,14 @@ _loop0_127_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_target) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_127[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); @@ -30422,7 +29320,7 @@ _loop0_127_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30439,13 +29337,13 @@ _loop0_127_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_127_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30453,12 +29351,9 @@ _loop0_127_rule(Parser *p) static asdl_seq * _loop0_129_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30468,14 +29363,14 @@ _loop0_129_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_129[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -30491,7 +29386,7 @@ _loop0_129_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30500,7 +29395,7 @@ _loop0_129_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30517,13 +29412,13 @@ _loop0_129_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_129_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30531,19 +29426,16 @@ _loop0_129_rule(Parser *p) static asdl_seq * _gather_128_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // star_target _loop0_129 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_128[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target _loop0_129")); @@ -30565,7 +29457,7 @@ _gather_128_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30573,12 +29465,9 @@ _gather_128_rule(Parser *p) static asdl_seq * _loop1_130_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30588,14 +29477,14 @@ _loop1_130_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_target) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_130[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); @@ -30611,7 +29500,7 @@ _loop1_130_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30625,7 +29514,7 @@ _loop1_130_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -30633,13 +29522,13 @@ _loop1_130_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_130_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30647,19 +29536,16 @@ _loop1_130_rule(Parser *p) static void * _tmp_131_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // !'*' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_131[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!'*' star_target")); @@ -30680,7 +29566,7 @@ _tmp_131_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30688,12 +29574,9 @@ _tmp_131_rule(Parser *p) static asdl_seq * _loop0_133_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30703,14 +29586,14 @@ _loop0_133_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' del_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_133[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' del_target")); @@ -30726,7 +29609,7 @@ _loop0_133_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30735,7 +29618,7 @@ _loop0_133_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30752,13 +29635,13 @@ _loop0_133_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_133_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30766,19 +29649,16 @@ _loop0_133_rule(Parser *p) static asdl_seq * _gather_132_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // del_target _loop0_133 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_132[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "del_target _loop0_133")); @@ -30800,7 +29680,7 @@ _gather_132_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30808,12 +29688,9 @@ _gather_132_rule(Parser *p) static asdl_seq * _loop0_135_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30823,14 +29700,14 @@ _loop0_135_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_135[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -30846,7 +29723,7 @@ _loop0_135_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30855,7 +29732,7 @@ _loop0_135_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30872,13 +29749,13 @@ _loop0_135_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_135_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -30886,19 +29763,16 @@ _loop0_135_rule(Parser *p) static asdl_seq * _gather_134_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_135 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_134[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_135")); @@ -30920,7 +29794,7 @@ _gather_134_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -30928,12 +29802,9 @@ _gather_134_rule(Parser *p) static asdl_seq * _loop0_137_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -30943,14 +29814,14 @@ _loop0_137_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_137[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -30966,7 +29837,7 @@ _loop0_137_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -30975,7 +29846,7 @@ _loop0_137_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -30992,13 +29863,13 @@ _loop0_137_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_137_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -31006,19 +29877,16 @@ _loop0_137_rule(Parser *p) static asdl_seq * _gather_136_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_137 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_136[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_137")); @@ -31040,7 +29908,7 @@ _gather_136_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31048,12 +29916,9 @@ _gather_136_rule(Parser *p) static asdl_seq * _loop0_139_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -31063,14 +29928,14 @@ _loop0_139_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_139[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -31086,7 +29951,7 @@ _loop0_139_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -31095,7 +29960,7 @@ _loop0_139_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -31112,13 +29977,13 @@ _loop0_139_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_139_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -31126,19 +29991,16 @@ _loop0_139_rule(Parser *p) static asdl_seq * _gather_138_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_139 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_138[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_139")); @@ -31160,7 +30022,7 @@ _gather_138_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31168,12 +30030,9 @@ _gather_138_rule(Parser *p) static asdl_seq * _loop0_141_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -31183,14 +30042,14 @@ _loop0_141_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_141[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -31206,7 +30065,7 @@ _loop0_141_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -31215,7 +30074,7 @@ _loop0_141_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -31232,13 +30091,13 @@ _loop0_141_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_141_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -31246,19 +30105,16 @@ _loop0_141_rule(Parser *p) static asdl_seq * _gather_140_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_141 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_140[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_141")); @@ -31280,7 +30136,7 @@ _gather_140_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31288,19 +30144,16 @@ _gather_140_rule(Parser *p) static void * _tmp_142_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE INDENT if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_142[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE INDENT")); @@ -31322,7 +30175,7 @@ _tmp_142_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31330,19 +30183,16 @@ _tmp_142_rule(Parser *p) static void * _tmp_143_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // args if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_143[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args")); @@ -31361,7 +30211,7 @@ _tmp_143_rule(Parser *p) } { // expression for_if_clauses if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_143[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); @@ -31383,7 +30233,7 @@ _tmp_143_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31391,19 +30241,16 @@ _tmp_143_rule(Parser *p) static void * _tmp_144_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'True' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -31422,7 +30269,7 @@ _tmp_144_rule(Parser *p) } { // 'False' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -31441,7 +30288,7 @@ _tmp_144_rule(Parser *p) } { // 'None' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -31460,7 +30307,7 @@ _tmp_144_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31468,19 +30315,16 @@ _tmp_144_rule(Parser *p) static void * _tmp_145_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '='")); @@ -31502,7 +30346,7 @@ _tmp_145_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31510,19 +30354,16 @@ _tmp_145_rule(Parser *p) static void * _tmp_146_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME STRING if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME STRING")); @@ -31544,7 +30385,7 @@ _tmp_146_rule(Parser *p) } { // SOFT_KEYWORD if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); @@ -31563,7 +30404,7 @@ _tmp_146_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31571,19 +30412,16 @@ _tmp_146_rule(Parser *p) static void * _tmp_147_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'else' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else'")); @@ -31602,7 +30440,7 @@ _tmp_147_rule(Parser *p) } { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -31621,7 +30459,7 @@ _tmp_147_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31629,19 +30467,16 @@ _tmp_147_rule(Parser *p) static void * _tmp_148_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -31660,7 +30495,7 @@ _tmp_148_rule(Parser *p) } { // ':=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); @@ -31679,7 +30514,7 @@ _tmp_148_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31687,19 +30522,16 @@ _tmp_148_rule(Parser *p) static void * _tmp_149_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -31718,7 +30550,7 @@ _tmp_149_rule(Parser *p) } { // tuple if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -31737,7 +30569,7 @@ _tmp_149_rule(Parser *p) } { // genexp if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); @@ -31756,7 +30588,7 @@ _tmp_149_rule(Parser *p) } { // 'True' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -31775,7 +30607,7 @@ _tmp_149_rule(Parser *p) } { // 'None' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -31794,7 +30626,7 @@ _tmp_149_rule(Parser *p) } { // 'False' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -31813,7 +30645,7 @@ _tmp_149_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31821,19 +30653,16 @@ _tmp_149_rule(Parser *p) static void * _tmp_150_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -31852,7 +30681,7 @@ _tmp_150_rule(Parser *p) } { // ':=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); @@ -31871,7 +30700,7 @@ _tmp_150_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -31879,12 +30708,9 @@ _tmp_150_rule(Parser *p) static asdl_seq * _loop0_151_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -31894,14 +30720,14 @@ _loop0_151_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // star_named_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expressions")); @@ -31917,7 +30743,7 @@ _loop0_151_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -31934,13 +30760,13 @@ _loop0_151_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_151_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -31948,12 +30774,9 @@ _loop0_151_rule(Parser *p) static asdl_seq * _loop0_152_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -31963,14 +30786,14 @@ _loop0_152_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_152[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -31986,7 +30809,7 @@ _loop0_152_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32003,13 +30826,13 @@ _loop0_152_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_152_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32017,12 +30840,9 @@ _loop0_152_rule(Parser *p) static asdl_seq * _loop0_153_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32032,14 +30852,14 @@ _loop0_153_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_153[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -32055,7 +30875,7 @@ _loop0_153_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32072,13 +30892,13 @@ _loop0_153_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_153_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32086,19 +30906,16 @@ _loop0_153_rule(Parser *p) static void * _tmp_154_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -32117,7 +30934,7 @@ _tmp_154_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -32136,7 +30953,7 @@ _tmp_154_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -32144,19 +30961,16 @@ _tmp_154_rule(Parser *p) static void * _tmp_155_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -32175,7 +30989,7 @@ _tmp_155_rule(Parser *p) } { // '(' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -32194,7 +31008,7 @@ _tmp_155_rule(Parser *p) } { // '{' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -32213,7 +31027,7 @@ _tmp_155_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -32221,19 +31035,16 @@ _tmp_155_rule(Parser *p) static void * _tmp_156_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -32252,7 +31063,7 @@ _tmp_156_rule(Parser *p) } { // '{' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -32271,7 +31082,7 @@ _tmp_156_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -32279,19 +31090,16 @@ _tmp_156_rule(Parser *p) static void * _tmp_157_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -32310,7 +31118,7 @@ _tmp_157_rule(Parser *p) } { // '{' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -32329,7 +31137,7 @@ _tmp_157_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -32337,12 +31145,9 @@ _tmp_157_rule(Parser *p) static asdl_seq * _loop0_158_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32352,14 +31157,14 @@ _loop0_158_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -32375,7 +31180,7 @@ _loop0_158_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32392,13 +31197,13 @@ _loop0_158_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_158_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32406,12 +31211,9 @@ _loop0_158_rule(Parser *p) static asdl_seq * _loop0_159_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32421,14 +31223,14 @@ _loop0_159_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_159[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -32444,7 +31246,7 @@ _loop0_159_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32461,13 +31263,13 @@ _loop0_159_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_159_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32475,12 +31277,9 @@ _loop0_159_rule(Parser *p) static asdl_seq * _loop1_160_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32490,14 +31289,14 @@ _loop1_160_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_160[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -32513,7 +31312,7 @@ _loop1_160_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32527,7 +31326,7 @@ _loop1_160_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -32535,13 +31334,13 @@ _loop1_160_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_160_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32549,12 +31348,9 @@ _loop1_160_rule(Parser *p) static asdl_seq * _loop1_161_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32564,14 +31360,14 @@ _loop1_161_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_161[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -32587,7 +31383,7 @@ _loop1_161_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32601,7 +31397,7 @@ _loop1_161_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -32609,13 +31405,13 @@ _loop1_161_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_161_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32623,12 +31419,9 @@ _loop1_161_rule(Parser *p) static asdl_seq * _loop0_162_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32638,14 +31431,14 @@ _loop0_162_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_162[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -32661,7 +31454,7 @@ _loop0_162_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32678,13 +31471,13 @@ _loop0_162_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_162_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32692,12 +31485,9 @@ _loop0_162_rule(Parser *p) static asdl_seq * _loop0_163_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32707,14 +31497,14 @@ _loop0_163_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_163[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -32730,7 +31520,7 @@ _loop0_163_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32747,13 +31537,13 @@ _loop0_163_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_163_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32761,12 +31551,9 @@ _loop0_163_rule(Parser *p) static asdl_seq * _loop0_165_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32776,14 +31563,14 @@ _loop0_165_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' lambda_param if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_165[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' lambda_param")); @@ -32799,7 +31586,7 @@ _loop0_165_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -32808,7 +31595,7 @@ _loop0_165_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32825,13 +31612,13 @@ _loop0_165_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_165_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32839,19 +31626,16 @@ _loop0_165_rule(Parser *p) static asdl_seq * _gather_164_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // lambda_param _loop0_165 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_164[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_165")); @@ -32873,7 +31657,7 @@ _gather_164_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -32881,12 +31665,9 @@ _gather_164_rule(Parser *p) static asdl_seq * _loop1_166_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -32896,14 +31677,14 @@ _loop1_166_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_166[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -32919,7 +31700,7 @@ _loop1_166_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -32933,7 +31714,7 @@ _loop1_166_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -32941,13 +31722,13 @@ _loop1_166_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_166_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -32955,19 +31736,16 @@ _loop1_166_rule(Parser *p) static void * _tmp_167_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -32986,7 +31764,7 @@ _tmp_167_rule(Parser *p) } { // ',' (')' | '**') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); @@ -33008,7 +31786,7 @@ _tmp_167_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33016,19 +31794,16 @@ _tmp_167_rule(Parser *p) static void * _tmp_168_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -33047,7 +31822,7 @@ _tmp_168_rule(Parser *p) } { // ',' (':' | '**') if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); @@ -33069,7 +31844,7 @@ _tmp_168_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33077,19 +31852,16 @@ _tmp_168_rule(Parser *p) static void * _tmp_169_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -33108,7 +31880,7 @@ _tmp_169_rule(Parser *p) } { // ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -33127,7 +31899,7 @@ _tmp_169_rule(Parser *p) } { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -33146,7 +31918,7 @@ _tmp_169_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33154,12 +31926,9 @@ _tmp_169_rule(Parser *p) static asdl_seq * _loop0_171_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33169,14 +31938,14 @@ _loop0_171_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expression ['as' star_target]) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); @@ -33192,7 +31961,7 @@ _loop0_171_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -33201,7 +31970,7 @@ _loop0_171_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33218,13 +31987,13 @@ _loop0_171_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_171_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33232,19 +32001,16 @@ _loop0_171_rule(Parser *p) static asdl_seq * _gather_170_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expression ['as' star_target]) _loop0_171 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_171")); @@ -33266,7 +32032,7 @@ _gather_170_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33274,12 +32040,9 @@ _gather_170_rule(Parser *p) static asdl_seq * _loop0_173_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33289,14 +32052,14 @@ _loop0_173_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expressions ['as' star_target]) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_173[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); @@ -33312,7 +32075,7 @@ _loop0_173_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -33321,7 +32084,7 @@ _loop0_173_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33338,13 +32101,13 @@ _loop0_173_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_173_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33352,19 +32115,16 @@ _loop0_173_rule(Parser *p) static asdl_seq * _gather_172_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expressions ['as' star_target]) _loop0_173 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_172[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_173")); @@ -33386,7 +32146,7 @@ _gather_172_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33394,12 +32154,9 @@ _gather_172_rule(Parser *p) static asdl_seq * _loop0_175_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33409,14 +32166,14 @@ _loop0_175_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expression ['as' star_target]) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); @@ -33432,7 +32189,7 @@ _loop0_175_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -33441,7 +32198,7 @@ _loop0_175_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33458,13 +32215,13 @@ _loop0_175_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_175_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33472,19 +32229,16 @@ _loop0_175_rule(Parser *p) static asdl_seq * _gather_174_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expression ['as' star_target]) _loop0_175 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_175")); @@ -33506,7 +32260,7 @@ _gather_174_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33514,12 +32268,9 @@ _gather_174_rule(Parser *p) static asdl_seq * _loop0_177_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33529,14 +32280,14 @@ _loop0_177_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expressions ['as' star_target]) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_177[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); @@ -33552,7 +32303,7 @@ _loop0_177_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -33561,7 +32312,7 @@ _loop0_177_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33578,13 +32329,13 @@ _loop0_177_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_177_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33592,19 +32343,16 @@ _loop0_177_rule(Parser *p) static asdl_seq * _gather_176_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expressions ['as' star_target]) _loop0_177 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_176[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_177")); @@ -33626,7 +32374,7 @@ _gather_176_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33634,19 +32382,16 @@ _gather_176_rule(Parser *p) static void * _tmp_178_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except'")); @@ -33665,7 +32410,7 @@ _tmp_178_rule(Parser *p) } { // 'finally' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally'")); @@ -33684,7 +32429,7 @@ _tmp_178_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33692,12 +32437,9 @@ _tmp_178_rule(Parser *p) static asdl_seq * _loop0_179_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33707,14 +32449,14 @@ _loop0_179_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_179[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); @@ -33730,7 +32472,7 @@ _loop0_179_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33747,13 +32489,13 @@ _loop0_179_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_179_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33761,19 +32503,16 @@ _loop0_179_rule(Parser *p) static void * _tmp_180_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // (except_block+ except_star_block) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_block+ except_star_block)")); @@ -33792,7 +32531,7 @@ _tmp_180_rule(Parser *p) } { // (except_star_block+ except_block) if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_star_block+ except_block)")); @@ -33811,7 +32550,7 @@ _tmp_180_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33819,12 +32558,9 @@ _tmp_180_rule(Parser *p) static asdl_seq * _loop0_181_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -33834,14 +32570,14 @@ _loop0_181_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_181[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); @@ -33857,7 +32593,7 @@ _loop0_181_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -33874,13 +32610,13 @@ _loop0_181_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_181_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -33888,19 +32624,16 @@ _loop0_181_rule(Parser *p) static void * _tmp_182_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_182[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -33922,7 +32655,7 @@ _tmp_182_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33930,19 +32663,16 @@ _tmp_182_rule(Parser *p) static void * _tmp_183_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_183[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -33964,7 +32694,7 @@ _tmp_183_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -33972,19 +32702,16 @@ _tmp_183_rule(Parser *p) static void * _tmp_184_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -34003,7 +32730,7 @@ _tmp_184_rule(Parser *p) } { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -34022,7 +32749,7 @@ _tmp_184_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34030,19 +32757,16 @@ _tmp_184_rule(Parser *p) static void * _tmp_185_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_185[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -34064,7 +32788,7 @@ _tmp_185_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34072,19 +32796,16 @@ _tmp_185_rule(Parser *p) static void * _tmp_186_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_186[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -34106,7 +32827,7 @@ _tmp_186_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34114,19 +32835,16 @@ _tmp_186_rule(Parser *p) static void * _tmp_187_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // positional_patterns ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_187[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); @@ -34148,7 +32866,7 @@ _tmp_187_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34156,19 +32874,16 @@ _tmp_187_rule(Parser *p) static void * _tmp_188_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -34190,7 +32905,7 @@ _tmp_188_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34198,19 +32913,16 @@ _tmp_188_rule(Parser *p) static void * _tmp_189_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' arguments? ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); @@ -34236,7 +32948,7 @@ _tmp_189_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34244,12 +32956,9 @@ _tmp_189_rule(Parser *p) static asdl_seq * _loop0_191_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -34259,14 +32968,14 @@ _loop0_191_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' double_starred_kvpair if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop0_191[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); @@ -34282,7 +32991,7 @@ _loop0_191_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } if (_n == _children_capacity) { @@ -34291,7 +33000,7 @@ _loop0_191_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -34308,13 +33017,13 @@ _loop0_191_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_191_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -34322,19 +33031,16 @@ _loop0_191_rule(Parser *p) static asdl_seq * _gather_190_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // double_starred_kvpair _loop0_191 if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _gather_190[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_191")); @@ -34356,7 +33062,7 @@ _gather_190_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34364,19 +33070,16 @@ _gather_190_rule(Parser *p) static void * _tmp_192_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '}' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); @@ -34395,7 +33098,7 @@ _tmp_192_rule(Parser *p) } { // ',' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -34414,7 +33117,7 @@ _tmp_192_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34422,19 +33125,16 @@ _tmp_192_rule(Parser *p) static void * _tmp_193_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -34450,7 +33150,7 @@ _tmp_193_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34461,7 +33161,7 @@ _tmp_193_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34469,19 +33169,16 @@ _tmp_193_rule(Parser *p) static void * _tmp_194_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -34500,7 +33197,7 @@ _tmp_194_rule(Parser *p) } { // '...' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -34519,7 +33216,7 @@ _tmp_194_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34527,19 +33224,16 @@ _tmp_194_rule(Parser *p) static void * _tmp_195_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -34558,7 +33252,7 @@ _tmp_195_rule(Parser *p) } { // '...' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -34577,7 +33271,7 @@ _tmp_195_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34585,19 +33279,16 @@ _tmp_195_rule(Parser *p) static void * _tmp_196_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // '@' named_expression NEWLINE if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_196[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); @@ -34616,7 +33307,7 @@ _tmp_196_rule(Parser *p) _res = f; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34627,7 +33318,7 @@ _tmp_196_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34635,19 +33326,16 @@ _tmp_196_rule(Parser *p) static void * _tmp_197_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_197[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -34663,7 +33351,7 @@ _tmp_197_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34674,7 +33362,7 @@ _tmp_197_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34682,19 +33370,16 @@ _tmp_197_rule(Parser *p) static void * _tmp_198_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_198[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); @@ -34710,7 +33395,7 @@ _tmp_198_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34721,7 +33406,7 @@ _tmp_198_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34729,19 +33414,16 @@ _tmp_198_rule(Parser *p) static void * _tmp_199_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'or' conjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_199[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); @@ -34757,7 +33439,7 @@ _tmp_199_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34768,7 +33450,7 @@ _tmp_199_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34776,19 +33458,16 @@ _tmp_199_rule(Parser *p) static void * _tmp_200_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'and' inversion if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_200[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); @@ -34804,7 +33483,7 @@ _tmp_200_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34815,7 +33494,7 @@ _tmp_200_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34823,19 +33502,16 @@ _tmp_200_rule(Parser *p) static void * _tmp_201_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' disjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_201[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); @@ -34851,7 +33527,7 @@ _tmp_201_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34862,7 +33538,7 @@ _tmp_201_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34870,19 +33546,16 @@ _tmp_201_rule(Parser *p) static void * _tmp_202_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' disjunction if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_202[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); @@ -34898,7 +33571,7 @@ _tmp_202_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -34909,7 +33582,7 @@ _tmp_202_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34917,19 +33590,16 @@ _tmp_202_rule(Parser *p) static void * _tmp_203_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // starred_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); @@ -34948,7 +33618,7 @@ _tmp_203_rule(Parser *p) } { // (assignment_expression | expression !':=') !'=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); @@ -34969,7 +33639,7 @@ _tmp_203_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -34977,19 +33647,16 @@ _tmp_203_rule(Parser *p) static void * _tmp_204_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -35005,7 +33672,7 @@ _tmp_204_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -35016,7 +33683,7 @@ _tmp_204_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35024,19 +33691,16 @@ _tmp_204_rule(Parser *p) static void * _tmp_205_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_205[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -35052,7 +33716,7 @@ _tmp_205_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - p->level--; + D(p->level--); return NULL; } goto done; @@ -35063,7 +33727,7 @@ _tmp_205_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35071,19 +33735,16 @@ _tmp_205_rule(Parser *p) static void * _tmp_206_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_206[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -35105,7 +33766,7 @@ _tmp_206_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35113,19 +33774,16 @@ _tmp_206_rule(Parser *p) static void * _tmp_207_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_207[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -35147,7 +33805,7 @@ _tmp_207_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35155,19 +33813,16 @@ _tmp_207_rule(Parser *p) static void * _tmp_208_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ')' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -35186,7 +33841,7 @@ _tmp_208_rule(Parser *p) } { // '**' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); @@ -35205,7 +33860,7 @@ _tmp_208_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35213,19 +33868,16 @@ _tmp_208_rule(Parser *p) static void * _tmp_209_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -35244,7 +33896,7 @@ _tmp_209_rule(Parser *p) } { // '**' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); @@ -35263,7 +33915,7 @@ _tmp_209_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35271,19 +33923,16 @@ _tmp_209_rule(Parser *p) static void * _tmp_210_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ['as' star_target] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_210[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); @@ -35306,7 +33955,7 @@ _tmp_210_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35314,19 +33963,16 @@ _tmp_210_rule(Parser *p) static void * _tmp_211_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expressions ['as' star_target] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_211[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); @@ -35349,7 +33995,7 @@ _tmp_211_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35357,19 +34003,16 @@ _tmp_211_rule(Parser *p) static void * _tmp_212_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ['as' star_target] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_212[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); @@ -35392,7 +34035,7 @@ _tmp_212_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35400,19 +34043,16 @@ _tmp_212_rule(Parser *p) static void * _tmp_213_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // expressions ['as' star_target] if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_213[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); @@ -35435,7 +34075,7 @@ _tmp_213_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35443,19 +34083,16 @@ _tmp_213_rule(Parser *p) static void * _tmp_214_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // except_block+ except_star_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); @@ -35477,7 +34114,7 @@ _tmp_214_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35485,19 +34122,16 @@ _tmp_214_rule(Parser *p) static void * _tmp_215_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // except_star_block+ except_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); @@ -35519,7 +34153,7 @@ _tmp_215_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35527,19 +34161,16 @@ _tmp_215_rule(Parser *p) static void * _tmp_216_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -35558,7 +34189,7 @@ _tmp_216_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -35579,7 +34210,7 @@ _tmp_216_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35587,19 +34218,16 @@ _tmp_216_rule(Parser *p) static void * _tmp_217_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -35621,7 +34249,7 @@ _tmp_217_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35629,19 +34257,16 @@ _tmp_217_rule(Parser *p) static void * _tmp_218_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -35663,7 +34288,7 @@ _tmp_218_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35671,19 +34296,16 @@ _tmp_218_rule(Parser *p) static void * _tmp_219_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -35705,7 +34327,7 @@ _tmp_219_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35713,19 +34335,16 @@ _tmp_219_rule(Parser *p) static void * _tmp_220_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -35747,7 +34366,7 @@ _tmp_220_rule(Parser *p) } _res = NULL; done: - p->level--; + D(p->level--); return _res; } @@ -35755,12 +34374,9 @@ _tmp_220_rule(Parser *p) static asdl_seq * _loop1_221_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -35770,14 +34386,14 @@ _loop1_221_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); @@ -35793,7 +34409,7 @@ _loop1_221_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -35807,7 +34423,7 @@ _loop1_221_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35815,13 +34431,13 @@ _loop1_221_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_221_type, _seq); - p->level--; + D(p->level--); return _seq; } @@ -35829,12 +34445,9 @@ _loop1_221_rule(Parser *p) static asdl_seq * _loop1_222_rule(Parser *p) { - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } + D(p->level++); if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } void *_res = NULL; @@ -35844,14 +34457,14 @@ _loop1_222_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_star_block if (p->error_indicator) { - p->level--; + D(p->level--); return NULL; } D(fprintf(stderr, "%*c> _loop1_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); @@ -35867,7 +34480,7 @@ _loop1_222_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } _children = _new_children; @@ -35881,7 +34494,7 @@ _loop1_222_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - p->level--; + D(p->level--); return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35889,13 +34502,13 @@ _loop1_222_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - p->level--; + D(p->level--); return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_222_type, _seq); - p->level--; + D(p->level--); return _seq; } diff --git a/Parser/pegen.c b/Parser/pegen.c index cfea1c87199b2..870085e7285e3 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -815,7 +815,6 @@ void * _PyPegen_run_parser(Parser *p) { void *res = _PyPegen_parse(p); - assert(p->level == 0); if (res == NULL) { if (PyErr_Occurred() && !PyErr_ExceptionMatches(PyExc_SyntaxError)) { return NULL; diff --git a/Tools/peg_generator/pegen/c_generator.py b/Tools/peg_generator/pegen/c_generator.py index ee255c8016386..9cfbf38b40a77 100644 --- a/Tools/peg_generator/pegen/c_generator.py +++ b/Tools/peg_generator/pegen/c_generator.py @@ -37,8 +37,6 @@ # define D(x) #endif -# define MAXSTACK 6000 - """ @@ -366,14 +364,10 @@ def __init__( self.skip_actions = skip_actions def add_level(self) -> None: - self.print("if (p->level++ == MAXSTACK) {") - with self.indent(): - self.print("p->error_indicator = 1;") - self.print("PyErr_NoMemory();") - self.print("}") + self.print("D(p->level++);") def remove_level(self) -> None: - self.print("p->level--;") + self.print("D(p->level--);") def add_return(self, ret_val: str) -> None: self.remove_level() @@ -550,10 +544,9 @@ def _set_up_rule_memoization(self, node: Rule, result_type: str) -> None: self.print("p->in_raw_rule++;") self.print(f"void *_raw = {node.name}_raw(p);") self.print("p->in_raw_rule--;") - self.print("if (p->error_indicator) {") + self.print("if (p->error_indicator)") with self.indent(): - self.add_return("NULL") - self.print("}") + self.print("return NULL;") self.print("if (_raw == NULL || p->mark <= _resmark)") with self.indent(): self.print("break;") From webhook-mailer at python.org Mon Jan 3 14:02:48 2022 From: webhook-mailer at python.org (pablogsal) Date: Mon, 03 Jan 2022 19:02:48 -0000 Subject: [Python-checkins] bpo-44092: Don't reset statements/cursors before rollback (GH-26026) Message-ID: https://github.com/python/cpython/commit/9d6a239a34a66e16188d76c23a3a770515ca44ca commit: 9d6a239a34a66e16188d76c23a3a770515ca44ca branch: main author: Erlend Egeberg Aasland committer: pablogsal date: 2022-01-03T19:02:39Z summary: bpo-44092: Don't reset statements/cursors before rollback (GH-26026) In SQLite versions pre 3.7.11, pending statements would block a rollback. This is no longer the case, so remove the workaround. files: A Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst M Doc/whatsnew/3.11.rst M Lib/test/test_sqlite3/test_regression.py M Lib/test/test_sqlite3/test_transactions.py M Modules/_sqlite/connection.c diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index faa63a93895a2..4ddca744720f5 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -302,6 +302,11 @@ sys (Contributed by Irit Katriel in :issue:`45711`.) +* Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. + Instead we leave it to the SQLite library to handle these cases. + (Contributed by Erlend E. Aasland in :issue:`44092`.) + + threading --------- diff --git a/Lib/test/test_sqlite3/test_regression.py b/Lib/test/test_sqlite3/test_regression.py index eb34069d3e554..b527053039b8a 100644 --- a/Lib/test/test_sqlite3/test_regression.py +++ b/Lib/test/test_sqlite3/test_regression.py @@ -231,28 +231,6 @@ def __init__(self, name): with self.assertRaises(sqlite.ProgrammingError): cur = con.cursor() - def test_cursor_registration(self): - """ - Verifies that subclassed cursor classes are correctly registered with - the connection object, too. (fetch-across-rollback problem) - """ - class Connection(sqlite.Connection): - def cursor(self): - return Cursor(self) - - class Cursor(sqlite.Cursor): - def __init__(self, con): - sqlite.Cursor.__init__(self, con) - - con = Connection(":memory:") - cur = con.cursor() - cur.execute("create table foo(x)") - cur.executemany("insert into foo(x) values (?)", [(3,), (4,), (5,)]) - cur.execute("select x from foo") - con.rollback() - with self.assertRaises(sqlite.InterfaceError): - cur.fetchall() - def test_auto_commit(self): """ Verifies that creating a connection in autocommit mode works. diff --git a/Lib/test/test_sqlite3/test_transactions.py b/Lib/test/test_sqlite3/test_transactions.py index 3efa2c1e604ff..55cf8f1fdfce3 100644 --- a/Lib/test/test_sqlite3/test_transactions.py +++ b/Lib/test/test_sqlite3/test_transactions.py @@ -131,10 +131,7 @@ def test_locking(self): self.con1.commit() def test_rollback_cursor_consistency(self): - """ - Checks if cursors on the connection are set into a "reset" state - when a rollback is done on the connection. - """ + """Check that cursors behave correctly after rollback.""" con = sqlite.connect(":memory:") cur = con.cursor() cur.execute("create table test(x)") @@ -142,8 +139,44 @@ def test_rollback_cursor_consistency(self): cur.execute("select 1 union select 2 union select 3") con.rollback() - with self.assertRaises(sqlite.InterfaceError): - cur.fetchall() + self.assertEqual(cur.fetchall(), [(1,), (2,), (3,)]) + + +class RollbackTests(unittest.TestCase): + """bpo-44092: sqlite3 now leaves it to SQLite to resolve rollback issues""" + + def setUp(self): + self.con = sqlite.connect(":memory:") + self.cur1 = self.con.cursor() + self.cur2 = self.con.cursor() + with self.con: + self.con.execute("create table t(c)"); + self.con.executemany("insert into t values(?)", [(0,), (1,), (2,)]) + self.cur1.execute("begin transaction") + select = "select c from t" + self.cur1.execute(select) + self.con.rollback() + self.res = self.cur2.execute(select) # Reusing stmt from cache + + def tearDown(self): + self.con.close() + + def _check_rows(self): + for i, row in enumerate(self.res): + self.assertEqual(row[0], i) + + def test_no_duplicate_rows_after_rollback_del_cursor(self): + del self.cur1 + self._check_rows() + + def test_no_duplicate_rows_after_rollback_close_cursor(self): + self.cur1.close() + self._check_rows() + + def test_no_duplicate_rows_after_rollback_new_query(self): + self.cur1.execute("select c from t where c = 1") + self._check_rows() + class SpecialCommandTests(unittest.TestCase): diff --git a/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst b/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst new file mode 100644 index 0000000000000..67777817ed550 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst @@ -0,0 +1,3 @@ +Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. Instead +we leave it to the SQLite library to handle these cases. +Patch by Erlend E. Aasland. diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c index 4f0baa649e1d0..02f4ac46b7c35 100644 --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -274,28 +274,6 @@ pysqlite_connection_init_impl(pysqlite_Connection *self, return 0; } -static void -pysqlite_do_all_statements(pysqlite_Connection *self) -{ - // Reset all statements - sqlite3_stmt *stmt = NULL; - while ((stmt = sqlite3_next_stmt(self->db, stmt))) { - if (sqlite3_stmt_busy(stmt)) { - (void)sqlite3_reset(stmt); - } - } - - // Reset all cursors - for (int i = 0; i < PyList_Size(self->cursors); i++) { - PyObject *weakref = PyList_GetItem(self->cursors, i); - PyObject *object = PyWeakref_GetObject(weakref); - if (object != Py_None) { - pysqlite_Cursor *cursor = (pysqlite_Cursor *)object; - cursor->reset = 1; - } - } -} - #define VISIT_CALLBACK_CONTEXT(ctx) \ do { \ if (ctx) { \ @@ -549,8 +527,6 @@ pysqlite_connection_rollback_impl(pysqlite_Connection *self) } if (!sqlite3_get_autocommit(self->db)) { - pysqlite_do_all_statements(self); - int rc; Py_BEGIN_ALLOW_THREADS From webhook-mailer at python.org Mon Jan 3 14:54:16 2022 From: webhook-mailer at python.org (pablogsal) Date: Mon, 03 Jan 2022 19:54:16 -0000 Subject: [Python-checkins] bpo-46110: Restore commit e9898bf153d26059261ffef11f7643ae991e2a4c Message-ID: https://github.com/python/cpython/commit/dd6c35761a4cd417e126a2d51dd0b89c8a30e5de commit: dd6c35761a4cd417e126a2d51dd0b89c8a30e5de branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-03T19:54:06Z summary: bpo-46110: Restore commit e9898bf153d26059261ffef11f7643ae991e2a4c This restores commit e9898bf153d26059261ffef11f7643ae991e2a4c . files: A Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst M Lib/test/test_syntax.py M Parser/parser.c M Parser/pegen.c M Tools/peg_generator/pegen/c_generator.py diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 6286529d2734e..c95bc15e7273d 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1729,6 +1729,14 @@ def test_syntax_error_on_deeply_nested_blocks(self): """ self._check_error(source, "too many statically nested blocks") + @support.cpython_only + def test_error_on_parser_stack_overflow(self): + source = "-" * 100000 + "4" + for mode in ["exec", "eval", "single"]: + with self.subTest(mode=mode): + with self.assertRaises(MemoryError): + compile(source, "", mode) + def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite()) diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst new file mode 100644 index 0000000000000..593d2855972c4 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst @@ -0,0 +1,2 @@ +Add a maximum recursion check to the PEG parser to avoid stack overflow. +Patch by Pablo Galindo diff --git a/Parser/parser.c b/Parser/parser.c index 4d576aa781542..07a04c917430c 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -6,6 +6,8 @@ #else # define D(x) #endif + +# define MAXSTACK 6000 static const int n_keyword_lists = 9; static KeywordToken *reserved_keywords[] = { (KeywordToken[]) {{NULL, -1}}, @@ -968,16 +970,19 @@ static asdl_seq *_loop1_222_rule(Parser *p); static mod_ty file_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // statements? $ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> file[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statements? $")); @@ -993,7 +998,7 @@ file_rule(Parser *p) _res = _PyPegen_make_module ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1004,7 +1009,7 @@ file_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1012,16 +1017,19 @@ file_rule(Parser *p) static mod_ty interactive_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // statement_newline if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> interactive[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement_newline")); @@ -1034,7 +1042,7 @@ interactive_rule(Parser *p) _res = _PyAST_Interactive ( a , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1045,7 +1053,7 @@ interactive_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1053,16 +1061,19 @@ interactive_rule(Parser *p) static mod_ty eval_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // expressions NEWLINE* $ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> eval[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions NEWLINE* $")); @@ -1081,7 +1092,7 @@ eval_rule(Parser *p) _res = _PyAST_Expression ( a , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1092,7 +1103,7 @@ eval_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1100,16 +1111,19 @@ eval_rule(Parser *p) static mod_ty func_type_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } mod_ty _res = NULL; int _mark = p->mark; { // '(' type_expressions? ')' '->' expression NEWLINE* $ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> func_type[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' type_expressions? ')' '->' expression NEWLINE* $")); @@ -1140,7 +1154,7 @@ func_type_rule(Parser *p) _res = _PyAST_FunctionType ( a , b , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1151,7 +1165,7 @@ func_type_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1159,16 +1173,19 @@ func_type_rule(Parser *p) static expr_ty fstring_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> fstring[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -1187,7 +1204,7 @@ fstring_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1195,16 +1212,19 @@ fstring_rule(Parser *p) static asdl_stmt_seq* statements_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // statement+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statements[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement+")); @@ -1217,7 +1237,7 @@ statements_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_seq_flatten ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1228,7 +1248,7 @@ statements_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1236,16 +1256,19 @@ statements_rule(Parser *p) static asdl_stmt_seq* statement_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // compound_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compound_stmt")); @@ -1258,7 +1281,7 @@ statement_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1269,7 +1292,7 @@ statement_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -1282,7 +1305,7 @@ statement_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1293,7 +1316,7 @@ statement_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1301,16 +1324,19 @@ statement_rule(Parser *p) static asdl_stmt_seq* statement_newline_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -1319,7 +1345,7 @@ statement_newline_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // compound_stmt NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compound_stmt NEWLINE")); @@ -1335,7 +1361,7 @@ statement_newline_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1346,7 +1372,7 @@ statement_newline_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -1365,7 +1391,7 @@ statement_newline_rule(Parser *p) } { // NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -1377,7 +1403,7 @@ statement_newline_rule(Parser *p) D(fprintf(stderr, "%*c+ statement_newline[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NEWLINE")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -1387,7 +1413,7 @@ statement_newline_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , CHECK ( stmt_ty , _PyAST_Pass ( EXTRA ) ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1398,7 +1424,7 @@ statement_newline_rule(Parser *p) } { // $ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> statement_newline[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "$")); @@ -1411,7 +1437,7 @@ statement_newline_rule(Parser *p) _res = _PyPegen_interactive_exit ( p ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1422,7 +1448,7 @@ statement_newline_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1430,16 +1456,19 @@ statement_newline_rule(Parser *p) static asdl_stmt_seq* simple_stmts_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; { // simple_stmt !';' NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmts[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmt !';' NEWLINE")); @@ -1457,7 +1486,7 @@ simple_stmts_rule(Parser *p) _res = ( asdl_stmt_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1468,7 +1497,7 @@ simple_stmts_rule(Parser *p) } { // ';'.simple_stmt+ ';'? NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmts[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';'.simple_stmt+ ';'? NEWLINE")); @@ -1488,7 +1517,7 @@ simple_stmts_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1499,7 +1528,7 @@ simple_stmts_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -1520,20 +1549,23 @@ simple_stmts_rule(Parser *p) static stmt_ty simple_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; if (_PyPegen_is_memoized(p, simple_stmt_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -1542,7 +1574,7 @@ simple_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // assignment if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment")); @@ -1561,7 +1593,7 @@ simple_stmt_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -1573,7 +1605,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expressions")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -1583,7 +1615,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Expr ( e , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1594,7 +1626,7 @@ simple_stmt_rule(Parser *p) } { // &'return' return_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'return' return_stmt")); @@ -1615,7 +1647,7 @@ simple_stmt_rule(Parser *p) } { // &('import' | 'from') import_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('import' | 'from') import_stmt")); @@ -1636,7 +1668,7 @@ simple_stmt_rule(Parser *p) } { // &'raise' raise_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'raise' raise_stmt")); @@ -1657,7 +1689,7 @@ simple_stmt_rule(Parser *p) } { // 'pass' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'pass'")); @@ -1669,7 +1701,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'pass'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -1679,7 +1711,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Pass ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1690,7 +1722,7 @@ simple_stmt_rule(Parser *p) } { // &'del' del_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'del' del_stmt")); @@ -1711,7 +1743,7 @@ simple_stmt_rule(Parser *p) } { // &'yield' yield_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'yield' yield_stmt")); @@ -1732,7 +1764,7 @@ simple_stmt_rule(Parser *p) } { // &'assert' assert_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'assert' assert_stmt")); @@ -1753,7 +1785,7 @@ simple_stmt_rule(Parser *p) } { // 'break' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'break'")); @@ -1765,7 +1797,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'break'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -1775,7 +1807,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Break ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1786,7 +1818,7 @@ simple_stmt_rule(Parser *p) } { // 'continue' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'continue'")); @@ -1798,7 +1830,7 @@ simple_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ simple_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'continue'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -1808,7 +1840,7 @@ simple_stmt_rule(Parser *p) _res = _PyAST_Continue ( EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -1819,7 +1851,7 @@ simple_stmt_rule(Parser *p) } { // &'global' global_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'global' global_stmt")); @@ -1840,7 +1872,7 @@ simple_stmt_rule(Parser *p) } { // &'nonlocal' nonlocal_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> simple_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'nonlocal' nonlocal_stmt")); @@ -1862,7 +1894,7 @@ simple_stmt_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, simple_stmt_type, _res); - D(p->level--); + p->level--; return _res; } @@ -1878,16 +1910,19 @@ simple_stmt_rule(Parser *p) static stmt_ty compound_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // &('def' | '@' | ASYNC) function_def if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('def' | '@' | ASYNC) function_def")); @@ -1908,7 +1943,7 @@ compound_stmt_rule(Parser *p) } { // &'if' if_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'if' if_stmt")); @@ -1929,7 +1964,7 @@ compound_stmt_rule(Parser *p) } { // &('class' | '@') class_def if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('class' | '@') class_def")); @@ -1950,7 +1985,7 @@ compound_stmt_rule(Parser *p) } { // &('with' | ASYNC) with_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('with' | ASYNC) with_stmt")); @@ -1971,7 +2006,7 @@ compound_stmt_rule(Parser *p) } { // &('for' | ASYNC) for_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&('for' | ASYNC) for_stmt")); @@ -1992,7 +2027,7 @@ compound_stmt_rule(Parser *p) } { // &'try' try_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'try' try_stmt")); @@ -2013,7 +2048,7 @@ compound_stmt_rule(Parser *p) } { // &'while' while_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'while' while_stmt")); @@ -2034,7 +2069,7 @@ compound_stmt_rule(Parser *p) } { // match_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compound_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "match_stmt")); @@ -2053,7 +2088,7 @@ compound_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2066,16 +2101,19 @@ compound_stmt_rule(Parser *p) static stmt_ty assignment_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2084,7 +2122,7 @@ assignment_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ':' expression ['=' annotated_rhs] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ':' expression ['=' annotated_rhs]")); @@ -2105,7 +2143,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ':' expression ['=' annotated_rhs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2115,7 +2153,7 @@ assignment_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 6 , "Variable annotation syntax is" , _PyAST_AnnAssign ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , b , c , 1 , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2126,7 +2164,7 @@ assignment_rule(Parser *p) } { // ('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs]")); @@ -2147,7 +2185,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "('(' single_target ')' | single_subscript_attribute_target) ':' expression ['=' annotated_rhs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2157,7 +2195,7 @@ assignment_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 6 , "Variable annotations syntax is" , _PyAST_AnnAssign ( a , b , c , 0 , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2168,7 +2206,7 @@ assignment_rule(Parser *p) } { // ((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT?")); @@ -2188,7 +2226,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "((star_targets '='))+ (yield_expr | star_expressions) !'=' TYPE_COMMENT?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2198,7 +2236,7 @@ assignment_rule(Parser *p) _res = _PyAST_Assign ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2209,7 +2247,7 @@ assignment_rule(Parser *p) } { // single_target augassign ~ (yield_expr | star_expressions) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); @@ -2230,7 +2268,7 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2240,7 +2278,7 @@ assignment_rule(Parser *p) _res = _PyAST_AugAssign ( a , b -> kind , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2249,13 +2287,13 @@ assignment_rule(Parser *p) D(fprintf(stderr, "%*c%s assignment[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "single_target augassign ~ (yield_expr | star_expressions)")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } if (p->call_invalid_rules) { // invalid_assignment if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_assignment")); @@ -2274,7 +2312,7 @@ assignment_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2282,16 +2320,19 @@ assignment_rule(Parser *p) static expr_ty annotated_rhs_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> annotated_rhs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -2310,7 +2351,7 @@ annotated_rhs_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> annotated_rhs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -2329,7 +2370,7 @@ annotated_rhs_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2350,16 +2391,19 @@ annotated_rhs_rule(Parser *p) static AugOperator* augassign_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } AugOperator* _res = NULL; int _mark = p->mark; { // '+=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+='")); @@ -2372,7 +2416,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Add ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2383,7 +2427,7 @@ augassign_rule(Parser *p) } { // '-=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-='")); @@ -2396,7 +2440,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Sub ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2407,7 +2451,7 @@ augassign_rule(Parser *p) } { // '*=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*='")); @@ -2420,7 +2464,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Mult ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2431,7 +2475,7 @@ augassign_rule(Parser *p) } { // '@=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@='")); @@ -2444,7 +2488,7 @@ augassign_rule(Parser *p) _res = CHECK_VERSION ( AugOperator* , 5 , "The '@' operator is" , _PyPegen_augoperator ( p , MatMult ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2455,7 +2499,7 @@ augassign_rule(Parser *p) } { // '/=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/='")); @@ -2468,7 +2512,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Div ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2479,7 +2523,7 @@ augassign_rule(Parser *p) } { // '%=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'%='")); @@ -2492,7 +2536,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Mod ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2503,7 +2547,7 @@ augassign_rule(Parser *p) } { // '&=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'&='")); @@ -2516,7 +2560,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitAnd ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2527,7 +2571,7 @@ augassign_rule(Parser *p) } { // '|=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|='")); @@ -2540,7 +2584,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitOr ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2551,7 +2595,7 @@ augassign_rule(Parser *p) } { // '^=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'^='")); @@ -2564,7 +2608,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , BitXor ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2575,7 +2619,7 @@ augassign_rule(Parser *p) } { // '<<=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<<='")); @@ -2588,7 +2632,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , LShift ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2599,7 +2643,7 @@ augassign_rule(Parser *p) } { // '>>=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>>='")); @@ -2612,7 +2656,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , RShift ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2623,7 +2667,7 @@ augassign_rule(Parser *p) } { // '**=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**='")); @@ -2636,7 +2680,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , Pow ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2647,7 +2691,7 @@ augassign_rule(Parser *p) } { // '//=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> augassign[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'//='")); @@ -2660,7 +2704,7 @@ augassign_rule(Parser *p) _res = _PyPegen_augoperator ( p , FloorDiv ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2671,7 +2715,7 @@ augassign_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2679,16 +2723,19 @@ augassign_rule(Parser *p) static stmt_ty return_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2697,7 +2744,7 @@ return_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'return' star_expressions? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> return_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'return' star_expressions?")); @@ -2712,7 +2759,7 @@ return_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ return_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'return' star_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2722,7 +2769,7 @@ return_stmt_rule(Parser *p) _res = _PyAST_Return ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2733,7 +2780,7 @@ return_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2741,16 +2788,19 @@ return_stmt_rule(Parser *p) static stmt_ty raise_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2759,7 +2809,7 @@ raise_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'raise' expression ['from' expression] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> raise_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'raise' expression ['from' expression]")); @@ -2777,7 +2827,7 @@ raise_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ raise_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'raise' expression ['from' expression]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2787,7 +2837,7 @@ raise_stmt_rule(Parser *p) _res = _PyAST_Raise ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2798,7 +2848,7 @@ raise_stmt_rule(Parser *p) } { // 'raise' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> raise_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'raise'")); @@ -2810,7 +2860,7 @@ raise_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ raise_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'raise'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2820,7 +2870,7 @@ raise_stmt_rule(Parser *p) _res = _PyAST_Raise ( NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2831,7 +2881,7 @@ raise_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2839,16 +2889,19 @@ raise_stmt_rule(Parser *p) static stmt_ty global_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2857,7 +2910,7 @@ global_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'global' ','.NAME+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> global_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'global' ','.NAME+")); @@ -2872,7 +2925,7 @@ global_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ global_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'global' ','.NAME+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2882,7 +2935,7 @@ global_stmt_rule(Parser *p) _res = _PyAST_Global ( CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2893,7 +2946,7 @@ global_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2901,16 +2954,19 @@ global_stmt_rule(Parser *p) static stmt_ty nonlocal_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2919,7 +2975,7 @@ nonlocal_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'nonlocal' ','.NAME+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> nonlocal_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'nonlocal' ','.NAME+")); @@ -2934,7 +2990,7 @@ nonlocal_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ nonlocal_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'nonlocal' ','.NAME+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -2944,7 +3000,7 @@ nonlocal_stmt_rule(Parser *p) _res = _PyAST_Nonlocal ( CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -2955,7 +3011,7 @@ nonlocal_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -2963,16 +3019,19 @@ nonlocal_stmt_rule(Parser *p) static stmt_ty del_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -2981,7 +3040,7 @@ del_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'del' del_targets &(';' | NEWLINE) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'del' del_targets &(';' | NEWLINE)")); @@ -2998,7 +3057,7 @@ del_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ del_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'del' del_targets &(';' | NEWLINE)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3008,7 +3067,7 @@ del_stmt_rule(Parser *p) _res = _PyAST_Delete ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3019,7 +3078,7 @@ del_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_del_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_del_stmt")); @@ -3038,7 +3097,7 @@ del_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3046,16 +3105,19 @@ del_stmt_rule(Parser *p) static stmt_ty yield_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3064,7 +3126,7 @@ yield_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> yield_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -3076,7 +3138,7 @@ yield_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "yield_expr")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3086,7 +3148,7 @@ yield_stmt_rule(Parser *p) _res = _PyAST_Expr ( y , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3097,7 +3159,7 @@ yield_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3105,16 +3167,19 @@ yield_stmt_rule(Parser *p) static stmt_ty assert_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3123,7 +3188,7 @@ assert_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'assert' expression [',' expression] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assert_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'assert' expression [',' expression]")); @@ -3141,7 +3206,7 @@ assert_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ assert_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'assert' expression [',' expression]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3151,7 +3216,7 @@ assert_stmt_rule(Parser *p) _res = _PyAST_Assert ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3162,7 +3227,7 @@ assert_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3170,16 +3235,19 @@ assert_stmt_rule(Parser *p) static stmt_ty import_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // import_name if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_name")); @@ -3198,7 +3266,7 @@ import_stmt_rule(Parser *p) } { // import_from if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from")); @@ -3217,7 +3285,7 @@ import_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3225,16 +3293,19 @@ import_stmt_rule(Parser *p) static stmt_ty import_name_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3243,7 +3314,7 @@ import_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'import' dotted_as_names if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'import' dotted_as_names")); @@ -3258,7 +3329,7 @@ import_name_rule(Parser *p) D(fprintf(stderr, "%*c+ import_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'import' dotted_as_names")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3268,7 +3339,7 @@ import_name_rule(Parser *p) _res = _PyAST_Import ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3279,7 +3350,7 @@ import_name_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3289,16 +3360,19 @@ import_name_rule(Parser *p) static stmt_ty import_from_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3307,7 +3381,7 @@ import_from_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'from' (('.' | '...'))* dotted_name 'import' import_from_targets if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))* dotted_name 'import' import_from_targets")); @@ -3331,7 +3405,7 @@ import_from_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))* dotted_name 'import' import_from_targets")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3341,7 +3415,7 @@ import_from_rule(Parser *p) _res = _PyAST_ImportFrom ( b -> v . Name . id , c , _PyPegen_seq_count_dots ( a ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3352,7 +3426,7 @@ import_from_rule(Parser *p) } { // 'from' (('.' | '...'))+ 'import' import_from_targets if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))+ 'import' import_from_targets")); @@ -3373,7 +3447,7 @@ import_from_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'from' (('.' | '...'))+ 'import' import_from_targets")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3383,7 +3457,7 @@ import_from_rule(Parser *p) _res = _PyAST_ImportFrom ( NULL , b , _PyPegen_seq_count_dots ( a ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3394,7 +3468,7 @@ import_from_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3406,16 +3480,19 @@ import_from_rule(Parser *p) static asdl_alias_seq* import_from_targets_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3424,7 +3501,7 @@ import_from_targets_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' import_from_as_names ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' import_from_as_names ','? ')'")); @@ -3447,7 +3524,7 @@ import_from_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3458,7 +3535,7 @@ import_from_targets_rule(Parser *p) } { // import_from_as_names !',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_names !','")); @@ -3479,7 +3556,7 @@ import_from_targets_rule(Parser *p) } { // '*' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); @@ -3491,7 +3568,7 @@ import_from_targets_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from_targets[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3501,7 +3578,7 @@ import_from_targets_rule(Parser *p) _res = ( asdl_alias_seq* ) _PyPegen_singleton_seq ( p , CHECK ( alias_ty , _PyPegen_alias_for_star ( p , EXTRA ) ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3512,7 +3589,7 @@ import_from_targets_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_import_from_targets if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_import_from_targets")); @@ -3531,7 +3608,7 @@ import_from_targets_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3539,16 +3616,19 @@ import_from_targets_rule(Parser *p) static asdl_alias_seq* import_from_as_names_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; { // ','.import_from_as_name+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_as_names[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.import_from_as_name+")); @@ -3561,7 +3641,7 @@ import_from_as_names_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3572,7 +3652,7 @@ import_from_as_names_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3580,16 +3660,19 @@ import_from_as_names_rule(Parser *p) static alias_ty import_from_as_name_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } alias_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3598,7 +3681,7 @@ import_from_as_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ['as' NAME] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> import_from_as_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ['as' NAME]")); @@ -3613,7 +3696,7 @@ import_from_as_name_rule(Parser *p) D(fprintf(stderr, "%*c+ import_from_as_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ['as' NAME]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3623,7 +3706,7 @@ import_from_as_name_rule(Parser *p) _res = _PyAST_alias ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Name . id : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3634,7 +3717,7 @@ import_from_as_name_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3642,16 +3725,19 @@ import_from_as_name_rule(Parser *p) static asdl_alias_seq* dotted_as_names_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_alias_seq* _res = NULL; int _mark = p->mark; { // ','.dotted_as_name+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dotted_as_names[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.dotted_as_name+")); @@ -3664,7 +3750,7 @@ dotted_as_names_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3675,7 +3761,7 @@ dotted_as_names_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3683,16 +3769,19 @@ dotted_as_names_rule(Parser *p) static alias_ty dotted_as_name_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } alias_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -3701,7 +3790,7 @@ dotted_as_name_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // dotted_name ['as' NAME] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dotted_as_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_name ['as' NAME]")); @@ -3716,7 +3805,7 @@ dotted_as_name_rule(Parser *p) D(fprintf(stderr, "%*c+ dotted_as_name[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "dotted_name ['as' NAME]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -3726,7 +3815,7 @@ dotted_as_name_rule(Parser *p) _res = _PyAST_alias ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Name . id : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3737,7 +3826,7 @@ dotted_as_name_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3747,10 +3836,13 @@ static expr_ty dotted_name_raw(Parser *); static expr_ty dotted_name_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, dotted_name_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -3758,37 +3850,42 @@ dotted_name_rule(Parser *p) while (1) { int tmpvar_0 = _PyPegen_update_memo(p, _mark, dotted_name_type, _res); if (tmpvar_0) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = dotted_name_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty dotted_name_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // dotted_name '.' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dotted_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_name '.' NAME")); @@ -3807,7 +3904,7 @@ dotted_name_raw(Parser *p) _res = _PyPegen_join_names_with_dot ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3818,7 +3915,7 @@ dotted_name_raw(Parser *p) } { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dotted_name[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -3837,7 +3934,7 @@ dotted_name_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3845,20 +3942,23 @@ dotted_name_raw(Parser *p) static asdl_stmt_seq* block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; if (_PyPegen_is_memoized(p, block_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; { // NEWLINE INDENT statements DEDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE INDENT statements DEDENT")); @@ -3880,7 +3980,7 @@ block_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3891,7 +3991,7 @@ block_rule(Parser *p) } { // simple_stmts if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmts")); @@ -3910,7 +4010,7 @@ block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_block")); @@ -3930,7 +4030,7 @@ block_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, block_type, _res); - D(p->level--); + p->level--; return _res; } @@ -3938,16 +4038,19 @@ block_rule(Parser *p) static asdl_expr_seq* decorators_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // (('@' named_expression NEWLINE))+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> decorators[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(('@' named_expression NEWLINE))+")); @@ -3960,7 +4063,7 @@ decorators_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -3971,7 +4074,7 @@ decorators_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -3979,16 +4082,19 @@ decorators_rule(Parser *p) static stmt_ty class_def_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // decorators class_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "decorators class_def_raw")); @@ -4004,7 +4110,7 @@ class_def_rule(Parser *p) _res = _PyPegen_class_def_decorators ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4015,7 +4121,7 @@ class_def_rule(Parser *p) } { // class_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "class_def_raw")); @@ -4034,7 +4140,7 @@ class_def_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4042,16 +4148,19 @@ class_def_rule(Parser *p) static stmt_ty class_def_raw_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -4060,7 +4169,7 @@ class_def_raw_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_class_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_class_def_raw")); @@ -4079,7 +4188,7 @@ class_def_raw_rule(Parser *p) } { // 'class' NAME ['(' arguments? ')'] &&':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); @@ -4103,7 +4212,7 @@ class_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ class_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -4113,7 +4222,7 @@ class_def_raw_rule(Parser *p) _res = _PyAST_ClassDef ( a -> v . Name . id , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , c , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4124,7 +4233,7 @@ class_def_raw_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4132,16 +4241,19 @@ class_def_raw_rule(Parser *p) static stmt_ty function_def_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; { // decorators function_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> function_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "decorators function_def_raw")); @@ -4157,7 +4269,7 @@ function_def_rule(Parser *p) _res = _PyPegen_function_def_decorators ( p , d , f ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4168,7 +4280,7 @@ function_def_rule(Parser *p) } { // function_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> function_def[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "function_def_raw")); @@ -4187,7 +4299,7 @@ function_def_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4198,16 +4310,19 @@ function_def_rule(Parser *p) static stmt_ty function_def_raw_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -4216,7 +4331,7 @@ function_def_raw_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_def_raw if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_def_raw")); @@ -4235,7 +4350,7 @@ function_def_raw_rule(Parser *p) } { // 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); @@ -4271,7 +4386,7 @@ function_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ function_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -4281,7 +4396,7 @@ function_def_raw_rule(Parser *p) _res = _PyAST_FunctionDef ( n -> v . Name . id , ( params ) ? params : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , NULL , a , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4292,7 +4407,7 @@ function_def_raw_rule(Parser *p) } { // ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> function_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); @@ -4331,7 +4446,7 @@ function_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c+ function_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'def' NAME &&'(' params? ')' ['->' expression] &&':' func_type_comment? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -4341,7 +4456,7 @@ function_def_raw_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async functions are" , _PyAST_AsyncFunctionDef ( n -> v . Name . id , ( params ) ? params : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , NULL , a , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4352,7 +4467,7 @@ function_def_raw_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4360,16 +4475,19 @@ function_def_raw_rule(Parser *p) static arguments_ty params_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arguments_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_parameters if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_parameters")); @@ -4388,7 +4506,7 @@ params_rule(Parser *p) } { // parameters if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "parameters")); @@ -4407,7 +4525,7 @@ params_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4420,16 +4538,19 @@ params_rule(Parser *p) static arguments_ty parameters_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arguments_ty _res = NULL; int _mark = p->mark; { // slash_no_default param_no_default* param_with_default* star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default param_no_default* param_with_default* star_etc?")); @@ -4451,7 +4572,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , a , NULL , b , c , d ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4462,7 +4583,7 @@ parameters_rule(Parser *p) } { // slash_with_default param_with_default* star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default param_with_default* star_etc?")); @@ -4481,7 +4602,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , a , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4492,7 +4613,7 @@ parameters_rule(Parser *p) } { // param_no_default+ param_with_default* star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ param_with_default* star_etc?")); @@ -4511,7 +4632,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4522,7 +4643,7 @@ parameters_rule(Parser *p) } { // param_with_default+ star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default+ star_etc?")); @@ -4538,7 +4659,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4549,7 +4670,7 @@ parameters_rule(Parser *p) } { // star_etc if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_etc")); @@ -4562,7 +4683,7 @@ parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4573,7 +4694,7 @@ parameters_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4581,16 +4702,19 @@ parameters_rule(Parser *p) static asdl_arg_seq* slash_no_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_arg_seq* _res = NULL; int _mark = p->mark; { // param_no_default+ '/' ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ '/' ','")); @@ -4609,7 +4733,7 @@ slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4620,7 +4744,7 @@ slash_no_default_rule(Parser *p) } { // param_no_default+ '/' &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default+ '/' &')'")); @@ -4638,7 +4762,7 @@ slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4649,7 +4773,7 @@ slash_no_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4659,16 +4783,19 @@ slash_no_default_rule(Parser *p) static SlashWithDefault* slash_with_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } SlashWithDefault* _res = NULL; int _mark = p->mark; { // param_no_default* param_with_default+ '/' ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* param_with_default+ '/' ','")); @@ -4690,7 +4817,7 @@ slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4701,7 +4828,7 @@ slash_with_default_rule(Parser *p) } { // param_no_default* param_with_default+ '/' &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* param_with_default+ '/' &')'")); @@ -4722,7 +4849,7 @@ slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4733,7 +4860,7 @@ slash_with_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4745,16 +4872,19 @@ slash_with_default_rule(Parser *p) static StarEtc* star_etc_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } StarEtc* _res = NULL; int _mark = p->mark; { // '*' param_no_default param_maybe_default* kwds? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' param_no_default param_maybe_default* kwds?")); @@ -4776,7 +4906,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4787,7 +4917,7 @@ star_etc_rule(Parser *p) } { // '*' ',' param_maybe_default+ kwds? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' param_maybe_default+ kwds?")); @@ -4809,7 +4939,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4820,7 +4950,7 @@ star_etc_rule(Parser *p) } { // kwds if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwds")); @@ -4833,7 +4963,7 @@ star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4844,7 +4974,7 @@ star_etc_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_star_etc if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_star_etc")); @@ -4863,7 +4993,7 @@ star_etc_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4871,16 +5001,19 @@ star_etc_rule(Parser *p) static arg_ty kwds_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // '**' param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwds[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' param_no_default")); @@ -4896,7 +5029,7 @@ kwds_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4907,7 +5040,7 @@ kwds_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4915,16 +5048,19 @@ kwds_rule(Parser *p) static arg_ty param_no_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // param ',' TYPE_COMMENT? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param ',' TYPE_COMMENT?")); @@ -4943,7 +5079,7 @@ param_no_default_rule(Parser *p) _res = _PyPegen_add_type_comment_to_arg ( p , a , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4954,7 +5090,7 @@ param_no_default_rule(Parser *p) } { // param TYPE_COMMENT? &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param TYPE_COMMENT? &')'")); @@ -4972,7 +5108,7 @@ param_no_default_rule(Parser *p) _res = _PyPegen_add_type_comment_to_arg ( p , a , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -4983,7 +5119,7 @@ param_no_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -4991,16 +5127,19 @@ param_no_default_rule(Parser *p) static NameDefaultPair* param_with_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // param default ',' TYPE_COMMENT? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default ',' TYPE_COMMENT?")); @@ -5022,7 +5161,7 @@ param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5033,7 +5172,7 @@ param_with_default_rule(Parser *p) } { // param default TYPE_COMMENT? &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default TYPE_COMMENT? &')'")); @@ -5054,7 +5193,7 @@ param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5065,7 +5204,7 @@ param_with_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5075,16 +5214,19 @@ param_with_default_rule(Parser *p) static NameDefaultPair* param_maybe_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // param default? ',' TYPE_COMMENT? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default? ',' TYPE_COMMENT?")); @@ -5106,7 +5248,7 @@ param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5117,7 +5259,7 @@ param_maybe_default_rule(Parser *p) } { // param default? TYPE_COMMENT? &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param default? TYPE_COMMENT? &')'")); @@ -5138,7 +5280,7 @@ param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , tc ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5149,7 +5291,7 @@ param_maybe_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5157,16 +5299,19 @@ param_maybe_default_rule(Parser *p) static arg_ty param_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5175,7 +5320,7 @@ param_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME annotation? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> param[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME annotation?")); @@ -5190,7 +5335,7 @@ param_rule(Parser *p) D(fprintf(stderr, "%*c+ param[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME annotation?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5200,7 +5345,7 @@ param_rule(Parser *p) _res = _PyAST_arg ( a -> v . Name . id , b , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5211,7 +5356,7 @@ param_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5219,16 +5364,19 @@ param_rule(Parser *p) static expr_ty annotation_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> annotation[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':' expression")); @@ -5244,7 +5392,7 @@ annotation_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5255,7 +5403,7 @@ annotation_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5263,16 +5411,19 @@ annotation_rule(Parser *p) static expr_ty default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '=' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' expression")); @@ -5288,7 +5439,7 @@ default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5299,7 +5450,7 @@ default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5310,16 +5461,19 @@ default_rule(Parser *p) static stmt_ty if_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5328,7 +5482,7 @@ if_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_if_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_if_stmt")); @@ -5347,7 +5501,7 @@ if_stmt_rule(Parser *p) } { // 'if' named_expression ':' block elif_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block elif_stmt")); @@ -5371,7 +5525,7 @@ if_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ if_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block elif_stmt")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5381,7 +5535,7 @@ if_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , CHECK ( asdl_stmt_seq* , _PyPegen_singleton_seq ( p , c ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5392,7 +5546,7 @@ if_stmt_rule(Parser *p) } { // 'if' named_expression ':' block else_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block else_block?")); @@ -5416,7 +5570,7 @@ if_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ if_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5426,7 +5580,7 @@ if_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5437,7 +5591,7 @@ if_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5448,16 +5602,19 @@ if_stmt_rule(Parser *p) static stmt_ty elif_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5466,7 +5623,7 @@ elif_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_elif_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_elif_stmt")); @@ -5485,7 +5642,7 @@ elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' block elif_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block elif_stmt")); @@ -5509,7 +5666,7 @@ elif_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ elif_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block elif_stmt")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5519,7 +5676,7 @@ elif_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , CHECK ( asdl_stmt_seq* , _PyPegen_singleton_seq ( p , c ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5530,7 +5687,7 @@ elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' block else_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block else_block?")); @@ -5554,7 +5711,7 @@ elif_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ elif_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5564,7 +5721,7 @@ elif_stmt_rule(Parser *p) _res = _PyAST_If ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5575,7 +5732,7 @@ elif_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5583,16 +5740,19 @@ elif_stmt_rule(Parser *p) static asdl_stmt_seq* else_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_else_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> else_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_else_stmt")); @@ -5611,7 +5771,7 @@ else_block_rule(Parser *p) } { // 'else' &&':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> else_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else' &&':' block")); @@ -5630,7 +5790,7 @@ else_block_rule(Parser *p) _res = b; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5641,7 +5801,7 @@ else_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5649,16 +5809,19 @@ else_block_rule(Parser *p) static stmt_ty while_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5667,7 +5830,7 @@ while_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_while_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_while_stmt")); @@ -5686,7 +5849,7 @@ while_stmt_rule(Parser *p) } { // 'while' named_expression ':' block else_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' block else_block?")); @@ -5710,7 +5873,7 @@ while_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ while_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5720,7 +5883,7 @@ while_stmt_rule(Parser *p) _res = _PyAST_While ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5731,7 +5894,7 @@ while_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5743,16 +5906,19 @@ while_stmt_rule(Parser *p) static stmt_ty for_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5761,7 +5927,7 @@ for_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_for_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_stmt")); @@ -5780,7 +5946,7 @@ for_stmt_rule(Parser *p) } { // 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); @@ -5816,7 +5982,7 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5826,7 +5992,7 @@ for_stmt_rule(Parser *p) _res = _PyAST_For ( t , ex , b , el , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5835,13 +6001,13 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } { // ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); @@ -5880,7 +6046,7 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -5890,7 +6056,7 @@ for_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async for loops are" , _PyAST_AsyncFor ( t , ex , b , el , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -5899,13 +6065,13 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } if (p->call_invalid_rules) { // invalid_for_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_target")); @@ -5924,7 +6090,7 @@ for_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -5938,16 +6104,19 @@ for_stmt_rule(Parser *p) static stmt_ty with_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -5956,7 +6125,7 @@ with_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_with_stmt_indent if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_stmt_indent")); @@ -5975,7 +6144,7 @@ with_stmt_rule(Parser *p) } { // 'with' '(' ','.with_item+ ','? ')' ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with' '(' ','.with_item+ ','? ')' ':' block")); @@ -6006,7 +6175,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'with' '(' ','.with_item+ ','? ')' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6016,7 +6185,7 @@ with_stmt_rule(Parser *p) _res = _PyAST_With ( a , b , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6027,7 +6196,7 @@ with_stmt_rule(Parser *p) } { // 'with' ','.with_item+ ':' TYPE_COMMENT? block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with' ','.with_item+ ':' TYPE_COMMENT? block")); @@ -6051,7 +6220,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'with' ','.with_item+ ':' TYPE_COMMENT? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6061,7 +6230,7 @@ with_stmt_rule(Parser *p) _res = _PyAST_With ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6072,7 +6241,7 @@ with_stmt_rule(Parser *p) } { // ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block")); @@ -6106,7 +6275,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' '(' ','.with_item+ ','? ')' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6116,7 +6285,7 @@ with_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async with statements are" , _PyAST_AsyncWith ( a , b , NULL , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6127,7 +6296,7 @@ with_stmt_rule(Parser *p) } { // ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block")); @@ -6154,7 +6323,7 @@ with_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'with' ','.with_item+ ':' TYPE_COMMENT? block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6164,7 +6333,7 @@ with_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 5 , "Async with statements are" , _PyAST_AsyncWith ( a , b , NEW_TYPE_COMMENT ( p , tc ) , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6175,7 +6344,7 @@ with_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_with_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_stmt")); @@ -6194,7 +6363,7 @@ with_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6205,16 +6374,19 @@ with_stmt_rule(Parser *p) static withitem_ty with_item_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } withitem_ty _res = NULL; int _mark = p->mark; { // expression 'as' star_target &(',' | ')' | ':') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression 'as' star_target &(',' | ')' | ':')")); @@ -6235,7 +6407,7 @@ with_item_rule(Parser *p) _res = _PyAST_withitem ( e , t , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6246,7 +6418,7 @@ with_item_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_with_item if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_with_item")); @@ -6265,7 +6437,7 @@ with_item_rule(Parser *p) } { // expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -6278,7 +6450,7 @@ with_item_rule(Parser *p) _res = _PyAST_withitem ( e , NULL , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6289,7 +6461,7 @@ with_item_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6301,16 +6473,19 @@ with_item_rule(Parser *p) static stmt_ty try_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6319,7 +6494,7 @@ try_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_try_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_try_stmt")); @@ -6338,7 +6513,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block finally_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block finally_block")); @@ -6359,7 +6534,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block finally_block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6369,7 +6544,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_Try ( b , NULL , NULL , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6380,7 +6555,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block except_block+ else_block? finally_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_block+ else_block? finally_block?")); @@ -6407,7 +6582,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_block+ else_block? finally_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6417,7 +6592,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_Try ( b , ex , el , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6428,7 +6603,7 @@ try_stmt_rule(Parser *p) } { // 'try' &&':' block except_star_block+ else_block? finally_block? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_star_block+ else_block? finally_block?")); @@ -6455,7 +6630,7 @@ try_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' &&':' block except_star_block+ else_block? finally_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6465,7 +6640,7 @@ try_stmt_rule(Parser *p) _res = _PyAST_TryStar ( b , ex , el , f , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6476,7 +6651,7 @@ try_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6488,16 +6663,19 @@ try_stmt_rule(Parser *p) static excepthandler_ty except_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } excepthandler_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6506,7 +6684,7 @@ except_block_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_except_stmt_indent if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt_indent")); @@ -6525,7 +6703,7 @@ except_block_rule(Parser *p) } { // 'except' expression ['as' NAME] ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' block")); @@ -6549,7 +6727,7 @@ except_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6559,7 +6737,7 @@ except_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( e , ( t ) ? ( ( expr_ty ) t ) -> v . Name . id : NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6570,7 +6748,7 @@ except_block_rule(Parser *p) } { // 'except' ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' ':' block")); @@ -6588,7 +6766,7 @@ except_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6598,7 +6776,7 @@ except_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( NULL , NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6609,7 +6787,7 @@ except_block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_except_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt")); @@ -6628,7 +6806,7 @@ except_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6639,16 +6817,19 @@ except_block_rule(Parser *p) static excepthandler_ty except_star_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } excepthandler_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6657,7 +6838,7 @@ except_star_block_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_except_star_stmt_indent if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_star_stmt_indent")); @@ -6676,7 +6857,7 @@ except_star_block_rule(Parser *p) } { // 'except' '*' expression ['as' NAME] ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' block")); @@ -6703,7 +6884,7 @@ except_star_block_rule(Parser *p) D(fprintf(stderr, "%*c+ except_star_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6713,7 +6894,7 @@ except_star_block_rule(Parser *p) _res = _PyAST_ExceptHandler ( e , ( t ) ? ( ( expr_ty ) t ) -> v . Name . id : NULL , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6724,7 +6905,7 @@ except_star_block_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_except_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> except_star_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_except_stmt")); @@ -6743,7 +6924,7 @@ except_star_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6751,16 +6932,19 @@ except_star_block_rule(Parser *p) static asdl_stmt_seq* finally_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_stmt_seq* _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_finally_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> finally_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_finally_stmt")); @@ -6779,7 +6963,7 @@ finally_block_rule(Parser *p) } { // 'finally' &&':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> finally_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally' &&':' block")); @@ -6798,7 +6982,7 @@ finally_block_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6809,7 +6993,7 @@ finally_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6819,16 +7003,19 @@ finally_block_rule(Parser *p) static stmt_ty match_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } stmt_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6837,7 +7024,7 @@ match_stmt_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // "match" subject_expr ':' NEWLINE INDENT case_block+ DEDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE INDENT case_block+ DEDENT")); @@ -6867,7 +7054,7 @@ match_stmt_rule(Parser *p) D(fprintf(stderr, "%*c+ match_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE INDENT case_block+ DEDENT")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6877,7 +7064,7 @@ match_stmt_rule(Parser *p) _res = CHECK_VERSION ( stmt_ty , 10 , "Pattern matching is" , _PyAST_Match ( subject , cases , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6888,7 +7075,7 @@ match_stmt_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_match_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_match_stmt")); @@ -6907,7 +7094,7 @@ match_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6915,16 +7102,19 @@ match_stmt_rule(Parser *p) static expr_ty subject_expr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -6933,7 +7123,7 @@ subject_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_named_expression ',' star_named_expressions? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> subject_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); @@ -6951,7 +7141,7 @@ subject_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ subject_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -6961,7 +7151,7 @@ subject_expr_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , value , values ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -6972,7 +7162,7 @@ subject_expr_rule(Parser *p) } { // named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> subject_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -6991,7 +7181,7 @@ subject_expr_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -6999,16 +7189,19 @@ subject_expr_rule(Parser *p) static match_case_ty case_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } match_case_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_case_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_case_block")); @@ -7027,7 +7220,7 @@ case_block_rule(Parser *p) } { // "case" patterns guard? ':' block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? ':' block")); @@ -7052,7 +7245,7 @@ case_block_rule(Parser *p) _res = _PyAST_match_case ( pattern , guard , body , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7063,7 +7256,7 @@ case_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7071,16 +7264,19 @@ case_block_rule(Parser *p) static expr_ty guard_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // 'if' named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> guard[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression")); @@ -7096,7 +7292,7 @@ guard_rule(Parser *p) _res = guard; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7107,7 +7303,7 @@ guard_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7115,16 +7311,19 @@ guard_rule(Parser *p) static pattern_ty patterns_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7133,7 +7332,7 @@ patterns_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // open_sequence_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "open_sequence_pattern")); @@ -7145,7 +7344,7 @@ patterns_rule(Parser *p) D(fprintf(stderr, "%*c+ patterns[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "open_sequence_pattern")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7155,7 +7354,7 @@ patterns_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7166,7 +7365,7 @@ patterns_rule(Parser *p) } { // pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern")); @@ -7185,7 +7384,7 @@ patterns_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7193,16 +7392,19 @@ patterns_rule(Parser *p) static pattern_ty pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // as_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "as_pattern")); @@ -7221,7 +7423,7 @@ pattern_rule(Parser *p) } { // or_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern")); @@ -7240,7 +7442,7 @@ pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7248,16 +7450,19 @@ pattern_rule(Parser *p) static pattern_ty as_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7266,7 +7471,7 @@ as_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // or_pattern 'as' pattern_capture_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' pattern_capture_target")); @@ -7284,7 +7489,7 @@ as_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ as_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7294,7 +7499,7 @@ as_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( pattern , target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7305,7 +7510,7 @@ as_pattern_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_as_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_as_pattern")); @@ -7324,7 +7529,7 @@ as_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7332,16 +7537,19 @@ as_pattern_rule(Parser *p) static pattern_ty or_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7350,7 +7558,7 @@ or_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '|'.closed_pattern+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> or_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|'.closed_pattern+")); @@ -7362,7 +7570,7 @@ or_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ or_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'|'.closed_pattern+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7372,7 +7580,7 @@ or_pattern_rule(Parser *p) _res = asdl_seq_LEN ( patterns ) == 1 ? asdl_seq_GET ( patterns , 0 ) : _PyAST_MatchOr ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7383,7 +7591,7 @@ or_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7399,16 +7607,19 @@ or_pattern_rule(Parser *p) static pattern_ty closed_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // literal_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "literal_pattern")); @@ -7427,7 +7638,7 @@ closed_pattern_rule(Parser *p) } { // capture_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "capture_pattern")); @@ -7446,7 +7657,7 @@ closed_pattern_rule(Parser *p) } { // wildcard_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "wildcard_pattern")); @@ -7465,7 +7676,7 @@ closed_pattern_rule(Parser *p) } { // value_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "value_pattern")); @@ -7484,7 +7695,7 @@ closed_pattern_rule(Parser *p) } { // group_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "group_pattern")); @@ -7503,7 +7714,7 @@ closed_pattern_rule(Parser *p) } { // sequence_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sequence_pattern")); @@ -7522,7 +7733,7 @@ closed_pattern_rule(Parser *p) } { // mapping_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "mapping_pattern")); @@ -7541,7 +7752,7 @@ closed_pattern_rule(Parser *p) } { // class_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> closed_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "class_pattern")); @@ -7560,7 +7771,7 @@ closed_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7574,16 +7785,19 @@ closed_pattern_rule(Parser *p) static pattern_ty literal_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7592,7 +7806,7 @@ literal_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_number !('+' | '-') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); @@ -7606,7 +7820,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7616,7 +7830,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7627,7 +7841,7 @@ literal_pattern_rule(Parser *p) } { // complex_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "complex_number")); @@ -7639,7 +7853,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "complex_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7649,7 +7863,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7660,7 +7874,7 @@ literal_pattern_rule(Parser *p) } { // strings if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "strings")); @@ -7672,7 +7886,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "strings")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7682,7 +7896,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( value , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7693,7 +7907,7 @@ literal_pattern_rule(Parser *p) } { // 'None' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -7705,7 +7919,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7715,7 +7929,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_None , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7726,7 +7940,7 @@ literal_pattern_rule(Parser *p) } { // 'True' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -7738,7 +7952,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7748,7 +7962,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_True , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7759,7 +7973,7 @@ literal_pattern_rule(Parser *p) } { // 'False' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -7771,7 +7985,7 @@ literal_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7781,7 +7995,7 @@ literal_pattern_rule(Parser *p) _res = _PyAST_MatchSingleton ( Py_False , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7792,7 +8006,7 @@ literal_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7806,16 +8020,19 @@ literal_pattern_rule(Parser *p) static expr_ty literal_expr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -7824,7 +8041,7 @@ literal_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_number !('+' | '-') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_number !('+' | '-')")); @@ -7845,7 +8062,7 @@ literal_expr_rule(Parser *p) } { // complex_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "complex_number")); @@ -7864,7 +8081,7 @@ literal_expr_rule(Parser *p) } { // strings if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "strings")); @@ -7883,7 +8100,7 @@ literal_expr_rule(Parser *p) } { // 'None' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -7895,7 +8112,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7905,7 +8122,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_None , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7916,7 +8133,7 @@ literal_expr_rule(Parser *p) } { // 'True' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -7928,7 +8145,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7938,7 +8155,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_True , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7949,7 +8166,7 @@ literal_expr_rule(Parser *p) } { // 'False' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> literal_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -7961,7 +8178,7 @@ literal_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ literal_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -7971,7 +8188,7 @@ literal_expr_rule(Parser *p) _res = _PyAST_Constant ( Py_False , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -7982,7 +8199,7 @@ literal_expr_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -7992,16 +8209,19 @@ literal_expr_rule(Parser *p) static expr_ty complex_number_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8010,7 +8230,7 @@ complex_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // signed_real_number '+' imaginary_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> complex_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_real_number '+' imaginary_number")); @@ -8028,7 +8248,7 @@ complex_number_rule(Parser *p) D(fprintf(stderr, "%*c+ complex_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_real_number '+' imaginary_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8038,7 +8258,7 @@ complex_number_rule(Parser *p) _res = _PyAST_BinOp ( real , Add , imag , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8049,7 +8269,7 @@ complex_number_rule(Parser *p) } { // signed_real_number '-' imaginary_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> complex_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "signed_real_number '-' imaginary_number")); @@ -8067,7 +8287,7 @@ complex_number_rule(Parser *p) D(fprintf(stderr, "%*c+ complex_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "signed_real_number '-' imaginary_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8077,7 +8297,7 @@ complex_number_rule(Parser *p) _res = _PyAST_BinOp ( real , Sub , imag , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8088,7 +8308,7 @@ complex_number_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8096,16 +8316,19 @@ complex_number_rule(Parser *p) static expr_ty signed_number_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8114,7 +8337,7 @@ signed_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NUMBER if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> signed_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8133,7 +8356,7 @@ signed_number_rule(Parser *p) } { // '-' NUMBER if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> signed_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' NUMBER")); @@ -8148,7 +8371,7 @@ signed_number_rule(Parser *p) D(fprintf(stderr, "%*c+ signed_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' NUMBER")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8158,7 +8381,7 @@ signed_number_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , number , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8169,7 +8392,7 @@ signed_number_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8177,16 +8400,19 @@ signed_number_rule(Parser *p) static expr_ty signed_real_number_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8195,7 +8421,7 @@ signed_real_number_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // real_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> signed_real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "real_number")); @@ -8214,7 +8440,7 @@ signed_real_number_rule(Parser *p) } { // '-' real_number if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> signed_real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' real_number")); @@ -8229,7 +8455,7 @@ signed_real_number_rule(Parser *p) D(fprintf(stderr, "%*c+ signed_real_number[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' real_number")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8239,7 +8465,7 @@ signed_real_number_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , real , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8250,7 +8476,7 @@ signed_real_number_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8258,16 +8484,19 @@ signed_real_number_rule(Parser *p) static expr_ty real_number_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // NUMBER if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> real_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8280,7 +8509,7 @@ real_number_rule(Parser *p) _res = _PyPegen_ensure_real ( p , real ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8291,7 +8520,7 @@ real_number_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8299,16 +8528,19 @@ real_number_rule(Parser *p) static expr_ty imaginary_number_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // NUMBER if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> imaginary_number[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -8321,7 +8553,7 @@ imaginary_number_rule(Parser *p) _res = _PyPegen_ensure_imaginary ( p , imag ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8332,7 +8564,7 @@ imaginary_number_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8340,16 +8572,19 @@ imaginary_number_rule(Parser *p) static pattern_ty capture_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8358,7 +8593,7 @@ capture_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // pattern_capture_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> capture_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern_capture_target")); @@ -8370,7 +8605,7 @@ capture_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ capture_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8380,7 +8615,7 @@ capture_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( NULL , target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8391,7 +8626,7 @@ capture_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8399,16 +8634,19 @@ capture_pattern_rule(Parser *p) static expr_ty pattern_capture_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // !"_" NAME !('.' | '(' | '=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> pattern_capture_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!\"_\" NAME !('.' | '(' | '=')")); @@ -8425,7 +8663,7 @@ pattern_capture_target_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , name , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8436,7 +8674,7 @@ pattern_capture_target_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8444,16 +8682,19 @@ pattern_capture_target_rule(Parser *p) static pattern_ty wildcard_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8462,7 +8703,7 @@ wildcard_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // "_" if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> wildcard_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"_\"")); @@ -8474,7 +8715,7 @@ wildcard_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ wildcard_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"_\"")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8484,7 +8725,7 @@ wildcard_pattern_rule(Parser *p) _res = _PyAST_MatchAs ( NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8495,7 +8736,7 @@ wildcard_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8503,16 +8744,19 @@ wildcard_pattern_rule(Parser *p) static pattern_ty value_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8521,7 +8765,7 @@ value_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // attr !('.' | '(' | '=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> value_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr !('.' | '(' | '=')")); @@ -8535,7 +8779,7 @@ value_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ value_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "attr !('.' | '(' | '=')")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8545,7 +8789,7 @@ value_pattern_rule(Parser *p) _res = _PyAST_MatchValue ( attr , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8556,7 +8800,7 @@ value_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8566,10 +8810,13 @@ static expr_ty attr_raw(Parser *); static expr_ty attr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, attr_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -8577,37 +8824,42 @@ attr_rule(Parser *p) while (1) { int tmpvar_1 = _PyPegen_update_memo(p, _mark, attr_type, _res); if (tmpvar_1) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = attr_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty attr_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8616,7 +8868,7 @@ attr_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // name_or_attr '.' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '.' NAME")); @@ -8634,7 +8886,7 @@ attr_raw(Parser *p) D(fprintf(stderr, "%*c+ attr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '.' NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8644,7 +8896,7 @@ attr_raw(Parser *p) _res = _PyAST_Attribute ( value , attr -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8655,7 +8907,7 @@ attr_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8664,16 +8916,19 @@ attr_raw(Parser *p) static expr_ty name_or_attr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // attr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> name_or_attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr")); @@ -8692,7 +8947,7 @@ name_or_attr_rule(Parser *p) } { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> name_or_attr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -8711,7 +8966,7 @@ name_or_attr_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8719,16 +8974,19 @@ name_or_attr_rule(Parser *p) static pattern_ty group_pattern_rule(Parser *p) { - D(p->level++); - if (p->error_indicator) { - D(p->level--); - return NULL; + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // '(' pattern ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> group_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' pattern ')'")); @@ -8747,7 +9005,7 @@ group_pattern_rule(Parser *p) _res = pattern; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8758,7 +9016,7 @@ group_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8766,16 +9024,19 @@ group_pattern_rule(Parser *p) static pattern_ty sequence_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -8784,7 +9045,7 @@ sequence_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' maybe_sequence_pattern? ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' maybe_sequence_pattern? ']'")); @@ -8802,7 +9063,7 @@ sequence_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ sequence_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' maybe_sequence_pattern? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8812,7 +9073,7 @@ sequence_pattern_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8823,7 +9084,7 @@ sequence_pattern_rule(Parser *p) } { // '(' open_sequence_pattern? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' open_sequence_pattern? ')'")); @@ -8841,7 +9102,7 @@ sequence_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ sequence_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' open_sequence_pattern? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -8851,7 +9112,7 @@ sequence_pattern_rule(Parser *p) _res = _PyAST_MatchSequence ( patterns , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8862,7 +9123,7 @@ sequence_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8870,16 +9131,19 @@ sequence_pattern_rule(Parser *p) static asdl_seq* open_sequence_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // maybe_star_pattern ',' maybe_sequence_pattern? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> open_sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "maybe_star_pattern ',' maybe_sequence_pattern?")); @@ -8898,7 +9162,7 @@ open_sequence_pattern_rule(Parser *p) _res = _PyPegen_seq_insert_in_front ( p , pattern , patterns ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8909,7 +9173,7 @@ open_sequence_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8917,16 +9181,19 @@ open_sequence_pattern_rule(Parser *p) static asdl_seq* maybe_sequence_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.maybe_star_pattern+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> maybe_sequence_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.maybe_star_pattern+ ','?")); @@ -8943,7 +9210,7 @@ maybe_sequence_pattern_rule(Parser *p) _res = patterns; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -8954,7 +9221,7 @@ maybe_sequence_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -8962,16 +9229,19 @@ maybe_sequence_pattern_rule(Parser *p) static pattern_ty maybe_star_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; { // star_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> maybe_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_pattern")); @@ -8990,7 +9260,7 @@ maybe_star_pattern_rule(Parser *p) } { // pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> maybe_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern")); @@ -9009,7 +9279,7 @@ maybe_star_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9017,16 +9287,19 @@ maybe_star_pattern_rule(Parser *p) static pattern_ty star_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9035,7 +9308,7 @@ star_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' pattern_capture_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' pattern_capture_target")); @@ -9050,7 +9323,7 @@ star_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ star_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' pattern_capture_target")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9060,7 +9333,7 @@ star_pattern_rule(Parser *p) _res = _PyAST_MatchStar ( target -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9071,7 +9344,7 @@ star_pattern_rule(Parser *p) } { // '*' wildcard_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' wildcard_pattern")); @@ -9086,7 +9359,7 @@ star_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ star_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' wildcard_pattern")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9096,7 +9369,7 @@ star_pattern_rule(Parser *p) _res = _PyAST_MatchStar ( NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9107,7 +9380,7 @@ star_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9119,16 +9392,19 @@ star_pattern_rule(Parser *p) static pattern_ty mapping_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9137,7 +9413,7 @@ mapping_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' '}'")); @@ -9152,7 +9428,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9162,7 +9438,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( NULL , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9173,7 +9449,7 @@ mapping_pattern_rule(Parser *p) } { // '{' double_star_pattern ','? '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' double_star_pattern ','? '}'")); @@ -9195,7 +9471,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' double_star_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9205,7 +9481,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( NULL , NULL , rest -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9216,7 +9492,7 @@ mapping_pattern_rule(Parser *p) } { // '{' items_pattern ',' double_star_pattern ','? '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ',' double_star_pattern ','? '}'")); @@ -9244,7 +9520,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ',' double_star_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9254,7 +9530,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , items ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , items ) ) , rest -> v . Name . id , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9265,7 +9541,7 @@ mapping_pattern_rule(Parser *p) } { // '{' items_pattern ','? '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> mapping_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ','? '}'")); @@ -9287,7 +9563,7 @@ mapping_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ mapping_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' items_pattern ','? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9297,7 +9573,7 @@ mapping_pattern_rule(Parser *p) _res = _PyAST_MatchMapping ( CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , items ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , items ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9308,7 +9584,7 @@ mapping_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9316,16 +9592,19 @@ mapping_pattern_rule(Parser *p) static asdl_seq* items_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.key_value_pattern+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> items_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.key_value_pattern+")); @@ -9344,7 +9623,7 @@ items_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9352,16 +9631,19 @@ items_pattern_rule(Parser *p) static KeyPatternPair* key_value_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeyPatternPair* _res = NULL; int _mark = p->mark; { // (literal_expr | attr) ':' pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> key_value_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(literal_expr | attr) ':' pattern")); @@ -9380,7 +9662,7 @@ key_value_pattern_rule(Parser *p) _res = _PyPegen_key_pattern_pair ( p , key , pattern ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9391,7 +9673,7 @@ key_value_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9399,16 +9681,19 @@ key_value_pattern_rule(Parser *p) static expr_ty double_star_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '**' pattern_capture_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> double_star_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' pattern_capture_target")); @@ -9424,7 +9709,7 @@ double_star_pattern_rule(Parser *p) _res = target; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9435,7 +9720,7 @@ double_star_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9448,16 +9733,19 @@ double_star_pattern_rule(Parser *p) static pattern_ty class_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } pattern_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9466,7 +9754,7 @@ class_pattern_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // name_or_attr '(' ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' ')'")); @@ -9484,7 +9772,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9494,7 +9782,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , NULL , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9505,7 +9793,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' positional_patterns ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ','? ')'")); @@ -9530,7 +9818,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9540,7 +9828,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , patterns , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9551,7 +9839,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' keyword_patterns ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' keyword_patterns ','? ')'")); @@ -9576,7 +9864,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' keyword_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9586,7 +9874,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , NULL , CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , keywords ) ) ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , keywords ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9597,7 +9885,7 @@ class_pattern_rule(Parser *p) } { // name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')'")); @@ -9628,7 +9916,7 @@ class_pattern_rule(Parser *p) D(fprintf(stderr, "%*c+ class_pattern[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' positional_patterns ',' keyword_patterns ','? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9638,7 +9926,7 @@ class_pattern_rule(Parser *p) _res = _PyAST_MatchClass ( cls , patterns , CHECK ( asdl_identifier_seq* , _PyPegen_map_names_to_ids ( p , CHECK ( asdl_expr_seq* , _PyPegen_get_pattern_keys ( p , keywords ) ) ) ) , CHECK ( asdl_pattern_seq* , _PyPegen_get_patterns ( p , keywords ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9649,7 +9937,7 @@ class_pattern_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_class_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_class_pattern")); @@ -9668,7 +9956,7 @@ class_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9676,16 +9964,19 @@ class_pattern_rule(Parser *p) static asdl_pattern_seq* positional_patterns_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_pattern_seq* _res = NULL; int _mark = p->mark; { // ','.pattern+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> positional_patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.pattern+")); @@ -9698,7 +9989,7 @@ positional_patterns_rule(Parser *p) _res = args; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9709,7 +10000,7 @@ positional_patterns_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9717,16 +10008,19 @@ positional_patterns_rule(Parser *p) static asdl_seq* keyword_patterns_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.keyword_pattern+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> keyword_patterns[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.keyword_pattern+")); @@ -9745,7 +10039,7 @@ keyword_patterns_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9753,16 +10047,19 @@ keyword_patterns_rule(Parser *p) static KeyPatternPair* keyword_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeyPatternPair* _res = NULL; int _mark = p->mark; { // NAME '=' pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> keyword_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' pattern")); @@ -9781,7 +10078,7 @@ keyword_pattern_rule(Parser *p) _res = _PyPegen_key_pattern_pair ( p , arg , value ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9792,7 +10089,7 @@ keyword_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9800,16 +10097,19 @@ keyword_pattern_rule(Parser *p) static expr_ty expressions_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9818,7 +10118,7 @@ expressions_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // expression ((',' expression))+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ((',' expression))+ ','?")); @@ -9837,7 +10137,7 @@ expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ((',' expression))+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9847,7 +10147,7 @@ expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9858,7 +10158,7 @@ expressions_rule(Parser *p) } { // expression ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ','")); @@ -9873,7 +10173,7 @@ expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ','")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -9883,7 +10183,7 @@ expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_singleton_seq ( p , a ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -9894,7 +10194,7 @@ expressions_rule(Parser *p) } { // expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -9913,7 +10213,7 @@ expressions_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -9926,20 +10226,23 @@ expressions_rule(Parser *p) static expr_ty expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, expression_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -9948,7 +10251,7 @@ expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_expression")); @@ -9967,7 +10270,7 @@ expression_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_legacy_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_legacy_expression")); @@ -9986,7 +10289,7 @@ expression_rule(Parser *p) } { // disjunction 'if' disjunction 'else' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); @@ -10010,7 +10313,7 @@ expression_rule(Parser *p) D(fprintf(stderr, "%*c+ expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10020,7 +10323,7 @@ expression_rule(Parser *p) _res = _PyAST_IfExp ( b , a , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10031,7 +10334,7 @@ expression_rule(Parser *p) } { // disjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction")); @@ -10050,7 +10353,7 @@ expression_rule(Parser *p) } { // lambdef if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambdef")); @@ -10070,7 +10373,7 @@ expression_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, expression_type, _res); - D(p->level--); + p->level--; return _res; } @@ -10078,16 +10381,19 @@ expression_rule(Parser *p) static expr_ty yield_expr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10096,7 +10402,7 @@ yield_expr_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'yield' 'from' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> yield_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'yield' 'from' expression")); @@ -10114,7 +10420,7 @@ yield_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'yield' 'from' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10124,7 +10430,7 @@ yield_expr_rule(Parser *p) _res = _PyAST_YieldFrom ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10135,7 +10441,7 @@ yield_expr_rule(Parser *p) } { // 'yield' star_expressions? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> yield_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'yield' star_expressions?")); @@ -10150,7 +10456,7 @@ yield_expr_rule(Parser *p) D(fprintf(stderr, "%*c+ yield_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'yield' star_expressions?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10160,7 +10466,7 @@ yield_expr_rule(Parser *p) _res = _PyAST_Yield ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10171,7 +10477,7 @@ yield_expr_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10182,16 +10488,19 @@ yield_expr_rule(Parser *p) static expr_ty star_expressions_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10200,7 +10509,7 @@ star_expressions_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_expression ((',' star_expression))+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression ((',' star_expression))+ ','?")); @@ -10219,7 +10528,7 @@ star_expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expression ((',' star_expression))+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10229,7 +10538,7 @@ star_expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10240,7 +10549,7 @@ star_expressions_rule(Parser *p) } { // star_expression ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression ','")); @@ -10255,7 +10564,7 @@ star_expressions_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expressions[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expression ','")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10265,7 +10574,7 @@ star_expressions_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_singleton_seq ( p , a ) ) , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10276,7 +10585,7 @@ star_expressions_rule(Parser *p) } { // star_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expression")); @@ -10295,7 +10604,7 @@ star_expressions_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10303,20 +10612,23 @@ star_expressions_rule(Parser *p) static expr_ty star_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, star_expression_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10325,7 +10637,7 @@ star_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); @@ -10340,7 +10652,7 @@ star_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ star_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10350,7 +10662,7 @@ star_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10361,7 +10673,7 @@ star_expression_rule(Parser *p) } { // expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression")); @@ -10381,7 +10693,7 @@ star_expression_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, star_expression_type, _res); - D(p->level--); + p->level--; return _res; } @@ -10389,16 +10701,19 @@ star_expression_rule(Parser *p) static asdl_expr_seq* star_named_expressions_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.star_named_expression+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_named_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.star_named_expression+ ','?")); @@ -10415,7 +10730,7 @@ star_named_expressions_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10426,7 +10741,7 @@ star_named_expressions_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10434,16 +10749,19 @@ star_named_expressions_rule(Parser *p) static expr_ty star_named_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10452,7 +10770,7 @@ star_named_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); @@ -10467,7 +10785,7 @@ star_named_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ star_named_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' bitwise_or")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10477,7 +10795,7 @@ star_named_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10488,7 +10806,7 @@ star_named_expression_rule(Parser *p) } { // named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -10507,7 +10825,7 @@ star_named_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10515,16 +10833,19 @@ star_named_expression_rule(Parser *p) static expr_ty assignment_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10533,7 +10854,7 @@ assignment_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME ':=' ~ expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> assignment_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME ':=' ~ expression")); @@ -10554,7 +10875,7 @@ assignment_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ assignment_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME ':=' ~ expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10564,7 +10885,7 @@ assignment_expression_rule(Parser *p) _res = _PyAST_NamedExpr ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10573,13 +10894,13 @@ assignment_expression_rule(Parser *p) D(fprintf(stderr, "%*c%s assignment_expression[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "NAME ':=' ~ expression")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10587,16 +10908,19 @@ assignment_expression_rule(Parser *p) static expr_ty named_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -10615,7 +10939,7 @@ named_expression_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_named_expression")); @@ -10634,7 +10958,7 @@ named_expression_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -10655,7 +10979,7 @@ named_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -10663,20 +10987,23 @@ named_expression_rule(Parser *p) static expr_ty disjunction_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, disjunction_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10685,7 +11012,7 @@ disjunction_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // conjunction (('or' conjunction))+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> disjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "conjunction (('or' conjunction))+")); @@ -10700,7 +11027,7 @@ disjunction_rule(Parser *p) D(fprintf(stderr, "%*c+ disjunction[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "conjunction (('or' conjunction))+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10710,7 +11037,7 @@ disjunction_rule(Parser *p) _res = _PyAST_BoolOp ( Or , CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10721,7 +11048,7 @@ disjunction_rule(Parser *p) } { // conjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> disjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "conjunction")); @@ -10741,7 +11068,7 @@ disjunction_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, disjunction_type, _res); - D(p->level--); + p->level--; return _res; } @@ -10749,20 +11076,23 @@ disjunction_rule(Parser *p) static expr_ty conjunction_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, conjunction_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10771,7 +11101,7 @@ conjunction_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // inversion (('and' inversion))+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> conjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "inversion (('and' inversion))+")); @@ -10786,7 +11116,7 @@ conjunction_rule(Parser *p) D(fprintf(stderr, "%*c+ conjunction[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "inversion (('and' inversion))+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10796,7 +11126,7 @@ conjunction_rule(Parser *p) _res = _PyAST_BoolOp ( And , CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10807,7 +11137,7 @@ conjunction_rule(Parser *p) } { // inversion if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> conjunction[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "inversion")); @@ -10827,7 +11157,7 @@ conjunction_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, conjunction_type, _res); - D(p->level--); + p->level--; return _res; } @@ -10835,20 +11165,23 @@ conjunction_rule(Parser *p) static expr_ty inversion_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, inversion_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10857,7 +11190,7 @@ inversion_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'not' inversion if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> inversion[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'not' inversion")); @@ -10872,7 +11205,7 @@ inversion_rule(Parser *p) D(fprintf(stderr, "%*c+ inversion[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'not' inversion")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10882,7 +11215,7 @@ inversion_rule(Parser *p) _res = _PyAST_UnaryOp ( Not , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10893,7 +11226,7 @@ inversion_rule(Parser *p) } { // comparison if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> inversion[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "comparison")); @@ -10913,7 +11246,7 @@ inversion_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, inversion_type, _res); - D(p->level--); + p->level--; return _res; } @@ -10921,16 +11254,19 @@ inversion_rule(Parser *p) static expr_ty comparison_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -10939,7 +11275,7 @@ comparison_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_or compare_op_bitwise_or_pair+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> comparison[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or compare_op_bitwise_or_pair+")); @@ -10954,7 +11290,7 @@ comparison_rule(Parser *p) D(fprintf(stderr, "%*c+ comparison[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_or compare_op_bitwise_or_pair+")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -10964,7 +11300,7 @@ comparison_rule(Parser *p) _res = _PyAST_Compare ( a , CHECK ( asdl_int_seq* , _PyPegen_get_cmpops ( p , b ) ) , CHECK ( asdl_expr_seq* , _PyPegen_get_exprs ( p , b ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -10975,7 +11311,7 @@ comparison_rule(Parser *p) } { // bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> comparison[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or")); @@ -10994,7 +11330,7 @@ comparison_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11012,16 +11348,19 @@ comparison_rule(Parser *p) static CmpopExprPair* compare_op_bitwise_or_pair_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // eq_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "eq_bitwise_or")); @@ -11040,7 +11379,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // noteq_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "noteq_bitwise_or")); @@ -11059,7 +11398,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // lte_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lte_bitwise_or")); @@ -11078,7 +11417,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // lt_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lt_bitwise_or")); @@ -11097,7 +11436,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // gte_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "gte_bitwise_or")); @@ -11116,7 +11455,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // gt_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "gt_bitwise_or")); @@ -11135,7 +11474,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // notin_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "notin_bitwise_or")); @@ -11154,7 +11493,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // in_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "in_bitwise_or")); @@ -11173,7 +11512,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // isnot_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "isnot_bitwise_or")); @@ -11192,7 +11531,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } { // is_bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> compare_op_bitwise_or_pair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "is_bitwise_or")); @@ -11211,7 +11550,7 @@ compare_op_bitwise_or_pair_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11219,16 +11558,19 @@ compare_op_bitwise_or_pair_rule(Parser *p) static CmpopExprPair* eq_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '==' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> eq_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'==' bitwise_or")); @@ -11244,7 +11586,7 @@ eq_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Eq , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11255,7 +11597,7 @@ eq_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11263,16 +11605,19 @@ eq_bitwise_or_rule(Parser *p) static CmpopExprPair* noteq_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // ('!=') bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> noteq_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('!=') bitwise_or")); @@ -11288,7 +11633,7 @@ noteq_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , NotEq , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11299,7 +11644,7 @@ noteq_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11307,16 +11652,19 @@ noteq_bitwise_or_rule(Parser *p) static CmpopExprPair* lte_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '<=' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lte_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<=' bitwise_or")); @@ -11332,7 +11680,7 @@ lte_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , LtE , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11343,7 +11691,7 @@ lte_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11351,16 +11699,19 @@ lte_bitwise_or_rule(Parser *p) static CmpopExprPair* lt_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '<' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lt_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'<' bitwise_or")); @@ -11376,7 +11727,7 @@ lt_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Lt , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11387,7 +11738,7 @@ lt_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11395,16 +11746,19 @@ lt_bitwise_or_rule(Parser *p) static CmpopExprPair* gte_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '>=' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> gte_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>=' bitwise_or")); @@ -11420,7 +11774,7 @@ gte_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , GtE , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11431,7 +11785,7 @@ gte_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11439,16 +11793,19 @@ gte_bitwise_or_rule(Parser *p) static CmpopExprPair* gt_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // '>' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> gt_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'>' bitwise_or")); @@ -11464,7 +11821,7 @@ gt_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Gt , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11475,7 +11832,7 @@ gt_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11483,16 +11840,19 @@ gt_bitwise_or_rule(Parser *p) static CmpopExprPair* notin_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'not' 'in' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> notin_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'not' 'in' bitwise_or")); @@ -11511,7 +11871,7 @@ notin_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , NotIn , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11522,7 +11882,7 @@ notin_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11530,16 +11890,19 @@ notin_bitwise_or_rule(Parser *p) static CmpopExprPair* in_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'in' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> in_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'in' bitwise_or")); @@ -11555,7 +11918,7 @@ in_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , In , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11566,7 +11929,7 @@ in_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11574,16 +11937,19 @@ in_bitwise_or_rule(Parser *p) static CmpopExprPair* isnot_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'is' 'not' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> isnot_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'is' 'not' bitwise_or")); @@ -11602,7 +11968,7 @@ isnot_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , IsNot , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11613,7 +11979,7 @@ isnot_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11621,16 +11987,19 @@ isnot_bitwise_or_rule(Parser *p) static CmpopExprPair* is_bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } CmpopExprPair* _res = NULL; int _mark = p->mark; { // 'is' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> is_bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'is' bitwise_or")); @@ -11646,7 +12015,7 @@ is_bitwise_or_rule(Parser *p) _res = _PyPegen_cmpop_expr_pair ( p , Is , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11657,7 +12026,7 @@ is_bitwise_or_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11667,10 +12036,13 @@ static expr_ty bitwise_or_raw(Parser *); static expr_ty bitwise_or_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_or_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -11678,37 +12050,42 @@ bitwise_or_rule(Parser *p) while (1) { int tmpvar_2 = _PyPegen_update_memo(p, _mark, bitwise_or_type, _res); if (tmpvar_2) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_or_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty bitwise_or_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11717,7 +12094,7 @@ bitwise_or_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_or '|' bitwise_xor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_or '|' bitwise_xor")); @@ -11735,7 +12112,7 @@ bitwise_or_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_or[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_or '|' bitwise_xor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -11745,7 +12122,7 @@ bitwise_or_raw(Parser *p) _res = _PyAST_BinOp ( a , BitOr , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11756,7 +12133,7 @@ bitwise_or_raw(Parser *p) } { // bitwise_xor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_or[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_xor")); @@ -11775,7 +12152,7 @@ bitwise_or_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11785,10 +12162,13 @@ static expr_ty bitwise_xor_raw(Parser *); static expr_ty bitwise_xor_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_xor_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -11796,37 +12176,42 @@ bitwise_xor_rule(Parser *p) while (1) { int tmpvar_3 = _PyPegen_update_memo(p, _mark, bitwise_xor_type, _res); if (tmpvar_3) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_xor_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty bitwise_xor_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11835,7 +12220,7 @@ bitwise_xor_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_xor '^' bitwise_and if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_xor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_xor '^' bitwise_and")); @@ -11853,7 +12238,7 @@ bitwise_xor_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_xor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_xor '^' bitwise_and")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -11863,7 +12248,7 @@ bitwise_xor_raw(Parser *p) _res = _PyAST_BinOp ( a , BitXor , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11874,7 +12259,7 @@ bitwise_xor_raw(Parser *p) } { // bitwise_and if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_xor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_and")); @@ -11893,7 +12278,7 @@ bitwise_xor_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -11903,10 +12288,13 @@ static expr_ty bitwise_and_raw(Parser *); static expr_ty bitwise_and_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, bitwise_and_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -11914,37 +12302,42 @@ bitwise_and_rule(Parser *p) while (1) { int tmpvar_4 = _PyPegen_update_memo(p, _mark, bitwise_and_type, _res); if (tmpvar_4) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = bitwise_and_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty bitwise_and_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -11953,7 +12346,7 @@ bitwise_and_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // bitwise_and '&' shift_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_and[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "bitwise_and '&' shift_expr")); @@ -11971,7 +12364,7 @@ bitwise_and_raw(Parser *p) D(fprintf(stderr, "%*c+ bitwise_and[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "bitwise_and '&' shift_expr")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -11981,7 +12374,7 @@ bitwise_and_raw(Parser *p) _res = _PyAST_BinOp ( a , BitAnd , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -11992,7 +12385,7 @@ bitwise_and_raw(Parser *p) } { // shift_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> bitwise_and[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr")); @@ -12011,7 +12404,7 @@ bitwise_and_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -12021,10 +12414,13 @@ static expr_ty shift_expr_raw(Parser *); static expr_ty shift_expr_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, shift_expr_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -12032,37 +12428,42 @@ shift_expr_rule(Parser *p) while (1) { int tmpvar_5 = _PyPegen_update_memo(p, _mark, shift_expr_type, _res); if (tmpvar_5) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = shift_expr_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty shift_expr_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12071,7 +12472,7 @@ shift_expr_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // shift_expr '<<' sum if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr '<<' sum")); @@ -12089,7 +12490,7 @@ shift_expr_raw(Parser *p) D(fprintf(stderr, "%*c+ shift_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "shift_expr '<<' sum")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12099,7 +12500,7 @@ shift_expr_raw(Parser *p) _res = _PyAST_BinOp ( a , LShift , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12110,7 +12511,7 @@ shift_expr_raw(Parser *p) } { // shift_expr '>>' sum if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "shift_expr '>>' sum")); @@ -12128,7 +12529,7 @@ shift_expr_raw(Parser *p) D(fprintf(stderr, "%*c+ shift_expr[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "shift_expr '>>' sum")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12138,7 +12539,7 @@ shift_expr_raw(Parser *p) _res = _PyAST_BinOp ( a , RShift , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12149,7 +12550,7 @@ shift_expr_raw(Parser *p) } { // sum if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> shift_expr[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum")); @@ -12168,7 +12569,7 @@ shift_expr_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -12178,10 +12579,13 @@ static expr_ty sum_raw(Parser *); static expr_ty sum_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, sum_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -12189,37 +12593,42 @@ sum_rule(Parser *p) while (1) { int tmpvar_6 = _PyPegen_update_memo(p, _mark, sum_type, _res); if (tmpvar_6) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = sum_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty sum_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12228,7 +12637,7 @@ sum_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // sum '+' term if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum '+' term")); @@ -12246,7 +12655,7 @@ sum_raw(Parser *p) D(fprintf(stderr, "%*c+ sum[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "sum '+' term")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12256,7 +12665,7 @@ sum_raw(Parser *p) _res = _PyAST_BinOp ( a , Add , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12267,7 +12676,7 @@ sum_raw(Parser *p) } { // sum '-' term if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "sum '-' term")); @@ -12285,7 +12694,7 @@ sum_raw(Parser *p) D(fprintf(stderr, "%*c+ sum[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "sum '-' term")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12295,7 +12704,7 @@ sum_raw(Parser *p) _res = _PyAST_BinOp ( a , Sub , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12306,7 +12715,7 @@ sum_raw(Parser *p) } { // term if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> sum[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term")); @@ -12325,7 +12734,7 @@ sum_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -12341,10 +12750,13 @@ static expr_ty term_raw(Parser *); static expr_ty term_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, term_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -12352,37 +12764,42 @@ term_rule(Parser *p) while (1) { int tmpvar_7 = _PyPegen_update_memo(p, _mark, term_type, _res); if (tmpvar_7) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = term_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty term_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12391,7 +12808,7 @@ term_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // term '*' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '*' factor")); @@ -12409,7 +12826,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '*' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12419,7 +12836,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Mult , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12430,7 +12847,7 @@ term_raw(Parser *p) } { // term '/' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '/' factor")); @@ -12448,7 +12865,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '/' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12458,7 +12875,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Div , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12469,7 +12886,7 @@ term_raw(Parser *p) } { // term '//' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '//' factor")); @@ -12487,7 +12904,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '//' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12497,7 +12914,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , FloorDiv , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12508,7 +12925,7 @@ term_raw(Parser *p) } { // term '%' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '%' factor")); @@ -12526,7 +12943,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '%' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12536,7 +12953,7 @@ term_raw(Parser *p) _res = _PyAST_BinOp ( a , Mod , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12547,7 +12964,7 @@ term_raw(Parser *p) } { // term '@' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "term '@' factor")); @@ -12565,7 +12982,7 @@ term_raw(Parser *p) D(fprintf(stderr, "%*c+ term[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "term '@' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12575,7 +12992,7 @@ term_raw(Parser *p) _res = CHECK_VERSION ( expr_ty , 5 , "The '@' operator is" , _PyAST_BinOp ( a , MatMult , b , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12586,7 +13003,7 @@ term_raw(Parser *p) } { // factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> term[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "factor")); @@ -12605,7 +13022,7 @@ term_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -12613,20 +13030,23 @@ term_raw(Parser *p) static expr_ty factor_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, factor_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12635,7 +13055,7 @@ factor_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '+' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+' factor")); @@ -12650,7 +13070,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'+' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12660,7 +13080,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( UAdd , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12671,7 +13091,7 @@ factor_rule(Parser *p) } { // '-' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-' factor")); @@ -12686,7 +13106,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'-' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12696,7 +13116,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( USub , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12707,7 +13127,7 @@ factor_rule(Parser *p) } { // '~' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'~' factor")); @@ -12722,7 +13142,7 @@ factor_rule(Parser *p) D(fprintf(stderr, "%*c+ factor[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'~' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12732,7 +13152,7 @@ factor_rule(Parser *p) _res = _PyAST_UnaryOp ( Invert , a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12743,7 +13163,7 @@ factor_rule(Parser *p) } { // power if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> factor[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "power")); @@ -12763,7 +13183,7 @@ factor_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, factor_type, _res); - D(p->level--); + p->level--; return _res; } @@ -12771,16 +13191,19 @@ factor_rule(Parser *p) static expr_ty power_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12789,7 +13212,7 @@ power_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // await_primary '**' factor if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> power[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "await_primary '**' factor")); @@ -12807,7 +13230,7 @@ power_rule(Parser *p) D(fprintf(stderr, "%*c+ power[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "await_primary '**' factor")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12817,7 +13240,7 @@ power_rule(Parser *p) _res = _PyAST_BinOp ( a , Pow , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12828,7 +13251,7 @@ power_rule(Parser *p) } { // await_primary if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> power[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "await_primary")); @@ -12847,7 +13270,7 @@ power_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -12855,20 +13278,23 @@ power_rule(Parser *p) static expr_ty await_primary_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, await_primary_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12877,7 +13303,7 @@ await_primary_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // AWAIT primary if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> await_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "AWAIT primary")); @@ -12892,7 +13318,7 @@ await_primary_rule(Parser *p) D(fprintf(stderr, "%*c+ await_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "AWAIT primary")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -12902,7 +13328,7 @@ await_primary_rule(Parser *p) _res = CHECK_VERSION ( expr_ty , 5 , "Await expressions are" , _PyAST_Await ( a , EXTRA ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -12913,7 +13339,7 @@ await_primary_rule(Parser *p) } { // primary if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> await_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary")); @@ -12933,7 +13359,7 @@ await_primary_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, await_primary_type, _res); - D(p->level--); + p->level--; return _res; } @@ -12948,10 +13374,13 @@ static expr_ty primary_raw(Parser *); static expr_ty primary_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, primary_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -12959,37 +13388,42 @@ primary_rule(Parser *p) while (1) { int tmpvar_8 = _PyPegen_update_memo(p, _mark, primary_type, _res); if (tmpvar_8) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = primary_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty primary_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -12998,7 +13432,7 @@ primary_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // primary '.' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '.' NAME")); @@ -13016,7 +13450,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '.' NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13026,7 +13460,7 @@ primary_raw(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13037,7 +13471,7 @@ primary_raw(Parser *p) } { // primary genexp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary genexp")); @@ -13052,7 +13486,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary genexp")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13062,7 +13496,7 @@ primary_raw(Parser *p) _res = _PyAST_Call ( a , CHECK ( asdl_expr_seq* , ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , b ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13073,7 +13507,7 @@ primary_raw(Parser *p) } { // primary '(' arguments? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '(' arguments? ')'")); @@ -13094,7 +13528,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '(' arguments? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13104,7 +13538,7 @@ primary_raw(Parser *p) _res = _PyAST_Call ( a , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13115,7 +13549,7 @@ primary_raw(Parser *p) } { // primary '[' slices ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "primary '[' slices ']'")); @@ -13136,7 +13570,7 @@ primary_raw(Parser *p) D(fprintf(stderr, "%*c+ primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "primary '[' slices ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13146,7 +13580,7 @@ primary_raw(Parser *p) _res = _PyAST_Subscript ( a , b , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13157,7 +13591,7 @@ primary_raw(Parser *p) } { // atom if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "atom")); @@ -13176,7 +13610,7 @@ primary_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13184,16 +13618,19 @@ primary_raw(Parser *p) static expr_ty slices_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13202,7 +13639,7 @@ slices_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // slice !',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slices[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice !','")); @@ -13217,7 +13654,7 @@ slices_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13228,7 +13665,7 @@ slices_rule(Parser *p) } { // ','.slice+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slices[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.slice+ ','?")); @@ -13244,7 +13681,7 @@ slices_rule(Parser *p) D(fprintf(stderr, "%*c+ slices[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.slice+ ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13254,7 +13691,7 @@ slices_rule(Parser *p) _res = _PyAST_Tuple ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13265,7 +13702,7 @@ slices_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13273,16 +13710,19 @@ slices_rule(Parser *p) static expr_ty slice_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13291,7 +13731,7 @@ slice_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // expression? ':' expression? [':' expression?] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slice[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression? ':' expression? [':' expression?]")); @@ -13312,7 +13752,7 @@ slice_rule(Parser *p) D(fprintf(stderr, "%*c+ slice[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression? ':' expression? [':' expression?]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13322,7 +13762,7 @@ slice_rule(Parser *p) _res = _PyAST_Slice ( a , b , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13333,7 +13773,7 @@ slice_rule(Parser *p) } { // named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> slice[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -13346,7 +13786,7 @@ slice_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13357,7 +13797,7 @@ slice_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13375,16 +13815,19 @@ slice_rule(Parser *p) static expr_ty atom_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13393,7 +13836,7 @@ atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -13412,7 +13855,7 @@ atom_rule(Parser *p) } { // 'True' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -13424,7 +13867,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13434,7 +13877,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_True , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13445,7 +13888,7 @@ atom_rule(Parser *p) } { // 'False' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -13457,7 +13900,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13467,7 +13910,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_False , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13478,7 +13921,7 @@ atom_rule(Parser *p) } { // 'None' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -13490,7 +13933,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13500,7 +13943,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_None , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13511,7 +13954,7 @@ atom_rule(Parser *p) } { // &STRING strings if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&STRING strings")); @@ -13532,7 +13975,7 @@ atom_rule(Parser *p) } { // NUMBER if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NUMBER")); @@ -13551,7 +13994,7 @@ atom_rule(Parser *p) } { // &'(' (tuple | group | genexp) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'(' (tuple | group | genexp)")); @@ -13572,7 +14015,7 @@ atom_rule(Parser *p) } { // &'[' (list | listcomp) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'[' (list | listcomp)")); @@ -13593,7 +14036,7 @@ atom_rule(Parser *p) } { // &'{' (dict | set | dictcomp | setcomp) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "&'{' (dict | set | dictcomp | setcomp)")); @@ -13614,7 +14057,7 @@ atom_rule(Parser *p) } { // '...' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -13626,7 +14069,7 @@ atom_rule(Parser *p) D(fprintf(stderr, "%*c+ atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13636,7 +14079,7 @@ atom_rule(Parser *p) _res = _PyAST_Constant ( Py_Ellipsis , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13647,7 +14090,7 @@ atom_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13655,16 +14098,19 @@ atom_rule(Parser *p) static expr_ty group_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // '(' (yield_expr | named_expression) ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' (yield_expr | named_expression) ')'")); @@ -13683,7 +14129,7 @@ group_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13694,7 +14140,7 @@ group_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_group if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_group")); @@ -13713,7 +14159,7 @@ group_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13721,16 +14167,19 @@ group_rule(Parser *p) static expr_ty lambdef_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -13739,7 +14188,7 @@ lambdef_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // 'lambda' lambda_params? ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambdef[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'lambda' lambda_params? ':' expression")); @@ -13760,7 +14209,7 @@ lambdef_rule(Parser *p) D(fprintf(stderr, "%*c+ lambdef[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'lambda' lambda_params? ':' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -13770,7 +14219,7 @@ lambdef_rule(Parser *p) _res = _PyAST_Lambda ( ( a ) ? a : CHECK ( arguments_ty , _PyPegen_empty_arguments ( p ) ) , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13781,7 +14230,7 @@ lambdef_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13789,16 +14238,19 @@ lambdef_rule(Parser *p) static arguments_ty lambda_params_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arguments_ty _res = NULL; int _mark = p->mark; if (p->call_invalid_rules) { // invalid_lambda_parameters if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_lambda_parameters")); @@ -13817,7 +14269,7 @@ lambda_params_rule(Parser *p) } { // lambda_parameters if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_params[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_parameters")); @@ -13836,7 +14288,7 @@ lambda_params_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -13849,16 +14301,19 @@ lambda_params_rule(Parser *p) static arguments_ty lambda_parameters_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arguments_ty _res = NULL; int _mark = p->mark; { // lambda_slash_no_default lambda_param_no_default* lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default lambda_param_no_default* lambda_param_with_default* lambda_star_etc?")); @@ -13880,7 +14335,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , a , NULL , b , c , d ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13891,7 +14346,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_slash_with_default lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default lambda_param_with_default* lambda_star_etc?")); @@ -13910,7 +14365,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , a , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13921,7 +14376,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_param_no_default+ lambda_param_with_default* lambda_star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ lambda_param_with_default* lambda_star_etc?")); @@ -13940,7 +14395,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13951,7 +14406,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_param_with_default+ lambda_star_etc? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+ lambda_star_etc?")); @@ -13967,7 +14422,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -13978,7 +14433,7 @@ lambda_parameters_rule(Parser *p) } { // lambda_star_etc if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_star_etc")); @@ -13991,7 +14446,7 @@ lambda_parameters_rule(Parser *p) _res = _PyPegen_make_arguments ( p , NULL , NULL , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14002,7 +14457,7 @@ lambda_parameters_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14012,16 +14467,19 @@ lambda_parameters_rule(Parser *p) static asdl_arg_seq* lambda_slash_no_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_arg_seq* _res = NULL; int _mark = p->mark; { // lambda_param_no_default+ '/' ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ '/' ','")); @@ -14040,7 +14498,7 @@ lambda_slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14051,7 +14509,7 @@ lambda_slash_no_default_rule(Parser *p) } { // lambda_param_no_default+ '/' &':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_slash_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default+ '/' &':'")); @@ -14069,7 +14527,7 @@ lambda_slash_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14080,7 +14538,7 @@ lambda_slash_no_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14090,16 +14548,19 @@ lambda_slash_no_default_rule(Parser *p) static SlashWithDefault* lambda_slash_with_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } SlashWithDefault* _res = NULL; int _mark = p->mark; { // lambda_param_no_default* lambda_param_with_default+ '/' ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* lambda_param_with_default+ '/' ','")); @@ -14121,7 +14582,7 @@ lambda_slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14132,7 +14593,7 @@ lambda_slash_with_default_rule(Parser *p) } { // lambda_param_no_default* lambda_param_with_default+ '/' &':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_slash_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* lambda_param_with_default+ '/' &':'")); @@ -14153,7 +14614,7 @@ lambda_slash_with_default_rule(Parser *p) _res = _PyPegen_slash_with_default ( p , ( asdl_arg_seq* ) a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14164,7 +14625,7 @@ lambda_slash_with_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14176,16 +14637,19 @@ lambda_slash_with_default_rule(Parser *p) static StarEtc* lambda_star_etc_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } StarEtc* _res = NULL; int _mark = p->mark; { // '*' lambda_param_no_default lambda_param_maybe_default* lambda_kwds? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' lambda_param_no_default lambda_param_maybe_default* lambda_kwds?")); @@ -14207,7 +14671,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , a , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14218,7 +14682,7 @@ lambda_star_etc_rule(Parser *p) } { // '*' ',' lambda_param_maybe_default+ lambda_kwds? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' lambda_param_maybe_default+ lambda_kwds?")); @@ -14240,7 +14704,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , b , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14251,7 +14715,7 @@ lambda_star_etc_rule(Parser *p) } { // lambda_kwds if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_kwds")); @@ -14264,7 +14728,7 @@ lambda_star_etc_rule(Parser *p) _res = _PyPegen_star_etc ( p , NULL , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14275,7 +14739,7 @@ lambda_star_etc_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_lambda_star_etc if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_lambda_star_etc")); @@ -14294,7 +14758,7 @@ lambda_star_etc_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14302,16 +14766,19 @@ lambda_star_etc_rule(Parser *p) static arg_ty lambda_kwds_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // '**' lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_kwds[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' lambda_param_no_default")); @@ -14327,7 +14794,7 @@ lambda_kwds_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14338,7 +14805,7 @@ lambda_kwds_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14346,16 +14813,19 @@ lambda_kwds_rule(Parser *p) static arg_ty lambda_param_no_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; { // lambda_param ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param ','")); @@ -14371,7 +14841,7 @@ lambda_param_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14382,7 +14852,7 @@ lambda_param_no_default_rule(Parser *p) } { // lambda_param &':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_no_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param &':'")); @@ -14397,7 +14867,7 @@ lambda_param_no_default_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14408,7 +14878,7 @@ lambda_param_no_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14416,16 +14886,19 @@ lambda_param_no_default_rule(Parser *p) static NameDefaultPair* lambda_param_with_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // lambda_param default ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default ','")); @@ -14444,7 +14917,7 @@ lambda_param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14455,7 +14928,7 @@ lambda_param_with_default_rule(Parser *p) } { // lambda_param default &':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_with_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default &':'")); @@ -14473,7 +14946,7 @@ lambda_param_with_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14484,7 +14957,7 @@ lambda_param_with_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14492,16 +14965,19 @@ lambda_param_with_default_rule(Parser *p) static NameDefaultPair* lambda_param_maybe_default_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } NameDefaultPair* _res = NULL; int _mark = p->mark; { // lambda_param default? ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default? ','")); @@ -14520,7 +14996,7 @@ lambda_param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14531,7 +15007,7 @@ lambda_param_maybe_default_rule(Parser *p) } { // lambda_param default? &':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param_maybe_default[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param default? &':'")); @@ -14549,7 +15025,7 @@ lambda_param_maybe_default_rule(Parser *p) _res = _PyPegen_name_default_pair ( p , a , c , NULL ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14560,7 +15036,7 @@ lambda_param_maybe_default_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14568,16 +15044,19 @@ lambda_param_maybe_default_rule(Parser *p) static arg_ty lambda_param_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } arg_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14586,7 +15065,7 @@ lambda_param_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> lambda_param[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -14598,7 +15077,7 @@ lambda_param_rule(Parser *p) D(fprintf(stderr, "%*c+ lambda_param[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -14608,7 +15087,7 @@ lambda_param_rule(Parser *p) _res = _PyAST_arg ( a -> v . Name . id , NULL , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14619,7 +15098,7 @@ lambda_param_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14627,20 +15106,23 @@ lambda_param_rule(Parser *p) static expr_ty strings_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, strings_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; { // STRING+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> strings[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "STRING+")); @@ -14653,7 +15135,7 @@ strings_rule(Parser *p) _res = _PyPegen_concatenate_strings ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14665,7 +15147,7 @@ strings_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, strings_type, _res); - D(p->level--); + p->level--; return _res; } @@ -14673,16 +15155,19 @@ strings_rule(Parser *p) static expr_ty list_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14691,7 +15176,7 @@ list_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' star_named_expressions? ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> list[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' star_named_expressions? ']'")); @@ -14709,7 +15194,7 @@ list_rule(Parser *p) D(fprintf(stderr, "%*c+ list[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' star_named_expressions? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -14719,7 +15204,7 @@ list_rule(Parser *p) _res = _PyAST_List ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14730,7 +15215,7 @@ list_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14738,16 +15223,19 @@ list_rule(Parser *p) static expr_ty tuple_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14756,7 +15244,7 @@ tuple_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' [star_named_expression ',' star_named_expressions?] ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> tuple[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' [star_named_expression ',' star_named_expressions?] ')'")); @@ -14774,7 +15262,7 @@ tuple_rule(Parser *p) D(fprintf(stderr, "%*c+ tuple[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' [star_named_expression ',' star_named_expressions?] ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -14784,7 +15272,7 @@ tuple_rule(Parser *p) _res = _PyAST_Tuple ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14795,7 +15283,7 @@ tuple_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14803,16 +15291,19 @@ tuple_rule(Parser *p) static expr_ty set_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14821,7 +15312,7 @@ set_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' star_named_expressions '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> set[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' star_named_expressions '}'")); @@ -14839,7 +15330,7 @@ set_rule(Parser *p) D(fprintf(stderr, "%*c+ set[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' star_named_expressions '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -14849,7 +15340,7 @@ set_rule(Parser *p) _res = _PyAST_Set ( a , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14860,7 +15351,7 @@ set_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14868,16 +15359,19 @@ set_rule(Parser *p) static expr_ty dict_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -14886,7 +15380,7 @@ dict_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' double_starred_kvpairs? '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dict[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' double_starred_kvpairs? '}'")); @@ -14904,7 +15398,7 @@ dict_rule(Parser *p) D(fprintf(stderr, "%*c+ dict[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' double_starred_kvpairs? '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -14914,7 +15408,7 @@ dict_rule(Parser *p) _res = _PyAST_Dict ( CHECK ( asdl_expr_seq* , _PyPegen_get_keys ( p , a ) ) , CHECK ( asdl_expr_seq* , _PyPegen_get_values ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14925,7 +15419,7 @@ dict_rule(Parser *p) } { // '{' invalid_double_starred_kvpairs '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dict[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' invalid_double_starred_kvpairs '}'")); @@ -14950,7 +15444,7 @@ dict_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -14958,16 +15452,19 @@ dict_rule(Parser *p) static asdl_seq* double_starred_kvpairs_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.double_starred_kvpair+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ','?")); @@ -14984,7 +15481,7 @@ double_starred_kvpairs_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -14995,7 +15492,7 @@ double_starred_kvpairs_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15003,16 +15500,19 @@ double_starred_kvpairs_rule(Parser *p) static KeyValuePair* double_starred_kvpair_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeyValuePair* _res = NULL; int _mark = p->mark; { // '**' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' bitwise_or")); @@ -15028,7 +15528,7 @@ double_starred_kvpair_rule(Parser *p) _res = _PyPegen_key_value_pair ( p , NULL , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15039,7 +15539,7 @@ double_starred_kvpair_rule(Parser *p) } { // kvpair if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> double_starred_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kvpair")); @@ -15058,7 +15558,7 @@ double_starred_kvpair_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15066,16 +15566,19 @@ double_starred_kvpair_rule(Parser *p) static KeyValuePair* kvpair_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeyValuePair* _res = NULL; int _mark = p->mark; { // expression ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' expression")); @@ -15094,7 +15597,7 @@ kvpair_rule(Parser *p) _res = _PyPegen_key_value_pair ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15105,7 +15608,7 @@ kvpair_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15113,16 +15616,19 @@ kvpair_rule(Parser *p) static asdl_comprehension_seq* for_if_clauses_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_comprehension_seq* _res = NULL; int _mark = p->mark; { // for_if_clause+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_if_clauses[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "for_if_clause+")); @@ -15135,7 +15641,7 @@ for_if_clauses_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15146,7 +15652,7 @@ for_if_clauses_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15157,16 +15663,19 @@ for_if_clauses_rule(Parser *p) static comprehension_ty for_if_clause_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } comprehension_ty _res = NULL; int _mark = p->mark; { // ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))* if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); @@ -15197,7 +15706,7 @@ for_if_clause_rule(Parser *p) _res = CHECK_VERSION ( comprehension_ty , 6 , "Async comprehensions are" , _PyAST_comprehension ( a , b , c , 1 , p -> arena ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15206,13 +15715,13 @@ for_if_clause_rule(Parser *p) D(fprintf(stderr, "%*c%s for_if_clause[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } { // 'for' star_targets 'in' ~ disjunction (('if' disjunction))* if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); @@ -15240,7 +15749,7 @@ for_if_clause_rule(Parser *p) _res = _PyAST_comprehension ( a , b , c , 0 , p -> arena ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15249,13 +15758,13 @@ for_if_clause_rule(Parser *p) D(fprintf(stderr, "%*c%s for_if_clause[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ disjunction (('if' disjunction))*")); if (_cut_var) { - D(p->level--); + p->level--; return NULL; } } if (p->call_invalid_rules) { // invalid_for_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> for_if_clause[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_for_target")); @@ -15274,7 +15783,7 @@ for_if_clause_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15282,16 +15791,19 @@ for_if_clause_rule(Parser *p) static expr_ty listcomp_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15300,7 +15812,7 @@ listcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '[' named_expression for_if_clauses ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> listcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' named_expression for_if_clauses ']'")); @@ -15321,7 +15833,7 @@ listcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ listcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' named_expression for_if_clauses ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15331,7 +15843,7 @@ listcomp_rule(Parser *p) _res = _PyAST_ListComp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15342,7 +15854,7 @@ listcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> listcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -15361,7 +15873,7 @@ listcomp_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15369,16 +15881,19 @@ listcomp_rule(Parser *p) static expr_ty setcomp_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15387,7 +15902,7 @@ setcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' named_expression for_if_clauses '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> setcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' named_expression for_if_clauses '}'")); @@ -15408,7 +15923,7 @@ setcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ setcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' named_expression for_if_clauses '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15418,7 +15933,7 @@ setcomp_rule(Parser *p) _res = _PyAST_SetComp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15429,7 +15944,7 @@ setcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> setcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -15448,7 +15963,7 @@ setcomp_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15458,16 +15973,19 @@ setcomp_rule(Parser *p) static expr_ty genexp_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15476,7 +15994,7 @@ genexp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '(' (assignment_expression | expression !':=') for_if_clauses ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> genexp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' (assignment_expression | expression !':=') for_if_clauses ')'")); @@ -15497,7 +16015,7 @@ genexp_rule(Parser *p) D(fprintf(stderr, "%*c+ genexp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' (assignment_expression | expression !':=') for_if_clauses ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15507,7 +16025,7 @@ genexp_rule(Parser *p) _res = _PyAST_GeneratorExp ( a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15518,7 +16036,7 @@ genexp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_comprehension if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> genexp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_comprehension")); @@ -15537,7 +16055,7 @@ genexp_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15545,16 +16063,19 @@ genexp_rule(Parser *p) static expr_ty dictcomp_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15563,7 +16084,7 @@ dictcomp_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '{' kvpair for_if_clauses '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dictcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' kvpair for_if_clauses '}'")); @@ -15584,7 +16105,7 @@ dictcomp_rule(Parser *p) D(fprintf(stderr, "%*c+ dictcomp[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{' kvpair for_if_clauses '}'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15594,7 +16115,7 @@ dictcomp_rule(Parser *p) _res = _PyAST_DictComp ( a -> key , a -> value , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15605,7 +16126,7 @@ dictcomp_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_dict_comprehension if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> dictcomp[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_dict_comprehension")); @@ -15624,7 +16145,7 @@ dictcomp_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15632,20 +16153,23 @@ dictcomp_rule(Parser *p) static expr_ty arguments_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, arguments_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; { // args ','? &')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ','? &')'")); @@ -15664,7 +16188,7 @@ arguments_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15675,7 +16199,7 @@ arguments_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_arguments if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_arguments")); @@ -15695,7 +16219,7 @@ arguments_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, arguments_type, _res); - D(p->level--); + p->level--; return _res; } @@ -15705,16 +16229,19 @@ arguments_rule(Parser *p) static expr_ty args_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15723,7 +16250,7 @@ args_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> args[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs]")); @@ -15738,7 +16265,7 @@ args_rule(Parser *p) D(fprintf(stderr, "%*c+ args[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ [',' kwargs]")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15748,7 +16275,7 @@ args_rule(Parser *p) _res = _PyPegen_collect_call_seqs ( p , a , b , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15759,7 +16286,7 @@ args_rule(Parser *p) } { // kwargs if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> args[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwargs")); @@ -15771,7 +16298,7 @@ args_rule(Parser *p) D(fprintf(stderr, "%*c+ args[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "kwargs")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15781,7 +16308,7 @@ args_rule(Parser *p) _res = _PyAST_Call ( _PyPegen_dummy_name ( p ) , CHECK_NULL_ALLOWED ( asdl_expr_seq* , _PyPegen_seq_extract_starred_exprs ( p , a ) ) , CHECK_NULL_ALLOWED ( asdl_keyword_seq* , _PyPegen_seq_delete_starred_exprs ( p , a ) ) , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15792,7 +16319,7 @@ args_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15803,16 +16330,19 @@ args_rule(Parser *p) static asdl_seq* kwargs_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq* _res = NULL; int _mark = p->mark; { // ','.kwarg_or_starred+ ',' ','.kwarg_or_double_starred+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_starred+ ',' ','.kwarg_or_double_starred+")); @@ -15831,7 +16361,7 @@ kwargs_rule(Parser *p) _res = _PyPegen_join_sequences ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15842,7 +16372,7 @@ kwargs_rule(Parser *p) } { // ','.kwarg_or_starred+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_starred+")); @@ -15861,7 +16391,7 @@ kwargs_rule(Parser *p) } { // ','.kwarg_or_double_starred+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwargs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.kwarg_or_double_starred+")); @@ -15880,7 +16410,7 @@ kwargs_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15888,16 +16418,19 @@ kwargs_rule(Parser *p) static expr_ty starred_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15906,7 +16439,7 @@ starred_expression_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> starred_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression")); @@ -15921,7 +16454,7 @@ starred_expression_rule(Parser *p) D(fprintf(stderr, "%*c+ starred_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -15931,7 +16464,7 @@ starred_expression_rule(Parser *p) _res = _PyAST_Starred ( a , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -15942,7 +16475,7 @@ starred_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -15950,16 +16483,19 @@ starred_expression_rule(Parser *p) static KeywordOrStarred* kwarg_or_starred_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeywordOrStarred* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -15968,7 +16504,7 @@ kwarg_or_starred_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_kwarg if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_kwarg")); @@ -15987,7 +16523,7 @@ kwarg_or_starred_rule(Parser *p) } { // NAME '=' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); @@ -16005,7 +16541,7 @@ kwarg_or_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16015,7 +16551,7 @@ kwarg_or_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( a -> v . Name . id , b , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16026,7 +16562,7 @@ kwarg_or_starred_rule(Parser *p) } { // starred_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); @@ -16039,7 +16575,7 @@ kwarg_or_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , a , 0 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16050,7 +16586,7 @@ kwarg_or_starred_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16058,16 +16594,19 @@ kwarg_or_starred_rule(Parser *p) static KeywordOrStarred* kwarg_or_double_starred_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } KeywordOrStarred* _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16076,7 +16615,7 @@ kwarg_or_double_starred_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro if (p->call_invalid_rules) { // invalid_kwarg if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_kwarg")); @@ -16095,7 +16634,7 @@ kwarg_or_double_starred_rule(Parser *p) } { // NAME '=' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); @@ -16113,7 +16652,7 @@ kwarg_or_double_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_double_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '=' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16123,7 +16662,7 @@ kwarg_or_double_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( a -> v . Name . id , b , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16134,7 +16673,7 @@ kwarg_or_double_starred_rule(Parser *p) } { // '**' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> kwarg_or_double_starred[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' expression")); @@ -16149,7 +16688,7 @@ kwarg_or_double_starred_rule(Parser *p) D(fprintf(stderr, "%*c+ kwarg_or_double_starred[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16159,7 +16698,7 @@ kwarg_or_double_starred_rule(Parser *p) _res = _PyPegen_keyword_or_starred ( p , CHECK ( keyword_ty , _PyAST_keyword ( NULL , a , EXTRA ) ) , 1 ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16170,7 +16709,7 @@ kwarg_or_double_starred_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16178,16 +16717,19 @@ kwarg_or_double_starred_rule(Parser *p) static expr_ty star_targets_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16196,7 +16738,7 @@ star_targets_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // star_target !',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target !','")); @@ -16211,7 +16753,7 @@ star_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16222,7 +16764,7 @@ star_targets_rule(Parser *p) } { // star_target ((',' star_target))* ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))* ','?")); @@ -16241,7 +16783,7 @@ star_targets_rule(Parser *p) D(fprintf(stderr, "%*c+ star_targets[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))* ','?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16251,7 +16793,7 @@ star_targets_rule(Parser *p) _res = _PyAST_Tuple ( CHECK ( asdl_expr_seq* , _PyPegen_seq_insert_in_front ( p , a , b ) ) , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16262,7 +16804,7 @@ star_targets_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16270,16 +16812,19 @@ star_targets_rule(Parser *p) static asdl_expr_seq* star_targets_list_seq_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.star_target+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_targets_list_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.star_target+ ','?")); @@ -16296,7 +16841,7 @@ star_targets_list_seq_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16307,7 +16852,7 @@ star_targets_list_seq_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16315,16 +16860,19 @@ star_targets_list_seq_rule(Parser *p) static asdl_expr_seq* star_targets_tuple_seq_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // star_target ((',' star_target))+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_targets_tuple_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ((',' star_target))+ ','?")); @@ -16344,7 +16892,7 @@ star_targets_tuple_seq_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_insert_in_front ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16355,7 +16903,7 @@ star_targets_tuple_seq_rule(Parser *p) } { // star_target ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_targets_tuple_seq[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target ','")); @@ -16371,7 +16919,7 @@ star_targets_tuple_seq_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16382,7 +16930,7 @@ star_targets_tuple_seq_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16390,20 +16938,23 @@ star_targets_tuple_seq_rule(Parser *p) static expr_ty star_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, star_target_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16412,7 +16963,7 @@ star_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // '*' (!'*' star_target) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (!'*' star_target)")); @@ -16427,7 +16978,7 @@ star_target_rule(Parser *p) D(fprintf(stderr, "%*c+ star_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (!'*' star_target)")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16437,7 +16988,7 @@ star_target_rule(Parser *p) _res = _PyAST_Starred ( CHECK ( expr_ty , _PyPegen_set_expr_context ( p , a , Store ) ) , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16448,7 +16999,7 @@ star_target_rule(Parser *p) } { // target_with_star_atom if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "target_with_star_atom")); @@ -16468,7 +17019,7 @@ star_target_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, star_target_type, _res); - D(p->level--); + p->level--; return _res; } @@ -16479,20 +17030,23 @@ star_target_rule(Parser *p) static expr_ty target_with_star_atom_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, target_with_star_atom_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16501,7 +17055,7 @@ target_with_star_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -16521,7 +17075,7 @@ target_with_star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ target_with_star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16531,7 +17085,7 @@ target_with_star_atom_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16542,7 +17096,7 @@ target_with_star_atom_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -16565,7 +17119,7 @@ target_with_star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ target_with_star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16575,7 +17129,7 @@ target_with_star_atom_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16586,7 +17140,7 @@ target_with_star_atom_rule(Parser *p) } { // star_atom if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> target_with_star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_atom")); @@ -16606,7 +17160,7 @@ target_with_star_atom_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, target_with_star_atom_type, _res); - D(p->level--); + p->level--; return _res; } @@ -16618,16 +17172,19 @@ target_with_star_atom_rule(Parser *p) static expr_ty star_atom_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16636,7 +17193,7 @@ star_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -16649,7 +17206,7 @@ star_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16660,7 +17217,7 @@ star_atom_rule(Parser *p) } { // '(' target_with_star_atom ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' target_with_star_atom ')'")); @@ -16679,7 +17236,7 @@ star_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16690,7 +17247,7 @@ star_atom_rule(Parser *p) } { // '(' star_targets_tuple_seq? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' star_targets_tuple_seq? ')'")); @@ -16708,7 +17265,7 @@ star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' star_targets_tuple_seq? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16718,7 +17275,7 @@ star_atom_rule(Parser *p) _res = _PyAST_Tuple ( a , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16729,7 +17286,7 @@ star_atom_rule(Parser *p) } { // '[' star_targets_list_seq? ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> star_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' star_targets_list_seq? ']'")); @@ -16747,7 +17304,7 @@ star_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ star_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' star_targets_list_seq? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16757,7 +17314,7 @@ star_atom_rule(Parser *p) _res = _PyAST_List ( a , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16768,7 +17325,7 @@ star_atom_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16776,16 +17333,19 @@ star_atom_rule(Parser *p) static expr_ty single_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // single_subscript_attribute_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_subscript_attribute_target")); @@ -16804,7 +17364,7 @@ single_target_rule(Parser *p) } { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -16817,7 +17377,7 @@ single_target_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Store ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16828,7 +17388,7 @@ single_target_rule(Parser *p) } { // '(' single_target ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> single_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' single_target ')'")); @@ -16847,7 +17407,7 @@ single_target_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16858,7 +17418,7 @@ single_target_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16868,16 +17428,19 @@ single_target_rule(Parser *p) static expr_ty single_subscript_attribute_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -16886,7 +17449,7 @@ single_subscript_attribute_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> single_subscript_attribute_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -16906,7 +17469,7 @@ single_subscript_attribute_target_rule(Parser *p) D(fprintf(stderr, "%*c+ single_subscript_attribute_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16916,7 +17479,7 @@ single_subscript_attribute_target_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16927,7 +17490,7 @@ single_subscript_attribute_target_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> single_subscript_attribute_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -16950,7 +17513,7 @@ single_subscript_attribute_target_rule(Parser *p) D(fprintf(stderr, "%*c+ single_subscript_attribute_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -16960,7 +17523,7 @@ single_subscript_attribute_target_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Store , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -16971,7 +17534,7 @@ single_subscript_attribute_target_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -16986,10 +17549,13 @@ static expr_ty t_primary_raw(Parser *); static expr_ty t_primary_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, t_primary_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; @@ -16997,37 +17563,42 @@ t_primary_rule(Parser *p) while (1) { int tmpvar_9 = _PyPegen_update_memo(p, _mark, t_primary_type, _res); if (tmpvar_9) { - D(p->level--); + p->level--; return _res; } p->mark = _mark; p->in_raw_rule++; void *_raw = t_primary_raw(p); p->in_raw_rule--; - if (p->error_indicator) + if (p->error_indicator) { + p->level--; return NULL; + } if (_raw == NULL || p->mark <= _resmark) break; _resmark = p->mark; _res = _raw; } p->mark = _resmark; - D(p->level--); + p->level--; return _res; } static expr_ty t_primary_raw(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17036,7 +17607,7 @@ t_primary_raw(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME &t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME &t_lookahead")); @@ -17056,7 +17627,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17066,7 +17637,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17077,7 +17648,7 @@ t_primary_raw(Parser *p) } { // t_primary '[' slices ']' &t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' &t_lookahead")); @@ -17100,7 +17671,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17110,7 +17681,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Subscript ( a , b , Load , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17121,7 +17692,7 @@ t_primary_raw(Parser *p) } { // t_primary genexp &t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary genexp &t_lookahead")); @@ -17138,7 +17709,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary genexp &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17148,7 +17719,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Call ( a , CHECK ( asdl_expr_seq* , ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , b ) ) , NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17159,7 +17730,7 @@ t_primary_raw(Parser *p) } { // t_primary '(' arguments? ')' &t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '(' arguments? ')' &t_lookahead")); @@ -17182,7 +17753,7 @@ t_primary_raw(Parser *p) D(fprintf(stderr, "%*c+ t_primary[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '(' arguments? ')' &t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17192,7 +17763,7 @@ t_primary_raw(Parser *p) _res = _PyAST_Call ( a , ( b ) ? ( ( expr_ty ) b ) -> v . Call . args : NULL , ( b ) ? ( ( expr_ty ) b ) -> v . Call . keywords : NULL , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17203,7 +17774,7 @@ t_primary_raw(Parser *p) } { // atom &t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_primary[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "atom &t_lookahead")); @@ -17218,7 +17789,7 @@ t_primary_raw(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17229,7 +17800,7 @@ t_primary_raw(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17237,16 +17808,19 @@ t_primary_raw(Parser *p) static void * t_lookahead_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -17265,7 +17839,7 @@ t_lookahead_rule(Parser *p) } { // '[' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -17284,7 +17858,7 @@ t_lookahead_rule(Parser *p) } { // '.' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> t_lookahead[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -17303,7 +17877,7 @@ t_lookahead_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17311,16 +17885,19 @@ t_lookahead_rule(Parser *p) static asdl_expr_seq* del_targets_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.del_target+ ','? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.del_target+ ','?")); @@ -17337,7 +17914,7 @@ del_targets_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17348,7 +17925,7 @@ del_targets_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17359,20 +17936,23 @@ del_targets_rule(Parser *p) static expr_ty del_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; if (_PyPegen_is_memoized(p, del_target_type, &_res)) { - D(p->level--); + p->level--; return _res; } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17381,7 +17961,7 @@ del_target_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // t_primary '.' NAME !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); @@ -17401,7 +17981,7 @@ del_target_rule(Parser *p) D(fprintf(stderr, "%*c+ del_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '.' NAME !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17411,7 +17991,7 @@ del_target_rule(Parser *p) _res = _PyAST_Attribute ( a , b -> v . Name . id , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17422,7 +18002,7 @@ del_target_rule(Parser *p) } { // t_primary '[' slices ']' !t_lookahead if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); @@ -17445,7 +18025,7 @@ del_target_rule(Parser *p) D(fprintf(stderr, "%*c+ del_target[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "t_primary '[' slices ']' !t_lookahead")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17455,7 +18035,7 @@ del_target_rule(Parser *p) _res = _PyAST_Subscript ( a , b , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17466,7 +18046,7 @@ del_target_rule(Parser *p) } { // del_t_atom if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "del_t_atom")); @@ -17486,7 +18066,7 @@ del_target_rule(Parser *p) _res = NULL; done: _PyPegen_insert_memo(p, _mark, del_target_type, _res); - D(p->level--); + p->level--; return _res; } @@ -17494,16 +18074,19 @@ del_target_rule(Parser *p) static expr_ty del_t_atom_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -17512,7 +18095,7 @@ del_t_atom_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME")); @@ -17525,7 +18108,7 @@ del_t_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Del ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17536,7 +18119,7 @@ del_t_atom_rule(Parser *p) } { // '(' del_target ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' del_target ')'")); @@ -17555,7 +18138,7 @@ del_t_atom_rule(Parser *p) _res = _PyPegen_set_expr_context ( p , a , Del ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17566,7 +18149,7 @@ del_t_atom_rule(Parser *p) } { // '(' del_targets? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' del_targets? ')'")); @@ -17584,7 +18167,7 @@ del_t_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ del_t_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' del_targets? ')'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17594,7 +18177,7 @@ del_t_atom_rule(Parser *p) _res = _PyAST_Tuple ( a , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17605,7 +18188,7 @@ del_t_atom_rule(Parser *p) } { // '[' del_targets? ']' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> del_t_atom[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'[' del_targets? ']'")); @@ -17623,7 +18206,7 @@ del_t_atom_rule(Parser *p) D(fprintf(stderr, "%*c+ del_t_atom[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'[' del_targets? ']'")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -17633,7 +18216,7 @@ del_t_atom_rule(Parser *p) _res = _PyAST_List ( a , Del , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17644,7 +18227,7 @@ del_t_atom_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17659,16 +18242,19 @@ del_t_atom_rule(Parser *p) static asdl_expr_seq* type_expressions_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_expr_seq* _res = NULL; int _mark = p->mark; { // ','.expression+ ',' '*' expression ',' '**' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '*' expression ',' '**' expression")); @@ -17699,7 +18285,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , CHECK ( asdl_seq* , _PyPegen_seq_append_to_end ( p , a , b ) ) , c ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17710,7 +18296,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ ',' '*' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '*' expression")); @@ -17732,7 +18318,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17743,7 +18329,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ ',' '**' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+ ',' '**' expression")); @@ -17765,7 +18351,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17776,7 +18362,7 @@ type_expressions_rule(Parser *p) } { // '*' expression ',' '**' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression ',' '**' expression")); @@ -17801,7 +18387,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_seq_append_to_end ( p , CHECK ( asdl_seq* , _PyPegen_singleton_seq ( p , a ) ) , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17812,7 +18398,7 @@ type_expressions_rule(Parser *p) } { // '*' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' expression")); @@ -17828,7 +18414,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17839,7 +18425,7 @@ type_expressions_rule(Parser *p) } { // '**' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**' expression")); @@ -17855,7 +18441,7 @@ type_expressions_rule(Parser *p) _res = ( asdl_expr_seq* ) _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17866,7 +18452,7 @@ type_expressions_rule(Parser *p) } { // ','.expression+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> type_expressions[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.expression+")); @@ -17879,7 +18465,7 @@ type_expressions_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17890,7 +18476,7 @@ type_expressions_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17901,16 +18487,19 @@ type_expressions_rule(Parser *p) static Token* func_type_comment_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } Token* _res = NULL; int _mark = p->mark; { // NEWLINE TYPE_COMMENT &(NEWLINE INDENT) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE TYPE_COMMENT &(NEWLINE INDENT)")); @@ -17928,7 +18517,7 @@ func_type_comment_rule(Parser *p) _res = t; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -17939,7 +18528,7 @@ func_type_comment_rule(Parser *p) } if (p->call_invalid_rules) { // invalid_double_type_comments if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_double_type_comments")); @@ -17958,7 +18547,7 @@ func_type_comment_rule(Parser *p) } { // TYPE_COMMENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> func_type_comment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "TYPE_COMMENT")); @@ -17977,7 +18566,7 @@ func_type_comment_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -17991,16 +18580,19 @@ func_type_comment_rule(Parser *p) static void * invalid_arguments_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // args ',' '*' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' '*'")); @@ -18019,7 +18611,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "iterable argument unpacking follows keyword argument unpacking" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18030,7 +18622,7 @@ invalid_arguments_rule(Parser *p) } { // expression for_if_clauses ',' [args | expression for_if_clauses] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses ',' [args | expression for_if_clauses]")); @@ -18053,7 +18645,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , PyPegen_last_item ( b , comprehension_ty ) -> target , "Generator expression must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18064,7 +18656,7 @@ invalid_arguments_rule(Parser *p) } { // NAME '=' expression for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression for_if_clauses")); @@ -18086,7 +18678,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18097,7 +18689,7 @@ invalid_arguments_rule(Parser *p) } { // args for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args for_if_clauses")); @@ -18113,7 +18705,7 @@ invalid_arguments_rule(Parser *p) _res = _PyPegen_nonparen_genexp_in_call ( p , a , b ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18124,7 +18716,7 @@ invalid_arguments_rule(Parser *p) } { // args ',' expression for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' expression for_if_clauses")); @@ -18146,7 +18738,7 @@ invalid_arguments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , asdl_seq_GET ( b , b -> size - 1 ) -> target , "Generator expression must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18157,7 +18749,7 @@ invalid_arguments_rule(Parser *p) } { // args ',' args if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' args")); @@ -18176,7 +18768,7 @@ invalid_arguments_rule(Parser *p) _res = _PyPegen_arguments_parsing_error ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18187,7 +18779,7 @@ invalid_arguments_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18198,16 +18790,19 @@ invalid_arguments_rule(Parser *p) static void * invalid_kwarg_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ('True' | 'False' | 'None') '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('True' | 'False' | 'None') '='")); @@ -18223,7 +18818,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "cannot assign to %s" , PyBytes_AS_STRING ( a -> bytes ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18234,7 +18829,7 @@ invalid_kwarg_rule(Parser *p) } { // NAME '=' expression for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' expression for_if_clauses")); @@ -18256,7 +18851,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18267,7 +18862,7 @@ invalid_kwarg_rule(Parser *p) } { // !(NAME '=') expression '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kwarg[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(NAME '=') expression '='")); @@ -18285,7 +18880,7 @@ invalid_kwarg_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "expression cannot contain assignment, perhaps you meant \"==\"?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18296,7 +18891,7 @@ invalid_kwarg_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18307,16 +18902,19 @@ invalid_kwarg_rule(Parser *p) static expr_ty expression_without_invalid_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } int _start_lineno = p->tokens[_mark]->lineno; @@ -18325,7 +18923,7 @@ expression_without_invalid_rule(Parser *p) UNUSED(_start_col_offset); // Only used by EXTRA macro { // disjunction 'if' disjunction 'else' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); @@ -18349,7 +18947,7 @@ expression_without_invalid_rule(Parser *p) D(fprintf(stderr, "%*c+ expression_without_invalid[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction 'else' expression")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { - D(p->level--); + p->level--; return NULL; } int _end_lineno = _token->end_lineno; @@ -18359,7 +18957,7 @@ expression_without_invalid_rule(Parser *p) _res = _PyAST_IfExp ( b , a , c , EXTRA ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18370,7 +18968,7 @@ expression_without_invalid_rule(Parser *p) } { // disjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction")); @@ -18389,7 +18987,7 @@ expression_without_invalid_rule(Parser *p) } { // lambdef if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> expression_without_invalid[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambdef")); @@ -18408,7 +19006,7 @@ expression_without_invalid_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18416,16 +19014,19 @@ expression_without_invalid_rule(Parser *p) static void * invalid_legacy_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME !'(' star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_legacy_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME !'(' star_expressions")); @@ -18443,7 +19044,7 @@ invalid_legacy_expression_rule(Parser *p) _res = _PyPegen_check_legacy_stmt ( p , a ) ? RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Missing parentheses in call to '%U'. Did you mean %U(...)?" , a -> v . Name . id , a -> v . Name . id ) : NULL; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18454,7 +19055,7 @@ invalid_legacy_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18464,16 +19065,19 @@ invalid_legacy_expression_rule(Parser *p) static void * invalid_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // !(NAME STRING | SOFT_KEYWORD) disjunction expression_without_invalid if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(NAME STRING | SOFT_KEYWORD) disjunction expression_without_invalid")); @@ -18491,7 +19095,7 @@ invalid_expression_rule(Parser *p) _res = _PyPegen_check_legacy_stmt ( p , a ) ? NULL : p -> tokens [p -> mark - 1] -> level == 0 ? NULL : RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Perhaps you forgot a comma?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18502,7 +19106,7 @@ invalid_expression_rule(Parser *p) } { // disjunction 'if' disjunction !('else' | ':') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction !('else' | ':')")); @@ -18523,7 +19127,7 @@ invalid_expression_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "expected 'else' after 'if' expression" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18534,7 +19138,7 @@ invalid_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18545,16 +19149,19 @@ invalid_expression_rule(Parser *p) static void * invalid_named_expression_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ':=' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':=' expression")); @@ -18573,7 +19180,7 @@ invalid_named_expression_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use assignment expressions with %s" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18584,7 +19191,7 @@ invalid_named_expression_rule(Parser *p) } { // NAME '=' bitwise_or !('=' | ':=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '=' bitwise_or !('=' | ':=')")); @@ -18605,7 +19212,7 @@ invalid_named_expression_rule(Parser *p) _res = p -> in_raw_rule ? NULL : RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "invalid syntax. Maybe you meant '==' or ':=' instead of '='?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18616,7 +19223,7 @@ invalid_named_expression_rule(Parser *p) } { // !(list | tuple | genexp | 'True' | 'None' | 'False') bitwise_or '=' bitwise_or !('=' | ':=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_named_expression[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!(list | tuple | genexp | 'True' | 'None' | 'False') bitwise_or '=' bitwise_or !('=' | ':=')")); @@ -18639,7 +19246,7 @@ invalid_named_expression_rule(Parser *p) _res = p -> in_raw_rule ? NULL : RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot assign to %s here. Maybe you meant '==' instead of '='?" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18650,7 +19257,7 @@ invalid_named_expression_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18664,16 +19271,19 @@ invalid_named_expression_rule(Parser *p) static void * invalid_assignment_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // invalid_ann_assign_target ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "invalid_ann_assign_target ':' expression")); @@ -18692,7 +19302,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "only single target (not %s) can be annotated" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18703,7 +19313,7 @@ invalid_assignment_rule(Parser *p) } { // star_named_expression ',' star_named_expressions* ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions* ':' expression")); @@ -18728,7 +19338,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "only single target (not tuple) can be annotated" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18739,7 +19349,7 @@ invalid_assignment_rule(Parser *p) } { // expression ':' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' expression")); @@ -18758,7 +19368,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "illegal target for annotation" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18769,7 +19379,7 @@ invalid_assignment_rule(Parser *p) } { // ((star_targets '='))* star_expressions '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* star_expressions '='")); @@ -18788,7 +19398,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( STAR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18799,7 +19409,7 @@ invalid_assignment_rule(Parser *p) } { // ((star_targets '='))* yield_expr '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* yield_expr '='")); @@ -18818,7 +19428,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "assignment to yield expression not possible" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18829,7 +19439,7 @@ invalid_assignment_rule(Parser *p) } { // star_expressions augassign (yield_expr | star_expressions) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions augassign (yield_expr | star_expressions)")); @@ -18848,7 +19458,7 @@ invalid_assignment_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "'%s' is an illegal expression for augmented assignment" , _PyPegen_get_expr_name ( a ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18859,7 +19469,7 @@ invalid_assignment_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18867,16 +19477,19 @@ invalid_assignment_rule(Parser *p) static expr_ty invalid_ann_assign_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } expr_ty _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -18895,7 +19508,7 @@ invalid_ann_assign_target_rule(Parser *p) } { // tuple if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -18914,7 +19527,7 @@ invalid_ann_assign_target_rule(Parser *p) } { // '(' invalid_ann_assign_target ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_ann_assign_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' invalid_ann_assign_target ')'")); @@ -18933,7 +19546,7 @@ invalid_ann_assign_target_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18944,7 +19557,7 @@ invalid_ann_assign_target_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18952,16 +19565,19 @@ invalid_ann_assign_target_rule(Parser *p) static void * invalid_del_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'del' star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_del_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'del' star_expressions")); @@ -18977,7 +19593,7 @@ invalid_del_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( DEL_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -18988,7 +19604,7 @@ invalid_del_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -18996,16 +19612,19 @@ invalid_del_stmt_rule(Parser *p) static void * invalid_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE !INDENT")); @@ -19020,7 +19639,7 @@ invalid_block_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19031,7 +19650,7 @@ invalid_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19042,16 +19661,19 @@ invalid_block_rule(Parser *p) static void * invalid_comprehension_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ('[' | '(' | '{') starred_expression for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '(' | '{') starred_expression for_if_clauses")); @@ -19070,7 +19692,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "iterable unpacking cannot be used in comprehension" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19081,7 +19703,7 @@ invalid_comprehension_rule(Parser *p) } { // ('[' | '{') star_named_expression ',' star_named_expressions for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' star_named_expressions for_if_clauses")); @@ -19106,7 +19728,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , PyPegen_last_item ( b , expr_ty ) , "did you forget parentheses around the comprehension target?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19117,7 +19739,7 @@ invalid_comprehension_rule(Parser *p) } { // ('[' | '{') star_named_expression ',' for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' for_if_clauses")); @@ -19139,7 +19761,7 @@ invalid_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "did you forget parentheses around the comprehension target?" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19150,7 +19772,7 @@ invalid_comprehension_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19158,16 +19780,19 @@ invalid_comprehension_rule(Parser *p) static void * invalid_dict_comprehension_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '{' '**' bitwise_or for_if_clauses '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_dict_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{' '**' bitwise_or for_if_clauses '}'")); @@ -19192,7 +19817,7 @@ invalid_dict_comprehension_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "dict unpacking cannot be used in dict comprehension" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19203,7 +19828,7 @@ invalid_dict_comprehension_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19213,16 +19838,19 @@ invalid_dict_comprehension_rule(Parser *p) static void * invalid_parameters_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // param_no_default* invalid_parameters_helper param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* invalid_parameters_helper param_no_default")); @@ -19241,7 +19869,7 @@ invalid_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "non-default argument follows default argument" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19252,7 +19880,7 @@ invalid_parameters_rule(Parser *p) } { // param_no_default* '(' param_no_default+ ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* '(' param_no_default+ ','? ')'")); @@ -19278,7 +19906,7 @@ invalid_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Function parameters cannot be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19289,7 +19917,7 @@ invalid_parameters_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19297,16 +19925,19 @@ invalid_parameters_rule(Parser *p) static void * invalid_parameters_helper_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // slash_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); @@ -19319,7 +19950,7 @@ invalid_parameters_helper_rule(Parser *p) _res = _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19330,7 +19961,7 @@ invalid_parameters_helper_rule(Parser *p) } { // param_with_default+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default+")); @@ -19349,7 +19980,7 @@ invalid_parameters_helper_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19359,16 +19990,19 @@ invalid_parameters_helper_rule(Parser *p) static void * invalid_lambda_parameters_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // lambda_param_no_default* invalid_lambda_parameters_helper lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* invalid_lambda_parameters_helper lambda_param_no_default")); @@ -19387,7 +20021,7 @@ invalid_lambda_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "non-default argument follows default argument" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19398,7 +20032,7 @@ invalid_lambda_parameters_rule(Parser *p) } { // lambda_param_no_default* '(' ','.lambda_param+ ','? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* '(' ','.lambda_param+ ','? ')'")); @@ -19424,7 +20058,7 @@ invalid_lambda_parameters_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( a , b , "Lambda expression parameters cannot be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19435,7 +20069,7 @@ invalid_lambda_parameters_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19445,16 +20079,19 @@ invalid_lambda_parameters_rule(Parser *p) static void * invalid_lambda_parameters_helper_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // lambda_slash_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); @@ -19467,7 +20104,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) _res = _PyPegen_singleton_seq ( p , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19478,7 +20115,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) } { // lambda_param_with_default+ if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+")); @@ -19497,7 +20134,7 @@ invalid_lambda_parameters_helper_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19505,16 +20142,19 @@ invalid_lambda_parameters_helper_rule(Parser *p) static void * invalid_star_etc_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '*' (')' | ',' (')' | '**')) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (')' | ',' (')' | '**'))")); @@ -19530,7 +20170,7 @@ invalid_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "named arguments must follow bare *" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19541,7 +20181,7 @@ invalid_star_etc_rule(Parser *p) } { // '*' ',' TYPE_COMMENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' ',' TYPE_COMMENT")); @@ -19560,7 +20200,7 @@ invalid_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "bare * has associated type comment" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19571,7 +20211,7 @@ invalid_star_etc_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19579,16 +20219,19 @@ invalid_star_etc_rule(Parser *p) static void * invalid_lambda_star_etc_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '*' (':' | ',' (':' | '**')) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (':' | ',' (':' | '**'))")); @@ -19604,7 +20247,7 @@ invalid_lambda_star_etc_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "named arguments must follow bare *" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19615,7 +20258,7 @@ invalid_lambda_star_etc_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19623,16 +20266,19 @@ invalid_lambda_star_etc_rule(Parser *p) static void * invalid_double_type_comments_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // TYPE_COMMENT NEWLINE TYPE_COMMENT NEWLINE INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_double_type_comments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "TYPE_COMMENT NEWLINE TYPE_COMMENT NEWLINE INDENT")); @@ -19657,7 +20303,7 @@ invalid_double_type_comments_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "Cannot have two type comments on def" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19668,7 +20314,7 @@ invalid_double_type_comments_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19676,16 +20322,19 @@ invalid_double_type_comments_rule(Parser *p) static void * invalid_with_item_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expression 'as' expression &(',' | ')' | ':') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_with_item[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression 'as' expression &(',' | ')' | ':')")); @@ -19706,7 +20355,7 @@ invalid_with_item_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( STAR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19717,7 +20366,7 @@ invalid_with_item_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19725,16 +20374,19 @@ invalid_with_item_rule(Parser *p) static void * invalid_for_target_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'for' star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_for_target[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_expressions")); @@ -19754,7 +20406,7 @@ invalid_for_target_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_INVALID_TARGET ( FOR_TARGETS , a ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19765,7 +20417,7 @@ invalid_for_target_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19773,16 +20425,19 @@ invalid_for_target_rule(Parser *p) static void * invalid_group_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' starred_expression ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' starred_expression ')'")); @@ -19801,7 +20456,7 @@ invalid_group_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use starred expression here" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19812,7 +20467,7 @@ invalid_group_rule(Parser *p) } { // '(' '**' expression ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_group[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' '**' expression ')'")); @@ -19834,7 +20489,7 @@ invalid_group_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use double starred expression here" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19845,7 +20500,7 @@ invalid_group_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19853,16 +20508,19 @@ invalid_group_rule(Parser *p) static void * invalid_import_from_targets_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // import_from_as_names ',' NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_import_from_targets[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_names ',' NEWLINE")); @@ -19881,7 +20539,7 @@ invalid_import_from_targets_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "trailing comma not allowed without surrounding parentheses" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -19892,7 +20550,7 @@ invalid_import_from_targets_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19902,16 +20560,19 @@ invalid_import_from_targets_rule(Parser *p) static void * invalid_with_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'with' ','.(expression ['as' star_target])+ &&':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ &&':'")); @@ -19940,7 +20601,7 @@ invalid_with_stmt_rule(Parser *p) } { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':'")); @@ -19979,7 +20640,7 @@ invalid_with_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -19989,16 +20650,19 @@ invalid_with_stmt_rule(Parser *p) static void * invalid_with_stmt_indent_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT")); @@ -20026,7 +20690,7 @@ invalid_with_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'with' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20037,7 +20701,7 @@ invalid_with_stmt_indent_rule(Parser *p) } { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' ':' NEWLINE !INDENT")); @@ -20075,7 +20739,7 @@ invalid_with_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'with' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20086,7 +20750,7 @@ invalid_with_stmt_indent_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20097,16 +20761,19 @@ invalid_with_stmt_indent_rule(Parser *p) static void * invalid_try_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'try' ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' NEWLINE !INDENT")); @@ -20127,7 +20794,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'try' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20138,7 +20805,7 @@ invalid_try_stmt_rule(Parser *p) } { // 'try' ':' block !('except' | 'finally') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' block !('except' | 'finally')")); @@ -20159,7 +20826,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected 'except' or 'finally' block" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20170,7 +20837,7 @@ invalid_try_stmt_rule(Parser *p) } { // 'try' ':' block* ((except_block+ except_star_block) | (except_star_block+ except_block)) block* if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_try_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'try' ':' block* ((except_block+ except_star_block) | (except_star_block+ except_block)) block*")); @@ -20195,7 +20862,7 @@ invalid_try_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "cannot have both 'except' and 'except*' on the same 'try'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20206,7 +20873,7 @@ invalid_try_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20218,16 +20885,19 @@ invalid_try_stmt_rule(Parser *p) static void * invalid_except_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' '*'? expression ',' expressions ['as' NAME] ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*'? expression ',' expressions ['as' NAME] ':'")); @@ -20260,7 +20930,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "multiple exception types must be parenthesized" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20271,7 +20941,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' '*'? expression ['as' NAME] NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*'? expression ['as' NAME] NEWLINE")); @@ -20298,7 +20968,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20309,7 +20979,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' NEWLINE")); @@ -20325,7 +20995,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20336,7 +21006,7 @@ invalid_except_stmt_rule(Parser *p) } { // 'except' '*' (NEWLINE | ':') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' (NEWLINE | ':')")); @@ -20355,7 +21025,7 @@ invalid_except_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected one or more exception types" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20366,7 +21036,7 @@ invalid_except_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20374,16 +21044,19 @@ invalid_except_stmt_rule(Parser *p) static void * invalid_finally_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'finally' ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_finally_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally' ':' NEWLINE !INDENT")); @@ -20404,7 +21077,7 @@ invalid_finally_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'finally' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20415,7 +21088,7 @@ invalid_finally_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20425,16 +21098,19 @@ invalid_finally_stmt_rule(Parser *p) static void * invalid_except_stmt_indent_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' expression ['as' NAME] ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' expression ['as' NAME] ':' NEWLINE !INDENT")); @@ -20462,7 +21138,7 @@ invalid_except_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'except' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20473,7 +21149,7 @@ invalid_except_stmt_indent_rule(Parser *p) } { // 'except' ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' ':' NEWLINE !INDENT")); @@ -20494,7 +21170,7 @@ invalid_except_stmt_indent_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected an indented block after except statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20505,7 +21181,7 @@ invalid_except_stmt_indent_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20514,16 +21190,19 @@ invalid_except_stmt_indent_rule(Parser *p) static void * invalid_except_star_stmt_indent_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' '*' expression ['as' NAME] ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_except_star_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' expression ['as' NAME] ':' NEWLINE !INDENT")); @@ -20554,7 +21233,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'except*' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20565,7 +21244,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20575,16 +21254,19 @@ invalid_except_star_stmt_indent_rule(Parser *p) static void * invalid_match_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // "match" subject_expr !':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr !':'")); @@ -20602,7 +21284,7 @@ invalid_match_stmt_rule(Parser *p) _res = CHECK_VERSION ( void* , 10 , "Pattern matching is" , RAISE_SYNTAX_ERROR ( "expected ':'" ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20613,7 +21295,7 @@ invalid_match_stmt_rule(Parser *p) } { // "match" subject_expr ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr ':' NEWLINE !INDENT")); @@ -20637,7 +21319,7 @@ invalid_match_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'match' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20648,7 +21330,7 @@ invalid_match_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20658,16 +21340,19 @@ invalid_match_stmt_rule(Parser *p) static void * invalid_case_block_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // "case" patterns guard? !':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? !':'")); @@ -20689,7 +21374,7 @@ invalid_case_block_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20700,7 +21385,7 @@ invalid_case_block_rule(Parser *p) } { // "case" patterns guard? ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? ':' NEWLINE !INDENT")); @@ -20728,7 +21413,7 @@ invalid_case_block_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'case' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20739,7 +21424,7 @@ invalid_case_block_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20747,16 +21432,19 @@ invalid_case_block_rule(Parser *p) static void * invalid_as_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // or_pattern 'as' "_" if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' \"_\"")); @@ -20775,7 +21463,7 @@ invalid_as_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "cannot use '_' as a target" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20786,7 +21474,7 @@ invalid_as_pattern_rule(Parser *p) } { // or_pattern 'as' !NAME expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_as_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "or_pattern 'as' !NAME expression")); @@ -20807,7 +21495,7 @@ invalid_as_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "invalid pattern target" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20818,7 +21506,7 @@ invalid_as_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20826,16 +21514,19 @@ invalid_as_pattern_rule(Parser *p) static void * invalid_class_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // name_or_attr '(' invalid_class_argument_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_class_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "name_or_attr '(' invalid_class_argument_pattern")); @@ -20854,7 +21545,7 @@ invalid_class_pattern_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_RANGE ( PyPegen_first_item ( a , pattern_ty ) , PyPegen_last_item ( a , pattern_ty ) , "positional patterns follow keyword patterns" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20865,7 +21556,7 @@ invalid_class_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20874,16 +21565,19 @@ invalid_class_pattern_rule(Parser *p) static asdl_pattern_seq* invalid_class_argument_pattern_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_pattern_seq* _res = NULL; int _mark = p->mark; { // [positional_patterns ','] keyword_patterns ',' positional_patterns if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_class_argument_pattern[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "[positional_patterns ','] keyword_patterns ',' positional_patterns")); @@ -20906,7 +21600,7 @@ invalid_class_argument_pattern_rule(Parser *p) _res = a; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20917,7 +21611,7 @@ invalid_class_argument_pattern_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -20927,16 +21621,19 @@ invalid_class_argument_pattern_rule(Parser *p) static void * invalid_if_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' named_expression NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression NEWLINE")); @@ -20955,7 +21652,7 @@ invalid_if_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -20966,7 +21663,7 @@ invalid_if_stmt_rule(Parser *p) } { // 'if' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_if_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' named_expression ':' NEWLINE !INDENT")); @@ -20990,7 +21687,7 @@ invalid_if_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'if' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21001,7 +21698,7 @@ invalid_if_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21011,16 +21708,19 @@ invalid_if_stmt_rule(Parser *p) static void * invalid_elif_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'elif' named_expression NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression NEWLINE")); @@ -21039,7 +21739,7 @@ invalid_elif_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21050,7 +21750,7 @@ invalid_elif_stmt_rule(Parser *p) } { // 'elif' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_elif_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'elif' named_expression ':' NEWLINE !INDENT")); @@ -21074,7 +21774,7 @@ invalid_elif_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'elif' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21085,7 +21785,7 @@ invalid_elif_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21093,16 +21793,19 @@ invalid_elif_stmt_rule(Parser *p) static void * invalid_else_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'else' ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_else_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else' ':' NEWLINE !INDENT")); @@ -21123,7 +21826,7 @@ invalid_else_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'else' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21134,7 +21837,7 @@ invalid_else_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21144,16 +21847,19 @@ invalid_else_stmt_rule(Parser *p) static void * invalid_while_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'while' named_expression NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression NEWLINE")); @@ -21172,7 +21878,7 @@ invalid_while_stmt_rule(Parser *p) _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21183,7 +21889,7 @@ invalid_while_stmt_rule(Parser *p) } { // 'while' named_expression ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_while_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'while' named_expression ':' NEWLINE !INDENT")); @@ -21207,7 +21913,7 @@ invalid_while_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'while' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21218,7 +21924,7 @@ invalid_while_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21226,16 +21932,19 @@ invalid_while_stmt_rule(Parser *p) static void * invalid_for_stmt_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT")); @@ -21269,7 +21978,7 @@ invalid_for_stmt_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after 'for' statement on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21280,7 +21989,7 @@ invalid_for_stmt_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21289,16 +21998,19 @@ invalid_for_stmt_rule(Parser *p) static void * invalid_def_raw_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ASYNC? 'def' NAME '(' params? ')' ['->' expression] ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'def' NAME '(' params? ')' ['->' expression] ':' NEWLINE !INDENT")); @@ -21340,7 +22052,7 @@ invalid_def_raw_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after function definition on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21351,7 +22063,7 @@ invalid_def_raw_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21359,16 +22071,19 @@ invalid_def_raw_rule(Parser *p) static void * invalid_class_def_raw_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT")); @@ -21396,7 +22111,7 @@ invalid_class_def_raw_rule(Parser *p) _res = RAISE_INDENTATION_ERROR ( "expected an indented block after class definition on line %d" , a -> lineno ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21407,7 +22122,7 @@ invalid_class_def_raw_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21418,16 +22133,19 @@ invalid_class_def_raw_rule(Parser *p) static void * invalid_double_starred_kvpairs_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ','.double_starred_kvpair+ ',' invalid_kvpair if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); @@ -21452,7 +22170,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } { // expression ':' '*' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' '*' bitwise_or")); @@ -21474,7 +22192,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "cannot use a starred expression in a dictionary value" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21485,7 +22203,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } { // expression ':' &('}' | ',') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' &('}' | ',')")); @@ -21503,7 +22221,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "expression expected after dictionary key and ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21514,7 +22232,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21522,16 +22240,19 @@ invalid_double_starred_kvpairs_rule(Parser *p) static void * invalid_kvpair_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expression !(':') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !(':')")); @@ -21546,7 +22267,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_ERROR_KNOWN_LOCATION ( p , PyExc_SyntaxError , a -> lineno , a -> end_col_offset - 1 , a -> end_lineno , - 1 , "':' expected after dictionary key" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21557,7 +22278,7 @@ invalid_kvpair_rule(Parser *p) } { // expression ':' '*' bitwise_or if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':' '*' bitwise_or")); @@ -21579,7 +22300,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_STARTING_FROM ( a , "cannot use a starred expression in a dictionary value" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21590,7 +22311,7 @@ invalid_kvpair_rule(Parser *p) } { // expression ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> invalid_kvpair[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ':'")); @@ -21606,7 +22327,7 @@ invalid_kvpair_rule(Parser *p) _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "expression expected after dictionary key and ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -21617,7 +22338,7 @@ invalid_kvpair_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21625,9 +22346,12 @@ invalid_kvpair_rule(Parser *p) static asdl_seq * _loop0_1_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -21637,14 +22361,14 @@ _loop0_1_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_1[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -21660,7 +22384,7 @@ _loop0_1_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -21677,13 +22401,13 @@ _loop0_1_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_1_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -21691,9 +22415,12 @@ _loop0_1_rule(Parser *p) static asdl_seq * _loop0_2_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -21703,14 +22430,14 @@ _loop0_2_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_2[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -21726,7 +22453,7 @@ _loop0_2_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -21743,13 +22470,13 @@ _loop0_2_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_2_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -21757,9 +22484,12 @@ _loop0_2_rule(Parser *p) static asdl_seq * _loop1_3_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -21769,14 +22499,14 @@ _loop1_3_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // statement if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_3[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "statement")); @@ -21792,7 +22522,7 @@ _loop1_3_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -21806,7 +22536,7 @@ _loop1_3_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -21814,13 +22544,13 @@ _loop1_3_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_3_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -21828,9 +22558,12 @@ _loop1_3_rule(Parser *p) static asdl_seq * _loop0_5_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -21840,14 +22573,14 @@ _loop0_5_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ';' simple_stmt if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_5[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';' simple_stmt")); @@ -21863,7 +22596,7 @@ _loop0_5_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -21872,7 +22605,7 @@ _loop0_5_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -21889,13 +22622,13 @@ _loop0_5_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_5_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -21903,16 +22636,19 @@ _loop0_5_rule(Parser *p) static asdl_seq * _gather_4_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // simple_stmt _loop0_5 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_4[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "simple_stmt _loop0_5")); @@ -21934,7 +22670,7 @@ _gather_4_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21942,16 +22678,19 @@ _gather_4_rule(Parser *p) static void * _tmp_6_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'import' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_6[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'import'")); @@ -21970,7 +22709,7 @@ _tmp_6_rule(Parser *p) } { // 'from' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_6[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from'")); @@ -21989,7 +22728,7 @@ _tmp_6_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -21997,16 +22736,19 @@ _tmp_6_rule(Parser *p) static void * _tmp_7_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'def' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'def'")); @@ -22025,7 +22767,7 @@ _tmp_7_rule(Parser *p) } { // '@' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@'")); @@ -22044,7 +22786,7 @@ _tmp_7_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22063,7 +22805,7 @@ _tmp_7_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22071,16 +22813,19 @@ _tmp_7_rule(Parser *p) static void * _tmp_8_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'class' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_8[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class'")); @@ -22099,7 +22844,7 @@ _tmp_8_rule(Parser *p) } { // '@' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_8[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@'")); @@ -22118,7 +22863,7 @@ _tmp_8_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22126,16 +22871,19 @@ _tmp_8_rule(Parser *p) static void * _tmp_9_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'with' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_9[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'with'")); @@ -22154,7 +22902,7 @@ _tmp_9_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_9[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22173,7 +22921,7 @@ _tmp_9_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22181,16 +22929,19 @@ _tmp_9_rule(Parser *p) static void * _tmp_10_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'for' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_10[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for'")); @@ -22209,7 +22960,7 @@ _tmp_10_rule(Parser *p) } { // ASYNC if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_10[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC")); @@ -22228,7 +22979,7 @@ _tmp_10_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22236,16 +22987,19 @@ _tmp_10_rule(Parser *p) static void * _tmp_11_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' annotated_rhs if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_11[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' annotated_rhs")); @@ -22261,7 +23015,7 @@ _tmp_11_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -22272,7 +23026,7 @@ _tmp_11_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22280,16 +23034,19 @@ _tmp_11_rule(Parser *p) static void * _tmp_12_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' single_target ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_12[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' single_target ')'")); @@ -22308,7 +23065,7 @@ _tmp_12_rule(Parser *p) _res = b; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -22319,7 +23076,7 @@ _tmp_12_rule(Parser *p) } { // single_subscript_attribute_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_12[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "single_subscript_attribute_target")); @@ -22338,7 +23095,7 @@ _tmp_12_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22346,16 +23103,19 @@ _tmp_12_rule(Parser *p) static void * _tmp_13_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' annotated_rhs if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_13[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'=' annotated_rhs")); @@ -22371,7 +23131,7 @@ _tmp_13_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -22382,7 +23142,7 @@ _tmp_13_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22390,9 +23150,12 @@ _tmp_13_rule(Parser *p) static asdl_seq * _loop1_14_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -22402,14 +23165,14 @@ _loop1_14_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_14[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -22425,7 +23188,7 @@ _loop1_14_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -22439,7 +23202,7 @@ _loop1_14_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -22447,13 +23210,13 @@ _loop1_14_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_14_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -22461,16 +23224,19 @@ _loop1_14_rule(Parser *p) static void * _tmp_15_rule(Parser *p) { - D(p->level++); - if (p->error_indicator) { - D(p->level--); - return NULL; + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_15[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -22489,7 +23255,7 @@ _tmp_15_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_15[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -22508,7 +23274,7 @@ _tmp_15_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22516,16 +23282,19 @@ _tmp_15_rule(Parser *p) static void * _tmp_16_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_16[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -22544,7 +23313,7 @@ _tmp_16_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_16[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -22563,7 +23332,7 @@ _tmp_16_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22571,16 +23340,19 @@ _tmp_16_rule(Parser *p) static void * _tmp_17_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'from' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_17[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'from' expression")); @@ -22596,7 +23368,7 @@ _tmp_17_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -22607,7 +23379,7 @@ _tmp_17_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22615,9 +23387,12 @@ _tmp_17_rule(Parser *p) static asdl_seq * _loop0_19_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -22627,14 +23402,14 @@ _loop0_19_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_19[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' NAME")); @@ -22650,7 +23425,7 @@ _loop0_19_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -22659,7 +23434,7 @@ _loop0_19_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -22676,13 +23451,13 @@ _loop0_19_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_19_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -22690,16 +23465,19 @@ _loop0_19_rule(Parser *p) static asdl_seq * _gather_18_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // NAME _loop0_19 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_18[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME _loop0_19")); @@ -22721,7 +23499,7 @@ _gather_18_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22729,9 +23507,12 @@ _gather_18_rule(Parser *p) static asdl_seq * _loop0_21_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -22741,14 +23522,14 @@ _loop0_21_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_21[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' NAME")); @@ -22764,7 +23545,7 @@ _loop0_21_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -22773,7 +23554,7 @@ _loop0_21_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -22790,13 +23571,13 @@ _loop0_21_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_21_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -22804,16 +23585,19 @@ _loop0_21_rule(Parser *p) static asdl_seq * _gather_20_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // NAME _loop0_21 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_20[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME _loop0_21")); @@ -22835,7 +23619,7 @@ _gather_20_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22843,16 +23627,19 @@ _gather_20_rule(Parser *p) static void * _tmp_22_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ';' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_22[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "';'")); @@ -22871,7 +23658,7 @@ _tmp_22_rule(Parser *p) } { // NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_22[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -22890,7 +23677,7 @@ _tmp_22_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22898,16 +23685,19 @@ _tmp_22_rule(Parser *p) static void * _tmp_23_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_23[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -22923,7 +23713,7 @@ _tmp_23_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -22934,7 +23724,7 @@ _tmp_23_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -22942,9 +23732,12 @@ _tmp_23_rule(Parser *p) static asdl_seq * _loop0_24_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -22954,14 +23747,14 @@ _loop0_24_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('.' | '...') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_24[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); @@ -22977,7 +23770,7 @@ _loop0_24_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -22994,13 +23787,13 @@ _loop0_24_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_24_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23008,9 +23801,12 @@ _loop0_24_rule(Parser *p) static asdl_seq * _loop1_25_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23020,14 +23816,14 @@ _loop1_25_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('.' | '...') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_25[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); @@ -23043,7 +23839,7 @@ _loop1_25_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23057,7 +23853,7 @@ _loop1_25_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23065,13 +23861,13 @@ _loop1_25_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_25_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23079,9 +23875,12 @@ _loop1_25_rule(Parser *p) static asdl_seq * _loop0_27_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23091,14 +23890,14 @@ _loop0_27_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' import_from_as_name if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_27[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' import_from_as_name")); @@ -23114,7 +23913,7 @@ _loop0_27_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -23123,7 +23922,7 @@ _loop0_27_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23140,13 +23939,13 @@ _loop0_27_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_27_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23154,16 +23953,19 @@ _loop0_27_rule(Parser *p) static asdl_seq * _gather_26_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // import_from_as_name _loop0_27 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_26[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "import_from_as_name _loop0_27")); @@ -23185,7 +23987,7 @@ _gather_26_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23193,16 +23995,19 @@ _gather_26_rule(Parser *p) static void * _tmp_28_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_28[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -23218,7 +24023,7 @@ _tmp_28_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -23229,7 +24034,7 @@ _tmp_28_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23237,9 +24042,12 @@ _tmp_28_rule(Parser *p) static asdl_seq * _loop0_30_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23249,14 +24057,14 @@ _loop0_30_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' dotted_as_name if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_30[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' dotted_as_name")); @@ -23272,7 +24080,7 @@ _loop0_30_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -23281,7 +24089,7 @@ _loop0_30_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23298,13 +24106,13 @@ _loop0_30_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_30_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23312,16 +24120,19 @@ _loop0_30_rule(Parser *p) static asdl_seq * _gather_29_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // dotted_as_name _loop0_30 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_29[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dotted_as_name _loop0_30")); @@ -23343,7 +24154,7 @@ _gather_29_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23351,16 +24162,19 @@ _gather_29_rule(Parser *p) static void * _tmp_31_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_31[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -23376,7 +24190,7 @@ _tmp_31_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -23387,7 +24201,7 @@ _tmp_31_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23395,9 +24209,12 @@ _tmp_31_rule(Parser *p) static asdl_seq * _loop1_32_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23407,14 +24224,14 @@ _loop1_32_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('@' named_expression NEWLINE) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_32[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('@' named_expression NEWLINE)")); @@ -23430,7 +24247,7 @@ _loop1_32_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23444,7 +24261,7 @@ _loop1_32_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23452,13 +24269,13 @@ _loop1_32_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_32_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23466,16 +24283,19 @@ _loop1_32_rule(Parser *p) static void * _tmp_33_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' arguments? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_33[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); @@ -23494,7 +24314,7 @@ _tmp_33_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -23505,7 +24325,7 @@ _tmp_33_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23513,16 +24333,19 @@ _tmp_33_rule(Parser *p) static void * _tmp_34_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_34[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -23538,7 +24361,7 @@ _tmp_34_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -23549,7 +24372,7 @@ _tmp_34_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23557,16 +24380,19 @@ _tmp_34_rule(Parser *p) static void * _tmp_35_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_35[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -23582,7 +24408,7 @@ _tmp_35_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -23593,7 +24419,7 @@ _tmp_35_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -23601,9 +24427,12 @@ _tmp_35_rule(Parser *p) static asdl_seq * _loop0_36_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23613,14 +24442,14 @@ _loop0_36_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_36[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -23636,7 +24465,7 @@ _loop0_36_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23653,13 +24482,13 @@ _loop0_36_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_36_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23667,9 +24496,12 @@ _loop0_36_rule(Parser *p) static asdl_seq * _loop0_37_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23679,14 +24511,14 @@ _loop0_37_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_37[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -23702,7 +24534,7 @@ _loop0_37_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23719,13 +24551,13 @@ _loop0_37_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_37_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23733,9 +24565,12 @@ _loop0_37_rule(Parser *p) static asdl_seq * _loop0_38_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23745,14 +24580,14 @@ _loop0_38_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_38[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -23768,7 +24603,7 @@ _loop0_38_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23785,13 +24620,13 @@ _loop0_38_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_38_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23799,9 +24634,12 @@ _loop0_38_rule(Parser *p) static asdl_seq * _loop1_39_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23811,14 +24649,14 @@ _loop1_39_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_39[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -23834,7 +24672,7 @@ _loop1_39_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23848,7 +24686,7 @@ _loop1_39_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23856,13 +24694,13 @@ _loop1_39_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_39_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23870,9 +24708,12 @@ _loop1_39_rule(Parser *p) static asdl_seq * _loop0_40_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23882,14 +24723,14 @@ _loop0_40_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_40[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -23905,7 +24746,7 @@ _loop0_40_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23922,13 +24763,13 @@ _loop0_40_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_40_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -23936,9 +24777,12 @@ _loop0_40_rule(Parser *p) static asdl_seq * _loop1_41_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -23948,14 +24792,14 @@ _loop1_41_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_41[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -23971,7 +24815,7 @@ _loop1_41_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -23985,7 +24829,7 @@ _loop1_41_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -23993,13 +24837,13 @@ _loop1_41_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_41_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24007,9 +24851,12 @@ _loop1_41_rule(Parser *p) static asdl_seq * _loop1_42_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24019,14 +24866,14 @@ _loop1_42_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_42[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24042,7 +24889,7 @@ _loop1_42_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24056,7 +24903,7 @@ _loop1_42_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24064,13 +24911,13 @@ _loop1_42_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_42_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24078,9 +24925,12 @@ _loop1_42_rule(Parser *p) static asdl_seq * _loop1_43_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24090,14 +24940,14 @@ _loop1_43_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_43[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24113,7 +24963,7 @@ _loop1_43_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24127,7 +24977,7 @@ _loop1_43_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24135,13 +24985,13 @@ _loop1_43_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_43_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24149,9 +24999,12 @@ _loop1_43_rule(Parser *p) static asdl_seq * _loop0_44_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24161,14 +25014,14 @@ _loop0_44_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_44[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24184,7 +25037,7 @@ _loop0_44_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24201,13 +25054,13 @@ _loop0_44_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_44_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24215,9 +25068,12 @@ _loop0_44_rule(Parser *p) static asdl_seq * _loop1_45_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24227,14 +25083,14 @@ _loop1_45_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_45[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24250,7 +25106,7 @@ _loop1_45_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24264,7 +25120,7 @@ _loop1_45_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24272,13 +25128,13 @@ _loop1_45_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_45_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24286,9 +25142,12 @@ _loop1_45_rule(Parser *p) static asdl_seq * _loop0_46_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24298,14 +25157,14 @@ _loop0_46_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_46[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -24321,7 +25180,7 @@ _loop0_46_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24338,13 +25197,13 @@ _loop0_46_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_46_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24352,9 +25211,12 @@ _loop0_46_rule(Parser *p) static asdl_seq * _loop1_47_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24364,14 +25226,14 @@ _loop1_47_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_47[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -24387,7 +25249,7 @@ _loop1_47_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24401,7 +25263,7 @@ _loop1_47_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24409,13 +25271,13 @@ _loop1_47_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_47_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24423,9 +25285,12 @@ _loop1_47_rule(Parser *p) static asdl_seq * _loop0_48_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24435,14 +25300,14 @@ _loop0_48_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_maybe_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_48[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); @@ -24458,7 +25323,7 @@ _loop0_48_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24475,13 +25340,13 @@ _loop0_48_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_48_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24489,9 +25354,12 @@ _loop0_48_rule(Parser *p) static asdl_seq * _loop1_49_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24501,14 +25369,14 @@ _loop1_49_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_maybe_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_49[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); @@ -24524,7 +25392,7 @@ _loop1_49_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24538,7 +25406,7 @@ _loop1_49_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -24546,13 +25414,13 @@ _loop1_49_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_49_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24560,9 +25428,12 @@ _loop1_49_rule(Parser *p) static asdl_seq * _loop0_51_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24572,14 +25443,14 @@ _loop0_51_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_51[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -24595,7 +25466,7 @@ _loop0_51_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -24604,7 +25475,7 @@ _loop0_51_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24621,13 +25492,13 @@ _loop0_51_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_51_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24635,16 +25506,19 @@ _loop0_51_rule(Parser *p) static asdl_seq * _gather_50_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_51 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_50[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_51")); @@ -24666,7 +25540,7 @@ _gather_50_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -24674,9 +25548,12 @@ _gather_50_rule(Parser *p) static asdl_seq * _loop0_53_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24686,14 +25563,14 @@ _loop0_53_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_53[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -24709,7 +25586,7 @@ _loop0_53_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -24718,7 +25595,7 @@ _loop0_53_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24735,13 +25612,13 @@ _loop0_53_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_53_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24749,16 +25626,19 @@ _loop0_53_rule(Parser *p) static asdl_seq * _gather_52_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_53 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_52[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_53")); @@ -24780,7 +25660,7 @@ _gather_52_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -24788,9 +25668,12 @@ _gather_52_rule(Parser *p) static asdl_seq * _loop0_55_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24800,14 +25683,14 @@ _loop0_55_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_55[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -24823,7 +25706,7 @@ _loop0_55_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -24832,7 +25715,7 @@ _loop0_55_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24849,13 +25732,13 @@ _loop0_55_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_55_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24863,16 +25746,19 @@ _loop0_55_rule(Parser *p) static asdl_seq * _gather_54_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_55 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_54[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_55")); @@ -24894,7 +25780,7 @@ _gather_54_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -24902,9 +25788,12 @@ _gather_54_rule(Parser *p) static asdl_seq * _loop0_57_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -24914,14 +25803,14 @@ _loop0_57_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' with_item if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_57[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' with_item")); @@ -24937,7 +25826,7 @@ _loop0_57_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -24946,7 +25835,7 @@ _loop0_57_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -24963,13 +25852,13 @@ _loop0_57_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_57_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -24977,16 +25866,19 @@ _loop0_57_rule(Parser *p) static asdl_seq * _gather_56_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // with_item _loop0_57 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_56[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "with_item _loop0_57")); @@ -25008,7 +25900,7 @@ _gather_56_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25016,16 +25908,19 @@ _gather_56_rule(Parser *p) static void * _tmp_58_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -25044,7 +25939,7 @@ _tmp_58_rule(Parser *p) } { // ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -25063,7 +25958,7 @@ _tmp_58_rule(Parser *p) } { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_58[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -25082,7 +25977,7 @@ _tmp_58_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25090,9 +25985,12 @@ _tmp_58_rule(Parser *p) static asdl_seq * _loop1_59_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25102,14 +26000,14 @@ _loop1_59_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_59[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); @@ -25125,7 +26023,7 @@ _loop1_59_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25139,7 +26037,7 @@ _loop1_59_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25147,13 +26045,13 @@ _loop1_59_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_59_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25161,9 +26059,12 @@ _loop1_59_rule(Parser *p) static asdl_seq * _loop1_60_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25173,14 +26074,14 @@ _loop1_60_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_star_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_60[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); @@ -25196,7 +26097,7 @@ _loop1_60_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25210,7 +26111,7 @@ _loop1_60_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25218,13 +26119,13 @@ _loop1_60_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_60_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25232,16 +26133,19 @@ _loop1_60_rule(Parser *p) static void * _tmp_61_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_61[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -25257,7 +26161,7 @@ _tmp_61_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -25268,7 +26172,7 @@ _tmp_61_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25276,16 +26180,19 @@ _tmp_61_rule(Parser *p) static void * _tmp_62_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_62[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -25301,7 +26208,7 @@ _tmp_62_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -25312,7 +26219,7 @@ _tmp_62_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25320,9 +26227,12 @@ _tmp_62_rule(Parser *p) static asdl_seq * _loop1_63_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25332,14 +26242,14 @@ _loop1_63_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // case_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_63[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "case_block")); @@ -25355,7 +26265,7 @@ _loop1_63_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25369,7 +26279,7 @@ _loop1_63_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -25377,13 +26287,13 @@ _loop1_63_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_63_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25391,9 +26301,12 @@ _loop1_63_rule(Parser *p) static asdl_seq * _loop0_65_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25403,14 +26316,14 @@ _loop0_65_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // '|' closed_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_65[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'|' closed_pattern")); @@ -25426,7 +26339,7 @@ _loop0_65_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -25435,7 +26348,7 @@ _loop0_65_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25452,13 +26365,13 @@ _loop0_65_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_65_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25466,16 +26379,19 @@ _loop0_65_rule(Parser *p) static asdl_seq * _gather_64_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // closed_pattern _loop0_65 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_64[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "closed_pattern _loop0_65")); @@ -25497,7 +26413,7 @@ _gather_64_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25505,16 +26421,19 @@ _gather_64_rule(Parser *p) static void * _tmp_66_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '+' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_66[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+'")); @@ -25533,7 +26452,7 @@ _tmp_66_rule(Parser *p) } { // '-' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_66[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-'")); @@ -25552,7 +26471,7 @@ _tmp_66_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25560,16 +26479,19 @@ _tmp_66_rule(Parser *p) static void * _tmp_67_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '+' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_67[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'+'")); @@ -25588,7 +26510,7 @@ _tmp_67_rule(Parser *p) } { // '-' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_67[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'-'")); @@ -25607,7 +26529,7 @@ _tmp_67_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25615,16 +26537,19 @@ _tmp_67_rule(Parser *p) static void * _tmp_68_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -25643,7 +26568,7 @@ _tmp_68_rule(Parser *p) } { // '(' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -25662,7 +26587,7 @@ _tmp_68_rule(Parser *p) } { // '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_68[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -25681,7 +26606,7 @@ _tmp_68_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25689,16 +26614,19 @@ _tmp_68_rule(Parser *p) static void * _tmp_69_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -25717,7 +26645,7 @@ _tmp_69_rule(Parser *p) } { // '(' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -25736,7 +26664,7 @@ _tmp_69_rule(Parser *p) } { // '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_69[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -25755,7 +26683,7 @@ _tmp_69_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25763,9 +26691,12 @@ _tmp_69_rule(Parser *p) static asdl_seq * _loop0_71_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25775,14 +26706,14 @@ _loop0_71_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' maybe_star_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_71[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' maybe_star_pattern")); @@ -25798,7 +26729,7 @@ _loop0_71_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -25807,7 +26738,7 @@ _loop0_71_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25824,13 +26755,13 @@ _loop0_71_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_71_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25838,16 +26769,19 @@ _loop0_71_rule(Parser *p) static asdl_seq * _gather_70_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // maybe_star_pattern _loop0_71 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_70[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "maybe_star_pattern _loop0_71")); @@ -25869,7 +26803,7 @@ _gather_70_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25877,9 +26811,12 @@ _gather_70_rule(Parser *p) static asdl_seq * _loop0_73_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -25889,14 +26826,14 @@ _loop0_73_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' key_value_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_73[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' key_value_pattern")); @@ -25912,7 +26849,7 @@ _loop0_73_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -25921,7 +26858,7 @@ _loop0_73_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -25938,13 +26875,13 @@ _loop0_73_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_73_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -25952,16 +26889,19 @@ _loop0_73_rule(Parser *p) static asdl_seq * _gather_72_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // key_value_pattern _loop0_73 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_72[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "key_value_pattern _loop0_73")); @@ -25983,7 +26923,7 @@ _gather_72_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -25991,16 +26931,19 @@ _gather_72_rule(Parser *p) static void * _tmp_74_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // literal_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_74[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "literal_expr")); @@ -26019,7 +26962,7 @@ _tmp_74_rule(Parser *p) } { // attr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_74[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "attr")); @@ -26038,7 +26981,7 @@ _tmp_74_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26046,9 +26989,12 @@ _tmp_74_rule(Parser *p) static asdl_seq * _loop0_76_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26058,14 +27004,14 @@ _loop0_76_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_76[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' pattern")); @@ -26081,7 +27027,7 @@ _loop0_76_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -26090,7 +27036,7 @@ _loop0_76_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26107,13 +27053,13 @@ _loop0_76_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_76_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26121,16 +27067,19 @@ _loop0_76_rule(Parser *p) static asdl_seq * _gather_75_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // pattern _loop0_76 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_75[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "pattern _loop0_76")); @@ -26152,7 +27101,7 @@ _gather_75_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26160,9 +27109,12 @@ _gather_75_rule(Parser *p) static asdl_seq * _loop0_78_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26172,14 +27124,14 @@ _loop0_78_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' keyword_pattern if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_78[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' keyword_pattern")); @@ -26195,7 +27147,7 @@ _loop0_78_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -26204,7 +27156,7 @@ _loop0_78_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26221,13 +27173,13 @@ _loop0_78_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_78_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26235,16 +27187,19 @@ _loop0_78_rule(Parser *p) static asdl_seq * _gather_77_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // keyword_pattern _loop0_78 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_77[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "keyword_pattern _loop0_78")); @@ -26266,7 +27221,7 @@ _gather_77_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26274,9 +27229,12 @@ _gather_77_rule(Parser *p) static asdl_seq * _loop1_79_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26286,14 +27244,14 @@ _loop1_79_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' expression) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_79[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' expression)")); @@ -26309,7 +27267,7 @@ _loop1_79_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26323,7 +27281,7 @@ _loop1_79_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26331,13 +27289,13 @@ _loop1_79_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_79_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26345,9 +27303,12 @@ _loop1_79_rule(Parser *p) static asdl_seq * _loop1_80_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26357,14 +27318,14 @@ _loop1_80_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_expression) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_80[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_expression)")); @@ -26380,7 +27341,7 @@ _loop1_80_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26394,7 +27355,7 @@ _loop1_80_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26402,13 +27363,13 @@ _loop1_80_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_80_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26416,9 +27377,12 @@ _loop1_80_rule(Parser *p) static asdl_seq * _loop0_82_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26428,14 +27392,14 @@ _loop0_82_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' star_named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_82[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_named_expression")); @@ -26451,7 +27415,7 @@ _loop0_82_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -26460,7 +27424,7 @@ _loop0_82_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26477,13 +27441,13 @@ _loop0_82_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_82_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26491,16 +27455,19 @@ _loop0_82_rule(Parser *p) static asdl_seq * _gather_81_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // star_named_expression _loop0_82 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_81[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression _loop0_82")); @@ -26522,7 +27489,7 @@ _gather_81_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26530,9 +27497,12 @@ _gather_81_rule(Parser *p) static asdl_seq * _loop1_83_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26542,14 +27512,14 @@ _loop1_83_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('or' conjunction) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_83[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('or' conjunction)")); @@ -26565,7 +27535,7 @@ _loop1_83_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26579,7 +27549,7 @@ _loop1_83_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26587,13 +27557,13 @@ _loop1_83_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_83_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26601,9 +27571,12 @@ _loop1_83_rule(Parser *p) static asdl_seq * _loop1_84_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26613,14 +27586,14 @@ _loop1_84_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('and' inversion) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_84[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('and' inversion)")); @@ -26636,7 +27609,7 @@ _loop1_84_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26650,7 +27623,7 @@ _loop1_84_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26658,13 +27631,13 @@ _loop1_84_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_84_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26672,9 +27645,12 @@ _loop1_84_rule(Parser *p) static asdl_seq * _loop1_85_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26684,14 +27660,14 @@ _loop1_85_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // compare_op_bitwise_or_pair if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_85[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "compare_op_bitwise_or_pair")); @@ -26707,7 +27683,7 @@ _loop1_85_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26721,7 +27697,7 @@ _loop1_85_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -26729,13 +27705,13 @@ _loop1_85_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_85_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26743,16 +27719,19 @@ _loop1_85_rule(Parser *p) static void * _tmp_86_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '!=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_86[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'!='")); @@ -26765,7 +27744,7 @@ _tmp_86_rule(Parser *p) _res = _PyPegen_check_barry_as_flufl ( p , tok ) ? NULL : tok; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -26776,7 +27755,7 @@ _tmp_86_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26784,9 +27763,12 @@ _tmp_86_rule(Parser *p) static asdl_seq * _loop0_88_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -26796,14 +27778,14 @@ _loop0_88_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' slice if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_88[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' slice")); @@ -26819,7 +27801,7 @@ _loop0_88_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -26828,7 +27810,7 @@ _loop0_88_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -26845,13 +27827,13 @@ _loop0_88_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_88_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -26859,16 +27841,19 @@ _loop0_88_rule(Parser *p) static asdl_seq * _gather_87_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // slice _loop0_88 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_87[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice _loop0_88")); @@ -26890,7 +27875,7 @@ _gather_87_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26898,16 +27883,19 @@ _gather_87_rule(Parser *p) static void * _tmp_89_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' expression? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_89[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':' expression?")); @@ -26923,7 +27911,7 @@ _tmp_89_rule(Parser *p) _res = d; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -26934,7 +27922,7 @@ _tmp_89_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -26942,16 +27930,19 @@ _tmp_89_rule(Parser *p) static void * _tmp_90_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // tuple if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -26970,7 +27961,7 @@ _tmp_90_rule(Parser *p) } { // group if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "group")); @@ -26989,7 +27980,7 @@ _tmp_90_rule(Parser *p) } { // genexp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_90[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); @@ -27008,7 +27999,7 @@ _tmp_90_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -27016,16 +28007,19 @@ _tmp_90_rule(Parser *p) static void * _tmp_91_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_91[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -27044,7 +28038,7 @@ _tmp_91_rule(Parser *p) } { // listcomp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_91[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "listcomp")); @@ -27063,7 +28057,7 @@ _tmp_91_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -27071,16 +28065,19 @@ _tmp_91_rule(Parser *p) static void * _tmp_92_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // dict if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dict")); @@ -27099,7 +28096,7 @@ _tmp_92_rule(Parser *p) } { // set if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "set")); @@ -27118,7 +28115,7 @@ _tmp_92_rule(Parser *p) } { // dictcomp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "dictcomp")); @@ -27137,7 +28134,7 @@ _tmp_92_rule(Parser *p) } { // setcomp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_92[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "setcomp")); @@ -27156,7 +28153,7 @@ _tmp_92_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -27164,16 +28161,19 @@ _tmp_92_rule(Parser *p) static void * _tmp_93_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_93[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -27192,7 +28192,7 @@ _tmp_93_rule(Parser *p) } { // named_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_93[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "named_expression")); @@ -27211,7 +28211,7 @@ _tmp_93_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -27219,9 +28219,12 @@ _tmp_93_rule(Parser *p) static asdl_seq * _loop0_94_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27231,14 +28234,14 @@ _loop0_94_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_94[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27254,7 +28257,7 @@ _loop0_94_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27271,13 +28274,13 @@ _loop0_94_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_94_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27285,9 +28288,12 @@ _loop0_94_rule(Parser *p) static asdl_seq * _loop0_95_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27297,14 +28303,14 @@ _loop0_95_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_95[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -27320,7 +28326,7 @@ _loop0_95_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27337,13 +28343,13 @@ _loop0_95_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_95_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27351,9 +28357,12 @@ _loop0_95_rule(Parser *p) static asdl_seq * _loop0_96_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27363,14 +28372,14 @@ _loop0_96_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_96[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -27386,7 +28395,7 @@ _loop0_96_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27403,13 +28412,13 @@ _loop0_96_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_96_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27417,9 +28426,12 @@ _loop0_96_rule(Parser *p) static asdl_seq * _loop1_97_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27429,14 +28441,14 @@ _loop1_97_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_97[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27452,7 +28464,7 @@ _loop1_97_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27466,7 +28478,7 @@ _loop1_97_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27474,13 +28486,13 @@ _loop1_97_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_97_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27488,9 +28500,12 @@ _loop1_97_rule(Parser *p) static asdl_seq * _loop0_98_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27500,14 +28515,14 @@ _loop0_98_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_98[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -27523,7 +28538,7 @@ _loop0_98_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27540,13 +28555,13 @@ _loop0_98_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_98_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27554,10 +28569,13 @@ _loop0_98_rule(Parser *p) static asdl_seq * _loop1_99_rule(Parser *p) { - D(p->level++); - if (p->error_indicator) { - D(p->level--); - return NULL; + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; } void *_res = NULL; int _mark = p->mark; @@ -27566,14 +28584,14 @@ _loop1_99_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_99[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -27589,7 +28607,7 @@ _loop1_99_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27603,7 +28621,7 @@ _loop1_99_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27611,13 +28629,13 @@ _loop1_99_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_99_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27625,9 +28643,12 @@ _loop1_99_rule(Parser *p) static asdl_seq * _loop1_100_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27637,14 +28658,14 @@ _loop1_100_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_100[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27660,7 +28681,7 @@ _loop1_100_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27674,7 +28695,7 @@ _loop1_100_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27682,13 +28703,13 @@ _loop1_100_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_100_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27696,9 +28717,12 @@ _loop1_100_rule(Parser *p) static asdl_seq * _loop1_101_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27708,14 +28732,14 @@ _loop1_101_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_101[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27731,7 +28755,7 @@ _loop1_101_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27745,7 +28769,7 @@ _loop1_101_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27753,13 +28777,13 @@ _loop1_101_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_101_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27767,9 +28791,12 @@ _loop1_101_rule(Parser *p) static asdl_seq * _loop0_102_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27779,14 +28806,14 @@ _loop0_102_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_102[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27802,7 +28829,7 @@ _loop0_102_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27819,13 +28846,13 @@ _loop0_102_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_102_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27833,9 +28860,12 @@ _loop0_102_rule(Parser *p) static asdl_seq * _loop1_103_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27845,14 +28875,14 @@ _loop1_103_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_103[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -27868,7 +28898,7 @@ _loop1_103_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27882,7 +28912,7 @@ _loop1_103_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -27890,13 +28920,13 @@ _loop1_103_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_103_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27904,9 +28934,12 @@ _loop1_103_rule(Parser *p) static asdl_seq * _loop0_104_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27916,14 +28949,14 @@ _loop0_104_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_104[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -27939,7 +28972,7 @@ _loop0_104_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -27956,13 +28989,13 @@ _loop0_104_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_104_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -27970,9 +29003,12 @@ _loop0_104_rule(Parser *p) static asdl_seq * _loop1_105_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -27982,14 +29018,14 @@ _loop1_105_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_105[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -28005,7 +29041,7 @@ _loop1_105_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28019,7 +29055,7 @@ _loop1_105_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28027,13 +29063,13 @@ _loop1_105_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_105_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28041,9 +29077,12 @@ _loop1_105_rule(Parser *p) static asdl_seq * _loop0_106_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28053,14 +29092,14 @@ _loop0_106_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_maybe_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_106[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); @@ -28076,7 +29115,7 @@ _loop0_106_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28093,13 +29132,13 @@ _loop0_106_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_106_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28107,9 +29146,12 @@ _loop0_106_rule(Parser *p) static asdl_seq * _loop1_107_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28119,14 +29161,14 @@ _loop1_107_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_maybe_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_107[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); @@ -28142,7 +29184,7 @@ _loop1_107_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28156,7 +29198,7 @@ _loop1_107_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28164,13 +29206,13 @@ _loop1_107_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_107_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28178,9 +29220,12 @@ _loop1_107_rule(Parser *p) static asdl_seq * _loop1_108_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28190,14 +29235,14 @@ _loop1_108_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // STRING if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_108[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "STRING")); @@ -28213,7 +29258,7 @@ _loop1_108_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28227,7 +29272,7 @@ _loop1_108_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28235,13 +29280,13 @@ _loop1_108_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_108_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28249,16 +29294,19 @@ _loop1_108_rule(Parser *p) static void * _tmp_109_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // star_named_expression ',' star_named_expressions? if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_109[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions?")); @@ -28277,7 +29325,7 @@ _tmp_109_rule(Parser *p) _res = _PyPegen_seq_insert_in_front ( p , y , z ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -28288,7 +29336,7 @@ _tmp_109_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28296,9 +29344,12 @@ _tmp_109_rule(Parser *p) static asdl_seq * _loop0_111_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28308,14 +29359,14 @@ _loop0_111_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' double_starred_kvpair if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_111[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); @@ -28331,7 +29382,7 @@ _loop0_111_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -28340,7 +29391,7 @@ _loop0_111_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28357,13 +29408,13 @@ _loop0_111_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_111_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28371,16 +29422,19 @@ _loop0_111_rule(Parser *p) static asdl_seq * _gather_110_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // double_starred_kvpair _loop0_111 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_110[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_111")); @@ -28402,7 +29456,7 @@ _gather_110_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28410,9 +29464,12 @@ _gather_110_rule(Parser *p) static asdl_seq * _loop1_112_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28422,14 +29479,14 @@ _loop1_112_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // for_if_clause if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_112[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "for_if_clause")); @@ -28445,7 +29502,7 @@ _loop1_112_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28459,7 +29516,7 @@ _loop1_112_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -28467,13 +29524,13 @@ _loop1_112_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_112_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28481,9 +29538,12 @@ _loop1_112_rule(Parser *p) static asdl_seq * _loop0_113_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28493,14 +29553,14 @@ _loop0_113_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('if' disjunction) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_113[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); @@ -28516,7 +29576,7 @@ _loop0_113_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28533,13 +29593,13 @@ _loop0_113_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_113_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28547,9 +29607,12 @@ _loop0_113_rule(Parser *p) static asdl_seq * _loop0_114_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28559,14 +29622,14 @@ _loop0_114_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ('if' disjunction) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_114[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); @@ -28582,7 +29645,7 @@ _loop0_114_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28599,13 +29662,13 @@ _loop0_114_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_114_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28613,16 +29676,19 @@ _loop0_114_rule(Parser *p) static void * _tmp_115_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -28641,7 +29707,7 @@ _tmp_115_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -28662,7 +29728,7 @@ _tmp_115_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28670,9 +29736,12 @@ _tmp_115_rule(Parser *p) static asdl_seq * _loop0_117_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28682,14 +29751,14 @@ _loop0_117_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (starred_expression | (assignment_expression | expression !':=') !'=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_117[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (starred_expression | (assignment_expression | expression !':=') !'=')")); @@ -28705,7 +29774,7 @@ _loop0_117_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -28714,7 +29783,7 @@ _loop0_117_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28731,13 +29800,13 @@ _loop0_117_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_117_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28746,16 +29815,19 @@ _loop0_117_rule(Parser *p) static asdl_seq * _gather_116_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (starred_expression | (assignment_expression | expression !':=') !'=') _loop0_117 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_116[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(starred_expression | (assignment_expression | expression !':=') !'=') _loop0_117")); @@ -28777,7 +29849,7 @@ _gather_116_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28785,16 +29857,19 @@ _gather_116_rule(Parser *p) static void * _tmp_118_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' kwargs if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_118[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwargs")); @@ -28810,7 +29885,7 @@ _tmp_118_rule(Parser *p) _res = k; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -28821,7 +29896,7 @@ _tmp_118_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28829,9 +29904,12 @@ _tmp_118_rule(Parser *p) static asdl_seq * _loop0_120_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28841,14 +29919,14 @@ _loop0_120_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_starred if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_120[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_starred")); @@ -28864,7 +29942,7 @@ _loop0_120_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -28873,7 +29951,7 @@ _loop0_120_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -28890,13 +29968,13 @@ _loop0_120_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_120_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -28904,16 +29982,19 @@ _loop0_120_rule(Parser *p) static asdl_seq * _gather_119_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_starred _loop0_120 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_119[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_starred _loop0_120")); @@ -28935,7 +30016,7 @@ _gather_119_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -28943,9 +30024,12 @@ _gather_119_rule(Parser *p) static asdl_seq * _loop0_122_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -28955,14 +30039,14 @@ _loop0_122_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_double_starred if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_122[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_double_starred")); @@ -28978,7 +30062,7 @@ _loop0_122_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -28987,7 +30071,7 @@ _loop0_122_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29004,13 +30088,13 @@ _loop0_122_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_122_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29018,16 +30102,19 @@ _loop0_122_rule(Parser *p) static asdl_seq * _gather_121_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_double_starred _loop0_122 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_121[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_double_starred _loop0_122")); @@ -29049,7 +30136,7 @@ _gather_121_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29057,9 +30144,12 @@ _gather_121_rule(Parser *p) static asdl_seq * _loop0_124_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29069,14 +30159,14 @@ _loop0_124_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_starred if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_124[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_starred")); @@ -29092,7 +30182,7 @@ _loop0_124_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29101,7 +30191,7 @@ _loop0_124_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29118,13 +30208,13 @@ _loop0_124_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_124_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29132,16 +30222,19 @@ _loop0_124_rule(Parser *p) static asdl_seq * _gather_123_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_starred _loop0_124 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_123[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_starred _loop0_124")); @@ -29163,7 +30256,7 @@ _gather_123_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29171,9 +30264,12 @@ _gather_123_rule(Parser *p) static asdl_seq * _loop0_126_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29183,14 +30279,14 @@ _loop0_126_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' kwarg_or_double_starred if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_126[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' kwarg_or_double_starred")); @@ -29206,7 +30302,7 @@ _loop0_126_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29215,7 +30311,7 @@ _loop0_126_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29232,13 +30328,13 @@ _loop0_126_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_126_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29246,16 +30342,19 @@ _loop0_126_rule(Parser *p) static asdl_seq * _gather_125_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // kwarg_or_double_starred _loop0_126 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_125[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwarg_or_double_starred _loop0_126")); @@ -29277,7 +30376,7 @@ _gather_125_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29285,9 +30384,12 @@ _gather_125_rule(Parser *p) static asdl_seq * _loop0_127_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29297,14 +30399,14 @@ _loop0_127_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_target) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_127[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); @@ -29320,7 +30422,7 @@ _loop0_127_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29337,13 +30439,13 @@ _loop0_127_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_127_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29351,9 +30453,12 @@ _loop0_127_rule(Parser *p) static asdl_seq * _loop0_129_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29363,14 +30468,14 @@ _loop0_129_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_129[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -29386,7 +30491,7 @@ _loop0_129_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29395,7 +30500,7 @@ _loop0_129_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29412,13 +30517,13 @@ _loop0_129_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_129_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29426,16 +30531,19 @@ _loop0_129_rule(Parser *p) static asdl_seq * _gather_128_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // star_target _loop0_129 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_128[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_target _loop0_129")); @@ -29457,7 +30565,7 @@ _gather_128_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29465,9 +30573,12 @@ _gather_128_rule(Parser *p) static asdl_seq * _loop1_130_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29477,14 +30588,14 @@ _loop1_130_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (',' star_target) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_130[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); @@ -29500,7 +30611,7 @@ _loop1_130_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29514,7 +30625,7 @@ _loop1_130_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -29522,13 +30633,13 @@ _loop1_130_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_130_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29536,16 +30647,19 @@ _loop1_130_rule(Parser *p) static void * _tmp_131_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // !'*' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_131[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "!'*' star_target")); @@ -29566,7 +30680,7 @@ _tmp_131_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29574,9 +30688,12 @@ _tmp_131_rule(Parser *p) static asdl_seq * _loop0_133_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29586,14 +30703,14 @@ _loop0_133_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' del_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_133[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' del_target")); @@ -29609,7 +30726,7 @@ _loop0_133_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29618,7 +30735,7 @@ _loop0_133_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29635,13 +30752,13 @@ _loop0_133_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_133_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29649,16 +30766,19 @@ _loop0_133_rule(Parser *p) static asdl_seq * _gather_132_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // del_target _loop0_133 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_132[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "del_target _loop0_133")); @@ -29680,7 +30800,7 @@ _gather_132_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29688,9 +30808,12 @@ _gather_132_rule(Parser *p) static asdl_seq * _loop0_135_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29700,14 +30823,14 @@ _loop0_135_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_135[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -29723,7 +30846,7 @@ _loop0_135_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29732,7 +30855,7 @@ _loop0_135_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29749,13 +30872,13 @@ _loop0_135_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_135_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29763,16 +30886,19 @@ _loop0_135_rule(Parser *p) static asdl_seq * _gather_134_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_135 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_134[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_135")); @@ -29794,7 +30920,7 @@ _gather_134_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29802,9 +30928,12 @@ _gather_134_rule(Parser *p) static asdl_seq * _loop0_137_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29814,14 +30943,14 @@ _loop0_137_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_137[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -29837,7 +30966,7 @@ _loop0_137_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29846,7 +30975,7 @@ _loop0_137_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29863,13 +30992,13 @@ _loop0_137_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_137_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29877,16 +31006,19 @@ _loop0_137_rule(Parser *p) static asdl_seq * _gather_136_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_137 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_136[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_137")); @@ -29908,7 +31040,7 @@ _gather_136_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -29916,9 +31048,12 @@ _gather_136_rule(Parser *p) static asdl_seq * _loop0_139_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -29928,14 +31063,14 @@ _loop0_139_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_139[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -29951,7 +31086,7 @@ _loop0_139_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -29960,7 +31095,7 @@ _loop0_139_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -29977,13 +31112,13 @@ _loop0_139_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_139_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -29991,16 +31126,19 @@ _loop0_139_rule(Parser *p) static asdl_seq * _gather_138_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_139 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_138[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_139")); @@ -30022,7 +31160,7 @@ _gather_138_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30030,9 +31168,12 @@ _gather_138_rule(Parser *p) static asdl_seq * _loop0_141_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -30042,14 +31183,14 @@ _loop0_141_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_141[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -30065,7 +31206,7 @@ _loop0_141_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -30074,7 +31215,7 @@ _loop0_141_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -30091,13 +31232,13 @@ _loop0_141_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_141_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -30105,16 +31246,19 @@ _loop0_141_rule(Parser *p) static asdl_seq * _gather_140_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // expression _loop0_141 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_140[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression _loop0_141")); @@ -30136,7 +31280,7 @@ _gather_140_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30144,16 +31288,19 @@ _gather_140_rule(Parser *p) static void * _tmp_142_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE INDENT if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_142[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE INDENT")); @@ -30175,7 +31322,7 @@ _tmp_142_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30183,16 +31330,19 @@ _tmp_142_rule(Parser *p) static void * _tmp_143_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // args if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_143[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args")); @@ -30211,7 +31361,7 @@ _tmp_143_rule(Parser *p) } { // expression for_if_clauses if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_143[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); @@ -30233,7 +31383,7 @@ _tmp_143_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30241,16 +31391,19 @@ _tmp_143_rule(Parser *p) static void * _tmp_144_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'True' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -30269,7 +31422,7 @@ _tmp_144_rule(Parser *p) } { // 'False' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -30288,7 +31441,7 @@ _tmp_144_rule(Parser *p) } { // 'None' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -30307,7 +31460,7 @@ _tmp_144_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30315,16 +31468,19 @@ _tmp_144_rule(Parser *p) static void * _tmp_145_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '='")); @@ -30346,7 +31502,7 @@ _tmp_145_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30354,16 +31510,19 @@ _tmp_145_rule(Parser *p) static void * _tmp_146_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NAME STRING if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME STRING")); @@ -30385,7 +31544,7 @@ _tmp_146_rule(Parser *p) } { // SOFT_KEYWORD if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); @@ -30404,7 +31563,7 @@ _tmp_146_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30412,16 +31571,19 @@ _tmp_146_rule(Parser *p) static void * _tmp_147_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'else' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else'")); @@ -30440,7 +31602,7 @@ _tmp_147_rule(Parser *p) } { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -30459,7 +31621,7 @@ _tmp_147_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30467,16 +31629,19 @@ _tmp_147_rule(Parser *p) static void * _tmp_148_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -30495,7 +31660,7 @@ _tmp_148_rule(Parser *p) } { // ':=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); @@ -30514,7 +31679,7 @@ _tmp_148_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30522,16 +31687,19 @@ _tmp_148_rule(Parser *p) static void * _tmp_149_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // list if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); @@ -30550,7 +31718,7 @@ _tmp_149_rule(Parser *p) } { // tuple if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); @@ -30569,7 +31737,7 @@ _tmp_149_rule(Parser *p) } { // genexp if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); @@ -30588,7 +31756,7 @@ _tmp_149_rule(Parser *p) } { // 'True' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); @@ -30607,7 +31775,7 @@ _tmp_149_rule(Parser *p) } { // 'None' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); @@ -30626,7 +31794,7 @@ _tmp_149_rule(Parser *p) } { // 'False' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); @@ -30645,7 +31813,7 @@ _tmp_149_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30653,16 +31821,19 @@ _tmp_149_rule(Parser *p) static void * _tmp_150_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); @@ -30681,7 +31852,7 @@ _tmp_150_rule(Parser *p) } { // ':=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); @@ -30700,7 +31871,7 @@ _tmp_150_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30708,9 +31879,12 @@ _tmp_150_rule(Parser *p) static asdl_seq * _loop0_151_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -30720,14 +31894,14 @@ _loop0_151_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // star_named_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expressions")); @@ -30743,7 +31917,7 @@ _loop0_151_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -30760,13 +31934,13 @@ _loop0_151_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_151_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -30774,9 +31948,12 @@ _loop0_151_rule(Parser *p) static asdl_seq * _loop0_152_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -30786,14 +31963,14 @@ _loop0_152_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_152[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -30809,7 +31986,7 @@ _loop0_152_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -30826,13 +32003,13 @@ _loop0_152_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_152_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -30840,9 +32017,12 @@ _loop0_152_rule(Parser *p) static asdl_seq * _loop0_153_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -30852,14 +32032,14 @@ _loop0_153_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // (star_targets '=') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_153[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); @@ -30875,7 +32055,7 @@ _loop0_153_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -30892,13 +32072,13 @@ _loop0_153_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_153_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -30906,16 +32086,19 @@ _loop0_153_rule(Parser *p) static void * _tmp_154_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // yield_expr if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); @@ -30934,7 +32117,7 @@ _tmp_154_rule(Parser *p) } { // star_expressions if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); @@ -30953,7 +32136,7 @@ _tmp_154_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -30961,16 +32144,19 @@ _tmp_154_rule(Parser *p) static void * _tmp_155_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -30989,7 +32175,7 @@ _tmp_155_rule(Parser *p) } { // '(' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); @@ -31008,7 +32194,7 @@ _tmp_155_rule(Parser *p) } { // '{' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -31027,7 +32213,7 @@ _tmp_155_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31035,16 +32221,19 @@ _tmp_155_rule(Parser *p) static void * _tmp_156_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -31063,7 +32252,7 @@ _tmp_156_rule(Parser *p) } { // '{' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -31082,7 +32271,7 @@ _tmp_156_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31090,16 +32279,19 @@ _tmp_156_rule(Parser *p) static void * _tmp_157_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '[' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); @@ -31118,7 +32310,7 @@ _tmp_157_rule(Parser *p) } { // '{' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); @@ -31137,7 +32329,7 @@ _tmp_157_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31145,9 +32337,12 @@ _tmp_157_rule(Parser *p) static asdl_seq * _loop0_158_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31157,14 +32352,14 @@ _loop0_158_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -31180,7 +32375,7 @@ _loop0_158_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31197,13 +32392,13 @@ _loop0_158_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_158_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31211,9 +32406,12 @@ _loop0_158_rule(Parser *p) static asdl_seq * _loop0_159_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31223,14 +32421,14 @@ _loop0_159_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_159[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -31246,7 +32444,7 @@ _loop0_159_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31263,13 +32461,13 @@ _loop0_159_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_159_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31277,9 +32475,12 @@ _loop0_159_rule(Parser *p) static asdl_seq * _loop1_160_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31289,14 +32490,14 @@ _loop1_160_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_160[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); @@ -31312,7 +32513,7 @@ _loop1_160_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31326,7 +32527,7 @@ _loop1_160_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -31334,13 +32535,13 @@ _loop1_160_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_160_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31348,9 +32549,12 @@ _loop1_160_rule(Parser *p) static asdl_seq * _loop1_161_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31360,14 +32564,14 @@ _loop1_161_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_161[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); @@ -31383,7 +32587,7 @@ _loop1_161_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31397,7 +32601,7 @@ _loop1_161_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -31405,13 +32609,13 @@ _loop1_161_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_161_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31419,9 +32623,12 @@ _loop1_161_rule(Parser *p) static asdl_seq * _loop0_162_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31431,14 +32638,14 @@ _loop0_162_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_162[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -31454,7 +32661,7 @@ _loop0_162_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31471,13 +32678,13 @@ _loop0_162_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_162_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31485,9 +32692,12 @@ _loop0_162_rule(Parser *p) static asdl_seq * _loop0_163_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31497,14 +32707,14 @@ _loop0_163_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_no_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_163[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); @@ -31520,7 +32730,7 @@ _loop0_163_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31537,13 +32747,13 @@ _loop0_163_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_163_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31551,9 +32761,12 @@ _loop0_163_rule(Parser *p) static asdl_seq * _loop0_165_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31563,14 +32776,14 @@ _loop0_165_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' lambda_param if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_165[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' lambda_param")); @@ -31586,7 +32799,7 @@ _loop0_165_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -31595,7 +32808,7 @@ _loop0_165_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31612,13 +32825,13 @@ _loop0_165_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_165_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31626,16 +32839,19 @@ _loop0_165_rule(Parser *p) static asdl_seq * _gather_164_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // lambda_param _loop0_165 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_164[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_165")); @@ -31657,7 +32873,7 @@ _gather_164_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31665,9 +32881,12 @@ _gather_164_rule(Parser *p) static asdl_seq * _loop1_166_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31677,14 +32896,14 @@ _loop1_166_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // lambda_param_with_default if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_166[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); @@ -31700,7 +32919,7 @@ _loop1_166_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31714,7 +32933,7 @@ _loop1_166_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -31722,13 +32941,13 @@ _loop1_166_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_166_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -31736,16 +32955,19 @@ _loop1_166_rule(Parser *p) static void * _tmp_167_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -31764,7 +32986,7 @@ _tmp_167_rule(Parser *p) } { // ',' (')' | '**') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); @@ -31786,7 +33008,7 @@ _tmp_167_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31794,16 +33016,19 @@ _tmp_167_rule(Parser *p) static void * _tmp_168_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -31822,7 +33047,7 @@ _tmp_168_rule(Parser *p) } { // ',' (':' | '**') if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); @@ -31844,7 +33069,7 @@ _tmp_168_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31852,16 +33077,19 @@ _tmp_168_rule(Parser *p) static void * _tmp_169_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -31880,7 +33108,7 @@ _tmp_169_rule(Parser *p) } { // ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -31899,7 +33127,7 @@ _tmp_169_rule(Parser *p) } { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -31918,7 +33146,7 @@ _tmp_169_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -31926,9 +33154,12 @@ _tmp_169_rule(Parser *p) static asdl_seq * _loop0_171_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -31938,14 +33169,14 @@ _loop0_171_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expression ['as' star_target]) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); @@ -31961,7 +33192,7 @@ _loop0_171_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -31970,7 +33201,7 @@ _loop0_171_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -31987,13 +33218,13 @@ _loop0_171_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_171_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32001,16 +33232,19 @@ _loop0_171_rule(Parser *p) static asdl_seq * _gather_170_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expression ['as' star_target]) _loop0_171 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_171")); @@ -32032,7 +33266,7 @@ _gather_170_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32040,9 +33274,12 @@ _gather_170_rule(Parser *p) static asdl_seq * _loop0_173_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32052,14 +33289,14 @@ _loop0_173_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expressions ['as' star_target]) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_173[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); @@ -32075,7 +33312,7 @@ _loop0_173_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -32084,7 +33321,7 @@ _loop0_173_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -32101,13 +33338,13 @@ _loop0_173_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_173_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32115,16 +33352,19 @@ _loop0_173_rule(Parser *p) static asdl_seq * _gather_172_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expressions ['as' star_target]) _loop0_173 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_172[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_173")); @@ -32146,7 +33386,7 @@ _gather_172_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32154,9 +33394,12 @@ _gather_172_rule(Parser *p) static asdl_seq * _loop0_175_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32166,14 +33409,14 @@ _loop0_175_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expression ['as' star_target]) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); @@ -32189,7 +33432,7 @@ _loop0_175_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -32198,7 +33441,7 @@ _loop0_175_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -32215,13 +33458,13 @@ _loop0_175_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_175_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32229,16 +33472,19 @@ _loop0_175_rule(Parser *p) static asdl_seq * _gather_174_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expression ['as' star_target]) _loop0_175 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_175")); @@ -32260,7 +33506,7 @@ _gather_174_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32268,9 +33514,12 @@ _gather_174_rule(Parser *p) static asdl_seq * _loop0_177_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32280,14 +33529,14 @@ _loop0_177_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' (expressions ['as' star_target]) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_177[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); @@ -32303,7 +33552,7 @@ _loop0_177_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -32312,7 +33561,7 @@ _loop0_177_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -32329,13 +33578,13 @@ _loop0_177_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_177_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32343,16 +33592,19 @@ _loop0_177_rule(Parser *p) static asdl_seq * _gather_176_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // (expressions ['as' star_target]) _loop0_177 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_176[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_177")); @@ -32374,7 +33626,7 @@ _gather_176_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32382,16 +33634,19 @@ _gather_176_rule(Parser *p) static void * _tmp_178_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'except' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except'")); @@ -32410,7 +33665,7 @@ _tmp_178_rule(Parser *p) } { // 'finally' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally'")); @@ -32429,7 +33684,7 @@ _tmp_178_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32437,9 +33692,12 @@ _tmp_178_rule(Parser *p) static asdl_seq * _loop0_179_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32449,14 +33707,14 @@ _loop0_179_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_179[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); @@ -32472,7 +33730,7 @@ _loop0_179_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -32489,13 +33747,13 @@ _loop0_179_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_179_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32503,16 +33761,19 @@ _loop0_179_rule(Parser *p) static void * _tmp_180_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // (except_block+ except_star_block) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_block+ except_star_block)")); @@ -32531,7 +33792,7 @@ _tmp_180_rule(Parser *p) } { // (except_star_block+ except_block) if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_star_block+ except_block)")); @@ -32550,7 +33811,7 @@ _tmp_180_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32558,9 +33819,12 @@ _tmp_180_rule(Parser *p) static asdl_seq * _loop0_181_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32570,14 +33834,14 @@ _loop0_181_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_181[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); @@ -32593,7 +33857,7 @@ _loop0_181_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -32610,13 +33874,13 @@ _loop0_181_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_181_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -32624,16 +33888,19 @@ _loop0_181_rule(Parser *p) static void * _tmp_182_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_182[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -32655,7 +33922,7 @@ _tmp_182_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32663,16 +33930,19 @@ _tmp_182_rule(Parser *p) static void * _tmp_183_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_183[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -32694,7 +33964,7 @@ _tmp_183_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32702,16 +33972,19 @@ _tmp_183_rule(Parser *p) static void * _tmp_184_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); @@ -32730,7 +34003,7 @@ _tmp_184_rule(Parser *p) } { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -32749,7 +34022,7 @@ _tmp_184_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32757,16 +34030,19 @@ _tmp_184_rule(Parser *p) static void * _tmp_185_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_185[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -32788,7 +34064,7 @@ _tmp_185_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32796,16 +34072,19 @@ _tmp_185_rule(Parser *p) static void * _tmp_186_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' NAME if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_186[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); @@ -32827,7 +34106,7 @@ _tmp_186_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32835,16 +34114,19 @@ _tmp_186_rule(Parser *p) static void * _tmp_187_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // positional_patterns ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_187[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); @@ -32866,7 +34148,7 @@ _tmp_187_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32874,16 +34156,19 @@ _tmp_187_rule(Parser *p) static void * _tmp_188_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '->' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); @@ -32905,7 +34190,7 @@ _tmp_188_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32913,16 +34198,19 @@ _tmp_188_rule(Parser *p) static void * _tmp_189_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '(' arguments? ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); @@ -32948,7 +34236,7 @@ _tmp_189_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -32956,9 +34244,12 @@ _tmp_189_rule(Parser *p) static asdl_seq * _loop0_191_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -32968,14 +34259,14 @@ _loop0_191_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // ',' double_starred_kvpair if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop0_191[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); @@ -32991,7 +34282,7 @@ _loop0_191_rule(Parser *p) if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } if (_n == _children_capacity) { @@ -33000,7 +34291,7 @@ _loop0_191_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -33017,13 +34308,13 @@ _loop0_191_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop0_191_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -33031,16 +34322,19 @@ _loop0_191_rule(Parser *p) static asdl_seq * _gather_190_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } asdl_seq * _res = NULL; int _mark = p->mark; { // double_starred_kvpair _loop0_191 if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _gather_190[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_191")); @@ -33062,7 +34356,7 @@ _gather_190_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33070,16 +34364,19 @@ _gather_190_rule(Parser *p) static void * _tmp_192_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '}' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); @@ -33098,7 +34395,7 @@ _tmp_192_rule(Parser *p) } { // ',' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); @@ -33117,7 +34414,7 @@ _tmp_192_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33125,16 +34422,19 @@ _tmp_192_rule(Parser *p) static void * _tmp_193_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -33150,7 +34450,7 @@ _tmp_193_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33161,7 +34461,7 @@ _tmp_193_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33169,16 +34469,19 @@ _tmp_193_rule(Parser *p) static void * _tmp_194_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -33197,7 +34500,7 @@ _tmp_194_rule(Parser *p) } { // '...' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -33216,7 +34519,7 @@ _tmp_194_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33224,16 +34527,19 @@ _tmp_194_rule(Parser *p) static void * _tmp_195_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '.' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); @@ -33252,7 +34558,7 @@ _tmp_195_rule(Parser *p) } { // '...' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); @@ -33271,7 +34577,7 @@ _tmp_195_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33279,16 +34585,19 @@ _tmp_195_rule(Parser *p) static void * _tmp_196_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // '@' named_expression NEWLINE if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_196[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); @@ -33307,7 +34616,7 @@ _tmp_196_rule(Parser *p) _res = f; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33318,7 +34627,7 @@ _tmp_196_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33326,16 +34635,19 @@ _tmp_196_rule(Parser *p) static void * _tmp_197_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_197[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); @@ -33351,7 +34663,7 @@ _tmp_197_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33362,7 +34674,7 @@ _tmp_197_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33370,16 +34682,19 @@ _tmp_197_rule(Parser *p) static void * _tmp_198_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_198[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); @@ -33395,7 +34710,7 @@ _tmp_198_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33406,7 +34721,7 @@ _tmp_198_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33414,16 +34729,19 @@ _tmp_198_rule(Parser *p) static void * _tmp_199_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'or' conjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_199[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); @@ -33439,7 +34757,7 @@ _tmp_199_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33450,7 +34768,7 @@ _tmp_199_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33458,16 +34776,19 @@ _tmp_199_rule(Parser *p) static void * _tmp_200_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'and' inversion if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_200[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); @@ -33483,7 +34804,7 @@ _tmp_200_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33494,7 +34815,7 @@ _tmp_200_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33502,16 +34823,19 @@ _tmp_200_rule(Parser *p) static void * _tmp_201_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' disjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_201[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); @@ -33527,7 +34851,7 @@ _tmp_201_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33538,7 +34862,7 @@ _tmp_201_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33546,16 +34870,19 @@ _tmp_201_rule(Parser *p) static void * _tmp_202_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'if' disjunction if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_202[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); @@ -33571,7 +34898,7 @@ _tmp_202_rule(Parser *p) _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33582,7 +34909,7 @@ _tmp_202_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33590,16 +34917,19 @@ _tmp_202_rule(Parser *p) static void * _tmp_203_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // starred_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); @@ -33618,7 +34948,7 @@ _tmp_203_rule(Parser *p) } { // (assignment_expression | expression !':=') !'=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); @@ -33639,7 +34969,7 @@ _tmp_203_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33647,16 +34977,19 @@ _tmp_203_rule(Parser *p) static void * _tmp_204_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -33672,7 +35005,7 @@ _tmp_204_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33683,7 +35016,7 @@ _tmp_204_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33691,16 +35024,19 @@ _tmp_204_rule(Parser *p) static void * _tmp_205_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ',' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_205[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); @@ -33716,7 +35052,7 @@ _tmp_205_rule(Parser *p) _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; - D(p->level--); + p->level--; return NULL; } goto done; @@ -33727,7 +35063,7 @@ _tmp_205_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33735,16 +35071,19 @@ _tmp_205_rule(Parser *p) static void * _tmp_206_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_206[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -33766,7 +35105,7 @@ _tmp_206_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33774,16 +35113,19 @@ _tmp_206_rule(Parser *p) static void * _tmp_207_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // star_targets '=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_207[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); @@ -33805,7 +35147,7 @@ _tmp_207_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33813,16 +35155,19 @@ _tmp_207_rule(Parser *p) static void * _tmp_208_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ')' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); @@ -33841,7 +35186,7 @@ _tmp_208_rule(Parser *p) } { // '**' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); @@ -33860,7 +35205,7 @@ _tmp_208_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33868,16 +35213,19 @@ _tmp_208_rule(Parser *p) static void * _tmp_209_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // ':' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); @@ -33896,7 +35244,7 @@ _tmp_209_rule(Parser *p) } { // '**' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); @@ -33915,7 +35263,7 @@ _tmp_209_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33923,16 +35271,19 @@ _tmp_209_rule(Parser *p) static void * _tmp_210_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ['as' star_target] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_210[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); @@ -33955,7 +35306,7 @@ _tmp_210_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -33963,16 +35314,19 @@ _tmp_210_rule(Parser *p) static void * _tmp_211_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expressions ['as' star_target] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_211[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); @@ -33995,7 +35349,7 @@ _tmp_211_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34003,16 +35357,19 @@ _tmp_211_rule(Parser *p) static void * _tmp_212_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expression ['as' star_target] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_212[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); @@ -34035,7 +35392,7 @@ _tmp_212_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34043,16 +35400,19 @@ _tmp_212_rule(Parser *p) static void * _tmp_213_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // expressions ['as' star_target] if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_213[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); @@ -34075,7 +35435,7 @@ _tmp_213_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34083,16 +35443,19 @@ _tmp_213_rule(Parser *p) static void * _tmp_214_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // except_block+ except_star_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); @@ -34114,7 +35477,7 @@ _tmp_214_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34122,16 +35485,19 @@ _tmp_214_rule(Parser *p) static void * _tmp_215_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // except_star_block+ except_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); @@ -34153,7 +35519,7 @@ _tmp_215_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34161,16 +35527,19 @@ _tmp_215_rule(Parser *p) static void * _tmp_216_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // assignment_expression if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); @@ -34189,7 +35558,7 @@ _tmp_216_rule(Parser *p) } { // expression !':=' if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); @@ -34210,7 +35579,7 @@ _tmp_216_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34218,16 +35587,19 @@ _tmp_216_rule(Parser *p) static void * _tmp_217_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -34249,7 +35621,7 @@ _tmp_217_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34257,16 +35629,19 @@ _tmp_217_rule(Parser *p) static void * _tmp_218_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -34288,7 +35663,7 @@ _tmp_218_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34296,16 +35671,19 @@ _tmp_218_rule(Parser *p) static void * _tmp_219_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -34327,7 +35705,7 @@ _tmp_219_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34335,16 +35713,19 @@ _tmp_219_rule(Parser *p) static void * _tmp_220_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void * _res = NULL; int _mark = p->mark; { // 'as' star_target if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); @@ -34366,7 +35747,7 @@ _tmp_220_rule(Parser *p) } _res = NULL; done: - D(p->level--); + p->level--; return _res; } @@ -34374,9 +35755,12 @@ _tmp_220_rule(Parser *p) static asdl_seq * _loop1_221_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -34386,14 +35770,14 @@ _loop1_221_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); @@ -34409,7 +35793,7 @@ _loop1_221_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -34423,7 +35807,7 @@ _loop1_221_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34431,13 +35815,13 @@ _loop1_221_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_221_type, _seq); - D(p->level--); + p->level--; return _seq; } @@ -34445,9 +35829,12 @@ _loop1_221_rule(Parser *p) static asdl_seq * _loop1_222_rule(Parser *p) { - D(p->level++); + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } void *_res = NULL; @@ -34457,14 +35844,14 @@ _loop1_222_rule(Parser *p) if (!_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } Py_ssize_t _children_capacity = 1; Py_ssize_t _n = 0; { // except_star_block if (p->error_indicator) { - D(p->level--); + p->level--; return NULL; } D(fprintf(stderr, "%*c> _loop1_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); @@ -34480,7 +35867,7 @@ _loop1_222_rule(Parser *p) if (!_new_children) { p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } _children = _new_children; @@ -34494,7 +35881,7 @@ _loop1_222_rule(Parser *p) } if (_n == 0 || p->error_indicator) { PyMem_Free(_children); - D(p->level--); + p->level--; return NULL; } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34502,13 +35889,13 @@ _loop1_222_rule(Parser *p) PyMem_Free(_children); p->error_indicator = 1; PyErr_NoMemory(); - D(p->level--); + p->level--; return NULL; } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); _PyPegen_insert_memo(p, _start_mark, _loop1_222_type, _seq); - D(p->level--); + p->level--; return _seq; } diff --git a/Parser/pegen.c b/Parser/pegen.c index 870085e7285e3..cfea1c87199b2 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -815,6 +815,7 @@ void * _PyPegen_run_parser(Parser *p) { void *res = _PyPegen_parse(p); + assert(p->level == 0); if (res == NULL) { if (PyErr_Occurred() && !PyErr_ExceptionMatches(PyExc_SyntaxError)) { return NULL; diff --git a/Tools/peg_generator/pegen/c_generator.py b/Tools/peg_generator/pegen/c_generator.py index 9cfbf38b40a77..ee255c8016386 100644 --- a/Tools/peg_generator/pegen/c_generator.py +++ b/Tools/peg_generator/pegen/c_generator.py @@ -37,6 +37,8 @@ # define D(x) #endif +# define MAXSTACK 6000 + """ @@ -364,10 +366,14 @@ def __init__( self.skip_actions = skip_actions def add_level(self) -> None: - self.print("D(p->level++);") + self.print("if (p->level++ == MAXSTACK) {") + with self.indent(): + self.print("p->error_indicator = 1;") + self.print("PyErr_NoMemory();") + self.print("}") def remove_level(self) -> None: - self.print("D(p->level--);") + self.print("p->level--;") def add_return(self, ret_val: str) -> None: self.remove_level() @@ -544,9 +550,10 @@ def _set_up_rule_memoization(self, node: Rule, result_type: str) -> None: self.print("p->in_raw_rule++;") self.print(f"void *_raw = {node.name}_raw(p);") self.print("p->in_raw_rule--;") - self.print("if (p->error_indicator)") + self.print("if (p->error_indicator) {") with self.indent(): - self.print("return NULL;") + self.add_return("NULL") + self.print("}") self.print("if (_raw == NULL || p->mark <= _resmark)") with self.indent(): self.print("break;") From webhook-mailer at python.org Mon Jan 3 15:10:19 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 03 Jan 2022 20:10:19 -0000 Subject: [Python-checkins] bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) Message-ID: https://github.com/python/cpython/commit/51700bf08b0dd4baf998440b2ebfaa488a2855ba commit: 51700bf08b0dd4baf998440b2ebfaa488a2855ba branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-03T20:10:07Z summary: bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) files: M Doc/library/os.path.rst diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index a66b3c5a3a990..6b15a113f5450 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -501,11 +501,16 @@ the :mod:`glob` module.) >>> splitext('foo.bar.exe') ('foo.bar', '.exe') + >>> splitext('/foo/bar.exe') + ('/foo/bar', '.exe') - Leading periods on the basename are ignored:: + Leading periods of the last component of the path are considered to + be part of the root:: >>> splitext('.cshrc') ('.cshrc', '') + >>> splitext('/foo/....jpg') + ('/foo/....jpg', '') .. versionchanged:: 3.6 Accepts a :term:`path-like object`. From webhook-mailer at python.org Mon Jan 3 15:36:50 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 03 Jan 2022 20:36:50 -0000 Subject: [Python-checkins] bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) (GH-30368) Message-ID: https://github.com/python/cpython/commit/8184a613b93d54416b954e667951cdf3d069cc13 commit: 8184a613b93d54416b954e667951cdf3d069cc13 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-03T20:36:41Z summary: bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) (GH-30368) (cherry picked from commit 51700bf08b0dd4baf998440b2ebfaa488a2855ba) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Doc/library/os.path.rst diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index a66b3c5a3a990..6b15a113f5450 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -501,11 +501,16 @@ the :mod:`glob` module.) >>> splitext('foo.bar.exe') ('foo.bar', '.exe') + >>> splitext('/foo/bar.exe') + ('/foo/bar', '.exe') - Leading periods on the basename are ignored:: + Leading periods of the last component of the path are considered to + be part of the root:: >>> splitext('.cshrc') ('.cshrc', '') + >>> splitext('/foo/....jpg') + ('/foo/....jpg', '') .. versionchanged:: 3.6 Accepts a :term:`path-like object`. From webhook-mailer at python.org Mon Jan 3 15:39:13 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 03 Jan 2022 20:39:13 -0000 Subject: [Python-checkins] bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) (GH-30369) Message-ID: https://github.com/python/cpython/commit/4a792ca95c1a994b07d18fe06e2104d5b1e0b796 commit: 4a792ca95c1a994b07d18fe06e2104d5b1e0b796 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-03T20:39:02Z summary: bpo-34931: [doc] clarify behavior of os.path.splitext() on paths with multiple leading periods (GH-30347) (GH-30369) (cherry picked from commit 51700bf08b0dd4baf998440b2ebfaa488a2855ba) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Doc/library/os.path.rst diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index 729649200209b..2fb6ea9da8e09 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -489,11 +489,16 @@ the :mod:`glob` module.) >>> splitext('foo.bar.exe') ('foo.bar', '.exe') + >>> splitext('/foo/bar.exe') + ('/foo/bar', '.exe') - Leading periods on the basename are ignored:: + Leading periods of the last component of the path are considered to + be part of the root:: >>> splitext('.cshrc') ('.cshrc', '') + >>> splitext('/foo/....jpg') + ('/foo/....jpg', '') .. versionchanged:: 3.6 Accepts a :term:`path-like object`. From webhook-mailer at python.org Mon Jan 3 17:26:31 2022 From: webhook-mailer at python.org (rhettinger) Date: Mon, 03 Jan 2022 22:26:31 -0000 Subject: [Python-checkins] Add doctest and improve readability for move_to_end() example. (#30370) Message-ID: https://github.com/python/cpython/commit/770f43d47e8e15747f4f3884992a344f3b547c67 commit: 770f43d47e8e15747f4f3884992a344f3b547c67 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-03T14:26:08-08:00 summary: Add doctest and improve readability for move_to_end() example. (#30370) files: M Doc/library/collections.rst diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst index 8bf3cb6cb12da..b8a717d883c09 100644 --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -1120,14 +1120,16 @@ Some differences from :class:`dict` still remain: Move an existing *key* to either end of an ordered dictionary. The item is moved to the right end if *last* is true (the default) or to the beginning if *last* is false. Raises :exc:`KeyError` if the *key* does - not exist:: + not exist: + + .. doctest:: >>> d = OrderedDict.fromkeys('abcde') >>> d.move_to_end('b') - >>> ''.join(d.keys()) + >>> ''.join(d) 'acdeb' >>> d.move_to_end('b', last=False) - >>> ''.join(d.keys()) + >>> ''.join(d) 'bacde' .. versionadded:: 3.2 From webhook-mailer at python.org Mon Jan 3 17:52:17 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 03 Jan 2022 22:52:17 -0000 Subject: [Python-checkins] bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Message-ID: https://github.com/python/cpython/commit/2db56130631255ca2eb504519430fb2f1fe789e9 commit: 2db56130631255ca2eb504519430fb2f1fe789e9 branch: main author: Hugo van Kemenade committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-03T14:52:09-08:00 summary: bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Remove the bit about subclassing exceptions. Documentation PR can skip the NEWS label. Automerge-Triggered-By: GH:iritkatriel files: M Doc/tutorial/errors.rst diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index f2490d65db5d4..3f09db2104068 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -329,41 +329,7 @@ be derived from the :exc:`Exception` class, either directly or indirectly. Exception classes can be defined which do anything any other class can do, but are usually kept simple, often only offering a number of attributes that allow -information about the error to be extracted by handlers for the exception. When -creating a module that can raise several distinct errors, a common practice is -to create a base class for exceptions defined by that module, and subclass that -to create specific exception classes for different error conditions:: - - class Error(Exception): - """Base class for exceptions in this module.""" - pass - - class InputError(Error): - """Exception raised for errors in the input. - - Attributes: - expression -- input expression in which the error occurred - message -- explanation of the error - """ - - def __init__(self, expression, message): - self.expression = expression - self.message = message - - class TransitionError(Error): - """Raised when an operation attempts a state transition that's not - allowed. - - Attributes: - previous -- state at beginning of transition - next -- attempted new state - message -- explanation of why the specific transition is not allowed - """ - - def __init__(self, previous, next, message): - self.previous = previous - self.next = next - self.message = message +information about the error to be extracted by handlers for the exception. Most exceptions are defined with names that end in "Error", similar to the naming of the standard exceptions. From webhook-mailer at python.org Mon Jan 3 18:10:24 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 03 Jan 2022 23:10:24 -0000 Subject: [Python-checkins] bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Message-ID: https://github.com/python/cpython/commit/0b3c3cbbaf2967cc17531d65ece0969b0d2a2079 commit: 0b3c3cbbaf2967cc17531d65ece0969b0d2a2079 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-03T15:10:20-08:00 summary: bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Remove the bit about subclassing exceptions. Documentation PR can skip the NEWS label. Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit 2db56130631255ca2eb504519430fb2f1fe789e9) Co-authored-by: Hugo van Kemenade files: M Doc/tutorial/errors.rst diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index f2490d65db5d4..3f09db2104068 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -329,41 +329,7 @@ be derived from the :exc:`Exception` class, either directly or indirectly. Exception classes can be defined which do anything any other class can do, but are usually kept simple, often only offering a number of attributes that allow -information about the error to be extracted by handlers for the exception. When -creating a module that can raise several distinct errors, a common practice is -to create a base class for exceptions defined by that module, and subclass that -to create specific exception classes for different error conditions:: - - class Error(Exception): - """Base class for exceptions in this module.""" - pass - - class InputError(Error): - """Exception raised for errors in the input. - - Attributes: - expression -- input expression in which the error occurred - message -- explanation of the error - """ - - def __init__(self, expression, message): - self.expression = expression - self.message = message - - class TransitionError(Error): - """Raised when an operation attempts a state transition that's not - allowed. - - Attributes: - previous -- state at beginning of transition - next -- attempted new state - message -- explanation of why the specific transition is not allowed - """ - - def __init__(self, previous, next, message): - self.previous = previous - self.next = next - self.message = message +information about the error to be extracted by handlers for the exception. Most exceptions are defined with names that end in "Error", similar to the naming of the standard exceptions. From webhook-mailer at python.org Mon Jan 3 18:19:33 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 03 Jan 2022 23:19:33 -0000 Subject: [Python-checkins] bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Message-ID: https://github.com/python/cpython/commit/4affb996ce6353dd029ece0c7d36f7c7c0af2de3 commit: 4affb996ce6353dd029ece0c7d36f7c7c0af2de3 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-03T15:19:29-08:00 summary: bpo-34538: Remove Exception subclassing from tutorial (GH-30361) Remove the bit about subclassing exceptions. Documentation PR can skip the NEWS label. Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit 2db56130631255ca2eb504519430fb2f1fe789e9) Co-authored-by: Hugo van Kemenade files: M Doc/tutorial/errors.rst diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index 4e2ed1d10a65c..542593ce731e4 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -326,41 +326,7 @@ be derived from the :exc:`Exception` class, either directly or indirectly. Exception classes can be defined which do anything any other class can do, but are usually kept simple, often only offering a number of attributes that allow -information about the error to be extracted by handlers for the exception. When -creating a module that can raise several distinct errors, a common practice is -to create a base class for exceptions defined by that module, and subclass that -to create specific exception classes for different error conditions:: - - class Error(Exception): - """Base class for exceptions in this module.""" - pass - - class InputError(Error): - """Exception raised for errors in the input. - - Attributes: - expression -- input expression in which the error occurred - message -- explanation of the error - """ - - def __init__(self, expression, message): - self.expression = expression - self.message = message - - class TransitionError(Error): - """Raised when an operation attempts a state transition that's not - allowed. - - Attributes: - previous -- state at beginning of transition - next -- attempted new state - message -- explanation of why the specific transition is not allowed - """ - - def __init__(self, previous, next, message): - self.previous = previous - self.next = next - self.message = message +information about the error to be extracted by handlers for the exception. Most exceptions are defined with names that end in "Error", similar to the naming of the standard exceptions. From webhook-mailer at python.org Mon Jan 3 18:47:21 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 03 Jan 2022 23:47:21 -0000 Subject: [Python-checkins] bpo-44092: Remove unused member `reset` from `sqlite3.Cursor` (GH-30377) Message-ID: https://github.com/python/cpython/commit/f1a58441eea6b7788c64d03a80ea35996301e550 commit: f1a58441eea6b7788c64d03a80ea35996301e550 branch: main author: Erlend Egeberg Aasland committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-03T15:47:16-08:00 summary: bpo-44092: Remove unused member `reset` from `sqlite3.Cursor` (GH-30377) Automerge-Triggered-By: GH:pablogsal files: M Modules/_sqlite/cursor.c M Modules/_sqlite/cursor.h diff --git a/Modules/_sqlite/cursor.c b/Modules/_sqlite/cursor.c index e475d933b5315..2729a85f3195d 100644 --- a/Modules/_sqlite/cursor.c +++ b/Modules/_sqlite/cursor.c @@ -35,8 +35,6 @@ class _sqlite3.Cursor "pysqlite_Cursor *" "clinic_state()->CursorType" [clinic start generated code]*/ /*[clinic end generated code: output=da39a3ee5e6b4b0d input=3c5b8115c5cf30f1]*/ -static const char errmsg_fetch_across_rollback[] = "Cursor needed to be reset because of commit/rollback and can no longer be fetched from."; - /*[clinic input] _sqlite3.Cursor.__init__ as pysqlite_cursor_init @@ -63,8 +61,6 @@ pysqlite_cursor_init_impl(pysqlite_Cursor *self, self->arraysize = 1; self->closed = 0; - self->reset = 0; - self->rowcount = -1L; Py_INCREF(Py_None); @@ -273,12 +269,6 @@ _pysqlite_fetch_one_row(pysqlite_Cursor* self) const char* colname; PyObject* error_msg; - if (self->reset) { - PyObject *exc = self->connection->InterfaceError; - PyErr_SetString(exc, errmsg_fetch_across_rollback); - return NULL; - } - Py_BEGIN_ALLOW_THREADS numcols = sqlite3_data_count(self->statement->st); Py_END_ALLOW_THREADS @@ -482,7 +472,6 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation } self->locked = 1; - self->reset = 0; if (multiple) { if (PyIter_Check(second_argument)) { @@ -731,8 +720,6 @@ pysqlite_cursor_executescript_impl(pysqlite_Cursor *self, return NULL; } - self->reset = 0; - size_t sql_len = strlen(sql_script); int max_length = sqlite3_limit(self->connection->db, SQLITE_LIMIT_SQL_LENGTH, -1); @@ -797,12 +784,6 @@ pysqlite_cursor_iternext(pysqlite_Cursor *self) return NULL; } - if (self->reset) { - PyObject *exc = self->connection->InterfaceError; - PyErr_SetString(exc, errmsg_fetch_across_rollback); - return NULL; - } - if (self->statement == NULL) { return NULL; } diff --git a/Modules/_sqlite/cursor.h b/Modules/_sqlite/cursor.h index d26d20a9fc5ea..0bcdddc3e2959 100644 --- a/Modules/_sqlite/cursor.h +++ b/Modules/_sqlite/cursor.h @@ -42,7 +42,6 @@ typedef struct PyObject* row_factory; pysqlite_Statement* statement; int closed; - int reset; int locked; int initialized; From webhook-mailer at python.org Mon Jan 3 21:41:29 2022 From: webhook-mailer at python.org (tim-one) Date: Tue, 04 Jan 2022 02:41:29 -0000 Subject: [Python-checkins] bpo-46233: Minor speedup for bigint squaring (GH-30345) Message-ID: https://github.com/python/cpython/commit/3aa5242b54b0627293d95cfb4a26b2f917f667be commit: 3aa5242b54b0627293d95cfb4a26b2f917f667be branch: main author: Tim Peters committer: tim-one date: 2022-01-03T20:41:16-06:00 summary: bpo-46233: Minor speedup for bigint squaring (GH-30345) x_mul()'s squaring code can do some redundant and/or useless work at the end of each digit pass. A more careful analysis of worst-case carries at various digit positions allows making that code leaner. files: M Lib/test/test_long.py M Objects/longobject.c diff --git a/Lib/test/test_long.py b/Lib/test/test_long.py index 3c8e9e22e17a1..f2a622b5868f0 100644 --- a/Lib/test/test_long.py +++ b/Lib/test/test_long.py @@ -1502,6 +1502,17 @@ class myint(int): self.assertEqual(type(numerator), int) self.assertEqual(type(denominator), int) + def test_square(self): + # Multiplication makes a special case of multiplying an int with + # itself, using a special, faster algorithm. This test is mostly + # to ensure that no asserts in the implementation trigger, in + # cases with a maximal amount of carries. + for bitlen in range(1, 400): + n = (1 << bitlen) - 1 # solid string of 1 bits + with self.subTest(bitlen=bitlen, n=n): + # (2**i - 1)**2 = 2**(2*i) - 2*2**i + 1 + self.assertEqual(n**2, + (1 << (2 * bitlen)) - (1 << (bitlen + 1)) + 1) if __name__ == "__main__": unittest.main() diff --git a/Objects/longobject.c b/Objects/longobject.c index b5648fca7dc5c..2db8701a841a9 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -3237,12 +3237,12 @@ x_mul(PyLongObject *a, PyLongObject *b) * via exploiting that each entry in the multiplication * pyramid appears twice (except for the size_a squares). */ + digit *paend = a->ob_digit + size_a; for (i = 0; i < size_a; ++i) { twodigits carry; twodigits f = a->ob_digit[i]; digit *pz = z->ob_digit + (i << 1); digit *pa = a->ob_digit + i + 1; - digit *paend = a->ob_digit + size_a; SIGCHECK({ Py_DECREF(z); @@ -3265,13 +3265,27 @@ x_mul(PyLongObject *a, PyLongObject *b) assert(carry <= (PyLong_MASK << 1)); } if (carry) { + /* See comment below. pz points at the highest possible + * carry position from the last outer loop iteration, so + * *pz is at most 1. + */ + assert(*pz <= 1); carry += *pz; - *pz++ = (digit)(carry & PyLong_MASK); + *pz = (digit)(carry & PyLong_MASK); carry >>= PyLong_SHIFT; + if (carry) { + /* If there's still a carry, it must be into a position + * that still holds a 0. Where the base + ^ B is 1 << PyLong_SHIFT, the last add was of a carry no + * more than 2*B - 2 to a stored digit no more than 1. + * So the sum was no more than 2*B - 1, so the current + * carry no more than floor((2*B - 1)/B) = 1. + */ + assert(carry == 1); + assert(pz[1] == 0); + pz[1] = (digit)carry; + } } - if (carry) - *pz += (digit)(carry & PyLong_MASK); - assert((carry >> PyLong_SHIFT) == 0); } } else { /* a is not the same as b -- gradeschool int mult */ From webhook-mailer at python.org Tue Jan 4 00:55:46 2022 From: webhook-mailer at python.org (rhettinger) Date: Tue, 04 Jan 2022 05:55:46 -0000 Subject: [Python-checkins] Add doctest and improve readability for move_to_end() example. (GH-30370) (GH-30373) Message-ID: https://github.com/python/cpython/commit/685b6285b9a7109c2c6dca04f32a585445dd0f04 commit: 685b6285b9a7109c2c6dca04f32a585445dd0f04 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-03T21:55:38-08:00 summary: Add doctest and improve readability for move_to_end() example. (GH-30370) (GH-30373) files: M Doc/library/collections.rst diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst index 8bf3cb6cb12da..b8a717d883c09 100644 --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -1120,14 +1120,16 @@ Some differences from :class:`dict` still remain: Move an existing *key* to either end of an ordered dictionary. The item is moved to the right end if *last* is true (the default) or to the beginning if *last* is false. Raises :exc:`KeyError` if the *key* does - not exist:: + not exist: + + .. doctest:: >>> d = OrderedDict.fromkeys('abcde') >>> d.move_to_end('b') - >>> ''.join(d.keys()) + >>> ''.join(d) 'acdeb' >>> d.move_to_end('b', last=False) - >>> ''.join(d.keys()) + >>> ''.join(d) 'bacde' .. versionadded:: 3.2 From webhook-mailer at python.org Tue Jan 4 03:42:35 2022 From: webhook-mailer at python.org (asvetlov) Date: Tue, 04 Jan 2022 08:42:35 -0000 Subject: [Python-checkins] bpo-46238: reuse `_winapi` constants in `asyncio.windows_events` (GH-30352) Message-ID: https://github.com/python/cpython/commit/1b111338cfe7840feea95e30ea8124063c450c65 commit: 1b111338cfe7840feea95e30ea8124063c450c65 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-04T10:42:19+02:00 summary: bpo-46238: reuse `_winapi` constants in `asyncio.windows_events` (GH-30352) files: A Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 5e7cd795895d6..8c3d73705e3c8 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -28,8 +28,8 @@ ) -NULL = 0 -INFINITE = 0xffffffff +NULL = _winapi.NULL +INFINITE = _winapi.INFINITE ERROR_CONNECTION_REFUSED = 1225 ERROR_CONNECTION_ABORTED = 1236 @@ -405,7 +405,7 @@ async def _make_subprocess_transport(self, protocol, args, shell, class IocpProactor: """Proactor implementation using IOCP.""" - def __init__(self, concurrency=0xffffffff): + def __init__(self, concurrency=INFINITE): self._loop = None self._results = [] self._iocp = _overlapped.CreateIoCompletionPort( diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst new file mode 100644 index 0000000000000..1617b0ed0538a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst @@ -0,0 +1 @@ +Reuse ``_winapi`` constants in ``asyncio.windows_events``. From webhook-mailer at python.org Tue Jan 4 03:44:38 2022 From: webhook-mailer at python.org (asvetlov) Date: Tue, 04 Jan 2022 08:44:38 -0000 Subject: [Python-checkins] bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) Message-ID: https://github.com/python/cpython/commit/5a2a65096c3ec2d37f33615f2a420d2ffcabecf2 commit: 5a2a65096c3ec2d37f33615f2a420d2ffcabecf2 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-04T10:44:26+02:00 summary: bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) files: A Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 8c3d73705e3c8..427d4624ad076 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -1,5 +1,10 @@ """Selector and proactor event loops for Windows.""" +import sys + +if sys.platform != 'win32': # pragma: no cover + raise ImportError('win32 only') + import _overlapped import _winapi import errno diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst new file mode 100644 index 0000000000000..202febf84fd10 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst @@ -0,0 +1,2 @@ +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. From webhook-mailer at python.org Tue Jan 4 04:14:17 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 09:14:17 -0000 Subject: [Python-checkins] bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) Message-ID: https://github.com/python/cpython/commit/86d1b8c13fcaf8a974cf2ae23b31fe87dfdb6267 commit: 86d1b8c13fcaf8a974cf2ae23b31fe87dfdb6267 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T01:13:56-08:00 summary: bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) (cherry picked from commit 5a2a65096c3ec2d37f33615f2a420d2ffcabecf2) Co-authored-by: Nikita Sobolev files: A Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 5e7cd795895d6..da81ab435b9a6 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -1,5 +1,10 @@ """Selector and proactor event loops for Windows.""" +import sys + +if sys.platform != 'win32': # pragma: no cover + raise ImportError('win32 only') + import _overlapped import _winapi import errno diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst new file mode 100644 index 0000000000000..202febf84fd10 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst @@ -0,0 +1,2 @@ +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. From webhook-mailer at python.org Tue Jan 4 04:22:47 2022 From: webhook-mailer at python.org (asvetlov) Date: Tue, 04 Jan 2022 09:22:47 -0000 Subject: [Python-checkins] bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) (#30388) Message-ID: https://github.com/python/cpython/commit/cf48a148190a6ccadc144cab2e2046e95c20fb57 commit: cf48a148190a6ccadc144cab2e2046e95c20fb57 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-04T11:22:26+02:00 summary: bpo-46239: improve error message when importing `asyncio.windows_events` (GH-30353) (#30388) (cherry picked from commit 5a2a65096c3ec2d37f33615f2a420d2ffcabecf2) Co-authored-by: Nikita Sobolev Co-authored-by: Nikita Sobolev files: A Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 5e7cd795895d6..da81ab435b9a6 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -1,5 +1,10 @@ """Selector and proactor event loops for Windows.""" +import sys + +if sys.platform != 'win32': # pragma: no cover + raise ImportError('win32 only') + import _overlapped import _winapi import errno diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst new file mode 100644 index 0000000000000..202febf84fd10 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst @@ -0,0 +1,2 @@ +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. From webhook-mailer at python.org Tue Jan 4 04:26:09 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 09:26:09 -0000 Subject: [Python-checkins] Update old-style strings to f-strings (GH-30384) Message-ID: https://github.com/python/cpython/commit/bef48837e79712868c096ef4f4692dbf1746b6d1 commit: bef48837e79712868c096ef4f4692dbf1746b6d1 branch: main author: David Gilbertson committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T01:25:56-08:00 summary: Update old-style strings to f-strings (GH-30384) Let me know if this sort of change is unwanted... files: M Doc/includes/minidom-example.py diff --git a/Doc/includes/minidom-example.py b/Doc/includes/minidom-example.py index 5ee7682c19271..3b9e9ee2db291 100644 --- a/Doc/includes/minidom-example.py +++ b/Doc/includes/minidom-example.py @@ -42,10 +42,10 @@ def handleSlide(slide): handlePoints(slide.getElementsByTagName("point")) def handleSlideshowTitle(title): - print("%s" % getText(title.childNodes)) + print(f"{getText(title.childNodes)}") def handleSlideTitle(title): - print("

    %s

    " % getText(title.childNodes)) + print(f"

    {getText(title.childNodes)}

    ") def handlePoints(points): print("
      ") @@ -54,11 +54,11 @@ def handlePoints(points): print("
    ") def handlePoint(point): - print("
  • %s
  • " % getText(point.childNodes)) + print(f"
  • {getText(point.childNodes)}
  • ") def handleToc(slides): for slide in slides: title = slide.getElementsByTagName("title")[0] - print("

    %s

    " % getText(title.childNodes)) + print(f"

    {getText(title.childNodes)}

    ") handleSlideshow(dom) From webhook-mailer at python.org Tue Jan 4 04:36:39 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 04 Jan 2022 09:36:39 -0000 Subject: [Python-checkins] bpo-44092: Move What's New entry to where it belongs (GH-30381) Message-ID: https://github.com/python/cpython/commit/a09062c267a94200ad299f779429fea1b571ee35 commit: a09062c267a94200ad299f779429fea1b571ee35 branch: main author: Erlend Egeberg Aasland committer: pablogsal date: 2022-01-04T09:36:30Z summary: bpo-44092: Move What's New entry to where it belongs (GH-30381) files: M Doc/whatsnew/3.11.rst diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 4ddca744720f5..be6cb158a8049 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -291,6 +291,10 @@ sqlite3 experience. (Contributed by Erlend E. Aasland in :issue:`45828`.) +* Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. + Instead we leave it to the SQLite library to handle these cases. + (Contributed by Erlend E. Aasland in :issue:`44092`.) + sys --- @@ -302,11 +306,6 @@ sys (Contributed by Irit Katriel in :issue:`45711`.) -* Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. - Instead we leave it to the SQLite library to handle these cases. - (Contributed by Erlend E. Aasland in :issue:`44092`.) - - threading --------- From webhook-mailer at python.org Tue Jan 4 05:37:16 2022 From: webhook-mailer at python.org (markshannon) Date: Tue, 04 Jan 2022 10:37:16 -0000 Subject: [Python-checkins] bpo-46202: Remove opcode POP_EXCEPT_AND_RERAISE (GH-30302) Message-ID: https://github.com/python/cpython/commit/a94461d7189d7f1147ab304a332c8684263dc17e commit: a94461d7189d7f1147ab304a332c8684263dc17e branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: markshannon date: 2022-01-04T10:37:12Z summary: bpo-46202: Remove opcode POP_EXCEPT_AND_RERAISE (GH-30302) * bpo-46202: remove opcode POP_EXCEPT_AND_RERAISE * do not assume that an exception group is truthy files: A Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst M Doc/library/dis.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Lib/test/test_code.py M Lib/test/test_dis.py M Objects/frameobject.c M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 14de191265cf2..87ec584789d31 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -603,16 +603,6 @@ iterations of the loop. The ``__exit__`` function is in position 4 of the stack rather than 7. Exception representation on the stack now consist of one, not three, items. -.. opcode:: POP_EXCEPT_AND_RERAISE - - Pops the exception currently on top of the stack. Pops the integer value on top - of the stack and sets the ``f_lasti`` attribute of the frame with that value. - Then pops the next exception from the stack uses it to restore the current exception. - Finally it re-raises the originally popped exception. - Used in exception handler cleanup. - - .. versionadded:: 3.11 - .. opcode:: LOAD_ASSERTION_ERROR diff --git a/Include/opcode.h b/Include/opcode.h index 05565267941fd..ef334de601fee 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -24,7 +24,6 @@ extern "C" { #define MATCH_SEQUENCE 32 #define MATCH_KEYS 33 #define PUSH_EXC_INFO 35 -#define POP_EXCEPT_AND_RERAISE 37 #define WITH_EXCEPT_START 49 #define GET_AITER 50 #define GET_ANEXT 51 @@ -132,42 +131,42 @@ extern "C" { #define BINARY_SUBSCR_TUPLE_INT 29 #define BINARY_SUBSCR_DICT 34 #define STORE_SUBSCR_ADAPTIVE 36 -#define STORE_SUBSCR_LIST_INT 38 -#define STORE_SUBSCR_DICT 39 -#define CALL_NO_KW_ADAPTIVE 40 -#define CALL_NO_KW_BUILTIN_O 41 -#define CALL_NO_KW_BUILTIN_FAST 42 -#define CALL_NO_KW_LEN 43 -#define CALL_NO_KW_ISINSTANCE 44 -#define CALL_NO_KW_PY_SIMPLE 45 -#define CALL_NO_KW_LIST_APPEND 46 -#define CALL_NO_KW_METHOD_DESCRIPTOR_O 47 -#define CALL_NO_KW_TYPE_1 48 -#define CALL_NO_KW_BUILTIN_CLASS_1 55 -#define CALL_NO_KW_METHOD_DESCRIPTOR_FAST 56 -#define JUMP_ABSOLUTE_QUICK 57 -#define LOAD_ATTR_ADAPTIVE 58 -#define LOAD_ATTR_INSTANCE_VALUE 59 -#define LOAD_ATTR_WITH_HINT 62 -#define LOAD_ATTR_SLOT 63 -#define LOAD_ATTR_MODULE 64 -#define LOAD_GLOBAL_ADAPTIVE 65 -#define LOAD_GLOBAL_MODULE 66 -#define LOAD_GLOBAL_BUILTIN 67 -#define LOAD_METHOD_ADAPTIVE 72 -#define LOAD_METHOD_CACHED 75 -#define LOAD_METHOD_CLASS 76 -#define LOAD_METHOD_MODULE 77 -#define LOAD_METHOD_NO_DICT 78 -#define STORE_ATTR_ADAPTIVE 79 -#define STORE_ATTR_INSTANCE_VALUE 80 -#define STORE_ATTR_SLOT 81 -#define STORE_ATTR_WITH_HINT 87 -#define LOAD_FAST__LOAD_FAST 128 -#define STORE_FAST__LOAD_FAST 131 -#define LOAD_FAST__LOAD_CONST 134 -#define LOAD_CONST__LOAD_FAST 140 -#define STORE_FAST__STORE_FAST 141 +#define STORE_SUBSCR_LIST_INT 37 +#define STORE_SUBSCR_DICT 38 +#define CALL_NO_KW_ADAPTIVE 39 +#define CALL_NO_KW_BUILTIN_O 40 +#define CALL_NO_KW_BUILTIN_FAST 41 +#define CALL_NO_KW_LEN 42 +#define CALL_NO_KW_ISINSTANCE 43 +#define CALL_NO_KW_PY_SIMPLE 44 +#define CALL_NO_KW_LIST_APPEND 45 +#define CALL_NO_KW_METHOD_DESCRIPTOR_O 46 +#define CALL_NO_KW_TYPE_1 47 +#define CALL_NO_KW_BUILTIN_CLASS_1 48 +#define CALL_NO_KW_METHOD_DESCRIPTOR_FAST 55 +#define JUMP_ABSOLUTE_QUICK 56 +#define LOAD_ATTR_ADAPTIVE 57 +#define LOAD_ATTR_INSTANCE_VALUE 58 +#define LOAD_ATTR_WITH_HINT 59 +#define LOAD_ATTR_SLOT 62 +#define LOAD_ATTR_MODULE 63 +#define LOAD_GLOBAL_ADAPTIVE 64 +#define LOAD_GLOBAL_MODULE 65 +#define LOAD_GLOBAL_BUILTIN 66 +#define LOAD_METHOD_ADAPTIVE 67 +#define LOAD_METHOD_CACHED 72 +#define LOAD_METHOD_CLASS 75 +#define LOAD_METHOD_MODULE 76 +#define LOAD_METHOD_NO_DICT 77 +#define STORE_ATTR_ADAPTIVE 78 +#define STORE_ATTR_INSTANCE_VALUE 79 +#define STORE_ATTR_SLOT 80 +#define STORE_ATTR_WITH_HINT 81 +#define LOAD_FAST__LOAD_FAST 87 +#define STORE_FAST__LOAD_FAST 128 +#define LOAD_FAST__LOAD_CONST 131 +#define LOAD_CONST__LOAD_FAST 134 +#define STORE_FAST__STORE_FAST 140 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 095c1274bebaf..29324664cea86 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -376,6 +376,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3468 (Add SEND opcode) # Python 3.11a4 3469 (bpo-45711: remove type, traceback from exc_info) # Python 3.11a4 3470 (bpo-46221: PREP_RERAISE_STAR no longer pushes lasti) +# Python 3.11a4 3471 (bpo-46202: remove pop POP_EXCEPT_AND_RERAISE) # # MAGIC must change whenever the bytecode emitted by the compiler may no @@ -385,7 +386,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3470).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3471).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index e654a1088b7ea..9bbff182f08fd 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -77,8 +77,6 @@ def jabs_op(name, op): def_op('PUSH_EXC_INFO', 35) -def_op('POP_EXCEPT_AND_RERAISE', 37) - def_op('WITH_EXCEPT_START', 49) def_op('GET_AITER', 50) def_op('GET_ANEXT', 51) diff --git a/Lib/test/test_code.py b/Lib/test/test_code.py index b42213bde0744..88f6c782a68e4 100644 --- a/Lib/test/test_code.py +++ b/Lib/test/test_code.py @@ -383,7 +383,9 @@ def test_co_positions_artificial_instructions(self): ("STORE_NAME", "e"), # XX: we know the location for this ("DELETE_NAME", "e"), ("RERAISE", 1), - ("POP_EXCEPT_AND_RERAISE", None) + ("COPY", 3), + ("POP_EXCEPT", None), + ("RERAISE", 1) ] ) diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index 93b24da317575..7857458e240a5 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -329,7 +329,9 @@ def bug42562(): 46 RERAISE 1 %3d >> 48 RERAISE 0 - >> 50 POP_EXCEPT_AND_RERAISE + >> 50 COPY 3 + 52 POP_EXCEPT + 54 RERAISE 1 ExceptionTable: 2 to 8 -> 14 [0] 14 to 20 -> 50 [1] lasti @@ -390,7 +392,9 @@ def _tryfinallyconst(b): 16 CALL_NO_KW 0 18 POP_TOP 20 RERAISE 0 - >> 22 POP_EXCEPT_AND_RERAISE + >> 22 COPY 3 + 24 POP_EXCEPT + 26 RERAISE 1 ExceptionTable: 2 to 2 -> 12 [0] 12 to 20 -> 22 [1] lasti @@ -414,7 +418,9 @@ def _tryfinallyconst(b): 18 CALL_NO_KW 0 20 POP_TOP 22 RERAISE 0 - >> 24 POP_EXCEPT_AND_RERAISE + >> 24 COPY 3 + 26 POP_EXCEPT + 28 RERAISE 1 ExceptionTable: 14 to 22 -> 24 [1] lasti """ % (_tryfinallyconst.__code__.co_firstlineno + 1, @@ -1105,7 +1111,7 @@ def _prepare_test_cases(): Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=0, argrepr='0', offset=108, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='BINARY_OP', opcode=122, arg=11, argval=11, argrepr='/', offset=110, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=112, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=12, argval=140, argrepr='to 140', offset=114, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=14, argval=144, argrepr='to 144', offset=114, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=116, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_GLOBAL', opcode=116, arg=2, argval='ZeroDivisionError', argrepr='ZeroDivisionError', offset=118, starts_line=22, is_jump_target=False, positions=None), Instruction(opname='JUMP_IF_NOT_EXC_MATCH', opcode=121, arg=68, argval=136, argrepr='to 136', offset=120, starts_line=None, is_jump_target=False, positions=None), @@ -1115,52 +1121,57 @@ def _prepare_test_cases(): Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=128, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=130, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=132, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=30, argval=196, argrepr='to 196', offset=134, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=34, argval=204, argrepr='to 204', offset=134, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=136, starts_line=22, is_jump_target=True, positions=None), - Instruction(opname='POP_EXCEPT_AND_RERAISE', opcode=37, arg=None, argval=None, argrepr='', offset=138, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=140, starts_line=25, is_jump_target=True, positions=None), - Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=142, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=144, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=146, starts_line=26, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=9, argval='Never reach this', argrepr="'Never reach this'", offset=148, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=150, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=152, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=154, starts_line=25, is_jump_target=False, positions=None), - Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=156, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=158, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=160, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=162, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=9, argval=184, argrepr='to 184', offset=164, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=168, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=88, argval=176, argrepr='to 176', offset=170, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=2, argval=2, argrepr='', offset=172, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT_AND_RERAISE', opcode=37, arg=None, argval=None, argrepr='', offset=174, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=176, starts_line=None, is_jump_target=True, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=178, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=180, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=182, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=184, starts_line=28, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=186, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=188, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=138, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=140, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=142, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=144, starts_line=25, is_jump_target=True, positions=None), + Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=146, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=148, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=150, starts_line=26, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=9, argval='Never reach this', argrepr="'Never reach this'", offset=152, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=154, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=156, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=158, starts_line=25, is_jump_target=False, positions=None), + Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=160, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=162, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=164, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=11, argval=192, argrepr='to 192', offset=168, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=170, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=172, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=92, argval=184, argrepr='to 184', offset=174, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=2, argval=2, argrepr='', offset=176, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=178, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=180, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=182, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=184, starts_line=None, is_jump_target=True, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=186, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=188, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=190, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=192, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=194, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=196, starts_line=23, is_jump_target=True, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=198, starts_line=28, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=200, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=202, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=204, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=206, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=208, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=210, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=212, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=214, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=216, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=220, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT_AND_RERAISE', opcode=37, arg=None, argval=None, argrepr='', offset=222, starts_line=None, is_jump_target=False, positions=None), -] + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=192, starts_line=28, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=194, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=196, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=198, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=200, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=202, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=204, starts_line=23, is_jump_target=True, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=206, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=208, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=210, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=212, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=214, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=216, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=220, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=222, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=224, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=226, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=230, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=232, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=234, starts_line=None, is_jump_target=False, positions=None)] # One last piece of inspect fodder to check the default line number handling def simple(): pass diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst new file mode 100644 index 0000000000000..ee0a9038837de --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst @@ -0,0 +1,2 @@ +Remove :opcode:`POP_EXCEPT_AND_RERAISE` and replace it by an equivalent +sequence of other opcodes. diff --git a/Objects/frameobject.c b/Objects/frameobject.c index fc62713aa241a..078fcfc6cf607 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -289,7 +289,6 @@ mark_stacks(PyCodeObject *code_obj, int len) case RETURN_VALUE: case RAISE_VARARGS: case RERAISE: - case POP_EXCEPT_AND_RERAISE: /* End of block */ break; case GEN_START: diff --git a/Python/ceval.c b/Python/ceval.c index 43925e6db269c..81bea44465dc7 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -2725,31 +2725,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } - TARGET(POP_EXCEPT_AND_RERAISE) { - PyObject *lasti = PEEK(2); - if (PyLong_Check(lasti)) { - frame->f_lasti = PyLong_AsLong(lasti); - assert(!_PyErr_Occurred(tstate)); - } - else { - _PyErr_SetString(tstate, PyExc_SystemError, "lasti is not an int"); - goto error; - } - PyObject *value = POP(); - assert(value); - assert(PyExceptionInstance_Check(value)); - PyObject *type = Py_NewRef(PyExceptionInstance_Class(value)); - PyObject *traceback = PyException_GetTraceback(value); - Py_DECREF(POP()); /* lasti */ - _PyErr_Restore(tstate, type, value, traceback); - - _PyErr_StackItem *exc_info = tstate->exc_info; - value = exc_info->exc_value; - exc_info->exc_value = POP(); - Py_XDECREF(value); - goto exception_unwind; - } - TARGET(RERAISE) { if (oparg) { PyObject *lasti = PEEK(oparg + 1); diff --git a/Python/compile.c b/Python/compile.c index 48250b5dba973..9d3752936266c 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -1049,8 +1049,6 @@ stack_effect(int opcode, int oparg, int jump) return 0; case POP_EXCEPT: return -1; - case POP_EXCEPT_AND_RERAISE: - return -3; case STORE_NAME: return -1; @@ -1669,6 +1667,9 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) #define ADD_YIELD_FROM(C) \ RETURN_IF_FALSE(compiler_add_yield_from((C))) +#define POP_EXCEPT_AND_RERAISE(C) \ + RETURN_IF_FALSE(compiler_pop_except_and_reraise((C))) + #define VISIT(C, TYPE, V) {\ if (!compiler_visit_ ## TYPE((C), (V))) \ return 0; \ @@ -1839,6 +1840,22 @@ compiler_add_yield_from(struct compiler *c) return 1; } +static int +compiler_pop_except_and_reraise(struct compiler *c) +{ + /* Stack contents + * [exc_info, lasti, exc] COPY 3 + * [exc_info, lasti, exc, exc_info] POP_EXCEPT + * [exc_info, lasti, exc] RERAISE 1 + * (exception_unwind clears the stack) + */ + + ADDOP_I(c, COPY, 3); + ADDOP(c, POP_EXCEPT); + ADDOP_I(c, RERAISE, 1); + return 1; +} + /* Unwind a frame block. If preserve_tos is true, the TOS before * popping the blocks will be restored afterwards, unless another * return, break or continue is found. In which case, the TOS will @@ -3235,7 +3252,7 @@ compiler_try_finally(struct compiler *c, stmt_ty s) compiler_pop_fblock(c, FINALLY_END, end); ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, exit); return 1; } @@ -3290,7 +3307,7 @@ compiler_try_star_finally(struct compiler *c, stmt_ty s) compiler_pop_fblock(c, FINALLY_END, end); ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, exit); return 1; } @@ -3446,7 +3463,7 @@ compiler_try_except(struct compiler *c, stmt_ty s) compiler_pop_fblock(c, EXCEPTION_HANDLER, NULL); ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, orelse); VISIT_SEQ(c, stmt, s->v.Try.orelse); ADDOP_JUMP(c, JUMP_FORWARD, end); @@ -3497,7 +3514,7 @@ compiler_try_except(struct compiler *c, stmt_ty s) [exc] RER: ROT_TWO [exc, prev_exc_info] POP_EXCEPT - [exc] RERAISE 0 + [exc] RERAISE 0 [] L0: */ @@ -3677,7 +3694,7 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) ADDOP(c, POP_EXCEPT); ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, orelse); VISIT_SEQ(c, stmt, s->v.TryStar.orelse); ADDOP_JUMP(c, JUMP_FORWARD, end); @@ -5429,7 +5446,7 @@ compiler_with_except_finish(struct compiler *c, basicblock * cleanup) { NEXT_BLOCK(c); ADDOP_I(c, RERAISE, 2); compiler_use_next_block(c, cleanup); - ADDOP(c, POP_EXCEPT_AND_RERAISE); + POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, exit); ADDOP(c, POP_TOP); /* exc_value */ ADDOP(c, POP_BLOCK); @@ -7032,8 +7049,7 @@ stackdepth(struct compiler *c) instr->i_opcode == JUMP_FORWARD || instr->i_opcode == RETURN_VALUE || instr->i_opcode == RAISE_VARARGS || - instr->i_opcode == RERAISE || - instr->i_opcode == POP_EXCEPT_AND_RERAISE) + instr->i_opcode == RERAISE) { /* remaining code is dead */ next = NULL; @@ -8756,7 +8772,6 @@ normalize_basic_block(basicblock *bb) { case RETURN_VALUE: case RAISE_VARARGS: case RERAISE: - case POP_EXCEPT_AND_RERAISE: bb->b_exit = 1; bb->b_nofallthrough = 1; break; diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index 3ee0b9c7a904c..a8f1398bfa66d 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -36,7 +36,6 @@ static void *opcode_targets[256] = { &&TARGET_BINARY_SUBSCR_DICT, &&TARGET_PUSH_EXC_INFO, &&TARGET_STORE_SUBSCR_ADAPTIVE, - &&TARGET_POP_EXCEPT_AND_RERAISE, &&TARGET_STORE_SUBSCR_LIST_INT, &&TARGET_STORE_SUBSCR_DICT, &&TARGET_CALL_NO_KW_ADAPTIVE, @@ -48,45 +47,46 @@ static void *opcode_targets[256] = { &&TARGET_CALL_NO_KW_LIST_APPEND, &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_O, &&TARGET_CALL_NO_KW_TYPE_1, + &&TARGET_CALL_NO_KW_BUILTIN_CLASS_1, &&TARGET_WITH_EXCEPT_START, &&TARGET_GET_AITER, &&TARGET_GET_ANEXT, &&TARGET_BEFORE_ASYNC_WITH, &&TARGET_BEFORE_WITH, &&TARGET_END_ASYNC_FOR, - &&TARGET_CALL_NO_KW_BUILTIN_CLASS_1, &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_FAST, &&TARGET_JUMP_ABSOLUTE_QUICK, &&TARGET_LOAD_ATTR_ADAPTIVE, &&TARGET_LOAD_ATTR_INSTANCE_VALUE, + &&TARGET_LOAD_ATTR_WITH_HINT, &&TARGET_STORE_SUBSCR, &&TARGET_DELETE_SUBSCR, - &&TARGET_LOAD_ATTR_WITH_HINT, &&TARGET_LOAD_ATTR_SLOT, &&TARGET_LOAD_ATTR_MODULE, &&TARGET_LOAD_GLOBAL_ADAPTIVE, &&TARGET_LOAD_GLOBAL_MODULE, &&TARGET_LOAD_GLOBAL_BUILTIN, + &&TARGET_LOAD_METHOD_ADAPTIVE, &&TARGET_GET_ITER, &&TARGET_GET_YIELD_FROM_ITER, &&TARGET_PRINT_EXPR, &&TARGET_LOAD_BUILD_CLASS, - &&TARGET_LOAD_METHOD_ADAPTIVE, + &&TARGET_LOAD_METHOD_CACHED, &&TARGET_GET_AWAITABLE, &&TARGET_LOAD_ASSERTION_ERROR, - &&TARGET_LOAD_METHOD_CACHED, &&TARGET_LOAD_METHOD_CLASS, &&TARGET_LOAD_METHOD_MODULE, &&TARGET_LOAD_METHOD_NO_DICT, &&TARGET_STORE_ATTR_ADAPTIVE, &&TARGET_STORE_ATTR_INSTANCE_VALUE, &&TARGET_STORE_ATTR_SLOT, + &&TARGET_STORE_ATTR_WITH_HINT, &&TARGET_LIST_TO_TUPLE, &&TARGET_RETURN_VALUE, &&TARGET_IMPORT_STAR, &&TARGET_SETUP_ANNOTATIONS, &&TARGET_YIELD_VALUE, - &&TARGET_STORE_ATTR_WITH_HINT, + &&TARGET_LOAD_FAST__LOAD_FAST, &&TARGET_PREP_RERAISE_STAR, &&TARGET_POP_EXCEPT, &&TARGET_STORE_NAME, @@ -127,20 +127,20 @@ static void *opcode_targets[256] = { &&TARGET_STORE_FAST, &&TARGET_DELETE_FAST, &&TARGET_JUMP_IF_NOT_EG_MATCH, - &&TARGET_LOAD_FAST__LOAD_FAST, + &&TARGET_STORE_FAST__LOAD_FAST, &&TARGET_GEN_START, &&TARGET_RAISE_VARARGS, - &&TARGET_STORE_FAST__LOAD_FAST, + &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, - &&TARGET_LOAD_FAST__LOAD_CONST, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_MAKE_CELL, &&TARGET_LOAD_CLOSURE, &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, - &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_STORE_FAST__STORE_FAST, + &&_unknown_opcode, &&TARGET_CALL_FUNCTION_EX, &&_unknown_opcode, &&TARGET_EXTENDED_ARG, From webhook-mailer at python.org Tue Jan 4 05:41:31 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 04 Jan 2022 10:41:31 -0000 Subject: [Python-checkins] bpo-46240: Correct the error for unclosed parentheses when the tokenizer is not finished (GH-30378) Message-ID: https://github.com/python/cpython/commit/70f415fb8b632247e28d87998642317ca7a652ae commit: 70f415fb8b632247e28d87998642317ca7a652ae branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-04T10:41:22Z summary: bpo-46240: Correct the error for unclosed parentheses when the tokenizer is not finished (GH-30378) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst M Lib/test/test_exceptions.py M Lib/test/test_syntax.py M Parser/pegen_errors.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 3e7808c449955..c04b57f5630ab 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -227,7 +227,7 @@ def testSyntaxErrorOffset(self): check('x = "a', 1, 5) check('lambda x: x = 2', 1, 1) check('f{a + b + c}', 1, 2) - check('[file for str(file) in []\n])', 1, 11) + check('[file for str(file) in []\n]', 1, 11) check('a = ? hello ? ? world ?', 1, 5) check('[\nfile\nfor str(file)\nin\n[]\n]', 3, 5) check('[file for\n str(file) in []]', 2, 2) diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index c95bc15e7273d..968d34809ce43 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1663,6 +1663,9 @@ def test_error_parenthesis(self): for paren in "([{": self._check_error(paren + "1 + 2", f"\\{paren}' was never closed") + for paren in "([{": + self._check_error(f"a = {paren} 1, 2, 3\nb=3", f"\\{paren}' was never closed") + for paren in ")]}": self._check_error(paren + "1 + 2", f"unmatched '\\{paren}'") diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst new file mode 100644 index 0000000000000..a7702ebafbd46 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst @@ -0,0 +1,3 @@ +Correct the error message for unclosed parentheses when the tokenizer +doesn't reach the end of the source when the error is reported. Patch by +Pablo Galindo diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index 93057d151db38..f07d9d8a34df7 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -388,7 +388,8 @@ _Pypegen_set_syntax_error(Parser* p, Token* last_token) { if (PyErr_Occurred()) { // Prioritize tokenizer errors to custom syntax errors raised // on the second phase only if the errors come from the parser. - if (p->tok->done == E_DONE && PyErr_ExceptionMatches(PyExc_SyntaxError)) { + int is_tok_ok = (p->tok->done == E_DONE || p->tok->done == E_OK); + if (is_tok_ok && PyErr_ExceptionMatches(PyExc_SyntaxError)) { _PyPegen_tokenize_full_source_to_check_for_errors(p); } // Propagate the existing syntax error. From webhook-mailer at python.org Tue Jan 4 05:42:19 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 04 Jan 2022 10:42:19 -0000 Subject: [Python-checkins] bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) Message-ID: https://github.com/python/cpython/commit/e09d94a140a5f6903017da9b6ac752ba041d69da commit: e09d94a140a5f6903017da9b6ac752ba041d69da branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-04T10:42:15Z summary: bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) files: M Doc/tools/extensions/peg_highlight.py diff --git a/Doc/tools/extensions/peg_highlight.py b/Doc/tools/extensions/peg_highlight.py index 42101be10ea9b..27f54cdf593c8 100644 --- a/Doc/tools/extensions/peg_highlight.py +++ b/Doc/tools/extensions/peg_highlight.py @@ -56,8 +56,8 @@ class PEGLexer(RegexLexer): (_name + _text_ws + r"(\[[\w\d_\*]+?\])" + _text_ws + "(=)", bygroups(None, None, None, None, None),), ], "invalids": [ - (r"^(\s+\|\s+invalid_\w+\s*\n)", bygroups(None)), - (r"^(\s+\|\s+incorrect_\w+\s*\n)", bygroups(None)), + (r"^(\s+\|\s+.*invalid_\w+.*\n)", bygroups(None)), + (r"^(\s+\|\s+.*incorrect_\w+.*\n)", bygroups(None)), (r"^(#.*invalid syntax.*(?:.|\n)*)", bygroups(None),), ], "root": [ From webhook-mailer at python.org Tue Jan 4 06:03:52 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 11:03:52 -0000 Subject: [Python-checkins] bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) Message-ID: https://github.com/python/cpython/commit/743394f2811796b30b618d4cb6dd582715f8638c commit: 743394f2811796b30b618d4cb6dd582715f8638c branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T03:03:46-08:00 summary: bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) (cherry picked from commit e09d94a140a5f6903017da9b6ac752ba041d69da) Co-authored-by: Pablo Galindo Salgado files: M Doc/tools/extensions/peg_highlight.py diff --git a/Doc/tools/extensions/peg_highlight.py b/Doc/tools/extensions/peg_highlight.py index 42101be10ea9b..27f54cdf593c8 100644 --- a/Doc/tools/extensions/peg_highlight.py +++ b/Doc/tools/extensions/peg_highlight.py @@ -56,8 +56,8 @@ class PEGLexer(RegexLexer): (_name + _text_ws + r"(\[[\w\d_\*]+?\])" + _text_ws + "(=)", bygroups(None, None, None, None, None),), ], "invalids": [ - (r"^(\s+\|\s+invalid_\w+\s*\n)", bygroups(None)), - (r"^(\s+\|\s+incorrect_\w+\s*\n)", bygroups(None)), + (r"^(\s+\|\s+.*invalid_\w+.*\n)", bygroups(None)), + (r"^(\s+\|\s+.*incorrect_\w+.*\n)", bygroups(None)), (r"^(#.*invalid syntax.*(?:.|\n)*)", bygroups(None),), ], "root": [ From webhook-mailer at python.org Tue Jan 4 06:35:06 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 11:35:06 -0000 Subject: [Python-checkins] bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) Message-ID: https://github.com/python/cpython/commit/b949845b36b999185ed2bdf8a04dca1da39f3002 commit: b949845b36b999185ed2bdf8a04dca1da39f3002 branch: main author: Hugo van Kemenade committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T03:34:31-08:00 summary: bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) `ResourceWarning` is ignored by default. Document this behaviour, for consistency with others in this table such as `DeprecationWarning`. Documentation PR can skip NEWS file. Automerge-Triggered-By: GH:iritkatriel files: M Doc/library/warnings.rst diff --git a/Doc/library/warnings.rst b/Doc/library/warnings.rst index fe11aabbcbdd6..289b28229e1a0 100644 --- a/Doc/library/warnings.rst +++ b/Doc/library/warnings.rst @@ -105,7 +105,7 @@ The following warnings category classes are currently defined: | | :class:`bytes` and :class:`bytearray`. | +----------------------------------+-----------------------------------------------+ | :exc:`ResourceWarning` | Base category for warnings related to | -| | resource usage. | +| | resource usage (ignored by default). | +----------------------------------+-----------------------------------------------+ .. versionchanged:: 3.7 From webhook-mailer at python.org Tue Jan 4 07:03:44 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 04 Jan 2022 12:03:44 -0000 Subject: [Python-checkins] bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) (GH-30395) Message-ID: https://github.com/python/cpython/commit/01b12942d0ba2fd3c2efdfb796e8816efc607ee7 commit: 01b12942d0ba2fd3c2efdfb796e8816efc607ee7 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-04T12:03:38Z summary: bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) (GH-30395) `ResourceWarning` is ignored by default. Document this behaviour, for consistency with others in this table such as `DeprecationWarning`. Documentation PR can skip NEWS file. Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit b949845b36b999185ed2bdf8a04dca1da39f3002) Co-authored-by: Hugo van Kemenade Co-authored-by: Hugo van Kemenade files: M Doc/library/warnings.rst diff --git a/Doc/library/warnings.rst b/Doc/library/warnings.rst index fe11aabbcbdd6..289b28229e1a0 100644 --- a/Doc/library/warnings.rst +++ b/Doc/library/warnings.rst @@ -105,7 +105,7 @@ The following warnings category classes are currently defined: | | :class:`bytes` and :class:`bytearray`. | +----------------------------------+-----------------------------------------------+ | :exc:`ResourceWarning` | Base category for warnings related to | -| | resource usage. | +| | resource usage (ignored by default). | +----------------------------------+-----------------------------------------------+ .. versionchanged:: 3.7 From webhook-mailer at python.org Tue Jan 4 07:04:03 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 04 Jan 2022 12:04:03 -0000 Subject: [Python-checkins] bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) (GH-30396) Message-ID: https://github.com/python/cpython/commit/8f082e2bf43c1367e30d00874267dd25f7256cd0 commit: 8f082e2bf43c1367e30d00874267dd25f7256cd0 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-04T12:03:58Z summary: bpo-33252: Document that ResourceWarning is ignored by default (GH-30358) (GH-30396) `ResourceWarning` is ignored by default. Document this behaviour, for consistency with others in this table such as `DeprecationWarning`. Documentation PR can skip NEWS file. Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit b949845b36b999185ed2bdf8a04dca1da39f3002) Co-authored-by: Hugo van Kemenade Co-authored-by: Hugo van Kemenade files: M Doc/library/warnings.rst diff --git a/Doc/library/warnings.rst b/Doc/library/warnings.rst index 9c1743cad23cb..4f05e1460bbab 100644 --- a/Doc/library/warnings.rst +++ b/Doc/library/warnings.rst @@ -105,7 +105,7 @@ The following warnings category classes are currently defined: | | :class:`bytes` and :class:`bytearray`. | +----------------------------------+-----------------------------------------------+ | :exc:`ResourceWarning` | Base category for warnings related to | -| | resource usage. | +| | resource usage (ignored by default). | +----------------------------------+-----------------------------------------------+ .. versionchanged:: 3.7 From webhook-mailer at python.org Tue Jan 4 08:36:46 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 04 Jan 2022 13:36:46 -0000 Subject: [Python-checkins] =?utf-8?q?bpo-20369=3A_concurrent=2Efutures=2E?= =?utf-8?q?wait=28=29_now_deduplicates_futures_given_a=E2=80=A6_=28GH-3016?= =?utf-8?q?8=29?= Message-ID: https://github.com/python/cpython/commit/7d7817cf0f826e566d8370a0e974bbfed6611d91 commit: 7d7817cf0f826e566d8370a0e974bbfed6611d91 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: pablogsal date: 2022-01-04T13:36:13Z summary: bpo-20369: concurrent.futures.wait() now deduplicates futures given a? (GH-30168) * bpo-20369: concurrent.futures.wait() now deduplicates futures given as arg. * ?? Added by blurb_it. Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst M Doc/library/concurrent.futures.rst M Lib/concurrent/futures/_base.py M Lib/test/test_concurrent_futures.py diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index c9f6aa1f2637c..0432fcdfa23e1 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -444,7 +444,8 @@ Module Functions .. function:: wait(fs, timeout=None, return_when=ALL_COMPLETED) Wait for the :class:`Future` instances (possibly created by different - :class:`Executor` instances) given by *fs* to complete. Returns a named + :class:`Executor` instances) given by *fs* to complete. Duplicate futures + given to *fs* are removed and will be returned only once. Returns a named 2-tuple of sets. The first set, named ``done``, contains the futures that completed (finished or cancelled futures) before the wait completed. The second set, named ``not_done``, contains the futures that did not complete diff --git a/Lib/concurrent/futures/_base.py b/Lib/concurrent/futures/_base.py index b0337399e5f25..c5912c24a1c20 100644 --- a/Lib/concurrent/futures/_base.py +++ b/Lib/concurrent/futures/_base.py @@ -282,13 +282,14 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): A named 2-tuple of sets. The first set, named 'done', contains the futures that completed (is finished or cancelled) before the wait completed. The second set, named 'not_done', contains uncompleted - futures. + futures. Duplicate futures given to *fs* are removed and will be + returned only once. """ + fs = set(fs) with _AcquireFutures(fs): - done = set(f for f in fs - if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]) - not_done = set(fs) - done - + done = {f for f in fs + if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]} + not_done = fs - done if (return_when == FIRST_COMPLETED) and done: return DoneAndNotDoneFutures(done, not_done) elif (return_when == FIRST_EXCEPTION) and done: @@ -307,7 +308,7 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): f._waiters.remove(waiter) done.update(waiter.finished_futures) - return DoneAndNotDoneFutures(done, set(fs) - done) + return DoneAndNotDoneFutures(done, fs - done) class Future(object): """Represents the result of an asynchronous computation.""" diff --git a/Lib/test/test_concurrent_futures.py b/Lib/test/test_concurrent_futures.py index bbb6aa1eef81f..71c88a3cadd25 100644 --- a/Lib/test/test_concurrent_futures.py +++ b/Lib/test/test_concurrent_futures.py @@ -578,6 +578,14 @@ def test_shutdown_no_wait(self): class WaitTests: + def test_20369(self): + # See https://bugs.python.org/issue20369 + future = self.executor.submit(time.sleep, 1.5) + done, not_done = futures.wait([future, future], + return_when=futures.ALL_COMPLETED) + self.assertEqual({future}, done) + self.assertEqual(set(), not_done) + def test_first_completed(self): future1 = self.executor.submit(mul, 21, 2) diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst new file mode 100644 index 0000000000000..cc5cd0067e61f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst @@ -0,0 +1 @@ +:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. From webhook-mailer at python.org Tue Jan 4 09:27:41 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 14:27:41 -0000 Subject: [Python-checkins] =?utf-8?q?bpo-20369=3A_concurrent=2Efutures=2E?= =?utf-8?q?wait=28=29_now_deduplicates_futures_given_a=E2=80=A6_=28GH-3016?= =?utf-8?q?8=29?= Message-ID: https://github.com/python/cpython/commit/9a9061d1ca7e28dc2b7e326153e933872c7cd452 commit: 9a9061d1ca7e28dc2b7e326153e933872c7cd452 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T06:27:26-08:00 summary: bpo-20369: concurrent.futures.wait() now deduplicates futures given a? (GH-30168) * bpo-20369: concurrent.futures.wait() now deduplicates futures given as arg. * ?? Added by blurb_it. Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> (cherry picked from commit 7d7817cf0f826e566d8370a0e974bbfed6611d91) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst M Doc/library/concurrent.futures.rst M Lib/concurrent/futures/_base.py M Lib/test/test_concurrent_futures.py diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index 897efc2f54442..f62b5e3546304 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -435,7 +435,8 @@ Module Functions .. function:: wait(fs, timeout=None, return_when=ALL_COMPLETED) Wait for the :class:`Future` instances (possibly created by different - :class:`Executor` instances) given by *fs* to complete. Returns a named + :class:`Executor` instances) given by *fs* to complete. Duplicate futures + given to *fs* are removed and will be returned only once. Returns a named 2-tuple of sets. The first set, named ``done``, contains the futures that completed (finished or cancelled futures) before the wait completed. The second set, named ``not_done``, contains the futures that did not complete diff --git a/Lib/concurrent/futures/_base.py b/Lib/concurrent/futures/_base.py index 6095026cb278b..5c00f2edbe548 100644 --- a/Lib/concurrent/futures/_base.py +++ b/Lib/concurrent/futures/_base.py @@ -284,13 +284,14 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): A named 2-tuple of sets. The first set, named 'done', contains the futures that completed (is finished or cancelled) before the wait completed. The second set, named 'not_done', contains uncompleted - futures. + futures. Duplicate futures given to *fs* are removed and will be + returned only once. """ + fs = set(fs) with _AcquireFutures(fs): - done = set(f for f in fs - if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]) - not_done = set(fs) - done - + done = {f for f in fs + if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]} + not_done = fs - done if (return_when == FIRST_COMPLETED) and done: return DoneAndNotDoneFutures(done, not_done) elif (return_when == FIRST_EXCEPTION) and done: @@ -309,7 +310,7 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): f._waiters.remove(waiter) done.update(waiter.finished_futures) - return DoneAndNotDoneFutures(done, set(fs) - done) + return DoneAndNotDoneFutures(done, fs - done) class Future(object): """Represents the result of an asynchronous computation.""" diff --git a/Lib/test/test_concurrent_futures.py b/Lib/test/test_concurrent_futures.py index 48d56d9fdcb1e..d693fb4ee199b 100644 --- a/Lib/test/test_concurrent_futures.py +++ b/Lib/test/test_concurrent_futures.py @@ -564,6 +564,14 @@ def test_shutdown_no_wait(self): class WaitTests: + def test_20369(self): + # See https://bugs.python.org/issue20369 + future = self.executor.submit(time.sleep, 1.5) + done, not_done = futures.wait([future, future], + return_when=futures.ALL_COMPLETED) + self.assertEqual({future}, done) + self.assertEqual(set(), not_done) + def test_first_completed(self): future1 = self.executor.submit(mul, 21, 2) diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst new file mode 100644 index 0000000000000..cc5cd0067e61f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst @@ -0,0 +1 @@ +:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. From webhook-mailer at python.org Tue Jan 4 09:27:41 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 04 Jan 2022 14:27:41 -0000 Subject: [Python-checkins] =?utf-8?q?bpo-20369=3A_concurrent=2Efutures=2E?= =?utf-8?q?wait=28=29_now_deduplicates_futures_given_a=E2=80=A6_=28GH-3016?= =?utf-8?q?8=29?= Message-ID: https://github.com/python/cpython/commit/ba124672d7bf490bea2930a3e8371823db5d4cae commit: ba124672d7bf490bea2930a3e8371823db5d4cae branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-04T06:27:13-08:00 summary: bpo-20369: concurrent.futures.wait() now deduplicates futures given a? (GH-30168) * bpo-20369: concurrent.futures.wait() now deduplicates futures given as arg. * ?? Added by blurb_it. Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> (cherry picked from commit 7d7817cf0f826e566d8370a0e974bbfed6611d91) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst M Doc/library/concurrent.futures.rst M Lib/concurrent/futures/_base.py M Lib/test/test_concurrent_futures.py diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index 897efc2f54442..f62b5e3546304 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -435,7 +435,8 @@ Module Functions .. function:: wait(fs, timeout=None, return_when=ALL_COMPLETED) Wait for the :class:`Future` instances (possibly created by different - :class:`Executor` instances) given by *fs* to complete. Returns a named + :class:`Executor` instances) given by *fs* to complete. Duplicate futures + given to *fs* are removed and will be returned only once. Returns a named 2-tuple of sets. The first set, named ``done``, contains the futures that completed (finished or cancelled futures) before the wait completed. The second set, named ``not_done``, contains the futures that did not complete diff --git a/Lib/concurrent/futures/_base.py b/Lib/concurrent/futures/_base.py index 6095026cb278b..5c00f2edbe548 100644 --- a/Lib/concurrent/futures/_base.py +++ b/Lib/concurrent/futures/_base.py @@ -284,13 +284,14 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): A named 2-tuple of sets. The first set, named 'done', contains the futures that completed (is finished or cancelled) before the wait completed. The second set, named 'not_done', contains uncompleted - futures. + futures. Duplicate futures given to *fs* are removed and will be + returned only once. """ + fs = set(fs) with _AcquireFutures(fs): - done = set(f for f in fs - if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]) - not_done = set(fs) - done - + done = {f for f in fs + if f._state in [CANCELLED_AND_NOTIFIED, FINISHED]} + not_done = fs - done if (return_when == FIRST_COMPLETED) and done: return DoneAndNotDoneFutures(done, not_done) elif (return_when == FIRST_EXCEPTION) and done: @@ -309,7 +310,7 @@ def wait(fs, timeout=None, return_when=ALL_COMPLETED): f._waiters.remove(waiter) done.update(waiter.finished_futures) - return DoneAndNotDoneFutures(done, set(fs) - done) + return DoneAndNotDoneFutures(done, fs - done) class Future(object): """Represents the result of an asynchronous computation.""" diff --git a/Lib/test/test_concurrent_futures.py b/Lib/test/test_concurrent_futures.py index 84209ca2520b8..29e041deeca57 100644 --- a/Lib/test/test_concurrent_futures.py +++ b/Lib/test/test_concurrent_futures.py @@ -579,6 +579,14 @@ def test_shutdown_no_wait(self): class WaitTests: + def test_20369(self): + # See https://bugs.python.org/issue20369 + future = self.executor.submit(time.sleep, 1.5) + done, not_done = futures.wait([future, future], + return_when=futures.ALL_COMPLETED) + self.assertEqual({future}, done) + self.assertEqual(set(), not_done) + def test_first_completed(self): future1 = self.executor.submit(mul, 21, 2) diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst new file mode 100644 index 0000000000000..cc5cd0067e61f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst @@ -0,0 +1 @@ +:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. From webhook-mailer at python.org Tue Jan 4 13:05:29 2022 From: webhook-mailer at python.org (markshannon) Date: Tue, 04 Jan 2022 18:05:29 -0000 Subject: [Python-checkins] bpo-45609: More specialization stats for STORE_SUBSCR (GH-30193) Message-ID: https://github.com/python/cpython/commit/7537f6008704b20e2d04a7ef1c0cfa34121cc5eb commit: 7537f6008704b20e2d04a7ef1c0cfa34121cc5eb branch: main author: Dennis Sweeney <36520290+sweeneyde at users.noreply.github.com> committer: markshannon date: 2022-01-04T18:05:09Z summary: bpo-45609: More specialization stats for STORE_SUBSCR (GH-30193) files: M Python/specialize.c M Tools/scripts/summarize_stats.py diff --git a/Python/specialize.c b/Python/specialize.c index 8991fa94f8e36..2da9e0f29b7a4 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -486,6 +486,13 @@ initial_counter_value(void) { #define SPEC_FAIL_BUFFER_SLICE 16 #define SPEC_FAIL_SEQUENCE_INT 17 +/* Store subscr */ +#define SPEC_FAIL_BYTEARRAY_INT 18 +#define SPEC_FAIL_BYTEARRAY_SLICE 19 +#define SPEC_FAIL_PY_SIMPLE 20 +#define SPEC_FAIL_PY_OTHER 21 +#define SPEC_FAIL_DICT_SUBCLASS_NO_OVERRIDE 22 + /* Binary add */ #define SPEC_FAIL_NON_FUNCTION_SCOPE 11 @@ -1253,15 +1260,73 @@ _Py_Specialize_StoreSubscr(PyObject *container, PyObject *sub, _Py_CODEUNIT *ins goto fail; } } - else if (container_type == &PyDict_Type) { + if (container_type == &PyDict_Type) { *instr = _Py_MAKECODEUNIT(STORE_SUBSCR_DICT, initial_counter_value()); goto success; } - else { - SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OTHER); +#ifdef Py_STATS + PyMappingMethods *as_mapping = container_type->tp_as_mapping; + if (as_mapping && (as_mapping->mp_ass_subscript + == PyDict_Type.tp_as_mapping->mp_ass_subscript)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_DICT_SUBCLASS_NO_OVERRIDE); goto fail; } + if (PyObject_CheckBuffer(container)) { + if (PyLong_CheckExact(sub) && (((size_t)Py_SIZE(sub)) > 1)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OUT_OF_RANGE); + } + else if (strcmp(container_type->tp_name, "array.array") == 0) { + if (PyLong_CheckExact(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_ARRAY_INT); + } + else if (PySlice_Check(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_ARRAY_SLICE); + } + else { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OTHER); + } + } + else if (PyByteArray_CheckExact(container)) { + if (PyLong_CheckExact(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_BYTEARRAY_INT); + } + else if (PySlice_Check(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_BYTEARRAY_SLICE); + } + else { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OTHER); + } + } + else { + if (PyLong_CheckExact(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_BUFFER_INT); + } + else if (PySlice_Check(sub)) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_BUFFER_SLICE); + } + else { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OTHER); + } + } + goto fail; + } + _Py_IDENTIFIER(__setitem__); + PyObject *descriptor = _PyType_LookupId(container_type, &PyId___setitem__); + if (descriptor && Py_TYPE(descriptor) == &PyFunction_Type) { + PyFunctionObject *func = (PyFunctionObject *)descriptor; + PyCodeObject *code = (PyCodeObject *)func->func_code; + int kind = function_kind(code); + if (kind == SIMPLE_FUNCTION) { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_PY_SIMPLE); + } + else { + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_PY_OTHER); + } + goto fail; + } +#endif + SPECIALIZATION_FAIL(STORE_SUBSCR, SPEC_FAIL_OTHER); fail: STAT_INC(STORE_SUBSCR, failure); assert(!PyErr_Occurred()); diff --git a/Tools/scripts/summarize_stats.py b/Tools/scripts/summarize_stats.py index a5a8e93c17392..3a77125035a32 100644 --- a/Tools/scripts/summarize_stats.py +++ b/Tools/scripts/summarize_stats.py @@ -28,7 +28,7 @@ def print_specialization_stats(name, family_stats): if "specialization.deferred" not in family_stats: return - total = sum(family_stats[kind] for kind in TOTAL) + total = sum(family_stats.get(kind, 0) for kind in TOTAL) if total == 0: return print(name+":") @@ -44,7 +44,7 @@ def print_specialization_stats(name, family_stats): for key in ("specialization.success", "specialization.failure"): label = key[len("specialization."):] print(f" {label}:{family_stats.get(key, 0):>12}") - total_failures = family_stats["specialization.failure"] + total_failures = family_stats.get("specialization.failure", 0) failure_kinds = [ 0 ] * 30 for key in family_stats: if not key.startswith("specialization.failure_kind"): From webhook-mailer at python.org Tue Jan 4 13:48:13 2022 From: webhook-mailer at python.org (Mariatta) Date: Tue, 04 Jan 2022 18:48:13 -0000 Subject: [Python-checkins] Fix missing ", " in the documentation of Executor Objects (GH-30404) Message-ID: https://github.com/python/cpython/commit/f404e26d749c85eef7b5be836375260855050ee3 commit: f404e26d749c85eef7b5be836375260855050ee3 branch: main author: Philipp Cla?en committer: Mariatta date: 2022-01-04T10:48:04-08:00 summary: Fix missing "," in the documentation of Executor Objects (GH-30404) files: M Doc/library/concurrent.futures.rst diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index 0432fcdfa23e1..959280833997e 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -30,7 +30,7 @@ Executor Objects .. method:: submit(fn, /, *args, **kwargs) - Schedules the callable, *fn*, to be executed as ``fn(*args **kwargs)`` + Schedules the callable, *fn*, to be executed as ``fn(*args, **kwargs)`` and returns a :class:`Future` object representing the execution of the callable. :: From webhook-mailer at python.org Tue Jan 4 14:16:06 2022 From: webhook-mailer at python.org (Mariatta) Date: Tue, 04 Jan 2022 19:16:06 -0000 Subject: [Python-checkins] Fix missing ", " in the documentation of Executor Objects (GH-30404) Message-ID: https://github.com/python/cpython/commit/f902d88be3aa42e03119b15469493e5cc816b784 commit: f902d88be3aa42e03119b15469493e5cc816b784 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Mariatta date: 2022-01-04T11:15:56-08:00 summary: Fix missing "," in the documentation of Executor Objects (GH-30404) (cherry picked from commit f404e26d749c85eef7b5be836375260855050ee3) Co-authored-by: Philipp Cla?en Co-authored-by: Philipp Cla?en files: M Doc/library/concurrent.futures.rst diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index f62b5e3546304..70a17a23119c1 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -30,7 +30,7 @@ Executor Objects .. method:: submit(fn, /, *args, **kwargs) - Schedules the callable, *fn*, to be executed as ``fn(*args **kwargs)`` + Schedules the callable, *fn*, to be executed as ``fn(*args, **kwargs)`` and returns a :class:`Future` object representing the execution of the callable. :: From webhook-mailer at python.org Tue Jan 4 14:17:21 2022 From: webhook-mailer at python.org (Mariatta) Date: Tue, 04 Jan 2022 19:17:21 -0000 Subject: [Python-checkins] Fix missing ", " in the documentation of Executor Objects (GH-30404) Message-ID: https://github.com/python/cpython/commit/289a32baf74cad88c374a5e6b7cbf87b3f5af58c commit: 289a32baf74cad88c374a5e6b7cbf87b3f5af58c branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: Mariatta date: 2022-01-04T11:17:16-08:00 summary: Fix missing "," in the documentation of Executor Objects (GH-30404) (cherry picked from commit f404e26d749c85eef7b5be836375260855050ee3) Co-authored-by: Philipp Cla?en Co-authored-by: Philipp Cla?en files: M Doc/library/concurrent.futures.rst diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index f62b5e3546304..70a17a23119c1 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -30,7 +30,7 @@ Executor Objects .. method:: submit(fn, /, *args, **kwargs) - Schedules the callable, *fn*, to be executed as ``fn(*args **kwargs)`` + Schedules the callable, *fn*, to be executed as ``fn(*args, **kwargs)`` and returns a :class:`Future` object representing the execution of the callable. :: From webhook-mailer at python.org Tue Jan 4 14:38:37 2022 From: webhook-mailer at python.org (brandtbucher) Date: Tue, 04 Jan 2022 19:38:37 -0000 Subject: [Python-checkins] bpo-46009: Remove GEN_START (GH-30367) Message-ID: https://github.com/python/cpython/commit/31e43cbe5f01cdd5b5ab330ec3040920e8b61a91 commit: 31e43cbe5f01cdd5b5ab330ec3040920e8b61a91 branch: main author: Brandt Bucher committer: brandtbucher date: 2022-01-04T11:38:32-08:00 summary: bpo-46009: Remove GEN_START (GH-30367) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst M Doc/library/dis.rst M Doc/whatsnew/3.11.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Objects/frameobject.c M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 87ec584789d31..8490a09669656 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -1175,14 +1175,6 @@ All of the following opcodes use their arguments. Previously, this instruction also pushed a boolean value indicating success (``True``) or failure (``False``). -.. opcode:: GEN_START (kind) - - Pops TOS. The ``kind`` operand corresponds to the type of generator or - coroutine. The legal kinds are 0 for generator, 1 for coroutine, - and 2 for async generator. - - .. versionadded:: 3.10 - .. opcode:: ROT_N (count) diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index be6cb158a8049..6794e828a7252 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -388,7 +388,7 @@ CPython bytecode changes This decouples the argument shifting for methods from the handling of keyword arguments and allows better specialization of calls. -* Removed ``COPY_DICT_WITHOUT_KEYS``. +* Removed ``COPY_DICT_WITHOUT_KEYS`` and ``GEN_START``. * :opcode:`MATCH_CLASS` and :opcode:`MATCH_KEYS` no longer push an additional boolean value indicating whether the match succeeded or failed. Instead, they diff --git a/Include/opcode.h b/Include/opcode.h index ef334de601fee..1af54943e1ca5 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -84,7 +84,6 @@ extern "C" { #define STORE_FAST 125 #define DELETE_FAST 126 #define JUMP_IF_NOT_EG_MATCH 127 -#define GEN_START 129 #define RAISE_VARARGS 130 #define MAKE_FUNCTION 132 #define BUILD_SLICE 133 @@ -164,9 +163,9 @@ extern "C" { #define STORE_ATTR_WITH_HINT 81 #define LOAD_FAST__LOAD_FAST 87 #define STORE_FAST__LOAD_FAST 128 -#define LOAD_FAST__LOAD_CONST 131 -#define LOAD_CONST__LOAD_FAST 134 -#define STORE_FAST__STORE_FAST 140 +#define LOAD_FAST__LOAD_CONST 129 +#define LOAD_CONST__LOAD_FAST 131 +#define STORE_FAST__STORE_FAST 134 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 29324664cea86..872d6d96a74b4 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -377,6 +377,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3469 (bpo-45711: remove type, traceback from exc_info) # Python 3.11a4 3470 (bpo-46221: PREP_RERAISE_STAR no longer pushes lasti) # Python 3.11a4 3471 (bpo-46202: remove pop POP_EXCEPT_AND_RERAISE) +# Python 3.11a4 3472 (bpo-46009: replace GEN_START with POP_TOP) # # MAGIC must change whenever the bytecode emitted by the compiler may no @@ -386,7 +387,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3471).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3472).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index 9bbff182f08fd..f99e20ad0a670 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -151,7 +151,6 @@ def jabs_op(name, op): jabs_op('JUMP_IF_NOT_EG_MATCH', 127) -def_op('GEN_START', 129) # Kind of generator/coroutine def_op('RAISE_VARARGS', 130) # Number of raise arguments (1, 2, or 3) def_op('MAKE_FUNCTION', 132) # Flags diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst new file mode 100644 index 0000000000000..1ffcc766725e6 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst @@ -0,0 +1 @@ +Remove the ``GEN_START`` opcode. diff --git a/Objects/frameobject.c b/Objects/frameobject.c index 078fcfc6cf607..4dd2183040dac 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -192,6 +192,11 @@ mark_stacks(PyCodeObject *code_obj, int len) stacks[i] = UNINITIALIZED; } stacks[0] = 0; + if (code_obj->co_flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) + { + // Generators get sent None while starting: + stacks[0] = push_value(stacks[0], Object); + } int todo = 1; while (todo) { todo = 0; @@ -291,9 +296,6 @@ mark_stacks(PyCodeObject *code_obj, int len) case RERAISE: /* End of block */ break; - case GEN_START: - stacks[i+1] = next_stack; - break; default: { int delta = PyCompile_OpcodeStackEffect(opcode, _Py_OPARG(code[i])); diff --git a/Python/ceval.c b/Python/ceval.c index 81bea44465dc7..953876f6226b9 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -2709,14 +2709,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr return retval; } - TARGET(GEN_START) { - PyObject *none = POP(); - assert(none == Py_None); - assert(oparg < 3); - Py_DECREF(none); - DISPATCH(); - } - TARGET(POP_EXCEPT) { _PyErr_StackItem *exc_info = tstate->exc_info; PyObject *value = exc_info->exc_value; diff --git a/Python/compile.c b/Python/compile.c index 9d3752936266c..3a390751fe2d2 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -1208,8 +1208,6 @@ stack_effect(int opcode, int oparg, int jump) return 1; case LIST_TO_TUPLE: return 0; - case GEN_START: - return -1; case LIST_EXTEND: case SET_UPDATE: case DICT_MERGE: @@ -8028,27 +8026,16 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, /* Add the generator prefix instructions. */ if (flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { - int kind; - if (flags & CO_COROUTINE) { - kind = 1; - } - else if (flags & CO_ASYNC_GENERATOR) { - kind = 2; - } - else { - kind = 0; - } - - struct instr gen_start = { - .i_opcode = GEN_START, - .i_oparg = kind, + struct instr pop_top = { + .i_opcode = POP_TOP, + .i_oparg = 0, .i_lineno = -1, .i_col_offset = -1, .i_end_lineno = -1, .i_end_col_offset = -1, .i_target = NULL, }; - if (insert_instruction(entryblock, 0, &gen_start) < 0) { + if (insert_instruction(entryblock, 0, &pop_top) < 0) { return -1; } } diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index a8f1398bfa66d..e9f1a483b970c 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -128,18 +128,18 @@ static void *opcode_targets[256] = { &&TARGET_DELETE_FAST, &&TARGET_JUMP_IF_NOT_EG_MATCH, &&TARGET_STORE_FAST__LOAD_FAST, - &&TARGET_GEN_START, - &&TARGET_RAISE_VARARGS, &&TARGET_LOAD_FAST__LOAD_CONST, + &&TARGET_RAISE_VARARGS, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, - &&TARGET_LOAD_CONST__LOAD_FAST, + &&TARGET_STORE_FAST__STORE_FAST, &&TARGET_MAKE_CELL, &&TARGET_LOAD_CLOSURE, &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, - &&TARGET_STORE_FAST__STORE_FAST, + &&_unknown_opcode, &&_unknown_opcode, &&TARGET_CALL_FUNCTION_EX, &&_unknown_opcode, From webhook-mailer at python.org Tue Jan 4 19:11:10 2022 From: webhook-mailer at python.org (ethanfurman) Date: Wed, 05 Jan 2022 00:11:10 -0000 Subject: [Python-checkins] bpo-46262: [Enum] test error path in `Flag._missing_` (GH-30408) Message-ID: https://github.com/python/cpython/commit/91bc6f9615eabb10090e2e4f0fe5913885a29c8c commit: 91bc6f9615eabb10090e2e4f0fe5913885a29c8c branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-04T16:11:06-08:00 summary: bpo-46262: [Enum] test error path in `Flag._missing_` (GH-30408) add tests that exercise the `_missing_` error path for `Flag` and `IntFlag` Co-authored-by: Alex Waygood Co-authored-by: Ethan Furman files: A Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index eecb9fd4835c4..51a31e5ebf807 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -3414,6 +3414,19 @@ class NeverEnum(WhereEnum): self.assertFalse(NeverEnum.__dict__.get('_test1', False)) self.assertFalse(NeverEnum.__dict__.get('_test2', False)) + def test_default_missing(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestFlag.Color", + ) as ctx: + self.Color('RED') + self.assertIs(ctx.exception.__context__, None) + + P = Flag('P', 'X Y') + with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: + P('X') + self.assertIs(ctx.exception.__context__, None) + class TestIntFlag(unittest.TestCase): """Tests of the IntFlags.""" @@ -3975,6 +3988,19 @@ def cycle_enum(): 'at least one thread failed while creating composite members') self.assertEqual(256, len(seen), 'too many composite members created') + def test_default_missing(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestIntFlag.Color", + ) as ctx: + self.Color('RED') + self.assertIs(ctx.exception.__context__, None) + + P = IntFlag('P', 'X Y') + with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: + P('X') + self.assertIs(ctx.exception.__context__, None) + class TestEmptyAndNonLatinStrings(unittest.TestCase): diff --git a/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst b/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst new file mode 100644 index 0000000000000..456d1359e4732 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst @@ -0,0 +1 @@ +Cover ``ValueError`` path in tests for :meth:`enum.Flag._missing_`. From webhook-mailer at python.org Wed Jan 5 04:54:26 2022 From: webhook-mailer at python.org (tiran) Date: Wed, 05 Jan 2022 09:54:26 -0000 Subject: [Python-checkins] bpo-46263: Don't use MULTIARCH on FreeBSD (#30410) Message-ID: https://github.com/python/cpython/commit/cae55542d23e606dde9819d5dadd7430085fcc77 commit: cae55542d23e606dde9819d5dadd7430085fcc77 branch: main author: Christian Heimes committer: tiran date: 2022-01-05T10:54:17+01:00 summary: bpo-46263: Don't use MULTIARCH on FreeBSD (#30410) files: A Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst M configure M configure.ac diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst new file mode 100644 index 0000000000000..3a575ed7f556b --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst @@ -0,0 +1 @@ +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/configure b/configure index eca63518dbca3..b99b9f5e8d1fc 100755 --- a/configure +++ b/configure @@ -6091,10 +6091,20 @@ $as_echo "none" >&6; } fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for multiarch" >&5 +$as_echo_n "checking for multiarch... " >&6; } +case $ac_sys_system in #( + Darwin*) : + MULTIARCH="" ;; #( + FreeBSD*) : + MULTIARCH="" ;; #( + *) : + MULTIARCH=$($CC --print-multiarch 2>/dev/null) + ;; +esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $MULTIARCH" >&5 +$as_echo "$MULTIARCH" >&6; } if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -6104,6 +6114,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi diff --git a/configure.ac b/configure.ac index 050b907ac8624..d9bdf9f8c66e9 100644 --- a/configure.ac +++ b/configure.ac @@ -987,10 +987,14 @@ else fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi -AC_SUBST(MULTIARCH) +AC_MSG_CHECKING([for multiarch]) +AS_CASE([$ac_sys_system], + [Darwin*], [MULTIARCH=""], + [FreeBSD*], [MULTIARCH=""], + [MULTIARCH=$($CC --print-multiarch 2>/dev/null)] +) +AC_SUBST([MULTIARCH]) +AC_MSG_RESULT([$MULTIARCH]) if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -1000,6 +1004,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi AC_SUBST(PLATFORM_TRIPLET) + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi From webhook-mailer at python.org Wed Jan 5 05:17:46 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 05 Jan 2022 10:17:46 -0000 Subject: [Python-checkins] bpo-46263: Don't use MULTIARCH on FreeBSD (GH-30410) Message-ID: https://github.com/python/cpython/commit/7e951f356ec76a5a5fdb851d71df5d120014bf3f commit: 7e951f356ec76a5a5fdb851d71df5d120014bf3f branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-05T02:17:39-08:00 summary: bpo-46263: Don't use MULTIARCH on FreeBSD (GH-30410) (cherry picked from commit cae55542d23e606dde9819d5dadd7430085fcc77) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst M configure M configure.ac diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst new file mode 100644 index 0000000000000..3a575ed7f556b --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst @@ -0,0 +1 @@ +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/configure b/configure index 2e7e0b7e0a75e..0e97c5228df10 100755 --- a/configure +++ b/configure @@ -5384,10 +5384,20 @@ $as_echo "none" >&6; } fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for multiarch" >&5 +$as_echo_n "checking for multiarch... " >&6; } +case $ac_sys_system in #( + Darwin*) : + MULTIARCH="" ;; #( + FreeBSD*) : + MULTIARCH="" ;; #( + *) : + MULTIARCH=$($CC --print-multiarch 2>/dev/null) + ;; +esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $MULTIARCH" >&5 +$as_echo "$MULTIARCH" >&6; } if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -5397,6 +5407,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi diff --git a/configure.ac b/configure.ac index 0c06914147854..9151059f8946f 100644 --- a/configure.ac +++ b/configure.ac @@ -872,10 +872,14 @@ else fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi -AC_SUBST(MULTIARCH) +AC_MSG_CHECKING([for multiarch]) +AS_CASE([$ac_sys_system], + [Darwin*], [MULTIARCH=""], + [FreeBSD*], [MULTIARCH=""], + [MULTIARCH=$($CC --print-multiarch 2>/dev/null)] +) +AC_SUBST([MULTIARCH]) +AC_MSG_RESULT([$MULTIARCH]) if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -885,6 +889,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi AC_SUBST(PLATFORM_TRIPLET) + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi From webhook-mailer at python.org Wed Jan 5 05:20:13 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 05 Jan 2022 10:20:13 -0000 Subject: [Python-checkins] bpo-46263: Don't use MULTIARCH on FreeBSD (GH-30410) Message-ID: https://github.com/python/cpython/commit/64199e9235275a795097ee0c53b2c560e21c70d0 commit: 64199e9235275a795097ee0c53b2c560e21c70d0 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-05T02:20:09-08:00 summary: bpo-46263: Don't use MULTIARCH on FreeBSD (GH-30410) (cherry picked from commit cae55542d23e606dde9819d5dadd7430085fcc77) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst M configure M configure.ac diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst new file mode 100644 index 0000000000000..3a575ed7f556b --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst @@ -0,0 +1 @@ +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/configure b/configure index 1b1caf84de7d5..d078887b2f485 100755 --- a/configure +++ b/configure @@ -5366,10 +5366,20 @@ $as_echo "none" >&6; } fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for multiarch" >&5 +$as_echo_n "checking for multiarch... " >&6; } +case $ac_sys_system in #( + Darwin*) : + MULTIARCH="" ;; #( + FreeBSD*) : + MULTIARCH="" ;; #( + *) : + MULTIARCH=$($CC --print-multiarch 2>/dev/null) + ;; +esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $MULTIARCH" >&5 +$as_echo "$MULTIARCH" >&6; } if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -5379,6 +5389,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi diff --git a/configure.ac b/configure.ac index 823252be69425..431d66197bc7f 100644 --- a/configure.ac +++ b/configure.ac @@ -882,10 +882,14 @@ else fi rm -f conftest.c conftest.out -if test x$PLATFORM_TRIPLET != xdarwin; then - MULTIARCH=$($CC --print-multiarch 2>/dev/null) -fi -AC_SUBST(MULTIARCH) +AC_MSG_CHECKING([for multiarch]) +AS_CASE([$ac_sys_system], + [Darwin*], [MULTIARCH=""], + [FreeBSD*], [MULTIARCH=""], + [MULTIARCH=$($CC --print-multiarch 2>/dev/null)] +) +AC_SUBST([MULTIARCH]) +AC_MSG_RESULT([$MULTIARCH]) if test x$PLATFORM_TRIPLET != x && test x$MULTIARCH != x; then if test x$PLATFORM_TRIPLET != x$MULTIARCH; then @@ -895,6 +899,7 @@ elif test x$PLATFORM_TRIPLET != x && test x$MULTIARCH = x; then MULTIARCH=$PLATFORM_TRIPLET fi AC_SUBST(PLATFORM_TRIPLET) + if test x$MULTIARCH != x; then MULTIARCH_CPPFLAGS="-DMULTIARCH=\\\"$MULTIARCH\\\"" fi From webhook-mailer at python.org Wed Jan 5 06:30:34 2022 From: webhook-mailer at python.org (markshannon) Date: Wed, 05 Jan 2022 11:30:34 -0000 Subject: [Python-checkins] bpo-45256: Don't track the exact depth of each `InterpreterFrame` (GH-30372) Message-ID: https://github.com/python/cpython/commit/332e6b972567debfa9d8f3f9a4a966c7ad15eec9 commit: 332e6b972567debfa9d8f3f9a4a966c7ad15eec9 branch: main author: Brandt Bucher committer: markshannon date: 2022-01-05T11:30:26Z summary: bpo-45256: Don't track the exact depth of each `InterpreterFrame` (GH-30372) files: M Include/internal/pycore_frame.h M Lib/test/test_sys.py M Python/ceval.c M Tools/gdb/libpython.py diff --git a/Include/internal/pycore_frame.h b/Include/internal/pycore_frame.h index 8eca39d1ab250..42df51f635615 100644 --- a/Include/internal/pycore_frame.h +++ b/Include/internal/pycore_frame.h @@ -4,6 +4,8 @@ extern "C" { #endif +#include + /* runtime lifecycle */ @@ -44,7 +46,7 @@ typedef struct _interpreter_frame { int f_lasti; /* Last instruction if called */ int stacktop; /* Offset of TOS from localsplus */ PyFrameState f_state; /* What state the frame is in */ - int depth; /* Depth of the frame in a ceval loop */ + bool is_entry; // Whether this is the "root" frame for the current CFrame. PyObject *localsplus[1]; } InterpreterFrame; @@ -101,7 +103,7 @@ _PyFrame_InitializeSpecials( frame->generator = NULL; frame->f_lasti = -1; frame->f_state = FRAME_CREATED; - frame->depth = 0; + frame->is_entry = false; } /* Gets the pointer to the locals array diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index 96075cf3b3473..38771d427da7b 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -1323,7 +1323,7 @@ class C(object): pass def func(): return sys._getframe() x = func() - check(x, size('3Pi3c8P2iciP')) + check(x, size('3Pi3c8P2ic?P')) # function def func(): pass check(func, size('14Pi')) @@ -1340,7 +1340,7 @@ def bar(cls): check(bar, size('PP')) # generator def get_gen(): yield 1 - check(get_gen(), size('P2P4P4c8P2iciP')) + check(get_gen(), size('P2P4P4c8P2ic?P')) # iterator check(iter('abc'), size('lP')) # callable-iterator diff --git a/Python/ceval.c b/Python/ceval.c index 953876f6226b9..b4ac9ec848f77 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1683,7 +1683,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr cframe.previous = prev_cframe; tstate->cframe = &cframe; - assert(frame->depth == 0); + frame->is_entry = true; /* Push frame */ frame->previous = prev_cframe->current_frame; cframe.current_frame = frame; @@ -2310,7 +2310,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr _PyFrame_SetStackPointer(frame, stack_pointer); new_frame->previous = frame; frame = cframe.current_frame = new_frame; - new_frame->depth = frame->depth + 1; goto start_frame; } @@ -2475,7 +2474,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr TRACE_FUNCTION_EXIT(); DTRACE_FUNCTION_EXIT(); _Py_LeaveRecursiveCall(tstate); - if (frame->depth) { + if (!frame->is_entry) { frame = cframe.current_frame = pop_frame(tstate, frame); _PyFrame_StackPush(frame, retval); goto resume_frame; @@ -2625,7 +2624,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(SEND) { - assert(frame->depth == 0); + assert(frame->is_entry); assert(STACK_LEVEL() >= 2); PyObject *v = POP(); PyObject *receiver = TOP(); @@ -2684,7 +2683,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(YIELD_VALUE) { - assert(frame->depth == 0); + assert(frame->is_entry); PyObject *retval = POP(); if (frame->f_code->co_flags & CO_ASYNC_GENERATOR) { @@ -4612,7 +4611,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr _PyFrame_SetStackPointer(frame, stack_pointer); new_frame->previous = frame; cframe.current_frame = frame = new_frame; - new_frame->depth = frame->depth + 1; goto start_frame; } } @@ -4706,7 +4704,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr _PyFrame_SetStackPointer(frame, stack_pointer); new_frame->previous = frame; frame = cframe.current_frame = new_frame; - new_frame->depth = frame->depth + 1; goto start_frame; } @@ -5382,7 +5379,7 @@ MISS_WITH_OPARG_COUNTER(STORE_SUBSCR) exit_unwind: assert(_PyErr_Occurred(tstate)); _Py_LeaveRecursiveCall(tstate); - if (frame->depth == 0) { + if (frame->is_entry) { /* Restore previous cframe and exit */ tstate->cframe = cframe.previous; tstate->cframe->use_tracing = cframe.use_tracing; diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py index a0a95e3fc63cb..e3d73bce6cfe5 100755 --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -1044,8 +1044,8 @@ def _f_nlocalsplus(self): def _f_lasti(self): return self._f_special("f_lasti", int_from_int) - def depth(self): - return self._f_special("depth", int_from_int) + def is_entry(self): + return self._f_special("is_entry", bool) def previous(self): return self._f_special("previous", PyFramePtr) @@ -1860,7 +1860,7 @@ def print_summary(self): line = interp_frame.current_line() if line is not None: sys.stdout.write(' %s\n' % line.strip()) - if interp_frame.depth() == 0: + if interp_frame.is_entry(): break else: sys.stdout.write('#%i (unable to read python frame information)\n' % self.get_index()) @@ -1883,7 +1883,7 @@ def print_traceback(self): line = interp_frame.current_line() if line is not None: sys.stdout.write(' %s\n' % line.strip()) - if interp_frame.depth() == 0: + if interp_frame.is_entry(): break else: sys.stdout.write(' (unable to read python frame information)\n') @@ -2147,7 +2147,7 @@ def invoke(self, args, from_tty): % (pyop_name.proxyval(set()), pyop_value.get_truncated_repr(MAX_OUTPUT_LEN))) - if pyop_frame.depth() == 0: + if pyop_frame.is_entry(): break pyop_frame = pyop_frame.previous() From webhook-mailer at python.org Wed Jan 5 06:53:31 2022 From: webhook-mailer at python.org (pablogsal) Date: Wed, 05 Jan 2022 11:53:31 -0000 Subject: [Python-checkins] bpo-43137: Revert "webbrowser: Don't run gvfs-open on GNOME" (GH-30417) Message-ID: https://github.com/python/cpython/commit/dd50316e458d7c3284f8948b0606d8aa91ab855d commit: dd50316e458d7c3284f8948b0606d8aa91ab855d branch: main author: Simon McVittie committer: pablogsal date: 2022-01-05T11:53:23Z summary: bpo-43137: Revert "webbrowser: Don't run gvfs-open on GNOME" (GH-30417) gvfs-open was deprecated in 2015 and removed in 2018, but its replacement, gio(1), is not available in Ubuntu 16.04, which is apparently still supported by CPython upstream even though it is considered to be EOL by Ubuntu developers. Signed-off-by: Simon McVittie files: M Lib/webbrowser.py diff --git a/Lib/webbrowser.py b/Lib/webbrowser.py index 02d2036906178..44974d433b469 100755 --- a/Lib/webbrowser.py +++ b/Lib/webbrowser.py @@ -467,6 +467,10 @@ def register_X_browsers(): if shutil.which("gio"): register("gio", None, BackgroundBrowser(["gio", "open", "--", "%s"])) + # Equivalent of gio open before 2015 + if "GNOME_DESKTOP_SESSION_ID" in os.environ and shutil.which("gvfs-open"): + register("gvfs-open", None, BackgroundBrowser("gvfs-open")) + # The default KDE browser if "KDE_FULL_SESSION" in os.environ and shutil.which("kfmclient"): register("kfmclient", Konqueror, Konqueror("kfmclient")) From webhook-mailer at python.org Wed Jan 5 07:25:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 05 Jan 2022 12:25:59 -0000 Subject: [Python-checkins] bpo-46236: Fix PyFunction_GetAnnotations() returned tuple. (GH-30409) Message-ID: https://github.com/python/cpython/commit/46e4c257e7c26c813620232135781e6c53fe8d4d commit: 46e4c257e7c26c813620232135781e6c53fe8d4d branch: main author: Inada Naoki committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-05T04:25:54-08:00 summary: bpo-46236: Fix PyFunction_GetAnnotations() returned tuple. (GH-30409) Automerge-Triggered-By: GH:pablogsal files: A Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst M Objects/funcobject.c diff --git a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst new file mode 100644 index 0000000000000..61906584a16a3 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst @@ -0,0 +1 @@ +Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a ``tuple`` instead of a ``dict``. diff --git a/Objects/funcobject.c b/Objects/funcobject.c index 7891e4f3122b1..a8f006da8ad33 100644 --- a/Objects/funcobject.c +++ b/Objects/funcobject.c @@ -274,6 +274,37 @@ PyFunction_SetClosure(PyObject *op, PyObject *closure) return 0; } +static PyObject * +func_get_annotation_dict(PyFunctionObject *op) +{ + if (op->func_annotations == NULL) { + return NULL; + } + if (PyTuple_CheckExact(op->func_annotations)) { + PyObject *ann_tuple = op->func_annotations; + PyObject *ann_dict = PyDict_New(); + if (ann_dict == NULL) { + return NULL; + } + + assert(PyTuple_GET_SIZE(ann_tuple) % 2 == 0); + + for (Py_ssize_t i = 0; i < PyTuple_GET_SIZE(ann_tuple); i += 2) { + int err = PyDict_SetItem(ann_dict, + PyTuple_GET_ITEM(ann_tuple, i), + PyTuple_GET_ITEM(ann_tuple, i + 1)); + + if (err < 0) { + return NULL; + } + } + Py_SETREF(op->func_annotations, ann_dict); + } + Py_INCREF(op->func_annotations); + assert(PyDict_Check(op->func_annotations)); + return op->func_annotations; +} + PyObject * PyFunction_GetAnnotations(PyObject *op) { @@ -281,7 +312,7 @@ PyFunction_GetAnnotations(PyObject *op) PyErr_BadInternalCall(); return NULL; } - return ((PyFunctionObject *) op) -> func_annotations; + return func_get_annotation_dict((PyFunctionObject *)op); } int @@ -501,27 +532,7 @@ func_get_annotations(PyFunctionObject *op, void *Py_UNUSED(ignored)) if (op->func_annotations == NULL) return NULL; } - if (PyTuple_CheckExact(op->func_annotations)) { - PyObject *ann_tuple = op->func_annotations; - PyObject *ann_dict = PyDict_New(); - if (ann_dict == NULL) { - return NULL; - } - - assert(PyTuple_GET_SIZE(ann_tuple) % 2 == 0); - - for (Py_ssize_t i = 0; i < PyTuple_GET_SIZE(ann_tuple); i += 2) { - int err = PyDict_SetItem(ann_dict, - PyTuple_GET_ITEM(ann_tuple, i), - PyTuple_GET_ITEM(ann_tuple, i + 1)); - - if (err < 0) - return NULL; - } - Py_SETREF(op->func_annotations, ann_dict); - } - Py_INCREF(op->func_annotations); - return op->func_annotations; + return func_get_annotation_dict(op); } static int From webhook-mailer at python.org Wed Jan 5 08:12:27 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 05 Jan 2022 13:12:27 -0000 Subject: [Python-checkins] bpo-46236: Fix PyFunction_GetAnnotations() returned tuple. (GH-30409) Message-ID: https://github.com/python/cpython/commit/da8be157f4e275c4c32b9199f1466ed7e52f62cf commit: da8be157f4e275c4c32b9199f1466ed7e52f62cf branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-05T05:12:21-08:00 summary: bpo-46236: Fix PyFunction_GetAnnotations() returned tuple. (GH-30409) Automerge-Triggered-By: GH:pablogsal (cherry picked from commit 46e4c257e7c26c813620232135781e6c53fe8d4d) Co-authored-by: Inada Naoki files: A Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst M Objects/funcobject.c diff --git a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst new file mode 100644 index 0000000000000..61906584a16a3 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst @@ -0,0 +1 @@ +Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a ``tuple`` instead of a ``dict``. diff --git a/Objects/funcobject.c b/Objects/funcobject.c index da648b7a0ebad..801478ade22f6 100644 --- a/Objects/funcobject.c +++ b/Objects/funcobject.c @@ -221,6 +221,37 @@ PyFunction_SetClosure(PyObject *op, PyObject *closure) return 0; } +static PyObject * +func_get_annotation_dict(PyFunctionObject *op) +{ + if (op->func_annotations == NULL) { + return NULL; + } + if (PyTuple_CheckExact(op->func_annotations)) { + PyObject *ann_tuple = op->func_annotations; + PyObject *ann_dict = PyDict_New(); + if (ann_dict == NULL) { + return NULL; + } + + assert(PyTuple_GET_SIZE(ann_tuple) % 2 == 0); + + for (Py_ssize_t i = 0; i < PyTuple_GET_SIZE(ann_tuple); i += 2) { + int err = PyDict_SetItem(ann_dict, + PyTuple_GET_ITEM(ann_tuple, i), + PyTuple_GET_ITEM(ann_tuple, i + 1)); + + if (err < 0) { + return NULL; + } + } + Py_SETREF(op->func_annotations, ann_dict); + } + Py_INCREF(op->func_annotations); + assert(PyDict_Check(op->func_annotations)); + return op->func_annotations; +} + PyObject * PyFunction_GetAnnotations(PyObject *op) { @@ -228,7 +259,7 @@ PyFunction_GetAnnotations(PyObject *op) PyErr_BadInternalCall(); return NULL; } - return ((PyFunctionObject *) op) -> func_annotations; + return func_get_annotation_dict((PyFunctionObject *)op); } int @@ -443,27 +474,7 @@ func_get_annotations(PyFunctionObject *op, void *Py_UNUSED(ignored)) if (op->func_annotations == NULL) return NULL; } - if (PyTuple_CheckExact(op->func_annotations)) { - PyObject *ann_tuple = op->func_annotations; - PyObject *ann_dict = PyDict_New(); - if (ann_dict == NULL) { - return NULL; - } - - assert(PyTuple_GET_SIZE(ann_tuple) % 2 == 0); - - for (Py_ssize_t i = 0; i < PyTuple_GET_SIZE(ann_tuple); i += 2) { - int err = PyDict_SetItem(ann_dict, - PyTuple_GET_ITEM(ann_tuple, i), - PyTuple_GET_ITEM(ann_tuple, i + 1)); - - if (err < 0) - return NULL; - } - Py_SETREF(op->func_annotations, ann_dict); - } - Py_INCREF(op->func_annotations); - return op->func_annotations; + return func_get_annotation_dict(op); } static int From webhook-mailer at python.org Wed Jan 5 10:39:20 2022 From: webhook-mailer at python.org (rhettinger) Date: Wed, 05 Jan 2022 15:39:20 -0000 Subject: [Python-checkins] bpo-46257: Convert statistics._ss() to a single pass algorithm (GH-30403) Message-ID: https://github.com/python/cpython/commit/43aac29cbbb8a963a22c334b5b795d1e43417d6b commit: 43aac29cbbb8a963a22c334b5b795d1e43417d6b branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-05T09:39:10-06:00 summary: bpo-46257: Convert statistics._ss() to a single pass algorithm (GH-30403) files: A Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst M Lib/statistics.py diff --git a/Lib/statistics.py b/Lib/statistics.py index c104571d39053..eef2453bc7394 100644 --- a/Lib/statistics.py +++ b/Lib/statistics.py @@ -138,7 +138,7 @@ from bisect import bisect_left, bisect_right from math import hypot, sqrt, fabs, exp, erf, tau, log, fsum from operator import mul -from collections import Counter, namedtuple +from collections import Counter, namedtuple, defaultdict _SQRT2 = sqrt(2.0) @@ -202,6 +202,43 @@ def _sum(data): return (T, total, count) +def _ss(data, c=None): + """Return sum of square deviations of sequence data. + + If ``c`` is None, the mean is calculated in one pass, and the deviations + from the mean are calculated in a second pass. Otherwise, deviations are + calculated from ``c`` as given. Use the second case with care, as it can + lead to garbage results. + """ + if c is not None: + T, total, count = _sum((d := x - c) * d for x in data) + return (T, total, count) + count = 0 + sx_partials = defaultdict(int) + sxx_partials = defaultdict(int) + T = int + for typ, values in groupby(data, type): + T = _coerce(T, typ) # or raise TypeError + for n, d in map(_exact_ratio, values): + count += 1 + sx_partials[d] += n + sxx_partials[d] += n * n + if not count: + total = Fraction(0) + elif None in sx_partials: + # The sum will be a NAN or INF. We can ignore all the finite + # partials, and just look at this special one. + total = sx_partials[None] + assert not _isfinite(total) + else: + sx = sum(Fraction(n, d) for d, n in sx_partials.items()) + sxx = sum(Fraction(n, d*d) for d, n in sxx_partials.items()) + # This formula has poor numeric properties for floats, + # but with fractions it is exact. + total = (count * sxx - sx * sx) / count + return (T, total, count) + + def _isfinite(x): try: return x.is_finite() # Likely a Decimal. @@ -399,13 +436,9 @@ def mean(data): If ``data`` is empty, StatisticsError will be raised. """ - if iter(data) is data: - data = list(data) - n = len(data) + T, total, n = _sum(data) if n < 1: raise StatisticsError('mean requires at least one data point') - T, total, count = _sum(data) - assert count == n return _convert(total / n, T) @@ -776,41 +809,6 @@ def quantiles(data, *, n=4, method='exclusive'): # See http://mathworld.wolfram.com/Variance.html # http://mathworld.wolfram.com/SampleVariance.html -# http://en.wikipedia.org/wiki/Algorithms_for_calculating_variance -# -# Under no circumstances use the so-called "computational formula for -# variance", as that is only suitable for hand calculations with a small -# amount of low-precision data. It has terrible numeric properties. -# -# See a comparison of three computational methods here: -# http://www.johndcook.com/blog/2008/09/26/comparing-three-methods-of-computing-standard-deviation/ - -def _ss(data, c=None): - """Return sum of square deviations of sequence data. - - If ``c`` is None, the mean is calculated in one pass, and the deviations - from the mean are calculated in a second pass. Otherwise, deviations are - calculated from ``c`` as given. Use the second case with care, as it can - lead to garbage results. - """ - if c is not None: - T, total, count = _sum((d := x - c) * d for x in data) - return (T, total) - T, total, count = _sum(data) - mean_n, mean_d = (total / count).as_integer_ratio() - partials = Counter() - for n, d in map(_exact_ratio, data): - diff_n = n * mean_d - d * mean_n - diff_d = d * mean_d - partials[diff_d * diff_d] += diff_n * diff_n - if None in partials: - # The sum will be a NAN or INF. We can ignore all the finite - # partials, and just look at this special one. - total = partials[None] - assert not _isfinite(total) - else: - total = sum(Fraction(n, d) for d, n in partials.items()) - return (T, total) def variance(data, xbar=None): @@ -851,12 +849,9 @@ def variance(data, xbar=None): Fraction(67, 108) """ - if iter(data) is data: - data = list(data) - n = len(data) + T, ss, n = _ss(data, xbar) if n < 2: raise StatisticsError('variance requires at least two data points') - T, ss = _ss(data, xbar) return _convert(ss / (n - 1), T) @@ -895,12 +890,9 @@ def pvariance(data, mu=None): Fraction(13, 72) """ - if iter(data) is data: - data = list(data) - n = len(data) + T, ss, n = _ss(data, mu) if n < 1: raise StatisticsError('pvariance requires at least one data point') - T, ss = _ss(data, mu) return _convert(ss / n, T) @@ -913,12 +905,9 @@ def stdev(data, xbar=None): 1.0810874155219827 """ - if iter(data) is data: - data = list(data) - n = len(data) + T, ss, n = _ss(data, xbar) if n < 2: raise StatisticsError('stdev requires at least two data points') - T, ss = _ss(data, xbar) mss = ss / (n - 1) if issubclass(T, Decimal): return _decimal_sqrt_of_frac(mss.numerator, mss.denominator) @@ -934,12 +923,9 @@ def pstdev(data, mu=None): 0.986893273527251 """ - if iter(data) is data: - data = list(data) - n = len(data) + T, ss, n = _ss(data, mu) if n < 1: raise StatisticsError('pstdev requires at least one data point') - T, ss = _ss(data, mu) mss = ss / n if issubclass(T, Decimal): return _decimal_sqrt_of_frac(mss.numerator, mss.denominator) diff --git a/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst b/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst new file mode 100644 index 0000000000000..72ae56ec412a6 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst @@ -0,0 +1,4 @@ +Optimized the mean, variance, and stdev functions in the statistics module. +If the input is an iterator, it is consumed in a single pass rather than +eating memory by conversion to a list. The single pass algorithm is about +twice as fast as the previous two pass code. From webhook-mailer at python.org Wed Jan 5 12:06:08 2022 From: webhook-mailer at python.org (ethanfurman) Date: Wed, 05 Jan 2022 17:06:08 -0000 Subject: [Python-checkins] bpo-46269: [Enum] remove special-casing of `__new__` in `EnumType.__dir__` (GH-30421) Message-ID: https://github.com/python/cpython/commit/817a6bc9f7b802511c4d42273a621c556a48870b commit: 817a6bc9f7b802511c4d42273a621c556a48870b branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-05T09:06:02-08:00 summary: bpo-46269: [Enum] remove special-casing of `__new__` in `EnumType.__dir__` (GH-30421) files: A Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst M Lib/enum.py diff --git a/Lib/enum.py b/Lib/enum.py index 8efc38c3d78db..86928b4f79f0b 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -652,10 +652,6 @@ def __dir__(self): # if and only if they have been user-overridden enum_dunders = set(filter(_is_dunder, enum_dict)) - # special-case __new__ - if self.__new__ is not first_enum_base.__new__: - add_to_dir('__new__') - for cls in mro: # Ignore any classes defined in this module if cls is object or is_from_this_module(cls): diff --git a/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst b/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst new file mode 100644 index 0000000000000..5d3687aaddfea --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst @@ -0,0 +1 @@ +Remove special-casing of ``__new__`` in :meth:`enum.Enum.__dir__`. From webhook-mailer at python.org Wed Jan 5 15:21:08 2022 From: webhook-mailer at python.org (rhettinger) Date: Wed, 05 Jan 2022 20:21:08 -0000 Subject: [Python-checkins] bpo-46266: Add calendar day of week constants to __all__ (GH-30412) Message-ID: https://github.com/python/cpython/commit/e5894ca8fd05e6a6df1033025b9093b68baa718d commit: e5894ca8fd05e6a6df1033025b9093b68baa718d branch: main author: Nikita Sobolev committer: rhettinger date: 2022-01-05T14:21:04-06:00 summary: bpo-46266: Add calendar day of week constants to __all__ (GH-30412) files: A Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst M Doc/library/calendar.rst M Lib/calendar.py M Lib/test/test_calendar.py diff --git a/Doc/library/calendar.rst b/Doc/library/calendar.rst index 6050ff5607a27..b667c42e708fa 100644 --- a/Doc/library/calendar.rst +++ b/Doc/library/calendar.rst @@ -31,7 +31,7 @@ interpreted as prescribed by the ISO 8601 standard. Year 0 is 1 BC, year -1 is .. class:: Calendar(firstweekday=0) Creates a :class:`Calendar` object. *firstweekday* is an integer specifying the - first day of the week. ``0`` is Monday (the default), ``6`` is Sunday. + first day of the week. :const:`MONDAY` is ``0`` (the default), :const:`SUNDAY` is ``6``. A :class:`Calendar` object provides several methods that can be used for preparing the calendar data for formatting. This class doesn't do any formatting @@ -406,6 +406,15 @@ The :mod:`calendar` module exports the following data attributes: locale. This follows normal convention of January being month number 1, so it has a length of 13 and ``month_abbr[0]`` is the empty string. +.. data:: MONDAY + TUESDAY + WEDNESDAY + THURSDAY + FRIDAY + SATURDAY + SUNDAY + + Aliases for day numbers, where ``MONDAY`` is ``0`` and ``SUNDAY`` is ``6``. .. seealso:: diff --git a/Lib/calendar.py b/Lib/calendar.py index 663bb946b0d11..06c65a80cd80f 100644 --- a/Lib/calendar.py +++ b/Lib/calendar.py @@ -15,7 +15,9 @@ "monthcalendar", "prmonth", "month", "prcal", "calendar", "timegm", "month_name", "month_abbr", "day_name", "day_abbr", "Calendar", "TextCalendar", "HTMLCalendar", "LocaleTextCalendar", - "LocaleHTMLCalendar", "weekheader"] + "LocaleHTMLCalendar", "weekheader", + "MONDAY", "TUESDAY", "WEDNESDAY", "THURSDAY", "FRIDAY", + "SATURDAY", "SUNDAY"] # Exception raised for bad input (with string parameter for details) error = ValueError diff --git a/Lib/test/test_calendar.py b/Lib/test/test_calendar.py index c641e8c418318..39094ad6fd9ab 100644 --- a/Lib/test/test_calendar.py +++ b/Lib/test/test_calendar.py @@ -935,8 +935,7 @@ def test_html_output_year_css(self): class MiscTestCase(unittest.TestCase): def test__all__(self): not_exported = { - 'mdays', 'January', 'February', 'EPOCH', 'MONDAY', 'TUESDAY', - 'WEDNESDAY', 'THURSDAY', 'FRIDAY', 'SATURDAY', 'SUNDAY', + 'mdays', 'January', 'February', 'EPOCH', 'different_locale', 'c', 'prweek', 'week', 'format', 'formatstring', 'main', 'monthlen', 'prevmonth', 'nextmonth'} support.check__all__(self, calendar, not_exported=not_exported) diff --git a/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst b/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst new file mode 100644 index 0000000000000..354dcb0106595 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst @@ -0,0 +1,4 @@ +Improve day constants in :mod:`calendar`. + +Now all constants (`MONDAY` ... `SUNDAY`) are documented, tested, and added +to ``__all__``. From webhook-mailer at python.org Thu Jan 6 02:53:53 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 06 Jan 2022 07:53:53 -0000 Subject: [Python-checkins] bpo-46006: Revert "bpo-40521: Per-interpreter interned strings (GH-20085)" (GH-30422) Message-ID: https://github.com/python/cpython/commit/35d6540c904ef07b8602ff014e520603f84b5886 commit: 35d6540c904ef07b8602ff014e520603f84b5886 branch: main author: Victor Stinner committer: vstinner date: 2022-01-06T08:53:44+01:00 summary: bpo-46006: Revert "bpo-40521: Per-interpreter interned strings (GH-20085)" (GH-30422) This reverts commit ea251806b8dffff11b30d2182af1e589caf88acf. Keep "assert(interned == NULL);" in _PyUnicode_Fini(), but only for the main interpreter. Keep _PyUnicode_ClearInterned() changes avoiding the creation of a temporary Python list object. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst M Include/internal/pycore_unicodeobject.h M Objects/typeobject.c M Objects/unicodeobject.c diff --git a/Include/internal/pycore_unicodeobject.h b/Include/internal/pycore_unicodeobject.h index c50c42011a934..97d11aeb8201c 100644 --- a/Include/internal/pycore_unicodeobject.h +++ b/Include/internal/pycore_unicodeobject.h @@ -48,21 +48,11 @@ struct _Py_unicode_state { PyObject *latin1[256]; struct _Py_unicode_fs_codec fs_codec; - /* This dictionary holds all interned unicode strings. Note that references - to strings in this dictionary are *not* counted in the string's ob_refcnt. - When the interned string reaches a refcnt of 0 the string deallocation - function will delete the reference from this dictionary. - - Another way to look at this is that to say that the actual reference - count of a string is: s->ob_refcnt + (s->state ? 2 : 0) - */ - PyObject *interned; - // Unicode identifiers (_Py_Identifier): see _PyUnicode_FromId() struct _Py_unicode_ids ids; }; -extern void _PyUnicode_ClearInterned(PyInterpreterState *); +extern void _PyUnicode_ClearInterned(PyInterpreterState *interp); #ifdef __cplusplus diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst new file mode 100644 index 0000000000000..3acd2b09390a8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst @@ -0,0 +1,5 @@ +Fix a regression when a type method like ``__init__()`` is modified in a +subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type +``update_slot()``. Revert the change which made the Unicode dictionary of +interned strings compatible with subinterpreters: the internal interned +dictionary is shared again by all interpreters. Patch by Victor Stinner. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index af35180cdb983..cbf806b074b9f 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -54,6 +54,11 @@ typedef struct PySlot_Offset { } PySlot_Offset; +/* bpo-40521: Interned strings are shared by all subinterpreters */ +#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS +# define INTERN_NAME_STRINGS +#endif + /* alphabetical order */ _Py_IDENTIFIER(__abstractmethods__); _Py_IDENTIFIER(__annotations__); @@ -4028,6 +4033,7 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) if (name == NULL) return -1; } +#ifdef INTERN_NAME_STRINGS if (!PyUnicode_CHECK_INTERNED(name)) { PyUnicode_InternInPlace(&name); if (!PyUnicode_CHECK_INTERNED(name)) { @@ -4037,6 +4043,7 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) return -1; } } +#endif } else { /* Will fail in _PyObject_GenericSetAttrWithDict. */ @@ -8424,10 +8431,17 @@ _PyTypes_InitSlotDefs(void) for (slotdef *p = slotdefs; p->name; p++) { /* Slots must be ordered by their offset in the PyHeapTypeObject. */ assert(!p[1].name || p->offset <= p[1].offset); +#ifdef INTERN_NAME_STRINGS p->name_strobj = PyUnicode_InternFromString(p->name); if (!p->name_strobj || !PyUnicode_CHECK_INTERNED(p->name_strobj)) { return _PyStatus_NO_MEMORY(); } +#else + p->name_strobj = PyUnicode_FromString(p->name); + if (!p->name_strobj) { + return _PyStatus_NO_MEMORY(); + } +#endif } slotdefs_initialized = 1; return _PyStatus_OK(); @@ -8452,16 +8466,24 @@ update_slot(PyTypeObject *type, PyObject *name) int offset; assert(PyUnicode_CheckExact(name)); +#ifdef INTERN_NAME_STRINGS assert(PyUnicode_CHECK_INTERNED(name)); +#endif assert(slotdefs_initialized); pp = ptrs; for (p = slotdefs; p->name; p++) { assert(PyUnicode_CheckExact(p->name_strobj)); assert(PyUnicode_CheckExact(name)); +#ifdef INTERN_NAME_STRINGS if (p->name_strobj == name) { *pp++ = p; } +#else + if (p->name_strobj == name || _PyUnicode_EQ(p->name_strobj, name)) { + *pp++ = p; + } +#endif } *pp = NULL; for (pp = ptrs; *pp; pp++) { diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 14449bce70839..31b8710defbea 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -214,6 +214,22 @@ extern "C" { # define OVERALLOCATE_FACTOR 4 #endif +/* bpo-40521: Interned strings are shared by all interpreters. */ +#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS +# define INTERNED_STRINGS +#endif + +/* This dictionary holds all interned unicode strings. Note that references + to strings in this dictionary are *not* counted in the string's ob_refcnt. + When the interned string reaches a refcnt of 0 the string deallocation + function will delete the reference from this dictionary. + + Another way to look at this is that to say that the actual reference + count of a string is: s->ob_refcnt + (s->state ? 2 : 0) +*/ +#ifdef INTERNED_STRINGS +static PyObject *interned = NULL; +#endif /* Forward declaration */ static inline int @@ -1950,7 +1966,7 @@ unicode_dealloc(PyObject *unicode) case SSTATE_INTERNED_MORTAL: { - struct _Py_unicode_state *state = get_unicode_state(); +#ifdef INTERNED_STRINGS /* Revive the dead object temporarily. PyDict_DelItem() removes two references (key and value) which were ignored by PyUnicode_InternInPlace(). Use refcnt=3 rather than refcnt=2 @@ -1958,12 +1974,13 @@ unicode_dealloc(PyObject *unicode) PyDict_DelItem(). */ assert(Py_REFCNT(unicode) == 0); Py_SET_REFCNT(unicode, 3); - if (PyDict_DelItem(state->interned, unicode) != 0) { + if (PyDict_DelItem(interned, unicode) != 0) { _PyErr_WriteUnraisableMsg("deletion of interned string failed", NULL); } assert(Py_REFCNT(unicode) == 1); Py_SET_REFCNT(unicode, 0); +#endif break; } @@ -11342,11 +11359,13 @@ _PyUnicode_EqualToASCIIId(PyObject *left, _Py_Identifier *right) if (PyUnicode_CHECK_INTERNED(left)) return 0; +#ifdef INTERNED_STRINGS assert(_PyUnicode_HASH(right_uni) != -1); Py_hash_t hash = _PyUnicode_HASH(left); if (hash != -1 && hash != _PyUnicode_HASH(right_uni)) { return 0; } +#endif return unicode_compare_eq(left, right_uni); } @@ -15591,21 +15610,21 @@ PyUnicode_InternInPlace(PyObject **p) return; } +#ifdef INTERNED_STRINGS if (PyUnicode_READY(s) == -1) { PyErr_Clear(); return; } - struct _Py_unicode_state *state = get_unicode_state(); - if (state->interned == NULL) { - state->interned = PyDict_New(); - if (state->interned == NULL) { + if (interned == NULL) { + interned = PyDict_New(); + if (interned == NULL) { PyErr_Clear(); /* Don't leave an exception */ return; } } - PyObject *t = PyDict_SetDefault(state->interned, s, s); + PyObject *t = PyDict_SetDefault(interned, s, s); if (t == NULL) { PyErr_Clear(); return; @@ -15622,9 +15641,13 @@ PyUnicode_InternInPlace(PyObject **p) this. */ Py_SET_REFCNT(s, Py_REFCNT(s) - 2); _PyUnicode_STATE(s).interned = SSTATE_INTERNED_MORTAL; +#else + // PyDict expects that interned strings have their hash + // (PyASCIIObject.hash) already computed. + (void)unicode_hash(s); +#endif } - void PyUnicode_InternImmortal(PyObject **p) { @@ -15658,11 +15681,15 @@ PyUnicode_InternFromString(const char *cp) void _PyUnicode_ClearInterned(PyInterpreterState *interp) { - struct _Py_unicode_state *state = &interp->unicode; - if (state->interned == NULL) { + if (!_Py_IsMainInterpreter(interp)) { + // interned dict is shared by all interpreters return; } - assert(PyDict_CheckExact(state->interned)); + + if (interned == NULL) { + return; + } + assert(PyDict_CheckExact(interned)); /* Interned unicode strings are not forcibly deallocated; rather, we give them their stolen references back, and then clear and DECREF the @@ -15670,13 +15697,13 @@ _PyUnicode_ClearInterned(PyInterpreterState *interp) #ifdef INTERNED_STATS fprintf(stderr, "releasing %zd interned strings\n", - PyDict_GET_SIZE(state->interned)); + PyDict_GET_SIZE(interned)); Py_ssize_t immortal_size = 0, mortal_size = 0; #endif Py_ssize_t pos = 0; PyObject *s, *ignored_value; - while (PyDict_Next(state->interned, &pos, &s, &ignored_value)) { + while (PyDict_Next(interned, &pos, &s, &ignored_value)) { assert(PyUnicode_IS_READY(s)); switch (PyUnicode_CHECK_INTERNED(s)) { @@ -15707,8 +15734,8 @@ _PyUnicode_ClearInterned(PyInterpreterState *interp) mortal_size, immortal_size); #endif - PyDict_Clear(state->interned); - Py_CLEAR(state->interned); + PyDict_Clear(interned); + Py_CLEAR(interned); } @@ -16079,8 +16106,7 @@ _PyUnicode_EnableLegacyWindowsFSEncoding(void) static inline int unicode_is_finalizing(void) { - struct _Py_unicode_state *state = get_unicode_state(); - return (state->interned == NULL); + return (interned == NULL); } #endif @@ -16090,8 +16116,10 @@ _PyUnicode_Fini(PyInterpreterState *interp) { struct _Py_unicode_state *state = &interp->unicode; - // _PyUnicode_ClearInterned() must be called before - assert(state->interned == NULL); + if (_Py_IsMainInterpreter(interp)) { + // _PyUnicode_ClearInterned() must be called before _PyUnicode_Fini() + assert(interned == NULL); + } _PyUnicode_FiniEncodings(&state->fs_codec); From webhook-mailer at python.org Thu Jan 6 06:38:40 2022 From: webhook-mailer at python.org (markshannon) Date: Thu, 06 Jan 2022 11:38:40 -0000 Subject: [Python-checkins] bpo-46031: add POP_JUMP_IF_NOT_NONE and POP_JUMP_IF_NONE (GH-30019) Message-ID: https://github.com/python/cpython/commit/3db762db72cc0da938614b1e414abb1e12ca4094 commit: 3db762db72cc0da938614b1e414abb1e12ca4094 branch: main author: penguin_wwy <940375606 at qq.com> committer: markshannon date: 2022-01-06T11:38:35Z summary: bpo-46031: add POP_JUMP_IF_NOT_NONE and POP_JUMP_IF_NONE (GH-30019) files: A Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst M Doc/library/dis.rst M Doc/whatsnew/3.11.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 8490a09669656..7afa62f95bc64 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -896,6 +896,20 @@ All of the following opcodes use their arguments. .. versionadded:: 3.11 +.. opcode:: POP_JUMP_IF_NOT_NONE (target) + + If TOS is not none, sets the bytecode counter to *target*. TOS is popped. + + .. versionadded:: 3.11 + + +.. opcode:: POP_JUMP_IF_NONE (target) + + If TOS is none, sets the bytecode counter to *target*. TOS is popped. + + .. versionadded:: 3.11 + + .. opcode:: PREP_RERAISE_STAR Combines the raised and reraised exceptions list from TOS, into an exception diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 6794e828a7252..98ff2d44a811b 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -398,6 +398,9 @@ CPython bytecode changes * Added :opcode:`COPY`, which pushes the *i*-th item to the top of the stack. The item is not removed from its original location. +* Add :opcode:`POP_JUMP_IF_NOT_NONE` and :opcode:`POP_JUMP_IF_NONE` opcodes to + speed up conditional jumps. + * :opcode:`JUMP_IF_NOT_EXC_MATCH` no longer pops the active exception. diff --git a/Include/opcode.h b/Include/opcode.h index 1af54943e1ca5..e4deeec932cea 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -84,6 +84,8 @@ extern "C" { #define STORE_FAST 125 #define DELETE_FAST 126 #define JUMP_IF_NOT_EG_MATCH 127 +#define POP_JUMP_IF_NOT_NONE 128 +#define POP_JUMP_IF_NONE 129 #define RAISE_VARARGS 130 #define MAKE_FUNCTION 132 #define BUILD_SLICE 133 @@ -162,10 +164,10 @@ extern "C" { #define STORE_ATTR_SLOT 80 #define STORE_ATTR_WITH_HINT 81 #define LOAD_FAST__LOAD_FAST 87 -#define STORE_FAST__LOAD_FAST 128 -#define LOAD_FAST__LOAD_CONST 129 -#define LOAD_CONST__LOAD_FAST 131 -#define STORE_FAST__STORE_FAST 134 +#define STORE_FAST__LOAD_FAST 131 +#define LOAD_FAST__LOAD_CONST 134 +#define LOAD_CONST__LOAD_FAST 140 +#define STORE_FAST__STORE_FAST 141 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { @@ -183,7 +185,7 @@ static uint32_t _PyOpcode_Jump[8] = { 0U, 536870912U, 2316288000U, - 0U, + 3U, 0U, 0U, 0U, diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 872d6d96a74b4..8e21be5916d31 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -378,6 +378,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3470 (bpo-46221: PREP_RERAISE_STAR no longer pushes lasti) # Python 3.11a4 3471 (bpo-46202: remove pop POP_EXCEPT_AND_RERAISE) # Python 3.11a4 3472 (bpo-46009: replace GEN_START with POP_TOP) +# Python 3.11a4 3473 (Add POP_JUMP_IF_NOT_NONE/POP_JUMP_IF_NONE opcodes) # # MAGIC must change whenever the bytecode emitted by the compiler may no @@ -387,7 +388,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3472).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3473).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index f99e20ad0a670..6030743b35c5b 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -148,9 +148,9 @@ def jabs_op(name, op): haslocal.append(125) def_op('DELETE_FAST', 126) # Local variable number haslocal.append(126) - jabs_op('JUMP_IF_NOT_EG_MATCH', 127) - +jabs_op('POP_JUMP_IF_NOT_NONE', 128) +jabs_op('POP_JUMP_IF_NONE', 129) def_op('RAISE_VARARGS', 130) # Number of raise arguments (1, 2, or 3) def_op('MAKE_FUNCTION', 132) # Flags diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst new file mode 100644 index 0000000000000..65c8b38cf8acc --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst @@ -0,0 +1 @@ +Add :opcode:`POP_JUMP_IF_NOT_NONE` and :opcode:`POP_JUMP_IF_NONE` opcodes to speed up conditional jumps. \ No newline at end of file diff --git a/Python/ceval.c b/Python/ceval.c index b4ac9ec848f77..86d834cd3a67f 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -4049,6 +4049,30 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } + TARGET(POP_JUMP_IF_NOT_NONE) { + PyObject *value = POP(); + if (!Py_IsNone(value)) { + Py_DECREF(value); + JUMPTO(oparg); + CHECK_EVAL_BREAKER(); + DISPATCH(); + } + Py_DECREF(value); + DISPATCH(); + } + + TARGET(POP_JUMP_IF_NONE) { + PyObject *value = POP(); + if (Py_IsNone(value)) { + Py_DECREF(value); + JUMPTO(oparg); + CHECK_EVAL_BREAKER(); + DISPATCH(); + } + Py_DECREF(value); + DISPATCH(); + } + TARGET(JUMP_IF_FALSE_OR_POP) { PyObject *cond = TOP(); int err; diff --git a/Python/compile.c b/Python/compile.c index 3a390751fe2d2..625a07bd39675 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -1110,6 +1110,8 @@ stack_effect(int opcode, int oparg, int jump) case POP_JUMP_IF_FALSE: case POP_JUMP_IF_TRUE: + case POP_JUMP_IF_NONE: + case POP_JUMP_IF_NOT_NONE: return -1; case LOAD_GLOBAL: @@ -8519,6 +8521,21 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) bb->b_instr[i+1].i_opcode = NOP; } break; + case IS_OP: + cnt = get_const_value(inst->i_opcode, oparg, consts); + if (cnt == NULL) { + goto error; + } + int jump_op = i+2 < bb->b_iused ? bb->b_instr[i+2].i_opcode : 0; + if (Py_IsNone(cnt) && (jump_op == POP_JUMP_IF_FALSE || jump_op == POP_JUMP_IF_TRUE)) { + unsigned char nextarg = bb->b_instr[i+1].i_oparg; + inst->i_opcode = NOP; + bb->b_instr[i+1].i_opcode = NOP; + bb->b_instr[i+2].i_opcode = nextarg ^ (jump_op == POP_JUMP_IF_FALSE) ? + POP_JUMP_IF_NOT_NONE : POP_JUMP_IF_NONE; + } + Py_DECREF(cnt); + break; } break; } @@ -8611,6 +8628,14 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) break; } break; + case POP_JUMP_IF_NOT_NONE: + case POP_JUMP_IF_NONE: + switch (target->i_opcode) { + case JUMP_ABSOLUTE: + case JUMP_FORWARD: + i -= jump_thread(inst, target, inst->i_opcode); + } + break; case POP_JUMP_IF_FALSE: switch (target->i_opcode) { case JUMP_ABSOLUTE: @@ -8766,6 +8791,8 @@ normalize_basic_block(basicblock *bb) { case JUMP_FORWARD: bb->b_nofallthrough = 1; /* fall through */ + case POP_JUMP_IF_NOT_NONE: + case POP_JUMP_IF_NONE: case POP_JUMP_IF_FALSE: case POP_JUMP_IF_TRUE: case JUMP_IF_FALSE_OR_POP: diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index e9f1a483b970c..7ba45666ed061 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -127,20 +127,20 @@ static void *opcode_targets[256] = { &&TARGET_STORE_FAST, &&TARGET_DELETE_FAST, &&TARGET_JUMP_IF_NOT_EG_MATCH, - &&TARGET_STORE_FAST__LOAD_FAST, - &&TARGET_LOAD_FAST__LOAD_CONST, + &&TARGET_POP_JUMP_IF_NOT_NONE, + &&TARGET_POP_JUMP_IF_NONE, &&TARGET_RAISE_VARARGS, - &&TARGET_LOAD_CONST__LOAD_FAST, + &&TARGET_STORE_FAST__LOAD_FAST, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, - &&TARGET_STORE_FAST__STORE_FAST, + &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_MAKE_CELL, &&TARGET_LOAD_CLOSURE, &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, - &&_unknown_opcode, - &&_unknown_opcode, + &&TARGET_LOAD_CONST__LOAD_FAST, + &&TARGET_STORE_FAST__STORE_FAST, &&TARGET_CALL_FUNCTION_EX, &&_unknown_opcode, &&TARGET_EXTENDED_ARG, From webhook-mailer at python.org Thu Jan 6 07:31:37 2022 From: webhook-mailer at python.org (asvetlov) Date: Thu, 06 Jan 2022 12:31:37 -0000 Subject: [Python-checkins] Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) Message-ID: https://github.com/python/cpython/commit/3e43fac2503afe219336742b150b3ef6e470686f commit: 3e43fac2503afe219336742b150b3ef6e470686f branch: main author: Andrew Svetlov committer: asvetlov date: 2022-01-06T14:31:32+02:00 summary: Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) files: A Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index d91fe8db2b020..831c19cf0ec68 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -257,13 +257,13 @@ def _timer_handle_cancelled(self, handle): """Notification that a TimerHandle has been cancelled.""" raise NotImplementedError - def call_soon(self, callback, *args): + def call_soon(self, callback, *args, context=None): return self.call_later(0, callback, *args) - def call_later(self, delay, callback, *args): + def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args): + def call_at(self, when, callback, *args, cotext=None): raise NotImplementedError def time(self): @@ -279,7 +279,7 @@ def create_task(self, coro, *, name=None): # Methods for interacting with threads. - def call_soon_threadsafe(self, callback, *args): + def call_soon_threadsafe(self, callback, *args, context=None): raise NotImplementedError def run_in_executor(self, executor, func, *args): diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst new file mode 100644 index 0000000000000..40849044cf1c8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst @@ -0,0 +1,2 @@ +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. From webhook-mailer at python.org Thu Jan 6 08:03:16 2022 From: webhook-mailer at python.org (asvetlov) Date: Thu, 06 Jan 2022 13:03:16 -0000 Subject: [Python-checkins] Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) (#30429) Message-ID: https://github.com/python/cpython/commit/0aa8bbfe1e42349acff3854bfd9f0b3488dd1b98 commit: 0aa8bbfe1e42349acff3854bfd9f0b3488dd1b98 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-06T15:03:11+02:00 summary: Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) (#30429) (cherry picked from commit 3e43fac2503afe219336742b150b3ef6e470686f) Co-authored-by: Andrew Svetlov Co-authored-by: Andrew Svetlov files: A Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index c8e4b407b476d..db7720abcfede 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -258,13 +258,13 @@ def _timer_handle_cancelled(self, handle): """Notification that a TimerHandle has been cancelled.""" raise NotImplementedError - def call_soon(self, callback, *args): + def call_soon(self, callback, *args, context=None): return self.call_later(0, callback, *args) - def call_later(self, delay, callback, *args): + def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args): + def call_at(self, when, callback, *args, cotext=None): raise NotImplementedError def time(self): @@ -280,7 +280,7 @@ def create_task(self, coro, *, name=None): # Methods for interacting with threads. - def call_soon_threadsafe(self, callback, *args): + def call_soon_threadsafe(self, callback, *args, context=None): raise NotImplementedError def run_in_executor(self, executor, func, *args): diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst new file mode 100644 index 0000000000000..40849044cf1c8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst @@ -0,0 +1,2 @@ +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. From webhook-mailer at python.org Thu Jan 6 08:04:09 2022 From: webhook-mailer at python.org (asvetlov) Date: Thu, 06 Jan 2022 13:04:09 -0000 Subject: [Python-checkins] Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) (GH-30428) Message-ID: https://github.com/python/cpython/commit/8670fbe4d2503ab9d3467d859fa504d1dd6c6eec commit: 8670fbe4d2503ab9d3467d859fa504d1dd6c6eec branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-06T15:04:05+02:00 summary: Reflect 'context' arg in 'AbstractEventLoop.call_*()' methods (GH-30427) (GH-30428) (cherry picked from commit 3e43fac2503afe219336742b150b3ef6e470686f) Co-authored-by: Andrew Svetlov Co-authored-by: Andrew Svetlov files: A Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index 7abaaca2d2b28..58236059f7e22 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -258,13 +258,13 @@ def _timer_handle_cancelled(self, handle): """Notification that a TimerHandle has been cancelled.""" raise NotImplementedError - def call_soon(self, callback, *args): + def call_soon(self, callback, *args, context=None): return self.call_later(0, callback, *args) - def call_later(self, delay, callback, *args): + def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args): + def call_at(self, when, callback, *args, cotext=None): raise NotImplementedError def time(self): @@ -280,7 +280,7 @@ def create_task(self, coro, *, name=None): # Methods for interacting with threads. - def call_soon_threadsafe(self, callback, *args): + def call_soon_threadsafe(self, callback, *args, context=None): raise NotImplementedError def run_in_executor(self, executor, func, *args): diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst new file mode 100644 index 0000000000000..40849044cf1c8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst @@ -0,0 +1,2 @@ +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. From webhook-mailer at python.org Thu Jan 6 08:09:33 2022 From: webhook-mailer at python.org (markshannon) Date: Thu, 06 Jan 2022 13:09:33 -0000 Subject: [Python-checkins] bpo-45923: Handle call events in bytecode (GH-30364) Message-ID: https://github.com/python/cpython/commit/e028ae99ecee671c0e8a3eabb829b5b2acfc4441 commit: e028ae99ecee671c0e8a3eabb829b5b2acfc4441 branch: main author: Mark Shannon committer: markshannon date: 2022-01-06T13:09:25Z summary: bpo-45923: Handle call events in bytecode (GH-30364) * Add a RESUME instruction to handle "call" events. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst M Doc/library/dis.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Lib/test/test_code.py M Lib/test/test_compile.py M Lib/test/test_dis.py M PC/launcher.c M Programs/test_frozenmain.h M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 7afa62f95bc64..6bbe4ecbe8a1f 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -1206,6 +1206,20 @@ All of the following opcodes use their arguments. .. versionadded:: 3.11 +.. opcode:: RESUME (where) + + A no-op. Performs internal tracing, debugging and optimization checks. + + The ``where`` operand marks where the ``RESUME`` occurs: + + * ``0`` The start of a function + * ``1`` After a ``yield`` expression + * ``2`` After a ``yield from`` expression + * ``3`` After an ``await`` expression + + .. versionadded:: 3.11 + + .. opcode:: HAVE_ARGUMENT This is not really an opcode. It identifies the dividing line between diff --git a/Include/opcode.h b/Include/opcode.h index e4deeec932cea..5cc885597ac35 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -101,6 +101,7 @@ extern "C" { #define MAP_ADD 147 #define LOAD_CLASSDEREF 148 #define COPY_FREE_VARS 149 +#define RESUME 151 #define MATCH_CLASS 152 #define FORMAT_VALUE 155 #define BUILD_CONST_KEY_MAP 156 diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 8e21be5916d31..5aea0c4f92477 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -379,16 +379,21 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3471 (bpo-46202: remove pop POP_EXCEPT_AND_RERAISE) # Python 3.11a4 3472 (bpo-46009: replace GEN_START with POP_TOP) # Python 3.11a4 3473 (Add POP_JUMP_IF_NOT_NONE/POP_JUMP_IF_NONE opcodes) +# Python 3.11a4 3474 (Add RESUME opcode) + +# Python 3.12 will start with magic number 3500 # # MAGIC must change whenever the bytecode emitted by the compiler may no # longer be understood by older implementations of the eval loop (usually # due to the addition of new opcodes). # +# Starting with Python 3.11, Python 3.n starts with magic number 2900+50n. +# # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3473).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3474).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index 6030743b35c5b..7f39a7bfe2e8c 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -178,6 +178,7 @@ def jabs_op(name, op): hasfree.append(148) def_op('COPY_FREE_VARS', 149) +def_op('RESUME', 151) def_op('MATCH_CLASS', 152) def_op('FORMAT_VALUE', 155) diff --git a/Lib/test/test_code.py b/Lib/test/test_code.py index 88f6c782a68e4..9319f200e34fb 100644 --- a/Lib/test/test_code.py +++ b/Lib/test/test_code.py @@ -367,7 +367,7 @@ def test_co_positions_artificial_instructions(self): # get assigned the first_lineno but they don't have other positions. # There is no easy way of inferring them at that stage, so for now # we don't support it. - self.assertTrue(positions.count(None) in [0, 4]) + self.assertIn(positions.count(None), [0, 3, 4]) if not any(positions): artificial_instructions.append(instr) @@ -378,6 +378,7 @@ def test_co_positions_artificial_instructions(self): for instruction in artificial_instructions ], [ + ('RESUME', 0), ("PUSH_EXC_INFO", None), ("LOAD_CONST", None), # artificial 'None' ("STORE_NAME", "e"), # XX: we know the location for this @@ -419,7 +420,9 @@ def test_co_positions_empty_linetable(self): def func(): x = 1 new_code = func.__code__.replace(co_linetable=b'') - for line, end_line, column, end_column in new_code.co_positions(): + positions = new_code.co_positions() + next(positions) # Skip RESUME at start + for line, end_line, column, end_column in positions: self.assertIsNone(line) self.assertEqual(end_line, new_code.co_firstlineno + 1) @@ -428,7 +431,9 @@ def test_co_positions_empty_endlinetable(self): def func(): x = 1 new_code = func.__code__.replace(co_endlinetable=b'') - for line, end_line, column, end_column in new_code.co_positions(): + positions = new_code.co_positions() + next(positions) # Skip RESUME at start + for line, end_line, column, end_column in positions: self.assertEqual(line, new_code.co_firstlineno + 1) self.assertIsNone(end_line) @@ -437,7 +442,9 @@ def test_co_positions_empty_columntable(self): def func(): x = 1 new_code = func.__code__.replace(co_columntable=b'') - for line, end_line, column, end_column in new_code.co_positions(): + positions = new_code.co_positions() + next(positions) # Skip RESUME at start + for line, end_line, column, end_column in positions: self.assertEqual(line, new_code.co_firstlineno + 1) self.assertEqual(end_line, new_code.co_firstlineno + 1) self.assertIsNone(column) diff --git a/Lib/test/test_compile.py b/Lib/test/test_compile.py index 11615b32232e8..e237156c75f8b 100644 --- a/Lib/test/test_compile.py +++ b/Lib/test/test_compile.py @@ -158,7 +158,7 @@ def test_leading_newlines(self): s256 = "".join(["\n"] * 256 + ["spam"]) co = compile(s256, 'fn', 'exec') self.assertEqual(co.co_firstlineno, 1) - self.assertEqual(list(co.co_lines()), [(0, 8, 257)]) + self.assertEqual(list(co.co_lines()), [(0, 2, None), (2, 10, 257)]) def test_literals_with_leading_zeroes(self): for arg in ["077787", "0xj", "0x.", "0e", "090000000000000", @@ -759,7 +759,7 @@ def unused_block_while_else(): for func in funcs: opcodes = list(dis.get_instructions(func)) - self.assertLessEqual(len(opcodes), 3) + self.assertLessEqual(len(opcodes), 4) self.assertEqual('LOAD_CONST', opcodes[-2].opname) self.assertEqual(None, opcodes[-2].argval) self.assertEqual('RETURN_VALUE', opcodes[-1].opname) @@ -778,10 +778,10 @@ def continue_in_while(): # Check that we did not raise but we also don't generate bytecode for func in funcs: opcodes = list(dis.get_instructions(func)) - self.assertEqual(2, len(opcodes)) - self.assertEqual('LOAD_CONST', opcodes[0].opname) - self.assertEqual(None, opcodes[0].argval) - self.assertEqual('RETURN_VALUE', opcodes[1].opname) + self.assertEqual(3, len(opcodes)) + self.assertEqual('LOAD_CONST', opcodes[1].opname) + self.assertEqual(None, opcodes[1].argval) + self.assertEqual('RETURN_VALUE', opcodes[2].opname) def test_consts_in_conditionals(self): def and_true(x): @@ -802,9 +802,9 @@ def or_false(x): for func in funcs: with self.subTest(func=func): opcodes = list(dis.get_instructions(func)) - self.assertEqual(2, len(opcodes)) - self.assertIn('LOAD_', opcodes[0].opname) - self.assertEqual('RETURN_VALUE', opcodes[1].opname) + self.assertLessEqual(len(opcodes), 3) + self.assertIn('LOAD_', opcodes[-2].opname) + self.assertEqual('RETURN_VALUE', opcodes[-1].opname) def test_imported_load_method(self): sources = [ @@ -906,7 +906,7 @@ def load_attr(): o. a ) - load_attr_lines = [ 2, 3, 1 ] + load_attr_lines = [ 0, 2, 3, 1 ] def load_method(): return ( @@ -915,7 +915,7 @@ def load_method(): 0 ) ) - load_method_lines = [ 2, 3, 4, 3, 1 ] + load_method_lines = [ 0, 2, 3, 4, 3, 1 ] def store_attr(): ( @@ -924,7 +924,7 @@ def store_attr(): ) = ( v ) - store_attr_lines = [ 5, 2, 3 ] + store_attr_lines = [ 0, 5, 2, 3 ] def aug_store_attr(): ( @@ -933,7 +933,7 @@ def aug_store_attr(): ) += ( v ) - aug_store_attr_lines = [ 2, 3, 5, 1, 3 ] + aug_store_attr_lines = [ 0, 2, 3, 5, 1, 3 ] funcs = [ load_attr, load_method, store_attr, aug_store_attr] func_lines = [ load_attr_lines, load_method_lines, @@ -942,7 +942,8 @@ def aug_store_attr(): for func, lines in zip(funcs, func_lines, strict=True): with self.subTest(func=func): code_lines = [ line-func.__code__.co_firstlineno - for (_, _, line) in func.__code__.co_lines() ] + for (_, _, line) in func.__code__.co_lines() + if line is not None ] self.assertEqual(lines, code_lines) def test_line_number_genexp(self): @@ -966,7 +967,7 @@ async def test(aseq): async for i in aseq: body - expected_lines = [None, 1, 2, 1] + expected_lines = [None, 0, 1, 2, 1] code_lines = [ None if line is None else line-test.__code__.co_firstlineno for (_, _, line) in test.__code__.co_lines() ] self.assertEqual(expected_lines, code_lines) diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index 7857458e240a5..c4473a4c261ad 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -38,43 +38,50 @@ def cm(cls, x): cls.x = x == 1 dis_c_instance_method = """\ -%3d 0 LOAD_FAST 1 (x) - 2 LOAD_CONST 1 (1) - 4 COMPARE_OP 2 (==) - 6 LOAD_FAST 0 (self) - 8 STORE_ATTR 0 (x) - 10 LOAD_CONST 0 (None) - 12 RETURN_VALUE -""" % (_C.__init__.__code__.co_firstlineno + 1,) +%3d 0 RESUME 0 + +%3d 2 LOAD_FAST 1 (x) + 4 LOAD_CONST 1 (1) + 6 COMPARE_OP 2 (==) + 8 LOAD_FAST 0 (self) + 10 STORE_ATTR 0 (x) + 12 LOAD_CONST 0 (None) + 14 RETURN_VALUE +""" % (_C.__init__.__code__.co_firstlineno, _C.__init__.__code__.co_firstlineno + 1,) dis_c_instance_method_bytes = """\ - 0 LOAD_FAST 1 - 2 LOAD_CONST 1 - 4 COMPARE_OP 2 (==) - 6 LOAD_FAST 0 - 8 STORE_ATTR 0 - 10 LOAD_CONST 0 - 12 RETURN_VALUE + 0 RESUME 0 + 2 LOAD_FAST 1 + 4 LOAD_CONST 1 + 6 COMPARE_OP 2 (==) + 8 LOAD_FAST 0 + 10 STORE_ATTR 0 + 12 LOAD_CONST 0 + 14 RETURN_VALUE """ dis_c_class_method = """\ -%3d 0 LOAD_FAST 1 (x) - 2 LOAD_CONST 1 (1) - 4 COMPARE_OP 2 (==) - 6 LOAD_FAST 0 (cls) - 8 STORE_ATTR 0 (x) - 10 LOAD_CONST 0 (None) - 12 RETURN_VALUE -""" % (_C.cm.__code__.co_firstlineno + 2,) +%3d 0 RESUME 0 + +%3d 2 LOAD_FAST 1 (x) + 4 LOAD_CONST 1 (1) + 6 COMPARE_OP 2 (==) + 8 LOAD_FAST 0 (cls) + 10 STORE_ATTR 0 (x) + 12 LOAD_CONST 0 (None) + 14 RETURN_VALUE +""" % (_C.cm.__code__.co_firstlineno, _C.cm.__code__.co_firstlineno + 2,) dis_c_static_method = """\ -%3d 0 LOAD_FAST 0 (x) - 2 LOAD_CONST 1 (1) - 4 COMPARE_OP 2 (==) - 6 STORE_FAST 0 (x) - 8 LOAD_CONST 0 (None) - 10 RETURN_VALUE -""" % (_C.sm.__code__.co_firstlineno + 2,) +%3d 0 RESUME 0 + +%3d 2 LOAD_FAST 0 (x) + 4 LOAD_CONST 1 (1) + 6 COMPARE_OP 2 (==) + 8 STORE_FAST 0 (x) + 10 LOAD_CONST 0 (None) + 12 RETURN_VALUE +""" % (_C.sm.__code__.co_firstlineno, _C.sm.__code__.co_firstlineno + 2,) # Class disassembling info has an extra newline at end. dis_c = """\ @@ -93,24 +100,28 @@ def _f(a): return 1 dis_f = """\ -%3d 0 LOAD_GLOBAL 0 (print) - 2 LOAD_FAST 0 (a) - 4 CALL_NO_KW 1 - 6 POP_TOP - -%3d 8 LOAD_CONST 1 (1) - 10 RETURN_VALUE -""" % (_f.__code__.co_firstlineno + 1, +%3d 0 RESUME 0 + +%3d 2 LOAD_GLOBAL 0 (print) + 4 LOAD_FAST 0 (a) + 6 CALL_NO_KW 1 + 8 POP_TOP + +%3d 10 LOAD_CONST 1 (1) + 12 RETURN_VALUE +""" % (_f.__code__.co_firstlineno, + _f.__code__.co_firstlineno + 1, _f.__code__.co_firstlineno + 2) dis_f_co_code = """\ - 0 LOAD_GLOBAL 0 - 2 LOAD_FAST 0 - 4 CALL_NO_KW 1 - 6 POP_TOP - 8 LOAD_CONST 1 - 10 RETURN_VALUE + 0 RESUME 0 + 2 LOAD_GLOBAL 0 + 4 LOAD_FAST 0 + 6 CALL_NO_KW 1 + 8 POP_TOP + 10 LOAD_CONST 1 + 12 RETURN_VALUE """ @@ -120,21 +131,24 @@ def bug708901(): pass dis_bug708901 = """\ -%3d 0 LOAD_GLOBAL 0 (range) - 2 LOAD_CONST 1 (1) +%3d 0 RESUME 0 + +%3d 2 LOAD_GLOBAL 0 (range) + 4 LOAD_CONST 1 (1) -%3d 4 LOAD_CONST 2 (10) +%3d 6 LOAD_CONST 2 (10) -%3d 6 CALL_NO_KW 2 - 8 GET_ITER - >> 10 FOR_ITER 2 (to 16) - 12 STORE_FAST 0 (res) +%3d 8 CALL_NO_KW 2 + 10 GET_ITER + >> 12 FOR_ITER 2 (to 18) + 14 STORE_FAST 0 (res) -%3d 14 JUMP_ABSOLUTE 5 (to 10) +%3d 16 JUMP_ABSOLUTE 6 (to 12) -%3d >> 16 LOAD_CONST 0 (None) - 18 RETURN_VALUE -""" % (bug708901.__code__.co_firstlineno + 1, +%3d >> 18 LOAD_CONST 0 (None) + 20 RETURN_VALUE +""" % (bug708901.__code__.co_firstlineno, + bug708901.__code__.co_firstlineno + 1, bug708901.__code__.co_firstlineno + 2, bug708901.__code__.co_firstlineno + 1, bug708901.__code__.co_firstlineno + 3, @@ -147,19 +161,22 @@ def bug1333982(x=[]): pass dis_bug1333982 = """\ -%3d 0 LOAD_ASSERTION_ERROR - 2 LOAD_CONST 2 ( at 0x..., file "%s", line %d>) - 4 MAKE_FUNCTION 0 - 6 LOAD_FAST 0 (x) - 8 GET_ITER - 10 CALL_NO_KW 1 - -%3d 12 LOAD_CONST 3 (1) - -%3d 14 BINARY_OP 0 (+) - 16 CALL_NO_KW 1 - 18 RAISE_VARARGS 1 -""" % (bug1333982.__code__.co_firstlineno + 1, +%3d 0 RESUME 0 + +%3d 2 LOAD_ASSERTION_ERROR + 4 LOAD_CONST 2 ( at 0x..., file "%s", line %d>) + 6 MAKE_FUNCTION 0 + 8 LOAD_FAST 0 (x) + 10 GET_ITER + 12 CALL_NO_KW 1 + +%3d 14 LOAD_CONST 3 (1) + +%3d 16 BINARY_OP 0 (+) + 18 CALL_NO_KW 1 + 20 RAISE_VARARGS 1 +""" % (bug1333982.__code__.co_firstlineno, + bug1333982.__code__.co_firstlineno + 1, __file__, bug1333982.__code__.co_firstlineno + 1, bug1333982.__code__.co_firstlineno + 2, @@ -175,8 +192,9 @@ def bug42562(): dis_bug42562 = """\ - 0 LOAD_CONST 0 (None) - 2 RETURN_VALUE + 0 RESUME 0 + 2 LOAD_CONST 0 (None) + 4 RETURN_VALUE """ # Extended arg followed by NOP @@ -197,48 +215,58 @@ def bug42562(): """ _BIG_LINENO_FORMAT = """\ -%3d 0 LOAD_GLOBAL 0 (spam) - 2 POP_TOP - 4 LOAD_CONST 0 (None) - 6 RETURN_VALUE + 1 0 RESUME 0 + +%3d 2 LOAD_GLOBAL 0 (spam) + 4 POP_TOP + 6 LOAD_CONST 0 (None) + 8 RETURN_VALUE """ _BIG_LINENO_FORMAT2 = """\ -%4d 0 LOAD_GLOBAL 0 (spam) - 2 POP_TOP - 4 LOAD_CONST 0 (None) - 6 RETURN_VALUE + 1 0 RESUME 0 + +%4d 2 LOAD_GLOBAL 0 (spam) + 4 POP_TOP + 6 LOAD_CONST 0 (None) + 8 RETURN_VALUE """ dis_module_expected_results = """\ Disassembly of f: - 4 0 LOAD_CONST 0 (None) - 2 RETURN_VALUE + 4 0 RESUME 0 + 2 LOAD_CONST 0 (None) + 4 RETURN_VALUE Disassembly of g: - 5 0 LOAD_CONST 0 (None) - 2 RETURN_VALUE + 5 0 RESUME 0 + 2 LOAD_CONST 0 (None) + 4 RETURN_VALUE """ expr_str = "x + 1" dis_expr_str = """\ - 1 0 LOAD_NAME 0 (x) - 2 LOAD_CONST 0 (1) - 4 BINARY_OP 0 (+) - 6 RETURN_VALUE + 0 RESUME 0 + + 1 2 LOAD_NAME 0 (x) + 4 LOAD_CONST 0 (1) + 6 BINARY_OP 0 (+) + 8 RETURN_VALUE """ simple_stmt_str = "x = x + 1" dis_simple_stmt_str = """\ - 1 0 LOAD_NAME 0 (x) - 2 LOAD_CONST 0 (1) - 4 BINARY_OP 0 (+) - 6 STORE_NAME 0 (x) - 8 LOAD_CONST 1 (None) - 10 RETURN_VALUE + 0 RESUME 0 + + 1 2 LOAD_NAME 0 (x) + 4 LOAD_CONST 0 (1) + 6 BINARY_OP 0 (+) + 8 STORE_NAME 0 (x) + 10 LOAD_CONST 1 (None) + 12 RETURN_VALUE """ annot_stmt_str = """\ @@ -250,31 +278,33 @@ def bug42562(): # leading newline is for a reason (tests lineno) dis_annot_stmt_str = """\ - 2 0 SETUP_ANNOTATIONS - 2 LOAD_CONST 0 (1) - 4 STORE_NAME 0 (x) - 6 LOAD_NAME 1 (int) - 8 LOAD_NAME 2 (__annotations__) - 10 LOAD_CONST 1 ('x') - 12 STORE_SUBSCR + 0 RESUME 0 - 3 14 LOAD_NAME 3 (fun) - 16 LOAD_CONST 0 (1) - 18 CALL_NO_KW 1 - 20 LOAD_NAME 2 (__annotations__) - 22 LOAD_CONST 2 ('y') - 24 STORE_SUBSCR - - 4 26 LOAD_CONST 0 (1) - 28 LOAD_NAME 4 (lst) - 30 LOAD_NAME 3 (fun) - 32 LOAD_CONST 3 (0) - 34 CALL_NO_KW 1 - 36 STORE_SUBSCR - 38 LOAD_NAME 1 (int) - 40 POP_TOP - 42 LOAD_CONST 4 (None) - 44 RETURN_VALUE + 2 2 SETUP_ANNOTATIONS + 4 LOAD_CONST 0 (1) + 6 STORE_NAME 0 (x) + 8 LOAD_NAME 1 (int) + 10 LOAD_NAME 2 (__annotations__) + 12 LOAD_CONST 1 ('x') + 14 STORE_SUBSCR + + 3 16 LOAD_NAME 3 (fun) + 18 LOAD_CONST 0 (1) + 20 CALL_NO_KW 1 + 22 LOAD_NAME 2 (__annotations__) + 24 LOAD_CONST 2 ('y') + 26 STORE_SUBSCR + + 4 28 LOAD_CONST 0 (1) + 30 LOAD_NAME 4 (lst) + 32 LOAD_NAME 3 (fun) + 34 LOAD_CONST 3 (0) + 36 CALL_NO_KW 1 + 38 STORE_SUBSCR + 40 LOAD_NAME 1 (int) + 42 POP_TOP + 44 LOAD_CONST 4 (None) + 46 RETURN_VALUE """ compound_stmt_str = """\ @@ -284,60 +314,65 @@ def bug42562(): # Trailing newline has been deliberately omitted dis_compound_stmt_str = """\ - 1 0 LOAD_CONST 0 (0) - 2 STORE_NAME 0 (x) + 0 RESUME 0 + + 1 2 LOAD_CONST 0 (0) + 4 STORE_NAME 0 (x) - 2 4 NOP + 2 6 NOP - 3 >> 6 LOAD_NAME 0 (x) - 8 LOAD_CONST 1 (1) - 10 BINARY_OP 13 (+=) - 12 STORE_NAME 0 (x) + 3 >> 8 LOAD_NAME 0 (x) + 10 LOAD_CONST 1 (1) + 12 BINARY_OP 13 (+=) + 14 STORE_NAME 0 (x) - 2 14 JUMP_ABSOLUTE 3 (to 6) + 2 16 JUMP_ABSOLUTE 4 (to 8) """ dis_traceback = """\ -%3d 0 NOP - -%3d 2 LOAD_CONST 1 (1) - 4 LOAD_CONST 2 (0) - --> 6 BINARY_OP 11 (/) - 8 POP_TOP +%3d 0 RESUME 0 -%3d 10 LOAD_FAST 1 (tb) - 12 RETURN_VALUE - >> 14 PUSH_EXC_INFO +%3d 2 NOP -%3d 16 LOAD_GLOBAL 0 (Exception) - 18 JUMP_IF_NOT_EXC_MATCH 24 (to 48) - 20 STORE_FAST 0 (e) +%3d 4 LOAD_CONST 1 (1) + 6 LOAD_CONST 2 (0) + --> 8 BINARY_OP 11 (/) + 10 POP_TOP -%3d 22 LOAD_FAST 0 (e) - 24 LOAD_ATTR 1 (__traceback__) - 26 STORE_FAST 1 (tb) - 28 POP_EXCEPT - 30 LOAD_CONST 0 (None) - 32 STORE_FAST 0 (e) - 34 DELETE_FAST 0 (e) - -%3d 36 LOAD_FAST 1 (tb) - 38 RETURN_VALUE - >> 40 LOAD_CONST 0 (None) - 42 STORE_FAST 0 (e) - 44 DELETE_FAST 0 (e) - 46 RERAISE 1 - -%3d >> 48 RERAISE 0 - >> 50 COPY 3 - 52 POP_EXCEPT - 54 RERAISE 1 +%3d 12 LOAD_FAST 1 (tb) + 14 RETURN_VALUE + >> 16 PUSH_EXC_INFO + +%3d 18 LOAD_GLOBAL 0 (Exception) + 20 JUMP_IF_NOT_EXC_MATCH 25 (to 50) + 22 STORE_FAST 0 (e) + +%3d 24 LOAD_FAST 0 (e) + 26 LOAD_ATTR 1 (__traceback__) + 28 STORE_FAST 1 (tb) + 30 POP_EXCEPT + 32 LOAD_CONST 0 (None) + 34 STORE_FAST 0 (e) + 36 DELETE_FAST 0 (e) + +%3d 38 LOAD_FAST 1 (tb) + 40 RETURN_VALUE + >> 42 LOAD_CONST 0 (None) + 44 STORE_FAST 0 (e) + 46 DELETE_FAST 0 (e) + 48 RERAISE 1 + +%3d >> 50 RERAISE 0 + >> 52 COPY 3 + 54 POP_EXCEPT + 56 RERAISE 1 ExceptionTable: - 2 to 8 -> 14 [0] - 14 to 20 -> 50 [1] lasti - 22 to 26 -> 40 [1] lasti - 40 to 48 -> 50 [1] lasti -""" % (TRACEBACK_CODE.co_firstlineno + 1, + 4 to 10 -> 16 [0] + 16 to 22 -> 52 [1] lasti + 24 to 28 -> 42 [1] lasti + 42 to 50 -> 52 [1] lasti +""" % (TRACEBACK_CODE.co_firstlineno, + TRACEBACK_CODE.co_firstlineno + 1, TRACEBACK_CODE.co_firstlineno + 2, TRACEBACK_CODE.co_firstlineno + 5, TRACEBACK_CODE.co_firstlineno + 3, @@ -349,22 +384,24 @@ def _fstring(a, b, c, d): return f'{a} {b:4} {c!r} {d!r:4}' dis_fstring = """\ -%3d 0 LOAD_FAST 0 (a) - 2 FORMAT_VALUE 0 - 4 LOAD_CONST 1 (' ') - 6 LOAD_FAST 1 (b) - 8 LOAD_CONST 2 ('4') - 10 FORMAT_VALUE 4 (with format) - 12 LOAD_CONST 1 (' ') - 14 LOAD_FAST 2 (c) - 16 FORMAT_VALUE 2 (repr) - 18 LOAD_CONST 1 (' ') - 20 LOAD_FAST 3 (d) - 22 LOAD_CONST 2 ('4') - 24 FORMAT_VALUE 6 (repr, with format) - 26 BUILD_STRING 7 - 28 RETURN_VALUE -""" % (_fstring.__code__.co_firstlineno + 1,) +%3d 0 RESUME 0 + +%3d 2 LOAD_FAST 0 (a) + 4 FORMAT_VALUE 0 + 6 LOAD_CONST 1 (' ') + 8 LOAD_FAST 1 (b) + 10 LOAD_CONST 2 ('4') + 12 FORMAT_VALUE 4 (with format) + 14 LOAD_CONST 1 (' ') + 16 LOAD_FAST 2 (c) + 18 FORMAT_VALUE 2 (repr) + 20 LOAD_CONST 1 (' ') + 22 LOAD_FAST 3 (d) + 24 LOAD_CONST 2 ('4') + 26 FORMAT_VALUE 6 (repr, with format) + 28 BUILD_STRING 7 + 30 RETURN_VALUE +""" % (_fstring.__code__.co_firstlineno, _fstring.__code__.co_firstlineno + 1) def _tryfinally(a, b): try: @@ -379,42 +416,18 @@ def _tryfinallyconst(b): b() dis_tryfinally = """\ -%3d 0 NOP - -%3d 2 LOAD_FAST 0 (a) - -%3d 4 LOAD_FAST 1 (b) - 6 CALL_NO_KW 0 - 8 POP_TOP - 10 RETURN_VALUE - >> 12 PUSH_EXC_INFO - 14 LOAD_FAST 1 (b) - 16 CALL_NO_KW 0 - 18 POP_TOP - 20 RERAISE 0 - >> 22 COPY 3 - 24 POP_EXCEPT - 26 RERAISE 1 -ExceptionTable: - 2 to 2 -> 12 [0] - 12 to 20 -> 22 [1] lasti -""" % (_tryfinally.__code__.co_firstlineno + 1, - _tryfinally.__code__.co_firstlineno + 2, - _tryfinally.__code__.co_firstlineno + 4, - ) - -dis_tryfinallyconst = """\ -%3d 0 NOP +%3d 0 RESUME 0 %3d 2 NOP -%3d 4 LOAD_FAST 0 (b) - 6 CALL_NO_KW 0 - 8 POP_TOP - 10 LOAD_CONST 1 (1) +%3d 4 LOAD_FAST 0 (a) + +%3d 6 LOAD_FAST 1 (b) + 8 CALL_NO_KW 0 + 10 POP_TOP 12 RETURN_VALUE - 14 PUSH_EXC_INFO - 16 LOAD_FAST 0 (b) + >> 14 PUSH_EXC_INFO + 16 LOAD_FAST 1 (b) 18 CALL_NO_KW 0 20 POP_TOP 22 RERAISE 0 @@ -422,8 +435,38 @@ def _tryfinallyconst(b): 26 POP_EXCEPT 28 RERAISE 1 ExceptionTable: + 4 to 4 -> 14 [0] 14 to 22 -> 24 [1] lasti -""" % (_tryfinallyconst.__code__.co_firstlineno + 1, +""" % (_tryfinally.__code__.co_firstlineno, + _tryfinally.__code__.co_firstlineno + 1, + _tryfinally.__code__.co_firstlineno + 2, + _tryfinally.__code__.co_firstlineno + 4, + ) + +dis_tryfinallyconst = """\ +%3d 0 RESUME 0 + +%3d 2 NOP + +%3d 4 NOP + +%3d 6 LOAD_FAST 0 (b) + 8 CALL_NO_KW 0 + 10 POP_TOP + 12 LOAD_CONST 1 (1) + 14 RETURN_VALUE + 16 PUSH_EXC_INFO + 18 LOAD_FAST 0 (b) + 20 CALL_NO_KW 0 + 22 POP_TOP + 24 RERAISE 0 + >> 26 COPY 3 + 28 POP_EXCEPT + 30 RERAISE 1 +ExceptionTable: + 16 to 24 -> 26 [1] lasti +""" % (_tryfinallyconst.__code__.co_firstlineno, + _tryfinallyconst.__code__.co_firstlineno + 1, _tryfinallyconst.__code__.co_firstlineno + 2, _tryfinallyconst.__code__.co_firstlineno + 4, ) @@ -447,15 +490,18 @@ def foo(x): dis_nested_0 = """\ 0 MAKE_CELL 0 (y) -%3d 2 LOAD_CLOSURE 0 (y) - 4 BUILD_TUPLE 1 - 6 LOAD_CONST 1 () - 8 MAKE_FUNCTION 8 (closure) - 10 STORE_FAST 1 (foo) +%3d 2 RESUME 0 -%3d 12 LOAD_FAST 1 (foo) - 14 RETURN_VALUE -""" % (_h.__code__.co_firstlineno + 1, +%3d 4 LOAD_CLOSURE 0 (y) + 6 BUILD_TUPLE 1 + 8 LOAD_CONST 1 () + 10 MAKE_FUNCTION 8 (closure) + 12 STORE_FAST 1 (foo) + +%3d 14 LOAD_FAST 1 (foo) + 16 RETURN_VALUE +""" % (_h.__code__.co_firstlineno, + _h.__code__.co_firstlineno + 1, __file__, _h.__code__.co_firstlineno + 1, _h.__code__.co_firstlineno + 4, @@ -466,17 +512,20 @@ def foo(x): 0 COPY_FREE_VARS 1 2 MAKE_CELL 0 (x) -%3d 4 LOAD_CLOSURE 0 (x) - 6 BUILD_TUPLE 1 - 8 LOAD_CONST 1 ( at 0x..., file "%s", line %d>) - 10 MAKE_FUNCTION 8 (closure) - 12 LOAD_DEREF 1 (y) - 14 GET_ITER - 16 CALL_NO_KW 1 - 18 RETURN_VALUE +%3d 4 RESUME 0 + +%3d 6 LOAD_CLOSURE 0 (x) + 8 BUILD_TUPLE 1 + 10 LOAD_CONST 1 ( at 0x..., file "%s", line %d>) + 12 MAKE_FUNCTION 8 (closure) + 14 LOAD_DEREF 1 (y) + 16 GET_ITER + 18 CALL_NO_KW 1 + 20 RETURN_VALUE """ % (dis_nested_0, __file__, _h.__code__.co_firstlineno + 1, + _h.__code__.co_firstlineno + 1, _h.__code__.co_firstlineno + 3, __file__, _h.__code__.co_firstlineno + 3, @@ -486,16 +535,17 @@ def foo(x): Disassembly of at 0x..., file "%s", line %d>: 0 COPY_FREE_VARS 1 -%3d 2 BUILD_LIST 0 - 4 LOAD_FAST 0 (.0) - >> 6 FOR_ITER 6 (to 20) - 8 STORE_FAST 1 (z) - 10 LOAD_DEREF 2 (x) - 12 LOAD_FAST 1 (z) - 14 BINARY_OP 0 (+) - 16 LIST_APPEND 2 - 18 JUMP_ABSOLUTE 3 (to 6) - >> 20 RETURN_VALUE +%3d 2 RESUME 0 + 4 BUILD_LIST 0 + 6 LOAD_FAST 0 (.0) + >> 8 FOR_ITER 6 (to 22) + 10 STORE_FAST 1 (z) + 12 LOAD_DEREF 2 (x) + 14 LOAD_FAST 1 (z) + 16 BINARY_OP 0 (+) + 18 LIST_APPEND 2 + 20 JUMP_ABSOLUTE 4 (to 8) + >> 22 RETURN_VALUE """ % (dis_nested_1, __file__, _h.__code__.co_firstlineno + 3, @@ -524,6 +574,7 @@ def strip_addresses(self, text): return re.sub(r'\b0x[0-9A-Fa-f]+\b', '0x...', text) def do_disassembly_test(self, func, expected): + self.maxDiff = None got = self.get_disassembly(func, depth=0) if got != expected: got = self.strip_addresses(got) @@ -599,6 +650,7 @@ def func(count): self.do_disassembly_test(dis_module, dis_module_expected_results) def test_big_offsets(self): + self.maxDiff = None def func(count): namespace = {} func = "def foo(x):\n " + ";".join(["x = x + 1"] * count) + "\n return x" @@ -607,23 +659,27 @@ def func(count): def expected(count, w): s = ['''\ + 1 %*d RESUME 0 + +''' % (w, 0)] + s += ['''\ %*d LOAD_FAST 0 (x) %*d LOAD_CONST 1 (1) %*d BINARY_OP 0 (+) %*d STORE_FAST 0 (x) -''' % (w, 8*i, w, 8*i + 2, w, 8*i + 4, w, 8*i + 6) +''' % (w, 8*i + 2, w, 8*i + 4, w, 8*i + 6, w, 8*i + 8) for i in range(count)] s += ['''\ 3 %*d LOAD_FAST 0 (x) %*d RETURN_VALUE -''' % (w, 8*count, w, 8*count + 2)] - s[0] = ' 2' + s[0][3:] +''' % (w, 8*count + 2, w, 8*count + 4)] + s[1] = ' 2' + s[1][3:] return ''.join(s) for i in range(1, 5): self.do_disassembly_test(func(i), expected(i, 4)) - self.do_disassembly_test(func(1249), expected(1249, 4)) + self.do_disassembly_test(func(1248), expected(1248, 4)) self.do_disassembly_test(func(1250), expected(1250, 5)) def test_disassemble_str(self): @@ -683,6 +739,7 @@ def test_dis_none(self): self.assertRaises(RuntimeError, dis.dis, None) def test_dis_traceback(self): + self.maxDiff = None try: del sys.last_traceback except AttributeError: @@ -993,191 +1050,197 @@ def _prepare_test_cases(): expected_opinfo_outer = [ Instruction(opname='MAKE_CELL', opcode=135, arg=0, argval='a', argrepr='a', offset=0, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='MAKE_CELL', opcode=135, arg=1, argval='b', argrepr='b', offset=2, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=(3, 4), argrepr='(3, 4)', offset=4, starts_line=2, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=0, argval='a', argrepr='a', offset=6, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=1, argval='b', argrepr='b', offset=8, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BUILD_TUPLE', opcode=102, arg=2, argval=2, argrepr='', offset=10, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=code_object_f, argrepr=repr(code_object_f), offset=12, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='MAKE_FUNCTION', opcode=132, arg=9, argval=9, argrepr='defaults, closure', offset=14, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=2, argval='f', argrepr='f', offset=16, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=18, starts_line=7, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=0, argval='a', argrepr='a', offset=20, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=1, argval='b', argrepr='b', offset=22, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='', argrepr="''", offset=24, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=26, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BUILD_LIST', opcode=103, arg=0, argval=0, argrepr='', offset=28, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BUILD_MAP', opcode=105, arg=0, argval=0, argrepr='', offset=30, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Hello world!', argrepr="'Hello world!'", offset=32, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=7, argval=7, argrepr='', offset=34, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=36, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='f', argrepr='f', offset=38, starts_line=8, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=40, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=4, starts_line=1, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=(3, 4), argrepr='(3, 4)', offset=6, starts_line=2, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=0, argval='a', argrepr='a', offset=8, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=1, argval='b', argrepr='b', offset=10, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BUILD_TUPLE', opcode=102, arg=2, argval=2, argrepr='', offset=12, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=code_object_f, argrepr=repr(code_object_f), offset=14, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='MAKE_FUNCTION', opcode=132, arg=9, argval=9, argrepr='defaults, closure', offset=16, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=2, argval='f', argrepr='f', offset=18, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=20, starts_line=7, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=0, argval='a', argrepr='a', offset=22, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=1, argval='b', argrepr='b', offset=24, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='', argrepr="''", offset=26, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=28, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BUILD_LIST', opcode=103, arg=0, argval=0, argrepr='', offset=30, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BUILD_MAP', opcode=105, arg=0, argval=0, argrepr='', offset=32, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Hello world!', argrepr="'Hello world!'", offset=34, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=7, argval=7, argrepr='', offset=36, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=38, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='f', argrepr='f', offset=40, starts_line=8, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=42, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_f = [ Instruction(opname='COPY_FREE_VARS', opcode=149, arg=2, argval=2, argrepr='', offset=0, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='MAKE_CELL', opcode=135, arg=0, argval='c', argrepr='c', offset=2, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='MAKE_CELL', opcode=135, arg=1, argval='d', argrepr='d', offset=4, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval=(5, 6), argrepr='(5, 6)', offset=6, starts_line=3, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=3, argval='a', argrepr='a', offset=8, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=4, argval='b', argrepr='b', offset=10, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=0, argval='c', argrepr='c', offset=12, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CLOSURE', opcode=136, arg=1, argval='d', argrepr='d', offset=14, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BUILD_TUPLE', opcode=102, arg=4, argval=4, argrepr='', offset=16, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=code_object_inner, argrepr=repr(code_object_inner), offset=18, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='MAKE_FUNCTION', opcode=132, arg=9, argval=9, argrepr='defaults, closure', offset=20, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=2, argval='inner', argrepr='inner', offset=22, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=24, starts_line=5, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=3, argval='a', argrepr='a', offset=26, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=4, argval='b', argrepr='b', offset=28, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=0, argval='c', argrepr='c', offset=30, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=1, argval='d', argrepr='d', offset=32, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=4, argval=4, argrepr='', offset=34, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=36, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='inner', argrepr='inner', offset=38, starts_line=6, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=40, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=6, starts_line=2, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval=(5, 6), argrepr='(5, 6)', offset=8, starts_line=3, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=3, argval='a', argrepr='a', offset=10, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=4, argval='b', argrepr='b', offset=12, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=0, argval='c', argrepr='c', offset=14, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CLOSURE', opcode=136, arg=1, argval='d', argrepr='d', offset=16, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BUILD_TUPLE', opcode=102, arg=4, argval=4, argrepr='', offset=18, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=code_object_inner, argrepr=repr(code_object_inner), offset=20, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='MAKE_FUNCTION', opcode=132, arg=9, argval=9, argrepr='defaults, closure', offset=22, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=2, argval='inner', argrepr='inner', offset=24, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=26, starts_line=5, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=3, argval='a', argrepr='a', offset=28, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=4, argval='b', argrepr='b', offset=30, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=0, argval='c', argrepr='c', offset=32, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=1, argval='d', argrepr='d', offset=34, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=4, argval=4, argrepr='', offset=36, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=38, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='inner', argrepr='inner', offset=40, starts_line=6, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=42, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_inner = [ Instruction(opname='COPY_FREE_VARS', opcode=149, arg=4, argval=4, argrepr='', offset=0, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=2, starts_line=4, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=2, argval='a', argrepr='a', offset=4, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=3, argval='b', argrepr='b', offset=6, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=4, argval='c', argrepr='c', offset=8, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_DEREF', opcode=137, arg=5, argval='d', argrepr='d', offset=10, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='e', argrepr='e', offset=12, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=1, argval='f', argrepr='f', offset=14, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=6, argval=6, argrepr='', offset=16, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=18, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=20, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=22, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=2, starts_line=3, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='print', argrepr='print', offset=4, starts_line=4, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=2, argval='a', argrepr='a', offset=6, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=3, argval='b', argrepr='b', offset=8, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=4, argval='c', argrepr='c', offset=10, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_DEREF', opcode=137, arg=5, argval='d', argrepr='d', offset=12, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='e', argrepr='e', offset=14, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=1, argval='f', argrepr='f', offset=16, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=6, argval=6, argrepr='', offset=18, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=20, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=22, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=24, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_jumpy = [ - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='range', argrepr='range', offset=0, starts_line=3, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=1, argval=10, argrepr='10', offset=2, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=4, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='GET_ITER', opcode=68, arg=None, argval=None, argrepr='', offset=6, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='FOR_ITER', opcode=93, arg=17, argval=44, argrepr='to 44', offset=8, starts_line=None, is_jump_target=True, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=10, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=12, starts_line=4, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=14, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=16, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=18, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=20, starts_line=5, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=22, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=24, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=15, argval=30, argrepr='to 30', offset=26, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=4, argval=8, argrepr='to 8', offset=28, starts_line=6, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=30, starts_line=7, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=32, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=34, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=21, argval=42, argrepr='to 42', offset=36, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=38, starts_line=8, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=5, argval=52, argrepr='to 52', offset=40, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=4, argval=8, argrepr='to 8', offset=42, starts_line=7, is_jump_target=True, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=44, starts_line=10, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='I can haz else clause?', argrepr="'I can haz else clause?'", offset=46, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=48, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=50, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=52, starts_line=11, is_jump_target=True, positions=None), - Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=48, argval=96, argrepr='to 96', offset=54, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=56, starts_line=12, is_jump_target=True, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=58, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=60, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=62, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=64, starts_line=13, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=66, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BINARY_OP', opcode=122, arg=23, argval=23, argrepr='-=', offset=68, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=70, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=72, starts_line=14, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=74, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=76, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=41, argval=82, argrepr='to 82', offset=78, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=26, argval=52, argrepr='to 52', offset=80, starts_line=15, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=82, starts_line=16, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=84, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=86, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=46, argval=92, argrepr='to 92', offset=88, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=6, argval=104, argrepr='to 104', offset=90, starts_line=17, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=92, starts_line=11, is_jump_target=True, positions=None), - Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=28, argval=56, argrepr='to 56', offset=94, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=96, starts_line=19, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Who let lolcatz into this test suite?', argrepr="'Who let lolcatz into this test suite?'", offset=98, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=100, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=102, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=104, starts_line=20, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=106, starts_line=21, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=0, argrepr='0', offset=108, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BINARY_OP', opcode=122, arg=11, argval=11, argrepr='/', offset=110, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=112, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=14, argval=144, argrepr='to 144', offset=114, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=116, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=2, argval='ZeroDivisionError', argrepr='ZeroDivisionError', offset=118, starts_line=22, is_jump_target=False, positions=None), - Instruction(opname='JUMP_IF_NOT_EXC_MATCH', opcode=121, arg=68, argval=136, argrepr='to 136', offset=120, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=122, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=124, starts_line=23, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=8, argval='Here we go, here we go, here we go...', argrepr="'Here we go, here we go, here we go...'", offset=126, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=128, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=130, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=132, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=34, argval=204, argrepr='to 204', offset=134, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=136, starts_line=22, is_jump_target=True, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=138, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=140, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=142, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=144, starts_line=25, is_jump_target=True, positions=None), - Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=146, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=148, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=150, starts_line=26, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=9, argval='Never reach this', argrepr="'Never reach this'", offset=152, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=154, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=156, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=158, starts_line=25, is_jump_target=False, positions=None), - Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=160, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=0, starts_line=1, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=0, argval='range', argrepr='range', offset=2, starts_line=3, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=1, argval=10, argrepr='10', offset=4, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=6, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='GET_ITER', opcode=68, arg=None, argval=None, argrepr='', offset=8, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='FOR_ITER', opcode=93, arg=17, argval=46, argrepr='to 46', offset=10, starts_line=None, is_jump_target=True, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=12, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=14, starts_line=4, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=16, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=18, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=20, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=22, starts_line=5, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=24, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=26, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=16, argval=32, argrepr='to 32', offset=28, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=5, argval=10, argrepr='to 10', offset=30, starts_line=6, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=32, starts_line=7, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=34, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=36, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=22, argval=44, argrepr='to 44', offset=38, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=40, starts_line=8, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=5, argval=54, argrepr='to 54', offset=42, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=5, argval=10, argrepr='to 10', offset=44, starts_line=7, is_jump_target=True, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=46, starts_line=10, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='I can haz else clause?', argrepr="'I can haz else clause?'", offset=48, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=50, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=52, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=54, starts_line=11, is_jump_target=True, positions=None), + Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=49, argval=98, argrepr='to 98', offset=56, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=58, starts_line=12, is_jump_target=True, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=60, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=62, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=64, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=66, starts_line=13, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=68, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BINARY_OP', opcode=122, arg=23, argval=23, argrepr='-=', offset=70, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=72, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=74, starts_line=14, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=76, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=78, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=42, argval=84, argrepr='to 84', offset=80, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_ABSOLUTE', opcode=113, arg=27, argval=54, argrepr='to 54', offset=82, starts_line=15, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=84, starts_line=16, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=86, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=88, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_FALSE', opcode=114, arg=47, argval=94, argrepr='to 94', offset=90, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=6, argval=106, argrepr='to 106', offset=92, starts_line=17, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=94, starts_line=11, is_jump_target=True, positions=None), + Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=29, argval=58, argrepr='to 58', offset=96, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=98, starts_line=19, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Who let lolcatz into this test suite?', argrepr="'Who let lolcatz into this test suite?'", offset=100, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=102, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=104, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=106, starts_line=20, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=108, starts_line=21, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=0, argrepr='0', offset=110, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BINARY_OP', opcode=122, arg=11, argval=11, argrepr='/', offset=112, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=114, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=14, argval=146, argrepr='to 146', offset=116, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=118, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=2, argval='ZeroDivisionError', argrepr='ZeroDivisionError', offset=120, starts_line=22, is_jump_target=False, positions=None), + Instruction(opname='JUMP_IF_NOT_EXC_MATCH', opcode=121, arg=69, argval=138, argrepr='to 138', offset=122, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=124, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=126, starts_line=23, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=8, argval='Here we go, here we go, here we go...', argrepr="'Here we go, here we go, here we go...'", offset=128, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=130, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=132, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=134, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=34, argval=206, argrepr='to 206', offset=136, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=138, starts_line=22, is_jump_target=True, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=140, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=142, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=144, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=146, starts_line=25, is_jump_target=True, positions=None), + Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=148, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=150, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=152, starts_line=26, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=9, argval='Never reach this', argrepr="'Never reach this'", offset=154, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=156, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=158, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=160, starts_line=25, is_jump_target=False, positions=None), Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=162, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=164, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=11, argval=192, argrepr='to 192', offset=168, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=170, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=172, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=92, argval=184, argrepr='to 184', offset=174, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=2, argval=2, argrepr='', offset=176, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=178, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=180, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=182, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=184, starts_line=None, is_jump_target=True, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=186, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=188, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=164, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=168, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=11, argval=194, argrepr='to 194', offset=170, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=172, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=174, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=93, argval=186, argrepr='to 186', offset=176, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=2, argval=2, argrepr='', offset=178, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=180, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=182, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=184, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=186, starts_line=None, is_jump_target=True, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=188, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=190, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=192, starts_line=28, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=194, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=196, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=198, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=200, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=202, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=204, starts_line=23, is_jump_target=True, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=206, starts_line=28, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=208, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=210, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=212, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=214, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=216, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=220, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=222, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=224, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=226, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=230, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=232, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=234, starts_line=None, is_jump_target=False, positions=None)] + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=192, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=194, starts_line=28, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=196, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=198, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=200, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=202, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=204, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=206, starts_line=23, is_jump_target=True, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=208, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=210, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=212, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=214, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=216, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=220, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=222, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=224, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=226, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=230, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=232, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=234, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=236, starts_line=None, is_jump_target=False, positions=None), +] # One last piece of inspect fodder to check the default line number handling def simple(): pass expected_opinfo_simple = [ - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=0, starts_line=simple.__code__.co_firstlineno, is_jump_target=False), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=2, starts_line=None, is_jump_target=False) + Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=0, starts_line=simple.__code__.co_firstlineno, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=2, starts_line=None, is_jump_target=False), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=4, starts_line=None, is_jump_target=False) ] @@ -1230,6 +1293,7 @@ def test_co_positions(self): for instr in dis.get_instructions(code) ] expected = [ + (None, None, None, None), (1, 1, 0, 1), (2, 2, 2, 3), (2, 2, 5, 6), @@ -1249,6 +1313,8 @@ def test_co_positions_missing_info(self): for instruction in actual: with self.subTest(instruction=instruction): start_line, end_line, start_offset, end_offset = instruction.positions + if instruction.opname == "RESUME": + continue assert start_line == 1 assert end_line == 1 assert start_offset is None @@ -1259,6 +1325,8 @@ def test_co_positions_missing_info(self): for instruction in actual: with self.subTest(instruction=instruction): start_line, end_line, start_offset, end_offset = instruction.positions + if instruction.opname == "RESUME": + continue assert start_line == 1 assert end_line is None assert start_offset is not None @@ -1398,6 +1466,7 @@ def test_distb_empty(self): dis.distb() def test_distb_last_traceback(self): + self.maxDiff = None # We need to have an existing last traceback in `sys`: tb = get_tb() sys.last_traceback = tb @@ -1405,6 +1474,7 @@ def test_distb_last_traceback(self): self.assertEqual(self.get_disassembly(None), dis_traceback) def test_distb_explicit_arg(self): + self.maxDiff = None tb = get_tb() self.assertEqual(self.get_disassembly(tb), dis_traceback) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst new file mode 100644 index 0000000000000..967f6db1236b7 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst @@ -0,0 +1,3 @@ +Add RESUME opcode. This is a logical no-op. It is emitted by the compiler +anywhere a Python function can be entered. It is used by the interpreter to +perform tracing and optimizer checks. diff --git a/PC/launcher.c b/PC/launcher.c index d2e6462f13e31..de7abeb4e86ab 100644 --- a/PC/launcher.c +++ b/PC/launcher.c @@ -1268,7 +1268,9 @@ static PYC_MAGIC magic_values[] = { { 3400, 3419, L"3.8" }, { 3420, 3429, L"3.9" }, { 3430, 3449, L"3.10" }, - { 3450, 3469, L"3.11" }, + /* Allow 50 magic numbers per version from here on */ + { 3450, 3499, L"3.11" }, + { 3500, 3549, L"3.12" }, { 0 } }; diff --git a/Programs/test_frozenmain.h b/Programs/test_frozenmain.h index 45ca31e1b5e95..4052fc6c92632 100644 --- a/Programs/test_frozenmain.h +++ b/Programs/test_frozenmain.h @@ -1,35 +1,35 @@ // Auto-generated by Programs/freeze_test_frozenmain.py unsigned char M_test_frozenmain[] = { 227,0,0,0,0,0,0,0,0,0,0,0,0,7,0,0, - 0,0,0,0,0,115,86,0,0,0,100,0,100,1,108,0, - 90,0,100,0,100,1,108,1,90,1,101,2,100,2,169,1, - 1,0,101,2,100,3,101,0,106,3,169,2,1,0,101,1, - 106,4,169,0,100,4,25,0,90,5,100,5,68,0,93,14, - 90,6,101,2,100,6,101,6,155,0,100,7,101,5,101,6, - 25,0,155,0,157,4,169,1,1,0,113,26,100,1,83,0, - 41,8,233,0,0,0,0,78,122,18,70,114,111,122,101,110, - 32,72,101,108,108,111,32,87,111,114,108,100,122,8,115,121, - 115,46,97,114,103,118,218,6,99,111,110,102,105,103,41,5, - 90,12,112,114,111,103,114,97,109,95,110,97,109,101,218,10, - 101,120,101,99,117,116,97,98,108,101,90,15,117,115,101,95, - 101,110,118,105,114,111,110,109,101,110,116,90,17,99,111,110, - 102,105,103,117,114,101,95,99,95,115,116,100,105,111,90,14, - 98,117,102,102,101,114,101,100,95,115,116,100,105,111,122,7, - 99,111,110,102,105,103,32,122,2,58,32,41,7,218,3,115, - 121,115,90,17,95,116,101,115,116,105,110,116,101,114,110,97, - 108,99,97,112,105,218,5,112,114,105,110,116,218,4,97,114, - 103,118,90,11,103,101,116,95,99,111,110,102,105,103,115,114, - 2,0,0,0,218,3,107,101,121,169,0,243,0,0,0,0, - 250,18,116,101,115,116,95,102,114,111,122,101,110,109,97,105, - 110,46,112,121,218,8,60,109,111,100,117,108,101,62,114,11, - 0,0,0,1,0,0,0,115,16,0,0,0,8,3,8,1, - 8,2,12,1,12,1,8,1,26,7,4,249,115,18,0,0, - 0,8,3,8,1,8,2,12,1,12,1,2,7,4,1,2, - 249,30,7,115,86,0,0,0,1,11,1,11,1,11,1,11, - 1,25,1,25,1,25,1,25,1,6,7,27,1,28,1,28, - 1,6,7,17,19,22,19,27,1,28,1,28,10,27,10,39, - 10,41,42,50,10,51,1,7,12,2,1,42,1,42,5,8, - 5,10,11,41,21,24,11,41,11,41,28,34,35,38,28,39, - 11,41,11,41,5,42,5,42,5,42,1,42,1,42,114,9, - 0,0,0, + 0,0,0,0,0,115,88,0,0,0,151,0,100,0,100,1, + 108,0,90,0,100,0,100,1,108,1,90,1,101,2,100,2, + 169,1,1,0,101,2,100,3,101,0,106,3,169,2,1,0, + 101,1,106,4,169,0,100,4,25,0,90,5,100,5,68,0, + 93,14,90,6,101,2,100,6,101,6,155,0,100,7,101,5, + 101,6,25,0,155,0,157,4,169,1,1,0,113,27,100,1, + 83,0,41,8,233,0,0,0,0,78,122,18,70,114,111,122, + 101,110,32,72,101,108,108,111,32,87,111,114,108,100,122,8, + 115,121,115,46,97,114,103,118,218,6,99,111,110,102,105,103, + 41,5,90,12,112,114,111,103,114,97,109,95,110,97,109,101, + 218,10,101,120,101,99,117,116,97,98,108,101,90,15,117,115, + 101,95,101,110,118,105,114,111,110,109,101,110,116,90,17,99, + 111,110,102,105,103,117,114,101,95,99,95,115,116,100,105,111, + 90,14,98,117,102,102,101,114,101,100,95,115,116,100,105,111, + 122,7,99,111,110,102,105,103,32,122,2,58,32,41,7,218, + 3,115,121,115,90,17,95,116,101,115,116,105,110,116,101,114, + 110,97,108,99,97,112,105,218,5,112,114,105,110,116,218,4, + 97,114,103,118,90,11,103,101,116,95,99,111,110,102,105,103, + 115,114,2,0,0,0,218,3,107,101,121,169,0,243,0,0, + 0,0,250,18,116,101,115,116,95,102,114,111,122,101,110,109, + 97,105,110,46,112,121,218,8,60,109,111,100,117,108,101,62, + 114,11,0,0,0,1,0,0,0,115,18,0,0,0,2,128, + 8,3,8,1,8,2,12,1,12,1,8,1,26,7,4,249, + 115,20,0,0,0,2,128,8,3,8,1,8,2,12,1,12, + 1,2,7,4,1,2,249,30,7,115,88,0,0,0,0,0, + 1,11,1,11,1,11,1,11,1,25,1,25,1,25,1,25, + 1,6,7,27,1,28,1,28,1,6,7,17,19,22,19,27, + 1,28,1,28,10,27,10,39,10,41,42,50,10,51,1,7, + 12,2,1,42,1,42,5,8,5,10,11,41,21,24,11,41, + 11,41,28,34,35,38,28,39,11,41,11,41,5,42,5,42, + 5,42,1,42,1,42,114,9,0,0,0, }; diff --git a/Python/ceval.c b/Python/ceval.c index 86d834cd3a67f..be26ffd822c13 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1546,6 +1546,17 @@ eval_frame_handle_pending(PyThreadState *tstate) #define TRACE_FUNCTION_ENTRY() \ if (cframe.use_tracing) { \ + _PyFrame_SetStackPointer(frame, stack_pointer); \ + int err = trace_function_entry(tstate, frame); \ + stack_pointer = _PyFrame_GetStackPointer(frame); \ + if (err) { \ + goto error; \ + } \ + } + +#define TRACE_FUNCTION_THROW_ENTRY() \ + if (cframe.use_tracing) { \ + assert(frame->stacktop >= 0); \ if (trace_function_entry(tstate, frame)) { \ goto exit_unwind; \ } \ @@ -1694,7 +1705,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr tstate->recursion_remaining--; goto exit_unwind; } - TRACE_FUNCTION_ENTRY(); + TRACE_FUNCTION_THROW_ENTRY(); DTRACE_FUNCTION_ENTRY(); goto resume_with_error; } @@ -1734,17 +1745,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr goto exit_unwind; } - assert(tstate->cframe == &cframe); - assert(frame == cframe.current_frame); - - TRACE_FUNCTION_ENTRY(); - DTRACE_FUNCTION_ENTRY(); - - if (_Py_IncrementCountAndMaybeQuicken(frame->f_code) < 0) { - goto exit_unwind; - } - frame->f_state = FRAME_EXECUTING; - resume_frame: SET_LOCALS_FROM_FRAME(); @@ -1825,6 +1825,24 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } + TARGET(RESUME) { + assert(tstate->cframe == &cframe); + assert(frame == cframe.current_frame); + + int err = _Py_IncrementCountAndMaybeQuicken(frame->f_code); + if (err) { + if (err < 0) { + goto error; + } + /* Update first_instr and next_instr to point to newly quickened code */ + int nexti = INSTR_OFFSET(); + first_instr = frame->f_code->co_firstinstr; + next_instr = first_instr + nexti; + } + frame->f_state = FRAME_EXECUTING; + DISPATCH(); + } + TARGET(LOAD_CLOSURE) { /* We keep LOAD_CLOSURE so that the bytecode stays more readable. */ PyObject *value = GETLOCAL(oparg); @@ -3134,7 +3152,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr PyObject *initial = GETLOCAL(oparg); PyObject *cell = PyCell_New(initial); if (cell == NULL) { - goto error; + goto resume_with_error; } SETLOCAL(oparg, cell); DISPATCH(); @@ -5209,33 +5227,40 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr int instr_prev = skip_backwards_over_extended_args(frame->f_code, frame->f_lasti); frame->f_lasti = INSTR_OFFSET(); TRACING_NEXTOPARG(); - if (PyDTrace_LINE_ENABLED()) { - maybe_dtrace_line(frame, &tstate->trace_info, instr_prev); + if (opcode == RESUME) { + /* Call tracing */ + TRACE_FUNCTION_ENTRY(); + DTRACE_FUNCTION_ENTRY(); } - /* line-by-line tracing support */ - - if (cframe.use_tracing && - tstate->c_tracefunc != NULL && !tstate->tracing) { - int err; - /* see maybe_call_line_trace() - for expository comments */ - _PyFrame_SetStackPointer(frame, stack_pointer); - - err = maybe_call_line_trace(tstate->c_tracefunc, - tstate->c_traceobj, - tstate, frame, instr_prev); - if (err) { - /* trace function raised an exception */ - next_instr++; - goto error; + else { + /* line-by-line tracing support */ + if (PyDTrace_LINE_ENABLED()) { + maybe_dtrace_line(frame, &tstate->trace_info, instr_prev); } - /* Reload possibly changed frame fields */ - JUMPTO(frame->f_lasti); - stack_pointer = _PyFrame_GetStackPointer(frame); - frame->stacktop = -1; - TRACING_NEXTOPARG(); + if (cframe.use_tracing && + tstate->c_tracefunc != NULL && !tstate->tracing) { + int err; + /* see maybe_call_line_trace() + for expository comments */ + _PyFrame_SetStackPointer(frame, stack_pointer); + + err = maybe_call_line_trace(tstate->c_tracefunc, + tstate->c_traceobj, + tstate, frame, instr_prev); + if (err) { + /* trace function raised an exception */ + next_instr++; + goto error; + } + /* Reload possibly changed frame fields */ + JUMPTO(frame->f_lasti); + + stack_pointer = _PyFrame_GetStackPointer(frame); + frame->stacktop = -1; + } } + TRACING_NEXTOPARG(); PRE_DISPATCH_GOTO(); DISPATCH_GOTO(); } @@ -6046,6 +6071,7 @@ _PyEval_Vector(PyThreadState *tstate, PyFunctionObject *func, return NULL; } PyObject *retval = _PyEval_EvalFrame(tstate, frame, 0); + assert(frame->stacktop >= 0); assert(_PyFrame_GetStackPointer(frame) == _PyFrame_Stackbase(frame)); _PyEvalFrameClearAndPop(tstate, frame); return retval; @@ -6492,13 +6518,9 @@ call_trace(Py_tracefunc func, PyObject *obj, if (f == NULL) { return -1; } - if (frame->f_lasti < 0) { - f->f_lineno = frame->f_code->co_firstlineno; - } - else { - initialize_trace_info(&tstate->trace_info, frame); - f->f_lineno = _PyCode_CheckLineNumber(frame->f_lasti*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); - } + assert (frame->f_lasti >= 0); + initialize_trace_info(&tstate->trace_info, frame); + f->f_lineno = _PyCode_CheckLineNumber(frame->f_lasti*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); result = func(obj, f, what, arg); f->f_lineno = 0; _PyThreadState_ResumeTracing(tstate); @@ -6534,7 +6556,14 @@ maybe_call_line_trace(Py_tracefunc func, PyObject *obj, then call the trace function if we're tracing source lines. */ initialize_trace_info(&tstate->trace_info, frame); - int lastline = _PyCode_CheckLineNumber(instr_prev*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); + _Py_CODEUNIT prev = ((_Py_CODEUNIT *)PyBytes_AS_STRING(frame->f_code->co_code))[instr_prev]; + int lastline; + if (_Py_OPCODE(prev) == RESUME && _Py_OPARG(prev) == 0) { + lastline = -1; + } + else { + lastline = _PyCode_CheckLineNumber(instr_prev*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); + } int line = _PyCode_CheckLineNumber(frame->f_lasti*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); PyFrameObject *f = _PyFrame_GetFrameObject(frame); if (f == NULL) { diff --git a/Python/compile.c b/Python/compile.c index 625a07bd39675..62f37ca452632 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -689,9 +689,9 @@ compiler_enter_scope(struct compiler *c, identifier name, u->u_blocks = NULL; u->u_nfblocks = 0; u->u_firstlineno = lineno; - u->u_lineno = 0; + u->u_lineno = lineno; u->u_col_offset = 0; - u->u_end_lineno = 0; + u->u_end_lineno = lineno; u->u_end_col_offset = 0; u->u_consts = PyDict_New(); if (!u->u_consts) { @@ -995,6 +995,7 @@ stack_effect(int opcode, int oparg, int jump) switch (opcode) { case NOP: case EXTENDED_ARG: + case RESUME: return 0; /* Stack manipulation */ @@ -1664,8 +1665,8 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) the ASDL name to synthesize the name of the C type and the visit function. */ -#define ADD_YIELD_FROM(C) \ - RETURN_IF_FALSE(compiler_add_yield_from((C))) +#define ADD_YIELD_FROM(C, await) \ + RETURN_IF_FALSE(compiler_add_yield_from((C), (await))) #define POP_EXCEPT_AND_RERAISE(C) \ RETURN_IF_FALSE(compiler_pop_except_and_reraise((C))) @@ -1823,18 +1824,19 @@ compiler_call_exit_with_nones(struct compiler *c) { } static int -compiler_add_yield_from(struct compiler *c) +compiler_add_yield_from(struct compiler *c, int await) { - basicblock *start, *jump, *exit; + basicblock *start, *resume, *exit; start = compiler_new_block(c); - jump = compiler_new_block(c); + resume = compiler_new_block(c); exit = compiler_new_block(c); - if (start == NULL || jump == NULL || exit == NULL) { + if (start == NULL || resume == NULL || exit == NULL) { return 0; } compiler_use_next_block(c, start); ADDOP_JUMP(c, SEND, exit); - compiler_use_next_block(c, jump); + compiler_use_next_block(c, resume); + ADDOP_I(c, RESUME, await ? 3 : 2); ADDOP_JUMP(c, JUMP_ABSOLUTE, start); compiler_use_next_block(c, exit); return 1; @@ -1928,7 +1930,7 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, if (info->fb_type == ASYNC_WITH) { ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); } ADDOP(c, POP_TOP); /* The exit block should appear to execute after the @@ -2047,9 +2049,11 @@ compiler_mod(struct compiler *c, mod_ty mod) if (module == NULL) { return 0; } - /* Use 0 for firstlineno initially, will fixup in assemble(). */ if (!compiler_enter_scope(c, module, COMPILER_SCOPE_MODULE, mod, 1)) return NULL; + c->u->u_lineno = -1; + ADDOP_I(c, RESUME, 0); + c->u->u_lineno = 1; switch (mod->kind) { case Module_kind: if (!compiler_body(c, mod->v.Module.body)) { @@ -2504,6 +2508,7 @@ compiler_function(struct compiler *c, stmt_ty s, int is_async) if (!compiler_enter_scope(c, name, scope_type, (void *)s, firstlineno)) { return 0; } + ADDOP_I(c, RESUME, 0); /* if not -OO mode, add docstring */ if (c->c_optimize < 2) { @@ -2573,8 +2578,10 @@ compiler_class(struct compiler *c, stmt_ty s) /* 1. compile the class body into a code object */ if (!compiler_enter_scope(c, s->v.ClassDef.name, - COMPILER_SCOPE_CLASS, (void *)s, firstlineno)) + COMPILER_SCOPE_CLASS, (void *)s, firstlineno)) { return 0; + } + ADDOP_I(c, RESUME, 0); /* this block represents what we do in the new scope */ { /* use the class name for name mangling */ @@ -2907,11 +2914,13 @@ compiler_lambda(struct compiler *c, expr_ty e) if (funcflags == -1) { return 0; } + ADDOP_I(c, RESUME, 0); if (!compiler_enter_scope(c, name, COMPILER_SCOPE_LAMBDA, (void *)e, e->lineno)) return 0; + ADDOP_I(c, RESUME, 0); /* Make None the first constant, so the lambda can't have a docstring. */ if (compiler_add_const(c, Py_None) < 0) @@ -3041,7 +3050,7 @@ compiler_async_for(struct compiler *c, stmt_ty s) ADDOP_JUMP(c, SETUP_FINALLY, except); ADDOP(c, GET_ANEXT); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); ADDOP(c, POP_BLOCK); /* for SETUP_FINALLY */ /* Success block for __anext__ */ @@ -5135,6 +5144,7 @@ compiler_sync_comprehension_generator(struct compiler *c, case COMP_GENEXP: VISIT(c, expr, elt); ADDOP(c, YIELD_VALUE); + ADDOP_I(c, RESUME, 1); ADDOP(c, POP_TOP); break; case COMP_LISTCOMP: @@ -5207,7 +5217,7 @@ compiler_async_comprehension_generator(struct compiler *c, ADDOP_JUMP(c, SETUP_FINALLY, except); ADDOP(c, GET_ANEXT); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); ADDOP(c, POP_BLOCK); VISIT(c, expr, gen->target); @@ -5233,6 +5243,7 @@ compiler_async_comprehension_generator(struct compiler *c, case COMP_GENEXP: VISIT(c, expr, elt); ADDOP(c, YIELD_VALUE); + ADDOP_I(c, RESUME, 1); ADDOP(c, POP_TOP); break; case COMP_LISTCOMP: @@ -5285,6 +5296,7 @@ compiler_comprehension(struct compiler *c, expr_ty e, int type, { goto error; } + ADDOP_I(c, RESUME, 0); SET_LOC(c, e); is_async_generator = c->u->u_ste->ste_coroutine; @@ -5357,7 +5369,7 @@ compiler_comprehension(struct compiler *c, expr_ty e, int type, if (is_async_generator && type != COMP_GENEXP) { ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); } return 1; @@ -5506,7 +5518,7 @@ compiler_async_with(struct compiler *c, stmt_ty s, int pos) ADDOP(c, BEFORE_ASYNC_WITH); ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); ADDOP_JUMP(c, SETUP_WITH, final); @@ -5543,7 +5555,7 @@ compiler_async_with(struct compiler *c, stmt_ty s, int pos) return 0; ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); ADDOP(c, POP_TOP); @@ -5557,7 +5569,7 @@ compiler_async_with(struct compiler *c, stmt_ty s, int pos) ADDOP(c, WITH_EXCEPT_START); ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); compiler_with_except_finish(c, cleanup); compiler_use_next_block(c, exit); @@ -5703,6 +5715,7 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) ADDOP_LOAD_CONST(c, Py_None); } ADDOP(c, YIELD_VALUE); + ADDOP_I(c, RESUME, 1); break; case YieldFrom_kind: if (c->u->u_ste->ste_type != FunctionBlock) @@ -5714,7 +5727,7 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) VISIT(c, expr, e->v.YieldFrom.value); ADDOP(c, GET_YIELD_FROM_ITER); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 0); break; case Await_kind: if (!IS_TOP_LEVEL_AWAIT(c)){ @@ -5731,7 +5744,7 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) VISIT(c, expr, e->v.Await.value); ADDOP(c, GET_AWAITABLE); ADDOP_LOAD_CONST(c, Py_None); - ADD_YIELD_FROM(c); + ADD_YIELD_FROM(c, 1); break; case Compare_kind: return compiler_compare(c, e); @@ -7987,6 +8000,7 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, if (flags < 0) { return -1; } + assert(c->u->u_firstlineno > 0); /* Set up cells for any variable that escapes, to be put in a closure. */ const int ncellvars = (int)PyDict_GET_SIZE(c->u->u_cellvars); @@ -8191,17 +8205,19 @@ assemble(struct compiler *c, int addNone) goto error; } - // This must be called before fix_cell_offsets(). - if (insert_prefix_instructions(c, entryblock, cellfixedoffsets, nfreevars)) { - goto error; - } - /* Set firstlineno if it wasn't explicitly set. */ if (!c->u->u_firstlineno) { - if (entryblock->b_instr && entryblock->b_instr->i_lineno) + if (entryblock->b_instr && entryblock->b_instr->i_lineno) { c->u->u_firstlineno = entryblock->b_instr->i_lineno; - else + } + else { c->u->u_firstlineno = 1; + } + } + + // This must be called before fix_cell_offsets(). + if (insert_prefix_instructions(c, entryblock, cellfixedoffsets, nfreevars)) { + goto error; } if (!assemble_init(&a, nblocks, c->u->u_firstlineno)) diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index 7ba45666ed061..c78425ff9bb64 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -150,7 +150,7 @@ static void *opcode_targets[256] = { &&TARGET_LOAD_CLASSDEREF, &&TARGET_COPY_FREE_VARS, &&_unknown_opcode, - &&_unknown_opcode, + &&TARGET_RESUME, &&TARGET_MATCH_CLASS, &&_unknown_opcode, &&_unknown_opcode, From webhook-mailer at python.org Thu Jan 6 08:21:44 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 13:21:44 -0000 Subject: [Python-checkins] bpo-46278: fix typo introduced in GH-30427 (GH-30430) Message-ID: https://github.com/python/cpython/commit/b50e5e916a05df65ab6a255af7624b751e0fe9d1 commit: b50e5e916a05df65ab6a255af7624b751e0fe9d1 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T05:21:24-08:00 summary: bpo-46278: fix typo introduced in GH-30427 (GH-30430) Automerge-Triggered-By: GH:asvetlov files: M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index 831c19cf0ec68..e3c55b22aace2 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -258,12 +258,12 @@ def _timer_handle_cancelled(self, handle): raise NotImplementedError def call_soon(self, callback, *args, context=None): - return self.call_later(0, callback, *args) + return self.call_later(0, callback, *args, context=context) def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args, cotext=None): + def call_at(self, when, callback, *args, context=None): raise NotImplementedError def time(self): From webhook-mailer at python.org Thu Jan 6 08:51:48 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 13:51:48 -0000 Subject: [Python-checkins] bpo-46278: fix typo introduced in GH-30427 (GH-30430) Message-ID: https://github.com/python/cpython/commit/cb0683128b8f413e0f16293752014902d4de4984 commit: cb0683128b8f413e0f16293752014902d4de4984 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T05:51:44-08:00 summary: bpo-46278: fix typo introduced in GH-30427 (GH-30430) Automerge-Triggered-By: GH:asvetlov (cherry picked from commit b50e5e916a05df65ab6a255af7624b751e0fe9d1) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index db7720abcfede..413ff2aaa6da6 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -259,12 +259,12 @@ def _timer_handle_cancelled(self, handle): raise NotImplementedError def call_soon(self, callback, *args, context=None): - return self.call_later(0, callback, *args) + return self.call_later(0, callback, *args, context=context) def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args, cotext=None): + def call_at(self, when, callback, *args, context=None): raise NotImplementedError def time(self): From webhook-mailer at python.org Thu Jan 6 09:44:31 2022 From: webhook-mailer at python.org (asvetlov) Date: Thu, 06 Jan 2022 14:44:31 -0000 Subject: [Python-checkins] bpo-46278: fix typo introduced in GH-30427 (GH-30430) (GH-30431) Message-ID: https://github.com/python/cpython/commit/861a9aaf0f517623c58ca4eb5588804b2632fcba commit: 861a9aaf0f517623c58ca4eb5588804b2632fcba branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-06T16:44:22+02:00 summary: bpo-46278: fix typo introduced in GH-30427 (GH-30430) (GH-30431) Automerge-Triggered-By: GH:asvetlov (cherry picked from commit b50e5e916a05df65ab6a255af7624b751e0fe9d1) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Lib/asyncio/events.py diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py index 58236059f7e22..5ab1acc41bf31 100644 --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -259,12 +259,12 @@ def _timer_handle_cancelled(self, handle): raise NotImplementedError def call_soon(self, callback, *args, context=None): - return self.call_later(0, callback, *args) + return self.call_later(0, callback, *args, context=context) def call_later(self, delay, callback, *args, context=None): raise NotImplementedError - def call_at(self, when, callback, *args, cotext=None): + def call_at(self, when, callback, *args, context=None): raise NotImplementedError def time(self): From webhook-mailer at python.org Thu Jan 6 10:12:37 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 06 Jan 2022 15:12:37 -0000 Subject: [Python-checkins] [3.10] bpo-46006: Revert "bpo-40521: Per-interpreter interned strings (GH-20085)" (GH-30422) (GH-30425) Message-ID: https://github.com/python/cpython/commit/72c260cf0c71eb01eb13100b751e9d5007d00b70 commit: 72c260cf0c71eb01eb13100b751e9d5007d00b70 branch: 3.10 author: Victor Stinner committer: vstinner date: 2022-01-06T16:12:28+01:00 summary: [3.10] bpo-46006: Revert "bpo-40521: Per-interpreter interned strings (GH-20085)" (GH-30422) (GH-30425) This reverts commit ea251806b8dffff11b30d2182af1e589caf88acf. Keep "assert(interned == NULL);" in _PyUnicode_Fini(), but only for the main interpreter. Keep _PyUnicode_ClearInterned() changes avoiding the creation of a temporary Python list object. Leave the PyInterpreterState structure unchanged to keep the ABI backward compatibility with Python 3.10.0: rename the "interned" member to "unused_interned". (cherry picked from commit 35d6540c904ef07b8602ff014e520603f84b5886) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst M Include/internal/pycore_interp.h M Objects/typeobject.c M Objects/unicodeobject.c diff --git a/Include/internal/pycore_interp.h b/Include/internal/pycore_interp.h index bfd082b588256..4307b61ca36aa 100644 --- a/Include/internal/pycore_interp.h +++ b/Include/internal/pycore_interp.h @@ -71,15 +71,9 @@ struct _Py_unicode_state { PyObject *latin1[256]; struct _Py_unicode_fs_codec fs_codec; - /* This dictionary holds all interned unicode strings. Note that references - to strings in this dictionary are *not* counted in the string's ob_refcnt. - When the interned string reaches a refcnt of 0 the string deallocation - function will delete the reference from this dictionary. - - Another way to look at this is that to say that the actual reference - count of a string is: s->ob_refcnt + (s->state ? 2 : 0) - */ - PyObject *interned; + // Unused member kept for ABI backward compatibility with Python 3.10.0: + // see bpo-46006. + PyObject *unused_interned; // Unicode identifiers (_Py_Identifier): see _PyUnicode_FromId() struct _Py_unicode_ids ids; diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst new file mode 100644 index 0000000000000..3acd2b09390a8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst @@ -0,0 +1,5 @@ +Fix a regression when a type method like ``__init__()`` is modified in a +subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type +``update_slot()``. Revert the change which made the Unicode dictionary of +interned strings compatible with subinterpreters: the internal interned +dictionary is shared again by all interpreters. Patch by Victor Stinner. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 02046e5f2ebef..b23e36a420fa0 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -50,6 +50,11 @@ typedef struct PySlot_Offset { } PySlot_Offset; +/* bpo-40521: Interned strings are shared by all subinterpreters */ +#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS +# define INTERN_NAME_STRINGS +#endif + /* alphabetical order */ _Py_IDENTIFIER(__abstractmethods__); _Py_IDENTIFIER(__annotations__); @@ -3988,6 +3993,7 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) if (name == NULL) return -1; } +#ifdef INTERN_NAME_STRINGS if (!PyUnicode_CHECK_INTERNED(name)) { PyUnicode_InternInPlace(&name); if (!PyUnicode_CHECK_INTERNED(name)) { @@ -3997,6 +4003,7 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) return -1; } } +#endif } else { /* Will fail in _PyObject_GenericSetAttrWithDict. */ @@ -8344,10 +8351,17 @@ _PyTypes_InitSlotDefs(void) for (slotdef *p = slotdefs; p->name; p++) { /* Slots must be ordered by their offset in the PyHeapTypeObject. */ assert(!p[1].name || p->offset <= p[1].offset); +#ifdef INTERN_NAME_STRINGS p->name_strobj = PyUnicode_InternFromString(p->name); if (!p->name_strobj || !PyUnicode_CHECK_INTERNED(p->name_strobj)) { return _PyStatus_NO_MEMORY(); } +#else + p->name_strobj = PyUnicode_FromString(p->name); + if (!p->name_strobj) { + return _PyStatus_NO_MEMORY(); + } +#endif } slotdefs_initialized = 1; return _PyStatus_OK(); @@ -8372,16 +8386,24 @@ update_slot(PyTypeObject *type, PyObject *name) int offset; assert(PyUnicode_CheckExact(name)); +#ifdef INTERN_NAME_STRINGS assert(PyUnicode_CHECK_INTERNED(name)); +#endif assert(slotdefs_initialized); pp = ptrs; for (p = slotdefs; p->name; p++) { assert(PyUnicode_CheckExact(p->name_strobj)); assert(PyUnicode_CheckExact(name)); +#ifdef INTERN_NAME_STRINGS if (p->name_strobj == name) { *pp++ = p; } +#else + if (p->name_strobj == name || _PyUnicode_EQ(p->name_strobj, name)) { + *pp++ = p; + } +#endif } *pp = NULL; for (pp = ptrs; *pp; pp++) { diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index c72871074b3eb..077cf8d7f4560 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -211,6 +211,22 @@ extern "C" { # define OVERALLOCATE_FACTOR 4 #endif +/* bpo-40521: Interned strings are shared by all interpreters. */ +#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS +# define INTERNED_STRINGS +#endif + +/* This dictionary holds all interned unicode strings. Note that references + to strings in this dictionary are *not* counted in the string's ob_refcnt. + When the interned string reaches a refcnt of 0 the string deallocation + function will delete the reference from this dictionary. + + Another way to look at this is that to say that the actual reference + count of a string is: s->ob_refcnt + (s->state ? 2 : 0) +*/ +#ifdef INTERNED_STRINGS +static PyObject *interned = NULL; +#endif static struct _Py_unicode_state* get_unicode_state(void) @@ -1936,7 +1952,7 @@ unicode_dealloc(PyObject *unicode) case SSTATE_INTERNED_MORTAL: { - struct _Py_unicode_state *state = get_unicode_state(); +#ifdef INTERNED_STRINGS /* Revive the dead object temporarily. PyDict_DelItem() removes two references (key and value) which were ignored by PyUnicode_InternInPlace(). Use refcnt=3 rather than refcnt=2 @@ -1944,12 +1960,13 @@ unicode_dealloc(PyObject *unicode) PyDict_DelItem(). */ assert(Py_REFCNT(unicode) == 0); Py_SET_REFCNT(unicode, 3); - if (PyDict_DelItem(state->interned, unicode) != 0) { + if (PyDict_DelItem(interned, unicode) != 0) { _PyErr_WriteUnraisableMsg("deletion of interned string failed", NULL); } assert(Py_REFCNT(unicode) == 1); Py_SET_REFCNT(unicode, 0); +#endif break; } @@ -11600,11 +11617,13 @@ _PyUnicode_EqualToASCIIId(PyObject *left, _Py_Identifier *right) if (PyUnicode_CHECK_INTERNED(left)) return 0; +#ifdef INTERNED_STRINGS assert(_PyUnicode_HASH(right_uni) != -1); Py_hash_t hash = _PyUnicode_HASH(left); if (hash != -1 && hash != _PyUnicode_HASH(right_uni)) { return 0; } +#endif return unicode_compare_eq(left, right_uni); } @@ -15833,21 +15852,21 @@ PyUnicode_InternInPlace(PyObject **p) return; } +#ifdef INTERNED_STRINGS if (PyUnicode_READY(s) == -1) { PyErr_Clear(); return; } - struct _Py_unicode_state *state = get_unicode_state(); - if (state->interned == NULL) { - state->interned = PyDict_New(); - if (state->interned == NULL) { + if (interned == NULL) { + interned = PyDict_New(); + if (interned == NULL) { PyErr_Clear(); /* Don't leave an exception */ return; } } - PyObject *t = PyDict_SetDefault(state->interned, s, s); + PyObject *t = PyDict_SetDefault(interned, s, s); if (t == NULL) { PyErr_Clear(); return; @@ -15864,9 +15883,13 @@ PyUnicode_InternInPlace(PyObject **p) this. */ Py_SET_REFCNT(s, Py_REFCNT(s) - 2); _PyUnicode_STATE(s).interned = SSTATE_INTERNED_MORTAL; +#else + // PyDict expects that interned strings have their hash + // (PyASCIIObject.hash) already computed. + (void)unicode_hash(s); +#endif } - void PyUnicode_InternImmortal(PyObject **p) { @@ -15900,11 +15923,15 @@ PyUnicode_InternFromString(const char *cp) void _PyUnicode_ClearInterned(PyInterpreterState *interp) { - struct _Py_unicode_state *state = &interp->unicode; - if (state->interned == NULL) { + if (!_Py_IsMainInterpreter(interp)) { + // interned dict is shared by all interpreters + return; + } + + if (interned == NULL) { return; } - assert(PyDict_CheckExact(state->interned)); + assert(PyDict_CheckExact(interned)); /* Interned unicode strings are not forcibly deallocated; rather, we give them their stolen references back, and then clear and DECREF the @@ -15912,13 +15939,13 @@ _PyUnicode_ClearInterned(PyInterpreterState *interp) #ifdef INTERNED_STATS fprintf(stderr, "releasing %zd interned strings\n", - PyDict_GET_SIZE(state->interned)); + PyDict_GET_SIZE(interned)); Py_ssize_t immortal_size = 0, mortal_size = 0; #endif Py_ssize_t pos = 0; PyObject *s, *ignored_value; - while (PyDict_Next(state->interned, &pos, &s, &ignored_value)) { + while (PyDict_Next(interned, &pos, &s, &ignored_value)) { assert(PyUnicode_IS_READY(s)); switch (PyUnicode_CHECK_INTERNED(s)) { @@ -15949,8 +15976,8 @@ _PyUnicode_ClearInterned(PyInterpreterState *interp) mortal_size, immortal_size); #endif - PyDict_Clear(state->interned); - Py_CLEAR(state->interned); + PyDict_Clear(interned); + Py_CLEAR(interned); } @@ -16322,8 +16349,10 @@ _PyUnicode_Fini(PyInterpreterState *interp) { struct _Py_unicode_state *state = &interp->unicode; - // _PyUnicode_ClearInterned() must be called before - assert(state->interned == NULL); + if (_Py_IsMainInterpreter(interp)) { + // _PyUnicode_ClearInterned() must be called before _PyUnicode_Fini() + assert(interned == NULL); + } _PyUnicode_FiniEncodings(&state->fs_codec); From webhook-mailer at python.org Thu Jan 6 10:14:57 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 15:14:57 -0000 Subject: [Python-checkins] bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) Message-ID: https://github.com/python/cpython/commit/a4aa52dc2801d25b6343fe2ef8de7f40ea3bc883 commit: a4aa52dc2801d25b6343fe2ef8de7f40ea3bc883 branch: main author: Christian Heimes committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T07:14:49-08:00 summary: bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) Automerge-Triggered-By: GH:tiran files: A Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index ecf3aa34ede7c..e246c36de01cd 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -857,8 +857,13 @@ class PyMemDebugTests(unittest.TestCase): def check(self, code): with support.SuppressCrashReport(): - out = assert_python_failure('-c', code, - PYTHONMALLOC=self.PYTHONMALLOC) + out = assert_python_failure( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + # FreeBSD: instruct jemalloc to not fill freed() memory + # with junk byte 0x5a, see JEMALLOC(3) + MALLOC_CONF="junk:false", + ) stderr = out.err return stderr.decode('ascii', 'replace') diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst new file mode 100644 index 0000000000000..0334af4e3cbe8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst @@ -0,0 +1,2 @@ +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. From webhook-mailer at python.org Thu Jan 6 10:37:06 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 15:37:06 -0000 Subject: [Python-checkins] bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) Message-ID: https://github.com/python/cpython/commit/b951dec4418326c06d493020dadedf85a3b29a82 commit: b951dec4418326c06d493020dadedf85a3b29a82 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T07:36:47-08:00 summary: bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) Automerge-Triggered-By: GH:tiran (cherry picked from commit a4aa52dc2801d25b6343fe2ef8de7f40ea3bc883) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index b5cb3ad0bc07a..6453f760a846e 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -850,8 +850,13 @@ class PyMemDebugTests(unittest.TestCase): def check(self, code): with support.SuppressCrashReport(): - out = assert_python_failure('-c', code, - PYTHONMALLOC=self.PYTHONMALLOC) + out = assert_python_failure( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + # FreeBSD: instruct jemalloc to not fill freed() memory + # with junk byte 0x5a, see JEMALLOC(3) + MALLOC_CONF="junk:false", + ) stderr = out.err return stderr.decode('ascii', 'replace') diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst new file mode 100644 index 0000000000000..0334af4e3cbe8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst @@ -0,0 +1,2 @@ +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. From webhook-mailer at python.org Thu Jan 6 10:52:25 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 06 Jan 2022 15:52:25 -0000 Subject: [Python-checkins] [3.9] bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) (GH-30437) Message-ID: https://github.com/python/cpython/commit/b259015c1079744dbe3d58bd4c27757a6df9d1c6 commit: b259015c1079744dbe3d58bd4c27757a6df9d1c6 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: tiran date: 2022-01-06T16:52:21+01:00 summary: [3.9] bpo-46263: FreeBSD 14.0 jemalloc workaround for junk bytes of freed memory (GH-30434) (GH-30437) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 87f327414b54e..72c648f54d8b7 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -763,8 +763,13 @@ class PyMemDebugTests(unittest.TestCase): def check(self, code): with support.SuppressCrashReport(): - out = assert_python_failure('-c', code, - PYTHONMALLOC=self.PYTHONMALLOC) + out = assert_python_failure( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + # FreeBSD: instruct jemalloc to not fill freed() memory + # with junk byte 0x5a, see JEMALLOC(3) + MALLOC_CONF="junk:false", + ) stderr = out.err return stderr.decode('ascii', 'replace') diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst new file mode 100644 index 0000000000000..0334af4e3cbe8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst @@ -0,0 +1,2 @@ +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. From webhook-mailer at python.org Thu Jan 6 13:56:17 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Thu, 06 Jan 2022 18:56:17 -0000 Subject: [Python-checkins] bpo-46263: Do not ever expect "use_frozen_modules" to be -1. (gh-30438) Message-ID: https://github.com/python/cpython/commit/68c76d9766cccb5fd992b0ac4b39645d9665dbe2 commit: 68c76d9766cccb5fd992b0ac4b39645d9665dbe2 branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-06T11:56:13-07:00 summary: bpo-46263: Do not ever expect "use_frozen_modules" to be -1. (gh-30438) The condition is no longer valid. This should resolve the buildbot failure on FreeBSD. https://bugs.python.org/issue46263 files: A Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst M Lib/test/test_embed.py diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 879952413374ec..dd43669ba96741 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -1247,8 +1247,6 @@ def test_init_setpythonhome(self): 'stdlib_dir': stdlib, } self.default_program_name(config) - if not config['executable']: - config['use_frozen_modules'] = -1 env = {'TESTHOME': home, 'PYTHONPATH': paths_str} self.check_all_configs("test_init_setpythonhome", config, api=API_COMPAT, env=env) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst new file mode 100644 index 00000000000000..fdcfe50a84aa13 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst @@ -0,0 +1,2 @@ +We always expect the "use_frozen_modules" config to be set, now that +getpath.c was rewritten in pure Python and the logic improved. From webhook-mailer at python.org Thu Jan 6 14:05:45 2022 From: webhook-mailer at python.org (iritkatriel) Date: Thu, 06 Jan 2022 19:05:45 -0000 Subject: [Python-checkins] bpo-45292: [PEP-654] exception groups and except* documentation (GH-30158) Message-ID: https://github.com/python/cpython/commit/9925e70e4811841556747a77acd89c1a70bf344a commit: 9925e70e4811841556747a77acd89c1a70bf344a branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-06T19:05:34Z summary: bpo-45292: [PEP-654] exception groups and except* documentation (GH-30158) files: M Doc/library/exceptions.rst M Doc/reference/compound_stmts.rst M Doc/tutorial/errors.rst diff --git a/Doc/library/exceptions.rst b/Doc/library/exceptions.rst index 12d7d8abb2650..f90b6761154af 100644 --- a/Doc/library/exceptions.rst +++ b/Doc/library/exceptions.rst @@ -851,6 +851,78 @@ The following exceptions are used as warning categories; see the .. versionadded:: 3.2 +Exception groups +---------------- + +The following are used when it is necessary to raise multiple unrelated +exceptions. They are part of the exception hierarchy so they can be +handled with :keyword:`except` like all other exceptions. In addition, +they are recognised by :keyword:`except*`, which matches +their subgroups based on the types of the contained exceptions. + +.. exception:: ExceptionGroup(msg, excs) +.. exception:: BaseExceptionGroup(msg, excs) + + Both of these exception types wrap the exceptions in the sequence ``excs``. + The ``msg`` parameter must be a string. The difference between the two + classes is that :exc:`BaseExceptionGroup` extends :exc:`BaseException` and + it can wrap any exception, while :exc:`ExceptionGroup` extends :exc:`Exception` + and it can only wrap subclasses of :exc:`Exception`. This design is so that + ``except Exception`` catches an :exc:`ExceptionGroup` but not + :exc:`BaseExceptionGroup`. + + The :exc:`BaseExceptionGroup` constructor returns an :exc:`ExceptionGroup` + rather than a :exc:`BaseExceptionGroup` if all contained exceptions are + :exc:`Exception` instances, so it can be used to make the selection + automatic. The :exc:`ExceptionGroup` constructor, on the other hand, + raises a :exc:`TypeError` if any contained exception is not an + :exc:`Exception` subclass. + + .. method:: subgroup(condition) + + Returns an exception group that contains only the exceptions from the + current group that match *condition*, or ``None`` if the result is empty. + + The condition can be either a function that accepts an exception and returns + true for those that should be in the subgroup, or it can be an exception type + or a tuple of exception types, which is used to check for a match using the + same check that is used in an ``except`` clause. + + The nesting structure of the current exception is preserved in the result, + as are the values of its :attr:`message`, :attr:`__traceback__`, + :attr:`__cause__`, :attr:`__context__` and :attr:`__note__` fields. + Empty nested groups are omitted from the result. + + The condition is checked for all exceptions in the nested exception group, + including the top-level and any nested exception groups. If the condition is + true for such an exception group, it is included in the result in full. + + .. method:: split(condition) + + Like :meth:`subgroup`, but returns the pair ``(match, rest)`` where ``match`` + is ``subgroup(condition)`` and ``rest`` is the remaining non-matching + part. + + .. method:: derive(excs) + + Returns an exception group with the same :attr:`message`, + :attr:`__traceback__`, :attr:`__cause__`, :attr:`__context__` + and :attr:`__note__` but which wraps the exceptions in ``excs``. + + This method is used by :meth:`subgroup` and :meth:`split`. A + subclass needs to override it in order to make :meth:`subgroup` + and :meth:`split` return instances of the subclass rather + than :exc:`ExceptionGroup`. :: + + >>> class MyGroup(ExceptionGroup): + ... def derive(self, exc): + ... return MyGroup(self.message, exc) + ... + >>> MyGroup("eg", [ValueError(1), TypeError(2)]).split(TypeError) + (MyGroup('eg', [TypeError(2)]), MyGroup('eg', [ValueError(1)])) + + .. versionadded:: 3.11 + Exception hierarchy ------------------- diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index 03fc2cb962791..473f977a4cd9d 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -199,6 +199,7 @@ returns the list ``[0, 1, 2]``. .. _try: .. _except: +.. _except_star: .. _finally: The :keyword:`!try` statement @@ -216,12 +217,16 @@ The :keyword:`try` statement specifies exception handlers and/or cleanup code for a group of statements: .. productionlist:: python-grammar - try_stmt: `try1_stmt` | `try2_stmt` + try_stmt: `try1_stmt` | `try2_stmt` | `try3_stmt` try1_stmt: "try" ":" `suite` : ("except" [`expression` ["as" `identifier`]] ":" `suite`)+ : ["else" ":" `suite`] : ["finally" ":" `suite`] try2_stmt: "try" ":" `suite` + : ("except" "*" `expression` ["as" `identifier`] ":" `suite`)+ + : ["else" ":" `suite`] + : ["finally" ":" `suite`] + try3_stmt: "try" ":" `suite` : "finally" ":" `suite` @@ -304,6 +309,47 @@ when leaving an exception handler:: >>> print(sys.exc_info()) (None, None, None) +.. index:: + keyword: except_star + +The :keyword:`except*` clause(s) are used for handling +:exc:`ExceptionGroup`s. The exception type for matching is interpreted as in +the case of :keyword:`except`, but in the case of exception groups we can have +partial matches when the type matches some of the exceptions in the group. +This means that multiple except* clauses can execute, each handling part of +the exception group. Each clause executes once and handles an exception group +of all matching exceptions. Each exception in the group is handled by at most +one except* clause, the first that matches it. :: + + >>> try: + ... raise ExceptionGroup("eg", + ... [ValueError(1), TypeError(2), OSError(3), OSError(4)]) + ... except* TypeError as e: + ... print(f'caught {type(e)} with nested {e.exceptions}') + ... except* OSError as e: + ... print(f'caught {type(e)} with nested {e.exceptions}') + ... + caught with nested (TypeError(2),) + caught with nested (OSError(3), OSError(4)) + + Exception Group Traceback (most recent call last): + | File "", line 2, in + | ExceptionGroup: eg + +-+---------------- 1 ---------------- + | ValueError: 1 + +------------------------------------ + >>> + + Any remaining exceptions that were not handled by any except* clause + are re-raised at the end, combined into an exception group along with + all exceptions that were raised from within except* clauses. + + An except* clause must have a matching type, and this type cannot be a + subclass of :exc:`BaseExceptionGroup`. It is not possible to mix except + and except* in the same :keyword:`try`. :keyword:`break`, + :keyword:`continue` and :keyword:`return` cannot appear in an except* + clause. + + .. index:: keyword: else statement: return diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index 3f09db2104068..ad1ef841bffc4 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -462,3 +462,92 @@ used in a way that ensures they are always cleaned up promptly and correctly. :: After the statement is executed, the file *f* is always closed, even if a problem was encountered while processing the lines. Objects which, like files, provide predefined clean-up actions will indicate this in their documentation. + + +.. _tut-exception-groups: + +Raising and Handling Multiple Unrelated Exceptions +================================================== + +There are situations where it is necessary to report several exceptions that +have occurred. This it often the case in concurrency frameworks, when several +tasks may have failed in parallel, but there are also other use cases where +it is desirable to continue execution and collect multiple errors rather than +raise the first exception. + +The builtin :exc:`ExceptionGroup` wraps a list of exception instances so +that they can be raised together. It is an exception itself, so it can be +caught like any other exception. :: + + >>> def f(): + ... excs = [OSError('error 1'), SystemError('error 2')] + ... raise ExceptionGroup('there were problems', excs) + ... + >>> f() + + Exception Group Traceback (most recent call last): + | File "", line 1, in + | File "", line 3, in f + | ExceptionGroup: there were problems + +-+---------------- 1 ---------------- + | OSError: error 1 + +---------------- 2 ---------------- + | SystemError: error 2 + +------------------------------------ + >>> try: + ... f() + ... except Exception as e: + ... print(f'caught {type(e)}: e') + ... + caught : e + >>> + +By using ``except*`` instead of ``except``, we can selectively +handle only the exceptions in the group that match a certain +type. In the following example, which shows a nested exception +group, each ``except*`` clause extracts from the group exceptions +of a certain type while letting all other exceptions propagate to +other clauses and eventually to be reraised. :: + + >>> def f(): + ... raise ExceptionGroup("group1", + ... [OSError(1), + ... SystemError(2), + ... ExceptionGroup("group2", + ... [OSError(3), RecursionError(4)])]) + ... + >>> try: + ... f() + ... except* OSError as e: + ... print("There were OSErrors") + ... except* SystemError as e: + ... print("There were SystemErrors") + ... + There were OSErrors + There were SystemErrors + + Exception Group Traceback (most recent call last): + | File "", line 2, in + | File "", line 2, in f + | ExceptionGroup: group1 + +-+---------------- 1 ---------------- + | ExceptionGroup: group2 + +-+---------------- 1 ---------------- + | RecursionError: 4 + +------------------------------------ + >>> + +Note that the exceptions nested in an exception group must be instances, +not types. This is because in practice the exceptions would typically +be ones that have already been raised and caught by the program, along +the following pattern:: + + >>> excs = [] + ... for test in tests: + ... try: + ... test.run() + ... except Exception as e: + ... excs.append(e) + ... + >>> if excs: + ... raise ExceptionGroup("Test Failures", excs) + ... + From webhook-mailer at python.org Thu Jan 6 14:13:29 2022 From: webhook-mailer at python.org (zooba) Date: Thu, 06 Jan 2022 19:13:29 -0000 Subject: [Python-checkins] bpo-46208: Fix normalization of relative paths in _Py_normpath()/os.path.normpath (GH-30362) Message-ID: https://github.com/python/cpython/commit/9c5fa9c97c5c5336e60e4ae7a2e6e3f67acedfc7 commit: 9c5fa9c97c5c5336e60e4ae7a2e6e3f67acedfc7 branch: main author: neonene <53406459+neonene at users.noreply.github.com> committer: zooba date: 2022-01-06T19:13:10Z summary: bpo-46208: Fix normalization of relative paths in _Py_normpath()/os.path.normpath (GH-30362) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst M Lib/test/test_ntpath.py M Lib/test/test_posixpath.py M Python/fileutils.c diff --git a/Lib/test/test_ntpath.py b/Lib/test/test_ntpath.py index a8d87e53db9e9..cc29881049224 100644 --- a/Lib/test/test_ntpath.py +++ b/Lib/test/test_ntpath.py @@ -235,6 +235,15 @@ def test_normpath(self): tester("ntpath.normpath('\\\\.\\NUL')", r'\\.\NUL') tester("ntpath.normpath('\\\\?\\D:/XY\\Z')", r'\\?\D:/XY\Z') + tester("ntpath.normpath('handbook/../../Tests/image.png')", r'..\Tests\image.png') + tester("ntpath.normpath('handbook/../../../Tests/image.png')", r'..\..\Tests\image.png') + tester("ntpath.normpath('handbook///../a/.././../b/c')", r'..\b\c') + tester("ntpath.normpath('handbook/a/../..///../../b/c')", r'..\..\b\c') + + tester("ntpath.normpath('//server/share/..')" , '\\\\server\\share\\') + tester("ntpath.normpath('//server/share/../')" , '\\\\server\\share\\') + tester("ntpath.normpath('//server/share/../..')", '\\\\server\\share\\') + tester("ntpath.normpath('//server/share/../../')", '\\\\server\\share\\') def test_realpath_curdir(self): expected = ntpath.normpath(os.getcwd()) diff --git a/Lib/test/test_posixpath.py b/Lib/test/test_posixpath.py index e4d8384ef0b4b..5fc4205beb125 100644 --- a/Lib/test/test_posixpath.py +++ b/Lib/test/test_posixpath.py @@ -329,13 +329,30 @@ def test_expanduser_pwd(self): ("/..", "/"), ("/../", "/"), ("/..//", "/"), + ("//.", "//"), ("//..", "//"), + ("//...", "//..."), + ("//../foo", "//foo"), + ("//../../foo", "//foo"), ("/../foo", "/foo"), ("/../../foo", "/foo"), ("/../foo/../", "/"), ("/../foo/../bar", "/bar"), ("/../../foo/../bar/./baz/boom/..", "/bar/baz"), ("/../../foo/../bar/./baz/boom/.", "/bar/baz/boom"), + ("foo/../bar/baz", "bar/baz"), + ("foo/../../bar/baz", "../bar/baz"), + ("foo/../../../bar/baz", "../../bar/baz"), + ("foo///../bar/.././../baz/boom", "../baz/boom"), + ("foo/bar/../..///../../baz/boom", "../../baz/boom"), + ("/foo/..", "/"), + ("/foo/../..", "/"), + ("//foo/..", "//"), + ("//foo/../..", "//"), + ("///foo/..", "/"), + ("///foo/../..", "/"), + ("////foo/..", "/"), + ("/////foo/..", "/"), ] def test_normpath(self): diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst new file mode 100644 index 0000000000000..92025a02d5b25 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst @@ -0,0 +1 @@ +Fix the regression of os.path.normpath("A/../../B") not returning expected "../B" but "B". \ No newline at end of file diff --git a/Python/fileutils.c b/Python/fileutils.c index cae6b75b6ae9b..d570be5577ae9 100644 --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -2218,11 +2218,11 @@ _Py_normpath(wchar_t *path, Py_ssize_t size) if (!path[0] || size == 0) { return path; } - wchar_t lastC = L'\0'; - wchar_t *p1 = path; wchar_t *pEnd = size >= 0 ? &path[size] : NULL; - wchar_t *p2 = path; - wchar_t *minP2 = path; + wchar_t *p1 = path; // sequentially scanned address in the path + wchar_t *p2 = path; // destination of a scanned character to be ljusted + wchar_t *minP2 = path; // the beginning of the destination range + wchar_t lastC = L'\0'; // the last ljusted character, p2[-1] in most cases #define IS_END(x) (pEnd ? (x) == pEnd : !*(x)) #ifdef ALTSEP @@ -2264,14 +2264,18 @@ _Py_normpath(wchar_t *path, Py_ssize_t size) *p2++ = lastC = *p1; } } - minP2 = p2; + if (sepCount) { + minP2 = p2; // Invalid path + } else { + minP2 = p2 - 1; // Absolute path has SEP at minP2 + } } #else // Skip past two leading SEPs else if (IS_SEP(&p1[0]) && IS_SEP(&p1[1]) && !IS_SEP(&p1[2])) { *p2++ = *p1++; *p2++ = *p1++; - minP2 = p2; + minP2 = p2 - 1; // Absolute path has SEP at minP2 lastC = SEP; } #endif /* MS_WINDOWS */ @@ -2292,8 +2296,11 @@ _Py_normpath(wchar_t *path, Py_ssize_t size) wchar_t *p3 = p2; while (p3 != minP2 && *--p3 == SEP) { } while (p3 != minP2 && *(p3 - 1) != SEP) { --p3; } - if (p3[0] == L'.' && p3[1] == L'.' && IS_SEP(&p3[2])) { - // Previous segment is also ../, so append instead + if (p2 == minP2 + || (p3[0] == L'.' && p3[1] == L'.' && IS_SEP(&p3[2]))) + { + // Previous segment is also ../, so append instead. + // Relative path does not absorb ../ at minP2 as well. *p2++ = L'.'; *p2++ = L'.'; lastC = L'.'; @@ -2314,7 +2321,7 @@ _Py_normpath(wchar_t *path, Py_ssize_t size) } } else { *p2++ = lastC = c; - } + } } *p2 = L'\0'; if (p2 != minP2) { From webhook-mailer at python.org Thu Jan 6 14:43:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 19:43:26 -0000 Subject: [Python-checkins] bpo-46286: use the new POP_JUMP_IF_NOT_NONE opcode to simplify except* (GH-30439) Message-ID: https://github.com/python/cpython/commit/16dfabf75cd0786781bcd8ded6a12591fb893d68 commit: 16dfabf75cd0786781bcd8ded6a12591fb893d68 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T11:43:16-08:00 summary: bpo-46286: use the new POP_JUMP_IF_NOT_NONE opcode to simplify except* (GH-30439) Automerge-Triggered-By: GH:iritkatriel files: M Python/compile.c diff --git a/Python/compile.c b/Python/compile.c index 62f37ca452632..643a5e507712c 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -3515,9 +3515,7 @@ compiler_try_except(struct compiler *c, stmt_ty s) [orig, res] PREP_RERAISE_STAR [exc] DUP_TOP - [exc, exc] LOAD_CONST None - [exc, exc, None] COMPARE_IS - [exc, is_none] POP_JUMP_IF_FALSE RER + [exc, exc] POP_JUMP_IF_NOT_NONE RER [exc] POP_TOP [] JUMP_FORWARD L0 @@ -3687,9 +3685,7 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) compiler_use_next_block(c, reraise_star); ADDOP(c, PREP_RERAISE_STAR); ADDOP(c, DUP_TOP); - ADDOP_LOAD_CONST(c, Py_None); - ADDOP_COMPARE(c, Is); - ADDOP_JUMP(c, POP_JUMP_IF_FALSE, reraise); + ADDOP_JUMP(c, POP_JUMP_IF_NOT_NONE, reraise); NEXT_BLOCK(c); /* Nothing to reraise */ From webhook-mailer at python.org Thu Jan 6 14:49:13 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 19:49:13 -0000 Subject: [Python-checkins] bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Message-ID: https://github.com/python/cpython/commit/c9137d4b638c0699b904011cafe68895d28dd80b commit: c9137d4b638c0699b904011cafe68895d28dd80b branch: main author: Christian Heimes committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T11:49:03-08:00 summary: bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Automerge-Triggered-By: GH:tiran files: M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index e246c36de01cd..99263bff091be 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -933,7 +933,11 @@ def check_pyobject_is_freed(self, func_name): except _testcapi.error: os._exit(1) ''') - assert_python_ok('-c', code, PYTHONMALLOC=self.PYTHONMALLOC) + assert_python_ok( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + MALLOC_CONF="junk:false", + ) def test_pyobject_null_is_freed(self): self.check_pyobject_is_freed('check_pyobject_null_is_freed') From webhook-mailer at python.org Thu Jan 6 15:12:17 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 20:12:17 -0000 Subject: [Python-checkins] bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Message-ID: https://github.com/python/cpython/commit/3cb64ede57bba260228864150123014fc682167e commit: 3cb64ede57bba260228864150123014fc682167e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T12:12:04-08:00 summary: bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Automerge-Triggered-By: GH:tiran (cherry picked from commit c9137d4b638c0699b904011cafe68895d28dd80b) Co-authored-by: Christian Heimes files: M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 6453f760a846e..ccb9d5383d6be 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -926,7 +926,11 @@ def check_pyobject_is_freed(self, func_name): except _testcapi.error: os._exit(1) ''') - assert_python_ok('-c', code, PYTHONMALLOC=self.PYTHONMALLOC) + assert_python_ok( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + MALLOC_CONF="junk:false", + ) def test_pyobject_null_is_freed(self): self.check_pyobject_is_freed('check_pyobject_null_is_freed') From webhook-mailer at python.org Thu Jan 6 15:16:03 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 06 Jan 2022 20:16:03 -0000 Subject: [Python-checkins] bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Message-ID: https://github.com/python/cpython/commit/b98730c5165467c6053e7274ae094d733aea804d commit: b98730c5165467c6053e7274ae094d733aea804d branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-06T12:15:55-08:00 summary: bpo-46263: Fix second location that needs MALLOC_CONF on FreeBSD (GH-30440) Automerge-Triggered-By: GH:tiran (cherry picked from commit c9137d4b638c0699b904011cafe68895d28dd80b) Co-authored-by: Christian Heimes files: M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 72c648f54d8b7..39b1b64651163 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -839,7 +839,11 @@ def check_pyobject_is_freed(self, func_name): except _testcapi.error: os._exit(1) ''') - assert_python_ok('-c', code, PYTHONMALLOC=self.PYTHONMALLOC) + assert_python_ok( + '-c', code, + PYTHONMALLOC=self.PYTHONMALLOC, + MALLOC_CONF="junk:false", + ) def test_pyobject_null_is_freed(self): self.check_pyobject_is_freed('check_pyobject_null_is_freed') From webhook-mailer at python.org Thu Jan 6 17:35:25 2022 From: webhook-mailer at python.org (vsajip) Date: Thu, 06 Jan 2022 22:35:25 -0000 Subject: [Python-checkins] =?utf-8?q?bpo-46251=3A_Add_=27Security_Conside?= =?utf-8?q?rations=27_section_to_logging_configura=E2=80=A6_=28GH-30411=29?= Message-ID: https://github.com/python/cpython/commit/46c7a6566bca2e974a89c90c35ed1c498d9d3b02 commit: 46c7a6566bca2e974a89c90c35ed1c498d9d3b02 branch: main author: Vinay Sajip committer: vsajip date: 2022-01-06T22:35:08Z summary: bpo-46251: Add 'Security Considerations' section to logging configura? (GH-30411) files: M Doc/library/logging.config.rst diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index 5a3e686802ea8..a1b8dc755ba6b 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -191,6 +191,20 @@ in :mod:`logging` itself) and defining handlers which are declared either in :func:`listen`. +Security considerations +^^^^^^^^^^^^^^^^^^^^^^^ + +The logging configuration functionality tries to offer convenience, and in part this +is done by offering the ability to convert text in configuration files into Python +objects used in logging configuration - for example, as described in +:ref:`logging-config-dict-userdef`. However, these same mechanisms (importing +callables from user-defined modules and calling them with parameters from the +configuration) could be used to invoke any code you like, and for this reason you +should treat configuration files from untrusted sources with *extreme caution* and +satisfy yourself that nothing bad can happen if you load them, before actually loading +them. + + .. _logging-config-dictschema: Configuration dictionary schema From webhook-mailer at python.org Thu Jan 6 18:18:47 2022 From: webhook-mailer at python.org (vsajip) Date: Thu, 06 Jan 2022 23:18:47 -0000 Subject: [Python-checkins] =?utf-8?b?WzMuOV0gYnBvLTQ2MjUxOiBBZGQgJ1NlY3Vy?= =?utf-8?q?ity_Considerations=27_section_to_logging_configura=E2=80=A6_=28?= =?utf-8?q?GH-30411=29_=28GH-30448=29?= Message-ID: https://github.com/python/cpython/commit/188fbdee0d6721a948eabb81cdcacac371614793 commit: 188fbdee0d6721a948eabb81cdcacac371614793 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vsajip date: 2022-01-06T23:18:18Z summary: [3.9] bpo-46251: Add 'Security Considerations' section to logging configura? (GH-30411) (GH-30448) files: M Doc/library/logging.config.rst diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index d4dc585351bab..afc32e64bc798 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -186,6 +186,20 @@ in :mod:`logging` itself) and defining handlers which are declared either in :func:`listen`. +Security considerations +^^^^^^^^^^^^^^^^^^^^^^^ + +The logging configuration functionality tries to offer convenience, and in part this +is done by offering the ability to convert text in configuration files into Python +objects used in logging configuration - for example, as described in +:ref:`logging-config-dict-userdef`. However, these same mechanisms (importing +callables from user-defined modules and calling them with parameters from the +configuration) could be used to invoke any code you like, and for this reason you +should treat configuration files from untrusted sources with *extreme caution* and +satisfy yourself that nothing bad can happen if you load them, before actually loading +them. + + .. _logging-config-dictschema: Configuration dictionary schema From webhook-mailer at python.org Thu Jan 6 18:18:47 2022 From: webhook-mailer at python.org (vsajip) Date: Thu, 06 Jan 2022 23:18:47 -0000 Subject: [Python-checkins] =?utf-8?b?WzMuMTBdIGJwby00NjI1MTogQWRkICdTZWN1?= =?utf-8?q?rity_Considerations=27_section_to_logging_configura=E2=80=A6_?= =?utf-8?b?KEdILTMwNDExKSAoR0gtMzA0NDcp?= Message-ID: https://github.com/python/cpython/commit/db60ed1170a02189a4fd4b7574e0722dd22c658b commit: db60ed1170a02189a4fd4b7574e0722dd22c658b branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vsajip date: 2022-01-06T23:18:41Z summary: [3.10] bpo-46251: Add 'Security Considerations' section to logging configura? (GH-30411) (GH-30447) files: M Doc/library/logging.config.rst diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index 5a3e686802ea8..a1b8dc755ba6b 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -191,6 +191,20 @@ in :mod:`logging` itself) and defining handlers which are declared either in :func:`listen`. +Security considerations +^^^^^^^^^^^^^^^^^^^^^^^ + +The logging configuration functionality tries to offer convenience, and in part this +is done by offering the ability to convert text in configuration files into Python +objects used in logging configuration - for example, as described in +:ref:`logging-config-dict-userdef`. However, these same mechanisms (importing +callables from user-defined modules and calling them with parameters from the +configuration) could be used to invoke any code you like, and for this reason you +should treat configuration files from untrusted sources with *extreme caution* and +satisfy yourself that nothing bad can happen if you load them, before actually loading +them. + + .. _logging-config-dictschema: Configuration dictionary schema From webhook-mailer at python.org Fri Jan 7 01:51:02 2022 From: webhook-mailer at python.org (vsajip) Date: Fri, 07 Jan 2022 06:51:02 -0000 Subject: [Python-checkins] bpo-41011: venv -- add more variables to pyvenv.cfg (GH-30382) Message-ID: https://github.com/python/cpython/commit/f4e325c21d6d9c2bf70224dc69d707b226f87872 commit: f4e325c21d6d9c2bf70224dc69d707b226f87872 branch: main author: andrei kulakov committer: vsajip date: 2022-01-07T06:50:30Z summary: bpo-41011: venv -- add more variables to pyvenv.cfg (GH-30382) files: A Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst M Lib/test/test_venv.py M Lib/venv/__init__.py diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index de714de2a2049..ca37abcf79854 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -19,7 +19,7 @@ from test.support.os_helper import (can_symlink, EnvironmentVarGuard, rmtree) import unittest import venv -from unittest.mock import patch +from unittest.mock import patch, Mock try: import ctypes @@ -114,6 +114,11 @@ def test_defaults(self): executable = sys._base_executable path = os.path.dirname(executable) self.assertIn('home = %s' % path, data) + self.assertIn('executable = %s' % + os.path.realpath(sys.executable), data) + copies = '' if os.name=='nt' else ' --copies' + cmd = f'command = {sys.executable} -m venv{copies} --without-pip {self.env_dir}' + self.assertIn(cmd, data) fn = self.get_env_file(self.bindir, self.exe) if not os.path.exists(fn): # diagnostics for Windows buildbot failures bd = self.get_env_file(self.bindir) @@ -121,6 +126,37 @@ def test_defaults(self): print(' %r' % os.listdir(bd)) self.assertTrue(os.path.exists(fn), 'File %r should exist.' % fn) + def test_config_file_command_key(self): + attrs = [ + (None, None), + ('symlinks', '--copies'), + ('with_pip', '--without-pip'), + ('system_site_packages', '--system-site-packages'), + ('clear', '--clear'), + ('upgrade', '--upgrade'), + ('upgrade_deps', '--upgrade-deps'), + ('prompt', '--prompt'), + ] + for attr, opt in attrs: + rmtree(self.env_dir) + if not attr: + b = venv.EnvBuilder() + else: + b = venv.EnvBuilder( + **{attr: False if attr in ('with_pip', 'symlinks') else True}) + b.upgrade_dependencies = Mock() # avoid pip command to upgrade deps + b._setup_pip = Mock() # avoid pip setup + self.run_with_capture(b.create, self.env_dir) + data = self.get_text_file_contents('pyvenv.cfg') + if not attr: + for opt in ('--system-site-packages', '--clear', '--upgrade', + '--upgrade-deps', '--prompt'): + self.assertNotRegex(data, rf'command = .* {opt}') + elif os.name=='nt' and attr=='symlinks': + pass + else: + self.assertRegex(data, rf'command = .* {opt}') + def test_prompt(self): env_name = os.path.split(self.env_dir)[1] diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py index 6f1af294ae63e..b90765074c36d 100644 --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -51,6 +51,7 @@ def __init__(self, system_site_packages=False, clear=False, self.symlinks = symlinks self.upgrade = upgrade self.with_pip = with_pip + self.orig_prompt = prompt if prompt == '.': # see bpo-38901 prompt = os.path.basename(os.getcwd()) self.prompt = prompt @@ -178,6 +179,29 @@ def create_configuration(self, context): f.write('version = %d.%d.%d\n' % sys.version_info[:3]) if self.prompt is not None: f.write(f'prompt = {self.prompt!r}\n') + f.write('executable = %s\n' % os.path.realpath(sys.executable)) + args = [] + nt = os.name == 'nt' + if nt and self.symlinks: + args.append('--symlinks') + if not nt and not self.symlinks: + args.append('--copies') + if not self.with_pip: + args.append('--without-pip') + if self.system_site_packages: + args.append('--system-site-packages') + if self.clear: + args.append('--clear') + if self.upgrade: + args.append('--upgrade') + if self.upgrade_deps: + args.append('--upgrade-deps') + if self.orig_prompt is not None: + args.append(f'--prompt="{self.orig_prompt}"') + + args.append(context.env_dir) + args = ' '.join(args) + f.write(f'command = {sys.executable} -m venv {args}\n') if os.name != 'nt': def symlink_or_copy(self, src, dst, relative_symlinks_ok=False): diff --git a/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst b/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst new file mode 100644 index 0000000000000..1b1fa5d376527 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst @@ -0,0 +1,3 @@ +Added two new variables to *pyvenv.cfg* which is generated by :mod:`venv` +module: *executable* for the executable and *command* for the command line used +to create the environment. From webhook-mailer at python.org Fri Jan 7 03:15:29 2022 From: webhook-mailer at python.org (tiran) Date: Fri, 07 Jan 2022 08:15:29 -0000 Subject: [Python-checkins] bpo-45723: Fix detection of epoll (#30449) Message-ID: https://github.com/python/cpython/commit/994f90c0772612780361e1dc5fa5223dce22f70a commit: 994f90c0772612780361e1dc5fa5223dce22f70a branch: main author: Christian Heimes committer: tiran date: 2022-01-07T09:15:20+01:00 summary: bpo-45723: Fix detection of epoll (#30449) files: A Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst M configure M configure.ac M pyconfig.h.in diff --git a/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst b/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst new file mode 100644 index 0000000000000..ca923b2f81f13 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst @@ -0,0 +1 @@ +Fixed a regression in ``configure`` check for :func:`select.epoll`. diff --git a/configure b/configure index b99b9f5e8d1fc..9e7090c7906dd 100755 --- a/configure +++ b/configure @@ -13850,9 +13850,9 @@ fi - { $as_echo "$as_me:${as_lineno-$LINENO}: checking for epoll" >&5 -$as_echo_n "checking for epoll... " >&6; } -if ${ac_cv_func_epoll+:} false; then : + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for epoll_create" >&5 +$as_echo_n "checking for epoll_create... " >&6; } +if ${ac_cv_func_epoll_create+:} false; then : $as_echo_n "(cached) " >&6 else cat confdefs.h - <<_ACEOF >conftest.$ac_ext @@ -13861,22 +13861,22 @@ else int main () { -void *x=epoll +void *x=epoll_create ; return 0; } _ACEOF if ac_fn_c_try_compile "$LINENO"; then : - ac_cv_func_epoll=yes + ac_cv_func_epoll_create=yes else - ac_cv_func_epoll=no + ac_cv_func_epoll_create=no fi rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_func_epoll" >&5 -$as_echo "$ac_cv_func_epoll" >&6; } - if test "x$ac_cv_func_epoll" = xyes; then : +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_func_epoll_create" >&5 +$as_echo "$ac_cv_func_epoll_create" >&6; } + if test "x$ac_cv_func_epoll_create" = xyes; then : $as_echo "#define HAVE_EPOLL 1" >>confdefs.h diff --git a/configure.ac b/configure.ac index d9bdf9f8c66e9..ff3163f921ae2 100644 --- a/configure.ac +++ b/configure.ac @@ -4089,7 +4089,7 @@ PY_CHECK_FUNC([symlink], [#include ]) PY_CHECK_FUNC([fchdir], [#include ]) PY_CHECK_FUNC([fsync], [#include ]) PY_CHECK_FUNC([fdatasync], [#include ]) -PY_CHECK_FUNC([epoll], [#include ]) +PY_CHECK_FUNC([epoll_create], [#include ], [HAVE_EPOLL]) PY_CHECK_FUNC([epoll_create1], [#include ]) PY_CHECK_FUNC([kqueue],[ #include diff --git a/pyconfig.h.in b/pyconfig.h.in index e6e81654699d8..f496b771999d9 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -293,7 +293,7 @@ /* Define to 1 if you have the header file. */ #undef HAVE_ENDIAN_H -/* Define if you have the 'epoll' function. */ +/* Define if you have the 'epoll_create' function. */ #undef HAVE_EPOLL /* Define if you have the 'epoll_create1' function. */ From webhook-mailer at python.org Fri Jan 7 09:08:46 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 07 Jan 2022 14:08:46 -0000 Subject: [Python-checkins] bpo-46070: Fix asyncio initialisation guard (GH-30423) Message-ID: https://github.com/python/cpython/commit/b127e70a8a682fe869c22ce04c379bd85a00db67 commit: b127e70a8a682fe869c22ce04c379bd85a00db67 branch: main author: Erlend Egeberg Aasland committer: vstinner date: 2022-01-07T15:08:19+01:00 summary: bpo-46070: Fix asyncio initialisation guard (GH-30423) If init flag is set, exit successfully immediately. If not, only set the flag after successful initialization. files: A Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst M Modules/_asynciomodule.c diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst new file mode 100644 index 0000000000000..0fedc9dfb8fb1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst @@ -0,0 +1,2 @@ +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index 978a1fdd0d852..b08a7d1c024c3 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -3318,17 +3318,14 @@ static int module_init(void) { PyObject *module = NULL; + if (module_initialized) { + return 0; + } asyncio_mod = PyImport_ImportModule("asyncio"); if (asyncio_mod == NULL) { goto fail; } - if (module_initialized != 0) { - return 0; - } - else { - module_initialized = 1; - } current_tasks = PyDict_New(); if (current_tasks == NULL) { @@ -3389,6 +3386,7 @@ module_init(void) goto fail; } + module_initialized = 1; Py_DECREF(module); return 0; From webhook-mailer at python.org Fri Jan 7 09:35:20 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 14:35:20 -0000 Subject: [Python-checkins] bpo-46070: Fix asyncio initialisation guard (GH-30423) Message-ID: https://github.com/python/cpython/commit/9d18045804f6db8224be14f7a618b77977f90144 commit: 9d18045804f6db8224be14f7a618b77977f90144 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T06:35:15-08:00 summary: bpo-46070: Fix asyncio initialisation guard (GH-30423) If init flag is set, exit successfully immediately. If not, only set the flag after successful initialization. (cherry picked from commit b127e70a8a682fe869c22ce04c379bd85a00db67) Co-authored-by: Erlend Egeberg Aasland files: A Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst M Modules/_asynciomodule.c diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst new file mode 100644 index 0000000000000..0fedc9dfb8fb1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst @@ -0,0 +1,2 @@ +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index 56079b0277d1a..befec9a8342ef 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -3309,17 +3309,14 @@ static int module_init(void) { PyObject *module = NULL; + if (module_initialized) { + return 0; + } asyncio_mod = PyImport_ImportModule("asyncio"); if (asyncio_mod == NULL) { goto fail; } - if (module_initialized != 0) { - return 0; - } - else { - module_initialized = 1; - } current_tasks = PyDict_New(); if (current_tasks == NULL) { @@ -3380,6 +3377,7 @@ module_init(void) goto fail; } + module_initialized = 1; Py_DECREF(module); return 0; From webhook-mailer at python.org Fri Jan 7 09:36:11 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 14:36:11 -0000 Subject: [Python-checkins] bpo-46070: Fix asyncio initialisation guard (GH-30423) Message-ID: https://github.com/python/cpython/commit/4d2cfd354969590ba8e0af0447fd84f8b5e61952 commit: 4d2cfd354969590ba8e0af0447fd84f8b5e61952 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T06:36:07-08:00 summary: bpo-46070: Fix asyncio initialisation guard (GH-30423) If init flag is set, exit successfully immediately. If not, only set the flag after successful initialization. (cherry picked from commit b127e70a8a682fe869c22ce04c379bd85a00db67) Co-authored-by: Erlend Egeberg Aasland files: A Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst M Modules/_asynciomodule.c diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst new file mode 100644 index 0000000000000..0fedc9dfb8fb1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst @@ -0,0 +1,2 @@ +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index 4457d7bd49e23..a1421cf5dbec8 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -3328,17 +3328,14 @@ static int module_init(void) { PyObject *module = NULL; + if (module_initialized) { + return 0; + } asyncio_mod = PyImport_ImportModule("asyncio"); if (asyncio_mod == NULL) { goto fail; } - if (module_initialized != 0) { - return 0; - } - else { - module_initialized = 1; - } current_tasks = PyDict_New(); if (current_tasks == NULL) { @@ -3399,6 +3396,7 @@ module_init(void) goto fail; } + module_initialized = 1; Py_DECREF(module); return 0; From webhook-mailer at python.org Fri Jan 7 13:28:19 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 18:28:19 -0000 Subject: [Python-checkins] bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) Message-ID: https://github.com/python/cpython/commit/9b7aa6a9d678ba798c57fa5bbc800014dfe4fb91 commit: 9b7aa6a9d678ba798c57fa5bbc800014dfe4fb91 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T10:28:08-08:00 summary: bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) Automerge-Triggered-By: GH:iritkatriel files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 3e8fc54485e20..eb3035344455f 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4254,20 +4254,20 @@ written in Python, such as a mail server's external command delivery program. Returns the current global process times. The return value is an object with five attributes: - * :attr:`user` - user time - * :attr:`system` - system time - * :attr:`children_user` - user time of all child processes - * :attr:`children_system` - system time of all child processes - * :attr:`elapsed` - elapsed real time since a fixed point in the past + * :attr:`!user` - user time + * :attr:`!system` - system time + * :attr:`!children_user` - user time of all child processes + * :attr:`!children_system` - system time of all child processes + * :attr:`!elapsed` - elapsed real time since a fixed point in the past For backwards compatibility, this object also behaves like a five-tuple - containing :attr:`user`, :attr:`system`, :attr:`children_user`, - :attr:`children_system`, and :attr:`elapsed` in that order. + containing :attr:`!user`, :attr:`!system`, :attr:`!children_user`, + :attr:`!children_system`, and :attr:`!elapsed` in that order. See the Unix manual page :manpage:`times(2)` and :manpage:`times(3)` manual page on Unix or `the GetProcessTimes MSDN `_ - on Windows. On Windows, only :attr:`user` and :attr:`system` are known; the other attributes are zero. + on Windows. On Windows, only :attr:`!user` and :attr:`!system` are known; the other attributes are zero. .. availability:: Unix, Windows. From webhook-mailer at python.org Fri Jan 7 13:39:57 2022 From: webhook-mailer at python.org (iritkatriel) Date: Fri, 07 Jan 2022 18:39:57 -0000 Subject: [Python-checkins] bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) Message-ID: https://github.com/python/cpython/commit/273cb8e7577d143830404f6779946a0bedb58758 commit: 273cb8e7577d143830404f6779946a0bedb58758 branch: main author: Jacob Walls committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-07T18:39:40Z summary: bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) files: M Doc/reference/expressions.rst diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 991f2d717a0a6..0a6651ea5ed67 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -427,9 +427,9 @@ Yield expressions The yield expression is used when defining a :term:`generator` function or an :term:`asynchronous generator` function and thus can only be used in the body of a function definition. Using a yield -expression in a function's body causes that function to be a generator, +expression in a function's body causes that function to be a generator function, and using it in an :keyword:`async def` function's body causes that -coroutine function to be an asynchronous generator. For example:: +coroutine function to be an asynchronous generator function. For example:: def gen(): # defines a generator function yield 123 From webhook-mailer at python.org Fri Jan 7 13:50:18 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 18:50:18 -0000 Subject: [Python-checkins] bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) Message-ID: https://github.com/python/cpython/commit/6630952cf0955d205e48874a53b69868c97b8875 commit: 6630952cf0955d205e48874a53b69868c97b8875 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T10:50:09-08:00 summary: bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit 9b7aa6a9d678ba798c57fa5bbc800014dfe4fb91) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 39d7e40dd915c..629a32f1b63e7 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4244,20 +4244,20 @@ written in Python, such as a mail server's external command delivery program. Returns the current global process times. The return value is an object with five attributes: - * :attr:`user` - user time - * :attr:`system` - system time - * :attr:`children_user` - user time of all child processes - * :attr:`children_system` - system time of all child processes - * :attr:`elapsed` - elapsed real time since a fixed point in the past + * :attr:`!user` - user time + * :attr:`!system` - system time + * :attr:`!children_user` - user time of all child processes + * :attr:`!children_system` - system time of all child processes + * :attr:`!elapsed` - elapsed real time since a fixed point in the past For backwards compatibility, this object also behaves like a five-tuple - containing :attr:`user`, :attr:`system`, :attr:`children_user`, - :attr:`children_system`, and :attr:`elapsed` in that order. + containing :attr:`!user`, :attr:`!system`, :attr:`!children_user`, + :attr:`!children_system`, and :attr:`!elapsed` in that order. See the Unix manual page :manpage:`times(2)` and :manpage:`times(3)` manual page on Unix or `the GetProcessTimes MSDN `_ - on Windows. On Windows, only :attr:`user` and :attr:`system` are known; the other attributes are zero. + on Windows. On Windows, only :attr:`!user` and :attr:`!system` are known; the other attributes are zero. .. availability:: Unix, Windows. From webhook-mailer at python.org Fri Jan 7 14:00:49 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 19:00:49 -0000 Subject: [Python-checkins] bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) Message-ID: https://github.com/python/cpython/commit/8bc68140cbe8230cf048bc04faf927c1413066d1 commit: 8bc68140cbe8230cf048bc04faf927c1413066d1 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T11:00:45-08:00 summary: bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) (cherry picked from commit 273cb8e7577d143830404f6779946a0bedb58758) Co-authored-by: Jacob Walls files: M Doc/reference/expressions.rst diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 5333de911c2c8..606e773456032 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -422,9 +422,9 @@ Yield expressions The yield expression is used when defining a :term:`generator` function or an :term:`asynchronous generator` function and thus can only be used in the body of a function definition. Using a yield -expression in a function's body causes that function to be a generator, +expression in a function's body causes that function to be a generator function, and using it in an :keyword:`async def` function's body causes that -coroutine function to be an asynchronous generator. For example:: +coroutine function to be an asynchronous generator function. For example:: def gen(): # defines a generator function yield 123 From webhook-mailer at python.org Fri Jan 7 14:01:31 2022 From: webhook-mailer at python.org (iritkatriel) Date: Fri, 07 Jan 2022 19:01:31 -0000 Subject: [Python-checkins] bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) (GH-30460) Message-ID: https://github.com/python/cpython/commit/c55616cf936bd8497d9ff321c03f7379f5f78a0e commit: c55616cf936bd8497d9ff321c03f7379f5f78a0e branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-07T19:01:26Z summary: bpo-46216: remove spurious link to os.system() from os.time() documentation (GH-30326) (GH-30460) Automerge-Triggered-By: GH:iritkatriel (cherry picked from commit 9b7aa6a9d678ba798c57fa5bbc800014dfe4fb91) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index d4cc296fbf89a..f8d567af48466 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -3998,20 +3998,20 @@ written in Python, such as a mail server's external command delivery program. Returns the current global process times. The return value is an object with five attributes: - * :attr:`user` - user time - * :attr:`system` - system time - * :attr:`children_user` - user time of all child processes - * :attr:`children_system` - system time of all child processes - * :attr:`elapsed` - elapsed real time since a fixed point in the past + * :attr:`!user` - user time + * :attr:`!system` - system time + * :attr:`!children_user` - user time of all child processes + * :attr:`!children_system` - system time of all child processes + * :attr:`!elapsed` - elapsed real time since a fixed point in the past For backwards compatibility, this object also behaves like a five-tuple - containing :attr:`user`, :attr:`system`, :attr:`children_user`, - :attr:`children_system`, and :attr:`elapsed` in that order. + containing :attr:`!user`, :attr:`!system`, :attr:`!children_user`, + :attr:`!children_system`, and :attr:`!elapsed` in that order. See the Unix manual page :manpage:`times(2)` and :manpage:`times(3)` manual page on Unix or `the GetProcessTimes MSDN `_ - on Windows. On Windows, only :attr:`user` and :attr:`system` are known; the other attributes are zero. + on Windows. On Windows, only :attr:`!user` and :attr:`!system` are known; the other attributes are zero. .. availability:: Unix, Windows. From webhook-mailer at python.org Fri Jan 7 14:01:38 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 19:01:38 -0000 Subject: [Python-checkins] [3.10] bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) (GH-30461) Message-ID: https://github.com/python/cpython/commit/75a1865d1ce352909ad9a30d001486bbd7d3ed75 commit: 75a1865d1ce352909ad9a30d001486bbd7d3ed75 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T11:01:33-08:00 summary: [3.10] bpo-24650: Use full term "generator function" in yield expressions docs (GH-24663) (GH-30461) (cherry picked from commit 273cb8e7577d143830404f6779946a0bedb58758) Co-authored-by: Jacob Walls Automerge-Triggered-By: GH:iritkatriel files: M Doc/reference/expressions.rst diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index c3f58fc841616..d4aae29725fb4 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -422,9 +422,9 @@ Yield expressions The yield expression is used when defining a :term:`generator` function or an :term:`asynchronous generator` function and thus can only be used in the body of a function definition. Using a yield -expression in a function's body causes that function to be a generator, +expression in a function's body causes that function to be a generator function, and using it in an :keyword:`async def` function's body causes that -coroutine function to be an asynchronous generator. For example:: +coroutine function to be an asynchronous generator function. For example:: def gen(): # defines a generator function yield 123 From webhook-mailer at python.org Fri Jan 7 14:41:40 2022 From: webhook-mailer at python.org (iritkatriel) Date: Fri, 07 Jan 2022 19:41:40 -0000 Subject: [Python-checkins] bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Message-ID: https://github.com/python/cpython/commit/6d07a9fb7cb31433c376a1aa20ea32001da0a418 commit: 6d07a9fb7cb31433c376a1aa20ea32001da0a418 branch: main author: Hugo van Kemenade committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-07T19:41:23Z summary: bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Co-authored-by: Ian Kelling files: M Doc/library/pdb.rst diff --git a/Doc/library/pdb.rst b/Doc/library/pdb.rst index 6d1dba1bf2eb0..ca59576336bf8 100644 --- a/Doc/library/pdb.rst +++ b/Doc/library/pdb.rst @@ -67,14 +67,13 @@ useful than quitting the debugger upon program's exit. before the first line of the module. -The typical usage to break into the debugger from a running program is to -insert :: +The typical usage to break into the debugger is to insert:: import pdb; pdb.set_trace() -at the location you want to break into the debugger. You can then step through -the code following this statement, and continue running without the debugger -using the :pdbcmd:`continue` command. +at the location you want to break into the debugger, and then run the program. +You can then step through the code following this statement, and continue +running without the debugger using the :pdbcmd:`continue` command. .. versionadded:: 3.7 The built-in :func:`breakpoint()`, when called with defaults, can be used From webhook-mailer at python.org Fri Jan 7 15:03:41 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 20:03:41 -0000 Subject: [Python-checkins] bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Message-ID: https://github.com/python/cpython/commit/ed2656a7d313dd9fa3a963d8b4dca4438cf73bc9 commit: ed2656a7d313dd9fa3a963d8b4dca4438cf73bc9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T12:03:36-08:00 summary: bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Co-authored-by: Ian Kelling (cherry picked from commit 6d07a9fb7cb31433c376a1aa20ea32001da0a418) Co-authored-by: Hugo van Kemenade files: M Doc/library/pdb.rst diff --git a/Doc/library/pdb.rst b/Doc/library/pdb.rst index ed1e9712c0e3d..13e1a1993692d 100644 --- a/Doc/library/pdb.rst +++ b/Doc/library/pdb.rst @@ -67,14 +67,13 @@ useful than quitting the debugger upon program's exit. before the first line of the module. -The typical usage to break into the debugger from a running program is to -insert :: +The typical usage to break into the debugger is to insert:: import pdb; pdb.set_trace() -at the location you want to break into the debugger. You can then step through -the code following this statement, and continue running without the debugger -using the :pdbcmd:`continue` command. +at the location you want to break into the debugger, and then run the program. +You can then step through the code following this statement, and continue +running without the debugger using the :pdbcmd:`continue` command. .. versionadded:: 3.7 The built-in :func:`breakpoint()`, when called with defaults, can be used From webhook-mailer at python.org Fri Jan 7 15:06:52 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 20:06:52 -0000 Subject: [Python-checkins] bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Message-ID: https://github.com/python/cpython/commit/a74eb5465582dd6e194a84ce4c66c12453373304 commit: a74eb5465582dd6e194a84ce4c66c12453373304 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T12:06:43-08:00 summary: bpo-28546: [doc] Clarify setting pdb breakpoints (GH-30360) Co-authored-by: Ian Kelling (cherry picked from commit 6d07a9fb7cb31433c376a1aa20ea32001da0a418) Co-authored-by: Hugo van Kemenade files: M Doc/library/pdb.rst diff --git a/Doc/library/pdb.rst b/Doc/library/pdb.rst index ed1e9712c0e3d..13e1a1993692d 100644 --- a/Doc/library/pdb.rst +++ b/Doc/library/pdb.rst @@ -67,14 +67,13 @@ useful than quitting the debugger upon program's exit. before the first line of the module. -The typical usage to break into the debugger from a running program is to -insert :: +The typical usage to break into the debugger is to insert:: import pdb; pdb.set_trace() -at the location you want to break into the debugger. You can then step through -the code following this statement, and continue running without the debugger -using the :pdbcmd:`continue` command. +at the location you want to break into the debugger, and then run the program. +You can then step through the code following this statement, and continue +running without the debugger using the :pdbcmd:`continue` command. .. versionadded:: 3.7 The built-in :func:`breakpoint()`, when called with defaults, can be used From webhook-mailer at python.org Fri Jan 7 16:05:37 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 21:05:37 -0000 Subject: [Python-checkins] bpo-46289: Make conversion of FormattedValue not optional on ASDL (GH-30467) Message-ID: https://github.com/python/cpython/commit/d382f7ee0b98e4ab6ade9384268f25c06be462ad commit: d382f7ee0b98e4ab6ade9384268f25c06be462ad branch: main author: Batuhan Taskaya committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T13:05:28-08:00 summary: bpo-46289: Make conversion of FormattedValue not optional on ASDL (GH-30467) Automerge-Triggered-By: GH:isidentical files: A Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst M Parser/Python.asdl M Python/Python-ast.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst new file mode 100644 index 0000000000000..816ff585f14e6 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst @@ -0,0 +1,2 @@ +ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` +field is not optional. diff --git a/Parser/Python.asdl b/Parser/Python.asdl index 4a61bda701b47..e9423a7c984f2 100644 --- a/Parser/Python.asdl +++ b/Parser/Python.asdl @@ -75,7 +75,7 @@ module Python -- x < 4 < 3 and (x < 4) < 3 | Compare(expr left, cmpop* ops, expr* comparators) | Call(expr func, expr* args, keyword* keywords) - | FormattedValue(expr value, int? conversion, expr? format_spec) + | FormattedValue(expr value, int conversion, expr? format_spec) | JoinedStr(expr* values) | Constant(constant value, string? kind) diff --git a/Python/Python-ast.c b/Python/Python-ast.c index 167018482077d..da79463375a1a 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -1324,7 +1324,7 @@ init_types(struct ast_state *state) " | YieldFrom(expr value)\n" " | Compare(expr left, cmpop* ops, expr* comparators)\n" " | Call(expr func, expr* args, keyword* keywords)\n" - " | FormattedValue(expr value, int? conversion, expr? format_spec)\n" + " | FormattedValue(expr value, int conversion, expr? format_spec)\n" " | JoinedStr(expr* values)\n" " | Constant(constant value, string? kind)\n" " | Attribute(expr value, identifier attr, expr_context ctx)\n" @@ -1414,11 +1414,8 @@ init_types(struct ast_state *state) state->FormattedValue_type = make_type(state, "FormattedValue", state->expr_type, FormattedValue_fields, 3, - "FormattedValue(expr value, int? conversion, expr? format_spec)"); + "FormattedValue(expr value, int conversion, expr? format_spec)"); if (!state->FormattedValue_type) return 0; - if (PyObject_SetAttr(state->FormattedValue_type, state->conversion, - Py_None) == -1) - return 0; if (PyObject_SetAttr(state->FormattedValue_type, state->format_spec, Py_None) == -1) return 0; @@ -9249,9 +9246,9 @@ obj2ast_expr(struct ast_state *state, PyObject* obj, expr_ty* out, PyArena* if (_PyObject_LookupAttr(obj, state->conversion, &tmp) < 0) { return 1; } - if (tmp == NULL || tmp == Py_None) { - Py_CLEAR(tmp); - conversion = 0; + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"conversion\" missing from FormattedValue"); + return 1; } else { int res; From webhook-mailer at python.org Fri Jan 7 16:44:28 2022 From: webhook-mailer at python.org (ethanfurman) Date: Fri, 07 Jan 2022 21:44:28 -0000 Subject: [Python-checkins] bpo-46296: [Enum] add a test for missing `value` recovery (GH-30458) Message-ID: https://github.com/python/cpython/commit/74d1663580d1914bd110c3ab7282451f5e2cd2b5 commit: 74d1663580d1914bd110c3ab7282451f5e2cd2b5 branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-07T13:44:21-08:00 summary: bpo-46296: [Enum] add a test for missing `value` recovery (GH-30458) In `__set_name__` there is a check for the `_value_` attribute and an attempt to add it if missing; this adds a test to cover the case for simple enums with a custom `__new__` method. files: A Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 51a31e5ebf807..2b3eac56865b1 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1022,6 +1022,16 @@ def repr(self): class Huh(MyStr, MyInt, Enum): One = 1 + def test_value_auto_assign(self): + class Some(Enum): + def __new__(cls, val): + return object.__new__(cls) + x = 1 + y = 2 + + self.assertEqual(Some.x.value, 1) + self.assertEqual(Some.y.value, 2) + def test_hash(self): Season = self.Season dates = {} diff --git a/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst b/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst new file mode 100644 index 0000000000000..9e0d470e269cb --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst @@ -0,0 +1,2 @@ +Add a test case for :mod:`enum` +with ``_use_args_ == True`` and ``_member_type_ == object``. From webhook-mailer at python.org Fri Jan 7 17:15:33 2022 From: webhook-mailer at python.org (vsajip) Date: Fri, 07 Jan 2022 22:15:33 -0000 Subject: [Python-checkins] [3.10] bpo-42378: fixed log truncation on logging shutdown (GH-27310) (GH-30468) Message-ID: https://github.com/python/cpython/commit/e35430bec528dfb1a653cd457ea58b5a08543632 commit: e35430bec528dfb1a653cd457ea58b5a08543632 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vsajip date: 2022-01-07T22:15:25Z summary: [3.10] bpo-42378: fixed log truncation on logging shutdown (GH-27310) (GH-30468) Co-authored-by: andrei kulakov files: A Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst M Doc/library/logging.handlers.rst M Lib/logging/__init__.py M Lib/test/test_logging.py diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst index a664efdc62526..a5b181ee612d5 100644 --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -117,6 +117,9 @@ sends logging output to a disk file. It inherits the output functionality from Outputs the record to the file. + Note that if the file was closed due to logging shutdown at exit and the file + mode is 'w', the record will not be emitted (see :issue:`42378`). + .. _null-handler: diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py index 555f598de7a92..19bd2bc20b250 100644 --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -878,6 +878,7 @@ def __init__(self, level=NOTSET): self._name = None self.level = _checkLevel(level) self.formatter = None + self._closed = False # Add the handler to the global _handlerList (for cleanup on shutdown) _addHandlerRef(self) self.createLock() @@ -996,6 +997,7 @@ def close(self): #get the module data lock, as we're updating a shared structure. _acquireLock() try: #unlikely to raise an exception, but you never know... + self._closed = True if self._name and self._name in _handlers: del _handlers[self._name] finally: @@ -1184,6 +1186,8 @@ def close(self): finally: # Issue #19523: call unconditionally to # prevent a handler leak when delay is set + # Also see Issue #42378: we also rely on + # self._closed being set to True there StreamHandler.close(self) finally: self.release() @@ -1203,10 +1207,15 @@ def emit(self, record): If the stream was not opened because 'delay' was specified in the constructor, open it before calling the superclass's emit. + + If stream is not open, current mode is 'w' and `_closed=True`, record + will not be emitted (see Issue #42378). """ if self.stream is None: - self.stream = self._open() - StreamHandler.emit(self, record) + if self.mode != 'w' or not self._closed: + self.stream = self._open() + if self.stream: + StreamHandler.emit(self, record) def __repr__(self): level = getLevelName(self.level) diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 40411b4488483..03d0319306a48 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -5168,6 +5168,9 @@ def assertLogFile(self, filename): msg="Log file %r does not exist" % filename) self.rmfiles.append(filename) + def next_rec(self): + return logging.LogRecord('n', logging.DEBUG, 'p', 1, + self.next_message(), None, None, None) class FileHandlerTest(BaseFileTest): def test_delay(self): @@ -5180,11 +5183,18 @@ def test_delay(self): self.assertTrue(os.path.exists(self.fn)) fh.close() -class RotatingFileHandlerTest(BaseFileTest): - def next_rec(self): - return logging.LogRecord('n', logging.DEBUG, 'p', 1, - self.next_message(), None, None, None) + def test_emit_after_closing_in_write_mode(self): + # Issue #42378 + os.unlink(self.fn) + fh = logging.FileHandler(self.fn, encoding='utf-8', mode='w') + fh.setFormatter(logging.Formatter('%(message)s')) + fh.emit(self.next_rec()) # '1' + fh.close() + fh.emit(self.next_rec()) # '2' + with open(self.fn) as fp: + self.assertEqual(fp.read().strip(), '1') +class RotatingFileHandlerTest(BaseFileTest): def test_should_not_rollover(self): # If maxbytes is zero rollover never occurs rh = logging.handlers.RotatingFileHandler( diff --git a/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst b/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst new file mode 100644 index 0000000000000..90c3961dc87d8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst @@ -0,0 +1,4 @@ +Fixes the issue with log file being overwritten when +:class:`logging.FileHandler` is used in :mod:`atexit` with *filemode* set to +``'w'``. Note this will cause the message in *atexit* not being logged if +the log stream is already closed due to shutdown of logging. From webhook-mailer at python.org Fri Jan 7 17:26:10 2022 From: webhook-mailer at python.org (zooba) Date: Fri, 07 Jan 2022 22:26:10 -0000 Subject: [Python-checkins] bpo-46297: Fix interpreter crash on startup with multiple PythonPaths set in registry (GH-30466) Message-ID: https://github.com/python/cpython/commit/c9dc1f491e8edb0bc433cde73190a3015d226891 commit: c9dc1f491e8edb0bc433cde73190a3015d226891 branch: main author: Daniel committer: zooba date: 2022-01-07T22:26:00Z summary: bpo-46297: Fix interpreter crash on startup with multiple PythonPaths set in registry (GH-30466) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst M Lib/test/test_getpath.py M Misc/ACKS M Modules/getpath.py diff --git a/Lib/test/test_getpath.py b/Lib/test/test_getpath.py index 232b6805284354..1a336a4abcafdb 100644 --- a/Lib/test/test_getpath.py +++ b/Lib/test/test_getpath.py @@ -734,12 +734,15 @@ def EnumKey(self, hkey, i): return n.removeprefix(prefix) raise OSError("end of enumeration") - def QueryValue(self, hkey): + def QueryValue(self, hkey, subkey): if verbose: - print(f"QueryValue({hkey})") + print(f"QueryValue({hkey}, {subkey})") hkey = hkey.casefold() if hkey not in self.open: raise RuntimeError("key is not open") + if subkey: + subkey = subkey.casefold() + hkey = f'{hkey}\\{subkey}' try: return self.keys[hkey] except KeyError: diff --git a/Misc/ACKS b/Misc/ACKS index 8baaf7304b603c..7f2e94dfa615f1 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -400,6 +400,7 @@ Lars Damerow Evan Dandrea Eric Daniel Scott David Daniels +Derzsi D?niel Lawrence D'Anna Ben Darnell Kushal Das diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst new file mode 100644 index 00000000000000..558d2392d61029 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst @@ -0,0 +1,2 @@ +Fixed an interpreter crash on bootup with multiple PythonPaths set in +the Windows registry. Patch by Derzsi D?niel. diff --git a/Modules/getpath.py b/Modules/getpath.py index 37d2ea03b0bbd2..6f2e0385577221 100644 --- a/Modules/getpath.py +++ b/Modules/getpath.py @@ -127,7 +127,7 @@ # checked by looking for the BUILDDIR_TXT file, which contains the # relative path to the platlib dir. The executable_dir value is # derived from joining the VPATH preprocessor variable to the -# directory containing pybuilddir.txt. If it is not found, the +# directory containing pybuilddir.txt. If it is not found, the # BUILD_LANDMARK file is found, which is part of the source tree. # prefix is then found by searching up for a file that should only # exist in the source tree, and the stdlib dir is set to prefix/Lib. @@ -642,19 +642,12 @@ def search_up(prefix, *landmarks, test=isfile): i = 0 while True: try: - keyname = winreg.EnumKey(key, i) - subkey = winreg.OpenKeyEx(key, keyname) - if not subkey: - continue - try: - v = winreg.QueryValue(subkey) - finally: - winreg.CloseKey(subkey) - if isinstance(v, str): - pythonpath.append(v) - i += 1 + v = winreg.QueryValue(key, winreg.EnumKey(key, i)) except OSError: break + if isinstance(v, str): + pythonpath.append(v) + i += 1 finally: winreg.CloseKey(key) except OSError: From webhook-mailer at python.org Fri Jan 7 17:30:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 07 Jan 2022 22:30:26 -0000 Subject: [Python-checkins] bpo-46289: Make conversion of FormattedValue not optional on ASDL (GH-30467) Message-ID: https://github.com/python/cpython/commit/bea3f42bb7c360921f864949ef7472a7ecb02cd3 commit: bea3f42bb7c360921f864949ef7472a7ecb02cd3 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T14:30:18-08:00 summary: bpo-46289: Make conversion of FormattedValue not optional on ASDL (GH-30467) Automerge-Triggered-By: GH:isidentical (cherry picked from commit d382f7ee0b98e4ab6ade9384268f25c06be462ad) Co-authored-by: Batuhan Taskaya files: A Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst M Parser/Python.asdl M Python/Python-ast.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst new file mode 100644 index 0000000000000..816ff585f14e6 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst @@ -0,0 +1,2 @@ +ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` +field is not optional. diff --git a/Parser/Python.asdl b/Parser/Python.asdl index 85225fc88c5a5..32fdc01a7e0e6 100644 --- a/Parser/Python.asdl +++ b/Parser/Python.asdl @@ -74,7 +74,7 @@ module Python -- x < 4 < 3 and (x < 4) < 3 | Compare(expr left, cmpop* ops, expr* comparators) | Call(expr func, expr* args, keyword* keywords) - | FormattedValue(expr value, int? conversion, expr? format_spec) + | FormattedValue(expr value, int conversion, expr? format_spec) | JoinedStr(expr* values) | Constant(constant value, string? kind) diff --git a/Python/Python-ast.c b/Python/Python-ast.c index ce6e6a93ea70f..2f84cad7749dd 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -1312,7 +1312,7 @@ init_types(struct ast_state *state) " | YieldFrom(expr value)\n" " | Compare(expr left, cmpop* ops, expr* comparators)\n" " | Call(expr func, expr* args, keyword* keywords)\n" - " | FormattedValue(expr value, int? conversion, expr? format_spec)\n" + " | FormattedValue(expr value, int conversion, expr? format_spec)\n" " | JoinedStr(expr* values)\n" " | Constant(constant value, string? kind)\n" " | Attribute(expr value, identifier attr, expr_context ctx)\n" @@ -1402,11 +1402,8 @@ init_types(struct ast_state *state) state->FormattedValue_type = make_type(state, "FormattedValue", state->expr_type, FormattedValue_fields, 3, - "FormattedValue(expr value, int? conversion, expr? format_spec)"); + "FormattedValue(expr value, int conversion, expr? format_spec)"); if (!state->FormattedValue_type) return 0; - if (PyObject_SetAttr(state->FormattedValue_type, state->conversion, - Py_None) == -1) - return 0; if (PyObject_SetAttr(state->FormattedValue_type, state->format_spec, Py_None) == -1) return 0; @@ -9023,9 +9020,9 @@ obj2ast_expr(struct ast_state *state, PyObject* obj, expr_ty* out, PyArena* if (_PyObject_LookupAttr(obj, state->conversion, &tmp) < 0) { return 1; } - if (tmp == NULL || tmp == Py_None) { - Py_CLEAR(tmp); - conversion = 0; + if (tmp == NULL) { + PyErr_SetString(PyExc_TypeError, "required field \"conversion\" missing from FormattedValue"); + return 1; } else { int res; From webhook-mailer at python.org Fri Jan 7 19:07:01 2022 From: webhook-mailer at python.org (zooba) Date: Sat, 08 Jan 2022 00:07:01 -0000 Subject: [Python-checkins] bpo-46217: Revert use of Windows constant that is newer than what we support (GH-30473) Message-ID: https://github.com/python/cpython/commit/d81182b8ec3b1593daf241d44757a9fa68fd14cc commit: d81182b8ec3b1593daf241d44757a9fa68fd14cc branch: main author: Steve Dower committer: zooba date: 2022-01-08T00:06:53Z summary: bpo-46217: Revert use of Windows constant that is newer than what we support (GH-30473) files: A Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst M Python/fileutils.c diff --git a/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst b/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst new file mode 100644 index 0000000000000..78b3cd01a03f6 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst @@ -0,0 +1,2 @@ +Removed parameter that is unsupported on Windows 8.1 and early Windows 10 +and may have caused build or runtime failures. diff --git a/Python/fileutils.c b/Python/fileutils.c index d570be5577ae9..151c6feb2ebe1 100644 --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -2129,7 +2129,7 @@ join_relfile(wchar_t *buffer, size_t bufsize, { #ifdef MS_WINDOWS if (FAILED(PathCchCombineEx(buffer, bufsize, dirname, relfile, - PATHCCH_ALLOW_LONG_PATHS | PATHCCH_FORCE_ENABLE_LONG_NAME_PROCESS))) { + PATHCCH_ALLOW_LONG_PATHS))) { return -1; } #else From webhook-mailer at python.org Fri Jan 7 19:23:48 2022 From: webhook-mailer at python.org (pablogsal) Date: Sat, 08 Jan 2022 00:23:48 -0000 Subject: [Python-checkins] bpo-46237: Fix the line number of tokenizer errors inside f-strings (GH-30463) Message-ID: https://github.com/python/cpython/commit/6fa8b2ceee38187b0ae96aee12fe4f0a5c8a2ce7 commit: 6fa8b2ceee38187b0ae96aee12fe4f0a5c8a2ce7 branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-08T00:23:40Z summary: bpo-46237: Fix the line number of tokenizer errors inside f-strings (GH-30463) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst M Lib/test/test_exceptions.py M Parser/pegen.c M Parser/string_parser.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index c04b57f5630ab..e4d685f4154ed 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -268,6 +268,18 @@ def baz(): check("(1+)", 1, 4) check("[interesting\nfoo()\n", 1, 1) check(b"\xef\xbb\xbf#coding: utf8\nprint('\xe6\x88\x91')\n", 0, -1) + check("""f''' + { + (123_a) + }'''""", 3, 17) + check("""f''' + { + f\"\"\" + { + (123_a) + } + \"\"\" + }'''""", 5, 17) # Errors thrown by symtable.c check('x = [(yield i) for i in range(3)]', 1, 7) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst new file mode 100644 index 0000000000000..931a2603293c3 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst @@ -0,0 +1,2 @@ +Fix the line number of tokenizer errors inside f-strings. Patch by Pablo +Galindo. diff --git a/Parser/pegen.c b/Parser/pegen.c index cfea1c87199b2..470c2cbd7438b 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -179,10 +179,10 @@ initialize_token(Parser *p, Token *token, const char *start, const char *end, in int col_offset = (start != NULL && start >= line_start) ? (int)(start - line_start) : -1; int end_col_offset = (end != NULL && end >= p->tok->line_start) ? (int)(end - p->tok->line_start) : -1; - token->lineno = p->starting_lineno + lineno; - token->col_offset = p->tok->lineno == 1 ? p->starting_col_offset + col_offset : col_offset; - token->end_lineno = p->starting_lineno + end_lineno; - token->end_col_offset = p->tok->lineno == 1 ? p->starting_col_offset + end_col_offset : end_col_offset; + token->lineno = lineno; + token->col_offset = p->tok->lineno == p->starting_lineno ? p->starting_col_offset + col_offset : col_offset; + token->end_lineno = end_lineno; + token->end_col_offset = p->tok->lineno == p->starting_lineno ? p->starting_col_offset + end_col_offset : end_col_offset; p->fill += 1; diff --git a/Parser/string_parser.c b/Parser/string_parser.c index c6fe99c885d69..57d9b9ed3fdbb 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -392,11 +392,14 @@ fstring_compile_expr(Parser *p, const char *expr_start, const char *expr_end, return NULL; } Py_INCREF(p->tok->filename); + tok->filename = p->tok->filename; + tok->lineno = t->lineno + lines - 1; Parser *p2 = _PyPegen_Parser_New(tok, Py_fstring_input, p->flags, p->feature_version, NULL, p->arena); - p2->starting_lineno = t->lineno + lines - 1; + + p2->starting_lineno = t->lineno + lines; p2->starting_col_offset = t->col_offset + cols; expr = _PyPegen_run_parser(p2); From webhook-mailer at python.org Fri Jan 7 22:48:21 2022 From: webhook-mailer at python.org (tim-one) Date: Sat, 08 Jan 2022 03:48:21 -0000 Subject: [Python-checkins] bpo-46235: Do all ref-counting at once during list/tuple multiplication (GH-30346) Message-ID: https://github.com/python/cpython/commit/ad1d5908ada171eff768291371a80022bfad4f04 commit: ad1d5908ada171eff768291371a80022bfad4f04 branch: main author: Dennis Sweeney <36520290+sweeneyde at users.noreply.github.com> committer: tim-one date: 2022-01-07T21:47:58-06:00 summary: bpo-46235: Do all ref-counting at once during list/tuple multiplication (GH-30346) When multiplying lists and tuples by `n`, increment each element's refcount, by `n`, just once. Saves `n-1` increments per element, and allows for a leaner & faster copying loop. Code by sweeneyde (Dennis Sweeney). files: A Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst M Objects/listobject.c M Objects/tupleobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst new file mode 100644 index 0000000000000..9115c9d70a331 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst @@ -0,0 +1 @@ +Certain sequence multiplication operations like ``[0] * 1_000`` are now faster due to reference-counting optimizations. Patch by Dennis Sweeney. \ No newline at end of file diff --git a/Objects/listobject.c b/Objects/listobject.c index e7023fb9eb1d2..29f5d70f1dbd3 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -553,11 +553,8 @@ list_concat(PyListObject *a, PyObject *bb) static PyObject * list_repeat(PyListObject *a, Py_ssize_t n) { - Py_ssize_t i, j; Py_ssize_t size; PyListObject *np; - PyObject **p, **items; - PyObject *elem; if (n < 0) n = 0; if (n > 0 && Py_SIZE(a) > PY_SSIZE_T_MAX / n) @@ -568,24 +565,32 @@ list_repeat(PyListObject *a, Py_ssize_t n) np = (PyListObject *) list_new_prealloc(size); if (np == NULL) return NULL; - + PyObject **dest = np->ob_item; + PyObject **dest_end = dest + size; if (Py_SIZE(a) == 1) { - items = np->ob_item; - elem = a->ob_item[0]; - for (i = 0; i < n; i++) { - items[i] = elem; - Py_INCREF(elem); + PyObject *elem = a->ob_item[0]; + Py_SET_REFCNT(elem, Py_REFCNT(elem) + n); +#ifdef Py_REF_DEBUG + _Py_RefTotal += n; +#endif + while (dest < dest_end) { + *dest++ = elem; } } else { - p = np->ob_item; - items = a->ob_item; - for (i = 0; i < n; i++) { - for (j = 0; j < Py_SIZE(a); j++) { - *p = items[j]; - Py_INCREF(*p); - p++; - } + PyObject **src = a->ob_item; + PyObject **src_end = src + Py_SIZE(a); + while (src < src_end) { + Py_SET_REFCNT(*src, Py_REFCNT(*src) + n); +#ifdef Py_REF_DEBUG + _Py_RefTotal += n; +#endif + *dest++ = *src++; + } + // Now src chases after dest in the same buffer + src = np->ob_item; + while (dest < dest_end) { + *dest++ = *src++; } } Py_SET_SIZE(np, size); diff --git a/Objects/tupleobject.c b/Objects/tupleobject.c index cb34c5eb15e4c..2051c1812efe2 100644 --- a/Objects/tupleobject.c +++ b/Objects/tupleobject.c @@ -589,10 +589,8 @@ tupleconcat(PyTupleObject *a, PyObject *bb) static PyObject * tuplerepeat(PyTupleObject *a, Py_ssize_t n) { - Py_ssize_t i, j; Py_ssize_t size; PyTupleObject *np; - PyObject **p, **items; if (Py_SIZE(a) == 0 || n == 1) { if (PyTuple_CheckExact(a)) { /* Since tuples are immutable, we can return a shared @@ -610,13 +608,32 @@ tuplerepeat(PyTupleObject *a, Py_ssize_t n) np = tuple_alloc(size); if (np == NULL) return NULL; - p = np->ob_item; - items = a->ob_item; - for (i = 0; i < n; i++) { - for (j = 0; j < Py_SIZE(a); j++) { - *p = items[j]; - Py_INCREF(*p); - p++; + PyObject **dest = np->ob_item; + PyObject **dest_end = dest + size; + if (Py_SIZE(a) == 1) { + PyObject *elem = a->ob_item[0]; + Py_SET_REFCNT(elem, Py_REFCNT(elem) + n); +#ifdef Py_REF_DEBUG + _Py_RefTotal += n; +#endif + while (dest < dest_end) { + *dest++ = elem; + } + } + else { + PyObject **src = a->ob_item; + PyObject **src_end = src + Py_SIZE(a); + while (src < src_end) { + Py_SET_REFCNT(*src, Py_REFCNT(*src) + n); +#ifdef Py_REF_DEBUG + _Py_RefTotal += n; +#endif + *dest++ = *src++; + } + // Now src chases after dest in the same buffer + src = np->ob_item; + while (dest < dest_end) { + *dest++ = *src++; } } _PyObject_GC_TRACK(np); From webhook-mailer at python.org Fri Jan 7 22:51:56 2022 From: webhook-mailer at python.org (corona10) Date: Sat, 08 Jan 2022 03:51:56 -0000 Subject: [Python-checkins] bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) Message-ID: https://github.com/python/cpython/commit/e63066cfed27511c9b786d61761f87f7a532571a commit: e63066cfed27511c9b786d61761f87f7a532571a branch: main author: Nikita Sobolev committer: corona10 date: 2022-01-08T12:51:51+09:00 summary: bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) files: M Lib/test/test_descr.py diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index a4131bec602ea..e8ecacb5c4d76 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -2545,10 +2545,8 @@ def getdict(self): m2instance.b = 2 m2instance.a = 1 self.assertEqual(m2instance.__dict__, "Not a dict!") - try: + with self.assertRaises(TypeError): dir(m2instance) - except TypeError: - pass # Two essentially featureless objects, (Ellipsis just inherits stuff # from object. @@ -4066,7 +4064,7 @@ class D(C): except TypeError: pass else: - assert 0, "best_base calculation found wanting" + self.fail("best_base calculation found wanting") def test_unsubclassable_types(self): with self.assertRaises(TypeError): @@ -4452,6 +4450,8 @@ def __getattr__(self, attr): print("Oops!") except RuntimeError: pass + else: + self.fail("Didn't raise RuntimeError") finally: sys.stdout = test_stdout From webhook-mailer at python.org Sat Jan 8 00:26:23 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 08 Jan 2022 05:26:23 -0000 Subject: [Python-checkins] bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) Message-ID: https://github.com/python/cpython/commit/566d70a8d1c1afb8e770068f1686f762a1e343b9 commit: 566d70a8d1c1afb8e770068f1686f762a1e343b9 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T21:26:11-08:00 summary: bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) (cherry picked from commit e63066cfed27511c9b786d61761f87f7a532571a) Co-authored-by: Nikita Sobolev files: M Lib/test/test_descr.py diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index 0e6a784c6c14b..e5e9f4939699e 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -2531,10 +2531,8 @@ def getdict(self): m2instance.b = 2 m2instance.a = 1 self.assertEqual(m2instance.__dict__, "Not a dict!") - try: + with self.assertRaises(TypeError): dir(m2instance) - except TypeError: - pass # Two essentially featureless objects, (Ellipsis just inherits stuff # from object. @@ -4013,7 +4011,7 @@ class D(C): except TypeError: pass else: - assert 0, "best_base calculation found wanting" + self.fail("best_base calculation found wanting") def test_unsubclassable_types(self): with self.assertRaises(TypeError): @@ -4399,6 +4397,8 @@ def __getattr__(self, attr): print("Oops!") except RuntimeError: pass + else: + self.fail("Didn't raise RuntimeError") finally: sys.stdout = test_stdout From webhook-mailer at python.org Sat Jan 8 00:26:23 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 08 Jan 2022 05:26:23 -0000 Subject: [Python-checkins] bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) Message-ID: https://github.com/python/cpython/commit/d2245cf190c36a6d74fe947bf133ce09d3313a6f commit: d2245cf190c36a6d74fe947bf133ce09d3313a6f branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-07T21:26:01-08:00 summary: bpo-46299: improve `test_descr.py` with stricter error handling (GH-30471) (cherry picked from commit e63066cfed27511c9b786d61761f87f7a532571a) Co-authored-by: Nikita Sobolev files: M Lib/test/test_descr.py diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index 3df69ba11407a..f3dd1b32e2afa 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -2545,10 +2545,8 @@ def getdict(self): m2instance.b = 2 m2instance.a = 1 self.assertEqual(m2instance.__dict__, "Not a dict!") - try: + with self.assertRaises(TypeError): dir(m2instance) - except TypeError: - pass # Two essentially featureless objects, (Ellipsis just inherits stuff # from object. @@ -4062,7 +4060,7 @@ class D(C): except TypeError: pass else: - assert 0, "best_base calculation found wanting" + self.fail("best_base calculation found wanting") def test_unsubclassable_types(self): with self.assertRaises(TypeError): @@ -4448,6 +4446,8 @@ def __getattr__(self, attr): print("Oops!") except RuntimeError: pass + else: + self.fail("Didn't raise RuntimeError") finally: sys.stdout = test_stdout From webhook-mailer at python.org Sat Jan 8 03:14:52 2022 From: webhook-mailer at python.org (corona10) Date: Sat, 08 Jan 2022 08:14:52 -0000 Subject: [Python-checkins] bpo-46299: Improve test_descr (GH-30475) Message-ID: https://github.com/python/cpython/commit/45d44b950f1dab0ef90d0a8f4fa75ffaae71500b commit: 45d44b950f1dab0ef90d0a8f4fa75ffaae71500b branch: main author: Dong-hee Na committer: corona10 date: 2022-01-08T17:14:40+09:00 summary: bpo-46299: Improve test_descr (GH-30475) files: M Lib/test/test_descr.py diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index e8ecacb5c4d76..707c93140e251 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -13,6 +13,7 @@ import weakref from copy import deepcopy +from contextlib import redirect_stdout from test import support try: @@ -1445,12 +1446,9 @@ def mysetattr(self, name, value): raise AttributeError return object.__setattr__(self, name, value) C.__setattr__ = mysetattr - try: + with self.assertRaises(AttributeError): a.spam = "not spam" - except AttributeError: - pass - else: - self.fail("expected AttributeError") + self.assertEqual(a.spam, "spam") class D(C): pass @@ -2431,12 +2429,8 @@ def test_dict_constructors(self): else: self.fail("no TypeError from dict(%r)" % badarg) - try: + with self.assertRaises(TypeError): dict({}, {}) - except TypeError: - pass - else: - self.fail("no TypeError from dict({}, {})") class Mapping: # Lacks a .keys() method; will be added later. @@ -3589,12 +3583,8 @@ class A(object): pass A.__call__ = A() - try: + with self.assertRaises(RecursionError): A()() - except RecursionError: - pass - else: - self.fail("Recursion limit should have been reached for __call__()") def test_delete_hook(self): # Testing __del__ hook... @@ -4440,20 +4430,14 @@ def test_wrapper_segfault(self): def test_file_fault(self): # Testing sys.stdout is changed in getattr... - test_stdout = sys.stdout class StdoutGuard: def __getattr__(self, attr): sys.stdout = sys.__stdout__ - raise RuntimeError("Premature access to sys.stdout.%s" % attr) - sys.stdout = StdoutGuard() - try: - print("Oops!") - except RuntimeError: - pass - else: - self.fail("Didn't raise RuntimeError") - finally: - sys.stdout = test_stdout + raise RuntimeError(f"Premature access to sys.stdout.{attr}") + + with redirect_stdout(StdoutGuard()): + with self.assertRaises(RuntimeError): + print("Oops!") def test_vicious_descriptor_nonsense(self): # Testing vicious_descriptor_nonsense... From webhook-mailer at python.org Sat Jan 8 05:56:42 2022 From: webhook-mailer at python.org (taleinat) Date: Sat, 08 Jan 2022 10:56:42 -0000 Subject: [Python-checkins] bpo-46290: Fix parameter names in dataclasses docs (GH-30450) Message-ID: https://github.com/python/cpython/commit/ef5376e69e72fa922d7f1b3df47b99d3576f9df1 commit: ef5376e69e72fa922d7f1b3df47b99d3576f9df1 branch: main author: Zsolt Dollenstein committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-08T12:56:35+02:00 summary: bpo-46290: Fix parameter names in dataclasses docs (GH-30450) files: M Doc/library/dataclasses.rst diff --git a/Doc/library/dataclasses.rst b/Doc/library/dataclasses.rst index 6bc493c957753..0bac594f05617 100644 --- a/Doc/library/dataclasses.rst +++ b/Doc/library/dataclasses.rst @@ -319,9 +319,9 @@ Module contents Raises :exc:`TypeError` if not passed a dataclass or instance of one. Does not return pseudo-fields which are ``ClassVar`` or ``InitVar``. -.. function:: asdict(instance, *, dict_factory=dict) +.. function:: asdict(obj, *, dict_factory=dict) - Converts the dataclass ``instance`` to a dict (by using the + Converts the dataclass ``obj`` to a dict (by using the factory function ``dict_factory``). Each dataclass is converted to a dict of its fields, as ``name: value`` pairs. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -346,14 +346,14 @@ Module contents To create a shallow copy, the following workaround may be used:: - dict((field.name, getattr(instance, field.name)) for field in fields(instance)) + dict((field.name, getattr(obj, field.name)) for field in fields(obj)) - :func:`asdict` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`asdict` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. -.. function:: astuple(instance, *, tuple_factory=tuple) +.. function:: astuple(obj, *, tuple_factory=tuple) - Converts the dataclass ``instance`` to a tuple (by using the + Converts the dataclass ``obj`` to a tuple (by using the factory function ``tuple_factory``). Each dataclass is converted to a tuple of its field values. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -366,9 +366,9 @@ Module contents To create a shallow copy, the following workaround may be used:: - tuple(getattr(instance, field.name) for field in dataclasses.fields(instance)) + tuple(getattr(obj, field.name) for field in dataclasses.fields(obj)) - :func:`astuple` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`astuple` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. .. function:: make_dataclass(cls_name, fields, *, bases=(), namespace=None, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False, match_args=True, kw_only=False, slots=False) @@ -406,10 +406,10 @@ Module contents def add_one(self): return self.x + 1 -.. function:: replace(instance, /, **changes) +.. function:: replace(obj, /, **changes) - Creates a new object of the same type as ``instance``, replacing - fields with values from ``changes``. If ``instance`` is not a Data + Creates a new object of the same type as ``obj``, replacing + fields with values from ``changes``. If ``obj`` is not a Data Class, raises :exc:`TypeError`. If values in ``changes`` do not specify fields, raises :exc:`TypeError`. @@ -434,7 +434,7 @@ Module contents ``replace()`` (or similarly named) method which handles instance copying. -.. function:: is_dataclass(class_or_instance) +.. function:: is_dataclass(obj) Return ``True`` if its parameter is a dataclass or an instance of one, otherwise return ``False``. From webhook-mailer at python.org Sat Jan 8 10:09:46 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 08 Jan 2022 15:09:46 -0000 Subject: [Python-checkins] bpo-46290: Fix parameter names in dataclasses docs (GH-30450) Message-ID: https://github.com/python/cpython/commit/8bef658668bac923166ae160c79720aed5f3b712 commit: 8bef658668bac923166ae160c79720aed5f3b712 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-08T07:09:40-08:00 summary: bpo-46290: Fix parameter names in dataclasses docs (GH-30450) (cherry picked from commit ef5376e69e72fa922d7f1b3df47b99d3576f9df1) Co-authored-by: Zsolt Dollenstein files: M Doc/library/dataclasses.rst diff --git a/Doc/library/dataclasses.rst b/Doc/library/dataclasses.rst index 7f9ffcb6137b4..6a9863cf56a6d 100644 --- a/Doc/library/dataclasses.rst +++ b/Doc/library/dataclasses.rst @@ -319,9 +319,9 @@ Module contents Raises :exc:`TypeError` if not passed a dataclass or instance of one. Does not return pseudo-fields which are ``ClassVar`` or ``InitVar``. -.. function:: asdict(instance, *, dict_factory=dict) +.. function:: asdict(obj, *, dict_factory=dict) - Converts the dataclass ``instance`` to a dict (by using the + Converts the dataclass ``obj`` to a dict (by using the factory function ``dict_factory``). Each dataclass is converted to a dict of its fields, as ``name: value`` pairs. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -346,14 +346,14 @@ Module contents To create a shallow copy, the following workaround may be used:: - dict((field.name, getattr(instance, field.name)) for field in fields(instance)) + dict((field.name, getattr(obj, field.name)) for field in fields(obj)) - :func:`asdict` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`asdict` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. -.. function:: astuple(instance, *, tuple_factory=tuple) +.. function:: astuple(obj, *, tuple_factory=tuple) - Converts the dataclass ``instance`` to a tuple (by using the + Converts the dataclass ``obj`` to a tuple (by using the factory function ``tuple_factory``). Each dataclass is converted to a tuple of its field values. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -366,9 +366,9 @@ Module contents To create a shallow copy, the following workaround may be used:: - tuple(getattr(instance, field.name) for field in dataclasses.fields(instance)) + tuple(getattr(obj, field.name) for field in dataclasses.fields(obj)) - :func:`astuple` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`astuple` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. .. function:: make_dataclass(cls_name, fields, *, bases=(), namespace=None, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False, match_args=True, kw_only=False, slots=False) @@ -406,10 +406,10 @@ Module contents def add_one(self): return self.x + 1 -.. function:: replace(instance, /, **changes) +.. function:: replace(obj, /, **changes) - Creates a new object of the same type as ``instance``, replacing - fields with values from ``changes``. If ``instance`` is not a Data + Creates a new object of the same type as ``obj``, replacing + fields with values from ``changes``. If ``obj`` is not a Data Class, raises :exc:`TypeError`. If values in ``changes`` do not specify fields, raises :exc:`TypeError`. @@ -434,7 +434,7 @@ Module contents ``replace()`` (or similarly named) method which handles instance copying. -.. function:: is_dataclass(class_or_instance) +.. function:: is_dataclass(obj) Return ``True`` if its parameter is a dataclass or an instance of one, otherwise return ``False``. From webhook-mailer at python.org Sat Jan 8 10:10:02 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 08 Jan 2022 15:10:02 -0000 Subject: [Python-checkins] bpo-46290: Fix parameter names in dataclasses docs (GH-30450) Message-ID: https://github.com/python/cpython/commit/cd95033d9c27d6c541dfd774b4c5eab4a70a23ab commit: cd95033d9c27d6c541dfd774b4c5eab4a70a23ab branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-08T07:09:58-08:00 summary: bpo-46290: Fix parameter names in dataclasses docs (GH-30450) (cherry picked from commit ef5376e69e72fa922d7f1b3df47b99d3576f9df1) Co-authored-by: Zsolt Dollenstein files: M Doc/library/dataclasses.rst diff --git a/Doc/library/dataclasses.rst b/Doc/library/dataclasses.rst index 3a943e05309d9..808e67b1f4e9a 100644 --- a/Doc/library/dataclasses.rst +++ b/Doc/library/dataclasses.rst @@ -287,9 +287,9 @@ Module-level decorators, classes, and functions Raises :exc:`TypeError` if not passed a dataclass or instance of one. Does not return pseudo-fields which are ``ClassVar`` or ``InitVar``. -.. function:: asdict(instance, *, dict_factory=dict) +.. function:: asdict(obj, *, dict_factory=dict) - Converts the dataclass ``instance`` to a dict (by using the + Converts the dataclass ``obj`` to a dict (by using the factory function ``dict_factory``). Each dataclass is converted to a dict of its fields, as ``name: value`` pairs. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -314,14 +314,14 @@ Module-level decorators, classes, and functions To create a shallow copy, the following workaround may be used:: - dict((field.name, getattr(instance, field.name)) for field in fields(instance)) + dict((field.name, getattr(obj, field.name)) for field in fields(obj)) - :func:`asdict` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`asdict` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. -.. function:: astuple(instance, *, tuple_factory=tuple) +.. function:: astuple(obj, *, tuple_factory=tuple) - Converts the dataclass ``instance`` to a tuple (by using the + Converts the dataclass ``obj`` to a tuple (by using the factory function ``tuple_factory``). Each dataclass is converted to a tuple of its field values. dataclasses, dicts, lists, and tuples are recursed into. Other objects are copied with @@ -334,9 +334,9 @@ Module-level decorators, classes, and functions To create a shallow copy, the following workaround may be used:: - tuple(getattr(instance, field.name) for field in dataclasses.fields(instance)) + tuple(getattr(obj, field.name) for field in dataclasses.fields(obj)) - :func:`astuple` raises :exc:`TypeError` if ``instance`` is not a dataclass + :func:`astuple` raises :exc:`TypeError` if ``obj`` is not a dataclass instance. .. function:: make_dataclass(cls_name, fields, *, bases=(), namespace=None, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False) @@ -373,10 +373,10 @@ Module-level decorators, classes, and functions def add_one(self): return self.x + 1 -.. function:: replace(instance, /, **changes) +.. function:: replace(obj, /, **changes) - Creates a new object of the same type as ``instance``, replacing - fields with values from ``changes``. If ``instance`` is not a Data + Creates a new object of the same type as ``obj``, replacing + fields with values from ``changes``. If ``obj`` is not a Data Class, raises :exc:`TypeError`. If values in ``changes`` do not specify fields, raises :exc:`TypeError`. @@ -401,7 +401,7 @@ Module-level decorators, classes, and functions ``replace()`` (or similarly named) method which handles instance copying. -.. function:: is_dataclass(class_or_instance) +.. function:: is_dataclass(obj) Return ``True`` if its parameter is a dataclass or an instance of one, otherwise return ``False``. From webhook-mailer at python.org Sat Jan 8 14:17:35 2022 From: webhook-mailer at python.org (taleinat) Date: Sat, 08 Jan 2022 19:17:35 -0000 Subject: [Python-checkins] bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) Message-ID: https://github.com/python/cpython/commit/b6aa38f1ca79600f2ab46ac114ff36461a19c4a3 commit: b6aa38f1ca79600f2ab46ac114ff36461a19c4a3 branch: main author: Erlend Egeberg Aasland committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-08T21:17:09+02:00 summary: bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) files: M Doc/library/sqlite3.rst diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index fb38182370ea9..415a5f9a92902 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -845,14 +845,15 @@ Cursor Objects .. attribute:: lastrowid - This read-only attribute provides the rowid of the last modified row. It is - only set if you issued an ``INSERT`` or a ``REPLACE`` statement using the - :meth:`execute` method. For operations other than ``INSERT`` or - ``REPLACE`` or when :meth:`executemany` is called, :attr:`lastrowid` is - set to :const:`None`. - - If the ``INSERT`` or ``REPLACE`` statement failed to insert the previous - successful rowid is returned. + This read-only attribute provides the row id of the last inserted row. It + is only updated after successful ``INSERT`` or ``REPLACE`` statements + using the :meth:`execute` method. For other statements, after + :meth:`executemany` or :meth:`executescript`, or if the insertion failed, + the value of ``lastrowid`` is left unchanged. The initial value of + ``lastrowid`` is :const:`None`. + + .. note:: + Inserts into ``WITHOUT ROWID`` tables are not recorded. .. versionchanged:: 3.6 Added support for the ``REPLACE`` statement. From webhook-mailer at python.org Sat Jan 8 14:43:50 2022 From: webhook-mailer at python.org (ethanfurman) Date: Sat, 08 Jan 2022 19:43:50 -0000 Subject: [Python-checkins] bpo-46301: [Enum] test uncomparable values in `_convert_` (GH-30472) Message-ID: https://github.com/python/cpython/commit/8d59d2563b914b7208779834895c080c70cd94dd commit: 8d59d2563b914b7208779834895c080c70cd94dd branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-08T11:43:42-08:00 summary: bpo-46301: [Enum] test uncomparable values in `_convert_` (GH-30472) add tests that cover different types, and same non-comparable types files: M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 2b3eac56865b1..7e919fb9b4263 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -4440,6 +4440,15 @@ def test__all__(self): CONVERT_STRING_TEST_NAME_E = 5 CONVERT_STRING_TEST_NAME_F = 5 +# We also need values that cannot be compared: +UNCOMPARABLE_A = 5 +UNCOMPARABLE_C = (9, 1) # naming order is broken on purpose +UNCOMPARABLE_B = 'value' + +COMPLEX_C = 1j +COMPLEX_A = 2j +COMPLEX_B = 3j + class TestIntEnumConvert(unittest.TestCase): def setUp(self): # Reset the module-level test variables to their original integer @@ -4477,6 +4486,32 @@ def test_convert(self): and name not in dir(IntEnum)], [], msg='Names other than CONVERT_TEST_* found.') + def test_convert_uncomparable(self): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('UNCOMPARABLE_'), + ) + + # Should be ordered by `name` only: + self.assertEqual( + list(uncomp), + [uncomp.UNCOMPARABLE_A, uncomp.UNCOMPARABLE_B, uncomp.UNCOMPARABLE_C], + ) + + def test_convert_complex(self): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('COMPLEX_'), + ) + + # Should be ordered by `name` only: + self.assertEqual( + list(uncomp), + [uncomp.COMPLEX_A, uncomp.COMPLEX_B, uncomp.COMPLEX_C], + ) + @unittest.skipUnless(python_version == (3, 8), '_convert was deprecated in 3.8') def test_convert_warn(self): From webhook-mailer at python.org Sat Jan 8 15:05:52 2022 From: webhook-mailer at python.org (taleinat) Date: Sat, 08 Jan 2022 20:05:52 -0000 Subject: [Python-checkins] bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) Message-ID: https://github.com/python/cpython/commit/987fba102e909229dd2aa1a6115aa28d514c1818 commit: 987fba102e909229dd2aa1a6115aa28d514c1818 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-08T22:05:43+02:00 summary: bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) files: M Doc/library/sqlite3.rst diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index f3964b9aa23ae..1c3bde3b914d0 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -766,14 +766,15 @@ Cursor Objects .. attribute:: lastrowid - This read-only attribute provides the rowid of the last modified row. It is - only set if you issued an ``INSERT`` or a ``REPLACE`` statement using the - :meth:`execute` method. For operations other than ``INSERT`` or - ``REPLACE`` or when :meth:`executemany` is called, :attr:`lastrowid` is - set to :const:`None`. - - If the ``INSERT`` or ``REPLACE`` statement failed to insert the previous - successful rowid is returned. + This read-only attribute provides the row id of the last inserted row. It + is only updated after successful ``INSERT`` or ``REPLACE`` statements + using the :meth:`execute` method. For other statements, after + :meth:`executemany` or :meth:`executescript`, or if the insertion failed, + the value of ``lastrowid`` is left unchanged. The initial value of + ``lastrowid`` is :const:`None`. + + .. note:: + Inserts into ``WITHOUT ROWID`` tables are not recorded. .. versionchanged:: 3.6 Added support for the ``REPLACE`` statement. From webhook-mailer at python.org Sat Jan 8 15:06:18 2022 From: webhook-mailer at python.org (taleinat) Date: Sat, 08 Jan 2022 20:06:18 -0000 Subject: [Python-checkins] [3.9] bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) Message-ID: https://github.com/python/cpython/commit/b29aa71090e4dd34900660ecca8bb62667440f41 commit: b29aa71090e4dd34900660ecca8bb62667440f41 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-08T22:06:14+02:00 summary: [3.9] bpo-46261: Update `sqlite3.Cursor.lastrowid` docs (GH-30407) files: M Doc/library/sqlite3.rst diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 3f4f8536c1c6b..0ffb8ff0b969c 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -756,14 +756,15 @@ Cursor Objects .. attribute:: lastrowid - This read-only attribute provides the rowid of the last modified row. It is - only set if you issued an ``INSERT`` or a ``REPLACE`` statement using the - :meth:`execute` method. For operations other than ``INSERT`` or - ``REPLACE`` or when :meth:`executemany` is called, :attr:`lastrowid` is - set to :const:`None`. - - If the ``INSERT`` or ``REPLACE`` statement failed to insert the previous - successful rowid is returned. + This read-only attribute provides the row id of the last inserted row. It + is only updated after successful ``INSERT`` or ``REPLACE`` statements + using the :meth:`execute` method. For other statements, after + :meth:`executemany` or :meth:`executescript`, or if the insertion failed, + the value of ``lastrowid`` is left unchanged. The initial value of + ``lastrowid`` is :const:`None`. + + .. note:: + Inserts into ``WITHOUT ROWID`` tables are not recorded. .. versionchanged:: 3.6 Added support for the ``REPLACE`` statement. From webhook-mailer at python.org Sat Jan 8 15:13:50 2022 From: webhook-mailer at python.org (ericvsmith) Date: Sat, 08 Jan 2022 20:13:50 -0000 Subject: [Python-checkins] bpo-46306: simplify `CodeType` attribute access in `doctest.py` (GH-30481) Message-ID: https://github.com/python/cpython/commit/0fc58c1e051026baff4919d8519ce2aabe3b2ba1 commit: 0fc58c1e051026baff4919d8519ce2aabe3b2ba1 branch: main author: Nikita Sobolev committer: ericvsmith date: 2022-01-08T15:13:42-05:00 summary: bpo-46306: simplify `CodeType` attribute access in `doctest.py` (GH-30481) Assume co_firstlineno always exists on types.CodeType objects. Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst M Lib/doctest.py diff --git a/Lib/doctest.py b/Lib/doctest.py index b27cbdfed46ff..4735b59852685 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1113,7 +1113,7 @@ def _find_lineno(self, obj, source_lines): if inspect.istraceback(obj): obj = obj.tb_frame if inspect.isframe(obj): obj = obj.f_code if inspect.iscode(obj): - lineno = getattr(obj, 'co_firstlineno', None)-1 + lineno = obj.co_firstlineno - 1 # Find the line number where the docstring starts. Assume # that it's the first line that begins with a quote mark. diff --git a/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst b/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst new file mode 100644 index 0000000000000..02943c95a7d79 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst @@ -0,0 +1,2 @@ +Assume that :class:`types.CodeType` always has :attr:`types.CodeType.co_firstlineno` in +:mod:`doctest`. From webhook-mailer at python.org Sat Jan 8 19:54:26 2022 From: webhook-mailer at python.org (ned-deily) Date: Sun, 09 Jan 2022 00:54:26 -0000 Subject: [Python-checkins] bpo-46308: Fix unportable test(1) operator in configure script (GH-30490) Message-ID: https://github.com/python/cpython/commit/3d11c1b8b49800c5c4c295953cc3abf577f6065a commit: 3d11c1b8b49800c5c4c295953cc3abf577f6065a branch: main author: Thomas Klausner committer: ned-deily date: 2022-01-08T19:54:13-05:00 summary: bpo-46308: Fix unportable test(1) operator in configure script (GH-30490) files: M configure M configure.ac diff --git a/configure b/configure index 9e7090c7906dd..9712446d24c11 100755 --- a/configure +++ b/configure @@ -10315,7 +10315,7 @@ then # small for the default recursion limit. Increase the stack size # to ensure that tests don't crash stack_size="1000000" # 16 MB - if test "$with_ubsan" == "yes" + if test "$with_ubsan" = "yes" then # Undefined behavior sanitizer requires an even deeper stack stack_size="4000000" # 64 MB diff --git a/configure.ac b/configure.ac index ff3163f921ae2..1720b9bfbee37 100644 --- a/configure.ac +++ b/configure.ac @@ -2861,7 +2861,7 @@ then # small for the default recursion limit. Increase the stack size # to ensure that tests don't crash stack_size="1000000" # 16 MB - if test "$with_ubsan" == "yes" + if test "$with_ubsan" = "yes" then # Undefined behavior sanitizer requires an even deeper stack stack_size="4000000" # 64 MB From webhook-mailer at python.org Sat Jan 8 20:08:29 2022 From: webhook-mailer at python.org (ned-deily) Date: Sun, 09 Jan 2022 01:08:29 -0000 Subject: [Python-checkins] bpo-34602: Fix unportable test(1) operator in configure script (GH-30490) (GH-30491) Message-ID: https://github.com/python/cpython/commit/b962544594c6a7c695330dd20fedffb3a1916ba6 commit: b962544594c6a7c695330dd20fedffb3a1916ba6 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ned-deily date: 2022-01-08T20:08:20-05:00 summary: bpo-34602: Fix unportable test(1) operator in configure script (GH-30490) (GH-30491) (cherry picked from commit 3d11c1b8b49800c5c4c295953cc3abf577f6065a) Co-authored-by: Thomas Klausner files: M configure M configure.ac diff --git a/configure b/configure index 0e97c5228df10..a7d2975f1f5e8 100755 --- a/configure +++ b/configure @@ -9895,7 +9895,7 @@ then # small for the default recursion limit. Increase the stack size # to ensure that tests don't crash stack_size="1000000" # 16 MB - if test "$with_ubsan" == "yes" + if test "$with_ubsan" = "yes" then # Undefined behavior sanitizer requires an even deeper stack stack_size="4000000" # 64 MB diff --git a/configure.ac b/configure.ac index 9151059f8946f..5aa91cbad3555 100644 --- a/configure.ac +++ b/configure.ac @@ -2816,7 +2816,7 @@ then # small for the default recursion limit. Increase the stack size # to ensure that tests don't crash stack_size="1000000" # 16 MB - if test "$with_ubsan" == "yes" + if test "$with_ubsan" = "yes" then # Undefined behavior sanitizer requires an even deeper stack stack_size="4000000" # 64 MB From webhook-mailer at python.org Sun Jan 9 05:29:03 2022 From: webhook-mailer at python.org (mdickinson) Date: Sun, 09 Jan 2022 10:29:03 -0000 Subject: [Python-checkins] Add a (conservative) timeout for Windows builds on GitHub Actions (GH-30301) Message-ID: https://github.com/python/cpython/commit/0ea2ef5fa81c72126e038c1a853e46c19bd4767e commit: 0ea2ef5fa81c72126e038c1a853e46c19bd4767e branch: main author: Mark Dickinson committer: mdickinson date: 2022-01-09T10:28:34Z summary: Add a (conservative) timeout for Windows builds on GitHub Actions (GH-30301) files: M .github/workflows/build.yml diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index f220aaa1d72a2..f11d51b2dc993 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -112,6 +112,7 @@ jobs: - uses: actions/checkout at v2 - name: Build CPython run: .\PCbuild\build.bat -e -p Win32 + timeout-minutes: 30 - name: Display build info run: .\python.bat -m test.pythoninfo - name: Tests @@ -130,6 +131,7 @@ jobs: run: echo "::add-matcher::.github/problem-matchers/msvc.json" - name: Build CPython run: .\PCbuild\build.bat -e -p x64 + timeout-minutes: 30 - name: Display build info run: .\python.bat -m test.pythoninfo - name: Tests From webhook-mailer at python.org Sun Jan 9 05:59:08 2022 From: webhook-mailer at python.org (tiran) Date: Sun, 09 Jan 2022 10:59:08 -0000 Subject: [Python-checkins] bpo-40280: Disable epoll_create in Emscripten config.site (GH-30494) Message-ID: https://github.com/python/cpython/commit/5c66414b5561c54e7a0f4bde8cc3271908ea525e commit: 5c66414b5561c54e7a0f4bde8cc3271908ea525e branch: main author: Ethan Smith committer: tiran date: 2022-01-09T11:58:59+01:00 summary: bpo-40280: Disable epoll_create in Emscripten config.site (GH-30494) Co-authored-by: nick.pope at infogrid.io files: M Tools/wasm/config.site-wasm32-emscripten diff --git a/Tools/wasm/config.site-wasm32-emscripten b/Tools/wasm/config.site-wasm32-emscripten index 67304be060b52..b291c802e1e4d 100644 --- a/Tools/wasm/config.site-wasm32-emscripten +++ b/Tools/wasm/config.site-wasm32-emscripten @@ -33,7 +33,7 @@ ac_cv_lib_bz2_BZ2_bzCompress=no # The rest is based on pyodide # https://github.com/pyodide/pyodide/blob/main/cpython/pyconfig.undefs.h -ac_cv_func_epoll=no +ac_cv_func_epoll_create=no ac_cv_func_epoll_create1=no ac_cv_header_linux_vm_sockets_h=no ac_cv_func_socketpair=no From webhook-mailer at python.org Sun Jan 9 08:32:34 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Sun, 09 Jan 2022 13:32:34 -0000 Subject: [Python-checkins] bpo-37295: Use constant-time comb() and perm() for larger n depending on k (GH-30305) Message-ID: https://github.com/python/cpython/commit/2d787971c65b005d0cce219399b9a8e2b70d4ef4 commit: 2d787971c65b005d0cce219399b9a8e2b70d4ef4 branch: main author: Serhiy Storchaka committer: serhiy-storchaka date: 2022-01-09T15:32:25+02:00 summary: bpo-37295: Use constant-time comb() and perm() for larger n depending on k (GH-30305) files: M Lib/test/test_math.py M Modules/mathmodule.c diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py index a7df00f0fb101..cfaf3b3ea26a7 100644 --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -1889,8 +1889,8 @@ def testPerm(self): perm = math.perm factorial = math.factorial # Test if factorial definition is satisfied - for n in range(100): - for k in range(n + 1): + for n in range(500): + for k in (range(n + 1) if n < 100 else range(30) if n < 200 else range(10)): self.assertEqual(perm(n, k), factorial(n) // factorial(n - k)) @@ -1953,8 +1953,8 @@ def testComb(self): comb = math.comb factorial = math.factorial # Test if factorial definition is satisfied - for n in range(100): - for k in range(n + 1): + for n in range(500): + for k in (range(n + 1) if n < 100 else range(30) if n < 200 else range(10)): self.assertEqual(comb(n, k), factorial(n) // (factorial(k) * factorial(n - k))) diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index 952c51304a9fc..3ab1a0776046d 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -3225,6 +3225,123 @@ math_prod_impl(PyObject *module, PyObject *iterable, PyObject *start) } +/* least significant 64 bits of the odd part of factorial(n), for n in range(128). + +Python code to generate the values: + + import math + + for n in range(128): + fac = math.factorial(n) + fac_odd_part = fac // (fac & -fac) + reduced_fac_odd_part = fac_odd_part % (2**64) + print(f"{reduced_fac_odd_part:#018x}u") +*/ +static const uint64_t reduced_factorial_odd_part[] = { + 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000003u, + 0x0000000000000003u, 0x000000000000000fu, 0x000000000000002du, 0x000000000000013bu, + 0x000000000000013bu, 0x0000000000000b13u, 0x000000000000375fu, 0x0000000000026115u, + 0x000000000007233fu, 0x00000000005cca33u, 0x0000000002898765u, 0x00000000260eeeebu, + 0x00000000260eeeebu, 0x0000000286fddd9bu, 0x00000016beecca73u, 0x000001b02b930689u, + 0x00000870d9df20adu, 0x0000b141df4dae31u, 0x00079dd498567c1bu, 0x00af2e19afc5266du, + 0x020d8a4d0f4f7347u, 0x335281867ec241efu, 0x9b3093d46fdd5923u, 0x5e1f9767cc5866b1u, + 0x92dd23d6966aced7u, 0xa30d0f4f0a196e5bu, 0x8dc3e5a1977d7755u, 0x2ab8ce915831734bu, + 0x2ab8ce915831734bu, 0x81d2a0bc5e5fdcabu, 0x9efcac82445da75bu, 0xbc8b95cf58cde171u, + 0xa0e8444a1f3cecf9u, 0x4191deb683ce3ffdu, 0xddd3878bc84ebfc7u, 0xcb39a64b83ff3751u, + 0xf8203f7993fc1495u, 0xbd2a2a78b35f4bddu, 0x84757be6b6d13921u, 0x3fbbcfc0b524988bu, + 0xbd11ed47c8928df9u, 0x3c26b59e41c2f4c5u, 0x677a5137e883fdb3u, 0xff74e943b03b93ddu, + 0xfe5ebbcb10b2bb97u, 0xb021f1de3235e7e7u, 0x33509eb2e743a58fu, 0x390f9da41279fb7du, + 0xe5cb0154f031c559u, 0x93074695ba4ddb6du, 0x81c471caa636247fu, 0xe1347289b5a1d749u, + 0x286f21c3f76ce2ffu, 0x00be84a2173e8ac7u, 0x1595065ca215b88bu, 0xf95877595b018809u, + 0x9c2efe3c5516f887u, 0x373294604679382bu, 0xaf1ff7a888adcd35u, 0x18ddf279a2c5800bu, + 0x18ddf279a2c5800bu, 0x505a90e2542582cbu, 0x5bacad2cd8d5dc2bu, 0xfe3152bcbff89f41u, + 0xe1467e88bf829351u, 0xb8001adb9e31b4d5u, 0x2803ac06a0cbb91fu, 0x1904b5d698805799u, + 0xe12a648b5c831461u, 0x3516abbd6160cfa9u, 0xac46d25f12fe036du, 0x78bfa1da906b00efu, + 0xf6390338b7f111bdu, 0x0f25f80f538255d9u, 0x4ec8ca55b8db140fu, 0x4ff670740b9b30a1u, + 0x8fd032443a07f325u, 0x80dfe7965c83eeb5u, 0xa3dc1714d1213afdu, 0x205b7bbfcdc62007u, + 0xa78126bbe140a093u, 0x9de1dc61ca7550cfu, 0x84f0046d01b492c5u, 0x2d91810b945de0f3u, + 0xf5408b7f6008aa71u, 0x43707f4863034149u, 0xdac65fb9679279d5u, 0xc48406e7d1114eb7u, + 0xa7dc9ed3c88e1271u, 0xfb25b2efdb9cb30du, 0x1bebda0951c4df63u, 0x5c85e975580ee5bdu, + 0x1591bc60082cb137u, 0x2c38606318ef25d7u, 0x76ca72f7c5c63e27u, 0xf04a75d17baa0915u, + 0x77458175139ae30du, 0x0e6c1330bc1b9421u, 0xdf87d2b5797e8293u, 0xefa5c703e1e68925u, + 0x2b6b1b3278b4f6e1u, 0xceee27b382394249u, 0xd74e3829f5dab91du, 0xfdb17989c26b5f1fu, + 0xc1b7d18781530845u, 0x7b4436b2105a8561u, 0x7ba7c0418372a7d7u, 0x9dbc5c67feb6c639u, + 0x502686d7f6ff6b8fu, 0x6101855406be7a1fu, 0x9956afb5806930e7u, 0xe1f0ee88af40f7c5u, + 0x984b057bda5c1151u, 0x9a49819acc13ea05u, 0x8ef0dead0896ef27u, 0x71f7826efe292b21u, + 0xad80a480e46986efu, 0x01cdc0ebf5e0c6f7u, 0x6e06f839968f68dbu, 0xdd5943ab56e76139u, + 0xcdcf31bf8604c5e7u, 0x7e2b4a847054a1cbu, 0x0ca75697a4d3d0f5u, 0x4703f53ac514a98bu, +}; + +/* inverses of reduced_factorial_odd_part values modulo 2**64. + +Python code to generate the values: + + import math + + for n in range(128): + fac = math.factorial(n) + fac_odd_part = fac // (fac & -fac) + inverted_fac_odd_part = pow(fac_odd_part, -1, 2**64) + print(f"{inverted_fac_odd_part:#018x}u") +*/ +static const uint64_t inverted_factorial_odd_part[] = { + 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000001u, 0xaaaaaaaaaaaaaaabu, + 0xaaaaaaaaaaaaaaabu, 0xeeeeeeeeeeeeeeefu, 0x4fa4fa4fa4fa4fa5u, 0x2ff2ff2ff2ff2ff3u, + 0x2ff2ff2ff2ff2ff3u, 0x938cc70553e3771bu, 0xb71c27cddd93e49fu, 0xb38e3229fcdee63du, + 0xe684bb63544a4cbfu, 0xc2f684917ca340fbu, 0xf747c9cba417526du, 0xbb26eb51d7bd49c3u, + 0xbb26eb51d7bd49c3u, 0xb0a7efb985294093u, 0xbe4b8c69f259eabbu, 0x6854d17ed6dc4fb9u, + 0xe1aa904c915f4325u, 0x3b8206df131cead1u, 0x79c6009fea76fe13u, 0xd8c5d381633cd365u, + 0x4841f12b21144677u, 0x4a91ff68200b0d0fu, 0x8f9513a58c4f9e8bu, 0x2b3e690621a42251u, + 0x4f520f00e03c04e7u, 0x2edf84ee600211d3u, 0xadcaa2764aaacdfdu, 0x161f4f9033f4fe63u, + 0x161f4f9033f4fe63u, 0xbada2932ea4d3e03u, 0xcec189f3efaa30d3u, 0xf7475bb68330bf91u, + 0x37eb7bf7d5b01549u, 0x46b35660a4e91555u, 0xa567c12d81f151f7u, 0x4c724007bb2071b1u, + 0x0f4a0cce58a016bdu, 0xfa21068e66106475u, 0x244ab72b5a318ae1u, 0x366ce67e080d0f23u, + 0xd666fdae5dd2a449u, 0xd740ddd0acc06a0du, 0xb050bbbb28e6f97bu, 0x70b003fe890a5c75u, + 0xd03aabff83037427u, 0x13ec4ca72c783bd7u, 0x90282c06afdbd96fu, 0x4414ddb9db4a95d5u, + 0xa2c68735ae6832e9u, 0xbf72d71455676665u, 0xa8469fab6b759b7fu, 0xc1e55b56e606caf9u, + 0x40455630fc4a1cffu, 0x0120a7b0046d16f7u, 0xa7c3553b08faef23u, 0x9f0bfd1b08d48639u, + 0xa433ffce9a304d37u, 0xa22ad1d53915c683u, 0xcb6cbc723ba5dd1du, 0x547fb1b8ab9d0ba3u, + 0x547fb1b8ab9d0ba3u, 0x8f15a826498852e3u, 0x32e1a03f38880283u, 0x3de4cce63283f0c1u, + 0x5dfe6667e4da95b1u, 0xfda6eeeef479e47du, 0xf14de991cc7882dfu, 0xe68db79247630ca9u, + 0xa7d6db8207ee8fa1u, 0x255e1f0fcf034499u, 0xc9a8990e43dd7e65u, 0x3279b6f289702e0fu, + 0xe7b5905d9b71b195u, 0x03025ba41ff0da69u, 0xb7df3d6d3be55aefu, 0xf89b212ebff2b361u, + 0xfe856d095996f0adu, 0xd6e533e9fdf20f9du, 0xf8c0e84a63da3255u, 0xa677876cd91b4db7u, + 0x07ed4f97780d7d9bu, 0x90a8705f258db62fu, 0xa41bbb2be31b1c0du, 0x6ec28690b038383bu, + 0xdb860c3bb2edd691u, 0x0838286838a980f9u, 0x558417a74b36f77du, 0x71779afc3646ef07u, + 0x743cda377ccb6e91u, 0x7fdf9f3fe89153c5u, 0xdc97d25df49b9a4bu, 0x76321a778eb37d95u, + 0x7cbb5e27da3bd487u, 0x9cff4ade1a009de7u, 0x70eb166d05c15197u, 0xdcf0460b71d5fe3du, + 0x5ac1ee5260b6a3c5u, 0xc922dedfdd78efe1u, 0xe5d381dc3b8eeb9bu, 0xd57e5347bafc6aadu, + 0x86939040983acd21u, 0x395b9d69740a4ff9u, 0x1467299c8e43d135u, 0x5fe440fcad975cdfu, + 0xcaa9a39794a6ca8du, 0xf61dbd640868dea1u, 0xac09d98d74843be7u, 0x2b103b9e1a6b4809u, + 0x2ab92d16960f536fu, 0x6653323d5e3681dfu, 0xefd48c1c0624e2d7u, 0xa496fefe04816f0du, + 0x1754a7b07bbdd7b1u, 0x23353c829a3852cdu, 0xbf831261abd59097u, 0x57a8e656df0618e1u, + 0x16e9206c3100680fu, 0xadad4c6ee921dac7u, 0x635f2b3860265353u, 0xdd6d0059f44b3d09u, + 0xac4dd6b894447dd7u, 0x42ea183eeaa87be3u, 0x15612d1550ee5b5du, 0x226fa19d656cb623u, +}; + +/* exponent of the largest power of 2 dividing factorial(n), for n in range(68) + +Python code to generate the values: + +import math + +for n in range(128): + fac = math.factorial(n) + fac_trailing_zeros = (fac & -fac).bit_length() - 1 + print(fac_trailing_zeros) +*/ + +static const uint8_t factorial_trailing_zeros[] = { + 0, 0, 1, 1, 3, 3, 4, 4, 7, 7, 8, 8, 10, 10, 11, 11, // 0-15 + 15, 15, 16, 16, 18, 18, 19, 19, 22, 22, 23, 23, 25, 25, 26, 26, // 16-31 + 31, 31, 32, 32, 34, 34, 35, 35, 38, 38, 39, 39, 41, 41, 42, 42, // 32-47 + 46, 46, 47, 47, 49, 49, 50, 50, 53, 53, 54, 54, 56, 56, 57, 57, // 48-63 + 63, 63, 64, 64, 66, 66, 67, 67, 70, 70, 71, 71, 73, 73, 74, 74, // 64-79 + 78, 78, 79, 79, 81, 81, 82, 82, 85, 85, 86, 86, 88, 88, 89, 89, // 80-95 + 94, 94, 95, 95, 97, 97, 98, 98, 101, 101, 102, 102, 104, 104, 105, 105, // 96-111 + 109, 109, 110, 110, 112, 112, 113, 113, 116, 116, 117, 117, 119, 119, 120, 120, // 112-127 +}; + /* Number of permutations and combinations. * P(n, k) = n! / (n-k)! * C(n, k) = P(n, k) / k! @@ -3234,48 +3351,93 @@ math_prod_impl(PyObject *module, PyObject *iterable, PyObject *start) static PyObject * perm_comb_small(unsigned long long n, unsigned long long k, int iscomb) { - /* long long is at least 64 bit */ - static const unsigned long long fast_comb_limits[] = { - 0, ULLONG_MAX, 4294967296ULL, 3329022, 102570, 13467, 3612, 1449, // 0-7 - 746, 453, 308, 227, 178, 147, 125, 110, // 8-15 - 99, 90, 84, 79, 75, 72, 69, 68, // 16-23 - 66, 65, 64, 63, 63, 62, 62, 62, // 24-31 - }; - static const unsigned long long fast_perm_limits[] = { - 0, ULLONG_MAX, 4294967296ULL, 2642246, 65537, 7133, 1627, 568, // 0-7 - 259, 142, 88, 61, 45, 36, 30, // 8-14 - }; - if (k == 0) { return PyLong_FromLong(1); } /* For small enough n and k the result fits in the 64-bit range and can * be calculated without allocating intermediate PyLong objects. */ - if (iscomb - ? (k < Py_ARRAY_LENGTH(fast_comb_limits) - && n <= fast_comb_limits[k]) - : (k < Py_ARRAY_LENGTH(fast_perm_limits) - && n <= fast_perm_limits[k])) - { - unsigned long long result = n; - if (iscomb) { + if (iscomb) { + /* Maps k to the maximal n so that 2*k-1 <= n <= 127 and C(n, k) + * fits into a uint64_t. Exclude k = 1, because the second fast + * path is faster for this case.*/ + static const unsigned char fast_comb_limits1[] = { + 0, 0, 127, 127, 127, 127, 127, 127, // 0-7 + 127, 127, 127, 127, 127, 127, 127, 127, // 8-15 + 116, 105, 97, 91, 86, 82, 78, 76, // 16-23 + 74, 72, 71, 70, 69, 68, 68, 67, // 24-31 + 67, 67, 67, // 32-34 + }; + if (k < Py_ARRAY_LENGTH(fast_comb_limits1) && n <= fast_comb_limits1[k]) { + /* + comb(n, k) fits into a uint64_t. We compute it as + + comb_odd_part << shift + + where 2**shift is the largest power of two dividing comb(n, k) + and comb_odd_part is comb(n, k) >> shift. comb_odd_part can be + calculated efficiently via arithmetic modulo 2**64, using three + lookups and two uint64_t multiplications. + */ + uint64_t comb_odd_part = reduced_factorial_odd_part[n] + * inverted_factorial_odd_part[k] + * inverted_factorial_odd_part[n - k]; + int shift = factorial_trailing_zeros[n] + - factorial_trailing_zeros[k] + - factorial_trailing_zeros[n - k]; + return PyLong_FromUnsignedLongLong(comb_odd_part << shift); + } + + /* Maps k to the maximal n so that 2*k-1 <= n <= 127 and C(n, k)*k + * fits into a long long (which is at least 64 bit). Only contains + * items larger than in fast_comb_limits1. */ + static const unsigned long long fast_comb_limits2[] = { + 0, ULLONG_MAX, 4294967296ULL, 3329022, 102570, 13467, 3612, 1449, // 0-7 + 746, 453, 308, 227, 178, 147, // 8-13 + }; + if (k < Py_ARRAY_LENGTH(fast_comb_limits2) && n <= fast_comb_limits2[k]) { + /* C(n, k) = C(n, k-1) * (n-k+1) / k */ + unsigned long long result = n; for (unsigned long long i = 1; i < k;) { result *= --n; result /= ++i; } + return PyLong_FromUnsignedLongLong(result); } - else { + } + else { + /* Maps k to the maximal n so that k <= n and P(n, k) + * fits into a long long (which is at least 64 bit). */ + static const unsigned long long fast_perm_limits[] = { + 0, ULLONG_MAX, 4294967296ULL, 2642246, 65537, 7133, 1627, 568, // 0-7 + 259, 142, 88, 61, 45, 36, 30, 26, // 8-15 + 24, 22, 21, 20, 20, // 16-20 + }; + if (k < Py_ARRAY_LENGTH(fast_perm_limits) && n <= fast_perm_limits[k]) { + if (n <= 127) { + /* P(n, k) fits into a uint64_t. */ + uint64_t perm_odd_part = reduced_factorial_odd_part[n] + * inverted_factorial_odd_part[n - k]; + int shift = factorial_trailing_zeros[n] + - factorial_trailing_zeros[n - k]; + return PyLong_FromUnsignedLongLong(perm_odd_part << shift); + } + + /* P(n, k) = P(n, k-1) * (n-k+1) */ + unsigned long long result = n; for (unsigned long long i = 1; i < k;) { result *= --n; ++i; } + return PyLong_FromUnsignedLongLong(result); } - return PyLong_FromUnsignedLongLong(result); } - /* For larger n use recursive formula. */ - /* C(n, k) = C(n, j) * C(n-j, k-j) // C(k, j) */ + /* For larger n use recursive formulas: + * + * P(n, k) = P(n, j) * P(n-j, k-j) + * C(n, k) = C(n, j) * C(n-j, k-j) // C(k, j) + */ unsigned long long j = k / 2; PyObject *a, *b; a = perm_comb_small(n, j, iscomb); @@ -3450,90 +3612,6 @@ math_perm_impl(PyObject *module, PyObject *n, PyObject *k) return NULL; } -/* least significant 64 bits of the odd part of factorial(n), for n in range(68). - -Python code to generate the values: - - import math - - for n in range(68): - fac = math.factorial(n) - fac_odd_part = fac // (fac & -fac) - reduced_fac_odd_part = fac_odd_part % (2**64) - print(f"{reduced_fac_odd_part:#018x}u") -*/ -static const uint64_t reduced_factorial_odd_part[] = { - 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000003u, - 0x0000000000000003u, 0x000000000000000fu, 0x000000000000002du, 0x000000000000013bu, - 0x000000000000013bu, 0x0000000000000b13u, 0x000000000000375fu, 0x0000000000026115u, - 0x000000000007233fu, 0x00000000005cca33u, 0x0000000002898765u, 0x00000000260eeeebu, - 0x00000000260eeeebu, 0x0000000286fddd9bu, 0x00000016beecca73u, 0x000001b02b930689u, - 0x00000870d9df20adu, 0x0000b141df4dae31u, 0x00079dd498567c1bu, 0x00af2e19afc5266du, - 0x020d8a4d0f4f7347u, 0x335281867ec241efu, 0x9b3093d46fdd5923u, 0x5e1f9767cc5866b1u, - 0x92dd23d6966aced7u, 0xa30d0f4f0a196e5bu, 0x8dc3e5a1977d7755u, 0x2ab8ce915831734bu, - 0x2ab8ce915831734bu, 0x81d2a0bc5e5fdcabu, 0x9efcac82445da75bu, 0xbc8b95cf58cde171u, - 0xa0e8444a1f3cecf9u, 0x4191deb683ce3ffdu, 0xddd3878bc84ebfc7u, 0xcb39a64b83ff3751u, - 0xf8203f7993fc1495u, 0xbd2a2a78b35f4bddu, 0x84757be6b6d13921u, 0x3fbbcfc0b524988bu, - 0xbd11ed47c8928df9u, 0x3c26b59e41c2f4c5u, 0x677a5137e883fdb3u, 0xff74e943b03b93ddu, - 0xfe5ebbcb10b2bb97u, 0xb021f1de3235e7e7u, 0x33509eb2e743a58fu, 0x390f9da41279fb7du, - 0xe5cb0154f031c559u, 0x93074695ba4ddb6du, 0x81c471caa636247fu, 0xe1347289b5a1d749u, - 0x286f21c3f76ce2ffu, 0x00be84a2173e8ac7u, 0x1595065ca215b88bu, 0xf95877595b018809u, - 0x9c2efe3c5516f887u, 0x373294604679382bu, 0xaf1ff7a888adcd35u, 0x18ddf279a2c5800bu, - 0x18ddf279a2c5800bu, 0x505a90e2542582cbu, 0x5bacad2cd8d5dc2bu, 0xfe3152bcbff89f41u, -}; - -/* inverses of reduced_factorial_odd_part values modulo 2**64. - -Python code to generate the values: - - import math - - for n in range(68): - fac = math.factorial(n) - fac_odd_part = fac // (fac & -fac) - inverted_fac_odd_part = pow(fac_odd_part, -1, 2**64) - print(f"{inverted_fac_odd_part:#018x}u") -*/ -static const uint64_t inverted_factorial_odd_part[] = { - 0x0000000000000001u, 0x0000000000000001u, 0x0000000000000001u, 0xaaaaaaaaaaaaaaabu, - 0xaaaaaaaaaaaaaaabu, 0xeeeeeeeeeeeeeeefu, 0x4fa4fa4fa4fa4fa5u, 0x2ff2ff2ff2ff2ff3u, - 0x2ff2ff2ff2ff2ff3u, 0x938cc70553e3771bu, 0xb71c27cddd93e49fu, 0xb38e3229fcdee63du, - 0xe684bb63544a4cbfu, 0xc2f684917ca340fbu, 0xf747c9cba417526du, 0xbb26eb51d7bd49c3u, - 0xbb26eb51d7bd49c3u, 0xb0a7efb985294093u, 0xbe4b8c69f259eabbu, 0x6854d17ed6dc4fb9u, - 0xe1aa904c915f4325u, 0x3b8206df131cead1u, 0x79c6009fea76fe13u, 0xd8c5d381633cd365u, - 0x4841f12b21144677u, 0x4a91ff68200b0d0fu, 0x8f9513a58c4f9e8bu, 0x2b3e690621a42251u, - 0x4f520f00e03c04e7u, 0x2edf84ee600211d3u, 0xadcaa2764aaacdfdu, 0x161f4f9033f4fe63u, - 0x161f4f9033f4fe63u, 0xbada2932ea4d3e03u, 0xcec189f3efaa30d3u, 0xf7475bb68330bf91u, - 0x37eb7bf7d5b01549u, 0x46b35660a4e91555u, 0xa567c12d81f151f7u, 0x4c724007bb2071b1u, - 0x0f4a0cce58a016bdu, 0xfa21068e66106475u, 0x244ab72b5a318ae1u, 0x366ce67e080d0f23u, - 0xd666fdae5dd2a449u, 0xd740ddd0acc06a0du, 0xb050bbbb28e6f97bu, 0x70b003fe890a5c75u, - 0xd03aabff83037427u, 0x13ec4ca72c783bd7u, 0x90282c06afdbd96fu, 0x4414ddb9db4a95d5u, - 0xa2c68735ae6832e9u, 0xbf72d71455676665u, 0xa8469fab6b759b7fu, 0xc1e55b56e606caf9u, - 0x40455630fc4a1cffu, 0x0120a7b0046d16f7u, 0xa7c3553b08faef23u, 0x9f0bfd1b08d48639u, - 0xa433ffce9a304d37u, 0xa22ad1d53915c683u, 0xcb6cbc723ba5dd1du, 0x547fb1b8ab9d0ba3u, - 0x547fb1b8ab9d0ba3u, 0x8f15a826498852e3u, 0x32e1a03f38880283u, 0x3de4cce63283f0c1u, -}; - -/* exponent of the largest power of 2 dividing factorial(n), for n in range(68) - -Python code to generate the values: - -import math - -for n in range(68): - fac = math.factorial(n) - fac_trailing_zeros = (fac & -fac).bit_length() - 1 - print(fac_trailing_zeros) -*/ - -static const uint8_t factorial_trailing_zeros[] = { - 0, 0, 1, 1, 3, 3, 4, 4, 7, 7, 8, 8, 10, 10, 11, 11, // 0-15 - 15, 15, 16, 16, 18, 18, 19, 19, 22, 22, 23, 23, 25, 25, 26, 26, // 16-31 - 31, 31, 32, 32, 34, 34, 35, 35, 38, 38, 39, 39, 41, 41, 42, 42, // 32-47 - 46, 46, 47, 47, 49, 49, 50, 50, 53, 53, 54, 54, 56, 56, 57, 57, // 48-63 - 63, 63, 64, 64, // 64-67 -}; - /*[clinic input] math.comb @@ -3597,28 +3675,6 @@ math_comb_impl(PyObject *module, PyObject *n, PyObject *k) } assert(ki >= 0); - if (ni <= 67) { - /* - For 0 <= k <= n <= 67, comb(n, k) always fits into a uint64_t. - We compute it as - - comb_odd_part << shift - - where 2**shift is the largest power of two dividing comb(n, k) - and comb_odd_part is comb(n, k) >> shift. comb_odd_part can be - calculated efficiently via arithmetic modulo 2**64, using three - lookups and two uint64_t multiplications. - */ - uint64_t comb_odd_part = reduced_factorial_odd_part[ni] - * inverted_factorial_odd_part[ki] - * inverted_factorial_odd_part[ni - ki]; - int shift = factorial_trailing_zeros[ni] - - factorial_trailing_zeros[ki] - - factorial_trailing_zeros[ni - ki]; - result = PyLong_FromUnsignedLongLong(comb_odd_part << shift); - goto done; - } - ki = Py_MIN(ki, ni - ki); if (ki > 1) { result = perm_comb_small((unsigned long long)ni, From webhook-mailer at python.org Sun Jan 9 11:22:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 09 Jan 2022 16:22:59 -0000 Subject: [Python-checkins] bpo-46272: Fix two heading comments in python.gram (GH-30499) Message-ID: https://github.com/python/cpython/commit/1bee9a4625e101d3308831de37590f4e2f57c71c commit: 1bee9a4625e101d3308831de37590f4e2f57c71c branch: main author: Mark Dickinson committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-09T08:22:54-08:00 summary: bpo-46272: Fix two heading comments in python.gram (GH-30499) One typo fix and one heading change, both in comments. No functional changes. files: M Grammar/python.gram diff --git a/Grammar/python.gram b/Grammar/python.gram index c989823e3091c..c5a5f1b7fe20e 100644 --- a/Grammar/python.gram +++ b/Grammar/python.gram @@ -676,8 +676,8 @@ inversion[expr_ty] (memo): | 'not' a=inversion { _PyAST_UnaryOp(Not, a, EXTRA) } | comparison -# Comparisons operators -# --------------------- +# Comparison operators +# -------------------- comparison[expr_ty]: | a=bitwise_or b=compare_op_bitwise_or_pair+ { @@ -712,7 +712,7 @@ in_bitwise_or[CmpopExprPair*]: 'in' a=bitwise_or { _PyPegen_cmpop_expr_pair(p, I isnot_bitwise_or[CmpopExprPair*]: 'is' 'not' a=bitwise_or { _PyPegen_cmpop_expr_pair(p, IsNot, a) } is_bitwise_or[CmpopExprPair*]: 'is' a=bitwise_or { _PyPegen_cmpop_expr_pair(p, Is, a) } -# Logical operators +# Bitwise operators # ----------------- bitwise_or[expr_ty]: From webhook-mailer at python.org Sun Jan 9 20:38:37 2022 From: webhook-mailer at python.org (methane) Date: Mon, 10 Jan 2022 01:38:37 -0000 Subject: [Python-checkins] bpo-23882: unittest: Drop PEP 420 support from discovery. (GH-29745) Message-ID: https://github.com/python/cpython/commit/0b2b9d251374c5ed94265e28039f82b37d039e3e commit: 0b2b9d251374c5ed94265e28039f82b37d039e3e branch: main author: Inada Naoki committer: methane date: 2022-01-10T10:38:33+09:00 summary: bpo-23882: unittest: Drop PEP 420 support from discovery. (GH-29745) files: A Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst M Doc/library/unittest.rst M Doc/whatsnew/3.11.rst M Lib/unittest/loader.py M Lib/unittest/test/test_discovery.py diff --git a/Doc/library/unittest.rst b/Doc/library/unittest.rst index 22723f42d048f..b5a533194b583 100644 --- a/Doc/library/unittest.rst +++ b/Doc/library/unittest.rst @@ -266,8 +266,7 @@ Test Discovery Unittest supports simple test discovery. In order to be compatible with test discovery, all of the test files must be :ref:`modules ` or -:ref:`packages ` (including :term:`namespace packages -`) importable from the top-level directory of +:ref:`packages ` importable from the top-level directory of the project (this means that their filenames must be valid :ref:`identifiers `). @@ -340,6 +339,24 @@ the `load_tests protocol`_. directory too (e.g. ``python -m unittest discover -s root/namespace -t root``). +.. versionchanged:: 3.11 + Python 3.11 dropped the :term:`namespace packages ` + support. It has been broken since Python 3.7. Start directory and + subdirectories containing tests must be regular package that have + ``__init__.py`` file. + + Directories containing start directory still can be a namespace package. + In this case, you need to specify start directory as dotted package name, + and target directory explicitly. For example:: + + # proj/ <-- current directory + # namespace/ + # mypkg/ + # __init__.py + # test_mypkg.py + + python -m unittest discover -s namespace.mypkg -t . + .. _organizing-tests: @@ -1858,6 +1875,10 @@ Loading and running tests whether their path matches *pattern*, because it is impossible for a package name to match the default pattern. + .. versionchanged:: 3.11 + *start_dir* can not be a :term:`namespace packages `. + It has been broken since Python 3.7 and Python 3.11 officially remove it. + The following attributes of a :class:`TestLoader` can be configured either by subclassing or assignment on an instance: diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 98ff2d44a811b..72243619891ae 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -542,6 +542,10 @@ Removed (Contributed by Hugo van Kemenade in :issue:`45320`.) +* Remove namespace package support from unittest discovery. It was introduced in + Python 3.4 but has been broken since Python 3.7. + (Contributed by Inada Naoki in :issue:`23882`.) + Porting to Python 3.11 ====================== diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py index 5951f3f754eb1..eb18cd0b49cd2 100644 --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -264,8 +264,6 @@ def discover(self, start_dir, pattern='test*.py', top_level_dir=None): self._top_level_dir = top_level_dir is_not_importable = False - is_namespace = False - tests = [] if os.path.isdir(os.path.abspath(start_dir)): start_dir = os.path.abspath(start_dir) if start_dir != top_level_dir: @@ -281,50 +279,25 @@ def discover(self, start_dir, pattern='test*.py', top_level_dir=None): top_part = start_dir.split('.')[0] try: start_dir = os.path.abspath( - os.path.dirname((the_module.__file__))) + os.path.dirname((the_module.__file__))) except AttributeError: - # look for namespace packages - try: - spec = the_module.__spec__ - except AttributeError: - spec = None - - if spec and spec.loader is None: - if spec.submodule_search_locations is not None: - is_namespace = True - - for path in the_module.__path__: - if (not set_implicit_top and - not path.startswith(top_level_dir)): - continue - self._top_level_dir = \ - (path.split(the_module.__name__ - .replace(".", os.path.sep))[0]) - tests.extend(self._find_tests(path, - pattern, - namespace=True)) - elif the_module.__name__ in sys.builtin_module_names: + if the_module.__name__ in sys.builtin_module_names: # builtin module raise TypeError('Can not use builtin modules ' 'as dotted module names') from None else: raise TypeError( - 'don\'t know how to discover from {!r}' - .format(the_module)) from None + f"don't know how to discover from {the_module!r}" + ) from None if set_implicit_top: - if not is_namespace: - self._top_level_dir = \ - self._get_directory_containing_module(top_part) - sys.path.remove(top_level_dir) - else: - sys.path.remove(top_level_dir) + self._top_level_dir = self._get_directory_containing_module(top_part) + sys.path.remove(top_level_dir) if is_not_importable: raise ImportError('Start directory is not importable: %r' % start_dir) - if not is_namespace: - tests = list(self._find_tests(start_dir, pattern)) + tests = list(self._find_tests(start_dir, pattern)) return self.suiteClass(tests) def _get_directory_containing_module(self, module_name): @@ -359,7 +332,7 @@ def _match_path(self, path, full_path, pattern): # override this method to use alternative matching strategy return fnmatch(path, pattern) - def _find_tests(self, start_dir, pattern, namespace=False): + def _find_tests(self, start_dir, pattern): """Used by discovery. Yields test suites it loads.""" # Handle the __init__ in this package name = self._get_name_from_path(start_dir) @@ -368,8 +341,7 @@ def _find_tests(self, start_dir, pattern, namespace=False): if name != '.' and name not in self._loading_packages: # name is in self._loading_packages while we have called into # loadTestsFromModule with name. - tests, should_recurse = self._find_test_path( - start_dir, pattern, namespace) + tests, should_recurse = self._find_test_path(start_dir, pattern) if tests is not None: yield tests if not should_recurse: @@ -380,8 +352,7 @@ def _find_tests(self, start_dir, pattern, namespace=False): paths = sorted(os.listdir(start_dir)) for path in paths: full_path = os.path.join(start_dir, path) - tests, should_recurse = self._find_test_path( - full_path, pattern, namespace) + tests, should_recurse = self._find_test_path(full_path, pattern) if tests is not None: yield tests if should_recurse: @@ -389,11 +360,11 @@ def _find_tests(self, start_dir, pattern, namespace=False): name = self._get_name_from_path(full_path) self._loading_packages.add(name) try: - yield from self._find_tests(full_path, pattern, namespace) + yield from self._find_tests(full_path, pattern) finally: self._loading_packages.discard(name) - def _find_test_path(self, full_path, pattern, namespace=False): + def _find_test_path(self, full_path, pattern): """Used by discovery. Loads tests from a single file, or a directories' __init__.py when @@ -437,8 +408,7 @@ def _find_test_path(self, full_path, pattern, namespace=False): msg % (mod_name, module_dir, expected_dir)) return self.loadTestsFromModule(module, pattern=pattern), False elif os.path.isdir(full_path): - if (not namespace and - not os.path.isfile(os.path.join(full_path, '__init__.py'))): + if not os.path.isfile(os.path.join(full_path, '__init__.py')): return None, False load_tests = None diff --git a/Lib/unittest/test/test_discovery.py b/Lib/unittest/test/test_discovery.py index 9d502c51fb36a..3b58786ec16a1 100644 --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/unittest/test/test_discovery.py @@ -396,7 +396,7 @@ def restore_isdir(): self.addCleanup(restore_isdir) _find_tests_args = [] - def _find_tests(start_dir, pattern, namespace=None): + def _find_tests(start_dir, pattern): _find_tests_args.append((start_dir, pattern)) return ['tests'] loader._find_tests = _find_tests @@ -792,7 +792,7 @@ def test_discovery_from_dotted_path(self): expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) self.wasRun = False - def _find_tests(start_dir, pattern, namespace=None): + def _find_tests(start_dir, pattern): self.wasRun = True self.assertEqual(start_dir, expectedPath) return tests @@ -825,37 +825,6 @@ def restore(): 'Can not use builtin modules ' 'as dotted module names') - def test_discovery_from_dotted_namespace_packages(self): - loader = unittest.TestLoader() - - package = types.ModuleType('package') - package.__path__ = ['/a', '/b'] - package.__spec__ = types.SimpleNamespace( - loader=None, - submodule_search_locations=['/a', '/b'] - ) - - def _import(packagename, *args, **kwargs): - sys.modules[packagename] = package - return package - - _find_tests_args = [] - def _find_tests(start_dir, pattern, namespace=None): - _find_tests_args.append((start_dir, pattern)) - return ['%s/tests' % start_dir] - - loader._find_tests = _find_tests - loader.suiteClass = list - - with unittest.mock.patch('builtins.__import__', _import): - # Since loader.discover() can modify sys.path, restore it when done. - with import_helper.DirsOnSysPath(): - # Make sure to remove 'package' from sys.modules when done. - with test.test_importlib.util.uncache('package'): - suite = loader.discover('package') - - self.assertEqual(suite, ['/a/tests', '/b/tests']) - def test_discovery_failed_discovery(self): loader = unittest.TestLoader() package = types.ModuleType('package') diff --git a/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst b/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst new file mode 100644 index 0000000000000..a37c0b869157d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst @@ -0,0 +1,2 @@ +Remove namespace package (PEP 420) support from unittest discovery. It was +introduced in Python 3.4 but has been broken since Python 3.7. From webhook-mailer at python.org Sun Jan 9 21:02:15 2022 From: webhook-mailer at python.org (rhettinger) Date: Mon, 10 Jan 2022 02:02:15 -0000 Subject: [Python-checkins] bpo-46270: Describe the `in` and `not in` operators as membership tests. (GH-30504) Message-ID: https://github.com/python/cpython/commit/d24cd49acb25c36db31893b84d0ca4fe2f373b94 commit: d24cd49acb25c36db31893b84d0ca4fe2f373b94 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-09T18:02:06-08:00 summary: bpo-46270: Describe the `in` and `not in` operators as membership tests. (GH-30504) files: M Doc/tutorial/datastructures.rst diff --git a/Doc/tutorial/datastructures.rst b/Doc/tutorial/datastructures.rst index e42b380db3d23..927a6722ca251 100644 --- a/Doc/tutorial/datastructures.rst +++ b/Doc/tutorial/datastructures.rst @@ -659,10 +659,12 @@ More on Conditions The conditions used in ``while`` and ``if`` statements can contain any operators, not just comparisons. -The comparison operators ``in`` and ``not in`` check whether a value occurs -(does not occur) in a sequence. The operators ``is`` and ``is not`` compare -whether two objects are really the same object. All comparison operators have -the same priority, which is lower than that of all numerical operators. + +The comparison operators ``in`` and ``not in`` are membership tests that +determine whether a value is in (or not in) a container. The operators ``is`` +and ``is not`` compare whether two objects are really the same object. All +comparison operators have the same priority, which is lower than that of all +numerical operators. Comparisons can be chained. For example, ``a < b == c`` tests whether ``a`` is less than ``b`` and moreover ``b`` equals ``c``. From webhook-mailer at python.org Sun Jan 9 21:32:01 2022 From: webhook-mailer at python.org (rhettinger) Date: Mon, 10 Jan 2022 02:32:01 -0000 Subject: [Python-checkins] bpo-46270: Describe the `in` and `not in` operators as membership tests. (GH-30504) (GH-30509) Message-ID: https://github.com/python/cpython/commit/2e6798f35260ff90129861ef1f289ac40c0396c2 commit: 2e6798f35260ff90129861ef1f289ac40c0396c2 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-09T18:31:51-08:00 summary: bpo-46270: Describe the `in` and `not in` operators as membership tests. (GH-30504) (GH-30509) files: M Doc/tutorial/datastructures.rst diff --git a/Doc/tutorial/datastructures.rst b/Doc/tutorial/datastructures.rst index e42b380db3d23..927a6722ca251 100644 --- a/Doc/tutorial/datastructures.rst +++ b/Doc/tutorial/datastructures.rst @@ -659,10 +659,12 @@ More on Conditions The conditions used in ``while`` and ``if`` statements can contain any operators, not just comparisons. -The comparison operators ``in`` and ``not in`` check whether a value occurs -(does not occur) in a sequence. The operators ``is`` and ``is not`` compare -whether two objects are really the same object. All comparison operators have -the same priority, which is lower than that of all numerical operators. + +The comparison operators ``in`` and ``not in`` are membership tests that +determine whether a value is in (or not in) a container. The operators ``is`` +and ``is not`` compare whether two objects are really the same object. All +comparison operators have the same priority, which is lower than that of all +numerical operators. Comparisons can be chained. For example, ``a < b == c`` tests whether ``a`` is less than ``b`` and moreover ``b`` equals ``c``. From webhook-mailer at python.org Mon Jan 10 07:29:17 2022 From: webhook-mailer at python.org (markshannon) Date: Mon, 10 Jan 2022 12:29:17 -0000 Subject: [Python-checkins] bpo-46314: Remove extra RESUME when compiling a lamdba. (GH-30513) Message-ID: https://github.com/python/cpython/commit/ec0c392f34ee2474ceacf66881f05546b540e2d1 commit: ec0c392f34ee2474ceacf66881f05546b540e2d1 branch: main author: Mark Shannon committer: markshannon date: 2022-01-10T12:29:02Z summary: bpo-46314: Remove extra RESUME when compiling a lamdba. (GH-30513) files: M Lib/test/test_sys_settrace.py M Python/compile.c diff --git a/Lib/test/test_sys_settrace.py b/Lib/test/test_sys_settrace.py index dc2aef1545b0c..8e430f72f63cc 100644 --- a/Lib/test/test_sys_settrace.py +++ b/Lib/test/test_sys_settrace.py @@ -1388,6 +1388,21 @@ def func(): (19, 'line'), (19, 'return')]) + def test_notrace_lambda(self): + #Regression test for issue 46314 + + def func(): + 1 + lambda x: 2 + 3 + + self.run_and_compare(func, + [(0, 'call'), + (1, 'line'), + (2, 'line'), + (3, 'line'), + (3, 'return')]) + class SkipLineEventsTraceTestCase(TraceTestCase): """Repeat the trace tests, but with per-line events skipped""" diff --git a/Python/compile.c b/Python/compile.c index 643a5e507712c..590ca9dbfc6a3 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -630,140 +630,6 @@ compiler_unit_free(struct compiler_unit *u) PyObject_Free(u); } -static int -compiler_enter_scope(struct compiler *c, identifier name, - int scope_type, void *key, int lineno) -{ - struct compiler_unit *u; - basicblock *block; - - u = (struct compiler_unit *)PyObject_Calloc(1, sizeof( - struct compiler_unit)); - if (!u) { - PyErr_NoMemory(); - return 0; - } - u->u_scope_type = scope_type; - u->u_argcount = 0; - u->u_posonlyargcount = 0; - u->u_kwonlyargcount = 0; - u->u_ste = PySymtable_Lookup(c->c_st, key); - if (!u->u_ste) { - compiler_unit_free(u); - return 0; - } - Py_INCREF(name); - u->u_name = name; - u->u_varnames = list2dict(u->u_ste->ste_varnames); - u->u_cellvars = dictbytype(u->u_ste->ste_symbols, CELL, 0, 0); - if (!u->u_varnames || !u->u_cellvars) { - compiler_unit_free(u); - return 0; - } - if (u->u_ste->ste_needs_class_closure) { - /* Cook up an implicit __class__ cell. */ - _Py_IDENTIFIER(__class__); - PyObject *name; - int res; - assert(u->u_scope_type == COMPILER_SCOPE_CLASS); - assert(PyDict_GET_SIZE(u->u_cellvars) == 0); - name = _PyUnicode_FromId(&PyId___class__); - if (!name) { - compiler_unit_free(u); - return 0; - } - res = PyDict_SetItem(u->u_cellvars, name, _PyLong_GetZero()); - if (res < 0) { - compiler_unit_free(u); - return 0; - } - } - - u->u_freevars = dictbytype(u->u_ste->ste_symbols, FREE, DEF_FREE_CLASS, - PyDict_GET_SIZE(u->u_cellvars)); - if (!u->u_freevars) { - compiler_unit_free(u); - return 0; - } - - u->u_blocks = NULL; - u->u_nfblocks = 0; - u->u_firstlineno = lineno; - u->u_lineno = lineno; - u->u_col_offset = 0; - u->u_end_lineno = lineno; - u->u_end_col_offset = 0; - u->u_consts = PyDict_New(); - if (!u->u_consts) { - compiler_unit_free(u); - return 0; - } - u->u_names = PyDict_New(); - if (!u->u_names) { - compiler_unit_free(u); - return 0; - } - - u->u_private = NULL; - - /* Push the old compiler_unit on the stack. */ - if (c->u) { - PyObject *capsule = PyCapsule_New(c->u, CAPSULE_NAME, NULL); - if (!capsule || PyList_Append(c->c_stack, capsule) < 0) { - Py_XDECREF(capsule); - compiler_unit_free(u); - return 0; - } - Py_DECREF(capsule); - u->u_private = c->u->u_private; - Py_XINCREF(u->u_private); - } - c->u = u; - - c->c_nestlevel++; - - block = compiler_new_block(c); - if (block == NULL) - return 0; - c->u->u_curblock = block; - - if (u->u_scope_type != COMPILER_SCOPE_MODULE) { - if (!compiler_set_qualname(c)) - return 0; - } - - return 1; -} - -static void -compiler_exit_scope(struct compiler *c) -{ - // Don't call PySequence_DelItem() with an exception raised - PyObject *exc_type, *exc_val, *exc_tb; - PyErr_Fetch(&exc_type, &exc_val, &exc_tb); - - c->c_nestlevel--; - compiler_unit_free(c->u); - /* Restore c->u to the parent unit. */ - Py_ssize_t n = PyList_GET_SIZE(c->c_stack) - 1; - if (n >= 0) { - PyObject *capsule = PyList_GET_ITEM(c->c_stack, n); - c->u = (struct compiler_unit *)PyCapsule_GetPointer(capsule, CAPSULE_NAME); - assert(c->u); - /* we are deleting from a list so this really shouldn't fail */ - if (PySequence_DelItem(c->c_stack, n) < 0) { - _PyErr_WriteUnraisableMsg("on removing the last compiler " - "stack item", NULL); - } - compiler_unit_check(c->u); - } - else { - c->u = NULL; - } - - PyErr_Restore(exc_type, exc_val, exc_tb); -} - static int compiler_set_qualname(struct compiler *c) { @@ -1715,6 +1581,144 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) return 0; \ } +static int +compiler_enter_scope(struct compiler *c, identifier name, + int scope_type, void *key, int lineno) +{ + struct compiler_unit *u; + basicblock *block; + + u = (struct compiler_unit *)PyObject_Calloc(1, sizeof( + struct compiler_unit)); + if (!u) { + PyErr_NoMemory(); + return 0; + } + u->u_scope_type = scope_type; + u->u_argcount = 0; + u->u_posonlyargcount = 0; + u->u_kwonlyargcount = 0; + u->u_ste = PySymtable_Lookup(c->c_st, key); + if (!u->u_ste) { + compiler_unit_free(u); + return 0; + } + Py_INCREF(name); + u->u_name = name; + u->u_varnames = list2dict(u->u_ste->ste_varnames); + u->u_cellvars = dictbytype(u->u_ste->ste_symbols, CELL, 0, 0); + if (!u->u_varnames || !u->u_cellvars) { + compiler_unit_free(u); + return 0; + } + if (u->u_ste->ste_needs_class_closure) { + /* Cook up an implicit __class__ cell. */ + _Py_IDENTIFIER(__class__); + PyObject *name; + int res; + assert(u->u_scope_type == COMPILER_SCOPE_CLASS); + assert(PyDict_GET_SIZE(u->u_cellvars) == 0); + name = _PyUnicode_FromId(&PyId___class__); + if (!name) { + compiler_unit_free(u); + return 0; + } + res = PyDict_SetItem(u->u_cellvars, name, _PyLong_GetZero()); + if (res < 0) { + compiler_unit_free(u); + return 0; + } + } + + u->u_freevars = dictbytype(u->u_ste->ste_symbols, FREE, DEF_FREE_CLASS, + PyDict_GET_SIZE(u->u_cellvars)); + if (!u->u_freevars) { + compiler_unit_free(u); + return 0; + } + + u->u_blocks = NULL; + u->u_nfblocks = 0; + u->u_firstlineno = lineno; + u->u_lineno = lineno; + u->u_col_offset = 0; + u->u_end_lineno = lineno; + u->u_end_col_offset = 0; + u->u_consts = PyDict_New(); + if (!u->u_consts) { + compiler_unit_free(u); + return 0; + } + u->u_names = PyDict_New(); + if (!u->u_names) { + compiler_unit_free(u); + return 0; + } + + u->u_private = NULL; + + /* Push the old compiler_unit on the stack. */ + if (c->u) { + PyObject *capsule = PyCapsule_New(c->u, CAPSULE_NAME, NULL); + if (!capsule || PyList_Append(c->c_stack, capsule) < 0) { + Py_XDECREF(capsule); + compiler_unit_free(u); + return 0; + } + Py_DECREF(capsule); + u->u_private = c->u->u_private; + Py_XINCREF(u->u_private); + } + c->u = u; + + c->c_nestlevel++; + + block = compiler_new_block(c); + if (block == NULL) + return 0; + c->u->u_curblock = block; + + if (u->u_scope_type == COMPILER_SCOPE_MODULE) { + c->u->u_lineno = -1; + } + else { + if (!compiler_set_qualname(c)) + return 0; + } + ADDOP_I(c, RESUME, 0); + + return 1; +} + +static void +compiler_exit_scope(struct compiler *c) +{ + // Don't call PySequence_DelItem() with an exception raised + PyObject *exc_type, *exc_val, *exc_tb; + PyErr_Fetch(&exc_type, &exc_val, &exc_tb); + + c->c_nestlevel--; + compiler_unit_free(c->u); + /* Restore c->u to the parent unit. */ + Py_ssize_t n = PyList_GET_SIZE(c->c_stack) - 1; + if (n >= 0) { + PyObject *capsule = PyList_GET_ITEM(c->c_stack, n); + c->u = (struct compiler_unit *)PyCapsule_GetPointer(capsule, CAPSULE_NAME); + assert(c->u); + /* we are deleting from a list so this really shouldn't fail */ + if (PySequence_DelItem(c->c_stack, n) < 0) { + _PyErr_WriteUnraisableMsg("on removing the last compiler " + "stack item", NULL); + } + compiler_unit_check(c->u); + } + else { + c->u = NULL; + } + + PyErr_Restore(exc_type, exc_val, exc_tb); +} + /* Search if variable annotations are present statically in a block. */ static int @@ -2049,10 +2053,9 @@ compiler_mod(struct compiler *c, mod_ty mod) if (module == NULL) { return 0; } - if (!compiler_enter_scope(c, module, COMPILER_SCOPE_MODULE, mod, 1)) + if (!compiler_enter_scope(c, module, COMPILER_SCOPE_MODULE, mod, 1)) { return NULL; - c->u->u_lineno = -1; - ADDOP_I(c, RESUME, 0); + } c->u->u_lineno = 1; switch (mod->kind) { case Module_kind: @@ -2508,7 +2511,6 @@ compiler_function(struct compiler *c, stmt_ty s, int is_async) if (!compiler_enter_scope(c, name, scope_type, (void *)s, firstlineno)) { return 0; } - ADDOP_I(c, RESUME, 0); /* if not -OO mode, add docstring */ if (c->c_optimize < 2) { @@ -2581,7 +2583,6 @@ compiler_class(struct compiler *c, stmt_ty s) COMPILER_SCOPE_CLASS, (void *)s, firstlineno)) { return 0; } - ADDOP_I(c, RESUME, 0); /* this block represents what we do in the new scope */ { /* use the class name for name mangling */ @@ -2914,13 +2915,11 @@ compiler_lambda(struct compiler *c, expr_ty e) if (funcflags == -1) { return 0; } - ADDOP_I(c, RESUME, 0); if (!compiler_enter_scope(c, name, COMPILER_SCOPE_LAMBDA, - (void *)e, e->lineno)) + (void *)e, e->lineno)) { return 0; - - ADDOP_I(c, RESUME, 0); + } /* Make None the first constant, so the lambda can't have a docstring. */ if (compiler_add_const(c, Py_None) < 0) @@ -5292,7 +5291,6 @@ compiler_comprehension(struct compiler *c, expr_ty e, int type, { goto error; } - ADDOP_I(c, RESUME, 0); SET_LOC(c, e); is_async_generator = c->u->u_ste->ste_coroutine; From webhook-mailer at python.org Mon Jan 10 13:59:36 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 10 Jan 2022 18:59:36 -0000 Subject: [Python-checkins] bpo-46332: use raise..from instead of assigning __cause__ and raising (GH-30517) Message-ID: https://github.com/python/cpython/commit/0d639678d33a0f085851a07259b8fe2782943118 commit: 0d639678d33a0f085851a07259b8fe2782943118 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-10T18:59:21Z summary: bpo-46332: use raise..from instead of assigning __cause__ and raising (GH-30517) files: M Lib/logging/config.py diff --git a/Lib/logging/config.py b/Lib/logging/config.py index 3bc63b78621ab..9bc07eddd76b4 100644 --- a/Lib/logging/config.py +++ b/Lib/logging/config.py @@ -30,7 +30,6 @@ import logging.handlers import re import struct -import sys import threading import traceback @@ -392,11 +391,9 @@ def resolve(self, s): self.importer(used) found = getattr(found, frag) return found - except ImportError: - e, tb = sys.exc_info()[1:] + except ImportError as e: v = ValueError('Cannot resolve %r: %s' % (s, e)) - v.__cause__, v.__traceback__ = e, tb - raise v + raise v from e def ext_convert(self, value): """Default converter for the ext:// protocol.""" From webhook-mailer at python.org Mon Jan 10 14:09:12 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 10 Jan 2022 19:09:12 -0000 Subject: [Python-checkins] bpo-46301: [Enum] fix refleak tests (GH30510) Message-ID: https://github.com/python/cpython/commit/582286d71c7ee61f5376a846a83c7be4a5727636 commit: 582286d71c7ee61f5376a846a83c7be4a5727636 branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-10T11:09:00-08:00 summary: bpo-46301: [Enum] fix refleak tests (GH30510) files: M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 7e919fb9b4263..dfa81a52a93a4 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -4449,30 +4449,34 @@ def test__all__(self): COMPLEX_A = 2j COMPLEX_B = 3j -class TestIntEnumConvert(unittest.TestCase): - def setUp(self): - # Reset the module-level test variables to their original integer - # values, otherwise the already created enum values get converted - # instead. - for suffix in ['A', 'B', 'C', 'D', 'E', 'F']: - globals()[f'CONVERT_TEST_NAME_{suffix}'] = 5 - globals()[f'CONVERT_STRING_TEST_NAME_{suffix}'] = 5 +class _ModuleWrapper: + """We use this class as a namespace for swapping modules.""" + def __init__(self, module): + self.__dict__.update(module.__dict__) + +class TestIntEnumConvert(unittest.TestCase): def test_convert_value_lookup_priority(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # We don't want the reverse lookup value to vary when there are # multiple possible names for a given value. It should always # report the first lexigraphical name in that case. self.assertEqual(test_type(5).name, 'CONVERT_TEST_NAME_A') def test_convert(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # Ensure that test_type has all of the desired names and values. self.assertEqual(test_type.CONVERT_TEST_NAME_F, test_type.CONVERT_TEST_NAME_A) @@ -4487,11 +4491,16 @@ def test_convert(self): [], msg='Names other than CONVERT_TEST_* found.') def test_convert_uncomparable(self): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('UNCOMPARABLE_'), - ) + # We swap a module to some other object with `__dict__` + # because otherwise refleak is created. + # `_convert_` uses a module side effect that does this. See 30472 + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('UNCOMPARABLE_')) # Should be ordered by `name` only: self.assertEqual( @@ -4500,11 +4509,13 @@ def test_convert_uncomparable(self): ) def test_convert_complex(self): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('COMPLEX_'), - ) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('COMPLEX_')) # Should be ordered by `name` only: self.assertEqual( @@ -4531,10 +4542,13 @@ def test_convert_raise(self): filter=lambda x: x.startswith('CONVERT_TEST_')) def test_convert_repr_and_str(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STRING_TEST_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STRING_TEST_')) self.assertEqual(repr(test_type.CONVERT_STRING_TEST_NAME_A), '%s.CONVERT_STRING_TEST_NAME_A' % SHORT_MODULE) self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), 'CONVERT_STRING_TEST_NAME_A') self.assertEqual(format(test_type.CONVERT_STRING_TEST_NAME_A), '5') @@ -4544,17 +4558,14 @@ def test_convert_repr_and_str(self): CONVERT_STR_TEST_1 = 'hello' class TestStrEnumConvert(unittest.TestCase): - def setUp(self): - global CONVERT_STR_TEST_1 - global CONVERT_STR_TEST_2 - CONVERT_STR_TEST_2 = 'goodbye' - CONVERT_STR_TEST_1 = 'hello' - def test_convert(self): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_')) # Ensure that test_type has all of the desired names and values. self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') @@ -4565,10 +4576,13 @@ def test_convert(self): [], msg='Names other than CONVERT_STR_* found.') def test_convert_repr_and_str(self): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_')) self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') From webhook-mailer at python.org Mon Jan 10 14:12:42 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 10 Jan 2022 19:12:42 -0000 Subject: [Python-checkins] bpo-46327: [Enum] remove skipped tests (GH-30512) Message-ID: https://github.com/python/cpython/commit/13e4659276c2af2fa5b0f2b3a31dcd69064868ef commit: 13e4659276c2af2fa5b0f2b3a31dcd69064868ef branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-10T11:12:34-08:00 summary: bpo-46327: [Enum] remove skipped tests (GH-30512) files: M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index dfa81a52a93a4..04a68cc9ea204 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -4523,17 +4523,6 @@ def test_convert_complex(self): [uncomp.COMPLEX_A, uncomp.COMPLEX_B, uncomp.COMPLEX_C], ) - @unittest.skipUnless(python_version == (3, 8), - '_convert was deprecated in 3.8') - def test_convert_warn(self): - with self.assertWarns(DeprecationWarning): - enum.IntEnum._convert( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) - - @unittest.skipUnless(python_version >= (3, 9), - '_convert was removed in 3.9') def test_convert_raise(self): with self.assertRaises(AttributeError): enum.IntEnum._convert( From webhook-mailer at python.org Mon Jan 10 18:42:58 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 10 Jan 2022 23:42:58 -0000 Subject: [Python-checkins] bpo-45331: [Enum] add rule to docs that mixin type must be subclassable (GH-30521) Message-ID: https://github.com/python/cpython/commit/6223cbf86ad7d5e6d12f9747e5a9cf1d8c72bdc8 commit: 6223cbf86ad7d5e6d12f9747e5a9cf1d8c72bdc8 branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-10T15:42:45-08:00 summary: bpo-45331: [Enum] add rule to docs that mixin type must be subclassable (GH-30521) files: M Doc/howto/enum.rst diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 91bef218b9299..6c09b9925c1de 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -819,17 +819,20 @@ Some rules: 1. When subclassing :class:`Enum`, mix-in types must appear before :class:`Enum` itself in the sequence of bases, as in the :class:`IntEnum` example above. -2. While :class:`Enum` can have members of any type, once you mix in an +2. Mix-in types must be subclassable. For example, + :class:`bool` and :class:`range` are not subclassable + and will throw an error during Enum creation if used as the mix-in type. +3. While :class:`Enum` can have members of any type, once you mix in an additional type, all the members must have values of that type, e.g. :class:`int` above. This restriction does not apply to mix-ins which only add methods and don't specify another type. -3. When another data type is mixed in, the :attr:`value` attribute is *not the +4. When another data type is mixed in, the :attr:`value` attribute is *not the same* as the enum member itself, although it is equivalent and will compare equal. -4. %-style formatting: `%s` and `%r` call the :class:`Enum` class's +5. %-style formatting: `%s` and `%r` call the :class:`Enum` class's :meth:`__str__` and :meth:`__repr__` respectively; other codes (such as `%i` or `%h` for IntEnum) treat the enum member as its mixed-in type. -5. :ref:`Formatted string literals `, :meth:`str.format`, +6. :ref:`Formatted string literals `, :meth:`str.format`, and :func:`format` will use the mixed-in type's :meth:`__format__` unless :meth:`__str__` or :meth:`__format__` is overridden in the subclass, in which case the overridden methods or :class:`Enum` methods will be used. From webhook-mailer at python.org Mon Jan 10 18:43:44 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Mon, 10 Jan 2022 23:43:44 -0000 Subject: [Python-checkins] bpo-46244: Remove __slots__ from typing.TypeVar, .ParamSpec (#30444) Message-ID: https://github.com/python/cpython/commit/081a2140083680ffc309e53699aea29e71760d70 commit: 081a2140083680ffc309e53699aea29e71760d70 branch: main author: Arie Bovenberg committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-11T07:43:39+08:00 summary: bpo-46244: Remove __slots__ from typing.TypeVar, .ParamSpec (#30444) * add missing __slots__ to typing._TypeVarLike * add news entry * remove slots from _TypeVarLike base classes * cleanup diff * fix broken link in blurb files: A Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst M Lib/typing.py diff --git a/Lib/typing.py b/Lib/typing.py index ae1dd5c2d7689..d520f6b2e1b3d 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -805,9 +805,6 @@ def longest(x: A, y: A) -> A: Note that only type variables defined in global scope can be pickled. """ - __slots__ = ('__name__', '__bound__', '__constraints__', - '__covariant__', '__contravariant__', '__dict__') - def __init__(self, name, *constraints, bound=None, covariant=False, contravariant=False): self.__name__ = name @@ -907,9 +904,6 @@ def add_two(x: float, y: float) -> float: be pickled. """ - __slots__ = ('__name__', '__bound__', '__covariant__', '__contravariant__', - '__dict__') - @property def args(self): return ParamSpecArgs(self) diff --git a/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst b/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst new file mode 100644 index 0000000000000..5ca536a97c9cd --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst @@ -0,0 +1,2 @@ +Removed ``__slots__`` from :class:`typing.ParamSpec` and :class:`typing.TypeVar`. +They served no purpose. Patch by Arie Bovenberg. From webhook-mailer at python.org Mon Jan 10 22:03:47 2022 From: webhook-mailer at python.org (corona10) Date: Tue, 11 Jan 2022 03:03:47 -0000 Subject: [Python-checkins] bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) Message-ID: https://github.com/python/cpython/commit/e13cdca0f5224ec4e23bdd04bb3120506964bc8b commit: e13cdca0f5224ec4e23bdd04bb3120506964bc8b branch: main author: Sam Gross committer: corona10 date: 2022-01-11T12:03:09+09:00 summary: bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) files: A Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst M Lib/test/libregrtest/runtest_mp.py diff --git a/Lib/test/libregrtest/runtest_mp.py b/Lib/test/libregrtest/runtest_mp.py index f02f56e98bcda2..75444e48080ce5 100644 --- a/Lib/test/libregrtest/runtest_mp.py +++ b/Lib/test/libregrtest/runtest_mp.py @@ -392,16 +392,12 @@ def stop_workers(self) -> None: worker.wait_stopped(start_time) def _get_result(self) -> QueueOutput | None: - if not any(worker.is_alive() for worker in self.workers): - # all worker threads are done: consume pending results - try: - return self.output.get(timeout=0) - except queue.Empty: - return None - use_faulthandler = (self.ns.timeout is not None) timeout = PROGRESS_UPDATE - while True: + + # bpo-46205: check the status of workers every iteration to avoid + # waiting forever on an empty queue. + while any(worker.is_alive() for worker in self.workers): if use_faulthandler: faulthandler.dump_traceback_later(MAIN_PROCESS_TIMEOUT, exit=True) @@ -417,6 +413,12 @@ def _get_result(self) -> QueueOutput | None: if running and not self.ns.pgo: self.log('running: %s' % ', '.join(running)) + # all worker threads are done: consume pending results + try: + return self.output.get(timeout=0) + except queue.Empty: + return None + def display_result(self, mp_result: MultiprocessResult) -> None: result = mp_result.result diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst new file mode 100644 index 00000000000000..7c6121fb162495 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst @@ -0,0 +1 @@ +Fix hang in runtest_mp due to race condition From webhook-mailer at python.org Mon Jan 10 22:29:40 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 03:29:40 -0000 Subject: [Python-checkins] bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) Message-ID: https://github.com/python/cpython/commit/e0ec08dc49f8e6f94a735bc9946ef7a3fd898a44 commit: e0ec08dc49f8e6f94a735bc9946ef7a3fd898a44 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-10T19:29:31-08:00 summary: bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) (cherry picked from commit e13cdca0f5224ec4e23bdd04bb3120506964bc8b) Co-authored-by: Sam Gross files: A Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst M Lib/test/libregrtest/runtest_mp.py diff --git a/Lib/test/libregrtest/runtest_mp.py b/Lib/test/libregrtest/runtest_mp.py index c83e44aed0514..d0d63ef8afe91 100644 --- a/Lib/test/libregrtest/runtest_mp.py +++ b/Lib/test/libregrtest/runtest_mp.py @@ -395,16 +395,12 @@ def stop_workers(self) -> None: worker.wait_stopped(start_time) def _get_result(self) -> QueueOutput | None: - if not any(worker.is_alive() for worker in self.workers): - # all worker threads are done: consume pending results - try: - return self.output.get(timeout=0) - except queue.Empty: - return None - use_faulthandler = (self.ns.timeout is not None) timeout = PROGRESS_UPDATE - while True: + + # bpo-46205: check the status of workers every iteration to avoid + # waiting forever on an empty queue. + while any(worker.is_alive() for worker in self.workers): if use_faulthandler: faulthandler.dump_traceback_later(MAIN_PROCESS_TIMEOUT, exit=True) @@ -420,6 +416,12 @@ def _get_result(self) -> QueueOutput | None: if running and not self.ns.pgo: self.log('running: %s' % ', '.join(running)) + # all worker threads are done: consume pending results + try: + return self.output.get(timeout=0) + except queue.Empty: + return None + def display_result(self, mp_result: MultiprocessResult) -> None: result = mp_result.result diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst new file mode 100644 index 0000000000000..7c6121fb16249 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst @@ -0,0 +1 @@ +Fix hang in runtest_mp due to race condition From webhook-mailer at python.org Mon Jan 10 22:32:25 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 03:32:25 -0000 Subject: [Python-checkins] bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) Message-ID: https://github.com/python/cpython/commit/690ed889c537c008a2c5f3e6c4f06c5b0c0afbc6 commit: 690ed889c537c008a2c5f3e6c4f06c5b0c0afbc6 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-10T19:32:15-08:00 summary: bpo-46205: exit if no workers are alive in runtest_mp (GH-30470) (cherry picked from commit e13cdca0f5224ec4e23bdd04bb3120506964bc8b) Co-authored-by: Sam Gross files: A Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst M Lib/test/libregrtest/runtest_mp.py diff --git a/Lib/test/libregrtest/runtest_mp.py b/Lib/test/libregrtest/runtest_mp.py index b9404d53d3ce7..cef877cbad8c8 100644 --- a/Lib/test/libregrtest/runtest_mp.py +++ b/Lib/test/libregrtest/runtest_mp.py @@ -396,16 +396,12 @@ def stop_workers(self) -> None: worker.wait_stopped(start_time) def _get_result(self) -> QueueOutput | None: - if not any(worker.is_alive() for worker in self.workers): - # all worker threads are done: consume pending results - try: - return self.output.get(timeout=0) - except queue.Empty: - return None - use_faulthandler = (self.ns.timeout is not None) timeout = PROGRESS_UPDATE - while True: + + # bpo-46205: check the status of workers every iteration to avoid + # waiting forever on an empty queue. + while any(worker.is_alive() for worker in self.workers): if use_faulthandler: faulthandler.dump_traceback_later(MAIN_PROCESS_TIMEOUT, exit=True) @@ -421,6 +417,12 @@ def _get_result(self) -> QueueOutput | None: if running and not self.ns.pgo: self.log('running: %s' % ', '.join(running)) + # all worker threads are done: consume pending results + try: + return self.output.get(timeout=0) + except queue.Empty: + return None + def display_result(self, mp_result: MultiprocessResult) -> None: result = mp_result.result diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst new file mode 100644 index 0000000000000..7c6121fb16249 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst @@ -0,0 +1 @@ +Fix hang in runtest_mp due to race condition From webhook-mailer at python.org Tue Jan 11 05:52:05 2022 From: webhook-mailer at python.org (asvetlov) Date: Tue, 11 Jan 2022 10:52:05 -0000 Subject: [Python-checkins] bpo-46310: simplify `for` loop in `asyncio/windows_events` (GH-30334) Message-ID: https://github.com/python/cpython/commit/fc75bfb8be8494e22123f2c14d1ab497c77cc22d commit: fc75bfb8be8494e22123f2c14d1ab497c77cc22d branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-11T12:51:34+02:00 summary: bpo-46310: simplify `for` loop in `asyncio/windows_events` (GH-30334) files: M Lib/asyncio/windows_events.py diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py index 427d4624ad076..0d9a07ef4772e 100644 --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -839,7 +839,7 @@ def close(self): return # Cancel remaining registered operations. - for address, (fut, ov, obj, callback) in list(self._cache.items()): + for fut, ov, obj, callback in list(self._cache.values()): if fut.cancelled(): # Nothing to do with cancelled futures pass From webhook-mailer at python.org Tue Jan 11 05:56:29 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 11 Jan 2022 10:56:29 -0000 Subject: [Python-checkins] bpo-46303: Move fileutils.h private functions to internal C API (GH-30484) Message-ID: https://github.com/python/cpython/commit/ea1a54506b4ac38b712ba63ec884292025f16111 commit: ea1a54506b4ac38b712ba63ec884292025f16111 branch: main author: Victor Stinner committer: vstinner date: 2022-01-11T11:56:16+01:00 summary: bpo-46303: Move fileutils.h private functions to internal C API (GH-30484) Move almost all private functions of Include/cpython/fileutils.h to the internal C API Include/internal/pycore_fileutils.h. Only keep _Py_fopen_obj() in Include/cpython/fileutils.h, since it's used by _testcapi which must not use the internal C API. Move EncodeLocaleEx() and DecodeLocaleEx() functions from _testcapi to _testinternalcapi, since the C API moved to the internal C API. files: M Include/cpython/fileutils.h M Include/internal/pycore_fileutils.h M Include/internal/pycore_unicodeobject.h M Lib/test/test_codecs.py M Modules/_testcapimodule.c M Modules/_testinternalcapi.c M Modules/_tracemalloc.c M Modules/mmapmodule.c M Modules/ossaudiodev.c M Modules/selectmodule.c M Modules/socketmodule.c M Programs/_freeze_module.c M Python/bootstrap_hash.c diff --git a/Include/cpython/fileutils.h b/Include/cpython/fileutils.h index 7ab2290184a2f..b386ad107bde1 100644 --- a/Include/cpython/fileutils.h +++ b/Include/cpython/fileutils.h @@ -2,165 +2,7 @@ # error "this header file must not be included directly" #endif -typedef enum { - _Py_ERROR_UNKNOWN=0, - _Py_ERROR_STRICT, - _Py_ERROR_SURROGATEESCAPE, - _Py_ERROR_REPLACE, - _Py_ERROR_IGNORE, - _Py_ERROR_BACKSLASHREPLACE, - _Py_ERROR_SURROGATEPASS, - _Py_ERROR_XMLCHARREFREPLACE, - _Py_ERROR_OTHER -} _Py_error_handler; - -PyAPI_FUNC(_Py_error_handler) _Py_GetErrorHandler(const char *errors); - -PyAPI_FUNC(int) _Py_DecodeLocaleEx( - const char *arg, - wchar_t **wstr, - size_t *wlen, - const char **reason, - int current_locale, - _Py_error_handler errors); - -PyAPI_FUNC(int) _Py_EncodeLocaleEx( - const wchar_t *text, - char **str, - size_t *error_pos, - const char **reason, - int current_locale, - _Py_error_handler errors); - -PyAPI_FUNC(char*) _Py_EncodeLocaleRaw( - const wchar_t *text, - size_t *error_pos); - -PyAPI_FUNC(PyObject *) _Py_device_encoding(int); - -#if defined(MS_WINDOWS) || defined(__APPLE__) - /* On Windows, the count parameter of read() is an int (bpo-9015, bpo-9611). - On macOS 10.13, read() and write() with more than INT_MAX bytes - fail with EINVAL (bpo-24658). */ -# define _PY_READ_MAX INT_MAX -# define _PY_WRITE_MAX INT_MAX -#else - /* write() should truncate the input to PY_SSIZE_T_MAX bytes, - but it's safer to do it ourself to have a portable behaviour */ -# define _PY_READ_MAX PY_SSIZE_T_MAX -# define _PY_WRITE_MAX PY_SSIZE_T_MAX -#endif - -#ifdef MS_WINDOWS -struct _Py_stat_struct { - unsigned long st_dev; - uint64_t st_ino; - unsigned short st_mode; - int st_nlink; - int st_uid; - int st_gid; - unsigned long st_rdev; - __int64 st_size; - time_t st_atime; - int st_atime_nsec; - time_t st_mtime; - int st_mtime_nsec; - time_t st_ctime; - int st_ctime_nsec; - unsigned long st_file_attributes; - unsigned long st_reparse_tag; -}; -#else -# define _Py_stat_struct stat -#endif - -PyAPI_FUNC(int) _Py_fstat( - int fd, - struct _Py_stat_struct *status); - -PyAPI_FUNC(int) _Py_fstat_noraise( - int fd, - struct _Py_stat_struct *status); - -PyAPI_FUNC(int) _Py_stat( - PyObject *path, - struct stat *status); - -PyAPI_FUNC(int) _Py_open( - const char *pathname, - int flags); - -PyAPI_FUNC(int) _Py_open_noraise( - const char *pathname, - int flags); - -PyAPI_FUNC(FILE *) _Py_wfopen( - const wchar_t *path, - const wchar_t *mode); - +// Used by _testcapi which must not use the internal C API PyAPI_FUNC(FILE*) _Py_fopen_obj( PyObject *path, const char *mode); - -PyAPI_FUNC(Py_ssize_t) _Py_read( - int fd, - void *buf, - size_t count); - -PyAPI_FUNC(Py_ssize_t) _Py_write( - int fd, - const void *buf, - size_t count); - -PyAPI_FUNC(Py_ssize_t) _Py_write_noraise( - int fd, - const void *buf, - size_t count); - -#ifdef HAVE_READLINK -PyAPI_FUNC(int) _Py_wreadlink( - const wchar_t *path, - wchar_t *buf, - /* Number of characters of 'buf' buffer - including the trailing NUL character */ - size_t buflen); -#endif - -#ifdef HAVE_REALPATH -PyAPI_FUNC(wchar_t*) _Py_wrealpath( - const wchar_t *path, - wchar_t *resolved_path, - /* Number of characters of 'resolved_path' buffer - including the trailing NUL character */ - size_t resolved_path_len); -#endif - -PyAPI_FUNC(wchar_t*) _Py_wgetcwd( - wchar_t *buf, - /* Number of characters of 'buf' buffer - including the trailing NUL character */ - size_t buflen); - -PyAPI_FUNC(int) _Py_get_inheritable(int fd); - -PyAPI_FUNC(int) _Py_set_inheritable(int fd, int inheritable, - int *atomic_flag_works); - -PyAPI_FUNC(int) _Py_set_inheritable_async_safe(int fd, int inheritable, - int *atomic_flag_works); - -PyAPI_FUNC(int) _Py_dup(int fd); - -#ifndef MS_WINDOWS -PyAPI_FUNC(int) _Py_get_blocking(int fd); - -PyAPI_FUNC(int) _Py_set_blocking(int fd, int blocking); -#else /* MS_WINDOWS */ -PyAPI_FUNC(void*) _Py_get_osfhandle_noraise(int fd); - -PyAPI_FUNC(void*) _Py_get_osfhandle(int fd); - -PyAPI_FUNC(int) _Py_open_osfhandle_noraise(void *handle, int flags); - -PyAPI_FUNC(int) _Py_open_osfhandle(void *handle, int flags); -#endif /* MS_WINDOWS */ diff --git a/Include/internal/pycore_fileutils.h b/Include/internal/pycore_fileutils.h index 38df73b745bcb..61c11a8b2d3b4 100644 --- a/Include/internal/pycore_fileutils.h +++ b/Include/internal/pycore_fileutils.h @@ -10,6 +10,165 @@ extern "C" { #include /* struct lconv */ +typedef enum { + _Py_ERROR_UNKNOWN=0, + _Py_ERROR_STRICT, + _Py_ERROR_SURROGATEESCAPE, + _Py_ERROR_REPLACE, + _Py_ERROR_IGNORE, + _Py_ERROR_BACKSLASHREPLACE, + _Py_ERROR_SURROGATEPASS, + _Py_ERROR_XMLCHARREFREPLACE, + _Py_ERROR_OTHER +} _Py_error_handler; + +PyAPI_FUNC(_Py_error_handler) _Py_GetErrorHandler(const char *errors); + +PyAPI_FUNC(int) _Py_DecodeLocaleEx( + const char *arg, + wchar_t **wstr, + size_t *wlen, + const char **reason, + int current_locale, + _Py_error_handler errors); + +PyAPI_FUNC(int) _Py_EncodeLocaleEx( + const wchar_t *text, + char **str, + size_t *error_pos, + const char **reason, + int current_locale, + _Py_error_handler errors); + +PyAPI_FUNC(char*) _Py_EncodeLocaleRaw( + const wchar_t *text, + size_t *error_pos); + +PyAPI_FUNC(PyObject *) _Py_device_encoding(int); + +#if defined(MS_WINDOWS) || defined(__APPLE__) + /* On Windows, the count parameter of read() is an int (bpo-9015, bpo-9611). + On macOS 10.13, read() and write() with more than INT_MAX bytes + fail with EINVAL (bpo-24658). */ +# define _PY_READ_MAX INT_MAX +# define _PY_WRITE_MAX INT_MAX +#else + /* write() should truncate the input to PY_SSIZE_T_MAX bytes, + but it's safer to do it ourself to have a portable behaviour */ +# define _PY_READ_MAX PY_SSIZE_T_MAX +# define _PY_WRITE_MAX PY_SSIZE_T_MAX +#endif + +#ifdef MS_WINDOWS +struct _Py_stat_struct { + unsigned long st_dev; + uint64_t st_ino; + unsigned short st_mode; + int st_nlink; + int st_uid; + int st_gid; + unsigned long st_rdev; + __int64 st_size; + time_t st_atime; + int st_atime_nsec; + time_t st_mtime; + int st_mtime_nsec; + time_t st_ctime; + int st_ctime_nsec; + unsigned long st_file_attributes; + unsigned long st_reparse_tag; +}; +#else +# define _Py_stat_struct stat +#endif + +PyAPI_FUNC(int) _Py_fstat( + int fd, + struct _Py_stat_struct *status); + +PyAPI_FUNC(int) _Py_fstat_noraise( + int fd, + struct _Py_stat_struct *status); + +PyAPI_FUNC(int) _Py_stat( + PyObject *path, + struct stat *status); + +PyAPI_FUNC(int) _Py_open( + const char *pathname, + int flags); + +PyAPI_FUNC(int) _Py_open_noraise( + const char *pathname, + int flags); + +PyAPI_FUNC(FILE *) _Py_wfopen( + const wchar_t *path, + const wchar_t *mode); + +PyAPI_FUNC(Py_ssize_t) _Py_read( + int fd, + void *buf, + size_t count); + +PyAPI_FUNC(Py_ssize_t) _Py_write( + int fd, + const void *buf, + size_t count); + +PyAPI_FUNC(Py_ssize_t) _Py_write_noraise( + int fd, + const void *buf, + size_t count); + +#ifdef HAVE_READLINK +PyAPI_FUNC(int) _Py_wreadlink( + const wchar_t *path, + wchar_t *buf, + /* Number of characters of 'buf' buffer + including the trailing NUL character */ + size_t buflen); +#endif + +#ifdef HAVE_REALPATH +PyAPI_FUNC(wchar_t*) _Py_wrealpath( + const wchar_t *path, + wchar_t *resolved_path, + /* Number of characters of 'resolved_path' buffer + including the trailing NUL character */ + size_t resolved_path_len); +#endif + +PyAPI_FUNC(wchar_t*) _Py_wgetcwd( + wchar_t *buf, + /* Number of characters of 'buf' buffer + including the trailing NUL character */ + size_t buflen); + +PyAPI_FUNC(int) _Py_get_inheritable(int fd); + +PyAPI_FUNC(int) _Py_set_inheritable(int fd, int inheritable, + int *atomic_flag_works); + +PyAPI_FUNC(int) _Py_set_inheritable_async_safe(int fd, int inheritable, + int *atomic_flag_works); + +PyAPI_FUNC(int) _Py_dup(int fd); + +#ifndef MS_WINDOWS +PyAPI_FUNC(int) _Py_get_blocking(int fd); + +PyAPI_FUNC(int) _Py_set_blocking(int fd, int blocking); +#else /* MS_WINDOWS */ +PyAPI_FUNC(void*) _Py_get_osfhandle_noraise(int fd); + +PyAPI_FUNC(void*) _Py_get_osfhandle(int fd); + +PyAPI_FUNC(int) _Py_open_osfhandle_noraise(void *handle, int flags); + +PyAPI_FUNC(int) _Py_open_osfhandle(void *handle, int flags); +#endif /* MS_WINDOWS */ + // This is used after getting NULL back from Py_DecodeLocale(). #define DECODE_LOCALE_ERR(NAME, LEN) \ ((LEN) == (size_t)-2) \ diff --git a/Include/internal/pycore_unicodeobject.h b/Include/internal/pycore_unicodeobject.h index 97d11aeb8201c..3b6dfe9dbbab4 100644 --- a/Include/internal/pycore_unicodeobject.h +++ b/Include/internal/pycore_unicodeobject.h @@ -8,6 +8,8 @@ extern "C" { # error "this header requires Py_BUILD_CORE define" #endif +#include "pycore_fileutils.h" // _Py_error_handler + /* runtime lifecycle */ diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py index f924826db9438..4ad24dbb9a924 100644 --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -15,6 +15,10 @@ import _testcapi except ImportError: _testcapi = None +try: + import _testinternalcapi +except ImportError: + _testinternalcapi = None try: import ctypes @@ -3345,7 +3349,7 @@ def test_seeking_write(self): self.assertEqual(sr.readline(), b'789\n') - at unittest.skipIf(_testcapi is None, 'need _testcapi module') + at unittest.skipIf(_testinternalcapi is None, 'need _testinternalcapi module') class LocaleCodecTest(unittest.TestCase): """ Test indirectly _Py_DecodeUTF8Ex() and _Py_EncodeUTF8Ex(). @@ -3359,7 +3363,7 @@ class LocaleCodecTest(unittest.TestCase): SURROGATES = "\uDC80\uDCFF" def encode(self, text, errors="strict"): - return _testcapi.EncodeLocaleEx(text, 0, errors) + return _testinternalcapi.EncodeLocaleEx(text, 0, errors) def check_encode_strings(self, errors): for text in self.STRINGS: @@ -3399,7 +3403,7 @@ def test_encode_unsupported_error_handler(self): self.assertEqual(str(cm.exception), 'unsupported error handler') def decode(self, encoded, errors="strict"): - return _testcapi.DecodeLocaleEx(encoded, 0, errors) + return _testinternalcapi.DecodeLocaleEx(encoded, 0, errors) def check_decode_strings(self, errors): is_utf8 = (self.ENCODING == "utf-8") diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index be40d68b40b17..ea9c048554d22 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -5417,98 +5417,6 @@ bad_get(PyObject *module, PyObject *const *args, Py_ssize_t nargs) } -static PyObject * -encode_locale_ex(PyObject *self, PyObject *args) -{ - PyObject *unicode; - int current_locale = 0; - wchar_t *wstr; - PyObject *res = NULL; - const char *errors = NULL; - - if (!PyArg_ParseTuple(args, "U|is", &unicode, ¤t_locale, &errors)) { - return NULL; - } - wstr = PyUnicode_AsWideCharString(unicode, NULL); - if (wstr == NULL) { - return NULL; - } - _Py_error_handler error_handler = _Py_GetErrorHandler(errors); - - char *str = NULL; - size_t error_pos; - const char *reason = NULL; - int ret = _Py_EncodeLocaleEx(wstr, - &str, &error_pos, &reason, - current_locale, error_handler); - PyMem_Free(wstr); - - switch(ret) { - case 0: - res = PyBytes_FromString(str); - PyMem_RawFree(str); - break; - case -1: - PyErr_NoMemory(); - break; - case -2: - PyErr_Format(PyExc_RuntimeError, "encode error: pos=%zu, reason=%s", - error_pos, reason); - break; - case -3: - PyErr_SetString(PyExc_ValueError, "unsupported error handler"); - break; - default: - PyErr_SetString(PyExc_ValueError, "unknown error code"); - break; - } - return res; -} - - -static PyObject * -decode_locale_ex(PyObject *self, PyObject *args) -{ - char *str; - int current_locale = 0; - PyObject *res = NULL; - const char *errors = NULL; - - if (!PyArg_ParseTuple(args, "y|is", &str, ¤t_locale, &errors)) { - return NULL; - } - _Py_error_handler error_handler = _Py_GetErrorHandler(errors); - - wchar_t *wstr = NULL; - size_t wlen = 0; - const char *reason = NULL; - int ret = _Py_DecodeLocaleEx(str, - &wstr, &wlen, &reason, - current_locale, error_handler); - - switch(ret) { - case 0: - res = PyUnicode_FromWideChar(wstr, wlen); - PyMem_RawFree(wstr); - break; - case -1: - PyErr_NoMemory(); - break; - case -2: - PyErr_Format(PyExc_RuntimeError, "decode error: pos=%zu, reason=%s", - wlen, reason); - break; - case -3: - PyErr_SetString(PyExc_ValueError, "unsupported error handler"); - break; - default: - PyErr_SetString(PyExc_ValueError, "unknown error code"); - break; - } - return res; -} - - #ifdef Py_REF_DEBUG static PyObject * negative_refcount(PyObject *self, PyObject *Py_UNUSED(args)) @@ -6125,8 +6033,6 @@ static PyMethodDef TestMethods[] = { {"test_pythread_tss_key_state", test_pythread_tss_key_state, METH_VARARGS}, {"hamt", new_hamt, METH_NOARGS}, {"bad_get", (PyCFunction)(void(*)(void))bad_get, METH_FASTCALL}, - {"EncodeLocaleEx", encode_locale_ex, METH_VARARGS}, - {"DecodeLocaleEx", decode_locale_ex, METH_VARARGS}, #ifdef Py_REF_DEBUG {"negative_refcount", negative_refcount, METH_NOARGS}, #endif diff --git a/Modules/_testinternalcapi.c b/Modules/_testinternalcapi.c index 19babb06f5635..9deba3558bf94 100644 --- a/Modules/_testinternalcapi.c +++ b/Modules/_testinternalcapi.c @@ -399,6 +399,98 @@ get_getpath_codeobject(PyObject *self, PyObject *Py_UNUSED(args)) { } +static PyObject * +encode_locale_ex(PyObject *self, PyObject *args) +{ + PyObject *unicode; + int current_locale = 0; + wchar_t *wstr; + PyObject *res = NULL; + const char *errors = NULL; + + if (!PyArg_ParseTuple(args, "U|is", &unicode, ¤t_locale, &errors)) { + return NULL; + } + wstr = PyUnicode_AsWideCharString(unicode, NULL); + if (wstr == NULL) { + return NULL; + } + _Py_error_handler error_handler = _Py_GetErrorHandler(errors); + + char *str = NULL; + size_t error_pos; + const char *reason = NULL; + int ret = _Py_EncodeLocaleEx(wstr, + &str, &error_pos, &reason, + current_locale, error_handler); + PyMem_Free(wstr); + + switch(ret) { + case 0: + res = PyBytes_FromString(str); + PyMem_RawFree(str); + break; + case -1: + PyErr_NoMemory(); + break; + case -2: + PyErr_Format(PyExc_RuntimeError, "encode error: pos=%zu, reason=%s", + error_pos, reason); + break; + case -3: + PyErr_SetString(PyExc_ValueError, "unsupported error handler"); + break; + default: + PyErr_SetString(PyExc_ValueError, "unknown error code"); + break; + } + return res; +} + + +static PyObject * +decode_locale_ex(PyObject *self, PyObject *args) +{ + char *str; + int current_locale = 0; + PyObject *res = NULL; + const char *errors = NULL; + + if (!PyArg_ParseTuple(args, "y|is", &str, ¤t_locale, &errors)) { + return NULL; + } + _Py_error_handler error_handler = _Py_GetErrorHandler(errors); + + wchar_t *wstr = NULL; + size_t wlen = 0; + const char *reason = NULL; + int ret = _Py_DecodeLocaleEx(str, + &wstr, &wlen, &reason, + current_locale, error_handler); + + switch(ret) { + case 0: + res = PyUnicode_FromWideChar(wstr, wlen); + PyMem_RawFree(wstr); + break; + case -1: + PyErr_NoMemory(); + break; + case -2: + PyErr_Format(PyExc_RuntimeError, "decode error: pos=%zu, reason=%s", + wlen, reason); + break; + case -3: + PyErr_SetString(PyExc_ValueError, "unsupported error handler"); + break; + default: + PyErr_SetString(PyExc_ValueError, "unknown error code"); + break; + } + return res; +} + + static PyMethodDef TestMethods[] = { {"get_configs", get_configs, METH_NOARGS}, {"get_recursion_depth", get_recursion_depth, METH_NOARGS}, @@ -413,6 +505,8 @@ static PyMethodDef TestMethods[] = { {"test_edit_cost", test_edit_cost, METH_NOARGS}, {"normalize_path", normalize_path, METH_O, NULL}, {"get_getpath_codeobject", get_getpath_codeobject, METH_NOARGS, NULL}, + {"EncodeLocaleEx", encode_locale_ex, METH_VARARGS}, + {"DecodeLocaleEx", decode_locale_ex, METH_VARARGS}, {NULL, NULL} /* sentinel */ }; diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c index 68e5f0d52dd36..b838439a9cb83 100644 --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1,8 +1,9 @@ #include "Python.h" +#include "pycore_fileutils.h" // _Py_write_noraise() #include "pycore_gc.h" // PyGC_Head +#include "pycore_hashtable.h" // _Py_hashtable_t #include "pycore_pymem.h" // _Py_tracemalloc_config #include "pycore_traceback.h" -#include "pycore_hashtable.h" #include #include // malloc() diff --git a/Modules/mmapmodule.c b/Modules/mmapmodule.c index 742bcb3d145fa..7c9c28f7fab59 100644 --- a/Modules/mmapmodule.c +++ b/Modules/mmapmodule.c @@ -18,8 +18,13 @@ / ftp://squirl.nightmare.com/pub/python/python-ext. */ +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #define PY_SSIZE_T_CLEAN #include +#include "pycore_fileutils.h" // _Py_stat_struct #include "structmember.h" // PyMemberDef #include // offsetof() diff --git a/Modules/ossaudiodev.c b/Modules/ossaudiodev.c index 4bab9a58eb104..c9e788fd70457 100644 --- a/Modules/ossaudiodev.c +++ b/Modules/ossaudiodev.c @@ -17,8 +17,13 @@ * $Id$ */ +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "pycore_fileutils.h" // _Py_write() #include "structmember.h" // PyMemberDef #include // getenv() diff --git a/Modules/selectmodule.c b/Modules/selectmodule.c index ff1c028d0d672..367e299f83ae8 100644 --- a/Modules/selectmodule.c +++ b/Modules/selectmodule.c @@ -4,11 +4,16 @@ have any value except INVALID_SOCKET. */ +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #if defined(HAVE_POLL_H) && !defined(_GNU_SOURCE) -#define _GNU_SOURCE +# define _GNU_SOURCE #endif #include "Python.h" +#include "pycore_fileutils.h" // _Py_set_inheritable() #include "structmember.h" // PyMemberDef #ifdef HAVE_SYS_DEVPOLL_H diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index 89e93c58187c9..ed83f5c82625e 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -85,6 +85,10 @@ Local naming conventions: */ +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #ifdef __APPLE__ // Issue #35569: Expose RFC 3542 socket options. #define __APPLE_USE_RFC_3542 1 @@ -103,6 +107,7 @@ Local naming conventions: #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "pycore_fileutils.h" // _Py_set_inheritable() #include "structmember.h" // PyMemberDef #ifdef _Py_MEMORY_SANITIZER diff --git a/Programs/_freeze_module.c b/Programs/_freeze_module.c index d50787666f81c..b2f1a24016fac 100644 --- a/Programs/_freeze_module.c +++ b/Programs/_freeze_module.c @@ -11,6 +11,7 @@ #include #include +#include "pycore_fileutils.h" // _Py_stat_struct #include #include diff --git a/Python/bootstrap_hash.c b/Python/bootstrap_hash.c index 144f7cb4a218a..3a2a7318086f7 100644 --- a/Python/bootstrap_hash.c +++ b/Python/bootstrap_hash.c @@ -1,5 +1,7 @@ #include "Python.h" #include "pycore_initconfig.h" +#include "pycore_fileutils.h" // _Py_fstat_noraise() + #ifdef MS_WINDOWS # include # include From webhook-mailer at python.org Tue Jan 11 06:29:06 2022 From: webhook-mailer at python.org (markshannon) Date: Tue, 11 Jan 2022 11:29:06 -0000 Subject: [Python-checkins] bpo-46331: Do not set line number of instruction storing doc-string. (GH-30518) Message-ID: https://github.com/python/cpython/commit/bd04fac7eb929cd11ab6985deb61d9780447fbff commit: bd04fac7eb929cd11ab6985deb61d9780447fbff branch: main author: Mark Shannon committer: markshannon date: 2022-01-11T11:28:30Z summary: bpo-46331: Do not set line number of instruction storing doc-string. (GH-30518) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst M Lib/test/test_sys_settrace.py M Python/compile.c diff --git a/Lib/test/test_sys_settrace.py b/Lib/test/test_sys_settrace.py index 8e430f72f63cc..883b2842f2c77 100644 --- a/Lib/test/test_sys_settrace.py +++ b/Lib/test/test_sys_settrace.py @@ -1403,6 +1403,25 @@ def func(): (3, 'line'), (3, 'return')]) + def test_class_creation_with_docstrings(self): + + def func(): + class Class_1: + ''' the docstring. 2''' + def __init__(self): + ''' Another docstring. 4''' + self.a = 5 + + self.run_and_compare(func, + [(0, 'call'), + (1, 'line'), + (1, 'call'), + (1, 'line'), + (2, 'line'), + (3, 'line'), + (3, 'return'), + (1, 'return')]) + class SkipLineEventsTraceTestCase(TraceTestCase): """Repeat the trace tests, but with per-line events skipped""" diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst new file mode 100644 index 0000000000000..8bb9a995cce35 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst @@ -0,0 +1,2 @@ +Do not set line number of instruction storing doc-string. Fixes regression +introduced in 3.11 alpha. diff --git a/Python/compile.c b/Python/compile.c index 590ca9dbfc6a3..0d821d4183f12 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -2034,6 +2034,7 @@ compiler_body(struct compiler *c, asdl_stmt_seq *stmts) st = (stmt_ty)asdl_seq_GET(stmts, 0); assert(st->kind == Expr_kind); VISIT(c, expr, st->v.Expr.value); + UNSET_LOC(c); if (!compiler_nameop(c, __doc__, Store)) return 0; } From webhook-mailer at python.org Tue Jan 11 06:29:53 2022 From: webhook-mailer at python.org (markshannon) Date: Tue, 11 Jan 2022 11:29:53 -0000 Subject: [Python-checkins] News item for issue 46314. (GH-30515) Message-ID: https://github.com/python/cpython/commit/7357ac94f84ae768c39a6bccf1a3f5c4e8dc8c75 commit: 7357ac94f84ae768c39a6bccf1a3f5c4e8dc8c75 branch: main author: Mark Shannon committer: markshannon date: 2022-01-11T11:29:48Z summary: News item for issue 46314. (GH-30515) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst new file mode 100644 index 0000000000000..c92c0cd47897b --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst @@ -0,0 +1,2 @@ +Remove spurious "call" event when creating a lambda function that was +accidentally introduced in 3.11a4. From webhook-mailer at python.org Tue Jan 11 08:59:35 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Tue, 11 Jan 2022 13:59:35 -0000 Subject: [Python-checkins] Remove unused `Any` from `Concatenate` example in typing docs (GH-30516) Message-ID: https://github.com/python/cpython/commit/73decdf0214c3ca931c22889734758acf5e65dd7 commit: 73decdf0214c3ca931c22889734758acf5e65dd7 branch: main author: Michael Oliver committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-11T21:59:26+08:00 summary: Remove unused `Any` from `Concatenate` example in typing docs (GH-30516) files: M Doc/library/typing.rst diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 08b59d84246f8..de7aa086a9f82 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -737,7 +737,7 @@ These can be used as types in annotations using ``[]``, each having a unique syn from collections.abc import Callable from threading import Lock - from typing import Any, Concatenate, ParamSpec, TypeVar + from typing import Concatenate, ParamSpec, TypeVar P = ParamSpec('P') R = TypeVar('R') From webhook-mailer at python.org Tue Jan 11 09:21:52 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 14:21:52 -0000 Subject: [Python-checkins] Remove unused `Any` from `Concatenate` example in typing docs (GH-30516) Message-ID: https://github.com/python/cpython/commit/da8c0759d2ad6eb70896fbd1e54b0f0d3402038d commit: da8c0759d2ad6eb70896fbd1e54b0f0d3402038d branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T06:21:40-08:00 summary: Remove unused `Any` from `Concatenate` example in typing docs (GH-30516) (cherry picked from commit 73decdf0214c3ca931c22889734758acf5e65dd7) Co-authored-by: Michael Oliver files: M Doc/library/typing.rst diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 29bdb80ad22c6..929749bc0b341 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -737,7 +737,7 @@ These can be used as types in annotations using ``[]``, each having a unique syn from collections.abc import Callable from threading import Lock - from typing import Any, Concatenate, ParamSpec, TypeVar + from typing import Concatenate, ParamSpec, TypeVar P = ParamSpec('P') R = TypeVar('R') From webhook-mailer at python.org Tue Jan 11 09:33:18 2022 From: webhook-mailer at python.org (benjaminp) Date: Tue, 11 Jan 2022 14:33:18 -0000 Subject: [Python-checkins] closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) Message-ID: https://github.com/python/cpython/commit/43c5c1369cb21f08a1dc1d63923c3586b883e3e8 commit: 43c5c1369cb21f08a1dc1d63923c3586b883e3e8 branch: main author: Julian Gilbey committer: benjaminp date: 2022-01-11T08:33:06-06:00 summary: closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) files: M Doc/c-api/unicode.rst diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst index 6cb453ef01075..cb7f0a7b85acf 100644 --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -268,57 +268,57 @@ are available through these macros which are mapped to C functions depending on the Python configuration. -.. c:function:: int Py_UNICODE_ISSPACE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISSPACE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a whitespace character. -.. c:function:: int Py_UNICODE_ISLOWER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLOWER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a lowercase character. -.. c:function:: int Py_UNICODE_ISUPPER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISUPPER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an uppercase character. -.. c:function:: int Py_UNICODE_ISTITLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISTITLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a titlecase character. -.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a linebreak character. -.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a decimal character. -.. c:function:: int Py_UNICODE_ISDIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDIGIT(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a digit character. -.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a numeric character. -.. c:function:: int Py_UNICODE_ISALPHA(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALPHA(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphabetic character. -.. c:function:: int Py_UNICODE_ISALNUM(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALNUM(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphanumeric character. -.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a printable character. Nonprintable characters are those characters defined in the Unicode character @@ -332,7 +332,7 @@ the Python configuration. These APIs can be used for fast direct character conversions: -.. c:function:: Py_UNICODE Py_UNICODE_TOLOWER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOLOWER(Py_UCS4 ch) Return the character *ch* converted to lower case. @@ -340,7 +340,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOUPPER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOUPPER(Py_UCS4 ch) Return the character *ch* converted to upper case. @@ -348,7 +348,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOTITLE(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOTITLE(Py_UCS4 ch) Return the character *ch* converted to title case. @@ -356,19 +356,19 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: int Py_UNICODE_TODECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODECIMAL(Py_UCS4 ch) Return the character *ch* converted to a decimal positive integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: int Py_UNICODE_TODIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODIGIT(Py_UCS4 ch) Return the character *ch* converted to a single digit integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: double Py_UNICODE_TONUMERIC(Py_UNICODE ch) +.. c:function:: double Py_UNICODE_TONUMERIC(Py_UCS4 ch) Return the character *ch* converted to a double. Return ``-1.0`` if this is not possible. This macro does not raise exceptions. From webhook-mailer at python.org Tue Jan 11 09:53:29 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 14:53:29 -0000 Subject: [Python-checkins] closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) Message-ID: https://github.com/python/cpython/commit/4cfb10979d74b8513ec751b81454709f38e3b51a commit: 4cfb10979d74b8513ec751b81454709f38e3b51a branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T06:53:08-08:00 summary: closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) (cherry picked from commit 43c5c1369cb21f08a1dc1d63923c3586b883e3e8) Co-authored-by: Julian Gilbey files: M Doc/c-api/unicode.rst diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst index cb2438e24b749..343eaf1444df6 100644 --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -268,57 +268,57 @@ are available through these macros which are mapped to C functions depending on the Python configuration. -.. c:function:: int Py_UNICODE_ISSPACE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISSPACE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a whitespace character. -.. c:function:: int Py_UNICODE_ISLOWER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLOWER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a lowercase character. -.. c:function:: int Py_UNICODE_ISUPPER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISUPPER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an uppercase character. -.. c:function:: int Py_UNICODE_ISTITLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISTITLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a titlecase character. -.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a linebreak character. -.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a decimal character. -.. c:function:: int Py_UNICODE_ISDIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDIGIT(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a digit character. -.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a numeric character. -.. c:function:: int Py_UNICODE_ISALPHA(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALPHA(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphabetic character. -.. c:function:: int Py_UNICODE_ISALNUM(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALNUM(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphanumeric character. -.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a printable character. Nonprintable characters are those characters defined in the Unicode character @@ -332,7 +332,7 @@ the Python configuration. These APIs can be used for fast direct character conversions: -.. c:function:: Py_UNICODE Py_UNICODE_TOLOWER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOLOWER(Py_UCS4 ch) Return the character *ch* converted to lower case. @@ -340,7 +340,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOUPPER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOUPPER(Py_UCS4 ch) Return the character *ch* converted to upper case. @@ -348,7 +348,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOTITLE(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOTITLE(Py_UCS4 ch) Return the character *ch* converted to title case. @@ -356,19 +356,19 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: int Py_UNICODE_TODECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODECIMAL(Py_UCS4 ch) Return the character *ch* converted to a decimal positive integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: int Py_UNICODE_TODIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODIGIT(Py_UCS4 ch) Return the character *ch* converted to a single digit integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: double Py_UNICODE_TONUMERIC(Py_UNICODE ch) +.. c:function:: double Py_UNICODE_TONUMERIC(Py_UCS4 ch) Return the character *ch* converted to a double. Return ``-1.0`` if this is not possible. This macro does not raise exceptions. From webhook-mailer at python.org Tue Jan 11 09:59:38 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 14:59:38 -0000 Subject: [Python-checkins] closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) Message-ID: https://github.com/python/cpython/commit/238a36b753affd373a315b81a5024aed7ebf6479 commit: 238a36b753affd373a315b81a5024aed7ebf6479 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T06:59:26-08:00 summary: closes bpo-46253: Change Py_UNICODE to Py_UCS4 in the C API docs to match the current source code (GH-30387) (cherry picked from commit 43c5c1369cb21f08a1dc1d63923c3586b883e3e8) Co-authored-by: Julian Gilbey files: M Doc/c-api/unicode.rst diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst index 8e237046ecd25..0d13d949e38c7 100644 --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -268,57 +268,57 @@ are available through these macros which are mapped to C functions depending on the Python configuration. -.. c:function:: int Py_UNICODE_ISSPACE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISSPACE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a whitespace character. -.. c:function:: int Py_UNICODE_ISLOWER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLOWER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a lowercase character. -.. c:function:: int Py_UNICODE_ISUPPER(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISUPPER(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an uppercase character. -.. c:function:: int Py_UNICODE_ISTITLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISTITLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a titlecase character. -.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISLINEBREAK(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a linebreak character. -.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDECIMAL(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a decimal character. -.. c:function:: int Py_UNICODE_ISDIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISDIGIT(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a digit character. -.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISNUMERIC(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a numeric character. -.. c:function:: int Py_UNICODE_ISALPHA(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALPHA(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphabetic character. -.. c:function:: int Py_UNICODE_ISALNUM(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISALNUM(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is an alphanumeric character. -.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_ISPRINTABLE(Py_UCS4 ch) Return ``1`` or ``0`` depending on whether *ch* is a printable character. Nonprintable characters are those characters defined in the Unicode character @@ -332,7 +332,7 @@ the Python configuration. These APIs can be used for fast direct character conversions: -.. c:function:: Py_UNICODE Py_UNICODE_TOLOWER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOLOWER(Py_UCS4 ch) Return the character *ch* converted to lower case. @@ -340,7 +340,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOUPPER(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOUPPER(Py_UCS4 ch) Return the character *ch* converted to upper case. @@ -348,7 +348,7 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: Py_UNICODE Py_UNICODE_TOTITLE(Py_UNICODE ch) +.. c:function:: Py_UCS4 Py_UNICODE_TOTITLE(Py_UCS4 ch) Return the character *ch* converted to title case. @@ -356,19 +356,19 @@ These APIs can be used for fast direct character conversions: This function uses simple case mappings. -.. c:function:: int Py_UNICODE_TODECIMAL(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODECIMAL(Py_UCS4 ch) Return the character *ch* converted to a decimal positive integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: int Py_UNICODE_TODIGIT(Py_UNICODE ch) +.. c:function:: int Py_UNICODE_TODIGIT(Py_UCS4 ch) Return the character *ch* converted to a single digit integer. Return ``-1`` if this is not possible. This macro does not raise exceptions. -.. c:function:: double Py_UNICODE_TONUMERIC(Py_UNICODE ch) +.. c:function:: double Py_UNICODE_TONUMERIC(Py_UCS4 ch) Return the character *ch* converted to a double. Return ``-1.0`` if this is not possible. This macro does not raise exceptions. From webhook-mailer at python.org Tue Jan 11 11:30:44 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 16:30:44 -0000 Subject: [Python-checkins] bpo-46339: Fix crash in the parser when computing error text for multi-line f-strings (GH-30529) Message-ID: https://github.com/python/cpython/commit/cedec19be81e6bd153678bfb28c8e217af8bda58 commit: cedec19be81e6bd153678bfb28c8e217af8bda58 branch: main author: Pablo Galindo Salgado committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T08:30:39-08:00 summary: bpo-46339: Fix crash in the parser when computing error text for multi-line f-strings (GH-30529) Automerge-Triggered-By: GH:pablogsal files: A Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst M Lib/test/test_exceptions.py M Parser/pegen_errors.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index e4d685f4154ed..531b9c92deae5 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -280,6 +280,12 @@ def baz(): } \"\"\" }'''""", 5, 17) + check('''f""" + + + { + 6 + 0="""''', 5, 13) # Errors thrown by symtable.c check('x = [(yield i) for i in range(3)]', 1, 7) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst new file mode 100644 index 0000000000000..cd04f060826b2 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst @@ -0,0 +1,3 @@ +Fix a crash in the parser when retrieving the error text for multi-line +f-strings expressions that do not start in the first line of the string. +Patch by Pablo Galindo diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index f07d9d8a34df7..bffae8532ca2b 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -250,8 +250,15 @@ get_error_line_from_tokenizer_buffers(Parser *p, Py_ssize_t lineno) char *cur_line = p->tok->fp_interactive ? p->tok->interactive_src_start : p->tok->str; assert(cur_line != NULL); - for (int i = 0; i < lineno - 1; i++) { - cur_line = strchr(cur_line, '\n') + 1; + Py_ssize_t relative_lineno = p->starting_lineno ? lineno - p->starting_lineno + 1 : lineno; + + for (int i = 0; i < relative_lineno - 1; i++) { + char *new_line = strchr(cur_line, '\n') + 1; + assert(new_line != NULL && new_line < p->tok->inp); + if (new_line == NULL || new_line >= p->tok->inp) { + break; + } + cur_line = new_line; } char *next_newline; From webhook-mailer at python.org Tue Jan 11 11:32:51 2022 From: webhook-mailer at python.org (JulienPalard) Date: Tue, 11 Jan 2022 16:32:51 -0000 Subject: [Python-checkins] [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) Message-ID: https://github.com/python/cpython/commit/6f05e1ec193c132015e9a23d1137b1731596f186 commit: 6f05e1ec193c132015e9a23d1137b1731596f186 branch: main author: Julien Palard committer: JulienPalard date: 2022-01-11T17:32:42+01:00 summary: [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) files: M Doc/conf.py M Doc/requirements.txt diff --git a/Doc/conf.py b/Doc/conf.py index f626ce67b3c74..cbf201a314eec 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -69,7 +69,8 @@ html_theme_path = ['tools'] html_theme_options = { 'collapsiblesidebar': True, - 'issues_url': 'https://docs.python.org/3/bugs.html', + 'issues_url': '/bugs.html', + 'license_url': '/license.html', 'root_include_title': False # We use the version switcher instead. } diff --git a/Doc/requirements.txt b/Doc/requirements.txt index 785da2c321784..0331a8dbebc46 100644 --- a/Doc/requirements.txt +++ b/Doc/requirements.txt @@ -9,4 +9,4 @@ blurb # The theme used by the documentation is stored separately, so we need # to install that as well. -python-docs-theme +python-docs-theme>=2022.1 From webhook-mailer at python.org Tue Jan 11 11:33:13 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 16:33:13 -0000 Subject: [Python-checkins] bpo-46237: Fix the line number of tokenizer errors inside f-strings (GH-30463) Message-ID: https://github.com/python/cpython/commit/19a85501cee24a6e426a431243d0adcb5664c6fe commit: 19a85501cee24a6e426a431243d0adcb5664c6fe branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T08:33:08-08:00 summary: bpo-46237: Fix the line number of tokenizer errors inside f-strings (GH-30463) (cherry picked from commit 6fa8b2ceee38187b0ae96aee12fe4f0a5c8a2ce7) Co-authored-by: Pablo Galindo Salgado files: A Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst M Lib/test/test_exceptions.py M Parser/pegen.c M Parser/string_parser.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index cc0640dda0980..86b5dccaaed98 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -266,6 +266,18 @@ def baz(): check("(1+)", 1, 4) check("[interesting\nfoo()\n", 1, 1) check(b"\xef\xbb\xbf#coding: utf8\nprint('\xe6\x88\x91')\n", 0, -1) + check("""f''' + { + (123_a) + }'''""", 3, 17) + check("""f''' + { + f\"\"\" + { + (123_a) + } + \"\"\" + }'''""", 5, 17) # Errors thrown by symtable.c check('x = [(yield i) for i in range(3)]', 1, 7) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst new file mode 100644 index 0000000000000..931a2603293c3 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst @@ -0,0 +1,2 @@ +Fix the line number of tokenizer errors inside f-strings. Patch by Pablo +Galindo. diff --git a/Parser/pegen.c b/Parser/pegen.c index 0504906c947d0..e507415f6d14c 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -701,10 +701,10 @@ initialize_token(Parser *p, Token *token, const char *start, const char *end, in int col_offset = (start != NULL && start >= line_start) ? (int)(start - line_start) : -1; int end_col_offset = (end != NULL && end >= p->tok->line_start) ? (int)(end - p->tok->line_start) : -1; - token->lineno = p->starting_lineno + lineno; - token->col_offset = p->tok->lineno == 1 ? p->starting_col_offset + col_offset : col_offset; - token->end_lineno = p->starting_lineno + end_lineno; - token->end_col_offset = p->tok->lineno == 1 ? p->starting_col_offset + end_col_offset : end_col_offset; + token->lineno = lineno; + token->col_offset = p->tok->lineno == p->starting_lineno ? p->starting_col_offset + col_offset : col_offset; + token->end_lineno = end_lineno; + token->end_col_offset = p->tok->lineno == p->starting_lineno ? p->starting_col_offset + end_col_offset : end_col_offset; p->fill += 1; diff --git a/Parser/string_parser.c b/Parser/string_parser.c index dcd298cb358ee..c83e63fc6f8f2 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -392,11 +392,14 @@ fstring_compile_expr(Parser *p, const char *expr_start, const char *expr_end, return NULL; } Py_INCREF(p->tok->filename); + tok->filename = p->tok->filename; + tok->lineno = t->lineno + lines - 1; Parser *p2 = _PyPegen_Parser_New(tok, Py_fstring_input, p->flags, p->feature_version, NULL, p->arena); - p2->starting_lineno = t->lineno + lines - 1; + + p2->starting_lineno = t->lineno + lines; p2->starting_col_offset = t->col_offset + cols; expr = _PyPegen_run_parser(p2); From webhook-mailer at python.org Tue Jan 11 11:37:37 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Tue, 11 Jan 2022 16:37:37 -0000 Subject: [Python-checkins] bpo-45953: Statically allocate and initialize global bytes objects. (gh-30096) Message-ID: https://github.com/python/cpython/commit/cf496d657a1a82eaf9ebfb47d721676fef6effa5 commit: cf496d657a1a82eaf9ebfb47d721676fef6effa5 branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-11T09:37:24-07:00 summary: bpo-45953: Statically allocate and initialize global bytes objects. (gh-30096) The empty bytes object (b'') and the 256 one-character bytes objects were allocated at runtime init. Now we statically allocate and initialize them. https://bugs.python.org/issue45953 files: M Include/internal/pycore_bytesobject.h M Include/internal/pycore_global_objects.h M Include/internal/pycore_interp.h M Objects/bytesobject.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_bytesobject.h b/Include/internal/pycore_bytesobject.h index b00ed9784ef34..18d9530aaf41e 100644 --- a/Include/internal/pycore_bytesobject.h +++ b/Include/internal/pycore_bytesobject.h @@ -11,17 +11,7 @@ extern "C" { /* runtime lifecycle */ -extern PyStatus _PyBytes_InitGlobalObjects(PyInterpreterState *); extern PyStatus _PyBytes_InitTypes(PyInterpreterState *); -extern void _PyBytes_Fini(PyInterpreterState *); - - -/* other API */ - -struct _Py_bytes_state { - PyObject *empty_string; - PyBytesObject *characters[256]; -}; #ifdef __cplusplus diff --git a/Include/internal/pycore_global_objects.h b/Include/internal/pycore_global_objects.h index 6cae3bca6be45..d2dc907c53d6d 100644 --- a/Include/internal/pycore_global_objects.h +++ b/Include/internal/pycore_global_objects.h @@ -34,6 +34,20 @@ extern "C" { } +/* bytes objects */ + +#define _PyBytes_SIMPLE_INIT(CH, LEN) \ + { \ + _PyVarObject_IMMORTAL_INIT(&PyBytes_Type, LEN), \ + .ob_shash = -1, \ + .ob_sval = { CH }, \ + } +#define _PyBytes_CHAR_INIT(CH) \ + { \ + _PyBytes_SIMPLE_INIT(CH, 1) \ + } + + /********************** * the global objects * **********************/ @@ -54,6 +68,12 @@ struct _Py_global_objects { * -_PY_NSMALLNEGINTS (inclusive) to _PY_NSMALLPOSINTS (exclusive). */ PyLongObject small_ints[_PY_NSMALLNEGINTS + _PY_NSMALLPOSINTS]; + + PyBytesObject bytes_empty; + struct { + PyBytesObject ob; + char eos; + } bytes_characters[256]; } singletons; }; @@ -323,6 +343,266 @@ struct _Py_global_objects { _PyLong_DIGIT_INIT(255), \ _PyLong_DIGIT_INIT(256), \ }, \ + \ + .bytes_empty = _PyBytes_SIMPLE_INIT(0, 0), \ + .bytes_characters = { \ + _PyBytes_CHAR_INIT(0), \ + _PyBytes_CHAR_INIT(1), \ + _PyBytes_CHAR_INIT(2), \ + _PyBytes_CHAR_INIT(3), \ + _PyBytes_CHAR_INIT(4), \ + _PyBytes_CHAR_INIT(5), \ + _PyBytes_CHAR_INIT(6), \ + _PyBytes_CHAR_INIT(7), \ + _PyBytes_CHAR_INIT(8), \ + _PyBytes_CHAR_INIT(9), \ + _PyBytes_CHAR_INIT(10), \ + _PyBytes_CHAR_INIT(11), \ + _PyBytes_CHAR_INIT(12), \ + _PyBytes_CHAR_INIT(13), \ + _PyBytes_CHAR_INIT(14), \ + _PyBytes_CHAR_INIT(15), \ + _PyBytes_CHAR_INIT(16), \ + _PyBytes_CHAR_INIT(17), \ + _PyBytes_CHAR_INIT(18), \ + _PyBytes_CHAR_INIT(19), \ + _PyBytes_CHAR_INIT(20), \ + _PyBytes_CHAR_INIT(21), \ + _PyBytes_CHAR_INIT(22), \ + _PyBytes_CHAR_INIT(23), \ + _PyBytes_CHAR_INIT(24), \ + _PyBytes_CHAR_INIT(25), \ + _PyBytes_CHAR_INIT(26), \ + _PyBytes_CHAR_INIT(27), \ + _PyBytes_CHAR_INIT(28), \ + _PyBytes_CHAR_INIT(29), \ + _PyBytes_CHAR_INIT(30), \ + _PyBytes_CHAR_INIT(31), \ + _PyBytes_CHAR_INIT(32), \ + _PyBytes_CHAR_INIT(33), \ + _PyBytes_CHAR_INIT(34), \ + _PyBytes_CHAR_INIT(35), \ + _PyBytes_CHAR_INIT(36), \ + _PyBytes_CHAR_INIT(37), \ + _PyBytes_CHAR_INIT(38), \ + _PyBytes_CHAR_INIT(39), \ + _PyBytes_CHAR_INIT(40), \ + _PyBytes_CHAR_INIT(41), \ + _PyBytes_CHAR_INIT(42), \ + _PyBytes_CHAR_INIT(43), \ + _PyBytes_CHAR_INIT(44), \ + _PyBytes_CHAR_INIT(45), \ + _PyBytes_CHAR_INIT(46), \ + _PyBytes_CHAR_INIT(47), \ + _PyBytes_CHAR_INIT(48), \ + _PyBytes_CHAR_INIT(49), \ + _PyBytes_CHAR_INIT(50), \ + _PyBytes_CHAR_INIT(51), \ + _PyBytes_CHAR_INIT(52), \ + _PyBytes_CHAR_INIT(53), \ + _PyBytes_CHAR_INIT(54), \ + _PyBytes_CHAR_INIT(55), \ + _PyBytes_CHAR_INIT(56), \ + _PyBytes_CHAR_INIT(57), \ + _PyBytes_CHAR_INIT(58), \ + _PyBytes_CHAR_INIT(59), \ + _PyBytes_CHAR_INIT(60), \ + _PyBytes_CHAR_INIT(61), \ + _PyBytes_CHAR_INIT(62), \ + _PyBytes_CHAR_INIT(63), \ + _PyBytes_CHAR_INIT(64), \ + _PyBytes_CHAR_INIT(65), \ + _PyBytes_CHAR_INIT(66), \ + _PyBytes_CHAR_INIT(67), \ + _PyBytes_CHAR_INIT(68), \ + _PyBytes_CHAR_INIT(69), \ + _PyBytes_CHAR_INIT(70), \ + _PyBytes_CHAR_INIT(71), \ + _PyBytes_CHAR_INIT(72), \ + _PyBytes_CHAR_INIT(73), \ + _PyBytes_CHAR_INIT(74), \ + _PyBytes_CHAR_INIT(75), \ + _PyBytes_CHAR_INIT(76), \ + _PyBytes_CHAR_INIT(77), \ + _PyBytes_CHAR_INIT(78), \ + _PyBytes_CHAR_INIT(79), \ + _PyBytes_CHAR_INIT(80), \ + _PyBytes_CHAR_INIT(81), \ + _PyBytes_CHAR_INIT(82), \ + _PyBytes_CHAR_INIT(83), \ + _PyBytes_CHAR_INIT(84), \ + _PyBytes_CHAR_INIT(85), \ + _PyBytes_CHAR_INIT(86), \ + _PyBytes_CHAR_INIT(87), \ + _PyBytes_CHAR_INIT(88), \ + _PyBytes_CHAR_INIT(89), \ + _PyBytes_CHAR_INIT(90), \ + _PyBytes_CHAR_INIT(91), \ + _PyBytes_CHAR_INIT(92), \ + _PyBytes_CHAR_INIT(93), \ + _PyBytes_CHAR_INIT(94), \ + _PyBytes_CHAR_INIT(95), \ + _PyBytes_CHAR_INIT(96), \ + _PyBytes_CHAR_INIT(97), \ + _PyBytes_CHAR_INIT(98), \ + _PyBytes_CHAR_INIT(99), \ + _PyBytes_CHAR_INIT(100), \ + _PyBytes_CHAR_INIT(101), \ + _PyBytes_CHAR_INIT(102), \ + _PyBytes_CHAR_INIT(103), \ + _PyBytes_CHAR_INIT(104), \ + _PyBytes_CHAR_INIT(105), \ + _PyBytes_CHAR_INIT(106), \ + _PyBytes_CHAR_INIT(107), \ + _PyBytes_CHAR_INIT(108), \ + _PyBytes_CHAR_INIT(109), \ + _PyBytes_CHAR_INIT(110), \ + _PyBytes_CHAR_INIT(111), \ + _PyBytes_CHAR_INIT(112), \ + _PyBytes_CHAR_INIT(113), \ + _PyBytes_CHAR_INIT(114), \ + _PyBytes_CHAR_INIT(115), \ + _PyBytes_CHAR_INIT(116), \ + _PyBytes_CHAR_INIT(117), \ + _PyBytes_CHAR_INIT(118), \ + _PyBytes_CHAR_INIT(119), \ + _PyBytes_CHAR_INIT(120), \ + _PyBytes_CHAR_INIT(121), \ + _PyBytes_CHAR_INIT(122), \ + _PyBytes_CHAR_INIT(123), \ + _PyBytes_CHAR_INIT(124), \ + _PyBytes_CHAR_INIT(125), \ + _PyBytes_CHAR_INIT(126), \ + _PyBytes_CHAR_INIT(127), \ + _PyBytes_CHAR_INIT(128), \ + _PyBytes_CHAR_INIT(129), \ + _PyBytes_CHAR_INIT(130), \ + _PyBytes_CHAR_INIT(131), \ + _PyBytes_CHAR_INIT(132), \ + _PyBytes_CHAR_INIT(133), \ + _PyBytes_CHAR_INIT(134), \ + _PyBytes_CHAR_INIT(135), \ + _PyBytes_CHAR_INIT(136), \ + _PyBytes_CHAR_INIT(137), \ + _PyBytes_CHAR_INIT(138), \ + _PyBytes_CHAR_INIT(139), \ + _PyBytes_CHAR_INIT(140), \ + _PyBytes_CHAR_INIT(141), \ + _PyBytes_CHAR_INIT(142), \ + _PyBytes_CHAR_INIT(143), \ + _PyBytes_CHAR_INIT(144), \ + _PyBytes_CHAR_INIT(145), \ + _PyBytes_CHAR_INIT(146), \ + _PyBytes_CHAR_INIT(147), \ + _PyBytes_CHAR_INIT(148), \ + _PyBytes_CHAR_INIT(149), \ + _PyBytes_CHAR_INIT(150), \ + _PyBytes_CHAR_INIT(151), \ + _PyBytes_CHAR_INIT(152), \ + _PyBytes_CHAR_INIT(153), \ + _PyBytes_CHAR_INIT(154), \ + _PyBytes_CHAR_INIT(155), \ + _PyBytes_CHAR_INIT(156), \ + _PyBytes_CHAR_INIT(157), \ + _PyBytes_CHAR_INIT(158), \ + _PyBytes_CHAR_INIT(159), \ + _PyBytes_CHAR_INIT(160), \ + _PyBytes_CHAR_INIT(161), \ + _PyBytes_CHAR_INIT(162), \ + _PyBytes_CHAR_INIT(163), \ + _PyBytes_CHAR_INIT(164), \ + _PyBytes_CHAR_INIT(165), \ + _PyBytes_CHAR_INIT(166), \ + _PyBytes_CHAR_INIT(167), \ + _PyBytes_CHAR_INIT(168), \ + _PyBytes_CHAR_INIT(169), \ + _PyBytes_CHAR_INIT(170), \ + _PyBytes_CHAR_INIT(171), \ + _PyBytes_CHAR_INIT(172), \ + _PyBytes_CHAR_INIT(173), \ + _PyBytes_CHAR_INIT(174), \ + _PyBytes_CHAR_INIT(175), \ + _PyBytes_CHAR_INIT(176), \ + _PyBytes_CHAR_INIT(177), \ + _PyBytes_CHAR_INIT(178), \ + _PyBytes_CHAR_INIT(179), \ + _PyBytes_CHAR_INIT(180), \ + _PyBytes_CHAR_INIT(181), \ + _PyBytes_CHAR_INIT(182), \ + _PyBytes_CHAR_INIT(183), \ + _PyBytes_CHAR_INIT(184), \ + _PyBytes_CHAR_INIT(185), \ + _PyBytes_CHAR_INIT(186), \ + _PyBytes_CHAR_INIT(187), \ + _PyBytes_CHAR_INIT(188), \ + _PyBytes_CHAR_INIT(189), \ + _PyBytes_CHAR_INIT(190), \ + _PyBytes_CHAR_INIT(191), \ + _PyBytes_CHAR_INIT(192), \ + _PyBytes_CHAR_INIT(193), \ + _PyBytes_CHAR_INIT(194), \ + _PyBytes_CHAR_INIT(195), \ + _PyBytes_CHAR_INIT(196), \ + _PyBytes_CHAR_INIT(197), \ + _PyBytes_CHAR_INIT(198), \ + _PyBytes_CHAR_INIT(199), \ + _PyBytes_CHAR_INIT(200), \ + _PyBytes_CHAR_INIT(201), \ + _PyBytes_CHAR_INIT(202), \ + _PyBytes_CHAR_INIT(203), \ + _PyBytes_CHAR_INIT(204), \ + _PyBytes_CHAR_INIT(205), \ + _PyBytes_CHAR_INIT(206), \ + _PyBytes_CHAR_INIT(207), \ + _PyBytes_CHAR_INIT(208), \ + _PyBytes_CHAR_INIT(209), \ + _PyBytes_CHAR_INIT(210), \ + _PyBytes_CHAR_INIT(211), \ + _PyBytes_CHAR_INIT(212), \ + _PyBytes_CHAR_INIT(213), \ + _PyBytes_CHAR_INIT(214), \ + _PyBytes_CHAR_INIT(215), \ + _PyBytes_CHAR_INIT(216), \ + _PyBytes_CHAR_INIT(217), \ + _PyBytes_CHAR_INIT(218), \ + _PyBytes_CHAR_INIT(219), \ + _PyBytes_CHAR_INIT(220), \ + _PyBytes_CHAR_INIT(221), \ + _PyBytes_CHAR_INIT(222), \ + _PyBytes_CHAR_INIT(223), \ + _PyBytes_CHAR_INIT(224), \ + _PyBytes_CHAR_INIT(225), \ + _PyBytes_CHAR_INIT(226), \ + _PyBytes_CHAR_INIT(227), \ + _PyBytes_CHAR_INIT(228), \ + _PyBytes_CHAR_INIT(229), \ + _PyBytes_CHAR_INIT(230), \ + _PyBytes_CHAR_INIT(231), \ + _PyBytes_CHAR_INIT(232), \ + _PyBytes_CHAR_INIT(233), \ + _PyBytes_CHAR_INIT(234), \ + _PyBytes_CHAR_INIT(235), \ + _PyBytes_CHAR_INIT(236), \ + _PyBytes_CHAR_INIT(237), \ + _PyBytes_CHAR_INIT(238), \ + _PyBytes_CHAR_INIT(239), \ + _PyBytes_CHAR_INIT(240), \ + _PyBytes_CHAR_INIT(241), \ + _PyBytes_CHAR_INIT(242), \ + _PyBytes_CHAR_INIT(243), \ + _PyBytes_CHAR_INIT(244), \ + _PyBytes_CHAR_INIT(245), \ + _PyBytes_CHAR_INIT(246), \ + _PyBytes_CHAR_INIT(247), \ + _PyBytes_CHAR_INIT(248), \ + _PyBytes_CHAR_INIT(249), \ + _PyBytes_CHAR_INIT(250), \ + _PyBytes_CHAR_INIT(251), \ + _PyBytes_CHAR_INIT(252), \ + _PyBytes_CHAR_INIT(253), \ + _PyBytes_CHAR_INIT(254), \ + _PyBytes_CHAR_INIT(255), \ + }, \ }, \ } diff --git a/Include/internal/pycore_interp.h b/Include/internal/pycore_interp.h index e4d7b1b8752ea..d48ea87fd67fe 100644 --- a/Include/internal/pycore_interp.h +++ b/Include/internal/pycore_interp.h @@ -10,7 +10,6 @@ extern "C" { #include "pycore_atomic.h" // _Py_atomic_address #include "pycore_ast_state.h" // struct ast_state -#include "pycore_bytesobject.h" // struct _Py_bytes_state #include "pycore_context.h" // struct _Py_context_state #include "pycore_dict.h" // struct _Py_dict_state #include "pycore_exceptions.h" // struct _Py_exc_state @@ -152,7 +151,6 @@ struct _is { PyObject *audit_hooks; - struct _Py_bytes_state bytes; struct _Py_unicode_state unicode; struct _Py_float_state float_state; /* Using a cache is very effective since typically only a single slice is diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 2f7e0a6dde6fe..85d6912ca751f 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -5,9 +5,9 @@ #include "Python.h" #include "pycore_abstract.h" // _PyIndex_Check() #include "pycore_bytes_methods.h" // _Py_bytes_startswith() -#include "pycore_bytesobject.h" // struct _Py_bytes_state #include "pycore_call.h" // _PyObject_CallNoArgs() #include "pycore_format.h" // F_LJUST +#include "pycore_global_objects.h" // _Py_GET_GLOBAL_OBJECT() #include "pycore_initconfig.h" // _PyStatus_OK() #include "pycore_long.h" // _PyLong_DigitValue #include "pycore_object.h" // _PyObject_GC_TRACK @@ -38,49 +38,24 @@ Py_LOCAL_INLINE(Py_ssize_t) _PyBytesWriter_GetSize(_PyBytesWriter *writer, char *str); -static struct _Py_bytes_state* -get_bytes_state(void) -{ - PyInterpreterState *interp = _PyInterpreterState_GET(); - return &interp->bytes; -} +#define CHARACTERS _Py_SINGLETON(bytes_characters) +#define CHARACTER(ch) \ + ((PyBytesObject *)&(CHARACTERS[ch])); +#define EMPTY (&_Py_SINGLETON(bytes_empty)) // Return a borrowed reference to the empty bytes string singleton. static inline PyObject* bytes_get_empty(void) { - struct _Py_bytes_state *state = get_bytes_state(); - // bytes_get_empty() must not be called before _PyBytes_Init() - // or after _PyBytes_Fini() - assert(state->empty_string != NULL); - return state->empty_string; + return &EMPTY->ob_base.ob_base; } // Return a strong reference to the empty bytes string singleton. static inline PyObject* bytes_new_empty(void) { - PyObject *empty = bytes_get_empty(); - Py_INCREF(empty); - return (PyObject *)empty; -} - - -static int -bytes_create_empty_string_singleton(struct _Py_bytes_state *state) -{ - // Create the empty bytes string singleton - PyBytesObject *op = (PyBytesObject *)PyObject_Malloc(PyBytesObject_SIZE); - if (op == NULL) { - return -1; - } - _PyObject_InitVar((PyVarObject*)op, &PyBytes_Type, 0); - op->ob_shash = -1; - op->ob_sval[0] = '\0'; - - assert(state->empty_string == NULL); - state->empty_string = (PyObject *)op; - return 0; + Py_INCREF(EMPTY); + return (PyObject *)EMPTY; } @@ -148,12 +123,9 @@ PyBytes_FromStringAndSize(const char *str, Py_ssize_t size) return NULL; } if (size == 1 && str != NULL) { - struct _Py_bytes_state *state = get_bytes_state(); - op = state->characters[*str & UCHAR_MAX]; - if (op != NULL) { - Py_INCREF(op); - return (PyObject *)op; - } + op = CHARACTER(*str & 255); + Py_INCREF(op); + return (PyObject *)op; } if (size == 0) { return bytes_new_empty(); @@ -166,12 +138,6 @@ PyBytes_FromStringAndSize(const char *str, Py_ssize_t size) return (PyObject *) op; memcpy(op->ob_sval, str, size); - /* share short strings */ - if (size == 1) { - struct _Py_bytes_state *state = get_bytes_state(); - Py_INCREF(op); - state->characters[*str & UCHAR_MAX] = op; - } return (PyObject *) op; } @@ -189,16 +155,13 @@ PyBytes_FromString(const char *str) return NULL; } - struct _Py_bytes_state *state = get_bytes_state(); if (size == 0) { return bytes_new_empty(); } else if (size == 1) { - op = state->characters[*str & UCHAR_MAX]; - if (op != NULL) { - Py_INCREF(op); - return (PyObject *)op; - } + op = CHARACTER(*str & 255); + Py_INCREF(op); + return (PyObject *)op; } /* Inline PyObject_NewVar */ @@ -209,12 +172,6 @@ PyBytes_FromString(const char *str) _PyObject_InitVar((PyVarObject*)op, &PyBytes_Type, size); op->ob_shash = -1; memcpy(op->ob_sval, str, size+1); - /* share short strings */ - if (size == 1) { - assert(state->characters[*str & UCHAR_MAX] == NULL); - Py_INCREF(op); - state->characters[*str & UCHAR_MAX] = op; - } return (PyObject *) op; } @@ -3086,17 +3043,6 @@ _PyBytes_Resize(PyObject **pv, Py_ssize_t newsize) } -PyStatus -_PyBytes_InitGlobalObjects(PyInterpreterState *interp) -{ - struct _Py_bytes_state *state = &interp->bytes; - if (bytes_create_empty_string_singleton(state) < 0) { - return _PyStatus_NO_MEMORY(); - } - return _PyStatus_OK(); -} - - PyStatus _PyBytes_InitTypes(PyInterpreterState *interp) { @@ -3116,16 +3062,6 @@ _PyBytes_InitTypes(PyInterpreterState *interp) } -void -_PyBytes_Fini(PyInterpreterState *interp) -{ - struct _Py_bytes_state* state = &interp->bytes; - for (int i = 0; i < UCHAR_MAX + 1; i++) { - Py_CLEAR(state->characters[i]); - } - Py_CLEAR(state->empty_string); -} - /*********************** Bytes Iterator ****************************/ typedef struct { diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index b2f58f4e3e8a3..284cfac3c40a5 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -678,11 +678,6 @@ pycore_init_global_objects(PyInterpreterState *interp) _PyFloat_InitState(interp); - status = _PyBytes_InitGlobalObjects(interp); - if (_PyStatus_EXCEPTION(status)) { - return status; - } - status = _PyUnicode_InitGlobalObjects(interp); if (_PyStatus_EXCEPTION(status)) { return status; @@ -1685,7 +1680,6 @@ finalize_interp_types(PyInterpreterState *interp) _PySlice_Fini(interp); - _PyBytes_Fini(interp); _PyUnicode_Fini(interp); _PyFloat_Fini(interp); } From webhook-mailer at python.org Tue Jan 11 14:15:51 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 19:15:51 -0000 Subject: [Python-checkins] bpo-46307: Add string.Template.get_identifiers() method (GH-30493) Message-ID: https://github.com/python/cpython/commit/dce642f24418c58e67fa31a686575c980c31dd37 commit: dce642f24418c58e67fa31a686575c980c31dd37 branch: main author: Ben Kehoe committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T11:15:42-08:00 summary: bpo-46307: Add string.Template.get_identifiers() method (GH-30493) Add `string.Template.get_identifiers()` method that returns the identifiers within the template. By default, raises an error if it encounters an invalid identifier (like `substitute()`). The keyword-only argument `raise_on_invalid` can be set to `False` to ignore invalid identifiers (like `safe_substitute()`). Automerge-Triggered-By: GH:warsaw files: A Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst M Doc/library/string.rst M Lib/string.py M Lib/test/test_string.py diff --git a/Doc/library/string.rst b/Doc/library/string.rst index b27782f8d8e9b..9bc703e70cdaa 100644 --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -783,6 +783,22 @@ these rules. The methods of :class:`Template` are: templates containing dangling delimiters, unmatched braces, or placeholders that are not valid Python identifiers. + + .. method:: is_valid() + + Returns false if the template has invalid placeholders that will cause + :meth:`substitute` to raise :exc:`ValueError`. + + .. versionadded:: 3.11 + + + .. method:: get_identifiers() + + Returns a list of the valid identifiers in the template, in the order + they first appear, ignoring any invalid identifiers. + + .. versionadded:: 3.11 + :class:`Template` instances also provide one public data attribute: .. attribute:: template @@ -869,6 +885,9 @@ rule: * *invalid* -- This group matches any other delimiter pattern (usually a single delimiter), and it should appear last in the regular expression. +The methods on this class will raise :exc:`ValueError` if the pattern matches +the template without one of these named groups matching. + Helper functions ---------------- diff --git a/Lib/string.py b/Lib/string.py index 261789cc10a44..2eab6d4f595c4 100644 --- a/Lib/string.py +++ b/Lib/string.py @@ -141,6 +141,35 @@ def convert(mo): self.pattern) return self.pattern.sub(convert, self.template) + def is_valid(self): + for mo in self.pattern.finditer(self.template): + if mo.group('invalid') is not None: + return False + if (mo.group('named') is None + and mo.group('braced') is None + and mo.group('escaped') is None): + # If all the groups are None, there must be + # another group we're not expecting + raise ValueError('Unrecognized named group in pattern', + self.pattern) + return True + + def get_identifiers(self): + ids = [] + for mo in self.pattern.finditer(self.template): + named = mo.group('named') or mo.group('braced') + if named is not None and named not in ids: + # add a named group only the first time it appears + ids.append(named) + elif (named is None + and mo.group('invalid') is None + and mo.group('escaped') is None): + # If all the groups are None, there must be + # another group we're not expecting + raise ValueError('Unrecognized named group in pattern', + self.pattern) + return ids + # Initialize Template.pattern. __init_subclass__() is automatically called # only for subclasses, not for the Template class itself. Template.__init_subclass__() diff --git a/Lib/test/test_string.py b/Lib/test/test_string.py index 0be28fdb609ea..824b89ad517c1 100644 --- a/Lib/test/test_string.py +++ b/Lib/test/test_string.py @@ -475,6 +475,57 @@ class PieDelims(Template): self.assertEqual(s.substitute(dict(who='tim', what='ham')), 'tim likes to eat a bag of ham worth $100') + def test_is_valid(self): + eq = self.assertEqual + s = Template('$who likes to eat a bag of ${what} worth $$100') + self.assertTrue(s.is_valid()) + + s = Template('$who likes to eat a bag of ${what} worth $100') + self.assertFalse(s.is_valid()) + + # if the pattern has an unrecognized capture group, + # it should raise ValueError like substitute and safe_substitute do + class BadPattern(Template): + pattern = r""" + (?P.*) | + (?P@{2}) | + @(?P[_a-z][._a-z0-9]*) | + @{(?P[_a-z][._a-z0-9]*)} | + (?P@) | + """ + s = BadPattern('@bag.foo.who likes to eat a bag of @bag.what') + self.assertRaises(ValueError, s.is_valid) + + def test_get_identifiers(self): + eq = self.assertEqual + raises = self.assertRaises + s = Template('$who likes to eat a bag of ${what} worth $$100') + ids = s.get_identifiers() + eq(ids, ['who', 'what']) + + # repeated identifiers only included once + s = Template('$who likes to eat a bag of ${what} worth $$100; ${who} likes to eat a bag of $what worth $$100') + ids = s.get_identifiers() + eq(ids, ['who', 'what']) + + # invalid identifiers are ignored + s = Template('$who likes to eat a bag of ${what} worth $100') + ids = s.get_identifiers() + eq(ids, ['who', 'what']) + + # if the pattern has an unrecognized capture group, + # it should raise ValueError like substitute and safe_substitute do + class BadPattern(Template): + pattern = r""" + (?P.*) | + (?P@{2}) | + @(?P[_a-z][._a-z0-9]*) | + @{(?P[_a-z][._a-z0-9]*)} | + (?P@) | + """ + s = BadPattern('@bag.foo.who likes to eat a bag of @bag.what') + self.assertRaises(ValueError, s.get_identifiers) + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst b/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst new file mode 100644 index 0000000000000..6207c424ce9c0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst @@ -0,0 +1 @@ +Add :meth:`string.Template.is_valid` and :meth:`string.Template.get_identifiers` methods. From webhook-mailer at python.org Tue Jan 11 15:18:45 2022 From: webhook-mailer at python.org (JulienPalard) Date: Tue, 11 Jan 2022 20:18:45 -0000 Subject: [Python-checkins] [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) (GH-30540) Message-ID: https://github.com/python/cpython/commit/6f035c07e0aeff9120a45e668a383c985cd8861a commit: 6f035c07e0aeff9120a45e668a383c985cd8861a branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: JulienPalard date: 2022-01-11T21:18:33+01:00 summary: [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) (GH-30540) (cherry picked from commit 6f05e1ec193c132015e9a23d1137b1731596f186) Co-authored-by: Julien Palard Co-authored-by: Julien Palard files: M Doc/conf.py M Doc/requirements.txt diff --git a/Doc/conf.py b/Doc/conf.py index f626ce67b3c74..cbf201a314eec 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -69,7 +69,8 @@ html_theme_path = ['tools'] html_theme_options = { 'collapsiblesidebar': True, - 'issues_url': 'https://docs.python.org/3/bugs.html', + 'issues_url': '/bugs.html', + 'license_url': '/license.html', 'root_include_title': False # We use the version switcher instead. } diff --git a/Doc/requirements.txt b/Doc/requirements.txt index dd3c8e62237cf..95d320f4cb1f3 100644 --- a/Doc/requirements.txt +++ b/Doc/requirements.txt @@ -13,4 +13,4 @@ blurb # The theme used by the documentation is stored separately, so we need # to install that as well. -python-docs-theme +python-docs-theme>=2022.1 From webhook-mailer at python.org Tue Jan 11 15:18:49 2022 From: webhook-mailer at python.org (JulienPalard) Date: Tue, 11 Jan 2022 20:18:49 -0000 Subject: [Python-checkins] [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) (GH-30541) Message-ID: https://github.com/python/cpython/commit/12cf91c3b17f6047eca4d7390741a37fc3bb03ca commit: 12cf91c3b17f6047eca4d7390741a37fc3bb03ca branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: JulienPalard date: 2022-01-11T21:18:45+01:00 summary: [doc] Add license_url for python-docs-theme 2022.1. (GH-30527) (GH-30541) (cherry picked from commit 6f05e1ec193c132015e9a23d1137b1731596f186) Co-authored-by: Julien Palard Co-authored-by: Julien Palard files: M Doc/conf.py M Doc/requirements.txt diff --git a/Doc/conf.py b/Doc/conf.py index d2dff7d7a8d83..7e355bae46f03 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -70,7 +70,8 @@ html_theme_path = ['tools'] html_theme_options = { 'collapsiblesidebar': True, - 'issues_url': 'https://docs.python.org/3/bugs.html', + 'issues_url': '/bugs.html', + 'license_url': '/license.html', 'root_include_title': False # We use the version switcher instead. } diff --git a/Doc/requirements.txt b/Doc/requirements.txt index cb21ed20397b2..1b75aed035ac9 100644 --- a/Doc/requirements.txt +++ b/Doc/requirements.txt @@ -13,4 +13,4 @@ blurb # The theme used by the documentation is stored separately, so we need # to install that as well. -python-docs-theme +python-docs-theme>=2022.1 From webhook-mailer at python.org Tue Jan 11 17:25:40 2022 From: webhook-mailer at python.org (1st1) Date: Tue, 11 Jan 2022 22:25:40 -0000 Subject: [Python-checkins] bpo-46347: Fix memory leak in PyEval_EvalCodeEx. (#30546) Message-ID: https://github.com/python/cpython/commit/607d8a838f29ad3c4c4e85b39f338dade5f9cafe commit: 607d8a838f29ad3c4c4e85b39f338dade5f9cafe branch: main author: Yury Selivanov committer: 1st1 date: 2022-01-11T14:25:28-08:00 summary: bpo-46347: Fix memory leak in PyEval_EvalCodeEx. (#30546) First introduced in 0332e569c12d3dc97171546c6dc10e42c27de34b files: A Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst M Python/ceval.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst new file mode 100644 index 0000000000000..fc12d6ba146ca --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst @@ -0,0 +1 @@ +Fix memory leak in PyEval_EvalCodeEx. diff --git a/Python/ceval.c b/Python/ceval.c index be26ffd822c13..85b4400de32f4 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -6128,16 +6128,9 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, } allargs = newargs; } - PyObject **kwargs = PyMem_Malloc(sizeof(PyObject *)*kwcount); - if (kwargs == NULL) { - res = NULL; - Py_DECREF(kwnames); - goto fail; - } for (int i = 0; i < kwcount; i++) { Py_INCREF(kws[2*i]); PyTuple_SET_ITEM(kwnames, i, kws[2*i]); - kwargs[i] = kws[2*i+1]; } PyFrameConstructor constr = { .fc_globals = globals, From webhook-mailer at python.org Tue Jan 11 18:09:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 11 Jan 2022 23:09:26 -0000 Subject: [Python-checkins] bpo-46347: Fix memory leak in PyEval_EvalCodeEx. (GH-30546) Message-ID: https://github.com/python/cpython/commit/b1a94f1fab7c0aee0705483616a1b2c3f2713c00 commit: b1a94f1fab7c0aee0705483616a1b2c3f2713c00 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-11T15:09:22-08:00 summary: bpo-46347: Fix memory leak in PyEval_EvalCodeEx. (GH-30546) First introduced in 0332e569c12d3dc97171546c6dc10e42c27de34b (cherry picked from commit 607d8a838f29ad3c4c4e85b39f338dade5f9cafe) Co-authored-by: Yury Selivanov files: A Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst M Python/ceval.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst new file mode 100644 index 0000000000000..fc12d6ba146ca --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst @@ -0,0 +1 @@ +Fix memory leak in PyEval_EvalCodeEx. diff --git a/Python/ceval.c b/Python/ceval.c index 8ad1713b78c01..e906076e27e56 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -5132,16 +5132,9 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, } allargs = newargs; } - PyObject **kwargs = PyMem_Malloc(sizeof(PyObject *)*kwcount); - if (kwargs == NULL) { - res = NULL; - Py_DECREF(kwnames); - goto fail; - } for (int i = 0; i < kwcount; i++) { Py_INCREF(kws[2*i]); PyTuple_SET_ITEM(kwnames, i, kws[2*i]); - kwargs[i] = kws[2*i+1]; } PyFrameConstructor constr = { .fc_globals = globals, From webhook-mailer at python.org Tue Jan 11 18:35:34 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 11 Jan 2022 23:35:34 -0000 Subject: [Python-checkins] bpo-46303: Fix fileutils.h compiler warnings (GH-30550) Message-ID: https://github.com/python/cpython/commit/08bc1bad11cad39f508bd662c9b28fcd9c995512 commit: 08bc1bad11cad39f508bd662c9b28fcd9c995512 branch: main author: Victor Stinner committer: vstinner date: 2022-01-12T00:35:26+01:00 summary: bpo-46303: Fix fileutils.h compiler warnings (GH-30550) Add missing pycore_fileutils.h include in _tkinter.c and _testconsole.c. files: M Modules/_tkinter.c M PC/_testconsole.c diff --git a/Modules/_tkinter.c b/Modules/_tkinter.c index aabf20b8d963c..f4d2716fe302d 100644 --- a/Modules/_tkinter.c +++ b/Modules/_tkinter.c @@ -22,9 +22,15 @@ Copyright (C) 1994 Steen Lumholt. */ #define PY_SSIZE_T_CLEAN +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif #include "Python.h" #include +#ifdef MS_WINDOWS +# include "pycore_fileutils.h" // _Py_stat() +#endif #ifdef MS_WINDOWS #include diff --git a/PC/_testconsole.c b/PC/_testconsole.c index db84f73c7744f..a8308835d8f85 100644 --- a/PC/_testconsole.c +++ b/PC/_testconsole.c @@ -1,11 +1,15 @@ - /* Testing module for multi-phase initialization of extension modules (PEP 489) */ +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #include "Python.h" #ifdef MS_WINDOWS +#include "pycore_fileutils.h" // _Py_get_osfhandle() #include "..\modules\_io\_iomodule.h" #define WIN32_LEAN_AND_MEAN From webhook-mailer at python.org Tue Jan 11 18:37:18 2022 From: webhook-mailer at python.org (1st1) Date: Tue, 11 Jan 2022 23:37:18 -0000 Subject: [Python-checkins] bpo-46347: Fix PyEval_EvalCodeEx to correctly cleanup in error paths (#30551) Message-ID: https://github.com/python/cpython/commit/20b5791ce9b47195ce51cfd5acb223b1ca59cdf0 commit: 20b5791ce9b47195ce51cfd5acb223b1ca59cdf0 branch: main author: Yury Selivanov committer: 1st1 date: 2022-01-11T15:37:09-08:00 summary: bpo-46347: Fix PyEval_EvalCodeEx to correctly cleanup in error paths (#30551) files: M Python/ceval.c diff --git a/Python/ceval.c b/Python/ceval.c index 85b4400de32f4..c512afadb1f7a 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -6086,7 +6086,7 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, PyObject *kwdefs, PyObject *closure) { PyThreadState *tstate = _PyThreadState_GET(); - PyObject *res; + PyObject *res = NULL; PyObject *defaults = _PyTuple_FromArray(defs, defcount); if (defaults == NULL) { return NULL; @@ -6099,22 +6099,20 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, if (locals == NULL) { locals = globals; } - PyObject *kwnames; + PyObject *kwnames = NULL; PyObject *const *allargs; - PyObject **newargs; + PyObject **newargs = NULL; + PyFunctionObject *func = NULL; if (kwcount == 0) { allargs = args; - kwnames = NULL; } else { kwnames = PyTuple_New(kwcount); if (kwnames == NULL) { - res = NULL; goto fail; } newargs = PyMem_Malloc(sizeof(PyObject *)*(kwcount+argcount)); if (newargs == NULL) { - res = NULL; Py_DECREF(kwnames); goto fail; } @@ -6142,19 +6140,17 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, .fc_kwdefaults = kwdefs, .fc_closure = closure }; - PyFunctionObject *func = _PyFunction_FromConstructor(&constr); + func = _PyFunction_FromConstructor(&constr); if (func == NULL) { - return NULL; + goto fail; } res = _PyEval_Vector(tstate, func, locals, allargs, argcount, kwnames); - Py_DECREF(func); - if (kwcount) { - Py_DECREF(kwnames); - PyMem_Free(newargs); - } fail: + Py_XDECREF(func); + Py_XDECREF(kwnames); + PyMem_Free(newargs); Py_DECREF(defaults); return res; } From webhook-mailer at python.org Tue Jan 11 19:17:51 2022 From: webhook-mailer at python.org (1st1) Date: Wed, 12 Jan 2022 00:17:51 -0000 Subject: [Python-checkins] bpo-46347: Fix PyEval_EvalCodeEx to correctly cleanup in error paths (#30553) Message-ID: https://github.com/python/cpython/commit/6f9ca53a6ac343a5663cc5c52546acf9a63b605a commit: 6f9ca53a6ac343a5663cc5c52546acf9a63b605a branch: 3.10 author: Yury Selivanov committer: 1st1 date: 2022-01-11T16:17:42-08:00 summary: bpo-46347: Fix PyEval_EvalCodeEx to correctly cleanup in error paths (#30553) files: M Python/ceval.c diff --git a/Python/ceval.c b/Python/ceval.c index e906076e27e56..ab10b4166d6d2 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -5090,7 +5090,7 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, PyObject *kwdefs, PyObject *closure) { PyThreadState *tstate = _PyThreadState_GET(); - PyObject *res; + PyObject *res = NULL; PyObject *defaults = _PyTuple_FromArray(defs, defcount); if (defaults == NULL) { return NULL; @@ -5103,23 +5103,19 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, if (locals == NULL) { locals = globals; } - PyObject *kwnames; + PyObject *kwnames = NULL; PyObject *const *allargs; - PyObject **newargs; + PyObject **newargs = NULL; if (kwcount == 0) { allargs = args; - kwnames = NULL; } else { kwnames = PyTuple_New(kwcount); if (kwnames == NULL) { - res = NULL; goto fail; } newargs = PyMem_Malloc(sizeof(PyObject *)*(kwcount+argcount)); if (newargs == NULL) { - res = NULL; - Py_DECREF(kwnames); goto fail; } for (int i = 0; i < argcount; i++) { @@ -5149,11 +5145,9 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, res = _PyEval_Vector(tstate, &constr, locals, allargs, argcount, kwnames); - if (kwcount) { - Py_DECREF(kwnames); - PyMem_Free(newargs); - } fail: + Py_XDECREF(kwnames); + PyMem_Free(newargs); Py_DECREF(defaults); return res; } From webhook-mailer at python.org Tue Jan 11 19:35:44 2022 From: webhook-mailer at python.org (1st1) Date: Wed, 12 Jan 2022 00:35:44 -0000 Subject: [Python-checkins] bpo-46347: Yet another fix in the erorr path of PyEval_EvalCodeEx (#30554) Message-ID: https://github.com/python/cpython/commit/be578e0c063dad1dbb273f86d5bc77e4e6f14583 commit: be578e0c063dad1dbb273f86d5bc77e4e6f14583 branch: main author: Yury Selivanov committer: 1st1 date: 2022-01-11T16:35:19-08:00 summary: bpo-46347: Yet another fix in the erorr path of PyEval_EvalCodeEx (#30554) files: M Python/ceval.c diff --git a/Python/ceval.c b/Python/ceval.c index c512afadb1f7a..8e878cbf7e2b3 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -6113,7 +6113,6 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, } newargs = PyMem_Malloc(sizeof(PyObject *)*(kwcount+argcount)); if (newargs == NULL) { - Py_DECREF(kwnames); goto fail; } for (int i = 0; i < argcount; i++) { From webhook-mailer at python.org Wed Jan 12 10:08:33 2022 From: webhook-mailer at python.org (tiran) Date: Wed, 12 Jan 2022 15:08:33 -0000 Subject: [Python-checkins] bpo-40280: Add --with-emscripten-target to build for browser or node (GH-30552) Message-ID: https://github.com/python/cpython/commit/43839ba438368a50f22f718d4ce8ce607c17046c commit: 43839ba438368a50f22f718d4ce8ce607c17046c branch: main author: Christian Heimes committer: tiran date: 2022-01-12T16:08:19+01:00 summary: bpo-40280: Add --with-emscripten-target to build for browser or node (GH-30552) Co-authored-by: Ethan Smith files: A Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst M Makefile.pre.in M Modules/socketmodule.c M Tools/wasm/README.md M Tools/wasm/config.site-wasm32-emscripten M configure M configure.ac diff --git a/Makefile.pre.in b/Makefile.pre.in index fbd4c3a23fd81..41b123abcef11 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -246,6 +246,10 @@ SRCDIRS= @SRCDIRS@ # Other subdirectories SUBDIRSTOO= Include Lib Misc +# assets for Emscripten browser builds +WASM_ASSETS_DIR=".$(prefix)" +WASM_STDLIB="$(WASM_ASSETS_DIR)/local/lib/python$(VERSION)/os.py" + # Files and directories to be distributed CONFIGFILES= configure configure.ac acconfig.h pyconfig.h.in Makefile.pre.in DISTFILES= README.rst ChangeLog $(CONFIGFILES) @@ -601,6 +605,7 @@ LIBEXPAT_HEADERS= \ all: @DEF_MAKE_ALL_RULE@ build_all: check-clean-src $(BUILDPYTHON) oldsharedmods sharedmods gdbhooks \ Programs/_testembed python-config +build_platform: check-clean-src $(BUILDPYTHON) platform # Check that the source is clean when building out of source. check-clean-src: @@ -833,19 +838,12 @@ $(DLLLIBRARY) libpython$(LDVERSION).dll.a: $(LIBRARY_OBJS) # wasm32-emscripten build # wasm assets directory is relative to current build dir, e.g. "./usr/local". # --preload-file turns a relative asset path into an absolute path. -WASM_ASSETS_DIR=".$(prefix)" -WASM_STDLIB="$(WASM_ASSETS_DIR)/local/lib/python$(VERSION)/os.py" $(WASM_STDLIB): $(srcdir)/Lib/*.py $(srcdir)/Lib/*/*.py \ pybuilddir.txt $(srcdir)/Tools/wasm/wasm_assets.py $(PYTHON_FOR_BUILD) $(srcdir)/Tools/wasm/wasm_assets.py \ --builddir . --prefix $(prefix) -python.html: Programs/python.o $(LIBRARY_DEPS) $(WASM_STDLIB) - $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o \ - $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) \ - -s ASSERTIONS=1 --preload-file $(WASM_ASSETS_DIR) - ########################################################################## # Build static libmpdec.a LIBMPDEC_CFLAGS=$(PY_STDMODULE_CFLAGS) $(CCSHARED) @LIBMPDEC_CFLAGS@ @@ -2396,7 +2394,7 @@ clean-retain-profile: pycremoval -rm -f pybuilddir.txt -rm -f Lib/lib2to3/*Grammar*.pickle -rm -f _bootstrap_python - -rm -f python.html python.js python.data + -rm -f python.html python*.js python.data -rm -f Programs/_testembed Programs/_freeze_module -rm -f Python/deepfreeze/*.[co] -rm -f Python/frozen_modules/*.h diff --git a/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst b/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst new file mode 100644 index 0000000000000..55fc0fc986b81 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst @@ -0,0 +1,2 @@ +The ``configure`` script has a new option ``--with-emscripten-target`` to +select browser or node as Emscripten build target. diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index ed83f5c82625e..0e275639967c2 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -7931,7 +7931,7 @@ PyInit__socket(void) #ifdef IPPROTO_VRRP PyModule_AddIntMacro(m, IPPROTO_VRRP); #endif -#ifdef IPPROTO_SCTP +#if defined(IPPROTO_SCTP) && !defined(__EMSCRIPTEN__) PyModule_AddIntMacro(m, IPPROTO_SCTP); #endif #ifdef IPPROTO_BIP diff --git a/Tools/wasm/README.md b/Tools/wasm/README.md index 93c76b225db79..f59b876b11a74 100644 --- a/Tools/wasm/README.md +++ b/Tools/wasm/README.md @@ -27,6 +27,8 @@ embuilder build zlib ### Cross compile to wasm32-emscripten +For browser: + ```shell mkdir -p builddir/emscripten pushd builddir/emscripten @@ -35,9 +37,23 @@ CONFIG_SITE=../../Tools/wasm/config.site-wasm32-emscripten \ emconfigure ../../configure -C \ --host=wasm32-unknown-emscripten \ --build=$(../../config.guess) \ + --with-emscripten-target=browser \ + --with-build-python=$(pwd)/../build/python + +emmake make -j$(nproc) +``` + +For node: + +``` +CONFIG_SITE=../../Tools/wasm/config.site-wasm32-emscripten \ + emconfigure ../../configure -C \ + --host=wasm32-unknown-emscripten \ + --build=$(../../config.guess) \ + --with-emscripten-target=node \ --with-build-python=$(pwd)/../build/python -emmake make -j$(nproc) python.html +emmake make -j$(nproc) ``` ### Test in browser diff --git a/Tools/wasm/config.site-wasm32-emscripten b/Tools/wasm/config.site-wasm32-emscripten index b291c802e1e4d..ce9dec7ecf6d4 100644 --- a/Tools/wasm/config.site-wasm32-emscripten +++ b/Tools/wasm/config.site-wasm32-emscripten @@ -30,6 +30,11 @@ ac_cv_func_shutdown=no # breaks build, see https://github.com/ethanhs/python-wasm/issues/16 ac_cv_lib_bz2_BZ2_bzCompress=no +# clock_nanosleep() causes time.sleep() to sleep forever. +# nanosleep() works correctly +ac_cv_func_clock_nanosleep=no +ac_cv_lib_rt_clock_nanosleep=no + # The rest is based on pyodide # https://github.com/pyodide/pyodide/blob/main/cpython/pyconfig.undefs.h diff --git a/configure b/configure index 9712446d24c11..327e9bd2d3f34 100755 --- a/configure +++ b/configure @@ -846,6 +846,8 @@ SHLIB_SUFFIX LIBTOOL_CRUFT OTHER_LIBTOOL_OPT UNIVERSAL_ARCH_FLAGS +WASM_STDLIB +WASM_ASSETS_DIR LDFLAGS_NOLTO LDFLAGS_NODIST CFLAGS_NODIST @@ -1002,6 +1004,7 @@ with_universal_archs with_framework_name enable_framework with_cxx_main +with_emscripten_target with_suffix enable_shared enable_profiling @@ -1751,6 +1754,8 @@ Optional Packages: --with-cxx-main[=COMPILER] compile main() and link Python executable with C++ compiler specified in COMPILER (default is $CXX) + --with-emscripten-target=[browser|node] + Emscripten platform --with-suffix=SUFFIX set executable suffix to SUFFIX (default is empty, yes is mapped to '.exe') --with-pydebug build with Py_DEBUG defined (default is no) @@ -6205,6 +6210,41 @@ case $ac_sys_system/$ac_sys_release in #( ;; esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-emscripten-target" >&5 +$as_echo_n "checking for --with-emscripten-target... " >&6; } + +# Check whether --with-emscripten-target was given. +if test "${with_emscripten_target+set}" = set; then : + withval=$with_emscripten_target; + if test "x$ac_sys_system" = xEmscripten; then : + + case $with_emscripten_target in #( + browser) : + ac_sys_emscripten_target=browser ;; #( + node) : + ac_sys_emscripten_target=node ;; #( + *) : + as_fn_error $? "Invalid argument: --with-emscripten-target=browser|node" "$LINENO" 5 + ;; +esac + +else + + as_fn_error $? "--with-emscripten-target only applies to Emscripten" "$LINENO" 5 + +fi + +else + + if test "x$ac_sys_system" = xEmscripten; then : + ac_sys_emscripten_target=browser +fi + +fi + +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_sys_emscripten_target" >&5 +$as_echo "$ac_sys_emscripten_target" >&6; } + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-suffix" >&5 $as_echo_n "checking for --with-suffix... " >&6; } @@ -6223,8 +6263,12 @@ esac else - case $ac_sys_system in #( - Emscripten) : + case $ac_sys_system/$ac_sys_emscripten_target in #( + Emscripten/browser) : + EXEEXT=.html ;; #( + Emscripten/node) : + EXEEXT=.js ;; #( + wasi/*) : EXEEXT=.wasm ;; #( *) : EXEEXT= @@ -7003,6 +7047,7 @@ else $as_echo "no" >&6; } fi + if test "$Py_OPT" = 'true' ; then # Intentionally not forcing Py_LTO='true' here. Too many toolchains do not # compile working code using it and both test_distutils and test_gdb are @@ -7053,8 +7098,10 @@ fi ;; esac - - +elif test "$ac_sys_system" = "Emscripten"; then + DEF_MAKE_ALL_RULE="build_platform" + REQUIRE_PGO="no" + DEF_MAKE_RULE="all" else DEF_MAKE_ALL_RULE="build_all" REQUIRE_PGO="no" @@ -7567,6 +7614,25 @@ then esac fi +# WASM flags +case $ac_sys_system/$ac_sys_emscripten_target in #( + Emscripten/browser) : + + LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" + WASM_ASSETS_DIR=".\$(prefix)" + WASM_STDLIB="\$(WASM_ASSETS_DIR)/local/lib/python\$(VERSION)/os.py" + ;; #( + Emscripten/node) : + + LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" + CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" + ;; #( + *) : + ;; +esac + + + @@ -21170,6 +21236,14 @@ else LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" fi +case $ac_sys_system/$ac_sys_emscripten_target in #( + Emscripten/browser) : + LIBRARY_DEPS="$LIBRARY_DEPS \$(WASM_STDLIB)" ;; #( + *) : + ;; +esac + + # Check whether to disable test modules. Once set, setup.py will not build @@ -23458,7 +23532,7 @@ $as_echo_n "checking for stdlib extension module xxlimited... " >&6; } *xxlimited*) : py_cv_module_xxlimited=n/a ;; #( *) : - if test "$with_trace_refs" = "no"; then : + if test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"; then : if true; then : py_cv_module_xxlimited=yes else @@ -23494,7 +23568,7 @@ $as_echo_n "checking for stdlib extension module xxlimited_35... " >&6; } *xxlimited_35*) : py_cv_module_xxlimited_35=n/a ;; #( *) : - if test "$with_trace_refs" = "no"; then : + if test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"; then : if true; then : py_cv_module_xxlimited_35=yes else diff --git a/configure.ac b/configure.ac index 1720b9bfbee37..25181c0f7ed17 100644 --- a/configure.ac +++ b/configure.ac @@ -1062,6 +1062,24 @@ AS_CASE([$ac_sys_system/$ac_sys_release], ] ) +AC_MSG_CHECKING([for --with-emscripten-target]) +AC_ARG_WITH([emscripten-target], + [AS_HELP_STRING([--with-emscripten-target=@<:@browser|node@:>@], [Emscripten platform])], +[ + AS_VAR_IF([ac_sys_system], [Emscripten], [ + AS_CASE([$with_emscripten_target], + [browser], [ac_sys_emscripten_target=browser], + [node], [ac_sys_emscripten_target=node], + [AC_MSG_ERROR([Invalid argument: --with-emscripten-target=browser|node])] + ) + ], [ + AC_MSG_ERROR([--with-emscripten-target only applies to Emscripten]) + ]) +], [ + AS_VAR_IF([ac_sys_system], [Emscripten], [ac_sys_emscripten_target=browser]) +]) +AC_MSG_RESULT([$ac_sys_emscripten_target]) + AC_MSG_CHECKING([for --with-suffix]) AC_ARG_WITH([suffix], [AS_HELP_STRING([--with-suffix=SUFFIX], [set executable suffix to SUFFIX (default is empty, yes is mapped to '.exe')])], @@ -1072,8 +1090,10 @@ AC_ARG_WITH([suffix], [EXEEXT=$with_suffix] ) ], [ - AS_CASE([$ac_sys_system], - [Emscripten], [EXEEXT=.wasm], + AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [Emscripten/browser], [EXEEXT=.html], + [Emscripten/node], [EXEEXT=.js], + [wasi/*], [EXEEXT=.wasm], [EXEEXT=] ) ]) @@ -1446,6 +1466,7 @@ else AC_MSG_RESULT(no); fi], [AC_MSG_RESULT(no)]) + if test "$Py_OPT" = 'true' ; then # Intentionally not forcing Py_LTO='true' here. Too many toolchains do not # compile working code using it and both test_distutils and test_gdb are @@ -1462,8 +1483,12 @@ if test "$Py_OPT" = 'true' ; then ]) ;; esac - - +elif test "$ac_sys_system" = "Emscripten"; then + dnl Emscripten does not support shared extensions yet. Build + dnl "python.[js,html,wasm]", "pybuilddir.txt", and "platform" files. + DEF_MAKE_ALL_RULE="build_platform" + REQUIRE_PGO="no" + DEF_MAKE_RULE="all" else DEF_MAKE_ALL_RULE="build_all" REQUIRE_PGO="no" @@ -1769,10 +1794,25 @@ then esac fi +# WASM flags +AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [Emscripten/browser], [ + LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" + WASM_ASSETS_DIR=".\$(prefix)" + WASM_STDLIB="\$(WASM_ASSETS_DIR)/local/lib/python\$(VERSION)/os.py" + ], + [Emscripten/node], [ + LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" + CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" + ], +) + AC_SUBST(BASECFLAGS) AC_SUBST(CFLAGS_NODIST) AC_SUBST(LDFLAGS_NODIST) AC_SUBST(LDFLAGS_NOLTO) +AC_SUBST([WASM_ASSETS_DIR]) +AC_SUBST([WASM_STDLIB]) # The -arch flags for universal builds on macOS UNIVERSAL_ARCH_FLAGS= @@ -6252,6 +6292,12 @@ if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then else LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" fi + +dnl browser needs a WASM assets stdlib bundle +AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [Emscripten/browser], [LIBRARY_DEPS="$LIBRARY_DEPS \$(WASM_STDLIB)"], +) + AC_SUBST(STATIC_LIBPYTHON) AC_SUBST(LIBRARY_DEPS) @@ -6520,8 +6566,9 @@ PY_STDLIB_MOD([_ctypes_test], [test "$TEST_MODULES" = yes], [], [], [-lm]) dnl Limited API template modules. dnl The limited C API is not compatible with the Py_TRACE_REFS macro. -PY_STDLIB_MOD([xxlimited], [test "$with_trace_refs" = "no"]) -PY_STDLIB_MOD([xxlimited_35], [test "$with_trace_refs" = "no"]) +dnl Emscripten does not support shared libraries yet. +PY_STDLIB_MOD([xxlimited], [test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"]) +PY_STDLIB_MOD([xxlimited_35], [test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"]) # substitute multiline block, must come after last PY_STDLIB_MOD() AC_SUBST([MODULE_BLOCK]) From webhook-mailer at python.org Wed Jan 12 11:48:14 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 12 Jan 2022 16:48:14 -0000 Subject: [Python-checkins] bpo-46348: modernize `test_typing` (GH-30547) Message-ID: https://github.com/python/cpython/commit/e2a9c8ef09cb7123d6b28852a323e6cc1f878b5b commit: e2a9c8ef09cb7123d6b28852a323e6cc1f878b5b branch: main author: Nikita Sobolev committer: gvanrossum date: 2022-01-12T08:48:10-08:00 summary: bpo-46348: modernize `test_typing` (GH-30547) files: M Lib/test/mod_generics_cache.py M Lib/test/test_typing.py diff --git a/Lib/test/mod_generics_cache.py b/Lib/test/mod_generics_cache.py index 6d35c58396d42..9d8b56cf03c36 100644 --- a/Lib/test/mod_generics_cache.py +++ b/Lib/test/mod_generics_cache.py @@ -1,53 +1,21 @@ """Module for testing the behavior of generics across different modules.""" -import sys -from textwrap import dedent from typing import TypeVar, Generic, Optional +default_a: Optional['A'] = None +default_b: Optional['B'] = None -if sys.version_info[:2] >= (3, 6): - exec(dedent(""" - default_a: Optional['A'] = None - default_b: Optional['B'] = None +T = TypeVar('T') - T = TypeVar('T') - - class A(Generic[T]): - some_b: 'B' - - - class B(Generic[T]): - class A(Generic[T]): - pass - - my_inner_a1: 'B.A' - my_inner_a2: A - my_outer_a: 'A' # unless somebody calls get_type_hints with localns=B.__dict__ - """)) -else: # This should stay in sync with the syntax above. - __annotations__ = dict( - default_a=Optional['A'], - default_b=Optional['B'], - ) - default_a = None - default_b = None - - T = TypeVar('T') +class A(Generic[T]): + some_b: 'B' +class B(Generic[T]): class A(Generic[T]): - __annotations__ = dict( - some_b='B' - ) - - - class B(Generic[T]): - class A(Generic[T]): - pass + pass - __annotations__ = dict( - my_inner_a1='B.A', - my_inner_a2=A, - my_outer_a='A' # unless somebody calls get_type_hints with localns=B.__dict__ - ) + my_inner_a1: 'B.A' + my_inner_a2: A + my_outer_a: 'A' # unless somebody calls get_type_hints with localns=B.__dict__ diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index a94d77d4edf4b..af5b1df6b04ca 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -2938,7 +2938,9 @@ def blah(): blah() -ASYNCIO_TESTS = """ +# Definitions needed for features introduced in Python 3.6 + +from test import ann_module, ann_module2, ann_module3, ann_module5, ann_module6 import asyncio T_a = TypeVar('T_a') @@ -2972,19 +2974,6 @@ async def __aenter__(self) -> int: return 42 async def __aexit__(self, etype, eval, tb): return None -""" - -try: - exec(ASYNCIO_TESTS) -except ImportError: - ASYNCIO = False # multithreading is not enabled -else: - ASYNCIO = True - -# Definitions needed for features introduced in Python 3.6 - -from test import ann_module, ann_module2, ann_module3, ann_module5, ann_module6 -from typing import AsyncContextManager class A: y: float @@ -3044,7 +3033,7 @@ class HasForeignBaseClass(mod_generics_cache.A): some_xrepr: 'XRepr' other_a: 'mod_generics_cache.A' -async def g_with(am: AsyncContextManager[int]): +async def g_with(am: typing.AsyncContextManager[int]): x: int async with am as x: return x @@ -3386,7 +3375,6 @@ def test_iterator(self): self.assertIsInstance(it, typing.Iterator) self.assertNotIsInstance(42, typing.Iterator) - @skipUnless(ASYNCIO, 'Python 3.5 and multithreading required') def test_awaitable(self): ns = {} exec( @@ -3399,7 +3387,6 @@ def test_awaitable(self): self.assertNotIsInstance(foo, typing.Awaitable) g.send(None) # Run foo() till completion, to avoid warning. - @skipUnless(ASYNCIO, 'Python 3.5 and multithreading required') def test_coroutine(self): ns = {} exec( @@ -3417,7 +3404,6 @@ def test_coroutine(self): except StopIteration: pass - @skipUnless(ASYNCIO, 'Python 3.5 and multithreading required') def test_async_iterable(self): base_it = range(10) # type: Iterator[int] it = AsyncIteratorWrapper(base_it) @@ -3425,7 +3411,6 @@ def test_async_iterable(self): self.assertIsInstance(it, typing.AsyncIterable) self.assertNotIsInstance(42, typing.AsyncIterable) - @skipUnless(ASYNCIO, 'Python 3.5 and multithreading required') def test_async_iterator(self): base_it = range(10) # type: Iterator[int] it = AsyncIteratorWrapper(base_it) @@ -3580,7 +3565,6 @@ class MyOrdDict(typing.OrderedDict[str, int]): self.assertIsSubclass(MyOrdDict, collections.OrderedDict) self.assertNotIsSubclass(collections.OrderedDict, MyOrdDict) - @skipUnless(sys.version_info >= (3, 3), 'ChainMap was added in 3.3') def test_chainmap_instantiation(self): self.assertIs(type(typing.ChainMap()), collections.ChainMap) self.assertIs(type(typing.ChainMap[KT, VT]()), collections.ChainMap) @@ -3588,7 +3572,6 @@ def test_chainmap_instantiation(self): class CM(typing.ChainMap[KT, VT]): ... self.assertIs(type(CM[int, str]()), CM) - @skipUnless(sys.version_info >= (3, 3), 'ChainMap was added in 3.3') def test_chainmap_subclass(self): class MyChainMap(typing.ChainMap[str, int]): @@ -3852,7 +3835,6 @@ def manager(): self.assertIsInstance(cm, typing.ContextManager) self.assertNotIsInstance(42, typing.ContextManager) - @skipUnless(ASYNCIO, 'Python 3.5 required') def test_async_contextmanager(self): class NotACM: pass From webhook-mailer at python.org Wed Jan 12 13:55:06 2022 From: webhook-mailer at python.org (tim-one) Date: Wed, 12 Jan 2022 18:55:06 -0000 Subject: [Python-checkins] bpo-46020: Optimize long_pow for the common case (GH-30555) Message-ID: https://github.com/python/cpython/commit/fc05e6bfce5d5dfc23859e6f7862c1e707a12e42 commit: fc05e6bfce5d5dfc23859e6f7862c1e707a12e42 branch: main author: Tim Peters committer: tim-one date: 2022-01-12T12:55:02-06:00 summary: bpo-46020: Optimize long_pow for the common case (GH-30555) This cuts a bit of overhead by not initializing the table of small odd powers unless it's needed for a large exponent. files: M Objects/longobject.c diff --git a/Objects/longobject.c b/Objects/longobject.c index 2db8701a841a9..5d181aa0850aa 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -4215,8 +4215,13 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) /* k-ary values. If the exponent is large enough, table is * precomputed so that table[i] == a**(2*i+1) % c for i in * range(EXP_TABLE_LEN). + * Note: this is uninitialzed stack trash: don't pay to set it to known + * values unless it's needed. Instead ensure that num_table_entries is + * set to the number of entries actually filled whenever a branch to the + * Error or Done labels is possible. */ - PyLongObject *table[EXP_TABLE_LEN] = {0}; + PyLongObject *table[EXP_TABLE_LEN]; + Py_ssize_t num_table_entries = 0; /* a, b, c = v, w, x */ CHECK_BINOP(v, w); @@ -4408,10 +4413,14 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) */ Py_INCREF(a); table[0] = a; + num_table_entries = 1; MULT(a, a, a2); /* table[i] == a**(2*i + 1) % c */ - for (i = 1; i < EXP_TABLE_LEN; ++i) + for (i = 1; i < EXP_TABLE_LEN; ++i) { + table[i] = NULL; /* must set to known value for MULT */ MULT(table[i-1], a2, table[i]); + ++num_table_entries; /* incremented iff MULT succeeded */ + } Py_CLEAR(a2); /* Repeatedly extract the next (no more than) EXP_WINDOW_SIZE bits @@ -4472,10 +4481,8 @@ long_pow(PyObject *v, PyObject *w, PyObject *x) Py_CLEAR(z); /* fall through */ Done: - if (Py_SIZE(b) > HUGE_EXP_CUTOFF / PyLong_SHIFT) { - for (i = 0; i < EXP_TABLE_LEN; ++i) - Py_XDECREF(table[i]); - } + for (i = 0; i < num_table_entries; ++i) + Py_DECREF(table[i]); Py_DECREF(a); Py_DECREF(b); Py_XDECREF(c); From webhook-mailer at python.org Wed Jan 12 14:27:42 2022 From: webhook-mailer at python.org (tiran) Date: Wed, 12 Jan 2022 19:27:42 -0000 Subject: [Python-checkins] bpo-40280: Allow to compile _testcapi as builtin module (GH-30559) Message-ID: https://github.com/python/cpython/commit/e34c9367f8e0068ca4bcad9fb5c2c1024d02a77d commit: e34c9367f8e0068ca4bcad9fb5c2c1024d02a77d branch: main author: Christian Heimes committer: tiran date: 2022-01-12T20:27:37+01:00 summary: bpo-40280: Allow to compile _testcapi as builtin module (GH-30559) files: M Modules/Setup.stdlib.in M Modules/_testcapimodule.c M configure M configure.ac diff --git a/Modules/Setup.stdlib.in b/Modules/Setup.stdlib.in index 5788b446201e5..73f041eb2fba9 100644 --- a/Modules/Setup.stdlib.in +++ b/Modules/Setup.stdlib.in @@ -169,11 +169,10 @@ @MODULE__XXTESTFUZZ_TRUE at _xxtestfuzz _xxtestfuzz/_xxtestfuzz.c _xxtestfuzz/fuzzer.c @MODULE__TESTBUFFER_TRUE at _testbuffer _testbuffer.c @MODULE__TESTINTERNALCAPI_TRUE at _testinternalcapi _testinternalcapi.c - + at MODULE__TESTCAPI_TRUE@_testcapi _testcapimodule.c # Some testing modules MUST be built as shared libraries. *shared* - at MODULE__TESTCAPI_TRUE@_testcapi _testcapimodule.c @MODULE__TESTIMPORTMULTIPLE_TRUE at _testimportmultiple _testimportmultiple.c @MODULE__TESTMULTIPHASE_TRUE at _testmultiphase _testmultiphase.c @MODULE__CTYPES_TEST_TRUE at _ctypes_test _ctypes/_ctypes_test.c diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index ea9c048554d22..7369f094faedd 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -12,6 +12,8 @@ macro defined, but only the public C API must be tested here. */ #undef Py_BUILD_CORE_MODULE +#undef Py_BUILD_CORE_BUILTIN + /* Always enable assertions */ #undef NDEBUG diff --git a/configure b/configure index 327e9bd2d3f34..6c9aacc68a956 100755 --- a/configure +++ b/configure @@ -21258,8 +21258,8 @@ fi if test "$enable_test_modules" = no; then TEST_MODULES=no else - case $ac_sys_system in #( - Emscripten) : + case $ac_sys_system/$ac_sys_emscripten_target in #( + Emscripten/browser) : TEST_MODULES=no ;; #( *) : TEST_MODULES=yes diff --git a/configure.ac b/configure.ac index 25181c0f7ed17..4396828bf6fe6 100644 --- a/configure.ac +++ b/configure.ac @@ -6309,8 +6309,8 @@ AC_ARG_ENABLE(test-modules, if test "$enable_test_modules" = no; then TEST_MODULES=no else - AS_CASE([$ac_sys_system], - [Emscripten], [TEST_MODULES=no], + AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [Emscripten/browser], [TEST_MODULES=no], [TEST_MODULES=yes] ) fi From webhook-mailer at python.org Wed Jan 12 14:38:34 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 12 Jan 2022 19:38:34 -0000 Subject: [Python-checkins] bpo-46342: make @typing.final introspectable (GH-30530) Message-ID: https://github.com/python/cpython/commit/0bbf30e2b910bc9c5899134ae9d73a8df968da35 commit: 0bbf30e2b910bc9c5899134ae9d73a8df968da35 branch: main author: Jelle Zijlstra committer: gvanrossum date: 2022-01-12T11:38:25-08:00 summary: bpo-46342: make @typing.final introspectable (GH-30530) Co-authored-by: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst M Doc/library/typing.rst M Lib/test/test_typing.py M Lib/typing.py diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index de7aa086a9f82..cb14db90711cf 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -1985,6 +1985,15 @@ Functions and decorators .. versionadded:: 3.8 + .. versionchanged:: 3.11 + The decorator will now set the ``__final__`` attribute to ``True`` + on the decorated object. Thus, a check like + ``if getattr(obj, "__final__", False)`` can be used at runtime + to determine whether an object ``obj`` has been marked as final. + If the decorated object does not support setting attributes, + the decorator returns the object unchanged without raising an exception. + + .. decorator:: no_type_check Decorator to indicate that annotations are not type hints. diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index af5b1df6b04ca..fd8237a1a8c33 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -1,5 +1,7 @@ import contextlib import collections +from functools import lru_cache +import inspect import pickle import re import sys @@ -2536,10 +2538,80 @@ def test_no_isinstance(self): with self.assertRaises(TypeError): issubclass(int, Final) + +class FinalDecoratorTests(BaseTestCase): def test_final_unmodified(self): def func(x): ... self.assertIs(func, final(func)) + def test_dunder_final(self): + @final + def func(): ... + @final + class Cls: ... + self.assertIs(True, func.__final__) + self.assertIs(True, Cls.__final__) + + class Wrapper: + __slots__ = ("func",) + def __init__(self, func): + self.func = func + def __call__(self, *args, **kwargs): + return self.func(*args, **kwargs) + + # Check that no error is thrown if the attribute + # is not writable. + @final + @Wrapper + def wrapped(): ... + self.assertIsInstance(wrapped, Wrapper) + self.assertIs(False, hasattr(wrapped, "__final__")) + + class Meta(type): + @property + def __final__(self): return "can't set me" + @final + class WithMeta(metaclass=Meta): ... + self.assertEqual(WithMeta.__final__, "can't set me") + + # Builtin classes throw TypeError if you try to set an + # attribute. + final(int) + self.assertIs(False, hasattr(int, "__final__")) + + # Make sure it works with common builtin decorators + class Methods: + @final + @classmethod + def clsmethod(cls): ... + + @final + @staticmethod + def stmethod(): ... + + # The other order doesn't work because property objects + # don't allow attribute assignment. + @property + @final + def prop(self): ... + + @final + @lru_cache() + def cached(self): ... + + # Use getattr_static because the descriptor returns the + # underlying function, which doesn't have __final__. + self.assertIs( + True, + inspect.getattr_static(Methods, "clsmethod").__final__ + ) + self.assertIs( + True, + inspect.getattr_static(Methods, "stmethod").__final__ + ) + self.assertIs(True, Methods.prop.fget.__final__) + self.assertIs(True, Methods.cached.__final__) + class CastTests(BaseTestCase): diff --git a/Lib/typing.py b/Lib/typing.py index d520f6b2e1b3d..972b8ba24b27e 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -2042,8 +2042,17 @@ class Leaf: class Other(Leaf): # Error reported by type checker ... - There is no runtime checking of these properties. + There is no runtime checking of these properties. The decorator + sets the ``__final__`` attribute to ``True`` on the decorated object + to allow runtime introspection. """ + try: + f.__final__ = True + except (AttributeError, TypeError): + # Skip the attribute silently if it is not writable. + # AttributeError happens if the object has __slots__ or a + # read-only property, TypeError if it's a builtin class. + pass return f diff --git a/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst b/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst new file mode 100644 index 0000000000000..31d484fc77f1f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst @@ -0,0 +1,2 @@ +The ``@typing.final`` decorator now sets the ``__final__`` attribute on the +decorated object to allow runtime introspection. Patch by Jelle Zijlstra. From webhook-mailer at python.org Wed Jan 12 18:28:56 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Wed, 12 Jan 2022 23:28:56 -0000 Subject: [Python-checkins] bpo-45953: Statically allocate the main interpreter (and initial thread state). (gh-29883) Message-ID: https://github.com/python/cpython/commit/ed57b36c32e521162dbb97199e64a340d3bff827 commit: ed57b36c32e521162dbb97199e64a340d3bff827 branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-12T16:28:46-07:00 summary: bpo-45953: Statically allocate the main interpreter (and initial thread state). (gh-29883) Previously, the main interpreter was allocated on the heap during runtime initialization. Here we instead embed it into _PyRuntimeState, which means it is statically allocated as part of the _PyRuntime global. The same goes for the initial thread state (of each interpreter, including the main one). Consequently there are fewer allocations during runtime/interpreter init, fewer possible failures, and better memory locality. FYI, this also helps efforts to consolidate globals, which in turns helps work on subinterpreter isolation. https://bugs.python.org/issue45953 files: A Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst M Include/cpython/pystate.h M Include/internal/pycore_global_objects.h M Include/internal/pycore_interp.h M Include/internal/pycore_runtime.h M Modules/signalmodule.c M Python/ceval.c M Python/pystate.c diff --git a/Include/cpython/pystate.h b/Include/cpython/pystate.h index c37123c4f6922..bcb1bb25a4940 100644 --- a/Include/cpython/pystate.h +++ b/Include/cpython/pystate.h @@ -2,6 +2,9 @@ # error "this header file must not be included directly" #endif +#include + + PyAPI_FUNC(int) _PyInterpreterState_RequiresIDRef(PyInterpreterState *); PyAPI_FUNC(void) _PyInterpreterState_RequireIDRef(PyInterpreterState *, int); @@ -83,6 +86,9 @@ struct _ts { after allocation. */ int _initialized; + /* Was this thread state statically allocated? */ + bool _static; + int recursion_remaining; int recursion_limit; int recursion_headroom; /* Allow 50 more calls to handle any errors. */ @@ -175,9 +181,11 @@ struct _ts { PyObject **datastack_top; PyObject **datastack_limit; /* XXX signal handlers should also be here */ - }; + +/* other API */ + // Alias for backward compatibility with Python 3.8 #define _PyInterpreterState_Get PyInterpreterState_Get diff --git a/Include/internal/pycore_global_objects.h b/Include/internal/pycore_global_objects.h index d2dc907c53d6d..de7ab9b53eb26 100644 --- a/Include/internal/pycore_global_objects.h +++ b/Include/internal/pycore_global_objects.h @@ -606,10 +606,6 @@ struct _Py_global_objects { }, \ } -static inline void -_Py_global_objects_reset(struct _Py_global_objects *objects) -{ -} #ifdef __cplusplus } diff --git a/Include/internal/pycore_interp.h b/Include/internal/pycore_interp.h index d48ea87fd67fe..77e42b65f5d3c 100644 --- a/Include/internal/pycore_interp.h +++ b/Include/internal/pycore_interp.h @@ -8,6 +8,8 @@ extern "C" { # error "this header requires Py_BUILD_CORE define" #endif +#include + #include "pycore_atomic.h" // _Py_atomic_address #include "pycore_ast_state.h" // struct ast_state #include "pycore_context.h" // struct _Py_context_state @@ -70,13 +72,18 @@ struct atexit_state { /* interpreter state */ -// The PyInterpreterState typedef is in Include/pystate.h. +/* PyInterpreterState holds the global state for one of the runtime's + interpreters. Typically the initial (main) interpreter is the only one. + + The PyInterpreterState typedef is in Include/pystate.h. + */ struct _is { struct _is *next; struct pythreads { uint64_t next_unique_id; + /* The linked list of threads, newest first. */ struct _ts *head; /* Used in Modules/_threadmodule.c. */ long count; @@ -104,6 +111,9 @@ struct _is { int _initialized; int finalizing; + /* Was this interpreter statically allocated? */ + bool _static; + struct _ceval_state ceval; struct _gc_runtime_state gc; @@ -166,8 +176,26 @@ struct _is { struct ast_state ast; struct type_cache type_cache; + + /* The following fields are here to avoid allocation during init. + The data is exposed through PyInterpreterState pointer fields. + These fields should not be accessed directly outside of init. + + All other PyInterpreterState pointer fields are populated when + needed and default to NULL. + + For now there are some exceptions to that rule, which require + allocation during init. These will be addressed on a case-by-case + basis. Also see _PyRuntimeState regarding the various mutex fields. + */ + + /* the initial PyInterpreterState.threads.head */ + struct _ts _initial_thread; }; + +/* other API */ + extern void _PyInterpreterState_ClearModules(PyInterpreterState *interp); extern void _PyInterpreterState_Clear(PyThreadState *tstate); diff --git a/Include/internal/pycore_runtime.h b/Include/internal/pycore_runtime.h index 725c859ea7853..a66a3cf3a3944 100644 --- a/Include/internal/pycore_runtime.h +++ b/Include/internal/pycore_runtime.h @@ -11,8 +11,10 @@ extern "C" { #include "pycore_atomic.h" /* _Py_atomic_address */ #include "pycore_gil.h" // struct _gil_runtime_state #include "pycore_global_objects.h" // struct _Py_global_objects +#include "pycore_interp.h" // struct _is #include "pycore_unicodeobject.h" // struct _Py_unicode_runtime_ids + /* ceval state */ struct _ceval_runtime_state { @@ -53,6 +55,9 @@ typedef struct _Py_AuditHookEntry { /* Full Python runtime state */ +/* _PyRuntimeState holds the global state for the CPython runtime. + That data is exposed in the internal API as a static variable (_PyRuntime). + */ typedef struct pyruntimestate { /* Has been initialized to a safe state. @@ -81,7 +86,11 @@ typedef struct pyruntimestate { struct pyinterpreters { PyThread_type_lock mutex; + /* The linked list of interpreters, newest first. */ PyInterpreterState *head; + /* The runtime's initial interpreter, which has a special role + in the operation of the runtime. It is also often the only + interpreter. */ PyInterpreterState *main; /* _next_interp_id is an auto-numbered sequence of small integers. It gets initialized in _PyInterpreterState_Init(), @@ -118,25 +127,44 @@ typedef struct pyruntimestate { struct _Py_unicode_runtime_ids unicode_ids; + /* All the objects that are shared by the runtime's interpreters. */ struct _Py_global_objects global_objects; - // If anything gets added after global_objects then - // _PyRuntimeState_reset() needs to get updated to clear it. + + /* The following fields are here to avoid allocation during init. + The data is exposed through _PyRuntimeState pointer fields. + These fields should not be accessed directly outside of init. + + All other _PyRuntimeState pointer fields are populated when + needed and default to NULL. + + For now there are some exceptions to that rule, which require + allocation during init. These will be addressed on a case-by-case + basis. Most notably, we don't pre-allocated the several mutex + (PyThread_type_lock) fields, because on Windows we only ever get + a pointer type. + */ + + /* PyInterpreterState.interpreters.main */ + PyInterpreterState _main_interpreter; } _PyRuntimeState; +#define _PyThreadState_INIT \ + { \ + ._static = 1, \ + } +#define _PyInterpreterState_INIT \ + { \ + ._static = 1, \ + ._initial_thread = _PyThreadState_INIT, \ + } #define _PyRuntimeState_INIT \ { \ .global_objects = _Py_global_objects_INIT, \ + ._main_interpreter = _PyInterpreterState_INIT, \ } -/* Note: _PyRuntimeState_INIT sets other fields to 0/NULL */ -static inline void -_PyRuntimeState_reset(_PyRuntimeState *runtime) -{ - /* Make it match _PyRuntimeState_INIT. */ - memset(runtime, 0, (size_t)&runtime->global_objects - (size_t)runtime); - _Py_global_objects_reset(&runtime->global_objects); -} +/* other API */ PyAPI_DATA(_PyRuntimeState) _PyRuntime; diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst new file mode 100644 index 0000000000000..4fa27b60c02f8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst @@ -0,0 +1,4 @@ +The main interpreter in _PyRuntimeState.interpreters is now statically +allocated (as part of _PyRuntime). Likewise for the initial thread state of +each interpreter. This means less allocation during runtime init, as well +as better memory locality for these key state objects. diff --git a/Modules/signalmodule.c b/Modules/signalmodule.c index 9316a9eed7684..e6f56e0aea9a9 100644 --- a/Modules/signalmodule.c +++ b/Modules/signalmodule.c @@ -292,7 +292,7 @@ trip_signal(int sig_num) _Py_atomic_store(&is_tripped, 1); /* Signals are always handled by the main interpreter */ - PyInterpreterState *interp = _PyRuntime.interpreters.main; + PyInterpreterState *interp = _PyInterpreterState_Main(); /* Notify ceval.c */ _PyEval_SignalReceived(interp); diff --git a/Python/ceval.c b/Python/ceval.c index 8e878cbf7e2b3..d33cd4e1edb5d 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -617,7 +617,7 @@ Py_AddPendingCall(int (*func)(void *), void *arg) } else { /* Last resort: use the main interpreter */ - interp = _PyRuntime.interpreters.main; + interp = _PyInterpreterState_Main(); } return _PyEval_AddPendingCall(interp, func, arg); } diff --git a/Python/pystate.c b/Python/pystate.c index 68fae8d283091..a18a159b55175 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -46,6 +46,10 @@ static PyThreadState *_PyGILState_GetThisThreadState(struct _gilstate_runtime_st static void _PyThreadState_Delete(PyThreadState *tstate, int check_current); +/* We use "initial" if the runtime gets re-used + (e.g. Py_Finalize() followed by Py_Initialize(). */ +static const _PyRuntimeState initial = _PyRuntimeState_INIT; + static int alloc_for_runtime(PyThread_type_lock *plock1, PyThread_type_lock *plock2, PyThread_type_lock *plock3) @@ -91,9 +95,12 @@ init_runtime(_PyRuntimeState *runtime, PyThread_type_lock xidregistry_mutex) { if (runtime->_initialized) { - _PyRuntimeState_reset(runtime); - assert(!runtime->initialized); + Py_FatalError("runtime already initialized"); } + assert(!runtime->preinitializing && + !runtime->preinitialized && + !runtime->core_initialized && + !runtime->initialized); runtime->open_code_hook = open_code_hook; runtime->open_code_userdata = open_code_userdata; @@ -144,6 +151,11 @@ _PyRuntimeState_Init(_PyRuntimeState *runtime) return _PyStatus_NO_MEMORY(); } + if (runtime->_initialized) { + // Py_Initialize() must be running again. + // Reset to _PyRuntimeState_INIT. + memcpy(runtime, &initial, sizeof(*runtime)); + } init_runtime(runtime, open_code_hook, open_code_userdata, audit_hook_head, unicode_next_index, lock1, lock2, lock3); @@ -250,13 +262,15 @@ alloc_interpreter(void) static void free_interpreter(PyInterpreterState *interp) { - PyMem_RawFree(interp); + if (!interp->_static) { + PyMem_RawFree(interp); + } } /* Get the interpreter state to a minimal consistent state. Further init happens in pylifecycle.c before it can be used. All fields not initialized here are expected to be zeroed out, - e.g. by PyMem_RawCalloc() or memset(). + e.g. by PyMem_RawCalloc() or memset(), or otherwise pre-initialized. The runtime state is not manipulated. Instead it is assumed that the interpreter is getting added to the runtime. */ @@ -338,23 +352,23 @@ PyInterpreterState_New(void) assert(interpreters->main == NULL); assert(id == 0); - interp = alloc_interpreter(); - if (interp == NULL) { - goto error; - } + interp = &runtime->_main_interpreter; assert(interp->id == 0); assert(interp->next == NULL); interpreters->main = interp; } else { - assert(id != 0); assert(interpreters->main != NULL); + assert(id != 0); interp = alloc_interpreter(); if (interp == NULL) { goto error; } + // Set to _PyInterpreterState_INIT. + memcpy(interp, &initial._main_interpreter, + sizeof(*interp)); if (id < 0) { /* overflow or Py_Initialize() not called yet! */ @@ -735,13 +749,15 @@ alloc_threadstate(void) static void free_threadstate(PyThreadState *tstate) { - PyMem_RawFree(tstate); + if (!tstate->_static) { + PyMem_RawFree(tstate); + } } /* Get the thread state to a minimal consistent state. Further init happens in pylifecycle.c before it can be used. All fields not initialized here are expected to be zeroed out, - e.g. by PyMem_RawCalloc() or memset(). + e.g. by PyMem_RawCalloc() or memset(), or otherwise pre-initialized. The interpreter state is not manipulated. Instead it is assumed that the thread is getting added to the interpreter. */ @@ -808,10 +824,7 @@ new_threadstate(PyInterpreterState *interp) // It's the interpreter's initial thread state. assert(id == 1); - tstate = alloc_threadstate(); - if (tstate == NULL) { - goto error; - } + tstate = &interp->_initial_thread; } else { // Every valid interpreter must have at least one thread. @@ -822,6 +835,10 @@ new_threadstate(PyInterpreterState *interp) if (tstate == NULL) { goto error; } + // Set to _PyThreadState_INIT. + memcpy(tstate, + &initial._main_interpreter._initial_thread, + sizeof(*tstate)); } interp->threads.head = tstate; @@ -1159,7 +1176,7 @@ _PyThreadState_DeleteExcept(_PyRuntimeState *runtime, PyThreadState *tstate) for (p = list; p; p = next) { next = p->next; PyThreadState_Clear(p); - PyMem_RawFree(p); + free_threadstate(p); } } From webhook-mailer at python.org Wed Jan 12 18:36:01 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 12 Jan 2022 23:36:01 -0000 Subject: [Python-checkins] bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) Message-ID: https://github.com/python/cpython/commit/1de60155d5d01be2924e72fb68dd13d4fd00acd7 commit: 1de60155d5d01be2924e72fb68dd13d4fd00acd7 branch: main author: Nikita Sobolev committer: gvanrossum date: 2022-01-12T15:35:44-08:00 summary: bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index fd8237a1a8c33..c11f6f7c19224 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -3177,6 +3177,12 @@ def test_get_type_hints_classes(self): 'my_inner_a2': mod_generics_cache.B.A, 'my_outer_a': mod_generics_cache.A}) + def test_get_type_hints_classes_no_implicit_optional(self): + class WithNoneDefault: + field: int = None # most type-checkers won't be happy with it + + self.assertEqual(gth(WithNoneDefault), {'field': int}) + def test_respect_no_type_check(self): @no_type_check class NoTpCheck: From webhook-mailer at python.org Wed Jan 12 20:45:58 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 01:45:58 -0000 Subject: [Python-checkins] bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) Message-ID: https://github.com/python/cpython/commit/a468866a67d83a24e3a3c3a0070129773d28bbd9 commit: a468866a67d83a24e3a3c3a0070129773d28bbd9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-12T17:45:49-08:00 summary: bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) (cherry picked from commit 1de60155d5d01be2924e72fb68dd13d4fd00acd7) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 82b6f8c1c6406..f943aed73614c 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -3104,6 +3104,12 @@ def test_get_type_hints_classes(self): 'my_inner_a2': mod_generics_cache.B.A, 'my_outer_a': mod_generics_cache.A}) + def test_get_type_hints_classes_no_implicit_optional(self): + class WithNoneDefault: + field: int = None # most type-checkers won't be happy with it + + self.assertEqual(gth(WithNoneDefault), {'field': int}) + def test_respect_no_type_check(self): @no_type_check class NoTpCheck: From webhook-mailer at python.org Wed Jan 12 20:46:39 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 01:46:39 -0000 Subject: [Python-checkins] bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) Message-ID: https://github.com/python/cpython/commit/d9101c4e49dc29f21319493818130ad5468402a2 commit: d9101c4e49dc29f21319493818130ad5468402a2 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-12T17:46:35-08:00 summary: bpo-46345: Add a test case for implicit `Optional` class attribute (GH-30535) (cherry picked from commit 1de60155d5d01be2924e72fb68dd13d4fd00acd7) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 19c38ec833c67..8cdb1166c847f 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -2943,6 +2943,12 @@ def test_get_type_hints_classes(self): 'my_inner_a2': mod_generics_cache.B.A, 'my_outer_a': mod_generics_cache.A}) + def test_get_type_hints_classes_no_implicit_optional(self): + class WithNoneDefault: + field: int = None # most type-checkers won't be happy with it + + self.assertEqual(gth(WithNoneDefault), {'field': int}) + def test_respect_no_type_check(self): @no_type_check class NoTpCheck: From webhook-mailer at python.org Thu Jan 13 03:46:15 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 13 Jan 2022 08:46:15 -0000 Subject: [Python-checkins] bpo-46315: Add ifdef HAVE_ feature checks for WASI compatibility (GH-30507) Message-ID: https://github.com/python/cpython/commit/a6ca8eee2254762422f90cf94fbaac34f85db780 commit: a6ca8eee2254762422f90cf94fbaac34f85db780 branch: main author: Christian Heimes committer: tiran date: 2022-01-13T09:46:04+01:00 summary: bpo-46315: Add ifdef HAVE_ feature checks for WASI compatibility (GH-30507) files: A Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst A Tools/wasm/config.site-wasm32-wasi M Include/internal/pycore_condvar.h M Include/pythread.h M Modules/_randommodule.c M Modules/clinic/posixmodule.c.h M Modules/faulthandler.c M Modules/posixmodule.c M Modules/xxsubtype.c M PC/pyconfig.h M Python/pyfpe.c M configure M configure.ac M pyconfig.h.in diff --git a/Include/internal/pycore_condvar.h b/Include/internal/pycore_condvar.h index edb7dc8193cb8..981c962bf7dfd 100644 --- a/Include/internal/pycore_condvar.h +++ b/Include/internal/pycore_condvar.h @@ -20,7 +20,9 @@ */ #define Py_HAVE_CONDVAR -#include +#ifdef HAVE_PTHREAD_H +# include +#endif #define PyMUTEX_T pthread_mutex_t #define PyCOND_T pthread_cond_t diff --git a/Include/pythread.h b/Include/pythread.h index 1a6092c4ad0be..034e660551531 100644 --- a/Include/pythread.h +++ b/Include/pythread.h @@ -125,7 +125,7 @@ Py_DEPRECATED(3.7) PyAPI_FUNC(void) PyThread_ReInitTLS(void); typedef struct _Py_tss_t Py_tss_t; /* opaque */ #ifndef Py_LIMITED_API -#if defined(_POSIX_THREADS) +#ifdef HAVE_PTHREAD_H /* Darwin needs pthread.h to know type name the pthread_key_t. */ # include # define NATIVE_TSS_KEY_T pthread_key_t diff --git a/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst b/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst new file mode 100644 index 0000000000000..9360f91e45dd2 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst @@ -0,0 +1,2 @@ +Added and fixed ``#ifdef HAVE_FEATURE`` checks for functionality that is not +available on WASI platform. diff --git a/Modules/_randommodule.c b/Modules/_randommodule.c index 5243d5a05e290..45860e342eb43 100644 --- a/Modules/_randommodule.c +++ b/Modules/_randommodule.c @@ -258,7 +258,11 @@ random_seed_time_pid(RandomObject *self) key[0] = (uint32_t)(now & 0xffffffffU); key[1] = (uint32_t)(now >> 32); +#ifdef HAVE_GETPID key[2] = (uint32_t)getpid(); +#else + key[2] = 0; +#endif now = _PyTime_GetMonotonicClock(); key[3] = (uint32_t)(now & 0xffffffffU); diff --git a/Modules/clinic/posixmodule.c.h b/Modules/clinic/posixmodule.c.h index 86da08711fd44..282a5410f7020 100644 --- a/Modules/clinic/posixmodule.c.h +++ b/Modules/clinic/posixmodule.c.h @@ -1831,6 +1831,8 @@ os_system(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *k #endif /* defined(HAVE_SYSTEM) && !defined(MS_WINDOWS) */ +#if defined(HAVE_UMASK) + PyDoc_STRVAR(os_umask__doc__, "umask($module, mask, /)\n" "--\n" @@ -1859,6 +1861,8 @@ os_umask(PyObject *module, PyObject *arg) return return_value; } +#endif /* defined(HAVE_UMASK) */ + PyDoc_STRVAR(os_unlink__doc__, "unlink($module, /, path, *, dir_fd=None)\n" "--\n" @@ -8812,6 +8816,10 @@ os_waitstatus_to_exitcode(PyObject *module, PyObject *const *args, Py_ssize_t na #define OS_SYSTEM_METHODDEF #endif /* !defined(OS_SYSTEM_METHODDEF) */ +#ifndef OS_UMASK_METHODDEF + #define OS_UMASK_METHODDEF +#endif /* !defined(OS_UMASK_METHODDEF) */ + #ifndef OS_UNAME_METHODDEF #define OS_UNAME_METHODDEF #endif /* !defined(OS_UNAME_METHODDEF) */ @@ -9295,4 +9303,4 @@ os_waitstatus_to_exitcode(PyObject *module, PyObject *const *args, Py_ssize_t na #ifndef OS_WAITSTATUS_TO_EXITCODE_METHODDEF #define OS_WAITSTATUS_TO_EXITCODE_METHODDEF #endif /* !defined(OS_WAITSTATUS_TO_EXITCODE_METHODDEF) */ -/*[clinic end generated code: output=05505f171cdcff72 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=d95ba7b0b9c52685 input=a9049054013a1b77]*/ diff --git a/Modules/faulthandler.c b/Modules/faulthandler.c index cb2e2588e19b2..1888337cf9f39 100644 --- a/Modules/faulthandler.c +++ b/Modules/faulthandler.c @@ -10,7 +10,7 @@ #include #include #include // abort() -#if defined(HAVE_PTHREAD_SIGMASK) && !defined(HAVE_BROKEN_PTHREAD_SIGMASK) +#if defined(HAVE_PTHREAD_SIGMASK) && !defined(HAVE_BROKEN_PTHREAD_SIGMASK) && defined(HAVE_PTHREAD_H) # include #endif #ifdef MS_WINDOWS diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 21adf806a4e85..904f8bfa55807 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -3292,7 +3292,14 @@ os_chmod_impl(PyObject *module, path_t *path, int mode, int dir_fd, } else #endif /* HAVE_FHCMODAT */ + { +#ifdef HAVE_CHMOD result = chmod(path->narrow, mode); +#else + result = -1; + errno = ENOSYS; +#endif + } Py_END_ALLOW_THREADS if (result) { @@ -4885,6 +4892,7 @@ os_system_impl(PyObject *module, PyObject *command) #endif /* HAVE_SYSTEM */ +#ifdef HAVE_UMASK /*[clinic input] os.umask @@ -4903,6 +4911,7 @@ os_umask_impl(PyObject *module, int mask) return posix_error(); return PyLong_FromLong((long)i); } +#endif #ifdef MS_WINDOWS diff --git a/Modules/xxsubtype.c b/Modules/xxsubtype.c index 7200337724e08..768dac9d1b19b 100644 --- a/Modules/xxsubtype.c +++ b/Modules/xxsubtype.c @@ -237,10 +237,11 @@ spam_bench(PyObject *self, PyObject *args) { PyObject *obj, *name, *res; int n = 1000; - time_t t0, t1; + time_t t0 = 0, t1 = 0; if (!PyArg_ParseTuple(args, "OU|i", &obj, &name, &n)) return NULL; +#ifdef HAVE_CLOCK t0 = clock(); while (--n >= 0) { res = PyObject_GetAttr(obj, name); @@ -249,6 +250,7 @@ spam_bench(PyObject *self, PyObject *args) Py_DECREF(res); } t1 = clock(); +#endif return PyFloat_FromDouble((double)(t1-t0) / CLOCKS_PER_SEC); } diff --git a/PC/pyconfig.h b/PC/pyconfig.h index e0d875adf2e4a..e8649be568420 100644 --- a/PC/pyconfig.h +++ b/PC/pyconfig.h @@ -529,6 +529,9 @@ Py_NO_ENABLE_SHARED to find out. Also support MS_NO_COREDLL for b/w compat */ /* Define if you have times. */ /* #undef HAVE_TIMES */ +/* Define to 1 if you have the `umask' function. */ +#define HAVE_UMASK 1 + /* Define if you have uname. */ /* #undef HAVE_UNAME */ diff --git a/Python/pyfpe.c b/Python/pyfpe.c index 31ef5d73b70ac..9b1260f687a77 100644 --- a/Python/pyfpe.c +++ b/Python/pyfpe.c @@ -3,9 +3,12 @@ * though, because they may be referenced by extensions using the stable ABI. */ -#include "setjmp.h" +#ifdef HAVE_SETJMP_H +#include jmp_buf PyFPE_jbuf; +#endif + int PyFPE_counter; double diff --git a/Tools/wasm/config.site-wasm32-wasi b/Tools/wasm/config.site-wasm32-wasi new file mode 100644 index 0000000000000..be26c46a148fe --- /dev/null +++ b/Tools/wasm/config.site-wasm32-wasi @@ -0,0 +1,17 @@ +# config.site override for cross compiling to wasm32-wasi platform +# +# Written by Christian Heimes +# Partly based on pyodide's pyconfig.undefs.h file. + + +# cannot be detected in cross builds +ac_cv_buggy_getaddrinfo=no + +# WASI has no /dev/pt* +ac_cv_file__dev_ptmx=no +ac_cv_file__dev_ptc=no + +# dummy readelf, WASI build does not need readelf. +ac_cv_prog_ac_ct_READELF=true + +ac_cv_func_eventfd=no diff --git a/configure b/configure index 6c9aacc68a956..127b350b4bb04 100755 --- a/configure +++ b/configure @@ -6268,7 +6268,7 @@ else EXEEXT=.html ;; #( Emscripten/node) : EXEEXT=.js ;; #( - wasi/*) : + WASI/*) : EXEEXT=.wasm ;; #( *) : EXEEXT= @@ -7627,6 +7627,15 @@ case $ac_sys_system/$ac_sys_emscripten_target in #( LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" ;; #( + WASI) : + + +$as_echo "#define _WASI_EMULATED_SIGNAL 1" >>confdefs.h + + LIBS="$LIBS -lwasi-emulated-signal" + echo "#define _WASI_EMULATED_SIGNAL 1" >> confdefs.h + + ;; #( *) : ;; esac @@ -8543,7 +8552,7 @@ for ac_header in \ alloca.h asm/types.h bluetooth.h conio.h crypt.h direct.h dlfcn.h endian.h errno.h fcntl.h grp.h \ ieeefp.h io.h langinfo.h libintl.h libutil.h linux/memfd.h linux/random.h linux/soundcard.h \ linux/tipc.h linux/wait.h netinet/in.h netpacket/packet.h poll.h process.h pthread.h pty.h \ - sched.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ + sched.h setjmp.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ sys/endian.h sys/epoll.h sys/event.h sys/eventfd.h sys/file.h sys/ioctl.h sys/kern_control.h \ sys/loadavg.h sys/lock.h sys/memfd.h sys/mkdev.h sys/mman.h sys/modem.h sys/param.h sys/poll.h \ sys/random.h sys/resource.h sys/select.h sys/sendfile.h sys/socket.h sys/soundcard.h sys/stat.h \ @@ -13630,7 +13639,7 @@ fi # checks for library functions for ac_func in \ - accept4 alarm bind_textdomain_codeset chown clock close_range confstr \ + accept4 alarm bind_textdomain_codeset chmod chown clock close_range confstr \ copy_file_range ctermid dup3 execv explicit_bzero explicit_memset \ faccessat fchmod fchmodat fchown fchownat fdopendir fdwalk fexecve \ fork fork1 fpathconf fstatat ftime ftruncate futimens futimes futimesat \ @@ -13652,7 +13661,7 @@ for ac_func in \ sigfillset siginterrupt sigpending sigrelse sigtimedwait sigwait \ sigwaitinfo snprintf splice strftime strlcpy strsignal symlinkat sync \ sysconf system tcgetpgrp tcsetpgrp tempnam timegm times tmpfile \ - tmpnam tmpnam_r truncate ttyname uname unlinkat utimensat utimes vfork \ + tmpnam tmpnam_r truncate ttyname umask uname unlinkat utimensat utimes vfork \ wait wait3 wait4 waitid waitpid wcscoll wcsftime wcsxfrm wmemcmp writev \ do : diff --git a/configure.ac b/configure.ac index 4396828bf6fe6..e5ebf7bc2e07a 100644 --- a/configure.ac +++ b/configure.ac @@ -1093,7 +1093,7 @@ AC_ARG_WITH([suffix], AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], [Emscripten/browser], [EXEEXT=.html], [Emscripten/node], [EXEEXT=.js], - [wasi/*], [EXEEXT=.wasm], + [WASI/*], [EXEEXT=.wasm], [EXEEXT=] ) ]) @@ -1805,6 +1805,11 @@ AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" ], + [WASI], [ + AC_DEFINE([_WASI_EMULATED_SIGNAL], [1], [Define to 1 if you want to emulate signals on WASI]) + LIBS="$LIBS -lwasi-emulated-signal" + echo "#define _WASI_EMULATED_SIGNAL 1" >> confdefs.h + ] ) AC_SUBST(BASECFLAGS) @@ -2306,7 +2311,7 @@ AC_CHECK_HEADERS([ \ alloca.h asm/types.h bluetooth.h conio.h crypt.h direct.h dlfcn.h endian.h errno.h fcntl.h grp.h \ ieeefp.h io.h langinfo.h libintl.h libutil.h linux/memfd.h linux/random.h linux/soundcard.h \ linux/tipc.h linux/wait.h netinet/in.h netpacket/packet.h poll.h process.h pthread.h pty.h \ - sched.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ + sched.h setjmp.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ sys/endian.h sys/epoll.h sys/event.h sys/eventfd.h sys/file.h sys/ioctl.h sys/kern_control.h \ sys/loadavg.h sys/lock.h sys/memfd.h sys/mkdev.h sys/mman.h sys/modem.h sys/param.h sys/poll.h \ sys/random.h sys/resource.h sys/select.h sys/sendfile.h sys/socket.h sys/soundcard.h sys/stat.h \ @@ -4062,7 +4067,7 @@ fi # checks for library functions AC_CHECK_FUNCS([ \ - accept4 alarm bind_textdomain_codeset chown clock close_range confstr \ + accept4 alarm bind_textdomain_codeset chmod chown clock close_range confstr \ copy_file_range ctermid dup3 execv explicit_bzero explicit_memset \ faccessat fchmod fchmodat fchown fchownat fdopendir fdwalk fexecve \ fork fork1 fpathconf fstatat ftime ftruncate futimens futimes futimesat \ @@ -4084,7 +4089,7 @@ AC_CHECK_FUNCS([ \ sigfillset siginterrupt sigpending sigrelse sigtimedwait sigwait \ sigwaitinfo snprintf splice strftime strlcpy strsignal symlinkat sync \ sysconf system tcgetpgrp tcsetpgrp tempnam timegm times tmpfile \ - tmpnam tmpnam_r truncate ttyname uname unlinkat utimensat utimes vfork \ + tmpnam tmpnam_r truncate ttyname umask uname unlinkat utimensat utimes vfork \ wait wait3 wait4 waitid waitpid wcscoll wcsftime wcsxfrm wmemcmp writev \ ]) diff --git a/pyconfig.h.in b/pyconfig.h.in index f496b771999d9..21822197708d3 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -127,6 +127,9 @@ /* Define to 1 if you have the 'chflags' function. */ #undef HAVE_CHFLAGS +/* Define to 1 if you have the `chmod' function. */ +#undef HAVE_CHMOD + /* Define to 1 if you have the `chown' function. */ #undef HAVE_CHOWN @@ -977,6 +980,9 @@ /* Define to 1 if you have the `setitimer' function. */ #undef HAVE_SETITIMER +/* Define to 1 if you have the header file. */ +#undef HAVE_SETJMP_H + /* Define to 1 if you have the `setlocale' function. */ #undef HAVE_SETLOCALE @@ -1336,6 +1342,9 @@ /* Define this if you have tcl and TCL_UTF_MAX==6 */ #undef HAVE_UCS4_TCL +/* Define to 1 if you have the `umask' function. */ +#undef HAVE_UMASK + /* Define to 1 if you have the `uname' function. */ #undef HAVE_UNAME @@ -1704,6 +1713,9 @@ /* Define to force use of thread-safe errno, h_errno, and other functions */ #undef _REENTRANT +/* Define to 1 if you want to emulate signals on WASI */ +#undef _WASI_EMULATED_SIGNAL + /* Define to the level of X/Open that your system supports */ #undef _XOPEN_SOURCE From webhook-mailer at python.org Thu Jan 13 03:46:48 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 13 Jan 2022 08:46:48 -0000 Subject: [Python-checkins] bpo-40479: Fix hashlib's usedforsecurity for OpenSSL 3.0.0 (GH-30455) Message-ID: https://github.com/python/cpython/commit/443b308fee088e21bbf472c376c5c9e3648f916c commit: 443b308fee088e21bbf472c376c5c9e3648f916c branch: main author: Christian Heimes committer: tiran date: 2022-01-13T09:46:38+01:00 summary: bpo-40479: Fix hashlib's usedforsecurity for OpenSSL 3.0.0 (GH-30455) files: A Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst M Doc/library/hashlib.rst M Lib/test/test_hashlib.py M Lib/test/test_imaplib.py M Lib/test/test_poplib.py M Lib/test/test_smtplib.py M Lib/test/test_tools/test_md5sum.py M Lib/test/test_urllib2_localnet.py M Modules/_hashopenssl.c diff --git a/Doc/library/hashlib.rst b/Doc/library/hashlib.rst index 0c3bd7b5ac2c9..53320d9cb0d6c 100644 --- a/Doc/library/hashlib.rst +++ b/Doc/library/hashlib.rst @@ -120,10 +120,10 @@ More condensed: Using :func:`new` with an algorithm provided by OpenSSL: - >>> h = hashlib.new('sha512_256') + >>> h = hashlib.new('sha256') >>> h.update(b"Nobody inspects the spammish repetition") >>> h.hexdigest() - '19197dc4d03829df858011c6c87600f994a858103bbc19005f20987aa19a97e2' + '031edd7d41651593c5fe5c006fa5752b37fddff7bc4e843aa6af0c950f4b9406' Hashlib provides the following constant attributes: diff --git a/Lib/test/test_hashlib.py b/Lib/test/test_hashlib.py index 1623bf350e287..110eb48fd4f8c 100644 --- a/Lib/test/test_hashlib.py +++ b/Lib/test/test_hashlib.py @@ -48,12 +48,15 @@ builtin_hashlib = None try: - from _hashlib import HASH, HASHXOF, openssl_md_meth_names + from _hashlib import HASH, HASHXOF, openssl_md_meth_names, get_fips_mode except ImportError: HASH = None HASHXOF = None openssl_md_meth_names = frozenset() + def get_fips_mode(): + return 0 + try: import _blake2 except ImportError: @@ -192,10 +195,7 @@ def hash_constructors(self): @property def is_fips_mode(self): - if hasattr(self._hashlib, "get_fips_mode"): - return self._hashlib.get_fips_mode() - else: - return None + return get_fips_mode() def test_hash_array(self): a = array.array("b", range(10)) @@ -1017,7 +1017,7 @@ def _test_pbkdf2_hmac(self, pbkdf2, supported): self.assertEqual(out, expected, (digest_name, password, salt, rounds)) - with self.assertRaisesRegex(ValueError, 'unsupported hash type'): + with self.assertRaisesRegex(ValueError, '.*unsupported.*'): pbkdf2('unknown', b'pass', b'salt', 1) if 'sha1' in supported: @@ -1057,6 +1057,7 @@ def test_pbkdf2_hmac_c(self): @unittest.skipUnless(hasattr(hashlib, 'scrypt'), ' test requires OpenSSL > 1.1') + @unittest.skipIf(get_fips_mode(), reason="scrypt is blocked in FIPS mode") def test_scrypt(self): for password, salt, n, r, p, expected in self.scrypt_test_vectors: result = hashlib.scrypt(password, salt=salt, n=n, r=r, p=p) diff --git a/Lib/test/test_imaplib.py b/Lib/test/test_imaplib.py index c2b935f58164e..30b553746af11 100644 --- a/Lib/test/test_imaplib.py +++ b/Lib/test/test_imaplib.py @@ -387,7 +387,7 @@ def cmd_AUTHENTICATE(self, tag, args): self.assertEqual(code, 'OK') self.assertEqual(server.response, b'ZmFrZQ==\r\n') # b64 encoded 'fake' - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_bytes(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -405,7 +405,7 @@ def cmd_AUTHENTICATE(self, tag, args): ret, _ = client.login_cram_md5("tim", b"tanstaaftanstaaf") self.assertEqual(ret, "OK") - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_plain_text(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -851,7 +851,7 @@ def cmd_AUTHENTICATE(self, tag, args): b'ZmFrZQ==\r\n') # b64 encoded 'fake' @threading_helper.reap_threads - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5(self): class AuthHandler(SimpleIMAPHandler): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py index 44cf5231f9d23..1220ca32ef82e 100644 --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -318,11 +318,11 @@ def test_noop(self): def test_rpop(self): self.assertOK(self.client.rpop('foo')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_normal(self): self.assertOK(self.client.apop('foo', 'dummypassword')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_REDOS(self): # Replace welcome with very long evil welcome. # NB The upper bound on welcome length is currently 2048. diff --git a/Lib/test/test_smtplib.py b/Lib/test/test_smtplib.py index 9761a37251731..1a60fef8a428b 100644 --- a/Lib/test/test_smtplib.py +++ b/Lib/test/test_smtplib.py @@ -1171,7 +1171,7 @@ def auth_buggy(challenge=None): finally: smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_CRAM_MD5(self): self.serv.add_feature("AUTH CRAM-MD5") smtp = smtplib.SMTP(HOST, self.port, local_hostname='localhost', @@ -1180,7 +1180,7 @@ def testAUTH_CRAM_MD5(self): self.assertEqual(resp, (235, b'Authentication Succeeded')) smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_multiple(self): # Test that multiple authentication methods are tried. self.serv.add_feature("AUTH BOGUS PLAIN LOGIN CRAM-MD5") diff --git a/Lib/test/test_tools/test_md5sum.py b/Lib/test/test_tools/test_md5sum.py index 92315f181c82c..c5a230e95c2b7 100644 --- a/Lib/test/test_tools/test_md5sum.py +++ b/Lib/test/test_tools/test_md5sum.py @@ -11,7 +11,7 @@ skip_if_missing() - at hashlib_helper.requires_hashdigest('md5') + at hashlib_helper.requires_hashdigest('md5', openssl=True) class MD5SumTests(unittest.TestCase): @classmethod def setUpClass(cls): diff --git a/Lib/test/test_urllib2_localnet.py b/Lib/test/test_urllib2_localnet.py index 36fb05d3db0e2..0b2d07ce61d5c 100644 --- a/Lib/test/test_urllib2_localnet.py +++ b/Lib/test/test_urllib2_localnet.py @@ -317,7 +317,7 @@ def test_basic_auth_httperror(self): self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, self.server_url) - at hashlib_helper.requires_hashdigest("md5") + at hashlib_helper.requires_hashdigest("md5", openssl=True) class ProxyAuthTests(unittest.TestCase): URL = "http://localhost" diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst new file mode 100644 index 0000000000000..af72923bbd759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst @@ -0,0 +1,2 @@ +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index 12491917832b6..eeea61aeceb54 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -25,6 +25,7 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "pycore_hashtable.h" #include "hashlib.h" #include "pycore_strhex.h" // _Py_strhex() @@ -49,6 +50,160 @@ #define PY_OPENSSL_HAS_SHAKE 1 #define PY_OPENSSL_HAS_BLAKE2 1 +#if OPENSSL_VERSION_NUMBER >= 0x30000000L +#define PY_EVP_MD EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_MD_fetch(NULL, algorithm, properties) +#define PY_EVP_MD_up_ref(md) EVP_MD_up_ref(md) +#define PY_EVP_MD_free(md) EVP_MD_free(md) +#else +#define PY_EVP_MD const EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_get_digestbyname(algorithm) +#define PY_EVP_MD_up_ref(md) do {} while(0) +#define PY_EVP_MD_free(md) do {} while(0) +#endif + +/* hash alias map and fast lookup + * + * Map between Python's preferred names and OpenSSL internal names. Maintain + * cache of fetched EVP MD objects. The EVP_get_digestbyname() and + * EVP_MD_fetch() API calls have a performance impact. + * + * The py_hashentry_t items are stored in a _Py_hashtable_t with py_name and + * py_alias as keys. + */ + +enum Py_hash_type { + Py_ht_evp, // usedforsecurity=True / default + Py_ht_evp_nosecurity, // usedforsecurity=False + Py_ht_mac, // HMAC + Py_ht_pbkdf2, // PKBDF2 +}; + +typedef struct { + const char *py_name; + const char *py_alias; + const char *ossl_name; + int ossl_nid; + int refcnt; + PY_EVP_MD *evp; + PY_EVP_MD *evp_nosecurity; +} py_hashentry_t; + +#define Py_hash_md5 "md5" +#define Py_hash_sha1 "sha1" +#define Py_hash_sha224 "sha224" +#define Py_hash_sha256 "sha256" +#define Py_hash_sha384 "sha384" +#define Py_hash_sha512 "sha512" +#define Py_hash_sha512_224 "sha512_224" +#define Py_hash_sha512_256 "sha512_256" +#define Py_hash_sha3_224 "sha3_224" +#define Py_hash_sha3_256 "sha3_256" +#define Py_hash_sha3_384 "sha3_384" +#define Py_hash_sha3_512 "sha3_512" +#define Py_hash_shake_128 "shake_128" +#define Py_hash_shake_256 "shake_256" +#define Py_hash_blake2s "blake2s" +#define Py_hash_blake2b "blake2b" + +#define PY_HASH_ENTRY(py_name, py_alias, ossl_name, ossl_nid) \ + {py_name, py_alias, ossl_name, ossl_nid, 0, NULL, NULL} + +static const py_hashentry_t py_hashes[] = { + /* md5 */ + PY_HASH_ENTRY(Py_hash_md5, "MD5", SN_md5, NID_md5), + /* sha1 */ + PY_HASH_ENTRY(Py_hash_sha1, "SHA1", SN_sha1, NID_sha1), + /* sha2 family */ + PY_HASH_ENTRY(Py_hash_sha224, "SHA224", SN_sha224, NID_sha224), + PY_HASH_ENTRY(Py_hash_sha256, "SHA256", SN_sha256, NID_sha256), + PY_HASH_ENTRY(Py_hash_sha384, "SHA384", SN_sha384, NID_sha384), + PY_HASH_ENTRY(Py_hash_sha512, "SHA512", SN_sha512, NID_sha512), + /* truncated sha2 */ + PY_HASH_ENTRY(Py_hash_sha512_224, "SHA512_224", SN_sha512_224, NID_sha512_224), + PY_HASH_ENTRY(Py_hash_sha512_256, "SHA512_256", SN_sha512_256, NID_sha512_256), + /* sha3 */ + PY_HASH_ENTRY(Py_hash_sha3_224, NULL, SN_sha3_224, NID_sha3_224), + PY_HASH_ENTRY(Py_hash_sha3_256, NULL, SN_sha3_256, NID_sha3_256), + PY_HASH_ENTRY(Py_hash_sha3_384, NULL, SN_sha3_384, NID_sha3_384), + PY_HASH_ENTRY(Py_hash_sha3_512, NULL, SN_sha3_512, NID_sha3_512), + /* sha3 shake */ + PY_HASH_ENTRY(Py_hash_shake_128, NULL, SN_shake128, NID_shake128), + PY_HASH_ENTRY(Py_hash_shake_256, NULL, SN_shake256, NID_shake256), + /* blake2 digest */ + PY_HASH_ENTRY(Py_hash_blake2s, "blake2s256", SN_blake2s256, NID_blake2s256), + PY_HASH_ENTRY(Py_hash_blake2b, "blake2b512", SN_blake2b512, NID_blake2b512), + PY_HASH_ENTRY(NULL, NULL, NULL, 0), +}; + +static Py_uhash_t +py_hashentry_t_hash_name(const void *key) { + return _Py_HashBytes(key, strlen((const char *)key)); +} + +static int +py_hashentry_t_compare_name(const void *key1, const void *key2) { + return strcmp((const char *)key1, (const char *)key2) == 0; +} + +static void +py_hashentry_t_destroy_value(void *entry) { + py_hashentry_t *h = (py_hashentry_t *)entry; + if (--(h->refcnt) == 0) { + if (h->evp != NULL) { + PY_EVP_MD_free(h->evp); + h->evp = NULL; + } + if (h->evp_nosecurity != NULL) { + PY_EVP_MD_free(h->evp_nosecurity); + h->evp_nosecurity = NULL; + } + PyMem_Free(entry); + } +} + +static _Py_hashtable_t * +py_hashentry_table_new(void) { + _Py_hashtable_t *ht = _Py_hashtable_new_full( + py_hashentry_t_hash_name, + py_hashentry_t_compare_name, + NULL, + py_hashentry_t_destroy_value, + NULL + ); + if (ht == NULL) { + return NULL; + } + + for (const py_hashentry_t *h = py_hashes; h->py_name != NULL; h++) { + py_hashentry_t *entry = (py_hashentry_t *)PyMem_Malloc(sizeof(py_hashentry_t)); + if (entry == NULL) { + goto error; + } + memcpy(entry, h, sizeof(py_hashentry_t)); + + if (_Py_hashtable_set(ht, (const void*)entry->py_name, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt = 1; + + if (h->py_alias != NULL) { + if (_Py_hashtable_set(ht, (const void*)entry->py_alias, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt++; + } + } + + return ht; + error: + _Py_hashtable_destroy(ht); + return NULL; +} + +/* Module state */ static PyModuleDef _hashlibmodule; typedef struct { @@ -59,6 +214,7 @@ typedef struct { #endif PyObject *constructs; PyObject *unsupported_digestmod_error; + _Py_hashtable_t *hashtable; } _hashlibstate; static inline _hashlibstate* @@ -93,16 +249,26 @@ class _hashlib.HMAC "HMACobject *" "((_hashlibstate *)PyModule_GetState(module)) /* LCOV_EXCL_START */ static PyObject * -_setException(PyObject *exc) +_setException(PyObject *exc, const char* altmsg, ...) { - unsigned long errcode; + unsigned long errcode = ERR_peek_last_error(); const char *lib, *func, *reason; + va_list vargs; - errcode = ERR_peek_last_error(); +#ifdef HAVE_STDARG_PROTOTYPES + va_start(vargs, altmsg); +#else + va_start(vargs); +#endif if (!errcode) { - PyErr_SetString(exc, "unknown reasons"); + if (altmsg == NULL) { + PyErr_SetString(exc, "no reason supplied"); + } else { + PyErr_FormatV(exc, altmsg, vargs); + } return NULL; } + va_end(vargs); ERR_clear_error(); lib = ERR_lib_error_string(errcode); @@ -127,68 +293,15 @@ py_digest_name(const EVP_MD *md) { int nid = EVP_MD_nid(md); const char *name = NULL; + const py_hashentry_t *h; - /* Hard-coded names for well-known hashing algorithms. - * OpenSSL uses slightly different names algorithms like SHA3. - */ - switch (nid) { - case NID_md5: - name = "md5"; - break; - case NID_sha1: - name = "sha1"; - break; - case NID_sha224: - name ="sha224"; - break; - case NID_sha256: - name ="sha256"; - break; - case NID_sha384: - name ="sha384"; - break; - case NID_sha512: - name ="sha512"; - break; -#ifdef NID_sha512_224 - case NID_sha512_224: - name ="sha512_224"; - break; - case NID_sha512_256: - name ="sha512_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - case NID_sha3_224: - name ="sha3_224"; - break; - case NID_sha3_256: - name ="sha3_256"; - break; - case NID_sha3_384: - name ="sha3_384"; - break; - case NID_sha3_512: - name ="sha3_512"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - case NID_shake128: - name ="shake_128"; - break; - case NID_shake256: - name ="shake_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - case NID_blake2s256: - name ="blake2s"; - break; - case NID_blake2b512: - name ="blake2b"; - break; -#endif - default: + for (h = py_hashes; h->py_name != NULL; h++) { + if (h->ossl_nid == nid) { + name = h->py_name; + break; + } + } + if (name == NULL) { /* Ignore aliased names and only use long, lowercase name. The aliases * pollute the list and OpenSSL appears to have its own definition of * alias as the resulting list still contains duplicate and alternate @@ -197,67 +310,58 @@ py_digest_name(const EVP_MD *md) name = OBJ_nid2ln(nid); if (name == NULL) name = OBJ_nid2sn(nid); - break; } return PyUnicode_FromString(name); } -static const EVP_MD* -py_digest_by_name(const char *name) +/* Get EVP_MD by HID and purpose */ +static PY_EVP_MD* +py_digest_by_name(PyObject *module, const char *name, enum Py_hash_type py_ht) { - const EVP_MD *digest = EVP_get_digestbyname(name); + PY_EVP_MD *digest = NULL; + _hashlibstate *state = get_hashlib_state(module); + py_hashentry_t *entry = (py_hashentry_t *)_Py_hashtable_get( + state->hashtable, (const void*)name + ); - /* OpenSSL uses dash instead of underscore in names of some algorithms - * like SHA3 and SHAKE. Detect different spellings. */ - if (digest == NULL) { - if (0) {} -#ifdef NID_sha512_224 - else if (!strcmp(name, "sha512_224") || !strcmp(name, "SHA512_224")) { - digest = EVP_sha512_224(); - } - else if (!strcmp(name, "sha512_256") || !strcmp(name, "SHA512_256")) { - digest = EVP_sha512_256(); - } -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - /* could be sha3_ or shake_, Python never defined upper case */ - else if (!strcmp(name, "sha3_224")) { - digest = EVP_sha3_224(); - } - else if (!strcmp(name, "sha3_256")) { - digest = EVP_sha3_256(); - } - else if (!strcmp(name, "sha3_384")) { - digest = EVP_sha3_384(); - } - else if (!strcmp(name, "sha3_512")) { - digest = EVP_sha3_512(); - } -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - else if (!strcmp(name, "shake_128")) { - digest = EVP_shake128(); + if (entry != NULL) { + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + if (entry->evp == NULL) { + entry->evp = PY_EVP_MD_fetch(entry->ossl_name, NULL); + } + digest = entry->evp; + break; + case Py_ht_evp_nosecurity: + if (entry->evp_nosecurity == NULL) { + entry->evp_nosecurity = PY_EVP_MD_fetch(entry->ossl_name, "-fips"); + } + digest = entry->evp_nosecurity; + break; } - else if (!strcmp(name, "shake_256")) { - digest = EVP_shake256(); + if (digest != NULL) { + PY_EVP_MD_up_ref(digest); } -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - else if (!strcmp(name, "blake2s256")) { - digest = EVP_blake2s256(); - } - else if (!strcmp(name, "blake2b512")) { - digest = EVP_blake2b512(); + } else { + // Fall back for looking up an unindexed OpenSSL specific name. + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + digest = PY_EVP_MD_fetch(name, NULL); + break; + case Py_ht_evp_nosecurity: + digest = PY_EVP_MD_fetch(name, "-fips"); + break; } -#endif } - if (digest == NULL) { - PyErr_Format(PyExc_ValueError, "unsupported hash type %s", name); + _setException(PyExc_ValueError, "unsupported hash type %s", name); return NULL; } - return digest; } @@ -268,9 +372,9 @@ py_digest_by_name(const char *name) * * on error returns NULL with exception set. */ -static const EVP_MD* -py_digest_by_digestmod(PyObject *module, PyObject *digestmod) { - const EVP_MD* evp; +static PY_EVP_MD* +py_digest_by_digestmod(PyObject *module, PyObject *digestmod, enum Py_hash_type py_ht) { + PY_EVP_MD* evp; PyObject *name_obj = NULL; const char *name; @@ -295,7 +399,7 @@ py_digest_by_digestmod(PyObject *module, PyObject *digestmod) { return NULL; } - evp = py_digest_by_name(name); + evp = py_digest_by_name(module, name, py_ht); if (evp == NULL) { return NULL; } @@ -334,7 +438,7 @@ EVP_hash(EVPobject *self, const void *vp, Py_ssize_t len) else process = Py_SAFE_DOWNCAST(len, Py_ssize_t, unsigned int); if (!EVP_DigestUpdate(self->ctx, (const void*)cp, process)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } len -= process; @@ -385,7 +489,7 @@ EVP_copy_impl(EVPobject *self) if (!locked_EVP_MD_CTX_copy(newobj->ctx, self)) { Py_DECREF(newobj); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return (PyObject *)newobj; } @@ -412,11 +516,11 @@ EVP_digest_impl(EVPobject *self) } if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -447,11 +551,11 @@ EVP_hexdigest_impl(EVPobject *self) /* Get the raw (binary) digest value */ if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -627,14 +731,14 @@ EVPXOF_digest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, (unsigned char*)PyBytes_AS_STRING(retval), length)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -675,12 +779,12 @@ EVPXOF_hexdigest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, digest, length)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -748,55 +852,74 @@ static PyType_Spec EVPXOFtype_spec = { #endif -static PyObject * -EVPnew(PyObject *module, const EVP_MD *digest, - const unsigned char *cp, Py_ssize_t len, int usedforsecurity) +static PyObject* +py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, + int usedforsecurity) { - int result = 0; - EVPobject *self; - PyTypeObject *type = get_hashlib_state(module)->EVPtype; + Py_buffer view = { 0 }; + PY_EVP_MD *digest = NULL; + PyTypeObject *type; + EVPobject *self = NULL; - if (!digest) { - PyErr_SetString(PyExc_ValueError, "unsupported hash type"); - return NULL; + if (data_obj != NULL) { + GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); + } + + digest = py_digest_by_name( + module, digestname, usedforsecurity ? Py_ht_evp : Py_ht_evp_nosecurity + ); + if (digest == NULL) { + goto exit; } -#ifdef PY_OPENSSL_HAS_SHAKE if ((EVP_MD_flags(digest) & EVP_MD_FLAG_XOF) == EVP_MD_FLAG_XOF) { type = get_hashlib_state(module)->EVPXOFtype; + } else { + type = get_hashlib_state(module)->EVPtype; } -#endif - if ((self = newEVPobject(type)) == NULL) - return NULL; + self = newEVPobject(type); + if (self == NULL) { + goto exit; + } +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L + // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while + // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { -#ifdef EVP_MD_CTX_FLAG_NON_FIPS_ALLOW EVP_MD_CTX_set_flags(self->ctx, EVP_MD_CTX_FLAG_NON_FIPS_ALLOW); -#endif } +#endif - - if (!EVP_DigestInit_ex(self->ctx, digest, NULL)) { - _setException(PyExc_ValueError); - Py_DECREF(self); - return NULL; + int result = EVP_DigestInit_ex(self->ctx, digest, NULL); + if (!result) { + _setException(PyExc_ValueError, NULL); + Py_CLEAR(self); + goto exit; } - if (cp && len) { - if (len >= HASHLIB_GIL_MINSIZE) { + if (view.buf && view.len) { + if (view.len >= HASHLIB_GIL_MINSIZE) { Py_BEGIN_ALLOW_THREADS - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); Py_END_ALLOW_THREADS } else { - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); } if (result == -1) { - Py_DECREF(self); - return NULL; + Py_CLEAR(self); + goto exit; } } + exit: + if (data_obj != NULL) { + PyBuffer_Release(&view); + } + if (digest != NULL) { + PY_EVP_MD_free(digest); + } + return (PyObject *)self; } @@ -824,53 +947,14 @@ EVP_new_impl(PyObject *module, PyObject *name_obj, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=ddd5053f92dffe90 input=c24554d0337be1b0]*/ { - Py_buffer view = { 0 }; - PyObject *ret_obj = NULL; char *name; - const EVP_MD *digest = NULL; - if (!PyArg_Parse(name_obj, "s", &name)) { PyErr_SetString(PyExc_TypeError, "name must be a string"); return NULL; } - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - digest = py_digest_by_name(name); - if (digest == NULL) { - goto exit; - } - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - -exit: - if (data_obj) - PyBuffer_Release(&view); - return ret_obj; + return py_evp_fromname(module, name, data_obj, usedforsecurity); } -static PyObject* -EVP_fast_new(PyObject *module, PyObject *data_obj, const EVP_MD *digest, - int usedforsecurity) -{ - Py_buffer view = { 0 }; - PyObject *ret_obj; - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - - if (data_obj) - PyBuffer_Release(&view); - - return ret_obj; -} /*[clinic input] _hashlib.openssl_md5 @@ -888,7 +972,7 @@ _hashlib_openssl_md5_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=87b0186440a44f8c input=990e36d5e689b16e]*/ { - return EVP_fast_new(module, data_obj, EVP_md5(), usedforsecurity); + return py_evp_fromname(module, Py_hash_md5, data_obj, usedforsecurity); } @@ -908,7 +992,7 @@ _hashlib_openssl_sha1_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=6813024cf690670d input=948f2f4b6deabc10]*/ { - return EVP_fast_new(module, data_obj, EVP_sha1(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha1, data_obj, usedforsecurity); } @@ -928,7 +1012,7 @@ _hashlib_openssl_sha224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=a2dfe7cc4eb14ebb input=f9272821fadca505]*/ { - return EVP_fast_new(module, data_obj, EVP_sha224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha224, data_obj, usedforsecurity); } @@ -948,7 +1032,7 @@ _hashlib_openssl_sha256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=1f874a34870f0a68 input=549fad9d2930d4c5]*/ { - return EVP_fast_new(module, data_obj, EVP_sha256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha256, data_obj, usedforsecurity); } @@ -968,7 +1052,7 @@ _hashlib_openssl_sha384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=58529eff9ca457b2 input=48601a6e3bf14ad7]*/ { - return EVP_fast_new(module, data_obj, EVP_sha384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha384, data_obj, usedforsecurity); } @@ -988,7 +1072,7 @@ _hashlib_openssl_sha512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2c744c9e4a40d5f6 input=c5c46a2a817aa98f]*/ { - return EVP_fast_new(module, data_obj, EVP_sha512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha512, data_obj, usedforsecurity); } @@ -1010,7 +1094,7 @@ _hashlib_openssl_sha3_224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=144641c1d144b974 input=e3a01b2888916157]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_224, data_obj, usedforsecurity); } /*[clinic input] @@ -1029,7 +1113,7 @@ _hashlib_openssl_sha3_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=c61f1ab772d06668 input=e2908126c1b6deed]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_256, data_obj , usedforsecurity); } /*[clinic input] @@ -1048,7 +1132,7 @@ _hashlib_openssl_sha3_384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=f68e4846858cf0ee input=ec0edf5c792f8252]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_384, data_obj , usedforsecurity); } /*[clinic input] @@ -1067,7 +1151,7 @@ _hashlib_openssl_sha3_512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2eede478c159354a input=64e2cc0c094d56f4]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_512, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHA3 */ @@ -1088,7 +1172,7 @@ _hashlib_openssl_shake_128_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=bc49cdd8ada1fa97 input=6c9d67440eb33ec8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake128(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_128, data_obj , usedforsecurity); } /*[clinic input] @@ -1107,7 +1191,7 @@ _hashlib_openssl_shake_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=358d213be8852df7 input=479cbe9fefd4a9f8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_256, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHAKE */ @@ -1133,9 +1217,8 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, char *key; long dklen; int retval; - const EVP_MD *digest; - digest = py_digest_by_name(hash_name); + PY_EVP_MD *digest = py_digest_by_name(module, hash_name, Py_ht_pbkdf2); if (digest == NULL) { goto end; } @@ -1198,11 +1281,14 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto end; } end: + if (digest != NULL) { + PY_EVP_MD_free(digest); + } return key_obj; } @@ -1301,9 +1387,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, /* let OpenSSL validate the rest */ retval = EVP_PBE_scrypt(NULL, 0, NULL, 0, n, r, p, maxmem, NULL, 0); if (!retval) { - /* sorry, can't do much better */ - PyErr_SetString(PyExc_ValueError, - "Invalid parameter combination for n, r, p, maxmem."); + _setException(PyExc_ValueError, "Invalid parameter combination for n, r, p, maxmem."); return NULL; } @@ -1324,7 +1408,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return key_obj; @@ -1352,12 +1436,7 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, unsigned char md[EVP_MAX_MD_SIZE] = {0}; unsigned int md_len = 0; unsigned char *result; - const EVP_MD *evp; - - evp = py_digest_by_digestmod(module, digest); - if (evp == NULL) { - return NULL; - } + PY_EVP_MD *evp; if (key->len > INT_MAX) { PyErr_SetString(PyExc_OverflowError, @@ -1370,6 +1449,11 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, return NULL; } + evp = py_digest_by_digestmod(module, digest, Py_ht_mac); + if (evp == NULL) { + return NULL; + } + Py_BEGIN_ALLOW_THREADS result = HMAC( evp, @@ -1378,9 +1462,10 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, md, &md_len ); Py_END_ALLOW_THREADS + PY_EVP_MD_free(evp); if (result == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return PyBytes_FromStringAndSize((const char*)md, md_len); @@ -1407,7 +1492,7 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, /*[clinic end generated code: output=c20d9e4d9ed6d219 input=5f4071dcc7f34362]*/ { PyTypeObject *type = get_hashlib_state(module)->HMACtype; - const EVP_MD *digest; + PY_EVP_MD *digest; HMAC_CTX *ctx = NULL; HMACobject *self = NULL; int r; @@ -1424,14 +1509,14 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, return NULL; } - digest = py_digest_by_digestmod(module, digestmod); + digest = py_digest_by_digestmod(module, digestmod, Py_ht_mac); if (digest == NULL) { return NULL; } ctx = HMAC_CTX_new(); if (ctx == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1441,8 +1526,9 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, (int)key->len, digest, NULL /*impl*/); + PY_EVP_MD_free(digest); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1512,7 +1598,7 @@ _hmac_update(HMACobject *self, PyObject *obj) PyBuffer_Release(&view); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1532,11 +1618,11 @@ _hashlib_HMAC_copy_impl(HMACobject *self) HMAC_CTX *ctx = HMAC_CTX_new(); if (ctx == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!locked_HMAC_CTX_copy(ctx, self)) { HMAC_CTX_free(ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } retval = (HMACobject *)PyObject_New(HMACobject, Py_TYPE(self)); @@ -1602,13 +1688,13 @@ _hmac_digest(HMACobject *self, unsigned char *buf, unsigned int len) return 0; } if (!locked_HMAC_CTX_copy(temp_ctx, self)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } int r = HMAC_Final(temp_ctx, buf, &len); HMAC_CTX_free(temp_ctx); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1626,7 +1712,7 @@ _hashlib_HMAC_digest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1651,7 +1737,7 @@ _hashlib_HMAC_hexdigest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1665,7 +1751,7 @@ _hashlib_hmac_get_digest_size(HMACobject *self, void *closure) { unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(digest_size); } @@ -1675,7 +1761,7 @@ _hashlib_hmac_get_block_size(HMACobject *self, void *closure) { const EVP_MD *md = HMAC_CTX_get_md(self->ctx); if (md == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(EVP_MD_block_size(md)); } @@ -1828,7 +1914,7 @@ _hashlib_get_fips_mode_impl(PyObject *module) // But 0 is also a valid result value. unsigned long errcode = ERR_peek_last_error(); if (errcode) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } } @@ -2004,6 +2090,12 @@ hashlib_clear(PyObject *m) #endif Py_CLEAR(state->constructs); Py_CLEAR(state->unsupported_digestmod_error); + + if (state->hashtable != NULL) { + _Py_hashtable_destroy(state->hashtable); + state->hashtable = NULL; + } + return 0; } @@ -2014,6 +2106,19 @@ hashlib_free(void *m) } /* Py_mod_exec functions */ +static int +hashlib_init_hashtable(PyObject *module) +{ + _hashlibstate *state = get_hashlib_state(module); + + state->hashtable = py_hashentry_table_new(); + if (state->hashtable == NULL) { + PyErr_NoMemory(); + return -1; + } + return 0; +} + static int hashlib_init_evptype(PyObject *module) { @@ -2141,6 +2246,7 @@ hashlib_exception(PyObject *module) static PyModuleDef_Slot hashlib_slots[] = { + {Py_mod_exec, hashlib_init_hashtable}, {Py_mod_exec, hashlib_init_evptype}, {Py_mod_exec, hashlib_init_evpxoftype}, {Py_mod_exec, hashlib_init_hmactype}, From webhook-mailer at python.org Thu Jan 13 04:42:54 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 09:42:54 -0000 Subject: [Python-checkins] Define Py_BUILD_CORE_MODULE Message-ID: https://github.com/python/cpython/commit/3ce6945f5f434806eea700eb5ff1bed6d39395e1 commit: 3ce6945f5f434806eea700eb5ff1bed6d39395e1 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T01:42:47-08:00 summary: Define Py_BUILD_CORE_MODULE files: A Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst M Doc/library/hashlib.rst M Lib/test/test_hashlib.py M Lib/test/test_imaplib.py M Lib/test/test_poplib.py M Lib/test/test_smtplib.py M Lib/test/test_tools/test_md5sum.py M Lib/test/test_urllib2_localnet.py M Modules/_hashopenssl.c diff --git a/Doc/library/hashlib.rst b/Doc/library/hashlib.rst index 0c3bd7b5ac2c9..53320d9cb0d6c 100644 --- a/Doc/library/hashlib.rst +++ b/Doc/library/hashlib.rst @@ -120,10 +120,10 @@ More condensed: Using :func:`new` with an algorithm provided by OpenSSL: - >>> h = hashlib.new('sha512_256') + >>> h = hashlib.new('sha256') >>> h.update(b"Nobody inspects the spammish repetition") >>> h.hexdigest() - '19197dc4d03829df858011c6c87600f994a858103bbc19005f20987aa19a97e2' + '031edd7d41651593c5fe5c006fa5752b37fddff7bc4e843aa6af0c950f4b9406' Hashlib provides the following constant attributes: diff --git a/Lib/test/test_hashlib.py b/Lib/test/test_hashlib.py index 1623bf350e287..110eb48fd4f8c 100644 --- a/Lib/test/test_hashlib.py +++ b/Lib/test/test_hashlib.py @@ -48,12 +48,15 @@ builtin_hashlib = None try: - from _hashlib import HASH, HASHXOF, openssl_md_meth_names + from _hashlib import HASH, HASHXOF, openssl_md_meth_names, get_fips_mode except ImportError: HASH = None HASHXOF = None openssl_md_meth_names = frozenset() + def get_fips_mode(): + return 0 + try: import _blake2 except ImportError: @@ -192,10 +195,7 @@ def hash_constructors(self): @property def is_fips_mode(self): - if hasattr(self._hashlib, "get_fips_mode"): - return self._hashlib.get_fips_mode() - else: - return None + return get_fips_mode() def test_hash_array(self): a = array.array("b", range(10)) @@ -1017,7 +1017,7 @@ def _test_pbkdf2_hmac(self, pbkdf2, supported): self.assertEqual(out, expected, (digest_name, password, salt, rounds)) - with self.assertRaisesRegex(ValueError, 'unsupported hash type'): + with self.assertRaisesRegex(ValueError, '.*unsupported.*'): pbkdf2('unknown', b'pass', b'salt', 1) if 'sha1' in supported: @@ -1057,6 +1057,7 @@ def test_pbkdf2_hmac_c(self): @unittest.skipUnless(hasattr(hashlib, 'scrypt'), ' test requires OpenSSL > 1.1') + @unittest.skipIf(get_fips_mode(), reason="scrypt is blocked in FIPS mode") def test_scrypt(self): for password, salt, n, r, p, expected in self.scrypt_test_vectors: result = hashlib.scrypt(password, salt=salt, n=n, r=r, p=p) diff --git a/Lib/test/test_imaplib.py b/Lib/test/test_imaplib.py index c2b935f58164e..30b553746af11 100644 --- a/Lib/test/test_imaplib.py +++ b/Lib/test/test_imaplib.py @@ -387,7 +387,7 @@ def cmd_AUTHENTICATE(self, tag, args): self.assertEqual(code, 'OK') self.assertEqual(server.response, b'ZmFrZQ==\r\n') # b64 encoded 'fake' - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_bytes(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -405,7 +405,7 @@ def cmd_AUTHENTICATE(self, tag, args): ret, _ = client.login_cram_md5("tim", b"tanstaaftanstaaf") self.assertEqual(ret, "OK") - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_plain_text(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -851,7 +851,7 @@ def cmd_AUTHENTICATE(self, tag, args): b'ZmFrZQ==\r\n') # b64 encoded 'fake' @threading_helper.reap_threads - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5(self): class AuthHandler(SimpleIMAPHandler): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py index 44cf5231f9d23..1220ca32ef82e 100644 --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -318,11 +318,11 @@ def test_noop(self): def test_rpop(self): self.assertOK(self.client.rpop('foo')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_normal(self): self.assertOK(self.client.apop('foo', 'dummypassword')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_REDOS(self): # Replace welcome with very long evil welcome. # NB The upper bound on welcome length is currently 2048. diff --git a/Lib/test/test_smtplib.py b/Lib/test/test_smtplib.py index 9761a37251731..1a60fef8a428b 100644 --- a/Lib/test/test_smtplib.py +++ b/Lib/test/test_smtplib.py @@ -1171,7 +1171,7 @@ def auth_buggy(challenge=None): finally: smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_CRAM_MD5(self): self.serv.add_feature("AUTH CRAM-MD5") smtp = smtplib.SMTP(HOST, self.port, local_hostname='localhost', @@ -1180,7 +1180,7 @@ def testAUTH_CRAM_MD5(self): self.assertEqual(resp, (235, b'Authentication Succeeded')) smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_multiple(self): # Test that multiple authentication methods are tried. self.serv.add_feature("AUTH BOGUS PLAIN LOGIN CRAM-MD5") diff --git a/Lib/test/test_tools/test_md5sum.py b/Lib/test/test_tools/test_md5sum.py index 92315f181c82c..c5a230e95c2b7 100644 --- a/Lib/test/test_tools/test_md5sum.py +++ b/Lib/test/test_tools/test_md5sum.py @@ -11,7 +11,7 @@ skip_if_missing() - at hashlib_helper.requires_hashdigest('md5') + at hashlib_helper.requires_hashdigest('md5', openssl=True) class MD5SumTests(unittest.TestCase): @classmethod def setUpClass(cls): diff --git a/Lib/test/test_urllib2_localnet.py b/Lib/test/test_urllib2_localnet.py index 36fb05d3db0e2..0b2d07ce61d5c 100644 --- a/Lib/test/test_urllib2_localnet.py +++ b/Lib/test/test_urllib2_localnet.py @@ -317,7 +317,7 @@ def test_basic_auth_httperror(self): self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, self.server_url) - at hashlib_helper.requires_hashdigest("md5") + at hashlib_helper.requires_hashdigest("md5", openssl=True) class ProxyAuthTests(unittest.TestCase): URL = "http://localhost" diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst new file mode 100644 index 0000000000000..af72923bbd759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst @@ -0,0 +1,2 @@ +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index b9e68c05c3edb..cb8460ab2fcf2 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -18,9 +18,14 @@ #endif #define OPENSSL_NO_DEPRECATED 1 +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "pycore_hashtable.h" #include "hashlib.h" #include "pystrhex.h" @@ -45,6 +50,160 @@ #define PY_OPENSSL_HAS_SHAKE 1 #define PY_OPENSSL_HAS_BLAKE2 1 +#if OPENSSL_VERSION_NUMBER >= 0x30000000L +#define PY_EVP_MD EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_MD_fetch(NULL, algorithm, properties) +#define PY_EVP_MD_up_ref(md) EVP_MD_up_ref(md) +#define PY_EVP_MD_free(md) EVP_MD_free(md) +#else +#define PY_EVP_MD const EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_get_digestbyname(algorithm) +#define PY_EVP_MD_up_ref(md) do {} while(0) +#define PY_EVP_MD_free(md) do {} while(0) +#endif + +/* hash alias map and fast lookup + * + * Map between Python's preferred names and OpenSSL internal names. Maintain + * cache of fetched EVP MD objects. The EVP_get_digestbyname() and + * EVP_MD_fetch() API calls have a performance impact. + * + * The py_hashentry_t items are stored in a _Py_hashtable_t with py_name and + * py_alias as keys. + */ + +enum Py_hash_type { + Py_ht_evp, // usedforsecurity=True / default + Py_ht_evp_nosecurity, // usedforsecurity=False + Py_ht_mac, // HMAC + Py_ht_pbkdf2, // PKBDF2 +}; + +typedef struct { + const char *py_name; + const char *py_alias; + const char *ossl_name; + int ossl_nid; + int refcnt; + PY_EVP_MD *evp; + PY_EVP_MD *evp_nosecurity; +} py_hashentry_t; + +#define Py_hash_md5 "md5" +#define Py_hash_sha1 "sha1" +#define Py_hash_sha224 "sha224" +#define Py_hash_sha256 "sha256" +#define Py_hash_sha384 "sha384" +#define Py_hash_sha512 "sha512" +#define Py_hash_sha512_224 "sha512_224" +#define Py_hash_sha512_256 "sha512_256" +#define Py_hash_sha3_224 "sha3_224" +#define Py_hash_sha3_256 "sha3_256" +#define Py_hash_sha3_384 "sha3_384" +#define Py_hash_sha3_512 "sha3_512" +#define Py_hash_shake_128 "shake_128" +#define Py_hash_shake_256 "shake_256" +#define Py_hash_blake2s "blake2s" +#define Py_hash_blake2b "blake2b" + +#define PY_HASH_ENTRY(py_name, py_alias, ossl_name, ossl_nid) \ + {py_name, py_alias, ossl_name, ossl_nid, 0, NULL, NULL} + +static const py_hashentry_t py_hashes[] = { + /* md5 */ + PY_HASH_ENTRY(Py_hash_md5, "MD5", SN_md5, NID_md5), + /* sha1 */ + PY_HASH_ENTRY(Py_hash_sha1, "SHA1", SN_sha1, NID_sha1), + /* sha2 family */ + PY_HASH_ENTRY(Py_hash_sha224, "SHA224", SN_sha224, NID_sha224), + PY_HASH_ENTRY(Py_hash_sha256, "SHA256", SN_sha256, NID_sha256), + PY_HASH_ENTRY(Py_hash_sha384, "SHA384", SN_sha384, NID_sha384), + PY_HASH_ENTRY(Py_hash_sha512, "SHA512", SN_sha512, NID_sha512), + /* truncated sha2 */ + PY_HASH_ENTRY(Py_hash_sha512_224, "SHA512_224", SN_sha512_224, NID_sha512_224), + PY_HASH_ENTRY(Py_hash_sha512_256, "SHA512_256", SN_sha512_256, NID_sha512_256), + /* sha3 */ + PY_HASH_ENTRY(Py_hash_sha3_224, NULL, SN_sha3_224, NID_sha3_224), + PY_HASH_ENTRY(Py_hash_sha3_256, NULL, SN_sha3_256, NID_sha3_256), + PY_HASH_ENTRY(Py_hash_sha3_384, NULL, SN_sha3_384, NID_sha3_384), + PY_HASH_ENTRY(Py_hash_sha3_512, NULL, SN_sha3_512, NID_sha3_512), + /* sha3 shake */ + PY_HASH_ENTRY(Py_hash_shake_128, NULL, SN_shake128, NID_shake128), + PY_HASH_ENTRY(Py_hash_shake_256, NULL, SN_shake256, NID_shake256), + /* blake2 digest */ + PY_HASH_ENTRY(Py_hash_blake2s, "blake2s256", SN_blake2s256, NID_blake2s256), + PY_HASH_ENTRY(Py_hash_blake2b, "blake2b512", SN_blake2b512, NID_blake2b512), + PY_HASH_ENTRY(NULL, NULL, NULL, 0), +}; + +static Py_uhash_t +py_hashentry_t_hash_name(const void *key) { + return _Py_HashBytes(key, strlen((const char *)key)); +} + +static int +py_hashentry_t_compare_name(const void *key1, const void *key2) { + return strcmp((const char *)key1, (const char *)key2) == 0; +} + +static void +py_hashentry_t_destroy_value(void *entry) { + py_hashentry_t *h = (py_hashentry_t *)entry; + if (--(h->refcnt) == 0) { + if (h->evp != NULL) { + PY_EVP_MD_free(h->evp); + h->evp = NULL; + } + if (h->evp_nosecurity != NULL) { + PY_EVP_MD_free(h->evp_nosecurity); + h->evp_nosecurity = NULL; + } + PyMem_Free(entry); + } +} + +static _Py_hashtable_t * +py_hashentry_table_new(void) { + _Py_hashtable_t *ht = _Py_hashtable_new_full( + py_hashentry_t_hash_name, + py_hashentry_t_compare_name, + NULL, + py_hashentry_t_destroy_value, + NULL + ); + if (ht == NULL) { + return NULL; + } + + for (const py_hashentry_t *h = py_hashes; h->py_name != NULL; h++) { + py_hashentry_t *entry = (py_hashentry_t *)PyMem_Malloc(sizeof(py_hashentry_t)); + if (entry == NULL) { + goto error; + } + memcpy(entry, h, sizeof(py_hashentry_t)); + + if (_Py_hashtable_set(ht, (const void*)entry->py_name, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt = 1; + + if (h->py_alias != NULL) { + if (_Py_hashtable_set(ht, (const void*)entry->py_alias, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt++; + } + } + + return ht; + error: + _Py_hashtable_destroy(ht); + return NULL; +} + +/* Module state */ static PyModuleDef _hashlibmodule; typedef struct { @@ -55,6 +214,7 @@ typedef struct { #endif PyObject *constructs; PyObject *unsupported_digestmod_error; + _Py_hashtable_t *hashtable; } _hashlibstate; static inline _hashlibstate* @@ -89,16 +249,26 @@ class _hashlib.HMAC "HMACobject *" "((_hashlibstate *)PyModule_GetState(module)) /* LCOV_EXCL_START */ static PyObject * -_setException(PyObject *exc) +_setException(PyObject *exc, const char* altmsg, ...) { - unsigned long errcode; + unsigned long errcode = ERR_peek_last_error(); const char *lib, *func, *reason; + va_list vargs; - errcode = ERR_peek_last_error(); +#ifdef HAVE_STDARG_PROTOTYPES + va_start(vargs, altmsg); +#else + va_start(vargs); +#endif if (!errcode) { - PyErr_SetString(exc, "unknown reasons"); + if (altmsg == NULL) { + PyErr_SetString(exc, "no reason supplied"); + } else { + PyErr_FormatV(exc, altmsg, vargs); + } return NULL; } + va_end(vargs); ERR_clear_error(); lib = ERR_lib_error_string(errcode); @@ -123,68 +293,15 @@ py_digest_name(const EVP_MD *md) { int nid = EVP_MD_nid(md); const char *name = NULL; + const py_hashentry_t *h; - /* Hard-coded names for well-known hashing algorithms. - * OpenSSL uses slightly different names algorithms like SHA3. - */ - switch (nid) { - case NID_md5: - name = "md5"; - break; - case NID_sha1: - name = "sha1"; - break; - case NID_sha224: - name ="sha224"; - break; - case NID_sha256: - name ="sha256"; - break; - case NID_sha384: - name ="sha384"; - break; - case NID_sha512: - name ="sha512"; - break; -#ifdef NID_sha512_224 - case NID_sha512_224: - name ="sha512_224"; - break; - case NID_sha512_256: - name ="sha512_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - case NID_sha3_224: - name ="sha3_224"; - break; - case NID_sha3_256: - name ="sha3_256"; - break; - case NID_sha3_384: - name ="sha3_384"; - break; - case NID_sha3_512: - name ="sha3_512"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - case NID_shake128: - name ="shake_128"; - break; - case NID_shake256: - name ="shake_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - case NID_blake2s256: - name ="blake2s"; - break; - case NID_blake2b512: - name ="blake2b"; - break; -#endif - default: + for (h = py_hashes; h->py_name != NULL; h++) { + if (h->ossl_nid == nid) { + name = h->py_name; + break; + } + } + if (name == NULL) { /* Ignore aliased names and only use long, lowercase name. The aliases * pollute the list and OpenSSL appears to have its own definition of * alias as the resulting list still contains duplicate and alternate @@ -193,67 +310,58 @@ py_digest_name(const EVP_MD *md) name = OBJ_nid2ln(nid); if (name == NULL) name = OBJ_nid2sn(nid); - break; } return PyUnicode_FromString(name); } -static const EVP_MD* -py_digest_by_name(const char *name) +/* Get EVP_MD by HID and purpose */ +static PY_EVP_MD* +py_digest_by_name(PyObject *module, const char *name, enum Py_hash_type py_ht) { - const EVP_MD *digest = EVP_get_digestbyname(name); + PY_EVP_MD *digest = NULL; + _hashlibstate *state = get_hashlib_state(module); + py_hashentry_t *entry = (py_hashentry_t *)_Py_hashtable_get( + state->hashtable, (const void*)name + ); - /* OpenSSL uses dash instead of underscore in names of some algorithms - * like SHA3 and SHAKE. Detect different spellings. */ - if (digest == NULL) { - if (0) {} -#ifdef NID_sha512_224 - else if (!strcmp(name, "sha512_224") || !strcmp(name, "SHA512_224")) { - digest = EVP_sha512_224(); - } - else if (!strcmp(name, "sha512_256") || !strcmp(name, "SHA512_256")) { - digest = EVP_sha512_256(); - } -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - /* could be sha3_ or shake_, Python never defined upper case */ - else if (!strcmp(name, "sha3_224")) { - digest = EVP_sha3_224(); - } - else if (!strcmp(name, "sha3_256")) { - digest = EVP_sha3_256(); - } - else if (!strcmp(name, "sha3_384")) { - digest = EVP_sha3_384(); - } - else if (!strcmp(name, "sha3_512")) { - digest = EVP_sha3_512(); + if (entry != NULL) { + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + if (entry->evp == NULL) { + entry->evp = PY_EVP_MD_fetch(entry->ossl_name, NULL); + } + digest = entry->evp; + break; + case Py_ht_evp_nosecurity: + if (entry->evp_nosecurity == NULL) { + entry->evp_nosecurity = PY_EVP_MD_fetch(entry->ossl_name, "-fips"); + } + digest = entry->evp_nosecurity; + break; } -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - else if (!strcmp(name, "shake_128")) { - digest = EVP_shake128(); - } - else if (!strcmp(name, "shake_256")) { - digest = EVP_shake256(); - } -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - else if (!strcmp(name, "blake2s256")) { - digest = EVP_blake2s256(); + if (digest != NULL) { + PY_EVP_MD_up_ref(digest); } - else if (!strcmp(name, "blake2b512")) { - digest = EVP_blake2b512(); + } else { + // Fall back for looking up an unindexed OpenSSL specific name. + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + digest = PY_EVP_MD_fetch(name, NULL); + break; + case Py_ht_evp_nosecurity: + digest = PY_EVP_MD_fetch(name, "-fips"); + break; } -#endif } - if (digest == NULL) { - PyErr_Format(PyExc_ValueError, "unsupported hash type %s", name); + _setException(PyExc_ValueError, "unsupported hash type %s", name); return NULL; } - return digest; } @@ -264,9 +372,9 @@ py_digest_by_name(const char *name) * * on error returns NULL with exception set. */ -static const EVP_MD* -py_digest_by_digestmod(PyObject *module, PyObject *digestmod) { - const EVP_MD* evp; +static PY_EVP_MD* +py_digest_by_digestmod(PyObject *module, PyObject *digestmod, enum Py_hash_type py_ht) { + PY_EVP_MD* evp; PyObject *name_obj = NULL; const char *name; @@ -291,7 +399,7 @@ py_digest_by_digestmod(PyObject *module, PyObject *digestmod) { return NULL; } - evp = py_digest_by_name(name); + evp = py_digest_by_name(module, name, py_ht); if (evp == NULL) { return NULL; } @@ -330,7 +438,7 @@ EVP_hash(EVPobject *self, const void *vp, Py_ssize_t len) else process = Py_SAFE_DOWNCAST(len, Py_ssize_t, unsigned int); if (!EVP_DigestUpdate(self->ctx, (const void*)cp, process)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } len -= process; @@ -381,7 +489,7 @@ EVP_copy_impl(EVPobject *self) if (!locked_EVP_MD_CTX_copy(newobj->ctx, self)) { Py_DECREF(newobj); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return (PyObject *)newobj; } @@ -408,11 +516,11 @@ EVP_digest_impl(EVPobject *self) } if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -443,11 +551,11 @@ EVP_hexdigest_impl(EVPobject *self) /* Get the raw (binary) digest value */ if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -623,14 +731,14 @@ EVPXOF_digest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, (unsigned char*)PyBytes_AS_STRING(retval), length)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -671,12 +779,12 @@ EVPXOF_hexdigest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, digest, length)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -744,55 +852,74 @@ static PyType_Spec EVPXOFtype_spec = { #endif -static PyObject * -EVPnew(PyObject *module, const EVP_MD *digest, - const unsigned char *cp, Py_ssize_t len, int usedforsecurity) +static PyObject* +py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, + int usedforsecurity) { - int result = 0; - EVPobject *self; - PyTypeObject *type = get_hashlib_state(module)->EVPtype; + Py_buffer view = { 0 }; + PY_EVP_MD *digest = NULL; + PyTypeObject *type; + EVPobject *self = NULL; - if (!digest) { - PyErr_SetString(PyExc_ValueError, "unsupported hash type"); - return NULL; + if (data_obj != NULL) { + GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); + } + + digest = py_digest_by_name( + module, digestname, usedforsecurity ? Py_ht_evp : Py_ht_evp_nosecurity + ); + if (digest == NULL) { + goto exit; } -#ifdef PY_OPENSSL_HAS_SHAKE if ((EVP_MD_flags(digest) & EVP_MD_FLAG_XOF) == EVP_MD_FLAG_XOF) { type = get_hashlib_state(module)->EVPXOFtype; + } else { + type = get_hashlib_state(module)->EVPtype; } -#endif - if ((self = newEVPobject(type)) == NULL) - return NULL; + self = newEVPobject(type); + if (self == NULL) { + goto exit; + } +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L + // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while + // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { -#ifdef EVP_MD_CTX_FLAG_NON_FIPS_ALLOW EVP_MD_CTX_set_flags(self->ctx, EVP_MD_CTX_FLAG_NON_FIPS_ALLOW); -#endif } +#endif - - if (!EVP_DigestInit_ex(self->ctx, digest, NULL)) { - _setException(PyExc_ValueError); - Py_DECREF(self); - return NULL; + int result = EVP_DigestInit_ex(self->ctx, digest, NULL); + if (!result) { + _setException(PyExc_ValueError, NULL); + Py_CLEAR(self); + goto exit; } - if (cp && len) { - if (len >= HASHLIB_GIL_MINSIZE) { + if (view.buf && view.len) { + if (view.len >= HASHLIB_GIL_MINSIZE) { Py_BEGIN_ALLOW_THREADS - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); Py_END_ALLOW_THREADS } else { - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); } if (result == -1) { - Py_DECREF(self); - return NULL; + Py_CLEAR(self); + goto exit; } } + exit: + if (data_obj != NULL) { + PyBuffer_Release(&view); + } + if (digest != NULL) { + PY_EVP_MD_free(digest); + } + return (PyObject *)self; } @@ -820,53 +947,14 @@ EVP_new_impl(PyObject *module, PyObject *name_obj, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=ddd5053f92dffe90 input=c24554d0337be1b0]*/ { - Py_buffer view = { 0 }; - PyObject *ret_obj = NULL; char *name; - const EVP_MD *digest = NULL; - if (!PyArg_Parse(name_obj, "s", &name)) { PyErr_SetString(PyExc_TypeError, "name must be a string"); return NULL; } - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - digest = py_digest_by_name(name); - if (digest == NULL) { - goto exit; - } - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - -exit: - if (data_obj) - PyBuffer_Release(&view); - return ret_obj; + return py_evp_fromname(module, name, data_obj, usedforsecurity); } -static PyObject* -EVP_fast_new(PyObject *module, PyObject *data_obj, const EVP_MD *digest, - int usedforsecurity) -{ - Py_buffer view = { 0 }; - PyObject *ret_obj; - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - - if (data_obj) - PyBuffer_Release(&view); - - return ret_obj; -} /*[clinic input] _hashlib.openssl_md5 @@ -884,7 +972,7 @@ _hashlib_openssl_md5_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=87b0186440a44f8c input=990e36d5e689b16e]*/ { - return EVP_fast_new(module, data_obj, EVP_md5(), usedforsecurity); + return py_evp_fromname(module, Py_hash_md5, data_obj, usedforsecurity); } @@ -904,7 +992,7 @@ _hashlib_openssl_sha1_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=6813024cf690670d input=948f2f4b6deabc10]*/ { - return EVP_fast_new(module, data_obj, EVP_sha1(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha1, data_obj, usedforsecurity); } @@ -924,7 +1012,7 @@ _hashlib_openssl_sha224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=a2dfe7cc4eb14ebb input=f9272821fadca505]*/ { - return EVP_fast_new(module, data_obj, EVP_sha224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha224, data_obj, usedforsecurity); } @@ -944,7 +1032,7 @@ _hashlib_openssl_sha256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=1f874a34870f0a68 input=549fad9d2930d4c5]*/ { - return EVP_fast_new(module, data_obj, EVP_sha256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha256, data_obj, usedforsecurity); } @@ -964,7 +1052,7 @@ _hashlib_openssl_sha384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=58529eff9ca457b2 input=48601a6e3bf14ad7]*/ { - return EVP_fast_new(module, data_obj, EVP_sha384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha384, data_obj, usedforsecurity); } @@ -984,7 +1072,7 @@ _hashlib_openssl_sha512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2c744c9e4a40d5f6 input=c5c46a2a817aa98f]*/ { - return EVP_fast_new(module, data_obj, EVP_sha512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha512, data_obj, usedforsecurity); } @@ -1006,7 +1094,7 @@ _hashlib_openssl_sha3_224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=144641c1d144b974 input=e3a01b2888916157]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_224, data_obj, usedforsecurity); } /*[clinic input] @@ -1025,7 +1113,7 @@ _hashlib_openssl_sha3_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=c61f1ab772d06668 input=e2908126c1b6deed]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_256, data_obj , usedforsecurity); } /*[clinic input] @@ -1044,7 +1132,7 @@ _hashlib_openssl_sha3_384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=f68e4846858cf0ee input=ec0edf5c792f8252]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_384, data_obj , usedforsecurity); } /*[clinic input] @@ -1063,7 +1151,7 @@ _hashlib_openssl_sha3_512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2eede478c159354a input=64e2cc0c094d56f4]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_512, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHA3 */ @@ -1084,7 +1172,7 @@ _hashlib_openssl_shake_128_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=bc49cdd8ada1fa97 input=6c9d67440eb33ec8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake128(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_128, data_obj , usedforsecurity); } /*[clinic input] @@ -1103,7 +1191,7 @@ _hashlib_openssl_shake_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=358d213be8852df7 input=479cbe9fefd4a9f8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_256, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHAKE */ @@ -1129,9 +1217,8 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, char *key; long dklen; int retval; - const EVP_MD *digest; - digest = py_digest_by_name(hash_name); + PY_EVP_MD *digest = py_digest_by_name(module, hash_name, Py_ht_pbkdf2); if (digest == NULL) { goto end; } @@ -1194,11 +1281,14 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto end; } end: + if (digest != NULL) { + PY_EVP_MD_free(digest); + } return key_obj; } @@ -1297,9 +1387,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, /* let OpenSSL validate the rest */ retval = EVP_PBE_scrypt(NULL, 0, NULL, 0, n, r, p, maxmem, NULL, 0); if (!retval) { - /* sorry, can't do much better */ - PyErr_SetString(PyExc_ValueError, - "Invalid parameter combination for n, r, p, maxmem."); + _setException(PyExc_ValueError, "Invalid parameter combination for n, r, p, maxmem."); return NULL; } @@ -1320,7 +1408,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return key_obj; @@ -1348,12 +1436,7 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, unsigned char md[EVP_MAX_MD_SIZE] = {0}; unsigned int md_len = 0; unsigned char *result; - const EVP_MD *evp; - - evp = py_digest_by_digestmod(module, digest); - if (evp == NULL) { - return NULL; - } + PY_EVP_MD *evp; if (key->len > INT_MAX) { PyErr_SetString(PyExc_OverflowError, @@ -1366,6 +1449,11 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, return NULL; } + evp = py_digest_by_digestmod(module, digest, Py_ht_mac); + if (evp == NULL) { + return NULL; + } + Py_BEGIN_ALLOW_THREADS result = HMAC( evp, @@ -1374,9 +1462,10 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, md, &md_len ); Py_END_ALLOW_THREADS + PY_EVP_MD_free(evp); if (result == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return PyBytes_FromStringAndSize((const char*)md, md_len); @@ -1403,7 +1492,7 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, /*[clinic end generated code: output=c20d9e4d9ed6d219 input=5f4071dcc7f34362]*/ { PyTypeObject *type = get_hashlib_state(module)->HMACtype; - const EVP_MD *digest; + PY_EVP_MD *digest; HMAC_CTX *ctx = NULL; HMACobject *self = NULL; int r; @@ -1420,14 +1509,14 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, return NULL; } - digest = py_digest_by_digestmod(module, digestmod); + digest = py_digest_by_digestmod(module, digestmod, Py_ht_mac); if (digest == NULL) { return NULL; } ctx = HMAC_CTX_new(); if (ctx == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1437,8 +1526,9 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, (int)key->len, digest, NULL /*impl*/); + PY_EVP_MD_free(digest); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1508,7 +1598,7 @@ _hmac_update(HMACobject *self, PyObject *obj) PyBuffer_Release(&view); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1528,11 +1618,11 @@ _hashlib_HMAC_copy_impl(HMACobject *self) HMAC_CTX *ctx = HMAC_CTX_new(); if (ctx == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!locked_HMAC_CTX_copy(ctx, self)) { HMAC_CTX_free(ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } retval = (HMACobject *)PyObject_New(HMACobject, Py_TYPE(self)); @@ -1598,13 +1688,13 @@ _hmac_digest(HMACobject *self, unsigned char *buf, unsigned int len) return 0; } if (!locked_HMAC_CTX_copy(temp_ctx, self)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } int r = HMAC_Final(temp_ctx, buf, &len); HMAC_CTX_free(temp_ctx); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1622,7 +1712,7 @@ _hashlib_HMAC_digest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1647,7 +1737,7 @@ _hashlib_HMAC_hexdigest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1661,7 +1751,7 @@ _hashlib_hmac_get_digest_size(HMACobject *self, void *closure) { unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(digest_size); } @@ -1671,7 +1761,7 @@ _hashlib_hmac_get_block_size(HMACobject *self, void *closure) { const EVP_MD *md = HMAC_CTX_get_md(self->ctx); if (md == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(EVP_MD_block_size(md)); } @@ -1824,7 +1914,7 @@ _hashlib_get_fips_mode_impl(PyObject *module) // But 0 is also a valid result value. unsigned long errcode = ERR_peek_last_error(); if (errcode) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } } @@ -2000,6 +2090,12 @@ hashlib_clear(PyObject *m) #endif Py_CLEAR(state->constructs); Py_CLEAR(state->unsupported_digestmod_error); + + if (state->hashtable != NULL) { + _Py_hashtable_destroy(state->hashtable); + state->hashtable = NULL; + } + return 0; } @@ -2010,6 +2106,19 @@ hashlib_free(void *m) } /* Py_mod_exec functions */ +static int +hashlib_init_hashtable(PyObject *module) +{ + _hashlibstate *state = get_hashlib_state(module); + + state->hashtable = py_hashentry_table_new(); + if (state->hashtable == NULL) { + PyErr_NoMemory(); + return -1; + } + return 0; +} + static int hashlib_init_evptype(PyObject *module) { @@ -2137,6 +2246,7 @@ hashlib_exception(PyObject *module) static PyModuleDef_Slot hashlib_slots[] = { + {Py_mod_exec, hashlib_init_hashtable}, {Py_mod_exec, hashlib_init_evptype}, {Py_mod_exec, hashlib_init_evpxoftype}, {Py_mod_exec, hashlib_init_hmactype}, From webhook-mailer at python.org Thu Jan 13 05:21:15 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 13 Jan 2022 10:21:15 -0000 Subject: [Python-checkins] [3.9] bpo-40479: Fix hashlib's usedforsecurity for OpenSSL 3.0.0 (GH-30455) (GH-30574) Message-ID: https://github.com/python/cpython/commit/4ddd5da2691bea39e36debbc7f53c7cc4f13904e commit: 4ddd5da2691bea39e36debbc7f53c7cc4f13904e branch: 3.9 author: Christian Heimes committer: tiran date: 2022-01-13T11:20:45+01:00 summary: [3.9] bpo-40479: Fix hashlib's usedforsecurity for OpenSSL 3.0.0 (GH-30455) (GH-30574) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst M Doc/library/hashlib.rst M Lib/test/test_hashlib.py M Lib/test/test_imaplib.py M Lib/test/test_poplib.py M Lib/test/test_smtplib.py M Lib/test/test_tools/test_md5sum.py M Lib/test/test_urllib2_localnet.py M Modules/_hashopenssl.c diff --git a/Doc/library/hashlib.rst b/Doc/library/hashlib.rst index 58ccbbedb2c63..bd25f008c64a8 100644 --- a/Doc/library/hashlib.rst +++ b/Doc/library/hashlib.rst @@ -120,10 +120,10 @@ More condensed: Using :func:`new` with an algorithm provided by OpenSSL: - >>> h = hashlib.new('sha512_256') + >>> h = hashlib.new('sha256') >>> h.update(b"Nobody inspects the spammish repetition") >>> h.hexdigest() - '19197dc4d03829df858011c6c87600f994a858103bbc19005f20987aa19a97e2' + '031edd7d41651593c5fe5c006fa5752b37fddff7bc4e843aa6af0c950f4b9406' Hashlib provides the following constant attributes: diff --git a/Lib/test/test_hashlib.py b/Lib/test/test_hashlib.py index 86f31a5587823..969e5e42e44c7 100644 --- a/Lib/test/test_hashlib.py +++ b/Lib/test/test_hashlib.py @@ -45,12 +45,15 @@ builtin_hashlib = None try: - from _hashlib import HASH, HASHXOF, openssl_md_meth_names + from _hashlib import HASH, HASHXOF, openssl_md_meth_names, get_fips_mode except ImportError: HASH = None HASHXOF = None openssl_md_meth_names = frozenset() + def get_fips_mode(): + return 0 + try: import _blake2 except ImportError: @@ -196,10 +199,7 @@ def hash_constructors(self): @property def is_fips_mode(self): - if hasattr(self._hashlib, "get_fips_mode"): - return self._hashlib.get_fips_mode() - else: - return None + return get_fips_mode() def test_hash_array(self): a = array.array("b", range(10)) @@ -1013,7 +1013,7 @@ def _test_pbkdf2_hmac(self, pbkdf2, supported): self.assertEqual(out, expected, (digest_name, password, salt, rounds)) - with self.assertRaisesRegex(ValueError, 'unsupported hash type'): + with self.assertRaisesRegex(ValueError, '.*unsupported.*'): pbkdf2('unknown', b'pass', b'salt', 1) if 'sha1' in supported: @@ -1050,6 +1050,7 @@ def test_pbkdf2_hmac_c(self): @unittest.skipUnless(hasattr(hashlib, 'scrypt'), ' test requires OpenSSL > 1.1') + @unittest.skipIf(get_fips_mode(), reason="scrypt is blocked in FIPS mode") def test_scrypt(self): for password, salt, n, r, p, expected in self.scrypt_test_vectors: result = hashlib.scrypt(password, salt=salt, n=n, r=r, p=p) diff --git a/Lib/test/test_imaplib.py b/Lib/test/test_imaplib.py index 914a75a6adabf..70559383e2699 100644 --- a/Lib/test/test_imaplib.py +++ b/Lib/test/test_imaplib.py @@ -385,7 +385,7 @@ def cmd_AUTHENTICATE(self, tag, args): self.assertEqual(code, 'OK') self.assertEqual(server.response, b'ZmFrZQ==\r\n') # b64 encoded 'fake' - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_bytes(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -403,7 +403,7 @@ def cmd_AUTHENTICATE(self, tag, args): ret, _ = client.login_cram_md5("tim", b"tanstaaftanstaaf") self.assertEqual(ret, "OK") - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5_plain_text(self): class AuthHandler(SimpleIMAPHandler): capabilities = 'LOGINDISABLED AUTH=CRAM-MD5' @@ -849,7 +849,7 @@ def cmd_AUTHENTICATE(self, tag, args): b'ZmFrZQ==\r\n') # b64 encoded 'fake' @reap_threads - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_login_cram_md5(self): class AuthHandler(SimpleIMAPHandler): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py index 5228fc46f4b6e..5d9f557eefb95 100644 --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -313,11 +313,11 @@ def test_noop(self): def test_rpop(self): self.assertOK(self.client.rpop('foo')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_normal(self): self.assertOK(self.client.apop('foo', 'dummypassword')) - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def test_apop_REDOS(self): # Replace welcome with very long evil welcome. # NB The upper bound on welcome length is currently 2048. diff --git a/Lib/test/test_smtplib.py b/Lib/test/test_smtplib.py index 52e9d6beecfa0..2d78019adbe9b 100644 --- a/Lib/test/test_smtplib.py +++ b/Lib/test/test_smtplib.py @@ -1167,7 +1167,7 @@ def auth_buggy(challenge=None): finally: smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_CRAM_MD5(self): self.serv.add_feature("AUTH CRAM-MD5") smtp = smtplib.SMTP(HOST, self.port, local_hostname='localhost', @@ -1176,7 +1176,7 @@ def testAUTH_CRAM_MD5(self): self.assertEqual(resp, (235, b'Authentication Succeeded')) smtp.close() - @hashlib_helper.requires_hashdigest('md5') + @hashlib_helper.requires_hashdigest('md5', openssl=True) def testAUTH_multiple(self): # Test that multiple authentication methods are tried. self.serv.add_feature("AUTH BOGUS PLAIN LOGIN CRAM-MD5") diff --git a/Lib/test/test_tools/test_md5sum.py b/Lib/test/test_tools/test_md5sum.py index 7321b488be5a5..f515378ac161d 100644 --- a/Lib/test/test_tools/test_md5sum.py +++ b/Lib/test/test_tools/test_md5sum.py @@ -11,7 +11,7 @@ skip_if_missing() - at hashlib_helper.requires_hashdigest('md5') + at hashlib_helper.requires_hashdigest('md5', openssl=True) class MD5SumTests(unittest.TestCase): @classmethod def setUpClass(cls): diff --git a/Lib/test/test_urllib2_localnet.py b/Lib/test/test_urllib2_localnet.py index e73132ab69974..7c716341c7023 100644 --- a/Lib/test/test_urllib2_localnet.py +++ b/Lib/test/test_urllib2_localnet.py @@ -316,7 +316,7 @@ def test_basic_auth_httperror(self): self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, self.server_url) - at hashlib_helper.requires_hashdigest("md5") + at hashlib_helper.requires_hashdigest("md5", openssl=True) class ProxyAuthTests(unittest.TestCase): URL = "http://localhost" diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst new file mode 100644 index 0000000000000..af72923bbd759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst @@ -0,0 +1,2 @@ +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index ff3a1aef5ee13..4f117b3afae7c 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -18,9 +18,14 @@ #endif #define OPENSSL_NO_DEPRECATED 1 +#ifndef Py_BUILD_CORE_BUILTIN +# define Py_BUILD_CORE_MODULE 1 +#endif + #define PY_SSIZE_T_CLEAN #include "Python.h" +#include "pycore_hashtable.h" #include "hashlib.h" #include "pystrhex.h" @@ -84,6 +89,168 @@ HMAC_CTX_get_md(const HMAC_CTX *ctx) #define PY_OPENSSL_HAS_BLAKE2 1 #endif +#if OPENSSL_VERSION_NUMBER >= 0x30000000L +#define PY_EVP_MD EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_MD_fetch(NULL, algorithm, properties) +#define PY_EVP_MD_up_ref(md) EVP_MD_up_ref(md) +#define PY_EVP_MD_free(md) EVP_MD_free(md) +#else +#define PY_EVP_MD const EVP_MD +#define PY_EVP_MD_fetch(algorithm, properties) EVP_get_digestbyname(algorithm) +#define PY_EVP_MD_up_ref(md) do {} while(0) +#define PY_EVP_MD_free(md) do {} while(0) +#endif + +/* hash alias map and fast lookup + * + * Map between Python's preferred names and OpenSSL internal names. Maintain + * cache of fetched EVP MD objects. The EVP_get_digestbyname() and + * EVP_MD_fetch() API calls have a performance impact. + * + * The py_hashentry_t items are stored in a _Py_hashtable_t with py_name and + * py_alias as keys. + */ + +enum Py_hash_type { + Py_ht_evp, // usedforsecurity=True / default + Py_ht_evp_nosecurity, // usedforsecurity=False + Py_ht_mac, // HMAC + Py_ht_pbkdf2, // PKBDF2 +}; + +typedef struct { + const char *py_name; + const char *py_alias; + const char *ossl_name; + int ossl_nid; + int refcnt; + PY_EVP_MD *evp; + PY_EVP_MD *evp_nosecurity; +} py_hashentry_t; + +#define Py_hash_md5 "md5" +#define Py_hash_sha1 "sha1" +#define Py_hash_sha224 "sha224" +#define Py_hash_sha256 "sha256" +#define Py_hash_sha384 "sha384" +#define Py_hash_sha512 "sha512" +#define Py_hash_sha512_224 "sha512_224" +#define Py_hash_sha512_256 "sha512_256" +#define Py_hash_sha3_224 "sha3_224" +#define Py_hash_sha3_256 "sha3_256" +#define Py_hash_sha3_384 "sha3_384" +#define Py_hash_sha3_512 "sha3_512" +#define Py_hash_shake_128 "shake_128" +#define Py_hash_shake_256 "shake_256" +#define Py_hash_blake2s "blake2s" +#define Py_hash_blake2b "blake2b" + +#define PY_HASH_ENTRY(py_name, py_alias, ossl_name, ossl_nid) \ + {py_name, py_alias, ossl_name, ossl_nid, 0, NULL, NULL} + +static const py_hashentry_t py_hashes[] = { + /* md5 */ + PY_HASH_ENTRY(Py_hash_md5, "MD5", SN_md5, NID_md5), + /* sha1 */ + PY_HASH_ENTRY(Py_hash_sha1, "SHA1", SN_sha1, NID_sha1), + /* sha2 family */ + PY_HASH_ENTRY(Py_hash_sha224, "SHA224", SN_sha224, NID_sha224), + PY_HASH_ENTRY(Py_hash_sha256, "SHA256", SN_sha256, NID_sha256), + PY_HASH_ENTRY(Py_hash_sha384, "SHA384", SN_sha384, NID_sha384), + PY_HASH_ENTRY(Py_hash_sha512, "SHA512", SN_sha512, NID_sha512), + /* truncated sha2 */ +#ifdef NID_sha512_224 + PY_HASH_ENTRY(Py_hash_sha512_224, "SHA512_224", SN_sha512_224, NID_sha512_224), + PY_HASH_ENTRY(Py_hash_sha512_256, "SHA512_256", SN_sha512_256, NID_sha512_256), +#endif + /* sha3 */ +#ifdef PY_OPENSSL_HAS_SHA3 + PY_HASH_ENTRY(Py_hash_sha3_224, NULL, SN_sha3_224, NID_sha3_224), + PY_HASH_ENTRY(Py_hash_sha3_256, NULL, SN_sha3_256, NID_sha3_256), + PY_HASH_ENTRY(Py_hash_sha3_384, NULL, SN_sha3_384, NID_sha3_384), + PY_HASH_ENTRY(Py_hash_sha3_512, NULL, SN_sha3_512, NID_sha3_512), +#endif + /* sha3 shake */ +#ifdef PY_OPENSSL_HAS_SHAKE + PY_HASH_ENTRY(Py_hash_shake_128, NULL, SN_shake128, NID_shake128), + PY_HASH_ENTRY(Py_hash_shake_256, NULL, SN_shake256, NID_shake256), +#endif + /* blake2 digest */ +#ifdef PY_OPENSSL_HAS_BLAKE2 + PY_HASH_ENTRY(Py_hash_blake2s, "blake2s256", SN_blake2s256, NID_blake2s256), + PY_HASH_ENTRY(Py_hash_blake2b, "blake2b512", SN_blake2b512, NID_blake2b512), +#endif + PY_HASH_ENTRY(NULL, NULL, NULL, 0), +}; + +static Py_uhash_t +py_hashentry_t_hash_name(const void *key) { + return _Py_HashBytes(key, strlen((const char *)key)); +} + +static int +py_hashentry_t_compare_name(const void *key1, const void *key2) { + return strcmp((const char *)key1, (const char *)key2) == 0; +} + +static void +py_hashentry_t_destroy_value(void *entry) { + py_hashentry_t *h = (py_hashentry_t *)entry; + if (--(h->refcnt) == 0) { + if (h->evp != NULL) { + PY_EVP_MD_free(h->evp); + h->evp = NULL; + } + if (h->evp_nosecurity != NULL) { + PY_EVP_MD_free(h->evp_nosecurity); + h->evp_nosecurity = NULL; + } + PyMem_Free(entry); + } +} + +static _Py_hashtable_t * +py_hashentry_table_new(void) { + _Py_hashtable_t *ht = _Py_hashtable_new_full( + py_hashentry_t_hash_name, + py_hashentry_t_compare_name, + NULL, + py_hashentry_t_destroy_value, + NULL + ); + if (ht == NULL) { + return NULL; + } + + for (const py_hashentry_t *h = py_hashes; h->py_name != NULL; h++) { + py_hashentry_t *entry = (py_hashentry_t *)PyMem_Malloc(sizeof(py_hashentry_t)); + if (entry == NULL) { + goto error; + } + memcpy(entry, h, sizeof(py_hashentry_t)); + + if (_Py_hashtable_set(ht, (const void*)entry->py_name, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt = 1; + + if (h->py_alias != NULL) { + if (_Py_hashtable_set(ht, (const void*)entry->py_alias, (void*)entry) < 0) { + PyMem_Free(entry); + goto error; + } + entry->refcnt++; + } + } + + return ht; + error: + _Py_hashtable_destroy(ht); + return NULL; +} + +/* Module state */ static PyModuleDef _hashlibmodule; typedef struct { @@ -92,6 +259,7 @@ typedef struct { #ifdef PY_OPENSSL_HAS_SHAKE PyTypeObject *EVPXOFtype; #endif + _Py_hashtable_t *hashtable; } _hashlibstate; static inline _hashlibstate* @@ -126,16 +294,26 @@ class _hashlib.HMAC "HMACobject *" "((_hashlibstate *)PyModule_GetState(module)) /* LCOV_EXCL_START */ static PyObject * -_setException(PyObject *exc) +_setException(PyObject *exc, const char* altmsg, ...) { - unsigned long errcode; + unsigned long errcode = ERR_peek_last_error(); const char *lib, *func, *reason; + va_list vargs; - errcode = ERR_peek_last_error(); +#ifdef HAVE_STDARG_PROTOTYPES + va_start(vargs, altmsg); +#else + va_start(vargs); +#endif if (!errcode) { - PyErr_SetString(exc, "unknown reasons"); + if (altmsg == NULL) { + PyErr_SetString(exc, "no reason supplied"); + } else { + PyErr_FormatV(exc, altmsg, vargs); + } return NULL; } + va_end(vargs); ERR_clear_error(); lib = ERR_lib_error_string(errcode); @@ -169,68 +347,15 @@ py_digest_name(const EVP_MD *md) { int nid = EVP_MD_nid(md); const char *name = NULL; + const py_hashentry_t *h; - /* Hard-coded names for well-known hashing algorithms. - * OpenSSL uses slightly different names algorithms like SHA3. - */ - switch (nid) { - case NID_md5: - name = "md5"; - break; - case NID_sha1: - name = "sha1"; - break; - case NID_sha224: - name ="sha224"; - break; - case NID_sha256: - name ="sha256"; - break; - case NID_sha384: - name ="sha384"; - break; - case NID_sha512: - name ="sha512"; - break; -#ifdef NID_sha512_224 - case NID_sha512_224: - name ="sha512_224"; - break; - case NID_sha512_256: - name ="sha512_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - case NID_sha3_224: - name ="sha3_224"; - break; - case NID_sha3_256: - name ="sha3_256"; - break; - case NID_sha3_384: - name ="sha3_384"; - break; - case NID_sha3_512: - name ="sha3_512"; - break; -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - case NID_shake128: - name ="shake_128"; - break; - case NID_shake256: - name ="shake_256"; - break; -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - case NID_blake2s256: - name ="blake2s"; - break; - case NID_blake2b512: - name ="blake2b"; - break; -#endif - default: + for (h = py_hashes; h->py_name != NULL; h++) { + if (h->ossl_nid == nid) { + name = h->py_name; + break; + } + } + if (name == NULL) { /* Ignore aliased names and only use long, lowercase name. The aliases * pollute the list and OpenSSL appears to have its own definition of * alias as the resulting list still contains duplicate and alternate @@ -239,62 +364,58 @@ py_digest_name(const EVP_MD *md) name = OBJ_nid2ln(nid); if (name == NULL) name = OBJ_nid2sn(nid); - break; } return PyUnicode_FromString(name); } -static const EVP_MD* -py_digest_by_name(const char *name) +/* Get EVP_MD by HID and purpose */ +static PY_EVP_MD* +py_digest_by_name(PyObject *module, const char *name, enum Py_hash_type py_ht) { - const EVP_MD *digest = EVP_get_digestbyname(name); + PY_EVP_MD *digest = NULL; + _hashlibstate *state = get_hashlib_state(module); + py_hashentry_t *entry = (py_hashentry_t *)_Py_hashtable_get( + state->hashtable, (const void*)name + ); - /* OpenSSL uses dash instead of underscore in names of some algorithms - * like SHA3 and SHAKE. Detect different spellings. */ - if (digest == NULL) { - if (0) {} -#ifdef NID_sha512_224 - else if (!strcmp(name, "sha512_224") || !strcmp(name, "SHA512_224")) { - digest = EVP_sha512_224(); + if (entry != NULL) { + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + if (entry->evp == NULL) { + entry->evp = PY_EVP_MD_fetch(entry->ossl_name, NULL); + } + digest = entry->evp; + break; + case Py_ht_evp_nosecurity: + if (entry->evp_nosecurity == NULL) { + entry->evp_nosecurity = PY_EVP_MD_fetch(entry->ossl_name, "-fips"); + } + digest = entry->evp_nosecurity; + break; } - else if (!strcmp(name, "sha512_256") || !strcmp(name, "SHA512_256")) { - digest = EVP_sha512_256(); - } -#endif -#ifdef PY_OPENSSL_HAS_SHA3 - /* could be sha3_ or shake_, Python never defined upper case */ - else if (!strcmp(name, "sha3_224")) { - digest = EVP_sha3_224(); - } - else if (!strcmp(name, "sha3_256")) { - digest = EVP_sha3_256(); - } - else if (!strcmp(name, "sha3_384")) { - digest = EVP_sha3_384(); - } - else if (!strcmp(name, "sha3_512")) { - digest = EVP_sha3_512(); - } -#endif -#ifdef PY_OPENSSL_HAS_SHAKE - else if (!strcmp(name, "shake_128")) { - digest = EVP_shake128(); + if (digest != NULL) { + PY_EVP_MD_up_ref(digest); } - else if (!strcmp(name, "shake_256")) { - digest = EVP_shake256(); - } -#endif -#ifdef PY_OPENSSL_HAS_BLAKE2 - else if (!strcmp(name, "blake2s256")) { - digest = EVP_blake2s256(); - } - else if (!strcmp(name, "blake2b512")) { - digest = EVP_blake2b512(); + } else { + // Fall back for looking up an unindexed OpenSSL specific name. + switch (py_ht) { + case Py_ht_evp: + case Py_ht_mac: + case Py_ht_pbkdf2: + digest = PY_EVP_MD_fetch(name, NULL); + break; + case Py_ht_evp_nosecurity: + digest = PY_EVP_MD_fetch(name, "-fips"); + break; } -#endif } - + if (digest == NULL) { + _setException(PyExc_ValueError, "unsupported hash type %s", name); + return NULL; + } return digest; } @@ -329,7 +450,7 @@ EVP_hash(EVPobject *self, const void *vp, Py_ssize_t len) else process = Py_SAFE_DOWNCAST(len, Py_ssize_t, unsigned int); if (!EVP_DigestUpdate(self->ctx, (const void*)cp, process)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } len -= process; @@ -380,7 +501,7 @@ EVP_copy_impl(EVPobject *self) if (!locked_EVP_MD_CTX_copy(newobj->ctx, self)) { Py_DECREF(newobj); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return (PyObject *)newobj; } @@ -407,11 +528,11 @@ EVP_digest_impl(EVPobject *self) } if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -442,11 +563,11 @@ EVP_hexdigest_impl(EVPobject *self) /* Get the raw (binary) digest value */ if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } digest_size = EVP_MD_CTX_size(temp_ctx); if (!EVP_DigestFinal(temp_ctx, digest, NULL)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -623,14 +744,14 @@ EVPXOF_digest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, (unsigned char*)PyBytes_AS_STRING(retval), length)) { Py_DECREF(retval); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -671,12 +792,12 @@ EVPXOF_hexdigest_impl(EVPobject *self, Py_ssize_t length) if (!locked_EVP_MD_CTX_copy(temp_ctx, self)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!EVP_DigestFinalXOF(temp_ctx, digest, length)) { PyMem_Free(digest); EVP_MD_CTX_free(temp_ctx); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } @@ -745,55 +866,77 @@ static PyType_Spec EVPXOFtype_spec = { #endif -static PyObject * -EVPnew(PyObject *module, const EVP_MD *digest, - const unsigned char *cp, Py_ssize_t len, int usedforsecurity) +static PyObject* +py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, + int usedforsecurity) { - int result = 0; - EVPobject *self; - PyTypeObject *type = get_hashlib_state(module)->EVPtype; + Py_buffer view = { 0 }; + PY_EVP_MD *digest = NULL; + PyTypeObject *type; + EVPobject *self = NULL; - if (!digest) { - PyErr_SetString(PyExc_ValueError, "unsupported hash type"); - return NULL; + if (data_obj != NULL) { + GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); + } + + digest = py_digest_by_name( + module, digestname, usedforsecurity ? Py_ht_evp : Py_ht_evp_nosecurity + ); + if (digest == NULL) { + goto exit; } #ifdef PY_OPENSSL_HAS_SHAKE if ((EVP_MD_flags(digest) & EVP_MD_FLAG_XOF) == EVP_MD_FLAG_XOF) { type = get_hashlib_state(module)->EVPXOFtype; - } + } else #endif + { + type = get_hashlib_state(module)->EVPtype; + } - if ((self = newEVPobject(type)) == NULL) - return NULL; + self = newEVPobject(type); + if (self == NULL) { + goto exit; + } +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L + // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while + // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { -#ifdef EVP_MD_CTX_FLAG_NON_FIPS_ALLOW EVP_MD_CTX_set_flags(self->ctx, EVP_MD_CTX_FLAG_NON_FIPS_ALLOW); -#endif } +#endif - - if (!EVP_DigestInit_ex(self->ctx, digest, NULL)) { - _setException(PyExc_ValueError); - Py_DECREF(self); - return NULL; + int result = EVP_DigestInit_ex(self->ctx, digest, NULL); + if (!result) { + _setException(PyExc_ValueError, NULL); + Py_CLEAR(self); + goto exit; } - if (cp && len) { - if (len >= HASHLIB_GIL_MINSIZE) { + if (view.buf && view.len) { + if (view.len >= HASHLIB_GIL_MINSIZE) { Py_BEGIN_ALLOW_THREADS - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); Py_END_ALLOW_THREADS } else { - result = EVP_hash(self, cp, len); + result = EVP_hash(self, view.buf, view.len); } if (result == -1) { - Py_DECREF(self); - return NULL; + Py_CLEAR(self); + goto exit; } } + exit: + if (data_obj != NULL) { + PyBuffer_Release(&view); + } + if (digest != NULL) { + PY_EVP_MD_free(digest); + } + return (PyObject *)self; } @@ -821,49 +964,14 @@ EVP_new_impl(PyObject *module, PyObject *name_obj, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=ddd5053f92dffe90 input=c24554d0337be1b0]*/ { - Py_buffer view = { 0 }; - PyObject *ret_obj; char *name; - const EVP_MD *digest = NULL; - if (!PyArg_Parse(name_obj, "s", &name)) { PyErr_SetString(PyExc_TypeError, "name must be a string"); return NULL; } - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - digest = py_digest_by_name(name); - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - - if (data_obj) - PyBuffer_Release(&view); - return ret_obj; + return py_evp_fromname(module, name, data_obj, usedforsecurity); } -static PyObject* -EVP_fast_new(PyObject *module, PyObject *data_obj, const EVP_MD *digest, - int usedforsecurity) -{ - Py_buffer view = { 0 }; - PyObject *ret_obj; - - if (data_obj) - GET_BUFFER_VIEW_OR_ERROUT(data_obj, &view); - - ret_obj = EVPnew(module, digest, - (unsigned char*)view.buf, view.len, - usedforsecurity); - - if (data_obj) - PyBuffer_Release(&view); - - return ret_obj; -} /*[clinic input] _hashlib.openssl_md5 @@ -881,7 +989,7 @@ _hashlib_openssl_md5_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=87b0186440a44f8c input=990e36d5e689b16e]*/ { - return EVP_fast_new(module, data_obj, EVP_md5(), usedforsecurity); + return py_evp_fromname(module, Py_hash_md5, data_obj, usedforsecurity); } @@ -901,7 +1009,7 @@ _hashlib_openssl_sha1_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=6813024cf690670d input=948f2f4b6deabc10]*/ { - return EVP_fast_new(module, data_obj, EVP_sha1(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha1, data_obj, usedforsecurity); } @@ -921,7 +1029,7 @@ _hashlib_openssl_sha224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=a2dfe7cc4eb14ebb input=f9272821fadca505]*/ { - return EVP_fast_new(module, data_obj, EVP_sha224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha224, data_obj, usedforsecurity); } @@ -941,7 +1049,7 @@ _hashlib_openssl_sha256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=1f874a34870f0a68 input=549fad9d2930d4c5]*/ { - return EVP_fast_new(module, data_obj, EVP_sha256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha256, data_obj, usedforsecurity); } @@ -961,7 +1069,7 @@ _hashlib_openssl_sha384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=58529eff9ca457b2 input=48601a6e3bf14ad7]*/ { - return EVP_fast_new(module, data_obj, EVP_sha384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha384, data_obj, usedforsecurity); } @@ -981,7 +1089,7 @@ _hashlib_openssl_sha512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2c744c9e4a40d5f6 input=c5c46a2a817aa98f]*/ { - return EVP_fast_new(module, data_obj, EVP_sha512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha512, data_obj, usedforsecurity); } @@ -1003,7 +1111,7 @@ _hashlib_openssl_sha3_224_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=144641c1d144b974 input=e3a01b2888916157]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_224(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_224, data_obj, usedforsecurity); } /*[clinic input] @@ -1022,7 +1130,7 @@ _hashlib_openssl_sha3_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=c61f1ab772d06668 input=e2908126c1b6deed]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_256, data_obj , usedforsecurity); } /*[clinic input] @@ -1041,7 +1149,7 @@ _hashlib_openssl_sha3_384_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=f68e4846858cf0ee input=ec0edf5c792f8252]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_384(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_384, data_obj , usedforsecurity); } /*[clinic input] @@ -1060,7 +1168,7 @@ _hashlib_openssl_sha3_512_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=2eede478c159354a input=64e2cc0c094d56f4]*/ { - return EVP_fast_new(module, data_obj, EVP_sha3_512(), usedforsecurity); + return py_evp_fromname(module, Py_hash_sha3_512, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHA3 */ @@ -1081,7 +1189,7 @@ _hashlib_openssl_shake_128_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=bc49cdd8ada1fa97 input=6c9d67440eb33ec8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake128(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_128, data_obj , usedforsecurity); } /*[clinic input] @@ -1100,7 +1208,7 @@ _hashlib_openssl_shake_256_impl(PyObject *module, PyObject *data_obj, int usedforsecurity) /*[clinic end generated code: output=358d213be8852df7 input=479cbe9fefd4a9f8]*/ { - return EVP_fast_new(module, data_obj, EVP_shake256(), usedforsecurity); + return py_evp_fromname(module, Py_hash_shake_256, data_obj , usedforsecurity); } #endif /* PY_OPENSSL_HAS_SHAKE */ @@ -1126,9 +1234,8 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, char *key; long dklen; int retval; - const EVP_MD *digest; - digest = py_digest_by_name(hash_name); + PY_EVP_MD *digest = py_digest_by_name(module, hash_name, Py_ht_pbkdf2); if (digest == NULL) { PyErr_SetString(PyExc_ValueError, "unsupported hash type"); goto end; @@ -1192,11 +1299,14 @@ pbkdf2_hmac_impl(PyObject *module, const char *hash_name, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto end; } end: + if (digest != NULL) { + PY_EVP_MD_free(digest); + } return key_obj; } @@ -1296,9 +1406,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, /* let OpenSSL validate the rest */ retval = EVP_PBE_scrypt(NULL, 0, NULL, 0, n, r, p, maxmem, NULL, 0); if (!retval) { - /* sorry, can't do much better */ - PyErr_SetString(PyExc_ValueError, - "Invalid parameter combination for n, r, p, maxmem."); + _setException(PyExc_ValueError, "Invalid parameter combination for n, r, p, maxmem."); return NULL; } @@ -1319,7 +1427,7 @@ _hashlib_scrypt_impl(PyObject *module, Py_buffer *password, Py_buffer *salt, if (!retval) { Py_CLEAR(key_obj); - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return key_obj; @@ -1347,11 +1455,10 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, unsigned char md[EVP_MAX_MD_SIZE] = {0}; unsigned int md_len = 0; unsigned char *result; - const EVP_MD *evp; + PY_EVP_MD *evp; - evp = py_digest_by_name(digest); + evp = py_digest_by_name(module, digest, Py_ht_mac); if (evp == NULL) { - PyErr_SetString(PyExc_ValueError, "unsupported hash type"); return NULL; } if (key->len > INT_MAX) { @@ -1365,6 +1472,11 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, return NULL; } + evp = py_digest_by_name(module, digest, Py_ht_mac); + if (evp == NULL) { + return NULL; + } + Py_BEGIN_ALLOW_THREADS result = HMAC( evp, @@ -1373,9 +1485,10 @@ _hashlib_hmac_singleshot_impl(PyObject *module, Py_buffer *key, md, &md_len ); Py_END_ALLOW_THREADS + PY_EVP_MD_free(evp); if (result == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return NULL; } return PyBytes_FromStringAndSize((const char*)md, md_len); @@ -1402,7 +1515,7 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, /*[clinic end generated code: output=9a35673be0cbea1b input=a0878868eb190134]*/ { PyTypeObject *type = get_hashlib_state(module)->HMACtype; - const EVP_MD *digest; + PY_EVP_MD *digest; HMAC_CTX *ctx = NULL; HMACobject *self = NULL; int r; @@ -1419,15 +1532,14 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, return NULL; } - digest = py_digest_by_name(digestmod); + digest = py_digest_by_name(module, digestmod, Py_ht_mac); if (!digest) { - PyErr_SetString(PyExc_ValueError, "unknown hash function"); return NULL; } ctx = HMAC_CTX_new(); if (ctx == NULL) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1437,8 +1549,9 @@ _hashlib_hmac_new_impl(PyObject *module, Py_buffer *key, PyObject *msg_obj, (int)key->len, digest, NULL /*impl*/); + PY_EVP_MD_free(digest); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); goto error; } @@ -1508,7 +1621,7 @@ _hmac_update(HMACobject *self, PyObject *obj) PyBuffer_Release(&view); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1528,11 +1641,11 @@ _hashlib_HMAC_copy_impl(HMACobject *self) HMAC_CTX *ctx = HMAC_CTX_new(); if (ctx == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } if (!locked_HMAC_CTX_copy(ctx, self)) { HMAC_CTX_free(ctx); - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } retval = (HMACobject *)PyObject_New(HMACobject, Py_TYPE(self)); @@ -1598,13 +1711,13 @@ _hmac_digest(HMACobject *self, unsigned char *buf, unsigned int len) return 0; } if (!locked_HMAC_CTX_copy(temp_ctx, self)) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } int r = HMAC_Final(temp_ctx, buf, &len); HMAC_CTX_free(temp_ctx); if (r == 0) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return 0; } return 1; @@ -1622,7 +1735,7 @@ _hashlib_HMAC_digest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1647,7 +1760,7 @@ _hashlib_HMAC_hexdigest_impl(HMACobject *self) unsigned char digest[EVP_MAX_MD_SIZE]; unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } int r = _hmac_digest(self, digest, digest_size); if (r == 0) { @@ -1661,7 +1774,7 @@ _hashlib_hmac_get_digest_size(HMACobject *self, void *closure) { unsigned int digest_size = _hmac_digest_size(self); if (digest_size == 0) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(digest_size); } @@ -1671,7 +1784,7 @@ _hashlib_hmac_get_block_size(HMACobject *self, void *closure) { const EVP_MD *md = HMAC_CTX_get_md(self->ctx); if (md == NULL) { - return _setException(PyExc_ValueError); + return _setException(PyExc_ValueError, NULL); } return PyLong_FromLong(EVP_MD_block_size(md)); } @@ -1831,7 +1944,7 @@ _hashlib_get_fips_mode_impl(PyObject *module) // But 0 is also a valid result value. unsigned long errcode = ERR_peek_last_error(); if (errcode) { - _setException(PyExc_ValueError); + _setException(PyExc_ValueError, NULL); return -1; } } @@ -2004,6 +2117,10 @@ hashlib_clear(PyObject *m) #ifdef PY_OPENSSL_HAS_SHAKE Py_CLEAR(state->EVPXOFtype); #endif + if (state->hashtable != NULL) { + _Py_hashtable_destroy(state->hashtable); + state->hashtable = NULL; + } return 0; } @@ -2025,6 +2142,19 @@ hashlib_openssl_legacy_init(PyObject *module) return 0; } +static int +hashlib_init_hashtable(PyObject *module) +{ + _hashlibstate *state = get_hashlib_state(module); + + state->hashtable = py_hashentry_table_new(); + if (state->hashtable == NULL) { + PyErr_NoMemory(); + return -1; + } + return 0; +} + static int hashlib_init_evptype(PyObject *module) { @@ -2089,6 +2219,7 @@ hashlib_init_hmactype(PyObject *module) static PyModuleDef_Slot hashlib_slots[] = { /* OpenSSL 1.0.2 and LibreSSL */ {Py_mod_exec, hashlib_openssl_legacy_init}, + {Py_mod_exec, hashlib_init_hashtable}, {Py_mod_exec, hashlib_init_evptype}, {Py_mod_exec, hashlib_init_evpxoftype}, {Py_mod_exec, hashlib_init_hmactype}, @@ -2127,6 +2258,10 @@ PyInit__hashlib(void) Py_DECREF(m); return NULL; } + if (hashlib_init_hashtable(m) < 0) { + Py_DECREF(m); + return NULL; + } if (hashlib_init_evptype(m) < 0) { Py_DECREF(m); return NULL; From webhook-mailer at python.org Thu Jan 13 07:34:42 2022 From: webhook-mailer at python.org (iritkatriel) Date: Thu, 13 Jan 2022 12:34:42 -0000 Subject: [Python-checkins] bpo-46344: Fix trace bug in else of try and try-star blocks (GH-30544) Message-ID: https://github.com/python/cpython/commit/9c2ebb906d1c68c3d571b100c92ceb08805b94cd commit: 9c2ebb906d1c68c3d571b100c92ceb08805b94cd branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-13T12:34:38Z summary: bpo-46344: Fix trace bug in else of try and try-star blocks (GH-30544) files: M Lib/test/test_dis.py M Lib/test/test_sys_settrace.py M Python/compile.c diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index c4473a4c261ad..19a4be2c4132b 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -1182,7 +1182,7 @@ def _prepare_test_cases(): Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=130, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=132, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=134, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=34, argval=206, argrepr='to 206', offset=136, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=35, argval=208, argrepr='to 208', offset=136, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=138, starts_line=22, is_jump_target=True, positions=None), Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=140, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=142, starts_line=None, is_jump_target=False, positions=None), @@ -1199,7 +1199,7 @@ def _prepare_test_cases(): Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=164, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=168, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=11, argval=194, argrepr='to 194', offset=170, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=25, argval=222, argrepr='to 222', offset=170, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=172, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=174, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_JUMP_IF_TRUE', opcode=115, arg=93, argval=186, argrepr='to 186', offset=176, starts_line=None, is_jump_target=False, positions=None), @@ -1211,28 +1211,36 @@ def _prepare_test_cases(): Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=188, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=190, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=192, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=194, starts_line=28, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=196, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=198, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=200, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=202, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=204, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=206, starts_line=23, is_jump_target=True, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=208, starts_line=28, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=210, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=212, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=214, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=216, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=220, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=222, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=224, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=226, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=230, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=232, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=234, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=236, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=194, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=196, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=198, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=200, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=202, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=204, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=206, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=208, starts_line=23, is_jump_target=True, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=210, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=212, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=214, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=216, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=218, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=220, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=222, starts_line=25, is_jump_target=True, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=224, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=226, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=230, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=232, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=234, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=236, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='print', argrepr='print', offset=238, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=240, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=242, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=244, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=246, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=248, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=250, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=252, starts_line=None, is_jump_target=False, positions=None), ] # One last piece of inspect fodder to check the default line number handling diff --git a/Lib/test/test_sys_settrace.py b/Lib/test/test_sys_settrace.py index 883b2842f2c77..e5fc88e1b6f59 100644 --- a/Lib/test/test_sys_settrace.py +++ b/Lib/test/test_sys_settrace.py @@ -644,16 +644,25 @@ def func(): 4 else: 6 + if False: + 8 + else: + 10 + if func.__name__ == 'Fred': + 12 finally: - 8 + 14 self.run_and_compare(func, [(0, 'call'), (1, 'line'), (2, 'line'), (6, 'line'), - (8, 'line'), - (8, 'return')]) + (7, 'line'), + (10, 'line'), + (11, 'line'), + (14, 'line'), + (14, 'return')]) def test_nested_loops(self): @@ -1222,16 +1231,25 @@ def func(): 4 else: 6 + if False: + 8 + else: + 10 + if func.__name__ == 'Fred': + 12 finally: - 8 + 14 self.run_and_compare(func, [(0, 'call'), (1, 'line'), (2, 'line'), (6, 'line'), - (8, 'line'), - (8, 'return')]) + (7, 'line'), + (10, 'line'), + (11, 'line'), + (14, 'line'), + (14, 'return')]) def test_try_except_star_named_no_exception(self): diff --git a/Python/compile.c b/Python/compile.c index 0d821d4183f12..b2702da8707f3 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -3475,7 +3475,6 @@ compiler_try_except(struct compiler *c, stmt_ty s) POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, orelse); VISIT_SEQ(c, stmt, s->v.Try.orelse); - ADDOP_JUMP(c, JUMP_FORWARD, end); compiler_use_next_block(c, end); return 1; } @@ -3702,7 +3701,6 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) POP_EXCEPT_AND_RERAISE(c); compiler_use_next_block(c, orelse); VISIT_SEQ(c, stmt, s->v.TryStar.orelse); - ADDOP_JUMP(c, JUMP_FORWARD, end); compiler_use_next_block(c, end); return 1; } From webhook-mailer at python.org Thu Jan 13 07:36:02 2022 From: webhook-mailer at python.org (iritkatriel) Date: Thu, 13 Jan 2022 12:36:02 -0000 Subject: [Python-checkins] bpo-46328: Add sys.exception() (GH-30514) Message-ID: https://github.com/python/cpython/commit/c590b581bba517f81ced2e6f531ccc9e2e22eab5 commit: c590b581bba517f81ced2e6f531ccc9e2e22eab5 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-13T12:35:58Z summary: bpo-46328: Add sys.exception() (GH-30514) files: A Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst M Doc/library/sys.rst M Doc/tutorial/errors.rst M Doc/whatsnew/3.11.rst M Lib/test/test_sys.py M Python/clinic/sysmodule.c.h M Python/sysmodule.c diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index 7d1b21f05edb1..5e47201f88eae 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -378,26 +378,41 @@ always available. .. versionadded:: 3.8 __unraisablehook__ + +.. function:: exception() + + This function returns the exception instance that is currently being + handled. This exception is specific both to the current thread and + to the current stack frame. If the current stack frame is not handling + an exception, the exception is taken from the calling stack frame, or its + caller, and so on until a stack frame is found that is handling an + exception. Here, "handling an exception" is defined as "executing an + except clause." For any stack frame, only the exception being currently + handled is accessible. + + .. index:: object: traceback + + If no exception is being handled anywhere on the stack, ``None`` is + returned. + + .. versionadded:: 3.11 + + .. function:: exc_info() - This function returns a tuple of three values that give information about the - exception that is currently being handled. The information returned is specific - both to the current thread and to the current stack frame. If the current stack - frame is not handling an exception, the information is taken from the calling - stack frame, or its caller, and so on until a stack frame is found that is - handling an exception. Here, "handling an exception" is defined as "executing - an except clause." For any stack frame, only information about the exception - being currently handled is accessible. + This function returns the old-style representation of the handled + exception. If an exception ``e`` is currently handled (so + :func:`exception` would return ``e``), :func:`exc_info` returns the + tuple ``(type(e), e, e.__traceback__)``. + That is, a tuple containing the type of the exception (a subclass of + :exc:`BaseException`), the exception itself, and a :ref:`traceback + object ` which typically encapsulates the call + stack at the point where the exception last occurred. .. index:: object: traceback - If no exception is being handled anywhere on the stack, a tuple containing - three ``None`` values is returned. Otherwise, the values returned are - ``(type, value, traceback)``. Their meaning is: *type* gets the type of the - exception being handled (a subclass of :exc:`BaseException`); *value* gets - the exception instance (an instance of the exception type); *traceback* gets - a :ref:`traceback object ` which typically encapsulates - the call stack at the point where the exception last occurred. + If no exception is being handled anywhere on the stack, this function + return a tuple containing three ``None`` values. .. versionchanged:: 3.11 The ``type`` and ``traceback`` fields are now derived from the ``value`` diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index ad1ef841bffc4..888740cdd0f19 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -167,7 +167,7 @@ then re-raise the exception (allowing a caller to handle the exception as well): raise Alternatively the last except clause may omit the exception name(s), however the exception -value must then be retrieved from ``sys.exc_info()[1]``. +value must then be retrieved with ``sys.exception()``. The :keyword:`try` ... :keyword:`except` statement has an optional *else clause*, which, when present, must follow all *except clauses*. It is useful diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 72243619891ae..28ac57e954438 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -305,6 +305,9 @@ sys the results of subsequent calls to :func:`exc_info`. (Contributed by Irit Katriel in :issue:`45711`.) +* Add :func:`sys.exception` which returns the active exception instance + (equivalent to ``sys.exc_info()[1]``). + (Contributed by Irit Katriel in :issue:`46328`.) threading --------- diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index 38771d427da7b..f05cd75af97b5 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -71,6 +71,69 @@ def baddisplayhook(obj): code = compile("42", "", "single") self.assertRaises(ValueError, eval, code) +class ActiveExceptionTests(unittest.TestCase): + def test_exc_info_no_exception(self): + self.assertEqual(sys.exc_info(), (None, None, None)) + + def test_sys_exception_no_exception(self): + self.assertEqual(sys.exception(), None) + + def test_exc_info_with_exception_instance(self): + def f(): + raise ValueError(42) + + try: + f() + except Exception as e_: + e = e_ + exc_info = sys.exc_info() + + self.assertIsInstance(e, ValueError) + self.assertIs(exc_info[0], ValueError) + self.assertIs(exc_info[1], e) + self.assertIs(exc_info[2], e.__traceback__) + + def test_exc_info_with_exception_type(self): + def f(): + raise ValueError + + try: + f() + except Exception as e_: + e = e_ + exc_info = sys.exc_info() + + self.assertIsInstance(e, ValueError) + self.assertIs(exc_info[0], ValueError) + self.assertIs(exc_info[1], e) + self.assertIs(exc_info[2], e.__traceback__) + + def test_sys_exception_with_exception_instance(self): + def f(): + raise ValueError(42) + + try: + f() + except Exception as e_: + e = e_ + exc = sys.exception() + + self.assertIsInstance(e, ValueError) + self.assertIs(exc, e) + + def test_sys_exception_with_exception_type(self): + def f(): + raise ValueError + + try: + f() + except Exception as e_: + e = e_ + exc = sys.exception() + + self.assertIsInstance(e, ValueError) + self.assertIs(exc, e) + class ExceptHookTest(unittest.TestCase): diff --git a/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst b/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst new file mode 100644 index 0000000000000..fec790d52cef3 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst @@ -0,0 +1 @@ +Added the :meth:`sys.exception` method which returns the active exception instance. \ No newline at end of file diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 8350fbf98561a..ce5390c8a1e58 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -76,6 +76,28 @@ sys_excepthook(PyObject *module, PyObject *const *args, Py_ssize_t nargs) return return_value; } +PyDoc_STRVAR(sys_exception__doc__, +"exception($module, /)\n" +"--\n" +"\n" +"Return the current exception.\n" +"\n" +"Return the most recent exception caught by an except clause\n" +"in the current stack frame or in an older stack frame, or None\n" +"if no such exception exists."); + +#define SYS_EXCEPTION_METHODDEF \ + {"exception", (PyCFunction)sys_exception, METH_NOARGS, sys_exception__doc__}, + +static PyObject * +sys_exception_impl(PyObject *module); + +static PyObject * +sys_exception(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys_exception_impl(module); +} + PyDoc_STRVAR(sys_exc_info__doc__, "exc_info($module, /)\n" "--\n" @@ -992,4 +1014,4 @@ sys_getandroidapilevel(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=855fc93b2347710b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=60756bc6f683e0c8 input=a9049054013a1b77]*/ diff --git a/Python/sysmodule.c b/Python/sysmodule.c index f912115560704..0b7b61d8b1e28 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -771,6 +771,28 @@ sys_excepthook_impl(PyObject *module, PyObject *exctype, PyObject *value, } +/*[clinic input] +sys.exception + +Return the current exception. + +Return the most recent exception caught by an except clause +in the current stack frame or in an older stack frame, or None +if no such exception exists. +[clinic start generated code]*/ + +static PyObject * +sys_exception_impl(PyObject *module) +/*[clinic end generated code: output=2381ee2f25953e40 input=c88fbb94b6287431]*/ +{ + _PyErr_StackItem *err_info = _PyErr_GetTopmostException(_PyThreadState_GET()); + if (err_info->exc_value != NULL) { + return Py_NewRef(err_info->exc_value); + } + Py_RETURN_NONE; +} + + /*[clinic input] sys.exc_info @@ -1963,6 +1985,7 @@ static PyMethodDef sys_methods[] = { SYS__CURRENT_FRAMES_METHODDEF SYS__CURRENT_EXCEPTIONS_METHODDEF SYS_DISPLAYHOOK_METHODDEF + SYS_EXCEPTION_METHODDEF SYS_EXC_INFO_METHODDEF SYS_EXCEPTHOOK_METHODDEF SYS_EXIT_METHODDEF @@ -2457,7 +2480,8 @@ Functions:\n\ \n\ displayhook() -- print an object to the screen, and save it in builtins._\n\ excepthook() -- print an exception and its traceback to sys.stderr\n\ -exc_info() -- return thread-safe information about the current exception\n\ +exception() -- return the current thread's active exception\n\ +exc_info() -- return information about the current thread's active exception\n\ exit() -- exit the interpreter by raising SystemExit\n\ getdlopenflags() -- returns flags to be used for dlopen() calls\n\ getprofile() -- get the global profiling function\n\ From webhook-mailer at python.org Thu Jan 13 08:09:48 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Thu, 13 Jan 2022 13:09:48 -0000 Subject: [Python-checkins] bpo-46359: Modernize `test_typing` by removing checks for EOL Python versions (GH-30563) Message-ID: https://github.com/python/cpython/commit/8c49d057bf8618208d4ed67c9caecbfa71f7a2d0 commit: 8c49d057bf8618208d4ed67c9caecbfa71f7a2d0 branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-13T21:09:40+08:00 summary: bpo-46359: Modernize `test_typing` by removing checks for EOL Python versions (GH-30563) Also removes unused tests meant for older versions of Python. files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index c11f6f7c19224..cf719df6da1d7 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -1940,13 +1940,12 @@ class A(Generic[T]): self.assertNotEqual(typing.FrozenSet[A[str]], typing.FrozenSet[mod_generics_cache.B.A[str]]) - if sys.version_info[:2] > (3, 2): - self.assertTrue(repr(Tuple[A[str]]).endswith('.A[str]]')) - self.assertTrue(repr(Tuple[B.A[str]]).endswith('.B.A[str]]')) - self.assertTrue(repr(Tuple[mod_generics_cache.A[str]]) - .endswith('mod_generics_cache.A[str]]')) - self.assertTrue(repr(Tuple[mod_generics_cache.B.A[str]]) - .endswith('mod_generics_cache.B.A[str]]')) + self.assertTrue(repr(Tuple[A[str]]).endswith('.A[str]]')) + self.assertTrue(repr(Tuple[B.A[str]]).endswith('.B.A[str]]')) + self.assertTrue(repr(Tuple[mod_generics_cache.A[str]]) + .endswith('mod_generics_cache.A[str]]')) + self.assertTrue(repr(Tuple[mod_generics_cache.B.A[str]]) + .endswith('mod_generics_cache.B.A[str]]')) def test_extended_generic_rules_eq(self): T = TypeVar('T') @@ -2065,11 +2064,9 @@ class MyDict(typing.Dict[T, T]): ... class MyDef(typing.DefaultDict[str, T]): ... self.assertIs(MyDef[int]().__class__, MyDef) self.assertEqual(MyDef[int]().__orig_class__, MyDef[int]) - # ChainMap was added in 3.3 - if sys.version_info >= (3, 3): - class MyChain(typing.ChainMap[str, T]): ... - self.assertIs(MyChain[int]().__class__, MyChain) - self.assertEqual(MyChain[int]().__orig_class__, MyChain[int]) + class MyChain(typing.ChainMap[str, T]): ... + self.assertIs(MyChain[int]().__class__, MyChain) + self.assertEqual(MyChain[int]().__orig_class__, MyChain[int]) def test_all_repr_eq_any(self): objs = (getattr(typing, el) for el in typing.__all__) @@ -4096,14 +4093,6 @@ def test_basics(self): self.assertEqual(Emp.__annotations__, collections.OrderedDict([('name', str), ('id', int)])) - def test_namedtuple_pyversion(self): - if sys.version_info[:2] < (3, 6): - with self.assertRaises(TypeError): - NamedTuple('Name', one=int, other=str) - with self.assertRaises(TypeError): - class NotYet(NamedTuple): - whatever = 0 - def test_annotation_usage(self): tim = CoolEmployee('Tim', 9000) self.assertIsInstance(tim, CoolEmployee) From webhook-mailer at python.org Thu Jan 13 09:28:11 2022 From: webhook-mailer at python.org (corona10) Date: Thu, 13 Jan 2022 14:28:11 -0000 Subject: [Python-checkins] bpo-46358: modernize `test_asyncio` (GH-30562) Message-ID: https://github.com/python/cpython/commit/f779faccd3a7a7e8c372492e858d021c449cdd85 commit: f779faccd3a7a7e8c372492e858d021c449cdd85 branch: main author: Nikita Sobolev committer: corona10 date: 2022-01-13T23:28:02+09:00 summary: bpo-46358: modernize `test_asyncio` (GH-30562) files: M Lib/test/test_asyncio/test_base_events.py M Lib/test/test_asyncio/test_events.py M Lib/test/test_asyncio/test_futures.py diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py index 3f4c2d85e0380..17e8396cfe2ef 100644 --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -21,7 +21,6 @@ MOCK_ANY = mock.ANY -PY34 = sys.version_info >= (3, 4) def tearDownModule(): @@ -596,18 +595,10 @@ async def zero_error_coro(): self.loop.run_forever() fut = None # Trigger Future.__del__ or futures._TracebackLogger support.gc_collect() - if PY34: - # Future.__del__ in Python 3.4 logs error with - # an actual exception context - log.error.assert_called_with( - test_utils.MockPattern('.*exception was never retrieved'), - exc_info=(ZeroDivisionError, MOCK_ANY, MOCK_ANY)) - else: - # futures._TracebackLogger logs only textual traceback - log.error.assert_called_with( - test_utils.MockPattern( - '.*exception was never retrieved.*ZeroDiv'), - exc_info=False) + # Future.__del__ in logs error with an actual exception context + log.error.assert_called_with( + test_utils.MockPattern('.*exception was never retrieved'), + exc_info=(ZeroDivisionError, MOCK_ANY, MOCK_ANY)) def test_set_exc_handler_invalid(self): with self.assertRaisesRegex(TypeError, 'A callable object or None'): diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py index c46c9dd40c83b..a30867e28029f 100644 --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -737,14 +737,6 @@ def client(): @unittest.skipIf(ssl is None, 'No ssl module') def test_ssl_connect_accepted_socket(self): - if (sys.platform == 'win32' and - sys.version_info < (3, 5) and - isinstance(self.loop, proactor_events.BaseProactorEventLoop) - ): - raise unittest.SkipTest( - 'SSL not supported with proactor event loops before Python 3.5' - ) - server_context = test_utils.simple_server_sslcontext() client_context = test_utils.simple_client_sslcontext() @@ -2206,17 +2198,15 @@ def test_handle_repr(self): self.assertRegex(repr(h), regex) # partial method - if sys.version_info >= (3, 4): - method = HandleTests.test_handle_repr - cb = functools.partialmethod(method) - filename, lineno = test_utils.get_function_source(method) - h = asyncio.Handle(cb, (), self.loop) - - cb_regex = r'' - cb_regex = (r'functools.partialmethod\(%s, , \)\(\)' % cb_regex) - regex = (r'^$' - % (cb_regex, re.escape(filename), lineno)) - self.assertRegex(repr(h), regex) + method = HandleTests.test_handle_repr + cb = functools.partialmethod(method) + filename, lineno = test_utils.get_function_source(method) + h = asyncio.Handle(cb, (), self.loop) + + cb_regex = r'' + cb_regex = fr'functools.partialmethod\({cb_regex}, , \)\(\)' + regex = fr'^$' + self.assertRegex(repr(h), regex) def test_handle_repr_debug(self): self.loop.get_debug.return_value = True diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py index 0c379e0fb0f95..95983f0550807 100644 --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -571,13 +571,10 @@ def memory_error(): test_utils.run_briefly(self.loop) support.gc_collect() - if sys.version_info >= (3, 4): - regex = f'^{self.cls.__name__} exception was never retrieved\n' - exc_info = (type(exc), exc, exc.__traceback__) - m_log.error.assert_called_once_with(mock.ANY, exc_info=exc_info) - else: - regex = r'^Future/Task exception was never retrieved\n' - m_log.error.assert_called_once_with(mock.ANY, exc_info=False) + regex = f'^{self.cls.__name__} exception was never retrieved\n' + exc_info = (type(exc), exc, exc.__traceback__) + m_log.error.assert_called_once_with(mock.ANY, exc_info=exc_info) + message = m_log.error.call_args[0][0] self.assertRegex(message, re.compile(regex, re.DOTALL)) From webhook-mailer at python.org Thu Jan 13 13:22:08 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 13 Jan 2022 18:22:08 -0000 Subject: [Python-checkins] bpo-46355: Document PyFrameObject and PyThreadState changes (GH-30558) Message-ID: https://github.com/python/cpython/commit/0885999a8e5ffad3fae0302675ad0030e33a15af commit: 0885999a8e5ffad3fae0302675ad0030e33a15af branch: main author: Victor Stinner committer: vstinner date: 2022-01-13T19:21:50+01:00 summary: bpo-46355: Document PyFrameObject and PyThreadState changes (GH-30558) Document PyFrameObject and PyThreadState changes in What's New in Python 3.11 and explain how to port code. files: M Doc/whatsnew/3.11.rst diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 28ac57e954438..6a6c22c9077c9 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -755,6 +755,110 @@ Porting to Python 3.11 which are not available in the limited C API. (Contributed by Victor Stinner in :issue:`46007`.) +* Changes of the :c:type:`PyFrameObject` structure members: + + * ``f_code``: removed, use :c:func:`PyFrame_GetCode` instead. + Warning: the function returns a :term:`strong reference`, need to call + :c:func:`Py_DECREF`. + * ``f_back``: changed, use :c:func:`PyFrame_GetBack`. + * ``f_builtins``: removed, + use ``PyObject_GetAttrString(frame, "f_builtins")``. + * ``f_globals``: removed, + use ``PyObject_GetAttrString(frame, "f_globals")``. + * ``f_locals``: removed, + use ``PyObject_GetAttrString(frame, "f_locals")``. + * ``f_lasti``: removed, + use ``PyObject_GetAttrString(frame, "f_lasti")``. + * ``f_valuesstack``: removed. + * ``f_stackdepth``: removed. + * ``f_gen``: removed. + * ``f_iblock``: removed. + * ``f_state``: removed. + * ``f_blockstack``: removed. + * ``f_localsplus``: removed. + + The Python frame object is now created lazily. A side effect is that the + ``f_back`` member must not be accessed directly, since its value is now also + computed lazily. The :c:func:`PyFrame_GetBack` function must be called + instead. + + Code defining ``PyFrame_GetCode()`` on Python 3.8 and older:: + + #if PY_VERSION_HEX < 0x030900B1 + static inline PyCodeObject* PyFrame_GetCode(PyFrameObject *frame) + { + Py_INCREF(frame->f_code); + return frame->f_code; + } + #endif + + Code defining ``PyFrame_GetBack()`` on Python 3.8 and older:: + + #if PY_VERSION_HEX < 0x030900B1 + static inline PyFrameObject* PyFrame_GetBack(PyFrameObject *frame) + { + Py_XINCREF(frame->f_back); + return frame->f_back; + } + #endif + + Or use `the pythoncapi_compat project + `__ to get these functions + on old Python functions. + +* Changes of the :c:type:`PyThreadState` structure members: + + * ``frame``: removed, use :c:func:`PyThreadState_GetFrame` (function added + to Python 3.9 by :issue:`40429`). + Warning: the function returns a :term:`strong reference`, need to call + :c:func:`Py_XDECREF`. + * ``tracing``: changed, use :c:func:`PyThreadState_EnterTracing` + and :c:func:`PyThreadState_LeaveTracing` + (functions added to Python 3.11 by :issue:`43760`). + * ``recursion_depth``: removed, + use ``(tstate->recursion_limit - tstate->recursion_remaining)`` instead. + * ``stackcheck_counter``: removed. + + Code defining ``PyThreadState_GetFrame()`` on Python 3.8 and older:: + + #if PY_VERSION_HEX < 0x030900B1 + static inline PyFrameObject* PyThreadState_GetFrame(PyThreadState *tstate) + { + Py_XINCREF(tstate->frame); + return tstate->frame; + } + #endif + + Code defining ``PyThreadState_EnterTracing()`` and + ``PyThreadState_LeaveTracing()`` on Python 3.10 and older:: + + #if PY_VERSION_HEX < 0x030B00A2 + static inline void PyThreadState_EnterTracing(PyThreadState *tstate) + { + tstate->tracing++; + #if PY_VERSION_HEX >= 0x030A00A1 + tstate->cframe->use_tracing = 0; + #else + tstate->use_tracing = 0; + #endif + } + + static inline void PyThreadState_LeaveTracing(PyThreadState *tstate) + { + int use_tracing = (tstate->c_tracefunc != NULL || tstate->c_profilefunc != NULL); + tstate->tracing--; + #if PY_VERSION_HEX >= 0x030A00A1 + tstate->cframe->use_tracing = use_tracing; + #else + tstate->use_tracing = use_tracing; + #endif + } + #endif + + Or use `the pythoncapi_compat project + `__ to get these functions + on old Python functions. + Deprecated ---------- From webhook-mailer at python.org Thu Jan 13 13:24:37 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 13 Jan 2022 18:24:37 -0000 Subject: [Python-checkins] bpo-44133: Link Python executable with object files (GH-30556) Message-ID: https://github.com/python/cpython/commit/6be848922bc0f4c632c255c39de82a45b6480286 commit: 6be848922bc0f4c632c255c39de82a45b6480286 branch: main author: Victor Stinner committer: vstinner date: 2022-01-13T19:24:28+01:00 summary: bpo-44133: Link Python executable with object files (GH-30556) When Python is built without --enable-shared, the "python" program is now linked to object files, rather than being linked to the Python library (libpython.a), to make sure that all symbols are exported. Previously, the linker omitted some symbols like the Py_FrozenMain() function. When Python is configured with --without-static-libpython, the Python static library (libpython.a) is no longer built. * Check --without-static-libpython earlier in configure.ac * Add LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variables to Makefile. * test_capi now ensures that the "Py_FrozenMain" symbol is exported. files: A Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst A Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst M Lib/test/test_capi.py M Makefile.pre.in M configure M configure.ac diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 99263bff091be..9f217852ec529 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -643,6 +643,24 @@ def test_Py_CompileString(self): expected = compile(code, "", "exec") self.assertEqual(result.co_consts, expected.co_consts) + def test_export_symbols(self): + # bpo-44133: Ensure that the "Py_FrozenMain" and + # "PyThread_get_thread_native_id" symbols are exported by the Python + # (directly by the binary, or via by the Python dynamic library). + ctypes = import_helper.import_module('ctypes') + names = ['PyThread_get_thread_native_id'] + + # Python/frozenmain.c fails to build on Windows when the symbols are + # missing: + # - PyWinFreeze_ExeInit + # - PyWinFreeze_ExeTerm + # - PyInitFrozenExtensions + if os.name != 'nt': + names.append('Py_FrozenMain') + for name in names: + with self.subTest(name=name): + self.assertTrue(hasattr(ctypes.pythonapi, name)) + class TestPendingCalls(unittest.TestCase): diff --git a/Makefile.pre.in b/Makefile.pre.in index 41b123abcef11..a84badcd49389 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -265,6 +265,7 @@ DLLLIBRARY= @DLLLIBRARY@ LDLIBRARYDIR= @LDLIBRARYDIR@ INSTSONAME= @INSTSONAME@ LIBRARY_DEPS= @LIBRARY_DEPS@ +LINK_PYTHON_DEPS=@LINK_PYTHON_DEPS@ PY_ENABLE_SHARED= @PY_ENABLE_SHARED@ STATIC_LIBPYTHON= @STATIC_LIBPYTHON@ @@ -526,6 +527,8 @@ LIBRARY_OBJS= \ Modules/getpath.o \ Python/frozen.o +LINK_PYTHON_OBJS=@LINK_PYTHON_OBJS@ + ########################################################################## # DTrace @@ -721,8 +724,8 @@ clinic: check-clean-src $(srcdir)/Modules/_blake2/blake2s_impl.c $(PYTHON_FOR_REGEN) $(srcdir)/Tools/clinic/clinic.py --make --srcdir $(srcdir) # Build the interpreter -$(BUILDPYTHON): Programs/python.o $(LIBRARY_DEPS) - $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) +$(BUILDPYTHON): Programs/python.o $(LINK_PYTHON_DEPS) + $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o $(LINK_PYTHON_OBJS) $(LIBS) $(MODLIBS) $(SYSLIBS) platform: $(BUILDPYTHON) pybuilddir.txt $(RUNSHARED) $(PYTHON_FOR_BUILD) -c 'import sys ; from sysconfig import get_platform ; print("%s-%d.%d" % (get_platform(), *sys.version_info[:2]))' >platform @@ -965,8 +968,8 @@ regen-test-frozenmain: $(BUILDPYTHON) # using Programs/freeze_test_frozenmain.py $(RUNSHARED) ./$(BUILDPYTHON) $(srcdir)/Programs/freeze_test_frozenmain.py Programs/test_frozenmain.h -Programs/_testembed: Programs/_testembed.o $(LIBRARY_DEPS) - $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/_testembed.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) +Programs/_testembed: Programs/_testembed.o $(LINK_PYTHON_DEPS) + $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/_testembed.o $(LINK_PYTHON_OBJS) $(LIBS) $(MODLIBS) $(SYSLIBS) ############################################################################ # "Bootstrap Python" used to run deepfreeze.py diff --git a/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst b/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst new file mode 100644 index 0000000000000..7c2a48a9e0d56 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst @@ -0,0 +1,5 @@ +When Python is built without :option:`--enable-shared`, the ``python`` +program is now linked to object files, rather than being linked to the Python +static library (libpython.a), to make sure that all symbols are exported. +Previously, the linker omitted some symbols like the :c:func:`Py_FrozenMain` +function. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst b/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst new file mode 100644 index 0000000000000..3542850ff286b --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst @@ -0,0 +1,2 @@ +When Python is configured with :option:`--without-static-libpython`, the Python +static library (libpython.a) is no longer built. Patch by Victor Stinner. diff --git a/configure b/configure index 127b350b4bb04..ccbf4fc0b180f 100755 --- a/configure +++ b/configure @@ -775,8 +775,6 @@ MODULE__IO_TRUE MODULES_SETUP_STDLIB MODULE_BUILDTYPE TEST_MODULES -LIBRARY_DEPS -STATIC_LIBPYTHON OPENSSL_RPATH OPENSSL_LDFLAGS OPENSSL_LIBS @@ -877,6 +875,10 @@ READELF ARFLAGS ac_ct_AR AR +LINK_PYTHON_OBJS +LINK_PYTHON_DEPS +LIBRARY_DEPS +STATIC_LIBPYTHON GNULD EXPORTSFROM EXPORTSYMS @@ -1007,6 +1009,7 @@ with_cxx_main with_emscripten_target with_suffix enable_shared +with_static_libpython enable_profiling with_pydebug with_trace_refs @@ -1048,7 +1051,6 @@ with_openssl_rpath with_ssl_default_suites with_builtin_hashlib_hashes with_experimental_isolated_subinterpreters -with_static_libpython enable_test_modules ' ac_precious_vars='build_alias @@ -1758,6 +1760,9 @@ Optional Packages: Emscripten platform --with-suffix=SUFFIX set executable suffix to SUFFIX (default is empty, yes is mapped to '.exe') + --without-static-libpython + do not build libpythonMAJOR.MINOR.a and do not + install python.o (default is yes) --with-pydebug build with Py_DEBUG defined (default is no) --with-trace-refs enable tracing references for debugging purpose (default is no) @@ -1840,9 +1845,6 @@ Optional Packages: --with-experimental-isolated-subinterpreters better isolate subinterpreters, experimental build mode (default is no) - --without-static-libpython - do not build libpythonMAJOR.MINOR.a and do not - install python.o (default is yes) Some influential environment variables: PKG_CONFIG path to pkg-config utility @@ -6428,6 +6430,30 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $enable_shared" >&5 $as_echo "$enable_shared" >&6; } +# --with-static-libpython +STATIC_LIBPYTHON=1 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-static-libpython" >&5 +$as_echo_n "checking for --with-static-libpython... " >&6; } + +# Check whether --with-static-libpython was given. +if test "${with_static_libpython+set}" = set; then : + withval=$with_static_libpython; +if test "$withval" = no +then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; }; + STATIC_LIBPYTHON=0 +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; }; +fi +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 +$as_echo "yes" >&6; } +fi + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for --enable-profiling" >&5 $as_echo_n "checking for --enable-profiling... " >&6; } # Check whether --enable-profiling was given. @@ -6550,6 +6576,31 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $LDLIBRARY" >&5 $as_echo "$LDLIBRARY" >&6; } +# LIBRARY_DEPS, LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variable +LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' +LINK_PYTHON_DEPS='$(LIBRARY_DEPS)' +if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then + LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" + if test "$STATIC_LIBPYTHON" = 1; then + LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" + fi + # Link Python program to the shared library + LINK_PYTHON_OBJS='$(BLDLIBRARY)' +else + if test "$STATIC_LIBPYTHON" = 0; then + # Build Python needs object files but don't need to build + # Python static library + LINK_PYTHON_DEPS="$LIBRARY_DEPS \$(LIBRARY_OBJS)" + fi + LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" + # Link Python program to object files + LINK_PYTHON_OBJS='$(LIBRARY_OBJS)' +fi + + + + +# ar program if test -n "$ac_tool_prefix"; then for ac_prog in ar aal @@ -21213,48 +21264,6 @@ $as_echo "no" >&6; } fi -# --with-static-libpython -STATIC_LIBPYTHON=1 -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-static-libpython" >&5 -$as_echo_n "checking for --with-static-libpython... " >&6; } - -# Check whether --with-static-libpython was given. -if test "${with_static_libpython+set}" = set; then : - withval=$with_static_libpython; -if test "$withval" = no -then - { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 -$as_echo "no" >&6; }; - STATIC_LIBPYTHON=0 -else - { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 -$as_echo "yes" >&6; }; -fi -else - { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 -$as_echo "yes" >&6; } -fi - -LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' -if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then - LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" - if test "$STATIC_LIBPYTHON" = 1; then - LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" - fi -else - LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" -fi - -case $ac_sys_system/$ac_sys_emscripten_target in #( - Emscripten/browser) : - LIBRARY_DEPS="$LIBRARY_DEPS \$(WASM_STDLIB)" ;; #( - *) : - ;; -esac - - - - # Check whether to disable test modules. Once set, setup.py will not build # test extension modules and "make install" will not install test suites. { $as_echo "$as_me:${as_lineno-$LINENO}: checking for --disable-test-modules" >&5 diff --git a/configure.ac b/configure.ac index e5ebf7bc2e07a..89041b205f50d 100644 --- a/configure.ac +++ b/configure.ac @@ -1231,6 +1231,23 @@ then fi AC_MSG_RESULT($enable_shared) +# --with-static-libpython +STATIC_LIBPYTHON=1 +AC_MSG_CHECKING(for --with-static-libpython) +AC_ARG_WITH(static-libpython, + AS_HELP_STRING([--without-static-libpython], + [do not build libpythonMAJOR.MINOR.a and do not install python.o (default is yes)]), +[ +if test "$withval" = no +then + AC_MSG_RESULT(no); + STATIC_LIBPYTHON=0 +else + AC_MSG_RESULT(yes); +fi], +[AC_MSG_RESULT(yes)]) +AC_SUBST(STATIC_LIBPYTHON) + AC_MSG_CHECKING(for --enable-profiling) AC_ARG_ENABLE(profiling, AS_HELP_STRING([--enable-profiling], [enable C-level code profiling with gprof (default is no)])) @@ -1336,6 +1353,31 @@ fi AC_MSG_RESULT($LDLIBRARY) +# LIBRARY_DEPS, LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variable +LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' +LINK_PYTHON_DEPS='$(LIBRARY_DEPS)' +if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then + LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" + if test "$STATIC_LIBPYTHON" = 1; then + LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" + fi + # Link Python program to the shared library + LINK_PYTHON_OBJS='$(BLDLIBRARY)' +else + if test "$STATIC_LIBPYTHON" = 0; then + # Build Python needs object files but don't need to build + # Python static library + LINK_PYTHON_DEPS="$LIBRARY_DEPS \$(LIBRARY_OBJS)" + fi + LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" + # Link Python program to object files + LINK_PYTHON_OBJS='$(LIBRARY_OBJS)' +fi +AC_SUBST(LIBRARY_DEPS) +AC_SUBST(LINK_PYTHON_DEPS) +AC_SUBST(LINK_PYTHON_OBJS) + +# ar program AC_SUBST(AR) AC_CHECK_TOOLS(AR, ar aal, ar) @@ -6273,39 +6315,6 @@ else fi], [AC_MSG_RESULT(no)]) -# --with-static-libpython -STATIC_LIBPYTHON=1 -AC_MSG_CHECKING(for --with-static-libpython) -AC_ARG_WITH(static-libpython, - AS_HELP_STRING([--without-static-libpython], - [do not build libpythonMAJOR.MINOR.a and do not install python.o (default is yes)]), -[ -if test "$withval" = no -then - AC_MSG_RESULT(no); - STATIC_LIBPYTHON=0 -else - AC_MSG_RESULT(yes); -fi], -[AC_MSG_RESULT(yes)]) -LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' -if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then - LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" - if test "$STATIC_LIBPYTHON" = 1; then - LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" - fi -else - LIBRARY_DEPS="\$(LIBRARY) $LIBRARY_DEPS" -fi - -dnl browser needs a WASM assets stdlib bundle -AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], - [Emscripten/browser], [LIBRARY_DEPS="$LIBRARY_DEPS \$(WASM_STDLIB)"], -) - -AC_SUBST(STATIC_LIBPYTHON) -AC_SUBST(LIBRARY_DEPS) - # Check whether to disable test modules. Once set, setup.py will not build # test extension modules and "make install" will not install test suites. AC_MSG_CHECKING(for --disable-test-modules) From webhook-mailer at python.org Thu Jan 13 13:28:40 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 13 Jan 2022 18:28:40 -0000 Subject: [Python-checkins] bpo-46070: _PyGC_Fini() untracks objects (GH-30577) Message-ID: https://github.com/python/cpython/commit/1a4d1c1c9b08e75e88aeac90901920938f649832 commit: 1a4d1c1c9b08e75e88aeac90901920938f649832 branch: main author: Victor Stinner committer: vstinner date: 2022-01-13T19:28:32+01:00 summary: bpo-46070: _PyGC_Fini() untracks objects (GH-30577) Py_EndInterpreter() now explicitly untracks all objects currently tracked by the GC. Previously, if an object was used later by another interpreter, calling PyObject_GC_UnTrack() on the object crashed if the previous or the next object of the PyGC_Head structure became a dangling pointer. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst M Modules/gcmodule.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst new file mode 100644 index 0000000000000..4ed088f9898eb --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst @@ -0,0 +1,5 @@ +:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently +tracked by the GC. Previously, if an object was used later by another +interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if the +previous or the next object of the :c:type:`PyGC_Head` structure became a +dangling pointer. Patch by Victor Stinner. diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index 1808057a650e9..e22f031f57490 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -2161,12 +2161,36 @@ _PyGC_DumpShutdownStats(PyInterpreterState *interp) } } + +static void +gc_fini_untrack(PyGC_Head *list) +{ + PyGC_Head *gc; + for (gc = GC_NEXT(list); gc != list; gc = GC_NEXT(list)) { + PyObject *op = FROM_GC(gc); + _PyObject_GC_UNTRACK(op); + } +} + + void _PyGC_Fini(PyInterpreterState *interp) { GCState *gcstate = &interp->gc; Py_CLEAR(gcstate->garbage); Py_CLEAR(gcstate->callbacks); + + if (!_Py_IsMainInterpreter(interp)) { + // bpo-46070: Explicitly untrack all objects currently tracked by the + // GC. Otherwise, if an object is used later by another interpreter, + // calling PyObject_GC_UnTrack() on the object crashs if the previous + // or the next object of the PyGC_Head structure became a dangling + // pointer. + for (int i = 0; i < NUM_GENERATIONS; i++) { + PyGC_Head *gen = GEN_HEAD(gcstate, i); + gc_fini_untrack(gen); + } + } } /* for debugging */ From webhook-mailer at python.org Thu Jan 13 13:50:19 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 18:50:19 -0000 Subject: [Python-checkins] bpo-46070: _PyGC_Fini() untracks objects (GH-30577) Message-ID: https://github.com/python/cpython/commit/e6bb17fe29713368e1fd93d9ac9611017c4f570c commit: e6bb17fe29713368e1fd93d9ac9611017c4f570c branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T10:50:09-08:00 summary: bpo-46070: _PyGC_Fini() untracks objects (GH-30577) Py_EndInterpreter() now explicitly untracks all objects currently tracked by the GC. Previously, if an object was used later by another interpreter, calling PyObject_GC_UnTrack() on the object crashed if the previous or the next object of the PyGC_Head structure became a dangling pointer. (cherry picked from commit 1a4d1c1c9b08e75e88aeac90901920938f649832) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst M Modules/gcmodule.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst new file mode 100644 index 0000000000000..4ed088f9898eb --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst @@ -0,0 +1,5 @@ +:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently +tracked by the GC. Previously, if an object was used later by another +interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if the +previous or the next object of the :c:type:`PyGC_Head` structure became a +dangling pointer. Patch by Victor Stinner. diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index e5e5aa3287b0d..805a159d53d60 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -2162,12 +2162,36 @@ _PyGC_DumpShutdownStats(PyInterpreterState *interp) } } + +static void +gc_fini_untrack(PyGC_Head *list) +{ + PyGC_Head *gc; + for (gc = GC_NEXT(list); gc != list; gc = GC_NEXT(list)) { + PyObject *op = FROM_GC(gc); + _PyObject_GC_UNTRACK(op); + } +} + + void _PyGC_Fini(PyInterpreterState *interp) { GCState *gcstate = &interp->gc; Py_CLEAR(gcstate->garbage); Py_CLEAR(gcstate->callbacks); + + if (!_Py_IsMainInterpreter(interp)) { + // bpo-46070: Explicitly untrack all objects currently tracked by the + // GC. Otherwise, if an object is used later by another interpreter, + // calling PyObject_GC_UnTrack() on the object crashs if the previous + // or the next object of the PyGC_Head structure became a dangling + // pointer. + for (int i = 0; i < NUM_GENERATIONS; i++) { + PyGC_Head *gen = GEN_HEAD(gcstate, i); + gc_fini_untrack(gen); + } + } } /* for debugging */ From webhook-mailer at python.org Thu Jan 13 14:12:59 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 13 Jan 2022 19:12:59 -0000 Subject: [Python-checkins] bpo-46070: _PyGC_Fini() untracks objects (GH-30577) (GH-30580) Message-ID: https://github.com/python/cpython/commit/52937c26adc35350ca0402070160cf6dc838f359 commit: 52937c26adc35350ca0402070160cf6dc838f359 branch: 3.9 author: Victor Stinner committer: vstinner date: 2022-01-13T20:12:50+01:00 summary: bpo-46070: _PyGC_Fini() untracks objects (GH-30577) (GH-30580) Py_EndInterpreter() now explicitly untracks all objects currently tracked by the GC. Previously, if an object was used later by another interpreter, calling PyObject_GC_UnTrack() on the object crashed if the previous or the next object of the PyGC_Head structure became a dangling pointer. (cherry picked from commit 1a4d1c1c9b08e75e88aeac90901920938f649832) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst M Modules/gcmodule.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst new file mode 100644 index 0000000000000..4ed088f9898eb --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst @@ -0,0 +1,5 @@ +:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently +tracked by the GC. Previously, if an object was used later by another +interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if the +previous or the next object of the :c:type:`PyGC_Head` structure became a +dangling pointer. Patch by Victor Stinner. diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index 2c6ef9252eae8..3cf1a00b003b3 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -2149,12 +2149,36 @@ _PyGC_DumpShutdownStats(PyThreadState *tstate) } } + +static void +gc_fini_untrack(PyGC_Head *list) +{ + PyGC_Head *gc; + for (gc = GC_NEXT(list); gc != list; gc = GC_NEXT(list)) { + PyObject *op = FROM_GC(gc); + _PyObject_GC_UNTRACK(op); + } +} + + void _PyGC_Fini(PyThreadState *tstate) { GCState *gcstate = &tstate->interp->gc; Py_CLEAR(gcstate->garbage); Py_CLEAR(gcstate->callbacks); + + if (!_Py_IsMainInterpreter(tstate)) { + // bpo-46070: Explicitly untrack all objects currently tracked by the + // GC. Otherwise, if an object is used later by another interpreter, + // calling PyObject_GC_UnTrack() on the object crashs if the previous + // or the next object of the PyGC_Head structure became a dangling + // pointer. + for (int i = 0; i < NUM_GENERATIONS; i++) { + PyGC_Head *gen = GEN_HEAD(gcstate, i); + gc_fini_untrack(gen); + } + } } /* for debugging */ From webhook-mailer at python.org Thu Jan 13 15:48:13 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 13 Jan 2022 20:48:13 -0000 Subject: [Python-checkins] bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) Message-ID: https://github.com/python/cpython/commit/276c234ce0fa6732237f1b187989837324d9dea3 commit: 276c234ce0fa6732237f1b187989837324d9dea3 branch: main author: Christian Heimes committer: tiran date: 2022-01-13T21:47:42+01:00 summary: bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) files: M Modules/_hashopenssl.c diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index eeea61aeceb54..fb155b2e62253 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -883,7 +883,7 @@ py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, goto exit; } -#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER < 0x30000000L // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { From webhook-mailer at python.org Thu Jan 13 16:08:55 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 21:08:55 -0000 Subject: [Python-checkins] [3.10] bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) (GH-30585) Message-ID: https://github.com/python/cpython/commit/47422a852de14a8ec11d058136c7c864d2cc7fc9 commit: 47422a852de14a8ec11d058136c7c864d2cc7fc9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T13:08:47-08:00 summary: [3.10] bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) (GH-30585) (cherry picked from commit 276c234ce0fa6732237f1b187989837324d9dea3) Co-authored-by: Christian Heimes Automerge-Triggered-By: GH:tiran files: M Modules/_hashopenssl.c diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index cb8460ab2fcf2..2eaa5f7d85d80 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -883,7 +883,7 @@ py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, goto exit; } -#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER < 0x30000000L // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { From webhook-mailer at python.org Thu Jan 13 16:19:54 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 13 Jan 2022 21:19:54 -0000 Subject: [Python-checkins] bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) Message-ID: https://github.com/python/cpython/commit/537f16adfa31b5b1fe9d656d571d1e10fb115351 commit: 537f16adfa31b5b1fe9d656d571d1e10fb115351 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T13:19:50-08:00 summary: bpo-40479: Fix typo, flag must be set for OpenSSL < 3.0.0 (GH-30584) (cherry picked from commit 276c234ce0fa6732237f1b187989837324d9dea3) Co-authored-by: Christian Heimes files: M Modules/_hashopenssl.c diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index 4f117b3afae7c..a488945082187 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -900,7 +900,7 @@ py_evp_fromname(PyObject *module, const char *digestname, PyObject *data_obj, goto exit; } -#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER >= 0x30000000L +#if defined(EVP_MD_CTX_FLAG_NON_FIPS_ALLOW) && OPENSSL_VERSION_NUMBER < 0x30000000L // In OpenSSL 1.1.1 the non FIPS allowed flag is context specific while // in 3.0.0 it is a different EVP_MD provider. if (!usedforsecurity) { From webhook-mailer at python.org Thu Jan 13 17:54:44 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Thu, 13 Jan 2022 22:54:44 -0000 Subject: [Python-checkins] bpo-46370: Move the static initializer for _PyRuntime to its own header file. (gh-30587) Message-ID: https://github.com/python/cpython/commit/bc02eac9d2cb36faffc5027b7ce09e6dd0922a7f commit: bc02eac9d2cb36faffc5027b7ce09e6dd0922a7f branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-13T15:54:36-07:00 summary: bpo-46370: Move the static initializer for _PyRuntime to its own header file. (gh-30587) https://bugs.python.org/issue46370 files: A Include/internal/pycore_runtime_init.h M Include/internal/pycore_global_objects.h M Include/internal/pycore_object.h M Include/internal/pycore_runtime.h M Makefile.pre.in M PCbuild/pythoncore.vcxproj M PCbuild/pythoncore.vcxproj.filters M Python/pylifecycle.c M Python/pystate.c diff --git a/Include/internal/pycore_global_objects.h b/Include/internal/pycore_global_objects.h index de7ab9b53eb26..acf0b2aed5ae4 100644 --- a/Include/internal/pycore_global_objects.h +++ b/Include/internal/pycore_global_objects.h @@ -9,48 +9,10 @@ extern "C" { #endif -#define _PyObject_IMMORTAL_INIT(type) \ - { \ - .ob_refcnt = 999999999, \ - .ob_type = type, \ - } -#define _PyVarObject_IMMORTAL_INIT(type, size) \ - { \ - .ob_base = _PyObject_IMMORTAL_INIT(type), \ - .ob_size = size, \ - } - - -/* int objects */ - +// These would be in pycore_long.h if it weren't for an include cycle. #define _PY_NSMALLPOSINTS 257 #define _PY_NSMALLNEGINTS 5 -#define _PyLong_DIGIT_INIT(val) \ - { \ - _PyVarObject_IMMORTAL_INIT(&PyLong_Type, \ - ((val) == 0 ? 0 : ((val) > 0 ? 1 : -1))), \ - .ob_digit = { ((val) >= 0 ? (val) : -(val)) }, \ - } - - -/* bytes objects */ - -#define _PyBytes_SIMPLE_INIT(CH, LEN) \ - { \ - _PyVarObject_IMMORTAL_INIT(&PyBytes_Type, LEN), \ - .ob_shash = -1, \ - .ob_sval = { CH }, \ - } -#define _PyBytes_CHAR_INIT(CH) \ - { \ - _PyBytes_SIMPLE_INIT(CH, 1) \ - } - - -/********************** - * the global objects * - **********************/ // Only immutable objects should be considered runtime-global. // All others must be per-interpreter. @@ -77,535 +39,6 @@ struct _Py_global_objects { } singletons; }; -#define _Py_global_objects_INIT { \ - .singletons = { \ - .small_ints = { \ - _PyLong_DIGIT_INIT(-5), \ - _PyLong_DIGIT_INIT(-4), \ - _PyLong_DIGIT_INIT(-3), \ - _PyLong_DIGIT_INIT(-2), \ - _PyLong_DIGIT_INIT(-1), \ - _PyLong_DIGIT_INIT(0), \ - _PyLong_DIGIT_INIT(1), \ - _PyLong_DIGIT_INIT(2), \ - _PyLong_DIGIT_INIT(3), \ - _PyLong_DIGIT_INIT(4), \ - _PyLong_DIGIT_INIT(5), \ - _PyLong_DIGIT_INIT(6), \ - _PyLong_DIGIT_INIT(7), \ - _PyLong_DIGIT_INIT(8), \ - _PyLong_DIGIT_INIT(9), \ - _PyLong_DIGIT_INIT(10), \ - _PyLong_DIGIT_INIT(11), \ - _PyLong_DIGIT_INIT(12), \ - _PyLong_DIGIT_INIT(13), \ - _PyLong_DIGIT_INIT(14), \ - _PyLong_DIGIT_INIT(15), \ - _PyLong_DIGIT_INIT(16), \ - _PyLong_DIGIT_INIT(17), \ - _PyLong_DIGIT_INIT(18), \ - _PyLong_DIGIT_INIT(19), \ - _PyLong_DIGIT_INIT(20), \ - _PyLong_DIGIT_INIT(21), \ - _PyLong_DIGIT_INIT(22), \ - _PyLong_DIGIT_INIT(23), \ - _PyLong_DIGIT_INIT(24), \ - _PyLong_DIGIT_INIT(25), \ - _PyLong_DIGIT_INIT(26), \ - _PyLong_DIGIT_INIT(27), \ - _PyLong_DIGIT_INIT(28), \ - _PyLong_DIGIT_INIT(29), \ - _PyLong_DIGIT_INIT(30), \ - _PyLong_DIGIT_INIT(31), \ - _PyLong_DIGIT_INIT(32), \ - _PyLong_DIGIT_INIT(33), \ - _PyLong_DIGIT_INIT(34), \ - _PyLong_DIGIT_INIT(35), \ - _PyLong_DIGIT_INIT(36), \ - _PyLong_DIGIT_INIT(37), \ - _PyLong_DIGIT_INIT(38), \ - _PyLong_DIGIT_INIT(39), \ - _PyLong_DIGIT_INIT(40), \ - _PyLong_DIGIT_INIT(41), \ - _PyLong_DIGIT_INIT(42), \ - _PyLong_DIGIT_INIT(43), \ - _PyLong_DIGIT_INIT(44), \ - _PyLong_DIGIT_INIT(45), \ - _PyLong_DIGIT_INIT(46), \ - _PyLong_DIGIT_INIT(47), \ - _PyLong_DIGIT_INIT(48), \ - _PyLong_DIGIT_INIT(49), \ - _PyLong_DIGIT_INIT(50), \ - _PyLong_DIGIT_INIT(51), \ - _PyLong_DIGIT_INIT(52), \ - _PyLong_DIGIT_INIT(53), \ - _PyLong_DIGIT_INIT(54), \ - _PyLong_DIGIT_INIT(55), \ - _PyLong_DIGIT_INIT(56), \ - _PyLong_DIGIT_INIT(57), \ - _PyLong_DIGIT_INIT(58), \ - _PyLong_DIGIT_INIT(59), \ - _PyLong_DIGIT_INIT(60), \ - _PyLong_DIGIT_INIT(61), \ - _PyLong_DIGIT_INIT(62), \ - _PyLong_DIGIT_INIT(63), \ - _PyLong_DIGIT_INIT(64), \ - _PyLong_DIGIT_INIT(65), \ - _PyLong_DIGIT_INIT(66), \ - _PyLong_DIGIT_INIT(67), \ - _PyLong_DIGIT_INIT(68), \ - _PyLong_DIGIT_INIT(69), \ - _PyLong_DIGIT_INIT(70), \ - _PyLong_DIGIT_INIT(71), \ - _PyLong_DIGIT_INIT(72), \ - _PyLong_DIGIT_INIT(73), \ - _PyLong_DIGIT_INIT(74), \ - _PyLong_DIGIT_INIT(75), \ - _PyLong_DIGIT_INIT(76), \ - _PyLong_DIGIT_INIT(77), \ - _PyLong_DIGIT_INIT(78), \ - _PyLong_DIGIT_INIT(79), \ - _PyLong_DIGIT_INIT(80), \ - _PyLong_DIGIT_INIT(81), \ - _PyLong_DIGIT_INIT(82), \ - _PyLong_DIGIT_INIT(83), \ - _PyLong_DIGIT_INIT(84), \ - _PyLong_DIGIT_INIT(85), \ - _PyLong_DIGIT_INIT(86), \ - _PyLong_DIGIT_INIT(87), \ - _PyLong_DIGIT_INIT(88), \ - _PyLong_DIGIT_INIT(89), \ - _PyLong_DIGIT_INIT(90), \ - _PyLong_DIGIT_INIT(91), \ - _PyLong_DIGIT_INIT(92), \ - _PyLong_DIGIT_INIT(93), \ - _PyLong_DIGIT_INIT(94), \ - _PyLong_DIGIT_INIT(95), \ - _PyLong_DIGIT_INIT(96), \ - _PyLong_DIGIT_INIT(97), \ - _PyLong_DIGIT_INIT(98), \ - _PyLong_DIGIT_INIT(99), \ - _PyLong_DIGIT_INIT(100), \ - _PyLong_DIGIT_INIT(101), \ - _PyLong_DIGIT_INIT(102), \ - _PyLong_DIGIT_INIT(103), \ - _PyLong_DIGIT_INIT(104), \ - _PyLong_DIGIT_INIT(105), \ - _PyLong_DIGIT_INIT(106), \ - _PyLong_DIGIT_INIT(107), \ - _PyLong_DIGIT_INIT(108), \ - _PyLong_DIGIT_INIT(109), \ - _PyLong_DIGIT_INIT(110), \ - _PyLong_DIGIT_INIT(111), \ - _PyLong_DIGIT_INIT(112), \ - _PyLong_DIGIT_INIT(113), \ - _PyLong_DIGIT_INIT(114), \ - _PyLong_DIGIT_INIT(115), \ - _PyLong_DIGIT_INIT(116), \ - _PyLong_DIGIT_INIT(117), \ - _PyLong_DIGIT_INIT(118), \ - _PyLong_DIGIT_INIT(119), \ - _PyLong_DIGIT_INIT(120), \ - _PyLong_DIGIT_INIT(121), \ - _PyLong_DIGIT_INIT(122), \ - _PyLong_DIGIT_INIT(123), \ - _PyLong_DIGIT_INIT(124), \ - _PyLong_DIGIT_INIT(125), \ - _PyLong_DIGIT_INIT(126), \ - _PyLong_DIGIT_INIT(127), \ - _PyLong_DIGIT_INIT(128), \ - _PyLong_DIGIT_INIT(129), \ - _PyLong_DIGIT_INIT(130), \ - _PyLong_DIGIT_INIT(131), \ - _PyLong_DIGIT_INIT(132), \ - _PyLong_DIGIT_INIT(133), \ - _PyLong_DIGIT_INIT(134), \ - _PyLong_DIGIT_INIT(135), \ - _PyLong_DIGIT_INIT(136), \ - _PyLong_DIGIT_INIT(137), \ - _PyLong_DIGIT_INIT(138), \ - _PyLong_DIGIT_INIT(139), \ - _PyLong_DIGIT_INIT(140), \ - _PyLong_DIGIT_INIT(141), \ - _PyLong_DIGIT_INIT(142), \ - _PyLong_DIGIT_INIT(143), \ - _PyLong_DIGIT_INIT(144), \ - _PyLong_DIGIT_INIT(145), \ - _PyLong_DIGIT_INIT(146), \ - _PyLong_DIGIT_INIT(147), \ - _PyLong_DIGIT_INIT(148), \ - _PyLong_DIGIT_INIT(149), \ - _PyLong_DIGIT_INIT(150), \ - _PyLong_DIGIT_INIT(151), \ - _PyLong_DIGIT_INIT(152), \ - _PyLong_DIGIT_INIT(153), \ - _PyLong_DIGIT_INIT(154), \ - _PyLong_DIGIT_INIT(155), \ - _PyLong_DIGIT_INIT(156), \ - _PyLong_DIGIT_INIT(157), \ - _PyLong_DIGIT_INIT(158), \ - _PyLong_DIGIT_INIT(159), \ - _PyLong_DIGIT_INIT(160), \ - _PyLong_DIGIT_INIT(161), \ - _PyLong_DIGIT_INIT(162), \ - _PyLong_DIGIT_INIT(163), \ - _PyLong_DIGIT_INIT(164), \ - _PyLong_DIGIT_INIT(165), \ - _PyLong_DIGIT_INIT(166), \ - _PyLong_DIGIT_INIT(167), \ - _PyLong_DIGIT_INIT(168), \ - _PyLong_DIGIT_INIT(169), \ - _PyLong_DIGIT_INIT(170), \ - _PyLong_DIGIT_INIT(171), \ - _PyLong_DIGIT_INIT(172), \ - _PyLong_DIGIT_INIT(173), \ - _PyLong_DIGIT_INIT(174), \ - _PyLong_DIGIT_INIT(175), \ - _PyLong_DIGIT_INIT(176), \ - _PyLong_DIGIT_INIT(177), \ - _PyLong_DIGIT_INIT(178), \ - _PyLong_DIGIT_INIT(179), \ - _PyLong_DIGIT_INIT(180), \ - _PyLong_DIGIT_INIT(181), \ - _PyLong_DIGIT_INIT(182), \ - _PyLong_DIGIT_INIT(183), \ - _PyLong_DIGIT_INIT(184), \ - _PyLong_DIGIT_INIT(185), \ - _PyLong_DIGIT_INIT(186), \ - _PyLong_DIGIT_INIT(187), \ - _PyLong_DIGIT_INIT(188), \ - _PyLong_DIGIT_INIT(189), \ - _PyLong_DIGIT_INIT(190), \ - _PyLong_DIGIT_INIT(191), \ - _PyLong_DIGIT_INIT(192), \ - _PyLong_DIGIT_INIT(193), \ - _PyLong_DIGIT_INIT(194), \ - _PyLong_DIGIT_INIT(195), \ - _PyLong_DIGIT_INIT(196), \ - _PyLong_DIGIT_INIT(197), \ - _PyLong_DIGIT_INIT(198), \ - _PyLong_DIGIT_INIT(199), \ - _PyLong_DIGIT_INIT(200), \ - _PyLong_DIGIT_INIT(201), \ - _PyLong_DIGIT_INIT(202), \ - _PyLong_DIGIT_INIT(203), \ - _PyLong_DIGIT_INIT(204), \ - _PyLong_DIGIT_INIT(205), \ - _PyLong_DIGIT_INIT(206), \ - _PyLong_DIGIT_INIT(207), \ - _PyLong_DIGIT_INIT(208), \ - _PyLong_DIGIT_INIT(209), \ - _PyLong_DIGIT_INIT(210), \ - _PyLong_DIGIT_INIT(211), \ - _PyLong_DIGIT_INIT(212), \ - _PyLong_DIGIT_INIT(213), \ - _PyLong_DIGIT_INIT(214), \ - _PyLong_DIGIT_INIT(215), \ - _PyLong_DIGIT_INIT(216), \ - _PyLong_DIGIT_INIT(217), \ - _PyLong_DIGIT_INIT(218), \ - _PyLong_DIGIT_INIT(219), \ - _PyLong_DIGIT_INIT(220), \ - _PyLong_DIGIT_INIT(221), \ - _PyLong_DIGIT_INIT(222), \ - _PyLong_DIGIT_INIT(223), \ - _PyLong_DIGIT_INIT(224), \ - _PyLong_DIGIT_INIT(225), \ - _PyLong_DIGIT_INIT(226), \ - _PyLong_DIGIT_INIT(227), \ - _PyLong_DIGIT_INIT(228), \ - _PyLong_DIGIT_INIT(229), \ - _PyLong_DIGIT_INIT(230), \ - _PyLong_DIGIT_INIT(231), \ - _PyLong_DIGIT_INIT(232), \ - _PyLong_DIGIT_INIT(233), \ - _PyLong_DIGIT_INIT(234), \ - _PyLong_DIGIT_INIT(235), \ - _PyLong_DIGIT_INIT(236), \ - _PyLong_DIGIT_INIT(237), \ - _PyLong_DIGIT_INIT(238), \ - _PyLong_DIGIT_INIT(239), \ - _PyLong_DIGIT_INIT(240), \ - _PyLong_DIGIT_INIT(241), \ - _PyLong_DIGIT_INIT(242), \ - _PyLong_DIGIT_INIT(243), \ - _PyLong_DIGIT_INIT(244), \ - _PyLong_DIGIT_INIT(245), \ - _PyLong_DIGIT_INIT(246), \ - _PyLong_DIGIT_INIT(247), \ - _PyLong_DIGIT_INIT(248), \ - _PyLong_DIGIT_INIT(249), \ - _PyLong_DIGIT_INIT(250), \ - _PyLong_DIGIT_INIT(251), \ - _PyLong_DIGIT_INIT(252), \ - _PyLong_DIGIT_INIT(253), \ - _PyLong_DIGIT_INIT(254), \ - _PyLong_DIGIT_INIT(255), \ - _PyLong_DIGIT_INIT(256), \ - }, \ - \ - .bytes_empty = _PyBytes_SIMPLE_INIT(0, 0), \ - .bytes_characters = { \ - _PyBytes_CHAR_INIT(0), \ - _PyBytes_CHAR_INIT(1), \ - _PyBytes_CHAR_INIT(2), \ - _PyBytes_CHAR_INIT(3), \ - _PyBytes_CHAR_INIT(4), \ - _PyBytes_CHAR_INIT(5), \ - _PyBytes_CHAR_INIT(6), \ - _PyBytes_CHAR_INIT(7), \ - _PyBytes_CHAR_INIT(8), \ - _PyBytes_CHAR_INIT(9), \ - _PyBytes_CHAR_INIT(10), \ - _PyBytes_CHAR_INIT(11), \ - _PyBytes_CHAR_INIT(12), \ - _PyBytes_CHAR_INIT(13), \ - _PyBytes_CHAR_INIT(14), \ - _PyBytes_CHAR_INIT(15), \ - _PyBytes_CHAR_INIT(16), \ - _PyBytes_CHAR_INIT(17), \ - _PyBytes_CHAR_INIT(18), \ - _PyBytes_CHAR_INIT(19), \ - _PyBytes_CHAR_INIT(20), \ - _PyBytes_CHAR_INIT(21), \ - _PyBytes_CHAR_INIT(22), \ - _PyBytes_CHAR_INIT(23), \ - _PyBytes_CHAR_INIT(24), \ - _PyBytes_CHAR_INIT(25), \ - _PyBytes_CHAR_INIT(26), \ - _PyBytes_CHAR_INIT(27), \ - _PyBytes_CHAR_INIT(28), \ - _PyBytes_CHAR_INIT(29), \ - _PyBytes_CHAR_INIT(30), \ - _PyBytes_CHAR_INIT(31), \ - _PyBytes_CHAR_INIT(32), \ - _PyBytes_CHAR_INIT(33), \ - _PyBytes_CHAR_INIT(34), \ - _PyBytes_CHAR_INIT(35), \ - _PyBytes_CHAR_INIT(36), \ - _PyBytes_CHAR_INIT(37), \ - _PyBytes_CHAR_INIT(38), \ - _PyBytes_CHAR_INIT(39), \ - _PyBytes_CHAR_INIT(40), \ - _PyBytes_CHAR_INIT(41), \ - _PyBytes_CHAR_INIT(42), \ - _PyBytes_CHAR_INIT(43), \ - _PyBytes_CHAR_INIT(44), \ - _PyBytes_CHAR_INIT(45), \ - _PyBytes_CHAR_INIT(46), \ - _PyBytes_CHAR_INIT(47), \ - _PyBytes_CHAR_INIT(48), \ - _PyBytes_CHAR_INIT(49), \ - _PyBytes_CHAR_INIT(50), \ - _PyBytes_CHAR_INIT(51), \ - _PyBytes_CHAR_INIT(52), \ - _PyBytes_CHAR_INIT(53), \ - _PyBytes_CHAR_INIT(54), \ - _PyBytes_CHAR_INIT(55), \ - _PyBytes_CHAR_INIT(56), \ - _PyBytes_CHAR_INIT(57), \ - _PyBytes_CHAR_INIT(58), \ - _PyBytes_CHAR_INIT(59), \ - _PyBytes_CHAR_INIT(60), \ - _PyBytes_CHAR_INIT(61), \ - _PyBytes_CHAR_INIT(62), \ - _PyBytes_CHAR_INIT(63), \ - _PyBytes_CHAR_INIT(64), \ - _PyBytes_CHAR_INIT(65), \ - _PyBytes_CHAR_INIT(66), \ - _PyBytes_CHAR_INIT(67), \ - _PyBytes_CHAR_INIT(68), \ - _PyBytes_CHAR_INIT(69), \ - _PyBytes_CHAR_INIT(70), \ - _PyBytes_CHAR_INIT(71), \ - _PyBytes_CHAR_INIT(72), \ - _PyBytes_CHAR_INIT(73), \ - _PyBytes_CHAR_INIT(74), \ - _PyBytes_CHAR_INIT(75), \ - _PyBytes_CHAR_INIT(76), \ - _PyBytes_CHAR_INIT(77), \ - _PyBytes_CHAR_INIT(78), \ - _PyBytes_CHAR_INIT(79), \ - _PyBytes_CHAR_INIT(80), \ - _PyBytes_CHAR_INIT(81), \ - _PyBytes_CHAR_INIT(82), \ - _PyBytes_CHAR_INIT(83), \ - _PyBytes_CHAR_INIT(84), \ - _PyBytes_CHAR_INIT(85), \ - _PyBytes_CHAR_INIT(86), \ - _PyBytes_CHAR_INIT(87), \ - _PyBytes_CHAR_INIT(88), \ - _PyBytes_CHAR_INIT(89), \ - _PyBytes_CHAR_INIT(90), \ - _PyBytes_CHAR_INIT(91), \ - _PyBytes_CHAR_INIT(92), \ - _PyBytes_CHAR_INIT(93), \ - _PyBytes_CHAR_INIT(94), \ - _PyBytes_CHAR_INIT(95), \ - _PyBytes_CHAR_INIT(96), \ - _PyBytes_CHAR_INIT(97), \ - _PyBytes_CHAR_INIT(98), \ - _PyBytes_CHAR_INIT(99), \ - _PyBytes_CHAR_INIT(100), \ - _PyBytes_CHAR_INIT(101), \ - _PyBytes_CHAR_INIT(102), \ - _PyBytes_CHAR_INIT(103), \ - _PyBytes_CHAR_INIT(104), \ - _PyBytes_CHAR_INIT(105), \ - _PyBytes_CHAR_INIT(106), \ - _PyBytes_CHAR_INIT(107), \ - _PyBytes_CHAR_INIT(108), \ - _PyBytes_CHAR_INIT(109), \ - _PyBytes_CHAR_INIT(110), \ - _PyBytes_CHAR_INIT(111), \ - _PyBytes_CHAR_INIT(112), \ - _PyBytes_CHAR_INIT(113), \ - _PyBytes_CHAR_INIT(114), \ - _PyBytes_CHAR_INIT(115), \ - _PyBytes_CHAR_INIT(116), \ - _PyBytes_CHAR_INIT(117), \ - _PyBytes_CHAR_INIT(118), \ - _PyBytes_CHAR_INIT(119), \ - _PyBytes_CHAR_INIT(120), \ - _PyBytes_CHAR_INIT(121), \ - _PyBytes_CHAR_INIT(122), \ - _PyBytes_CHAR_INIT(123), \ - _PyBytes_CHAR_INIT(124), \ - _PyBytes_CHAR_INIT(125), \ - _PyBytes_CHAR_INIT(126), \ - _PyBytes_CHAR_INIT(127), \ - _PyBytes_CHAR_INIT(128), \ - _PyBytes_CHAR_INIT(129), \ - _PyBytes_CHAR_INIT(130), \ - _PyBytes_CHAR_INIT(131), \ - _PyBytes_CHAR_INIT(132), \ - _PyBytes_CHAR_INIT(133), \ - _PyBytes_CHAR_INIT(134), \ - _PyBytes_CHAR_INIT(135), \ - _PyBytes_CHAR_INIT(136), \ - _PyBytes_CHAR_INIT(137), \ - _PyBytes_CHAR_INIT(138), \ - _PyBytes_CHAR_INIT(139), \ - _PyBytes_CHAR_INIT(140), \ - _PyBytes_CHAR_INIT(141), \ - _PyBytes_CHAR_INIT(142), \ - _PyBytes_CHAR_INIT(143), \ - _PyBytes_CHAR_INIT(144), \ - _PyBytes_CHAR_INIT(145), \ - _PyBytes_CHAR_INIT(146), \ - _PyBytes_CHAR_INIT(147), \ - _PyBytes_CHAR_INIT(148), \ - _PyBytes_CHAR_INIT(149), \ - _PyBytes_CHAR_INIT(150), \ - _PyBytes_CHAR_INIT(151), \ - _PyBytes_CHAR_INIT(152), \ - _PyBytes_CHAR_INIT(153), \ - _PyBytes_CHAR_INIT(154), \ - _PyBytes_CHAR_INIT(155), \ - _PyBytes_CHAR_INIT(156), \ - _PyBytes_CHAR_INIT(157), \ - _PyBytes_CHAR_INIT(158), \ - _PyBytes_CHAR_INIT(159), \ - _PyBytes_CHAR_INIT(160), \ - _PyBytes_CHAR_INIT(161), \ - _PyBytes_CHAR_INIT(162), \ - _PyBytes_CHAR_INIT(163), \ - _PyBytes_CHAR_INIT(164), \ - _PyBytes_CHAR_INIT(165), \ - _PyBytes_CHAR_INIT(166), \ - _PyBytes_CHAR_INIT(167), \ - _PyBytes_CHAR_INIT(168), \ - _PyBytes_CHAR_INIT(169), \ - _PyBytes_CHAR_INIT(170), \ - _PyBytes_CHAR_INIT(171), \ - _PyBytes_CHAR_INIT(172), \ - _PyBytes_CHAR_INIT(173), \ - _PyBytes_CHAR_INIT(174), \ - _PyBytes_CHAR_INIT(175), \ - _PyBytes_CHAR_INIT(176), \ - _PyBytes_CHAR_INIT(177), \ - _PyBytes_CHAR_INIT(178), \ - _PyBytes_CHAR_INIT(179), \ - _PyBytes_CHAR_INIT(180), \ - _PyBytes_CHAR_INIT(181), \ - _PyBytes_CHAR_INIT(182), \ - _PyBytes_CHAR_INIT(183), \ - _PyBytes_CHAR_INIT(184), \ - _PyBytes_CHAR_INIT(185), \ - _PyBytes_CHAR_INIT(186), \ - _PyBytes_CHAR_INIT(187), \ - _PyBytes_CHAR_INIT(188), \ - _PyBytes_CHAR_INIT(189), \ - _PyBytes_CHAR_INIT(190), \ - _PyBytes_CHAR_INIT(191), \ - _PyBytes_CHAR_INIT(192), \ - _PyBytes_CHAR_INIT(193), \ - _PyBytes_CHAR_INIT(194), \ - _PyBytes_CHAR_INIT(195), \ - _PyBytes_CHAR_INIT(196), \ - _PyBytes_CHAR_INIT(197), \ - _PyBytes_CHAR_INIT(198), \ - _PyBytes_CHAR_INIT(199), \ - _PyBytes_CHAR_INIT(200), \ - _PyBytes_CHAR_INIT(201), \ - _PyBytes_CHAR_INIT(202), \ - _PyBytes_CHAR_INIT(203), \ - _PyBytes_CHAR_INIT(204), \ - _PyBytes_CHAR_INIT(205), \ - _PyBytes_CHAR_INIT(206), \ - _PyBytes_CHAR_INIT(207), \ - _PyBytes_CHAR_INIT(208), \ - _PyBytes_CHAR_INIT(209), \ - _PyBytes_CHAR_INIT(210), \ - _PyBytes_CHAR_INIT(211), \ - _PyBytes_CHAR_INIT(212), \ - _PyBytes_CHAR_INIT(213), \ - _PyBytes_CHAR_INIT(214), \ - _PyBytes_CHAR_INIT(215), \ - _PyBytes_CHAR_INIT(216), \ - _PyBytes_CHAR_INIT(217), \ - _PyBytes_CHAR_INIT(218), \ - _PyBytes_CHAR_INIT(219), \ - _PyBytes_CHAR_INIT(220), \ - _PyBytes_CHAR_INIT(221), \ - _PyBytes_CHAR_INIT(222), \ - _PyBytes_CHAR_INIT(223), \ - _PyBytes_CHAR_INIT(224), \ - _PyBytes_CHAR_INIT(225), \ - _PyBytes_CHAR_INIT(226), \ - _PyBytes_CHAR_INIT(227), \ - _PyBytes_CHAR_INIT(228), \ - _PyBytes_CHAR_INIT(229), \ - _PyBytes_CHAR_INIT(230), \ - _PyBytes_CHAR_INIT(231), \ - _PyBytes_CHAR_INIT(232), \ - _PyBytes_CHAR_INIT(233), \ - _PyBytes_CHAR_INIT(234), \ - _PyBytes_CHAR_INIT(235), \ - _PyBytes_CHAR_INIT(236), \ - _PyBytes_CHAR_INIT(237), \ - _PyBytes_CHAR_INIT(238), \ - _PyBytes_CHAR_INIT(239), \ - _PyBytes_CHAR_INIT(240), \ - _PyBytes_CHAR_INIT(241), \ - _PyBytes_CHAR_INIT(242), \ - _PyBytes_CHAR_INIT(243), \ - _PyBytes_CHAR_INIT(244), \ - _PyBytes_CHAR_INIT(245), \ - _PyBytes_CHAR_INIT(246), \ - _PyBytes_CHAR_INIT(247), \ - _PyBytes_CHAR_INIT(248), \ - _PyBytes_CHAR_INIT(249), \ - _PyBytes_CHAR_INIT(250), \ - _PyBytes_CHAR_INIT(251), \ - _PyBytes_CHAR_INIT(252), \ - _PyBytes_CHAR_INIT(253), \ - _PyBytes_CHAR_INIT(254), \ - _PyBytes_CHAR_INIT(255), \ - }, \ - }, \ -} - #ifdef __cplusplus } diff --git a/Include/internal/pycore_object.h b/Include/internal/pycore_object.h index 9041a4dc8a3ce..0348563218072 100644 --- a/Include/internal/pycore_object.h +++ b/Include/internal/pycore_object.h @@ -12,6 +12,19 @@ extern "C" { #include "pycore_interp.h" // PyInterpreterState.gc #include "pycore_pystate.h" // _PyInterpreterState_GET() + +#define _PyObject_IMMORTAL_INIT(type) \ + { \ + .ob_refcnt = 999999999, \ + .ob_type = type, \ + } +#define _PyVarObject_IMMORTAL_INIT(type, size) \ + { \ + .ob_base = _PyObject_IMMORTAL_INIT(type), \ + .ob_size = size, \ + } + + PyAPI_FUNC(int) _PyType_CheckConsistency(PyTypeObject *type); PyAPI_FUNC(int) _PyDict_CheckConsistency(PyObject *mp, int check_content); diff --git a/Include/internal/pycore_runtime.h b/Include/internal/pycore_runtime.h index a66a3cf3a3944..038e6f8263fae 100644 --- a/Include/internal/pycore_runtime.h +++ b/Include/internal/pycore_runtime.h @@ -148,21 +148,6 @@ typedef struct pyruntimestate { PyInterpreterState _main_interpreter; } _PyRuntimeState; -#define _PyThreadState_INIT \ - { \ - ._static = 1, \ - } -#define _PyInterpreterState_INIT \ - { \ - ._static = 1, \ - ._initial_thread = _PyThreadState_INIT, \ - } -#define _PyRuntimeState_INIT \ - { \ - .global_objects = _Py_global_objects_INIT, \ - ._main_interpreter = _PyInterpreterState_INIT, \ - } - /* other API */ diff --git a/Include/internal/pycore_runtime_init.h b/Include/internal/pycore_runtime_init.h new file mode 100644 index 0000000000000..e35c696610b94 --- /dev/null +++ b/Include/internal/pycore_runtime_init.h @@ -0,0 +1,590 @@ +#ifndef Py_INTERNAL_RUNTIME_INIT_H +#define Py_INTERNAL_RUNTIME_INIT_H +#ifdef __cplusplus +extern "C" { +#endif + +#ifndef Py_BUILD_CORE +# error "this header requires Py_BUILD_CORE define" +#endif + +#include "pycore_object.h" + + +/* The static initializers defined here should only be used + in the runtime init code (in pystate.c and pylifecycle.c). */ + + +#define _PyRuntimeState_INIT \ + { \ + .global_objects = _Py_global_objects_INIT, \ + ._main_interpreter = _PyInterpreterState_INIT, \ + } + +#define _PyInterpreterState_INIT \ + { \ + ._static = 1, \ + ._initial_thread = _PyThreadState_INIT, \ + } + +#define _PyThreadState_INIT \ + { \ + ._static = 1, \ + } + + +// global objects + +#define _PyLong_DIGIT_INIT(val) \ + { \ + _PyVarObject_IMMORTAL_INIT(&PyLong_Type, \ + ((val) == 0 ? 0 : ((val) > 0 ? 1 : -1))), \ + .ob_digit = { ((val) >= 0 ? (val) : -(val)) }, \ + } + + +#define _PyBytes_SIMPLE_INIT(CH, LEN) \ + { \ + _PyVarObject_IMMORTAL_INIT(&PyBytes_Type, LEN), \ + .ob_shash = -1, \ + .ob_sval = { CH }, \ + } +#define _PyBytes_CHAR_INIT(CH) \ + { \ + _PyBytes_SIMPLE_INIT(CH, 1) \ + } + +#define _Py_global_objects_INIT { \ + .singletons = { \ + .small_ints = { \ + _PyLong_DIGIT_INIT(-5), \ + _PyLong_DIGIT_INIT(-4), \ + _PyLong_DIGIT_INIT(-3), \ + _PyLong_DIGIT_INIT(-2), \ + _PyLong_DIGIT_INIT(-1), \ + _PyLong_DIGIT_INIT(0), \ + _PyLong_DIGIT_INIT(1), \ + _PyLong_DIGIT_INIT(2), \ + _PyLong_DIGIT_INIT(3), \ + _PyLong_DIGIT_INIT(4), \ + _PyLong_DIGIT_INIT(5), \ + _PyLong_DIGIT_INIT(6), \ + _PyLong_DIGIT_INIT(7), \ + _PyLong_DIGIT_INIT(8), \ + _PyLong_DIGIT_INIT(9), \ + _PyLong_DIGIT_INIT(10), \ + _PyLong_DIGIT_INIT(11), \ + _PyLong_DIGIT_INIT(12), \ + _PyLong_DIGIT_INIT(13), \ + _PyLong_DIGIT_INIT(14), \ + _PyLong_DIGIT_INIT(15), \ + _PyLong_DIGIT_INIT(16), \ + _PyLong_DIGIT_INIT(17), \ + _PyLong_DIGIT_INIT(18), \ + _PyLong_DIGIT_INIT(19), \ + _PyLong_DIGIT_INIT(20), \ + _PyLong_DIGIT_INIT(21), \ + _PyLong_DIGIT_INIT(22), \ + _PyLong_DIGIT_INIT(23), \ + _PyLong_DIGIT_INIT(24), \ + _PyLong_DIGIT_INIT(25), \ + _PyLong_DIGIT_INIT(26), \ + _PyLong_DIGIT_INIT(27), \ + _PyLong_DIGIT_INIT(28), \ + _PyLong_DIGIT_INIT(29), \ + _PyLong_DIGIT_INIT(30), \ + _PyLong_DIGIT_INIT(31), \ + _PyLong_DIGIT_INIT(32), \ + _PyLong_DIGIT_INIT(33), \ + _PyLong_DIGIT_INIT(34), \ + _PyLong_DIGIT_INIT(35), \ + _PyLong_DIGIT_INIT(36), \ + _PyLong_DIGIT_INIT(37), \ + _PyLong_DIGIT_INIT(38), \ + _PyLong_DIGIT_INIT(39), \ + _PyLong_DIGIT_INIT(40), \ + _PyLong_DIGIT_INIT(41), \ + _PyLong_DIGIT_INIT(42), \ + _PyLong_DIGIT_INIT(43), \ + _PyLong_DIGIT_INIT(44), \ + _PyLong_DIGIT_INIT(45), \ + _PyLong_DIGIT_INIT(46), \ + _PyLong_DIGIT_INIT(47), \ + _PyLong_DIGIT_INIT(48), \ + _PyLong_DIGIT_INIT(49), \ + _PyLong_DIGIT_INIT(50), \ + _PyLong_DIGIT_INIT(51), \ + _PyLong_DIGIT_INIT(52), \ + _PyLong_DIGIT_INIT(53), \ + _PyLong_DIGIT_INIT(54), \ + _PyLong_DIGIT_INIT(55), \ + _PyLong_DIGIT_INIT(56), \ + _PyLong_DIGIT_INIT(57), \ + _PyLong_DIGIT_INIT(58), \ + _PyLong_DIGIT_INIT(59), \ + _PyLong_DIGIT_INIT(60), \ + _PyLong_DIGIT_INIT(61), \ + _PyLong_DIGIT_INIT(62), \ + _PyLong_DIGIT_INIT(63), \ + _PyLong_DIGIT_INIT(64), \ + _PyLong_DIGIT_INIT(65), \ + _PyLong_DIGIT_INIT(66), \ + _PyLong_DIGIT_INIT(67), \ + _PyLong_DIGIT_INIT(68), \ + _PyLong_DIGIT_INIT(69), \ + _PyLong_DIGIT_INIT(70), \ + _PyLong_DIGIT_INIT(71), \ + _PyLong_DIGIT_INIT(72), \ + _PyLong_DIGIT_INIT(73), \ + _PyLong_DIGIT_INIT(74), \ + _PyLong_DIGIT_INIT(75), \ + _PyLong_DIGIT_INIT(76), \ + _PyLong_DIGIT_INIT(77), \ + _PyLong_DIGIT_INIT(78), \ + _PyLong_DIGIT_INIT(79), \ + _PyLong_DIGIT_INIT(80), \ + _PyLong_DIGIT_INIT(81), \ + _PyLong_DIGIT_INIT(82), \ + _PyLong_DIGIT_INIT(83), \ + _PyLong_DIGIT_INIT(84), \ + _PyLong_DIGIT_INIT(85), \ + _PyLong_DIGIT_INIT(86), \ + _PyLong_DIGIT_INIT(87), \ + _PyLong_DIGIT_INIT(88), \ + _PyLong_DIGIT_INIT(89), \ + _PyLong_DIGIT_INIT(90), \ + _PyLong_DIGIT_INIT(91), \ + _PyLong_DIGIT_INIT(92), \ + _PyLong_DIGIT_INIT(93), \ + _PyLong_DIGIT_INIT(94), \ + _PyLong_DIGIT_INIT(95), \ + _PyLong_DIGIT_INIT(96), \ + _PyLong_DIGIT_INIT(97), \ + _PyLong_DIGIT_INIT(98), \ + _PyLong_DIGIT_INIT(99), \ + _PyLong_DIGIT_INIT(100), \ + _PyLong_DIGIT_INIT(101), \ + _PyLong_DIGIT_INIT(102), \ + _PyLong_DIGIT_INIT(103), \ + _PyLong_DIGIT_INIT(104), \ + _PyLong_DIGIT_INIT(105), \ + _PyLong_DIGIT_INIT(106), \ + _PyLong_DIGIT_INIT(107), \ + _PyLong_DIGIT_INIT(108), \ + _PyLong_DIGIT_INIT(109), \ + _PyLong_DIGIT_INIT(110), \ + _PyLong_DIGIT_INIT(111), \ + _PyLong_DIGIT_INIT(112), \ + _PyLong_DIGIT_INIT(113), \ + _PyLong_DIGIT_INIT(114), \ + _PyLong_DIGIT_INIT(115), \ + _PyLong_DIGIT_INIT(116), \ + _PyLong_DIGIT_INIT(117), \ + _PyLong_DIGIT_INIT(118), \ + _PyLong_DIGIT_INIT(119), \ + _PyLong_DIGIT_INIT(120), \ + _PyLong_DIGIT_INIT(121), \ + _PyLong_DIGIT_INIT(122), \ + _PyLong_DIGIT_INIT(123), \ + _PyLong_DIGIT_INIT(124), \ + _PyLong_DIGIT_INIT(125), \ + _PyLong_DIGIT_INIT(126), \ + _PyLong_DIGIT_INIT(127), \ + _PyLong_DIGIT_INIT(128), \ + _PyLong_DIGIT_INIT(129), \ + _PyLong_DIGIT_INIT(130), \ + _PyLong_DIGIT_INIT(131), \ + _PyLong_DIGIT_INIT(132), \ + _PyLong_DIGIT_INIT(133), \ + _PyLong_DIGIT_INIT(134), \ + _PyLong_DIGIT_INIT(135), \ + _PyLong_DIGIT_INIT(136), \ + _PyLong_DIGIT_INIT(137), \ + _PyLong_DIGIT_INIT(138), \ + _PyLong_DIGIT_INIT(139), \ + _PyLong_DIGIT_INIT(140), \ + _PyLong_DIGIT_INIT(141), \ + _PyLong_DIGIT_INIT(142), \ + _PyLong_DIGIT_INIT(143), \ + _PyLong_DIGIT_INIT(144), \ + _PyLong_DIGIT_INIT(145), \ + _PyLong_DIGIT_INIT(146), \ + _PyLong_DIGIT_INIT(147), \ + _PyLong_DIGIT_INIT(148), \ + _PyLong_DIGIT_INIT(149), \ + _PyLong_DIGIT_INIT(150), \ + _PyLong_DIGIT_INIT(151), \ + _PyLong_DIGIT_INIT(152), \ + _PyLong_DIGIT_INIT(153), \ + _PyLong_DIGIT_INIT(154), \ + _PyLong_DIGIT_INIT(155), \ + _PyLong_DIGIT_INIT(156), \ + _PyLong_DIGIT_INIT(157), \ + _PyLong_DIGIT_INIT(158), \ + _PyLong_DIGIT_INIT(159), \ + _PyLong_DIGIT_INIT(160), \ + _PyLong_DIGIT_INIT(161), \ + _PyLong_DIGIT_INIT(162), \ + _PyLong_DIGIT_INIT(163), \ + _PyLong_DIGIT_INIT(164), \ + _PyLong_DIGIT_INIT(165), \ + _PyLong_DIGIT_INIT(166), \ + _PyLong_DIGIT_INIT(167), \ + _PyLong_DIGIT_INIT(168), \ + _PyLong_DIGIT_INIT(169), \ + _PyLong_DIGIT_INIT(170), \ + _PyLong_DIGIT_INIT(171), \ + _PyLong_DIGIT_INIT(172), \ + _PyLong_DIGIT_INIT(173), \ + _PyLong_DIGIT_INIT(174), \ + _PyLong_DIGIT_INIT(175), \ + _PyLong_DIGIT_INIT(176), \ + _PyLong_DIGIT_INIT(177), \ + _PyLong_DIGIT_INIT(178), \ + _PyLong_DIGIT_INIT(179), \ + _PyLong_DIGIT_INIT(180), \ + _PyLong_DIGIT_INIT(181), \ + _PyLong_DIGIT_INIT(182), \ + _PyLong_DIGIT_INIT(183), \ + _PyLong_DIGIT_INIT(184), \ + _PyLong_DIGIT_INIT(185), \ + _PyLong_DIGIT_INIT(186), \ + _PyLong_DIGIT_INIT(187), \ + _PyLong_DIGIT_INIT(188), \ + _PyLong_DIGIT_INIT(189), \ + _PyLong_DIGIT_INIT(190), \ + _PyLong_DIGIT_INIT(191), \ + _PyLong_DIGIT_INIT(192), \ + _PyLong_DIGIT_INIT(193), \ + _PyLong_DIGIT_INIT(194), \ + _PyLong_DIGIT_INIT(195), \ + _PyLong_DIGIT_INIT(196), \ + _PyLong_DIGIT_INIT(197), \ + _PyLong_DIGIT_INIT(198), \ + _PyLong_DIGIT_INIT(199), \ + _PyLong_DIGIT_INIT(200), \ + _PyLong_DIGIT_INIT(201), \ + _PyLong_DIGIT_INIT(202), \ + _PyLong_DIGIT_INIT(203), \ + _PyLong_DIGIT_INIT(204), \ + _PyLong_DIGIT_INIT(205), \ + _PyLong_DIGIT_INIT(206), \ + _PyLong_DIGIT_INIT(207), \ + _PyLong_DIGIT_INIT(208), \ + _PyLong_DIGIT_INIT(209), \ + _PyLong_DIGIT_INIT(210), \ + _PyLong_DIGIT_INIT(211), \ + _PyLong_DIGIT_INIT(212), \ + _PyLong_DIGIT_INIT(213), \ + _PyLong_DIGIT_INIT(214), \ + _PyLong_DIGIT_INIT(215), \ + _PyLong_DIGIT_INIT(216), \ + _PyLong_DIGIT_INIT(217), \ + _PyLong_DIGIT_INIT(218), \ + _PyLong_DIGIT_INIT(219), \ + _PyLong_DIGIT_INIT(220), \ + _PyLong_DIGIT_INIT(221), \ + _PyLong_DIGIT_INIT(222), \ + _PyLong_DIGIT_INIT(223), \ + _PyLong_DIGIT_INIT(224), \ + _PyLong_DIGIT_INIT(225), \ + _PyLong_DIGIT_INIT(226), \ + _PyLong_DIGIT_INIT(227), \ + _PyLong_DIGIT_INIT(228), \ + _PyLong_DIGIT_INIT(229), \ + _PyLong_DIGIT_INIT(230), \ + _PyLong_DIGIT_INIT(231), \ + _PyLong_DIGIT_INIT(232), \ + _PyLong_DIGIT_INIT(233), \ + _PyLong_DIGIT_INIT(234), \ + _PyLong_DIGIT_INIT(235), \ + _PyLong_DIGIT_INIT(236), \ + _PyLong_DIGIT_INIT(237), \ + _PyLong_DIGIT_INIT(238), \ + _PyLong_DIGIT_INIT(239), \ + _PyLong_DIGIT_INIT(240), \ + _PyLong_DIGIT_INIT(241), \ + _PyLong_DIGIT_INIT(242), \ + _PyLong_DIGIT_INIT(243), \ + _PyLong_DIGIT_INIT(244), \ + _PyLong_DIGIT_INIT(245), \ + _PyLong_DIGIT_INIT(246), \ + _PyLong_DIGIT_INIT(247), \ + _PyLong_DIGIT_INIT(248), \ + _PyLong_DIGIT_INIT(249), \ + _PyLong_DIGIT_INIT(250), \ + _PyLong_DIGIT_INIT(251), \ + _PyLong_DIGIT_INIT(252), \ + _PyLong_DIGIT_INIT(253), \ + _PyLong_DIGIT_INIT(254), \ + _PyLong_DIGIT_INIT(255), \ + _PyLong_DIGIT_INIT(256), \ + }, \ + \ + .bytes_empty = _PyBytes_SIMPLE_INIT(0, 0), \ + .bytes_characters = { \ + _PyBytes_CHAR_INIT(0), \ + _PyBytes_CHAR_INIT(1), \ + _PyBytes_CHAR_INIT(2), \ + _PyBytes_CHAR_INIT(3), \ + _PyBytes_CHAR_INIT(4), \ + _PyBytes_CHAR_INIT(5), \ + _PyBytes_CHAR_INIT(6), \ + _PyBytes_CHAR_INIT(7), \ + _PyBytes_CHAR_INIT(8), \ + _PyBytes_CHAR_INIT(9), \ + _PyBytes_CHAR_INIT(10), \ + _PyBytes_CHAR_INIT(11), \ + _PyBytes_CHAR_INIT(12), \ + _PyBytes_CHAR_INIT(13), \ + _PyBytes_CHAR_INIT(14), \ + _PyBytes_CHAR_INIT(15), \ + _PyBytes_CHAR_INIT(16), \ + _PyBytes_CHAR_INIT(17), \ + _PyBytes_CHAR_INIT(18), \ + _PyBytes_CHAR_INIT(19), \ + _PyBytes_CHAR_INIT(20), \ + _PyBytes_CHAR_INIT(21), \ + _PyBytes_CHAR_INIT(22), \ + _PyBytes_CHAR_INIT(23), \ + _PyBytes_CHAR_INIT(24), \ + _PyBytes_CHAR_INIT(25), \ + _PyBytes_CHAR_INIT(26), \ + _PyBytes_CHAR_INIT(27), \ + _PyBytes_CHAR_INIT(28), \ + _PyBytes_CHAR_INIT(29), \ + _PyBytes_CHAR_INIT(30), \ + _PyBytes_CHAR_INIT(31), \ + _PyBytes_CHAR_INIT(32), \ + _PyBytes_CHAR_INIT(33), \ + _PyBytes_CHAR_INIT(34), \ + _PyBytes_CHAR_INIT(35), \ + _PyBytes_CHAR_INIT(36), \ + _PyBytes_CHAR_INIT(37), \ + _PyBytes_CHAR_INIT(38), \ + _PyBytes_CHAR_INIT(39), \ + _PyBytes_CHAR_INIT(40), \ + _PyBytes_CHAR_INIT(41), \ + _PyBytes_CHAR_INIT(42), \ + _PyBytes_CHAR_INIT(43), \ + _PyBytes_CHAR_INIT(44), \ + _PyBytes_CHAR_INIT(45), \ + _PyBytes_CHAR_INIT(46), \ + _PyBytes_CHAR_INIT(47), \ + _PyBytes_CHAR_INIT(48), \ + _PyBytes_CHAR_INIT(49), \ + _PyBytes_CHAR_INIT(50), \ + _PyBytes_CHAR_INIT(51), \ + _PyBytes_CHAR_INIT(52), \ + _PyBytes_CHAR_INIT(53), \ + _PyBytes_CHAR_INIT(54), \ + _PyBytes_CHAR_INIT(55), \ + _PyBytes_CHAR_INIT(56), \ + _PyBytes_CHAR_INIT(57), \ + _PyBytes_CHAR_INIT(58), \ + _PyBytes_CHAR_INIT(59), \ + _PyBytes_CHAR_INIT(60), \ + _PyBytes_CHAR_INIT(61), \ + _PyBytes_CHAR_INIT(62), \ + _PyBytes_CHAR_INIT(63), \ + _PyBytes_CHAR_INIT(64), \ + _PyBytes_CHAR_INIT(65), \ + _PyBytes_CHAR_INIT(66), \ + _PyBytes_CHAR_INIT(67), \ + _PyBytes_CHAR_INIT(68), \ + _PyBytes_CHAR_INIT(69), \ + _PyBytes_CHAR_INIT(70), \ + _PyBytes_CHAR_INIT(71), \ + _PyBytes_CHAR_INIT(72), \ + _PyBytes_CHAR_INIT(73), \ + _PyBytes_CHAR_INIT(74), \ + _PyBytes_CHAR_INIT(75), \ + _PyBytes_CHAR_INIT(76), \ + _PyBytes_CHAR_INIT(77), \ + _PyBytes_CHAR_INIT(78), \ + _PyBytes_CHAR_INIT(79), \ + _PyBytes_CHAR_INIT(80), \ + _PyBytes_CHAR_INIT(81), \ + _PyBytes_CHAR_INIT(82), \ + _PyBytes_CHAR_INIT(83), \ + _PyBytes_CHAR_INIT(84), \ + _PyBytes_CHAR_INIT(85), \ + _PyBytes_CHAR_INIT(86), \ + _PyBytes_CHAR_INIT(87), \ + _PyBytes_CHAR_INIT(88), \ + _PyBytes_CHAR_INIT(89), \ + _PyBytes_CHAR_INIT(90), \ + _PyBytes_CHAR_INIT(91), \ + _PyBytes_CHAR_INIT(92), \ + _PyBytes_CHAR_INIT(93), \ + _PyBytes_CHAR_INIT(94), \ + _PyBytes_CHAR_INIT(95), \ + _PyBytes_CHAR_INIT(96), \ + _PyBytes_CHAR_INIT(97), \ + _PyBytes_CHAR_INIT(98), \ + _PyBytes_CHAR_INIT(99), \ + _PyBytes_CHAR_INIT(100), \ + _PyBytes_CHAR_INIT(101), \ + _PyBytes_CHAR_INIT(102), \ + _PyBytes_CHAR_INIT(103), \ + _PyBytes_CHAR_INIT(104), \ + _PyBytes_CHAR_INIT(105), \ + _PyBytes_CHAR_INIT(106), \ + _PyBytes_CHAR_INIT(107), \ + _PyBytes_CHAR_INIT(108), \ + _PyBytes_CHAR_INIT(109), \ + _PyBytes_CHAR_INIT(110), \ + _PyBytes_CHAR_INIT(111), \ + _PyBytes_CHAR_INIT(112), \ + _PyBytes_CHAR_INIT(113), \ + _PyBytes_CHAR_INIT(114), \ + _PyBytes_CHAR_INIT(115), \ + _PyBytes_CHAR_INIT(116), \ + _PyBytes_CHAR_INIT(117), \ + _PyBytes_CHAR_INIT(118), \ + _PyBytes_CHAR_INIT(119), \ + _PyBytes_CHAR_INIT(120), \ + _PyBytes_CHAR_INIT(121), \ + _PyBytes_CHAR_INIT(122), \ + _PyBytes_CHAR_INIT(123), \ + _PyBytes_CHAR_INIT(124), \ + _PyBytes_CHAR_INIT(125), \ + _PyBytes_CHAR_INIT(126), \ + _PyBytes_CHAR_INIT(127), \ + _PyBytes_CHAR_INIT(128), \ + _PyBytes_CHAR_INIT(129), \ + _PyBytes_CHAR_INIT(130), \ + _PyBytes_CHAR_INIT(131), \ + _PyBytes_CHAR_INIT(132), \ + _PyBytes_CHAR_INIT(133), \ + _PyBytes_CHAR_INIT(134), \ + _PyBytes_CHAR_INIT(135), \ + _PyBytes_CHAR_INIT(136), \ + _PyBytes_CHAR_INIT(137), \ + _PyBytes_CHAR_INIT(138), \ + _PyBytes_CHAR_INIT(139), \ + _PyBytes_CHAR_INIT(140), \ + _PyBytes_CHAR_INIT(141), \ + _PyBytes_CHAR_INIT(142), \ + _PyBytes_CHAR_INIT(143), \ + _PyBytes_CHAR_INIT(144), \ + _PyBytes_CHAR_INIT(145), \ + _PyBytes_CHAR_INIT(146), \ + _PyBytes_CHAR_INIT(147), \ + _PyBytes_CHAR_INIT(148), \ + _PyBytes_CHAR_INIT(149), \ + _PyBytes_CHAR_INIT(150), \ + _PyBytes_CHAR_INIT(151), \ + _PyBytes_CHAR_INIT(152), \ + _PyBytes_CHAR_INIT(153), \ + _PyBytes_CHAR_INIT(154), \ + _PyBytes_CHAR_INIT(155), \ + _PyBytes_CHAR_INIT(156), \ + _PyBytes_CHAR_INIT(157), \ + _PyBytes_CHAR_INIT(158), \ + _PyBytes_CHAR_INIT(159), \ + _PyBytes_CHAR_INIT(160), \ + _PyBytes_CHAR_INIT(161), \ + _PyBytes_CHAR_INIT(162), \ + _PyBytes_CHAR_INIT(163), \ + _PyBytes_CHAR_INIT(164), \ + _PyBytes_CHAR_INIT(165), \ + _PyBytes_CHAR_INIT(166), \ + _PyBytes_CHAR_INIT(167), \ + _PyBytes_CHAR_INIT(168), \ + _PyBytes_CHAR_INIT(169), \ + _PyBytes_CHAR_INIT(170), \ + _PyBytes_CHAR_INIT(171), \ + _PyBytes_CHAR_INIT(172), \ + _PyBytes_CHAR_INIT(173), \ + _PyBytes_CHAR_INIT(174), \ + _PyBytes_CHAR_INIT(175), \ + _PyBytes_CHAR_INIT(176), \ + _PyBytes_CHAR_INIT(177), \ + _PyBytes_CHAR_INIT(178), \ + _PyBytes_CHAR_INIT(179), \ + _PyBytes_CHAR_INIT(180), \ + _PyBytes_CHAR_INIT(181), \ + _PyBytes_CHAR_INIT(182), \ + _PyBytes_CHAR_INIT(183), \ + _PyBytes_CHAR_INIT(184), \ + _PyBytes_CHAR_INIT(185), \ + _PyBytes_CHAR_INIT(186), \ + _PyBytes_CHAR_INIT(187), \ + _PyBytes_CHAR_INIT(188), \ + _PyBytes_CHAR_INIT(189), \ + _PyBytes_CHAR_INIT(190), \ + _PyBytes_CHAR_INIT(191), \ + _PyBytes_CHAR_INIT(192), \ + _PyBytes_CHAR_INIT(193), \ + _PyBytes_CHAR_INIT(194), \ + _PyBytes_CHAR_INIT(195), \ + _PyBytes_CHAR_INIT(196), \ + _PyBytes_CHAR_INIT(197), \ + _PyBytes_CHAR_INIT(198), \ + _PyBytes_CHAR_INIT(199), \ + _PyBytes_CHAR_INIT(200), \ + _PyBytes_CHAR_INIT(201), \ + _PyBytes_CHAR_INIT(202), \ + _PyBytes_CHAR_INIT(203), \ + _PyBytes_CHAR_INIT(204), \ + _PyBytes_CHAR_INIT(205), \ + _PyBytes_CHAR_INIT(206), \ + _PyBytes_CHAR_INIT(207), \ + _PyBytes_CHAR_INIT(208), \ + _PyBytes_CHAR_INIT(209), \ + _PyBytes_CHAR_INIT(210), \ + _PyBytes_CHAR_INIT(211), \ + _PyBytes_CHAR_INIT(212), \ + _PyBytes_CHAR_INIT(213), \ + _PyBytes_CHAR_INIT(214), \ + _PyBytes_CHAR_INIT(215), \ + _PyBytes_CHAR_INIT(216), \ + _PyBytes_CHAR_INIT(217), \ + _PyBytes_CHAR_INIT(218), \ + _PyBytes_CHAR_INIT(219), \ + _PyBytes_CHAR_INIT(220), \ + _PyBytes_CHAR_INIT(221), \ + _PyBytes_CHAR_INIT(222), \ + _PyBytes_CHAR_INIT(223), \ + _PyBytes_CHAR_INIT(224), \ + _PyBytes_CHAR_INIT(225), \ + _PyBytes_CHAR_INIT(226), \ + _PyBytes_CHAR_INIT(227), \ + _PyBytes_CHAR_INIT(228), \ + _PyBytes_CHAR_INIT(229), \ + _PyBytes_CHAR_INIT(230), \ + _PyBytes_CHAR_INIT(231), \ + _PyBytes_CHAR_INIT(232), \ + _PyBytes_CHAR_INIT(233), \ + _PyBytes_CHAR_INIT(234), \ + _PyBytes_CHAR_INIT(235), \ + _PyBytes_CHAR_INIT(236), \ + _PyBytes_CHAR_INIT(237), \ + _PyBytes_CHAR_INIT(238), \ + _PyBytes_CHAR_INIT(239), \ + _PyBytes_CHAR_INIT(240), \ + _PyBytes_CHAR_INIT(241), \ + _PyBytes_CHAR_INIT(242), \ + _PyBytes_CHAR_INIT(243), \ + _PyBytes_CHAR_INIT(244), \ + _PyBytes_CHAR_INIT(245), \ + _PyBytes_CHAR_INIT(246), \ + _PyBytes_CHAR_INIT(247), \ + _PyBytes_CHAR_INIT(248), \ + _PyBytes_CHAR_INIT(249), \ + _PyBytes_CHAR_INIT(250), \ + _PyBytes_CHAR_INIT(251), \ + _PyBytes_CHAR_INIT(252), \ + _PyBytes_CHAR_INIT(253), \ + _PyBytes_CHAR_INIT(254), \ + _PyBytes_CHAR_INIT(255), \ + }, \ + }, \ +} + + +#ifdef __cplusplus +} +#endif +#endif /* !Py_INTERNAL_RUNTIME_INIT_H */ diff --git a/Makefile.pre.in b/Makefile.pre.in index a84badcd49389..0782d95482576 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1651,6 +1651,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/internal/pycore_pymem.h \ $(srcdir)/Include/internal/pycore_pystate.h \ $(srcdir)/Include/internal/pycore_runtime.h \ + $(srcdir)/Include/internal/pycore_runtime_init.h \ $(srcdir)/Include/internal/pycore_sliceobject.h \ $(srcdir)/Include/internal/pycore_strhex.h \ $(srcdir)/Include/internal/pycore_structseq.h \ diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index 83ae2f08749aa..12eac8ebab510 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -233,6 +233,7 @@ + diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters index cc0a2b0f75924..4a502078177d8 100644 --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -609,6 +609,9 @@ Include\internal + + Include\internal + Include\internal diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 284cfac3c40a5..8bcad67e80a0c 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -20,6 +20,7 @@ #include "pycore_pyerrors.h" // _PyErr_Occurred() #include "pycore_pylifecycle.h" // _PyErr_Print() #include "pycore_pystate.h" // _PyThreadState_GET() +#include "pycore_runtime_init.h" // _PyRuntimeState_INIT #include "pycore_sliceobject.h" // _PySlice_Fini() #include "pycore_structseq.h" // _PyStructSequence_InitState() #include "pycore_sysmodule.h" // _PySys_ClearAuditHooks() diff --git a/Python/pystate.c b/Python/pystate.c index a18a159b55175..e6175852c461c 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -10,6 +10,7 @@ #include "pycore_pylifecycle.h" #include "pycore_pymem.h" // _PyMem_SetDefaultAllocator() #include "pycore_pystate.h" // _PyThreadState_GET() +#include "pycore_runtime_init.h" // _PyRuntimeState_INIT #include "pycore_sysmodule.h" /* -------------------------------------------------------------------------- From webhook-mailer at python.org Thu Jan 13 18:33:52 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Thu, 13 Jan 2022 23:33:52 -0000 Subject: [Python-checkins] Statically initialize _PyRuntimeState fields. (gh-30588) Message-ID: https://github.com/python/cpython/commit/b8ddf7e794e5316981016d6d014862e3c4ce149a commit: b8ddf7e794e5316981016d6d014862e3c4ce149a branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-13T16:33:40-07:00 summary: Statically initialize _PyRuntimeState fields. (gh-30588) https://bugs.python.org/issue45953 files: M Include/internal/pycore_runtime_init.h M Python/pystate.c diff --git a/Include/internal/pycore_runtime_init.h b/Include/internal/pycore_runtime_init.h index e35c696610b94..fc768ff5d053f 100644 --- a/Include/internal/pycore_runtime_init.h +++ b/Include/internal/pycore_runtime_init.h @@ -17,6 +17,17 @@ extern "C" { #define _PyRuntimeState_INIT \ { \ + .gilstate = { \ + .check_enabled = 1, \ + /* A TSS key must be initialized with Py_tss_NEEDS_INIT \ + in accordance with the specification. */ \ + .autoTSSkey = Py_tss_NEEDS_INIT, \ + }, \ + .interpreters = { \ + /* This prevents interpreters from getting created \ + until _PyInterpreterState_Enable() is called. */ \ + .next_id = -1, \ + }, \ .global_objects = _Py_global_objects_INIT, \ ._main_interpreter = _PyInterpreterState_INIT, \ } diff --git a/Python/pystate.c b/Python/pystate.c index e6175852c461c..50b3621896028 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -111,17 +111,7 @@ init_runtime(_PyRuntimeState *runtime, PyPreConfig_InitPythonConfig(&runtime->preconfig); - runtime->gilstate.check_enabled = 1; - - /* A TSS key must be initialized with Py_tss_NEEDS_INIT - in accordance with the specification. */ - Py_tss_t initial = Py_tss_NEEDS_INIT; - runtime->gilstate.autoTSSkey = initial; - runtime->interpreters.mutex = interpreters_mutex; - // This prevents interpreters from getting created - // until _PyInterpreterState_Enable() is called. - runtime->interpreters.next_id = -1; runtime->xidregistry.mutex = xidregistry_mutex; From webhook-mailer at python.org Thu Jan 13 18:35:47 2022 From: webhook-mailer at python.org (zooba) Date: Thu, 13 Jan 2022 23:35:47 -0000 Subject: [Python-checkins] bpo-46362: Ensure ntpath.abspath() uses the Windows API correctly (GH-30571) Message-ID: https://github.com/python/cpython/commit/d4e64cd4b0ea431d4e371f9b0a25f6b75a069dc1 commit: d4e64cd4b0ea431d4e371f9b0a25f6b75a069dc1 branch: main author: neonene <53406459+neonene at users.noreply.github.com> committer: zooba date: 2022-01-13T23:35:42Z summary: bpo-46362: Ensure ntpath.abspath() uses the Windows API correctly (GH-30571) This makes ntpath.abspath()/getpath_abspath() follow normpath(), since some WinAPIs such as PathCchSkipRoot() require backslashed paths. files: A Misc/NEWS.d/next/Windows/2022-01-13-22-31-09.bpo-46362.f2cuEb.rst M Include/internal/pycore_fileutils.h M Lib/ntpath.py M Lib/test/test_embed.py M Lib/test/test_ntpath.py M Modules/getpath.c M Modules/posixmodule.c M Python/fileutils.c diff --git a/Include/internal/pycore_fileutils.h b/Include/internal/pycore_fileutils.h index 61c11a8b2d3b4..3ce8108e4e04f 100644 --- a/Include/internal/pycore_fileutils.h +++ b/Include/internal/pycore_fileutils.h @@ -235,6 +235,9 @@ extern int _Py_EncodeNonUnicodeWchar_InPlace( extern int _Py_isabs(const wchar_t *path); extern int _Py_abspath(const wchar_t *path, wchar_t **abspath_p); +#ifdef MS_WINDOWS +extern int _PyOS_getfullpathname(const wchar_t *path, wchar_t **abspath_p); +#endif extern wchar_t * _Py_join_relfile(const wchar_t *dirname, const wchar_t *relfile); extern int _Py_add_relfile(wchar_t *dirname, diff --git a/Lib/ntpath.py b/Lib/ntpath.py index 58483a0c0a98b..041ebc75cb127 100644 --- a/Lib/ntpath.py +++ b/Lib/ntpath.py @@ -551,7 +551,7 @@ def _abspath_fallback(path): def abspath(path): """Return the absolute version of a path.""" try: - return normpath(_getfullpathname(path)) + return _getfullpathname(normpath(path)) except (OSError, ValueError): return _abspath_fallback(path) diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index dd43669ba9674..02bbe3511c6f7 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -1404,6 +1404,33 @@ def test_init_pyvenv_cfg(self): api=API_COMPAT, env=env, ignore_stderr=True, cwd=tmpdir) + @unittest.skipUnless(MS_WINDOWS, 'specific to Windows') + def test_getpath_abspath_win32(self): + # Check _Py_abspath() is passed a backslashed path not to fall back to + # GetFullPathNameW() on startup, which (re-)normalizes the path overly. + # Currently, _Py_normpath() doesn't trim trailing dots and spaces. + CASES = [ + ("C:/a. . .", "C:\\a. . ."), + ("C:\\a. . .", "C:\\a. . ."), + ("\\\\?\\C:////a////b. . .", "\\\\?\\C:\\a\\b. . ."), + ("//a/b/c. . .", "\\\\a\\b\\c. . ."), + ("\\\\a\\b\\c. . .", "\\\\a\\b\\c. . ."), + ("a. . .", f"{os.getcwd()}\\a"), # relpath gets fully normalized + ] + out, err = self.run_embedded_interpreter( + "test_init_initialize_config", + env=dict(PYTHONPATH=os.path.pathsep.join(c[0] for c in CASES)) + ) + self.assertEqual(err, "") + try: + out = json.loads(out) + except json.JSONDecodeError: + self.fail(f"fail to decode stdout: {out!r}") + + results = out['config']["module_search_paths"] + for (_, expected), result in zip(CASES, results): + self.assertEqual(result, expected) + def test_global_pathconfig(self): # Test C API functions getting the path configuration: # diff --git a/Lib/test/test_ntpath.py b/Lib/test/test_ntpath.py index cc29881049224..99a77e3fb43dc 100644 --- a/Lib/test/test_ntpath.py +++ b/Lib/test/test_ntpath.py @@ -613,6 +613,40 @@ def test_expanduser(self): @unittest.skipUnless(nt, "abspath requires 'nt' module") def test_abspath(self): tester('ntpath.abspath("C:\\")', "C:\\") + tester('ntpath.abspath("\\\\?\\C:////spam////eggs. . .")', "\\\\?\\C:\\spam\\eggs") + tester('ntpath.abspath("\\\\.\\C:////spam////eggs. . .")', "\\\\.\\C:\\spam\\eggs") + tester('ntpath.abspath("//spam//eggs. . .")', "\\\\spam\\eggs") + tester('ntpath.abspath("\\\\spam\\\\eggs. . .")', "\\\\spam\\eggs") + tester('ntpath.abspath("C:/spam. . .")', "C:\\spam") + tester('ntpath.abspath("C:\\spam. . .")', "C:\\spam") + tester('ntpath.abspath("C:/nul")', "\\\\.\\nul") + tester('ntpath.abspath("C:\\nul")', "\\\\.\\nul") + tester('ntpath.abspath("//..")', "\\\\") + tester('ntpath.abspath("//../")', "\\\\..\\") + tester('ntpath.abspath("//../..")', "\\\\..\\") + tester('ntpath.abspath("//../../")', "\\\\..\\..\\") + tester('ntpath.abspath("//../../../")', "\\\\..\\..\\") + tester('ntpath.abspath("//../../../..")', "\\\\..\\..\\") + tester('ntpath.abspath("//../../../../")', "\\\\..\\..\\") + tester('ntpath.abspath("//server")', "\\\\server") + tester('ntpath.abspath("//server/")', "\\\\server\\") + tester('ntpath.abspath("//server/..")', "\\\\server\\") + tester('ntpath.abspath("//server/../")', "\\\\server\\..\\") + tester('ntpath.abspath("//server/../..")', "\\\\server\\..\\") + tester('ntpath.abspath("//server/../../")', "\\\\server\\..\\") + tester('ntpath.abspath("//server/../../..")', "\\\\server\\..\\") + tester('ntpath.abspath("//server/../../../")', "\\\\server\\..\\") + tester('ntpath.abspath("//server/share")', "\\\\server\\share") + tester('ntpath.abspath("//server/share/")', "\\\\server\\share\\") + tester('ntpath.abspath("//server/share/..")', "\\\\server\\share\\") + tester('ntpath.abspath("//server/share/../")', "\\\\server\\share\\") + tester('ntpath.abspath("//server/share/../..")', "\\\\server\\share\\") + tester('ntpath.abspath("//server/share/../../")', "\\\\server\\share\\") + tester('ntpath.abspath("C:\\nul. . .")', "\\\\.\\nul") + tester('ntpath.abspath("//... . .")', "\\\\") + tester('ntpath.abspath("//.. . . .")', "\\\\") + tester('ntpath.abspath("//../... . .")', "\\\\..\\") + tester('ntpath.abspath("//../.. . . .")', "\\\\..\\") with os_helper.temp_cwd(os_helper.TESTFN) as cwd_dir: # bpo-31047 tester('ntpath.abspath("")', cwd_dir) tester('ntpath.abspath(" ")', cwd_dir + "\\ ") diff --git a/Misc/NEWS.d/next/Windows/2022-01-13-22-31-09.bpo-46362.f2cuEb.rst b/Misc/NEWS.d/next/Windows/2022-01-13-22-31-09.bpo-46362.f2cuEb.rst new file mode 100644 index 0000000000000..0b59cd28ba4fd --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-01-13-22-31-09.bpo-46362.f2cuEb.rst @@ -0,0 +1,2 @@ +os.path.abspath("C:\CON") is now fixed to return "\\.\CON", not the same path. +The regression was true of all legacy DOS devices such as COM1, LPT1, or NUL. \ No newline at end of file diff --git a/Modules/getpath.c b/Modules/getpath.c index fdfe929514530..5c646c9c83cbf 100644 --- a/Modules/getpath.c +++ b/Modules/getpath.c @@ -59,7 +59,7 @@ getpath_abspath(PyObject *Py_UNUSED(self), PyObject *args) { PyObject *r = NULL; PyObject *pathobj; - const wchar_t *path; + wchar_t *path; if (!PyArg_ParseTuple(args, "U", &pathobj)) { return NULL; } @@ -67,8 +67,8 @@ getpath_abspath(PyObject *Py_UNUSED(self), PyObject *args) path = PyUnicode_AsWideCharString(pathobj, &len); if (path) { wchar_t *abs; - if (_Py_abspath(path, &abs) == 0 && abs) { - r = PyUnicode_FromWideChar(_Py_normpath(abs, -1), -1); + if (_Py_abspath((const wchar_t *)_Py_normpath(path, -1), &abs) == 0 && abs) { + r = PyUnicode_FromWideChar(abs, -1); PyMem_RawFree((void *)abs); } else { PyErr_SetString(PyExc_OSError, "failed to make path absolute"); diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 904f8bfa55807..7b5c3ef575565 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -4240,6 +4240,48 @@ os_listdir_impl(PyObject *module, path_t *path) } #ifdef MS_WINDOWS +int +_PyOS_getfullpathname(const wchar_t *path, wchar_t **abspath_p) +{ + wchar_t woutbuf[MAX_PATH], *woutbufp = woutbuf; + DWORD result; + + result = GetFullPathNameW(path, + Py_ARRAY_LENGTH(woutbuf), woutbuf, + NULL); + if (!result) { + return -1; + } + + if (result >= Py_ARRAY_LENGTH(woutbuf)) { + if ((size_t)result <= (size_t)PY_SSIZE_T_MAX / sizeof(wchar_t)) { + woutbufp = PyMem_RawMalloc((size_t)result * sizeof(wchar_t)); + } + else { + woutbufp = NULL; + } + if (!woutbufp) { + *abspath_p = NULL; + return 0; + } + + result = GetFullPathNameW(path, result, woutbufp, NULL); + if (!result) { + PyMem_RawFree(woutbufp); + return -1; + } + } + + if (woutbufp != woutbuf) { + *abspath_p = woutbufp; + return 0; + } + + *abspath_p = _PyMem_RawWcsdup(woutbufp); + return 0; +} + + /* A helper function for abspath on win32 */ /*[clinic input] os._getfullpathname @@ -4255,8 +4297,7 @@ os__getfullpathname_impl(PyObject *module, path_t *path) { wchar_t *abspath; - /* _Py_abspath() is implemented with GetFullPathNameW() on Windows */ - if (_Py_abspath(path->wide, &abspath) < 0) { + if (_PyOS_getfullpathname(path->wide, &abspath) < 0) { return win32_error_object("GetFullPathNameW", path->object); } if (abspath == NULL) { diff --git a/Python/fileutils.c b/Python/fileutils.c index 151c6feb2ebe1..9a71b83f45578 100644 --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -2049,42 +2049,7 @@ _Py_abspath(const wchar_t *path, wchar_t **abspath_p) } #ifdef MS_WINDOWS - wchar_t woutbuf[MAX_PATH], *woutbufp = woutbuf; - DWORD result; - - result = GetFullPathNameW(path, - Py_ARRAY_LENGTH(woutbuf), woutbuf, - NULL); - if (!result) { - return -1; - } - - if (result >= Py_ARRAY_LENGTH(woutbuf)) { - if ((size_t)result <= (size_t)PY_SSIZE_T_MAX / sizeof(wchar_t)) { - woutbufp = PyMem_RawMalloc((size_t)result * sizeof(wchar_t)); - } - else { - woutbufp = NULL; - } - if (!woutbufp) { - *abspath_p = NULL; - return 0; - } - - result = GetFullPathNameW(path, result, woutbufp, NULL); - if (!result) { - PyMem_RawFree(woutbufp); - return -1; - } - } - - if (woutbufp != woutbuf) { - *abspath_p = woutbufp; - return 0; - } - - *abspath_p = _PyMem_RawWcsdup(woutbufp); - return 0; + return _PyOS_getfullpathname(path, abspath_p); #else wchar_t cwd[MAXPATHLEN + 1]; cwd[Py_ARRAY_LENGTH(cwd) - 1] = 0; From webhook-mailer at python.org Thu Jan 13 19:09:34 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Fri, 14 Jan 2022 00:09:34 -0000 Subject: [Python-checkins] bpo-45953: Statically initialize all the PyThreadState fields we can. (gh-30590) Message-ID: https://github.com/python/cpython/commit/324908ba936d5d262026deebb81f050803848c41 commit: 324908ba936d5d262026deebb81f050803848c41 branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-13T17:09:24-07:00 summary: bpo-45953: Statically initialize all the PyThreadState fields we can. (gh-30590) https://bugs.python.org/issue45953 files: M Include/cpython/pystate.h M Include/internal/pycore_ceval.h M Include/internal/pycore_runtime_init.h M Python/ceval.c M Python/pystate.c diff --git a/Include/cpython/pystate.h b/Include/cpython/pystate.h index bcb1bb25a4940..a35e5b803bd08 100644 --- a/Include/cpython/pystate.h +++ b/Include/cpython/pystate.h @@ -53,12 +53,19 @@ typedef struct _cframe { } CFrame; typedef struct _err_stackitem { - /* This struct represents an entry on the exception stack, which is a - * per-coroutine state. (Coroutine in the computer science sense, - * including the thread and generators). - * This ensures that the exception state is not impacted by "yields" - * from an except handler. + /* This struct represents a single execution context where we might + * be currently handling an exception. It is a per-coroutine state + * (coroutine in the computer science sense, including the thread + * and generators). + * + * This is used as an entry on the exception stack, where each + * entry indicates if it is currently handling an exception. + * This ensures that the exception state is not impacted + * by "yields" from an except handler. The thread + * always has an entry (the bottom-most one). */ + + /* The exception currently being handled in this context, if any. */ PyObject *exc_value; struct _err_stackitem *previous_item; @@ -112,13 +119,9 @@ struct _ts { PyObject *curexc_value; PyObject *curexc_traceback; - /* The exception currently being handled, if no coroutines/generators - * are present. Always last element on the stack referred to be exc_info. - */ - _PyErr_StackItem exc_state; - - /* Pointer to the top of the stack of the exceptions currently - * being handled */ + /* Pointer to the top of the exception stack for the exceptions + * we may be currently handling. (See _PyErr_StackItem above.) + * This is never NULL. */ _PyErr_StackItem *exc_info; PyObject *dict; /* Stores per-thread state */ @@ -174,13 +177,26 @@ struct _ts { /* Unique thread state id. */ uint64_t id; - CFrame root_cframe; PyTraceInfo trace_info; _PyStackChunk *datastack_chunk; PyObject **datastack_top; PyObject **datastack_limit; /* XXX signal handlers should also be here */ + + /* The following fields are here to avoid allocation during init. + The data is exposed through PyThreadState pointer fields. + These fields should not be accessed directly outside of init. + + All other PyInterpreterState pointer fields are populated when + needed and default to NULL. + */ + + /* The thread's exception stack entry. (Always the last entry.) */ + _PyErr_StackItem _exc_state; + + /* The bottom-most frame on the stack. */ + CFrame _root_cframe; }; diff --git a/Include/internal/pycore_ceval.h b/Include/internal/pycore_ceval.h index 20508d4a68747..53d0b5c4549bc 100644 --- a/Include/internal/pycore_ceval.h +++ b/Include/internal/pycore_ceval.h @@ -12,9 +12,14 @@ extern "C" { struct pyruntimestate; struct _ceval_runtime_state; +#ifndef Py_DEFAULT_RECURSION_LIMIT +# define Py_DEFAULT_RECURSION_LIMIT 1000 +#endif + #include "pycore_interp.h" // PyInterpreterState.eval_frame #include "pycore_pystate.h" // _PyThreadState_GET() + extern void _Py_FinishPendingCalls(PyThreadState *tstate); extern void _PyEval_InitRuntimeState(struct _ceval_runtime_state *); extern void _PyEval_InitState(struct _ceval_state *, PyThread_type_lock); diff --git a/Include/internal/pycore_runtime_init.h b/Include/internal/pycore_runtime_init.h index fc768ff5d053f..aa6612063d0ba 100644 --- a/Include/internal/pycore_runtime_init.h +++ b/Include/internal/pycore_runtime_init.h @@ -41,6 +41,8 @@ extern "C" { #define _PyThreadState_INIT \ { \ ._static = 1, \ + .recursion_limit = Py_DEFAULT_RECURSION_LIMIT, \ + .context_ver = 1, \ } diff --git a/Python/ceval.c b/Python/ceval.c index d33cd4e1edb5d..eed902fc68791 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -737,10 +737,6 @@ Py_MakePendingCalls(void) /* The interpreter's recursion limit */ -#ifndef Py_DEFAULT_RECURSION_LIMIT -# define Py_DEFAULT_RECURSION_LIMIT 1000 -#endif - void _PyEval_InitRuntimeState(struct _ceval_runtime_state *ceval) { diff --git a/Python/pystate.c b/Python/pystate.c index 50b3621896028..23156850bfeba 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -775,21 +775,19 @@ init_threadstate(PyThreadState *tstate, next->prev = tstate; } tstate->next = next; - tstate->prev = NULL; + assert(tstate->prev == NULL); tstate->thread_id = PyThread_get_thread_ident(); #ifdef PY_HAVE_THREAD_NATIVE_ID tstate->native_thread_id = PyThread_get_thread_native_id(); #endif - tstate->context_ver = 1; - tstate->recursion_limit = interp->ceval.recursion_limit, tstate->recursion_remaining = interp->ceval.recursion_limit, - tstate->exc_info = &tstate->exc_state; + tstate->exc_info = &tstate->_exc_state; - tstate->cframe = &tstate->root_cframe; + tstate->cframe = &tstate->_root_cframe; tstate->datastack_chunk = NULL; tstate->datastack_top = NULL; tstate->datastack_limit = NULL; @@ -1027,10 +1025,10 @@ PyThreadState_Clear(PyThreadState *tstate) Py_CLEAR(tstate->curexc_value); Py_CLEAR(tstate->curexc_traceback); - Py_CLEAR(tstate->exc_state.exc_value); + Py_CLEAR(tstate->_exc_state.exc_value); /* The stack of exception states should contain just this thread. */ - if (verbose && tstate->exc_info != &tstate->exc_state) { + if (verbose && tstate->exc_info != &tstate->_exc_state) { fprintf(stderr, "PyThreadState_Clear: warning: thread still has a generator\n"); } From webhook-mailer at python.org Thu Jan 13 19:17:47 2022 From: webhook-mailer at python.org (ericsnowcurrently) Date: Fri, 14 Jan 2022 00:17:47 -0000 Subject: [Python-checkins] bpo-45953: Statically initialize all the non-object PyInterpreterState fields we can. (gh-30589) Message-ID: https://github.com/python/cpython/commit/322f962f3ee31d0dbde99e36379de8488ccc6804 commit: 322f962f3ee31d0dbde99e36379de8488ccc6804 branch: main author: Eric Snow committer: ericsnowcurrently date: 2022-01-13T17:17:28-07:00 summary: bpo-45953: Statically initialize all the non-object PyInterpreterState fields we can. (gh-30589) https://bugs.python.org/issue45953 files: M Include/internal/pycore_gc.h M Include/internal/pycore_runtime_init.h M Modules/gcmodule.c M Python/ceval.c M Python/pystate.c diff --git a/Include/internal/pycore_gc.h b/Include/internal/pycore_gc.h index a23dca805491d..56a23e9970752 100644 --- a/Include/internal/pycore_gc.h +++ b/Include/internal/pycore_gc.h @@ -134,6 +134,7 @@ struct _gc_runtime_state { /* Current call-stack depth of tp_dealloc calls. */ int trash_delete_nesting; + /* Is automatic collection enabled? */ int enabled; int debug; /* linked lists of container objects */ @@ -161,6 +162,7 @@ struct _gc_runtime_state { Py_ssize_t long_lived_pending; }; + extern void _PyGC_InitState(struct _gc_runtime_state *); extern Py_ssize_t _PyGC_CollectNoFail(PyThreadState *tstate); diff --git a/Include/internal/pycore_runtime_init.h b/Include/internal/pycore_runtime_init.h index aa6612063d0ba..72ca3464f58db 100644 --- a/Include/internal/pycore_runtime_init.h +++ b/Include/internal/pycore_runtime_init.h @@ -32,9 +32,36 @@ extern "C" { ._main_interpreter = _PyInterpreterState_INIT, \ } +#ifdef HAVE_DLOPEN +# include +# if HAVE_DECL_RTLD_NOW +# define _Py_DLOPEN_FLAGS RTLD_NOW +# else +# define _Py_DLOPEN_FLAGS RTLD_LAZY +# endif +# define DLOPENFLAGS_INIT .dlopenflags = _Py_DLOPEN_FLAGS, +#else +# define _Py_DLOPEN_FLAGS 0 +# define DLOPENFLAGS_INIT +#endif + #define _PyInterpreterState_INIT \ { \ ._static = 1, \ + .id_refcount = -1, \ + DLOPENFLAGS_INIT \ + .ceval = { \ + .recursion_limit = Py_DEFAULT_RECURSION_LIMIT, \ + }, \ + .gc = { \ + .enabled = 1, \ + .generations = { \ + /* .head is set in _PyGC_InitState(). */ \ + { .threshold = 700, }, \ + { .threshold = 10, }, \ + { .threshold = 10, }, \ + }, \ + }, \ ._initial_thread = _PyThreadState_INIT, \ } diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index e22f031f57490..16f8c2b18e717 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -139,24 +139,20 @@ get_gc_state(void) void _PyGC_InitState(GCState *gcstate) { - gcstate->enabled = 1; /* automatic collection enabled? */ - -#define _GEN_HEAD(n) GEN_HEAD(gcstate, n) - struct gc_generation generations[NUM_GENERATIONS] = { - /* PyGC_Head, threshold, count */ - {{(uintptr_t)_GEN_HEAD(0), (uintptr_t)_GEN_HEAD(0)}, 700, 0}, - {{(uintptr_t)_GEN_HEAD(1), (uintptr_t)_GEN_HEAD(1)}, 10, 0}, - {{(uintptr_t)_GEN_HEAD(2), (uintptr_t)_GEN_HEAD(2)}, 10, 0}, - }; +#define INIT_HEAD(GEN) \ + do { \ + GEN.head._gc_next = (uintptr_t)&GEN.head; \ + GEN.head._gc_prev = (uintptr_t)&GEN.head; \ + } while (0) + for (int i = 0; i < NUM_GENERATIONS; i++) { - gcstate->generations[i] = generations[i]; + assert(gcstate->generations[i].count == 0); + INIT_HEAD(gcstate->generations[i]); }; gcstate->generation0 = GEN_HEAD(gcstate, 0); - struct gc_generation permanent_generation = { - {(uintptr_t)&gcstate->permanent_generation.head, - (uintptr_t)&gcstate->permanent_generation.head}, 0, 0 - }; - gcstate->permanent_generation = permanent_generation; + INIT_HEAD(gcstate->permanent_generation); + +#undef INIT_HEAD } diff --git a/Python/ceval.c b/Python/ceval.c index eed902fc68791..70a7750f81190 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -748,8 +748,6 @@ _PyEval_InitRuntimeState(struct _ceval_runtime_state *ceval) void _PyEval_InitState(struct _ceval_state *ceval, PyThread_type_lock pending_lock) { - ceval->recursion_limit = Py_DEFAULT_RECURSION_LIMIT; - struct _pending_calls *pending = &ceval->pending; assert(pending->lock == NULL); diff --git a/Python/pystate.c b/Python/pystate.c index 23156850bfeba..4b698f2b1d771 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -281,7 +281,6 @@ init_interpreter(PyInterpreterState *interp, assert(id > 0 || (id == 0 && interp == runtime->interpreters.main)); interp->id = id; - interp->id_refcount = -1; assert(runtime->interpreters.head == interp); assert(next != NULL || (interp == runtime->interpreters.main)); @@ -291,14 +290,6 @@ init_interpreter(PyInterpreterState *interp, _PyGC_InitState(&interp->gc); PyConfig_InitPythonConfig(&interp->config); _PyType_InitCache(interp); - interp->eval_frame = NULL; -#ifdef HAVE_DLOPEN -#if HAVE_DECL_RTLD_NOW - interp->dlopenflags = RTLD_NOW; -#else - interp->dlopenflags = RTLD_LAZY; -#endif -#endif interp->_initialized = 1; } From webhook-mailer at python.org Thu Jan 13 23:11:47 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 14 Jan 2022 04:11:47 -0000 Subject: [Python-checkins] bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Message-ID: https://github.com/python/cpython/commit/7c770d3350813a82a639fcb3babae0de2b87aaae commit: 7c770d3350813a82a639fcb3babae0de2b87aaae branch: main author: Victor Stinner committer: vstinner date: 2022-01-14T05:11:38+01:00 summary: bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Test if tracemalloc_copy_traces() failed to allocated memory in tracemalloc_copy_domain(). files: M Modules/_tracemalloc.c diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c index b838439a9cb83..14bad00e4c629 100644 --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1242,6 +1242,9 @@ tracemalloc_copy_domain(_Py_hashtable_t *domains, _Py_hashtable_t *traces = (_Py_hashtable_t *)value; _Py_hashtable_t *traces2 = tracemalloc_copy_traces(traces); + if (traces2 == NULL) { + return -1; + } if (_Py_hashtable_set(domains2, TO_PTR(domain), traces2) < 0) { _Py_hashtable_destroy(traces2); return -1; From webhook-mailer at python.org Thu Jan 13 23:32:47 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 14 Jan 2022 04:32:47 -0000 Subject: [Python-checkins] bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Message-ID: https://github.com/python/cpython/commit/86d18019e96167c5ab6f5157fa90598202849904 commit: 86d18019e96167c5ab6f5157fa90598202849904 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T20:32:40-08:00 summary: bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Test if tracemalloc_copy_traces() failed to allocated memory in tracemalloc_copy_domain(). (cherry picked from commit 7c770d3350813a82a639fcb3babae0de2b87aaae) Co-authored-by: Victor Stinner files: M Modules/_tracemalloc.c diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c index 90498fb7a7897..ba0eb738abcbc 100644 --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1241,6 +1241,9 @@ tracemalloc_copy_domain(_Py_hashtable_t *domains, _Py_hashtable_t *traces = (_Py_hashtable_t *)value; _Py_hashtable_t *traces2 = tracemalloc_copy_traces(traces); + if (traces2 == NULL) { + return -1; + } if (_Py_hashtable_set(domains2, TO_PTR(domain), traces2) < 0) { _Py_hashtable_destroy(traces2); return -1; From webhook-mailer at python.org Thu Jan 13 23:35:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 14 Jan 2022 04:35:26 -0000 Subject: [Python-checkins] bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Message-ID: https://github.com/python/cpython/commit/ae6e255cb362557ff713ff2967aecb92f7eb069c commit: ae6e255cb362557ff713ff2967aecb92f7eb069c branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-13T20:35:21-08:00 summary: bpo-46280: Fix tracemalloc_copy_domain() (GH-30591) Test if tracemalloc_copy_traces() failed to allocated memory in tracemalloc_copy_domain(). (cherry picked from commit 7c770d3350813a82a639fcb3babae0de2b87aaae) Co-authored-by: Victor Stinner files: M Modules/_tracemalloc.c diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c index 90498fb7a7897..ba0eb738abcbc 100644 --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1241,6 +1241,9 @@ tracemalloc_copy_domain(_Py_hashtable_t *domains, _Py_hashtable_t *traces = (_Py_hashtable_t *)value; _Py_hashtable_t *traces2 = tracemalloc_copy_traces(traces); + if (traces2 == NULL) { + return -1; + } if (_Py_hashtable_set(domains2, TO_PTR(domain), traces2) < 0) { _Py_hashtable_destroy(traces2); return -1; From webhook-mailer at python.org Fri Jan 14 06:11:55 2022 From: webhook-mailer at python.org (tiran) Date: Fri, 14 Jan 2022 11:11:55 -0000 Subject: [Python-checkins] bpo-40280: Build WASM stdlib bundle and more modules for node (GH-30597) Message-ID: https://github.com/python/cpython/commit/c8319f7921fbcb0dea04da48a1b04a5d0d21ae1c commit: c8319f7921fbcb0dea04da48a1b04a5d0d21ae1c branch: main author: Christian Heimes committer: tiran date: 2022-01-14T12:11:49+01:00 summary: bpo-40280: Build WASM stdlib bundle and more modules for node (GH-30597) files: M Makefile.pre.in M configure M configure.ac diff --git a/Makefile.pre.in b/Makefile.pre.in index 0782d95482576..0b4d9a5240158 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -2399,6 +2399,7 @@ clean-retain-profile: pycremoval -rm -f Lib/lib2to3/*Grammar*.pickle -rm -f _bootstrap_python -rm -f python.html python*.js python.data + -rm -rf $(WASM_STDLIB) -rm -f Programs/_testembed Programs/_freeze_module -rm -f Python/deepfreeze/*.[co] -rm -f Python/frozen_modules/*.h diff --git a/configure b/configure index ccbf4fc0b180f..b5a6e0c51bd52 100755 --- a/configure +++ b/configure @@ -6577,7 +6577,13 @@ fi $as_echo "$LDLIBRARY" >&6; } # LIBRARY_DEPS, LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variable -LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' +case $ac_sys_system/$ac_sys_emscripten_target in #( + Emscripten/browser) : + LIBRARY_DEPS='$(PY3LIBRARY) $(WASM_STDLIB)' ;; #( + *) : + LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' + ;; +esac LINK_PYTHON_DEPS='$(LIBRARY_DEPS)' if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" @@ -7669,14 +7675,14 @@ fi case $ac_sys_system/$ac_sys_emscripten_target in #( Emscripten/browser) : - LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" + LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" WASM_ASSETS_DIR=".\$(prefix)" WASM_STDLIB="\$(WASM_ASSETS_DIR)/local/lib/python\$(VERSION)/os.py" ;; #( Emscripten/node) : - LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" - CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" + LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" + CFLAGS_NODIST="$CFLAGS_NODIST -pthread" ;; #( WASI) : @@ -21294,22 +21300,26 @@ $as_echo "yes" >&6; } fi -case $ac_sys_system in #( - AIX) : +case $ac_sys_system/$ac_sys_emscripten_target in #( + AIX/*) : py_stdlib_not_available="_scproxy spwd" ;; #( - VxWorks*) : + VxWorks*/*) : py_stdlib_not_available="_scproxy _crypt termios grp" ;; #( - Darwin) : + Darwin/*) : py_stdlib_not_available="ossaudiodev spwd" ;; #( - CYGWIN*) : + CYGWIN*/*) : py_stdlib_not_available="_scproxy nis" ;; #( - QNX*) : + QNX*/*) : py_stdlib_not_available="_scproxy nis" ;; #( - FreeBSD*) : + FreeBSD*/*) : py_stdlib_not_available="_scproxy spwd" ;; #( - Emscripten) : + Emscripten/browser) : py_stdlib_not_available="_ctypes _curses _curses_panel _dbm _gdbm _multiprocessing _posixshmem _posixsubprocess _scproxy _tkinter _xxsubinterpreters fcntl grp nis ossaudiodev resource readline spwd syslog termios" + ;; #( + Emscripten/node) : + + py_stdlib_not_available="_ctypes _curses _curses_panel _dbm _gdbm _scproxy _tkinter nis ossaudiodev spwd syslog" ;; #( *) : py_stdlib_not_available="_scproxy" diff --git a/configure.ac b/configure.ac index 89041b205f50d..300d793ad7dfa 100644 --- a/configure.ac +++ b/configure.ac @@ -1354,7 +1354,10 @@ fi AC_MSG_RESULT($LDLIBRARY) # LIBRARY_DEPS, LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variable -LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)' +AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [Emscripten/browser], [LIBRARY_DEPS='$(PY3LIBRARY) $(WASM_STDLIB)'], + [LIBRARY_DEPS='$(PY3LIBRARY) $(EXPORTSYMS)'] +) LINK_PYTHON_DEPS='$(LIBRARY_DEPS)' if test "$PY_ENABLE_SHARED" = 1 || test "$enable_framework" ; then LIBRARY_DEPS="\$(LDLIBRARY) $LIBRARY_DEPS" @@ -1839,13 +1842,13 @@ fi # WASM flags AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], [Emscripten/browser], [ - LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" + LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 --preload-file \$(WASM_ASSETS_DIR)" WASM_ASSETS_DIR=".\$(prefix)" WASM_STDLIB="\$(WASM_ASSETS_DIR)/local/lib/python\$(VERSION)/os.py" ], [Emscripten/node], [ - LDFLAGS_NODIST="$(LDFLAGS_NODIST) -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" - CFLAGS_NODIST="$(CFLAGS_NODIST) -pthread" + LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" + CFLAGS_NODIST="$CFLAGS_NODIST -pthread" ], [WASI], [ AC_DEFINE([_WASI_EMULATED_SIGNAL], [1], [Define to 1 if you want to emulate signals on WASI]) @@ -6336,14 +6339,14 @@ AC_SUBST(TEST_MODULES) dnl Modules that are not available on some platforms dnl AIX has shadow passwords, but access is not via getspent() dnl VxWorks does not provide crypt() function -AS_CASE([$ac_sys_system], - [AIX], [py_stdlib_not_available="_scproxy spwd"], - [VxWorks*], [py_stdlib_not_available="_scproxy _crypt termios grp"], - [Darwin], [py_stdlib_not_available="ossaudiodev spwd"], - [CYGWIN*], [py_stdlib_not_available="_scproxy nis"], - [QNX*], [py_stdlib_not_available="_scproxy nis"], - [FreeBSD*], [py_stdlib_not_available="_scproxy spwd"], - [Emscripten], [ +AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], + [AIX/*], [py_stdlib_not_available="_scproxy spwd"], + [VxWorks*/*], [py_stdlib_not_available="_scproxy _crypt termios grp"], + [Darwin/*], [py_stdlib_not_available="ossaudiodev spwd"], + [CYGWIN*/*], [py_stdlib_not_available="_scproxy nis"], + [QNX*/*], [py_stdlib_not_available="_scproxy nis"], + [FreeBSD*/*], [py_stdlib_not_available="_scproxy spwd"], + [Emscripten/browser], [ py_stdlib_not_available="m4_normalize([ _ctypes _curses @@ -6367,6 +6370,23 @@ AS_CASE([$ac_sys_system], termios ])" ], + dnl Some modules like _posixsubprocess do not work. We build them anyway + dnl so imports in tests do not fail. + [Emscripten/node], [ + py_stdlib_not_available="m4_normalize([ + _ctypes + _curses + _curses_panel + _dbm + _gdbm + _scproxy + _tkinter + nis + ossaudiodev + spwd + syslog + ])" + ], [py_stdlib_not_available="_scproxy"] ) From webhook-mailer at python.org Fri Jan 14 10:31:48 2022 From: webhook-mailer at python.org (zooba) Date: Fri, 14 Jan 2022 15:31:48 -0000 Subject: [Python-checkins] bpo-46362: Ensure abspath() tests pass through environment variables to subprocess (GH-30595) Message-ID: https://github.com/python/cpython/commit/71c0b859ae16ee748cbb050a1f4de93c04e04f83 commit: 71c0b859ae16ee748cbb050a1f4de93c04e04f83 branch: main author: neonene <53406459+neonene at users.noreply.github.com> committer: zooba date: 2022-01-14T15:31:15Z summary: bpo-46362: Ensure abspath() tests pass through environment variables to subprocess (GH-30595) files: M Lib/test/test_embed.py diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 02bbe3511c6f7..9fed0a5f14e65 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -1419,7 +1419,8 @@ def test_getpath_abspath_win32(self): ] out, err = self.run_embedded_interpreter( "test_init_initialize_config", - env=dict(PYTHONPATH=os.path.pathsep.join(c[0] for c in CASES)) + env={**remove_python_envvars(), + "PYTHONPATH": os.path.pathsep.join(c[0] for c in CASES)} ) self.assertEqual(err, "") try: From webhook-mailer at python.org Fri Jan 14 12:25:53 2022 From: webhook-mailer at python.org (iritkatriel) Date: Fri, 14 Jan 2022 17:25:53 -0000 Subject: [Python-checkins] bpo-23183: Document the timeit output (GH-30359) Message-ID: https://github.com/python/cpython/commit/73140de97cbeb01bb6c9af1da89ecb9355921e91 commit: 73140de97cbeb01bb6c9af1da89ecb9355921e91 branch: main author: Hugo van Kemenade committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-14T17:25:36Z summary: bpo-23183: Document the timeit output (GH-30359) Co-authored-by: Robert Collins files: M Doc/library/timeit.rst diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index 7f1c41d46399e..660a546e72189 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -282,6 +282,13 @@ It is possible to provide a setup statement that is executed only once at the be $ python -m timeit -s 'text = "sample string"; char = "g"' 'text.find(char)' 1000000 loops, best of 5: 0.342 usec per loop +In the output, there are three fields. The loop count, which tells you how many +times the statement body was run per timing loop repetition. The repetition +count ('best of 5') which tells you how many times the timing loop was +repeated, and finally the time the statement body took on average within the +best repetition of the timing loop. That is, the time the fastest repetition +took divided by the loop count. + :: >>> import timeit From webhook-mailer at python.org Fri Jan 14 12:48:00 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 14 Jan 2022 17:48:00 -0000 Subject: [Python-checkins] bpo-23183: Document the timeit output (GH-30359) Message-ID: https://github.com/python/cpython/commit/26039d1e0a1da897d28688895126eb8bbd16f2c9 commit: 26039d1e0a1da897d28688895126eb8bbd16f2c9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-14T09:47:51-08:00 summary: bpo-23183: Document the timeit output (GH-30359) Co-authored-by: Robert Collins (cherry picked from commit 73140de97cbeb01bb6c9af1da89ecb9355921e91) Co-authored-by: Hugo van Kemenade files: M Doc/library/timeit.rst diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index d4e8b749db480..ca21fe622323f 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -282,6 +282,13 @@ It is possible to provide a setup statement that is executed only once at the be $ python -m timeit -s 'text = "sample string"; char = "g"' 'text.find(char)' 1000000 loops, best of 5: 0.342 usec per loop +In the output, there are three fields. The loop count, which tells you how many +times the statement body was run per timing loop repetition. The repetition +count ('best of 5') which tells you how many times the timing loop was +repeated, and finally the time the statement body took on average within the +best repetition of the timing loop. That is, the time the fastest repetition +took divided by the loop count. + :: >>> import timeit From webhook-mailer at python.org Fri Jan 14 12:48:48 2022 From: webhook-mailer at python.org (tiran) Date: Fri, 14 Jan 2022 17:48:48 -0000 Subject: [Python-checkins] bpo-40280: Block more syscalls that are causing crashes in tests (GH-30601) Message-ID: https://github.com/python/cpython/commit/ee1a8b336d30476e9635a6826f61a99fc3604159 commit: ee1a8b336d30476e9635a6826f61a99fc3604159 branch: main author: Christian Heimes committer: tiran date: 2022-01-14T18:48:44+01:00 summary: bpo-40280: Block more syscalls that are causing crashes in tests (GH-30601) files: M Tools/wasm/config.site-wasm32-emscripten diff --git a/Tools/wasm/config.site-wasm32-emscripten b/Tools/wasm/config.site-wasm32-emscripten index ce9dec7ecf6d4..c15e4fc6b64b1 100644 --- a/Tools/wasm/config.site-wasm32-emscripten +++ b/Tools/wasm/config.site-wasm32-emscripten @@ -45,9 +45,10 @@ ac_cv_func_socketpair=no ac_cv_func_utimensat=no ac_cv_func_sigaction=no -# Untested syscalls in emscripten +# Untested or failing syscalls in emscripten ac_cv_func_openat=no ac_cv_func_mkdirat=no +ac_cv_func_faccessat=no ac_cv_func_fchownat=no ac_cv_func_renameat=no ac_cv_func_linkat=no @@ -71,5 +72,10 @@ ac_cv_header_sys_ioctl_h=no # sockets are supported, but only in non-blocking mode # ac_cv_header_sys_socket_h=no -# Unsupported functionality -#undef HAVE_PTHREAD_H +# aborts with bad ioctl +ac_cv_func_openpty=no +ac_cv_func_forkpty=no + +# To use dlopen, you need to use Emscripten's linking support, +# see https://github.com/emscripten-core/emscripten/wiki/Linking) +ac_cv_func_dlopen=no From webhook-mailer at python.org Fri Jan 14 12:53:50 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 14 Jan 2022 17:53:50 -0000 Subject: [Python-checkins] bpo-23183: Document the timeit output (GH-30359) Message-ID: https://github.com/python/cpython/commit/9badf6895a9bc1b01b2d6b2fb35419e7c5523ce6 commit: 9badf6895a9bc1b01b2d6b2fb35419e7c5523ce6 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-14T09:53:38-08:00 summary: bpo-23183: Document the timeit output (GH-30359) Co-authored-by: Robert Collins (cherry picked from commit 73140de97cbeb01bb6c9af1da89ecb9355921e91) Co-authored-by: Hugo van Kemenade files: M Doc/library/timeit.rst diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index d4e8b749db480..ca21fe622323f 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -282,6 +282,13 @@ It is possible to provide a setup statement that is executed only once at the be $ python -m timeit -s 'text = "sample string"; char = "g"' 'text.find(char)' 1000000 loops, best of 5: 0.342 usec per loop +In the output, there are three fields. The loop count, which tells you how many +times the statement body was run per timing loop repetition. The repetition +count ('best of 5') which tells you how many times the timing loop was +repeated, and finally the time the statement body took on average within the +best repetition of the timing loop. That is, the time the fastest repetition +took divided by the loop count. + :: >>> import timeit From webhook-mailer at python.org Fri Jan 14 13:55:09 2022 From: webhook-mailer at python.org (mdickinson) Date: Fri, 14 Jan 2022 18:55:09 -0000 Subject: [Python-checkins] bpo-45569: Change PYLONG_BITS_IN_DIGIT default to 30 (GH-30497) Message-ID: https://github.com/python/cpython/commit/025cbe7a9b5d3058ce2eb8015d3650e396004545 commit: 025cbe7a9b5d3058ce2eb8015d3650e396004545 branch: main author: Mark Dickinson committer: mdickinson date: 2022-01-14T18:54:56Z summary: bpo-45569: Change PYLONG_BITS_IN_DIGIT default to 30 (GH-30497) files: A Misc/NEWS.d/next/Build/2022-01-09-11-24-54.bpo-45569.zCIENy.rst M Doc/using/configure.rst M Doc/whatsnew/3.11.rst M Include/pyport.h M configure M configure.ac diff --git a/Doc/using/configure.rst b/Doc/using/configure.rst index 771ad3cf18d97..f1c156c042353 100644 --- a/Doc/using/configure.rst +++ b/Doc/using/configure.rst @@ -35,8 +35,7 @@ General Options Define the size in bits of Python :class:`int` digits: 15 or 30 bits. - By default, the number of bits is selected depending on ``sizeof(void*)``: - 30 bits if ``void*`` size is 64-bit or larger, 15 bits otherwise. + By default, the digit size is 30. Define the ``PYLONG_BITS_IN_DIGIT`` to ``15`` or ``30``. diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 6a6c22c9077c9..96d6e26709342 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -622,6 +622,16 @@ Build Changes like Pyodide. (Contributed by Christian Heimes and Ethan Smith in :issue:`40280`.) +* CPython will now use 30-bit digits by default for the Python :class:`int` + implementation. Previously, the default was to use 30-bit digits on platforms + with ``SIZEOF_VOID_P >= 8``, and 15-bit digits otherwise. It's still possible + to explicitly request use of 15-bit digits via either the + ``--enable-big-digits`` option to the configure script or (for Windows) the + ``PYLONG_BITS_IN_DIGIT`` variable in ``PC/pyconfig.h``, but this option may + be removed at some point in the future. (Contributed by Mark Dickinson in + :issue:`45569`.) + + C API Changes ============= diff --git a/Include/pyport.h b/Include/pyport.h index 81b1bde841e08..d27b3dde11659 100644 --- a/Include/pyport.h +++ b/Include/pyport.h @@ -85,20 +85,12 @@ Used in: Py_SAFE_DOWNCAST #define PY_INT32_T int32_t #define PY_INT64_T int64_t -/* If PYLONG_BITS_IN_DIGIT is not defined then we'll use 30-bit digits if all - the necessary integer types are available, and we're on a 64-bit platform - (as determined by SIZEOF_VOID_P); otherwise we use 15-bit digits. - - From pyodide: WASM has 32 bit pointers but has native 64 bit arithmetic - so it is more efficient to use 30 bit digits. +/* PYLONG_BITS_IN_DIGIT describes the number of bits per "digit" (limb) in the + * PyLongObject implementation (longintrepr.h). It's currently either 30 or 15, + * defaulting to 30. The 15-bit digit option may be removed in the future. */ - #ifndef PYLONG_BITS_IN_DIGIT -#if SIZEOF_VOID_P >= 8 || defined(__wasm__) -# define PYLONG_BITS_IN_DIGIT 30 -#else -# define PYLONG_BITS_IN_DIGIT 15 -#endif +#define PYLONG_BITS_IN_DIGIT 30 #endif /* uintptr_t is the C9X name for an unsigned integral type such that a diff --git a/Misc/NEWS.d/next/Build/2022-01-09-11-24-54.bpo-45569.zCIENy.rst b/Misc/NEWS.d/next/Build/2022-01-09-11-24-54.bpo-45569.zCIENy.rst new file mode 100644 index 0000000000000..69716cd9af5b2 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-09-11-24-54.bpo-45569.zCIENy.rst @@ -0,0 +1,5 @@ +The build now defaults to using 30-bit digits for Python integers. Previously +either 15-bit or 30-bit digits would be selected, depending on the platform. +15-bit digits may still be selected using the ``--enable-big-digits=15`` option +to the ``configure`` script, or by defining ``PYLONG_BITS_IN_DIGIT`` in +``pyconfig.h``. diff --git a/configure b/configure index b5a6e0c51bd52..1dee645c387eb 100755 --- a/configure +++ b/configure @@ -1730,7 +1730,7 @@ Optional Features: Doc/library/socket.rst (default is yes if supported) --enable-big-digits[=15|30] use big digits (30 or 15 bits) for Python longs - (default is system-dependent)] + (default is 30)] --disable-test-modules don't build nor install test modules Optional Packages: diff --git a/configure.ac b/configure.ac index 300d793ad7dfa..7b084a264d411 100644 --- a/configure.ac +++ b/configure.ac @@ -5084,7 +5084,7 @@ AC_CHECK_DECLS([RTLD_LAZY, RTLD_NOW, RTLD_GLOBAL, RTLD_LOCAL, RTLD_NODELETE, RTL # determine what size digit to use for Python's longs AC_MSG_CHECKING([digit size for Python's longs]) AC_ARG_ENABLE(big-digits, -AS_HELP_STRING([--enable-big-digits@<:@=15|30@:>@],[use big digits (30 or 15 bits) for Python longs (default is system-dependent)]]), +AS_HELP_STRING([--enable-big-digits@<:@=15|30@:>@],[use big digits (30 or 15 bits) for Python longs (default is 30)]]), [case $enable_big_digits in yes) enable_big_digits=30 ;; From webhook-mailer at python.org Fri Jan 14 16:13:54 2022 From: webhook-mailer at python.org (rhettinger) Date: Fri, 14 Jan 2022 21:13:54 -0000 Subject: [Python-checkins] bpo-46380: Apply tests to both C and Python version (GH-30606) Message-ID: https://github.com/python/cpython/commit/c5640ef87511c960e339af37b486678788be910a commit: c5640ef87511c960e339af37b486678788be910a branch: main author: Nikita Sobolev committer: rhettinger date: 2022-01-14T13:13:45-08:00 summary: bpo-46380: Apply tests to both C and Python version (GH-30606) files: M Lib/test/test_functools.py diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py index 70ae8e06bb475..d527e31f39ffe 100644 --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -1414,7 +1414,7 @@ def test_lru_reentrancy_with_len(self): def test_lru_star_arg_handling(self): # Test regression that arose in ea064ff3c10f - @functools.lru_cache() + @self.module.lru_cache() def f(*args): return args @@ -1426,11 +1426,11 @@ def test_lru_type_error(self): # lru_cache was leaking when one of the arguments # wasn't cacheable. - @functools.lru_cache(maxsize=None) + @self.module.lru_cache(maxsize=None) def infinite_cache(o): pass - @functools.lru_cache(maxsize=10) + @self.module.lru_cache(maxsize=10) def limited_cache(o): pass From webhook-mailer at python.org Fri Jan 14 16:15:05 2022 From: webhook-mailer at python.org (pablogsal) Date: Fri, 14 Jan 2022 21:15:05 -0000 Subject: [Python-checkins] Python 3.10.2 Message-ID: https://github.com/python/cpython/commit/a58ebcc701dd6c43630df941481475ff0f615a81 commit: a58ebcc701dd6c43630df941481475ff0f615a81 branch: 3.10 author: Pablo Galindo committer: pablogsal date: 2022-01-13T18:52:14Z summary: Python 3.10.2 files: A Misc/NEWS.d/3.10.2.rst D Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst D Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst D Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-24-24.bpo-46004.TTEU1p.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-08-11-06-53.bpo-46009.cL8pH0.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst D Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst D Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst D Misc/NEWS.d/next/Documentation/2021-12-11-20-03-09.bpo-46040.qrsG0C.rst D Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst D Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst D Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst D Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst D Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst D Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst D Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst D Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst D Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst D Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst D Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst D Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst D Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst D Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst D Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst D Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst D Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst D Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst D Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst D Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Include/patchlevel.h M Lib/pydoc_data/topics.py M README.rst diff --git a/Include/patchlevel.h b/Include/patchlevel.h index 8a6fec39f6e2e..5769674a3272c 100644 --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -18,12 +18,12 @@ /*--start constants--*/ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 10 -#define PY_MICRO_VERSION 1 +#define PY_MICRO_VERSION 2 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.10.1+" +#define PY_VERSION "3.10.2" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/pydoc_data/topics.py b/Lib/pydoc_data/topics.py index 00c98ad51072c..1b5cfe24001c5 100644 --- a/Lib/pydoc_data/topics.py +++ b/Lib/pydoc_data/topics.py @@ -1,5 +1,5 @@ # -*- coding: utf-8 -*- -# Autogenerated by Sphinx on Mon Dec 6 17:57:38 2021 +# Autogenerated by Sphinx on Thu Jan 13 18:49:56 2022 topics = {'assert': 'The "assert" statement\n' '**********************\n' '\n' @@ -1007,7 +1007,7 @@ '"super(B,\n' ' obj).m()" searches "obj.__class__.__mro__" for the ' 'base class "A"\n' - ' immediately preceding "B" and then invokes the ' + ' immediately following "B" and then invokes the ' 'descriptor with the\n' ' call: "A.__dict__[\'m\'].__get__(obj, ' 'obj.__class__)".\n' @@ -1038,14 +1038,15 @@ 'can be\n' 'overridden by instances.\n' '\n' - 'Python methods (including "staticmethod()" and ' - '"classmethod()") are\n' - 'implemented as non-data descriptors. Accordingly, ' - 'instances can\n' - 'redefine and override methods. This allows individual ' - 'instances to\n' - 'acquire behaviors that differ from other instances of ' - 'the same class.\n' + 'Python methods (including those decorated with ' + '"@staticmethod" and\n' + '"@classmethod") are implemented as non-data ' + 'descriptors. Accordingly,\n' + 'instances can redefine and override methods. This ' + 'allows individual\n' + 'instances to acquire behaviors that differ from other ' + 'instances of the\n' + 'same class.\n' '\n' 'The "property()" function is implemented as a data ' 'descriptor.\n' @@ -1058,12 +1059,12 @@ '\n' '*__slots__* allow us to explicitly declare data members ' '(like\n' - 'properties) and deny the creation of *__dict__* and ' + 'properties) and deny the creation of "__dict__" and ' '*__weakref__*\n' '(unless explicitly declared in *__slots__* or available ' 'in a parent.)\n' '\n' - 'The space saved over using *__dict__* can be ' + 'The space saved over using "__dict__" can be ' 'significant. Attribute\n' 'lookup speed can be significantly improved as well.\n' '\n' @@ -1075,7 +1076,7 @@ '*__slots__*\n' ' reserves space for the declared variables and ' 'prevents the\n' - ' automatic creation of *__dict__* and *__weakref__* ' + ' automatic creation of "__dict__" and *__weakref__* ' 'for each\n' ' instance.\n' '\n' @@ -1084,11 +1085,11 @@ '--------------------------\n' '\n' '* When inheriting from a class without *__slots__*, the ' - '*__dict__* and\n' + '"__dict__" and\n' ' *__weakref__* attribute of the instances will always ' 'be accessible.\n' '\n' - '* Without a *__dict__* variable, instances cannot be ' + '* Without a "__dict__" variable, instances cannot be ' 'assigned new\n' ' variables not listed in the *__slots__* definition. ' 'Attempts to\n' @@ -1102,28 +1103,28 @@ '\n' '* Without a *__weakref__* variable for each instance, ' 'classes defining\n' - ' *__slots__* do not support weak references to its ' - 'instances. If weak\n' - ' reference support is needed, then add ' + ' *__slots__* do not support "weak references" to its ' + 'instances. If\n' + ' weak reference support is needed, then add ' '"\'__weakref__\'" to the\n' ' sequence of strings in the *__slots__* declaration.\n' '\n' '* *__slots__* are implemented at the class level by ' 'creating\n' - ' descriptors (Implementing Descriptors) for each ' - 'variable name. As a\n' - ' result, class attributes cannot be used to set default ' - 'values for\n' - ' instance variables defined by *__slots__*; otherwise, ' - 'the class\n' - ' attribute would overwrite the descriptor assignment.\n' + ' descriptors for each variable name. As a result, ' + 'class attributes\n' + ' cannot be used to set default values for instance ' + 'variables defined\n' + ' by *__slots__*; otherwise, the class attribute would ' + 'overwrite the\n' + ' descriptor assignment.\n' '\n' '* The action of a *__slots__* declaration is not limited ' 'to the class\n' ' where it is defined. *__slots__* declared in parents ' 'are available\n' ' in child classes. However, child subclasses will get a ' - '*__dict__*\n' + '"__dict__"\n' ' and *__weakref__* unless they also define *__slots__* ' '(which should\n' ' only contain names of any *additional* slots).\n' @@ -1143,13 +1144,19 @@ ' ?variable-length? built-in types such as "int", ' '"bytes" and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to ' - '*__slots__*. Mappings may\n' - ' also be used; however, in the future, special meaning ' - 'may be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to ' + '*__slots__*.\n' '\n' - '* *__class__* assignment works only if both classes have ' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' + '\n' + '* "__class__" assignment works only if both classes have ' 'the same\n' ' *__slots__*.\n' '\n' @@ -1161,10 +1168,10 @@ 'violations\n' ' raise "TypeError".\n' '\n' - '* If an iterator is used for *__slots__* then a ' - 'descriptor is created\n' - ' for each of the iterator?s values. However, the ' - '*__slots__*\n' + '* If an *iterator* is used for *__slots__* then a ' + '*descriptor* is\n' + ' created for each of the iterator?s values. However, ' + 'the *__slots__*\n' ' attribute will be an empty iterator.\n', 'attribute-references': 'Attribute references\n' '********************\n' @@ -2378,33 +2385,6 @@ ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, ' '2]".\n' '\n' - 'Note:\n' - '\n' - ' There is a subtlety when the sequence is being modified by the ' - 'loop\n' - ' (this can only occur for mutable sequences, e.g. lists). An\n' - ' internal counter is used to keep track of which item is used ' - 'next,\n' - ' and this is incremented on each iteration. When this counter ' - 'has\n' - ' reached the length of the sequence the loop terminates. This ' - 'means\n' - ' that if the suite deletes the current (or a previous) item ' - 'from the\n' - ' sequence, the next item will be skipped (since it gets the ' - 'index of\n' - ' the current item which has already been treated). Likewise, ' - 'if the\n' - ' suite inserts an item in the sequence before the current item, ' - 'the\n' - ' current item will be treated again the next time through the ' - 'loop.\n' - ' This can lead to nasty bugs that can be avoided by making a\n' - ' temporary copy using a slice of the whole sequence, e.g.,\n' - '\n' - ' for x in a[:]:\n' - ' if x < 0: a.remove(x)\n' - '\n' '\n' 'The "try" statement\n' '===================\n' @@ -4622,17 +4602,16 @@ 'debugger will pause execution just before the first line of the\n' 'module.\n' '\n' - 'The typical usage to break into the debugger from a running ' - 'program is\n' - 'to insert\n' + 'The typical usage to break into the debugger is to insert:\n' '\n' ' import pdb; pdb.set_trace()\n' '\n' - 'at the location you want to break into the debugger. You can ' - 'then\n' - 'step through the code following this statement, and continue ' - 'running\n' - 'without the debugger using the "continue" command.\n' + 'at the location you want to break into the debugger, and then ' + 'run the\n' + 'program. You can then step through the code following this ' + 'statement,\n' + 'and continue running without the debugger using the "continue"\n' + 'command.\n' '\n' 'New in version 3.7: The built-in "breakpoint()", when called ' 'with\n' @@ -5894,30 +5873,7 @@ 'all by the loop. Hint: the built-in function "range()" returns an\n' 'iterator of integers suitable to emulate the effect of Pascal?s "for ' 'i\n' - ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, 2]".\n' - '\n' - 'Note:\n' - '\n' - ' There is a subtlety when the sequence is being modified by the ' - 'loop\n' - ' (this can only occur for mutable sequences, e.g. lists). An\n' - ' internal counter is used to keep track of which item is used next,\n' - ' and this is incremented on each iteration. When this counter has\n' - ' reached the length of the sequence the loop terminates. This ' - 'means\n' - ' that if the suite deletes the current (or a previous) item from ' - 'the\n' - ' sequence, the next item will be skipped (since it gets the index ' - 'of\n' - ' the current item which has already been treated). Likewise, if ' - 'the\n' - ' suite inserts an item in the sequence before the current item, the\n' - ' current item will be treated again the next time through the loop.\n' - ' This can lead to nasty bugs that can be avoided by making a\n' - ' temporary copy using a slice of the whole sequence, e.g.,\n' - '\n' - ' for x in a[:]:\n' - ' if x < 0: a.remove(x)\n', + ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, 2]".\n', 'formatstrings': 'Format String Syntax\n' '********************\n' '\n' @@ -8574,61 +8530,62 @@ '\n' 'The following methods can be defined to implement ' 'container objects.\n' - 'Containers usually are sequences (such as lists or tuples) ' - 'or mappings\n' - '(like dictionaries), but can represent other containers as ' - 'well. The\n' - 'first set of methods is used either to emulate a sequence ' - 'or to\n' - 'emulate a mapping; the difference is that for a sequence, ' - 'the\n' - 'allowable keys should be the integers *k* for which "0 <= ' - 'k < N" where\n' - '*N* is the length of the sequence, or slice objects, which ' - 'define a\n' - 'range of items. It is also recommended that mappings ' - 'provide the\n' - 'methods "keys()", "values()", "items()", "get()", ' - '"clear()",\n' - '"setdefault()", "pop()", "popitem()", "copy()", and ' - '"update()"\n' - 'behaving similar to those for Python?s standard dictionary ' + 'Containers usually are *sequences* (such as "lists" or ' + '"tuples") or\n' + '*mappings* (like "dictionaries"), but can represent other ' + 'containers\n' + 'as well. The first set of methods is used either to ' + 'emulate a\n' + 'sequence or to emulate a mapping; the difference is that ' + 'for a\n' + 'sequence, the allowable keys should be the integers *k* ' + 'for which "0\n' + '<= k < N" where *N* is the length of the sequence, or ' + '"slice" objects,\n' + 'which define a range of items. It is also recommended ' + 'that mappings\n' + 'provide the methods "keys()", "values()", "items()", ' + '"get()",\n' + '"clear()", "setdefault()", "pop()", "popitem()", "copy()", ' + 'and\n' + '"update()" behaving similar to those for Python?s ' + 'standard\n' + '"dictionary" objects. The "collections.abc" module ' + 'provides a\n' + '"MutableMapping" *abstract base class* to help create ' + 'those methods\n' + 'from a base set of "__getitem__()", "__setitem__()", ' + '"__delitem__()",\n' + 'and "keys()". Mutable sequences should provide methods ' + '"append()",\n' + '"count()", "index()", "extend()", "insert()", "pop()", ' + '"remove()",\n' + '"reverse()" and "sort()", like Python standard "list" ' 'objects.\n' - 'The "collections.abc" module provides a "MutableMapping" ' - 'abstract base\n' - 'class to help create those methods from a base set of ' - '"__getitem__()",\n' - '"__setitem__()", "__delitem__()", and "keys()". Mutable ' - 'sequences\n' - 'should provide methods "append()", "count()", "index()", ' - '"extend()",\n' - '"insert()", "pop()", "remove()", "reverse()" and "sort()", ' - 'like Python\n' - 'standard list objects. Finally, sequence types should ' - 'implement\n' - 'addition (meaning concatenation) and multiplication ' + 'Finally, sequence types should implement addition ' '(meaning\n' - 'repetition) by defining the methods "__add__()", ' - '"__radd__()",\n' - '"__iadd__()", "__mul__()", "__rmul__()" and "__imul__()" ' - 'described\n' - 'below; they should not define other numerical operators. ' + 'concatenation) and multiplication (meaning repetition) by ' + 'defining the\n' + 'methods "__add__()", "__radd__()", "__iadd__()", ' + '"__mul__()",\n' + '"__rmul__()" and "__imul__()" described below; they should ' + 'not define\n' + 'other numerical operators. It is recommended that both ' + 'mappings and\n' + 'sequences implement the "__contains__()" method to allow ' + 'efficient use\n' + 'of the "in" operator; for mappings, "in" should search the ' + 'mapping?s\n' + 'keys; for sequences, it should search through the values. ' 'It is\n' - 'recommended that both mappings and sequences implement ' + 'further recommended that both mappings and sequences ' + 'implement the\n' + '"__iter__()" method to allow efficient iteration through ' 'the\n' - '"__contains__()" method to allow efficient use of the "in" ' - 'operator;\n' - 'for mappings, "in" should search the mapping?s keys; for ' - 'sequences, it\n' - 'should search through the values. It is further ' - 'recommended that both\n' - 'mappings and sequences implement the "__iter__()" method ' - 'to allow\n' - 'efficient iteration through the container; for mappings, ' - '"__iter__()"\n' - 'should iterate through the object?s keys; for sequences, ' - 'it should\n' - 'iterate through the values.\n' + 'container; for mappings, "__iter__()" should iterate ' + 'through the\n' + 'object?s keys; for sequences, it should iterate through ' + 'the values.\n' '\n' 'object.__len__(self)\n' '\n' @@ -9789,7 +9746,7 @@ '"super(B,\n' ' obj).m()" searches "obj.__class__.__mro__" for the base ' 'class "A"\n' - ' immediately preceding "B" and then invokes the descriptor ' + ' immediately following "B" and then invokes the descriptor ' 'with the\n' ' call: "A.__dict__[\'m\'].__get__(obj, obj.__class__)".\n' '\n' @@ -9819,13 +9776,14 @@ 'be\n' 'overridden by instances.\n' '\n' - 'Python methods (including "staticmethod()" and ' - '"classmethod()") are\n' - 'implemented as non-data descriptors. Accordingly, instances ' - 'can\n' - 'redefine and override methods. This allows individual ' - 'instances to\n' - 'acquire behaviors that differ from other instances of the ' + 'Python methods (including those decorated with ' + '"@staticmethod" and\n' + '"@classmethod") are implemented as non-data descriptors. ' + 'Accordingly,\n' + 'instances can redefine and override methods. This allows ' + 'individual\n' + 'instances to acquire behaviors that differ from other ' + 'instances of the\n' 'same class.\n' '\n' 'The "property()" function is implemented as a data ' @@ -9839,12 +9797,12 @@ '\n' '*__slots__* allow us to explicitly declare data members ' '(like\n' - 'properties) and deny the creation of *__dict__* and ' + 'properties) and deny the creation of "__dict__" and ' '*__weakref__*\n' '(unless explicitly declared in *__slots__* or available in a ' 'parent.)\n' '\n' - 'The space saved over using *__dict__* can be significant. ' + 'The space saved over using "__dict__" can be significant. ' 'Attribute\n' 'lookup speed can be significantly improved as well.\n' '\n' @@ -9856,7 +9814,7 @@ '*__slots__*\n' ' reserves space for the declared variables and prevents ' 'the\n' - ' automatic creation of *__dict__* and *__weakref__* for ' + ' automatic creation of "__dict__" and *__weakref__* for ' 'each\n' ' instance.\n' '\n' @@ -9865,11 +9823,11 @@ '~~~~~~~~~~~~~~~~~~~~~~~~~~\n' '\n' '* When inheriting from a class without *__slots__*, the ' - '*__dict__* and\n' + '"__dict__" and\n' ' *__weakref__* attribute of the instances will always be ' 'accessible.\n' '\n' - '* Without a *__dict__* variable, instances cannot be ' + '* Without a "__dict__" variable, instances cannot be ' 'assigned new\n' ' variables not listed in the *__slots__* definition. ' 'Attempts to\n' @@ -9882,28 +9840,28 @@ '\n' '* Without a *__weakref__* variable for each instance, ' 'classes defining\n' - ' *__slots__* do not support weak references to its ' - 'instances. If weak\n' - ' reference support is needed, then add "\'__weakref__\'" to ' - 'the\n' + ' *__slots__* do not support "weak references" to its ' + 'instances. If\n' + ' weak reference support is needed, then add ' + '"\'__weakref__\'" to the\n' ' sequence of strings in the *__slots__* declaration.\n' '\n' '* *__slots__* are implemented at the class level by ' 'creating\n' - ' descriptors (Implementing Descriptors) for each variable ' - 'name. As a\n' - ' result, class attributes cannot be used to set default ' - 'values for\n' - ' instance variables defined by *__slots__*; otherwise, the ' - 'class\n' - ' attribute would overwrite the descriptor assignment.\n' + ' descriptors for each variable name. As a result, class ' + 'attributes\n' + ' cannot be used to set default values for instance ' + 'variables defined\n' + ' by *__slots__*; otherwise, the class attribute would ' + 'overwrite the\n' + ' descriptor assignment.\n' '\n' '* The action of a *__slots__* declaration is not limited to ' 'the class\n' ' where it is defined. *__slots__* declared in parents are ' 'available\n' ' in child classes. However, child subclasses will get a ' - '*__dict__*\n' + '"__dict__"\n' ' and *__weakref__* unless they also define *__slots__* ' '(which should\n' ' only contain names of any *additional* slots).\n' @@ -9923,13 +9881,18 @@ ' ?variable-length? built-in types such as "int", "bytes" ' 'and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to *__slots__*. ' - 'Mappings may\n' - ' also be used; however, in the future, special meaning may ' - 'be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to *__slots__*.\n' + '\n' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' '\n' - '* *__class__* assignment works only if both classes have the ' + '* "__class__" assignment works only if both classes have the ' 'same\n' ' *__slots__*.\n' '\n' @@ -9941,9 +9904,9 @@ 'violations\n' ' raise "TypeError".\n' '\n' - '* If an iterator is used for *__slots__* then a descriptor ' - 'is created\n' - ' for each of the iterator?s values. However, the ' + '* If an *iterator* is used for *__slots__* then a ' + '*descriptor* is\n' + ' created for each of the iterator?s values. However, the ' '*__slots__*\n' ' attribute will be an empty iterator.\n' '\n' @@ -9952,7 +9915,7 @@ '==========================\n' '\n' 'Whenever a class inherits from another class, ' - '*__init_subclass__* is\n' + '"__init_subclass__()" is\n' 'called on that class. This way, it is possible to write ' 'classes which\n' 'change the behavior of subclasses. This is closely related ' @@ -10152,10 +10115,10 @@ 'come from\n' 'the class definition). The "__prepare__" method should be ' 'implemented\n' - 'as a "classmethod()". The namespace returned by ' - '"__prepare__" is\n' - 'passed in to "__new__", but when the final class object is ' - 'created the\n' + 'as a "classmethod". The namespace returned by "__prepare__" ' + 'is passed\n' + 'in to "__new__", but when the final class object is created ' + 'the\n' 'namespace is copied into a new "dict".\n' '\n' 'If the metaclass has no "__prepare__" attribute, then the ' @@ -10532,60 +10495,60 @@ '\n' 'The following methods can be defined to implement container ' 'objects.\n' - 'Containers usually are sequences (such as lists or tuples) ' - 'or mappings\n' - '(like dictionaries), but can represent other containers as ' - 'well. The\n' - 'first set of methods is used either to emulate a sequence or ' - 'to\n' - 'emulate a mapping; the difference is that for a sequence, ' - 'the\n' - 'allowable keys should be the integers *k* for which "0 <= k ' - '< N" where\n' - '*N* is the length of the sequence, or slice objects, which ' - 'define a\n' - 'range of items. It is also recommended that mappings ' - 'provide the\n' - 'methods "keys()", "values()", "items()", "get()", ' - '"clear()",\n' - '"setdefault()", "pop()", "popitem()", "copy()", and ' - '"update()"\n' - 'behaving similar to those for Python?s standard dictionary ' + 'Containers usually are *sequences* (such as "lists" or ' + '"tuples") or\n' + '*mappings* (like "dictionaries"), but can represent other ' + 'containers\n' + 'as well. The first set of methods is used either to emulate ' + 'a\n' + 'sequence or to emulate a mapping; the difference is that for ' + 'a\n' + 'sequence, the allowable keys should be the integers *k* for ' + 'which "0\n' + '<= k < N" where *N* is the length of the sequence, or ' + '"slice" objects,\n' + 'which define a range of items. It is also recommended that ' + 'mappings\n' + 'provide the methods "keys()", "values()", "items()", ' + '"get()",\n' + '"clear()", "setdefault()", "pop()", "popitem()", "copy()", ' + 'and\n' + '"update()" behaving similar to those for Python?s standard\n' + '"dictionary" objects. The "collections.abc" module provides ' + 'a\n' + '"MutableMapping" *abstract base class* to help create those ' + 'methods\n' + 'from a base set of "__getitem__()", "__setitem__()", ' + '"__delitem__()",\n' + 'and "keys()". Mutable sequences should provide methods ' + '"append()",\n' + '"count()", "index()", "extend()", "insert()", "pop()", ' + '"remove()",\n' + '"reverse()" and "sort()", like Python standard "list" ' 'objects.\n' - 'The "collections.abc" module provides a "MutableMapping" ' - 'abstract base\n' - 'class to help create those methods from a base set of ' - '"__getitem__()",\n' - '"__setitem__()", "__delitem__()", and "keys()". Mutable ' - 'sequences\n' - 'should provide methods "append()", "count()", "index()", ' - '"extend()",\n' - '"insert()", "pop()", "remove()", "reverse()" and "sort()", ' - 'like Python\n' - 'standard list objects. Finally, sequence types should ' - 'implement\n' - 'addition (meaning concatenation) and multiplication ' - '(meaning\n' - 'repetition) by defining the methods "__add__()", ' - '"__radd__()",\n' - '"__iadd__()", "__mul__()", "__rmul__()" and "__imul__()" ' - 'described\n' - 'below; they should not define other numerical operators. It ' - 'is\n' - 'recommended that both mappings and sequences implement the\n' - '"__contains__()" method to allow efficient use of the "in" ' - 'operator;\n' - 'for mappings, "in" should search the mapping?s keys; for ' - 'sequences, it\n' - 'should search through the values. It is further recommended ' - 'that both\n' - 'mappings and sequences implement the "__iter__()" method to ' - 'allow\n' - 'efficient iteration through the container; for mappings, ' - '"__iter__()"\n' - 'should iterate through the object?s keys; for sequences, it ' - 'should\n' - 'iterate through the values.\n' + 'Finally, sequence types should implement addition (meaning\n' + 'concatenation) and multiplication (meaning repetition) by ' + 'defining the\n' + 'methods "__add__()", "__radd__()", "__iadd__()", ' + '"__mul__()",\n' + '"__rmul__()" and "__imul__()" described below; they should ' + 'not define\n' + 'other numerical operators. It is recommended that both ' + 'mappings and\n' + 'sequences implement the "__contains__()" method to allow ' + 'efficient use\n' + 'of the "in" operator; for mappings, "in" should search the ' + 'mapping?s\n' + 'keys; for sequences, it should search through the values. ' + 'It is\n' + 'further recommended that both mappings and sequences ' + 'implement the\n' + '"__iter__()" method to allow efficient iteration through ' + 'the\n' + 'container; for mappings, "__iter__()" should iterate through ' + 'the\n' + 'object?s keys; for sequences, it should iterate through the ' + 'values.\n' '\n' 'object.__len__(self)\n' '\n' @@ -11493,9 +11456,9 @@ ' >>> from keyword import iskeyword\n' '\n' " >>> 'hello'.isidentifier(), iskeyword('hello')\n" - ' True, False\n' + ' (True, False)\n' " >>> 'def'.isidentifier(), iskeyword('def')\n" - ' True, True\n' + ' (True, True)\n' '\n' 'str.islower()\n' '\n' @@ -11846,7 +11809,7 @@ " >>> ' 1 2 3 '.split()\n" " ['1', '2', '3']\n" '\n' - 'str.splitlines([keepends])\n' + 'str.splitlines(keepends=False)\n' '\n' ' Return a list of the lines in the string, breaking at ' 'line\n' @@ -13203,14 +13166,14 @@ '"async\n' ' for" statement to execute the body of the function.\n' '\n' - ' Calling the asynchronous iterator?s "aiterator.__anext__()"\n' - ' method will return an *awaitable* which when awaited will\n' - ' execute until it provides a value using the "yield" ' - 'expression.\n' - ' When the function executes an empty "return" statement or ' - 'falls\n' - ' off the end, a "StopAsyncIteration" exception is raised and ' + ' Calling the asynchronous iterator?s "aiterator.__anext__" ' + 'method\n' + ' will return an *awaitable* which when awaited will execute ' + 'until\n' + ' it provides a value using the "yield" expression. When the\n' + ' function executes an empty "return" statement or falls off ' 'the\n' + ' end, a "StopAsyncIteration" exception is raised and the\n' ' asynchronous iterator will have reached the end of the set ' 'of\n' ' values to be yielded.\n' @@ -13754,9 +13717,9 @@ '"dict"\n' 'constructor.\n' '\n' - 'class dict(**kwarg)\n' - 'class dict(mapping, **kwarg)\n' - 'class dict(iterable, **kwarg)\n' + 'class dict(**kwargs)\n' + 'class dict(mapping, **kwargs)\n' + 'class dict(iterable, **kwargs)\n' '\n' ' Return a new dictionary initialized from an optional ' 'positional\n' @@ -14406,6 +14369,14 @@ 'Comparisons in\n' 'the language reference.)\n' '\n' + 'Forward and reversed iterators over mutable sequences access ' + 'values\n' + 'using an index. That index will continue to march forward (or\n' + 'backward) even if the underlying sequence is mutated. The ' + 'iterator\n' + 'terminates only when an "IndexError" or a "StopIteration" is\n' + 'encountered (or when the index drops below zero).\n' + '\n' 'Notes:\n' '\n' '1. While the "in" and "not in" operations are used only for ' @@ -14877,7 +14848,8 @@ '\n' ' The arguments to the range constructor must be integers ' '(either\n' - ' built-in "int" or any object that implements the "__index__"\n' + ' built-in "int" or any object that implements the ' + '"__index__()"\n' ' special method). If the *step* argument is omitted, it ' 'defaults to\n' ' "1". If the *start* argument is omitted, it defaults to "0". ' diff --git a/Misc/NEWS.d/3.10.2.rst b/Misc/NEWS.d/3.10.2.rst new file mode 100644 index 0000000000000..c0fc751c19971 --- /dev/null +++ b/Misc/NEWS.d/3.10.2.rst @@ -0,0 +1,393 @@ +.. bpo: 46347 +.. date: 2022-01-11-13-57-00 +.. nonce: Gd8M-S +.. release date: 2022-01-13 +.. section: Core and Builtins + +Fix memory leak in PyEval_EvalCodeEx. + +.. + +.. bpo: 46289 +.. date: 2022-01-07-23-32-03 +.. nonce: NnjpVc +.. section: Core and Builtins + +ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` +field is not optional. + +.. + +.. bpo: 46237 +.. date: 2022-01-07-19-33-05 +.. nonce: 9A6Hpq +.. section: Core and Builtins + +Fix the line number of tokenizer errors inside f-strings. Patch by Pablo +Galindo. + +.. + +.. bpo: 46006 +.. date: 2022-01-05-17-13-47 +.. nonce: hdH5Vn +.. section: Core and Builtins + +Fix a regression when a type method like ``__init__()`` is modified in a +subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type +``update_slot()``. Revert the change which made the Unicode dictionary of +interned strings compatible with subinterpreters: the internal interned +dictionary is shared again by all interpreters. Patch by Victor Stinner. + +.. + +.. bpo: 46085 +.. date: 2021-12-30-00-23-41 +.. nonce: bDuJqu +.. section: Core and Builtins + +Fix iterator cache mechanism of :class:`OrderedDict`. + +.. + +.. bpo: 46110 +.. date: 2021-12-18-02-37-07 +.. nonce: B6hAfu +.. section: Core and Builtins + +Add a maximum recursion check to the PEG parser to avoid stack overflow. +Patch by Pablo Galindo + +.. + +.. bpo: 46054 +.. date: 2021-12-12-05-30-21 +.. nonce: 2P-foG +.. section: Core and Builtins + +Fix parser error when parsing non-utf8 characters in source files. Patch by +Pablo Galindo. + +.. + +.. bpo: 46042 +.. date: 2021-12-11-17-40-34 +.. nonce: aqYxku +.. section: Core and Builtins + +Improve the location of the caret in :exc:`SyntaxError` exceptions emitted +by the symbol table. Patch by Pablo Galindo. + +.. + +.. bpo: 46025 +.. date: 2021-12-09-11-41-35 +.. nonce: pkEvW9 +.. section: Core and Builtins + +Fix a crash in the :mod:`atexit` module involving functions that unregister +themselves before raising exceptions. Patch by Pablo Galindo. + +.. + +.. bpo: 46009 +.. date: 2021-12-08-11-06-53 +.. nonce: cL8pH0 +.. section: Core and Builtins + +Restore behavior from 3.9 and earlier when sending non-None to newly started +generator. In 3.9 this did not affect the state of the generator. In 3.10.0 +and 3.10.1 ``gen_func().send(0)`` is equivalent to +``gen_func().throw(TypeError(...)`` which exhausts the generator. In 3.10.2 +onward, the behavior has been reverted to that of 3.9. + +.. + +.. bpo: 46000 +.. date: 2021-12-07-11-42-44 +.. nonce: v_ru3k +.. section: Core and Builtins + +Improve compatibility of the :mod:`curses` module with NetBSD curses. + +.. + +.. bpo: 46004 +.. date: 2021-12-07-11-24-24 +.. nonce: TTEU1p +.. section: Core and Builtins + +Fix the :exc:`SyntaxError` location for errors involving for loops with +invalid targets. Patch by Pablo Galindo + +.. + +.. bpo: 42918 +.. date: 2021-12-06-15-32-12 +.. nonce: Czpgtg +.. section: Core and Builtins + +Fix bug where the built-in :func:`compile` function did not always raise a +:exc:`SyntaxError` when passed multiple statements in 'single' mode. Patch +by Weipeng Hong. + +.. + +.. bpo: 40479 +.. date: 2022-01-07-15-20-19 +.. nonce: EKfr3F +.. section: Library + +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. + +.. + +.. bpo: 46070 +.. date: 2022-01-07-13-51-22 +.. nonce: -axLUW +.. section: Library + +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. + +.. + +.. bpo: 46278 +.. date: 2022-01-06-13-38-00 +.. nonce: wILA80 +.. section: Library + +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. + +.. + +.. bpo: 46239 +.. date: 2022-01-03-12-59-20 +.. nonce: ySVSEy +.. section: Library + +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. + +.. + +.. bpo: 20369 +.. date: 2021-12-17-12-06-40 +.. nonce: zzLuBz +.. section: Library + +:func:`concurrent.futures.wait` no longer blocks forever when given +duplicate Futures. Patch by Kumar Aditya. + +.. + +.. bpo: 46105 +.. date: 2021-12-16-14-30-36 +.. nonce: pprB1K +.. section: Library + +Honor spec when generating requirement specs with urls and extras +(importlib_metadata 4.8.3). + +.. + +.. bpo: 26952 +.. date: 2021-12-14-13-18-45 +.. nonce: hjhISq +.. section: Library + +:mod:`argparse` raises :exc:`ValueError` with clear message when trying to +render usage for an empty mutually-exclusive group. Previously it raised a +cryptic :exc:`IndexError`. + +.. + +.. bpo: 27718 +.. date: 2021-12-11-22-51-30 +.. nonce: MgQiGl +.. section: Library + +Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and +``getsignal()``) were omitted. + +.. + +.. bpo: 46032 +.. date: 2021-12-11-15-45-07 +.. nonce: HmciLT +.. section: Library + +The ``registry()`` method of :func:`functools.singledispatch` functions +checks now the first argument or the first parameter annotation and raises a +TypeError if it is not supported. Previously unsupported "types" were +ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. +``list[int]``). + +.. + +.. bpo: 46018 +.. date: 2021-12-09-00-44-42 +.. nonce: hkTI7v +.. section: Library + +Ensure that :func:`math.expm1` does not raise on underflow. + +.. + +.. bpo: 45755 +.. date: 2021-12-07-21-55-22 +.. nonce: bRqKGa +.. section: Library + +:mod:`typing` generic aliases now reveal the class attributes of the +original generic class when passed to ``dir()``. This was the behavior up to +Python 3.6, but was changed in 3.7-3.9. + +.. + +.. bpo: 13236 +.. date: 2021-11-30-13-52-02 +.. nonce: FmJIkO +.. section: Library + +:class:`unittest.TextTestResult` and :class:`unittest.TextTestRunner` flush +now the output stream more often. + +.. + +.. bpo: 42378 +.. date: 2021-07-25-08-17-55 +.. nonce: WIhUZK +.. section: Library + +Fixes the issue with log file being overwritten when +:class:`logging.FileHandler` is used in :mod:`atexit` with *filemode* set to +``'w'``. Note this will cause the message in *atexit* not being logged if +the log stream is already closed due to shutdown of logging. + +.. + +.. bpo: 46120 +.. date: 2021-12-21-12-45-57 +.. nonce: PE0DmJ +.. section: Documentation + +State that ``|`` is preferred for readability over ``Union`` in the +:mod:`typing` docs. + +.. + +.. bpo: 46040 +.. date: 2021-12-11-20-03-09 +.. nonce: qrsG0C +.. section: Documentation + +Fix removal Python version for ``@asyncio.coroutine``, the correct value is +3.11. + +.. + +.. bpo: 19737 +.. date: 2021-11-28-22-43-21 +.. nonce: cOOubB +.. section: Documentation + +Update the documentation for the :func:`globals` function. + +.. + +.. bpo: 45840 +.. date: 2021-11-19-02-02-32 +.. nonce: A51B2S +.. section: Documentation + +Improve cross-references in the documentation for the data model. + +.. + +.. bpo: 46205 +.. date: 2022-01-07-14-06-12 +.. nonce: dnc2OC +.. section: Tests + +Fix hang in runtest_mp due to race condition + +.. + +.. bpo: 46263 +.. date: 2022-01-06-15-45-34 +.. nonce: bJXek6 +.. section: Tests + +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. + +.. + +.. bpo: 46150 +.. date: 2021-12-23-13-42-15 +.. nonce: RhtADs +.. section: Tests + +Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is +checked to be non-existent. + +.. + +.. bpo: 46129 +.. date: 2021-12-19-12-20-57 +.. nonce: I3MunH +.. section: Tests + +Rewrite ``asyncio.locks`` tests with +:class:`unittest.IsolatedAsyncioTestCase` usage. + +.. + +.. bpo: 46114 +.. date: 2021-12-17-14-46-19 +.. nonce: 9iyZ_9 +.. section: Tests + +Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. + +.. + +.. bpo: 46263 +.. date: 2022-01-05-02-58-10 +.. nonce: xiv8NU +.. section: Build + +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. + +.. + +.. bpo: 46106 +.. date: 2021-12-20-07-10-41 +.. nonce: 5qcv3L +.. section: Build + +Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. +Patch by Kumar Aditya. + +.. + +.. bpo: 40477 +.. date: 2022-01-02-21-56-53 +.. nonce: W3nnM6 +.. section: macOS + +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. + +.. + +.. bpo: 46236 +.. date: 2022-01-05-10-16-16 +.. nonce: pcmVQw +.. section: C API + +Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a +``tuple`` instead of a ``dict``. diff --git a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst b/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst deleted file mode 100644 index d3e25f77c7336..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst +++ /dev/null @@ -1,2 +0,0 @@ -Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. -Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst deleted file mode 100644 index 3a575ed7f556b..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst +++ /dev/null @@ -1 +0,0 @@ -``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst deleted file mode 100644 index 61906584a16a3..0000000000000 --- a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst +++ /dev/null @@ -1 +0,0 @@ -Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a ``tuple`` instead of a ``dict``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst deleted file mode 100644 index f03dadebcf3b3..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix bug where the built-in :func:`compile` function did not always raise a -:exc:`SyntaxError` when passed multiple statements in 'single' mode. Patch by -Weipeng Hong. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-24-24.bpo-46004.TTEU1p.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-24-24.bpo-46004.TTEU1p.rst deleted file mode 100644 index 199bccf8166f0..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-24-24.bpo-46004.TTEU1p.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix the :exc:`SyntaxError` location for errors involving for loops with -invalid targets. Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst deleted file mode 100644 index 68e4bfa9e77b1..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst +++ /dev/null @@ -1 +0,0 @@ -Improve compatibility of the :mod:`curses` module with NetBSD curses. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-08-11-06-53.bpo-46009.cL8pH0.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-08-11-06-53.bpo-46009.cL8pH0.rst deleted file mode 100644 index a80e66b7c6451..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-08-11-06-53.bpo-46009.cL8pH0.rst +++ /dev/null @@ -1,5 +0,0 @@ -Restore behavior from 3.9 and earlier when sending non-None to newly started -generator. In 3.9 this did not affect the state of the generator. In 3.10.0 -and 3.10.1 ``gen_func().send(0)`` is equivalent to -``gen_func().throw(TypeError(...)`` which exhausts the generator. In 3.10.2 -onward, the behavior has been reverted to that of 3.9. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst deleted file mode 100644 index dd2f1ff4731e7..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix a crash in the :mod:`atexit` module involving functions that unregister -themselves before raising exceptions. Patch by Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst deleted file mode 100644 index 7a302bcd7648b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst +++ /dev/null @@ -1,2 +0,0 @@ -Improve the location of the caret in :exc:`SyntaxError` exceptions emitted -by the symbol table. Patch by Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst deleted file mode 100644 index 6ca91f03445e2..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix parser error when parsing non-utf8 characters in source files. Patch by -Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst deleted file mode 100644 index 593d2855972c4..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a maximum recursion check to the PEG parser to avoid stack overflow. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst deleted file mode 100644 index a2093f75c3b62..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst +++ /dev/null @@ -1 +0,0 @@ -Fix iterator cache mechanism of :class:`OrderedDict`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst deleted file mode 100644 index 3acd2b09390a8..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst +++ /dev/null @@ -1,5 +0,0 @@ -Fix a regression when a type method like ``__init__()`` is modified in a -subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type -``update_slot()``. Revert the change which made the Unicode dictionary of -interned strings compatible with subinterpreters: the internal interned -dictionary is shared again by all interpreters. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst deleted file mode 100644 index 931a2603293c3..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix the line number of tokenizer errors inside f-strings. Patch by Pablo -Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst deleted file mode 100644 index 816ff585f14e6..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst +++ /dev/null @@ -1,2 +0,0 @@ -ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` -field is not optional. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst deleted file mode 100644 index fc12d6ba146ca..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst +++ /dev/null @@ -1 +0,0 @@ -Fix memory leak in PyEval_EvalCodeEx. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst b/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst deleted file mode 100644 index 87371e5b76bc1..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst +++ /dev/null @@ -1 +0,0 @@ -Improve cross-references in the documentation for the data model. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst b/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst deleted file mode 100644 index a3e16c9fdd0e6..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst +++ /dev/null @@ -1 +0,0 @@ -Update the documentation for the :func:`globals` function. diff --git a/Misc/NEWS.d/next/Documentation/2021-12-11-20-03-09.bpo-46040.qrsG0C.rst b/Misc/NEWS.d/next/Documentation/2021-12-11-20-03-09.bpo-46040.qrsG0C.rst deleted file mode 100644 index c63b2c92b3790..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-12-11-20-03-09.bpo-46040.qrsG0C.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix removal Python version for ``@asyncio.coroutine``, the correct value is -3.11. diff --git a/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst b/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst deleted file mode 100644 index 17f67472e2ab0..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst +++ /dev/null @@ -1 +0,0 @@ -State that ``|`` is preferred for readability over ``Union`` in the :mod:`typing` docs. diff --git a/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst b/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst deleted file mode 100644 index 90c3961dc87d8..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-07-25-08-17-55.bpo-42378.WIhUZK.rst +++ /dev/null @@ -1,4 +0,0 @@ -Fixes the issue with log file being overwritten when -:class:`logging.FileHandler` is used in :mod:`atexit` with *filemode* set to -``'w'``. Note this will cause the message in *atexit* not being logged if -the log stream is already closed due to shutdown of logging. diff --git a/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst b/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst deleted file mode 100644 index bfea8d4fca0e0..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst +++ /dev/null @@ -1,2 +0,0 @@ -:class:`unittest.TextTestResult` and :class:`unittest.TextTestRunner` flush -now the output stream more often. diff --git a/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst b/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst deleted file mode 100644 index e5201b0dfde2d..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst +++ /dev/null @@ -1,3 +0,0 @@ -:mod:`typing` generic aliases now reveal the class attributes of the -original generic class when passed to ``dir()``. This was the behavior up to -Python 3.6, but was changed in 3.7-3.9. diff --git a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst b/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst deleted file mode 100644 index 6ff76f58779d2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst +++ /dev/null @@ -1 +0,0 @@ -Ensure that :func:`math.expm1` does not raise on underflow. diff --git a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst b/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst deleted file mode 100644 index 97a553d7ba29f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst +++ /dev/null @@ -1,5 +0,0 @@ -The ``registry()`` method of :func:`functools.singledispatch` functions -checks now the first argument or the first parameter annotation and raises a -TypeError if it is not supported. Previously unsupported "types" were -ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. -``list[int]``). diff --git a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst b/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst deleted file mode 100644 index c68e98ff0630b..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and -``getsignal()``) were omitted. diff --git a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst b/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst deleted file mode 100644 index 379dbb55c7ca8..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst +++ /dev/null @@ -1 +0,0 @@ -:mod:`argparse` raises :exc:`ValueError` with clear message when trying to render usage for an empty mutually-exclusive group. Previously it raised a cryptic :exc:`IndexError`. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst b/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst deleted file mode 100644 index 145edccb2e7a5..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst +++ /dev/null @@ -1,2 +0,0 @@ -Honor spec when generating requirement specs with urls and extras -(importlib_metadata 4.8.3). diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst deleted file mode 100644 index cc5cd0067e61f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst +++ /dev/null @@ -1 +0,0 @@ -:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst deleted file mode 100644 index 202febf84fd10..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst +++ /dev/null @@ -1,2 +0,0 @@ -Improve error message when importing :mod:`asyncio.windows_events` on -non-Windows. diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst deleted file mode 100644 index 40849044cf1c8..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst +++ /dev/null @@ -1,2 +0,0 @@ -Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop -implementations already support it. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst deleted file mode 100644 index 0fedc9dfb8fb1..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix possible segfault when importing the :mod:`asyncio` module from -different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst deleted file mode 100644 index af72923bbd759..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL -3.0.0 in FIPS mode. diff --git a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst b/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst deleted file mode 100644 index 6878cea032387..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst +++ /dev/null @@ -1 +0,0 @@ -Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. diff --git a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst b/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst deleted file mode 100644 index b06436a4c8460..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst +++ /dev/null @@ -1,2 +0,0 @@ -Rewrite ``asyncio.locks`` tests with -:class:`unittest.IsolatedAsyncioTestCase` usage. diff --git a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst b/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst deleted file mode 100644 index 8ef9cd9b4a594..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst +++ /dev/null @@ -1,2 +0,0 @@ -Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is checked -to be non-existent. diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst deleted file mode 100644 index 0334af4e3cbe8..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory -with junk byte. diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst deleted file mode 100644 index 7c6121fb16249..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst +++ /dev/null @@ -1 +0,0 @@ -Fix hang in runtest_mp due to race condition diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst deleted file mode 100644 index fc953b85dcc2a..0000000000000 --- a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst +++ /dev/null @@ -1,2 +0,0 @@ -The Python Launcher app for macOS now properly launches scripts and, if -necessary, the Terminal app when running on recent macOS releases. diff --git a/README.rst b/README.rst index d98a2ad9a3dae..ae35928628bb0 100644 --- a/README.rst +++ b/README.rst @@ -1,4 +1,4 @@ -This is Python version 3.10.1 +This is Python version 3.10.2 ============================= .. image:: https://travis-ci.com/python/cpython.svg?branch=master From webhook-mailer at python.org Fri Jan 14 16:21:29 2022 From: webhook-mailer at python.org (pablogsal) Date: Fri, 14 Jan 2022 21:21:29 -0000 Subject: [Python-checkins] Python 3.11.0a4 Message-ID: https://github.com/python/cpython/commit/9471106fd5b47418ffd2f90c6b729f58698f573e commit: 9471106fd5b47418ffd2f90c6b729f58698f573e branch: main author: Pablo Galindo committer: pablogsal date: 2022-01-13T19:38:15Z summary: Python 3.11.0a4 files: A Misc/NEWS.d/3.11.0a4.rst D Misc/NEWS.d/next/Build/2021-12-02-23-21-18.bpo-45949.OTSo9X.rst D Misc/NEWS.d/next/Build/2021-12-09-10-25-11.bpo-46023.PLpNB6.rst D Misc/NEWS.d/next/Build/2021-12-13-21-03-52.bpo-40280.b7NG4Y.rst D Misc/NEWS.d/next/Build/2021-12-15-10-37-44.bpo-46072.GgeAU3.rst D Misc/NEWS.d/next/Build/2021-12-16-14-18-07.bpo-46088.8UUuAd.rst D Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst D Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst D Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst D Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst D Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst D Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst D Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst D Misc/NEWS.d/next/C API/2021-12-08-12-41-51.bpo-46007.sMgDLz.rst D Misc/NEWS.d/next/C API/2021-12-11-08-41-36.bpo-45855.Lq2_gR.rst D Misc/NEWS.d/next/C API/2021-12-12-10-09-02.bpo-45855.MVsTDj.rst D Misc/NEWS.d/next/C API/2021-12-21-22-56-36.bpo-46140.dvXkYK.rst D Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst D Misc/NEWS.d/next/Core and Builtins/2021-04-24-15-39-23.bpo-43931.zpChDi.rst D Misc/NEWS.d/next/Core and Builtins/2021-05-30-16-37-47.bpo-43413.vYFPPC1.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-22-13-05-32.bpo-45292.pfEouJ.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-04-21.bpo-44525.6OWCgr.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-57-43.bpo-45654.MZc7ei.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-10-13-42-17.bpo-37971.6BC1Tx.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-14-42.bpo-46048._-OGD9.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-49-19.bpo-46049.9dNto2.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-12-15-52-41.bpo-45635.ADVaPT.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-01-13.bpo-46039.TrCBbF.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-12-16.bpo-44525.4-FiSf.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-15-15-17-04.bpo-45711.QK4QrB.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-16-23-27-05.bpo-46107.7q5an0.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-24-20-21-45.bpo-46055.R0QMVQ.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst D Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst D Misc/NEWS.d/next/Documentation/2021-12-16-21-13-55.bpo-46109.0-RNzu.rst D Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst D Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst D Misc/NEWS.d/next/Library/2020-11-26-10-23-46.bpo-42413.HFikOl.rst D Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst D Misc/NEWS.d/next/Library/2021-10-28-11-40-59.bpo-45643.jeiPiX.rst D Misc/NEWS.d/next/Library/2021-11-24-12-25-42.bpo-25066.YIcIkn.rst D Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst D Misc/NEWS.d/next/Library/2021-11-29-19-37-20.bpo-44674.NijWLt.rst D Misc/NEWS.d/next/Library/2021-12-02-11-55-45.bpo-45874.dtJIsN.rst D Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst D Misc/NEWS.d/next/Library/2021-12-08-19-15-03.bpo-46016.s9PuyF.rst D Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst D Misc/NEWS.d/next/Library/2021-12-09-11-50-32.bpo-27062.R5vii6.rst D Misc/NEWS.d/next/Library/2021-12-10-03-13-57.bpo-46014.3xYdST.rst D Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst D Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst D Misc/NEWS.d/next/Library/2021-12-12-13-41-47.bpo-16594.yfC7L4.rst D Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst D Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst D Misc/NEWS.d/next/Library/2021-12-15-19-24-54.bpo-22047.gBV4vT.rst D Misc/NEWS.d/next/Library/2021-12-16-12-54-21.bpo-22815.0NRH8s.rst D Misc/NEWS.d/next/Library/2021-12-16-13-54-55.bpo-44893.I7aLiW.rst D Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst D Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst D Misc/NEWS.d/next/Library/2021-12-17-13-22-37.bpo-37578._tluuR.rst D Misc/NEWS.d/next/Library/2021-12-17-16-27-44.bpo-46118.euAy0E.rst D Misc/NEWS.d/next/Library/2021-12-18-18-29-07.bpo-46125.LLmcox.rst D Misc/NEWS.d/next/Library/2021-12-19-00-00-48.bpo-45321.OyuhaY.rst D Misc/NEWS.d/next/Library/2021-12-23-14-36-58.bpo-43424.d9x2JZ.rst D Misc/NEWS.d/next/Library/2021-12-25-11-11-21.bpo-46176.EOY9wd.rst D Misc/NEWS.d/next/Library/2021-12-27-15-52-28.bpo-37295.s3LPo0.rst D Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst D Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst D Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst D Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst D Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst D Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst D Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst D Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst D Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst D Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst D Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst D Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst D Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst D Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst D Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst D Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst D Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst D Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst D Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst D Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst D Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst D Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst D Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst D Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst D Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Include/patchlevel.h M Lib/pydoc_data/topics.py M README.rst diff --git a/Include/patchlevel.h b/Include/patchlevel.h index 70205dbac07e7..c7da3481e2d0b 100644 --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -20,10 +20,10 @@ #define PY_MINOR_VERSION 11 #define PY_MICRO_VERSION 0 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_ALPHA -#define PY_RELEASE_SERIAL 3 +#define PY_RELEASE_SERIAL 4 /* Version as a string */ -#define PY_VERSION "3.11.0a3+" +#define PY_VERSION "3.11.0a4" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/pydoc_data/topics.py b/Lib/pydoc_data/topics.py index e31d2d8de647b..5ce05420414de 100644 --- a/Lib/pydoc_data/topics.py +++ b/Lib/pydoc_data/topics.py @@ -1,5 +1,5 @@ # -*- coding: utf-8 -*- -# Autogenerated by Sphinx on Wed Dec 8 22:23:59 2021 +# Autogenerated by Sphinx on Thu Jan 13 19:37:48 2022 topics = {'assert': 'The "assert" statement\n' '**********************\n' '\n' @@ -1142,11 +1142,17 @@ ' ?variable-length? built-in types such as "int", ' '"bytes" and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to ' - '*__slots__*. Mappings may\n' - ' also be used; however, in the future, special meaning ' - 'may be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to ' + '*__slots__*.\n' + '\n' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' '\n' '* "__class__" assignment works only if both classes have ' 'the same\n' @@ -2376,33 +2382,6 @@ ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, ' '2]".\n' '\n' - 'Note:\n' - '\n' - ' There is a subtlety when the sequence is being modified by the ' - 'loop\n' - ' (this can only occur for mutable sequences, e.g. lists). An\n' - ' internal counter is used to keep track of which item is used ' - 'next,\n' - ' and this is incremented on each iteration. When this counter ' - 'has\n' - ' reached the length of the sequence the loop terminates. This ' - 'means\n' - ' that if the suite deletes the current (or a previous) item ' - 'from the\n' - ' sequence, the next item will be skipped (since it gets the ' - 'index of\n' - ' the current item which has already been treated). Likewise, ' - 'if the\n' - ' suite inserts an item in the sequence before the current item, ' - 'the\n' - ' current item will be treated again the next time through the ' - 'loop.\n' - ' This can lead to nasty bugs that can be avoided by making a\n' - ' temporary copy using a slice of the whole sequence, e.g.,\n' - '\n' - ' for x in a[:]:\n' - ' if x < 0: a.remove(x)\n' - '\n' '\n' 'The "try" statement\n' '===================\n' @@ -2411,13 +2390,18 @@ 'code\n' 'for a group of statements:\n' '\n' - ' try_stmt ::= try1_stmt | try2_stmt\n' + ' try_stmt ::= try1_stmt | try2_stmt | try3_stmt\n' ' try1_stmt ::= "try" ":" suite\n' ' ("except" [expression ["as" identifier]] ":" ' 'suite)+\n' ' ["else" ":" suite]\n' ' ["finally" ":" suite]\n' ' try2_stmt ::= "try" ":" suite\n' + ' ("except" "*" expression ["as" identifier] ":" ' + 'suite)+\n' + ' ["else" ":" suite]\n' + ' ["finally" ":" suite]\n' + ' try3_stmt ::= "try" ":" suite\n' ' "finally" ":" suite\n' '\n' 'The "except" clause(s) specify one or more exception handlers. ' @@ -2534,6 +2518,60 @@ ' >>> print(sys.exc_info())\n' ' (None, None, None)\n' '\n' + 'The "except*" clause(s) are used for handling "ExceptionGroup`s. ' + 'The\n' + 'exception type for matching is interpreted as in the case of\n' + ':keyword:`except", but in the case of exception groups we can ' + 'have\n' + 'partial matches when the type matches some of the exceptions in ' + 'the\n' + 'group. This means that multiple except* clauses can execute, ' + 'each\n' + 'handling part of the exception group. Each clause executes once ' + 'and\n' + 'handles an exception group of all matching exceptions. Each ' + 'exception\n' + 'in the group is handled by at most one except* clause, the first ' + 'that\n' + 'matches it.\n' + '\n' + ' >>> try:\n' + ' ... raise ExceptionGroup("eg",\n' + ' ... [ValueError(1), TypeError(2), OSError(3), ' + 'OSError(4)])\n' + ' ... except* TypeError as e:\n' + " ... print(f'caught {type(e)} with nested " + "{e.exceptions}')\n" + ' ... except* OSError as e:\n' + " ... print(f'caught {type(e)} with nested " + "{e.exceptions}')\n" + ' ...\n' + " caught with nested (TypeError(2),)\n" + " caught with nested (OSError(3), " + 'OSError(4))\n' + ' + Exception Group Traceback (most recent call last):\n' + ' | File "", line 2, in \n' + ' | ExceptionGroup: eg\n' + ' +-+---------------- 1 ----------------\n' + ' | ValueError: 1\n' + ' +------------------------------------\n' + ' >>>\n' + '\n' + ' Any remaining exceptions that were not handled by any except* ' + 'clause\n' + ' are re-raised at the end, combined into an exception group ' + 'along with\n' + ' all exceptions that were raised from within except* clauses.\n' + '\n' + ' An except* clause must have a matching type, and this type ' + 'cannot be a\n' + ' subclass of :exc:`BaseExceptionGroup`. It is not possible to ' + 'mix except\n' + ' and except* in the same :keyword:`try`. :keyword:`break`,\n' + ' :keyword:`continue` and :keyword:`return` cannot appear in an ' + 'except*\n' + ' clause.\n' + '\n' 'The optional "else" clause is executed if the control flow ' 'leaves the\n' '"try" suite, no exception was raised, and no "return", ' @@ -4620,17 +4658,16 @@ 'debugger will pause execution just before the first line of the\n' 'module.\n' '\n' - 'The typical usage to break into the debugger from a running ' - 'program is\n' - 'to insert\n' + 'The typical usage to break into the debugger is to insert:\n' '\n' ' import pdb; pdb.set_trace()\n' '\n' - 'at the location you want to break into the debugger. You can ' - 'then\n' - 'step through the code following this statement, and continue ' - 'running\n' - 'without the debugger using the "continue" command.\n' + 'at the location you want to break into the debugger, and then ' + 'run the\n' + 'program. You can then step through the code following this ' + 'statement,\n' + 'and continue running without the debugger using the "continue"\n' + 'command.\n' '\n' 'New in version 3.7: The built-in "breakpoint()", when called ' 'with\n' @@ -5897,30 +5934,7 @@ 'all by the loop. Hint: the built-in function "range()" returns an\n' 'iterator of integers suitable to emulate the effect of Pascal?s "for ' 'i\n' - ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, 2]".\n' - '\n' - 'Note:\n' - '\n' - ' There is a subtlety when the sequence is being modified by the ' - 'loop\n' - ' (this can only occur for mutable sequences, e.g. lists). An\n' - ' internal counter is used to keep track of which item is used next,\n' - ' and this is incremented on each iteration. When this counter has\n' - ' reached the length of the sequence the loop terminates. This ' - 'means\n' - ' that if the suite deletes the current (or a previous) item from ' - 'the\n' - ' sequence, the next item will be skipped (since it gets the index ' - 'of\n' - ' the current item which has already been treated). Likewise, if ' - 'the\n' - ' suite inserts an item in the sequence before the current item, the\n' - ' current item will be treated again the next time through the loop.\n' - ' This can lead to nasty bugs that can be avoided by making a\n' - ' temporary copy using a slice of the whole sequence, e.g.,\n' - '\n' - ' for x in a[:]:\n' - ' if x < 0: a.remove(x)\n', + ':= a to b do"; e.g., "list(range(3))" returns the list "[0, 1, 2]".\n', 'formatstrings': 'Format String Syntax\n' '********************\n' '\n' @@ -9934,11 +9948,16 @@ ' ?variable-length? built-in types such as "int", "bytes" ' 'and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to *__slots__*. ' - 'Mappings may\n' - ' also be used; however, in the future, special meaning may ' - 'be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to *__slots__*.\n' + '\n' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' '\n' '* "__class__" assignment works only if both classes have the ' 'same\n' @@ -11504,9 +11523,9 @@ ' >>> from keyword import iskeyword\n' '\n' " >>> 'hello'.isidentifier(), iskeyword('hello')\n" - ' True, False\n' + ' (True, False)\n' " >>> 'def'.isidentifier(), iskeyword('def')\n" - ' True, True\n' + ' (True, True)\n' '\n' 'str.islower()\n' '\n' @@ -11857,7 +11876,7 @@ " >>> ' 1 2 3 '.split()\n" " ['1', '2', '3']\n" '\n' - 'str.splitlines([keepends])\n' + 'str.splitlines(keepends=False)\n' '\n' ' Return a list of the lines in the string, breaking at ' 'line\n' @@ -12432,13 +12451,18 @@ 'The "try" statement specifies exception handlers and/or cleanup code\n' 'for a group of statements:\n' '\n' - ' try_stmt ::= try1_stmt | try2_stmt\n' + ' try_stmt ::= try1_stmt | try2_stmt | try3_stmt\n' ' try1_stmt ::= "try" ":" suite\n' ' ("except" [expression ["as" identifier]] ":" ' 'suite)+\n' ' ["else" ":" suite]\n' ' ["finally" ":" suite]\n' ' try2_stmt ::= "try" ":" suite\n' + ' ("except" "*" expression ["as" identifier] ":" ' + 'suite)+\n' + ' ["else" ":" suite]\n' + ' ["finally" ":" suite]\n' + ' try3_stmt ::= "try" ":" suite\n' ' "finally" ":" suite\n' '\n' 'The "except" clause(s) specify one or more exception handlers. When ' @@ -12538,6 +12562,53 @@ ' >>> print(sys.exc_info())\n' ' (None, None, None)\n' '\n' + 'The "except*" clause(s) are used for handling "ExceptionGroup`s. The\n' + 'exception type for matching is interpreted as in the case of\n' + ':keyword:`except", but in the case of exception groups we can have\n' + 'partial matches when the type matches some of the exceptions in the\n' + 'group. This means that multiple except* clauses can execute, each\n' + 'handling part of the exception group. Each clause executes once and\n' + 'handles an exception group of all matching exceptions. Each ' + 'exception\n' + 'in the group is handled by at most one except* clause, the first ' + 'that\n' + 'matches it.\n' + '\n' + ' >>> try:\n' + ' ... raise ExceptionGroup("eg",\n' + ' ... [ValueError(1), TypeError(2), OSError(3), ' + 'OSError(4)])\n' + ' ... except* TypeError as e:\n' + " ... print(f'caught {type(e)} with nested {e.exceptions}')\n" + ' ... except* OSError as e:\n' + " ... print(f'caught {type(e)} with nested {e.exceptions}')\n" + ' ...\n' + " caught with nested (TypeError(2),)\n" + " caught with nested (OSError(3), " + 'OSError(4))\n' + ' + Exception Group Traceback (most recent call last):\n' + ' | File "", line 2, in \n' + ' | ExceptionGroup: eg\n' + ' +-+---------------- 1 ----------------\n' + ' | ValueError: 1\n' + ' +------------------------------------\n' + ' >>>\n' + '\n' + ' Any remaining exceptions that were not handled by any except* ' + 'clause\n' + ' are re-raised at the end, combined into an exception group along ' + 'with\n' + ' all exceptions that were raised from within except* clauses.\n' + '\n' + ' An except* clause must have a matching type, and this type cannot ' + 'be a\n' + ' subclass of :exc:`BaseExceptionGroup`. It is not possible to mix ' + 'except\n' + ' and except* in the same :keyword:`try`. :keyword:`break`,\n' + ' :keyword:`continue` and :keyword:`return` cannot appear in an ' + 'except*\n' + ' clause.\n' + '\n' 'The optional "else" clause is executed if the control flow leaves ' 'the\n' '"try" suite, no exception was raised, and no "return", "continue", ' @@ -13814,9 +13885,9 @@ '"dict"\n' 'constructor.\n' '\n' - 'class dict(**kwarg)\n' - 'class dict(mapping, **kwarg)\n' - 'class dict(iterable, **kwarg)\n' + 'class dict(**kwargs)\n' + 'class dict(mapping, **kwargs)\n' + 'class dict(iterable, **kwargs)\n' '\n' ' Return a new dictionary initialized from an optional ' 'positional\n' @@ -14466,6 +14537,14 @@ 'Comparisons in\n' 'the language reference.)\n' '\n' + 'Forward and reversed iterators over mutable sequences access ' + 'values\n' + 'using an index. That index will continue to march forward (or\n' + 'backward) even if the underlying sequence is mutated. The ' + 'iterator\n' + 'terminates only when an "IndexError" or a "StopIteration" is\n' + 'encountered (or when the index drops below zero).\n' + '\n' 'Notes:\n' '\n' '1. While the "in" and "not in" operations are used only for ' @@ -14937,7 +15016,8 @@ '\n' ' The arguments to the range constructor must be integers ' '(either\n' - ' built-in "int" or any object that implements the "__index__"\n' + ' built-in "int" or any object that implements the ' + '"__index__()"\n' ' special method). If the *step* argument is omitted, it ' 'defaults to\n' ' "1". If the *start* argument is omitted, it defaults to "0". ' diff --git a/Misc/NEWS.d/3.11.0a4.rst b/Misc/NEWS.d/3.11.0a4.rst new file mode 100644 index 0000000000000..2391f43e2b7fa --- /dev/null +++ b/Misc/NEWS.d/3.11.0a4.rst @@ -0,0 +1,1177 @@ +.. bpo: 46070 +.. date: 2022-01-13-17-58-56 +.. nonce: q8IGth +.. release date: 2022-01-13 +.. section: Core and Builtins + +:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently +tracked by the GC. Previously, if an object was used later by another +interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if +the previous or the next object of the :c:type:`PyGC_Head` structure became +a dangling pointer. Patch by Victor Stinner. + +.. + +.. bpo: 46347 +.. date: 2022-01-11-13-57-00 +.. nonce: Gd8M-S +.. section: Core and Builtins + +Fix memory leak in PyEval_EvalCodeEx. + +.. + +.. bpo: 46339 +.. date: 2022-01-11-11-50-19 +.. nonce: OVumDZ +.. section: Core and Builtins + +Fix a crash in the parser when retrieving the error text for multi-line +f-strings expressions that do not start in the first line of the string. +Patch by Pablo Galindo + +.. + +.. bpo: 46331 +.. date: 2022-01-10-16-21-54 +.. nonce: h1AC-i +.. section: Core and Builtins + +Do not set line number of instruction storing doc-string. Fixes regression +introduced in 3.11 alpha. + +.. + +.. bpo: 46314 +.. date: 2022-01-10-12-34-17 +.. nonce: jId9Ky +.. section: Core and Builtins + +Remove spurious "call" event when creating a lambda function that was +accidentally introduced in 3.11a4. + +.. + +.. bpo: 46289 +.. date: 2022-01-07-23-32-03 +.. nonce: NnjpVc +.. section: Core and Builtins + +ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` +field is not optional. + +.. + +.. bpo: 46297 +.. date: 2022-01-07-22-13-59 +.. nonce: 83ThTl +.. section: Core and Builtins + +Fixed an interpreter crash on bootup with multiple PythonPaths set in the +Windows registry. Patch by Derzsi D?niel. + +.. + +.. bpo: 46237 +.. date: 2022-01-07-19-33-05 +.. nonce: 9A6Hpq +.. section: Core and Builtins + +Fix the line number of tokenizer errors inside f-strings. Patch by Pablo +Galindo. + +.. + +.. bpo: 46263 +.. date: 2022-01-06-10-54-07 +.. nonce: 60dRZb +.. section: Core and Builtins + +We always expect the "use_frozen_modules" config to be set, now that +getpath.c was rewritten in pure Python and the logic improved. + +.. + +.. bpo: 46006 +.. date: 2022-01-05-17-13-47 +.. nonce: hdH5Vn +.. section: Core and Builtins + +Fix a regression when a type method like ``__init__()`` is modified in a +subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type +``update_slot()``. Revert the change which made the Unicode dictionary of +interned strings compatible with subinterpreters: the internal interned +dictionary is shared again by all interpreters. Patch by Victor Stinner. + +.. + +.. bpo: 45923 +.. date: 2022-01-04-14-08-10 +.. nonce: rBp7r1 +.. section: Core and Builtins + +Add RESUME opcode. This is a logical no-op. It is emitted by the compiler +anywhere a Python function can be entered. It is used by the interpreter to +perform tracing and optimizer checks. + +.. + +.. bpo: 46208 +.. date: 2022-01-04-01-53-35 +.. nonce: i00Vz5 +.. section: Core and Builtins + +Fix the regression of os.path.normpath("A/../../B") not returning expected +"../B" but "B". + +.. + +.. bpo: 46240 +.. date: 2022-01-03-23-31-25 +.. nonce: 8lGjeK +.. section: Core and Builtins + +Correct the error message for unclosed parentheses when the tokenizer +doesn't reach the end of the source when the error is reported. Patch by +Pablo Galindo + +.. + +.. bpo: 46009 +.. date: 2022-01-03-11-36-34 +.. nonce: QZGrov +.. section: Core and Builtins + +Remove the ``GEN_START`` opcode. + +.. + +.. bpo: 46235 +.. date: 2022-01-02-23-55-13 +.. nonce: gUjp2v +.. section: Core and Builtins + +Certain sequence multiplication operations like ``[0] * 1_000`` are now +faster due to reference-counting optimizations. Patch by Dennis Sweeney. + +.. + +.. bpo: 46221 +.. date: 2022-01-01-14-23-57 +.. nonce: 7oGp-I +.. section: Core and Builtins + +:opcode:`PREP_RERAISE_STAR` no longer pushes ``lasti`` to the stack. + +.. + +.. bpo: 46202 +.. date: 2021-12-30-11-06-27 +.. nonce: IKx4v6 +.. section: Core and Builtins + +Remove :opcode:`POP_EXCEPT_AND_RERAISE` and replace it by an equivalent +sequence of other opcodes. + +.. + +.. bpo: 46085 +.. date: 2021-12-30-00-23-41 +.. nonce: bDuJqu +.. section: Core and Builtins + +Fix iterator cache mechanism of :class:`OrderedDict`. + +.. + +.. bpo: 46055 +.. date: 2021-12-24-20-21-45 +.. nonce: R0QMVQ +.. section: Core and Builtins + +Speed up shifting operation involving integers less than +:c:macro:`PyLong_BASE`. Patch by Xinhang Xu. + +.. + +.. bpo: 46110 +.. date: 2021-12-18-02-37-07 +.. nonce: B6hAfu +.. section: Core and Builtins + +Add a maximum recursion check to the PEG parser to avoid stack overflow. +Patch by Pablo Galindo + +.. + +.. bpo: 46107 +.. date: 2021-12-16-23-27-05 +.. nonce: 7q5an0 +.. section: Core and Builtins + +Fix bug where :meth:`ExceptionGroup.split` and +:meth:`ExceptionGroup.subgroup` did not copy the exception group's +``__note__`` field to the parts. + +.. + +.. bpo: 45711 +.. date: 2021-12-15-15-17-04 +.. nonce: QK4QrB +.. section: Core and Builtins + +The interpreter state's representation of handled exceptions (a.k.a +exc_info, or _PyErr_StackItem) now has only the ``exc_value`` field, +``exc_type`` and ``exc_traceback`` have been removed as their values can be +derived from ``exc_value``. + +.. + +.. bpo: 44525 +.. date: 2021-12-13-17-12-16 +.. nonce: 4-FiSf +.. section: Core and Builtins + +Replace the four call bytecode instructions which one pre-call instruction +and two call instructions. + +Removes ``CALL_FUNCTION``, ``CALL_FUNCTION_KW``, ``CALL_METHOD`` and +``CALL_METHOD_KW``. + +Adds ``CALL_NO_KW`` and ``CALL_KW`` call instructions, and +``PRECALL_METHOD`` prefix for pairing with ``LOAD_METHOD``. + +.. + +.. bpo: 46039 +.. date: 2021-12-13-17-01-13 +.. nonce: TrCBbF +.. section: Core and Builtins + +Remove the ``YIELD_FROM`` instruction and replace it with the ``SEND`` +instruction which performs the same operation, but without the loop. + +.. + +.. bpo: 45635 +.. date: 2021-12-12-15-52-41 +.. nonce: ADVaPT +.. section: Core and Builtins + +The code called from :c:func:`_PyErr_Display` was refactored to improve +error handling. It now exits immediately upon an unrecoverable error. + +.. + +.. bpo: 46054 +.. date: 2021-12-12-05-30-21 +.. nonce: 2P-foG +.. section: Core and Builtins + +Fix parser error when parsing non-utf8 characters in source files. Patch by +Pablo Galindo. + +.. + +.. bpo: 46042 +.. date: 2021-12-11-17-40-34 +.. nonce: aqYxku +.. section: Core and Builtins + +Improve the location of the caret in :exc:`SyntaxError` exceptions emitted +by the symbol table. Patch by Pablo Galindo. + +.. + +.. bpo: 46049 +.. date: 2021-12-11-13-49-19 +.. nonce: 9dNto2 +.. section: Core and Builtins + +Ensure :file:`._pth` files work as intended on platforms other than Windows. + +.. + +.. bpo: 46048 +.. date: 2021-12-11-13-14-42 +.. nonce: _-OGD9 +.. section: Core and Builtins + +Fixes parsing of :file:`._pth` files on startup so that single-character +paths are correctly read. + +.. + +.. bpo: 37971 +.. date: 2021-12-10-13-42-17 +.. nonce: 6BC1Tx +.. section: Core and Builtins + +Fix a bug where the line numbers given in a traceback when a decorator +application raised an exception were wrong. + +.. + +.. bpo: 46031 +.. date: 2021-12-10-09-10-32 +.. nonce: rM7JOX +.. section: Core and Builtins + +Add :opcode:`POP_JUMP_IF_NOT_NONE` and :opcode:`POP_JUMP_IF_NONE` opcodes to +speed up conditional jumps. + +.. + +.. bpo: 45654 +.. date: 2021-12-09-11-57-43 +.. nonce: MZc7ei +.. section: Core and Builtins + +Deepfreeze :mod:`runpy`, patch by Kumar Aditya. + +.. + +.. bpo: 46025 +.. date: 2021-12-09-11-41-35 +.. nonce: pkEvW9 +.. section: Core and Builtins + +Fix a crash in the :mod:`atexit` module involving functions that unregister +themselves before raising exceptions. Patch by Pablo Galindo. + +.. + +.. bpo: 46000 +.. date: 2021-12-07-11-42-44 +.. nonce: v_ru3k +.. section: Core and Builtins + +Improve compatibility of the :mod:`curses` module with NetBSD curses. + +.. + +.. bpo: 44525 +.. date: 2021-12-07-11-04-21 +.. nonce: 6OWCgr +.. section: Core and Builtins + +Specialize the CALL_FUNCTION instruction for calls to builtin types with a +single argument. Speeds up ``range(x)``, ``list(x)``, and specifically +``type(obj)``. + +.. + +.. bpo: 42918 +.. date: 2021-12-06-15-32-12 +.. nonce: Czpgtg +.. section: Core and Builtins + +Fix bug where the built-in :func:`compile` function did not always raise a +:exc:`SyntaxError` when passed multiple statements in 'single' mode. Patch +by Weipeng Hong. + +.. + +.. bpo: 45953 +.. date: 2021-12-01-11-54-27 +.. nonce: 2znR0E +.. section: Core and Builtins + +The main interpreter in _PyRuntimeState.interpreters is now statically +allocated (as part of _PyRuntime). Likewise for the initial thread state of +each interpreter. This means less allocation during runtime init, as well +as better memory locality for these key state objects. + +.. + +.. bpo: 45292 +.. date: 2021-11-22-13-05-32 +.. nonce: pfEouJ +.. section: Core and Builtins + +Complete the :pep:`654` implementation: add ``except*``. + +.. + +.. bpo: 43413 +.. date: 2021-05-30-16-37-47 +.. nonce: vYFPPC1 +.. section: Core and Builtins + +Revert changes in ``set.__init__``. Subclass of :class:`set` needs to define +a ``__init__()`` method if it defines a ``__new__()`` method with additional +keyword parameters. + +.. + +.. bpo: 43931 +.. date: 2021-04-24-15-39-23 +.. nonce: zpChDi +.. section: Core and Builtins + +Added the :c:data:`Py_Version` constant which bears the same value as +:c:macro:`PY_VERSION_HEX`. Patch by Gabriele N. Tornetta. + +.. + +.. bpo: 46342 +.. date: 2022-01-11-04-28-09 +.. nonce: 5QVEH1 +.. section: Library + +The ``@typing.final`` decorator now sets the ``__final__`` attribute on the +decorated object to allow runtime introspection. Patch by Jelle Zijlstra. + +.. + +.. bpo: 46328 +.. date: 2022-01-10-11-53-15 +.. nonce: 6i9Wvq +.. section: Library + +Added the :meth:`sys.exception` method which returns the active exception +instance. + +.. + +.. bpo: 46307 +.. date: 2022-01-10-07-51-43 +.. nonce: SKvOIY +.. section: Library + +Add :meth:`string.Template.is_valid` and +:meth:`string.Template.get_identifiers` methods. + +.. + +.. bpo: 46306 +.. date: 2022-01-08-13-53-25 +.. nonce: 1_es8z +.. section: Library + +Assume that :class:`types.CodeType` always has +:attr:`types.CodeType.co_firstlineno` in :mod:`doctest`. + +.. + +.. bpo: 40479 +.. date: 2022-01-07-15-20-19 +.. nonce: EKfr3F +.. section: Library + +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. + +.. + +.. bpo: 46070 +.. date: 2022-01-07-13-51-22 +.. nonce: -axLUW +.. section: Library + +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. + +.. + +.. bpo: 46244 +.. date: 2022-01-06-21-31-14 +.. nonce: hjyfJj +.. section: Library + +Removed ``__slots__`` from :class:`typing.ParamSpec` and +:class:`typing.TypeVar`. They served no purpose. Patch by Arie Bovenberg. + +.. + +.. bpo: 46278 +.. date: 2022-01-06-13-38-00 +.. nonce: wILA80 +.. section: Library + +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. + +.. + +.. bpo: 46269 +.. date: 2022-01-05-18-16-13 +.. nonce: K16Z1S +.. section: Library + +Remove special-casing of ``__new__`` in :meth:`enum.Enum.__dir__`. + +.. + +.. bpo: 46266 +.. date: 2022-01-05-12-48-18 +.. nonce: ACQCgX +.. section: Library + +Improve day constants in :mod:`calendar`. + +Now all constants (`MONDAY` ... `SUNDAY`) are documented, tested, and added +to ``__all__``. + +.. + +.. bpo: 46257 +.. date: 2022-01-04-11-04-20 +.. nonce: _o2ADe +.. section: Library + +Optimized the mean, variance, and stdev functions in the statistics module. +If the input is an iterator, it is consumed in a single pass rather than +eating memory by conversion to a list. The single pass algorithm is about +twice as fast as the previous two pass code. + +.. + +.. bpo: 41011 +.. date: 2022-01-03-21-03-50 +.. nonce: uULmGi +.. section: Library + +Added two new variables to *pyvenv.cfg* which is generated by :mod:`venv` +module: *executable* for the executable and *command* for the command line +used to create the environment. + +.. + +.. bpo: 46239 +.. date: 2022-01-03-12-59-20 +.. nonce: ySVSEy +.. section: Library + +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. + +.. + +.. bpo: 46238 +.. date: 2022-01-03-12-19-10 +.. nonce: lANhCi +.. section: Library + +Reuse ``_winapi`` constants in ``asyncio.windows_events``. + +.. + +.. bpo: 46222 +.. date: 2022-01-01-17-34-32 +.. nonce: s2fzZU +.. section: Library + +Adding ``SF_NOCACHE`` sendfile constant for FreeBSD for the posixmodule. + +.. + +.. bpo: 37295 +.. date: 2021-12-27-15-52-28 +.. nonce: s3LPo0 +.. section: Library + +Add fast path for ``0 <= k <= n <= 67`` for :func:`math.comb`. + +.. + +.. bpo: 46176 +.. date: 2021-12-25-11-11-21 +.. nonce: EOY9wd +.. section: Library + +Adding the ``MAP_STACK`` constant for the mmap module. + +.. + +.. bpo: 43424 +.. date: 2021-12-23-14-36-58 +.. nonce: d9x2JZ +.. section: Library + +Deprecate :attr:`webbrowser.MacOSXOSAScript._name` and use ``name`` instead. + +.. + +.. bpo: 45321 +.. date: 2021-12-19-00-00-48 +.. nonce: OyuhaY +.. section: Library + +Added missing error codes to module ``xml.parsers.expat.errors``. + +.. + +.. bpo: 46125 +.. date: 2021-12-18-18-29-07 +.. nonce: LLmcox +.. section: Library + +Refactor tests to test traversable API directly. Includes changes from +importlib 5.4.0. + +.. + +.. bpo: 46118 +.. date: 2021-12-17-16-27-44 +.. nonce: euAy0E +.. section: Library + +Moved importlib.resources and its related functionality to a package. + +.. + +.. bpo: 37578 +.. date: 2021-12-17-13-22-37 +.. nonce: _tluuR +.. section: Library + +Add *include_hidden* parameter to :func:`~glob.glob` and :func:`~glob.iglob` +to match hidden files and directories when using special characters like +``*``, ``**``, ``?`` and ``[]``. + +.. + +.. bpo: 20369 +.. date: 2021-12-17-12-06-40 +.. nonce: zzLuBz +.. section: Library + +:func:`concurrent.futures.wait` no longer blocks forever when given +duplicate Futures. Patch by Kumar Aditya. + +.. + +.. bpo: 46105 +.. date: 2021-12-16-14-30-36 +.. nonce: pprB1K +.. section: Library + +Honor spec when generating requirement specs with urls and extras +(importlib_metadata 4.8.3). + +.. + +.. bpo: 44893 +.. date: 2021-12-16-13-54-55 +.. nonce: I7aLiW +.. section: Library + +EntryPoint objects are no longer tuples. Recommended means to access is by +attribute ('.name', '.group') or accessor ('.load()'). Access by index is +deprecated and will raise deprecation warning. + +.. + +.. bpo: 22815 +.. date: 2021-12-16-12-54-21 +.. nonce: 0NRH8s +.. section: Library + +Print unexpected successes together with failures and errors in summary in +:class:`unittest.TextTestResult`. + +.. + +.. bpo: 22047 +.. date: 2021-12-15-19-24-54 +.. nonce: gBV4vT +.. section: Library + +Calling :meth:`add_argument_group` on an argument group is deprecated. +Calling :meth:`add_argument_group` or :meth:`add_mutually_exclusive_group` +on a mutually exclusive group is deprecated. + +These features were never supported and do not always work correctly. The +functions exist on the API by accident through inheritance and will be +removed in the future. + +.. + +.. bpo: 26952 +.. date: 2021-12-14-13-18-45 +.. nonce: hjhISq +.. section: Library + +:mod:`argparse` raises :exc:`ValueError` with clear message when trying to +render usage for an empty mutually-exclusive group. Previously it raised a +cryptic :exc:`IndexError`. + +.. + +.. bpo: 45615 +.. date: 2021-12-13-15-51-16 +.. nonce: hVx83Q +.. section: Library + +Functions in the :mod:`traceback` module raise :exc:`TypeError` rather than +:exc:`AttributeError` when an exception argument is not of type +:exc:`BaseException`. + +.. + +.. bpo: 16594 +.. date: 2021-12-12-13-41-47 +.. nonce: yfC7L4 +.. section: Library + +Add allow allow_reuse_port flag in socketserver. + +.. + +.. bpo: 27718 +.. date: 2021-12-11-22-51-30 +.. nonce: MgQiGl +.. section: Library + +Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and +``getsignal()``) were omitted. + +.. + +.. bpo: 46032 +.. date: 2021-12-11-15-45-07 +.. nonce: HmciLT +.. section: Library + +The ``registry()`` method of :func:`functools.singledispatch` functions +checks now the first argument or the first parameter annotation and raises a +TypeError if it is not supported. Previously unsupported "types" were +ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. +``list[int]``). + +.. + +.. bpo: 46014 +.. date: 2021-12-10-03-13-57 +.. nonce: 3xYdST +.. section: Library + +Add ability to use ``typing.Union`` and ``types.UnionType`` as dispatch +argument to ``functools.singledispatch``. Patch provided by Yurii Karabas. + +.. + +.. bpo: 27062 +.. date: 2021-12-09-11-50-32 +.. nonce: R5vii6 +.. section: Library + +Add :attr:`__all__` to :mod:`inspect`, patch by Kumar Aditya. + +.. + +.. bpo: 46018 +.. date: 2021-12-09-00-44-42 +.. nonce: hkTI7v +.. section: Library + +Ensure that :func:`math.expm1` does not raise on underflow. + +.. + +.. bpo: 46016 +.. date: 2021-12-08-19-15-03 +.. nonce: s9PuyF +.. section: Library + +Adding :attr:`F_DUP2FD` and :attr:`F_DUP2FD_CLOEXEC` constants from FreeBSD +into the fcntl module. + +.. + +.. bpo: 45755 +.. date: 2021-12-07-21-55-22 +.. nonce: bRqKGa +.. section: Library + +:mod:`typing` generic aliases now reveal the class attributes of the +original generic class when passed to ``dir()``. This was the behavior up to +Python 3.6, but was changed in 3.7-3.9. + +.. + +.. bpo: 45874 +.. date: 2021-12-02-11-55-45 +.. nonce: dtJIsN +.. section: Library + +The empty query string, consisting of no query arguments, is now handled +correctly in ``urllib.parse.parse_qsl``. This caused problems before when +strict parsing was enabled. + +.. + +.. bpo: 44674 +.. date: 2021-11-29-19-37-20 +.. nonce: NijWLt +.. section: Library + +Change how dataclasses disallows mutable default values. It used to use a +list of known types (list, dict, set). Now it disallows unhashable objects +to be defaults. It's using unhashability as a proxy for mutability. Patch +by Eric V. Smith, idea by Raymond Hettinger. + +.. + +.. bpo: 23882 +.. date: 2021-11-24-19-09-14 +.. nonce: _tctCv +.. section: Library + +Remove namespace package (PEP 420) support from unittest discovery. It was +introduced in Python 3.4 but has been broken since Python 3.7. + +.. + +.. bpo: 25066 +.. date: 2021-11-24-12-25-42 +.. nonce: YIcIkn +.. section: Library + +Added a :meth:`__repr__` method to :class:`multiprocessing.Event` objects, +patch by Kumar Aditya. + +.. + +.. bpo: 45643 +.. date: 2021-10-28-11-40-59 +.. nonce: jeiPiX +.. section: Library + +Added :data:`signal.SIGSTKFLT` on platforms where this signal is defined. + +.. + +.. bpo: 44092 +.. date: 2021-05-19-12-35-49 +.. nonce: hiSlI5 +.. section: Library + +Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. +Instead we leave it to the SQLite library to handle these cases. Patch by +Erlend E. Aasland. + +.. + +.. bpo: 42413 +.. date: 2020-11-26-10-23-46 +.. nonce: HFikOl +.. section: Library + +Replace ``concurrent.futures.TimeoutError`` and ``asyncio.TimeoutError`` +with builtin :exc:`TimeoutError`, keep these names as deprecated aliases. + +.. + +.. bpo: 46196 +.. date: 2021-12-30-19-12-24 +.. nonce: UvQ8Sq +.. section: Documentation + +Document method :meth:`cmd.Cmd.columnize`. + +.. + +.. bpo: 46120 +.. date: 2021-12-21-12-45-57 +.. nonce: PE0DmJ +.. section: Documentation + +State that ``|`` is preferred for readability over ``Union`` in the +:mod:`typing` docs. + +.. + +.. bpo: 46109 +.. date: 2021-12-16-21-13-55 +.. nonce: 0-RNzu +.. section: Documentation + +Extracted ``importlib.resources`` and ``importlib.resources.abc`` +documentation into separate files. + +.. + +.. bpo: 19737 +.. date: 2021-11-28-22-43-21 +.. nonce: cOOubB +.. section: Documentation + +Update the documentation for the :func:`globals` function. + +.. + +.. bpo: 46296 +.. date: 2022-01-08-00-00-38 +.. nonce: vqxgTm +.. section: Tests + +Add a test case for :mod:`enum` with ``_use_args_ == True`` and +``_member_type_ == object``. + +.. + +.. bpo: 46205 +.. date: 2022-01-07-14-06-12 +.. nonce: dnc2OC +.. section: Tests + +Fix hang in runtest_mp due to race condition + +.. + +.. bpo: 46263 +.. date: 2022-01-06-15-45-34 +.. nonce: bJXek6 +.. section: Tests + +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. + +.. + +.. bpo: 46262 +.. date: 2022-01-05-01-38-45 +.. nonce: MhiLWP +.. section: Tests + +Cover ``ValueError`` path in tests for :meth:`enum.Flag._missing_`. + +.. + +.. bpo: 46150 +.. date: 2021-12-23-13-42-15 +.. nonce: RhtADs +.. section: Tests + +Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is +checked to be non-existent. + +.. + +.. bpo: 46129 +.. date: 2021-12-19-12-20-57 +.. nonce: I3MunH +.. section: Tests + +Rewrite ``asyncio.locks`` tests with +:class:`unittest.IsolatedAsyncioTestCase` usage. + +.. + +.. bpo: 23819 +.. date: 2021-12-19-08-44-32 +.. nonce: 9ueiII +.. section: Tests + +Fixed :mod:`asyncio` tests in python optimized mode. Patch by Kumar Aditya. + +.. + +.. bpo: 46114 +.. date: 2021-12-17-14-46-19 +.. nonce: 9iyZ_9 +.. section: Tests + +Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. + +.. + +.. bpo: 44133 +.. date: 2022-01-12-13-42-16 +.. nonce: NgyNAh +.. section: Build + +When Python is configured with :option:`--without-static-libpython`, the +Python static library (libpython.a) is no longer built. Patch by Victor +Stinner. + +.. + +.. bpo: 44133 +.. date: 2022-01-12-13-34-52 +.. nonce: HYCNXb +.. section: Build + +When Python is built without :option:`--enable-shared`, the ``python`` +program is now linked to object files, rather than being linked to the +Python static library (libpython.a), to make sure that all symbols are +exported. Previously, the linker omitted some symbols like the +:c:func:`Py_FrozenMain` function. Patch by Victor Stinner. + +.. + +.. bpo: 40280 +.. date: 2022-01-12-10-22-23 +.. nonce: 5maBz8 +.. section: Build + +The ``configure`` script has a new option ``--with-emscripten-target`` to +select browser or node as Emscripten build target. + +.. + +.. bpo: 46315 +.. date: 2022-01-09-15-48-49 +.. nonce: NdCRLu +.. section: Build + +Added and fixed ``#ifdef HAVE_FEATURE`` checks for functionality that is not +available on WASI platform. + +.. + +.. bpo: 45723 +.. date: 2022-01-07-08-33-45 +.. nonce: uq2nBU +.. section: Build + +Fixed a regression in ``configure`` check for :func:`select.epoll`. + +.. + +.. bpo: 46263 +.. date: 2022-01-05-02-58-10 +.. nonce: xiv8NU +.. section: Build + +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. + +.. + +.. bpo: 46106 +.. date: 2021-12-20-07-10-41 +.. nonce: 5qcv3L +.. section: Build + +Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. +Patch by Kumar Aditya. + +.. + +.. bpo: 46088 +.. date: 2021-12-16-14-18-07 +.. nonce: 8UUuAd +.. section: Build + +Automatically detect or install bootstrap Python runtime when building from +Visual Studio. + +.. + +.. bpo: 46072 +.. date: 2021-12-15-10-37-44 +.. nonce: GgeAU3 +.. section: Build + +Add a --with-pystats configure option to turn on internal statistics +gathering. + +.. + +.. bpo: 40280 +.. date: 2021-12-13-21-03-52 +.. nonce: b7NG4Y +.. section: Build + +A new directory ``Tools/wasm`` contains WebAssembly-related helpers like +``config.site`` override for wasm32-emscripten, wasm assets generator to +bundle the stdlib, and a README. + +.. + +.. bpo: 46023 +.. date: 2021-12-09-10-25-11 +.. nonce: PLpNB6 +.. section: Build + +:program:`makesetup` no longer builds extensions that have been marked as +*disabled*. This allows users to disable modules in ``Modules/Setup.local``. + +.. + +.. bpo: 45949 +.. date: 2021-12-02-23-21-18 +.. nonce: OTSo9X +.. section: Build + +Use pure Python ``freeze_module`` for all but importlib bootstrap files. +``--with-freeze-module`` :program:`configure` option is no longer needed for +cross builds. + +.. + +.. bpo: 46217 +.. date: 2022-01-07-22-55-11 +.. nonce: tgJEsB +.. section: Windows + +Removed parameter that is unsupported on Windows 8.1 and early Windows 10 +and may have caused build or runtime failures. + +.. + +.. bpo: 40477 +.. date: 2022-01-02-21-56-53 +.. nonce: W3nnM6 +.. section: macOS + +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. + +.. + +.. bpo: 46236 +.. date: 2022-01-05-10-16-16 +.. nonce: pcmVQw +.. section: C API + +Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a +``tuple`` instead of a ``dict``. + +.. + +.. bpo: 46140 +.. date: 2021-12-21-22-56-36 +.. nonce: dvXkYK +.. section: C API + +:c:func:`PyBuffer_GetPointer`, :c:func:`PyBuffer_FromContiguous`, +:c:func:`PyBuffer_ToContiguous` and :c:func:`PyMemoryView_FromBuffer` now +take buffer info by ``const Py_buffer *`` instead of ``Py_buffer *``, as +they do not need mutability. :c:func:`PyBuffer_FromContiguous` also now +takes the source buffer as ``const void *``, and similarly +:c:func:`PyBuffer_GetPointer` takes the strides as ``const Py_ssize_t *``. + +.. + +.. bpo: 45855 +.. date: 2021-12-12-10-09-02 +.. nonce: MVsTDj +.. section: C API + +Document that the *no_block* argument to :c:func:`PyCapsule_Import` is a +no-op now. + +.. + +.. bpo: 45855 +.. date: 2021-12-11-08-41-36 +.. nonce: Lq2_gR +.. section: C API + +Replaced deprecated usage of :c:func:`PyImport_ImportModuleNoBlock` with +:c:func:`PyImport_ImportModule` in stdlib modules. Patch by Kumar Aditya. + +.. + +.. bpo: 46007 +.. date: 2021-12-08-12-41-51 +.. nonce: sMgDLz +.. section: C API + +The :c:func:`PyUnicode_CHECK_INTERNED` macro has been excluded from the +limited C API. It was never usable there, because it used internal +structures which are not available in the limited C API. Patch by Victor +Stinner. diff --git a/Misc/NEWS.d/next/Build/2021-12-02-23-21-18.bpo-45949.OTSo9X.rst b/Misc/NEWS.d/next/Build/2021-12-02-23-21-18.bpo-45949.OTSo9X.rst deleted file mode 100644 index c746d71b1dcae..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-02-23-21-18.bpo-45949.OTSo9X.rst +++ /dev/null @@ -1,3 +0,0 @@ -Use pure Python ``freeze_module`` for all but importlib bootstrap files. -``--with-freeze-module`` :program:`configure` option is no longer needed for -cross builds. diff --git a/Misc/NEWS.d/next/Build/2021-12-09-10-25-11.bpo-46023.PLpNB6.rst b/Misc/NEWS.d/next/Build/2021-12-09-10-25-11.bpo-46023.PLpNB6.rst deleted file mode 100644 index 4ef9202559394..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-09-10-25-11.bpo-46023.PLpNB6.rst +++ /dev/null @@ -1,2 +0,0 @@ -:program:`makesetup` no longer builds extensions that have been marked as -*disabled*. This allows users to disable modules in ``Modules/Setup.local``. diff --git a/Misc/NEWS.d/next/Build/2021-12-13-21-03-52.bpo-40280.b7NG4Y.rst b/Misc/NEWS.d/next/Build/2021-12-13-21-03-52.bpo-40280.b7NG4Y.rst deleted file mode 100644 index 905ee44680276..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-13-21-03-52.bpo-40280.b7NG4Y.rst +++ /dev/null @@ -1 +0,0 @@ -A new directory ``Tools/wasm`` contains WebAssembly-related helpers like ``config.site`` override for wasm32-emscripten, wasm assets generator to bundle the stdlib, and a README. diff --git a/Misc/NEWS.d/next/Build/2021-12-15-10-37-44.bpo-46072.GgeAU3.rst b/Misc/NEWS.d/next/Build/2021-12-15-10-37-44.bpo-46072.GgeAU3.rst deleted file mode 100644 index 9cc8b6c82de41..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-15-10-37-44.bpo-46072.GgeAU3.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a --with-pystats configure option to turn on internal statistics -gathering. diff --git a/Misc/NEWS.d/next/Build/2021-12-16-14-18-07.bpo-46088.8UUuAd.rst b/Misc/NEWS.d/next/Build/2021-12-16-14-18-07.bpo-46088.8UUuAd.rst deleted file mode 100644 index 408ed53111fd2..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-16-14-18-07.bpo-46088.8UUuAd.rst +++ /dev/null @@ -1,2 +0,0 @@ -Automatically detect or install bootstrap Python runtime when building from -Visual Studio. diff --git a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst b/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst deleted file mode 100644 index d3e25f77c7336..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst +++ /dev/null @@ -1,2 +0,0 @@ -Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. -Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst deleted file mode 100644 index 3a575ed7f556b..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst +++ /dev/null @@ -1 +0,0 @@ -``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst b/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst deleted file mode 100644 index ca923b2f81f13..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-07-08-33-45.bpo-45723.uq2nBU.rst +++ /dev/null @@ -1 +0,0 @@ -Fixed a regression in ``configure`` check for :func:`select.epoll`. diff --git a/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst b/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst deleted file mode 100644 index 9360f91e45dd2..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-09-15-48-49.bpo-46315.NdCRLu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Added and fixed ``#ifdef HAVE_FEATURE`` checks for functionality that is not -available on WASI platform. diff --git a/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst b/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst deleted file mode 100644 index 55fc0fc986b81..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-12-10-22-23.bpo-40280.5maBz8.rst +++ /dev/null @@ -1,2 +0,0 @@ -The ``configure`` script has a new option ``--with-emscripten-target`` to -select browser or node as Emscripten build target. diff --git a/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst b/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst deleted file mode 100644 index 7c2a48a9e0d56..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-12-13-34-52.bpo-44133.HYCNXb.rst +++ /dev/null @@ -1,5 +0,0 @@ -When Python is built without :option:`--enable-shared`, the ``python`` -program is now linked to object files, rather than being linked to the Python -static library (libpython.a), to make sure that all symbols are exported. -Previously, the linker omitted some symbols like the :c:func:`Py_FrozenMain` -function. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst b/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst deleted file mode 100644 index 3542850ff286b..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-12-13-42-16.bpo-44133.NgyNAh.rst +++ /dev/null @@ -1,2 +0,0 @@ -When Python is configured with :option:`--without-static-libpython`, the Python -static library (libpython.a) is no longer built. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/C API/2021-12-08-12-41-51.bpo-46007.sMgDLz.rst b/Misc/NEWS.d/next/C API/2021-12-08-12-41-51.bpo-46007.sMgDLz.rst deleted file mode 100644 index 6ed871b9950af..0000000000000 --- a/Misc/NEWS.d/next/C API/2021-12-08-12-41-51.bpo-46007.sMgDLz.rst +++ /dev/null @@ -1,3 +0,0 @@ -The :c:func:`PyUnicode_CHECK_INTERNED` macro has been excluded from the limited -C API. It was never usable there, because it used internal structures which are -not available in the limited C API. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/C API/2021-12-11-08-41-36.bpo-45855.Lq2_gR.rst b/Misc/NEWS.d/next/C API/2021-12-11-08-41-36.bpo-45855.Lq2_gR.rst deleted file mode 100644 index 03258df00420f..0000000000000 --- a/Misc/NEWS.d/next/C API/2021-12-11-08-41-36.bpo-45855.Lq2_gR.rst +++ /dev/null @@ -1 +0,0 @@ -Replaced deprecated usage of :c:func:`PyImport_ImportModuleNoBlock` with :c:func:`PyImport_ImportModule` in stdlib modules. Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/C API/2021-12-12-10-09-02.bpo-45855.MVsTDj.rst b/Misc/NEWS.d/next/C API/2021-12-12-10-09-02.bpo-45855.MVsTDj.rst deleted file mode 100644 index e00d56e0e4ad2..0000000000000 --- a/Misc/NEWS.d/next/C API/2021-12-12-10-09-02.bpo-45855.MVsTDj.rst +++ /dev/null @@ -1,2 +0,0 @@ -Document that the *no_block* argument to :c:func:`PyCapsule_Import` is a -no-op now. diff --git a/Misc/NEWS.d/next/C API/2021-12-21-22-56-36.bpo-46140.dvXkYK.rst b/Misc/NEWS.d/next/C API/2021-12-21-22-56-36.bpo-46140.dvXkYK.rst deleted file mode 100644 index 26b985924b54d..0000000000000 --- a/Misc/NEWS.d/next/C API/2021-12-21-22-56-36.bpo-46140.dvXkYK.rst +++ /dev/null @@ -1 +0,0 @@ -:c:func:`PyBuffer_GetPointer`, :c:func:`PyBuffer_FromContiguous`, :c:func:`PyBuffer_ToContiguous` and :c:func:`PyMemoryView_FromBuffer` now take buffer info by ``const Py_buffer *`` instead of ``Py_buffer *``, as they do not need mutability. :c:func:`PyBuffer_FromContiguous` also now takes the source buffer as ``const void *``, and similarly :c:func:`PyBuffer_GetPointer` takes the strides as ``const Py_ssize_t *``. \ No newline at end of file diff --git a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst b/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst deleted file mode 100644 index 61906584a16a3..0000000000000 --- a/Misc/NEWS.d/next/C API/2022-01-05-10-16-16.bpo-46236.pcmVQw.rst +++ /dev/null @@ -1 +0,0 @@ -Fix a bug in :c:func:`PyFunction_GetAnnotations` that caused it to return a ``tuple`` instead of a ``dict``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-04-24-15-39-23.bpo-43931.zpChDi.rst b/Misc/NEWS.d/next/Core and Builtins/2021-04-24-15-39-23.bpo-43931.zpChDi.rst deleted file mode 100644 index 037512916878c..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-04-24-15-39-23.bpo-43931.zpChDi.rst +++ /dev/null @@ -1,2 +0,0 @@ -Added the :c:data:`Py_Version` constant which bears the same value as -:c:macro:`PY_VERSION_HEX`. Patch by Gabriele N. Tornetta. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-05-30-16-37-47.bpo-43413.vYFPPC1.rst b/Misc/NEWS.d/next/Core and Builtins/2021-05-30-16-37-47.bpo-43413.vYFPPC1.rst deleted file mode 100644 index cf879eceeb721..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-05-30-16-37-47.bpo-43413.vYFPPC1.rst +++ /dev/null @@ -1 +0,0 @@ -Revert changes in ``set.__init__``. Subclass of :class:`set` needs to define a ``__init__()`` method if it defines a ``__new__()`` method with additional keyword parameters. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-22-13-05-32.bpo-45292.pfEouJ.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-22-13-05-32.bpo-45292.pfEouJ.rst deleted file mode 100644 index 2cc7f39d46480..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-22-13-05-32.bpo-45292.pfEouJ.rst +++ /dev/null @@ -1 +0,0 @@ -Complete the :pep:`654` implementation: add ``except*``. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst deleted file mode 100644 index 4fa27b60c02f8..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-01-11-54-27.bpo-45953.2znR0E.rst +++ /dev/null @@ -1,4 +0,0 @@ -The main interpreter in _PyRuntimeState.interpreters is now statically -allocated (as part of _PyRuntime). Likewise for the initial thread state of -each interpreter. This means less allocation during runtime init, as well -as better memory locality for these key state objects. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst deleted file mode 100644 index f03dadebcf3b3..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-06-15-32-12.bpo-42918.Czpgtg.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix bug where the built-in :func:`compile` function did not always raise a -:exc:`SyntaxError` when passed multiple statements in 'single' mode. Patch by -Weipeng Hong. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-04-21.bpo-44525.6OWCgr.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-04-21.bpo-44525.6OWCgr.rst deleted file mode 100644 index 8e1533f477e3d..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-04-21.bpo-44525.6OWCgr.rst +++ /dev/null @@ -1,3 +0,0 @@ -Specialize the CALL_FUNCTION instruction for calls to builtin types with a -single argument. Speeds up ``range(x)``, ``list(x)``, and specifically -``type(obj)``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst deleted file mode 100644 index 68e4bfa9e77b1..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst +++ /dev/null @@ -1 +0,0 @@ -Improve compatibility of the :mod:`curses` module with NetBSD curses. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst deleted file mode 100644 index dd2f1ff4731e7..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-41-35.bpo-46025.pkEvW9.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix a crash in the :mod:`atexit` module involving functions that unregister -themselves before raising exceptions. Patch by Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-57-43.bpo-45654.MZc7ei.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-57-43.bpo-45654.MZc7ei.rst deleted file mode 100644 index 9072558a30e9a..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-09-11-57-43.bpo-45654.MZc7ei.rst +++ /dev/null @@ -1 +0,0 @@ -Deepfreeze :mod:`runpy`, patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst deleted file mode 100644 index 65c8b38cf8acc..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-10-09-10-32.bpo-46031.rM7JOX.rst +++ /dev/null @@ -1 +0,0 @@ -Add :opcode:`POP_JUMP_IF_NOT_NONE` and :opcode:`POP_JUMP_IF_NONE` opcodes to speed up conditional jumps. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-10-13-42-17.bpo-37971.6BC1Tx.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-10-13-42-17.bpo-37971.6BC1Tx.rst deleted file mode 100644 index 17f44f0e5b0a2..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-10-13-42-17.bpo-37971.6BC1Tx.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix a bug where the line numbers given in a traceback when a decorator -application raised an exception were wrong. - diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-14-42.bpo-46048._-OGD9.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-14-42.bpo-46048._-OGD9.rst deleted file mode 100644 index 647fb6df7a87b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-14-42.bpo-46048._-OGD9.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fixes parsing of :file:`._pth` files on startup so that single-character -paths are correctly read. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-49-19.bpo-46049.9dNto2.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-49-19.bpo-46049.9dNto2.rst deleted file mode 100644 index 07c6cb45614b6..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-13-49-19.bpo-46049.9dNto2.rst +++ /dev/null @@ -1 +0,0 @@ -Ensure :file:`._pth` files work as intended on platforms other than Windows. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst deleted file mode 100644 index 7a302bcd7648b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-17-40-34.bpo-46042.aqYxku.rst +++ /dev/null @@ -1,2 +0,0 @@ -Improve the location of the caret in :exc:`SyntaxError` exceptions emitted -by the symbol table. Patch by Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst deleted file mode 100644 index 6ca91f03445e2..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-05-30-21.bpo-46054.2P-foG.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix parser error when parsing non-utf8 characters in source files. Patch by -Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-15-52-41.bpo-45635.ADVaPT.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-12-15-52-41.bpo-45635.ADVaPT.rst deleted file mode 100644 index d2c97f564b2d2..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-15-52-41.bpo-45635.ADVaPT.rst +++ /dev/null @@ -1 +0,0 @@ -The code called from :c:func:`_PyErr_Display` was refactored to improve error handling. It now exits immediately upon an unrecoverable error. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-01-13.bpo-46039.TrCBbF.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-01-13.bpo-46039.TrCBbF.rst deleted file mode 100644 index 18bdc34d21c6b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-01-13.bpo-46039.TrCBbF.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove the ``YIELD_FROM`` instruction and replace it with the ``SEND`` -instruction which performs the same operation, but without the loop. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-12-16.bpo-44525.4-FiSf.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-12-16.bpo-44525.4-FiSf.rst deleted file mode 100644 index d929666c9107d..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-13-17-12-16.bpo-44525.4-FiSf.rst +++ /dev/null @@ -1,8 +0,0 @@ -Replace the four call bytecode instructions which one pre-call instruction -and two call instructions. - -Removes ``CALL_FUNCTION``, ``CALL_FUNCTION_KW``, ``CALL_METHOD`` and -``CALL_METHOD_KW``. - -Adds ``CALL_NO_KW`` and ``CALL_KW`` call instructions, and -``PRECALL_METHOD`` prefix for pairing with ``LOAD_METHOD``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-15-15-17-04.bpo-45711.QK4QrB.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-15-15-17-04.bpo-45711.QK4QrB.rst deleted file mode 100644 index 717f89ff0e279..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-15-15-17-04.bpo-45711.QK4QrB.rst +++ /dev/null @@ -1 +0,0 @@ -The interpreter state's representation of handled exceptions (a.k.a exc_info, or _PyErr_StackItem) now has only the ``exc_value`` field, ``exc_type`` and ``exc_traceback`` have been removed as their values can be derived from ``exc_value``. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-16-23-27-05.bpo-46107.7q5an0.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-23-27-05.bpo-46107.7q5an0.rst deleted file mode 100644 index 3257805f2ab5a..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-16-23-27-05.bpo-46107.7q5an0.rst +++ /dev/null @@ -1 +0,0 @@ -Fix bug where :meth:`ExceptionGroup.split` and :meth:`ExceptionGroup.subgroup` did not copy the exception group's ``__note__`` field to the parts. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst deleted file mode 100644 index 593d2855972c4..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a maximum recursion check to the PEG parser to avoid stack overflow. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-24-20-21-45.bpo-46055.R0QMVQ.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-24-20-21-45.bpo-46055.R0QMVQ.rst deleted file mode 100644 index 124138806f17d..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-24-20-21-45.bpo-46055.R0QMVQ.rst +++ /dev/null @@ -1,2 +0,0 @@ -Speed up shifting operation involving integers less than -:c:macro:`PyLong_BASE`. Patch by Xinhang Xu. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst deleted file mode 100644 index a2093f75c3b62..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst +++ /dev/null @@ -1 +0,0 @@ -Fix iterator cache mechanism of :class:`OrderedDict`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst deleted file mode 100644 index ee0a9038837de..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-11-06-27.bpo-46202.IKx4v6.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove :opcode:`POP_EXCEPT_AND_RERAISE` and replace it by an equivalent -sequence of other opcodes. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst deleted file mode 100644 index 0cb3e90a28d75..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-01-14-23-57.bpo-46221.7oGp-I.rst +++ /dev/null @@ -1 +0,0 @@ -:opcode:`PREP_RERAISE_STAR` no longer pushes ``lasti`` to the stack. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst deleted file mode 100644 index 9115c9d70a331..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-02-23-55-13.bpo-46235.gUjp2v.rst +++ /dev/null @@ -1 +0,0 @@ -Certain sequence multiplication operations like ``[0] * 1_000`` are now faster due to reference-counting optimizations. Patch by Dennis Sweeney. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst deleted file mode 100644 index 1ffcc766725e6..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-11-36-34.bpo-46009.QZGrov.rst +++ /dev/null @@ -1 +0,0 @@ -Remove the ``GEN_START`` opcode. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst deleted file mode 100644 index a7702ebafbd46..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst +++ /dev/null @@ -1,3 +0,0 @@ -Correct the error message for unclosed parentheses when the tokenizer -doesn't reach the end of the source when the error is reported. Patch by -Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst deleted file mode 100644 index 92025a02d5b25..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-01-53-35.bpo-46208.i00Vz5.rst +++ /dev/null @@ -1 +0,0 @@ -Fix the regression of os.path.normpath("A/../../B") not returning expected "../B" but "B". \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst deleted file mode 100644 index 967f6db1236b7..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-04-14-08-10.bpo-45923.rBp7r1.rst +++ /dev/null @@ -1,3 +0,0 @@ -Add RESUME opcode. This is a logical no-op. It is emitted by the compiler -anywhere a Python function can be entered. It is used by the interpreter to -perform tracing and optimizer checks. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst deleted file mode 100644 index 3acd2b09390a8..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-05-17-13-47.bpo-46006.hdH5Vn.rst +++ /dev/null @@ -1,5 +0,0 @@ -Fix a regression when a type method like ``__init__()`` is modified in a -subinterpreter. Fix a regression in ``_PyUnicode_EqualToASCIIId()`` and type -``update_slot()``. Revert the change which made the Unicode dictionary of -interned strings compatible with subinterpreters: the internal interned -dictionary is shared again by all interpreters. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst deleted file mode 100644 index fdcfe50a84aa1..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-06-10-54-07.bpo-46263.60dRZb.rst +++ /dev/null @@ -1,2 +0,0 @@ -We always expect the "use_frozen_modules" config to be set, now that -getpath.c was rewritten in pure Python and the logic improved. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst deleted file mode 100644 index 931a2603293c3..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-19-33-05.bpo-46237.9A6Hpq.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix the line number of tokenizer errors inside f-strings. Patch by Pablo -Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst deleted file mode 100644 index 558d2392d6102..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-22-13-59.bpo-46297.83ThTl.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fixed an interpreter crash on bootup with multiple PythonPaths set in -the Windows registry. Patch by Derzsi D?niel. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst deleted file mode 100644 index 816ff585f14e6..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-07-23-32-03.bpo-46289.NnjpVc.rst +++ /dev/null @@ -1,2 +0,0 @@ -ASDL declaration of ``FormattedValue`` has changed to reflect ``conversion`` -field is not optional. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst deleted file mode 100644 index c92c0cd47897b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-12-34-17.bpo-46314.jId9Ky.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove spurious "call" event when creating a lambda function that was -accidentally introduced in 3.11a4. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst deleted file mode 100644 index 8bb9a995cce35..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-10-16-21-54.bpo-46331.h1AC-i.rst +++ /dev/null @@ -1,2 +0,0 @@ -Do not set line number of instruction storing doc-string. Fixes regression -introduced in 3.11 alpha. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst deleted file mode 100644 index cd04f060826b2..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix a crash in the parser when retrieving the error text for multi-line -f-strings expressions that do not start in the first line of the string. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst deleted file mode 100644 index fc12d6ba146ca..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-13-57-00.bpo-46347.Gd8M-S.rst +++ /dev/null @@ -1 +0,0 @@ -Fix memory leak in PyEval_EvalCodeEx. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst deleted file mode 100644 index 4ed088f9898eb..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst +++ /dev/null @@ -1,5 +0,0 @@ -:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently -tracked by the GC. Previously, if an object was used later by another -interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if the -previous or the next object of the :c:type:`PyGC_Head` structure became a -dangling pointer. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst b/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst deleted file mode 100644 index a3e16c9fdd0e6..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst +++ /dev/null @@ -1 +0,0 @@ -Update the documentation for the :func:`globals` function. diff --git a/Misc/NEWS.d/next/Documentation/2021-12-16-21-13-55.bpo-46109.0-RNzu.rst b/Misc/NEWS.d/next/Documentation/2021-12-16-21-13-55.bpo-46109.0-RNzu.rst deleted file mode 100644 index 78d5149c80022..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-12-16-21-13-55.bpo-46109.0-RNzu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Extracted ``importlib.resources`` and ``importlib.resources.abc`` documentation into -separate files. diff --git a/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst b/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst deleted file mode 100644 index 17f67472e2ab0..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-12-21-12-45-57.bpo-46120.PE0DmJ.rst +++ /dev/null @@ -1 +0,0 @@ -State that ``|`` is preferred for readability over ``Union`` in the :mod:`typing` docs. diff --git a/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst b/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst deleted file mode 100644 index f14ada607522e..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-12-30-19-12-24.bpo-46196.UvQ8Sq.rst +++ /dev/null @@ -1 +0,0 @@ -Document method :meth:`cmd.Cmd.columnize`. diff --git a/Misc/NEWS.d/next/Library/2020-11-26-10-23-46.bpo-42413.HFikOl.rst b/Misc/NEWS.d/next/Library/2020-11-26-10-23-46.bpo-42413.HFikOl.rst deleted file mode 100644 index 85b7fe25074b3..0000000000000 --- a/Misc/NEWS.d/next/Library/2020-11-26-10-23-46.bpo-42413.HFikOl.rst +++ /dev/null @@ -1,2 +0,0 @@ -Replace ``concurrent.futures.TimeoutError`` and ``asyncio.TimeoutError`` -with builtin :exc:`TimeoutError`, keep these names as deprecated aliases. diff --git a/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst b/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst deleted file mode 100644 index 67777817ed550..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-05-19-12-35-49.bpo-44092.hiSlI5.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fetch across rollback no longer raises :exc:`~sqlite3.InterfaceError`. Instead -we leave it to the SQLite library to handle these cases. -Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2021-10-28-11-40-59.bpo-45643.jeiPiX.rst b/Misc/NEWS.d/next/Library/2021-10-28-11-40-59.bpo-45643.jeiPiX.rst deleted file mode 100644 index e1592ed53ab25..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-10-28-11-40-59.bpo-45643.jeiPiX.rst +++ /dev/null @@ -1 +0,0 @@ -Added :data:`signal.SIGSTKFLT` on platforms where this signal is defined. diff --git a/Misc/NEWS.d/next/Library/2021-11-24-12-25-42.bpo-25066.YIcIkn.rst b/Misc/NEWS.d/next/Library/2021-11-24-12-25-42.bpo-25066.YIcIkn.rst deleted file mode 100644 index df19d041644c2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-24-12-25-42.bpo-25066.YIcIkn.rst +++ /dev/null @@ -1 +0,0 @@ -Added a :meth:`__repr__` method to :class:`multiprocessing.Event` objects, patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst b/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst deleted file mode 100644 index a37c0b869157d..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-24-19-09-14.bpo-23882._tctCv.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove namespace package (PEP 420) support from unittest discovery. It was -introduced in Python 3.4 but has been broken since Python 3.7. diff --git a/Misc/NEWS.d/next/Library/2021-11-29-19-37-20.bpo-44674.NijWLt.rst b/Misc/NEWS.d/next/Library/2021-11-29-19-37-20.bpo-44674.NijWLt.rst deleted file mode 100644 index 79e7a08b3b174..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-29-19-37-20.bpo-44674.NijWLt.rst +++ /dev/null @@ -1,6 +0,0 @@ -Change how dataclasses disallows mutable default values. It used to -use a list of known types (list, dict, set). Now it disallows -unhashable objects to be defaults. It's using unhashability as a -proxy for mutability. Patch by Eric V. Smith, idea by Raymond -Hettinger. - diff --git a/Misc/NEWS.d/next/Library/2021-12-02-11-55-45.bpo-45874.dtJIsN.rst b/Misc/NEWS.d/next/Library/2021-12-02-11-55-45.bpo-45874.dtJIsN.rst deleted file mode 100644 index ef793cf30a80e..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-02-11-55-45.bpo-45874.dtJIsN.rst +++ /dev/null @@ -1,3 +0,0 @@ -The empty query string, consisting of no query arguments, is now handled -correctly in ``urllib.parse.parse_qsl``. This caused problems before when -strict parsing was enabled. diff --git a/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst b/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst deleted file mode 100644 index e5201b0dfde2d..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-07-21-55-22.bpo-45755.bRqKGa.rst +++ /dev/null @@ -1,3 +0,0 @@ -:mod:`typing` generic aliases now reveal the class attributes of the -original generic class when passed to ``dir()``. This was the behavior up to -Python 3.6, but was changed in 3.7-3.9. diff --git a/Misc/NEWS.d/next/Library/2021-12-08-19-15-03.bpo-46016.s9PuyF.rst b/Misc/NEWS.d/next/Library/2021-12-08-19-15-03.bpo-46016.s9PuyF.rst deleted file mode 100644 index 485bd86f3145e..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-08-19-15-03.bpo-46016.s9PuyF.rst +++ /dev/null @@ -1 +0,0 @@ -Adding :attr:`F_DUP2FD` and :attr:`F_DUP2FD_CLOEXEC` constants from FreeBSD into the fcntl module. diff --git a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst b/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst deleted file mode 100644 index 6ff76f58779d2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst +++ /dev/null @@ -1 +0,0 @@ -Ensure that :func:`math.expm1` does not raise on underflow. diff --git a/Misc/NEWS.d/next/Library/2021-12-09-11-50-32.bpo-27062.R5vii6.rst b/Misc/NEWS.d/next/Library/2021-12-09-11-50-32.bpo-27062.R5vii6.rst deleted file mode 100644 index 3ca22b69d456f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-09-11-50-32.bpo-27062.R5vii6.rst +++ /dev/null @@ -1 +0,0 @@ -Add :attr:`__all__` to :mod:`inspect`, patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-10-03-13-57.bpo-46014.3xYdST.rst b/Misc/NEWS.d/next/Library/2021-12-10-03-13-57.bpo-46014.3xYdST.rst deleted file mode 100644 index 90aacaf7e088f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-10-03-13-57.bpo-46014.3xYdST.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add ability to use ``typing.Union`` and ``types.UnionType`` as dispatch -argument to ``functools.singledispatch``. Patch provided by Yurii Karabas. diff --git a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst b/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst deleted file mode 100644 index 97a553d7ba29f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst +++ /dev/null @@ -1,5 +0,0 @@ -The ``registry()`` method of :func:`functools.singledispatch` functions -checks now the first argument or the first parameter annotation and raises a -TypeError if it is not supported. Previously unsupported "types" were -ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. -``list[int]``). diff --git a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst b/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst deleted file mode 100644 index c68e98ff0630b..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and -``getsignal()``) were omitted. diff --git a/Misc/NEWS.d/next/Library/2021-12-12-13-41-47.bpo-16594.yfC7L4.rst b/Misc/NEWS.d/next/Library/2021-12-12-13-41-47.bpo-16594.yfC7L4.rst deleted file mode 100644 index a977a6b21dd64..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-12-13-41-47.bpo-16594.yfC7L4.rst +++ /dev/null @@ -1 +0,0 @@ -Add allow allow_reuse_port flag in socketserver. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst b/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst deleted file mode 100644 index f8cd911ea6365..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-13-15-51-16.bpo-45615.hVx83Q.rst +++ /dev/null @@ -1 +0,0 @@ -Functions in the :mod:`traceback` module raise :exc:`TypeError` rather than :exc:`AttributeError` when an exception argument is not of type :exc:`BaseException`. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst b/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst deleted file mode 100644 index 379dbb55c7ca8..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst +++ /dev/null @@ -1 +0,0 @@ -:mod:`argparse` raises :exc:`ValueError` with clear message when trying to render usage for an empty mutually-exclusive group. Previously it raised a cryptic :exc:`IndexError`. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-15-19-24-54.bpo-22047.gBV4vT.rst b/Misc/NEWS.d/next/Library/2021-12-15-19-24-54.bpo-22047.gBV4vT.rst deleted file mode 100644 index a381ad88af6c5..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-15-19-24-54.bpo-22047.gBV4vT.rst +++ /dev/null @@ -1,3 +0,0 @@ -Calling :meth:`add_argument_group` on an argument group is deprecated. Calling :meth:`add_argument_group` or :meth:`add_mutually_exclusive_group` on a mutually exclusive group is deprecated. - -These features were never supported and do not always work correctly. The functions exist on the API by accident through inheritance and will be removed in the future. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-16-12-54-21.bpo-22815.0NRH8s.rst b/Misc/NEWS.d/next/Library/2021-12-16-12-54-21.bpo-22815.0NRH8s.rst deleted file mode 100644 index 5c4600f316ac3..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-16-12-54-21.bpo-22815.0NRH8s.rst +++ /dev/null @@ -1,2 +0,0 @@ -Print unexpected successes together with failures and errors in summary in -:class:`unittest.TextTestResult`. diff --git a/Misc/NEWS.d/next/Library/2021-12-16-13-54-55.bpo-44893.I7aLiW.rst b/Misc/NEWS.d/next/Library/2021-12-16-13-54-55.bpo-44893.I7aLiW.rst deleted file mode 100644 index e77c6ad2a483e..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-16-13-54-55.bpo-44893.I7aLiW.rst +++ /dev/null @@ -1,3 +0,0 @@ -EntryPoint objects are no longer tuples. Recommended means to access is by -attribute ('.name', '.group') or accessor ('.load()'). Access by index is -deprecated and will raise deprecation warning. diff --git a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst b/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst deleted file mode 100644 index 145edccb2e7a5..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst +++ /dev/null @@ -1,2 +0,0 @@ -Honor spec when generating requirement specs with urls and extras -(importlib_metadata 4.8.3). diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst deleted file mode 100644 index cc5cd0067e61f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst +++ /dev/null @@ -1 +0,0 @@ -:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2021-12-17-13-22-37.bpo-37578._tluuR.rst b/Misc/NEWS.d/next/Library/2021-12-17-13-22-37.bpo-37578._tluuR.rst deleted file mode 100644 index 455d0648a94c2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-17-13-22-37.bpo-37578._tluuR.rst +++ /dev/null @@ -1,3 +0,0 @@ -Add *include_hidden* parameter to :func:`~glob.glob` and :func:`~glob.iglob` to -match hidden files and directories when using special characters like ``*``, -``**``, ``?`` and ``[]``. diff --git a/Misc/NEWS.d/next/Library/2021-12-17-16-27-44.bpo-46118.euAy0E.rst b/Misc/NEWS.d/next/Library/2021-12-17-16-27-44.bpo-46118.euAy0E.rst deleted file mode 100644 index c53e5765b9785..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-17-16-27-44.bpo-46118.euAy0E.rst +++ /dev/null @@ -1 +0,0 @@ -Moved importlib.resources and its related functionality to a package. diff --git a/Misc/NEWS.d/next/Library/2021-12-18-18-29-07.bpo-46125.LLmcox.rst b/Misc/NEWS.d/next/Library/2021-12-18-18-29-07.bpo-46125.LLmcox.rst deleted file mode 100644 index d2c3a32dfdc13..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-18-18-29-07.bpo-46125.LLmcox.rst +++ /dev/null @@ -1,2 +0,0 @@ -Refactor tests to test traversable API directly. Includes changes from -importlib 5.4.0. diff --git a/Misc/NEWS.d/next/Library/2021-12-19-00-00-48.bpo-45321.OyuhaY.rst b/Misc/NEWS.d/next/Library/2021-12-19-00-00-48.bpo-45321.OyuhaY.rst deleted file mode 100644 index 171bf8a43e645..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-19-00-00-48.bpo-45321.OyuhaY.rst +++ /dev/null @@ -1 +0,0 @@ -Added missing error codes to module ``xml.parsers.expat.errors``. diff --git a/Misc/NEWS.d/next/Library/2021-12-23-14-36-58.bpo-43424.d9x2JZ.rst b/Misc/NEWS.d/next/Library/2021-12-23-14-36-58.bpo-43424.d9x2JZ.rst deleted file mode 100644 index aa5f8d4211c37..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-23-14-36-58.bpo-43424.d9x2JZ.rst +++ /dev/null @@ -1 +0,0 @@ -Deprecate :attr:`webbrowser.MacOSXOSAScript._name` and use ``name`` instead. diff --git a/Misc/NEWS.d/next/Library/2021-12-25-11-11-21.bpo-46176.EOY9wd.rst b/Misc/NEWS.d/next/Library/2021-12-25-11-11-21.bpo-46176.EOY9wd.rst deleted file mode 100644 index 4a50c2617200e..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-25-11-11-21.bpo-46176.EOY9wd.rst +++ /dev/null @@ -1 +0,0 @@ -Adding the ``MAP_STACK`` constant for the mmap module. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-27-15-52-28.bpo-37295.s3LPo0.rst b/Misc/NEWS.d/next/Library/2021-12-27-15-52-28.bpo-37295.s3LPo0.rst deleted file mode 100644 index a624f10637002..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-27-15-52-28.bpo-37295.s3LPo0.rst +++ /dev/null @@ -1 +0,0 @@ -Add fast path for ``0 <= k <= n <= 67`` for :func:`math.comb`. diff --git a/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst b/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst deleted file mode 100644 index 1fe28792529d0..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-01-17-34-32.bpo-46222.s2fzZU.rst +++ /dev/null @@ -1 +0,0 @@ -Adding ``SF_NOCACHE`` sendfile constant for FreeBSD for the posixmodule. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst deleted file mode 100644 index 1617b0ed0538a..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-03-12-19-10.bpo-46238.lANhCi.rst +++ /dev/null @@ -1 +0,0 @@ -Reuse ``_winapi`` constants in ``asyncio.windows_events``. diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst deleted file mode 100644 index 202febf84fd10..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst +++ /dev/null @@ -1,2 +0,0 @@ -Improve error message when importing :mod:`asyncio.windows_events` on -non-Windows. diff --git a/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst b/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst deleted file mode 100644 index 1b1fa5d376527..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-03-21-03-50.bpo-41011.uULmGi.rst +++ /dev/null @@ -1,3 +0,0 @@ -Added two new variables to *pyvenv.cfg* which is generated by :mod:`venv` -module: *executable* for the executable and *command* for the command line used -to create the environment. diff --git a/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst b/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst deleted file mode 100644 index 72ae56ec412a6..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-04-11-04-20.bpo-46257._o2ADe.rst +++ /dev/null @@ -1,4 +0,0 @@ -Optimized the mean, variance, and stdev functions in the statistics module. -If the input is an iterator, it is consumed in a single pass rather than -eating memory by conversion to a list. The single pass algorithm is about -twice as fast as the previous two pass code. diff --git a/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst b/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst deleted file mode 100644 index 354dcb0106595..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst +++ /dev/null @@ -1,4 +0,0 @@ -Improve day constants in :mod:`calendar`. - -Now all constants (`MONDAY` ... `SUNDAY`) are documented, tested, and added -to ``__all__``. diff --git a/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst b/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst deleted file mode 100644 index 5d3687aaddfea..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-05-18-16-13.bpo-46269.K16Z1S.rst +++ /dev/null @@ -1 +0,0 @@ -Remove special-casing of ``__new__`` in :meth:`enum.Enum.__dir__`. diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst deleted file mode 100644 index 40849044cf1c8..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst +++ /dev/null @@ -1,2 +0,0 @@ -Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop -implementations already support it. diff --git a/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst b/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst deleted file mode 100644 index 5ca536a97c9cd..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-06-21-31-14.bpo-46244.hjyfJj.rst +++ /dev/null @@ -1,2 +0,0 @@ -Removed ``__slots__`` from :class:`typing.ParamSpec` and :class:`typing.TypeVar`. -They served no purpose. Patch by Arie Bovenberg. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst deleted file mode 100644 index 0fedc9dfb8fb1..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix possible segfault when importing the :mod:`asyncio` module from -different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst deleted file mode 100644 index af72923bbd759..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL -3.0.0 in FIPS mode. diff --git a/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst b/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst deleted file mode 100644 index 02943c95a7d79..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-08-13-53-25.bpo-46306.1_es8z.rst +++ /dev/null @@ -1,2 +0,0 @@ -Assume that :class:`types.CodeType` always has :attr:`types.CodeType.co_firstlineno` in -:mod:`doctest`. diff --git a/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst b/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst deleted file mode 100644 index 6207c424ce9c0..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-10-07-51-43.bpo-46307.SKvOIY.rst +++ /dev/null @@ -1 +0,0 @@ -Add :meth:`string.Template.is_valid` and :meth:`string.Template.get_identifiers` methods. diff --git a/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst b/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst deleted file mode 100644 index fec790d52cef3..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-10-11-53-15.bpo-46328.6i9Wvq.rst +++ /dev/null @@ -1 +0,0 @@ -Added the :meth:`sys.exception` method which returns the active exception instance. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst b/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst deleted file mode 100644 index 31d484fc77f1f..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-11-04-28-09.bpo-46342.5QVEH1.rst +++ /dev/null @@ -1,2 +0,0 @@ -The ``@typing.final`` decorator now sets the ``__final__`` attribute on the -decorated object to allow runtime introspection. Patch by Jelle Zijlstra. diff --git a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst b/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst deleted file mode 100644 index 6878cea032387..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst +++ /dev/null @@ -1 +0,0 @@ -Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. diff --git a/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst b/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst deleted file mode 100644 index 4ef0fe6f6d5fa..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst +++ /dev/null @@ -1 +0,0 @@ -Fixed :mod:`asyncio` tests in python optimized mode. Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst b/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst deleted file mode 100644 index b06436a4c8460..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst +++ /dev/null @@ -1,2 +0,0 @@ -Rewrite ``asyncio.locks`` tests with -:class:`unittest.IsolatedAsyncioTestCase` usage. diff --git a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst b/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst deleted file mode 100644 index 8ef9cd9b4a594..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst +++ /dev/null @@ -1,2 +0,0 @@ -Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is checked -to be non-existent. diff --git a/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst b/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst deleted file mode 100644 index 456d1359e4732..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-05-01-38-45.bpo-46262.MhiLWP.rst +++ /dev/null @@ -1 +0,0 @@ -Cover ``ValueError`` path in tests for :meth:`enum.Flag._missing_`. diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst deleted file mode 100644 index 0334af4e3cbe8..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory -with junk byte. diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst deleted file mode 100644 index 7c6121fb16249..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst +++ /dev/null @@ -1 +0,0 @@ -Fix hang in runtest_mp due to race condition diff --git a/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst b/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst deleted file mode 100644 index 9e0d470e269cb..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-08-00-00-38.bpo-46296.vqxgTm.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a test case for :mod:`enum` -with ``_use_args_ == True`` and ``_member_type_ == object``. diff --git a/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst b/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst deleted file mode 100644 index 78b3cd01a03f6..0000000000000 --- a/Misc/NEWS.d/next/Windows/2022-01-07-22-55-11.bpo-46217.tgJEsB.rst +++ /dev/null @@ -1,2 +0,0 @@ -Removed parameter that is unsupported on Windows 8.1 and early Windows 10 -and may have caused build or runtime failures. diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst deleted file mode 100644 index fc953b85dcc2a..0000000000000 --- a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst +++ /dev/null @@ -1,2 +0,0 @@ -The Python Launcher app for macOS now properly launches scripts and, if -necessary, the Terminal app when running on recent macOS releases. diff --git a/README.rst b/README.rst index ff9d7858fd5bf..dcc1c39d78a08 100644 --- a/README.rst +++ b/README.rst @@ -1,4 +1,4 @@ -This is Python version 3.11.0 alpha 3 +This is Python version 3.11.0 alpha 4 ===================================== .. image:: https://github.com/python/cpython/workflows/Tests/badge.svg From webhook-mailer at python.org Fri Jan 14 16:50:43 2022 From: webhook-mailer at python.org (ambv) Date: Fri, 14 Jan 2022 21:50:43 -0000 Subject: [Python-checkins] Python 3.9.10 Message-ID: https://github.com/python/cpython/commit/f2f3f537829ab0ef6948be5ee7f46b8ce8213ff2 commit: f2f3f537829ab0ef6948be5ee7f46b8ce8213ff2 branch: 3.9 author: ?ukasz Langa committer: ambv date: 2022-01-13T22:21:23+01:00 summary: Python 3.9.10 files: A Misc/NEWS.d/3.9.10.rst D Misc/NEWS.d/next/Build/2021-11-24-17-14-06.bpo-45881.GTXXLk.rst D Misc/NEWS.d/next/Build/2021-11-25-09-15-04.bpo-41498.qAk5eo.rst D Misc/NEWS.d/next/Build/2021-11-25-13-53-36.bpo-45866.ZH1W8N.rst D Misc/NEWS.d/next/Build/2021-11-25-20-26-06.bpo-33393.24YNtM.rst D Misc/NEWS.d/next/Build/2021-12-06-09-31-27.bpo-44035.BiO4XC.rst D Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst D Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst D Misc/NEWS.d/next/C API/2021-11-09-15-42-11.bpo-39026.sUnYWn.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-15-12-08-27.bpo-42540.V2w107.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-00-27.bpo-45820.2X6Psr.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-41-04.bpo-45822.OT6ueS.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-19-19-21-48.bpo-45806.DflDMe.rst D Misc/NEWS.d/next/Core and Builtins/2021-11-23-12-06-41.bpo-45614.fIekgI.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst D Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst D Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst D Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst D Misc/NEWS.d/next/Documentation/2021-05-24-05-00-12.bpo-43905.tBIndE.rst D Misc/NEWS.d/next/Documentation/2021-06-21-17-51-51.bpo-25381.7Kn-_H.rst D Misc/NEWS.d/next/Documentation/2021-11-18-00-07-40.bpo-45788.qibUoB.rst D Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst D Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst D Misc/NEWS.d/next/Library/2018-08-21-16-20-33.bpo-29620.xxx666.rst D Misc/NEWS.d/next/Library/2021-04-20-14-14-16.bpo-43498.L_Hq-8.rst D Misc/NEWS.d/next/Library/2021-10-28-22-58-14.bpo-45662.sJd7Ir.rst D Misc/NEWS.d/next/Library/2021-10-28-23-11-59.bpo-45663.J90N5R.rst D Misc/NEWS.d/next/Library/2021-10-28-23-40-54.bpo-45664.7dqtxQ.rst D Misc/NEWS.d/next/Library/2021-11-17-11-38-30.bpo-41735.2feh9v.rst D Misc/NEWS.d/next/Library/2021-11-17-19-25-37.bpo-45831.9-TojK.rst D Misc/NEWS.d/next/Library/2021-11-28-15-30-34.bpo-37658.8Hno7d.rst D Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst D Misc/NEWS.d/next/Library/2021-12-04-20-08-42.bpo-27946.-Vuarf.rst D Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst D Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst D Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst D Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst D Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst D Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst D Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst D Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst D Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst D Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst D Misc/NEWS.d/next/Tests/2021-11-17-14-28-08.bpo-45835.Mgyhjx.rst D Misc/NEWS.d/next/Tests/2021-11-28-15-25-02.bpo-19460.lr0aWs.rst D Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst D Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst D Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst D Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst D Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst D Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst D Misc/NEWS.d/next/Tools-Demos/2021-11-18-11-20-21.bpo-45838.TH6mwc.rst D Misc/NEWS.d/next/Windows/2021-11-26-18-17-41.bpo-45901.c5IBqM.rst D Misc/NEWS.d/next/macOS/2021-12-05-23-52-03.bpo-45732.-BWrnh.rst D Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst M Include/patchlevel.h M Lib/pydoc_data/topics.py M README.rst diff --git a/Include/patchlevel.h b/Include/patchlevel.h index 19e90975330b3..23fbd55ffcd66 100644 --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -18,12 +18,12 @@ /*--start constants--*/ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 9 -#define PY_MICRO_VERSION 9 +#define PY_MICRO_VERSION 10 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.9.9+" +#define PY_VERSION "3.9.10" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/pydoc_data/topics.py b/Lib/pydoc_data/topics.py index 890a61668df1c..67a51977cfe74 100644 --- a/Lib/pydoc_data/topics.py +++ b/Lib/pydoc_data/topics.py @@ -1,5 +1,5 @@ # -*- coding: utf-8 -*- -# Autogenerated by Sphinx on Mon Nov 15 18:21:10 2021 +# Autogenerated by Sphinx on Thu Jan 13 21:46:32 2022 topics = {'assert': 'The "assert" statement\n' '**********************\n' '\n' @@ -979,7 +979,7 @@ '"super(B,\n' ' obj).m()" searches "obj.__class__.__mro__" for the ' 'base class "A"\n' - ' immediately preceding "B" and then invokes the ' + ' immediately following "B" and then invokes the ' 'descriptor with the\n' ' call: "A.__dict__[\'m\'].__get__(obj, ' 'obj.__class__)".\n' @@ -1010,14 +1010,15 @@ 'can be\n' 'overridden by instances.\n' '\n' - 'Python methods (including "staticmethod()" and ' - '"classmethod()") are\n' - 'implemented as non-data descriptors. Accordingly, ' - 'instances can\n' - 'redefine and override methods. This allows individual ' - 'instances to\n' - 'acquire behaviors that differ from other instances of ' - 'the same class.\n' + 'Python methods (including those decorated with ' + '"@staticmethod" and\n' + '"@classmethod") are implemented as non-data ' + 'descriptors. Accordingly,\n' + 'instances can redefine and override methods. This ' + 'allows individual\n' + 'instances to acquire behaviors that differ from other ' + 'instances of the\n' + 'same class.\n' '\n' 'The "property()" function is implemented as a data ' 'descriptor.\n' @@ -1030,12 +1031,12 @@ '\n' '*__slots__* allow us to explicitly declare data members ' '(like\n' - 'properties) and deny the creation of *__dict__* and ' + 'properties) and deny the creation of "__dict__" and ' '*__weakref__*\n' '(unless explicitly declared in *__slots__* or available ' 'in a parent.)\n' '\n' - 'The space saved over using *__dict__* can be ' + 'The space saved over using "__dict__" can be ' 'significant. Attribute\n' 'lookup speed can be significantly improved as well.\n' '\n' @@ -1047,7 +1048,7 @@ '*__slots__*\n' ' reserves space for the declared variables and ' 'prevents the\n' - ' automatic creation of *__dict__* and *__weakref__* ' + ' automatic creation of "__dict__" and *__weakref__* ' 'for each\n' ' instance.\n' '\n' @@ -1056,11 +1057,11 @@ '--------------------------\n' '\n' '* When inheriting from a class without *__slots__*, the ' - '*__dict__* and\n' + '"__dict__" and\n' ' *__weakref__* attribute of the instances will always ' 'be accessible.\n' '\n' - '* Without a *__dict__* variable, instances cannot be ' + '* Without a "__dict__" variable, instances cannot be ' 'assigned new\n' ' variables not listed in the *__slots__* definition. ' 'Attempts to\n' @@ -1074,28 +1075,28 @@ '\n' '* Without a *__weakref__* variable for each instance, ' 'classes defining\n' - ' *__slots__* do not support weak references to its ' - 'instances. If weak\n' - ' reference support is needed, then add ' + ' *__slots__* do not support "weak references" to its ' + 'instances. If\n' + ' weak reference support is needed, then add ' '"\'__weakref__\'" to the\n' ' sequence of strings in the *__slots__* declaration.\n' '\n' '* *__slots__* are implemented at the class level by ' 'creating\n' - ' descriptors (Implementing Descriptors) for each ' - 'variable name. As a\n' - ' result, class attributes cannot be used to set default ' - 'values for\n' - ' instance variables defined by *__slots__*; otherwise, ' - 'the class\n' - ' attribute would overwrite the descriptor assignment.\n' + ' descriptors for each variable name. As a result, ' + 'class attributes\n' + ' cannot be used to set default values for instance ' + 'variables defined\n' + ' by *__slots__*; otherwise, the class attribute would ' + 'overwrite the\n' + ' descriptor assignment.\n' '\n' '* The action of a *__slots__* declaration is not limited ' 'to the class\n' ' where it is defined. *__slots__* declared in parents ' 'are available\n' ' in child classes. However, child subclasses will get a ' - '*__dict__*\n' + '"__dict__"\n' ' and *__weakref__* unless they also define *__slots__* ' '(which should\n' ' only contain names of any *additional* slots).\n' @@ -1115,13 +1116,19 @@ ' ?variable-length? built-in types such as "int", ' '"bytes" and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to ' - '*__slots__*. Mappings may\n' - ' also be used; however, in the future, special meaning ' - 'may be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to ' + '*__slots__*.\n' '\n' - '* *__class__* assignment works only if both classes have ' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' + '\n' + '* "__class__" assignment works only if both classes have ' 'the same\n' ' *__slots__*.\n' '\n' @@ -1133,10 +1140,10 @@ 'violations\n' ' raise "TypeError".\n' '\n' - '* If an iterator is used for *__slots__* then a ' - 'descriptor is created\n' - ' for each of the iterator?s values. However, the ' - '*__slots__*\n' + '* If an *iterator* is used for *__slots__* then a ' + '*descriptor* is\n' + ' created for each of the iterator?s values. However, ' + 'the *__slots__*\n' ' attribute will be an empty iterator.\n', 'attribute-references': 'Attribute references\n' '********************\n' @@ -3763,17 +3770,16 @@ 'debugger will pause execution just before the first line of the\n' 'module.\n' '\n' - 'The typical usage to break into the debugger from a running ' - 'program is\n' - 'to insert\n' + 'The typical usage to break into the debugger is to insert:\n' '\n' ' import pdb; pdb.set_trace()\n' '\n' - 'at the location you want to break into the debugger. You can ' - 'then\n' - 'step through the code following this statement, and continue ' - 'running\n' - 'without the debugger using the "continue" command.\n' + 'at the location you want to break into the debugger, and then ' + 'run the\n' + 'program. You can then step through the code following this ' + 'statement,\n' + 'and continue running without the debugger using the "continue"\n' + 'command.\n' '\n' 'New in version 3.7: The built-in "breakpoint()", when called ' 'with\n' @@ -7655,61 +7661,62 @@ '\n' 'The following methods can be defined to implement ' 'container objects.\n' - 'Containers usually are sequences (such as lists or tuples) ' - 'or mappings\n' - '(like dictionaries), but can represent other containers as ' - 'well. The\n' - 'first set of methods is used either to emulate a sequence ' - 'or to\n' - 'emulate a mapping; the difference is that for a sequence, ' - 'the\n' - 'allowable keys should be the integers *k* for which "0 <= ' - 'k < N" where\n' - '*N* is the length of the sequence, or slice objects, which ' - 'define a\n' - 'range of items. It is also recommended that mappings ' - 'provide the\n' - 'methods "keys()", "values()", "items()", "get()", ' - '"clear()",\n' - '"setdefault()", "pop()", "popitem()", "copy()", and ' - '"update()"\n' - 'behaving similar to those for Python?s standard dictionary ' + 'Containers usually are *sequences* (such as "lists" or ' + '"tuples") or\n' + '*mappings* (like "dictionaries"), but can represent other ' + 'containers\n' + 'as well. The first set of methods is used either to ' + 'emulate a\n' + 'sequence or to emulate a mapping; the difference is that ' + 'for a\n' + 'sequence, the allowable keys should be the integers *k* ' + 'for which "0\n' + '<= k < N" where *N* is the length of the sequence, or ' + '"slice" objects,\n' + 'which define a range of items. It is also recommended ' + 'that mappings\n' + 'provide the methods "keys()", "values()", "items()", ' + '"get()",\n' + '"clear()", "setdefault()", "pop()", "popitem()", "copy()", ' + 'and\n' + '"update()" behaving similar to those for Python?s ' + 'standard\n' + '"dictionary" objects. The "collections.abc" module ' + 'provides a\n' + '"MutableMapping" *abstract base class* to help create ' + 'those methods\n' + 'from a base set of "__getitem__()", "__setitem__()", ' + '"__delitem__()",\n' + 'and "keys()". Mutable sequences should provide methods ' + '"append()",\n' + '"count()", "index()", "extend()", "insert()", "pop()", ' + '"remove()",\n' + '"reverse()" and "sort()", like Python standard "list" ' 'objects.\n' - 'The "collections.abc" module provides a "MutableMapping" ' - 'abstract base\n' - 'class to help create those methods from a base set of ' - '"__getitem__()",\n' - '"__setitem__()", "__delitem__()", and "keys()". Mutable ' - 'sequences\n' - 'should provide methods "append()", "count()", "index()", ' - '"extend()",\n' - '"insert()", "pop()", "remove()", "reverse()" and "sort()", ' - 'like Python\n' - 'standard list objects. Finally, sequence types should ' - 'implement\n' - 'addition (meaning concatenation) and multiplication ' + 'Finally, sequence types should implement addition ' '(meaning\n' - 'repetition) by defining the methods "__add__()", ' - '"__radd__()",\n' - '"__iadd__()", "__mul__()", "__rmul__()" and "__imul__()" ' - 'described\n' - 'below; they should not define other numerical operators. ' + 'concatenation) and multiplication (meaning repetition) by ' + 'defining the\n' + 'methods "__add__()", "__radd__()", "__iadd__()", ' + '"__mul__()",\n' + '"__rmul__()" and "__imul__()" described below; they should ' + 'not define\n' + 'other numerical operators. It is recommended that both ' + 'mappings and\n' + 'sequences implement the "__contains__()" method to allow ' + 'efficient use\n' + 'of the "in" operator; for mappings, "in" should search the ' + 'mapping?s\n' + 'keys; for sequences, it should search through the values. ' 'It is\n' - 'recommended that both mappings and sequences implement ' + 'further recommended that both mappings and sequences ' + 'implement the\n' + '"__iter__()" method to allow efficient iteration through ' 'the\n' - '"__contains__()" method to allow efficient use of the "in" ' - 'operator;\n' - 'for mappings, "in" should search the mapping?s keys; for ' - 'sequences, it\n' - 'should search through the values. It is further ' - 'recommended that both\n' - 'mappings and sequences implement the "__iter__()" method ' - 'to allow\n' - 'efficient iteration through the container; for mappings, ' - '"__iter__()"\n' - 'should iterate through the object?s keys; for sequences, ' - 'it should\n' - 'iterate through the values.\n' + 'container; for mappings, "__iter__()" should iterate ' + 'through the\n' + 'object?s keys; for sequences, it should iterate through ' + 'the values.\n' '\n' 'object.__len__(self)\n' '\n' @@ -7768,22 +7775,24 @@ 'object.__getitem__(self, key)\n' '\n' ' Called to implement evaluation of "self[key]". For ' - 'sequence types,\n' - ' the accepted keys should be integers and slice ' - 'objects. Note that\n' - ' the special interpretation of negative indexes (if the ' - 'class wishes\n' - ' to emulate a sequence type) is up to the ' - '"__getitem__()" method. If\n' - ' *key* is of an inappropriate type, "TypeError" may be ' - 'raised; if of\n' - ' a value outside the set of indexes for the sequence ' - '(after any\n' - ' special interpretation of negative values), ' - '"IndexError" should be\n' - ' raised. For mapping types, if *key* is missing (not in ' + '*sequence*\n' + ' types, the accepted keys should be integers and slice ' + 'objects.\n' + ' Note that the special interpretation of negative ' + 'indexes (if the\n' + ' class wishes to emulate a *sequence* type) is up to ' 'the\n' - ' container), "KeyError" should be raised.\n' + ' "__getitem__()" method. If *key* is of an inappropriate ' + 'type,\n' + ' "TypeError" may be raised; if of a value outside the ' + 'set of indexes\n' + ' for the sequence (after any special interpretation of ' + 'negative\n' + ' values), "IndexError" should be raised. For *mapping* ' + 'types, if\n' + ' *key* is missing (not in the container), "KeyError" ' + 'should be\n' + ' raised.\n' '\n' ' Note:\n' '\n' @@ -7793,6 +7802,15 @@ 'of the\n' ' sequence.\n' '\n' + ' Note:\n' + '\n' + ' When subscripting a *class*, the special class ' + 'method\n' + ' "__class_getitem__()" may be called instead of ' + '"__getitem__()".\n' + ' See __class_getitem__ versus __getitem__ for more ' + 'details.\n' + '\n' 'object.__setitem__(self, key, value)\n' '\n' ' Called to implement assignment to "self[key]". Same ' @@ -8891,7 +8909,7 @@ '"super(B,\n' ' obj).m()" searches "obj.__class__.__mro__" for the base ' 'class "A"\n' - ' immediately preceding "B" and then invokes the descriptor ' + ' immediately following "B" and then invokes the descriptor ' 'with the\n' ' call: "A.__dict__[\'m\'].__get__(obj, obj.__class__)".\n' '\n' @@ -8921,13 +8939,14 @@ 'be\n' 'overridden by instances.\n' '\n' - 'Python methods (including "staticmethod()" and ' - '"classmethod()") are\n' - 'implemented as non-data descriptors. Accordingly, instances ' - 'can\n' - 'redefine and override methods. This allows individual ' - 'instances to\n' - 'acquire behaviors that differ from other instances of the ' + 'Python methods (including those decorated with ' + '"@staticmethod" and\n' + '"@classmethod") are implemented as non-data descriptors. ' + 'Accordingly,\n' + 'instances can redefine and override methods. This allows ' + 'individual\n' + 'instances to acquire behaviors that differ from other ' + 'instances of the\n' 'same class.\n' '\n' 'The "property()" function is implemented as a data ' @@ -8941,12 +8960,12 @@ '\n' '*__slots__* allow us to explicitly declare data members ' '(like\n' - 'properties) and deny the creation of *__dict__* and ' + 'properties) and deny the creation of "__dict__" and ' '*__weakref__*\n' '(unless explicitly declared in *__slots__* or available in a ' 'parent.)\n' '\n' - 'The space saved over using *__dict__* can be significant. ' + 'The space saved over using "__dict__" can be significant. ' 'Attribute\n' 'lookup speed can be significantly improved as well.\n' '\n' @@ -8958,7 +8977,7 @@ '*__slots__*\n' ' reserves space for the declared variables and prevents ' 'the\n' - ' automatic creation of *__dict__* and *__weakref__* for ' + ' automatic creation of "__dict__" and *__weakref__* for ' 'each\n' ' instance.\n' '\n' @@ -8967,11 +8986,11 @@ '~~~~~~~~~~~~~~~~~~~~~~~~~~\n' '\n' '* When inheriting from a class without *__slots__*, the ' - '*__dict__* and\n' + '"__dict__" and\n' ' *__weakref__* attribute of the instances will always be ' 'accessible.\n' '\n' - '* Without a *__dict__* variable, instances cannot be ' + '* Without a "__dict__" variable, instances cannot be ' 'assigned new\n' ' variables not listed in the *__slots__* definition. ' 'Attempts to\n' @@ -8984,28 +9003,28 @@ '\n' '* Without a *__weakref__* variable for each instance, ' 'classes defining\n' - ' *__slots__* do not support weak references to its ' - 'instances. If weak\n' - ' reference support is needed, then add "\'__weakref__\'" to ' - 'the\n' + ' *__slots__* do not support "weak references" to its ' + 'instances. If\n' + ' weak reference support is needed, then add ' + '"\'__weakref__\'" to the\n' ' sequence of strings in the *__slots__* declaration.\n' '\n' '* *__slots__* are implemented at the class level by ' 'creating\n' - ' descriptors (Implementing Descriptors) for each variable ' - 'name. As a\n' - ' result, class attributes cannot be used to set default ' - 'values for\n' - ' instance variables defined by *__slots__*; otherwise, the ' - 'class\n' - ' attribute would overwrite the descriptor assignment.\n' + ' descriptors for each variable name. As a result, class ' + 'attributes\n' + ' cannot be used to set default values for instance ' + 'variables defined\n' + ' by *__slots__*; otherwise, the class attribute would ' + 'overwrite the\n' + ' descriptor assignment.\n' '\n' '* The action of a *__slots__* declaration is not limited to ' 'the class\n' ' where it is defined. *__slots__* declared in parents are ' 'available\n' ' in child classes. However, child subclasses will get a ' - '*__dict__*\n' + '"__dict__"\n' ' and *__weakref__* unless they also define *__slots__* ' '(which should\n' ' only contain names of any *additional* slots).\n' @@ -9025,13 +9044,18 @@ ' ?variable-length? built-in types such as "int", "bytes" ' 'and "tuple".\n' '\n' - '* Any non-string iterable may be assigned to *__slots__*. ' - 'Mappings may\n' - ' also be used; however, in the future, special meaning may ' - 'be\n' - ' assigned to the values corresponding to each key.\n' + '* Any non-string *iterable* may be assigned to *__slots__*.\n' + '\n' + '* If a "dictionary" is used to assign *__slots__*, the ' + 'dictionary keys\n' + ' will be used as the slot names. The values of the ' + 'dictionary can be\n' + ' used to provide per-attribute docstrings that will be ' + 'recognised by\n' + ' "inspect.getdoc()" and displayed in the output of ' + '"help()".\n' '\n' - '* *__class__* assignment works only if both classes have the ' + '* "__class__" assignment works only if both classes have the ' 'same\n' ' *__slots__*.\n' '\n' @@ -9043,9 +9067,9 @@ 'violations\n' ' raise "TypeError".\n' '\n' - '* If an iterator is used for *__slots__* then a descriptor ' - 'is created\n' - ' for each of the iterator?s values. However, the ' + '* If an *iterator* is used for *__slots__* then a ' + '*descriptor* is\n' + ' created for each of the iterator?s values. However, the ' '*__slots__*\n' ' attribute will be an empty iterator.\n' '\n' @@ -9054,7 +9078,7 @@ '==========================\n' '\n' 'Whenever a class inherits from another class, ' - '*__init_subclass__* is\n' + '"__init_subclass__()" is\n' 'called on that class. This way, it is possible to write ' 'classes which\n' 'change the behavior of subclasses. This is closely related ' @@ -9222,10 +9246,10 @@ 'come from\n' 'the class definition). The "__prepare__" method should be ' 'implemented\n' - 'as a "classmethod()". The namespace returned by ' - '"__prepare__" is\n' - 'passed in to "__new__", but when the final class object is ' - 'created the\n' + 'as a "classmethod". The namespace returned by "__prepare__" ' + 'is passed\n' + 'in to "__new__", but when the final class object is created ' + 'the\n' 'namespace is copied into a new "dict".\n' '\n' 'If the metaclass has no "__prepare__" attribute, then the ' @@ -9413,9 +9437,33 @@ 'Emulating generic types\n' '=======================\n' '\n' - 'One can implement the generic class syntax as specified by ' - '**PEP 484**\n' - '(for example "List[int]") by defining a special method:\n' + 'When using *type annotations*, it is often useful to ' + '*parameterize* a\n' + '*generic type* using Python?s square-brackets notation. For ' + 'example,\n' + 'the annotation "list[int]" might be used to signify a "list" ' + 'in which\n' + 'all the elements are of type "int".\n' + '\n' + 'See also:\n' + '\n' + ' **PEP 484** - Type Hints\n' + ' Introducing Python?s framework for type annotations\n' + '\n' + ' Generic Alias Types\n' + ' Documentation for objects representing parameterized ' + 'generic\n' + ' classes\n' + '\n' + ' Generics, user-defined generics and "typing.Generic"\n' + ' Documentation on how to implement generic classes that ' + 'can be\n' + ' parameterized at runtime and understood by static ' + 'type-checkers.\n' + '\n' + 'A class can *generally* only be parameterized if it defines ' + 'the\n' + 'special class method "__class_getitem__()".\n' '\n' 'classmethod object.__class_getitem__(cls, key)\n' '\n' @@ -9423,18 +9471,144 @@ 'generic class\n' ' by type arguments found in *key*.\n' '\n' - 'This method is looked up on the class object itself, and ' - 'when defined\n' - 'in the class body, this method is implicitly a class ' - 'method. Note,\n' - 'this mechanism is primarily reserved for use with static ' - 'type hints,\n' - 'other usage is discouraged.\n' + ' When defined on a class, "__class_getitem__()" is ' + 'automatically a\n' + ' class method. As such, there is no need for it to be ' + 'decorated with\n' + ' "@classmethod" when it is defined.\n' + '\n' + '\n' + 'The purpose of *__class_getitem__*\n' + '----------------------------------\n' + '\n' + 'The purpose of "__class_getitem__()" is to allow runtime\n' + 'parameterization of standard-library generic classes in ' + 'order to more\n' + 'easily apply *type hints* to these classes.\n' + '\n' + 'To implement custom generic classes that can be ' + 'parameterized at\n' + 'runtime and understood by static type-checkers, users should ' + 'either\n' + 'inherit from a standard library class that already ' + 'implements\n' + '"__class_getitem__()", or inherit from "typing.Generic", ' + 'which has its\n' + 'own implementation of "__class_getitem__()".\n' + '\n' + 'Custom implementations of "__class_getitem__()" on classes ' + 'defined\n' + 'outside of the standard library may not be understood by ' + 'third-party\n' + 'type-checkers such as mypy. Using "__class_getitem__()" on ' + 'any class\n' + 'for purposes other than type hinting is discouraged.\n' + '\n' + '\n' + '*__class_getitem__* versus *__getitem__*\n' + '----------------------------------------\n' + '\n' + 'Usually, the subscription of an object using square brackets ' + 'will call\n' + 'the "__getitem__()" instance method defined on the object?s ' + 'class.\n' + 'However, if the object being subscribed is itself a class, ' + 'the class\n' + 'method "__class_getitem__()" may be called instead.\n' + '"__class_getitem__()" should return a GenericAlias object if ' + 'it is\n' + 'properly defined.\n' + '\n' + 'Presented with the *expression* "obj[x]", the Python ' + 'interpreter\n' + 'follows something like the following process to decide ' + 'whether\n' + '"__getitem__()" or "__class_getitem__()" should be called:\n' + '\n' + ' from inspect import isclass\n' + '\n' + ' def subscribe(obj, x):\n' + ' """Return the result of the expression `obj[x]`"""\n' + '\n' + ' class_of_obj = type(obj)\n' + '\n' + ' # If the class of obj defines __getitem__,\n' + ' # call class_of_obj.__getitem__(obj, x)\n' + " if hasattr(class_of_obj, '__getitem__'):\n" + ' return class_of_obj.__getitem__(obj, x)\n' + '\n' + ' # Else, if obj is a class and defines ' + '__class_getitem__,\n' + ' # call obj.__class_getitem__(x)\n' + ' elif isclass(obj) and hasattr(obj, ' + "'__class_getitem__'):\n" + ' return obj.__class_getitem__(x)\n' + '\n' + ' # Else, raise an exception\n' + ' else:\n' + ' raise TypeError(\n' + ' f"\'{class_of_obj.__name__}\' object is not ' + 'subscriptable"\n' + ' )\n' + '\n' + 'In Python, all classes are themselves instances of other ' + 'classes. The\n' + 'class of a class is known as that class?s *metaclass*, and ' + 'most\n' + 'classes have the "type" class as their metaclass. "type" ' + 'does not\n' + 'define "__getitem__()", meaning that expressions such as ' + '"list[int]",\n' + '"dict[str, float]" and "tuple[str, bytes]" all result in\n' + '"__class_getitem__()" being called:\n' + '\n' + ' >>> # list has class "type" as its metaclass, like most ' + 'classes:\n' + ' >>> type(list)\n' + " \n" + ' >>> type(dict) == type(list) == type(tuple) == type(str) ' + '== type(bytes)\n' + ' True\n' + ' >>> # "list[int]" calls "list.__class_getitem__(int)"\n' + ' >>> list[int]\n' + ' list[int]\n' + ' >>> # list.__class_getitem__ returns a GenericAlias ' + 'object:\n' + ' >>> type(list[int])\n' + " \n" + '\n' + 'However, if a class has a custom metaclass that defines\n' + '"__getitem__()", subscribing the class may result in ' + 'different\n' + 'behaviour. An example of this can be found in the "enum" ' + 'module:\n' + '\n' + ' >>> from enum import Enum\n' + ' >>> class Menu(Enum):\n' + ' ... """A breakfast menu"""\n' + " ... SPAM = 'spam'\n" + " ... BACON = 'bacon'\n" + ' ...\n' + ' >>> # Enum classes have a custom metaclass:\n' + ' >>> type(Menu)\n' + " \n" + ' >>> # EnumMeta defines __getitem__,\n' + ' >>> # so __class_getitem__ is not called,\n' + ' >>> # and the result is not a GenericAlias object:\n' + " >>> Menu['SPAM']\n" + " \n" + " >>> type(Menu['SPAM'])\n" + " \n" '\n' 'See also:\n' '\n' - ' **PEP 560** - Core support for typing module and generic ' + ' **PEP 560** - Core Support for typing module and generic ' 'types\n' + ' Introducing "__class_getitem__()", and outlining when ' + 'a\n' + ' subscription results in "__class_getitem__()" being ' + 'called\n' + ' instead of "__getitem__()"\n' '\n' '\n' 'Emulating callable objects\n' @@ -9453,60 +9627,60 @@ '\n' 'The following methods can be defined to implement container ' 'objects.\n' - 'Containers usually are sequences (such as lists or tuples) ' - 'or mappings\n' - '(like dictionaries), but can represent other containers as ' - 'well. The\n' - 'first set of methods is used either to emulate a sequence or ' - 'to\n' - 'emulate a mapping; the difference is that for a sequence, ' - 'the\n' - 'allowable keys should be the integers *k* for which "0 <= k ' - '< N" where\n' - '*N* is the length of the sequence, or slice objects, which ' - 'define a\n' - 'range of items. It is also recommended that mappings ' - 'provide the\n' - 'methods "keys()", "values()", "items()", "get()", ' - '"clear()",\n' - '"setdefault()", "pop()", "popitem()", "copy()", and ' - '"update()"\n' - 'behaving similar to those for Python?s standard dictionary ' + 'Containers usually are *sequences* (such as "lists" or ' + '"tuples") or\n' + '*mappings* (like "dictionaries"), but can represent other ' + 'containers\n' + 'as well. The first set of methods is used either to emulate ' + 'a\n' + 'sequence or to emulate a mapping; the difference is that for ' + 'a\n' + 'sequence, the allowable keys should be the integers *k* for ' + 'which "0\n' + '<= k < N" where *N* is the length of the sequence, or ' + '"slice" objects,\n' + 'which define a range of items. It is also recommended that ' + 'mappings\n' + 'provide the methods "keys()", "values()", "items()", ' + '"get()",\n' + '"clear()", "setdefault()", "pop()", "popitem()", "copy()", ' + 'and\n' + '"update()" behaving similar to those for Python?s standard\n' + '"dictionary" objects. The "collections.abc" module provides ' + 'a\n' + '"MutableMapping" *abstract base class* to help create those ' + 'methods\n' + 'from a base set of "__getitem__()", "__setitem__()", ' + '"__delitem__()",\n' + 'and "keys()". Mutable sequences should provide methods ' + '"append()",\n' + '"count()", "index()", "extend()", "insert()", "pop()", ' + '"remove()",\n' + '"reverse()" and "sort()", like Python standard "list" ' 'objects.\n' - 'The "collections.abc" module provides a "MutableMapping" ' - 'abstract base\n' - 'class to help create those methods from a base set of ' - '"__getitem__()",\n' - '"__setitem__()", "__delitem__()", and "keys()". Mutable ' - 'sequences\n' - 'should provide methods "append()", "count()", "index()", ' - '"extend()",\n' - '"insert()", "pop()", "remove()", "reverse()" and "sort()", ' - 'like Python\n' - 'standard list objects. Finally, sequence types should ' - 'implement\n' - 'addition (meaning concatenation) and multiplication ' - '(meaning\n' - 'repetition) by defining the methods "__add__()", ' - '"__radd__()",\n' - '"__iadd__()", "__mul__()", "__rmul__()" and "__imul__()" ' - 'described\n' - 'below; they should not define other numerical operators. It ' - 'is\n' - 'recommended that both mappings and sequences implement the\n' - '"__contains__()" method to allow efficient use of the "in" ' - 'operator;\n' - 'for mappings, "in" should search the mapping?s keys; for ' - 'sequences, it\n' - 'should search through the values. It is further recommended ' - 'that both\n' - 'mappings and sequences implement the "__iter__()" method to ' - 'allow\n' - 'efficient iteration through the container; for mappings, ' - '"__iter__()"\n' - 'should iterate through the object?s keys; for sequences, it ' - 'should\n' - 'iterate through the values.\n' + 'Finally, sequence types should implement addition (meaning\n' + 'concatenation) and multiplication (meaning repetition) by ' + 'defining the\n' + 'methods "__add__()", "__radd__()", "__iadd__()", ' + '"__mul__()",\n' + '"__rmul__()" and "__imul__()" described below; they should ' + 'not define\n' + 'other numerical operators. It is recommended that both ' + 'mappings and\n' + 'sequences implement the "__contains__()" method to allow ' + 'efficient use\n' + 'of the "in" operator; for mappings, "in" should search the ' + 'mapping?s\n' + 'keys; for sequences, it should search through the values. ' + 'It is\n' + 'further recommended that both mappings and sequences ' + 'implement the\n' + '"__iter__()" method to allow efficient iteration through ' + 'the\n' + 'container; for mappings, "__iter__()" should iterate through ' + 'the\n' + 'object?s keys; for sequences, it should iterate through the ' + 'values.\n' '\n' 'object.__len__(self)\n' '\n' @@ -9564,22 +9738,23 @@ 'object.__getitem__(self, key)\n' '\n' ' Called to implement evaluation of "self[key]". For ' - 'sequence types,\n' - ' the accepted keys should be integers and slice objects. ' - 'Note that\n' - ' the special interpretation of negative indexes (if the ' - 'class wishes\n' - ' to emulate a sequence type) is up to the "__getitem__()" ' - 'method. If\n' - ' *key* is of an inappropriate type, "TypeError" may be ' - 'raised; if of\n' - ' a value outside the set of indexes for the sequence ' - '(after any\n' - ' special interpretation of negative values), "IndexError" ' + '*sequence*\n' + ' types, the accepted keys should be integers and slice ' + 'objects.\n' + ' Note that the special interpretation of negative indexes ' + '(if the\n' + ' class wishes to emulate a *sequence* type) is up to the\n' + ' "__getitem__()" method. If *key* is of an inappropriate ' + 'type,\n' + ' "TypeError" may be raised; if of a value outside the set ' + 'of indexes\n' + ' for the sequence (after any special interpretation of ' + 'negative\n' + ' values), "IndexError" should be raised. For *mapping* ' + 'types, if\n' + ' *key* is missing (not in the container), "KeyError" ' 'should be\n' - ' raised. For mapping types, if *key* is missing (not in ' - 'the\n' - ' container), "KeyError" should be raised.\n' + ' raised.\n' '\n' ' Note:\n' '\n' @@ -9589,6 +9764,14 @@ 'the\n' ' sequence.\n' '\n' + ' Note:\n' + '\n' + ' When subscripting a *class*, the special class method\n' + ' "__class_getitem__()" may be called instead of ' + '"__getitem__()".\n' + ' See __class_getitem__ versus __getitem__ for more ' + 'details.\n' + '\n' 'object.__setitem__(self, key, value)\n' '\n' ' Called to implement assignment to "self[key]". Same note ' @@ -10376,9 +10559,9 @@ ' >>> from keyword import iskeyword\n' '\n' " >>> 'hello'.isidentifier(), iskeyword('hello')\n" - ' True, False\n' + ' (True, False)\n' " >>> 'def'.isidentifier(), iskeyword('def')\n" - ' True, True\n' + ' (True, True)\n' '\n' 'str.islower()\n' '\n' @@ -10729,7 +10912,7 @@ " >>> ' 1 2 3 '.split()\n" " ['1', '2', '3']\n" '\n' - 'str.splitlines([keepends])\n' + 'str.splitlines(keepends=False)\n' '\n' ' Return a list of the lines in the string, breaking at ' 'line\n' @@ -12060,14 +12243,14 @@ 'for"\n' ' statement to execute the body of the function.\n' '\n' - ' Calling the asynchronous iterator?s "aiterator.__anext__()"\n' - ' method will return an *awaitable* which when awaited will\n' - ' execute until it provides a value using the "yield" ' - 'expression.\n' - ' When the function executes an empty "return" statement or ' - 'falls\n' - ' off the end, a "StopAsyncIteration" exception is raised and ' + ' Calling the asynchronous iterator?s "aiterator.__anext__" ' + 'method\n' + ' will return an *awaitable* which when awaited will execute ' + 'until\n' + ' it provides a value using the "yield" expression. When the\n' + ' function executes an empty "return" statement or falls off ' 'the\n' + ' end, a "StopAsyncIteration" exception is raised and the\n' ' asynchronous iterator will have reached the end of the set ' 'of\n' ' values to be yielded.\n' @@ -12587,9 +12770,9 @@ '"dict"\n' 'constructor.\n' '\n' - 'class dict(**kwarg)\n' - 'class dict(mapping, **kwarg)\n' - 'class dict(iterable, **kwarg)\n' + 'class dict(**kwargs)\n' + 'class dict(mapping, **kwargs)\n' + 'class dict(iterable, **kwargs)\n' '\n' ' Return a new dictionary initialized from an optional ' 'positional\n' @@ -13694,7 +13877,8 @@ '\n' ' The arguments to the range constructor must be integers ' '(either\n' - ' built-in "int" or any object that implements the "__index__"\n' + ' built-in "int" or any object that implements the ' + '"__index__()"\n' ' special method). If the *step* argument is omitted, it ' 'defaults to\n' ' "1". If the *start* argument is omitted, it defaults to "0". ' diff --git a/Misc/NEWS.d/3.9.10.rst b/Misc/NEWS.d/3.9.10.rst new file mode 100644 index 0000000000000..5e119eb8598a8 --- /dev/null +++ b/Misc/NEWS.d/3.9.10.rst @@ -0,0 +1,557 @@ +.. bpo: 46070 +.. date: 2022-01-13-17-58-56 +.. nonce: q8IGth +.. release date: 2022-01-13 +.. section: Core and Builtins + +:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently +tracked by the GC. Previously, if an object was used later by another +interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if +the previous or the next object of the :c:type:`PyGC_Head` structure became +a dangling pointer. Patch by Victor Stinner. + +.. + +.. bpo: 46085 +.. date: 2021-12-30-00-23-41 +.. nonce: bDuJqu +.. section: Core and Builtins + +Fix iterator cache mechanism of :class:`OrderedDict`. + +.. + +.. bpo: 46110 +.. date: 2021-12-18-02-37-07 +.. nonce: B6hAfu +.. section: Core and Builtins + +Add a maximum recursion check to the PEG parser to avoid stack overflow. +Patch by Pablo Galindo + +.. + +.. bpo: 46000 +.. date: 2021-12-07-11-42-44 +.. nonce: v_ru3k +.. section: Core and Builtins + +Improve compatibility of the :mod:`curses` module with NetBSD curses. + +.. + +.. bpo: 45614 +.. date: 2021-11-23-12-06-41 +.. nonce: fIekgI +.. section: Core and Builtins + +Fix :mod:`traceback` display for exceptions with invalid module name. + +.. + +.. bpo: 45806 +.. date: 2021-11-19-19-21-48 +.. nonce: DflDMe +.. section: Core and Builtins + +Re-introduced fix that allows recovery from stack overflow without crashing +the interpreter. The original fix as part of :issue:`42500` was reverted +(see release notes for Python 3.9.4) since it introduced an ABI change in a +bugfix release which is not allowed. The new fix doesn't introduce any ABI +changes. Patch by Mark Shannon. + +.. + +.. bpo: 45822 +.. date: 2021-11-16-19-41-04 +.. nonce: OT6ueS +.. section: Core and Builtins + +Fixed a bug in the parser that was causing it to not respect :pep:`263` +coding cookies when no flags are provided. Patch by Pablo Galindo + +.. + +.. bpo: 45820 +.. date: 2021-11-16-19-00-27 +.. nonce: 2X6Psr +.. section: Core and Builtins + +Fix a segfault when the parser fails without reading any input. Patch by +Pablo Galindo + +.. + +.. bpo: 42540 +.. date: 2021-11-15-12-08-27 +.. nonce: V2w107 +.. section: Core and Builtins + +Fix crash when :func:`os.fork` is called with an active non-default memory +allocator. + +.. + +.. bpo: 40479 +.. date: 2022-01-07-15-20-19 +.. nonce: EKfr3F +.. section: Library + +Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL +3.0.0 in FIPS mode. + +.. + +.. bpo: 46070 +.. date: 2022-01-07-13-51-22 +.. nonce: -axLUW +.. section: Library + +Fix possible segfault when importing the :mod:`asyncio` module from +different sub-interpreters in parallel. Patch by Erlend E. Aasland. + +.. + +.. bpo: 46278 +.. date: 2022-01-06-13-38-00 +.. nonce: wILA80 +.. section: Library + +Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop +implementations already support it. + +.. + +.. bpo: 46239 +.. date: 2022-01-03-12-59-20 +.. nonce: ySVSEy +.. section: Library + +Improve error message when importing :mod:`asyncio.windows_events` on +non-Windows. + +.. + +.. bpo: 20369 +.. date: 2021-12-17-12-06-40 +.. nonce: zzLuBz +.. section: Library + +:func:`concurrent.futures.wait` no longer blocks forever when given +duplicate Futures. Patch by Kumar Aditya. + +.. + +.. bpo: 46105 +.. date: 2021-12-16-14-30-36 +.. nonce: pprB1K +.. section: Library + +Honor spec when generating requirement specs with urls and extras +(importlib_metadata 4.8.3). + +.. + +.. bpo: 26952 +.. date: 2021-12-14-13-18-45 +.. nonce: hjhISq +.. section: Library + +:mod:`argparse` raises :exc:`ValueError` with clear message when trying to +render usage for an empty mutually-exclusive group. Previously it raised a +cryptic :exc:`IndexError`. + +.. + +.. bpo: 27718 +.. date: 2021-12-11-22-51-30 +.. nonce: MgQiGl +.. section: Library + +Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and +``getsignal()``) were omitted. + +.. + +.. bpo: 46032 +.. date: 2021-12-11-15-45-07 +.. nonce: HmciLT +.. section: Library + +The ``registry()`` method of :func:`functools.singledispatch` functions +checks now the first argument or the first parameter annotation and raises a +TypeError if it is not supported. Previously unsupported "types" were +ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. +``list[int]``). + +.. + +.. bpo: 46018 +.. date: 2021-12-09-00-44-42 +.. nonce: hkTI7v +.. section: Library + +Ensure that :func:`math.expm1` does not raise on underflow. + +.. + +.. bpo: 27946 +.. date: 2021-12-04-20-08-42 +.. nonce: -Vuarf +.. section: Library + +Fix possible crash when getting an attribute of +class:`xml.etree.ElementTree.Element` simultaneously with replacing the +``attrib`` dict. + +.. + +.. bpo: 13236 +.. date: 2021-11-30-13-52-02 +.. nonce: FmJIkO +.. section: Library + +:class:`unittest.TextTestResult` and :class:`unittest.TextTestRunner` flush +now the output stream more often. + +.. + +.. bpo: 37658 +.. date: 2021-11-28-15-30-34 +.. nonce: 8Hno7d +.. section: Library + +Fix issue when on certain conditions ``asyncio.wait_for()`` may allow a +coroutine to complete successfully, but fail to return the result, +potentially causing memory leaks or other issues. + +.. + +.. bpo: 45831 +.. date: 2021-11-17-19-25-37 +.. nonce: 9-TojK +.. section: Library + +:mod:`faulthandler` can now write ASCII-only strings (like filenames and +function names) with a single write() syscall when dumping a traceback. It +reduces the risk of getting an unreadable dump when two threads or two +processes dump a traceback to the same file (like stderr) at the same time. +Patch by Victor Stinner. + +.. + +.. bpo: 41735 +.. date: 2021-11-17-11-38-30 +.. nonce: 2feh9v +.. section: Library + +Fix thread lock in ``zlib.Decompress.flush()`` method before +``PyObject_GetBuffer``. + +.. + +.. bpo: 45664 +.. date: 2021-10-28-23-40-54 +.. nonce: 7dqtxQ +.. section: Library + +Fix :func:`types.resolve_bases` and :func:`types.new_class` for +:class:`types.GenericAlias` instance as a base. + +.. + +.. bpo: 45663 +.. date: 2021-10-28-23-11-59 +.. nonce: J90N5R +.. section: Library + +Fix :func:`dataclasses.is_dataclass` for dataclasses which are subclasses of +:class:`types.GenericAlias`. + +.. + +.. bpo: 45662 +.. date: 2021-10-28-22-58-14 +.. nonce: sJd7Ir +.. section: Library + +Fix the repr of :data:`dataclasses.InitVar` with a type alias to the +built-in class, e.g. ``InitVar[list[int]]``. + +.. + +.. bpo: 43498 +.. date: 2021-04-20-14-14-16 +.. nonce: L_Hq-8 +.. section: Library + +Avoid a possible *"RuntimeError: dictionary changed size during iteration"* +when adjusting the process count of :class:`ProcessPoolExecutor`. + +.. + +.. bpo: 29620 +.. date: 2018-08-21-16-20-33 +.. nonce: xxx666 +.. section: Library + +:func:`~unittest.TestCase.assertWarns` no longer raises a +``RuntimeException`` when accessing a module's ``__warningregistry__`` +causes importation of a new module, or when a new module is imported in +another thread. Patch by Kernc. + +.. + +.. bpo: 19737 +.. date: 2021-11-28-22-43-21 +.. nonce: cOOubB +.. section: Documentation + +Update the documentation for the :func:`globals` function. + +.. + +.. bpo: 45840 +.. date: 2021-11-19-02-02-32 +.. nonce: A51B2S +.. section: Documentation + +Improve cross-references in the documentation for the data model. + +.. + +.. bpo: 45788 +.. date: 2021-11-18-00-07-40 +.. nonce: qibUoB +.. section: Documentation + +Link doc for sys.prefix to sysconfig doc on installation paths. + +.. + +.. bpo: 25381 +.. date: 2021-06-21-17-51-51 +.. nonce: 7Kn-_H +.. section: Documentation + +In the extending chapter of the extending doc, update a paragraph about the +global variables containing exception information. + +.. + +.. bpo: 43905 +.. date: 2021-05-24-05-00-12 +.. nonce: tBIndE +.. section: Documentation + +Expanded :func:`~dataclasses.astuple` and :func:`~dataclasses.asdict` docs, +warning about deepcopy being applied and providing a workaround. + +.. + +.. bpo: 41028 +.. date: 2020-06-18-23-37-03 +.. nonce: vM8bC8 +.. section: Documentation + +Language and version switchers, previously maintained in every cpython +branches, are now handled by docsbuild-script. + +.. + +.. bpo: 46205 +.. date: 2022-01-07-14-06-12 +.. nonce: dnc2OC +.. section: Tests + +Fix hang in runtest_mp due to race condition + +.. + +.. bpo: 46263 +.. date: 2022-01-06-15-45-34 +.. nonce: bJXek6 +.. section: Tests + +Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory +with junk byte. + +.. + +.. bpo: 46150 +.. date: 2021-12-23-13-42-15 +.. nonce: RhtADs +.. section: Tests + +Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is +checked to be non-existent. + +.. + +.. bpo: 46129 +.. date: 2021-12-19-12-20-57 +.. nonce: I3MunH +.. section: Tests + +Rewrite ``asyncio.locks`` tests with +:class:`unittest.IsolatedAsyncioTestCase` usage. + +.. + +.. bpo: 23819 +.. date: 2021-12-19-08-44-32 +.. nonce: 9ueiII +.. section: Tests + +Fixed :mod:`asyncio` tests in python optimized mode. Patch by Kumar Aditya. + +.. + +.. bpo: 46114 +.. date: 2021-12-17-14-46-19 +.. nonce: 9iyZ_9 +.. section: Tests + +Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. + +.. + +.. bpo: 19460 +.. date: 2021-11-28-15-25-02 +.. nonce: lr0aWs +.. section: Tests + +Add new Test for :class:`email.mime.nonmultipart.MIMENonMultipart`. + +.. + +.. bpo: 45835 +.. date: 2021-11-17-14-28-08 +.. nonce: Mgyhjx +.. section: Tests + +Fix race condition in test_queue tests with multiple "feeder" threads. + +.. + +.. bpo: 46263 +.. date: 2022-01-05-02-58-10 +.. nonce: xiv8NU +.. section: Build + +``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. + +.. + +.. bpo: 46106 +.. date: 2021-12-20-07-10-41 +.. nonce: 5qcv3L +.. section: Build + +Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. +Patch by Kumar Aditya. + +.. + +.. bpo: 44035 +.. date: 2021-12-06-09-31-27 +.. nonce: BiO4XC +.. section: Build + +CI now verifies that autoconf files have been regenerated with a current and +unpatched autoconf package. + +.. + +.. bpo: 33393 +.. date: 2021-11-25-20-26-06 +.. nonce: 24YNtM +.. section: Build + +Update ``config.guess`` to 2021-06-03 and ``config.sub`` to 2021-08-14. +``Makefile`` now has an ``update-config`` target to make updating more +convenient. + +.. + +.. bpo: 45866 +.. date: 2021-11-25-13-53-36 +.. nonce: ZH1W8N +.. section: Build + +``make regen-all`` now produces the same output when run from a directory +other than the source tree: when building Python out of the source tree. +pegen now strips directory of the "generated by pygen from " +header Patch by Victor Stinner. + +.. + +.. bpo: 41498 +.. date: 2021-11-25-09-15-04 +.. nonce: qAk5eo +.. section: Build + +Python now compiles on platforms without ``sigset_t``. Several functions in +:mod:`signal` are not available when ``sigset_t`` is missing. + +Based on patch by Roman Yurchak for pyodide. + +.. + +.. bpo: 45881 +.. date: 2021-11-24-17-14-06 +.. nonce: GTXXLk +.. section: Build + +``setup.py`` now uses ``CC`` from environment first to discover multiarch +and cross compile paths. + +.. + +.. bpo: 45901 +.. date: 2021-11-26-18-17-41 +.. nonce: c5IBqM +.. section: Windows + +When installed through the Microsoft Store and set as the default app for +:file:`*.py` files, command line arguments will now be passed to Python when +invoking a script without explicitly launching Python (that is, ``script.py +args`` rather than ``python script.py args``). + +.. + +.. bpo: 40477 +.. date: 2022-01-02-21-56-53 +.. nonce: W3nnM6 +.. section: macOS + +The Python Launcher app for macOS now properly launches scripts and, if +necessary, the Terminal app when running on recent macOS releases. + +.. + +.. bpo: 45732 +.. date: 2021-12-05-23-52-03 +.. nonce: -BWrnh +.. section: macOS + +Update python.org macOS installer to use Tcl/Tk 8.6.12. + +.. + +.. bpo: 45838 +.. date: 2021-11-18-11-20-21 +.. nonce: TH6mwc +.. section: Tools/Demos + +Fix line number calculation when debugging Python with GDB. + +.. + +.. bpo: 39026 +.. date: 2021-11-09-15-42-11 +.. nonce: sUnYWn +.. section: C API + +Fix Python.h to build C extensions with Xcode: remove a relative include +from ``Include/cpython/pystate.h``. diff --git a/Misc/NEWS.d/next/Build/2021-11-24-17-14-06.bpo-45881.GTXXLk.rst b/Misc/NEWS.d/next/Build/2021-11-24-17-14-06.bpo-45881.GTXXLk.rst deleted file mode 100644 index b697658cf3aaa..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-11-24-17-14-06.bpo-45881.GTXXLk.rst +++ /dev/null @@ -1,2 +0,0 @@ -``setup.py`` now uses ``CC`` from environment first to discover multiarch -and cross compile paths. diff --git a/Misc/NEWS.d/next/Build/2021-11-25-09-15-04.bpo-41498.qAk5eo.rst b/Misc/NEWS.d/next/Build/2021-11-25-09-15-04.bpo-41498.qAk5eo.rst deleted file mode 100644 index 18dc290dafd02..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-11-25-09-15-04.bpo-41498.qAk5eo.rst +++ /dev/null @@ -1,4 +0,0 @@ -Python now compiles on platforms without ``sigset_t``. Several functions -in :mod:`signal` are not available when ``sigset_t`` is missing. - -Based on patch by Roman Yurchak for pyodide. diff --git a/Misc/NEWS.d/next/Build/2021-11-25-13-53-36.bpo-45866.ZH1W8N.rst b/Misc/NEWS.d/next/Build/2021-11-25-13-53-36.bpo-45866.ZH1W8N.rst deleted file mode 100644 index e87b93932ffa1..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-11-25-13-53-36.bpo-45866.ZH1W8N.rst +++ /dev/null @@ -1,4 +0,0 @@ -``make regen-all`` now produces the same output when run from a directory -other than the source tree: when building Python out of the source tree. -pegen now strips directory of the "generated by pygen from " header -Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Build/2021-11-25-20-26-06.bpo-33393.24YNtM.rst b/Misc/NEWS.d/next/Build/2021-11-25-20-26-06.bpo-33393.24YNtM.rst deleted file mode 100644 index c27869c9b6ded..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-11-25-20-26-06.bpo-33393.24YNtM.rst +++ /dev/null @@ -1,3 +0,0 @@ -Update ``config.guess`` to 2021-06-03 and ``config.sub`` to 2021-08-14. -``Makefile`` now has an ``update-config`` target to make updating more -convenient. diff --git a/Misc/NEWS.d/next/Build/2021-12-06-09-31-27.bpo-44035.BiO4XC.rst b/Misc/NEWS.d/next/Build/2021-12-06-09-31-27.bpo-44035.BiO4XC.rst deleted file mode 100644 index 7530587b73d14..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-06-09-31-27.bpo-44035.BiO4XC.rst +++ /dev/null @@ -1,2 +0,0 @@ -CI now verifies that autoconf files have been regenerated with a current and -unpatched autoconf package. diff --git a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst b/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst deleted file mode 100644 index d3e25f77c7336..0000000000000 --- a/Misc/NEWS.d/next/Build/2021-12-20-07-10-41.bpo-46106.5qcv3L.rst +++ /dev/null @@ -1,2 +0,0 @@ -Updated OpenSSL to 1.1.1m in Windows builds, macOS installer builds, and CI. -Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst b/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst deleted file mode 100644 index 3a575ed7f556b..0000000000000 --- a/Misc/NEWS.d/next/Build/2022-01-05-02-58-10.bpo-46263.xiv8NU.rst +++ /dev/null @@ -1 +0,0 @@ -``configure`` no longer sets ``MULTIARCH`` on FreeBSD platforms. diff --git a/Misc/NEWS.d/next/C API/2021-11-09-15-42-11.bpo-39026.sUnYWn.rst b/Misc/NEWS.d/next/C API/2021-11-09-15-42-11.bpo-39026.sUnYWn.rst deleted file mode 100644 index 77a0119792152..0000000000000 --- a/Misc/NEWS.d/next/C API/2021-11-09-15-42-11.bpo-39026.sUnYWn.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix Python.h to build C extensions with Xcode: remove a relative include -from ``Include/cpython/pystate.h``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-15-12-08-27.bpo-42540.V2w107.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-15-12-08-27.bpo-42540.V2w107.rst deleted file mode 100644 index 91160598bd3f4..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-15-12-08-27.bpo-42540.V2w107.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix crash when :func:`os.fork` is called with an active non-default -memory allocator. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-00-27.bpo-45820.2X6Psr.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-00-27.bpo-45820.2X6Psr.rst deleted file mode 100644 index c2ec3d690cd4b..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-00-27.bpo-45820.2X6Psr.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix a segfault when the parser fails without reading any input. Patch by -Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-41-04.bpo-45822.OT6ueS.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-41-04.bpo-45822.OT6ueS.rst deleted file mode 100644 index 1ac7a8becee40..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-16-19-41-04.bpo-45822.OT6ueS.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fixed a bug in the parser that was causing it to not respect :pep:`263` -coding cookies when no flags are provided. Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-19-19-21-48.bpo-45806.DflDMe.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-19-19-21-48.bpo-45806.DflDMe.rst deleted file mode 100644 index f8c47cae4b0b9..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-19-19-21-48.bpo-45806.DflDMe.rst +++ /dev/null @@ -1,5 +0,0 @@ -Re-introduced fix that allows recovery from stack overflow without crashing -the interpreter. The original fix as part of :issue:`42500` was reverted -(see release notes for Python 3.9.4) since it introduced an ABI change in a -bugfix release which is not allowed. The new fix doesn't introduce any ABI -changes. Patch by Mark Shannon. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-11-23-12-06-41.bpo-45614.fIekgI.rst b/Misc/NEWS.d/next/Core and Builtins/2021-11-23-12-06-41.bpo-45614.fIekgI.rst deleted file mode 100644 index 4255e1885ad67..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-11-23-12-06-41.bpo-45614.fIekgI.rst +++ /dev/null @@ -1 +0,0 @@ -Fix :mod:`traceback` display for exceptions with invalid module name. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst deleted file mode 100644 index 68e4bfa9e77b1..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-07-11-42-44.bpo-46000.v_ru3k.rst +++ /dev/null @@ -1 +0,0 @@ -Improve compatibility of the :mod:`curses` module with NetBSD curses. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst deleted file mode 100644 index 593d2855972c4..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-18-02-37-07.bpo-46110.B6hAfu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add a maximum recursion check to the PEG parser to avoid stack overflow. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst deleted file mode 100644 index a2093f75c3b62..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2021-12-30-00-23-41.bpo-46085.bDuJqu.rst +++ /dev/null @@ -1 +0,0 @@ -Fix iterator cache mechanism of :class:`OrderedDict`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst deleted file mode 100644 index 4ed088f9898eb..0000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2022-01-13-17-58-56.bpo-46070.q8IGth.rst +++ /dev/null @@ -1,5 +0,0 @@ -:c:func:`Py_EndInterpreter` now explicitly untracks all objects currently -tracked by the GC. Previously, if an object was used later by another -interpreter, calling :c:func:`PyObject_GC_UnTrack` on the object crashed if the -previous or the next object of the :c:type:`PyGC_Head` structure became a -dangling pointer. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst b/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst deleted file mode 100644 index 5fc4155b55346..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2020-06-18-23-37-03.bpo-41028.vM8bC8.rst +++ /dev/null @@ -1,2 +0,0 @@ -Language and version switchers, previously maintained in every cpython -branches, are now handled by docsbuild-script. diff --git a/Misc/NEWS.d/next/Documentation/2021-05-24-05-00-12.bpo-43905.tBIndE.rst b/Misc/NEWS.d/next/Documentation/2021-05-24-05-00-12.bpo-43905.tBIndE.rst deleted file mode 100644 index 760e1eea0deb7..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-05-24-05-00-12.bpo-43905.tBIndE.rst +++ /dev/null @@ -1,2 +0,0 @@ -Expanded :func:`~dataclasses.astuple` and :func:`~dataclasses.asdict` docs, -warning about deepcopy being applied and providing a workaround. diff --git a/Misc/NEWS.d/next/Documentation/2021-06-21-17-51-51.bpo-25381.7Kn-_H.rst b/Misc/NEWS.d/next/Documentation/2021-06-21-17-51-51.bpo-25381.7Kn-_H.rst deleted file mode 100644 index f009f880e917d..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-06-21-17-51-51.bpo-25381.7Kn-_H.rst +++ /dev/null @@ -1,2 +0,0 @@ -In the extending chapter of the extending doc, update a paragraph about the -global variables containing exception information. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-18-00-07-40.bpo-45788.qibUoB.rst b/Misc/NEWS.d/next/Documentation/2021-11-18-00-07-40.bpo-45788.qibUoB.rst deleted file mode 100644 index 8aa3293673e1f..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-18-00-07-40.bpo-45788.qibUoB.rst +++ /dev/null @@ -1 +0,0 @@ -Link doc for sys.prefix to sysconfig doc on installation paths. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst b/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst deleted file mode 100644 index 87371e5b76bc1..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-19-02-02-32.bpo-45840.A51B2S.rst +++ /dev/null @@ -1 +0,0 @@ -Improve cross-references in the documentation for the data model. diff --git a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst b/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst deleted file mode 100644 index a3e16c9fdd0e6..0000000000000 --- a/Misc/NEWS.d/next/Documentation/2021-11-28-22-43-21.bpo-19737.cOOubB.rst +++ /dev/null @@ -1 +0,0 @@ -Update the documentation for the :func:`globals` function. diff --git a/Misc/NEWS.d/next/Library/2018-08-21-16-20-33.bpo-29620.xxx666.rst b/Misc/NEWS.d/next/Library/2018-08-21-16-20-33.bpo-29620.xxx666.rst deleted file mode 100644 index d781919504e68..0000000000000 --- a/Misc/NEWS.d/next/Library/2018-08-21-16-20-33.bpo-29620.xxx666.rst +++ /dev/null @@ -1,3 +0,0 @@ -:func:`~unittest.TestCase.assertWarns` no longer raises a ``RuntimeException`` -when accessing a module's ``__warningregistry__`` causes importation of a new -module, or when a new module is imported in another thread. Patch by Kernc. diff --git a/Misc/NEWS.d/next/Library/2021-04-20-14-14-16.bpo-43498.L_Hq-8.rst b/Misc/NEWS.d/next/Library/2021-04-20-14-14-16.bpo-43498.L_Hq-8.rst deleted file mode 100644 index 4713d1427ccd3..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-04-20-14-14-16.bpo-43498.L_Hq-8.rst +++ /dev/null @@ -1,2 +0,0 @@ -Avoid a possible *"RuntimeError: dictionary changed size during iteration"* -when adjusting the process count of :class:`ProcessPoolExecutor`. diff --git a/Misc/NEWS.d/next/Library/2021-10-28-22-58-14.bpo-45662.sJd7Ir.rst b/Misc/NEWS.d/next/Library/2021-10-28-22-58-14.bpo-45662.sJd7Ir.rst deleted file mode 100644 index 050b443dd7cb2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-10-28-22-58-14.bpo-45662.sJd7Ir.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix the repr of :data:`dataclasses.InitVar` with a type alias to the -built-in class, e.g. ``InitVar[list[int]]``. diff --git a/Misc/NEWS.d/next/Library/2021-10-28-23-11-59.bpo-45663.J90N5R.rst b/Misc/NEWS.d/next/Library/2021-10-28-23-11-59.bpo-45663.J90N5R.rst deleted file mode 100644 index f246f67cf80e5..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-10-28-23-11-59.bpo-45663.J90N5R.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :func:`dataclasses.is_dataclass` for dataclasses which are subclasses of -:class:`types.GenericAlias`. diff --git a/Misc/NEWS.d/next/Library/2021-10-28-23-40-54.bpo-45664.7dqtxQ.rst b/Misc/NEWS.d/next/Library/2021-10-28-23-40-54.bpo-45664.7dqtxQ.rst deleted file mode 100644 index 573a569845a87..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-10-28-23-40-54.bpo-45664.7dqtxQ.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :func:`types.resolve_bases` and :func:`types.new_class` for -:class:`types.GenericAlias` instance as a base. diff --git a/Misc/NEWS.d/next/Library/2021-11-17-11-38-30.bpo-41735.2feh9v.rst b/Misc/NEWS.d/next/Library/2021-11-17-11-38-30.bpo-41735.2feh9v.rst deleted file mode 100644 index 101da0e9ce648..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-17-11-38-30.bpo-41735.2feh9v.rst +++ /dev/null @@ -1 +0,0 @@ -Fix thread lock in ``zlib.Decompress.flush()`` method before ``PyObject_GetBuffer``. diff --git a/Misc/NEWS.d/next/Library/2021-11-17-19-25-37.bpo-45831.9-TojK.rst b/Misc/NEWS.d/next/Library/2021-11-17-19-25-37.bpo-45831.9-TojK.rst deleted file mode 100644 index 049449ff0a4a1..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-17-19-25-37.bpo-45831.9-TojK.rst +++ /dev/null @@ -1,5 +0,0 @@ -:mod:`faulthandler` can now write ASCII-only strings (like filenames and -function names) with a single write() syscall when dumping a traceback. It -reduces the risk of getting an unreadable dump when two threads or two -processes dump a traceback to the same file (like stderr) at the same time. -Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2021-11-28-15-30-34.bpo-37658.8Hno7d.rst b/Misc/NEWS.d/next/Library/2021-11-28-15-30-34.bpo-37658.8Hno7d.rst deleted file mode 100644 index 97d1e961ac2be..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-28-15-30-34.bpo-37658.8Hno7d.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix issue when on certain conditions ``asyncio.wait_for()`` may allow a -coroutine to complete successfully, but fail to return the result, -potentially causing memory leaks or other issues. diff --git a/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst b/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst deleted file mode 100644 index bfea8d4fca0e0..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-11-30-13-52-02.bpo-13236.FmJIkO.rst +++ /dev/null @@ -1,2 +0,0 @@ -:class:`unittest.TextTestResult` and :class:`unittest.TextTestRunner` flush -now the output stream more often. diff --git a/Misc/NEWS.d/next/Library/2021-12-04-20-08-42.bpo-27946.-Vuarf.rst b/Misc/NEWS.d/next/Library/2021-12-04-20-08-42.bpo-27946.-Vuarf.rst deleted file mode 100644 index 0378efca746bb..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-04-20-08-42.bpo-27946.-Vuarf.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix possible crash when getting an attribute of -class:`xml.etree.ElementTree.Element` simultaneously with -replacing the ``attrib`` dict. diff --git a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst b/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst deleted file mode 100644 index 6ff76f58779d2..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-09-00-44-42.bpo-46018.hkTI7v.rst +++ /dev/null @@ -1 +0,0 @@ -Ensure that :func:`math.expm1` does not raise on underflow. diff --git a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst b/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst deleted file mode 100644 index 97a553d7ba29f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-15-45-07.bpo-46032.HmciLT.rst +++ /dev/null @@ -1,5 +0,0 @@ -The ``registry()`` method of :func:`functools.singledispatch` functions -checks now the first argument or the first parameter annotation and raises a -TypeError if it is not supported. Previously unsupported "types" were -ignored (e.g. ``typing.List[int]``) or caused an error at calling time (e.g. -``list[int]``). diff --git a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst b/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst deleted file mode 100644 index c68e98ff0630b..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-11-22-51-30.bpo-27718.MgQiGl.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix help for the :mod:`signal` module. Some functions (e.g. ``signal()`` and -``getsignal()``) were omitted. diff --git a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst b/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst deleted file mode 100644 index 379dbb55c7ca8..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-14-13-18-45.bpo-26952.hjhISq.rst +++ /dev/null @@ -1 +0,0 @@ -:mod:`argparse` raises :exc:`ValueError` with clear message when trying to render usage for an empty mutually-exclusive group. Previously it raised a cryptic :exc:`IndexError`. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst b/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst deleted file mode 100644 index 145edccb2e7a5..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-16-14-30-36.bpo-46105.pprB1K.rst +++ /dev/null @@ -1,2 +0,0 @@ -Honor spec when generating requirement specs with urls and extras -(importlib_metadata 4.8.3). diff --git a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst b/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst deleted file mode 100644 index cc5cd0067e61f..0000000000000 --- a/Misc/NEWS.d/next/Library/2021-12-17-12-06-40.bpo-20369.zzLuBz.rst +++ /dev/null @@ -1 +0,0 @@ -:func:`concurrent.futures.wait` no longer blocks forever when given duplicate Futures. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst b/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst deleted file mode 100644 index 202febf84fd10..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-03-12-59-20.bpo-46239.ySVSEy.rst +++ /dev/null @@ -1,2 +0,0 @@ -Improve error message when importing :mod:`asyncio.windows_events` on -non-Windows. diff --git a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst b/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst deleted file mode 100644 index 40849044cf1c8..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-06-13-38-00.bpo-46278.wILA80.rst +++ /dev/null @@ -1,2 +0,0 @@ -Reflect ``context`` argument in ``AbstractEventLoop.call_*()`` methods. Loop -implementations already support it. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst b/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst deleted file mode 100644 index 0fedc9dfb8fb1..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-13-51-22.bpo-46070.-axLUW.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix possible segfault when importing the :mod:`asyncio` module from -different sub-interpreters in parallel. Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst b/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst deleted file mode 100644 index af72923bbd759..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-07-15-20-19.bpo-40479.EKfr3F.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :mod:`hashlib` *usedforsecurity* option to work correctly with OpenSSL -3.0.0 in FIPS mode. diff --git a/Misc/NEWS.d/next/Tests/2021-11-17-14-28-08.bpo-45835.Mgyhjx.rst b/Misc/NEWS.d/next/Tests/2021-11-17-14-28-08.bpo-45835.Mgyhjx.rst deleted file mode 100644 index 6a73b01959065..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-11-17-14-28-08.bpo-45835.Mgyhjx.rst +++ /dev/null @@ -1 +0,0 @@ -Fix race condition in test_queue tests with multiple "feeder" threads. diff --git a/Misc/NEWS.d/next/Tests/2021-11-28-15-25-02.bpo-19460.lr0aWs.rst b/Misc/NEWS.d/next/Tests/2021-11-28-15-25-02.bpo-19460.lr0aWs.rst deleted file mode 100644 index b082d6de20c07..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-11-28-15-25-02.bpo-19460.lr0aWs.rst +++ /dev/null @@ -1 +0,0 @@ -Add new Test for :class:`email.mime.nonmultipart.MIMENonMultipart`. diff --git a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst b/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst deleted file mode 100644 index 6878cea032387..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-17-14-46-19.bpo-46114.9iyZ_9.rst +++ /dev/null @@ -1 +0,0 @@ -Fix test case for OpenSSL 3.0.1 version. OpenSSL 3.0 uses ``0xMNN00PP0L``. diff --git a/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst b/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst deleted file mode 100644 index 4ef0fe6f6d5fa..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-19-08-44-32.bpo-23819.9ueiII.rst +++ /dev/null @@ -1 +0,0 @@ -Fixed :mod:`asyncio` tests in python optimized mode. Patch by Kumar Aditya. \ No newline at end of file diff --git a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst b/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst deleted file mode 100644 index b06436a4c8460..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-19-12-20-57.bpo-46129.I3MunH.rst +++ /dev/null @@ -1,2 +0,0 @@ -Rewrite ``asyncio.locks`` tests with -:class:`unittest.IsolatedAsyncioTestCase` usage. diff --git a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst b/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst deleted file mode 100644 index 8ef9cd9b4a594..0000000000000 --- a/Misc/NEWS.d/next/Tests/2021-12-23-13-42-15.bpo-46150.RhtADs.rst +++ /dev/null @@ -1,2 +0,0 @@ -Now ``fakename`` in ``test_pathlib.PosixPathTest.test_expanduser`` is checked -to be non-existent. diff --git a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst b/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst deleted file mode 100644 index 0334af4e3cbe8..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-06-15-45-34.bpo-46263.bJXek6.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix test_capi on FreeBSD 14-dev: instruct jemalloc to not fill freed memory -with junk byte. diff --git a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst b/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst deleted file mode 100644 index 7c6121fb16249..0000000000000 --- a/Misc/NEWS.d/next/Tests/2022-01-07-14-06-12.bpo-46205.dnc2OC.rst +++ /dev/null @@ -1 +0,0 @@ -Fix hang in runtest_mp due to race condition diff --git a/Misc/NEWS.d/next/Tools-Demos/2021-11-18-11-20-21.bpo-45838.TH6mwc.rst b/Misc/NEWS.d/next/Tools-Demos/2021-11-18-11-20-21.bpo-45838.TH6mwc.rst deleted file mode 100644 index b6dafc5c3edc4..0000000000000 --- a/Misc/NEWS.d/next/Tools-Demos/2021-11-18-11-20-21.bpo-45838.TH6mwc.rst +++ /dev/null @@ -1 +0,0 @@ -Fix line number calculation when debugging Python with GDB. diff --git a/Misc/NEWS.d/next/Windows/2021-11-26-18-17-41.bpo-45901.c5IBqM.rst b/Misc/NEWS.d/next/Windows/2021-11-26-18-17-41.bpo-45901.c5IBqM.rst deleted file mode 100644 index 2cb872bffe072..0000000000000 --- a/Misc/NEWS.d/next/Windows/2021-11-26-18-17-41.bpo-45901.c5IBqM.rst +++ /dev/null @@ -1,4 +0,0 @@ -When installed through the Microsoft Store and set as the default app for -:file:`*.py` files, command line arguments will now be passed to Python when -invoking a script without explicitly launching Python (that is, ``script.py -args`` rather than ``python script.py args``). diff --git a/Misc/NEWS.d/next/macOS/2021-12-05-23-52-03.bpo-45732.-BWrnh.rst b/Misc/NEWS.d/next/macOS/2021-12-05-23-52-03.bpo-45732.-BWrnh.rst deleted file mode 100644 index eb47985f86f95..0000000000000 --- a/Misc/NEWS.d/next/macOS/2021-12-05-23-52-03.bpo-45732.-BWrnh.rst +++ /dev/null @@ -1 +0,0 @@ -Update python.org macOS installer to use Tcl/Tk 8.6.12. diff --git a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst b/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst deleted file mode 100644 index fc953b85dcc2a..0000000000000 --- a/Misc/NEWS.d/next/macOS/2022-01-02-21-56-53.bpo-40477.W3nnM6.rst +++ /dev/null @@ -1,2 +0,0 @@ -The Python Launcher app for macOS now properly launches scripts and, if -necessary, the Terminal app when running on recent macOS releases. diff --git a/README.rst b/README.rst index 9fb0d243b4986..2d921be5e03d4 100644 --- a/README.rst +++ b/README.rst @@ -1,5 +1,5 @@ -This is Python version 3.9.9 -============================ +This is Python version 3.9.10 +============================= .. image:: https://travis-ci.org/python/cpython.svg?branch=3.9 :alt: CPython build status on Travis CI From webhook-mailer at python.org Fri Jan 14 17:18:23 2022 From: webhook-mailer at python.org (ethanfurman) Date: Fri, 14 Jan 2022 22:18:23 -0000 Subject: [Python-checkins] bpo-46242: [Enum] better error message for extending `Enum` with members (GH-30357) Message-ID: https://github.com/python/cpython/commit/e674e48ddc2712f28cc7ecdc66a6c328066694b0 commit: e674e48ddc2712f28cc7ecdc66a6c328066694b0 branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-14T14:18:00-08:00 summary: bpo-46242: [Enum] better error message for extending `Enum` with members (GH-30357) files: A Misc/NEWS.d/next/Library/2022-01-03-16-25-06.bpo-46242.f4l_CL.rst M Lib/enum.py M Lib/test/test_enum.py diff --git a/Lib/enum.py b/Lib/enum.py index 86928b4f79f0b..93ea1bea36db7 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -763,7 +763,7 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s """ metacls = cls.__class__ bases = (cls, ) if type is None else (type, cls) - _, first_enum = cls._get_mixins_(cls, bases) + _, first_enum = cls._get_mixins_(class_name, bases) classdict = metacls.__prepare__(class_name, bases) # special processing needed for names? @@ -848,8 +848,8 @@ def _check_for_existing_members(class_name, bases): % (class_name, base.__name__) ) - @staticmethod - def _get_mixins_(class_name, bases): + @classmethod + def _get_mixins_(cls, class_name, bases): """ Returns the type for creating enum members, and the first inherited enum class. @@ -890,9 +890,8 @@ def _find_data_type(bases): if not issubclass(first_enum, Enum): raise TypeError("new enumerations should be created as " "`EnumName([mixin_type, ...] [data_type,] enum_type)`") + cls._check_for_existing_members(class_name, bases) member_type = _find_data_type(bases) or object - if first_enum._member_names_: - raise TypeError("Cannot extend enumerations") return member_type, first_enum @staticmethod diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 04a68cc9ea204..43f98c1c1efb6 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1433,6 +1433,8 @@ class MoreColor(Color): with self.assertRaisesRegex(TypeError, "EvenMoreColor: cannot extend enumeration 'Color'"): class EvenMoreColor(Color, IntEnum): chartruese = 7 + with self.assertRaisesRegex(TypeError, "Foo: cannot extend enumeration 'Color'"): + Color('Foo', ('pink', 'black')) def test_exclude_methods(self): class whatever(Enum): diff --git a/Misc/NEWS.d/next/Library/2022-01-03-16-25-06.bpo-46242.f4l_CL.rst b/Misc/NEWS.d/next/Library/2022-01-03-16-25-06.bpo-46242.f4l_CL.rst new file mode 100644 index 0000000000000..6a5b5fdffda40 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-03-16-25-06.bpo-46242.f4l_CL.rst @@ -0,0 +1 @@ +Improve error message when creating a new :class:`enum.Enum` type subclassing an existing ``Enum`` with ``_member_names_`` using :meth:`enum.Enum.__call__`. From webhook-mailer at python.org Fri Jan 14 17:49:24 2022 From: webhook-mailer at python.org (brettcannon) Date: Fri, 14 Jan 2022 22:49:24 -0000 Subject: [Python-checkins] bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) Message-ID: https://github.com/python/cpython/commit/305588c67cdede4ef127ada90c1557bc1ef7c200 commit: 305588c67cdede4ef127ada90c1557bc1ef7c200 branch: main author: Hugo van Kemenade committer: brettcannon date: 2022-01-14T14:49:12-08:00 summary: bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 622f66719cee7..6540932eecbaa 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -476,10 +476,10 @@ Functions | | negative time difference from UTC/GMT of the | | | | form +HHMM or -HHMM, where H represents decimal| | | | hour digits and M represents decimal minute | | - | | digits [-23:59, +23:59]. | | + | | digits [-23:59, +23:59]. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%Z`` | Time zone name (no characters if no time zone | | - | | exists). | | + | | exists). Deprecated. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%%`` | A literal ``'%'`` character. | | +-----------+------------------------------------------------+-------+ @@ -500,7 +500,7 @@ Functions calculations when the day of the week and the year are specified. Here is an example, a format for dates compatible with that specified in the - :rfc:`2822` Internet email standard. [#]_ :: + :rfc:`2822` Internet email standard. [1]_ :: >>> from time import gmtime, strftime >>> strftime("%a, %d %b %Y %H:%M:%S +0000", gmtime()) @@ -928,10 +928,9 @@ Timezone Constants .. rubric:: Footnotes -.. [#] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the - preferred hour/minute offset is not supported by all ANSI C libraries. Also, a +.. [1] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the + preferred hour/minute offset is not supported by all ANSI C libraries. Also, a strict reading of the original 1982 :rfc:`822` standard calls for a two-digit - year (%y rather than %Y), but practice moved to 4-digit years long before the + year (``%y`` rather than ``%Y``), but practice moved to 4-digit years long before the year 2000. After that, :rfc:`822` became obsolete and the 4-digit year has been first recommended by :rfc:`1123` and then mandated by :rfc:`2822`. - From webhook-mailer at python.org Fri Jan 14 18:59:36 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 14 Jan 2022 23:59:36 -0000 Subject: [Python-checkins] bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) Message-ID: https://github.com/python/cpython/commit/93dc1654dc3c925c062e19f0ef8587aa8961ef8a commit: 93dc1654dc3c925c062e19f0ef8587aa8961ef8a branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-14T15:59:20-08:00 summary: bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) (cherry picked from commit 305588c67cdede4ef127ada90c1557bc1ef7c200) Co-authored-by: Hugo van Kemenade files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index cfd67e87501cd..735588a17e6e3 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -456,10 +456,10 @@ Functions | | negative time difference from UTC/GMT of the | | | | form +HHMM or -HHMM, where H represents decimal| | | | hour digits and M represents decimal minute | | - | | digits [-23:59, +23:59]. | | + | | digits [-23:59, +23:59]. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%Z`` | Time zone name (no characters if no time zone | | - | | exists). | | + | | exists). Deprecated. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%%`` | A literal ``'%'`` character. | | +-----------+------------------------------------------------+-------+ @@ -480,7 +480,7 @@ Functions calculations when the day of the week and the year are specified. Here is an example, a format for dates compatible with that specified in the - :rfc:`2822` Internet email standard. [#]_ :: + :rfc:`2822` Internet email standard. [1]_ :: >>> from time import gmtime, strftime >>> strftime("%a, %d %b %Y %H:%M:%S +0000", gmtime()) @@ -908,10 +908,9 @@ Timezone Constants .. rubric:: Footnotes -.. [#] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the - preferred hour/minute offset is not supported by all ANSI C libraries. Also, a +.. [1] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the + preferred hour/minute offset is not supported by all ANSI C libraries. Also, a strict reading of the original 1982 :rfc:`822` standard calls for a two-digit - year (%y rather than %Y), but practice moved to 4-digit years long before the + year (``%y`` rather than ``%Y``), but practice moved to 4-digit years long before the year 2000. After that, :rfc:`822` became obsolete and the 4-digit year has been first recommended by :rfc:`1123` and then mandated by :rfc:`2822`. - From webhook-mailer at python.org Fri Jan 14 19:00:00 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 15 Jan 2022 00:00:00 -0000 Subject: [Python-checkins] bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) Message-ID: https://github.com/python/cpython/commit/f869b56fe4be416d356fffc94b8b18fe65039929 commit: f869b56fe4be416d356fffc94b8b18fe65039929 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-14T15:59:56-08:00 summary: bpo-20281, bpo-29964: update datetime docs to refer %z and %Z to a pre-existing footnote (GH-30354) (cherry picked from commit 305588c67cdede4ef127ada90c1557bc1ef7c200) Co-authored-by: Hugo van Kemenade files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 11aba5dd842c5..0dca9a8eed24b 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -434,10 +434,10 @@ Functions | | negative time difference from UTC/GMT of the | | | | form +HHMM or -HHMM, where H represents decimal| | | | hour digits and M represents decimal minute | | - | | digits [-23:59, +23:59]. | | + | | digits [-23:59, +23:59]. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%Z`` | Time zone name (no characters if no time zone | | - | | exists). | | + | | exists). Deprecated. [1]_ | | +-----------+------------------------------------------------+-------+ | ``%%`` | A literal ``'%'`` character. | | +-----------+------------------------------------------------+-------+ @@ -458,7 +458,7 @@ Functions calculations when the day of the week and the year are specified. Here is an example, a format for dates compatible with that specified in the - :rfc:`2822` Internet email standard. [#]_ :: + :rfc:`2822` Internet email standard. [1]_ :: >>> from time import gmtime, strftime >>> strftime("%a, %d %b %Y %H:%M:%S +0000", gmtime()) @@ -879,10 +879,9 @@ Timezone Constants .. rubric:: Footnotes -.. [#] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the - preferred hour/minute offset is not supported by all ANSI C libraries. Also, a +.. [1] The use of ``%Z`` is now deprecated, but the ``%z`` escape that expands to the + preferred hour/minute offset is not supported by all ANSI C libraries. Also, a strict reading of the original 1982 :rfc:`822` standard calls for a two-digit - year (%y rather than %Y), but practice moved to 4-digit years long before the + year (``%y`` rather than ``%Y``), but practice moved to 4-digit years long before the year 2000. After that, :rfc:`822` became obsolete and the 4-digit year has been first recommended by :rfc:`1123` and then mandated by :rfc:`2822`. - From webhook-mailer at python.org Sat Jan 15 03:52:28 2022 From: webhook-mailer at python.org (tiran) Date: Sat, 15 Jan 2022 08:52:28 -0000 Subject: [Python-checkins] bpo-46383: Fix signature of zoneinfo module_free function (GH-30607) Message-ID: https://github.com/python/cpython/commit/cfbde65df318eea243706ff876e5ef834c085e5f commit: cfbde65df318eea243706ff876e5ef834c085e5f branch: main author: Christian Heimes committer: tiran date: 2022-01-15T09:52:19+01:00 summary: bpo-46383: Fix signature of zoneinfo module_free function (GH-30607) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst M Modules/_zoneinfo.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst new file mode 100644 index 0000000000000..8f8b12732a690 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst @@ -0,0 +1,2 @@ +Fix invalid signature of ``_zoneinfo``'s ``module_free`` function to resolve +a crash on wasm32-emscripten platform. diff --git a/Modules/_zoneinfo.c b/Modules/_zoneinfo.c index cac347071f91d..1535721b026d1 100644 --- a/Modules/_zoneinfo.c +++ b/Modules/_zoneinfo.c @@ -2612,7 +2612,7 @@ static PyTypeObject PyZoneInfo_ZoneInfoType = { // Specify the _zoneinfo module static PyMethodDef module_methods[] = {{NULL, NULL}}; static void -module_free(void) +module_free(void *m) { Py_XDECREF(_tzpath_find_tzfile); _tzpath_find_tzfile = NULL; From webhook-mailer at python.org Sat Jan 15 04:58:09 2022 From: webhook-mailer at python.org (mdickinson) Date: Sat, 15 Jan 2022 09:58:09 -0000 Subject: [Python-checkins] bpo-46258: Streamline isqrt fast path (#30333) Message-ID: https://github.com/python/cpython/commit/d02c5e9b55a8651b7d396ac3f2bdedf1fc1780b5 commit: d02c5e9b55a8651b7d396ac3f2bdedf1fc1780b5 branch: main author: Mark Dickinson committer: mdickinson date: 2022-01-15T09:58:04Z summary: bpo-46258: Streamline isqrt fast path (#30333) files: A Misc/NEWS.d/next/Library/2022-01-04-18-05-25.bpo-46258.DYgwRo.rst M Modules/mathmodule.c diff --git a/Misc/NEWS.d/next/Library/2022-01-04-18-05-25.bpo-46258.DYgwRo.rst b/Misc/NEWS.d/next/Library/2022-01-04-18-05-25.bpo-46258.DYgwRo.rst new file mode 100644 index 0000000000000..b918ed1a5d9e0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-04-18-05-25.bpo-46258.DYgwRo.rst @@ -0,0 +1,2 @@ +Speed up :func:`math.isqrt` for small positive integers by replacing two +division steps with a lookup table. diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index 3ab1a0776046d..0c7d4de068621 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -1718,20 +1718,49 @@ completes the proof sketch. */ +/* + The _approximate_isqrt_tab table provides approximate square roots for + 16-bit integers. For any n in the range 2**14 <= n < 2**16, the value + + a = _approximate_isqrt_tab[(n >> 8) - 64] + + is an approximate square root of n, satisfying (a - 1)**2 < n < (a + 1)**2. + + The table was computed in Python using the expression: + + [min(round(sqrt(256*n + 128)), 255) for n in range(64, 256)] +*/ + +static const uint8_t _approximate_isqrt_tab[192] = { + 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, + 140, 141, 142, 143, 144, 144, 145, 146, 147, 148, 149, 150, + 151, 151, 152, 153, 154, 155, 156, 156, 157, 158, 159, 160, + 160, 161, 162, 163, 164, 164, 165, 166, 167, 167, 168, 169, + 170, 170, 171, 172, 173, 173, 174, 175, 176, 176, 177, 178, + 179, 179, 180, 181, 181, 182, 183, 183, 184, 185, 186, 186, + 187, 188, 188, 189, 190, 190, 191, 192, 192, 193, 194, 194, + 195, 196, 196, 197, 198, 198, 199, 200, 200, 201, 201, 202, + 203, 203, 204, 205, 205, 206, 206, 207, 208, 208, 209, 210, + 210, 211, 211, 212, 213, 213, 214, 214, 215, 216, 216, 217, + 217, 218, 219, 219, 220, 220, 221, 221, 222, 223, 223, 224, + 224, 225, 225, 226, 227, 227, 228, 228, 229, 229, 230, 230, + 231, 232, 232, 233, 233, 234, 234, 235, 235, 236, 237, 237, + 238, 238, 239, 239, 240, 240, 241, 241, 242, 242, 243, 243, + 244, 244, 245, 246, 246, 247, 247, 248, 248, 249, 249, 250, + 250, 251, 251, 252, 252, 253, 253, 254, 254, 255, 255, 255, +}; /* Approximate square root of a large 64-bit integer. Given `n` satisfying `2**62 <= n < 2**64`, return `a` satisfying `(a - 1)**2 < n < (a + 1)**2`. */ -static uint64_t +static inline uint32_t _approximate_isqrt(uint64_t n) { - uint32_t u = 1U + (n >> 62); - u = (u << 1) + (n >> 59) / u; - u = (u << 3) + (n >> 53) / u; - u = (u << 7) + (n >> 41) / u; - return (u << 15) + (n >> 17) / u; + uint32_t u = _approximate_isqrt_tab[(n >> 56) - 64]; + u = (u << 7) + (uint32_t)(n >> 41) / u; + return (u << 15) + (uint32_t)((n >> 17) / u); } /*[clinic input] @@ -1749,7 +1778,8 @@ math_isqrt(PyObject *module, PyObject *n) { int a_too_large, c_bit_length; size_t c, d; - uint64_t m, u; + uint64_t m; + uint32_t u; PyObject *a = NULL, *b; n = _PyNumber_Index(n); @@ -1776,18 +1806,17 @@ math_isqrt(PyObject *module, PyObject *n) c = (c - 1U) / 2U; /* Fast path: if c <= 31 then n < 2**64 and we can compute directly with a - fast, almost branch-free algorithm. In the final correction, we use `u*u - - 1 >= m` instead of the simpler `u*u > m` in order to get the correct - result in the corner case where `u=2**32`. */ + fast, almost branch-free algorithm. */ if (c <= 31U) { + int shift = 31 - (int)c; m = (uint64_t)PyLong_AsUnsignedLongLong(n); Py_DECREF(n); if (m == (uint64_t)(-1) && PyErr_Occurred()) { return NULL; } - u = _approximate_isqrt(m << (62U - 2U*c)) >> (31U - c); - u -= u * u - 1U >= m; - return PyLong_FromUnsignedLongLong((unsigned long long)u); + u = _approximate_isqrt(m << 2*shift) >> shift; + u -= (uint64_t)u * u > m; + return PyLong_FromUnsignedLong(u); } /* Slow path: n >= 2**64. We perform the first five iterations in C integer @@ -1811,7 +1840,7 @@ math_isqrt(PyObject *module, PyObject *n) goto error; } u = _approximate_isqrt(m) >> (31U - d); - a = PyLong_FromUnsignedLongLong((unsigned long long)u); + a = PyLong_FromUnsignedLong(u); if (a == NULL) { goto error; } From webhook-mailer at python.org Sat Jan 15 17:15:40 2022 From: webhook-mailer at python.org (rhettinger) Date: Sat, 15 Jan 2022 22:15:40 -0000 Subject: [Python-checkins] bpo-46388: Test NotImplemented code path for functools.total_ordering (GH-30616) Message-ID: https://github.com/python/cpython/commit/0a28118324f64851b684ec3afdd063c47513a236 commit: 0a28118324f64851b684ec3afdd063c47513a236 branch: main author: Russel Webber <24542073+RusselWebber at users.noreply.github.com> committer: rhettinger date: 2022-01-15T14:15:32-08:00 summary: bpo-46388: Test NotImplemented code path for functools.total_ordering (GH-30616) files: M Lib/test/test_functools.py diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py index d527e31f39ffe..abbd50a47f395 100644 --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -1049,6 +1049,73 @@ def test_no_operations_defined(self): class A: pass + def test_notimplemented(self): + # Verify NotImplemented results are correctly handled + @functools.total_ordering + class ImplementsLessThan: + def __init__(self, value): + self.value = value + def __eq__(self, other): + if isinstance(other, ImplementsLessThan): + return self.value == other.value + return False + def __lt__(self, other): + if isinstance(other, ImplementsLessThan): + return self.value < other.value + return NotImplemented + + @functools.total_ordering + class ImplementsLessThanEqualTo: + def __init__(self, value): + self.value = value + def __eq__(self, other): + if isinstance(other, ImplementsLessThanEqualTo): + return self.value == other.value + return False + def __le__(self, other): + if isinstance(other, ImplementsLessThanEqualTo): + return self.value <= other.value + return NotImplemented + + @functools.total_ordering + class ImplementsGreaterThan: + def __init__(self, value): + self.value = value + def __eq__(self, other): + if isinstance(other, ImplementsGreaterThan): + return self.value == other.value + return False + def __gt__(self, other): + if isinstance(other, ImplementsGreaterThan): + return self.value > other.value + return NotImplemented + + @functools.total_ordering + class ImplementsGreaterThanEqualTo: + def __init__(self, value): + self.value = value + def __eq__(self, other): + if isinstance(other, ImplementsGreaterThanEqualTo): + return self.value == other.value + return False + def __ge__(self, other): + if isinstance(other, ImplementsGreaterThanEqualTo): + return self.value >= other.value + return NotImplemented + + self.assertIs(ImplementsLessThan(1).__le__(1), NotImplemented) + self.assertIs(ImplementsLessThan(1).__gt__(1), NotImplemented) + self.assertIs(ImplementsLessThan(1).__ge__(1), NotImplemented) + self.assertIs(ImplementsLessThanEqualTo(1).__lt__(1), NotImplemented) + self.assertIs(ImplementsLessThanEqualTo(1).__gt__(1), NotImplemented) + self.assertIs(ImplementsLessThanEqualTo(1).__ge__(1), NotImplemented) + self.assertIs(ImplementsGreaterThan(1).__lt__(1), NotImplemented) + self.assertIs(ImplementsGreaterThan(1).__gt__(1), NotImplemented) + self.assertIs(ImplementsGreaterThan(1).__ge__(1), NotImplemented) + self.assertIs(ImplementsGreaterThanEqualTo(1).__lt__(1), NotImplemented) + self.assertIs(ImplementsGreaterThanEqualTo(1).__le__(1), NotImplemented) + self.assertIs(ImplementsGreaterThanEqualTo(1).__gt__(1), NotImplemented) + def test_type_error_when_not_implemented(self): # bug 10042; ensure stack overflow does not occur # when decorated types return NotImplemented From webhook-mailer at python.org Sat Jan 15 23:33:39 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 16 Jan 2022 04:33:39 -0000 Subject: [Python-checkins] bpo-46387: test all pickle protos in `test_field_descriptor` in `test_collections` (GH-30614) Message-ID: https://github.com/python/cpython/commit/37eab55ac9da6b6361f136a1da15bfcef12ed954 commit: 37eab55ac9da6b6361f136a1da15bfcef12ed954 branch: main author: Nikita Sobolev committer: rhettinger date: 2022-01-15T22:33:28-06:00 summary: bpo-46387: test all pickle protos in `test_field_descriptor` in `test_collections` (GH-30614) files: M Lib/test/test_collections.py diff --git a/Lib/test/test_collections.py b/Lib/test/test_collections.py index 48327bf50ea42..3a16045c5aa1a 100644 --- a/Lib/test/test_collections.py +++ b/Lib/test/test_collections.py @@ -677,14 +677,16 @@ def test_field_descriptor(self): self.assertRaises(AttributeError, Point.x.__set__, p, 33) self.assertRaises(AttributeError, Point.x.__delete__, p) - class NewPoint(tuple): - x = pickle.loads(pickle.dumps(Point.x)) - y = pickle.loads(pickle.dumps(Point.y)) + for proto in range(pickle.HIGHEST_PROTOCOL + 1): + with self.subTest(proto=proto): + class NewPoint(tuple): + x = pickle.loads(pickle.dumps(Point.x, proto)) + y = pickle.loads(pickle.dumps(Point.y, proto)) - np = NewPoint([1, 2]) + np = NewPoint([1, 2]) - self.assertEqual(np.x, 1) - self.assertEqual(np.y, 2) + self.assertEqual(np.x, 1) + self.assertEqual(np.y, 2) def test_new_builtins_issue_43102(self): obj = namedtuple('C', ()) From webhook-mailer at python.org Sun Jan 16 01:42:12 2022 From: webhook-mailer at python.org (ethanfurman) Date: Sun, 16 Jan 2022 06:42:12 -0000 Subject: [Python-checkins] bpo-40066: [Enum] update str() and format() output (GH-30582) Message-ID: https://github.com/python/cpython/commit/acf7403f9baea3ae1119fc6b4a3298522188bf96 commit: acf7403f9baea3ae1119fc6b4a3298522188bf96 branch: main author: Ethan Furman committer: ethanfurman date: 2022-01-15T22:41:43-08:00 summary: bpo-40066: [Enum] update str() and format() output (GH-30582) Undo rejected PEP-663 changes: - restore `repr()` to its 3.10 status - restore `str()` to its 3.10 status New changes: - `IntEnum` and `IntFlag` now leave `__str__` as the original `int.__str__` so that str() and format() return the same result - zero-valued flags without a name have a slightly changed repr(), e.g. `repr(Color(0)) == ''` - update `dir()` for mixed-in types to return all the methods and attributes of the mixed-in type - added `_numeric_repr_` to `Flag` to control display of unnamed values - enums without doc strings have a more comprehensive doc string added - `ReprEnum` added -- inheriting from this makes it so only `__repr__` is replaced, not `__str__` nor `__format__`; `IntEnum`, `IntFlag`, and `StrEnum` all inherit from `ReprEnum` files: A Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst M Doc/howto/enum.rst M Doc/library/enum.rst M Doc/library/ssl.rst M Lib/enum.py M Lib/inspect.py M Lib/plistlib.py M Lib/re.py M Lib/ssl.py M Lib/test/test_enum.py M Lib/test/test_signal.py M Lib/test/test_socket.py M Lib/test/test_ssl.py M Lib/test/test_unicode.py diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 6c09b9925c1de..fa0e2283ebc10 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -2,15 +2,10 @@ Enum HOWTO ========== -:Author: Ethan Furman - .. _enum-basic-tutorial: .. currentmodule:: enum -Basic Enum Tutorial -------------------- - An :class:`Enum` is a set of symbolic names bound to unique values. They are similar to global variables, but they offer a more useful :func:`repr()`, grouping, type-safety, and a few other features. @@ -28,6 +23,14 @@ selection of values. For example, the days of the week:: ... SATURDAY = 6 ... SUNDAY = 7 + Or perhaps the RGB primary colors:: + + >>> from enum import Enum + >>> class Color(Enum): + ... RED = 1 + ... GREEN = 2 + ... BLUE = 3 + As you can see, creating an :class:`Enum` is as simple as writing a class that inherits from :class:`Enum` itself. @@ -41,13 +44,14 @@ important, but either way that value can be used to get the corresponding member:: >>> Weekday(3) - Weekday.WEDNESDAY + -As you can see, the ``repr()`` of a member shows the enum name and the -member name. The ``str()`` on a member shows only its name:: +As you can see, the ``repr()`` of a member shows the enum name, the member name, +and the value. The ``str()`` of a member shows only the enum name and member +name:: >>> print(Weekday.THURSDAY) - THURSDAY + Weekday.THURSDAY The *type* of an enumeration member is the enum it belongs to:: @@ -97,8 +101,8 @@ The complete :class:`Weekday` enum now looks like this:: Now we can find out what today is! Observe:: >>> from datetime import date - >>> Weekday.from_date(date.today()) - Weekday.TUESDAY + >>> Weekday.from_date(date.today()) # doctest: +SKIP + Of course, if you're reading this on some other day, you'll see that day instead. @@ -124,21 +128,21 @@ Just like the original :class:`Weekday` enum above, we can have a single selecti >>> first_week_day = Weekday.MONDAY >>> first_week_day - Weekday.MONDAY + But :class:`Flag` also allows us to combine several members into a single variable:: >>> weekend = Weekday.SATURDAY | Weekday.SUNDAY >>> weekend - Weekday.SATURDAY|Weekday.SUNDAY + You can even iterate over a :class:`Flag` variable:: >>> for day in weekend: ... print(day) - SATURDAY - SUNDAY + Weekday.SATURDAY + Weekday.SUNDAY Okay, let's get some chores set up:: @@ -173,6 +177,7 @@ yourself some work and use :func:`auto()` for the values:: .. _enum-advanced-tutorial: + Programmatic access to enumeration members and their attributes --------------------------------------------------------------- @@ -181,16 +186,16 @@ situations where ``Color.RED`` won't do because the exact color is not known at program-writing time). ``Enum`` allows such access:: >>> Color(1) - Color.RED + >>> Color(3) - Color.BLUE + If you want to access enum members by *name*, use item access:: >>> Color['RED'] - Color.RED + >>> Color['GREEN'] - Color.GREEN + If you have an enum member and need its :attr:`name` or :attr:`value`:: @@ -212,7 +217,7 @@ Having two enum members with the same name is invalid:: ... Traceback (most recent call last): ... - TypeError: 'SQUARE' already defined as: 2 + TypeError: 'SQUARE' already defined as 2 However, an enum member can have other names associated with it. Given two entries ``A`` and ``B`` with the same value (and ``A`` defined first), ``B`` @@ -227,11 +232,11 @@ By-name lookup of ``B`` will also return the member ``A``:: ... ALIAS_FOR_SQUARE = 2 ... >>> Shape.SQUARE - Shape.SQUARE + >>> Shape.ALIAS_FOR_SQUARE - Shape.SQUARE + >>> Shape(2) - Shape.SQUARE + .. note:: @@ -299,7 +304,7 @@ Iteration Iterating over the members of an enum does not provide the aliases:: >>> list(Shape) - [Shape.SQUARE, Shape.DIAMOND, Shape.CIRCLE] + [, , ] The special attribute ``__members__`` is a read-only ordered mapping of names to members. It includes all names defined in the enumeration, including the @@ -308,10 +313,10 @@ aliases:: >>> for name, member in Shape.__members__.items(): ... name, member ... - ('SQUARE', Shape.SQUARE) - ('DIAMOND', Shape.DIAMOND) - ('CIRCLE', Shape.CIRCLE) - ('ALIAS_FOR_SQUARE', Shape.SQUARE) + ('SQUARE', ) + ('DIAMOND', ) + ('CIRCLE', ) + ('ALIAS_FOR_SQUARE', ) The ``__members__`` attribute can be used for detailed programmatic access to the enumeration members. For example, finding all the aliases:: @@ -360,8 +365,8 @@ below):: Allowed members and attributes of enumerations ---------------------------------------------- -Most of the examples above use integers for enumeration values. Using integers is -short and handy (and provided by default by the `Functional API`_), but not +Most of the examples above use integers for enumeration values. Using integers +is short and handy (and provided by default by the `Functional API`_), but not strictly enforced. In the vast majority of use-cases, one doesn't care what the actual value of an enumeration is. But if the value *is* important, enumerations can have arbitrary values. @@ -389,7 +394,7 @@ usual. If we have this enumeration:: Then:: >>> Mood.favorite_mood() - Mood.HAPPY + >>> Mood.HAPPY.describe() ('HAPPY', 3) >>> str(Mood.FUNKY) @@ -425,7 +430,7 @@ any members. So this is forbidden:: ... Traceback (most recent call last): ... - TypeError: MoreColor: cannot extend enumeration 'Color' + TypeError: cannot extend But this is allowed:: @@ -476,11 +481,9 @@ The :class:`Enum` class is callable, providing the following functional API:: >>> Animal >>> Animal.ANT - Animal.ANT - >>> Animal.ANT.value - 1 + >>> list(Animal) - [Animal.ANT, Animal.BEE, Animal.CAT, Animal.DOG] + [, , , ] The semantics of this API resemble :class:`~collections.namedtuple`. The first argument of the call to :class:`Enum` is the name of the enumeration. @@ -625,16 +628,7 @@ StrEnum The second variation of :class:`Enum` that is provided is also a subclass of :class:`str`. Members of a :class:`StrEnum` can be compared to strings; by extension, string enumerations of different types can also be compared -to each other. :class:`StrEnum` exists to help avoid the problem of getting -an incorrect member:: - - >>> from enum import StrEnum - >>> class Directions(StrEnum): - ... NORTH = 'north', # notice the trailing comma - ... SOUTH = 'south' - -Before :class:`StrEnum`, ``Directions.NORTH`` would have been the :class:`tuple` -``('north',)``. +to each other. .. versionadded:: 3.11 @@ -645,9 +639,8 @@ IntFlag The next variation of :class:`Enum` provided, :class:`IntFlag`, is also based on :class:`int`. The difference being :class:`IntFlag` members can be combined using the bitwise operators (&, \|, ^, ~) and the result is still an -:class:`IntFlag` member, if possible. However, as the name implies, :class:`IntFlag` -members also subclass :class:`int` and can be used wherever an :class:`int` is -used. +:class:`IntFlag` member, if possible. Like :class:`IntEnum`, :class:`IntFlag` +members are also integers and can be used wherever an :class:`int` is used. .. note:: @@ -670,7 +663,7 @@ Sample :class:`IntFlag` class:: ... X = 1 ... >>> Perm.R | Perm.W - Perm.R|Perm.W + >>> Perm.R + Perm.W 6 >>> RW = Perm.R | Perm.W @@ -685,11 +678,11 @@ It is also possible to name the combinations:: ... X = 1 ... RWX = 7 >>> Perm.RWX - Perm.RWX + >>> ~Perm.RWX - Perm(0) + >>> Perm(7) - Perm.RWX + .. note:: @@ -702,7 +695,7 @@ Another important difference between :class:`IntFlag` and :class:`Enum` is that if no flags are set (the value is 0), its boolean evaluation is :data:`False`:: >>> Perm.R & Perm.X - Perm(0) + >>> bool(Perm.R & Perm.X) False @@ -710,7 +703,7 @@ Because :class:`IntFlag` members are also subclasses of :class:`int` they can be combined with them (but may lose :class:`IntFlag` membership:: >>> Perm.X | 4 - Perm.R|Perm.X + >>> Perm.X | 8 9 @@ -726,7 +719,7 @@ be combined with them (but may lose :class:`IntFlag` membership:: :class:`IntFlag` members can also be iterated over:: >>> list(RW) - [Perm.R, Perm.W] + [, ] .. versionadded:: 3.11 @@ -753,7 +746,7 @@ flags being set, the boolean evaluation is :data:`False`:: ... GREEN = auto() ... >>> Color.RED & Color.GREEN - Color(0) + >>> bool(Color.RED & Color.GREEN) False @@ -767,7 +760,7 @@ while combinations of flags won't:: ... WHITE = RED | BLUE | GREEN ... >>> Color.WHITE - Color.WHITE + Giving a name to the "no flags set" condition does not change its boolean value:: @@ -779,7 +772,7 @@ value:: ... GREEN = auto() ... >>> Color.BLACK - Color.BLACK + >>> bool(Color.BLACK) False @@ -787,7 +780,7 @@ value:: >>> purple = Color.RED | Color.BLUE >>> list(purple) - [Color.RED, Color.BLUE] + [, ] .. versionadded:: 3.11 @@ -812,16 +805,16 @@ simple to implement independently:: pass This demonstrates how similar derived enumerations can be defined; for example -a :class:`StrEnum` that mixes in :class:`str` instead of :class:`int`. +a :class:`FloatEnum` that mixes in :class:`float` instead of :class:`int`. Some rules: 1. When subclassing :class:`Enum`, mix-in types must appear before :class:`Enum` itself in the sequence of bases, as in the :class:`IntEnum` example above. -2. Mix-in types must be subclassable. For example, - :class:`bool` and :class:`range` are not subclassable - and will throw an error during Enum creation if used as the mix-in type. +2. Mix-in types must be subclassable. For example, :class:`bool` and + :class:`range` are not subclassable and will throw an error during Enum + creation if used as the mix-in type. 3. While :class:`Enum` can have members of any type, once you mix in an additional type, all the members must have values of that type, e.g. :class:`int` above. This restriction does not apply to mix-ins which only @@ -829,15 +822,18 @@ Some rules: 4. When another data type is mixed in, the :attr:`value` attribute is *not the same* as the enum member itself, although it is equivalent and will compare equal. -5. %-style formatting: `%s` and `%r` call the :class:`Enum` class's +5. %-style formatting: ``%s`` and ``%r`` call the :class:`Enum` class's :meth:`__str__` and :meth:`__repr__` respectively; other codes (such as - `%i` or `%h` for IntEnum) treat the enum member as its mixed-in type. + ``%i`` or ``%h`` for IntEnum) treat the enum member as its mixed-in type. 6. :ref:`Formatted string literals `, :meth:`str.format`, - and :func:`format` will use the mixed-in type's :meth:`__format__` - unless :meth:`__str__` or :meth:`__format__` is overridden in the subclass, - in which case the overridden methods or :class:`Enum` methods will be used. - Use the !s and !r format codes to force usage of the :class:`Enum` class's - :meth:`__str__` and :meth:`__repr__` methods. + and :func:`format` will use the enum's :meth:`__str__` method. + +.. note:: + + Because :class:`IntEnum`, :class:`IntFlag`, and :class:`StrEnum` are + designed to be drop-in replacements for existing constants, their + :meth:`__str__` method has been reset to their data types + :meth:`__str__` method. When to use :meth:`__new__` vs. :meth:`__init__` ------------------------------------------------ @@ -866,10 +862,10 @@ want one of them to be the value:: ... >>> print(Coordinate['PY']) - PY + Coordinate.PY >>> print(Coordinate(3)) - VY + Coordinate.VY Finer Points @@ -927,8 +923,8 @@ and raise an error if the two do not match:: Traceback (most recent call last): ... TypeError: member order does not match _order_: - ['RED', 'BLUE', 'GREEN'] - ['RED', 'GREEN', 'BLUE'] + ['RED', 'BLUE', 'GREEN'] + ['RED', 'GREEN', 'BLUE'] .. note:: @@ -949,35 +945,36 @@ but remain normal attributes. """""""""""""""""""" Enum members are instances of their enum class, and are normally accessed as -``EnumClass.member``. In Python versions ``3.5`` to ``3.9`` you could access -members from other members -- this practice was discouraged, and in ``3.12`` -:class:`Enum` will return to not allowing it, while in ``3.10`` and ``3.11`` -it will raise a :exc:`DeprecationWarning`:: +``EnumClass.member``. In Python versions ``3.5`` to ``3.10`` you could access +members from other members -- this practice was discouraged, and in ``3.11`` +:class:`Enum` returns to not allowing it:: >>> class FieldTypes(Enum): ... name = 0 ... value = 1 ... size = 2 ... - >>> FieldTypes.value.size # doctest: +SKIP - DeprecationWarning: accessing one member from another is not supported, - and will be disabled in 3.12 - + >>> FieldTypes.value.size + Traceback (most recent call last): + ... + AttributeError: member has no attribute 'size' + .. versionchanged:: 3.5 +.. versionchanged:: 3.11 Creating members that are mixed with other data types """"""""""""""""""""""""""""""""""""""""""""""""""""" When subclassing other data types, such as :class:`int` or :class:`str`, with -an :class:`Enum`, all values after the `=` are passed to that data type's +an :class:`Enum`, all values after the ``=`` are passed to that data type's constructor. For example:: - >>> class MyEnum(IntEnum): - ... example = '11', 16 # '11' will be interpreted as a hexadecimal - ... # number - >>> MyEnum.example.value + >>> class MyEnum(IntEnum): # help(int) -> int(x, base=10) -> integer + ... example = '11', 16 # so x='11' and base=16 + ... + >>> MyEnum.example.value # and hex(11) is... 17 @@ -1000,13 +997,12 @@ Plain :class:`Enum` classes always evaluate as :data:`True`. """"""""""""""""""""""""""""" If you give your enum subclass extra methods, like the `Planet`_ -class below, those methods will show up in a :func:`dir` of the member and the -class. Attributes defined in an :func:`__init__` method will only show up in a -:func:`dir` of the member:: +class below, those methods will show up in a :func:`dir` of the member, +but not of the class:: - >>> dir(Planet) - ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__init__', '__members__', '__module__', 'surface_gravity'] - >>> dir(Planet.EARTH) + >>> dir(Planet) # doctest: +SKIP + ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__members__', '__module__'] + >>> dir(Planet.EARTH) # doctest: +SKIP ['__class__', '__doc__', '__module__', 'mass', 'name', 'radius', 'surface_gravity', 'value'] @@ -1025,19 +1021,10 @@ are comprised of a single bit:: ... CYAN = GREEN | BLUE ... >>> Color(3) # named combination - Color.YELLOW + >>> Color(7) # not named combination - Color.RED|Color.GREEN|Color.BLUE + -``StrEnum`` and :meth:`str.__str__` -""""""""""""""""""""""""""""""""""" - -An important difference between :class:`StrEnum` and other Enums is the -:meth:`__str__` method; because :class:`StrEnum` members are strings, some -parts of Python will read the string data directly, while others will call -:meth:`str()`. To make those two operations have the same result, -:meth:`StrEnum.__str__` will be the same as :meth:`str.__str__` so that -``str(StrEnum.member) == StrEnum.member`` is true. ``Flag`` and ``IntFlag`` minutia """""""""""""""""""""""""""""""" @@ -1060,16 +1047,16 @@ the following are true: - only canonical flags are returned during iteration:: >>> list(Color.WHITE) - [Color.RED, Color.GREEN, Color.BLUE] + [, , ] - negating a flag or flag set returns a new flag/flag set with the corresponding positive integer value:: >>> Color.BLUE - Color.BLUE + >>> ~Color.BLUE - Color.RED|Color.GREEN + - names of pseudo-flags are constructed from their members' names:: @@ -1079,25 +1066,29 @@ the following are true: - multi-bit flags, aka aliases, can be returned from operations:: >>> Color.RED | Color.BLUE - Color.PURPLE + >>> Color(7) # or Color(-1) - Color.WHITE + >>> Color(0) - Color.BLACK + -- membership / containment checking has changed slightly -- zero-valued flags - are never considered to be contained:: +- membership / containment checking: zero-valued flags are always considered + to be contained:: >>> Color.BLACK in Color.WHITE - False + True - otherwise, if all bits of one flag are in the other flag, True is returned:: + otherwise, only if all bits of one flag are in the other flag will True + be returned:: >>> Color.PURPLE in Color.WHITE True + >>> Color.GREEN in Color.PURPLE + False + There is a new boundary mechanism that controls how out-of-range / invalid bits are handled: ``STRICT``, ``CONFORM``, ``EJECT``, and ``KEEP``: @@ -1181,7 +1172,7 @@ Using :class:`auto` would look like:: ... GREEN = auto() ... >>> Color.GREEN - + Using :class:`object` @@ -1194,10 +1185,24 @@ Using :class:`object` would look like:: ... GREEN = object() ... BLUE = object() ... + >>> Color.GREEN # doctest: +SKIP + > + +This is also a good example of why you might want to write your own +:meth:`__repr__`:: + + >>> class Color(Enum): + ... RED = object() + ... GREEN = object() + ... BLUE = object() + ... def __repr__(self): + ... return "<%s.%s>" % (self.__class__.__name__, self._name_) + ... >>> Color.GREEN + Using a descriptive string """""""""""""""""""""""""" @@ -1209,9 +1214,7 @@ Using a string as the value would look like:: ... BLUE = 'too fast!' ... >>> Color.GREEN - - >>> Color.GREEN.value - 'go' + Using a custom :meth:`__new__` @@ -1232,9 +1235,7 @@ Using an auto-numbering :meth:`__new__` would look like:: ... BLUE = () ... >>> Color.GREEN - - >>> Color.GREEN.value - 2 + To make a more general purpose ``AutoNumber``, add ``*args`` to the signature:: @@ -1257,7 +1258,7 @@ to handle any extra arguments:: ... BLEACHED_CORAL = () # New color, no Pantone code yet! ... >>> Swatch.SEA_GREEN - + >>> Swatch.SEA_GREEN.pantone '1246' >>> Swatch.BLEACHED_CORAL.pantone @@ -1384,30 +1385,9 @@ An example to show the :attr:`_ignore_` attribute in use:: ... Period['day_%d' % i] = i ... >>> list(Period)[:2] - [Period.day_0, Period.day_1] + [, ] >>> list(Period)[-2:] - [Period.day_365, Period.day_366] - - -Conforming input to Flag -^^^^^^^^^^^^^^^^^^^^^^^^ - -To create a :class:`Flag` enum that is more resilient to out-of-bounds results -from mathematical operations, you can use the :attr:`FlagBoundary.CONFORM` -setting:: - - >>> from enum import Flag, CONFORM, auto - >>> class Weekday(Flag, boundary=CONFORM): - ... MONDAY = auto() - ... TUESDAY = auto() - ... WEDNESDAY = auto() - ... THURSDAY = auto() - ... FRIDAY = auto() - ... SATURDAY = auto() - ... SUNDAY = auto() - >>> today = Weekday.TUESDAY - >>> Weekday(today + 22) # what day is three weeks from tomorrow? - >>> Weekday.WEDNESDAY + [, ] .. _enumtype-examples: diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 8bb19dcdf2b61..906c60bc3efe3 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -31,7 +31,7 @@ An enumeration: * uses *call* syntax to return members by value * uses *index* syntax to return members by name -Enumerations are created either by using the :keyword:`class` syntax, or by +Enumerations are created either by using :keyword:`class` syntax, or by using function-call syntax:: >>> from enum import Enum @@ -45,7 +45,7 @@ using function-call syntax:: >>> # functional syntax >>> Color = Enum('Color', ['RED', 'GREEN', 'BLUE']) -Even though we can use the :keyword:`class` syntax to create Enums, Enums +Even though we can use :keyword:`class` syntax to create Enums, Enums are not normal Python classes. See :ref:`How are Enums different? ` for more details. @@ -53,7 +53,7 @@ are not normal Python classes. See - The class :class:`Color` is an *enumeration* (or *enum*) - The attributes :attr:`Color.RED`, :attr:`Color.GREEN`, etc., are - *enumeration members* (or *enum members*) and are functionally constants. + *enumeration members* (or *members*) and are functionally constants. - The enum members have *names* and *values* (the name of :attr:`Color.RED` is ``RED``, the value of :attr:`Color.BLUE` is ``3``, etc.) @@ -110,15 +110,10 @@ Module Contents :class:`StrEnum` defaults to the lower-cased version of the member name, while other Enums default to 1 and increase from there. - :func:`global_enum` - - :class:`Enum` class decorator to apply the appropriate global `__repr__`, - and export its members into the global name space. - - :func:`.property` + :func:`property` Allows :class:`Enum` members to have attributes without conflicting with - other members' names. + member names. :func:`unique` @@ -131,7 +126,7 @@ Module Contents .. versionadded:: 3.6 ``Flag``, ``IntFlag``, ``auto`` -.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary`` +.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary``, ``property`` --------------- @@ -145,6 +140,11 @@ Data Types to subclass *EnumType* -- see :ref:`Subclassing EnumType ` for details. + *EnumType* is responsible for setting the correct :meth:`__repr__`, + :meth:`__str__`, :meth:`__format__`, and :meth:`__reduce__` methods on the + final *enum*, as well as creating the enum members, properly handling + duplicates, providing iteration over the enum class, etc. + .. method:: EnumType.__contains__(cls, member) Returns ``True`` if member belongs to the ``cls``:: @@ -162,32 +162,31 @@ Data Types .. method:: EnumType.__dir__(cls) Returns ``['__class__', '__doc__', '__members__', '__module__']`` and the - names of the members in ``cls``. User-defined methods and methods from - mixin classes will also be included:: + names of the members in *cls*:: >>> dir(Color) - ['BLUE', 'GREEN', 'RED', '__class__', '__doc__', '__members__', '__module__'] + ['BLUE', 'GREEN', 'RED', '__class__', '__contains__', '__doc__', '__getitem__', '__init_subclass__', '__iter__', '__len__', '__members__', '__module__', '__name__', '__qualname__'] .. method:: EnumType.__getattr__(cls, name) Returns the Enum member in *cls* matching *name*, or raises an :exc:`AttributeError`:: >>> Color.GREEN - Color.GREEN + .. method:: EnumType.__getitem__(cls, name) - Returns the Enum member in *cls* matching *name*, or raises a :exc:`KeyError`:: + Returns the Enum member in *cls* matching *name*, or raises an :exc:`KeyError`:: >>> Color['BLUE'] - Color.BLUE + .. method:: EnumType.__iter__(cls) Returns each member in *cls* in definition order:: >>> list(Color) - [Color.RED, Color.GREEN, Color.BLUE] + [, , ] .. method:: EnumType.__len__(cls) @@ -201,7 +200,7 @@ Data Types Returns each member in *cls* in reverse definition order:: >>> list(reversed(Color)) - [Color.BLUE, Color.GREEN, Color.RED] + [, , ] .. class:: Enum @@ -232,7 +231,7 @@ Data Types .. attribute:: Enum._ignore_ ``_ignore_`` is only used during creation and is removed from the - enumeration once that is complete. + enumeration once creation is complete. ``_ignore_`` is a list of names that will not become members, and whose names will also be removed from the completed enumeration. See @@ -261,7 +260,7 @@ Data Types .. method:: Enum.__dir__(self) Returns ``['__class__', '__doc__', '__module__', 'name', 'value']`` and - any public methods defined on ``self.__class__`` or a mixin class:: + any public methods defined on *self.__class__*:: >>> from datetime import date >>> class Weekday(Enum): @@ -276,7 +275,7 @@ Data Types ... def today(cls): ... print('today is %s' % cls(date.today().isoweekday()).name) >>> dir(Weekday.SATURDAY) - ['__class__', '__doc__', '__module__', 'name', 'today', 'value'] + ['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'today', 'value'] .. method:: Enum._generate_next_value_(name, start, count, last_values) @@ -298,6 +297,11 @@ Data Types >>> PowersOfThree.SECOND.value 6 + .. method:: Enum.__init_subclass__(cls, \**kwds) + + A *classmethod* that is used to further configure subsequent subclasses. + By default, does nothing. + .. method:: Enum._missing_(cls, value) A *classmethod* for looking up values not found in *cls*. By default it @@ -317,42 +321,55 @@ Data Types >>> Build.DEBUG.value 'debug' >>> Build('deBUG') - Build.DEBUG + .. method:: Enum.__repr__(self) Returns the string used for *repr()* calls. By default, returns the - *Enum* name and the member name, but can be overridden:: + *Enum* name, member name, and value, but can be overridden:: - >>> class OldStyle(Enum): - ... RETRO = auto() - ... OLD_SCHOOl = auto() - ... YESTERYEAR = auto() + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() ... def __repr__(self): ... cls_name = self.__class__.__name__ - ... return f'<{cls_name}.{self.name}: {self.value}>' - >>> OldStyle.RETRO - + ... return f'{cls_name}.{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (OtherStyle.ALTERNATE, 'OtherStyle.ALTERNATE', 'OtherStyle.ALTERNATE') .. method:: Enum.__str__(self) Returns the string used for *str()* calls. By default, returns the - member name, but can be overridden:: + *Enum* name and member name, but can be overridden:: - >>> class OldStyle(Enum): - ... RETRO = auto() - ... OLD_SCHOOl = auto() - ... YESTERYEAR = auto() + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() ... def __str__(self): - ... cls_name = self.__class__.__name__ - ... return f'{cls_name}.{self.name}' - >>> OldStyle.RETRO - OldStyle.RETRO + ... return f'{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (, 'ALTERNATE', 'ALTERNATE') + + .. method:: Enum.__format__(self) + + Returns the string used for *format()* and *f-string* calls. By default, + returns :meth:`__str__` returns, but can be overridden:: + + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() + ... def __format__(self, spec): + ... return f'{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (, 'OtherStyle.ALTERNATE', 'ALTERNATE') -.. note:: + .. note:: - Using :class:`auto` with :class:`Enum` results in integers of increasing value, - starting with ``1``. + Using :class:`auto` with :class:`Enum` results in integers of increasing value, + starting with ``1``. .. class:: IntEnum @@ -367,7 +384,7 @@ Data Types ... TWO = 2 ... THREE = 3 >>> Numbers.THREE - Numbers.THREE + >>> Numbers.ONE + Numbers.TWO 3 >>> Numbers.THREE + 5 @@ -375,10 +392,14 @@ Data Types >>> Numbers.THREE == 3 True -.. note:: + .. note:: - Using :class:`auto` with :class:`IntEnum` results in integers of increasing value, - starting with ``1``. + Using :class:`auto` with :class:`IntEnum` results in integers of increasing + value, starting with ``1``. + + .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to + better support the *replacement of existing constants* use-case. + :meth:`__format__` was already :func:`int.__format__` for that same reason. .. class:: StrEnum @@ -392,13 +413,16 @@ Data Types instead of ``isinstance(str, unknown)``), and in those locations you will need to use ``str(StrEnum.member)``. + .. note:: -.. note:: + Using :class:`auto` with :class:`StrEnum` results in the lower-cased member + name as the value. - Using :class:`auto` with :class:`StrEnum` results in values of the member name, - lower-cased. + .. note:: :meth:`__str__` is :func:`str.__str__` to better support the + *replacement of existing constants* use-case. :meth:`__format__` is likewise + :func:`int.__format__` for that same reason. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. class:: Flag @@ -431,9 +455,9 @@ Data Types Returns all contained members:: >>> list(Color.RED) - [Color.RED] + [] >>> list(purple) - [Color.RED, Color.BLUE] + [, ] .. method:: __len__(self): @@ -461,42 +485,52 @@ Data Types Returns current flag binary or'ed with other:: >>> Color.RED | Color.GREEN - Color.RED|Color.GREEN + .. method:: __and__(self, other) Returns current flag binary and'ed with other:: >>> purple & white - Color.RED|Color.BLUE + >>> purple & Color.GREEN - 0x0 + .. method:: __xor__(self, other) Returns current flag binary xor'ed with other:: >>> purple ^ white - Color.GREEN + >>> purple ^ Color.GREEN - Color.RED|Color.GREEN|Color.BLUE + .. method:: __invert__(self): Returns all the flags in *type(self)* that are not in self:: >>> ~white - 0x0 + >>> ~purple - Color.GREEN + >>> ~Color.RED - Color.GREEN|Color.BLUE + + + .. method:: _numeric_repr_ + + Function used to format any remaining unnamed numeric values. Default is + the value's repr; common choices are :func:`hex` and :func:`oct`. + + .. note:: -.. note:: + Using :class:`auto` with :class:`Flag` results in integers that are powers + of two, starting with ``1``. - Using :class:`auto` with :class:`Flag` results in integers that are powers - of two, starting with ``1``. + .. versionchanged:: 3.11 The *repr()* of zero-valued flags has changed. It + is now:: + >>> Color(0) + .. class:: IntFlag @@ -509,9 +543,9 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> Color.RED & 2 - 0x0 + >>> Color.RED | 2 - Color.RED|Color.GREEN + If any integer operation is performed with an *IntFlag* member, the result is not an *IntFlag*:: @@ -524,15 +558,25 @@ Data Types * the result is a valid *IntFlag*: an *IntFlag* is returned * the result is not a valid *IntFlag*: the result depends on the *FlagBoundary* setting -.. note:: + The *repr()* of unnamed zero-valued flags has changed. It is now: + + >>> Color(0) + + + .. note:: + + Using :class:`auto` with :class:`IntFlag` results in integers that are powers + of two, starting with ``1``. + + .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to + better support the *replacement of existing constants* use-case. + :meth:`__format__` was already :func:`int.__format__` for that same reason. - Using :class:`auto` with :class:`IntFlag` results in integers that are powers - of two, starting with ``1``. .. class:: EnumCheck *EnumCheck* contains the options used by the :func:`verify` decorator to ensure - various constraints; failed constraints result in a :exc:`TypeError`. + various constraints; failed constraints result in a :exc:`ValueError`. .. attribute:: UNIQUE @@ -582,11 +626,11 @@ Data Types ... ValueError: invalid Flag 'Color': aliases WHITE and NEON are missing combined values of 0x18 [use enum.show_flag_values(value) for details] -.. note:: + .. note:: - CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. + CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. class:: FlagBoundary @@ -606,7 +650,7 @@ Data Types >>> StrictFlag(2**2 + 2**4) Traceback (most recent call last): ... - ValueError: StrictFlag: invalid value: 20 + ValueError: invalid value 20 given 0b0 10100 allowed 0b0 00111 @@ -621,7 +665,7 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> ConformFlag(2**2 + 2**4) - ConformFlag.BLUE + .. attribute:: EJECT @@ -647,12 +691,52 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> KeepFlag(2**2 + 2**4) - KeepFlag.BLUE|0x10 + .. versionadded:: 3.11 --------------- +Supported ``__dunder__`` names +"""""""""""""""""""""""""""""" + +:attr:`__members__` is a read-only ordered mapping of ``member_name``:``member`` +items. It is only available on the class. + +:meth:`__new__`, if specified, must create and return the enum members; it is +also a very good idea to set the member's :attr:`_value_` appropriately. Once +all the members are created it is no longer used. + + +Supported ``_sunder_`` names +"""""""""""""""""""""""""""" + +- ``_name_`` -- name of the member +- ``_value_`` -- value of the member; can be set / modified in ``__new__`` + +- ``_missing_`` -- a lookup function used when a value is not found; may be + overridden +- ``_ignore_`` -- a list of names, either as a :class:`list` or a :class:`str`, + that will not be transformed into members, and will be removed from the final + class +- ``_order_`` -- used in Python 2/3 code to ensure member order is consistent + (class attribute, removed during class creation) +- ``_generate_next_value_`` -- used to get an appropriate value for an enum + member; may be overridden + + .. note:: + + For standard :class:`Enum` classes the next value chosen is the last value seen + incremented by one. + + For :class:`Flag` classes the next value chosen will be the next highest + power-of-two, regardless of the last value seen. + +.. versionadded:: 3.6 ``_missing_``, ``_order_``, ``_generate_next_value_`` +.. versionadded:: 3.7 ``_ignore_`` + +--------------- + Utilities and Decorators ------------------------ @@ -668,15 +752,6 @@ Utilities and Decorators ``_generate_next_value_`` can be overridden to customize the values used by *auto*. -.. decorator:: global_enum - - A :keyword:`class` decorator specifically for enumerations. It replaces the - :meth:`__repr__` method with one that shows *module_name*.*member_name*. It - also injects the members, and their aliases, into the global namespace they - were defined in. - -.. versionadded:: 3.11 - .. decorator:: property A decorator similar to the built-in *property*, but specifically for @@ -688,7 +763,7 @@ Utilities and Decorators *Enum* class, and *Enum* subclasses can define members with the names ``value`` and ``name``. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. decorator:: unique @@ -714,7 +789,7 @@ Utilities and Decorators :class:`EnumCheck` are used to specify which constraints should be checked on the decorated enumeration. -.. versionadded:: 3.11 + .. versionadded:: 3.11 --------------- @@ -726,14 +801,20 @@ Notes These three enum types are designed to be drop-in replacements for existing integer- and string-based values; as such, they have extra limitations: - - ``format()`` will use the value of the enum member, unless ``__str__`` - has been overridden + - ``__str__`` uses the value and not the name of the enum member - - ``StrEnum.__str__`` uses the value and not the name of the enum member + - ``__format__``, because it uses ``__str__``, will also use the value of + the enum member instead of its name - If you do not need/want those limitations, you can create your own base - class by mixing in the ``int`` or ``str`` type yourself:: + If you do not need/want those limitations, you can either create your own + base class by mixing in the ``int`` or ``str`` type yourself:: >>> from enum import Enum >>> class MyIntEnum(int, Enum): ... pass + + or you can reassign the appropriate :meth:`str`, etc., in your enum:: + + >>> from enum import IntEnum + >>> class MyIntEnum(IntEnum): + ... __str__ = IntEnum.__str__ diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index eb33d7e1778a7..4d8488a4a28de 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -2070,7 +2070,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_flags` returns :class:`VerifyFlags` flags: >>> ssl.create_default_context().verify_flags # doctest: +SKIP - ssl.VERIFY_X509_TRUSTED_FIRST + .. attribute:: SSLContext.verify_mode @@ -2082,7 +2082,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_mode` returns :class:`VerifyMode` enum: >>> ssl.create_default_context().verify_mode - ssl.CERT_REQUIRED + .. index:: single: certificates diff --git a/Lib/enum.py b/Lib/enum.py index 93ea1bea36db7..772e1eac0e1e6 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -1,16 +1,16 @@ import sys +import builtins as bltns from types import MappingProxyType, DynamicClassAttribute from operator import or_ as _or_ from functools import reduce -from builtins import property as _bltin_property, bin as _bltin_bin __all__ = [ 'EnumType', 'EnumMeta', - 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', + 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', 'ReprEnum', 'auto', 'unique', 'property', 'verify', 'FlagBoundary', 'STRICT', 'CONFORM', 'EJECT', 'KEEP', - 'global_flag_repr', 'global_enum_repr', 'global_enum', + 'global_flag_repr', 'global_enum_repr', 'global_str', 'global_enum', 'EnumCheck', 'CONTINUOUS', 'NAMED_FLAGS', 'UNIQUE', ] @@ -18,7 +18,7 @@ # Dummy value for Enum and Flag as there are explicit checks for them # before they have been created. # This is also why there are checks in EnumType like `if Enum is not None` -Enum = Flag = EJECT = None +Enum = Flag = EJECT = _stdlib_enums = ReprEnum = None def _is_descriptor(obj): """ @@ -116,9 +116,9 @@ def bin(num, max_bits=None): ceiling = 2 ** (num).bit_length() if num >= 0: - s = _bltin_bin(num + ceiling).replace('1', '0', 1) + s = bltns.bin(num + ceiling).replace('1', '0', 1) else: - s = _bltin_bin(~num ^ (ceiling - 1) + ceiling) + s = bltns.bin(~num ^ (ceiling - 1) + ceiling) sign = s[:3] digits = s[3:] if max_bits is not None: @@ -126,6 +126,19 @@ def bin(num, max_bits=None): digits = (sign[-1] * max_bits + digits)[-max_bits:] return "%s %s" % (sign, digits) +def _dedent(text): + """ + Like textwrap.dedent. Rewritten because we cannot import textwrap. + """ + lines = text.split('\n') + blanks = 0 + for i, ch in enumerate(lines[0]): + if ch != ' ': + break + for j, l in enumerate(lines): + lines[j] = l[i:] + return '\n'.join(lines) + _auto_null = object() class auto: @@ -149,22 +162,12 @@ def __get__(self, instance, ownerclass=None): return ownerclass._member_map_[self.name] except KeyError: raise AttributeError( - '%s: no class attribute %r' % (ownerclass.__name__, self.name) + '%r has no attribute %r' % (ownerclass, self.name) ) else: if self.fget is None: - # check for member - if self.name in ownerclass._member_map_: - import warnings - warnings.warn( - "accessing one member from another is not supported, " - " and will be disabled in 3.12", - DeprecationWarning, - stacklevel=2, - ) - return ownerclass._member_map_[self.name] raise AttributeError( - '%s: no instance attribute %r' % (ownerclass.__name__, self.name) + '%r member has no attribute %r' % (ownerclass, self.name) ) else: return self.fget(instance) @@ -172,7 +175,7 @@ def __get__(self, instance, ownerclass=None): def __set__(self, instance, value): if self.fset is None: raise AttributeError( - "%s: cannot set instance attribute %r" % (self.clsname, self.name) + " cannot set attribute %r" % (self.clsname, self.name) ) else: return self.fset(instance, value) @@ -180,7 +183,7 @@ def __set__(self, instance, value): def __delete__(self, instance): if self.fdel is None: raise AttributeError( - "%s: cannot delete instance attribute %r" % (self.clsname, self.name) + " cannot delete attribute %r" % (self.clsname, self.name) ) else: return self.fdel(instance) @@ -328,7 +331,7 @@ def __setitem__(self, key, value): elif _is_sunder(key): if key not in ( '_order_', - '_generate_next_value_', '_missing_', '_ignore_', + '_generate_next_value_', '_numeric_repr_', '_missing_', '_ignore_', '_iter_member_', '_iter_member_by_value_', '_iter_member_by_def_', ): raise ValueError( @@ -358,13 +361,13 @@ def __setitem__(self, key, value): key = '_order_' elif key in self._member_names: # descriptor overwriting an enum? - raise TypeError('%r already defined as: %r' % (key, self[key])) + raise TypeError('%r already defined as %r' % (key, self[key])) elif key in self._ignore: pass elif not _is_descriptor(value): if key in self: # enum overwriting a descriptor? - raise TypeError('%r already defined as: %r' % (key, self[key])) + raise TypeError('%r already defined as %r' % (key, self[key])) if isinstance(value, auto): if value.value == _auto_null: value.value = self._generate_next_value( @@ -395,7 +398,7 @@ class EnumType(type): @classmethod def __prepare__(metacls, cls, bases, **kwds): # check that previous enum members do not exist - metacls._check_for_existing_members(cls, bases) + metacls._check_for_existing_members_(cls, bases) # create the namespace dict enum_dict = _EnumDict() enum_dict._cls_name = cls @@ -413,9 +416,10 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # inherited __new__ unless a new __new__ is defined (or the resulting # class will fail). # - # remove any keys listed in _ignore_ if _simple: return super().__new__(metacls, cls, bases, classdict, **kwds) + # + # remove any keys listed in _ignore_ classdict.setdefault('_ignore_', []).append('_ignore_') ignore = classdict['_ignore_'] for key in ignore: @@ -427,8 +431,8 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # check for illegal enum names (any others?) invalid_names = set(member_names) & {'mro', ''} if invalid_names: - raise ValueError('Invalid enum member name: {0}'.format( - ','.join(invalid_names))) + raise ValueError('invalid enum member name(s) '.format( + ','.join(repr(n) for n in invalid_names))) # # adjust the sunders _order_ = classdict.pop('_order_', None) @@ -458,6 +462,8 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_value2member_map_'] = {} classdict['_unhashable_values_'] = [] classdict['_member_type_'] = member_type + # now set the __repr__ for the value + classdict['_value_repr_'] = metacls._find_data_repr_(cls, bases) # # Flag structures (will be removed if final class is not a Flag classdict['_boundary_'] = ( @@ -467,10 +473,6 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_flag_mask_'] = flag_mask classdict['_all_bits_'] = 2 ** ((flag_mask).bit_length()) - 1 classdict['_inverted_'] = None - # - # create a default docstring if one has not been provided - if '__doc__' not in classdict: - classdict['__doc__'] = 'An enumeration.' try: exc = None enum_class = super().__new__(metacls, cls, bases, classdict, **kwds) @@ -481,18 +483,140 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k if exc is not None: raise exc # + # update classdict with any changes made by __init_subclass__ + classdict.update(enum_class.__dict__) + # + # create a default docstring if one has not been provided + if enum_class.__doc__ is None: + if not member_names: + enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ + Create a collection of name/value pairs. + + Example enumeration: + + >>> class Color(Enum): + ... RED = 1 + ... BLUE = 2 + ... GREEN = 3 + + Access them by: + + - attribute access:: + + >>> Color.RED + + + - value lookup: + + >>> Color(1) + + + - name lookup: + + >>> Color['RED'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Color) + 3 + + >>> list(Color) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """) + else: + member = list(enum_class)[0] + enum_length = len(enum_class) + cls_name = enum_class.__name__ + if enum_length == 1: + list_line = 'list(%s)' % cls_name + list_repr = '[<%s.%s: %r>]' % (cls_name, member.name, member.value) + elif enum_length == 2: + member2 = list(enum_class)[1] + list_line = 'list(%s)' % cls_name + list_repr = '[<%s.%s: %r>, <%s.%s: %r>]' % ( + cls_name, member.name, member.value, + cls_name, member2.name, member2.value, + ) + else: + member2 = list(enum_class)[1] + member3 = list(enum_class)[2] + list_line = 'list(%s)%s' % (cls_name, ('','[:3]')[enum_length > 3]) + list_repr = '[<%s.%s: %r>, <%s.%s: %r>, <%s.%s: %r>]' % ( + cls_name, member.name, member.value, + cls_name, member2.name, member2.value, + cls_name, member3.name, member3.value, + ) + enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> %s.%s + <%s.%s: %r> + + - value lookup: + + >>> %s(%r) + <%s.%s: %r> + + - name lookup: + + >>> %s[%r] + <%s.%s: %r> + + Enumerations can be iterated over, and know how many members they have: + + >>> len(%s) + %r + + >>> %s + %s + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """ + % (cls_name, member.name, + cls_name, member.name, member.value, + cls_name, member.value, + cls_name, member.name, member.value, + cls_name, member.name, + cls_name, member.name, member.value, + cls_name, enum_length, + list_line, list_repr, + )) + # # double check that repr and friends are not the mixin's or various # things break (such as pickle) # however, if the method is defined in the Enum itself, don't replace # it + # + # Also, special handling for ReprEnum + if ReprEnum is not None and ReprEnum in bases: + if member_type is object: + raise TypeError( + 'ReprEnum subclasses must be mixed with a data type (i.e.' + ' int, str, float, etc.)' + ) + if '__format__' not in classdict: + enum_class.__format__ = member_type.__format__ + classdict['__format__'] = enum_class.__format__ + if '__str__' not in classdict: + method = member_type.__str__ + if method is object.__str__: + # if member_type does not define __str__, object.__str__ will use + # its __repr__ instead, so we'll also use its __repr__ + method = member_type.__repr__ + enum_class.__str__ = method + classdict['__str__'] = enum_class.__str__ for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name in classdict: - continue - class_method = getattr(enum_class, name) - obj_method = getattr(member_type, name, None) - enum_method = getattr(first_enum, name, None) - if obj_method is not None and obj_method is class_method: - setattr(enum_class, name, enum_method) + if name not in classdict: + setattr(enum_class, name, getattr(first_enum, name)) # # replace any other __new__ with our own (as long as Enum is not None, # anyway) -- again, this is to support pickle @@ -563,13 +687,13 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # _order_ step 4: verify that _order_ and _member_names_ match if _order_ != enum_class._member_names_: raise TypeError( - 'member order does not match _order_:\n%r\n%r' + 'member order does not match _order_:\n %r\n %r' % (enum_class._member_names_, _order_) ) # return enum_class - def __bool__(self): + def __bool__(cls): """ classes/types should always be True. """ @@ -614,6 +738,13 @@ def __call__(cls, value, names=None, *, module=None, qualname=None, type=None, s ) def __contains__(cls, member): + """ + Return True if member is a member of this enum + raises TypeError if member is not an enum member + + note: in 3.12 TypeError will no longer be raised, and True will also be + returned if member is the value of a member in this enum + """ if not isinstance(member, Enum): import warnings warnings.warn( @@ -631,60 +762,33 @@ def __delattr__(cls, attr): # nicer error message when someone tries to delete an attribute # (see issue19025). if attr in cls._member_map_: - raise AttributeError("%s: cannot delete Enum member %r." % (cls.__name__, attr)) + raise AttributeError("%r cannot delete member %r." % (cls.__name__, attr)) super().__delattr__(attr) - def __dir__(self): - # Start off with the desired result for dir(Enum) - cls_dir = {'__class__', '__doc__', '__members__', '__module__'} - add_to_dir = cls_dir.add - mro = self.__mro__ - this_module = globals().values() - is_from_this_module = lambda cls: any(cls is thing for thing in this_module) - first_enum_base = next(cls for cls in mro if is_from_this_module(cls)) - enum_dict = Enum.__dict__ - sentinel = object() - # special-case __new__ - ignored = {'__new__', *filter(_is_sunder, enum_dict)} - add_to_ignored = ignored.add - - # We want these added to __dir__ - # if and only if they have been user-overridden - enum_dunders = set(filter(_is_dunder, enum_dict)) - - for cls in mro: - # Ignore any classes defined in this module - if cls is object or is_from_this_module(cls): - continue - - cls_lookup = cls.__dict__ - - # If not an instance of EnumType, - # ensure all attributes excluded from that class's `dir()` are ignored here. - if not isinstance(cls, EnumType): - cls_lookup = set(cls_lookup).intersection(dir(cls)) - - for attr_name in cls_lookup: - # Already seen it? Carry on - if attr_name in cls_dir or attr_name in ignored: - continue - # Sunders defined in Enum.__dict__ are already in `ignored`, - # But sunders defined in a subclass won't be (we want all sunders excluded). - elif _is_sunder(attr_name): - add_to_ignored(attr_name) - # Not an "enum dunder"? Add it to dir() output. - elif attr_name not in enum_dunders: - add_to_dir(attr_name) - # Is an "enum dunder", and is defined by a class from enum.py? Ignore it. - elif getattr(self, attr_name) is getattr(first_enum_base, attr_name, sentinel): - add_to_ignored(attr_name) - # Is an "enum dunder", and is either user-defined or defined by a mixin class? - # Add it to dir() output. - else: - add_to_dir(attr_name) - - # sort the output before returning it, so that the result is deterministic. - return sorted(cls_dir) + def __dir__(cls): + # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ + # on object-based enums + if cls._member_type_ is object: + interesting = set(cls._member_names_) + if cls._new_member_ is not object.__new__: + interesting.add('__new__') + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') + for method in ('__init__', '__format__', '__repr__', '__str__'): + if getattr(cls, method) not in (getattr(Enum, method), getattr(Flag, method)): + interesting.add(method) + return sorted(set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ]) | interesting + ) + else: + # return whatever mixed-in data type has + return sorted(set( + dir(cls._member_type_) + + cls._member_names_ + )) def __getattr__(cls, name): """ @@ -703,18 +807,24 @@ def __getattr__(cls, name): raise AttributeError(name) from None def __getitem__(cls, name): + """ + Return the member matching `name`. + """ return cls._member_map_[name] def __iter__(cls): """ - Returns members in definition order. + Return members in definition order. """ return (cls._member_map_[name] for name in cls._member_names_) def __len__(cls): + """ + Return the number of members (no aliases) + """ return len(cls._member_names_) - @_bltin_property + @bltns.property def __members__(cls): """ Returns a mapping of member name->value. @@ -732,7 +842,7 @@ def __repr__(cls): def __reversed__(cls): """ - Returns members in reverse definition order. + Return members in reverse definition order. """ return (cls._member_map_[name] for name in reversed(cls._member_names_)) @@ -746,7 +856,7 @@ def __setattr__(cls, name, value): """ member_map = cls.__dict__.get('_member_map_', {}) if name in member_map: - raise AttributeError('Cannot reassign member %r.' % (name, )) + raise AttributeError('cannot reassign member %r' % (name, )) super().__setattr__(name, value) def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, start=1, boundary=None): @@ -801,8 +911,7 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s return metacls.__new__(metacls, class_name, bases, classdict, boundary=boundary) - def _convert_(cls, name, module, filter, source=None, *, boundary=None): - + def _convert_(cls, name, module, filter, source=None, *, boundary=None, as_global=False): """ Create a new Enum subclass that replaces a collection of global constants """ @@ -834,22 +943,25 @@ def _convert_(cls, name, module, filter, source=None, *, boundary=None): tmp_cls = type(name, (object, ), body) cls = _simple_enum(etype=cls, boundary=boundary or KEEP)(tmp_cls) cls.__reduce_ex__ = _reduce_ex_by_global_name - global_enum(cls) + if as_global: + global_enum(cls) + else: + sys.modules[cls.__module__].__dict__.update(cls.__members__) module_globals[name] = cls return cls - @staticmethod - def _check_for_existing_members(class_name, bases): + @classmethod + def _check_for_existing_members_(mcls, class_name, bases): for chain in bases: for base in chain.__mro__: if issubclass(base, Enum) and base._member_names_: raise TypeError( - "%s: cannot extend enumeration %r" - % (class_name, base.__name__) + " cannot extend %r" + % (class_name, base) ) @classmethod - def _get_mixins_(cls, class_name, bases): + def _get_mixins_(mcls, class_name, bases): """ Returns the type for creating enum members, and the first inherited enum class. @@ -859,30 +971,7 @@ def _get_mixins_(cls, class_name, bases): if not bases: return object, Enum - def _find_data_type(bases): - data_types = set() - for chain in bases: - candidate = None - for base in chain.__mro__: - if base is object: - continue - elif issubclass(base, Enum): - if base._member_type_ is not object: - data_types.add(base._member_type_) - break - elif '__new__' in base.__dict__: - if issubclass(base, Enum): - continue - data_types.add(candidate or base) - break - else: - candidate = candidate or base - if len(data_types) > 1: - raise TypeError('%r: too many data types: %r' % (class_name, data_types)) - elif data_types: - return data_types.pop() - else: - return None + mcls._check_for_existing_members_(class_name, bases) # ensure final parent class is an Enum derivative, find any concrete # data type, and check that Enum has no members @@ -890,12 +979,51 @@ def _find_data_type(bases): if not issubclass(first_enum, Enum): raise TypeError("new enumerations should be created as " "`EnumName([mixin_type, ...] [data_type,] enum_type)`") - cls._check_for_existing_members(class_name, bases) - member_type = _find_data_type(bases) or object + member_type = mcls._find_data_type_(class_name, bases) or object return member_type, first_enum - @staticmethod - def _find_new_(classdict, member_type, first_enum): + @classmethod + def _find_data_repr_(mcls, class_name, bases): + for chain in bases: + for base in chain.__mro__: + if base is object: + continue + elif issubclass(base, Enum): + # if we hit an Enum, use it's _value_repr_ + return base._value_repr_ + elif '__repr__' in base.__dict__: + # this is our data repr + return base.__dict__['__repr__'] + return None + + @classmethod + def _find_data_type_(mcls, class_name, bases): + data_types = set() + for chain in bases: + candidate = None + for base in chain.__mro__: + if base is object: + continue + elif issubclass(base, Enum): + if base._member_type_ is not object: + data_types.add(base._member_type_) + break + elif '__new__' in base.__dict__: + if issubclass(base, Enum): + continue + data_types.add(candidate or base) + break + else: + candidate = candidate or base + if len(data_types) > 1: + raise TypeError('too many data types for %r: %r' % (class_name, data_types)) + elif data_types: + return data_types.pop() + else: + return None + + @classmethod + def _find_new_(mcls, classdict, member_type, first_enum): """ Returns the __new__ to be used for creating the enum members. @@ -943,9 +1071,42 @@ def _find_new_(classdict, member_type, first_enum): class Enum(metaclass=EnumType): """ - Generic enumeration. + Create a collection of name/value pairs. + + Example enumeration: + + >>> class Color(Enum): + ... RED = 1 + ... BLUE = 2 + ... GREEN = 3 + + Access them by: + + - attribute access:: + + >>> Color.RED + + + - value lookup: + + >>> Color(1) + - Derive from this class to define new enumerations. + - name lookup: + + >>> Color['RED'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Color) + 3 + + >>> list(Color) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. """ def __new__(cls, value): @@ -999,6 +1160,9 @@ def __new__(cls, value): exc = None ve_exc = None + def __init__(self, *args, **kwds): + pass + def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1021,47 +1185,44 @@ def _missing_(cls, value): return None def __repr__(self): - return "%s.%s" % ( self.__class__.__name__, self._name_) + v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ + return "<%s.%s: %s>" % (self.__class__.__name__, self._name_, v_repr(self._value_)) def __str__(self): - return "%s" % (self._name_, ) + return "%s.%s" % (self.__class__.__name__, self._name_, ) def __dir__(self): """ Returns all members and all public methods """ - cls = type(self) - to_exclude = {'__members__', '__init__', '__new__', *cls._member_names_} - filtered_self_dict = (name for name in self.__dict__ if not name.startswith('_')) - return sorted({'name', 'value', *dir(cls), *filtered_self_dict} - to_exclude) + if self.__class__._member_type_ is object: + interesting = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) + else: + interesting = set(object.__dir__(self)) + for name in getattr(self, '__dict__', []): + if name[0] != '_': + interesting.add(name) + for cls in self.__class__.mro(): + for name, obj in cls.__dict__.items(): + if name[0] == '_': + continue + if isinstance(obj, property): + # that's an enum.property + if obj.fget is not None or name not in self._member_map_: + interesting.add(name) + else: + # in case it was added by `dir(self)` + interesting.discard(name) + else: + interesting.add(name) + names = sorted( + set(['__class__', '__doc__', '__eq__', '__hash__', '__module__']) + | interesting + ) + return names def __format__(self, format_spec): - """ - Returns format using actual value type unless __str__ has been overridden. - """ - # mixed-in Enums should use the mixed-in type's __format__, otherwise - # we can get strange results with the Enum name showing up instead of - # the value - # - # pure Enum branch, or branch with __str__ explicitly overridden - str_overridden = type(self).__str__ not in (Enum.__str__, IntEnum.__str__, Flag.__str__) - if self._member_type_ is object or str_overridden: - cls = str - val = str(self) - # mix-in branch - else: - if not format_spec or format_spec in ('{}','{:}'): - import warnings - warnings.warn( - "in 3.12 format() will use the enum member, not the enum member's value;\n" - "use a format specifier, such as :d for an integer-based Enum, to maintain " - "the current display", - DeprecationWarning, - stacklevel=2, - ) - cls = self._member_type_ - val = self._value_ - return cls.__format__(val, format_spec) + return str.__format__(str(self), format_spec) def __hash__(self): return hash(self._name_) @@ -1088,34 +1249,25 @@ def value(self): return self._value_ -class IntEnum(int, Enum): +class ReprEnum(Enum): """ - Enum where members are also (and must be) ints + Only changes the repr(), leaving str() and format() to the mixed-in type. """ - def __str__(self): - return "%s" % (self._name_, ) - def __format__(self, format_spec): - """ - Returns format using actual value unless __str__ has been overridden. - """ - str_overridden = type(self).__str__ != IntEnum.__str__ - if str_overridden: - cls = str - val = str(self) - else: - cls = self._member_type_ - val = self._value_ - return cls.__format__(val, format_spec) +class IntEnum(int, ReprEnum): + """ + Enum where members are also (and must be) ints + """ -class StrEnum(str, Enum): +class StrEnum(str, ReprEnum): """ Enum where members are also (and must be) strings """ def __new__(cls, *values): + "values must already be of type `str`" if len(values) > 3: raise TypeError('too many arguments for str(): %r' % (values, )) if len(values) == 1: @@ -1135,10 +1287,6 @@ def __new__(cls, *values): member._value_ = value return member - __str__ = str.__str__ - - __format__ = str.__format__ - def _generate_next_value_(name, start, count, last_values): """ Return the lower-cased version of the member name. @@ -1169,6 +1317,8 @@ class Flag(Enum, boundary=STRICT): Support for flags """ + _numeric_repr_ = repr + def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1184,7 +1334,7 @@ def _generate_next_value_(name, start, count, last_values): try: high_bit = _high_bit(last_value) except Exception: - raise TypeError('Invalid Flag value: %r' % last_value) from None + raise TypeError('invalid flag value %r' % last_value) from None return 2 ** (high_bit+1) @classmethod @@ -1232,8 +1382,8 @@ def _missing_(cls, value): if cls._boundary_ is STRICT: max_bits = max(value.bit_length(), flag_mask.bit_length()) raise ValueError( - "%s: invalid value: %r\n given %s\n allowed %s" % ( - cls.__name__, value, bin(value, max_bits), bin(flag_mask, max_bits), + "%r invalid value %r\n given %s\n allowed %s" % ( + cls, value, bin(value, max_bits), bin(flag_mask, max_bits), )) elif cls._boundary_ is CONFORM: value = value & flag_mask @@ -1247,7 +1397,7 @@ def _missing_(cls, value): ) else: raise ValueError( - 'unknown flag boundary: %r' % (cls._boundary_, ) + '%r unknown flag boundary %r' % (cls, cls._boundary_, ) ) if value < 0: neg_value = value @@ -1274,7 +1424,7 @@ def _missing_(cls, value): m._name_ for m in cls._iter_member_(member_value) ]) if unknown: - pseudo_member._name_ += '|0x%x' % unknown + pseudo_member._name_ += '|%s' % cls._numeric_repr_(unknown) else: pseudo_member._name_ = None # use setdefault in case another thread already created a composite @@ -1292,10 +1442,8 @@ def __contains__(self, other): """ if not isinstance(other, self.__class__): raise TypeError( - "unsupported operand type(s) for 'in': '%s' and '%s'" % ( + "unsupported operand type(s) for 'in': %r and %r" % ( type(other).__qualname__, self.__class__.__qualname__)) - if other._value_ == 0 or self._value_ == 0: - return False return other._value_ & self._value_ == other._value_ def __iter__(self): @@ -1309,27 +1457,18 @@ def __len__(self): def __repr__(self): cls_name = self.__class__.__name__ + v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ if self._name_ is None: - return "0x%x" % (self._value_, ) - if _is_single_bit(self._value_): - return '%s.%s' % (cls_name, self._name_) - if self._boundary_ is not FlagBoundary.KEEP: - return '%s.' % cls_name + ('|%s.' % cls_name).join(self.name.split('|')) + return "<%s: %s>" % (cls_name, v_repr(self._value_)) else: - name = [] - for n in self._name_.split('|'): - if n.startswith('0'): - name.append(n) - else: - name.append('%s.%s' % (cls_name, n)) - return '|'.join(name) + return "<%s.%s: %s>" % (cls_name, self._name_, v_repr(self._value_)) def __str__(self): - cls = self.__class__ + cls_name = self.__class__.__name__ if self._name_ is None: - return '%s(%x)' % (cls.__name__, self._value_) + return '%s(%r)' % (cls_name, self._value_) else: - return self._name_ + return "%s.%s" % (cls_name, self._name_) def __bool__(self): return bool(self._value_) @@ -1362,20 +1501,11 @@ def __invert__(self): return self._inverted_ -class IntFlag(int, Flag, boundary=EJECT): +class IntFlag(int, ReprEnum, Flag, boundary=EJECT): """ Support for integer-based Flags """ - def __format__(self, format_spec): - """ - Returns format using actual value unless __str__ has been overridden. - """ - str_overridden = type(self).__str__ != Flag.__str__ - value = self - if not str_overridden: - value = self._value_ - return int.__format__(value, format_spec) def __or__(self, other): if isinstance(other, self.__class__): @@ -1412,6 +1542,7 @@ def __xor__(self, other): __rxor__ = __xor__ __invert__ = Flag.__invert__ + def _high_bit(value): """ returns index of highest bit, or -1 if value is zero or negative @@ -1456,7 +1587,7 @@ def global_flag_repr(self): module = self.__class__.__module__.split('.')[-1] cls_name = self.__class__.__name__ if self._name_ is None: - return "%s.%s(0x%x)" % (module, cls_name, self._value_) + return "%s.%s(%r)" % (module, cls_name, self._value_) if _is_single_bit(self): return '%s.%s' % (module, self._name_) if self._boundary_ is not FlagBoundary.KEEP: @@ -1464,14 +1595,22 @@ def global_flag_repr(self): else: name = [] for n in self._name_.split('|'): - if n.startswith('0'): + if n[0].isdigit(): name.append(n) else: name.append('%s.%s' % (module, n)) return '|'.join(name) +def global_str(self): + """ + use enum_name instead of class.enum_name + """ + if self._name_ is None: + return "%s(%r)" % (cls_name, self._value_) + else: + return self._name_ -def global_enum(cls): +def global_enum(cls, update_str=False): """ decorator that makes the repr() of an enum member reference its module instead of its class; also exports all members to the enum's module's @@ -1481,6 +1620,8 @@ def global_enum(cls): cls.__repr__ = global_flag_repr else: cls.__repr__ = global_enum_repr + if not issubclass(cls, ReprEnum) or update_str: + cls.__str__ = global_str sys.modules[cls.__module__].__dict__.update(cls.__members__) return cls @@ -1522,6 +1663,7 @@ def convert_class(cls): body['_value2member_map_'] = value2member_map = {} body['_unhashable_values_'] = [] body['_member_type_'] = member_type = etype._member_type_ + body['_value_repr_'] = etype._value_repr_ if issubclass(etype, Flag): body['_boundary_'] = boundary or etype._boundary_ body['_flag_mask_'] = None @@ -1543,13 +1685,8 @@ def convert_class(cls): # it enum_class = type(cls_name, (etype, ), body, boundary=boundary, _simple=True) for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name in body: - continue - class_method = getattr(enum_class, name) - obj_method = getattr(member_type, name, None) - enum_method = getattr(etype, name, None) - if obj_method is not None and obj_method is class_method: - setattr(enum_class, name, enum_method) + if name not in body: + setattr(enum_class, name, getattr(etype, name)) gnv_last_values = [] if issubclass(enum_class, Flag): # Flag / IntFlag @@ -1760,8 +1897,8 @@ def _test_simple_enum(checked_enum, simple_enum): + list(simple_enum._member_map_.keys()) ) for key in set(checked_keys + simple_keys): - if key in ('__module__', '_member_map_', '_value2member_map_'): - # keys known to be different + if key in ('__module__', '_member_map_', '_value2member_map_', '__doc__'): + # keys known to be different, or very long continue elif key in member_names: # members are checked below @@ -1882,3 +2019,5 @@ def _old_convert_(etype, name, module, filter, source=None, *, boundary=None): cls.__reduce_ex__ = _reduce_ex_by_global_name cls.__repr__ = global_enum_repr return cls + +_stdlib_enums = IntEnum, StrEnum, IntFlag diff --git a/Lib/inspect.py b/Lib/inspect.py index 5d33f0d445fb9..8236698b8de0f 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2567,15 +2567,21 @@ class _empty: class _ParameterKind(enum.IntEnum): - POSITIONAL_ONLY = 0 - POSITIONAL_OR_KEYWORD = 1 - VAR_POSITIONAL = 2 - KEYWORD_ONLY = 3 - VAR_KEYWORD = 4 + POSITIONAL_ONLY = 'positional-only' + POSITIONAL_OR_KEYWORD = 'positional or keyword' + VAR_POSITIONAL = 'variadic positional' + KEYWORD_ONLY = 'keyword-only' + VAR_KEYWORD = 'variadic keyword' + + def __new__(cls, description): + value = len(cls.__members__) + member = int.__new__(cls, value) + member._value_ = value + member.description = description + return member - @property - def description(self): - return _PARAM_NAME_MAPPING[self] + def __str__(self): + return self.name _POSITIONAL_ONLY = _ParameterKind.POSITIONAL_ONLY _POSITIONAL_OR_KEYWORD = _ParameterKind.POSITIONAL_OR_KEYWORD @@ -2583,14 +2589,6 @@ def description(self): _KEYWORD_ONLY = _ParameterKind.KEYWORD_ONLY _VAR_KEYWORD = _ParameterKind.VAR_KEYWORD -_PARAM_NAME_MAPPING = { - _POSITIONAL_ONLY: 'positional-only', - _POSITIONAL_OR_KEYWORD: 'positional or keyword', - _VAR_POSITIONAL: 'variadic positional', - _KEYWORD_ONLY: 'keyword-only', - _VAR_KEYWORD: 'variadic keyword' -} - class Parameter: """Represents a parameter in a function signature. diff --git a/Lib/plistlib.py b/Lib/plistlib.py index 3ab71edc320af..4862355b2252c 100644 --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -61,7 +61,8 @@ from xml.parsers.expat import ParserCreate -PlistFormat = enum.global_enum(enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__)) +PlistFormat = enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__) +globals().update(PlistFormat.__members__) class UID: diff --git a/Lib/re.py b/Lib/re.py index ea41217ce08c2..a7ab9b3706748 100644 --- a/Lib/re.py +++ b/Lib/re.py @@ -155,6 +155,8 @@ class RegexFlag: # sre extensions (experimental, don't rely on these) TEMPLATE = T = sre_compile.SRE_FLAG_TEMPLATE # disable backtracking DEBUG = sre_compile.SRE_FLAG_DEBUG # dump pattern after compilation + __str__ = object.__str__ + _numeric_repr_ = hex # sre exception error = sre_compile.error diff --git a/Lib/ssl.py b/Lib/ssl.py index 207925166efa3..dafb70a67864c 100644 --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -119,7 +119,6 @@ ) from _ssl import _DEFAULT_CIPHERS, _OPENSSL_API_VERSION - _IntEnum._convert_( '_SSLMethod', __name__, lambda name: name.startswith('PROTOCOL_') and name != 'PROTOCOL_SSLv23', diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 43f98c1c1efb6..a0953fb960f33 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -6,15 +6,18 @@ import sys import unittest import threading +import builtins as bltns from collections import OrderedDict +from datetime import date from enum import Enum, IntEnum, StrEnum, EnumType, Flag, IntFlag, unique, auto from enum import STRICT, CONFORM, EJECT, KEEP, _simple_enum, _test_simple_enum -from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS +from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS, ReprEnum from io import StringIO from pickle import dumps, loads, PicklingError, HIGHEST_PROTOCOL from test import support from test.support import ALWAYS_EQ from test.support import threading_helper +from textwrap import dedent from datetime import timedelta python_version = sys.version_info[:2] @@ -107,6 +110,12 @@ def test_pickle_exception(assertion, exception, obj): class TestHelpers(unittest.TestCase): # _is_descriptor, _is_sunder, _is_dunder + sunder_names = '_bad_', '_good_', '_what_ho_' + dunder_names = '__mal__', '__bien__', '__que_que__' + private_names = '_MyEnum__private', '_MyEnum__still_private' + private_and_sunder_names = '_MyEnum__private_', '_MyEnum__also_private_' + random_names = 'okay', '_semi_private', '_weird__', '_MyEnum__' + def test_is_descriptor(self): class foo: pass @@ -116,21 +125,36 @@ class foo: setattr(obj, attr, 1) self.assertTrue(enum._is_descriptor(obj)) - def test_is_sunder(self): + def test_sunder(self): + for name in self.sunder_names + self.private_and_sunder_names: + self.assertTrue(enum._is_sunder(name), '%r is a not sunder name?' % name) + for name in self.dunder_names + self.private_names + self.random_names: + self.assertFalse(enum._is_sunder(name), '%r is a sunder name?' % name) for s in ('_a_', '_aa_'): self.assertTrue(enum._is_sunder(s)) - for s in ('a', 'a_', '_a', '__a', 'a__', '__a__', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_sunder(s)) - def test_is_dunder(self): + def test_dunder(self): + for name in self.dunder_names: + self.assertTrue(enum._is_dunder(name), '%r is a not dunder name?' % name) + for name in self.sunder_names + self.private_names + self.private_and_sunder_names + self.random_names: + self.assertFalse(enum._is_dunder(name), '%r is a dunder name?' % name) for s in ('__a__', '__aa__'): self.assertTrue(enum._is_dunder(s)) for s in ('a', 'a_', '_a', '__a', 'a__', '_a_', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_dunder(s)) + + def test_is_private(self): + for name in self.private_names + self.private_and_sunder_names: + self.assertTrue(enum._is_private('MyEnum', name), '%r is a not private name?') + for name in self.sunder_names + self.dunder_names + self.random_names: + self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') + + # for subclassing tests class classproperty: @@ -166,473 +190,658 @@ class HeadlightsC(IntFlag, boundary=enum.CONFORM): # tests -class TestEnum(unittest.TestCase): +class _EnumTests: + """ + Test for behavior that is the same across the different types of enumerations. + """ + + values = None def setUp(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = 3 - WINTER = 4 - self.Season = Season + class BaseEnum(self.enum_type): + @enum.property + def first(self): + return '%s is first!' % self.name + class MainEnum(BaseEnum): + first = auto() + second = auto() + third = auto() + if issubclass(self.enum_type, Flag): + dupe = 3 + else: + dupe = third + self.MainEnum = MainEnum + # + class NewStrEnum(self.enum_type): + def __str__(self): + return self.name.upper() + first = auto() + self.NewStrEnum = NewStrEnum + # + class NewFormatEnum(self.enum_type): + def __format__(self, spec): + return self.name.upper() + first = auto() + self.NewFormatEnum = NewFormatEnum + # + class NewStrFormatEnum(self.enum_type): + def __str__(self): + return self.name.title() + def __format__(self, spec): + return ''.join(reversed(self.name)) + first = auto() + self.NewStrFormatEnum = NewStrFormatEnum + # + class NewBaseEnum(self.enum_type): + def __str__(self): + return self.name.title() + def __format__(self, spec): + return ''.join(reversed(self.name)) + class NewSubEnum(NewBaseEnum): + first = auto() + self.NewSubEnum = NewSubEnum + # + self.is_flag = False + self.names = ['first', 'second', 'third'] + if issubclass(MainEnum, StrEnum): + self.values = self.names + elif MainEnum._member_type_ is str: + self.values = ['1', '2', '3'] + elif issubclass(self.enum_type, Flag): + self.values = [1, 2, 4] + self.is_flag = True + self.dupe2 = MainEnum(5) + else: + self.values = self.values or [1, 2, 3] + # + if not getattr(self, 'source_values', False): + self.source_values = self.values - class Konstants(float, Enum): - E = 2.7182818 - PI = 3.1415926 - TAU = 2 * PI - self.Konstants = Konstants + def assertFormatIsValue(self, spec, member): + self.assertEqual(spec.format(member), spec.format(member.value)) - class Grades(IntEnum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.Grades = Grades + def assertFormatIsStr(self, spec, member): + self.assertEqual(spec.format(member), spec.format(str(member))) - class Directional(str, Enum): - EAST = 'east' - WEST = 'west' - NORTH = 'north' - SOUTH = 'south' - self.Directional = Directional + def test_attribute_deletion(self): + class Season(self.enum_type): + SPRING = auto() + SUMMER = auto() + AUTUMN = auto() + # + def spam(cls): + pass + # + self.assertTrue(hasattr(Season, 'spam')) + del Season.spam + self.assertFalse(hasattr(Season, 'spam')) + # + with self.assertRaises(AttributeError): + del Season.SPRING + with self.assertRaises(AttributeError): + del Season.DRY + with self.assertRaises(AttributeError): + del Season.SPRING.name - from datetime import date - class Holiday(date, Enum): - NEW_YEAR = 2013, 1, 1 - IDES_OF_MARCH = 2013, 3, 15 - self.Holiday = Holiday + def test_basics(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(repr(TE), "") + self.assertEqual(str(TE), "") + self.assertEqual(format(TE), "") + self.assertTrue(TE(5) is self.dupe2) + else: + self.assertEqual(repr(TE), "") + self.assertEqual(str(TE), "") + self.assertEqual(format(TE), "") + self.assertEqual(list(TE), [TE.first, TE.second, TE.third]) + self.assertEqual( + [m.name for m in TE], + self.names, + ) + self.assertEqual( + [m.value for m in TE], + self.values, + ) + self.assertEqual( + [m.first for m in TE], + ['first is first!', 'second is first!', 'third is first!'] + ) + for member, name in zip(TE, self.names, strict=True): + self.assertIs(TE[name], member) + for member, value in zip(TE, self.values, strict=True): + self.assertIs(TE(value), member) + if issubclass(TE, StrEnum): + self.assertTrue(TE.dupe is TE('third') is TE['dupe']) + elif TE._member_type_ is str: + self.assertTrue(TE.dupe is TE('3') is TE['dupe']) + elif issubclass(TE, Flag): + self.assertTrue(TE.dupe is TE(3) is TE['dupe']) + else: + self.assertTrue(TE.dupe is TE(self.values[2]) is TE['dupe']) - class DateEnum(date, Enum): pass - self.DateEnum = DateEnum + def test_bool_is_true(self): + class Empty(self.enum_type): + pass + self.assertTrue(Empty) + # + self.assertTrue(self.MainEnum) + for member in self.MainEnum: + self.assertTrue(member) - class FloatEnum(float, Enum): pass - self.FloatEnum = FloatEnum + def test_changing_member_fails(self): + MainEnum = self.MainEnum + with self.assertRaises(AttributeError): + self.MainEnum.second = 'really first' - class Wowser(Enum): - this = 'that' - these = 'those' - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.Wowser = Wowser - - class IntWowser(IntEnum): - this = 1 - these = 2 - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.IntWowser = IntWowser - - class FloatWowser(float, Enum): - this = 3.14 - these = 4.2 - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.FloatWowser = FloatWowser + @unittest.skipIf( + python_version >= (3, 12), + '__contains__ now returns True/False for all inputs', + ) + def test_contains_er(self): + MainEnum = self.MainEnum + self.assertIn(MainEnum.third, MainEnum) + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + self.source_values[1] in MainEnum + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + 'first' in MainEnum + val = MainEnum.dupe + self.assertIn(val, MainEnum) + # + class OtherEnum(Enum): + one = auto() + two = auto() + self.assertNotIn(OtherEnum.two, MainEnum) - class WowserNoMembers(Enum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - class SubclassOfWowserNoMembers(WowserNoMembers): pass - self.WowserNoMembers = WowserNoMembers - self.SubclassOfWowserNoMembers = SubclassOfWowserNoMembers - - class IntWowserNoMembers(IntEnum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.IntWowserNoMembers = IntWowserNoMembers + @unittest.skipIf( + python_version < (3, 12), + '__contains__ works only with enum memmbers before 3.12', + ) + def test_contains_tf(self): + MainEnum = self.MainEnum + self.assertIn(MainEnum.first, MainEnum) + self.assertTrue(self.source_values[0] in MainEnum) + self.assertFalse('first' in MainEnum) + val = MainEnum.dupe + self.assertIn(val, MainEnum) + # + class OtherEnum(Enum): + one = auto() + two = auto() + self.assertNotIn(OtherEnum.two, MainEnum) - class FloatWowserNoMembers(float, Enum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.FloatWowserNoMembers = FloatWowserNoMembers - - class EnumWithInit(Enum): - def __init__(self, greeting, farewell): - self.greeting = greeting - self.farewell = farewell - ENGLISH = 'hello', 'goodbye' - GERMAN = 'Guten Morgen', 'Auf Wiedersehen' - def some_method(self): pass - self.EnumWithInit = EnumWithInit + def test_dir_on_class(self): + TE = self.MainEnum + self.assertEqual(set(dir(TE)), set(enum_dir(TE))) + + def test_dir_on_item(self): + TE = self.MainEnum + self.assertEqual(set(dir(TE.first)), set(member_dir(TE.first))) + + def test_dir_with_added_behavior(self): + class Test(self.enum_type): + this = auto() + these = auto() + def wowser(self): + return ("Wowser! I'm %s!" % self.name) + self.assertTrue('wowser' not in dir(Test)) + self.assertTrue('wowser' in dir(Test.this)) + def test_dir_on_sub_with_behavior_on_super(self): # see issue22506 - class SuperEnum1(Enum): + class SuperEnum(self.enum_type): def invisible(self): return "did you see me?" - class SubEnum1(SuperEnum1): - sample = 5 - self.SubEnum1 = SubEnum1 + class SubEnum(SuperEnum): + sample = auto() + self.assertTrue('invisible' not in dir(SubEnum)) + self.assertTrue('invisible' in dir(SubEnum.sample)) - class SuperEnum2(IntEnum): - def __new__(cls, value, description=""): - obj = int.__new__(cls, value) - obj._value_ = value - obj.description = description + def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): + # see issue40084 + class SuperEnum(self.enum_type): + def __new__(cls, *value, **kwds): + new = self.enum_type._member_type_.__new__ + if self.enum_type._member_type_ is object: + obj = new(cls) + else: + if isinstance(value[0], tuple): + create_value ,= value[0] + else: + create_value = value + obj = new(cls, *create_value) + obj._value_ = value[0] if len(value) == 1 else value + obj.description = 'test description' return obj - class SubEnum2(SuperEnum2): - sample = 5 - self.SubEnum2 = SubEnum2 - - def test_dir_basics_for_all_enums(self): - enums_for_tests = ( - # Generic enums in enum.py - Enum, - IntEnum, - StrEnum, - # Generic enums defined outside of enum.py - self.DateEnum, - self.FloatEnum, - # Concrete enums derived from enum.py generics - self.Grades, - self.Season, - # Concrete enums derived from generics defined outside of enum.py - self.Konstants, - self.Holiday, - # Standard enum with added behaviour & members - self.Wowser, - # Mixin-enum-from-enum.py with added behaviour & members - self.IntWowser, - # Mixin-enum-from-oustide-enum.py with added behaviour & members - self.FloatWowser, - # Equivalents of the three immediately above, but with no members - self.WowserNoMembers, - self.IntWowserNoMembers, - self.FloatWowserNoMembers, - # Enum with members and an __init__ method - self.EnumWithInit, - # Special cases to test - self.SubEnum1, - self.SubEnum2 - ) - - for cls in enums_for_tests: - with self.subTest(cls=cls): - cls_dir = dir(cls) - # test that dir is deterministic - self.assertEqual(cls_dir, dir(cls)) - # test that dir is sorted - self.assertEqual(list(cls_dir), sorted(cls_dir)) - # test that there are no dupes in dir - self.assertEqual(len(cls_dir), len(set(cls_dir))) - # test that there are no sunders in dir - self.assertFalse(any(enum._is_sunder(attr) for attr in cls_dir)) - self.assertNotIn('__new__', cls_dir) - - for attr in ('__class__', '__doc__', '__members__', '__module__'): - with self.subTest(attr=attr): - self.assertIn(attr, cls_dir) - - def test_dir_for_enum_with_members(self): - enums_for_test = ( - # Enum with members - self.Season, - # IntEnum with members - self.Grades, - # Two custom-mixin enums with members - self.Konstants, - self.Holiday, - # several enums-with-added-behaviour and members - self.Wowser, - self.IntWowser, - self.FloatWowser, - # An enum with an __init__ method and members - self.EnumWithInit, - # Special cases to test - self.SubEnum1, - self.SubEnum2 - ) - - for cls in enums_for_test: - cls_dir = dir(cls) - member_names = cls._member_names_ - with self.subTest(cls=cls): - self.assertTrue(all(member_name in cls_dir for member_name in member_names)) - for member in cls: - member_dir = dir(member) - # test that dir is deterministic - self.assertEqual(member_dir, dir(member)) - # test that dir is sorted - self.assertEqual(list(member_dir), sorted(member_dir)) - # test that there are no dupes in dir - self.assertEqual(len(member_dir), len(set(member_dir))) - - for attr_name in cls_dir: - with self.subTest(attr_name=attr_name): - if attr_name in {'__members__', '__init__', '__new__', *member_names}: - self.assertNotIn(attr_name, member_dir) - else: - self.assertIn(attr_name, member_dir) - - self.assertFalse(any(enum._is_sunder(attr) for attr in member_dir)) - - def test_dir_for_enums_with_added_behaviour(self): - enums_for_test = ( - self.Wowser, - self.IntWowser, - self.FloatWowser, - self.WowserNoMembers, - self.SubclassOfWowserNoMembers, - self.IntWowserNoMembers, - self.FloatWowserNoMembers - ) - - for cls in enums_for_test: - with self.subTest(cls=cls): - self.assertIn('wowser', dir(cls)) - self.assertIn('classmethod_wowser', dir(cls)) - self.assertIn('staticmethod_wowser', dir(cls)) - self.assertTrue(all( - all(attr in dir(member) for attr in ('wowser', 'classmethod_wowser', 'staticmethod_wowser')) - for member in cls - )) + class SubEnum(SuperEnum): + sample = self.source_values[1] + self.assertTrue('description' not in dir(SubEnum)) + self.assertTrue('description' in dir(SubEnum.sample), dir(SubEnum.sample)) - self.assertEqual(dir(self.WowserNoMembers), dir(self.SubclassOfWowserNoMembers)) - # Check classmethods are present - self.assertIn('from_bytes', dir(self.IntWowser)) - self.assertIn('from_bytes', dir(self.IntWowserNoMembers)) - - def test_help_output_on_enum_members(self): - added_behaviour_enums = ( - self.Wowser, - self.IntWowser, - self.FloatWowser - ) - - for cls in added_behaviour_enums: - with self.subTest(cls=cls): - rendered_doc = pydoc.render_doc(cls.this) - self.assertIn('Wowser docstring', rendered_doc) - if cls in {self.IntWowser, self.FloatWowser}: - self.assertIn('float(self)', rendered_doc) - - def test_dir_for_enum_with_init(self): - EnumWithInit = self.EnumWithInit - - cls_dir = dir(EnumWithInit) - self.assertIn('__init__', cls_dir) - self.assertIn('some_method', cls_dir) - self.assertNotIn('greeting', cls_dir) - self.assertNotIn('farewell', cls_dir) - - member_dir = dir(EnumWithInit.ENGLISH) - self.assertNotIn('__init__', member_dir) - self.assertIn('some_method', member_dir) - self.assertIn('greeting', member_dir) - self.assertIn('farewell', member_dir) - - def test_mixin_dirs(self): - from datetime import date + def test_enum_in_enum_out(self): + Main = self.MainEnum + self.assertIs(Main(Main.first), Main.first) - enums_for_test = ( - # generic mixins from enum.py - (IntEnum, int), - (StrEnum, str), - # generic mixins from outside enum.py - (self.FloatEnum, float), - (self.DateEnum, date), - # concrete mixin from enum.py - (self.Grades, int), - # concrete mixin from outside enum.py - (self.Holiday, date), - # concrete mixin from enum.py with added behaviour - (self.IntWowser, int), - # concrete mixin from outside enum.py with added behaviour - (self.FloatWowser, float) - ) - - enum_dict = Enum.__dict__ - enum_dir = dir(Enum) - enum_module_names = enum.__all__ - is_from_enum_module = lambda cls: cls.__name__ in enum_module_names - is_enum_dunder = lambda attr: enum._is_dunder(attr) and attr in enum_dict - - def attr_is_inherited_from_object(cls, attr_name): - for base in cls.__mro__: - if attr_name in base.__dict__: - return base is object - return False - - # General tests - for enum_cls, mixin_cls in enums_for_test: - with self.subTest(enum_cls=enum_cls): - cls_dir = dir(enum_cls) - cls_dict = enum_cls.__dict__ - - mixin_attrs = [ - x for x in dir(mixin_cls) - if not attr_is_inherited_from_object(cls=mixin_cls, attr_name=x) - ] + def test_hash(self): + MainEnum = self.MainEnum + mapping = {} + mapping[MainEnum.first] = '1225' + mapping[MainEnum.second] = '0315' + mapping[MainEnum.third] = '0704' + self.assertEqual(mapping[MainEnum.second], '0315') + + def test_invalid_names(self): + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + mro = 9 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _create_= 11 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _get_mixins_ = 9 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _find_new_ = 1 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _any_name_ = 9 + + def test_object_str_override(self): + "check that setting __str__ to object's is not reset to Enum's" + class Generic(self.enum_type): + item = self.source_values[2] + def __repr__(self): + return "%s.test" % (self._name_, ) + __str__ = object.__str__ + self.assertEqual(str(Generic.item), 'item.test') + + def test_overridden_str(self): + NS = self.NewStrEnum + self.assertEqual(str(NS.first), NS.first.name.upper()) + self.assertEqual(format(NS.first), NS.first.name.upper()) - first_enum_base = next( - base for base in enum_cls.__mro__ - if is_from_enum_module(base) + def test_overridden_str_format(self): + NSF = self.NewStrFormatEnum + self.assertEqual(str(NSF.first), NSF.first.name.title()) + self.assertEqual(format(NSF.first), ''.join(reversed(NSF.first.name))) + + def test_overridden_str_format_inherited(self): + NSE = self.NewSubEnum + self.assertEqual(str(NSE.first), NSE.first.name.title()) + self.assertEqual(format(NSE.first), ''.join(reversed(NSE.first.name))) + + def test_programmatic_function_string(self): + MinorEnum = self.enum_type('MinorEnum', 'june july august') + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, ) + values = self.values + if self.enum_type is StrEnum: + values = ['june','july','august'] + for month, av in zip('june july august'.split(), values): + e = MinorEnum[month] + self.assertEqual(e.value, av, list(MinorEnum)) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - for attr in mixin_attrs: - with self.subTest(attr=attr): - if enum._is_sunder(attr): - # Unlikely, but no harm in testing - self.assertNotIn(attr, cls_dir) - elif attr in {'__class__', '__doc__', '__members__', '__module__'}: - self.assertIn(attr, cls_dir) - elif is_enum_dunder(attr): - if is_from_enum_module(enum_cls): - self.assertNotIn(attr, cls_dir) - elif getattr(enum_cls, attr) is getattr(first_enum_base, attr): - self.assertNotIn(attr, cls_dir) - else: - self.assertIn(attr, cls_dir) - else: - self.assertIn(attr, cls_dir) - - # Some specific examples - int_enum_dir = dir(IntEnum) - self.assertIn('imag', int_enum_dir) - self.assertIn('__rfloordiv__', int_enum_dir) - self.assertNotIn('__format__', int_enum_dir) - self.assertNotIn('__hash__', int_enum_dir) - self.assertNotIn('__init_subclass__', int_enum_dir) - self.assertNotIn('__subclasshook__', int_enum_dir) - - class OverridesFormatOutsideEnumModule(Enum): - def __format__(self, *args, **kwargs): - return super().__format__(*args, **kwargs) - SOME_MEMBER = 1 - - self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule)) - self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule.SOME_MEMBER)) + def test_programmatic_function_string_list(self): + MinorEnum = self.enum_type('MinorEnum', ['june', 'july', 'august']) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + values = self.values + if self.enum_type is StrEnum: + values = ['june','july','august'] + for month, av in zip('june july august'.split(), values): + e = MinorEnum[month] + self.assertEqual(e.value, av) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_dir_on_sub_with_behavior_on_super(self): - # see issue22506 + def test_programmatic_function_iterable(self): + MinorEnum = self.enum_type( + 'MinorEnum', + (('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2])) + ) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) self.assertEqual( - set(dir(self.SubEnum1.sample)), - set(['__class__', '__doc__', '__module__', 'name', 'value', 'invisible']), + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, ) + for month, av in zip('june july august'.split(), self.values): + e = MinorEnum[month] + self.assertEqual(e.value, av) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): - # see issue40084 - self.assertTrue({'description'} <= set(dir(self.SubEnum2.sample))) + def test_programmatic_function_from_dict(self): + MinorEnum = self.enum_type( + 'MinorEnum', + OrderedDict((('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2]))) + ) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for month, av in zip('june july august'.split(), self.values): + e = MinorEnum[month] + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_enum_in_enum_out(self): - Season = self.Season - self.assertIs(Season(Season.WINTER), Season.WINTER) + def test_repr(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(repr(TE(0)), "") + self.assertEqual(repr(TE.dupe), "") + self.assertEqual(repr(self.dupe2), "") + elif issubclass(TE, StrEnum): + self.assertEqual(repr(TE.dupe), "") + else: + self.assertEqual(repr(TE.dupe), "" % (self.values[2], ), TE._value_repr_) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(repr(member), "" % (member.name, member.value)) - def test_enum_value(self): - Season = self.Season - self.assertEqual(Season.SPRING.value, 1) + def test_repr_override(self): + class Generic(self.enum_type): + first = auto() + second = auto() + third = auto() + def __repr__(self): + return "don't you just love shades of %s?" % self.name + self.assertEqual( + repr(Generic.third), + "don't you just love shades of third?", + ) - def test_intenum_value(self): - self.assertEqual(IntStooges.CURLY.value, 2) + def test_inherited_repr(self): + class MyEnum(self.enum_type): + def __repr__(self): + return "My name is %s." % self.name + class MySubEnum(MyEnum): + this = auto() + that = auto() + theother = auto() + self.assertEqual(repr(MySubEnum.that), "My name is that.") - def test_enum(self): - Season = self.Season - lst = list(Season) - self.assertEqual(len(lst), len(Season)) - self.assertEqual(len(Season), 4, Season) + def test_reversed_iteration_order(self): self.assertEqual( - [Season.SPRING, Season.SUMMER, Season.AUTUMN, Season.WINTER], lst) + list(reversed(self.MainEnum)), + [self.MainEnum.third, self.MainEnum.second, self.MainEnum.first], + ) - for i, season in enumerate('SPRING SUMMER AUTUMN WINTER'.split(), 1): - e = Season(i) - self.assertEqual(e, getattr(Season, season)) - self.assertEqual(e.value, i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, season) - self.assertIn(e, Season) - self.assertIs(type(e), Season) - self.assertIsInstance(e, Season) - self.assertEqual(str(e), season) - self.assertEqual(repr(e), 'Season.{0}'.format(season)) - - def test_value_name(self): - Season = self.Season - self.assertEqual(Season.SPRING.name, 'SPRING') - self.assertEqual(Season.SPRING.value, 1) - with self.assertRaises(AttributeError): - Season.SPRING.name = 'invierno' - with self.assertRaises(AttributeError): - Season.SPRING.value = 2 +class _PlainOutputTests: - def test_changing_member(self): - Season = self.Season - with self.assertRaises(AttributeError): - Season.WINTER = 'really cold' + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "MainEnum.dupe") + self.assertEqual(str(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(str(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) - def test_attribute_deletion(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = 3 - WINTER = 4 + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "MainEnum.dupe") + self.assertEqual(format(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(format(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) - def spam(cls): - pass + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), "NewFormatEnum.first", '%s %r' % (NF.__str__, NF.first)) + self.assertEqual(format(NF.first), "FIRST") - self.assertTrue(hasattr(Season, 'spam')) - del Season.spam - self.assertFalse(hasattr(Season, 'spam')) + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsStr('{}', TE.second) + self.assertFormatIsStr('{:}', TE.second) + self.assertFormatIsStr('{:20}', TE.second) + self.assertFormatIsStr('{:^20}', TE.second) + self.assertFormatIsStr('{:>20}', TE.second) + self.assertFormatIsStr('{:<20}', TE.second) + self.assertFormatIsStr('{:5.2}', TE.second) - with self.assertRaises(AttributeError): - del Season.SPRING - with self.assertRaises(AttributeError): - del Season.DRY - with self.assertRaises(AttributeError): - del Season.SPRING.name - def test_bool_of_class(self): - class Empty(Enum): - pass - self.assertTrue(bool(Empty)) +class _MixedOutputTests: - def test_bool_of_member(self): - class Count(Enum): - zero = 0 - one = 1 - two = 2 - for member in Count: - self.assertTrue(bool(member)) + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "MainEnum.dupe") + self.assertEqual(str(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(str(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) + + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "MainEnum.dupe") + self.assertEqual(format(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(format(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) + + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), "NewFormatEnum.first") + self.assertEqual(format(NF.first), "FIRST") + + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsStr('{}', TE.first) + self.assertFormatIsStr('{:}', TE.first) + self.assertFormatIsStr('{:20}', TE.first) + self.assertFormatIsStr('{:^20}', TE.first) + self.assertFormatIsStr('{:>20}', TE.first) + self.assertFormatIsStr('{:<20}', TE.first) + self.assertFormatIsStr('{:5.2}', TE.first) + + +class _MinimalOutputTests: + + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "3") + self.assertEqual(str(self.dupe2), "5") + else: + self.assertEqual(str(TE.dupe), str(self.values[2])) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), str(value)) + + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "3") + self.assertEqual(format(self.dupe2), "5") + else: + self.assertEqual(format(TE.dupe), format(self.values[2])) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), format(value)) + + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), str(self.values[0])) + self.assertEqual(format(NF.first), "FIRST") + + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsValue('{}', TE.third) + self.assertFormatIsValue('{:}', TE.third) + self.assertFormatIsValue('{:20}', TE.third) + self.assertFormatIsValue('{:^20}', TE.third) + self.assertFormatIsValue('{:>20}', TE.third) + self.assertFormatIsValue('{:<20}', TE.third) + if TE._member_type_ is float: + self.assertFormatIsValue('{:n}', TE.third) + self.assertFormatIsValue('{:5.2}', TE.third) + self.assertFormatIsValue('{:f}', TE.third) + + +class _FlagTests: + + def test_default_missing_with_wrong_type_value(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestFlag.Color", + ) as ctx: + self.MainEnum('RED') + self.assertIs(ctx.exception.__context__, None) + +class TestPlainEnum(_EnumTests, _PlainOutputTests, unittest.TestCase): + enum_type = Enum + + +class TestPlainFlag(_EnumTests, _PlainOutputTests, unittest.TestCase): + enum_type = Flag + + +class TestIntEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = IntEnum + + +class TestStrEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = StrEnum + + +class TestIntFlag(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = IntFlag + + +class TestMixedInt(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(int, Enum): pass + + +class TestMixedStr(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(str, Enum): pass + + +class TestMixedIntFlag(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(int, Flag): pass + + +class TestMixedDate(_EnumTests, _MixedOutputTests, unittest.TestCase): + + values = [date(2021, 12, 25), date(2020, 3, 15), date(2019, 11, 27)] + source_values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] + + class enum_type(date, Enum): + def _generate_next_value_(name, start, count, last_values): + values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] + return values[count] + + +class TestMinimalDate(_EnumTests, _MinimalOutputTests, unittest.TestCase): + + values = [date(2023, 12, 1), date(2016, 2, 29), date(2009, 1, 1)] + source_values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] + + class enum_type(date, ReprEnum): + def _generate_next_value_(name, start, count, last_values): + values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] + return values[count] - def test_invalid_names(self): - with self.assertRaises(ValueError): - class Wrong(Enum): - mro = 9 - with self.assertRaises(ValueError): - class Wrong(Enum): - _create_= 11 - with self.assertRaises(ValueError): - class Wrong(Enum): - _get_mixins_ = 9 - with self.assertRaises(ValueError): - class Wrong(Enum): - _find_new_ = 1 - with self.assertRaises(ValueError): - class Wrong(Enum): - _any_name_ = 9 + +class TestMixedFloat(_EnumTests, _MixedOutputTests, unittest.TestCase): + + values = [1.1, 2.2, 3.3] + + class enum_type(float, Enum): + def _generate_next_value_(name, start, count, last_values): + values = [1.1, 2.2, 3.3] + return values[count] + + +class TestMinimalFloat(_EnumTests, _MinimalOutputTests, unittest.TestCase): + + values = [4.4, 5.5, 6.6] + + class enum_type(float, ReprEnum): + def _generate_next_value_(name, start, count, last_values): + values = [4.4, 5.5, 6.6] + return values[count] + + +class TestSpecial(unittest.TestCase): + """ + various operations that are not attributable to every possible enum + """ + + def setUp(self): + class Season(Enum): + SPRING = 1 + SUMMER = 2 + AUTUMN = 3 + WINTER = 4 + self.Season = Season + # + class Grades(IntEnum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + self.Grades = Grades + # + class Directional(str, Enum): + EAST = 'east' + WEST = 'west' + NORTH = 'north' + SOUTH = 'south' + self.Directional = Directional + # + from datetime import date + class Holiday(date, Enum): + NEW_YEAR = 2013, 1, 1 + IDES_OF_MARCH = 2013, 3, 15 + self.Holiday = Holiday def test_bool(self): # plain Enum members are always True @@ -656,92 +865,56 @@ class IntLogic(int, Enum): self.assertTrue(IntLogic.true) self.assertFalse(IntLogic.false) - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - def test_contains_er(self): - Season = self.Season - self.assertIn(Season.AUTUMN, Season) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 3 in Season - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'AUTUMN' in Season - val = Season(3) - self.assertIn(val, Season) - # - class OtherEnum(Enum): - one = 1; two = 2 - self.assertNotIn(OtherEnum.two, Season) - - @unittest.skipIf( - python_version < (3, 12), - '__contains__ only works with enum memmbers before 3.12', - ) - def test_contains_tf(self): - Season = self.Season - self.assertIn(Season.AUTUMN, Season) - self.assertTrue(3 in Season) - self.assertFalse('AUTUMN' in Season) - val = Season(3) - self.assertIn(val, Season) - # - class OtherEnum(Enum): - one = 1; two = 2 - self.assertNotIn(OtherEnum.two, Season) - def test_comparisons(self): Season = self.Season with self.assertRaises(TypeError): Season.SPRING < Season.WINTER with self.assertRaises(TypeError): Season.SPRING > 4 - + # self.assertNotEqual(Season.SPRING, 1) - + # class Part(Enum): SPRING = 1 CLIP = 2 BARREL = 3 - + # self.assertNotEqual(Season.SPRING, Part.SPRING) with self.assertRaises(TypeError): Season.SPRING < Part.CLIP - def test_enum_duplicates(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = FALL = 3 - WINTER = 4 - ANOTHER_SPRING = 1 - lst = list(Season) - self.assertEqual( - lst, - [Season.SPRING, Season.SUMMER, - Season.AUTUMN, Season.WINTER, - ]) - self.assertIs(Season.FALL, Season.AUTUMN) - self.assertEqual(Season.FALL.value, 3) - self.assertEqual(Season.AUTUMN.value, 3) - self.assertIs(Season(3), Season.AUTUMN) - self.assertIs(Season(1), Season.SPRING) - self.assertEqual(Season.FALL.name, 'AUTUMN') - self.assertEqual( - [k for k,v in Season.__members__.items() if v.name != k], - ['FALL', 'ANOTHER_SPRING'], - ) + def test_dir_with_custom_dunders(self): + class PlainEnum(Enum): + pass + cls_dir = dir(PlainEnum) + self.assertNotIn('__repr__', cls_dir) + self.assertNotIn('__str__', cls_dir) + self.assertNotIn('__repr__', cls_dir) + self.assertNotIn('__repr__', cls_dir) + # + class MyEnum(Enum): + def __repr__(self): + return object.__repr__(self) + def __str__(self): + return object.__repr__(self) + def __format__(self): + return object.__repr__(self) + def __init__(self): + pass + cls_dir = dir(MyEnum) + self.assertIn('__repr__', cls_dir) + self.assertIn('__str__', cls_dir) + self.assertIn('__repr__', cls_dir) + self.assertIn('__repr__', cls_dir) - def test_duplicate_name(self): + def test_duplicate_name_error(self): with self.assertRaises(TypeError): class Color(Enum): red = 1 green = 2 blue = 3 red = 4 - + # with self.assertRaises(TypeError): class Color(Enum): red = 1 @@ -749,232 +922,45 @@ class Color(Enum): blue = 3 def red(self): return 'red' - + # with self.assertRaises(TypeError): class Color(Enum): - @property + @enum.property def red(self): return 'redder' - red = 1 - green = 2 - blue = 3 - - def test_reserved__sunder_(self): - with self.assertRaisesRegex( - ValueError, - '_sunder_ names, such as ._bad_., are reserved', - ): - class Bad(Enum): - _bad_ = 1 + red = 1 + green = 2 + blue = 3 + + def test_enum_function_with_qualname(self): + if isinstance(Theory, Exception): + raise Theory + self.assertEqual(Theory.__qualname__, 'spanish_inquisition') def test_enum_with_value_name(self): class Huh(Enum): name = 1 value = 2 - self.assertEqual( - list(Huh), - [Huh.name, Huh.value], - ) + self.assertEqual(list(Huh), [Huh.name, Huh.value]) self.assertIs(type(Huh.name), Huh) self.assertEqual(Huh.name.name, 'name') self.assertEqual(Huh.name.value, 1) - def test_format_enum(self): - Season = self.Season - self.assertEqual('{}'.format(Season.SPRING), - '{}'.format(str(Season.SPRING))) - self.assertEqual( '{:}'.format(Season.SPRING), - '{:}'.format(str(Season.SPRING))) - self.assertEqual('{:20}'.format(Season.SPRING), - '{:20}'.format(str(Season.SPRING))) - self.assertEqual('{:^20}'.format(Season.SPRING), - '{:^20}'.format(str(Season.SPRING))) - self.assertEqual('{:>20}'.format(Season.SPRING), - '{:>20}'.format(str(Season.SPRING))) - self.assertEqual('{:<20}'.format(Season.SPRING), - '{:<20}'.format(str(Season.SPRING))) - - def test_str_override_enum(self): - class EnumWithStrOverrides(Enum): - one = auto() - two = auto() - - def __str__(self): - return 'Str!' - self.assertEqual(str(EnumWithStrOverrides.one), 'Str!') - self.assertEqual('{}'.format(EnumWithStrOverrides.one), 'Str!') - - def test_format_override_enum(self): - class EnumWithFormatOverride(Enum): - one = 1.0 - two = 2.0 - def __format__(self, spec): - return 'Format!!' - self.assertEqual(str(EnumWithFormatOverride.one), 'one') - self.assertEqual('{}'.format(EnumWithFormatOverride.one), 'Format!!') - - def test_str_and_format_override_enum(self): - class EnumWithStrFormatOverrides(Enum): - one = auto() - two = auto() - def __str__(self): - return 'Str!' - def __format__(self, spec): - return 'Format!' - self.assertEqual(str(EnumWithStrFormatOverrides.one), 'Str!') - self.assertEqual('{}'.format(EnumWithStrFormatOverrides.one), 'Format!') - - def test_str_override_mixin(self): - class MixinEnumWithStrOverride(float, Enum): - one = 1.0 - two = 2.0 - def __str__(self): - return 'Overridden!' - self.assertEqual(str(MixinEnumWithStrOverride.one), 'Overridden!') - self.assertEqual('{}'.format(MixinEnumWithStrOverride.one), 'Overridden!') - - def test_str_and_format_override_mixin(self): - class MixinWithStrFormatOverrides(float, Enum): - one = 1.0 - two = 2.0 - def __str__(self): - return 'Str!' - def __format__(self, spec): - return 'Format!' - self.assertEqual(str(MixinWithStrFormatOverrides.one), 'Str!') - self.assertEqual('{}'.format(MixinWithStrFormatOverrides.one), 'Format!') - - def test_format_override_mixin(self): - class TestFloat(float, Enum): - one = 1.0 - two = 2.0 - def __format__(self, spec): - return 'TestFloat success!' - self.assertEqual(str(TestFloat.one), 'one') - self.assertEqual('{}'.format(TestFloat.one), 'TestFloat success!') - - @unittest.skipIf( - python_version < (3, 12), - 'mixin-format is still using member.value', - ) - def test_mixin_format_warning(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.assertEqual(f'{self.Grades.B}', 'B') - - @unittest.skipIf( - python_version >= (3, 12), - 'mixin-format now uses member instead of member.value', - ) - def test_mixin_format_warning(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - with self.assertWarns(DeprecationWarning): - self.assertEqual(f'{Grades.B}', '4') - - def assertFormatIsValue(self, spec, member): - if python_version < (3, 12) and (not spec or spec in ('{}','{:}')): - with self.assertWarns(DeprecationWarning): - self.assertEqual(spec.format(member), spec.format(member.value)) - else: - self.assertEqual(spec.format(member), spec.format(member.value)) - - def test_format_enum_date(self): - Holiday = self.Holiday - self.assertFormatIsValue('{}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:^20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:>20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:<20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:%Y %m}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:%Y %m %M:00}', Holiday.IDES_OF_MARCH) - - def test_format_enum_float(self): - Konstants = self.Konstants - self.assertFormatIsValue('{}', Konstants.TAU) - self.assertFormatIsValue('{:}', Konstants.TAU) - self.assertFormatIsValue('{:20}', Konstants.TAU) - self.assertFormatIsValue('{:^20}', Konstants.TAU) - self.assertFormatIsValue('{:>20}', Konstants.TAU) - self.assertFormatIsValue('{:<20}', Konstants.TAU) - self.assertFormatIsValue('{:n}', Konstants.TAU) - self.assertFormatIsValue('{:5.2}', Konstants.TAU) - self.assertFormatIsValue('{:f}', Konstants.TAU) - - def test_format_enum_int(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.assertFormatIsValue('{}', Grades.C) - self.assertFormatIsValue('{:}', Grades.C) - self.assertFormatIsValue('{:20}', Grades.C) - self.assertFormatIsValue('{:^20}', Grades.C) - self.assertFormatIsValue('{:>20}', Grades.C) - self.assertFormatIsValue('{:<20}', Grades.C) - self.assertFormatIsValue('{:+}', Grades.C) - self.assertFormatIsValue('{:08X}', Grades.C) - self.assertFormatIsValue('{:b}', Grades.C) - - def test_format_enum_str(self): - Directional = self.Directional - self.assertFormatIsValue('{}', Directional.WEST) - self.assertFormatIsValue('{:}', Directional.WEST) - self.assertFormatIsValue('{:20}', Directional.WEST) - self.assertFormatIsValue('{:^20}', Directional.WEST) - self.assertFormatIsValue('{:>20}', Directional.WEST) - self.assertFormatIsValue('{:<20}', Directional.WEST) - - def test_object_str_override(self): - class Colors(Enum): - RED, GREEN, BLUE = 1, 2, 3 - def __repr__(self): - return "test.%s" % (self._name_, ) - __str__ = object.__str__ - self.assertEqual(str(Colors.RED), 'test.RED') - - def test_enum_str_override(self): - class MyStrEnum(Enum): - def __str__(self): - return 'MyStr' - class MyMethodEnum(Enum): - def hello(self): - return 'Hello! My name is %s' % self.name - class Test1Enum(MyMethodEnum, int, MyStrEnum): - One = 1 - Two = 2 - self.assertTrue(Test1Enum._member_type_ is int) - self.assertEqual(str(Test1Enum.One), 'MyStr') - self.assertEqual(format(Test1Enum.One, ''), 'MyStr') - # - class Test2Enum(MyStrEnum, MyMethodEnum): - One = 1 - Two = 2 - self.assertEqual(str(Test2Enum.One), 'MyStr') - self.assertEqual(format(Test1Enum.One, ''), 'MyStr') - def test_inherited_data_type(self): class HexInt(int): + __qualname__ = 'HexInt' def __repr__(self): return hex(self) class MyEnum(HexInt, enum.Enum): + __qualname__ = 'MyEnum' A = 1 B = 2 C = 3 - def __repr__(self): - return '<%s.%s: %r>' % (self.__class__.__name__, self._name_, self._value_) self.assertEqual(repr(MyEnum.A), '') + globals()['HexInt'] = HexInt + globals()['MyEnum'] = MyEnum + test_pickle_dump_load(self.assertIs, MyEnum.A) + test_pickle_dump_load(self.assertIs, MyEnum) # class SillyInt(HexInt): __qualname__ = 'SillyInt' @@ -990,7 +976,7 @@ class MyOtherEnum(SillyInt, enum.Enum): test_pickle_dump_load(self.assertIs, MyOtherEnum.E) test_pickle_dump_load(self.assertIs, MyOtherEnum) # - # This did not work in 3.9, but does now with pickling by name + # This did not work in 3.10, but does now with pickling by name class UnBrokenInt(int): __qualname__ = 'UnBrokenInt' def __new__(cls, value): @@ -1007,6 +993,124 @@ class MyUnBrokenEnum(UnBrokenInt, Enum): test_pickle_dump_load(self.assertIs, MyUnBrokenEnum.I) test_pickle_dump_load(self.assertIs, MyUnBrokenEnum) + def test_floatenum_fromhex(self): + h = float.hex(FloatStooges.MOE.value) + self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) + h = float.hex(FloatStooges.MOE.value + 0.01) + with self.assertRaises(ValueError): + FloatStooges.fromhex(h) + + def test_programmatic_function_type(self): + MinorEnum = Enum('MinorEnum', 'june july august', type=int) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_string_with_start(self): + MinorEnum = Enum('MinorEnum', 'june july august', start=10) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 10): + e = MinorEnum(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_with_start(self): + MinorEnum = Enum('MinorEnum', 'june july august', type=int, start=30) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 30): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_string_list_with_start(self): + MinorEnum = Enum('MinorEnum', ['june', 'july', 'august'], start=20) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 20): + e = MinorEnum(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_from_subclass(self): + MinorEnum = IntEnum('MinorEnum', 'june july august') + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_from_subclass_with_start(self): + MinorEnum = IntEnum('MinorEnum', 'june july august', start=40) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 40): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_intenum_from_bytes(self): + self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) + with self.assertRaises(ValueError): + IntStooges.from_bytes(b'\x00\x05', 'big') + + def test_reserved_sunder_error(self): + with self.assertRaisesRegex( + ValueError, + '_sunder_ names, such as ._bad_., are reserved', + ): + class Bad(Enum): + _bad_ = 1 + def test_too_many_data_types(self): with self.assertRaisesRegex(TypeError, 'too many data types'): class Huh(str, int, Enum): @@ -1022,122 +1126,6 @@ def repr(self): class Huh(MyStr, MyInt, Enum): One = 1 - def test_value_auto_assign(self): - class Some(Enum): - def __new__(cls, val): - return object.__new__(cls) - x = 1 - y = 2 - - self.assertEqual(Some.x.value, 1) - self.assertEqual(Some.y.value, 2) - - def test_hash(self): - Season = self.Season - dates = {} - dates[Season.WINTER] = '1225' - dates[Season.SPRING] = '0315' - dates[Season.SUMMER] = '0704' - dates[Season.AUTUMN] = '1031' - self.assertEqual(dates[Season.AUTUMN], '1031') - - def test_intenum_from_scratch(self): - class phy(int, Enum): - pi = 3 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_intenum_inherited(self): - class IntEnum(int, Enum): - pass - class phy(IntEnum): - pi = 3 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_floatenum_from_scratch(self): - class phy(float, Enum): - pi = 3.1415926 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_floatenum_inherited(self): - class FloatEnum(float, Enum): - pass - class phy(FloatEnum): - pi = 3.1415926 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_strenum_from_scratch(self): - class phy(str, Enum): - pi = 'Pi' - tau = 'Tau' - self.assertTrue(phy.pi < phy.tau) - - def test_strenum_inherited_methods(self): - class phy(StrEnum): - pi = 'Pi' - tau = 'Tau' - self.assertTrue(phy.pi < phy.tau) - self.assertEqual(phy.pi.upper(), 'PI') - self.assertEqual(phy.tau.count('a'), 1) - - def test_intenum(self): - class WeekDay(IntEnum): - SUNDAY = 1 - MONDAY = 2 - TUESDAY = 3 - WEDNESDAY = 4 - THURSDAY = 5 - FRIDAY = 6 - SATURDAY = 7 - - self.assertEqual(['a', 'b', 'c'][WeekDay.MONDAY], 'c') - self.assertEqual([i for i in range(WeekDay.TUESDAY)], [0, 1, 2]) - - lst = list(WeekDay) - self.assertEqual(len(lst), len(WeekDay)) - self.assertEqual(len(WeekDay), 7) - target = 'SUNDAY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY' - target = target.split() - for i, weekday in enumerate(target, 1): - e = WeekDay(i) - self.assertEqual(e, i) - self.assertEqual(int(e), i) - self.assertEqual(e.name, weekday) - self.assertIn(e, WeekDay) - self.assertEqual(lst.index(e)+1, i) - self.assertTrue(0 < e < 8) - self.assertIs(type(e), WeekDay) - self.assertIsInstance(e, int) - self.assertIsInstance(e, Enum) - - def test_intenum_duplicates(self): - class WeekDay(IntEnum): - SUNDAY = 1 - MONDAY = 2 - TUESDAY = TEUSDAY = 3 - WEDNESDAY = 4 - THURSDAY = 5 - FRIDAY = 6 - SATURDAY = 7 - self.assertIs(WeekDay.TEUSDAY, WeekDay.TUESDAY) - self.assertEqual(WeekDay(3).name, 'TUESDAY') - self.assertEqual([k for k,v in WeekDay.__members__.items() - if v.name != k], ['TEUSDAY', ]) - - def test_intenum_from_bytes(self): - self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) - with self.assertRaises(ValueError): - IntStooges.from_bytes(b'\x00\x05', 'big') - - def test_floatenum_fromhex(self): - h = float.hex(FloatStooges.MOE.value) - self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) - h = float.hex(FloatStooges.MOE.value + 0.01) - with self.assertRaises(ValueError): - FloatStooges.fromhex(h) def test_pickle_enum(self): if isinstance(Stooges, Exception): @@ -1169,12 +1157,7 @@ def test_pickle_enum_function_with_module(self): test_pickle_dump_load(self.assertIs, Question.who) test_pickle_dump_load(self.assertIs, Question) - def test_enum_function_with_qualname(self): - if isinstance(Theory, Exception): - raise Theory - self.assertEqual(Theory.__qualname__, 'spanish_inquisition') - - def test_class_nested_enum_and_pickle_protocol_four(self): + def test_pickle_nested_class(self): # would normally just have this directly in the class namespace class NestedEnum(Enum): twigs = 'common' @@ -1192,225 +1175,46 @@ class ReplaceGlobalInt(IntEnum): for proto in range(HIGHEST_PROTOCOL): self.assertEqual(ReplaceGlobalInt.TWO.__reduce_ex__(proto), 'TWO') - def test_exploding_pickle(self): + def test_pickle_explodes(self): BadPickle = Enum( 'BadPickle', 'dill sweet bread-n-butter', module=__name__) globals()['BadPickle'] = BadPickle # now break BadPickle to test exception raising enum._make_class_unpicklable(BadPickle) - test_pickle_exception(self.assertRaises, TypeError, BadPickle.dill) - test_pickle_exception(self.assertRaises, PicklingError, BadPickle) - - def test_string_enum(self): - class SkillLevel(str, Enum): - master = 'what is the sound of one hand clapping?' - journeyman = 'why did the chicken cross the road?' - apprentice = 'knock, knock!' - self.assertEqual(SkillLevel.apprentice, 'knock, knock!') - - def test_getattr_getitem(self): - class Period(Enum): - morning = 1 - noon = 2 - evening = 3 - night = 4 - self.assertIs(Period(2), Period.noon) - self.assertIs(getattr(Period, 'night'), Period.night) - self.assertIs(Period['morning'], Period.morning) - - def test_getattr_dunder(self): - Season = self.Season - self.assertTrue(getattr(Season, '__eq__')) - - def test_iteration_order(self): - class Season(Enum): - SUMMER = 2 - WINTER = 4 - AUTUMN = 3 - SPRING = 1 - self.assertEqual( - list(Season), - [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], - ) - - def test_reversed_iteration_order(self): - self.assertEqual( - list(reversed(self.Season)), - [self.Season.WINTER, self.Season.AUTUMN, self.Season.SUMMER, - self.Season.SPRING] - ) - - def test_programmatic_function_string(self): - SummerMonth = Enum('SummerMonth', 'june july august') - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_with_start(self): - SummerMonth = Enum('SummerMonth', 'june july august', start=10) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 10): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_list(self): - SummerMonth = Enum('SummerMonth', ['june', 'july', 'august']) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_list_with_start(self): - SummerMonth = Enum('SummerMonth', ['june', 'july', 'august'], start=20) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 20): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_iterable(self): - SummerMonth = Enum( - 'SummerMonth', - (('june', 1), ('july', 2), ('august', 3)) - ) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_from_dict(self): - SummerMonth = Enum( - 'SummerMonth', - OrderedDict((('june', 1), ('july', 2), ('august', 3))) - ) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + test_pickle_exception(self.assertRaises, TypeError, BadPickle.dill) + test_pickle_exception(self.assertRaises, PicklingError, BadPickle) - def test_programmatic_function_type(self): - SummerMonth = Enum('SummerMonth', 'june july august', type=int) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_string_enum(self): + class SkillLevel(str, Enum): + master = 'what is the sound of one hand clapping?' + journeyman = 'why did the chicken cross the road?' + apprentice = 'knock, knock!' + self.assertEqual(SkillLevel.apprentice, 'knock, knock!') - def test_programmatic_function_type_with_start(self): - SummerMonth = Enum('SummerMonth', 'june july august', type=int, start=30) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 30): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_getattr_getitem(self): + class Period(Enum): + morning = 1 + noon = 2 + evening = 3 + night = 4 + self.assertIs(Period(2), Period.noon) + self.assertIs(getattr(Period, 'night'), Period.night) + self.assertIs(Period['morning'], Period.morning) - def test_programmatic_function_type_from_subclass(self): - SummerMonth = IntEnum('SummerMonth', 'june july august') - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_getattr_dunder(self): + Season = self.Season + self.assertTrue(getattr(Season, '__eq__')) - def test_programmatic_function_type_from_subclass_with_start(self): - SummerMonth = IntEnum('SummerMonth', 'june july august', start=40) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) + def test_iteration_order(self): + class Season(Enum): + SUMMER = 2 + WINTER = 4 + AUTUMN = 3 + SPRING = 1 self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, + list(Season), + [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], ) - for i, month in enumerate('june july august'.split(), 40): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) def test_subclassing(self): if isinstance(Name, Exception): @@ -1425,15 +1229,18 @@ class Color(Enum): red = 1 green = 2 blue = 3 + # with self.assertRaises(TypeError): class MoreColor(Color): cyan = 4 magenta = 5 yellow = 6 - with self.assertRaisesRegex(TypeError, "EvenMoreColor: cannot extend enumeration 'Color'"): + # + with self.assertRaisesRegex(TypeError, " cannot extend "): class EvenMoreColor(Color, IntEnum): chartruese = 7 - with self.assertRaisesRegex(TypeError, "Foo: cannot extend enumeration 'Color'"): + # + with self.assertRaisesRegex(TypeError, " cannot extend "): Color('Foo', ('pink', 'black')) def test_exclude_methods(self): @@ -1537,27 +1344,7 @@ class Color(Enum): with self.assertRaises(KeyError): Color['chartreuse'] - def test_new_repr(self): - class Color(Enum): - red = 1 - green = 2 - blue = 3 - def __repr__(self): - return "don't you just love shades of %s?" % self.name - self.assertEqual( - repr(Color.blue), - "don't you just love shades of blue?", - ) - - def test_inherited_repr(self): - class MyEnum(Enum): - def __repr__(self): - return "My name is %s." % self.name - class MyIntEnum(int, MyEnum): - this = 1 - that = 2 - theother = 3 - self.assertEqual(repr(MyIntEnum.that), "My name is that.") + # tests that need to be evalualted for moving def test_multiple_mixin_mro(self): class auto_enum(type(Enum)): @@ -1610,7 +1397,7 @@ def __new__(cls, *args): return self def __getnewargs__(self): return self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1670,7 +1457,7 @@ def __new__(cls, *args): return self def __getnewargs_ex__(self): return self._args, {} - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1730,7 +1517,7 @@ def __new__(cls, *args): return self def __reduce__(self): return self.__class__, self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1790,7 +1577,7 @@ def __new__(cls, *args): return self def __reduce_ex__(self, proto): return self.__class__, self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1847,7 +1634,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1902,7 +1689,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -2091,6 +1878,7 @@ def test(self): class Test(Base): test = 1 self.assertEqual(Test.test.test, 'dynamic') + self.assertEqual(Test.test.value, 1) class Base2(Enum): @enum.property def flash(self): @@ -2098,6 +1886,7 @@ def flash(self): class Test(Base2): flash = 1 self.assertEqual(Test.flash.flash, 'flashy dynamic') + self.assertEqual(Test.flash.value, 1) def test_no_duplicates(self): class UniqueEnum(Enum): @@ -2134,7 +1923,7 @@ class Planet(Enum): def __init__(self, mass, radius): self.mass = mass # in kilograms self.radius = radius # in meters - @property + @enum.property def surface_gravity(self): # universal gravitational constant (m3 kg-1 s-2) G = 6.67300E-11 @@ -2204,90 +1993,7 @@ class LabelledList(LabelledIntEnum): self.assertEqual(LabelledList.unprocessed, 1) self.assertEqual(LabelledList(1), LabelledList.unprocessed) - def test_auto_number(self): - class Color(Enum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 1) - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_name(self): - class Color(Enum): - def _generate_next_value_(name, start, count, last): - return name - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_name_inherit(self): - class AutoNameEnum(Enum): - def _generate_next_value_(name, start, count, last): - return name - class Color(AutoNameEnum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_garbage(self): - class Color(Enum): - red = 'red' - blue = auto() - self.assertEqual(Color.blue.value, 1) - - def test_auto_garbage_corrected(self): - class Color(Enum): - red = 'red' - blue = 2 - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_order(self): - with self.assertRaises(TypeError): - class Color(Enum): - red = auto() - green = auto() - blue = auto() - def _generate_next_value_(name, start, count, last): - return name - - def test_auto_order_wierd(self): - weird_auto = auto() - weird_auto.value = 'pathological case' - class Color(Enum): - red = weird_auto - def _generate_next_value_(name, start, count, last): - return name - blue = auto() - self.assertEqual(list(Color), [Color.red, Color.blue]) - self.assertEqual(Color.red.value, 'pathological case') - self.assertEqual(Color.blue.value, 'blue') - - def test_duplicate_auto(self): - class Dupes(Enum): - first = primero = auto() - second = auto() - third = auto() - self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) - - def test_default_missing(self): + def test_default_missing_no_chained_exception(self): class Color(Enum): RED = 1 GREEN = 2 @@ -2299,7 +2005,7 @@ class Color(Enum): else: raise Exception('Exception not raised.') - def test_missing(self): + def test_missing_override(self): class Color(Enum): red = 1 green = 2 @@ -2363,9 +2069,9 @@ def __init__(self): class_1_ref = weakref.ref(Class1()) class_2_ref = weakref.ref(Class2()) # - # The exception raised by Enum creates a reference loop and thus - # Class2 instances will stick around until the next garbage collection - # cycle, unlike Class1. + # The exception raised by Enum used to create a reference loop and thus + # Class2 instances would stick around until the next garbage collection + # cycle, unlike Class1. Verify Class2 no longer does this. gc.collect() # For PyPy or other GCs. self.assertIs(class_1_ref(), None) self.assertIs(class_2_ref(), None) @@ -2396,11 +2102,12 @@ class Color(MaxMixin, Enum): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) self.assertEqual(Color.MAX, 3) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), 'Color.BLUE') class Color(MaxMixin, StrMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2410,6 +2117,7 @@ class Color(StrMixin, MaxMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2419,6 +2127,7 @@ class CoolColor(StrMixin, SomeEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolColor.RED.value, 1) self.assertEqual(CoolColor.GREEN.value, 2) self.assertEqual(CoolColor.BLUE.value, 3) @@ -2428,6 +2137,7 @@ class CoolerColor(StrMixin, AnotherEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolerColor.RED.value, 1) self.assertEqual(CoolerColor.GREEN.value, 2) self.assertEqual(CoolerColor.BLUE.value, 3) @@ -2438,6 +2148,7 @@ class CoolestColor(StrMixin, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolestColor.RED.value, 1) self.assertEqual(CoolestColor.GREEN.value, 2) self.assertEqual(CoolestColor.BLUE.value, 3) @@ -2448,6 +2159,7 @@ class ConfusedColor(StrMixin, AnotherEnum, SomeEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ConfusedColor.RED.value, 1) self.assertEqual(ConfusedColor.GREEN.value, 2) self.assertEqual(ConfusedColor.BLUE.value, 3) @@ -2458,6 +2170,7 @@ class ReformedColor(StrMixin, IntEnum, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ReformedColor.RED.value, 1) self.assertEqual(ReformedColor.GREEN.value, 2) self.assertEqual(ReformedColor.BLUE.value, 3) @@ -2490,11 +2203,12 @@ def __repr__(self): return hex(self) class MyIntEnum(HexMixin, MyInt, enum.Enum): - pass + __repr__ = HexMixin.__repr__ class Foo(MyIntEnum): TEST = 1 self.assertTrue(isinstance(Foo.TEST, MyInt)) + self.assertEqual(Foo._member_type_, MyInt) self.assertEqual(repr(Foo.TEST), "0x1") class Fee(MyIntEnum): @@ -2506,7 +2220,7 @@ def __new__(cls, value): return member self.assertEqual(Fee.TEST, 2) - def test_miltuple_mixin_with_common_data_type(self): + def test_multiple_mixin_with_common_data_type(self): class CaseInsensitiveStrEnum(str, Enum): @classmethod def _missing_(cls, value): @@ -2526,7 +2240,7 @@ def _missing_(cls, value): unknown._value_ = value cls._member_map_[value] = unknown return unknown - @property + @enum.property def valid(self): return self._valid # @@ -2570,7 +2284,7 @@ class GoodStrEnum(StrEnum): self.assertEqual('{}'.format(GoodStrEnum.one), '1') self.assertEqual(GoodStrEnum.one, str(GoodStrEnum.one)) self.assertEqual(GoodStrEnum.one, '{}'.format(GoodStrEnum.one)) - self.assertEqual(repr(GoodStrEnum.one), 'GoodStrEnum.one') + self.assertEqual(repr(GoodStrEnum.one), "") # class DumbMixin: def __str__(self): @@ -2579,6 +2293,7 @@ class DumbStrEnum(DumbMixin, StrEnum): five = '5' six = '6' seven = '7' + __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2620,74 +2335,6 @@ class ThirdFailedStrEnum(StrEnum): one = '1' two = b'2', 'ascii', 9 - @unittest.skipIf( - python_version >= (3, 12), - 'mixin-format now uses member instead of member.value', - ) - def test_custom_strenum_with_warning(self): - class CustomStrEnum(str, Enum): - pass - class OkayEnum(CustomStrEnum): - one = '1' - two = '2' - three = b'3', 'ascii' - four = b'4', 'latin1', 'strict' - self.assertEqual(OkayEnum.one, '1') - self.assertEqual(str(OkayEnum.one), 'one') - with self.assertWarns(DeprecationWarning): - self.assertEqual('{}'.format(OkayEnum.one), '1') - self.assertEqual(OkayEnum.one, '{}'.format(OkayEnum.one)) - self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') - # - class DumbMixin: - def __str__(self): - return "don't do this" - class DumbStrEnum(DumbMixin, CustomStrEnum): - five = '5' - six = '6' - seven = '7' - self.assertEqual(DumbStrEnum.seven, '7') - self.assertEqual(str(DumbStrEnum.seven), "don't do this") - # - class EnumMixin(Enum): - def hello(self): - print('hello from %s' % (self, )) - class HelloEnum(EnumMixin, CustomStrEnum): - eight = '8' - self.assertEqual(HelloEnum.eight, '8') - self.assertEqual(str(HelloEnum.eight), 'eight') - # - class GoodbyeMixin: - def goodbye(self): - print('%s wishes you a fond farewell') - class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): - nine = '9' - self.assertEqual(GoodbyeEnum.nine, '9') - self.assertEqual(str(GoodbyeEnum.nine), 'nine') - # - class FirstFailedStrEnum(CustomStrEnum): - one = 1 # this will become '1' - two = '2' - class SecondFailedStrEnum(CustomStrEnum): - one = '1' - two = 2, # this will become '2' - three = '3' - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = 2 # this will become '2' - with self.assertRaisesRegex(TypeError, '.encoding. must be str, not '): - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = b'2', sys.getdefaultencoding - with self.assertRaisesRegex(TypeError, '.errors. must be str, not '): - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = b'2', 'ascii', 9 - - @unittest.skipIf( - python_version < (3, 12), - 'mixin-format currently uses member.value', - ) def test_custom_strenum(self): class CustomStrEnum(str, Enum): pass @@ -2697,9 +2344,9 @@ class OkayEnum(CustomStrEnum): three = b'3', 'ascii' four = b'4', 'latin1', 'strict' self.assertEqual(OkayEnum.one, '1') - self.assertEqual(str(OkayEnum.one), 'one') - self.assertEqual('{}'.format(OkayEnum.one), 'one') - self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') + self.assertEqual(str(OkayEnum.one), 'OkayEnum.one') + self.assertEqual('{}'.format(OkayEnum.one), 'OkayEnum.one') + self.assertEqual(repr(OkayEnum.one), "") # class DumbMixin: def __str__(self): @@ -2708,6 +2355,7 @@ class DumbStrEnum(DumbMixin, CustomStrEnum): five = '5' six = '6' seven = '7' + __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2717,7 +2365,7 @@ def hello(self): class HelloEnum(EnumMixin, CustomStrEnum): eight = '8' self.assertEqual(HelloEnum.eight, '8') - self.assertEqual(str(HelloEnum.eight), 'eight') + self.assertEqual(str(HelloEnum.eight), 'HelloEnum.eight') # class GoodbyeMixin: def goodbye(self): @@ -2725,7 +2373,7 @@ def goodbye(self): class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): nine = '9' self.assertEqual(GoodbyeEnum.nine, '9') - self.assertEqual(str(GoodbyeEnum.nine), 'nine') + self.assertEqual(str(GoodbyeEnum.nine), 'GoodbyeEnum.nine') # class FirstFailedStrEnum(CustomStrEnum): one = 1 # this will become '1' @@ -2771,21 +2419,6 @@ def __repr__(self): code = 'An$(5,1)', 2 description = 'Bn$', 3 - @unittest.skipUnless( - python_version == (3, 9), - 'private variables are now normal attributes', - ) - def test_warning_for_private_variables(self): - with self.assertWarns(DeprecationWarning): - class Private(Enum): - __corporal = 'Radar' - self.assertEqual(Private._Private__corporal.value, 'Radar') - try: - with self.assertWarns(DeprecationWarning): - class Private(Enum): - __major_ = 'Hoolihan' - except ValueError: - pass def test_private_variable_is_normal_attribute(self): class Private(Enum): @@ -2794,35 +2427,13 @@ class Private(Enum): self.assertEqual(Private._Private__corporal, 'Radar') self.assertEqual(Private._Private__major_, 'Hoolihan') - @unittest.skipUnless( - python_version < (3, 12), - 'member-member access now raises an exception', - ) - def test_warning_for_member_from_member_access(self): - with self.assertWarns(DeprecationWarning): - class Di(Enum): - YES = 1 - NO = 0 - nope = Di.YES.NO - self.assertIs(Di.NO, nope) - - @unittest.skipUnless( - python_version >= (3, 12), - 'member-member access currently issues a warning', - ) def test_exception_for_member_from_member_access(self): - with self.assertRaisesRegex(AttributeError, "Di: no instance attribute .NO."): + with self.assertRaisesRegex(AttributeError, " member has no attribute .NO."): class Di(Enum): YES = 1 NO = 0 nope = Di.YES.NO - def test_strenum_auto(self): - class Strings(StrEnum): - ONE = auto() - TWO = auto() - self.assertEqual([Strings.ONE, Strings.TWO], ['one', 'two']) - def test_dynamic_members_with_static_methods(self): # @@ -2839,7 +2450,7 @@ def upper(self): self.assertEqual(Foo.FOO_CAT.value, 'aloof') self.assertEqual(Foo.FOO_HORSE.upper(), 'BIG') # - with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as: 'aloof'"): + with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as 'aloof'"): class FooBar(Enum): vars().update({ k: v @@ -2851,8 +2462,42 @@ class FooBar(Enum): def upper(self): return self.value.upper() + def test_repr_with_dataclass(self): + "ensure dataclass-mixin has correct repr()" + from dataclasses import dataclass + @dataclass + class Foo: + __qualname__ = 'Foo' + a: int = 0 + class Entries(Foo, Enum): + ENTRY1 = Foo(1) + self.assertEqual(repr(Entries.ENTRY1), '') + + def test_repr_with_non_data_type_mixin(self): + # non-data_type is a mixin that doesn't define __new__ + class Foo: + def __init__(self, a): + self.a = a + def __repr__(self): + return f'Foo(a={self.a!r})' + class Entries(Foo, Enum): + ENTRY1 = Foo(1) + + self.assertEqual(repr(Entries.ENTRY1), '') + + def test_value_backup_assign(self): + # check that enum will add missing values when custom __new__ does not + class Some(Enum): + def __new__(cls, val): + return object.__new__(cls) + x = 1 + y = 2 + self.assertEqual(Some.x.value, 1) + self.assertEqual(Some.y.value, 2) + class TestOrder(unittest.TestCase): + "test usage of the `_order_` attribute" def test_same_members(self): class Color(Enum): @@ -2914,7 +2559,7 @@ class Color(Enum): verde = green -class TestFlag(unittest.TestCase): +class OldTestFlag(unittest.TestCase): """Tests of the Flags.""" class Perm(Flag): @@ -2934,67 +2579,8 @@ class Color(Flag): GREEN = 2 BLUE = 4 PURPLE = RED|BLUE - WHITE = RED|GREEN|BLUE - BLANCO = RED|GREEN|BLUE - - def test_str(self): - Perm = self.Perm - self.assertEqual(str(Perm.R), 'R') - self.assertEqual(str(Perm.W), 'W') - self.assertEqual(str(Perm.X), 'X') - self.assertEqual(str(Perm.R | Perm.W), 'R|W') - self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') - self.assertEqual(str(Perm(0)), 'Perm(0)') - self.assertEqual(str(~Perm.R), 'W|X') - self.assertEqual(str(~Perm.W), 'R|X') - self.assertEqual(str(~Perm.X), 'R|W') - self.assertEqual(str(~(Perm.R | Perm.W)), 'X') - self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') - self.assertEqual(str(Perm(~0)), 'R|W|X') - - Open = self.Open - self.assertEqual(str(Open.RO), 'RO') - self.assertEqual(str(Open.WO), 'WO') - self.assertEqual(str(Open.AC), 'AC') - self.assertEqual(str(Open.RO | Open.CE), 'CE') - self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') - self.assertEqual(str(~Open.RO), 'WO|RW|CE') - self.assertEqual(str(~Open.WO), 'RW|CE') - self.assertEqual(str(~Open.AC), 'CE') - self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') - self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') - - def test_repr(self): - Perm = self.Perm - self.assertEqual(repr(Perm.R), 'Perm.R') - self.assertEqual(repr(Perm.W), 'Perm.W') - self.assertEqual(repr(Perm.X), 'Perm.X') - self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') - self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm(0)), '0x0') - self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') - self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') - self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') - self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') - self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') - self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') - - Open = self.Open - self.assertEqual(repr(Open.RO), 'Open.RO') - self.assertEqual(repr(Open.WO), 'Open.WO') - self.assertEqual(repr(Open.AC), 'Open.AC') - self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') - self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') - self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') - self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') - self.assertEqual(repr(~Open.AC), 'Open.CE') - self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') - self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') - - def test_format(self): - Perm = self.Perm - self.assertEqual(format(Perm.R, ''), 'R') - self.assertEqual(format(Perm.R | Perm.X, ''), 'R|X') + WHITE = RED|GREEN|BLUE + BLANCO = RED|GREEN|BLUE def test_or(self): Perm = self.Perm @@ -3088,7 +2674,7 @@ class Bizarre(Flag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value: 7', Iron, 7) + self.assertRaisesRegex(ValueError, 'invalid value 7', Iron, 7) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -3297,7 +2883,7 @@ class Color(Flag): self.assertEqual(Color.green.value, 4) def test_auto_number_garbage(self): - with self.assertRaisesRegex(TypeError, 'Invalid Flag value: .not an int.'): + with self.assertRaisesRegex(TypeError, 'invalid flag value .not an int.'): class Color(Flag): red = 'not an int' blue = auto() @@ -3332,11 +2918,12 @@ class Color(AllMixin, Flag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), 'Color.BLUE') class Color(AllMixin, StrMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3346,6 +2933,7 @@ class Color(StrMixin, AllMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3426,21 +3014,8 @@ class NeverEnum(WhereEnum): self.assertFalse(NeverEnum.__dict__.get('_test1', False)) self.assertFalse(NeverEnum.__dict__.get('_test2', False)) - def test_default_missing(self): - with self.assertRaisesRegex( - ValueError, - "'RED' is not a valid TestFlag.Color", - ) as ctx: - self.Color('RED') - self.assertIs(ctx.exception.__context__, None) - - P = Flag('P', 'X Y') - with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: - P('X') - self.assertIs(ctx.exception.__context__, None) - -class TestIntFlag(unittest.TestCase): +class OldTestIntFlag(unittest.TestCase): """Tests of the IntFlags.""" class Perm(IntFlag): @@ -3485,73 +3060,6 @@ def test_type(self): self.assertTrue(isinstance(Open.WO | Open.RW, Open)) self.assertEqual(Open.WO | Open.RW, 3) - - def test_str(self): - Perm = self.Perm - self.assertEqual(str(Perm.R), 'R') - self.assertEqual(str(Perm.W), 'W') - self.assertEqual(str(Perm.X), 'X') - self.assertEqual(str(Perm.R | Perm.W), 'R|W') - self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') - self.assertEqual(str(Perm.R | 8), '12') - self.assertEqual(str(Perm(0)), 'Perm(0)') - self.assertEqual(str(Perm(8)), '8') - self.assertEqual(str(~Perm.R), 'W|X') - self.assertEqual(str(~Perm.W), 'R|X') - self.assertEqual(str(~Perm.X), 'R|W') - self.assertEqual(str(~(Perm.R | Perm.W)), 'X') - self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') - self.assertEqual(str(~(Perm.R | 8)), '-13') - self.assertEqual(str(Perm(~0)), 'R|W|X') - self.assertEqual(str(Perm(~8)), '-9') - - Open = self.Open - self.assertEqual(str(Open.RO), 'RO') - self.assertEqual(str(Open.WO), 'WO') - self.assertEqual(str(Open.AC), 'AC') - self.assertEqual(str(Open.RO | Open.CE), 'CE') - self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') - self.assertEqual(str(Open(4)), '4') - self.assertEqual(str(~Open.RO), 'WO|RW|CE') - self.assertEqual(str(~Open.WO), 'RW|CE') - self.assertEqual(str(~Open.AC), 'CE') - self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') - self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') - self.assertEqual(str(Open(~4)), '-5') - - def test_repr(self): - Perm = self.Perm - self.assertEqual(repr(Perm.R), 'Perm.R') - self.assertEqual(repr(Perm.W), 'Perm.W') - self.assertEqual(repr(Perm.X), 'Perm.X') - self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') - self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm.R | 8), '12') - self.assertEqual(repr(Perm(0)), '0x0') - self.assertEqual(repr(Perm(8)), '8') - self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') - self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') - self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') - self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') - self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') - self.assertEqual(repr(~(Perm.R | 8)), '-13') - self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm(~8)), '-9') - - Open = self.Open - self.assertEqual(repr(Open.RO), 'Open.RO') - self.assertEqual(repr(Open.WO), 'Open.WO') - self.assertEqual(repr(Open.AC), 'Open.AC') - self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') - self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') - self.assertEqual(repr(Open(4)), '4') - self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') - self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') - self.assertEqual(repr(~Open.AC), 'Open.CE') - self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') - self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') - self.assertEqual(repr(Open(~4)), '-5') - def test_global_repr_keep(self): self.assertEqual( repr(HeadlightsK(0)), @@ -3559,11 +3067,11 @@ def test_global_repr_keep(self): ) self.assertEqual( repr(HeadlightsK(2**0 + 2**2 + 2**3)), - '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|0x8' % {'m': SHORT_MODULE}, + '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|8' % {'m': SHORT_MODULE}, ) self.assertEqual( repr(HeadlightsK(2**3)), - '%(m)s.HeadlightsK(0x8)' % {'m': SHORT_MODULE}, + '%(m)s.HeadlightsK(8)' % {'m': SHORT_MODULE}, ) def test_global_repr_conform1(self): @@ -3705,7 +3213,7 @@ class Bizarre(IntFlag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value: 5', Iron, 5) + self.assertRaisesRegex(ValueError, 'invalid value 5', Iron, 5) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -3942,11 +3450,12 @@ class Color(AllMixin, IntFlag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), '4') class Color(AllMixin, StrMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3956,6 +3465,7 @@ class Color(StrMixin, AllMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -4000,19 +3510,6 @@ def cycle_enum(): 'at least one thread failed while creating composite members') self.assertEqual(256, len(seen), 'too many composite members created') - def test_default_missing(self): - with self.assertRaisesRegex( - ValueError, - "'RED' is not a valid TestIntFlag.Color", - ) as ctx: - self.Color('RED') - self.assertIs(ctx.exception.__context__, None) - - P = IntFlag('P', 'X Y') - with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: - P('X') - self.assertIs(ctx.exception.__context__, None) - class TestEmptyAndNonLatinStrings(unittest.TestCase): @@ -4229,6 +3726,89 @@ def test_is_private(self): for name in self.sunder_names + self.dunder_names + self.random_names: self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') + def test_auto_number(self): + class Color(Enum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 1) + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_name(self): + class Color(Enum): + def _generate_next_value_(name, start, count, last): + return name + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_name_inherit(self): + class AutoNameEnum(Enum): + def _generate_next_value_(name, start, count, last): + return name + class Color(AutoNameEnum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_garbage(self): + class Color(Enum): + red = 'red' + blue = auto() + self.assertEqual(Color.blue.value, 1) + + def test_auto_garbage_corrected(self): + class Color(Enum): + red = 'red' + blue = 2 + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_order(self): + with self.assertRaises(TypeError): + class Color(Enum): + red = auto() + green = auto() + blue = auto() + def _generate_next_value_(name, start, count, last): + return name + + def test_auto_order_wierd(self): + weird_auto = auto() + weird_auto.value = 'pathological case' + class Color(Enum): + red = weird_auto + def _generate_next_value_(name, start, count, last): + return name + blue = auto() + self.assertEqual(list(Color), [Color.red, Color.blue]) + self.assertEqual(Color.red.value, 'pathological case') + self.assertEqual(Color.blue.value, 'blue') + + def test_duplicate_auto(self): + class Dupes(Enum): + first = primero = auto() + second = auto() + third = auto() + self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) + class TestEnumTypeSubclassing(unittest.TestCase): pass @@ -4238,7 +3818,35 @@ class TestEnumTypeSubclassing(unittest.TestCase): class Color(enum.Enum) | Color(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None) |\x20\x20 - | An enumeration. + | A collection of name/value pairs. + |\x20\x20 + | Access them by: + |\x20\x20 + | - attribute access:: + |\x20\x20 + | >>> Color.CYAN + | + |\x20\x20 + | - value lookup: + |\x20\x20 + | >>> Color(1) + | + |\x20\x20 + | - name lookup: + |\x20\x20 + | >>> Color['CYAN'] + | + |\x20\x20 + | Enumerations can be iterated over, and know how many members they have: + |\x20\x20 + | >>> len(Color) + | 3 + |\x20\x20 + | >>> list(Color) + | [, , ] + |\x20\x20 + | Methods can be added to enumerations, and members can have their own + | attributes -- see the documentation for details. |\x20\x20 | Method resolution order: | Color @@ -4247,11 +3855,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | blue = Color.blue + | CYAN = |\x20\x20 - | green = Color.green + | MAGENTA = |\x20\x20 - | red = Color.red + | YELLOW = |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -4263,6 +3871,25 @@ class Color(enum.Enum) | The value of the Enum member. |\x20\x20 | ---------------------------------------------------------------------- + | Methods inherited from enum.EnumType: + |\x20\x20 + | __contains__(member) from enum.EnumType + | Return True if member is a member of this enum + | raises TypeError if member is not an enum member + |\x20\x20\x20\x20\x20\x20 + | note: in 3.12 TypeError will no longer be raised, and True will also be + | returned if member is the value of a member in this enum + |\x20\x20 + | __getitem__(name) from enum.EnumType + | Return the member matching `name`. + |\x20\x20 + | __iter__() from enum.EnumType + | Return members in definition order. + |\x20\x20 + | __len__() from enum.EnumType + | Return the number of members (no aliases) + |\x20\x20 + | ---------------------------------------------------------------------- | Readonly properties inherited from enum.EnumType: |\x20\x20 | __members__ @@ -4284,11 +3911,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | blue = Color.blue + | YELLOW = |\x20\x20 - | green = Color.green + | MAGENTA = |\x20\x20 - | red = Color.red + | CYAN = |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -4307,9 +3934,9 @@ class TestStdLib(unittest.TestCase): maxDiff = None class Color(Enum): - red = 1 - green = 2 - blue = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 def test_pydoc(self): # indirectly test __objclass__ @@ -4321,24 +3948,34 @@ def test_pydoc(self): helper = pydoc.Helper(output=output) helper(self.Color) result = output.getvalue().strip() - self.assertEqual(result, expected_text) + self.assertEqual(result, expected_text, result) def test_inspect_getmembers(self): values = dict(( ('__class__', EnumType), - ('__doc__', 'An enumeration.'), + ('__doc__', '...'), ('__members__', self.Color.__members__), ('__module__', __name__), - ('blue', self.Color.blue), - ('green', self.Color.green), + ('YELLOW', self.Color.YELLOW), + ('MAGENTA', self.Color.MAGENTA), + ('CYAN', self.Color.CYAN), ('name', Enum.__dict__['name']), - ('red', self.Color.red), ('value', Enum.__dict__['value']), + ('__len__', self.Color.__len__), + ('__contains__', self.Color.__contains__), + ('__name__', 'Color'), + ('__getitem__', self.Color.__getitem__), + ('__qualname__', 'TestStdLib.Color'), + ('__init_subclass__', getattr(self.Color, '__init_subclass__')), + ('__iter__', self.Color.__iter__), )) result = dict(inspect.getmembers(self.Color)) self.assertEqual(set(values.keys()), set(result.keys())) failed = False for k in values.keys(): + if k == '__doc__': + # __doc__ is huge, not comparing + continue if result[k] != values[k]: print() print('\n%s\n key: %s\n result: %s\nexpected: %s\n%s\n' % @@ -4353,23 +3990,42 @@ def test_inspect_classify_class_attrs(self): values = [ Attribute(name='__class__', kind='data', defining_class=object, object=EnumType), + Attribute(name='__contains__', kind='method', + defining_class=EnumType, object=self.Color.__contains__), Attribute(name='__doc__', kind='data', - defining_class=self.Color, object='An enumeration.'), + defining_class=self.Color, object='...'), + Attribute(name='__getitem__', kind='method', + defining_class=EnumType, object=self.Color.__getitem__), + Attribute(name='__iter__', kind='method', + defining_class=EnumType, object=self.Color.__iter__), + Attribute(name='__init_subclass__', kind='class method', + defining_class=object, object=getattr(self.Color, '__init_subclass__')), + Attribute(name='__len__', kind='method', + defining_class=EnumType, object=self.Color.__len__), Attribute(name='__members__', kind='property', defining_class=EnumType, object=EnumType.__members__), Attribute(name='__module__', kind='data', defining_class=self.Color, object=__name__), - Attribute(name='blue', kind='data', - defining_class=self.Color, object=self.Color.blue), - Attribute(name='green', kind='data', - defining_class=self.Color, object=self.Color.green), - Attribute(name='red', kind='data', - defining_class=self.Color, object=self.Color.red), + Attribute(name='__name__', kind='data', + defining_class=self.Color, object='Color'), + Attribute(name='__qualname__', kind='data', + defining_class=self.Color, object='TestStdLib.Color'), + Attribute(name='YELLOW', kind='data', + defining_class=self.Color, object=self.Color.YELLOW), + Attribute(name='MAGENTA', kind='data', + defining_class=self.Color, object=self.Color.MAGENTA), + Attribute(name='CYAN', kind='data', + defining_class=self.Color, object=self.Color.CYAN), Attribute(name='name', kind='data', defining_class=Enum, object=Enum.__dict__['name']), Attribute(name='value', kind='data', defining_class=Enum, object=Enum.__dict__['value']), ] + for v in values: + try: + v.name + except AttributeError: + print(v) values.sort(key=lambda item: item.name) result = list(inspect.classify_class_attrs(self.Color)) result.sort(key=lambda item: item.name) @@ -4379,7 +4035,15 @@ def test_inspect_classify_class_attrs(self): ) failed = False for v, r in zip(values, result): - if r != v: + if r.name in ('__init_subclass__', '__doc__'): + # not sure how to make the __init_subclass_ Attributes match + # so as long as there is one, call it good + # __doc__ is too big to check exactly, so treat the same as __init_subclass__ + for name in ('name','kind','defining_class'): + if getattr(v, name) != getattr(r, name): + print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') + failed = True + elif r != v: print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') failed = True if failed: @@ -4388,15 +4052,15 @@ def test_inspect_classify_class_attrs(self): def test_test_simple_enum(self): @_simple_enum(Enum) class SimpleColor: - RED = 1 - GREEN = 2 - BLUE = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 class CheckedColor(Enum): - RED = 1 - GREEN = 2 - BLUE = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 self.assertTrue(_test_simple_enum(CheckedColor, SimpleColor) is None) - SimpleColor.GREEN._value_ = 9 + SimpleColor.MAGENTA._value_ = 9 self.assertRaisesRegex( TypeError, "enum mismatch", _test_simple_enum, CheckedColor, SimpleColor, @@ -4422,9 +4086,165 @@ class Missing: class MiscTestCase(unittest.TestCase): + def test__all__(self): support.check__all__(self, enum, not_exported={'bin', 'show_flag_values'}) + def test_doc_1(self): + class Single(Enum): + ONE = 1 + self.assertEqual( + Single.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Single.ONE + + + - value lookup: + + >>> Single(1) + + + - name lookup: + + >>> Single['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Single) + 1 + + >>> list(Single) + [] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + def test_doc_2(self): + class Double(Enum): + ONE = 1 + TWO = 2 + self.assertEqual( + Double.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Double.ONE + + + - value lookup: + + >>> Double(1) + + + - name lookup: + + >>> Double['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Double) + 2 + + >>> list(Double) + [, ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + + def test_doc_1(self): + class Triple(Enum): + ONE = 1 + TWO = 2 + THREE = 3 + self.assertEqual( + Triple.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Triple.ONE + + + - value lookup: + + >>> Triple(1) + + + - name lookup: + + >>> Triple['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Triple) + 3 + + >>> list(Triple) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + def test_doc_1(self): + class Quadruple(Enum): + ONE = 1 + TWO = 2 + THREE = 3 + FOUR = 4 + self.assertEqual( + Quadruple.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Quadruple.ONE + + + - value lookup: + + >>> Quadruple(1) + + + - name lookup: + + >>> Quadruple['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Quadruple) + 4 + + >>> list(Quadruple)[:3] + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + # These are unordered here on purpose to ensure that declaration order # makes no difference. @@ -4442,6 +4262,10 @@ def test__all__(self): CONVERT_STRING_TEST_NAME_E = 5 CONVERT_STRING_TEST_NAME_F = 5 +# global names for StrEnum._convert_ test +CONVERT_STR_TEST_2 = 'goodbye' +CONVERT_STR_TEST_1 = 'hello' + # We also need values that cannot be compared: UNCOMPARABLE_A = 5 UNCOMPARABLE_C = (9, 1) # naming order is broken on purpose @@ -4453,32 +4277,40 @@ def test__all__(self): class _ModuleWrapper: """We use this class as a namespace for swapping modules.""" - def __init__(self, module): self.__dict__.update(module.__dict__) -class TestIntEnumConvert(unittest.TestCase): +class TestConvert(unittest.TestCase): + def tearDown(self): + # Reset the module-level test variables to their original integer + # values, otherwise the already created enum values get converted + # instead. + g = globals() + for suffix in ['A', 'B', 'C', 'D', 'E', 'F']: + g['CONVERT_TEST_NAME_%s' % suffix] = 5 + g['CONVERT_STRING_TEST_NAME_%s' % suffix] = 5 + for suffix, value in (('A', 5), ('B', (9, 1)), ('C', 'value')): + g['UNCOMPARABLE_%s' % suffix] = value + for suffix, value in (('A', 2j), ('B', 3j), ('C', 1j)): + g['COMPLEX_%s' % suffix] = value + for suffix, value in (('1', 'hello'), ('2', 'goodbye')): + g['CONVERT_STR_TEST_%s' % suffix] = value + def test_convert_value_lookup_priority(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # We don't want the reverse lookup value to vary when there are # multiple possible names for a given value. It should always # report the first lexigraphical name in that case. self.assertEqual(test_type(5).name, 'CONVERT_TEST_NAME_A') - def test_convert(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + def test_convert_int(self): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # Ensure that test_type has all of the desired names and values. self.assertEqual(test_type.CONVERT_TEST_NAME_F, test_type.CONVERT_TEST_NAME_A) @@ -4487,43 +4319,57 @@ def test_convert(self): self.assertEqual(test_type.CONVERT_TEST_NAME_D, 5) self.assertEqual(test_type.CONVERT_TEST_NAME_E, 5) # Ensure that test_type only picked up names matching the filter. - self.assertEqual([name for name in dir(test_type) - if name[0:2] not in ('CO', '__') - and name not in dir(IntEnum)], - [], msg='Names other than CONVERT_TEST_* found.') + int_dir = dir(int) + [ + 'CONVERT_TEST_NAME_A', 'CONVERT_TEST_NAME_B', 'CONVERT_TEST_NAME_C', + 'CONVERT_TEST_NAME_D', 'CONVERT_TEST_NAME_E', 'CONVERT_TEST_NAME_F', + ] + self.assertEqual( + [name for name in dir(test_type) if name not in int_dir], + [], + msg='Names other than CONVERT_TEST_* found.', + ) def test_convert_uncomparable(self): - # We swap a module to some other object with `__dict__` - # because otherwise refleak is created. - # `_convert_` uses a module side effect that does this. See 30472 - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('UNCOMPARABLE_')) - + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('UNCOMPARABLE_')) # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.UNCOMPARABLE_A, uncomp.UNCOMPARABLE_B, uncomp.UNCOMPARABLE_C], - ) + ) def test_convert_complex(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('COMPLEX_')) - + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('COMPLEX_')) # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.COMPLEX_A, uncomp.COMPLEX_B, uncomp.COMPLEX_C], - ) + ) + + def test_convert_str(self): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_'), + as_global=True) + # Ensure that test_type has all of the desired names and values. + self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') + self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') + # Ensure that test_type only picked up names matching the filter. + str_dir = dir(str) + ['CONVERT_STR_TEST_1', 'CONVERT_STR_TEST_2'] + self.assertEqual( + [name for name in dir(test_type) if name not in str_dir], + [], + msg='Names other than CONVERT_STR_* found.', + ) + self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) + self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') + self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') def test_convert_raise(self): with self.assertRaises(AttributeError): @@ -4533,50 +4379,58 @@ def test_convert_raise(self): filter=lambda x: x.startswith('CONVERT_TEST_')) def test_convert_repr_and_str(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STRING_TEST_')) + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STRING_TEST_'), + as_global=True) self.assertEqual(repr(test_type.CONVERT_STRING_TEST_NAME_A), '%s.CONVERT_STRING_TEST_NAME_A' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), 'CONVERT_STRING_TEST_NAME_A') + self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), '5') self.assertEqual(format(test_type.CONVERT_STRING_TEST_NAME_A), '5') -# global names for StrEnum._convert_ test -CONVERT_STR_TEST_2 = 'goodbye' -CONVERT_STR_TEST_1 = 'hello' -class TestStrEnumConvert(unittest.TestCase): - def test_convert(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) - # Ensure that test_type has all of the desired names and values. - self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') - self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') - # Ensure that test_type only picked up names matching the filter. - self.assertEqual([name for name in dir(test_type) - if name[0:2] not in ('CO', '__') - and name not in dir(StrEnum)], - [], msg='Names other than CONVERT_STR_* found.') +# helpers - def test_convert_repr_and_str(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) - self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') - self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') +def enum_dir(cls): + # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ + if cls._member_type_ is object: + interesting = set() + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') + return sorted(set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ] + + cls._member_names_ + ) | interesting + ) + else: + # return whatever mixed-in data type has + return sorted(set( + dir(cls._member_type_) + + cls._member_names_ + )) + +def member_dir(member): + if member.__class__._member_type_ is object: + allowed = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) + else: + allowed = set(dir(member)) + for cls in member.__class__.mro(): + for name, obj in cls.__dict__.items(): + if name[0] == '_': + continue + if isinstance(obj, enum.property): + if obj.fget is not None or name not in member._member_map_: + allowed.add(name) + else: + allowed.discard(name) + else: + allowed.add(name) + return sorted(allowed) + +missing = object() if __name__ == '__main__': diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index 3f0e7270eb26f..ac4626d0c456e 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -908,7 +908,7 @@ def handler(signum, frame): %s - blocked = %r + blocked = %s signum = signal.SIGALRM # child: block and wait the signal diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 394d2942483fb..56cc23dbbbf4e 100755 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -1517,9 +1517,11 @@ def testGetaddrinfo(self): infos = socket.getaddrinfo(HOST, 80, socket.AF_INET, socket.SOCK_STREAM) for family, type, _, _, _ in infos: self.assertEqual(family, socket.AF_INET) - self.assertEqual(str(family), 'AF_INET') + self.assertEqual(repr(family), '') + self.assertEqual(str(family), '2') self.assertEqual(type, socket.SOCK_STREAM) - self.assertEqual(str(type), 'SOCK_STREAM') + self.assertEqual(repr(type), '') + self.assertEqual(str(type), '1') infos = socket.getaddrinfo(HOST, None, 0, socket.SOCK_STREAM) for _, socktype, _, _, _ in infos: self.assertEqual(socktype, socket.SOCK_STREAM) @@ -1793,8 +1795,10 @@ def test_str_for_enums(self): # Make sure that the AF_* and SOCK_* constants have enum-like string # reprs. with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: - self.assertEqual(str(s.family), 'AF_INET') - self.assertEqual(str(s.type), 'SOCK_STREAM') + self.assertEqual(repr(s.family), '') + self.assertEqual(repr(s.type), '') + self.assertEqual(str(s.family), '2') + self.assertEqual(str(s.type), '1') def test_socket_consistent_sock_type(self): SOCK_NONBLOCK = getattr(socket, 'SOCK_NONBLOCK', 0) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index f99a3e8da95f8..64f4bce7f7781 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -373,7 +373,8 @@ def test_str_for_enums(self): # Make sure that the PROTOCOL_* constants have enum-like string # reprs. proto = ssl.PROTOCOL_TLS_CLIENT - self.assertEqual(str(proto), 'PROTOCOL_TLS_CLIENT') + self.assertEqual(repr(proto), '<_SSLMethod.PROTOCOL_TLS_CLIENT: 16>') + self.assertEqual(str(proto), '16') ctx = ssl.SSLContext(proto) self.assertIs(ctx.protocol, proto) @@ -622,7 +623,7 @@ def test_openssl111_deprecations(self): with self.assertWarns(DeprecationWarning) as cm: ssl.SSLContext(protocol) self.assertEqual( - f'{protocol!r} is deprecated', + f'ssl.{protocol.name} is deprecated', str(cm.warning) ) @@ -631,8 +632,9 @@ def test_openssl111_deprecations(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT) with self.assertWarns(DeprecationWarning) as cm: ctx.minimum_version = version + version_text = '%s.%s' % (version.__class__.__name__, version.name) self.assertEqual( - f'ssl.{version!r} is deprecated', + f'ssl.{version_text} is deprecated', str(cm.warning) ) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index d5e2c5266aae7..8e4e64808b688 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1490,8 +1490,10 @@ def test_formatting_with_enum(self): # issue18780 import enum class Float(float, enum.Enum): + # a mixed-in type will use the name for %s etc. PI = 3.1415926 class Int(enum.IntEnum): + # IntEnum uses the value and not the name for %s etc. IDES = 15 class Str(enum.StrEnum): # StrEnum uses the value and not the name for %s etc. @@ -1508,8 +1510,10 @@ class Str(enum.StrEnum): # formatting jobs delegated from the string implementation: self.assertEqual('...%(foo)s...' % {'foo':Str.ABC}, '...abc...') + self.assertEqual('...%(foo)r...' % {'foo':Int.IDES}, + '......') self.assertEqual('...%(foo)s...' % {'foo':Int.IDES}, - '...IDES...') + '...15...') self.assertEqual('...%(foo)i...' % {'foo':Int.IDES}, '...15...') self.assertEqual('...%(foo)d...' % {'foo':Int.IDES}, diff --git a/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst b/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst new file mode 100644 index 0000000000000..2df487855785e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst @@ -0,0 +1,2 @@ +``IntEnum``, ``IntFlag``, and ``StrEnum`` use the mixed-in type for their +``str()`` and ``format()`` output. From webhook-mailer at python.org Sun Jan 16 02:32:25 2022 From: webhook-mailer at python.org (corona10) Date: Sun, 16 Jan 2022 07:32:25 -0000 Subject: [Python-checkins] bpo-46386: improve `test_typing:test_immutability_by_copy_and_pickle` (GH-30613) Message-ID: https://github.com/python/cpython/commit/09087b8519316608b85131ee7455b664c00c38d2 commit: 09087b8519316608b85131ee7455b664c00c38d2 branch: main author: Nikita Sobolev committer: corona10 date: 2022-01-16T16:32:11+09:00 summary: bpo-46386: improve `test_typing:test_immutability_by_copy_and_pickle` (GH-30613) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index cf719df6da1d7..c8a077e2f1ff5 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -2130,22 +2130,30 @@ class Node(Generic[T]): ... def test_immutability_by_copy_and_pickle(self): # Special forms like Union, Any, etc., generic aliases to containers like List, # Mapping, etc., and type variabcles are considered immutable by copy and pickle. - global TP, TPB, TPV # for pickle + global TP, TPB, TPV, PP # for pickle TP = TypeVar('TP') TPB = TypeVar('TPB', bound=int) TPV = TypeVar('TPV', bytes, str) - for X in [TP, TPB, TPV, List, typing.Mapping, ClassVar, typing.Iterable, + PP = ParamSpec('PP') + for X in [TP, TPB, TPV, PP, + List, typing.Mapping, ClassVar, typing.Iterable, Union, Any, Tuple, Callable]: - self.assertIs(copy(X), X) - self.assertIs(deepcopy(X), X) - self.assertIs(pickle.loads(pickle.dumps(X)), X) + with self.subTest(thing=X): + self.assertIs(copy(X), X) + self.assertIs(deepcopy(X), X) + for proto in range(pickle.HIGHEST_PROTOCOL + 1): + self.assertIs(pickle.loads(pickle.dumps(X, proto)), X) + del TP, TPB, TPV, PP + # Check that local type variables are copyable. TL = TypeVar('TL') TLB = TypeVar('TLB', bound=int) TLV = TypeVar('TLV', bytes, str) - for X in [TL, TLB, TLV]: - self.assertIs(copy(X), X) - self.assertIs(deepcopy(X), X) + PL = ParamSpec('PL') + for X in [TL, TLB, TLV, PL]: + with self.subTest(thing=X): + self.assertIs(copy(X), X) + self.assertIs(deepcopy(X), X) def test_copy_generic_instances(self): T = TypeVar('T') From webhook-mailer at python.org Sun Jan 16 11:06:46 2022 From: webhook-mailer at python.org (mdickinson) Date: Sun, 16 Jan 2022 16:06:46 -0000 Subject: [Python-checkins] bpo-46361: Fix "small" `int` caching (GH-30583) Message-ID: https://github.com/python/cpython/commit/5cd9a162cd02a3d0f1b0a182d80feeb17439e84f commit: 5cd9a162cd02a3d0f1b0a182d80feeb17439e84f branch: main author: Brandt Bucher committer: mdickinson date: 2022-01-16T16:06:37Z summary: bpo-46361: Fix "small" `int` caching (GH-30583) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-12-17-15-17.bpo-46361.mgI_j_.rst M Lib/test/test_decimal.py M Lib/test/test_long.py M Modules/_decimal/_decimal.c M Objects/longobject.c diff --git a/Lib/test/test_decimal.py b/Lib/test/test_decimal.py index b6173a5ffec96..9ced801afc2e9 100644 --- a/Lib/test/test_decimal.py +++ b/Lib/test/test_decimal.py @@ -2552,6 +2552,13 @@ def test_int(self): self.assertRaises(OverflowError, int, Decimal('inf')) self.assertRaises(OverflowError, int, Decimal('-inf')) + @cpython_only + def test_small_ints(self): + Decimal = self.decimal.Decimal + # bpo-46361 + for x in range(-5, 257): + self.assertIs(int(Decimal(x)), x) + def test_trunc(self): Decimal = self.decimal.Decimal diff --git a/Lib/test/test_long.py b/Lib/test/test_long.py index f2a622b5868f0..c7dd0b274d1e3 100644 --- a/Lib/test/test_long.py +++ b/Lib/test/test_long.py @@ -1471,6 +1471,13 @@ def __init__(self, value): self.assertEqual(i, 1) self.assertEqual(getattr(i, 'foo', 'none'), 'bar') + @support.cpython_only + def test_from_bytes_small(self): + # bpo-46361 + for i in range(-5, 257): + b = i.to_bytes(2, signed=True) + self.assertIs(int.from_bytes(b, signed=True), i) + def test_access_to_nonexistent_digit_0(self): # http://bugs.python.org/issue14630: A bug in _PyLong_Copy meant that # ob_digit[0] was being incorrectly accessed for instances of a diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-12-17-15-17.bpo-46361.mgI_j_.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-12-17-15-17.bpo-46361.mgI_j_.rst new file mode 100644 index 0000000000000..eef877d5cbd8f --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-12-17-15-17.bpo-46361.mgI_j_.rst @@ -0,0 +1,2 @@ +Ensure that "small" integers created by :meth:`int.from_bytes` and +:class:`decimal.Decimal` are properly cached. diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c index 7fc7315603e7a..35a115676a71b 100644 --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3394,6 +3394,13 @@ dec_as_long(PyObject *dec, PyObject *context, int round) return NULL; } + if (n == 1) { + sdigit val = mpd_arith_sign(x) * ob_digit[0]; + mpd_free(ob_digit); + mpd_del(x); + return PyLong_FromLong(val); + } + assert(n > 0); pylong = _PyLong_New(n); if (pylong == NULL) { diff --git a/Objects/longobject.c b/Objects/longobject.c index 5d181aa0850aa..1b2d1266c6bc5 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -911,7 +911,7 @@ _PyLong_FromByteArray(const unsigned char* bytes, size_t n, } Py_SET_SIZE(v, is_signed ? -idigit : idigit); - return (PyObject *)long_normalize(v); + return (PyObject *)maybe_small_long(long_normalize(v)); } int From webhook-mailer at python.org Sun Jan 16 16:52:55 2022 From: webhook-mailer at python.org (tiran) Date: Sun, 16 Jan 2022 21:52:55 -0000 Subject: [Python-checkins] bpo-40280: Add requires_fork test helper (GH-30622) Message-ID: https://github.com/python/cpython/commit/91e33ac3d08a1c6004c469da2c0e2a97b5bdc53c commit: 91e33ac3d08a1c6004c469da2c0e2a97b5bdc53c branch: main author: Christian Heimes committer: tiran date: 2022-01-16T22:52:43+01:00 summary: bpo-40280: Add requires_fork test helper (GH-30622) files: A Misc/NEWS.d/next/Tests/2022-01-16-14-11-57.bpo-40280.fNnFfx.rst M Lib/test/support/__init__.py M Lib/test/test_fork1.py M Lib/test/test_random.py M Lib/test/test_support.py M Lib/test/test_sysconfig.py M Lib/test/test_tempfile.py M Lib/test/test_thread.py M Lib/test/test_threading.py M Lib/test/test_uuid.py diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index f8faa41ad439c..ca903d302bdd3 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -39,12 +39,13 @@ "requires_gzip", "requires_bz2", "requires_lzma", "bigmemtest", "bigaddrspacetest", "cpython_only", "get_attribute", "requires_IEEE_754", "requires_zlib", + "has_fork_support", "requires_fork", "anticipate_failure", "load_package_tests", "detect_api_mismatch", "check__all__", "skip_if_buggy_ucrt_strfptime", "check_disallow_instantiation", # sys - "is_jython", "is_android", "check_impl_detail", "unix_shell", - "setswitchinterval", + "is_jython", "is_android", "is_emscripten", + "check_impl_detail", "unix_shell", "setswitchinterval", # network "open_urlresource", # processes @@ -466,6 +467,15 @@ def requires_debug_ranges(reason='requires co_positions / debug_ranges'): else: unix_shell = None +# wasm32-emscripten is POSIX-like but does not provide a +# working fork() or subprocess API. +is_emscripten = sys.platform == "emscripten" + +has_fork_support = hasattr(os, "fork") and not is_emscripten + +def requires_fork(): + return unittest.skipUnless(has_fork_support, "requires working os.fork()") + # Define the URL of a dedicated HTTP server for the network tests. # The URL must use clear-text HTTP: no redirection to encrypted HTTPS. TEST_HTTP_URL = "http://www.pythontest.net" diff --git a/Lib/test/test_fork1.py b/Lib/test/test_fork1.py index a2f7cfee9cf69..a6523bbc51817 100644 --- a/Lib/test/test_fork1.py +++ b/Lib/test/test_fork1.py @@ -14,7 +14,9 @@ # Skip test if fork does not exist. -support.get_attribute(os, 'fork') +if not support.has_fork_support: + raise unittest.SkipTest("test module requires working os.fork") + class ForkTest(ForkWait): def test_threaded_import_lock_fork(self): diff --git a/Lib/test/test_random.py b/Lib/test/test_random.py index b80aeca26cf48..f980c5b8df0d2 100644 --- a/Lib/test/test_random.py +++ b/Lib/test/test_random.py @@ -1293,7 +1293,7 @@ def test__all__(self): # tests validity but not completeness of the __all__ list self.assertTrue(set(random.__all__) <= set(dir(random))) - @unittest.skipUnless(hasattr(os, "fork"), "fork() required") + @test.support.requires_fork() def test_after_fork(self): # Test the global Random instance gets reseeded in child r, w = os.pipe() diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index d5a1d447f0563..4dac7f6cd4200 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -198,7 +198,7 @@ def test_temp_dir__existing_dir__quiet_true(self): f'temporary directory {path!r}: '), warn) - @unittest.skipUnless(hasattr(os, "fork"), "test requires os.fork") + @support.requires_fork() def test_temp_dir__forked_child(self): """Test that a forked child process does not remove the directory.""" # See bpo-30028 for details. @@ -447,6 +447,7 @@ def test_check__all__(self): @unittest.skipUnless(hasattr(os, 'waitpid') and hasattr(os, 'WNOHANG'), 'need os.waitpid() and os.WNOHANG') + @support.requires_fork() def test_reap_children(self): # Make sure that there is no other pending child process support.reap_children() diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py index 506266d08185d..6fbb80d77f793 100644 --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -412,6 +412,8 @@ def test_SO_value(self): 'EXT_SUFFIX required for this test') def test_EXT_SUFFIX_in_vars(self): import _imp + if not _imp.extension_suffixes(): + self.skipTest("stub loader has no suffixes") vars = sysconfig.get_config_vars() self.assertIsNotNone(vars['SO']) self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) diff --git a/Lib/test/test_tempfile.py b/Lib/test/test_tempfile.py index 2b0ec46a10327..25fddaec6d317 100644 --- a/Lib/test/test_tempfile.py +++ b/Lib/test/test_tempfile.py @@ -198,8 +198,7 @@ def supports_iter(self): if i == 20: break - @unittest.skipUnless(hasattr(os, 'fork'), - "os.fork is required for this test") + @support.requires_fork() def test_process_awareness(self): # ensure that the random source differs between # child and parent. diff --git a/Lib/test/test_thread.py b/Lib/test/test_thread.py index 4ae8a833b990d..d55fb731b6df5 100644 --- a/Lib/test/test_thread.py +++ b/Lib/test/test_thread.py @@ -224,7 +224,7 @@ class TestForkInThread(unittest.TestCase): def setUp(self): self.read_fd, self.write_fd = os.pipe() - @unittest.skipUnless(hasattr(os, 'fork'), 'need os.fork') + @support.requires_fork() @threading_helper.reap_threads def test_forkinthread(self): pid = None diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index a8f3c139b24be..f03a64232e17c 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -505,7 +505,7 @@ def test_daemon_param(self): t = threading.Thread(daemon=True) self.assertTrue(t.daemon) - @unittest.skipUnless(hasattr(os, 'fork'), 'needs os.fork()') + @support.requires_fork() def test_fork_at_exit(self): # bpo-42350: Calling os.fork() after threading._shutdown() must # not log an error. @@ -533,7 +533,7 @@ def exit_handler(): self.assertEqual(out, b'') self.assertEqual(err.rstrip(), b'child process ok') - @unittest.skipUnless(hasattr(os, 'fork'), 'test needs fork()') + @support.requires_fork() def test_dummy_thread_after_fork(self): # Issue #14308: a dummy thread in the active list doesn't mess up # the after-fork mechanism. @@ -560,7 +560,7 @@ def background_thread(evt): self.assertEqual(out, b'') self.assertEqual(err, b'') - @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + @support.requires_fork() def test_is_alive_after_fork(self): # Try hard to trigger #18418: is_alive() could sometimes be True on # threads that vanished after a fork. @@ -594,7 +594,7 @@ def f(): th.start() th.join() - @unittest.skipUnless(hasattr(os, 'fork'), "test needs os.fork()") + @support.requires_fork() @unittest.skipUnless(hasattr(os, 'waitpid'), "test needs os.waitpid()") def test_main_thread_after_fork(self): code = """if 1: @@ -616,7 +616,7 @@ def test_main_thread_after_fork(self): self.assertEqual(data, "MainThread\nTrue\nTrue\n") @unittest.skipIf(sys.platform in platforms_to_skip, "due to known OS bug") - @unittest.skipUnless(hasattr(os, 'fork'), "test needs os.fork()") + @support.requires_fork() @unittest.skipUnless(hasattr(os, 'waitpid'), "test needs os.waitpid()") def test_main_thread_after_fork_from_nonmain_thread(self): code = """if 1: @@ -993,7 +993,7 @@ def test_1_join_on_shutdown(self): """ self._run_and_join(script) - @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + @support.requires_fork() @unittest.skipIf(sys.platform in platforms_to_skip, "due to known OS bug") def test_2_join_in_forked_process(self): # Like the test above, but from a forked interpreter @@ -1014,7 +1014,7 @@ def test_2_join_in_forked_process(self): """ self._run_and_join(script) - @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + @support.requires_fork() @unittest.skipIf(sys.platform in platforms_to_skip, "due to known OS bug") def test_3_join_in_forked_from_thread(self): # Like the test above, but fork() was called from a worker thread @@ -1085,7 +1085,7 @@ def main(): rc, out, err = assert_python_ok('-c', script) self.assertFalse(err) - @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + @support.requires_fork() @unittest.skipIf(sys.platform in platforms_to_skip, "due to known OS bug") def test_reinit_tls_after_fork(self): # Issue #13817: fork() would deadlock in a multithreaded program with @@ -1109,7 +1109,7 @@ def do_fork_and_wait(): for t in threads: t.join() - @unittest.skipUnless(hasattr(os, 'fork'), "needs os.fork()") + @support.requires_fork() def test_clear_threads_states_after_fork(self): # Issue #17094: check that threads states are cleared after fork() diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py index 3f56192c70e84..411eec0f40621 100755 --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -647,7 +647,7 @@ def test_uuid5(self): equal(u, self.uuid.UUID(v)) equal(str(u), v) - @unittest.skipUnless(hasattr(os, 'fork'), 'need os.fork') + @support.requires_fork() def testIssue8621(self): # On at least some versions of OSX self.uuid.uuid4 generates # the same sequence of UUIDs in the parent and any diff --git a/Misc/NEWS.d/next/Tests/2022-01-16-14-11-57.bpo-40280.fNnFfx.rst b/Misc/NEWS.d/next/Tests/2022-01-16-14-11-57.bpo-40280.fNnFfx.rst new file mode 100644 index 0000000000000..2d66db1210854 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-16-14-11-57.bpo-40280.fNnFfx.rst @@ -0,0 +1,2 @@ +Add :func:`test.support.requires_fork` decorators to mark tests that require +a working :func:`os.fork`. From webhook-mailer at python.org Mon Jan 17 01:23:54 2022 From: webhook-mailer at python.org (tiran) Date: Mon, 17 Jan 2022 06:23:54 -0000 Subject: [Python-checkins] bpo-40280: Change subprocess imports for cleaner error on wasm32 (GH-30620) Message-ID: https://github.com/python/cpython/commit/7f4b69b9076bdbcea31f6ad16eb125ee99cf0175 commit: 7f4b69b9076bdbcea31f6ad16eb125ee99cf0175 branch: main author: Christian Heimes committer: tiran date: 2022-01-17T07:23:36+01:00 summary: bpo-40280: Change subprocess imports for cleaner error on wasm32 (GH-30620) files: A Misc/NEWS.d/next/Library/2022-01-16-14-07-14.bpo-40280.LtFHfF.rst M Lib/subprocess.py diff --git a/Lib/subprocess.py b/Lib/subprocess.py index 33f022f8fced6..358f49a5f8cd8 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -65,16 +65,11 @@ # NOTE: We intentionally exclude list2cmdline as it is # considered an internal implementation detail. issue10838. -try: +_mswindows = sys.platform == "win32" + +if _mswindows: import msvcrt import _winapi - _mswindows = True -except ModuleNotFoundError: - _mswindows = False - import _posixsubprocess - import select - import selectors -else: from _winapi import (CREATE_NEW_CONSOLE, CREATE_NEW_PROCESS_GROUP, STD_INPUT_HANDLE, STD_OUTPUT_HANDLE, STD_ERROR_HANDLE, SW_HIDE, @@ -95,6 +90,10 @@ "NORMAL_PRIORITY_CLASS", "REALTIME_PRIORITY_CLASS", "CREATE_NO_WINDOW", "DETACHED_PROCESS", "CREATE_DEFAULT_ERROR_MODE", "CREATE_BREAKAWAY_FROM_JOB"]) +else: + import _posixsubprocess + import select + import selectors # Exception classes used by this module. diff --git a/Misc/NEWS.d/next/Library/2022-01-16-14-07-14.bpo-40280.LtFHfF.rst b/Misc/NEWS.d/next/Library/2022-01-16-14-07-14.bpo-40280.LtFHfF.rst new file mode 100644 index 0000000000000..b7bd7abd80c42 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-16-14-07-14.bpo-40280.LtFHfF.rst @@ -0,0 +1,4 @@ +:mod:`subprocess` now imports Windows-specific imports when +``sys.platform == "win32"`` and POSIX-specific imports on all other +platforms. This gives a clean exception when ``_posixsubprocess`` is not +available (e.g. Emscripten browser target) and it's slightly faster, too. From webhook-mailer at python.org Mon Jan 17 07:58:45 2022 From: webhook-mailer at python.org (vstinner) Date: Mon, 17 Jan 2022 12:58:45 -0000 Subject: [Python-checkins] Revert "bpo-40066: [Enum] update str() and format() output (GH-30582)" (GH-30632) Message-ID: https://github.com/python/cpython/commit/42a64c03ec5c443f2a5c2ee4284622f5d1f5326c commit: 42a64c03ec5c443f2a5c2ee4284622f5d1f5326c branch: main author: Victor Stinner committer: vstinner date: 2022-01-17T13:58:40+01:00 summary: Revert "bpo-40066: [Enum] update str() and format() output (GH-30582)" (GH-30632) This reverts commit acf7403f9baea3ae1119fc6b4a3298522188bf96. files: D Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst M Doc/howto/enum.rst M Doc/library/enum.rst M Doc/library/ssl.rst M Lib/enum.py M Lib/inspect.py M Lib/plistlib.py M Lib/re.py M Lib/ssl.py M Lib/test/test_enum.py M Lib/test/test_signal.py M Lib/test/test_socket.py M Lib/test/test_ssl.py M Lib/test/test_unicode.py diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index fa0e2283ebc10..6c09b9925c1de 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -2,10 +2,15 @@ Enum HOWTO ========== +:Author: Ethan Furman + .. _enum-basic-tutorial: .. currentmodule:: enum +Basic Enum Tutorial +------------------- + An :class:`Enum` is a set of symbolic names bound to unique values. They are similar to global variables, but they offer a more useful :func:`repr()`, grouping, type-safety, and a few other features. @@ -23,14 +28,6 @@ selection of values. For example, the days of the week:: ... SATURDAY = 6 ... SUNDAY = 7 - Or perhaps the RGB primary colors:: - - >>> from enum import Enum - >>> class Color(Enum): - ... RED = 1 - ... GREEN = 2 - ... BLUE = 3 - As you can see, creating an :class:`Enum` is as simple as writing a class that inherits from :class:`Enum` itself. @@ -44,14 +41,13 @@ important, but either way that value can be used to get the corresponding member:: >>> Weekday(3) - + Weekday.WEDNESDAY -As you can see, the ``repr()`` of a member shows the enum name, the member name, -and the value. The ``str()`` of a member shows only the enum name and member -name:: +As you can see, the ``repr()`` of a member shows the enum name and the +member name. The ``str()`` on a member shows only its name:: >>> print(Weekday.THURSDAY) - Weekday.THURSDAY + THURSDAY The *type* of an enumeration member is the enum it belongs to:: @@ -101,8 +97,8 @@ The complete :class:`Weekday` enum now looks like this:: Now we can find out what today is! Observe:: >>> from datetime import date - >>> Weekday.from_date(date.today()) # doctest: +SKIP - + >>> Weekday.from_date(date.today()) + Weekday.TUESDAY Of course, if you're reading this on some other day, you'll see that day instead. @@ -128,21 +124,21 @@ Just like the original :class:`Weekday` enum above, we can have a single selecti >>> first_week_day = Weekday.MONDAY >>> first_week_day - + Weekday.MONDAY But :class:`Flag` also allows us to combine several members into a single variable:: >>> weekend = Weekday.SATURDAY | Weekday.SUNDAY >>> weekend - + Weekday.SATURDAY|Weekday.SUNDAY You can even iterate over a :class:`Flag` variable:: >>> for day in weekend: ... print(day) - Weekday.SATURDAY - Weekday.SUNDAY + SATURDAY + SUNDAY Okay, let's get some chores set up:: @@ -177,7 +173,6 @@ yourself some work and use :func:`auto()` for the values:: .. _enum-advanced-tutorial: - Programmatic access to enumeration members and their attributes --------------------------------------------------------------- @@ -186,16 +181,16 @@ situations where ``Color.RED`` won't do because the exact color is not known at program-writing time). ``Enum`` allows such access:: >>> Color(1) - + Color.RED >>> Color(3) - + Color.BLUE If you want to access enum members by *name*, use item access:: >>> Color['RED'] - + Color.RED >>> Color['GREEN'] - + Color.GREEN If you have an enum member and need its :attr:`name` or :attr:`value`:: @@ -217,7 +212,7 @@ Having two enum members with the same name is invalid:: ... Traceback (most recent call last): ... - TypeError: 'SQUARE' already defined as 2 + TypeError: 'SQUARE' already defined as: 2 However, an enum member can have other names associated with it. Given two entries ``A`` and ``B`` with the same value (and ``A`` defined first), ``B`` @@ -232,11 +227,11 @@ By-name lookup of ``B`` will also return the member ``A``:: ... ALIAS_FOR_SQUARE = 2 ... >>> Shape.SQUARE - + Shape.SQUARE >>> Shape.ALIAS_FOR_SQUARE - + Shape.SQUARE >>> Shape(2) - + Shape.SQUARE .. note:: @@ -304,7 +299,7 @@ Iteration Iterating over the members of an enum does not provide the aliases:: >>> list(Shape) - [, , ] + [Shape.SQUARE, Shape.DIAMOND, Shape.CIRCLE] The special attribute ``__members__`` is a read-only ordered mapping of names to members. It includes all names defined in the enumeration, including the @@ -313,10 +308,10 @@ aliases:: >>> for name, member in Shape.__members__.items(): ... name, member ... - ('SQUARE', ) - ('DIAMOND', ) - ('CIRCLE', ) - ('ALIAS_FOR_SQUARE', ) + ('SQUARE', Shape.SQUARE) + ('DIAMOND', Shape.DIAMOND) + ('CIRCLE', Shape.CIRCLE) + ('ALIAS_FOR_SQUARE', Shape.SQUARE) The ``__members__`` attribute can be used for detailed programmatic access to the enumeration members. For example, finding all the aliases:: @@ -365,8 +360,8 @@ below):: Allowed members and attributes of enumerations ---------------------------------------------- -Most of the examples above use integers for enumeration values. Using integers -is short and handy (and provided by default by the `Functional API`_), but not +Most of the examples above use integers for enumeration values. Using integers is +short and handy (and provided by default by the `Functional API`_), but not strictly enforced. In the vast majority of use-cases, one doesn't care what the actual value of an enumeration is. But if the value *is* important, enumerations can have arbitrary values. @@ -394,7 +389,7 @@ usual. If we have this enumeration:: Then:: >>> Mood.favorite_mood() - + Mood.HAPPY >>> Mood.HAPPY.describe() ('HAPPY', 3) >>> str(Mood.FUNKY) @@ -430,7 +425,7 @@ any members. So this is forbidden:: ... Traceback (most recent call last): ... - TypeError: cannot extend + TypeError: MoreColor: cannot extend enumeration 'Color' But this is allowed:: @@ -481,9 +476,11 @@ The :class:`Enum` class is callable, providing the following functional API:: >>> Animal >>> Animal.ANT - + Animal.ANT + >>> Animal.ANT.value + 1 >>> list(Animal) - [, , , ] + [Animal.ANT, Animal.BEE, Animal.CAT, Animal.DOG] The semantics of this API resemble :class:`~collections.namedtuple`. The first argument of the call to :class:`Enum` is the name of the enumeration. @@ -628,7 +625,16 @@ StrEnum The second variation of :class:`Enum` that is provided is also a subclass of :class:`str`. Members of a :class:`StrEnum` can be compared to strings; by extension, string enumerations of different types can also be compared -to each other. +to each other. :class:`StrEnum` exists to help avoid the problem of getting +an incorrect member:: + + >>> from enum import StrEnum + >>> class Directions(StrEnum): + ... NORTH = 'north', # notice the trailing comma + ... SOUTH = 'south' + +Before :class:`StrEnum`, ``Directions.NORTH`` would have been the :class:`tuple` +``('north',)``. .. versionadded:: 3.11 @@ -639,8 +645,9 @@ IntFlag The next variation of :class:`Enum` provided, :class:`IntFlag`, is also based on :class:`int`. The difference being :class:`IntFlag` members can be combined using the bitwise operators (&, \|, ^, ~) and the result is still an -:class:`IntFlag` member, if possible. Like :class:`IntEnum`, :class:`IntFlag` -members are also integers and can be used wherever an :class:`int` is used. +:class:`IntFlag` member, if possible. However, as the name implies, :class:`IntFlag` +members also subclass :class:`int` and can be used wherever an :class:`int` is +used. .. note:: @@ -663,7 +670,7 @@ Sample :class:`IntFlag` class:: ... X = 1 ... >>> Perm.R | Perm.W - + Perm.R|Perm.W >>> Perm.R + Perm.W 6 >>> RW = Perm.R | Perm.W @@ -678,11 +685,11 @@ It is also possible to name the combinations:: ... X = 1 ... RWX = 7 >>> Perm.RWX - + Perm.RWX >>> ~Perm.RWX - + Perm(0) >>> Perm(7) - + Perm.RWX .. note:: @@ -695,7 +702,7 @@ Another important difference between :class:`IntFlag` and :class:`Enum` is that if no flags are set (the value is 0), its boolean evaluation is :data:`False`:: >>> Perm.R & Perm.X - + Perm(0) >>> bool(Perm.R & Perm.X) False @@ -703,7 +710,7 @@ Because :class:`IntFlag` members are also subclasses of :class:`int` they can be combined with them (but may lose :class:`IntFlag` membership:: >>> Perm.X | 4 - + Perm.R|Perm.X >>> Perm.X | 8 9 @@ -719,7 +726,7 @@ be combined with them (but may lose :class:`IntFlag` membership:: :class:`IntFlag` members can also be iterated over:: >>> list(RW) - [, ] + [Perm.R, Perm.W] .. versionadded:: 3.11 @@ -746,7 +753,7 @@ flags being set, the boolean evaluation is :data:`False`:: ... GREEN = auto() ... >>> Color.RED & Color.GREEN - + Color(0) >>> bool(Color.RED & Color.GREEN) False @@ -760,7 +767,7 @@ while combinations of flags won't:: ... WHITE = RED | BLUE | GREEN ... >>> Color.WHITE - + Color.WHITE Giving a name to the "no flags set" condition does not change its boolean value:: @@ -772,7 +779,7 @@ value:: ... GREEN = auto() ... >>> Color.BLACK - + Color.BLACK >>> bool(Color.BLACK) False @@ -780,7 +787,7 @@ value:: >>> purple = Color.RED | Color.BLUE >>> list(purple) - [, ] + [Color.RED, Color.BLUE] .. versionadded:: 3.11 @@ -805,16 +812,16 @@ simple to implement independently:: pass This demonstrates how similar derived enumerations can be defined; for example -a :class:`FloatEnum` that mixes in :class:`float` instead of :class:`int`. +a :class:`StrEnum` that mixes in :class:`str` instead of :class:`int`. Some rules: 1. When subclassing :class:`Enum`, mix-in types must appear before :class:`Enum` itself in the sequence of bases, as in the :class:`IntEnum` example above. -2. Mix-in types must be subclassable. For example, :class:`bool` and - :class:`range` are not subclassable and will throw an error during Enum - creation if used as the mix-in type. +2. Mix-in types must be subclassable. For example, + :class:`bool` and :class:`range` are not subclassable + and will throw an error during Enum creation if used as the mix-in type. 3. While :class:`Enum` can have members of any type, once you mix in an additional type, all the members must have values of that type, e.g. :class:`int` above. This restriction does not apply to mix-ins which only @@ -822,18 +829,15 @@ Some rules: 4. When another data type is mixed in, the :attr:`value` attribute is *not the same* as the enum member itself, although it is equivalent and will compare equal. -5. %-style formatting: ``%s`` and ``%r`` call the :class:`Enum` class's +5. %-style formatting: `%s` and `%r` call the :class:`Enum` class's :meth:`__str__` and :meth:`__repr__` respectively; other codes (such as - ``%i`` or ``%h`` for IntEnum) treat the enum member as its mixed-in type. + `%i` or `%h` for IntEnum) treat the enum member as its mixed-in type. 6. :ref:`Formatted string literals `, :meth:`str.format`, - and :func:`format` will use the enum's :meth:`__str__` method. - -.. note:: - - Because :class:`IntEnum`, :class:`IntFlag`, and :class:`StrEnum` are - designed to be drop-in replacements for existing constants, their - :meth:`__str__` method has been reset to their data types - :meth:`__str__` method. + and :func:`format` will use the mixed-in type's :meth:`__format__` + unless :meth:`__str__` or :meth:`__format__` is overridden in the subclass, + in which case the overridden methods or :class:`Enum` methods will be used. + Use the !s and !r format codes to force usage of the :class:`Enum` class's + :meth:`__str__` and :meth:`__repr__` methods. When to use :meth:`__new__` vs. :meth:`__init__` ------------------------------------------------ @@ -862,10 +866,10 @@ want one of them to be the value:: ... >>> print(Coordinate['PY']) - Coordinate.PY + PY >>> print(Coordinate(3)) - Coordinate.VY + VY Finer Points @@ -923,8 +927,8 @@ and raise an error if the two do not match:: Traceback (most recent call last): ... TypeError: member order does not match _order_: - ['RED', 'BLUE', 'GREEN'] - ['RED', 'GREEN', 'BLUE'] + ['RED', 'BLUE', 'GREEN'] + ['RED', 'GREEN', 'BLUE'] .. note:: @@ -945,36 +949,35 @@ but remain normal attributes. """""""""""""""""""" Enum members are instances of their enum class, and are normally accessed as -``EnumClass.member``. In Python versions ``3.5`` to ``3.10`` you could access -members from other members -- this practice was discouraged, and in ``3.11`` -:class:`Enum` returns to not allowing it:: +``EnumClass.member``. In Python versions ``3.5`` to ``3.9`` you could access +members from other members -- this practice was discouraged, and in ``3.12`` +:class:`Enum` will return to not allowing it, while in ``3.10`` and ``3.11`` +it will raise a :exc:`DeprecationWarning`:: >>> class FieldTypes(Enum): ... name = 0 ... value = 1 ... size = 2 ... - >>> FieldTypes.value.size - Traceback (most recent call last): - ... - AttributeError: member has no attribute 'size' - + >>> FieldTypes.value.size # doctest: +SKIP + DeprecationWarning: accessing one member from another is not supported, + and will be disabled in 3.12 + .. versionchanged:: 3.5 -.. versionchanged:: 3.11 Creating members that are mixed with other data types """"""""""""""""""""""""""""""""""""""""""""""""""""" When subclassing other data types, such as :class:`int` or :class:`str`, with -an :class:`Enum`, all values after the ``=`` are passed to that data type's +an :class:`Enum`, all values after the `=` are passed to that data type's constructor. For example:: - >>> class MyEnum(IntEnum): # help(int) -> int(x, base=10) -> integer - ... example = '11', 16 # so x='11' and base=16 - ... - >>> MyEnum.example.value # and hex(11) is... + >>> class MyEnum(IntEnum): + ... example = '11', 16 # '11' will be interpreted as a hexadecimal + ... # number + >>> MyEnum.example.value 17 @@ -997,12 +1000,13 @@ Plain :class:`Enum` classes always evaluate as :data:`True`. """"""""""""""""""""""""""""" If you give your enum subclass extra methods, like the `Planet`_ -class below, those methods will show up in a :func:`dir` of the member, -but not of the class:: +class below, those methods will show up in a :func:`dir` of the member and the +class. Attributes defined in an :func:`__init__` method will only show up in a +:func:`dir` of the member:: - >>> dir(Planet) # doctest: +SKIP - ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__members__', '__module__'] - >>> dir(Planet.EARTH) # doctest: +SKIP + >>> dir(Planet) + ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__init__', '__members__', '__module__', 'surface_gravity'] + >>> dir(Planet.EARTH) ['__class__', '__doc__', '__module__', 'mass', 'name', 'radius', 'surface_gravity', 'value'] @@ -1021,10 +1025,19 @@ are comprised of a single bit:: ... CYAN = GREEN | BLUE ... >>> Color(3) # named combination - + Color.YELLOW >>> Color(7) # not named combination - + Color.RED|Color.GREEN|Color.BLUE +``StrEnum`` and :meth:`str.__str__` +""""""""""""""""""""""""""""""""""" + +An important difference between :class:`StrEnum` and other Enums is the +:meth:`__str__` method; because :class:`StrEnum` members are strings, some +parts of Python will read the string data directly, while others will call +:meth:`str()`. To make those two operations have the same result, +:meth:`StrEnum.__str__` will be the same as :meth:`str.__str__` so that +``str(StrEnum.member) == StrEnum.member`` is true. ``Flag`` and ``IntFlag`` minutia """""""""""""""""""""""""""""""" @@ -1047,16 +1060,16 @@ the following are true: - only canonical flags are returned during iteration:: >>> list(Color.WHITE) - [, , ] + [Color.RED, Color.GREEN, Color.BLUE] - negating a flag or flag set returns a new flag/flag set with the corresponding positive integer value:: >>> Color.BLUE - + Color.BLUE >>> ~Color.BLUE - + Color.RED|Color.GREEN - names of pseudo-flags are constructed from their members' names:: @@ -1066,29 +1079,25 @@ the following are true: - multi-bit flags, aka aliases, can be returned from operations:: >>> Color.RED | Color.BLUE - + Color.PURPLE >>> Color(7) # or Color(-1) - + Color.WHITE >>> Color(0) - + Color.BLACK -- membership / containment checking: zero-valued flags are always considered - to be contained:: +- membership / containment checking has changed slightly -- zero-valued flags + are never considered to be contained:: >>> Color.BLACK in Color.WHITE - True + False - otherwise, only if all bits of one flag are in the other flag will True - be returned:: + otherwise, if all bits of one flag are in the other flag, True is returned:: >>> Color.PURPLE in Color.WHITE True - >>> Color.GREEN in Color.PURPLE - False - There is a new boundary mechanism that controls how out-of-range / invalid bits are handled: ``STRICT``, ``CONFORM``, ``EJECT``, and ``KEEP``: @@ -1172,7 +1181,7 @@ Using :class:`auto` would look like:: ... GREEN = auto() ... >>> Color.GREEN - + Using :class:`object` @@ -1185,24 +1194,10 @@ Using :class:`object` would look like:: ... GREEN = object() ... BLUE = object() ... - >>> Color.GREEN # doctest: +SKIP - > - -This is also a good example of why you might want to write your own -:meth:`__repr__`:: - - >>> class Color(Enum): - ... RED = object() - ... GREEN = object() - ... BLUE = object() - ... def __repr__(self): - ... return "<%s.%s>" % (self.__class__.__name__, self._name_) - ... >>> Color.GREEN - Using a descriptive string """""""""""""""""""""""""" @@ -1214,7 +1209,9 @@ Using a string as the value would look like:: ... BLUE = 'too fast!' ... >>> Color.GREEN - + + >>> Color.GREEN.value + 'go' Using a custom :meth:`__new__` @@ -1235,7 +1232,9 @@ Using an auto-numbering :meth:`__new__` would look like:: ... BLUE = () ... >>> Color.GREEN - + + >>> Color.GREEN.value + 2 To make a more general purpose ``AutoNumber``, add ``*args`` to the signature:: @@ -1258,7 +1257,7 @@ to handle any extra arguments:: ... BLEACHED_CORAL = () # New color, no Pantone code yet! ... >>> Swatch.SEA_GREEN - + >>> Swatch.SEA_GREEN.pantone '1246' >>> Swatch.BLEACHED_CORAL.pantone @@ -1385,9 +1384,30 @@ An example to show the :attr:`_ignore_` attribute in use:: ... Period['day_%d' % i] = i ... >>> list(Period)[:2] - [, ] + [Period.day_0, Period.day_1] >>> list(Period)[-2:] - [, ] + [Period.day_365, Period.day_366] + + +Conforming input to Flag +^^^^^^^^^^^^^^^^^^^^^^^^ + +To create a :class:`Flag` enum that is more resilient to out-of-bounds results +from mathematical operations, you can use the :attr:`FlagBoundary.CONFORM` +setting:: + + >>> from enum import Flag, CONFORM, auto + >>> class Weekday(Flag, boundary=CONFORM): + ... MONDAY = auto() + ... TUESDAY = auto() + ... WEDNESDAY = auto() + ... THURSDAY = auto() + ... FRIDAY = auto() + ... SATURDAY = auto() + ... SUNDAY = auto() + >>> today = Weekday.TUESDAY + >>> Weekday(today + 22) # what day is three weeks from tomorrow? + >>> Weekday.WEDNESDAY .. _enumtype-examples: diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 906c60bc3efe3..8bb19dcdf2b61 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -31,7 +31,7 @@ An enumeration: * uses *call* syntax to return members by value * uses *index* syntax to return members by name -Enumerations are created either by using :keyword:`class` syntax, or by +Enumerations are created either by using the :keyword:`class` syntax, or by using function-call syntax:: >>> from enum import Enum @@ -45,7 +45,7 @@ using function-call syntax:: >>> # functional syntax >>> Color = Enum('Color', ['RED', 'GREEN', 'BLUE']) -Even though we can use :keyword:`class` syntax to create Enums, Enums +Even though we can use the :keyword:`class` syntax to create Enums, Enums are not normal Python classes. See :ref:`How are Enums different? ` for more details. @@ -53,7 +53,7 @@ are not normal Python classes. See - The class :class:`Color` is an *enumeration* (or *enum*) - The attributes :attr:`Color.RED`, :attr:`Color.GREEN`, etc., are - *enumeration members* (or *members*) and are functionally constants. + *enumeration members* (or *enum members*) and are functionally constants. - The enum members have *names* and *values* (the name of :attr:`Color.RED` is ``RED``, the value of :attr:`Color.BLUE` is ``3``, etc.) @@ -110,10 +110,15 @@ Module Contents :class:`StrEnum` defaults to the lower-cased version of the member name, while other Enums default to 1 and increase from there. - :func:`property` + :func:`global_enum` + + :class:`Enum` class decorator to apply the appropriate global `__repr__`, + and export its members into the global name space. + + :func:`.property` Allows :class:`Enum` members to have attributes without conflicting with - member names. + other members' names. :func:`unique` @@ -126,7 +131,7 @@ Module Contents .. versionadded:: 3.6 ``Flag``, ``IntFlag``, ``auto`` -.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary``, ``property`` +.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary`` --------------- @@ -140,11 +145,6 @@ Data Types to subclass *EnumType* -- see :ref:`Subclassing EnumType ` for details. - *EnumType* is responsible for setting the correct :meth:`__repr__`, - :meth:`__str__`, :meth:`__format__`, and :meth:`__reduce__` methods on the - final *enum*, as well as creating the enum members, properly handling - duplicates, providing iteration over the enum class, etc. - .. method:: EnumType.__contains__(cls, member) Returns ``True`` if member belongs to the ``cls``:: @@ -162,31 +162,32 @@ Data Types .. method:: EnumType.__dir__(cls) Returns ``['__class__', '__doc__', '__members__', '__module__']`` and the - names of the members in *cls*:: + names of the members in ``cls``. User-defined methods and methods from + mixin classes will also be included:: >>> dir(Color) - ['BLUE', 'GREEN', 'RED', '__class__', '__contains__', '__doc__', '__getitem__', '__init_subclass__', '__iter__', '__len__', '__members__', '__module__', '__name__', '__qualname__'] + ['BLUE', 'GREEN', 'RED', '__class__', '__doc__', '__members__', '__module__'] .. method:: EnumType.__getattr__(cls, name) Returns the Enum member in *cls* matching *name*, or raises an :exc:`AttributeError`:: >>> Color.GREEN - + Color.GREEN .. method:: EnumType.__getitem__(cls, name) - Returns the Enum member in *cls* matching *name*, or raises an :exc:`KeyError`:: + Returns the Enum member in *cls* matching *name*, or raises a :exc:`KeyError`:: >>> Color['BLUE'] - + Color.BLUE .. method:: EnumType.__iter__(cls) Returns each member in *cls* in definition order:: >>> list(Color) - [, , ] + [Color.RED, Color.GREEN, Color.BLUE] .. method:: EnumType.__len__(cls) @@ -200,7 +201,7 @@ Data Types Returns each member in *cls* in reverse definition order:: >>> list(reversed(Color)) - [, , ] + [Color.BLUE, Color.GREEN, Color.RED] .. class:: Enum @@ -231,7 +232,7 @@ Data Types .. attribute:: Enum._ignore_ ``_ignore_`` is only used during creation and is removed from the - enumeration once creation is complete. + enumeration once that is complete. ``_ignore_`` is a list of names that will not become members, and whose names will also be removed from the completed enumeration. See @@ -260,7 +261,7 @@ Data Types .. method:: Enum.__dir__(self) Returns ``['__class__', '__doc__', '__module__', 'name', 'value']`` and - any public methods defined on *self.__class__*:: + any public methods defined on ``self.__class__`` or a mixin class:: >>> from datetime import date >>> class Weekday(Enum): @@ -275,7 +276,7 @@ Data Types ... def today(cls): ... print('today is %s' % cls(date.today().isoweekday()).name) >>> dir(Weekday.SATURDAY) - ['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'today', 'value'] + ['__class__', '__doc__', '__module__', 'name', 'today', 'value'] .. method:: Enum._generate_next_value_(name, start, count, last_values) @@ -297,11 +298,6 @@ Data Types >>> PowersOfThree.SECOND.value 6 - .. method:: Enum.__init_subclass__(cls, \**kwds) - - A *classmethod* that is used to further configure subsequent subclasses. - By default, does nothing. - .. method:: Enum._missing_(cls, value) A *classmethod* for looking up values not found in *cls*. By default it @@ -321,55 +317,42 @@ Data Types >>> Build.DEBUG.value 'debug' >>> Build('deBUG') - + Build.DEBUG .. method:: Enum.__repr__(self) Returns the string used for *repr()* calls. By default, returns the - *Enum* name, member name, and value, but can be overridden:: + *Enum* name and the member name, but can be overridden:: - >>> class OtherStyle(Enum): - ... ALTERNATE = auto() - ... OTHER = auto() - ... SOMETHING_ELSE = auto() + >>> class OldStyle(Enum): + ... RETRO = auto() + ... OLD_SCHOOl = auto() + ... YESTERYEAR = auto() ... def __repr__(self): ... cls_name = self.__class__.__name__ - ... return f'{cls_name}.{self.name}' - >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" - (OtherStyle.ALTERNATE, 'OtherStyle.ALTERNATE', 'OtherStyle.ALTERNATE') + ... return f'<{cls_name}.{self.name}: {self.value}>' + >>> OldStyle.RETRO + .. method:: Enum.__str__(self) Returns the string used for *str()* calls. By default, returns the - *Enum* name and member name, but can be overridden:: + member name, but can be overridden:: - >>> class OtherStyle(Enum): - ... ALTERNATE = auto() - ... OTHER = auto() - ... SOMETHING_ELSE = auto() + >>> class OldStyle(Enum): + ... RETRO = auto() + ... OLD_SCHOOl = auto() + ... YESTERYEAR = auto() ... def __str__(self): - ... return f'{self.name}' - >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" - (, 'ALTERNATE', 'ALTERNATE') - - .. method:: Enum.__format__(self) - - Returns the string used for *format()* and *f-string* calls. By default, - returns :meth:`__str__` returns, but can be overridden:: - - >>> class OtherStyle(Enum): - ... ALTERNATE = auto() - ... OTHER = auto() - ... SOMETHING_ELSE = auto() - ... def __format__(self, spec): - ... return f'{self.name}' - >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" - (, 'OtherStyle.ALTERNATE', 'ALTERNATE') + ... cls_name = self.__class__.__name__ + ... return f'{cls_name}.{self.name}' + >>> OldStyle.RETRO + OldStyle.RETRO - .. note:: +.. note:: - Using :class:`auto` with :class:`Enum` results in integers of increasing value, - starting with ``1``. + Using :class:`auto` with :class:`Enum` results in integers of increasing value, + starting with ``1``. .. class:: IntEnum @@ -384,7 +367,7 @@ Data Types ... TWO = 2 ... THREE = 3 >>> Numbers.THREE - + Numbers.THREE >>> Numbers.ONE + Numbers.TWO 3 >>> Numbers.THREE + 5 @@ -392,14 +375,10 @@ Data Types >>> Numbers.THREE == 3 True - .. note:: +.. note:: - Using :class:`auto` with :class:`IntEnum` results in integers of increasing - value, starting with ``1``. - - .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to - better support the *replacement of existing constants* use-case. - :meth:`__format__` was already :func:`int.__format__` for that same reason. + Using :class:`auto` with :class:`IntEnum` results in integers of increasing value, + starting with ``1``. .. class:: StrEnum @@ -413,16 +392,13 @@ Data Types instead of ``isinstance(str, unknown)``), and in those locations you will need to use ``str(StrEnum.member)``. - .. note:: - Using :class:`auto` with :class:`StrEnum` results in the lower-cased member - name as the value. +.. note:: - .. note:: :meth:`__str__` is :func:`str.__str__` to better support the - *replacement of existing constants* use-case. :meth:`__format__` is likewise - :func:`int.__format__` for that same reason. + Using :class:`auto` with :class:`StrEnum` results in values of the member name, + lower-cased. - .. versionadded:: 3.11 +.. versionadded:: 3.11 .. class:: Flag @@ -455,9 +431,9 @@ Data Types Returns all contained members:: >>> list(Color.RED) - [] + [Color.RED] >>> list(purple) - [, ] + [Color.RED, Color.BLUE] .. method:: __len__(self): @@ -485,52 +461,42 @@ Data Types Returns current flag binary or'ed with other:: >>> Color.RED | Color.GREEN - + Color.RED|Color.GREEN .. method:: __and__(self, other) Returns current flag binary and'ed with other:: >>> purple & white - + Color.RED|Color.BLUE >>> purple & Color.GREEN - + 0x0 .. method:: __xor__(self, other) Returns current flag binary xor'ed with other:: >>> purple ^ white - + Color.GREEN >>> purple ^ Color.GREEN - + Color.RED|Color.GREEN|Color.BLUE .. method:: __invert__(self): Returns all the flags in *type(self)* that are not in self:: >>> ~white - + 0x0 >>> ~purple - + Color.GREEN >>> ~Color.RED - - - .. method:: _numeric_repr_ - - Function used to format any remaining unnamed numeric values. Default is - the value's repr; common choices are :func:`hex` and :func:`oct`. - - .. note:: + Color.GREEN|Color.BLUE - Using :class:`auto` with :class:`Flag` results in integers that are powers - of two, starting with ``1``. +.. note:: - .. versionchanged:: 3.11 The *repr()* of zero-valued flags has changed. It - is now:: + Using :class:`auto` with :class:`Flag` results in integers that are powers + of two, starting with ``1``. - >>> Color(0) - .. class:: IntFlag @@ -543,9 +509,9 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> Color.RED & 2 - + 0x0 >>> Color.RED | 2 - + Color.RED|Color.GREEN If any integer operation is performed with an *IntFlag* member, the result is not an *IntFlag*:: @@ -558,25 +524,15 @@ Data Types * the result is a valid *IntFlag*: an *IntFlag* is returned * the result is not a valid *IntFlag*: the result depends on the *FlagBoundary* setting - The *repr()* of unnamed zero-valued flags has changed. It is now: - - >>> Color(0) - - - .. note:: - - Using :class:`auto` with :class:`IntFlag` results in integers that are powers - of two, starting with ``1``. - - .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to - better support the *replacement of existing constants* use-case. - :meth:`__format__` was already :func:`int.__format__` for that same reason. +.. note:: + Using :class:`auto` with :class:`IntFlag` results in integers that are powers + of two, starting with ``1``. .. class:: EnumCheck *EnumCheck* contains the options used by the :func:`verify` decorator to ensure - various constraints; failed constraints result in a :exc:`ValueError`. + various constraints; failed constraints result in a :exc:`TypeError`. .. attribute:: UNIQUE @@ -626,11 +582,11 @@ Data Types ... ValueError: invalid Flag 'Color': aliases WHITE and NEON are missing combined values of 0x18 [use enum.show_flag_values(value) for details] - .. note:: +.. note:: - CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. + CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. - .. versionadded:: 3.11 +.. versionadded:: 3.11 .. class:: FlagBoundary @@ -650,7 +606,7 @@ Data Types >>> StrictFlag(2**2 + 2**4) Traceback (most recent call last): ... - ValueError: invalid value 20 + ValueError: StrictFlag: invalid value: 20 given 0b0 10100 allowed 0b0 00111 @@ -665,7 +621,7 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> ConformFlag(2**2 + 2**4) - + ConformFlag.BLUE .. attribute:: EJECT @@ -691,52 +647,12 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> KeepFlag(2**2 + 2**4) - + KeepFlag.BLUE|0x10 .. versionadded:: 3.11 --------------- -Supported ``__dunder__`` names -"""""""""""""""""""""""""""""" - -:attr:`__members__` is a read-only ordered mapping of ``member_name``:``member`` -items. It is only available on the class. - -:meth:`__new__`, if specified, must create and return the enum members; it is -also a very good idea to set the member's :attr:`_value_` appropriately. Once -all the members are created it is no longer used. - - -Supported ``_sunder_`` names -"""""""""""""""""""""""""""" - -- ``_name_`` -- name of the member -- ``_value_`` -- value of the member; can be set / modified in ``__new__`` - -- ``_missing_`` -- a lookup function used when a value is not found; may be - overridden -- ``_ignore_`` -- a list of names, either as a :class:`list` or a :class:`str`, - that will not be transformed into members, and will be removed from the final - class -- ``_order_`` -- used in Python 2/3 code to ensure member order is consistent - (class attribute, removed during class creation) -- ``_generate_next_value_`` -- used to get an appropriate value for an enum - member; may be overridden - - .. note:: - - For standard :class:`Enum` classes the next value chosen is the last value seen - incremented by one. - - For :class:`Flag` classes the next value chosen will be the next highest - power-of-two, regardless of the last value seen. - -.. versionadded:: 3.6 ``_missing_``, ``_order_``, ``_generate_next_value_`` -.. versionadded:: 3.7 ``_ignore_`` - ---------------- - Utilities and Decorators ------------------------ @@ -752,6 +668,15 @@ Utilities and Decorators ``_generate_next_value_`` can be overridden to customize the values used by *auto*. +.. decorator:: global_enum + + A :keyword:`class` decorator specifically for enumerations. It replaces the + :meth:`__repr__` method with one that shows *module_name*.*member_name*. It + also injects the members, and their aliases, into the global namespace they + were defined in. + +.. versionadded:: 3.11 + .. decorator:: property A decorator similar to the built-in *property*, but specifically for @@ -763,7 +688,7 @@ Utilities and Decorators *Enum* class, and *Enum* subclasses can define members with the names ``value`` and ``name``. - .. versionadded:: 3.11 +.. versionadded:: 3.11 .. decorator:: unique @@ -789,7 +714,7 @@ Utilities and Decorators :class:`EnumCheck` are used to specify which constraints should be checked on the decorated enumeration. - .. versionadded:: 3.11 +.. versionadded:: 3.11 --------------- @@ -801,20 +726,14 @@ Notes These three enum types are designed to be drop-in replacements for existing integer- and string-based values; as such, they have extra limitations: - - ``__str__`` uses the value and not the name of the enum member + - ``format()`` will use the value of the enum member, unless ``__str__`` + has been overridden - - ``__format__``, because it uses ``__str__``, will also use the value of - the enum member instead of its name + - ``StrEnum.__str__`` uses the value and not the name of the enum member - If you do not need/want those limitations, you can either create your own - base class by mixing in the ``int`` or ``str`` type yourself:: + If you do not need/want those limitations, you can create your own base + class by mixing in the ``int`` or ``str`` type yourself:: >>> from enum import Enum >>> class MyIntEnum(int, Enum): ... pass - - or you can reassign the appropriate :meth:`str`, etc., in your enum:: - - >>> from enum import IntEnum - >>> class MyIntEnum(IntEnum): - ... __str__ = IntEnum.__str__ diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index 4d8488a4a28de..eb33d7e1778a7 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -2070,7 +2070,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_flags` returns :class:`VerifyFlags` flags: >>> ssl.create_default_context().verify_flags # doctest: +SKIP - + ssl.VERIFY_X509_TRUSTED_FIRST .. attribute:: SSLContext.verify_mode @@ -2082,7 +2082,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_mode` returns :class:`VerifyMode` enum: >>> ssl.create_default_context().verify_mode - + ssl.CERT_REQUIRED .. index:: single: certificates diff --git a/Lib/enum.py b/Lib/enum.py index 772e1eac0e1e6..93ea1bea36db7 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -1,16 +1,16 @@ import sys -import builtins as bltns from types import MappingProxyType, DynamicClassAttribute from operator import or_ as _or_ from functools import reduce +from builtins import property as _bltin_property, bin as _bltin_bin __all__ = [ 'EnumType', 'EnumMeta', - 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', 'ReprEnum', + 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', 'auto', 'unique', 'property', 'verify', 'FlagBoundary', 'STRICT', 'CONFORM', 'EJECT', 'KEEP', - 'global_flag_repr', 'global_enum_repr', 'global_str', 'global_enum', + 'global_flag_repr', 'global_enum_repr', 'global_enum', 'EnumCheck', 'CONTINUOUS', 'NAMED_FLAGS', 'UNIQUE', ] @@ -18,7 +18,7 @@ # Dummy value for Enum and Flag as there are explicit checks for them # before they have been created. # This is also why there are checks in EnumType like `if Enum is not None` -Enum = Flag = EJECT = _stdlib_enums = ReprEnum = None +Enum = Flag = EJECT = None def _is_descriptor(obj): """ @@ -116,9 +116,9 @@ def bin(num, max_bits=None): ceiling = 2 ** (num).bit_length() if num >= 0: - s = bltns.bin(num + ceiling).replace('1', '0', 1) + s = _bltin_bin(num + ceiling).replace('1', '0', 1) else: - s = bltns.bin(~num ^ (ceiling - 1) + ceiling) + s = _bltin_bin(~num ^ (ceiling - 1) + ceiling) sign = s[:3] digits = s[3:] if max_bits is not None: @@ -126,19 +126,6 @@ def bin(num, max_bits=None): digits = (sign[-1] * max_bits + digits)[-max_bits:] return "%s %s" % (sign, digits) -def _dedent(text): - """ - Like textwrap.dedent. Rewritten because we cannot import textwrap. - """ - lines = text.split('\n') - blanks = 0 - for i, ch in enumerate(lines[0]): - if ch != ' ': - break - for j, l in enumerate(lines): - lines[j] = l[i:] - return '\n'.join(lines) - _auto_null = object() class auto: @@ -162,12 +149,22 @@ def __get__(self, instance, ownerclass=None): return ownerclass._member_map_[self.name] except KeyError: raise AttributeError( - '%r has no attribute %r' % (ownerclass, self.name) + '%s: no class attribute %r' % (ownerclass.__name__, self.name) ) else: if self.fget is None: + # check for member + if self.name in ownerclass._member_map_: + import warnings + warnings.warn( + "accessing one member from another is not supported, " + " and will be disabled in 3.12", + DeprecationWarning, + stacklevel=2, + ) + return ownerclass._member_map_[self.name] raise AttributeError( - '%r member has no attribute %r' % (ownerclass, self.name) + '%s: no instance attribute %r' % (ownerclass.__name__, self.name) ) else: return self.fget(instance) @@ -175,7 +172,7 @@ def __get__(self, instance, ownerclass=None): def __set__(self, instance, value): if self.fset is None: raise AttributeError( - " cannot set attribute %r" % (self.clsname, self.name) + "%s: cannot set instance attribute %r" % (self.clsname, self.name) ) else: return self.fset(instance, value) @@ -183,7 +180,7 @@ def __set__(self, instance, value): def __delete__(self, instance): if self.fdel is None: raise AttributeError( - " cannot delete attribute %r" % (self.clsname, self.name) + "%s: cannot delete instance attribute %r" % (self.clsname, self.name) ) else: return self.fdel(instance) @@ -331,7 +328,7 @@ def __setitem__(self, key, value): elif _is_sunder(key): if key not in ( '_order_', - '_generate_next_value_', '_numeric_repr_', '_missing_', '_ignore_', + '_generate_next_value_', '_missing_', '_ignore_', '_iter_member_', '_iter_member_by_value_', '_iter_member_by_def_', ): raise ValueError( @@ -361,13 +358,13 @@ def __setitem__(self, key, value): key = '_order_' elif key in self._member_names: # descriptor overwriting an enum? - raise TypeError('%r already defined as %r' % (key, self[key])) + raise TypeError('%r already defined as: %r' % (key, self[key])) elif key in self._ignore: pass elif not _is_descriptor(value): if key in self: # enum overwriting a descriptor? - raise TypeError('%r already defined as %r' % (key, self[key])) + raise TypeError('%r already defined as: %r' % (key, self[key])) if isinstance(value, auto): if value.value == _auto_null: value.value = self._generate_next_value( @@ -398,7 +395,7 @@ class EnumType(type): @classmethod def __prepare__(metacls, cls, bases, **kwds): # check that previous enum members do not exist - metacls._check_for_existing_members_(cls, bases) + metacls._check_for_existing_members(cls, bases) # create the namespace dict enum_dict = _EnumDict() enum_dict._cls_name = cls @@ -416,10 +413,9 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # inherited __new__ unless a new __new__ is defined (or the resulting # class will fail). # + # remove any keys listed in _ignore_ if _simple: return super().__new__(metacls, cls, bases, classdict, **kwds) - # - # remove any keys listed in _ignore_ classdict.setdefault('_ignore_', []).append('_ignore_') ignore = classdict['_ignore_'] for key in ignore: @@ -431,8 +427,8 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # check for illegal enum names (any others?) invalid_names = set(member_names) & {'mro', ''} if invalid_names: - raise ValueError('invalid enum member name(s) '.format( - ','.join(repr(n) for n in invalid_names))) + raise ValueError('Invalid enum member name: {0}'.format( + ','.join(invalid_names))) # # adjust the sunders _order_ = classdict.pop('_order_', None) @@ -462,8 +458,6 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_value2member_map_'] = {} classdict['_unhashable_values_'] = [] classdict['_member_type_'] = member_type - # now set the __repr__ for the value - classdict['_value_repr_'] = metacls._find_data_repr_(cls, bases) # # Flag structures (will be removed if final class is not a Flag classdict['_boundary_'] = ( @@ -473,6 +467,10 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_flag_mask_'] = flag_mask classdict['_all_bits_'] = 2 ** ((flag_mask).bit_length()) - 1 classdict['_inverted_'] = None + # + # create a default docstring if one has not been provided + if '__doc__' not in classdict: + classdict['__doc__'] = 'An enumeration.' try: exc = None enum_class = super().__new__(metacls, cls, bases, classdict, **kwds) @@ -483,140 +481,18 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k if exc is not None: raise exc # - # update classdict with any changes made by __init_subclass__ - classdict.update(enum_class.__dict__) - # - # create a default docstring if one has not been provided - if enum_class.__doc__ is None: - if not member_names: - enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ - Create a collection of name/value pairs. - - Example enumeration: - - >>> class Color(Enum): - ... RED = 1 - ... BLUE = 2 - ... GREEN = 3 - - Access them by: - - - attribute access:: - - >>> Color.RED - - - - value lookup: - - >>> Color(1) - - - - name lookup: - - >>> Color['RED'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Color) - 3 - - >>> list(Color) - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """) - else: - member = list(enum_class)[0] - enum_length = len(enum_class) - cls_name = enum_class.__name__ - if enum_length == 1: - list_line = 'list(%s)' % cls_name - list_repr = '[<%s.%s: %r>]' % (cls_name, member.name, member.value) - elif enum_length == 2: - member2 = list(enum_class)[1] - list_line = 'list(%s)' % cls_name - list_repr = '[<%s.%s: %r>, <%s.%s: %r>]' % ( - cls_name, member.name, member.value, - cls_name, member2.name, member2.value, - ) - else: - member2 = list(enum_class)[1] - member3 = list(enum_class)[2] - list_line = 'list(%s)%s' % (cls_name, ('','[:3]')[enum_length > 3]) - list_repr = '[<%s.%s: %r>, <%s.%s: %r>, <%s.%s: %r>]' % ( - cls_name, member.name, member.value, - cls_name, member2.name, member2.value, - cls_name, member3.name, member3.value, - ) - enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> %s.%s - <%s.%s: %r> - - - value lookup: - - >>> %s(%r) - <%s.%s: %r> - - - name lookup: - - >>> %s[%r] - <%s.%s: %r> - - Enumerations can be iterated over, and know how many members they have: - - >>> len(%s) - %r - - >>> %s - %s - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """ - % (cls_name, member.name, - cls_name, member.name, member.value, - cls_name, member.value, - cls_name, member.name, member.value, - cls_name, member.name, - cls_name, member.name, member.value, - cls_name, enum_length, - list_line, list_repr, - )) - # # double check that repr and friends are not the mixin's or various # things break (such as pickle) # however, if the method is defined in the Enum itself, don't replace # it - # - # Also, special handling for ReprEnum - if ReprEnum is not None and ReprEnum in bases: - if member_type is object: - raise TypeError( - 'ReprEnum subclasses must be mixed with a data type (i.e.' - ' int, str, float, etc.)' - ) - if '__format__' not in classdict: - enum_class.__format__ = member_type.__format__ - classdict['__format__'] = enum_class.__format__ - if '__str__' not in classdict: - method = member_type.__str__ - if method is object.__str__: - # if member_type does not define __str__, object.__str__ will use - # its __repr__ instead, so we'll also use its __repr__ - method = member_type.__repr__ - enum_class.__str__ = method - classdict['__str__'] = enum_class.__str__ for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name not in classdict: - setattr(enum_class, name, getattr(first_enum, name)) + if name in classdict: + continue + class_method = getattr(enum_class, name) + obj_method = getattr(member_type, name, None) + enum_method = getattr(first_enum, name, None) + if obj_method is not None and obj_method is class_method: + setattr(enum_class, name, enum_method) # # replace any other __new__ with our own (as long as Enum is not None, # anyway) -- again, this is to support pickle @@ -687,13 +563,13 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # _order_ step 4: verify that _order_ and _member_names_ match if _order_ != enum_class._member_names_: raise TypeError( - 'member order does not match _order_:\n %r\n %r' + 'member order does not match _order_:\n%r\n%r' % (enum_class._member_names_, _order_) ) # return enum_class - def __bool__(cls): + def __bool__(self): """ classes/types should always be True. """ @@ -738,13 +614,6 @@ def __call__(cls, value, names=None, *, module=None, qualname=None, type=None, s ) def __contains__(cls, member): - """ - Return True if member is a member of this enum - raises TypeError if member is not an enum member - - note: in 3.12 TypeError will no longer be raised, and True will also be - returned if member is the value of a member in this enum - """ if not isinstance(member, Enum): import warnings warnings.warn( @@ -762,33 +631,60 @@ def __delattr__(cls, attr): # nicer error message when someone tries to delete an attribute # (see issue19025). if attr in cls._member_map_: - raise AttributeError("%r cannot delete member %r." % (cls.__name__, attr)) + raise AttributeError("%s: cannot delete Enum member %r." % (cls.__name__, attr)) super().__delattr__(attr) - def __dir__(cls): - # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ - # on object-based enums - if cls._member_type_ is object: - interesting = set(cls._member_names_) - if cls._new_member_ is not object.__new__: - interesting.add('__new__') - if cls.__init_subclass__ is not object.__init_subclass__: - interesting.add('__init_subclass__') - for method in ('__init__', '__format__', '__repr__', '__str__'): - if getattr(cls, method) not in (getattr(Enum, method), getattr(Flag, method)): - interesting.add(method) - return sorted(set([ - '__class__', '__contains__', '__doc__', '__getitem__', - '__iter__', '__len__', '__members__', '__module__', - '__name__', '__qualname__', - ]) | interesting - ) - else: - # return whatever mixed-in data type has - return sorted(set( - dir(cls._member_type_) - + cls._member_names_ - )) + def __dir__(self): + # Start off with the desired result for dir(Enum) + cls_dir = {'__class__', '__doc__', '__members__', '__module__'} + add_to_dir = cls_dir.add + mro = self.__mro__ + this_module = globals().values() + is_from_this_module = lambda cls: any(cls is thing for thing in this_module) + first_enum_base = next(cls for cls in mro if is_from_this_module(cls)) + enum_dict = Enum.__dict__ + sentinel = object() + # special-case __new__ + ignored = {'__new__', *filter(_is_sunder, enum_dict)} + add_to_ignored = ignored.add + + # We want these added to __dir__ + # if and only if they have been user-overridden + enum_dunders = set(filter(_is_dunder, enum_dict)) + + for cls in mro: + # Ignore any classes defined in this module + if cls is object or is_from_this_module(cls): + continue + + cls_lookup = cls.__dict__ + + # If not an instance of EnumType, + # ensure all attributes excluded from that class's `dir()` are ignored here. + if not isinstance(cls, EnumType): + cls_lookup = set(cls_lookup).intersection(dir(cls)) + + for attr_name in cls_lookup: + # Already seen it? Carry on + if attr_name in cls_dir or attr_name in ignored: + continue + # Sunders defined in Enum.__dict__ are already in `ignored`, + # But sunders defined in a subclass won't be (we want all sunders excluded). + elif _is_sunder(attr_name): + add_to_ignored(attr_name) + # Not an "enum dunder"? Add it to dir() output. + elif attr_name not in enum_dunders: + add_to_dir(attr_name) + # Is an "enum dunder", and is defined by a class from enum.py? Ignore it. + elif getattr(self, attr_name) is getattr(first_enum_base, attr_name, sentinel): + add_to_ignored(attr_name) + # Is an "enum dunder", and is either user-defined or defined by a mixin class? + # Add it to dir() output. + else: + add_to_dir(attr_name) + + # sort the output before returning it, so that the result is deterministic. + return sorted(cls_dir) def __getattr__(cls, name): """ @@ -807,24 +703,18 @@ def __getattr__(cls, name): raise AttributeError(name) from None def __getitem__(cls, name): - """ - Return the member matching `name`. - """ return cls._member_map_[name] def __iter__(cls): """ - Return members in definition order. + Returns members in definition order. """ return (cls._member_map_[name] for name in cls._member_names_) def __len__(cls): - """ - Return the number of members (no aliases) - """ return len(cls._member_names_) - @bltns.property + @_bltin_property def __members__(cls): """ Returns a mapping of member name->value. @@ -842,7 +732,7 @@ def __repr__(cls): def __reversed__(cls): """ - Return members in reverse definition order. + Returns members in reverse definition order. """ return (cls._member_map_[name] for name in reversed(cls._member_names_)) @@ -856,7 +746,7 @@ def __setattr__(cls, name, value): """ member_map = cls.__dict__.get('_member_map_', {}) if name in member_map: - raise AttributeError('cannot reassign member %r' % (name, )) + raise AttributeError('Cannot reassign member %r.' % (name, )) super().__setattr__(name, value) def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, start=1, boundary=None): @@ -911,7 +801,8 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s return metacls.__new__(metacls, class_name, bases, classdict, boundary=boundary) - def _convert_(cls, name, module, filter, source=None, *, boundary=None, as_global=False): + def _convert_(cls, name, module, filter, source=None, *, boundary=None): + """ Create a new Enum subclass that replaces a collection of global constants """ @@ -943,25 +834,22 @@ def _convert_(cls, name, module, filter, source=None, *, boundary=None, as_globa tmp_cls = type(name, (object, ), body) cls = _simple_enum(etype=cls, boundary=boundary or KEEP)(tmp_cls) cls.__reduce_ex__ = _reduce_ex_by_global_name - if as_global: - global_enum(cls) - else: - sys.modules[cls.__module__].__dict__.update(cls.__members__) + global_enum(cls) module_globals[name] = cls return cls - @classmethod - def _check_for_existing_members_(mcls, class_name, bases): + @staticmethod + def _check_for_existing_members(class_name, bases): for chain in bases: for base in chain.__mro__: if issubclass(base, Enum) and base._member_names_: raise TypeError( - " cannot extend %r" - % (class_name, base) + "%s: cannot extend enumeration %r" + % (class_name, base.__name__) ) @classmethod - def _get_mixins_(mcls, class_name, bases): + def _get_mixins_(cls, class_name, bases): """ Returns the type for creating enum members, and the first inherited enum class. @@ -971,7 +859,30 @@ def _get_mixins_(mcls, class_name, bases): if not bases: return object, Enum - mcls._check_for_existing_members_(class_name, bases) + def _find_data_type(bases): + data_types = set() + for chain in bases: + candidate = None + for base in chain.__mro__: + if base is object: + continue + elif issubclass(base, Enum): + if base._member_type_ is not object: + data_types.add(base._member_type_) + break + elif '__new__' in base.__dict__: + if issubclass(base, Enum): + continue + data_types.add(candidate or base) + break + else: + candidate = candidate or base + if len(data_types) > 1: + raise TypeError('%r: too many data types: %r' % (class_name, data_types)) + elif data_types: + return data_types.pop() + else: + return None # ensure final parent class is an Enum derivative, find any concrete # data type, and check that Enum has no members @@ -979,51 +890,12 @@ def _get_mixins_(mcls, class_name, bases): if not issubclass(first_enum, Enum): raise TypeError("new enumerations should be created as " "`EnumName([mixin_type, ...] [data_type,] enum_type)`") - member_type = mcls._find_data_type_(class_name, bases) or object + cls._check_for_existing_members(class_name, bases) + member_type = _find_data_type(bases) or object return member_type, first_enum - @classmethod - def _find_data_repr_(mcls, class_name, bases): - for chain in bases: - for base in chain.__mro__: - if base is object: - continue - elif issubclass(base, Enum): - # if we hit an Enum, use it's _value_repr_ - return base._value_repr_ - elif '__repr__' in base.__dict__: - # this is our data repr - return base.__dict__['__repr__'] - return None - - @classmethod - def _find_data_type_(mcls, class_name, bases): - data_types = set() - for chain in bases: - candidate = None - for base in chain.__mro__: - if base is object: - continue - elif issubclass(base, Enum): - if base._member_type_ is not object: - data_types.add(base._member_type_) - break - elif '__new__' in base.__dict__: - if issubclass(base, Enum): - continue - data_types.add(candidate or base) - break - else: - candidate = candidate or base - if len(data_types) > 1: - raise TypeError('too many data types for %r: %r' % (class_name, data_types)) - elif data_types: - return data_types.pop() - else: - return None - - @classmethod - def _find_new_(mcls, classdict, member_type, first_enum): + @staticmethod + def _find_new_(classdict, member_type, first_enum): """ Returns the __new__ to be used for creating the enum members. @@ -1071,42 +943,9 @@ def _find_new_(mcls, classdict, member_type, first_enum): class Enum(metaclass=EnumType): """ - Create a collection of name/value pairs. - - Example enumeration: - - >>> class Color(Enum): - ... RED = 1 - ... BLUE = 2 - ... GREEN = 3 - - Access them by: - - - attribute access:: - - >>> Color.RED - - - - value lookup: - - >>> Color(1) - + Generic enumeration. - - name lookup: - - >>> Color['RED'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Color) - 3 - - >>> list(Color) - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. + Derive from this class to define new enumerations. """ def __new__(cls, value): @@ -1160,9 +999,6 @@ def __new__(cls, value): exc = None ve_exc = None - def __init__(self, *args, **kwds): - pass - def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1185,44 +1021,47 @@ def _missing_(cls, value): return None def __repr__(self): - v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ - return "<%s.%s: %s>" % (self.__class__.__name__, self._name_, v_repr(self._value_)) + return "%s.%s" % ( self.__class__.__name__, self._name_) def __str__(self): - return "%s.%s" % (self.__class__.__name__, self._name_, ) + return "%s" % (self._name_, ) def __dir__(self): """ Returns all members and all public methods """ - if self.__class__._member_type_ is object: - interesting = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) - else: - interesting = set(object.__dir__(self)) - for name in getattr(self, '__dict__', []): - if name[0] != '_': - interesting.add(name) - for cls in self.__class__.mro(): - for name, obj in cls.__dict__.items(): - if name[0] == '_': - continue - if isinstance(obj, property): - # that's an enum.property - if obj.fget is not None or name not in self._member_map_: - interesting.add(name) - else: - # in case it was added by `dir(self)` - interesting.discard(name) - else: - interesting.add(name) - names = sorted( - set(['__class__', '__doc__', '__eq__', '__hash__', '__module__']) - | interesting - ) - return names + cls = type(self) + to_exclude = {'__members__', '__init__', '__new__', *cls._member_names_} + filtered_self_dict = (name for name in self.__dict__ if not name.startswith('_')) + return sorted({'name', 'value', *dir(cls), *filtered_self_dict} - to_exclude) def __format__(self, format_spec): - return str.__format__(str(self), format_spec) + """ + Returns format using actual value type unless __str__ has been overridden. + """ + # mixed-in Enums should use the mixed-in type's __format__, otherwise + # we can get strange results with the Enum name showing up instead of + # the value + # + # pure Enum branch, or branch with __str__ explicitly overridden + str_overridden = type(self).__str__ not in (Enum.__str__, IntEnum.__str__, Flag.__str__) + if self._member_type_ is object or str_overridden: + cls = str + val = str(self) + # mix-in branch + else: + if not format_spec or format_spec in ('{}','{:}'): + import warnings + warnings.warn( + "in 3.12 format() will use the enum member, not the enum member's value;\n" + "use a format specifier, such as :d for an integer-based Enum, to maintain " + "the current display", + DeprecationWarning, + stacklevel=2, + ) + cls = self._member_type_ + val = self._value_ + return cls.__format__(val, format_spec) def __hash__(self): return hash(self._name_) @@ -1249,25 +1088,34 @@ def value(self): return self._value_ -class ReprEnum(Enum): +class IntEnum(int, Enum): """ - Only changes the repr(), leaving str() and format() to the mixed-in type. + Enum where members are also (and must be) ints """ + def __str__(self): + return "%s" % (self._name_, ) -class IntEnum(int, ReprEnum): - """ - Enum where members are also (and must be) ints - """ + def __format__(self, format_spec): + """ + Returns format using actual value unless __str__ has been overridden. + """ + str_overridden = type(self).__str__ != IntEnum.__str__ + if str_overridden: + cls = str + val = str(self) + else: + cls = self._member_type_ + val = self._value_ + return cls.__format__(val, format_spec) -class StrEnum(str, ReprEnum): +class StrEnum(str, Enum): """ Enum where members are also (and must be) strings """ def __new__(cls, *values): - "values must already be of type `str`" if len(values) > 3: raise TypeError('too many arguments for str(): %r' % (values, )) if len(values) == 1: @@ -1287,6 +1135,10 @@ def __new__(cls, *values): member._value_ = value return member + __str__ = str.__str__ + + __format__ = str.__format__ + def _generate_next_value_(name, start, count, last_values): """ Return the lower-cased version of the member name. @@ -1317,8 +1169,6 @@ class Flag(Enum, boundary=STRICT): Support for flags """ - _numeric_repr_ = repr - def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1334,7 +1184,7 @@ def _generate_next_value_(name, start, count, last_values): try: high_bit = _high_bit(last_value) except Exception: - raise TypeError('invalid flag value %r' % last_value) from None + raise TypeError('Invalid Flag value: %r' % last_value) from None return 2 ** (high_bit+1) @classmethod @@ -1382,8 +1232,8 @@ def _missing_(cls, value): if cls._boundary_ is STRICT: max_bits = max(value.bit_length(), flag_mask.bit_length()) raise ValueError( - "%r invalid value %r\n given %s\n allowed %s" % ( - cls, value, bin(value, max_bits), bin(flag_mask, max_bits), + "%s: invalid value: %r\n given %s\n allowed %s" % ( + cls.__name__, value, bin(value, max_bits), bin(flag_mask, max_bits), )) elif cls._boundary_ is CONFORM: value = value & flag_mask @@ -1397,7 +1247,7 @@ def _missing_(cls, value): ) else: raise ValueError( - '%r unknown flag boundary %r' % (cls, cls._boundary_, ) + 'unknown flag boundary: %r' % (cls._boundary_, ) ) if value < 0: neg_value = value @@ -1424,7 +1274,7 @@ def _missing_(cls, value): m._name_ for m in cls._iter_member_(member_value) ]) if unknown: - pseudo_member._name_ += '|%s' % cls._numeric_repr_(unknown) + pseudo_member._name_ += '|0x%x' % unknown else: pseudo_member._name_ = None # use setdefault in case another thread already created a composite @@ -1442,8 +1292,10 @@ def __contains__(self, other): """ if not isinstance(other, self.__class__): raise TypeError( - "unsupported operand type(s) for 'in': %r and %r" % ( + "unsupported operand type(s) for 'in': '%s' and '%s'" % ( type(other).__qualname__, self.__class__.__qualname__)) + if other._value_ == 0 or self._value_ == 0: + return False return other._value_ & self._value_ == other._value_ def __iter__(self): @@ -1457,18 +1309,27 @@ def __len__(self): def __repr__(self): cls_name = self.__class__.__name__ - v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ if self._name_ is None: - return "<%s: %s>" % (cls_name, v_repr(self._value_)) + return "0x%x" % (self._value_, ) + if _is_single_bit(self._value_): + return '%s.%s' % (cls_name, self._name_) + if self._boundary_ is not FlagBoundary.KEEP: + return '%s.' % cls_name + ('|%s.' % cls_name).join(self.name.split('|')) else: - return "<%s.%s: %s>" % (cls_name, self._name_, v_repr(self._value_)) + name = [] + for n in self._name_.split('|'): + if n.startswith('0'): + name.append(n) + else: + name.append('%s.%s' % (cls_name, n)) + return '|'.join(name) def __str__(self): - cls_name = self.__class__.__name__ + cls = self.__class__ if self._name_ is None: - return '%s(%r)' % (cls_name, self._value_) + return '%s(%x)' % (cls.__name__, self._value_) else: - return "%s.%s" % (cls_name, self._name_) + return self._name_ def __bool__(self): return bool(self._value_) @@ -1501,11 +1362,20 @@ def __invert__(self): return self._inverted_ -class IntFlag(int, ReprEnum, Flag, boundary=EJECT): +class IntFlag(int, Flag, boundary=EJECT): """ Support for integer-based Flags """ + def __format__(self, format_spec): + """ + Returns format using actual value unless __str__ has been overridden. + """ + str_overridden = type(self).__str__ != Flag.__str__ + value = self + if not str_overridden: + value = self._value_ + return int.__format__(value, format_spec) def __or__(self, other): if isinstance(other, self.__class__): @@ -1542,7 +1412,6 @@ def __xor__(self, other): __rxor__ = __xor__ __invert__ = Flag.__invert__ - def _high_bit(value): """ returns index of highest bit, or -1 if value is zero or negative @@ -1587,7 +1456,7 @@ def global_flag_repr(self): module = self.__class__.__module__.split('.')[-1] cls_name = self.__class__.__name__ if self._name_ is None: - return "%s.%s(%r)" % (module, cls_name, self._value_) + return "%s.%s(0x%x)" % (module, cls_name, self._value_) if _is_single_bit(self): return '%s.%s' % (module, self._name_) if self._boundary_ is not FlagBoundary.KEEP: @@ -1595,22 +1464,14 @@ def global_flag_repr(self): else: name = [] for n in self._name_.split('|'): - if n[0].isdigit(): + if n.startswith('0'): name.append(n) else: name.append('%s.%s' % (module, n)) return '|'.join(name) -def global_str(self): - """ - use enum_name instead of class.enum_name - """ - if self._name_ is None: - return "%s(%r)" % (cls_name, self._value_) - else: - return self._name_ -def global_enum(cls, update_str=False): +def global_enum(cls): """ decorator that makes the repr() of an enum member reference its module instead of its class; also exports all members to the enum's module's @@ -1620,8 +1481,6 @@ def global_enum(cls, update_str=False): cls.__repr__ = global_flag_repr else: cls.__repr__ = global_enum_repr - if not issubclass(cls, ReprEnum) or update_str: - cls.__str__ = global_str sys.modules[cls.__module__].__dict__.update(cls.__members__) return cls @@ -1663,7 +1522,6 @@ def convert_class(cls): body['_value2member_map_'] = value2member_map = {} body['_unhashable_values_'] = [] body['_member_type_'] = member_type = etype._member_type_ - body['_value_repr_'] = etype._value_repr_ if issubclass(etype, Flag): body['_boundary_'] = boundary or etype._boundary_ body['_flag_mask_'] = None @@ -1685,8 +1543,13 @@ def convert_class(cls): # it enum_class = type(cls_name, (etype, ), body, boundary=boundary, _simple=True) for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name not in body: - setattr(enum_class, name, getattr(etype, name)) + if name in body: + continue + class_method = getattr(enum_class, name) + obj_method = getattr(member_type, name, None) + enum_method = getattr(etype, name, None) + if obj_method is not None and obj_method is class_method: + setattr(enum_class, name, enum_method) gnv_last_values = [] if issubclass(enum_class, Flag): # Flag / IntFlag @@ -1897,8 +1760,8 @@ def _test_simple_enum(checked_enum, simple_enum): + list(simple_enum._member_map_.keys()) ) for key in set(checked_keys + simple_keys): - if key in ('__module__', '_member_map_', '_value2member_map_', '__doc__'): - # keys known to be different, or very long + if key in ('__module__', '_member_map_', '_value2member_map_'): + # keys known to be different continue elif key in member_names: # members are checked below @@ -2019,5 +1882,3 @@ def _old_convert_(etype, name, module, filter, source=None, *, boundary=None): cls.__reduce_ex__ = _reduce_ex_by_global_name cls.__repr__ = global_enum_repr return cls - -_stdlib_enums = IntEnum, StrEnum, IntFlag diff --git a/Lib/inspect.py b/Lib/inspect.py index 8236698b8de0f..5d33f0d445fb9 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2567,21 +2567,15 @@ class _empty: class _ParameterKind(enum.IntEnum): - POSITIONAL_ONLY = 'positional-only' - POSITIONAL_OR_KEYWORD = 'positional or keyword' - VAR_POSITIONAL = 'variadic positional' - KEYWORD_ONLY = 'keyword-only' - VAR_KEYWORD = 'variadic keyword' - - def __new__(cls, description): - value = len(cls.__members__) - member = int.__new__(cls, value) - member._value_ = value - member.description = description - return member + POSITIONAL_ONLY = 0 + POSITIONAL_OR_KEYWORD = 1 + VAR_POSITIONAL = 2 + KEYWORD_ONLY = 3 + VAR_KEYWORD = 4 - def __str__(self): - return self.name + @property + def description(self): + return _PARAM_NAME_MAPPING[self] _POSITIONAL_ONLY = _ParameterKind.POSITIONAL_ONLY _POSITIONAL_OR_KEYWORD = _ParameterKind.POSITIONAL_OR_KEYWORD @@ -2589,6 +2583,14 @@ def __str__(self): _KEYWORD_ONLY = _ParameterKind.KEYWORD_ONLY _VAR_KEYWORD = _ParameterKind.VAR_KEYWORD +_PARAM_NAME_MAPPING = { + _POSITIONAL_ONLY: 'positional-only', + _POSITIONAL_OR_KEYWORD: 'positional or keyword', + _VAR_POSITIONAL: 'variadic positional', + _KEYWORD_ONLY: 'keyword-only', + _VAR_KEYWORD: 'variadic keyword' +} + class Parameter: """Represents a parameter in a function signature. diff --git a/Lib/plistlib.py b/Lib/plistlib.py index 4862355b2252c..3ab71edc320af 100644 --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -61,8 +61,7 @@ from xml.parsers.expat import ParserCreate -PlistFormat = enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__) -globals().update(PlistFormat.__members__) +PlistFormat = enum.global_enum(enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__)) class UID: diff --git a/Lib/re.py b/Lib/re.py index a7ab9b3706748..ea41217ce08c2 100644 --- a/Lib/re.py +++ b/Lib/re.py @@ -155,8 +155,6 @@ class RegexFlag: # sre extensions (experimental, don't rely on these) TEMPLATE = T = sre_compile.SRE_FLAG_TEMPLATE # disable backtracking DEBUG = sre_compile.SRE_FLAG_DEBUG # dump pattern after compilation - __str__ = object.__str__ - _numeric_repr_ = hex # sre exception error = sre_compile.error diff --git a/Lib/ssl.py b/Lib/ssl.py index dafb70a67864c..207925166efa3 100644 --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -119,6 +119,7 @@ ) from _ssl import _DEFAULT_CIPHERS, _OPENSSL_API_VERSION + _IntEnum._convert_( '_SSLMethod', __name__, lambda name: name.startswith('PROTOCOL_') and name != 'PROTOCOL_SSLv23', diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index a0953fb960f33..43f98c1c1efb6 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -6,18 +6,15 @@ import sys import unittest import threading -import builtins as bltns from collections import OrderedDict -from datetime import date from enum import Enum, IntEnum, StrEnum, EnumType, Flag, IntFlag, unique, auto from enum import STRICT, CONFORM, EJECT, KEEP, _simple_enum, _test_simple_enum -from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS, ReprEnum +from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS from io import StringIO from pickle import dumps, loads, PicklingError, HIGHEST_PROTOCOL from test import support from test.support import ALWAYS_EQ from test.support import threading_helper -from textwrap import dedent from datetime import timedelta python_version = sys.version_info[:2] @@ -110,12 +107,6 @@ def test_pickle_exception(assertion, exception, obj): class TestHelpers(unittest.TestCase): # _is_descriptor, _is_sunder, _is_dunder - sunder_names = '_bad_', '_good_', '_what_ho_' - dunder_names = '__mal__', '__bien__', '__que_que__' - private_names = '_MyEnum__private', '_MyEnum__still_private' - private_and_sunder_names = '_MyEnum__private_', '_MyEnum__also_private_' - random_names = 'okay', '_semi_private', '_weird__', '_MyEnum__' - def test_is_descriptor(self): class foo: pass @@ -125,36 +116,21 @@ class foo: setattr(obj, attr, 1) self.assertTrue(enum._is_descriptor(obj)) - def test_sunder(self): - for name in self.sunder_names + self.private_and_sunder_names: - self.assertTrue(enum._is_sunder(name), '%r is a not sunder name?' % name) - for name in self.dunder_names + self.private_names + self.random_names: - self.assertFalse(enum._is_sunder(name), '%r is a sunder name?' % name) + def test_is_sunder(self): for s in ('_a_', '_aa_'): self.assertTrue(enum._is_sunder(s)) + for s in ('a', 'a_', '_a', '__a', 'a__', '__a__', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_sunder(s)) - def test_dunder(self): - for name in self.dunder_names: - self.assertTrue(enum._is_dunder(name), '%r is a not dunder name?' % name) - for name in self.sunder_names + self.private_names + self.private_and_sunder_names + self.random_names: - self.assertFalse(enum._is_dunder(name), '%r is a dunder name?' % name) + def test_is_dunder(self): for s in ('__a__', '__aa__'): self.assertTrue(enum._is_dunder(s)) for s in ('a', 'a_', '_a', '__a', 'a__', '_a_', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_dunder(s)) - - def test_is_private(self): - for name in self.private_names + self.private_and_sunder_names: - self.assertTrue(enum._is_private('MyEnum', name), '%r is a not private name?') - for name in self.sunder_names + self.dunder_names + self.random_names: - self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') - - # for subclassing tests class classproperty: @@ -190,658 +166,473 @@ class HeadlightsC(IntFlag, boundary=enum.CONFORM): # tests -class _EnumTests: - """ - Test for behavior that is the same across the different types of enumerations. - """ - - values = None +class TestEnum(unittest.TestCase): def setUp(self): - class BaseEnum(self.enum_type): - @enum.property - def first(self): - return '%s is first!' % self.name - class MainEnum(BaseEnum): - first = auto() - second = auto() - third = auto() - if issubclass(self.enum_type, Flag): - dupe = 3 - else: - dupe = third - self.MainEnum = MainEnum - # - class NewStrEnum(self.enum_type): - def __str__(self): - return self.name.upper() - first = auto() - self.NewStrEnum = NewStrEnum - # - class NewFormatEnum(self.enum_type): - def __format__(self, spec): - return self.name.upper() - first = auto() - self.NewFormatEnum = NewFormatEnum - # - class NewStrFormatEnum(self.enum_type): - def __str__(self): - return self.name.title() - def __format__(self, spec): - return ''.join(reversed(self.name)) - first = auto() - self.NewStrFormatEnum = NewStrFormatEnum - # - class NewBaseEnum(self.enum_type): - def __str__(self): - return self.name.title() - def __format__(self, spec): - return ''.join(reversed(self.name)) - class NewSubEnum(NewBaseEnum): - first = auto() - self.NewSubEnum = NewSubEnum - # - self.is_flag = False - self.names = ['first', 'second', 'third'] - if issubclass(MainEnum, StrEnum): - self.values = self.names - elif MainEnum._member_type_ is str: - self.values = ['1', '2', '3'] - elif issubclass(self.enum_type, Flag): - self.values = [1, 2, 4] - self.is_flag = True - self.dupe2 = MainEnum(5) - else: - self.values = self.values or [1, 2, 3] - # - if not getattr(self, 'source_values', False): - self.source_values = self.values - - def assertFormatIsValue(self, spec, member): - self.assertEqual(spec.format(member), spec.format(member.value)) - - def assertFormatIsStr(self, spec, member): - self.assertEqual(spec.format(member), spec.format(str(member))) - - def test_attribute_deletion(self): - class Season(self.enum_type): - SPRING = auto() - SUMMER = auto() - AUTUMN = auto() - # - def spam(cls): - pass - # - self.assertTrue(hasattr(Season, 'spam')) - del Season.spam - self.assertFalse(hasattr(Season, 'spam')) - # - with self.assertRaises(AttributeError): - del Season.SPRING - with self.assertRaises(AttributeError): - del Season.DRY - with self.assertRaises(AttributeError): - del Season.SPRING.name - - def test_basics(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(repr(TE), "") - self.assertEqual(str(TE), "") - self.assertEqual(format(TE), "") - self.assertTrue(TE(5) is self.dupe2) - else: - self.assertEqual(repr(TE), "") - self.assertEqual(str(TE), "") - self.assertEqual(format(TE), "") - self.assertEqual(list(TE), [TE.first, TE.second, TE.third]) - self.assertEqual( - [m.name for m in TE], - self.names, - ) - self.assertEqual( - [m.value for m in TE], - self.values, - ) - self.assertEqual( - [m.first for m in TE], - ['first is first!', 'second is first!', 'third is first!'] - ) - for member, name in zip(TE, self.names, strict=True): - self.assertIs(TE[name], member) - for member, value in zip(TE, self.values, strict=True): - self.assertIs(TE(value), member) - if issubclass(TE, StrEnum): - self.assertTrue(TE.dupe is TE('third') is TE['dupe']) - elif TE._member_type_ is str: - self.assertTrue(TE.dupe is TE('3') is TE['dupe']) - elif issubclass(TE, Flag): - self.assertTrue(TE.dupe is TE(3) is TE['dupe']) - else: - self.assertTrue(TE.dupe is TE(self.values[2]) is TE['dupe']) + class Season(Enum): + SPRING = 1 + SUMMER = 2 + AUTUMN = 3 + WINTER = 4 + self.Season = Season - def test_bool_is_true(self): - class Empty(self.enum_type): - pass - self.assertTrue(Empty) - # - self.assertTrue(self.MainEnum) - for member in self.MainEnum: - self.assertTrue(member) + class Konstants(float, Enum): + E = 2.7182818 + PI = 3.1415926 + TAU = 2 * PI + self.Konstants = Konstants - def test_changing_member_fails(self): - MainEnum = self.MainEnum - with self.assertRaises(AttributeError): - self.MainEnum.second = 'really first' + class Grades(IntEnum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + self.Grades = Grades - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - def test_contains_er(self): - MainEnum = self.MainEnum - self.assertIn(MainEnum.third, MainEnum) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - self.source_values[1] in MainEnum - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'first' in MainEnum - val = MainEnum.dupe - self.assertIn(val, MainEnum) - # - class OtherEnum(Enum): - one = auto() - two = auto() - self.assertNotIn(OtherEnum.two, MainEnum) + class Directional(str, Enum): + EAST = 'east' + WEST = 'west' + NORTH = 'north' + SOUTH = 'south' + self.Directional = Directional - @unittest.skipIf( - python_version < (3, 12), - '__contains__ works only with enum memmbers before 3.12', - ) - def test_contains_tf(self): - MainEnum = self.MainEnum - self.assertIn(MainEnum.first, MainEnum) - self.assertTrue(self.source_values[0] in MainEnum) - self.assertFalse('first' in MainEnum) - val = MainEnum.dupe - self.assertIn(val, MainEnum) - # - class OtherEnum(Enum): - one = auto() - two = auto() - self.assertNotIn(OtherEnum.two, MainEnum) + from datetime import date + class Holiday(date, Enum): + NEW_YEAR = 2013, 1, 1 + IDES_OF_MARCH = 2013, 3, 15 + self.Holiday = Holiday - def test_dir_on_class(self): - TE = self.MainEnum - self.assertEqual(set(dir(TE)), set(enum_dir(TE))) + class DateEnum(date, Enum): pass + self.DateEnum = DateEnum - def test_dir_on_item(self): - TE = self.MainEnum - self.assertEqual(set(dir(TE.first)), set(member_dir(TE.first))) + class FloatEnum(float, Enum): pass + self.FloatEnum = FloatEnum - def test_dir_with_added_behavior(self): - class Test(self.enum_type): - this = auto() - these = auto() + class Wowser(Enum): + this = 'that' + these = 'those' + def wowser(self): + """Wowser docstring""" + return ("Wowser! I'm %s!" % self.name) + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + self.Wowser = Wowser + + class IntWowser(IntEnum): + this = 1 + these = 2 + def wowser(self): + """Wowser docstring""" + return ("Wowser! I'm %s!" % self.name) + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + self.IntWowser = IntWowser + + class FloatWowser(float, Enum): + this = 3.14 + these = 4.2 def wowser(self): + """Wowser docstring""" return ("Wowser! I'm %s!" % self.name) - self.assertTrue('wowser' not in dir(Test)) - self.assertTrue('wowser' in dir(Test.this)) + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + self.FloatWowser = FloatWowser + + class WowserNoMembers(Enum): + def wowser(self): pass + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + class SubclassOfWowserNoMembers(WowserNoMembers): pass + self.WowserNoMembers = WowserNoMembers + self.SubclassOfWowserNoMembers = SubclassOfWowserNoMembers + + class IntWowserNoMembers(IntEnum): + def wowser(self): pass + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + self.IntWowserNoMembers = IntWowserNoMembers + + class FloatWowserNoMembers(float, Enum): + def wowser(self): pass + @classmethod + def classmethod_wowser(cls): pass + @staticmethod + def staticmethod_wowser(): pass + self.FloatWowserNoMembers = FloatWowserNoMembers + + class EnumWithInit(Enum): + def __init__(self, greeting, farewell): + self.greeting = greeting + self.farewell = farewell + ENGLISH = 'hello', 'goodbye' + GERMAN = 'Guten Morgen', 'Auf Wiedersehen' + def some_method(self): pass + self.EnumWithInit = EnumWithInit - def test_dir_on_sub_with_behavior_on_super(self): # see issue22506 - class SuperEnum(self.enum_type): + class SuperEnum1(Enum): def invisible(self): return "did you see me?" - class SubEnum(SuperEnum): - sample = auto() - self.assertTrue('invisible' not in dir(SubEnum)) - self.assertTrue('invisible' in dir(SubEnum.sample)) + class SubEnum1(SuperEnum1): + sample = 5 + self.SubEnum1 = SubEnum1 - def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): - # see issue40084 - class SuperEnum(self.enum_type): - def __new__(cls, *value, **kwds): - new = self.enum_type._member_type_.__new__ - if self.enum_type._member_type_ is object: - obj = new(cls) - else: - if isinstance(value[0], tuple): - create_value ,= value[0] - else: - create_value = value - obj = new(cls, *create_value) - obj._value_ = value[0] if len(value) == 1 else value - obj.description = 'test description' + class SuperEnum2(IntEnum): + def __new__(cls, value, description=""): + obj = int.__new__(cls, value) + obj._value_ = value + obj.description = description return obj - class SubEnum(SuperEnum): - sample = self.source_values[1] - self.assertTrue('description' not in dir(SubEnum)) - self.assertTrue('description' in dir(SubEnum.sample), dir(SubEnum.sample)) - - def test_enum_in_enum_out(self): - Main = self.MainEnum - self.assertIs(Main(Main.first), Main.first) - - def test_hash(self): - MainEnum = self.MainEnum - mapping = {} - mapping[MainEnum.first] = '1225' - mapping[MainEnum.second] = '0315' - mapping[MainEnum.third] = '0704' - self.assertEqual(mapping[MainEnum.second], '0315') - - def test_invalid_names(self): - with self.assertRaises(ValueError): - class Wrong(self.enum_type): - mro = 9 - with self.assertRaises(ValueError): - class Wrong(self.enum_type): - _create_= 11 - with self.assertRaises(ValueError): - class Wrong(self.enum_type): - _get_mixins_ = 9 - with self.assertRaises(ValueError): - class Wrong(self.enum_type): - _find_new_ = 1 - with self.assertRaises(ValueError): - class Wrong(self.enum_type): - _any_name_ = 9 - - def test_object_str_override(self): - "check that setting __str__ to object's is not reset to Enum's" - class Generic(self.enum_type): - item = self.source_values[2] - def __repr__(self): - return "%s.test" % (self._name_, ) - __str__ = object.__str__ - self.assertEqual(str(Generic.item), 'item.test') - - def test_overridden_str(self): - NS = self.NewStrEnum - self.assertEqual(str(NS.first), NS.first.name.upper()) - self.assertEqual(format(NS.first), NS.first.name.upper()) + class SubEnum2(SuperEnum2): + sample = 5 + self.SubEnum2 = SubEnum2 + + def test_dir_basics_for_all_enums(self): + enums_for_tests = ( + # Generic enums in enum.py + Enum, + IntEnum, + StrEnum, + # Generic enums defined outside of enum.py + self.DateEnum, + self.FloatEnum, + # Concrete enums derived from enum.py generics + self.Grades, + self.Season, + # Concrete enums derived from generics defined outside of enum.py + self.Konstants, + self.Holiday, + # Standard enum with added behaviour & members + self.Wowser, + # Mixin-enum-from-enum.py with added behaviour & members + self.IntWowser, + # Mixin-enum-from-oustide-enum.py with added behaviour & members + self.FloatWowser, + # Equivalents of the three immediately above, but with no members + self.WowserNoMembers, + self.IntWowserNoMembers, + self.FloatWowserNoMembers, + # Enum with members and an __init__ method + self.EnumWithInit, + # Special cases to test + self.SubEnum1, + self.SubEnum2 + ) + + for cls in enums_for_tests: + with self.subTest(cls=cls): + cls_dir = dir(cls) + # test that dir is deterministic + self.assertEqual(cls_dir, dir(cls)) + # test that dir is sorted + self.assertEqual(list(cls_dir), sorted(cls_dir)) + # test that there are no dupes in dir + self.assertEqual(len(cls_dir), len(set(cls_dir))) + # test that there are no sunders in dir + self.assertFalse(any(enum._is_sunder(attr) for attr in cls_dir)) + self.assertNotIn('__new__', cls_dir) + + for attr in ('__class__', '__doc__', '__members__', '__module__'): + with self.subTest(attr=attr): + self.assertIn(attr, cls_dir) + + def test_dir_for_enum_with_members(self): + enums_for_test = ( + # Enum with members + self.Season, + # IntEnum with members + self.Grades, + # Two custom-mixin enums with members + self.Konstants, + self.Holiday, + # several enums-with-added-behaviour and members + self.Wowser, + self.IntWowser, + self.FloatWowser, + # An enum with an __init__ method and members + self.EnumWithInit, + # Special cases to test + self.SubEnum1, + self.SubEnum2 + ) + + for cls in enums_for_test: + cls_dir = dir(cls) + member_names = cls._member_names_ + with self.subTest(cls=cls): + self.assertTrue(all(member_name in cls_dir for member_name in member_names)) + for member in cls: + member_dir = dir(member) + # test that dir is deterministic + self.assertEqual(member_dir, dir(member)) + # test that dir is sorted + self.assertEqual(list(member_dir), sorted(member_dir)) + # test that there are no dupes in dir + self.assertEqual(len(member_dir), len(set(member_dir))) + + for attr_name in cls_dir: + with self.subTest(attr_name=attr_name): + if attr_name in {'__members__', '__init__', '__new__', *member_names}: + self.assertNotIn(attr_name, member_dir) + else: + self.assertIn(attr_name, member_dir) + + self.assertFalse(any(enum._is_sunder(attr) for attr in member_dir)) + + def test_dir_for_enums_with_added_behaviour(self): + enums_for_test = ( + self.Wowser, + self.IntWowser, + self.FloatWowser, + self.WowserNoMembers, + self.SubclassOfWowserNoMembers, + self.IntWowserNoMembers, + self.FloatWowserNoMembers + ) + + for cls in enums_for_test: + with self.subTest(cls=cls): + self.assertIn('wowser', dir(cls)) + self.assertIn('classmethod_wowser', dir(cls)) + self.assertIn('staticmethod_wowser', dir(cls)) + self.assertTrue(all( + all(attr in dir(member) for attr in ('wowser', 'classmethod_wowser', 'staticmethod_wowser')) + for member in cls + )) - def test_overridden_str_format(self): - NSF = self.NewStrFormatEnum - self.assertEqual(str(NSF.first), NSF.first.name.title()) - self.assertEqual(format(NSF.first), ''.join(reversed(NSF.first.name))) + self.assertEqual(dir(self.WowserNoMembers), dir(self.SubclassOfWowserNoMembers)) + # Check classmethods are present + self.assertIn('from_bytes', dir(self.IntWowser)) + self.assertIn('from_bytes', dir(self.IntWowserNoMembers)) + + def test_help_output_on_enum_members(self): + added_behaviour_enums = ( + self.Wowser, + self.IntWowser, + self.FloatWowser + ) + + for cls in added_behaviour_enums: + with self.subTest(cls=cls): + rendered_doc = pydoc.render_doc(cls.this) + self.assertIn('Wowser docstring', rendered_doc) + if cls in {self.IntWowser, self.FloatWowser}: + self.assertIn('float(self)', rendered_doc) + + def test_dir_for_enum_with_init(self): + EnumWithInit = self.EnumWithInit + + cls_dir = dir(EnumWithInit) + self.assertIn('__init__', cls_dir) + self.assertIn('some_method', cls_dir) + self.assertNotIn('greeting', cls_dir) + self.assertNotIn('farewell', cls_dir) + + member_dir = dir(EnumWithInit.ENGLISH) + self.assertNotIn('__init__', member_dir) + self.assertIn('some_method', member_dir) + self.assertIn('greeting', member_dir) + self.assertIn('farewell', member_dir) + + def test_mixin_dirs(self): + from datetime import date - def test_overridden_str_format_inherited(self): - NSE = self.NewSubEnum - self.assertEqual(str(NSE.first), NSE.first.name.title()) - self.assertEqual(format(NSE.first), ''.join(reversed(NSE.first.name))) + enums_for_test = ( + # generic mixins from enum.py + (IntEnum, int), + (StrEnum, str), + # generic mixins from outside enum.py + (self.FloatEnum, float), + (self.DateEnum, date), + # concrete mixin from enum.py + (self.Grades, int), + # concrete mixin from outside enum.py + (self.Holiday, date), + # concrete mixin from enum.py with added behaviour + (self.IntWowser, int), + # concrete mixin from outside enum.py with added behaviour + (self.FloatWowser, float) + ) + + enum_dict = Enum.__dict__ + enum_dir = dir(Enum) + enum_module_names = enum.__all__ + is_from_enum_module = lambda cls: cls.__name__ in enum_module_names + is_enum_dunder = lambda attr: enum._is_dunder(attr) and attr in enum_dict + + def attr_is_inherited_from_object(cls, attr_name): + for base in cls.__mro__: + if attr_name in base.__dict__: + return base is object + return False + + # General tests + for enum_cls, mixin_cls in enums_for_test: + with self.subTest(enum_cls=enum_cls): + cls_dir = dir(enum_cls) + cls_dict = enum_cls.__dict__ + + mixin_attrs = [ + x for x in dir(mixin_cls) + if not attr_is_inherited_from_object(cls=mixin_cls, attr_name=x) + ] - def test_programmatic_function_string(self): - MinorEnum = self.enum_type('MinorEnum', 'june july august') - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, + first_enum_base = next( + base for base in enum_cls.__mro__ + if is_from_enum_module(base) ) - values = self.values - if self.enum_type is StrEnum: - values = ['june','july','august'] - for month, av in zip('june july august'.split(), values): - e = MinorEnum[month] - self.assertEqual(e.value, av, list(MinorEnum)) - self.assertEqual(e.name, month) - if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): - self.assertEqual(e, av) - else: - self.assertNotEqual(e, av) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - self.assertIs(e, MinorEnum(av)) - def test_programmatic_function_string_list(self): - MinorEnum = self.enum_type('MinorEnum', ['june', 'july', 'august']) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - values = self.values - if self.enum_type is StrEnum: - values = ['june','july','august'] - for month, av in zip('june july august'.split(), values): - e = MinorEnum[month] - self.assertEqual(e.value, av) - self.assertEqual(e.name, month) - if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): - self.assertEqual(e, av) - else: - self.assertNotEqual(e, av) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - self.assertIs(e, MinorEnum(av)) + for attr in mixin_attrs: + with self.subTest(attr=attr): + if enum._is_sunder(attr): + # Unlikely, but no harm in testing + self.assertNotIn(attr, cls_dir) + elif attr in {'__class__', '__doc__', '__members__', '__module__'}: + self.assertIn(attr, cls_dir) + elif is_enum_dunder(attr): + if is_from_enum_module(enum_cls): + self.assertNotIn(attr, cls_dir) + elif getattr(enum_cls, attr) is getattr(first_enum_base, attr): + self.assertNotIn(attr, cls_dir) + else: + self.assertIn(attr, cls_dir) + else: + self.assertIn(attr, cls_dir) + + # Some specific examples + int_enum_dir = dir(IntEnum) + self.assertIn('imag', int_enum_dir) + self.assertIn('__rfloordiv__', int_enum_dir) + self.assertNotIn('__format__', int_enum_dir) + self.assertNotIn('__hash__', int_enum_dir) + self.assertNotIn('__init_subclass__', int_enum_dir) + self.assertNotIn('__subclasshook__', int_enum_dir) + + class OverridesFormatOutsideEnumModule(Enum): + def __format__(self, *args, **kwargs): + return super().__format__(*args, **kwargs) + SOME_MEMBER = 1 + + self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule)) + self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule.SOME_MEMBER)) - def test_programmatic_function_iterable(self): - MinorEnum = self.enum_type( - 'MinorEnum', - (('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2])) - ) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) + def test_dir_on_sub_with_behavior_on_super(self): + # see issue22506 self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, + set(dir(self.SubEnum1.sample)), + set(['__class__', '__doc__', '__module__', 'name', 'value', 'invisible']), ) - for month, av in zip('june july august'.split(), self.values): - e = MinorEnum[month] - self.assertEqual(e.value, av) - self.assertEqual(e.name, month) - if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): - self.assertEqual(e, av) - else: - self.assertNotEqual(e, av) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - self.assertIs(e, MinorEnum(av)) - def test_programmatic_function_from_dict(self): - MinorEnum = self.enum_type( - 'MinorEnum', - OrderedDict((('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2]))) - ) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for month, av in zip('june july august'.split(), self.values): - e = MinorEnum[month] - if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): - self.assertEqual(e, av) - else: - self.assertNotEqual(e, av) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - self.assertIs(e, MinorEnum(av)) + def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): + # see issue40084 + self.assertTrue({'description'} <= set(dir(self.SubEnum2.sample))) - def test_repr(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(repr(TE(0)), "") - self.assertEqual(repr(TE.dupe), "") - self.assertEqual(repr(self.dupe2), "") - elif issubclass(TE, StrEnum): - self.assertEqual(repr(TE.dupe), "") - else: - self.assertEqual(repr(TE.dupe), "" % (self.values[2], ), TE._value_repr_) - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(repr(member), "" % (member.name, member.value)) + def test_enum_in_enum_out(self): + Season = self.Season + self.assertIs(Season(Season.WINTER), Season.WINTER) - def test_repr_override(self): - class Generic(self.enum_type): - first = auto() - second = auto() - third = auto() - def __repr__(self): - return "don't you just love shades of %s?" % self.name - self.assertEqual( - repr(Generic.third), - "don't you just love shades of third?", - ) + def test_enum_value(self): + Season = self.Season + self.assertEqual(Season.SPRING.value, 1) - def test_inherited_repr(self): - class MyEnum(self.enum_type): - def __repr__(self): - return "My name is %s." % self.name - class MySubEnum(MyEnum): - this = auto() - that = auto() - theother = auto() - self.assertEqual(repr(MySubEnum.that), "My name is that.") + def test_intenum_value(self): + self.assertEqual(IntStooges.CURLY.value, 2) - def test_reversed_iteration_order(self): + def test_enum(self): + Season = self.Season + lst = list(Season) + self.assertEqual(len(lst), len(Season)) + self.assertEqual(len(Season), 4, Season) self.assertEqual( - list(reversed(self.MainEnum)), - [self.MainEnum.third, self.MainEnum.second, self.MainEnum.first], - ) - -class _PlainOutputTests: - - def test_str(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(str(TE.dupe), "MainEnum.dupe") - self.assertEqual(str(self.dupe2), "MainEnum.first|third") - else: - self.assertEqual(str(TE.dupe), "MainEnum.third") - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) - - def test_format(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(format(TE.dupe), "MainEnum.dupe") - self.assertEqual(format(self.dupe2), "MainEnum.first|third") - else: - self.assertEqual(format(TE.dupe), "MainEnum.third") - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) - - def test_overridden_format(self): - NF = self.NewFormatEnum - self.assertEqual(str(NF.first), "NewFormatEnum.first", '%s %r' % (NF.__str__, NF.first)) - self.assertEqual(format(NF.first), "FIRST") - - def test_format_specs(self): - TE = self.MainEnum - self.assertFormatIsStr('{}', TE.second) - self.assertFormatIsStr('{:}', TE.second) - self.assertFormatIsStr('{:20}', TE.second) - self.assertFormatIsStr('{:^20}', TE.second) - self.assertFormatIsStr('{:>20}', TE.second) - self.assertFormatIsStr('{:<20}', TE.second) - self.assertFormatIsStr('{:5.2}', TE.second) + [Season.SPRING, Season.SUMMER, Season.AUTUMN, Season.WINTER], lst) + for i, season in enumerate('SPRING SUMMER AUTUMN WINTER'.split(), 1): + e = Season(i) + self.assertEqual(e, getattr(Season, season)) + self.assertEqual(e.value, i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, season) + self.assertIn(e, Season) + self.assertIs(type(e), Season) + self.assertIsInstance(e, Season) + self.assertEqual(str(e), season) + self.assertEqual(repr(e), 'Season.{0}'.format(season)) + + def test_value_name(self): + Season = self.Season + self.assertEqual(Season.SPRING.name, 'SPRING') + self.assertEqual(Season.SPRING.value, 1) + with self.assertRaises(AttributeError): + Season.SPRING.name = 'invierno' + with self.assertRaises(AttributeError): + Season.SPRING.value = 2 -class _MixedOutputTests: - - def test_str(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(str(TE.dupe), "MainEnum.dupe") - self.assertEqual(str(self.dupe2), "MainEnum.first|third") - else: - self.assertEqual(str(TE.dupe), "MainEnum.third") - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) - - def test_format(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(format(TE.dupe), "MainEnum.dupe") - self.assertEqual(format(self.dupe2), "MainEnum.first|third") - else: - self.assertEqual(format(TE.dupe), "MainEnum.third") - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) - - def test_overridden_format(self): - NF = self.NewFormatEnum - self.assertEqual(str(NF.first), "NewFormatEnum.first") - self.assertEqual(format(NF.first), "FIRST") - - def test_format_specs(self): - TE = self.MainEnum - self.assertFormatIsStr('{}', TE.first) - self.assertFormatIsStr('{:}', TE.first) - self.assertFormatIsStr('{:20}', TE.first) - self.assertFormatIsStr('{:^20}', TE.first) - self.assertFormatIsStr('{:>20}', TE.first) - self.assertFormatIsStr('{:<20}', TE.first) - self.assertFormatIsStr('{:5.2}', TE.first) - - -class _MinimalOutputTests: - - def test_str(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(str(TE.dupe), "3") - self.assertEqual(str(self.dupe2), "5") - else: - self.assertEqual(str(TE.dupe), str(self.values[2])) - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(str(member), str(value)) - - def test_format(self): - TE = self.MainEnum - if self.is_flag: - self.assertEqual(format(TE.dupe), "3") - self.assertEqual(format(self.dupe2), "5") - else: - self.assertEqual(format(TE.dupe), format(self.values[2])) - for name, value, member in zip(self.names, self.values, TE, strict=True): - self.assertEqual(format(member), format(value)) - - def test_overridden_format(self): - NF = self.NewFormatEnum - self.assertEqual(str(NF.first), str(self.values[0])) - self.assertEqual(format(NF.first), "FIRST") - - def test_format_specs(self): - TE = self.MainEnum - self.assertFormatIsValue('{}', TE.third) - self.assertFormatIsValue('{:}', TE.third) - self.assertFormatIsValue('{:20}', TE.third) - self.assertFormatIsValue('{:^20}', TE.third) - self.assertFormatIsValue('{:>20}', TE.third) - self.assertFormatIsValue('{:<20}', TE.third) - if TE._member_type_ is float: - self.assertFormatIsValue('{:n}', TE.third) - self.assertFormatIsValue('{:5.2}', TE.third) - self.assertFormatIsValue('{:f}', TE.third) - - -class _FlagTests: - - def test_default_missing_with_wrong_type_value(self): - with self.assertRaisesRegex( - ValueError, - "'RED' is not a valid TestFlag.Color", - ) as ctx: - self.MainEnum('RED') - self.assertIs(ctx.exception.__context__, None) - -class TestPlainEnum(_EnumTests, _PlainOutputTests, unittest.TestCase): - enum_type = Enum - - -class TestPlainFlag(_EnumTests, _PlainOutputTests, unittest.TestCase): - enum_type = Flag - - -class TestIntEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): - enum_type = IntEnum - - -class TestStrEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): - enum_type = StrEnum - - -class TestIntFlag(_EnumTests, _MinimalOutputTests, unittest.TestCase): - enum_type = IntFlag - - -class TestMixedInt(_EnumTests, _MixedOutputTests, unittest.TestCase): - class enum_type(int, Enum): pass - - -class TestMixedStr(_EnumTests, _MixedOutputTests, unittest.TestCase): - class enum_type(str, Enum): pass - - -class TestMixedIntFlag(_EnumTests, _MixedOutputTests, unittest.TestCase): - class enum_type(int, Flag): pass - - -class TestMixedDate(_EnumTests, _MixedOutputTests, unittest.TestCase): - - values = [date(2021, 12, 25), date(2020, 3, 15), date(2019, 11, 27)] - source_values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] - - class enum_type(date, Enum): - def _generate_next_value_(name, start, count, last_values): - values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] - return values[count] - - -class TestMinimalDate(_EnumTests, _MinimalOutputTests, unittest.TestCase): - - values = [date(2023, 12, 1), date(2016, 2, 29), date(2009, 1, 1)] - source_values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] - - class enum_type(date, ReprEnum): - def _generate_next_value_(name, start, count, last_values): - values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] - return values[count] - - -class TestMixedFloat(_EnumTests, _MixedOutputTests, unittest.TestCase): - - values = [1.1, 2.2, 3.3] - - class enum_type(float, Enum): - def _generate_next_value_(name, start, count, last_values): - values = [1.1, 2.2, 3.3] - return values[count] + def test_changing_member(self): + Season = self.Season + with self.assertRaises(AttributeError): + Season.WINTER = 'really cold' + def test_attribute_deletion(self): + class Season(Enum): + SPRING = 1 + SUMMER = 2 + AUTUMN = 3 + WINTER = 4 -class TestMinimalFloat(_EnumTests, _MinimalOutputTests, unittest.TestCase): + def spam(cls): + pass - values = [4.4, 5.5, 6.6] + self.assertTrue(hasattr(Season, 'spam')) + del Season.spam + self.assertFalse(hasattr(Season, 'spam')) - class enum_type(float, ReprEnum): - def _generate_next_value_(name, start, count, last_values): - values = [4.4, 5.5, 6.6] - return values[count] + with self.assertRaises(AttributeError): + del Season.SPRING + with self.assertRaises(AttributeError): + del Season.DRY + with self.assertRaises(AttributeError): + del Season.SPRING.name + def test_bool_of_class(self): + class Empty(Enum): + pass + self.assertTrue(bool(Empty)) -class TestSpecial(unittest.TestCase): - """ - various operations that are not attributable to every possible enum - """ + def test_bool_of_member(self): + class Count(Enum): + zero = 0 + one = 1 + two = 2 + for member in Count: + self.assertTrue(bool(member)) - def setUp(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = 3 - WINTER = 4 - self.Season = Season - # - class Grades(IntEnum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.Grades = Grades - # - class Directional(str, Enum): - EAST = 'east' - WEST = 'west' - NORTH = 'north' - SOUTH = 'south' - self.Directional = Directional - # - from datetime import date - class Holiday(date, Enum): - NEW_YEAR = 2013, 1, 1 - IDES_OF_MARCH = 2013, 3, 15 - self.Holiday = Holiday + def test_invalid_names(self): + with self.assertRaises(ValueError): + class Wrong(Enum): + mro = 9 + with self.assertRaises(ValueError): + class Wrong(Enum): + _create_= 11 + with self.assertRaises(ValueError): + class Wrong(Enum): + _get_mixins_ = 9 + with self.assertRaises(ValueError): + class Wrong(Enum): + _find_new_ = 1 + with self.assertRaises(ValueError): + class Wrong(Enum): + _any_name_ = 9 def test_bool(self): # plain Enum members are always True @@ -865,56 +656,92 @@ class IntLogic(int, Enum): self.assertTrue(IntLogic.true) self.assertFalse(IntLogic.false) + @unittest.skipIf( + python_version >= (3, 12), + '__contains__ now returns True/False for all inputs', + ) + def test_contains_er(self): + Season = self.Season + self.assertIn(Season.AUTUMN, Season) + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + 3 in Season + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + 'AUTUMN' in Season + val = Season(3) + self.assertIn(val, Season) + # + class OtherEnum(Enum): + one = 1; two = 2 + self.assertNotIn(OtherEnum.two, Season) + + @unittest.skipIf( + python_version < (3, 12), + '__contains__ only works with enum memmbers before 3.12', + ) + def test_contains_tf(self): + Season = self.Season + self.assertIn(Season.AUTUMN, Season) + self.assertTrue(3 in Season) + self.assertFalse('AUTUMN' in Season) + val = Season(3) + self.assertIn(val, Season) + # + class OtherEnum(Enum): + one = 1; two = 2 + self.assertNotIn(OtherEnum.two, Season) + def test_comparisons(self): Season = self.Season with self.assertRaises(TypeError): Season.SPRING < Season.WINTER with self.assertRaises(TypeError): Season.SPRING > 4 - # + self.assertNotEqual(Season.SPRING, 1) - # + class Part(Enum): SPRING = 1 CLIP = 2 BARREL = 3 - # + self.assertNotEqual(Season.SPRING, Part.SPRING) with self.assertRaises(TypeError): Season.SPRING < Part.CLIP - def test_dir_with_custom_dunders(self): - class PlainEnum(Enum): - pass - cls_dir = dir(PlainEnum) - self.assertNotIn('__repr__', cls_dir) - self.assertNotIn('__str__', cls_dir) - self.assertNotIn('__repr__', cls_dir) - self.assertNotIn('__repr__', cls_dir) - # - class MyEnum(Enum): - def __repr__(self): - return object.__repr__(self) - def __str__(self): - return object.__repr__(self) - def __format__(self): - return object.__repr__(self) - def __init__(self): - pass - cls_dir = dir(MyEnum) - self.assertIn('__repr__', cls_dir) - self.assertIn('__str__', cls_dir) - self.assertIn('__repr__', cls_dir) - self.assertIn('__repr__', cls_dir) + def test_enum_duplicates(self): + class Season(Enum): + SPRING = 1 + SUMMER = 2 + AUTUMN = FALL = 3 + WINTER = 4 + ANOTHER_SPRING = 1 + lst = list(Season) + self.assertEqual( + lst, + [Season.SPRING, Season.SUMMER, + Season.AUTUMN, Season.WINTER, + ]) + self.assertIs(Season.FALL, Season.AUTUMN) + self.assertEqual(Season.FALL.value, 3) + self.assertEqual(Season.AUTUMN.value, 3) + self.assertIs(Season(3), Season.AUTUMN) + self.assertIs(Season(1), Season.SPRING) + self.assertEqual(Season.FALL.name, 'AUTUMN') + self.assertEqual( + [k for k,v in Season.__members__.items() if v.name != k], + ['FALL', 'ANOTHER_SPRING'], + ) - def test_duplicate_name_error(self): + def test_duplicate_name(self): with self.assertRaises(TypeError): class Color(Enum): red = 1 green = 2 blue = 3 red = 4 - # + with self.assertRaises(TypeError): class Color(Enum): red = 1 @@ -922,45 +749,232 @@ class Color(Enum): blue = 3 def red(self): return 'red' - # + with self.assertRaises(TypeError): class Color(Enum): - @enum.property + @property def red(self): return 'redder' red = 1 green = 2 blue = 3 - def test_enum_function_with_qualname(self): - if isinstance(Theory, Exception): - raise Theory - self.assertEqual(Theory.__qualname__, 'spanish_inquisition') + def test_reserved__sunder_(self): + with self.assertRaisesRegex( + ValueError, + '_sunder_ names, such as ._bad_., are reserved', + ): + class Bad(Enum): + _bad_ = 1 def test_enum_with_value_name(self): class Huh(Enum): name = 1 value = 2 - self.assertEqual(list(Huh), [Huh.name, Huh.value]) + self.assertEqual( + list(Huh), + [Huh.name, Huh.value], + ) self.assertIs(type(Huh.name), Huh) self.assertEqual(Huh.name.name, 'name') self.assertEqual(Huh.name.value, 1) + def test_format_enum(self): + Season = self.Season + self.assertEqual('{}'.format(Season.SPRING), + '{}'.format(str(Season.SPRING))) + self.assertEqual( '{:}'.format(Season.SPRING), + '{:}'.format(str(Season.SPRING))) + self.assertEqual('{:20}'.format(Season.SPRING), + '{:20}'.format(str(Season.SPRING))) + self.assertEqual('{:^20}'.format(Season.SPRING), + '{:^20}'.format(str(Season.SPRING))) + self.assertEqual('{:>20}'.format(Season.SPRING), + '{:>20}'.format(str(Season.SPRING))) + self.assertEqual('{:<20}'.format(Season.SPRING), + '{:<20}'.format(str(Season.SPRING))) + + def test_str_override_enum(self): + class EnumWithStrOverrides(Enum): + one = auto() + two = auto() + + def __str__(self): + return 'Str!' + self.assertEqual(str(EnumWithStrOverrides.one), 'Str!') + self.assertEqual('{}'.format(EnumWithStrOverrides.one), 'Str!') + + def test_format_override_enum(self): + class EnumWithFormatOverride(Enum): + one = 1.0 + two = 2.0 + def __format__(self, spec): + return 'Format!!' + self.assertEqual(str(EnumWithFormatOverride.one), 'one') + self.assertEqual('{}'.format(EnumWithFormatOverride.one), 'Format!!') + + def test_str_and_format_override_enum(self): + class EnumWithStrFormatOverrides(Enum): + one = auto() + two = auto() + def __str__(self): + return 'Str!' + def __format__(self, spec): + return 'Format!' + self.assertEqual(str(EnumWithStrFormatOverrides.one), 'Str!') + self.assertEqual('{}'.format(EnumWithStrFormatOverrides.one), 'Format!') + + def test_str_override_mixin(self): + class MixinEnumWithStrOverride(float, Enum): + one = 1.0 + two = 2.0 + def __str__(self): + return 'Overridden!' + self.assertEqual(str(MixinEnumWithStrOverride.one), 'Overridden!') + self.assertEqual('{}'.format(MixinEnumWithStrOverride.one), 'Overridden!') + + def test_str_and_format_override_mixin(self): + class MixinWithStrFormatOverrides(float, Enum): + one = 1.0 + two = 2.0 + def __str__(self): + return 'Str!' + def __format__(self, spec): + return 'Format!' + self.assertEqual(str(MixinWithStrFormatOverrides.one), 'Str!') + self.assertEqual('{}'.format(MixinWithStrFormatOverrides.one), 'Format!') + + def test_format_override_mixin(self): + class TestFloat(float, Enum): + one = 1.0 + two = 2.0 + def __format__(self, spec): + return 'TestFloat success!' + self.assertEqual(str(TestFloat.one), 'one') + self.assertEqual('{}'.format(TestFloat.one), 'TestFloat success!') + + @unittest.skipIf( + python_version < (3, 12), + 'mixin-format is still using member.value', + ) + def test_mixin_format_warning(self): + class Grades(int, Enum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + self.assertEqual(f'{self.Grades.B}', 'B') + + @unittest.skipIf( + python_version >= (3, 12), + 'mixin-format now uses member instead of member.value', + ) + def test_mixin_format_warning(self): + class Grades(int, Enum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + with self.assertWarns(DeprecationWarning): + self.assertEqual(f'{Grades.B}', '4') + + def assertFormatIsValue(self, spec, member): + if python_version < (3, 12) and (not spec or spec in ('{}','{:}')): + with self.assertWarns(DeprecationWarning): + self.assertEqual(spec.format(member), spec.format(member.value)) + else: + self.assertEqual(spec.format(member), spec.format(member.value)) + + def test_format_enum_date(self): + Holiday = self.Holiday + self.assertFormatIsValue('{}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:20}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:^20}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:>20}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:<20}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:%Y %m}', Holiday.IDES_OF_MARCH) + self.assertFormatIsValue('{:%Y %m %M:00}', Holiday.IDES_OF_MARCH) + + def test_format_enum_float(self): + Konstants = self.Konstants + self.assertFormatIsValue('{}', Konstants.TAU) + self.assertFormatIsValue('{:}', Konstants.TAU) + self.assertFormatIsValue('{:20}', Konstants.TAU) + self.assertFormatIsValue('{:^20}', Konstants.TAU) + self.assertFormatIsValue('{:>20}', Konstants.TAU) + self.assertFormatIsValue('{:<20}', Konstants.TAU) + self.assertFormatIsValue('{:n}', Konstants.TAU) + self.assertFormatIsValue('{:5.2}', Konstants.TAU) + self.assertFormatIsValue('{:f}', Konstants.TAU) + + def test_format_enum_int(self): + class Grades(int, Enum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + self.assertFormatIsValue('{}', Grades.C) + self.assertFormatIsValue('{:}', Grades.C) + self.assertFormatIsValue('{:20}', Grades.C) + self.assertFormatIsValue('{:^20}', Grades.C) + self.assertFormatIsValue('{:>20}', Grades.C) + self.assertFormatIsValue('{:<20}', Grades.C) + self.assertFormatIsValue('{:+}', Grades.C) + self.assertFormatIsValue('{:08X}', Grades.C) + self.assertFormatIsValue('{:b}', Grades.C) + + def test_format_enum_str(self): + Directional = self.Directional + self.assertFormatIsValue('{}', Directional.WEST) + self.assertFormatIsValue('{:}', Directional.WEST) + self.assertFormatIsValue('{:20}', Directional.WEST) + self.assertFormatIsValue('{:^20}', Directional.WEST) + self.assertFormatIsValue('{:>20}', Directional.WEST) + self.assertFormatIsValue('{:<20}', Directional.WEST) + + def test_object_str_override(self): + class Colors(Enum): + RED, GREEN, BLUE = 1, 2, 3 + def __repr__(self): + return "test.%s" % (self._name_, ) + __str__ = object.__str__ + self.assertEqual(str(Colors.RED), 'test.RED') + + def test_enum_str_override(self): + class MyStrEnum(Enum): + def __str__(self): + return 'MyStr' + class MyMethodEnum(Enum): + def hello(self): + return 'Hello! My name is %s' % self.name + class Test1Enum(MyMethodEnum, int, MyStrEnum): + One = 1 + Two = 2 + self.assertTrue(Test1Enum._member_type_ is int) + self.assertEqual(str(Test1Enum.One), 'MyStr') + self.assertEqual(format(Test1Enum.One, ''), 'MyStr') + # + class Test2Enum(MyStrEnum, MyMethodEnum): + One = 1 + Two = 2 + self.assertEqual(str(Test2Enum.One), 'MyStr') + self.assertEqual(format(Test1Enum.One, ''), 'MyStr') + def test_inherited_data_type(self): class HexInt(int): - __qualname__ = 'HexInt' def __repr__(self): return hex(self) class MyEnum(HexInt, enum.Enum): - __qualname__ = 'MyEnum' A = 1 B = 2 C = 3 + def __repr__(self): + return '<%s.%s: %r>' % (self.__class__.__name__, self._name_, self._value_) self.assertEqual(repr(MyEnum.A), '') - globals()['HexInt'] = HexInt - globals()['MyEnum'] = MyEnum - test_pickle_dump_load(self.assertIs, MyEnum.A) - test_pickle_dump_load(self.assertIs, MyEnum) # class SillyInt(HexInt): __qualname__ = 'SillyInt' @@ -976,7 +990,7 @@ class MyOtherEnum(SillyInt, enum.Enum): test_pickle_dump_load(self.assertIs, MyOtherEnum.E) test_pickle_dump_load(self.assertIs, MyOtherEnum) # - # This did not work in 3.10, but does now with pickling by name + # This did not work in 3.9, but does now with pickling by name class UnBrokenInt(int): __qualname__ = 'UnBrokenInt' def __new__(cls, value): @@ -993,124 +1007,6 @@ class MyUnBrokenEnum(UnBrokenInt, Enum): test_pickle_dump_load(self.assertIs, MyUnBrokenEnum.I) test_pickle_dump_load(self.assertIs, MyUnBrokenEnum) - def test_floatenum_fromhex(self): - h = float.hex(FloatStooges.MOE.value) - self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) - h = float.hex(FloatStooges.MOE.value + 0.01) - with self.assertRaises(ValueError): - FloatStooges.fromhex(h) - - def test_programmatic_function_type(self): - MinorEnum = Enum('MinorEnum', 'june july august', type=int) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = MinorEnum(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_programmatic_function_string_with_start(self): - MinorEnum = Enum('MinorEnum', 'june july august', start=10) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 10): - e = MinorEnum(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_programmatic_function_type_with_start(self): - MinorEnum = Enum('MinorEnum', 'june july august', type=int, start=30) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 30): - e = MinorEnum(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_programmatic_function_string_list_with_start(self): - MinorEnum = Enum('MinorEnum', ['june', 'july', 'august'], start=20) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 20): - e = MinorEnum(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_programmatic_function_type_from_subclass(self): - MinorEnum = IntEnum('MinorEnum', 'june july august') - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = MinorEnum(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_programmatic_function_type_from_subclass_with_start(self): - MinorEnum = IntEnum('MinorEnum', 'june july august', start=40) - lst = list(MinorEnum) - self.assertEqual(len(lst), len(MinorEnum)) - self.assertEqual(len(MinorEnum), 3, MinorEnum) - self.assertEqual( - [MinorEnum.june, MinorEnum.july, MinorEnum.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 40): - e = MinorEnum(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, MinorEnum) - self.assertIs(type(e), MinorEnum) - - def test_intenum_from_bytes(self): - self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) - with self.assertRaises(ValueError): - IntStooges.from_bytes(b'\x00\x05', 'big') - - def test_reserved_sunder_error(self): - with self.assertRaisesRegex( - ValueError, - '_sunder_ names, such as ._bad_., are reserved', - ): - class Bad(Enum): - _bad_ = 1 - def test_too_many_data_types(self): with self.assertRaisesRegex(TypeError, 'too many data types'): class Huh(str, int, Enum): @@ -1126,6 +1022,122 @@ def repr(self): class Huh(MyStr, MyInt, Enum): One = 1 + def test_value_auto_assign(self): + class Some(Enum): + def __new__(cls, val): + return object.__new__(cls) + x = 1 + y = 2 + + self.assertEqual(Some.x.value, 1) + self.assertEqual(Some.y.value, 2) + + def test_hash(self): + Season = self.Season + dates = {} + dates[Season.WINTER] = '1225' + dates[Season.SPRING] = '0315' + dates[Season.SUMMER] = '0704' + dates[Season.AUTUMN] = '1031' + self.assertEqual(dates[Season.AUTUMN], '1031') + + def test_intenum_from_scratch(self): + class phy(int, Enum): + pi = 3 + tau = 2 * pi + self.assertTrue(phy.pi < phy.tau) + + def test_intenum_inherited(self): + class IntEnum(int, Enum): + pass + class phy(IntEnum): + pi = 3 + tau = 2 * pi + self.assertTrue(phy.pi < phy.tau) + + def test_floatenum_from_scratch(self): + class phy(float, Enum): + pi = 3.1415926 + tau = 2 * pi + self.assertTrue(phy.pi < phy.tau) + + def test_floatenum_inherited(self): + class FloatEnum(float, Enum): + pass + class phy(FloatEnum): + pi = 3.1415926 + tau = 2 * pi + self.assertTrue(phy.pi < phy.tau) + + def test_strenum_from_scratch(self): + class phy(str, Enum): + pi = 'Pi' + tau = 'Tau' + self.assertTrue(phy.pi < phy.tau) + + def test_strenum_inherited_methods(self): + class phy(StrEnum): + pi = 'Pi' + tau = 'Tau' + self.assertTrue(phy.pi < phy.tau) + self.assertEqual(phy.pi.upper(), 'PI') + self.assertEqual(phy.tau.count('a'), 1) + + def test_intenum(self): + class WeekDay(IntEnum): + SUNDAY = 1 + MONDAY = 2 + TUESDAY = 3 + WEDNESDAY = 4 + THURSDAY = 5 + FRIDAY = 6 + SATURDAY = 7 + + self.assertEqual(['a', 'b', 'c'][WeekDay.MONDAY], 'c') + self.assertEqual([i for i in range(WeekDay.TUESDAY)], [0, 1, 2]) + + lst = list(WeekDay) + self.assertEqual(len(lst), len(WeekDay)) + self.assertEqual(len(WeekDay), 7) + target = 'SUNDAY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY' + target = target.split() + for i, weekday in enumerate(target, 1): + e = WeekDay(i) + self.assertEqual(e, i) + self.assertEqual(int(e), i) + self.assertEqual(e.name, weekday) + self.assertIn(e, WeekDay) + self.assertEqual(lst.index(e)+1, i) + self.assertTrue(0 < e < 8) + self.assertIs(type(e), WeekDay) + self.assertIsInstance(e, int) + self.assertIsInstance(e, Enum) + + def test_intenum_duplicates(self): + class WeekDay(IntEnum): + SUNDAY = 1 + MONDAY = 2 + TUESDAY = TEUSDAY = 3 + WEDNESDAY = 4 + THURSDAY = 5 + FRIDAY = 6 + SATURDAY = 7 + self.assertIs(WeekDay.TEUSDAY, WeekDay.TUESDAY) + self.assertEqual(WeekDay(3).name, 'TUESDAY') + self.assertEqual([k for k,v in WeekDay.__members__.items() + if v.name != k], ['TEUSDAY', ]) + + def test_intenum_from_bytes(self): + self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) + with self.assertRaises(ValueError): + IntStooges.from_bytes(b'\x00\x05', 'big') + + def test_floatenum_fromhex(self): + h = float.hex(FloatStooges.MOE.value) + self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) + h = float.hex(FloatStooges.MOE.value + 0.01) + with self.assertRaises(ValueError): + FloatStooges.fromhex(h) def test_pickle_enum(self): if isinstance(Stooges, Exception): @@ -1157,7 +1169,12 @@ def test_pickle_enum_function_with_module(self): test_pickle_dump_load(self.assertIs, Question.who) test_pickle_dump_load(self.assertIs, Question) - def test_pickle_nested_class(self): + def test_enum_function_with_qualname(self): + if isinstance(Theory, Exception): + raise Theory + self.assertEqual(Theory.__qualname__, 'spanish_inquisition') + + def test_class_nested_enum_and_pickle_protocol_four(self): # would normally just have this directly in the class namespace class NestedEnum(Enum): twigs = 'common' @@ -1175,7 +1192,7 @@ class ReplaceGlobalInt(IntEnum): for proto in range(HIGHEST_PROTOCOL): self.assertEqual(ReplaceGlobalInt.TWO.__reduce_ex__(proto), 'TWO') - def test_pickle_explodes(self): + def test_exploding_pickle(self): BadPickle = Enum( 'BadPickle', 'dill sweet bread-n-butter', module=__name__) globals()['BadPickle'] = BadPickle @@ -1184,37 +1201,216 @@ def test_pickle_explodes(self): test_pickle_exception(self.assertRaises, TypeError, BadPickle.dill) test_pickle_exception(self.assertRaises, PicklingError, BadPickle) - def test_string_enum(self): - class SkillLevel(str, Enum): - master = 'what is the sound of one hand clapping?' - journeyman = 'why did the chicken cross the road?' - apprentice = 'knock, knock!' - self.assertEqual(SkillLevel.apprentice, 'knock, knock!') + def test_string_enum(self): + class SkillLevel(str, Enum): + master = 'what is the sound of one hand clapping?' + journeyman = 'why did the chicken cross the road?' + apprentice = 'knock, knock!' + self.assertEqual(SkillLevel.apprentice, 'knock, knock!') + + def test_getattr_getitem(self): + class Period(Enum): + morning = 1 + noon = 2 + evening = 3 + night = 4 + self.assertIs(Period(2), Period.noon) + self.assertIs(getattr(Period, 'night'), Period.night) + self.assertIs(Period['morning'], Period.morning) + + def test_getattr_dunder(self): + Season = self.Season + self.assertTrue(getattr(Season, '__eq__')) + + def test_iteration_order(self): + class Season(Enum): + SUMMER = 2 + WINTER = 4 + AUTUMN = 3 + SPRING = 1 + self.assertEqual( + list(Season), + [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], + ) + + def test_reversed_iteration_order(self): + self.assertEqual( + list(reversed(self.Season)), + [self.Season.WINTER, self.Season.AUTUMN, self.Season.SUMMER, + self.Season.SPRING] + ) + + def test_programmatic_function_string(self): + SummerMonth = Enum('SummerMonth', 'june july august') + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_string_with_start(self): + SummerMonth = Enum('SummerMonth', 'june july august', start=10) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 10): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_string_list(self): + SummerMonth = Enum('SummerMonth', ['june', 'july', 'august']) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_string_list_with_start(self): + SummerMonth = Enum('SummerMonth', ['june', 'july', 'august'], start=20) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 20): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_iterable(self): + SummerMonth = Enum( + 'SummerMonth', + (('june', 1), ('july', 2), ('august', 3)) + ) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_from_dict(self): + SummerMonth = Enum( + 'SummerMonth', + OrderedDict((('june', 1), ('july', 2), ('august', 3))) + ) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) + + def test_programmatic_function_type(self): + SummerMonth = Enum('SummerMonth', 'june july august', type=int) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) - def test_getattr_getitem(self): - class Period(Enum): - morning = 1 - noon = 2 - evening = 3 - night = 4 - self.assertIs(Period(2), Period.noon) - self.assertIs(getattr(Period, 'night'), Period.night) - self.assertIs(Period['morning'], Period.morning) + def test_programmatic_function_type_with_start(self): + SummerMonth = Enum('SummerMonth', 'june july august', type=int, start=30) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 30): + e = SummerMonth(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) - def test_getattr_dunder(self): - Season = self.Season - self.assertTrue(getattr(Season, '__eq__')) + def test_programmatic_function_type_from_subclass(self): + SummerMonth = IntEnum('SummerMonth', 'june july august') + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) + self.assertEqual( + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = SummerMonth(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) - def test_iteration_order(self): - class Season(Enum): - SUMMER = 2 - WINTER = 4 - AUTUMN = 3 - SPRING = 1 + def test_programmatic_function_type_from_subclass_with_start(self): + SummerMonth = IntEnum('SummerMonth', 'june july august', start=40) + lst = list(SummerMonth) + self.assertEqual(len(lst), len(SummerMonth)) + self.assertEqual(len(SummerMonth), 3, SummerMonth) self.assertEqual( - list(Season), - [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], + [SummerMonth.june, SummerMonth.july, SummerMonth.august], + lst, ) + for i, month in enumerate('june july august'.split(), 40): + e = SummerMonth(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, SummerMonth) + self.assertIs(type(e), SummerMonth) def test_subclassing(self): if isinstance(Name, Exception): @@ -1229,18 +1425,15 @@ class Color(Enum): red = 1 green = 2 blue = 3 - # with self.assertRaises(TypeError): class MoreColor(Color): cyan = 4 magenta = 5 yellow = 6 - # - with self.assertRaisesRegex(TypeError, " cannot extend "): + with self.assertRaisesRegex(TypeError, "EvenMoreColor: cannot extend enumeration 'Color'"): class EvenMoreColor(Color, IntEnum): chartruese = 7 - # - with self.assertRaisesRegex(TypeError, " cannot extend "): + with self.assertRaisesRegex(TypeError, "Foo: cannot extend enumeration 'Color'"): Color('Foo', ('pink', 'black')) def test_exclude_methods(self): @@ -1344,7 +1537,27 @@ class Color(Enum): with self.assertRaises(KeyError): Color['chartreuse'] - # tests that need to be evalualted for moving + def test_new_repr(self): + class Color(Enum): + red = 1 + green = 2 + blue = 3 + def __repr__(self): + return "don't you just love shades of %s?" % self.name + self.assertEqual( + repr(Color.blue), + "don't you just love shades of blue?", + ) + + def test_inherited_repr(self): + class MyEnum(Enum): + def __repr__(self): + return "My name is %s." % self.name + class MyIntEnum(int, MyEnum): + this = 1 + that = 2 + theother = 3 + self.assertEqual(repr(MyIntEnum.that), "My name is that.") def test_multiple_mixin_mro(self): class auto_enum(type(Enum)): @@ -1397,7 +1610,7 @@ def __new__(cls, *args): return self def __getnewargs__(self): return self._args - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1457,7 +1670,7 @@ def __new__(cls, *args): return self def __getnewargs_ex__(self): return self._args, {} - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1517,7 +1730,7 @@ def __new__(cls, *args): return self def __reduce__(self): return self.__class__, self._args - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1577,7 +1790,7 @@ def __new__(cls, *args): return self def __reduce_ex__(self, proto): return self.__class__, self._args - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1634,7 +1847,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1689,7 +1902,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @bltns.property + @property def __name__(self): return self._intname def __repr__(self): @@ -1878,7 +2091,6 @@ def test(self): class Test(Base): test = 1 self.assertEqual(Test.test.test, 'dynamic') - self.assertEqual(Test.test.value, 1) class Base2(Enum): @enum.property def flash(self): @@ -1886,7 +2098,6 @@ def flash(self): class Test(Base2): flash = 1 self.assertEqual(Test.flash.flash, 'flashy dynamic') - self.assertEqual(Test.flash.value, 1) def test_no_duplicates(self): class UniqueEnum(Enum): @@ -1923,7 +2134,7 @@ class Planet(Enum): def __init__(self, mass, radius): self.mass = mass # in kilograms self.radius = radius # in meters - @enum.property + @property def surface_gravity(self): # universal gravitational constant (m3 kg-1 s-2) G = 6.67300E-11 @@ -1993,7 +2204,90 @@ class LabelledList(LabelledIntEnum): self.assertEqual(LabelledList.unprocessed, 1) self.assertEqual(LabelledList(1), LabelledList.unprocessed) - def test_default_missing_no_chained_exception(self): + def test_auto_number(self): + class Color(Enum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 1) + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_name(self): + class Color(Enum): + def _generate_next_value_(name, start, count, last): + return name + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_name_inherit(self): + class AutoNameEnum(Enum): + def _generate_next_value_(name, start, count, last): + return name + class Color(AutoNameEnum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_garbage(self): + class Color(Enum): + red = 'red' + blue = auto() + self.assertEqual(Color.blue.value, 1) + + def test_auto_garbage_corrected(self): + class Color(Enum): + red = 'red' + blue = 2 + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_order(self): + with self.assertRaises(TypeError): + class Color(Enum): + red = auto() + green = auto() + blue = auto() + def _generate_next_value_(name, start, count, last): + return name + + def test_auto_order_wierd(self): + weird_auto = auto() + weird_auto.value = 'pathological case' + class Color(Enum): + red = weird_auto + def _generate_next_value_(name, start, count, last): + return name + blue = auto() + self.assertEqual(list(Color), [Color.red, Color.blue]) + self.assertEqual(Color.red.value, 'pathological case') + self.assertEqual(Color.blue.value, 'blue') + + def test_duplicate_auto(self): + class Dupes(Enum): + first = primero = auto() + second = auto() + third = auto() + self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) + + def test_default_missing(self): class Color(Enum): RED = 1 GREEN = 2 @@ -2005,7 +2299,7 @@ class Color(Enum): else: raise Exception('Exception not raised.') - def test_missing_override(self): + def test_missing(self): class Color(Enum): red = 1 green = 2 @@ -2069,9 +2363,9 @@ def __init__(self): class_1_ref = weakref.ref(Class1()) class_2_ref = weakref.ref(Class2()) # - # The exception raised by Enum used to create a reference loop and thus - # Class2 instances would stick around until the next garbage collection - # cycle, unlike Class1. Verify Class2 no longer does this. + # The exception raised by Enum creates a reference loop and thus + # Class2 instances will stick around until the next garbage collection + # cycle, unlike Class1. gc.collect() # For PyPy or other GCs. self.assertIs(class_1_ref(), None) self.assertIs(class_2_ref(), None) @@ -2102,12 +2396,11 @@ class Color(MaxMixin, Enum): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) self.assertEqual(Color.MAX, 3) - self.assertEqual(str(Color.BLUE), 'Color.BLUE') + self.assertEqual(str(Color.BLUE), 'BLUE') class Color(MaxMixin, StrMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2117,7 +2410,6 @@ class Color(StrMixin, MaxMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2127,7 +2419,6 @@ class CoolColor(StrMixin, SomeEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolColor.RED.value, 1) self.assertEqual(CoolColor.GREEN.value, 2) self.assertEqual(CoolColor.BLUE.value, 3) @@ -2137,7 +2428,6 @@ class CoolerColor(StrMixin, AnotherEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolerColor.RED.value, 1) self.assertEqual(CoolerColor.GREEN.value, 2) self.assertEqual(CoolerColor.BLUE.value, 3) @@ -2148,7 +2438,6 @@ class CoolestColor(StrMixin, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolestColor.RED.value, 1) self.assertEqual(CoolestColor.GREEN.value, 2) self.assertEqual(CoolestColor.BLUE.value, 3) @@ -2159,7 +2448,6 @@ class ConfusedColor(StrMixin, AnotherEnum, SomeEnum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ConfusedColor.RED.value, 1) self.assertEqual(ConfusedColor.GREEN.value, 2) self.assertEqual(ConfusedColor.BLUE.value, 3) @@ -2170,7 +2458,6 @@ class ReformedColor(StrMixin, IntEnum, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ReformedColor.RED.value, 1) self.assertEqual(ReformedColor.GREEN.value, 2) self.assertEqual(ReformedColor.BLUE.value, 3) @@ -2203,12 +2490,11 @@ def __repr__(self): return hex(self) class MyIntEnum(HexMixin, MyInt, enum.Enum): - __repr__ = HexMixin.__repr__ + pass class Foo(MyIntEnum): TEST = 1 self.assertTrue(isinstance(Foo.TEST, MyInt)) - self.assertEqual(Foo._member_type_, MyInt) self.assertEqual(repr(Foo.TEST), "0x1") class Fee(MyIntEnum): @@ -2220,7 +2506,7 @@ def __new__(cls, value): return member self.assertEqual(Fee.TEST, 2) - def test_multiple_mixin_with_common_data_type(self): + def test_miltuple_mixin_with_common_data_type(self): class CaseInsensitiveStrEnum(str, Enum): @classmethod def _missing_(cls, value): @@ -2240,7 +2526,7 @@ def _missing_(cls, value): unknown._value_ = value cls._member_map_[value] = unknown return unknown - @enum.property + @property def valid(self): return self._valid # @@ -2284,7 +2570,7 @@ class GoodStrEnum(StrEnum): self.assertEqual('{}'.format(GoodStrEnum.one), '1') self.assertEqual(GoodStrEnum.one, str(GoodStrEnum.one)) self.assertEqual(GoodStrEnum.one, '{}'.format(GoodStrEnum.one)) - self.assertEqual(repr(GoodStrEnum.one), "") + self.assertEqual(repr(GoodStrEnum.one), 'GoodStrEnum.one') # class DumbMixin: def __str__(self): @@ -2293,7 +2579,6 @@ class DumbStrEnum(DumbMixin, StrEnum): five = '5' six = '6' seven = '7' - __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2335,6 +2620,74 @@ class ThirdFailedStrEnum(StrEnum): one = '1' two = b'2', 'ascii', 9 + @unittest.skipIf( + python_version >= (3, 12), + 'mixin-format now uses member instead of member.value', + ) + def test_custom_strenum_with_warning(self): + class CustomStrEnum(str, Enum): + pass + class OkayEnum(CustomStrEnum): + one = '1' + two = '2' + three = b'3', 'ascii' + four = b'4', 'latin1', 'strict' + self.assertEqual(OkayEnum.one, '1') + self.assertEqual(str(OkayEnum.one), 'one') + with self.assertWarns(DeprecationWarning): + self.assertEqual('{}'.format(OkayEnum.one), '1') + self.assertEqual(OkayEnum.one, '{}'.format(OkayEnum.one)) + self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') + # + class DumbMixin: + def __str__(self): + return "don't do this" + class DumbStrEnum(DumbMixin, CustomStrEnum): + five = '5' + six = '6' + seven = '7' + self.assertEqual(DumbStrEnum.seven, '7') + self.assertEqual(str(DumbStrEnum.seven), "don't do this") + # + class EnumMixin(Enum): + def hello(self): + print('hello from %s' % (self, )) + class HelloEnum(EnumMixin, CustomStrEnum): + eight = '8' + self.assertEqual(HelloEnum.eight, '8') + self.assertEqual(str(HelloEnum.eight), 'eight') + # + class GoodbyeMixin: + def goodbye(self): + print('%s wishes you a fond farewell') + class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): + nine = '9' + self.assertEqual(GoodbyeEnum.nine, '9') + self.assertEqual(str(GoodbyeEnum.nine), 'nine') + # + class FirstFailedStrEnum(CustomStrEnum): + one = 1 # this will become '1' + two = '2' + class SecondFailedStrEnum(CustomStrEnum): + one = '1' + two = 2, # this will become '2' + three = '3' + class ThirdFailedStrEnum(CustomStrEnum): + one = '1' + two = 2 # this will become '2' + with self.assertRaisesRegex(TypeError, '.encoding. must be str, not '): + class ThirdFailedStrEnum(CustomStrEnum): + one = '1' + two = b'2', sys.getdefaultencoding + with self.assertRaisesRegex(TypeError, '.errors. must be str, not '): + class ThirdFailedStrEnum(CustomStrEnum): + one = '1' + two = b'2', 'ascii', 9 + + @unittest.skipIf( + python_version < (3, 12), + 'mixin-format currently uses member.value', + ) def test_custom_strenum(self): class CustomStrEnum(str, Enum): pass @@ -2344,9 +2697,9 @@ class OkayEnum(CustomStrEnum): three = b'3', 'ascii' four = b'4', 'latin1', 'strict' self.assertEqual(OkayEnum.one, '1') - self.assertEqual(str(OkayEnum.one), 'OkayEnum.one') - self.assertEqual('{}'.format(OkayEnum.one), 'OkayEnum.one') - self.assertEqual(repr(OkayEnum.one), "") + self.assertEqual(str(OkayEnum.one), 'one') + self.assertEqual('{}'.format(OkayEnum.one), 'one') + self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') # class DumbMixin: def __str__(self): @@ -2355,7 +2708,6 @@ class DumbStrEnum(DumbMixin, CustomStrEnum): five = '5' six = '6' seven = '7' - __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2365,7 +2717,7 @@ def hello(self): class HelloEnum(EnumMixin, CustomStrEnum): eight = '8' self.assertEqual(HelloEnum.eight, '8') - self.assertEqual(str(HelloEnum.eight), 'HelloEnum.eight') + self.assertEqual(str(HelloEnum.eight), 'eight') # class GoodbyeMixin: def goodbye(self): @@ -2373,7 +2725,7 @@ def goodbye(self): class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): nine = '9' self.assertEqual(GoodbyeEnum.nine, '9') - self.assertEqual(str(GoodbyeEnum.nine), 'GoodbyeEnum.nine') + self.assertEqual(str(GoodbyeEnum.nine), 'nine') # class FirstFailedStrEnum(CustomStrEnum): one = 1 # this will become '1' @@ -2419,6 +2771,21 @@ def __repr__(self): code = 'An$(5,1)', 2 description = 'Bn$', 3 + @unittest.skipUnless( + python_version == (3, 9), + 'private variables are now normal attributes', + ) + def test_warning_for_private_variables(self): + with self.assertWarns(DeprecationWarning): + class Private(Enum): + __corporal = 'Radar' + self.assertEqual(Private._Private__corporal.value, 'Radar') + try: + with self.assertWarns(DeprecationWarning): + class Private(Enum): + __major_ = 'Hoolihan' + except ValueError: + pass def test_private_variable_is_normal_attribute(self): class Private(Enum): @@ -2427,13 +2794,35 @@ class Private(Enum): self.assertEqual(Private._Private__corporal, 'Radar') self.assertEqual(Private._Private__major_, 'Hoolihan') + @unittest.skipUnless( + python_version < (3, 12), + 'member-member access now raises an exception', + ) + def test_warning_for_member_from_member_access(self): + with self.assertWarns(DeprecationWarning): + class Di(Enum): + YES = 1 + NO = 0 + nope = Di.YES.NO + self.assertIs(Di.NO, nope) + + @unittest.skipUnless( + python_version >= (3, 12), + 'member-member access currently issues a warning', + ) def test_exception_for_member_from_member_access(self): - with self.assertRaisesRegex(AttributeError, " member has no attribute .NO."): + with self.assertRaisesRegex(AttributeError, "Di: no instance attribute .NO."): class Di(Enum): YES = 1 NO = 0 nope = Di.YES.NO + def test_strenum_auto(self): + class Strings(StrEnum): + ONE = auto() + TWO = auto() + self.assertEqual([Strings.ONE, Strings.TWO], ['one', 'two']) + def test_dynamic_members_with_static_methods(self): # @@ -2450,7 +2839,7 @@ def upper(self): self.assertEqual(Foo.FOO_CAT.value, 'aloof') self.assertEqual(Foo.FOO_HORSE.upper(), 'BIG') # - with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as 'aloof'"): + with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as: 'aloof'"): class FooBar(Enum): vars().update({ k: v @@ -2462,42 +2851,8 @@ class FooBar(Enum): def upper(self): return self.value.upper() - def test_repr_with_dataclass(self): - "ensure dataclass-mixin has correct repr()" - from dataclasses import dataclass - @dataclass - class Foo: - __qualname__ = 'Foo' - a: int = 0 - class Entries(Foo, Enum): - ENTRY1 = Foo(1) - self.assertEqual(repr(Entries.ENTRY1), '') - - def test_repr_with_non_data_type_mixin(self): - # non-data_type is a mixin that doesn't define __new__ - class Foo: - def __init__(self, a): - self.a = a - def __repr__(self): - return f'Foo(a={self.a!r})' - class Entries(Foo, Enum): - ENTRY1 = Foo(1) - - self.assertEqual(repr(Entries.ENTRY1), '') - - def test_value_backup_assign(self): - # check that enum will add missing values when custom __new__ does not - class Some(Enum): - def __new__(cls, val): - return object.__new__(cls) - x = 1 - y = 2 - self.assertEqual(Some.x.value, 1) - self.assertEqual(Some.y.value, 2) - class TestOrder(unittest.TestCase): - "test usage of the `_order_` attribute" def test_same_members(self): class Color(Enum): @@ -2559,7 +2914,7 @@ class Color(Enum): verde = green -class OldTestFlag(unittest.TestCase): +class TestFlag(unittest.TestCase): """Tests of the Flags.""" class Perm(Flag): @@ -2582,6 +2937,65 @@ class Color(Flag): WHITE = RED|GREEN|BLUE BLANCO = RED|GREEN|BLUE + def test_str(self): + Perm = self.Perm + self.assertEqual(str(Perm.R), 'R') + self.assertEqual(str(Perm.W), 'W') + self.assertEqual(str(Perm.X), 'X') + self.assertEqual(str(Perm.R | Perm.W), 'R|W') + self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') + self.assertEqual(str(Perm(0)), 'Perm(0)') + self.assertEqual(str(~Perm.R), 'W|X') + self.assertEqual(str(~Perm.W), 'R|X') + self.assertEqual(str(~Perm.X), 'R|W') + self.assertEqual(str(~(Perm.R | Perm.W)), 'X') + self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') + self.assertEqual(str(Perm(~0)), 'R|W|X') + + Open = self.Open + self.assertEqual(str(Open.RO), 'RO') + self.assertEqual(str(Open.WO), 'WO') + self.assertEqual(str(Open.AC), 'AC') + self.assertEqual(str(Open.RO | Open.CE), 'CE') + self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') + self.assertEqual(str(~Open.RO), 'WO|RW|CE') + self.assertEqual(str(~Open.WO), 'RW|CE') + self.assertEqual(str(~Open.AC), 'CE') + self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') + self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') + + def test_repr(self): + Perm = self.Perm + self.assertEqual(repr(Perm.R), 'Perm.R') + self.assertEqual(repr(Perm.W), 'Perm.W') + self.assertEqual(repr(Perm.X), 'Perm.X') + self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') + self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') + self.assertEqual(repr(Perm(0)), '0x0') + self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') + self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') + self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') + self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') + self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') + self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') + + Open = self.Open + self.assertEqual(repr(Open.RO), 'Open.RO') + self.assertEqual(repr(Open.WO), 'Open.WO') + self.assertEqual(repr(Open.AC), 'Open.AC') + self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') + self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') + self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') + self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') + self.assertEqual(repr(~Open.AC), 'Open.CE') + self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') + self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') + + def test_format(self): + Perm = self.Perm + self.assertEqual(format(Perm.R, ''), 'R') + self.assertEqual(format(Perm.R | Perm.X, ''), 'R|X') + def test_or(self): Perm = self.Perm for i in Perm: @@ -2674,7 +3088,7 @@ class Bizarre(Flag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value 7', Iron, 7) + self.assertRaisesRegex(ValueError, 'invalid value: 7', Iron, 7) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -2883,7 +3297,7 @@ class Color(Flag): self.assertEqual(Color.green.value, 4) def test_auto_number_garbage(self): - with self.assertRaisesRegex(TypeError, 'invalid flag value .not an int.'): + with self.assertRaisesRegex(TypeError, 'Invalid Flag value: .not an int.'): class Color(Flag): red = 'not an int' blue = auto() @@ -2918,12 +3332,11 @@ class Color(AllMixin, Flag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), 'Color.BLUE') + self.assertEqual(str(Color.BLUE), 'BLUE') class Color(AllMixin, StrMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -2933,7 +3346,6 @@ class Color(StrMixin, AllMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3014,8 +3426,21 @@ class NeverEnum(WhereEnum): self.assertFalse(NeverEnum.__dict__.get('_test1', False)) self.assertFalse(NeverEnum.__dict__.get('_test2', False)) + def test_default_missing(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestFlag.Color", + ) as ctx: + self.Color('RED') + self.assertIs(ctx.exception.__context__, None) + + P = Flag('P', 'X Y') + with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: + P('X') + self.assertIs(ctx.exception.__context__, None) + -class OldTestIntFlag(unittest.TestCase): +class TestIntFlag(unittest.TestCase): """Tests of the IntFlags.""" class Perm(IntFlag): @@ -3060,6 +3485,73 @@ def test_type(self): self.assertTrue(isinstance(Open.WO | Open.RW, Open)) self.assertEqual(Open.WO | Open.RW, 3) + + def test_str(self): + Perm = self.Perm + self.assertEqual(str(Perm.R), 'R') + self.assertEqual(str(Perm.W), 'W') + self.assertEqual(str(Perm.X), 'X') + self.assertEqual(str(Perm.R | Perm.W), 'R|W') + self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') + self.assertEqual(str(Perm.R | 8), '12') + self.assertEqual(str(Perm(0)), 'Perm(0)') + self.assertEqual(str(Perm(8)), '8') + self.assertEqual(str(~Perm.R), 'W|X') + self.assertEqual(str(~Perm.W), 'R|X') + self.assertEqual(str(~Perm.X), 'R|W') + self.assertEqual(str(~(Perm.R | Perm.W)), 'X') + self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') + self.assertEqual(str(~(Perm.R | 8)), '-13') + self.assertEqual(str(Perm(~0)), 'R|W|X') + self.assertEqual(str(Perm(~8)), '-9') + + Open = self.Open + self.assertEqual(str(Open.RO), 'RO') + self.assertEqual(str(Open.WO), 'WO') + self.assertEqual(str(Open.AC), 'AC') + self.assertEqual(str(Open.RO | Open.CE), 'CE') + self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') + self.assertEqual(str(Open(4)), '4') + self.assertEqual(str(~Open.RO), 'WO|RW|CE') + self.assertEqual(str(~Open.WO), 'RW|CE') + self.assertEqual(str(~Open.AC), 'CE') + self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') + self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') + self.assertEqual(str(Open(~4)), '-5') + + def test_repr(self): + Perm = self.Perm + self.assertEqual(repr(Perm.R), 'Perm.R') + self.assertEqual(repr(Perm.W), 'Perm.W') + self.assertEqual(repr(Perm.X), 'Perm.X') + self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') + self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') + self.assertEqual(repr(Perm.R | 8), '12') + self.assertEqual(repr(Perm(0)), '0x0') + self.assertEqual(repr(Perm(8)), '8') + self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') + self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') + self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') + self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') + self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') + self.assertEqual(repr(~(Perm.R | 8)), '-13') + self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') + self.assertEqual(repr(Perm(~8)), '-9') + + Open = self.Open + self.assertEqual(repr(Open.RO), 'Open.RO') + self.assertEqual(repr(Open.WO), 'Open.WO') + self.assertEqual(repr(Open.AC), 'Open.AC') + self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') + self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') + self.assertEqual(repr(Open(4)), '4') + self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') + self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') + self.assertEqual(repr(~Open.AC), 'Open.CE') + self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') + self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') + self.assertEqual(repr(Open(~4)), '-5') + def test_global_repr_keep(self): self.assertEqual( repr(HeadlightsK(0)), @@ -3067,11 +3559,11 @@ def test_global_repr_keep(self): ) self.assertEqual( repr(HeadlightsK(2**0 + 2**2 + 2**3)), - '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|8' % {'m': SHORT_MODULE}, + '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|0x8' % {'m': SHORT_MODULE}, ) self.assertEqual( repr(HeadlightsK(2**3)), - '%(m)s.HeadlightsK(8)' % {'m': SHORT_MODULE}, + '%(m)s.HeadlightsK(0x8)' % {'m': SHORT_MODULE}, ) def test_global_repr_conform1(self): @@ -3213,7 +3705,7 @@ class Bizarre(IntFlag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value 5', Iron, 5) + self.assertRaisesRegex(ValueError, 'invalid value: 5', Iron, 5) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -3450,12 +3942,11 @@ class Color(AllMixin, IntFlag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), '4') + self.assertEqual(str(Color.BLUE), 'BLUE') class Color(AllMixin, StrMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3465,7 +3956,6 @@ class Color(StrMixin, AllMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() - __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3510,6 +4000,19 @@ def cycle_enum(): 'at least one thread failed while creating composite members') self.assertEqual(256, len(seen), 'too many composite members created') + def test_default_missing(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestIntFlag.Color", + ) as ctx: + self.Color('RED') + self.assertIs(ctx.exception.__context__, None) + + P = IntFlag('P', 'X Y') + with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: + P('X') + self.assertIs(ctx.exception.__context__, None) + class TestEmptyAndNonLatinStrings(unittest.TestCase): @@ -3726,89 +4229,6 @@ def test_is_private(self): for name in self.sunder_names + self.dunder_names + self.random_names: self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') - def test_auto_number(self): - class Color(Enum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 1) - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_name(self): - class Color(Enum): - def _generate_next_value_(name, start, count, last): - return name - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_name_inherit(self): - class AutoNameEnum(Enum): - def _generate_next_value_(name, start, count, last): - return name - class Color(AutoNameEnum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_garbage(self): - class Color(Enum): - red = 'red' - blue = auto() - self.assertEqual(Color.blue.value, 1) - - def test_auto_garbage_corrected(self): - class Color(Enum): - red = 'red' - blue = 2 - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_order(self): - with self.assertRaises(TypeError): - class Color(Enum): - red = auto() - green = auto() - blue = auto() - def _generate_next_value_(name, start, count, last): - return name - - def test_auto_order_wierd(self): - weird_auto = auto() - weird_auto.value = 'pathological case' - class Color(Enum): - red = weird_auto - def _generate_next_value_(name, start, count, last): - return name - blue = auto() - self.assertEqual(list(Color), [Color.red, Color.blue]) - self.assertEqual(Color.red.value, 'pathological case') - self.assertEqual(Color.blue.value, 'blue') - - def test_duplicate_auto(self): - class Dupes(Enum): - first = primero = auto() - second = auto() - third = auto() - self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) - class TestEnumTypeSubclassing(unittest.TestCase): pass @@ -3818,35 +4238,7 @@ class TestEnumTypeSubclassing(unittest.TestCase): class Color(enum.Enum) | Color(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None) |\x20\x20 - | A collection of name/value pairs. - |\x20\x20 - | Access them by: - |\x20\x20 - | - attribute access:: - |\x20\x20 - | >>> Color.CYAN - | - |\x20\x20 - | - value lookup: - |\x20\x20 - | >>> Color(1) - | - |\x20\x20 - | - name lookup: - |\x20\x20 - | >>> Color['CYAN'] - | - |\x20\x20 - | Enumerations can be iterated over, and know how many members they have: - |\x20\x20 - | >>> len(Color) - | 3 - |\x20\x20 - | >>> list(Color) - | [, , ] - |\x20\x20 - | Methods can be added to enumerations, and members can have their own - | attributes -- see the documentation for details. + | An enumeration. |\x20\x20 | Method resolution order: | Color @@ -3855,11 +4247,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | CYAN = + | blue = Color.blue |\x20\x20 - | MAGENTA = + | green = Color.green |\x20\x20 - | YELLOW = + | red = Color.red |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -3871,25 +4263,6 @@ class Color(enum.Enum) | The value of the Enum member. |\x20\x20 | ---------------------------------------------------------------------- - | Methods inherited from enum.EnumType: - |\x20\x20 - | __contains__(member) from enum.EnumType - | Return True if member is a member of this enum - | raises TypeError if member is not an enum member - |\x20\x20\x20\x20\x20\x20 - | note: in 3.12 TypeError will no longer be raised, and True will also be - | returned if member is the value of a member in this enum - |\x20\x20 - | __getitem__(name) from enum.EnumType - | Return the member matching `name`. - |\x20\x20 - | __iter__() from enum.EnumType - | Return members in definition order. - |\x20\x20 - | __len__() from enum.EnumType - | Return the number of members (no aliases) - |\x20\x20 - | ---------------------------------------------------------------------- | Readonly properties inherited from enum.EnumType: |\x20\x20 | __members__ @@ -3911,11 +4284,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | YELLOW = + | blue = Color.blue |\x20\x20 - | MAGENTA = + | green = Color.green |\x20\x20 - | CYAN = + | red = Color.red |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -3934,9 +4307,9 @@ class TestStdLib(unittest.TestCase): maxDiff = None class Color(Enum): - CYAN = 1 - MAGENTA = 2 - YELLOW = 3 + red = 1 + green = 2 + blue = 3 def test_pydoc(self): # indirectly test __objclass__ @@ -3948,34 +4321,24 @@ def test_pydoc(self): helper = pydoc.Helper(output=output) helper(self.Color) result = output.getvalue().strip() - self.assertEqual(result, expected_text, result) + self.assertEqual(result, expected_text) def test_inspect_getmembers(self): values = dict(( ('__class__', EnumType), - ('__doc__', '...'), + ('__doc__', 'An enumeration.'), ('__members__', self.Color.__members__), ('__module__', __name__), - ('YELLOW', self.Color.YELLOW), - ('MAGENTA', self.Color.MAGENTA), - ('CYAN', self.Color.CYAN), + ('blue', self.Color.blue), + ('green', self.Color.green), ('name', Enum.__dict__['name']), + ('red', self.Color.red), ('value', Enum.__dict__['value']), - ('__len__', self.Color.__len__), - ('__contains__', self.Color.__contains__), - ('__name__', 'Color'), - ('__getitem__', self.Color.__getitem__), - ('__qualname__', 'TestStdLib.Color'), - ('__init_subclass__', getattr(self.Color, '__init_subclass__')), - ('__iter__', self.Color.__iter__), )) result = dict(inspect.getmembers(self.Color)) self.assertEqual(set(values.keys()), set(result.keys())) failed = False for k in values.keys(): - if k == '__doc__': - # __doc__ is huge, not comparing - continue if result[k] != values[k]: print() print('\n%s\n key: %s\n result: %s\nexpected: %s\n%s\n' % @@ -3990,42 +4353,23 @@ def test_inspect_classify_class_attrs(self): values = [ Attribute(name='__class__', kind='data', defining_class=object, object=EnumType), - Attribute(name='__contains__', kind='method', - defining_class=EnumType, object=self.Color.__contains__), Attribute(name='__doc__', kind='data', - defining_class=self.Color, object='...'), - Attribute(name='__getitem__', kind='method', - defining_class=EnumType, object=self.Color.__getitem__), - Attribute(name='__iter__', kind='method', - defining_class=EnumType, object=self.Color.__iter__), - Attribute(name='__init_subclass__', kind='class method', - defining_class=object, object=getattr(self.Color, '__init_subclass__')), - Attribute(name='__len__', kind='method', - defining_class=EnumType, object=self.Color.__len__), + defining_class=self.Color, object='An enumeration.'), Attribute(name='__members__', kind='property', defining_class=EnumType, object=EnumType.__members__), Attribute(name='__module__', kind='data', defining_class=self.Color, object=__name__), - Attribute(name='__name__', kind='data', - defining_class=self.Color, object='Color'), - Attribute(name='__qualname__', kind='data', - defining_class=self.Color, object='TestStdLib.Color'), - Attribute(name='YELLOW', kind='data', - defining_class=self.Color, object=self.Color.YELLOW), - Attribute(name='MAGENTA', kind='data', - defining_class=self.Color, object=self.Color.MAGENTA), - Attribute(name='CYAN', kind='data', - defining_class=self.Color, object=self.Color.CYAN), + Attribute(name='blue', kind='data', + defining_class=self.Color, object=self.Color.blue), + Attribute(name='green', kind='data', + defining_class=self.Color, object=self.Color.green), + Attribute(name='red', kind='data', + defining_class=self.Color, object=self.Color.red), Attribute(name='name', kind='data', defining_class=Enum, object=Enum.__dict__['name']), Attribute(name='value', kind='data', defining_class=Enum, object=Enum.__dict__['value']), ] - for v in values: - try: - v.name - except AttributeError: - print(v) values.sort(key=lambda item: item.name) result = list(inspect.classify_class_attrs(self.Color)) result.sort(key=lambda item: item.name) @@ -4035,15 +4379,7 @@ def test_inspect_classify_class_attrs(self): ) failed = False for v, r in zip(values, result): - if r.name in ('__init_subclass__', '__doc__'): - # not sure how to make the __init_subclass_ Attributes match - # so as long as there is one, call it good - # __doc__ is too big to check exactly, so treat the same as __init_subclass__ - for name in ('name','kind','defining_class'): - if getattr(v, name) != getattr(r, name): - print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') - failed = True - elif r != v: + if r != v: print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') failed = True if failed: @@ -4052,15 +4388,15 @@ def test_inspect_classify_class_attrs(self): def test_test_simple_enum(self): @_simple_enum(Enum) class SimpleColor: - CYAN = 1 - MAGENTA = 2 - YELLOW = 3 + RED = 1 + GREEN = 2 + BLUE = 3 class CheckedColor(Enum): - CYAN = 1 - MAGENTA = 2 - YELLOW = 3 + RED = 1 + GREEN = 2 + BLUE = 3 self.assertTrue(_test_simple_enum(CheckedColor, SimpleColor) is None) - SimpleColor.MAGENTA._value_ = 9 + SimpleColor.GREEN._value_ = 9 self.assertRaisesRegex( TypeError, "enum mismatch", _test_simple_enum, CheckedColor, SimpleColor, @@ -4086,165 +4422,9 @@ class Missing: class MiscTestCase(unittest.TestCase): - def test__all__(self): support.check__all__(self, enum, not_exported={'bin', 'show_flag_values'}) - def test_doc_1(self): - class Single(Enum): - ONE = 1 - self.assertEqual( - Single.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Single.ONE - - - - value lookup: - - >>> Single(1) - - - - name lookup: - - >>> Single['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Single) - 1 - - >>> list(Single) - [] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) - - def test_doc_2(self): - class Double(Enum): - ONE = 1 - TWO = 2 - self.assertEqual( - Double.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Double.ONE - - - - value lookup: - - >>> Double(1) - - - - name lookup: - - >>> Double['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Double) - 2 - - >>> list(Double) - [, ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) - - - def test_doc_1(self): - class Triple(Enum): - ONE = 1 - TWO = 2 - THREE = 3 - self.assertEqual( - Triple.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Triple.ONE - - - - value lookup: - - >>> Triple(1) - - - - name lookup: - - >>> Triple['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Triple) - 3 - - >>> list(Triple) - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) - - def test_doc_1(self): - class Quadruple(Enum): - ONE = 1 - TWO = 2 - THREE = 3 - FOUR = 4 - self.assertEqual( - Quadruple.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Quadruple.ONE - - - - value lookup: - - >>> Quadruple(1) - - - - name lookup: - - >>> Quadruple['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Quadruple) - 4 - - >>> list(Quadruple)[:3] - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) - # These are unordered here on purpose to ensure that declaration order # makes no difference. @@ -4262,10 +4442,6 @@ class Quadruple(Enum): CONVERT_STRING_TEST_NAME_E = 5 CONVERT_STRING_TEST_NAME_F = 5 -# global names for StrEnum._convert_ test -CONVERT_STR_TEST_2 = 'goodbye' -CONVERT_STR_TEST_1 = 'hello' - # We also need values that cannot be compared: UNCOMPARABLE_A = 5 UNCOMPARABLE_C = (9, 1) # naming order is broken on purpose @@ -4277,40 +4453,32 @@ class Quadruple(Enum): class _ModuleWrapper: """We use this class as a namespace for swapping modules.""" + def __init__(self, module): self.__dict__.update(module.__dict__) -class TestConvert(unittest.TestCase): - def tearDown(self): - # Reset the module-level test variables to their original integer - # values, otherwise the already created enum values get converted - # instead. - g = globals() - for suffix in ['A', 'B', 'C', 'D', 'E', 'F']: - g['CONVERT_TEST_NAME_%s' % suffix] = 5 - g['CONVERT_STRING_TEST_NAME_%s' % suffix] = 5 - for suffix, value in (('A', 5), ('B', (9, 1)), ('C', 'value')): - g['UNCOMPARABLE_%s' % suffix] = value - for suffix, value in (('A', 2j), ('B', 3j), ('C', 1j)): - g['COMPLEX_%s' % suffix] = value - for suffix, value in (('1', 'hello'), ('2', 'goodbye')): - g['CONVERT_STR_TEST_%s' % suffix] = value - +class TestIntEnumConvert(unittest.TestCase): def test_convert_value_lookup_priority(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # We don't want the reverse lookup value to vary when there are # multiple possible names for a given value. It should always # report the first lexigraphical name in that case. self.assertEqual(test_type(5).name, 'CONVERT_TEST_NAME_A') - def test_convert_int(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + def test_convert(self): + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # Ensure that test_type has all of the desired names and values. self.assertEqual(test_type.CONVERT_TEST_NAME_F, test_type.CONVERT_TEST_NAME_A) @@ -4319,57 +4487,43 @@ def test_convert_int(self): self.assertEqual(test_type.CONVERT_TEST_NAME_D, 5) self.assertEqual(test_type.CONVERT_TEST_NAME_E, 5) # Ensure that test_type only picked up names matching the filter. - int_dir = dir(int) + [ - 'CONVERT_TEST_NAME_A', 'CONVERT_TEST_NAME_B', 'CONVERT_TEST_NAME_C', - 'CONVERT_TEST_NAME_D', 'CONVERT_TEST_NAME_E', 'CONVERT_TEST_NAME_F', - ] - self.assertEqual( - [name for name in dir(test_type) if name not in int_dir], - [], - msg='Names other than CONVERT_TEST_* found.', - ) + self.assertEqual([name for name in dir(test_type) + if name[0:2] not in ('CO', '__') + and name not in dir(IntEnum)], + [], msg='Names other than CONVERT_TEST_* found.') def test_convert_uncomparable(self): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('UNCOMPARABLE_')) + # We swap a module to some other object with `__dict__` + # because otherwise refleak is created. + # `_convert_` uses a module side effect that does this. See 30472 + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('UNCOMPARABLE_')) + # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.UNCOMPARABLE_A, uncomp.UNCOMPARABLE_B, uncomp.UNCOMPARABLE_C], - ) + ) def test_convert_complex(self): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('COMPLEX_')) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('COMPLEX_')) + # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.COMPLEX_A, uncomp.COMPLEX_B, uncomp.COMPLEX_C], - ) - - def test_convert_str(self): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_'), - as_global=True) - # Ensure that test_type has all of the desired names and values. - self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') - self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') - # Ensure that test_type only picked up names matching the filter. - str_dir = dir(str) + ['CONVERT_STR_TEST_1', 'CONVERT_STR_TEST_2'] - self.assertEqual( - [name for name in dir(test_type) if name not in str_dir], - [], - msg='Names other than CONVERT_STR_* found.', - ) - self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') - self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') + ) def test_convert_raise(self): with self.assertRaises(AttributeError): @@ -4379,58 +4533,50 @@ def test_convert_raise(self): filter=lambda x: x.startswith('CONVERT_TEST_')) def test_convert_repr_and_str(self): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STRING_TEST_'), - as_global=True) + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STRING_TEST_')) self.assertEqual(repr(test_type.CONVERT_STRING_TEST_NAME_A), '%s.CONVERT_STRING_TEST_NAME_A' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), '5') + self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), 'CONVERT_STRING_TEST_NAME_A') self.assertEqual(format(test_type.CONVERT_STRING_TEST_NAME_A), '5') +# global names for StrEnum._convert_ test +CONVERT_STR_TEST_2 = 'goodbye' +CONVERT_STR_TEST_1 = 'hello' -# helpers - -def enum_dir(cls): - # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ - if cls._member_type_ is object: - interesting = set() - if cls.__init_subclass__ is not object.__init_subclass__: - interesting.add('__init_subclass__') - return sorted(set([ - '__class__', '__contains__', '__doc__', '__getitem__', - '__iter__', '__len__', '__members__', '__module__', - '__name__', '__qualname__', - ] - + cls._member_names_ - ) | interesting - ) - else: - # return whatever mixed-in data type has - return sorted(set( - dir(cls._member_type_) - + cls._member_names_ - )) - -def member_dir(member): - if member.__class__._member_type_ is object: - allowed = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) - else: - allowed = set(dir(member)) - for cls in member.__class__.mro(): - for name, obj in cls.__dict__.items(): - if name[0] == '_': - continue - if isinstance(obj, enum.property): - if obj.fget is not None or name not in member._member_map_: - allowed.add(name) - else: - allowed.discard(name) - else: - allowed.add(name) - return sorted(allowed) +class TestStrEnumConvert(unittest.TestCase): + def test_convert(self): + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_')) + # Ensure that test_type has all of the desired names and values. + self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') + self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') + # Ensure that test_type only picked up names matching the filter. + self.assertEqual([name for name in dir(test_type) + if name[0:2] not in ('CO', '__') + and name not in dir(StrEnum)], + [], msg='Names other than CONVERT_STR_* found.') -missing = object() + def test_convert_repr_and_str(self): + with support.swap_item( + sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), + ): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_')) + self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) + self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') + self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') if __name__ == '__main__': diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index ac4626d0c456e..3f0e7270eb26f 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -908,7 +908,7 @@ def handler(signum, frame): %s - blocked = %s + blocked = %r signum = signal.SIGALRM # child: block and wait the signal diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 56cc23dbbbf4e..394d2942483fb 100755 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -1517,11 +1517,9 @@ def testGetaddrinfo(self): infos = socket.getaddrinfo(HOST, 80, socket.AF_INET, socket.SOCK_STREAM) for family, type, _, _, _ in infos: self.assertEqual(family, socket.AF_INET) - self.assertEqual(repr(family), '') - self.assertEqual(str(family), '2') + self.assertEqual(str(family), 'AF_INET') self.assertEqual(type, socket.SOCK_STREAM) - self.assertEqual(repr(type), '') - self.assertEqual(str(type), '1') + self.assertEqual(str(type), 'SOCK_STREAM') infos = socket.getaddrinfo(HOST, None, 0, socket.SOCK_STREAM) for _, socktype, _, _, _ in infos: self.assertEqual(socktype, socket.SOCK_STREAM) @@ -1795,10 +1793,8 @@ def test_str_for_enums(self): # Make sure that the AF_* and SOCK_* constants have enum-like string # reprs. with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: - self.assertEqual(repr(s.family), '') - self.assertEqual(repr(s.type), '') - self.assertEqual(str(s.family), '2') - self.assertEqual(str(s.type), '1') + self.assertEqual(str(s.family), 'AF_INET') + self.assertEqual(str(s.type), 'SOCK_STREAM') def test_socket_consistent_sock_type(self): SOCK_NONBLOCK = getattr(socket, 'SOCK_NONBLOCK', 0) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 64f4bce7f7781..f99a3e8da95f8 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -373,8 +373,7 @@ def test_str_for_enums(self): # Make sure that the PROTOCOL_* constants have enum-like string # reprs. proto = ssl.PROTOCOL_TLS_CLIENT - self.assertEqual(repr(proto), '<_SSLMethod.PROTOCOL_TLS_CLIENT: 16>') - self.assertEqual(str(proto), '16') + self.assertEqual(str(proto), 'PROTOCOL_TLS_CLIENT') ctx = ssl.SSLContext(proto) self.assertIs(ctx.protocol, proto) @@ -623,7 +622,7 @@ def test_openssl111_deprecations(self): with self.assertWarns(DeprecationWarning) as cm: ssl.SSLContext(protocol) self.assertEqual( - f'ssl.{protocol.name} is deprecated', + f'{protocol!r} is deprecated', str(cm.warning) ) @@ -632,9 +631,8 @@ def test_openssl111_deprecations(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT) with self.assertWarns(DeprecationWarning) as cm: ctx.minimum_version = version - version_text = '%s.%s' % (version.__class__.__name__, version.name) self.assertEqual( - f'ssl.{version_text} is deprecated', + f'ssl.{version!r} is deprecated', str(cm.warning) ) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index 8e4e64808b688..d5e2c5266aae7 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1490,10 +1490,8 @@ def test_formatting_with_enum(self): # issue18780 import enum class Float(float, enum.Enum): - # a mixed-in type will use the name for %s etc. PI = 3.1415926 class Int(enum.IntEnum): - # IntEnum uses the value and not the name for %s etc. IDES = 15 class Str(enum.StrEnum): # StrEnum uses the value and not the name for %s etc. @@ -1510,10 +1508,8 @@ class Str(enum.StrEnum): # formatting jobs delegated from the string implementation: self.assertEqual('...%(foo)s...' % {'foo':Str.ABC}, '...abc...') - self.assertEqual('...%(foo)r...' % {'foo':Int.IDES}, - '......') self.assertEqual('...%(foo)s...' % {'foo':Int.IDES}, - '...15...') + '...IDES...') self.assertEqual('...%(foo)i...' % {'foo':Int.IDES}, '...15...') self.assertEqual('...%(foo)d...' % {'foo':Int.IDES}, diff --git a/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst b/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst deleted file mode 100644 index 2df487855785e..0000000000000 --- a/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst +++ /dev/null @@ -1,2 +0,0 @@ -``IntEnum``, ``IntFlag``, and ``StrEnum`` use the mixed-in type for their -``str()`` and ``format()`` output. From webhook-mailer at python.org Mon Jan 17 08:00:57 2022 From: webhook-mailer at python.org (vstinner) Date: Mon, 17 Jan 2022 13:00:57 -0000 Subject: [Python-checkins] bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) Message-ID: https://github.com/python/cpython/commit/ad6e640f910787e73fd00f59117fbd22cdf88c78 commit: ad6e640f910787e73fd00f59117fbd22cdf88c78 branch: main author: Victor Stinner committer: vstinner date: 2022-01-17T14:00:50+01:00 summary: bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) Skip test_builtin PTY tests on non-ASCII characters if the readline module is loaded. The readline module changes input() behavior, but test_builtin is not intented to test the readline module. When the readline module is loaded, PyOS_Readline() uses the readline implementation. In some cases, the Python readline callback rlhandler() is called by readline with a string without non-ASCII characters. files: A Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst M Lib/test/test_builtin.py diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 6dc4fa555021c..7456803221964 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -2090,12 +2090,24 @@ def test_input_tty(self): # is different and invokes GNU readline if available). self.check_input_tty("prompt", b"quux") + def skip_if_readline(self): + # bpo-13886: When the readline module is loaded, PyOS_Readline() uses + # the readline implementation. In some cases, the Python readline + # callback rlhandler() is called by readline with a string without + # non-ASCII characters. Skip tests on non-ASCII characters if the + # readline module is loaded, since test_builtin is not intented to test + # the readline module, but the builtins module. + if 'readline' in sys.modules: + self.skipTest("the readline module is loaded") + def test_input_tty_non_ascii(self): - # Check stdin/stdout encoding is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout encoding is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "utf-8") def test_input_tty_non_ascii_unicode_errors(self): - # Check stdin/stdout error handler is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout error handler is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "ascii") def test_input_no_stdout_fileno(self): diff --git a/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst new file mode 100644 index 0000000000000..cd19dce37d5c8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst @@ -0,0 +1,3 @@ +Skip test_builtin PTY tests on non-ASCII characters if the readline module +is loaded. The readline module changes input() behavior, but test_builtin is +not intented to test the readline module. Patch by Victor Stinner. From webhook-mailer at python.org Mon Jan 17 08:35:17 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 17 Jan 2022 13:35:17 -0000 Subject: [Python-checkins] bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) Message-ID: https://github.com/python/cpython/commit/1345b460f568afa8a6f9c0e2b23adba5015f208e commit: 1345b460f568afa8a6f9c0e2b23adba5015f208e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-17T05:35:07-08:00 summary: bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) Skip test_builtin PTY tests on non-ASCII characters if the readline module is loaded. The readline module changes input() behavior, but test_builtin is not intented to test the readline module. When the readline module is loaded, PyOS_Readline() uses the readline implementation. In some cases, the Python readline callback rlhandler() is called by readline with a string without non-ASCII characters. (cherry picked from commit ad6e640f910787e73fd00f59117fbd22cdf88c78) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst M Lib/test/test_builtin.py diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 6dc4fa555021c..7456803221964 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -2090,12 +2090,24 @@ def test_input_tty(self): # is different and invokes GNU readline if available). self.check_input_tty("prompt", b"quux") + def skip_if_readline(self): + # bpo-13886: When the readline module is loaded, PyOS_Readline() uses + # the readline implementation. In some cases, the Python readline + # callback rlhandler() is called by readline with a string without + # non-ASCII characters. Skip tests on non-ASCII characters if the + # readline module is loaded, since test_builtin is not intented to test + # the readline module, but the builtins module. + if 'readline' in sys.modules: + self.skipTest("the readline module is loaded") + def test_input_tty_non_ascii(self): - # Check stdin/stdout encoding is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout encoding is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "utf-8") def test_input_tty_non_ascii_unicode_errors(self): - # Check stdin/stdout error handler is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout error handler is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "ascii") def test_input_no_stdout_fileno(self): diff --git a/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst new file mode 100644 index 0000000000000..cd19dce37d5c8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst @@ -0,0 +1,3 @@ +Skip test_builtin PTY tests on non-ASCII characters if the readline module +is loaded. The readline module changes input() behavior, but test_builtin is +not intented to test the readline module. Patch by Victor Stinner. From webhook-mailer at python.org Mon Jan 17 08:47:34 2022 From: webhook-mailer at python.org (vstinner) Date: Mon, 17 Jan 2022 13:47:34 -0000 Subject: [Python-checkins] bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) (GH-30635) Message-ID: https://github.com/python/cpython/commit/0fbb9afbddb93408e34bdb7625002374cb2ad68c commit: 0fbb9afbddb93408e34bdb7625002374cb2ad68c branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-17T14:47:21+01:00 summary: bpo-13886: Skip PTY non-ASCII tests if readline is loaded (GH-30631) (GH-30635) Skip test_builtin PTY tests on non-ASCII characters if the readline module is loaded. The readline module changes input() behavior, but test_builtin is not intented to test the readline module. When the readline module is loaded, PyOS_Readline() uses the readline implementation. In some cases, the Python readline callback rlhandler() is called by readline with a string without non-ASCII characters. (cherry picked from commit ad6e640f910787e73fd00f59117fbd22cdf88c78) Co-authored-by: Victor Stinner Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst M Lib/test/test_builtin.py diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index d009f57e47e05..1f224bfe1ba99 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1980,12 +1980,24 @@ def test_input_tty(self): # is different and invokes GNU readline if available). self.check_input_tty("prompt", b"quux") + def skip_if_readline(self): + # bpo-13886: When the readline module is loaded, PyOS_Readline() uses + # the readline implementation. In some cases, the Python readline + # callback rlhandler() is called by readline with a string without + # non-ASCII characters. Skip tests on non-ASCII characters if the + # readline module is loaded, since test_builtin is not intented to test + # the readline module, but the builtins module. + if 'readline' in sys.modules: + self.skipTest("the readline module is loaded") + def test_input_tty_non_ascii(self): - # Check stdin/stdout encoding is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout encoding is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "utf-8") def test_input_tty_non_ascii_unicode_errors(self): - # Check stdin/stdout error handler is used when invoking GNU readline + self.skip_if_readline() + # Check stdin/stdout error handler is used when invoking PyOS_Readline() self.check_input_tty("prompt?", b"quux\xe9", "ascii") def test_input_no_stdout_fileno(self): diff --git a/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst new file mode 100644 index 0000000000000..cd19dce37d5c8 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-17-13-10-04.bpo-13886.5mZH4b.rst @@ -0,0 +1,3 @@ +Skip test_builtin PTY tests on non-ASCII characters if the readline module +is loaded. The readline module changes input() behavior, but test_builtin is +not intented to test the readline module. Patch by Victor Stinner. From webhook-mailer at python.org Mon Jan 17 08:47:55 2022 From: webhook-mailer at python.org (vstinner) Date: Mon, 17 Jan 2022 13:47:55 -0000 Subject: [Python-checkins] bpo-46383: Fix signature of zoneinfo module_free function (GH-30607) (GH-30610) Message-ID: https://github.com/python/cpython/commit/7a822c92782ffda8fa32a4b30a95b9de7cc1b8e6 commit: 7a822c92782ffda8fa32a4b30a95b9de7cc1b8e6 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-17T14:47:51+01:00 summary: bpo-46383: Fix signature of zoneinfo module_free function (GH-30607) (GH-30610) (cherry picked from commit cfbde65df318eea243706ff876e5ef834c085e5f) Co-authored-by: Christian Heimes Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst M Modules/_zoneinfo.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst new file mode 100644 index 0000000000000..8f8b12732a690 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst @@ -0,0 +1,2 @@ +Fix invalid signature of ``_zoneinfo``'s ``module_free`` function to resolve +a crash on wasm32-emscripten platform. diff --git a/Modules/_zoneinfo.c b/Modules/_zoneinfo.c index 04fa09422b213..0388d27ce10a4 100644 --- a/Modules/_zoneinfo.c +++ b/Modules/_zoneinfo.c @@ -2608,7 +2608,7 @@ static PyTypeObject PyZoneInfo_ZoneInfoType = { // Specify the _zoneinfo module static PyMethodDef module_methods[] = {{NULL, NULL}}; static void -module_free(void) +module_free(void *m) { Py_XDECREF(_tzpath_find_tzfile); _tzpath_find_tzfile = NULL; From webhook-mailer at python.org Mon Jan 17 08:49:32 2022 From: webhook-mailer at python.org (vstinner) Date: Mon, 17 Jan 2022 13:49:32 -0000 Subject: [Python-checkins] bpo-44133: Skip PyThread_get_thread_native_id() if not available (GH-30636) Message-ID: https://github.com/python/cpython/commit/16901c0482734dbd389b09ca3edfcf3e22faeed7 commit: 16901c0482734dbd389b09ca3edfcf3e22faeed7 branch: main author: Victor Stinner committer: vstinner date: 2022-01-17T14:49:20+01:00 summary: bpo-44133: Skip PyThread_get_thread_native_id() if not available (GH-30636) test_capi.test_export_symbols() doesn't check if Python exports the "PyThread_get_thread_native_id" symbol if the _thread.get_native_id() function is not available (if the PY_HAVE_THREAD_NATIVE_ID macro is not defined). files: M Lib/test/test_capi.py diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 9f217852ec529..7ada8406a3584 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -2,6 +2,7 @@ # these are all functions _testcapi exports whose name begins with 'test_'. from collections import OrderedDict +import _thread import importlib.machinery import importlib.util import os @@ -648,7 +649,11 @@ def test_export_symbols(self): # "PyThread_get_thread_native_id" symbols are exported by the Python # (directly by the binary, or via by the Python dynamic library). ctypes = import_helper.import_module('ctypes') - names = ['PyThread_get_thread_native_id'] + names = [] + + # Test if the PY_HAVE_THREAD_NATIVE_ID macro is defined + if hasattr(_thread, 'get_native_id'): + names.append('PyThread_get_thread_native_id') # Python/frozenmain.c fails to build on Windows when the symbols are # missing: @@ -657,6 +662,7 @@ def test_export_symbols(self): # - PyInitFrozenExtensions if os.name != 'nt': names.append('Py_FrozenMain') + for name in names: with self.subTest(name=name): self.assertTrue(hasattr(ctypes.pythonapi, name)) From webhook-mailer at python.org Mon Jan 17 10:18:35 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 17 Jan 2022 15:18:35 -0000 Subject: [Python-checkins] bpo-40066: [Enum] skip failing doc test (GH-30637) Message-ID: https://github.com/python/cpython/commit/83d544b9292870eb44f6fca37df0aa351c4ef83a commit: 83d544b9292870eb44f6fca37df0aa351c4ef83a branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: ethanfurman date: 2022-01-17T07:18:13-08:00 summary: bpo-40066: [Enum] skip failing doc test (GH-30637) files: A Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst M Doc/howto/enum.rst M Doc/library/enum.rst M Doc/library/ssl.rst M Lib/enum.py M Lib/inspect.py M Lib/plistlib.py M Lib/re.py M Lib/ssl.py M Lib/test/test_enum.py M Lib/test/test_signal.py M Lib/test/test_socket.py M Lib/test/test_ssl.py M Lib/test/test_unicode.py diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 6c09b9925c1de..fa0e2283ebc10 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -2,15 +2,10 @@ Enum HOWTO ========== -:Author: Ethan Furman - .. _enum-basic-tutorial: .. currentmodule:: enum -Basic Enum Tutorial -------------------- - An :class:`Enum` is a set of symbolic names bound to unique values. They are similar to global variables, but they offer a more useful :func:`repr()`, grouping, type-safety, and a few other features. @@ -28,6 +23,14 @@ selection of values. For example, the days of the week:: ... SATURDAY = 6 ... SUNDAY = 7 + Or perhaps the RGB primary colors:: + + >>> from enum import Enum + >>> class Color(Enum): + ... RED = 1 + ... GREEN = 2 + ... BLUE = 3 + As you can see, creating an :class:`Enum` is as simple as writing a class that inherits from :class:`Enum` itself. @@ -41,13 +44,14 @@ important, but either way that value can be used to get the corresponding member:: >>> Weekday(3) - Weekday.WEDNESDAY + -As you can see, the ``repr()`` of a member shows the enum name and the -member name. The ``str()`` on a member shows only its name:: +As you can see, the ``repr()`` of a member shows the enum name, the member name, +and the value. The ``str()`` of a member shows only the enum name and member +name:: >>> print(Weekday.THURSDAY) - THURSDAY + Weekday.THURSDAY The *type* of an enumeration member is the enum it belongs to:: @@ -97,8 +101,8 @@ The complete :class:`Weekday` enum now looks like this:: Now we can find out what today is! Observe:: >>> from datetime import date - >>> Weekday.from_date(date.today()) - Weekday.TUESDAY + >>> Weekday.from_date(date.today()) # doctest: +SKIP + Of course, if you're reading this on some other day, you'll see that day instead. @@ -124,21 +128,21 @@ Just like the original :class:`Weekday` enum above, we can have a single selecti >>> first_week_day = Weekday.MONDAY >>> first_week_day - Weekday.MONDAY + But :class:`Flag` also allows us to combine several members into a single variable:: >>> weekend = Weekday.SATURDAY | Weekday.SUNDAY >>> weekend - Weekday.SATURDAY|Weekday.SUNDAY + You can even iterate over a :class:`Flag` variable:: >>> for day in weekend: ... print(day) - SATURDAY - SUNDAY + Weekday.SATURDAY + Weekday.SUNDAY Okay, let's get some chores set up:: @@ -173,6 +177,7 @@ yourself some work and use :func:`auto()` for the values:: .. _enum-advanced-tutorial: + Programmatic access to enumeration members and their attributes --------------------------------------------------------------- @@ -181,16 +186,16 @@ situations where ``Color.RED`` won't do because the exact color is not known at program-writing time). ``Enum`` allows such access:: >>> Color(1) - Color.RED + >>> Color(3) - Color.BLUE + If you want to access enum members by *name*, use item access:: >>> Color['RED'] - Color.RED + >>> Color['GREEN'] - Color.GREEN + If you have an enum member and need its :attr:`name` or :attr:`value`:: @@ -212,7 +217,7 @@ Having two enum members with the same name is invalid:: ... Traceback (most recent call last): ... - TypeError: 'SQUARE' already defined as: 2 + TypeError: 'SQUARE' already defined as 2 However, an enum member can have other names associated with it. Given two entries ``A`` and ``B`` with the same value (and ``A`` defined first), ``B`` @@ -227,11 +232,11 @@ By-name lookup of ``B`` will also return the member ``A``:: ... ALIAS_FOR_SQUARE = 2 ... >>> Shape.SQUARE - Shape.SQUARE + >>> Shape.ALIAS_FOR_SQUARE - Shape.SQUARE + >>> Shape(2) - Shape.SQUARE + .. note:: @@ -299,7 +304,7 @@ Iteration Iterating over the members of an enum does not provide the aliases:: >>> list(Shape) - [Shape.SQUARE, Shape.DIAMOND, Shape.CIRCLE] + [, , ] The special attribute ``__members__`` is a read-only ordered mapping of names to members. It includes all names defined in the enumeration, including the @@ -308,10 +313,10 @@ aliases:: >>> for name, member in Shape.__members__.items(): ... name, member ... - ('SQUARE', Shape.SQUARE) - ('DIAMOND', Shape.DIAMOND) - ('CIRCLE', Shape.CIRCLE) - ('ALIAS_FOR_SQUARE', Shape.SQUARE) + ('SQUARE', ) + ('DIAMOND', ) + ('CIRCLE', ) + ('ALIAS_FOR_SQUARE', ) The ``__members__`` attribute can be used for detailed programmatic access to the enumeration members. For example, finding all the aliases:: @@ -360,8 +365,8 @@ below):: Allowed members and attributes of enumerations ---------------------------------------------- -Most of the examples above use integers for enumeration values. Using integers is -short and handy (and provided by default by the `Functional API`_), but not +Most of the examples above use integers for enumeration values. Using integers +is short and handy (and provided by default by the `Functional API`_), but not strictly enforced. In the vast majority of use-cases, one doesn't care what the actual value of an enumeration is. But if the value *is* important, enumerations can have arbitrary values. @@ -389,7 +394,7 @@ usual. If we have this enumeration:: Then:: >>> Mood.favorite_mood() - Mood.HAPPY + >>> Mood.HAPPY.describe() ('HAPPY', 3) >>> str(Mood.FUNKY) @@ -425,7 +430,7 @@ any members. So this is forbidden:: ... Traceback (most recent call last): ... - TypeError: MoreColor: cannot extend enumeration 'Color' + TypeError: cannot extend But this is allowed:: @@ -476,11 +481,9 @@ The :class:`Enum` class is callable, providing the following functional API:: >>> Animal >>> Animal.ANT - Animal.ANT - >>> Animal.ANT.value - 1 + >>> list(Animal) - [Animal.ANT, Animal.BEE, Animal.CAT, Animal.DOG] + [, , , ] The semantics of this API resemble :class:`~collections.namedtuple`. The first argument of the call to :class:`Enum` is the name of the enumeration. @@ -625,16 +628,7 @@ StrEnum The second variation of :class:`Enum` that is provided is also a subclass of :class:`str`. Members of a :class:`StrEnum` can be compared to strings; by extension, string enumerations of different types can also be compared -to each other. :class:`StrEnum` exists to help avoid the problem of getting -an incorrect member:: - - >>> from enum import StrEnum - >>> class Directions(StrEnum): - ... NORTH = 'north', # notice the trailing comma - ... SOUTH = 'south' - -Before :class:`StrEnum`, ``Directions.NORTH`` would have been the :class:`tuple` -``('north',)``. +to each other. .. versionadded:: 3.11 @@ -645,9 +639,8 @@ IntFlag The next variation of :class:`Enum` provided, :class:`IntFlag`, is also based on :class:`int`. The difference being :class:`IntFlag` members can be combined using the bitwise operators (&, \|, ^, ~) and the result is still an -:class:`IntFlag` member, if possible. However, as the name implies, :class:`IntFlag` -members also subclass :class:`int` and can be used wherever an :class:`int` is -used. +:class:`IntFlag` member, if possible. Like :class:`IntEnum`, :class:`IntFlag` +members are also integers and can be used wherever an :class:`int` is used. .. note:: @@ -670,7 +663,7 @@ Sample :class:`IntFlag` class:: ... X = 1 ... >>> Perm.R | Perm.W - Perm.R|Perm.W + >>> Perm.R + Perm.W 6 >>> RW = Perm.R | Perm.W @@ -685,11 +678,11 @@ It is also possible to name the combinations:: ... X = 1 ... RWX = 7 >>> Perm.RWX - Perm.RWX + >>> ~Perm.RWX - Perm(0) + >>> Perm(7) - Perm.RWX + .. note:: @@ -702,7 +695,7 @@ Another important difference between :class:`IntFlag` and :class:`Enum` is that if no flags are set (the value is 0), its boolean evaluation is :data:`False`:: >>> Perm.R & Perm.X - Perm(0) + >>> bool(Perm.R & Perm.X) False @@ -710,7 +703,7 @@ Because :class:`IntFlag` members are also subclasses of :class:`int` they can be combined with them (but may lose :class:`IntFlag` membership:: >>> Perm.X | 4 - Perm.R|Perm.X + >>> Perm.X | 8 9 @@ -726,7 +719,7 @@ be combined with them (but may lose :class:`IntFlag` membership:: :class:`IntFlag` members can also be iterated over:: >>> list(RW) - [Perm.R, Perm.W] + [, ] .. versionadded:: 3.11 @@ -753,7 +746,7 @@ flags being set, the boolean evaluation is :data:`False`:: ... GREEN = auto() ... >>> Color.RED & Color.GREEN - Color(0) + >>> bool(Color.RED & Color.GREEN) False @@ -767,7 +760,7 @@ while combinations of flags won't:: ... WHITE = RED | BLUE | GREEN ... >>> Color.WHITE - Color.WHITE + Giving a name to the "no flags set" condition does not change its boolean value:: @@ -779,7 +772,7 @@ value:: ... GREEN = auto() ... >>> Color.BLACK - Color.BLACK + >>> bool(Color.BLACK) False @@ -787,7 +780,7 @@ value:: >>> purple = Color.RED | Color.BLUE >>> list(purple) - [Color.RED, Color.BLUE] + [, ] .. versionadded:: 3.11 @@ -812,16 +805,16 @@ simple to implement independently:: pass This demonstrates how similar derived enumerations can be defined; for example -a :class:`StrEnum` that mixes in :class:`str` instead of :class:`int`. +a :class:`FloatEnum` that mixes in :class:`float` instead of :class:`int`. Some rules: 1. When subclassing :class:`Enum`, mix-in types must appear before :class:`Enum` itself in the sequence of bases, as in the :class:`IntEnum` example above. -2. Mix-in types must be subclassable. For example, - :class:`bool` and :class:`range` are not subclassable - and will throw an error during Enum creation if used as the mix-in type. +2. Mix-in types must be subclassable. For example, :class:`bool` and + :class:`range` are not subclassable and will throw an error during Enum + creation if used as the mix-in type. 3. While :class:`Enum` can have members of any type, once you mix in an additional type, all the members must have values of that type, e.g. :class:`int` above. This restriction does not apply to mix-ins which only @@ -829,15 +822,18 @@ Some rules: 4. When another data type is mixed in, the :attr:`value` attribute is *not the same* as the enum member itself, although it is equivalent and will compare equal. -5. %-style formatting: `%s` and `%r` call the :class:`Enum` class's +5. %-style formatting: ``%s`` and ``%r`` call the :class:`Enum` class's :meth:`__str__` and :meth:`__repr__` respectively; other codes (such as - `%i` or `%h` for IntEnum) treat the enum member as its mixed-in type. + ``%i`` or ``%h`` for IntEnum) treat the enum member as its mixed-in type. 6. :ref:`Formatted string literals `, :meth:`str.format`, - and :func:`format` will use the mixed-in type's :meth:`__format__` - unless :meth:`__str__` or :meth:`__format__` is overridden in the subclass, - in which case the overridden methods or :class:`Enum` methods will be used. - Use the !s and !r format codes to force usage of the :class:`Enum` class's - :meth:`__str__` and :meth:`__repr__` methods. + and :func:`format` will use the enum's :meth:`__str__` method. + +.. note:: + + Because :class:`IntEnum`, :class:`IntFlag`, and :class:`StrEnum` are + designed to be drop-in replacements for existing constants, their + :meth:`__str__` method has been reset to their data types + :meth:`__str__` method. When to use :meth:`__new__` vs. :meth:`__init__` ------------------------------------------------ @@ -866,10 +862,10 @@ want one of them to be the value:: ... >>> print(Coordinate['PY']) - PY + Coordinate.PY >>> print(Coordinate(3)) - VY + Coordinate.VY Finer Points @@ -927,8 +923,8 @@ and raise an error if the two do not match:: Traceback (most recent call last): ... TypeError: member order does not match _order_: - ['RED', 'BLUE', 'GREEN'] - ['RED', 'GREEN', 'BLUE'] + ['RED', 'BLUE', 'GREEN'] + ['RED', 'GREEN', 'BLUE'] .. note:: @@ -949,35 +945,36 @@ but remain normal attributes. """""""""""""""""""" Enum members are instances of their enum class, and are normally accessed as -``EnumClass.member``. In Python versions ``3.5`` to ``3.9`` you could access -members from other members -- this practice was discouraged, and in ``3.12`` -:class:`Enum` will return to not allowing it, while in ``3.10`` and ``3.11`` -it will raise a :exc:`DeprecationWarning`:: +``EnumClass.member``. In Python versions ``3.5`` to ``3.10`` you could access +members from other members -- this practice was discouraged, and in ``3.11`` +:class:`Enum` returns to not allowing it:: >>> class FieldTypes(Enum): ... name = 0 ... value = 1 ... size = 2 ... - >>> FieldTypes.value.size # doctest: +SKIP - DeprecationWarning: accessing one member from another is not supported, - and will be disabled in 3.12 - + >>> FieldTypes.value.size + Traceback (most recent call last): + ... + AttributeError: member has no attribute 'size' + .. versionchanged:: 3.5 +.. versionchanged:: 3.11 Creating members that are mixed with other data types """"""""""""""""""""""""""""""""""""""""""""""""""""" When subclassing other data types, such as :class:`int` or :class:`str`, with -an :class:`Enum`, all values after the `=` are passed to that data type's +an :class:`Enum`, all values after the ``=`` are passed to that data type's constructor. For example:: - >>> class MyEnum(IntEnum): - ... example = '11', 16 # '11' will be interpreted as a hexadecimal - ... # number - >>> MyEnum.example.value + >>> class MyEnum(IntEnum): # help(int) -> int(x, base=10) -> integer + ... example = '11', 16 # so x='11' and base=16 + ... + >>> MyEnum.example.value # and hex(11) is... 17 @@ -1000,13 +997,12 @@ Plain :class:`Enum` classes always evaluate as :data:`True`. """"""""""""""""""""""""""""" If you give your enum subclass extra methods, like the `Planet`_ -class below, those methods will show up in a :func:`dir` of the member and the -class. Attributes defined in an :func:`__init__` method will only show up in a -:func:`dir` of the member:: +class below, those methods will show up in a :func:`dir` of the member, +but not of the class:: - >>> dir(Planet) - ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__init__', '__members__', '__module__', 'surface_gravity'] - >>> dir(Planet.EARTH) + >>> dir(Planet) # doctest: +SKIP + ['EARTH', 'JUPITER', 'MARS', 'MERCURY', 'NEPTUNE', 'SATURN', 'URANUS', 'VENUS', '__class__', '__doc__', '__members__', '__module__'] + >>> dir(Planet.EARTH) # doctest: +SKIP ['__class__', '__doc__', '__module__', 'mass', 'name', 'radius', 'surface_gravity', 'value'] @@ -1025,19 +1021,10 @@ are comprised of a single bit:: ... CYAN = GREEN | BLUE ... >>> Color(3) # named combination - Color.YELLOW + >>> Color(7) # not named combination - Color.RED|Color.GREEN|Color.BLUE + -``StrEnum`` and :meth:`str.__str__` -""""""""""""""""""""""""""""""""""" - -An important difference between :class:`StrEnum` and other Enums is the -:meth:`__str__` method; because :class:`StrEnum` members are strings, some -parts of Python will read the string data directly, while others will call -:meth:`str()`. To make those two operations have the same result, -:meth:`StrEnum.__str__` will be the same as :meth:`str.__str__` so that -``str(StrEnum.member) == StrEnum.member`` is true. ``Flag`` and ``IntFlag`` minutia """""""""""""""""""""""""""""""" @@ -1060,16 +1047,16 @@ the following are true: - only canonical flags are returned during iteration:: >>> list(Color.WHITE) - [Color.RED, Color.GREEN, Color.BLUE] + [, , ] - negating a flag or flag set returns a new flag/flag set with the corresponding positive integer value:: >>> Color.BLUE - Color.BLUE + >>> ~Color.BLUE - Color.RED|Color.GREEN + - names of pseudo-flags are constructed from their members' names:: @@ -1079,25 +1066,29 @@ the following are true: - multi-bit flags, aka aliases, can be returned from operations:: >>> Color.RED | Color.BLUE - Color.PURPLE + >>> Color(7) # or Color(-1) - Color.WHITE + >>> Color(0) - Color.BLACK + -- membership / containment checking has changed slightly -- zero-valued flags - are never considered to be contained:: +- membership / containment checking: zero-valued flags are always considered + to be contained:: >>> Color.BLACK in Color.WHITE - False + True - otherwise, if all bits of one flag are in the other flag, True is returned:: + otherwise, only if all bits of one flag are in the other flag will True + be returned:: >>> Color.PURPLE in Color.WHITE True + >>> Color.GREEN in Color.PURPLE + False + There is a new boundary mechanism that controls how out-of-range / invalid bits are handled: ``STRICT``, ``CONFORM``, ``EJECT``, and ``KEEP``: @@ -1181,7 +1172,7 @@ Using :class:`auto` would look like:: ... GREEN = auto() ... >>> Color.GREEN - + Using :class:`object` @@ -1194,10 +1185,24 @@ Using :class:`object` would look like:: ... GREEN = object() ... BLUE = object() ... + >>> Color.GREEN # doctest: +SKIP + > + +This is also a good example of why you might want to write your own +:meth:`__repr__`:: + + >>> class Color(Enum): + ... RED = object() + ... GREEN = object() + ... BLUE = object() + ... def __repr__(self): + ... return "<%s.%s>" % (self.__class__.__name__, self._name_) + ... >>> Color.GREEN + Using a descriptive string """""""""""""""""""""""""" @@ -1209,9 +1214,7 @@ Using a string as the value would look like:: ... BLUE = 'too fast!' ... >>> Color.GREEN - - >>> Color.GREEN.value - 'go' + Using a custom :meth:`__new__` @@ -1232,9 +1235,7 @@ Using an auto-numbering :meth:`__new__` would look like:: ... BLUE = () ... >>> Color.GREEN - - >>> Color.GREEN.value - 2 + To make a more general purpose ``AutoNumber``, add ``*args`` to the signature:: @@ -1257,7 +1258,7 @@ to handle any extra arguments:: ... BLEACHED_CORAL = () # New color, no Pantone code yet! ... >>> Swatch.SEA_GREEN - + >>> Swatch.SEA_GREEN.pantone '1246' >>> Swatch.BLEACHED_CORAL.pantone @@ -1384,30 +1385,9 @@ An example to show the :attr:`_ignore_` attribute in use:: ... Period['day_%d' % i] = i ... >>> list(Period)[:2] - [Period.day_0, Period.day_1] + [, ] >>> list(Period)[-2:] - [Period.day_365, Period.day_366] - - -Conforming input to Flag -^^^^^^^^^^^^^^^^^^^^^^^^ - -To create a :class:`Flag` enum that is more resilient to out-of-bounds results -from mathematical operations, you can use the :attr:`FlagBoundary.CONFORM` -setting:: - - >>> from enum import Flag, CONFORM, auto - >>> class Weekday(Flag, boundary=CONFORM): - ... MONDAY = auto() - ... TUESDAY = auto() - ... WEDNESDAY = auto() - ... THURSDAY = auto() - ... FRIDAY = auto() - ... SATURDAY = auto() - ... SUNDAY = auto() - >>> today = Weekday.TUESDAY - >>> Weekday(today + 22) # what day is three weeks from tomorrow? - >>> Weekday.WEDNESDAY + [, ] .. _enumtype-examples: diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 8bb19dcdf2b61..a37c9d4506241 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -31,7 +31,7 @@ An enumeration: * uses *call* syntax to return members by value * uses *index* syntax to return members by name -Enumerations are created either by using the :keyword:`class` syntax, or by +Enumerations are created either by using :keyword:`class` syntax, or by using function-call syntax:: >>> from enum import Enum @@ -45,7 +45,7 @@ using function-call syntax:: >>> # functional syntax >>> Color = Enum('Color', ['RED', 'GREEN', 'BLUE']) -Even though we can use the :keyword:`class` syntax to create Enums, Enums +Even though we can use :keyword:`class` syntax to create Enums, Enums are not normal Python classes. See :ref:`How are Enums different? ` for more details. @@ -53,7 +53,7 @@ are not normal Python classes. See - The class :class:`Color` is an *enumeration* (or *enum*) - The attributes :attr:`Color.RED`, :attr:`Color.GREEN`, etc., are - *enumeration members* (or *enum members*) and are functionally constants. + *enumeration members* (or *members*) and are functionally constants. - The enum members have *names* and *values* (the name of :attr:`Color.RED` is ``RED``, the value of :attr:`Color.BLUE` is ``3``, etc.) @@ -110,15 +110,10 @@ Module Contents :class:`StrEnum` defaults to the lower-cased version of the member name, while other Enums default to 1 and increase from there. - :func:`global_enum` - - :class:`Enum` class decorator to apply the appropriate global `__repr__`, - and export its members into the global name space. - - :func:`.property` + :func:`property` Allows :class:`Enum` members to have attributes without conflicting with - other members' names. + member names. :func:`unique` @@ -131,7 +126,7 @@ Module Contents .. versionadded:: 3.6 ``Flag``, ``IntFlag``, ``auto`` -.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary`` +.. versionadded:: 3.11 ``StrEnum``, ``EnumCheck``, ``FlagBoundary``, ``property`` --------------- @@ -145,6 +140,11 @@ Data Types to subclass *EnumType* -- see :ref:`Subclassing EnumType ` for details. + *EnumType* is responsible for setting the correct :meth:`__repr__`, + :meth:`__str__`, :meth:`__format__`, and :meth:`__reduce__` methods on the + final *enum*, as well as creating the enum members, properly handling + duplicates, providing iteration over the enum class, etc. + .. method:: EnumType.__contains__(cls, member) Returns ``True`` if member belongs to the ``cls``:: @@ -162,32 +162,31 @@ Data Types .. method:: EnumType.__dir__(cls) Returns ``['__class__', '__doc__', '__members__', '__module__']`` and the - names of the members in ``cls``. User-defined methods and methods from - mixin classes will also be included:: + names of the members in *cls*:: >>> dir(Color) - ['BLUE', 'GREEN', 'RED', '__class__', '__doc__', '__members__', '__module__'] + ['BLUE', 'GREEN', 'RED', '__class__', '__contains__', '__doc__', '__getitem__', '__init_subclass__', '__iter__', '__len__', '__members__', '__module__', '__name__', '__qualname__'] .. method:: EnumType.__getattr__(cls, name) Returns the Enum member in *cls* matching *name*, or raises an :exc:`AttributeError`:: >>> Color.GREEN - Color.GREEN + .. method:: EnumType.__getitem__(cls, name) - Returns the Enum member in *cls* matching *name*, or raises a :exc:`KeyError`:: + Returns the Enum member in *cls* matching *name*, or raises an :exc:`KeyError`:: >>> Color['BLUE'] - Color.BLUE + .. method:: EnumType.__iter__(cls) Returns each member in *cls* in definition order:: >>> list(Color) - [Color.RED, Color.GREEN, Color.BLUE] + [, , ] .. method:: EnumType.__len__(cls) @@ -201,7 +200,7 @@ Data Types Returns each member in *cls* in reverse definition order:: >>> list(reversed(Color)) - [Color.BLUE, Color.GREEN, Color.RED] + [, , ] .. class:: Enum @@ -232,7 +231,7 @@ Data Types .. attribute:: Enum._ignore_ ``_ignore_`` is only used during creation and is removed from the - enumeration once that is complete. + enumeration once creation is complete. ``_ignore_`` is a list of names that will not become members, and whose names will also be removed from the completed enumeration. See @@ -261,7 +260,7 @@ Data Types .. method:: Enum.__dir__(self) Returns ``['__class__', '__doc__', '__module__', 'name', 'value']`` and - any public methods defined on ``self.__class__`` or a mixin class:: + any public methods defined on *self.__class__*:: >>> from datetime import date >>> class Weekday(Enum): @@ -276,7 +275,7 @@ Data Types ... def today(cls): ... print('today is %s' % cls(date.today().isoweekday()).name) >>> dir(Weekday.SATURDAY) - ['__class__', '__doc__', '__module__', 'name', 'today', 'value'] + ['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'today', 'value'] .. method:: Enum._generate_next_value_(name, start, count, last_values) @@ -298,6 +297,11 @@ Data Types >>> PowersOfThree.SECOND.value 6 + .. method:: Enum.__init_subclass__(cls, \**kwds) + + A *classmethod* that is used to further configure subsequent subclasses. + By default, does nothing. + .. method:: Enum._missing_(cls, value) A *classmethod* for looking up values not found in *cls*. By default it @@ -317,42 +321,55 @@ Data Types >>> Build.DEBUG.value 'debug' >>> Build('deBUG') - Build.DEBUG + .. method:: Enum.__repr__(self) Returns the string used for *repr()* calls. By default, returns the - *Enum* name and the member name, but can be overridden:: + *Enum* name, member name, and value, but can be overridden:: - >>> class OldStyle(Enum): - ... RETRO = auto() - ... OLD_SCHOOl = auto() - ... YESTERYEAR = auto() + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() ... def __repr__(self): ... cls_name = self.__class__.__name__ - ... return f'<{cls_name}.{self.name}: {self.value}>' - >>> OldStyle.RETRO - + ... return f'{cls_name}.{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (OtherStyle.ALTERNATE, 'OtherStyle.ALTERNATE', 'OtherStyle.ALTERNATE') .. method:: Enum.__str__(self) Returns the string used for *str()* calls. By default, returns the - member name, but can be overridden:: + *Enum* name and member name, but can be overridden:: - >>> class OldStyle(Enum): - ... RETRO = auto() - ... OLD_SCHOOl = auto() - ... YESTERYEAR = auto() + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() ... def __str__(self): - ... cls_name = self.__class__.__name__ - ... return f'{cls_name}.{self.name}' - >>> OldStyle.RETRO - OldStyle.RETRO + ... return f'{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (, 'ALTERNATE', 'ALTERNATE') + + .. method:: Enum.__format__(self) + + Returns the string used for *format()* and *f-string* calls. By default, + returns :meth:`__str__` returns, but can be overridden:: + + >>> class OtherStyle(Enum): + ... ALTERNATE = auto() + ... OTHER = auto() + ... SOMETHING_ELSE = auto() + ... def __format__(self, spec): + ... return f'{self.name}' + >>> OtherStyle.ALTERNATE, str(OtherStyle.ALTERNATE), f"{OtherStyle.ALTERNATE}" + (, 'OtherStyle.ALTERNATE', 'ALTERNATE') -.. note:: + .. note:: - Using :class:`auto` with :class:`Enum` results in integers of increasing value, - starting with ``1``. + Using :class:`auto` with :class:`Enum` results in integers of increasing value, + starting with ``1``. .. class:: IntEnum @@ -367,7 +384,7 @@ Data Types ... TWO = 2 ... THREE = 3 >>> Numbers.THREE - Numbers.THREE + >>> Numbers.ONE + Numbers.TWO 3 >>> Numbers.THREE + 5 @@ -375,10 +392,14 @@ Data Types >>> Numbers.THREE == 3 True -.. note:: + .. note:: - Using :class:`auto` with :class:`IntEnum` results in integers of increasing value, - starting with ``1``. + Using :class:`auto` with :class:`IntEnum` results in integers of increasing + value, starting with ``1``. + + .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to + better support the *replacement of existing constants* use-case. + :meth:`__format__` was already :func:`int.__format__` for that same reason. .. class:: StrEnum @@ -392,13 +413,16 @@ Data Types instead of ``isinstance(str, unknown)``), and in those locations you will need to use ``str(StrEnum.member)``. + .. note:: -.. note:: + Using :class:`auto` with :class:`StrEnum` results in the lower-cased member + name as the value. - Using :class:`auto` with :class:`StrEnum` results in values of the member name, - lower-cased. + .. note:: :meth:`__str__` is :func:`str.__str__` to better support the + *replacement of existing constants* use-case. :meth:`__format__` is likewise + :func:`int.__format__` for that same reason. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. class:: Flag @@ -431,9 +455,9 @@ Data Types Returns all contained members:: >>> list(Color.RED) - [Color.RED] + [] >>> list(purple) - [Color.RED, Color.BLUE] + [, ] .. method:: __len__(self): @@ -461,42 +485,52 @@ Data Types Returns current flag binary or'ed with other:: >>> Color.RED | Color.GREEN - Color.RED|Color.GREEN + .. method:: __and__(self, other) Returns current flag binary and'ed with other:: >>> purple & white - Color.RED|Color.BLUE + >>> purple & Color.GREEN - 0x0 + .. method:: __xor__(self, other) Returns current flag binary xor'ed with other:: >>> purple ^ white - Color.GREEN + >>> purple ^ Color.GREEN - Color.RED|Color.GREEN|Color.BLUE + .. method:: __invert__(self): Returns all the flags in *type(self)* that are not in self:: >>> ~white - 0x0 + >>> ~purple - Color.GREEN + >>> ~Color.RED - Color.GREEN|Color.BLUE + + + .. method:: _numeric_repr_ + + Function used to format any remaining unnamed numeric values. Default is + the value's repr; common choices are :func:`hex` and :func:`oct`. + + .. note:: -.. note:: + Using :class:`auto` with :class:`Flag` results in integers that are powers + of two, starting with ``1``. - Using :class:`auto` with :class:`Flag` results in integers that are powers - of two, starting with ``1``. + .. versionchanged:: 3.11 The *repr()* of zero-valued flags has changed. It + is now:: + >>> Color(0) # doctest: +SKIP + .. class:: IntFlag @@ -509,9 +543,9 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> Color.RED & 2 - 0x0 + >>> Color.RED | 2 - Color.RED|Color.GREEN + If any integer operation is performed with an *IntFlag* member, the result is not an *IntFlag*:: @@ -524,15 +558,25 @@ Data Types * the result is a valid *IntFlag*: an *IntFlag* is returned * the result is not a valid *IntFlag*: the result depends on the *FlagBoundary* setting -.. note:: + The *repr()* of unnamed zero-valued flags has changed. It is now: + + >>> Color(0) + + + .. note:: + + Using :class:`auto` with :class:`IntFlag` results in integers that are powers + of two, starting with ``1``. + + .. versionchanged:: 3.11 :meth:`__str__` is now :func:`int.__str__` to + better support the *replacement of existing constants* use-case. + :meth:`__format__` was already :func:`int.__format__` for that same reason. - Using :class:`auto` with :class:`IntFlag` results in integers that are powers - of two, starting with ``1``. .. class:: EnumCheck *EnumCheck* contains the options used by the :func:`verify` decorator to ensure - various constraints; failed constraints result in a :exc:`TypeError`. + various constraints; failed constraints result in a :exc:`ValueError`. .. attribute:: UNIQUE @@ -582,11 +626,11 @@ Data Types ... ValueError: invalid Flag 'Color': aliases WHITE and NEON are missing combined values of 0x18 [use enum.show_flag_values(value) for details] -.. note:: + .. note:: - CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. + CONTINUOUS and NAMED_FLAGS are designed to work with integer-valued members. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. class:: FlagBoundary @@ -606,7 +650,7 @@ Data Types >>> StrictFlag(2**2 + 2**4) Traceback (most recent call last): ... - ValueError: StrictFlag: invalid value: 20 + ValueError: invalid value 20 given 0b0 10100 allowed 0b0 00111 @@ -621,7 +665,7 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> ConformFlag(2**2 + 2**4) - ConformFlag.BLUE + .. attribute:: EJECT @@ -647,12 +691,52 @@ Data Types ... GREEN = auto() ... BLUE = auto() >>> KeepFlag(2**2 + 2**4) - KeepFlag.BLUE|0x10 + .. versionadded:: 3.11 --------------- +Supported ``__dunder__`` names +"""""""""""""""""""""""""""""" + +:attr:`__members__` is a read-only ordered mapping of ``member_name``:``member`` +items. It is only available on the class. + +:meth:`__new__`, if specified, must create and return the enum members; it is +also a very good idea to set the member's :attr:`_value_` appropriately. Once +all the members are created it is no longer used. + + +Supported ``_sunder_`` names +"""""""""""""""""""""""""""" + +- ``_name_`` -- name of the member +- ``_value_`` -- value of the member; can be set / modified in ``__new__`` + +- ``_missing_`` -- a lookup function used when a value is not found; may be + overridden +- ``_ignore_`` -- a list of names, either as a :class:`list` or a :class:`str`, + that will not be transformed into members, and will be removed from the final + class +- ``_order_`` -- used in Python 2/3 code to ensure member order is consistent + (class attribute, removed during class creation) +- ``_generate_next_value_`` -- used to get an appropriate value for an enum + member; may be overridden + + .. note:: + + For standard :class:`Enum` classes the next value chosen is the last value seen + incremented by one. + + For :class:`Flag` classes the next value chosen will be the next highest + power-of-two, regardless of the last value seen. + +.. versionadded:: 3.6 ``_missing_``, ``_order_``, ``_generate_next_value_`` +.. versionadded:: 3.7 ``_ignore_`` + +--------------- + Utilities and Decorators ------------------------ @@ -668,15 +752,6 @@ Utilities and Decorators ``_generate_next_value_`` can be overridden to customize the values used by *auto*. -.. decorator:: global_enum - - A :keyword:`class` decorator specifically for enumerations. It replaces the - :meth:`__repr__` method with one that shows *module_name*.*member_name*. It - also injects the members, and their aliases, into the global namespace they - were defined in. - -.. versionadded:: 3.11 - .. decorator:: property A decorator similar to the built-in *property*, but specifically for @@ -688,7 +763,7 @@ Utilities and Decorators *Enum* class, and *Enum* subclasses can define members with the names ``value`` and ``name``. -.. versionadded:: 3.11 + .. versionadded:: 3.11 .. decorator:: unique @@ -714,7 +789,7 @@ Utilities and Decorators :class:`EnumCheck` are used to specify which constraints should be checked on the decorated enumeration. -.. versionadded:: 3.11 + .. versionadded:: 3.11 --------------- @@ -726,14 +801,20 @@ Notes These three enum types are designed to be drop-in replacements for existing integer- and string-based values; as such, they have extra limitations: - - ``format()`` will use the value of the enum member, unless ``__str__`` - has been overridden + - ``__str__`` uses the value and not the name of the enum member - - ``StrEnum.__str__`` uses the value and not the name of the enum member + - ``__format__``, because it uses ``__str__``, will also use the value of + the enum member instead of its name - If you do not need/want those limitations, you can create your own base - class by mixing in the ``int`` or ``str`` type yourself:: + If you do not need/want those limitations, you can either create your own + base class by mixing in the ``int`` or ``str`` type yourself:: >>> from enum import Enum >>> class MyIntEnum(int, Enum): ... pass + + or you can reassign the appropriate :meth:`str`, etc., in your enum:: + + >>> from enum import IntEnum + >>> class MyIntEnum(IntEnum): + ... __str__ = IntEnum.__str__ diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index eb33d7e1778a7..4d8488a4a28de 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -2070,7 +2070,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_flags` returns :class:`VerifyFlags` flags: >>> ssl.create_default_context().verify_flags # doctest: +SKIP - ssl.VERIFY_X509_TRUSTED_FIRST + .. attribute:: SSLContext.verify_mode @@ -2082,7 +2082,7 @@ to speed up repeated connections from the same clients. :attr:`SSLContext.verify_mode` returns :class:`VerifyMode` enum: >>> ssl.create_default_context().verify_mode - ssl.CERT_REQUIRED + .. index:: single: certificates diff --git a/Lib/enum.py b/Lib/enum.py index 93ea1bea36db7..772e1eac0e1e6 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -1,16 +1,16 @@ import sys +import builtins as bltns from types import MappingProxyType, DynamicClassAttribute from operator import or_ as _or_ from functools import reduce -from builtins import property as _bltin_property, bin as _bltin_bin __all__ = [ 'EnumType', 'EnumMeta', - 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', + 'Enum', 'IntEnum', 'StrEnum', 'Flag', 'IntFlag', 'ReprEnum', 'auto', 'unique', 'property', 'verify', 'FlagBoundary', 'STRICT', 'CONFORM', 'EJECT', 'KEEP', - 'global_flag_repr', 'global_enum_repr', 'global_enum', + 'global_flag_repr', 'global_enum_repr', 'global_str', 'global_enum', 'EnumCheck', 'CONTINUOUS', 'NAMED_FLAGS', 'UNIQUE', ] @@ -18,7 +18,7 @@ # Dummy value for Enum and Flag as there are explicit checks for them # before they have been created. # This is also why there are checks in EnumType like `if Enum is not None` -Enum = Flag = EJECT = None +Enum = Flag = EJECT = _stdlib_enums = ReprEnum = None def _is_descriptor(obj): """ @@ -116,9 +116,9 @@ def bin(num, max_bits=None): ceiling = 2 ** (num).bit_length() if num >= 0: - s = _bltin_bin(num + ceiling).replace('1', '0', 1) + s = bltns.bin(num + ceiling).replace('1', '0', 1) else: - s = _bltin_bin(~num ^ (ceiling - 1) + ceiling) + s = bltns.bin(~num ^ (ceiling - 1) + ceiling) sign = s[:3] digits = s[3:] if max_bits is not None: @@ -126,6 +126,19 @@ def bin(num, max_bits=None): digits = (sign[-1] * max_bits + digits)[-max_bits:] return "%s %s" % (sign, digits) +def _dedent(text): + """ + Like textwrap.dedent. Rewritten because we cannot import textwrap. + """ + lines = text.split('\n') + blanks = 0 + for i, ch in enumerate(lines[0]): + if ch != ' ': + break + for j, l in enumerate(lines): + lines[j] = l[i:] + return '\n'.join(lines) + _auto_null = object() class auto: @@ -149,22 +162,12 @@ def __get__(self, instance, ownerclass=None): return ownerclass._member_map_[self.name] except KeyError: raise AttributeError( - '%s: no class attribute %r' % (ownerclass.__name__, self.name) + '%r has no attribute %r' % (ownerclass, self.name) ) else: if self.fget is None: - # check for member - if self.name in ownerclass._member_map_: - import warnings - warnings.warn( - "accessing one member from another is not supported, " - " and will be disabled in 3.12", - DeprecationWarning, - stacklevel=2, - ) - return ownerclass._member_map_[self.name] raise AttributeError( - '%s: no instance attribute %r' % (ownerclass.__name__, self.name) + '%r member has no attribute %r' % (ownerclass, self.name) ) else: return self.fget(instance) @@ -172,7 +175,7 @@ def __get__(self, instance, ownerclass=None): def __set__(self, instance, value): if self.fset is None: raise AttributeError( - "%s: cannot set instance attribute %r" % (self.clsname, self.name) + " cannot set attribute %r" % (self.clsname, self.name) ) else: return self.fset(instance, value) @@ -180,7 +183,7 @@ def __set__(self, instance, value): def __delete__(self, instance): if self.fdel is None: raise AttributeError( - "%s: cannot delete instance attribute %r" % (self.clsname, self.name) + " cannot delete attribute %r" % (self.clsname, self.name) ) else: return self.fdel(instance) @@ -328,7 +331,7 @@ def __setitem__(self, key, value): elif _is_sunder(key): if key not in ( '_order_', - '_generate_next_value_', '_missing_', '_ignore_', + '_generate_next_value_', '_numeric_repr_', '_missing_', '_ignore_', '_iter_member_', '_iter_member_by_value_', '_iter_member_by_def_', ): raise ValueError( @@ -358,13 +361,13 @@ def __setitem__(self, key, value): key = '_order_' elif key in self._member_names: # descriptor overwriting an enum? - raise TypeError('%r already defined as: %r' % (key, self[key])) + raise TypeError('%r already defined as %r' % (key, self[key])) elif key in self._ignore: pass elif not _is_descriptor(value): if key in self: # enum overwriting a descriptor? - raise TypeError('%r already defined as: %r' % (key, self[key])) + raise TypeError('%r already defined as %r' % (key, self[key])) if isinstance(value, auto): if value.value == _auto_null: value.value = self._generate_next_value( @@ -395,7 +398,7 @@ class EnumType(type): @classmethod def __prepare__(metacls, cls, bases, **kwds): # check that previous enum members do not exist - metacls._check_for_existing_members(cls, bases) + metacls._check_for_existing_members_(cls, bases) # create the namespace dict enum_dict = _EnumDict() enum_dict._cls_name = cls @@ -413,9 +416,10 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # inherited __new__ unless a new __new__ is defined (or the resulting # class will fail). # - # remove any keys listed in _ignore_ if _simple: return super().__new__(metacls, cls, bases, classdict, **kwds) + # + # remove any keys listed in _ignore_ classdict.setdefault('_ignore_', []).append('_ignore_') ignore = classdict['_ignore_'] for key in ignore: @@ -427,8 +431,8 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # check for illegal enum names (any others?) invalid_names = set(member_names) & {'mro', ''} if invalid_names: - raise ValueError('Invalid enum member name: {0}'.format( - ','.join(invalid_names))) + raise ValueError('invalid enum member name(s) '.format( + ','.join(repr(n) for n in invalid_names))) # # adjust the sunders _order_ = classdict.pop('_order_', None) @@ -458,6 +462,8 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_value2member_map_'] = {} classdict['_unhashable_values_'] = [] classdict['_member_type_'] = member_type + # now set the __repr__ for the value + classdict['_value_repr_'] = metacls._find_data_repr_(cls, bases) # # Flag structures (will be removed if final class is not a Flag classdict['_boundary_'] = ( @@ -467,10 +473,6 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k classdict['_flag_mask_'] = flag_mask classdict['_all_bits_'] = 2 ** ((flag_mask).bit_length()) - 1 classdict['_inverted_'] = None - # - # create a default docstring if one has not been provided - if '__doc__' not in classdict: - classdict['__doc__'] = 'An enumeration.' try: exc = None enum_class = super().__new__(metacls, cls, bases, classdict, **kwds) @@ -481,18 +483,140 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k if exc is not None: raise exc # + # update classdict with any changes made by __init_subclass__ + classdict.update(enum_class.__dict__) + # + # create a default docstring if one has not been provided + if enum_class.__doc__ is None: + if not member_names: + enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ + Create a collection of name/value pairs. + + Example enumeration: + + >>> class Color(Enum): + ... RED = 1 + ... BLUE = 2 + ... GREEN = 3 + + Access them by: + + - attribute access:: + + >>> Color.RED + + + - value lookup: + + >>> Color(1) + + + - name lookup: + + >>> Color['RED'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Color) + 3 + + >>> list(Color) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """) + else: + member = list(enum_class)[0] + enum_length = len(enum_class) + cls_name = enum_class.__name__ + if enum_length == 1: + list_line = 'list(%s)' % cls_name + list_repr = '[<%s.%s: %r>]' % (cls_name, member.name, member.value) + elif enum_length == 2: + member2 = list(enum_class)[1] + list_line = 'list(%s)' % cls_name + list_repr = '[<%s.%s: %r>, <%s.%s: %r>]' % ( + cls_name, member.name, member.value, + cls_name, member2.name, member2.value, + ) + else: + member2 = list(enum_class)[1] + member3 = list(enum_class)[2] + list_line = 'list(%s)%s' % (cls_name, ('','[:3]')[enum_length > 3]) + list_repr = '[<%s.%s: %r>, <%s.%s: %r>, <%s.%s: %r>]' % ( + cls_name, member.name, member.value, + cls_name, member2.name, member2.value, + cls_name, member3.name, member3.value, + ) + enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> %s.%s + <%s.%s: %r> + + - value lookup: + + >>> %s(%r) + <%s.%s: %r> + + - name lookup: + + >>> %s[%r] + <%s.%s: %r> + + Enumerations can be iterated over, and know how many members they have: + + >>> len(%s) + %r + + >>> %s + %s + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """ + % (cls_name, member.name, + cls_name, member.name, member.value, + cls_name, member.value, + cls_name, member.name, member.value, + cls_name, member.name, + cls_name, member.name, member.value, + cls_name, enum_length, + list_line, list_repr, + )) + # # double check that repr and friends are not the mixin's or various # things break (such as pickle) # however, if the method is defined in the Enum itself, don't replace # it + # + # Also, special handling for ReprEnum + if ReprEnum is not None and ReprEnum in bases: + if member_type is object: + raise TypeError( + 'ReprEnum subclasses must be mixed with a data type (i.e.' + ' int, str, float, etc.)' + ) + if '__format__' not in classdict: + enum_class.__format__ = member_type.__format__ + classdict['__format__'] = enum_class.__format__ + if '__str__' not in classdict: + method = member_type.__str__ + if method is object.__str__: + # if member_type does not define __str__, object.__str__ will use + # its __repr__ instead, so we'll also use its __repr__ + method = member_type.__repr__ + enum_class.__str__ = method + classdict['__str__'] = enum_class.__str__ for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name in classdict: - continue - class_method = getattr(enum_class, name) - obj_method = getattr(member_type, name, None) - enum_method = getattr(first_enum, name, None) - if obj_method is not None and obj_method is class_method: - setattr(enum_class, name, enum_method) + if name not in classdict: + setattr(enum_class, name, getattr(first_enum, name)) # # replace any other __new__ with our own (as long as Enum is not None, # anyway) -- again, this is to support pickle @@ -563,13 +687,13 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # _order_ step 4: verify that _order_ and _member_names_ match if _order_ != enum_class._member_names_: raise TypeError( - 'member order does not match _order_:\n%r\n%r' + 'member order does not match _order_:\n %r\n %r' % (enum_class._member_names_, _order_) ) # return enum_class - def __bool__(self): + def __bool__(cls): """ classes/types should always be True. """ @@ -614,6 +738,13 @@ def __call__(cls, value, names=None, *, module=None, qualname=None, type=None, s ) def __contains__(cls, member): + """ + Return True if member is a member of this enum + raises TypeError if member is not an enum member + + note: in 3.12 TypeError will no longer be raised, and True will also be + returned if member is the value of a member in this enum + """ if not isinstance(member, Enum): import warnings warnings.warn( @@ -631,60 +762,33 @@ def __delattr__(cls, attr): # nicer error message when someone tries to delete an attribute # (see issue19025). if attr in cls._member_map_: - raise AttributeError("%s: cannot delete Enum member %r." % (cls.__name__, attr)) + raise AttributeError("%r cannot delete member %r." % (cls.__name__, attr)) super().__delattr__(attr) - def __dir__(self): - # Start off with the desired result for dir(Enum) - cls_dir = {'__class__', '__doc__', '__members__', '__module__'} - add_to_dir = cls_dir.add - mro = self.__mro__ - this_module = globals().values() - is_from_this_module = lambda cls: any(cls is thing for thing in this_module) - first_enum_base = next(cls for cls in mro if is_from_this_module(cls)) - enum_dict = Enum.__dict__ - sentinel = object() - # special-case __new__ - ignored = {'__new__', *filter(_is_sunder, enum_dict)} - add_to_ignored = ignored.add - - # We want these added to __dir__ - # if and only if they have been user-overridden - enum_dunders = set(filter(_is_dunder, enum_dict)) - - for cls in mro: - # Ignore any classes defined in this module - if cls is object or is_from_this_module(cls): - continue - - cls_lookup = cls.__dict__ - - # If not an instance of EnumType, - # ensure all attributes excluded from that class's `dir()` are ignored here. - if not isinstance(cls, EnumType): - cls_lookup = set(cls_lookup).intersection(dir(cls)) - - for attr_name in cls_lookup: - # Already seen it? Carry on - if attr_name in cls_dir or attr_name in ignored: - continue - # Sunders defined in Enum.__dict__ are already in `ignored`, - # But sunders defined in a subclass won't be (we want all sunders excluded). - elif _is_sunder(attr_name): - add_to_ignored(attr_name) - # Not an "enum dunder"? Add it to dir() output. - elif attr_name not in enum_dunders: - add_to_dir(attr_name) - # Is an "enum dunder", and is defined by a class from enum.py? Ignore it. - elif getattr(self, attr_name) is getattr(first_enum_base, attr_name, sentinel): - add_to_ignored(attr_name) - # Is an "enum dunder", and is either user-defined or defined by a mixin class? - # Add it to dir() output. - else: - add_to_dir(attr_name) - - # sort the output before returning it, so that the result is deterministic. - return sorted(cls_dir) + def __dir__(cls): + # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ + # on object-based enums + if cls._member_type_ is object: + interesting = set(cls._member_names_) + if cls._new_member_ is not object.__new__: + interesting.add('__new__') + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') + for method in ('__init__', '__format__', '__repr__', '__str__'): + if getattr(cls, method) not in (getattr(Enum, method), getattr(Flag, method)): + interesting.add(method) + return sorted(set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ]) | interesting + ) + else: + # return whatever mixed-in data type has + return sorted(set( + dir(cls._member_type_) + + cls._member_names_ + )) def __getattr__(cls, name): """ @@ -703,18 +807,24 @@ def __getattr__(cls, name): raise AttributeError(name) from None def __getitem__(cls, name): + """ + Return the member matching `name`. + """ return cls._member_map_[name] def __iter__(cls): """ - Returns members in definition order. + Return members in definition order. """ return (cls._member_map_[name] for name in cls._member_names_) def __len__(cls): + """ + Return the number of members (no aliases) + """ return len(cls._member_names_) - @_bltin_property + @bltns.property def __members__(cls): """ Returns a mapping of member name->value. @@ -732,7 +842,7 @@ def __repr__(cls): def __reversed__(cls): """ - Returns members in reverse definition order. + Return members in reverse definition order. """ return (cls._member_map_[name] for name in reversed(cls._member_names_)) @@ -746,7 +856,7 @@ def __setattr__(cls, name, value): """ member_map = cls.__dict__.get('_member_map_', {}) if name in member_map: - raise AttributeError('Cannot reassign member %r.' % (name, )) + raise AttributeError('cannot reassign member %r' % (name, )) super().__setattr__(name, value) def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, start=1, boundary=None): @@ -801,8 +911,7 @@ def _create_(cls, class_name, names, *, module=None, qualname=None, type=None, s return metacls.__new__(metacls, class_name, bases, classdict, boundary=boundary) - def _convert_(cls, name, module, filter, source=None, *, boundary=None): - + def _convert_(cls, name, module, filter, source=None, *, boundary=None, as_global=False): """ Create a new Enum subclass that replaces a collection of global constants """ @@ -834,22 +943,25 @@ def _convert_(cls, name, module, filter, source=None, *, boundary=None): tmp_cls = type(name, (object, ), body) cls = _simple_enum(etype=cls, boundary=boundary or KEEP)(tmp_cls) cls.__reduce_ex__ = _reduce_ex_by_global_name - global_enum(cls) + if as_global: + global_enum(cls) + else: + sys.modules[cls.__module__].__dict__.update(cls.__members__) module_globals[name] = cls return cls - @staticmethod - def _check_for_existing_members(class_name, bases): + @classmethod + def _check_for_existing_members_(mcls, class_name, bases): for chain in bases: for base in chain.__mro__: if issubclass(base, Enum) and base._member_names_: raise TypeError( - "%s: cannot extend enumeration %r" - % (class_name, base.__name__) + " cannot extend %r" + % (class_name, base) ) @classmethod - def _get_mixins_(cls, class_name, bases): + def _get_mixins_(mcls, class_name, bases): """ Returns the type for creating enum members, and the first inherited enum class. @@ -859,30 +971,7 @@ def _get_mixins_(cls, class_name, bases): if not bases: return object, Enum - def _find_data_type(bases): - data_types = set() - for chain in bases: - candidate = None - for base in chain.__mro__: - if base is object: - continue - elif issubclass(base, Enum): - if base._member_type_ is not object: - data_types.add(base._member_type_) - break - elif '__new__' in base.__dict__: - if issubclass(base, Enum): - continue - data_types.add(candidate or base) - break - else: - candidate = candidate or base - if len(data_types) > 1: - raise TypeError('%r: too many data types: %r' % (class_name, data_types)) - elif data_types: - return data_types.pop() - else: - return None + mcls._check_for_existing_members_(class_name, bases) # ensure final parent class is an Enum derivative, find any concrete # data type, and check that Enum has no members @@ -890,12 +979,51 @@ def _find_data_type(bases): if not issubclass(first_enum, Enum): raise TypeError("new enumerations should be created as " "`EnumName([mixin_type, ...] [data_type,] enum_type)`") - cls._check_for_existing_members(class_name, bases) - member_type = _find_data_type(bases) or object + member_type = mcls._find_data_type_(class_name, bases) or object return member_type, first_enum - @staticmethod - def _find_new_(classdict, member_type, first_enum): + @classmethod + def _find_data_repr_(mcls, class_name, bases): + for chain in bases: + for base in chain.__mro__: + if base is object: + continue + elif issubclass(base, Enum): + # if we hit an Enum, use it's _value_repr_ + return base._value_repr_ + elif '__repr__' in base.__dict__: + # this is our data repr + return base.__dict__['__repr__'] + return None + + @classmethod + def _find_data_type_(mcls, class_name, bases): + data_types = set() + for chain in bases: + candidate = None + for base in chain.__mro__: + if base is object: + continue + elif issubclass(base, Enum): + if base._member_type_ is not object: + data_types.add(base._member_type_) + break + elif '__new__' in base.__dict__: + if issubclass(base, Enum): + continue + data_types.add(candidate or base) + break + else: + candidate = candidate or base + if len(data_types) > 1: + raise TypeError('too many data types for %r: %r' % (class_name, data_types)) + elif data_types: + return data_types.pop() + else: + return None + + @classmethod + def _find_new_(mcls, classdict, member_type, first_enum): """ Returns the __new__ to be used for creating the enum members. @@ -943,9 +1071,42 @@ def _find_new_(classdict, member_type, first_enum): class Enum(metaclass=EnumType): """ - Generic enumeration. + Create a collection of name/value pairs. + + Example enumeration: + + >>> class Color(Enum): + ... RED = 1 + ... BLUE = 2 + ... GREEN = 3 + + Access them by: + + - attribute access:: + + >>> Color.RED + + + - value lookup: + + >>> Color(1) + - Derive from this class to define new enumerations. + - name lookup: + + >>> Color['RED'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Color) + 3 + + >>> list(Color) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. """ def __new__(cls, value): @@ -999,6 +1160,9 @@ def __new__(cls, value): exc = None ve_exc = None + def __init__(self, *args, **kwds): + pass + def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1021,47 +1185,44 @@ def _missing_(cls, value): return None def __repr__(self): - return "%s.%s" % ( self.__class__.__name__, self._name_) + v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ + return "<%s.%s: %s>" % (self.__class__.__name__, self._name_, v_repr(self._value_)) def __str__(self): - return "%s" % (self._name_, ) + return "%s.%s" % (self.__class__.__name__, self._name_, ) def __dir__(self): """ Returns all members and all public methods """ - cls = type(self) - to_exclude = {'__members__', '__init__', '__new__', *cls._member_names_} - filtered_self_dict = (name for name in self.__dict__ if not name.startswith('_')) - return sorted({'name', 'value', *dir(cls), *filtered_self_dict} - to_exclude) + if self.__class__._member_type_ is object: + interesting = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) + else: + interesting = set(object.__dir__(self)) + for name in getattr(self, '__dict__', []): + if name[0] != '_': + interesting.add(name) + for cls in self.__class__.mro(): + for name, obj in cls.__dict__.items(): + if name[0] == '_': + continue + if isinstance(obj, property): + # that's an enum.property + if obj.fget is not None or name not in self._member_map_: + interesting.add(name) + else: + # in case it was added by `dir(self)` + interesting.discard(name) + else: + interesting.add(name) + names = sorted( + set(['__class__', '__doc__', '__eq__', '__hash__', '__module__']) + | interesting + ) + return names def __format__(self, format_spec): - """ - Returns format using actual value type unless __str__ has been overridden. - """ - # mixed-in Enums should use the mixed-in type's __format__, otherwise - # we can get strange results with the Enum name showing up instead of - # the value - # - # pure Enum branch, or branch with __str__ explicitly overridden - str_overridden = type(self).__str__ not in (Enum.__str__, IntEnum.__str__, Flag.__str__) - if self._member_type_ is object or str_overridden: - cls = str - val = str(self) - # mix-in branch - else: - if not format_spec or format_spec in ('{}','{:}'): - import warnings - warnings.warn( - "in 3.12 format() will use the enum member, not the enum member's value;\n" - "use a format specifier, such as :d for an integer-based Enum, to maintain " - "the current display", - DeprecationWarning, - stacklevel=2, - ) - cls = self._member_type_ - val = self._value_ - return cls.__format__(val, format_spec) + return str.__format__(str(self), format_spec) def __hash__(self): return hash(self._name_) @@ -1088,34 +1249,25 @@ def value(self): return self._value_ -class IntEnum(int, Enum): +class ReprEnum(Enum): """ - Enum where members are also (and must be) ints + Only changes the repr(), leaving str() and format() to the mixed-in type. """ - def __str__(self): - return "%s" % (self._name_, ) - def __format__(self, format_spec): - """ - Returns format using actual value unless __str__ has been overridden. - """ - str_overridden = type(self).__str__ != IntEnum.__str__ - if str_overridden: - cls = str - val = str(self) - else: - cls = self._member_type_ - val = self._value_ - return cls.__format__(val, format_spec) +class IntEnum(int, ReprEnum): + """ + Enum where members are also (and must be) ints + """ -class StrEnum(str, Enum): +class StrEnum(str, ReprEnum): """ Enum where members are also (and must be) strings """ def __new__(cls, *values): + "values must already be of type `str`" if len(values) > 3: raise TypeError('too many arguments for str(): %r' % (values, )) if len(values) == 1: @@ -1135,10 +1287,6 @@ def __new__(cls, *values): member._value_ = value return member - __str__ = str.__str__ - - __format__ = str.__format__ - def _generate_next_value_(name, start, count, last_values): """ Return the lower-cased version of the member name. @@ -1169,6 +1317,8 @@ class Flag(Enum, boundary=STRICT): Support for flags """ + _numeric_repr_ = repr + def _generate_next_value_(name, start, count, last_values): """ Generate the next value when not given. @@ -1184,7 +1334,7 @@ def _generate_next_value_(name, start, count, last_values): try: high_bit = _high_bit(last_value) except Exception: - raise TypeError('Invalid Flag value: %r' % last_value) from None + raise TypeError('invalid flag value %r' % last_value) from None return 2 ** (high_bit+1) @classmethod @@ -1232,8 +1382,8 @@ def _missing_(cls, value): if cls._boundary_ is STRICT: max_bits = max(value.bit_length(), flag_mask.bit_length()) raise ValueError( - "%s: invalid value: %r\n given %s\n allowed %s" % ( - cls.__name__, value, bin(value, max_bits), bin(flag_mask, max_bits), + "%r invalid value %r\n given %s\n allowed %s" % ( + cls, value, bin(value, max_bits), bin(flag_mask, max_bits), )) elif cls._boundary_ is CONFORM: value = value & flag_mask @@ -1247,7 +1397,7 @@ def _missing_(cls, value): ) else: raise ValueError( - 'unknown flag boundary: %r' % (cls._boundary_, ) + '%r unknown flag boundary %r' % (cls, cls._boundary_, ) ) if value < 0: neg_value = value @@ -1274,7 +1424,7 @@ def _missing_(cls, value): m._name_ for m in cls._iter_member_(member_value) ]) if unknown: - pseudo_member._name_ += '|0x%x' % unknown + pseudo_member._name_ += '|%s' % cls._numeric_repr_(unknown) else: pseudo_member._name_ = None # use setdefault in case another thread already created a composite @@ -1292,10 +1442,8 @@ def __contains__(self, other): """ if not isinstance(other, self.__class__): raise TypeError( - "unsupported operand type(s) for 'in': '%s' and '%s'" % ( + "unsupported operand type(s) for 'in': %r and %r" % ( type(other).__qualname__, self.__class__.__qualname__)) - if other._value_ == 0 or self._value_ == 0: - return False return other._value_ & self._value_ == other._value_ def __iter__(self): @@ -1309,27 +1457,18 @@ def __len__(self): def __repr__(self): cls_name = self.__class__.__name__ + v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ if self._name_ is None: - return "0x%x" % (self._value_, ) - if _is_single_bit(self._value_): - return '%s.%s' % (cls_name, self._name_) - if self._boundary_ is not FlagBoundary.KEEP: - return '%s.' % cls_name + ('|%s.' % cls_name).join(self.name.split('|')) + return "<%s: %s>" % (cls_name, v_repr(self._value_)) else: - name = [] - for n in self._name_.split('|'): - if n.startswith('0'): - name.append(n) - else: - name.append('%s.%s' % (cls_name, n)) - return '|'.join(name) + return "<%s.%s: %s>" % (cls_name, self._name_, v_repr(self._value_)) def __str__(self): - cls = self.__class__ + cls_name = self.__class__.__name__ if self._name_ is None: - return '%s(%x)' % (cls.__name__, self._value_) + return '%s(%r)' % (cls_name, self._value_) else: - return self._name_ + return "%s.%s" % (cls_name, self._name_) def __bool__(self): return bool(self._value_) @@ -1362,20 +1501,11 @@ def __invert__(self): return self._inverted_ -class IntFlag(int, Flag, boundary=EJECT): +class IntFlag(int, ReprEnum, Flag, boundary=EJECT): """ Support for integer-based Flags """ - def __format__(self, format_spec): - """ - Returns format using actual value unless __str__ has been overridden. - """ - str_overridden = type(self).__str__ != Flag.__str__ - value = self - if not str_overridden: - value = self._value_ - return int.__format__(value, format_spec) def __or__(self, other): if isinstance(other, self.__class__): @@ -1412,6 +1542,7 @@ def __xor__(self, other): __rxor__ = __xor__ __invert__ = Flag.__invert__ + def _high_bit(value): """ returns index of highest bit, or -1 if value is zero or negative @@ -1456,7 +1587,7 @@ def global_flag_repr(self): module = self.__class__.__module__.split('.')[-1] cls_name = self.__class__.__name__ if self._name_ is None: - return "%s.%s(0x%x)" % (module, cls_name, self._value_) + return "%s.%s(%r)" % (module, cls_name, self._value_) if _is_single_bit(self): return '%s.%s' % (module, self._name_) if self._boundary_ is not FlagBoundary.KEEP: @@ -1464,14 +1595,22 @@ def global_flag_repr(self): else: name = [] for n in self._name_.split('|'): - if n.startswith('0'): + if n[0].isdigit(): name.append(n) else: name.append('%s.%s' % (module, n)) return '|'.join(name) +def global_str(self): + """ + use enum_name instead of class.enum_name + """ + if self._name_ is None: + return "%s(%r)" % (cls_name, self._value_) + else: + return self._name_ -def global_enum(cls): +def global_enum(cls, update_str=False): """ decorator that makes the repr() of an enum member reference its module instead of its class; also exports all members to the enum's module's @@ -1481,6 +1620,8 @@ def global_enum(cls): cls.__repr__ = global_flag_repr else: cls.__repr__ = global_enum_repr + if not issubclass(cls, ReprEnum) or update_str: + cls.__str__ = global_str sys.modules[cls.__module__].__dict__.update(cls.__members__) return cls @@ -1522,6 +1663,7 @@ def convert_class(cls): body['_value2member_map_'] = value2member_map = {} body['_unhashable_values_'] = [] body['_member_type_'] = member_type = etype._member_type_ + body['_value_repr_'] = etype._value_repr_ if issubclass(etype, Flag): body['_boundary_'] = boundary or etype._boundary_ body['_flag_mask_'] = None @@ -1543,13 +1685,8 @@ def convert_class(cls): # it enum_class = type(cls_name, (etype, ), body, boundary=boundary, _simple=True) for name in ('__repr__', '__str__', '__format__', '__reduce_ex__'): - if name in body: - continue - class_method = getattr(enum_class, name) - obj_method = getattr(member_type, name, None) - enum_method = getattr(etype, name, None) - if obj_method is not None and obj_method is class_method: - setattr(enum_class, name, enum_method) + if name not in body: + setattr(enum_class, name, getattr(etype, name)) gnv_last_values = [] if issubclass(enum_class, Flag): # Flag / IntFlag @@ -1760,8 +1897,8 @@ def _test_simple_enum(checked_enum, simple_enum): + list(simple_enum._member_map_.keys()) ) for key in set(checked_keys + simple_keys): - if key in ('__module__', '_member_map_', '_value2member_map_'): - # keys known to be different + if key in ('__module__', '_member_map_', '_value2member_map_', '__doc__'): + # keys known to be different, or very long continue elif key in member_names: # members are checked below @@ -1882,3 +2019,5 @@ def _old_convert_(etype, name, module, filter, source=None, *, boundary=None): cls.__reduce_ex__ = _reduce_ex_by_global_name cls.__repr__ = global_enum_repr return cls + +_stdlib_enums = IntEnum, StrEnum, IntFlag diff --git a/Lib/inspect.py b/Lib/inspect.py index 5d33f0d445fb9..8236698b8de0f 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2567,15 +2567,21 @@ class _empty: class _ParameterKind(enum.IntEnum): - POSITIONAL_ONLY = 0 - POSITIONAL_OR_KEYWORD = 1 - VAR_POSITIONAL = 2 - KEYWORD_ONLY = 3 - VAR_KEYWORD = 4 + POSITIONAL_ONLY = 'positional-only' + POSITIONAL_OR_KEYWORD = 'positional or keyword' + VAR_POSITIONAL = 'variadic positional' + KEYWORD_ONLY = 'keyword-only' + VAR_KEYWORD = 'variadic keyword' + + def __new__(cls, description): + value = len(cls.__members__) + member = int.__new__(cls, value) + member._value_ = value + member.description = description + return member - @property - def description(self): - return _PARAM_NAME_MAPPING[self] + def __str__(self): + return self.name _POSITIONAL_ONLY = _ParameterKind.POSITIONAL_ONLY _POSITIONAL_OR_KEYWORD = _ParameterKind.POSITIONAL_OR_KEYWORD @@ -2583,14 +2589,6 @@ def description(self): _KEYWORD_ONLY = _ParameterKind.KEYWORD_ONLY _VAR_KEYWORD = _ParameterKind.VAR_KEYWORD -_PARAM_NAME_MAPPING = { - _POSITIONAL_ONLY: 'positional-only', - _POSITIONAL_OR_KEYWORD: 'positional or keyword', - _VAR_POSITIONAL: 'variadic positional', - _KEYWORD_ONLY: 'keyword-only', - _VAR_KEYWORD: 'variadic keyword' -} - class Parameter: """Represents a parameter in a function signature. diff --git a/Lib/plistlib.py b/Lib/plistlib.py index 3ab71edc320af..4862355b2252c 100644 --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -61,7 +61,8 @@ from xml.parsers.expat import ParserCreate -PlistFormat = enum.global_enum(enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__)) +PlistFormat = enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__) +globals().update(PlistFormat.__members__) class UID: diff --git a/Lib/re.py b/Lib/re.py index ea41217ce08c2..a7ab9b3706748 100644 --- a/Lib/re.py +++ b/Lib/re.py @@ -155,6 +155,8 @@ class RegexFlag: # sre extensions (experimental, don't rely on these) TEMPLATE = T = sre_compile.SRE_FLAG_TEMPLATE # disable backtracking DEBUG = sre_compile.SRE_FLAG_DEBUG # dump pattern after compilation + __str__ = object.__str__ + _numeric_repr_ = hex # sre exception error = sre_compile.error diff --git a/Lib/ssl.py b/Lib/ssl.py index 207925166efa3..dafb70a67864c 100644 --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -119,7 +119,6 @@ ) from _ssl import _DEFAULT_CIPHERS, _OPENSSL_API_VERSION - _IntEnum._convert_( '_SSLMethod', __name__, lambda name: name.startswith('PROTOCOL_') and name != 'PROTOCOL_SSLv23', diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 43f98c1c1efb6..a0953fb960f33 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -6,15 +6,18 @@ import sys import unittest import threading +import builtins as bltns from collections import OrderedDict +from datetime import date from enum import Enum, IntEnum, StrEnum, EnumType, Flag, IntFlag, unique, auto from enum import STRICT, CONFORM, EJECT, KEEP, _simple_enum, _test_simple_enum -from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS +from enum import verify, UNIQUE, CONTINUOUS, NAMED_FLAGS, ReprEnum from io import StringIO from pickle import dumps, loads, PicklingError, HIGHEST_PROTOCOL from test import support from test.support import ALWAYS_EQ from test.support import threading_helper +from textwrap import dedent from datetime import timedelta python_version = sys.version_info[:2] @@ -107,6 +110,12 @@ def test_pickle_exception(assertion, exception, obj): class TestHelpers(unittest.TestCase): # _is_descriptor, _is_sunder, _is_dunder + sunder_names = '_bad_', '_good_', '_what_ho_' + dunder_names = '__mal__', '__bien__', '__que_que__' + private_names = '_MyEnum__private', '_MyEnum__still_private' + private_and_sunder_names = '_MyEnum__private_', '_MyEnum__also_private_' + random_names = 'okay', '_semi_private', '_weird__', '_MyEnum__' + def test_is_descriptor(self): class foo: pass @@ -116,21 +125,36 @@ class foo: setattr(obj, attr, 1) self.assertTrue(enum._is_descriptor(obj)) - def test_is_sunder(self): + def test_sunder(self): + for name in self.sunder_names + self.private_and_sunder_names: + self.assertTrue(enum._is_sunder(name), '%r is a not sunder name?' % name) + for name in self.dunder_names + self.private_names + self.random_names: + self.assertFalse(enum._is_sunder(name), '%r is a sunder name?' % name) for s in ('_a_', '_aa_'): self.assertTrue(enum._is_sunder(s)) - for s in ('a', 'a_', '_a', '__a', 'a__', '__a__', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_sunder(s)) - def test_is_dunder(self): + def test_dunder(self): + for name in self.dunder_names: + self.assertTrue(enum._is_dunder(name), '%r is a not dunder name?' % name) + for name in self.sunder_names + self.private_names + self.private_and_sunder_names + self.random_names: + self.assertFalse(enum._is_dunder(name), '%r is a dunder name?' % name) for s in ('__a__', '__aa__'): self.assertTrue(enum._is_dunder(s)) for s in ('a', 'a_', '_a', '__a', 'a__', '_a_', '_a__', '__a_', '_', '__', '___', '____', '_____',): self.assertFalse(enum._is_dunder(s)) + + def test_is_private(self): + for name in self.private_names + self.private_and_sunder_names: + self.assertTrue(enum._is_private('MyEnum', name), '%r is a not private name?') + for name in self.sunder_names + self.dunder_names + self.random_names: + self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') + + # for subclassing tests class classproperty: @@ -166,473 +190,658 @@ class HeadlightsC(IntFlag, boundary=enum.CONFORM): # tests -class TestEnum(unittest.TestCase): +class _EnumTests: + """ + Test for behavior that is the same across the different types of enumerations. + """ + + values = None def setUp(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = 3 - WINTER = 4 - self.Season = Season + class BaseEnum(self.enum_type): + @enum.property + def first(self): + return '%s is first!' % self.name + class MainEnum(BaseEnum): + first = auto() + second = auto() + third = auto() + if issubclass(self.enum_type, Flag): + dupe = 3 + else: + dupe = third + self.MainEnum = MainEnum + # + class NewStrEnum(self.enum_type): + def __str__(self): + return self.name.upper() + first = auto() + self.NewStrEnum = NewStrEnum + # + class NewFormatEnum(self.enum_type): + def __format__(self, spec): + return self.name.upper() + first = auto() + self.NewFormatEnum = NewFormatEnum + # + class NewStrFormatEnum(self.enum_type): + def __str__(self): + return self.name.title() + def __format__(self, spec): + return ''.join(reversed(self.name)) + first = auto() + self.NewStrFormatEnum = NewStrFormatEnum + # + class NewBaseEnum(self.enum_type): + def __str__(self): + return self.name.title() + def __format__(self, spec): + return ''.join(reversed(self.name)) + class NewSubEnum(NewBaseEnum): + first = auto() + self.NewSubEnum = NewSubEnum + # + self.is_flag = False + self.names = ['first', 'second', 'third'] + if issubclass(MainEnum, StrEnum): + self.values = self.names + elif MainEnum._member_type_ is str: + self.values = ['1', '2', '3'] + elif issubclass(self.enum_type, Flag): + self.values = [1, 2, 4] + self.is_flag = True + self.dupe2 = MainEnum(5) + else: + self.values = self.values or [1, 2, 3] + # + if not getattr(self, 'source_values', False): + self.source_values = self.values - class Konstants(float, Enum): - E = 2.7182818 - PI = 3.1415926 - TAU = 2 * PI - self.Konstants = Konstants + def assertFormatIsValue(self, spec, member): + self.assertEqual(spec.format(member), spec.format(member.value)) - class Grades(IntEnum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.Grades = Grades + def assertFormatIsStr(self, spec, member): + self.assertEqual(spec.format(member), spec.format(str(member))) - class Directional(str, Enum): - EAST = 'east' - WEST = 'west' - NORTH = 'north' - SOUTH = 'south' - self.Directional = Directional + def test_attribute_deletion(self): + class Season(self.enum_type): + SPRING = auto() + SUMMER = auto() + AUTUMN = auto() + # + def spam(cls): + pass + # + self.assertTrue(hasattr(Season, 'spam')) + del Season.spam + self.assertFalse(hasattr(Season, 'spam')) + # + with self.assertRaises(AttributeError): + del Season.SPRING + with self.assertRaises(AttributeError): + del Season.DRY + with self.assertRaises(AttributeError): + del Season.SPRING.name - from datetime import date - class Holiday(date, Enum): - NEW_YEAR = 2013, 1, 1 - IDES_OF_MARCH = 2013, 3, 15 - self.Holiday = Holiday + def test_basics(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(repr(TE), "") + self.assertEqual(str(TE), "") + self.assertEqual(format(TE), "") + self.assertTrue(TE(5) is self.dupe2) + else: + self.assertEqual(repr(TE), "") + self.assertEqual(str(TE), "") + self.assertEqual(format(TE), "") + self.assertEqual(list(TE), [TE.first, TE.second, TE.third]) + self.assertEqual( + [m.name for m in TE], + self.names, + ) + self.assertEqual( + [m.value for m in TE], + self.values, + ) + self.assertEqual( + [m.first for m in TE], + ['first is first!', 'second is first!', 'third is first!'] + ) + for member, name in zip(TE, self.names, strict=True): + self.assertIs(TE[name], member) + for member, value in zip(TE, self.values, strict=True): + self.assertIs(TE(value), member) + if issubclass(TE, StrEnum): + self.assertTrue(TE.dupe is TE('third') is TE['dupe']) + elif TE._member_type_ is str: + self.assertTrue(TE.dupe is TE('3') is TE['dupe']) + elif issubclass(TE, Flag): + self.assertTrue(TE.dupe is TE(3) is TE['dupe']) + else: + self.assertTrue(TE.dupe is TE(self.values[2]) is TE['dupe']) - class DateEnum(date, Enum): pass - self.DateEnum = DateEnum + def test_bool_is_true(self): + class Empty(self.enum_type): + pass + self.assertTrue(Empty) + # + self.assertTrue(self.MainEnum) + for member in self.MainEnum: + self.assertTrue(member) - class FloatEnum(float, Enum): pass - self.FloatEnum = FloatEnum + def test_changing_member_fails(self): + MainEnum = self.MainEnum + with self.assertRaises(AttributeError): + self.MainEnum.second = 'really first' - class Wowser(Enum): - this = 'that' - these = 'those' - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.Wowser = Wowser - - class IntWowser(IntEnum): - this = 1 - these = 2 - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.IntWowser = IntWowser - - class FloatWowser(float, Enum): - this = 3.14 - these = 4.2 - def wowser(self): - """Wowser docstring""" - return ("Wowser! I'm %s!" % self.name) - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.FloatWowser = FloatWowser + @unittest.skipIf( + python_version >= (3, 12), + '__contains__ now returns True/False for all inputs', + ) + def test_contains_er(self): + MainEnum = self.MainEnum + self.assertIn(MainEnum.third, MainEnum) + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + self.source_values[1] in MainEnum + with self.assertRaises(TypeError): + with self.assertWarns(DeprecationWarning): + 'first' in MainEnum + val = MainEnum.dupe + self.assertIn(val, MainEnum) + # + class OtherEnum(Enum): + one = auto() + two = auto() + self.assertNotIn(OtherEnum.two, MainEnum) - class WowserNoMembers(Enum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - class SubclassOfWowserNoMembers(WowserNoMembers): pass - self.WowserNoMembers = WowserNoMembers - self.SubclassOfWowserNoMembers = SubclassOfWowserNoMembers - - class IntWowserNoMembers(IntEnum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.IntWowserNoMembers = IntWowserNoMembers + @unittest.skipIf( + python_version < (3, 12), + '__contains__ works only with enum memmbers before 3.12', + ) + def test_contains_tf(self): + MainEnum = self.MainEnum + self.assertIn(MainEnum.first, MainEnum) + self.assertTrue(self.source_values[0] in MainEnum) + self.assertFalse('first' in MainEnum) + val = MainEnum.dupe + self.assertIn(val, MainEnum) + # + class OtherEnum(Enum): + one = auto() + two = auto() + self.assertNotIn(OtherEnum.two, MainEnum) - class FloatWowserNoMembers(float, Enum): - def wowser(self): pass - @classmethod - def classmethod_wowser(cls): pass - @staticmethod - def staticmethod_wowser(): pass - self.FloatWowserNoMembers = FloatWowserNoMembers - - class EnumWithInit(Enum): - def __init__(self, greeting, farewell): - self.greeting = greeting - self.farewell = farewell - ENGLISH = 'hello', 'goodbye' - GERMAN = 'Guten Morgen', 'Auf Wiedersehen' - def some_method(self): pass - self.EnumWithInit = EnumWithInit + def test_dir_on_class(self): + TE = self.MainEnum + self.assertEqual(set(dir(TE)), set(enum_dir(TE))) + + def test_dir_on_item(self): + TE = self.MainEnum + self.assertEqual(set(dir(TE.first)), set(member_dir(TE.first))) + + def test_dir_with_added_behavior(self): + class Test(self.enum_type): + this = auto() + these = auto() + def wowser(self): + return ("Wowser! I'm %s!" % self.name) + self.assertTrue('wowser' not in dir(Test)) + self.assertTrue('wowser' in dir(Test.this)) + def test_dir_on_sub_with_behavior_on_super(self): # see issue22506 - class SuperEnum1(Enum): + class SuperEnum(self.enum_type): def invisible(self): return "did you see me?" - class SubEnum1(SuperEnum1): - sample = 5 - self.SubEnum1 = SubEnum1 + class SubEnum(SuperEnum): + sample = auto() + self.assertTrue('invisible' not in dir(SubEnum)) + self.assertTrue('invisible' in dir(SubEnum.sample)) - class SuperEnum2(IntEnum): - def __new__(cls, value, description=""): - obj = int.__new__(cls, value) - obj._value_ = value - obj.description = description + def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): + # see issue40084 + class SuperEnum(self.enum_type): + def __new__(cls, *value, **kwds): + new = self.enum_type._member_type_.__new__ + if self.enum_type._member_type_ is object: + obj = new(cls) + else: + if isinstance(value[0], tuple): + create_value ,= value[0] + else: + create_value = value + obj = new(cls, *create_value) + obj._value_ = value[0] if len(value) == 1 else value + obj.description = 'test description' return obj - class SubEnum2(SuperEnum2): - sample = 5 - self.SubEnum2 = SubEnum2 - - def test_dir_basics_for_all_enums(self): - enums_for_tests = ( - # Generic enums in enum.py - Enum, - IntEnum, - StrEnum, - # Generic enums defined outside of enum.py - self.DateEnum, - self.FloatEnum, - # Concrete enums derived from enum.py generics - self.Grades, - self.Season, - # Concrete enums derived from generics defined outside of enum.py - self.Konstants, - self.Holiday, - # Standard enum with added behaviour & members - self.Wowser, - # Mixin-enum-from-enum.py with added behaviour & members - self.IntWowser, - # Mixin-enum-from-oustide-enum.py with added behaviour & members - self.FloatWowser, - # Equivalents of the three immediately above, but with no members - self.WowserNoMembers, - self.IntWowserNoMembers, - self.FloatWowserNoMembers, - # Enum with members and an __init__ method - self.EnumWithInit, - # Special cases to test - self.SubEnum1, - self.SubEnum2 - ) - - for cls in enums_for_tests: - with self.subTest(cls=cls): - cls_dir = dir(cls) - # test that dir is deterministic - self.assertEqual(cls_dir, dir(cls)) - # test that dir is sorted - self.assertEqual(list(cls_dir), sorted(cls_dir)) - # test that there are no dupes in dir - self.assertEqual(len(cls_dir), len(set(cls_dir))) - # test that there are no sunders in dir - self.assertFalse(any(enum._is_sunder(attr) for attr in cls_dir)) - self.assertNotIn('__new__', cls_dir) - - for attr in ('__class__', '__doc__', '__members__', '__module__'): - with self.subTest(attr=attr): - self.assertIn(attr, cls_dir) - - def test_dir_for_enum_with_members(self): - enums_for_test = ( - # Enum with members - self.Season, - # IntEnum with members - self.Grades, - # Two custom-mixin enums with members - self.Konstants, - self.Holiday, - # several enums-with-added-behaviour and members - self.Wowser, - self.IntWowser, - self.FloatWowser, - # An enum with an __init__ method and members - self.EnumWithInit, - # Special cases to test - self.SubEnum1, - self.SubEnum2 - ) - - for cls in enums_for_test: - cls_dir = dir(cls) - member_names = cls._member_names_ - with self.subTest(cls=cls): - self.assertTrue(all(member_name in cls_dir for member_name in member_names)) - for member in cls: - member_dir = dir(member) - # test that dir is deterministic - self.assertEqual(member_dir, dir(member)) - # test that dir is sorted - self.assertEqual(list(member_dir), sorted(member_dir)) - # test that there are no dupes in dir - self.assertEqual(len(member_dir), len(set(member_dir))) - - for attr_name in cls_dir: - with self.subTest(attr_name=attr_name): - if attr_name in {'__members__', '__init__', '__new__', *member_names}: - self.assertNotIn(attr_name, member_dir) - else: - self.assertIn(attr_name, member_dir) - - self.assertFalse(any(enum._is_sunder(attr) for attr in member_dir)) - - def test_dir_for_enums_with_added_behaviour(self): - enums_for_test = ( - self.Wowser, - self.IntWowser, - self.FloatWowser, - self.WowserNoMembers, - self.SubclassOfWowserNoMembers, - self.IntWowserNoMembers, - self.FloatWowserNoMembers - ) - - for cls in enums_for_test: - with self.subTest(cls=cls): - self.assertIn('wowser', dir(cls)) - self.assertIn('classmethod_wowser', dir(cls)) - self.assertIn('staticmethod_wowser', dir(cls)) - self.assertTrue(all( - all(attr in dir(member) for attr in ('wowser', 'classmethod_wowser', 'staticmethod_wowser')) - for member in cls - )) + class SubEnum(SuperEnum): + sample = self.source_values[1] + self.assertTrue('description' not in dir(SubEnum)) + self.assertTrue('description' in dir(SubEnum.sample), dir(SubEnum.sample)) - self.assertEqual(dir(self.WowserNoMembers), dir(self.SubclassOfWowserNoMembers)) - # Check classmethods are present - self.assertIn('from_bytes', dir(self.IntWowser)) - self.assertIn('from_bytes', dir(self.IntWowserNoMembers)) - - def test_help_output_on_enum_members(self): - added_behaviour_enums = ( - self.Wowser, - self.IntWowser, - self.FloatWowser - ) - - for cls in added_behaviour_enums: - with self.subTest(cls=cls): - rendered_doc = pydoc.render_doc(cls.this) - self.assertIn('Wowser docstring', rendered_doc) - if cls in {self.IntWowser, self.FloatWowser}: - self.assertIn('float(self)', rendered_doc) - - def test_dir_for_enum_with_init(self): - EnumWithInit = self.EnumWithInit - - cls_dir = dir(EnumWithInit) - self.assertIn('__init__', cls_dir) - self.assertIn('some_method', cls_dir) - self.assertNotIn('greeting', cls_dir) - self.assertNotIn('farewell', cls_dir) - - member_dir = dir(EnumWithInit.ENGLISH) - self.assertNotIn('__init__', member_dir) - self.assertIn('some_method', member_dir) - self.assertIn('greeting', member_dir) - self.assertIn('farewell', member_dir) - - def test_mixin_dirs(self): - from datetime import date + def test_enum_in_enum_out(self): + Main = self.MainEnum + self.assertIs(Main(Main.first), Main.first) - enums_for_test = ( - # generic mixins from enum.py - (IntEnum, int), - (StrEnum, str), - # generic mixins from outside enum.py - (self.FloatEnum, float), - (self.DateEnum, date), - # concrete mixin from enum.py - (self.Grades, int), - # concrete mixin from outside enum.py - (self.Holiday, date), - # concrete mixin from enum.py with added behaviour - (self.IntWowser, int), - # concrete mixin from outside enum.py with added behaviour - (self.FloatWowser, float) - ) - - enum_dict = Enum.__dict__ - enum_dir = dir(Enum) - enum_module_names = enum.__all__ - is_from_enum_module = lambda cls: cls.__name__ in enum_module_names - is_enum_dunder = lambda attr: enum._is_dunder(attr) and attr in enum_dict - - def attr_is_inherited_from_object(cls, attr_name): - for base in cls.__mro__: - if attr_name in base.__dict__: - return base is object - return False - - # General tests - for enum_cls, mixin_cls in enums_for_test: - with self.subTest(enum_cls=enum_cls): - cls_dir = dir(enum_cls) - cls_dict = enum_cls.__dict__ - - mixin_attrs = [ - x for x in dir(mixin_cls) - if not attr_is_inherited_from_object(cls=mixin_cls, attr_name=x) - ] + def test_hash(self): + MainEnum = self.MainEnum + mapping = {} + mapping[MainEnum.first] = '1225' + mapping[MainEnum.second] = '0315' + mapping[MainEnum.third] = '0704' + self.assertEqual(mapping[MainEnum.second], '0315') + + def test_invalid_names(self): + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + mro = 9 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _create_= 11 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _get_mixins_ = 9 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _find_new_ = 1 + with self.assertRaises(ValueError): + class Wrong(self.enum_type): + _any_name_ = 9 + + def test_object_str_override(self): + "check that setting __str__ to object's is not reset to Enum's" + class Generic(self.enum_type): + item = self.source_values[2] + def __repr__(self): + return "%s.test" % (self._name_, ) + __str__ = object.__str__ + self.assertEqual(str(Generic.item), 'item.test') + + def test_overridden_str(self): + NS = self.NewStrEnum + self.assertEqual(str(NS.first), NS.first.name.upper()) + self.assertEqual(format(NS.first), NS.first.name.upper()) - first_enum_base = next( - base for base in enum_cls.__mro__ - if is_from_enum_module(base) + def test_overridden_str_format(self): + NSF = self.NewStrFormatEnum + self.assertEqual(str(NSF.first), NSF.first.name.title()) + self.assertEqual(format(NSF.first), ''.join(reversed(NSF.first.name))) + + def test_overridden_str_format_inherited(self): + NSE = self.NewSubEnum + self.assertEqual(str(NSE.first), NSE.first.name.title()) + self.assertEqual(format(NSE.first), ''.join(reversed(NSE.first.name))) + + def test_programmatic_function_string(self): + MinorEnum = self.enum_type('MinorEnum', 'june july august') + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, ) + values = self.values + if self.enum_type is StrEnum: + values = ['june','july','august'] + for month, av in zip('june july august'.split(), values): + e = MinorEnum[month] + self.assertEqual(e.value, av, list(MinorEnum)) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - for attr in mixin_attrs: - with self.subTest(attr=attr): - if enum._is_sunder(attr): - # Unlikely, but no harm in testing - self.assertNotIn(attr, cls_dir) - elif attr in {'__class__', '__doc__', '__members__', '__module__'}: - self.assertIn(attr, cls_dir) - elif is_enum_dunder(attr): - if is_from_enum_module(enum_cls): - self.assertNotIn(attr, cls_dir) - elif getattr(enum_cls, attr) is getattr(first_enum_base, attr): - self.assertNotIn(attr, cls_dir) - else: - self.assertIn(attr, cls_dir) - else: - self.assertIn(attr, cls_dir) - - # Some specific examples - int_enum_dir = dir(IntEnum) - self.assertIn('imag', int_enum_dir) - self.assertIn('__rfloordiv__', int_enum_dir) - self.assertNotIn('__format__', int_enum_dir) - self.assertNotIn('__hash__', int_enum_dir) - self.assertNotIn('__init_subclass__', int_enum_dir) - self.assertNotIn('__subclasshook__', int_enum_dir) - - class OverridesFormatOutsideEnumModule(Enum): - def __format__(self, *args, **kwargs): - return super().__format__(*args, **kwargs) - SOME_MEMBER = 1 - - self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule)) - self.assertIn('__format__', dir(OverridesFormatOutsideEnumModule.SOME_MEMBER)) + def test_programmatic_function_string_list(self): + MinorEnum = self.enum_type('MinorEnum', ['june', 'july', 'august']) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + values = self.values + if self.enum_type is StrEnum: + values = ['june','july','august'] + for month, av in zip('june july august'.split(), values): + e = MinorEnum[month] + self.assertEqual(e.value, av) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_dir_on_sub_with_behavior_on_super(self): - # see issue22506 + def test_programmatic_function_iterable(self): + MinorEnum = self.enum_type( + 'MinorEnum', + (('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2])) + ) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) self.assertEqual( - set(dir(self.SubEnum1.sample)), - set(['__class__', '__doc__', '__module__', 'name', 'value', 'invisible']), + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, ) + for month, av in zip('june july august'.split(), self.values): + e = MinorEnum[month] + self.assertEqual(e.value, av) + self.assertEqual(e.name, month) + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_dir_on_sub_with_behavior_including_instance_dict_on_super(self): - # see issue40084 - self.assertTrue({'description'} <= set(dir(self.SubEnum2.sample))) + def test_programmatic_function_from_dict(self): + MinorEnum = self.enum_type( + 'MinorEnum', + OrderedDict((('june', self.source_values[0]), ('july', self.source_values[1]), ('august', self.source_values[2]))) + ) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for month, av in zip('june july august'.split(), self.values): + e = MinorEnum[month] + if MinorEnum._member_type_ is not object and issubclass(MinorEnum, MinorEnum._member_type_): + self.assertEqual(e, av) + else: + self.assertNotEqual(e, av) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + self.assertIs(e, MinorEnum(av)) - def test_enum_in_enum_out(self): - Season = self.Season - self.assertIs(Season(Season.WINTER), Season.WINTER) + def test_repr(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(repr(TE(0)), "") + self.assertEqual(repr(TE.dupe), "") + self.assertEqual(repr(self.dupe2), "") + elif issubclass(TE, StrEnum): + self.assertEqual(repr(TE.dupe), "") + else: + self.assertEqual(repr(TE.dupe), "" % (self.values[2], ), TE._value_repr_) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(repr(member), "" % (member.name, member.value)) - def test_enum_value(self): - Season = self.Season - self.assertEqual(Season.SPRING.value, 1) + def test_repr_override(self): + class Generic(self.enum_type): + first = auto() + second = auto() + third = auto() + def __repr__(self): + return "don't you just love shades of %s?" % self.name + self.assertEqual( + repr(Generic.third), + "don't you just love shades of third?", + ) - def test_intenum_value(self): - self.assertEqual(IntStooges.CURLY.value, 2) + def test_inherited_repr(self): + class MyEnum(self.enum_type): + def __repr__(self): + return "My name is %s." % self.name + class MySubEnum(MyEnum): + this = auto() + that = auto() + theother = auto() + self.assertEqual(repr(MySubEnum.that), "My name is that.") - def test_enum(self): - Season = self.Season - lst = list(Season) - self.assertEqual(len(lst), len(Season)) - self.assertEqual(len(Season), 4, Season) + def test_reversed_iteration_order(self): self.assertEqual( - [Season.SPRING, Season.SUMMER, Season.AUTUMN, Season.WINTER], lst) + list(reversed(self.MainEnum)), + [self.MainEnum.third, self.MainEnum.second, self.MainEnum.first], + ) - for i, season in enumerate('SPRING SUMMER AUTUMN WINTER'.split(), 1): - e = Season(i) - self.assertEqual(e, getattr(Season, season)) - self.assertEqual(e.value, i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, season) - self.assertIn(e, Season) - self.assertIs(type(e), Season) - self.assertIsInstance(e, Season) - self.assertEqual(str(e), season) - self.assertEqual(repr(e), 'Season.{0}'.format(season)) - - def test_value_name(self): - Season = self.Season - self.assertEqual(Season.SPRING.name, 'SPRING') - self.assertEqual(Season.SPRING.value, 1) - with self.assertRaises(AttributeError): - Season.SPRING.name = 'invierno' - with self.assertRaises(AttributeError): - Season.SPRING.value = 2 +class _PlainOutputTests: - def test_changing_member(self): - Season = self.Season - with self.assertRaises(AttributeError): - Season.WINTER = 'really cold' + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "MainEnum.dupe") + self.assertEqual(str(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(str(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) - def test_attribute_deletion(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = 3 - WINTER = 4 + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "MainEnum.dupe") + self.assertEqual(format(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(format(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) - def spam(cls): - pass + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), "NewFormatEnum.first", '%s %r' % (NF.__str__, NF.first)) + self.assertEqual(format(NF.first), "FIRST") - self.assertTrue(hasattr(Season, 'spam')) - del Season.spam - self.assertFalse(hasattr(Season, 'spam')) + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsStr('{}', TE.second) + self.assertFormatIsStr('{:}', TE.second) + self.assertFormatIsStr('{:20}', TE.second) + self.assertFormatIsStr('{:^20}', TE.second) + self.assertFormatIsStr('{:>20}', TE.second) + self.assertFormatIsStr('{:<20}', TE.second) + self.assertFormatIsStr('{:5.2}', TE.second) - with self.assertRaises(AttributeError): - del Season.SPRING - with self.assertRaises(AttributeError): - del Season.DRY - with self.assertRaises(AttributeError): - del Season.SPRING.name - def test_bool_of_class(self): - class Empty(Enum): - pass - self.assertTrue(bool(Empty)) +class _MixedOutputTests: - def test_bool_of_member(self): - class Count(Enum): - zero = 0 - one = 1 - two = 2 - for member in Count: - self.assertTrue(bool(member)) + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "MainEnum.dupe") + self.assertEqual(str(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(str(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), "MainEnum.%s" % (member.name, )) + + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "MainEnum.dupe") + self.assertEqual(format(self.dupe2), "MainEnum.first|third") + else: + self.assertEqual(format(TE.dupe), "MainEnum.third") + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), "MainEnum.%s" % (member.name, )) + + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), "NewFormatEnum.first") + self.assertEqual(format(NF.first), "FIRST") + + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsStr('{}', TE.first) + self.assertFormatIsStr('{:}', TE.first) + self.assertFormatIsStr('{:20}', TE.first) + self.assertFormatIsStr('{:^20}', TE.first) + self.assertFormatIsStr('{:>20}', TE.first) + self.assertFormatIsStr('{:<20}', TE.first) + self.assertFormatIsStr('{:5.2}', TE.first) + + +class _MinimalOutputTests: + + def test_str(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(str(TE.dupe), "3") + self.assertEqual(str(self.dupe2), "5") + else: + self.assertEqual(str(TE.dupe), str(self.values[2])) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(str(member), str(value)) + + def test_format(self): + TE = self.MainEnum + if self.is_flag: + self.assertEqual(format(TE.dupe), "3") + self.assertEqual(format(self.dupe2), "5") + else: + self.assertEqual(format(TE.dupe), format(self.values[2])) + for name, value, member in zip(self.names, self.values, TE, strict=True): + self.assertEqual(format(member), format(value)) + + def test_overridden_format(self): + NF = self.NewFormatEnum + self.assertEqual(str(NF.first), str(self.values[0])) + self.assertEqual(format(NF.first), "FIRST") + + def test_format_specs(self): + TE = self.MainEnum + self.assertFormatIsValue('{}', TE.third) + self.assertFormatIsValue('{:}', TE.third) + self.assertFormatIsValue('{:20}', TE.third) + self.assertFormatIsValue('{:^20}', TE.third) + self.assertFormatIsValue('{:>20}', TE.third) + self.assertFormatIsValue('{:<20}', TE.third) + if TE._member_type_ is float: + self.assertFormatIsValue('{:n}', TE.third) + self.assertFormatIsValue('{:5.2}', TE.third) + self.assertFormatIsValue('{:f}', TE.third) + + +class _FlagTests: + + def test_default_missing_with_wrong_type_value(self): + with self.assertRaisesRegex( + ValueError, + "'RED' is not a valid TestFlag.Color", + ) as ctx: + self.MainEnum('RED') + self.assertIs(ctx.exception.__context__, None) + +class TestPlainEnum(_EnumTests, _PlainOutputTests, unittest.TestCase): + enum_type = Enum + + +class TestPlainFlag(_EnumTests, _PlainOutputTests, unittest.TestCase): + enum_type = Flag + + +class TestIntEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = IntEnum + + +class TestStrEnum(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = StrEnum + + +class TestIntFlag(_EnumTests, _MinimalOutputTests, unittest.TestCase): + enum_type = IntFlag + + +class TestMixedInt(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(int, Enum): pass + + +class TestMixedStr(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(str, Enum): pass + + +class TestMixedIntFlag(_EnumTests, _MixedOutputTests, unittest.TestCase): + class enum_type(int, Flag): pass + + +class TestMixedDate(_EnumTests, _MixedOutputTests, unittest.TestCase): + + values = [date(2021, 12, 25), date(2020, 3, 15), date(2019, 11, 27)] + source_values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] + + class enum_type(date, Enum): + def _generate_next_value_(name, start, count, last_values): + values = [(2021, 12, 25), (2020, 3, 15), (2019, 11, 27)] + return values[count] + + +class TestMinimalDate(_EnumTests, _MinimalOutputTests, unittest.TestCase): + + values = [date(2023, 12, 1), date(2016, 2, 29), date(2009, 1, 1)] + source_values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] + + class enum_type(date, ReprEnum): + def _generate_next_value_(name, start, count, last_values): + values = [(2023, 12, 1), (2016, 2, 29), (2009, 1, 1)] + return values[count] - def test_invalid_names(self): - with self.assertRaises(ValueError): - class Wrong(Enum): - mro = 9 - with self.assertRaises(ValueError): - class Wrong(Enum): - _create_= 11 - with self.assertRaises(ValueError): - class Wrong(Enum): - _get_mixins_ = 9 - with self.assertRaises(ValueError): - class Wrong(Enum): - _find_new_ = 1 - with self.assertRaises(ValueError): - class Wrong(Enum): - _any_name_ = 9 + +class TestMixedFloat(_EnumTests, _MixedOutputTests, unittest.TestCase): + + values = [1.1, 2.2, 3.3] + + class enum_type(float, Enum): + def _generate_next_value_(name, start, count, last_values): + values = [1.1, 2.2, 3.3] + return values[count] + + +class TestMinimalFloat(_EnumTests, _MinimalOutputTests, unittest.TestCase): + + values = [4.4, 5.5, 6.6] + + class enum_type(float, ReprEnum): + def _generate_next_value_(name, start, count, last_values): + values = [4.4, 5.5, 6.6] + return values[count] + + +class TestSpecial(unittest.TestCase): + """ + various operations that are not attributable to every possible enum + """ + + def setUp(self): + class Season(Enum): + SPRING = 1 + SUMMER = 2 + AUTUMN = 3 + WINTER = 4 + self.Season = Season + # + class Grades(IntEnum): + A = 5 + B = 4 + C = 3 + D = 2 + F = 0 + self.Grades = Grades + # + class Directional(str, Enum): + EAST = 'east' + WEST = 'west' + NORTH = 'north' + SOUTH = 'south' + self.Directional = Directional + # + from datetime import date + class Holiday(date, Enum): + NEW_YEAR = 2013, 1, 1 + IDES_OF_MARCH = 2013, 3, 15 + self.Holiday = Holiday def test_bool(self): # plain Enum members are always True @@ -656,92 +865,56 @@ class IntLogic(int, Enum): self.assertTrue(IntLogic.true) self.assertFalse(IntLogic.false) - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - def test_contains_er(self): - Season = self.Season - self.assertIn(Season.AUTUMN, Season) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 3 in Season - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'AUTUMN' in Season - val = Season(3) - self.assertIn(val, Season) - # - class OtherEnum(Enum): - one = 1; two = 2 - self.assertNotIn(OtherEnum.two, Season) - - @unittest.skipIf( - python_version < (3, 12), - '__contains__ only works with enum memmbers before 3.12', - ) - def test_contains_tf(self): - Season = self.Season - self.assertIn(Season.AUTUMN, Season) - self.assertTrue(3 in Season) - self.assertFalse('AUTUMN' in Season) - val = Season(3) - self.assertIn(val, Season) - # - class OtherEnum(Enum): - one = 1; two = 2 - self.assertNotIn(OtherEnum.two, Season) - def test_comparisons(self): Season = self.Season with self.assertRaises(TypeError): Season.SPRING < Season.WINTER with self.assertRaises(TypeError): Season.SPRING > 4 - + # self.assertNotEqual(Season.SPRING, 1) - + # class Part(Enum): SPRING = 1 CLIP = 2 BARREL = 3 - + # self.assertNotEqual(Season.SPRING, Part.SPRING) with self.assertRaises(TypeError): Season.SPRING < Part.CLIP - def test_enum_duplicates(self): - class Season(Enum): - SPRING = 1 - SUMMER = 2 - AUTUMN = FALL = 3 - WINTER = 4 - ANOTHER_SPRING = 1 - lst = list(Season) - self.assertEqual( - lst, - [Season.SPRING, Season.SUMMER, - Season.AUTUMN, Season.WINTER, - ]) - self.assertIs(Season.FALL, Season.AUTUMN) - self.assertEqual(Season.FALL.value, 3) - self.assertEqual(Season.AUTUMN.value, 3) - self.assertIs(Season(3), Season.AUTUMN) - self.assertIs(Season(1), Season.SPRING) - self.assertEqual(Season.FALL.name, 'AUTUMN') - self.assertEqual( - [k for k,v in Season.__members__.items() if v.name != k], - ['FALL', 'ANOTHER_SPRING'], - ) + def test_dir_with_custom_dunders(self): + class PlainEnum(Enum): + pass + cls_dir = dir(PlainEnum) + self.assertNotIn('__repr__', cls_dir) + self.assertNotIn('__str__', cls_dir) + self.assertNotIn('__repr__', cls_dir) + self.assertNotIn('__repr__', cls_dir) + # + class MyEnum(Enum): + def __repr__(self): + return object.__repr__(self) + def __str__(self): + return object.__repr__(self) + def __format__(self): + return object.__repr__(self) + def __init__(self): + pass + cls_dir = dir(MyEnum) + self.assertIn('__repr__', cls_dir) + self.assertIn('__str__', cls_dir) + self.assertIn('__repr__', cls_dir) + self.assertIn('__repr__', cls_dir) - def test_duplicate_name(self): + def test_duplicate_name_error(self): with self.assertRaises(TypeError): class Color(Enum): red = 1 green = 2 blue = 3 red = 4 - + # with self.assertRaises(TypeError): class Color(Enum): red = 1 @@ -749,232 +922,45 @@ class Color(Enum): blue = 3 def red(self): return 'red' - + # with self.assertRaises(TypeError): class Color(Enum): - @property + @enum.property def red(self): return 'redder' - red = 1 - green = 2 - blue = 3 - - def test_reserved__sunder_(self): - with self.assertRaisesRegex( - ValueError, - '_sunder_ names, such as ._bad_., are reserved', - ): - class Bad(Enum): - _bad_ = 1 + red = 1 + green = 2 + blue = 3 + + def test_enum_function_with_qualname(self): + if isinstance(Theory, Exception): + raise Theory + self.assertEqual(Theory.__qualname__, 'spanish_inquisition') def test_enum_with_value_name(self): class Huh(Enum): name = 1 value = 2 - self.assertEqual( - list(Huh), - [Huh.name, Huh.value], - ) + self.assertEqual(list(Huh), [Huh.name, Huh.value]) self.assertIs(type(Huh.name), Huh) self.assertEqual(Huh.name.name, 'name') self.assertEqual(Huh.name.value, 1) - def test_format_enum(self): - Season = self.Season - self.assertEqual('{}'.format(Season.SPRING), - '{}'.format(str(Season.SPRING))) - self.assertEqual( '{:}'.format(Season.SPRING), - '{:}'.format(str(Season.SPRING))) - self.assertEqual('{:20}'.format(Season.SPRING), - '{:20}'.format(str(Season.SPRING))) - self.assertEqual('{:^20}'.format(Season.SPRING), - '{:^20}'.format(str(Season.SPRING))) - self.assertEqual('{:>20}'.format(Season.SPRING), - '{:>20}'.format(str(Season.SPRING))) - self.assertEqual('{:<20}'.format(Season.SPRING), - '{:<20}'.format(str(Season.SPRING))) - - def test_str_override_enum(self): - class EnumWithStrOverrides(Enum): - one = auto() - two = auto() - - def __str__(self): - return 'Str!' - self.assertEqual(str(EnumWithStrOverrides.one), 'Str!') - self.assertEqual('{}'.format(EnumWithStrOverrides.one), 'Str!') - - def test_format_override_enum(self): - class EnumWithFormatOverride(Enum): - one = 1.0 - two = 2.0 - def __format__(self, spec): - return 'Format!!' - self.assertEqual(str(EnumWithFormatOverride.one), 'one') - self.assertEqual('{}'.format(EnumWithFormatOverride.one), 'Format!!') - - def test_str_and_format_override_enum(self): - class EnumWithStrFormatOverrides(Enum): - one = auto() - two = auto() - def __str__(self): - return 'Str!' - def __format__(self, spec): - return 'Format!' - self.assertEqual(str(EnumWithStrFormatOverrides.one), 'Str!') - self.assertEqual('{}'.format(EnumWithStrFormatOverrides.one), 'Format!') - - def test_str_override_mixin(self): - class MixinEnumWithStrOverride(float, Enum): - one = 1.0 - two = 2.0 - def __str__(self): - return 'Overridden!' - self.assertEqual(str(MixinEnumWithStrOverride.one), 'Overridden!') - self.assertEqual('{}'.format(MixinEnumWithStrOverride.one), 'Overridden!') - - def test_str_and_format_override_mixin(self): - class MixinWithStrFormatOverrides(float, Enum): - one = 1.0 - two = 2.0 - def __str__(self): - return 'Str!' - def __format__(self, spec): - return 'Format!' - self.assertEqual(str(MixinWithStrFormatOverrides.one), 'Str!') - self.assertEqual('{}'.format(MixinWithStrFormatOverrides.one), 'Format!') - - def test_format_override_mixin(self): - class TestFloat(float, Enum): - one = 1.0 - two = 2.0 - def __format__(self, spec): - return 'TestFloat success!' - self.assertEqual(str(TestFloat.one), 'one') - self.assertEqual('{}'.format(TestFloat.one), 'TestFloat success!') - - @unittest.skipIf( - python_version < (3, 12), - 'mixin-format is still using member.value', - ) - def test_mixin_format_warning(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.assertEqual(f'{self.Grades.B}', 'B') - - @unittest.skipIf( - python_version >= (3, 12), - 'mixin-format now uses member instead of member.value', - ) - def test_mixin_format_warning(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - with self.assertWarns(DeprecationWarning): - self.assertEqual(f'{Grades.B}', '4') - - def assertFormatIsValue(self, spec, member): - if python_version < (3, 12) and (not spec or spec in ('{}','{:}')): - with self.assertWarns(DeprecationWarning): - self.assertEqual(spec.format(member), spec.format(member.value)) - else: - self.assertEqual(spec.format(member), spec.format(member.value)) - - def test_format_enum_date(self): - Holiday = self.Holiday - self.assertFormatIsValue('{}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:^20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:>20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:<20}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:%Y %m}', Holiday.IDES_OF_MARCH) - self.assertFormatIsValue('{:%Y %m %M:00}', Holiday.IDES_OF_MARCH) - - def test_format_enum_float(self): - Konstants = self.Konstants - self.assertFormatIsValue('{}', Konstants.TAU) - self.assertFormatIsValue('{:}', Konstants.TAU) - self.assertFormatIsValue('{:20}', Konstants.TAU) - self.assertFormatIsValue('{:^20}', Konstants.TAU) - self.assertFormatIsValue('{:>20}', Konstants.TAU) - self.assertFormatIsValue('{:<20}', Konstants.TAU) - self.assertFormatIsValue('{:n}', Konstants.TAU) - self.assertFormatIsValue('{:5.2}', Konstants.TAU) - self.assertFormatIsValue('{:f}', Konstants.TAU) - - def test_format_enum_int(self): - class Grades(int, Enum): - A = 5 - B = 4 - C = 3 - D = 2 - F = 0 - self.assertFormatIsValue('{}', Grades.C) - self.assertFormatIsValue('{:}', Grades.C) - self.assertFormatIsValue('{:20}', Grades.C) - self.assertFormatIsValue('{:^20}', Grades.C) - self.assertFormatIsValue('{:>20}', Grades.C) - self.assertFormatIsValue('{:<20}', Grades.C) - self.assertFormatIsValue('{:+}', Grades.C) - self.assertFormatIsValue('{:08X}', Grades.C) - self.assertFormatIsValue('{:b}', Grades.C) - - def test_format_enum_str(self): - Directional = self.Directional - self.assertFormatIsValue('{}', Directional.WEST) - self.assertFormatIsValue('{:}', Directional.WEST) - self.assertFormatIsValue('{:20}', Directional.WEST) - self.assertFormatIsValue('{:^20}', Directional.WEST) - self.assertFormatIsValue('{:>20}', Directional.WEST) - self.assertFormatIsValue('{:<20}', Directional.WEST) - - def test_object_str_override(self): - class Colors(Enum): - RED, GREEN, BLUE = 1, 2, 3 - def __repr__(self): - return "test.%s" % (self._name_, ) - __str__ = object.__str__ - self.assertEqual(str(Colors.RED), 'test.RED') - - def test_enum_str_override(self): - class MyStrEnum(Enum): - def __str__(self): - return 'MyStr' - class MyMethodEnum(Enum): - def hello(self): - return 'Hello! My name is %s' % self.name - class Test1Enum(MyMethodEnum, int, MyStrEnum): - One = 1 - Two = 2 - self.assertTrue(Test1Enum._member_type_ is int) - self.assertEqual(str(Test1Enum.One), 'MyStr') - self.assertEqual(format(Test1Enum.One, ''), 'MyStr') - # - class Test2Enum(MyStrEnum, MyMethodEnum): - One = 1 - Two = 2 - self.assertEqual(str(Test2Enum.One), 'MyStr') - self.assertEqual(format(Test1Enum.One, ''), 'MyStr') - def test_inherited_data_type(self): class HexInt(int): + __qualname__ = 'HexInt' def __repr__(self): return hex(self) class MyEnum(HexInt, enum.Enum): + __qualname__ = 'MyEnum' A = 1 B = 2 C = 3 - def __repr__(self): - return '<%s.%s: %r>' % (self.__class__.__name__, self._name_, self._value_) self.assertEqual(repr(MyEnum.A), '') + globals()['HexInt'] = HexInt + globals()['MyEnum'] = MyEnum + test_pickle_dump_load(self.assertIs, MyEnum.A) + test_pickle_dump_load(self.assertIs, MyEnum) # class SillyInt(HexInt): __qualname__ = 'SillyInt' @@ -990,7 +976,7 @@ class MyOtherEnum(SillyInt, enum.Enum): test_pickle_dump_load(self.assertIs, MyOtherEnum.E) test_pickle_dump_load(self.assertIs, MyOtherEnum) # - # This did not work in 3.9, but does now with pickling by name + # This did not work in 3.10, but does now with pickling by name class UnBrokenInt(int): __qualname__ = 'UnBrokenInt' def __new__(cls, value): @@ -1007,6 +993,124 @@ class MyUnBrokenEnum(UnBrokenInt, Enum): test_pickle_dump_load(self.assertIs, MyUnBrokenEnum.I) test_pickle_dump_load(self.assertIs, MyUnBrokenEnum) + def test_floatenum_fromhex(self): + h = float.hex(FloatStooges.MOE.value) + self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) + h = float.hex(FloatStooges.MOE.value + 0.01) + with self.assertRaises(ValueError): + FloatStooges.fromhex(h) + + def test_programmatic_function_type(self): + MinorEnum = Enum('MinorEnum', 'june july august', type=int) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_string_with_start(self): + MinorEnum = Enum('MinorEnum', 'june july august', start=10) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 10): + e = MinorEnum(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_with_start(self): + MinorEnum = Enum('MinorEnum', 'june july august', type=int, start=30) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 30): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_string_list_with_start(self): + MinorEnum = Enum('MinorEnum', ['june', 'july', 'august'], start=20) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 20): + e = MinorEnum(i) + self.assertEqual(int(e.value), i) + self.assertNotEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_from_subclass(self): + MinorEnum = IntEnum('MinorEnum', 'june july august') + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 1): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_programmatic_function_type_from_subclass_with_start(self): + MinorEnum = IntEnum('MinorEnum', 'june july august', start=40) + lst = list(MinorEnum) + self.assertEqual(len(lst), len(MinorEnum)) + self.assertEqual(len(MinorEnum), 3, MinorEnum) + self.assertEqual( + [MinorEnum.june, MinorEnum.july, MinorEnum.august], + lst, + ) + for i, month in enumerate('june july august'.split(), 40): + e = MinorEnum(i) + self.assertEqual(e, i) + self.assertEqual(e.name, month) + self.assertIn(e, MinorEnum) + self.assertIs(type(e), MinorEnum) + + def test_intenum_from_bytes(self): + self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) + with self.assertRaises(ValueError): + IntStooges.from_bytes(b'\x00\x05', 'big') + + def test_reserved_sunder_error(self): + with self.assertRaisesRegex( + ValueError, + '_sunder_ names, such as ._bad_., are reserved', + ): + class Bad(Enum): + _bad_ = 1 + def test_too_many_data_types(self): with self.assertRaisesRegex(TypeError, 'too many data types'): class Huh(str, int, Enum): @@ -1022,122 +1126,6 @@ def repr(self): class Huh(MyStr, MyInt, Enum): One = 1 - def test_value_auto_assign(self): - class Some(Enum): - def __new__(cls, val): - return object.__new__(cls) - x = 1 - y = 2 - - self.assertEqual(Some.x.value, 1) - self.assertEqual(Some.y.value, 2) - - def test_hash(self): - Season = self.Season - dates = {} - dates[Season.WINTER] = '1225' - dates[Season.SPRING] = '0315' - dates[Season.SUMMER] = '0704' - dates[Season.AUTUMN] = '1031' - self.assertEqual(dates[Season.AUTUMN], '1031') - - def test_intenum_from_scratch(self): - class phy(int, Enum): - pi = 3 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_intenum_inherited(self): - class IntEnum(int, Enum): - pass - class phy(IntEnum): - pi = 3 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_floatenum_from_scratch(self): - class phy(float, Enum): - pi = 3.1415926 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_floatenum_inherited(self): - class FloatEnum(float, Enum): - pass - class phy(FloatEnum): - pi = 3.1415926 - tau = 2 * pi - self.assertTrue(phy.pi < phy.tau) - - def test_strenum_from_scratch(self): - class phy(str, Enum): - pi = 'Pi' - tau = 'Tau' - self.assertTrue(phy.pi < phy.tau) - - def test_strenum_inherited_methods(self): - class phy(StrEnum): - pi = 'Pi' - tau = 'Tau' - self.assertTrue(phy.pi < phy.tau) - self.assertEqual(phy.pi.upper(), 'PI') - self.assertEqual(phy.tau.count('a'), 1) - - def test_intenum(self): - class WeekDay(IntEnum): - SUNDAY = 1 - MONDAY = 2 - TUESDAY = 3 - WEDNESDAY = 4 - THURSDAY = 5 - FRIDAY = 6 - SATURDAY = 7 - - self.assertEqual(['a', 'b', 'c'][WeekDay.MONDAY], 'c') - self.assertEqual([i for i in range(WeekDay.TUESDAY)], [0, 1, 2]) - - lst = list(WeekDay) - self.assertEqual(len(lst), len(WeekDay)) - self.assertEqual(len(WeekDay), 7) - target = 'SUNDAY MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY SATURDAY' - target = target.split() - for i, weekday in enumerate(target, 1): - e = WeekDay(i) - self.assertEqual(e, i) - self.assertEqual(int(e), i) - self.assertEqual(e.name, weekday) - self.assertIn(e, WeekDay) - self.assertEqual(lst.index(e)+1, i) - self.assertTrue(0 < e < 8) - self.assertIs(type(e), WeekDay) - self.assertIsInstance(e, int) - self.assertIsInstance(e, Enum) - - def test_intenum_duplicates(self): - class WeekDay(IntEnum): - SUNDAY = 1 - MONDAY = 2 - TUESDAY = TEUSDAY = 3 - WEDNESDAY = 4 - THURSDAY = 5 - FRIDAY = 6 - SATURDAY = 7 - self.assertIs(WeekDay.TEUSDAY, WeekDay.TUESDAY) - self.assertEqual(WeekDay(3).name, 'TUESDAY') - self.assertEqual([k for k,v in WeekDay.__members__.items() - if v.name != k], ['TEUSDAY', ]) - - def test_intenum_from_bytes(self): - self.assertIs(IntStooges.from_bytes(b'\x00\x03', 'big'), IntStooges.MOE) - with self.assertRaises(ValueError): - IntStooges.from_bytes(b'\x00\x05', 'big') - - def test_floatenum_fromhex(self): - h = float.hex(FloatStooges.MOE.value) - self.assertIs(FloatStooges.fromhex(h), FloatStooges.MOE) - h = float.hex(FloatStooges.MOE.value + 0.01) - with self.assertRaises(ValueError): - FloatStooges.fromhex(h) def test_pickle_enum(self): if isinstance(Stooges, Exception): @@ -1169,12 +1157,7 @@ def test_pickle_enum_function_with_module(self): test_pickle_dump_load(self.assertIs, Question.who) test_pickle_dump_load(self.assertIs, Question) - def test_enum_function_with_qualname(self): - if isinstance(Theory, Exception): - raise Theory - self.assertEqual(Theory.__qualname__, 'spanish_inquisition') - - def test_class_nested_enum_and_pickle_protocol_four(self): + def test_pickle_nested_class(self): # would normally just have this directly in the class namespace class NestedEnum(Enum): twigs = 'common' @@ -1192,225 +1175,46 @@ class ReplaceGlobalInt(IntEnum): for proto in range(HIGHEST_PROTOCOL): self.assertEqual(ReplaceGlobalInt.TWO.__reduce_ex__(proto), 'TWO') - def test_exploding_pickle(self): + def test_pickle_explodes(self): BadPickle = Enum( 'BadPickle', 'dill sweet bread-n-butter', module=__name__) globals()['BadPickle'] = BadPickle # now break BadPickle to test exception raising enum._make_class_unpicklable(BadPickle) - test_pickle_exception(self.assertRaises, TypeError, BadPickle.dill) - test_pickle_exception(self.assertRaises, PicklingError, BadPickle) - - def test_string_enum(self): - class SkillLevel(str, Enum): - master = 'what is the sound of one hand clapping?' - journeyman = 'why did the chicken cross the road?' - apprentice = 'knock, knock!' - self.assertEqual(SkillLevel.apprentice, 'knock, knock!') - - def test_getattr_getitem(self): - class Period(Enum): - morning = 1 - noon = 2 - evening = 3 - night = 4 - self.assertIs(Period(2), Period.noon) - self.assertIs(getattr(Period, 'night'), Period.night) - self.assertIs(Period['morning'], Period.morning) - - def test_getattr_dunder(self): - Season = self.Season - self.assertTrue(getattr(Season, '__eq__')) - - def test_iteration_order(self): - class Season(Enum): - SUMMER = 2 - WINTER = 4 - AUTUMN = 3 - SPRING = 1 - self.assertEqual( - list(Season), - [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], - ) - - def test_reversed_iteration_order(self): - self.assertEqual( - list(reversed(self.Season)), - [self.Season.WINTER, self.Season.AUTUMN, self.Season.SUMMER, - self.Season.SPRING] - ) - - def test_programmatic_function_string(self): - SummerMonth = Enum('SummerMonth', 'june july august') - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_with_start(self): - SummerMonth = Enum('SummerMonth', 'june july august', start=10) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 10): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_list(self): - SummerMonth = Enum('SummerMonth', ['june', 'july', 'august']) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_string_list_with_start(self): - SummerMonth = Enum('SummerMonth', ['june', 'july', 'august'], start=20) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 20): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_iterable(self): - SummerMonth = Enum( - 'SummerMonth', - (('june', 1), ('july', 2), ('august', 3)) - ) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) - - def test_programmatic_function_from_dict(self): - SummerMonth = Enum( - 'SummerMonth', - OrderedDict((('june', 1), ('july', 2), ('august', 3))) - ) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(int(e.value), i) - self.assertNotEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + test_pickle_exception(self.assertRaises, TypeError, BadPickle.dill) + test_pickle_exception(self.assertRaises, PicklingError, BadPickle) - def test_programmatic_function_type(self): - SummerMonth = Enum('SummerMonth', 'june july august', type=int) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_string_enum(self): + class SkillLevel(str, Enum): + master = 'what is the sound of one hand clapping?' + journeyman = 'why did the chicken cross the road?' + apprentice = 'knock, knock!' + self.assertEqual(SkillLevel.apprentice, 'knock, knock!') - def test_programmatic_function_type_with_start(self): - SummerMonth = Enum('SummerMonth', 'june july august', type=int, start=30) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 30): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_getattr_getitem(self): + class Period(Enum): + morning = 1 + noon = 2 + evening = 3 + night = 4 + self.assertIs(Period(2), Period.noon) + self.assertIs(getattr(Period, 'night'), Period.night) + self.assertIs(Period['morning'], Period.morning) - def test_programmatic_function_type_from_subclass(self): - SummerMonth = IntEnum('SummerMonth', 'june july august') - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) - self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, - ) - for i, month in enumerate('june july august'.split(), 1): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) + def test_getattr_dunder(self): + Season = self.Season + self.assertTrue(getattr(Season, '__eq__')) - def test_programmatic_function_type_from_subclass_with_start(self): - SummerMonth = IntEnum('SummerMonth', 'june july august', start=40) - lst = list(SummerMonth) - self.assertEqual(len(lst), len(SummerMonth)) - self.assertEqual(len(SummerMonth), 3, SummerMonth) + def test_iteration_order(self): + class Season(Enum): + SUMMER = 2 + WINTER = 4 + AUTUMN = 3 + SPRING = 1 self.assertEqual( - [SummerMonth.june, SummerMonth.july, SummerMonth.august], - lst, + list(Season), + [Season.SUMMER, Season.WINTER, Season.AUTUMN, Season.SPRING], ) - for i, month in enumerate('june july august'.split(), 40): - e = SummerMonth(i) - self.assertEqual(e, i) - self.assertEqual(e.name, month) - self.assertIn(e, SummerMonth) - self.assertIs(type(e), SummerMonth) def test_subclassing(self): if isinstance(Name, Exception): @@ -1425,15 +1229,18 @@ class Color(Enum): red = 1 green = 2 blue = 3 + # with self.assertRaises(TypeError): class MoreColor(Color): cyan = 4 magenta = 5 yellow = 6 - with self.assertRaisesRegex(TypeError, "EvenMoreColor: cannot extend enumeration 'Color'"): + # + with self.assertRaisesRegex(TypeError, " cannot extend "): class EvenMoreColor(Color, IntEnum): chartruese = 7 - with self.assertRaisesRegex(TypeError, "Foo: cannot extend enumeration 'Color'"): + # + with self.assertRaisesRegex(TypeError, " cannot extend "): Color('Foo', ('pink', 'black')) def test_exclude_methods(self): @@ -1537,27 +1344,7 @@ class Color(Enum): with self.assertRaises(KeyError): Color['chartreuse'] - def test_new_repr(self): - class Color(Enum): - red = 1 - green = 2 - blue = 3 - def __repr__(self): - return "don't you just love shades of %s?" % self.name - self.assertEqual( - repr(Color.blue), - "don't you just love shades of blue?", - ) - - def test_inherited_repr(self): - class MyEnum(Enum): - def __repr__(self): - return "My name is %s." % self.name - class MyIntEnum(int, MyEnum): - this = 1 - that = 2 - theother = 3 - self.assertEqual(repr(MyIntEnum.that), "My name is that.") + # tests that need to be evalualted for moving def test_multiple_mixin_mro(self): class auto_enum(type(Enum)): @@ -1610,7 +1397,7 @@ def __new__(cls, *args): return self def __getnewargs__(self): return self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1670,7 +1457,7 @@ def __new__(cls, *args): return self def __getnewargs_ex__(self): return self._args, {} - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1730,7 +1517,7 @@ def __new__(cls, *args): return self def __reduce__(self): return self.__class__, self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1790,7 +1577,7 @@ def __new__(cls, *args): return self def __reduce_ex__(self, proto): return self.__class__, self._args - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1847,7 +1634,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -1902,7 +1689,7 @@ def __new__(cls, *args): self._intname = name self._args = _args return self - @property + @bltns.property def __name__(self): return self._intname def __repr__(self): @@ -2091,6 +1878,7 @@ def test(self): class Test(Base): test = 1 self.assertEqual(Test.test.test, 'dynamic') + self.assertEqual(Test.test.value, 1) class Base2(Enum): @enum.property def flash(self): @@ -2098,6 +1886,7 @@ def flash(self): class Test(Base2): flash = 1 self.assertEqual(Test.flash.flash, 'flashy dynamic') + self.assertEqual(Test.flash.value, 1) def test_no_duplicates(self): class UniqueEnum(Enum): @@ -2134,7 +1923,7 @@ class Planet(Enum): def __init__(self, mass, radius): self.mass = mass # in kilograms self.radius = radius # in meters - @property + @enum.property def surface_gravity(self): # universal gravitational constant (m3 kg-1 s-2) G = 6.67300E-11 @@ -2204,90 +1993,7 @@ class LabelledList(LabelledIntEnum): self.assertEqual(LabelledList.unprocessed, 1) self.assertEqual(LabelledList(1), LabelledList.unprocessed) - def test_auto_number(self): - class Color(Enum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 1) - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_name(self): - class Color(Enum): - def _generate_next_value_(name, start, count, last): - return name - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_name_inherit(self): - class AutoNameEnum(Enum): - def _generate_next_value_(name, start, count, last): - return name - class Color(AutoNameEnum): - red = auto() - blue = auto() - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 'blue') - self.assertEqual(Color.green.value, 'green') - - def test_auto_garbage(self): - class Color(Enum): - red = 'red' - blue = auto() - self.assertEqual(Color.blue.value, 1) - - def test_auto_garbage_corrected(self): - class Color(Enum): - red = 'red' - blue = 2 - green = auto() - - self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) - self.assertEqual(Color.red.value, 'red') - self.assertEqual(Color.blue.value, 2) - self.assertEqual(Color.green.value, 3) - - def test_auto_order(self): - with self.assertRaises(TypeError): - class Color(Enum): - red = auto() - green = auto() - blue = auto() - def _generate_next_value_(name, start, count, last): - return name - - def test_auto_order_wierd(self): - weird_auto = auto() - weird_auto.value = 'pathological case' - class Color(Enum): - red = weird_auto - def _generate_next_value_(name, start, count, last): - return name - blue = auto() - self.assertEqual(list(Color), [Color.red, Color.blue]) - self.assertEqual(Color.red.value, 'pathological case') - self.assertEqual(Color.blue.value, 'blue') - - def test_duplicate_auto(self): - class Dupes(Enum): - first = primero = auto() - second = auto() - third = auto() - self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) - - def test_default_missing(self): + def test_default_missing_no_chained_exception(self): class Color(Enum): RED = 1 GREEN = 2 @@ -2299,7 +2005,7 @@ class Color(Enum): else: raise Exception('Exception not raised.') - def test_missing(self): + def test_missing_override(self): class Color(Enum): red = 1 green = 2 @@ -2363,9 +2069,9 @@ def __init__(self): class_1_ref = weakref.ref(Class1()) class_2_ref = weakref.ref(Class2()) # - # The exception raised by Enum creates a reference loop and thus - # Class2 instances will stick around until the next garbage collection - # cycle, unlike Class1. + # The exception raised by Enum used to create a reference loop and thus + # Class2 instances would stick around until the next garbage collection + # cycle, unlike Class1. Verify Class2 no longer does this. gc.collect() # For PyPy or other GCs. self.assertIs(class_1_ref(), None) self.assertIs(class_2_ref(), None) @@ -2396,11 +2102,12 @@ class Color(MaxMixin, Enum): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) self.assertEqual(Color.MAX, 3) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), 'Color.BLUE') class Color(MaxMixin, StrMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2410,6 +2117,7 @@ class Color(StrMixin, MaxMixin, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 3) @@ -2419,6 +2127,7 @@ class CoolColor(StrMixin, SomeEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolColor.RED.value, 1) self.assertEqual(CoolColor.GREEN.value, 2) self.assertEqual(CoolColor.BLUE.value, 3) @@ -2428,6 +2137,7 @@ class CoolerColor(StrMixin, AnotherEnum, Enum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolerColor.RED.value, 1) self.assertEqual(CoolerColor.GREEN.value, 2) self.assertEqual(CoolerColor.BLUE.value, 3) @@ -2438,6 +2148,7 @@ class CoolestColor(StrMixin, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(CoolestColor.RED.value, 1) self.assertEqual(CoolestColor.GREEN.value, 2) self.assertEqual(CoolestColor.BLUE.value, 3) @@ -2448,6 +2159,7 @@ class ConfusedColor(StrMixin, AnotherEnum, SomeEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ConfusedColor.RED.value, 1) self.assertEqual(ConfusedColor.GREEN.value, 2) self.assertEqual(ConfusedColor.BLUE.value, 3) @@ -2458,6 +2170,7 @@ class ReformedColor(StrMixin, IntEnum, SomeEnum, AnotherEnum): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ # needed as of 3.11 self.assertEqual(ReformedColor.RED.value, 1) self.assertEqual(ReformedColor.GREEN.value, 2) self.assertEqual(ReformedColor.BLUE.value, 3) @@ -2490,11 +2203,12 @@ def __repr__(self): return hex(self) class MyIntEnum(HexMixin, MyInt, enum.Enum): - pass + __repr__ = HexMixin.__repr__ class Foo(MyIntEnum): TEST = 1 self.assertTrue(isinstance(Foo.TEST, MyInt)) + self.assertEqual(Foo._member_type_, MyInt) self.assertEqual(repr(Foo.TEST), "0x1") class Fee(MyIntEnum): @@ -2506,7 +2220,7 @@ def __new__(cls, value): return member self.assertEqual(Fee.TEST, 2) - def test_miltuple_mixin_with_common_data_type(self): + def test_multiple_mixin_with_common_data_type(self): class CaseInsensitiveStrEnum(str, Enum): @classmethod def _missing_(cls, value): @@ -2526,7 +2240,7 @@ def _missing_(cls, value): unknown._value_ = value cls._member_map_[value] = unknown return unknown - @property + @enum.property def valid(self): return self._valid # @@ -2570,7 +2284,7 @@ class GoodStrEnum(StrEnum): self.assertEqual('{}'.format(GoodStrEnum.one), '1') self.assertEqual(GoodStrEnum.one, str(GoodStrEnum.one)) self.assertEqual(GoodStrEnum.one, '{}'.format(GoodStrEnum.one)) - self.assertEqual(repr(GoodStrEnum.one), 'GoodStrEnum.one') + self.assertEqual(repr(GoodStrEnum.one), "") # class DumbMixin: def __str__(self): @@ -2579,6 +2293,7 @@ class DumbStrEnum(DumbMixin, StrEnum): five = '5' six = '6' seven = '7' + __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2620,74 +2335,6 @@ class ThirdFailedStrEnum(StrEnum): one = '1' two = b'2', 'ascii', 9 - @unittest.skipIf( - python_version >= (3, 12), - 'mixin-format now uses member instead of member.value', - ) - def test_custom_strenum_with_warning(self): - class CustomStrEnum(str, Enum): - pass - class OkayEnum(CustomStrEnum): - one = '1' - two = '2' - three = b'3', 'ascii' - four = b'4', 'latin1', 'strict' - self.assertEqual(OkayEnum.one, '1') - self.assertEqual(str(OkayEnum.one), 'one') - with self.assertWarns(DeprecationWarning): - self.assertEqual('{}'.format(OkayEnum.one), '1') - self.assertEqual(OkayEnum.one, '{}'.format(OkayEnum.one)) - self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') - # - class DumbMixin: - def __str__(self): - return "don't do this" - class DumbStrEnum(DumbMixin, CustomStrEnum): - five = '5' - six = '6' - seven = '7' - self.assertEqual(DumbStrEnum.seven, '7') - self.assertEqual(str(DumbStrEnum.seven), "don't do this") - # - class EnumMixin(Enum): - def hello(self): - print('hello from %s' % (self, )) - class HelloEnum(EnumMixin, CustomStrEnum): - eight = '8' - self.assertEqual(HelloEnum.eight, '8') - self.assertEqual(str(HelloEnum.eight), 'eight') - # - class GoodbyeMixin: - def goodbye(self): - print('%s wishes you a fond farewell') - class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): - nine = '9' - self.assertEqual(GoodbyeEnum.nine, '9') - self.assertEqual(str(GoodbyeEnum.nine), 'nine') - # - class FirstFailedStrEnum(CustomStrEnum): - one = 1 # this will become '1' - two = '2' - class SecondFailedStrEnum(CustomStrEnum): - one = '1' - two = 2, # this will become '2' - three = '3' - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = 2 # this will become '2' - with self.assertRaisesRegex(TypeError, '.encoding. must be str, not '): - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = b'2', sys.getdefaultencoding - with self.assertRaisesRegex(TypeError, '.errors. must be str, not '): - class ThirdFailedStrEnum(CustomStrEnum): - one = '1' - two = b'2', 'ascii', 9 - - @unittest.skipIf( - python_version < (3, 12), - 'mixin-format currently uses member.value', - ) def test_custom_strenum(self): class CustomStrEnum(str, Enum): pass @@ -2697,9 +2344,9 @@ class OkayEnum(CustomStrEnum): three = b'3', 'ascii' four = b'4', 'latin1', 'strict' self.assertEqual(OkayEnum.one, '1') - self.assertEqual(str(OkayEnum.one), 'one') - self.assertEqual('{}'.format(OkayEnum.one), 'one') - self.assertEqual(repr(OkayEnum.one), 'OkayEnum.one') + self.assertEqual(str(OkayEnum.one), 'OkayEnum.one') + self.assertEqual('{}'.format(OkayEnum.one), 'OkayEnum.one') + self.assertEqual(repr(OkayEnum.one), "") # class DumbMixin: def __str__(self): @@ -2708,6 +2355,7 @@ class DumbStrEnum(DumbMixin, CustomStrEnum): five = '5' six = '6' seven = '7' + __str__ = DumbMixin.__str__ # needed as of 3.11 self.assertEqual(DumbStrEnum.seven, '7') self.assertEqual(str(DumbStrEnum.seven), "don't do this") # @@ -2717,7 +2365,7 @@ def hello(self): class HelloEnum(EnumMixin, CustomStrEnum): eight = '8' self.assertEqual(HelloEnum.eight, '8') - self.assertEqual(str(HelloEnum.eight), 'eight') + self.assertEqual(str(HelloEnum.eight), 'HelloEnum.eight') # class GoodbyeMixin: def goodbye(self): @@ -2725,7 +2373,7 @@ def goodbye(self): class GoodbyeEnum(GoodbyeMixin, EnumMixin, CustomStrEnum): nine = '9' self.assertEqual(GoodbyeEnum.nine, '9') - self.assertEqual(str(GoodbyeEnum.nine), 'nine') + self.assertEqual(str(GoodbyeEnum.nine), 'GoodbyeEnum.nine') # class FirstFailedStrEnum(CustomStrEnum): one = 1 # this will become '1' @@ -2771,21 +2419,6 @@ def __repr__(self): code = 'An$(5,1)', 2 description = 'Bn$', 3 - @unittest.skipUnless( - python_version == (3, 9), - 'private variables are now normal attributes', - ) - def test_warning_for_private_variables(self): - with self.assertWarns(DeprecationWarning): - class Private(Enum): - __corporal = 'Radar' - self.assertEqual(Private._Private__corporal.value, 'Radar') - try: - with self.assertWarns(DeprecationWarning): - class Private(Enum): - __major_ = 'Hoolihan' - except ValueError: - pass def test_private_variable_is_normal_attribute(self): class Private(Enum): @@ -2794,35 +2427,13 @@ class Private(Enum): self.assertEqual(Private._Private__corporal, 'Radar') self.assertEqual(Private._Private__major_, 'Hoolihan') - @unittest.skipUnless( - python_version < (3, 12), - 'member-member access now raises an exception', - ) - def test_warning_for_member_from_member_access(self): - with self.assertWarns(DeprecationWarning): - class Di(Enum): - YES = 1 - NO = 0 - nope = Di.YES.NO - self.assertIs(Di.NO, nope) - - @unittest.skipUnless( - python_version >= (3, 12), - 'member-member access currently issues a warning', - ) def test_exception_for_member_from_member_access(self): - with self.assertRaisesRegex(AttributeError, "Di: no instance attribute .NO."): + with self.assertRaisesRegex(AttributeError, " member has no attribute .NO."): class Di(Enum): YES = 1 NO = 0 nope = Di.YES.NO - def test_strenum_auto(self): - class Strings(StrEnum): - ONE = auto() - TWO = auto() - self.assertEqual([Strings.ONE, Strings.TWO], ['one', 'two']) - def test_dynamic_members_with_static_methods(self): # @@ -2839,7 +2450,7 @@ def upper(self): self.assertEqual(Foo.FOO_CAT.value, 'aloof') self.assertEqual(Foo.FOO_HORSE.upper(), 'BIG') # - with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as: 'aloof'"): + with self.assertRaisesRegex(TypeError, "'FOO_CAT' already defined as 'aloof'"): class FooBar(Enum): vars().update({ k: v @@ -2851,8 +2462,42 @@ class FooBar(Enum): def upper(self): return self.value.upper() + def test_repr_with_dataclass(self): + "ensure dataclass-mixin has correct repr()" + from dataclasses import dataclass + @dataclass + class Foo: + __qualname__ = 'Foo' + a: int = 0 + class Entries(Foo, Enum): + ENTRY1 = Foo(1) + self.assertEqual(repr(Entries.ENTRY1), '') + + def test_repr_with_non_data_type_mixin(self): + # non-data_type is a mixin that doesn't define __new__ + class Foo: + def __init__(self, a): + self.a = a + def __repr__(self): + return f'Foo(a={self.a!r})' + class Entries(Foo, Enum): + ENTRY1 = Foo(1) + + self.assertEqual(repr(Entries.ENTRY1), '') + + def test_value_backup_assign(self): + # check that enum will add missing values when custom __new__ does not + class Some(Enum): + def __new__(cls, val): + return object.__new__(cls) + x = 1 + y = 2 + self.assertEqual(Some.x.value, 1) + self.assertEqual(Some.y.value, 2) + class TestOrder(unittest.TestCase): + "test usage of the `_order_` attribute" def test_same_members(self): class Color(Enum): @@ -2914,7 +2559,7 @@ class Color(Enum): verde = green -class TestFlag(unittest.TestCase): +class OldTestFlag(unittest.TestCase): """Tests of the Flags.""" class Perm(Flag): @@ -2934,67 +2579,8 @@ class Color(Flag): GREEN = 2 BLUE = 4 PURPLE = RED|BLUE - WHITE = RED|GREEN|BLUE - BLANCO = RED|GREEN|BLUE - - def test_str(self): - Perm = self.Perm - self.assertEqual(str(Perm.R), 'R') - self.assertEqual(str(Perm.W), 'W') - self.assertEqual(str(Perm.X), 'X') - self.assertEqual(str(Perm.R | Perm.W), 'R|W') - self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') - self.assertEqual(str(Perm(0)), 'Perm(0)') - self.assertEqual(str(~Perm.R), 'W|X') - self.assertEqual(str(~Perm.W), 'R|X') - self.assertEqual(str(~Perm.X), 'R|W') - self.assertEqual(str(~(Perm.R | Perm.W)), 'X') - self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') - self.assertEqual(str(Perm(~0)), 'R|W|X') - - Open = self.Open - self.assertEqual(str(Open.RO), 'RO') - self.assertEqual(str(Open.WO), 'WO') - self.assertEqual(str(Open.AC), 'AC') - self.assertEqual(str(Open.RO | Open.CE), 'CE') - self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') - self.assertEqual(str(~Open.RO), 'WO|RW|CE') - self.assertEqual(str(~Open.WO), 'RW|CE') - self.assertEqual(str(~Open.AC), 'CE') - self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') - self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') - - def test_repr(self): - Perm = self.Perm - self.assertEqual(repr(Perm.R), 'Perm.R') - self.assertEqual(repr(Perm.W), 'Perm.W') - self.assertEqual(repr(Perm.X), 'Perm.X') - self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') - self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm(0)), '0x0') - self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') - self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') - self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') - self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') - self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') - self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') - - Open = self.Open - self.assertEqual(repr(Open.RO), 'Open.RO') - self.assertEqual(repr(Open.WO), 'Open.WO') - self.assertEqual(repr(Open.AC), 'Open.AC') - self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') - self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') - self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') - self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') - self.assertEqual(repr(~Open.AC), 'Open.CE') - self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') - self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') - - def test_format(self): - Perm = self.Perm - self.assertEqual(format(Perm.R, ''), 'R') - self.assertEqual(format(Perm.R | Perm.X, ''), 'R|X') + WHITE = RED|GREEN|BLUE + BLANCO = RED|GREEN|BLUE def test_or(self): Perm = self.Perm @@ -3088,7 +2674,7 @@ class Bizarre(Flag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value: 7', Iron, 7) + self.assertRaisesRegex(ValueError, 'invalid value 7', Iron, 7) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -3297,7 +2883,7 @@ class Color(Flag): self.assertEqual(Color.green.value, 4) def test_auto_number_garbage(self): - with self.assertRaisesRegex(TypeError, 'Invalid Flag value: .not an int.'): + with self.assertRaisesRegex(TypeError, 'invalid flag value .not an int.'): class Color(Flag): red = 'not an int' blue = auto() @@ -3332,11 +2918,12 @@ class Color(AllMixin, Flag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), 'Color.BLUE') class Color(AllMixin, StrMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3346,6 +2933,7 @@ class Color(StrMixin, AllMixin, Flag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3426,21 +3014,8 @@ class NeverEnum(WhereEnum): self.assertFalse(NeverEnum.__dict__.get('_test1', False)) self.assertFalse(NeverEnum.__dict__.get('_test2', False)) - def test_default_missing(self): - with self.assertRaisesRegex( - ValueError, - "'RED' is not a valid TestFlag.Color", - ) as ctx: - self.Color('RED') - self.assertIs(ctx.exception.__context__, None) - - P = Flag('P', 'X Y') - with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: - P('X') - self.assertIs(ctx.exception.__context__, None) - -class TestIntFlag(unittest.TestCase): +class OldTestIntFlag(unittest.TestCase): """Tests of the IntFlags.""" class Perm(IntFlag): @@ -3485,73 +3060,6 @@ def test_type(self): self.assertTrue(isinstance(Open.WO | Open.RW, Open)) self.assertEqual(Open.WO | Open.RW, 3) - - def test_str(self): - Perm = self.Perm - self.assertEqual(str(Perm.R), 'R') - self.assertEqual(str(Perm.W), 'W') - self.assertEqual(str(Perm.X), 'X') - self.assertEqual(str(Perm.R | Perm.W), 'R|W') - self.assertEqual(str(Perm.R | Perm.W | Perm.X), 'R|W|X') - self.assertEqual(str(Perm.R | 8), '12') - self.assertEqual(str(Perm(0)), 'Perm(0)') - self.assertEqual(str(Perm(8)), '8') - self.assertEqual(str(~Perm.R), 'W|X') - self.assertEqual(str(~Perm.W), 'R|X') - self.assertEqual(str(~Perm.X), 'R|W') - self.assertEqual(str(~(Perm.R | Perm.W)), 'X') - self.assertEqual(str(~(Perm.R | Perm.W | Perm.X)), 'Perm(0)') - self.assertEqual(str(~(Perm.R | 8)), '-13') - self.assertEqual(str(Perm(~0)), 'R|W|X') - self.assertEqual(str(Perm(~8)), '-9') - - Open = self.Open - self.assertEqual(str(Open.RO), 'RO') - self.assertEqual(str(Open.WO), 'WO') - self.assertEqual(str(Open.AC), 'AC') - self.assertEqual(str(Open.RO | Open.CE), 'CE') - self.assertEqual(str(Open.WO | Open.CE), 'WO|CE') - self.assertEqual(str(Open(4)), '4') - self.assertEqual(str(~Open.RO), 'WO|RW|CE') - self.assertEqual(str(~Open.WO), 'RW|CE') - self.assertEqual(str(~Open.AC), 'CE') - self.assertEqual(str(~(Open.RO | Open.CE)), 'AC') - self.assertEqual(str(~(Open.WO | Open.CE)), 'RW') - self.assertEqual(str(Open(~4)), '-5') - - def test_repr(self): - Perm = self.Perm - self.assertEqual(repr(Perm.R), 'Perm.R') - self.assertEqual(repr(Perm.W), 'Perm.W') - self.assertEqual(repr(Perm.X), 'Perm.X') - self.assertEqual(repr(Perm.R | Perm.W), 'Perm.R|Perm.W') - self.assertEqual(repr(Perm.R | Perm.W | Perm.X), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm.R | 8), '12') - self.assertEqual(repr(Perm(0)), '0x0') - self.assertEqual(repr(Perm(8)), '8') - self.assertEqual(repr(~Perm.R), 'Perm.W|Perm.X') - self.assertEqual(repr(~Perm.W), 'Perm.R|Perm.X') - self.assertEqual(repr(~Perm.X), 'Perm.R|Perm.W') - self.assertEqual(repr(~(Perm.R | Perm.W)), 'Perm.X') - self.assertEqual(repr(~(Perm.R | Perm.W | Perm.X)), '0x0') - self.assertEqual(repr(~(Perm.R | 8)), '-13') - self.assertEqual(repr(Perm(~0)), 'Perm.R|Perm.W|Perm.X') - self.assertEqual(repr(Perm(~8)), '-9') - - Open = self.Open - self.assertEqual(repr(Open.RO), 'Open.RO') - self.assertEqual(repr(Open.WO), 'Open.WO') - self.assertEqual(repr(Open.AC), 'Open.AC') - self.assertEqual(repr(Open.RO | Open.CE), 'Open.CE') - self.assertEqual(repr(Open.WO | Open.CE), 'Open.WO|Open.CE') - self.assertEqual(repr(Open(4)), '4') - self.assertEqual(repr(~Open.RO), 'Open.WO|Open.RW|Open.CE') - self.assertEqual(repr(~Open.WO), 'Open.RW|Open.CE') - self.assertEqual(repr(~Open.AC), 'Open.CE') - self.assertEqual(repr(~(Open.RO | Open.CE)), 'Open.AC') - self.assertEqual(repr(~(Open.WO | Open.CE)), 'Open.RW') - self.assertEqual(repr(Open(~4)), '-5') - def test_global_repr_keep(self): self.assertEqual( repr(HeadlightsK(0)), @@ -3559,11 +3067,11 @@ def test_global_repr_keep(self): ) self.assertEqual( repr(HeadlightsK(2**0 + 2**2 + 2**3)), - '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|0x8' % {'m': SHORT_MODULE}, + '%(m)s.LOW_BEAM_K|%(m)s.FOG_K|8' % {'m': SHORT_MODULE}, ) self.assertEqual( repr(HeadlightsK(2**3)), - '%(m)s.HeadlightsK(0x8)' % {'m': SHORT_MODULE}, + '%(m)s.HeadlightsK(8)' % {'m': SHORT_MODULE}, ) def test_global_repr_conform1(self): @@ -3705,7 +3213,7 @@ class Bizarre(IntFlag, boundary=KEEP): c = 4 d = 6 # - self.assertRaisesRegex(ValueError, 'invalid value: 5', Iron, 5) + self.assertRaisesRegex(ValueError, 'invalid value 5', Iron, 5) # self.assertIs(Water(7), Water.ONE|Water.TWO) self.assertIs(Water(~9), Water.TWO) @@ -3942,11 +3450,12 @@ class Color(AllMixin, IntFlag): self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) self.assertEqual(Color.ALL.value, 7) - self.assertEqual(str(Color.BLUE), 'BLUE') + self.assertEqual(str(Color.BLUE), '4') class Color(AllMixin, StrMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -3956,6 +3465,7 @@ class Color(StrMixin, AllMixin, IntFlag): RED = auto() GREEN = auto() BLUE = auto() + __str__ = StrMixin.__str__ self.assertEqual(Color.RED.value, 1) self.assertEqual(Color.GREEN.value, 2) self.assertEqual(Color.BLUE.value, 4) @@ -4000,19 +3510,6 @@ def cycle_enum(): 'at least one thread failed while creating composite members') self.assertEqual(256, len(seen), 'too many composite members created') - def test_default_missing(self): - with self.assertRaisesRegex( - ValueError, - "'RED' is not a valid TestIntFlag.Color", - ) as ctx: - self.Color('RED') - self.assertIs(ctx.exception.__context__, None) - - P = IntFlag('P', 'X Y') - with self.assertRaisesRegex(ValueError, "'X' is not a valid P") as ctx: - P('X') - self.assertIs(ctx.exception.__context__, None) - class TestEmptyAndNonLatinStrings(unittest.TestCase): @@ -4229,6 +3726,89 @@ def test_is_private(self): for name in self.sunder_names + self.dunder_names + self.random_names: self.assertFalse(enum._is_private('MyEnum', name), '%r is a private name?') + def test_auto_number(self): + class Color(Enum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 1) + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_name(self): + class Color(Enum): + def _generate_next_value_(name, start, count, last): + return name + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_name_inherit(self): + class AutoNameEnum(Enum): + def _generate_next_value_(name, start, count, last): + return name + class Color(AutoNameEnum): + red = auto() + blue = auto() + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 'blue') + self.assertEqual(Color.green.value, 'green') + + def test_auto_garbage(self): + class Color(Enum): + red = 'red' + blue = auto() + self.assertEqual(Color.blue.value, 1) + + def test_auto_garbage_corrected(self): + class Color(Enum): + red = 'red' + blue = 2 + green = auto() + + self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) + self.assertEqual(Color.red.value, 'red') + self.assertEqual(Color.blue.value, 2) + self.assertEqual(Color.green.value, 3) + + def test_auto_order(self): + with self.assertRaises(TypeError): + class Color(Enum): + red = auto() + green = auto() + blue = auto() + def _generate_next_value_(name, start, count, last): + return name + + def test_auto_order_wierd(self): + weird_auto = auto() + weird_auto.value = 'pathological case' + class Color(Enum): + red = weird_auto + def _generate_next_value_(name, start, count, last): + return name + blue = auto() + self.assertEqual(list(Color), [Color.red, Color.blue]) + self.assertEqual(Color.red.value, 'pathological case') + self.assertEqual(Color.blue.value, 'blue') + + def test_duplicate_auto(self): + class Dupes(Enum): + first = primero = auto() + second = auto() + third = auto() + self.assertEqual([Dupes.first, Dupes.second, Dupes.third], list(Dupes)) + class TestEnumTypeSubclassing(unittest.TestCase): pass @@ -4238,7 +3818,35 @@ class TestEnumTypeSubclassing(unittest.TestCase): class Color(enum.Enum) | Color(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None) |\x20\x20 - | An enumeration. + | A collection of name/value pairs. + |\x20\x20 + | Access them by: + |\x20\x20 + | - attribute access:: + |\x20\x20 + | >>> Color.CYAN + | + |\x20\x20 + | - value lookup: + |\x20\x20 + | >>> Color(1) + | + |\x20\x20 + | - name lookup: + |\x20\x20 + | >>> Color['CYAN'] + | + |\x20\x20 + | Enumerations can be iterated over, and know how many members they have: + |\x20\x20 + | >>> len(Color) + | 3 + |\x20\x20 + | >>> list(Color) + | [, , ] + |\x20\x20 + | Methods can be added to enumerations, and members can have their own + | attributes -- see the documentation for details. |\x20\x20 | Method resolution order: | Color @@ -4247,11 +3855,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | blue = Color.blue + | CYAN = |\x20\x20 - | green = Color.green + | MAGENTA = |\x20\x20 - | red = Color.red + | YELLOW = |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -4263,6 +3871,25 @@ class Color(enum.Enum) | The value of the Enum member. |\x20\x20 | ---------------------------------------------------------------------- + | Methods inherited from enum.EnumType: + |\x20\x20 + | __contains__(member) from enum.EnumType + | Return True if member is a member of this enum + | raises TypeError if member is not an enum member + |\x20\x20\x20\x20\x20\x20 + | note: in 3.12 TypeError will no longer be raised, and True will also be + | returned if member is the value of a member in this enum + |\x20\x20 + | __getitem__(name) from enum.EnumType + | Return the member matching `name`. + |\x20\x20 + | __iter__() from enum.EnumType + | Return members in definition order. + |\x20\x20 + | __len__() from enum.EnumType + | Return the number of members (no aliases) + |\x20\x20 + | ---------------------------------------------------------------------- | Readonly properties inherited from enum.EnumType: |\x20\x20 | __members__ @@ -4284,11 +3911,11 @@ class Color(enum.Enum) |\x20\x20 | Data and other attributes defined here: |\x20\x20 - | blue = Color.blue + | YELLOW = |\x20\x20 - | green = Color.green + | MAGENTA = |\x20\x20 - | red = Color.red + | CYAN = |\x20\x20 | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -4307,9 +3934,9 @@ class TestStdLib(unittest.TestCase): maxDiff = None class Color(Enum): - red = 1 - green = 2 - blue = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 def test_pydoc(self): # indirectly test __objclass__ @@ -4321,24 +3948,34 @@ def test_pydoc(self): helper = pydoc.Helper(output=output) helper(self.Color) result = output.getvalue().strip() - self.assertEqual(result, expected_text) + self.assertEqual(result, expected_text, result) def test_inspect_getmembers(self): values = dict(( ('__class__', EnumType), - ('__doc__', 'An enumeration.'), + ('__doc__', '...'), ('__members__', self.Color.__members__), ('__module__', __name__), - ('blue', self.Color.blue), - ('green', self.Color.green), + ('YELLOW', self.Color.YELLOW), + ('MAGENTA', self.Color.MAGENTA), + ('CYAN', self.Color.CYAN), ('name', Enum.__dict__['name']), - ('red', self.Color.red), ('value', Enum.__dict__['value']), + ('__len__', self.Color.__len__), + ('__contains__', self.Color.__contains__), + ('__name__', 'Color'), + ('__getitem__', self.Color.__getitem__), + ('__qualname__', 'TestStdLib.Color'), + ('__init_subclass__', getattr(self.Color, '__init_subclass__')), + ('__iter__', self.Color.__iter__), )) result = dict(inspect.getmembers(self.Color)) self.assertEqual(set(values.keys()), set(result.keys())) failed = False for k in values.keys(): + if k == '__doc__': + # __doc__ is huge, not comparing + continue if result[k] != values[k]: print() print('\n%s\n key: %s\n result: %s\nexpected: %s\n%s\n' % @@ -4353,23 +3990,42 @@ def test_inspect_classify_class_attrs(self): values = [ Attribute(name='__class__', kind='data', defining_class=object, object=EnumType), + Attribute(name='__contains__', kind='method', + defining_class=EnumType, object=self.Color.__contains__), Attribute(name='__doc__', kind='data', - defining_class=self.Color, object='An enumeration.'), + defining_class=self.Color, object='...'), + Attribute(name='__getitem__', kind='method', + defining_class=EnumType, object=self.Color.__getitem__), + Attribute(name='__iter__', kind='method', + defining_class=EnumType, object=self.Color.__iter__), + Attribute(name='__init_subclass__', kind='class method', + defining_class=object, object=getattr(self.Color, '__init_subclass__')), + Attribute(name='__len__', kind='method', + defining_class=EnumType, object=self.Color.__len__), Attribute(name='__members__', kind='property', defining_class=EnumType, object=EnumType.__members__), Attribute(name='__module__', kind='data', defining_class=self.Color, object=__name__), - Attribute(name='blue', kind='data', - defining_class=self.Color, object=self.Color.blue), - Attribute(name='green', kind='data', - defining_class=self.Color, object=self.Color.green), - Attribute(name='red', kind='data', - defining_class=self.Color, object=self.Color.red), + Attribute(name='__name__', kind='data', + defining_class=self.Color, object='Color'), + Attribute(name='__qualname__', kind='data', + defining_class=self.Color, object='TestStdLib.Color'), + Attribute(name='YELLOW', kind='data', + defining_class=self.Color, object=self.Color.YELLOW), + Attribute(name='MAGENTA', kind='data', + defining_class=self.Color, object=self.Color.MAGENTA), + Attribute(name='CYAN', kind='data', + defining_class=self.Color, object=self.Color.CYAN), Attribute(name='name', kind='data', defining_class=Enum, object=Enum.__dict__['name']), Attribute(name='value', kind='data', defining_class=Enum, object=Enum.__dict__['value']), ] + for v in values: + try: + v.name + except AttributeError: + print(v) values.sort(key=lambda item: item.name) result = list(inspect.classify_class_attrs(self.Color)) result.sort(key=lambda item: item.name) @@ -4379,7 +4035,15 @@ def test_inspect_classify_class_attrs(self): ) failed = False for v, r in zip(values, result): - if r != v: + if r.name in ('__init_subclass__', '__doc__'): + # not sure how to make the __init_subclass_ Attributes match + # so as long as there is one, call it good + # __doc__ is too big to check exactly, so treat the same as __init_subclass__ + for name in ('name','kind','defining_class'): + if getattr(v, name) != getattr(r, name): + print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') + failed = True + elif r != v: print('\n%s\n%s\n%s\n%s\n' % ('=' * 75, r, v, '=' * 75), sep='') failed = True if failed: @@ -4388,15 +4052,15 @@ def test_inspect_classify_class_attrs(self): def test_test_simple_enum(self): @_simple_enum(Enum) class SimpleColor: - RED = 1 - GREEN = 2 - BLUE = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 class CheckedColor(Enum): - RED = 1 - GREEN = 2 - BLUE = 3 + CYAN = 1 + MAGENTA = 2 + YELLOW = 3 self.assertTrue(_test_simple_enum(CheckedColor, SimpleColor) is None) - SimpleColor.GREEN._value_ = 9 + SimpleColor.MAGENTA._value_ = 9 self.assertRaisesRegex( TypeError, "enum mismatch", _test_simple_enum, CheckedColor, SimpleColor, @@ -4422,9 +4086,165 @@ class Missing: class MiscTestCase(unittest.TestCase): + def test__all__(self): support.check__all__(self, enum, not_exported={'bin', 'show_flag_values'}) + def test_doc_1(self): + class Single(Enum): + ONE = 1 + self.assertEqual( + Single.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Single.ONE + + + - value lookup: + + >>> Single(1) + + + - name lookup: + + >>> Single['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Single) + 1 + + >>> list(Single) + [] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + def test_doc_2(self): + class Double(Enum): + ONE = 1 + TWO = 2 + self.assertEqual( + Double.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Double.ONE + + + - value lookup: + + >>> Double(1) + + + - name lookup: + + >>> Double['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Double) + 2 + + >>> list(Double) + [, ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + + def test_doc_1(self): + class Triple(Enum): + ONE = 1 + TWO = 2 + THREE = 3 + self.assertEqual( + Triple.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Triple.ONE + + + - value lookup: + + >>> Triple(1) + + + - name lookup: + + >>> Triple['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Triple) + 3 + + >>> list(Triple) + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + + def test_doc_1(self): + class Quadruple(Enum): + ONE = 1 + TWO = 2 + THREE = 3 + FOUR = 4 + self.assertEqual( + Quadruple.__doc__, + dedent("""\ + A collection of name/value pairs. + + Access them by: + + - attribute access:: + + >>> Quadruple.ONE + + + - value lookup: + + >>> Quadruple(1) + + + - name lookup: + + >>> Quadruple['ONE'] + + + Enumerations can be iterated over, and know how many members they have: + + >>> len(Quadruple) + 4 + + >>> list(Quadruple)[:3] + [, , ] + + Methods can be added to enumerations, and members can have their own + attributes -- see the documentation for details. + """)) + # These are unordered here on purpose to ensure that declaration order # makes no difference. @@ -4442,6 +4262,10 @@ def test__all__(self): CONVERT_STRING_TEST_NAME_E = 5 CONVERT_STRING_TEST_NAME_F = 5 +# global names for StrEnum._convert_ test +CONVERT_STR_TEST_2 = 'goodbye' +CONVERT_STR_TEST_1 = 'hello' + # We also need values that cannot be compared: UNCOMPARABLE_A = 5 UNCOMPARABLE_C = (9, 1) # naming order is broken on purpose @@ -4453,32 +4277,40 @@ def test__all__(self): class _ModuleWrapper: """We use this class as a namespace for swapping modules.""" - def __init__(self, module): self.__dict__.update(module.__dict__) -class TestIntEnumConvert(unittest.TestCase): +class TestConvert(unittest.TestCase): + def tearDown(self): + # Reset the module-level test variables to their original integer + # values, otherwise the already created enum values get converted + # instead. + g = globals() + for suffix in ['A', 'B', 'C', 'D', 'E', 'F']: + g['CONVERT_TEST_NAME_%s' % suffix] = 5 + g['CONVERT_STRING_TEST_NAME_%s' % suffix] = 5 + for suffix, value in (('A', 5), ('B', (9, 1)), ('C', 'value')): + g['UNCOMPARABLE_%s' % suffix] = value + for suffix, value in (('A', 2j), ('B', 3j), ('C', 1j)): + g['COMPLEX_%s' % suffix] = value + for suffix, value in (('1', 'hello'), ('2', 'goodbye')): + g['CONVERT_STR_TEST_%s' % suffix] = value + def test_convert_value_lookup_priority(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # We don't want the reverse lookup value to vary when there are # multiple possible names for a given value. It should always # report the first lexigraphical name in that case. self.assertEqual(test_type(5).name, 'CONVERT_TEST_NAME_A') - def test_convert(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_TEST_')) + def test_convert_int(self): + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_TEST_')) # Ensure that test_type has all of the desired names and values. self.assertEqual(test_type.CONVERT_TEST_NAME_F, test_type.CONVERT_TEST_NAME_A) @@ -4487,43 +4319,57 @@ def test_convert(self): self.assertEqual(test_type.CONVERT_TEST_NAME_D, 5) self.assertEqual(test_type.CONVERT_TEST_NAME_E, 5) # Ensure that test_type only picked up names matching the filter. - self.assertEqual([name for name in dir(test_type) - if name[0:2] not in ('CO', '__') - and name not in dir(IntEnum)], - [], msg='Names other than CONVERT_TEST_* found.') + int_dir = dir(int) + [ + 'CONVERT_TEST_NAME_A', 'CONVERT_TEST_NAME_B', 'CONVERT_TEST_NAME_C', + 'CONVERT_TEST_NAME_D', 'CONVERT_TEST_NAME_E', 'CONVERT_TEST_NAME_F', + ] + self.assertEqual( + [name for name in dir(test_type) if name not in int_dir], + [], + msg='Names other than CONVERT_TEST_* found.', + ) def test_convert_uncomparable(self): - # We swap a module to some other object with `__dict__` - # because otherwise refleak is created. - # `_convert_` uses a module side effect that does this. See 30472 - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('UNCOMPARABLE_')) - + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('UNCOMPARABLE_')) # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.UNCOMPARABLE_A, uncomp.UNCOMPARABLE_B, uncomp.UNCOMPARABLE_C], - ) + ) def test_convert_complex(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - uncomp = enum.Enum._convert_( - 'Uncomparable', - MODULE, - filter=lambda x: x.startswith('COMPLEX_')) - + uncomp = enum.Enum._convert_( + 'Uncomparable', + MODULE, + filter=lambda x: x.startswith('COMPLEX_')) # Should be ordered by `name` only: self.assertEqual( list(uncomp), [uncomp.COMPLEX_A, uncomp.COMPLEX_B, uncomp.COMPLEX_C], - ) + ) + + def test_convert_str(self): + test_type = enum.StrEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STR_'), + as_global=True) + # Ensure that test_type has all of the desired names and values. + self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') + self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') + # Ensure that test_type only picked up names matching the filter. + str_dir = dir(str) + ['CONVERT_STR_TEST_1', 'CONVERT_STR_TEST_2'] + self.assertEqual( + [name for name in dir(test_type) if name not in str_dir], + [], + msg='Names other than CONVERT_STR_* found.', + ) + self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) + self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') + self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') def test_convert_raise(self): with self.assertRaises(AttributeError): @@ -4533,50 +4379,58 @@ def test_convert_raise(self): filter=lambda x: x.startswith('CONVERT_TEST_')) def test_convert_repr_and_str(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.IntEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STRING_TEST_')) + test_type = enum.IntEnum._convert_( + 'UnittestConvert', + MODULE, + filter=lambda x: x.startswith('CONVERT_STRING_TEST_'), + as_global=True) self.assertEqual(repr(test_type.CONVERT_STRING_TEST_NAME_A), '%s.CONVERT_STRING_TEST_NAME_A' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), 'CONVERT_STRING_TEST_NAME_A') + self.assertEqual(str(test_type.CONVERT_STRING_TEST_NAME_A), '5') self.assertEqual(format(test_type.CONVERT_STRING_TEST_NAME_A), '5') -# global names for StrEnum._convert_ test -CONVERT_STR_TEST_2 = 'goodbye' -CONVERT_STR_TEST_1 = 'hello' -class TestStrEnumConvert(unittest.TestCase): - def test_convert(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) - # Ensure that test_type has all of the desired names and values. - self.assertEqual(test_type.CONVERT_STR_TEST_1, 'hello') - self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') - # Ensure that test_type only picked up names matching the filter. - self.assertEqual([name for name in dir(test_type) - if name[0:2] not in ('CO', '__') - and name not in dir(StrEnum)], - [], msg='Names other than CONVERT_STR_* found.') +# helpers - def test_convert_repr_and_str(self): - with support.swap_item( - sys.modules, MODULE, _ModuleWrapper(sys.modules[MODULE]), - ): - test_type = enum.StrEnum._convert_( - 'UnittestConvert', - MODULE, - filter=lambda x: x.startswith('CONVERT_STR_')) - self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) - self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') - self.assertEqual(format(test_type.CONVERT_STR_TEST_1), 'hello') +def enum_dir(cls): + # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ + if cls._member_type_ is object: + interesting = set() + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') + return sorted(set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ] + + cls._member_names_ + ) | interesting + ) + else: + # return whatever mixed-in data type has + return sorted(set( + dir(cls._member_type_) + + cls._member_names_ + )) + +def member_dir(member): + if member.__class__._member_type_ is object: + allowed = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) + else: + allowed = set(dir(member)) + for cls in member.__class__.mro(): + for name, obj in cls.__dict__.items(): + if name[0] == '_': + continue + if isinstance(obj, enum.property): + if obj.fget is not None or name not in member._member_map_: + allowed.add(name) + else: + allowed.discard(name) + else: + allowed.add(name) + return sorted(allowed) + +missing = object() if __name__ == '__main__': diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index 3f0e7270eb26f..ac4626d0c456e 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -908,7 +908,7 @@ def handler(signum, frame): %s - blocked = %r + blocked = %s signum = signal.SIGALRM # child: block and wait the signal diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 394d2942483fb..56cc23dbbbf4e 100755 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -1517,9 +1517,11 @@ def testGetaddrinfo(self): infos = socket.getaddrinfo(HOST, 80, socket.AF_INET, socket.SOCK_STREAM) for family, type, _, _, _ in infos: self.assertEqual(family, socket.AF_INET) - self.assertEqual(str(family), 'AF_INET') + self.assertEqual(repr(family), '') + self.assertEqual(str(family), '2') self.assertEqual(type, socket.SOCK_STREAM) - self.assertEqual(str(type), 'SOCK_STREAM') + self.assertEqual(repr(type), '') + self.assertEqual(str(type), '1') infos = socket.getaddrinfo(HOST, None, 0, socket.SOCK_STREAM) for _, socktype, _, _, _ in infos: self.assertEqual(socktype, socket.SOCK_STREAM) @@ -1793,8 +1795,10 @@ def test_str_for_enums(self): # Make sure that the AF_* and SOCK_* constants have enum-like string # reprs. with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: - self.assertEqual(str(s.family), 'AF_INET') - self.assertEqual(str(s.type), 'SOCK_STREAM') + self.assertEqual(repr(s.family), '') + self.assertEqual(repr(s.type), '') + self.assertEqual(str(s.family), '2') + self.assertEqual(str(s.type), '1') def test_socket_consistent_sock_type(self): SOCK_NONBLOCK = getattr(socket, 'SOCK_NONBLOCK', 0) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index f99a3e8da95f8..64f4bce7f7781 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -373,7 +373,8 @@ def test_str_for_enums(self): # Make sure that the PROTOCOL_* constants have enum-like string # reprs. proto = ssl.PROTOCOL_TLS_CLIENT - self.assertEqual(str(proto), 'PROTOCOL_TLS_CLIENT') + self.assertEqual(repr(proto), '<_SSLMethod.PROTOCOL_TLS_CLIENT: 16>') + self.assertEqual(str(proto), '16') ctx = ssl.SSLContext(proto) self.assertIs(ctx.protocol, proto) @@ -622,7 +623,7 @@ def test_openssl111_deprecations(self): with self.assertWarns(DeprecationWarning) as cm: ssl.SSLContext(protocol) self.assertEqual( - f'{protocol!r} is deprecated', + f'ssl.{protocol.name} is deprecated', str(cm.warning) ) @@ -631,8 +632,9 @@ def test_openssl111_deprecations(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT) with self.assertWarns(DeprecationWarning) as cm: ctx.minimum_version = version + version_text = '%s.%s' % (version.__class__.__name__, version.name) self.assertEqual( - f'ssl.{version!r} is deprecated', + f'ssl.{version_text} is deprecated', str(cm.warning) ) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index d5e2c5266aae7..8e4e64808b688 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1490,8 +1490,10 @@ def test_formatting_with_enum(self): # issue18780 import enum class Float(float, enum.Enum): + # a mixed-in type will use the name for %s etc. PI = 3.1415926 class Int(enum.IntEnum): + # IntEnum uses the value and not the name for %s etc. IDES = 15 class Str(enum.StrEnum): # StrEnum uses the value and not the name for %s etc. @@ -1508,8 +1510,10 @@ class Str(enum.StrEnum): # formatting jobs delegated from the string implementation: self.assertEqual('...%(foo)s...' % {'foo':Str.ABC}, '...abc...') + self.assertEqual('...%(foo)r...' % {'foo':Int.IDES}, + '......') self.assertEqual('...%(foo)s...' % {'foo':Int.IDES}, - '...IDES...') + '...15...') self.assertEqual('...%(foo)i...' % {'foo':Int.IDES}, '...15...') self.assertEqual('...%(foo)d...' % {'foo':Int.IDES}, diff --git a/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst b/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst new file mode 100644 index 0000000000000..2df487855785e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-13-11-41-24.bpo-40066.1QuVli.rst @@ -0,0 +1,2 @@ +``IntEnum``, ``IntFlag``, and ``StrEnum`` use the mixed-in type for their +``str()`` and ``format()`` output. From webhook-mailer at python.org Mon Jan 17 11:30:25 2022 From: webhook-mailer at python.org (markshannon) Date: Mon, 17 Jan 2022 16:30:25 -0000 Subject: [Python-checkins] bpo-46405: fix msvc compiler warnings (GH-30627) Message-ID: https://github.com/python/cpython/commit/a4bc2218d270c4c7a898c8b3967c8c271afe9abe commit: a4bc2218d270c4c7a898c8b3967c8c271afe9abe branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: markshannon date: 2022-01-17T16:30:10Z summary: bpo-46405: fix msvc compiler warnings (GH-30627) files: M Python/specialize.c diff --git a/Python/specialize.c b/Python/specialize.c index 2da9e0f29b7a4..7c2252dd7a0e5 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -1240,7 +1240,7 @@ _Py_Specialize_StoreSubscr(PyObject *container, PyObject *sub, _Py_CODEUNIT *ins if (container_type == &PyList_Type) { if (PyLong_CheckExact(sub)) { if ((Py_SIZE(sub) == 0 || Py_SIZE(sub) == 1) - && ((PyLongObject *)sub)->ob_digit[0] < PyList_GET_SIZE(container)) + && ((PyLongObject *)sub)->ob_digit[0] < (size_t)PyList_GET_SIZE(container)) { *instr = _Py_MAKECODEUNIT(STORE_SUBSCR_LIST_INT, initial_counter_value()); From webhook-mailer at python.org Mon Jan 17 11:52:57 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 17 Jan 2022 16:52:57 -0000 Subject: [Python-checkins] bpo-40066: [Enum] fix tests (GH-30643) Message-ID: https://github.com/python/cpython/commit/62a6594e66ca955073be2f4e5a40291a39252ef3 commit: 62a6594e66ca955073be2f4e5a40291a39252ef3 branch: main author: Ethan Furman committer: ethanfurman date: 2022-01-17T08:52:42-08:00 summary: bpo-40066: [Enum] fix tests (GH-30643) - skip doctest that changes depending on target system - skip doctest that only fails on CI - substitute in values that change depending on target system files: M Doc/library/ssl.rst M Lib/test/test_socket.py M Lib/test/test_ssl.py diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index 4d8488a4a28de..151f2546eeb60 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -2081,7 +2081,7 @@ to speed up repeated connections from the same clients. .. versionchanged:: 3.6 :attr:`SSLContext.verify_mode` returns :class:`VerifyMode` enum: - >>> ssl.create_default_context().verify_mode + >>> ssl.create_default_context().verify_mode # doctest: +SKIP .. index:: single: certificates diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 56cc23dbbbf4e..53aa5e90fa25c 100755 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -1517,11 +1517,11 @@ def testGetaddrinfo(self): infos = socket.getaddrinfo(HOST, 80, socket.AF_INET, socket.SOCK_STREAM) for family, type, _, _, _ in infos: self.assertEqual(family, socket.AF_INET) - self.assertEqual(repr(family), '') - self.assertEqual(str(family), '2') + self.assertEqual(repr(family), '' % family.value) + self.assertEqual(str(family), str(family.value)) self.assertEqual(type, socket.SOCK_STREAM) - self.assertEqual(repr(type), '') - self.assertEqual(str(type), '1') + self.assertEqual(repr(type), '' % type.value) + self.assertEqual(str(type), str(type.value)) infos = socket.getaddrinfo(HOST, None, 0, socket.SOCK_STREAM) for _, socktype, _, _, _ in infos: self.assertEqual(socktype, socket.SOCK_STREAM) @@ -1795,10 +1795,10 @@ def test_str_for_enums(self): # Make sure that the AF_* and SOCK_* constants have enum-like string # reprs. with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: - self.assertEqual(repr(s.family), '') - self.assertEqual(repr(s.type), '') - self.assertEqual(str(s.family), '2') - self.assertEqual(str(s.type), '1') + self.assertEqual(repr(s.family), '' % s.family.value) + self.assertEqual(repr(s.type), '' % s.type.value) + self.assertEqual(str(s.family), str(s.family.value)) + self.assertEqual(str(s.type), str(s.type.value)) def test_socket_consistent_sock_type(self): SOCK_NONBLOCK = getattr(socket, 'SOCK_NONBLOCK', 0) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 64f4bce7f7781..543d34a546933 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -373,8 +373,8 @@ def test_str_for_enums(self): # Make sure that the PROTOCOL_* constants have enum-like string # reprs. proto = ssl.PROTOCOL_TLS_CLIENT - self.assertEqual(repr(proto), '<_SSLMethod.PROTOCOL_TLS_CLIENT: 16>') - self.assertEqual(str(proto), '16') + self.assertEqual(repr(proto), '<_SSLMethod.PROTOCOL_TLS_CLIENT: %r>' % proto.value) + self.assertEqual(str(proto), str(proto.value)) ctx = ssl.SSLContext(proto) self.assertIs(ctx.protocol, proto) From webhook-mailer at python.org Mon Jan 17 12:45:57 2022 From: webhook-mailer at python.org (markshannon) Date: Mon, 17 Jan 2022 17:45:57 -0000 Subject: [Python-checkins] bpo-46161: Fix bug in starunpack_helper in compile.c (GH-30235) Message-ID: https://github.com/python/cpython/commit/c118c2455c95baea08045dc64963600b7a56b6fd commit: c118c2455c95baea08045dc64963600b7a56b6fd branch: main author: zq1997 committer: markshannon date: 2022-01-17T17:45:44Z summary: bpo-46161: Fix bug in starunpack_helper in compile.c (GH-30235) files: A Misc/NEWS.d/next/Core and Builtins/2021-12-23-12-32-45.bpo-46161.EljBmu.rst M Lib/test/test_class.py M Python/compile.c diff --git a/Lib/test/test_class.py b/Lib/test/test_class.py index 7524f58a3ce73..7cf5e06d59e20 100644 --- a/Lib/test/test_class.py +++ b/Lib/test/test_class.py @@ -666,5 +666,23 @@ def __init__(self, *args, **kwargs): with self.assertRaisesRegex(TypeError, error_msg): object.__init__(E(), 42) + def testClassWithExtCall(self): + class Meta(int): + def __init__(*args, **kwargs): + pass + + def __new__(cls, name, bases, attrs, **kwargs): + return bases, kwargs + + d = {'metaclass': Meta} + + class A(**d): pass + self.assertEqual(A, ((), {})) + class A(0, 1, 2, 3, 4, 5, 6, 7, **d): pass + self.assertEqual(A, (tuple(range(8)), {})) + class A(0, *range(1, 8), **d, foo='bar'): pass + self.assertEqual(A, (tuple(range(8)), {'foo': 'bar'})) + + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-23-12-32-45.bpo-46161.EljBmu.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-23-12-32-45.bpo-46161.EljBmu.rst new file mode 100644 index 0000000000000..3eeb358c52080 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-23-12-32-45.bpo-46161.EljBmu.rst @@ -0,0 +1 @@ +Fix the class building error when the arguments are constants and CALL_FUNCTION_EX is used. \ No newline at end of file diff --git a/Python/compile.c b/Python/compile.c index b2702da8707f3..86f888ef8a394 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -4233,7 +4233,7 @@ starunpack_helper(struct compiler *c, asdl_expr_seq *elts, int pushed, Py_INCREF(val); PyTuple_SET_ITEM(folded, i, val); } - if (tuple) { + if (tuple && !pushed) { ADDOP_LOAD_CONST_NEW(c, folded); } else { if (add == SET_ADD) { @@ -4245,6 +4245,9 @@ starunpack_helper(struct compiler *c, asdl_expr_seq *elts, int pushed, ADDOP_I(c, build, pushed); ADDOP_LOAD_CONST_NEW(c, folded); ADDOP_I(c, extend, 1); + if (tuple) { + ADDOP(c, LIST_TO_TUPLE); + } } return 1; } From webhook-mailer at python.org Mon Jan 17 13:05:28 2022 From: webhook-mailer at python.org (zooba) Date: Mon, 17 Jan 2022 18:05:28 -0000 Subject: [Python-checkins] Skip signing side-loadable MSIX for Windows (GH-30644) Message-ID: https://github.com/python/cpython/commit/d6c6e6ba739ee714e5706144853008f1eed446ba commit: d6c6e6ba739ee714e5706144853008f1eed446ba branch: main author: Steve Dower committer: zooba date: 2022-01-17T18:05:16Z summary: Skip signing side-loadable MSIX for Windows (GH-30644) We currently do not release these files, and so there's nothing lost by signing them. Our code signing certificate is somehow incompatible with signing MSIX files. We may be able to re-enable this when we next renew, or if Microsoft updates their signing tool to work with our certificate. files: M .azure-pipelines/windows-release/stage-pack-msix.yml diff --git a/.azure-pipelines/windows-release/stage-pack-msix.yml b/.azure-pipelines/windows-release/stage-pack-msix.yml index f967cfdbe326f..6f3e7a5e5d593 100644 --- a/.azure-pipelines/windows-release/stage-pack-msix.yml +++ b/.azure-pipelines/windows-release/stage-pack-msix.yml @@ -96,7 +96,9 @@ jobs: displayName: Sign side-loadable MSIX bundles dependsOn: - Pack_MSIX - condition: and(succeeded(), variables['SigningCertificate']) + # Our current certificate does not support MSIX signing, so we unconditionally skip this step + #condition: and(succeeded(), variables['SigningCertificate']) + condition: false pool: name: 'Windows Release' From webhook-mailer at python.org Mon Jan 17 15:17:06 2022 From: webhook-mailer at python.org (ethanfurman) Date: Mon, 17 Jan 2022 20:17:06 -0000 Subject: [Python-checkins] bpo-46418: [Enum] simplify `MODULE` declaration in tests (GH-30647) Message-ID: https://github.com/python/cpython/commit/596cf51a4d40f1ac3090cbccb83ad0663d739ae2 commit: 596cf51a4d40f1ac3090cbccb83ad0663d739ae2 branch: main author: Nikita Sobolev committer: ethanfurman date: 2022-01-17T12:16:56-08:00 summary: bpo-46418: [Enum] simplify `MODULE` declaration in tests (GH-30647) files: M Lib/test/test_enum.py diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index a0953fb960f33..18cc2f30ce559 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -31,7 +31,7 @@ def load_tests(loader, tests, ignore): )) return tests -MODULE = ('test.test_enum', '__main__')[__name__=='__main__'] +MODULE = __name__ SHORT_MODULE = MODULE.split('.')[-1] # for pickle tests From webhook-mailer at python.org Mon Jan 17 15:23:09 2022 From: webhook-mailer at python.org (zooba) Date: Mon, 17 Jan 2022 20:23:09 -0000 Subject: [Python-checkins] Restore MSIX signing and ensure expired certificates are not selected (GH-30649) Message-ID: https://github.com/python/cpython/commit/9e20ec4d437993715a8d1317a9b80043e6c07fe1 commit: 9e20ec4d437993715a8d1317a9b80043e6c07fe1 branch: main author: Steve Dower committer: zooba date: 2022-01-17T20:22:52Z summary: Restore MSIX signing and ensure expired certificates are not selected (GH-30649) Reverts the change in d6c6e6b and applies a better fix. files: M .azure-pipelines/windows-release/stage-pack-msix.yml M .azure-pipelines/windows-release/stage-sign.yml diff --git a/.azure-pipelines/windows-release/stage-pack-msix.yml b/.azure-pipelines/windows-release/stage-pack-msix.yml index 6f3e7a5e5d593..9f7919ee64706 100644 --- a/.azure-pipelines/windows-release/stage-pack-msix.yml +++ b/.azure-pipelines/windows-release/stage-pack-msix.yml @@ -96,9 +96,7 @@ jobs: displayName: Sign side-loadable MSIX bundles dependsOn: - Pack_MSIX - # Our current certificate does not support MSIX signing, so we unconditionally skip this step - #condition: and(succeeded(), variables['SigningCertificate']) - condition: false + condition: and(succeeded(), variables['SigningCertificate']) pool: name: 'Windows Release' @@ -123,6 +121,10 @@ jobs: downloadPath: $(Build.BinariesDirectory) # MSIX must be signed and timestamped simultaneously + # + # Getting "Error: SignerSign() failed." (-2147024885/0x8007000b)"? + # It may be that the certificate info collected in stage-sign.yml is wrong. Check that + # you do not have multiple matches for the certificate name you have specified. - powershell: | $failed = $true foreach ($retry in 1..3) { diff --git a/.azure-pipelines/windows-release/stage-sign.yml b/.azure-pipelines/windows-release/stage-sign.yml index c21e1c9f2b0f9..d43e077186c42 100644 --- a/.azure-pipelines/windows-release/stage-sign.yml +++ b/.azure-pipelines/windows-release/stage-sign.yml @@ -91,7 +91,7 @@ jobs: - powershell: | $m = 'CN=$(SigningCertificate)' $c = ((gci Cert:\CurrentUser\My), (gci Cert:\LocalMachine\My)) | %{ $_ } | ` - ?{ $_.Subject -match $m } | ` + ?{ $_.Subject -match $m -and $_.NotBefore -lt (Get-Date) -and $_.NotAfter -gt (Get-Date) } | ` select -First 1 if (-not $c) { Write-Host "Failed to find certificate for $(SigningCertificate)" From webhook-mailer at python.org Mon Jan 17 19:19:06 2022 From: webhook-mailer at python.org (zooba) Date: Tue, 18 Jan 2022 00:19:06 -0000 Subject: [Python-checkins] bpo-44934: Add optional feature AppendPath to Windows MSI installer (GH-27889) Message-ID: https://github.com/python/cpython/commit/c47c9e6589eb7a272cfe4d352eb87389eb20ec2f commit: c47c9e6589eb7a272cfe4d352eb87389eb20ec2f branch: main author: bneuburg committer: zooba date: 2022-01-18T00:18:44Z summary: bpo-44934: Add optional feature AppendPath to Windows MSI installer (GH-27889) The option must be enabled from the command line files: A Misc/NEWS.d/next/Windows/2021-09-01-10-48-11.bpo-44934.W1xPATH.rst A Tools/msi/appendpath/appendpath.wixproj A Tools/msi/appendpath/appendpath.wxs A Tools/msi/appendpath/appendpath_en-US.wxl M Doc/using/windows.rst M Doc/whatsnew/3.11.rst M Tools/msi/bundle/Default.wxl M Tools/msi/bundle/bootstrap/PythonBootstrapperApplication.cpp M Tools/msi/bundle/bundle.targets M Tools/msi/bundle/bundle.wxs M Tools/msi/bundle/packagegroups/postinstall.wxs diff --git a/Doc/using/windows.rst b/Doc/using/windows.rst index 68ee09c565e21..041166f4f8584 100644 --- a/Doc/using/windows.rst +++ b/Doc/using/windows.rst @@ -165,9 +165,13 @@ of available options is shown below. | CompileAll | Compile all ``.py`` files to | 0 | | | ``.pyc``. | | +---------------------------+--------------------------------------+--------------------------+ -| PrependPath | Add install and Scripts directories | 0 | -| | to :envvar:`PATH` and ``.PY`` to | | -| | :envvar:`PATHEXT` | | +| PrependPath | Prepend install and Scripts | 0 | +| | directories to :envvar:`PATH` and | | +| | add ``.PY`` to :envvar:`PATHEXT` | | ++---------------------------+--------------------------------------+--------------------------+ +| AppendPath | Append install and Scripts | 0 | +| | directories to :envvar:`PATH` and | | +| | add ``.PY`` to :envvar:`PATHEXT` | | +---------------------------+--------------------------------------+--------------------------+ | Shortcuts | Create shortcuts for the interpreter,| 1 | | | documentation and IDLE if installed. | | diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 96d6e26709342..5563e3d84de6d 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -200,6 +200,12 @@ Other CPython Implementation Changes have been removed as their values can be derived from ``exc_value``. (Contributed by Irit Katriel in :issue:`45711`.) +* A new command line option for the Windows installer ``AppendPath`` has beend added. + It behaves similiar to ``PrependPath`` but appends the install and scripts directories + instead of prepending it. + (Contributed by Bastian Neuburger in :issue:`44934`.) + + New Modules =========== diff --git a/Misc/NEWS.d/next/Windows/2021-09-01-10-48-11.bpo-44934.W1xPATH.rst b/Misc/NEWS.d/next/Windows/2021-09-01-10-48-11.bpo-44934.W1xPATH.rst new file mode 100644 index 0000000000000..0f1c25a0705df --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2021-09-01-10-48-11.bpo-44934.W1xPATH.rst @@ -0,0 +1 @@ +The installer now offers a command-line only option to add the installation directory to the end of :envvar:`PATH` instead of at the start. diff --git a/Tools/msi/appendpath/appendpath.wixproj b/Tools/msi/appendpath/appendpath.wixproj new file mode 100644 index 0000000000000..897087678c56f --- /dev/null +++ b/Tools/msi/appendpath/appendpath.wixproj @@ -0,0 +1,19 @@ + + + + {12B59A06-37CC-4558-A9C8-DAE922E64EF3} + 2.0 + appendpath + Package + ICE71 + + + + + + + + + + + diff --git a/Tools/msi/appendpath/appendpath.wxs b/Tools/msi/appendpath/appendpath.wxs new file mode 100644 index 0000000000000..b972f612bf799 --- /dev/null +++ b/Tools/msi/appendpath/appendpath.wxs @@ -0,0 +1,39 @@ +? + + + + + + + + + + NOT ALLUSERS=1 + + + + + + + + + + + + ALLUSERS=1 + + + + + + + + + + + + + + + + diff --git a/Tools/msi/appendpath/appendpath_en-US.wxl b/Tools/msi/appendpath/appendpath_en-US.wxl new file mode 100644 index 0000000000000..19a2e7734f8c3 --- /dev/null +++ b/Tools/msi/appendpath/appendpath_en-US.wxl @@ -0,0 +1,6 @@ +? + + Append to Path + AppendPath + No !(loc.ProductName) installation was detected. + diff --git a/Tools/msi/bundle/Default.wxl b/Tools/msi/bundle/Default.wxl index 791ce6eab7474..053306b0d7dcf 100644 --- a/Tools/msi/bundle/Default.wxl +++ b/Tools/msi/bundle/Default.wxl @@ -84,6 +84,8 @@ Select Customize to review current options. Create shortcuts for installed applications Add Python to &environment variables Add &Python [ShortVersion] to PATH + Append Python to &environment variables + Append &Python [ShortVersion] to PATH Install for &all users for &all users (requires elevation) Install &launcher for all users (recommended) diff --git a/Tools/msi/bundle/bootstrap/PythonBootstrapperApplication.cpp b/Tools/msi/bundle/bootstrap/PythonBootstrapperApplication.cpp index 3c54e401330cf..fdc2a21d83d5f 100644 --- a/Tools/msi/bundle/bootstrap/PythonBootstrapperApplication.cpp +++ b/Tools/msi/bundle/bootstrap/PythonBootstrapperApplication.cpp @@ -205,6 +205,7 @@ static struct { LPCWSTR regName; LPCWSTR variableName; } OPTIONAL_FEATURES[] = { { L"exe", L"Include_exe" }, { L"lib", L"Include_lib" }, { L"path", L"PrependPath" }, + { L"appendpath", L"AppendPath" }, { L"pip", L"Include_pip" }, { L"tcltk", L"Include_tcltk" }, { L"test", L"Include_test" }, diff --git a/Tools/msi/bundle/bundle.targets b/Tools/msi/bundle/bundle.targets index f882d2ee1f0c2..5d8ae6c1d7cba 100644 --- a/Tools/msi/bundle/bundle.targets +++ b/Tools/msi/bundle/bundle.targets @@ -67,6 +67,7 @@ + @@ -104,4 +105,4 @@ - \ No newline at end of file + diff --git a/Tools/msi/bundle/bundle.wxs b/Tools/msi/bundle/bundle.wxs index e2f871889340f..a145d840d3305 100644 --- a/Tools/msi/bundle/bundle.wxs +++ b/Tools/msi/bundle/bundle.wxs @@ -1,4 +1,4 @@ -? + @@ -87,6 +87,7 @@ + diff --git a/Tools/msi/bundle/packagegroups/postinstall.wxs b/Tools/msi/bundle/packagegroups/postinstall.wxs index 11ab673907054..64f42dd30e8ba 100644 --- a/Tools/msi/bundle/packagegroups/postinstall.wxs +++ b/Tools/msi/bundle/packagegroups/postinstall.wxs @@ -20,6 +20,27 @@ + + + + + + + + + + + - \ No newline at end of file + From webhook-mailer at python.org Tue Jan 18 02:05:22 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 18 Jan 2022 07:05:22 -0000 Subject: [Python-checkins] bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) Message-ID: https://github.com/python/cpython/commit/a287b31bcb065e4122400cb59167340d25480e6d commit: a287b31bcb065e4122400cb59167340d25480e6d branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-18T07:05:16Z summary: bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) files: M Lib/test/test_argparse.py M Lib/test/test_builtin.py M Lib/test/test_inspect.py M Lib/test/test_logging.py M Lib/test/test_raise.py M Lib/test/test_zipimport.py diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index 4c23610b9e961..afcb88ff5ce0f 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -102,8 +102,8 @@ def stderr_to_parser_error(parse_args, *args, **kwargs): if getattr(result, key) is sys.stderr: setattr(result, key, old_stderr) return result - except SystemExit: - code = sys.exc_info()[1].code + except SystemExit as e: + code = e.code stdout = sys.stdout.getvalue() stderr = sys.stderr.getvalue() raise ArgumentParserError( @@ -1850,8 +1850,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('opt_action failed: %s' % e) setattr(namespace, 'spam', value) @@ -1876,8 +1875,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('arg_action failed: %s' % e) setattr(namespace, 'badger', value) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 7456803221964..4b0b15f0a9361 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -581,8 +581,8 @@ def __dir__(self): # dir(traceback) try: raise IndexError - except: - self.assertEqual(len(dir(sys.exc_info()[2])), 4) + except IndexError as e: + self.assertEqual(len(dir(e.__traceback__)), 4) # test that object has a __dir__() self.assertEqual(sorted([].__dir__()), dir([])) diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 56168817a27f2..67372cca6ed1f 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -135,8 +135,8 @@ def test_excluding_predicates(self): self.istest(inspect.iscode, 'mod.spam.__code__') try: 1/0 - except: - tb = sys.exc_info()[2] + except Exception as e: + tb = e.__traceback__ self.istest(inspect.isframe, 'tb.tb_frame') self.istest(inspect.istraceback, 'tb') if hasattr(types, 'GetSetDescriptorType'): diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index e61ccdf86bdea..7c38676012bab 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -5524,8 +5524,8 @@ def test_compute_rollover(self, when=when, exp=exp): print('currentSecond: %s' % currentSecond, file=sys.stderr) print('r: %s' % r, file=sys.stderr) print('result: %s' % result, file=sys.stderr) - except Exception: - print('exception in diagnostic code: %s' % sys.exc_info()[1], file=sys.stderr) + except Exception as e: + print('exception in diagnostic code: %s' % e, file=sys.stderr) self.assertEqual(exp, actual) rh.close() setattr(TimedRotatingFileHandlerTest, "test_compute_rollover_%s" % when, test_compute_rollover) diff --git a/Lib/test/test_raise.py b/Lib/test/test_raise.py index 8dc62a933e872..8225504c4756d 100644 --- a/Lib/test/test_raise.py +++ b/Lib/test/test_raise.py @@ -12,8 +12,8 @@ def get_tb(): try: raise OSError() - except: - return sys.exc_info()[2] + except OSError as e: + return e.__traceback__ class Context: diff --git a/Lib/test/test_zipimport.py b/Lib/test/test_zipimport.py index 392dcfa87a19b..85dbf4d8f68eb 100644 --- a/Lib/test/test_zipimport.py +++ b/Lib/test/test_zipimport.py @@ -710,8 +710,8 @@ def testDoctestSuite(self): def doTraceback(self, module): try: module.do_raise() - except: - tb = sys.exc_info()[2].tb_next + except Exception as e: + tb = e.__traceback__.tb_next f,lno,n,line = extract_tb(tb, 1)[0] self.assertEqual(line, raise_src.strip()) From webhook-mailer at python.org Tue Jan 18 02:29:10 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 07:29:10 -0000 Subject: [Python-checkins] [3.9] bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) (GH-30658) Message-ID: https://github.com/python/cpython/commit/1d6530dd0564a6bb75989b9fca25a649b5ddc1b0 commit: 1d6530dd0564a6bb75989b9fca25a649b5ddc1b0 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-17T23:29:02-08:00 summary: [3.9] bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) (GH-30658) (cherry picked from commit a287b31bcb065e4122400cb59167340d25480e6d) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Automerge-Triggered-By: GH:iritkatriel files: M Lib/test/test_argparse.py M Lib/test/test_builtin.py M Lib/test/test_inspect.py M Lib/test/test_logging.py M Lib/test/test_raise.py M Lib/test/test_zipimport.py diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index 043a9cf93f771..cc5e8491b4c49 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -101,8 +101,8 @@ def stderr_to_parser_error(parse_args, *args, **kwargs): if getattr(result, key) is sys.stderr: setattr(result, key, old_stderr) return result - except SystemExit: - code = sys.exc_info()[1].code + except SystemExit as e: + code = e.code stdout = sys.stdout.getvalue() stderr = sys.stderr.getvalue() raise ArgumentParserError( @@ -1828,8 +1828,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('opt_action failed: %s' % e) setattr(namespace, 'spam', value) @@ -1854,8 +1853,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('arg_action failed: %s' % e) setattr(namespace, 'badger', value) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 1f224bfe1ba99..9927b7ec3728c 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -581,8 +581,8 @@ def __dir__(self): # dir(traceback) try: raise IndexError - except: - self.assertEqual(len(dir(sys.exc_info()[2])), 4) + except IndexError as e: + self.assertEqual(len(dir(e.__traceback__)), 4) # test that object has a __dir__() self.assertEqual(sorted([].__dir__()), dir([])) diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 08aed248b9665..2a6de956cdc6d 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -127,8 +127,8 @@ def test_excluding_predicates(self): self.istest(inspect.iscode, 'mod.spam.__code__') try: 1/0 - except: - tb = sys.exc_info()[2] + except Exception as e: + tb = e.__traceback__ self.istest(inspect.isframe, 'tb.tb_frame') self.istest(inspect.istraceback, 'tb') if hasattr(types, 'GetSetDescriptorType'): diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 363ea29d3af2f..0a9d449078f10 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -5374,8 +5374,8 @@ def test_compute_rollover(self, when=when, exp=exp): print('currentSecond: %s' % currentSecond, file=sys.stderr) print('r: %s' % r, file=sys.stderr) print('result: %s' % result, file=sys.stderr) - except Exception: - print('exception in diagnostic code: %s' % sys.exc_info()[1], file=sys.stderr) + except Exception as e: + print('exception in diagnostic code: %s' % e, file=sys.stderr) self.assertEqual(exp, actual) rh.close() setattr(TimedRotatingFileHandlerTest, "test_compute_rollover_%s" % when, test_compute_rollover) diff --git a/Lib/test/test_raise.py b/Lib/test/test_raise.py index 8dc62a933e872..8225504c4756d 100644 --- a/Lib/test/test_raise.py +++ b/Lib/test/test_raise.py @@ -12,8 +12,8 @@ def get_tb(): try: raise OSError() - except: - return sys.exc_info()[2] + except OSError as e: + return e.__traceback__ class Context: diff --git a/Lib/test/test_zipimport.py b/Lib/test/test_zipimport.py index 2e2438863da28..b7347a384f1ab 100644 --- a/Lib/test/test_zipimport.py +++ b/Lib/test/test_zipimport.py @@ -628,8 +628,8 @@ def testDoctestSuite(self): def doTraceback(self, module): try: module.do_raise() - except: - tb = sys.exc_info()[2].tb_next + except Exception as e: + tb = e.__traceback__.tb_next f,lno,n,line = extract_tb(tb, 1)[0] self.assertEqual(line, raise_src.strip()) From webhook-mailer at python.org Tue Jan 18 02:33:09 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 07:33:09 -0000 Subject: [Python-checkins] bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) Message-ID: https://github.com/python/cpython/commit/42038d00ea7b0b5455e371285102d85006fbf687 commit: 42038d00ea7b0b5455e371285102d85006fbf687 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-17T23:33:00-08:00 summary: bpo-46411: Remove unnecessary calls to sys.exc_info() in tests (GH-30638) (cherry picked from commit a287b31bcb065e4122400cb59167340d25480e6d) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Lib/test/test_argparse.py M Lib/test/test_builtin.py M Lib/test/test_inspect.py M Lib/test/test_logging.py M Lib/test/test_raise.py M Lib/test/test_zipimport.py diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index c96a540a8b315..37a73e0686377 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -101,8 +101,8 @@ def stderr_to_parser_error(parse_args, *args, **kwargs): if getattr(result, key) is sys.stderr: setattr(result, key, old_stderr) return result - except SystemExit: - code = sys.exc_info()[1].code + except SystemExit as e: + code = e.code stdout = sys.stdout.getvalue() stderr = sys.stderr.getvalue() raise ArgumentParserError( @@ -1830,8 +1830,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('opt_action failed: %s' % e) setattr(namespace, 'spam', value) @@ -1856,8 +1855,7 @@ def __call__(self, parser, namespace, value, option_string=None): raise AssertionError('value: %s' % value) assert expected_ns == namespace, ('expected %s, got %s' % (expected_ns, namespace)) - except AssertionError: - e = sys.exc_info()[1] + except AssertionError as e: raise ArgumentParserError('arg_action failed: %s' % e) setattr(namespace, 'badger', value) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 7456803221964..4b0b15f0a9361 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -581,8 +581,8 @@ def __dir__(self): # dir(traceback) try: raise IndexError - except: - self.assertEqual(len(dir(sys.exc_info()[2])), 4) + except IndexError as e: + self.assertEqual(len(dir(e.__traceback__)), 4) # test that object has a __dir__() self.assertEqual(sorted([].__dir__()), dir([])) diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 93ff2f85df193..545dab5c6348f 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -132,8 +132,8 @@ def test_excluding_predicates(self): self.istest(inspect.iscode, 'mod.spam.__code__') try: 1/0 - except: - tb = sys.exc_info()[2] + except Exception as e: + tb = e.__traceback__ self.istest(inspect.isframe, 'tb.tb_frame') self.istest(inspect.istraceback, 'tb') if hasattr(types, 'GetSetDescriptorType'): diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 03d0319306a48..8212cf7a9a964 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -5508,8 +5508,8 @@ def test_compute_rollover(self, when=when, exp=exp): print('currentSecond: %s' % currentSecond, file=sys.stderr) print('r: %s' % r, file=sys.stderr) print('result: %s' % result, file=sys.stderr) - except Exception: - print('exception in diagnostic code: %s' % sys.exc_info()[1], file=sys.stderr) + except Exception as e: + print('exception in diagnostic code: %s' % e, file=sys.stderr) self.assertEqual(exp, actual) rh.close() setattr(TimedRotatingFileHandlerTest, "test_compute_rollover_%s" % when, test_compute_rollover) diff --git a/Lib/test/test_raise.py b/Lib/test/test_raise.py index 8dc62a933e872..8225504c4756d 100644 --- a/Lib/test/test_raise.py +++ b/Lib/test/test_raise.py @@ -12,8 +12,8 @@ def get_tb(): try: raise OSError() - except: - return sys.exc_info()[2] + except OSError as e: + return e.__traceback__ class Context: diff --git a/Lib/test/test_zipimport.py b/Lib/test/test_zipimport.py index 19d3a880f4cd7..b291d5301690d 100644 --- a/Lib/test/test_zipimport.py +++ b/Lib/test/test_zipimport.py @@ -709,8 +709,8 @@ def testDoctestSuite(self): def doTraceback(self, module): try: module.do_raise() - except: - tb = sys.exc_info()[2].tb_next + except Exception as e: + tb = e.__traceback__.tb_next f,lno,n,line = extract_tb(tb, 1)[0] self.assertEqual(line, raise_src.strip()) From webhook-mailer at python.org Tue Jan 18 03:02:44 2022 From: webhook-mailer at python.org (rhettinger) Date: Tue, 18 Jan 2022 08:02:44 -0000 Subject: [Python-checkins] bpo-42161: Hoist the _PyLong_GetOne() call out of the inner loop. (GH-30656) Message-ID: https://github.com/python/cpython/commit/243c31667cc15a9a338330ad9b2a29b1cd1c76ec commit: 243c31667cc15a9a338330ad9b2a29b1cd1c76ec branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-18T00:02:35-08:00 summary: bpo-42161: Hoist the _PyLong_GetOne() call out of the inner loop. (GH-30656) files: M Lib/test/test_sys.py M Objects/enumobject.c diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index f05cd75af97b5..2c8c6ab6cee76 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -1375,7 +1375,7 @@ class C(object): pass x = codecs.charmap_build(encodings.iso8859_3.decoding_table) check(x, size('32B2iB')) # enumerate - check(enumerate([]), size('n3P')) + check(enumerate([]), size('n4P')) # reverse check(reversed(''), size('nP')) # float diff --git a/Objects/enumobject.c b/Objects/enumobject.c index b78230ddaebaa..8fbf4fd6e470b 100644 --- a/Objects/enumobject.c +++ b/Objects/enumobject.c @@ -16,9 +16,10 @@ class reversed "reversedobject *" "&PyReversed_Type" typedef struct { PyObject_HEAD Py_ssize_t en_index; /* current index of enumeration */ - PyObject* en_sit; /* secondary iterator of enumeration */ + PyObject* en_sit; /* secondary iterator of enumeration */ PyObject* en_result; /* result tuple */ PyObject* en_longindex; /* index for sequences >= PY_SSIZE_T_MAX */ + PyObject* one; /* borrowed reference */ } enumobject; @@ -78,6 +79,7 @@ enum_new_impl(PyTypeObject *type, PyObject *iterable, PyObject *start) Py_DECREF(en); return NULL; } + en->one = _PyLong_GetOne(); /* borrowed reference */ return (PyObject *)en; } @@ -157,7 +159,7 @@ enum_next_long(enumobject *en, PyObject* next_item) } next_index = en->en_longindex; assert(next_index != NULL); - stepped_up = PyNumber_Add(next_index, _PyLong_GetOne()); + stepped_up = PyNumber_Add(next_index, en->one); if (stepped_up == NULL) { Py_DECREF(next_item); return NULL; From webhook-mailer at python.org Tue Jan 18 06:13:09 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 18 Jan 2022 11:13:09 -0000 Subject: [Python-checkins] bpo-46339: Include clarification on assert in 'get_error_line_from_tokenizer_buffers' (#30545) Message-ID: https://github.com/python/cpython/commit/8c2fd09f365e082cfceb29afdf38953cdd670946 commit: 8c2fd09f365e082cfceb29afdf38953cdd670946 branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-18T11:13:00Z summary: bpo-46339: Include clarification on assert in 'get_error_line_from_tokenizer_buffers' (#30545) files: M Parser/pegen_errors.c diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index bffae8532ca2b..f348ac3000dda 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -254,6 +254,9 @@ get_error_line_from_tokenizer_buffers(Parser *p, Py_ssize_t lineno) for (int i = 0; i < relative_lineno - 1; i++) { char *new_line = strchr(cur_line, '\n') + 1; + // The assert is here for debug builds but the conditional that + // follows is there so in release builds we do not crash at the cost + // to report a potentially wrong line. assert(new_line != NULL && new_line < p->tok->inp); if (new_line == NULL || new_line >= p->tok->inp) { break; From webhook-mailer at python.org Tue Jan 18 07:37:31 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 18 Jan 2022 12:37:31 -0000 Subject: [Python-checkins] bpo-46402: Promote SQLite URI tricks in `sqlite3` docs (GH-30660) Message-ID: https://github.com/python/cpython/commit/bdf2ab1887a2edfb089a3c2a1590cf1e84ea0048 commit: bdf2ab1887a2edfb089a3c2a1590cf1e84ea0048 branch: main author: Erlend Egeberg Aasland committer: ericvsmith date: 2022-01-18T07:37:02-05:00 summary: bpo-46402: Promote SQLite URI tricks in `sqlite3` docs (GH-30660) Provide some examples of URI parameters in sqlite connect(). Co-authored-by: Ned Batchelder files: M Doc/library/sqlite3.rst diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 415a5f9a92902..d213933ba5827 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -273,14 +273,28 @@ Module functions and constants for the connection, you can set the *cached_statements* parameter. The currently implemented default is to cache 128 statements. - If *uri* is true, *database* is interpreted as a URI. This allows you - to specify options. For example, to open a database in read-only mode - you can use:: - - db = sqlite3.connect('file:path/to/database?mode=ro', uri=True) - - More information about this feature, including a list of recognized options, can - be found in the `SQLite URI documentation `_. + If *uri* is :const:`True`, *database* is interpreted as a + :abbr:`URI (Uniform Resource Identifier)` with a file path and an optional + query string. The scheme part *must* be ``"file:"``. The path can be a + relative or absolute file path. The query string allows us to pass + parameters to SQLite. Some useful URI tricks include:: + + # Open a database in read-only mode. + con = sqlite3.connect("file:template.db?mode=ro", uri=True) + + # Don't implicitly create a new database file if it does not already exist. + # Will raise sqlite3.OperationalError if unable to open a database file. + con = sqlite3.connect("file:nosuchdb.db?mode=rw", uri=True) + + # Create a shared named in-memory database. + con1 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con2 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con1.executescript("create table t(t); insert into t values(28);") + rows = con2.execute("select * from t").fetchall() + + More information about this feature, including a list of recognized + parameters, can be found in the + `SQLite URI documentation `_. .. audit-event:: sqlite3.connect database sqlite3.connect .. audit-event:: sqlite3.connect/handle connection_handle sqlite3.connect From webhook-mailer at python.org Tue Jan 18 09:44:17 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Tue, 18 Jan 2022 14:44:17 -0000 Subject: [Python-checkins] bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) Message-ID: https://github.com/python/cpython/commit/32398294fb3fcf4ee74da54722fd0221c4e6cb74 commit: 32398294fb3fcf4ee74da54722fd0221c4e6cb74 branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-18T22:43:51+08:00 summary: bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index c8a077e2f1ff5..97c2c7f56cecb 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4609,6 +4609,10 @@ def test_cannot_check_subclass(self): with self.assertRaises(TypeError): issubclass(int, Annotated[int, "positive"]) + def test_too_few_type_args(self): + with self.assertRaisesRegex(TypeError, 'at least two arguments'): + Annotated[int] + def test_pickle(self): samples = [typing.Any, typing.Union[int, str], typing.Optional[str], Tuple[int, ...], From webhook-mailer at python.org Tue Jan 18 10:46:35 2022 From: webhook-mailer at python.org (zooba) Date: Tue, 18 Jan 2022 15:46:35 -0000 Subject: [Python-checkins] bpo-46028: Calculate base_executable by resolving symlinks in a venv (GH-30144) Message-ID: https://github.com/python/cpython/commit/7407fe4c25ba0308d49e3e88e4a107ef32251cdc commit: 7407fe4c25ba0308d49e3e88e4a107ef32251cdc branch: main author: Steve Dower committer: zooba date: 2022-01-18T15:46:26Z summary: bpo-46028: Calculate base_executable by resolving symlinks in a venv (GH-30144) files: A Misc/NEWS.d/next/Core and Builtins/2021-12-16-15-04-58.bpo-46028.zfWacB.rst M Lib/test/test_getpath.py M Modules/getpath.py diff --git a/Lib/test/test_getpath.py b/Lib/test/test_getpath.py index 1a336a4abcafd..eaf4a99279663 100644 --- a/Lib/test/test_getpath.py +++ b/Lib/test/test_getpath.py @@ -328,6 +328,38 @@ def test_venv_posix(self): actual = getpath(ns, expected) self.assertEqual(expected, actual) + def test_venv_changed_name_posix(self): + "Test a venv layout on *nix." + ns = MockPosixNamespace( + argv0="python", + PREFIX="/usr", + ENV_PATH="/venv/bin:/usr/bin", + ) + ns.add_known_xfile("/usr/bin/python3") + ns.add_known_xfile("/venv/bin/python") + ns.add_known_link("/venv/bin/python", "/usr/bin/python3") + ns.add_known_file("/usr/lib/python9.8/os.py") + ns.add_known_dir("/usr/lib/python9.8/lib-dynload") + ns.add_known_file("/venv/pyvenv.cfg", [ + r"home = /usr/bin" + ]) + expected = dict( + executable="/venv/bin/python", + prefix="/usr", + exec_prefix="/usr", + base_executable="/usr/bin/python3", + base_prefix="/usr", + base_exec_prefix="/usr", + module_search_paths_set=1, + module_search_paths=[ + "/usr/lib/python98.zip", + "/usr/lib/python9.8", + "/usr/lib/python9.8/lib-dynload", + ], + ) + actual = getpath(ns, expected) + self.assertEqual(expected, actual) + def test_symlink_normal_posix(self): "Test a 'standard' install layout via symlink on *nix" ns = MockPosixNamespace( diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-16-15-04-58.bpo-46028.zfWacB.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-15-04-58.bpo-46028.zfWacB.rst new file mode 100644 index 0000000000000..cc34c0fa2405b --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-15-04-58.bpo-46028.zfWacB.rst @@ -0,0 +1,3 @@ +Fixes calculation of :data:`sys._base_executable` when inside a virtual +environment that uses symlinks with different binary names than the base +environment provides. diff --git a/Modules/getpath.py b/Modules/getpath.py index 6f2e038557722..f84e6e8afaf62 100644 --- a/Modules/getpath.py +++ b/Modules/getpath.py @@ -351,7 +351,18 @@ def search_up(prefix, *landmarks, test=isfile): key, had_equ, value = line.partition('=') if had_equ and key.strip().lower() == 'home': executable_dir = real_executable_dir = value.strip() - base_executable = joinpath(executable_dir, basename(executable)) + if not base_executable: + # First try to resolve symlinked executables, since that may be + # more accurate than assuming the executable in 'home'. + try: + base_executable = realpath(executable) + if base_executable == executable: + # No change, so probably not a link. Clear it and fall back + base_executable = '' + except OSError: + pass + if not base_executable: + base_executable = joinpath(executable_dir, basename(executable)) break else: venv_prefix = None From webhook-mailer at python.org Tue Jan 18 10:55:38 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 18 Jan 2022 15:55:38 -0000 Subject: [Python-checkins] bpo-43869: Improve epoch docs (GH-25777) Message-ID: https://github.com/python/cpython/commit/ff7703c4b609a697ada8165fd1c52a73404b6d07 commit: ff7703c4b609a697ada8165fd1c52a73404b6d07 branch: main author: Miguel Brito <5544985+miguendes at users.noreply.github.com> committer: vstinner date: 2022-01-18T16:55:16+01:00 summary: bpo-43869: Improve epoch docs (GH-25777) files: M Doc/library/time.rst diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 6540932eecbaa..3a771208519b4 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -22,7 +22,7 @@ An explanation of some terminology and conventions is in order. .. index:: single: epoch * The :dfn:`epoch` is the point where the time starts, and is platform - dependent. For Unix, the epoch is January 1, 1970, 00:00:00 (UTC). + dependent. For Unix and Windows, the epoch is January 1, 1970, 00:00:00 (UTC). To find out what the epoch is on a given platform, look at ``time.gmtime(0)``. From webhook-mailer at python.org Tue Jan 18 14:28:36 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Tue, 18 Jan 2022 19:28:36 -0000 Subject: [Python-checkins] bpo-46425: Fix direct invocation of multiple test modules (GH-30666) Message-ID: https://github.com/python/cpython/commit/1292aa6db5bed889a3c87df443754fcae0177801 commit: 1292aa6db5bed889a3c87df443754fcae0177801 branch: main author: Nikita Sobolev committer: serhiy-storchaka date: 2022-01-18T21:28:18+02:00 summary: bpo-46425: Fix direct invocation of multiple test modules (GH-30666) files: M Lib/test/test_compileall.py M Lib/test/test_distutils.py M Lib/test/test_dtrace.py M Lib/test/test_tools/test_freeze.py M Lib/test/test_zipfile64.py M Lib/unittest/test/test_program.py diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 9e15ecf3aae29..33f0c939325f5 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -3,7 +3,6 @@ import filecmp import importlib.util import io -import itertools import os import pathlib import py_compile @@ -29,9 +28,8 @@ from test import support from test.support import os_helper from test.support import script_helper - -from .test_py_compile import without_source_date_epoch -from .test_py_compile import SourceDateEpochTestMeta +from test.test_py_compile import without_source_date_epoch +from test.test_py_compile import SourceDateEpochTestMeta def get_pyc(script, opt): diff --git a/Lib/test/test_distutils.py b/Lib/test/test_distutils.py index 4b40af0213234..d82d2b6423433 100644 --- a/Lib/test/test_distutils.py +++ b/Lib/test/test_distutils.py @@ -5,7 +5,7 @@ be run. """ -import warnings +import unittest from test import support from test.support import warnings_helper diff --git a/Lib/test/test_dtrace.py b/Lib/test/test_dtrace.py index 3957077f5d612..8a436ad123b80 100644 --- a/Lib/test/test_dtrace.py +++ b/Lib/test/test_dtrace.py @@ -170,4 +170,4 @@ class SystemTapOptimizedTests(TraceTests, unittest.TestCase): if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_tools/test_freeze.py b/Lib/test/test_tools/test_freeze.py index 386c35a973bc2..cca3c47f5ac05 100644 --- a/Lib/test/test_tools/test_freeze.py +++ b/Lib/test/test_tools/test_freeze.py @@ -6,8 +6,8 @@ from test import support from test.support import os_helper +from test.test_tools import imports_under_tool, skip_if_missing -from . import imports_under_tool, skip_if_missing skip_if_missing('freeze') with imports_under_tool('freeze', 'test'): import freeze as helper diff --git a/Lib/test/test_zipfile64.py b/Lib/test/test_zipfile64.py index 810fdedef39dd..0947013afbc6e 100644 --- a/Lib/test/test_zipfile64.py +++ b/Lib/test/test_zipfile64.py @@ -18,8 +18,9 @@ from tempfile import TemporaryFile from test.support import os_helper -from test.support import TESTFN, requires_zlib +from test.support import requires_zlib +TESTFN = os_helper.TESTFN TESTFN2 = TESTFN + "2" # How much time in seconds can pass before we print a 'Still working' message. diff --git a/Lib/unittest/test/test_program.py b/Lib/unittest/test/test_program.py index f7049fbb24e7b..687f62996740e 100644 --- a/Lib/unittest/test/test_program.py +++ b/Lib/unittest/test/test_program.py @@ -6,7 +6,7 @@ from test import support import unittest import unittest.test -from .test_result import BufferedWriter +from unittest.test.test_result import BufferedWriter class Test_TestProgram(unittest.TestCase): From webhook-mailer at python.org Tue Jan 18 14:53:53 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Tue, 18 Jan 2022 19:53:53 -0000 Subject: [Python-checkins] bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) Message-ID: https://github.com/python/cpython/commit/65940fa5c12a4b4a0650c7845044ffd63b94e227 commit: 65940fa5c12a4b4a0650c7845044ffd63b94e227 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: serhiy-storchaka date: 2022-01-18T21:53:43+02:00 summary: bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) files: M Doc/library/copyreg.rst diff --git a/Doc/library/copyreg.rst b/Doc/library/copyreg.rst index 4392021009595..dc35965be3e40 100644 --- a/Doc/library/copyreg.rst +++ b/Doc/library/copyreg.rst @@ -33,8 +33,8 @@ Such constructors may be factory functions or class instances. The optional *constructor* parameter, if provided, is a callable object which can be used to reconstruct the object when called with the tuple of arguments - returned by *function* at pickling time. :exc:`TypeError` will be raised if - *object* is a class or *constructor* is not callable. + returned by *function* at pickling time. A :exc:`TypeError` is raised if the + *constructor* is not callable. See the :mod:`pickle` module for more details on the interface expected of *function* and *constructor*. Note that the From webhook-mailer at python.org Tue Jan 18 15:17:04 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 20:17:04 -0000 Subject: [Python-checkins] bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) Message-ID: https://github.com/python/cpython/commit/9238a52cbc39c17ca6c7a8cbda32808dd5522a59 commit: 9238a52cbc39c17ca6c7a8cbda32808dd5522a59 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-18T12:16:54-08:00 summary: bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) (cherry picked from commit 65940fa5c12a4b4a0650c7845044ffd63b94e227) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Doc/library/copyreg.rst diff --git a/Doc/library/copyreg.rst b/Doc/library/copyreg.rst index 4392021009595..dc35965be3e40 100644 --- a/Doc/library/copyreg.rst +++ b/Doc/library/copyreg.rst @@ -33,8 +33,8 @@ Such constructors may be factory functions or class instances. The optional *constructor* parameter, if provided, is a callable object which can be used to reconstruct the object when called with the tuple of arguments - returned by *function* at pickling time. :exc:`TypeError` will be raised if - *object* is a class or *constructor* is not callable. + returned by *function* at pickling time. A :exc:`TypeError` is raised if the + *constructor* is not callable. See the :mod:`pickle` module for more details on the interface expected of *function* and *constructor*. Note that the From webhook-mailer at python.org Tue Jan 18 15:17:11 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 20:17:11 -0000 Subject: [Python-checkins] bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) Message-ID: https://github.com/python/cpython/commit/8527f7a722aee3d9025267cc7ff2eb8afa38d166 commit: 8527f7a722aee3d9025267cc7ff2eb8afa38d166 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-18T12:17:06-08:00 summary: bpo-20823: Clarify copyreg.pickle() documentation (GH-30230) (cherry picked from commit 65940fa5c12a4b4a0650c7845044ffd63b94e227) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Doc/library/copyreg.rst diff --git a/Doc/library/copyreg.rst b/Doc/library/copyreg.rst index 4392021009595..dc35965be3e40 100644 --- a/Doc/library/copyreg.rst +++ b/Doc/library/copyreg.rst @@ -33,8 +33,8 @@ Such constructors may be factory functions or class instances. The optional *constructor* parameter, if provided, is a callable object which can be used to reconstruct the object when called with the tuple of arguments - returned by *function* at pickling time. :exc:`TypeError` will be raised if - *object* is a class or *constructor* is not callable. + returned by *function* at pickling time. A :exc:`TypeError` is raised if the + *constructor* is not callable. See the :mod:`pickle` module for more details on the interface expected of *function* and *constructor*. Note that the From webhook-mailer at python.org Tue Jan 18 15:38:40 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Tue, 18 Jan 2022 20:38:40 -0000 Subject: [Python-checkins] bpo-46045: Do not use POSIX semaphores on NetBSD (GH-30047) Message-ID: https://github.com/python/cpython/commit/60ceedbdd5b5fb22803039a59954798d931f659a commit: 60ceedbdd5b5fb22803039a59954798d931f659a branch: main author: Thomas Klausner committer: serhiy-storchaka date: 2022-01-18T22:38:35+02:00 summary: bpo-46045: Do not use POSIX semaphores on NetBSD (GH-30047) This fixes hanging tests test_compileall,, test_multiprocessing_fork and test_concurrent_futures. files: A Misc/NEWS.d/next/Core and Builtins/2021-12-11-11-36-48.bpo-46045.sfThay.rst M configure M configure.ac diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-11-11-36-48.bpo-46045.sfThay.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-11-36-48.bpo-46045.sfThay.rst new file mode 100644 index 00000000000000..97fd1883eb2ab3 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-11-11-36-48.bpo-46045.sfThay.rst @@ -0,0 +1 @@ +Do not use POSIX semaphores on NetBSD diff --git a/configure b/configure index 1dee645c387eb6..7236e0930e15bb 100755 --- a/configure +++ b/configure @@ -13006,6 +13006,10 @@ $as_echo "#define HAVE_BROKEN_POSIX_SEMAPHORES 1" >>confdefs.h AIX/*) $as_echo "#define HAVE_BROKEN_POSIX_SEMAPHORES 1" >>confdefs.h + ;; + NetBSD/*) +$as_echo "#define HAVE_BROKEN_POSIX_SEMAPHORES 1" >>confdefs.h + ;; esac diff --git a/configure.ac b/configure.ac index 7b084a264d411e..aea12128c1217d 100644 --- a/configure.ac +++ b/configure.ac @@ -3716,6 +3716,9 @@ if test "$posix_threads" = "yes"; then AIX/*) AC_DEFINE(HAVE_BROKEN_POSIX_SEMAPHORES, 1, [Define if the Posix semaphores do not work on your system]) ;; + NetBSD/*) AC_DEFINE(HAVE_BROKEN_POSIX_SEMAPHORES, 1, + [Define if the Posix semaphores do not work on your system]) + ;; esac AC_CACHE_CHECK([if PTHREAD_SCOPE_SYSTEM is supported], [ac_cv_pthread_system_supported], From webhook-mailer at python.org Tue Jan 18 15:46:36 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Tue, 18 Jan 2022 20:46:36 -0000 Subject: [Python-checkins] bpo-44024: Improve the TypeError message in getattr and hasattr (GH-25863) Message-ID: https://github.com/python/cpython/commit/16bf9bd157c7bf5f9c60414fa8e0fe5047c55a9b commit: 16bf9bd157c7bf5f9c60414fa8e0fe5047c55a9b branch: main author: G?ry Ogam committer: serhiy-storchaka date: 2022-01-18T22:46:26+02:00 summary: bpo-44024: Improve the TypeError message in getattr and hasattr (GH-25863) Use common error message for non-string attribute name in the builtin functions getattr and hasattr. The special check no longer needed since Python 3.0. files: A Misc/NEWS.d/next/Core and Builtins/2021-05-04-21-55-49.bpo-44024.M9m8Qd.rst M Lib/test/test_builtin.py M Python/bltinmodule.c diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 4b0b15f0a9361..c6e67cc2910cf 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -509,6 +509,9 @@ def test_delattr(self): sys.spam = 1 delattr(sys, 'spam') self.assertRaises(TypeError, delattr) + self.assertRaises(TypeError, delattr, sys) + msg = r"^attribute name must be string, not 'int'$" + self.assertRaisesRegex(TypeError, msg, delattr, sys, 1) def test_dir(self): # dir(wrong number of arguments) @@ -801,17 +804,21 @@ def test_filter_pickle(self): def test_getattr(self): self.assertTrue(getattr(sys, 'stdout') is sys.stdout) - self.assertRaises(TypeError, getattr, sys, 1) - self.assertRaises(TypeError, getattr, sys, 1, "foo") self.assertRaises(TypeError, getattr) + self.assertRaises(TypeError, getattr, sys) + msg = r"^attribute name must be string, not 'int'$" + self.assertRaisesRegex(TypeError, msg, getattr, sys, 1) + self.assertRaisesRegex(TypeError, msg, getattr, sys, 1, 'spam') self.assertRaises(AttributeError, getattr, sys, chr(sys.maxunicode)) # unicode surrogates are not encodable to the default encoding (utf8) self.assertRaises(AttributeError, getattr, 1, "\uDAD1\uD51E") def test_hasattr(self): self.assertTrue(hasattr(sys, 'stdout')) - self.assertRaises(TypeError, hasattr, sys, 1) self.assertRaises(TypeError, hasattr) + self.assertRaises(TypeError, hasattr, sys) + msg = r"^attribute name must be string, not 'int'$" + self.assertRaisesRegex(TypeError, msg, hasattr, sys, 1) self.assertEqual(False, hasattr(sys, chr(sys.maxunicode))) # Check that hasattr propagates all exceptions outside of @@ -1457,8 +1464,11 @@ def test_bug_27936(self): def test_setattr(self): setattr(sys, 'spam', 1) self.assertEqual(sys.spam, 1) - self.assertRaises(TypeError, setattr, sys, 1, 'spam') self.assertRaises(TypeError, setattr) + self.assertRaises(TypeError, setattr, sys) + self.assertRaises(TypeError, setattr, sys, 'spam') + msg = r"^attribute name must be string, not 'int'$" + self.assertRaisesRegex(TypeError, msg, setattr, sys, 1, 'spam') # test_str(): see test_unicode.py and test_bytes.py for str() tests. diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-05-04-21-55-49.bpo-44024.M9m8Qd.rst b/Misc/NEWS.d/next/Core and Builtins/2021-05-04-21-55-49.bpo-44024.M9m8Qd.rst new file mode 100644 index 0000000000000..5037413353d28 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-05-04-21-55-49.bpo-44024.M9m8Qd.rst @@ -0,0 +1,2 @@ +Improve the exc:`TypeError` message for non-string second arguments passed to +the built-in functions :func:`getattr` and :func:`hasattr`. Patch by G?ry Ogam. diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index 6763f9969707d..ef1b2bb9cf644 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -1091,11 +1091,6 @@ builtin_getattr(PyObject *self, PyObject *const *args, Py_ssize_t nargs) v = args[0]; name = args[1]; - if (!PyUnicode_Check(name)) { - PyErr_SetString(PyExc_TypeError, - "getattr(): attribute name must be string"); - return NULL; - } if (nargs > 2) { if (_PyObject_LookupAttr(v, name, &result) == 0) { PyObject *dflt = args[2]; @@ -1156,11 +1151,6 @@ builtin_hasattr_impl(PyObject *module, PyObject *obj, PyObject *name) { PyObject *v; - if (!PyUnicode_Check(name)) { - PyErr_SetString(PyExc_TypeError, - "hasattr(): attribute name must be string"); - return NULL; - } if (_PyObject_LookupAttr(obj, name, &v) < 0) { return NULL; } From webhook-mailer at python.org Tue Jan 18 16:31:40 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 21:31:40 -0000 Subject: [Python-checkins] bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) Message-ID: https://github.com/python/cpython/commit/3852269b91fcc8ee668cd876b3669eba6da5b1ac commit: 3852269b91fcc8ee668cd876b3669eba6da5b1ac branch: main author: John Marshall committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-18T13:31:27-08:00 summary: bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) This addresses [bpo-45554]() by expanding the `exitcode` documentation to also describe what `exitcode` will be in cases of normal termination, `sys.exit()` called, and on uncaught exceptions. Automerge-Triggered-By: GH:pitrou files: M Doc/library/multiprocessing.rst M Misc/ACKS diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 7a1a285255ff7..9bb7dd3d703ab 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -569,8 +569,15 @@ The :mod:`multiprocessing` package mostly replicates the API of the .. attribute:: exitcode The child's exit code. This will be ``None`` if the process has not yet - terminated. A negative value *-N* indicates that the child was terminated - by signal *N*. + terminated. + + If the child's :meth:`run` method returned normally, the exit code + will be 0. If it terminated via :func:`sys.exit` with an integer + argument *N*, the exit code will be *N*. + + If the child terminated due to an exception not caught within + :meth:`run`, the exit code will be 1. If it was terminated by + signal *N*, the exit code will be the negative value *-N*. .. attribute:: authkey diff --git a/Misc/ACKS b/Misc/ACKS index 7f2e94dfa615f..04d6a651489bb 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1118,6 +1118,7 @@ Vincent Marchetti David Marek Doug Marien Sven Marnach +John Marshall Alex Martelli Dennis M?rtensson Anthony Martin From webhook-mailer at python.org Tue Jan 18 16:51:38 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 21:51:38 -0000 Subject: [Python-checkins] bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) Message-ID: https://github.com/python/cpython/commit/4449a1694a0fd2c63fcef5eb7d0ad1d7dfbb6077 commit: 4449a1694a0fd2c63fcef5eb7d0ad1d7dfbb6077 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-18T13:51:30-08:00 summary: bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) This addresses [bpo-45554]() by expanding the `exitcode` documentation to also describe what `exitcode` will be in cases of normal termination, `sys.exit()` called, and on uncaught exceptions. Automerge-Triggered-By: GH:pitrou (cherry picked from commit 3852269b91fcc8ee668cd876b3669eba6da5b1ac) Co-authored-by: John Marshall files: M Doc/library/multiprocessing.rst M Misc/ACKS diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index e81dd7e648f7f..e0954b285b37b 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -569,8 +569,15 @@ The :mod:`multiprocessing` package mostly replicates the API of the .. attribute:: exitcode The child's exit code. This will be ``None`` if the process has not yet - terminated. A negative value *-N* indicates that the child was terminated - by signal *N*. + terminated. + + If the child's :meth:`run` method returned normally, the exit code + will be 0. If it terminated via :func:`sys.exit` with an integer + argument *N*, the exit code will be *N*. + + If the child terminated due to an exception not caught within + :meth:`run`, the exit code will be 1. If it was terminated by + signal *N*, the exit code will be the negative value *-N*. .. attribute:: authkey diff --git a/Misc/ACKS b/Misc/ACKS index 94b0ed0b241cd..9292bdc8dc73b 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1115,6 +1115,7 @@ Vincent Marchetti David Marek Doug Marien Sven Marnach +John Marshall Alex Martelli Dennis M?rtensson Anthony Martin From webhook-mailer at python.org Tue Jan 18 16:57:48 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 18 Jan 2022 21:57:48 -0000 Subject: [Python-checkins] [3.10] bpo-46402: Promote SQLite URI tricks in sqlite3 docs (GH-30660) (GH-30671) Message-ID: https://github.com/python/cpython/commit/01e6cbefd3d0f60c942ed711131f5d638dde1227 commit: 01e6cbefd3d0f60c942ed711131f5d638dde1227 branch: 3.10 author: Erlend Egeberg Aasland committer: ericvsmith date: 2022-01-18T16:57:33-05:00 summary: [3.10] bpo-46402: Promote SQLite URI tricks in sqlite3 docs (GH-30660) (GH-30671) * bpo-46402: Promote SQLite URI tricks in `sqlite3` docs (GH-30660) Provide some examples of URI parameters in sqlite connect(). Co-authored-by: Ned Batchelder (cherry picked from commit bdf2ab1887a2edfb089a3c2a1590cf1e84ea0048) Co-authored-by: Erlend Egeberg Aasland * Update suspicious rules files: M Doc/library/sqlite3.rst M Doc/tools/susp-ignored.csv diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 1c3bde3b914d0..492dadb2746ac 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -255,14 +255,28 @@ Module functions and constants for the connection, you can set the *cached_statements* parameter. The currently implemented default is to cache 100 statements. - If *uri* is true, *database* is interpreted as a URI. This allows you - to specify options. For example, to open a database in read-only mode - you can use:: - - db = sqlite3.connect('file:path/to/database?mode=ro', uri=True) - - More information about this feature, including a list of recognized options, can - be found in the `SQLite URI documentation `_. + If *uri* is :const:`True`, *database* is interpreted as a + :abbr:`URI (Uniform Resource Identifier)` with a file path and an optional + query string. The scheme part *must* be ``"file:"``. The path can be a + relative or absolute file path. The query string allows us to pass + parameters to SQLite. Some useful URI tricks include:: + + # Open a database in read-only mode. + con = sqlite3.connect("file:template.db?mode=ro", uri=True) + + # Don't implicitly create a new database file if it does not already exist. + # Will raise sqlite3.OperationalError if unable to open a database file. + con = sqlite3.connect("file:nosuchdb.db?mode=rw", uri=True) + + # Create a shared named in-memory database. + con1 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con2 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con1.executescript("create table t(t); insert into t values(28);") + rows = con2.execute("select * from t").fetchall() + + More information about this feature, including a list of recognized + parameters, can be found in the + `SQLite URI documentation `_. .. audit-event:: sqlite3.connect database sqlite3.connect .. audit-event:: sqlite3.connect/handle connection_handle sqlite3.connect diff --git a/Doc/tools/susp-ignored.csv b/Doc/tools/susp-ignored.csv index 2453faa8ccd74..9f4a44f1de64e 100644 --- a/Doc/tools/susp-ignored.csv +++ b/Doc/tools/susp-ignored.csv @@ -212,7 +212,10 @@ library/socket,,:can,"return (can_id, can_dlc, data[:can_dlc])" library/socket,,:len,fds.frombytes(cmsg_data[:len(cmsg_data) - (len(cmsg_data) % fds.itemsize)]) library/sqlite3,,:year,"cur.execute(""select * from lang where first_appeared=:year"", {""year"": 1972})" library/sqlite3,,:memory, -library/sqlite3,,:path,"db = sqlite3.connect('file:path/to/database?mode=ro', uri=True)" +library/sqlite3,,:template,"con = sqlite3.connect(""file:template.db?mode=ro"", uri=True)" +library/sqlite3,,:nosuchdb,"con = sqlite3.connect(""file:nosuchdb.db?mode=rw"", uri=True)" +library/sqlite3,,:mem1,"con1 = sqlite3.connect(""file:mem1?mode=memory&cache=shared"", uri=True)" +library/sqlite3,,:mem1,"con2 = sqlite3.connect(""file:mem1?mode=memory&cache=shared"", uri=True)" library/ssl,,:My,"Organizational Unit Name (eg, section) []:My Group" library/ssl,,:My,"Organization Name (eg, company) [Internet Widgits Pty Ltd]:My Organization, Inc." library/ssl,,:myserver,"Common Name (eg, YOUR name) []:myserver.mygroup.myorganization.com" From webhook-mailer at python.org Tue Jan 18 16:58:51 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 18 Jan 2022 21:58:51 -0000 Subject: [Python-checkins] [3.9] bpo-46402: Promote SQLite URI tricks in sqlite3 docs (GH-30660) (#30672) Message-ID: https://github.com/python/cpython/commit/0ae22577606f1b52e3b6c2de6c5b307518044605 commit: 0ae22577606f1b52e3b6c2de6c5b307518044605 branch: 3.9 author: Erlend Egeberg Aasland committer: ericvsmith date: 2022-01-18T16:58:47-05:00 summary: [3.9] bpo-46402: Promote SQLite URI tricks in sqlite3 docs (GH-30660) (#30672) * bpo-46402: Promote SQLite URI tricks in `sqlite3` docs (GH-30660) Provide some examples of URI parameters in sqlite connect(). Co-authored-by: Ned Batchelder (cherry picked from commit bdf2ab1887a2edfb089a3c2a1590cf1e84ea0048) Co-authored-by: Erlend Egeberg Aasland * Update suspicious rules files: M Doc/library/sqlite3.rst M Doc/tools/susp-ignored.csv diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 0ffb8ff0b969c..e50928a3845c4 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -254,14 +254,28 @@ Module functions and constants for the connection, you can set the *cached_statements* parameter. The currently implemented default is to cache 100 statements. - If *uri* is true, *database* is interpreted as a URI. This allows you - to specify options. For example, to open a database in read-only mode - you can use:: - - db = sqlite3.connect('file:path/to/database?mode=ro', uri=True) - - More information about this feature, including a list of recognized options, can - be found in the `SQLite URI documentation `_. + If *uri* is :const:`True`, *database* is interpreted as a + :abbr:`URI (Uniform Resource Identifier)` with a file path and an optional + query string. The scheme part *must* be ``"file:"``. The path can be a + relative or absolute file path. The query string allows us to pass + parameters to SQLite. Some useful URI tricks include:: + + # Open a database in read-only mode. + con = sqlite3.connect("file:template.db?mode=ro", uri=True) + + # Don't implicitly create a new database file if it does not already exist. + # Will raise sqlite3.OperationalError if unable to open a database file. + con = sqlite3.connect("file:nosuchdb.db?mode=rw", uri=True) + + # Create a shared named in-memory database. + con1 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con2 = sqlite3.connect("file:mem1?mode=memory&cache=shared", uri=True) + con1.executescript("create table t(t); insert into t values(28);") + rows = con2.execute("select * from t").fetchall() + + More information about this feature, including a list of recognized + parameters, can be found in the + `SQLite URI documentation `_. .. audit-event:: sqlite3.connect database sqlite3.connect diff --git a/Doc/tools/susp-ignored.csv b/Doc/tools/susp-ignored.csv index 02fe9175278c5..48480ed567713 100644 --- a/Doc/tools/susp-ignored.csv +++ b/Doc/tools/susp-ignored.csv @@ -213,7 +213,10 @@ library/socket,,:can,"return (can_id, can_dlc, data[:can_dlc])" library/socket,,:len,fds.frombytes(cmsg_data[:len(cmsg_data) - (len(cmsg_data) % fds.itemsize)]) library/sqlite3,,:year,"cur.execute(""select * from lang where first_appeared=:year"", {""year"": 1972})" library/sqlite3,,:memory, -library/sqlite3,,:path,"db = sqlite3.connect('file:path/to/database?mode=ro', uri=True)" +library/sqlite3,,:template,"con = sqlite3.connect(""file:template.db?mode=ro"", uri=True)" +library/sqlite3,,:nosuchdb,"con = sqlite3.connect(""file:nosuchdb.db?mode=rw"", uri=True)" +library/sqlite3,,:mem1,"con1 = sqlite3.connect(""file:mem1?mode=memory&cache=shared"", uri=True)" +library/sqlite3,,:mem1,"con2 = sqlite3.connect(""file:mem1?mode=memory&cache=shared"", uri=True)" library/ssl,,:My,"Organizational Unit Name (eg, section) []:My Group" library/ssl,,:My,"Organization Name (eg, company) [Internet Widgits Pty Ltd]:My Organization, Inc." library/ssl,,:myserver,"Common Name (eg, YOUR name) []:myserver.mygroup.myorganization.com" From webhook-mailer at python.org Tue Jan 18 17:03:34 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 18 Jan 2022 22:03:34 -0000 Subject: [Python-checkins] [3.9] bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) (GH-30675) Message-ID: https://github.com/python/cpython/commit/0be4760d85399a308421d9229b5d7f1b4ec718a2 commit: 0be4760d85399a308421d9229b5d7f1b4ec718a2 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-18T14:03:22-08:00 summary: [3.9] bpo-45554: Document multiprocessing.Process.exitcode values (GH-30142) (GH-30675) This addresses [[bpo-45554]()]() by expanding the `exitcode` documentation to also describe what `exitcode` will be in cases of normal termination, `sys.exit()` called, and on uncaught exceptions. (cherry picked from commit 3852269b91fcc8ee668cd876b3669eba6da5b1ac) Co-authored-by: John Marshall files: M Doc/library/multiprocessing.rst M Misc/ACKS diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 96bc6c153284a..4966be303f425 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -569,8 +569,15 @@ The :mod:`multiprocessing` package mostly replicates the API of the .. attribute:: exitcode The child's exit code. This will be ``None`` if the process has not yet - terminated. A negative value *-N* indicates that the child was terminated - by signal *N*. + terminated. + + If the child's :meth:`run` method returned normally, the exit code + will be 0. If it terminated via :func:`sys.exit` with an integer + argument *N*, the exit code will be *N*. + + If the child terminated due to an exception not caught within + :meth:`run`, the exit code will be 1. If it was terminated by + signal *N*, the exit code will be the negative value *-N*. .. attribute:: authkey diff --git a/Misc/ACKS b/Misc/ACKS index ac893acbf3e46..25c88656d4245 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1095,6 +1095,7 @@ Vincent Marchetti David Marek Doug Marien Sven Marnach +John Marshall Alex Martelli Dennis M?rtensson Anthony Martin From webhook-mailer at python.org Tue Jan 18 18:13:24 2022 From: webhook-mailer at python.org (ethanfurman) Date: Tue, 18 Jan 2022 23:13:24 -0000 Subject: [Python-checkins] bpo-45535: [Enum] include special dunders in dir() (GH-30677) Message-ID: https://github.com/python/cpython/commit/7c0914d35eaaab2f323260ba5fe8884732533888 commit: 7c0914d35eaaab2f323260ba5fe8884732533888 branch: main author: Ethan Furman committer: ethanfurman date: 2022-01-18T15:13:13-08:00 summary: bpo-45535: [Enum] include special dunders in dir() (GH-30677) Include the `__dunders__` in `dir()` that make `Enum` special: - `__contains__` - `__getitem__` - `__iter__` - `__len__` - `__members__` files: M Lib/enum.py M Lib/test/test_enum.py diff --git a/Lib/enum.py b/Lib/enum.py index 772e1eac0e1e6..b510467731293 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -766,29 +766,22 @@ def __delattr__(cls, attr): super().__delattr__(attr) def __dir__(cls): - # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ - # on object-based enums + interesting = set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ] + + cls._member_names_ + ) + if cls._new_member_ is not object.__new__: + interesting.add('__new__') + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') if cls._member_type_ is object: - interesting = set(cls._member_names_) - if cls._new_member_ is not object.__new__: - interesting.add('__new__') - if cls.__init_subclass__ is not object.__init_subclass__: - interesting.add('__init_subclass__') - for method in ('__init__', '__format__', '__repr__', '__str__'): - if getattr(cls, method) not in (getattr(Enum, method), getattr(Flag, method)): - interesting.add(method) - return sorted(set([ - '__class__', '__contains__', '__doc__', '__getitem__', - '__iter__', '__len__', '__members__', '__module__', - '__name__', '__qualname__', - ]) | interesting - ) + return sorted(interesting) else: # return whatever mixed-in data type has - return sorted(set( - dir(cls._member_type_) - + cls._member_names_ - )) + return sorted(set(dir(cls._member_type_)) | interesting) def __getattr__(cls, name): """ diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 18cc2f30ce559..d7ce8add78715 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -883,14 +883,15 @@ class Part(Enum): with self.assertRaises(TypeError): Season.SPRING < Part.CLIP + @unittest.skip('to-do list') def test_dir_with_custom_dunders(self): class PlainEnum(Enum): pass cls_dir = dir(PlainEnum) self.assertNotIn('__repr__', cls_dir) self.assertNotIn('__str__', cls_dir) - self.assertNotIn('__repr__', cls_dir) - self.assertNotIn('__repr__', cls_dir) + self.assertNotIn('__format__', cls_dir) + self.assertNotIn('__init__', cls_dir) # class MyEnum(Enum): def __repr__(self): @@ -904,8 +905,8 @@ def __init__(self): cls_dir = dir(MyEnum) self.assertIn('__repr__', cls_dir) self.assertIn('__str__', cls_dir) - self.assertIn('__repr__', cls_dir) - self.assertIn('__repr__', cls_dir) + self.assertIn('__format__', cls_dir) + self.assertIn('__init__', cls_dir) def test_duplicate_name_error(self): with self.assertRaises(TypeError): @@ -4322,13 +4323,18 @@ def test_convert_int(self): int_dir = dir(int) + [ 'CONVERT_TEST_NAME_A', 'CONVERT_TEST_NAME_B', 'CONVERT_TEST_NAME_C', 'CONVERT_TEST_NAME_D', 'CONVERT_TEST_NAME_E', 'CONVERT_TEST_NAME_F', + 'CONVERT_TEST_SIGABRT', 'CONVERT_TEST_SIGIOT', + 'CONVERT_TEST_EIO', 'CONVERT_TEST_EBUS', ] + extra = [name for name in dir(test_type) if name not in enum_dir(test_type)] + missing = [name for name in enum_dir(test_type) if name not in dir(test_type)] self.assertEqual( - [name for name in dir(test_type) if name not in int_dir], + extra + missing, [], - msg='Names other than CONVERT_TEST_* found.', + msg='extra names: %r; missing names: %r' % (extra, missing), ) + def test_convert_uncomparable(self): uncomp = enum.Enum._convert_( 'Uncomparable', @@ -4362,10 +4368,12 @@ def test_convert_str(self): self.assertEqual(test_type.CONVERT_STR_TEST_2, 'goodbye') # Ensure that test_type only picked up names matching the filter. str_dir = dir(str) + ['CONVERT_STR_TEST_1', 'CONVERT_STR_TEST_2'] + extra = [name for name in dir(test_type) if name not in enum_dir(test_type)] + missing = [name for name in enum_dir(test_type) if name not in dir(test_type)] self.assertEqual( - [name for name in dir(test_type) if name not in str_dir], + extra + missing, [], - msg='Names other than CONVERT_STR_* found.', + msg='extra names: %r; missing names: %r' % (extra, missing), ) self.assertEqual(repr(test_type.CONVERT_STR_TEST_1), '%s.CONVERT_STR_TEST_1' % SHORT_MODULE) self.assertEqual(str(test_type.CONVERT_STR_TEST_2), 'goodbye') @@ -4392,25 +4400,22 @@ def test_convert_repr_and_str(self): # helpers def enum_dir(cls): - # TODO: check for custom __init__, __new__, __format__, __repr__, __str__, __init_subclass__ + interesting = set([ + '__class__', '__contains__', '__doc__', '__getitem__', + '__iter__', '__len__', '__members__', '__module__', + '__name__', '__qualname__', + ] + + cls._member_names_ + ) + if cls._new_member_ is not object.__new__: + interesting.add('__new__') + if cls.__init_subclass__ is not object.__init_subclass__: + interesting.add('__init_subclass__') if cls._member_type_ is object: - interesting = set() - if cls.__init_subclass__ is not object.__init_subclass__: - interesting.add('__init_subclass__') - return sorted(set([ - '__class__', '__contains__', '__doc__', '__getitem__', - '__iter__', '__len__', '__members__', '__module__', - '__name__', '__qualname__', - ] - + cls._member_names_ - ) | interesting - ) + return sorted(interesting) else: # return whatever mixed-in data type has - return sorted(set( - dir(cls._member_type_) - + cls._member_names_ - )) + return sorted(set(dir(cls._member_type_)) | interesting) def member_dir(member): if member.__class__._member_type_ is object: From webhook-mailer at python.org Wed Jan 19 05:27:37 2022 From: webhook-mailer at python.org (vstinner) Date: Wed, 19 Jan 2022 10:27:37 -0000 Subject: [Python-checkins] bpo-43869: Time Epoch is the same on all platforms (GH-30664) Message-ID: https://github.com/python/cpython/commit/a847785b40ed8819bde2dac5849dc31d15e99a74 commit: a847785b40ed8819bde2dac5849dc31d15e99a74 branch: main author: Victor Stinner committer: vstinner date: 2022-01-19T11:27:11+01:00 summary: bpo-43869: Time Epoch is the same on all platforms (GH-30664) files: A Misc/NEWS.d/next/Library/2022-01-18-17-24-21.bpo-43869.NayN12.rst M Doc/library/time.rst M Lib/test/test_time.py diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 3a771208519b4..d524f4ffebc75 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -21,10 +21,8 @@ An explanation of some terminology and conventions is in order. .. index:: single: epoch -* The :dfn:`epoch` is the point where the time starts, and is platform - dependent. For Unix and Windows, the epoch is January 1, 1970, 00:00:00 (UTC). - To find out what the epoch is on a given platform, look at - ``time.gmtime(0)``. +* The :dfn:`epoch` is the point where the time starts, the return value of + ``time.gmtime(0)``. It is January 1, 1970, 00:00:00 (UTC) on all platforms. .. _leap seconds: https://en.wikipedia.org/wiki/Leap_second @@ -37,7 +35,7 @@ An explanation of some terminology and conventions is in order. .. index:: single: Year 2038 -* The functions in this module may not handle dates and times before the epoch or +* The functions in this module may not handle dates and times before the epoch_ or far in the future. The cut-off point in the future is determined by the C library; for 32-bit systems, it is typically in 2038. @@ -207,7 +205,7 @@ Functions .. function:: ctime([secs]) - Convert a time expressed in seconds since the epoch to a string of a form: + Convert a time expressed in seconds since the epoch_ to a string of a form: ``'Sun Jun 20 23:21:05 1993'`` representing local time. The day field is two characters long and is space padded if the day is a single digit, e.g.: ``'Wed Jun 9 04:26:40 1993'``. @@ -245,7 +243,7 @@ Functions .. function:: gmtime([secs]) - Convert a time expressed in seconds since the epoch to a :class:`struct_time` in + Convert a time expressed in seconds since the epoch_ to a :class:`struct_time` in UTC in which the dst flag is always zero. If *secs* is not provided or :const:`None`, the current time as returned by :func:`.time` is used. Fractions of a second are ignored. See above for a description of the @@ -601,14 +599,10 @@ Functions .. function:: time() -> float Return the time in seconds since the epoch_ as a floating point - number. The specific date of the epoch and the handling of - `leap seconds`_ is platform dependent. - On Windows and most Unix systems, the epoch is January 1, 1970, - 00:00:00 (UTC) and leap seconds are not counted towards the time - in seconds since the epoch. This is commonly referred to as - `Unix time `_. - To find out what the epoch is on a given platform, look at - ``gmtime(0)``. + number. The handling of `leap seconds`_ is platform dependent. + On Windows and most Unix systems, the leap seconds are not counted towards + the time in seconds since the epoch_. This is commonly referred to as `Unix + time `_. Note that even though the time is always returned as a floating point number, not all systems provide time with a better precision than 1 second. @@ -629,8 +623,8 @@ Functions .. function:: time_ns() -> int - Similar to :func:`~time.time` but returns time as an integer number of nanoseconds - since the epoch_. + Similar to :func:`~time.time` but returns time as an integer number of + nanoseconds since the epoch_. .. versionadded:: 3.7 diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py index 1aa5874dfe272..1c444e381a552 100644 --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -159,6 +159,13 @@ def test_sleep(self): self.assertRaises(ValueError, time.sleep, -1) time.sleep(1.2) + def test_epoch(self): + # bpo-43869: Make sure that Python use the same Epoch on all platforms: + # January 1, 1970, 00:00:00 (UTC). + epoch = time.gmtime(0) + # Only test the date and time, ignore other gmtime() members + self.assertEqual(tuple(epoch)[:6], (1970, 1, 1, 0, 0, 0), epoch) + def test_strftime(self): tt = time.gmtime(self.t) for directive in ('a', 'A', 'b', 'B', 'c', 'd', 'H', 'I', diff --git a/Misc/NEWS.d/next/Library/2022-01-18-17-24-21.bpo-43869.NayN12.rst b/Misc/NEWS.d/next/Library/2022-01-18-17-24-21.bpo-43869.NayN12.rst new file mode 100644 index 0000000000000..5486c95b0689b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-18-17-24-21.bpo-43869.NayN12.rst @@ -0,0 +1,2 @@ +Python uses the same time Epoch on all platforms. Add an explicit unit test +to ensure that it's the case. Patch by Victor Stinner. From webhook-mailer at python.org Wed Jan 19 07:04:00 2022 From: webhook-mailer at python.org (iritkatriel) Date: Wed, 19 Jan 2022 12:04:00 -0000 Subject: [Python-checkins] bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) Message-ID: https://github.com/python/cpython/commit/3bf6315c4cabf72d64e65e6f85bf72c65137255a commit: 3bf6315c4cabf72d64e65e6f85bf72c65137255a branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-19T12:03:51Z summary: bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) files: M Doc/c-api/object.rst diff --git a/Doc/c-api/object.rst b/Doc/c-api/object.rst index 17e3707799496..41a3affcf9842 100644 --- a/Doc/c-api/object.rst +++ b/Doc/c-api/object.rst @@ -81,8 +81,9 @@ Object Protocol return ``0`` on success. This is the equivalent of the Python statement ``o.attr_name = v``. - If *v* is ``NULL``, the attribute is deleted, however this feature is - deprecated in favour of using :c:func:`PyObject_DelAttr`. + If *v* is ``NULL``, the attribute is deleted. This behaviour is deprecated + in favour of using :c:func:`PyObject_DelAttr`, but there are currently no + plans to remove it. .. c:function:: int PyObject_SetAttrString(PyObject *o, const char *attr_name, PyObject *v) From webhook-mailer at python.org Wed Jan 19 07:30:10 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 12:30:10 -0000 Subject: [Python-checkins] [3.9] bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) (GH-30685) Message-ID: https://github.com/python/cpython/commit/7b694b816f30c463ffcab0952d3319320d23e154 commit: 7b694b816f30c463ffcab0952d3319320d23e154 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T04:29:52-08:00 summary: [3.9] bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) (GH-30685) (cherry picked from commit 3bf6315c4cabf72d64e65e6f85bf72c65137255a) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Automerge-Triggered-By: GH:iritkatriel files: M Doc/c-api/object.rst diff --git a/Doc/c-api/object.rst b/Doc/c-api/object.rst index 05faa72346ed6..9f874382e85bb 100644 --- a/Doc/c-api/object.rst +++ b/Doc/c-api/object.rst @@ -81,8 +81,9 @@ Object Protocol return ``0`` on success. This is the equivalent of the Python statement ``o.attr_name = v``. - If *v* is ``NULL``, the attribute is deleted, however this feature is - deprecated in favour of using :c:func:`PyObject_DelAttr`. + If *v* is ``NULL``, the attribute is deleted. This behaviour is deprecated + in favour of using :c:func:`PyObject_DelAttr`, but there are currently no + plans to remove it. .. c:function:: int PyObject_SetAttrString(PyObject *o, const char *attr_name, PyObject *v) From webhook-mailer at python.org Wed Jan 19 07:34:37 2022 From: webhook-mailer at python.org (iritkatriel) Date: Wed, 19 Jan 2022 12:34:37 -0000 Subject: [Python-checkins] bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) (GH-30684) Message-ID: https://github.com/python/cpython/commit/0861a50bd434d1f3e12fe7122e37356f1fce93dc commit: 0861a50bd434d1f3e12fe7122e37356f1fce93dc branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-19T12:34:17Z summary: bpo-22039: [doc] clarify that there are no plans to disable deleting an attribute via PyObject_SetAttr (GH-30639) (GH-30684) (cherry picked from commit 3bf6315c4cabf72d64e65e6f85bf72c65137255a) Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> Co-authored-by: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> files: M Doc/c-api/object.rst diff --git a/Doc/c-api/object.rst b/Doc/c-api/object.rst index 17e3707799496..41a3affcf9842 100644 --- a/Doc/c-api/object.rst +++ b/Doc/c-api/object.rst @@ -81,8 +81,9 @@ Object Protocol return ``0`` on success. This is the equivalent of the Python statement ``o.attr_name = v``. - If *v* is ``NULL``, the attribute is deleted, however this feature is - deprecated in favour of using :c:func:`PyObject_DelAttr`. + If *v* is ``NULL``, the attribute is deleted. This behaviour is deprecated + in favour of using :c:func:`PyObject_DelAttr`, but there are currently no + plans to remove it. .. c:function:: int PyObject_SetAttrString(PyObject *o, const char *attr_name, PyObject *v) From webhook-mailer at python.org Wed Jan 19 09:13:49 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 19 Jan 2022 14:13:49 -0000 Subject: [Python-checkins] bpo-46416: Allow direct invocation of `Lib/test/test_typing.py` (GH-30641) Message-ID: https://github.com/python/cpython/commit/2792d6d18eab3efeb71e6397f88db86e610541f1 commit: 2792d6d18eab3efeb71e6397f88db86e610541f1 branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-19T22:13:38+08:00 summary: bpo-46416: Allow direct invocation of `Lib/test/test_typing.py` (GH-30641) Use `__name__` files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 97c2c7f56cecb..e61f8828f5405 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -5071,7 +5071,7 @@ def test_special_attrs2(self): ) self.assertEqual( SpecialAttrsTests.TypeName.__module__, - 'test.test_typing', + __name__, ) # NewTypes are picklable assuming correct qualname information. for proto in range(pickle.HIGHEST_PROTOCOL + 1): @@ -5085,7 +5085,7 @@ def test_special_attrs2(self): # __qualname__ is unnecessary. self.assertEqual(SpecialAttrsT.__name__, 'SpecialAttrsT') self.assertFalse(hasattr(SpecialAttrsT, '__qualname__')) - self.assertEqual(SpecialAttrsT.__module__, 'test.test_typing') + self.assertEqual(SpecialAttrsT.__module__, __name__) # Module-level type variables are picklable. for proto in range(pickle.HIGHEST_PROTOCOL + 1): s = pickle.dumps(SpecialAttrsT, proto) @@ -5094,7 +5094,7 @@ def test_special_attrs2(self): self.assertEqual(SpecialAttrsP.__name__, 'SpecialAttrsP') self.assertFalse(hasattr(SpecialAttrsP, '__qualname__')) - self.assertEqual(SpecialAttrsP.__module__, 'test.test_typing') + self.assertEqual(SpecialAttrsP.__module__, __name__) # Module-level ParamSpecs are picklable. for proto in range(pickle.HIGHEST_PROTOCOL + 1): s = pickle.dumps(SpecialAttrsP, proto) From webhook-mailer at python.org Wed Jan 19 09:46:00 2022 From: webhook-mailer at python.org (pablogsal) Date: Wed, 19 Jan 2022 14:46:00 -0000 Subject: [Python-checkins] bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) (GH-30392) Message-ID: https://github.com/python/cpython/commit/353674f289076eecf848d7a26871cce529b89a98 commit: 353674f289076eecf848d7a26871cce529b89a98 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: pablogsal date: 2022-01-19T14:45:50Z summary: bpo-46231: Remove invalid_* rules preceded by more tokens from the grammar docs (GH-30341) (GH-30392) (cherry picked from commit e09d94a140a5f6903017da9b6ac752ba041d69da) Co-authored-by: Pablo Galindo Salgado Co-authored-by: Pablo Galindo Salgado files: M Doc/tools/extensions/peg_highlight.py diff --git a/Doc/tools/extensions/peg_highlight.py b/Doc/tools/extensions/peg_highlight.py index 8bc24670fbe0a..4262687d95b9d 100644 --- a/Doc/tools/extensions/peg_highlight.py +++ b/Doc/tools/extensions/peg_highlight.py @@ -45,8 +45,8 @@ class PEGLexer(RegexLexer): ], "variables": [(_name + _text_ws + "(=)", bygroups(None, None, None),),], "invalids": [ - (r"^(\s+\|\s+invalid_\w+\s*\n)", bygroups(None)), - (r"^(\s+\|\s+incorrect_\w+\s*\n)", bygroups(None)), + (r"^(\s+\|\s+.*invalid_\w+.*\n)", bygroups(None)), + (r"^(\s+\|\s+.*incorrect_\w+.*\n)", bygroups(None)), (r"^(#.*invalid syntax.*(?:.|\n)*)", bygroups(None),), ], "root": [ From webhook-mailer at python.org Wed Jan 19 09:53:51 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 19 Jan 2022 14:53:51 -0000 Subject: [Python-checkins] bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) Message-ID: https://github.com/python/cpython/commit/0eae9a2a2db6cc5a72535f61bb988cc417011640 commit: 0eae9a2a2db6cc5a72535f61bb988cc417011640 branch: main author: Alex Waygood committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-19T22:53:41+08:00 summary: bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) The documentation on ``GenericAlias`` objects implies at multiple points that only container classes can define ``__class_getitem__``. This is misleading. This PR proposes a rewrite of the documentation to clarify that non-container classes can define ``__class_getitem__``, and to clarify what it means when a non-container class is parameterized. See also: initial discussion of issues with this piece of documentation in GH-29308, and previous BPO issue [42280](https://bugs.python.org/issue42280). Also improved references in glossary and typing docs. Fixed some links. Co-authored-by: Erlend Egeberg Aasland Co-authored-by: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> files: M Doc/library/stdtypes.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index dc423bfbb7f55..f97e7c222b172 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -4834,33 +4834,54 @@ Generic Alias Type object: GenericAlias pair: Generic; Alias -``GenericAlias`` objects are created by subscripting a class (usually a -container), such as ``list[int]``. They are intended primarily for +``GenericAlias`` objects are generally created by +:ref:`subscripting ` a class. They are most often used with +:ref:`container classes `, such as :class:`list` or +:class:`dict`. For example, ``list[int]`` is a ``GenericAlias`` object created +by subscripting the ``list`` class with the argument :class:`int`. +``GenericAlias`` objects are intended primarily for use with :term:`type annotations `. -Usually, the :ref:`subscription ` of container objects calls the -method :meth:`__getitem__` of the object. However, the subscription of some -containers' classes may call the classmethod :meth:`__class_getitem__` of the -class instead. The classmethod :meth:`__class_getitem__` should return a -``GenericAlias`` object. - .. note:: - If the :meth:`__getitem__` of the class' metaclass is present, it will take - precedence over the :meth:`__class_getitem__` defined in the class (see - :pep:`560` for more details). -The ``GenericAlias`` object acts as a proxy for :term:`generic types -`, implementing *parameterized generics* - a specific instance -of a generic which provides the types for container elements. + It is generally only possible to subscript a class if the class implements + the special method :meth:`~object.__class_getitem__`. + +A ``GenericAlias`` object acts as a proxy for a :term:`generic type`, +implementing *parameterized generics*. + +For a container class, the +argument(s) supplied to a :ref:`subscription ` of the class may +indicate the type(s) of the elements an object contains. For example, +``set[bytes]`` can be used in type annotations to signify a :class:`set` in +which all the elements are of type :class:`bytes`. + +For a class which defines :meth:`~object.__class_getitem__` but is not a +container, the argument(s) supplied to a subscription of the class will often +indicate the return type(s) of one or more methods defined on an object. For +example, :mod:`regular expressions ` can be used on both the :class:`str` data +type and the :class:`bytes` data type: + +* If ``x = re.search('foo', 'foo')``, ``x`` will be a + :ref:`re.Match ` object where the return values of + ``x.group(0)`` and ``x[0]`` will both be of type :class:`str`. We can + represent this kind of object in type annotations with the ``GenericAlias`` + ``re.Match[str]``. -The user-exposed type for the ``GenericAlias`` object can be accessed from -:class:`types.GenericAlias` and used for :func:`isinstance` checks. It can -also be used to create ``GenericAlias`` objects directly. +* If ``y = re.search(b'bar', b'bar')``, (note the ``b`` for :class:`bytes`), + ``y`` will also be an instance of ``re.Match``, but the return + values of ``y.group(0)`` and ``y[0]`` will both be of type + :class:`bytes`. In type annotations, we would represent this + variety of :ref:`re.Match ` objects with ``re.Match[bytes]``. + +``GenericAlias`` objects are instances of the class +:class:`types.GenericAlias`, which can also be used to create ``GenericAlias`` +objects directly. .. describe:: T[X, Y, ...] - Creates a ``GenericAlias`` representing a type ``T`` containing elements - of types *X*, *Y*, and more depending on the ``T`` used. + Creates a ``GenericAlias`` representing a type ``T`` parameterized by types + *X*, *Y*, and more depending on the ``T`` used. For example, a function expecting a :class:`list` containing :class:`float` elements:: @@ -4885,7 +4906,7 @@ The builtin functions :func:`isinstance` and :func:`issubclass` do not accept The Python runtime does not enforce :term:`type annotations `. This extends to generic types and their type parameters. When creating -an object from a ``GenericAlias``, container elements are not checked +a container object from a ``GenericAlias``, the elements in the container are not checked against their type. For example, the following code is discouraged, but will run without errors:: @@ -4912,8 +4933,8 @@ Calling :func:`repr` or :func:`str` on a generic shows the parameterized type:: >>> str(list[int]) 'list[int]' -The :meth:`__getitem__` method of generics will raise an exception to disallow -mistakes like ``dict[str][str]``:: +The :meth:`~object.__getitem__` method of generic containers will raise an +exception to disallow mistakes like ``dict[str][str]``:: >>> dict[str][str] Traceback (most recent call last): @@ -4922,7 +4943,7 @@ mistakes like ``dict[str][str]``:: However, such expressions are valid when :ref:`type variables ` are used. The index must have as many elements as there are type variable items -in the ``GenericAlias`` object's :attr:`__args__ `. :: +in the ``GenericAlias`` object's :attr:`~genericalias.__args__`. :: >>> from typing import TypeVar >>> Y = TypeVar('Y') @@ -4930,10 +4951,11 @@ in the ``GenericAlias`` object's :attr:`__args__ `. :: dict[str, int] -Standard Generic Collections -^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Standard Generic Classes +^^^^^^^^^^^^^^^^^^^^^^^^ -These standard library collections support parameterized generics. +The following standard library classes support parameterized generics. This +list is non-exhaustive. * :class:`tuple` * :class:`list` @@ -4971,12 +4993,33 @@ These standard library collections support parameterized generics. * :class:`collections.abc.ValuesView` * :class:`contextlib.AbstractContextManager` * :class:`contextlib.AbstractAsyncContextManager` +* :class:`dataclasses.Field` +* :class:`functools.cached_property` +* :class:`functools.partialmethod` +* :class:`os.PathLike` +* :class:`pathlib.Path` +* :class:`pathlib.PurePath` +* :class:`pathlib.PurePosixPath` +* :class:`pathlib.PureWindowsPath` +* :class:`queue.LifoQueue` +* :class:`queue.Queue` +* :class:`queue.PriorityQueue` +* :class:`queue.SimpleQueue` * :ref:`re.Pattern ` * :ref:`re.Match ` +* :class:`shelve.BsdDbShelf` +* :class:`shelve.DbfilenameShelf` +* :class:`shelve.Shelf` +* :class:`types.MappingProxyType` +* :class:`weakref.WeakKeyDictionary` +* :class:`weakref.WeakMethod` +* :class:`weakref.WeakSet` +* :class:`weakref.WeakValueDictionary` + -Special Attributes of Generic Alias -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Special Attributes of ``GenericAlias`` objects +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ All parameterized generics implement special read-only attributes. @@ -4991,8 +5034,8 @@ All parameterized generics implement special read-only attributes. .. attribute:: genericalias.__args__ This attribute is a :class:`tuple` (possibly of length 1) of generic - types passed to the original :meth:`__class_getitem__` - of the generic container:: + types passed to the original :meth:`~object.__class_getitem__` of the + generic class:: >>> dict[str, list[int]].__args__ (, list[int]) @@ -5017,9 +5060,17 @@ All parameterized generics implement special read-only attributes. .. seealso:: - * :pep:`585` -- "Type Hinting Generics In Standard Collections" - * :meth:`__class_getitem__` -- Used to implement parameterized generics. - * :ref:`generics` -- Generics in the :mod:`typing` module. + :pep:`484` - Type Hints + Introducing Python's framework for type annotations. + + :pep:`585` - "Type Hinting Generics In Standard Collections" + Introducing the ability to natively parameterize standard-library + classes, provided they implement the special class method + :meth:`~object.__class_getitem__`. + + :ref:`Generics`, :ref:`user-defined generics ` and :class:`typing.Generic` + Documentation on how to implement generic classes that can be + parameterized at runtime and understood by static type-checkers. .. versionadded:: 3.9 From webhook-mailer at python.org Wed Jan 19 10:24:36 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 15:24:36 -0000 Subject: [Python-checkins] [3.10] bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) (GH-30688) Message-ID: https://github.com/python/cpython/commit/24d0b331e81b4e4af8dd4c1b66ea7159c1fdabc5 commit: 24d0b331e81b4e4af8dd4c1b66ea7159c1fdabc5 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T07:24:14-08:00 summary: [3.10] bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) (GH-30688) The documentation on ``GenericAlias`` objects implies at multiple points that only container classes can define ``__class_getitem__``. This is misleading. This PR proposes a rewrite of the documentation to clarify that non-container classes can define ``__class_getitem__``, and to clarify what it means when a non-container class is parameterized. See also: initial discussion of issues with this piece of documentation in GH-29308, and previous BPO issue [42280](). Also improved references in glossary and typing docs. Fixed some links. Co-authored-by: Erlend Egeberg Aasland Co-authored-by: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> (cherry picked from commit 0eae9a2a2db6cc5a72535f61bb988cc417011640) Co-authored-by: Alex Waygood Automerge-Triggered-By: GH:Fidget-Spinner files: M Doc/library/stdtypes.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index 8fa252b04d706..08e7c0db8cc97 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -4796,33 +4796,54 @@ Generic Alias Type object: GenericAlias pair: Generic; Alias -``GenericAlias`` objects are created by subscripting a class (usually a -container), such as ``list[int]``. They are intended primarily for +``GenericAlias`` objects are generally created by +:ref:`subscripting ` a class. They are most often used with +:ref:`container classes `, such as :class:`list` or +:class:`dict`. For example, ``list[int]`` is a ``GenericAlias`` object created +by subscripting the ``list`` class with the argument :class:`int`. +``GenericAlias`` objects are intended primarily for use with :term:`type annotations `. -Usually, the :ref:`subscription ` of container objects calls the -method :meth:`__getitem__` of the object. However, the subscription of some -containers' classes may call the classmethod :meth:`__class_getitem__` of the -class instead. The classmethod :meth:`__class_getitem__` should return a -``GenericAlias`` object. - .. note:: - If the :meth:`__getitem__` of the class' metaclass is present, it will take - precedence over the :meth:`__class_getitem__` defined in the class (see - :pep:`560` for more details). -The ``GenericAlias`` object acts as a proxy for :term:`generic types -`, implementing *parameterized generics* - a specific instance -of a generic which provides the types for container elements. + It is generally only possible to subscript a class if the class implements + the special method :meth:`~object.__class_getitem__`. + +A ``GenericAlias`` object acts as a proxy for a :term:`generic type`, +implementing *parameterized generics*. + +For a container class, the +argument(s) supplied to a :ref:`subscription ` of the class may +indicate the type(s) of the elements an object contains. For example, +``set[bytes]`` can be used in type annotations to signify a :class:`set` in +which all the elements are of type :class:`bytes`. + +For a class which defines :meth:`~object.__class_getitem__` but is not a +container, the argument(s) supplied to a subscription of the class will often +indicate the return type(s) of one or more methods defined on an object. For +example, :mod:`regular expressions ` can be used on both the :class:`str` data +type and the :class:`bytes` data type: + +* If ``x = re.search('foo', 'foo')``, ``x`` will be a + :ref:`re.Match ` object where the return values of + ``x.group(0)`` and ``x[0]`` will both be of type :class:`str`. We can + represent this kind of object in type annotations with the ``GenericAlias`` + ``re.Match[str]``. -The user-exposed type for the ``GenericAlias`` object can be accessed from -:class:`types.GenericAlias` and used for :func:`isinstance` checks. It can -also be used to create ``GenericAlias`` objects directly. +* If ``y = re.search(b'bar', b'bar')``, (note the ``b`` for :class:`bytes`), + ``y`` will also be an instance of ``re.Match``, but the return + values of ``y.group(0)`` and ``y[0]`` will both be of type + :class:`bytes`. In type annotations, we would represent this + variety of :ref:`re.Match ` objects with ``re.Match[bytes]``. + +``GenericAlias`` objects are instances of the class +:class:`types.GenericAlias`, which can also be used to create ``GenericAlias`` +objects directly. .. describe:: T[X, Y, ...] - Creates a ``GenericAlias`` representing a type ``T`` containing elements - of types *X*, *Y*, and more depending on the ``T`` used. + Creates a ``GenericAlias`` representing a type ``T`` parameterized by types + *X*, *Y*, and more depending on the ``T`` used. For example, a function expecting a :class:`list` containing :class:`float` elements:: @@ -4847,7 +4868,7 @@ The builtin functions :func:`isinstance` and :func:`issubclass` do not accept The Python runtime does not enforce :term:`type annotations `. This extends to generic types and their type parameters. When creating -an object from a ``GenericAlias``, container elements are not checked +a container object from a ``GenericAlias``, the elements in the container are not checked against their type. For example, the following code is discouraged, but will run without errors:: @@ -4874,8 +4895,8 @@ Calling :func:`repr` or :func:`str` on a generic shows the parameterized type:: >>> str(list[int]) 'list[int]' -The :meth:`__getitem__` method of generics will raise an exception to disallow -mistakes like ``dict[str][str]``:: +The :meth:`~object.__getitem__` method of generic containers will raise an +exception to disallow mistakes like ``dict[str][str]``:: >>> dict[str][str] Traceback (most recent call last): @@ -4884,7 +4905,7 @@ mistakes like ``dict[str][str]``:: However, such expressions are valid when :ref:`type variables ` are used. The index must have as many elements as there are type variable items -in the ``GenericAlias`` object's :attr:`__args__ `. :: +in the ``GenericAlias`` object's :attr:`~genericalias.__args__`. :: >>> from typing import TypeVar >>> Y = TypeVar('Y') @@ -4892,10 +4913,11 @@ in the ``GenericAlias`` object's :attr:`__args__ `. :: dict[str, int] -Standard Generic Collections -^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Standard Generic Classes +^^^^^^^^^^^^^^^^^^^^^^^^ -These standard library collections support parameterized generics. +The following standard library classes support parameterized generics. This +list is non-exhaustive. * :class:`tuple` * :class:`list` @@ -4933,12 +4955,33 @@ These standard library collections support parameterized generics. * :class:`collections.abc.ValuesView` * :class:`contextlib.AbstractContextManager` * :class:`contextlib.AbstractAsyncContextManager` +* :class:`dataclasses.Field` +* :class:`functools.cached_property` +* :class:`functools.partialmethod` +* :class:`os.PathLike` +* :class:`pathlib.Path` +* :class:`pathlib.PurePath` +* :class:`pathlib.PurePosixPath` +* :class:`pathlib.PureWindowsPath` +* :class:`queue.LifoQueue` +* :class:`queue.Queue` +* :class:`queue.PriorityQueue` +* :class:`queue.SimpleQueue` * :ref:`re.Pattern ` * :ref:`re.Match ` +* :class:`shelve.BsdDbShelf` +* :class:`shelve.DbfilenameShelf` +* :class:`shelve.Shelf` +* :class:`types.MappingProxyType` +* :class:`weakref.WeakKeyDictionary` +* :class:`weakref.WeakMethod` +* :class:`weakref.WeakSet` +* :class:`weakref.WeakValueDictionary` + -Special Attributes of Generic Alias -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Special Attributes of ``GenericAlias`` objects +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ All parameterized generics implement special read-only attributes. @@ -4953,8 +4996,8 @@ All parameterized generics implement special read-only attributes. .. attribute:: genericalias.__args__ This attribute is a :class:`tuple` (possibly of length 1) of generic - types passed to the original :meth:`__class_getitem__` - of the generic container:: + types passed to the original :meth:`~object.__class_getitem__` of the + generic class:: >>> dict[str, list[int]].__args__ (, list[int]) @@ -4979,9 +5022,17 @@ All parameterized generics implement special read-only attributes. .. seealso:: - * :pep:`585` -- "Type Hinting Generics In Standard Collections" - * :meth:`__class_getitem__` -- Used to implement parameterized generics. - * :ref:`generics` -- Generics in the :mod:`typing` module. + :pep:`484` - Type Hints + Introducing Python's framework for type annotations. + + :pep:`585` - Type Hinting Generics In Standard Collections + Introducing the ability to natively parameterize standard-library + classes, provided they implement the special class method + :meth:`~object.__class_getitem__`. + + :ref:`Generics`, :ref:`user-defined generics ` and :class:`typing.Generic` + Documentation on how to implement generic classes that can be + parameterized at runtime and understood by static type-checkers. .. versionadded:: 3.9 From webhook-mailer at python.org Wed Jan 19 10:32:36 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 19 Jan 2022 15:32:36 -0000 Subject: [Python-checkins] bpo-46413: properly test `__{r}or__` code paths in `_SpecialGenericAlias` (GH-30640) Message-ID: https://github.com/python/cpython/commit/0a49148e87cca11e3820cbff2abfd316986a68c6 commit: 0a49148e87cca11e3820cbff2abfd316986a68c6 branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-19T23:32:25+08:00 summary: bpo-46413: properly test `__{r}or__` code paths in `_SpecialGenericAlias` (GH-30640) Co-authored-by: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index e61f8828f5405..8d024514fcb84 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -523,6 +523,10 @@ def test_ellipsis_in_generic(self): # Shouldn't crash; see https://github.com/python/typing/issues/259 typing.List[Callable[..., str]] + def test_or_and_ror(self): + Callable = self.Callable + self.assertEqual(Callable | Tuple, Union[Callable, Tuple]) + self.assertEqual(Tuple | Callable, Union[Tuple, Callable]) def test_basic(self): Callable = self.Callable @@ -3906,6 +3910,10 @@ class B: ... A.register(B) self.assertIsSubclass(B, typing.Mapping) + def test_or_and_ror(self): + self.assertEqual(typing.Sized | typing.Awaitable, Union[typing.Sized, typing.Awaitable]) + self.assertEqual(typing.Coroutine | typing.Hashable, Union[typing.Coroutine, typing.Hashable]) + class OtherABCTests(BaseTestCase): From webhook-mailer at python.org Wed Jan 19 10:37:17 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 19 Jan 2022 15:37:17 -0000 Subject: [Python-checkins] bpo-45680: Minor formatting fix in stdtypes.rst (GH-30690) Message-ID: https://github.com/python/cpython/commit/1faf7c4effbe8b66f9b0347cab570fb3b5c91fb0 commit: 1faf7c4effbe8b66f9b0347cab570fb3b5c91fb0 branch: main author: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-19T23:37:05+08:00 summary: bpo-45680: Minor formatting fix in stdtypes.rst (GH-30690) Makes quotation consistent with rest of docs in commit 0eae9a2a2db6cc5a72535f61bb988cc417011640. files: M Doc/library/stdtypes.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index f97e7c222b172..3465320c87d8b 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -5063,7 +5063,7 @@ All parameterized generics implement special read-only attributes. :pep:`484` - Type Hints Introducing Python's framework for type annotations. - :pep:`585` - "Type Hinting Generics In Standard Collections" + :pep:`585` - Type Hinting Generics In Standard Collections Introducing the ability to natively parameterize standard-library classes, provided they implement the special class method :meth:`~object.__class_getitem__`. From webhook-mailer at python.org Wed Jan 19 10:54:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 15:54:26 -0000 Subject: [Python-checkins] bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) Message-ID: https://github.com/python/cpython/commit/baf26d07a634b0ea3ff052716bdeaee985b3a3a9 commit: baf26d07a634b0ea3ff052716bdeaee985b3a3a9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T07:54:07-08:00 summary: bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) (cherry picked from commit 32398294fb3fcf4ee74da54722fd0221c4e6cb74) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index f943aed73614c..b886c38827f1f 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4512,6 +4512,10 @@ def test_cannot_check_subclass(self): with self.assertRaises(TypeError): issubclass(int, Annotated[int, "positive"]) + def test_too_few_type_args(self): + with self.assertRaisesRegex(TypeError, 'at least two arguments'): + Annotated[int] + def test_pickle(self): samples = [typing.Any, typing.Union[int, str], typing.Optional[str], Tuple[int, ...], From webhook-mailer at python.org Wed Jan 19 11:00:00 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 16:00:00 -0000 Subject: [Python-checkins] [3.9] bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) (GH-30692) Message-ID: https://github.com/python/cpython/commit/331378dffc334c1f05ab3152c87f46cd9155e169 commit: 331378dffc334c1f05ab3152c87f46cd9155e169 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T07:59:46-08:00 summary: [3.9] bpo-46424: [typing] cover `Annotation[arg]` invalid usage in tests (GH-30663) (GH-30692) (cherry picked from commit 32398294fb3fcf4ee74da54722fd0221c4e6cb74) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 8cdb1166c847f..cb6be2cee87d7 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4243,6 +4243,10 @@ def test_cannot_check_subclass(self): with self.assertRaises(TypeError): issubclass(int, Annotated[int, "positive"]) + def test_too_few_type_args(self): + with self.assertRaisesRegex(TypeError, 'at least two arguments'): + Annotated[int] + def test_pickle(self): samples = [typing.Any, typing.Union[int, str], typing.Optional[str], Tuple[int, ...], From webhook-mailer at python.org Wed Jan 19 11:01:39 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 19 Jan 2022 16:01:39 -0000 Subject: [Python-checkins] [3.9] bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) (GH-30689) Message-ID: https://github.com/python/cpython/commit/00645166b64e68001a425a15281a1ccdcb78f818 commit: 00645166b64e68001a425a15281a1ccdcb78f818 branch: 3.9 author: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-20T00:01:30+08:00 summary: [3.9] bpo-45680: Clarify documentation on ``GenericAlias`` objects (GH-29335) (GH-30689) The documentation on ``GenericAlias`` objects implies at multiple points that only container classes can define ``__class_getitem__``. This is misleading. This PR proposes a rewrite of the documentation to clarify that non-container classes can define ``__class_getitem__``, and to clarify what it means when a non-container class is parameterized. See also: initial discussion of issues with this piece of documentation in GH-29308, and previous BPO issue [42280](https://bugs.python.org/issue42280). Also improved references in glossary and typing docs. Fixed some links. (cherry picked from commit 0eae9a2a2db6cc5a72535f61bb988cc417011640) Co-Authored-By: Erlend Egeberg Aasland Co-Authored-By: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> Co-Authored-By: Alex Waygood files: M Doc/library/stdtypes.rst diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index bfa0e744f2c4f..ba5e0d5a164e8 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -4740,33 +4740,54 @@ Generic Alias Type object: GenericAlias pair: Generic; Alias -``GenericAlias`` objects are created by subscripting a class (usually a -container), such as ``list[int]``. They are intended primarily for +``GenericAlias`` objects are generally created by +:ref:`subscripting ` a class. They are most often used with +:ref:`container classes `, such as :class:`list` or +:class:`dict`. For example, ``list[int]`` is a ``GenericAlias`` object created +by subscripting the ``list`` class with the argument :class:`int`. +``GenericAlias`` objects are intended primarily for use with :term:`type annotations `. -Usually, the :ref:`subscription ` of container objects calls the -method :meth:`__getitem__` of the object. However, the subscription of some -containers' classes may call the classmethod :meth:`__class_getitem__` of the -class instead. The classmethod :meth:`__class_getitem__` should return a -``GenericAlias`` object. - .. note:: - If the :meth:`__getitem__` of the class' metaclass is present, it will take - precedence over the :meth:`__class_getitem__` defined in the class (see - :pep:`560` for more details). -The ``GenericAlias`` object acts as a proxy for :term:`generic types -`, implementing *parameterized generics* - a specific instance -of a generic which provides the types for container elements. + It is generally only possible to subscript a class if the class implements + the special method :meth:`~object.__class_getitem__`. + +A ``GenericAlias`` object acts as a proxy for a :term:`generic type`, +implementing *parameterized generics*. + +For a container class, the +argument(s) supplied to a :ref:`subscription ` of the class may +indicate the type(s) of the elements an object contains. For example, +``set[bytes]`` can be used in type annotations to signify a :class:`set` in +which all the elements are of type :class:`bytes`. + +For a class which defines :meth:`~object.__class_getitem__` but is not a +container, the argument(s) supplied to a subscription of the class will often +indicate the return type(s) of one or more methods defined on an object. For +example, :mod:`regular expressions ` can be used on both the :class:`str` data +type and the :class:`bytes` data type: + +* If ``x = re.search('foo', 'foo')``, ``x`` will be a + :ref:`re.Match ` object where the return values of + ``x.group(0)`` and ``x[0]`` will both be of type :class:`str`. We can + represent this kind of object in type annotations with the ``GenericAlias`` + ``re.Match[str]``. + +* If ``y = re.search(b'bar', b'bar')``, (note the ``b`` for :class:`bytes`), + ``y`` will also be an instance of ``re.Match``, but the return + values of ``y.group(0)`` and ``y[0]`` will both be of type + :class:`bytes`. In type annotations, we would represent this + variety of :ref:`re.Match ` objects with ``re.Match[bytes]``. -The user-exposed type for the ``GenericAlias`` object can be accessed from -:class:`types.GenericAlias` and used for :func:`isinstance` checks. It can -also be used to create ``GenericAlias`` objects directly. +``GenericAlias`` objects are instances of the class +:class:`types.GenericAlias`, which can also be used to create ``GenericAlias`` +objects directly. .. describe:: T[X, Y, ...] - Creates a ``GenericAlias`` representing a type ``T`` containing elements - of types *X*, *Y*, and more depending on the ``T`` used. + Creates a ``GenericAlias`` representing a type ``T`` parameterized by types + *X*, *Y*, and more depending on the ``T`` used. For example, a function expecting a :class:`list` containing :class:`float` elements:: @@ -4791,7 +4812,7 @@ The builtin functions :func:`isinstance` and :func:`issubclass` do not accept The Python runtime does not enforce :term:`type annotations `. This extends to generic types and their type parameters. When creating -an object from a ``GenericAlias``, container elements are not checked +a container object from a ``GenericAlias``, the elements in the container are not checked against their type. For example, the following code is discouraged, but will run without errors:: @@ -4818,8 +4839,8 @@ Calling :func:`repr` or :func:`str` on a generic shows the parameterized type:: >>> str(list[int]) 'list[int]' -The :meth:`__getitem__` method of generics will raise an exception to disallow -mistakes like ``dict[str][str]``:: +The :meth:`~object.__getitem__` method of generic containers will raise an +exception to disallow mistakes like ``dict[str][str]``:: >>> dict[str][str] Traceback (most recent call last): @@ -4828,7 +4849,7 @@ mistakes like ``dict[str][str]``:: However, such expressions are valid when :ref:`type variables ` are used. The index must have as many elements as there are type variable items -in the ``GenericAlias`` object's :attr:`__args__ `. :: +in the ``GenericAlias`` object's :attr:`~genericalias.__args__`. :: >>> from typing import TypeVar >>> Y = TypeVar('Y') @@ -4836,10 +4857,11 @@ in the ``GenericAlias`` object's :attr:`__args__ `. :: dict[str, int] -Standard Generic Collections ----------------------------- +Standard Generic Classes +------------------------ -These standard library collections support parameterized generics. +The following standard library classes support parameterized generics. This +list is non-exhaustive. * :class:`tuple` * :class:`list` @@ -4877,12 +4899,33 @@ These standard library collections support parameterized generics. * :class:`collections.abc.ValuesView` * :class:`contextlib.AbstractContextManager` * :class:`contextlib.AbstractAsyncContextManager` +* :class:`dataclasses.Field` +* :class:`functools.cached_property` +* :class:`functools.partialmethod` +* :class:`os.PathLike` +* :class:`pathlib.Path` +* :class:`pathlib.PurePath` +* :class:`pathlib.PurePosixPath` +* :class:`pathlib.PureWindowsPath` +* :class:`queue.LifoQueue` +* :class:`queue.Queue` +* :class:`queue.PriorityQueue` +* :class:`queue.SimpleQueue` * :ref:`re.Pattern ` * :ref:`re.Match ` +* :class:`shelve.BsdDbShelf` +* :class:`shelve.DbfilenameShelf` +* :class:`shelve.Shelf` +* :class:`types.MappingProxyType` +* :class:`weakref.WeakKeyDictionary` +* :class:`weakref.WeakMethod` +* :class:`weakref.WeakSet` +* :class:`weakref.WeakValueDictionary` -Special Attributes of Generic Alias ------------------------------------ + +Special Attributes of ``GenericAlias`` objects +---------------------------------------------- All parameterized generics implement special read-only attributes. @@ -4897,8 +4940,8 @@ All parameterized generics implement special read-only attributes. .. attribute:: genericalias.__args__ This attribute is a :class:`tuple` (possibly of length 1) of generic - types passed to the original :meth:`__class_getitem__` - of the generic container:: + types passed to the original :meth:`~object.__class_getitem__` of the + generic class:: >>> dict[str, list[int]].__args__ (, list[int]) @@ -4918,9 +4961,17 @@ All parameterized generics implement special read-only attributes. .. seealso:: - * :pep:`585` -- "Type Hinting Generics In Standard Collections" - * :meth:`__class_getitem__` -- Used to implement parameterized generics. - * :ref:`generics` -- Generics in the :mod:`typing` module. + :pep:`484` - Type Hints + Introducing Python's framework for type annotations. + + :pep:`585` - Type Hinting Generics In Standard Collections + Introducing the ability to natively parameterize standard-library + classes, provided they implement the special class method + :meth:`~object.__class_getitem__`. + + :ref:`Generics`, :ref:`user-defined generics ` and :class:`typing.Generic` + Documentation on how to implement generic classes that can be + parameterized at runtime and understood by static type-checkers. .. versionadded:: 3.9 From webhook-mailer at python.org Wed Jan 19 11:11:22 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 16:11:22 -0000 Subject: [Python-checkins] [3.10] bpo-46413: properly test `__{r}or__` code paths in `_SpecialGenericAlias` (GH-30640) (GH-30694) Message-ID: https://github.com/python/cpython/commit/39374c44d98b470213256ceead0e2b4e44b14b92 commit: 39374c44d98b470213256ceead0e2b4e44b14b92 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T08:11:12-08:00 summary: [3.10] bpo-46413: properly test `__{r}or__` code paths in `_SpecialGenericAlias` (GH-30640) (GH-30694) Co-authored-by: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> (cherry picked from commit 0a49148e87cca11e3820cbff2abfd316986a68c6) Co-authored-by: Nikita Sobolev Automerge-Triggered-By: GH:Fidget-Spinner files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index b886c38827f1f..ee432b65cf5df 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -515,6 +515,10 @@ def test_ellipsis_in_generic(self): # Shouldn't crash; see https://github.com/python/typing/issues/259 typing.List[Callable[..., str]] + def test_or_and_ror(self): + Callable = self.Callable + self.assertEqual(Callable | Tuple, Union[Callable, Tuple]) + self.assertEqual(Tuple | Callable, Union[Tuple, Callable]) def test_basic(self): Callable = self.Callable @@ -3834,6 +3838,10 @@ class B: ... A.register(B) self.assertIsSubclass(B, typing.Mapping) + def test_or_and_ror(self): + self.assertEqual(typing.Sized | typing.Awaitable, Union[typing.Sized, typing.Awaitable]) + self.assertEqual(typing.Coroutine | typing.Hashable, Union[typing.Coroutine, typing.Hashable]) + class OtherABCTests(BaseTestCase): From webhook-mailer at python.org Wed Jan 19 11:54:50 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 16:54:50 -0000 Subject: [Python-checkins] Mark all clinic headers as generated (GH-30679) Message-ID: https://github.com/python/cpython/commit/71734d0b9ca584bcbdcb2fb44ae16bb2fbfcaf6e commit: 71734d0b9ca584bcbdcb2fb44ae16bb2fbfcaf6e branch: main author: Erlend Egeberg Aasland committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T08:54:45-08:00 summary: Mark all clinic headers as generated (GH-30679) files: M .gitattributes diff --git a/.gitattributes b/.gitattributes index 3363ea8e4e744..05b0420714e1b 100644 --- a/.gitattributes +++ b/.gitattributes @@ -41,11 +41,8 @@ PCbuild/readme.txt text eol=crlf PC/readme.txt text eol=crlf # Generated files -# https://github.com/github/linguist#generated-code -Modules/clinic/*.h linguist-generated=true -Objects/clinic/*.h linguist-generated=true -PC/clinic/*.h linguist-generated=true -Python/clinic/*.h linguist-generated=true +# https://github.com/github/linguist/blob/master/docs/overrides.md +**/clinic/*.h linguist-generated=true Python/deepfreeze/*.c linguist-generated=true Python/frozen_modules/*.h linguist-generated=true Python/frozen_modules/MANIFEST linguist-generated=true From webhook-mailer at python.org Wed Jan 19 12:40:01 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 17:40:01 -0000 Subject: [Python-checkins] Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) Message-ID: https://github.com/python/cpython/commit/d45cd2d20770f72a000ba6dfa9ac88dd49423c27 commit: d45cd2d20770f72a000ba6dfa9ac88dd49423c27 branch: main author: Evan committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T09:39:57-08:00 summary: Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) A small change to the documentation of datetime module , in the format codes section of stftime and strptime. Changed the description of format code '%W' from 'as a decimal number' to 'a zero padded decimal number' so it's in line with the example having leading zeros. Similar to the format code '%U' above. Automerge-Triggered-By: GH:pganssle files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 217cdf222b89b..f447b7bc9491e 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -2375,7 +2375,7 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%U`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Sunday as the first day of | | \(9) | -| | the week) as a zero padded | | | +| | the week) as a zero-padded | | | | | decimal number. All days in a | | | | | new year preceding the first | | | | | Sunday are considered to be in | | | @@ -2383,10 +2383,10 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%W`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Monday as the first day of | | \(9) | -| | the week) as a decimal number. | | | -| | All days in a new year | | | -| | preceding the first Monday | | | -| | are considered to be in | | | +| | the week) as a zero-padded | | | +| | decimal number. All days in a | | | +| | new year preceding the first | | | +| | Monday are considered to be in | | | | | week 0. | | | +-----------+--------------------------------+------------------------+-------+ | ``%c`` | Locale's appropriate date and || Tue Aug 16 21:30:00 | \(1) | From webhook-mailer at python.org Wed Jan 19 13:02:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 18:02:59 -0000 Subject: [Python-checkins] Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) Message-ID: https://github.com/python/cpython/commit/c4fe0aa670480d887f1f736d1a4251234914b58c commit: c4fe0aa670480d887f1f736d1a4251234914b58c branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T10:02:07-08:00 summary: Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) A small change to the documentation of datetime module , in the format codes section of stftime and strptime. Changed the description of format code '%W' from 'as a decimal number' to 'a zero padded decimal number' so it's in line with the example having leading zeros. Similar to the format code '%U' above. Automerge-Triggered-By: GH:pganssle (cherry picked from commit d45cd2d20770f72a000ba6dfa9ac88dd49423c27) Co-authored-by: Evan files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 217cdf222b89b..f447b7bc9491e 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -2375,7 +2375,7 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%U`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Sunday as the first day of | | \(9) | -| | the week) as a zero padded | | | +| | the week) as a zero-padded | | | | | decimal number. All days in a | | | | | new year preceding the first | | | | | Sunday are considered to be in | | | @@ -2383,10 +2383,10 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%W`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Monday as the first day of | | \(9) | -| | the week) as a decimal number. | | | -| | All days in a new year | | | -| | preceding the first Monday | | | -| | are considered to be in | | | +| | the week) as a zero-padded | | | +| | decimal number. All days in a | | | +| | new year preceding the first | | | +| | Monday are considered to be in | | | | | week 0. | | | +-----------+--------------------------------+------------------------+-------+ | ``%c`` | Locale's appropriate date and || Tue Aug 16 21:30:00 | \(1) | From webhook-mailer at python.org Wed Jan 19 16:21:20 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 19 Jan 2022 21:21:20 -0000 Subject: [Python-checkins] docs: correct outdated MappingProxyType docstrings (#30281) Message-ID: https://github.com/python/cpython/commit/2d10fa9bc4cf83c5e5dd73decc9a138d6d247374 commit: 2d10fa9bc4cf83c5e5dd73decc9a138d6d247374 branch: main author: Joshua Bronson committer: gvanrossum date: 2022-01-19T13:20:43-08:00 summary: docs: correct outdated MappingProxyType docstrings (#30281) The docstrings for MappingProxyType's keys(), values(), and items() methods were never updated to reflect the changes that Python 3 brought to these APIs, namely returning views rather than lists. files: M Objects/descrobject.c diff --git a/Objects/descrobject.c b/Objects/descrobject.c index 946ea6aa80319..962136beae05d 100644 --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1135,11 +1135,11 @@ static PyMethodDef mappingproxy_methods[] = { PyDoc_STR("D.get(k[,d]) -> D[k] if k in D, else d." " d defaults to None.")}, {"keys", (PyCFunction)mappingproxy_keys, METH_NOARGS, - PyDoc_STR("D.keys() -> list of D's keys")}, + PyDoc_STR("D.keys() -> a set-like object providing a view on D's keys")}, {"values", (PyCFunction)mappingproxy_values, METH_NOARGS, - PyDoc_STR("D.values() -> list of D's values")}, + PyDoc_STR("D.values() -> an object providing a view on D's values")}, {"items", (PyCFunction)mappingproxy_items, METH_NOARGS, - PyDoc_STR("D.items() -> list of D's (key, value) pairs, as 2-tuples")}, + PyDoc_STR("D.items() -> a set-like object providing a view on D's items")}, {"copy", (PyCFunction)mappingproxy_copy, METH_NOARGS, PyDoc_STR("D.copy() -> a shallow copy of D")}, {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, From webhook-mailer at python.org Wed Jan 19 16:22:32 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 19 Jan 2022 21:22:32 -0000 Subject: [Python-checkins] doc: Clarify os.urandom return type (#30282) Message-ID: https://github.com/python/cpython/commit/4b99803b861e58eb476a7a30e2e8aacdec5df104 commit: 4b99803b861e58eb476a7a30e2e8aacdec5df104 branch: main author: Florian Bruhin committer: gvanrossum date: 2022-01-19T13:22:15-08:00 summary: doc: Clarify os.urandom return type (#30282) Other descriptions in the same file also use 'bytestring' to refer to bytes objects files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index eb3035344455f..234ea3238ef99 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4879,7 +4879,7 @@ Random numbers .. function:: urandom(size) - Return a string of *size* random bytes suitable for cryptographic use. + Return a bytestring of *size* random bytes suitable for cryptographic use. This function returns random bytes from an OS-specific randomness source. The returned data should be unpredictable enough for cryptographic applications, From webhook-mailer at python.org Wed Jan 19 16:24:31 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 19 Jan 2022 21:24:31 -0000 Subject: [Python-checkins] bpo-46437: remove useless `hasattr` from `test_typing` (#30704) Message-ID: https://github.com/python/cpython/commit/263c0dd16017613c5ea2fbfc270be4de2b41b5ad commit: 263c0dd16017613c5ea2fbfc270be4de2b41b5ad branch: main author: Nikita Sobolev committer: gvanrossum date: 2022-01-19T13:24:27-08:00 summary: bpo-46437: remove useless `hasattr` from `test_typing` (#30704) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 8d024514fcb84..ce0c940e2a112 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -3513,11 +3513,10 @@ def test_container(self): self.assertNotIsInstance(42, typing.Container) def test_collection(self): - if hasattr(typing, 'Collection'): - self.assertIsInstance(tuple(), typing.Collection) - self.assertIsInstance(frozenset(), typing.Collection) - self.assertIsSubclass(dict, typing.Collection) - self.assertNotIsInstance(42, typing.Collection) + self.assertIsInstance(tuple(), typing.Collection) + self.assertIsInstance(frozenset(), typing.Collection) + self.assertIsSubclass(dict, typing.Collection) + self.assertNotIsInstance(42, typing.Collection) def test_abstractset(self): self.assertIsInstance(set(), typing.AbstractSet) @@ -5130,8 +5129,9 @@ def test_all(self): self.assertIn('ValuesView', a) self.assertIn('cast', a) self.assertIn('overload', a) - if hasattr(contextlib, 'AbstractContextManager'): - self.assertIn('ContextManager', a) + # Context managers. + self.assertIn('ContextManager', a) + self.assertIn('AsyncContextManager', a) # Check that io and re are not exported. self.assertNotIn('io', a) self.assertNotIn('re', a) @@ -5145,8 +5145,6 @@ def test_all(self): self.assertIn('SupportsComplex', a) def test_all_exported_names(self): - import typing - actual_all = set(typing.__all__) computed_all = { k for k, v in vars(typing).items() From webhook-mailer at python.org Wed Jan 19 16:57:20 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 21:57:20 -0000 Subject: [Python-checkins] docs: correct outdated MappingProxyType docstrings (GH-30281) Message-ID: https://github.com/python/cpython/commit/d2b7e08d86874be7d4375a4994617ba8f068a65e commit: d2b7e08d86874be7d4375a4994617ba8f068a65e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T13:57:09-08:00 summary: docs: correct outdated MappingProxyType docstrings (GH-30281) The docstrings for MappingProxyType's keys(), values(), and items() methods were never updated to reflect the changes that Python 3 brought to these APIs, namely returning views rather than lists. (cherry picked from commit 2d10fa9bc4cf83c5e5dd73decc9a138d6d247374) Co-authored-by: Joshua Bronson files: M Objects/descrobject.c diff --git a/Objects/descrobject.c b/Objects/descrobject.c index 97669bef368da..26726cc0973df 100644 --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1135,11 +1135,11 @@ static PyMethodDef mappingproxy_methods[] = { PyDoc_STR("D.get(k[,d]) -> D[k] if k in D, else d." " d defaults to None.")}, {"keys", (PyCFunction)mappingproxy_keys, METH_NOARGS, - PyDoc_STR("D.keys() -> list of D's keys")}, + PyDoc_STR("D.keys() -> a set-like object providing a view on D's keys")}, {"values", (PyCFunction)mappingproxy_values, METH_NOARGS, - PyDoc_STR("D.values() -> list of D's values")}, + PyDoc_STR("D.values() -> an object providing a view on D's values")}, {"items", (PyCFunction)mappingproxy_items, METH_NOARGS, - PyDoc_STR("D.items() -> list of D's (key, value) pairs, as 2-tuples")}, + PyDoc_STR("D.items() -> a set-like object providing a view on D's items")}, {"copy", (PyCFunction)mappingproxy_copy, METH_NOARGS, PyDoc_STR("D.copy() -> a shallow copy of D")}, {"__class_getitem__", (PyCFunction)Py_GenericAlias, METH_O|METH_CLASS, From webhook-mailer at python.org Wed Jan 19 16:57:34 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 21:57:34 -0000 Subject: [Python-checkins] docs: correct outdated MappingProxyType docstrings (GH-30281) Message-ID: https://github.com/python/cpython/commit/54feddae8832f24f1ca8ebd1f21a19d6aec4b3fd commit: 54feddae8832f24f1ca8ebd1f21a19d6aec4b3fd branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T13:57:30-08:00 summary: docs: correct outdated MappingProxyType docstrings (GH-30281) The docstrings for MappingProxyType's keys(), values(), and items() methods were never updated to reflect the changes that Python 3 brought to these APIs, namely returning views rather than lists. (cherry picked from commit 2d10fa9bc4cf83c5e5dd73decc9a138d6d247374) Co-authored-by: Joshua Bronson files: M Objects/descrobject.c diff --git a/Objects/descrobject.c b/Objects/descrobject.c index 075a92d4084d5..ee40645955206 100644 --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1134,11 +1134,11 @@ static PyMethodDef mappingproxy_methods[] = { PyDoc_STR("D.get(k[,d]) -> D[k] if k in D, else d." " d defaults to None.")}, {"keys", (PyCFunction)mappingproxy_keys, METH_NOARGS, - PyDoc_STR("D.keys() -> list of D's keys")}, + PyDoc_STR("D.keys() -> a set-like object providing a view on D's keys")}, {"values", (PyCFunction)mappingproxy_values, METH_NOARGS, - PyDoc_STR("D.values() -> list of D's values")}, + PyDoc_STR("D.values() -> an object providing a view on D's values")}, {"items", (PyCFunction)mappingproxy_items, METH_NOARGS, - PyDoc_STR("D.items() -> list of D's (key, value) pairs, as 2-tuples")}, + PyDoc_STR("D.items() -> a set-like object providing a view on D's items")}, {"copy", (PyCFunction)mappingproxy_copy, METH_NOARGS, PyDoc_STR("D.copy() -> a shallow copy of D")}, {"__class_getitem__", (PyCFunction)Py_GenericAlias, METH_O|METH_CLASS, From webhook-mailer at python.org Wed Jan 19 16:58:18 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 21:58:18 -0000 Subject: [Python-checkins] doc: Clarify os.urandom return type (GH-30282) Message-ID: https://github.com/python/cpython/commit/ee077500888ca5c1360bbd224b3af4a0fbbf6e02 commit: ee077500888ca5c1360bbd224b3af4a0fbbf6e02 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T13:58:13-08:00 summary: doc: Clarify os.urandom return type (GH-30282) Other descriptions in the same file also use 'bytestring' to refer to bytes objects (cherry picked from commit 4b99803b861e58eb476a7a30e2e8aacdec5df104) Co-authored-by: Florian Bruhin files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 629a32f1b63e7..5b2c2e0d0f2d0 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4869,7 +4869,7 @@ Random numbers .. function:: urandom(size) - Return a string of *size* random bytes suitable for cryptographic use. + Return a bytestring of *size* random bytes suitable for cryptographic use. This function returns random bytes from an OS-specific randomness source. The returned data should be unpredictable enough for cryptographic applications, From webhook-mailer at python.org Wed Jan 19 17:08:36 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 22:08:36 -0000 Subject: [Python-checkins] doc: Clarify os.urandom return type (GH-30282) Message-ID: https://github.com/python/cpython/commit/981c1dc8b6a5673aa59a213cd62e7a8a3cb3fdd9 commit: 981c1dc8b6a5673aa59a213cd62e7a8a3cb3fdd9 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T14:07:47-08:00 summary: doc: Clarify os.urandom return type (GH-30282) Other descriptions in the same file also use 'bytestring' to refer to bytes objects (cherry picked from commit 4b99803b861e58eb476a7a30e2e8aacdec5df104) Co-authored-by: Florian Bruhin files: M Doc/library/os.rst diff --git a/Doc/library/os.rst b/Doc/library/os.rst index f8d567af48466..d70f403e8cf1f 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4623,7 +4623,7 @@ Random numbers .. function:: urandom(size) - Return a string of *size* random bytes suitable for cryptographic use. + Return a bytestring of *size* random bytes suitable for cryptographic use. This function returns random bytes from an OS-specific randomness source. The returned data should be unpredictable enough for cryptographic applications, From webhook-mailer at python.org Wed Jan 19 17:12:32 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 22:12:32 -0000 Subject: [Python-checkins] bpo-46437: remove useless `hasattr` from `test_typing` (GH-30704) Message-ID: https://github.com/python/cpython/commit/3b51926ee9838e746a5cdb08c7eb985646bd133c commit: 3b51926ee9838e746a5cdb08c7eb985646bd133c branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T14:12:25-08:00 summary: bpo-46437: remove useless `hasattr` from `test_typing` (GH-30704) (cherry picked from commit 263c0dd16017613c5ea2fbfc270be4de2b41b5ad) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index cb6be2cee87d7..17da4b81f5193 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -3230,11 +3230,10 @@ def test_container(self): self.assertNotIsInstance(42, typing.Container) def test_collection(self): - if hasattr(typing, 'Collection'): - self.assertIsInstance(tuple(), typing.Collection) - self.assertIsInstance(frozenset(), typing.Collection) - self.assertIsSubclass(dict, typing.Collection) - self.assertNotIsInstance(42, typing.Collection) + self.assertIsInstance(tuple(), typing.Collection) + self.assertIsInstance(frozenset(), typing.Collection) + self.assertIsSubclass(dict, typing.Collection) + self.assertNotIsInstance(42, typing.Collection) def test_abstractset(self): self.assertIsInstance(set(), typing.AbstractSet) @@ -4321,8 +4320,9 @@ def test_all(self): self.assertIn('ValuesView', a) self.assertIn('cast', a) self.assertIn('overload', a) - if hasattr(contextlib, 'AbstractContextManager'): - self.assertIn('ContextManager', a) + # Context managers. + self.assertIn('ContextManager', a) + self.assertIn('AsyncContextManager', a) # Check that io and re are not exported. self.assertNotIn('io', a) self.assertNotIn('re', a) @@ -4336,8 +4336,6 @@ def test_all(self): self.assertIn('SupportsComplex', a) def test_all_exported_names(self): - import typing - actual_all = set(typing.__all__) computed_all = { k for k, v in vars(typing).items() From webhook-mailer at python.org Wed Jan 19 17:30:31 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 19 Jan 2022 22:30:31 -0000 Subject: [Python-checkins] bpo-46437: remove useless `hasattr` from `test_typing` (GH-30704) Message-ID: https://github.com/python/cpython/commit/07b12fdf5545a20e0fb7be9d6ad35344337e00ae commit: 07b12fdf5545a20e0fb7be9d6ad35344337e00ae branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-19T14:30:07-08:00 summary: bpo-46437: remove useless `hasattr` from `test_typing` (GH-30704) (cherry picked from commit 263c0dd16017613c5ea2fbfc270be4de2b41b5ad) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index ee432b65cf5df..1d16e78d422cd 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -3439,11 +3439,10 @@ def test_container(self): self.assertNotIsInstance(42, typing.Container) def test_collection(self): - if hasattr(typing, 'Collection'): - self.assertIsInstance(tuple(), typing.Collection) - self.assertIsInstance(frozenset(), typing.Collection) - self.assertIsSubclass(dict, typing.Collection) - self.assertNotIsInstance(42, typing.Collection) + self.assertIsInstance(tuple(), typing.Collection) + self.assertIsInstance(frozenset(), typing.Collection) + self.assertIsSubclass(dict, typing.Collection) + self.assertNotIsInstance(42, typing.Collection) def test_abstractset(self): self.assertIsInstance(set(), typing.AbstractSet) @@ -5033,8 +5032,9 @@ def test_all(self): self.assertIn('ValuesView', a) self.assertIn('cast', a) self.assertIn('overload', a) - if hasattr(contextlib, 'AbstractContextManager'): - self.assertIn('ContextManager', a) + # Context managers. + self.assertIn('ContextManager', a) + self.assertIn('AsyncContextManager', a) # Check that io and re are not exported. self.assertNotIn('io', a) self.assertNotIn('re', a) @@ -5048,8 +5048,6 @@ def test_all(self): self.assertIn('SupportsComplex', a) def test_all_exported_names(self): - import typing - actual_all = set(typing.__all__) computed_all = { k for k, v in vars(typing).items() From webhook-mailer at python.org Wed Jan 19 20:43:58 2022 From: webhook-mailer at python.org (corona10) Date: Thu, 20 Jan 2022 01:43:58 -0000 Subject: [Python-checkins] [3.9] bpo-46425: Fix direct invocation of multiple test modules (GH-30666) (GH-30700) Message-ID: https://github.com/python/cpython/commit/8105dd24112509fab2eabfce5352afc41e3a34b6 commit: 8105dd24112509fab2eabfce5352afc41e3a34b6 branch: 3.9 author: Nikita Sobolev committer: corona10 date: 2022-01-20T10:43:49+09:00 summary: [3.9] bpo-46425: Fix direct invocation of multiple test modules (GH-30666) (GH-30700) files: M Lib/test/test_compileall.py M Lib/test/test_distutils.py M Lib/test/test_dtrace.py M Lib/unittest/test/test_program.py diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 6e1f4b2f397e2..a904f426b172a 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -3,7 +3,6 @@ import filecmp import importlib.util import io -import itertools import os import pathlib import py_compile @@ -24,9 +23,8 @@ from test import support from test.support import script_helper - -from .test_py_compile import without_source_date_epoch -from .test_py_compile import SourceDateEpochTestMeta +from test.test_py_compile import without_source_date_epoch +from test.test_py_compile import SourceDateEpochTestMeta def get_pyc(script, opt): diff --git a/Lib/test/test_distutils.py b/Lib/test/test_distutils.py index 790d39c6d35ae..849aa737e9bdc 100644 --- a/Lib/test/test_distutils.py +++ b/Lib/test/test_distutils.py @@ -7,7 +7,7 @@ import distutils.tests import test.support - +import unittest def load_tests(*_): # used by unittest diff --git a/Lib/test/test_dtrace.py b/Lib/test/test_dtrace.py index 3957077f5d612..8a436ad123b80 100644 --- a/Lib/test/test_dtrace.py +++ b/Lib/test/test_dtrace.py @@ -170,4 +170,4 @@ class SystemTapOptimizedTests(TraceTests, unittest.TestCase): if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/unittest/test/test_program.py b/Lib/unittest/test/test_program.py index 4746d71e0b603..b7fbbc1e7badd 100644 --- a/Lib/unittest/test/test_program.py +++ b/Lib/unittest/test/test_program.py @@ -6,7 +6,7 @@ from test import support import unittest import unittest.test -from .test_result import BufferedWriter +from unittest.test.test_result import BufferedWriter class Test_TestProgram(unittest.TestCase): From webhook-mailer at python.org Wed Jan 19 20:44:26 2022 From: webhook-mailer at python.org (corona10) Date: Thu, 20 Jan 2022 01:44:26 -0000 Subject: [Python-checkins] [3.10] bpo-46425: Fix direct invocation of multiple test modules (GH-30666) (GH-30699) Message-ID: https://github.com/python/cpython/commit/a6a088548063226233c08b8d35dde130746fdd10 commit: a6a088548063226233c08b8d35dde130746fdd10 branch: 3.10 author: Nikita Sobolev committer: corona10 date: 2022-01-20T10:44:21+09:00 summary: [3.10] bpo-46425: Fix direct invocation of multiple test modules (GH-30666) (GH-30699) files: M Lib/test/test_compileall.py M Lib/test/test_distutils.py M Lib/test/test_dtrace.py M Lib/test/test_zipfile64.py M Lib/unittest/test/test_program.py diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 9e15ecf3aae29..33f0c939325f5 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -3,7 +3,6 @@ import filecmp import importlib.util import io -import itertools import os import pathlib import py_compile @@ -29,9 +28,8 @@ from test import support from test.support import os_helper from test.support import script_helper - -from .test_py_compile import without_source_date_epoch -from .test_py_compile import SourceDateEpochTestMeta +from test.test_py_compile import without_source_date_epoch +from test.test_py_compile import SourceDateEpochTestMeta def get_pyc(script, opt): diff --git a/Lib/test/test_distutils.py b/Lib/test/test_distutils.py index 4b40af0213234..d82d2b6423433 100644 --- a/Lib/test/test_distutils.py +++ b/Lib/test/test_distutils.py @@ -5,7 +5,7 @@ be run. """ -import warnings +import unittest from test import support from test.support import warnings_helper diff --git a/Lib/test/test_dtrace.py b/Lib/test/test_dtrace.py index 3957077f5d612..8a436ad123b80 100644 --- a/Lib/test/test_dtrace.py +++ b/Lib/test/test_dtrace.py @@ -170,4 +170,4 @@ class SystemTapOptimizedTests(TraceTests, unittest.TestCase): if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_zipfile64.py b/Lib/test/test_zipfile64.py index 810fdedef39dd..0947013afbc6e 100644 --- a/Lib/test/test_zipfile64.py +++ b/Lib/test/test_zipfile64.py @@ -18,8 +18,9 @@ from tempfile import TemporaryFile from test.support import os_helper -from test.support import TESTFN, requires_zlib +from test.support import requires_zlib +TESTFN = os_helper.TESTFN TESTFN2 = TESTFN + "2" # How much time in seconds can pass before we print a 'Still working' message. diff --git a/Lib/unittest/test/test_program.py b/Lib/unittest/test/test_program.py index 4746d71e0b603..b7fbbc1e7badd 100644 --- a/Lib/unittest/test/test_program.py +++ b/Lib/unittest/test/test_program.py @@ -6,7 +6,7 @@ from test import support import unittest import unittest.test -from .test_result import BufferedWriter +from unittest.test.test_result import BufferedWriter class Test_TestProgram(unittest.TestCase): From webhook-mailer at python.org Thu Jan 20 01:13:37 2022 From: webhook-mailer at python.org (gvanrossum) Date: Thu, 20 Jan 2022 06:13:37 -0000 Subject: [Python-checkins] bpo-46443: deepfreeze: use small ints and singleton zero bytes (GH-30715) Message-ID: https://github.com/python/cpython/commit/194ecc6d44adc1fb39a56ca696418368b69432ce commit: 194ecc6d44adc1fb39a56ca696418368b69432ce branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: gvanrossum date: 2022-01-19T22:13:21-08:00 summary: bpo-46443: deepfreeze: use small ints and singleton zero bytes (GH-30715) files: A Misc/NEWS.d/next/Build/2022-01-20-05-27-07.bpo-46443.udCVII.rst M Tools/scripts/deepfreeze.py diff --git a/Misc/NEWS.d/next/Build/2022-01-20-05-27-07.bpo-46443.udCVII.rst b/Misc/NEWS.d/next/Build/2022-01-20-05-27-07.bpo-46443.udCVII.rst new file mode 100644 index 0000000000000..8e3fa197be9da --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-20-05-27-07.bpo-46443.udCVII.rst @@ -0,0 +1 @@ +Deepfreeze now uses cached small integers as it saves some space for common small integers. \ No newline at end of file diff --git a/Tools/scripts/deepfreeze.py b/Tools/scripts/deepfreeze.py index 002d680e10c2f..49638b8400285 100644 --- a/Tools/scripts/deepfreeze.py +++ b/Tools/scripts/deepfreeze.py @@ -113,6 +113,7 @@ def __init__(self, file: TextIO): self.write('#include "Python.h"') self.write('#include "internal/pycore_gc.h"') self.write('#include "internal/pycore_code.h"') + self.write('#include "internal/pycore_long.h"') self.write("") @contextlib.contextmanager @@ -148,6 +149,8 @@ def field(self, obj: object, name: str) -> None: self.write(f".{name} = {getattr(obj, name)},") def generate_bytes(self, name: str, b: bytes) -> str: + if b == b"": + return "(PyObject *)&_Py_SINGLETON(bytes_empty)" self.write("static") with self.indent(): with self.block("struct"): @@ -313,6 +316,8 @@ def _generate_int_for_bits(self, name: str, i: int, digit: int) -> None: self.write(f".ob_digit = {{ {ds} }},") def generate_int(self, name: str, i: int) -> str: + if -5 <= i <= 256: + return f"(PyObject *)&_PyLong_SMALL_INTS[_PY_NSMALLNEGINTS + {i}]" if abs(i) < 2**15: self._generate_int_for_bits(name, i, 2**15) else: From webhook-mailer at python.org Thu Jan 20 03:18:05 2022 From: webhook-mailer at python.org (corona10) Date: Thu, 20 Jan 2022 08:18:05 -0000 Subject: [Python-checkins] no-issue: Fix documentation typos. (GH-30576) Message-ID: https://github.com/python/cpython/commit/d05a66339b5e07d72d96e4c30a34cc3821bb61a2 commit: d05a66339b5e07d72d96e4c30a34cc3821bb61a2 branch: main author: Piotr Fusik committer: corona10 date: 2022-01-20T17:17:15+09:00 summary: no-issue: Fix documentation typos. (GH-30576) files: M Doc/c-api/init_config.rst M Doc/howto/descriptor.rst diff --git a/Doc/c-api/init_config.rst b/Doc/c-api/init_config.rst index 7c8fb7b1dee5d..922412c142302 100644 --- a/Doc/c-api/init_config.rst +++ b/Doc/c-api/init_config.rst @@ -656,7 +656,7 @@ PyConfig .. c:member:: int dump_refs - Dump Python refererences? + Dump Python references? If non-zero, dump all objects which are still alive at exit. diff --git a/Doc/howto/descriptor.rst b/Doc/howto/descriptor.rst index 6ce062d0fa853..f8b1e00d96fad 100644 --- a/Doc/howto/descriptor.rst +++ b/Doc/howto/descriptor.rst @@ -1544,7 +1544,7 @@ variables: 'Simulate how the type metaclass adds member objects for slots' def __new__(mcls, clsname, bases, mapping): - 'Emuluate type_new() in Objects/typeobject.c' + 'Emulate type_new() in Objects/typeobject.c' # type_new() calls PyTypeReady() which calls add_methods() slot_names = mapping.get('slot_names', []) for offset, name in enumerate(slot_names): From webhook-mailer at python.org Thu Jan 20 06:46:53 2022 From: webhook-mailer at python.org (markshannon) Date: Thu, 20 Jan 2022 11:46:53 -0000 Subject: [Python-checkins] bpo-46409: Make generators in bytecode (GH-30633) Message-ID: https://github.com/python/cpython/commit/b04dfbbe4bd7071d46c8688c2263726ea31d33cd commit: b04dfbbe4bd7071d46c8688c2263726ea31d33cd branch: main author: Mark Shannon committer: markshannon date: 2022-01-20T11:46:39Z summary: bpo-46409: Make generators in bytecode (GH-30633) * Add RETURN_GENERATOR and JUMP_NO_INTERRUPT opcodes. * Trim frame and generator by word each. * Minor refactor of frame.c * Update test.test_sys to account for smaller frames. * Treat generator functions as normal functions when evaluating and specializing. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-17-12-57-27.bpo-46409.HouS6m.rst M Doc/library/dis.rst M Include/cpython/genobject.h M Include/internal/pycore_frame.h M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/inspect.py M Lib/opcode.py M Lib/test/test_compile.py M Lib/test/test_generators.py M Lib/test/test_sys.py M Objects/frameobject.c M Objects/genobject.c M Python/ceval.c M Python/compile.c M Python/frame.c M Python/opcode_targets.h M Python/specialize.c diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 6bbe4ecbe8a1f..af28e5c115934 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -942,6 +942,13 @@ All of the following opcodes use their arguments. Set bytecode counter to *target*. +.. opcode:: JUMP_NO_INTERRUPT (target) + + Set bytecode counter to *target*. Do not check for interrupts. + + .. versionadded:: 3.11 + + .. opcode:: FOR_ITER (delta) TOS is an :term:`iterator`. Call its :meth:`~iterator.__next__` method. If @@ -1220,6 +1227,14 @@ All of the following opcodes use their arguments. .. versionadded:: 3.11 +.. opcode:: RETURN_GENERATOR + + Create a generator, coroutine, or async generator from the current frame. + Clear the current frame and return the newly created generator. + + .. versionadded:: 3.11 + + .. opcode:: HAVE_ARGUMENT This is not really an opcode. It identifies the dividing line between diff --git a/Include/cpython/genobject.h b/Include/cpython/genobject.h index ad2818e881667..838ca6cd24691 100644 --- a/Include/cpython/genobject.h +++ b/Include/cpython/genobject.h @@ -13,7 +13,6 @@ extern "C" { and coroutine objects. */ #define _PyGenObject_HEAD(prefix) \ PyObject_HEAD \ - /* Note: gi_frame can be NULL if the generator is "finished" */ \ /* The code object backing the generator */ \ PyCodeObject *prefix##_code; \ /* List of weak reference. */ \ diff --git a/Include/internal/pycore_frame.h b/Include/internal/pycore_frame.h index 42df51f635615..937c13b5203ac 100644 --- a/Include/internal/pycore_frame.h +++ b/Include/internal/pycore_frame.h @@ -41,12 +41,12 @@ typedef struct _interpreter_frame { PyObject *f_locals; /* Strong reference, may be NULL */ PyCodeObject *f_code; /* Strong reference */ PyFrameObject *frame_obj; /* Strong reference, may be NULL */ - PyObject *generator; /* Borrowed reference, may be NULL */ struct _interpreter_frame *previous; int f_lasti; /* Last instruction if called */ int stacktop; /* Offset of TOS from localsplus */ PyFrameState f_state; /* What state the frame is in */ bool is_entry; // Whether this is the "root" frame for the current CFrame. + bool is_generator; PyObject *localsplus[1]; } InterpreterFrame; @@ -100,10 +100,10 @@ _PyFrame_InitializeSpecials( frame->f_locals = Py_XNewRef(locals); frame->stacktop = nlocalsplus; frame->frame_obj = NULL; - frame->generator = NULL; frame->f_lasti = -1; frame->f_state = FRAME_CREATED; frame->is_entry = false; + frame->is_generator = false; } /* Gets the pointer to the locals array diff --git a/Include/opcode.h b/Include/opcode.h index 5cc885597ac35..c0686bd2249ce 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -38,6 +38,7 @@ extern "C" { #define LOAD_BUILD_CLASS 71 #define GET_AWAITABLE 73 #define LOAD_ASSERTION_ERROR 74 +#define RETURN_GENERATOR 75 #define LIST_TO_TUPLE 82 #define RETURN_VALUE 83 #define IMPORT_STAR 84 @@ -89,6 +90,7 @@ extern "C" { #define RAISE_VARARGS 130 #define MAKE_FUNCTION 132 #define BUILD_SLICE 133 +#define JUMP_NO_INTERRUPT 134 #define MAKE_CELL 135 #define LOAD_CLOSURE 136 #define LOAD_DEREF 137 @@ -157,18 +159,18 @@ extern "C" { #define LOAD_GLOBAL_BUILTIN 66 #define LOAD_METHOD_ADAPTIVE 67 #define LOAD_METHOD_CACHED 72 -#define LOAD_METHOD_CLASS 75 -#define LOAD_METHOD_MODULE 76 -#define LOAD_METHOD_NO_DICT 77 -#define STORE_ATTR_ADAPTIVE 78 -#define STORE_ATTR_INSTANCE_VALUE 79 -#define STORE_ATTR_SLOT 80 -#define STORE_ATTR_WITH_HINT 81 -#define LOAD_FAST__LOAD_FAST 87 -#define STORE_FAST__LOAD_FAST 131 -#define LOAD_FAST__LOAD_CONST 134 -#define LOAD_CONST__LOAD_FAST 140 -#define STORE_FAST__STORE_FAST 141 +#define LOAD_METHOD_CLASS 76 +#define LOAD_METHOD_MODULE 77 +#define LOAD_METHOD_NO_DICT 78 +#define STORE_ATTR_ADAPTIVE 79 +#define STORE_ATTR_INSTANCE_VALUE 80 +#define STORE_ATTR_SLOT 81 +#define STORE_ATTR_WITH_HINT 87 +#define LOAD_FAST__LOAD_FAST 131 +#define STORE_FAST__LOAD_FAST 140 +#define LOAD_FAST__LOAD_CONST 141 +#define LOAD_CONST__LOAD_FAST 143 +#define STORE_FAST__STORE_FAST 150 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { @@ -186,7 +188,7 @@ static uint32_t _PyOpcode_Jump[8] = { 0U, 536870912U, 2316288000U, - 3U, + 67U, 0U, 0U, 0U, diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 5aea0c4f92477..1560e60dbb925 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -380,6 +380,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3472 (bpo-46009: replace GEN_START with POP_TOP) # Python 3.11a4 3473 (Add POP_JUMP_IF_NOT_NONE/POP_JUMP_IF_NONE opcodes) # Python 3.11a4 3474 (Add RESUME opcode) +# Python 3.11a5 3475 (Add RETURN_GENERATOR opcode) # Python 3.12 will start with magic number 3500 @@ -393,7 +394,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3474).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3475).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/inspect.py b/Lib/inspect.py index 8236698b8de0f..7a8f5d3464318 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -1819,11 +1819,11 @@ def getgeneratorstate(generator): """ if generator.gi_running: return GEN_RUNNING + if generator.gi_suspended: + return GEN_SUSPENDED if generator.gi_frame is None: return GEN_CLOSED - if generator.gi_frame.f_lasti == -1: - return GEN_CREATED - return GEN_SUSPENDED + return GEN_CREATED def getgeneratorlocals(generator): @@ -1861,11 +1861,11 @@ def getcoroutinestate(coroutine): """ if coroutine.cr_running: return CORO_RUNNING + if coroutine.cr_suspended: + return CORO_SUSPENDED if coroutine.cr_frame is None: return CORO_CLOSED - if coroutine.cr_frame.f_lasti == -1: - return CORO_CREATED - return CORO_SUSPENDED + return CORO_CREATED def getcoroutinelocals(coroutine): diff --git a/Lib/opcode.py b/Lib/opcode.py index 7f39a7bfe2e8c..73b41d22df2fc 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -94,6 +94,7 @@ def jabs_op(name, op): def_op('GET_AWAITABLE', 73) def_op('LOAD_ASSERTION_ERROR', 74) +def_op('RETURN_GENERATOR', 75) def_op('LIST_TO_TUPLE', 82) def_op('RETURN_VALUE', 83) @@ -155,7 +156,7 @@ def jabs_op(name, op): def_op('MAKE_FUNCTION', 132) # Flags def_op('BUILD_SLICE', 133) # Number of items - +jabs_op('JUMP_NO_INTERRUPT', 134) # Target byte offset from beginning of code def_op('MAKE_CELL', 135) hasfree.append(135) def_op('LOAD_CLOSURE', 136) diff --git a/Lib/test/test_compile.py b/Lib/test/test_compile.py index e237156c75f8b..f007aec9d3819 100644 --- a/Lib/test/test_compile.py +++ b/Lib/test/test_compile.py @@ -954,7 +954,7 @@ def return_genexp(): x in y) - genexp_lines = [None, 1, 3, 1] + genexp_lines = [1, 3, 1] genexp_code = return_genexp.__code__.co_consts[1] code_lines = [ None if line is None else line-return_genexp.__code__.co_firstlineno @@ -967,7 +967,7 @@ async def test(aseq): async for i in aseq: body - expected_lines = [None, 0, 1, 2, 1] + expected_lines = [0, 1, 2, 1] code_lines = [ None if line is None else line-test.__code__.co_firstlineno for (_, _, line) in test.__code__.co_lines() ] self.assertEqual(expected_lines, code_lines) diff --git a/Lib/test/test_generators.py b/Lib/test/test_generators.py index 4f4fd9c5aa76e..87a7dd69d106c 100644 --- a/Lib/test/test_generators.py +++ b/Lib/test/test_generators.py @@ -897,7 +897,7 @@ def b(): >>> type(i) >>> [s for s in dir(i) if not s.startswith('_')] -['close', 'gi_code', 'gi_frame', 'gi_running', 'gi_yieldfrom', 'send', 'throw'] +['close', 'gi_code', 'gi_frame', 'gi_running', 'gi_suspended', 'gi_yieldfrom', 'send', 'throw'] >>> from test.support import HAVE_DOCSTRINGS >>> print(i.__next__.__doc__ if HAVE_DOCSTRINGS else 'Implement next(self).') Implement next(self). diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index 2c8c6ab6cee76..accd35e4ab271 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -1386,7 +1386,7 @@ class C(object): pass def func(): return sys._getframe() x = func() - check(x, size('3Pi3c8P2ic?P')) + check(x, size('3Pi3c7P2ic??P')) # function def func(): pass check(func, size('14Pi')) @@ -1403,7 +1403,7 @@ def bar(cls): check(bar, size('PP')) # generator def get_gen(): yield 1 - check(get_gen(), size('P2P4P4c8P2ic?P')) + check(get_gen(), size('P2P4P4c7P2ic??P')) # iterator check(iter('abc'), size('lP')) # callable-iterator diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-17-12-57-27.bpo-46409.HouS6m.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-17-12-57-27.bpo-46409.HouS6m.rst new file mode 100644 index 0000000000000..aa61bc5201118 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-17-12-57-27.bpo-46409.HouS6m.rst @@ -0,0 +1,6 @@ +Add new ``RETURN_GENERATOR`` bytecode to make generators. +Simplifies calling Python functions in the VM, as they no +longer any need to special case generator functions. + +Also add ``JUMP_NO_INTERRUPT`` bytecode that acts like +``JUMP_ABSOLUTE``, but does not check for interrupts. diff --git a/Objects/frameobject.c b/Objects/frameobject.c index 4dd2183040dac..81ad4cc65d150 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -242,6 +242,7 @@ mark_stacks(PyCodeObject *code_obj, int len) break; } case JUMP_ABSOLUTE: + case JUMP_NO_INTERRUPT: j = get_arg(code, i); assert(j < len); if (stacks[j] == UNINITIALIZED && j < i) { @@ -625,7 +626,7 @@ frame_dealloc(PyFrameObject *f) { /* It is the responsibility of the owning generator/coroutine * to have cleared the generator pointer */ - assert(f->f_frame->generator == NULL); + assert(!f->f_frame->is_generator); if (_PyObject_GC_IS_TRACKED(f)) { _PyObject_GC_UNTRACK(f); @@ -698,8 +699,11 @@ frame_clear(PyFrameObject *f, PyObject *Py_UNUSED(ignored)) "cannot clear an executing frame"); return NULL; } - if (f->f_frame->generator) { - _PyGen_Finalize(f->f_frame->generator); + if (f->f_frame->is_generator) { + assert(!f->f_owns_frame); + size_t offset_in_gen = offsetof(PyGenObject, gi_iframe); + PyObject *gen = (PyObject *)(((char *)f->f_frame) - offset_in_gen); + _PyGen_Finalize(gen); } (void)frame_tp_clear(f); Py_RETURN_NONE; diff --git a/Objects/genobject.c b/Objects/genobject.c index d093f3dd7de30..46b019051a064 100644 --- a/Objects/genobject.c +++ b/Objects/genobject.c @@ -87,7 +87,7 @@ _PyGen_Finalize(PyObject *self) issue a RuntimeWarning. */ if (gen->gi_code != NULL && ((PyCodeObject *)gen->gi_code)->co_flags & CO_COROUTINE && - ((InterpreterFrame *)gen->gi_iframe)->f_lasti == -1) + ((InterpreterFrame *)gen->gi_iframe)->f_state == FRAME_CREATED) { _PyErr_WarnUnawaitedCoroutine((PyObject *)gen); } @@ -133,7 +133,7 @@ gen_dealloc(PyGenObject *gen) if (gen->gi_frame_valid) { InterpreterFrame *frame = (InterpreterFrame *)gen->gi_iframe; gen->gi_frame_valid = 0; - frame->generator = NULL; + frame->is_generator = false; frame->previous = NULL; _PyFrame_Clear(frame); } @@ -156,7 +156,7 @@ gen_send_ex2(PyGenObject *gen, PyObject *arg, PyObject **presult, PyObject *result; *presult = NULL; - if (frame->f_lasti < 0 && arg && arg != Py_None) { + if (frame->f_state == FRAME_CREATED && arg && arg != Py_None) { const char *msg = "can't send non-None value to a " "just-started generator"; if (PyCoro_CheckExact(gen)) { @@ -265,7 +265,7 @@ gen_send_ex2(PyGenObject *gen, PyObject *arg, PyObject **presult, /* first clean reference cycle through stored exception traceback */ _PyErr_ClearExcState(&gen->gi_exc_state); - frame->generator = NULL; + frame->is_generator = false; gen->gi_frame_valid = 0; _PyFrame_Clear(frame); *presult = result; @@ -753,6 +753,15 @@ gen_getrunning(PyGenObject *gen, void *Py_UNUSED(ignored)) return PyBool_FromLong(_PyFrame_IsExecuting((InterpreterFrame *)gen->gi_iframe)); } +static PyObject * +gen_getsuspended(PyGenObject *gen, void *Py_UNUSED(ignored)) +{ + if (gen->gi_frame_valid == 0) { + Py_RETURN_FALSE; + } + return PyBool_FromLong(((InterpreterFrame *)gen->gi_iframe)->f_state == FRAME_SUSPENDED); +} + static PyObject * _gen_getframe(PyGenObject *gen, const char *const name) { @@ -780,6 +789,7 @@ static PyGetSetDef gen_getsetlist[] = { PyDoc_STR("object being iterated by yield from, or None")}, {"gi_running", (getter)gen_getrunning, NULL, NULL}, {"gi_frame", (getter)gen_getframe, NULL, NULL}, + {"gi_suspended", (getter)gen_getsuspended, NULL, NULL}, {NULL} /* Sentinel */ }; @@ -886,22 +896,16 @@ make_gen(PyTypeObject *type, PyFunctionObject *func) gen->gi_weakreflist = NULL; gen->gi_exc_state.exc_value = NULL; gen->gi_exc_state.previous_item = NULL; - if (func->func_name != NULL) - gen->gi_name = func->func_name; - else - gen->gi_name = gen->gi_code->co_name; - Py_INCREF(gen->gi_name); - if (func->func_qualname != NULL) - gen->gi_qualname = func->func_qualname; - else - gen->gi_qualname = gen->gi_name; - Py_INCREF(gen->gi_qualname); + assert(func->func_name != NULL); + gen->gi_name = Py_NewRef(func->func_name); + assert(func->func_qualname != NULL); + gen->gi_qualname = Py_NewRef(func->func_qualname); _PyObject_GC_TRACK(gen); return (PyObject *)gen; } static PyObject * -compute_cr_origin(int origin_depth); +compute_cr_origin(int origin_depth, InterpreterFrame *current_frame); PyObject * _Py_MakeCoro(PyFunctionObject *func) @@ -935,7 +939,8 @@ _Py_MakeCoro(PyFunctionObject *func) if (origin_depth == 0) { ((PyCoroObject *)coro)->cr_origin_or_finalizer = NULL; } else { - PyObject *cr_origin = compute_cr_origin(origin_depth); + assert(_PyEval_GetFrame()); + PyObject *cr_origin = compute_cr_origin(origin_depth, _PyEval_GetFrame()->previous); ((PyCoroObject *)coro)->cr_origin_or_finalizer = cr_origin; if (!cr_origin) { Py_DECREF(coro); @@ -965,7 +970,7 @@ gen_new_with_qualname(PyTypeObject *type, PyFrameObject *f, assert(frame->frame_obj == f); f->f_owns_frame = 0; f->f_frame = frame; - frame->generator = (PyObject *) gen; + frame->is_generator = true; assert(PyObject_GC_IsTracked((PyObject *)f)); gen->gi_code = PyFrame_GetCode(f); Py_INCREF(gen->gi_code); @@ -1097,6 +1102,15 @@ coro_get_cr_await(PyCoroObject *coro, void *Py_UNUSED(ignored)) return yf; } +static PyObject * +cr_getsuspended(PyCoroObject *coro, void *Py_UNUSED(ignored)) +{ + if (coro->cr_frame_valid == 0) { + Py_RETURN_FALSE; + } + return PyBool_FromLong(((InterpreterFrame *)coro->cr_iframe)->f_state == FRAME_SUSPENDED); +} + static PyObject * cr_getrunning(PyCoroObject *coro, void *Py_UNUSED(ignored)) { @@ -1122,6 +1136,7 @@ static PyGetSetDef coro_getsetlist[] = { PyDoc_STR("object being awaited on, or None")}, {"cr_running", (getter)cr_getrunning, NULL, NULL}, {"cr_frame", (getter)cr_getframe, NULL, NULL}, + {"cr_suspended", (getter)cr_getsuspended, NULL, NULL}, {NULL} /* Sentinel */ }; @@ -1299,9 +1314,9 @@ PyTypeObject _PyCoroWrapper_Type = { }; static PyObject * -compute_cr_origin(int origin_depth) +compute_cr_origin(int origin_depth, InterpreterFrame *current_frame) { - InterpreterFrame *frame = _PyEval_GetFrame(); + InterpreterFrame *frame = current_frame; /* First count how many frames we have */ int frame_count = 0; for (; frame && frame_count < origin_depth; ++frame_count) { @@ -1313,7 +1328,7 @@ compute_cr_origin(int origin_depth) if (cr_origin == NULL) { return NULL; } - frame = _PyEval_GetFrame(); + frame = current_frame; for (int i = 0; i < frame_count; ++i) { PyCodeObject *code = frame->f_code; PyObject *frameinfo = Py_BuildValue("OiO", @@ -1345,7 +1360,7 @@ PyCoro_New(PyFrameObject *f, PyObject *name, PyObject *qualname) if (origin_depth == 0) { ((PyCoroObject *)coro)->cr_origin_or_finalizer = NULL; } else { - PyObject *cr_origin = compute_cr_origin(origin_depth); + PyObject *cr_origin = compute_cr_origin(origin_depth, _PyEval_GetFrame()); ((PyCoroObject *)coro)->cr_origin_or_finalizer = cr_origin; if (!cr_origin) { Py_DECREF(coro); diff --git a/Python/ceval.c b/Python/ceval.c index 70a7750f81190..9aaddd99edacf 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1345,7 +1345,7 @@ eval_frame_handle_pending(PyThreadState *tstate) #define CHECK_EVAL_BREAKER() \ if (_Py_atomic_load_relaxed(eval_breaker)) { \ - goto check_eval_breaker; \ + goto handle_eval_breaker; \ } @@ -1620,12 +1620,6 @@ trace_function_exit(PyThreadState *tstate, InterpreterFrame *frame, PyObject *re return 0; } -static PyObject * -make_coro(PyThreadState *tstate, PyFunctionObject *func, - PyObject *locals, - PyObject* const* args, size_t argcount, - PyObject *kwnames); - static int skip_backwards_over_extended_args(PyCodeObject *code, int offset) { @@ -1760,49 +1754,21 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr assert(!_PyErr_Occurred(tstate)); #endif -check_eval_breaker: - { - assert(STACK_LEVEL() >= 0); /* else underflow */ - assert(STACK_LEVEL() <= frame->f_code->co_stacksize); /* else overflow */ - assert(!_PyErr_Occurred(tstate)); - - /* Do periodic things. Doing this every time through - the loop would add too much overhead, so we do it - only every Nth instruction. We also do it if - ``pending.calls_to_do'' is set, i.e. when an asynchronous - event needs attention (e.g. a signal handler or - async I/O handler); see Py_AddPendingCall() and - Py_MakePendingCalls() above. */ - - if (_Py_atomic_load_relaxed(eval_breaker)) { - opcode = _Py_OPCODE(*next_instr); - if (opcode != BEFORE_ASYNC_WITH && - opcode != SEND && - _Py_OPCODE(next_instr[-1]) != SEND) { - /* Few cases where we skip running signal handlers and other - pending calls: - - If we're about to enter the 'with:'. It will prevent - emitting a resource warning in the common idiom - 'with open(path) as file:'. - - If we're about to enter the 'async with:'. - - If we're about to enter the 'try:' of a try/finally (not - *very* useful, but might help in some cases and it's - traditional) - - If we're resuming a chain of nested 'yield from' or - 'await' calls, then each frame is parked with YIELD_FROM - as its next opcode. If the user hit control-C we want to - wait until we've reached the innermost frame before - running the signal handler and raising KeyboardInterrupt - (see bpo-30039). - */ - if (eval_frame_handle_pending(tstate) != 0) { - goto error; - } - } - } + DISPATCH(); +handle_eval_breaker: + + /* Do periodic things, like check for signals and async I/0. + * We need to do reasonably frequently, but not too frequently. + * All loops should include a check of the eval breaker. + * We also check on return from any builtin function. + */ + if (eval_frame_handle_pending(tstate) != 0) { + goto error; + } DISPATCH(); + { /* Start instructions */ #if USE_COMPUTED_GOTOS { @@ -1834,6 +1800,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr next_instr = first_instr + nexti; } frame->f_state = FRAME_EXECUTING; + if (_Py_atomic_load_relaxed(eval_breaker) && oparg < 2) { + goto handle_eval_breaker; + } DISPATCH(); } @@ -4152,6 +4121,17 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } + TARGET(JUMP_NO_INTERRUPT) { + /* This bytecode is used in the `yield from` or `await` loop. + * If there is an interrupt, we want it handled in the innermost + * generator or coroutine, so we deliberately do not check it here. + * (see bpo-30039). + */ + frame->f_state = FRAME_EXECUTING; + JUMPTO(oparg); + DISPATCH(); + } + TARGET(JUMP_ABSOLUTE_QUICK) { assert(oparg < INSTR_OFFSET()); JUMPTO(oparg); @@ -4627,28 +4607,25 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr // Check if the call can be inlined or not if (Py_TYPE(function) == &PyFunction_Type && tstate->interp->eval_frame == NULL) { int code_flags = ((PyCodeObject*)PyFunction_GET_CODE(function))->co_flags; - int is_generator = code_flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR); - if (!is_generator) { - PyObject *locals = code_flags & CO_OPTIMIZED ? NULL : PyFunction_GET_GLOBALS(function); - STACK_SHRINK(oparg); - InterpreterFrame *new_frame = _PyEvalFramePushAndInit( - tstate, (PyFunctionObject *)function, locals, - stack_pointer, nargs, kwnames - ); - STACK_SHRINK(postcall_shrink); - RESET_STACK_ADJUST_FOR_CALLS; - // The frame has stolen all the arguments from the stack, - // so there is no need to clean them up. - Py_XDECREF(kwnames); - Py_DECREF(function); - if (new_frame == NULL) { - goto error; - } - _PyFrame_SetStackPointer(frame, stack_pointer); - new_frame->previous = frame; - cframe.current_frame = frame = new_frame; - goto start_frame; + PyObject *locals = code_flags & CO_OPTIMIZED ? NULL : PyFunction_GET_GLOBALS(function); + STACK_SHRINK(oparg); + InterpreterFrame *new_frame = _PyEvalFramePushAndInit( + tstate, (PyFunctionObject *)function, locals, + stack_pointer, nargs, kwnames + ); + STACK_SHRINK(postcall_shrink); + RESET_STACK_ADJUST_FOR_CALLS; + // The frame has stolen all the arguments from the stack, + // so there is no need to clean them up. + Py_XDECREF(kwnames); + Py_DECREF(function); + if (new_frame == NULL) { + goto error; } + _PyFrame_SetStackPointer(frame, stack_pointer); + new_frame->previous = frame; + cframe.current_frame = frame = new_frame; + goto start_frame; } /* Callable is not a normal Python function */ PyObject *res; @@ -5076,6 +5053,40 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } + TARGET(RETURN_GENERATOR) { + PyGenObject *gen = (PyGenObject *)_Py_MakeCoro(frame->f_func); + if (gen == NULL) { + goto error; + } + assert(EMPTY()); + _PyFrame_SetStackPointer(frame, stack_pointer); + InterpreterFrame *gen_frame = (InterpreterFrame *)gen->gi_iframe; + _PyFrame_Copy(frame, gen_frame); + assert(frame->frame_obj == NULL); + gen->gi_frame_valid = 1; + gen_frame->is_generator = true; + gen_frame->f_state = FRAME_CREATED; + _Py_LeaveRecursiveCall(tstate); + if (!frame->is_entry) { + InterpreterFrame *prev = frame->previous; + _PyThreadState_PopFrame(tstate, frame); + frame = cframe.current_frame = prev; + _PyFrame_StackPush(frame, (PyObject *)gen); + goto resume_frame; + } + /* Make sure that frame is in a valid state */ + frame->stacktop = 0; + frame->f_locals = NULL; + Py_INCREF(frame->f_func); + Py_INCREF(frame->f_code); + /* Restore previous cframe and return. */ + tstate->cframe = cframe.previous; + tstate->cframe->use_tracing = cframe.use_tracing; + assert(tstate->cframe->current_frame == frame->previous); + assert(!_PyErr_Occurred(tstate)); + return (PyObject *)gen; + } + TARGET(BUILD_SLICE) { PyObject *start, *stop, *step, *slice; if (oparg == 3) @@ -5222,11 +5233,14 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr frame->f_lasti = INSTR_OFFSET(); TRACING_NEXTOPARG(); if (opcode == RESUME) { + if (oparg < 2) { + CHECK_EVAL_BREAKER(); + } /* Call tracing */ TRACE_FUNCTION_ENTRY(); DTRACE_FUNCTION_ENTRY(); } - else { + else if (frame->f_state > FRAME_CREATED) { /* line-by-line tracing support */ if (PyDTrace_LINE_ENABLED()) { maybe_dtrace_line(frame, &tstate->trace_info, instr_prev); @@ -5961,33 +5975,6 @@ initialize_locals(PyThreadState *tstate, PyFunctionObject *func, return -1; } -/* Consumes all the references to the args */ -static PyObject * -make_coro(PyThreadState *tstate, PyFunctionObject *func, - PyObject *locals, - PyObject* const* args, size_t argcount, - PyObject *kwnames) -{ - assert (((PyCodeObject *)func->func_code)->co_flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)); - PyObject *gen = _Py_MakeCoro(func); - if (gen == NULL) { - return NULL; - } - InterpreterFrame *frame = (InterpreterFrame *)((PyGenObject *)gen)->gi_iframe; - PyCodeObject *code = (PyCodeObject *)func->func_code; - _PyFrame_InitializeSpecials(frame, func, locals, code->co_nlocalsplus); - for (int i = 0; i < code->co_nlocalsplus; i++) { - frame->localsplus[i] = NULL; - } - ((PyGenObject *)gen)->gi_frame_valid = 1; - if (initialize_locals(tstate, func, frame->localsplus, args, argcount, kwnames)) { - Py_DECREF(gen); - return NULL; - } - frame->generator = gen; - return gen; -} - /* Consumes all the references to the args */ static InterpreterFrame * _PyEvalFramePushAndInit(PyThreadState *tstate, PyFunctionObject *func, @@ -6041,10 +6028,7 @@ _PyEval_Vector(PyThreadState *tstate, PyFunctionObject *func, PyObject* const* args, size_t argcount, PyObject *kwnames) { - PyCodeObject *code = (PyCodeObject *)func->func_code; - /* _PyEvalFramePushAndInit and make_coro consume - * all the references to their arguments - */ + /* _PyEvalFramePushAndInit consumes all the references to its arguments */ for (size_t i = 0; i < argcount; i++) { Py_INCREF(args[i]); } @@ -6054,19 +6038,16 @@ _PyEval_Vector(PyThreadState *tstate, PyFunctionObject *func, Py_INCREF(args[i+argcount]); } } - int is_coro = code->co_flags & - (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR); - if (is_coro) { - return make_coro(tstate, func, locals, args, argcount, kwnames); - } InterpreterFrame *frame = _PyEvalFramePushAndInit( tstate, func, locals, args, argcount, kwnames); if (frame == NULL) { return NULL; } PyObject *retval = _PyEval_EvalFrame(tstate, frame, 0); - assert(frame->stacktop >= 0); - assert(_PyFrame_GetStackPointer(frame) == _PyFrame_Stackbase(frame)); + assert( + _PyFrame_GetStackPointer(frame) == _PyFrame_Stackbase(frame) || + _PyFrame_GetStackPointer(frame) == frame->localsplus + ); _PyEvalFrameClearAndPop(tstate, frame); return retval; } diff --git a/Python/compile.c b/Python/compile.c index 86f888ef8a394..5d32959db3b65 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -969,6 +969,7 @@ stack_effect(int opcode, int oparg, int jump) /* Jumps */ case JUMP_FORWARD: case JUMP_ABSOLUTE: + case JUMP_NO_INTERRUPT: return 0; case JUMP_IF_TRUE_OR_POP: @@ -1017,6 +1018,9 @@ stack_effect(int opcode, int oparg, int jump) case DELETE_FAST: return 0; + case RETURN_GENERATOR: + return 0; + case RAISE_VARARGS: return -oparg; @@ -1841,7 +1845,7 @@ compiler_add_yield_from(struct compiler *c, int await) ADDOP_JUMP(c, SEND, exit); compiler_use_next_block(c, resume); ADDOP_I(c, RESUME, await ? 3 : 2); - ADDOP_JUMP(c, JUMP_ABSOLUTE, start); + ADDOP_JUMP(c, JUMP_NO_INTERRUPT, start); compiler_use_next_block(c, exit); return 1; } @@ -7055,6 +7059,7 @@ stackdepth(struct compiler *c) } depth = new_depth; if (instr->i_opcode == JUMP_ABSOLUTE || + instr->i_opcode == JUMP_NO_INTERRUPT || instr->i_opcode == JUMP_FORWARD || instr->i_opcode == RETURN_VALUE || instr->i_opcode == RAISE_VARARGS || @@ -7572,9 +7577,6 @@ normalize_jumps(struct assembler *a) if (last->i_target->b_visited == 0) { last->i_opcode = JUMP_FORWARD; } - else if (b->b_iused >= 2 && b->b_instr[b->b_iused-2].i_opcode == SEND) { - last->i_opcode = JUMP_ABSOLUTE_QUICK; - } } } } @@ -7998,6 +8000,34 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, } assert(c->u->u_firstlineno > 0); + /* Add the generator prefix instructions. */ + if (flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { + struct instr make_gen = { + .i_opcode = RETURN_GENERATOR, + .i_oparg = 0, + .i_lineno = c->u->u_firstlineno, + .i_col_offset = -1, + .i_end_lineno = c->u->u_firstlineno, + .i_end_col_offset = -1, + .i_target = NULL, + }; + if (insert_instruction(entryblock, 0, &make_gen) < 0) { + return -1; + } + struct instr pop_top = { + .i_opcode = POP_TOP, + .i_oparg = 0, + .i_lineno = -1, + .i_col_offset = -1, + .i_end_lineno = -1, + .i_end_col_offset = -1, + .i_target = NULL, + }; + if (insert_instruction(entryblock, 1, &pop_top) < 0) { + return -1; + } + } + /* Set up cells for any variable that escapes, to be put in a closure. */ const int ncellvars = (int)PyDict_GET_SIZE(c->u->u_cellvars); if (ncellvars) { @@ -8036,22 +8066,6 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, PyMem_RawFree(sorted); } - /* Add the generator prefix instructions. */ - if (flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { - struct instr pop_top = { - .i_opcode = POP_TOP, - .i_oparg = 0, - .i_lineno = -1, - .i_col_offset = -1, - .i_end_lineno = -1, - .i_end_col_offset = -1, - .i_target = NULL, - }; - if (insert_instruction(entryblock, 0, &pop_top) < 0) { - return -1; - } - } - if (nfreevars) { struct instr copy_frees = { .i_opcode = COPY_FREE_VARS, @@ -8801,6 +8815,7 @@ normalize_basic_block(basicblock *bb) { break; case JUMP_ABSOLUTE: case JUMP_FORWARD: + case JUMP_NO_INTERRUPT: bb->b_nofallthrough = 1; /* fall through */ case POP_JUMP_IF_NOT_NONE: @@ -8985,6 +9000,7 @@ optimize_cfg(struct compiler *c, struct assembler *a, PyObject *consts) if (b->b_iused > 0) { struct instr *b_last_instr = &b->b_instr[b->b_iused - 1]; if (b_last_instr->i_opcode == JUMP_ABSOLUTE || + b_last_instr->i_opcode == JUMP_NO_INTERRUPT || b_last_instr->i_opcode == JUMP_FORWARD) { if (b_last_instr->i_target == b->b_next) { assert(b->b_next->b_iused); diff --git a/Python/frame.c b/Python/frame.c index da2c1c4aec075..9578747c19d87 100644 --- a/Python/frame.c +++ b/Python/frame.c @@ -3,6 +3,7 @@ #include "frameobject.h" #include "pycore_frame.h" #include "pycore_object.h" // _PyObject_GC_UNTRACK() +#include "opcode.h" int _PyFrame_Traverse(InterpreterFrame *frame, visitproc visit, void *arg) @@ -51,15 +52,6 @@ _PyFrame_Copy(InterpreterFrame *src, InterpreterFrame *dest) memcpy(dest, src, size); } -static inline void -clear_specials(InterpreterFrame *frame) -{ - frame->generator = NULL; - Py_XDECREF(frame->frame_obj); - Py_XDECREF(frame->f_locals); - Py_DECREF(frame->f_func); - Py_DECREF(frame->f_code); -} static void take_ownership(PyFrameObject *f, InterpreterFrame *frame) @@ -94,8 +86,8 @@ void _PyFrame_Clear(InterpreterFrame * frame) { /* It is the responsibility of the owning generator/coroutine - * to have cleared the generator pointer */ - assert(frame->generator == NULL); + * to have cleared the enclosing generator, if any. */ + assert(!frame->is_generator); if (frame->frame_obj) { PyFrameObject *f = frame->frame_obj; frame->frame_obj = NULL; @@ -110,5 +102,8 @@ _PyFrame_Clear(InterpreterFrame * frame) for (int i = 0; i < frame->stacktop; i++) { Py_XDECREF(frame->localsplus[i]); } - clear_specials(frame); + Py_XDECREF(frame->frame_obj); + Py_XDECREF(frame->f_locals); + Py_DECREF(frame->f_func); + Py_DECREF(frame->f_code); } diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index c78425ff9bb64..11ac0e975fdcd 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -74,19 +74,19 @@ static void *opcode_targets[256] = { &&TARGET_LOAD_METHOD_CACHED, &&TARGET_GET_AWAITABLE, &&TARGET_LOAD_ASSERTION_ERROR, + &&TARGET_RETURN_GENERATOR, &&TARGET_LOAD_METHOD_CLASS, &&TARGET_LOAD_METHOD_MODULE, &&TARGET_LOAD_METHOD_NO_DICT, &&TARGET_STORE_ATTR_ADAPTIVE, &&TARGET_STORE_ATTR_INSTANCE_VALUE, &&TARGET_STORE_ATTR_SLOT, - &&TARGET_STORE_ATTR_WITH_HINT, &&TARGET_LIST_TO_TUPLE, &&TARGET_RETURN_VALUE, &&TARGET_IMPORT_STAR, &&TARGET_SETUP_ANNOTATIONS, &&TARGET_YIELD_VALUE, - &&TARGET_LOAD_FAST__LOAD_FAST, + &&TARGET_STORE_ATTR_WITH_HINT, &&TARGET_PREP_RERAISE_STAR, &&TARGET_POP_EXCEPT, &&TARGET_STORE_NAME, @@ -130,26 +130,26 @@ static void *opcode_targets[256] = { &&TARGET_POP_JUMP_IF_NOT_NONE, &&TARGET_POP_JUMP_IF_NONE, &&TARGET_RAISE_VARARGS, - &&TARGET_STORE_FAST__LOAD_FAST, + &&TARGET_LOAD_FAST__LOAD_FAST, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, - &&TARGET_LOAD_FAST__LOAD_CONST, + &&TARGET_JUMP_NO_INTERRUPT, &&TARGET_MAKE_CELL, &&TARGET_LOAD_CLOSURE, &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, - &&TARGET_LOAD_CONST__LOAD_FAST, - &&TARGET_STORE_FAST__STORE_FAST, + &&TARGET_STORE_FAST__LOAD_FAST, + &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_CALL_FUNCTION_EX, - &&_unknown_opcode, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_EXTENDED_ARG, &&TARGET_LIST_APPEND, &&TARGET_SET_ADD, &&TARGET_MAP_ADD, &&TARGET_LOAD_CLASSDEREF, &&TARGET_COPY_FREE_VARS, - &&_unknown_opcode, + &&TARGET_STORE_FAST__STORE_FAST, &&TARGET_RESUME, &&TARGET_MATCH_CLASS, &&_unknown_opcode, diff --git a/Python/specialize.c b/Python/specialize.c index 7c2252dd7a0e5..e32986ad9d61a 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -499,7 +499,6 @@ initial_counter_value(void) { #define SPEC_FAIL_DIFFERENT_TYPES 12 /* Calls */ -#define SPEC_FAIL_GENERATOR 7 #define SPEC_FAIL_COMPLEX_PARAMETERS 8 #define SPEC_FAIL_WRONG_NUMBER_ARGUMENTS 9 #define SPEC_FAIL_CO_NOT_OPTIMIZED 10 @@ -1153,9 +1152,6 @@ _Py_IDENTIFIER(__getitem__); static int function_kind(PyCodeObject *code) { int flags = code->co_flags; - if (flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { - return SPEC_FAIL_GENERATOR; - } if ((flags & (CO_VARKEYWORDS | CO_VARARGS)) || code->co_kwonlyargcount) { return SPEC_FAIL_COMPLEX_PARAMETERS; } From webhook-mailer at python.org Thu Jan 20 08:05:28 2022 From: webhook-mailer at python.org (pablogsal) Date: Thu, 20 Jan 2022 13:05:28 -0000 Subject: [Python-checkins] [3.10] bpo-46339: Fix crash in the parser when computing error text for multi-line f-strings (GH-30529) (GH-30542) Message-ID: https://github.com/python/cpython/commit/1fb1f5d8bd084c20f0a5fde547b563c08d103f09 commit: 1fb1f5d8bd084c20f0a5fde547b563c08d103f09 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: pablogsal date: 2022-01-20T13:05:10Z summary: [3.10] bpo-46339: Fix crash in the parser when computing error text for multi-line f-strings (GH-30529) (GH-30542) * bpo-46339: Fix crash in the parser when computing error text for multi-line f-strings (GH-30529) Automerge-Triggered-By: GH:pablogsal (cherry picked from commit cedec19be81e6bd153678bfb28c8e217af8bda58) Co-authored-by: Pablo Galindo Salgado * Fix interactive mode Co-authored-by: Pablo Galindo Salgado files: A Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst D Parser/pegen_errors.c M Lib/test/test_exceptions.py M Parser/pegen.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 86b5dccaaed98..b3d1c35274c71 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -278,6 +278,12 @@ def baz(): } \"\"\" }'''""", 5, 17) + check('''f""" + + + { + 6 + 0="""''', 5, 13) # Errors thrown by symtable.c check('x = [(yield i) for i in range(3)]', 1, 7) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst new file mode 100644 index 0000000000000..cd04f060826b2 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-11-11-50-19.bpo-46339.OVumDZ.rst @@ -0,0 +1,3 @@ +Fix a crash in the parser when retrieving the error text for multi-line +f-strings expressions that do not start in the first line of the string. +Patch by Pablo Galindo diff --git a/Parser/pegen.c b/Parser/pegen.c index e507415f6d14c..f9812c0ea8f02 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -436,9 +436,17 @@ get_error_line(Parser *p, Py_ssize_t lineno) char *cur_line = p->tok->fp_interactive ? p->tok->interactive_src_start : p->tok->str; assert(cur_line != NULL); + const char* buf_end = p->tok->fp_interactive ? p->tok->interactive_src_end : p->tok->inp; - for (int i = 0; i < lineno - 1; i++) { - cur_line = strchr(cur_line, '\n') + 1; + Py_ssize_t relative_lineno = p->starting_lineno ? lineno - p->starting_lineno + 1 : lineno; + + for (int i = 0; i < relative_lineno - 1; i++) { + char *new_line = strchr(cur_line, '\n') + 1; + assert(new_line != NULL && new_line <= buf_end); + if (new_line == NULL || new_line > buf_end) { + break; + } + cur_line = new_line; } char *next_newline; diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c deleted file mode 100644 index 93057d151db38..0000000000000 --- a/Parser/pegen_errors.c +++ /dev/null @@ -1,425 +0,0 @@ -#include -#include - -#include "tokenizer.h" -#include "pegen.h" - -// TOKENIZER ERRORS - -void -_PyPegen_raise_tokenizer_init_error(PyObject *filename) -{ - if (!(PyErr_ExceptionMatches(PyExc_LookupError) - || PyErr_ExceptionMatches(PyExc_SyntaxError) - || PyErr_ExceptionMatches(PyExc_ValueError) - || PyErr_ExceptionMatches(PyExc_UnicodeDecodeError))) { - return; - } - PyObject *errstr = NULL; - PyObject *tuple = NULL; - PyObject *type; - PyObject *value; - PyObject *tback; - PyErr_Fetch(&type, &value, &tback); - errstr = PyObject_Str(value); - if (!errstr) { - goto error; - } - - PyObject *tmp = Py_BuildValue("(OiiO)", filename, 0, -1, Py_None); - if (!tmp) { - goto error; - } - - tuple = PyTuple_Pack(2, errstr, tmp); - Py_DECREF(tmp); - if (!value) { - goto error; - } - PyErr_SetObject(PyExc_SyntaxError, tuple); - -error: - Py_XDECREF(type); - Py_XDECREF(value); - Py_XDECREF(tback); - Py_XDECREF(errstr); - Py_XDECREF(tuple); -} - -static inline void -raise_unclosed_parentheses_error(Parser *p) { - int error_lineno = p->tok->parenlinenostack[p->tok->level-1]; - int error_col = p->tok->parencolstack[p->tok->level-1]; - RAISE_ERROR_KNOWN_LOCATION(p, PyExc_SyntaxError, - error_lineno, error_col, error_lineno, -1, - "'%c' was never closed", - p->tok->parenstack[p->tok->level-1]); -} - -int -_Pypegen_tokenizer_error(Parser *p) -{ - if (PyErr_Occurred()) { - return -1; - } - - const char *msg = NULL; - PyObject* errtype = PyExc_SyntaxError; - Py_ssize_t col_offset = -1; - switch (p->tok->done) { - case E_TOKEN: - msg = "invalid token"; - break; - case E_EOF: - if (p->tok->level) { - raise_unclosed_parentheses_error(p); - } else { - RAISE_SYNTAX_ERROR("unexpected EOF while parsing"); - } - return -1; - case E_DEDENT: - RAISE_INDENTATION_ERROR("unindent does not match any outer indentation level"); - return -1; - case E_INTR: - if (!PyErr_Occurred()) { - PyErr_SetNone(PyExc_KeyboardInterrupt); - } - return -1; - case E_NOMEM: - PyErr_NoMemory(); - return -1; - case E_TABSPACE: - errtype = PyExc_TabError; - msg = "inconsistent use of tabs and spaces in indentation"; - break; - case E_TOODEEP: - errtype = PyExc_IndentationError; - msg = "too many levels of indentation"; - break; - case E_LINECONT: { - col_offset = p->tok->cur - p->tok->buf - 1; - msg = "unexpected character after line continuation character"; - break; - } - default: - msg = "unknown parsing error"; - } - - RAISE_ERROR_KNOWN_LOCATION(p, errtype, p->tok->lineno, - col_offset >= 0 ? col_offset : 0, - p->tok->lineno, -1, msg); - return -1; -} - -int -_Pypegen_raise_decode_error(Parser *p) -{ - assert(PyErr_Occurred()); - const char *errtype = NULL; - if (PyErr_ExceptionMatches(PyExc_UnicodeError)) { - errtype = "unicode error"; - } - else if (PyErr_ExceptionMatches(PyExc_ValueError)) { - errtype = "value error"; - } - if (errtype) { - PyObject *type; - PyObject *value; - PyObject *tback; - PyObject *errstr; - PyErr_Fetch(&type, &value, &tback); - errstr = PyObject_Str(value); - if (errstr) { - RAISE_SYNTAX_ERROR("(%s) %U", errtype, errstr); - Py_DECREF(errstr); - } - else { - PyErr_Clear(); - RAISE_SYNTAX_ERROR("(%s) unknown error", errtype); - } - Py_XDECREF(type); - Py_XDECREF(value); - Py_XDECREF(tback); - } - - return -1; -} - -static int -_PyPegen_tokenize_full_source_to_check_for_errors(Parser *p) { - // Tokenize the whole input to see if there are any tokenization - // errors such as mistmatching parentheses. These will get priority - // over generic syntax errors only if the line number of the error is - // before the one that we had for the generic error. - - // We don't want to tokenize to the end for interactive input - if (p->tok->prompt != NULL) { - return 0; - } - - PyObject *type, *value, *traceback; - PyErr_Fetch(&type, &value, &traceback); - - Token *current_token = p->known_err_token != NULL ? p->known_err_token : p->tokens[p->fill - 1]; - Py_ssize_t current_err_line = current_token->lineno; - - int ret = 0; - - for (;;) { - const char *start; - const char *end; - switch (_PyTokenizer_Get(p->tok, &start, &end)) { - case ERRORTOKEN: - if (p->tok->level != 0) { - int error_lineno = p->tok->parenlinenostack[p->tok->level-1]; - if (current_err_line > error_lineno) { - raise_unclosed_parentheses_error(p); - ret = -1; - goto exit; - } - } - break; - case ENDMARKER: - break; - default: - continue; - } - break; - } - - -exit: - if (PyErr_Occurred()) { - Py_XDECREF(value); - Py_XDECREF(type); - Py_XDECREF(traceback); - } else { - PyErr_Restore(type, value, traceback); - } - return ret; -} - -// PARSER ERRORS - -void * -_PyPegen_raise_error(Parser *p, PyObject *errtype, const char *errmsg, ...) -{ - if (p->fill == 0) { - va_list va; - va_start(va, errmsg); - _PyPegen_raise_error_known_location(p, errtype, 0, 0, 0, -1, errmsg, va); - va_end(va); - return NULL; - } - - Token *t = p->known_err_token != NULL ? p->known_err_token : p->tokens[p->fill - 1]; - Py_ssize_t col_offset; - Py_ssize_t end_col_offset = -1; - if (t->col_offset == -1) { - if (p->tok->cur == p->tok->buf) { - col_offset = 0; - } else { - const char* start = p->tok->buf ? p->tok->line_start : p->tok->buf; - col_offset = Py_SAFE_DOWNCAST(p->tok->cur - start, intptr_t, int); - } - } else { - col_offset = t->col_offset + 1; - } - - if (t->end_col_offset != -1) { - end_col_offset = t->end_col_offset + 1; - } - - va_list va; - va_start(va, errmsg); - _PyPegen_raise_error_known_location(p, errtype, t->lineno, col_offset, t->end_lineno, end_col_offset, errmsg, va); - va_end(va); - - return NULL; -} - -static PyObject * -get_error_line_from_tokenizer_buffers(Parser *p, Py_ssize_t lineno) -{ - /* If the file descriptor is interactive, the source lines of the current - * (multi-line) statement are stored in p->tok->interactive_src_start. - * If not, we're parsing from a string, which means that the whole source - * is stored in p->tok->str. */ - assert((p->tok->fp == NULL && p->tok->str != NULL) || p->tok->fp == stdin); - - char *cur_line = p->tok->fp_interactive ? p->tok->interactive_src_start : p->tok->str; - assert(cur_line != NULL); - - for (int i = 0; i < lineno - 1; i++) { - cur_line = strchr(cur_line, '\n') + 1; - } - - char *next_newline; - if ((next_newline = strchr(cur_line, '\n')) == NULL) { // This is the last line - next_newline = cur_line + strlen(cur_line); - } - return PyUnicode_DecodeUTF8(cur_line, next_newline - cur_line, "replace"); -} - -void * -_PyPegen_raise_error_known_location(Parser *p, PyObject *errtype, - Py_ssize_t lineno, Py_ssize_t col_offset, - Py_ssize_t end_lineno, Py_ssize_t end_col_offset, - const char *errmsg, va_list va) -{ - PyObject *value = NULL; - PyObject *errstr = NULL; - PyObject *error_line = NULL; - PyObject *tmp = NULL; - p->error_indicator = 1; - - if (end_lineno == CURRENT_POS) { - end_lineno = p->tok->lineno; - } - if (end_col_offset == CURRENT_POS) { - end_col_offset = p->tok->cur - p->tok->line_start; - } - - if (p->start_rule == Py_fstring_input) { - const char *fstring_msg = "f-string: "; - Py_ssize_t len = strlen(fstring_msg) + strlen(errmsg); - - char *new_errmsg = PyMem_Malloc(len + 1); // Lengths of both strings plus NULL character - if (!new_errmsg) { - return (void *) PyErr_NoMemory(); - } - - // Copy both strings into new buffer - memcpy(new_errmsg, fstring_msg, strlen(fstring_msg)); - memcpy(new_errmsg + strlen(fstring_msg), errmsg, strlen(errmsg)); - new_errmsg[len] = 0; - errmsg = new_errmsg; - } - errstr = PyUnicode_FromFormatV(errmsg, va); - if (!errstr) { - goto error; - } - - if (p->tok->fp_interactive) { - error_line = get_error_line_from_tokenizer_buffers(p, lineno); - } - else if (p->start_rule == Py_file_input) { - error_line = _PyErr_ProgramDecodedTextObject(p->tok->filename, - (int) lineno, p->tok->encoding); - } - - if (!error_line) { - /* PyErr_ProgramTextObject was not called or returned NULL. If it was not called, - then we need to find the error line from some other source, because - p->start_rule != Py_file_input. If it returned NULL, then it either unexpectedly - failed or we're parsing from a string or the REPL. There's a third edge case where - we're actually parsing from a file, which has an E_EOF SyntaxError and in that case - `PyErr_ProgramTextObject` fails because lineno points to last_file_line + 1, which - does not physically exist */ - assert(p->tok->fp == NULL || p->tok->fp == stdin || p->tok->done == E_EOF); - - if (p->tok->lineno <= lineno && p->tok->inp > p->tok->buf) { - Py_ssize_t size = p->tok->inp - p->tok->buf; - error_line = PyUnicode_DecodeUTF8(p->tok->buf, size, "replace"); - } - else if (p->tok->fp == NULL || p->tok->fp == stdin) { - error_line = get_error_line_from_tokenizer_buffers(p, lineno); - } - else { - error_line = PyUnicode_FromStringAndSize("", 0); - } - if (!error_line) { - goto error; - } - } - - if (p->start_rule == Py_fstring_input) { - col_offset -= p->starting_col_offset; - end_col_offset -= p->starting_col_offset; - } - - Py_ssize_t col_number = col_offset; - Py_ssize_t end_col_number = end_col_offset; - - if (p->tok->encoding != NULL) { - col_number = _PyPegen_byte_offset_to_character_offset(error_line, col_offset); - if (col_number < 0) { - goto error; - } - if (end_col_number > 0) { - Py_ssize_t end_col_offset = _PyPegen_byte_offset_to_character_offset(error_line, end_col_number); - if (end_col_offset < 0) { - goto error; - } else { - end_col_number = end_col_offset; - } - } - } - tmp = Py_BuildValue("(OiiNii)", p->tok->filename, lineno, col_number, error_line, end_lineno, end_col_number); - if (!tmp) { - goto error; - } - value = PyTuple_Pack(2, errstr, tmp); - Py_DECREF(tmp); - if (!value) { - goto error; - } - PyErr_SetObject(errtype, value); - - Py_DECREF(errstr); - Py_DECREF(value); - if (p->start_rule == Py_fstring_input) { - PyMem_Free((void *)errmsg); - } - return NULL; - -error: - Py_XDECREF(errstr); - Py_XDECREF(error_line); - if (p->start_rule == Py_fstring_input) { - PyMem_Free((void *)errmsg); - } - return NULL; -} - -void -_Pypegen_set_syntax_error(Parser* p, Token* last_token) { - // Existing sintax error - if (PyErr_Occurred()) { - // Prioritize tokenizer errors to custom syntax errors raised - // on the second phase only if the errors come from the parser. - if (p->tok->done == E_DONE && PyErr_ExceptionMatches(PyExc_SyntaxError)) { - _PyPegen_tokenize_full_source_to_check_for_errors(p); - } - // Propagate the existing syntax error. - return; - } - // Initialization error - if (p->fill == 0) { - RAISE_SYNTAX_ERROR("error at start before reading any input"); - } - // Parser encountered EOF (End of File) unexpectedtly - if (last_token->type == ERRORTOKEN && p->tok->done == E_EOF) { - if (p->tok->level) { - raise_unclosed_parentheses_error(p); - } else { - RAISE_SYNTAX_ERROR("unexpected EOF while parsing"); - } - return; - } - // Indentation error in the tokenizer - if (last_token->type == INDENT || last_token->type == DEDENT) { - RAISE_INDENTATION_ERROR(last_token->type == INDENT ? "unexpected indent" : "unexpected unindent"); - return; - } - // Unknown error (generic case) - - // Use the last token we found on the first pass to avoid reporting - // incorrect locations for generic syntax errors just because we reached - // further away when trying to find specific syntax errors in the second - // pass. - RAISE_SYNTAX_ERROR_KNOWN_LOCATION(last_token, "invalid syntax"); - // _PyPegen_tokenize_full_source_to_check_for_errors will override the existing - // generic SyntaxError we just raised if errors are found. - _PyPegen_tokenize_full_source_to_check_for_errors(p); -} From webhook-mailer at python.org Thu Jan 20 10:34:24 2022 From: webhook-mailer at python.org (pablogsal) Date: Thu, 20 Jan 2022 15:34:24 -0000 Subject: [Python-checkins] Fix the caret position in some syntax errors in interactive mode (GH-30718) Message-ID: https://github.com/python/cpython/commit/650720a0cfa1673938e6d1bad53b6c37c9edb47d commit: 650720a0cfa1673938e6d1bad53b6c37c9edb47d branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-20T15:34:13Z summary: Fix the caret position in some syntax errors in interactive mode (GH-30718) files: M Parser/pegen_errors.c diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index f348ac3000dda..0be9df0ae5535 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -251,14 +251,15 @@ get_error_line_from_tokenizer_buffers(Parser *p, Py_ssize_t lineno) assert(cur_line != NULL); Py_ssize_t relative_lineno = p->starting_lineno ? lineno - p->starting_lineno + 1 : lineno; + const char* buf_end = p->tok->fp_interactive ? p->tok->interactive_src_end : p->tok->inp; for (int i = 0; i < relative_lineno - 1; i++) { char *new_line = strchr(cur_line, '\n') + 1; // The assert is here for debug builds but the conditional that // follows is there so in release builds we do not crash at the cost // to report a potentially wrong line. - assert(new_line != NULL && new_line < p->tok->inp); - if (new_line == NULL || new_line >= p->tok->inp) { + assert(new_line != NULL && new_line <= buf_end); + if (new_line == NULL || new_line > buf_end) { break; } cur_line = new_line; From webhook-mailer at python.org Thu Jan 20 11:39:19 2022 From: webhook-mailer at python.org (gvanrossum) Date: Thu, 20 Jan 2022 16:39:19 -0000 Subject: [Python-checkins] bpo-46429: Merge all deepfrozen files into one (GH-30572) Message-ID: https://github.com/python/cpython/commit/ef3ef6fa43d5cca072eed2a66064e818de583be7 commit: ef3ef6fa43d5cca072eed2a66064e818de583be7 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: gvanrossum date: 2022-01-20T08:38:39-08:00 summary: bpo-46429: Merge all deepfrozen files into one (GH-30572) files: A Misc/NEWS.d/next/Build/2022-01-19-04-36-15.bpo-46429.y0OtVL.rst M Makefile.pre.in M PCbuild/_freeze_module.vcxproj M PCbuild/pythoncore.vcxproj M Tools/scripts/deepfreeze.py M Tools/scripts/freeze_modules.py diff --git a/Makefile.pre.in b/Makefile.pre.in index 0b4d9a5240158..3ea019b7cac47 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -484,32 +484,7 @@ OBJECT_OBJS= \ Objects/unionobject.o \ Objects/weakrefobject.o -# DEEPFREEZE_OBJS is auto-generated by Tools/scripts/freeze_modules.py. -DEEPFREEZE_OBJS = \ - Python/deepfreeze/importlib._bootstrap.o \ - Python/deepfreeze/importlib._bootstrap_external.o \ - Python/deepfreeze/zipimport.o \ - Python/deepfreeze/abc.o \ - Python/deepfreeze/codecs.o \ - Python/deepfreeze/io.o \ - Python/deepfreeze/_collections_abc.o \ - Python/deepfreeze/_sitebuiltins.o \ - Python/deepfreeze/genericpath.o \ - Python/deepfreeze/ntpath.o \ - Python/deepfreeze/posixpath.o \ - Python/deepfreeze/os.o \ - Python/deepfreeze/site.o \ - Python/deepfreeze/stat.o \ - Python/deepfreeze/importlib.util.o \ - Python/deepfreeze/importlib.machinery.o \ - Python/deepfreeze/runpy.o \ - Python/deepfreeze/__hello__.o \ - Python/deepfreeze/__phello__.o \ - Python/deepfreeze/__phello__.ham.o \ - Python/deepfreeze/__phello__.ham.eggs.o \ - Python/deepfreeze/__phello__.spam.o \ - Python/deepfreeze/frozen_only.o -# End DEEPFREEZE_OBJS +DEEPFREEZE_OBJS = Python/deepfreeze/deepfreeze.o ########################################################################## # objects that get linked into the Python library @@ -984,86 +959,6 @@ _bootstrap_python: $(LIBRARY_OBJS_OMIT_FROZEN) Programs/_bootstrap_python.o Modu $(LINKCC) $(PY_LDFLAGS_NOLTO) -o $@ $(LIBRARY_OBJS_OMIT_FROZEN) \ Programs/_bootstrap_python.o Modules/getpath.o $(LIBS) $(MODLIBS) $(SYSLIBS) -############################################################################ -# Deepfreeze targets - -.PHONY: regen-deepfreeze -regen-deepfreeze: $(DEEPFREEZE_OBJS) - -DEEPFREEZE_DEPS=$(srcdir)/Tools/scripts/deepfreeze.py $(FREEZE_MODULE_DEPS) - -# BEGIN: deepfreeze modules - -Python/deepfreeze/importlib._bootstrap.c: Python/frozen_modules/importlib._bootstrap.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/importlib._bootstrap.h -m importlib._bootstrap -o Python/deepfreeze/importlib._bootstrap.c - -Python/deepfreeze/importlib._bootstrap_external.c: Python/frozen_modules/importlib._bootstrap_external.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/importlib._bootstrap_external.h -m importlib._bootstrap_external -o Python/deepfreeze/importlib._bootstrap_external.c - -Python/deepfreeze/zipimport.c: Python/frozen_modules/zipimport.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/zipimport.h -m zipimport -o Python/deepfreeze/zipimport.c - -Python/deepfreeze/abc.c: Python/frozen_modules/abc.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/abc.h -m abc -o Python/deepfreeze/abc.c - -Python/deepfreeze/codecs.c: Python/frozen_modules/codecs.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/codecs.h -m codecs -o Python/deepfreeze/codecs.c - -Python/deepfreeze/io.c: Python/frozen_modules/io.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/io.h -m io -o Python/deepfreeze/io.c - -Python/deepfreeze/_collections_abc.c: Python/frozen_modules/_collections_abc.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/_collections_abc.h -m _collections_abc -o Python/deepfreeze/_collections_abc.c - -Python/deepfreeze/_sitebuiltins.c: Python/frozen_modules/_sitebuiltins.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/_sitebuiltins.h -m _sitebuiltins -o Python/deepfreeze/_sitebuiltins.c - -Python/deepfreeze/genericpath.c: Python/frozen_modules/genericpath.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/genericpath.h -m genericpath -o Python/deepfreeze/genericpath.c - -Python/deepfreeze/ntpath.c: Python/frozen_modules/ntpath.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/ntpath.h -m ntpath -o Python/deepfreeze/ntpath.c - -Python/deepfreeze/posixpath.c: Python/frozen_modules/posixpath.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/posixpath.h -m posixpath -o Python/deepfreeze/posixpath.c - -Python/deepfreeze/os.c: Python/frozen_modules/os.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/os.h -m os -o Python/deepfreeze/os.c - -Python/deepfreeze/site.c: Python/frozen_modules/site.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/site.h -m site -o Python/deepfreeze/site.c - -Python/deepfreeze/stat.c: Python/frozen_modules/stat.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/stat.h -m stat -o Python/deepfreeze/stat.c - -Python/deepfreeze/importlib.util.c: Python/frozen_modules/importlib.util.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/importlib.util.h -m importlib.util -o Python/deepfreeze/importlib.util.c - -Python/deepfreeze/importlib.machinery.c: Python/frozen_modules/importlib.machinery.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/importlib.machinery.h -m importlib.machinery -o Python/deepfreeze/importlib.machinery.c - -Python/deepfreeze/runpy.c: Python/frozen_modules/runpy.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/runpy.h -m runpy -o Python/deepfreeze/runpy.c - -Python/deepfreeze/__hello__.c: Python/frozen_modules/__hello__.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/__hello__.h -m __hello__ -o Python/deepfreeze/__hello__.c - -Python/deepfreeze/__phello__.c: Python/frozen_modules/__phello__.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/__phello__.h -m __phello__ -o Python/deepfreeze/__phello__.c - -Python/deepfreeze/__phello__.ham.c: Python/frozen_modules/__phello__.ham.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/__phello__.ham.h -m __phello__.ham -o Python/deepfreeze/__phello__.ham.c - -Python/deepfreeze/__phello__.ham.eggs.c: Python/frozen_modules/__phello__.ham.eggs.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/__phello__.ham.eggs.h -m __phello__.ham.eggs -o Python/deepfreeze/__phello__.ham.eggs.c - -Python/deepfreeze/__phello__.spam.c: Python/frozen_modules/__phello__.spam.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/__phello__.spam.h -m __phello__.spam -o Python/deepfreeze/__phello__.spam.c - -Python/deepfreeze/frozen_only.c: Python/frozen_modules/frozen_only.h $(DEEPFREEZE_DEPS) - $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py Python/frozen_modules/frozen_only.h -m frozen_only -o Python/deepfreeze/frozen_only.c - -# END: deepfreeze modules ############################################################################ # frozen modules (including importlib) @@ -1235,6 +1130,44 @@ regen-frozen: Tools/scripts/freeze_modules.py $(FROZEN_FILES_IN) $(PYTHON_FOR_REGEN) $(srcdir)/Tools/scripts/freeze_modules.py @echo "The Makefile was updated, you may need to re-run make." +############################################################################ +# Deepfreeze targets + +.PHONY: regen-deepfreeze +regen-deepfreeze: $(DEEPFREEZE_OBJS) + +DEEPFREEZE_DEPS=$(srcdir)/Tools/scripts/deepfreeze.py $(FREEZE_MODULE_DEPS) $(FROZEN_FILES_OUT) + +# BEGIN: deepfreeze modules +Python/deepfreeze/deepfreeze.c: $(DEEPFREEZE_DEPS) + $(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py \ + Python/frozen_modules/importlib._bootstrap.h:importlib._bootstrap \ + Python/frozen_modules/importlib._bootstrap_external.h:importlib._bootstrap_external \ + Python/frozen_modules/zipimport.h:zipimport \ + Python/frozen_modules/abc.h:abc \ + Python/frozen_modules/codecs.h:codecs \ + Python/frozen_modules/io.h:io \ + Python/frozen_modules/_collections_abc.h:_collections_abc \ + Python/frozen_modules/_sitebuiltins.h:_sitebuiltins \ + Python/frozen_modules/genericpath.h:genericpath \ + Python/frozen_modules/ntpath.h:ntpath \ + Python/frozen_modules/posixpath.h:posixpath \ + Python/frozen_modules/os.h:os \ + Python/frozen_modules/site.h:site \ + Python/frozen_modules/stat.h:stat \ + Python/frozen_modules/importlib.util.h:importlib.util \ + Python/frozen_modules/importlib.machinery.h:importlib.machinery \ + Python/frozen_modules/runpy.h:runpy \ + Python/frozen_modules/__hello__.h:__hello__ \ + Python/frozen_modules/__phello__.h:__phello__ \ + Python/frozen_modules/__phello__.ham.h:__phello__.ham \ + Python/frozen_modules/__phello__.ham.eggs.h:__phello__.ham.eggs \ + Python/frozen_modules/__phello__.spam.h:__phello__.spam \ + Python/frozen_modules/frozen_only.h:frozen_only \ + -o Python/deepfreeze/deepfreeze.c + +# END: deepfreeze modules + # We keep this renamed target around for folks with muscle memory. .PHONY: regen-importlib regen-importlib: regen-frozen diff --git a/Misc/NEWS.d/next/Build/2022-01-19-04-36-15.bpo-46429.y0OtVL.rst b/Misc/NEWS.d/next/Build/2022-01-19-04-36-15.bpo-46429.y0OtVL.rst new file mode 100644 index 0000000000000..c983d9637fc89 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-19-04-36-15.bpo-46429.y0OtVL.rst @@ -0,0 +1 @@ +Merge all deep-frozen files into one for space savings. Patch by Kumar Aditya. \ No newline at end of file diff --git a/PCbuild/_freeze_module.vcxproj b/PCbuild/_freeze_module.vcxproj index 59519cade2670..0a74f5850a1e8 100644 --- a/PCbuild/_freeze_module.vcxproj +++ b/PCbuild/_freeze_module.vcxproj @@ -241,162 +241,116 @@ importlib._bootstrap $(IntDir)importlib._bootstrap.g.h $(PySourcePath)Python\frozen_modules\importlib._bootstrap.h - $(IntDir)importlib._bootstrap.g.c - $(PySourcePath)Python\deepfreeze\df.importlib._bootstrap.c importlib._bootstrap_external $(IntDir)importlib._bootstrap_external.g.h $(PySourcePath)Python\frozen_modules\importlib._bootstrap_external.h - $(IntDir)importlib._bootstrap_external.g.c - $(PySourcePath)Python\deepfreeze\df.importlib._bootstrap_external.c zipimport $(IntDir)zipimport.g.h $(PySourcePath)Python\frozen_modules\zipimport.h - $(IntDir)zipimport.g.c - $(PySourcePath)Python\deepfreeze\df.zipimport.c abc $(IntDir)abc.g.h $(PySourcePath)Python\frozen_modules\abc.h - $(IntDir)abc.g.c - $(PySourcePath)Python\deepfreeze\df.abc.c codecs $(IntDir)codecs.g.h $(PySourcePath)Python\frozen_modules\codecs.h - $(IntDir)codecs.g.c - $(PySourcePath)Python\deepfreeze\df.codecs.c io $(IntDir)io.g.h $(PySourcePath)Python\frozen_modules\io.h - $(IntDir)io.g.c - $(PySourcePath)Python\deepfreeze\df.io.c _collections_abc $(IntDir)_collections_abc.g.h $(PySourcePath)Python\frozen_modules\_collections_abc.h - $(IntDir)_collections_abc.g.c - $(PySourcePath)Python\deepfreeze\df._collections_abc.c _sitebuiltins $(IntDir)_sitebuiltins.g.h $(PySourcePath)Python\frozen_modules\_sitebuiltins.h - $(IntDir)_sitebuiltins.g.c - $(PySourcePath)Python\deepfreeze\df._sitebuiltins.c genericpath $(IntDir)genericpath.g.h $(PySourcePath)Python\frozen_modules\genericpath.h - $(IntDir)genericpath.g.c - $(PySourcePath)Python\deepfreeze\df.genericpath.c ntpath $(IntDir)ntpath.g.h $(PySourcePath)Python\frozen_modules\ntpath.h - $(IntDir)ntpath.g.c - $(PySourcePath)Python\deepfreeze\df.ntpath.c posixpath $(IntDir)posixpath.g.h $(PySourcePath)Python\frozen_modules\posixpath.h - $(IntDir)posixpath.g.c - $(PySourcePath)Python\deepfreeze\df.posixpath.c os $(IntDir)os.g.h $(PySourcePath)Python\frozen_modules\os.h - $(IntDir)os.g.c - $(PySourcePath)Python\deepfreeze\df.os.c site $(IntDir)site.g.h $(PySourcePath)Python\frozen_modules\site.h - $(IntDir)site.g.c - $(PySourcePath)Python\deepfreeze\df.site.c stat $(IntDir)stat.g.h $(PySourcePath)Python\frozen_modules\stat.h - $(IntDir)stat.g.c - $(PySourcePath)Python\deepfreeze\df.stat.c importlib.util $(IntDir)importlib.util.g.h $(PySourcePath)Python\frozen_modules\importlib.util.h - $(IntDir)importlib.util.g.c - $(PySourcePath)Python\deepfreeze\df.importlib.util.c importlib.machinery $(IntDir)importlib.machinery.g.h $(PySourcePath)Python\frozen_modules\importlib.machinery.h - $(IntDir)importlib.machinery.g.c - $(PySourcePath)Python\deepfreeze\df.importlib.machinery.c runpy $(IntDir)runpy.g.h $(PySourcePath)Python\frozen_modules\runpy.h - $(IntDir)runpy.g.c - $(PySourcePath)Python\deepfreeze\df.runpy.c __hello__ $(IntDir)__hello__.g.h $(PySourcePath)Python\frozen_modules\__hello__.h - $(IntDir)__hello__.g.c - $(PySourcePath)Python\deepfreeze\df.__hello__.c __phello__ $(IntDir)__phello__.g.h $(PySourcePath)Python\frozen_modules\__phello__.h - $(IntDir)__phello__.g.c - $(PySourcePath)Python\deepfreeze\df.__phello__.c __phello__.ham $(IntDir)__phello__.ham.g.h $(PySourcePath)Python\frozen_modules\__phello__.ham.h - $(IntDir)__phello__.ham.g.c - $(PySourcePath)Python\deepfreeze\df.__phello__.ham.c __phello__.ham.eggs $(IntDir)__phello__.ham.eggs.g.h $(PySourcePath)Python\frozen_modules\__phello__.ham.eggs.h - $(IntDir)__phello__.ham.eggs.g.c - $(PySourcePath)Python\deepfreeze\df.__phello__.ham.eggs.c __phello__.spam $(IntDir)__phello__.spam.g.h $(PySourcePath)Python\frozen_modules\__phello__.spam.h - $(IntDir)__phello__.spam.g.c - $(PySourcePath)Python\deepfreeze\df.__phello__.spam.c frozen_only $(IntDir)frozen_only.g.h $(PySourcePath)Python\frozen_modules\frozen_only.h - $(IntDir)frozen_only.g.c - $(PySourcePath)Python\deepfreeze\df.frozen_only.c @@ -424,11 +378,11 @@ Condition="'@(_UpdatedGetPath)' != ''" Importance="high" /> - + - + Condition="!Exists(%(None.OutFile)) or (Exists(%(None.IntFile)) and '$([System.IO.File]::ReadAllText(%(None.OutFile)).Replace(` `, ` `))' != '$([System.IO.File]::ReadAllText(%(None.IntFile)).Replace(` `, ` `))')"> @@ -439,16 +393,33 @@ AfterTargets="_RebuildFrozen" DependsOnTargets="FindPythonForBuild" Condition="$(Configuration) != 'PGUpdate'"> - - - - - - - + + + diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index 12eac8ebab510..fd1ab837c0775 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -528,29 +528,7 @@ - - - - - - - - - - - - - - - - - - - - - - - + diff --git a/Tools/scripts/deepfreeze.py b/Tools/scripts/deepfreeze.py index 49638b8400285..a7546a8c60751 100644 --- a/Tools/scripts/deepfreeze.py +++ b/Tools/scripts/deepfreeze.py @@ -10,10 +10,9 @@ import contextlib import os import re -import sys import time import types -from typing import Dict, FrozenSet, Tuple, TextIO +from typing import Dict, FrozenSet, TextIO, Tuple import umarshal @@ -104,10 +103,10 @@ def removesuffix(base: str, suffix: str) -> str: class Printer: - def __init__(self, file: TextIO): + def __init__(self, file: TextIO) -> None: self.level = 0 self.file = file - self.cache: Dict[Tuple[type, object], str] = {} + self.cache: Dict[tuple[type, object, str], str] = {} self.hits, self.misses = 0, 0 self.patchups: list[str] = [] self.write('#include "Python.h"') @@ -349,6 +348,15 @@ def generate_frozenset(self, name: str, fs: FrozenSet[object]) -> str: self.write("// TODO: The above tuple should be a frozenset") return ret + def generate_file(self, module: str, code: object)-> None: + module = module.replace(".", "_") + self.generate(f"{module}_toplevel", code) + with self.block(f"static void {module}_do_patchups(void)"): + for p in self.patchups: + self.write(p) + self.patchups.clear() + self.write(EPILOGUE.replace("%%NAME%%", module)) + def generate(self, name: str, obj: object) -> str: # Use repr() in the key to distinguish -0.0 from +0.0 key = (type(obj), obj, repr(obj)) @@ -357,7 +365,7 @@ def generate(self, name: str, obj: object) -> str: # print(f"Cache hit {key!r:.40}: {self.cache[key]!r:.40}") return self.cache[key] self.misses += 1 - if isinstance(obj, types.CodeType) or isinstance(obj, umarshal.Code): + if isinstance(obj, (types.CodeType, umarshal.Code)) : val = self.generate_code(name, obj) elif isinstance(obj, tuple): val = self.generate_tuple(name, obj) @@ -393,8 +401,8 @@ def generate(self, name: str, obj: object) -> str: PyObject * _Py_get_%%NAME%%_toplevel(void) { - do_patchups(); - return (PyObject *) &toplevel; + %%NAME%%_do_patchups(); + return (PyObject *) &%%NAME%%_toplevel; } """ @@ -419,29 +427,25 @@ def decode_frozen_data(source: str) -> types.CodeType: return umarshal.loads(data) -def generate(source: str, filename: str, modname: str, file: TextIO) -> None: - if is_frozen_header(source): - code = decode_frozen_data(source) - else: - code = compile(source, filename, "exec") - printer = Printer(file) - printer.generate("toplevel", code) - printer.write("") - with printer.block("static void do_patchups(void)"): - for p in printer.patchups: - printer.write(p) - here = os.path.dirname(__file__) - printer.write(EPILOGUE.replace("%%NAME%%", modname.replace(".", "_"))) +def generate(args: list[str], output: TextIO) -> None: + printer = Printer(output) + for arg in args: + file, modname = arg.rsplit(':', 1) + with open(file, "r", encoding="utf8") as fd: + source = fd.read() + if is_frozen_header(source): + code = decode_frozen_data(source) + else: + code = compile(fd.read(), f"", "exec") + printer.generate_file(modname, code) if verbose: print(f"Cache hits: {printer.hits}, misses: {printer.misses}") parser = argparse.ArgumentParser() -parser.add_argument("-m", "--module", help="Defaults to basename(file)") -parser.add_argument("-o", "--output", help="Defaults to MODULE.c") +parser.add_argument("-o", "--output", help="Defaults to deepfreeze.c", default="deepfreeze.c") parser.add_argument("-v", "--verbose", action="store_true", help="Print diagnostics") -parser.add_argument("file", help="Input file (required)") - +parser.add_argument('args', nargs="+", help="Input file and module name (required) in file:modname format") @contextlib.contextmanager def report_time(label: str): @@ -458,13 +462,10 @@ def main() -> None: global verbose args = parser.parse_args() verbose = args.verbose - with open(args.file, encoding="utf-8") as f: - source = f.read() - modname = args.module or removesuffix(os.path.basename(args.file), ".py") - output = args.output or modname + ".c" + output = args.output with open(output, "w", encoding="utf-8") as file: with report_time("generate"): - generate(source, f"", modname, file) + generate(args.args, file) if verbose: print(f"Wrote {os.path.getsize(output)} bytes to {output}") diff --git a/Tools/scripts/freeze_modules.py b/Tools/scripts/freeze_modules.py index cbe8bf1ce60cd..6d10758b5285c 100644 --- a/Tools/scripts/freeze_modules.py +++ b/Tools/scripts/freeze_modules.py @@ -575,16 +575,12 @@ def regen_frozen(modules): def regen_makefile(modules): pyfiles = [] frozenfiles = [] - deepfreezefiles = [] rules = [''] - deepfreezerules = [''] + deepfreezerules = ["Python/deepfreeze/deepfreeze.c: $(DEEPFREEZE_DEPS)", + "\t$(PYTHON_FOR_FREEZE) $(srcdir)/Tools/scripts/deepfreeze.py \\"] for src in _iter_sources(modules): frozen_header = relpath_for_posix_display(src.frozenfile, ROOT_DIR) - deepfreeze_header = relpath_for_posix_display(src.deepfreezefile, ROOT_DIR) frozenfiles.append(f'\t\t{frozen_header} \\') - cfile = deepfreeze_header[:-2] + ".c" - ofile = deepfreeze_header[:-2] + ".o" - deepfreezefiles.append(f"\t\t{ofile} \\") pyfile = relpath_for_posix_display(src.pyfile, ROOT_DIR) pyfiles.append(f'\t\t{pyfile} \\') @@ -603,15 +599,11 @@ def regen_makefile(modules): f'\t{freeze}', '', ]) - deepfreezerules.append(f'{cfile}: {frozen_header} $(DEEPFREEZE_DEPS)') - deepfreezerules.append( - f"\t$(PYTHON_FOR_FREEZE) " - f"$(srcdir)/Tools/scripts/deepfreeze.py " - f"{frozen_header} -m {src.frozenid} -o {cfile}") - deepfreezerules.append('') + deepfreezerules.append(f"\t{frozen_header}:{src.frozenid} \\") + deepfreezerules.append('\t-o Python/deepfreeze/deepfreeze.c') + deepfreezerules.append('') pyfiles[-1] = pyfiles[-1].rstrip(" \\") frozenfiles[-1] = frozenfiles[-1].rstrip(" \\") - deepfreezefiles[-1] = deepfreezefiles[-1].rstrip(" \\") print(f'# Updating {os.path.relpath(MAKEFILE)}') with updating_file_with_tmpfile(MAKEFILE) as (infile, outfile): @@ -630,13 +622,6 @@ def regen_makefile(modules): frozenfiles, MAKEFILE, ) - lines = replace_block( - lines, - "DEEPFREEZE_OBJS =", - "# End DEEPFREEZE_OBJS", - deepfreezefiles, - MAKEFILE, - ) lines = replace_block( lines, "# BEGIN: freezing modules", @@ -658,26 +643,24 @@ def regen_pcbuild(modules): projlines = [] filterlines = [] corelines = [] + deepfreezerules = ['\t') projlines.append(f' {src.frozenid}') projlines.append(f' $(IntDir){intfile}') projlines.append(f' $(PySourcePath){header}') - projlines.append(f' $(IntDir){deepintfile}') - projlines.append(f' $(PySourcePath){deepoutfile}') projlines.append(f' ') filterlines.append(f' ') filterlines.append(' Python Files') filterlines.append(' ') + deepfreezerules.append(f'\t\t "$(PySourcePath){header}:{src.frozenid}" ^') + deepfreezerules.append('\t\t "-o" "$(PySourcePath)Python\\deepfreeze\\deepfreeze.c"\'/>' ) - corelines.append(f' ') + corelines.append(f' ') print(f'# Updating {os.path.relpath(PCBUILD_PROJECT)}') with updating_file_with_tmpfile(PCBUILD_PROJECT) as (infile, outfile): @@ -690,6 +673,16 @@ def regen_pcbuild(modules): PCBUILD_PROJECT, ) outfile.writelines(lines) + with updating_file_with_tmpfile(PCBUILD_PROJECT) as (infile, outfile): + lines = infile.readlines() + lines = replace_block( + lines, + '', + '', + deepfreezerules, + PCBUILD_PROJECT, + ) + outfile.writelines(lines) print(f'# Updating {os.path.relpath(PCBUILD_FILTERS)}') with updating_file_with_tmpfile(PCBUILD_FILTERS) as (infile, outfile): lines = infile.readlines() From webhook-mailer at python.org Thu Jan 20 12:56:41 2022 From: webhook-mailer at python.org (tiran) Date: Thu, 20 Jan 2022 17:56:41 -0000 Subject: [Python-checkins] bpo-40280: Misc fixes for wasm32-emscripten (GH-30722) Message-ID: https://github.com/python/cpython/commit/c02e860ee79f29905be6fca997c96bb1a404bb32 commit: c02e860ee79f29905be6fca997c96bb1a404bb32 branch: main author: Christian Heimes committer: tiran date: 2022-01-20T18:56:33+01:00 summary: bpo-40280: Misc fixes for wasm32-emscripten (GH-30722) files: M Lib/test/test___all__.py M Lib/test/test_capi.py M Lib/test/test_compileall.py M Lib/test/test_imp.py M Lib/test/test_pty.py M Lib/test/test_tracemalloc.py M Tools/wasm/config.site-wasm32-emscripten M configure M configure.ac diff --git a/Lib/test/test___all__.py b/Lib/test/test___all__.py index 15f42d2d114a6..81293e15f8163 100644 --- a/Lib/test/test___all__.py +++ b/Lib/test/test___all__.py @@ -3,6 +3,12 @@ from test.support import warnings_helper import os import sys +import types + +try: + import _multiprocessing +except ModuleNotFoundError: + _multiprocessing = None class NoAll(RuntimeError): @@ -14,6 +20,17 @@ class FailedImport(RuntimeError): class AllTest(unittest.TestCase): + def setUp(self): + # concurrent.futures uses a __getattr__ hook. Its __all__ triggers + # import of a submodule, which fails when _multiprocessing is not + # available. + if _multiprocessing is None: + sys.modules["_multiprocessing"] = types.ModuleType("_multiprocessing") + + def tearDown(self): + if _multiprocessing is None: + sys.modules.pop("_multiprocessing") + def check_all(self, modname): names = {} with warnings_helper.check_warnings( diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 7ada8406a3584..0957f3253d7a6 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -26,6 +26,10 @@ import _posixsubprocess except ImportError: _posixsubprocess = None +try: + import _testmultiphase +except ImportError: + _testmultiphase = None # Skip this test if the _testcapi module isn't available. _testcapi = import_helper.import_module('_testcapi') @@ -798,6 +802,7 @@ def test_mutate_exception(self): self.assertFalse(hasattr(binascii.Error, "foobar")) + @unittest.skipIf(_testmultiphase is None, "test requires _testmultiphase module") def test_module_state_shared_in_global(self): """ bpo-44050: Extension module state should be shared between interpreters @@ -991,6 +996,7 @@ class PyMemDefaultTests(PyMemDebugTests): PYTHONMALLOC = '' + at unittest.skipIf(_testmultiphase is None, "test requires _testmultiphase module") class Test_ModuleStateAccess(unittest.TestCase): """Test access to module start (PEP 573)""" diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 33f0c939325f5..e207cf8f1793b 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -15,14 +15,14 @@ import unittest from unittest import mock, skipUnless -from concurrent.futures import ProcessPoolExecutor try: # compileall relies on ProcessPoolExecutor if ProcessPoolExecutor exists # and it can function. + from concurrent.futures import ProcessPoolExecutor from concurrent.futures.process import _check_system_limits _check_system_limits() _have_multiprocessing = True -except NotImplementedError: +except (NotImplementedError, ModuleNotFoundError): _have_multiprocessing = False from test import support diff --git a/Lib/test/test_imp.py b/Lib/test/test_imp.py index 1a21025fe6eaf..35e9a2a186552 100644 --- a/Lib/test/test_imp.py +++ b/Lib/test/test_imp.py @@ -23,7 +23,7 @@ def requires_load_dynamic(meth): """Decorator to skip a test if not running under CPython or lacking imp.load_dynamic().""" meth = support.cpython_only(meth) - return unittest.skipIf(not hasattr(imp, 'load_dynamic'), + return unittest.skipIf(getattr(imp, 'load_dynamic', None) is None, 'imp.load_dynamic() required')(meth) diff --git a/Lib/test/test_pty.py b/Lib/test/test_pty.py index 0c178127571b0..0781cde1e1582 100644 --- a/Lib/test/test_pty.py +++ b/Lib/test/test_pty.py @@ -1,8 +1,9 @@ from test.support import verbose, reap_children from test.support.import_helper import import_module -# Skip these tests if termios is not available +# Skip these tests if termios or fcntl are not available import_module('termios') +import_module("fcntl") import errno import os diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py index 82be98dfd8f5a..d2a5ede61e3ff 100644 --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -346,7 +346,7 @@ def fork_child(self): # everything is fine return 0 - @unittest.skipUnless(hasattr(os, 'fork'), 'need os.fork()') + @support.requires_fork() def test_fork(self): # check that tracemalloc is still working after fork pid = os.fork() diff --git a/Tools/wasm/config.site-wasm32-emscripten b/Tools/wasm/config.site-wasm32-emscripten index c15e4fc6b64b1..413506bbc9abd 100644 --- a/Tools/wasm/config.site-wasm32-emscripten +++ b/Tools/wasm/config.site-wasm32-emscripten @@ -58,12 +58,14 @@ ac_cv_func_fchmodat=no ac_cv_func_dup3=no # Syscalls not implemented in emscripten +# [Errno 52] Function not implemented ac_cv_func_preadv2=no ac_cv_func_preadv=no ac_cv_func_pwritev2=no ac_cv_func_pwritev=no ac_cv_func_pipe2=no ac_cv_func_nice=no +ac_cv_func_setitimer=no # Syscalls that resulted in a segfault ac_cv_func_utimensat=no diff --git a/configure b/configure index 7236e0930e15b..402e626b6992d 100755 --- a/configure +++ b/configure @@ -21323,7 +21323,7 @@ case $ac_sys_system/$ac_sys_emscripten_target in #( ;; #( Emscripten/node) : - py_stdlib_not_available="_ctypes _curses _curses_panel _dbm _gdbm _scproxy _tkinter nis ossaudiodev spwd syslog" + py_stdlib_not_available="_ctypes _curses _curses_panel _dbm _gdbm _scproxy _tkinter _xxsubinterpreters grp nis ossaudiodev spwd syslog" ;; #( *) : py_stdlib_not_available="_scproxy" diff --git a/configure.ac b/configure.ac index aea12128c1217..9c9a338576736 100644 --- a/configure.ac +++ b/configure.ac @@ -6384,6 +6384,8 @@ AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], _gdbm _scproxy _tkinter + _xxsubinterpreters + grp nis ossaudiodev spwd From webhook-mailer at python.org Thu Jan 20 14:20:08 2022 From: webhook-mailer at python.org (zware) Date: Thu, 20 Jan 2022 19:20:08 -0000 Subject: [Python-checkins] bpo-46316: optimize `pathlib.Path.iterdir()` (GH-30501) Message-ID: https://github.com/python/cpython/commit/a1c88414926610a3527398a478c3e63c531dc742 commit: a1c88414926610a3527398a478c3e63c531dc742 branch: main author: Barney Gale committer: zware date: 2022-01-20T13:20:00-06:00 summary: bpo-46316: optimize `pathlib.Path.iterdir()` (GH-30501) `os.listdir()` doesn't return entries for `.` or `..`, so we don't need to check for them here. files: A Misc/NEWS.d/next/Library/2022-01-09-15-04-56.bpo-46316.AMTyd0.rst M Lib/pathlib.py diff --git a/Lib/pathlib.py b/Lib/pathlib.py index f1a33178e2958..04b321b9ccf16 100644 --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -1013,9 +1013,6 @@ def iterdir(self): result for the special paths '.' and '..'. """ for name in self._accessor.listdir(self): - if name in {'.', '..'}: - # Yielding a path object for these makes little sense - continue yield self._make_child_relpath(name) def glob(self, pattern): diff --git a/Misc/NEWS.d/next/Library/2022-01-09-15-04-56.bpo-46316.AMTyd0.rst b/Misc/NEWS.d/next/Library/2022-01-09-15-04-56.bpo-46316.AMTyd0.rst new file mode 100644 index 0000000000000..09acb77855f15 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-09-15-04-56.bpo-46316.AMTyd0.rst @@ -0,0 +1 @@ +Optimize :meth:`pathlib.Path.iterdir` by removing an unnecessary check for special entries. From webhook-mailer at python.org Thu Jan 20 15:07:47 2022 From: webhook-mailer at python.org (isidentical) Date: Thu, 20 Jan 2022 20:07:47 -0000 Subject: [Python-checkins] bpo-46441: Add a boilerplate to test syntax errors in interactive mode (GH-30720) Message-ID: https://github.com/python/cpython/commit/30fb6d073d9ca00dff8e4155c523cdfa63abab6b commit: 30fb6d073d9ca00dff8e4155c523cdfa63abab6b branch: main author: Batuhan Taskaya committer: isidentical date: 2022-01-20T23:07:43+03:00 summary: bpo-46441: Add a boilerplate to test syntax errors in interactive mode (GH-30720) files: M Lib/test/test_repl.py diff --git a/Lib/test/test_repl.py b/Lib/test/test_repl.py index 03bf8d8b5483f..a8d04a425e278 100644 --- a/Lib/test/test_repl.py +++ b/Lib/test/test_repl.py @@ -36,6 +36,21 @@ def spawn_repl(*args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kw): stdout=stdout, stderr=stderr, **kw) +def run_on_interactive_mode(source): + """Spawn a new Python interpreter, pass the given + input source code from the stdin and return the + result back. If the interpreter exits non-zero, it + raises a ValueError.""" + + process = spawn_repl() + process.stdin.write(source) + output = kill_python(process) + + if process.returncode != 0: + raise ValueError("Process didn't exit properly.") + return output + + class TestInteractiveInterpreter(unittest.TestCase): @cpython_only @@ -108,5 +123,23 @@ def test_close_stdin(self): self.assertIn('before close', output) +class TestInteractiveModeSyntaxErrors(unittest.TestCase): + + def test_interactive_syntax_error_correct_line(self): + output = run_on_interactive_mode(dedent("""\ + def f(): + print(0) + return yield 42 + """)) + + traceback_lines = output.splitlines()[-4:-1] + expected_lines = [ + ' return yield 42', + ' ^^^^^', + 'SyntaxError: invalid syntax' + ] + self.assertEqual(traceback_lines, expected_lines) + + if __name__ == "__main__": unittest.main() From webhook-mailer at python.org Thu Jan 20 16:06:56 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 20 Jan 2022 21:06:56 -0000 Subject: [Python-checkins] [3.10] Mark all clinic headers as generated (GH-30679). (GH-30726) Message-ID: https://github.com/python/cpython/commit/876ade1ae3a805b546a211fd7303253c10395569 commit: 876ade1ae3a805b546a211fd7303253c10395569 branch: 3.10 author: Erlend Egeberg Aasland committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-20T13:06:47-08:00 summary: [3.10] Mark all clinic headers as generated (GH-30679). (GH-30726) (cherry picked from commit 71734d0b9ca584bcbdcb2fb44ae16bb2fbfcaf6e) Co-authored-by: Erlend Egeberg Aasland files: M .gitattributes diff --git a/.gitattributes b/.gitattributes index c66e765266382..be369d2a5c63c 100644 --- a/.gitattributes +++ b/.gitattributes @@ -40,11 +40,8 @@ PCbuild/readme.txt text eol=crlf PC/readme.txt text eol=crlf # Generated files -# https://github.com/github/linguist#generated-code -Modules/clinic/*.h linguist-generated=true -Objects/clinic/*.h linguist-generated=true -PC/clinic/*.h linguist-generated=true -Python/clinic/*.h linguist-generated=true +# https://github.com/github/linguist/blob/master/docs/overrides.md +**/clinic/*.h linguist-generated=true Python/importlib.h linguist-generated=true Python/importlib_external.h linguist-generated=true Include/internal/pycore_ast.h linguist-generated=true From webhook-mailer at python.org Thu Jan 20 16:07:23 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 20 Jan 2022 21:07:23 -0000 Subject: [Python-checkins] [3.9] Mark all clinic headers as generated (GH-30679). (GH-30728) Message-ID: https://github.com/python/cpython/commit/e8e71c481a4ac05f35b423d661b38e67d035912a commit: e8e71c481a4ac05f35b423d661b38e67d035912a branch: 3.9 author: Erlend Egeberg Aasland committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-20T13:07:19-08:00 summary: [3.9] Mark all clinic headers as generated (GH-30679). (GH-30728) (cherry picked from commit 71734d0b9ca584bcbdcb2fb44ae16bb2fbfcaf6e) Co-authored-by: Erlend Egeberg Aasland files: M .gitattributes diff --git a/.gitattributes b/.gitattributes index bec16a08152eb..f73a2e6d90828 100644 --- a/.gitattributes +++ b/.gitattributes @@ -40,13 +40,10 @@ PCbuild/readme.txt text eol=crlf PC/readme.txt text eol=crlf # Generated files -# https://github.com/github/linguist#generated-code +# https://github.com/github/linguist/blob/master/docs/overrides.md Include/graminit.h linguist-generated=true Python/graminit.h linguist-generated=true -Modules/clinic/*.h linguist-generated=true -Objects/clinic/*.h linguist-generated=true -PC/clinic/*.h linguist-generated=true -Python/clinic/*.h linguist-generated=true +**/clinic/*.h linguist-generated=true Python/importlib.h linguist-generated=true Python/importlib_external.h linguist-generated=true Include/Python-ast.h linguist-generated=true From webhook-mailer at python.org Thu Jan 20 17:48:56 2022 From: webhook-mailer at python.org (taleinat) Date: Thu, 20 Jan 2022 22:48:56 -0000 Subject: [Python-checkins] bpo-46080: fix argparse help generation exception in edge case (GH-30111) Message-ID: https://github.com/python/cpython/commit/9e87c0e03fa501fb90008547983ce4c1dcaaf90c commit: 9e87c0e03fa501fb90008547983ce4c1dcaaf90c branch: main author: Felix Fontein committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-21T00:48:48+02:00 summary: bpo-46080: fix argparse help generation exception in edge case (GH-30111) Fix an uncaught exception during help text generation when argparse.BooleanOptionalAction is used with default=argparse.SUPPRESS and help is specified. files: A Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst M Lib/argparse.py M Lib/test/test_argparse.py diff --git a/Lib/argparse.py b/Lib/argparse.py index 1529d9e768737..9344dab3e60d5 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -881,7 +881,7 @@ def __init__(self, option_string = '--no-' + option_string[2:] _option_strings.append(option_string) - if help is not None and default is not None: + if help is not None and default is not None and default is not SUPPRESS: help += " (default: %(default)s)" super().__init__( diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index afcb88ff5ce0f..df6da928c9beb 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -3626,6 +3626,8 @@ class TestHelpUsage(HelpTestCase): Sig('--bar', help='Whether to bar', default=True, action=argparse.BooleanOptionalAction), Sig('-f', '--foobar', '--barfoo', action=argparse.BooleanOptionalAction), + Sig('--bazz', action=argparse.BooleanOptionalAction, + default=argparse.SUPPRESS, help='Bazz!'), ] argument_group_signatures = [ (Sig('group'), [ @@ -3638,8 +3640,8 @@ class TestHelpUsage(HelpTestCase): usage = '''\ usage: PROG [-h] [-w W [W ...]] [-x [X ...]] [--foo | --no-foo] [--bar | --no-bar] - [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] [-y [Y]] - [-z Z Z Z] + [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] + [--bazz | --no-bazz] [-y [Y]] [-z Z Z Z] a b b [c] [d ...] e [e ...] ''' help = usage + '''\ @@ -3656,6 +3658,7 @@ class TestHelpUsage(HelpTestCase): --foo, --no-foo Whether to foo --bar, --no-bar Whether to bar (default: True) -f, --foobar, --no-foobar, --barfoo, --no-barfoo + --bazz, --no-bazz Bazz! group: -y [Y] y diff --git a/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst new file mode 100644 index 0000000000000..e42d84e31e759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst @@ -0,0 +1,3 @@ +Fix exception in argparse help text generation if a +:class:`argparse.BooleanOptionalAction` argument's default is +``argparse.SUPPRESS`` and it has ``help`` specified. Patch by Felix Fontein. \ No newline at end of file From webhook-mailer at python.org Thu Jan 20 18:08:53 2022 From: webhook-mailer at python.org (taleinat) Date: Thu, 20 Jan 2022 23:08:53 -0000 Subject: [Python-checkins] bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) Message-ID: https://github.com/python/cpython/commit/27df7566bc19699b967e0e30d7808637b90141f6 commit: 27df7566bc19699b967e0e30d7808637b90141f6 branch: main author: Zane Bitter committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-21T01:08:44+02:00 summary: bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) files: M Modules/clinic/selectmodule.c.h M Modules/selectmodule.c diff --git a/Modules/clinic/selectmodule.c.h b/Modules/clinic/selectmodule.c.h index a1695f2390a9c..ca06dae32964f 100644 --- a/Modules/clinic/selectmodule.c.h +++ b/Modules/clinic/selectmodule.c.h @@ -193,6 +193,10 @@ PyDoc_STRVAR(select_poll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -363,6 +367,10 @@ PyDoc_STRVAR(select_devpoll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -1181,4 +1189,4 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize #ifndef SELECT_KQUEUE_CONTROL_METHODDEF #define SELECT_KQUEUE_CONTROL_METHODDEF #endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */ -/*[clinic end generated code: output=ed1e5a658863244c input=a9049054013a1b77]*/ +/*[clinic end generated code: output=09ff9484c1b092fb input=a9049054013a1b77]*/ diff --git a/Modules/selectmodule.c b/Modules/selectmodule.c index 367e299f83ae8..1a5a63249c7b8 100644 --- a/Modules/selectmodule.c +++ b/Modules/selectmodule.c @@ -570,6 +570,8 @@ select_poll_unregister_impl(pollObject *self, int fd) select.poll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -580,7 +582,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_poll_poll_impl(pollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=876e837d193ed7e4 input=7a446ed45189e894]*/ +/*[clinic end generated code: output=876e837d193ed7e4 input=c2f6953ec45e5622]*/ { PyObject *result_list = NULL; int poll_result, i, j; @@ -894,6 +896,8 @@ select_devpoll_unregister_impl(devpollObject *self, int fd) /*[clinic input] select.devpoll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -904,7 +908,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_devpoll_poll_impl(devpollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=2654e5457cca0b3c input=fd0db698d84f0333]*/ +/*[clinic end generated code: output=2654e5457cca0b3c input=3c3f0a355ec2bedb]*/ { struct dvpoll dvp; PyObject *result_list = NULL; From webhook-mailer at python.org Thu Jan 20 18:13:26 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 20 Jan 2022 23:13:26 -0000 Subject: [Python-checkins] bpo-46080: fix argparse help generation exception in edge case (GH-30111) Message-ID: https://github.com/python/cpython/commit/e5edc8d737a45d9d8b9b93b8be52f85d79d0f417 commit: e5edc8d737a45d9d8b9b93b8be52f85d79d0f417 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-20T15:13:17-08:00 summary: bpo-46080: fix argparse help generation exception in edge case (GH-30111) Fix an uncaught exception during help text generation when argparse.BooleanOptionalAction is used with default=argparse.SUPPRESS and help is specified. (cherry picked from commit 9e87c0e03fa501fb90008547983ce4c1dcaaf90c) Co-authored-by: Felix Fontein files: A Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst M Lib/argparse.py M Lib/test/test_argparse.py diff --git a/Lib/argparse.py b/Lib/argparse.py index e177e4fe034d3..b71a6703c3982 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -878,7 +878,7 @@ def __init__(self, option_string = '--no-' + option_string[2:] _option_strings.append(option_string) - if help is not None and default is not None: + if help is not None and default is not None and default is not SUPPRESS: help += " (default: %(default)s)" super().__init__( diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index 37a73e0686377..9d66ace5474a1 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -3597,6 +3597,8 @@ class TestHelpUsage(HelpTestCase): Sig('--bar', help='Whether to bar', default=True, action=argparse.BooleanOptionalAction), Sig('-f', '--foobar', '--barfoo', action=argparse.BooleanOptionalAction), + Sig('--bazz', action=argparse.BooleanOptionalAction, + default=argparse.SUPPRESS, help='Bazz!'), ] argument_group_signatures = [ (Sig('group'), [ @@ -3609,8 +3611,8 @@ class TestHelpUsage(HelpTestCase): usage = '''\ usage: PROG [-h] [-w W [W ...]] [-x [X ...]] [--foo | --no-foo] [--bar | --no-bar] - [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] [-y [Y]] - [-z Z Z Z] + [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] + [--bazz | --no-bazz] [-y [Y]] [-z Z Z Z] a b b [c] [d ...] e [e ...] ''' help = usage + '''\ @@ -3627,6 +3629,7 @@ class TestHelpUsage(HelpTestCase): --foo, --no-foo Whether to foo --bar, --no-bar Whether to bar (default: True) -f, --foobar, --no-foobar, --barfoo, --no-barfoo + --bazz, --no-bazz Bazz! group: -y [Y] y diff --git a/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst new file mode 100644 index 0000000000000..e42d84e31e759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst @@ -0,0 +1,3 @@ +Fix exception in argparse help text generation if a +:class:`argparse.BooleanOptionalAction` argument's default is +``argparse.SUPPRESS`` and it has ``help`` specified. Patch by Felix Fontein. \ No newline at end of file From webhook-mailer at python.org Thu Jan 20 18:22:58 2022 From: webhook-mailer at python.org (miss-islington) Date: Thu, 20 Jan 2022 23:22:58 -0000 Subject: [Python-checkins] bpo-46080: fix argparse help generation exception in edge case (GH-30111) Message-ID: https://github.com/python/cpython/commit/c6691a7ccbd027298ea2486014b55db037fffc9f commit: c6691a7ccbd027298ea2486014b55db037fffc9f branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-20T15:22:50-08:00 summary: bpo-46080: fix argparse help generation exception in edge case (GH-30111) Fix an uncaught exception during help text generation when argparse.BooleanOptionalAction is used with default=argparse.SUPPRESS and help is specified. (cherry picked from commit 9e87c0e03fa501fb90008547983ce4c1dcaaf90c) Co-authored-by: Felix Fontein files: A Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst M Lib/argparse.py M Lib/test/test_argparse.py diff --git a/Lib/argparse.py b/Lib/argparse.py index 40569437ac62b..fbdfd5172fb35 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -878,7 +878,7 @@ def __init__(self, option_string = '--no-' + option_string[2:] _option_strings.append(option_string) - if help is not None and default is not None: + if help is not None and default is not None and default is not SUPPRESS: help += " (default: %(default)s)" super().__init__( diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index cc5e8491b4c49..5c86583ce10a4 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -3597,6 +3597,8 @@ class TestHelpUsage(HelpTestCase): Sig('--bar', help='Whether to bar', default=True, action=argparse.BooleanOptionalAction), Sig('-f', '--foobar', '--barfoo', action=argparse.BooleanOptionalAction), + Sig('--bazz', action=argparse.BooleanOptionalAction, + default=argparse.SUPPRESS, help='Bazz!'), ] argument_group_signatures = [ (Sig('group'), [ @@ -3609,8 +3611,8 @@ class TestHelpUsage(HelpTestCase): usage = '''\ usage: PROG [-h] [-w W [W ...]] [-x [X ...]] [--foo | --no-foo] [--bar | --no-bar] - [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] [-y [Y]] - [-z Z Z Z] + [-f | --foobar | --no-foobar | --barfoo | --no-barfoo] + [--bazz | --no-bazz] [-y [Y]] [-z Z Z Z] a b b [c] [d ...] e [e ...] ''' help = usage + '''\ @@ -3627,6 +3629,7 @@ class TestHelpUsage(HelpTestCase): --foo, --no-foo Whether to foo --bar, --no-bar Whether to bar (default: True) -f, --foobar, --no-foobar, --barfoo, --no-barfoo + --bazz, --no-bazz Bazz! group: -y [Y] y diff --git a/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst new file mode 100644 index 0000000000000..e42d84e31e759 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-15-06-29-00.bpo-46080.AuQpLt.rst @@ -0,0 +1,3 @@ +Fix exception in argparse help text generation if a +:class:`argparse.BooleanOptionalAction` argument's default is +``argparse.SUPPRESS`` and it has ``help`` specified. Patch by Felix Fontein. \ No newline at end of file From webhook-mailer at python.org Thu Jan 20 19:42:36 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 00:42:36 -0000 Subject: [Python-checkins] bpo-46417: Finalize structseq types at exit (GH-30645) Message-ID: https://github.com/python/cpython/commit/e9e3eab0b868c7d0b48e472705024240d5c39d5c commit: e9e3eab0b868c7d0b48e472705024240d5c39d5c branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T01:42:25+01:00 summary: bpo-46417: Finalize structseq types at exit (GH-30645) Add _PyStructSequence_FiniType() and _PyStaticType_Dealloc() functions to finalize a structseq static type in Py_Finalize(). Currrently, these functions do nothing if Python is built in release mode. Clear static types: * AsyncGenHooksType: sys.set_asyncgen_hooks() * FlagsType: sys.flags * FloatInfoType: sys.float_info * Hash_InfoType: sys.hash_info * Int_InfoType: sys.int_info * ThreadInfoType: sys.thread_info * UnraisableHookArgsType: sys.unraisablehook * VersionInfoType: sys.version * WindowsVersionType: sys.getwindowsversion() files: A Lib/test/_test_embed_structseq.py M Include/internal/pycore_floatobject.h M Include/internal/pycore_long.h M Include/internal/pycore_pyerrors.h M Include/internal/pycore_pylifecycle.h M Include/internal/pycore_typeobject.h M Include/structseq.h M Lib/test/test_embed.py M Objects/floatobject.c M Objects/longobject.c M Objects/structseq.c M Objects/typeobject.c M Programs/_testembed.c M Python/errors.c M Python/pylifecycle.c M Python/sysmodule.c M Python/thread.c diff --git a/Include/internal/pycore_floatobject.h b/Include/internal/pycore_floatobject.h index be6045587de1c..891e422f59472 100644 --- a/Include/internal/pycore_floatobject.h +++ b/Include/internal/pycore_floatobject.h @@ -14,6 +14,7 @@ extern "C" { extern void _PyFloat_InitState(PyInterpreterState *); extern PyStatus _PyFloat_InitTypes(PyInterpreterState *); extern void _PyFloat_Fini(PyInterpreterState *); +extern void _PyFloat_FiniType(PyInterpreterState *); /* other API */ diff --git a/Include/internal/pycore_long.h b/Include/internal/pycore_long.h index 4d1a0d0424969..436bf08468599 100644 --- a/Include/internal/pycore_long.h +++ b/Include/internal/pycore_long.h @@ -15,6 +15,7 @@ extern "C" { /* runtime lifecycle */ extern PyStatus _PyLong_InitTypes(PyInterpreterState *); +extern void _PyLong_FiniTypes(PyInterpreterState *interp); /* other API */ diff --git a/Include/internal/pycore_pyerrors.h b/Include/internal/pycore_pyerrors.h index f375337a405bb..e3c445ba5d926 100644 --- a/Include/internal/pycore_pyerrors.h +++ b/Include/internal/pycore_pyerrors.h @@ -12,6 +12,7 @@ extern "C" { /* runtime lifecycle */ extern PyStatus _PyErr_InitTypes(PyInterpreterState *); +extern void _PyErr_FiniTypes(PyInterpreterState *); /* other API */ diff --git a/Include/internal/pycore_pylifecycle.h b/Include/internal/pycore_pylifecycle.h index 766e889f237b9..dfa8fd6bd0d28 100644 --- a/Include/internal/pycore_pylifecycle.h +++ b/Include/internal/pycore_pylifecycle.h @@ -58,6 +58,7 @@ extern PyStatus _PySys_Create( extern PyStatus _PySys_ReadPreinitWarnOptions(PyWideStringList *options); extern PyStatus _PySys_ReadPreinitXOptions(PyConfig *config); extern int _PySys_UpdateConfig(PyThreadState *tstate); +extern void _PySys_Fini(PyInterpreterState *interp); extern PyStatus _PyBuiltins_AddExceptions(PyObject * bltinmod); extern PyStatus _Py_HashRandomization_Init(const PyConfig *); @@ -81,6 +82,7 @@ extern void _PyTraceMalloc_Fini(void); extern void _PyWarnings_Fini(PyInterpreterState *interp); extern void _PyAST_Fini(PyInterpreterState *interp); extern void _PyAtExit_Fini(PyInterpreterState *interp); +extern void _PyThread_FiniType(PyInterpreterState *interp); extern PyStatus _PyGILState_Init(_PyRuntimeState *runtime); extern PyStatus _PyGILState_SetTstate(PyThreadState *tstate); diff --git a/Include/internal/pycore_typeobject.h b/Include/internal/pycore_typeobject.h index 7fd8a1f35092f..ba95bbc1c4820 100644 --- a/Include/internal/pycore_typeobject.h +++ b/Include/internal/pycore_typeobject.h @@ -40,6 +40,8 @@ struct type_cache { extern PyStatus _PyTypes_InitSlotDefs(void); +extern void _PyStaticType_Dealloc(PyTypeObject *type); + #ifdef __cplusplus } diff --git a/Include/structseq.h b/Include/structseq.h index e89265a67c322..8abd2443468b7 100644 --- a/Include/structseq.h +++ b/Include/structseq.h @@ -27,6 +27,9 @@ PyAPI_FUNC(void) PyStructSequence_InitType(PyTypeObject *type, PyAPI_FUNC(int) PyStructSequence_InitType2(PyTypeObject *type, PyStructSequence_Desc *desc); #endif +#ifdef Py_BUILD_CORE +PyAPI_FUNC(void) _PyStructSequence_FiniType(PyTypeObject *type); +#endif PyAPI_FUNC(PyTypeObject*) PyStructSequence_NewType(PyStructSequence_Desc *desc); PyAPI_FUNC(PyObject *) PyStructSequence_New(PyTypeObject* type); diff --git a/Lib/test/_test_embed_structseq.py b/Lib/test/_test_embed_structseq.py new file mode 100644 index 0000000000000..868f9f83e8be7 --- /dev/null +++ b/Lib/test/_test_embed_structseq.py @@ -0,0 +1,55 @@ +import sys +import types +import unittest + + +# bpo-46417: Test that structseq types used by the sys module are still +# valid when Py_Finalize()/Py_Initialize() are called multiple times. +class TestStructSeq(unittest.TestCase): + # test PyTypeObject members + def check_structseq(self, obj_type): + # ob_refcnt + self.assertGreaterEqual(sys.getrefcount(obj_type), 1) + # tp_base + self.assertTrue(issubclass(obj_type, tuple)) + # tp_bases + self.assertEqual(obj_type.__bases__, (tuple,)) + # tp_dict + self.assertIsInstance(obj_type.__dict__, types.MappingProxyType) + # tp_mro + self.assertEqual(obj_type.__mro__, (obj_type, tuple, object)) + # tp_name + self.assertIsInstance(type.__name__, str) + # tp_subclasses + self.assertEqual(obj_type.__subclasses__(), []) + + def test_sys_attrs(self): + for attr_name in ( + 'flags', # FlagsType + 'float_info', # FloatInfoType + 'hash_info', # Hash_InfoType + 'int_info', # Int_InfoType + 'thread_info', # ThreadInfoType + 'version_info', # VersionInfoType + ): + with self.subTest(attr=attr_name): + attr = getattr(sys, attr_name) + self.check_structseq(type(attr)) + + def test_sys_funcs(self): + func_names = ['get_asyncgen_hooks'] # AsyncGenHooksType + if hasattr(sys, 'getwindowsversion'): + func_names.append('getwindowsversion') # WindowsVersionType + for func_name in func_names: + with self.subTest(func=func_name): + func = getattr(sys, func_name) + obj = func() + self.check_structseq(type(obj)) + + +try: + unittest.main() +except SystemExit as exc: + if exc.args[0] != 0: + raise +print("Tests passed") diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 9fed0a5f14e65..204b194ed3bf3 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -329,6 +329,18 @@ def test_run_main_loop(self): self.assertEqual(out, "Py_RunMain(): sys.argv=['-c', 'arg2']\n" * nloop) self.assertEqual(err, '') + def test_finalize_structseq(self): + # bpo-46417: Py_Finalize() clears structseq static types. Check that + # sys attributes using struct types still work when + # Py_Finalize()/Py_Initialize() is called multiple times. + # print() calls type->tp_repr(instance) and so checks that the types + # are still working properly. + script = support.findfile('_test_embed_structseq.py') + with open(script, encoding="utf-8") as fp: + code = fp.read() + out, err = self.run_embedded_interpreter("test_repeated_init_exec", code) + self.assertEqual(out, 'Tests passed\n' * INIT_LOOPS) + class InitConfigTests(EmbeddingTestsMixin, unittest.TestCase): maxDiff = 4096 diff --git a/Objects/floatobject.c b/Objects/floatobject.c index f8620d6f8ef0b..88f25d6b8c886 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -2082,6 +2082,14 @@ _PyFloat_Fini(PyInterpreterState *interp) #endif } +void +_PyFloat_FiniType(PyInterpreterState *interp) +{ + if (_Py_IsMainInterpreter(interp)) { + _PyStructSequence_FiniType(&FloatInfoType); + } +} + /* Print summary info about the state of the optimized allocator */ void _PyFloat_DebugMallocStats(FILE *out) diff --git a/Objects/longobject.c b/Objects/longobject.c index 1b2d1266c6bc5..5aa53dd91c299 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -5949,3 +5949,14 @@ _PyLong_InitTypes(PyInterpreterState *interp) return _PyStatus_OK(); } + + +void +_PyLong_FiniTypes(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + _PyStructSequence_FiniType(&Int_InfoType); +} diff --git a/Objects/structseq.c b/Objects/structseq.c index a2eefb0455a17..f8bf9477f2848 100644 --- a/Objects/structseq.c +++ b/Objects/structseq.c @@ -532,6 +532,36 @@ PyStructSequence_InitType(PyTypeObject *type, PyStructSequence_Desc *desc) (void)PyStructSequence_InitType2(type, desc); } + +void +_PyStructSequence_FiniType(PyTypeObject *type) +{ + // Ensure that the type is initialized + assert(type->tp_name != NULL); + assert(type->tp_base == &PyTuple_Type); + + // Cannot delete a type if it still has subclasses + if (type->tp_subclasses != NULL) { + return; + } + + // Undo PyStructSequence_NewType() + type->tp_name = NULL; + PyMem_Free(type->tp_members); + + _PyStaticType_Dealloc(type); + assert(Py_REFCNT(type) == 1); + // Undo Py_INCREF(type) of _PyStructSequence_InitType(). + // Don't use Py_DECREF(): static type must not be deallocated + Py_SET_REFCNT(type, 0); + + // Make sure that _PyStructSequence_InitType() will initialize + // the type again + assert(Py_REFCNT(type) == 0); + assert(type->tp_name == NULL); +} + + PyTypeObject * PyStructSequence_NewType(PyStructSequence_Desc *desc) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c index cbf806b074b9f..66a10a5bc57dd 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4070,10 +4070,27 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) extern void _PyDictKeys_DecRef(PyDictKeysObject *keys); + +void +_PyStaticType_Dealloc(PyTypeObject *type) +{ + // _PyStaticType_Dealloc() must not be called if a type has subtypes. + // A subtype can inherit attributes and methods of its parent type, + // and a type must no longer be used once it's deallocated. + assert(type->tp_subclasses == NULL); + + Py_CLEAR(type->tp_dict); + Py_CLEAR(type->tp_bases); + Py_CLEAR(type->tp_mro); + Py_CLEAR(type->tp_cache); + Py_CLEAR(type->tp_subclasses); + type->tp_flags &= ~Py_TPFLAGS_READY; +} + + static void type_dealloc(PyTypeObject *type) { - PyHeapTypeObject *et; PyObject *tp, *val, *tb; /* Assert this is a heap-allocated type object */ @@ -4082,8 +4099,8 @@ type_dealloc(PyTypeObject *type) PyErr_Fetch(&tp, &val, &tb); remove_all_subclasses(type, type->tp_bases); PyErr_Restore(tp, val, tb); + PyObject_ClearWeakRefs((PyObject *)type); - et = (PyHeapTypeObject *)type; Py_XDECREF(type->tp_base); Py_XDECREF(type->tp_dict); Py_XDECREF(type->tp_bases); @@ -4094,6 +4111,8 @@ type_dealloc(PyTypeObject *type) * of most other objects. It's okay to cast it to char *. */ PyObject_Free((char *)type->tp_doc); + + PyHeapTypeObject *et = (PyHeapTypeObject *)type; Py_XDECREF(et->ht_name); Py_XDECREF(et->ht_qualname); Py_XDECREF(et->ht_slots); diff --git a/Programs/_testembed.c b/Programs/_testembed.c index b31781938eb39..5bc0a127a8750 100644 --- a/Programs/_testembed.c +++ b/Programs/_testembed.c @@ -15,12 +15,18 @@ #include // putenv() #include +int main_argc; +char **main_argv; + /********************************************************* * Embedded interpreter tests that need a custom exe * * Executed via 'EmbeddingTests' in Lib/test/test_capi.py *********************************************************/ +// Use to display the usage +#define PROGRAM "test_embed" + /* Use path starting with "./" avoids a search along the PATH */ #define PROGRAM_NAME L"./_testembed" @@ -113,6 +119,36 @@ PyInit_embedded_ext(void) return PyModule_Create(&embedded_ext); } +/**************************************************************************** + * Call Py_Initialize()/Py_Finalize() multiple times and execute Python code + ***************************************************************************/ + +// Used by bpo-46417 to test that structseq types used by the sys module are +// cleared properly and initialized again properly when Python is finalized +// multiple times. +static int test_repeated_init_exec(void) +{ + if (main_argc < 3) { + fprintf(stderr, "usage: %s test_repeated_init_exec CODE\n", PROGRAM); + exit(1); + } + const char *code = main_argv[2]; + + for (int i=1; i <= INIT_LOOPS; i++) { + fprintf(stderr, "--- Loop #%d ---\n", i); + fflush(stderr); + + _testembed_Py_Initialize(); + int err = PyRun_SimpleString(code); + Py_Finalize(); + if (err) { + return 1; + } + } + return 0; +} + + /***************************************************** * Test forcing a particular IO encoding *****************************************************/ @@ -1880,6 +1916,7 @@ struct TestCase static struct TestCase TestCases[] = { // Python initialization + {"test_repeated_init_exec", test_repeated_init_exec}, {"test_forced_io_encoding", test_forced_io_encoding}, {"test_repeated_init_and_subinterpreters", test_repeated_init_and_subinterpreters}, {"test_repeated_init_and_inittab", test_repeated_init_and_inittab}, @@ -1946,6 +1983,9 @@ static struct TestCase TestCases[] = { int main(int argc, char *argv[]) { + main_argc = argc; + main_argv = argv; + if (argc > 1) { for (struct TestCase *tc = TestCases; tc && tc->name; tc++) { if (strcmp(argv[1], tc->name) == 0) diff --git a/Python/errors.c b/Python/errors.c index 6c5fe41142304..211881ca5eb6c 100644 --- a/Python/errors.c +++ b/Python/errors.c @@ -1241,6 +1241,17 @@ _PyErr_InitTypes(PyInterpreterState *interp) } +void +_PyErr_FiniTypes(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + _PyStructSequence_FiniType(&UnraisableHookArgsType); +} + + static PyObject * make_unraisable_hook_args(PyThreadState *tstate, PyObject *exc_type, PyObject *exc_value, PyObject *exc_tb, diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 8bcad67e80a0c..0b1f47147696d 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -1666,11 +1666,17 @@ flush_std_files(void) static void finalize_interp_types(PyInterpreterState *interp) { + _PySys_Fini(interp); _PyExc_Fini(interp); _PyFrame_Fini(interp); _PyAsyncGen_Fini(interp); _PyContext_Fini(interp); + _PyFloat_FiniType(interp); + _PyLong_FiniTypes(interp); + _PyThread_FiniType(interp); + _PyErr_FiniTypes(interp); _PyTypes_Fini(interp); + // Call _PyUnicode_ClearInterned() before _PyDict_Fini() since it uses // a dict internally. _PyUnicode_ClearInterned(interp); diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 0b7b61d8b1e28..515994f049086 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -3102,6 +3102,21 @@ _PySys_Create(PyThreadState *tstate, PyObject **sysmod_p) } +void +_PySys_Fini(PyInterpreterState *interp) +{ + if (_Py_IsMainInterpreter(interp)) { + _PyStructSequence_FiniType(&VersionInfoType); + _PyStructSequence_FiniType(&FlagsType); +#if defined(MS_WINDOWS) + _PyStructSequence_FiniType(&WindowsVersionType); +#endif + _PyStructSequence_FiniType(&Hash_InfoType); + _PyStructSequence_FiniType(&AsyncGenHooksType); + } +} + + static PyObject * makepathobject(const wchar_t *path, wchar_t delim) { diff --git a/Python/thread.c b/Python/thread.c index b1c0cfe84f28d..c2457c4f8fe83 100644 --- a/Python/thread.c +++ b/Python/thread.c @@ -243,3 +243,14 @@ PyThread_GetInfo(void) PyStructSequence_SET_ITEM(threadinfo, pos++, value); return threadinfo; } + + +void +_PyThread_FiniType(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + _PyStructSequence_FiniType(&ThreadInfoType); +} From webhook-mailer at python.org Thu Jan 20 20:12:28 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 01:12:28 -0000 Subject: [Python-checkins] bpo-46417: _testembed.c avoids Py_SetProgramName() (GH-30732) Message-ID: https://github.com/python/cpython/commit/6415e2ee4955b1a995c1e75544e2506b03780c3d commit: 6415e2ee4955b1a995c1e75544e2506b03780c3d branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T02:12:18+01:00 summary: bpo-46417: _testembed.c avoids Py_SetProgramName() (GH-30732) * _testembed_Py_Initialize() now uses the PyConfig API, rather than deprecated Py_SetProgramName(). * Reduce INIT_LOOPS from 16 to 4: test_embed now takes 8.7 seconds rather than 14.7 seconds. files: M Lib/test/test_embed.py M Programs/_testembed.c diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 204b194ed3bf3..19c53c392607b 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -32,7 +32,7 @@ # _PyCoreConfig_InitIsolatedConfig() API_ISOLATED = 3 -INIT_LOOPS = 16 +INIT_LOOPS = 4 MAX_HASH_SEED = 4294967295 diff --git a/Programs/_testembed.c b/Programs/_testembed.c index 5bc0a127a8750..08e27d97723b3 100644 --- a/Programs/_testembed.c +++ b/Programs/_testembed.c @@ -30,7 +30,7 @@ char **main_argv; /* Use path starting with "./" avoids a search along the PATH */ #define PROGRAM_NAME L"./_testembed" -#define INIT_LOOPS 16 +#define INIT_LOOPS 4 // Ignore Py_DEPRECATED() compiler warnings: deprecated functions are // tested on purpose here. @@ -45,10 +45,39 @@ static void error(const char *msg) } +static void config_set_string(PyConfig *config, wchar_t **config_str, const wchar_t *str) +{ + PyStatus status = PyConfig_SetString(config, config_str, str); + if (PyStatus_Exception(status)) { + PyConfig_Clear(config); + Py_ExitStatusException(status); + } +} + + +static void config_set_program_name(PyConfig *config) +{ + const wchar_t *program_name = PROGRAM_NAME; + config_set_string(config, &config->program_name, program_name); +} + + +static void init_from_config_clear(PyConfig *config) +{ + PyStatus status = Py_InitializeFromConfig(config); + PyConfig_Clear(config); + if (PyStatus_Exception(status)) { + Py_ExitStatusException(status); + } +} + + static void _testembed_Py_Initialize(void) { - Py_SetProgramName(PROGRAM_NAME); - Py_Initialize(); + PyConfig config; + _PyConfig_InitCompatConfig(&config); + config_set_program_name(&config); + init_from_config_clear(&config); } @@ -391,16 +420,6 @@ static int test_init_initialize_config(void) } -static void config_set_string(PyConfig *config, wchar_t **config_str, const wchar_t *str) -{ - PyStatus status = PyConfig_SetString(config, config_str, str); - if (PyStatus_Exception(status)) { - PyConfig_Clear(config); - Py_ExitStatusException(status); - } -} - - static void config_set_argv(PyConfig *config, Py_ssize_t argc, wchar_t * const *argv) { PyStatus status = PyConfig_SetArgv(config, argc, argv); @@ -423,23 +442,6 @@ config_set_wide_string_list(PyConfig *config, PyWideStringList *list, } -static void config_set_program_name(PyConfig *config) -{ - const wchar_t *program_name = PROGRAM_NAME; - config_set_string(config, &config->program_name, program_name); -} - - -static void init_from_config_clear(PyConfig *config) -{ - PyStatus status = Py_InitializeFromConfig(config); - PyConfig_Clear(config); - if (PyStatus_Exception(status)) { - Py_ExitStatusException(status); - } -} - - static int check_init_compat_config(int preinit) { PyStatus status; From webhook-mailer at python.org Thu Jan 20 20:51:12 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 01:51:12 -0000 Subject: [Python-checkins] bpo-46417: _thread uses PyStructSequence_NewType() (GH-30733) Message-ID: https://github.com/python/cpython/commit/f389b37fb1cebe7ed66331cdd373a014695261f6 commit: f389b37fb1cebe7ed66331cdd373a014695261f6 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T02:51:04+01:00 summary: bpo-46417: _thread uses PyStructSequence_NewType() (GH-30733) The _thread module now creates its _ExceptHookArgs type as a heap type using PyStructSequence_NewType(), rather than using a static type. files: M Modules/_threadmodule.c diff --git a/Modules/_threadmodule.c b/Modules/_threadmodule.c index cde2e0b6be7e4..9e6e462b59e06 100644 --- a/Modules/_threadmodule.c +++ b/Modules/_threadmodule.c @@ -28,6 +28,7 @@ static struct PyModuleDef thread_module; typedef struct { + PyTypeObject *excepthook_type; PyTypeObject *lock_type; PyTypeObject *local_type; PyTypeObject *local_dummy_type; @@ -1473,8 +1474,6 @@ PyDoc_STRVAR(ExceptHookArgs__doc__, \n\ Type used to pass arguments to threading.excepthook."); -static PyTypeObject ExceptHookArgsType; - static PyStructSequence_Field ExceptHookArgs_fields[] = { {"exc_type", "Exception type"}, {"exc_value", "Exception value"}, @@ -1492,9 +1491,11 @@ static PyStructSequence_Desc ExceptHookArgs_desc = { static PyObject * -thread_excepthook(PyObject *self, PyObject *args) +thread_excepthook(PyObject *module, PyObject *args) { - if (!Py_IS_TYPE(args, &ExceptHookArgsType)) { + thread_module_state *state = get_thread_state(module); + + if (!Py_IS_TYPE(args, state->excepthook_type)) { PyErr_SetString(PyExc_TypeError, "_thread.excepthook argument type " "must be ExceptHookArgs"); @@ -1629,18 +1630,17 @@ thread_module_exec(PyObject *module) return -1; } - if (ExceptHookArgsType.tp_name == NULL) { - if (PyStructSequence_InitType2(&ExceptHookArgsType, - &ExceptHookArgs_desc) < 0) { - return -1; - } - } - // Add module attributes if (PyDict_SetItemString(d, "error", ThreadError) < 0) { return -1; } - if (PyModule_AddType(module, &ExceptHookArgsType) < 0) { + + // _ExceptHookArgs type + state->excepthook_type = PyStructSequence_NewType(&ExceptHookArgs_desc); + if (state->excepthook_type == NULL) { + return -1; + } + if (PyModule_AddType(module, state->excepthook_type) < 0) { return -1; } @@ -1664,6 +1664,7 @@ static int thread_module_traverse(PyObject *module, visitproc visit, void *arg) { thread_module_state *state = get_thread_state(module); + Py_VISIT(state->excepthook_type); Py_VISIT(state->lock_type); Py_VISIT(state->local_type); Py_VISIT(state->local_dummy_type); @@ -1674,6 +1675,7 @@ static int thread_module_clear(PyObject *module) { thread_module_state *state = get_thread_state(module); + Py_CLEAR(state->excepthook_type); Py_CLEAR(state->lock_type); Py_CLEAR(state->local_type); Py_CLEAR(state->local_dummy_type); From webhook-mailer at python.org Thu Jan 20 20:52:51 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 01:52:51 -0000 Subject: [Python-checkins] bpo-46417: time module uses PyStructSequence_NewType() (GH-30734) Message-ID: https://github.com/python/cpython/commit/17f268a4ae6190b2659c89c6f32ad2d006e0e3c8 commit: 17f268a4ae6190b2659c89c6f32ad2d006e0e3c8 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T02:52:43+01:00 summary: bpo-46417: time module uses PyStructSequence_NewType() (GH-30734) The time module now creates its struct_time type as a heap type using PyStructSequence_NewType(), rather than using a static type. * Add a module state to the time module: add traverse, clear and free functions. * Use PyModule_AddType(). * Remove the 'initialized' variable. files: M Modules/timemodule.c diff --git a/Modules/timemodule.c b/Modules/timemodule.c index dd81d352fd713..35b8e14e82711 100644 --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -2,6 +2,7 @@ #include "Python.h" #include "pycore_fileutils.h" // _Py_BEGIN_SUPPRESS_IPH +#include "pycore_moduleobject.h" // _PyModule_GetState() #include "pycore_namespace.h" // _PyNamespace_New() #include @@ -64,6 +65,19 @@ static int pysleep(_PyTime_t timeout); +typedef struct { + PyTypeObject *struct_time_type; +} time_module_state; + +static inline time_module_state* +get_time_state(PyObject *module) +{ + void *state = _PyModule_GetState(module); + assert(state != NULL); + return (time_module_state *)state; +} + + static PyObject* _PyFloat_FromPyTime(_PyTime_t t) { @@ -405,9 +419,6 @@ static PyStructSequence_Desc struct_time_type_desc = { 9, }; -static int initialized; -static PyTypeObject StructTimeType; - #if defined(MS_WINDOWS) #ifndef CREATE_WAITABLE_TIMER_HIGH_RESOLUTION #define CREATE_WAITABLE_TIMER_HIGH_RESOLUTION 0x00000002 @@ -417,13 +428,13 @@ static DWORD timer_flags = (DWORD)-1; #endif static PyObject * -tmtotuple(struct tm *p +tmtotuple(time_module_state *state, struct tm *p #ifndef HAVE_STRUCT_TM_TM_ZONE , const char *zone, time_t gmtoff #endif ) { - PyObject *v = PyStructSequence_New(&StructTimeType); + PyObject *v = PyStructSequence_New(state->struct_time_type); if (v == NULL) return NULL; @@ -480,7 +491,7 @@ parse_time_t_args(PyObject *args, const char *format, time_t *pwhen) } static PyObject * -time_gmtime(PyObject *self, PyObject *args) +time_gmtime(PyObject *module, PyObject *args) { time_t when; struct tm buf; @@ -491,10 +502,12 @@ time_gmtime(PyObject *self, PyObject *args) errno = 0; if (_PyTime_gmtime(when, &buf) != 0) return NULL; + + time_module_state *state = get_time_state(module); #ifdef HAVE_STRUCT_TM_TM_ZONE - return tmtotuple(&buf); + return tmtotuple(state, &buf); #else - return tmtotuple(&buf, "UTC", 0); + return tmtotuple(state, &buf, "UTC", 0); #endif } @@ -522,7 +535,7 @@ If the platform supports the tm_gmtoff and tm_zone, they are available as\n\ attributes only."); static PyObject * -time_localtime(PyObject *self, PyObject *args) +time_localtime(PyObject *module, PyObject *args) { time_t when; struct tm buf; @@ -531,8 +544,10 @@ time_localtime(PyObject *self, PyObject *args) return NULL; if (_PyTime_localtime(when, &buf) != 0) return NULL; + + time_module_state *state = get_time_state(module); #ifdef HAVE_STRUCT_TM_TM_ZONE - return tmtotuple(&buf); + return tmtotuple(state, &buf); #else { struct tm local = buf; @@ -540,7 +555,7 @@ time_localtime(PyObject *self, PyObject *args) time_t gmtoff; strftime(zone, sizeof(zone), "%Z", &buf); gmtoff = timegm(&buf) - when; - return tmtotuple(&local, zone, gmtoff); + return tmtotuple(state, &local, zone, gmtoff); } #endif } @@ -560,7 +575,8 @@ When 'seconds' is not passed in, convert the current time instead."); * an exception and return 0 on error. */ static int -gettmarg(PyObject *args, struct tm *p, const char *format) +gettmarg(time_module_state *state, PyObject *args, + struct tm *p, const char *format) { int y; @@ -588,7 +604,7 @@ gettmarg(PyObject *args, struct tm *p, const char *format) p->tm_wday = (p->tm_wday + 1) % 7; p->tm_yday--; #ifdef HAVE_STRUCT_TM_TM_ZONE - if (Py_IS_TYPE(args, &StructTimeType)) { + if (Py_IS_TYPE(args, state->struct_time_type)) { PyObject *item; item = PyStructSequence_GET_ITEM(args, 9); if (item != Py_None) { @@ -729,7 +745,7 @@ the C library strftime function.\n" #endif static PyObject * -time_strftime(PyObject *self, PyObject *args) +time_strftime(PyObject *module, PyObject *args) { PyObject *tup = NULL; struct tm buf; @@ -753,12 +769,13 @@ time_strftime(PyObject *self, PyObject *args) if (!PyArg_ParseTuple(args, "U|O:strftime", &format_arg, &tup)) return NULL; + time_module_state *state = get_time_state(module); if (tup == NULL) { time_t tt = time(NULL); if (_PyTime_localtime(tt, &buf) != 0) return NULL; } - else if (!gettmarg(tup, &buf, + else if (!gettmarg(state, tup, &buf, "iiiiiiiii;strftime(): illegal time tuple argument") || !checktm(&buf)) { @@ -941,19 +958,21 @@ _asctime(struct tm *timeptr) } static PyObject * -time_asctime(PyObject *self, PyObject *args) +time_asctime(PyObject *module, PyObject *args) { PyObject *tup = NULL; struct tm buf; if (!PyArg_UnpackTuple(args, "asctime", 0, 1, &tup)) return NULL; + + time_module_state *state = get_time_state(module); if (tup == NULL) { time_t tt = time(NULL); if (_PyTime_localtime(tt, &buf) != 0) return NULL; } - else if (!gettmarg(tup, &buf, + else if (!gettmarg(state, tup, &buf, "iiiiiiiii;asctime(): illegal time tuple argument") || !checktm(&buf)) { @@ -990,12 +1009,13 @@ not present, current time as returned by localtime() is used."); #ifdef HAVE_MKTIME static PyObject * -time_mktime(PyObject *self, PyObject *tm_tuple) +time_mktime(PyObject *module, PyObject *tm_tuple) { struct tm tm; time_t tt; - if (!gettmarg(tm_tuple, &tm, + time_module_state *state = get_time_state(module); + if (!gettmarg(state, tm_tuple, &tm, "iiiiiiiii;mktime(): illegal time tuple argument")) { return NULL; @@ -1888,6 +1908,7 @@ if it is -1, mktime() should guess based on the date and time.\n"); static int time_exec(PyObject *module) { + time_module_state *state = get_time_state(module); #if defined(__APPLE__) && defined(HAVE_CLOCK_GETTIME) if (HAVE_CLOCK_GETTIME_RUNTIME) { /* pass: ^^^ cannot use '!' here */ @@ -2001,21 +2022,18 @@ time_exec(PyObject *module) #endif /* defined(HAVE_CLOCK_GETTIME) || defined(HAVE_CLOCK_SETTIME) || defined(HAVE_CLOCK_GETRES) */ - if (!initialized) { - if (PyStructSequence_InitType2(&StructTimeType, - &struct_time_type_desc) < 0) { - return -1; - } - } if (PyModule_AddIntConstant(module, "_STRUCT_TM_ITEMS", 11)) { return -1; } - Py_INCREF(&StructTimeType); - if (PyModule_AddObject(module, "struct_time", (PyObject*) &StructTimeType)) { - Py_DECREF(&StructTimeType); + + // struct_time type + state->struct_time_type = PyStructSequence_NewType(&struct_time_type_desc); + if (state->struct_time_type == NULL) { + return -1; + } + if (PyModule_AddType(module, state->struct_time_type)) { return -1; } - initialized = 1; #if defined(__linux__) && !defined(__GLIBC__) struct tm tm; @@ -2044,6 +2062,32 @@ time_exec(PyObject *module) return 0; } + +static int +time_module_traverse(PyObject *module, visitproc visit, void *arg) +{ + time_module_state *state = get_time_state(module); + Py_VISIT(state->struct_time_type); + return 0; +} + + +static int +time_module_clear(PyObject *module) +{ + time_module_state *state = get_time_state(module); + Py_CLEAR(state->struct_time_type); + return 0; +} + + +static void +time_module_free(void *module) +{ + time_module_clear((PyObject *)module); +} + + static struct PyModuleDef_Slot time_slots[] = { {Py_mod_exec, time_exec}, {0, NULL} @@ -2051,14 +2095,14 @@ static struct PyModuleDef_Slot time_slots[] = { static struct PyModuleDef timemodule = { PyModuleDef_HEAD_INIT, - "time", - module_doc, - 0, - time_methods, - time_slots, - NULL, - NULL, - NULL + .m_name = "time", + .m_doc = module_doc, + .m_size = sizeof(time_module_state), + .m_methods = time_methods, + .m_slots = time_slots, + .m_traverse = time_module_traverse, + .m_clear = time_module_clear, + .m_free = time_module_free, }; PyMODINIT_FUNC From webhook-mailer at python.org Thu Jan 20 21:30:30 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 02:30:30 -0000 Subject: [Python-checkins] bpo-46417: _curses uses PyStructSequence_NewType() (GH-30736) Message-ID: https://github.com/python/cpython/commit/1781d55eb34f94029e50970232635fc5082378cb commit: 1781d55eb34f94029e50970232635fc5082378cb branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T03:30:20+01:00 summary: bpo-46417: _curses uses PyStructSequence_NewType() (GH-30736) The _curses module now creates its ncurses_version type as a heap type using PyStructSequence_NewType(), rather than using a static type. * Move _PyStructSequence_FiniType() definition to pycore_structseq.h. * test.pythoninfo: log curses.ncurses_version. files: M Include/internal/pycore_structseq.h M Include/structseq.h M Lib/test/pythoninfo.py M Modules/_cursesmodule.c M Objects/floatobject.c M Objects/longobject.c M Objects/structseq.c M Python/errors.c M Python/sysmodule.c M Python/thread.c diff --git a/Include/internal/pycore_structseq.h b/Include/internal/pycore_structseq.h index 3a61cb9a12608..c0323bbea8991 100644 --- a/Include/internal/pycore_structseq.h +++ b/Include/internal/pycore_structseq.h @@ -16,11 +16,16 @@ extern PyStatus _PyStructSequence_InitState(PyInterpreterState *); /* other API */ +PyAPI_FUNC(PyTypeObject *) _PyStructSequence_NewType( + PyStructSequence_Desc *desc, + unsigned long tp_flags); + PyAPI_FUNC(int) _PyStructSequence_InitType( PyTypeObject *type, PyStructSequence_Desc *desc, unsigned long tp_flags); +extern void _PyStructSequence_FiniType(PyTypeObject *type); #ifdef __cplusplus } diff --git a/Include/structseq.h b/Include/structseq.h index 8abd2443468b7..e89265a67c322 100644 --- a/Include/structseq.h +++ b/Include/structseq.h @@ -27,9 +27,6 @@ PyAPI_FUNC(void) PyStructSequence_InitType(PyTypeObject *type, PyAPI_FUNC(int) PyStructSequence_InitType2(PyTypeObject *type, PyStructSequence_Desc *desc); #endif -#ifdef Py_BUILD_CORE -PyAPI_FUNC(void) _PyStructSequence_FiniType(PyTypeObject *type); -#endif PyAPI_FUNC(PyTypeObject*) PyStructSequence_NewType(PyStructSequence_Desc *desc); PyAPI_FUNC(PyObject *) PyStructSequence_New(PyTypeObject* type); diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 39ee9e1d769f8..9d733c5721cde 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -434,6 +434,15 @@ def collect_time(info_add): info_add('time.get_clock_info(%s)' % clock, clock_info) +def collect_curses(info_add): + try: + import curses + except ImportError: + return + + copy_attr(info_add, 'curses.ncurses_version', curses, 'ncurses_version') + + def collect_datetime(info_add): try: import datetime @@ -752,6 +761,7 @@ def collect_info(info): collect_builtins, collect_cc, + collect_curses, collect_datetime, collect_decimal, collect_expat, diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c index bf742dacf0110..423b042b90755 100644 --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -108,7 +108,7 @@ static const char PyCursesVersion[] = "2.2"; #include "Python.h" #include "pycore_long.h" // _PyLong_GetZero() -#include "pycore_structseq.h" // PyStructSequence_InitType() +#include "pycore_structseq.h" // _PyStructSequence_NewType() #ifdef __hpux #define STRICT_SYSV_CURSES @@ -4569,8 +4569,6 @@ PyDoc_STRVAR(ncurses_version__doc__, \n\ Ncurses version information as a named tuple."); -static PyTypeObject NcursesVersionType; - static PyStructSequence_Field ncurses_version_fields[] = { {"major", "Major release number"}, {"minor", "Minor release number"}, @@ -4586,12 +4584,12 @@ static PyStructSequence_Desc ncurses_version_desc = { }; static PyObject * -make_ncurses_version(void) +make_ncurses_version(PyTypeObject *type) { PyObject *ncurses_version; int pos = 0; - ncurses_version = PyStructSequence_New(&NcursesVersionType); + ncurses_version = PyStructSequence_New(type); if (ncurses_version == NULL) { return NULL; } @@ -4796,14 +4794,14 @@ PyInit__curses(void) #ifdef NCURSES_VERSION /* ncurses_version */ - if (NcursesVersionType.tp_name == NULL) { - if (_PyStructSequence_InitType(&NcursesVersionType, - &ncurses_version_desc, - Py_TPFLAGS_DISALLOW_INSTANTIATION) < 0) { - return NULL; - } + PyTypeObject *version_type; + version_type = _PyStructSequence_NewType(&ncurses_version_desc, + Py_TPFLAGS_DISALLOW_INSTANTIATION); + if (version_type == NULL) { + return NULL; } - v = make_ncurses_version(); + v = make_ncurses_version(version_type); + Py_DECREF(version_type); if (v == NULL) { return NULL; } diff --git a/Objects/floatobject.c b/Objects/floatobject.c index 88f25d6b8c886..68be7acaa2e72 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -12,6 +12,7 @@ #include "pycore_object.h" // _PyObject_Init() #include "pycore_pymath.h" // _Py_ADJUST_ERANGE1() #include "pycore_pystate.h" // _PyInterpreterState_GET() +#include "pycore_structseq.h" // _PyStructSequence_FiniType() #include #include diff --git a/Objects/longobject.c b/Objects/longobject.c index 5aa53dd91c299..7721f40adbba6 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -9,6 +9,7 @@ #include "pycore_object.h" // _PyObject_InitVar() #include "pycore_pystate.h" // _Py_IsMainInterpreter() #include "pycore_runtime.h" // _PY_NSMALLPOSINTS +#include "pycore_structseq.h" // _PyStructSequence_FiniType() #include #include diff --git a/Objects/structseq.c b/Objects/structseq.c index f8bf9477f2848..dfefae8928eb6 100644 --- a/Objects/structseq.c +++ b/Objects/structseq.c @@ -563,7 +563,7 @@ _PyStructSequence_FiniType(PyTypeObject *type) PyTypeObject * -PyStructSequence_NewType(PyStructSequence_Desc *desc) +_PyStructSequence_NewType(PyStructSequence_Desc *desc, unsigned long tp_flags) { PyMemberDef *members; PyTypeObject *type; @@ -596,7 +596,7 @@ PyStructSequence_NewType(PyStructSequence_Desc *desc) spec.name = desc->name; spec.basicsize = sizeof(PyStructSequence) - sizeof(PyObject *); spec.itemsize = sizeof(PyObject *); - spec.flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC; + spec.flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | tp_flags; spec.slots = slots; type = (PyTypeObject *)PyType_FromSpecWithBases(&spec, (PyObject *)&PyTuple_Type); @@ -615,6 +615,13 @@ PyStructSequence_NewType(PyStructSequence_Desc *desc) } +PyTypeObject * +PyStructSequence_NewType(PyStructSequence_Desc *desc) +{ + return _PyStructSequence_NewType(desc, 0); +} + + /* runtime lifecycle */ PyStatus _PyStructSequence_InitState(PyInterpreterState *interp) diff --git a/Python/errors.c b/Python/errors.c index 211881ca5eb6c..023234974c47d 100644 --- a/Python/errors.c +++ b/Python/errors.c @@ -6,6 +6,7 @@ #include "pycore_initconfig.h" // _PyStatus_ERR() #include "pycore_pyerrors.h" // _PyErr_Format() #include "pycore_pystate.h" // _PyThreadState_GET() +#include "pycore_structseq.h" // _PyStructSequence_FiniType() #include "pycore_sysmodule.h" // _PySys_Audit() #include "pycore_traceback.h" // _PyTraceBack_FromFrame() diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 515994f049086..7597ea2ea9e49 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -27,7 +27,7 @@ Data members: #include "pycore_pylifecycle.h" // _PyErr_WriteUnraisableDefaultHook() #include "pycore_pymem.h" // _PyMem_SetDefaultAllocator() #include "pycore_pystate.h" // _PyThreadState_GET() -#include "pycore_structseq.h" // PyStructSequence_InitType() +#include "pycore_structseq.h" // _PyStructSequence_InitType() #include "pycore_tuple.h" // _PyTuple_FromArray() #include "code.h" diff --git a/Python/thread.c b/Python/thread.c index c2457c4f8fe83..c6b16251a05b6 100644 --- a/Python/thread.c +++ b/Python/thread.c @@ -6,7 +6,8 @@ Stuff shared by all thread_*.h files is collected here. */ #include "Python.h" -#include "pycore_pystate.h" // _PyInterpreterState_GET() +#include "pycore_pystate.h" // _PyInterpreterState_GET() +#include "pycore_structseq.h" // _PyStructSequence_FiniType() #ifndef _POSIX_THREADS /* This means pthreads are not implemented in libc headers, hence the macro From webhook-mailer at python.org Thu Jan 20 22:02:49 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 03:02:49 -0000 Subject: [Python-checkins] bpo-46417: signal uses PyStructSequence_NewType() (GH-30735) Message-ID: https://github.com/python/cpython/commit/d013b241352e902389f955f8f99d75f16c124ee2 commit: d013b241352e902389f955f8f99d75f16c124ee2 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T04:02:38+01:00 summary: bpo-46417: signal uses PyStructSequence_NewType() (GH-30735) The signal module now creates its struct_siginfo type as a heap type using PyStructSequence_NewType(), rather than using a static type. Add 'siginfo_type' member to the global signal_state_t structure. files: M Modules/signalmodule.c diff --git a/Modules/signalmodule.c b/Modules/signalmodule.c index e6f56e0aea9a9..423dc1687bf24 100644 --- a/Modules/signalmodule.c +++ b/Modules/signalmodule.c @@ -136,6 +136,7 @@ typedef struct { #ifdef MS_WINDOWS HANDLE sigint_event; #endif + PyTypeObject *siginfo_type; } signal_state_t; // State shared by all Python interpreters @@ -1136,12 +1137,13 @@ static PyStructSequence_Desc struct_siginfo_desc = { 7 /* n_in_sequence */ }; -static PyTypeObject SiginfoType; static PyObject * fill_siginfo(siginfo_t *si) { - PyObject *result = PyStructSequence_New(&SiginfoType); + signal_state_t *state = &signal_global_state; + + PyObject *result = PyStructSequence_New(state->siginfo_type); if (!result) return NULL; @@ -1660,7 +1662,7 @@ signal_module_exec(PyObject *m) } #endif #if defined(HAVE_SIGWAITINFO) || defined(HAVE_SIGTIMEDWAIT) - if (PyModule_AddType(m, &SiginfoType) < 0) { + if (PyModule_AddType(m, state->siginfo_type) < 0) { return -1; } #endif @@ -1758,6 +1760,7 @@ _PySignal_Fini(void) Py_CLEAR(state->default_handler); Py_CLEAR(state->ignore_handler); + Py_CLEAR(state->siginfo_type); } @@ -1966,10 +1969,9 @@ _PySignal_Init(int install_signal_handlers) #endif #if defined(HAVE_SIGWAITINFO) || defined(HAVE_SIGTIMEDWAIT) - if (SiginfoType.tp_name == NULL) { - if (PyStructSequence_InitType2(&SiginfoType, &struct_siginfo_desc) < 0) { - return -1; - } + state->siginfo_type = PyStructSequence_NewType(&struct_siginfo_desc); + if (state->siginfo_type == NULL) { + return -1; } #endif From webhook-mailer at python.org Fri Jan 21 02:36:37 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Fri, 21 Jan 2022 07:36:37 -0000 Subject: [Python-checkins] bpo-46425: Fix direct invocation of `test_contextlib` (GH-30681) Message-ID: https://github.com/python/cpython/commit/22f73bd9f1fc573d5c998f345b66c29f7ca6614d commit: 22f73bd9f1fc573d5c998f345b66c29f7ca6614d branch: main author: Nikita Sobolev committer: serhiy-storchaka date: 2022-01-21T09:36:19+02:00 summary: bpo-46425: Fix direct invocation of `test_contextlib` (GH-30681) files: M Lib/test/test_contextlib.py diff --git a/Lib/test/test_contextlib.py b/Lib/test/test_contextlib.py index bc8e4e4e2918f..e238548be9e2b 100644 --- a/Lib/test/test_contextlib.py +++ b/Lib/test/test_contextlib.py @@ -1117,9 +1117,15 @@ def test_cm_is_reentrant(self): class TestChdir(unittest.TestCase): + def make_relative_path(self, *parts): + return os.path.join( + os.path.dirname(os.path.realpath(__file__)), + *parts, + ) + def test_simple(self): old_cwd = os.getcwd() - target = os.path.join(os.path.dirname(__file__), 'data') + target = self.make_relative_path('data') self.assertNotEqual(old_cwd, target) with chdir(target): @@ -1128,8 +1134,8 @@ def test_simple(self): def test_reentrant(self): old_cwd = os.getcwd() - target1 = os.path.join(os.path.dirname(__file__), 'data') - target2 = os.path.join(os.path.dirname(__file__), 'ziptestdata') + target1 = self.make_relative_path('data') + target2 = self.make_relative_path('ziptestdata') self.assertNotIn(old_cwd, (target1, target2)) chdir1, chdir2 = chdir(target1), chdir(target2) @@ -1145,7 +1151,7 @@ def test_reentrant(self): def test_exception(self): old_cwd = os.getcwd() - target = os.path.join(os.path.dirname(__file__), 'data') + target = self.make_relative_path('data') self.assertNotEqual(old_cwd, target) try: From webhook-mailer at python.org Fri Jan 21 02:40:45 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Fri, 21 Jan 2022 07:40:45 -0000 Subject: [Python-checkins] bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) Message-ID: https://github.com/python/cpython/commit/cfadcc31ea84617b1c73022ce54d4ae831333e8d commit: cfadcc31ea84617b1c73022ce54d4ae831333e8d branch: main author: andrei kulakov committer: serhiy-storchaka date: 2022-01-21T09:40:32+02:00 summary: bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) files: A Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst M Lib/tarfile.py M Lib/test/test_tarfile.py diff --git a/Lib/tarfile.py b/Lib/tarfile.py index c1ee1222e09b5..e187da2b1994a 100755 --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -1789,7 +1789,7 @@ def getmember(self, name): than once in the archive, its last occurrence is assumed to be the most up-to-date version. """ - tarinfo = self._getmember(name) + tarinfo = self._getmember(name.rstrip('/')) if tarinfo is None: raise KeyError("filename %r not found" % name) return tarinfo diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index e4b5c52bf1eaf..1357df57eb179 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -220,6 +220,25 @@ def test_fileobj_symlink2(self): def test_issue14160(self): self._test_fileobj_link("symtype2", "ustar/regtype") + def test_add_dir_getmember(self): + # bpo-21987 + self.add_dir_and_getmember('bar') + self.add_dir_and_getmember('a'*101) + + def add_dir_and_getmember(self, name): + with os_helper.temp_cwd(): + with tarfile.open(tmpname, 'w') as tar: + try: + os.mkdir(name) + tar.add(name) + finally: + os.rmdir(name) + with tarfile.open(tmpname) as tar: + self.assertEqual( + tar.getmember(name), + tar.getmember(name + '/') + ) + class GzipUstarReadTest(GzipTest, UstarReadTest): pass diff --git a/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst new file mode 100644 index 0000000000000..305dd16d53b49 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst @@ -0,0 +1,2 @@ +Fix an issue with :meth:`tarfile.TarFile.getmember` getting a directory name +with a trailing slash. From webhook-mailer at python.org Fri Jan 21 02:44:21 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Fri, 21 Jan 2022 07:44:21 -0000 Subject: [Python-checkins] bpo-30512: Add CAN Socket support for NetBSD (GH-30066) Message-ID: https://github.com/python/cpython/commit/40fcd16889028bd3cd2289e0f8a2af43f17a5824 commit: 40fcd16889028bd3cd2289e0f8a2af43f17a5824 branch: main author: Thomas Klausner committer: serhiy-storchaka date: 2022-01-21T09:44:05+02:00 summary: bpo-30512: Add CAN Socket support for NetBSD (GH-30066) files: A Misc/NEWS.d/next/Core and Builtins/2021-12-12-00-49-19.bpo-30512.nU9E9V.rst M Doc/library/socket.rst M Doc/whatsnew/3.11.rst M Modules/socketmodule.c M Modules/socketmodule.h M configure M configure.ac M pyconfig.h.in diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index d6edc057f5e9c..679631a739092 100755 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -396,10 +396,13 @@ Constants Many constants of these forms, documented in the Linux documentation, are also defined in the socket module. - .. availability:: Linux >= 2.6.25. + .. availability:: Linux >= 2.6.25, NetBSD >= 8. .. versionadded:: 3.3 + .. versionchanged:: 3.11 + NetBSD support was added. + .. data:: CAN_BCM CAN_BCM_* diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 5563e3d84de6d..ad421b16fbac3 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -264,6 +264,13 @@ os (Contributed by Dong-hee Na in :issue:`44611`.) +socket +------ + +* Add CAN Socket support for NetBSD. + (Contributed by Thomas Klausner in :issue:`30512`.) + + sqlite3 ------- diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-12-00-49-19.bpo-30512.nU9E9V.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-12-00-49-19.bpo-30512.nU9E9V.rst new file mode 100644 index 0000000000000..da2ce12fec15d --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-12-00-49-19.bpo-30512.nU9E9V.rst @@ -0,0 +1 @@ +Add CAN Socket support for NetBSD. diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index 0e275639967c2..1c8ef1eb3b5b3 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -7703,7 +7703,7 @@ PyInit__socket(void) PyModule_AddIntMacro(m, SOL_CAN_RAW); PyModule_AddIntMacro(m, CAN_RAW); #endif -#ifdef HAVE_LINUX_CAN_H +#if defined(HAVE_LINUX_CAN_H) || defined(HAVE_NETCAN_CAN_H) PyModule_AddIntMacro(m, CAN_EFF_FLAG); PyModule_AddIntMacro(m, CAN_RTR_FLAG); PyModule_AddIntMacro(m, CAN_ERR_FLAG); @@ -7718,9 +7718,11 @@ PyInit__socket(void) PyModule_AddIntMacro(m, CAN_J1939); #endif #endif -#ifdef HAVE_LINUX_CAN_RAW_H +#if defined(HAVE_LINUX_CAN_RAW_H) || defined(HAVE_NETCAN_CAN_H) PyModule_AddIntMacro(m, CAN_RAW_FILTER); +#ifdef CAN_RAW_ERR_FILTER PyModule_AddIntMacro(m, CAN_RAW_ERR_FILTER); +#endif PyModule_AddIntMacro(m, CAN_RAW_LOOPBACK); PyModule_AddIntMacro(m, CAN_RAW_RECV_OWN_MSGS); #endif diff --git a/Modules/socketmodule.h b/Modules/socketmodule.h index aea599f0ee6c8..db26c046c3637 100644 --- a/Modules/socketmodule.h +++ b/Modules/socketmodule.h @@ -129,6 +129,8 @@ typedef int socklen_t; #ifdef HAVE_LINUX_CAN_H # include +#elif defined(HAVE_NETCAN_CAN_H) +# include #else # undef AF_CAN # undef PF_CAN @@ -253,7 +255,7 @@ typedef union sock_addr { #ifdef HAVE_NETPACKET_PACKET_H struct sockaddr_ll ll; #endif -#ifdef HAVE_LINUX_CAN_H +#if defined(HAVE_LINUX_CAN_H) || defined(HAVE_NETCAN_CAN_H) struct sockaddr_can can; #endif #ifdef HAVE_SYS_KERN_CONTROL_H diff --git a/configure b/configure index 402e626b6992d..f40d425371dc6 100755 --- a/configure +++ b/configure @@ -8940,7 +8940,8 @@ done # On Linux, can.h, can/bcm.h, can/j1939.h, can/raw.h require sys/socket.h -for ac_header in linux/can.h linux/can/bcm.h linux/can/j1939.h linux/can/raw.h +# On NetBSD, netcan/can.h requires sys/socket.h +for ac_header in linux/can.h linux/can/bcm.h linux/can/j1939.h linux/can/raw.h netcan/can.h do : as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` ac_fn_c_check_header_compile "$LINENO" "$ac_header" "$as_ac_Header" " diff --git a/configure.ac b/configure.ac index 9c9a338576736..8d140427de48d 100644 --- a/configure.ac +++ b/configure.ac @@ -2411,7 +2411,8 @@ AC_CHECK_HEADERS(linux/vm_sockets.h,,,[ ]) # On Linux, can.h, can/bcm.h, can/j1939.h, can/raw.h require sys/socket.h -AC_CHECK_HEADERS(linux/can.h linux/can/bcm.h linux/can/j1939.h linux/can/raw.h,,,[ +# On NetBSD, netcan/can.h requires sys/socket.h +AC_CHECK_HEADERS(linux/can.h linux/can/bcm.h linux/can/j1939.h linux/can/raw.h netcan/can.h,,,[ #ifdef HAVE_SYS_SOCKET_H #include #endif diff --git a/pyconfig.h.in b/pyconfig.h.in index 21822197708d3..a779ffadf200f 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -772,6 +772,9 @@ /* Define to 1 if you have the header file, and it defines `DIR'. */ #undef HAVE_NDIR_H +/* Define to 1 if you have the header file. */ +#undef HAVE_NETCAN_CAN_H + /* Define to 1 if you have the header file. */ #undef HAVE_NETINET_IN_H From webhook-mailer at python.org Fri Jan 21 02:55:02 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Fri, 21 Jan 2022 07:55:02 -0000 Subject: [Python-checkins] bpo-46426: Improve tests for the dir_fd argument (GH-30668) Message-ID: https://github.com/python/cpython/commit/54610bb448a9cf5be77d53b66169fca4c11be6cb commit: 54610bb448a9cf5be77d53b66169fca4c11be6cb branch: main author: Serhiy Storchaka committer: serhiy-storchaka date: 2022-01-21T09:54:50+02:00 summary: bpo-46426: Improve tests for the dir_fd argument (GH-30668) Ensure that directory file descriptors refer to directories different from the current directory, and that src_dir_fd and dst_dir_fd refer to different directories. Add context manager open_dir_fd() in test.support.os_helper. files: M Lib/test/support/os_helper.py M Lib/test/test_os.py M Lib/test/test_posix.py diff --git a/Lib/test/support/os_helper.py b/Lib/test/support/os_helper.py index ce01417ed07d8..50aa7a7176c0a 100644 --- a/Lib/test/support/os_helper.py +++ b/Lib/test/support/os_helper.py @@ -455,6 +455,17 @@ def create_empty_file(filename): os.close(fd) + at contextlib.contextmanager +def open_dir_fd(path): + """Open a file descriptor to a directory.""" + assert os.path.isdir(path) + dir_fd = os.open(path, os.O_RDONLY) + try: + yield dir_fd + finally: + os.close(dir_fd) + + def fs_is_case_insensitive(directory): """Detects if the file system for the specified directory is case-insensitive.""" diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 8da0aa3163fe2..89e5e4190c640 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -848,12 +848,9 @@ def set_time(filename, ns): def test_utime_dir_fd(self): def set_time(filename, ns): dirname, name = os.path.split(filename) - dirfd = os.open(dirname, os.O_RDONLY) - try: + with os_helper.open_dir_fd(dirname) as dirfd: # pass dir_fd to test utimensat(timespec) or futimesat(timeval) os.utime(name, dir_fd=dirfd, ns=ns) - finally: - os.close(dirfd) self._test_utime(set_time) def test_utime_directory(self): @@ -4341,8 +4338,7 @@ def test_fd(self): os.symlink('file.txt', os.path.join(self.path, 'link')) expected_names.append('link') - fd = os.open(self.path, os.O_RDONLY) - try: + with os_helper.open_dir_fd(self.path) as fd: with os.scandir(fd) as it: entries = list(it) names = [entry.name for entry in entries] @@ -4357,8 +4353,6 @@ def test_fd(self): self.assertEqual(entry.stat(), st) st = os.stat(entry.name, dir_fd=fd, follow_symlinks=False) self.assertEqual(entry.stat(follow_symlinks=False), st) - finally: - os.close(fd) def test_empty_path(self): self.assertRaises(FileNotFoundError, os.scandir, '') diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 56b72f465c1c0..974edd766cc80 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -21,6 +21,7 @@ import unittest import warnings import textwrap +from contextlib import contextmanager _DUMMY_SYMLINK = os.path.join(tempfile.gettempdir(), os_helper.TESTFN + '-dummy-symlink') @@ -1081,187 +1082,6 @@ def test_getgroups(self): symdiff = idg_groups.symmetric_difference(posix.getgroups()) self.assertTrue(not symdiff or symdiff == {posix.getegid()}) - # tests for the posix *at functions follow - - @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") - def test_access_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertTrue(posix.access(os_helper.TESTFN, os.R_OK, dir_fd=f)) - finally: - posix.close(f) - - @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") - def test_chmod_dir_fd(self): - os.chmod(os_helper.TESTFN, stat.S_IRUSR) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chmod(os_helper.TESTFN, stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - - s = posix.stat(os_helper.TESTFN) - self.assertEqual(s[0] & stat.S_IRWXU, stat.S_IRUSR | stat.S_IWUSR) - finally: - posix.close(f) - - @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), - "test needs dir_fd support in os.chown()") - def test_chown_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - os_helper.create_empty_file(os_helper.TESTFN) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chown(os_helper.TESTFN, os.getuid(), os.getgid(), dir_fd=f) - finally: - posix.close(f) - - @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") - def test_stat_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - with open(os_helper.TESTFN, 'w') as outfile: - outfile.write("testline\n") - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - s1 = posix.stat(os_helper.TESTFN) - s2 = posix.stat(os_helper.TESTFN, dir_fd=f) - self.assertEqual(s1, s2) - s2 = posix.stat(os_helper.TESTFN, dir_fd=None) - self.assertEqual(s1, s2) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, os_helper.TESTFN, dir_fd=posix.getcwd()) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, os_helper.TESTFN, dir_fd=float(f)) - self.assertRaises(OverflowError, - posix.stat, os_helper.TESTFN, dir_fd=10**20) - finally: - posix.close(f) - - @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") - def test_utime_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - now = time.time() - posix.utime(os_helper.TESTFN, None, dir_fd=f) - posix.utime(os_helper.TESTFN, dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - now, dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (None, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (now, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (None, now), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (now, "x"), dir_fd=f) - posix.utime(os_helper.TESTFN, (int(now), int(now)), dir_fd=f) - posix.utime(os_helper.TESTFN, (now, now), dir_fd=f) - posix.utime(os_helper.TESTFN, - (int(now), int((now - int(now)) * 1e9)), dir_fd=f) - posix.utime(os_helper.TESTFN, dir_fd=f, - times=(int(now), int((now - int(now)) * 1e9))) - - # try dir_fd and follow_symlinks together - if os.utime in os.supports_follow_symlinks: - try: - posix.utime(os_helper.TESTFN, follow_symlinks=False, - dir_fd=f) - except ValueError: - # whoops! using both together not supported on this platform. - pass - - finally: - posix.close(f) - - @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") - def test_link_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.link(os_helper.TESTFN, os_helper.TESTFN + 'link', - src_dir_fd=f, dst_dir_fd=f) - except PermissionError as e: - self.skipTest('posix.link(): %s' % e) - else: - # should have same inodes - self.assertEqual(posix.stat(os_helper.TESTFN)[1], - posix.stat(os_helper.TESTFN + 'link')[1]) - finally: - posix.close(f) - os_helper.unlink(os_helper.TESTFN + 'link') - - @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") - def test_mkdir_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mkdir(os_helper.TESTFN + 'dir', dir_fd=f) - posix.stat(os_helper.TESTFN + 'dir') # should not raise exception - finally: - posix.close(f) - os_helper.rmtree(os_helper.TESTFN + 'dir') - - @unittest.skipUnless(hasattr(os, 'mknod') - and (os.mknod in os.supports_dir_fd) - and hasattr(stat, 'S_IFIFO'), - "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") - def test_mknod_dir_fd(self): - # Test using mknodat() to create a FIFO (the only use specified - # by POSIX). - os_helper.unlink(os_helper.TESTFN) - mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mknod(os_helper.TESTFN, mode, 0, dir_fd=f) - except OSError as e: - # Some old systems don't allow unprivileged users to use - # mknod(), or only support creating device nodes. - self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) - else: - self.assertTrue(stat.S_ISFIFO(posix.stat(os_helper.TESTFN).st_mode)) - finally: - posix.close(f) - - @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") - def test_open_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - with open(os_helper.TESTFN, 'w') as outfile: - outfile.write("testline\n") - a = posix.open(posix.getcwd(), posix.O_RDONLY) - b = posix.open(os_helper.TESTFN, posix.O_RDONLY, dir_fd=a) - try: - res = posix.read(b, 9).decode(encoding="utf-8") - self.assertEqual("testline\n", res) - finally: - posix.close(a) - posix.close(b) - - @unittest.skipUnless(hasattr(os, 'readlink') and (os.readlink in os.supports_dir_fd), - "test needs dir_fd support in os.readlink()") - def test_readlink_dir_fd(self): - os.symlink(os_helper.TESTFN, os_helper.TESTFN + 'link') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertEqual(posix.readlink(os_helper.TESTFN + 'link'), - posix.readlink(os_helper.TESTFN + 'link', dir_fd=f)) - finally: - os_helper.unlink(os_helper.TESTFN + 'link') - posix.close(f) - - @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") - def test_rename_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - os_helper.create_empty_file(os_helper.TESTFN + 'ren') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.rename(os_helper.TESTFN + 'ren', os_helper.TESTFN, src_dir_fd=f, dst_dir_fd=f) - except: - posix.rename(os_helper.TESTFN + 'ren', os_helper.TESTFN) - raise - else: - posix.stat(os_helper.TESTFN) # should not raise exception - finally: - posix.close(f) - @unittest.skipUnless(hasattr(signal, 'SIGCHLD'), 'CLD_XXXX be placed in si_code for a SIGCHLD signal') @unittest.skipUnless(hasattr(os, 'waitid_result'), "test needs os.waitid_result") def test_cld_xxxx_constants(self): @@ -1272,47 +1092,6 @@ def test_cld_xxxx_constants(self): os.CLD_STOPPED os.CLD_CONTINUED - @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") - def test_symlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.symlink(os_helper.TESTFN, os_helper.TESTFN + 'link', - dir_fd=f) - self.assertEqual(posix.readlink(os_helper.TESTFN + 'link'), - os_helper.TESTFN) - finally: - posix.close(f) - os_helper.unlink(os_helper.TESTFN + 'link') - - @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") - def test_unlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - os_helper.create_empty_file(os_helper.TESTFN + 'del') - posix.stat(os_helper.TESTFN + 'del') # should not raise exception - try: - posix.unlink(os_helper.TESTFN + 'del', dir_fd=f) - except: - os_helper.unlink(os_helper.TESTFN + 'del') - raise - else: - self.assertRaises(OSError, posix.stat, os_helper.TESTFN + 'link') - finally: - posix.close(f) - - @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") - def test_mkfifo_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - try: - posix.mkfifo(os_helper.TESTFN, - stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - except PermissionError as e: - self.skipTest('posix.mkfifo(): %s' % e) - self.assertTrue(stat.S_ISFIFO(posix.stat(os_helper.TESTFN).st_mode)) - finally: - posix.close(f) - requires_sched_h = unittest.skipUnless(hasattr(posix, 'sched_yield'), "don't have scheduling support") requires_sched_affinity = unittest.skipUnless(hasattr(posix, 'sched_setaffinity'), @@ -1519,6 +1298,200 @@ def test_pidfd_open(self): self.assertEqual(cm.exception.errno, errno.EINVAL) os.close(os.pidfd_open(os.getpid(), 0)) + +# tests for the posix *at functions follow +class TestPosixDirFd(unittest.TestCase): + count = 0 + + @contextmanager + def prepare(self): + TestPosixDirFd.count += 1 + name = f'{os_helper.TESTFN}_{self.count}' + base_dir = f'{os_helper.TESTFN}_{self.count}base' + posix.mkdir(base_dir) + self.addCleanup(posix.rmdir, base_dir) + fullname = os.path.join(base_dir, name) + assert not os.path.exists(fullname) + with os_helper.open_dir_fd(base_dir) as dir_fd: + yield (dir_fd, name, fullname) + + @contextmanager + def prepare_file(self): + with self.prepare() as (dir_fd, name, fullname): + os_helper.create_empty_file(fullname) + self.addCleanup(posix.unlink, fullname) + yield (dir_fd, name, fullname) + + @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") + def test_access_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + self.assertTrue(posix.access(name, os.R_OK, dir_fd=dir_fd)) + + @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") + def test_chmod_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chmod(fullname, stat.S_IRUSR) + posix.chmod(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + s = posix.stat(fullname) + self.assertEqual(s.st_mode & stat.S_IRWXU, + stat.S_IRUSR | stat.S_IWUSR) + + @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), + "test needs dir_fd support in os.chown()") + def test_chown_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chown(name, os.getuid(), os.getgid(), dir_fd=dir_fd) + + @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") + def test_stat_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'w') as outfile: + outfile.write("testline\n") + self.addCleanup(posix.unlink, fullname) + + s1 = posix.stat(fullname) + s2 = posix.stat(name, dir_fd=dir_fd) + self.assertEqual(s1, s2) + s2 = posix.stat(fullname, dir_fd=None) + self.assertEqual(s1, s2) + + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=posix.getcwd()) + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=float(dir_fd)) + self.assertRaises(OverflowError, + posix.stat, name, dir_fd=10**20) + + @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") + def test_utime_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + now = time.time() + posix.utime(name, None, dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + now, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, now), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, "x"), dir_fd=dir_fd) + posix.utime(name, (int(now), int(now)), dir_fd=dir_fd) + posix.utime(name, (now, now), dir_fd=dir_fd) + posix.utime(name, + (int(now), int((now - int(now)) * 1e9)), dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd, + times=(int(now), int((now - int(now)) * 1e9))) + + # try dir_fd and follow_symlinks together + if os.utime in os.supports_follow_symlinks: + try: + posix.utime(name, follow_symlinks=False, dir_fd=dir_fd) + except ValueError: + # whoops! using both together not supported on this platform. + pass + + @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") + def test_link_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, linkname, fulllinkname): + try: + posix.link(name, linkname, src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + except PermissionError as e: + self.skipTest('posix.link(): %s' % e) + self.addCleanup(posix.unlink, fulllinkname) + # should have same inodes + self.assertEqual(posix.stat(fullname)[1], + posix.stat(fulllinkname)[1]) + + @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") + def test_mkdir_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.mkdir(name, dir_fd=dir_fd) + self.addCleanup(posix.rmdir, fullname) + posix.stat(fullname) # should not raise exception + + @unittest.skipUnless(hasattr(os, 'mknod') + and (os.mknod in os.supports_dir_fd) + and hasattr(stat, 'S_IFIFO'), + "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") + def test_mknod_dir_fd(self): + # Test using mknodat() to create a FIFO (the only use specified + # by POSIX). + with self.prepare() as (dir_fd, name, fullname): + mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR + try: + posix.mknod(name, mode, 0, dir_fd=dir_fd) + except OSError as e: + # Some old systems don't allow unprivileged users to use + # mknod(), or only support creating device nodes. + self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) + else: + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") + def test_open_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'wb') as outfile: + outfile.write(b"testline\n") + self.addCleanup(posix.unlink, fullname) + fd = posix.open(name, posix.O_RDONLY, dir_fd=dir_fd) + try: + res = posix.read(fd, 9) + self.assertEqual(b"testline\n", res) + finally: + posix.close(fd) + + @unittest.skipUnless(hasattr(os, 'readlink') and (os.readlink in os.supports_dir_fd), + "test needs dir_fd support in os.readlink()") + def test_readlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + os.symlink('symlink', fullname) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(name, dir_fd=dir_fd), 'symlink') + + @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") + def test_rename_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, name2, fullname2): + posix.rename(name, name2, + src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + posix.stat(fullname2) # should not raise exception + posix.rename(fullname2, fullname) + + @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") + def test_symlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.symlink('symlink', name, dir_fd=dir_fd) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(fullname), 'symlink') + + @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") + def test_unlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + os_helper.create_empty_file(fullname) + posix.stat(fullname) # should not raise exception + try: + posix.unlink(name, dir_fd=dir_fd) + self.assertRaises(OSError, posix.stat, fullname) + except: + self.addCleanup(posix.unlink, fullname) + raise + + @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") + def test_mkfifo_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + try: + posix.mkfifo(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + except PermissionError as e: + self.skipTest('posix.mkfifo(): %s' % e) + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + class PosixGroupsTester(unittest.TestCase): def setUp(self): From webhook-mailer at python.org Fri Jan 21 03:06:06 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 08:06:06 -0000 Subject: [Python-checkins] bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) Message-ID: https://github.com/python/cpython/commit/1d11fdd3eeff77ba600278433b7ab0ce4d2a7f3b commit: 1d11fdd3eeff77ba600278433b7ab0ce4d2a7f3b branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T00:05:57-08:00 summary: bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) (cherry picked from commit cfadcc31ea84617b1c73022ce54d4ae831333e8d) Co-authored-by: andrei kulakov files: A Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst M Lib/tarfile.py M Lib/test/test_tarfile.py diff --git a/Lib/tarfile.py b/Lib/tarfile.py index c1ee1222e09b5..e187da2b1994a 100755 --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -1789,7 +1789,7 @@ def getmember(self, name): than once in the archive, its last occurrence is assumed to be the most up-to-date version. """ - tarinfo = self._getmember(name) + tarinfo = self._getmember(name.rstrip('/')) if tarinfo is None: raise KeyError("filename %r not found" % name) return tarinfo diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index e4b5c52bf1eaf..1357df57eb179 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -220,6 +220,25 @@ def test_fileobj_symlink2(self): def test_issue14160(self): self._test_fileobj_link("symtype2", "ustar/regtype") + def test_add_dir_getmember(self): + # bpo-21987 + self.add_dir_and_getmember('bar') + self.add_dir_and_getmember('a'*101) + + def add_dir_and_getmember(self, name): + with os_helper.temp_cwd(): + with tarfile.open(tmpname, 'w') as tar: + try: + os.mkdir(name) + tar.add(name) + finally: + os.rmdir(name) + with tarfile.open(tmpname) as tar: + self.assertEqual( + tar.getmember(name), + tar.getmember(name + '/') + ) + class GzipUstarReadTest(GzipTest, UstarReadTest): pass diff --git a/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst new file mode 100644 index 0000000000000..305dd16d53b49 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst @@ -0,0 +1,2 @@ +Fix an issue with :meth:`tarfile.TarFile.getmember` getting a directory name +with a trailing slash. From webhook-mailer at python.org Fri Jan 21 04:02:55 2022 From: webhook-mailer at python.org (taleinat) Date: Fri, 21 Jan 2022 09:02:55 -0000 Subject: [Python-checkins] [3.10] bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) Message-ID: https://github.com/python/cpython/commit/f6e5972fa984c10d47694973db1c91c6486d654a commit: f6e5972fa984c10d47694973db1c91c6486d654a branch: 3.10 author: Tal Einat <532281+taleinat at users.noreply.github.com> committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-21T11:02:25+02:00 summary: [3.10] bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) (cherry picked from commit 27df7566bc19699b967e0e30d7808637b90141f6) Co-authored-by: Zane Bitter files: M Modules/clinic/selectmodule.c.h M Modules/selectmodule.c diff --git a/Modules/clinic/selectmodule.c.h b/Modules/clinic/selectmodule.c.h index d7095dfb00ead..be752e981667c 100644 --- a/Modules/clinic/selectmodule.c.h +++ b/Modules/clinic/selectmodule.c.h @@ -193,6 +193,10 @@ PyDoc_STRVAR(select_poll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -363,6 +367,10 @@ PyDoc_STRVAR(select_devpoll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -1179,4 +1187,4 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize #ifndef SELECT_KQUEUE_CONTROL_METHODDEF #define SELECT_KQUEUE_CONTROL_METHODDEF #endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */ -/*[clinic end generated code: output=cd2062a787e13b35 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=a8fc031269d28454 input=a9049054013a1b77]*/ diff --git a/Modules/selectmodule.c b/Modules/selectmodule.c index 3ecd0c32b3038..3afcb0e2a0220 100644 --- a/Modules/selectmodule.c +++ b/Modules/selectmodule.c @@ -564,6 +564,8 @@ select_poll_unregister_impl(pollObject *self, int fd) select.poll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -574,7 +576,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_poll_poll_impl(pollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=876e837d193ed7e4 input=7a446ed45189e894]*/ +/*[clinic end generated code: output=876e837d193ed7e4 input=c2f6953ec45e5622]*/ { PyObject *result_list = NULL; int poll_result, i, j; @@ -888,6 +890,8 @@ select_devpoll_unregister_impl(devpollObject *self, int fd) /*[clinic input] select.devpoll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -898,7 +902,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_devpoll_poll_impl(devpollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=2654e5457cca0b3c input=fd0db698d84f0333]*/ +/*[clinic end generated code: output=2654e5457cca0b3c input=3c3f0a355ec2bedb]*/ { struct dvpoll dvp; PyObject *result_list = NULL; From webhook-mailer at python.org Fri Jan 21 04:37:45 2022 From: webhook-mailer at python.org (taleinat) Date: Fri, 21 Jan 2022 09:37:45 -0000 Subject: [Python-checkins] [3.9] bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) Message-ID: https://github.com/python/cpython/commit/656971e4953a70a6048170377888db5530eea0a6 commit: 656971e4953a70a6048170377888db5530eea0a6 branch: 3.9 author: Tal Einat <532281+taleinat at users.noreply.github.com> committer: taleinat <532281+taleinat at users.noreply.github.com> date: 2022-01-21T11:37:39+02:00 summary: [3.9] bpo-41857: mention timeout argument units in select.poll() and select.depoll() doc-strings (GH-22406) (cherry picked from commit 27df7566bc19699b967e0e30d7808637b90141f6) Co-authored-by: Zane Bitter files: M Modules/clinic/selectmodule.c.h M Modules/selectmodule.c diff --git a/Modules/clinic/selectmodule.c.h b/Modules/clinic/selectmodule.c.h index c1072e6ef9430..7791af8f93143 100644 --- a/Modules/clinic/selectmodule.c.h +++ b/Modules/clinic/selectmodule.c.h @@ -193,6 +193,10 @@ PyDoc_STRVAR(select_poll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -363,6 +367,10 @@ PyDoc_STRVAR(select_devpoll_poll__doc__, "\n" "Polls the set of registered file descriptors.\n" "\n" +" timeout\n" +" The maximum time to wait in milliseconds, or else None (or a negative\n" +" value) to wait indefinitely.\n" +"\n" "Returns a list containing any descriptors that have events or errors to\n" "report, as a list of (fd, event) 2-tuples."); @@ -1219,4 +1227,4 @@ select_kqueue_control(kqueue_queue_Object *self, PyObject *const *args, Py_ssize #ifndef SELECT_KQUEUE_CONTROL_METHODDEF #define SELECT_KQUEUE_CONTROL_METHODDEF #endif /* !defined(SELECT_KQUEUE_CONTROL_METHODDEF) */ -/*[clinic end generated code: output=ef42c3485a8fe3a0 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=d78e30f231a926d6 input=a9049054013a1b77]*/ diff --git a/Modules/selectmodule.c b/Modules/selectmodule.c index fb71e9196f32a..d4af4f7b52db4 100644 --- a/Modules/selectmodule.c +++ b/Modules/selectmodule.c @@ -580,6 +580,8 @@ select_poll_unregister_impl(pollObject *self, int fd) select.poll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -590,7 +592,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_poll_poll_impl(pollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=876e837d193ed7e4 input=7a446ed45189e894]*/ +/*[clinic end generated code: output=876e837d193ed7e4 input=c2f6953ec45e5622]*/ { PyObject *result_list = NULL; int poll_result, i, j; @@ -911,6 +913,8 @@ select_devpoll_unregister_impl(devpollObject *self, int fd) /*[clinic input] select.devpoll.poll timeout as timeout_obj: object = None + The maximum time to wait in milliseconds, or else None (or a negative + value) to wait indefinitely. / Polls the set of registered file descriptors. @@ -921,7 +925,7 @@ report, as a list of (fd, event) 2-tuples. static PyObject * select_devpoll_poll_impl(devpollObject *self, PyObject *timeout_obj) -/*[clinic end generated code: output=2654e5457cca0b3c input=fd0db698d84f0333]*/ +/*[clinic end generated code: output=2654e5457cca0b3c input=3c3f0a355ec2bedb]*/ { struct dvpoll dvp; PyObject *result_list = NULL; From webhook-mailer at python.org Fri Jan 21 07:05:36 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 12:05:36 -0000 Subject: [Python-checkins] bpo-46417: Call _PyDebug_PrintTotalRefs() later (GH-30744) Message-ID: https://github.com/python/cpython/commit/ea38e436fe1e585fb8c1f0badf5482f525b7f9ff commit: ea38e436fe1e585fb8c1f0badf5482f525b7f9ff branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T13:05:26+01:00 summary: bpo-46417: Call _PyDebug_PrintTotalRefs() later (GH-30744) "python -X showrefcount" now shows the total reference count after clearing and destroyed the main Python interpreter. Previously, it was shown before. Py_FinalizeEx() now calls _PyDebug_PrintTotalRefs() after finalize_interp_delete(). files: A Misc/NEWS.d/next/Core and Builtins/2022-01-21-12-24-14.bpo-46417.i3IqMf.rst M Python/pylifecycle.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-21-12-24-14.bpo-46417.i3IqMf.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-21-12-24-14.bpo-46417.i3IqMf.rst new file mode 100644 index 0000000000000..c7e2ee33500d9 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-21-12-24-14.bpo-46417.i3IqMf.rst @@ -0,0 +1,3 @@ +``python -X showrefcount`` now shows the total reference count after clearing +and destroyed the main Python interpreter. Previously, it was shown before. +Patch by Victor Stinner. diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 0b1f47147696d..5572f61c7288a 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -1862,12 +1862,6 @@ Py_FinalizeEx(void) /* dump hash stats */ _PyHash_Fini(); -#ifdef Py_REF_DEBUG - if (show_ref_count) { - _PyDebug_PrintTotalRefs(); - } -#endif - #ifdef Py_TRACE_REFS /* Display all objects still alive -- this can invoke arbitrary * __repr__ overrides, so requires a mostly-intact interpreter. @@ -1895,6 +1889,12 @@ Py_FinalizeEx(void) finalize_interp_clear(tstate); finalize_interp_delete(tstate->interp); +#ifdef Py_REF_DEBUG + if (show_ref_count) { + _PyDebug_PrintTotalRefs(); + } +#endif + #ifdef Py_TRACE_REFS /* Display addresses (& refcnts) of all objects still alive. * An address can be used to find the repr of the object, printed From webhook-mailer at python.org Fri Jan 21 07:06:39 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 12:06:39 -0000 Subject: [Python-checkins] bpo-46417: Py_Finalize() clears static types (GH-30743) Message-ID: https://github.com/python/cpython/commit/595225e86dcc6ea520a584839925a878dce7a9b2 commit: 595225e86dcc6ea520a584839925a878dce7a9b2 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T13:06:34+01:00 summary: bpo-46417: Py_Finalize() clears static types (GH-30743) Add _PyTypes_FiniTypes() best-effort function to clear static types: don't deallocate a type if it still has subclasses. remove_subclass() now sets tp_subclasses to NULL when removing the last subclass. files: M Include/internal/pycore_typeobject.h M Objects/object.c M Objects/typeobject.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_typeobject.h b/Include/internal/pycore_typeobject.h index ba95bbc1c4820..c480a3a57b436 100644 --- a/Include/internal/pycore_typeobject.h +++ b/Include/internal/pycore_typeobject.h @@ -13,6 +13,7 @@ extern "C" { extern PyStatus _PyTypes_InitState(PyInterpreterState *); extern PyStatus _PyTypes_InitTypes(PyInterpreterState *); +extern void _PyTypes_FiniTypes(PyInterpreterState *); extern void _PyTypes_Fini(PyInterpreterState *); diff --git a/Objects/object.c b/Objects/object.c index 124485d64ab77..dc2cba2ebccec 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1837,6 +1837,94 @@ _PyTypes_InitState(PyInterpreterState *interp) return _PyStatus_OK(); } + +static PyTypeObject* static_types[] = { + // base types + &PyAsyncGen_Type, + &PyBool_Type, + &PyByteArrayIter_Type, + &PyByteArray_Type, + &PyCFunction_Type, + &PyCallIter_Type, + &PyCapsule_Type, + &PyCell_Type, + &PyClassMethodDescr_Type, + &PyClassMethod_Type, + &PyCode_Type, + &PyComplex_Type, + &PyCoro_Type, + &PyDictItems_Type, + &PyDictIterItem_Type, + &PyDictIterKey_Type, + &PyDictIterValue_Type, + &PyDictKeys_Type, + &PyDictProxy_Type, + &PyDictRevIterItem_Type, + &PyDictRevIterKey_Type, + &PyDictRevIterValue_Type, + &PyDictValues_Type, + &PyDict_Type, + &PyEllipsis_Type, + &PyEnum_Type, + &PyFrame_Type, + &PyFrozenSet_Type, + &PyFunction_Type, + &PyGen_Type, + &PyGetSetDescr_Type, + &PyInstanceMethod_Type, + &PyListIter_Type, + &PyListRevIter_Type, + &PyList_Type, + &PyLongRangeIter_Type, + &PyMemberDescr_Type, + &PyMemoryView_Type, + &PyMethodDescr_Type, + &PyMethod_Type, + &PyModuleDef_Type, + &PyModule_Type, + &PyODictIter_Type, + &PyPickleBuffer_Type, + &PyProperty_Type, + &PyRangeIter_Type, + &PyRange_Type, + &PyReversed_Type, + &PySTEntry_Type, + &PySeqIter_Type, + &PySetIter_Type, + &PySet_Type, + &PySlice_Type, + &PyStaticMethod_Type, + &PyStdPrinter_Type, + &PySuper_Type, + &PyTraceBack_Type, + &PyWrapperDescr_Type, + &Py_GenericAliasType, + &_PyAnextAwaitable_Type, + &_PyAsyncGenASend_Type, + &_PyAsyncGenAThrow_Type, + &_PyAsyncGenWrappedValue_Type, + &_PyCoroWrapper_Type, + &_PyInterpreterID_Type, + &_PyManagedBuffer_Type, + &_PyMethodWrapper_Type, + &_PyNamespace_Type, + &_PyNone_Type, + &_PyNotImplemented_Type, + &_PyUnion_Type, + &_PyWeakref_CallableProxyType, + &_PyWeakref_ProxyType, + &_PyWeakref_RefType, + + // subclasses: _PyTypes_FiniTypes() deallocates them before their base + // class + &PyCMethod_Type, // base=&PyCFunction_Type + &PyODictItems_Type, // base=&PyDictItems_Type + &PyODictKeys_Type, // base=&PyDictKeys_Type + &PyODictValues_Type, // base=&PyDictValues_Type + &PyODict_Type, // base=&PyDict_Type +}; + + PyStatus _PyTypes_InitTypes(PyInterpreterState *interp) { @@ -1858,91 +1946,44 @@ _PyTypes_InitTypes(PyInterpreterState *interp) assert(PyType_Type.tp_base == &PyBaseObject_Type); // All other static types (unless initialized elsewhere) - INIT_TYPE(PyAsyncGen_Type); - INIT_TYPE(PyBool_Type); - INIT_TYPE(PyByteArrayIter_Type); - INIT_TYPE(PyByteArray_Type); - INIT_TYPE(PyCFunction_Type); - INIT_TYPE(PyCMethod_Type); - INIT_TYPE(PyCallIter_Type); - INIT_TYPE(PyCapsule_Type); - INIT_TYPE(PyCell_Type); - INIT_TYPE(PyClassMethodDescr_Type); - INIT_TYPE(PyClassMethod_Type); - INIT_TYPE(PyCode_Type); - INIT_TYPE(PyComplex_Type); - INIT_TYPE(PyCoro_Type); - INIT_TYPE(PyDictItems_Type); - INIT_TYPE(PyDictIterItem_Type); - INIT_TYPE(PyDictIterKey_Type); - INIT_TYPE(PyDictIterValue_Type); - INIT_TYPE(PyDictKeys_Type); - INIT_TYPE(PyDictProxy_Type); - INIT_TYPE(PyDictRevIterItem_Type); - INIT_TYPE(PyDictRevIterKey_Type); - INIT_TYPE(PyDictRevIterValue_Type); - INIT_TYPE(PyDictValues_Type); - INIT_TYPE(PyDict_Type); - INIT_TYPE(PyEllipsis_Type); - INIT_TYPE(PyEnum_Type); - INIT_TYPE(PyFrame_Type); - INIT_TYPE(PyFrozenSet_Type); - INIT_TYPE(PyFunction_Type); - INIT_TYPE(PyGen_Type); - INIT_TYPE(PyGetSetDescr_Type); - INIT_TYPE(PyInstanceMethod_Type); - INIT_TYPE(PyListIter_Type); - INIT_TYPE(PyListRevIter_Type); - INIT_TYPE(PyList_Type); - INIT_TYPE(PyLongRangeIter_Type); - INIT_TYPE(PyMemberDescr_Type); - INIT_TYPE(PyMemoryView_Type); - INIT_TYPE(PyMethodDescr_Type); - INIT_TYPE(PyMethod_Type); - INIT_TYPE(PyModuleDef_Type); - INIT_TYPE(PyModule_Type); - INIT_TYPE(PyODictItems_Type); - INIT_TYPE(PyODictIter_Type); - INIT_TYPE(PyODictKeys_Type); - INIT_TYPE(PyODictValues_Type); - INIT_TYPE(PyODict_Type); - INIT_TYPE(PyPickleBuffer_Type); - INIT_TYPE(PyProperty_Type); - INIT_TYPE(PyRangeIter_Type); - INIT_TYPE(PyRange_Type); - INIT_TYPE(PyReversed_Type); - INIT_TYPE(PySTEntry_Type); - INIT_TYPE(PySeqIter_Type); - INIT_TYPE(PySetIter_Type); - INIT_TYPE(PySet_Type); - INIT_TYPE(PySlice_Type); - INIT_TYPE(PyStaticMethod_Type); - INIT_TYPE(PyStdPrinter_Type); - INIT_TYPE(PySuper_Type); - INIT_TYPE(PyTraceBack_Type); - INIT_TYPE(PyWrapperDescr_Type); - INIT_TYPE(Py_GenericAliasType); - INIT_TYPE(_PyAnextAwaitable_Type); - INIT_TYPE(_PyAsyncGenASend_Type); - INIT_TYPE(_PyAsyncGenAThrow_Type); - INIT_TYPE(_PyAsyncGenWrappedValue_Type); - INIT_TYPE(_PyCoroWrapper_Type); - INIT_TYPE(_PyInterpreterID_Type); - INIT_TYPE(_PyManagedBuffer_Type); - INIT_TYPE(_PyMethodWrapper_Type); - INIT_TYPE(_PyNamespace_Type); - INIT_TYPE(_PyNone_Type); - INIT_TYPE(_PyNotImplemented_Type); - INIT_TYPE(_PyWeakref_CallableProxyType); - INIT_TYPE(_PyWeakref_ProxyType); - INIT_TYPE(_PyWeakref_RefType); - INIT_TYPE(_PyUnion_Type); + for (size_t i=0; i < Py_ARRAY_LENGTH(static_types); i++) { + PyTypeObject *type = static_types[i]; + if (PyType_Ready(type) < 0) { + return _PyStatus_ERR("Can't initialize types"); + } + } return _PyStatus_OK(); #undef INIT_TYPE } +// Best-effort function clearing static types. +// +// Don't deallocate a type if it still has subclasses. If a Py_Finalize() +// sub-function is interrupted by CTRL+C or fails with MemoryError, some +// subclasses are not cleared properly. Leave the static type unchanged in this +// case. +void +_PyTypes_FiniTypes(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + // Deallocate types in the reverse order to deallocate subclasses before + // their base classes. + for (Py_ssize_t i=Py_ARRAY_LENGTH(static_types)-1; i>=0; i--) { + PyTypeObject *type = static_types[i]; + // Cannot delete a type if it still has subclasses + if (type->tp_subclasses != NULL) { + continue; + } + _PyStaticType_Dealloc(type); + } +} + + void _Py_NewReference(PyObject *op) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 66a10a5bc57dd..97a9a65c36b0e 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4071,6 +4071,18 @@ extern void _PyDictKeys_DecRef(PyDictKeysObject *keys); +static void +type_dealloc_common(PyTypeObject *type) +{ + PyObject *tp, *val, *tb; + PyErr_Fetch(&tp, &val, &tb); + remove_all_subclasses(type, type->tp_bases); + PyErr_Restore(tp, val, tb); + + PyObject_ClearWeakRefs((PyObject *)type); +} + + void _PyStaticType_Dealloc(PyTypeObject *type) { @@ -4079,11 +4091,14 @@ _PyStaticType_Dealloc(PyTypeObject *type) // and a type must no longer be used once it's deallocated. assert(type->tp_subclasses == NULL); + type_dealloc_common(type); + Py_CLEAR(type->tp_dict); Py_CLEAR(type->tp_bases); Py_CLEAR(type->tp_mro); Py_CLEAR(type->tp_cache); Py_CLEAR(type->tp_subclasses); + type->tp_flags &= ~Py_TPFLAGS_READY; } @@ -4091,22 +4106,19 @@ _PyStaticType_Dealloc(PyTypeObject *type) static void type_dealloc(PyTypeObject *type) { - PyObject *tp, *val, *tb; - /* Assert this is a heap-allocated type object */ _PyObject_ASSERT((PyObject *)type, type->tp_flags & Py_TPFLAGS_HEAPTYPE); _PyObject_GC_UNTRACK(type); - PyErr_Fetch(&tp, &val, &tb); - remove_all_subclasses(type, type->tp_bases); - PyErr_Restore(tp, val, tb); - PyObject_ClearWeakRefs((PyObject *)type); + type_dealloc_common(type); + Py_XDECREF(type->tp_base); Py_XDECREF(type->tp_dict); Py_XDECREF(type->tp_bases); Py_XDECREF(type->tp_mro); Py_XDECREF(type->tp_cache); Py_XDECREF(type->tp_subclasses); + /* A type's tp_doc is heap allocated, unlike the tp_doc slots * of most other objects. It's okay to cast it to char *. */ @@ -6541,6 +6553,10 @@ remove_subclass(PyTypeObject *base, PyTypeObject *type) PyErr_Clear(); } Py_XDECREF(key); + + if (PyDict_Size(dict) == 0) { + Py_CLEAR(base->tp_subclasses); + } } static void diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 5572f61c7288a..662e578818349 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -1676,6 +1676,7 @@ finalize_interp_types(PyInterpreterState *interp) _PyThread_FiniType(interp); _PyErr_FiniTypes(interp); _PyTypes_Fini(interp); + _PyTypes_FiniTypes(interp); // Call _PyUnicode_ClearInterned() before _PyDict_Fini() since it uses // a dict internally. From webhook-mailer at python.org Fri Jan 21 07:45:53 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 12:45:53 -0000 Subject: [Python-checkins] no-issue: Fix documentation typos. (GH-30576) Message-ID: https://github.com/python/cpython/commit/68a31dba975419b7b4432fa31730e5ca67071d9f commit: 68a31dba975419b7b4432fa31730e5ca67071d9f branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T04:45:42-08:00 summary: no-issue: Fix documentation typos. (GH-30576) (cherry picked from commit d05a66339b5e07d72d96e4c30a34cc3821bb61a2) Co-authored-by: Piotr Fusik files: M Doc/c-api/init_config.rst M Doc/howto/descriptor.rst diff --git a/Doc/c-api/init_config.rst b/Doc/c-api/init_config.rst index c037f19ce64f3..b8b41510b89ec 100644 --- a/Doc/c-api/init_config.rst +++ b/Doc/c-api/init_config.rst @@ -634,7 +634,7 @@ PyConfig .. c:member:: int dump_refs - Dump Python refererences? + Dump Python references? If non-zero, dump all objects which are still alive at exit. diff --git a/Doc/howto/descriptor.rst b/Doc/howto/descriptor.rst index 6ce062d0fa853..f8b1e00d96fad 100644 --- a/Doc/howto/descriptor.rst +++ b/Doc/howto/descriptor.rst @@ -1544,7 +1544,7 @@ variables: 'Simulate how the type metaclass adds member objects for slots' def __new__(mcls, clsname, bases, mapping): - 'Emuluate type_new() in Objects/typeobject.c' + 'Emulate type_new() in Objects/typeobject.c' # type_new() calls PyTypeReady() which calls add_methods() slot_names = mapping.get('slot_names', []) for offset, name in enumerate(slot_names): From webhook-mailer at python.org Fri Jan 21 10:45:19 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 15:45:19 -0000 Subject: [Python-checkins] bpo-46417: Revert remove_subclass() change (GH-30750) Message-ID: https://github.com/python/cpython/commit/fda88864980ffce57add0ea03fb9cbda2798975e commit: fda88864980ffce57add0ea03fb9cbda2798975e branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T16:45:14+01:00 summary: bpo-46417: Revert remove_subclass() change (GH-30750) remove_subclass() doesn't clear the tp_subclasses dict if the dict becomes empty. files: M Objects/typeobject.c diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 97a9a65c36b0e..34a9817a3178e 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -6553,10 +6553,6 @@ remove_subclass(PyTypeObject *base, PyTypeObject *type) PyErr_Clear(); } Py_XDECREF(key); - - if (PyDict_Size(dict) == 0) { - Py_CLEAR(base->tp_subclasses); - } } static void From webhook-mailer at python.org Fri Jan 21 11:53:20 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 16:53:20 -0000 Subject: [Python-checkins] bpo-46417: Add missing types of _PyTypes_InitTypes() (GH-30749) Message-ID: https://github.com/python/cpython/commit/a1bf329bca80a0259da454c936075e11e6af710f commit: a1bf329bca80a0259da454c936075e11e6af710f branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T17:53:13+01:00 summary: bpo-46417: Add missing types of _PyTypes_InitTypes() (GH-30749) Add types removed by mistake by the commit adding _PyTypes_FiniTypes(). Move also PyBool_Type at the end, since it depends on PyLong_Type. PyBytes_Type and PyUnicode_Type no longer depend explicitly on PyBaseObject_Type: it's the default of PyType_Ready(). files: M Objects/bytesobject.c M Objects/object.c M Objects/unicodeobject.c diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 85d6912ca751fc..b6edfb9acb2dbb 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -2904,7 +2904,7 @@ PyTypeObject PyBytes_Type = { bytes_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ - &PyBaseObject_Type, /* tp_base */ + 0, /* tp_base */ 0, /* tp_dict */ 0, /* tp_descr_get */ 0, /* tp_descr_set */ diff --git a/Objects/object.c b/Objects/object.c index dc2cba2ebccec0..a5ee8eef4a3b49 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1841,9 +1841,10 @@ _PyTypes_InitState(PyInterpreterState *interp) static PyTypeObject* static_types[] = { // base types &PyAsyncGen_Type, - &PyBool_Type, &PyByteArrayIter_Type, &PyByteArray_Type, + &PyBytesIter_Type, + &PyBytes_Type, &PyCFunction_Type, &PyCallIter_Type, &PyCapsule_Type, @@ -1866,6 +1867,7 @@ static PyTypeObject* static_types[] = { &PyDict_Type, &PyEllipsis_Type, &PyEnum_Type, + &PyFloat_Type, &PyFrame_Type, &PyFrozenSet_Type, &PyFunction_Type, @@ -1876,6 +1878,7 @@ static PyTypeObject* static_types[] = { &PyListRevIter_Type, &PyList_Type, &PyLongRangeIter_Type, + &PyLong_Type, &PyMemberDescr_Type, &PyMemoryView_Type, &PyMethodDescr_Type, @@ -1897,6 +1900,10 @@ static PyTypeObject* static_types[] = { &PyStdPrinter_Type, &PySuper_Type, &PyTraceBack_Type, + &PyTupleIter_Type, + &PyTuple_Type, + &PyUnicodeIter_Type, + &PyUnicode_Type, &PyWrapperDescr_Type, &Py_GenericAliasType, &_PyAnextAwaitable_Type, @@ -1917,6 +1924,7 @@ static PyTypeObject* static_types[] = { // subclasses: _PyTypes_FiniTypes() deallocates them before their base // class + &PyBool_Type, // base=&PyLong_Type &PyCMethod_Type, // base=&PyCFunction_Type &PyODictItems_Type, // base=&PyDictItems_Type &PyODictKeys_Type, // base=&PyDictKeys_Type diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 31b8710defbea6..2e1f8a6ac4e565 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -15511,7 +15511,7 @@ PyTypeObject PyUnicode_Type = { unicode_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ - &PyBaseObject_Type, /* tp_base */ + 0, /* tp_base */ 0, /* tp_dict */ 0, /* tp_descr_get */ 0, /* tp_descr_set */ From webhook-mailer at python.org Fri Jan 21 12:01:08 2022 From: webhook-mailer at python.org (zooba) Date: Fri, 21 Jan 2022 17:01:08 -0000 Subject: [Python-checkins] bpo-46434: Handle missing docstrings in pdb help (GH-30705) Message-ID: https://github.com/python/cpython/commit/60705cff70576482fea31dcafbf8a37cbb751ea5 commit: 60705cff70576482fea31dcafbf8a37cbb751ea5 branch: main author: Tom Sparrow <793763+sparrowt at users.noreply.github.com> committer: zooba date: 2022-01-21T17:00:48Z summary: bpo-46434: Handle missing docstrings in pdb help (GH-30705) files: A Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst M Lib/pdb.py M Lib/test/test_pdb.py M Misc/ACKS diff --git a/Lib/pdb.py b/Lib/pdb.py index d7110074538ac..58bc720275a28 100755 --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -1577,6 +1577,9 @@ def do_help(self, arg): self.error('No help for %r; please do not run Python with -OO ' 'if you need command help' % arg) return + if command.__doc__ is None: + self.error('No help for %r; __doc__ string missing' % arg) + return self.message(command.__doc__.rstrip()) do_h = do_help diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index 01263db28f18c..d2bf3dc90ed23 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -1474,6 +1474,27 @@ def test_issue7964(self): self.assertNotIn(b'SyntaxError', stdout, "Got a syntax error running test script under PDB") + def test_issue46434(self): + # Temporarily patch in an extra help command which doesn't have a + # docstring to emulate what happens in an embeddable distribution + script = """ + def do_testcmdwithnodocs(self, arg): + pass + + import pdb + pdb.Pdb.do_testcmdwithnodocs = do_testcmdwithnodocs + """ + commands = """ + continue + help testcmdwithnodocs + """ + stdout, stderr = self.run_pdb_script(script, commands) + output = (stdout or '') + (stderr or '') + self.assertNotIn('AttributeError', output, + 'Calling help on a command with no docs should be handled gracefully') + self.assertIn("*** No help for 'testcmdwithnodocs'; __doc__ string missing", output, + 'Calling help on a command with no docs should print an error') + def test_issue13183(self): script = """ from bar import bar diff --git a/Misc/ACKS b/Misc/ACKS index 04d6a651489bb..cf023c9af927a 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1675,6 +1675,7 @@ Evgeny Sologubov Cody Somerville Anthony Sottile Edoardo Spadolini +Tom Sparrow Geoffrey Spear Clay Spence Stefan Sperling diff --git a/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst new file mode 100644 index 0000000000000..6000781fa5aea --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst @@ -0,0 +1,2 @@ +:mod:`pdb` now gracefully handles ``help`` when :attr:`__doc__` is missing, +for example when run with pregenerated optimized ``.pyc`` files. From webhook-mailer at python.org Fri Jan 21 12:31:35 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Fri, 21 Jan 2022 17:31:35 -0000 Subject: [Python-checkins] bpo-46426: Improve tests for the dir_fd argument (GH-30668) (GH-30739) Message-ID: https://github.com/python/cpython/commit/a1015c6478e8cbec2ecb984a3cba733783d168b5 commit: a1015c6478e8cbec2ecb984a3cba733783d168b5 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: serhiy-storchaka date: 2022-01-21T19:31:25+02:00 summary: bpo-46426: Improve tests for the dir_fd argument (GH-30668) (GH-30739) Ensure that directory file descriptors refer to directories different from the current directory, and that src_dir_fd and dst_dir_fd refer to different directories. Add context manager open_dir_fd() in test.support.os_helper. (cherry picked from commit 54610bb448a9cf5be77d53b66169fca4c11be6cb) Co-authored-by: Serhiy Storchaka files: M Lib/test/support/os_helper.py M Lib/test/test_os.py M Lib/test/test_posix.py diff --git a/Lib/test/support/os_helper.py b/Lib/test/support/os_helper.py index d9807a1e114b6..82a6de789c866 100644 --- a/Lib/test/support/os_helper.py +++ b/Lib/test/support/os_helper.py @@ -455,6 +455,17 @@ def create_empty_file(filename): os.close(fd) + at contextlib.contextmanager +def open_dir_fd(path): + """Open a file descriptor to a directory.""" + assert os.path.isdir(path) + dir_fd = os.open(path, os.O_RDONLY) + try: + yield dir_fd + finally: + os.close(dir_fd) + + def fs_is_case_insensitive(directory): """Detects if the file system for the specified directory is case-insensitive.""" diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 0ad13d59ded37..7f7d14ef0a095 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -848,12 +848,9 @@ def set_time(filename, ns): def test_utime_dir_fd(self): def set_time(filename, ns): dirname, name = os.path.split(filename) - dirfd = os.open(dirname, os.O_RDONLY) - try: + with os_helper.open_dir_fd(dirname) as dirfd: # pass dir_fd to test utimensat(timespec) or futimesat(timeval) os.utime(name, dir_fd=dirfd, ns=ns) - finally: - os.close(dirfd) self._test_utime(set_time) def test_utime_directory(self): @@ -4339,8 +4336,7 @@ def test_fd(self): os.symlink('file.txt', os.path.join(self.path, 'link')) expected_names.append('link') - fd = os.open(self.path, os.O_RDONLY) - try: + with os_helper.open_dir_fd(self.path) as fd: with os.scandir(fd) as it: entries = list(it) names = [entry.name for entry in entries] @@ -4355,8 +4351,6 @@ def test_fd(self): self.assertEqual(entry.stat(), st) st = os.stat(entry.name, dir_fd=fd, follow_symlinks=False) self.assertEqual(entry.stat(follow_symlinks=False), st) - finally: - os.close(fd) def test_empty_path(self): self.assertRaises(FileNotFoundError, os.scandir, '') diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 56b72f465c1c0..974edd766cc80 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -21,6 +21,7 @@ import unittest import warnings import textwrap +from contextlib import contextmanager _DUMMY_SYMLINK = os.path.join(tempfile.gettempdir(), os_helper.TESTFN + '-dummy-symlink') @@ -1081,187 +1082,6 @@ def test_getgroups(self): symdiff = idg_groups.symmetric_difference(posix.getgroups()) self.assertTrue(not symdiff or symdiff == {posix.getegid()}) - # tests for the posix *at functions follow - - @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") - def test_access_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertTrue(posix.access(os_helper.TESTFN, os.R_OK, dir_fd=f)) - finally: - posix.close(f) - - @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") - def test_chmod_dir_fd(self): - os.chmod(os_helper.TESTFN, stat.S_IRUSR) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chmod(os_helper.TESTFN, stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - - s = posix.stat(os_helper.TESTFN) - self.assertEqual(s[0] & stat.S_IRWXU, stat.S_IRUSR | stat.S_IWUSR) - finally: - posix.close(f) - - @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), - "test needs dir_fd support in os.chown()") - def test_chown_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - os_helper.create_empty_file(os_helper.TESTFN) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chown(os_helper.TESTFN, os.getuid(), os.getgid(), dir_fd=f) - finally: - posix.close(f) - - @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") - def test_stat_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - with open(os_helper.TESTFN, 'w') as outfile: - outfile.write("testline\n") - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - s1 = posix.stat(os_helper.TESTFN) - s2 = posix.stat(os_helper.TESTFN, dir_fd=f) - self.assertEqual(s1, s2) - s2 = posix.stat(os_helper.TESTFN, dir_fd=None) - self.assertEqual(s1, s2) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, os_helper.TESTFN, dir_fd=posix.getcwd()) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, os_helper.TESTFN, dir_fd=float(f)) - self.assertRaises(OverflowError, - posix.stat, os_helper.TESTFN, dir_fd=10**20) - finally: - posix.close(f) - - @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") - def test_utime_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - now = time.time() - posix.utime(os_helper.TESTFN, None, dir_fd=f) - posix.utime(os_helper.TESTFN, dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - now, dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (None, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (now, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (None, now), dir_fd=f) - self.assertRaises(TypeError, posix.utime, os_helper.TESTFN, - (now, "x"), dir_fd=f) - posix.utime(os_helper.TESTFN, (int(now), int(now)), dir_fd=f) - posix.utime(os_helper.TESTFN, (now, now), dir_fd=f) - posix.utime(os_helper.TESTFN, - (int(now), int((now - int(now)) * 1e9)), dir_fd=f) - posix.utime(os_helper.TESTFN, dir_fd=f, - times=(int(now), int((now - int(now)) * 1e9))) - - # try dir_fd and follow_symlinks together - if os.utime in os.supports_follow_symlinks: - try: - posix.utime(os_helper.TESTFN, follow_symlinks=False, - dir_fd=f) - except ValueError: - # whoops! using both together not supported on this platform. - pass - - finally: - posix.close(f) - - @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") - def test_link_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.link(os_helper.TESTFN, os_helper.TESTFN + 'link', - src_dir_fd=f, dst_dir_fd=f) - except PermissionError as e: - self.skipTest('posix.link(): %s' % e) - else: - # should have same inodes - self.assertEqual(posix.stat(os_helper.TESTFN)[1], - posix.stat(os_helper.TESTFN + 'link')[1]) - finally: - posix.close(f) - os_helper.unlink(os_helper.TESTFN + 'link') - - @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") - def test_mkdir_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mkdir(os_helper.TESTFN + 'dir', dir_fd=f) - posix.stat(os_helper.TESTFN + 'dir') # should not raise exception - finally: - posix.close(f) - os_helper.rmtree(os_helper.TESTFN + 'dir') - - @unittest.skipUnless(hasattr(os, 'mknod') - and (os.mknod in os.supports_dir_fd) - and hasattr(stat, 'S_IFIFO'), - "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") - def test_mknod_dir_fd(self): - # Test using mknodat() to create a FIFO (the only use specified - # by POSIX). - os_helper.unlink(os_helper.TESTFN) - mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mknod(os_helper.TESTFN, mode, 0, dir_fd=f) - except OSError as e: - # Some old systems don't allow unprivileged users to use - # mknod(), or only support creating device nodes. - self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) - else: - self.assertTrue(stat.S_ISFIFO(posix.stat(os_helper.TESTFN).st_mode)) - finally: - posix.close(f) - - @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") - def test_open_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - with open(os_helper.TESTFN, 'w') as outfile: - outfile.write("testline\n") - a = posix.open(posix.getcwd(), posix.O_RDONLY) - b = posix.open(os_helper.TESTFN, posix.O_RDONLY, dir_fd=a) - try: - res = posix.read(b, 9).decode(encoding="utf-8") - self.assertEqual("testline\n", res) - finally: - posix.close(a) - posix.close(b) - - @unittest.skipUnless(hasattr(os, 'readlink') and (os.readlink in os.supports_dir_fd), - "test needs dir_fd support in os.readlink()") - def test_readlink_dir_fd(self): - os.symlink(os_helper.TESTFN, os_helper.TESTFN + 'link') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertEqual(posix.readlink(os_helper.TESTFN + 'link'), - posix.readlink(os_helper.TESTFN + 'link', dir_fd=f)) - finally: - os_helper.unlink(os_helper.TESTFN + 'link') - posix.close(f) - - @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") - def test_rename_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - os_helper.create_empty_file(os_helper.TESTFN + 'ren') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.rename(os_helper.TESTFN + 'ren', os_helper.TESTFN, src_dir_fd=f, dst_dir_fd=f) - except: - posix.rename(os_helper.TESTFN + 'ren', os_helper.TESTFN) - raise - else: - posix.stat(os_helper.TESTFN) # should not raise exception - finally: - posix.close(f) - @unittest.skipUnless(hasattr(signal, 'SIGCHLD'), 'CLD_XXXX be placed in si_code for a SIGCHLD signal') @unittest.skipUnless(hasattr(os, 'waitid_result'), "test needs os.waitid_result") def test_cld_xxxx_constants(self): @@ -1272,47 +1092,6 @@ def test_cld_xxxx_constants(self): os.CLD_STOPPED os.CLD_CONTINUED - @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") - def test_symlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.symlink(os_helper.TESTFN, os_helper.TESTFN + 'link', - dir_fd=f) - self.assertEqual(posix.readlink(os_helper.TESTFN + 'link'), - os_helper.TESTFN) - finally: - posix.close(f) - os_helper.unlink(os_helper.TESTFN + 'link') - - @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") - def test_unlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - os_helper.create_empty_file(os_helper.TESTFN + 'del') - posix.stat(os_helper.TESTFN + 'del') # should not raise exception - try: - posix.unlink(os_helper.TESTFN + 'del', dir_fd=f) - except: - os_helper.unlink(os_helper.TESTFN + 'del') - raise - else: - self.assertRaises(OSError, posix.stat, os_helper.TESTFN + 'link') - finally: - posix.close(f) - - @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") - def test_mkfifo_dir_fd(self): - os_helper.unlink(os_helper.TESTFN) - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - try: - posix.mkfifo(os_helper.TESTFN, - stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - except PermissionError as e: - self.skipTest('posix.mkfifo(): %s' % e) - self.assertTrue(stat.S_ISFIFO(posix.stat(os_helper.TESTFN).st_mode)) - finally: - posix.close(f) - requires_sched_h = unittest.skipUnless(hasattr(posix, 'sched_yield'), "don't have scheduling support") requires_sched_affinity = unittest.skipUnless(hasattr(posix, 'sched_setaffinity'), @@ -1519,6 +1298,200 @@ def test_pidfd_open(self): self.assertEqual(cm.exception.errno, errno.EINVAL) os.close(os.pidfd_open(os.getpid(), 0)) + +# tests for the posix *at functions follow +class TestPosixDirFd(unittest.TestCase): + count = 0 + + @contextmanager + def prepare(self): + TestPosixDirFd.count += 1 + name = f'{os_helper.TESTFN}_{self.count}' + base_dir = f'{os_helper.TESTFN}_{self.count}base' + posix.mkdir(base_dir) + self.addCleanup(posix.rmdir, base_dir) + fullname = os.path.join(base_dir, name) + assert not os.path.exists(fullname) + with os_helper.open_dir_fd(base_dir) as dir_fd: + yield (dir_fd, name, fullname) + + @contextmanager + def prepare_file(self): + with self.prepare() as (dir_fd, name, fullname): + os_helper.create_empty_file(fullname) + self.addCleanup(posix.unlink, fullname) + yield (dir_fd, name, fullname) + + @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") + def test_access_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + self.assertTrue(posix.access(name, os.R_OK, dir_fd=dir_fd)) + + @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") + def test_chmod_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chmod(fullname, stat.S_IRUSR) + posix.chmod(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + s = posix.stat(fullname) + self.assertEqual(s.st_mode & stat.S_IRWXU, + stat.S_IRUSR | stat.S_IWUSR) + + @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), + "test needs dir_fd support in os.chown()") + def test_chown_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chown(name, os.getuid(), os.getgid(), dir_fd=dir_fd) + + @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") + def test_stat_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'w') as outfile: + outfile.write("testline\n") + self.addCleanup(posix.unlink, fullname) + + s1 = posix.stat(fullname) + s2 = posix.stat(name, dir_fd=dir_fd) + self.assertEqual(s1, s2) + s2 = posix.stat(fullname, dir_fd=None) + self.assertEqual(s1, s2) + + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=posix.getcwd()) + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=float(dir_fd)) + self.assertRaises(OverflowError, + posix.stat, name, dir_fd=10**20) + + @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") + def test_utime_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + now = time.time() + posix.utime(name, None, dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + now, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, now), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, "x"), dir_fd=dir_fd) + posix.utime(name, (int(now), int(now)), dir_fd=dir_fd) + posix.utime(name, (now, now), dir_fd=dir_fd) + posix.utime(name, + (int(now), int((now - int(now)) * 1e9)), dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd, + times=(int(now), int((now - int(now)) * 1e9))) + + # try dir_fd and follow_symlinks together + if os.utime in os.supports_follow_symlinks: + try: + posix.utime(name, follow_symlinks=False, dir_fd=dir_fd) + except ValueError: + # whoops! using both together not supported on this platform. + pass + + @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") + def test_link_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, linkname, fulllinkname): + try: + posix.link(name, linkname, src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + except PermissionError as e: + self.skipTest('posix.link(): %s' % e) + self.addCleanup(posix.unlink, fulllinkname) + # should have same inodes + self.assertEqual(posix.stat(fullname)[1], + posix.stat(fulllinkname)[1]) + + @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") + def test_mkdir_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.mkdir(name, dir_fd=dir_fd) + self.addCleanup(posix.rmdir, fullname) + posix.stat(fullname) # should not raise exception + + @unittest.skipUnless(hasattr(os, 'mknod') + and (os.mknod in os.supports_dir_fd) + and hasattr(stat, 'S_IFIFO'), + "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") + def test_mknod_dir_fd(self): + # Test using mknodat() to create a FIFO (the only use specified + # by POSIX). + with self.prepare() as (dir_fd, name, fullname): + mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR + try: + posix.mknod(name, mode, 0, dir_fd=dir_fd) + except OSError as e: + # Some old systems don't allow unprivileged users to use + # mknod(), or only support creating device nodes. + self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) + else: + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") + def test_open_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'wb') as outfile: + outfile.write(b"testline\n") + self.addCleanup(posix.unlink, fullname) + fd = posix.open(name, posix.O_RDONLY, dir_fd=dir_fd) + try: + res = posix.read(fd, 9) + self.assertEqual(b"testline\n", res) + finally: + posix.close(fd) + + @unittest.skipUnless(hasattr(os, 'readlink') and (os.readlink in os.supports_dir_fd), + "test needs dir_fd support in os.readlink()") + def test_readlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + os.symlink('symlink', fullname) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(name, dir_fd=dir_fd), 'symlink') + + @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") + def test_rename_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, name2, fullname2): + posix.rename(name, name2, + src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + posix.stat(fullname2) # should not raise exception + posix.rename(fullname2, fullname) + + @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") + def test_symlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.symlink('symlink', name, dir_fd=dir_fd) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(fullname), 'symlink') + + @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") + def test_unlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + os_helper.create_empty_file(fullname) + posix.stat(fullname) # should not raise exception + try: + posix.unlink(name, dir_fd=dir_fd) + self.assertRaises(OSError, posix.stat, fullname) + except: + self.addCleanup(posix.unlink, fullname) + raise + + @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") + def test_mkfifo_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + try: + posix.mkfifo(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + except PermissionError as e: + self.skipTest('posix.mkfifo(): %s' % e) + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + class PosixGroupsTester(unittest.TestCase): def setUp(self): From webhook-mailer at python.org Fri Jan 21 12:33:29 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 17:33:29 -0000 Subject: [Python-checkins] bpo-46434: Handle missing docstrings in pdb help (GH-30705) Message-ID: https://github.com/python/cpython/commit/c3ad850b57f92bd7c5515616b59afbd9e1c79538 commit: c3ad850b57f92bd7c5515616b59afbd9e1c79538 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T09:33:25-08:00 summary: bpo-46434: Handle missing docstrings in pdb help (GH-30705) (cherry picked from commit 60705cff70576482fea31dcafbf8a37cbb751ea5) Co-authored-by: Tom Sparrow <793763+sparrowt at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst M Lib/pdb.py M Lib/test/test_pdb.py M Misc/ACKS diff --git a/Lib/pdb.py b/Lib/pdb.py index 943211158ac41..7ab50b4845d3e 100755 --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -1493,6 +1493,9 @@ def do_help(self, arg): self.error('No help for %r; please do not run Python with -OO ' 'if you need command help' % arg) return + if command.__doc__ is None: + self.error('No help for %r; __doc__ string missing' % arg) + return self.message(command.__doc__.rstrip()) do_h = do_help diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index cb9cd07b07143..58778300eee09 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -1400,6 +1400,27 @@ def test_issue7964(self): self.assertNotIn(b'SyntaxError', stdout, "Got a syntax error running test script under PDB") + def test_issue46434(self): + # Temporarily patch in an extra help command which doesn't have a + # docstring to emulate what happens in an embeddable distribution + script = """ + def do_testcmdwithnodocs(self, arg): + pass + + import pdb + pdb.Pdb.do_testcmdwithnodocs = do_testcmdwithnodocs + """ + commands = """ + continue + help testcmdwithnodocs + """ + stdout, stderr = self.run_pdb_script(script, commands) + output = (stdout or '') + (stderr or '') + self.assertNotIn('AttributeError', output, + 'Calling help on a command with no docs should be handled gracefully') + self.assertIn("*** No help for 'testcmdwithnodocs'; __doc__ string missing", output, + 'Calling help on a command with no docs should print an error') + def test_issue13183(self): script = """ from bar import bar diff --git a/Misc/ACKS b/Misc/ACKS index 25c88656d4245..61267d2a23bea 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1640,6 +1640,7 @@ Evgeny Sologubov Cody Somerville Anthony Sottile Edoardo Spadolini +Tom Sparrow Geoffrey Spear Clay Spence Stefan Sperling diff --git a/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst new file mode 100644 index 0000000000000..6000781fa5aea --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst @@ -0,0 +1,2 @@ +:mod:`pdb` now gracefully handles ``help`` when :attr:`__doc__` is missing, +for example when run with pregenerated optimized ``.pyc`` files. From webhook-mailer at python.org Fri Jan 21 14:32:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 19:32:59 -0000 Subject: [Python-checkins] bpo-46434: Handle missing docstrings in pdb help (GH-30705) Message-ID: https://github.com/python/cpython/commit/05063fa15c594012e6dc9c2c7a3ea72e7cb933f2 commit: 05063fa15c594012e6dc9c2c7a3ea72e7cb933f2 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T11:32:43-08:00 summary: bpo-46434: Handle missing docstrings in pdb help (GH-30705) (cherry picked from commit 60705cff70576482fea31dcafbf8a37cbb751ea5) Co-authored-by: Tom Sparrow <793763+sparrowt at users.noreply.github.com> files: A Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst M Lib/pdb.py M Lib/test/test_pdb.py M Misc/ACKS diff --git a/Lib/pdb.py b/Lib/pdb.py index 943211158ac41..7ab50b4845d3e 100755 --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -1493,6 +1493,9 @@ def do_help(self, arg): self.error('No help for %r; please do not run Python with -OO ' 'if you need command help' % arg) return + if command.__doc__ is None: + self.error('No help for %r; __doc__ string missing' % arg) + return self.message(command.__doc__.rstrip()) do_h = do_help diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index d4c037dabff97..6ac1a4a3c3025 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -1463,6 +1463,27 @@ def test_issue7964(self): self.assertNotIn(b'SyntaxError', stdout, "Got a syntax error running test script under PDB") + def test_issue46434(self): + # Temporarily patch in an extra help command which doesn't have a + # docstring to emulate what happens in an embeddable distribution + script = """ + def do_testcmdwithnodocs(self, arg): + pass + + import pdb + pdb.Pdb.do_testcmdwithnodocs = do_testcmdwithnodocs + """ + commands = """ + continue + help testcmdwithnodocs + """ + stdout, stderr = self.run_pdb_script(script, commands) + output = (stdout or '') + (stderr or '') + self.assertNotIn('AttributeError', output, + 'Calling help on a command with no docs should be handled gracefully') + self.assertIn("*** No help for 'testcmdwithnodocs'; __doc__ string missing", output, + 'Calling help on a command with no docs should print an error') + def test_issue13183(self): script = """ from bar import bar diff --git a/Misc/ACKS b/Misc/ACKS index 9292bdc8dc73b..7f9166cd74cfa 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1668,6 +1668,7 @@ Evgeny Sologubov Cody Somerville Anthony Sottile Edoardo Spadolini +Tom Sparrow Geoffrey Spear Clay Spence Stefan Sperling diff --git a/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst new file mode 100644 index 0000000000000..6000781fa5aea --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-20-10-35-10.bpo-46434.geS-aP.rst @@ -0,0 +1,2 @@ +:mod:`pdb` now gracefully handles ``help`` when :attr:`__doc__` is missing, +for example when run with pregenerated optimized ``.pyc`` files. From webhook-mailer at python.org Fri Jan 21 15:39:11 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 20:39:11 -0000 Subject: [Python-checkins] bpo-46417: Add _PyType_CAST() macro (GH-30760) Message-ID: https://github.com/python/cpython/commit/bc67f189fdd62ed42013fa05cd0ef2df498f5967 commit: bc67f189fdd62ed42013fa05cd0ef2df498f5967 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T21:39:01+01:00 summary: bpo-46417: Add _PyType_CAST() macro (GH-30760) In debug mode, the macro makes sure that its argument is a type using an assertion. files: M Include/object.h M Include/py_curses.h M Objects/typeobject.c diff --git a/Include/object.h b/Include/object.h index e5544e8b588ed..4fd16616ce705 100644 --- a/Include/object.h +++ b/Include/object.h @@ -755,6 +755,8 @@ static inline int _PyType_Check(PyObject *op) { } #define PyType_Check(op) _PyType_Check(_PyObject_CAST(op)) +#define _PyType_CAST(op) (assert(PyType_Check(op)), (PyTypeObject*)(op)) + static inline int _PyType_CheckExact(PyObject *op) { return Py_IS_TYPE(op, &PyType_Type); } diff --git a/Include/py_curses.h b/Include/py_curses.h index b70252d9d7605..b2c7f1bb4309c 100644 --- a/Include/py_curses.h +++ b/Include/py_curses.h @@ -77,7 +77,7 @@ typedef struct { static void **PyCurses_API; -#define PyCursesWindow_Type (*(PyTypeObject *) PyCurses_API[0]) +#define PyCursesWindow_Type (*_PyType_CAST(PyCurses_API[0])) #define PyCursesSetupTermCalled {if (! ((int (*)(void))PyCurses_API[1]) () ) return NULL;} #define PyCursesInitialised {if (! ((int (*)(void))PyCurses_API[2]) () ) return NULL;} #define PyCursesInitialisedColor {if (! ((int (*)(void))PyCurses_API[3]) () ) return NULL;} diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 34a9817a3178e..c46c3d80edbaa 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -342,7 +342,7 @@ PyType_Modified(PyTypeObject *type) assert(PyWeakref_CheckRef(ref)); ref = PyWeakref_GET_OBJECT(ref); if (ref != Py_None) { - PyType_Modified((PyTypeObject *)ref); + PyType_Modified(_PyType_CAST(ref)); } } } @@ -387,10 +387,7 @@ type_mro_modified(PyTypeObject *type, PyObject *bases) { n = PyTuple_GET_SIZE(bases); for (i = 0; i < n; i++) { PyObject *b = PyTuple_GET_ITEM(bases, i); - PyTypeObject *cls; - - assert(PyType_Check(b)); - cls = (PyTypeObject *)b; + PyTypeObject *cls = _PyType_CAST(b); if (!PyType_IsSubtype(type, cls)) { goto clear; @@ -431,8 +428,7 @@ assign_version_tag(struct type_cache *cache, PyTypeObject *type) n = PyTuple_GET_SIZE(bases); for (i = 0; i < n; i++) { PyObject *b = PyTuple_GET_ITEM(bases, i); - assert(PyType_Check(b)); - if (!assign_version_tag(cache, (PyTypeObject *)b)) + if (!assign_version_tag(cache, _PyType_CAST(b))) return 0; } type->tp_flags |= Py_TPFLAGS_VALID_VERSION_TAG; @@ -736,8 +732,7 @@ mro_hierarchy(PyTypeObject *type, PyObject *temp) return -1; n = PyList_GET_SIZE(subclasses); for (i = 0; i < n; i++) { - PyTypeObject *subclass; - subclass = (PyTypeObject *)PyList_GET_ITEM(subclasses, i); + PyTypeObject *subclass = _PyType_CAST(PyList_GET_ITEM(subclasses, i)); res = mro_hierarchy(subclass, temp); if (res < 0) break; @@ -771,18 +766,15 @@ type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) return -1; } for (i = 0; i < PyTuple_GET_SIZE(new_bases); i++) { - PyObject *ob; - PyTypeObject *base; - - ob = PyTuple_GET_ITEM(new_bases, i); + PyObject *ob = PyTuple_GET_ITEM(new_bases, i); if (!PyType_Check(ob)) { PyErr_Format(PyExc_TypeError, "%s.__bases__ must be tuple of classes, not '%s'", type->tp_name, Py_TYPE(ob)->tp_name); return -1; } + PyTypeObject *base = (PyTypeObject*)ob; - base = (PyTypeObject*)ob; if (PyType_IsSubtype(base, type) || /* In case of reentering here again through a custom mro() the above check is not enough since it relies on @@ -1947,7 +1939,7 @@ mro_implementation(PyTypeObject *type) assert(PyTuple_Check(bases)); n = PyTuple_GET_SIZE(bases); for (i = 0; i < n; i++) { - PyTypeObject *base = (PyTypeObject *)PyTuple_GET_ITEM(bases, i); + PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, i)); if (base->tp_mro == NULL) { PyErr_Format(PyExc_TypeError, "Cannot extend an incomplete type '%.100s'", @@ -1961,7 +1953,7 @@ mro_implementation(PyTypeObject *type) /* Fast path: if there is a single base, constructing the MRO * is trivial. */ - PyTypeObject *base = (PyTypeObject *)PyTuple_GET_ITEM(bases, 0); + PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, 0)); Py_ssize_t k = PyTuple_GET_SIZE(base->tp_mro); result = PyTuple_New(k + 1); if (result == NULL) { @@ -1998,7 +1990,7 @@ mro_implementation(PyTypeObject *type) } for (i = 0; i < n; i++) { - PyTypeObject *base = (PyTypeObject *)PyTuple_GET_ITEM(bases, i); + PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, i)); to_merge[i] = base->tp_mro; } to_merge[n] = bases; @@ -2047,19 +2039,16 @@ mro_check(PyTypeObject *type, PyObject *mro) n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { - PyTypeObject *base; - PyObject *tmp; - - tmp = PyTuple_GET_ITEM(mro, i); - if (!PyType_Check(tmp)) { + PyObject *obj = PyTuple_GET_ITEM(mro, i); + if (!PyType_Check(obj)) { PyErr_Format( PyExc_TypeError, "mro() returned a non-class ('%.500s')", - Py_TYPE(tmp)->tp_name); + Py_TYPE(obj)->tp_name); return -1; } + PyTypeObject *base = (PyTypeObject*)obj; - base = (PyTypeObject*)tmp; if (!PyType_IsSubtype(solid, solid_base(base))) { PyErr_Format( PyExc_TypeError, @@ -2196,8 +2185,7 @@ static PyTypeObject * best_base(PyObject *bases) { Py_ssize_t i, n; - PyTypeObject *base, *winner, *candidate, *base_i; - PyObject *base_proto; + PyTypeObject *base, *winner, *candidate; assert(PyTuple_Check(bases)); n = PyTuple_GET_SIZE(bases); @@ -2205,14 +2193,15 @@ best_base(PyObject *bases) base = NULL; winner = NULL; for (i = 0; i < n; i++) { - base_proto = PyTuple_GET_ITEM(bases, i); + PyObject *base_proto = PyTuple_GET_ITEM(bases, i); if (!PyType_Check(base_proto)) { PyErr_SetString( PyExc_TypeError, "bases must be types"); return NULL; } - base_i = (PyTypeObject *)base_proto; + PyTypeObject *base_i = (PyTypeObject *)base_proto; + if (!_PyType_IsReady(base_i)) { if (PyType_Ready(base_i) < 0) return NULL; @@ -2663,9 +2652,8 @@ type_new_slots_bases(type_new_ctx *ctx) /* Skip primary base */ continue; } + PyTypeObject *type = _PyType_CAST(base); - assert(PyType_Check(base)); - PyTypeObject *type = (PyTypeObject *)base; if (ctx->may_add_dict && ctx->add_dict == 0 && type->tp_dictoffset != 0) { @@ -3754,7 +3742,7 @@ _PyType_GetModuleByDef(PyTypeObject *type, struct PyModuleDef *def) // by PyType_FromModuleAndSpec() or on its subclasses. // type_ready_mro() ensures that a static type cannot inherit from a // heap type. - assert(_PyType_HasFeature((PyTypeObject *)type, Py_TPFLAGS_HEAPTYPE)); + assert(_PyType_HasFeature(type, Py_TPFLAGS_HEAPTYPE)); PyHeapTypeObject *ht = (PyHeapTypeObject*)super; PyObject *module = ht->ht_module; @@ -3818,7 +3806,7 @@ find_name_in_mro(PyTypeObject *type, PyObject *name, int *error) for (i = 0; i < n; i++) { base = PyTuple_GET_ITEM(mro, i); assert(PyType_Check(base)); - dict = ((PyTypeObject *)base)->tp_dict; + dict = _PyType_CAST(base)->tp_dict; assert(dict && PyDict_Check(dict)); res = _PyDict_GetItem_KnownHash(dict, name, hash); if (res != NULL) @@ -4780,7 +4768,6 @@ static int object_set_class(PyObject *self, PyObject *value, void *closure) { PyTypeObject *oldto = Py_TYPE(self); - PyTypeObject *newto; if (value == NULL) { PyErr_SetString(PyExc_TypeError, @@ -4793,12 +4780,13 @@ object_set_class(PyObject *self, PyObject *value, void *closure) Py_TYPE(value)->tp_name); return -1; } + PyTypeObject *newto = (PyTypeObject *)value; + if (PySys_Audit("object.__setattr__", "OsO", self, "__class__", value) < 0) { return -1; } - newto = (PyTypeObject *)value; /* In versions of CPython prior to 3.5, the code in compatible_for_assignment was not set up to correctly check for memory layout / slot / etc. compatibility for non-HEAPTYPE classes, so we just @@ -6219,7 +6207,7 @@ type_ready_mro(PyTypeObject *type) PyObject *mro = type->tp_mro; Py_ssize_t n = PyTuple_GET_SIZE(mro); for (Py_ssize_t i = 0; i < n; i++) { - PyTypeObject *base = (PyTypeObject *)PyTuple_GET_ITEM(mro, i); + PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(mro, i)); if (PyType_Check(base) && (base->tp_flags & Py_TPFLAGS_HEAPTYPE)) { PyErr_Format(PyExc_TypeError, "type '%.100s' is not dynamically allocated but " @@ -6528,7 +6516,9 @@ add_all_subclasses(PyTypeObject *type, PyObject *bases) PyObject *base = PyTuple_GET_ITEM(bases, i); if (PyType_Check(base) && add_subclass((PyTypeObject*)base, type) < 0) + { res = -1; + } } } @@ -6562,8 +6552,9 @@ remove_all_subclasses(PyTypeObject *type, PyObject *bases) Py_ssize_t i; for (i = 0; i < PyTuple_GET_SIZE(bases); i++) { PyObject *base = PyTuple_GET_ITEM(bases, i); - if (PyType_Check(base)) + if (PyType_Check(base)) { remove_subclass((PyTypeObject*) base, type); + } } } } @@ -6857,7 +6848,7 @@ hackcheck(PyObject *self, setattrofunc func, const char *what) PyTypeObject *defining_type = type; Py_ssize_t i; for (i = PyTuple_GET_SIZE(mro) - 1; i >= 0; i--) { - PyTypeObject *base = (PyTypeObject*) PyTuple_GET_ITEM(mro, i); + PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(mro, i)); if (base->tp_setattro == slot_tp_setattro) { /* Ignore Python classes: they never define their own C-level setattro. */ @@ -7062,7 +7053,7 @@ wrap_init(PyObject *self, PyObject *args, void *wrapped, PyObject *kwds) static PyObject * tp_new_wrapper(PyObject *self, PyObject *args, PyObject *kwds) { - PyTypeObject *type, *subtype, *staticbase; + PyTypeObject *staticbase; PyObject *arg0, *res; if (self == NULL || !PyType_Check(self)) { @@ -7070,7 +7061,7 @@ tp_new_wrapper(PyObject *self, PyObject *args, PyObject *kwds) "__new__() called with non-type 'self'"); return NULL; } - type = (PyTypeObject *)self; + PyTypeObject *type = (PyTypeObject *)self; if (!PyTuple_Check(args) || PyTuple_GET_SIZE(args) < 1) { PyErr_Format(PyExc_TypeError, @@ -7086,7 +7077,8 @@ tp_new_wrapper(PyObject *self, PyObject *args, PyObject *kwds) Py_TYPE(arg0)->tp_name); return NULL; } - subtype = (PyTypeObject *)arg0; + PyTypeObject *subtype = (PyTypeObject *)arg0; + if (!PyType_IsSubtype(subtype, type)) { PyErr_Format(PyExc_TypeError, "%s.__new__(%s): %s is not a subtype of %s", @@ -8646,7 +8638,6 @@ static int recurse_down_subclasses(PyTypeObject *type, PyObject *name, update_callback callback, void *data) { - PyTypeObject *subclass; PyObject *ref, *subclasses, *dict; Py_ssize_t i; @@ -8657,11 +8648,13 @@ recurse_down_subclasses(PyTypeObject *type, PyObject *name, i = 0; while (PyDict_Next(subclasses, &i, NULL, &ref)) { assert(PyWeakref_CheckRef(ref)); - subclass = (PyTypeObject *)PyWeakref_GET_OBJECT(ref); - assert(subclass != NULL); - if ((PyObject *)subclass == Py_None) + PyObject *obj = PyWeakref_GET_OBJECT(ref); + assert(obj != NULL); + if (obj == Py_None) { continue; - assert(PyType_Check(subclass)); + } + PyTypeObject *subclass = _PyType_CAST(obj); + /* Avoid recursing down into unaffected classes */ dict = subclass->tp_dict; if (dict != NULL && PyDict_Check(dict)) { @@ -8838,28 +8831,24 @@ super_getattro(PyObject *self, PyObject *name) replaced during PyDict_GetItemWithError(dict, name) */ Py_INCREF(mro); do { - PyObject *res, *tmp, *dict; - descrgetfunc f; - - tmp = PyTuple_GET_ITEM(mro, i); - assert(PyType_Check(tmp)); - - dict = ((PyTypeObject *)tmp)->tp_dict; + PyObject *obj = PyTuple_GET_ITEM(mro, i); + PyObject *dict = _PyType_CAST(obj)->tp_dict; assert(dict != NULL && PyDict_Check(dict)); - res = PyDict_GetItemWithError(dict, name); + PyObject *res = PyDict_GetItemWithError(dict, name); if (res != NULL) { Py_INCREF(res); - f = Py_TYPE(res)->tp_descr_get; + descrgetfunc f = Py_TYPE(res)->tp_descr_get; if (f != NULL) { - tmp = f(res, + PyObject *res2; + res2 = f(res, /* Only pass 'obj' param if this is instance-mode super (See SF ID #743627) */ (su->obj == (PyObject *)starttype) ? NULL : su->obj, (PyObject *)starttype); Py_DECREF(res); - res = tmp; + res = res2; } Py_DECREF(mro); @@ -8920,8 +8909,9 @@ supercheck(PyTypeObject *type, PyObject *obj) { int ok = PyType_IsSubtype( (PyTypeObject *)class_attr, type); - if (ok) + if (ok) { return (PyTypeObject *)class_attr; + } } Py_XDECREF(class_attr); } From webhook-mailer at python.org Fri Jan 21 15:59:50 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 20:59:50 -0000 Subject: [Python-checkins] Update generated files list and add `diff=generated` attribute (GH-30745) Message-ID: https://github.com/python/cpython/commit/f1e559b7544d665a84747a7bc7ecebadef2a6be2 commit: f1e559b7544d665a84747a7bc7ecebadef2a6be2 branch: main author: Erlend Egeberg Aasland committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T12:59:45-08:00 summary: Update generated files list and add `diff=generated` attribute (GH-30745) As a side effect, the list of generated files is relocated after the language aware diff settings. Closes python/core-workflow#425 Automerge-Triggered-By: GH:zware files: M .gitattributes diff --git a/.gitattributes b/.gitattributes index 05b0420714e1b..2718e63e5748f 100644 --- a/.gitattributes +++ b/.gitattributes @@ -40,24 +40,6 @@ Lib/test/test_importlib/namespacedata01/* -text PCbuild/readme.txt text eol=crlf PC/readme.txt text eol=crlf -# Generated files -# https://github.com/github/linguist/blob/master/docs/overrides.md -**/clinic/*.h linguist-generated=true -Python/deepfreeze/*.c linguist-generated=true -Python/frozen_modules/*.h linguist-generated=true -Python/frozen_modules/MANIFEST linguist-generated=true -Include/internal/pycore_ast.h linguist-generated=true -Python/Python-ast.c linguist-generated=true -Include/opcode.h linguist-generated=true -Python/opcode_targets.h linguist-generated=true -Objects/typeslots.inc linguist-generated=true -*_db.h linguist-generated=true -Doc/library/token-list.inc linguist-generated=true -Include/token.h linguist-generated=true -Lib/token.py linguist-generated=true -Parser/token.c linguist-generated=true -Programs/test_frozenmain.h linguist-generated=true - # Language aware diff headers # https://tekin.co.uk/2020/10/better-git-diff-output-for-ruby-python-elixir-and-more # https://gist.github.com/tekin/12500956bd56784728e490d8cef9cb81 @@ -67,3 +49,31 @@ Programs/test_frozenmain.h linguist-generated=true *.html diff=html *.py diff=python *.md diff=markdown + +# Generated files +# https://github.com/github/linguist/blob/master/docs/overrides.md +# +# To always hide generated files in local diffs, mark them as binary: +# $ git config diff.generated.binary true +# +[attr]generated linguist-generated=true diff=generated + +**/clinic/*.c.h generated +*_db.h generated +Doc/library/token-list.inc generated +Include/internal/pycore_ast.h generated +Include/internal/pycore_ast_state.h generated +Include/opcode.h generated +Include/token.h generated +Lib/keyword.py generated +Lib/token.py generated +Objects/typeslots.inc generated +Parser/parser.c generated +Parser/token.c generated +Programs/test_frozenmain.h generated +Python/Python-ast.c generated +Python/opcode_targets.h generated +Python/stdlib_module_names.h generated +Tools/peg_generator/pegen/grammar_parser.py generated +aclocal.m4 generated +configure generated From webhook-mailer at python.org Fri Jan 21 16:18:36 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 21:18:36 -0000 Subject: [Python-checkins] bpo-46124: Update zoneinfo to rely on importlib.resources traversable API. (GH-30190) Message-ID: https://github.com/python/cpython/commit/00b2b578bd9e516d601063a086b03177f546bcdd commit: 00b2b578bd9e516d601063a086b03177f546bcdd branch: main author: Jason R. Coombs committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T13:18:31-08:00 summary: bpo-46124: Update zoneinfo to rely on importlib.resources traversable API. (GH-30190) Automerge-Triggered-By: GH:jaraco files: A Misc/NEWS.d/next/Library/2021-12-18-18-41-30.bpo-46124.ESPrb7.rst M Lib/zoneinfo/_common.py M Lib/zoneinfo/_tzpath.py diff --git a/Lib/zoneinfo/_common.py b/Lib/zoneinfo/_common.py index 4c24f01bd7b27..98cdfe37ca6ca 100644 --- a/Lib/zoneinfo/_common.py +++ b/Lib/zoneinfo/_common.py @@ -2,14 +2,14 @@ def load_tzdata(key): - import importlib.resources + from importlib import resources components = key.split("/") package_name = ".".join(["tzdata.zoneinfo"] + components[:-1]) resource_name = components[-1] try: - return importlib.resources.open_binary(package_name, resource_name) + return resources.files(package_name).joinpath(resource_name).open("rb") except (ImportError, FileNotFoundError, UnicodeEncodeError): # There are three types of exception that can be raised that all amount # to "we cannot find this key": diff --git a/Lib/zoneinfo/_tzpath.py b/Lib/zoneinfo/_tzpath.py index 672560b951442..4985dce2dc36d 100644 --- a/Lib/zoneinfo/_tzpath.py +++ b/Lib/zoneinfo/_tzpath.py @@ -118,7 +118,7 @@ def available_timezones(): # Start with loading from the tzdata package if it exists: this has a # pre-assembled list of zones that only requires opening one file. try: - with resources.open_text("tzdata", "zones") as f: + with resources.files("tzdata").joinpath("zones").open("r") as f: for zone in f: zone = zone.strip() if zone: diff --git a/Misc/NEWS.d/next/Library/2021-12-18-18-41-30.bpo-46124.ESPrb7.rst b/Misc/NEWS.d/next/Library/2021-12-18-18-41-30.bpo-46124.ESPrb7.rst new file mode 100644 index 0000000000000..26f9f81303a96 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-18-18-41-30.bpo-46124.ESPrb7.rst @@ -0,0 +1 @@ +Update :mod:`zoneinfo` to rely on importlib.resources traversable API. From webhook-mailer at python.org Fri Jan 21 16:24:40 2022 From: webhook-mailer at python.org (gvanrossum) Date: Fri, 21 Jan 2022 21:24:40 -0000 Subject: [Python-checkins] bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) Message-ID: https://github.com/python/cpython/commit/881a763cfe07ef4a5806ec78f13a9bc99e8909dc commit: 881a763cfe07ef4a5806ec78f13a9bc99e8909dc branch: main author: Weipeng Hong committer: gvanrossum date: 2022-01-21T13:24:33-08:00 summary: bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) files: A Lib/test/ann_module7.py A Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst M Lib/inspect.py M Lib/test/test_inspect.py diff --git a/Lib/inspect.py b/Lib/inspect.py index 7a8f5d3464318..879a577d43fbe 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2511,9 +2511,9 @@ def _signature_from_callable(obj, *, pass else: if text_sig: - # If 'obj' class has a __text_signature__ attribute: + # If 'base' class has a __text_signature__ attribute: # return a signature based on it - return _signature_fromstr(sigcls, obj, text_sig) + return _signature_fromstr(sigcls, base, text_sig) # No '__text_signature__' was found for the 'obj' class. # Last option is to check if its '__init__' is diff --git a/Lib/test/ann_module7.py b/Lib/test/ann_module7.py new file mode 100644 index 0000000000000..8f890cd28025b --- /dev/null +++ b/Lib/test/ann_module7.py @@ -0,0 +1,11 @@ +# Tests class have ``__text_signature__`` + +from __future__ import annotations + +DEFAULT_BUFFER_SIZE = 8192 + +class BufferedReader(object): + """BufferedReader(raw, buffer_size=DEFAULT_BUFFER_SIZE)\n--\n\n + Create a new buffered reader using the given readable raw IO object. + """ + pass diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 67372cca6ed1f..cdbb9eb6a8f7c 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -4151,6 +4151,17 @@ def func(*args, **kwargs): sig = inspect.signature(func) self.assertEqual(str(sig), '(self, a, b=1, /, *args, c, d=2, **kwargs)') + def test_base_class_have_text_signature(self): + # see issue 43118 + from test.ann_module7 import BufferedReader + class MyBufferedReader(BufferedReader): + """buffer reader class.""" + + text_signature = BufferedReader.__text_signature__ + self.assertEqual(text_signature, '(raw, buffer_size=DEFAULT_BUFFER_SIZE)') + sig = inspect.signature(MyBufferedReader) + self.assertEqual(str(sig), '(raw, buffer_size=8192)') + class NTimesUnwrappable: def __init__(self, n): diff --git a/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst new file mode 100644 index 0000000000000..a37c22cd78c09 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst @@ -0,0 +1,3 @@ +Fix a bug in :func:`inspect.signature` that was causing it to fail on some +subclasses of classes with a ``__text_signature__`` referencing module +globals. Patch by Weipeng Hong. From webhook-mailer at python.org Fri Jan 21 16:38:28 2022 From: webhook-mailer at python.org (gvanrossum) Date: Fri, 21 Jan 2022 21:38:28 -0000 Subject: [Python-checkins] bpo-46445: Cover multiple inheritance of `TypedDict` in `test_typing` (GH-30719) Message-ID: https://github.com/python/cpython/commit/65b88d5e01c845c0cfa3ff61bc8b2faec8f67a57 commit: 65b88d5e01c845c0cfa3ff61bc8b2faec8f67a57 branch: main author: Nikita Sobolev committer: gvanrossum date: 2022-01-21T13:38:23-08:00 summary: bpo-46445: Cover multiple inheritance of `TypedDict` in `test_typing` (GH-30719) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index ce0c940e2a112..150d7c081c30b 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -22,7 +22,6 @@ from typing import is_typeddict from typing import no_type_check, no_type_check_decorator from typing import Type -from typing import NewType from typing import NamedTuple, TypedDict from typing import IO, TextIO, BinaryIO from typing import Pattern, Match @@ -4393,6 +4392,93 @@ class Cat(Animal): 'voice': str, } + def test_multiple_inheritance(self): + class One(TypedDict): + one: int + class Two(TypedDict): + two: str + class Untotal(TypedDict, total=False): + untotal: str + Inline = TypedDict('Inline', {'inline': bool}) + class Regular: + pass + + class Child(One, Two): + child: bool + self.assertEqual( + Child.__required_keys__, + frozenset(['one', 'two', 'child']), + ) + self.assertEqual( + Child.__optional_keys__, + frozenset([]), + ) + self.assertEqual( + Child.__annotations__, + {'one': int, 'two': str, 'child': bool}, + ) + + class ChildWithOptional(One, Untotal): + child: bool + self.assertEqual( + ChildWithOptional.__required_keys__, + frozenset(['one', 'child']), + ) + self.assertEqual( + ChildWithOptional.__optional_keys__, + frozenset(['untotal']), + ) + self.assertEqual( + ChildWithOptional.__annotations__, + {'one': int, 'untotal': str, 'child': bool}, + ) + + class ChildWithTotalFalse(One, Untotal, total=False): + child: bool + self.assertEqual( + ChildWithTotalFalse.__required_keys__, + frozenset(['one']), + ) + self.assertEqual( + ChildWithTotalFalse.__optional_keys__, + frozenset(['untotal', 'child']), + ) + self.assertEqual( + ChildWithTotalFalse.__annotations__, + {'one': int, 'untotal': str, 'child': bool}, + ) + + class ChildWithInlineAndOptional(Untotal, Inline): + child: bool + self.assertEqual( + ChildWithInlineAndOptional.__required_keys__, + frozenset(['inline', 'child']), + ) + self.assertEqual( + ChildWithInlineAndOptional.__optional_keys__, + frozenset(['untotal']), + ) + self.assertEqual( + ChildWithInlineAndOptional.__annotations__, + {'inline': bool, 'untotal': str, 'child': bool}, + ) + + wrong_bases = [ + (One, Regular), + (Regular, One), + (One, Two, Regular), + (Inline, Regular), + (Untotal, Regular), + ] + for bases in wrong_bases: + with self.subTest(bases=bases): + with self.assertRaisesRegex( + TypeError, + 'cannot inherit from both a TypedDict type and a non-TypedDict', + ): + class Wrong(*bases): + pass + def test_is_typeddict(self): assert is_typeddict(Point2D) is True assert is_typeddict(Union[str, int]) is False From webhook-mailer at python.org Fri Jan 21 16:51:20 2022 From: webhook-mailer at python.org (zooba) Date: Fri, 21 Jan 2022 21:51:20 -0000 Subject: [Python-checkins] bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) Message-ID: https://github.com/python/cpython/commit/57d1855682dbeb9233ef3a531f9535c6442e9992 commit: 57d1855682dbeb9233ef3a531f9535c6442e9992 branch: main author: Steve Dower committer: zooba date: 2022-01-21T21:51:15Z summary: bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) files: A Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst M Doc/tools/extensions/escape4chm.py diff --git a/Doc/tools/extensions/escape4chm.py b/Doc/tools/extensions/escape4chm.py index e999971625173..89970975b9032 100644 --- a/Doc/tools/extensions/escape4chm.py +++ b/Doc/tools/extensions/escape4chm.py @@ -5,6 +5,7 @@ https://bugs.python.org/issue32174 """ +import pathlib import re from html.entities import codepoint2name @@ -39,12 +40,12 @@ def fixup_keywords(app, exception): return getLogger(__name__).info('fixing HTML escapes in keywords file...') - outdir = app.builder.outdir + outdir = pathlib.Path(app.builder.outdir) outname = app.builder.config.htmlhelp_basename - with app.builder.open_file(outdir, outname + '.hhk', 'r') as f: + with open(outdir / (outname + '.hhk'), 'rb') as f: index = f.read() - with app.builder.open_file(outdir, outname + '.hhk', 'w') as f: - f.write(index.replace(''', ''')) + with open(outdir / (outname + '.hhk'), 'wb') as f: + f.write(index.replace(b''', b''')) def setup(app): # `html-page-context` event emitted when the HTML builder has diff --git a/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst new file mode 100644 index 0000000000000..d418190bb8fc8 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst @@ -0,0 +1,2 @@ +Fixes :file:`escape4chm.py` script used when building the CHM documentation +file From webhook-mailer at python.org Fri Jan 21 17:06:42 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 22:06:42 -0000 Subject: [Python-checkins] bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) Message-ID: https://github.com/python/cpython/commit/9e3ff821dac05e8fde030ec83bd988f3eba66065 commit: 9e3ff821dac05e8fde030ec83bd988f3eba66065 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T14:06:35-08:00 summary: bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) (cherry picked from commit 881a763cfe07ef4a5806ec78f13a9bc99e8909dc) Co-authored-by: Weipeng Hong files: A Lib/test/ann_module7.py A Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst M Lib/inspect.py M Lib/test/test_inspect.py diff --git a/Lib/inspect.py b/Lib/inspect.py index 10b5d14eb37bf..6f91435541b8a 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2379,9 +2379,9 @@ def _signature_from_callable(obj, *, pass else: if text_sig: - # If 'obj' class has a __text_signature__ attribute: + # If 'base' class has a __text_signature__ attribute: # return a signature based on it - return _signature_fromstr(sigcls, obj, text_sig) + return _signature_fromstr(sigcls, base, text_sig) # No '__text_signature__' was found for the 'obj' class. # Last option is to check if its '__init__' is diff --git a/Lib/test/ann_module7.py b/Lib/test/ann_module7.py new file mode 100644 index 0000000000000..8f890cd28025b --- /dev/null +++ b/Lib/test/ann_module7.py @@ -0,0 +1,11 @@ +# Tests class have ``__text_signature__`` + +from __future__ import annotations + +DEFAULT_BUFFER_SIZE = 8192 + +class BufferedReader(object): + """BufferedReader(raw, buffer_size=DEFAULT_BUFFER_SIZE)\n--\n\n + Create a new buffered reader using the given readable raw IO object. + """ + pass diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 2a6de956cdc6d..ef26d79a50a02 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -3949,6 +3949,17 @@ def func(*args, **kwargs): sig = inspect.signature(func) self.assertEqual(str(sig), '(self, a, b=1, /, *args, c, d=2, **kwargs)') + def test_base_class_have_text_signature(self): + # see issue 43118 + from test.ann_module7 import BufferedReader + class MyBufferedReader(BufferedReader): + """buffer reader class.""" + + text_signature = BufferedReader.__text_signature__ + self.assertEqual(text_signature, '(raw, buffer_size=DEFAULT_BUFFER_SIZE)') + sig = inspect.signature(MyBufferedReader) + self.assertEqual(str(sig), '(raw, buffer_size=8192)') + class NTimesUnwrappable: def __init__(self, n): diff --git a/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst new file mode 100644 index 0000000000000..a37c22cd78c09 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst @@ -0,0 +1,3 @@ +Fix a bug in :func:`inspect.signature` that was causing it to fail on some +subclasses of classes with a ``__text_signature__`` referencing module +globals. Patch by Weipeng Hong. From webhook-mailer at python.org Fri Jan 21 17:11:56 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 22:11:56 -0000 Subject: [Python-checkins] bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) Message-ID: https://github.com/python/cpython/commit/d548c871716dfda73714d9f38b4e4219878a414e commit: d548c871716dfda73714d9f38b4e4219878a414e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T14:11:47-08:00 summary: bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) (cherry picked from commit 57d1855682dbeb9233ef3a531f9535c6442e9992) Co-authored-by: Steve Dower files: A Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst M Doc/tools/extensions/escape4chm.py diff --git a/Doc/tools/extensions/escape4chm.py b/Doc/tools/extensions/escape4chm.py index e999971625173..89970975b9032 100644 --- a/Doc/tools/extensions/escape4chm.py +++ b/Doc/tools/extensions/escape4chm.py @@ -5,6 +5,7 @@ https://bugs.python.org/issue32174 """ +import pathlib import re from html.entities import codepoint2name @@ -39,12 +40,12 @@ def fixup_keywords(app, exception): return getLogger(__name__).info('fixing HTML escapes in keywords file...') - outdir = app.builder.outdir + outdir = pathlib.Path(app.builder.outdir) outname = app.builder.config.htmlhelp_basename - with app.builder.open_file(outdir, outname + '.hhk', 'r') as f: + with open(outdir / (outname + '.hhk'), 'rb') as f: index = f.read() - with app.builder.open_file(outdir, outname + '.hhk', 'w') as f: - f.write(index.replace(''', ''')) + with open(outdir / (outname + '.hhk'), 'wb') as f: + f.write(index.replace(b''', b''')) def setup(app): # `html-page-context` event emitted when the HTML builder has diff --git a/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst new file mode 100644 index 0000000000000..d418190bb8fc8 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst @@ -0,0 +1,2 @@ +Fixes :file:`escape4chm.py` script used when building the CHM documentation +file From webhook-mailer at python.org Fri Jan 21 17:15:54 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 22:15:54 -0000 Subject: [Python-checkins] bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) Message-ID: https://github.com/python/cpython/commit/b37f3e993a978eacf05c5fddd716be2d31f18a8d commit: b37f3e993a978eacf05c5fddd716be2d31f18a8d branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T14:15:44-08:00 summary: bpo-46463: Fixes escape4chm.py script used when building the CHM documentation file (GH-30768) (cherry picked from commit 57d1855682dbeb9233ef3a531f9535c6442e9992) Co-authored-by: Steve Dower files: A Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst M Doc/tools/extensions/escape4chm.py diff --git a/Doc/tools/extensions/escape4chm.py b/Doc/tools/extensions/escape4chm.py index e999971625173..89970975b9032 100644 --- a/Doc/tools/extensions/escape4chm.py +++ b/Doc/tools/extensions/escape4chm.py @@ -5,6 +5,7 @@ https://bugs.python.org/issue32174 """ +import pathlib import re from html.entities import codepoint2name @@ -39,12 +40,12 @@ def fixup_keywords(app, exception): return getLogger(__name__).info('fixing HTML escapes in keywords file...') - outdir = app.builder.outdir + outdir = pathlib.Path(app.builder.outdir) outname = app.builder.config.htmlhelp_basename - with app.builder.open_file(outdir, outname + '.hhk', 'r') as f: + with open(outdir / (outname + '.hhk'), 'rb') as f: index = f.read() - with app.builder.open_file(outdir, outname + '.hhk', 'w') as f: - f.write(index.replace(''', ''')) + with open(outdir / (outname + '.hhk'), 'wb') as f: + f.write(index.replace(b''', b''')) def setup(app): # `html-page-context` event emitted when the HTML builder has diff --git a/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst new file mode 100644 index 0000000000000..d418190bb8fc8 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-01-21-21-33-48.bpo-46463.fBbdTG.rst @@ -0,0 +1,2 @@ +Fixes :file:`escape4chm.py` script used when building the CHM documentation +file From webhook-mailer at python.org Fri Jan 21 17:29:21 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 22:29:21 -0000 Subject: [Python-checkins] bpo-46417: Add _PyType_GetSubclasses() function (GH-30761) Message-ID: https://github.com/python/cpython/commit/8ee07dda139f3fa1d7c58a29532a98efc790568d commit: 8ee07dda139f3fa1d7c58a29532a98efc790568d branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T23:29:10+01:00 summary: bpo-46417: Add _PyType_GetSubclasses() function (GH-30761) Add a new _PyType_GetSubclasses() function to get type's subclasses. _PyType_GetSubclasses(type) returns a list which holds strong refererences to subclasses. It is safer than iterating on type->tp_subclasses which yields weak references and can be modified in the loop. _PyType_GetSubclasses(type) now holds a reference to the tp_subclasses dict while creating the list of subclasses. set_collection_flag_recursive() of _abc.c now uses _PyType_GetSubclasses(). files: M Include/internal/pycore_object.h M Modules/_abc.c M Objects/typeobject.c diff --git a/Include/internal/pycore_object.h b/Include/internal/pycore_object.h index 0348563218072..be308cd25d710 100644 --- a/Include/internal/pycore_object.h +++ b/Include/internal/pycore_object.h @@ -220,11 +220,12 @@ static inline PyObject **_PyObject_ManagedDictPointer(PyObject *obj) return ((PyObject **)obj)-3; } -PyObject ** _PyObject_DictPointer(PyObject *); -int _PyObject_VisitInstanceAttributes(PyObject *self, visitproc visit, void *arg); -void _PyObject_ClearInstanceAttributes(PyObject *self); -void _PyObject_FreeInstanceAttributes(PyObject *self); -int _PyObject_IsInstanceDictEmpty(PyObject *); +extern PyObject ** _PyObject_DictPointer(PyObject *); +extern int _PyObject_VisitInstanceAttributes(PyObject *self, visitproc visit, void *arg); +extern void _PyObject_ClearInstanceAttributes(PyObject *self); +extern void _PyObject_FreeInstanceAttributes(PyObject *self); +extern int _PyObject_IsInstanceDictEmpty(PyObject *); +extern PyObject* _PyType_GetSubclasses(PyTypeObject *); #ifdef __cplusplus } diff --git a/Modules/_abc.c b/Modules/_abc.c index b7465c379dddf..a043961812041 100644 --- a/Modules/_abc.c +++ b/Modules/_abc.c @@ -4,6 +4,7 @@ #endif #include "Python.h" +#include "pycore_object.h" // _PyType_GetSubclasses() #include "pycore_moduleobject.h" // _PyModule_GetState() #include "clinic/_abc.c.h" @@ -493,21 +494,20 @@ set_collection_flag_recursive(PyTypeObject *child, unsigned long flag) { return; } + child->tp_flags &= ~COLLECTION_FLAGS; child->tp_flags |= flag; - PyObject *grandchildren = child->tp_subclasses; + + PyObject *grandchildren = _PyType_GetSubclasses(child); if (grandchildren == NULL) { return; } - assert(PyDict_CheckExact(grandchildren)); - Py_ssize_t i = 0; - while (PyDict_Next(grandchildren, &i, NULL, &grandchildren)) { - assert(PyWeakref_CheckRef(grandchildren)); - PyObject *grandchild = PyWeakref_GET_OBJECT(grandchildren); - if (PyType_Check(grandchild)) { - set_collection_flag_recursive((PyTypeObject *)grandchild, flag); - } + + for (Py_ssize_t i = 0; i < PyList_GET_SIZE(grandchildren); i++) { + PyObject *grandchild = PyList_GET_ITEM(grandchildren, i); + set_collection_flag_recursive((PyTypeObject *)grandchild, flag); } + Py_DECREF(grandchildren); } /*[clinic input] diff --git a/Objects/typeobject.c b/Objects/typeobject.c index c46c3d80edbaa..e4a4824fa2e41 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -687,27 +687,28 @@ static int recurse_down_subclasses(PyTypeObject *type, PyObject *name, static int mro_hierarchy(PyTypeObject *type, PyObject *temp) { - int res; - PyObject *new_mro, *old_mro; - PyObject *tuple; - PyObject *subclasses; - Py_ssize_t i, n; - - res = mro_internal(type, &old_mro); - if (res <= 0) + PyObject *old_mro; + int res = mro_internal(type, &old_mro); + if (res <= 0) { /* error / reentrance */ return res; - new_mro = type->tp_mro; + } + PyObject *new_mro = type->tp_mro; - if (old_mro != NULL) + PyObject *tuple; + if (old_mro != NULL) { tuple = PyTuple_Pack(3, type, new_mro, old_mro); - else + } + else { tuple = PyTuple_Pack(2, type, new_mro); + } - if (tuple != NULL) + if (tuple != NULL) { res = PyList_Append(temp, tuple); - else + } + else { res = -1; + } Py_XDECREF(tuple); if (res < 0) { @@ -727,15 +728,18 @@ mro_hierarchy(PyTypeObject *type, PyObject *temp) Finally, this makes things simple avoiding the need to deal with dictionary iterators and weak references. */ - subclasses = type___subclasses___impl(type); - if (subclasses == NULL) + PyObject *subclasses = _PyType_GetSubclasses(type); + if (subclasses == NULL) { return -1; - n = PyList_GET_SIZE(subclasses); - for (i = 0; i < n; i++) { + } + + Py_ssize_t n = PyList_GET_SIZE(subclasses); + for (Py_ssize_t i = 0; i < n; i++) { PyTypeObject *subclass = _PyType_CAST(PyList_GET_ITEM(subclasses, i)); res = mro_hierarchy(subclass, temp); - if (res < 0) + if (res < 0) { break; + } } Py_DECREF(subclasses); @@ -4124,6 +4128,42 @@ type_dealloc(PyTypeObject *type) Py_TYPE(type)->tp_free((PyObject *)type); } + +PyObject* +_PyType_GetSubclasses(PyTypeObject *self) +{ + PyObject *list = PyList_New(0); + if (list == NULL) { + return NULL; + } + + // Hold a strong reference to tp_subclasses while iterating on it + PyObject *dict = Py_XNewRef(self->tp_subclasses); + if (dict == NULL) { + return list; + } + assert(PyDict_CheckExact(dict)); + + Py_ssize_t i = 0; + PyObject *ref; // borrowed ref + while (PyDict_Next(dict, &i, NULL, &ref)) { + assert(PyWeakref_CheckRef(ref)); + PyObject *obj = PyWeakref_GET_OBJECT(ref); // borrowed ref + if (obj == Py_None) { + continue; + } + assert(PyType_Check(obj)); + if (PyList_Append(list, obj) < 0) { + Py_CLEAR(list); + goto done; + } + } +done: + Py_DECREF(dict); + return list; +} + + /*[clinic input] type.__subclasses__ @@ -4134,28 +4174,7 @@ static PyObject * type___subclasses___impl(PyTypeObject *self) /*[clinic end generated code: output=eb5eb54485942819 input=5af66132436f9a7b]*/ { - PyObject *list, *raw, *ref; - Py_ssize_t i; - - list = PyList_New(0); - if (list == NULL) - return NULL; - raw = self->tp_subclasses; - if (raw == NULL) - return list; - assert(PyDict_CheckExact(raw)); - i = 0; - while (PyDict_Next(raw, &i, NULL, &ref)) { - assert(PyWeakref_CheckRef(ref)); - ref = PyWeakref_GET_OBJECT(ref); - if (ref != Py_None) { - if (PyList_Append(list, ref) < 0) { - Py_DECREF(list); - return NULL; - } - } - } - return list; + return _PyType_GetSubclasses(self); } static PyObject * @@ -4165,6 +4184,7 @@ type_prepare(PyObject *self, PyObject *const *args, Py_ssize_t nargs, return PyDict_New(); } + /* Merge the __dict__ of aclass into dict, and recursively also all the __dict__s of aclass's base classes. The order of merging isn't From webhook-mailer at python.org Fri Jan 21 17:30:28 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 22:30:28 -0000 Subject: [Python-checkins] bpo-46417: Use _PyType_CAST() in Python directory (GH-30769) Message-ID: https://github.com/python/cpython/commit/7835cbf949c413a746324721a352cc72670a8a36 commit: 7835cbf949c413a746324721a352cc72670a8a36 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T23:30:17+01:00 summary: bpo-46417: Use _PyType_CAST() in Python directory (GH-30769) files: M Python/bltinmodule.c M Python/specialize.c diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index ef1b2bb9cf644..ecd8be1af6f2d 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -536,7 +536,7 @@ static PyObject * filter_vectorcall(PyObject *type, PyObject * const*args, size_t nargsf, PyObject *kwnames) { - PyTypeObject *tp = (PyTypeObject *)type; + PyTypeObject *tp = _PyType_CAST(type); if (tp == &PyFilter_Type && !_PyArg_NoKwnames("filter", kwnames)) { return NULL; } @@ -1251,7 +1251,7 @@ static PyObject * map_vectorcall(PyObject *type, PyObject * const*args, size_t nargsf, PyObject *kwnames) { - PyTypeObject *tp = (PyTypeObject *)type; + PyTypeObject *tp = _PyType_CAST(type); if (tp == &PyMap_Type && !_PyArg_NoKwnames("map", kwnames)) { return NULL; } diff --git a/Python/specialize.c b/Python/specialize.c index e32986ad9d61a..8daeaa6cb2f51 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -1339,8 +1339,7 @@ specialize_class_call( PyObject *callable, _Py_CODEUNIT *instr, int nargs, SpecializedCacheEntry *cache) { - assert(PyType_Check(callable)); - PyTypeObject *tp = (PyTypeObject *)callable; + PyTypeObject *tp = _PyType_CAST(callable); if (_Py_OPCODE(instr[-1]) == PRECALL_METHOD) { SPECIALIZATION_FAIL(CALL_NO_KW, SPEC_FAIL_METHOD_CALL_CLASS); return -1; From webhook-mailer at python.org Fri Jan 21 17:33:58 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 22:33:58 -0000 Subject: [Python-checkins] bpo-46417: Use _PyType_CAST() in Objects directory (GH-30764) Message-ID: https://github.com/python/cpython/commit/ac1f152421fab3ac854fe4565c575b306e2bb4b5 commit: ac1f152421fab3ac854fe4565c575b306e2bb4b5 branch: main author: Victor Stinner committer: vstinner date: 2022-01-21T23:33:43+01:00 summary: bpo-46417: Use _PyType_CAST() in Objects directory (GH-30764) files: M Objects/complexobject.c M Objects/dictobject.c M Objects/enumobject.c M Objects/exceptions.c M Objects/floatobject.c M Objects/listobject.c M Objects/setobject.c M Objects/structseq.c M Objects/tupleobject.c diff --git a/Objects/complexobject.c b/Objects/complexobject.c index f658dbf336dbf..e0766de258805 100644 --- a/Objects/complexobject.c +++ b/Objects/complexobject.c @@ -846,7 +846,7 @@ complex_from_string_inner(const char *s, Py_ssize_t len, void *type) if (s-start != len) goto parse_error; - return complex_subtype_from_doubles((PyTypeObject *)type, x, y); + return complex_subtype_from_doubles(_PyType_CAST(type), x, y); parse_error: PyErr_SetString(PyExc_ValueError, diff --git a/Objects/dictobject.c b/Objects/dictobject.c index 7ce4b9069f77e..39be189e12000 100644 --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -3450,13 +3450,12 @@ static PyObject * dict_vectorcall(PyObject *type, PyObject * const*args, size_t nargsf, PyObject *kwnames) { - assert(PyType_Check(type)); Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); if (!_PyArg_CheckPositional("dict", nargs, 0, 1)) { return NULL; } - PyObject *self = dict_new((PyTypeObject *)type, NULL, NULL); + PyObject *self = dict_new(_PyType_CAST(type), NULL, NULL); if (self == NULL) { return NULL; } diff --git a/Objects/enumobject.c b/Objects/enumobject.c index 8fbf4fd6e470b..36f592d7c239c 100644 --- a/Objects/enumobject.c +++ b/Objects/enumobject.c @@ -88,8 +88,7 @@ static PyObject * enumerate_vectorcall(PyObject *type, PyObject *const *args, size_t nargsf, PyObject *kwnames) { - assert(PyType_Check(type)); - PyTypeObject *tp = (PyTypeObject *)type; + PyTypeObject *tp = _PyType_CAST(type); Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); Py_ssize_t nkwargs = 0; if (nargs == 0) { @@ -373,8 +372,6 @@ static PyObject * reversed_vectorcall(PyObject *type, PyObject * const*args, size_t nargsf, PyObject *kwnames) { - assert(PyType_Check(type)); - if (!_PyArg_NoKwnames("reversed", kwnames)) { return NULL; } @@ -384,7 +381,7 @@ reversed_vectorcall(PyObject *type, PyObject * const*args, return NULL; } - return reversed_new_impl((PyTypeObject *)type, args[0]); + return reversed_new_impl(_PyType_CAST(type), args[0]); } static void diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 403d2d4a3fddf..22a47131aa12c 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -1775,8 +1775,7 @@ OSError_new(PyTypeObject *type, PyObject *args, PyObject *kwds) PyObject *newtype; newtype = PyDict_GetItemWithError(state->errnomap, myerrno); if (newtype) { - assert(PyType_Check(newtype)); - type = (PyTypeObject *) newtype; + type = _PyType_CAST(newtype); } else if (PyErr_Occurred()) goto error; diff --git a/Objects/floatobject.c b/Objects/floatobject.c index 68be7acaa2e72..79fbdabce9608 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -1686,7 +1686,7 @@ float_vectorcall(PyObject *type, PyObject * const*args, } PyObject *x = nargs >= 1 ? args[0] : NULL; - return float_new_impl((PyTypeObject *)type, x); + return float_new_impl(_PyType_CAST(type), x); } diff --git a/Objects/listobject.c b/Objects/listobject.c index 29f5d70f1dbd3..0ce58b240327f 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -2858,8 +2858,7 @@ list_vectorcall(PyObject *type, PyObject * const*args, return NULL; } - assert(PyType_Check(type)); - PyObject *list = PyType_GenericAlloc((PyTypeObject *)type, 0); + PyObject *list = PyType_GenericAlloc(_PyType_CAST(type), 0); if (list == NULL) { return NULL; } diff --git a/Objects/setobject.c b/Objects/setobject.c index 6e110ef196c82..ca3cfe8196467 100644 --- a/Objects/setobject.c +++ b/Objects/setobject.c @@ -1001,7 +1001,7 @@ make_new_frozenset(PyTypeObject *type, PyObject *iterable) Py_INCREF(iterable); return iterable; } - return make_new_set((PyTypeObject *)type, iterable); + return make_new_set(type, iterable); } static PyObject * @@ -1036,7 +1036,7 @@ frozenset_vectorcall(PyObject *type, PyObject * const*args, } PyObject *iterable = (nargs ? args[0] : NULL); - return make_new_frozenset((PyTypeObject *)type, iterable); + return make_new_frozenset(_PyType_CAST(type), iterable); } static PyObject * @@ -1974,10 +1974,10 @@ set_vectorcall(PyObject *type, PyObject * const*args, } if (nargs) { - return make_new_set((PyTypeObject *)type, args[0]); + return make_new_set(_PyType_CAST(type), args[0]); } - return make_new_set((PyTypeObject *)type, NULL); + return make_new_set(_PyType_CAST(type), NULL); } static PySequenceMethods set_as_sequence = { diff --git a/Objects/structseq.c b/Objects/structseq.c index dfefae8928eb6..cded877300d9e 100644 --- a/Objects/structseq.c +++ b/Objects/structseq.c @@ -108,10 +108,9 @@ static void structseq_dealloc(PyStructSequence *obj) { Py_ssize_t i, size; - PyTypeObject *tp; PyObject_GC_UnTrack(obj); - tp = (PyTypeObject *) Py_TYPE(obj); + PyTypeObject *tp = Py_TYPE(obj); size = REAL_SIZE(obj); for (i = 0; i < size; ++i) { Py_XDECREF(obj->ob_item[i]); diff --git a/Objects/tupleobject.c b/Objects/tupleobject.c index 2051c1812efe2..86f541a96a5a1 100644 --- a/Objects/tupleobject.c +++ b/Objects/tupleobject.c @@ -817,7 +817,7 @@ tuple_vectorcall(PyObject *type, PyObject * const*args, } if (nargs) { - return tuple_new_impl((PyTypeObject *)type, args[0]); + return tuple_new_impl(_PyType_CAST(type), args[0]); } else { return tuple_get_empty(); From webhook-mailer at python.org Fri Jan 21 17:37:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Fri, 21 Jan 2022 22:37:59 -0000 Subject: [Python-checkins] bpo-46445: Cover multiple inheritance of `TypedDict` in `test_typing` (GH-30719) Message-ID: https://github.com/python/cpython/commit/46e6aad12958d3b73c5377ec034d056bb1a36d65 commit: 46e6aad12958d3b73c5377ec034d056bb1a36d65 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T14:37:52-08:00 summary: bpo-46445: Cover multiple inheritance of `TypedDict` in `test_typing` (GH-30719) (cherry picked from commit 65b88d5e01c845c0cfa3ff61bc8b2faec8f67a57) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 1d16e78d422cd..d6c55ef1de75f 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -19,7 +19,6 @@ from typing import is_typeddict from typing import no_type_check, no_type_check_decorator from typing import Type -from typing import NewType from typing import NamedTuple, TypedDict from typing import IO, TextIO, BinaryIO from typing import Pattern, Match @@ -4302,6 +4301,93 @@ class Cat(Animal): 'voice': str, } + def test_multiple_inheritance(self): + class One(TypedDict): + one: int + class Two(TypedDict): + two: str + class Untotal(TypedDict, total=False): + untotal: str + Inline = TypedDict('Inline', {'inline': bool}) + class Regular: + pass + + class Child(One, Two): + child: bool + self.assertEqual( + Child.__required_keys__, + frozenset(['one', 'two', 'child']), + ) + self.assertEqual( + Child.__optional_keys__, + frozenset([]), + ) + self.assertEqual( + Child.__annotations__, + {'one': int, 'two': str, 'child': bool}, + ) + + class ChildWithOptional(One, Untotal): + child: bool + self.assertEqual( + ChildWithOptional.__required_keys__, + frozenset(['one', 'child']), + ) + self.assertEqual( + ChildWithOptional.__optional_keys__, + frozenset(['untotal']), + ) + self.assertEqual( + ChildWithOptional.__annotations__, + {'one': int, 'untotal': str, 'child': bool}, + ) + + class ChildWithTotalFalse(One, Untotal, total=False): + child: bool + self.assertEqual( + ChildWithTotalFalse.__required_keys__, + frozenset(['one']), + ) + self.assertEqual( + ChildWithTotalFalse.__optional_keys__, + frozenset(['untotal', 'child']), + ) + self.assertEqual( + ChildWithTotalFalse.__annotations__, + {'one': int, 'untotal': str, 'child': bool}, + ) + + class ChildWithInlineAndOptional(Untotal, Inline): + child: bool + self.assertEqual( + ChildWithInlineAndOptional.__required_keys__, + frozenset(['inline', 'child']), + ) + self.assertEqual( + ChildWithInlineAndOptional.__optional_keys__, + frozenset(['untotal']), + ) + self.assertEqual( + ChildWithInlineAndOptional.__annotations__, + {'inline': bool, 'untotal': str, 'child': bool}, + ) + + wrong_bases = [ + (One, Regular), + (Regular, One), + (One, Two, Regular), + (Inline, Regular), + (Untotal, Regular), + ] + for bases in wrong_bases: + with self.subTest(bases=bases): + with self.assertRaisesRegex( + TypeError, + 'cannot inherit from both a TypedDict type and a non-TypedDict', + ): + class Wrong(*bases): + pass + def test_is_typeddict(self): assert is_typeddict(Point2D) is True assert is_typeddict(Union[str, int]) is False From webhook-mailer at python.org Fri Jan 21 18:54:53 2022 From: webhook-mailer at python.org (vstinner) Date: Fri, 21 Jan 2022 23:54:53 -0000 Subject: [Python-checkins] bpo-29882: _Py_popcount32() doesn't need 64x64 multiply (GH-30774) Message-ID: https://github.com/python/cpython/commit/cd8de40b3b10311de2db7b90abdf80af9e35535f commit: cd8de40b3b10311de2db7b90abdf80af9e35535f branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T00:54:42+01:00 summary: bpo-29882: _Py_popcount32() doesn't need 64x64 multiply (GH-30774) 32x32 bits multiply is enough for _Py_popcount32(). files: M Include/internal/pycore_bitutils.h diff --git a/Include/internal/pycore_bitutils.h b/Include/internal/pycore_bitutils.h index e4aa7a3d0d056..3fd70b0e417c1 100644 --- a/Include/internal/pycore_bitutils.h +++ b/Include/internal/pycore_bitutils.h @@ -125,7 +125,7 @@ _Py_popcount32(uint32_t x) // Put count of each 8 bits into those 8 bits x = (x + (x >> 4)) & M4; // Sum of the 4 byte counts - return (uint32_t)((uint64_t)x * (uint64_t)SUM) >> 24; + return (x * SUM) >> 24; #endif } From webhook-mailer at python.org Fri Jan 21 20:13:24 2022 From: webhook-mailer at python.org (zooba) Date: Sat, 22 Jan 2022 01:13:24 -0000 Subject: [Python-checkins] Improve the Windows release build scripts (GH-30771) Message-ID: https://github.com/python/cpython/commit/70c16468deee9390e34322d32fda57df6e0f46bb commit: 70c16468deee9390e34322d32fda57df6e0f46bb branch: main author: Steve Dower committer: zooba date: 2022-01-22T01:13:16Z summary: Improve the Windows release build scripts (GH-30771) Update to windows-2022 image Promote queue variables to parameters for better UI Structure build steps using parameters instead of conditions for simpler status display files: M .azure-pipelines/ci.yml M .azure-pipelines/pr.yml M .azure-pipelines/windows-release.yml M .azure-pipelines/windows-release/stage-build.yml M .azure-pipelines/windows-release/stage-layout-embed.yml M .azure-pipelines/windows-release/stage-layout-full.yml M .azure-pipelines/windows-release/stage-layout-msix.yml M .azure-pipelines/windows-release/stage-layout-nuget.yml M .azure-pipelines/windows-release/stage-msi.yml M .azure-pipelines/windows-release/stage-pack-msix.yml M .azure-pipelines/windows-release/stage-publish-nugetorg.yml M .azure-pipelines/windows-release/stage-publish-pythonorg.yml M .azure-pipelines/windows-release/stage-publish-store.yml M .azure-pipelines/windows-release/stage-sign.yml M .azure-pipelines/windows-release/stage-test-embed.yml M .azure-pipelines/windows-release/stage-test-msi.yml M .azure-pipelines/windows-release/stage-test-nuget.yml diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml index 25cc726504b37..638625540e44c 100644 --- a/.azure-pipelines/ci.yml +++ b/.azure-pipelines/ci.yml @@ -98,7 +98,7 @@ jobs: condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 strategy: matrix: diff --git a/.azure-pipelines/pr.yml b/.azure-pipelines/pr.yml index e2aae324f211b..8b065e6caea53 100644 --- a/.azure-pipelines/pr.yml +++ b/.azure-pipelines/pr.yml @@ -98,7 +98,7 @@ jobs: condition: and(succeeded(), eq(dependencies.Prebuild.outputs['tests.run'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 strategy: matrix: diff --git a/.azure-pipelines/windows-release.yml b/.azure-pipelines/windows-release.yml index 3d072e3b43e17..338c305ecdc0a 100644 --- a/.azure-pipelines/windows-release.yml +++ b/.azure-pipelines/windows-release.yml @@ -1,129 +1,183 @@ name: Release_$(Build.SourceBranchName)_$(SourceTag)_$(Date:yyyyMMdd)$(Rev:.rr) +parameters: +- name: GitRemote + displayName: "Git remote" + type: string + default: python + values: + - 'python' + - 'pablogsal' + - 'ambv' + - '(Other)' +- name: GitRemote_Other + displayName: "If Other, specify Git remote" + type: string + default: 'python' +- name: SourceTag + displayName: "Git tag" + type: string + default: main +- name: DoPublish + displayName: "Publish release" + type: boolean + default: false +- name: SigningCertificate + displayName: "Code signing certificate" + type: string + default: 'Python Software Foundation' + values: + - 'Python Software Foundation' + - 'TestSign' + - 'Unsigned' +- name: SigningDescription + displayName: "Signature description" + type: string + default: 'Built: $(Build.BuildNumber)' +- name: DoPGO + displayName: "Run PGO" + type: boolean + default: true +- name: DoLayout + displayName: "Produce full layout artifact" + type: boolean + default: true +- name: DoMSIX + displayName: "Produce Store packages" + type: boolean + default: true +- name: DoNuget + displayName: "Produce Nuget packages" + type: boolean + default: true +- name: DoEmbed + displayName: "Produce embeddable package" + type: boolean + default: true +- name: DoMSI + displayName: "Produce EXE/MSI installer" + type: boolean + default: true +- name: BuildToPublish + displayName: "Build number to publish (0 to skip)" + type: number + default: '0' + variables: __RealSigningCertificate: 'Python Software Foundation' + ${{ if ne(parameters.GitRemote, '(Other)') }}: + GitRemote: ${{ parameters.GitRemote }} + ${{ else }}: + GitRemote: ${{ parameters.GitRemote_Other }} + SourceTag: ${{ parameters.SourceTag }} + DoPGO: ${{ parameters.DoPGO }} + ${{ if ne(parameters.SigningCertificate, 'Unsigned') }}: + SigningCertificate: ${{ parameters.SigningCertificate }} + SigningDescription: ${{ parameters.SigningDescription }} + DoLayout: ${{ parameters.DoLayout }} + DoMSIX: ${{ parameters.DoMSIX }} + DoNuget: ${{ parameters.DoNuget }} + DoEmbed: ${{ parameters.DoEmbed }} + DoMSI: ${{ parameters.DoMSI }} + DoPublish: ${{ parameters.DoPublish }} # QUEUE TIME VARIABLES -# GitRemote: python -# SourceTag: -# DoPGO: true -# SigningCertificate: 'Python Software Foundation' -# SigningDescription: 'Built: $(Build.BuildNumber)' -# DoLayout: true -# DoMSIX: true -# DoNuget: true -# DoEmbed: true -# DoMSI: true -# DoPublish: false -# PyDotOrgUsername: '' -# PyDotOrgServer: '' -# BuildToPublish: '' +# PyDotOrgUsername: '' +# PyDotOrgServer: '' trigger: none pr: none stages: -- stage: Build - displayName: Build binaries - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-build.yml - -- stage: Sign - displayName: Sign binaries - dependsOn: Build - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-sign.yml - -- stage: Layout - displayName: Generate layouts - dependsOn: Sign - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-layout-full.yml - - template: windows-release/stage-layout-embed.yml - - template: windows-release/stage-layout-nuget.yml - -- stage: Pack - dependsOn: Layout - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-pack-nuget.yml - -- stage: Test - dependsOn: Pack - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-test-embed.yml - - template: windows-release/stage-test-nuget.yml - -- stage: Layout_MSIX - displayName: Generate MSIX layouts - dependsOn: Sign - condition: and(succeeded(), and(eq(variables['DoMSIX'], 'true'), not(variables['BuildToPublish']))) - jobs: - - template: windows-release/stage-layout-msix.yml - -- stage: Pack_MSIX - displayName: Package MSIX - dependsOn: Layout_MSIX - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-pack-msix.yml - -- stage: Build_MSI - displayName: Build MSI installer - dependsOn: Sign - condition: and(succeeded(), and(eq(variables['DoMSI'], 'true'), not(variables['BuildToPublish']))) - jobs: - - template: windows-release/stage-msi.yml - -- stage: Test_MSI - displayName: Test MSI installer - dependsOn: Build_MSI - condition: and(succeeded(), not(variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-test-msi.yml - -- stage: PublishPyDotOrg - displayName: Publish to python.org - dependsOn: ['Test_MSI', 'Test'] - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), not(variables['BuildToPublish']))) - jobs: - - template: windows-release/stage-publish-pythonorg.yml - -- stage: PublishNuget - displayName: Publish to nuget.org - dependsOn: Test - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), not(variables['BuildToPublish']))) - jobs: - - template: windows-release/stage-publish-nugetorg.yml - -- stage: PublishStore - displayName: Publish to Store - dependsOn: Pack_MSIX - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), not(variables['BuildToPublish']))) - jobs: - - template: windows-release/stage-publish-store.yml - - -- stage: PublishExistingPyDotOrg - displayName: Publish existing build to python.org - dependsOn: [] - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-publish-pythonorg.yml - -- stage: PublishExistingNuget - displayName: Publish existing build to nuget.org - dependsOn: [] - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-publish-nugetorg.yml - -- stage: PublishExistingStore - displayName: Publish existing build to Store - dependsOn: [] - condition: and(succeeded(), and(eq(variables['DoPublish'], 'true'), variables['BuildToPublish'])) - jobs: - - template: windows-release/stage-publish-store.yml +- ${{ if eq(parameters.BuildToPublish, '0') }}: + - stage: Build + displayName: Build binaries + jobs: + - template: windows-release/stage-build.yml + + - stage: Sign + displayName: Sign binaries + dependsOn: Build + jobs: + - template: windows-release/stage-sign.yml + + - stage: Layout + displayName: Generate layouts + dependsOn: Sign + jobs: + - template: windows-release/stage-layout-full.yml + - template: windows-release/stage-layout-embed.yml + - template: windows-release/stage-layout-nuget.yml + + - stage: Pack + dependsOn: Layout + jobs: + - template: windows-release/stage-pack-nuget.yml + + - stage: Test + dependsOn: Pack + jobs: + - template: windows-release/stage-test-embed.yml + - template: windows-release/stage-test-nuget.yml + + - stage: Layout_MSIX + displayName: Generate MSIX layouts + dependsOn: Sign + condition: and(succeeded(), eq(variables['DoMSIX'], 'true')) + jobs: + - template: windows-release/stage-layout-msix.yml + + - stage: Pack_MSIX + displayName: Package MSIX + dependsOn: Layout_MSIX + jobs: + - template: windows-release/stage-pack-msix.yml + + - stage: Build_MSI + displayName: Build MSI installer + dependsOn: Sign + condition: and(succeeded(), eq(variables['DoMSI'], 'true')) + jobs: + - template: windows-release/stage-msi.yml + + - stage: Test_MSI + displayName: Test MSI installer + dependsOn: Build_MSI + jobs: + - template: windows-release/stage-test-msi.yml + + - ${{ if eq(parameters.DoPublish, 'true') }}: + - stage: PublishPyDotOrg + displayName: Publish to python.org + dependsOn: ['Test_MSI', 'Test'] + jobs: + - template: windows-release/stage-publish-pythonorg.yml + + - stage: PublishNuget + displayName: Publish to nuget.org + dependsOn: Test + jobs: + - template: windows-release/stage-publish-nugetorg.yml + + - stage: PublishStore + displayName: Publish to Store + dependsOn: Pack_MSIX + jobs: + - template: windows-release/stage-publish-store.yml + +- ${{ else }}: + - stage: PublishExisting + displayName: Publish existing build + dependsOn: [] + condition: and(succeeded(), eq(variables['DoPublish'], 'true')) + jobs: + - template: windows-release/stage-publish-pythonorg.yml + parameters: + BuildToPublish: ${{ parameters.BuildToPublish }} + + - template: windows-release/stage-publish-nugetorg.yml + parameters: + BuildToPublish: ${{ parameters.BuildToPublish }} + + - template: windows-release/stage-publish-store.yml + parameters: + BuildToPublish: ${{ parameters.BuildToPublish }} diff --git a/.azure-pipelines/windows-release/stage-build.yml b/.azure-pipelines/windows-release/stage-build.yml index 69f3b1e16451e..f70414ba21145 100644 --- a/.azure-pipelines/windows-release/stage-build.yml +++ b/.azure-pipelines/windows-release/stage-build.yml @@ -2,8 +2,8 @@ jobs: - job: Build_Docs displayName: Docs build pool: - name: 'Windows Release' - #vmImage: windows-2019 + #name: 'Windows Release' + vmImage: windows-2022 workspace: clean: all @@ -45,7 +45,7 @@ jobs: displayName: Python build pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all @@ -91,7 +91,7 @@ jobs: condition: and(succeeded(), ne(variables['DoPGO'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all @@ -141,7 +141,7 @@ jobs: displayName: Publish Tcl/Tk Library pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-layout-embed.yml b/.azure-pipelines/windows-release/stage-layout-embed.yml index dbccdead143b2..c8b23d308d81e 100644 --- a/.azure-pipelines/windows-release/stage-layout-embed.yml +++ b/.azure-pipelines/windows-release/stage-layout-embed.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), eq(variables['DoEmbed'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-layout-full.yml b/.azure-pipelines/windows-release/stage-layout-full.yml index 8fc8da3e52fe0..0ba2fc017d987 100644 --- a/.azure-pipelines/windows-release/stage-layout-full.yml +++ b/.azure-pipelines/windows-release/stage-layout-full.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), eq(variables['DoLayout'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-layout-msix.yml b/.azure-pipelines/windows-release/stage-layout-msix.yml index def4f7d3c6bee..6efd327bdb32e 100644 --- a/.azure-pipelines/windows-release/stage-layout-msix.yml +++ b/.azure-pipelines/windows-release/stage-layout-msix.yml @@ -3,7 +3,7 @@ jobs: displayName: Make MSIX layout pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-layout-nuget.yml b/.azure-pipelines/windows-release/stage-layout-nuget.yml index 41cdff850e83b..b60a324dd90e3 100644 --- a/.azure-pipelines/windows-release/stage-layout-nuget.yml +++ b/.azure-pipelines/windows-release/stage-layout-nuget.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), eq(variables['DoNuget'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-msi.yml b/.azure-pipelines/windows-release/stage-msi.yml index 9b965b09c1474..f14bc9a45ae38 100644 --- a/.azure-pipelines/windows-release/stage-msi.yml +++ b/.azure-pipelines/windows-release/stage-msi.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), not(variables['SigningCertificate'])) pool: - vmImage: windows-2019 + vmImage: windows-2022 variables: ReleaseUri: http://www.python.org/{arch} diff --git a/.azure-pipelines/windows-release/stage-pack-msix.yml b/.azure-pipelines/windows-release/stage-pack-msix.yml index 9f7919ee64706..95988151a03db 100644 --- a/.azure-pipelines/windows-release/stage-pack-msix.yml +++ b/.azure-pipelines/windows-release/stage-pack-msix.yml @@ -3,7 +3,7 @@ jobs: displayName: Pack MSIX bundles pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-publish-nugetorg.yml b/.azure-pipelines/windows-release/stage-publish-nugetorg.yml index d5edf44ef5c2e..38f6772afcde3 100644 --- a/.azure-pipelines/windows-release/stage-publish-nugetorg.yml +++ b/.azure-pipelines/windows-release/stage-publish-nugetorg.yml @@ -1,10 +1,13 @@ +parameters: + BuildToPublish: '' + jobs: - job: Publish_Nuget displayName: Publish Nuget packages condition: and(succeeded(), eq(variables['DoNuget'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all @@ -12,24 +15,25 @@ jobs: steps: - checkout: none - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact: nuget' - condition: and(succeeded(), not(variables['BuildToPublish'])) - inputs: - artifactName: nuget - downloadPath: $(Build.BinariesDirectory) + - ${{ if parameters.BuildToPublish }}: + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact from ${{ parameters.BuildToPublish }}' + inputs: + artifactName: nuget + downloadPath: $(Build.BinariesDirectory) + buildType: specific + project: $(System.TeamProject) + pipeline: $(Build.DefinitionName) + buildVersionToDownload: specific + buildId: ${{ parameters.BuildToPublish }} + + - ${{ else }}: + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact: nuget' + inputs: + artifactName: nuget + downloadPath: $(Build.BinariesDirectory) - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact: nuget' - condition: and(succeeded(), variables['BuildToPublish']) - inputs: - artifactName: nuget - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: cpython - pipeline: Windows-Release - buildVersionToDownload: specific - buildId: $(BuildToPublish) - powershell: 'gci pythonarm*.nupkg | %{ Write-Host "Not publishing: $($_.Name)"; gi $_ } | del' displayName: 'Prevent publishing ARM/ARM64 packages' diff --git a/.azure-pipelines/windows-release/stage-publish-pythonorg.yml b/.azure-pipelines/windows-release/stage-publish-pythonorg.yml index 4b88bdebf8cc4..ef95572f7d165 100644 --- a/.azure-pipelines/windows-release/stage-publish-pythonorg.yml +++ b/.azure-pipelines/windows-release/stage-publish-pythonorg.yml @@ -1,10 +1,13 @@ +parameters: + BuildToPublish: '' + jobs: - job: Publish_Python displayName: Publish python.org packages condition: and(succeeded(), and(eq(variables['DoMSI'], 'true'), eq(variables['DoEmbed'], 'true'))) pool: - #vmImage: windows-2019 + #vmImage: windows-2022 name: 'Windows Release' workspace: @@ -18,62 +21,61 @@ jobs: inputs: versionSpec: '>=3.6' - - task: DownloadPipelineArtifact at 1 - displayName: 'Download artifact: Doc' - condition: and(succeeded(), not(variables['BuildToPublish'])) - inputs: - artifactName: Doc - targetPath: $(Build.BinariesDirectory)\Doc - - - task: DownloadPipelineArtifact at 1 - displayName: 'Download artifact: msi' - condition: and(succeeded(), not(variables['BuildToPublish'])) - inputs: - artifactName: msi - targetPath: $(Build.BinariesDirectory)\msi + - ${{ if parameters.BuildToPublish }}: + - task: DownloadPipelineArtifact at 1 + displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: Doc' + inputs: + artifactName: Doc + targetPath: $(Build.BinariesDirectory)\Doc + buildType: specific + project: $(System.TeamProject) + pipeline: $(Build.DefinitionName) + buildVersionToDownload: specific + buildId: ${{ parameters.BuildToPublish }} + + - task: DownloadPipelineArtifact at 1 + displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: msi' + inputs: + artifactName: msi + targetPath: $(Build.BinariesDirectory)\msi + buildType: specific + project: $(System.TeamProject) + pipeline: $(Build.DefinitionName) + buildVersionToDownload: specific + buildId: ${{ parameters.BuildToPublish }} + + # Note that embed is a 'build' artifact, not a 'pipeline' artifact + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: embed' + inputs: + artifactName: embed + downloadPath: $(Build.BinariesDirectory) + buildType: specific + project: $(System.TeamProject) + pipeline: $(Build.DefinitionName) + buildVersionToDownload: specific + buildId: ${{ parameters.BuildToPublish }} + + - ${{ else }}: + - task: DownloadPipelineArtifact at 1 + displayName: 'Download artifact: Doc' + inputs: + artifactName: Doc + targetPath: $(Build.BinariesDirectory)\Doc + + - task: DownloadPipelineArtifact at 1 + displayName: 'Download artifact: msi' + inputs: + artifactName: msi + targetPath: $(Build.BinariesDirectory)\msi + + # Note that embed is a 'build' artifact, not a 'pipeline' artifact + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact: embed' + inputs: + artifactName: embed + downloadPath: $(Build.BinariesDirectory) - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact: embed' - condition: and(succeeded(), not(variables['BuildToPublish'])) - inputs: - artifactName: embed - downloadPath: $(Build.BinariesDirectory) - - - task: DownloadPipelineArtifact at 1 - displayName: 'Download artifact from $(BuildToPublish): Doc' - condition: and(succeeded(), variables['BuildToPublish']) - inputs: - artifactName: Doc - targetPath: $(Build.BinariesDirectory)\Doc - buildType: specific - project: cpython - pipeline: 21 - buildVersionToDownload: specific - buildId: $(BuildToPublish) - - - task: DownloadPipelineArtifact at 1 - displayName: 'Download artifact from $(BuildToPublish): msi' - condition: and(succeeded(), variables['BuildToPublish']) - inputs: - artifactName: msi - targetPath: $(Build.BinariesDirectory)\msi - buildType: specific - project: cpython - pipeline: 21 - buildVersionToDownload: specific - buildId: $(BuildToPublish) - - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact from $(BuildToPublish): embed' - condition: and(succeeded(), variables['BuildToPublish']) - inputs: - artifactName: embed - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: cpython - pipeline: Windows-Release - buildVersionToDownload: specific - buildId: $(BuildToPublish) - powershell: 'gci *embed-arm*.zip | %{ Write-Host "Not publishing: $($_.Name)"; gi $_ } | del' displayName: 'Prevent publishing ARM/ARM64 packages' @@ -105,6 +107,7 @@ jobs: "$(Build.SourcesDirectory)\Tools\msi\purge.py" (gci msi\*\python-*.exe | %{ $_.Name -replace 'python-(.+?)(-|\.exe).+', '$1' } | select -First 1) workingDirectory: $(Build.BinariesDirectory) + condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) displayName: 'Purge CDN' - powershell: | @@ -124,7 +127,7 @@ jobs: Write-Error "Failed to validate $failures installers" exit 1 } - #condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) + condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) workingDirectory: $(Build.BinariesDirectory) displayName: 'Test layouts' diff --git a/.azure-pipelines/windows-release/stage-publish-store.yml b/.azure-pipelines/windows-release/stage-publish-store.yml index e0512b95f27da..f3d4c80be9138 100644 --- a/.azure-pipelines/windows-release/stage-publish-store.yml +++ b/.azure-pipelines/windows-release/stage-publish-store.yml @@ -1,10 +1,13 @@ +parameters: + BuildToPublish: '' + jobs: - job: Publish_Store displayName: Publish Store packages condition: and(succeeded(), eq(variables['DoMSIX'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all @@ -12,24 +15,24 @@ jobs: steps: - checkout: none - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact: msixupload' - condition: and(succeeded(), not(variables['BuildToPublish'])) - inputs: - artifactName: msixupload - downloadPath: $(Build.BinariesDirectory) + - ${{ if parameters.BuildToPublish }}: + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact: msixupload' + inputs: + artifactName: msixupload + downloadPath: $(Build.BinariesDirectory) + buildType: specific + project: cpython + pipeline: Windows-Release + buildVersionToDownload: specific + buildId: ${{ parameters.BuildToPublish }} - - task: DownloadBuildArtifacts at 0 - displayName: 'Download artifact: msixupload' - condition: and(succeeded(), variables['BuildToPublish']) - inputs: - artifactName: msixupload - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: cpython - pipeline: Windows-Release - buildVersionToDownload: specific - buildId: $(BuildToPublish) + - ${{ else }}: + - task: DownloadBuildArtifacts at 0 + displayName: 'Download artifact: msixupload' + inputs: + artifactName: msixupload + downloadPath: $(Build.BinariesDirectory) # TODO: eq(variables['SigningCertificate'], variables['__RealSigningCertificate']) # If we are not real-signed, DO NOT PUBLISH diff --git a/.azure-pipelines/windows-release/stage-sign.yml b/.azure-pipelines/windows-release/stage-sign.yml index d43e077186c42..4481aa86edc2c 100644 --- a/.azure-pipelines/windows-release/stage-sign.yml +++ b/.azure-pipelines/windows-release/stage-sign.yml @@ -120,7 +120,7 @@ jobs: condition: and(succeeded(), not(variables['SigningCertificate'])) pool: - vmImage: windows-2019 + vmImage: windows-2022 steps: - checkout: none diff --git a/.azure-pipelines/windows-release/stage-test-embed.yml b/.azure-pipelines/windows-release/stage-test-embed.yml index d99bd74722bac..252db959930f2 100644 --- a/.azure-pipelines/windows-release/stage-test-embed.yml +++ b/.azure-pipelines/windows-release/stage-test-embed.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), eq(variables['DoEmbed'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-test-msi.yml b/.azure-pipelines/windows-release/stage-test-msi.yml index 21e38c39590f7..4b02f478ce0a1 100644 --- a/.azure-pipelines/windows-release/stage-test-msi.yml +++ b/.azure-pipelines/windows-release/stage-test-msi.yml @@ -3,7 +3,7 @@ jobs: displayName: Test MSI pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all diff --git a/.azure-pipelines/windows-release/stage-test-nuget.yml b/.azure-pipelines/windows-release/stage-test-nuget.yml index 94d815e95226e..c500baf29b457 100644 --- a/.azure-pipelines/windows-release/stage-test-nuget.yml +++ b/.azure-pipelines/windows-release/stage-test-nuget.yml @@ -4,7 +4,7 @@ jobs: condition: and(succeeded(), eq(variables['DoNuget'], 'true')) pool: - vmImage: windows-2019 + vmImage: windows-2022 workspace: clean: all From webhook-mailer at python.org Sat Jan 22 02:09:46 2022 From: webhook-mailer at python.org (ericvsmith) Date: Sat, 22 Jan 2022 07:09:46 -0000 Subject: [Python-checkins] bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) Message-ID: https://github.com/python/cpython/commit/82c53229e18f5853c82cb8ab6b9af1925a0e9e58 commit: 82c53229e18f5853c82cb8ab6b9af1925a0e9e58 branch: main author: Yellow Dusk committer: ericvsmith date: 2022-01-22T02:09:34-05:00 summary: bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) The test tested that explicitly deleting the local variable bound to the exception did not cause problems, but it did not test what it actually claimed to test, i.e. that the variable is deleted automatically. files: M Lib/test/test_exceptions.py diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 531b9c92deae5..5932b9d4f6677 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -671,15 +671,27 @@ def test_str(self): self.assertTrue(str(Exception('a'))) self.assertTrue(str(Exception('a', 'b'))) - def testExceptionCleanupNames(self): + def test_exception_cleanup_names(self): # Make sure the local variable bound to the exception instance by # an "except" statement is only visible inside the except block. try: raise Exception() except Exception as e: - self.assertTrue(e) + self.assertIsInstance(e, Exception) + self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e + + def test_exception_cleanup_names2(self): + # Make sure the cleanup doesn't break if the variable is explicitly deleted. + try: + raise Exception() + except Exception as e: + self.assertIsInstance(e, Exception) del e self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e def testExceptionCleanupState(self): # Make sure exception state is cleaned up as soon as the except From webhook-mailer at python.org Sat Jan 22 02:34:38 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 07:34:38 -0000 Subject: [Python-checkins] bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) Message-ID: https://github.com/python/cpython/commit/d4a9e34401d519250d3b3744cd10394069f748c1 commit: d4a9e34401d519250d3b3744cd10394069f748c1 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T23:34:29-08:00 summary: bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) The test tested that explicitly deleting the local variable bound to the exception did not cause problems, but it did not test what it actually claimed to test, i.e. that the variable is deleted automatically. (cherry picked from commit 82c53229e18f5853c82cb8ab6b9af1925a0e9e58) Co-authored-by: Yellow Dusk files: M Lib/test/test_exceptions.py diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index b3d1c35274c71..802dc9a67eb21 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -648,15 +648,27 @@ def test_str(self): self.assertTrue(str(Exception('a'))) self.assertTrue(str(Exception('a', 'b'))) - def testExceptionCleanupNames(self): + def test_exception_cleanup_names(self): # Make sure the local variable bound to the exception instance by # an "except" statement is only visible inside the except block. try: raise Exception() except Exception as e: - self.assertTrue(e) + self.assertIsInstance(e, Exception) + self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e + + def test_exception_cleanup_names2(self): + # Make sure the cleanup doesn't break if the variable is explicitly deleted. + try: + raise Exception() + except Exception as e: + self.assertIsInstance(e, Exception) del e self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e def testExceptionCleanupState(self): # Make sure exception state is cleaned up as soon as the except From webhook-mailer at python.org Sat Jan 22 02:37:41 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 07:37:41 -0000 Subject: [Python-checkins] bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) Message-ID: https://github.com/python/cpython/commit/e064af564c8580285a76199229864ddfb1e50c0f commit: e064af564c8580285a76199229864ddfb1e50c0f branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-21T23:37:32-08:00 summary: bpo-46442: improve and rename testExceptionCleanupNames (GH-30758) The test tested that explicitly deleting the local variable bound to the exception did not cause problems, but it did not test what it actually claimed to test, i.e. that the variable is deleted automatically. (cherry picked from commit 82c53229e18f5853c82cb8ab6b9af1925a0e9e58) Co-authored-by: Yellow Dusk files: M Lib/test/test_exceptions.py diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 5168b0b8a0831..90d7f37dd8670 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -598,15 +598,27 @@ def test_str(self): self.assertTrue(str(Exception('a'))) self.assertTrue(str(Exception('a', 'b'))) - def testExceptionCleanupNames(self): + def test_exception_cleanup_names(self): # Make sure the local variable bound to the exception instance by # an "except" statement is only visible inside the except block. try: raise Exception() except Exception as e: - self.assertTrue(e) + self.assertIsInstance(e, Exception) + self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e + + def test_exception_cleanup_names2(self): + # Make sure the cleanup doesn't break if the variable is explicitly deleted. + try: + raise Exception() + except Exception as e: + self.assertIsInstance(e, Exception) del e self.assertNotIn('e', locals()) + with self.assertRaises(UnboundLocalError): + e def testExceptionCleanupState(self): # Make sure exception state is cleaned up as soon as the except From webhook-mailer at python.org Sat Jan 22 04:40:30 2022 From: webhook-mailer at python.org (corona10) Date: Sat, 22 Jan 2022 09:40:30 -0000 Subject: [Python-checkins] bpo-46249: Move set lastrowid out of the sqlite3 query loop (GH-30489) Message-ID: https://github.com/python/cpython/commit/38afeb1a336f0451c0db86df567ef726f49f6438 commit: 38afeb1a336f0451c0db86df567ef726f49f6438 branch: main author: Erlend Egeberg Aasland committer: corona10 date: 2022-01-22T18:40:22+09:00 summary: bpo-46249: Move set lastrowid out of the sqlite3 query loop (GH-30489) files: M Modules/_sqlite/cursor.c diff --git a/Modules/_sqlite/cursor.c b/Modules/_sqlite/cursor.c index 2729a85f3195d..4700afbbf1188 100644 --- a/Modules/_sqlite/cursor.c +++ b/Modules/_sqlite/cursor.c @@ -465,7 +465,6 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation int rc; int numcols; PyObject* column_name; - sqlite_int64 lastrowid; if (!check_cursor(self)) { goto error; @@ -630,16 +629,6 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation self->rowcount= -1L; } - if (!multiple) { - Py_BEGIN_ALLOW_THREADS - lastrowid = sqlite3_last_insert_rowid(self->connection->db); - Py_END_ALLOW_THREADS - Py_SETREF(self->lastrowid, PyLong_FromLongLong(lastrowid)); - if (self->lastrowid == NULL) { - goto error; - } - } - if (rc == SQLITE_DONE && !multiple) { pysqlite_statement_reset(self->statement); Py_CLEAR(self->statement); @@ -651,6 +640,17 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation Py_XDECREF(parameters); } + if (!multiple) { + sqlite_int64 lastrowid; + + Py_BEGIN_ALLOW_THREADS + lastrowid = sqlite3_last_insert_rowid(self->connection->db); + Py_END_ALLOW_THREADS + + Py_SETREF(self->lastrowid, PyLong_FromLongLong(lastrowid)); + // Fall through on error. + } + error: Py_XDECREF(parameters); Py_XDECREF(parameters_iter); From webhook-mailer at python.org Sat Jan 22 06:06:00 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 11:06:00 -0000 Subject: [Python-checkins] =?utf-8?q?=5B3=2E9=5D_bpo-46383=3A_Fix_signatu?= =?utf-8?q?re_of_zoneinfo_module=5Ffree_function_=28GH-3=E2=80=A6_=28GH-30?= =?utf-8?q?611=29?= Message-ID: https://github.com/python/cpython/commit/3e7d06a1fa2102723314552b37410d11fefa928a commit: 3e7d06a1fa2102723314552b37410d11fefa928a branch: 3.9 author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-22T03:05:55-08:00 summary: [3.9] bpo-46383: Fix signature of zoneinfo module_free function (GH-3? (GH-30611) ?0607) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst M Modules/_zoneinfo.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst new file mode 100644 index 0000000000000..8f8b12732a690 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-14-20-55-34.bpo-46383.v8MTl4.rst @@ -0,0 +1,2 @@ +Fix invalid signature of ``_zoneinfo``'s ``module_free`` function to resolve +a crash on wasm32-emscripten platform. diff --git a/Modules/_zoneinfo.c b/Modules/_zoneinfo.c index d7945d31affea..cd147aedb4cdb 100644 --- a/Modules/_zoneinfo.c +++ b/Modules/_zoneinfo.c @@ -2613,7 +2613,7 @@ static PyTypeObject PyZoneInfo_ZoneInfoType = { // Specify the _zoneinfo module static PyMethodDef module_methods[] = {{NULL, NULL}}; static void -module_free() +module_free(void *m) { Py_XDECREF(_tzpath_find_tzfile); _tzpath_find_tzfile = NULL; From webhook-mailer at python.org Sat Jan 22 06:06:31 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 11:06:31 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `asyncio` tests (#30725) Message-ID: https://github.com/python/cpython/commit/5a5340044ca98cbe6297668d91bccba04b102923 commit: 5a5340044ca98cbe6297668d91bccba04b102923 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-22T13:06:27+02:00 summary: bpo-46425: fix direct invocation of `asyncio` tests (#30725) files: M Lib/test/test_asyncio/test_context.py M Lib/test/test_asyncio/test_futures2.py M Lib/test/test_asyncio/test_protocols.py M Lib/test/test_asyncio/test_runners.py M Lib/test/test_asyncio/test_sendfile.py M Lib/test/test_asyncio/test_sock_lowlevel.py diff --git a/Lib/test/test_asyncio/test_context.py b/Lib/test/test_asyncio/test_context.py index 63b1eb320ce16b..6b80721873d95c 100644 --- a/Lib/test/test_asyncio/test_context.py +++ b/Lib/test/test_asyncio/test_context.py @@ -32,3 +32,7 @@ async def main(): self.assertEqual(str(r2[0]), '0.333333') self.assertEqual(str(r2[1]), '0.111111') + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_futures2.py b/Lib/test/test_asyncio/test_futures2.py index 13dbc703277c81..57d24190bc0bd5 100644 --- a/Lib/test/test_asyncio/test_futures2.py +++ b/Lib/test/test_asyncio/test_futures2.py @@ -16,3 +16,7 @@ async def func(): # The check for returned string is not very reliable but # exact comparison for the whole string is even weaker. self.assertIn('...', repr(await asyncio.wait_for(func(), timeout=10))) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_protocols.py b/Lib/test/test_asyncio/test_protocols.py index 438111cccd3478..d8cde6d89aadcd 100644 --- a/Lib/test/test_asyncio/test_protocols.py +++ b/Lib/test/test_asyncio/test_protocols.py @@ -55,3 +55,7 @@ def test_subprocess_protocol(self): self.assertIsNone(sp.pipe_connection_lost(1, f)) self.assertIsNone(sp.process_exited()) self.assertFalse(hasattr(sp, '__dict__')) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_runners.py b/Lib/test/test_asyncio/test_runners.py index b9ae02dc3c04e0..5c06a1aaa830fa 100644 --- a/Lib/test/test_asyncio/test_runners.py +++ b/Lib/test/test_asyncio/test_runners.py @@ -2,7 +2,7 @@ import unittest from unittest import mock -from . import utils as test_utils +from test.test_asyncio import utils as test_utils class TestPolicy(asyncio.AbstractEventLoopPolicy): @@ -180,3 +180,7 @@ async def main(): self.assertIsNone(spinner.ag_frame) self.assertFalse(spinner.ag_running) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 0a5466a0af152b..57b56bba34100b 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -565,3 +565,7 @@ class SelectEventLoopTests(SendfileTestsBase, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sock_lowlevel.py b/Lib/test/test_asyncio/test_sock_lowlevel.py index ab022a357d205e..448d835b04d570 100644 --- a/Lib/test/test_asyncio/test_sock_lowlevel.py +++ b/Lib/test/test_asyncio/test_sock_lowlevel.py @@ -1,5 +1,4 @@ import socket -import time import asyncio import sys import unittest @@ -512,3 +511,7 @@ class SelectEventLoopTests(BaseSockTestsMixin, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() From webhook-mailer at python.org Sat Jan 22 06:16:08 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sat, 22 Jan 2022 11:16:08 -0000 Subject: [Python-checkins] bpo-46460: remove duplicated `versionchanged` from `dis.rst` (GH-30752) Message-ID: https://github.com/python/cpython/commit/5d735241168cefe00be177ef4152955c100177ae commit: 5d735241168cefe00be177ef4152955c100177ae branch: main author: Nikita Sobolev committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-22T11:15:58Z summary: bpo-46460: remove duplicated `versionchanged` from `dis.rst` (GH-30752) files: M Doc/library/dis.rst diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index af28e5c115934..c9a4768618702 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -596,8 +596,6 @@ iterations of the loop. has occurred in a :keyword:`with` statement. .. versionadded:: 3.9 - .. versionchanged:: 3.11 - The ``__exit__`` function is in position 8 of the stack rather than 7. .. versionchanged:: 3.11 The ``__exit__`` function is in position 4 of the stack rather than 7. From webhook-mailer at python.org Sat Jan 22 06:20:14 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 11:20:14 -0000 Subject: [Python-checkins] fix DeprecationWarning when running asyncio tests (GH-30486) Message-ID: https://github.com/python/cpython/commit/ab8fe22e5e4e282da8ea6f4e77f4c0a6616ec9c2 commit: ab8fe22e5e4e282da8ea6f4e77f4c0a6616ec9c2 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: asvetlov date: 2022-01-22T13:20:10+02:00 summary: fix DeprecationWarning when running asyncio tests (GH-30486) files: M Lib/test/test_asyncio/test_windows_events.py diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py index f276cd205a2f8e..6b4f65c3376f5d 100644 --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -45,7 +45,7 @@ def SIGINT_after_delay(): signal.raise_signal(signal.SIGINT) thread = threading.Thread(target=SIGINT_after_delay) - loop = asyncio.get_event_loop() + loop = asyncio.new_event_loop() try: # only start the loop once the event loop is running loop.call_soon(thread.start) From webhook-mailer at python.org Sat Jan 22 06:29:20 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 11:29:20 -0000 Subject: [Python-checkins] bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) Message-ID: https://github.com/python/cpython/commit/ea5b96842e066623a53015d8b2492ed61a5baf96 commit: ea5b96842e066623a53015d8b2492ed61a5baf96 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: asvetlov date: 2022-01-22T13:28:53+02:00 summary: bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) * bpo-46469: Make asyncio generic classes return GenericAlias * ?? Added by blurb_it. * Update Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst Co-authored-by: Jelle Zijlstra Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> Co-authored-by: Jelle Zijlstra files: A Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst M Lib/asyncio/futures.py M Lib/asyncio/queues.py M Lib/asyncio/tasks.py M Lib/test/test_asyncio/test_futures.py M Lib/test/test_asyncio/test_queues.py M Lib/test/test_asyncio/test_tasks.py M Modules/_asynciomodule.c diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py index 10f8f0554e4b6..8e8cd87612588 100644 --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -8,6 +8,7 @@ import contextvars import logging import sys +from types import GenericAlias from . import base_futures from . import events @@ -106,8 +107,7 @@ def __del__(self): context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) @property def _log_traceback(self): diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py index a87ec8b215876..10dd689bbb2ef 100644 --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -2,6 +2,7 @@ import collections import heapq +from types import GenericAlias from . import locks from . import mixins @@ -69,8 +70,7 @@ def __repr__(self): def __str__(self): return f'<{type(self).__name__} {self._format()}>' - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _format(self): result = f'maxsize={self._maxsize!r}' diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py index 53eef84427be1..445a9f51226ed 100644 --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -17,6 +17,7 @@ import types import warnings import weakref +from types import GenericAlias from . import base_tasks from . import coroutines @@ -123,8 +124,7 @@ def __del__(self): self._loop.call_exception_handler(context) super().__del__() - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _repr_info(self): return base_tasks._task_repr_info(self) diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py index 95983f0550807..84d7d45af949e 100644 --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -7,7 +7,7 @@ import threading import unittest from unittest import mock - +from types import GenericAlias import asyncio from asyncio import futures from test.test_asyncio import utils as test_utils @@ -109,6 +109,11 @@ def setUp(self): self.loop = self.new_test_loop() self.addCleanup(self.loop.close) + def test_generic_alias(self): + future = self.cls[str] + self.assertEqual(future.__args__, (str,)) + self.assertIsInstance(future, GenericAlias) + def test_isfuture(self): class MyFuture: _asyncio_future_blocking = None diff --git a/Lib/test/test_asyncio/test_queues.py b/Lib/test/test_asyncio/test_queues.py index 63a9a5f270cc9..b1a53b859c5cc 100644 --- a/Lib/test/test_asyncio/test_queues.py +++ b/Lib/test/test_asyncio/test_queues.py @@ -1,9 +1,8 @@ """Tests for queues.py""" import unittest -from unittest import mock - import asyncio +from types import GenericAlias from test.test_asyncio import utils as test_utils @@ -74,6 +73,11 @@ def test_repr(self): def test_str(self): self._test_repr_or_str(str, False) + def test_generic_alias(self): + q = asyncio.Queue[int] + self.assertEqual(q.__args__, (int,)) + self.assertIsInstance(q, GenericAlias) + def test_empty(self): q = asyncio.Queue() self.assertTrue(q.empty()) diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py index 832ff80f115ca..8c4dceacdeec9 100644 --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -11,10 +11,10 @@ import sys import textwrap import traceback -import types import unittest import weakref from unittest import mock +from types import GenericAlias import asyncio from asyncio import coroutines @@ -108,6 +108,12 @@ def setUp(self): self.loop.set_task_factory(self.new_task) self.loop.create_future = lambda: self.new_future(self.loop) + + def test_generic_alias(self): + task = self.__class__.Task[str] + self.assertEqual(task.__args__, (str,)) + self.assertIsInstance(task, GenericAlias) + def test_task_cancel_message_getter(self): async def coro(): pass diff --git a/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst new file mode 100644 index 0000000000000..0d0e4b5d3d735 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst @@ -0,0 +1 @@ +:mod:`asyncio` generic classes now return :class:`types.GenericAlias` in ``__class_getitem__`` instead of the same class. \ No newline at end of file diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index b08a7d1c024c3..2216dd0178173 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -1480,13 +1480,6 @@ FutureObj_finalize(FutureObj *fut) PyErr_Restore(error_type, error_value, error_traceback); } -static PyObject * -future_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static PyAsyncMethods FutureType_as_async = { (unaryfunc)future_new_iter, /* am_await */ 0, /* am_aiter */ @@ -1507,7 +1500,7 @@ static PyMethodDef FutureType_methods[] = { _ASYNCIO_FUTURE_GET_LOOP_METHODDEF _ASYNCIO_FUTURE__MAKE_CANCELLED_ERROR_METHODDEF _ASYNCIO_FUTURE__REPR_INFO_METHODDEF - {"__class_getitem__", future_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; @@ -2449,13 +2442,6 @@ TaskObj_finalize(TaskObj *task) FutureObj_finalize((FutureObj*)task); } -static PyObject * -task_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static void TaskObj_dealloc(PyObject *); /* Needs Task_CheckExact */ static PyMethodDef TaskType_methods[] = { @@ -2475,7 +2461,7 @@ static PyMethodDef TaskType_methods[] = { _ASYNCIO_TASK_GET_NAME_METHODDEF _ASYNCIO_TASK_SET_NAME_METHODDEF _ASYNCIO_TASK_GET_CORO_METHODDEF - {"__class_getitem__", task_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; From webhook-mailer at python.org Sat Jan 22 06:54:12 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 11:54:12 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `asyncio` tests (GH-30725) Message-ID: https://github.com/python/cpython/commit/3c4a3745b900e748f99e80fc3728b534e857d1ff commit: 3c4a3745b900e748f99e80fc3728b534e857d1ff branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-22T03:54:07-08:00 summary: bpo-46425: fix direct invocation of `asyncio` tests (GH-30725) (cherry picked from commit 5a5340044ca98cbe6297668d91bccba04b102923) Co-authored-by: Nikita Sobolev files: M Lib/test/test_asyncio/test_context.py M Lib/test/test_asyncio/test_futures2.py M Lib/test/test_asyncio/test_protocols.py M Lib/test/test_asyncio/test_runners.py M Lib/test/test_asyncio/test_sendfile.py M Lib/test/test_asyncio/test_sock_lowlevel.py diff --git a/Lib/test/test_asyncio/test_context.py b/Lib/test/test_asyncio/test_context.py index 63b1eb320ce16..6b80721873d95 100644 --- a/Lib/test/test_asyncio/test_context.py +++ b/Lib/test/test_asyncio/test_context.py @@ -32,3 +32,7 @@ async def main(): self.assertEqual(str(r2[0]), '0.333333') self.assertEqual(str(r2[1]), '0.111111') + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_futures2.py b/Lib/test/test_asyncio/test_futures2.py index 13dbc703277c8..57d24190bc0bd 100644 --- a/Lib/test/test_asyncio/test_futures2.py +++ b/Lib/test/test_asyncio/test_futures2.py @@ -16,3 +16,7 @@ async def func(): # The check for returned string is not very reliable but # exact comparison for the whole string is even weaker. self.assertIn('...', repr(await asyncio.wait_for(func(), timeout=10))) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_protocols.py b/Lib/test/test_asyncio/test_protocols.py index 438111cccd347..d8cde6d89aadc 100644 --- a/Lib/test/test_asyncio/test_protocols.py +++ b/Lib/test/test_asyncio/test_protocols.py @@ -55,3 +55,7 @@ def test_subprocess_protocol(self): self.assertIsNone(sp.pipe_connection_lost(1, f)) self.assertIsNone(sp.process_exited()) self.assertFalse(hasattr(sp, '__dict__')) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_runners.py b/Lib/test/test_asyncio/test_runners.py index b9ae02dc3c04e..5c06a1aaa830f 100644 --- a/Lib/test/test_asyncio/test_runners.py +++ b/Lib/test/test_asyncio/test_runners.py @@ -2,7 +2,7 @@ import unittest from unittest import mock -from . import utils as test_utils +from test.test_asyncio import utils as test_utils class TestPolicy(asyncio.AbstractEventLoopPolicy): @@ -180,3 +180,7 @@ async def main(): self.assertIsNone(spinner.ag_frame) self.assertFalse(spinner.ag_running) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 1b1af08db0a13..28f8de35edd96 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -560,3 +560,7 @@ class SelectEventLoopTests(SendfileTestsBase, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sock_lowlevel.py b/Lib/test/test_asyncio/test_sock_lowlevel.py index d8a5df8ede1f8..7de27b11154a4 100644 --- a/Lib/test/test_asyncio/test_sock_lowlevel.py +++ b/Lib/test/test_asyncio/test_sock_lowlevel.py @@ -1,5 +1,4 @@ import socket -import time import asyncio import sys import unittest @@ -508,3 +507,7 @@ class SelectEventLoopTests(BaseSockTestsMixin, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() From webhook-mailer at python.org Sat Jan 22 07:28:58 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 12:28:58 -0000 Subject: [Python-checkins] [3.10] bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) (#30784) Message-ID: https://github.com/python/cpython/commit/90e2998db78cd15e45b3c82f6360ac8841e03945 commit: 90e2998db78cd15e45b3c82f6360ac8841e03945 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-22T14:28:51+02:00 summary: [3.10] bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) (#30784) * bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) * bpo-46469: Make asyncio generic classes return GenericAlias * ?? Added by blurb_it. * Update Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst Co-authored-by: Jelle Zijlstra Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> Co-authored-by: Jelle Zijlstra (cherry picked from commit ea5b96842e066623a53015d8b2492ed61a5baf96) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> * Fix tests Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> Co-authored-by: Andrew Svetlov files: A Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst M Lib/asyncio/futures.py M Lib/asyncio/queues.py M Lib/asyncio/tasks.py M Lib/test/test_asyncio/test_futures.py M Lib/test/test_asyncio/test_queues.py M Lib/test/test_asyncio/test_tasks.py M Modules/_asynciomodule.c diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py index 10f8f0554e4b60..8e8cd87612588c 100644 --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -8,6 +8,7 @@ import contextvars import logging import sys +from types import GenericAlias from . import base_futures from . import events @@ -106,8 +107,7 @@ def __del__(self): context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) @property def _log_traceback(self): diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py index a87ec8b2158767..10dd689bbb2efa 100644 --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -2,6 +2,7 @@ import collections import heapq +from types import GenericAlias from . import locks from . import mixins @@ -69,8 +70,7 @@ def __repr__(self): def __str__(self): return f'<{type(self).__name__} {self._format()}>' - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _format(self): result = f'maxsize={self._maxsize!r}' diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py index 53eef84427be10..445a9f51226ed7 100644 --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -17,6 +17,7 @@ import types import warnings import weakref +from types import GenericAlias from . import base_tasks from . import coroutines @@ -123,8 +124,7 @@ def __del__(self): self._loop.call_exception_handler(context) super().__del__() - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _repr_info(self): return base_tasks._task_repr_info(self) diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py index 0c379e0fb0f956..838147b1e65f4e 100644 --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -7,7 +7,7 @@ import threading import unittest from unittest import mock - +from types import GenericAlias import asyncio from asyncio import futures from test.test_asyncio import utils as test_utils @@ -109,6 +109,11 @@ def setUp(self): self.loop = self.new_test_loop() self.addCleanup(self.loop.close) + def test_generic_alias(self): + future = self.cls[str] + self.assertEqual(future.__args__, (str,)) + self.assertIsInstance(future, GenericAlias) + def test_isfuture(self): class MyFuture: _asyncio_future_blocking = None diff --git a/Lib/test/test_asyncio/test_queues.py b/Lib/test/test_asyncio/test_queues.py index 63a9a5f270cc92..b1a53b859c5ccb 100644 --- a/Lib/test/test_asyncio/test_queues.py +++ b/Lib/test/test_asyncio/test_queues.py @@ -1,9 +1,8 @@ """Tests for queues.py""" import unittest -from unittest import mock - import asyncio +from types import GenericAlias from test.test_asyncio import utils as test_utils @@ -74,6 +73,11 @@ def test_repr(self): def test_str(self): self._test_repr_or_str(str, False) + def test_generic_alias(self): + q = asyncio.Queue[int] + self.assertEqual(q.__args__, (int,)) + self.assertIsInstance(q, GenericAlias) + def test_empty(self): q = asyncio.Queue() self.assertTrue(q.empty()) diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py index 1c05944c42d255..4782c92a7c1550 100644 --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -15,6 +15,7 @@ import unittest import weakref from unittest import mock +from types import GenericAlias import asyncio from asyncio import coroutines @@ -120,6 +121,12 @@ def setUp(self): self.loop.set_task_factory(self.new_task) self.loop.create_future = lambda: self.new_future(self.loop) + + def test_generic_alias(self): + task = self.__class__.Task[str] + self.assertEqual(task.__args__, (str,)) + self.assertIsInstance(task, GenericAlias) + def test_task_cancel_message_getter(self): async def coro(): pass diff --git a/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst new file mode 100644 index 00000000000000..0d0e4b5d3d7358 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst @@ -0,0 +1 @@ +:mod:`asyncio` generic classes now return :class:`types.GenericAlias` in ``__class_getitem__`` instead of the same class. \ No newline at end of file diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index befec9a8342ef4..392e0e7c7357f8 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -1476,13 +1476,6 @@ FutureObj_finalize(FutureObj *fut) PyErr_Restore(error_type, error_value, error_traceback); } -static PyObject * -future_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static PyAsyncMethods FutureType_as_async = { (unaryfunc)future_new_iter, /* am_await */ 0, /* am_aiter */ @@ -1503,7 +1496,7 @@ static PyMethodDef FutureType_methods[] = { _ASYNCIO_FUTURE_GET_LOOP_METHODDEF _ASYNCIO_FUTURE__MAKE_CANCELLED_ERROR_METHODDEF _ASYNCIO_FUTURE__REPR_INFO_METHODDEF - {"__class_getitem__", future_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; @@ -2445,13 +2438,6 @@ TaskObj_finalize(TaskObj *task) FutureObj_finalize((FutureObj*)task); } -static PyObject * -task_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static void TaskObj_dealloc(PyObject *); /* Needs Task_CheckExact */ static PyMethodDef TaskType_methods[] = { @@ -2471,7 +2457,7 @@ static PyMethodDef TaskType_methods[] = { _ASYNCIO_TASK_GET_NAME_METHODDEF _ASYNCIO_TASK_SET_NAME_METHODDEF _ASYNCIO_TASK_GET_CORO_METHODDEF - {"__class_getitem__", task_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; From webhook-mailer at python.org Sat Jan 22 07:29:55 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 12:29:55 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `asyncio` tests (GH-30725) (#30782) Message-ID: https://github.com/python/cpython/commit/6111d5dee2b24916ff95dba56efc569396a31851 commit: 6111d5dee2b24916ff95dba56efc569396a31851 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-22T14:29:51+02:00 summary: bpo-46425: fix direct invocation of `asyncio` tests (GH-30725) (#30782) (cherry picked from commit 5a5340044ca98cbe6297668d91bccba04b102923) Co-authored-by: Nikita Sobolev Co-authored-by: Nikita Sobolev files: M Lib/test/test_asyncio/test_context.py M Lib/test/test_asyncio/test_futures2.py M Lib/test/test_asyncio/test_protocols.py M Lib/test/test_asyncio/test_runners.py M Lib/test/test_asyncio/test_sendfile.py M Lib/test/test_asyncio/test_sock_lowlevel.py diff --git a/Lib/test/test_asyncio/test_context.py b/Lib/test/test_asyncio/test_context.py index 63b1eb320ce16..6b80721873d95 100644 --- a/Lib/test/test_asyncio/test_context.py +++ b/Lib/test/test_asyncio/test_context.py @@ -32,3 +32,7 @@ async def main(): self.assertEqual(str(r2[0]), '0.333333') self.assertEqual(str(r2[1]), '0.111111') + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_futures2.py b/Lib/test/test_asyncio/test_futures2.py index 13dbc703277c8..57d24190bc0bd 100644 --- a/Lib/test/test_asyncio/test_futures2.py +++ b/Lib/test/test_asyncio/test_futures2.py @@ -16,3 +16,7 @@ async def func(): # The check for returned string is not very reliable but # exact comparison for the whole string is even weaker. self.assertIn('...', repr(await asyncio.wait_for(func(), timeout=10))) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_protocols.py b/Lib/test/test_asyncio/test_protocols.py index 438111cccd347..d8cde6d89aadc 100644 --- a/Lib/test/test_asyncio/test_protocols.py +++ b/Lib/test/test_asyncio/test_protocols.py @@ -55,3 +55,7 @@ def test_subprocess_protocol(self): self.assertIsNone(sp.pipe_connection_lost(1, f)) self.assertIsNone(sp.process_exited()) self.assertFalse(hasattr(sp, '__dict__')) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_runners.py b/Lib/test/test_asyncio/test_runners.py index b9ae02dc3c04e..5c06a1aaa830f 100644 --- a/Lib/test/test_asyncio/test_runners.py +++ b/Lib/test/test_asyncio/test_runners.py @@ -2,7 +2,7 @@ import unittest from unittest import mock -from . import utils as test_utils +from test.test_asyncio import utils as test_utils class TestPolicy(asyncio.AbstractEventLoopPolicy): @@ -180,3 +180,7 @@ async def main(): self.assertIsNone(spinner.ag_frame) self.assertFalse(spinner.ag_running) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 0a5466a0af152..57b56bba34100 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -565,3 +565,7 @@ class SelectEventLoopTests(SendfileTestsBase, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_asyncio/test_sock_lowlevel.py b/Lib/test/test_asyncio/test_sock_lowlevel.py index ab022a357d205..448d835b04d57 100644 --- a/Lib/test/test_asyncio/test_sock_lowlevel.py +++ b/Lib/test/test_asyncio/test_sock_lowlevel.py @@ -1,5 +1,4 @@ import socket -import time import asyncio import sys import unittest @@ -512,3 +511,7 @@ class SelectEventLoopTests(BaseSockTestsMixin, def create_event_loop(self): return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +if __name__ == '__main__': + unittest.main() From webhook-mailer at python.org Sat Jan 22 07:31:20 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 12:31:20 -0000 Subject: [Python-checkins] bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) Message-ID: https://github.com/python/cpython/commit/c8a536624e8f5d6612e3c275c5b19592583a8cf8 commit: c8a536624e8f5d6612e3c275c5b19592583a8cf8 branch: main author: Jelle Zijlstra committer: asvetlov date: 2022-01-22T14:31:15+02:00 summary: bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) Code link: https://github.com/python/cpython/blob/70c16468deee9390e34322d32fda57df6e0f46bb/Lib/http/server.py#L1270 It's been this way since at least 3.4. Also improved some wording in the same section. files: M Doc/library/http.server.rst diff --git a/Doc/library/http.server.rst b/Doc/library/http.server.rst index c3cee079526b2..0de02834401aa 100644 --- a/Doc/library/http.server.rst +++ b/Doc/library/http.server.rst @@ -412,17 +412,22 @@ the current directory:: .. _http-server-cli: :mod:`http.server` can also be invoked directly using the :option:`-m` -switch of the interpreter with a ``port number`` argument. Similar to +switch of the interpreter. Similar to the previous example, this serves files relative to the current directory:: - python -m http.server 8000 + python -m http.server -By default, server binds itself to all interfaces. The option ``-b/--bind`` +The server listens to port 8000 by default. The default can be overridden +by passing the desired port number as an argument:: + + python -m http.server 9000 + +By default, the server binds itself to all interfaces. The option ``-b/--bind`` specifies a specific address to which it should bind. Both IPv4 and IPv6 addresses are supported. For example, the following command causes the server to bind to localhost only:: - python -m http.server 8000 --bind 127.0.0.1 + python -m http.server --bind 127.0.0.1 .. versionadded:: 3.4 ``--bind`` argument was introduced. @@ -430,14 +435,14 @@ to bind to localhost only:: .. versionadded:: 3.8 ``--bind`` argument enhanced to support IPv6 -By default, server uses the current directory. The option ``-d/--directory`` +By default, the server uses the current directory. The option ``-d/--directory`` specifies a directory to which it should serve the files. For example, the following command uses a specific directory:: python -m http.server --directory /tmp/ .. versionadded:: 3.7 - ``--directory`` specify alternate directory + ``--directory`` argument was introduced. .. class:: CGIHTTPRequestHandler(request, client_address, server) @@ -482,4 +487,4 @@ the following command uses a specific directory:: :class:`CGIHTTPRequestHandler` can be enabled in the command line by passing the ``--cgi`` option:: - python -m http.server --cgi 8000 + python -m http.server --cgi From webhook-mailer at python.org Sat Jan 22 07:52:30 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 12:52:30 -0000 Subject: [Python-checkins] [3.9] bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) (GH-30785) Message-ID: https://github.com/python/cpython/commit/6ed874f8c59cc6c01d9663bad2f4bed8dc1c6109 commit: 6ed874f8c59cc6c01d9663bad2f4bed8dc1c6109 branch: 3.9 author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-22T04:52:24-08:00 summary: [3.9] bpo-46469: Make asyncio generic classes return GenericAlias (GH-30777) (GH-30785) Automerge-Triggered-By: GH:asvetlov files: A Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst M Lib/asyncio/futures.py M Lib/asyncio/queues.py M Lib/asyncio/tasks.py M Lib/test/test_asyncio/test_futures.py M Lib/test/test_asyncio/test_queues.py M Lib/test/test_asyncio/test_tasks.py M Modules/_asynciomodule.c diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py index bed4da52fd4d9..aaab09c28e6b5 100644 --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -8,6 +8,7 @@ import contextvars import logging import sys +from types import GenericAlias from . import base_futures from . import events @@ -106,8 +107,7 @@ def __del__(self): context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) @property def _log_traceback(self): diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py index cd3f7c6a56789..14ae87e0a2dd2 100644 --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -3,6 +3,7 @@ import collections import heapq import warnings +from types import GenericAlias from . import events from . import locks @@ -76,8 +77,7 @@ def __repr__(self): def __str__(self): return f'<{type(self).__name__} {self._format()}>' - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _format(self): result = f'maxsize={self._maxsize!r}' diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py index 27a3c8c5a88df..39bd068535668 100644 --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -17,6 +17,7 @@ import types import warnings import weakref +from types import GenericAlias from . import base_tasks from . import coroutines @@ -147,8 +148,7 @@ def __del__(self): self._loop.call_exception_handler(context) super().__del__() - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) def _repr_info(self): return base_tasks._task_repr_info(self) diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py index c16bfc08a7abe..939ad7be6a436 100644 --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -7,7 +7,7 @@ import threading import unittest from unittest import mock - +from types import GenericAlias import asyncio from asyncio import futures from test.test_asyncio import utils as test_utils @@ -109,6 +109,11 @@ def setUp(self): self.loop = self.new_test_loop() self.addCleanup(self.loop.close) + def test_generic_alias(self): + future = self.cls[str] + self.assertEqual(future.__args__, (str,)) + self.assertIsInstance(future, GenericAlias) + def test_isfuture(self): class MyFuture: _asyncio_future_blocking = None diff --git a/Lib/test/test_asyncio/test_queues.py b/Lib/test/test_asyncio/test_queues.py index 81e888fbce6db..90d352276c9d6 100644 --- a/Lib/test/test_asyncio/test_queues.py +++ b/Lib/test/test_asyncio/test_queues.py @@ -4,6 +4,7 @@ from unittest import mock import asyncio +from types import GenericAlias from test.test_asyncio import utils as test_utils @@ -92,6 +93,11 @@ def test_repr(self): def test_str(self): self._test_repr_or_str(str, False) + def test_generic_alias(self): + q = asyncio.Queue[int] + self.assertEqual(q.__args__, (int,)) + self.assertIsInstance(q, GenericAlias) + def test_empty(self): with self.assertWarns(DeprecationWarning): q = asyncio.Queue(loop=self.loop) diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py index a6afce0766493..2f016740b9e76 100644 --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -9,12 +9,13 @@ import random import re import sys +import types import textwrap import traceback -import types import unittest import weakref from unittest import mock +from types import GenericAlias import asyncio from asyncio import coroutines @@ -120,6 +121,12 @@ def setUp(self): self.loop.set_task_factory(self.new_task) self.loop.create_future = lambda: self.new_future(self.loop) + + def test_generic_alias(self): + task = self.__class__.Task[str] + self.assertEqual(task.__args__, (str,)) + self.assertIsInstance(task, GenericAlias) + def test_task_cancel_message_getter(self): async def coro(): pass diff --git a/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst new file mode 100644 index 0000000000000..0d0e4b5d3d735 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-05-05-08.bpo-46469.plUab5.rst @@ -0,0 +1 @@ +:mod:`asyncio` generic classes now return :class:`types.GenericAlias` in ``__class_getitem__`` instead of the same class. \ No newline at end of file diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index a1421cf5dbec8..b64069a25a743 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -1474,13 +1474,6 @@ FutureObj_finalize(FutureObj *fut) PyErr_Restore(error_type, error_value, error_traceback); } -static PyObject * -future_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static PyAsyncMethods FutureType_as_async = { (unaryfunc)future_new_iter, /* am_await */ 0, /* am_aiter */ @@ -1500,7 +1493,7 @@ static PyMethodDef FutureType_methods[] = { _ASYNCIO_FUTURE_GET_LOOP_METHODDEF _ASYNCIO_FUTURE__MAKE_CANCELLED_ERROR_METHODDEF _ASYNCIO_FUTURE__REPR_INFO_METHODDEF - {"__class_getitem__", future_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; @@ -2487,13 +2480,6 @@ TaskObj_finalize(TaskObj *task) FutureObj_finalize((FutureObj*)task); } -static PyObject * -task_cls_getitem(PyObject *cls, PyObject *type) -{ - Py_INCREF(cls); - return cls; -} - static void TaskObj_dealloc(PyObject *); /* Needs Task_CheckExact */ static PyMethodDef TaskType_methods[] = { @@ -2513,7 +2499,7 @@ static PyMethodDef TaskType_methods[] = { _ASYNCIO_TASK_GET_NAME_METHODDEF _ASYNCIO_TASK_SET_NAME_METHODDEF _ASYNCIO_TASK_GET_CORO_METHODDEF - {"__class_getitem__", task_cls_getitem, METH_O|METH_CLASS, NULL}, + {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* Sentinel */ }; From webhook-mailer at python.org Sat Jan 22 09:08:54 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 14:08:54 -0000 Subject: [Python-checkins] bpo-46417: Fix race condition on setting type __bases__ (GH-30788) Message-ID: https://github.com/python/cpython/commit/f1c6ae3270913e095d24ae13ecf96f5a32c8c503 commit: f1c6ae3270913e095d24ae13ecf96f5a32c8c503 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T15:08:42+01:00 summary: bpo-46417: Fix race condition on setting type __bases__ (GH-30788) Fix a race condition on setting a type __bases__ attribute: the internal function add_subclass() now gets the PyTypeObject.tp_subclasses member after calling PyWeakref_NewRef() which can trigger a garbage collection which can indirectly modify PyTypeObject.tp_subclasses. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst M Objects/typeobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst new file mode 100644 index 00000000000000..54fe09b7ba4544 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst @@ -0,0 +1,5 @@ +Fix a race condition on setting a type ``__bases__`` attribute: the internal +function ``add_subclass()`` now gets the ``PyTypeObject.tp_subclasses`` +member after calling :c:func:`PyWeakref_NewRef` which can trigger a garbage +collection which can indirectly modify ``PyTypeObject.tp_subclasses``. Patch +by Victor Stinner. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index e4a4824fa2e41b..2b47afe30e6ecb 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -6503,24 +6503,29 @@ PyType_Ready(PyTypeObject *type) static int add_subclass(PyTypeObject *base, PyTypeObject *type) { - int result = -1; - PyObject *dict, *key, *newobj; + PyObject *key = PyLong_FromVoidPtr((void *) type); + if (key == NULL) + return -1; - dict = base->tp_subclasses; + PyObject *ref = PyWeakref_NewRef((PyObject *)type, NULL); + if (ref == NULL) { + Py_DECREF(key); + return -1; + } + + // Only get tp_subclasses after creating the key and value. + // PyWeakref_NewRef() can trigger a garbage collection which can execute + // arbitrary Python code and so modify base->tp_subclasses. + PyObject *dict = base->tp_subclasses; if (dict == NULL) { base->tp_subclasses = dict = PyDict_New(); if (dict == NULL) return -1; } assert(PyDict_CheckExact(dict)); - key = PyLong_FromVoidPtr((void *) type); - if (key == NULL) - return -1; - newobj = PyWeakref_NewRef((PyObject *)type, NULL); - if (newobj != NULL) { - result = PyDict_SetItem(dict, key, newobj); - Py_DECREF(newobj); - } + + int result = PyDict_SetItem(dict, key, ref); + Py_DECREF(ref); Py_DECREF(key); return result; } From webhook-mailer at python.org Sat Jan 22 09:28:57 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 14:28:57 -0000 Subject: [Python-checkins] bpo-46417: Fix race condition on setting type __bases__ (GH-30788) (GH-30789) Message-ID: https://github.com/python/cpython/commit/acda9f3b90c33e4020237cb9e5c676efb38f7847 commit: acda9f3b90c33e4020237cb9e5c676efb38f7847 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-22T15:28:36+01:00 summary: bpo-46417: Fix race condition on setting type __bases__ (GH-30788) (GH-30789) Fix a race condition on setting a type __bases__ attribute: the internal function add_subclass() now gets the PyTypeObject.tp_subclasses member after calling PyWeakref_NewRef() which can trigger a garbage collection which can indirectly modify PyTypeObject.tp_subclasses. (cherry picked from commit f1c6ae3270913e095d24ae13ecf96f5a32c8c503) Co-authored-by: Victor Stinner Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst M Objects/typeobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst new file mode 100644 index 0000000000000..54fe09b7ba454 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst @@ -0,0 +1,5 @@ +Fix a race condition on setting a type ``__bases__`` attribute: the internal +function ``add_subclass()`` now gets the ``PyTypeObject.tp_subclasses`` +member after calling :c:func:`PyWeakref_NewRef` which can trigger a garbage +collection which can indirectly modify ``PyTypeObject.tp_subclasses``. Patch +by Victor Stinner. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index b23e36a420fa0..10b69fe1196db 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -6384,24 +6384,29 @@ PyType_Ready(PyTypeObject *type) static int add_subclass(PyTypeObject *base, PyTypeObject *type) { - int result = -1; - PyObject *dict, *key, *newobj; + PyObject *key = PyLong_FromVoidPtr((void *) type); + if (key == NULL) + return -1; - dict = base->tp_subclasses; + PyObject *ref = PyWeakref_NewRef((PyObject *)type, NULL); + if (ref == NULL) { + Py_DECREF(key); + return -1; + } + + // Only get tp_subclasses after creating the key and value. + // PyWeakref_NewRef() can trigger a garbage collection which can execute + // arbitrary Python code and so modify base->tp_subclasses. + PyObject *dict = base->tp_subclasses; if (dict == NULL) { base->tp_subclasses = dict = PyDict_New(); if (dict == NULL) return -1; } assert(PyDict_CheckExact(dict)); - key = PyLong_FromVoidPtr((void *) type); - if (key == NULL) - return -1; - newobj = PyWeakref_NewRef((PyObject *)type, NULL); - if (newobj != NULL) { - result = PyDict_SetItem(dict, key, newobj); - Py_DECREF(newobj); - } + + int result = PyDict_SetItem(dict, key, ref); + Py_DECREF(ref); Py_DECREF(key); return result; } From webhook-mailer at python.org Sat Jan 22 09:28:58 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 14:28:58 -0000 Subject: [Python-checkins] bpo-46417: Fix race condition on setting type __bases__ (GH-30788) (GH-30790) Message-ID: https://github.com/python/cpython/commit/f1796f29478f08f34e0c30a060622c0b2d843e2c commit: f1796f29478f08f34e0c30a060622c0b2d843e2c branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-22T15:28:42+01:00 summary: bpo-46417: Fix race condition on setting type __bases__ (GH-30788) (GH-30790) Fix a race condition on setting a type __bases__ attribute: the internal function add_subclass() now gets the PyTypeObject.tp_subclasses member after calling PyWeakref_NewRef() which can trigger a garbage collection which can indirectly modify PyTypeObject.tp_subclasses. (cherry picked from commit f1c6ae3270913e095d24ae13ecf96f5a32c8c503) Co-authored-by: Victor Stinner Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst M Objects/typeobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst new file mode 100644 index 0000000000000..54fe09b7ba454 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-22-14-39-23.bpo-46417.3U5SfN.rst @@ -0,0 +1,5 @@ +Fix a race condition on setting a type ``__bases__`` attribute: the internal +function ``add_subclass()`` now gets the ``PyTypeObject.tp_subclasses`` +member after calling :c:func:`PyWeakref_NewRef` which can trigger a garbage +collection which can indirectly modify ``PyTypeObject.tp_subclasses``. Patch +by Victor Stinner. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 1cdf80bfcf5af..d9ea9e8626478 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -5622,24 +5622,29 @@ PyType_Ready(PyTypeObject *type) static int add_subclass(PyTypeObject *base, PyTypeObject *type) { - int result = -1; - PyObject *dict, *key, *newobj; + PyObject *key = PyLong_FromVoidPtr((void *) type); + if (key == NULL) + return -1; - dict = base->tp_subclasses; + PyObject *ref = PyWeakref_NewRef((PyObject *)type, NULL); + if (ref == NULL) { + Py_DECREF(key); + return -1; + } + + // Only get tp_subclasses after creating the key and value. + // PyWeakref_NewRef() can trigger a garbage collection which can execute + // arbitrary Python code and so modify base->tp_subclasses. + PyObject *dict = base->tp_subclasses; if (dict == NULL) { base->tp_subclasses = dict = PyDict_New(); if (dict == NULL) return -1; } assert(PyDict_CheckExact(dict)); - key = PyLong_FromVoidPtr((void *) type); - if (key == NULL) - return -1; - newobj = PyWeakref_NewRef((PyObject *)type, NULL); - if (newobj != NULL) { - result = PyDict_SetItem(dict, key, newobj); - Py_DECREF(newobj); - } + + int result = PyDict_SetItem(dict, key, ref); + Py_DECREF(ref); Py_DECREF(key); return result; } From webhook-mailer at python.org Sat Jan 22 10:31:49 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 15:31:49 -0000 Subject: [Python-checkins] bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) (#30786) Message-ID: https://github.com/python/cpython/commit/b4088801db4b4f56b177b1c01dd873c7922e6a9f commit: b4088801db4b4f56b177b1c01dd873c7922e6a9f branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-22T17:31:40+02:00 summary: bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) (#30786) Code link: https://github.com/python/cpython/blob/70c16468deee9390e34322d32fda57df6e0f46bb/Lib/http/server.pyGH-L1270 It's been this way since at least 3.4. Also improved some wording in the same section. (cherry picked from commit c8a536624e8f5d6612e3c275c5b19592583a8cf8) Co-authored-by: Jelle Zijlstra Co-authored-by: Jelle Zijlstra files: M Doc/library/http.server.rst diff --git a/Doc/library/http.server.rst b/Doc/library/http.server.rst index 08b0ddf5f2c6b..dea79108d34f2 100644 --- a/Doc/library/http.server.rst +++ b/Doc/library/http.server.rst @@ -412,17 +412,22 @@ the current directory:: .. _http-server-cli: :mod:`http.server` can also be invoked directly using the :option:`-m` -switch of the interpreter with a ``port number`` argument. Similar to +switch of the interpreter. Similar to the previous example, this serves files relative to the current directory:: - python -m http.server 8000 + python -m http.server -By default, server binds itself to all interfaces. The option ``-b/--bind`` +The server listens to port 8000 by default. The default can be overridden +by passing the desired port number as an argument:: + + python -m http.server 9000 + +By default, the server binds itself to all interfaces. The option ``-b/--bind`` specifies a specific address to which it should bind. Both IPv4 and IPv6 addresses are supported. For example, the following command causes the server to bind to localhost only:: - python -m http.server 8000 --bind 127.0.0.1 + python -m http.server --bind 127.0.0.1 .. versionadded:: 3.4 ``--bind`` argument was introduced. @@ -430,14 +435,14 @@ to bind to localhost only:: .. versionadded:: 3.8 ``--bind`` argument enhanced to support IPv6 -By default, server uses the current directory. The option ``-d/--directory`` +By default, the server uses the current directory. The option ``-d/--directory`` specifies a directory to which it should serve the files. For example, the following command uses a specific directory:: python -m http.server --directory /tmp/ .. versionadded:: 3.7 - ``--directory`` specify alternate directory + ``--directory`` argument was introduced. .. class:: CGIHTTPRequestHandler(request, client_address, server) @@ -482,4 +487,4 @@ the following command uses a specific directory:: :class:`CGIHTTPRequestHandler` can be enabled in the command line by passing the ``--cgi`` option:: - python -m http.server --cgi 8000 + python -m http.server --cgi From webhook-mailer at python.org Sat Jan 22 10:31:56 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 15:31:56 -0000 Subject: [Python-checkins] bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) (#30787) Message-ID: https://github.com/python/cpython/commit/923c994400b3f1c67f95d25c703e131890a16912 commit: 923c994400b3f1c67f95d25c703e131890a16912 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: asvetlov date: 2022-01-22T17:31:52+02:00 summary: bpo-46468: document that "-m http.server" defaults to port 8000 (GH-30776) (#30787) Code link: https://github.com/python/cpython/blob/70c16468deee9390e34322d32fda57df6e0f46bb/Lib/http/server.pyGH-L1270 It's been this way since at least 3.4. Also improved some wording in the same section. (cherry picked from commit c8a536624e8f5d6612e3c275c5b19592583a8cf8) Co-authored-by: Jelle Zijlstra Co-authored-by: Jelle Zijlstra files: M Doc/library/http.server.rst diff --git a/Doc/library/http.server.rst b/Doc/library/http.server.rst index c3cee079526b2..0de02834401aa 100644 --- a/Doc/library/http.server.rst +++ b/Doc/library/http.server.rst @@ -412,17 +412,22 @@ the current directory:: .. _http-server-cli: :mod:`http.server` can also be invoked directly using the :option:`-m` -switch of the interpreter with a ``port number`` argument. Similar to +switch of the interpreter. Similar to the previous example, this serves files relative to the current directory:: - python -m http.server 8000 + python -m http.server -By default, server binds itself to all interfaces. The option ``-b/--bind`` +The server listens to port 8000 by default. The default can be overridden +by passing the desired port number as an argument:: + + python -m http.server 9000 + +By default, the server binds itself to all interfaces. The option ``-b/--bind`` specifies a specific address to which it should bind. Both IPv4 and IPv6 addresses are supported. For example, the following command causes the server to bind to localhost only:: - python -m http.server 8000 --bind 127.0.0.1 + python -m http.server --bind 127.0.0.1 .. versionadded:: 3.4 ``--bind`` argument was introduced. @@ -430,14 +435,14 @@ to bind to localhost only:: .. versionadded:: 3.8 ``--bind`` argument enhanced to support IPv6 -By default, server uses the current directory. The option ``-d/--directory`` +By default, the server uses the current directory. The option ``-d/--directory`` specifies a directory to which it should serve the files. For example, the following command uses a specific directory:: python -m http.server --directory /tmp/ .. versionadded:: 3.7 - ``--directory`` specify alternate directory + ``--directory`` argument was introduced. .. class:: CGIHTTPRequestHandler(request, client_address, server) @@ -482,4 +487,4 @@ the following command uses a specific directory:: :class:`CGIHTTPRequestHandler` can be enabled in the command line by passing the ``--cgi`` option:: - python -m http.server --cgi 8000 + python -m http.server --cgi From webhook-mailer at python.org Sat Jan 22 10:53:50 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 15:53:50 -0000 Subject: [Python-checkins] bpo-46417: remove_subclass() clears tp_subclasses (GH-30793) Message-ID: https://github.com/python/cpython/commit/2d03b73cc9c0dada3243eab1373a46dbd98d24a0 commit: 2d03b73cc9c0dada3243eab1373a46dbd98d24a0 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T16:53:30+01:00 summary: bpo-46417: remove_subclass() clears tp_subclasses (GH-30793) The remove_subclass() function now deletes the dictionary when removing the last subclass (if the dictionary becomes empty) to save memory: set PyTypeObject.tp_subclasses to NULL. remove_subclass() is called when a type is deallocated. _PyType_GetSubclasses() no longer holds a reference to tp_subclasses: its loop cannot modify tp_subclasses. files: M Lib/test/test_descr.py M Objects/typeobject.c diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index 707c93140e251..791cf714d46a3 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -4923,6 +4923,23 @@ def __new__(cls): cls.lst = [2**i for i in range(10000)] X.descr + def test_remove_subclass(self): + # bpo-46417: when the last subclass of a type is deleted, + # remove_subclass() clears the internal dictionary of subclasses: + # set PyTypeObject.tp_subclasses to NULL. remove_subclass() is called + # when a type is deallocated. + class Parent: + pass + self.assertEqual(Parent.__subclasses__(), []) + + class Child(Parent): + pass + self.assertEqual(Parent.__subclasses__(), [Child]) + + del Child + gc.collect() + self.assertEqual(Parent.__subclasses__(), []) + class DictProxyTests(unittest.TestCase): def setUp(self): diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 2b47afe30e6ec..b3c305e0bf430 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4137,16 +4137,17 @@ _PyType_GetSubclasses(PyTypeObject *self) return NULL; } - // Hold a strong reference to tp_subclasses while iterating on it - PyObject *dict = Py_XNewRef(self->tp_subclasses); - if (dict == NULL) { + PyObject *subclasses = self->tp_subclasses; // borrowed ref + if (subclasses == NULL) { return list; } - assert(PyDict_CheckExact(dict)); + assert(PyDict_CheckExact(subclasses)); + // The loop cannot modify tp_subclasses, there is no need + // to hold a strong reference (use a borrowed reference). Py_ssize_t i = 0; PyObject *ref; // borrowed ref - while (PyDict_Next(dict, &i, NULL, &ref)) { + while (PyDict_Next(subclasses, &i, NULL, &ref)) { assert(PyWeakref_CheckRef(ref)); PyObject *obj = PyWeakref_GET_OBJECT(ref); // borrowed ref if (obj == Py_None) { @@ -4154,12 +4155,10 @@ _PyType_GetSubclasses(PyTypeObject *self) } assert(PyType_Check(obj)); if (PyList_Append(list, obj) < 0) { - Py_CLEAR(list); - goto done; + Py_DECREF(list); + return NULL; } } -done: - Py_DECREF(dict); return list; } @@ -6568,6 +6567,13 @@ remove_subclass(PyTypeObject *base, PyTypeObject *type) PyErr_Clear(); } Py_XDECREF(key); + + if (PyDict_Size(dict) == 0) { + // Delete the dictionary to save memory. _PyStaticType_Dealloc() + // callers also test if tp_subclasses is NULL to check if a static type + // has no subclass. + Py_CLEAR(base->tp_subclasses); + } } static void From webhook-mailer at python.org Sat Jan 22 11:03:17 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 16:03:17 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `test_traceback` (GH-30746) Message-ID: https://github.com/python/cpython/commit/101a184d49756043a0c39dde6eca08b1891137a2 commit: 101a184d49756043a0c39dde6eca08b1891137a2 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-22T18:03:13+02:00 summary: bpo-46425: fix direct invocation of `test_traceback` (GH-30746) files: M Lib/test/test_traceback.py diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index a0e4656d3d9ea..966ff2a1241ca 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -18,6 +18,7 @@ import traceback from functools import partial +MODULE_PREFIX = f'{__name__}.' if __name__ == '__main__' else '' test_code = namedtuple('code', ['co_filename', 'co_name']) test_code.co_positions = lambda _: iter([(6, 6, 0, 0)]) @@ -1312,7 +1313,7 @@ def __str__(self): str_value = 'I am X' str_name = '.'.join([A.B.X.__module__, A.B.X.__qualname__]) exp = "%s: %s\n" % (str_name, str_value) - self.assertEqual(exp, err) + self.assertEqual(exp, MODULE_PREFIX + err) def test_exception_modulename(self): class X(Exception): @@ -1349,7 +1350,7 @@ def __str__(self): err = self.get_report(X()) str_value = '' str_name = '.'.join([X.__module__, X.__qualname__]) - self.assertEqual(err, f"{str_name}: {str_value}\n") + self.assertEqual(MODULE_PREFIX + err, f"{str_name}: {str_value}\n") # #### Exception Groups #### From webhook-mailer at python.org Sat Jan 22 11:04:01 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 16:04:01 -0000 Subject: [Python-checkins] bpo-46425: use absolute imports in `test_sqlite3` (GH-30676) Message-ID: https://github.com/python/cpython/commit/55f4ec460ee6dcffc26180fd982ad89083c9acb1 commit: 55f4ec460ee6dcffc26180fd982ad89083c9acb1 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-22T18:03:56+02:00 summary: bpo-46425: use absolute imports in `test_sqlite3` (GH-30676) files: M Lib/test/test_sqlite3/test_hooks.py M Lib/test/test_sqlite3/test_regression.py M Lib/test/test_sqlite3/test_transactions.py M Lib/test/test_sqlite3/test_userfunctions.py diff --git a/Lib/test/test_sqlite3/test_hooks.py b/Lib/test/test_sqlite3/test_hooks.py index 9e5e53ad223f0..d4790cfe77b7b 100644 --- a/Lib/test/test_sqlite3/test_hooks.py +++ b/Lib/test/test_sqlite3/test_hooks.py @@ -24,7 +24,7 @@ import sqlite3 as sqlite from test.support.os_helper import TESTFN, unlink -from .test_userfunctions import with_tracebacks +from test.test_sqlite3.test_userfunctions import with_tracebacks class CollationTests(unittest.TestCase): def test_create_collation_not_string(self): diff --git a/Lib/test/test_sqlite3/test_regression.py b/Lib/test/test_sqlite3/test_regression.py index b527053039b8a..211f7636b746d 100644 --- a/Lib/test/test_sqlite3/test_regression.py +++ b/Lib/test/test_sqlite3/test_regression.py @@ -21,14 +21,13 @@ # 3. This notice may not be removed or altered from any source distribution. import datetime -import sys import unittest import sqlite3 as sqlite import weakref import functools -from test import support -from .test_dbapi import memory_database, managed_connect, cx_limit +from test import support +from test.test_sqlite3.test_dbapi import memory_database, managed_connect, cx_limit class RegressionTests(unittest.TestCase): diff --git a/Lib/test/test_sqlite3/test_transactions.py b/Lib/test/test_sqlite3/test_transactions.py index 55cf8f1fdfce3..040ab1ee608cf 100644 --- a/Lib/test/test_sqlite3/test_transactions.py +++ b/Lib/test/test_sqlite3/test_transactions.py @@ -23,7 +23,7 @@ import os, unittest import sqlite3 as sqlite -from .test_dbapi import memory_database +from test.test_sqlite3.test_dbapi import memory_database def get_db_path(): return "sqlite_testdb" diff --git a/Lib/test/test_sqlite3/test_userfunctions.py b/Lib/test/test_sqlite3/test_userfunctions.py index 996437b1a4bee..b2906081e5028 100644 --- a/Lib/test/test_sqlite3/test_userfunctions.py +++ b/Lib/test/test_sqlite3/test_userfunctions.py @@ -32,7 +32,7 @@ import sqlite3 as sqlite from test.support import bigmemtest, catch_unraisable_exception -from .test_dbapi import cx_limit +from test.test_sqlite3.test_dbapi import cx_limit def with_tracebacks(exc, regex="", name=""): From webhook-mailer at python.org Sat Jan 22 11:05:09 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 16:05:09 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `test_importlib` (GH-30682) Message-ID: https://github.com/python/cpython/commit/57316c52bae5d6420f5067f3891ec328deb97305 commit: 57316c52bae5d6420f5067f3891ec328deb97305 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-22T18:05:05+02:00 summary: bpo-46425: fix direct invocation of `test_importlib` (GH-30682) files: M Lib/test/test_importlib/builtin/test_finder.py M Lib/test/test_importlib/builtin/test_loader.py M Lib/test/test_importlib/extension/test_case_sensitivity.py M Lib/test/test_importlib/extension/test_finder.py M Lib/test/test_importlib/extension/test_loader.py M Lib/test/test_importlib/extension/test_path_hook.py M Lib/test/test_importlib/frozen/test_finder.py M Lib/test/test_importlib/frozen/test_loader.py M Lib/test/test_importlib/import_/test___loader__.py M Lib/test/test_importlib/import_/test___package__.py M Lib/test/test_importlib/import_/test_api.py M Lib/test/test_importlib/import_/test_caching.py M Lib/test/test_importlib/import_/test_fromlist.py M Lib/test/test_importlib/import_/test_meta_path.py M Lib/test/test_importlib/import_/test_packages.py M Lib/test/test_importlib/import_/test_path.py M Lib/test/test_importlib/import_/test_relative_imports.py M Lib/test/test_importlib/source/test_case_sensitivity.py M Lib/test/test_importlib/source/test_file_loader.py M Lib/test/test_importlib/source/test_finder.py M Lib/test/test_importlib/source/test_path_hook.py M Lib/test/test_importlib/source/test_source_encoding.py M Lib/test/test_importlib/test_abc.py M Lib/test/test_importlib/test_api.py M Lib/test/test_importlib/test_compatibilty_files.py M Lib/test/test_importlib/test_contents.py M Lib/test/test_importlib/test_files.py M Lib/test/test_importlib/test_lazy.py M Lib/test/test_importlib/test_locks.py M Lib/test/test_importlib/test_main.py M Lib/test/test_importlib/test_metadata_api.py M Lib/test/test_importlib/test_open.py M Lib/test/test_importlib/test_path.py M Lib/test/test_importlib/test_read.py M Lib/test/test_importlib/test_resource.py M Lib/test/test_importlib/test_spec.py M Lib/test/test_importlib/test_util.py M Lib/test/test_importlib/test_windows.py M Lib/test/test_importlib/test_zip.py diff --git a/Lib/test/test_importlib/builtin/test_finder.py b/Lib/test/test_importlib/builtin/test_finder.py index 6f51abab9bcd1..a4869e07b9c0c 100644 --- a/Lib/test/test_importlib/builtin/test_finder.py +++ b/Lib/test/test_importlib/builtin/test_finder.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/builtin/test_loader.py b/Lib/test/test_importlib/builtin/test_loader.py index f6b6d97cd5bce..7e9d1b1960fdd 100644 --- a/Lib/test/test_importlib/builtin/test_loader.py +++ b/Lib/test/test_importlib/builtin/test_loader.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/extension/test_case_sensitivity.py b/Lib/test/test_importlib/extension/test_case_sensitivity.py index 20bf035cb5f66..366e565cf4b7a 100644 --- a/Lib/test/test_importlib/extension/test_case_sensitivity.py +++ b/Lib/test/test_importlib/extension/test_case_sensitivity.py @@ -2,7 +2,7 @@ from test.support import os_helper import unittest import sys -from .. import util +from test.test_importlib import util importlib = util.import_importlib('importlib') machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/extension/test_finder.py b/Lib/test/test_importlib/extension/test_finder.py index e8065d7dadecf..140f20657f736 100644 --- a/Lib/test/test_importlib/extension/test_finder.py +++ b/Lib/test/test_importlib/extension/test_finder.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/extension/test_loader.py b/Lib/test/test_importlib/extension/test_loader.py index 8fd556dbed57a..e7a88a8f5e321 100644 --- a/Lib/test/test_importlib/extension/test_loader.py +++ b/Lib/test/test_importlib/extension/test_loader.py @@ -1,6 +1,5 @@ from warnings import catch_warnings -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/extension/test_path_hook.py b/Lib/test/test_importlib/extension/test_path_hook.py index a4b5a64aae2a7..a0adc70ad1ec4 100644 --- a/Lib/test/test_importlib/extension/test_path_hook.py +++ b/Lib/test/test_importlib/extension/test_path_hook.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/frozen/test_finder.py b/Lib/test/test_importlib/frozen/test_finder.py index 66080b2ade009..069755606b40a 100644 --- a/Lib/test/test_importlib/frozen/test_finder.py +++ b/Lib/test/test_importlib/frozen/test_finder.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/frozen/test_loader.py b/Lib/test/test_importlib/frozen/test_loader.py index f1ccb8a188aca..f2df7e60bf8e3 100644 --- a/Lib/test/test_importlib/frozen/test_loader.py +++ b/Lib/test/test_importlib/frozen/test_loader.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/import_/test___loader__.py b/Lib/test/test_importlib/import_/test___loader__.py index ecd83c6567e70..eaf665a6f5b5a 100644 --- a/Lib/test/test_importlib/import_/test___loader__.py +++ b/Lib/test/test_importlib/import_/test___loader__.py @@ -4,7 +4,7 @@ import unittest import warnings -from .. import util +from test.test_importlib import util class SpecLoaderMock: diff --git a/Lib/test/test_importlib/import_/test___package__.py b/Lib/test/test_importlib/import_/test___package__.py index 4a2b34e5f67f2..1ab5018a431de 100644 --- a/Lib/test/test_importlib/import_/test___package__.py +++ b/Lib/test/test_importlib/import_/test___package__.py @@ -6,7 +6,7 @@ """ import unittest import warnings -from .. import util +from test.test_importlib import util class Using__package__: diff --git a/Lib/test/test_importlib/import_/test_api.py b/Lib/test/test_importlib/import_/test_api.py index 35c26977ea315..0ee032b0206df 100644 --- a/Lib/test/test_importlib/import_/test_api.py +++ b/Lib/test/test_importlib/import_/test_api.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util from importlib import machinery import sys diff --git a/Lib/test/test_importlib/import_/test_caching.py b/Lib/test/test_importlib/import_/test_caching.py index 0f987b22100c9..3ca765fb4ada9 100644 --- a/Lib/test/test_importlib/import_/test_caching.py +++ b/Lib/test/test_importlib/import_/test_caching.py @@ -1,5 +1,5 @@ """Test that sys.modules is used properly by import.""" -from .. import util +from test.test_importlib import util import sys from types import MethodType import unittest diff --git a/Lib/test/test_importlib/import_/test_fromlist.py b/Lib/test/test_importlib/import_/test_fromlist.py index deb21710a61fa..4b4b9bc3f5e04 100644 --- a/Lib/test/test_importlib/import_/test_fromlist.py +++ b/Lib/test/test_importlib/import_/test_fromlist.py @@ -1,5 +1,5 @@ """Test that the semantics relating to the 'fromlist' argument are correct.""" -from .. import util +from test.test_importlib import util import warnings import unittest diff --git a/Lib/test/test_importlib/import_/test_meta_path.py b/Lib/test/test_importlib/import_/test_meta_path.py index 5730119fe9933..c8b898ec23785 100644 --- a/Lib/test/test_importlib/import_/test_meta_path.py +++ b/Lib/test/test_importlib/import_/test_meta_path.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util import importlib._bootstrap import sys from types import MethodType diff --git a/Lib/test/test_importlib/import_/test_packages.py b/Lib/test/test_importlib/import_/test_packages.py index c73ac63f6eef3..eb0831f7d6d54 100644 --- a/Lib/test/test_importlib/import_/test_packages.py +++ b/Lib/test/test_importlib/import_/test_packages.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util import sys import unittest from test import support diff --git a/Lib/test/test_importlib/import_/test_path.py b/Lib/test/test_importlib/import_/test_path.py index 57a25228fc043..6f1d0cabd28a6 100644 --- a/Lib/test/test_importlib/import_/test_path.py +++ b/Lib/test/test_importlib/import_/test_path.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util importlib = util.import_importlib('importlib') machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/import_/test_relative_imports.py b/Lib/test/test_importlib/import_/test_relative_imports.py index 41aa18269952f..99c24f1fd9487 100644 --- a/Lib/test/test_importlib/import_/test_relative_imports.py +++ b/Lib/test/test_importlib/import_/test_relative_imports.py @@ -1,5 +1,5 @@ """Test relative imports (PEP 328).""" -from .. import util +from test.test_importlib import util import unittest import warnings diff --git a/Lib/test/test_importlib/source/test_case_sensitivity.py b/Lib/test/test_importlib/source/test_case_sensitivity.py index 19543f4a6653a..9d472707abe84 100644 --- a/Lib/test/test_importlib/source/test_case_sensitivity.py +++ b/Lib/test/test_importlib/source/test_case_sensitivity.py @@ -1,7 +1,7 @@ """Test case-sensitivity (PEP 235).""" import sys -from .. import util +from test.test_importlib import util importlib = util.import_importlib('importlib') machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/source/test_file_loader.py b/Lib/test/test_importlib/source/test_file_loader.py index 1065ac55fce3f..378dcbe08a805 100644 --- a/Lib/test/test_importlib/source/test_file_loader.py +++ b/Lib/test/test_importlib/source/test_file_loader.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util importlib = util.import_importlib('importlib') importlib_abc = util.import_importlib('importlib.abc') diff --git a/Lib/test/test_importlib/source/test_finder.py b/Lib/test/test_importlib/source/test_finder.py index 80e930cc6a1f2..6a23e9d50f6ff 100644 --- a/Lib/test/test_importlib/source/test_finder.py +++ b/Lib/test/test_importlib/source/test_finder.py @@ -1,5 +1,4 @@ -from .. import abc -from .. import util +from test.test_importlib import abc, util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/source/test_path_hook.py b/Lib/test/test_importlib/source/test_path_hook.py index 795d436c3b954..ead62f5e945e2 100644 --- a/Lib/test/test_importlib/source/test_path_hook.py +++ b/Lib/test/test_importlib/source/test_path_hook.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/source/test_source_encoding.py b/Lib/test/test_importlib/source/test_source_encoding.py index c0b9b031262eb..c09c9aa12b862 100644 --- a/Lib/test/test_importlib/source/test_source_encoding.py +++ b/Lib/test/test_importlib/source/test_source_encoding.py @@ -1,4 +1,4 @@ -from .. import util +from test.test_importlib import util machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/test_abc.py b/Lib/test/test_importlib/test_abc.py index 45cbf90791708..92cb78067d0eb 100644 --- a/Lib/test/test_importlib/test_abc.py +++ b/Lib/test/test_importlib/test_abc.py @@ -9,7 +9,7 @@ from unittest import mock import warnings -from . import util as test_util +from test.test_importlib import util as test_util init = test_util.import_importlib('importlib') abc = test_util.import_importlib('importlib.abc') diff --git a/Lib/test/test_importlib/test_api.py b/Lib/test/test_importlib/test_api.py index 763b2add07307..1f8f7c00bda53 100644 --- a/Lib/test/test_importlib/test_api.py +++ b/Lib/test/test_importlib/test_api.py @@ -1,4 +1,4 @@ -from . import util as test_util +from test.test_importlib import util as test_util init = test_util.import_importlib('importlib') util = test_util.import_importlib('importlib.util') diff --git a/Lib/test/test_importlib/test_compatibilty_files.py b/Lib/test/test_importlib/test_compatibilty_files.py index 9a823f2d93058..18cbdee6ce475 100644 --- a/Lib/test/test_importlib/test_compatibilty_files.py +++ b/Lib/test/test_importlib/test_compatibilty_files.py @@ -8,7 +8,7 @@ wrap_spec, ) -from .resources import util +from test.test_importlib.resources import util class CompatibilityFilesTests(unittest.TestCase): @@ -100,3 +100,7 @@ def files(self): def test_spec_path_joinpath(self): self.assertIsInstance(self.files / 'a', CompatibilityFiles.OrphanPath) + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_importlib/test_contents.py b/Lib/test/test_importlib/test_contents.py index 3323bf5b5cf56..a5b6538a2fc79 100644 --- a/Lib/test/test_importlib/test_contents.py +++ b/Lib/test/test_importlib/test_contents.py @@ -1,8 +1,8 @@ import unittest from importlib import resources -from . import data01 -from .resources import util +from test.test_importlib import data01 +from test.test_importlib.resources import util class ContentsTests: @@ -38,6 +38,10 @@ class ContentsNamespaceTests(ContentsTests, unittest.TestCase): } def setUp(self): - from . import namespacedata01 + from test.test_importlib import namespacedata01 self.data = namespacedata01 + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_importlib/test_files.py b/Lib/test/test_importlib/test_files.py index b9170d83bea91..3f28b55509bc1 100644 --- a/Lib/test/test_importlib/test_files.py +++ b/Lib/test/test_importlib/test_files.py @@ -3,8 +3,8 @@ from importlib import resources from importlib.abc import Traversable -from . import data01 -from .resources import util +from test.test_importlib import data01 +from test.test_importlib.resources import util class FilesTests: @@ -37,7 +37,7 @@ class OpenZipTests(FilesTests, util.ZipSetup, unittest.TestCase): class OpenNamespaceTests(FilesTests, unittest.TestCase): def setUp(self): - from . import namespacedata01 + from test.test_importlib import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_lazy.py b/Lib/test/test_importlib/test_lazy.py index 28608e95d060f..cc993f333e355 100644 --- a/Lib/test/test_importlib/test_lazy.py +++ b/Lib/test/test_importlib/test_lazy.py @@ -5,7 +5,7 @@ import types import unittest -from . import util as test_util +from test.test_importlib import util as test_util class CollectInit: diff --git a/Lib/test/test_importlib/test_locks.py b/Lib/test/test_importlib/test_locks.py index 9290bac80a78a..584d013caacad 100644 --- a/Lib/test/test_importlib/test_locks.py +++ b/Lib/test/test_importlib/test_locks.py @@ -1,4 +1,4 @@ -from . import util as test_util +from test.test_importlib import util as test_util init = test_util.import_importlib('importlib') diff --git a/Lib/test/test_importlib/test_main.py b/Lib/test/test_importlib/test_main.py index 2e120f7ac50ac..77e3dd7e08472 100644 --- a/Lib/test/test_importlib/test_main.py +++ b/Lib/test/test_importlib/test_main.py @@ -9,9 +9,9 @@ try: import pyfakefs.fake_filesystem_unittest as ffs except ImportError: - from .stubs import fake_filesystem_unittest as ffs + from test.test_importlib.stubs import fake_filesystem_unittest as ffs -from . import fixtures +from test.test_importlib import fixtures from importlib.metadata import ( Distribution, EntryPoint, @@ -315,3 +315,7 @@ def test_packages_distributions_neither_toplevel_nor_files(self): prefix=self.site_dir, ) packages_distributions() + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_importlib/test_metadata_api.py b/Lib/test/test_importlib/test_metadata_api.py index e16773a7e87ef..24d46c3d28013 100644 --- a/Lib/test/test_importlib/test_metadata_api.py +++ b/Lib/test/test_importlib/test_metadata_api.py @@ -5,7 +5,7 @@ import importlib import contextlib -from . import fixtures +from test.test_importlib import fixtures from importlib.metadata import ( Distribution, PackageNotFoundError, @@ -313,3 +313,7 @@ class InvalidateCache(unittest.TestCase): def test_invalidate_cache(self): # No externally observable behavior, but ensures test coverage... importlib.invalidate_caches() + + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_importlib/test_open.py b/Lib/test/test_importlib/test_open.py index df75e343d2c5b..ab390269e08f2 100644 --- a/Lib/test/test_importlib/test_open.py +++ b/Lib/test/test_importlib/test_open.py @@ -1,8 +1,8 @@ import unittest from importlib import resources -from . import data01 -from .resources import util +from test.test_importlib import data01 +from test.test_importlib.resources import util class CommonBinaryTests(util.CommonTests, unittest.TestCase): @@ -68,7 +68,7 @@ def setUp(self): class OpenDiskNamespaceTests(OpenTests, unittest.TestCase): def setUp(self): - from . import namespacedata01 + from test.test_importlib import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_path.py b/Lib/test/test_importlib/test_path.py index 6fc41f301d1ca..66dc0b215ad9f 100644 --- a/Lib/test/test_importlib/test_path.py +++ b/Lib/test/test_importlib/test_path.py @@ -2,8 +2,8 @@ import unittest from importlib import resources -from . import data01 -from .resources import util +from test.test_importlib import data01 +from test.test_importlib.resources import util class CommonTests(util.CommonTests, unittest.TestCase): diff --git a/Lib/test/test_importlib/test_read.py b/Lib/test/test_importlib/test_read.py index ebd72267776d9..7e907e4c8c59c 100644 --- a/Lib/test/test_importlib/test_read.py +++ b/Lib/test/test_importlib/test_read.py @@ -1,8 +1,8 @@ import unittest from importlib import import_module, resources -from . import data01 -from .resources import util +from test.test_importlib import data01 +from test.test_importlib.resources import util class CommonBinaryTests(util.CommonTests, unittest.TestCase): @@ -66,7 +66,7 @@ def test_read_submodule_resource_by_name(self): class ReadNamespaceTests(ReadTests, unittest.TestCase): def setUp(self): - from . import namespacedata01 + from test.test_importlib import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_resource.py b/Lib/test/test_importlib/test_resource.py index 834b8bd8a2818..825d1b0eb054e 100644 --- a/Lib/test/test_importlib/test_resource.py +++ b/Lib/test/test_importlib/test_resource.py @@ -3,9 +3,8 @@ import uuid import pathlib -from . import data01 -from . import zipdata01, zipdata02 -from .resources import util +from test.test_importlib import data01, zipdata01, zipdata02 +from test.test_importlib.resources import util from importlib import resources, import_module from test.support import import_helper from test.support.os_helper import unlink diff --git a/Lib/test/test_importlib/test_spec.py b/Lib/test/test_importlib/test_spec.py index dcb0527e33cfe..21e2c02094f22 100644 --- a/Lib/test/test_importlib/test_spec.py +++ b/Lib/test/test_importlib/test_spec.py @@ -1,4 +1,4 @@ -from . import util as test_util +from test.test_importlib import util as test_util init = test_util.import_importlib('importlib') machinery = test_util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/test_util.py b/Lib/test/test_importlib/test_util.py index 104452267c067..c77c7814a9ccd 100644 --- a/Lib/test/test_importlib/test_util.py +++ b/Lib/test/test_importlib/test_util.py @@ -1,4 +1,5 @@ -?from . import util +from test.test_importlib import util + abc = util.import_importlib('importlib.abc') init = util.import_importlib('importlib') machinery = util.import_importlib('importlib.machinery') diff --git a/Lib/test/test_importlib/test_windows.py b/Lib/test/test_importlib/test_windows.py index 6f09c5a7a5a46..b3e8e7e6d63fc 100644 --- a/Lib/test/test_importlib/test_windows.py +++ b/Lib/test/test_importlib/test_windows.py @@ -1,4 +1,4 @@ -from . import util as test_util +from test.test_importlib import util as test_util machinery = test_util.import_importlib('importlib.machinery') import os @@ -6,10 +6,9 @@ import sys import unittest import warnings -from test import support from test.support import import_helper from contextlib import contextmanager -from .util import temp_module +from test.test_importlib.util import temp_module import_helper.import_module('winreg', required_on=['win']) from winreg import ( @@ -178,3 +177,6 @@ def test_path_join(self): self.check_join("C:", "C:", "") self.check_join("//Server/Share\\", "//Server/Share/", "") self.check_join("//Server/Share\\", "//Server/Share", "") + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/test/test_importlib/test_zip.py b/Lib/test/test_importlib/test_zip.py index 276f6288c9159..a9f5c68ac60d7 100644 --- a/Lib/test/test_importlib/test_zip.py +++ b/Lib/test/test_importlib/test_zip.py @@ -1,7 +1,7 @@ import sys import unittest -from . import fixtures +from test.test_importlib import fixtures from importlib.metadata import ( PackageNotFoundError, distribution, @@ -60,3 +60,6 @@ def test_files(self): def test_normalized_name(self): dist = distribution('example') assert dist._normalized_name == 'example' + +if __name__ == '__main__': + unittest.main() From webhook-mailer at python.org Sat Jan 22 11:05:48 2022 From: webhook-mailer at python.org (asvetlov) Date: Sat, 22 Jan 2022 16:05:48 -0000 Subject: [Python-checkins] bpo-46425: fix direct invocation of `test_fileutils` and `test_zoneinfo` (GH-30792) Message-ID: https://github.com/python/cpython/commit/1f8014c5b4ea7acee069ca453f6fbcad5990ebf0 commit: 1f8014c5b4ea7acee069ca453f6fbcad5990ebf0 branch: main author: Nikita Sobolev committer: asvetlov date: 2022-01-22T18:05:43+02:00 summary: bpo-46425: fix direct invocation of `test_fileutils` and `test_zoneinfo` (GH-30792) files: M Lib/test/test_fileutils.py M Lib/test/test_zoneinfo/test_zoneinfo.py diff --git a/Lib/test/test_fileutils.py b/Lib/test/test_fileutils.py index 45b3f3233c617..ff13498fbfeb5 100644 --- a/Lib/test/test_fileutils.py +++ b/Lib/test/test_fileutils.py @@ -15,7 +15,7 @@ def test_capi_normalize_path(self): if os.name == 'nt': raise unittest.SkipTest('Windows has its own helper for this') else: - from .test_posixpath import PosixPathTest as posixdata + from test.test_posixpath import PosixPathTest as posixdata tests = posixdata.NORMPATH_CASES for filename, expected in tests: if not os.path.isabs(filename): diff --git a/Lib/test/test_zoneinfo/test_zoneinfo.py b/Lib/test/test_zoneinfo/test_zoneinfo.py index 59b35ef63f987..a2172f3ac21d0 100644 --- a/Lib/test/test_zoneinfo/test_zoneinfo.py +++ b/Lib/test/test_zoneinfo/test_zoneinfo.py @@ -17,8 +17,8 @@ from datetime import date, datetime, time, timedelta, timezone from functools import cached_property -from . import _support as test_support -from ._support import OS_ENV_LOCK, TZPATH_TEST_LOCK, ZoneInfoTestBase +from test.test_zoneinfo import _support as test_support +from test.test_zoneinfo._support import OS_ENV_LOCK, TZPATH_TEST_LOCK, ZoneInfoTestBase from test.support.import_helper import import_module lzma = import_module('lzma') @@ -2107,3 +2107,7 @@ def _Pacific_Kiritimati(): _ZONEDUMP_DATA = None _FIXED_OFFSET_ZONES = None + + +if __name__ == '__main__': + unittest.main() From webhook-mailer at python.org Sat Jan 22 12:28:57 2022 From: webhook-mailer at python.org (gvanrossum) Date: Sat, 22 Jan 2022 17:28:57 -0000 Subject: [Python-checkins] bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) (#30765) Message-ID: https://github.com/python/cpython/commit/83aef4d34022f293336f606dba8598cc7ac8f9f2 commit: 83aef4d34022f293336f606dba8598cc7ac8f9f2 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: gvanrossum date: 2022-01-22T09:28:48-08:00 summary: bpo-43118: Fix bug in inspect.signature around 'base.__text_signature__' (GH-30285) (#30765) (cherry picked from commit 881a763cfe07ef4a5806ec78f13a9bc99e8909dc) Co-authored-by: Weipeng Hong Co-authored-by: Weipeng Hong files: A Lib/test/ann_module7.py A Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst M Lib/inspect.py M Lib/test/test_inspect.py diff --git a/Lib/inspect.py b/Lib/inspect.py index 6d43d8dad46b99..c5881cc808d21a 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -2511,9 +2511,9 @@ def _signature_from_callable(obj, *, pass else: if text_sig: - # If 'obj' class has a __text_signature__ attribute: + # If 'base' class has a __text_signature__ attribute: # return a signature based on it - return _signature_fromstr(sigcls, obj, text_sig) + return _signature_fromstr(sigcls, base, text_sig) # No '__text_signature__' was found for the 'obj' class. # Last option is to check if its '__init__' is diff --git a/Lib/test/ann_module7.py b/Lib/test/ann_module7.py new file mode 100644 index 00000000000000..8f890cd28025be --- /dev/null +++ b/Lib/test/ann_module7.py @@ -0,0 +1,11 @@ +# Tests class have ``__text_signature__`` + +from __future__ import annotations + +DEFAULT_BUFFER_SIZE = 8192 + +class BufferedReader(object): + """BufferedReader(raw, buffer_size=DEFAULT_BUFFER_SIZE)\n--\n\n + Create a new buffered reader using the given readable raw IO object. + """ + pass diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index 545dab5c6348f7..28e4f5b4a718a4 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -4218,6 +4218,17 @@ def func(*args, **kwargs): sig = inspect.signature(func) self.assertEqual(str(sig), '(self, a, b=1, /, *args, c, d=2, **kwargs)') + def test_base_class_have_text_signature(self): + # see issue 43118 + from test.ann_module7 import BufferedReader + class MyBufferedReader(BufferedReader): + """buffer reader class.""" + + text_signature = BufferedReader.__text_signature__ + self.assertEqual(text_signature, '(raw, buffer_size=DEFAULT_BUFFER_SIZE)') + sig = inspect.signature(MyBufferedReader) + self.assertEqual(str(sig), '(raw, buffer_size=8192)') + class NTimesUnwrappable: def __init__(self, n): diff --git a/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst new file mode 100644 index 00000000000000..a37c22cd78c098 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-29-14-42-09.bpo-43118.BoVi_5.rst @@ -0,0 +1,3 @@ +Fix a bug in :func:`inspect.signature` that was causing it to fail on some +subclasses of classes with a ``__text_signature__`` referencing module +globals. Patch by Weipeng Hong. From webhook-mailer at python.org Sat Jan 22 12:55:56 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 17:55:56 -0000 Subject: [Python-checkins] bpo-46417: Clear more static types (GH-30796) Message-ID: https://github.com/python/cpython/commit/500c146387b01ea797b52e6a54caf228384e184c commit: 500c146387b01ea797b52e6a54caf228384e184c branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T18:55:48+01:00 summary: bpo-46417: Clear more static types (GH-30796) * Move PyContext static types into object.c static_types list. * Rename PyContextTokenMissing_Type to _PyContextTokenMissing_Type and declare it in pycore_context.h. * _PyHamtItems types are no long exported: replace PyAPI_DATA() with extern. files: M Include/internal/pycore_context.h M Include/internal/pycore_hamt.h M Objects/object.c M Python/bltinmodule.c M Python/context.c M Python/hamt.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_context.h b/Include/internal/pycore_context.h index 31ca0a43fae29..1bf4e8f3ee532 100644 --- a/Include/internal/pycore_context.h +++ b/Include/internal/pycore_context.h @@ -8,9 +8,11 @@ #include "pycore_hamt.h" /* PyHamtObject */ +extern PyTypeObject _PyContextTokenMissing_Type; + /* runtime lifecycle */ -PyStatus _PyContext_InitTypes(PyInterpreterState *); +PyStatus _PyContext_Init(PyInterpreterState *); void _PyContext_Fini(PyInterpreterState *); diff --git a/Include/internal/pycore_hamt.h b/Include/internal/pycore_hamt.h index cf9c19e022d8a..85e35c5afc90c 100644 --- a/Include/internal/pycore_hamt.h +++ b/Include/internal/pycore_hamt.h @@ -8,9 +8,16 @@ #define _Py_HAMT_MAX_TREE_DEPTH 7 +extern PyTypeObject _PyHamt_Type; +extern PyTypeObject _PyHamt_ArrayNode_Type; +extern PyTypeObject _PyHamt_BitmapNode_Type; +extern PyTypeObject _PyHamt_CollisionNode_Type; +extern PyTypeObject _PyHamtKeys_Type; +extern PyTypeObject _PyHamtValues_Type; +extern PyTypeObject _PyHamtItems_Type; + /* runtime lifecycle */ -PyStatus _PyHamt_InitTypes(PyInterpreterState *); void _PyHamt_Fini(PyInterpreterState *); @@ -69,15 +76,6 @@ typedef struct { } PyHamtIterator; -PyAPI_DATA(PyTypeObject) _PyHamt_Type; -PyAPI_DATA(PyTypeObject) _PyHamt_ArrayNode_Type; -PyAPI_DATA(PyTypeObject) _PyHamt_BitmapNode_Type; -PyAPI_DATA(PyTypeObject) _PyHamt_CollisionNode_Type; -PyAPI_DATA(PyTypeObject) _PyHamtKeys_Type; -PyAPI_DATA(PyTypeObject) _PyHamtValues_Type; -PyAPI_DATA(PyTypeObject) _PyHamtItems_Type; - - /* Create a new HAMT immutable mapping. */ PyHamtObject * _PyHamt_New(void); diff --git a/Objects/object.c b/Objects/object.c index a5ee8eef4a3b4..a1663c0dbb7b7 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -4,6 +4,7 @@ #include "Python.h" #include "pycore_call.h" // _PyObject_CallNoArgs() #include "pycore_ceval.h" // _Py_EnterRecursiveCall() +#include "pycore_context.h" // _PyContextTokenMissing_Type #include "pycore_dict.h" // _PyObject_MakeDictFromInstanceAttributes() #include "pycore_floatobject.h" // _PyFloat_DebugMallocStats() #include "pycore_initconfig.h" // _PyStatus_EXCEPTION() @@ -1853,6 +1854,9 @@ static PyTypeObject* static_types[] = { &PyClassMethod_Type, &PyCode_Type, &PyComplex_Type, + &PyContextToken_Type, + &PyContextVar_Type, + &PyContext_Type, &PyCoro_Type, &PyDictItems_Type, &PyDictIterItem_Type, @@ -1867,6 +1871,7 @@ static PyTypeObject* static_types[] = { &PyDict_Type, &PyEllipsis_Type, &PyEnum_Type, + &PyFilter_Type, &PyFloat_Type, &PyFrame_Type, &PyFrozenSet_Type, @@ -1879,6 +1884,7 @@ static PyTypeObject* static_types[] = { &PyList_Type, &PyLongRangeIter_Type, &PyLong_Type, + &PyMap_Type, &PyMemberDescr_Type, &PyMemoryView_Type, &PyMethodDescr_Type, @@ -1905,12 +1911,21 @@ static PyTypeObject* static_types[] = { &PyUnicodeIter_Type, &PyUnicode_Type, &PyWrapperDescr_Type, + &PyZip_Type, &Py_GenericAliasType, &_PyAnextAwaitable_Type, &_PyAsyncGenASend_Type, &_PyAsyncGenAThrow_Type, &_PyAsyncGenWrappedValue_Type, + &_PyContextTokenMissing_Type, &_PyCoroWrapper_Type, + &_PyHamtItems_Type, + &_PyHamtKeys_Type, + &_PyHamtValues_Type, + &_PyHamt_ArrayNode_Type, + &_PyHamt_BitmapNode_Type, + &_PyHamt_CollisionNode_Type, + &_PyHamt_Type, &_PyInterpreterID_Type, &_PyManagedBuffer_Type, &_PyMethodWrapper_Type, diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index ecd8be1af6f2d..ed612091d8394 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -2986,11 +2986,6 @@ _PyBuiltin_Init(PyInterpreterState *interp) const PyConfig *config = _PyInterpreterState_GetConfig(interp); - if (PyType_Ready(&PyFilter_Type) < 0 || - PyType_Ready(&PyMap_Type) < 0 || - PyType_Ready(&PyZip_Type) < 0) - return NULL; - mod = _PyModule_CreateInitialized(&builtinsmodule, PYTHON_API_VERSION); if (mod == NULL) return NULL; diff --git a/Python/context.c b/Python/context.c index 9ed73b7444d44..f3033d9b649af 100644 --- a/Python/context.c +++ b/Python/context.c @@ -1260,7 +1260,7 @@ context_token_missing_tp_repr(PyObject *self) } -PyTypeObject PyContextTokenMissing_Type = { +PyTypeObject _PyContextTokenMissing_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "Token.MISSING", sizeof(PyContextTokenMissing), @@ -1279,7 +1279,7 @@ get_token_missing(void) } _token_missing = (PyObject *)PyObject_New( - PyContextTokenMissing, &PyContextTokenMissing_Type); + PyContextTokenMissing, &_PyContextTokenMissing_Type); if (_token_missing == NULL) { return NULL; } @@ -1323,25 +1323,12 @@ _PyContext_Fini(PyInterpreterState *interp) PyStatus -_PyContext_InitTypes(PyInterpreterState *interp) +_PyContext_Init(PyInterpreterState *interp) { if (!_Py_IsMainInterpreter(interp)) { return _PyStatus_OK(); } - PyStatus status = _PyHamt_InitTypes(interp); - if (_PyStatus_EXCEPTION(status)) { - return status; - } - - if ((PyType_Ready(&PyContext_Type) < 0) || - (PyType_Ready(&PyContextVar_Type) < 0) || - (PyType_Ready(&PyContextToken_Type) < 0) || - (PyType_Ready(&PyContextTokenMissing_Type) < 0)) - { - return _PyStatus_ERR("can't init context types"); - } - PyObject *missing = get_token_missing(); if (PyDict_SetItemString( PyContextToken_Type.tp_dict, "MISSING", missing)) diff --git a/Python/hamt.c b/Python/hamt.c index 8c8e025a3eff3..cbfe4459d3ed0 100644 --- a/Python/hamt.c +++ b/Python/hamt.c @@ -2953,27 +2953,6 @@ PyTypeObject _PyHamt_CollisionNode_Type = { }; -PyStatus -_PyHamt_InitTypes(PyInterpreterState *interp) -{ - if (!_Py_IsMainInterpreter(interp)) { - return _PyStatus_OK(); - } - - if ((PyType_Ready(&_PyHamt_Type) < 0) || - (PyType_Ready(&_PyHamt_ArrayNode_Type) < 0) || - (PyType_Ready(&_PyHamt_BitmapNode_Type) < 0) || - (PyType_Ready(&_PyHamt_CollisionNode_Type) < 0) || - (PyType_Ready(&_PyHamtKeys_Type) < 0) || - (PyType_Ready(&_PyHamtValues_Type) < 0) || - (PyType_Ready(&_PyHamtItems_Type) < 0)) - { - return _PyStatus_ERR("can't init hamt types"); - } - - return _PyStatus_OK(); -} - void _PyHamt_Fini(PyInterpreterState *interp) { diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 662e578818349..a53f532e9e202 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -760,7 +760,7 @@ pycore_init_types(PyInterpreterState *interp) return status; } - status = _PyContext_InitTypes(interp); + status = _PyContext_Init(interp); if (_PyStatus_EXCEPTION(status)) { return status; } From webhook-mailer at python.org Sat Jan 22 12:56:15 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 17:56:15 -0000 Subject: [Python-checkins] bpo-46417: Cleanup typeobject.c code (GH-30795) Message-ID: https://github.com/python/cpython/commit/3a4c15bb9815b6f4652621fe6043ae18e0d202b3 commit: 3a4c15bb9815b6f4652621fe6043ae18e0d202b3 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T18:56:11+01:00 summary: bpo-46417: Cleanup typeobject.c code (GH-30795) * Add comment to recurse_down_subclasses() explaining why it's safe to use a borrowed reference to tp_subclasses. * remove_all_subclasses() no longer accept NULL cases * type_set_bases() now relies on the fact that new_bases is not NULL. * type_dealloc_common() avoids PyErr_Fetch/PyErr_Restore if tp_bases is NULL. * remove_all_subclasses() makes sure that no exception is raised. * Don't test at runtime if tp_mro only contains types: rely on _PyType_CAST() assertion for that. * _PyStaticType_Dealloc() no longer clears tp_subclasses which is already NULL. * mro_hierarchy() avoids calling _PyType_GetSubclasses() if tp_subclasses is NULL. Coding style: * Use Py_NewRef(). * Add braces and move variable declarations to the first variable assignement. * Rename a few variables and parameters to use better names. files: M Objects/typeobject.c diff --git a/Objects/typeobject.c b/Objects/typeobject.c index b3c305e0bf430..bf62b5389257f 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -328,24 +328,26 @@ PyType_Modified(PyTypeObject *type) We don't assign new version tags eagerly, but only as needed. */ - PyObject *raw, *ref; - Py_ssize_t i; - - if (!_PyType_HasFeature(type, Py_TPFLAGS_VALID_VERSION_TAG)) + if (!_PyType_HasFeature(type, Py_TPFLAGS_VALID_VERSION_TAG)) { return; + } + + PyObject *subclasses = type->tp_subclasses; + if (subclasses != NULL) { + assert(PyDict_CheckExact(subclasses)); - raw = type->tp_subclasses; - if (raw != NULL) { - assert(PyDict_CheckExact(raw)); - i = 0; - while (PyDict_Next(raw, &i, NULL, &ref)) { + Py_ssize_t i = 0; + PyObject *ref; + while (PyDict_Next(subclasses, &i, NULL, &ref)) { assert(PyWeakref_CheckRef(ref)); - ref = PyWeakref_GET_OBJECT(ref); - if (ref != Py_None) { - PyType_Modified(_PyType_CAST(ref)); + PyObject *obj = PyWeakref_GET_OBJECT(ref); + if (obj == Py_None) { + continue; } + PyType_Modified(_PyType_CAST(obj)); } } + type->tp_flags &= ~Py_TPFLAGS_VALID_VERSION_TAG; type->tp_version_tag = 0; /* 0 is not a valid version tag */ } @@ -409,13 +411,12 @@ assign_version_tag(struct type_cache *cache, PyTypeObject *type) must first be done on all super classes. Return 0 if this cannot be done, 1 if Py_TPFLAGS_VALID_VERSION_TAG. */ - Py_ssize_t i, n; - PyObject *bases; - - if (_PyType_HasFeature(type, Py_TPFLAGS_VALID_VERSION_TAG)) + if (_PyType_HasFeature(type, Py_TPFLAGS_VALID_VERSION_TAG)) { return 1; - if (!_PyType_HasFeature(type, Py_TPFLAGS_READY)) + } + if (!_PyType_HasFeature(type, Py_TPFLAGS_READY)) { return 0; + } if (next_version_tag == 0) { /* We have run out of version numbers */ @@ -424,9 +425,9 @@ assign_version_tag(struct type_cache *cache, PyTypeObject *type) type->tp_version_tag = next_version_tag++; assert (type->tp_version_tag != 0); - bases = type->tp_bases; - n = PyTuple_GET_SIZE(bases); - for (i = 0; i < n; i++) { + PyObject *bases = type->tp_bases; + Py_ssize_t n = PyTuple_GET_SIZE(bases); + for (Py_ssize_t i = 0; i < n; i++) { PyObject *b = PyTuple_GET_ITEM(bases, i); if (!assign_version_tag(cache, _PyType_CAST(b))) return 0; @@ -679,7 +680,7 @@ static void remove_all_subclasses(PyTypeObject *type, PyObject *bases); static void update_all_slots(PyTypeObject *); typedef int (*update_callback)(PyTypeObject *, void *); -static int update_subclasses(PyTypeObject *type, PyObject *name, +static int update_subclasses(PyTypeObject *type, PyObject *attr_name, update_callback callback, void *data); static int recurse_down_subclasses(PyTypeObject *type, PyObject *name, update_callback callback, void *data); @@ -718,30 +719,33 @@ mro_hierarchy(PyTypeObject *type, PyObject *temp) } Py_XDECREF(old_mro); - /* Obtain a copy of subclasses list to iterate over. + // Avoid creating an empty list if there is no subclass + if (type->tp_subclasses != NULL) { + /* Obtain a copy of subclasses list to iterate over. - Otherwise type->tp_subclasses might be altered - in the middle of the loop, for example, through a custom mro(), - by invoking type_set_bases on some subclass of the type - which in turn calls remove_subclass/add_subclass on this type. + Otherwise type->tp_subclasses might be altered + in the middle of the loop, for example, through a custom mro(), + by invoking type_set_bases on some subclass of the type + which in turn calls remove_subclass/add_subclass on this type. - Finally, this makes things simple avoiding the need to deal - with dictionary iterators and weak references. - */ - PyObject *subclasses = _PyType_GetSubclasses(type); - if (subclasses == NULL) { - return -1; - } + Finally, this makes things simple avoiding the need to deal + with dictionary iterators and weak references. + */ + PyObject *subclasses = _PyType_GetSubclasses(type); + if (subclasses == NULL) { + return -1; + } - Py_ssize_t n = PyList_GET_SIZE(subclasses); - for (Py_ssize_t i = 0; i < n; i++) { - PyTypeObject *subclass = _PyType_CAST(PyList_GET_ITEM(subclasses, i)); - res = mro_hierarchy(subclass, temp); - if (res < 0) { - break; + Py_ssize_t n = PyList_GET_SIZE(subclasses); + for (Py_ssize_t i = 0; i < n; i++) { + PyTypeObject *subclass = _PyType_CAST(PyList_GET_ITEM(subclasses, i)); + res = mro_hierarchy(subclass, temp); + if (res < 0) { + break; + } } + Py_DECREF(subclasses); } - Py_DECREF(subclasses); return res; } @@ -749,14 +753,12 @@ mro_hierarchy(PyTypeObject *type, PyObject *temp) static int type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) { - int res = 0; - PyObject *temp; - PyObject *old_bases; - PyTypeObject *new_base, *old_base; - Py_ssize_t i; - - if (!check_set_special_type_attr(type, new_bases, "__bases__")) + // Check arguments + if (!check_set_special_type_attr(type, new_bases, "__bases__")) { return -1; + } + assert(new_bases != NULL); + if (!PyTuple_Check(new_bases)) { PyErr_Format(PyExc_TypeError, "can only assign tuple to %s.__bases__, not %s", @@ -769,7 +771,8 @@ type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) type->tp_name); return -1; } - for (i = 0; i < PyTuple_GET_SIZE(new_bases); i++) { + Py_ssize_t n = PyTuple_GET_SIZE(new_bases); + for (Py_ssize_t i = 0; i < n; i++) { PyObject *ob = PyTuple_GET_ITEM(new_bases, i); if (!PyType_Check(ob)) { PyErr_Format(PyExc_TypeError, @@ -789,39 +792,42 @@ type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) below), which in turn may cause an inheritance cycle through tp_base chain. And this is definitely not what you want to ever happen. */ - (base->tp_mro != NULL && type_is_subtype_base_chain(base, type))) { - + (base->tp_mro != NULL && type_is_subtype_base_chain(base, type))) + { PyErr_SetString(PyExc_TypeError, "a __bases__ item causes an inheritance cycle"); return -1; } } - new_base = best_base(new_bases); + // Compute the new MRO and the new base class + PyTypeObject *new_base = best_base(new_bases); if (new_base == NULL) return -1; - if (!compatible_for_assignment(type->tp_base, new_base, "__bases__")) + if (!compatible_for_assignment(type->tp_base, new_base, "__bases__")) { return -1; + } - Py_INCREF(new_bases); - Py_INCREF(new_base); - - old_bases = type->tp_bases; - old_base = type->tp_base; + PyObject *old_bases = type->tp_bases; + assert(old_bases != NULL); + PyTypeObject *old_base = type->tp_base; - type->tp_bases = new_bases; - type->tp_base = new_base; + type->tp_bases = Py_NewRef(new_bases); + type->tp_base = (PyTypeObject *)Py_NewRef(new_base); - temp = PyList_New(0); - if (temp == NULL) + PyObject *temp = PyList_New(0); + if (temp == NULL) { goto bail; - if (mro_hierarchy(type, temp) < 0) + } + if (mro_hierarchy(type, temp) < 0) { goto undo; + } Py_DECREF(temp); /* Take no action in case if type->tp_bases has been replaced through reentrance. */ + int res; if (type->tp_bases == new_bases) { /* any base that was in __bases__ but now isn't, we need to remove |type| from its tp_subclasses. @@ -834,6 +840,9 @@ type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) res = add_all_subclasses(type, new_bases); update_all_slots(type); } + else { + res = 0; + } Py_DECREF(old_bases); Py_DECREF(old_base); @@ -842,7 +851,8 @@ type_set_bases(PyTypeObject *type, PyObject *new_bases, void *context) return res; undo: - for (i = PyList_GET_SIZE(temp) - 1; i >= 0; i--) { + n = PyList_GET_SIZE(temp); + for (Py_ssize_t i = n - 1; i >= 0; i--) { PyTypeObject *cls; PyObject *new_mro, *old_mro = NULL; @@ -1413,8 +1423,9 @@ subtype_dealloc(PyObject *self) and if self is tracked at that point, it will look like trash to GC and GC will try to delete self again. */ - if (type->tp_weaklistoffset && !base->tp_weaklistoffset) + if (type->tp_weaklistoffset && !base->tp_weaklistoffset) { PyObject_ClearWeakRefs(self); + } if (type->tp_del) { _PyObject_GC_TRACK(self); @@ -1929,20 +1940,14 @@ pmerge(PyObject *acc, PyObject **to_merge, Py_ssize_t to_merge_size) static PyObject * mro_implementation(PyTypeObject *type) { - PyObject *result; - PyObject *bases; - PyObject **to_merge; - Py_ssize_t i, n; - if (!_PyType_IsReady(type)) { if (PyType_Ready(type) < 0) return NULL; } - bases = type->tp_bases; - assert(PyTuple_Check(bases)); - n = PyTuple_GET_SIZE(bases); - for (i = 0; i < n; i++) { + PyObject *bases = type->tp_bases; + Py_ssize_t n = PyTuple_GET_SIZE(bases); + for (Py_ssize_t i = 0; i < n; i++) { PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, i)); if (base->tp_mro == NULL) { PyErr_Format(PyExc_TypeError, @@ -1959,13 +1964,14 @@ mro_implementation(PyTypeObject *type) */ PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, 0)); Py_ssize_t k = PyTuple_GET_SIZE(base->tp_mro); - result = PyTuple_New(k + 1); + PyObject *result = PyTuple_New(k + 1); if (result == NULL) { return NULL; } + Py_INCREF(type); PyTuple_SET_ITEM(result, 0, (PyObject *) type); - for (i = 0; i < k; i++) { + for (Py_ssize_t i = 0; i < k; i++) { PyObject *cls = PyTuple_GET_ITEM(base->tp_mro, i); Py_INCREF(cls); PyTuple_SET_ITEM(result, i + 1, cls); @@ -1986,20 +1992,19 @@ mro_implementation(PyTypeObject *type) linearization implied by a base class. The last element of to_merge is the declared tuple of bases. */ - - to_merge = PyMem_New(PyObject *, n + 1); + PyObject **to_merge = PyMem_New(PyObject *, n + 1); if (to_merge == NULL) { PyErr_NoMemory(); return NULL; } - for (i = 0; i < n; i++) { + for (Py_ssize_t i = 0; i < n; i++) { PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(bases, i)); to_merge[i] = base->tp_mro; } to_merge[n] = bases; - result = PyList_New(1); + PyObject *result = PyList_New(1); if (result == NULL) { PyMem_Free(to_merge); return NULL; @@ -2010,8 +2015,8 @@ mro_implementation(PyTypeObject *type) if (pmerge(result, to_merge, n + 1) < 0) { Py_CLEAR(result); } - PyMem_Free(to_merge); + return result; } @@ -2651,20 +2656,20 @@ type_new_slots_bases(type_new_ctx *ctx) (ctx->may_add_weak && ctx->add_weak == 0))) { for (Py_ssize_t i = 0; i < nbases; i++) { - PyObject *base = PyTuple_GET_ITEM(ctx->bases, i); - if (base == (PyObject *)ctx->base) { + PyObject *obj = PyTuple_GET_ITEM(ctx->bases, i); + if (obj == (PyObject *)ctx->base) { /* Skip primary base */ continue; } - PyTypeObject *type = _PyType_CAST(base); + PyTypeObject *base = _PyType_CAST(obj); if (ctx->may_add_dict && ctx->add_dict == 0 && - type->tp_dictoffset != 0) + base->tp_dictoffset != 0) { ctx->add_dict++; } if (ctx->may_add_weak && ctx->add_weak == 0 && - type->tp_weaklistoffset != 0) + base->tp_weaklistoffset != 0) { ctx->add_weak++; } @@ -3739,8 +3744,8 @@ _PyType_GetModuleByDef(PyTypeObject *type, struct PyModuleDef *def) // to check i < PyTuple_GET_SIZE(mro) at the first loop iteration. assert(PyTuple_GET_SIZE(mro) >= 1); - Py_ssize_t i = 0; - do { + Py_ssize_t n = PyTuple_GET_SIZE(mro); + for (Py_ssize_t i = 0; i < n; i++) { PyObject *super = PyTuple_GET_ITEM(mro, i); // _PyType_GetModuleByDef() must only be called on a heap type created // by PyType_FromModuleAndSpec() or on its subclasses. @@ -3753,8 +3758,7 @@ _PyType_GetModuleByDef(PyTypeObject *type, struct PyModuleDef *def) if (module && _PyModule_GetDef(module) == def) { return module; } - i++; - } while (i < PyTuple_GET_SIZE(mro)); + } PyErr_Format( PyExc_TypeError, @@ -3770,10 +3774,7 @@ _PyType_GetModuleByDef(PyTypeObject *type, struct PyModuleDef *def) static PyObject * find_name_in_mro(PyTypeObject *type, PyObject *name, int *error) { - Py_ssize_t i, n; - PyObject *mro, *res, *base, *dict; Py_hash_t hash; - if (!PyUnicode_CheckExact(name) || (hash = ((PyASCIIObject *) name)->hash) == -1) { @@ -3785,8 +3786,7 @@ find_name_in_mro(PyTypeObject *type, PyObject *name, int *error) } /* Look in tp_dict of types in MRO */ - mro = type->tp_mro; - + PyObject *mro = type->tp_mro; if (mro == NULL) { if ((type->tp_flags & Py_TPFLAGS_READYING) == 0) { if (PyType_Ready(type) < 0) { @@ -3801,20 +3801,19 @@ find_name_in_mro(PyTypeObject *type, PyObject *name, int *error) } } - res = NULL; + PyObject *res = NULL; /* Keep a strong reference to mro because type->tp_mro can be replaced during dict lookup, e.g. when comparing to non-string keys. */ Py_INCREF(mro); - assert(PyTuple_Check(mro)); - n = PyTuple_GET_SIZE(mro); - for (i = 0; i < n; i++) { - base = PyTuple_GET_ITEM(mro, i); - assert(PyType_Check(base)); - dict = _PyType_CAST(base)->tp_dict; + Py_ssize_t n = PyTuple_GET_SIZE(mro); + for (Py_ssize_t i = 0; i < n; i++) { + PyObject *base = PyTuple_GET_ITEM(mro, i); + PyObject *dict = _PyType_CAST(base)->tp_dict; assert(dict && PyDict_Check(dict)); res = _PyDict_GetItem_KnownHash(dict, name, hash); - if (res != NULL) + if (res != NULL) { break; + } if (PyErr_Occurred()) { *error = -1; goto done; @@ -4066,10 +4065,12 @@ _PyDictKeys_DecRef(PyDictKeysObject *keys); static void type_dealloc_common(PyTypeObject *type) { - PyObject *tp, *val, *tb; - PyErr_Fetch(&tp, &val, &tb); - remove_all_subclasses(type, type->tp_bases); - PyErr_Restore(tp, val, tb); + if (type->tp_bases != NULL) { + PyObject *tp, *val, *tb; + PyErr_Fetch(&tp, &val, &tb); + remove_all_subclasses(type, type->tp_bases); + PyErr_Restore(tp, val, tb); + } PyObject_ClearWeakRefs((PyObject *)type); } @@ -4089,7 +4090,7 @@ _PyStaticType_Dealloc(PyTypeObject *type) Py_CLEAR(type->tp_bases); Py_CLEAR(type->tp_mro); Py_CLEAR(type->tp_cache); - Py_CLEAR(type->tp_subclasses); + // type->tp_subclasses is NULL type->tp_flags &= ~Py_TPFLAGS_READY; } @@ -4154,6 +4155,7 @@ _PyType_GetSubclasses(PyTypeObject *self) continue; } assert(PyType_Check(obj)); + if (PyList_Append(list, obj) < 0) { Py_DECREF(list); return NULL; @@ -6227,7 +6229,7 @@ type_ready_mro(PyTypeObject *type) Py_ssize_t n = PyTuple_GET_SIZE(mro); for (Py_ssize_t i = 0; i < n; i++) { PyTypeObject *base = _PyType_CAST(PyTuple_GET_ITEM(mro, i)); - if (PyType_Check(base) && (base->tp_flags & Py_TPFLAGS_HEAPTYPE)) { + if (base->tp_flags & Py_TPFLAGS_HEAPTYPE) { PyErr_Format(PyExc_TypeError, "type '%.100s' is not dynamically allocated but " "its base type '%.100s' is dynamically allocated", @@ -6515,15 +6517,15 @@ add_subclass(PyTypeObject *base, PyTypeObject *type) // Only get tp_subclasses after creating the key and value. // PyWeakref_NewRef() can trigger a garbage collection which can execute // arbitrary Python code and so modify base->tp_subclasses. - PyObject *dict = base->tp_subclasses; - if (dict == NULL) { - base->tp_subclasses = dict = PyDict_New(); - if (dict == NULL) + PyObject *subclasses = base->tp_subclasses; + if (subclasses == NULL) { + base->tp_subclasses = subclasses = PyDict_New(); + if (subclasses == NULL) return -1; } - assert(PyDict_CheckExact(dict)); + assert(PyDict_CheckExact(subclasses)); - int result = PyDict_SetItem(dict, key, ref); + int result = PyDict_SetItem(subclasses, key, ref); Py_DECREF(ref); Py_DECREF(key); return result; @@ -6532,35 +6534,30 @@ add_subclass(PyTypeObject *base, PyTypeObject *type) static int add_all_subclasses(PyTypeObject *type, PyObject *bases) { + Py_ssize_t n = PyTuple_GET_SIZE(bases); int res = 0; - - if (bases) { - Py_ssize_t i; - for (i = 0; i < PyTuple_GET_SIZE(bases); i++) { - PyObject *base = PyTuple_GET_ITEM(bases, i); - if (PyType_Check(base) && - add_subclass((PyTypeObject*)base, type) < 0) - { - res = -1; - } + for (Py_ssize_t i = 0; i < n; i++) { + PyObject *obj = PyTuple_GET_ITEM(bases, i); + // bases tuple must only contain types + PyTypeObject *base = _PyType_CAST(obj); + if (add_subclass(base, type) < 0) { + res = -1; } } - return res; } static void remove_subclass(PyTypeObject *base, PyTypeObject *type) { - PyObject *dict, *key; - - dict = base->tp_subclasses; - if (dict == NULL) { + PyObject *subclasses = base->tp_subclasses; // borrowed ref + if (subclasses == NULL) { return; } - assert(PyDict_CheckExact(dict)); - key = PyLong_FromVoidPtr((void *) type); - if (key == NULL || PyDict_DelItem(dict, key)) { + assert(PyDict_CheckExact(subclasses)); + + PyObject *key = PyLong_FromVoidPtr((void *) type); + if (key == NULL || PyDict_DelItem(subclasses, key)) { /* This can happen if the type initialization errored out before the base subclasses were updated (e.g. a non-str __qualname__ was passed in the type dict). */ @@ -6568,7 +6565,7 @@ remove_subclass(PyTypeObject *base, PyTypeObject *type) } Py_XDECREF(key); - if (PyDict_Size(dict) == 0) { + if (PyDict_Size(subclasses) == 0) { // Delete the dictionary to save memory. _PyStaticType_Dealloc() // callers also test if tp_subclasses is NULL to check if a static type // has no subclass. @@ -6579,15 +6576,17 @@ remove_subclass(PyTypeObject *base, PyTypeObject *type) static void remove_all_subclasses(PyTypeObject *type, PyObject *bases) { - if (bases) { - Py_ssize_t i; - for (i = 0; i < PyTuple_GET_SIZE(bases); i++) { - PyObject *base = PyTuple_GET_ITEM(bases, i); - if (PyType_Check(base)) { - remove_subclass((PyTypeObject*) base, type); - } + assert(bases != NULL); + // remove_subclass() can clear the current exception + assert(!PyErr_Occurred()); + + for (Py_ssize_t i = 0; i < PyTuple_GET_SIZE(bases); i++) { + PyObject *base = PyTuple_GET_ITEM(bases, i); + if (PyType_Check(base)) { + remove_subclass((PyTypeObject*) base, type); } } + assert(!PyErr_Occurred()); } static int @@ -8466,9 +8465,9 @@ static int update_slots_callback(PyTypeObject *type, void *data) { slotdef **pp = (slotdef **)data; - - for (; *pp; pp++) + for (; *pp; pp++) { update_one_slot(type, *pp); + } return 0; } @@ -8654,29 +8653,33 @@ type_new_init_subclass(PyTypeObject *type, PyObject *kwds) /* recurse_down_subclasses() and update_subclasses() are mutually recursive functions to call a callback for all subclasses, - but refraining from recursing into subclasses that define 'name'. */ + but refraining from recursing into subclasses that define 'attr_name'. */ static int -update_subclasses(PyTypeObject *type, PyObject *name, +update_subclasses(PyTypeObject *type, PyObject *attr_name, update_callback callback, void *data) { - if (callback(type, data) < 0) + if (callback(type, data) < 0) { return -1; - return recurse_down_subclasses(type, name, callback, data); + } + return recurse_down_subclasses(type, attr_name, callback, data); } static int -recurse_down_subclasses(PyTypeObject *type, PyObject *name, +recurse_down_subclasses(PyTypeObject *type, PyObject *attr_name, update_callback callback, void *data) { - PyObject *ref, *subclasses, *dict; - Py_ssize_t i; - - subclasses = type->tp_subclasses; - if (subclasses == NULL) + // It is safe to use a borrowed reference because update_subclasses() is + // only used with update_slots_callback() which doesn't modify + // tp_subclasses. + PyObject *subclasses = type->tp_subclasses; // borrowed ref + if (subclasses == NULL) { return 0; + } assert(PyDict_CheckExact(subclasses)); - i = 0; + + Py_ssize_t i = 0; + PyObject *ref; while (PyDict_Next(subclasses, &i, NULL, &ref)) { assert(PyWeakref_CheckRef(ref)); PyObject *obj = PyWeakref_GET_OBJECT(ref); @@ -8687,18 +8690,20 @@ recurse_down_subclasses(PyTypeObject *type, PyObject *name, PyTypeObject *subclass = _PyType_CAST(obj); /* Avoid recursing down into unaffected classes */ - dict = subclass->tp_dict; + PyObject *dict = subclass->tp_dict; if (dict != NULL && PyDict_Check(dict)) { - int r = PyDict_Contains(dict, name); - if (r > 0) { - continue; - } + int r = PyDict_Contains(dict, attr_name); if (r < 0) { return -1; } + if (r > 0) { + continue; + } } - if (update_subclasses(subclass, name, callback, data) < 0) + + if (update_subclasses(subclass, attr_name, callback, data) < 0) { return -1; + } } return 0; } From webhook-mailer at python.org Sat Jan 22 13:15:44 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 18:15:44 -0000 Subject: [Python-checkins] bpo-45200: GHA Address Sanitizer skips 3 slowest tests (GH-30797) Message-ID: https://github.com/python/cpython/commit/ce7d66771ec64488134a1dd114015aa056eef696 commit: ce7d66771ec64488134a1dd114015aa056eef696 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T19:15:37+01:00 summary: bpo-45200: GHA Address Sanitizer skips 3 slowest tests (GH-30797) Skip the 3 slowest tests of the Address Sanitizer CI of GitHub Actions: * test_tools * test_peg_generator * test_concurrent_futures These tests take between 5 and 20 minutes on this CI which makes this CI job the slowest. Making this CI job faster makes the whole Python workflow faster. These tests are run on all others CIs. Example of Address Sanitizer output: 10 slowest tests: - test_peg_generator: 17 min 33 sec - test_tools: 8 min 27 sec - test_concurrent_futures: 5 min 24 sec - test_zipfile: 2 min 41 sec - test_compileall: 2 min 21 sec - test_asyncio: 2 min 17 sec - test_gdb: 1 min 43 sec - test_weakref: 1 min 35 sec - test_pickle: 1 min 18 sec - test_subprocess: 1 min 12 sec Moreover, test_concurrent_futures also seems to be affected by bpo-45200 bug: libasan dead lock in pthread_create(). files: M .github/workflows/build.yml diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index f11d51b2dc993..d6af174d1c3a7 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -306,4 +306,9 @@ jobs: - name: Display build info run: make pythoninfo - name: Tests - run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu -x test_ctypes test_crypt test_decimal test_faulthandler test_interpreters test___all__ test_idle test_tix test_tk test_ttk_guionly test_ttk_textonly test_multiprocessing_fork test_multiprocessing_forkserver test_multiprocessing_spawn" + # Skip test_tools test_peg_generator test_concurrent_futures because + # there are too slow: between 5 and 20 minutes on this CI. + # + # Skip multiprocessing and concurrent.futures tests which are affected by + # bpo-45200 bug: libasan dead lock in pthread_create(). + run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu -x test_ctypes test_crypt test_decimal test_faulthandler test_interpreters test___all__ test_idle test_tix test_tk test_ttk_guionly test_ttk_textonly test_multiprocessing_fork test_multiprocessing_forkserver test_multiprocessing_spawn test_tools test_peg_generator test_concurrent_futures" From webhook-mailer at python.org Sat Jan 22 13:31:33 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 18:31:33 -0000 Subject: [Python-checkins] bpo-46417: _PyTypes_FiniTypes() clears object and type (GH-30798) Message-ID: https://github.com/python/cpython/commit/6cacdb42454264ae75cab5e32bb62876da43bf6f commit: 6cacdb42454264ae75cab5e32bb62876da43bf6f branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T19:31:24+01:00 summary: bpo-46417: _PyTypes_FiniTypes() clears object and type (GH-30798) files: M Objects/object.c diff --git a/Objects/object.c b/Objects/object.c index a1663c0dbb7b7..27f89e8d75212 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1840,7 +1840,12 @@ _PyTypes_InitState(PyInterpreterState *interp) static PyTypeObject* static_types[] = { - // base types + // The two most important base types: must be initialized first and + // deallocated last. + &PyBaseObject_Type, + &PyType_Type, + + // Static types with base=&PyBaseObject_Type &PyAsyncGen_Type, &PyByteArrayIter_Type, &PyByteArray_Type, @@ -1955,29 +1960,20 @@ _PyTypes_InitTypes(PyInterpreterState *interp) return _PyStatus_OK(); } -#define INIT_TYPE(TYPE) \ - do { \ - if (PyType_Ready(&(TYPE)) < 0) { \ - return _PyStatus_ERR("Can't initialize " #TYPE " type"); \ - } \ - } while (0) - - // Base types - INIT_TYPE(PyBaseObject_Type); - INIT_TYPE(PyType_Type); - assert(PyBaseObject_Type.tp_base == NULL); - assert(PyType_Type.tp_base == &PyBaseObject_Type); - // All other static types (unless initialized elsewhere) for (size_t i=0; i < Py_ARRAY_LENGTH(static_types); i++) { PyTypeObject *type = static_types[i]; if (PyType_Ready(type) < 0) { return _PyStatus_ERR("Can't initialize types"); } + if (type == &PyType_Type) { + // Sanitify checks of the two most important types + assert(PyBaseObject_Type.tp_base == NULL); + assert(PyType_Type.tp_base == &PyBaseObject_Type); + } } return _PyStatus_OK(); -#undef INIT_TYPE } From webhook-mailer at python.org Sat Jan 22 13:49:47 2022 From: webhook-mailer at python.org (miss-islington) Date: Sat, 22 Jan 2022 18:49:47 -0000 Subject: [Python-checkins] bpo-46126: Disable 'descriptions' when running tests internally. (GH-30194) Message-ID: https://github.com/python/cpython/commit/a941e5927f7f2540946813606c61c6aea38db426 commit: a941e5927f7f2540946813606c61c6aea38db426 branch: main author: Jason R. Coombs committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-22T10:49:38-08:00 summary: bpo-46126: Disable 'descriptions' when running tests internally. (GH-30194) files: A Misc/NEWS.d/next/Tests/2021-12-18-22-23-50.bpo-46126.0LH3Yb.rst M Lib/test/support/testresult.py diff --git a/Lib/test/support/testresult.py b/Lib/test/support/testresult.py index 2cd1366cd8a9e..eb2279a88f9a0 100644 --- a/Lib/test/support/testresult.py +++ b/Lib/test/support/testresult.py @@ -145,7 +145,11 @@ def get_test_runner_class(verbosity, buffer=False): return functools.partial(unittest.TextTestRunner, resultclass=RegressionTestResult, buffer=buffer, - verbosity=verbosity) + verbosity=verbosity, + # disable descriptions so errors are + # readily traceable. bpo-46126 + descriptions=False, + ) return functools.partial(QuietRegressionTestRunner, buffer=buffer) def get_test_runner(stream, verbosity, capture_output=False): diff --git a/Misc/NEWS.d/next/Tests/2021-12-18-22-23-50.bpo-46126.0LH3Yb.rst b/Misc/NEWS.d/next/Tests/2021-12-18-22-23-50.bpo-46126.0LH3Yb.rst new file mode 100644 index 0000000000000..b7360b36454ea --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2021-12-18-22-23-50.bpo-46126.0LH3Yb.rst @@ -0,0 +1 @@ +Disable 'descriptions' when running tests internally. From webhook-mailer at python.org Sat Jan 22 15:49:12 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 20:49:12 -0000 Subject: [Python-checkins] bpo-46417: Factorize _PyExc_InitTypes() code (GH-30804) Message-ID: https://github.com/python/cpython/commit/f1bcdeaca6e912a2bec1fbcff76cc49e7f761d38 commit: f1bcdeaca6e912a2bec1fbcff76cc49e7f761d38 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T21:48:56+01:00 summary: bpo-46417: Factorize _PyExc_InitTypes() code (GH-30804) Add 'static_exceptions' list to factorize code between _PyExc_InitTypes() and _PyBuiltins_AddExceptions(). _PyExc_InitTypes() does nothing if it's not the main interpreter. Sort exceptions in Lib/test/exception_hierarchy.txt. files: M Include/internal/pycore_exceptions.h M Include/internal/pycore_pylifecycle.h M Lib/test/exception_hierarchy.txt M Objects/exceptions.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_exceptions.h b/Include/internal/pycore_exceptions.h index 1651966dad936..4a9df70913199 100644 --- a/Include/internal/pycore_exceptions.h +++ b/Include/internal/pycore_exceptions.h @@ -13,7 +13,7 @@ extern "C" { extern PyStatus _PyExc_InitState(PyInterpreterState *); extern PyStatus _PyExc_InitGlobalObjects(PyInterpreterState *); -extern PyStatus _PyExc_InitTypes(PyInterpreterState *); +extern int _PyExc_InitTypes(PyInterpreterState *); extern void _PyExc_Fini(PyInterpreterState *); diff --git a/Include/internal/pycore_pylifecycle.h b/Include/internal/pycore_pylifecycle.h index dfa8fd6bd0d28..35e560b42ee0a 100644 --- a/Include/internal/pycore_pylifecycle.h +++ b/Include/internal/pycore_pylifecycle.h @@ -59,7 +59,7 @@ extern PyStatus _PySys_ReadPreinitWarnOptions(PyWideStringList *options); extern PyStatus _PySys_ReadPreinitXOptions(PyConfig *config); extern int _PySys_UpdateConfig(PyThreadState *tstate); extern void _PySys_Fini(PyInterpreterState *interp); -extern PyStatus _PyBuiltins_AddExceptions(PyObject * bltinmod); +extern int _PyBuiltins_AddExceptions(PyObject * bltinmod); extern PyStatus _Py_HashRandomization_Init(const PyConfig *); extern PyStatus _PyImportZip_Init(PyThreadState *tstate); diff --git a/Lib/test/exception_hierarchy.txt b/Lib/test/exception_hierarchy.txt index 5c0bfda373794..1eca123be0fec 100644 --- a/Lib/test/exception_hierarchy.txt +++ b/Lib/test/exception_hierarchy.txt @@ -1,12 +1,9 @@ BaseException - ??? SystemExit - ??? KeyboardInterrupt - ??? GeneratorExit ??? BaseExceptionGroup + ??? GeneratorExit + ??? KeyboardInterrupt + ??? SystemExit ??? Exception - ??? ExceptionGroup [BaseExceptionGroup] - ??? StopIteration - ??? StopAsyncIteration ??? ArithmeticError ? ??? FloatingPointError ? ??? OverflowError @@ -15,6 +12,7 @@ BaseException ??? AttributeError ??? BufferError ??? EOFError + ??? ExceptionGroup [BaseExceptionGroup] ??? ImportError ? ??? ModuleNotFoundError ??? LookupError @@ -43,6 +41,8 @@ BaseException ??? RuntimeError ? ??? NotImplementedError ? ??? RecursionError + ??? StopAsyncIteration + ??? StopIteration ??? SyntaxError ? ??? IndentationError ? ??? TabError @@ -54,14 +54,14 @@ BaseException ? ??? UnicodeEncodeError ? ??? UnicodeTranslateError ??? Warning + ??? BytesWarning ??? DeprecationWarning + ??? EncodingWarning + ??? FutureWarning + ??? ImportWarning ??? PendingDeprecationWarning + ??? ResourceWarning ??? RuntimeWarning ??? SyntaxWarning - ??? UserWarning - ??? FutureWarning - ??? ImportWarning ??? UnicodeWarning - ??? BytesWarning - ??? EncodingWarning - ??? ResourceWarning + ??? UserWarning diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 22a47131aa12c..f8f727c673c02 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -3421,92 +3421,121 @@ SimpleExtendsException(PyExc_Warning, ResourceWarning, #endif #endif /* MS_WINDOWS */ -PyStatus +struct static_exception { + PyTypeObject *exc; + const char *name; +}; + +static struct static_exception static_exceptions[] = { +#define ITEM(NAME) {&_PyExc_##NAME, #NAME} + // Level 1 + ITEM(BaseException), + + // Level 2: BaseException subclasses + ITEM(BaseExceptionGroup), + ITEM(Exception), + ITEM(GeneratorExit), + ITEM(KeyboardInterrupt), + ITEM(SystemExit), + + // Level 3: Exception(BaseException) subclasses + ITEM(ArithmeticError), + ITEM(AssertionError), + ITEM(AttributeError), + ITEM(BufferError), + ITEM(EOFError), + //ITEM(ExceptionGroup), + ITEM(ImportError), + ITEM(LookupError), + ITEM(MemoryError), + ITEM(NameError), + ITEM(OSError), + ITEM(ReferenceError), + ITEM(RuntimeError), + ITEM(StopAsyncIteration), + ITEM(StopIteration), + ITEM(SyntaxError), + ITEM(SystemError), + ITEM(TypeError), + ITEM(ValueError), + ITEM(Warning), + + // Level 4: ArithmeticError(Exception) subclasses + ITEM(FloatingPointError), + ITEM(OverflowError), + ITEM(ZeroDivisionError), + + // Level 4: Warning(Exception) subclasses + ITEM(BytesWarning), + ITEM(DeprecationWarning), + ITEM(EncodingWarning), + ITEM(FutureWarning), + ITEM(ImportWarning), + ITEM(PendingDeprecationWarning), + ITEM(ResourceWarning), + ITEM(RuntimeWarning), + ITEM(SyntaxWarning), + ITEM(UnicodeWarning), + ITEM(UserWarning), + + // Level 4: OSError(Exception) subclasses + ITEM(BlockingIOError), + ITEM(ChildProcessError), + ITEM(ConnectionError), + ITEM(FileExistsError), + ITEM(FileNotFoundError), + ITEM(InterruptedError), + ITEM(IsADirectoryError), + ITEM(NotADirectoryError), + ITEM(PermissionError), + ITEM(ProcessLookupError), + ITEM(TimeoutError), + + // Level 4: Other subclasses + ITEM(IndentationError), // base: SyntaxError(Exception) + ITEM(IndexError), // base: LookupError(Exception) + ITEM(KeyError), // base: LookupError(Exception) + ITEM(ModuleNotFoundError), // base: ImportError(Exception) + ITEM(NotImplementedError), // base: RuntimeError(Exception) + ITEM(RecursionError), // base: RuntimeError(Exception) + ITEM(UnboundLocalError), // base: NameError(Exception) + ITEM(UnicodeError), // base: ValueError(Exception) + + // Level 5: ConnectionError(OSError) subclasses + ITEM(BrokenPipeError), + ITEM(ConnectionAbortedError), + ITEM(ConnectionRefusedError), + ITEM(ConnectionResetError), + + // Level 5: IndentationError(SyntaxError) subclasses + ITEM(TabError), // base: IndentationError + + // Level 5: UnicodeError(ValueError) subclasses + ITEM(UnicodeDecodeError), + ITEM(UnicodeEncodeError), + ITEM(UnicodeTranslateError), +#undef ITEM +}; + + +int _PyExc_InitTypes(PyInterpreterState *interp) { -#define PRE_INIT(TYPE) \ - if (!(_PyExc_ ## TYPE.tp_flags & Py_TPFLAGS_READY)) { \ - if (PyType_Ready(&_PyExc_ ## TYPE) < 0) { \ - return _PyStatus_ERR("exceptions bootstrapping error."); \ - } \ - Py_INCREF(PyExc_ ## TYPE); \ + if (!_Py_IsMainInterpreter(interp)) { + return 0; } - PRE_INIT(BaseException); - PRE_INIT(BaseExceptionGroup); - PRE_INIT(Exception); - PRE_INIT(TypeError); - PRE_INIT(StopAsyncIteration); - PRE_INIT(StopIteration); - PRE_INIT(GeneratorExit); - PRE_INIT(SystemExit); - PRE_INIT(KeyboardInterrupt); - PRE_INIT(ImportError); - PRE_INIT(ModuleNotFoundError); - PRE_INIT(OSError); - PRE_INIT(EOFError); - PRE_INIT(RuntimeError); - PRE_INIT(RecursionError); - PRE_INIT(NotImplementedError); - PRE_INIT(NameError); - PRE_INIT(UnboundLocalError); - PRE_INIT(AttributeError); - PRE_INIT(SyntaxError); - PRE_INIT(IndentationError); - PRE_INIT(TabError); - PRE_INIT(LookupError); - PRE_INIT(IndexError); - PRE_INIT(KeyError); - PRE_INIT(ValueError); - PRE_INIT(UnicodeError); - PRE_INIT(UnicodeEncodeError); - PRE_INIT(UnicodeDecodeError); - PRE_INIT(UnicodeTranslateError); - PRE_INIT(AssertionError); - PRE_INIT(ArithmeticError); - PRE_INIT(FloatingPointError); - PRE_INIT(OverflowError); - PRE_INIT(ZeroDivisionError); - PRE_INIT(SystemError); - PRE_INIT(ReferenceError); - PRE_INIT(MemoryError); - PRE_INIT(BufferError); - PRE_INIT(Warning); - PRE_INIT(UserWarning); - PRE_INIT(EncodingWarning); - PRE_INIT(DeprecationWarning); - PRE_INIT(PendingDeprecationWarning); - PRE_INIT(SyntaxWarning); - PRE_INIT(RuntimeWarning); - PRE_INIT(FutureWarning); - PRE_INIT(ImportWarning); - PRE_INIT(UnicodeWarning); - PRE_INIT(BytesWarning); - PRE_INIT(ResourceWarning); - - /* OSError subclasses */ - PRE_INIT(ConnectionError); - - PRE_INIT(BlockingIOError); - PRE_INIT(BrokenPipeError); - PRE_INIT(ChildProcessError); - PRE_INIT(ConnectionAbortedError); - PRE_INIT(ConnectionRefusedError); - PRE_INIT(ConnectionResetError); - PRE_INIT(FileExistsError); - PRE_INIT(FileNotFoundError); - PRE_INIT(IsADirectoryError); - PRE_INIT(NotADirectoryError); - PRE_INIT(InterruptedError); - PRE_INIT(PermissionError); - PRE_INIT(ProcessLookupError); - PRE_INIT(TimeoutError); + for (size_t i=0; i < Py_ARRAY_LENGTH(static_exceptions); i++) { + PyTypeObject *exc = static_exceptions[i].exc; - return _PyStatus_OK(); - -#undef PRE_INIT + if (PyType_Ready(exc) < 0) { + return -1; + } + } + return 0; } + PyStatus _PyExc_InitGlobalObjects(PyInterpreterState *interp) { @@ -3569,12 +3598,28 @@ _PyExc_InitState(PyInterpreterState *interp) /* Add exception types to the builtins module */ -PyStatus +int _PyBuiltins_AddExceptions(PyObject *bltinmod) { -#define POST_INIT(TYPE) \ - if (PyDict_SetItemString(bdict, # TYPE, PyExc_ ## TYPE)) { \ - return _PyStatus_ERR("Module dictionary insertion problem."); \ + PyObject *mod_dict = PyModule_GetDict(bltinmod); + if (mod_dict == NULL) { + return -1; + } + + for (size_t i=0; i < Py_ARRAY_LENGTH(static_exceptions); i++) { + struct static_exception item = static_exceptions[i]; + + if (PyDict_SetItemString(mod_dict, item.name, (PyObject*)item.exc)) { + return -1; + } + } + + PyObject *PyExc_ExceptionGroup = create_exception_group_class(); + if (!PyExc_ExceptionGroup) { + return -1; + } + if (PyDict_SetItemString(mod_dict, "ExceptionGroup", PyExc_ExceptionGroup)) { + return -1; } #define INIT_ALIAS(NAME, TYPE) \ @@ -3582,103 +3627,20 @@ _PyBuiltins_AddExceptions(PyObject *bltinmod) Py_INCREF(PyExc_ ## TYPE); \ Py_XDECREF(PyExc_ ## NAME); \ PyExc_ ## NAME = PyExc_ ## TYPE; \ - if (PyDict_SetItemString(bdict, # NAME, PyExc_ ## NAME)) { \ - return _PyStatus_ERR("Module dictionary insertion problem."); \ + if (PyDict_SetItemString(mod_dict, # NAME, PyExc_ ## NAME)) { \ + return -1; \ } \ } while (0) - PyObject *bdict; - - bdict = PyModule_GetDict(bltinmod); - if (bdict == NULL) { - return _PyStatus_ERR("exceptions bootstrapping error."); - } - - PyObject *PyExc_ExceptionGroup = create_exception_group_class(); - if (!PyExc_ExceptionGroup) { - return _PyStatus_ERR("exceptions bootstrapping error."); - } - - POST_INIT(BaseException); - POST_INIT(Exception); - POST_INIT(BaseExceptionGroup); - POST_INIT(ExceptionGroup); - POST_INIT(TypeError); - POST_INIT(StopAsyncIteration); - POST_INIT(StopIteration); - POST_INIT(GeneratorExit); - POST_INIT(SystemExit); - POST_INIT(KeyboardInterrupt); - POST_INIT(ImportError); - POST_INIT(ModuleNotFoundError); - POST_INIT(OSError); INIT_ALIAS(EnvironmentError, OSError); INIT_ALIAS(IOError, OSError); #ifdef MS_WINDOWS INIT_ALIAS(WindowsError, OSError); #endif - POST_INIT(EOFError); - POST_INIT(RuntimeError); - POST_INIT(RecursionError); - POST_INIT(NotImplementedError); - POST_INIT(NameError); - POST_INIT(UnboundLocalError); - POST_INIT(AttributeError); - POST_INIT(SyntaxError); - POST_INIT(IndentationError); - POST_INIT(TabError); - POST_INIT(LookupError); - POST_INIT(IndexError); - POST_INIT(KeyError); - POST_INIT(ValueError); - POST_INIT(UnicodeError); - POST_INIT(UnicodeEncodeError); - POST_INIT(UnicodeDecodeError); - POST_INIT(UnicodeTranslateError); - POST_INIT(AssertionError); - POST_INIT(ArithmeticError); - POST_INIT(FloatingPointError); - POST_INIT(OverflowError); - POST_INIT(ZeroDivisionError); - POST_INIT(SystemError); - POST_INIT(ReferenceError); - POST_INIT(MemoryError); - POST_INIT(BufferError); - POST_INIT(Warning); - POST_INIT(UserWarning); - POST_INIT(EncodingWarning); - POST_INIT(DeprecationWarning); - POST_INIT(PendingDeprecationWarning); - POST_INIT(SyntaxWarning); - POST_INIT(RuntimeWarning); - POST_INIT(FutureWarning); - POST_INIT(ImportWarning); - POST_INIT(UnicodeWarning); - POST_INIT(BytesWarning); - POST_INIT(ResourceWarning); - - /* OSError subclasses */ - POST_INIT(ConnectionError); - - POST_INIT(BlockingIOError); - POST_INIT(BrokenPipeError); - POST_INIT(ChildProcessError); - POST_INIT(ConnectionAbortedError); - POST_INIT(ConnectionRefusedError); - POST_INIT(ConnectionResetError); - POST_INIT(FileExistsError); - POST_INIT(FileNotFoundError); - POST_INIT(IsADirectoryError); - POST_INIT(NotADirectoryError); - POST_INIT(InterruptedError); - POST_INIT(PermissionError); - POST_INIT(ProcessLookupError); - POST_INIT(TimeoutError); - - return _PyStatus_OK(); -#undef POST_INIT #undef INIT_ALIAS + + return 0; } void diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index a53f532e9e202..aca3b1a5fd1a4 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -740,9 +740,8 @@ pycore_init_types(PyInterpreterState *interp) return status; } - status = _PyExc_InitTypes(interp); - if (_PyStatus_EXCEPTION(status)) { - return status; + if (_PyExc_InitTypes(interp) < 0) { + return _PyStatus_ERR("failed to initialize an exception type"); } status = _PyExc_InitGlobalObjects(interp); @@ -790,9 +789,8 @@ pycore_init_builtins(PyThreadState *tstate) Py_INCREF(builtins_dict); interp->builtins = builtins_dict; - PyStatus status = _PyBuiltins_AddExceptions(bimod); - if (_PyStatus_EXCEPTION(status)) { - return status; + if (_PyBuiltins_AddExceptions(bimod) < 0) { + return _PyStatus_ERR("failed to add exceptions to builtins"); } interp->builtins_copy = PyDict_Copy(interp->builtins); From webhook-mailer at python.org Sat Jan 22 16:31:57 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 21:31:57 -0000 Subject: [Python-checkins] bpo-46417: Py_Finalize() clears static exceptioins (GH-30805) Message-ID: https://github.com/python/cpython/commit/621a45ccacd121f9ae4d8a539f040410c74b253b commit: 621a45ccacd121f9ae4d8a539f040410c74b253b branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T22:31:44+01:00 summary: bpo-46417: Py_Finalize() clears static exceptioins (GH-30805) The Py_Finalize() function now clears exceptions implemented as static types. Add _PyExc_FiniTypes() function, called by _PyExc_Fini(). files: M Objects/exceptions.c diff --git a/Objects/exceptions.c b/Objects/exceptions.c index f8f727c673c02..6bf70e2b15c16 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -3536,13 +3536,36 @@ _PyExc_InitTypes(PyInterpreterState *interp) } +static void +_PyExc_FiniTypes(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + for (Py_ssize_t i=Py_ARRAY_LENGTH(static_exceptions) - 1; i >= 0; i--) { + PyTypeObject *exc = static_exceptions[i].exc; + + // Cannot delete a type if it still has subclasses + if (exc->tp_subclasses != NULL) { + continue; + } + + _PyStaticType_Dealloc(exc); + } +} + + PyStatus _PyExc_InitGlobalObjects(PyInterpreterState *interp) { + if (!_Py_IsMainInterpreter(interp)) { + return _PyStatus_OK(); + } + if (preallocate_memerrors() < 0) { return _PyStatus_NO_MEMORY(); } - return _PyStatus_OK(); } @@ -3656,6 +3679,8 @@ _PyExc_Fini(PyInterpreterState *interp) struct _Py_exc_state *state = &interp->exc_state; free_preallocated_memerrors(state); Py_CLEAR(state->errnomap); + + _PyExc_FiniTypes(interp); } /* Helper to do the equivalent of "raise X from Y" in C, but always using From webhook-mailer at python.org Sat Jan 22 16:56:09 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 21:56:09 -0000 Subject: [Python-checkins] bpo-46417: Clear Unicode static types at exit (GH-30806) Message-ID: https://github.com/python/cpython/commit/1626bf4ac7aef1244e6f886e63a31f7ed65fbd10 commit: 1626bf4ac7aef1244e6f886e63a31f7ed65fbd10 branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T22:55:39+01:00 summary: bpo-46417: Clear Unicode static types at exit (GH-30806) Add _PyUnicode_FiniTypes() function, called by finalize_interp_types(). It clears these static types: * EncodingMapType * PyFieldNameIter_Type * PyFormatterIter_Type _PyStaticType_Dealloc() now does nothing if tp_subclasses is not NULL. files: M Include/internal/pycore_unicodeobject.h M Objects/exceptions.c M Objects/object.c M Objects/typeobject.c M Objects/unicodeobject.c M Python/pylifecycle.c diff --git a/Include/internal/pycore_unicodeobject.h b/Include/internal/pycore_unicodeobject.h index 3b6dfe9dbbab4..fabe522f6fc23 100644 --- a/Include/internal/pycore_unicodeobject.h +++ b/Include/internal/pycore_unicodeobject.h @@ -17,6 +17,7 @@ extern void _PyUnicode_InitState(PyInterpreterState *); extern PyStatus _PyUnicode_InitGlobalObjects(PyInterpreterState *); extern PyStatus _PyUnicode_InitTypes(PyInterpreterState *); extern void _PyUnicode_Fini(PyInterpreterState *); +extern void _PyUnicode_FiniTypes(PyInterpreterState *); /* other API */ diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 6bf70e2b15c16..065503f59d62d 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -3545,12 +3545,6 @@ _PyExc_FiniTypes(PyInterpreterState *interp) for (Py_ssize_t i=Py_ARRAY_LENGTH(static_exceptions) - 1; i >= 0; i--) { PyTypeObject *exc = static_exceptions[i].exc; - - // Cannot delete a type if it still has subclasses - if (exc->tp_subclasses != NULL) { - continue; - } - _PyStaticType_Dealloc(exc); } } diff --git a/Objects/object.c b/Objects/object.c index 27f89e8d75212..3082e70e7e230 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -1994,10 +1994,6 @@ _PyTypes_FiniTypes(PyInterpreterState *interp) // their base classes. for (Py_ssize_t i=Py_ARRAY_LENGTH(static_types)-1; i>=0; i--) { PyTypeObject *type = static_types[i]; - // Cannot delete a type if it still has subclasses - if (type->tp_subclasses != NULL) { - continue; - } _PyStaticType_Dealloc(type); } } diff --git a/Objects/typeobject.c b/Objects/typeobject.c index bf62b5389257f..cc4612f9308d0 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4079,10 +4079,12 @@ type_dealloc_common(PyTypeObject *type) void _PyStaticType_Dealloc(PyTypeObject *type) { - // _PyStaticType_Dealloc() must not be called if a type has subtypes. + // If a type still has subtypes, it cannot be deallocated. // A subtype can inherit attributes and methods of its parent type, // and a type must no longer be used once it's deallocated. - assert(type->tp_subclasses == NULL); + if (type->tp_subclasses != NULL) { + return; + } type_dealloc_common(type); diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index 2e1f8a6ac4e56..4cea0d8e62e85 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -15567,23 +15567,19 @@ _PyUnicode_InitTypes(PyInterpreterState *interp) return _PyStatus_OK(); } - if (PyType_Ready(&PyUnicode_Type) < 0) { - return _PyStatus_ERR("Can't initialize unicode type"); - } - if (PyType_Ready(&PyUnicodeIter_Type) < 0) { - return _PyStatus_ERR("Can't initialize unicode iterator type"); - } - if (PyType_Ready(&EncodingMapType) < 0) { - return _PyStatus_ERR("Can't initialize encoding map type"); + goto error; } if (PyType_Ready(&PyFieldNameIter_Type) < 0) { - return _PyStatus_ERR("Can't initialize field name iterator type"); + goto error; } if (PyType_Ready(&PyFormatterIter_Type) < 0) { - return _PyStatus_ERR("Can't initialize formatter iter type"); + goto error; } return _PyStatus_OK(); + +error: + return _PyStatus_ERR("Can't initialize unicode types"); } @@ -16111,6 +16107,19 @@ unicode_is_finalizing(void) #endif +void +_PyUnicode_FiniTypes(PyInterpreterState *interp) +{ + if (!_Py_IsMainInterpreter(interp)) { + return; + } + + _PyStaticType_Dealloc(&EncodingMapType); + _PyStaticType_Dealloc(&PyFieldNameIter_Type); + _PyStaticType_Dealloc(&PyFormatterIter_Type); +} + + void _PyUnicode_Fini(PyInterpreterState *interp) { diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index aca3b1a5fd1a4..7fc9d3c94ce51 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -1664,6 +1664,7 @@ flush_std_files(void) static void finalize_interp_types(PyInterpreterState *interp) { + _PyUnicode_FiniTypes(interp); _PySys_Fini(interp); _PyExc_Fini(interp); _PyFrame_Fini(interp); From webhook-mailer at python.org Sat Jan 22 17:22:38 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 22:22:38 -0000 Subject: [Python-checkins] bpo-46417: Clear _io module static objects at exit (GH-30807) Message-ID: https://github.com/python/cpython/commit/9c8e490b8f9e40a6fe9815be58bacaecab5369ee commit: 9c8e490b8f9e40a6fe9815be58bacaecab5369ee branch: main author: Victor Stinner committer: vstinner date: 2022-01-22T23:22:20+01:00 summary: bpo-46417: Clear _io module static objects at exit (GH-30807) Add _PyIO_Fini() function, called by finalize_interp_clear(). It clears static objects used by the _io extension module. files: M Modules/_io/_iomodule.c M Python/pylifecycle.c diff --git a/Modules/_io/_iomodule.c b/Modules/_io/_iomodule.c index b4743fbd5e04f..116688da5e7af 100644 --- a/Modules/_io/_iomodule.c +++ b/Modules/_io/_iomodule.c @@ -666,6 +666,82 @@ struct PyModuleDef _PyIO_Module = { (freefunc)iomodule_free, }; + +static PyTypeObject* static_types[] = { + // Base classes + &PyIOBase_Type, + &PyIncrementalNewlineDecoder_Type, + + // PyIOBase_Type subclasses + &PyBufferedIOBase_Type, + &PyRawIOBase_Type, + &PyTextIOBase_Type, + + // PyBufferedIOBase_Type(PyIOBase_Type) subclasses + &PyBytesIO_Type, + &PyBufferedReader_Type, + &PyBufferedWriter_Type, + &PyBufferedRWPair_Type, + &PyBufferedRandom_Type, + + // PyRawIOBase_Type(PyIOBase_Type) subclasses + &PyFileIO_Type, + &_PyBytesIOBuffer_Type, +#ifdef MS_WINDOWS + &PyWindowsConsoleIO_Type, +#endif + + // PyTextIOBase_Type(PyIOBase_Type) subclasses + &PyStringIO_Type, + &PyTextIOWrapper_Type, +}; + + +void +_PyIO_Fini(void) +{ + for (Py_ssize_t i=Py_ARRAY_LENGTH(static_types) - 1; i >= 0; i--) { + PyTypeObject *exc = static_types[i]; + _PyStaticType_Dealloc(exc); + } + + /* Interned strings */ +#define CLEAR_INTERNED(name) \ + Py_CLEAR(_PyIO_str_ ## name) + + CLEAR_INTERNED(close); + CLEAR_INTERNED(closed); + CLEAR_INTERNED(decode); + CLEAR_INTERNED(encode); + CLEAR_INTERNED(fileno); + CLEAR_INTERNED(flush); + CLEAR_INTERNED(getstate); + CLEAR_INTERNED(isatty); + CLEAR_INTERNED(locale); + CLEAR_INTERNED(newlines); + CLEAR_INTERNED(peek); + CLEAR_INTERNED(read); + CLEAR_INTERNED(read1); + CLEAR_INTERNED(readable); + CLEAR_INTERNED(readall); + CLEAR_INTERNED(readinto); + CLEAR_INTERNED(readline); + CLEAR_INTERNED(reset); + CLEAR_INTERNED(seek); + CLEAR_INTERNED(seekable); + CLEAR_INTERNED(setstate); + CLEAR_INTERNED(tell); + CLEAR_INTERNED(truncate); + CLEAR_INTERNED(write); + CLEAR_INTERNED(writable); +#undef CLEAR_INTERNED + + Py_CLEAR(_PyIO_str_nl); + Py_CLEAR(_PyIO_empty_str); + Py_CLEAR(_PyIO_empty_bytes); +} + + PyMODINIT_FUNC PyInit__io(void) { @@ -676,11 +752,6 @@ PyInit__io(void) state = get_io_state(m); state->initialized = 0; -#define ADD_TYPE(type) \ - if (PyModule_AddType(m, type) < 0) { \ - goto fail; \ - } - /* DEFAULT_BUFFER_SIZE */ if (PyModule_AddIntMacro(m, DEFAULT_BUFFER_SIZE) < 0) goto fail; @@ -702,57 +773,34 @@ PyInit__io(void) (PyObject *) PyExc_BlockingIOError) < 0) goto fail; - /* Concrete base types of the IO ABCs. - (the ABCs themselves are declared through inheritance in io.py) - */ - ADD_TYPE(&PyIOBase_Type); - ADD_TYPE(&PyRawIOBase_Type); - ADD_TYPE(&PyBufferedIOBase_Type); - ADD_TYPE(&PyTextIOBase_Type); - - /* Implementation of concrete IO objects. */ - /* FileIO */ + // Set type base classes PyFileIO_Type.tp_base = &PyRawIOBase_Type; - ADD_TYPE(&PyFileIO_Type); - - /* BytesIO */ PyBytesIO_Type.tp_base = &PyBufferedIOBase_Type; - ADD_TYPE(&PyBytesIO_Type); - if (PyType_Ready(&_PyBytesIOBuffer_Type) < 0) - goto fail; - - /* StringIO */ PyStringIO_Type.tp_base = &PyTextIOBase_Type; - ADD_TYPE(&PyStringIO_Type); - #ifdef MS_WINDOWS - /* WindowsConsoleIO */ PyWindowsConsoleIO_Type.tp_base = &PyRawIOBase_Type; - ADD_TYPE(&PyWindowsConsoleIO_Type); #endif - - /* BufferedReader */ PyBufferedReader_Type.tp_base = &PyBufferedIOBase_Type; - ADD_TYPE(&PyBufferedReader_Type); - - /* BufferedWriter */ PyBufferedWriter_Type.tp_base = &PyBufferedIOBase_Type; - ADD_TYPE(&PyBufferedWriter_Type); - - /* BufferedRWPair */ PyBufferedRWPair_Type.tp_base = &PyBufferedIOBase_Type; - ADD_TYPE(&PyBufferedRWPair_Type); - - /* BufferedRandom */ PyBufferedRandom_Type.tp_base = &PyBufferedIOBase_Type; - ADD_TYPE(&PyBufferedRandom_Type); - - /* TextIOWrapper */ PyTextIOWrapper_Type.tp_base = &PyTextIOBase_Type; - ADD_TYPE(&PyTextIOWrapper_Type); - /* IncrementalNewlineDecoder */ - ADD_TYPE(&PyIncrementalNewlineDecoder_Type); + // Add types + for (size_t i=0; i < Py_ARRAY_LENGTH(static_types); i++) { + PyTypeObject *type = static_types[i]; + // Private type not exposed in the _io module + if (type == &_PyBytesIOBuffer_Type) { + if (PyType_Ready(type) < 0) { + goto fail; + } + } + else { + if (PyModule_AddType(m, type) < 0) { + goto fail; + } + } + } /* Interned strings */ #define ADD_INTERNED(name) \ @@ -785,6 +833,7 @@ PyInit__io(void) ADD_INTERNED(truncate) ADD_INTERNED(write) ADD_INTERNED(writable) +#undef ADD_INTERNED if (!_PyIO_str_nl && !(_PyIO_str_nl = PyUnicode_InternFromString("\n"))) diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 7fc9d3c94ce51..92c8ad079c5fb 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -29,6 +29,8 @@ #include "pycore_typeobject.h" // _PyTypes_InitTypes() #include "pycore_unicodeobject.h" // _PyUnicode_InitTypes() +extern void _PyIO_Fini(void); + #include // setlocale() #include // getenv() @@ -1702,6 +1704,10 @@ finalize_interp_clear(PyThreadState *tstate) /* Clear interpreter state and all thread states */ _PyInterpreterState_Clear(tstate); + if (is_main_interp) { + _PyIO_Fini(); + } + /* Clear all loghooks */ /* Both _PySys_Audit function and users still need PyObject, such as tuple. Call _PySys_ClearAuditHooks when PyObject available. */ From webhook-mailer at python.org Sat Jan 22 17:52:35 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 22:52:35 -0000 Subject: [Python-checkins] bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) Message-ID: https://github.com/python/cpython/commit/1ded8ed8e817b8f9dae1a0ef92d97983afbc844e commit: 1ded8ed8e817b8f9dae1a0ef92d97983afbc844e branch: main author: Nikita Sobolev committer: vstinner date: 2022-01-22T23:52:26+01:00 summary: bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 57b56bba34100..c8bfa892c73fc 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -456,6 +456,8 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") + @unittest.skipIf(sys.platform == "win32", + "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Sat Jan 22 18:07:06 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 23:07:06 -0000 Subject: [Python-checkins] bpo-46417: Clear symtable identifiers at exit (GH-30809) Message-ID: https://github.com/python/cpython/commit/12f4ac3bc848244242d6b8a7ee158b985fd64744 commit: 12f4ac3bc848244242d6b8a7ee158b985fd64744 branch: main author: Victor Stinner committer: vstinner date: 2022-01-23T00:06:56+01:00 summary: bpo-46417: Clear symtable identifiers at exit (GH-30809) Add _PySymtable_Fini() function, called by finalize_interp_clear(). Update test_cmd_line.test_showrefcount() to tolerate negative reference count. files: M Include/internal/pycore_symtable.h M Lib/test/test_cmd_line.py M Python/pylifecycle.c M Python/symtable.c diff --git a/Include/internal/pycore_symtable.h b/Include/internal/pycore_symtable.h index 28935f4ed5501..4ecfab5585032 100644 --- a/Include/internal/pycore_symtable.h +++ b/Include/internal/pycore_symtable.h @@ -128,6 +128,8 @@ extern struct symtable* _Py_SymtableStringObjectFlags( int start, PyCompilerFlags *flags); +extern void _PySymtable_Fini(void); + #ifdef __cplusplus } #endif diff --git a/Lib/test/test_cmd_line.py b/Lib/test/test_cmd_line.py index 86ee27485c964..fa5f39ea5fa97 100644 --- a/Lib/test/test_cmd_line.py +++ b/Lib/test/test_cmd_line.py @@ -119,7 +119,10 @@ def run_python(*args): rc, out, err = run_python('-X', 'showrefcount', '-c', code) self.assertEqual(out.rstrip(), b"{'showrefcount': True}") if Py_DEBUG: - self.assertRegex(err, br'^\[\d+ refs, \d+ blocks\]') + # bpo-46417: Tolerate negative reference count which can occur + # because of bugs in C extensions. This test is only about checking + # the showrefcount feature. + self.assertRegex(err, br'^\[-?\d+ refs, \d+ blocks\]') else: self.assertEqual(err, b'') diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 92c8ad079c5fb..9d10f94efa732 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -23,6 +23,7 @@ #include "pycore_runtime_init.h" // _PyRuntimeState_INIT #include "pycore_sliceobject.h" // _PySlice_Fini() #include "pycore_structseq.h" // _PyStructSequence_InitState() +#include "pycore_symtable.h" // _PySymtable_Fini() #include "pycore_sysmodule.h" // _PySys_ClearAuditHooks() #include "pycore_traceback.h" // _Py_DumpTracebackThreads() #include "pycore_tuple.h" // _PyTuple_InitTypes() @@ -1700,6 +1701,9 @@ finalize_interp_clear(PyThreadState *tstate) int is_main_interp = _Py_IsMainInterpreter(tstate->interp); _PyExc_ClearExceptionGroupType(tstate->interp); + if (is_main_interp) { + _PySymtable_Fini(); + } /* Clear interpreter state and all thread states */ _PyInterpreterState_Clear(tstate); diff --git a/Python/symtable.c b/Python/symtable.c index 01c6ec1318d63..e9bdff3eba109 100644 --- a/Python/symtable.c +++ b/Python/symtable.c @@ -1121,7 +1121,7 @@ static int symtable_add_def(struct symtable *st, PyObject *name, int flag, int lineno, int col_offset, int end_lineno, int end_col_offset) { - return symtable_add_def_helper(st, name, flag, st->st_cur, + return symtable_add_def_helper(st, name, flag, st->st_cur, lineno, col_offset, end_lineno, end_col_offset); } @@ -2134,7 +2134,7 @@ symtable_raise_if_annotation_block(struct symtable *st, const char *name, expr_t static int symtable_raise_if_comprehension_block(struct symtable *st, expr_ty e) { _Py_comprehension_ty type = st->st_cur->ste_comprehension; - PyErr_SetString(PyExc_SyntaxError, + PyErr_SetString(PyExc_SyntaxError, (type == ListComprehension) ? "'yield' inside list comprehension" : (type == SetComprehension) ? "'yield' inside set comprehension" : (type == DictComprehension) ? "'yield' inside dict comprehension" : @@ -2173,3 +2173,16 @@ _Py_SymtableStringObjectFlags(const char *str, PyObject *filename, _PyArena_Free(arena); return st; } + +void +_PySymtable_Fini(void) +{ + Py_CLEAR(top); + Py_CLEAR(lambda); + Py_CLEAR(genexpr); + Py_CLEAR(listcomp); + Py_CLEAR(setcomp); + Py_CLEAR(dictcomp); + Py_CLEAR(__class__); + Py_CLEAR(_annotation); +} From webhook-mailer at python.org Sat Jan 22 18:32:16 2022 From: webhook-mailer at python.org (vstinner) Date: Sat, 22 Jan 2022 23:32:16 -0000 Subject: [Python-checkins] bpo-46417: Fix _PyStaticType_Dealloc() (GH-30810) Message-ID: https://github.com/python/cpython/commit/a1444f43584af0f7a0af72aa06ba0a86ae5a87a2 commit: a1444f43584af0f7a0af72aa06ba0a86ae5a87a2 branch: main author: Victor Stinner committer: vstinner date: 2022-01-23T00:32:05+01:00 summary: bpo-46417: Fix _PyStaticType_Dealloc() (GH-30810) _PyStaticType_Dealloc() now only calls PyObject_ClearWeakRefs() if the call is not going to fail. files: M Objects/typeobject.c diff --git a/Objects/typeobject.c b/Objects/typeobject.c index cc4612f9308d0..452759334f456 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -4071,8 +4071,6 @@ type_dealloc_common(PyTypeObject *type) remove_all_subclasses(type, type->tp_bases); PyErr_Restore(tp, val, tb); } - - PyObject_ClearWeakRefs((PyObject *)type); } @@ -4094,6 +4092,11 @@ _PyStaticType_Dealloc(PyTypeObject *type) Py_CLEAR(type->tp_cache); // type->tp_subclasses is NULL + // PyObject_ClearWeakRefs() raises an exception if Py_REFCNT() != 0 + if (Py_REFCNT(type) == 0) { + PyObject_ClearWeakRefs((PyObject *)type); + } + type->tp_flags &= ~Py_TPFLAGS_READY; } @@ -4101,12 +4104,17 @@ _PyStaticType_Dealloc(PyTypeObject *type) static void type_dealloc(PyTypeObject *type) { - /* Assert this is a heap-allocated type object */ + // Assert this is a heap-allocated type object _PyObject_ASSERT((PyObject *)type, type->tp_flags & Py_TPFLAGS_HEAPTYPE); + _PyObject_GC_UNTRACK(type); type_dealloc_common(type); + // PyObject_ClearWeakRefs() raises an exception if Py_REFCNT() != 0 + assert(Py_REFCNT(type) == 0); + PyObject_ClearWeakRefs((PyObject *)type); + Py_XDECREF(type->tp_base); Py_XDECREF(type->tp_dict); Py_XDECREF(type->tp_bases); From webhook-mailer at python.org Sat Jan 22 19:20:53 2022 From: webhook-mailer at python.org (miss-islington) Date: Sun, 23 Jan 2022 00:20:53 -0000 Subject: [Python-checkins] bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) Message-ID: https://github.com/python/cpython/commit/ba932d90244252f6d4073263f25989507a183f79 commit: ba932d90244252f6d4073263f25989507a183f79 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-22T16:20:49-08:00 summary: bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) (cherry picked from commit 1ded8ed8e817b8f9dae1a0ef92d97983afbc844e) Co-authored-by: Nikita Sobolev files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 28f8de35edd96..0ba966cb5ccf4 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -451,6 +451,8 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") + @unittest.skipIf(sys.platform == "win32", + "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Sat Jan 22 19:47:30 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 23 Jan 2022 00:47:30 -0000 Subject: [Python-checkins] Minor code rearrangement to group related methods together. (GH-30813) Message-ID: https://github.com/python/cpython/commit/bcacab47bf9e8bee58f6f248638e229ae8ea7992 commit: bcacab47bf9e8bee58f6f248638e229ae8ea7992 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-22T18:47:22-06:00 summary: Minor code rearrangement to group related methods together. (GH-30813) * Make example more focused with math.prod() * Move comparison tests to the multiset operations section files: M Lib/collections/__init__.py diff --git a/Lib/collections/__init__.py b/Lib/collections/__init__.py index d989d85d6d829..fa8b30985a435 100644 --- a/Lib/collections/__init__.py +++ b/Lib/collections/__init__.py @@ -617,11 +617,9 @@ def elements(self): ['A', 'A', 'B', 'B', 'C', 'C'] # Knuth's example for prime factors of 1836: 2**2 * 3**3 * 17**1 + >>> import math >>> prime_factors = Counter({2: 2, 3: 3, 17: 1}) - >>> product = 1 - >>> for factor in prime_factors.elements(): # loop over factors - ... product *= factor # and multiply them - >>> product + >>> math.prod(prime_factors.elements()) 1836 Note, if an element's count has been set to zero or is a negative @@ -718,42 +716,6 @@ def __delitem__(self, elem): if elem in self: super().__delitem__(elem) - def __eq__(self, other): - 'True if all counts agree. Missing counts are treated as zero.' - if not isinstance(other, Counter): - return NotImplemented - return all(self[e] == other[e] for c in (self, other) for e in c) - - def __ne__(self, other): - 'True if any counts disagree. Missing counts are treated as zero.' - if not isinstance(other, Counter): - return NotImplemented - return not self == other - - def __le__(self, other): - 'True if all counts in self are a subset of those in other.' - if not isinstance(other, Counter): - return NotImplemented - return all(self[e] <= other[e] for c in (self, other) for e in c) - - def __lt__(self, other): - 'True if all counts in self are a proper subset of those in other.' - if not isinstance(other, Counter): - return NotImplemented - return self <= other and self != other - - def __ge__(self, other): - 'True if all counts in self are a superset of those in other.' - if not isinstance(other, Counter): - return NotImplemented - return all(self[e] >= other[e] for c in (self, other) for e in c) - - def __gt__(self, other): - 'True if all counts in self are a proper superset of those in other.' - if not isinstance(other, Counter): - return NotImplemented - return self >= other and self != other - def __repr__(self): if not self: return f'{self.__class__.__name__}()' @@ -795,6 +757,42 @@ def __repr__(self): # (cp >= cq) == (sp >= sq) # (cp > cq) == (sp > sq) + def __eq__(self, other): + 'True if all counts agree. Missing counts are treated as zero.' + if not isinstance(other, Counter): + return NotImplemented + return all(self[e] == other[e] for c in (self, other) for e in c) + + def __ne__(self, other): + 'True if any counts disagree. Missing counts are treated as zero.' + if not isinstance(other, Counter): + return NotImplemented + return not self == other + + def __le__(self, other): + 'True if all counts in self are a subset of those in other.' + if not isinstance(other, Counter): + return NotImplemented + return all(self[e] <= other[e] for c in (self, other) for e in c) + + def __lt__(self, other): + 'True if all counts in self are a proper subset of those in other.' + if not isinstance(other, Counter): + return NotImplemented + return self <= other and self != other + + def __ge__(self, other): + 'True if all counts in self are a superset of those in other.' + if not isinstance(other, Counter): + return NotImplemented + return all(self[e] >= other[e] for c in (self, other) for e in c) + + def __gt__(self, other): + 'True if all counts in self are a proper superset of those in other.' + if not isinstance(other, Counter): + return NotImplemented + return self >= other and self != other + def __add__(self, other): '''Add counts from two counters. From webhook-mailer at python.org Sat Jan 22 19:58:25 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 00:58:25 -0000 Subject: [Python-checkins] bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) (GH-30812) Message-ID: https://github.com/python/cpython/commit/486b4d3d8e1a5699a2854e310c58fe12b220b7a9 commit: 486b4d3d8e1a5699a2854e310c58fe12b220b7a9 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-23T01:58:16+01:00 summary: bpo-41682: Skip unstable test_asyncio sendfile test on Windows (GH-30801) (GH-30812) (cherry picked from commit 1ded8ed8e817b8f9dae1a0ef92d97983afbc844e) Co-authored-by: Nikita Sobolev Co-authored-by: Nikita Sobolev files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 57b56bba34100..c8bfa892c73fc 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -456,6 +456,8 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") + @unittest.skipIf(sys.platform == "win32", + "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Sat Jan 22 19:59:32 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 00:59:32 -0000 Subject: [Python-checkins] Document optional 'task'/'asyncgen' fields in call_exception_handler (GH-21735) (GH-30727) Message-ID: https://github.com/python/cpython/commit/d807bf2ee9e4774c5a95dbbef3bdd722d1847e23 commit: d807bf2ee9e4774c5a95dbbef3bdd722d1847e23 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-23T01:59:23+01:00 summary: Document optional 'task'/'asyncgen' fields in call_exception_handler (GH-21735) (GH-30727) (cherry picked from commit a1652da2c89bb21f3fdc71780b63b1de2dff11f0) Co-authored-by: Shane Harvey Co-authored-by: Shane Harvey files: M Doc/library/asyncio-eventloop.rst diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst index 140851ce2e448..2a6d82fa057b0 100644 --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -1193,10 +1193,13 @@ Allows customizing how exceptions are handled in the event loop. * 'message': Error message; * 'exception' (optional): Exception object; * 'future' (optional): :class:`asyncio.Future` instance; + * 'task' (optional): :class:`asyncio.Task` instance; * 'handle' (optional): :class:`asyncio.Handle` instance; * 'protocol' (optional): :ref:`Protocol ` instance; * 'transport' (optional): :ref:`Transport ` instance; - * 'socket' (optional): :class:`socket.socket` instance. + * 'socket' (optional): :class:`socket.socket` instance; + * 'asyncgen' (optional): Asynchronous generator that caused + the exception. .. note:: From webhook-mailer at python.org Sat Jan 22 19:59:45 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 00:59:45 -0000 Subject: [Python-checkins] Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) (GH-30703) Message-ID: https://github.com/python/cpython/commit/d0852c447aae650b665aaad61d914a1dc4d7ad96 commit: d0852c447aae650b665aaad61d914a1dc4d7ad96 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-23T01:59:41+01:00 summary: Update documentation in datetime module strftime-and-strptime-behavior fix typo in '%W' format code description (GH-30232) (GH-30703) A small change to the documentation of datetime module , in the format codes section of stftime and strptime. Changed the description of format code '%W' from 'as a decimal number' to 'a zero padded decimal number' so it's in line with the example having leading zeros. Similar to the format code '%U' above. Automerge-Triggered-By: GH:pganssle (cherry picked from commit d45cd2d20770f72a000ba6dfa9ac88dd49423c27) Co-authored-by: Evan Co-authored-by: Evan files: M Doc/library/datetime.rst diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 217cdf222b89b..f447b7bc9491e 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -2375,7 +2375,7 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%U`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Sunday as the first day of | | \(9) | -| | the week) as a zero padded | | | +| | the week) as a zero-padded | | | | | decimal number. All days in a | | | | | new year preceding the first | | | | | Sunday are considered to be in | | | @@ -2383,10 +2383,10 @@ requires, and these work on all platforms with a standard C implementation. +-----------+--------------------------------+------------------------+-------+ | ``%W`` | Week number of the year | 00, 01, ..., 53 | \(7), | | | (Monday as the first day of | | \(9) | -| | the week) as a decimal number. | | | -| | All days in a new year | | | -| | preceding the first Monday | | | -| | are considered to be in | | | +| | the week) as a zero-padded | | | +| | decimal number. All days in a | | | +| | new year preceding the first | | | +| | Monday are considered to be in | | | | | week 0. | | | +-----------+--------------------------------+------------------------+-------+ | ``%c`` | Locale's appropriate date and || Tue Aug 16 21:30:00 | \(1) | From webhook-mailer at python.org Sat Jan 22 20:00:15 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 01:00:15 -0000 Subject: [Python-checkins] bpo-46266: Add calendar day of week constants to __all__ (GH-30412) (GH-30424) Message-ID: https://github.com/python/cpython/commit/f66ef3eab62c6d262ddbc8ab16fb43c8921ad33a commit: f66ef3eab62c6d262ddbc8ab16fb43c8921ad33a branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-23T02:00:11+01:00 summary: bpo-46266: Add calendar day of week constants to __all__ (GH-30412) (GH-30424) (cherry picked from commit e5894ca8fd05e6a6df1033025b9093b68baa718d) Co-authored-by: Nikita Sobolev Co-authored-by: Nikita Sobolev files: A Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst M Doc/library/calendar.rst M Lib/calendar.py M Lib/test/test_calendar.py diff --git a/Doc/library/calendar.rst b/Doc/library/calendar.rst index c3c04db853ed2..f641760d1bd1a 100644 --- a/Doc/library/calendar.rst +++ b/Doc/library/calendar.rst @@ -31,7 +31,7 @@ interpreted as prescribed by the ISO 8601 standard. Year 0 is 1 BC, year -1 is .. class:: Calendar(firstweekday=0) Creates a :class:`Calendar` object. *firstweekday* is an integer specifying the - first day of the week. ``0`` is Monday (the default), ``6`` is Sunday. + first day of the week. :const:`MONDAY` is ``0`` (the default), :const:`SUNDAY` is ``6``. A :class:`Calendar` object provides several methods that can be used for preparing the calendar data for formatting. This class doesn't do any formatting @@ -409,6 +409,15 @@ The :mod:`calendar` module exports the following data attributes: locale. This follows normal convention of January being month number 1, so it has a length of 13 and ``month_abbr[0]`` is the empty string. +.. data:: MONDAY + TUESDAY + WEDNESDAY + THURSDAY + FRIDAY + SATURDAY + SUNDAY + + Aliases for day numbers, where ``MONDAY`` is ``0`` and ``SUNDAY`` is ``6``. .. seealso:: diff --git a/Lib/calendar.py b/Lib/calendar.py index 7311a0173729e..cbea9ec99f550 100644 --- a/Lib/calendar.py +++ b/Lib/calendar.py @@ -15,7 +15,9 @@ "monthcalendar", "prmonth", "month", "prcal", "calendar", "timegm", "month_name", "month_abbr", "day_name", "day_abbr", "Calendar", "TextCalendar", "HTMLCalendar", "LocaleTextCalendar", - "LocaleHTMLCalendar", "weekheader"] + "LocaleHTMLCalendar", "weekheader", + "MONDAY", "TUESDAY", "WEDNESDAY", "THURSDAY", "FRIDAY", + "SATURDAY", "SUNDAY"] # Exception raised for bad input (with string parameter for details) error = ValueError diff --git a/Lib/test/test_calendar.py b/Lib/test/test_calendar.py index c641e8c418318..39094ad6fd9ab 100644 --- a/Lib/test/test_calendar.py +++ b/Lib/test/test_calendar.py @@ -935,8 +935,7 @@ def test_html_output_year_css(self): class MiscTestCase(unittest.TestCase): def test__all__(self): not_exported = { - 'mdays', 'January', 'February', 'EPOCH', 'MONDAY', 'TUESDAY', - 'WEDNESDAY', 'THURSDAY', 'FRIDAY', 'SATURDAY', 'SUNDAY', + 'mdays', 'January', 'February', 'EPOCH', 'different_locale', 'c', 'prweek', 'week', 'format', 'formatstring', 'main', 'monthlen', 'prevmonth', 'nextmonth'} support.check__all__(self, calendar, not_exported=not_exported) diff --git a/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst b/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst new file mode 100644 index 0000000000000..354dcb0106595 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-05-12-48-18.bpo-46266.ACQCgX.rst @@ -0,0 +1,4 @@ +Improve day constants in :mod:`calendar`. + +Now all constants (`MONDAY` ... `SUNDAY`) are documented, tested, and added +to ``__all__``. From webhook-mailer at python.org Sat Jan 22 20:21:00 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 01:21:00 -0000 Subject: [Python-checkins] bpo-46417: _PyList_Fini() clears indexerr (GH-30815) Message-ID: https://github.com/python/cpython/commit/976dec9b3b35fddbaa893c99297e0c54731451b5 commit: 976dec9b3b35fddbaa893c99297e0c54731451b5 branch: main author: Victor Stinner committer: vstinner date: 2022-01-23T02:20:44+01:00 summary: bpo-46417: _PyList_Fini() clears indexerr (GH-30815) _PyList_Fini() now clears the 'indexerr' error message. files: M Objects/listobject.c diff --git a/Objects/listobject.c b/Objects/listobject.c index 0ce58b240327f..752d9e00bb7bf 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -15,6 +15,8 @@ class list "PyListObject *" "&PyList_Type" #include "clinic/listobject.c.h" +static PyObject *indexerr = NULL; + #if PyList_MAXFREELIST > 0 static struct _Py_list_state * get_list_state(void) @@ -123,6 +125,10 @@ _PyList_Fini(PyInterpreterState *interp) struct _Py_list_state *state = &interp->list; state->numfree = -1; #endif + + if (_Py_IsMainInterpreter(interp)) { + Py_CLEAR(indexerr); + } } /* Print summary info about the state of the optimized allocator */ @@ -224,8 +230,6 @@ valid_index(Py_ssize_t i, Py_ssize_t limit) return (size_t) i < (size_t) limit; } -static PyObject *indexerr = NULL; - PyObject * PyList_GetItem(PyObject *op, Py_ssize_t i) { From webhook-mailer at python.org Sat Jan 22 21:27:58 2022 From: webhook-mailer at python.org (ethanfurman) Date: Sun, 23 Jan 2022 02:27:58 -0000 Subject: [Python-checkins] bpo-46477: [Enum] ensure Flag subclasses have correct bitwise methods (GH-30816) Message-ID: https://github.com/python/cpython/commit/353e3b2820bed38da16140276786eef9ba33d3bd commit: 353e3b2820bed38da16140276786eef9ba33d3bd branch: main author: Ethan Furman committer: ethanfurman date: 2022-01-22T18:27:52-08:00 summary: bpo-46477: [Enum] ensure Flag subclasses have correct bitwise methods (GH-30816) files: M Lib/enum.py M Lib/test/test_enum.py diff --git a/Lib/enum.py b/Lib/enum.py index b510467731293..85245c95f9a9c 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -618,6 +618,18 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k if name not in classdict: setattr(enum_class, name, getattr(first_enum, name)) # + # for Flag, add __or__, __and__, __xor__, and __invert__ + if Flag is not None and issubclass(enum_class, Flag): + for name in ( + '__or__', '__and__', '__xor__', + '__ror__', '__rand__', '__rxor__', + '__invert__' + ): + if name not in classdict: + enum_method = getattr(Flag, name) + setattr(enum_class, name, enum_method) + classdict[name] = enum_method + # # replace any other __new__ with our own (as long as Enum is not None, # anyway) -- again, this is to support pickle if Enum is not None: @@ -1466,44 +1478,10 @@ def __str__(self): def __bool__(self): return bool(self._value_) - def __or__(self, other): - if not isinstance(other, self.__class__): - return NotImplemented - return self.__class__(self._value_ | other._value_) - - def __and__(self, other): - if not isinstance(other, self.__class__): - return NotImplemented - return self.__class__(self._value_ & other._value_) - - def __xor__(self, other): - if not isinstance(other, self.__class__): - return NotImplemented - return self.__class__(self._value_ ^ other._value_) - - def __invert__(self): - if self._inverted_ is None: - if self._boundary_ is KEEP: - # use all bits - self._inverted_ = self.__class__(~self._value_) - else: - # calculate flags not in this member - self._inverted_ = self.__class__(self._flag_mask_ ^ self._value_) - if isinstance(self._inverted_, self.__class__): - self._inverted_._inverted_ = self - return self._inverted_ - - -class IntFlag(int, ReprEnum, Flag, boundary=EJECT): - """ - Support for integer-based Flags - """ - - def __or__(self, other): if isinstance(other, self.__class__): other = other._value_ - elif isinstance(other, int): + elif self._member_type_ is not object and isinstance(other, self._member_type_): other = other else: return NotImplemented @@ -1513,7 +1491,7 @@ def __or__(self, other): def __and__(self, other): if isinstance(other, self.__class__): other = other._value_ - elif isinstance(other, int): + elif self._member_type_ is not object and isinstance(other, self._member_type_): other = other else: return NotImplemented @@ -1523,17 +1501,34 @@ def __and__(self, other): def __xor__(self, other): if isinstance(other, self.__class__): other = other._value_ - elif isinstance(other, int): + elif self._member_type_ is not object and isinstance(other, self._member_type_): other = other else: return NotImplemented value = self._value_ return self.__class__(value ^ other) - __ror__ = __or__ + def __invert__(self): + if self._inverted_ is None: + if self._boundary_ is KEEP: + # use all bits + self._inverted_ = self.__class__(~self._value_) + else: + # calculate flags not in this member + self._inverted_ = self.__class__(self._flag_mask_ ^ self._value_) + if isinstance(self._inverted_, self.__class__): + self._inverted_._inverted_ = self + return self._inverted_ + __rand__ = __and__ + __ror__ = __or__ __rxor__ = __xor__ - __invert__ = Flag.__invert__ + + +class IntFlag(int, ReprEnum, Flag, boundary=EJECT): + """ + Support for integer-based Flags + """ def _high_bit(value): @@ -1662,6 +1657,13 @@ def convert_class(cls): body['_flag_mask_'] = None body['_all_bits_'] = None body['_inverted_'] = None + body['__or__'] = Flag.__or__ + body['__xor__'] = Flag.__xor__ + body['__and__'] = Flag.__and__ + body['__ror__'] = Flag.__ror__ + body['__rxor__'] = Flag.__rxor__ + body['__rand__'] = Flag.__rand__ + body['__invert__'] = Flag.__invert__ for name, obj in cls.__dict__.items(): if name in ('__dict__', '__weakref__'): continue diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index d7ce8add78715..b8a7914355c53 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -2496,6 +2496,13 @@ def __new__(cls, val): self.assertEqual(Some.x.value, 1) self.assertEqual(Some.y.value, 2) + def test_custom_flag_bitwise(self): + class MyIntFlag(int, Flag): + ONE = 1 + TWO = 2 + FOUR = 4 + self.assertTrue(isinstance(MyIntFlag.ONE | MyIntFlag.TWO, MyIntFlag), MyIntFlag.ONE | MyIntFlag.TWO) + self.assertTrue(isinstance(MyIntFlag.ONE | 2, MyIntFlag)) class TestOrder(unittest.TestCase): "test usage of the `_order_` attribute" From webhook-mailer at python.org Sat Jan 22 21:38:46 2022 From: webhook-mailer at python.org (jaraco) Date: Sun, 23 Jan 2022 02:38:46 -0000 Subject: [Python-checkins] bpo-46425: Partially revert "bpo-46425: fix direct invocation of `test_importlib` (GH-30682)" (GH-30799) Message-ID: https://github.com/python/cpython/commit/d888ff5381594641126065e78dc9210dae4436a4 commit: d888ff5381594641126065e78dc9210dae4436a4 branch: main author: Jason R. Coombs committer: jaraco date: 2022-01-22T21:38:26-05:00 summary: bpo-46425: Partially revert "bpo-46425: fix direct invocation of `test_importlib` (GH-30682)" (GH-30799) This reverts commit 57316c52bae5d6420f5067f3891ec328deb97305 for files pertaining to importlib.metadata and importlib.resources. files: M Lib/test/test_importlib/test_compatibilty_files.py M Lib/test/test_importlib/test_contents.py M Lib/test/test_importlib/test_files.py M Lib/test/test_importlib/test_main.py M Lib/test/test_importlib/test_metadata_api.py M Lib/test/test_importlib/test_open.py M Lib/test/test_importlib/test_path.py M Lib/test/test_importlib/test_read.py M Lib/test/test_importlib/test_resource.py M Lib/test/test_importlib/test_zip.py diff --git a/Lib/test/test_importlib/test_compatibilty_files.py b/Lib/test/test_importlib/test_compatibilty_files.py index 18cbdee6ce475..9a823f2d93058 100644 --- a/Lib/test/test_importlib/test_compatibilty_files.py +++ b/Lib/test/test_importlib/test_compatibilty_files.py @@ -8,7 +8,7 @@ wrap_spec, ) -from test.test_importlib.resources import util +from .resources import util class CompatibilityFilesTests(unittest.TestCase): @@ -100,7 +100,3 @@ def files(self): def test_spec_path_joinpath(self): self.assertIsInstance(self.files / 'a', CompatibilityFiles.OrphanPath) - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/test/test_importlib/test_contents.py b/Lib/test/test_importlib/test_contents.py index a5b6538a2fc79..3323bf5b5cf56 100644 --- a/Lib/test/test_importlib/test_contents.py +++ b/Lib/test/test_importlib/test_contents.py @@ -1,8 +1,8 @@ import unittest from importlib import resources -from test.test_importlib import data01 -from test.test_importlib.resources import util +from . import data01 +from .resources import util class ContentsTests: @@ -38,10 +38,6 @@ class ContentsNamespaceTests(ContentsTests, unittest.TestCase): } def setUp(self): - from test.test_importlib import namespacedata01 + from . import namespacedata01 self.data = namespacedata01 - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/test/test_importlib/test_files.py b/Lib/test/test_importlib/test_files.py index 3f28b55509bc1..b9170d83bea91 100644 --- a/Lib/test/test_importlib/test_files.py +++ b/Lib/test/test_importlib/test_files.py @@ -3,8 +3,8 @@ from importlib import resources from importlib.abc import Traversable -from test.test_importlib import data01 -from test.test_importlib.resources import util +from . import data01 +from .resources import util class FilesTests: @@ -37,7 +37,7 @@ class OpenZipTests(FilesTests, util.ZipSetup, unittest.TestCase): class OpenNamespaceTests(FilesTests, unittest.TestCase): def setUp(self): - from test.test_importlib import namespacedata01 + from . import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_main.py b/Lib/test/test_importlib/test_main.py index 77e3dd7e08472..2e120f7ac50ac 100644 --- a/Lib/test/test_importlib/test_main.py +++ b/Lib/test/test_importlib/test_main.py @@ -9,9 +9,9 @@ try: import pyfakefs.fake_filesystem_unittest as ffs except ImportError: - from test.test_importlib.stubs import fake_filesystem_unittest as ffs + from .stubs import fake_filesystem_unittest as ffs -from test.test_importlib import fixtures +from . import fixtures from importlib.metadata import ( Distribution, EntryPoint, @@ -315,7 +315,3 @@ def test_packages_distributions_neither_toplevel_nor_files(self): prefix=self.site_dir, ) packages_distributions() - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/test/test_importlib/test_metadata_api.py b/Lib/test/test_importlib/test_metadata_api.py index 24d46c3d28013..e16773a7e87ef 100644 --- a/Lib/test/test_importlib/test_metadata_api.py +++ b/Lib/test/test_importlib/test_metadata_api.py @@ -5,7 +5,7 @@ import importlib import contextlib -from test.test_importlib import fixtures +from . import fixtures from importlib.metadata import ( Distribution, PackageNotFoundError, @@ -313,7 +313,3 @@ class InvalidateCache(unittest.TestCase): def test_invalidate_cache(self): # No externally observable behavior, but ensures test coverage... importlib.invalidate_caches() - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/test/test_importlib/test_open.py b/Lib/test/test_importlib/test_open.py index ab390269e08f2..df75e343d2c5b 100644 --- a/Lib/test/test_importlib/test_open.py +++ b/Lib/test/test_importlib/test_open.py @@ -1,8 +1,8 @@ import unittest from importlib import resources -from test.test_importlib import data01 -from test.test_importlib.resources import util +from . import data01 +from .resources import util class CommonBinaryTests(util.CommonTests, unittest.TestCase): @@ -68,7 +68,7 @@ def setUp(self): class OpenDiskNamespaceTests(OpenTests, unittest.TestCase): def setUp(self): - from test.test_importlib import namespacedata01 + from . import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_path.py b/Lib/test/test_importlib/test_path.py index 66dc0b215ad9f..6fc41f301d1ca 100644 --- a/Lib/test/test_importlib/test_path.py +++ b/Lib/test/test_importlib/test_path.py @@ -2,8 +2,8 @@ import unittest from importlib import resources -from test.test_importlib import data01 -from test.test_importlib.resources import util +from . import data01 +from .resources import util class CommonTests(util.CommonTests, unittest.TestCase): diff --git a/Lib/test/test_importlib/test_read.py b/Lib/test/test_importlib/test_read.py index 7e907e4c8c59c..ebd72267776d9 100644 --- a/Lib/test/test_importlib/test_read.py +++ b/Lib/test/test_importlib/test_read.py @@ -1,8 +1,8 @@ import unittest from importlib import import_module, resources -from test.test_importlib import data01 -from test.test_importlib.resources import util +from . import data01 +from .resources import util class CommonBinaryTests(util.CommonTests, unittest.TestCase): @@ -66,7 +66,7 @@ def test_read_submodule_resource_by_name(self): class ReadNamespaceTests(ReadTests, unittest.TestCase): def setUp(self): - from test.test_importlib import namespacedata01 + from . import namespacedata01 self.data = namespacedata01 diff --git a/Lib/test/test_importlib/test_resource.py b/Lib/test/test_importlib/test_resource.py index 825d1b0eb054e..834b8bd8a2818 100644 --- a/Lib/test/test_importlib/test_resource.py +++ b/Lib/test/test_importlib/test_resource.py @@ -3,8 +3,9 @@ import uuid import pathlib -from test.test_importlib import data01, zipdata01, zipdata02 -from test.test_importlib.resources import util +from . import data01 +from . import zipdata01, zipdata02 +from .resources import util from importlib import resources, import_module from test.support import import_helper from test.support.os_helper import unlink diff --git a/Lib/test/test_importlib/test_zip.py b/Lib/test/test_importlib/test_zip.py index a9f5c68ac60d7..276f6288c9159 100644 --- a/Lib/test/test_importlib/test_zip.py +++ b/Lib/test/test_importlib/test_zip.py @@ -1,7 +1,7 @@ import sys import unittest -from test.test_importlib import fixtures +from . import fixtures from importlib.metadata import ( PackageNotFoundError, distribution, @@ -60,6 +60,3 @@ def test_files(self): def test_normalized_name(self): dist = distribution('example') assert dist._normalized_name == 'example' - -if __name__ == '__main__': - unittest.main() From webhook-mailer at python.org Sat Jan 22 21:39:04 2022 From: webhook-mailer at python.org (jaraco) Date: Sun, 23 Jan 2022 02:39:04 -0000 Subject: [Python-checkins] bpo-46474: Apply changes from importlib_metadata 4.10.0 (GH-30802) Message-ID: https://github.com/python/cpython/commit/443dec6c9a104386ee90165d32fb28d0c5d29043 commit: 443dec6c9a104386ee90165d32fb28d0c5d29043 branch: main author: Jason R. Coombs committer: jaraco date: 2022-01-22T21:39:00-05:00 summary: bpo-46474: Apply changes from importlib_metadata 4.10.0 (GH-30802) files: A Misc/NEWS.d/next/Library/2022-01-22-14-45-46.bpo-46474.2DUC62.rst M Lib/importlib/metadata/__init__.py M Lib/test/test_importlib/fixtures.py diff --git a/Lib/importlib/metadata/__init__.py b/Lib/importlib/metadata/__init__.py index d44541fcbfbf4..5ef6d9dc4893d 100644 --- a/Lib/importlib/metadata/__init__.py +++ b/Lib/importlib/metadata/__init__.py @@ -571,18 +571,6 @@ def _discover_resolvers(): ) return filter(None, declared) - @classmethod - def _local(cls, root='.'): - from pep517 import build, meta - - system = build.compat_system(root) - builder = functools.partial( - meta.build, - source_dir=root, - system=system, - ) - return PathDistribution(zipfile.Path(meta.build_as_zip(builder))) - @property def metadata(self) -> _meta.PackageMetadata: """Return the parsed metadata for this Distribution. diff --git a/Lib/test/test_importlib/fixtures.py b/Lib/test/test_importlib/fixtures.py index d7ed4e9d56ff5..803d3738d263f 100644 --- a/Lib/test/test_importlib/fixtures.py +++ b/Lib/test/test_importlib/fixtures.py @@ -12,7 +12,7 @@ from typing import Dict, Union try: - from importlib import resources + from importlib import resources # type: ignore getattr(resources, 'files') getattr(resources, 'as_file') @@ -232,21 +232,6 @@ def setUp(self): build_files(EggInfoFile.files, prefix=self.site_dir) -class LocalPackage: - files: FilesDef = { - "setup.py": """ - import setuptools - setuptools.setup(name="local-pkg", version="2.0.1") - """, - } - - def setUp(self): - self.fixtures = contextlib.ExitStack() - self.addCleanup(self.fixtures.close) - self.fixtures.enter_context(tempdir_as_cwd()) - build_files(self.files) - - def build_files(file_defs, prefix=pathlib.Path()): """Build a set of files/directories, as described by the diff --git a/Misc/NEWS.d/next/Library/2022-01-22-14-45-46.bpo-46474.2DUC62.rst b/Misc/NEWS.d/next/Library/2022-01-22-14-45-46.bpo-46474.2DUC62.rst new file mode 100644 index 0000000000000..a5eafdf30f148 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-14-45-46.bpo-46474.2DUC62.rst @@ -0,0 +1,2 @@ +Removed private method from ``importlib.metadata.Path``. Sync with +importlib_metadata 4.10.0. From webhook-mailer at python.org Sat Jan 22 21:53:03 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 23 Jan 2022 02:53:03 -0000 Subject: [Python-checkins] This localization technique is no longer cost effective. (GH-30818) Message-ID: https://github.com/python/cpython/commit/7ad52d174a12951d260aecfcdfe0fcfd2437a66b commit: 7ad52d174a12951d260aecfcdfe0fcfd2437a66b branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-22T20:52:55-06:00 summary: This localization technique is no longer cost effective. (GH-30818) files: M Lib/functools.py diff --git a/Lib/functools.py b/Lib/functools.py index 91b678c226966..cd5666dfa71fd 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -86,84 +86,84 @@ def wraps(wrapped, # infinite recursion that could occur when the operator dispatch logic # detects a NotImplemented result and then calls a reflected method. -def _gt_from_lt(self, other, NotImplemented=NotImplemented): +def _gt_from_lt(self, other): 'Return a > b. Computed by @total_ordering from (not a < b) and (a != b).' op_result = type(self).__lt__(self, other) if op_result is NotImplemented: return op_result return not op_result and self != other -def _le_from_lt(self, other, NotImplemented=NotImplemented): +def _le_from_lt(self, other): 'Return a <= b. Computed by @total_ordering from (a < b) or (a == b).' op_result = type(self).__lt__(self, other) if op_result is NotImplemented: return op_result return op_result or self == other -def _ge_from_lt(self, other, NotImplemented=NotImplemented): +def _ge_from_lt(self, other): 'Return a >= b. Computed by @total_ordering from (not a < b).' op_result = type(self).__lt__(self, other) if op_result is NotImplemented: return op_result return not op_result -def _ge_from_le(self, other, NotImplemented=NotImplemented): +def _ge_from_le(self, other): 'Return a >= b. Computed by @total_ordering from (not a <= b) or (a == b).' op_result = type(self).__le__(self, other) if op_result is NotImplemented: return op_result return not op_result or self == other -def _lt_from_le(self, other, NotImplemented=NotImplemented): +def _lt_from_le(self, other): 'Return a < b. Computed by @total_ordering from (a <= b) and (a != b).' op_result = type(self).__le__(self, other) if op_result is NotImplemented: return op_result return op_result and self != other -def _gt_from_le(self, other, NotImplemented=NotImplemented): +def _gt_from_le(self, other): 'Return a > b. Computed by @total_ordering from (not a <= b).' op_result = type(self).__le__(self, other) if op_result is NotImplemented: return op_result return not op_result -def _lt_from_gt(self, other, NotImplemented=NotImplemented): +def _lt_from_gt(self, other): 'Return a < b. Computed by @total_ordering from (not a > b) and (a != b).' op_result = type(self).__gt__(self, other) if op_result is NotImplemented: return op_result return not op_result and self != other -def _ge_from_gt(self, other, NotImplemented=NotImplemented): +def _ge_from_gt(self, other): 'Return a >= b. Computed by @total_ordering from (a > b) or (a == b).' op_result = type(self).__gt__(self, other) if op_result is NotImplemented: return op_result return op_result or self == other -def _le_from_gt(self, other, NotImplemented=NotImplemented): +def _le_from_gt(self, other): 'Return a <= b. Computed by @total_ordering from (not a > b).' op_result = type(self).__gt__(self, other) if op_result is NotImplemented: return op_result return not op_result -def _le_from_ge(self, other, NotImplemented=NotImplemented): +def _le_from_ge(self, other): 'Return a <= b. Computed by @total_ordering from (not a >= b) or (a == b).' op_result = type(self).__ge__(self, other) if op_result is NotImplemented: return op_result return not op_result or self == other -def _gt_from_ge(self, other, NotImplemented=NotImplemented): +def _gt_from_ge(self, other): 'Return a > b. Computed by @total_ordering from (a >= b) and (a != b).' op_result = type(self).__ge__(self, other) if op_result is NotImplemented: return op_result return op_result and self != other -def _lt_from_ge(self, other, NotImplemented=NotImplemented): +def _lt_from_ge(self, other): 'Return a < b. Computed by @total_ordering from (not a >= b).' op_result = type(self).__ge__(self, other) if op_result is NotImplemented: From webhook-mailer at python.org Sat Jan 22 22:03:51 2022 From: webhook-mailer at python.org (vstinner) Date: Sun, 23 Jan 2022 03:03:51 -0000 Subject: [Python-checkins] bpo-45382: test.pythoninfo logs more Windows versions (GH-30817) Message-ID: https://github.com/python/cpython/commit/b0898f4aa90d9397e23aef98a2d6b82445ee7455 commit: b0898f4aa90d9397e23aef98a2d6b82445ee7455 branch: main author: Victor Stinner committer: vstinner date: 2022-01-23T04:03:43+01:00 summary: bpo-45382: test.pythoninfo logs more Windows versions (GH-30817) Add the following info to test.pythoninfo: * windows.ver: output of the shell "ver" command * windows.version and windows.version_caption: output of the "wmic os get Caption,Version /value" command. files: M Lib/test/pythoninfo.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 9d733c5721cde..cfd7ac2755d51 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -729,6 +729,45 @@ def collect_windows(info_add): except (ImportError, AttributeError): pass + import subprocess + try: + proc = subprocess.Popen(["wmic", "os", "get", "Caption,Version", "/value"], + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + text=True) + output, stderr = proc.communicate() + if proc.returncode: + output = "" + except OSError: + pass + else: + for line in output.splitlines(): + line = line.strip() + if line.startswith('Caption='): + line = line.removeprefix('Caption=').strip() + if line: + info_add('windows.version_caption', line) + elif line.startswith('Version='): + line = line.removeprefix('Version=').strip() + if line: + info_add('windows.version', line) + + try: + proc = subprocess.Popen(["ver"], shell=True, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + text=True) + output = proc.communicate()[0] + if proc.returncode: + output = "" + except OSError: + return + else: + output = output.strip() + line = output.splitlines()[0] + if line: + info_add('windows.ver', line) + def collect_fips(info_add): try: From webhook-mailer at python.org Sat Jan 22 22:10:41 2022 From: webhook-mailer at python.org (pablogsal) Date: Sun, 23 Jan 2022 03:10:41 -0000 Subject: [Python-checkins] [3.10] bpo-46240: Correct the error for unclosed parentheses when the tokenizer is not finished (GH-30378). (GH-30819) Message-ID: https://github.com/python/cpython/commit/633db1c4eb863a1340e45c353e36f2f8dcf5945c commit: 633db1c4eb863a1340e45c353e36f2f8dcf5945c branch: 3.10 author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-23T03:10:37Z summary: [3.10] bpo-46240: Correct the error for unclosed parentheses when the tokenizer is not finished (GH-30378). (GH-30819) (cherry picked from commit 70f415fb8b632247e28d87998642317ca7a652ae) Co-authored-by: Pablo Galindo Salgado files: A Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst M Lib/test/test_exceptions.py M Lib/test/test_syntax.py M Parser/pegen.c diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 802dc9a67eb21..606e6852627c4 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -227,7 +227,7 @@ def testSyntaxErrorOffset(self): check('x = "a', 1, 5) check('lambda x: x = 2', 1, 1) check('f{a + b + c}', 1, 2) - check('[file for str(file) in []\n])', 1, 11) + check('[file for str(file) in []\n]', 1, 11) check('a = ? hello ? ? world ?', 1, 5) check('[\nfile\nfor str(file)\nin\n[]\n]', 3, 5) check('[file for\n str(file) in []]', 2, 2) diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index b5bebb3d0bdfa..7aa93a012e113 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1513,6 +1513,9 @@ def test_error_parenthesis(self): for paren in "([{": self._check_error(paren + "1 + 2", f"\\{paren}' was never closed") + for paren in "([{": + self._check_error(f"a = {paren} 1, 2, 3\nb=3", f"\\{paren}' was never closed") + for paren in ")]}": self._check_error(paren + "1 + 2", f"unmatched '\\{paren}'") diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst new file mode 100644 index 0000000000000..a7702ebafbd46 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-03-23-31-25.bpo-46240.8lGjeK.rst @@ -0,0 +1,3 @@ +Correct the error message for unclosed parentheses when the tokenizer +doesn't reach the end of the source when the error is reported. Patch by +Pablo Galindo diff --git a/Parser/pegen.c b/Parser/pegen.c index f9812c0ea8f02..26143f57c0924 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -1342,7 +1342,8 @@ _PyPegen_run_parser(Parser *p) if (PyErr_Occurred()) { // Prioritize tokenizer errors to custom syntax errors raised // on the second phase only if the errors come from the parser. - if (p->tok->done == E_DONE && PyErr_ExceptionMatches(PyExc_SyntaxError)) { + int is_tok_ok = (p->tok->done == E_DONE || p->tok->done == E_OK); + if (is_tok_ok && PyErr_ExceptionMatches(PyExc_SyntaxError)) { _PyPegen_check_tokenizer_errors(p); } return NULL; From webhook-mailer at python.org Sat Jan 22 23:00:46 2022 From: webhook-mailer at python.org (jaraco) Date: Sun, 23 Jan 2022 04:00:46 -0000 Subject: [Python-checkins] bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803) Message-ID: https://github.com/python/cpython/commit/51c3e28c8a163e58dc753765e3cc51d5a717e70d commit: 51c3e28c8a163e58dc753765e3cc51d5a717e70d branch: main author: Jason R. Coombs committer: jaraco date: 2022-01-22T23:00:23-05:00 summary: bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803) files: A Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst M Lib/importlib/metadata/__init__.py diff --git a/Lib/importlib/metadata/__init__.py b/Lib/importlib/metadata/__init__.py index 5ef6d9dc4893d..371c482209586 100644 --- a/Lib/importlib/metadata/__init__.py +++ b/Lib/importlib/metadata/__init__.py @@ -156,8 +156,8 @@ class EntryPoint(DeprecatedTuple): pattern = re.compile( r'(?P[\w.]+)\s*' - r'(:\s*(?P[\w.]+))?\s*' - r'(?P\[.*\])?\s*$' + r'(:\s*(?P[\w.]+)\s*)?' + r'((?P\[.*\])\s*)?$' ) """ A regular expression describing the syntax for an entry point, diff --git a/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst new file mode 100644 index 0000000000000..156b7de4f6787 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst @@ -0,0 +1,2 @@ +In ``importlib.metadata.EntryPoint.pattern``, avoid potential REDoS by +limiting ambiguity in consecutive whitespace. From webhook-mailer at python.org Sun Jan 23 04:59:43 2022 From: webhook-mailer at python.org (mdickinson) Date: Sun, 23 Jan 2022 09:59:43 -0000 Subject: [Python-checkins] bpo-29882: Fix portability bug introduced in GH-30774 (#30794) Message-ID: https://github.com/python/cpython/commit/83a0ef2162aa379071e243f1b696aa6814edcd2a commit: 83a0ef2162aa379071e243f1b696aa6814edcd2a branch: main author: Mark Dickinson committer: mdickinson date: 2022-01-23T09:59:34Z summary: bpo-29882: Fix portability bug introduced in GH-30774 (#30794) files: M Include/internal/pycore_bitutils.h M Modules/_testinternalcapi.c diff --git a/Include/internal/pycore_bitutils.h b/Include/internal/pycore_bitutils.h index 3fd70b0e417c1..e6bf61ef425bd 100644 --- a/Include/internal/pycore_bitutils.h +++ b/Include/internal/pycore_bitutils.h @@ -115,8 +115,6 @@ _Py_popcount32(uint32_t x) const uint32_t M2 = 0x33333333; // Binary: 0000 1111 0000 1111 ... const uint32_t M4 = 0x0F0F0F0F; - // 256**4 + 256**3 + 256**2 + 256**1 - const uint32_t SUM = 0x01010101; // Put count of each 2 bits into those 2 bits x = x - ((x >> 1) & M1); @@ -124,8 +122,20 @@ _Py_popcount32(uint32_t x) x = (x & M2) + ((x >> 2) & M2); // Put count of each 8 bits into those 8 bits x = (x + (x >> 4)) & M4; - // Sum of the 4 byte counts - return (x * SUM) >> 24; + // Sum of the 4 byte counts. + // Take care when considering changes to the next line. Portability and + // correctness are delicate here, thanks to C's "integer promotions" (C99 + // ?6.3.1.1p2). On machines where the `int` type has width greater than 32 + // bits, `x` will be promoted to an `int`, and following C's "usual + // arithmetic conversions" (C99 ?6.3.1.8), the multiplication will be + // performed as a multiplication of two `unsigned int` operands. In this + // case it's critical that we cast back to `uint32_t` in order to keep only + // the least significant 32 bits. On machines where the `int` type has + // width no greater than 32, the multiplication is of two 32-bit unsigned + // integer types, and the (uint32_t) cast is a no-op. In both cases, we + // avoid the risk of undefined behaviour due to overflow of a + // multiplication of signed integer types. + return (uint32_t)(x * 0x01010101U) >> 24; #endif } diff --git a/Modules/_testinternalcapi.c b/Modules/_testinternalcapi.c index 9deba3558bf94..5d5b3e6b2fd62 100644 --- a/Modules/_testinternalcapi.c +++ b/Modules/_testinternalcapi.c @@ -100,6 +100,7 @@ test_popcount(PyObject *self, PyObject *Py_UNUSED(args)) CHECK(0, 0); CHECK(1, 1); CHECK(0x08080808, 4); + CHECK(0x10000001, 2); CHECK(0x10101010, 4); CHECK(0x10204080, 4); CHECK(0xDEADCAFE, 22); From webhook-mailer at python.org Sun Jan 23 05:00:47 2022 From: webhook-mailer at python.org (mdickinson) Date: Sun, 23 Jan 2022 10:00:47 -0000 Subject: [Python-checkins] bpo-46406: Faster single digit int division. (#30626) Message-ID: https://github.com/python/cpython/commit/c7f20f1cc8c20654e5d539552604362feb9b0512 commit: c7f20f1cc8c20654e5d539552604362feb9b0512 branch: main author: Gregory P. Smith committer: mdickinson date: 2022-01-23T10:00:41Z summary: bpo-46406: Faster single digit int division. (#30626) * bpo-46406: Faster single digit int division. This expresses the algorithm in a more basic manner resulting in better instruction generation by todays compilers. See https://mail.python.org/archives/list/python-dev at python.org/thread/ZICIMX5VFCX4IOFH5NUPVHCUJCQ4Q7QM/#NEUNFZU3TQU4CPTYZNF3WCN7DOJBBTK5 files: A Misc/NEWS.d/next/Core and Builtins/2022-01-16-15-40-11.bpo-46406.g0mke-.rst M Objects/longobject.c diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-16-15-40-11.bpo-46406.g0mke-.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-16-15-40-11.bpo-46406.g0mke-.rst new file mode 100644 index 0000000000000..20d1e08bfd48b --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-16-15-40-11.bpo-46406.g0mke-.rst @@ -0,0 +1,3 @@ +The integer division ``//`` implementation has been optimized to better let the +compiler understand its constraints. It can be 20% faster on the amd64 platform +when dividing an int by a value smaller than ``2**30``. diff --git a/Objects/longobject.c b/Objects/longobject.c index 7721f40adbba6..ee20e2638bcad 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -1617,25 +1617,41 @@ v_rshift(digit *z, digit *a, Py_ssize_t m, int d) in pout, and returning the remainder. pin and pout point at the LSD. It's OK for pin == pout on entry, which saves oodles of mallocs/frees in _PyLong_Format, but that should be done with great care since ints are - immutable. */ + immutable. + This version of the code can be 20% faster than the pre-2022 version + on todays compilers on architectures like amd64. It evolved from Mark + Dickinson observing that a 128:64 divide instruction was always being + generated by the compiler despite us working with 30-bit digit values. + See the thread for full context: + + https://mail.python.org/archives/list/python-dev at python.org/thread/ZICIMX5VFCX4IOFH5NUPVHCUJCQ4Q7QM/#NEUNFZU3TQU4CPTYZNF3WCN7DOJBBTK5 + + If you ever want to change this code, pay attention to performance using + different compilers, optimization levels, and cpu architectures. Beware of + PGO/FDO builds doing value specialization such as a fast path for //10. :) + + Verify that 17 isn't specialized and this works as a quick test: + python -m timeit -s 'x = 10**1000; r=x//10; assert r == 10**999, r' 'x//17' +*/ static digit inplace_divrem1(digit *pout, digit *pin, Py_ssize_t size, digit n) { - twodigits rem = 0; + digit remainder = 0; assert(n > 0 && n <= PyLong_MASK); - pin += size; - pout += size; while (--size >= 0) { - digit hi; - rem = (rem << PyLong_SHIFT) | *--pin; - *--pout = hi = (digit)(rem / n); - rem -= (twodigits)hi * n; - } - return (digit)rem; + twodigits dividend; + dividend = ((twodigits)remainder << PyLong_SHIFT) | pin[size]; + digit quotient; + quotient = (digit)(dividend / n); + remainder = dividend % n; + pout[size] = quotient; + } + return remainder; } + /* Divide an integer by a digit, returning both the quotient (as function result) and the remainder (through *prem). The sign of a is ignored; n should not be zero. */ From webhook-mailer at python.org Sun Jan 23 09:48:49 2022 From: webhook-mailer at python.org (isidentical) Date: Sun, 23 Jan 2022 14:48:49 -0000 Subject: [Python-checkins] bpo-46483: change `PurePath.__class_getitem__` to return `GenericAlias` (GH-30822) Message-ID: https://github.com/python/cpython/commit/1f715d5bd3bc9ff444e109b6bbd13011913681b1 commit: 1f715d5bd3bc9ff444e109b6bbd13011913681b1 branch: main author: Nikita Sobolev committer: isidentical date: 2022-01-23T17:48:43+03:00 summary: bpo-46483: change `PurePath.__class_getitem__` to return `GenericAlias` (GH-30822) files: A Misc/NEWS.d/next/Library/2022-01-23-11-17-48.bpo-46483.j7qwWb.rst M Lib/pathlib.py M Lib/test/test_pathlib.py diff --git a/Lib/pathlib.py b/Lib/pathlib.py index 04b321b9ccf16..d42ee4dc90b43 100644 --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -12,6 +12,7 @@ from operator import attrgetter from stat import S_ISDIR, S_ISLNK, S_ISREG, S_ISSOCK, S_ISBLK, S_ISCHR, S_ISFIFO from urllib.parse import quote_from_bytes as urlquote_from_bytes +from types import GenericAlias __all__ = [ @@ -690,8 +691,7 @@ def __ge__(self, other): return NotImplemented return self._cparts >= other._cparts - def __class_getitem__(cls, type): - return cls + __class_getitem__ = classmethod(GenericAlias) drive = property(attrgetter('_drv'), doc="""The drive prefix (letter or UNC path), if any.""") diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py index 555c7ee795bd1..1bf21120a36ca 100644 --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -2429,13 +2429,19 @@ def test_complex_symlinks_relative(self): def test_complex_symlinks_relative_dot_dot(self): self._check_complex_symlinks(os.path.join('dirA', '..')) + def test_class_getitem(self): + from types import GenericAlias + + alias = self.cls[str] + self.assertIsInstance(alias, GenericAlias) + self.assertIs(alias.__origin__, self.cls) + self.assertEqual(alias.__args__, (str,)) + self.assertEqual(alias.__parameters__, ()) + class PathTest(_BasePathTest, unittest.TestCase): cls = pathlib.Path - def test_class_getitem(self): - self.assertIs(self.cls[str], self.cls) - def test_concrete_class(self): p = self.cls('a') self.assertIs(type(p), diff --git a/Misc/NEWS.d/next/Library/2022-01-23-11-17-48.bpo-46483.j7qwWb.rst b/Misc/NEWS.d/next/Library/2022-01-23-11-17-48.bpo-46483.j7qwWb.rst new file mode 100644 index 0000000000000..a84503d299a10 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-23-11-17-48.bpo-46483.j7qwWb.rst @@ -0,0 +1,2 @@ +Change :meth:`pathlib.PurePath.__class_getitem__` to return +:class:`types.GenericAlias`. From webhook-mailer at python.org Sun Jan 23 10:17:35 2022 From: webhook-mailer at python.org (jaraco) Date: Sun, 23 Jan 2022 15:17:35 -0000 Subject: [Python-checkins] [3.10] bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803) (GH-30827) Message-ID: https://github.com/python/cpython/commit/a7a4ca4f06c8c31d7f403113702ad2e80bfc326b commit: a7a4ca4f06c8c31d7f403113702ad2e80bfc326b branch: 3.10 author: Jason R. Coombs committer: jaraco date: 2022-01-23T10:17:27-05:00 summary: [3.10] bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803) (GH-30827) (cherry picked from commit 51c3e28c8a163e58dc753765e3cc51d5a717e70d) Co-authored-by: Jason R. Coombs files: A Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst M Lib/importlib/metadata/__init__.py diff --git a/Lib/importlib/metadata/__init__.py b/Lib/importlib/metadata/__init__.py index ec41ed39157a9..33ce1b6b56964 100644 --- a/Lib/importlib/metadata/__init__.py +++ b/Lib/importlib/metadata/__init__.py @@ -132,8 +132,8 @@ class EntryPoint( pattern = re.compile( r'(?P[\w.]+)\s*' - r'(:\s*(?P[\w.]+))?\s*' - r'(?P\[.*\])?\s*$' + r'(:\s*(?P[\w.]+)\s*)?' + r'((?P\[.*\])\s*)?$' ) """ A regular expression describing the syntax for an entry point, diff --git a/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst new file mode 100644 index 0000000000000..156b7de4f6787 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst @@ -0,0 +1,2 @@ +In ``importlib.metadata.EntryPoint.pattern``, avoid potential REDoS by +limiting ambiguity in consecutive whitespace. From webhook-mailer at python.org Sun Jan 23 10:17:46 2022 From: webhook-mailer at python.org (jaraco) Date: Sun, 23 Jan 2022 15:17:46 -0000 Subject: [Python-checkins] [3.9] bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803). (GH-30828) Message-ID: https://github.com/python/cpython/commit/1514d1252f96e6a83eb65c439522a6b5443f6a1a commit: 1514d1252f96e6a83eb65c439522a6b5443f6a1a branch: 3.9 author: Jason R. Coombs committer: jaraco date: 2022-01-23T10:17:41-05:00 summary: [3.9] bpo-46474: Avoid REDoS in EntryPoint.pattern (sync with importlib_metadata 4.10.1) (GH-30803). (GH-30828) (cherry picked from commit 51c3e28c8a163e58dc753765e3cc51d5a717e70d) Co-authored-by: Jason R. Coombs files: A Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst M Lib/importlib/metadata.py diff --git a/Lib/importlib/metadata.py b/Lib/importlib/metadata.py index 594986ce23efa..7b8038fd63a84 100644 --- a/Lib/importlib/metadata.py +++ b/Lib/importlib/metadata.py @@ -49,8 +49,8 @@ class EntryPoint( pattern = re.compile( r'(?P[\w.]+)\s*' - r'(:\s*(?P[\w.]+))?\s*' - r'(?P\[.*\])?\s*$' + r'(:\s*(?P[\w.]+)\s*)?' + r'((?P\[.*\])\s*)?$' ) """ A regular expression describing the syntax for an entry point, diff --git a/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst new file mode 100644 index 0000000000000..156b7de4f6787 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-14-49-10.bpo-46474.eKQhvx.rst @@ -0,0 +1,2 @@ +In ``importlib.metadata.EntryPoint.pattern``, avoid potential REDoS by +limiting ambiguity in consecutive whitespace. From webhook-mailer at python.org Sun Jan 23 10:39:49 2022 From: webhook-mailer at python.org (corona10) Date: Sun, 23 Jan 2022 15:39:49 -0000 Subject: [Python-checkins] bpo-46481: Implement vectorcall for weakref.ref.__call__ method. (GH-30820) Message-ID: https://github.com/python/cpython/commit/76dc047a0e88d10aad0405228d56e94438cdd91c commit: 76dc047a0e88d10aad0405228d56e94438cdd91c branch: main author: Dong-hee Na committer: corona10 date: 2022-01-24T00:39:45+09:00 summary: bpo-46481: Implement vectorcall for weakref.ref.__call__ method. (GH-30820) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-23-06-56-33.bpo-46481.X_FfnB.rst M Include/cpython/weakrefobject.h M Lib/test/test_sys.py M Objects/weakrefobject.c diff --git a/Include/cpython/weakrefobject.h b/Include/cpython/weakrefobject.h index 9efcc412df9be..3623071cdb044 100644 --- a/Include/cpython/weakrefobject.h +++ b/Include/cpython/weakrefobject.h @@ -29,6 +29,7 @@ struct _PyWeakReference { */ PyWeakReference *wr_prev; PyWeakReference *wr_next; + vectorcallfunc vectorcall; }; PyAPI_FUNC(Py_ssize_t) _PyWeakref_GetWeakrefCount(PyWeakReference *head); diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index accd35e4ab271..f6da57f55f161 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -1538,11 +1538,11 @@ class newstyleclass(object): pass # TODO: add check that forces layout of unicodefields # weakref import weakref - check(weakref.ref(int), size('2Pn2P')) + check(weakref.ref(int), size('2Pn3P')) # weakproxy # XXX # weakcallableproxy - check(weakref.proxy(int), size('2Pn2P')) + check(weakref.proxy(int), size('2Pn3P')) def check_slots(self, obj, base, extra): expected = sys.getsizeof(base) + struct.calcsize(extra) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-23-06-56-33.bpo-46481.X_FfnB.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-23-06-56-33.bpo-46481.X_FfnB.rst new file mode 100644 index 0000000000000..edab2eb014430 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-23-06-56-33.bpo-46481.X_FfnB.rst @@ -0,0 +1,2 @@ +Speed up calls to :meth:`weakref.ref.__call__` by using the :pep:`590` +``vectorcall`` calling convention. Patch by Dong-hee Na. diff --git a/Objects/weakrefobject.c b/Objects/weakrefobject.c index 8922768975221..b9920404c5f9f 100644 --- a/Objects/weakrefobject.c +++ b/Objects/weakrefobject.c @@ -19,6 +19,7 @@ _PyWeakref_GetWeakrefCount(PyWeakReference *head) return count; } +static PyObject *weakref_vectorcall(PyWeakReference *self, PyObject *const *args, size_t nargsf, PyObject *kwnames); static void init_weakref(PyWeakReference *self, PyObject *ob, PyObject *callback) @@ -27,8 +28,8 @@ init_weakref(PyWeakReference *self, PyObject *ob, PyObject *callback) self->wr_object = ob; self->wr_prev = NULL; self->wr_next = NULL; - Py_XINCREF(callback); - self->wr_callback = callback; + self->wr_callback = Py_XNewRef(callback); + self->vectorcall = (vectorcallfunc)weakref_vectorcall; } static PyWeakReference * @@ -128,19 +129,19 @@ gc_clear(PyWeakReference *self) static PyObject * -weakref_call(PyWeakReference *self, PyObject *args, PyObject *kw) +weakref_vectorcall(PyWeakReference *self, PyObject *const *args, + size_t nargsf, PyObject *kwnames) { - static char *kwlist[] = {NULL}; - - if (PyArg_ParseTupleAndKeywords(args, kw, ":__call__", kwlist)) { - PyObject *object = PyWeakref_GET_OBJECT(self); - Py_INCREF(object); - return (object); + if (!_PyArg_NoKwnames("weakref", kwnames)) { + return NULL; + } + Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); + if (!_PyArg_CheckPositional("weakref", nargs, 0, 0)) { + return NULL; } - return NULL; + return Py_NewRef(PyWeakref_GET_OBJECT(self)); } - static Py_hash_t weakref_hash(PyWeakReference *self) { @@ -371,45 +372,24 @@ static PyMethodDef weakref_methods[] = { PyTypeObject _PyWeakref_RefType = { PyVarObject_HEAD_INIT(&PyType_Type, 0) - "weakref", - sizeof(PyWeakReference), - 0, - weakref_dealloc, /*tp_dealloc*/ - 0, /*tp_vectorcall_offset*/ - 0, /*tp_getattr*/ - 0, /*tp_setattr*/ - 0, /*tp_as_async*/ - (reprfunc)weakref_repr, /*tp_repr*/ - 0, /*tp_as_number*/ - 0, /*tp_as_sequence*/ - 0, /*tp_as_mapping*/ - (hashfunc)weakref_hash, /*tp_hash*/ - (ternaryfunc)weakref_call, /*tp_call*/ - 0, /*tp_str*/ - 0, /*tp_getattro*/ - 0, /*tp_setattro*/ - 0, /*tp_as_buffer*/ - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC - | Py_TPFLAGS_BASETYPE, /*tp_flags*/ - 0, /*tp_doc*/ - (traverseproc)gc_traverse, /*tp_traverse*/ - (inquiry)gc_clear, /*tp_clear*/ - (richcmpfunc)weakref_richcompare, /*tp_richcompare*/ - 0, /*tp_weaklistoffset*/ - 0, /*tp_iter*/ - 0, /*tp_iternext*/ - weakref_methods, /*tp_methods*/ - weakref_members, /*tp_members*/ - 0, /*tp_getset*/ - 0, /*tp_base*/ - 0, /*tp_dict*/ - 0, /*tp_descr_get*/ - 0, /*tp_descr_set*/ - 0, /*tp_dictoffset*/ - weakref___init__, /*tp_init*/ - PyType_GenericAlloc, /*tp_alloc*/ - weakref___new__, /*tp_new*/ - PyObject_GC_Del, /*tp_free*/ + .tp_name = "weakref", + .tp_basicsize = sizeof(PyWeakReference), + .tp_dealloc = weakref_dealloc, + .tp_vectorcall_offset = offsetof(PyWeakReference, vectorcall), + .tp_call = PyVectorcall_Call, + .tp_repr = (reprfunc)weakref_repr, + .tp_hash = (hashfunc)weakref_hash, + .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | + Py_TPFLAGS_HAVE_VECTORCALL | Py_TPFLAGS_BASETYPE, + .tp_traverse = (traverseproc)gc_traverse, + .tp_clear = (inquiry)gc_clear, + .tp_richcompare = (richcmpfunc)weakref_richcompare, + .tp_methods = weakref_methods, + .tp_members = weakref_members, + .tp_init = weakref___init__, + .tp_alloc = PyType_GenericAlloc, + .tp_new = weakref___new__, + .tp_free = PyObject_GC_Del, }; From webhook-mailer at python.org Sun Jan 23 12:40:43 2022 From: webhook-mailer at python.org (ethanfurman) Date: Sun, 23 Jan 2022 17:40:43 -0000 Subject: [Python-checkins] bpo-46103: Fix inspect.getmembers to only get __bases__ from class (GH-30147) Message-ID: https://github.com/python/cpython/commit/691506f4e9408a1205166f99640946ad7822e302 commit: 691506f4e9408a1205166f99640946ad7822e302 branch: main author: Weipeng Hong committer: ethanfurman date: 2022-01-23T09:40:38-08:00 summary: bpo-46103: Fix inspect.getmembers to only get __bases__ from class (GH-30147) files: A Misc/NEWS.d/next/Library/2021-12-16-23-42-54.bpo-46103.LMnZAN.rst M Lib/inspect.py M Lib/test/test_inspect.py diff --git a/Lib/inspect.py b/Lib/inspect.py index 879a577d43fbe..d47f5b717471c 100644 --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -540,23 +540,23 @@ def isabstract(object): return False def _getmembers(object, predicate, getter): + results = [] + processed = set() + names = dir(object) if isclass(object): mro = (object,) + getmro(object) + # add any DynamicClassAttributes to the list of names if object is a class; + # this may result in duplicate entries if, for example, a virtual + # attribute with the same name as a DynamicClassAttribute exists + try: + for base in object.__bases__: + for k, v in base.__dict__.items(): + if isinstance(v, types.DynamicClassAttribute): + names.append(k) + except AttributeError: + pass else: mro = () - results = [] - processed = set() - names = dir(object) - # :dd any DynamicClassAttributes to the list of names if object is a class; - # this may result in duplicate entries if, for example, a virtual - # attribute with the same name as a DynamicClassAttribute exists - try: - for base in object.__bases__: - for k, v in base.__dict__.items(): - if isinstance(v, types.DynamicClassAttribute): - names.append(k) - except AttributeError: - pass for key in names: # First try to get the value via getattr. Some descriptors don't # like calling their __get__ (see bug #1785), so fall back to diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index cdbb9eb6a8f7c..76fa6f7e2dab8 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -1215,8 +1215,13 @@ class A(metaclass=M): @types.DynamicClassAttribute def eggs(self): return 'spam' + class B: + def __getattr__(self, attribute): + return None self.assertIn(('eggs', 'scrambled'), inspect.getmembers(A)) self.assertIn(('eggs', 'spam'), inspect.getmembers(A())) + b = B() + self.assertIn(('__getattr__', b.__getattr__), inspect.getmembers(b)) def test_getmembers_static(self): class A: diff --git a/Misc/NEWS.d/next/Library/2021-12-16-23-42-54.bpo-46103.LMnZAN.rst b/Misc/NEWS.d/next/Library/2021-12-16-23-42-54.bpo-46103.LMnZAN.rst new file mode 100644 index 0000000000000..3becbc3de8fc2 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-16-23-42-54.bpo-46103.LMnZAN.rst @@ -0,0 +1,2 @@ +Now :func:`inspect.getmembers` only gets :attr:`__bases__` attribute from +class type. Patch by Weipeng Hong. From webhook-mailer at python.org Sun Jan 23 12:46:04 2022 From: webhook-mailer at python.org (gvanrossum) Date: Sun, 23 Jan 2022 17:46:04 -0000 Subject: [Python-checkins] bpo-46471: Use single byte singletons (GH-30781) Message-ID: https://github.com/python/cpython/commit/ca78130d7eb5265759697639e42487ec6d0a4caf commit: ca78130d7eb5265759697639e42487ec6d0a4caf branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: gvanrossum date: 2022-01-23T09:45:39-08:00 summary: bpo-46471: Use single byte singletons (GH-30781) files: A Misc/NEWS.d/next/Build/2022-01-22-11-06-23.bpo-46471.03snrE.rst M Tools/scripts/deepfreeze.py diff --git a/Misc/NEWS.d/next/Build/2022-01-22-11-06-23.bpo-46471.03snrE.rst b/Misc/NEWS.d/next/Build/2022-01-22-11-06-23.bpo-46471.03snrE.rst new file mode 100644 index 0000000000000..ca8f72868e69e --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-22-11-06-23.bpo-46471.03snrE.rst @@ -0,0 +1 @@ +Use global singletons for single byte bytes objects in deepfreeze. \ No newline at end of file diff --git a/Tools/scripts/deepfreeze.py b/Tools/scripts/deepfreeze.py index a7546a8c60751..a1ef85ea891a2 100644 --- a/Tools/scripts/deepfreeze.py +++ b/Tools/scripts/deepfreeze.py @@ -150,6 +150,8 @@ def field(self, obj: object, name: str) -> None: def generate_bytes(self, name: str, b: bytes) -> str: if b == b"": return "(PyObject *)&_Py_SINGLETON(bytes_empty)" + if len(b) == 1: + return f"(PyObject *)&_Py_SINGLETON(bytes_characters[{b[0]}])" self.write("static") with self.indent(): with self.block("struct"): From webhook-mailer at python.org Sun Jan 23 12:54:23 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Sun, 23 Jan 2022 17:54:23 -0000 Subject: [Python-checkins] [3.9] bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) (GH-30738) Message-ID: https://github.com/python/cpython/commit/94d6434ba7ec3e4b154e515c5583b0b665ab0b09 commit: 94d6434ba7ec3e4b154e515c5583b0b665ab0b09 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: serhiy-storchaka date: 2022-01-23T19:54:13+02:00 summary: [3.9] bpo-21987: Fix TarFile.getmember getting a dir with a trailing slash (GH-30283) (GH-30738) (cherry picked from commit cfadcc31ea84617b1c73022ce54d4ae831333e8d) Co-authored-by: andrei kulakov files: A Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst M Lib/tarfile.py M Lib/test/test_tarfile.py diff --git a/Lib/tarfile.py b/Lib/tarfile.py index 043a4ab5a52a6..a738e25684c48 100755 --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -1785,7 +1785,7 @@ def getmember(self, name): than once in the archive, its last occurrence is assumed to be the most up-to-date version. """ - tarinfo = self._getmember(name) + tarinfo = self._getmember(name.rstrip('/')) if tarinfo is None: raise KeyError("filename %r not found" % name) return tarinfo diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index 06fb97212d856..a4b63ff82b3ae 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -219,6 +219,25 @@ def test_fileobj_symlink2(self): def test_issue14160(self): self._test_fileobj_link("symtype2", "ustar/regtype") + def test_add_dir_getmember(self): + # bpo-21987 + self.add_dir_and_getmember('bar') + self.add_dir_and_getmember('a'*101) + + def add_dir_and_getmember(self, name): + with support.temp_cwd(): + with tarfile.open(tmpname, 'w') as tar: + try: + os.mkdir(name) + tar.add(name) + finally: + os.rmdir(name) + with tarfile.open(tmpname) as tar: + self.assertEqual( + tar.getmember(name), + tar.getmember(name + '/') + ) + class GzipUstarReadTest(GzipTest, UstarReadTest): pass diff --git a/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst new file mode 100644 index 0000000000000..305dd16d53b49 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2021-12-28-11-55-10.bpo-21987.avBK-p.rst @@ -0,0 +1,2 @@ +Fix an issue with :meth:`tarfile.TarFile.getmember` getting a directory name +with a trailing slash. From webhook-mailer at python.org Sun Jan 23 13:42:53 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sun, 23 Jan 2022 18:42:53 -0000 Subject: [Python-checkins] bpo-41403: Improve error message for invalid mock target (GH-30833) Message-ID: https://github.com/python/cpython/commit/f7955a82e36d4c32ebdd7b7707cdf0e6ffa7a418 commit: f7955a82e36d4c32ebdd7b7707cdf0e6ffa7a418 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-23T18:42:41Z summary: bpo-41403: Improve error message for invalid mock target (GH-30833) files: A Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst M Lib/unittest/mock.py M Lib/unittest/test/testmock/testpatch.py diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index 9f99a5aa5bcdc..9137501930000 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -1589,9 +1589,9 @@ def stop(self): def _get_target(target): try: target, attribute = target.rsplit('.', 1) - except (TypeError, ValueError): - raise TypeError("Need a valid target to patch. You supplied: %r" % - (target,)) + except (TypeError, ValueError, AttributeError): + raise TypeError( + f"Need a valid target to patch. You supplied: {target!r}") return partial(pkgutil.resolve_name, target), attribute diff --git a/Lib/unittest/test/testmock/testpatch.py b/Lib/unittest/test/testmock/testpatch.py index 233a5afffaed4..8ab63a1317d3d 100644 --- a/Lib/unittest/test/testmock/testpatch.py +++ b/Lib/unittest/test/testmock/testpatch.py @@ -1933,8 +1933,13 @@ def test(mock): def test_invalid_target(self): - with self.assertRaises(TypeError): - patch('') + class Foo: + pass + + for target in ['', 12, Foo()]: + with self.subTest(target=target): + with self.assertRaises(TypeError): + patch(target) def test_cant_set_kwargs_when_passing_a_mock(self): diff --git a/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst new file mode 100644 index 0000000000000..ede159b25641f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst @@ -0,0 +1,3 @@ +Make :meth:`mock.patch` raise a :exc:`TypeError` with a relevant error +message on invalid arg. Previously it allowed a cryptic +:exc:`AttributeError` to escape. From webhook-mailer at python.org Sun Jan 23 14:34:51 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sun, 23 Jan 2022 19:34:51 -0000 Subject: [Python-checkins] bpo-41403: Improve error message for invalid mock target (GH-30833) (GH-30834) Message-ID: https://github.com/python/cpython/commit/e3ade66ec575e0cb4882cfdff155ef962e67c837 commit: e3ade66ec575e0cb4882cfdff155ef962e67c837 branch: 3.10 author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-23T19:34:43Z summary: bpo-41403: Improve error message for invalid mock target (GH-30833) (GH-30834) (cherry picked from commit f7955a82e36d4c32ebdd7b7707cdf0e6ffa7a418) files: A Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst M Lib/unittest/mock.py M Lib/unittest/test/testmock/testpatch.py diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index 6226bd4bc0c19..7152f86ed9694 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -1602,9 +1602,9 @@ def stop(self): def _get_target(target): try: target, attribute = target.rsplit('.', 1) - except (TypeError, ValueError): - raise TypeError("Need a valid target to patch. You supplied: %r" % - (target,)) + except (TypeError, ValueError, AttributeError): + raise TypeError( + f"Need a valid target to patch. You supplied: {target!r}") getter = lambda: _importer(target) return getter, attribute diff --git a/Lib/unittest/test/testmock/testpatch.py b/Lib/unittest/test/testmock/testpatch.py index 233a5afffaed4..8ab63a1317d3d 100644 --- a/Lib/unittest/test/testmock/testpatch.py +++ b/Lib/unittest/test/testmock/testpatch.py @@ -1933,8 +1933,13 @@ def test(mock): def test_invalid_target(self): - with self.assertRaises(TypeError): - patch('') + class Foo: + pass + + for target in ['', 12, Foo()]: + with self.subTest(target=target): + with self.assertRaises(TypeError): + patch(target) def test_cant_set_kwargs_when_passing_a_mock(self): diff --git a/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst new file mode 100644 index 0000000000000..ede159b25641f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst @@ -0,0 +1,3 @@ +Make :meth:`mock.patch` raise a :exc:`TypeError` with a relevant error +message on invalid arg. Previously it allowed a cryptic +:exc:`AttributeError` to escape. From webhook-mailer at python.org Sun Jan 23 14:35:19 2022 From: webhook-mailer at python.org (iritkatriel) Date: Sun, 23 Jan 2022 19:35:19 -0000 Subject: [Python-checkins] bpo-41403: Improve error message for invalid mock target (GH-30833) (GH-30835) Message-ID: https://github.com/python/cpython/commit/1398dca838529e682c06b496cc1911d91334ff3a commit: 1398dca838529e682c06b496cc1911d91334ff3a branch: 3.9 author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-23T19:35:15Z summary: bpo-41403: Improve error message for invalid mock target (GH-30833) (GH-30835) (cherry picked from commit f7955a82e36d4c32ebdd7b7707cdf0e6ffa7a418) files: A Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst M Lib/unittest/mock.py M Lib/unittest/test/testmock/testpatch.py diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index 5c78274d09357..3f5442ed80951 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -1557,9 +1557,9 @@ def stop(self): def _get_target(target): try: target, attribute = target.rsplit('.', 1) - except (TypeError, ValueError): - raise TypeError("Need a valid target to patch. You supplied: %r" % - (target,)) + except (TypeError, ValueError, AttributeError): + raise TypeError( + f"Need a valid target to patch. You supplied: {target!r}") getter = lambda: _importer(target) return getter, attribute diff --git a/Lib/unittest/test/testmock/testpatch.py b/Lib/unittest/test/testmock/testpatch.py index 233a5afffaed4..8ab63a1317d3d 100644 --- a/Lib/unittest/test/testmock/testpatch.py +++ b/Lib/unittest/test/testmock/testpatch.py @@ -1933,8 +1933,13 @@ def test(mock): def test_invalid_target(self): - with self.assertRaises(TypeError): - patch('') + class Foo: + pass + + for target in ['', 12, Foo()]: + with self.subTest(target=target): + with self.assertRaises(TypeError): + patch(target) def test_cant_set_kwargs_when_passing_a_mock(self): diff --git a/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst new file mode 100644 index 0000000000000..ede159b25641f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-23-18-04-45.bpo-41403.SgoHqV.rst @@ -0,0 +1,3 @@ +Make :meth:`mock.patch` raise a :exc:`TypeError` with a relevant error +message on invalid arg. Previously it allowed a cryptic +:exc:`AttributeError` to escape. From webhook-mailer at python.org Sun Jan 23 15:31:15 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 23 Jan 2022 20:31:15 -0000 Subject: [Python-checkins] Improve grouper() recipe to demonstrate all forms of zip() (GH-30837) Message-ID: https://github.com/python/cpython/commit/270a09184d312856ca112396daec8f360cc5510e commit: 270a09184d312856ca112396daec8f360cc5510e branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-23T14:31:10-06:00 summary: Improve grouper() recipe to demonstrate all forms of zip() (GH-30837) files: M Doc/library/itertools.rst M Lib/test/test_itertools.py diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 61d8b869711fa..34667561c3cfe 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -813,11 +813,20 @@ which incur interpreter overhead. return starmap(func, repeat(args)) return starmap(func, repeat(args, times)) - def grouper(iterable, n, fillvalue=None): + def grouper(iterable, n, *, incomplete='fill', fillvalue=None): "Collect data into non-overlapping fixed-length chunks or blocks" - # grouper('ABCDEFG', 3, 'x') --> ABC DEF Gxx + # grouper('ABCDEFG', 3, fillvalue='x') --> ABC DEF Gxx + # grouper('ABCDEFG', 3, incomplete='strict') --> ABC DEF ValueError + # grouper('ABCDEFG', 3, incomplete='ignore') --> ABC DEF args = [iter(iterable)] * n - return zip_longest(*args, fillvalue=fillvalue) + if incomplete == 'fill': + return zip_longest(*args, fillvalue=fillvalue) + if incomplete == 'strict': + return zip(*args, strict=True) + if incomplete == 'ignore': + return zip(*args) + else: + raise ValueError('Expected fill, strict, or ignore') def triplewise(iterable): "Return overlapping triplets from an iterable" diff --git a/Lib/test/test_itertools.py b/Lib/test/test_itertools.py index 808c32f7dd7ab..3043e8c404e6e 100644 --- a/Lib/test/test_itertools.py +++ b/Lib/test/test_itertools.py @@ -2436,6 +2436,21 @@ def test_permutations_sizeof(self): ... else: ... return starmap(func, repeat(args, times)) +>>> def grouper(iterable, n, *, incomplete='fill', fillvalue=None): +... "Collect data into non-overlapping fixed-length chunks or blocks" +... # grouper('ABCDEFG', 3, fillvalue='x') --> ABC DEF Gxx +... # grouper('ABCDEFG', 3, incomplete='strict') --> ABC DEF ValueError +... # grouper('ABCDEFG', 3, incomplete='ignore') --> ABC DEF +... args = [iter(iterable)] * n +... if incomplete == 'fill': +... return zip_longest(*args, fillvalue=fillvalue) +... if incomplete == 'strict': +... return zip(*args, strict=True) +... if incomplete == 'ignore': +... return zip(*args) +... else: +... raise ValueError('Expected fill, strict, or ignore') + >>> def triplewise(iterable): ... "Return overlapping triplets from an iterable" ... # pairwise('ABCDEFG') -> ABC BCD CDE DEF EFG @@ -2453,11 +2468,6 @@ def test_permutations_sizeof(self): ... window.append(x) ... yield tuple(window) ->>> def grouper(n, iterable, fillvalue=None): -... "grouper(3, 'ABCDEFG', 'x') --> ABC DEF Gxx" -... args = [iter(iterable)] * n -... return zip_longest(*args, fillvalue=fillvalue) - >>> def roundrobin(*iterables): ... "roundrobin('ABC', 'D', 'EF') --> A D E B F C" ... # Recipe credited to George Sakkis @@ -2626,9 +2636,22 @@ def test_permutations_sizeof(self): >>> dotproduct([1,2,3], [4,5,6]) 32 ->>> list(grouper(3, 'abcdefg', 'x')) +>>> list(grouper('abcdefg', 3, fillvalue='x')) [('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'x', 'x')] +>>> it = grouper('abcdefg', 3, incomplete='strict') +>>> next(it) +('a', 'b', 'c') +>>> next(it) +('d', 'e', 'f') +>>> next(it) +Traceback (most recent call last): + ... +ValueError: zip() argument 2 is shorter than argument 1 + +>>> list(grouper('abcdefg', n=3, incomplete='ignore')) +[('a', 'b', 'c'), ('d', 'e', 'f')] + >>> list(triplewise('ABCDEFG')) [('A', 'B', 'C'), ('B', 'C', 'D'), ('C', 'D', 'E'), ('D', 'E', 'F'), ('E', 'F', 'G')] From webhook-mailer at python.org Sun Jan 23 17:02:40 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 23 Jan 2022 22:02:40 -0000 Subject: [Python-checkins] Improve grouper() recipe to demonstrate all forms of zip() (GH-30837) (GH-30840) Message-ID: https://github.com/python/cpython/commit/b2c7fe1f61c8ec3742635428570bc61d820c7a68 commit: b2c7fe1f61c8ec3742635428570bc61d820c7a68 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: rhettinger date: 2022-01-23T16:02:31-06:00 summary: Improve grouper() recipe to demonstrate all forms of zip() (GH-30837) (GH-30840) files: M Doc/library/itertools.rst M Lib/test/test_itertools.py diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 61d8b869711fa..34667561c3cfe 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -813,11 +813,20 @@ which incur interpreter overhead. return starmap(func, repeat(args)) return starmap(func, repeat(args, times)) - def grouper(iterable, n, fillvalue=None): + def grouper(iterable, n, *, incomplete='fill', fillvalue=None): "Collect data into non-overlapping fixed-length chunks or blocks" - # grouper('ABCDEFG', 3, 'x') --> ABC DEF Gxx + # grouper('ABCDEFG', 3, fillvalue='x') --> ABC DEF Gxx + # grouper('ABCDEFG', 3, incomplete='strict') --> ABC DEF ValueError + # grouper('ABCDEFG', 3, incomplete='ignore') --> ABC DEF args = [iter(iterable)] * n - return zip_longest(*args, fillvalue=fillvalue) + if incomplete == 'fill': + return zip_longest(*args, fillvalue=fillvalue) + if incomplete == 'strict': + return zip(*args, strict=True) + if incomplete == 'ignore': + return zip(*args) + else: + raise ValueError('Expected fill, strict, or ignore') def triplewise(iterable): "Return overlapping triplets from an iterable" diff --git a/Lib/test/test_itertools.py b/Lib/test/test_itertools.py index a12f6f0b9773e..4c9c597cc4842 100644 --- a/Lib/test/test_itertools.py +++ b/Lib/test/test_itertools.py @@ -2394,6 +2394,21 @@ def test_permutations_sizeof(self): ... else: ... return starmap(func, repeat(args, times)) +>>> def grouper(iterable, n, *, incomplete='fill', fillvalue=None): +... "Collect data into non-overlapping fixed-length chunks or blocks" +... # grouper('ABCDEFG', 3, fillvalue='x') --> ABC DEF Gxx +... # grouper('ABCDEFG', 3, incomplete='strict') --> ABC DEF ValueError +... # grouper('ABCDEFG', 3, incomplete='ignore') --> ABC DEF +... args = [iter(iterable)] * n +... if incomplete == 'fill': +... return zip_longest(*args, fillvalue=fillvalue) +... if incomplete == 'strict': +... return zip(*args, strict=True) +... if incomplete == 'ignore': +... return zip(*args) +... else: +... raise ValueError('Expected fill, strict, or ignore') + >>> def triplewise(iterable): ... "Return overlapping triplets from an iterable" ... # pairwise('ABCDEFG') -> ABC BCD CDE DEF EFG @@ -2411,11 +2426,6 @@ def test_permutations_sizeof(self): ... window.append(x) ... yield tuple(window) ->>> def grouper(n, iterable, fillvalue=None): -... "grouper(3, 'ABCDEFG', 'x') --> ABC DEF Gxx" -... args = [iter(iterable)] * n -... return zip_longest(*args, fillvalue=fillvalue) - >>> def roundrobin(*iterables): ... "roundrobin('ABC', 'D', 'EF') --> A D E B F C" ... # Recipe credited to George Sakkis @@ -2584,9 +2594,22 @@ def test_permutations_sizeof(self): >>> dotproduct([1,2,3], [4,5,6]) 32 ->>> list(grouper(3, 'abcdefg', 'x')) +>>> list(grouper('abcdefg', 3, fillvalue='x')) [('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'x', 'x')] +>>> it = grouper('abcdefg', 3, incomplete='strict') +>>> next(it) +('a', 'b', 'c') +>>> next(it) +('d', 'e', 'f') +>>> next(it) +Traceback (most recent call last): + ... +ValueError: zip() argument 2 is shorter than argument 1 + +>>> list(grouper('abcdefg', n=3, incomplete='ignore')) +[('a', 'b', 'c'), ('d', 'e', 'f')] + >>> list(triplewise('ABCDEFG')) [('A', 'B', 'C'), ('B', 'C', 'D'), ('C', 'D', 'E'), ('D', 'E', 'F'), ('E', 'F', 'G')] From webhook-mailer at python.org Sun Jan 23 17:03:55 2022 From: webhook-mailer at python.org (rhettinger) Date: Sun, 23 Jan 2022 22:03:55 -0000 Subject: [Python-checkins] bpo-46486: Fixed misspelled name DesciptorClassification Message-ID: https://github.com/python/cpython/commit/d1beb241d9bdf912682bc8323a59c052f99b82a8 commit: d1beb241d9bdf912682bc8323a59c052f99b82a8 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: rhettinger date: 2022-01-23T16:03:50-06:00 summary: bpo-46486: Fixed misspelled name DesciptorClassification files: M Python/specialize.c diff --git a/Python/specialize.c b/Python/specialize.c index 8daeaa6cb2f51..44c006245ebfc 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -587,10 +587,10 @@ typedef enum { ABSENT, /* Attribute is not present on the class */ DUNDER_CLASS, /* __class__ attribute */ GETSET_OVERRIDDEN /* __getattribute__ or __setattr__ has been overridden */ -} DesciptorClassification; +} DescriptorClassification; -static DesciptorClassification +static DescriptorClassification analyze_descriptor(PyTypeObject *type, PyObject *name, PyObject **descr, int store) { if (store) { @@ -651,7 +651,7 @@ analyze_descriptor(PyTypeObject *type, PyObject *name, PyObject **descr, int sto static int specialize_dict_access( PyObject *owner, _Py_CODEUNIT *instr, PyTypeObject *type, - DesciptorClassification kind, PyObject *name, + DescriptorClassification kind, PyObject *name, _PyAdaptiveEntry *cache0, _PyAttrCache *cache1, int base_op, int values_op, int hint_op) { @@ -718,7 +718,7 @@ _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name, Sp } } PyObject *descr; - DesciptorClassification kind = analyze_descriptor(type, name, &descr, 0); + DescriptorClassification kind = analyze_descriptor(type, name, &descr, 0); switch(kind) { case OVERRIDING: SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_OVERRIDING_DESCRIPTOR); @@ -807,7 +807,7 @@ _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name, S goto fail; } PyObject *descr; - DesciptorClassification kind = analyze_descriptor(type, name, &descr, 1); + DescriptorClassification kind = analyze_descriptor(type, name, &descr, 1); switch(kind) { case OVERRIDING: SPECIALIZATION_FAIL(STORE_ATTR, SPEC_FAIL_OVERRIDING_DESCRIPTOR); @@ -881,7 +881,7 @@ _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name, S #ifdef Py_STATS static int -load_method_fail_kind(DesciptorClassification kind) +load_method_fail_kind(DescriptorClassification kind) { switch (kind) { case OVERRIDING: @@ -921,7 +921,7 @@ specialize_class_load_method(PyObject *owner, _Py_CODEUNIT *instr, PyObject *nam { PyObject *descr = NULL; - DesciptorClassification kind = 0; + DescriptorClassification kind = 0; kind = analyze_descriptor((PyTypeObject *)owner, name, &descr, 0); switch (kind) { case METHOD: @@ -969,7 +969,7 @@ _Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name, } PyObject *descr = NULL; - DesciptorClassification kind = 0; + DescriptorClassification kind = 0; kind = analyze_descriptor(owner_cls, name, &descr, 0); assert(descr != NULL || kind == ABSENT || kind == GETSET_OVERRIDDEN); if (kind != METHOD) { From webhook-mailer at python.org Sun Jan 23 18:36:13 2022 From: webhook-mailer at python.org (gvanrossum) Date: Sun, 23 Jan 2022 23:36:13 -0000 Subject: [Python-checkins] fix typo in typing.rst (#30841) Message-ID: https://github.com/python/cpython/commit/d75a51bea3c2442f81d38ff850b81b8b7f3330f0 commit: d75a51bea3c2442f81d38ff850b81b8b7f3330f0 branch: main author: Jelle Zijlstra committer: gvanrossum date: 2022-01-23T15:36:08-08:00 summary: fix typo in typing.rst (#30841) files: M Doc/library/typing.rst diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index cb14db90711cf..cdfd403a34ef9 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -2142,7 +2142,7 @@ Constant If ``from __future__ import annotations`` is used in Python 3.7 or later, annotations are not evaluated at function definition time. - Instead, they are stored as strings in ``__annotations__``, + Instead, they are stored as strings in ``__annotations__``. This makes it unnecessary to use quotes around the annotation. (see :pep:`563`). From webhook-mailer at python.org Mon Jan 24 05:14:47 2022 From: webhook-mailer at python.org (serhiy-storchaka) Date: Mon, 24 Jan 2022 10:14:47 -0000 Subject: [Python-checkins] [3.9] bpo-46426: Improve tests for the dir_fd argument (GH-30668) (GH-30757) Message-ID: https://github.com/python/cpython/commit/3f1ea163ea54513e00e0e9d5442fee1b639825cc commit: 3f1ea163ea54513e00e0e9d5442fee1b639825cc branch: 3.9 author: Serhiy Storchaka committer: serhiy-storchaka date: 2022-01-24T12:14:42+02:00 summary: [3.9] bpo-46426: Improve tests for the dir_fd argument (GH-30668) (GH-30757) Ensure that directory file descriptors refer to directories different from the current directory, and that src_dir_fd and dst_dir_fd refer to different directories. Add context manager open_dir_fd() in test.support.os_helper. (cherry picked from commit 54610bb448a9cf5be77d53b66169fca4c11be6cb) Co-authored-by: Serhiy Storchaka files: M Lib/test/support/__init__.py M Lib/test/test_os.py M Lib/test/test_posix.py diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 4ced1300cbfab..53804f13fc8a2 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -1013,6 +1013,16 @@ def create_empty_file(filename): fd = os.open(filename, os.O_WRONLY | os.O_CREAT | os.O_TRUNC) os.close(fd) + at contextlib.contextmanager +def open_dir_fd(path): + """Open a file descriptor to a directory.""" + assert os.path.isdir(path) + dir_fd = os.open(path, os.O_RDONLY) + try: + yield dir_fd + finally: + os.close(dir_fd) + def sortdict(dict): "Like repr(dict), but in sorted order." items = sorted(dict.items()) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 59ddf9e0b3c91..e48157a3de26d 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -708,12 +708,9 @@ def set_time(filename, ns): def test_utime_dir_fd(self): def set_time(filename, ns): dirname, name = os.path.split(filename) - dirfd = os.open(dirname, os.O_RDONLY) - try: + with support.open_dir_fd(dirname) as dirfd: # pass dir_fd to test utimensat(timespec) or futimesat(timeval) os.utime(name, dir_fd=dirfd, ns=ns) - finally: - os.close(dirfd) self._test_utime(set_time) def test_utime_directory(self): @@ -4111,8 +4108,7 @@ def test_fd(self): os.symlink('file.txt', os.path.join(self.path, 'link')) expected_names.append('link') - fd = os.open(self.path, os.O_RDONLY) - try: + with support.open_dir_fd(self.path) as fd: with os.scandir(fd) as it: entries = list(it) names = [entry.name for entry in entries] @@ -4127,8 +4123,6 @@ def test_fd(self): self.assertEqual(entry.stat(), st) st = os.stat(entry.name, dir_fd=fd, follow_symlinks=False) self.assertEqual(entry.stat(follow_symlinks=False), st) - finally: - os.close(fd) def test_empty_path(self): self.assertRaises(FileNotFoundError, os.scandir, '') diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 890b7e04da364..a8db30679af8d 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -18,6 +18,7 @@ import unittest import warnings import textwrap +from contextlib import contextmanager _DUMMY_SYMLINK = os.path.join(tempfile.gettempdir(), support.TESTFN + '-dummy-symlink') @@ -1055,176 +1056,6 @@ def test_getgroups(self): symdiff = idg_groups.symmetric_difference(posix.getgroups()) self.assertTrue(not symdiff or symdiff == {posix.getegid()}) - # tests for the posix *at functions follow - - @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") - def test_access_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertTrue(posix.access(support.TESTFN, os.R_OK, dir_fd=f)) - finally: - posix.close(f) - - @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") - def test_chmod_dir_fd(self): - os.chmod(support.TESTFN, stat.S_IRUSR) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chmod(support.TESTFN, stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - - s = posix.stat(support.TESTFN) - self.assertEqual(s[0] & stat.S_IRWXU, stat.S_IRUSR | stat.S_IWUSR) - finally: - posix.close(f) - - @unittest.skipUnless(os.chown in os.supports_dir_fd, "test needs dir_fd support in os.chown()") - def test_chown_dir_fd(self): - support.unlink(support.TESTFN) - support.create_empty_file(support.TESTFN) - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.chown(support.TESTFN, os.getuid(), os.getgid(), dir_fd=f) - finally: - posix.close(f) - - @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") - def test_stat_dir_fd(self): - support.unlink(support.TESTFN) - with open(support.TESTFN, 'w') as outfile: - outfile.write("testline\n") - - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - s1 = posix.stat(support.TESTFN) - s2 = posix.stat(support.TESTFN, dir_fd=f) - self.assertEqual(s1, s2) - s2 = posix.stat(support.TESTFN, dir_fd=None) - self.assertEqual(s1, s2) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, support.TESTFN, dir_fd=posix.getcwd()) - self.assertRaisesRegex(TypeError, 'should be integer or None, not', - posix.stat, support.TESTFN, dir_fd=float(f)) - self.assertRaises(OverflowError, - posix.stat, support.TESTFN, dir_fd=10**20) - finally: - posix.close(f) - - @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") - def test_utime_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - now = time.time() - posix.utime(support.TESTFN, None, dir_fd=f) - posix.utime(support.TESTFN, dir_fd=f) - self.assertRaises(TypeError, posix.utime, support.TESTFN, now, dir_fd=f) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, None), dir_fd=f) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, now), dir_fd=f) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, "x"), dir_fd=f) - posix.utime(support.TESTFN, (int(now), int(now)), dir_fd=f) - posix.utime(support.TESTFN, (now, now), dir_fd=f) - posix.utime(support.TESTFN, - (int(now), int((now - int(now)) * 1e9)), dir_fd=f) - posix.utime(support.TESTFN, dir_fd=f, - times=(int(now), int((now - int(now)) * 1e9))) - - # try dir_fd and follow_symlinks together - if os.utime in os.supports_follow_symlinks: - try: - posix.utime(support.TESTFN, follow_symlinks=False, dir_fd=f) - except ValueError: - # whoops! using both together not supported on this platform. - pass - - finally: - posix.close(f) - - @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") - def test_link_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.link(support.TESTFN, support.TESTFN + 'link', src_dir_fd=f, dst_dir_fd=f) - except PermissionError as e: - self.skipTest('posix.link(): %s' % e) - else: - # should have same inodes - self.assertEqual(posix.stat(support.TESTFN)[1], - posix.stat(support.TESTFN + 'link')[1]) - finally: - posix.close(f) - support.unlink(support.TESTFN + 'link') - - @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") - def test_mkdir_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mkdir(support.TESTFN + 'dir', dir_fd=f) - posix.stat(support.TESTFN + 'dir') # should not raise exception - finally: - posix.close(f) - support.rmtree(support.TESTFN + 'dir') - - @unittest.skipUnless((os.mknod in os.supports_dir_fd) and hasattr(stat, 'S_IFIFO'), - "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") - def test_mknod_dir_fd(self): - # Test using mknodat() to create a FIFO (the only use specified - # by POSIX). - support.unlink(support.TESTFN) - mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.mknod(support.TESTFN, mode, 0, dir_fd=f) - except OSError as e: - # Some old systems don't allow unprivileged users to use - # mknod(), or only support creating device nodes. - self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) - else: - self.assertTrue(stat.S_ISFIFO(posix.stat(support.TESTFN).st_mode)) - finally: - posix.close(f) - - @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") - def test_open_dir_fd(self): - support.unlink(support.TESTFN) - with open(support.TESTFN, 'w') as outfile: - outfile.write("testline\n") - a = posix.open(posix.getcwd(), posix.O_RDONLY) - b = posix.open(support.TESTFN, posix.O_RDONLY, dir_fd=a) - try: - res = posix.read(b, 9).decode(encoding="utf-8") - self.assertEqual("testline\n", res) - finally: - posix.close(a) - posix.close(b) - - @unittest.skipUnless(os.readlink in os.supports_dir_fd, "test needs dir_fd support in os.readlink()") - def test_readlink_dir_fd(self): - os.symlink(support.TESTFN, support.TESTFN + 'link') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - self.assertEqual(posix.readlink(support.TESTFN + 'link'), - posix.readlink(support.TESTFN + 'link', dir_fd=f)) - finally: - support.unlink(support.TESTFN + 'link') - posix.close(f) - - @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") - def test_rename_dir_fd(self): - support.unlink(support.TESTFN) - support.create_empty_file(support.TESTFN + 'ren') - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.rename(support.TESTFN + 'ren', support.TESTFN, src_dir_fd=f, dst_dir_fd=f) - except: - posix.rename(support.TESTFN + 'ren', support.TESTFN) - raise - else: - posix.stat(support.TESTFN) # should not raise exception - finally: - posix.close(f) - @unittest.skipUnless(hasattr(signal, 'SIGCHLD'), 'CLD_XXXX be placed in si_code for a SIGCHLD signal') @unittest.skipUnless(hasattr(os, 'waitid_result'), "test needs os.waitid_result") def test_cld_xxxx_constants(self): @@ -1235,45 +1066,6 @@ def test_cld_xxxx_constants(self): os.CLD_STOPPED os.CLD_CONTINUED - @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") - def test_symlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - posix.symlink(support.TESTFN, support.TESTFN + 'link', dir_fd=f) - self.assertEqual(posix.readlink(support.TESTFN + 'link'), support.TESTFN) - finally: - posix.close(f) - support.unlink(support.TESTFN + 'link') - - @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") - def test_unlink_dir_fd(self): - f = posix.open(posix.getcwd(), posix.O_RDONLY) - support.create_empty_file(support.TESTFN + 'del') - posix.stat(support.TESTFN + 'del') # should not raise exception - try: - posix.unlink(support.TESTFN + 'del', dir_fd=f) - except: - support.unlink(support.TESTFN + 'del') - raise - else: - self.assertRaises(OSError, posix.stat, support.TESTFN + 'link') - finally: - posix.close(f) - - @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") - def test_mkfifo_dir_fd(self): - support.unlink(support.TESTFN) - f = posix.open(posix.getcwd(), posix.O_RDONLY) - try: - try: - posix.mkfifo(support.TESTFN, - stat.S_IRUSR | stat.S_IWUSR, dir_fd=f) - except PermissionError as e: - self.skipTest('posix.mkfifo(): %s' % e) - self.assertTrue(stat.S_ISFIFO(posix.stat(support.TESTFN).st_mode)) - finally: - posix.close(f) - requires_sched_h = unittest.skipUnless(hasattr(posix, 'sched_yield'), "don't have scheduling support") requires_sched_affinity = unittest.skipUnless(hasattr(posix, 'sched_setaffinity'), @@ -1480,6 +1272,200 @@ def test_pidfd_open(self): self.assertEqual(cm.exception.errno, errno.EINVAL) os.close(os.pidfd_open(os.getpid(), 0)) + +# tests for the posix *at functions follow +class TestPosixDirFd(unittest.TestCase): + count = 0 + + @contextmanager + def prepare(self): + TestPosixDirFd.count += 1 + name = f'{support.TESTFN}_{self.count}' + base_dir = f'{support.TESTFN}_{self.count}base' + posix.mkdir(base_dir) + self.addCleanup(posix.rmdir, base_dir) + fullname = os.path.join(base_dir, name) + assert not os.path.exists(fullname) + with support.open_dir_fd(base_dir) as dir_fd: + yield (dir_fd, name, fullname) + + @contextmanager + def prepare_file(self): + with self.prepare() as (dir_fd, name, fullname): + support.create_empty_file(fullname) + self.addCleanup(posix.unlink, fullname) + yield (dir_fd, name, fullname) + + @unittest.skipUnless(os.access in os.supports_dir_fd, "test needs dir_fd support for os.access()") + def test_access_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + self.assertTrue(posix.access(name, os.R_OK, dir_fd=dir_fd)) + + @unittest.skipUnless(os.chmod in os.supports_dir_fd, "test needs dir_fd support in os.chmod()") + def test_chmod_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chmod(fullname, stat.S_IRUSR) + posix.chmod(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + s = posix.stat(fullname) + self.assertEqual(s.st_mode & stat.S_IRWXU, + stat.S_IRUSR | stat.S_IWUSR) + + @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), + "test needs dir_fd support in os.chown()") + def test_chown_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + posix.chown(name, os.getuid(), os.getgid(), dir_fd=dir_fd) + + @unittest.skipUnless(os.stat in os.supports_dir_fd, "test needs dir_fd support in os.stat()") + def test_stat_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'w') as outfile: + outfile.write("testline\n") + self.addCleanup(posix.unlink, fullname) + + s1 = posix.stat(fullname) + s2 = posix.stat(name, dir_fd=dir_fd) + self.assertEqual(s1, s2) + s2 = posix.stat(fullname, dir_fd=None) + self.assertEqual(s1, s2) + + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=posix.getcwd()) + self.assertRaisesRegex(TypeError, 'should be integer or None, not', + posix.stat, name, dir_fd=float(dir_fd)) + self.assertRaises(OverflowError, + posix.stat, name, dir_fd=10**20) + + @unittest.skipUnless(os.utime in os.supports_dir_fd, "test needs dir_fd support in os.utime()") + def test_utime_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname): + now = time.time() + posix.utime(name, None, dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + now, dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, None), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (None, now), dir_fd=dir_fd) + self.assertRaises(TypeError, posix.utime, name, + (now, "x"), dir_fd=dir_fd) + posix.utime(name, (int(now), int(now)), dir_fd=dir_fd) + posix.utime(name, (now, now), dir_fd=dir_fd) + posix.utime(name, + (int(now), int((now - int(now)) * 1e9)), dir_fd=dir_fd) + posix.utime(name, dir_fd=dir_fd, + times=(int(now), int((now - int(now)) * 1e9))) + + # try dir_fd and follow_symlinks together + if os.utime in os.supports_follow_symlinks: + try: + posix.utime(name, follow_symlinks=False, dir_fd=dir_fd) + except ValueError: + # whoops! using both together not supported on this platform. + pass + + @unittest.skipUnless(os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()") + def test_link_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, linkname, fulllinkname): + try: + posix.link(name, linkname, src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + except PermissionError as e: + self.skipTest('posix.link(): %s' % e) + self.addCleanup(posix.unlink, fulllinkname) + # should have same inodes + self.assertEqual(posix.stat(fullname)[1], + posix.stat(fulllinkname)[1]) + + @unittest.skipUnless(os.mkdir in os.supports_dir_fd, "test needs dir_fd support in os.mkdir()") + def test_mkdir_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.mkdir(name, dir_fd=dir_fd) + self.addCleanup(posix.rmdir, fullname) + posix.stat(fullname) # should not raise exception + + @unittest.skipUnless(hasattr(os, 'mknod') + and (os.mknod in os.supports_dir_fd) + and hasattr(stat, 'S_IFIFO'), + "test requires both stat.S_IFIFO and dir_fd support for os.mknod()") + def test_mknod_dir_fd(self): + # Test using mknodat() to create a FIFO (the only use specified + # by POSIX). + with self.prepare() as (dir_fd, name, fullname): + mode = stat.S_IFIFO | stat.S_IRUSR | stat.S_IWUSR + try: + posix.mknod(name, mode, 0, dir_fd=dir_fd) + except OSError as e: + # Some old systems don't allow unprivileged users to use + # mknod(), or only support creating device nodes. + self.assertIn(e.errno, (errno.EPERM, errno.EINVAL, errno.EACCES)) + else: + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + @unittest.skipUnless(os.open in os.supports_dir_fd, "test needs dir_fd support in os.open()") + def test_open_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + with open(fullname, 'wb') as outfile: + outfile.write(b"testline\n") + self.addCleanup(posix.unlink, fullname) + fd = posix.open(name, posix.O_RDONLY, dir_fd=dir_fd) + try: + res = posix.read(fd, 9) + self.assertEqual(b"testline\n", res) + finally: + posix.close(fd) + + @unittest.skipUnless(hasattr(os, 'readlink') and (os.readlink in os.supports_dir_fd), + "test needs dir_fd support in os.readlink()") + def test_readlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + os.symlink('symlink', fullname) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(name, dir_fd=dir_fd), 'symlink') + + @unittest.skipUnless(os.rename in os.supports_dir_fd, "test needs dir_fd support in os.rename()") + def test_rename_dir_fd(self): + with self.prepare_file() as (dir_fd, name, fullname), \ + self.prepare() as (dir_fd2, name2, fullname2): + posix.rename(name, name2, + src_dir_fd=dir_fd, dst_dir_fd=dir_fd2) + posix.stat(fullname2) # should not raise exception + posix.rename(fullname2, fullname) + + @unittest.skipUnless(os.symlink in os.supports_dir_fd, "test needs dir_fd support in os.symlink()") + def test_symlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + posix.symlink('symlink', name, dir_fd=dir_fd) + self.addCleanup(posix.unlink, fullname) + self.assertEqual(posix.readlink(fullname), 'symlink') + + @unittest.skipUnless(os.unlink in os.supports_dir_fd, "test needs dir_fd support in os.unlink()") + def test_unlink_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + support.create_empty_file(fullname) + posix.stat(fullname) # should not raise exception + try: + posix.unlink(name, dir_fd=dir_fd) + self.assertRaises(OSError, posix.stat, fullname) + except: + self.addCleanup(posix.unlink, fullname) + raise + + @unittest.skipUnless(os.mkfifo in os.supports_dir_fd, "test needs dir_fd support in os.mkfifo()") + def test_mkfifo_dir_fd(self): + with self.prepare() as (dir_fd, name, fullname): + try: + posix.mkfifo(name, stat.S_IRUSR | stat.S_IWUSR, dir_fd=dir_fd) + except PermissionError as e: + self.skipTest('posix.mkfifo(): %s' % e) + self.addCleanup(posix.unlink, fullname) + self.assertTrue(stat.S_ISFIFO(posix.stat(fullname).st_mode)) + + class PosixGroupsTester(unittest.TestCase): def setUp(self): From webhook-mailer at python.org Mon Jan 24 06:08:58 2022 From: webhook-mailer at python.org (markshannon) Date: Mon, 24 Jan 2022 11:08:58 -0000 Subject: [Python-checkins] bpo-43683: Streamline YIELD_VALUE and SEND (GH-30723) Message-ID: https://github.com/python/cpython/commit/0367a36fdc36b9c909c4d5acf7cde6ceeec0ba69 commit: 0367a36fdc36b9c909c4d5acf7cde6ceeec0ba69 branch: main author: Mark Shannon committer: markshannon date: 2022-01-24T11:08:53Z summary: bpo-43683: Streamline YIELD_VALUE and SEND (GH-30723) * Split YIELD_VALUE into ASYNC_GEN_WRAP; YIELD_VALUE for async generators. * Split SEND into SEND; YIELD_VALUE. * Document new opcodes. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-20-17-13-49.bpo-43683.BqQ26Z.rst M Doc/library/dis.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Objects/genobject.c M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index c9a4768618702..ddba668088e4a 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -1233,6 +1233,22 @@ All of the following opcodes use their arguments. .. versionadded:: 3.11 +.. opcode:: SEND + + Sends ``None`` to the sub-generator of this generator. + Used in ``yield from`` and ``await`` statements. + + .. versionadded:: 3.11 + + +.. opcode:: ASYNC_GEN_WRAP + + Wraps the value on top of the stack in an ``async_generator_wrapped_value``. + Used to yield in async generators. + + .. versionadded:: 3.11 + + .. opcode:: HAVE_ARGUMENT This is not really an opcode. It identifies the dividing line between diff --git a/Include/opcode.h b/Include/opcode.h index c0686bd2249ce..985758d8fdbf2 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -44,6 +44,7 @@ extern "C" { #define IMPORT_STAR 84 #define SETUP_ANNOTATIONS 85 #define YIELD_VALUE 86 +#define ASYNC_GEN_WRAP 87 #define PREP_RERAISE_STAR 88 #define POP_EXCEPT 89 #define HAVE_ARGUMENT 90 @@ -165,12 +166,12 @@ extern "C" { #define STORE_ATTR_ADAPTIVE 79 #define STORE_ATTR_INSTANCE_VALUE 80 #define STORE_ATTR_SLOT 81 -#define STORE_ATTR_WITH_HINT 87 -#define LOAD_FAST__LOAD_FAST 131 -#define STORE_FAST__LOAD_FAST 140 -#define LOAD_FAST__LOAD_CONST 141 -#define LOAD_CONST__LOAD_FAST 143 -#define STORE_FAST__STORE_FAST 150 +#define STORE_ATTR_WITH_HINT 131 +#define LOAD_FAST__LOAD_FAST 140 +#define STORE_FAST__LOAD_FAST 141 +#define LOAD_FAST__LOAD_CONST 143 +#define LOAD_CONST__LOAD_FAST 150 +#define STORE_FAST__STORE_FAST 153 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 1560e60dbb925..cd4f69c7aa149 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -381,6 +381,7 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3473 (Add POP_JUMP_IF_NOT_NONE/POP_JUMP_IF_NONE opcodes) # Python 3.11a4 3474 (Add RESUME opcode) # Python 3.11a5 3475 (Add RETURN_GENERATOR opcode) +# Python 3.11a5 3476 (Add ASYNC_GEN_WRAP opcode) # Python 3.12 will start with magic number 3500 @@ -394,7 +395,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3475).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3476).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index 73b41d22df2fc..1bd48eee8549a 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -101,7 +101,7 @@ def jabs_op(name, op): def_op('IMPORT_STAR', 84) def_op('SETUP_ANNOTATIONS', 85) def_op('YIELD_VALUE', 86) - +def_op('ASYNC_GEN_WRAP', 87) def_op('PREP_RERAISE_STAR', 88) def_op('POP_EXCEPT', 89) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-20-17-13-49.bpo-43683.BqQ26Z.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-20-17-13-49.bpo-43683.BqQ26Z.rst new file mode 100644 index 0000000000000..737f44f296cb1 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-20-17-13-49.bpo-43683.BqQ26Z.rst @@ -0,0 +1,3 @@ +Add ASYNC_GEN_WRAP opcode to wrap the value to be yielded in async +generators. Removes the need to special case async generators in the +``YIELD_VALUE`` instruction. diff --git a/Objects/genobject.c b/Objects/genobject.c index 46b019051a064..b2d402eba6333 100644 --- a/Objects/genobject.c +++ b/Objects/genobject.c @@ -353,7 +353,7 @@ _PyGen_yf(PyGenObject *gen) PyObject *bytecode = gen->gi_code->co_code; unsigned char *code = (unsigned char *)PyBytes_AS_STRING(bytecode); - if (frame->f_lasti < 0) { + if (frame->f_lasti < 1) { /* Return immediately if the frame didn't start yet. SEND always come after LOAD_CONST: a code object should not start with SEND */ @@ -361,7 +361,7 @@ _PyGen_yf(PyGenObject *gen) return NULL; } - if (code[frame->f_lasti*sizeof(_Py_CODEUNIT)] != SEND || frame->stacktop < 0) + if (code[(frame->f_lasti-1)*sizeof(_Py_CODEUNIT)] != SEND || frame->stacktop < 0) return NULL; yf = _PyFrame_StackPeek(frame); Py_INCREF(yf); @@ -488,6 +488,8 @@ _gen_throw(PyGenObject *gen, int close_on_genexit, assert(frame->f_lasti >= 0); PyObject *bytecode = gen->gi_code->co_code; unsigned char *code = (unsigned char *)PyBytes_AS_STRING(bytecode); + /* Backup to SEND */ + frame->f_lasti--; assert(code[frame->f_lasti*sizeof(_Py_CODEUNIT)] == SEND); int jump = code[frame->f_lasti*sizeof(_Py_CODEUNIT)+1]; frame->f_lasti += jump; diff --git a/Python/ceval.c b/Python/ceval.c index 9aaddd99edacf..2c524ab7e0422 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -2650,32 +2650,25 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } assert (gen_status == PYGEN_NEXT); assert (retval != NULL); - frame->f_state = FRAME_SUSPENDED; - _PyFrame_SetStackPointer(frame, stack_pointer); - TRACE_FUNCTION_EXIT(); - DTRACE_FUNCTION_EXIT(); - _Py_LeaveRecursiveCall(tstate); - /* Restore previous cframe and return. */ - tstate->cframe = cframe.previous; - tstate->cframe->use_tracing = cframe.use_tracing; - assert(tstate->cframe->current_frame == frame->previous); - assert(!_PyErr_Occurred(tstate)); - return retval; + PUSH(retval); + DISPATCH(); + } + + TARGET(ASYNC_GEN_WRAP) { + PyObject *v = TOP(); + assert(frame->f_code->co_flags & CO_ASYNC_GENERATOR); + PyObject *w = _PyAsyncGenValueWrapperNew(v); + if (w == NULL) { + goto error; + } + SET_TOP(w); + Py_DECREF(v); + DISPATCH(); } TARGET(YIELD_VALUE) { assert(frame->is_entry); PyObject *retval = POP(); - - if (frame->f_code->co_flags & CO_ASYNC_GENERATOR) { - PyObject *w = _PyAsyncGenValueWrapperNew(retval); - Py_DECREF(retval); - if (w == NULL) { - retval = NULL; - goto error; - } - retval = w; - } frame->f_state = FRAME_SUSPENDED; _PyFrame_SetStackPointer(frame, stack_pointer); TRACE_FUNCTION_EXIT(); diff --git a/Python/compile.c b/Python/compile.c index 5d32959db3b65..feb9fcac51254 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -910,6 +910,7 @@ stack_effect(int opcode, int oparg, int jump) return -1; case SETUP_ANNOTATIONS: return 0; + case ASYNC_GEN_WRAP: case YIELD_VALUE: return 0; case POP_BLOCK: @@ -1541,6 +1542,9 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) #define POP_EXCEPT_AND_RERAISE(C) \ RETURN_IF_FALSE(compiler_pop_except_and_reraise((C))) +#define ADDOP_YIELD(C) \ + RETURN_IF_FALSE(addop_yield(C)) + #define VISIT(C, TYPE, V) {\ if (!compiler_visit_ ## TYPE((C), (V))) \ return 0; \ @@ -1844,6 +1848,7 @@ compiler_add_yield_from(struct compiler *c, int await) compiler_use_next_block(c, start); ADDOP_JUMP(c, SEND, exit); compiler_use_next_block(c, resume); + ADDOP(c, YIELD_VALUE); ADDOP_I(c, RESUME, await ? 3 : 2); ADDOP_JUMP(c, JUMP_NO_INTERRUPT, start); compiler_use_next_block(c, exit); @@ -4094,6 +4099,17 @@ addop_binary(struct compiler *c, operator_ty binop, bool inplace) return 1; } + +static int +addop_yield(struct compiler *c) { + if (c->u->u_ste->ste_generator && c->u->u_ste->ste_coroutine) { + ADDOP(c, ASYNC_GEN_WRAP); + } + ADDOP(c, YIELD_VALUE); + ADDOP_I(c, RESUME, 1); + return 1; +} + static int compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) { @@ -5144,8 +5160,7 @@ compiler_sync_comprehension_generator(struct compiler *c, switch (type) { case COMP_GENEXP: VISIT(c, expr, elt); - ADDOP(c, YIELD_VALUE); - ADDOP_I(c, RESUME, 1); + ADDOP_YIELD(c); ADDOP(c, POP_TOP); break; case COMP_LISTCOMP: @@ -5243,8 +5258,7 @@ compiler_async_comprehension_generator(struct compiler *c, switch (type) { case COMP_GENEXP: VISIT(c, expr, elt); - ADDOP(c, YIELD_VALUE); - ADDOP_I(c, RESUME, 1); + ADDOP_YIELD(c); ADDOP(c, POP_TOP); break; case COMP_LISTCOMP: @@ -5714,8 +5728,7 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) else { ADDOP_LOAD_CONST(c, Py_None); } - ADDOP(c, YIELD_VALUE); - ADDOP_I(c, RESUME, 1); + ADDOP_YIELD(c); break; case YieldFrom_kind: if (c->u->u_ste->ste_type != FunctionBlock) diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index 11ac0e975fdcd..c19cd0e88468a 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -86,7 +86,7 @@ static void *opcode_targets[256] = { &&TARGET_IMPORT_STAR, &&TARGET_SETUP_ANNOTATIONS, &&TARGET_YIELD_VALUE, - &&TARGET_STORE_ATTR_WITH_HINT, + &&TARGET_ASYNC_GEN_WRAP, &&TARGET_PREP_RERAISE_STAR, &&TARGET_POP_EXCEPT, &&TARGET_STORE_NAME, @@ -130,7 +130,7 @@ static void *opcode_targets[256] = { &&TARGET_POP_JUMP_IF_NOT_NONE, &&TARGET_POP_JUMP_IF_NONE, &&TARGET_RAISE_VARARGS, - &&TARGET_LOAD_FAST__LOAD_FAST, + &&TARGET_STORE_ATTR_WITH_HINT, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, &&TARGET_JUMP_NO_INTERRUPT, @@ -139,20 +139,20 @@ static void *opcode_targets[256] = { &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, + &&TARGET_LOAD_FAST__LOAD_FAST, &&TARGET_STORE_FAST__LOAD_FAST, - &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_CALL_FUNCTION_EX, - &&TARGET_LOAD_CONST__LOAD_FAST, + &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_EXTENDED_ARG, &&TARGET_LIST_APPEND, &&TARGET_SET_ADD, &&TARGET_MAP_ADD, &&TARGET_LOAD_CLASSDEREF, &&TARGET_COPY_FREE_VARS, - &&TARGET_STORE_FAST__STORE_FAST, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_RESUME, &&TARGET_MATCH_CLASS, - &&_unknown_opcode, + &&TARGET_STORE_FAST__STORE_FAST, &&_unknown_opcode, &&TARGET_FORMAT_VALUE, &&TARGET_BUILD_CONST_KEY_MAP, From webhook-mailer at python.org Mon Jan 24 06:09:26 2022 From: webhook-mailer at python.org (isidentical) Date: Mon, 24 Jan 2022 11:09:26 -0000 Subject: [Python-checkins] bpo-46422: use `dis.Positions` in `dis.Instruction` (GH-30716) Message-ID: https://github.com/python/cpython/commit/58f3d980989c7346ad792d464c1d749dcec6af63 commit: 58f3d980989c7346ad792d464c1d749dcec6af63 branch: main author: Nikita Sobolev committer: isidentical date: 2022-01-24T14:09:20+03:00 summary: bpo-46422: use `dis.Positions` in `dis.Instruction` (GH-30716) Co-authored-by: Batuhan Taskaya files: A Misc/NEWS.d/next/Library/2022-01-20-10-35-50.bpo-46422.1UAEHL.rst M Doc/library/dis.rst M Lib/dis.py M Lib/test/test_dis.py diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index ddba668088e4a..793152d9d812c 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -316,8 +316,30 @@ details of bytecode instructions as :class:`Instruction` instances: ``True`` if other code jumps to here, otherwise ``False`` + + .. data:: positions + + :class:`dis.Positions` object holding the + start and end locations that are covered by this instruction. + .. versionadded:: 3.4 + .. versionchanged:: 3.11 + + Field ``positions`` is added. + + +.. class:: Positions + + In case the information is not available, some fields might be `None`. + + .. data:: lineno + .. data:: end_lineno + .. data:: col_offset + .. data:: end_col_offset + + .. versionadded:: 3.11 + The Python compiler currently generates the following bytecode instructions. diff --git a/Lib/dis.py b/Lib/dis.py index ac0c6e7f04c45..2462a8434e895 100644 --- a/Lib/dis.py +++ b/Lib/dis.py @@ -413,10 +413,7 @@ def _get_instructions_bytes(code, varname_from_oparg=None, is_jump_target = offset in labels argval = None argrepr = '' - try: - positions = next(co_positions) - except StopIteration: - positions = None + positions = Positions(*next(co_positions, ())) if arg is not None: # Set argval to the dereferenced value of the argument when # available, and argrepr to the string representation of argval. diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index 19a4be2c4132b..ee9729ebabf4a 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -1313,6 +1313,12 @@ def test_co_positions(self): ] self.assertEqual(positions, expected) + named_positions = [ + (pos.lineno, pos.end_lineno, pos.col_offset, pos.end_col_offset) + for pos in positions + ] + self.assertEqual(named_positions, expected) + @requires_debug_ranges() def test_co_positions_missing_info(self): code = compile('x, y, z', '', 'exec') @@ -1320,25 +1326,27 @@ def test_co_positions_missing_info(self): actual = dis.get_instructions(code_without_column_table) for instruction in actual: with self.subTest(instruction=instruction): - start_line, end_line, start_offset, end_offset = instruction.positions + positions = instruction.positions + self.assertEqual(len(positions), 4) if instruction.opname == "RESUME": continue - assert start_line == 1 - assert end_line == 1 - assert start_offset is None - assert end_offset is None + self.assertEqual(positions.lineno, 1) + self.assertEqual(positions.end_lineno, 1) + self.assertIsNone(positions.col_offset) + self.assertIsNone(positions.end_col_offset) code_without_endline_table = code.replace(co_endlinetable=b'') actual = dis.get_instructions(code_without_endline_table) for instruction in actual: with self.subTest(instruction=instruction): - start_line, end_line, start_offset, end_offset = instruction.positions + positions = instruction.positions + self.assertEqual(len(positions), 4) if instruction.opname == "RESUME": continue - assert start_line == 1 - assert end_line is None - assert start_offset is not None - assert end_offset is not None + self.assertEqual(positions.lineno, 1) + self.assertIsNone(positions.end_lineno) + self.assertIsNotNone(positions.col_offset) + self.assertIsNotNone(positions.end_col_offset) # get_instructions has its own tests above, so can rely on it to validate # the object oriented API diff --git a/Misc/NEWS.d/next/Library/2022-01-20-10-35-50.bpo-46422.1UAEHL.rst b/Misc/NEWS.d/next/Library/2022-01-20-10-35-50.bpo-46422.1UAEHL.rst new file mode 100644 index 0000000000000..831f526359062 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-20-10-35-50.bpo-46422.1UAEHL.rst @@ -0,0 +1 @@ +Use ``dis.Positions`` in ``dis.Instruction`` instead of a regular ``tuple``. From webhook-mailer at python.org Mon Jan 24 07:40:02 2022 From: webhook-mailer at python.org (miss-islington) Date: Mon, 24 Jan 2022 12:40:02 -0000 Subject: [Python-checkins] bpo-41906: Accept built filters in dictConfig (GH-30756) Message-ID: https://github.com/python/cpython/commit/d7c68639795a576ff58b6479c8bb34c113df3618 commit: d7c68639795a576ff58b6479c8bb34c113df3618 branch: main author: Mario Corchero committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-24T04:39:50-08:00 summary: bpo-41906: Accept built filters in dictConfig (GH-30756) When configuring the logging stack, accept already built filters (or just callables) in the filters array of loggers and handlers. This facilitates passing quick callables as filters. Automerge-Triggered-By: GH:vsajip files: A Misc/NEWS.d/next/Library/2022-01-21-18-19-45.bpo-41906.YBaquj.rst M Doc/library/logging.config.rst M Lib/logging/config.py M Lib/test/test_logging.py diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index a1b8dc755ba6b..c979961a221c9 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -288,6 +288,9 @@ otherwise, the context is used to determine what to instantiate. * ``filters`` (optional). A list of ids of the filters for this handler. + .. versionchanged:: 3.11 + ``filters`` can take filter instances in addition to ids. + All *other* keys are passed through as keyword arguments to the handler's constructor. For example, given the snippet: @@ -326,6 +329,9 @@ otherwise, the context is used to determine what to instantiate. * ``filters`` (optional). A list of ids of the filters for this logger. + .. versionchanged:: 3.11 + ``filters`` can take filter instances in addition to ids. + * ``handlers`` (optional). A list of ids of the handlers for this logger. @@ -524,6 +530,10 @@ valid keyword parameter name, and so will not clash with the names of the keyword arguments used in the call. The ``'()'`` also serves as a mnemonic that the corresponding value is a callable. + .. versionchanged:: 3.11 + The ``filters`` member of ``handlers`` and ``loggers`` can take + filter instances in addition to ids. + .. _logging-config-dict-externalobj: diff --git a/Lib/logging/config.py b/Lib/logging/config.py index 9bc07eddd76b4..86a1e4eaf4cbc 100644 --- a/Lib/logging/config.py +++ b/Lib/logging/config.py @@ -694,7 +694,11 @@ def add_filters(self, filterer, filters): """Add filters to a filterer from a list of names.""" for f in filters: try: - filterer.addFilter(self.config['filters'][f]) + if callable(f) or callable(getattr(f, 'filter', None)): + filter_ = f + else: + filter_ = self.config['filters'][f] + filterer.addFilter(filter_) except Exception as e: raise ValueError('Unable to add filter %r' % f) from e diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 7c38676012bab..4f3315161cf20 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -3447,6 +3447,44 @@ def emit(self, record): logging.info('some log') self.assertEqual(stderr.getvalue(), 'some log my_type\n') + def test_config_callable_filter_works(self): + def filter_(_): + return 1 + self.apply_config({ + "version": 1, "root": {"level": "DEBUG", "filters": [filter_]} + }) + assert logging.getLogger().filters[0] is filter_ + logging.getLogger().filters = [] + + def test_config_filter_works(self): + filter_ = logging.Filter("spam.eggs") + self.apply_config({ + "version": 1, "root": {"level": "DEBUG", "filters": [filter_]} + }) + assert logging.getLogger().filters[0] is filter_ + logging.getLogger().filters = [] + + def test_config_filter_method_works(self): + class FakeFilter: + def filter(self, _): + return 1 + filter_ = FakeFilter() + self.apply_config({ + "version": 1, "root": {"level": "DEBUG", "filters": [filter_]} + }) + assert logging.getLogger().filters[0] is filter_ + logging.getLogger().filters = [] + + def test_invalid_type_raises(self): + class NotAFilter: pass + for filter_ in [None, 1, NotAFilter()]: + self.assertRaises( + ValueError, + self.apply_config, + {"version": 1, "root": {"level": "DEBUG", "filters": [filter_]}} + ) + + class ManagerTest(BaseTest): def test_manager_loggerclass(self): logged = [] diff --git a/Misc/NEWS.d/next/Library/2022-01-21-18-19-45.bpo-41906.YBaquj.rst b/Misc/NEWS.d/next/Library/2022-01-21-18-19-45.bpo-41906.YBaquj.rst new file mode 100644 index 0000000000000..be707130875f2 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-21-18-19-45.bpo-41906.YBaquj.rst @@ -0,0 +1,2 @@ +Support passing filter instances in the ``filters`` values of ``handlers`` and +``loggers`` in the dictionary passed to :func:`logging.config.dictConfig`. From webhook-mailer at python.org Mon Jan 24 10:43:04 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Mon, 24 Jan 2022 15:43:04 -0000 Subject: [Python-checkins] bpo-46470: remove unused branch from `typing._remove_dups_flatten` (GH-30780) Message-ID: https://github.com/python/cpython/commit/c144d9363107b50bcb0ccd01e7202e26a40c21f0 commit: c144d9363107b50bcb0ccd01e7202e26a40c21f0 branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-24T23:42:54+08:00 summary: bpo-46470: remove unused branch from `typing._remove_dups_flatten` (GH-30780) The branch was a remnant of old 3.6 typing.Union implementation. files: A Misc/NEWS.d/next/Library/2022-01-22-13-17-35.bpo-46470.MnNhgU.rst M Lib/typing.py diff --git a/Lib/typing.py b/Lib/typing.py index 972b8ba24b27e..7ff546fbb6492 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -280,8 +280,6 @@ def _remove_dups_flatten(parameters): for p in parameters: if isinstance(p, (_UnionGenericAlias, types.UnionType)): params.extend(p.__args__) - elif isinstance(p, tuple) and len(p) > 0 and p[0] is Union: - params.extend(p[1:]) else: params.append(p) diff --git a/Misc/NEWS.d/next/Library/2022-01-22-13-17-35.bpo-46470.MnNhgU.rst b/Misc/NEWS.d/next/Library/2022-01-22-13-17-35.bpo-46470.MnNhgU.rst new file mode 100644 index 0000000000000..45b9cea3cd56a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-22-13-17-35.bpo-46470.MnNhgU.rst @@ -0,0 +1 @@ +Remove unused branch from ``typing._remove_dups_flatten`` From webhook-mailer at python.org Mon Jan 24 11:18:45 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Mon, 24 Jan 2022 16:18:45 -0000 Subject: [Python-checkins] [3.10] bpo-46416: Allow direct invocation of `Lib/test/test_typing.py` (GH-30641) (GH-30697) Message-ID: https://github.com/python/cpython/commit/eaeb99468045b863d2dd3da3e3d1c3c9c78e1254 commit: eaeb99468045b863d2dd3da3e3d1c3c9c78e1254 branch: 3.10 author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-25T00:18:38+08:00 summary: [3.10] bpo-46416: Allow direct invocation of `Lib/test/test_typing.py` (GH-30641) (GH-30697) Use `__name__` (cherry picked from commit 2792d6d18eab3efeb71e6397f88db86e610541f1) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index d6c55ef1de75f..acad35d18d5f3 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -5067,7 +5067,7 @@ def test_special_attrs2(self): ) self.assertEqual( SpecialAttrsTests.TypeName.__module__, - 'test.test_typing', + __name__, ) # NewTypes are picklable assuming correct qualname information. for proto in range(pickle.HIGHEST_PROTOCOL + 1): @@ -5081,7 +5081,7 @@ def test_special_attrs2(self): # __qualname__ is unnecessary. self.assertEqual(SpecialAttrsT.__name__, 'SpecialAttrsT') self.assertFalse(hasattr(SpecialAttrsT, '__qualname__')) - self.assertEqual(SpecialAttrsT.__module__, 'test.test_typing') + self.assertEqual(SpecialAttrsT.__module__, __name__) # Module-level type variables are picklable. for proto in range(pickle.HIGHEST_PROTOCOL + 1): s = pickle.dumps(SpecialAttrsT, proto) @@ -5090,7 +5090,7 @@ def test_special_attrs2(self): self.assertEqual(SpecialAttrsP.__name__, 'SpecialAttrsP') self.assertFalse(hasattr(SpecialAttrsP, '__qualname__')) - self.assertEqual(SpecialAttrsP.__module__, 'test.test_typing') + self.assertEqual(SpecialAttrsP.__module__, __name__) # Module-level ParamSpecs are picklable. for proto in range(pickle.HIGHEST_PROTOCOL + 1): s = pickle.dumps(SpecialAttrsP, proto) From webhook-mailer at python.org Mon Jan 24 12:45:15 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 24 Jan 2022 17:45:15 -0000 Subject: [Python-checkins] bpo-45711: move whatsnew entries which are incorrectly listed under New Features (GH-30849) Message-ID: https://github.com/python/cpython/commit/80e1def9ded2a1d017410394e50c88aa39135029 commit: 80e1def9ded2a1d017410394e50c88aa39135029 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-24T17:44:42Z summary: bpo-45711: move whatsnew entries which are incorrectly listed under New Features (GH-30849) files: M Doc/whatsnew/3.11.rst diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index ad421b16fbac3a..4328ee6a5030cd 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -648,6 +648,17 @@ Build Changes C API Changes ============= +* :c:func:`PyErr_SetExcInfo()` no longer uses the ``type`` and ``traceback`` + arguments, the interpreter now derives those values from the exception + instance (the ``value`` argument). The function still steals references + of all three arguments. + (Contributed by Irit Katriel in :issue:`45711`.) + +* :c:func:`PyErr_GetExcInfo()` now derives the ``type`` and ``traceback`` + fields of the result from the exception instance (the ``value`` field). + (Contributed by Irit Katriel in :issue:`45711`.) + + New Features ------------ @@ -662,16 +673,6 @@ New Features suspend and resume tracing and profiling. (Contributed by Victor Stinner in :issue:`43760`.) -* :c:func:`PyErr_SetExcInfo()` no longer uses the ``type`` and ``traceback`` - arguments, the interpreter now derives those values from the exception - instance (the ``value`` argument). The function still steals references - of all three arguments. - (Contributed by Irit Katriel in :issue:`45711`.) - -* :c:func:`PyErr_GetExcInfo()` now derives the ``type`` and ``traceback`` - fields of the result from the exception instance (the ``value`` field). - (Contributed by Irit Katriel in :issue:`45711`.) - * Added the :c:data:`Py_Version` constant which bears the same value as :c:macro:`PY_VERSION_HEX`. (Contributed by Gabriele N. Tornetta in :issue:`43931`.) From webhook-mailer at python.org Mon Jan 24 16:02:13 2022 From: webhook-mailer at python.org (tiran) Date: Mon, 24 Jan 2022 21:02:13 -0000 Subject: [Python-checkins] bpo-40280: Get help() working and more (GH-30858) Message-ID: https://github.com/python/cpython/commit/d5fd438b38248a0d2e91898475369361e34f74b7 commit: d5fd438b38248a0d2e91898475369361e34f74b7 branch: main author: Christian Heimes committer: tiran date: 2022-01-24T22:02:01+01:00 summary: bpo-40280: Get help() working and more (GH-30858) files: M Tools/wasm/wasm_assets.py M configure M configure.ac diff --git a/Tools/wasm/wasm_assets.py b/Tools/wasm/wasm_assets.py index 6a4027184030f..bb1983af4c7a7 100755 --- a/Tools/wasm/wasm_assets.py +++ b/Tools/wasm/wasm_assets.py @@ -71,7 +71,11 @@ "smtplib.py", "socketserver.py", "telnetlib.py", - "urllib/", + # keep urllib.parse for pydoc + "urllib/error.py", + "urllib/request.py", + "urllib/response.py", + "urllib/robotparser.py", "wsgiref/", "xmlrpc/", # dbm / gdbm diff --git a/configure b/configure index f40d425371dc6..78e5a09927221 100755 --- a/configure +++ b/configure @@ -7684,7 +7684,7 @@ case $ac_sys_system/$ac_sys_emscripten_target in #( LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" CFLAGS_NODIST="$CFLAGS_NODIST -pthread" ;; #( - WASI) : + WASI/*) : $as_echo "#define _WASI_EMULATED_SIGNAL 1" >>confdefs.h @@ -21345,6 +21345,8 @@ $as_echo_n "checking for additional Modules/Setup files... " >&6; } case $ac_sys_system in #( Emscripten) : MODULES_SETUP_STDLIB=Modules/Setup.stdlib ;; #( + WASI) : + MODULES_SETUP_STDLIB=Modules/Setup.stdlib ;; #( *) : MODULES_SETUP_STDLIB= ;; @@ -23421,7 +23423,7 @@ $as_echo_n "checking for stdlib extension module _testimportmultiple... " >&6; } py_cv_module__testimportmultiple=n/a ;; #( *) : if test "$TEST_MODULES" = yes; then : - if true; then : + if test "$ac_cv_func_dlopen" = yes; then : py_cv_module__testimportmultiple=yes else py_cv_module__testimportmultiple=missing @@ -23457,7 +23459,7 @@ $as_echo_n "checking for stdlib extension module _testmultiphase... " >&6; } py_cv_module__testmultiphase=n/a ;; #( *) : if test "$TEST_MODULES" = yes; then : - if true; then : + if test "$ac_cv_func_dlopen" = yes; then : py_cv_module__testmultiphase=yes else py_cv_module__testmultiphase=missing @@ -23565,8 +23567,8 @@ $as_echo_n "checking for stdlib extension module xxlimited... " >&6; } *xxlimited*) : py_cv_module_xxlimited=n/a ;; #( *) : - if test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"; then : - if true; then : + if test "$with_trace_refs" = "no"; then : + if test "$ac_cv_func_dlopen" = yes; then : py_cv_module_xxlimited=yes else py_cv_module_xxlimited=missing @@ -23601,8 +23603,8 @@ $as_echo_n "checking for stdlib extension module xxlimited_35... " >&6; } *xxlimited_35*) : py_cv_module_xxlimited_35=n/a ;; #( *) : - if test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"; then : - if true; then : + if test "$with_trace_refs" = "no"; then : + if test "$ac_cv_func_dlopen" = yes; then : py_cv_module_xxlimited_35=yes else py_cv_module_xxlimited_35=missing diff --git a/configure.ac b/configure.ac index 8d140427de48d..a0fbe41c9ec59 100644 --- a/configure.ac +++ b/configure.ac @@ -1850,7 +1850,7 @@ AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], LDFLAGS_NODIST="$LDFLAGS_NODIST -s ASSERTIONS=1 -s ALLOW_MEMORY_GROWTH=1 -s NODERAWFS=1 -s EXIT_RUNTIME=1 -s USE_PTHREADS -s PROXY_TO_PTHREAD" CFLAGS_NODIST="$CFLAGS_NODIST -pthread" ], - [WASI], [ + [WASI/*], [ AC_DEFINE([_WASI_EMULATED_SIGNAL], [1], [Define to 1 if you want to emulate signals on WASI]) LIBS="$LIBS -lwasi-emulated-signal" echo "#define _WASI_EMULATED_SIGNAL 1" >> confdefs.h @@ -6407,6 +6407,7 @@ dnl Use Modules/Setup.stdlib as additional provider? AC_MSG_CHECKING([for additional Modules/Setup files]) AS_CASE([$ac_sys_system], [Emscripten], [MODULES_SETUP_STDLIB=Modules/Setup.stdlib], + [WASI], [MODULES_SETUP_STDLIB=Modules/Setup.stdlib], [MODULES_SETUP_STDLIB=] ) AC_MSG_RESULT([$MODULES_SETUP_STDLIB]) @@ -6599,16 +6600,16 @@ dnl test modules PY_STDLIB_MOD([_testcapi], [test "$TEST_MODULES" = yes]) PY_STDLIB_MOD([_testinternalcapi], [test "$TEST_MODULES" = yes]) PY_STDLIB_MOD([_testbuffer], [test "$TEST_MODULES" = yes]) -PY_STDLIB_MOD([_testimportmultiple], [test "$TEST_MODULES" = yes]) -PY_STDLIB_MOD([_testmultiphase], [test "$TEST_MODULES" = yes]) +PY_STDLIB_MOD([_testimportmultiple], [test "$TEST_MODULES" = yes], [test "$ac_cv_func_dlopen" = yes]) +PY_STDLIB_MOD([_testmultiphase], [test "$TEST_MODULES" = yes], [test "$ac_cv_func_dlopen" = yes]) PY_STDLIB_MOD([_xxtestfuzz], [test "$TEST_MODULES" = yes]) PY_STDLIB_MOD([_ctypes_test], [test "$TEST_MODULES" = yes], [], [], [-lm]) dnl Limited API template modules. dnl The limited C API is not compatible with the Py_TRACE_REFS macro. dnl Emscripten does not support shared libraries yet. -PY_STDLIB_MOD([xxlimited], [test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"]) -PY_STDLIB_MOD([xxlimited_35], [test "$with_trace_refs" = "no" -a "$ac_sys_system" != "Emscripten"]) +PY_STDLIB_MOD([xxlimited], [test "$with_trace_refs" = "no"], [test "$ac_cv_func_dlopen" = yes]) +PY_STDLIB_MOD([xxlimited_35], [test "$with_trace_refs" = "no"], [test "$ac_cv_func_dlopen" = yes]) # substitute multiline block, must come after last PY_STDLIB_MOD() AC_SUBST([MODULE_BLOCK]) From webhook-mailer at python.org Mon Jan 24 16:05:02 2022 From: webhook-mailer at python.org (pablogsal) Date: Mon, 24 Jan 2022 21:05:02 -0000 Subject: [Python-checkins] fixed flaky test (GH-30845) Message-ID: https://github.com/python/cpython/commit/1c705fda8f9902906edd26d46acb0433b0b098e1 commit: 1c705fda8f9902906edd26d46acb0433b0b098e1 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: pablogsal date: 2022-01-24T21:04:47Z summary: fixed flaky test (GH-30845) files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index c8bfa892c73fc..effca6644c062 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -92,9 +92,13 @@ async def wait_closed(self): class SendfileBase: - # 128 KiB plus small unaligned to buffer chunk - DATA = b"SendfileBaseData" * (1024 * 8 + 1) - + # 256 KiB plus small unaligned to buffer chunk + # Newer versions of Windows seems to have increased its internal + # buffer and tries to send as much of the data as it can as it + # has some form of buffering for this which is less than 256KiB + # on newer server versions and Windows 11. + # So DATA should be larger than 256 KiB to make this test reliable. + DATA = b"x" * (1024 * 256 + 1) # Reduce socket buffer size to test on relative small data sets. BUF_SIZE = 4 * 1024 # 4 KiB @@ -456,8 +460,6 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") - @unittest.skipIf(sys.platform == "win32", - "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Mon Jan 24 16:48:00 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 24 Jan 2022 21:48:00 -0000 Subject: [Python-checkins] bpo-46431: improve error message on invalid calls to BaseExceptionGroup.__new__ (GH-30854) Message-ID: https://github.com/python/cpython/commit/573b54515740ce51dcf2402038a9d953aa6c317f commit: 573b54515740ce51dcf2402038a9d953aa6c317f branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-24T21:47:40Z summary: bpo-46431: improve error message on invalid calls to BaseExceptionGroup.__new__ (GH-30854) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-24-16-58-01.bpo-46431.N6mKAx.rst M Lib/test/test_exception_group.py M Objects/exceptions.c diff --git a/Lib/test/test_exception_group.py b/Lib/test/test_exception_group.py index f0ae37741ab60..bbfce944c1765 100644 --- a/Lib/test/test_exception_group.py +++ b/Lib/test/test_exception_group.py @@ -22,7 +22,7 @@ def test_exception_group_is_generic_type(self): class BadConstructorArgs(unittest.TestCase): def test_bad_EG_construction__too_many_args(self): - MSG = 'function takes exactly 2 arguments' + MSG = 'BaseExceptionGroup.__new__\(\) takes exactly 2 arguments' with self.assertRaisesRegex(TypeError, MSG): ExceptionGroup('no errors') with self.assertRaisesRegex(TypeError, MSG): diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-24-16-58-01.bpo-46431.N6mKAx.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-16-58-01.bpo-46431.N6mKAx.rst new file mode 100644 index 0000000000000..3a2af9df03c38 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-16-58-01.bpo-46431.N6mKAx.rst @@ -0,0 +1 @@ +Improve error message on invalid calls to :meth:`BaseExceptionGroup.__new__`. \ No newline at end of file diff --git a/Objects/exceptions.c b/Objects/exceptions.c index 065503f59d62d..d8bfb31a6094a 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -685,7 +685,10 @@ BaseExceptionGroup_new(PyTypeObject *type, PyObject *args, PyObject *kwds) PyObject *message = NULL; PyObject *exceptions = NULL; - if (!PyArg_ParseTuple(args, "UO", &message, &exceptions)) { + if (!PyArg_ParseTuple(args, + "UO:BaseExceptionGroup.__new__", + &message, + &exceptions)) { return NULL; } From webhook-mailer at python.org Mon Jan 24 16:50:28 2022 From: webhook-mailer at python.org (iritkatriel) Date: Mon, 24 Jan 2022 21:50:28 -0000 Subject: [Python-checkins] bpo-46431: Add example of subclassing ExceptionGroup. Document the message and exceptions attributes (GH-30852) Message-ID: https://github.com/python/cpython/commit/b18fd54f8c27e4b2aac222e75ac58aa85e5a7988 commit: b18fd54f8c27e4b2aac222e75ac58aa85e5a7988 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-24T21:50:18Z summary: bpo-46431: Add example of subclassing ExceptionGroup. Document the message and exceptions attributes (GH-30852) files: M Doc/library/exceptions.rst diff --git a/Doc/library/exceptions.rst b/Doc/library/exceptions.rst index f90b6761154af..e093425cdc5ef 100644 --- a/Doc/library/exceptions.rst +++ b/Doc/library/exceptions.rst @@ -878,48 +878,72 @@ their subgroups based on the types of the contained exceptions. raises a :exc:`TypeError` if any contained exception is not an :exc:`Exception` subclass. + .. attribute:: message + + The ``msg`` argument to the constructor. This is a read-only attribute. + + .. attribute:: exceptions + + A tuple of the exceptions in the ``excs`` sequence given to the + constructor. This is a read-only attribute. + .. method:: subgroup(condition) - Returns an exception group that contains only the exceptions from the - current group that match *condition*, or ``None`` if the result is empty. + Returns an exception group that contains only the exceptions from the + current group that match *condition*, or ``None`` if the result is empty. - The condition can be either a function that accepts an exception and returns - true for those that should be in the subgroup, or it can be an exception type - or a tuple of exception types, which is used to check for a match using the - same check that is used in an ``except`` clause. + The condition can be either a function that accepts an exception and returns + true for those that should be in the subgroup, or it can be an exception type + or a tuple of exception types, which is used to check for a match using the + same check that is used in an ``except`` clause. - The nesting structure of the current exception is preserved in the result, - as are the values of its :attr:`message`, :attr:`__traceback__`, - :attr:`__cause__`, :attr:`__context__` and :attr:`__note__` fields. - Empty nested groups are omitted from the result. + The nesting structure of the current exception is preserved in the result, + as are the values of its :attr:`message`, :attr:`__traceback__`, + :attr:`__cause__`, :attr:`__context__` and :attr:`__note__` fields. + Empty nested groups are omitted from the result. - The condition is checked for all exceptions in the nested exception group, - including the top-level and any nested exception groups. If the condition is - true for such an exception group, it is included in the result in full. + The condition is checked for all exceptions in the nested exception group, + including the top-level and any nested exception groups. If the condition is + true for such an exception group, it is included in the result in full. .. method:: split(condition) - Like :meth:`subgroup`, but returns the pair ``(match, rest)`` where ``match`` - is ``subgroup(condition)`` and ``rest`` is the remaining non-matching - part. + Like :meth:`subgroup`, but returns the pair ``(match, rest)`` where ``match`` + is ``subgroup(condition)`` and ``rest`` is the remaining non-matching + part. .. method:: derive(excs) - Returns an exception group with the same :attr:`message`, - :attr:`__traceback__`, :attr:`__cause__`, :attr:`__context__` - and :attr:`__note__` but which wraps the exceptions in ``excs``. - - This method is used by :meth:`subgroup` and :meth:`split`. A - subclass needs to override it in order to make :meth:`subgroup` - and :meth:`split` return instances of the subclass rather - than :exc:`ExceptionGroup`. :: - - >>> class MyGroup(ExceptionGroup): - ... def derive(self, exc): - ... return MyGroup(self.message, exc) - ... - >>> MyGroup("eg", [ValueError(1), TypeError(2)]).split(TypeError) - (MyGroup('eg', [TypeError(2)]), MyGroup('eg', [ValueError(1)])) + Returns an exception group with the same :attr:`message`, + :attr:`__traceback__`, :attr:`__cause__`, :attr:`__context__` + and :attr:`__note__` but which wraps the exceptions in ``excs``. + + This method is used by :meth:`subgroup` and :meth:`split`. A + subclass needs to override it in order to make :meth:`subgroup` + and :meth:`split` return instances of the subclass rather + than :exc:`ExceptionGroup`. :: + + >>> class MyGroup(ExceptionGroup): + ... def derive(self, exc): + ... return MyGroup(self.message, exc) + ... + >>> MyGroup("eg", [ValueError(1), TypeError(2)]).split(TypeError) + (MyGroup('eg', [TypeError(2)]), MyGroup('eg', [ValueError(1)])) + + Note that :exc:`BaseExceptionGroup` defines :meth:`__new__`, so + subclasses that need a different constructor signature need to + override that rather than :meth:`__init__`. For example, the following + defines an exception group subclass which accepts an exit_code and + and constructs the group's message from it. :: + + class Errors(ExceptionGroup): + def __new__(cls, errors, exit_code): + self = super().__new__(Errors, f"exit code: {exit_code}", errors) + self.exit_code = exit_code + return self + + def derive(self, excs): + return Errors(excs, self.exit_code) .. versionadded:: 3.11 From webhook-mailer at python.org Mon Jan 24 20:06:10 2022 From: webhook-mailer at python.org (tim-one) Date: Tue, 25 Jan 2022 01:06:10 -0000 Subject: [Python-checkins] bpo-46504: faster code for trial quotient in x_divrem() (GH-30856) Message-ID: https://github.com/python/cpython/commit/7c26472d09548905d8c158b26b6a2b12de6cdc32 commit: 7c26472d09548905d8c158b26b6a2b12de6cdc32 branch: main author: Tim Peters committer: tim-one date: 2022-01-24T19:06:00-06:00 summary: bpo-46504: faster code for trial quotient in x_divrem() (GH-30856) * bpo-46504: faster code for trial quotient in x_divrem() This brings x_divrem() back into synch with x_divrem1(), which was changed in bpo-46406 to generate faster code to find machine-word division quotients and remainders. Modern processors compute both with a single machine instruction, but convincing C to exploit that requires writing _less_ "clever" C code. files: M Objects/longobject.c diff --git a/Objects/longobject.c b/Objects/longobject.c index ee20e2638bcad..5f0cc579c2cca 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -2767,8 +2767,15 @@ x_divrem(PyLongObject *v1, PyLongObject *w1, PyLongObject **prem) vtop = vk[size_w]; assert(vtop <= wm1); vv = ((twodigits)vtop << PyLong_SHIFT) | vk[size_w-1]; + /* The code used to compute the remainder via + * r = (digit)(vv - (twodigits)wm1 * q); + * and compilers generally generated code to do the * and -. + * But modern processors generally compute q and r with a single + * instruction, and modern optimizing compilers exploit that if we + * _don't_ try to optimize it. + */ q = (digit)(vv / wm1); - r = (digit)(vv - (twodigits)wm1 * q); /* r = vv % wm1 */ + r = (digit)(vv % wm1); while ((twodigits)wm2 * q > (((twodigits)r << PyLong_SHIFT) | vk[size_w-2])) { --q; From webhook-mailer at python.org Mon Jan 24 21:53:39 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 25 Jan 2022 02:53:39 -0000 Subject: [Python-checkins] bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) Message-ID: https://github.com/python/cpython/commit/0daf72194bd4e31de7f12020685bb39a14d6f45e commit: 0daf72194bd4e31de7f12020685bb39a14d6f45e branch: main author: Eric V. Smith committer: ericvsmith date: 2022-01-24T21:53:27-05:00 summary: bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) * bpo-46503: Prevent an assert from firing. Also fix one nearby tiny PEP-7 nit. * Added blurb. files: A Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst M Lib/test/test_fstring.py M Parser/string_parser.c diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index bd1ca943c7c09..d0b1ade15137b 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -746,12 +746,16 @@ def test_misformed_unicode_character_name(self): # differently inside f-strings. self.assertAllRaise(SyntaxError, r"\(unicode error\) 'unicodeescape' codec can't decode bytes in position .*: malformed \\N character escape", [r"f'\N'", + r"f'\N '", + r"f'\N '", # See bpo-46503. r"f'\N{'", r"f'\N{GREEK CAPITAL LETTER DELTA'", # Here are the non-f-string versions, # which should give the same errors. r"'\N'", + r"'\N '", + r"'\N '", r"'\N{'", r"'\N{GREEK CAPITAL LETTER DELTA'", ]) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst new file mode 100644 index 0000000000000..e48028d72ca8e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst @@ -0,0 +1 @@ +Fix an assert when parsing some invalid \N escape sequences in f-strings. diff --git a/Parser/string_parser.c b/Parser/string_parser.c index 57d9b9ed3fdbb..0b5e30ba2ca6a 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -442,12 +442,23 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, if (!raw && ch == '\\' && s < end) { ch = *s++; if (ch == 'N') { + /* We need to look at and skip matching braces for "\N{name}" + sequences because otherwise we'll think the opening '{' + starts an expression, which is not the case with "\N". + Keep looking for either a matched '{' '}' pair, or the end + of the string. */ + if (s < end && *s++ == '{') { while (s < end && *s++ != '}') { } continue; } - break; + + /* This is an invalid "\N" sequence, since it's a "\N" not + followed by a "{". Just keep parsing this literal. This + error will be caught later by + decode_unicode_with_escapes(). */ + continue; } if (ch == '{' && warn_invalid_escape_sequence(p, ch, t) < 0) { return -1; @@ -491,7 +502,8 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, *literal = PyUnicode_DecodeUTF8Stateful(literal_start, s - literal_start, NULL, NULL); - } else { + } + else { *literal = decode_unicode_with_escapes(p, literal_start, s - literal_start, t); } From webhook-mailer at python.org Mon Jan 24 22:08:51 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 25 Jan 2022 03:08:51 -0000 Subject: [Python-checkins] bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) (30867) Message-ID: https://github.com/python/cpython/commit/c314e3e829943b186e1c894071f00c613433cfe5 commit: c314e3e829943b186e1c894071f00c613433cfe5 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ericvsmith date: 2022-01-24T22:08:42-05:00 summary: bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) (30867) * bpo-46503: Prevent an assert from firing. Also fix one nearby tiny PEP-7 nit. * Added blurb. (cherry picked from commit 0daf72194bd4e31de7f12020685bb39a14d6f45e) Co-authored-by: Eric V. Smith Co-authored-by: Eric V. Smith files: A Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst M Lib/test/test_fstring.py M Parser/pegen/parse_string.c diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index 518ebdf16c1c6..92a4d22062f98 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -747,12 +747,16 @@ def test_misformed_unicode_character_name(self): # differently inside f-strings. self.assertAllRaise(SyntaxError, r"\(unicode error\) 'unicodeescape' codec can't decode bytes in position .*: malformed \\N character escape", [r"f'\N'", + r"f'\N '", + r"f'\N '", # See bpo-46503. r"f'\N{'", r"f'\N{GREEK CAPITAL LETTER DELTA'", # Here are the non-f-string versions, # which should give the same errors. r"'\N'", + r"'\N '", + r"'\N '", r"'\N{'", r"'\N{GREEK CAPITAL LETTER DELTA'", ]) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst new file mode 100644 index 0000000000000..e48028d72ca8e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst @@ -0,0 +1 @@ +Fix an assert when parsing some invalid \N escape sequences in f-strings. diff --git a/Parser/pegen/parse_string.c b/Parser/pegen/parse_string.c index f1df2c46a6cf6..af350b340db68 100644 --- a/Parser/pegen/parse_string.c +++ b/Parser/pegen/parse_string.c @@ -444,12 +444,23 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, if (!raw && ch == '\\' && s < end) { ch = *s++; if (ch == 'N') { + /* We need to look at and skip matching braces for "\N{name}" + sequences because otherwise we'll think the opening '{' + starts an expression, which is not the case with "\N". + Keep looking for either a matched '{' '}' pair, or the end + of the string. */ + if (s < end && *s++ == '{') { while (s < end && *s++ != '}') { } continue; } - break; + + /* This is an invalid "\N" sequence, since it's a "\N" not + followed by a "{". Just keep parsing this literal. This + error will be caught later by + decode_unicode_with_escapes(). */ + continue; } if (ch == '{' && warn_invalid_escape_sequence(p, ch, t) < 0) { return -1; @@ -493,7 +504,8 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, *literal = PyUnicode_DecodeUTF8Stateful(literal_start, s - literal_start, NULL, NULL); - } else { + } + else { *literal = decode_unicode_with_escapes(p, literal_start, s - literal_start, t); } From webhook-mailer at python.org Mon Jan 24 22:13:20 2022 From: webhook-mailer at python.org (ericvsmith) Date: Tue, 25 Jan 2022 03:13:20 -0000 Subject: [Python-checkins] bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) (GH-30866) Message-ID: https://github.com/python/cpython/commit/894e8c13484822458d53cc77c9265b7a88450a4b commit: 894e8c13484822458d53cc77c9265b7a88450a4b branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: ericvsmith date: 2022-01-24T22:13:11-05:00 summary: bpo-46503: Prevent an assert from firing when parsing some invalid \N sequences in f-strings. (GH-30865) (GH-30866) * bpo-46503: Prevent an assert from firing. Also fix one nearby tiny PEP-7 nit. * Added blurb. (cherry picked from commit 0daf72194bd4e31de7f12020685bb39a14d6f45e) Co-authored-by: Eric V. Smith Co-authored-by: Eric V. Smith files: A Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst M Lib/test/test_fstring.py M Parser/string_parser.c diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index bd1ca943c7c09..d0b1ade15137b 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -746,12 +746,16 @@ def test_misformed_unicode_character_name(self): # differently inside f-strings. self.assertAllRaise(SyntaxError, r"\(unicode error\) 'unicodeescape' codec can't decode bytes in position .*: malformed \\N character escape", [r"f'\N'", + r"f'\N '", + r"f'\N '", # See bpo-46503. r"f'\N{'", r"f'\N{GREEK CAPITAL LETTER DELTA'", # Here are the non-f-string versions, # which should give the same errors. r"'\N'", + r"'\N '", + r"'\N '", r"'\N{'", r"'\N{GREEK CAPITAL LETTER DELTA'", ]) diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst new file mode 100644 index 0000000000000..e48028d72ca8e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-24-21-24-41.bpo-46503.4UrPsE.rst @@ -0,0 +1 @@ +Fix an assert when parsing some invalid \N escape sequences in f-strings. diff --git a/Parser/string_parser.c b/Parser/string_parser.c index c83e63fc6f8f2..4c2043521c0cd 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -442,12 +442,23 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, if (!raw && ch == '\\' && s < end) { ch = *s++; if (ch == 'N') { + /* We need to look at and skip matching braces for "\N{name}" + sequences because otherwise we'll think the opening '{' + starts an expression, which is not the case with "\N". + Keep looking for either a matched '{' '}' pair, or the end + of the string. */ + if (s < end && *s++ == '{') { while (s < end && *s++ != '}') { } continue; } - break; + + /* This is an invalid "\N" sequence, since it's a "\N" not + followed by a "{". Just keep parsing this literal. This + error will be caught later by + decode_unicode_with_escapes(). */ + continue; } if (ch == '{' && warn_invalid_escape_sequence(p, ch, t) < 0) { return -1; @@ -491,7 +502,8 @@ fstring_find_literal(Parser *p, const char **str, const char *end, int raw, *literal = PyUnicode_DecodeUTF8Stateful(literal_start, s - literal_start, NULL, NULL); - } else { + } + else { *literal = decode_unicode_with_escapes(p, literal_start, s - literal_start, t); } From webhook-mailer at python.org Mon Jan 24 23:48:45 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 04:48:45 -0000 Subject: [Python-checkins] bpo-41841: update idlelib News up to 3.10.0. (GH-30868) Message-ID: https://github.com/python/cpython/commit/9d3c9788a6ccd4f2f53a147dd0026a316c396976 commit: 9d3c9788a6ccd4f2f53a147dd0026a316c396976 branch: main author: Terry Jan Reedy committer: terryjreedy date: 2022-01-24T23:48:40-05:00 summary: bpo-41841: update idlelib News up to 3.10.0. (GH-30868) files: M Lib/idlelib/NEWS.txt M Misc/NEWS.d/3.10.0a7.rst M Misc/NEWS.d/3.10.0b1.rst diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 396820e9117b5..c0f9a10e10b1c 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,12 +1,30 @@ What's New in IDLE 3.10.0 (since 3.9.0) -Released on 2021-10-04? +Released on 2021-10-04 ========================= +bpo-40128: Mostly fix completions on macOS when not using tcl/tk 8.6.11 +(as with 3.9). + +bpo-33962: Move the indent space setting from the Font tab to the new Windows +tab. Patch by Mark Roseman and Terry Jan Reedy. + +bpo-40468: Split the settings dialog General tab into Windows and Shell/Ed +tabs. Move help sources, which extend the Help menu, to the Extensions tab. +Make space for new options and shorten the dialog. The latter makes the +dialog better fit small screens. + +bpo-44010: Highlight the new match statement's soft keywords: match, case, +and _. This highlighting is not perfect and will be incorrect in some rare +cases, especially for some _s in case patterns. + bpo-44026: Include interpreter's typo fix suggestions in message line for NameErrors and AttributeErrors. Patch by E. Paine. +bpo-41611: Avoid occasional uncaught exceptions and freezing when using +completions on macOS. + bpo-37903: Add mouse actions to the shell sidebar. Left click and optional drag selects one or more lines of text, as with the editor line number sidebar. Right click after selecting text lines @@ -14,6 +32,9 @@ displays a context menu with 'copy with prompts'. This zips together prompts from the sidebar with lines from the selected text. This option also appears on the context menu for the text. +bpo-43981: Fix reference leaks in test_sidebar and test_squeezer. +Patches by Terry Jan Reedy and Pablo Galindo + bpo-37892: Change Shell input indents from tabs to spaces. Shell input now 'looks right'. Making this feasible motivated the shell sidebar. @@ -22,6 +43,9 @@ bpo-37903: Move the Shell input prompt to a side bar. bpo-43655: Make window managers on macOS and X Window recognize IDLE dialog windows as dialogs. +bpo-42225: Document that IDLE can fail on Unix either from misconfigured IP +masquerade rules or failure displaying complex colored (non-ascii) characters. + bpo-43283: Document why printing to IDLE's Shell is often slower than printing to a system terminal and that it can be made faster by pre-formatting a single string before printing. @@ -50,6 +74,12 @@ by using inspect.getdoc. bpo-33987: Mostly finish using ttk widgets, mainly for editor, settings, and searches. Some patches by Mark Roseman. +bpo-40511: Stop unnecessary "flashing" when typing opening and closing +parentheses inside the parentheses of a function call. + +bpo-38439: Add a 256x256 pixel IDLE icon to the Windows .ico file. Created by +Andrew Clover. Remove the low-color gif variations from the .ico file. + bpo-41775: Make 'IDLE Shell' the shell title. bpo-35764: Rewrite the Calltips doc section. diff --git a/Misc/NEWS.d/3.10.0a7.rst b/Misc/NEWS.d/3.10.0a7.rst index aa332631292a7..7e9cb77266bd9 100644 --- a/Misc/NEWS.d/3.10.0a7.rst +++ b/Misc/NEWS.d/3.10.0a7.rst @@ -855,7 +855,7 @@ Aasland. .. nonce: iIeiLg .. section: IDLE -Document that IDLE can fail on Unix either from misconfigured IP masquerage +Document that IDLE can fail on Unix either from misconfigured IP masquerade rules or failure displaying complex colored (non-ascii) characters. .. diff --git a/Misc/NEWS.d/3.10.0b1.rst b/Misc/NEWS.d/3.10.0b1.rst index e4391a1ee3870..83ba504d04342 100644 --- a/Misc/NEWS.d/3.10.0b1.rst +++ b/Misc/NEWS.d/3.10.0b1.rst @@ -1673,7 +1673,8 @@ zips together prompts from the sidebar with lines from the selected text. .. nonce: 3EFl1H .. section: IDLE -Fix reference leak in test_squeezer. Patch by Pablo Galindo +Fix reference leak in test_sidebar and test_squeezer. +Patches by Terry Jan Reedy and Pablo Galindo .. From webhook-mailer at python.org Tue Jan 25 01:37:38 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 25 Jan 2022 06:37:38 -0000 Subject: [Python-checkins] bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) Message-ID: https://github.com/python/cpython/commit/e1abffca45b60729c460e3e2ad50c8c1946cfd4e commit: e1abffca45b60729c460e3e2ad50c8c1946cfd4e branch: main author: Gregory Beauregard committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-24T22:37:15-08:00 summary: bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) We treat Annotated type arg as class-level annotation. This exempts it from checks against Final and ClassVar in order to allow using them in any nesting order. Automerge-Triggered-By: GH:gvanrossum files: A Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst M Lib/test/test_typing.py M Lib/typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 150d7c081c30b..5777656552d79 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4679,6 +4679,14 @@ class C: A.x = 5 self.assertEqual(C.x, 5) + def test_special_form_containment(self): + class C: + classvar: Annotated[ClassVar[int], "a decoration"] = 4 + const: Annotated[Final[int], "Const"] = 4 + + self.assertEqual(get_type_hints(C, globals())['classvar'], ClassVar[int]) + self.assertEqual(get_type_hints(C, globals())['const'], Final[int]) + def test_hash_eq(self): self.assertEqual(len({Annotated[int, 4, 5], Annotated[int, 4, 5]}), 1) self.assertNotEqual(Annotated[int, 4, 5], Annotated[int, 5, 4]) diff --git a/Lib/typing.py b/Lib/typing.py index 7ff546fbb6492..e3e098b1fcc8f 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -151,7 +151,7 @@ def _type_convert(arg, module=None): return arg -def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): +def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms=False): """Check that the argument is a type, and return it (internal helper). As a special case, accept None and return type(None) instead. Also wrap strings @@ -164,7 +164,7 @@ def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): We append the repr() of the actual value (truncated to 100 chars). """ invalid_generic_forms = (Generic, Protocol) - if not is_class: + if not allow_special_forms: invalid_generic_forms += (ClassVar,) if is_argument: invalid_generic_forms += (Final,) @@ -697,7 +697,7 @@ def _evaluate(self, globalns, localns, recursive_guard): eval(self.__forward_code__, globalns, localns), "Forward references must evaluate to types.", is_argument=self.__forward_is_argument__, - is_class=self.__forward_is_class__, + allow_special_forms=self.__forward_is_class__, ) self.__forward_value__ = _eval_type( type_, globalns, localns, recursive_guard | {self.__forward_arg__} @@ -1674,7 +1674,7 @@ def __class_getitem__(cls, params): "with at least two arguments (a type and an " "annotation).") msg = "Annotated[t, ...]: t must be a type." - origin = _type_check(params[0], msg) + origin = _type_check(params[0], msg, allow_special_forms=True) metadata = tuple(params[1:]) return _AnnotatedAlias(origin, metadata) diff --git a/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst new file mode 100644 index 0000000000000..f66e8868f753f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst @@ -0,0 +1 @@ +Allow :data:`typing.Annotated` to wrap :data:`typing.Final` and :data:`typing.ClassVar`. Patch by Gregory Beauregard. From webhook-mailer at python.org Tue Jan 25 02:01:34 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 07:01:34 -0000 Subject: [Python-checkins] bpo-41841: update idlelib News up to 3.10.0 (GH-30871) Message-ID: https://github.com/python/cpython/commit/98cabce59958914b59914abbffbfde7129d4c47f commit: 98cabce59958914b59914abbffbfde7129d4c47f branch: 3.9 author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T02:01:25-05:00 summary: bpo-41841: update idlelib News up to 3.10.0 (GH-30871) files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index f68e144d25610..3e5915679a092 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,7 +1,24 @@ -What's New in IDLE 3.9.z -(since 3.9.0) -========================= +What's New in IDLE 3.9 after 3.9.0 +until 3.9.12, 2022-05-16 +================================== + + +bpo-40128: Mostly fix completions on macOS when not using tcl/tk 8.6.11 +(as with 3.9). + +bpo-33962: Move the indent space setting from the Font tab to the new Windows +tab. Patch by Mark Roseman and Terry Jan Reedy. +bpo-40468: Split the settings dialog General tab into Windows and Shell/ED +tabs. Move help sources, which extend the Help menu, to the Extensions tab. +Make space for new options and shorten the dialog. The latter makes the dialog +better fit small screens. + +bpo-41611: Avoid occasional uncaught exceptions and freezing when using +completions on macOS. + +bpo-42225: Document that IDLE can fail on Unix either from misconfigured IP +masquerade rules or failure displaying complex colored (non-ascii) characters. bpo-43283: Document why printing to IDLE's Shell is often slower than printing to a system terminal and that it can be made faster by @@ -30,6 +47,12 @@ by using inspect.getdoc. bpo-33987: Mostly finish using ttk widgets, mainly for editor, settings, and searches. Some patches by Mark Roseman. +bpo-40511: Stop unnecessary "flashing" when typing opening and closing +parentheses inside the parentheses of a function call. + +bpo-38439: Add a 256?256 pixel IDLE icon to the Windows .ico file. Created by +Andrew Clover. Remove the low-color gif variations from the .ico file. + bpo-41775: Make 'IDLE Shell' the shell title. bpo-35764: Rewrite the Calltips doc section. From webhook-mailer at python.org Tue Jan 25 02:02:07 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 07:02:07 -0000 Subject: [Python-checkins] bpo-41841: update idlelib News up to 3.10.0. (GH-30868) (GH-30870) Message-ID: https://github.com/python/cpython/commit/ce79b504a790d02c080449d31356d33a5aaf19dd commit: ce79b504a790d02c080449d31356d33a5aaf19dd branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: terryjreedy date: 2022-01-25T02:02:02-05:00 summary: bpo-41841: update idlelib News up to 3.10.0. (GH-30868) (GH-30870) (cherry picked from commit 9d3c9788a6ccd4f2f53a147dd0026a316c396976) Co-authored-by: Terry Jan Reedy Co-authored-by: Terry Jan Reedy files: M Lib/idlelib/NEWS.txt M Misc/NEWS.d/3.10.0a7.rst M Misc/NEWS.d/3.10.0b1.rst diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 396820e9117b5..c0f9a10e10b1c 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,12 +1,30 @@ What's New in IDLE 3.10.0 (since 3.9.0) -Released on 2021-10-04? +Released on 2021-10-04 ========================= +bpo-40128: Mostly fix completions on macOS when not using tcl/tk 8.6.11 +(as with 3.9). + +bpo-33962: Move the indent space setting from the Font tab to the new Windows +tab. Patch by Mark Roseman and Terry Jan Reedy. + +bpo-40468: Split the settings dialog General tab into Windows and Shell/Ed +tabs. Move help sources, which extend the Help menu, to the Extensions tab. +Make space for new options and shorten the dialog. The latter makes the +dialog better fit small screens. + +bpo-44010: Highlight the new match statement's soft keywords: match, case, +and _. This highlighting is not perfect and will be incorrect in some rare +cases, especially for some _s in case patterns. + bpo-44026: Include interpreter's typo fix suggestions in message line for NameErrors and AttributeErrors. Patch by E. Paine. +bpo-41611: Avoid occasional uncaught exceptions and freezing when using +completions on macOS. + bpo-37903: Add mouse actions to the shell sidebar. Left click and optional drag selects one or more lines of text, as with the editor line number sidebar. Right click after selecting text lines @@ -14,6 +32,9 @@ displays a context menu with 'copy with prompts'. This zips together prompts from the sidebar with lines from the selected text. This option also appears on the context menu for the text. +bpo-43981: Fix reference leaks in test_sidebar and test_squeezer. +Patches by Terry Jan Reedy and Pablo Galindo + bpo-37892: Change Shell input indents from tabs to spaces. Shell input now 'looks right'. Making this feasible motivated the shell sidebar. @@ -22,6 +43,9 @@ bpo-37903: Move the Shell input prompt to a side bar. bpo-43655: Make window managers on macOS and X Window recognize IDLE dialog windows as dialogs. +bpo-42225: Document that IDLE can fail on Unix either from misconfigured IP +masquerade rules or failure displaying complex colored (non-ascii) characters. + bpo-43283: Document why printing to IDLE's Shell is often slower than printing to a system terminal and that it can be made faster by pre-formatting a single string before printing. @@ -50,6 +74,12 @@ by using inspect.getdoc. bpo-33987: Mostly finish using ttk widgets, mainly for editor, settings, and searches. Some patches by Mark Roseman. +bpo-40511: Stop unnecessary "flashing" when typing opening and closing +parentheses inside the parentheses of a function call. + +bpo-38439: Add a 256x256 pixel IDLE icon to the Windows .ico file. Created by +Andrew Clover. Remove the low-color gif variations from the .ico file. + bpo-41775: Make 'IDLE Shell' the shell title. bpo-35764: Rewrite the Calltips doc section. diff --git a/Misc/NEWS.d/3.10.0a7.rst b/Misc/NEWS.d/3.10.0a7.rst index 6c32e60dc8a46..f62be491d56dd 100644 --- a/Misc/NEWS.d/3.10.0a7.rst +++ b/Misc/NEWS.d/3.10.0a7.rst @@ -855,7 +855,7 @@ Aasland. .. nonce: iIeiLg .. section: IDLE -Document that IDLE can fail on Unix either from misconfigured IP masquerage +Document that IDLE can fail on Unix either from misconfigured IP masquerade rules or failure displaying complex colored (non-ascii) characters. .. diff --git a/Misc/NEWS.d/3.10.0b1.rst b/Misc/NEWS.d/3.10.0b1.rst index ca524fe16b7cf..4731dca2e74cf 100644 --- a/Misc/NEWS.d/3.10.0b1.rst +++ b/Misc/NEWS.d/3.10.0b1.rst @@ -1673,7 +1673,8 @@ zips together prompts from the sidebar with lines from the selected text. .. nonce: 3EFl1H .. section: IDLE -Fix reference leak in test_squeezer. Patch by Pablo Galindo +Fix reference leak in test_sidebar and test_squeezer. +Patches by Terry Jan Reedy and Pablo Galindo .. From webhook-mailer at python.org Tue Jan 25 02:09:10 2022 From: webhook-mailer at python.org (tiran) Date: Tue, 25 Jan 2022 07:09:10 -0000 Subject: [Python-checkins] bpo-40280: Skip subprocess-based tests on wasm32-emscripten (GH-30615) Message-ID: https://github.com/python/cpython/commit/8464fbc42ecc9ce504faac499711dcdc6eedef16 commit: 8464fbc42ecc9ce504faac499711dcdc6eedef16 branch: main author: Christian Heimes committer: tiran date: 2022-01-25T08:09:06+01:00 summary: bpo-40280: Skip subprocess-based tests on wasm32-emscripten (GH-30615) files: A Misc/NEWS.d/next/Tests/2022-01-14-23-22-41.bpo-40280.nHLWoD.rst M Lib/distutils/tests/test_build_clib.py M Lib/distutils/tests/test_build_ext.py M Lib/distutils/tests/test_build_py.py M Lib/distutils/tests/test_config_cmd.py M Lib/distutils/tests/test_install.py M Lib/distutils/tests/test_install_lib.py M Lib/distutils/tests/test_spawn.py M Lib/distutils/tests/test_sysconfig.py M Lib/lib2to3/tests/test_parser.py M Lib/test/support/__init__.py M Lib/test/support/script_helper.py M Lib/test/test_audit.py M Lib/test/test_capi.py M Lib/test/test_cmd_line.py M Lib/test/test_embed.py M Lib/test/test_faulthandler.py M Lib/test/test_file_eintr.py M Lib/test/test_gc.py M Lib/test/test_gzip.py M Lib/test/test_os.py M Lib/test/test_platform.py M Lib/test/test_poll.py M Lib/test/test_popen.py M Lib/test/test_py_compile.py M Lib/test/test_quopri.py M Lib/test/test_regrtest.py M Lib/test/test_repl.py M Lib/test/test_runpy.py M Lib/test/test_signal.py M Lib/test/test_site.py M Lib/test/test_source_encoding.py M Lib/test/test_subprocess.py M Lib/test/test_support.py M Lib/test/test_sys.py M Lib/test/test_sysconfig.py M Lib/test/test_threading.py M Lib/test/test_traceback.py M Lib/test/test_utf8_mode.py M Lib/test/test_venv.py M Lib/test/test_webbrowser.py M Lib/test/test_zipfile.py diff --git a/Lib/distutils/tests/test_build_clib.py b/Lib/distutils/tests/test_build_clib.py index 601a1b10fa100..95f928288e004 100644 --- a/Lib/distutils/tests/test_build_clib.py +++ b/Lib/distutils/tests/test_build_clib.py @@ -4,7 +4,9 @@ import sys import sysconfig -from test.support import run_unittest, missing_compiler_executable +from test.support import ( + run_unittest, missing_compiler_executable, requires_subprocess +) from distutils.command.build_clib import build_clib from distutils.errors import DistutilsSetupError @@ -112,6 +114,7 @@ def test_finalize_options(self): self.assertRaises(DistutilsSetupError, cmd.finalize_options) @unittest.skipIf(sys.platform == 'win32', "can't test on Windows") + @requires_subprocess() def test_run(self): pkg_dir, dist = self.create_dist() cmd = build_clib(dist) diff --git a/Lib/distutils/tests/test_build_ext.py b/Lib/distutils/tests/test_build_ext.py index 3ee567d045508..460b62f50bc12 100644 --- a/Lib/distutils/tests/test_build_ext.py +++ b/Lib/distutils/tests/test_build_ext.py @@ -56,6 +56,7 @@ def tearDown(self): def build_ext(self, *args, **kwargs): return build_ext(*args, **kwargs) + @support.requires_subprocess() def test_build_ext(self): cmd = support.missing_compiler_executable() if cmd is not None: @@ -332,6 +333,7 @@ def test_compiler_option(self): cmd.run() self.assertEqual(cmd.compiler, 'unix') + @support.requires_subprocess() def test_get_outputs(self): cmd = support.missing_compiler_executable() if cmd is not None: diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py index a590a485a2b92..44a06cc963aa3 100644 --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -9,7 +9,7 @@ from distutils.errors import DistutilsFileError from distutils.tests import support -from test.support import run_unittest +from test.support import run_unittest, requires_subprocess class BuildPyTestCase(support.TempdirManager, @@ -89,6 +89,7 @@ def test_empty_package_dir(self): self.fail("failed package_data test when package_dir is ''") @unittest.skipIf(sys.dont_write_bytecode, 'byte-compile disabled') + @requires_subprocess() def test_byte_compile(self): project_dir, dist = self.create_dist(py_modules=['boiledeggs']) os.chdir(project_dir) @@ -106,6 +107,7 @@ def test_byte_compile(self): ['boiledeggs.%s.pyc' % sys.implementation.cache_tag]) @unittest.skipIf(sys.dont_write_bytecode, 'byte-compile disabled') + @requires_subprocess() def test_byte_compile_optimized(self): project_dir, dist = self.create_dist(py_modules=['boiledeggs']) os.chdir(project_dir) diff --git a/Lib/distutils/tests/test_config_cmd.py b/Lib/distutils/tests/test_config_cmd.py index 072f9ebe714aa..c79db68aae115 100644 --- a/Lib/distutils/tests/test_config_cmd.py +++ b/Lib/distutils/tests/test_config_cmd.py @@ -3,7 +3,9 @@ import os import sys import sysconfig -from test.support import run_unittest, missing_compiler_executable +from test.support import ( + run_unittest, missing_compiler_executable, requires_subprocess +) from distutils.command.config import dump_file, config from distutils.tests import support @@ -42,6 +44,7 @@ def test_dump_file(self): self.assertEqual(len(self._logs), numlines+1) @unittest.skipIf(sys.platform == 'win32', "can't test on Windows") + @requires_subprocess() def test_search_cpp(self): cmd = missing_compiler_executable(['preprocessor']) if cmd is not None: diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py index b2a3887f0bbc9..c38f98b8b2c29 100644 --- a/Lib/distutils/tests/test_install.py +++ b/Lib/distutils/tests/test_install.py @@ -5,7 +5,7 @@ import unittest import site -from test.support import captured_stdout, run_unittest +from test.support import captured_stdout, run_unittest, requires_subprocess from distutils import sysconfig from distutils.command.install import install, HAS_USER_SITE @@ -208,6 +208,7 @@ def test_record(self): 'UNKNOWN-0.0.0-py%s.%s.egg-info' % sys.version_info[:2]] self.assertEqual(found, expected) + @requires_subprocess() def test_record_extensions(self): cmd = test_support.missing_compiler_executable() if cmd is not None: diff --git a/Lib/distutils/tests/test_install_lib.py b/Lib/distutils/tests/test_install_lib.py index 652653f2b2c2b..f840d1a94665e 100644 --- a/Lib/distutils/tests/test_install_lib.py +++ b/Lib/distutils/tests/test_install_lib.py @@ -8,7 +8,7 @@ from distutils.extension import Extension from distutils.tests import support from distutils.errors import DistutilsOptionError -from test.support import run_unittest +from test.support import run_unittest, requires_subprocess class InstallLibTestCase(support.TempdirManager, @@ -35,6 +35,7 @@ def test_finalize_options(self): self.assertEqual(cmd.optimize, 2) @unittest.skipIf(sys.dont_write_bytecode, 'byte-compile disabled') + @requires_subprocess() def test_byte_compile(self): project_dir, dist = self.create_dist() os.chdir(project_dir) @@ -90,6 +91,7 @@ def test_get_inputs(self): inputs = cmd.get_inputs() self.assertEqual(len(inputs), 2, inputs) + @requires_subprocess() def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] diff --git a/Lib/distutils/tests/test_spawn.py b/Lib/distutils/tests/test_spawn.py index 631d6455d4572..a0a1145da5df8 100644 --- a/Lib/distutils/tests/test_spawn.py +++ b/Lib/distutils/tests/test_spawn.py @@ -3,7 +3,7 @@ import stat import sys import unittest.mock -from test.support import run_unittest, unix_shell +from test.support import run_unittest, unix_shell, requires_subprocess from test.support import os_helper from distutils.spawn import find_executable @@ -11,6 +11,8 @@ from distutils.errors import DistutilsExecError from distutils.tests import support + + at requires_subprocess() class SpawnTestCase(support.TempdirManager, support.LoggingSilencer, unittest.TestCase): diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py index 3697206229d20..7a88c88f6cdc5 100644 --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -10,7 +10,7 @@ from distutils import sysconfig from distutils.ccompiler import get_default_compiler from distutils.tests import support -from test.support import run_unittest, swap_item +from test.support import run_unittest, swap_item, requires_subprocess from test.support.os_helper import TESTFN from test.support.warnings_helper import check_warnings @@ -247,6 +247,7 @@ def test_SO_in_vars(self): self.assertIsNotNone(vars['SO']) self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) + @requires_subprocess() def test_customize_compiler_before_get_config_vars(self): # Issue #21923: test that a Distribution compiler # instance can be called without an explicit call to diff --git a/Lib/lib2to3/tests/test_parser.py b/Lib/lib2to3/tests/test_parser.py index 5eefb5aad7d49..ff4f8078878d8 100644 --- a/Lib/lib2to3/tests/test_parser.py +++ b/Lib/lib2to3/tests/test_parser.py @@ -61,6 +61,9 @@ def test_load_grammar_from_pickle(self): shutil.rmtree(tmpdir) @unittest.skipIf(sys.executable is None, 'sys.executable required') + @unittest.skipIf( + sys.platform == 'emscripten', 'requires working subprocess' + ) def test_load_grammar_from_subprocess(self): tmpdir = tempfile.mkdtemp() tmpsubdir = os.path.join(tmpdir, 'subdir') diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index ca903d302bdd3..1e4935fc3e617 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -40,11 +40,12 @@ "bigmemtest", "bigaddrspacetest", "cpython_only", "get_attribute", "requires_IEEE_754", "requires_zlib", "has_fork_support", "requires_fork", + "has_subprocess_support", "requires_subprocess", "anticipate_failure", "load_package_tests", "detect_api_mismatch", "check__all__", "skip_if_buggy_ucrt_strfptime", "check_disallow_instantiation", # sys - "is_jython", "is_android", "is_emscripten", + "is_jython", "is_android", "is_emscripten", "is_wasi", "check_impl_detail", "unix_shell", "setswitchinterval", # network "open_urlresource", @@ -467,15 +468,23 @@ def requires_debug_ranges(reason='requires co_positions / debug_ranges'): else: unix_shell = None -# wasm32-emscripten is POSIX-like but does not provide a -# working fork() or subprocess API. +# wasm32-emscripten and -wasi are POSIX-like but do not +# have subprocess or fork support. is_emscripten = sys.platform == "emscripten" +is_wasi = sys.platform == "wasi" -has_fork_support = hasattr(os, "fork") and not is_emscripten +has_fork_support = hasattr(os, "fork") and not is_emscripten and not is_wasi def requires_fork(): return unittest.skipUnless(has_fork_support, "requires working os.fork()") +has_subprocess_support = not is_emscripten and not is_wasi + +def requires_subprocess(): + """Used for subprocess, os.spawn calls""" + return unittest.skipUnless(has_subprocess_support, "requires subprocess support") + + # Define the URL of a dedicated HTTP server for the network tests. # The URL must use clear-text HTTP: no redirection to encrypted HTTPS. TEST_HTTP_URL = "http://www.pythontest.net" diff --git a/Lib/test/support/script_helper.py b/Lib/test/support/script_helper.py index 6d699c8486cd2..c2b43f4060eb5 100644 --- a/Lib/test/support/script_helper.py +++ b/Lib/test/support/script_helper.py @@ -42,6 +42,10 @@ def interpreter_requires_environment(): if 'PYTHONHOME' in os.environ: __cached_interp_requires_environment = True return True + # cannot run subprocess, assume we don't need it + if not support.has_subprocess_support: + __cached_interp_requires_environment = False + return False # Try running an interpreter with -E to see if it works or not. try: @@ -87,6 +91,7 @@ def fail(self, cmd_line): # Executing the interpreter in a subprocess + at support.requires_subprocess() def run_python_until_end(*args, **env_vars): env_required = interpreter_requires_environment() cwd = env_vars.pop('__cwd', None) @@ -139,6 +144,7 @@ def run_python_until_end(*args, **env_vars): return _PythonRunResult(rc, out, err), cmd_line + at support.requires_subprocess() def _assert_python(expected_success, /, *args, **env_vars): res, cmd_line = run_python_until_end(*args, **env_vars) if (res.rc and expected_success) or (not res.rc and not expected_success): @@ -171,6 +177,7 @@ def assert_python_failure(*args, **env_vars): return _assert_python(False, *args, **env_vars) + at support.requires_subprocess() def spawn_python(*args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kw): """Run a Python subprocess with the given arguments. @@ -273,6 +280,7 @@ def make_zip_pkg(zip_dir, zip_basename, pkg_name, script_basename, return zip_name, os.path.join(zip_name, script_name_in_zip) + at support.requires_subprocess() def run_test_script(script): # use -u to try to get the full output if the test hangs or crash if support.verbose: diff --git a/Lib/test/test_audit.py b/Lib/test/test_audit.py index d99b3b7ed7d36..0fa2d74835cba 100644 --- a/Lib/test/test_audit.py +++ b/Lib/test/test_audit.py @@ -16,6 +16,8 @@ class AuditTest(unittest.TestCase): + + @support.requires_subprocess() def do_test(self, *args): with subprocess.Popen( [sys.executable, "-Xutf8", AUDIT_TESTS_PY, *args], @@ -29,6 +31,7 @@ def do_test(self, *args): if p.returncode: self.fail("".join(p.stderr)) + @support.requires_subprocess() def run_python(self, *args): events = [] with subprocess.Popen( diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 0957f3253d7a6..a5db8a11c5f67 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -66,6 +66,7 @@ def test_instancemethod(self): self.assertEqual(testfunction.attribute, "test") self.assertRaises(AttributeError, setattr, inst.testfunction, "attribute", "test") + @support.requires_subprocess() def test_no_FatalError_infinite_loop(self): with support.SuppressCrashReport(): p = subprocess.Popen([sys.executable, "-c", diff --git a/Lib/test/test_cmd_line.py b/Lib/test/test_cmd_line.py index fa5f39ea5fa97..352109ed4b2ff 100644 --- a/Lib/test/test_cmd_line.py +++ b/Lib/test/test_cmd_line.py @@ -15,6 +15,8 @@ interpreter_requires_environment ) +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") # Debug build? Py_DEBUG = hasattr(sys, "gettotalrefcount") diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 19c53c392607b..15c6b05916f34 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -17,6 +17,8 @@ import tempfile import textwrap +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") MS_WINDOWS = (os.name == 'nt') MACOS = (sys.platform == 'darwin') diff --git a/Lib/test/test_faulthandler.py b/Lib/test/test_faulthandler.py index de986a3cbea97..f7eaa77942476 100644 --- a/Lib/test/test_faulthandler.py +++ b/Lib/test/test_faulthandler.py @@ -412,6 +412,7 @@ def test_is_enabled(self): finally: sys.stderr = orig_stderr + @support.requires_subprocess() def test_disabled_by_default(self): # By default, the module should be disabled code = "import faulthandler; print(faulthandler.is_enabled())" @@ -420,6 +421,7 @@ def test_disabled_by_default(self): output = subprocess.check_output(args) self.assertEqual(output.rstrip(), b"False") + @support.requires_subprocess() def test_sys_xoptions(self): # Test python -X faulthandler code = "import faulthandler; print(faulthandler.is_enabled())" @@ -432,6 +434,7 @@ def test_sys_xoptions(self): output = subprocess.check_output(args, env=env) self.assertEqual(output.rstrip(), b"True") + @support.requires_subprocess() def test_env_var(self): # empty env var code = "import faulthandler; print(faulthandler.is_enabled())" diff --git a/Lib/test/test_file_eintr.py b/Lib/test/test_file_eintr.py index 01408d838a83c..f9236f45ca4be 100644 --- a/Lib/test/test_file_eintr.py +++ b/Lib/test/test_file_eintr.py @@ -15,12 +15,15 @@ import sys import time import unittest +from test import support + +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") # Test import all of the things we're about to try testing up front. import _io import _pyio - @unittest.skipUnless(os.name == 'posix', 'tests requires a posix system.') class TestFileIOSignalInterrupt: def setUp(self): diff --git a/Lib/test/test_gc.py b/Lib/test/test_gc.py index 52948f1c7bde5..c4d4355dec9c6 100644 --- a/Lib/test/test_gc.py +++ b/Lib/test/test_gc.py @@ -1,7 +1,7 @@ import unittest import unittest.mock from test.support import (verbose, refcount_test, - cpython_only) + cpython_only, requires_subprocess) from test.support.import_helper import import_module from test.support.os_helper import temp_dir, TESTFN, unlink from test.support.script_helper import assert_python_ok, make_script @@ -661,6 +661,7 @@ def do_work(): gc.collect() # this blows up (bad C pointer) when it fails @cpython_only + @requires_subprocess() def test_garbage_at_shutdown(self): import subprocess code = """if 1: diff --git a/Lib/test/test_gzip.py b/Lib/test/test_gzip.py index aa66d2f07f508..497e66cd553b7 100644 --- a/Lib/test/test_gzip.py +++ b/Lib/test/test_gzip.py @@ -12,7 +12,7 @@ from subprocess import PIPE, Popen from test.support import import_helper from test.support import os_helper -from test.support import _4G, bigmemtest +from test.support import _4G, bigmemtest, requires_subprocess from test.support.script_helper import assert_python_ok, assert_python_failure gzip = import_helper.import_module('gzip') @@ -760,6 +760,7 @@ def wrapper(*args, **kwargs): class TestCommandLine(unittest.TestCase): data = b'This is a simple test with gzip' + @requires_subprocess() def test_decompress_stdin_stdout(self): with io.BytesIO() as bytes_io: with gzip.GzipFile(fileobj=bytes_io, mode='wb') as gzip_file: @@ -795,6 +796,7 @@ def test_decompress_infile_outfile_error(self): self.assertEqual(rc, 1) self.assertEqual(out, b'') + @requires_subprocess() @create_and_remove_directory(TEMPDIR) def test_compress_stdin_outfile(self): args = sys.executable, '-m', 'gzip' diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 89e5e4190c640..84c27f346c340 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -1096,6 +1096,7 @@ def test_environb(self): value_str = value.decode(sys.getfilesystemencoding(), 'surrogateescape') self.assertEqual(os.environ['bytes'], value_str) + @support.requires_subprocess() def test_putenv_unsetenv(self): name = "PYTHONTESTVAR" value = "testvalue" @@ -2279,6 +2280,7 @@ def test_setreuid(self): self.assertRaises(OverflowError, os.setreuid, 0, self.UID_OVERFLOW) @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + @support.requires_subprocess() def test_setreuid_neg1(self): # Needs to accept -1. We run this in a subprocess to avoid # altering the test runner's process state (issue8045). @@ -2287,6 +2289,7 @@ def test_setreuid_neg1(self): 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + @support.requires_subprocess() def test_setregid(self): if os.getuid() != 0 and not HAVE_WHEEL_GROUP: self.assertRaises(OSError, os.setregid, 0, 0) @@ -2296,6 +2299,7 @@ def test_setregid(self): self.assertRaises(OverflowError, os.setregid, 0, self.GID_OVERFLOW) @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + @support.requires_subprocess() def test_setregid_neg1(self): # Needs to accept -1. We run this in a subprocess to avoid # altering the test runner's process state (issue8045). @@ -2469,6 +2473,7 @@ def _kill_with_event(self, event, name): self.fail("subprocess did not stop on {}".format(name)) @unittest.skip("subprocesses aren't inheriting Ctrl+C property") + @support.requires_subprocess() def test_CTRL_C_EVENT(self): from ctypes import wintypes import ctypes @@ -2487,6 +2492,7 @@ def test_CTRL_C_EVENT(self): self._kill_with_event(signal.CTRL_C_EVENT, "CTRL_C_EVENT") + @support.requires_subprocess() def test_CTRL_BREAK_EVENT(self): self._kill_with_event(signal.CTRL_BREAK_EVENT, "CTRL_BREAK_EVENT") @@ -2924,6 +2930,7 @@ def test_device_encoding(self): self.assertTrue(codecs.lookup(encoding)) + at support.requires_subprocess() class PidTests(unittest.TestCase): @unittest.skipUnless(hasattr(os, 'getppid'), "test needs os.getppid") def test_getppid(self): @@ -2996,6 +3003,7 @@ def kill_process(pid): self.check_waitpid(code, exitcode=-signum, callback=kill_process) + at support.requires_subprocess() class SpawnTests(unittest.TestCase): def create_args(self, *, with_env=False, use_bytes=False): self.exitcode = 17 diff --git a/Lib/test/test_platform.py b/Lib/test/test_platform.py index 1a688775f4630..d70ef155271f5 100644 --- a/Lib/test/test_platform.py +++ b/Lib/test/test_platform.py @@ -79,6 +79,7 @@ def test_architecture(self): res = platform.architecture() @os_helper.skip_unless_symlink + @support.requires_subprocess() def test_architecture_via_symlink(self): # issue3762 with support.PythonSymlink() as py: cmd = "-c", "import platform; print(platform.architecture())" @@ -269,6 +270,7 @@ def test_uname_slices(self): self.assertEqual(res[:5], expected[:5]) @unittest.skipIf(sys.platform in ['win32', 'OpenVMS'], "uname -p not used") + @support.requires_subprocess() def test_uname_processor(self): """ On some systems, the processor must match the output diff --git a/Lib/test/test_poll.py b/Lib/test/test_poll.py index 82bbb3af9f1b3..ae3ffc77e9924 100644 --- a/Lib/test/test_poll.py +++ b/Lib/test/test_poll.py @@ -7,7 +7,7 @@ import threading import time import unittest -from test.support import cpython_only +from test.support import cpython_only, requires_subprocess from test.support import threading_helper from test.support.os_helper import TESTFN @@ -120,6 +120,7 @@ def fileno(self): # Another test case for poll(). This is copied from the test case for # select(), modified to use poll() instead. + @requires_subprocess() def test_poll2(self): cmd = 'for i in 0 1 2 3 4 5 6 7 8 9; do echo testing...; sleep 1; done' proc = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, diff --git a/Lib/test/test_popen.py b/Lib/test/test_popen.py index cac2f6177f325..e6bfc480cbd12 100644 --- a/Lib/test/test_popen.py +++ b/Lib/test/test_popen.py @@ -19,6 +19,7 @@ if ' ' in python: python = '"' + python + '"' # quote embedded space for cmdline + at support.requires_subprocess() class PopenTest(unittest.TestCase): def _do_test_commandline(self, cmdline, expected): diff --git a/Lib/test/test_py_compile.py b/Lib/test/test_py_compile.py index 5ed98dbff1737..794d6436b61ab 100644 --- a/Lib/test/test_py_compile.py +++ b/Lib/test/test_py_compile.py @@ -230,6 +230,7 @@ def setUp(self): def tearDown(self): os_helper.rmtree(self.directory) + @support.requires_subprocess() def pycompilecmd(self, *args, **kwargs): # assert_python_* helpers don't return proc object. We'll just use # subprocess.run() instead of spawn_python() and its friends to test diff --git a/Lib/test/test_quopri.py b/Lib/test/test_quopri.py index 715544c8a9669..152d1858dcdd2 100644 --- a/Lib/test/test_quopri.py +++ b/Lib/test/test_quopri.py @@ -3,6 +3,7 @@ import sys, io, subprocess import quopri +from test import support ENCSAMPLE = b"""\ @@ -180,6 +181,7 @@ def test_decode_header(self): for p, e in self.HSTRINGS: self.assertEqual(quopri.decodestring(e, header=True), p) + @support.requires_subprocess() def test_scriptencode(self): (p, e) = self.STRINGS[-1] process = subprocess.Popen([sys.executable, "-mquopri"], @@ -196,6 +198,7 @@ def test_scriptencode(self): self.assertEqual(cout[i], e[i]) self.assertEqual(cout, e) + @support.requires_subprocess() def test_scriptdecode(self): (p, e) = self.STRINGS[-1] process = subprocess.Popen([sys.executable, "-mquopri", "-d"], diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 08e2c87e15c4d..babc8a690877a 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -22,6 +22,8 @@ from test.support import os_helper from test.libregrtest import utils, setup +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") Py_DEBUG = hasattr(sys, 'gettotalrefcount') ROOT_DIR = os.path.join(os.path.dirname(__file__), '..', '..') diff --git a/Lib/test/test_repl.py b/Lib/test/test_repl.py index a8d04a425e278..ddb4aa68048df 100644 --- a/Lib/test/test_repl.py +++ b/Lib/test/test_repl.py @@ -5,9 +5,14 @@ import unittest import subprocess from textwrap import dedent -from test.support import cpython_only, SuppressCrashReport +from test.support import cpython_only, has_subprocess_support, SuppressCrashReport from test.support.script_helper import kill_python + +if not has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") + + def spawn_repl(*args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kw): """Run the Python REPL with the given arguments. diff --git a/Lib/test/test_runpy.py b/Lib/test/test_runpy.py index 2954dfedc7e42..80e695a5f3f79 100644 --- a/Lib/test/test_runpy.py +++ b/Lib/test/test_runpy.py @@ -12,7 +12,7 @@ import textwrap import unittest import warnings -from test.support import no_tracing, verbose +from test.support import no_tracing, verbose, requires_subprocess from test.support.import_helper import forget, make_legacy_pyc, unload from test.support.os_helper import create_empty_file, temp_dir from test.support.script_helper import make_script, make_zip_script @@ -781,6 +781,7 @@ def run(self, *args, **kwargs): ) super().run(*args, **kwargs) + @requires_subprocess() def assertSigInt(self, *args, **kwargs): proc = subprocess.run(*args, **kwargs, text=True, stderr=subprocess.PIPE) self.assertTrue(proc.stderr.endswith("\nKeyboardInterrupt\n")) diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index ac4626d0c456e..09de608bb771f 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -116,6 +116,7 @@ def test_valid_signals(self): self.assertLess(len(s), signal.NSIG) @unittest.skipUnless(sys.executable, "sys.executable required.") + @support.requires_subprocess() def test_keyboard_interrupt_exit_code(self): """KeyboardInterrupt triggers exit via SIGINT.""" process = subprocess.run( @@ -166,6 +167,7 @@ def test_issue9324(self): signal.signal(7, handler) @unittest.skipUnless(sys.executable, "sys.executable required.") + @support.requires_subprocess() def test_keyboard_interrupt_exit_code(self): """KeyboardInterrupt triggers an exit using STATUS_CONTROL_C_EXIT.""" # We don't test via os.kill(os.getpid(), signal.CTRL_C_EVENT) here @@ -637,6 +639,7 @@ def handler(signum, frame): @unittest.skipIf(sys.platform == "win32", "Not valid on Windows") @unittest.skipUnless(hasattr(signal, 'siginterrupt'), "needs signal.siginterrupt()") + at support.requires_subprocess() class SiginterruptTest(unittest.TestCase): def readpipe_interrupted(self, interrupt): diff --git a/Lib/test/test_site.py b/Lib/test/test_site.py index c54d868cb234f..032a1be3aa529 100644 --- a/Lib/test/test_site.py +++ b/Lib/test/test_site.py @@ -211,6 +211,7 @@ def test_get_path(self): @unittest.skipUnless(site.ENABLE_USER_SITE, "requires access to PEP 370 " "user-site (site.ENABLE_USER_SITE)") + @support.requires_subprocess() def test_s_option(self): # (ncoghlan) Change this to use script_helper... usersite = site.USER_SITE @@ -497,6 +498,7 @@ def test_license_exists_at_url(self): class StartupImportTests(unittest.TestCase): + @support.requires_subprocess() def test_startup_imports(self): # Get sys.path in isolated mode (python3 -I) popen = subprocess.Popen([sys.executable, '-X', 'utf8', '-I', @@ -547,17 +549,20 @@ def test_startup_imports(self): }.difference(sys.builtin_module_names) self.assertFalse(modules.intersection(collection_mods), stderr) + @support.requires_subprocess() def test_startup_interactivehook(self): r = subprocess.Popen([sys.executable, '-c', 'import sys; sys.exit(hasattr(sys, "__interactivehook__"))']).wait() self.assertTrue(r, "'__interactivehook__' not added by site") + @support.requires_subprocess() def test_startup_interactivehook_isolated(self): # issue28192 readline is not automatically enabled in isolated mode r = subprocess.Popen([sys.executable, '-I', '-c', 'import sys; sys.exit(hasattr(sys, "__interactivehook__"))']).wait() self.assertFalse(r, "'__interactivehook__' added in isolated mode") + @support.requires_subprocess() def test_startup_interactivehook_isolated_explicit(self): # issue28192 readline can be explicitly enabled in isolated mode r = subprocess.Popen([sys.executable, '-I', '-c', @@ -607,6 +612,7 @@ def _calc_sys_path_for_underpth_nosite(self, sys_prefix, lines): sys_path.append(abs_path) return sys_path + @support.requires_subprocess() def test_underpth_basic(self): libpath = test.support.STDLIB_DIR exe_prefix = os.path.dirname(sys.executable) @@ -627,6 +633,7 @@ def test_underpth_basic(self): "sys.path is incorrect" ) + @support.requires_subprocess() def test_underpth_nosite_file(self): libpath = test.support.STDLIB_DIR exe_prefix = os.path.dirname(sys.executable) @@ -655,6 +662,7 @@ def test_underpth_nosite_file(self): "sys.path is incorrect" ) + @support.requires_subprocess() def test_underpth_file(self): libpath = test.support.STDLIB_DIR exe_prefix = os.path.dirname(sys.executable) @@ -679,6 +687,7 @@ def test_underpth_file(self): )], env=env) self.assertTrue(rc, "sys.path is incorrect") + @support.requires_subprocess() def test_underpth_dll_file(self): libpath = test.support.STDLIB_DIR exe_prefix = os.path.dirname(sys.executable) diff --git a/Lib/test/test_source_encoding.py b/Lib/test/test_source_encoding.py index a0cb605c1651c..a0375fda0d365 100644 --- a/Lib/test/test_source_encoding.py +++ b/Lib/test/test_source_encoding.py @@ -1,7 +1,7 @@ # -*- coding: koi8-r -*- import unittest -from test.support import script_helper, captured_stdout +from test.support import script_helper, captured_stdout, requires_subprocess from test.support.os_helper import TESTFN, unlink, rmtree from test.support.import_helper import unload import importlib @@ -65,6 +65,7 @@ def test_issue7820(self): # two bytes in common with the UTF-8 BOM self.assertRaises(SyntaxError, eval, b'\xef\xbb\x20') + @requires_subprocess() def test_20731(self): sub = subprocess.Popen([sys.executable, os.path.join(os.path.dirname(__file__), diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py index 3af523e8346c4..99a25e279df92 100644 --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -48,6 +48,9 @@ if support.PGO: raise unittest.SkipTest("test is not helpful for PGO") +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") + mswindows = (sys.platform == "win32") # diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index 4dac7f6cd4200..1ce3c826d6b1b 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -491,6 +491,7 @@ def test_reap_children(self): # pending child process support.reap_children() + @support.requires_subprocess() def check_options(self, args, func, expected=None): code = f'from test.support import {func}; print(repr({func}()))' cmd = [sys.executable, *args, '-c', code] diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index f6da57f55f161..41c4618ad10d4 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -694,6 +694,7 @@ def test_sys_getwindowsversion_no_instantiation(self): def test_clear_type_cache(self): sys._clear_type_cache() + @support.requires_subprocess() def test_ioencoding(self): env = dict(os.environ) @@ -741,6 +742,7 @@ def test_ioencoding(self): 'requires OS support of non-ASCII encodings') @unittest.skipUnless(sys.getfilesystemencoding() == locale.getpreferredencoding(False), 'requires FS encoding to match locale') + @support.requires_subprocess() def test_ioencoding_nonascii(self): env = dict(os.environ) @@ -753,6 +755,7 @@ def test_ioencoding_nonascii(self): @unittest.skipIf(sys.base_prefix != sys.prefix, 'Test is not venv-compatible') + @support.requires_subprocess() def test_executable(self): # sys.executable should be absolute self.assertEqual(os.path.abspath(sys.executable), sys.executable) @@ -854,9 +857,11 @@ def check_locale_surrogateescape(self, locale): 'stdout: surrogateescape\n' 'stderr: backslashreplace\n') + @support.requires_subprocess() def test_c_locale_surrogateescape(self): self.check_locale_surrogateescape('C') + @support.requires_subprocess() def test_posix_locale_surrogateescape(self): self.check_locale_surrogateescape('POSIX') @@ -1005,6 +1010,7 @@ def test_getandroidapilevel(self): self.assertIsInstance(level, int) self.assertGreater(level, 0) + @support.requires_subprocess() def test_sys_tracebacklimit(self): code = """if 1: import sys @@ -1051,6 +1057,7 @@ def test__enablelegacywindowsfsencoding(self): out = out.decode('ascii', 'replace').rstrip() self.assertEqual(out, 'mbcs replace') + @support.requires_subprocess() def test_orig_argv(self): code = textwrap.dedent(''' import sys diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py index 6fbb80d77f793..80fe9c8a8b1e0 100644 --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -5,7 +5,7 @@ import shutil from copy import copy -from test.support import (captured_stdout, PythonSymlink) +from test.support import (captured_stdout, PythonSymlink, requires_subprocess) from test.support.import_helper import import_module from test.support.os_helper import (TESTFN, unlink, skip_unless_symlink, change_cwd) @@ -273,6 +273,7 @@ def test_get_scheme_names(self): self.assertEqual(get_scheme_names(), tuple(sorted(wanted))) @skip_unless_symlink + @requires_subprocess() def test_symlink(self): # Issue 7880 with PythonSymlink() as py: cmd = "-c", "import sysconfig; print(sysconfig.get_platform())" @@ -326,6 +327,7 @@ def test_ldshared_value(self): self.assertIn(ldflags, ldshared) @unittest.skipUnless(sys.platform == "darwin", "test only relevant on MacOSX") + @requires_subprocess() def test_platform_in_subprocess(self): my_platform = sysconfig.get_platform() diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index f03a64232e17c..4830571474b5b 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -3,7 +3,7 @@ """ import test.support -from test.support import threading_helper +from test.support import threading_helper, requires_subprocess from test.support import verbose, cpython_only, os_helper from test.support.import_helper import import_module from test.support.script_helper import assert_python_ok, assert_python_failure @@ -1259,6 +1259,7 @@ def test_releasing_unacquired_lock(self): lock = threading.Lock() self.assertRaises(RuntimeError, lock.release) + @requires_subprocess() def test_recursion_limit(self): # Issue 9670 # test that excessive recursion within a non-main thread causes diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index 966ff2a1241ca..e0884fb9b7814 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -9,7 +9,8 @@ import re from test import support from test.support import (Error, captured_output, cpython_only, ALWAYS_EQ, - requires_debug_ranges, has_no_debug_ranges) + requires_debug_ranges, has_no_debug_ranges, + requires_subprocess) from test.support.os_helper import TESTFN, unlink from test.support.script_helper import assert_python_ok, assert_python_failure @@ -203,6 +204,7 @@ def __str__(self): str_name = '.'.join([X.__module__, X.__qualname__]) self.assertEqual(err[0], "%s: %s\n" % (str_name, str_value)) + @requires_subprocess() def test_encoded_file(self): # Test that tracebacks are correctly printed for encoded source files: # - correct line number (Issue2384) diff --git a/Lib/test/test_utf8_mode.py b/Lib/test/test_utf8_mode.py index 8b6332ee22771..2b96f76df305f 100644 --- a/Lib/test/test_utf8_mode.py +++ b/Lib/test/test_utf8_mode.py @@ -255,6 +255,7 @@ def test_optim_level(self): @unittest.skipIf(MS_WINDOWS, "os.device_encoding() doesn't implement " "the UTF-8 Mode on Windows") + @support.requires_subprocess() def test_device_encoding(self): # Use stdout as TTY if not sys.stdout.isatty(): diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index ca37abcf79854..043158c79214b 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -15,7 +15,8 @@ import sys import tempfile from test.support import (captured_stdout, captured_stderr, requires_zlib, - skip_if_broken_multiprocessing_synchronize, verbose) + skip_if_broken_multiprocessing_synchronize, verbose, + requires_subprocess) from test.support.os_helper import (can_symlink, EnvironmentVarGuard, rmtree) import unittest import venv @@ -33,6 +34,7 @@ or sys._base_executable != sys.executable, 'cannot run venv.create from within a venv on this platform') + at requires_subprocess() def check_output(cmd, encoding=None): p = subprocess.Popen(cmd, stdout=subprocess.PIPE, diff --git a/Lib/test/test_webbrowser.py b/Lib/test/test_webbrowser.py index dbfd2e5a0f280..9d608d63a01ed 100644 --- a/Lib/test/test_webbrowser.py +++ b/Lib/test/test_webbrowser.py @@ -8,6 +8,8 @@ from test.support import import_helper from test.support import os_helper +if not support.has_subprocess_support: + raise unittest.SkipTest("test webserver requires subprocess") URL = 'http://www.example.com' CMD_NAME = 'test' diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py index df48fabff951d..e226dd741d7a7 100644 --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -21,7 +21,7 @@ from test.support import script_helper from test.support import (findfile, requires_zlib, requires_bz2, - requires_lzma, captured_stdout) + requires_lzma, captured_stdout, requires_subprocess) from test.support.os_helper import TESTFN, unlink, rmtree, temp_dir, temp_cwd @@ -2771,6 +2771,7 @@ def test_read_zip64_with_exe_prepended(self): @unittest.skipUnless(sys.executable, 'sys.executable required.') @unittest.skipUnless(os.access('/bin/bash', os.X_OK), 'Test relies on #!/bin/bash working.') + @requires_subprocess() def test_execute_zip2(self): output = subprocess.check_output([self.exe_zip, sys.executable]) self.assertIn(b'number in executable: 5', output) @@ -2778,6 +2779,7 @@ def test_execute_zip2(self): @unittest.skipUnless(sys.executable, 'sys.executable required.') @unittest.skipUnless(os.access('/bin/bash', os.X_OK), 'Test relies on #!/bin/bash working.') + @requires_subprocess() def test_execute_zip64(self): output = subprocess.check_output([self.exe_zip64, sys.executable]) self.assertIn(b'number in executable: 5', output) diff --git a/Misc/NEWS.d/next/Tests/2022-01-14-23-22-41.bpo-40280.nHLWoD.rst b/Misc/NEWS.d/next/Tests/2022-01-14-23-22-41.bpo-40280.nHLWoD.rst new file mode 100644 index 0000000000000..67134f1191cd2 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-01-14-23-22-41.bpo-40280.nHLWoD.rst @@ -0,0 +1,3 @@ +Add :func:`test.support.requires_subprocess` decorator to mark tests which +require working :mod:`subprocess` module or ``os.spawn*``. The +wasm32-emscripten platform has no support for processes. From webhook-mailer at python.org Tue Jan 25 03:27:14 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 08:27:14 -0000 Subject: [Python-checkins] bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30875) Message-ID: https://github.com/python/cpython/commit/b1a3446f077b7d56b89f55d98dadb8018986a3e5 commit: b1a3446f077b7d56b89f55d98dadb8018986a3e5 branch: main author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T03:27:09-05:00 summary: bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30875) files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index c0f9a10e10b1c..a1c64121aae4b 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,3 +1,15 @@ +What's New in IDLE 3.11.0 +(since 3.10.0) +Released on 2022-10-03 +========================= + + +bpo-45495: Add context keywords 'case' and 'match' to completions list. + +bpo-45296: On Windows, change exit/quit message to suggest Ctrl-D, which +works, instead of , which does not work in IDLE. + + What's New in IDLE 3.10.0 (since 3.9.0) Released on 2021-10-04 From webhook-mailer at python.org Tue Jan 25 03:28:34 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 08:28:34 -0000 Subject: [Python-checkins] [3.10] bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30876) Message-ID: https://github.com/python/cpython/commit/367a37a18c4411c42da9006947dd95b0afbdf200 commit: 367a37a18c4411c42da9006947dd95b0afbdf200 branch: 3.10 author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T03:28:29-05:00 summary: [3.10] bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30876) Cherry picked from b1a3446f077b7d56b89f55d98dadb8018986a3e files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index c0f9a10e10b1c..0fe1cd258bfa1 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,3 +1,15 @@ +What's New in IDLE 3.10.z +after 3.10.0 until 3.10.? +Released on 2022-05-16 +========================= + + +bpo-45495: Add context keywords 'case' and 'match' to completions list. + +bpo-45296: On Windows, change exit/quit message to suggest Ctrl-D, which +works, instead of , which does not work in IDLE. + + What's New in IDLE 3.10.0 (since 3.9.0) Released on 2021-10-04 From webhook-mailer at python.org Tue Jan 25 03:28:57 2022 From: webhook-mailer at python.org (terryjreedy) Date: Tue, 25 Jan 2022 08:28:57 -0000 Subject: [Python-checkins] [3.9] bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30877) Message-ID: https://github.com/python/cpython/commit/3178efbf06666409107237a3cfe61ba85a5d3a26 commit: 3178efbf06666409107237a3cfe61ba85a5d3a26 branch: 3.9 author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T03:28:53-05:00 summary: [3.9] bpo-46496: Update IDLE News to 2021 Jan 24 (GH-30877) Cherry picked from b1a3446f077b7d56b89f55d98dadb8018986a3e files: M Lib/idlelib/NEWS.txt diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt index 3e5915679a092..d656f0c0214b4 100644 --- a/Lib/idlelib/NEWS.txt +++ b/Lib/idlelib/NEWS.txt @@ -1,7 +1,11 @@ -What's New in IDLE 3.9 after 3.9.0 -until 3.9.12, 2022-05-16 -================================== +What's New in IDLE 3.9.z +after 3.9.0 until 3.9.12 +Released on 2022-05-16 +========================= + +bpo-45296: On Windows, change exit/quit message to suggest Ctrl-D, which +works, instead of , which does not work in IDLE. bpo-40128: Mostly fix completions on macOS when not using tcl/tk 8.6.11 (as with 3.9). From webhook-mailer at python.org Tue Jan 25 07:28:43 2022 From: webhook-mailer at python.org (markshannon) Date: Tue, 25 Jan 2022 12:28:43 -0000 Subject: [Python-checkins] bpo-46420: Use NOTRACE_DISPATCH() in specialized opcodes (GH-30652) Message-ID: https://github.com/python/cpython/commit/96bf84d57a7c29544866a6c20231603049de4919 commit: 96bf84d57a7c29544866a6c20231603049de4919 branch: main author: Dennis Sweeney <36520290+sweeneyde at users.noreply.github.com> committer: markshannon date: 2022-01-25T12:28:29Z summary: bpo-46420: Use NOTRACE_DISPATCH() in specialized opcodes (GH-30652) files: M Python/ceval.c diff --git a/Python/ceval.c b/Python/ceval.c index 2c524ab7e0422..0a6fc4a20660b 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -2017,6 +2017,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(BINARY_OP_MULTIPLY_INT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyLong_CheckExact(left), BINARY_OP); @@ -2030,10 +2031,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (prod == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_MULTIPLY_FLOAT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyFloat_CheckExact(left), BINARY_OP); @@ -2049,10 +2051,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (prod == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_SUBTRACT_INT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyLong_CheckExact(left), BINARY_OP); @@ -2066,10 +2069,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (sub == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_SUBTRACT_FLOAT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyFloat_CheckExact(left), BINARY_OP); @@ -2084,10 +2088,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (sub == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_ADD_UNICODE) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyUnicode_CheckExact(left), BINARY_OP); @@ -2101,10 +2106,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (TOP() == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_INPLACE_ADD_UNICODE) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyUnicode_CheckExact(left), BINARY_OP); @@ -2129,10 +2135,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (TOP() == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_ADD_FLOAT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyFloat_CheckExact(left), BINARY_OP); @@ -2148,10 +2155,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (sum == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_OP_ADD_INT) { + assert(cframe.use_tracing == 0); PyObject *left = SECOND(); PyObject *right = TOP(); DEOPT_IF(!PyLong_CheckExact(left), BINARY_OP); @@ -2165,7 +2173,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr if (sum == NULL) { goto error; } - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_SUBSCR) { @@ -2202,6 +2210,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(BINARY_SUBSCR_LIST_INT) { + assert(cframe.use_tracing == 0); PyObject *sub = TOP(); PyObject *list = SECOND(); DEOPT_IF(!PyLong_CheckExact(sub), BINARY_SUBSCR); @@ -2221,10 +2230,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_DECREF(sub); SET_TOP(res); Py_DECREF(list); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_SUBSCR_TUPLE_INT) { + assert(cframe.use_tracing == 0); PyObject *sub = TOP(); PyObject *tuple = SECOND(); DEOPT_IF(!PyLong_CheckExact(sub), BINARY_SUBSCR); @@ -2244,10 +2254,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_DECREF(sub); SET_TOP(res); Py_DECREF(tuple); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(BINARY_SUBSCR_DICT) { + assert(cframe.use_tracing == 0); PyObject *dict = SECOND(); DEOPT_IF(!PyDict_CheckExact(SECOND()), BINARY_SUBSCR); STAT_INC(BINARY_SUBSCR, hit); @@ -2354,6 +2365,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(STORE_SUBSCR_LIST_INT) { + assert(cframe.use_tracing == 0); PyObject *sub = TOP(); PyObject *list = SECOND(); PyObject *value = THIRD(); @@ -2374,10 +2386,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_DECREF(old_value); Py_DECREF(sub); Py_DECREF(list); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(STORE_SUBSCR_DICT) { + assert(cframe.use_tracing == 0); PyObject *sub = TOP(); PyObject *dict = SECOND(); PyObject *value = THIRD(); @@ -3065,7 +3078,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr STAT_INC(LOAD_GLOBAL, hit); Py_INCREF(res); PUSH(res); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_GLOBAL_BUILTIN) { @@ -3085,7 +3098,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr STAT_INC(LOAD_GLOBAL, hit); Py_INCREF(res); PUSH(res); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(DELETE_FAST) { @@ -3517,7 +3530,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_INCREF(res); SET_TOP(res); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_ATTR_MODULE) { @@ -3528,7 +3541,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr LOAD_MODULE_ATTR_OR_METHOD(ATTR); SET_TOP(res); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_ATTR_WITH_HINT) { @@ -3556,7 +3569,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_INCREF(res); SET_TOP(res); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_ATTR_SLOT) { @@ -3576,7 +3589,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_INCREF(res); SET_TOP(res); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(STORE_ATTR_ADAPTIVE) { @@ -3625,7 +3638,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_DECREF(old_value); } Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(STORE_ATTR_WITH_HINT) { @@ -3660,7 +3673,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr /* PEP 509 */ dict->ma_version_tag = DICT_NEXT_VERSION(); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(STORE_ATTR_SLOT) { @@ -3680,7 +3693,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr *(PyObject **)addr = value; Py_XDECREF(old_value); Py_DECREF(owner); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(COMPARE_OP) { @@ -4479,10 +4492,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_INCREF(res); SET_TOP(res); PUSH(self); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_METHOD_NO_DICT) { + assert(cframe.use_tracing == 0); PyObject *self = TOP(); PyTypeObject *self_cls = Py_TYPE(self); SpecializedCacheEntry *caches = GET_CACHE(); @@ -4497,7 +4511,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_INCREF(res); SET_TOP(res); PUSH(self); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_METHOD_MODULE) { @@ -4509,7 +4523,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr SET_TOP(NULL); Py_DECREF(owner); PUSH(res); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(LOAD_METHOD_CLASS) { @@ -4532,7 +4546,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr SET_TOP(NULL); Py_DECREF(cls); PUSH(res); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(PRECALL_METHOD) { @@ -4714,6 +4728,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(CALL_NO_KW_TYPE_1) { + assert(cframe.use_tracing == 0); assert(STACK_ADJUST_IS_RESET); assert(GET_CACHE()->adaptive.original_oparg == 1); PyObject *obj = TOP(); @@ -4724,10 +4739,11 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr Py_DECREF(callable); Py_DECREF(obj); SET_TOP(res); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(CALL_NO_KW_BUILTIN_CLASS_1) { + assert(cframe.use_tracing == 0); assert(STACK_ADJUST_IS_RESET); SpecializedCacheEntry *caches = GET_CACHE(); _PyAdaptiveEntry *cache0 = &caches[0].adaptive; @@ -4878,6 +4894,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } TARGET(CALL_NO_KW_LIST_APPEND) { + assert(cframe.use_tracing == 0); assert(_Py_OPCODE(next_instr[-2]) == PRECALL_METHOD); assert(GET_CACHE()->adaptive.original_oparg == 1); DEOPT_IF(extra_args == 0, CALL_NO_KW); @@ -4899,7 +4916,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr STACK_SHRINK(2); SET_TOP(Py_None); Py_DECREF(callable); - DISPATCH(); + NOTRACE_DISPATCH(); } TARGET(CALL_NO_KW_METHOD_DESCRIPTOR_O) { From webhook-mailer at python.org Tue Jan 25 07:57:02 2022 From: webhook-mailer at python.org (rhettinger) Date: Tue, 25 Jan 2022 12:57:02 -0000 Subject: [Python-checkins] Move doctests to the main docs. Eliminate duplication. Improve coverage. (GH-30869) Message-ID: https://github.com/python/cpython/commit/ee60550e9ba3ab94ca43a890cf4313b63ffa1a81 commit: ee60550e9ba3ab94ca43a890cf4313b63ffa1a81 branch: main author: Raymond Hettinger committer: rhettinger date: 2022-01-25T06:56:53-06:00 summary: Move doctests to the main docs. Eliminate duplication. Improve coverage. (GH-30869) files: M Doc/library/itertools.rst M Lib/test/test_itertools.py diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 34667561c3cfe..f0d93ebb6b21a 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -1004,3 +1004,241 @@ which incur interpreter overhead. c, n = c*(n-r)//n, n-1 result.append(pool[-1-n]) return tuple(result) + +.. doctest:: + :hide: + + These examples no longer appear in the docs but are guaranteed + to keep working. + + >>> amounts = [120.15, 764.05, 823.14] + >>> for checknum, amount in zip(count(1200), amounts): + ... print('Check %d is for $%.2f' % (checknum, amount)) + ... + Check 1200 is for $120.15 + Check 1201 is for $764.05 + Check 1202 is for $823.14 + + >>> import operator + >>> for cube in map(operator.pow, range(1,4), repeat(3)): + ... print(cube) + ... + 1 + 8 + 27 + + >>> reportlines = ['EuroPython', 'Roster', '', 'alex', '', 'laura', '', 'martin', '', 'walter', '', 'samuele'] + >>> for name in islice(reportlines, 3, None, 2): + ... print(name.title()) + ... + Alex + Laura + Martin + Walter + Samuele + + >>> from operator import itemgetter + >>> d = dict(a=1, b=2, c=1, d=2, e=1, f=2, g=3) + >>> di = sorted(sorted(d.items()), key=itemgetter(1)) + >>> for k, g in groupby(di, itemgetter(1)): + ... print(k, list(map(itemgetter(0), g))) + ... + 1 ['a', 'c', 'e'] + 2 ['b', 'd', 'f'] + 3 ['g'] + + # Find runs of consecutive numbers using groupby. The key to the solution + # is differencing with a range so that consecutive numbers all appear in + # same group. + >>> data = [ 1, 4,5,6, 10, 15,16,17,18, 22, 25,26,27,28] + >>> for k, g in groupby(enumerate(data), lambda t:t[0]-t[1]): + ... print(list(map(operator.itemgetter(1), g))) + ... + [1] + [4, 5, 6] + [10] + [15, 16, 17, 18] + [22] + [25, 26, 27, 28] + + Now, we test all of the itertool recipes + + >>> import operator + >>> import collections + + >>> take(10, count()) + [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] + + >>> list(prepend(1, [2, 3, 4])) + [1, 2, 3, 4] + + >>> list(enumerate('abc')) + [(0, 'a'), (1, 'b'), (2, 'c')] + + >>> list(islice(tabulate(lambda x: 2*x), 4)) + [0, 2, 4, 6] + + >>> list(tail(3, 'ABCDEFG')) + ['E', 'F', 'G'] + + >>> it = iter(range(10)) + >>> consume(it, 3) + >>> next(it) + 3 + >>> consume(it) + >>> next(it, 'Done') + 'Done' + + >>> nth('abcde', 3) + 'd' + + >>> nth('abcde', 9) is None + True + + >>> [all_equal(s) for s in ('', 'A', 'AAAA', 'AAAB', 'AAABA')] + [True, True, True, False, False] + + >>> quantify(range(99), lambda x: x%2==0) + 50 + + >>> quantify([True, False, False, True, True]) + 3 + + >>> quantify(range(12), pred=lambda x: x%2==1) + 6 + + >>> a = [[1, 2, 3], [4, 5, 6]] + >>> list(flatten(a)) + [1, 2, 3, 4, 5, 6] + + >>> list(repeatfunc(pow, 5, 2, 3)) + [8, 8, 8, 8, 8] + + >>> import random + >>> take(5, map(int, repeatfunc(random.random))) + [0, 0, 0, 0, 0] + + >>> list(islice(pad_none('abc'), 0, 6)) + ['a', 'b', 'c', None, None, None] + + >>> list(ncycles('abc', 3)) + ['a', 'b', 'c', 'a', 'b', 'c', 'a', 'b', 'c'] + + >>> dotproduct([1,2,3], [4,5,6]) + 32 + + >>> data = [20, 40, 24, 32, 20, 28, 16] + >>> list(convolve(data, [0.25, 0.25, 0.25, 0.25])) + [5.0, 15.0, 21.0, 29.0, 29.0, 26.0, 24.0, 16.0, 11.0, 4.0] + >>> list(convolve(data, [1, -1])) + [20, 20, -16, 8, -12, 8, -12, -16] + >>> list(convolve(data, [1, -2, 1])) + [20, 0, -36, 24, -20, 20, -20, -4, 16] + + >>> list(flatten([('a', 'b'), (), ('c', 'd', 'e'), ('f',), ('g', 'h', 'i')])) + ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i'] + + >>> import random + >>> random.seed(85753098575309) + >>> list(repeatfunc(random.random, 3)) + [0.16370491282496968, 0.45889608687313455, 0.3747076837820118] + >>> list(repeatfunc(chr, 3, 65)) + ['A', 'A', 'A'] + >>> list(repeatfunc(pow, 3, 2, 5)) + [32, 32, 32] + + >>> list(grouper('abcdefg', 3, fillvalue='x')) + [('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'x', 'x')] + + >>> it = grouper('abcdefg', 3, incomplete='strict') + >>> next(it) + ('a', 'b', 'c') + >>> next(it) + ('d', 'e', 'f') + >>> next(it) + Traceback (most recent call last): + ... + ValueError: zip() argument 2 is shorter than argument 1 + + >>> list(grouper('abcdefg', n=3, incomplete='ignore')) + [('a', 'b', 'c'), ('d', 'e', 'f')] + + >>> list(triplewise('ABCDEFG')) + [('A', 'B', 'C'), ('B', 'C', 'D'), ('C', 'D', 'E'), ('D', 'E', 'F'), ('E', 'F', 'G')] + + >>> list(sliding_window('ABCDEFG', 4)) + [('A', 'B', 'C', 'D'), ('B', 'C', 'D', 'E'), ('C', 'D', 'E', 'F'), ('D', 'E', 'F', 'G')] + + >>> list(roundrobin('abc', 'd', 'ef')) + ['a', 'd', 'e', 'b', 'f', 'c'] + + >>> def is_odd(x): + ... return x % 2 == 1 + + >>> evens, odds = partition(is_odd, range(10)) + >>> list(evens) + [0, 2, 4, 6, 8] + >>> list(odds) + [1, 3, 5, 7, 9] + + >>> it = iter('ABCdEfGhI') + >>> all_upper, remainder = before_and_after(str.isupper, it) + >>> ''.join(all_upper) + 'ABC' + >>> ''.join(remainder) + 'dEfGhI' + + >>> list(powerset([1,2,3])) + [(), (1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] + + >>> all(len(list(powerset(range(n)))) == 2**n for n in range(18)) + True + + >>> list(powerset('abcde')) == sorted(sorted(set(powerset('abcde'))), key=len) + True + + >>> list(unique_everseen('AAAABBBCCDAABBB')) + ['A', 'B', 'C', 'D'] + + >>> list(unique_everseen('ABBCcAD', str.lower)) + ['A', 'B', 'C', 'D'] + + >>> list(unique_justseen('AAAABBBCCDAABBB')) + ['A', 'B', 'C', 'D', 'A', 'B'] + + >>> list(unique_justseen('ABBCcAD', str.lower)) + ['A', 'B', 'C', 'A', 'D'] + + >>> d = dict(a=1, b=2, c=3) + >>> it = iter_except(d.popitem, KeyError) + >>> d['d'] = 4 + >>> next(it) + ('d', 4) + >>> next(it) + ('c', 3) + >>> next(it) + ('b', 2) + >>> d['e'] = 5 + >>> next(it) + ('e', 5) + >>> next(it) + ('a', 1) + >>> next(it, 'empty') + 'empty' + + >>> first_true('ABC0DEF1', '9', str.isdigit) + '0' + + >>> population = 'ABCDEFGH' + >>> for r in range(len(population) + 1): + ... seq = list(combinations(population, r)) + ... for i in range(len(seq)): + ... assert nth_combination(population, r, i) == seq[i] + ... for i in range(-len(seq), 0): + ... assert nth_combination(population, r, i) == seq[i] + + >>> iterable = 'abcde' + >>> r = 3 + >>> combos = list(combinations(iterable, r)) + >>> all(nth_combination(iterable, r, i) == comb for i, comb in enumerate(combos)) + True diff --git a/Lib/test/test_itertools.py b/Lib/test/test_itertools.py index 3043e8c404e6e..3f3f7cb35d0bc 100644 --- a/Lib/test/test_itertools.py +++ b/Lib/test/test_itertools.py @@ -2321,399 +2321,6 @@ def test_permutations_sizeof(self): basesize + 10 * self.ssize_t + 4 * self.ssize_t) -libreftest = """ Doctest for examples in the library reference: libitertools.tex - - ->>> amounts = [120.15, 764.05, 823.14] ->>> for checknum, amount in zip(count(1200), amounts): -... print('Check %d is for $%.2f' % (checknum, amount)) -... -Check 1200 is for $120.15 -Check 1201 is for $764.05 -Check 1202 is for $823.14 - ->>> import operator ->>> for cube in map(operator.pow, range(1,4), repeat(3)): -... print(cube) -... -1 -8 -27 - ->>> reportlines = ['EuroPython', 'Roster', '', 'alex', '', 'laura', '', 'martin', '', 'walter', '', 'samuele'] ->>> for name in islice(reportlines, 3, None, 2): -... print(name.title()) -... -Alex -Laura -Martin -Walter -Samuele - ->>> from operator import itemgetter ->>> d = dict(a=1, b=2, c=1, d=2, e=1, f=2, g=3) ->>> di = sorted(sorted(d.items()), key=itemgetter(1)) ->>> for k, g in groupby(di, itemgetter(1)): -... print(k, list(map(itemgetter(0), g))) -... -1 ['a', 'c', 'e'] -2 ['b', 'd', 'f'] -3 ['g'] - -# Find runs of consecutive numbers using groupby. The key to the solution -# is differencing with a range so that consecutive numbers all appear in -# same group. ->>> data = [ 1, 4,5,6, 10, 15,16,17,18, 22, 25,26,27,28] ->>> for k, g in groupby(enumerate(data), lambda t:t[0]-t[1]): -... print(list(map(operator.itemgetter(1), g))) -... -[1] -[4, 5, 6] -[10] -[15, 16, 17, 18] -[22] -[25, 26, 27, 28] - ->>> def take(n, iterable): -... "Return first n items of the iterable as a list" -... return list(islice(iterable, n)) - ->>> def prepend(value, iterator): -... "Prepend a single value in front of an iterator" -... # prepend(1, [2, 3, 4]) -> 1 2 3 4 -... return chain([value], iterator) - ->>> def enumerate(iterable, start=0): -... return zip(count(start), iterable) - ->>> def tabulate(function, start=0): -... "Return function(0), function(1), ..." -... return map(function, count(start)) - ->>> import collections ->>> def consume(iterator, n=None): -... "Advance the iterator n-steps ahead. If n is None, consume entirely." -... # Use functions that consume iterators at C speed. -... if n is None: -... # feed the entire iterator into a zero-length deque -... collections.deque(iterator, maxlen=0) -... else: -... # advance to the empty slice starting at position n -... next(islice(iterator, n, n), None) - ->>> def nth(iterable, n, default=None): -... "Returns the nth item or a default value" -... return next(islice(iterable, n, None), default) - ->>> def all_equal(iterable): -... "Returns True if all the elements are equal to each other" -... g = groupby(iterable) -... return next(g, True) and not next(g, False) - ->>> def quantify(iterable, pred=bool): -... "Count how many times the predicate is true" -... return sum(map(pred, iterable)) - ->>> def pad_none(iterable): -... "Returns the sequence elements and then returns None indefinitely" -... return chain(iterable, repeat(None)) - ->>> def ncycles(iterable, n): -... "Returns the sequence elements n times" -... return chain(*repeat(iterable, n)) - ->>> def dotproduct(vec1, vec2): -... return sum(map(operator.mul, vec1, vec2)) - ->>> def flatten(listOfLists): -... return list(chain.from_iterable(listOfLists)) - ->>> def repeatfunc(func, times=None, *args): -... "Repeat calls to func with specified arguments." -... " Example: repeatfunc(random.random)" -... if times is None: -... return starmap(func, repeat(args)) -... else: -... return starmap(func, repeat(args, times)) - ->>> def grouper(iterable, n, *, incomplete='fill', fillvalue=None): -... "Collect data into non-overlapping fixed-length chunks or blocks" -... # grouper('ABCDEFG', 3, fillvalue='x') --> ABC DEF Gxx -... # grouper('ABCDEFG', 3, incomplete='strict') --> ABC DEF ValueError -... # grouper('ABCDEFG', 3, incomplete='ignore') --> ABC DEF -... args = [iter(iterable)] * n -... if incomplete == 'fill': -... return zip_longest(*args, fillvalue=fillvalue) -... if incomplete == 'strict': -... return zip(*args, strict=True) -... if incomplete == 'ignore': -... return zip(*args) -... else: -... raise ValueError('Expected fill, strict, or ignore') - ->>> def triplewise(iterable): -... "Return overlapping triplets from an iterable" -... # pairwise('ABCDEFG') -> ABC BCD CDE DEF EFG -... for (a, _), (b, c) in pairwise(pairwise(iterable)): -... yield a, b, c - ->>> import collections ->>> def sliding_window(iterable, n): -... # sliding_window('ABCDEFG', 4) -> ABCD BCDE CDEF DEFG -... it = iter(iterable) -... window = collections.deque(islice(it, n), maxlen=n) -... if len(window) == n: -... yield tuple(window) -... for x in it: -... window.append(x) -... yield tuple(window) - ->>> def roundrobin(*iterables): -... "roundrobin('ABC', 'D', 'EF') --> A D E B F C" -... # Recipe credited to George Sakkis -... pending = len(iterables) -... nexts = cycle(iter(it).__next__ for it in iterables) -... while pending: -... try: -... for next in nexts: -... yield next() -... except StopIteration: -... pending -= 1 -... nexts = cycle(islice(nexts, pending)) - ->>> def partition(pred, iterable): -... "Use a predicate to partition entries into false entries and true entries" -... # partition(is_odd, range(10)) --> 0 2 4 6 8 and 1 3 5 7 9 -... t1, t2 = tee(iterable) -... return filterfalse(pred, t1), filter(pred, t2) - ->>> def before_and_after(predicate, it): -... ''' Variant of takewhile() that allows complete -... access to the remainder of the iterator. -... -... >>> all_upper, remainder = before_and_after(str.isupper, 'ABCdEfGhI') -... >>> str.join('', all_upper) -... 'ABC' -... >>> str.join('', remainder) -... 'dEfGhI' -... -... Note that the first iterator must be fully -... consumed before the second iterator can -... generate valid results. -... ''' -... it = iter(it) -... transition = [] -... def true_iterator(): -... for elem in it: -... if predicate(elem): -... yield elem -... else: -... transition.append(elem) -... return -... def remainder_iterator(): -... yield from transition -... yield from it -... return true_iterator(), remainder_iterator() - ->>> def powerset(iterable): -... "powerset([1,2,3]) --> () (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)" -... s = list(iterable) -... return chain.from_iterable(combinations(s, r) for r in range(len(s)+1)) - ->>> def unique_everseen(iterable, key=None): -... "List unique elements, preserving order. Remember all elements ever seen." -... # unique_everseen('AAAABBBCCDAABBB') --> A B C D -... # unique_everseen('ABBCcAD', str.lower) --> A B C D -... seen = set() -... seen_add = seen.add -... if key is None: -... for element in iterable: -... if element not in seen: -... seen_add(element) -... yield element -... else: -... for element in iterable: -... k = key(element) -... if k not in seen: -... seen_add(k) -... yield element - ->>> def unique_justseen(iterable, key=None): -... "List unique elements, preserving order. Remember only the element just seen." -... # unique_justseen('AAAABBBCCDAABBB') --> A B C D A B -... # unique_justseen('ABBCcAD', str.lower) --> A B C A D -... return map(next, map(itemgetter(1), groupby(iterable, key))) - ->>> def first_true(iterable, default=False, pred=None): -... '''Returns the first true value in the iterable. -... -... If no true value is found, returns *default* -... -... If *pred* is not None, returns the first item -... for which pred(item) is true. -... -... ''' -... # first_true([a,b,c], x) --> a or b or c or x -... # first_true([a,b], x, f) --> a if f(a) else b if f(b) else x -... return next(filter(pred, iterable), default) - ->>> def nth_combination(iterable, r, index): -... 'Equivalent to list(combinations(iterable, r))[index]' -... pool = tuple(iterable) -... n = len(pool) -... if r < 0 or r > n: -... raise ValueError -... c = 1 -... k = min(r, n-r) -... for i in range(1, k+1): -... c = c * (n - k + i) // i -... if index < 0: -... index += c -... if index < 0 or index >= c: -... raise IndexError -... result = [] -... while r: -... c, n, r = c*r//n, n-1, r-1 -... while index >= c: -... index -= c -... c, n = c*(n-r)//n, n-1 -... result.append(pool[-1-n]) -... return tuple(result) - - -This is not part of the examples but it tests to make sure the definitions -perform as purported. - ->>> take(10, count()) -[0, 1, 2, 3, 4, 5, 6, 7, 8, 9] - ->>> list(prepend(1, [2, 3, 4])) -[1, 2, 3, 4] - ->>> list(enumerate('abc')) -[(0, 'a'), (1, 'b'), (2, 'c')] - ->>> list(islice(tabulate(lambda x: 2*x), 4)) -[0, 2, 4, 6] - ->>> it = iter(range(10)) ->>> consume(it, 3) ->>> next(it) -3 ->>> consume(it) ->>> next(it, 'Done') -'Done' - ->>> nth('abcde', 3) -'d' - ->>> nth('abcde', 9) is None -True - ->>> [all_equal(s) for s in ('', 'A', 'AAAA', 'AAAB', 'AAABA')] -[True, True, True, False, False] - ->>> quantify(range(99), lambda x: x%2==0) -50 - ->>> a = [[1, 2, 3], [4, 5, 6]] ->>> flatten(a) -[1, 2, 3, 4, 5, 6] - ->>> list(repeatfunc(pow, 5, 2, 3)) -[8, 8, 8, 8, 8] - ->>> import random ->>> take(5, map(int, repeatfunc(random.random))) -[0, 0, 0, 0, 0] - ->>> list(islice(pad_none('abc'), 0, 6)) -['a', 'b', 'c', None, None, None] - ->>> list(ncycles('abc', 3)) -['a', 'b', 'c', 'a', 'b', 'c', 'a', 'b', 'c'] - ->>> dotproduct([1,2,3], [4,5,6]) -32 - ->>> list(grouper('abcdefg', 3, fillvalue='x')) -[('a', 'b', 'c'), ('d', 'e', 'f'), ('g', 'x', 'x')] - ->>> it = grouper('abcdefg', 3, incomplete='strict') ->>> next(it) -('a', 'b', 'c') ->>> next(it) -('d', 'e', 'f') ->>> next(it) -Traceback (most recent call last): - ... -ValueError: zip() argument 2 is shorter than argument 1 - ->>> list(grouper('abcdefg', n=3, incomplete='ignore')) -[('a', 'b', 'c'), ('d', 'e', 'f')] - ->>> list(triplewise('ABCDEFG')) -[('A', 'B', 'C'), ('B', 'C', 'D'), ('C', 'D', 'E'), ('D', 'E', 'F'), ('E', 'F', 'G')] - ->>> list(sliding_window('ABCDEFG', 4)) -[('A', 'B', 'C', 'D'), ('B', 'C', 'D', 'E'), ('C', 'D', 'E', 'F'), ('D', 'E', 'F', 'G')] - ->>> list(roundrobin('abc', 'd', 'ef')) -['a', 'd', 'e', 'b', 'f', 'c'] - ->>> def is_odd(x): -... return x % 2 == 1 - ->>> evens, odds = partition(is_odd, range(10)) ->>> list(evens) -[0, 2, 4, 6, 8] ->>> list(odds) -[1, 3, 5, 7, 9] - ->>> it = iter('ABCdEfGhI') ->>> all_upper, remainder = before_and_after(str.isupper, it) ->>> ''.join(all_upper) -'ABC' ->>> ''.join(remainder) -'dEfGhI' - ->>> list(powerset([1,2,3])) -[(), (1,), (2,), (3,), (1, 2), (1, 3), (2, 3), (1, 2, 3)] - ->>> all(len(list(powerset(range(n)))) == 2**n for n in range(18)) -True - ->>> list(powerset('abcde')) == sorted(sorted(set(powerset('abcde'))), key=len) -True - ->>> list(unique_everseen('AAAABBBCCDAABBB')) -['A', 'B', 'C', 'D'] - ->>> list(unique_everseen('ABBCcAD', str.lower)) -['A', 'B', 'C', 'D'] - ->>> list(unique_justseen('AAAABBBCCDAABBB')) -['A', 'B', 'C', 'D', 'A', 'B'] - ->>> list(unique_justseen('ABBCcAD', str.lower)) -['A', 'B', 'C', 'A', 'D'] - ->>> first_true('ABC0DEF1', '9', str.isdigit) -'0' - ->>> population = 'ABCDEFGH' ->>> for r in range(len(population) + 1): -... seq = list(combinations(population, r)) -... for i in range(len(seq)): -... assert nth_combination(population, r, i) == seq[i] -... for i in range(-len(seq), 0): -... assert nth_combination(population, r, i) == seq[i] - - -""" - -__test__ = {'libreftest' : libreftest} - def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite()) return tests From webhook-mailer at python.org Tue Jan 25 09:38:55 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 25 Jan 2022 14:38:55 -0000 Subject: [Python-checkins] bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) Message-ID: https://github.com/python/cpython/commit/41e0aead3defa6d0486514e313b6975fbf375998 commit: 41e0aead3defa6d0486514e313b6975fbf375998 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-25T06:38:45-08:00 summary: bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) We treat Annotated type arg as class-level annotation. This exempts it from checks against Final and ClassVar in order to allow using them in any nesting order. Automerge-Triggered-By: GH:gvanrossum (cherry picked from commit e1abffca45b60729c460e3e2ad50c8c1946cfd4e) Co-authored-by: Gregory Beauregard files: A Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst M Lib/test/test_typing.py M Lib/typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index acad35d18d5f3..a840ffe8daaec 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4582,6 +4582,14 @@ class C: A.x = 5 self.assertEqual(C.x, 5) + def test_special_form_containment(self): + class C: + classvar: Annotated[ClassVar[int], "a decoration"] = 4 + const: Annotated[Final[int], "Const"] = 4 + + self.assertEqual(get_type_hints(C, globals())['classvar'], ClassVar[int]) + self.assertEqual(get_type_hints(C, globals())['const'], Final[int]) + def test_hash_eq(self): self.assertEqual(len({Annotated[int, 4, 5], Annotated[int, 4, 5]}), 1) self.assertNotEqual(Annotated[int, 4, 5], Annotated[int, 5, 4]) diff --git a/Lib/typing.py b/Lib/typing.py index 25225470afbac..705331a9a89a0 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -143,7 +143,7 @@ def _type_convert(arg, module=None): return arg -def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): +def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms=False): """Check that the argument is a type, and return it (internal helper). As a special case, accept None and return type(None) instead. Also wrap strings @@ -156,7 +156,7 @@ def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): We append the repr() of the actual value (truncated to 100 chars). """ invalid_generic_forms = (Generic, Protocol) - if not is_class: + if not allow_special_forms: invalid_generic_forms += (ClassVar,) if is_argument: invalid_generic_forms += (Final,) @@ -691,7 +691,7 @@ def _evaluate(self, globalns, localns, recursive_guard): eval(self.__forward_code__, globalns, localns), "Forward references must evaluate to types.", is_argument=self.__forward_is_argument__, - is_class=self.__forward_is_class__, + allow_special_forms=self.__forward_is_class__, ) self.__forward_value__ = _eval_type( type_, globalns, localns, recursive_guard | {self.__forward_arg__} @@ -1677,7 +1677,7 @@ def __class_getitem__(cls, params): "with at least two arguments (a type and an " "annotation).") msg = "Annotated[t, ...]: t must be a type." - origin = _type_check(params[0], msg) + origin = _type_check(params[0], msg, allow_special_forms=True) metadata = tuple(params[1:]) return _AnnotatedAlias(origin, metadata) diff --git a/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst new file mode 100644 index 0000000000000..f66e8868f753f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst @@ -0,0 +1 @@ +Allow :data:`typing.Annotated` to wrap :data:`typing.Final` and :data:`typing.ClassVar`. Patch by Gregory Beauregard. From webhook-mailer at python.org Tue Jan 25 09:39:16 2022 From: webhook-mailer at python.org (miss-islington) Date: Tue, 25 Jan 2022 14:39:16 -0000 Subject: [Python-checkins] bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) Message-ID: https://github.com/python/cpython/commit/b0b8388a1c29dc9203dd1a9e8b1420a6a5e88c97 commit: b0b8388a1c29dc9203dd1a9e8b1420a6a5e88c97 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-25T06:39:12-08:00 summary: bpo-46491: Allow Annotated on outside of Final/ClassVar (GH-30864) We treat Annotated type arg as class-level annotation. This exempts it from checks against Final and ClassVar in order to allow using them in any nesting order. Automerge-Triggered-By: GH:gvanrossum (cherry picked from commit e1abffca45b60729c460e3e2ad50c8c1946cfd4e) Co-authored-by: Gregory Beauregard files: A Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst M Lib/test/test_typing.py M Lib/typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 17da4b81f5193..f87832a631d49 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4219,6 +4219,14 @@ class C: A.x = 5 self.assertEqual(C.x, 5) + def test_special_form_containment(self): + class C: + classvar: Annotated[ClassVar[int], "a decoration"] = 4 + const: Annotated[Final[int], "Const"] = 4 + + self.assertEqual(get_type_hints(C, globals())['classvar'], ClassVar[int]) + self.assertEqual(get_type_hints(C, globals())['const'], Final[int]) + def test_hash_eq(self): self.assertEqual(len({Annotated[int, 4, 5], Annotated[int, 4, 5]}), 1) self.assertNotEqual(Annotated[int, 4, 5], Annotated[int, 5, 4]) diff --git a/Lib/typing.py b/Lib/typing.py index da70d4115fa23..fdc0f163d6504 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -134,7 +134,7 @@ def _type_convert(arg, module=None): return arg -def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): +def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms=False): """Check that the argument is a type, and return it (internal helper). As a special case, accept None and return type(None) instead. Also wrap strings @@ -147,7 +147,7 @@ def _type_check(arg, msg, is_argument=True, module=None, *, is_class=False): We append the repr() of the actual value (truncated to 100 chars). """ invalid_generic_forms = (Generic, Protocol) - if not is_class: + if not allow_special_forms: invalid_generic_forms += (ClassVar,) if is_argument: invalid_generic_forms += (Final,) @@ -554,7 +554,7 @@ def _evaluate(self, globalns, localns, recursive_guard): eval(self.__forward_code__, globalns, localns), "Forward references must evaluate to types.", is_argument=self.__forward_is_argument__, - is_class=self.__forward_is_class__, + allow_special_forms=self.__forward_is_class__, ) self.__forward_value__ = _eval_type( type_, globalns, localns, recursive_guard | {self.__forward_arg__} @@ -1336,7 +1336,7 @@ def __class_getitem__(cls, params): "with at least two arguments (a type and an " "annotation).") msg = "Annotated[t, ...]: t must be a type." - origin = _type_check(params[0], msg) + origin = _type_check(params[0], msg, allow_special_forms=True) metadata = tuple(params[1:]) return _AnnotatedAlias(origin, metadata) diff --git a/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst new file mode 100644 index 0000000000000..f66e8868f753f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-24-23-55-30.bpo-46491.jmIKHo.rst @@ -0,0 +1 @@ +Allow :data:`typing.Annotated` to wrap :data:`typing.Final` and :data:`typing.ClassVar`. Patch by Gregory Beauregard. From webhook-mailer at python.org Tue Jan 25 10:31:24 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Tue, 25 Jan 2022 15:31:24 -0000 Subject: [Python-checkins] [3.10] bpo-46445, bpo-46519: Re-import typing.NewType (GH-30886) Message-ID: https://github.com/python/cpython/commit/9a7d01046723a8a0a10f9a26702c5e39e73d4414 commit: 9a7d01046723a8a0a10f9a26702c5e39e73d4414 branch: 3.10 author: Ken Jin <28750310+Fidget-Spinner at users.noreply.github.com> committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-25T23:31:12+08:00 summary: [3.10] bpo-46445, bpo-46519: Re-import typing.NewType (GH-30886) Partially reverts 65b88d5e01c845c0cfa3ff61bc8b2faec8f67a57. files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index a840ffe8daaec..9b552c422d56d 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -19,6 +19,7 @@ from typing import is_typeddict from typing import no_type_check, no_type_check_decorator from typing import Type +from typing import NewType from typing import NamedTuple, TypedDict from typing import IO, TextIO, BinaryIO from typing import Pattern, Match From webhook-mailer at python.org Tue Jan 25 10:34:17 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 25 Jan 2022 15:34:17 -0000 Subject: [Python-checkins] bpo-46510: update Python2-style exception handling in argparse (GH-30881) Message-ID: https://github.com/python/cpython/commit/45f5f52601ebccb195c19cb0a77beaf7f7dfa56a commit: 45f5f52601ebccb195c19cb0a77beaf7f7dfa56a branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-25T15:34:03Z summary: bpo-46510: update Python2-style exception handling in argparse (GH-30881) files: M Lib/argparse.py diff --git a/Lib/argparse.py b/Lib/argparse.py index 9344dab3e60d5..3c6aa3c991bfd 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -1875,8 +1875,7 @@ def parse_known_args(self, args=None, namespace=None): if self.exit_on_error: try: namespace, args = self._parse_known_args(args, namespace) - except ArgumentError: - err = _sys.exc_info()[1] + except ArgumentError as err: self.error(str(err)) else: namespace, args = self._parse_known_args(args, namespace) @@ -2151,8 +2150,7 @@ def _read_args_from_files(self, arg_strings): arg_strings.append(arg) arg_strings = self._read_args_from_files(arg_strings) new_arg_strings.extend(arg_strings) - except OSError: - err = _sys.exc_info()[1] + except OSError as err: self.error(str(err)) # return the modified argument list @@ -2502,9 +2500,9 @@ def _get_value(self, action, arg_string): result = type_func(arg_string) # ArgumentTypeErrors indicate errors - except ArgumentTypeError: + except ArgumentTypeError as err: name = getattr(action.type, '__name__', repr(action.type)) - msg = str(_sys.exc_info()[1]) + msg = str(err) raise ArgumentError(action, msg) # TypeErrors or ValueErrors also indicate errors From webhook-mailer at python.org Tue Jan 25 12:40:35 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 25 Jan 2022 17:40:35 -0000 Subject: [Python-checkins] bpo-41682: fixed flaky test test_sendfile_close_peer_in_the_middle_of_receiving (GH-30845) (#30860) Message-ID: https://github.com/python/cpython/commit/75d88b91e6b1320ae0511eaf72e860bea913a3eb commit: 75d88b91e6b1320ae0511eaf72e860bea913a3eb branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-25T18:40:13+01:00 summary: bpo-41682: fixed flaky test test_sendfile_close_peer_in_the_middle_of_receiving (GH-30845) (#30860) (cherry picked from commit 1c705fda8f9902906edd26d46acb0433b0b098e1) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index c8bfa892c73fc..effca6644c062 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -92,9 +92,13 @@ async def wait_closed(self): class SendfileBase: - # 128 KiB plus small unaligned to buffer chunk - DATA = b"SendfileBaseData" * (1024 * 8 + 1) - + # 256 KiB plus small unaligned to buffer chunk + # Newer versions of Windows seems to have increased its internal + # buffer and tries to send as much of the data as it can as it + # has some form of buffering for this which is less than 256KiB + # on newer server versions and Windows 11. + # So DATA should be larger than 256 KiB to make this test reliable. + DATA = b"x" * (1024 * 256 + 1) # Reduce socket buffer size to test on relative small data sets. BUF_SIZE = 4 * 1024 # 4 KiB @@ -456,8 +460,6 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") - @unittest.skipIf(sys.platform == "win32", - "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Tue Jan 25 12:40:39 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 25 Jan 2022 17:40:39 -0000 Subject: [Python-checkins] bpo-41682: fixed flaky test test_sendfile_close_peer_in_the_middle_of_receiving (GH-30845) (#30861) Message-ID: https://github.com/python/cpython/commit/f9ff0bf515e0fa162889aca508e755cc65d85079 commit: f9ff0bf515e0fa162889aca508e755cc65d85079 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: vstinner date: 2022-01-25T18:40:34+01:00 summary: bpo-41682: fixed flaky test test_sendfile_close_peer_in_the_middle_of_receiving (GH-30845) (#30861) (cherry picked from commit 1c705fda8f9902906edd26d46acb0433b0b098e1) Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> Co-authored-by: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> files: M Lib/test/test_asyncio/test_sendfile.py diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 0ba966cb5ccf4..8e34ab4938390 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -87,9 +87,13 @@ async def wait_closed(self): class SendfileBase: - # 128 KiB plus small unaligned to buffer chunk - DATA = b"SendfileBaseData" * (1024 * 8 + 1) - + # 256 KiB plus small unaligned to buffer chunk + # Newer versions of Windows seems to have increased its internal + # buffer and tries to send as much of the data as it can as it + # has some form of buffering for this which is less than 256KiB + # on newer server versions and Windows 11. + # So DATA should be larger than 256 KiB to make this test reliable. + DATA = b"x" * (1024 * 256 + 1) # Reduce socket buffer size to test on relative small data sets. BUF_SIZE = 4 * 1024 # 4 KiB @@ -451,8 +455,6 @@ def test_sendfile_ssl_close_peer_after_receiving(self): # themselves). @unittest.skipIf(sys.platform.startswith('sunos'), "Doesn't work on Solaris") - @unittest.skipIf(sys.platform == "win32", - "It is flaky on Windows and needs to be fixed") # TODO: bpo-41682 def test_sendfile_close_peer_in_the_middle_of_receiving(self): srv_proto, cli_proto = self.prepare_sendfile(close_after=1024) with self.assertRaises(ConnectionError): From webhook-mailer at python.org Tue Jan 25 12:58:25 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 25 Jan 2022 17:58:25 -0000 Subject: [Python-checkins] bpo-46510: simplify exception handling code in xmlrpc (GH-30878) Message-ID: https://github.com/python/cpython/commit/d69d3d8b2fec501e51309221fb1fa4622c8a3db3 commit: d69d3d8b2fec501e51309221fb1fa4622c8a3db3 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-25T17:58:13Z summary: bpo-46510: simplify exception handling code in xmlrpc (GH-30878) files: M Lib/xmlrpc/server.py diff --git a/Lib/xmlrpc/server.py b/Lib/xmlrpc/server.py index e22e480a829ff..4228a8535bfba 100644 --- a/Lib/xmlrpc/server.py +++ b/Lib/xmlrpc/server.py @@ -268,17 +268,11 @@ def _marshaled_dispatch(self, data, dispatch_method = None, path = None): except Fault as fault: response = dumps(fault, allow_none=self.allow_none, encoding=self.encoding) - except: - # report exception back to server - exc_type, exc_value, exc_tb = sys.exc_info() - try: - response = dumps( - Fault(1, "%s:%s" % (exc_type, exc_value)), - encoding=self.encoding, allow_none=self.allow_none, - ) - finally: - # Break reference cycle - exc_type = exc_value = exc_tb = None + except BaseException as exc: + response = dumps( + Fault(1, "%s:%s" % (type(exc), exc)), + encoding=self.encoding, allow_none=self.allow_none, + ) return response.encode(self.encoding, 'xmlcharrefreplace') @@ -368,16 +362,11 @@ def system_multicall(self, call_list): {'faultCode' : fault.faultCode, 'faultString' : fault.faultString} ) - except: - exc_type, exc_value, exc_tb = sys.exc_info() - try: - results.append( - {'faultCode' : 1, - 'faultString' : "%s:%s" % (exc_type, exc_value)} - ) - finally: - # Break reference cycle - exc_type = exc_value = exc_tb = None + except BaseException as exc: + results.append( + {'faultCode' : 1, + 'faultString' : "%s:%s" % (type(exc), exc)} + ) return results def _dispatch(self, method, params): @@ -634,19 +623,14 @@ def _marshaled_dispatch(self, data, dispatch_method = None, path = None): try: response = self.dispatchers[path]._marshaled_dispatch( data, dispatch_method, path) - except: + except BaseException as exc: # report low level exception back to server # (each dispatcher should have handled their own # exceptions) - exc_type, exc_value = sys.exc_info()[:2] - try: - response = dumps( - Fault(1, "%s:%s" % (exc_type, exc_value)), - encoding=self.encoding, allow_none=self.allow_none) - response = response.encode(self.encoding, 'xmlcharrefreplace') - finally: - # Break reference cycle - exc_type = exc_value = None + response = dumps( + Fault(1, "%s:%s" % (type(exc), exc)), + encoding=self.encoding, allow_none=self.allow_none) + response = response.encode(self.encoding, 'xmlcharrefreplace') return response class CGIXMLRPCRequestHandler(SimpleXMLRPCDispatcher): From webhook-mailer at python.org Tue Jan 25 13:01:05 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 25 Jan 2022 18:01:05 -0000 Subject: [Python-checkins] bpo-46510: Add missing test for types.TracebackType/FrameType. Calculate them directly from the caught exception. (GH-30880) Message-ID: https://github.com/python/cpython/commit/ec7c17ea236f71c8376abcc2930a7c857d417966 commit: ec7c17ea236f71c8376abcc2930a7c857d417966 branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-25T18:00:57Z summary: bpo-46510: Add missing test for types.TracebackType/FrameType. Calculate them directly from the caught exception. (GH-30880) files: A Misc/NEWS.d/next/Library/2022-01-25-10-59-41.bpo-46510.PM5svI.rst M Lib/test/test_types.py M Lib/types.py diff --git a/Lib/test/test_types.py b/Lib/test/test_types.py index 3dfda5cb95663..c54854eeb5ad2 100644 --- a/Lib/test/test_types.py +++ b/Lib/test/test_types.py @@ -624,6 +624,14 @@ def test_notimplemented_type(self): def test_none_type(self): self.assertIsInstance(None, types.NoneType) + def test_traceback_and_frame_types(self): + try: + raise OSError + except OSError as e: + exc = e + self.assertIsInstance(exc.__traceback__, types.TracebackType) + self.assertIsInstance(exc.__traceback__.tb_frame, types.FrameType) + class UnionTests(unittest.TestCase): diff --git a/Lib/types.py b/Lib/types.py index 679c7f638b310..9490da7b9ee3b 100644 --- a/Lib/types.py +++ b/Lib/types.py @@ -52,11 +52,9 @@ def _m(self): pass try: raise TypeError -except TypeError: - tb = sys.exc_info()[2] - TracebackType = type(tb) - FrameType = type(tb.tb_frame) - tb = None; del tb +except TypeError as exc: + TracebackType = type(exc.__traceback__) + FrameType = type(exc.__traceback__.tb_frame) # For Jython, the following two types are identical GetSetDescriptorType = type(FunctionType.__code__) diff --git a/Misc/NEWS.d/next/Library/2022-01-25-10-59-41.bpo-46510.PM5svI.rst b/Misc/NEWS.d/next/Library/2022-01-25-10-59-41.bpo-46510.PM5svI.rst new file mode 100644 index 0000000000000..b416a1692270b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-25-10-59-41.bpo-46510.PM5svI.rst @@ -0,0 +1,3 @@ +Add missing test for :class:`types.TracebackType` and +:class:`types.FrameType`. Calculate them directly from the caught exception +without calling :func:`sys.exc_info`. From webhook-mailer at python.org Tue Jan 25 14:02:32 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 25 Jan 2022 19:02:32 -0000 Subject: [Python-checkins] bpo-45382: test.pythoninfo: set wmic.exe encoding to OEM (GH-30890) Message-ID: https://github.com/python/cpython/commit/cef0a5458f254c2f8536b928dee25862ca90ffa6 commit: cef0a5458f254c2f8536b928dee25862ca90ffa6 branch: main author: Victor Stinner committer: vstinner date: 2022-01-25T20:02:23+01:00 summary: bpo-45382: test.pythoninfo: set wmic.exe encoding to OEM (GH-30890) files: M Lib/test/pythoninfo.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index cfd7ac2755d51..d15a11c80b649 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -731,9 +731,12 @@ def collect_windows(info_add): import subprocess try: + # When wmic.exe output is redirected to a pipe, + # it uses the OEM code page proc = subprocess.Popen(["wmic", "os", "get", "Caption,Version", "/value"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, + encoding="oem", text=True) output, stderr = proc.communicate() if proc.returncode: From webhook-mailer at python.org Tue Jan 25 15:20:49 2022 From: webhook-mailer at python.org (vstinner) Date: Tue, 25 Jan 2022 20:20:49 -0000 Subject: [Python-checkins] [3.10] bpo-45382: test.pythoninfo logs more Windows versions (GH-30891) Message-ID: https://github.com/python/cpython/commit/4a57fa296b92125e41220ecd201eb2e432b79fb0 commit: 4a57fa296b92125e41220ecd201eb2e432b79fb0 branch: 3.10 author: Victor Stinner committer: vstinner date: 2022-01-25T21:20:34+01:00 summary: [3.10] bpo-45382: test.pythoninfo logs more Windows versions (GH-30891) Add the following info to test.pythoninfo: * windows.ver: output of the shell "ver" command * windows.version and windows.version_caption: output of the "wmic os get Caption,Version /value" command. (cherry picked from commit b0898f4aa90d9397e23aef98a2d6b82445ee7455) * bpo-45382: test.pythoninfo: set wmic.exe encoding to OEM (GH-30890) (cherry picked from commit cef0a5458f254c2f8536b928dee25862ca90ffa6) files: M Lib/test/pythoninfo.py diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 39ee9e1d769f8..8c8011b550acd 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -720,6 +720,48 @@ def collect_windows(info_add): except (ImportError, AttributeError): pass + import subprocess + try: + # When wmic.exe output is redirected to a pipe, + # it uses the OEM code page + proc = subprocess.Popen(["wmic", "os", "get", "Caption,Version", "/value"], + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + encoding="oem", + text=True) + output, stderr = proc.communicate() + if proc.returncode: + output = "" + except OSError: + pass + else: + for line in output.splitlines(): + line = line.strip() + if line.startswith('Caption='): + line = line.removeprefix('Caption=').strip() + if line: + info_add('windows.version_caption', line) + elif line.startswith('Version='): + line = line.removeprefix('Version=').strip() + if line: + info_add('windows.version', line) + + try: + proc = subprocess.Popen(["ver"], shell=True, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + text=True) + output = proc.communicate()[0] + if proc.returncode: + output = "" + except OSError: + return + else: + output = output.strip() + line = output.splitlines()[0] + if line: + info_add('windows.ver', line) + def collect_fips(info_add): try: From webhook-mailer at python.org Tue Jan 25 17:01:21 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 25 Jan 2022 22:01:21 -0000 Subject: [Python-checkins] Refactor sanitiser skip tests into test.support (GH-30889) Message-ID: https://github.com/python/cpython/commit/b1cb8430504931f7854eac5d32cba74770078a4e commit: b1cb8430504931f7854eac5d32cba74770078a4e branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-25T22:01:10Z summary: Refactor sanitiser skip tests into test.support (GH-30889) * Refactor sanitizer skip tests into test.support * fixup! Refactor sanitizer skip tests into test.support * fixup! fixup! Refactor sanitizer skip tests into test.support files: M Lib/test/support/__init__.py M Lib/test/test_faulthandler.py M Lib/test/test_io.py diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 1e4935fc3e617..583d94f41e563 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -43,7 +43,7 @@ "has_subprocess_support", "requires_subprocess", "anticipate_failure", "load_package_tests", "detect_api_mismatch", "check__all__", "skip_if_buggy_ucrt_strfptime", - "check_disallow_instantiation", + "check_disallow_instantiation", "skip_if_sanitizer", # sys "is_jython", "is_android", "is_emscripten", "is_wasi", "check_impl_detail", "unix_shell", "setswitchinterval", @@ -384,6 +384,35 @@ def skip_if_buildbot(reason=None): isbuildbot = os.environ.get('USER') == 'buildbot' return unittest.skipIf(isbuildbot, reason) +def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): + """Decorator raising SkipTest if running with a sanitizer active.""" + if not (address or memory or ub): + raise ValueError('At least one of address, memory, or ub must be True') + + if not reason: + reason = 'not working with sanitizers active' + + _cflags = sysconfig.get_config_var('CFLAGS') or '' + _config_args = sysconfig.get_config_var('CONFIG_ARGS') or '' + memory_sanitizer = ( + '-fsanitize=memory' in _cflags or + '--with-memory-sanitizer' in _config_args + ) + address_sanitizer = ( + '-fsanitize=address' in _cflags or + '--with-memory-sanitizer' in _config_args + ) + ub_sanitizer = ( + '-fsanitize=undefined' in _cflags or + '--with-undefined-behavior-sanitizer' in _config_args + ) + skip = ( + (memory and memory_sanitizer) or + (address and address_sanitizer) or + (ub and ub_sanitizer) + ) + return unittest.skipIf(skip, reason) + def system_must_validate_cert(f): """Skip the test on TLS certificate validation failures.""" diff --git a/Lib/test/test_faulthandler.py b/Lib/test/test_faulthandler.py index f7eaa77942476..daacdeef5bc80 100644 --- a/Lib/test/test_faulthandler.py +++ b/Lib/test/test_faulthandler.py @@ -6,10 +6,10 @@ import signal import subprocess import sys -import sysconfig from test import support from test.support import os_helper from test.support import script_helper, is_android +from test.support import skip_if_sanitizer import tempfile import unittest from textwrap import dedent @@ -21,16 +21,6 @@ TIMEOUT = 0.5 MS_WINDOWS = (os.name == 'nt') -_cflags = sysconfig.get_config_var('CFLAGS') or '' -_config_args = sysconfig.get_config_var('CONFIG_ARGS') or '' -UB_SANITIZER = ( - '-fsanitize=undefined' in _cflags or - '--with-undefined-behavior-sanitizer' in _config_args -) -MEMORY_SANITIZER = ( - '-fsanitize=memory' in _cflags or - '--with-memory-sanitizer' in _config_args -) def expected_traceback(lineno1, lineno2, header, min_count=1): @@ -311,8 +301,8 @@ def test_gil_released(self): 3, 'Segmentation fault') - @unittest.skipIf(UB_SANITIZER or MEMORY_SANITIZER, - "sanitizer builds change crashing process output.") + @skip_if_sanitizer(memory=True, ub=True, reason="sanitizer " + "builds change crashing process output.") @skip_segfault_on_android def test_enable_file(self): with temporary_filename() as filename: @@ -328,8 +318,8 @@ def test_enable_file(self): @unittest.skipIf(sys.platform == "win32", "subprocess doesn't support pass_fds on Windows") - @unittest.skipIf(UB_SANITIZER or MEMORY_SANITIZER, - "sanitizer builds change crashing process output.") + @skip_if_sanitizer(memory=True, ub=True, reason="sanitizer " + "builds change crashing process output.") @skip_segfault_on_android def test_enable_fd(self): with tempfile.TemporaryFile('wb+') as fp: diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py index 3619e749d1731..a10611abb13f4 100644 --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -28,7 +28,6 @@ import random import signal import sys -import sysconfig import textwrap import threading import time @@ -44,6 +43,7 @@ from test.support import os_helper from test.support import threading_helper from test.support import warnings_helper +from test.support import skip_if_sanitizer from test.support.os_helper import FakePath import codecs @@ -66,17 +66,6 @@ def byteslike(*pos, **kw): class EmptyStruct(ctypes.Structure): pass -_cflags = sysconfig.get_config_var('CFLAGS') or '' -_config_args = sysconfig.get_config_var('CONFIG_ARGS') or '' -MEMORY_SANITIZER = ( - '-fsanitize=memory' in _cflags or - '--with-memory-sanitizer' in _config_args -) - -ADDRESS_SANITIZER = ( - '-fsanitize=address' in _cflags -) - # Does io.IOBase finalizer log the exception if the close() method fails? # The exception is ignored silently by default in release build. IOBASE_EMITS_UNRAISABLE = (hasattr(sys, "gettotalrefcount") or sys.flags.dev_mode) @@ -1550,8 +1539,8 @@ def test_truncate_on_read_only(self): class CBufferedReaderTest(BufferedReaderTest, SizeofTest): tp = io.BufferedReader - @unittest.skipIf(MEMORY_SANITIZER or ADDRESS_SANITIZER, "sanitizer defaults to crashing " - "instead of returning NULL for malloc failure.") + @skip_if_sanitizer(memory=True, address=True, reason= "sanitizer defaults to crashing " + "instead of returning NULL for malloc failure.") def test_constructor(self): BufferedReaderTest.test_constructor(self) # The allocation can succeed on 32-bit builds, e.g. with more @@ -1915,8 +1904,8 @@ def test_slow_close_from_thread(self): class CBufferedWriterTest(BufferedWriterTest, SizeofTest): tp = io.BufferedWriter - @unittest.skipIf(MEMORY_SANITIZER or ADDRESS_SANITIZER, "sanitizer defaults to crashing " - "instead of returning NULL for malloc failure.") + @skip_if_sanitizer(memory=True, address=True, reason= "sanitizer defaults to crashing " + "instead of returning NULL for malloc failure.") def test_constructor(self): BufferedWriterTest.test_constructor(self) # The allocation can succeed on 32-bit builds, e.g. with more @@ -2414,8 +2403,8 @@ def test_interleaved_readline_write(self): class CBufferedRandomTest(BufferedRandomTest, SizeofTest): tp = io.BufferedRandom - @unittest.skipIf(MEMORY_SANITIZER or ADDRESS_SANITIZER, "sanitizer defaults to crashing " - "instead of returning NULL for malloc failure.") + @skip_if_sanitizer(memory=True, address=True, reason= "sanitizer defaults to crashing " + "instead of returning NULL for malloc failure.") def test_constructor(self): BufferedRandomTest.test_constructor(self) # The allocation can succeed on 32-bit builds, e.g. with more From webhook-mailer at python.org Tue Jan 25 17:12:20 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 25 Jan 2022 22:12:20 -0000 Subject: [Python-checkins] bpo-46091: Correctly calculate indentation levels for whitespace lines with continuation characters (GH-30130) Message-ID: https://github.com/python/cpython/commit/a0efc0c1960e2c49e0092694d98395555270914c commit: a0efc0c1960e2c49e0092694d98395555270914c branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-25T22:12:14Z summary: bpo-46091: Correctly calculate indentation levels for whitespace lines with continuation characters (GH-30130) files: A Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst M Lib/test/test_ast.py M Lib/test/test_syntax.py M Lib/test/test_tokenize.py M Parser/tokenizer.c diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 314b360c58ba9..039d1c1010b6d 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -1078,8 +1078,7 @@ def test_literal_eval_malformed_lineno(self): ast.literal_eval(node) def test_literal_eval_syntax_errors(self): - msg = "unexpected character after line continuation character" - with self.assertRaisesRegex(SyntaxError, msg): + with self.assertRaisesRegex(SyntaxError, "unexpected indent"): ast.literal_eval(r''' \ (\ diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 968d34809ce43..a6ff319af2ac8 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1613,6 +1613,36 @@ def test_empty_line_after_linecont(self): except SyntaxError: self.fail("Empty line after a line continuation character is valid.") + # See issue-46091 + s1 = r"""\ +def fib(n): + \ +'''Print a Fibonacci series up to n.''' + \ +a, b = 0, 1 +""" + s2 = r"""\ +def fib(n): + '''Print a Fibonacci series up to n.''' + a, b = 0, 1 +""" + try: + self.assertEqual(compile(s1, '', 'exec'), compile(s2, '', 'exec')) + except SyntaxError: + self.fail("Indented statement over multiple lines is valid") + + def test_continuation_bad_indentation(self): + # Check that code that breaks indentation across multiple lines raises a syntax error + + code = r"""\ +if x: + y = 1 + \ + foo = 1 + """ + + self.assertRaises(IndentationError, exec, code) + @support.cpython_only def test_nested_named_except_blocks(self): code = "" diff --git a/Lib/test/test_tokenize.py b/Lib/test/test_tokenize.py index ca2821de7c081..334390abaa2de 100644 --- a/Lib/test/test_tokenize.py +++ b/Lib/test/test_tokenize.py @@ -6,6 +6,7 @@ NEWLINE, _generate_tokens_from_c_tokenizer) from io import BytesIO, StringIO import unittest +from textwrap import dedent from unittest import TestCase, mock from test.test_grammar import (VALID_UNDERSCORE_LITERALS, INVALID_UNDERSCORE_LITERALS) @@ -44,7 +45,6 @@ def check_tokenize(self, s, expected): # The ENDMARKER and final NEWLINE are omitted. f = BytesIO(s.encode('utf-8')) result = stringify_tokens_from_source(tokenize(f.readline), s) - self.assertEqual(result, [" ENCODING 'utf-8' (0, 0) (0, 0)"] + expected.rstrip().splitlines()) @@ -2511,7 +2511,105 @@ def get_tokens(string): self.assertRaises(SyntaxError, get_tokens, "("*1000+"a"+")"*1000) self.assertRaises(SyntaxError, get_tokens, "]") + + def test_continuation_lines_indentation(self): + def get_tokens(string): + return [(kind, string) for (kind, string, *_) in _generate_tokens_from_c_tokenizer(string)] + code = dedent(""" + def fib(n): + \\ + '''Print a Fibonacci series up to n.''' + \\ + a, b = 0, 1 + """) + + self.check_tokenize(code, """\ + NAME 'def' (2, 0) (2, 3) + NAME 'fib' (2, 4) (2, 7) + LPAR '(' (2, 7) (2, 8) + NAME 'n' (2, 8) (2, 9) + RPAR ')' (2, 9) (2, 10) + COLON ':' (2, 10) (2, 11) + NEWLINE '' (2, 11) (2, 11) + INDENT '' (4, -1) (4, -1) + STRING "'''Print a Fibonacci series up to n.'''" (4, 0) (4, 39) + NEWLINE '' (4, 39) (4, 39) + NAME 'a' (6, 0) (6, 1) + COMMA ',' (6, 1) (6, 2) + NAME 'b' (6, 3) (6, 4) + EQUAL '=' (6, 5) (6, 6) + NUMBER '0' (6, 7) (6, 8) + COMMA ',' (6, 8) (6, 9) + NUMBER '1' (6, 10) (6, 11) + NEWLINE '' (6, 11) (6, 11) + DEDENT '' (6, -1) (6, -1) + """) + + code_no_cont = dedent(""" + def fib(n): + '''Print a Fibonacci series up to n.''' + a, b = 0, 1 + """) + + self.assertEqual(get_tokens(code), get_tokens(code_no_cont)) + + code = dedent(""" + pass + \\ + + pass + """) + + self.check_tokenize(code, """\ + NAME 'pass' (2, 0) (2, 4) + NEWLINE '' (2, 4) (2, 4) + NAME 'pass' (5, 0) (5, 4) + NEWLINE '' (5, 4) (5, 4) + """) + + code_no_cont = dedent(""" + pass + pass + """) + + self.assertEqual(get_tokens(code), get_tokens(code_no_cont)) + + code = dedent(""" + if x: + y = 1 + \\ + \\ + \\ + \\ + foo = 1 + """) + + self.check_tokenize(code, """\ + NAME 'if' (2, 0) (2, 2) + NAME 'x' (2, 3) (2, 4) + COLON ':' (2, 4) (2, 5) + NEWLINE '' (2, 5) (2, 5) + INDENT '' (3, -1) (3, -1) + NAME 'y' (3, 4) (3, 5) + EQUAL '=' (3, 6) (3, 7) + NUMBER '1' (3, 8) (3, 9) + NEWLINE '' (3, 9) (3, 9) + NAME 'foo' (8, 4) (8, 7) + EQUAL '=' (8, 8) (8, 9) + NUMBER '1' (8, 10) (8, 11) + NEWLINE '' (8, 11) (8, 11) + DEDENT '' (8, -1) (8, -1) + """) + + code_no_cont = dedent(""" + if x: + y = 1 + foo = 1 + """) + + self.assertEqual(get_tokens(code), get_tokens(code_no_cont)) + if __name__ == "__main__": unittest.main() diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst new file mode 100644 index 0000000000000..a2eee0f3ebd51 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst @@ -0,0 +1,2 @@ +Correctly calculate indentation levels for lines with whitespace character +that are ended by line continuation characters. Patch by Pablo Galindo diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 5e35d6fa621b1..cd4254f8b9077 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -1347,6 +1347,24 @@ tok_decimal_tail(struct tok_state *tok) /* Get next token, after space stripping etc. */ +static inline int +tok_continuation_line(struct tok_state *tok) { + int c = tok_nextc(tok); + if (c != '\n') { + tok->done = E_LINECONT; + return -1; + } + c = tok_nextc(tok); + if (c == EOF) { + tok->done = E_EOF; + tok->cur = tok->inp; + return -1; + } else { + tok_backup(tok, c); + } + return c; +} + static int tok_get(struct tok_state *tok, const char **p_start, const char **p_end) { @@ -1363,6 +1381,7 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) int col = 0; int altcol = 0; tok->atbol = 0; + int cont_line_col = 0; for (;;) { c = tok_nextc(tok); if (c == ' ') { @@ -1375,14 +1394,23 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) else if (c == '\014') {/* Control-L (formfeed) */ col = altcol = 0; /* For Emacs users */ } + else if (c == '\\') { + // Indentation cannot be split over multiple physical lines + // using backslashes. This means that if we found a backslash + // preceded by whitespace, **the first one we find** determines + // the level of indentation of whatever comes next. + cont_line_col = cont_line_col ? cont_line_col : col; + if ((c = tok_continuation_line(tok)) == -1) { + return ERRORTOKEN; + } + } else { break; } } tok_backup(tok, c); - if (c == '#' || c == '\n' || c == '\\') { + if (c == '#' || c == '\n') { /* Lines with only whitespace and/or comments - and/or a line continuation character shouldn't affect the indentation and are not passed to the parser as NEWLINE tokens, except *totally* empty lines in interactive @@ -1403,6 +1431,8 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) may need to skip to the end of a comment */ } if (!blankline && tok->level == 0) { + col = cont_line_col ? cont_line_col : col; + altcol = cont_line_col ? cont_line_col : altcol; if (col == tok->indstack[tok->indent]) { /* No change */ if (altcol != tok->altindstack[tok->indent]) { @@ -1964,19 +1994,9 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) /* Line continuation */ if (c == '\\') { - c = tok_nextc(tok); - if (c != '\n') { - tok->done = E_LINECONT; + if ((c = tok_continuation_line(tok)) == -1) { return ERRORTOKEN; } - c = tok_nextc(tok); - if (c == EOF) { - tok->done = E_EOF; - tok->cur = tok->inp; - return ERRORTOKEN; - } else { - tok_backup(tok, c); - } tok->cont_line = 1; goto again; /* Read next line */ } From webhook-mailer at python.org Tue Jan 25 17:34:07 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 25 Jan 2022 22:34:07 -0000 Subject: [Python-checkins] [3.10] bpo-46091: Correctly calculate indentation levels for whitespace lines with continuation characters (GH-30130). (GH-30898) Message-ID: https://github.com/python/cpython/commit/3fc8b74ace033a17346a992f661928ba619e61e8 commit: 3fc8b74ace033a17346a992f661928ba619e61e8 branch: 3.10 author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-25T22:33:57Z summary: [3.10] bpo-46091: Correctly calculate indentation levels for whitespace lines with continuation characters (GH-30130). (GH-30898) (cherry picked from commit a0efc0c1960e2c49e0092694d98395555270914c) Co-authored-by: Pablo Galindo Salgado files: A Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst M Lib/test/test_ast.py M Lib/test/test_syntax.py M Lib/test/test_tokenize.py M Parser/tokenizer.c diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 39fc7e9673816..95af9e2aa690c 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -1045,8 +1045,7 @@ def test_literal_eval_malformed_lineno(self): ast.literal_eval(node) def test_literal_eval_syntax_errors(self): - msg = "unexpected character after line continuation character" - with self.assertRaisesRegex(SyntaxError, msg): + with self.assertRaisesRegex(SyntaxError, "unexpected indent"): ast.literal_eval(r''' \ (\ diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 7aa93a012e113..ac5a41ce4cc67 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1463,6 +1463,36 @@ def test_empty_line_after_linecont(self): except SyntaxError: self.fail("Empty line after a line continuation character is valid.") + # See issue-46091 + s1 = r"""\ +def fib(n): + \ +'''Print a Fibonacci series up to n.''' + \ +a, b = 0, 1 +""" + s2 = r"""\ +def fib(n): + '''Print a Fibonacci series up to n.''' + a, b = 0, 1 +""" + try: + self.assertEqual(compile(s1, '', 'exec'), compile(s2, '', 'exec')) + except SyntaxError: + self.fail("Indented statement over multiple lines is valid") + + def test_continuation_bad_indentation(self): + # Check that code that breaks indentation across multiple lines raises a syntax error + + code = r"""\ +if x: + y = 1 + \ + foo = 1 + """ + + self.assertRaises(IndentationError, exec, code) + @support.cpython_only def test_nested_named_except_blocks(self): code = "" diff --git a/Lib/test/test_tokenize.py b/Lib/test/test_tokenize.py index 4bce1ca9c76f7..127f0a17c95e3 100644 --- a/Lib/test/test_tokenize.py +++ b/Lib/test/test_tokenize.py @@ -6,6 +6,7 @@ NEWLINE) from io import BytesIO, StringIO import unittest +from textwrap import dedent from unittest import TestCase, mock from test.test_grammar import (VALID_UNDERSCORE_LITERALS, INVALID_UNDERSCORE_LITERALS) @@ -45,7 +46,6 @@ def check_tokenize(self, s, expected): # The ENDMARKER and final NEWLINE are omitted. f = BytesIO(s.encode('utf-8')) result = stringify_tokens_from_source(tokenize(f.readline), s) - self.assertEqual(result, [" ENCODING 'utf-8' (0, 0) (0, 0)"] + expected.rstrip().splitlines()) diff --git a/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst new file mode 100644 index 0000000000000..a2eee0f3ebd51 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2021-12-16-00-24-00.bpo-46091.rJ_e_e.rst @@ -0,0 +1,2 @@ +Correctly calculate indentation levels for lines with whitespace character +that are ended by line continuation characters. Patch by Pablo Galindo diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 8e9c69d0785af..de5f57649c256 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -1346,6 +1346,24 @@ tok_decimal_tail(struct tok_state *tok) /* Get next token, after space stripping etc. */ +static inline int +tok_continuation_line(struct tok_state *tok) { + int c = tok_nextc(tok); + if (c != '\n') { + tok->done = E_LINECONT; + return -1; + } + c = tok_nextc(tok); + if (c == EOF) { + tok->done = E_EOF; + tok->cur = tok->inp; + return -1; + } else { + tok_backup(tok, c); + } + return c; +} + static int tok_get(struct tok_state *tok, const char **p_start, const char **p_end) { @@ -1362,6 +1380,7 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) int col = 0; int altcol = 0; tok->atbol = 0; + int cont_line_col = 0; for (;;) { c = tok_nextc(tok); if (c == ' ') { @@ -1374,14 +1393,23 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) else if (c == '\014') {/* Control-L (formfeed) */ col = altcol = 0; /* For Emacs users */ } + else if (c == '\\') { + // Indentation cannot be split over multiple physical lines + // using backslashes. This means that if we found a backslash + // preceded by whitespace, **the first one we find** determines + // the level of indentation of whatever comes next. + cont_line_col = cont_line_col ? cont_line_col : col; + if ((c = tok_continuation_line(tok)) == -1) { + return ERRORTOKEN; + } + } else { break; } } tok_backup(tok, c); - if (c == '#' || c == '\n' || c == '\\') { + if (c == '#' || c == '\n') { /* Lines with only whitespace and/or comments - and/or a line continuation character shouldn't affect the indentation and are not passed to the parser as NEWLINE tokens, except *totally* empty lines in interactive @@ -1402,6 +1430,8 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) may need to skip to the end of a comment */ } if (!blankline && tok->level == 0) { + col = cont_line_col ? cont_line_col : col; + altcol = cont_line_col ? cont_line_col : altcol; if (col == tok->indstack[tok->indent]) { /* No change */ if (altcol != tok->altindstack[tok->indent]) { @@ -1963,19 +1993,9 @@ tok_get(struct tok_state *tok, const char **p_start, const char **p_end) /* Line continuation */ if (c == '\\') { - c = tok_nextc(tok); - if (c != '\n') { - tok->done = E_LINECONT; + if ((c = tok_continuation_line(tok)) == -1) { return ERRORTOKEN; } - c = tok_nextc(tok); - if (c == EOF) { - tok->done = E_EOF; - tok->cur = tok->inp; - return ERRORTOKEN; - } else { - tok_backup(tok, c); - } tok->cont_line = 1; goto again; /* Read next line */ } From webhook-mailer at python.org Tue Jan 25 18:14:08 2022 From: webhook-mailer at python.org (pablogsal) Date: Tue, 25 Jan 2022 23:14:08 -0000 Subject: [Python-checkins] Add skips to crashing tests under sanitizers instead of manually skipping them (GH-30897) Message-ID: https://github.com/python/cpython/commit/a27505345e34d462139f5f8b6b5e7c9a59955150 commit: a27505345e34d462139f5f8b6b5e7c9a59955150 branch: main author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-25T23:14:03Z summary: Add skips to crashing tests under sanitizers instead of manually skipping them (GH-30897) files: M .github/workflows/build.yml M Lib/test/support/__init__.py M Lib/test/test_crypt.py M Lib/test/test_idle.py M Lib/test/test_tix.py M Lib/test/test_tk.py M Lib/test/test_ttk_guionly.py diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index d6af174d1c3a7..5d36dffa80108 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -311,4 +311,7 @@ jobs: # # Skip multiprocessing and concurrent.futures tests which are affected by # bpo-45200 bug: libasan dead lock in pthread_create(). - run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu -x test_ctypes test_crypt test_decimal test_faulthandler test_interpreters test___all__ test_idle test_tix test_tk test_ttk_guionly test_ttk_textonly test_multiprocessing_fork test_multiprocessing_forkserver test_multiprocessing_spawn test_tools test_peg_generator test_concurrent_futures" + # + # test___all__ is skipped because importing some modules directly can trigger + # known problems with ASAN (like tk or crypt). + run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu -x test___all__ test_multiprocessing_fork test_multiprocessing_forkserver test_multiprocessing_spawn test_tools test_peg_generator test_concurrent_futures" diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 583d94f41e563..d71cfe5ee44fd 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -43,7 +43,7 @@ "has_subprocess_support", "requires_subprocess", "anticipate_failure", "load_package_tests", "detect_api_mismatch", "check__all__", "skip_if_buggy_ucrt_strfptime", - "check_disallow_instantiation", "skip_if_sanitizer", + "check_disallow_instantiation", "check_sanitizer", "skip_if_sanitizer", # sys "is_jython", "is_android", "is_emscripten", "is_wasi", "check_impl_detail", "unix_shell", "setswitchinterval", @@ -384,13 +384,11 @@ def skip_if_buildbot(reason=None): isbuildbot = os.environ.get('USER') == 'buildbot' return unittest.skipIf(isbuildbot, reason) -def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): - """Decorator raising SkipTest if running with a sanitizer active.""" +def check_sanitizer(*, address=False, memory=False, ub=False): + """Returns True if Python is compiled with sanitizer support""" if not (address or memory or ub): raise ValueError('At least one of address, memory, or ub must be True') - if not reason: - reason = 'not working with sanitizers active' _cflags = sysconfig.get_config_var('CFLAGS') or '' _config_args = sysconfig.get_config_var('CONFIG_ARGS') or '' @@ -406,11 +404,18 @@ def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): '-fsanitize=undefined' in _cflags or '--with-undefined-behavior-sanitizer' in _config_args ) - skip = ( + return ( (memory and memory_sanitizer) or (address and address_sanitizer) or (ub and ub_sanitizer) ) + + +def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): + """Decorator raising SkipTest if running with a sanitizer active.""" + if not reason: + reason = 'not working with sanitizers active' + skip = check_sanitizer(address=address, memory=memory, ub=ub) return unittest.skipIf(skip, reason) diff --git a/Lib/test/test_crypt.py b/Lib/test/test_crypt.py index 5dc83b4ecbfa0..877c575c5534a 100644 --- a/Lib/test/test_crypt.py +++ b/Lib/test/test_crypt.py @@ -1,8 +1,11 @@ import sys import unittest +from test.support import check_sanitizer try: + if check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("The crypt module SEGFAULTs on ASAN/MSAN builds") import crypt IMPORT_ERROR = None except ImportError as ex: diff --git a/Lib/test/test_idle.py b/Lib/test/test_idle.py index 8756b766334e8..b94b18a541a70 100644 --- a/Lib/test/test_idle.py +++ b/Lib/test/test_idle.py @@ -1,5 +1,9 @@ import unittest from test.support.import_helper import import_module +from test.support import check_sanitizer + +if check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") # Skip test_idle if _tkinter wasn't built, if tkinter is missing, # if tcl/tk is not the 8.5+ needed for ttk widgets, diff --git a/Lib/test/test_tix.py b/Lib/test/test_tix.py index 8a60c7c8e1fbd..454baeb38a934 100644 --- a/Lib/test/test_tix.py +++ b/Lib/test/test_tix.py @@ -2,6 +2,11 @@ import unittest from test import support from test.support import import_helper +from test.support import check_sanitizer + +if check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") + # Skip this test if the _tkinter module wasn't built. _tkinter = import_helper.import_module('_tkinter') diff --git a/Lib/test/test_tk.py b/Lib/test/test_tk.py index 69cc2322cd9aa..8f90cbaba9f7c 100644 --- a/Lib/test/test_tk.py +++ b/Lib/test/test_tk.py @@ -1,5 +1,11 @@ +import unittest from test import support from test.support import import_helper +from test.support import check_sanitizer + +if check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") + # Skip test if _tkinter wasn't built. import_helper.import_module('_tkinter') diff --git a/Lib/test/test_ttk_guionly.py b/Lib/test/test_ttk_guionly.py index 8f59839d066e6..c4919045d75cb 100644 --- a/Lib/test/test_ttk_guionly.py +++ b/Lib/test/test_ttk_guionly.py @@ -1,6 +1,10 @@ import unittest from test import support from test.support import import_helper +from test.support import check_sanitizer + +if check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") # Skip this test if _tkinter wasn't built. import_helper.import_module('_tkinter') From webhook-mailer at python.org Tue Jan 25 18:53:08 2022 From: webhook-mailer at python.org (iritkatriel) Date: Tue, 25 Jan 2022 23:53:08 -0000 Subject: [Python-checkins] bpo-46431: use raw string for regex in test (GH-30901) Message-ID: https://github.com/python/cpython/commit/072f4a473e861c6c987650f08990c0ed1f76715f commit: 072f4a473e861c6c987650f08990c0ed1f76715f branch: main author: Irit Katriel <1055913+iritkatriel at users.noreply.github.com> committer: iritkatriel <1055913+iritkatriel at users.noreply.github.com> date: 2022-01-25T23:52:43Z summary: bpo-46431: use raw string for regex in test (GH-30901) files: M Lib/test/test_exception_group.py diff --git a/Lib/test/test_exception_group.py b/Lib/test/test_exception_group.py index bbfce944c1765..b7b53bb2f0ece 100644 --- a/Lib/test/test_exception_group.py +++ b/Lib/test/test_exception_group.py @@ -22,7 +22,7 @@ def test_exception_group_is_generic_type(self): class BadConstructorArgs(unittest.TestCase): def test_bad_EG_construction__too_many_args(self): - MSG = 'BaseExceptionGroup.__new__\(\) takes exactly 2 arguments' + MSG = r'BaseExceptionGroup.__new__\(\) takes exactly 2 arguments' with self.assertRaisesRegex(TypeError, MSG): ExceptionGroup('no errors') with self.assertRaisesRegex(TypeError, MSG): From webhook-mailer at python.org Tue Jan 25 23:49:11 2022 From: webhook-mailer at python.org (terryjreedy) Date: Wed, 26 Jan 2022 04:49:11 -0000 Subject: [Python-checkins] bpo-48146: Update IDLE part of What's New 3.10 to 2022 (GH-30906) Message-ID: https://github.com/python/cpython/commit/4a49fa6ca66664383d406dbf6f6c28289ffeeeaa commit: 4a49fa6ca66664383d406dbf6f6c28289ffeeeaa branch: main author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T23:48:53-05:00 summary: bpo-48146: Update IDLE part of What's New 3.10 to 2022 (GH-30906) files: M Doc/whatsnew/3.10.rst diff --git a/Doc/whatsnew/3.10.rst b/Doc/whatsnew/3.10.rst index 85bb1f8838e9b..73c22aafc764d 100644 --- a/Doc/whatsnew/3.10.rst +++ b/Doc/whatsnew/3.10.rst @@ -1148,10 +1148,18 @@ IDLE and idlelib ---------------- Make IDLE invoke :func:`sys.excepthook` (when started without '-n'). -User hooks were previously ignored. (Patch by Ken Hilton in +User hooks were previously ignored. (Contributed by Ken Hilton in :issue:`43008`.) -This change was backported to a 3.9 maintenance release. +Rearrange the settings dialog. Split the General tab into Windows +and Shell/Ed tabs. Move help sources, which extend the Help menu, to the +Extensions tab. Make space for new options and shorten the dialog. The +latter makes the dialog better fit small screens. (Contributed by Terry Jan +Reedy in :issue:`40468`.) Move the indent space setting from the Font tab to +the new Windows tab. (Contributed by Mark Roseman and Terry Jan Reedy in +:issue:`33962`.) + +These changes were backported to a 3.9 maintenance release. Add a Shell sidebar. Move the primary prompt ('>>>') to the sidebar. Add secondary prompts ('...') to the sidebar. Left click and optional @@ -1164,12 +1172,9 @@ in :issue:`37903`.) Use spaces instead of tabs to indent interactive code. This makes interactive code entries 'look right'. Making this feasible was a -major motivation for adding the shell sidebar. Contributed by +major motivation for adding the shell sidebar. (Contributed by Terry Jan Reedy in :issue:`37892`.) -We expect to backport these shell changes to a future 3.9 maintenance -release. - Highlight the new :ref:`soft keywords ` :keyword:`match`, :keyword:`case `, and :keyword:`_ ` in pattern-matching statements. However, this highlighting is not perfect From webhook-mailer at python.org Tue Jan 25 23:50:03 2022 From: webhook-mailer at python.org (terryjreedy) Date: Wed, 26 Jan 2022 04:50:03 -0000 Subject: [Python-checkins] bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) Message-ID: https://github.com/python/cpython/commit/7cf285d82ec722d4225297366013e924805171f2 commit: 7cf285d82ec722d4225297366013e924805171f2 branch: main author: Terry Jan Reedy committer: terryjreedy date: 2022-01-25T23:49:54-05:00 summary: bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) files: M Doc/whatsnew/3.9.rst diff --git a/Doc/whatsnew/3.9.rst b/Doc/whatsnew/3.9.rst index 0d514084d6cc1..4fa3835a39900 100644 --- a/Doc/whatsnew/3.9.rst +++ b/Doc/whatsnew/3.9.rst @@ -484,8 +484,22 @@ Najera in :issue:`38944`.) Added keywords to module name completion list. (Contributed by Terry J. Reedy in :issue:`37765`.) +New in 3.9 maintenance releases + +Make IDLE invoke :func:`sys.excepthook` (when started without '-n'). +User hooks were previously ignored. (Contributed by Ken Hilton in +:issue:`43008`.) + The changes above have been backported to 3.8 maintenance releases. +Rearrange the settings dialog. Split the General tab into Windows +and Shell/Ed tabs. Move help sources, which extend the Help menu, to the +Extensions tab. Make space for new options and shorten the dialog. The +latter makes the dialog better fit small screens. (Contributed by Terry Jan +Reedy in :issue:`40468`.) Move the indent space setting from the Font tab to +the new Windows tab. (Contributed by Mark Roseman and Terry Jan Reedy in +:issue:`33962`.) + imaplib ------- From webhook-mailer at python.org Wed Jan 26 00:11:03 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 05:11:03 -0000 Subject: [Python-checkins] bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) Message-ID: https://github.com/python/cpython/commit/8356f6aac2fc41cab44159574f5d8fd5fdf95a63 commit: 8356f6aac2fc41cab44159574f5d8fd5fdf95a63 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-25T21:10:53-08:00 summary: bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) (cherry picked from commit 7cf285d82ec722d4225297366013e924805171f2) Co-authored-by: Terry Jan Reedy files: M Doc/whatsnew/3.9.rst diff --git a/Doc/whatsnew/3.9.rst b/Doc/whatsnew/3.9.rst index 296c64d737d3e..a6f83a91f5e14 100644 --- a/Doc/whatsnew/3.9.rst +++ b/Doc/whatsnew/3.9.rst @@ -484,8 +484,22 @@ Najera in :issue:`38944`.) Added keywords to module name completion list. (Contributed by Terry J. Reedy in :issue:`37765`.) +New in 3.9 maintenance releases + +Make IDLE invoke :func:`sys.excepthook` (when started without '-n'). +User hooks were previously ignored. (Contributed by Ken Hilton in +:issue:`43008`.) + The changes above have been backported to 3.8 maintenance releases. +Rearrange the settings dialog. Split the General tab into Windows +and Shell/Ed tabs. Move help sources, which extend the Help menu, to the +Extensions tab. Make space for new options and shorten the dialog. The +latter makes the dialog better fit small screens. (Contributed by Terry Jan +Reedy in :issue:`40468`.) Move the indent space setting from the Font tab to +the new Windows tab. (Contributed by Mark Roseman and Terry Jan Reedy in +:issue:`33962`.) + imaplib ------- From webhook-mailer at python.org Wed Jan 26 00:15:10 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 05:15:10 -0000 Subject: [Python-checkins] bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) Message-ID: https://github.com/python/cpython/commit/f8a805bde1ff4679c2824ced4a28437da61b1506 commit: f8a805bde1ff4679c2824ced4a28437da61b1506 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-25T21:15:03-08:00 summary: bpo-41844: Update IDLE part of What's New 3.9 to 20228 (GH-30905) (cherry picked from commit 7cf285d82ec722d4225297366013e924805171f2) Co-authored-by: Terry Jan Reedy files: M Doc/whatsnew/3.9.rst diff --git a/Doc/whatsnew/3.9.rst b/Doc/whatsnew/3.9.rst index 0662adba7d4af..044cfc07bce9b 100644 --- a/Doc/whatsnew/3.9.rst +++ b/Doc/whatsnew/3.9.rst @@ -484,8 +484,22 @@ Najera in :issue:`38944`.) Added keywords to module name completion list. (Contributed by Terry J. Reedy in :issue:`37765`.) +New in 3.9 maintenance releases + +Make IDLE invoke :func:`sys.excepthook` (when started without '-n'). +User hooks were previously ignored. (Contributed by Ken Hilton in +:issue:`43008`.) + The changes above have been backported to 3.8 maintenance releases. +Rearrange the settings dialog. Split the General tab into Windows +and Shell/Ed tabs. Move help sources, which extend the Help menu, to the +Extensions tab. Make space for new options and shorten the dialog. The +latter makes the dialog better fit small screens. (Contributed by Terry Jan +Reedy in :issue:`40468`.) Move the indent space setting from the Font tab to +the new Windows tab. (Contributed by Mark Roseman and Terry Jan Reedy in +:issue:`33962`.) + imaplib ------- From webhook-mailer at python.org Wed Jan 26 02:05:14 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 07:05:14 -0000 Subject: [Python-checkins] bpo-48146: Update IDLE part of What's New 3.10 to 2022 (GH-30906) Message-ID: https://github.com/python/cpython/commit/c1254c44e2e0e807fa1b8a0b589732996d2a9c2e commit: c1254c44e2e0e807fa1b8a0b589732996d2a9c2e branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-25T23:04:52-08:00 summary: bpo-48146: Update IDLE part of What's New 3.10 to 2022 (GH-30906) (cherry picked from commit 4a49fa6ca66664383d406dbf6f6c28289ffeeeaa) Co-authored-by: Terry Jan Reedy files: M Doc/whatsnew/3.10.rst diff --git a/Doc/whatsnew/3.10.rst b/Doc/whatsnew/3.10.rst index c07523c49163b..6edc09ff6cef6 100644 --- a/Doc/whatsnew/3.10.rst +++ b/Doc/whatsnew/3.10.rst @@ -1137,10 +1137,18 @@ IDLE and idlelib ---------------- Make IDLE invoke :func:`sys.excepthook` (when started without '-n'). -User hooks were previously ignored. (Patch by Ken Hilton in +User hooks were previously ignored. (Contributed by Ken Hilton in :issue:`43008`.) -This change was backported to a 3.9 maintenance release. +Rearrange the settings dialog. Split the General tab into Windows +and Shell/Ed tabs. Move help sources, which extend the Help menu, to the +Extensions tab. Make space for new options and shorten the dialog. The +latter makes the dialog better fit small screens. (Contributed by Terry Jan +Reedy in :issue:`40468`.) Move the indent space setting from the Font tab to +the new Windows tab. (Contributed by Mark Roseman and Terry Jan Reedy in +:issue:`33962`.) + +These changes were backported to a 3.9 maintenance release. Add a Shell sidebar. Move the primary prompt ('>>>') to the sidebar. Add secondary prompts ('...') to the sidebar. Left click and optional @@ -1153,12 +1161,9 @@ in :issue:`37903`.) Use spaces instead of tabs to indent interactive code. This makes interactive code entries 'look right'. Making this feasible was a -major motivation for adding the shell sidebar. Contributed by +major motivation for adding the shell sidebar. (Contributed by Terry Jan Reedy in :issue:`37892`.) -We expect to backport these shell changes to a future 3.9 maintenance -release. - Highlight the new :ref:`soft keywords ` :keyword:`match`, :keyword:`case `, and :keyword:`_ ` in pattern-matching statements. However, this highlighting is not perfect From webhook-mailer at python.org Wed Jan 26 04:03:59 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 09:03:59 -0000 Subject: [Python-checkins] bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) Message-ID: https://github.com/python/cpython/commit/6e5a193816e1bdf11f5fb78d620995fd6987ccf8 commit: 6e5a193816e1bdf11f5fb78d620995fd6987ccf8 branch: main author: Christian Heimes committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T01:03:49-08:00 summary: bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) files: A Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst M Modules/audioop.c M configure M configure.ac M pyconfig.h.in diff --git a/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst new file mode 100644 index 0000000000000..b8986ae31a340 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst @@ -0,0 +1,2 @@ +:program:`configure` no longer uses ``AC_C_CHAR_UNSIGNED`` macro and +``pyconfig.h`` no longer defines reserved symbol ``__CHAR_UNSIGNED__``. diff --git a/Modules/audioop.c b/Modules/audioop.c index 3aeb6f04f13cb..2a5d805c053c7 100644 --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -5,13 +5,6 @@ #include "Python.h" -#if defined(__CHAR_UNSIGNED__) -#if defined(signed) -/* This module currently does not work on systems where only unsigned - characters are available. Take it out of Setup. Sorry. */ -#endif -#endif - static const int maxvals[] = {0, 0x7F, 0x7FFF, 0x7FFFFF, 0x7FFFFFFF}; /* -1 trick is needed on Windows to support -0x80000000 without a warning */ static const int minvals[] = {0, -0x80, -0x8000, -0x800000, -0x7FFFFFFF-1}; diff --git a/configure b/configure index 78e5a09927221..22c65e492fbab 100755 --- a/configure +++ b/configure @@ -17063,39 +17063,6 @@ fi # checks for compiler characteristics -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether char is unsigned" >&5 -$as_echo_n "checking whether char is unsigned... " >&6; } -if ${ac_cv_c_char_unsigned+:} false; then : - $as_echo_n "(cached) " >&6 -else - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ -$ac_includes_default -int -main () -{ -static int test_array [1 - 2 * !(((char) -1) < 0)]; -test_array [0] = 0; -return test_array [0]; - - ; - return 0; -} -_ACEOF -if ac_fn_c_try_compile "$LINENO"; then : - ac_cv_c_char_unsigned=no -else - ac_cv_c_char_unsigned=yes -fi -rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext -fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_char_unsigned" >&5 -$as_echo "$ac_cv_c_char_unsigned" >&6; } -if test $ac_cv_c_char_unsigned = yes && test "$GCC" != yes; then - $as_echo "#define __CHAR_UNSIGNED__ 1" >>confdefs.h - -fi - { $as_echo "$as_me:${as_lineno-$LINENO}: checking for an ANSI C-conforming const" >&5 $as_echo_n "checking for an ANSI C-conforming const... " >&6; } if ${ac_cv_c_const+:} false; then : diff --git a/configure.ac b/configure.ac index a0fbe41c9ec59..9bc6e3beecbcc 100644 --- a/configure.ac +++ b/configure.ac @@ -4699,7 +4699,6 @@ fi # checks for compiler characteristics -AC_C_CHAR_UNSIGNED AC_C_CONST AC_CACHE_CHECK([for working signed char], [ac_cv_working_signed_char_c], [ diff --git a/pyconfig.h.in b/pyconfig.h.in index a779ffadf200f..02569c49543b6 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -1728,11 +1728,6 @@ /* Define on FreeBSD to activate all library features */ #undef __BSD_VISIBLE -/* Define to 1 if type `char' is unsigned and you are not using gcc. */ -#ifndef __CHAR_UNSIGNED__ -# undef __CHAR_UNSIGNED__ -#endif - /* Define to 'long' if doesn't define. */ #undef clock_t From webhook-mailer at python.org Wed Jan 26 05:05:56 2022 From: webhook-mailer at python.org (corona10) Date: Wed, 26 Jan 2022 10:05:56 -0000 Subject: [Python-checkins] bpo-45578: add a test case for `dis.findlabels` (GH-30058) Message-ID: https://github.com/python/cpython/commit/84f093918a4be663775afe2933f4be86f72fe495 commit: 84f093918a4be663775afe2933f4be86f72fe495 branch: main author: Nikita Sobolev committer: corona10 date: 2022-01-26T19:05:35+09:00 summary: bpo-45578: add a test case for `dis.findlabels` (GH-30058) files: M Lib/test/test_dis.py diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index ee9729ebabf4a..c65b0143e87d0 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -1462,6 +1462,16 @@ def test__find_store_names(self): res = tuple(dis._find_store_names(code)) self.assertEqual(res, expected) + def test_findlabels(self): + labels = dis.findlabels(jumpy.__code__.co_code) + jumps = [ + instr.offset + for instr in expected_opinfo_jumpy + if instr.is_jump_target + ] + + self.assertEqual(sorted(labels), sorted(jumps)) + class TestDisTraceback(unittest.TestCase): def setUp(self) -> None: From webhook-mailer at python.org Wed Jan 26 05:06:16 2022 From: webhook-mailer at python.org (corona10) Date: Wed, 26 Jan 2022 10:06:16 -0000 Subject: [Python-checkins] bpo-43698: do not use `...` as argument name in docs (GH-30502) Message-ID: https://github.com/python/cpython/commit/b9d8980d89bfaa4bf16d60f0488adcc9d2cbf5ef commit: b9d8980d89bfaa4bf16d60f0488adcc9d2cbf5ef branch: main author: Nikita Sobolev committer: corona10 date: 2022-01-26T19:06:10+09:00 summary: bpo-43698: do not use `...` as argument name in docs (GH-30502) files: M Doc/faq/design.rst M Doc/glossary.rst M Doc/library/abc.rst M Doc/library/functions.rst diff --git a/Doc/faq/design.rst b/Doc/faq/design.rst index 0437b59d55da6..ff83a1b8134b7 100644 --- a/Doc/faq/design.rst +++ b/Doc/faq/design.rst @@ -266,12 +266,9 @@ For cases where you need to choose from a very large number of possibilities, you can create a dictionary mapping case values to functions to call. For example:: - def function_1(...): - ... - functions = {'a': function_1, 'b': function_2, - 'c': self.method_1, ...} + 'c': self.method_1} func = functions[value] func() @@ -279,14 +276,14 @@ example:: For calling methods on objects, you can simplify yet further by using the :func:`getattr` built-in to retrieve methods with a particular name:: - def visit_a(self, ...): - ... - ... + class MyVisitor: + def visit_a(self): + ... - def dispatch(self, value): - method_name = 'visit_' + str(value) - method = getattr(self, method_name) - method() + def dispatch(self, value): + method_name = 'visit_' + str(value) + method = getattr(self, method_name) + method() It's suggested that you use a prefix for the method names, such as ``visit_`` in this example. Without such a prefix, if values are coming from an untrusted diff --git a/Doc/glossary.rst b/Doc/glossary.rst index e71f6c0406a23..f0f33d577374b 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -282,12 +282,12 @@ Glossary The decorator syntax is merely syntactic sugar, the following two function definitions are semantically equivalent:: - def f(...): + def f(arg): ... f = staticmethod(f) @staticmethod - def f(...): + def f(arg): ... The same concept exists for classes, but is less commonly used there. See diff --git a/Doc/library/abc.rst b/Doc/library/abc.rst index 1a6ed474ff21d..3b74622e7ff46 100644 --- a/Doc/library/abc.rst +++ b/Doc/library/abc.rst @@ -186,15 +186,15 @@ The :mod:`abc` module also provides the following decorator: class C(ABC): @abstractmethod - def my_abstract_method(self, ...): + def my_abstract_method(self, arg1): ... @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg2): ... @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg3): ... @property @@ -255,7 +255,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg): ... @@ -276,7 +276,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg): ... diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 059a058d5888c..9c061bcd8252a 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -248,7 +248,7 @@ are always available. They are listed here in alphabetical order. class C: @classmethod - def f(cls, arg1, arg2, ...): ... + def f(cls, arg1, arg2): ... The ``@classmethod`` form is a function :term:`decorator` -- see :ref:`function` for details. From webhook-mailer at python.org Wed Jan 26 05:13:17 2022 From: webhook-mailer at python.org (Fidget-Spinner) Date: Wed, 26 Jan 2022 10:13:17 -0000 Subject: [Python-checkins] bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) Message-ID: https://github.com/python/cpython/commit/d0c690b5f85c679de6059cf353fe0524e905530e commit: d0c690b5f85c679de6059cf353fe0524e905530e branch: main author: Nikita Sobolev committer: Fidget-Spinner <28750310+Fidget-Spinner at users.noreply.github.com> date: 2022-01-26T18:13:02+08:00 summary: bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 5777656552d79..b5767d02691d8 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -325,6 +325,15 @@ def test_repr(self): u = Union[int | float] self.assertEqual(repr(u), 'typing.Union[int, float]') + u = Union[None, str] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[str, None] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[None, str, int] + self.assertEqual(repr(u), 'typing.Union[NoneType, str, int]') + u = Optional[str] + self.assertEqual(repr(u), 'typing.Optional[str]') + def test_cannot_subclass(self): with self.assertRaises(TypeError): class C(Union): From webhook-mailer at python.org Wed Jan 26 05:39:52 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 10:39:52 -0000 Subject: [Python-checkins] bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) Message-ID: https://github.com/python/cpython/commit/c730342005edf67333c37b575b419e2fc67d232b commit: c730342005edf67333c37b575b419e2fc67d232b branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T02:39:40-08:00 summary: bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) (cherry picked from commit d0c690b5f85c679de6059cf353fe0524e905530e) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 9b552c422d56d..d4068242da6da 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -318,6 +318,15 @@ def test_repr(self): u = Union[int | float] self.assertEqual(repr(u), 'typing.Union[int, float]') + u = Union[None, str] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[str, None] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[None, str, int] + self.assertEqual(repr(u), 'typing.Union[NoneType, str, int]') + u = Optional[str] + self.assertEqual(repr(u), 'typing.Optional[str]') + def test_cannot_subclass(self): with self.assertRaises(TypeError): class C(Union): From webhook-mailer at python.org Wed Jan 26 05:40:48 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 10:40:48 -0000 Subject: [Python-checkins] bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) Message-ID: https://github.com/python/cpython/commit/29eefcc9c688bc4097f2660de1fa846c5ea54735 commit: 29eefcc9c688bc4097f2660de1fa846c5ea54735 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T02:40:40-08:00 summary: bpo-46529: increase coverage of `typing.Union.__repr__` method (GH-30911) (cherry picked from commit d0c690b5f85c679de6059cf353fe0524e905530e) Co-authored-by: Nikita Sobolev files: M Lib/test/test_typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index f87832a631d49..f70311c95b24e 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -304,6 +304,15 @@ def test_repr(self): u = Union[list[int], dict[str, float]] self.assertEqual(repr(u), 'typing.Union[list[int], dict[str, float]]') + u = Union[None, str] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[str, None] + self.assertEqual(repr(u), 'typing.Optional[str]') + u = Union[None, str, int] + self.assertEqual(repr(u), 'typing.Union[NoneType, str, int]') + u = Optional[str] + self.assertEqual(repr(u), 'typing.Optional[str]') + def test_cannot_subclass(self): with self.assertRaises(TypeError): class C(Union): From webhook-mailer at python.org Wed Jan 26 06:21:09 2022 From: webhook-mailer at python.org (tiran) Date: Wed, 26 Jan 2022 11:21:09 -0000 Subject: [Python-checkins] [3.10] bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) (GH-30914) Message-ID: https://github.com/python/cpython/commit/4371fbd4328781496f5f2c6938c4d9a84049b187 commit: 4371fbd4328781496f5f2c6938c4d9a84049b187 branch: 3.10 author: Christian Heimes committer: tiran date: 2022-01-26T12:20:31+01:00 summary: [3.10] bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) (GH-30914) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst M Modules/audioop.c M configure M configure.ac M pyconfig.h.in diff --git a/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst new file mode 100644 index 0000000000000..b8986ae31a340 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst @@ -0,0 +1,2 @@ +:program:`configure` no longer uses ``AC_C_CHAR_UNSIGNED`` macro and +``pyconfig.h`` no longer defines reserved symbol ``__CHAR_UNSIGNED__``. diff --git a/Modules/audioop.c b/Modules/audioop.c index 3aeb6f04f13cb..2a5d805c053c7 100644 --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -5,13 +5,6 @@ #include "Python.h" -#if defined(__CHAR_UNSIGNED__) -#if defined(signed) -/* This module currently does not work on systems where only unsigned - characters are available. Take it out of Setup. Sorry. */ -#endif -#endif - static const int maxvals[] = {0, 0x7F, 0x7FFF, 0x7FFFFF, 0x7FFFFFFF}; /* -1 trick is needed on Windows to support -0x80000000 without a warning */ static const int minvals[] = {0, -0x80, -0x8000, -0x800000, -0x7FFFFFFF-1}; diff --git a/configure b/configure index a7d2975f1f5e8..e68e00b0b338b 100755 --- a/configure +++ b/configure @@ -14183,39 +14183,6 @@ fi # checks for compiler characteristics -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether char is unsigned" >&5 -$as_echo_n "checking whether char is unsigned... " >&6; } -if ${ac_cv_c_char_unsigned+:} false; then : - $as_echo_n "(cached) " >&6 -else - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ -$ac_includes_default -int -main () -{ -static int test_array [1 - 2 * !(((char) -1) < 0)]; -test_array [0] = 0; -return test_array [0]; - - ; - return 0; -} -_ACEOF -if ac_fn_c_try_compile "$LINENO"; then : - ac_cv_c_char_unsigned=no -else - ac_cv_c_char_unsigned=yes -fi -rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext -fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_char_unsigned" >&5 -$as_echo "$ac_cv_c_char_unsigned" >&6; } -if test $ac_cv_c_char_unsigned = yes && test "$GCC" != yes; then - $as_echo "#define __CHAR_UNSIGNED__ 1" >>confdefs.h - -fi - { $as_echo "$as_me:${as_lineno-$LINENO}: checking for an ANSI C-conforming const" >&5 $as_echo_n "checking for an ANSI C-conforming const... " >&6; } if ${ac_cv_c_const+:} false; then : diff --git a/configure.ac b/configure.ac index 5aa91cbad3555..0efeb8f585d09 100644 --- a/configure.ac +++ b/configure.ac @@ -4322,7 +4322,6 @@ fi # checks for compiler characteristics -AC_C_CHAR_UNSIGNED AC_C_CONST works=no diff --git a/pyconfig.h.in b/pyconfig.h.in index b97b8f8bf8b4a..8a4aeda646923 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -1656,11 +1656,6 @@ /* Define on FreeBSD to activate all library features */ #undef __BSD_VISIBLE -/* Define to 1 if type `char' is unsigned and you are not using gcc. */ -#ifndef __CHAR_UNSIGNED__ -# undef __CHAR_UNSIGNED__ -#endif - /* Define to 'long' if doesn't define. */ #undef clock_t From webhook-mailer at python.org Wed Jan 26 06:21:09 2022 From: webhook-mailer at python.org (tiran) Date: Wed, 26 Jan 2022 11:21:09 -0000 Subject: [Python-checkins] [3.9] bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) (GH-30915) Message-ID: https://github.com/python/cpython/commit/04772cd6f164c1219c8c74d55626ba114f01aa96 commit: 04772cd6f164c1219c8c74d55626ba114f01aa96 branch: 3.9 author: Christian Heimes committer: tiran date: 2022-01-26T12:20:39+01:00 summary: [3.9] bpo-46513: Remove AC_C_CHAR_UNSIGNED / __CHAR_UNSIGNED__ (GH-30851) (GH-30915) Co-authored-by: Christian Heimes files: A Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst M Modules/audioop.c M configure M configure.ac M pyconfig.h.in diff --git a/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst new file mode 100644 index 0000000000000..b8986ae31a340 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-25-12-32-37.bpo-46513.mPm9B4.rst @@ -0,0 +1,2 @@ +:program:`configure` no longer uses ``AC_C_CHAR_UNSIGNED`` macro and +``pyconfig.h`` no longer defines reserved symbol ``__CHAR_UNSIGNED__``. diff --git a/Modules/audioop.c b/Modules/audioop.c index 3aeb6f04f13cb..2a5d805c053c7 100644 --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -5,13 +5,6 @@ #include "Python.h" -#if defined(__CHAR_UNSIGNED__) -#if defined(signed) -/* This module currently does not work on systems where only unsigned - characters are available. Take it out of Setup. Sorry. */ -#endif -#endif - static const int maxvals[] = {0, 0x7F, 0x7FFF, 0x7FFFFF, 0x7FFFFFFF}; /* -1 trick is needed on Windows to support -0x80000000 without a warning */ static const int minvals[] = {0, -0x80, -0x8000, -0x800000, -0x7FFFFFFF-1}; diff --git a/configure b/configure index d078887b2f485..5232ce64b27fd 100755 --- a/configure +++ b/configure @@ -13998,39 +13998,6 @@ fi # checks for compiler characteristics -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether char is unsigned" >&5 -$as_echo_n "checking whether char is unsigned... " >&6; } -if ${ac_cv_c_char_unsigned+:} false; then : - $as_echo_n "(cached) " >&6 -else - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ -$ac_includes_default -int -main () -{ -static int test_array [1 - 2 * !(((char) -1) < 0)]; -test_array [0] = 0; -return test_array [0]; - - ; - return 0; -} -_ACEOF -if ac_fn_c_try_compile "$LINENO"; then : - ac_cv_c_char_unsigned=no -else - ac_cv_c_char_unsigned=yes -fi -rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext -fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_char_unsigned" >&5 -$as_echo "$ac_cv_c_char_unsigned" >&6; } -if test $ac_cv_c_char_unsigned = yes && test "$GCC" != yes; then - $as_echo "#define __CHAR_UNSIGNED__ 1" >>confdefs.h - -fi - { $as_echo "$as_me:${as_lineno-$LINENO}: checking for an ANSI C-conforming const" >&5 $as_echo_n "checking for an ANSI C-conforming const... " >&6; } if ${ac_cv_c_const+:} false; then : diff --git a/configure.ac b/configure.ac index 431d66197bc7f..754621844b1ed 100644 --- a/configure.ac +++ b/configure.ac @@ -4309,7 +4309,6 @@ fi # checks for compiler characteristics -AC_C_CHAR_UNSIGNED AC_C_CONST works=no diff --git a/pyconfig.h.in b/pyconfig.h.in index 188faeeb12020..b7bfc2541f8ba 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -1632,11 +1632,6 @@ /* Define on FreeBSD to activate all library features */ #undef __BSD_VISIBLE -/* Define to 1 if type `char' is unsigned and you are not using gcc. */ -#ifndef __CHAR_UNSIGNED__ -# undef __CHAR_UNSIGNED__ -#endif - /* Define to 'long' if doesn't define. */ #undef clock_t From webhook-mailer at python.org Wed Jan 26 07:42:30 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 12:42:30 -0000 Subject: [Python-checkins] bpo-43698: do not use `...` as argument name in docs (GH-30502) Message-ID: https://github.com/python/cpython/commit/a57ec7a4feaf24f470a9d1e5b1b3f2cb1b062af7 commit: a57ec7a4feaf24f470a9d1e5b1b3f2cb1b062af7 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T04:42:26-08:00 summary: bpo-43698: do not use `...` as argument name in docs (GH-30502) (cherry picked from commit b9d8980d89bfaa4bf16d60f0488adcc9d2cbf5ef) Co-authored-by: Nikita Sobolev files: M Doc/faq/design.rst M Doc/glossary.rst M Doc/library/abc.rst M Doc/library/functions.rst diff --git a/Doc/faq/design.rst b/Doc/faq/design.rst index 0437b59d55da6..ff83a1b8134b7 100644 --- a/Doc/faq/design.rst +++ b/Doc/faq/design.rst @@ -266,12 +266,9 @@ For cases where you need to choose from a very large number of possibilities, you can create a dictionary mapping case values to functions to call. For example:: - def function_1(...): - ... - functions = {'a': function_1, 'b': function_2, - 'c': self.method_1, ...} + 'c': self.method_1} func = functions[value] func() @@ -279,14 +276,14 @@ example:: For calling methods on objects, you can simplify yet further by using the :func:`getattr` built-in to retrieve methods with a particular name:: - def visit_a(self, ...): - ... - ... + class MyVisitor: + def visit_a(self): + ... - def dispatch(self, value): - method_name = 'visit_' + str(value) - method = getattr(self, method_name) - method() + def dispatch(self, value): + method_name = 'visit_' + str(value) + method = getattr(self, method_name) + method() It's suggested that you use a prefix for the method names, such as ``visit_`` in this example. Without such a prefix, if values are coming from an untrusted diff --git a/Doc/glossary.rst b/Doc/glossary.rst index 1bbd05a5bcecd..ddf085b8c1a6c 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -292,12 +292,12 @@ Glossary The decorator syntax is merely syntactic sugar, the following two function definitions are semantically equivalent:: - def f(...): + def f(arg): ... f = staticmethod(f) @staticmethod - def f(...): + def f(arg): ... The same concept exists for classes, but is less commonly used there. See diff --git a/Doc/library/abc.rst b/Doc/library/abc.rst index 1a6ed474ff21d..3b74622e7ff46 100644 --- a/Doc/library/abc.rst +++ b/Doc/library/abc.rst @@ -186,15 +186,15 @@ The :mod:`abc` module also provides the following decorator: class C(ABC): @abstractmethod - def my_abstract_method(self, ...): + def my_abstract_method(self, arg1): ... @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg2): ... @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg3): ... @property @@ -255,7 +255,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg): ... @@ -276,7 +276,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg): ... diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 9a9c87e32013c..95231363a5dc2 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -248,7 +248,7 @@ are always available. They are listed here in alphabetical order. class C: @classmethod - def f(cls, arg1, arg2, ...): ... + def f(cls, arg1, arg2): ... The ``@classmethod`` form is a function :term:`decorator` -- see :ref:`function` for details. From webhook-mailer at python.org Wed Jan 26 07:42:43 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 12:42:43 -0000 Subject: [Python-checkins] bpo-43698: do not use `...` as argument name in docs (GH-30502) Message-ID: https://github.com/python/cpython/commit/49971b2d1890c15eeec2d83ea3e8d178f266c4f9 commit: 49971b2d1890c15eeec2d83ea3e8d178f266c4f9 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T04:42:39-08:00 summary: bpo-43698: do not use `...` as argument name in docs (GH-30502) (cherry picked from commit b9d8980d89bfaa4bf16d60f0488adcc9d2cbf5ef) Co-authored-by: Nikita Sobolev files: M Doc/faq/design.rst M Doc/glossary.rst M Doc/library/abc.rst M Doc/library/functions.rst diff --git a/Doc/faq/design.rst b/Doc/faq/design.rst index d0aee4e607268..01fa5c77d8659 100644 --- a/Doc/faq/design.rst +++ b/Doc/faq/design.rst @@ -267,12 +267,9 @@ For cases where you need to choose from a very large number of possibilities, you can create a dictionary mapping case values to functions to call. For example:: - def function_1(...): - ... - functions = {'a': function_1, 'b': function_2, - 'c': self.method_1, ...} + 'c': self.method_1} func = functions[value] func() @@ -280,14 +277,14 @@ example:: For calling methods on objects, you can simplify yet further by using the :func:`getattr` built-in to retrieve methods with a particular name:: - def visit_a(self, ...): - ... - ... + class MyVisitor: + def visit_a(self): + ... - def dispatch(self, value): - method_name = 'visit_' + str(value) - method = getattr(self, method_name) - method() + def dispatch(self, value): + method_name = 'visit_' + str(value) + method = getattr(self, method_name) + method() It's suggested that you use a prefix for the method names, such as ``visit_`` in this example. Without such a prefix, if values are coming from an untrusted diff --git a/Doc/glossary.rst b/Doc/glossary.rst index da9dc9ceebfc4..f759c3fb05607 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -278,12 +278,12 @@ Glossary The decorator syntax is merely syntactic sugar, the following two function definitions are semantically equivalent:: - def f(...): + def f(arg): ... f = staticmethod(f) @staticmethod - def f(...): + def f(arg): ... The same concept exists for classes, but is less commonly used there. See diff --git a/Doc/library/abc.rst b/Doc/library/abc.rst index 424ae547d829a..35be01e891e9d 100644 --- a/Doc/library/abc.rst +++ b/Doc/library/abc.rst @@ -185,15 +185,15 @@ The :mod:`abc` module also provides the following decorator: class C(ABC): @abstractmethod - def my_abstract_method(self, ...): + def my_abstract_method(self, arg1): ... @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg2): ... @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg3): ... @property @@ -255,7 +255,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @classmethod @abstractmethod - def my_abstract_classmethod(cls, ...): + def my_abstract_classmethod(cls, arg): ... @@ -276,7 +276,7 @@ The :mod:`abc` module also supports the following legacy decorators: class C(ABC): @staticmethod @abstractmethod - def my_abstract_staticmethod(...): + def my_abstract_staticmethod(arg): ... diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 8df557e47a16e..9a9c707dc177a 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -210,7 +210,7 @@ are always available. They are listed here in alphabetical order. class C: @classmethod - def f(cls, arg1, arg2, ...): ... + def f(cls, arg1, arg2): ... The ``@classmethod`` form is a function :term:`decorator` -- see :ref:`function` for details. From webhook-mailer at python.org Wed Jan 26 08:58:37 2022 From: webhook-mailer at python.org (markshannon) Date: Wed, 26 Jan 2022 13:58:37 -0000 Subject: [Python-checkins] Use existing unbound_local_error label in DELETE_FAST opcode (GH-30882) Message-ID: https://github.com/python/cpython/commit/1e8a3a5579c9a96a45073b24d1af2dbe3039d366 commit: 1e8a3a5579c9a96a45073b24d1af2dbe3039d366 branch: main author: Kumar Aditya <59607654+kumaraditya303 at users.noreply.github.com> committer: markshannon date: 2022-01-26T13:58:28Z summary: Use existing unbound_local_error label in DELETE_FAST opcode (GH-30882) files: M Python/ceval.c diff --git a/Python/ceval.c b/Python/ceval.c index 0a6fc4a20660b..29ca5e3750cc0 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -3107,12 +3107,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr SETLOCAL(oparg, NULL); DISPATCH(); } - format_exc_check_arg( - tstate, PyExc_UnboundLocalError, - UNBOUNDLOCAL_ERROR_MSG, - PyTuple_GetItem(frame->f_code->co_localsplusnames, oparg) - ); - goto error; + goto unbound_local_error; } TARGET(MAKE_CELL) { From webhook-mailer at python.org Wed Jan 26 10:46:57 2022 From: webhook-mailer at python.org (corona10) Date: Wed, 26 Jan 2022 15:46:57 -0000 Subject: [Python-checkins] bpo-46527: allow calling enumerate(iterable=...) again (GH-30904) Message-ID: https://github.com/python/cpython/commit/ac0c6e128cb6553585af096c851c488b53a6c952 commit: ac0c6e128cb6553585af096c851c488b53a6c952 branch: main author: Jelle Zijlstra committer: corona10 date: 2022-01-27T00:46:48+09:00 summary: bpo-46527: allow calling enumerate(iterable=...) again (GH-30904) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-25-19-34-55.bpo-46527.mQLNPk.rst M Lib/test/test_enumerate.py M Objects/enumobject.c diff --git a/Lib/test/test_enumerate.py b/Lib/test/test_enumerate.py index 906bfc21a26ae..5cb54cff9b76f 100644 --- a/Lib/test/test_enumerate.py +++ b/Lib/test/test_enumerate.py @@ -128,6 +128,18 @@ def test_argumentcheck(self): self.assertRaises(TypeError, self.enum, 'abc', 'a') # wrong type self.assertRaises(TypeError, self.enum, 'abc', 2, 3) # too many arguments + def test_kwargs(self): + self.assertEqual(list(self.enum(iterable=Ig(self.seq))), self.res) + expected = list(self.enum(Ig(self.seq), 0)) + self.assertEqual(list(self.enum(iterable=Ig(self.seq), start=0)), + expected) + self.assertEqual(list(self.enum(start=0, iterable=Ig(self.seq))), + expected) + self.assertRaises(TypeError, self.enum, iterable=[], x=3) + self.assertRaises(TypeError, self.enum, start=0, x=3) + self.assertRaises(TypeError, self.enum, x=0, y=3) + self.assertRaises(TypeError, self.enum, x=0) + @support.cpython_only def test_tuple_reuse(self): # Tests an implementation detail where tuple is reused @@ -266,14 +278,16 @@ def test_basicfunction(self): class TestStart(EnumerateStartTestCase): + def enum(self, iterable, start=11): + return enumerate(iterable, start=start) - enum = lambda self, i: enumerate(i, start=11) seq, res = 'abc', [(11, 'a'), (12, 'b'), (13, 'c')] class TestLongStart(EnumerateStartTestCase): + def enum(self, iterable, start=sys.maxsize + 1): + return enumerate(iterable, start=start) - enum = lambda self, i: enumerate(i, start=sys.maxsize+1) seq, res = 'abc', [(sys.maxsize+1,'a'), (sys.maxsize+2,'b'), (sys.maxsize+3,'c')] diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-25-19-34-55.bpo-46527.mQLNPk.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-25-19-34-55.bpo-46527.mQLNPk.rst new file mode 100644 index 0000000000000..c9fd0ed05e2ae --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-25-19-34-55.bpo-46527.mQLNPk.rst @@ -0,0 +1,2 @@ +Allow passing ``iterable`` as a keyword argument to :func:`enumerate` again. +Patch by Jelle Zijlstra. diff --git a/Objects/enumobject.c b/Objects/enumobject.c index 36f592d7c239c..828f1f925a0a1 100644 --- a/Objects/enumobject.c +++ b/Objects/enumobject.c @@ -83,6 +83,18 @@ enum_new_impl(PyTypeObject *type, PyObject *iterable, PyObject *start) return (PyObject *)en; } +static int check_keyword(PyObject *kwnames, int index, + const char *name) +{ + PyObject *kw = PyTuple_GET_ITEM(kwnames, index); + if (!_PyUnicode_EqualToASCIIString(kw, name)) { + PyErr_Format(PyExc_TypeError, + "'%S' is an invalid keyword argument for enumerate()", kw); + return 0; + } + return 1; +} + // TODO: Use AC when bpo-43447 is supported static PyObject * enumerate_vectorcall(PyObject *type, PyObject *const *args, @@ -91,31 +103,46 @@ enumerate_vectorcall(PyObject *type, PyObject *const *args, PyTypeObject *tp = _PyType_CAST(type); Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); Py_ssize_t nkwargs = 0; - if (nargs == 0) { - PyErr_SetString(PyExc_TypeError, - "enumerate() missing required argument 'iterable'"); - return NULL; - } if (kwnames != NULL) { nkwargs = PyTuple_GET_SIZE(kwnames); } + // Manually implement enumerate(iterable, start=...) if (nargs + nkwargs == 2) { if (nkwargs == 1) { - PyObject *kw = PyTuple_GET_ITEM(kwnames, 0); - if (!_PyUnicode_EqualToASCIIString(kw, "start")) { - PyErr_Format(PyExc_TypeError, - "'%S' is an invalid keyword argument for enumerate()", kw); + if (!check_keyword(kwnames, 0, "start")) { + return NULL; + } + } else if (nkwargs == 2) { + PyObject *kw0 = PyTuple_GET_ITEM(kwnames, 0); + if (_PyUnicode_EqualToASCIIString(kw0, "start")) { + if (!check_keyword(kwnames, 1, "iterable")) { + return NULL; + } + return enum_new_impl(tp, args[1], args[0]); + } + if (!check_keyword(kwnames, 0, "iterable") || + !check_keyword(kwnames, 1, "start")) { return NULL; } + } return enum_new_impl(tp, args[0], args[1]); } - if (nargs == 1 && nkwargs == 0) { + if (nargs + nkwargs == 1) { + if (nkwargs == 1 && !check_keyword(kwnames, 0, "iterable")) { + return NULL; + } return enum_new_impl(tp, args[0], NULL); } + if (nargs == 0) { + PyErr_SetString(PyExc_TypeError, + "enumerate() missing required argument 'iterable'"); + return NULL; + } + PyErr_Format(PyExc_TypeError, "enumerate() takes at most 2 arguments (%d given)", nargs + nkwargs); return NULL; From webhook-mailer at python.org Wed Jan 26 11:26:27 2022 From: webhook-mailer at python.org (gvanrossum) Date: Wed, 26 Jan 2022 16:26:27 -0000 Subject: [Python-checkins] bpo-43853: Expand test suite for SQLite UDF's (GH-27642) Message-ID: https://github.com/python/cpython/commit/3eb3b4f270757f66c7fb6dcf5afa416ee1582a4b commit: 3eb3b4f270757f66c7fb6dcf5afa416ee1582a4b branch: main author: Erlend Egeberg Aasland committer: gvanrossum date: 2022-01-26T08:26:16-08:00 summary: bpo-43853: Expand test suite for SQLite UDF's (GH-27642) files: M Lib/test/test_sqlite3/test_userfunctions.py M Modules/_sqlite/connection.c M Modules/_sqlite/statement.c diff --git a/Lib/test/test_sqlite3/test_userfunctions.py b/Lib/test/test_sqlite3/test_userfunctions.py index b2906081e5028..23ecfb4e8a689 100644 --- a/Lib/test/test_sqlite3/test_userfunctions.py +++ b/Lib/test/test_sqlite3/test_userfunctions.py @@ -23,7 +23,6 @@ import contextlib import functools -import gc import io import re import sys @@ -31,7 +30,8 @@ import unittest.mock import sqlite3 as sqlite -from test.support import bigmemtest, catch_unraisable_exception +from test.support import bigmemtest, catch_unraisable_exception, gc_collect + from test.test_sqlite3.test_dbapi import cx_limit @@ -94,22 +94,6 @@ def func_memoryerror(): def func_overflowerror(): raise OverflowError -def func_isstring(v): - return type(v) is str -def func_isint(v): - return type(v) is int -def func_isfloat(v): - return type(v) is float -def func_isnone(v): - return type(v) is type(None) -def func_isblob(v): - return isinstance(v, (bytes, memoryview)) -def func_islonglong(v): - return isinstance(v, int) and v >= 1<<31 - -def func(*args): - return len(args) - class AggrNoStep: def __init__(self): pass @@ -210,17 +194,15 @@ def setUp(self): self.con.create_function("returnnull", 0, func_returnnull) self.con.create_function("returnblob", 0, func_returnblob) self.con.create_function("returnlonglong", 0, func_returnlonglong) + self.con.create_function("returnnan", 0, lambda: float("nan")) + self.con.create_function("returntoolargeint", 0, lambda: 1 << 65) self.con.create_function("raiseexception", 0, func_raiseexception) self.con.create_function("memoryerror", 0, func_memoryerror) self.con.create_function("overflowerror", 0, func_overflowerror) - self.con.create_function("isstring", 1, func_isstring) - self.con.create_function("isint", 1, func_isint) - self.con.create_function("isfloat", 1, func_isfloat) - self.con.create_function("isnone", 1, func_isnone) - self.con.create_function("isblob", 1, func_isblob) - self.con.create_function("islonglong", 1, func_islonglong) - self.con.create_function("spam", -1, func) + self.con.create_function("isblob", 1, lambda x: isinstance(x, bytes)) + self.con.create_function("isnone", 1, lambda x: x is None) + self.con.create_function("spam", -1, lambda *x: len(x)) self.con.execute("create table test(t text)") def tearDown(self): @@ -305,6 +287,16 @@ def test_func_return_long_long(self): val = cur.fetchone()[0] self.assertEqual(val, 1<<31) + def test_func_return_nan(self): + cur = self.con.cursor() + cur.execute("select returnnan()") + self.assertIsNone(cur.fetchone()[0]) + + def test_func_return_too_large_int(self): + cur = self.con.cursor() + self.assertRaisesRegex(sqlite.DataError, "string or blob too big", + self.con.execute, "select returntoolargeint()") + @with_tracebacks(ZeroDivisionError, name="func_raiseexception") def test_func_exception(self): cur = self.con.cursor() @@ -327,44 +319,6 @@ def test_func_overflow_error(self): cur.execute("select overflowerror()") cur.fetchone() - def test_param_string(self): - cur = self.con.cursor() - for text in ["foo", str()]: - with self.subTest(text=text): - cur.execute("select isstring(?)", (text,)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - - def test_param_int(self): - cur = self.con.cursor() - cur.execute("select isint(?)", (42,)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - - def test_param_float(self): - cur = self.con.cursor() - cur.execute("select isfloat(?)", (3.14,)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - - def test_param_none(self): - cur = self.con.cursor() - cur.execute("select isnone(?)", (None,)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - - def test_param_blob(self): - cur = self.con.cursor() - cur.execute("select isblob(?)", (memoryview(b"blob"),)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - - def test_param_long_long(self): - cur = self.con.cursor() - cur.execute("select islonglong(?)", (1<<42,)) - val = cur.fetchone()[0] - self.assertEqual(val, 1) - def test_any_arguments(self): cur = self.con.cursor() cur.execute("select spam(?, ?)", (1, 2)) @@ -375,6 +329,52 @@ def test_empty_blob(self): cur = self.con.execute("select isblob(x'')") self.assertTrue(cur.fetchone()[0]) + def test_nan_float(self): + cur = self.con.execute("select isnone(?)", (float("nan"),)) + # SQLite has no concept of nan; it is converted to NULL + self.assertTrue(cur.fetchone()[0]) + + def test_too_large_int(self): + err = "Python int too large to convert to SQLite INTEGER" + self.assertRaisesRegex(OverflowError, err, self.con.execute, + "select spam(?)", (1 << 65,)) + + def test_non_contiguous_blob(self): + self.assertRaisesRegex(ValueError, "could not convert BLOB to buffer", + self.con.execute, "select spam(?)", + (memoryview(b"blob")[::2],)) + + def test_param_surrogates(self): + self.assertRaisesRegex(UnicodeEncodeError, "surrogates not allowed", + self.con.execute, "select spam(?)", + ("\ud803\ude6d",)) + + def test_func_params(self): + results = [] + def append_result(arg): + results.append((arg, type(arg))) + self.con.create_function("test_params", 1, append_result) + + dataset = [ + (42, int), + (-1, int), + (1234567890123456789, int), + (4611686018427387905, int), # 63-bit int with non-zero low bits + (3.14, float), + (float('inf'), float), + ("text", str), + ("1\x002", str), + ("\u02e2q\u02e1\u2071\u1d57\u1d49", str), + (b"blob", bytes), + (bytearray(range(2)), bytes), + (memoryview(b"blob"), bytes), + (None, type(None)), + ] + for val, _ in dataset: + cur = self.con.execute("select test_params(?)", (val,)) + cur.fetchone() + self.assertEqual(dataset, results) + # Regarding deterministic functions: # # Between 3.8.3 and 3.15.0, deterministic functions were only used to @@ -430,7 +430,7 @@ def md5sum(t): y.append(y) del x,y - gc.collect() + gc_collect() @with_tracebacks(OverflowError) def test_func_return_too_large_int(self): diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c index 02f4ac46b7c35..caefdf483a0df 100644 --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -559,7 +559,11 @@ _pysqlite_set_result(sqlite3_context* context, PyObject* py_val) return -1; sqlite3_result_int64(context, value); } else if (PyFloat_Check(py_val)) { - sqlite3_result_double(context, PyFloat_AsDouble(py_val)); + double value = PyFloat_AsDouble(py_val); + if (value == -1 && PyErr_Occurred()) { + return -1; + } + sqlite3_result_double(context, value); } else if (PyUnicode_Check(py_val)) { Py_ssize_t sz; const char *str = PyUnicode_AsUTF8AndSize(py_val, &sz); diff --git a/Modules/_sqlite/statement.c b/Modules/_sqlite/statement.c index 66fadb63e53ca..6885b50f61637 100644 --- a/Modules/_sqlite/statement.c +++ b/Modules/_sqlite/statement.c @@ -166,9 +166,16 @@ int pysqlite_statement_bind_parameter(pysqlite_Statement* self, int pos, PyObjec rc = sqlite3_bind_int64(self->st, pos, value); break; } - case TYPE_FLOAT: - rc = sqlite3_bind_double(self->st, pos, PyFloat_AsDouble(parameter)); + case TYPE_FLOAT: { + double value = PyFloat_AsDouble(parameter); + if (value == -1 && PyErr_Occurred()) { + rc = -1; + } + else { + rc = sqlite3_bind_double(self->st, pos, value); + } break; + } case TYPE_UNICODE: string = PyUnicode_AsUTF8AndSize(parameter, &buflen); if (string == NULL) From webhook-mailer at python.org Wed Jan 26 11:32:57 2022 From: webhook-mailer at python.org (vstinner) Date: Wed, 26 Jan 2022 16:32:57 -0000 Subject: [Python-checkins] bpo-35134: Add Include/cpython/descrobject.h (GH-30923) Message-ID: https://github.com/python/cpython/commit/d4a85f104bf9d2e368f25c9a567eaaa2cc39a96a commit: d4a85f104bf9d2e368f25c9a567eaaa2cc39a96a branch: main author: Victor Stinner committer: vstinner date: 2022-01-26T17:32:47+01:00 summary: bpo-35134: Add Include/cpython/descrobject.h (GH-30923) Move Include/descrobject.h non-limited API to a new Include/cpython/descrobject.h header file. files: A Include/cpython/descrobject.h M Include/descrobject.h M Makefile.pre.in M PCbuild/pythoncore.vcxproj M PCbuild/pythoncore.vcxproj.filters diff --git a/Include/cpython/descrobject.h b/Include/cpython/descrobject.h new file mode 100644 index 0000000000000..5d6c3a24d951e --- /dev/null +++ b/Include/cpython/descrobject.h @@ -0,0 +1,64 @@ +#ifndef Py_CPYTHON_DESCROBJECT_H +# error "this header file must not be included directly" +#endif + +typedef PyObject *(*wrapperfunc)(PyObject *self, PyObject *args, + void *wrapped); + +typedef PyObject *(*wrapperfunc_kwds)(PyObject *self, PyObject *args, + void *wrapped, PyObject *kwds); + +struct wrapperbase { + const char *name; + int offset; + void *function; + wrapperfunc wrapper; + const char *doc; + int flags; + PyObject *name_strobj; +}; + +/* Flags for above struct */ +#define PyWrapperFlag_KEYWORDS 1 /* wrapper function takes keyword args */ + +/* Various kinds of descriptor objects */ + +typedef struct { + PyObject_HEAD + PyTypeObject *d_type; + PyObject *d_name; + PyObject *d_qualname; +} PyDescrObject; + +#define PyDescr_COMMON PyDescrObject d_common + +#define PyDescr_TYPE(x) (((PyDescrObject *)(x))->d_type) +#define PyDescr_NAME(x) (((PyDescrObject *)(x))->d_name) + +typedef struct { + PyDescr_COMMON; + PyMethodDef *d_method; + vectorcallfunc vectorcall; +} PyMethodDescrObject; + +typedef struct { + PyDescr_COMMON; + struct PyMemberDef *d_member; +} PyMemberDescrObject; + +typedef struct { + PyDescr_COMMON; + PyGetSetDef *d_getset; +} PyGetSetDescrObject; + +typedef struct { + PyDescr_COMMON; + struct wrapperbase *d_base; + void *d_wrapped; /* This can be any function pointer */ +} PyWrapperDescrObject; + +PyAPI_DATA(PyTypeObject) _PyMethodWrapper_Type; + +PyAPI_FUNC(PyObject *) PyDescr_NewWrapper(PyTypeObject *, + struct wrapperbase *, void *); +PyAPI_FUNC(int) PyDescr_IsData(PyObject *); diff --git a/Include/descrobject.h b/Include/descrobject.h index 703bc8fd6df21..36802f4c55afa 100644 --- a/Include/descrobject.h +++ b/Include/descrobject.h @@ -16,93 +16,33 @@ typedef struct PyGetSetDef { void *closure; } PyGetSetDef; -#ifndef Py_LIMITED_API -typedef PyObject *(*wrapperfunc)(PyObject *self, PyObject *args, - void *wrapped); - -typedef PyObject *(*wrapperfunc_kwds)(PyObject *self, PyObject *args, - void *wrapped, PyObject *kwds); - -struct wrapperbase { - const char *name; - int offset; - void *function; - wrapperfunc wrapper; - const char *doc; - int flags; - PyObject *name_strobj; -}; - -/* Flags for above struct */ -#define PyWrapperFlag_KEYWORDS 1 /* wrapper function takes keyword args */ - -/* Various kinds of descriptor objects */ - -typedef struct { - PyObject_HEAD - PyTypeObject *d_type; - PyObject *d_name; - PyObject *d_qualname; -} PyDescrObject; - -#define PyDescr_COMMON PyDescrObject d_common - -#define PyDescr_TYPE(x) (((PyDescrObject *)(x))->d_type) -#define PyDescr_NAME(x) (((PyDescrObject *)(x))->d_name) - -typedef struct { - PyDescr_COMMON; - PyMethodDef *d_method; - vectorcallfunc vectorcall; -} PyMethodDescrObject; - -typedef struct { - PyDescr_COMMON; - struct PyMemberDef *d_member; -} PyMemberDescrObject; - -typedef struct { - PyDescr_COMMON; - PyGetSetDef *d_getset; -} PyGetSetDescrObject; - -typedef struct { - PyDescr_COMMON; - struct wrapperbase *d_base; - void *d_wrapped; /* This can be any function pointer */ -} PyWrapperDescrObject; -#endif /* Py_LIMITED_API */ - PyAPI_DATA(PyTypeObject) PyClassMethodDescr_Type; PyAPI_DATA(PyTypeObject) PyGetSetDescr_Type; PyAPI_DATA(PyTypeObject) PyMemberDescr_Type; PyAPI_DATA(PyTypeObject) PyMethodDescr_Type; PyAPI_DATA(PyTypeObject) PyWrapperDescr_Type; PyAPI_DATA(PyTypeObject) PyDictProxy_Type; -#ifndef Py_LIMITED_API -PyAPI_DATA(PyTypeObject) _PyMethodWrapper_Type; -#endif /* Py_LIMITED_API */ +PyAPI_DATA(PyTypeObject) PyProperty_Type; +// Forward declaration for following prototype +struct PyMemberDef; PyAPI_FUNC(PyObject *) PyDescr_NewMethod(PyTypeObject *, PyMethodDef *); PyAPI_FUNC(PyObject *) PyDescr_NewClassMethod(PyTypeObject *, PyMethodDef *); -struct PyMemberDef; /* forward declaration for following prototype */ PyAPI_FUNC(PyObject *) PyDescr_NewMember(PyTypeObject *, struct PyMemberDef *); PyAPI_FUNC(PyObject *) PyDescr_NewGetSet(PyTypeObject *, struct PyGetSetDef *); -#ifndef Py_LIMITED_API -PyAPI_FUNC(PyObject *) PyDescr_NewWrapper(PyTypeObject *, - struct wrapperbase *, void *); -PyAPI_FUNC(int) PyDescr_IsData(PyObject *); -#endif PyAPI_FUNC(PyObject *) PyDictProxy_New(PyObject *); PyAPI_FUNC(PyObject *) PyWrapper_New(PyObject *, PyObject *); +#ifndef Py_LIMITED_API +# define Py_CPYTHON_DESCROBJECT_H +# include "cpython/descrobject.h" +# undef Py_CPYTHON_DESCROBJECT_H +#endif -PyAPI_DATA(PyTypeObject) PyProperty_Type; #ifdef __cplusplus } #endif #endif /* !Py_DESCROBJECT_H */ - diff --git a/Makefile.pre.in b/Makefile.pre.in index 3ea019b7cac47..75482f9c1c56b 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1505,6 +1505,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/cpython/code.h \ $(srcdir)/Include/cpython/compile.h \ $(srcdir)/Include/cpython/context.h \ + $(srcdir)/Include/cpython/descrobject.h \ $(srcdir)/Include/cpython/dictobject.h \ $(srcdir)/Include/cpython/fileobject.h \ $(srcdir)/Include/cpython/fileutils.h \ diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index fd1ab837c0775..8f9c4fe63b8dc 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -142,6 +142,7 @@ + diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters index 4a502078177d8..dc3b554779486 100644 --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -366,6 +366,9 @@ Include\cpython + + Include\cpython + Include\cpython @@ -1274,4 +1277,4 @@ Resource Files - \ No newline at end of file + From webhook-mailer at python.org Wed Jan 26 15:47:53 2022 From: webhook-mailer at python.org (brandtbucher) Date: Wed, 26 Jan 2022 20:47:53 -0000 Subject: [Python-checkins] bpo-46528: Simplify the VM's stack manipulations (GH-30902) Message-ID: https://github.com/python/cpython/commit/85483668647e7840c7b9a1877caaf2ef14a4443f commit: 85483668647e7840c7b9a1877caaf2ef14a4443f branch: main author: Brandt Bucher committer: brandtbucher date: 2022-01-26T12:47:45-08:00 summary: bpo-46528: Simplify the VM's stack manipulations (GH-30902) files: A Misc/NEWS.d/next/Core and Builtins/2022-01-25-17-40-07.bpo-46528.2Qmni9.rst M Doc/library/dis.rst M Doc/whatsnew/3.11.rst M Include/opcode.h M Lib/importlib/_bootstrap_external.py M Lib/opcode.py M Lib/test/test__opcode.py M Lib/test/test_dis.py M Lib/test/test_peepholer.py M Python/ceval.c M Python/compile.c M Python/opcode_targets.h diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 793152d9d812c..d59680919716e 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -348,7 +348,8 @@ The Python compiler currently generates the following bytecode instructions. .. opcode:: NOP - Do nothing code. Used as a placeholder by the bytecode optimizer. + Do nothing code. Used as a placeholder by the bytecode optimizer, and to + generate line tracing events. .. opcode:: POP_TOP @@ -356,38 +357,19 @@ The Python compiler currently generates the following bytecode instructions. Removes the top-of-stack (TOS) item. -.. opcode:: ROT_TWO - - Swaps the two top-most stack items. - - -.. opcode:: ROT_THREE - - Lifts second and third stack item one position up, moves top down to position - three. - - -.. opcode:: ROT_FOUR - - Lifts second, third and fourth stack items one position up, moves top down - to position four. - - .. versionadded:: 3.8 - - -.. opcode:: DUP_TOP +.. opcode:: COPY (i) - Duplicates the reference on top of the stack. + Push the *i*-th item to the top of the stack. The item is not removed from its + original location. - .. versionadded:: 3.2 + .. versionadded:: 3.11 -.. opcode:: DUP_TOP_TWO +.. opcode:: SWAP (i) - Duplicates the two references on top of the stack, leaving them in the - same order. + Swap TOS with the item at position *i*. - .. versionadded:: 3.2 + .. versionadded:: 3.11 **Unary operations** @@ -689,8 +671,6 @@ iterations of the loop. success (``True``) or failure (``False``). -All of the following opcodes use their arguments. - .. opcode:: STORE_NAME (namei) Implements ``name = TOS``. *namei* is the index of *name* in the attribute @@ -1217,22 +1197,6 @@ All of the following opcodes use their arguments. success (``True``) or failure (``False``). -.. opcode:: ROT_N (count) - - Lift the top *count* stack items one position up, and move TOS down to - position *count*. - - .. versionadded:: 3.10 - - -.. opcode:: COPY (i) - - Push the *i*-th item to the top of the stack. The item is not removed from its - original location. - - .. versionadded:: 3.11 - - .. opcode:: RESUME (where) A no-op. Performs internal tracing, debugging and optimization checks. diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 4328ee6a5030c..edb2f6b89bbae 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -411,8 +411,9 @@ CPython bytecode changes indicate failure with :const:`None` (where a tuple of extracted values would otherwise be). -* Added :opcode:`COPY`, which pushes the *i*-th item to the top of the stack. - The item is not removed from its original location. +* Replace several stack manipulation instructions (``DUP_TOP``, ``DUP_TOP_TWO``, + ``ROT_TWO``, ``ROT_THREE``, ``ROT_FOUR``, and ``ROT_N``) with new + :opcode:`COPY` and :opcode:`SWAP` instructions. * Add :opcode:`POP_JUMP_IF_NOT_NONE` and :opcode:`POP_JUMP_IF_NONE` opcodes to speed up conditional jumps. diff --git a/Include/opcode.h b/Include/opcode.h index 985758d8fdbf2..02cdf42fe2443 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -8,11 +8,6 @@ extern "C" { /* Instruction opcodes for compiled code */ #define POP_TOP 1 -#define ROT_TWO 2 -#define ROT_THREE 3 -#define DUP_TOP 4 -#define DUP_TOP_TWO 5 -#define ROT_FOUR 6 #define NOP 9 #define UNARY_POSITIVE 10 #define UNARY_NEGATIVE 11 @@ -57,7 +52,7 @@ extern "C" { #define DELETE_ATTR 96 #define STORE_GLOBAL 97 #define DELETE_GLOBAL 98 -#define ROT_N 99 +#define SWAP 99 #define LOAD_CONST 100 #define LOAD_NAME 101 #define BUILD_TUPLE 102 @@ -117,61 +112,61 @@ extern "C" { #define PRECALL_METHOD 168 #define CALL_NO_KW 169 #define CALL_KW 170 -#define BINARY_OP_ADAPTIVE 7 -#define BINARY_OP_ADD_INT 8 -#define BINARY_OP_ADD_FLOAT 13 -#define BINARY_OP_ADD_UNICODE 14 -#define BINARY_OP_INPLACE_ADD_UNICODE 16 -#define BINARY_OP_MULTIPLY_INT 17 -#define BINARY_OP_MULTIPLY_FLOAT 18 -#define BINARY_OP_SUBTRACT_INT 19 -#define BINARY_OP_SUBTRACT_FLOAT 20 -#define COMPARE_OP_ADAPTIVE 21 -#define COMPARE_OP_FLOAT_JUMP 22 -#define COMPARE_OP_INT_JUMP 23 -#define COMPARE_OP_STR_JUMP 24 -#define BINARY_SUBSCR_ADAPTIVE 26 -#define BINARY_SUBSCR_GETITEM 27 -#define BINARY_SUBSCR_LIST_INT 28 -#define BINARY_SUBSCR_TUPLE_INT 29 -#define BINARY_SUBSCR_DICT 34 -#define STORE_SUBSCR_ADAPTIVE 36 -#define STORE_SUBSCR_LIST_INT 37 -#define STORE_SUBSCR_DICT 38 -#define CALL_NO_KW_ADAPTIVE 39 -#define CALL_NO_KW_BUILTIN_O 40 -#define CALL_NO_KW_BUILTIN_FAST 41 -#define CALL_NO_KW_LEN 42 -#define CALL_NO_KW_ISINSTANCE 43 -#define CALL_NO_KW_PY_SIMPLE 44 -#define CALL_NO_KW_LIST_APPEND 45 -#define CALL_NO_KW_METHOD_DESCRIPTOR_O 46 -#define CALL_NO_KW_TYPE_1 47 -#define CALL_NO_KW_BUILTIN_CLASS_1 48 -#define CALL_NO_KW_METHOD_DESCRIPTOR_FAST 55 -#define JUMP_ABSOLUTE_QUICK 56 -#define LOAD_ATTR_ADAPTIVE 57 -#define LOAD_ATTR_INSTANCE_VALUE 58 -#define LOAD_ATTR_WITH_HINT 59 -#define LOAD_ATTR_SLOT 62 -#define LOAD_ATTR_MODULE 63 -#define LOAD_GLOBAL_ADAPTIVE 64 -#define LOAD_GLOBAL_MODULE 65 -#define LOAD_GLOBAL_BUILTIN 66 -#define LOAD_METHOD_ADAPTIVE 67 -#define LOAD_METHOD_CACHED 72 -#define LOAD_METHOD_CLASS 76 -#define LOAD_METHOD_MODULE 77 -#define LOAD_METHOD_NO_DICT 78 -#define STORE_ATTR_ADAPTIVE 79 -#define STORE_ATTR_INSTANCE_VALUE 80 -#define STORE_ATTR_SLOT 81 -#define STORE_ATTR_WITH_HINT 131 -#define LOAD_FAST__LOAD_FAST 140 -#define STORE_FAST__LOAD_FAST 141 -#define LOAD_FAST__LOAD_CONST 143 -#define LOAD_CONST__LOAD_FAST 150 -#define STORE_FAST__STORE_FAST 153 +#define BINARY_OP_ADAPTIVE 2 +#define BINARY_OP_ADD_INT 3 +#define BINARY_OP_ADD_FLOAT 4 +#define BINARY_OP_ADD_UNICODE 5 +#define BINARY_OP_INPLACE_ADD_UNICODE 6 +#define BINARY_OP_MULTIPLY_INT 7 +#define BINARY_OP_MULTIPLY_FLOAT 8 +#define BINARY_OP_SUBTRACT_INT 13 +#define BINARY_OP_SUBTRACT_FLOAT 14 +#define COMPARE_OP_ADAPTIVE 16 +#define COMPARE_OP_FLOAT_JUMP 17 +#define COMPARE_OP_INT_JUMP 18 +#define COMPARE_OP_STR_JUMP 19 +#define BINARY_SUBSCR_ADAPTIVE 20 +#define BINARY_SUBSCR_GETITEM 21 +#define BINARY_SUBSCR_LIST_INT 22 +#define BINARY_SUBSCR_TUPLE_INT 23 +#define BINARY_SUBSCR_DICT 24 +#define STORE_SUBSCR_ADAPTIVE 26 +#define STORE_SUBSCR_LIST_INT 27 +#define STORE_SUBSCR_DICT 28 +#define CALL_NO_KW_ADAPTIVE 29 +#define CALL_NO_KW_BUILTIN_O 34 +#define CALL_NO_KW_BUILTIN_FAST 36 +#define CALL_NO_KW_LEN 37 +#define CALL_NO_KW_ISINSTANCE 38 +#define CALL_NO_KW_PY_SIMPLE 39 +#define CALL_NO_KW_LIST_APPEND 40 +#define CALL_NO_KW_METHOD_DESCRIPTOR_O 41 +#define CALL_NO_KW_TYPE_1 42 +#define CALL_NO_KW_BUILTIN_CLASS_1 43 +#define CALL_NO_KW_METHOD_DESCRIPTOR_FAST 44 +#define JUMP_ABSOLUTE_QUICK 45 +#define LOAD_ATTR_ADAPTIVE 46 +#define LOAD_ATTR_INSTANCE_VALUE 47 +#define LOAD_ATTR_WITH_HINT 48 +#define LOAD_ATTR_SLOT 55 +#define LOAD_ATTR_MODULE 56 +#define LOAD_GLOBAL_ADAPTIVE 57 +#define LOAD_GLOBAL_MODULE 58 +#define LOAD_GLOBAL_BUILTIN 59 +#define LOAD_METHOD_ADAPTIVE 62 +#define LOAD_METHOD_CACHED 63 +#define LOAD_METHOD_CLASS 64 +#define LOAD_METHOD_MODULE 65 +#define LOAD_METHOD_NO_DICT 66 +#define STORE_ATTR_ADAPTIVE 67 +#define STORE_ATTR_INSTANCE_VALUE 72 +#define STORE_ATTR_SLOT 76 +#define STORE_ATTR_WITH_HINT 77 +#define LOAD_FAST__LOAD_FAST 78 +#define STORE_FAST__LOAD_FAST 79 +#define LOAD_FAST__LOAD_CONST 80 +#define LOAD_CONST__LOAD_FAST 81 +#define STORE_FAST__STORE_FAST 131 #define DO_TRACING 255 #ifdef NEED_OPCODE_JUMP_TABLES static uint32_t _PyOpcode_RelativeJump[8] = { diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index cd4f69c7aa149..c05add9638aa1 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -382,6 +382,8 @@ def _write_atomic(path, data, mode=0o666): # Python 3.11a4 3474 (Add RESUME opcode) # Python 3.11a5 3475 (Add RETURN_GENERATOR opcode) # Python 3.11a5 3476 (Add ASYNC_GEN_WRAP opcode) +# Python 3.11a5 3477 (Replace DUP_TOP/DUP_TOP_TWO with COPY and +# ROT_TWO/ROT_THREE/ROT_FOUR/ROT_N with SWAP) # Python 3.12 will start with magic number 3500 @@ -395,7 +397,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3476).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3477).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c _PYCACHE = '__pycache__' diff --git a/Lib/opcode.py b/Lib/opcode.py index 1bd48eee8549a..37da05d76d734 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -55,11 +55,6 @@ def jabs_op(name, op): # Blank lines correspond to available opcodes def_op('POP_TOP', 1) -def_op('ROT_TWO', 2) -def_op('ROT_THREE', 3) -def_op('DUP_TOP', 4) -def_op('DUP_TOP_TWO', 5) -def_op('ROT_FOUR', 6) def_op('NOP', 9) def_op('UNARY_POSITIVE', 10) @@ -116,7 +111,7 @@ def jabs_op(name, op): name_op('DELETE_ATTR', 96) # "" name_op('STORE_GLOBAL', 97) # "" name_op('DELETE_GLOBAL', 98) # "" -def_op('ROT_N', 99) +def_op('SWAP', 99) def_op('LOAD_CONST', 100) # Index in const list hasconst.append(100) name_op('LOAD_NAME', 101) # Index in name list diff --git a/Lib/test/test__opcode.py b/Lib/test/test__opcode.py index f6b6b3d3532bd..7c1c0cfdb069b 100644 --- a/Lib/test/test__opcode.py +++ b/Lib/test/test__opcode.py @@ -11,7 +11,6 @@ class OpcodeTests(unittest.TestCase): def test_stack_effect(self): self.assertEqual(stack_effect(dis.opmap['POP_TOP']), -1) - self.assertEqual(stack_effect(dis.opmap['DUP_TOP_TWO']), 2) self.assertEqual(stack_effect(dis.opmap['BUILD_SLICE'], 0), -1) self.assertEqual(stack_effect(dis.opmap['BUILD_SLICE'], 1), -1) self.assertEqual(stack_effect(dis.opmap['BUILD_SLICE'], 3), -2) diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index c65b0143e87d0..72590649ea80c 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -1195,8 +1195,8 @@ def _prepare_test_cases(): Instruction(opname='CALL_NO_KW', opcode=169, arg=1, argval=1, argrepr='', offset=156, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=158, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=160, starts_line=25, is_jump_target=False, positions=None), - Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=162, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='DUP_TOP', opcode=4, arg=None, argval=None, argrepr='', offset=164, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=162, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=164, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='CALL_NO_KW', opcode=169, arg=3, argval=3, argrepr='', offset=166, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=168, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='JUMP_FORWARD', opcode=110, arg=25, argval=222, argrepr='to 222', offset=170, starts_line=None, is_jump_target=False, positions=None), diff --git a/Lib/test/test_peepholer.py b/Lib/test/test_peepholer.py index 8306c896a57f4..659f654b5c676 100644 --- a/Lib/test/test_peepholer.py +++ b/Lib/test/test_peepholer.py @@ -119,8 +119,8 @@ def f(): def test_pack_unpack(self): for line, elem in ( ('a, = a,', 'LOAD_CONST',), - ('a, b = a, b', 'ROT_TWO',), - ('a, b, c = a, b, c', 'ROT_THREE',), + ('a, b = a, b', 'SWAP',), + ('a, b, c = a, b, c', 'SWAP',), ): with self.subTest(line=line): code = compile(line,'','single') diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-25-17-40-07.bpo-46528.2Qmni9.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-25-17-40-07.bpo-46528.2Qmni9.rst new file mode 100644 index 0000000000000..f1639f8b3f06e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-25-17-40-07.bpo-46528.2Qmni9.rst @@ -0,0 +1,3 @@ +Replace several stack manipulation instructions (``DUP_TOP``, +``DUP_TOP_TWO``, ``ROT_TWO``, ``ROT_THREE``, ``ROT_FOUR``, and ``ROT_N``) +with new :opcode:`COPY` and :opcode:`SWAP` instructions. diff --git a/Python/ceval.c b/Python/ceval.c index 29ca5e3750cc0..106e4080840f4 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1434,8 +1434,6 @@ eval_frame_handle_pending(PyThreadState *tstate) #define PEEK(n) (stack_pointer[-(n)]) #define SET_TOP(v) (stack_pointer[-1] = (v)) #define SET_SECOND(v) (stack_pointer[-2] = (v)) -#define SET_THIRD(v) (stack_pointer[-3] = (v)) -#define SET_FOURTH(v) (stack_pointer[-4] = (v)) #define BASIC_STACKADJ(n) (stack_pointer += n) #define BASIC_PUSH(v) (*stack_pointer++ = (v)) #define BASIC_POP() (*--stack_pointer) @@ -1920,54 +1918,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } - TARGET(ROT_TWO) { - PyObject *top = TOP(); - PyObject *second = SECOND(); - SET_TOP(second); - SET_SECOND(top); - DISPATCH(); - } - - TARGET(ROT_THREE) { - PyObject *top = TOP(); - PyObject *second = SECOND(); - PyObject *third = THIRD(); - SET_TOP(second); - SET_SECOND(third); - SET_THIRD(top); - DISPATCH(); - } - - TARGET(ROT_FOUR) { - PyObject *top = TOP(); - PyObject *second = SECOND(); - PyObject *third = THIRD(); - PyObject *fourth = FOURTH(); - SET_TOP(second); - SET_SECOND(third); - SET_THIRD(fourth); - SET_FOURTH(top); - DISPATCH(); - } - - TARGET(DUP_TOP) { - PyObject *top = TOP(); - Py_INCREF(top); - PUSH(top); - DISPATCH(); - } - - TARGET(DUP_TOP_TWO) { - PyObject *top = TOP(); - PyObject *second = SECOND(); - Py_INCREF(top); - Py_INCREF(second); - STACK_GROW(2); - SET_TOP(top); - SET_SECOND(second); - DISPATCH(); - } - TARGET(UNARY_POSITIVE) { PyObject *value = TOP(); PyObject *res = PyNumber_Positive(value); @@ -5170,14 +5120,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr DISPATCH(); } - TARGET(ROT_N) { - PyObject *top = TOP(); - memmove(&PEEK(oparg - 1), &PEEK(oparg), - sizeof(PyObject*) * (oparg - 1)); - PEEK(oparg) = top; - DISPATCH(); - } - TARGET(COPY) { assert(oparg != 0); PyObject *peek = PEEK(oparg); @@ -5221,6 +5163,14 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, InterpreterFrame *frame, int thr } } + TARGET(SWAP) { + assert(oparg != 0); + PyObject *top = TOP(); + SET_TOP(PEEK(oparg)); + PEEK(oparg) = top; + DISPATCH(); + } + TARGET(EXTENDED_ARG) { int oldoparg = oparg; NEXTOPARG(); @@ -7380,7 +7330,7 @@ format_awaitable_error(PyThreadState *tstate, PyTypeObject *type, int prevprevop "that does not implement __await__: %.100s", type->tp_name); } - else if (prevopcode == WITH_EXCEPT_START || (prevopcode == CALL_NO_KW && prevprevopcode == DUP_TOP)) { + else if (prevopcode == WITH_EXCEPT_START || (prevopcode == CALL_NO_KW && prevprevopcode == LOAD_CONST)) { _PyErr_Format(tstate, PyExc_TypeError, "'async with' received an object from __aexit__ " "that does not implement __await__: %.100s", diff --git a/Python/compile.c b/Python/compile.c index feb9fcac51254..f1049fd931e14 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -867,14 +867,8 @@ stack_effect(int opcode, int oparg, int jump) /* Stack manipulation */ case POP_TOP: return -1; - case ROT_TWO: - case ROT_THREE: - case ROT_FOUR: + case SWAP: return 0; - case DUP_TOP: - return 1; - case DUP_TOP_TWO: - return 2; /* Unary operators */ case UNARY_POSITIVE: @@ -1094,8 +1088,6 @@ stack_effect(int opcode, int oparg, int jump) case MATCH_SEQUENCE: case MATCH_KEYS: return 1; - case ROT_N: - return 0; case COPY: return 1; case BINARY_OP: @@ -1829,8 +1821,8 @@ compiler_pop_fblock(struct compiler *c, enum fblocktype t, basicblock *b) static int compiler_call_exit_with_nones(struct compiler *c) { ADDOP_LOAD_CONST(c, Py_None); - ADDOP(c, DUP_TOP); - ADDOP(c, DUP_TOP); + ADDOP_LOAD_CONST(c, Py_None); + ADDOP_LOAD_CONST(c, Py_None); ADDOP_I(c, CALL_NO_KW, 3); return 1; } @@ -1890,7 +1882,7 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, case FOR_LOOP: /* Pop the iterator */ if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } ADDOP(c, POP_TOP); return 1; @@ -1920,11 +1912,11 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, case FINALLY_END: if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } ADDOP(c, POP_TOP); /* exc_value */ if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } ADDOP(c, POP_BLOCK); ADDOP(c, POP_EXCEPT); @@ -1935,7 +1927,7 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, SET_LOC(c, (stmt_ty)info->fb_datum); ADDOP(c, POP_BLOCK); if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } if(!compiler_call_exit_with_nones(c)) { return 0; @@ -1957,7 +1949,7 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, ADDOP(c, POP_BLOCK); } if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } ADDOP(c, POP_BLOCK); ADDOP(c, POP_EXCEPT); @@ -1970,7 +1962,7 @@ compiler_unwind_fblock(struct compiler *c, struct fblockinfo *info, case POP_VALUE: if (preserve_tos) { - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } ADDOP(c, POP_TOP); return 1; @@ -2647,7 +2639,7 @@ compiler_class(struct compiler *c, stmt_ty s) assert(i == 0); ADDOP_I(c, LOAD_CLOSURE, i); - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); str = PyUnicode_InternFromString("__classcell__"); if (!str || !compiler_nameop(c, str, Store)) { Py_XDECREF(str); @@ -2843,8 +2835,8 @@ compiler_jump_if(struct compiler *c, expr_ty e, basicblock *next, int cond) for (i = 0; i < n; i++) { VISIT(c, expr, (expr_ty)asdl_seq_GET(e->v.Compare.comparators, i)); - ADDOP(c, DUP_TOP); - ADDOP(c, ROT_THREE); + ADDOP_I(c, SWAP, 2); + ADDOP_I(c, COPY, 2); ADDOP_COMPARE(c, asdl_seq_GET(e->v.Compare.ops, i)); ADDOP_JUMP(c, POP_JUMP_IF_FALSE, cleanup); NEXT_BLOCK(c); @@ -3500,9 +3492,9 @@ compiler_try_except(struct compiler *c, stmt_ty s) [] POP_BLOCK [] JUMP_FORWARD L0 - [exc] L1: DUP_TOP ) save copy of the original exception + [exc] L1: COPY 1 ) save copy of the original exception [orig, exc] BUILD_LIST ) list for raised/reraised excs ("result") - [orig, exc, res] ROT_TWO + [orig, exc, res] SWAP 2 [orig, res, exc] [orig, res, exc, E1] JUMP_IF_NOT_EG_MATCH L2 @@ -3522,12 +3514,12 @@ compiler_try_except(struct compiler *c, stmt_ty s) [orig, res, rest] Ln+1: LIST_APPEND 1 ) add unhandled exc to res (could be None) [orig, res] PREP_RERAISE_STAR - [exc] DUP_TOP + [exc] COPY 1 [exc, exc] POP_JUMP_IF_NOT_NONE RER [exc] POP_TOP [] JUMP_FORWARD L0 - [exc] RER: ROT_TWO + [exc] RER: SWAP 2 [exc, prev_exc_info] POP_EXCEPT [exc] RERAISE 0 @@ -3592,19 +3584,19 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) if (i == 0) { /* Push the original EG into the stack */ /* - [exc] DUP_TOP + [exc] COPY 1 [orig, exc] */ - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); /* create empty list for exceptions raised/reraise in the except* blocks */ /* [orig, exc] BUILD_LIST - [orig, exc, []] ROT_TWO + [orig, exc, []] SWAP 2 [orig, [], exc] */ ADDOP_I(c, BUILD_LIST, 0); - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); } if (handler->v.ExceptHandler.type) { VISIT(c, expr, handler->v.ExceptHandler.type); @@ -3692,7 +3684,7 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) compiler_use_next_block(c, reraise_star); ADDOP(c, PREP_RERAISE_STAR); - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); ADDOP_JUMP(c, POP_JUMP_IF_NOT_NONE, reraise); NEXT_BLOCK(c); @@ -3703,7 +3695,7 @@ compiler_try_star_except(struct compiler *c, stmt_ty s) ADDOP_JUMP(c, JUMP_FORWARD, end); compiler_use_next_block(c, reraise); ADDOP(c, POP_BLOCK); - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); ADDOP(c, POP_EXCEPT); ADDOP_I(c, RERAISE, 0); compiler_use_next_block(c, cleanup); @@ -3761,7 +3753,7 @@ compiler_import_as(struct compiler *c, identifier name, identifier asname) if (dot == -1) { break; } - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); ADDOP(c, POP_TOP); } if (!compiler_nameop(c, asname, Store)) { @@ -3961,8 +3953,9 @@ compiler_visit_stmt(struct compiler *c, stmt_ty s) n = asdl_seq_LEN(s->v.Assign.targets); VISIT(c, expr, s->v.Assign.value); for (i = 0; i < n; i++) { - if (i < n - 1) - ADDOP(c, DUP_TOP); + if (i < n - 1) { + ADDOP_I(c, COPY, 1); + } VISIT(c, expr, (expr_ty)asdl_seq_GET(s->v.Assign.targets, i)); } @@ -4532,8 +4525,8 @@ compiler_compare(struct compiler *c, expr_ty e) for (i = 0; i < n; i++) { VISIT(c, expr, (expr_ty)asdl_seq_GET(e->v.Compare.comparators, i)); - ADDOP(c, DUP_TOP); - ADDOP(c, ROT_THREE); + ADDOP_I(c, SWAP, 2); + ADDOP_I(c, COPY, 2); ADDOP_COMPARE(c, asdl_seq_GET(e->v.Compare.ops, i)); ADDOP_JUMP(c, JUMP_IF_FALSE_OR_POP, cleanup); NEXT_BLOCK(c); @@ -4545,7 +4538,7 @@ compiler_compare(struct compiler *c, expr_ty e) return 0; ADDOP_JUMP_NOLINE(c, JUMP_FORWARD, end); compiler_use_next_block(c, cleanup); - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); ADDOP(c, POP_TOP); compiler_use_next_block(c, end); } @@ -5689,7 +5682,7 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) switch (e->kind) { case NamedExpr_kind: VISIT(c, expr, e->v.NamedExpr.value); - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); VISIT(c, expr, e->v.NamedExpr.target); break; case BoolOp_kind: @@ -5854,7 +5847,7 @@ compiler_augassign(struct compiler *c, stmt_ty s) switch (e->kind) { case Attribute_kind: VISIT(c, expr, e->v.Attribute.value); - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); int old_lineno = c->u->u_lineno; c->u->u_lineno = e->end_lineno; ADDOP_NAME(c, LOAD_ATTR, e->v.Attribute.attr, names); @@ -5863,7 +5856,8 @@ compiler_augassign(struct compiler *c, stmt_ty s) case Subscript_kind: VISIT(c, expr, e->v.Subscript.value); VISIT(c, expr, e->v.Subscript.slice); - ADDOP(c, DUP_TOP_TWO); + ADDOP_I(c, COPY, 2); + ADDOP_I(c, COPY, 2); ADDOP(c, BINARY_SUBSCR); break; case Name_kind: @@ -5890,11 +5884,12 @@ compiler_augassign(struct compiler *c, stmt_ty s) switch (e->kind) { case Attribute_kind: c->u->u_lineno = e->end_lineno; - ADDOP(c, ROT_TWO); + ADDOP_I(c, SWAP, 2); ADDOP_NAME(c, STORE_ATTR, e->v.Attribute.attr, names); break; case Subscript_kind: - ADDOP(c, ROT_THREE); + ADDOP_I(c, SWAP, 3); + ADDOP_I(c, SWAP, 2); ADDOP(c, STORE_SUBSCR); break; case Name_kind: @@ -6246,6 +6241,16 @@ compiler_error_duplicate_store(struct compiler *c, identifier n) return compiler_error(c, "multiple assignments to name %R in pattern", n); } +// Duplicate the effect of 3.10's ROT_* instructions using SWAPs. +static int +pattern_helper_rotate(struct compiler *c, Py_ssize_t count) +{ + while (1 < count) { + ADDOP_I(c, SWAP, count--); + } + return 1; +} + static int pattern_helper_store_name(struct compiler *c, identifier n, pattern_context *pc) { @@ -6265,7 +6270,8 @@ pattern_helper_store_name(struct compiler *c, identifier n, pattern_context *pc) return compiler_error_duplicate_store(c, n); } // Rotate this object underneath any items we need to preserve: - ADDOP_I(c, ROT_N, pc->on_top + PyList_GET_SIZE(pc->stores) + 1); + Py_ssize_t rotations = pc->on_top + PyList_GET_SIZE(pc->stores) + 1; + RETURN_IF_FALSE(pattern_helper_rotate(c, rotations)); return !PyList_Append(pc->stores, n); } @@ -6334,7 +6340,7 @@ pattern_helper_sequence_subscr(struct compiler *c, asdl_pattern_seq *patterns, assert(WILDCARD_STAR_CHECK(pattern)); continue; } - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); if (i < star) { ADDOP_LOAD_CONST_NEW(c, PyLong_FromSsize_t(i)); } @@ -6383,7 +6389,7 @@ compiler_pattern_as(struct compiler *c, pattern_ty p, pattern_context *pc) } // Need to make a copy for (possibly) storing later: pc->on_top++; - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); RETURN_IF_FALSE(compiler_pattern(c, p->v.MatchAs.pattern, pc)); // Success! Store it: pc->on_top--; @@ -6458,7 +6464,7 @@ compiler_pattern_class(struct compiler *c, pattern_ty p, pattern_context *pc) } ADDOP_LOAD_CONST_NEW(c, attr_names); ADDOP_I(c, MATCH_CLASS, nargs); - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); ADDOP_LOAD_CONST(c, Py_None); ADDOP_I(c, IS_OP, 1); // TOS is now a tuple of (nargs + nattrs) attributes (or None): @@ -6576,7 +6582,7 @@ compiler_pattern_mapping(struct compiler *c, pattern_ty p, pattern_context *pc) ADDOP(c, MATCH_KEYS); // There's now a tuple of keys and a tuple of values on top of the subject: pc->on_top += 2; - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); ADDOP_LOAD_CONST(c, Py_None); ADDOP_I(c, IS_OP, 1); RETURN_IF_FALSE(jump_to_fail_pop(c, pc, POP_JUMP_IF_FALSE)); @@ -6600,13 +6606,12 @@ compiler_pattern_mapping(struct compiler *c, pattern_ty p, pattern_context *pc) // for key in TOS: // del rest[key] ADDOP_I(c, BUILD_MAP, 0); // [subject, keys, empty] - ADDOP(c, ROT_THREE); // [empty, subject, keys] - ADDOP(c, ROT_TWO); // [empty, keys, subject] + ADDOP_I(c, SWAP, 3); // [empty, keys, subject] ADDOP_I(c, DICT_UPDATE, 2); // [copy, keys] ADDOP_I(c, UNPACK_SEQUENCE, size); // [copy, keys...] while (size) { ADDOP_I(c, COPY, 1 + size--); // [copy, keys..., copy] - ADDOP(c, ROT_TWO); // [copy, keys..., copy, key] + ADDOP_I(c, SWAP, 2); // [copy, keys..., copy, key] ADDOP(c, DELETE_SUBSCR); // [copy, keys...] } RETURN_IF_FALSE(pattern_helper_store_name(c, star_target, pc)); @@ -6651,7 +6656,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) pc->fail_pop = NULL; pc->fail_pop_size = 0; pc->on_top = 0; - if (!compiler_addop(c, DUP_TOP) || !compiler_pattern(c, alt, pc)) { + if (!compiler_addop_i(c, COPY, 1) || !compiler_pattern(c, alt, pc)) { goto error; } // Success! @@ -6683,7 +6688,8 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) // this; the current solution is potentially very // inefficient when each alternative subpattern binds lots // of names in different orders. It's fine for reasonable - // cases, though. + // cases, though, and the peephole optimizer will ensure + // that the final code is as efficient as possible. assert(istores < icontrol); Py_ssize_t rotations = istores + 1; // Perform the same rotation on pc->stores: @@ -6702,9 +6708,10 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) // rotated = pc_stores[:rotations] // del pc_stores[:rotations] // pc_stores[icontrol-istores:icontrol-istores] = rotated - // Do the same thing to the stack, using several ROT_Ns: + // Do the same thing to the stack, using several + // rotations: while (rotations--) { - if (!compiler_addop_i(c, ROT_N, icontrol + 1)) { + if (!pattern_helper_rotate(c, icontrol + 1)){ goto error; } } @@ -6730,7 +6737,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) } compiler_use_next_block(c, end); Py_ssize_t nstores = PyList_GET_SIZE(control); - // There's a bunch of stuff on the stack between any where the new stores + // There's a bunch of stuff on the stack between where the new stores // are and where they need to be: // - The other stores. // - A copy of the subject. @@ -6739,7 +6746,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) Py_ssize_t nrots = nstores + 1 + pc->on_top + PyList_GET_SIZE(pc->stores); for (Py_ssize_t i = 0; i < nstores; i++) { // Rotate this capture to its proper place on the stack: - if (!compiler_addop_i(c, ROT_N, nrots)) { + if (!pattern_helper_rotate(c, nrots)) { goto error; } // Update the list of previous stores with this new name, checking for @@ -6898,7 +6905,7 @@ compiler_match_inner(struct compiler *c, stmt_ty s, pattern_context *pc) SET_LOC(c, m->pattern); // Only copy the subject if we're *not* on the last case: if (i != cases - has_default - 1) { - ADDOP(c, DUP_TOP); + ADDOP_I(c, COPY, 1); } RETURN_IF_FALSE(pc->stores = PyList_New(0)); // Irrefutable cases must be either guarded, last, or both: @@ -8433,36 +8440,99 @@ fold_tuple_on_constants(struct compiler *c, return 0; } +#define VISITED (-1) -// Eliminate n * ROT_N(n). -static void -fold_rotations(struct instr *inst, int n) +// Replace an arbitrary run of SWAPs and NOPs with an optimal one that has the +// same effect. Return the number of instructions that were optimized. +static int +swaptimize(basicblock *block, int ix) { - for (int i = 0; i < n; i++) { - int rot; - switch (inst[i].i_opcode) { - case ROT_N: - rot = inst[i].i_oparg; - break; - case ROT_FOUR: - rot = 4; - break; - case ROT_THREE: - rot = 3; - break; - case ROT_TWO: - rot = 2; - break; - default: - return; + // NOTE: "./python -m test test_patma" serves as a good, quick stress test + // for this function. Make sure to blow away cached *.pyc files first! + assert(ix < block->b_iused); + struct instr *instructions = &block->b_instr[ix]; + // Find the length of the current sequence of SWAPs and NOPs, and record the + // maximum depth of the stack manipulations: + assert(instructions[0].i_opcode == SWAP); + int depth = instructions[0].i_oparg; + int len = 0; + int more = false; + while (++len < block->b_iused - ix) { + int opcode = instructions[len].i_opcode; + if (opcode == SWAP) { + depth = Py_MAX(depth, instructions[len].i_oparg); + more = true; } - if (rot != n) { - return; + else if (opcode != NOP) { + break; } } - for (int i = 0; i < n; i++) { - inst[i].i_opcode = NOP; + // It's already optimal if there's only one SWAP: + if (!more) { + return 0; + } + // Create an array with elements {0, 1, 2, ..., depth - 1}: + int *stack = PyMem_Malloc(depth * sizeof(int)); + for (int i = 0; i < depth; i++) { + stack[i] = i; + } + // Simulate the combined effect of these instructions by "running" them on + // our "stack": + for (int i = 0; i < len; i++) { + if (instructions[i].i_opcode == SWAP) { + int oparg = instructions[i].i_oparg; + int top = stack[0]; + // SWAPs are 1-indexed: + stack[0] = stack[oparg - 1]; + stack[oparg - 1] = top; + } + } + // Now we can begin! Our approach here is based on a solution to a closely + // related problem (https://cs.stackexchange.com/a/13938). It's easiest to + // think of this algorithm as determining the steps needed to efficiently + // "un-shuffle" our stack. By performing the moves in *reverse* order, + // though, we can efficiently *shuffle* it! For this reason, we will be + // replacing instructions starting from the *end* of the run. Since the + // solution is optimal, we don't need to worry about running out of space: + int current = len - 1; + for (int i = 0; i < depth; i++) { + // Skip items that have already been visited, or just happen to be in + // the correct location: + if (stack[i] == VISITED || stack[i] == i) { + continue; + } + // Okay, we've found an item that hasn't been visited. It forms a cycle + // with other items; traversing the cycle and swapping each item with + // the next will put them all in the correct place. The weird + // loop-and-a-half is necessary to insert 0 into every cycle, since we + // can only swap from that position: + int j = i; + while (true) { + // Skip the actual swap if our item is zero, since swapping the top + // item with itself is pointless: + if (j) { + assert(0 <= current); + // SWAPs are 1-indexed: + instructions[current].i_opcode = SWAP; + instructions[current--].i_oparg = j + 1; + } + if (stack[j] == VISITED) { + // Completed the cycle: + assert(j == i); + break; + } + int next_j = stack[j]; + stack[j] = VISITED; + j = next_j; + } } + // NOP out any unused instructions: + while (0 <= current) { + instructions[current--].i_opcode = NOP; + } + // Done! Return the number of optimized instructions: + PyMem_Free(stack); + return len - 1; } // Attempt to eliminate jumps to jumps by updating inst to jump to @@ -8591,12 +8661,16 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) bb->b_instr[i+1].i_opcode = NOP; break; case 2: - inst->i_opcode = ROT_TWO; + inst->i_opcode = SWAP; + inst->i_oparg = 2; bb->b_instr[i+1].i_opcode = NOP; + i--; break; case 3: - inst->i_opcode = ROT_THREE; - bb->b_instr[i+1].i_opcode = ROT_TWO; + inst->i_opcode = SWAP; + inst->i_oparg = 3; + bb->b_instr[i+1].i_opcode = NOP; + i--; } break; } @@ -8704,30 +8778,12 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) i -= jump_thread(inst, target, FOR_ITER); } break; - case ROT_N: - switch (oparg) { - case 0: - case 1: - inst->i_opcode = NOP; - continue; - case 2: - inst->i_opcode = ROT_TWO; - break; - case 3: - inst->i_opcode = ROT_THREE; - break; - case 4: - inst->i_opcode = ROT_FOUR; - break; - } - if (i >= oparg - 1) { - fold_rotations(inst - oparg + 1, oparg); - } - break; - case COPY: + case SWAP: if (oparg == 1) { - inst->i_opcode = DUP_TOP; + inst->i_opcode = NOP; + break; } + i += swaptimize(bb, i); break; default: /* All HAS_CONST opcodes should be handled with LOAD_CONST */ diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index c19cd0e88468a..c6e6d826aee60 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -1,45 +1,40 @@ static void *opcode_targets[256] = { &&_unknown_opcode, &&TARGET_POP_TOP, - &&TARGET_ROT_TWO, - &&TARGET_ROT_THREE, - &&TARGET_DUP_TOP, - &&TARGET_DUP_TOP_TWO, - &&TARGET_ROT_FOUR, &&TARGET_BINARY_OP_ADAPTIVE, &&TARGET_BINARY_OP_ADD_INT, - &&TARGET_NOP, - &&TARGET_UNARY_POSITIVE, - &&TARGET_UNARY_NEGATIVE, - &&TARGET_UNARY_NOT, &&TARGET_BINARY_OP_ADD_FLOAT, &&TARGET_BINARY_OP_ADD_UNICODE, - &&TARGET_UNARY_INVERT, &&TARGET_BINARY_OP_INPLACE_ADD_UNICODE, &&TARGET_BINARY_OP_MULTIPLY_INT, &&TARGET_BINARY_OP_MULTIPLY_FLOAT, + &&TARGET_NOP, + &&TARGET_UNARY_POSITIVE, + &&TARGET_UNARY_NEGATIVE, + &&TARGET_UNARY_NOT, &&TARGET_BINARY_OP_SUBTRACT_INT, &&TARGET_BINARY_OP_SUBTRACT_FLOAT, + &&TARGET_UNARY_INVERT, &&TARGET_COMPARE_OP_ADAPTIVE, &&TARGET_COMPARE_OP_FLOAT_JUMP, &&TARGET_COMPARE_OP_INT_JUMP, &&TARGET_COMPARE_OP_STR_JUMP, - &&TARGET_BINARY_SUBSCR, &&TARGET_BINARY_SUBSCR_ADAPTIVE, &&TARGET_BINARY_SUBSCR_GETITEM, &&TARGET_BINARY_SUBSCR_LIST_INT, &&TARGET_BINARY_SUBSCR_TUPLE_INT, - &&TARGET_GET_LEN, - &&TARGET_MATCH_MAPPING, - &&TARGET_MATCH_SEQUENCE, - &&TARGET_MATCH_KEYS, &&TARGET_BINARY_SUBSCR_DICT, - &&TARGET_PUSH_EXC_INFO, + &&TARGET_BINARY_SUBSCR, &&TARGET_STORE_SUBSCR_ADAPTIVE, &&TARGET_STORE_SUBSCR_LIST_INT, &&TARGET_STORE_SUBSCR_DICT, &&TARGET_CALL_NO_KW_ADAPTIVE, + &&TARGET_GET_LEN, + &&TARGET_MATCH_MAPPING, + &&TARGET_MATCH_SEQUENCE, + &&TARGET_MATCH_KEYS, &&TARGET_CALL_NO_KW_BUILTIN_O, + &&TARGET_PUSH_EXC_INFO, &&TARGET_CALL_NO_KW_BUILTIN_FAST, &&TARGET_CALL_NO_KW_LEN, &&TARGET_CALL_NO_KW_ISINSTANCE, @@ -48,39 +43,44 @@ static void *opcode_targets[256] = { &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_O, &&TARGET_CALL_NO_KW_TYPE_1, &&TARGET_CALL_NO_KW_BUILTIN_CLASS_1, + &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_FAST, + &&TARGET_JUMP_ABSOLUTE_QUICK, + &&TARGET_LOAD_ATTR_ADAPTIVE, + &&TARGET_LOAD_ATTR_INSTANCE_VALUE, + &&TARGET_LOAD_ATTR_WITH_HINT, &&TARGET_WITH_EXCEPT_START, &&TARGET_GET_AITER, &&TARGET_GET_ANEXT, &&TARGET_BEFORE_ASYNC_WITH, &&TARGET_BEFORE_WITH, &&TARGET_END_ASYNC_FOR, - &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_FAST, - &&TARGET_JUMP_ABSOLUTE_QUICK, - &&TARGET_LOAD_ATTR_ADAPTIVE, - &&TARGET_LOAD_ATTR_INSTANCE_VALUE, - &&TARGET_LOAD_ATTR_WITH_HINT, - &&TARGET_STORE_SUBSCR, - &&TARGET_DELETE_SUBSCR, &&TARGET_LOAD_ATTR_SLOT, &&TARGET_LOAD_ATTR_MODULE, &&TARGET_LOAD_GLOBAL_ADAPTIVE, &&TARGET_LOAD_GLOBAL_MODULE, &&TARGET_LOAD_GLOBAL_BUILTIN, + &&TARGET_STORE_SUBSCR, + &&TARGET_DELETE_SUBSCR, &&TARGET_LOAD_METHOD_ADAPTIVE, + &&TARGET_LOAD_METHOD_CACHED, + &&TARGET_LOAD_METHOD_CLASS, + &&TARGET_LOAD_METHOD_MODULE, + &&TARGET_LOAD_METHOD_NO_DICT, + &&TARGET_STORE_ATTR_ADAPTIVE, &&TARGET_GET_ITER, &&TARGET_GET_YIELD_FROM_ITER, &&TARGET_PRINT_EXPR, &&TARGET_LOAD_BUILD_CLASS, - &&TARGET_LOAD_METHOD_CACHED, + &&TARGET_STORE_ATTR_INSTANCE_VALUE, &&TARGET_GET_AWAITABLE, &&TARGET_LOAD_ASSERTION_ERROR, &&TARGET_RETURN_GENERATOR, - &&TARGET_LOAD_METHOD_CLASS, - &&TARGET_LOAD_METHOD_MODULE, - &&TARGET_LOAD_METHOD_NO_DICT, - &&TARGET_STORE_ATTR_ADAPTIVE, - &&TARGET_STORE_ATTR_INSTANCE_VALUE, &&TARGET_STORE_ATTR_SLOT, + &&TARGET_STORE_ATTR_WITH_HINT, + &&TARGET_LOAD_FAST__LOAD_FAST, + &&TARGET_STORE_FAST__LOAD_FAST, + &&TARGET_LOAD_FAST__LOAD_CONST, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_LIST_TO_TUPLE, &&TARGET_RETURN_VALUE, &&TARGET_IMPORT_STAR, @@ -98,7 +98,7 @@ static void *opcode_targets[256] = { &&TARGET_DELETE_ATTR, &&TARGET_STORE_GLOBAL, &&TARGET_DELETE_GLOBAL, - &&TARGET_ROT_N, + &&TARGET_SWAP, &&TARGET_LOAD_CONST, &&TARGET_LOAD_NAME, &&TARGET_BUILD_TUPLE, @@ -130,7 +130,7 @@ static void *opcode_targets[256] = { &&TARGET_POP_JUMP_IF_NOT_NONE, &&TARGET_POP_JUMP_IF_NONE, &&TARGET_RAISE_VARARGS, - &&TARGET_STORE_ATTR_WITH_HINT, + &&TARGET_STORE_FAST__STORE_FAST, &&TARGET_MAKE_FUNCTION, &&TARGET_BUILD_SLICE, &&TARGET_JUMP_NO_INTERRUPT, @@ -139,20 +139,20 @@ static void *opcode_targets[256] = { &&TARGET_LOAD_DEREF, &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, - &&TARGET_LOAD_FAST__LOAD_FAST, - &&TARGET_STORE_FAST__LOAD_FAST, + &&_unknown_opcode, + &&_unknown_opcode, &&TARGET_CALL_FUNCTION_EX, - &&TARGET_LOAD_FAST__LOAD_CONST, + &&_unknown_opcode, &&TARGET_EXTENDED_ARG, &&TARGET_LIST_APPEND, &&TARGET_SET_ADD, &&TARGET_MAP_ADD, &&TARGET_LOAD_CLASSDEREF, &&TARGET_COPY_FREE_VARS, - &&TARGET_LOAD_CONST__LOAD_FAST, + &&_unknown_opcode, &&TARGET_RESUME, &&TARGET_MATCH_CLASS, - &&TARGET_STORE_FAST__STORE_FAST, + &&_unknown_opcode, &&_unknown_opcode, &&TARGET_FORMAT_VALUE, &&TARGET_BUILD_CONST_KEY_MAP, From webhook-mailer at python.org Wed Jan 26 18:22:13 2022 From: webhook-mailer at python.org (vstinner) Date: Wed, 26 Jan 2022 23:22:13 -0000 Subject: [Python-checkins] bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Message-ID: https://github.com/python/cpython/commit/a9503ac39474a9cb1b1935ddf159c0d9672b04b6 commit: a9503ac39474a9cb1b1935ddf159c0d9672b04b6 branch: main author: Victor Stinner committer: vstinner date: 2022-01-27T00:22:04+01:00 summary: bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Fix GCC detection in setup.py when cross-compiling. The C compiler is now run with LC_ALL=C. Previously, the detection failed with a German locale. files: A Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst M setup.py diff --git a/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst new file mode 100644 index 0000000000000..4e0ee70bdc513 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst @@ -0,0 +1,2 @@ +Fix GCC detection in setup.py when cross-compiling. The C compiler is now +run with LC_ALL=C. Previously, the detection failed with a German locale. diff --git a/setup.py b/setup.py index e30674f31cdb8..e47b2ab90958b 100644 --- a/setup.py +++ b/setup.py @@ -848,7 +848,9 @@ def add_cross_compiling_paths(self): tmpfile = os.path.join(self.build_temp, 'ccpaths') if not os.path.exists(self.build_temp): os.makedirs(self.build_temp) - ret = run_command('%s -E -v - %s 1>/dev/null' % (CC, tmpfile)) + # bpo-38472: With a German locale, GCC returns "gcc-Version 9.1.0 + # (GCC)", whereas it returns "gcc version 9.1.0" with the C locale. + ret = run_command('LC_ALL=C %s -E -v - %s 1>/dev/null' % (CC, tmpfile)) is_gcc = False is_clang = False in_incdirs = False From webhook-mailer at python.org Wed Jan 26 18:35:48 2022 From: webhook-mailer at python.org (vstinner) Date: Wed, 26 Jan 2022 23:35:48 -0000 Subject: [Python-checkins] make regen-all now suggests running: make autoconf (GH-30893) Message-ID: https://github.com/python/cpython/commit/13194084b40fe6579a70248b7be0665ff4f08374 commit: 13194084b40fe6579a70248b7be0665ff4f08374 branch: main author: Victor Stinner committer: vstinner date: 2022-01-27T00:35:40+01:00 summary: make regen-all now suggests running: make autoconf (GH-30893) "make autoconf" also runs autoheader, whereas "autoconf" does not. files: M Makefile.pre.in diff --git a/Makefile.pre.in b/Makefile.pre.in index 75482f9c1c56b..55f09c6e74b20 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1185,7 +1185,7 @@ regen-all: regen-opcode regen-opcode-targets regen-typeslots \ regen-token regen-ast regen-keyword regen-frozen clinic \ regen-pegen-metaparser regen-pegen regen-test-frozenmain @echo - @echo "Note: make regen-stdlib-module-names and autoconf should be run manually" + @echo "Note: make regen-stdlib-module-names and make autoconf should be run manually" ############################################################################ # Special rules for object files From webhook-mailer at python.org Wed Jan 26 18:49:14 2022 From: webhook-mailer at python.org (pablogsal) Date: Wed, 26 Jan 2022 23:49:14 -0000 Subject: [Python-checkins] bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) Message-ID: https://github.com/python/cpython/commit/f0a648152f2d8011f47cc49873438ebaf01d3f82 commit: f0a648152f2d8011f47cc49873438ebaf01d3f82 branch: main author: Mateusz ?oskot committer: pablogsal date: 2022-01-26T23:49:06Z summary: bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) Since, - Py_CompileString no longer allows to distinguish "incomplete input" from "invalid input" - there is no alternative solution available from the Python C API due to how the new parser works (rewritten in 3.9) - the only supported way is to manually import the codeop module from C and use its API as IDLE does, and accept its own complications it is desirable to remove this Q&A from the official FAQ. files: M Doc/faq/extending.rst diff --git a/Doc/faq/extending.rst b/Doc/faq/extending.rst index fd32b097335e5..1d2aca6f4c8d9 100644 --- a/Doc/faq/extending.rst +++ b/Doc/faq/extending.rst @@ -254,7 +254,6 @@ For Red Hat, install the python-devel RPM to get the necessary files. For Debian, run ``apt-get install python-dev``. - How do I tell "incomplete input" from "invalid input"? ------------------------------------------------------ @@ -273,127 +272,6 @@ you. You can also set the :c:func:`PyOS_ReadlineFunctionPointer` to point at you custom input function. See ``Modules/readline.c`` and ``Parser/myreadline.c`` for more hints. -However sometimes you have to run the embedded Python interpreter in the same -thread as your rest application and you can't allow the -:c:func:`PyRun_InteractiveLoop` to stop while waiting for user input. -A solution is trying to compile the received string with -:c:func:`Py_CompileString`. If it compiles without errors, try to execute the -returned code object by calling :c:func:`PyEval_EvalCode`. Otherwise save the -input for later. If the compilation fails, find out if it's an error or just -more input is required - by extracting the message string from the exception -tuple and comparing it to the string "unexpected EOF while parsing". Here is a -complete example using the GNU readline library (you may want to ignore -**SIGINT** while calling readline()):: - - #include - #include - - #define PY_SSIZE_T_CLEAN - #include - - int main (int argc, char* argv[]) - { - int i, j, done = 0; /* lengths of line, code */ - char ps1[] = ">>> "; - char ps2[] = "... "; - char *prompt = ps1; - char *msg, *line, *code = NULL; - PyObject *src, *glb, *loc; - PyObject *exc, *val, *trb, *obj, *dum; - - Py_Initialize (); - loc = PyDict_New (); - glb = PyDict_New (); - PyDict_SetItemString (glb, "__builtins__", PyEval_GetBuiltins ()); - - while (!done) - { - line = readline (prompt); - - if (NULL == line) /* Ctrl-D pressed */ - { - done = 1; - } - else - { - i = strlen (line); - - if (i > 0) - add_history (line); /* save non-empty lines */ - - if (NULL == code) /* nothing in code yet */ - j = 0; - else - j = strlen (code); - - code = realloc (code, i + j + 2); - if (NULL == code) /* out of memory */ - exit (1); - - if (0 == j) /* code was empty, so */ - code[0] = '\0'; /* keep strncat happy */ - - strncat (code, line, i); /* append line to code */ - code[i + j] = '\n'; /* append '\n' to code */ - code[i + j + 1] = '\0'; - - src = Py_CompileString (code, "", Py_single_input); - - if (NULL != src) /* compiled just fine - */ - { - if (ps1 == prompt || /* ">>> " or */ - '\n' == code[i + j - 1]) /* "... " and double '\n' */ - { /* so execute it */ - dum = PyEval_EvalCode (src, glb, loc); - Py_XDECREF (dum); - Py_XDECREF (src); - free (code); - code = NULL; - if (PyErr_Occurred ()) - PyErr_Print (); - prompt = ps1; - } - } /* syntax error or E_EOF? */ - else if (PyErr_ExceptionMatches (PyExc_SyntaxError)) - { - PyErr_Fetch (&exc, &val, &trb); /* clears exception! */ - - if (PyArg_ParseTuple (val, "sO", &msg, &obj) && - !strcmp (msg, "unexpected EOF while parsing")) /* E_EOF */ - { - Py_XDECREF (exc); - Py_XDECREF (val); - Py_XDECREF (trb); - prompt = ps2; - } - else /* some other syntax error */ - { - PyErr_Restore (exc, val, trb); - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - } - else /* some non-syntax error */ - { - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - - free (line); - } - } - - Py_XDECREF(glb); - Py_XDECREF(loc); - Py_Finalize(); - exit(0); - } - - How do I find undefined g++ symbols __builtin_new or __pure_virtual? -------------------------------------------------------------------- From webhook-mailer at python.org Wed Jan 26 18:50:02 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 23:50:02 -0000 Subject: [Python-checkins] bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Message-ID: https://github.com/python/cpython/commit/171fdf2162130bc8c748173bc8eef184b21f5a08 commit: 171fdf2162130bc8c748173bc8eef184b21f5a08 branch: 3.10 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T15:49:53-08:00 summary: bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Fix GCC detection in setup.py when cross-compiling. The C compiler is now run with LC_ALL=C. Previously, the detection failed with a German locale. (cherry picked from commit a9503ac39474a9cb1b1935ddf159c0d9672b04b6) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst M setup.py diff --git a/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst new file mode 100644 index 0000000000000..4e0ee70bdc513 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst @@ -0,0 +1,2 @@ +Fix GCC detection in setup.py when cross-compiling. The C compiler is now +run with LC_ALL=C. Previously, the detection failed with a German locale. diff --git a/setup.py b/setup.py index 43e807f20d989..e74a275edbf2d 100644 --- a/setup.py +++ b/setup.py @@ -757,7 +757,9 @@ def add_cross_compiling_paths(self): tmpfile = os.path.join(self.build_temp, 'ccpaths') if not os.path.exists(self.build_temp): os.makedirs(self.build_temp) - ret = run_command('%s -E -v - %s 1>/dev/null' % (CC, tmpfile)) + # bpo-38472: With a German locale, GCC returns "gcc-Version 9.1.0 + # (GCC)", whereas it returns "gcc version 9.1.0" with the C locale. + ret = run_command('LC_ALL=C %s -E -v - %s 1>/dev/null' % (CC, tmpfile)) is_gcc = False is_clang = False in_incdirs = False From webhook-mailer at python.org Wed Jan 26 18:50:39 2022 From: webhook-mailer at python.org (miss-islington) Date: Wed, 26 Jan 2022 23:50:39 -0000 Subject: [Python-checkins] bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Message-ID: https://github.com/python/cpython/commit/ff11effab7ae10b57719c066ee49b52d3991ead3 commit: ff11effab7ae10b57719c066ee49b52d3991ead3 branch: 3.9 author: Miss Islington (bot) <31488909+miss-islington at users.noreply.github.com> committer: miss-islington <31488909+miss-islington at users.noreply.github.com> date: 2022-01-26T15:50:30-08:00 summary: bpo-38472: setup.py uses LC_ALL=C to check the C compiler (GH-30929) Fix GCC detection in setup.py when cross-compiling. The C compiler is now run with LC_ALL=C. Previously, the detection failed with a German locale. (cherry picked from commit a9503ac39474a9cb1b1935ddf159c0d9672b04b6) Co-authored-by: Victor Stinner files: A Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst M setup.py diff --git a/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst new file mode 100644 index 0000000000000..4e0ee70bdc513 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-01-26-22-59-12.bpo-38472.RxfLho.rst @@ -0,0 +1,2 @@ +Fix GCC detection in setup.py when cross-compiling. The C compiler is now +run with LC_ALL=C. Previously, the detection failed with a German locale. diff --git a/setup.py b/setup.py index c6023e1ab6353..0bec170d3f244 100644 --- a/setup.py +++ b/setup.py @@ -682,7 +682,9 @@ def add_cross_compiling_paths(self): tmpfile = os.path.join(self.build_temp, 'ccpaths') if not os.path.exists(self.build_temp): os.makedirs(self.build_temp) - ret = run_command('%s -E -v - %s 1>/dev/null' % (CC, tmpfile)) + # bpo-38472: With a German locale, GCC returns "gcc-Version 9.1.0 + # (GCC)", whereas it returns "gcc version 9.1.0" with the C locale. + ret = run_command('LC_ALL=C %s -E -v - %s 1>/dev/null' % (CC, tmpfile)) is_gcc = False is_clang = False in_incdirs = False From webhook-mailer at python.org Wed Jan 26 19:16:54 2022 From: webhook-mailer at python.org (pablogsal) Date: Thu, 27 Jan 2022 00:16:54 -0000 Subject: [Python-checkins] [3.9] bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) (GH-30934) Message-ID: https://github.com/python/cpython/commit/dafada393f9a790461430e2493ea1379e938b51a commit: dafada393f9a790461430e2493ea1379e938b51a branch: 3.9 author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-27T00:16:44Z summary: [3.9] bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) (GH-30934) Since, - Py_CompileString no longer allows to distinguish "incomplete input" from "invalid input" - there is no alternative solution available from the Python C API due to how the new parser works (rewritten in 3.9) - the only supported way is to manually import the codeop module from C and use its API as IDLE does, and accept its own complications it is desirable to remove this Q&A from the official FAQ.. (cherry picked from commit f0a648152f2d8011f47cc49873438ebaf01d3f82) Co-authored-by: Mateusz ?oskot Co-authored-by: Mateusz ?oskot files: M Doc/faq/extending.rst diff --git a/Doc/faq/extending.rst b/Doc/faq/extending.rst index aecb56eaa4fd2..1d2aca6f4c8d9 100644 --- a/Doc/faq/extending.rst +++ b/Doc/faq/extending.rst @@ -254,7 +254,6 @@ For Red Hat, install the python-devel RPM to get the necessary files. For Debian, run ``apt-get install python-dev``. - How do I tell "incomplete input" from "invalid input"? ------------------------------------------------------ @@ -273,161 +272,6 @@ you. You can also set the :c:func:`PyOS_ReadlineFunctionPointer` to point at you custom input function. See ``Modules/readline.c`` and ``Parser/myreadline.c`` for more hints. -However sometimes you have to run the embedded Python interpreter in the same -thread as your rest application and you can't allow the -:c:func:`PyRun_InteractiveLoop` to stop while waiting for user input. The one -solution then is to call :c:func:`PyParser_ParseString` and test for ``e.error`` -equal to ``E_EOF``, which means the input is incomplete. Here's a sample code -fragment, untested, inspired by code from Alex Farber:: - - #define PY_SSIZE_T_CLEAN - #include - #include - #include - #include - #include - #include - - int testcomplete(char *code) - /* code should end in \n */ - /* return -1 for error, 0 for incomplete, 1 for complete */ - { - node *n; - perrdetail e; - - n = PyParser_ParseString(code, &_PyParser_Grammar, - Py_file_input, &e); - if (n == NULL) { - if (e.error == E_EOF) - return 0; - return -1; - } - - PyNode_Free(n); - return 1; - } - -Another solution is trying to compile the received string with -:c:func:`Py_CompileString`. If it compiles without errors, try to execute the -returned code object by calling :c:func:`PyEval_EvalCode`. Otherwise save the -input for later. If the compilation fails, find out if it's an error or just -more input is required - by extracting the message string from the exception -tuple and comparing it to the string "unexpected EOF while parsing". Here is a -complete example using the GNU readline library (you may want to ignore -**SIGINT** while calling readline()):: - - #include - #include - - #define PY_SSIZE_T_CLEAN - #include - #include - #include - #include - - int main (int argc, char* argv[]) - { - int i, j, done = 0; /* lengths of line, code */ - char ps1[] = ">>> "; - char ps2[] = "... "; - char *prompt = ps1; - char *msg, *line, *code = NULL; - PyObject *src, *glb, *loc; - PyObject *exc, *val, *trb, *obj, *dum; - - Py_Initialize (); - loc = PyDict_New (); - glb = PyDict_New (); - PyDict_SetItemString (glb, "__builtins__", PyEval_GetBuiltins ()); - - while (!done) - { - line = readline (prompt); - - if (NULL == line) /* Ctrl-D pressed */ - { - done = 1; - } - else - { - i = strlen (line); - - if (i > 0) - add_history (line); /* save non-empty lines */ - - if (NULL == code) /* nothing in code yet */ - j = 0; - else - j = strlen (code); - - code = realloc (code, i + j + 2); - if (NULL == code) /* out of memory */ - exit (1); - - if (0 == j) /* code was empty, so */ - code[0] = '\0'; /* keep strncat happy */ - - strncat (code, line, i); /* append line to code */ - code[i + j] = '\n'; /* append '\n' to code */ - code[i + j + 1] = '\0'; - - src = Py_CompileString (code, "", Py_single_input); - - if (NULL != src) /* compiled just fine - */ - { - if (ps1 == prompt || /* ">>> " or */ - '\n' == code[i + j - 1]) /* "... " and double '\n' */ - { /* so execute it */ - dum = PyEval_EvalCode (src, glb, loc); - Py_XDECREF (dum); - Py_XDECREF (src); - free (code); - code = NULL; - if (PyErr_Occurred ()) - PyErr_Print (); - prompt = ps1; - } - } /* syntax error or E_EOF? */ - else if (PyErr_ExceptionMatches (PyExc_SyntaxError)) - { - PyErr_Fetch (&exc, &val, &trb); /* clears exception! */ - - if (PyArg_ParseTuple (val, "sO", &msg, &obj) && - !strcmp (msg, "unexpected EOF while parsing")) /* E_EOF */ - { - Py_XDECREF (exc); - Py_XDECREF (val); - Py_XDECREF (trb); - prompt = ps2; - } - else /* some other syntax error */ - { - PyErr_Restore (exc, val, trb); - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - } - else /* some non-syntax error */ - { - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - - free (line); - } - } - - Py_XDECREF(glb); - Py_XDECREF(loc); - Py_Finalize(); - exit(0); - } - - How do I find undefined g++ symbols __builtin_new or __pure_virtual? -------------------------------------------------------------------- From webhook-mailer at python.org Wed Jan 26 19:16:54 2022 From: webhook-mailer at python.org (pablogsal) Date: Thu, 27 Jan 2022 00:16:54 -0000 Subject: [Python-checkins] [3.10] bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) (GH-30933) Message-ID: https://github.com/python/cpython/commit/c7af838805ddf52320bce3d5978bfdd37eed1b3a commit: c7af838805ddf52320bce3d5978bfdd37eed1b3a branch: 3.10 author: Pablo Galindo Salgado committer: pablogsal date: 2022-01-27T00:16:50Z summary: [3.10] bpo-46502: Remove "How do I tell incomplete input" from FAQ (GH-30925) (GH-30933) Since, - Py_CompileString no longer allows to distinguish "incomplete input" from "invalid input" - there is no alternative solution available from the Python C API due to how the new parser works (rewritten in 3.9) - the only supported way is to manually import the codeop module from C and use its API as IDLE does, and accept its own complications it is desirable to remove this Q&A from the official FAQ.. (cherry picked from commit f0a648152f2d8011f47cc49873438ebaf01d3f82) Co-authored-by: Mateusz ?oskot Co-authored-by: Mateusz ?oskot files: M Doc/faq/extending.rst diff --git a/Doc/faq/extending.rst b/Doc/faq/extending.rst index 3379e41d9de07..1d2aca6f4c8d9 100644 --- a/Doc/faq/extending.rst +++ b/Doc/faq/extending.rst @@ -254,7 +254,6 @@ For Red Hat, install the python-devel RPM to get the necessary files. For Debian, run ``apt-get install python-dev``. - How do I tell "incomplete input" from "invalid input"? ------------------------------------------------------ @@ -273,130 +272,6 @@ you. You can also set the :c:func:`PyOS_ReadlineFunctionPointer` to point at you custom input function. See ``Modules/readline.c`` and ``Parser/myreadline.c`` for more hints. -However sometimes you have to run the embedded Python interpreter in the same -thread as your rest application and you can't allow the -:c:func:`PyRun_InteractiveLoop` to stop while waiting for user input. -A solution is trying to compile the received string with -:c:func:`Py_CompileString`. If it compiles without errors, try to execute the -returned code object by calling :c:func:`PyEval_EvalCode`. Otherwise save the -input for later. If the compilation fails, find out if it's an error or just -more input is required - by extracting the message string from the exception -tuple and comparing it to the string "unexpected EOF while parsing". Here is a -complete example using the GNU readline library (you may want to ignore -**SIGINT** while calling readline()):: - - #include - #include - - #define PY_SSIZE_T_CLEAN - #include - #include - #include - #include - - int main (int argc, char* argv[]) - { - int i, j, done = 0; /* lengths of line, code */ - char ps1[] = ">>> "; - char ps2[] = "... "; - char *prompt = ps1; - char *msg, *line, *code = NULL; - PyObject *src, *glb, *loc; - PyObject *exc, *val, *trb, *obj, *dum; - - Py_Initialize (); - loc = PyDict_New (); - glb = PyDict_New (); - PyDict_SetItemString (glb, "__builtins__", PyEval_GetBuiltins ()); - - while (!done) - { - line = readline (prompt); - - if (NULL == line) /* Ctrl-D pressed */ - { - done = 1; - } - else - { - i = strlen (line); - - if (i > 0) - add_history (line); /* save non-empty lines */ - - if (NULL == code) /* nothing in code yet */ - j = 0; - else - j = strlen (code); - - code = realloc (code, i + j + 2); - if (NULL == code) /* out of memory */ - exit (1); - - if (0 == j) /* code was empty, so */ - code[0] = '\0'; /* keep strncat happy */ - - strncat (code, line, i); /* append line to code */ - code[i + j] = '\n'; /* append '\n' to code */ - code[i + j + 1] = '\0'; - - src = Py_CompileString (code, "", Py_single_input); - - if (NULL != src) /* compiled just fine - */ - { - if (ps1 == prompt || /* ">>> " or */ - '\n' == code[i + j - 1]) /* "... " and double '\n' */ - { /* so execute it */ - dum = PyEval_EvalCode (src, glb, loc); - Py_XDECREF (dum); - Py_XDECREF (src); - free (code); - code = NULL; - if (PyErr_Occurred ()) - PyErr_Print (); - prompt = ps1; - } - } /* syntax error or E_EOF? */ - else if (PyErr_ExceptionMatches (PyExc_SyntaxError)) - { - PyErr_Fetch (&exc, &val, &trb); /* clears exception! */ - - if (PyArg_ParseTuple (val, "sO", &msg, &obj) && - !strcmp (msg, "unexpected EOF while parsing")) /* E_EOF */ - { - Py_XDECREF (exc); - Py_XDECREF (val); - Py_XDECREF (trb); - prompt = ps2; - } - else /* some other syntax error */ - { - PyErr_Restore (exc, val, trb); - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - } - else /* some non-syntax error */ - { - PyErr_Print (); - free (code); - code = NULL; - prompt = ps1; - } - - free (line); - } - } - - Py_XDECREF(glb); - Py_XDECREF(loc); - Py_Finalize(); - exit(0); - } - - How do I find undefined g++ symbols __builtin_new or __pure_virtual? -------------------------------------------------------------------- From webhook-mailer at python.org Wed Jan 26 21:01:07 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 27 Jan 2022 02:01:07 -0000 Subject: [Python-checkins] bpo-40170: PyType_SUPPORTS_WEAKREFS() becomes a regular function (GH-30938) Message-ID: https://github.com/python/cpython/commit/af32b3ef1fbad3c2242627a14398320960a0cb45 commit: af32b3ef1fbad3c2242627a14398320960a0cb45 branch: main author: Victor Stinner committer: vstinner date: 2022-01-27T03:00:55+01:00 summary: bpo-40170: PyType_SUPPORTS_WEAKREFS() becomes a regular function (GH-30938) Convert the PyType_SUPPORTS_WEAKREFS() macro to a regular function. It no longer access the PyTypeObject.tp_weaklistoffset member directly. Add _PyType_SUPPORTS_WEAKREFS() static inline functions, used internally by Python for best performance. files: M Include/cpython/objimpl.h M Include/internal/pycore_object.h M Modules/_weakref.c M Modules/gcmodule.c M Objects/exceptions.c M Objects/typeobject.c M Objects/weakrefobject.c diff --git a/Include/cpython/objimpl.h b/Include/cpython/objimpl.h index 4a905c25cc845..7fff96e8eb27f 100644 --- a/Include/cpython/objimpl.h +++ b/Include/cpython/objimpl.h @@ -91,7 +91,7 @@ PyAPI_FUNC(int) PyObject_IS_GC(PyObject *obj); #endif -/* Test if a type supports weak references */ -#define PyType_SUPPORTS_WEAKREFS(t) ((t)->tp_weaklistoffset > 0) +// Test if a type supports weak references +PyAPI_FUNC(int) PyType_SUPPORTS_WEAKREFS(PyTypeObject *type); PyAPI_FUNC(PyObject **) PyObject_GET_WEAKREFS_LISTPTR(PyObject *op); diff --git a/Include/internal/pycore_object.h b/Include/internal/pycore_object.h index be308cd25d710..5fe4ddb2efbe1 100644 --- a/Include/internal/pycore_object.h +++ b/Include/internal/pycore_object.h @@ -200,6 +200,11 @@ extern int _Py_CheckSlotResult( // See also the Py_TPFLAGS_READY flag. #define _PyType_IsReady(type) ((type)->tp_dict != NULL) +// Test if a type supports weak references +static inline int _PyType_SUPPORTS_WEAKREFS(PyTypeObject *type) { + return (type->tp_weaklistoffset > 0); +} + extern PyObject* _PyType_AllocNoTrack(PyTypeObject *type, Py_ssize_t nitems); extern int _PyObject_InitializeDict(PyObject *obj); diff --git a/Modules/_weakref.c b/Modules/_weakref.c index e33cba2a3dd81..edc09b949fd3e 100644 --- a/Modules/_weakref.c +++ b/Modules/_weakref.c @@ -28,7 +28,7 @@ _weakref_getweakrefcount_impl(PyObject *module, PyObject *object) { PyWeakReference **list; - if (!PyType_SUPPORTS_WEAKREFS(Py_TYPE(object))) + if (!_PyType_SUPPORTS_WEAKREFS(Py_TYPE(object))) return 0; list = GET_WEAKREFS_LISTPTR(object); @@ -85,7 +85,7 @@ weakref_getweakrefs(PyObject *self, PyObject *object) { PyObject *result = NULL; - if (PyType_SUPPORTS_WEAKREFS(Py_TYPE(object))) { + if (_PyType_SUPPORTS_WEAKREFS(Py_TYPE(object))) { PyWeakReference **list = GET_WEAKREFS_LISTPTR(object); Py_ssize_t count = _PyWeakref_GetWeakrefCount(*list); diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index 16f8c2b18e717..802c3eadccfb0 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -791,7 +791,7 @@ handle_weakrefs(PyGC_Head *unreachable, PyGC_Head *old) _PyWeakref_ClearRef((PyWeakReference *)op); } - if (! PyType_SUPPORTS_WEAKREFS(Py_TYPE(op))) + if (! _PyType_SUPPORTS_WEAKREFS(Py_TYPE(op))) continue; /* It supports weakrefs. Does it have any? */ diff --git a/Objects/exceptions.c b/Objects/exceptions.c index d8bfb31a6094a..ea8a31076b060 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -3729,7 +3729,7 @@ _PyErr_TrySetFromCause(const char *format, ...) base_exc_size = _PyExc_BaseException.tp_basicsize; same_basic_size = ( caught_type_size == base_exc_size || - (PyType_SUPPORTS_WEAKREFS(caught_type) && + (_PyType_SUPPORTS_WEAKREFS(caught_type) && (caught_type_size == base_exc_size + (Py_ssize_t)sizeof(PyObject *)) ) ); diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 452759334f456..39e8b466ce82d 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -2473,12 +2473,21 @@ type_init(PyObject *cls, PyObject *args, PyObject *kwds) return 0; } + unsigned long PyType_GetFlags(PyTypeObject *type) { return type->tp_flags; } + +int +PyType_SUPPORTS_WEAKREFS(PyTypeObject *type) +{ + return _PyType_SUPPORTS_WEAKREFS(type); +} + + /* Determine the most derived metatype. */ PyTypeObject * _PyType_CalculateMetaclass(PyTypeObject *metatype, PyObject *bases) diff --git a/Objects/weakrefobject.c b/Objects/weakrefobject.c index b9920404c5f9f..76121f9fe8872 100644 --- a/Objects/weakrefobject.c +++ b/Objects/weakrefobject.c @@ -299,7 +299,7 @@ weakref___new__(PyTypeObject *type, PyObject *args, PyObject *kwargs) PyWeakReference *ref, *proxy; PyWeakReference **list; - if (!PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { + if (!_PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { PyErr_Format(PyExc_TypeError, "cannot create weak reference to '%s' object", Py_TYPE(ob)->tp_name); @@ -794,7 +794,7 @@ PyWeakref_NewRef(PyObject *ob, PyObject *callback) PyWeakReference **list; PyWeakReference *ref, *proxy; - if (!PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { + if (!_PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { PyErr_Format(PyExc_TypeError, "cannot create weak reference to '%s' object", Py_TYPE(ob)->tp_name); @@ -853,7 +853,7 @@ PyWeakref_NewProxy(PyObject *ob, PyObject *callback) PyWeakReference **list; PyWeakReference *ref, *proxy; - if (!PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { + if (!_PyType_SUPPORTS_WEAKREFS(Py_TYPE(ob))) { PyErr_Format(PyExc_TypeError, "cannot create weak reference to '%s' object", Py_TYPE(ob)->tp_name); @@ -949,7 +949,7 @@ PyObject_ClearWeakRefs(PyObject *object) PyWeakReference **list; if (object == NULL - || !PyType_SUPPORTS_WEAKREFS(Py_TYPE(object)) + || !_PyType_SUPPORTS_WEAKREFS(Py_TYPE(object)) || Py_REFCNT(object) != 0) { PyErr_BadInternalCall(); From webhook-mailer at python.org Wed Jan 26 21:35:56 2022 From: webhook-mailer at python.org (vstinner) Date: Thu, 27 Jan 2022 02:35:56 -0000 Subject: [Python-checkins] bpo-40170: Remove _Py_GetAllocatedBlocks() function (GH-30940) Message-ID: https://github.com/python/cpython/commit/6b491b9dc0b0fdfd1f07ea4e2151236186d8e7e6 commit: 6b491b9dc0b0fdfd1f07ea4e2151236186d8e7e6 branch: main author: Victor Stinner committer: vstinner date: 2022-01-27T03:35:51+01:00 summary: bpo-40170: Remove _Py_GetAllocatedBlocks() function (GH-30940) Move _Py_GetAllocatedBlocks() and _PyObject_DebugMallocStats() private functions to the internal C API. files: A Misc/NEWS.d/next/C API/2022-01-27-02-37-18.bpo-40170.XxQB0i.rst M Include/cpython/objimpl.h M Include/internal/pycore_object.h M Objects/obmalloc.c diff --git a/Include/cpython/objimpl.h b/Include/cpython/objimpl.h index 7fff96e8eb27f..d7c76eab5c731 100644 --- a/Include/cpython/objimpl.h +++ b/Include/cpython/objimpl.h @@ -52,14 +52,6 @@ the 1st step is performed automatically for you, so in a C++ class constructor you would start directly with PyObject_Init/InitVar. */ -/* This function returns the number of allocated memory blocks, regardless of size */ -PyAPI_FUNC(Py_ssize_t) _Py_GetAllocatedBlocks(void); - -/* Macros */ -#ifdef WITH_PYMALLOC -PyAPI_FUNC(int) _PyObject_DebugMallocStats(FILE *out); -#endif - typedef struct { /* user context passed as the first argument to the 2 functions */ diff --git a/Include/internal/pycore_object.h b/Include/internal/pycore_object.h index 5fe4ddb2efbe1..c520122aa579b 100644 --- a/Include/internal/pycore_object.h +++ b/Include/internal/pycore_object.h @@ -232,6 +232,15 @@ extern void _PyObject_FreeInstanceAttributes(PyObject *self); extern int _PyObject_IsInstanceDictEmpty(PyObject *); extern PyObject* _PyType_GetSubclasses(PyTypeObject *); +/* This function returns the number of allocated memory blocks, regardless of size */ +PyAPI_FUNC(Py_ssize_t) _Py_GetAllocatedBlocks(void); + +/* Macros */ +#ifdef WITH_PYMALLOC +// Export the symbol for the 3rd party guppy3 project +PyAPI_FUNC(int) _PyObject_DebugMallocStats(FILE *out); +#endif + #ifdef __cplusplus } #endif diff --git a/Misc/NEWS.d/next/C API/2022-01-27-02-37-18.bpo-40170.XxQB0i.rst b/Misc/NEWS.d/next/C API/2022-01-27-02-37-18.bpo-40170.XxQB0i.rst new file mode 100644 index 0000000000000..7b743827bb168 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-01-27-02-37-18.bpo-40170.XxQB0i.rst @@ -0,0 +1,2 @@ +Move _Py_GetAllocatedBlocks() and _PyObject_DebugMallocStats() private +functions to the internal C API. Patch by Victor Stinner. diff --git a/Objects/obmalloc.c b/Objects/obmalloc.c index 4e17bf44b4e96..ea0faff5bbe30 100644 --- a/Objects/obmalloc.c +++ b/Objects/obmalloc.c @@ -8,6 +8,9 @@ /* Defined in tracemalloc.c */ extern void _PyMem_DumpTraceback(int fd, const void *ptr); +// Forward declaration +int _PyObject_DebugMallocStats(FILE *out); + /* Python's malloc wrappers (see pymem.h) */ @@ -1569,8 +1572,9 @@ new_arena(void) const char *opt = Py_GETENV("PYTHONMALLOCSTATS"); debug_stats = (opt != NULL && *opt != '\0'); } - if (debug_stats) + if (debug_stats) { _PyObject_DebugMallocStats(stderr); + } if (unused_arena_objects == NULL) { uint i; From webhook-mailer at python.org Wed Jan 26 22:12:04 2022 From: webhook-mailer at python.org (gvanrossum) Date: Thu, 27 Jan 2022 03:12:04 -0000 Subject: [Python-checkins] bpo-46539: Pass status of special typeforms to forward references (GH-30926) Message-ID: https://github.com/python/cpython/commit/ced50051bb752a7c1e616f4b0c001f37f0354f32 commit: ced50051bb752a7c1e616f4b0c001f37f0354f32 branch: main author: Gregory Beauregard committer: gvanrossum date: 2022-01-26T19:11:51-08:00 summary: bpo-46539: Pass status of special typeforms to forward references (GH-30926) Previously this didn't matter because there weren't any valid code paths that could trigger a type check with a special form, but after the bug fix for `Annotated` wrapping special forms it's now possible to annotate something like `Annotated['ClassVar[int]', (3, 4)]`. This change would also be needed for proposed future changes, such as allowing `ClassVar` and `Final` to nest each other in dataclasses. files: A Misc/NEWS.d/next/Library/2022-01-26-20-36-30.bpo-46539.23iW1d.rst M Lib/test/test_typing.py M Lib/typing.py diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index b5767d02691d8..4b260d49bdfe4 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -2870,6 +2870,20 @@ def foo(a: 'Callable[..., T]'): self.assertEqual(get_type_hints(foo, globals(), locals()), {'a': Callable[..., T]}) + def test_special_forms_forward(self): + + class C: + a: Annotated['ClassVar[int]', (3, 5)] = 4 + b: Annotated['Final[int]', "const"] = 4 + + class CF: + b: List['Final[int]'] = 4 + + self.assertEqual(get_type_hints(C, globals())['a'], ClassVar[int]) + self.assertEqual(get_type_hints(C, globals())['b'], Final[int]) + with self.assertRaises(TypeError): + get_type_hints(CF, globals()), + def test_syntax_error(self): with self.assertRaises(SyntaxError): diff --git a/Lib/typing.py b/Lib/typing.py index e3e098b1fcc8f..450cd7b51184e 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -142,12 +142,12 @@ def _idfunc(_, x): # legitimate imports of those modules. -def _type_convert(arg, module=None): +def _type_convert(arg, module=None, *, allow_special_forms=False): """For converting None to type(None), and strings to ForwardRef.""" if arg is None: return type(None) if isinstance(arg, str): - return ForwardRef(arg, module=module) + return ForwardRef(arg, module=module, is_class=allow_special_forms) return arg @@ -169,7 +169,7 @@ def _type_check(arg, msg, is_argument=True, module=None, *, allow_special_forms= if is_argument: invalid_generic_forms += (Final,) - arg = _type_convert(arg, module=module) + arg = _type_convert(arg, module=module, allow_special_forms=allow_special_forms) if (isinstance(arg, _GenericAlias) and arg.__origin__ in invalid_generic_forms): raise TypeError(f"{arg} is not valid as type argument") diff --git a/Misc/NEWS.d/next/Library/2022-01-26-20-36-30.bpo-46539.23iW1d.rst b/Misc/NEWS.d/next/Library/2022-01-26-20-36-30.bpo-46539.23iW1d.rst new file mode 100644 index 0000000000000..2bdde21b6e58e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-26-20-36-30.bpo-46539.23iW1d.rst @@ -0,0 +1 @@ +In :func:`typing.get_type_hints`, support evaluating stringified ``ClassVar`` and ``Final`` annotations inside ``Annotated``. Patch by Gregory Beauregard. From webhook-mailer at python.org Wed Jan 26 22:16:39 2022 From: webhook-mailer at python.org (terryjreedy) Date: Thu, 27 Jan 2022 03:16:39 -0000 Subject: [Python-checkins] bpo-45296: Clarify close, quit, and exit in IDLE (GH-30936) Message-ID: https://github.com/python/cpython/commit/fcde0bc10ddd836b62d0a8e893d80b8c55e0ba3f commit: fcde0bc10ddd836b62d0a8e893d80b8c55e0ba3f branch: main author: Terry Jan Reedy committer: terryjreedy date: 2022-01-26T22:16:31-05:00 summary: bpo-45296: Clarify close, quit, and exit in IDLE (GH-30936) In the File menu, 'Close' and 'Exit' are now 'Close Window' (the current one) and 'Exit' is now 'Exit IDLE' (by closing all windows). In Shell, 'quit()' and 'exit()' mean 'close Shell'. If there are no other windows, this also exits IDLE. files: A Misc/NEWS.d/next/IDLE/2022-01-26-19-33-55.bpo-45296.LzZKdU.rst M Doc/library/idle.rst M Lib/idlelib/help.html M Lib/idlelib/mainmenu.py diff --git a/Doc/library/idle.rst b/Doc/library/idle.rst index d740973af9124..d6021042c6116 100644 --- a/Doc/library/idle.rst +++ b/Doc/library/idle.rst @@ -96,11 +96,13 @@ Save Copy As... Print Window Print the current window to the default printer. -Close - Close the current window (ask to save if unsaved). +Close Window + Close the current window (if an unsaved editor, ask to save; if an unsaved + Shell, ask to quit execution). Calling ``exit()`` or ``close()`` in the Shell + window also closes Shell. If this is the only window, also exit IDLE. -Exit - Close all windows and quit IDLE (ask to save unsaved windows). +Exit IDLE + Close all windows and quit IDLE (ask to save unsaved edit windows). Edit menu (Shell and Editor) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/Lib/idlelib/help.html b/Lib/idlelib/help.html index 2468afa7148b9..41626ec5abb56 100644 --- a/Lib/idlelib/help.html +++ b/Lib/idlelib/help.html @@ -5,7 +5,7 @@ - IDLE — Python 3.11.0a0 documentation + IDLE — Python 3.11.0a4 documentation @@ -18,7 +18,7 @@ @@ -71,7 +71,7 @@

    Navigation

  • - 3.11.0a0 Documentation » + 3.11.0a4 Documentation »
  • @@ -163,9 +163,11 @@

    File menu (Shell and Editor)exit() or close() in the Shell +window also closes Shell. If this is the only window, also exit IDLE.

    -
    Exit

    Close all windows and quit IDLE (ask to save unsaved windows).

    +
    Exit IDLE

    Close all windows and quit IDLE (ask to save unsaved edit windows).

    @@ -976,7 +978,7 @@

    Navigation

  • - 3.11.0a0 Documentation » + 3.11.0a4 Documentation »
  • @@ -1000,7 +1002,7 @@

    Navigation

    @@ -971,7 +973,7 @@

    Navigation

  • - 3.11.0a0 Documentation » + 3.11.0a4 Documentation »
  • @@ -995,7 +997,7 @@

    Navigation

    @@ -976,7 +978,7 @@

    Navigation

  • - 3.11.0a0 Documentation » + 3.11.0a4 Documentation »
  • @@ -1000,7 +1002,7 @@

    Navigation