[issue34428] tokenize

Serhiy Storchaka report at bugs.python.org
Sat Aug 18 07:21:53 EDT 2018


Serhiy Storchaka <storchaka+cpython at gmail.com> added the comment:

I can't reproduce.

>>> import tokenize
>>> list(tokenize.generate_tokens(iter(['(\n', r'"\(")']).__next__))
[TokenInfo(type=53 (OP), string='(', start=(1, 0), end=(1, 1), line='(\n'), TokenInfo(type=56 (NL), string='\n', start=(1, 1), end=(1, 2), line='(\n'), TokenInfo(type=3 (STRING), string='"\\("', start=(2, 0), end=(2, 4), line='"\\(")'), TokenInfo(type=53 (OP), string=')', start=(2, 4), end=(2, 5), line='"\\(")'), TokenInfo(type=4 (NEWLINE), string='', start=(2, 5), end=(2, 6), line=''), TokenInfo(type=0 (ENDMARKER), string='', start=(3, 0), end=(3, 0), line='')]

Could you please provide a minimal script that reproduces your issue?

----------
nosy: +serhiy.storchaka

_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue34428>
_______________________________________


More information about the Python-bugs-list mailing list