[issue12486] tokenize module should have a unicode API

Thomas Kluyver report at bugs.python.org
Sun Mar 11 05:22:55 EDT 2018


Thomas Kluyver <thomas at kluyver.me.uk> added the comment:

> Why not just bless the existing generate_tokens() function as a public API

We're actually using generate_tokens() from IPython - we wanted a way to tokenize unicode strings, and although it's undocumented, it's been there for a number of releases and does what we want. So +1 to promoting it to a public API.

In fact, at the moment, IPython has its own copy of tokenize to fix one or two old issues. I'm trying to get rid of that and use the stdlib module again, which is how I came to notice that we're using an undocumented API.

----------
nosy: +takluyver

_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue12486>
_______________________________________


More information about the Python-bugs-list mailing list