Which one to use: generate_tokens or tokenize?
Andr? Roberge
andre.roberge at ns.sympatico.ca
Thu Sep 9 19:32:14 EDT 2004
According to the Python documentation:
18.5 tokenize -- Tokenizer for Python source
...
The primary entry point is a generator:
generate_tokens(readline)
...
An older entry point is retained for backward compatibility:
tokenize(readline[, tokeneater])
====
Does this mean that one should preferably use generate_tokens? If so,
what are the advantages?
André Roberge
More information about the Python-list
mailing list