Which one to use: generate_tokens or tokenize?

Tim Peters tim.peters at gmail.com
Thu Sep 9 19:39:39 EDT 2004


[Andr? Roberge]
> According to the Python documentation:
> 
> 18.5 tokenize -- Tokenizer for Python source
> ...
> The primary entry point is a generator:
> generate_tokens(readline)
> ...
> An older entry point is retained for backward compatibility:
> tokenize(readline[, tokeneater])
> ====
> Does this mean that one should preferably use generate_tokens?

Yes.

> If so, what are the advantages?

Be adventurous:  try them both.  You'll figure it out quickly.  If you
have to endure "an explanation" first, read PEP 255, where
tokenize.tokenize was used as an example motivating the desirability
of introducing generators.



More information about the Python-list mailing list