tokenise a string

Lonnie Princehouse finite.automaton at gmail.com
Mon Oct 11 19:08:01 EDT 2004


The tokenize module is only useful for parsing Python code.  That
said, reading the code of tokenize.py might be helpful in writing your
own tokenizer.

Look into the re (regular expression) module, or PLY or Spark if you
need more sophisticated parsers.

Also check out SQLObject (http://sqlobject.org/) --- it makes SQL
almost pleasant =)

Cousin Stanley <cousinstanley at hotmail.com> wrote in message news:<2sqeabF1nrio1U1 at uni-berlin.de>...
> On 2004-10-09, Matthias Teege <matthias-dated at mteege.de> wrote:
> > Moin,
> > ....
> > So I must parse the input, build tokens and map the fieldnames. 
> >
> > Is there a special modul which I can use 
> > or are the standard string functions adequate?
> >
> 
> Matthias .... 
> 
>   You might check the  tokenize  module ....
>   
>       import tokenize
>       
>       help( tokenize )
>       
>   I only know of it, but have never used it myself ....



More information about the Python-list mailing list