OT: novice regular expression question

M.E.Farmer mefjr75 at hotmail.com
Thu Dec 30 14:54:51 EST 2004


Hello me,
Have you tried shlex.py it is a tokenizer for writing lexical
parsers.
Should be a breeze to whip something up with it.
an example of tokenizing:
py>import shlex
py># fake an open record
py>import cStringIO
py>myfakeRecord = cStringIO.StringIO()
py>myfakeRecord.write("['1','2'] \n 'fdfdfdfd' \n 'dfdfdfdfd'
['1','2']\n")
py>myfakeRecord.seek(0)
py>lexer = shlex.shlex(myfakeRecord)

py>lexer.get_token()
'['
py>lexer.get_token()
'1'
py>lexer.get_token()
','
py>lexer.get_token()
'2'
py>lexer.get_token()
']'
py>lexer.get_token()
'fdfdfdfd'

You can do a lot with it that is just a teaser.
M.E.Farmer




More information about the Python-list mailing list