Macros in Python, and using reg exps to build a scheme interpreter

Ian Bicking ianb at colorstudy.com
Sun Nov 3 23:56:19 EST 2002


On Sat, 2002-11-02 at 12:12, Matthew Knepley wrote:
>   IB> You can create a tokenizer with regular expressions, though Scheme's tokenization rules are simple enough you
>   IB> could pretty much do it with string.find.  You can't turn the tokens into a parse tree using regular expressions,
>   IB> but the relation between tokens and parse trees in Scheme is clear enough that it should be obvious how you'll do
>   IB> that portion.
>   This is possible, but I would say that it is (in my opinion) more elegant, and certainly just as easy to write
>   the parser using PLY (Python Lex-Yacc) from David Beazley, http://systems.cs.uchicago.edu/ply. It has a great
>   design which makes is really easy to do these kinds of things (making use of Python's reflection capabilities).

For Scheme I doubt it's significantly easier to write the parser with a
parser-generator than to write it by hand.  Scheme is very, very, very
easy to parse -- and if it's not easy you aren't looking at it in the
right way (e.g., 'a is equivalent to (quote a), and the parser need not
distinguish between the two in its output).

  Ian





More information about the Python-list mailing list