Dynamically create classes for spark-parser-framework

Diez B. Roggisch deets_noospaam at web.de
Mon Dec 15 14:00:58 EST 2003


Hi,

I've got to create some spark-based parsers during runtime. For people not
familiar with spark, a parser looks like this:

class MixFixParser(spark.GenericParser):
    def __init__(self, start='p_start'):
        spark.GenericParser.__init__(self, start)

    def p_rules(_, args):
        """
        p_start ::= p_op p_start
        p_start ::=
        p_op    ::= p_start lbracket p_start rbracket
        """
        return args

The framework looks for methods beginning with p_ and inspects its docstring
for grammar rules.

I hope that the body of my p_rules will be uniform, so until now I'd like to
have one paser baseclass like MixFixParser above, and then modify the
docstring accordingly - only for one specified instance!

My first attempts failed due to 

AttributeError: 'instancemethod' object attribute '__doc__' is read-only

Of course I could create one big string, with placeholders for class-name,
doc-string contents and maybe even methodbody and evaluate it, but somehow
this strikes me as unelegant - first of all, I lose emacs indention support
for the code I write, and secondly I'd like the idea of manipulating only
as few as possible - I don't _need_ distinct classes, only instances with
differing parser-rules.

I'll also have to check if spark allows for second-time or delayed calling
of __init__, so that my altered rules get recognized - however, I think
that could be done, come hell or highwater.

Thanks for any suggestions,

Diez





More information about the Python-list mailing list