[Doc-SIG] A promise

Laurence Tratt laurie@eh.org
Mon, 27 Nov 2000 17:37:49 +0000


"Fred L. Drake, Jr." wrote:

[Laurie]
>> Bad idea. IMHO, the only way to create a professional tool is
>> to get it to use Pythons parse trees ("import"ing the modules
>> is at best mickey mouse even if it solves one or two problems
>> that static analysis can't hope to) and that really means that 
>   I agree that parse-tree analysis is required; I don't see why this
> precludes making the tool a little more general.  If 1.5.2 support is
> required by the author of the tool, it's a requirement I won't argue
> with!

What one could do - at least a tool like Crystal supports this - is pluggable
parsers. In fact, thinking back the default parser in Crystal was named CPython
specifically so I could cater for the different interface in (as was then)
JPython.

However, when the grammar changes other things tend to change too, the sorts of
changes that might well affect other parts of the docutils system. I may be
being a little too paranoid here... But there again I wouldn't want to be the
poor git maintaining parsers for Python 1.5.2, 1.6, 2.0, Jython 1.0, Jython 1.1
etc... That's just multiplying the number of chances for things to go seriously
wrong.

I'll put my neck on the line: long term, tracking the current Python release
(even if not using the built in parser interface) is the way to go. In the short
term, you might get away with coping with multiple versions but if eg the
type/class dichotomy is solved (did I see a PEP for that? Can't remember), then
that might have ramifications beyond the grammar probably ruining easily
maintained multiple version support.

> Supporting Jython will be harder, but also should not be too hard. 
> (This points to using the tokenize module instead of the parser module,
> as well.)

If you only use the tokenize module, you effectively have to write your own
grammar (be that for a parser system or implicitly in code)... It just seems
another chance for things to go wrong even if it does make the tool more
flexible. As evidenced by the size of the tokenize module, tokenisation is the
easy part of things!


Laurie