Newbie design problem

Jonathan Gardner jgardner.jonathangardner.net at gmail.com
Mon Dec 17 13:17:59 EST 2007


On Dec 14, 8:02 am, MartinRineh... at gmail.com wrote:
>
> Lex is very crude. I've found that it takes about half a day to
> organize your token definitions and another half day to write a
> tokenizer by hand. What's the point of the second half-day's work?
>

As someone who has earned a BS in Physics, I have learned one powerful
truth: No matter how smart you are, you are not as smart as everyone
else.

See, the scientific arena of Physics has gotten to the point where it
is because people carefully built on each other's work. They spend a
great deal of time trying to understand what everyone else is doing
and why they do it that way and not another way, and very little time
trying to outsmart each other. The brightest bulbs in the physics
community don't think they are the brightest bulbs. They are just
really good at understanding everyone else and putting it all
together. This is summed up in Isaac Newton's statement about seeing
farther because he has stood on the shoulders of giants.

The same applies to computer science. Either you can take a few days
and study about how parsers and lexers really work and why you need
them to make your life easier and which implementations are
worthwhile, or you can go off and do things on your own and learn the
hard way that everyone that went before you was really smarter than
you think you are. Five months later, maybe you will have made up the
time you would have "wasted" by reading a good booking on formal
languages, lexers, and parsers. At that point, you will opt to use one
of the existing libraries, perhaps even Bison and Flex.

It's your time that is at stake, man. Don't waste it trying to
reinvent the wheel, even if you think you need an upside-down one.



More information about the Python-list mailing list