What's better about Ruby than Python?
Kenny Tilton
ktilton at nyc.rr.com
Thu Aug 21 16:14:38 EDT 2003
Andrew Dalke wrote:
> Kenny Tilton
>
>>C? does something similar to what you think, but at with an order of
>>magnitude more power. Estimated. :) Here is how C? can be used:
>>
>>[.. some lisp example ...]
>
>
> You have got to stop assuming that a description in Lisp is
> intuitive to me. I don't know anywhere near enough of that
> language to know what's normal vs. what's novel.
Don't feel bad, lispniks are scared by (c? ...). It does Something
Completely Different. Well, Garnet and COSI and other constraint
programming ttols also do it, but a lot of people don't know about
those. I'd translate to Python if I remembered it from my brief
exploration a while back.
>
>
>>So I do not have to drop out of the work at hand to put /somewhere else/
>>a top-level function which also caches,
>
>
> Didn't you have to "drop out of the work at hand" to make the macro?
C? is like your cacheCall, it works for anything. But I just toss it in
in-line, around any arbitrary lisp expression. I think you had to set up
a class with a method and a cache slot before even doing the cacheCall
call. Too messy.
Can a Python first-class function close over a lexical variable and use
that as a cache across multiple calls? That is another Lisp feature I am
pressing into action.
>
> I tried to understand the lisp but got lost with all the 'defun' vs
> 'function' vs 'funcall' vs 'defmodel' vs 'defclass' vs 'destructuring-bind'
> I know Lisp is a very nuanced language. I just don't understand
> all the subtleties. (And since Python works for what I do, I
> don't really see the need to understand those nuances.)
Build it and they will come. Maybe if you used a macro-powered language
and saw where macros improved code and they became second nature...
>
>
>>Cool. But call cachedCall "memoize". :) Maybe the difference is that you
>>are cacheing a specific computation of 3, while my macro in a sense
>>caches the computation of arbitrary code by writing the necessary
>>plumbing at compile time, so I do not have to drop my train of thought
>>(and scatter my code all over the place).
>
>
> Sure, I'll call it memoize, but I don't see what that's to be prefered.
I think it's just industry practice (since I saw it today in a Perl
conext <g>)
> The code caches the result of calling a given function, which could
> compute 3 or could compute bessel functions or could compute
> anything else. I don't see how that's any different than what your
> code does.
Because I do not have to set anything up like you did with the class and
the cache and the slot and the call to cacheCall. And if I did, a single
form such as:
(defmemoized count (n) (* 2 n))
...would suffice. In my case I can even inline, since I am immediately
dumping the outcome into an instance's slot where cooperating code
(written by the defmodel macro) makes it work.
>
> And I still don't see how your macro solution affects the train
> of thought any less than my class-based one.
Memoization does not come up that much, tho Graham still saw fit to
create a macro for it. But my hack gets used hundreds of times
throughout a sizeable app. I dash them off inline without missing a
beat, and when I change C? they all keep working.
>
>
>>That is where Lisp macros step up--they are just one way code is treated
>>as data, albeit at compile time instead of the usual runtime
>
> consideration.
>
> The distinction between run-time and compile time use of code is
> rarely important to me. Suppose the hardware was 'infinitely' fast
> (that is, fast enough that whatever you coded could be done in
> within your deadline). On that machine, there's little need for
> the extra efficiencies of code transformation at compile time. But
> there still is a need for human readablity, maintainability, and code
> sharing.
This has nothing to do with shifting computation from runtime to compile
time, though I have seen people cite that as one advantage. I agree with
you that that is negligible. The key thing about being there at compile
time is that my code can write code for me, driven by the very small
amount of code that varies from case to case. That makes my code more
readable than yours, because you are cutting and pasting all that
implementing cruft over and over again and the varying essence does not
stand out so well.
--
kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker
More information about the Python-list
mailing list