Inheriting the @ sign from Ruby

Roy Katz katz at Glue.umd.edu
Tue Dec 12 10:50:46 EST 2000


On Tue, 12 Dec 2000, Alex Martelli wrote:

> 'self' is not "pasted in front of a name" -- 'self' is an object,
> and it is used with perfectly regular syntax for object access.

Yes, we understand this.  Passing 'self' (or whatever people choose to
call it) to an instance method is automatic upon invocation, and manual
for anything else.  So we have:

   Class X:
      def __init__( self, c ): 
          @c = c     # @c binds to 'self'.c
      def example( self ):  X.__init__( self )
      def same_example( self ): X.__init__( self )
      def another_example( myself, y ): @x = y     # binds to 'myself'.y
   

To me, grammatically and practically, self *is* pasted in as the first
argument.  I'm bringing up a grammatical question.  My question was
whether or not the shorthand was practical. 

> So, you would like '@identifier' to be a special-case
> syntax meaning exactly the same as 'self.identifier' (if
> the first argument, as per normal convention, is named
> 'self').

I would like the @identifier to denote <self-object>.identifier, where
<self-object> is the first parameter to a method. Regardless of whether it
is 'self' or 'myself';  the 'def bind(myself, x)' was just an example. 


> such a violation of
> normal expectations will make your code harder to read
> and understand by any maintainers.

Why should there be any violation if all instance identifiers are labelled
with a @? To be honest, I believe that the @identifier convention would
decrease the number of variant programming styles.  We all want that
'self' be used (instead of any other name).  Fine! the @identifier
scheme would at least reduce this, then. You wouldn't see 'myself' (or
some other deviant) pasted in front of every instance variable; you'd
see a common @ sign. So for reading someone else's code, that's a good
thing. 

> > What faults or strengths do you see with this?
> 
> I see no strength in this supplementary syntax form
> offering a second, different way to express what can
> be perfectly well expressed explicitly today.

shorthand??

> 
> I see it as a fault that all this does is offer a
> second, different way to express what can be perfectly
> well expressed explicitly today.

Save for being able to call the first parameter of a method whatever you
like, wouldn't this reduce the number of programming conventions?

> It would also be a (small) step along the road of

I disagree; Like I mentioned above, this would enforce
standardization. 

> turning the currently crystal-clear syntax of Python
> into that morass of special-purpose, ad-hoc, 'convenient'
> kludges that make certain other languages' typical
> programs look so peculiarly similar to line noise.

Python's language is not crystal-clear, imho.  If we're so tough on
perl for having a kludgy interface, why do we have {} for
defining dictionaries instead of [ key:value ]?  Why do we have to
explain to new Python programmers how the label-value system Python uses
is different than traditional call-by-reference and
call-by-value?  (I've been programming in Python for three years now 
and I understand only that passing an int or float is call-by-value,
otherwise it is call-by-reference.  At the risk of naivete, I would
like a simple & referencing operator so that I can explicitly,
finally, have control over this.) When would I ever use print>> ?
Isn't there a less-ugly way of generating a raw string, instead of placing
an 'r' in front of it? 

Good that you mentioned that the language is deteriorating. IMHO, from 
the above, it's seemed like this from the start. (It's still my favorite 
language). 




But that's just my opinion. 

Roey. 






More information about the Python-list mailing list