PyProtocols, Components, and Inheritance

Terry Hancock hancock at anansispaceworks.com
Fri Feb 17 00:53:51 EST 2006


I've been discussing PyProtocols with a a friend
collaborating with me on a SF game project, about
the benefits and design concept of "component architecture",
and I'm a little confused by what I'm learning.

I learned the concepts of "component architecture" from
the Zope project, where if I'm not mistaken, "components"
are essentially python "mix-in" classes, and you make
a class from components by using multiple inheritance
and document this with an interface declaration:

So in this model, I have (let's say):

Interfaces: IVehicle, IAnimal
	these are special (non-class) objects,
	even though their declarations look like class
	declarations

Components: Vehicle, Animal (each provide these interfaces)
	these are just ordinary python classes

Then I can make a class by combining components:

class Horse(Animal, Vehicle):
	implements(IAnimal, IVehicle)
	pass

(This is an interactive fiction engine, so it would be
reasonable to think of a horse both as an animal and
as a means of transportation -- but it's just an example).

However, this is apparently not how PyProtocols works.

The PP way apparently looks something like this:

Components: are special (non-class?) objects (just like
interfaces?)

So I need something like:

class Vehicle(Component):
	#... implementation snipped
	pass

class Animal(Component):
	#... implementation snipped
	pass

then I use it by making a class:

class Horse(object):
	pass

and only then do I "attach" components:

Horse.addComponent(Vehicle)
Horse.addComponent(Animal)

To me, this looks like madness. Aren't we just reinventing
multiple inheritance with obtuse syntax?  Why is multiple
inheritance such a bad thing? (I can see that it can be
abused, but ISTM that real trouble comes not from broad,
shallow multiple inheritance as in the component model, but
from deep inheritance chains (as one is forced to do in
single-inheritance languages)).

Anyway, my question is WHY? I know PyProtocols is
fairly popular, so there must be some benefit to this, 
but I'd really like to hear what it's supposed to be.

My friend seems to think there is a great evil in
doing it with multiple inheritance, yet the PyProtocols
way seems to be kind of complex. I had erroneously
assumed that PyProtocols and Zope component models
were more-or-less equivalent, with PP being perhaps
a bit more complete (for example, PP's interface module is
apparently a superset of Zope's).

I realize that at some level we're just having a "cultural
conflict" since we learned different versions of this
concept, but I would like to make a real attempt to
understand where he's coming from.  (And yes, of course
I've asked him, but since he only knows the PP way of
doing this, I don't think he's able to tell me -- I guess
we need an interpreter. ;-) ).

I feel like there's something going on here that
I just don't "get".

Is there someone out there who can give me a PyProtocols
advocate's PoV on this? Can you tell me why PP's approach is
"better" than Zope's? (Or perhaps what you think it is
more appropriate for?).  Anyway, c.l.python seemed like the
(neutral) place to ask since this is comparing two major
python packages.

Cheers,
Terry



-- 
Terry Hancock (hancock at AnansiSpaceworks.com)
Anansi Spaceworks http://www.AnansiSpaceworks.com




More information about the Python-list mailing list