UI Design, XUL, Blender

Terry Hancock hancock at anansispaceworks.com
Sat Oct 22 13:54:01 EDT 2005


Newcomers to Blender (3D modelling/animation program)
often find its fairly unique UI a bit off-putting,
but on closer inspection, I find it's a very compelling
design for "power users" (i.e. professionals who need to
use a given program on a daily basis, and who are therefore
willing to make the effort to learn the specific interface).
It is much better than either a command line interface
or a more conventional GUI, for that purpose, IMHO.

Unfortunately, Blender doesn't really follow a Model-View-
Controller design, so I'm not sure how separable the
interface is from the rest of the program (I'm asking that
question elsewhere). My question here, is just how unique,
really, is that interface?  Could it be implemented with
"standard" GUI toolkits (using complex widgets and
customizing button appearance, for example)?

The main things I notice as a user are that:

* The buttons are smaller and use iconic graphics, so you
  can access more controls at once.

* Extensive use of "tabs" allowing control pallettes to be
  brought up or expanded, facillitating highly hierarchical
  pallettes.

* Widgets are color-coded as to functionality and fall into
  several important categories:

  o ON/OFF (Bistate)
  o ON/OFF/FLOATING (Tristate)
  o Numerical data entry, which act simultaneously as
    sliders and data-entry widgets

* The main thing is that these "buttons" actually have
  fairly complicated behavior, acting as sliders,
  text-entry, and selectboxes simultaneously.

(I may be missing things that would be more apparent to GUI
designers, though, so I'd be even more interested in a reply
from someone who's actually seen the Blender UI themselves.)

I'm wondering if this could be achieved in a Python program
by using wxPython or PyGTK or another popular, 
cross-platform GUI toolkit?  Also, is a GUI specification
language like XUL capable of expressing this kind of 
interface so that it could be made functional on multiple
GUI implementations?

I'm trying to compare two alternatives:

1) Figure out how to mimic the Blender GUI in a more
   conventional, separable GUI toolkit, so that an MVC
   design can be more easily used.  (This leaves a fairly
   complication "View" component, but there are other fairly
   advanced 3D visualization components).

2) Use Blender itself, using the Python scripting facility
   to wedge an MVC design into it (probably by hacking
   Blender to create a bridge between Blender's internal
   (C struct based) model, and an external model (probably
   represented in an object database component, such
   as ZODB).

in order to figure out which would be easier.

Any more informed comments from people who know the various
GUI packages would be a lot of help.  I assume it goes
without saying that I'm looking at Python as an integration
language.  In my ideal design, the M,V, and C components
are separate Python modules, so that the communications
are all at the Python level.

Cheers,
Terry

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks  http://www.anansispaceworks.com




More information about the Python-list mailing list