[Edu-sig] Interactive learning: Twenty years later

Mike C. Fletcher mcfletch@rogers.com
Sun, 29 Jun 2003 13:47:33 -0400


Terry Hancock wrote:
...

>I think the question about the "chord player" (never heard it called that,
>but then I'm probably looking at a derivitive concept) is whether it is
>harder to learn to type using it or to learn how to type on a standard
>keyboard one-handed, which I do quite a lot (anybody who does a
>lot of CAD drawing will wind up doing this, I also do it a lot while
>programming).  
>
...
There has been an enormous amount of research within the HCI literature 
into the question of multi-modal/multi-channel HCI design.  Just to see 
*why* it's such a huge win to have more channels for communication, take 
Dragon Dictate and use it to dictate commands within AutoCAD.  You find 
yourself becoming more productive, as you leave your pen tablet right 
where you're drawing, your other hand is basically free, and your eyes 
never *need* to leave the particular spot on which you're concentrating.

A chording keyboard tries to increase the communications bandwidth 
between user and computer by adding yet more functionality to the 
(already overworked) hand "channels".  It's possible to train up to 
greater bandwidth with such a system, but it does require considerable 
training, and until then the channel has reduced bandwidth.  Eye 
tracking, voice dictation, and the like leverage as-yet-unused channels, 
so don't impact the existing channels (generally), and so tend to 
provide a greater facility immediately.

BTW, the navy's "put that there" research was fascinating, combining eye 
tracking, voice recognition and logic to figure out the user's current 
and previous conversational focus (another bandwidth-increasing 
mechanism, "context compression").  It was for very minimalist projects 
(command-and-control interfaces) but it did give you some idea of what 
could be done with enough channels open for communication.

>I'm looking forward to trying out an actual tablet -- I remember
>wanting one of these before even mice were commercially
>available, and they're actually pretty affordable nowadays.
>
Recommended.  Been using them for years now, mice seem ridiculous when 
I'm forced to use them.  Using sense-memory of where a particular 
scrollbar/button/menu is on the screen so that you don't have to look at 
where you're pointing can dramatically speed up your computing 
experience (and it doesn't interrupt your cognitive processes with a 
slow feedback loop for targetting the control).

>Another thing that really fascinates me is the 6-axis input devices
>like the "Space Ball" or "Space Puck" that came out a few years
>back. These are a really cool concept.  I was actually trying to
>design something like this (hard) before I found them commercially
>available!  Basically it's an object which you hold and can move
>through all 6 axes (XYZ/PYR) of translation and rotation.  It'd be
>great for the "gripping hand"  in a VR environment, or with a variable
>end-effector trigger as a robotic teleoperation control (really it
>would then have 7 axes).  The downside is, I think they start at
>about $800 and I haven't got one.  I was able to find enough
>information to write a driver for Linux though (and one may
>already exist).  Someday maybe I'll try to find them again, if
>they're still available. :-P
>
They're okay, but having worked in VR for 3 years, and studied for quite 
a bit before that, the only time I ever found them *useful* (as distinct 
from cool) was in driving a VR camera around a VR play's set (with 
specialised software just for that task, and all other tasks mapped to a 
keyboard-driven interface so they wouldn't require the pointer).  For 
modelling it's just so pointless to take your hand off the tablet or 
keyboard just to reach for the (honestly rather clumsy) thing. 

Generalised VR and modelling apps are doing a lot more than simple 3D 
manipulation with the "pointer", and the spaceball implementations just 
aren't particularly good for regular mousing about in my experience (too 
joystick-like).  I can imagine ways you could set up a console such that 
you could use sense-memory to get to the spaceball but it just doesn't 
seem that critically important given the rather well-thought-out axis 
manipulator mechansims you see in Maya (or 3DSMax, though they are 
clumsier in Max4), and the relatively minimal benefit the spaceball (in 
it's current, rather crude, forms) has for precision modelling.  
Specialised applications seem to be where it'll stay until there's a 
compelling application developed.

Enjoy,
Mike

_______________________________________
  Mike C. Fletcher
  Designer, VR Plumber, Coder
  http://members.rogers.com/mcfletch/