Text input with keyboard, via input methods

Rustom Mody rustompmody at gmail.com
Thu Mar 10 06:59:17 EST 2016


On Thursday, March 10, 2016 at 4:21:15 PM UTC+5:30, Marko Rauhamaa wrote:
> Ben Finney :
> 
> > As for how solved it is, that depends on what you're hoping for as a
> > solution.
> >
> > [...]
> >
> > Hopefully your operating system has a good input method system, with
> > many input methods available to choose from. May you find a decent
> > default there.
> 
> I don't have an answer. I have requirements, though:
> 
>  * I should be able to get the character by knowing its glyph (shape).
> 
>  * It should be very low-level and work system-wide, preferably over the
>    network (I'm typing this over the network).
> 
> The solution may require a touch screen and a canvas where I can draw
> with my fingers.
> 
> The solution may have to be implemented in the keyboard.
> 
> Or maybe we'll have to wait for brain-implantable bluetooth tranceivers.
> Then, we'd just think of the character and it would appear on the
> screen.

Lets say you wrote/participated in a million-line C/C++ codebase.
And lets say you would/could redo it in super-duper language 'L' (could but need not be python)
Even if you believe in a 1000-fold improvement going C → L, you'd still need
to input a 1000 lines of L-code.

How would you do it? With character recognition?

OTOH…

I am ready to bet that on your keyboard maybe US-104, maybe something more
exotic:

- There is a key that is marked something that looks like 'A'
- Pounding that 'A' produces something that looks like 'a'
' And to get a 'A' from the 'A' you need to do a SHIFT-A

IOW its easy to forget that typing ASCII on a us-104 still needs input-methods

Its just that these need to become more reified/firstclass going from the
<100 chars of ASCII to the million+ of unicode.

Or if I may invoke programmer-intuition:

a. If one had to store a dozen values, a dozen variables would be ok
b. For a thousand, we'd like a -- maybe simple -- datastructure like an array/dict
c. For a million (or billion) the data structure would need to be sophisticated

The problem with unicode is not that 100000 is a large number but that
we are applying a-paradigm to c-needs.

Some of my --admittedly faltering -- attempts to correct this:

http://blog.languager.org/2015/01/unicode-and-universe.html
http://blog.languager.org/2015/03/whimsical-unicode.html
http://blog.languager.org/2015/02/universal-unicode.html



More information about the Python-list mailing list