[ExI] wii controllers

Kelly Anderson kellycoinguy at gmail.com
Tue Jul 5 17:17:49 UTC 2011


On Mon, Jul 4, 2011 at 4:47 PM, Adrian Tymes <atymes at gmail.com> wrote:
> On Tue, Jun 28, 2011 at 11:43 AM, Kelly Anderson <kellycoinguy at gmail.com> wrote:
>> The Kinect is the next mouse. It won't replace the mouse any more than
>> the mouse replaced the keyboard, but I think it will have just as much
>> impact, and not just in games. What I'm really excited about is
>> changing the focal length, and tweaking the recognition, so that the
>> Kinect will work on a person sitting at their desk from, say, the top
>> of their laptop. Then you can recognize individual finger joint
>> positions, facial expressions, and of course gross movements. That's
>> going to be really big. There is absolutely zero reason that this
>> can't and won't be the case in three years.
>
> I can.  The mouse came with software and applications to do what
> could not previously be done as easily, such as spreadsheets (and the
> entire concept of a graphical operating system).  The Kinect's
> suggested applications, so far that I've seen, are not that revolutionary.

I suggest that they will be at least evolutionary. It may be that they
are niche, like today's use of voice recognition. Or, it could be that
the computer being able to read your emotional state and respond
appropriately will be so useful that everyone will want it.

> It's the applications that drive hardware.  What task, that most people
> do (or would do if it were practical), does the Kinect enable?  What
> can finger joint positions, facial expressions, and gross movements
> communicate to a computer that can not be communicated as
> readily through point and click?

Body language is widely understood to communicate more than voice. Why
shouldn't computers be able to read body language? I realize it is
somewhat theoretical to make the jump from Kinect to being able to
read body language, but it leads in that direction.

> The only applications I can think of are niche (emotional context,
> which requires software to interpret and make use of that), not
> attributable to this ("this will make communication so much more
> intuitive" - not without other improvements that could as easily be
> piled on point and click too), ignorant of reality (see "gorilla arm" for
> why touch screens didn't catch on more widely), or outright false
> (such as assuming it is easier to master 100 hand signals than
> moving a cursor to the right point on a 10*10 grid and clicking).

To be successful, the computer must adjust to the human, not vice
versa. While I agree that it takes an immense amount of imagination to
see the kinds of applications that are enabled by Kinect and similar
technologies, I respectfully submit that you will be amazed,
eventually. :-)

We aren't at the 1984 point yet. Remember the mouse was invented
several years earlier at Xerox. It took Steve Jobs and the boys at
apple to take that technology and finally apply it properly. We may
have six years or so before we see the same level of competence in NUI
interfaces. But trust me, it will come. Hopefully, not just in niche
markets.

My niche market is people who don't "do" computers, who need help, who
might lose a remote or wireless keyboard. I want to help elderly
people continue to live with dignity in an independent setting. While
it is a niche market, the Kinect really does enable what I'm trying to
accomplish very well.

-Kelly




More information about the extropy-chat mailing list