[ExI] Google’s Go Victory Is Just a Glimpse of How Powerful AI Will Be
William Flynn Wallace
foozler83 at gmail.com
Fri Feb 5 23:26:57 UTC 2016
Mike Dougherty wrote:
"People will adapt" should be an unacceptable attitude.
And I agree with that. I hear stories about engineers 'doing their own
things' and presenting their products to upper management. Then the
marketing people tell them that it might be OK for engineers but not for
regular people (whoever those are). Make it user friendly in the current
cant. This problem has plagued PCs from the beginning and is still a
I wish marketers would take their products and present them to us ordinary
people and watch them through a one-way mirror. The people in charge of
packaging (much less things in kits) would get a real lesson is how hard it
is for older folks to just get into their product without injuring
themselves! Surely there are ways to make packaging user friendly and
theft resistant at the same time.
I know little of AI but I do read that they are trying make them learn and
adapt as a person would. That way the adaptation can go both ways - you
get used to it and it gets used to you. Like raising a child.
Now if it ever comes to pass that we can download our brains, then we can
download them into our robots etc. (leaving the original intact, of course)
and they would understand us perfectly. However, think of the problems
involved if the person in question is a contrarian. Endless arguments with
On Fri, Feb 5, 2016 at 2:08 PM, Mike Dougherty <msd001 at gmail.com> wrote:
> On Thu, Feb 4, 2016 at 12:53 PM, William Flynn Wallace
> <foozler83 at gmail.com> wrote:
> > Yeah, I guess that was ambiguous. Of course I am biased, being a social
> > psychologist, but many fields overlap psych: marketing, management, AI,
> > economics and so forth. If someone is going to program robots to
> > with people, we need to know more about people than we do, esp. if the
> > is supposed to detect deception, hidden meanings, emotional reactions,
> > other subtleties
> > . Enormous progress has been made in reading faces, for example, that
> > need to be programmed into robots or AI or whatever broad term applies.
> > Ditto for research showing how vocalizations can be read for stress and
> > other emotions. Lie detection is getting sophisticated.
> I agree. I think it's an important lesson for non-AI developers to
> learn too. As a non-AI software developer, I am disgusted with the
> state of human computer interaction (HCI) that the majority of devs
> produce. There is such a long history of forcing people to adapt to
> whatever new software/UI is perceived (by the developer) as "better"
> (without regard for users' muscle-memory)
> Why can't we use the latest version of Excel with a UI skin having the
> good-ol' retro Excel '95 (or 2005, or 2010, or whatever it was when it
> was learned). Sure there might be some new features that weren't
> available then, but the basics of spreadsheets hasn't changed since
> Lotus 1-2-3 (which was modeled on pre-industrial bookkeeping
> practices, right?) To suggest that everything we know about using a
> menu should be trashed so we can click on a ribbon (in the name of
> 'easier') is to miss a fundamental UX principle. The "Cool new stuff"
> menu might be a place to look... or the application (and computers in
> general) have enough resources to know what you're doing and offer
> context-aware modes for working through most of the heavy lifting.
> *shrug* i digress. The point was that computers need to get better at
> working with people rather than people getting better at working with
> computers. The people who help computers do that need to understand
> the people that would eventually be helped by the AI they create.
> "People will adapt" should be an unacceptable attitude.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat