[ExI] intelligence and generalization

Adrian Tymes atymes at gmail.com
Mon Jan 14 19:35:51 UTC 2019


On Mon, Jan 14, 2019 at 6:21 AM John Clark <johnkclark at gmail.com> wrote:
> People are always asking geniuses how they can do what they do but they can never give satisfactory answers because they don't know they just do it. If they did know and could explain it we'd all be geniuses just like them.

This is one of the advances I have noted, only in the past few
decades: more and more, those who have done well at some aspect of
thinking are beginning to explain how they do what they do, in formats
accessible to the masses - and critically, in formats accessible to
others who have done well at the same or similar aspect, and who can
comment on or correct elements of the explanations to make them more
correct and more accessible.

For instance, I once intuited that the ability to make more basic,
fundamental mental models of the world helped with understanding wide
ranges of things.  At first I thought it was just something I did, and
that for the general public to adapt said models would be an
educational technology revolution worthy of science fiction (in the
vein of sci-fi that boils down to, "imagine some miracle technology,
then explore its ramifications for the human condition/society/et
cetera").  So I wrote said science fiction: a spacefaring society
where multidisciplinary capability (such as learning basics of science
and critical thought that could be applied to any scientific field,
picking up details of biology/geology/physics/et cetera later on as
relevant to specific investigations, or more handwavium - in my work,
anyway - practices to study other fields broadly and quickly) was the
baseline education most people received, and the effects on society
when what today we would call "mastery" could be learned much more
quickly (both in enabling a much larger pool of experts as needed, and
the level of ability they would consider "mastery").  Now I'm seeing
that others are coming around to the same concepts in real life.

Learning how to learn is a thing these days.  In a sense, this is what
AI research is largely about.  I wonder, is it feasible to set an AI
to modeling and codifying the process of learning itself, with the end
result of educational packages for humans of ordinary intellect that
can give them the same or better tools as a typical K-BS education in
a much shorter timeframe?  ("K-BS" might not be the best label, but
given how many people are seeing college as necessary these days,
merely K-12 isn't cutting it.  If a package like this became standard,
people would then desire more education, meaning there would be calls
for even better packages, perhaps leading to something like the
Singularity but driven mainly by human intellect - in conjunction with
AI.)  In addition to the obvious application for children, there are
many adults who might directly benefit from such a package (including,
perhaps, graduates of such a package once much better packages come
out, assuming this becomes a self-reinforcing chain).


More information about the extropy-chat mailing list