[ExI] "general repudiation of Transhumanism"
possiblepaths2050 at gmail.com
Sun Apr 27 15:56:59 UTC 2008
Bryan Bishop wrote:
Sorry, raising IQ does not mean raising awareness and it doesn't mean
raising capability, there are many high IQ individuals that are
basically 'worthless' in terms of democracy, that aren't living full
and capable lives, etc.
I think James Hughes was misquoted on this subject. Instead of saying "IQ"
the speaker should have said "general intelligence." As we all know,
intelligence can be broken down into many highly-varied categories.
Citizenry and nationalism just don't work out in this day and age, time
to update your ideas. :-) There's more than just living as a citizen -
you can live as a person, a human.
Please don't automatically connect being a concerned citizen with militant
nationalism. You can be a concerned and contributing citizen of your nation
and at the same time be a good "citizen of the Earth."
I wrote/you wrote:
> > After all, as a professor he is a member of America's
> > intellectual/high IQ elite. Or does he only want a relative few to
> > be in that club?
> So now you're saying that you *don't* want IQ elitism.
I never did.
> > Sarewitz went on to say the two key challenges facing humanity would
> > not be helped by aggressive intelligence augmentation. The first
> > challenge regards individuals, groups and societies experiencing
> > conflicting values and world views and trying to deal peacefully with
> > each other. The second challenge dealt with humanity's ability to
> > predict and manage the future. He pointed out that extremely bright
> > and educated people/think tanks have guided nations into very stupid
> > policies/wars over the years and done great damage and so why should
> > even brighter technologically augmented folks do better? I thought to
> > myself that perhaps we should instead use biotech to *weaken* our
> > collective intelligence... His talk seemed to inadvertently point
> > out very bright people as a threat to humanity! LOL
> Nah, I bet he was just trying to say that very bright people that try to
> control humanity are the threat, not that intelligence itself is the
> threat. Power corrupts, etc. That sort of thing.
No. He was not talking about "absolute power corrupts absolutely, etc.,."
The focus was on not expecting technologically augmented intellects to do
better than the already gifted intellects (through natural selection) we
already now have in high places to guide us politically. He was not
stating high intellect itself as a threat, but instead saying we should not
view it as the panacea to our political ills. My playful comments
were satirizing what he said.
> > The speaker did grudgingly admit that the technologies Transhumanists
> > endorse will be coming into being whether he likes it or not. And he
> > stated the primary mover for this was military and economic
> > competitiveness between nations. He saw this as the main reason why
> > reasonable people like him had to swing into action and carefully
> > control and regulate these new technologies.
> You both sound like tech regulators ... you have your 'tech democracy'
> stuff, he wants to regulate it so that you don't enforce those
> tech-democracy requirement stuffs, whatever. Same thing, same sides of
> the spectrum, nothing new.
Huh? Please clarify what you wrote here with much clearer language. Why
would I "enforce" democracy enhancing tech? I'm not for forcing
intelligence augmentation or other Transhumanist technologies onto people,
especially for the cause of advancing democracy.
> > I did like his concern about inequality in relation to the subject
> > and it was a person in the Q & A session who brought up the classic
> > scenario of rich parents buying their unborn offspring genetic
> > enhancements, causing even greater gaping inequities within society.
> > But Sarewitz to my surprise did mention how in time treatments might
> > become cheaper as they are easier to do. And so in time, due to the
> > "trickle down effect," middle class parents could afford these
> > treatments to enhance their own children.
> Looks like he's still thinking about scarcity economics, meanwhile
> transhumanists have been talking post-scarcity for decades -- see
> self-replicating tech, like Merkle, Freitas, Drexler, RepRap, fabbing,
> etc. So I don't think this guy is particularly informed. ;-)
Ahh, but we will still need resources in the form of matter and energy. And
in time even the resources of our home solar system and beyond could be
devoured unless we are careful stewards. If a company or individual
develops a wonderful gene tweak for whatever great benefit, they are going
to want some sort of payment (but that might come in the form of barter or
even simply reputation/fame). It may be quite awhile before we free
ourselves from scarcity economics. I truly dream of the world you envision.
> > After the lecture he mingled with the crowd over refreshments and
> > then the real venom against Transhumanism came pouring out. Sarewitz
> > very mockingly referred to the Singularity as a crazy essentially
> > religious obsession Transhumanists had. And he spoke about how they
> > envisioned god-like computers running things and saving us from
> > ourselves. Sarewitz ridiculed Ray Kurzweil's book "The Singularity
> > is Coming" and said the predictions were pie in the sky overly
> > optimistic and basically just plain wrong. Oh, and the matter of
> > Transhumanist fear of death (especially in middle aged
> > Transhumanists) was also brought up as a reason why the Singularity
> > was predicted to be within the lifespan of many somewhat older
> > Transhumanists. As I listened to all of this I thought to myself,
> > "these people really don't like Transhumanists and want to totally
> > marginalize us!" And to think I always thought the Evangelicals and
> > not the academics would be our sparring partners. lol
> Ignore him. He doesn't know what he's talking about.
LOL I agree with you there! But he is a professor of fairly high stature
and such people need to be properly confronted (assertively &
respectfully and not aggressively & rudely) on the gladiatorial arena floor
of memetic conflict.
John Grigg : )
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat