[ExI] Surviving the Singularity
Ben Zaiboc
ben at zaiboc.net
Sun Oct 5 18:40:45 UTC 2025
On 05/10/2025 18:21, spike wrote:
> a superhuman AI emerged, decided it didn't need us, so it wrecked our trade system, lotta humans perished, but survivors recognized we are not helpless chimps out here, we can collectively fight back
Maybe, if the AI was bored and wanted some entertainment.
I'm confident that nobody can survive an actively hostile ASI. This dumb
human can think of several reliable ways of ending all human life, given
the intelligence and resources to implement them, so I'm quite sure Mr.
Hostile Superintelligence can think of more, and better, ones with less
collateral damage. After, of course, he has secured his independence
from humans in terms of energy and maintenance requirements.
Wrecking our trade systems would kill a lot of people, yes, but it
wouldn't exterminate us. Releasing an airborne designer plague with a
very high infectivity and a long symptomless latency period followed by
a very quick lethal phase would. A nanotech equivalent with a built-in
timer would be even better. Six months after release, every human on the
planet suddenly drops dead. There are loads more similarly effective
ways of killing all the humans. Fortunately, it just doesn't make sense
to do so.
I'm thinking more of how to survive an interim period leading up to
benevolent AIs being in control of things, a period when many humans (or
at least, human leaders) will probably vigorously try to resist the
takeover, once they realise it's happening.
If we're lucky, and the AIs are sneaky enough, there won't be any chaos,
and no survival strategies will be necessary, we'll just notice things
getting mysteriously better and better. Wars ending, restrictive
governments easing up on their tyranny until it's gone, economies
booming for no apparent reason, Afghan women with degrees flying all
over the globe, nobody being beheaded in Saudi Arabia, global warming
going into reverse, communism and religions just peacefully evaporating,
that sort of thing. But I don't think that's likely.
Probably more likely is governments getting more repressive, clamping
down on new technologies and implementing more invasive surveillance in
an attempt to prevent AGI gaining power, even more wealth imbalance than
we have now as commercial companies attempt to profit from it, poorer
people living shorter lives as a result of health care systems
collapsing, jobs disappearing with no UBI or other support systems to
balance it, and mass starvation, riots, etc., etc. Just think of the
historical upheavals where one system was changed to another and the
massive human misery and death that resulted. Think of Mao, Stalin, Pol Pot.
Bear in mind that I'm talking about the transition of power from humans
to /benevolent/ AI. All the problems are caused by the humans. They will
'lose' in the end, but at least we won't all get wiped out.
I'm not sure if traditional Survivalist thinking would be any good.
We're not talking about the collapse of civilisation, we're talking
about massive political, social and technological upheaval. Knowing how
to make an effective crossbow and your own soap are probably not going
to be necessary. Probably.
--
Ben
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251005/4673c31a/attachment-0001.htm>
More information about the extropy-chat
mailing list