[ExI] Limiting factor to the Intelligence Singularity?

Mike Dougherty msd001 at gmail.com
Sat Dec 23 17:45:11 UTC 2023

On Sat, Dec 23, 2023, 11:13 AM Kelly Anderson via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> As I believe in emergence as a deep concept, I tend to see groups more than
> individuals, though I value individual contribution greatly.
> Perhaps my mind has wandered too freely and it is time to go work on
> my own, much simpler, inventions, as they seem to constantly be
> broken.

I know your focus was on low level engineering principles, but I would
propose an emergent driver at another organizational level: values
alignment of the population.  Even if your smartest engineers prove
something like climate change can be managed adequately, the implementation
costs (lifestyle change) may be unacceptable to the majority - then are the
constraints properly modelled for the problem to actually be solved.  I
originally thought down the line that common resources (earth) may be best
allocated as computation after turning the planet completely to
computronium and abandoning this obsession with green biology - but the
climate example is closer to home (and now)  i know we may as well upload
the bio-humans and run their zoo simulation with sufficient fidelity that
they never know the nature of their Nature, but on the way to that
implementation there will be disagreements that slow progress.  Slowing
progress is that negative feedback, but has little to do with engineering
and instead is a people problem.  The larger the impact to people (positive
or negative, per perception or opinion) the larger the people problem

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231223/94f316bc/attachment.htm>

More information about the extropy-chat mailing list