[ExI] Convergence

BillK pharos at gmail.com
Wed Oct 24 18:30:31 UTC 2012


On Wed, Oct 24, 2012 at 11:29 AM, Anders Sandberg  wrote:
> The problem is that societal behaviour is not really forecastable. There are
> too few constraints outside slow demographic and institutional change, and
> the system changes in response to what you do. You certainly ought to
> consider social and cultural factors, but it is not anything that can be
> reliably forecast.
>

Hmmmn.  I thought that general trends in society could be forecast.
e.g. The War on Terror seems set for a while. Therefore it will
continue to burn money that could be used on more useful projects.


> My standard example is twitter. If a time traveller had explained the
> concept to me and my classmates back in 1991 I think we could have
> programmed the system without too much effort. However, we would not have
> been able to make any prediction about what it would be used for. In fact,
> it would likely have been fairly useless given the small number of Internet
> users back then. Twitter gets its functionality from what social
> functionality people invent, and for that you need 1) lots of users, 2) some
> experience with social media, and 3) creativity. So the current role of
> twitter was entirely impossible to forecast, although *maybe* we would have
> been able to figure out the above 1-3 requirements and hence deduce that if
> it ever got big it would get big in the future when there would be more
> internet users. But most likely we would have just concluded it was a
> useless system.
>

Well, some would say that twitter is indeed an eminently useless system.
A time sink that stops productive work.
If lots of shiny new toys are part of our future society, planning
should take account of this and allow for a distracted population with
little interest in big engineering projects.


> Many very successful systems have been successful without
> such planning (consider the Internet, where again as in the twitter example,
> the actual role became known only as it was invented). And if you conclude
> that your system would be opposed by group X, is that an argument against
> the system or just an argument for a marketing campaign aimed at group X?
>
> In general humans are bad at long-term forecasting and planning. Having
> foresight is useful, but one should not overestimate our ability to be
> accurate. Adaptivity is better than overconfidence in any plan.
>
>

Some projects will work against other projects. For example, starships
is not the only project and may lose engineers and money to other
projects. Then there are 'entertainment' projects which suck whole
populations away into a non-productive euphoria. The worst case, of
course, is humans moving to virtual reality where nothing gets done in
the real world.

I think all I'm really saying is that plans that say 'we could have AI
by 2025' need to consider some of the external problems that might be
encountered. Problems outside the project can easily double project
estimates.


BillK



More information about the extropy-chat mailing list