[ExI] Strong AI Hypothesis: logically flawed?

Anders Sandberg anders at aleph.se
Sun Sep 28 09:01:09 UTC 2014

Ohad Asor <ohadasor at gmail.com> , 28/9/2014 4:56 AM:
Hi all, great to be here :)Hi!
On Sun, Sep 28, 2014 at 12:58 AM, Anders Sandberg <anders at aleph.se> wrote:Decades of failure is obviously some evidence
Why do you think so, sir?
I was using it in a Bayesian sense: it is information that ought to change our probability estimates, but it might of course be weak evidence that just multiplies them with 0.999999 or something like that. 
If one thinks that real AI research is only possible now because of computational advances or some relevant new insights, then decades of failure are very weak evidence. Just like decades of flying failure was not really good evidence against heavier-than-air flying since most of those approaches lacked the necessary aerodynamic knowledge: it was only after that had been discovered the Wright brothers had a chance. However, now the uncertainty resides in whether we think we know enough or not.
One neat way of reasoning about problems with unknown difficulty is to assume the amount of effort needed to succeed has a power-law distribution. Why? Because it is scale free, so whatever your way of measuring effort you get the same distribution (also, there are some entropy maximization properties I think). We also have priors which can be approximated as log-uniform. >From this some useful things can be seen, like that the probability of success tends to grow in a strongly convex way as a function of resources spent, that neglected domains can be extra profitable to investigate even when our priors say they are difficult, and estimates of expected benefit given a certain resource spending and our current knowledge. See http://www.fhi.ox.ac.uk/how-to-treat-problems-of-unknown-difficulty/for a start - Owen have a lot of neat results I hope he puts up soon. 

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140928/014380c4/attachment.html>

More information about the extropy-chat mailing list