[extropy-chat] AI design

Alejandro Dubrovsky alito at organicrobot.com
Thu Jun 3 12:24:22 UTC 2004


On Thu, 2004-06-03 at 05:57 -0400, Eliezer Yudkowsky wrote:
> An AI that had been programmed with a static utility function over human 
> happiness and to not destroy human bodies, would rewrite all extant human 
> brains in a state of maximum "pleasure" as defined in the utility function, 
> and then freeze them in that exact position (because the utility function 
> is over static states rather than dynamic states).
> 

i don't see why the utility function can't be time dependent. eg
V(x sub t) = pleasure of system x at time t
U(x sub t) = { 0 if V(x sub t) in  set (V(x sub t1 where 0 < t1 < t mod
(number of possible states of x)));
V(x sub t) otherwise
}

Not that i would recommend that as a utility function.





More information about the extropy-chat mailing list