[extropy-chat] Playing Go & demandingness in ethics

Jeff Medina analyticphilosophy at gmail.com
Wed Jan 4 18:53:11 UTC 2006


On 1/4/06, Eliezer S. Yudkowsky <sentience at pobox.com> wrote:
> Let me clarify:  Playing Go has a positive side effect, you get better
> at Go and learn generalizable skills.  Playing Go is fun.

Playing Go has the negative side effect of its opportunity cost. For
example, if someone smart enough to play Go well is spending time
playing Go, they aren't making contributions they otherwise could to
the solution of technical problems that would contribute to the
well-being of {a subset of the current and future morally significant
population} greater than playing Go contributes to the subset of the
population that enjoys playing Go or benefits from the generalizable
skills Go has imparted on its players.

This is not to pick on Go. It just gave me an opportunity to seed a
discussion on demandingness in ethics, which has re-emerged in the
past few days to steal some of my CPU cycles.

Demandingness is a common criticism of consequentialist ethics --
e.g., don't ever eat a fancy pasta dish, as you can nearly always
replace it with oatmeal or some other nutritious, less expensive food,
be just as healthy, and donate the difference in cost to charitable
causes (this sort of replacement argument applies to a bewildering
proportion of most of our daily activities and decisions). Though it
isn't argued as often, it applies to deontological systems and a
number of virtue ethical systems.

Now given that many people interested in transhumanism express an
interest in the ethical arguments for various technological
developments, the permissibility of enhancement, the right to
morphological freedom (whether others consider what you're doing
'enhancement' or not)... why are the demands of our alleged beliefs
nigh universally ignored? It is just a fact of human psychology that
we can't motivate ourselves to moral behavior if it's not right in our
face, or if it doesn't present immediate & painful consequences to
ignore it? It is an illusory problem because none of us really care
about ethics at all, and are only engaged in a social reciprocity
game? Or is there some other explanation? And should and can we do
something to change, acting more in accord with the demands of our
ethics?

Best,
--
Jeff Medina
http://www.painfullyclear.com/

Community Director
Singularity Institute for Artificial Intelligence
http://www.singinst.org/

Relationships & Community Fellow
Institute for Ethics & Emerging Technologies
http://www.ieet.org/

School of Philosophy, Birkbeck, University of London
http://www.bbk.ac.uk/phil/



More information about the extropy-chat mailing list