[extropy-chat] Playing Go & demandingness in ethics
Jef Allbright
jef at jefallbright.net
Thu Jan 5 14:54:28 UTC 2006
On 1/4/06, Jeff Medina <analyticphilosophy at gmail.com> wrote:
> Now given that many people interested in transhumanism express an
> interest in the ethical arguments for various technological
> developments, the permissibility of enhancement, the right to
> morphological freedom (whether others consider what you're doing
> 'enhancement' or not)... why are the demands of our alleged beliefs
> nigh universally ignored? It is just a fact of human psychology that
> we can't motivate ourselves to moral behavior if it's not right in our
> face, or if it doesn't present immediate & painful consequences to
> ignore it? It is an illusory problem because none of us really care
> about ethics at all, and are only engaged in a social reciprocity
> game? Or is there some other explanation? And should and can we do
> something to change, acting more in accord with the demands of our
> ethics?
This question appears to be the same as what I asked myself when I was
about eight years old, had just finished reading the Bible and was
trying to reconcile the Christian words and teachings (in which I was
immersed) with easily apparent Christian hypocrisy.
And I eventually came to understand that people have only limited
awareness of themselves and their relationship to others, and to
compound the matter, they have only limited motivation to increase
their awareness, or their awareness of their awareness, and so on.
It became clear to me that while a completely objective, big-picture
view of any particular domain will facilitate rational decision-making
(and thus behavior) within that domain, in the real world there is
always a larger possible domain within which, even under the best
conditions, we must operate under conditions of incomplete
information. Consistency is the ultimate measure of morality, but we
can only approach and never reach that level of objectivity (and
gladly so, because then our subjective values would be nil.)
Later it became clear to me that even the Self, which is assumed to be
evaluating and making these rational decisions, doesn't exist in the
discrete and independent sense that is assumed by most people raised
within western culture.
So, objective terms, as systems operating within the physical
universe, our behavior certainly is consistent, but in terms of
subjective agents, operating with only an approximate internal model
of reality, our behavior is quite naturally inconsistent.
I hope the foregoing provides a useful response to your question about
why people do not meet the demands of their professed ethics.
To your second question, "should and can we do something to change,
acting more in accord with the demands of our ethics?", I would point
out that for the last four or so years I have been something of a
broken record on this very subject.
It is disturbing to see intelligent thinkers simply assume that giving
to charity is a fundamental good, that all humans or all "sentient
beings" possess equal moral value, or that "rights" are somehow
inherent in the structure of our world.
These values and the actions they imply, like all others, are "good"
to the extent that they work to promote increasingly shared values
into the future. Charity, for example, is seen as a fundamental good
because altruism is woven deeply into the fabric of our culture and
our genes -- because it tends to work, and our values have been
thereby shaped. Given a different environment of evolutionary
adaptation -- or an imminent environment of rapid technological change
-- charity may be seen and evaluated differently.
Certain values become increasingly shared because they work, meaning
they persist and grow. These increasingly shared values are the basis
of our morality, what we tend to agree is good.
Certain actions are considered good, to the extent that they promote
our increasingly shared values.
What can we do? We can continue to build a framework of social
decison-making that increases our awareness of our increasingly shared
subjective values, and that increases our increasingly objective
instrumental knowledge applied to the promotion of those values.
[To head off just one likely and immediate objection: "increasingly
shared values" does not imply that we become borg-like. On the
contrary, increasingly shared values that work implies a great
practical respect for freedom and diversity.]
- Jef
More information about the extropy-chat
mailing list