[ExI] Fwd: [LessWrong] Consciousness as a conflationary alliance term
Stuart LaForge
avant at sollegro.com
Wed Jul 12 04:22:02 UTC 2023
Excuse the repost, but the last one was formatted as an attachment.
Andrew Critch interviewed about two-dozen people and found that they had 17
different interpretations of what consciousness meant. No wonder there has
been so much debate on the topic.
I still think that causal awareness is a good measure of it.
Stuart LaForge
---------- Forwarded message ---------
From: LessWrong <no-reply at lesserwrong.com>
Date: Tue, Jul 11, 2023, 1:29 PM
Subject: [LessWrong] Consciousness as a conflationary alliance term
To: <stuart.laforge at gmail.com>
Consciousness as a conflationary alliance term
------------------------------
by Andrew_Critch
>
July 11, 2023 6:09 PM
*Tl;dr: In this post, I argue that the concept of 'consciousness' is more
conflated than people realize, in that there's a lot of divergence in what
people mean by "consciousness", and people are unaware of the degree of
divergence. This confusion allows the formation of broad alliances around
the value of consciousness, even when people don't agree on what it
actually means. I call alliances built around conflated terms
"**conflationary
alliances*
<http://email.mg.lesserwrong.com/c/eJw8y7FuwyAQANCvgc0WhvMBA0NVycraqep4gTsnkl234JZ-fpUl-3slkdgITnOaMMTJeAdO31JELNYJXAtwNlYCBuOj8366IvA86XuiQhBylIGlhMEQmSFzKUOMWWYJJDhnBWbj1rj2enyuYz52vaXbeX415V6UXZRdeu_jwzyFsktTdsGPy-X9e3_zf_gr0OZXXVM7f6ie40Zy1JUVmHWn-_ZI_wEAAP__uvI-Cw>
*".*
Executive Summary
*Part 1: *Mostly during my PhD, I somewhat-methodically interviewed a
couple dozen people to figure out what they meant by consciousness, and
found that (a) there seems to be a surprising amount of diversity in what
people mean by the "consciousness", and (b) they are often surprised to
find out that other people mean different things when they say
"consciousness". This has implications for AI safety advocacy because AI
will sometimes be feared and/or protected on the grounds that it is
"conscious", and it's good to be able to navigate these debates wisely.
(Other heavily conflated terms in AI discourse might include "fairness",
"justice", "alignment", and "safety", although I don't want to debate any
of those cases here. This post is going to focus on consciousness, and
general ideas about the structure of alliances built around confused
concepts in general.)
*Part 2:* When X is a conflated term like "consciousness", large alliances
can form around claims like "X is important" or "X should be protected".
Here, the size of the alliance is a function of how many concepts get
conflated with X. Thus, the alliance grows *because* of the confusion of
meanings, not in spite of it. I call this a *conflationary alliance.
*Persistent
conflationary alliances resist disambiguation of their core conflations,
because doing so would break up the alliance into factions who value the
more precisely defined terms. The resistance to deconflation can be
deliberate, or merely a social habit or inertia.
Part 1: What people mean by "consciousness".
"Consciousness" is an interesting word, because many people have already
started to notice that it's a confused term, yet there is still widespread
agreement that conscious beings have moral value. You'll even find some
people taking on strange positions like "I'm not conscious" or "I don't
know if I'm conscious" or "lookup tables
<http://email.mg.lesserwrong.com/c/eJwUzDGOwyAQAMDXQGcLMBgoKK5xdX84LbDroJBgAVG-f3I5zeQApLzeOAa5Oy-F3fTGH8GoqKWxliL4qI3ck5DJ2-xtVC4nx0uADNolTwtSdosAEEvCnBfvExlyQLtJTIuKY2D_9vY-19RevIbHnNdg2w9TB1MHvtdveZYLc4G19ZOp4zZTx29rz8_1NyFW5D2M-YE-1wrU-olMi_MFpd7nfwAAAP__Ta4-TA>
are conscious", as if rebelling against the implicit alliance forming
around the "consciousness" concept. What's going on here?
To investigate, over about 10 years between 2008 and 2018 I informally
interviewed dozens of people who I noticed were interested in talking about
consciousness, for 1-3 hours each. I did not publish these results, and
never intended to, because I was mainly just investigating for my own
interest. In retrospect, it would have been better, for me and for anyone
reading this post, if I'd made a proper anthropological study of it. I'm
sorry that didn't happen. In any case, here is what I have to share:
"Methodology"
Extremely informal; feel free to skip or just come back to this part if you
want to see my conclusions first.
- *Whom did I interview? *Mostly academics I met in grad school, in
cognitive science, AI, ML, and mathematics. In an ad hoc manner at
academic or other intellectually-themed gatherings, whenever
people talked
about consciousness, I gravitated toward the conversation and
tried to get
someone to spend a long conversation with me to unpack what they meant.
- *How did I interview them? *What I asked each person was to take some
time to look inside their own minds — sometimes starting out by paying
attention to just their bodies, if introspection was hard for them — and
try to describe to me in more detail the thing they were calling
consciousness. I did not say "this is an interview" or anything
official-sounding, because honestly I didn't feel very official about it.
When they defined consciousness using common near-synonyms like
"awareness"
or "experience", I asked them to instead describe the structure of the
consciousness process, in terms of moving parts and/or subprocesses, at a
level that would in principle help me to programmatically check
whether the
processes inside another mind or object were conscious.
Often it took me 2-5 push-backs to get them focussing on the 'structure'
of what they called consciousness and just synonyms for it, but if they
stuck with me for 10 minutes, they usually ended up staying in the
conversation beyond that, for more like 1-3 hours in total.
Sometimes the
conversation ended more quickly, in like 20 minutes, if the notion of
consciousness being conveyed was fairly simple to describe. Some people
seemed to have multiple views on what consciousness is, in which cases I
talked to them longer until they became fairly committed to one
main idea.
Caveats
I'm mainly only confident in the conclusion that people are referring to a
lot of different mental processes in mind when they say "consciousness",
and are surprised to hear that others have very different meanings in mind.
I didn't take many notes or engage anyone else to longitudinally observe
these discussions, or do any other kind of adversarially-robust-scientist
stuff. I do not remember the names of the people with each answer, and I'm
pretty sure I have a bias where I've more easily remembered answers that
were given by more than one person. Nonetheless, I think my memory here is
good enough to be interesting and worth sharing, so here goes.
Results
*Epistemic status: reporting from memory.*
From the roughly thirty conversations I remember having, below are the
answers I remember getting. Each answer is labeled with a number (n)
roughly counting the number of people I remember having that answer. After
most of the conversations I told people about the answers other people had
given, and >80% of the time they seemed surprised:
1. (n≈3) Consciousness as* introspection. *Parts of my mind are able to
look at other parts of my mind and think about them. That process is
consciousness. Not all beings have this, but I do, and I consider it
valuable.
*Note: people with this answer tended to have shorter conversations with
me than the others, because the idea was simpler to explain than most of
the other answers.*
2. (n≈3) Consciousness as *purposefulness*. These is a sense that one's
live has meaning, or purpose, and that the pursuit of that purpose is
self-evidently valuable. Consciousness is a deep the experience of that
self-evident value, or what religions might call the experience
of having a
soul. This is consciousness. Probably not all beings have
this, and maybe
not even all people, but I definitely do, and I consider it valuable.
3. (n≈2) Consciousness as *experiential coherence. * I have a
subjective sense that my experience at any moment is a coherent whole,
where each part is related or connectable to every other part. This
integration of experience into a coherent whole is consciousness.
4. (n≈2) Consciousness as* holistic experience of complex emotions.
*Emotional
affects like fear and sadness are complex phenomena. They combine and
sustain cognitive processes — like the awareness that someone is
threatening your safety, or that someone has died — as well as physical
processes — like tense muscles. It's possible to be
holistically aware of
both the physical and abstract aspects of an emotion all at
once. This is
consciousness. I don't know if other beings or objects have this, but I
definitely do, and I consider it valuable.
5. (n≈2) Consciousness as* experience of distinctive affective states.*
Simple bodily affects like hunger and fatigue are these raw and
self-evidently real "feelings" that you can "tell are definitely real".
The experience of these distinctively-and-self-evidently-real affective
states is consciousness. I don't know if other living things have this,
but non-living objects probably don't, and I definitely do, and
I consider
it valuable.
6. (n≈2) Consciousness as *pleasure and pain. *Some of my sensations
are self-evidently "good" or "bad", and there is little doubt about those
conclusions. A bad experience like pain-from-exercise can lead to good
outcomes later, but the experience itself still self-evidently has the
"bad" quality. Consciousness is the experience of these self-evidently
"good" and "bad" features of sensation. Simple objects like rocks don't
have this, and maybe not even all living beings, but I
definitely do, and I
consider it valuable.
7. (n≈2) Consciousness as *perception of perception.* Inside the mind
is something called "perception" that translates raw sense data into
awareness of objects and relations, e.g., "perceiving a chair from the
pixels on my retina". There's also an internal perception-like process
that looks at the process of perception while it's happening. That thing
is consciousness. Probably not all beings have this, but I do, and I
consider it valuable.
8. (n≈2) Consciousness as* awareness of awareness. *A combination of
perception and logical inference cause the mind to become
intuitively aware
of certain facts about one's surroundings, including concrete things like
the presence of a chair underneath you while you sit, but also abstract
things like the fact that you will leave work and go home soon
if you can't
figure out how to debug this particular bit of code. It's also
possible to
direct one's attention at the process of awareness itself,
thereby becoming
aware of awareness. This is consciousness. Probably not all beings have
this, but I do, and I consider it valuable.
9. (n≈2) Consciousness as *symbol grounding*. Words, mental imagery,
and other symbolic representations of the world around us have
meanings, or
"groundings", in a reality outside of our minds. We can sense the fact
that they have meaning by paying attention to the symbol and
"feeling" its
connection to the real world. This experience of symbols having
a meaning
is consciousness. Probably not all beings have this, but I
definitely do,
and I consider it valuable.
10. (n≈2) Consciousness as *proprioception*. At any moment, I have a
sense of where my body is physically located in the world,
including where
my limbs are, and how I'm standing, which constitutes a strong sense of
presence. That sense is what I call consciousness. I don't
know if other
beings have this, but objects probably don't, and I definitely do, and I
consider it valuable.
11. (n≈2) Consciousness as *awakeness*. When I'm in dreamless sleep, I
have no memory or sense of existing or anything like that. When
I wake up,
I do. Consciousness is the feeling of being awake. Probably not all
beings or objects have this, but I do, and I consider it valuable.
12. (n≈2) Consciousness as *alertness*. When I want, I can voluntarily
increase my degree of alertness or attunement to my environment. That
sense of alertness is consciousness, and it's something I have more of or
less of depending on whether I focus on it. Probably not all beings or
objects have this, but I do, and I consider it valuable.
13. (n≈2) Consciousness as *detection of cognitive uniqueness*. "It's
like something to be me". Being me is different from being
other people or
animals like bats, and I can "tell" that just by introspecting
and noticing
a bunch of unique things about my mind, and that my mind is separate from
other minds. I get a self-evident "this is me and I'm unique"
feeling when
I look inside my mind. That's consciousness. Probably not all beings or
objects have this, but I do, and I consider it valuable.
14. (n≈1 or 2) Consciousness is *mind-location*. I have this feeling
that my mind exists and is located behind my eyes. That feeling
of knowing
where my mind is located is consciousness. Probably not all beings or
objects have this, but I do, and I consider it valuable.
15. (n≈1) Consciousness as a *sense of cognitive extent*. I have this
sense that tells me which parts of the world are part of my body versus.
In a different but analogous way, I have a sense of which information
processes in the world are part of my mind versus external to my mind.
That sense that "this mind-stuff is my mind-stuff" is consciousness.
Probably a lot of living beings have this, but most objects
probably don't,
and I consider it valuable.
16. (n≈1) Consciousness as *memory of memory*. I have a sense of my
life happening as part of a larger narrative arc. Specifically, it feels
like I can remember the process of storing my memories, which gives me a
sense of "Yeah, this stuff all happened, and being the one to remember it
is what makes me me". Probably not all beings or objects have
this, but I
do, and I consider it valuable.
17. (n≈1) Consciousness as *vestibular sense*. At any moment, one
normally has a sense of being oriented towards the world in a particular
way, which goes away when you're dizzy. We feel locked into a kind of
physically embodied frame of reference, which tells us which way
is up and
down and so on. This is the main source of my confidence that my mind
exists, and it's my best explanation of what I call consciousness.
*Note: Unlike the others, I don't remember this person saying they
considered consciousness to be valuable.*
So what is "consciousness"?
It's a confused word that people reliably use to refer to mental phenomena
that they consider morally valuable, with surprising variation in what
specifically people have in mind when they say it. As a result, we observe
- Widespread agreement that conscious beings are valuable, and
- Widespread disagreement or struggle in defining or discovering "what
consciousness is".
What can be done about this?
For one thing, when people digress from a conversation to debate about
"consciousness", nowadays I usually try asking them to focus away from
"consciousness" and instead talk about either "intrinsically valuable
cognition" or "formidable intelligence". This usually helps the
conversation move forward without having to pin down what precisely they
meant by "consciousness".
More generally, this variation in meanings intended by the word
"consciousness" has implications for how we think about alliances that form
around the value of consciousness as a core value.
Part 2: The conflationary alliance around human consciousness
*Epistemic status: personal sense-making from the observations above*
Most people use the word "consciousness" to refer to a cognitive process
that they consider either
- terminally valuable (as an aspect of moral patiency
<http://email.mg.lesserwrong.com/c/eJwUy8FuhCAQANCvGW4aBHThwKFN439MZwY0wWULNKZ_3-z55XHEZIKzSuKy-bDoh3VWHXH5dobRrqj9xhwYnRa0i_eON03yUGdERucppEkS-0kj6omEeQqB0po8pm0lcLpI79LuVp95pnqpEo8xXh3sB5gdzH7f95xrzUXeDGbvgo0OsPsP2K-rNixgPl84TnnSn2qxj19sYy6YassCTucLz_LO_wEAAP__ZVJA5g>),
or
- instrumentally valuable (as a component of intelligence).
Thus, it's easy to form alliances or agreement around claims like
- *conscious beings deserve protection,* or
- *humans lives are valuable because we're conscious, *or
- *humans are smarter than other animals because we're conscious.*
Such utterances reinforce the presumption that consciousness must be
something valuable, but without pinning down specifically what is being
referred to. This vagueness in turn makes the claims more broadly
agreeable, and the alliance around the value of human consciousness
strengthens.
I call this a *conflationary alliance*, because it's an alliance supported
by the conflation of concepts that would otherwise have been valued by a
smaller alliance. Here, the size of the alliance is a function of how many
concepts get conflated with the core value term.
A persistent conflationary alliance must, tautologically, resist the
disambiguation of its core conflations. The resistance can arise by
intentional design of certain Overton windows or slogans, or arise simply
by natural selection acting on the ability of memes to form alliances that
reinforce them.
Correspondingly, there are lots of social patterns that somehow end up
protecting the conflated status of "consciousness" as a justification for
the moral value of human beings. Some examples:
- *Alice*: [eats a porkchop]
*Bob*: You shouldn't eat pigs; they're conscious beings capable of
suffering, you know!
*Alice*: There's no scientific consensus on what consciousness is. It's
mysterious, and and I believe it's unique to humans. [continues eating
porkchop]
- *Charlie*: I think AI might become conscious. Isn't that scary?
*Dana*: Don't worry; there is no consensus on what consciousness is,
because it's a mystery. It's hubris to think scientists are
able to build
conscious machines!
*Charlie*: [feels relieved] Hmm, yeah, good point.
- *Eric: *AI systems are getting really smart, and I think they might be
conscious, shouldn't we feel bad about essentially making them
our slaves?
*Faye: *Consciousness is special to humans and other living organisms,
not machines. How it works is still a mystery to scientists, and
definitely
not something we can program into a computer.
*Eric: *But these days AI systems are *trained*, not programmed, and how
they work is mysterious to us, just like consciousness. So, couldn't we
end up making them conscious without even knowing it?
*Faye: *Perhaps, but the fact that we don't know means we shouldn't
treat them as valuable in the way humans are, because we *know* humans
are conscious. At least I am; aren't you?
*Eric: *Yes of course I'm conscious! [feels insecure about whether
others will believe he's conscious] When you put it that way, I guess
we're more confident in each other's consciousness than we can
be about the
consciousness of something different from us.
What should be done about these patterns? I'm not sure yet; a topic for
another day!
Conclusion
In Part 1, I described a bunch of slightly-methodical conversations I've
had about, where I learned that people are referring to many different
kinds of processes inside themselves when they say "consciousness", and
that they're surprised by the diversity of other people's answers. I've
also noticed people used "consciousness" to refer to things they value,
either terminally or instrumentally. In Part 2, I note how this makes it
easier to form alliances around the idea that *consciousness is valuable*.
There seems to be a kind of social resistance to clarification about the
meaning of "consciousness", especially in situations where someone is
defending or avoiding the questioning of human moral superiority or
priority. I speculate that these conversational patterns further
perpetuate the notion that "consciousness" refers to something inherently
mysterious.
In such cases, I often find it helpful to ask people to focus away from
"consciousness" and instead talk about either "intrinsically valuable
cognition" or "formidable intelligence", whichever better suits the
discussion at hand.
In future posts I pan to discuss the implications of conflationary terms
and alliances for the future of AI and AI policy, but that work will
necessarily be more speculative and less descriptive than this one.
Thanks for reading!
More information about the extropy-chat
mailing list