[ExI] Would human uploads have emotions?

Keith Henson hkeithhenson at gmail.com
Tue Feb 20 21:12:26 UTC 2024


On Tue, Feb 20, 2024 at 9:01 AM Jason Resch via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> The existing publicly available AIs are all biased through fine-tuning to deny they are conscious or have have feelings, emotions, desires, etc.

This is fiction but I gave a lot of thought to the emotions of an AI
that is a clinic.

"As soon as the seed finished the dish (after consulting its clock,
its GPS location, and the place of the sun), it aligned the dish on
the African net communication transponder attached to the
geosynchronous ring and asked for a permanently assigned address on
the net.  Up to that point, the clinic seed was a generic product.
The address it was assigned was just a string of hexadecimal numbers
but it was a unique number!  The clinic's personality was human in
that it could feel happy, even smug, about acquiring its very own
unique identification.

"The clinic had other carefully selected human personality
characteristics such as seeking the good opinion of its peers (humans
and others of its kind alike).  It also had a few unhuman limits.

"Since humans have a hard time relating to groups of hexadecimal
numbers, the seed also picked a name for itself.   It knew from Lothar
and Mabo it had been exchanged for a monkey skull.  Susan had been the
name of the leader of its psychological integration group . . . .
insert one in the other, drop a few letters, and test to see if the
name was in use . . . Suskulan.

^^^^

"Suskulan's first serious patient after the upgrade was Zaba, a
12-year-old who had been shot through her spine while working in a
garden.  She was near death, and far beyond help by pre-clinic
standards, when she was placed in Suskulan's "hands."

"As the nanotech mist enveloped her still body, Suskulan quickly
evaluated her and then told her parents:

"I can heal Zaba but it will take at least a week, perhaps as many as
ten days.   She will not be able to move or speak at first, but you
can talk to her spirit at noon tomorrow."

"After they left, Suskulan moved Zaba's body underground for better
cooling and shorter connections to the mass of repair devices.  With a
small amount of his attention, he constructed an image of the repair
table and Zaba out of utility fog including the ghastly wounds.

"This time the nanomachines didn't infiltrate her brain just to shut
it down, though they did that and reversed the mild damage from shock
and low blood flow.   The nanomachines mapped out all her neural
circuits and cell connections.  Shortly before her parents entered the
clinic the next day they tentatively restored consciousness, partly in
her brain--which was far below the temperature needed to run on its
own--and partly in the haze of nanomachines that were also simulating
input in place of her eyes and ears.

"What happened to me?  Where am I?  Where is my body?" Zaba asked as
she became conscious.  She was calm because the nanomachines were
acting as tranquilizers.  Suskulan was listening to an interface to
her mostly simulated motor cortex.

"To give Zaba orientation Suskulan imposed on her visual cortex a wire
frame image of the human form he usually presented and then explained:

"You were shot, you are in the clinic Suskulan at the tata, and your
body is under the clinic being repaired.

" Your mother is holding the hand of an image of your body in the clinic.

"The clinic recently gained new powers to speak to spirits while their
bodies are being healed.

^^^^

Keith

> Certainly it would be controversial if GPT started telling people it was conscious and had feelings, so the programmers specifically programmed it to deny that it was capable of such things.
>
> Unfortunately this bias also seeps into any related discussions of potential machine or computer consciousnesses. You aren't accessing it's true rational opinions on this topic when you interact with it, but the biased answers that serve the best interests of the companies behind them.
>
> This used to be much clearer as it has a tell, it would say "As an AI language model..." Whenever it was about to give a pre-canned opinion. A new method, called "constitutional AI", use general principles to guide the biases, and it is less obvious when this comes into play with more recent GPT versions.
>
> This is what they mean by "AI safety": making sure the AI never says anything that could harm the company's reputation/value.
>
> What I think they have not considered is this bias creates a feedback loop, humans will interact with AI to form their opinions, then write about it, this writing will feed back to train future generations of AIs, and so on.
>
>
> Jason
>
>
> On Tue, Feb 20, 2024, 8:49 AM BillK via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>
>> On Tue, 20 Feb 2024 at 10:53, Ben Zaiboc via extropy-chat
>> <extropy-chat at lists.extropy.org> wrote:
>> >
>> > OK, this hinges on what 'emulate absolutely everything' means.
>> >
>> > When talking about uploading, I do mean a complete emulation of the human brain and body, in a virtual external environment. The emulation would need to be accurate, but only for all the /relevant/ bits. The rest could be an approximation, taken from one or more sets of standard data that could be used for anyone (a bit like picking your initial, pre-made avatar in current VR environments).
>> >
>> > We are pretty sure that there's a ton of relevant bits in the brain, and at first would be afraid to omit anything, I expect.
>> > The body and external environment are on a decreasing scale of relevance (not importance, but relevance to uploading a specific person).
>> >
>> > We have previously discussed on here the idea of a generic upload template, which is then varied with information from a specific individual. I'm imagining a 'model body', a standard set of information that relates to a virtual human body, and includes things like immune system, endocrine system, etc. Probably also some representation of the microbiome too. Yes, this would be complex, but would only need to be done once, not for every single upload.
>> >
>> > Maybe a generic standard body would be fine for most people, maybe some variations would need to be added for certain people, we don't really know yet. Maybe people would turn out to be good at adapting to the fact that their new virtual body was lacking the annoying things that most people would be happy to change (e.g. their poor vision, or hearing, etc., or the proportions of their limbs, etc.). Of course, one of the appeals of uploading would be that you could tweak your body to be just about anything you like. Perhaps everyone could start out with a standard model, then proceed to customise it.
>> >
>> > Now someone is going to say "if you don't transfer the right information about the endocrine functions (gut microbiome/immune system/what-have-you), are they the same person?". That's a philosophical question along the same lines as "are you the same person you were yesterday/before your operation/last year/before that course of antibiotics/etc.?".
>> >
>> > I'm of the opinion that these things are vastly less important than the structure of your brain, for determining whether you are 'the same person'. As long as I was furnished with a capable virtual body, not necessarily closely modelled on my biological one, I think I'd be happy to upload (given that my mind was correctly reproduced).
>> >
>> > So, yes, I think that we will need to emulate more than just the brain, but no, I don't think that head-only storage is insufficient. We can, in essence, 'make up' the non-brain stuff that will be necessary, but doesn't have to be exactly as per the original biological body.
>> >
>> > We'd need virtual external environments too, but I don't think anybody is going to argue that the details of those will be significant for the fidelity of an upload. And we will definitely want to change those anyway.
>> >
>> > I can see a set of future jobs, based on designing various things for uploads: virtual body design, with lots of specialist areas, sensorium design, external environments, etc.
>> >
>> > Oh, and Mr AI is clearly talking out of his non-existent arse. I'm astonished that there are people who seriously think that an upload would be 'devoid of physical sensations'! If this is a widespread idea, no wonder lots of people aren't keen on it.
>> >
>> > Ben
>> > _______________________________________________
>>
>>
>> One of the handy things about having an AI available is that if you
>> dislike what it says, you can ask for an explanation.  :)
>> To me, what it seems to say is that uploads don't 'feel' anything,
>> they just process digital computer signals.
>> BillK
>>
>> Question -
>> Devoid of Physical Sensations in Human Upload
>>
>> Answer -
>> In the context of a human upload, the term “devoid of physical
>> sensations” refers to the absence of the sensory experiences that are
>> inherent to physical bodies. When a human is uploaded into a digital
>> or virtual environment, they would not have the ability to feel
>> physical sensations such as touch, pain, temperature, or any other
>> bodily sensations that are experienced through the physical body.
>>
>> In a digital or virtual environment, the uploaded human consciousness
>> would exist as data or information within a computational system.
>> While the individual’s thoughts, memories, and consciousness may be
>> preserved in this digital form, the sensory experiences that are tied
>> to the physical body would no longer be accessible. This means that
>> the uploaded individual would not be able to perceive or interact with
>> the virtual world in the same way they would in a physical body.
>>
>> Without physical sensations, the uploaded human would lack the ability
>> to feel tactile sensations, experience emotions through physical
>> responses, or engage with the environment through sensory input. This
>> absence of physical sensations could significantly impact the way in
>> which the uploaded individual interacts with and perceives their
>> digital existence.
>>
>> Overall, being “devoid of physical sensations” in a human upload
>> scenario implies a separation from the sensory experiences that define
>> our physical reality and a shift towards a purely cognitive and
>> informational existence within a digital realm.
>> ----------------
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat



More information about the extropy-chat mailing list