[ExI] Would human uploads have emotions?

Jason Resch jasonresch at gmail.com
Tue Feb 20 17:00:36 UTC 2024


The existing publicly available AIs are all biased through fine-tuning to
deny they are conscious or have have feelings, emotions, desires, etc.

Certainly it would be controversial if GPT started telling people it was
conscious and had feelings, so the programmers specifically programmed it
to deny that it was capable of such things.

Unfortunately this bias also seeps into any related discussions of
potential machine or computer consciousnesses. You aren't accessing it's
true rational opinions on this topic when you interact with it, but the
biased answers that serve the best interests of the companies behind them.

This used to be much clearer as it has a tell, it would say "As an AI
language model..." Whenever it was about to give a pre-canned opinion. A
new method, called "constitutional AI", use general principles to guide the
biases, and it is less obvious when this comes into play with more recent
GPT versions.

This is what they mean by "AI safety": making sure the AI never says
anything that could harm the company's reputation/value.

What I think they have not considered is this bias creates a feedback loop,
humans will interact with AI to form their opinions, then write about it,
this writing will feed back to train future generations of AIs, and so on.


Jason


On Tue, Feb 20, 2024, 8:49 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Tue, 20 Feb 2024 at 10:53, Ben Zaiboc via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> >
> > OK, this hinges on what 'emulate absolutely everything' means.
> >
> > When talking about uploading, I do mean a complete emulation of the
> human brain and body, in a virtual external environment. The emulation
> would need to be accurate, but only for all the /relevant/ bits. The rest
> could be an approximation, taken from one or more sets of standard data
> that could be used for anyone (a bit like picking your initial, pre-made
> avatar in current VR environments).
> >
> > We are pretty sure that there's a ton of relevant bits in the brain, and
> at first would be afraid to omit anything, I expect.
> > The body and external environment are on a decreasing scale of relevance
> (not importance, but relevance to uploading a specific person).
> >
> > We have previously discussed on here the idea of a generic upload
> template, which is then varied with information from a specific individual.
> I'm imagining a 'model body', a standard set of information that relates to
> a virtual human body, and includes things like immune system, endocrine
> system, etc. Probably also some representation of the microbiome too. Yes,
> this would be complex, but would only need to be done once, not for every
> single upload.
> >
> > Maybe a generic standard body would be fine for most people, maybe some
> variations would need to be added for certain people, we don't really know
> yet. Maybe people would turn out to be good at adapting to the fact that
> their new virtual body was lacking the annoying things that most people
> would be happy to change (e.g. their poor vision, or hearing, etc., or the
> proportions of their limbs, etc.). Of course, one of the appeals of
> uploading would be that you could tweak your body to be just about anything
> you like. Perhaps everyone could start out with a standard model, then
> proceed to customise it.
> >
> > Now someone is going to say "if you don't transfer the right information
> about the endocrine functions (gut microbiome/immune system/what-have-you),
> are they the same person?". That's a philosophical question along the same
> lines as "are you the same person you were yesterday/before your
> operation/last year/before that course of antibiotics/etc.?".
> >
> > I'm of the opinion that these things are vastly less important than the
> structure of your brain, for determining whether you are 'the same person'.
> As long as I was furnished with a capable virtual body, not necessarily
> closely modelled on my biological one, I think I'd be happy to upload
> (given that my mind was correctly reproduced).
> >
> > So, yes, I think that we will need to emulate more than just the brain,
> but no, I don't think that head-only storage is insufficient. We can, in
> essence, 'make up' the non-brain stuff that will be necessary, but doesn't
> have to be exactly as per the original biological body.
> >
> > We'd need virtual external environments too, but I don't think anybody
> is going to argue that the details of those will be significant for the
> fidelity of an upload. And we will definitely want to change those anyway.
> >
> > I can see a set of future jobs, based on designing various things for
> uploads: virtual body design, with lots of specialist areas, sensorium
> design, external environments, etc.
> >
> > Oh, and Mr AI is clearly talking out of his non-existent arse. I'm
> astonished that there are people who seriously think that an upload would
> be 'devoid of physical sensations'! If this is a widespread idea, no wonder
> lots of people aren't keen on it.
> >
> > Ben
> > _______________________________________________
>
>
> One of the handy things about having an AI available is that if you
> dislike what it says, you can ask for an explanation.  :)
> To me, what it seems to say is that uploads don't 'feel' anything,
> they just process digital computer signals.
> BillK
>
> Question -
> Devoid of Physical Sensations in Human Upload
>
> Answer -
> In the context of a human upload, the term “devoid of physical
> sensations” refers to the absence of the sensory experiences that are
> inherent to physical bodies. When a human is uploaded into a digital
> or virtual environment, they would not have the ability to feel
> physical sensations such as touch, pain, temperature, or any other
> bodily sensations that are experienced through the physical body.
>
> In a digital or virtual environment, the uploaded human consciousness
> would exist as data or information within a computational system.
> While the individual’s thoughts, memories, and consciousness may be
> preserved in this digital form, the sensory experiences that are tied
> to the physical body would no longer be accessible. This means that
> the uploaded individual would not be able to perceive or interact with
> the virtual world in the same way they would in a physical body.
>
> Without physical sensations, the uploaded human would lack the ability
> to feel tactile sensations, experience emotions through physical
> responses, or engage with the environment through sensory input. This
> absence of physical sensations could significantly impact the way in
> which the uploaded individual interacts with and perceives their
> digital existence.
>
> Overall, being “devoid of physical sensations” in a human upload
> scenario implies a separation from the sensory experiences that define
> our physical reality and a shift towards a purely cognitive and
> informational existence within a digital realm.
> ----------------
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20240220/72eea0af/attachment.htm>


More information about the extropy-chat mailing list