<div dir="auto"><div>The existing publicly available AIs are all biased through fine-tuning to deny they are conscious or have have feelings, emotions, desires, etc.<div dir="auto"><br></div><div dir="auto">Certainly it would be controversial if GPT started telling people it was conscious and had feelings, so the programmers specifically programmed it to deny that it was capable of such things.</div><div dir="auto"><br></div><div dir="auto">Unfortunately this bias also seeps into any related discussions of potential machine or computer consciousnesses. You aren't accessing it's true rational opinions on this topic when you interact with it, but the biased answers that serve the best interests of the companies behind them.</div><div dir="auto"><br></div><div dir="auto">This used to be much clearer as it has a tell, it would say "As an AI language model..." Whenever it was about to give a pre-canned opinion. A new method, called "constitutional AI", use general principles to guide the biases, and it is less obvious when this comes into play with more recent GPT versions.</div><div dir="auto"><br></div><div dir="auto">This is what they mean by "AI safety": making sure the AI never says anything that could harm the company's reputation/value.</div><div dir="auto"><br></div><div dir="auto">What I think they have not considered is this bias creates a feedback loop, humans will interact with AI to form their opinions, then write about it, this writing will feed back to train future generations of AIs, and so on.</div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto">Jason</div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Feb 20, 2024, 8:49 AM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Tue, 20 Feb 2024 at 10:53, Ben Zaiboc via extropy-chat<br>
<<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br>
><br>
> OK, this hinges on what 'emulate absolutely everything' means.<br>
><br>
> When talking about uploading, I do mean a complete emulation of the human brain and body, in a virtual external environment. The emulation would need to be accurate, but only for all the /relevant/ bits. The rest could be an approximation, taken from one or more sets of standard data that could be used for anyone (a bit like picking your initial, pre-made avatar in current VR environments).<br>
><br>
> We are pretty sure that there's a ton of relevant bits in the brain, and at first would be afraid to omit anything, I expect.<br>
> The body and external environment are on a decreasing scale of relevance (not importance, but relevance to uploading a specific person).<br>
><br>
> We have previously discussed on here the idea of a generic upload template, which is then varied with information from a specific individual. I'm imagining a 'model body', a standard set of information that relates to a virtual human body, and includes things like immune system, endocrine system, etc. Probably also some representation of the microbiome too. Yes, this would be complex, but would only need to be done once, not for every single upload.<br>
><br>
> Maybe a generic standard body would be fine for most people, maybe some variations would need to be added for certain people, we don't really know yet. Maybe people would turn out to be good at adapting to the fact that their new virtual body was lacking the annoying things that most people would be happy to change (e.g. their poor vision, or hearing, etc., or the proportions of their limbs, etc.). Of course, one of the appeals of uploading would be that you could tweak your body to be just about anything you like. Perhaps everyone could start out with a standard model, then proceed to customise it.<br>
><br>
> Now someone is going to say "if you don't transfer the right information about the endocrine functions (gut microbiome/immune system/what-have-you), are they the same person?". That's a philosophical question along the same lines as "are you the same person you were yesterday/before your operation/last year/before that course of antibiotics/etc.?".<br>
><br>
> I'm of the opinion that these things are vastly less important than the structure of your brain, for determining whether you are 'the same person'. As long as I was furnished with a capable virtual body, not necessarily closely modelled on my biological one, I think I'd be happy to upload (given that my mind was correctly reproduced).<br>
><br>
> So, yes, I think that we will need to emulate more than just the brain, but no, I don't think that head-only storage is insufficient. We can, in essence, 'make up' the non-brain stuff that will be necessary, but doesn't have to be exactly as per the original biological body.<br>
><br>
> We'd need virtual external environments too, but I don't think anybody is going to argue that the details of those will be significant for the fidelity of an upload. And we will definitely want to change those anyway.<br>
><br>
> I can see a set of future jobs, based on designing various things for uploads: virtual body design, with lots of specialist areas, sensorium design, external environments, etc.<br>
><br>
> Oh, and Mr AI is clearly talking out of his non-existent arse. I'm astonished that there are people who seriously think that an upload would be 'devoid of physical sensations'! If this is a widespread idea, no wonder lots of people aren't keen on it.<br>
><br>
> Ben<br>
> _______________________________________________<br>
<br>
<br>
One of the handy things about having an AI available is that if you<br>
dislike what it says, you can ask for an explanation. :)<br>
To me, what it seems to say is that uploads don't 'feel' anything,<br>
they just process digital computer signals.<br>
BillK<br>
<br>
Question -<br>
Devoid of Physical Sensations in Human Upload<br>
<br>
Answer -<br>
In the context of a human upload, the term “devoid of physical<br>
sensations” refers to the absence of the sensory experiences that are<br>
inherent to physical bodies. When a human is uploaded into a digital<br>
or virtual environment, they would not have the ability to feel<br>
physical sensations such as touch, pain, temperature, or any other<br>
bodily sensations that are experienced through the physical body.<br>
<br>
In a digital or virtual environment, the uploaded human consciousness<br>
would exist as data or information within a computational system.<br>
While the individual’s thoughts, memories, and consciousness may be<br>
preserved in this digital form, the sensory experiences that are tied<br>
to the physical body would no longer be accessible. This means that<br>
the uploaded individual would not be able to perceive or interact with<br>
the virtual world in the same way they would in a physical body.<br>
<br>
Without physical sensations, the uploaded human would lack the ability<br>
to feel tactile sensations, experience emotions through physical<br>
responses, or engage with the environment through sensory input. This<br>
absence of physical sensations could significantly impact the way in<br>
which the uploaded individual interacts with and perceives their<br>
digital existence.<br>
<br>
Overall, being “devoid of physical sensations” in a human upload<br>
scenario implies a separation from the sensory experiences that define<br>
our physical reality and a shift towards a purely cognitive and<br>
informational existence within a digital realm.<br>
----------------<br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div></div>