<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<br>
<div class="moz-cite-prefix">On 25/04/2023 23:40, Giovanni
Santostasi wrote:<br>
</div>
<blockquote type="cite"
cite="mid:CAL+RtPdOq9zmVuH-p=NSaAMs3EOG9_CaTymSZ2p5mHQ-V1gw3w@mail.gmail.com">
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<div dir="ltr">Hi Ben and Jason,<br>
Can we all read this article and make sense of it? It seems very
relevant to the discussion. I love this idea of "semiotic
physics". I talked to Umberto Eco when I was a student in
Bologna about this exact idea even if it was very vague in my
head at that time. Eco was very encouraging but I never was able
to spend time on it. I think this could be a great tool to
understand LLMs.<br>
<br>
<a
href="https://www.lesswrong.com/posts/TTn6vTcZ3szBctvgb/simulators-seminar-sequence-2-semiotic-physics-revamped?fbclid=IwAR3AtV49lmoyF7F8imCwiN0XCdKJ84LIfX8ZeUyuWRGiDBM1qxupX-Lwweo"
moz-do-not-send="true" class="moz-txt-link-freetext">https://www.lesswrong.com/posts/TTn6vTcZ3szBctvgb/simulators-seminar-sequence-2-semiotic-physics-revamped?fbclid=IwAR3AtV49lmoyF7F8imCwiN0XCdKJ84LIfX8ZeUyuWRGiDBM1qxupX-Lwweo</a><br>
<br>
Giovanni </div>
<br>
<div class="gmail_quote">
<div dir="ltr" class="gmail_attr">On Tue, Apr 25, 2023 at
1:06 PM Ben Zaiboc via extropy-chat <<a
href="mailto:extropy-chat@lists.extropy.org"
moz-do-not-send="true" class="moz-txt-link-freetext">extropy-chat@lists.extropy.org</a>>
wrote:<br>
</div>
<blockquote class="gmail_quote" style="margin:0px 0px 0px
0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On
25/04/2023 14:06, spike wrote:<br>
> Cool thx Ben. I had never thought of it that way, but it
is a cause <br>
> for hope. If we find enough ways a brain is like a
computer, it <br>
> suggests a mind can (in theory) exist in a computer,
which is <br>
> something I have long believed and hoped is true. If
thought is <br>
> substrate dependent on biology, we are all sunk in the
long run.<br>
<br>
Thought cannot be dependent on biology. This is something I've
thought <br>
about, and done research on, for a long time, and I'm
completely <br>
convinced. It's logically impossible. If it's true, then all
of our <br>
science and logic is wrong.<br>
<br>
What we call 'a computer' is open to interpretation, and it
may well be <br>
that minds (human-equivalent and above) can't be implemented
on the <br>
types of computer we have now (we already know that simpler
minds can <br>
be). But that doesn't destroy the substrate indifference
argument (I <br>
never liked the term 'substrate independent', because it
conjures up the <br>
concept of a mind that has no substrate. Substrate indifferent
is more <br>
accurate, imo (and yes, even that is not good enough, because
the <br>
substrate must be capable of supporting a mind, and not all
will be (we <br>
just need to find the right ones. (and OMD, I'm turning into a
spikeian <br>
bracket nester!!)))).<br>
<br>
Ben<br>
</blockquote>
</div>
</blockquote>
<br>
<br>
I keep seeing references to 'Alignment', with no attempt to define
what that means (as far as I can see. There are too many links and
references to quickly follow, I'll try to have another look later).<br>
<br>
Presumably this refers to 'alignment with human interests and goals'
but that's not stated, and it's a huge and seemingly intractable
problem anyway.<br>
<br>
Apart from that, after reading about 1/10 of the article, I think
the answer, in my case, is 'no'. I don't have either the background
or the intelligence, or both, to make sense of what it's saying. I
don't understand the concept of 'semiotic physics', despite the
explanation, and don't understand what "we draw a sequence of coin
flips from a large language model" actually means. Are they asking a
LLM to generate some random series? What use is that?<br>
<br>
I think it's safe to say I don't understand almost everything in
this article (if the first tenth of it is any guide. I'm sure
there's some arcane statistical technique that can predict that!).<br>
<br>
I'm afraid it's way beyond me. I just have a little brain.<br>
<br>
Ben<br>
</body>
</html>