<div dir="auto"><div><br><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Sun, Mar 8, 2026, 3:13 PM Brent Allsop via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><br><div>I agree, this is a big step beyond <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC7801769/" target="_blank" rel="noreferrer">the full emulation of C-elegense</a>, so thanks for sharing Giulio,</div><div>And I agree this is falsifying evidence for quantum theories and such as <a href="https://canonizer.com/topic/88-Theories-of-Consciousness/20-Orch-OR" target="_blank" rel="noreferrer">"Orch OR"</a> .</div><div><br></div><div>But you're going way too far by claiming this falsifies theories that predict #2 Emulations of brains aren't conscious or #3 Emulation of brains are differently conscious.</div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto">I am sorry if I was unclear. I did not claim #2 or #3 were falsified. Only that one of the three standard objections to functionalism (#1) has been falsified.</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div><br></div><div>This may be convincing for you, who are not in those camps, the question is, is it falsifying for any of the supporters of those camps? I added this argument to the <a href="https://canonizer.com/topic/88-Theories-of-Consciousness/75-Orch-OR-Falsifying-Evidence?is_tree_open=0&asof=review" target="_blank" rel="noreferrer">"Orchestrated Object Reduction Falsifying Evidence"</a> camp, but I don't see any evidence that it has convinced any supporters of those camps. some people have jumped camps for some evidences, but this evidence hasn't yet convinced anyone I can see in those camps.</div><div><br></div><div><br></div>And you did nail it on the head when you said: </div><div dir="ltr"><br></div><div dir="ltr"><font color="#0000ff"><b>"Next we need those with functionally equivalent neural prosthetics who report no qualitative differences in their subjective qualia, which again may come soon."</b></font></div><div dir="ltr"><div><br></div><div>but you must also include the possibility that it may be something like glutamate being subjectively bound that is the only thing the subject reports has a redness quality.</div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Yes, it could go the other way. But the strange thing is, the person wouldn't be able to report the change (assuming it was a functionally-equivalent substitution). Their behavior would have to be just the same as it would have been without the substitution, and so we should expect them to report no qualitative changes in their experience. Now, is it possible they could have such a qualitative change and be unable to report it, or even think about it (as thinking a different thought would involve alternate neural circuits being activated)? It seems strange to think of a change you couldn't think about or notice. At that point is it really a change?</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div><br></div><div>Much of the brain isn't conscious, or at least is subconscious (not subjectively bound into consciousness). It is very likely that C. elegans has no qualia, but the fruit fly may be using qualia. But until we know which of all our descriptions of stuff in the brain is a description of redness, we can't know if the fruit fly is using that, or is phenomenally conscious likek that.</div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">It is also possible that what we consider to be unconscious is just a separate consciousness not integrated in the main sphere. Much like how to a split brain patient, each hemisphere considers the other hemisphere to be unconscious (when in actuality, both hemispheres are conscious)</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div><br></div><div><br></div><div>We simply need to discover which of all our descriptions of stuff in the brain is a descripton of redness, before we know if anything is like redness.</div><div><br></div><div>Which again may come soon.</div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Yes human uploaded brains open a new possibility space of experimentation. We could tweak, deactivate, take out, modify, disconnect any part of a brain.</div><div dir="auto"><br></div><div dir="auto">Jason </div><div dir="auto"><br></div><div dir="auto"></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Mar 8, 2026 at 8:44 AM Jason Resch via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="auto"><div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Mar 8, 2026, 4:10 AM Giulio Prisco via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Now, this seems VERY cool:<br>
<br>
<a href="https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload" rel="noreferrer noreferrer noreferrer" target="_blank">https://theinnermostloop.substack.com/p/the-first-multi-behavior-brain-upload</a></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Absolutely incredible!</div><div dir="auto"><br></div><div dir="auto">This removes from philosophy one of the three possible objections to functionalism:</div><div dir="auto"><br></div><div dir="auto"><s>1. Emulations of brains aren't possible</s></div><div dir="auto">2. Emulations of brains aren't conscious </div><div dir="auto">3. Emulations of brains are differently conscious</div><div dir="auto"><br></div><div dir="auto">This result rules out #1. which includes a wide class of theories, such as those that claim non computable physics or other non-algorithmic processes or quantum weirdness is required for the brain to function as it does. E.g. the microtubule / quantum gravity theories, by Hameroff and Penrose, and perhaps some versions of panpsychism or intrinsicist physicalism that would deny the possibility of functional equivalence without actually employing real physical particles having special properties than manifest in behavioral differences.</div><div dir="auto"><br></div><div dir="auto">Perhaps the biggest piece of news relating to philosophy of mind in many decades.</div><div dir="auto"><br></div><div dir="auto">Next we need those with functionally equivalent neural prosthetics who report no qualitative differences in their subjective qualia, which again may come soon.</div><div dir="auto"><br></div><div dir="auto">Jason</div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer noreferrer" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div></div>