<div dir="auto"><div><br><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Tue, Mar 24, 2026, 7:24 AM John Clark via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><span style="font-size:large;font-family:Arial,Helvetica,sans-serif">On 21/03/2026 21:19, Jason Resch wrote:</span></div></div><div class="gmail_quote"><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><font size="4" face="georgia, serif"><i>
> Functionalism is a theory in the philosophy of mind. If one accepts functionalism, then that is enough to establish the uploaded mind will be conscious.</i></font><br></blockquote><div><br></div><div class="gmail_default"><font face="arial, helvetica, sans-serif"></font><font size="4" face="tahoma, sans-serif"><b>Yes. And an implicit belief in functionalism is the reason you feel certain that solipsism is untrue and your fellow human beings are conscious, except when they are sleeping or under anesthesia or dead. </b></font></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">I would say that strictly speaking functionalism isn't enough to escape solipsism, which also requires an ontological claim (e.g., that other people I see are real with functional brains of their own, rather than figments of my imagination). But functionalism could perhaps be used to argue that even if they were figments of your imagination, then at a certain point of accuracy, your brain generating a simulation of their behavior would invoke something like a functional process that emulates (and thus generates) their mind.</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><font size="4" face="georgia, serif"><i>
> But functionalism is silent on the question of which experiences instantiated in which places are experience[s] you can expect to be yours.</i></font><br></blockquote><div><br></div><div class="gmail_default"><span style="font-family:arial,helvetica,sans-serif"></span><font face="tahoma, sans-serif" size="4"><b>No, and when discussing this topic great care is needed in the use of personal pronouns. According to functionalism the "you" of yesterday is the "you" who says he remembers being the "you" of yesterday. And yes, if two beings are able to do that then they are both the "you" of yesterday.</b></font></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Then you are assuming more than just functionalism. You're subscribing to a memory-based theory of personal identity. This is common, but by no means universal. There are functionalists who would consider as plausible, surviving through amnesia.</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div class="gmail_default"><font face="tahoma, sans-serif" size="4"><b> Using this procedure one can always look back through time and see a continuous chain of "yous", but trying to do this into the future does not work, it would be like pushing on a string.</b></font></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">It works fine. You just aren't using your imagination. Setup a thought experiment to jump forward in time. Then you can apply your rearward-facing identity comparison function on this future state. If it matches then you can infer that indeed, this current you is linked to this future you. This you can make predictions using it.</div><div dir="auto"><br></div><div dir="auto">After all, what good is a theory that can't make predictions?</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div class="gmail_default"><font face="tahoma, sans-serif" size="4"><b> As Hugh Everett said in his original PhD thesis that introduced the Many Worlds interpretation of quantum mechanics, it would be like asking which one was the real original amoeba after it reproduced by dividing in two. </b></font></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">That's a different problem. In the amoeba case (like a split teletransporter case) there is no unique original, for both have an equal claim.</div><div dir="auto"><br></div><div dir="auto">But note that personal identity theories don't all demand unique individuals. Many theories of personal identity are fine with saying both amoebae are the same amoeba, or that both results of a splitting teletransporter are equally the same person.</div><div dir="auto"><br></div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div class="gmail_default"><font face="tahoma, sans-serif" size="4"><b><br></b></font></div><div class="gmail_default"><font face="tahoma, sans-serif" size="4"><b>And if you reject functionalism then you'd need to take the idea that you're the only conscious being in the universe seriously. Do you really want to do that? </b></font></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">There are many routes to and out of solipsism, but they're are largely independent of any assumptions in philosophy of mind.</div><div dir="auto"><br></div><div dir="auto">Jason </div><div dir="auto"><div class="gmail_quote gmail_quote_container"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div class="gmail_default"></div></div></div>
</blockquote></div></div></div>