[ExI] VERY cool. Toward mind uploading?

BillK pharos at gmail.com
Sun Mar 8 15:17:15 UTC 2026


On Sun, 8 Mar 2026 at 14:45, Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> This removes from philosophy one of the three possible objections to
> functionalism:
> <s>1. Emulations of brains aren't possible</s>
> 2. Emulations of brains aren't conscious
> 3. Emulations of brains are differently conscious
>
> This result rules out #1. which includes a wide class of theories, such as
> those that claim non computable physics or other non-algorithmic processes
> or quantum weirdness is required for the brain to function as it does. E.g.
> the microtubule / quantum gravity theories, by Hameroff and Penrose, and
> perhaps some versions of panpsychism or intrinsicist physicalism that would
> deny the possibility of functional equivalence without actually employing
> real physical particles having special properties than manifest in
> behavioral differences.
>
> Perhaps the biggest piece of news relating to philosophy of mind in many
> decades.
> Next we need those with functionally equivalent neural prosthetics who
> report no qualitative differences in their subjective qualia, which again
> may come soon.
>
> Jason
>
>> _______________________________________________



Gemini seems to have an excellent understanding of the problem.
BillK

Gemini AI 3.1 Pro Thinking -

The article you referenced, which discusses the digital reconstruction of
the *Drosophila* (fruit fly) larva connectome to reproduce multi-behavioral
outputs, represents a landmark shift in the philosophy of mind. By
demonstrating that a digital model can replicate the complex behavioral
repertoire of a biological organism, it moves the conversation from the
theoretical to the empirical.

Here is a discussion on the implications for the three primary objections
to functionalism:
1. "Emulations of brains aren't possible"

This objection typically relies on the idea that the brain’s complexity is
"non-computable" (as Roger Penrose has argued) or that the biological
substrate is so intricately tied to its function that a digital copy would
lose the essential "causal powers" of biology.

   -

   *Implications:* The success of the *Drosophila* larva emulation
   effectively shifts this from a philosophical "no" to an engineering "when."
   By successfully mapping 3,016 neurons and 548,000 synapses and seeing the
   digital version perform "escape behaviors" and "feeding turns," we have
   proof of concept. The objection must now retreat to a "Scaling
   Argument"—that while a 3,000-neuron brain is computable, an
   86-billion-neuron human brain involving complex glial interactions and
   neurochemistry is too vast. However, the categorical barrier—the idea that
   digital logic *cannot* replicate neural logic—has been significantly
   weakened.

2. "Emulations of brains aren't conscious"

This is the "Philosophical Zombie" or "Chinese Room" objection. It posits
that even if a machine perfectly mimics the behavior of a human
(functionalism), it lacks "inner light" or qualia. It is a simulation of a
storm, not a wet storm.

   -

   *Implications:* As emulations become more sophisticated, this objection
   becomes increasingly "unfalsifiable." If a digital fruit fly avoids "pain"
   stimuli in a way that is indistinguishable from a biological fly, the
   functionalist argues that the fly is experiencing something functionally
   equivalent to pain. The article forces critics to define exactly what is
   missing. If the digital fly responds to every input exactly like the
   biological one, the claim that it "isn't conscious" becomes a claim about a
   "soul" or a specific biological "vital spark" that cannot be measured. The
   more successful the emulation, the more the "No Consciousness" argument
   looks like moving the goalposts.

3. "Emulations of brains are differently conscious"

This objection suggests that consciousness is substrate-independent
(agreeing with functionalism), but that the architecture of a computer
(serial processing, discrete clock cycles, lack of biological hormones)
would result in a subjective experience fundamentally alien to ours.

   -

   *Implications:* This is perhaps the most reinforced objection by recent
   progress. The *Drosophila* emulation runs on hardware that doesn't
   experience "fatigue" in the biological sense, can be paused, and can be
   accelerated. This suggests that while a digital brain might be "conscious,"
   its temporal perception and emotional "texture" would be dictated by the
   silicon substrate. If we can "upload" a brain but have to simulate the
   chemical bath of the endocrine system to make it "feel" human, we are
   admitting that the *functional* part of the brain (the wiring) is only
   half the story. It implies that "consciousness" is a spectrum, and digital
   consciousness might be a "high-fidelity translation" rather than a "perfect
   copy."

Summary

The *Drosophila* larva upload effectively moves the debate from the
*Structural* (Can we build it?) to the *Qualitative* (What is it like to be
it?). If we can build a functional fly, the "impossibility" argument is
dying; the remaining battle is over whether the "ghost in the machine" is
truly there or if we have simply built a very convincing puppet.

Mapping the entire brain of a fruit fly
<https://www.google.com/search?q=https://www.youtube.com/watch%3Fv%3DR26D1X_w_1M>

This video provides a visual and technical overview of the connectome
project mentioned in your article, illustrating how researchers mapped the
neural pathways to understand the functional output of the organism.

-------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260308/dd68f4ca/attachment.htm>


More information about the extropy-chat mailing list