[ExI] Mind Uploading: is it still me?
Jason Resch
jasonresch at gmail.com
Sun Dec 28 15:10:54 UTC 2025
On Sun, Dec 28, 2025, 7:43 AM John Clark via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Sat, Dec 27, 2025 at 6:07 PM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>
>> *>>That probably would be good enough resolution for an upload but
>>> unfortunately it was just for one cubic millimeter, the average human brain
>>> contains about 1,400,000 cubic millimeters.*
>>
>>
>> * > On the surface, this sounds quite discouraging,*
>>
>
>
> *It would be very difficult but such a scaling up would not be
> unprecedented. In August 1942 only one microgram of the element Plutonium
> had been made, but by August 1945 hundreds of kilograms of Plutonium had
> been manufactured (and by 1994 111.4 tons had been produced); of course it
> required a gargantuan factory in Hanford Washington, and $2 billion in
> 1940s dollars, to do so. *
>
In 2020, I did an analysis based on then-available technology and concluded
we could scan a human brain in less than ten years at a cost of 2.5
billion. (Since it is a highly parallelizeable task, it could also be done
in half that time for twice that cost).
Here is what I wrote:
"The average human brain has a volume of 1.375 liters. When cut into 50
nanometer thick slices, they have a combined surface area of 275 million
cm^2 (roughly nine football fields). This area has to be scanned at
nanometer-scale resolutions using expensive electron microscopes
A modern electron microscope scanning at a 4 nanometer resolution (the
level of detail needed to trace neural circuits) takes 3 hours to process 1
cm^2 of area. To cover the 9 football fields with one microscope would take
94 millennia. Ten thousand microscopes could do the work in 9.4 years, but
at a quarter million dollars each, this effort entails an equipment cost of
$2.5 billion.
The quantity of data generated in the process is also enormous. At a 4
nanometer resolution, each cm^2 contains (2.5 million × 2.5 million)
pixels, about the number of pixels on 750,000 4K TV screens. At one byte
per pixel this amounts to 62.5 petabytes of data — three times more data in
each square centimeter than all the data held by the U.S. Library of
Congress (estimated to be 20 petabytes).
Scanning the 9 football fields of brain tissue at this resolution requires
1.72 zettabytes (10^21 bytes) of data to be collected and processed. This
is about the amount of data sent over the Internet in 2019 (1.992
zettabytes according to Cisco’s estimate).
Deriving the human connectome is a herculean task, but a path to the finish
line is now in sight.
Many lessons were learned obtaining the fruit fly’s connectome. Researchers
found the electron microscope was the main bottleneck. To counter this,
they applied parallelization: using multiple electron microscopes at the
same time, and using software to digitally stitch together the images they
obtained.
The team also found a massive reduction in size from processing the images
into a connectome map. The 26 terabytes of raw scanned images, when reduced
to a map of neural connections, decreased in size to just 26 MB (a
million-fold reduction). A similar reduction in for the human brain into a
human connectome would reduce the ~2 zettabytes of images down to a ~2
petabyte connectome.
This is in line with expectations. A petabyte is 10^15 bytes — as many
bytes as 10^15 connections in the human brain."
This is an except from a chapter on the direction of technology:
https://docs.google.com/document/d/1G3vE8L9svX_283r6ZHR2VriiyGrATzvD/edit?usp=drivesdk&ouid=109779696990142678208&rtpof=true&sd=true
Jason
> * > The x-y resolution mentioned was overkill, by at least 10 times, and
>> the z resolution less so, but still probably higher than necessary. Let's
>> say it was just about right, though. That means approx. 14 trillion bytes
>> for 1 cubic mm.*
>>
>
> *I think you could get by with a much much smaller file size because for
> an upload it doesn't matter what a neuron looks like, what matters is what
> other neurons it is connected to, which can be described by a list, and how
> that neuron response to signals received from those other neurons, which
> can be described by a matrix. An AI would be able to deduce those numbers
> from the neuron's appearance and save those numbers and discard the now
> irrelevant image information. *
>
> *And I know for a fact that most people have been vastly overemphasizing
> the complexity of the brain. We know the upper bound of how much
> information would be required to construct a human brain at the time of
> birth, and it's not very large. DNA also places an upper bound on how
> complex a seed AI would have to be. In the entire human genome there are
> only 3 billion base pairs. There are 4 bases, so each base can represent
> 2 bits, there are 8 bits per byte, so that comes out to 750 meg. Just 750
> meg, that's about the same amount of information as an old CD disk could
> hold when they first came out 40 years ago! *
>
> And that's for an entire human body, only about a third of that 750 meg
> has anything to do with the brain. And even the stuff that is about the
> brain, most of it has nothing to do with intelligence, it's just
> information about metabolism that any cell needs in order to stay alive.
> And the 750 meg isn't even efficiently coded, there is *a ridiculous
> amount of redundancy* in the human genome.
>
> And then there is this:
>
> *Only 8.2% of our DNA is functional*
> <https://www.ox.ac.uk/news/2014-07-25-82-our-dna-%E2%80%98functional%E2%80%99>
>
> *And yet that tiny amount of information was enough to reshape the surface
> of a planet, and enough to make an intelligence that was smarter than
> itself. Of course I've been talking about the amount of information
> required to make a newborn baby, I haven't mentioned the all important
> memory information. Computer scientist Hans Moravec estimated that an
> adult human has between 1 and 10 TB of memory information. My new iPhone
> has 2 TB of memory capacity. *
>
> *> Another factor will be the time needed to scan an entire brain. And
>> there's also the problem of the scanning method dumping heat into the
>> tissue surrounding the area being scanned, potentially messing up the
>> structure and chemical environment.*
>
>
> *That wouldn't be a problem if the brain was at the temperature of liquid
> nitrogen and Aldehyde-Stabilized Cryopreservation was used. I'll be damned
> if I can understand why the hell ALCOR doesn't offer it! *
>
> *John K Clark*
>
>
>
>> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251228/f7aa8de9/attachment.htm>
More information about the extropy-chat
mailing list