[ExI] teachers

BillK pharos at gmail.com
Sat Sep 16 22:51:21 UTC 2023


On Sat, 16 Sept 2023 at 23:02, efc--- via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> Upon some thought I decided to follow Bills modus operandi of asking
> iask.ai the following questions about my scenarios below:
>_______________________________________________


Asking these questions to a different AI (there are many nowadays) -

1. How does a quantum computer store the information in 4000 qubits when
 there are only 2^260 atoms in the universe?

Quantum computers don't actually store information in physical qubits
the same way classical bits store 1s and 0s. Instead, the qubits exist
in superpositions of states that can represent many values
simultaneously. So 4000 qubits don't need to store 2^4000 distinct
values - they can explore a space of 2^4000 possibilities in parallel
during computation. The actual physical implementation only needs to
maintain the superposition, not store every distinct state. So there
is no need for an infeasible number of atoms. The power comes from
manipulating coherences among superpositions to perform computations
in ways not possible with classical bits. The final read-out then
samples from the quantum probability distribution to get the result.


2. What is the biggest quantum computer ever built in terms of qubits?

The largest quantum computer built so far in terms of qubits is
Google's Sycamore processor, which has achieved quantum supremacy with
53 qubits. Here are some other notable quantum computers and their
qubit counts:

IBM quantum computer 'Eagle' - 127 qubits (in 2022)
IonQ quantum computer - 32 qubits
Rigetti quantum computer - 40 qubits
Honeywell quantum computer - 64 qubits
Quantum Brilliance quantum computer - 19 qubits
Intel quantum computer 'Horse Ridge' - 49 qubits
Amazon Braket quantum computer - 32 qubits

So in summary, Google currently holds the record for the largest
operational quantum computer with 53 qubits in its Sycamore processor.
IBM is a close second with 127 qubits in their Eagle processor
unveiled in 2022. But many companies and research labs are working to
build even larger quantum computers of over 100 qubits and beyond in
the near future. The race is on to achieve the milestone of quantum
advantage over classical supercomputers.


3. Are there any limits to how big a quantum computer can be in terms of qubits?

There are both practical and theoretical limits to how large quantum
computers can scale in terms of number of qubits:

Practical limits:

Noise and decoherence - More qubits means more opportunities for
errors to creep in from environmental noise, limiting useful
computation time. Mitigating noise is a huge engineering challenge.
Connectivity - Connecting many qubits in useful ways for multi-qubit
gates becomes complex. Current hardware uses nearest-neighbor
interactions.
Components - Scaling up requires having many high quality quantum
processors, amplifiers, cables etc. Very technically challenging.
Temperature maintenance - Larger systems need more complex cooling
mechanisms to maintain quantum coherence.

Theoretical limits:

Threshold theorem - To maintain low error rates when correcting
errors, noise per qubit must be below a threshold to allow
fault-tolerant quantum error correction.
Surface code lattice - For 2D surface code error correction, each
logical qubit may require a lattice of thousands of physical qubits.
Overheads scale up costs.
Quantum parallelism - While qubits themselves can represent large
Hilbert spaces, the number of parallel operations is still limited by
the number of qubits.

So in practice technical challenges and error rates may prevent a
quantum computer from exceeding a few hundred well-controlled qubits
in the near term. But theoretical limits could allow much larger
computers given low enough physical error rates per qubit. For full
fault tolerance, millions of physical qubits may be needed per useful
logical qubit.


4. Is the brain a quantum computer?

No, there is no conclusive evidence that the brain acts as a quantum
computer. Here are some key reasons why the brain is unlikely to rely
on meaningful quantum effects:

Temperature - The brain operates at a high temperature, around 310 K.
Delicate quantum states generally require very low temperatures, near
absolute zero, to avoid decoherence issues.
Noise - The noisy, wet environment of the brain would quickly destroy
most quantum coherence. The brain lacks quantum error correction
mechanisms.
Signaling - Neurotransmitter signaling operates classically, without
evidence of exploiting quantum effects like superposition or
entanglement.
Structure - Neurons and synapses are far too large-scale and chaotic
to support quantum behavior. No atomic-scale qubit architecture
exists.
Observation - No quantum effects like interference or tunneling have
been observed in the brain. Its cognitive functions appear classically
simulatable.
Evolution - The brain slowly evolved within a classical environment.
There are no signs of selective pressures for quantum advantages.
Consciousness - Reports of quantum effects related to consciousness
remain speculative and lack robust evidence.

While some small-scale quantum effects may play minor roles in
biology, overall the brain appears to fully operate via classical
information processing. There is no evidence that the brain supports
fundamental quantum computation or that quantum mechanics explains its
core functions. The search continues for quantum phenomena in the
brain.
-----------------------------------------

Interesting different answers?


BillK


More information about the extropy-chat mailing list