[ExI] AI designs chips that humans don't understand

Darin Sunley dsunley at gmail.com
Sun Jan 12 21:50:44 UTC 2025


Those who do not remember pre-AI science fiction are doomed to repeat it.
Those who do remember pre-AI science fiction are doomed to watch everyone
else repeat it.

"At last! We have managed to create the Torment Nexus from the seminal
science fiction novel "Don't Build The Torment Nexus"!"

On Sun, Jan 12, 2025 at 2:47 PM Darin Sunley <dsunley at gmail.com> wrote:

> More info on the origin of the anecdote here:
> https://news.ycombinator.com/item?id=18461565
>
> On Sun, Jan 12, 2025 at 2:41 PM Darin Sunley <dsunley at gmail.com> wrote:
>
>> There's a fun anecdote from the late 80's or early 90's. A lab was using
>> a small AI (probably genetic algorithms) to generate FPGA gate layouts for
>> a small circuit that could reliably distinguish between audio samples of
>> arbitrary speakers saying "yes" or "no". They were trying to get a reliable
>> circuit that could do this using only 100 logic gates.
>>
>> And they did it. Sort of. The winning circuit was /weird/. Like, no one
>> could figure out how it was doing what it was doing. It didn't make any
>> sense. Also, it had gates that were disconnected from the main logic flow,
>> but if the researchers removed them, the circuit would no longer function.
>> Also also, it turned out the circuit only worked at the precise temperature
>> they did the testing at.
>>
>> What they eventually puzzled out was that this particular solution was
>> using the conductivity of the underlying layer of the chip to build up a
>> cascade of electrical field waves /in the substrate of the chip/ that
>> interfered with each other throughout the circuit and peaked at the output
>> with a positive or negative result.
>>
>> So yeah, unconstrained AIs can generate transhuman-quality weirdness,
>> even (and perhaps especially) when you're not trying. and they were already
>> doing that 30 years ago.
>>
>> As for what could possibly go wrong, this is a major plot point in Vernor
>> Vinge's "A Fire Upon the Deep." There is a species of intelligent plants
>> who are utterly dependent on electronic prostheses to move and speak, and
>> those prostheses have been designed by transhuman AIs and are utterly
>> impenetrable to mere mortal software engineers. Suffice to say, it doesn't
>> go well.
>>
>> ---------------------------
>>
>> Man, remember how we used to talk about this in the 90s? If you'd told us
>> that in 2025, human-level (or near) chatbots were the hot new thing, and we
>> got there by simply training a fancy neural net to predict the next word in
>> a block of text, using the entire corpus of human literature as digitized
>> on the internet as training data, and that the same AI was pretty darned
>> good at coding because there was a massive open-source code repository in
>> it's training data, I don't know what we'd have thought.
>>
>> On Wed, Jan 8, 2025 at 2:12 PM BillK via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> AI slashes cost and time for chip design, but that is not all
>>> By John Sullivan on January 6, 2025
>>>
>>> <
>>> https://engineering.princeton.edu/news/2025/01/06/ai-slashes-cost-and-time-chip-design-not-all
>>> >
>>> Quotes:
>>> What is more, the AI behind the new system has produced strange new
>>> designs featuring unusual patterns of circuitry. Kaushik Sengupta, the
>>> lead researcher, said the designs were unintuitive and unlikely to be
>>> developed by a human mind. But they frequently offer marked
>>> improvements over even the best standard chips.
>>>
>>> “We are coming up with structures that are complex and look random
>>> shaped and when connected with circuits, they create previously
>>> unachievable performance. Humans cannot really understand them, but
>>> they can work better,” said Sengupta, a professor of electrical and
>>> computer engineering and co-director of NextG, Princeton’s industry
>>> partnership program to develop next-generation communications.
>>> ---------------------
>>>
>>> Now, what could possibly go wrong?
>>>
>>> BillK
>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250112/c797e685/attachment.htm>


More information about the extropy-chat mailing list