[ExI] AI 2027: A Realistic Scenario of AI Takeover
John Clark
johnkclark at gmail.com
Tue Oct 7 11:18:54 UTC 2025
On Mon, Oct 6, 2025 at 4:35 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> *> There is room for smaller AIs to self-replicate onto a lot more*
> *platforms, but those smaller AIs need to be able to self-improve to*
> *pull off something like this scenario, and those who are running*
> *self-improving AIs generally don't see the point in using smaller AIs**for
> their work.*
*A smaller model doesn't necessarily mean a less capable model thanks to a
technique that has already proven itself to be very effective called "AI
distillation". A smart but large and therefore expensive to operate AI is
used to teach a much smaller AI with far fewer parameters. The goal is for
the small model to mimic the behavior of the large model, and there is
something called "a distillation loss function" that grades and tells the
small model how well it's doing, and it is not just pass or fail. *
*For example if it needs 20 steps to solve a problem and got all of them
correct except for step 17 where it got a decimal point wrong then it would
still get a pretty high grade because it understands the general idea, and
the large model would explain to the small model what its error is and why
it didn't get a perfect grade. After many computations and enough
electricity to power a small city for a couple of months the small model is
as smart as the large model but is faster and much cheaper to operate. And
as I've said none of this is hypothetical, it's already been used to great
effect. *
*Then you take several of those small models and teach them to be
specialists, you link those agents together with a mechanism that decides
which specialist would be best to answer the question and you have a new
large model. And then you do another AI distillation. *
*John K Clark*
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251007/14659a6b/attachment.htm>
More information about the extropy-chat
mailing list