[ExI] Dark AI is fueling cybercrime

Ben Zaiboc ben at zaiboc.net
Fri Oct 24 20:55:25 UTC 2025


On 24/10/2025 17:31,
BillK wrote:
> Dark AI is fueling cybercrime — and accelerating the cybersecurity arms race
> A look at how criminals are using unrestricted chatbots and how cyber
> defenders are fighting back.
> Mike Wehner     October 24, 2025
>
> <https://bigthink.com/the-future/dark-ai/>
> Quote:
> Key Takeaways
>   “Dark AI” — models and tools fine-tuned for fraud, hacking, and
> deception — are giving cybercriminals new ways to automate scams and
> launch attacks at scale.
> The legal system hasn’t had time to catch up, leaving a gray area
> where creating malicious AI tools isn’t illegal — only using them is.
> Cybersecurity experts are fighting fire with fire, using AI to detect
> threats, patch vulnerabilities, and counter hackers in real time.
> -----------------------------

And what count as 'malicious AI tools'?

Assuming that this is something that /needs/ to be legislated against 
sounds like a power-grab to me. Think about how you could legislate 
against 'malicious power tools', given a rise in the incidence of people 
using various power tools to commit crimes. Make the possession of an 
angle-grinder illegal, perhaps?

There's no such thing as a malicious tool. Using tools maliciously is, 
and should be, a crime. Even a guillotine is not malicious. Using one on 
someone is.

> AI works for the bad guys as well as the good guys.
> BillK

Of course. Just like fire, electricity, screwdrivers and every other 
technology you can think of.

Will we never learn that it's actual crimes that need to be prosecuted, 
not the ability to commit a crime?

-- 
Ben

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251024/a9b15e81/attachment.htm>


More information about the extropy-chat mailing list