[ExI] Dark AI is fueling cybercrime
    Adrian Tymes 
    atymes at gmail.com
       
    Fri Oct 24 15:57:27 UTC 2025
    
    
  
This is kind of like the "gray area" where creating guns isn't
illegal, only using them in unapproved ways (such as to murder) is.
For the most part, the legal system can handle this just fine.
There's little to "catch up".
It is not the AI tools themselves that are malicious, even if some
tools are more readily used for malicious ends.
On Fri, Oct 24, 2025 at 11:12 AM BillK via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> Dark AI is fueling cybercrime — and accelerating the cybersecurity arms race
> A look at how criminals are using unrestricted chatbots and how cyber
> defenders are fighting back.
> Mike Wehner     October 24, 2025
>
> <https://bigthink.com/the-future/dark-ai/>
> Quote:
> Key Takeaways
>  “Dark AI” — models and tools fine-tuned for fraud, hacking, and
> deception — are giving cybercriminals new ways to automate scams and
> launch attacks at scale.
> The legal system hasn’t had time to catch up, leaving a gray area
> where creating malicious AI tools isn’t illegal — only using them is.
> Cybersecurity experts are fighting fire with fire, using AI to detect
> threats, patch vulnerabilities, and counter hackers in real time.
> -----------------------------
>
> AI works for the bad guys as well as the good guys.
> BillK
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
    
    
More information about the extropy-chat
mailing list