[ExI] AI and scams

Ben Zaiboc ben at zaiboc.net
Tue Apr 30 12:21:24 UTC 2024


[ExI] AI and scams

> Digital life is getting more dangerous


Authoritarian governments are threatened by the free flow of 
information, so they have a vested interest in combatting it. One way to 
do that is to create a walled garden, but everyone knows that has 
limited effectiveness. A better way is to make people distrust what they 
see, by creating floods of false and confusing information. This is 
obvious in some cases, but I would be very surprised if there weren't a 
lot of covert operations around the world creating and spreading 
disinformation that gets more and more subtle and difficult to detect as 
time goes on. That then makes it easy to brand genuine information (that 
a regime wants to censor or suppress) as disinformation too.

This seems to even spread to scientific papers. There has been a flood 
of recent cases of papers that were obviously written by AI systems, 
with the attendant problems, often with blatant clues to their origin, 
creating doubt about their accuracy or validity.

If a regime is 'anti-science', what better way than this to discredit 
science and muddy the waters? It's cheap and easy, and can be done on a 
massive scale. It threatens the backbone of international scientific 
co-operation.

These things are much more worrying than national 'firewalls', as they 
affect the whole world.

I don't see any good way to combat or guard against this trend (do you? 
Scientific journals doing more to check the papers submitted would be a 
start, but would be limited). We may be heading towards a much more 
fragmented world where relatively small groups of people only trust 
information from within their own group (with the need to constantly 
check that their information is indeed accurate), which would slow down 
progress immensely.

Ironic, because it's the very AI that we expect will lead to the 
singularity that is enabling this. Are we getting closer, or farther 
away? It's difficult to say. It looks like a negative feedback mechanism 
that prevents cultures from achieving a singularity just as they get 
close to it. Another candidate for the 'great filter'? Maybe.

Ben


More information about the extropy-chat mailing list