atymes at gmail.com
Wed Mar 11 16:16:34 UTC 2020
On Wed, Mar 11, 2020 at 8:54 AM SR Ballard via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> It’s fine and well to think that this kind of thing is an isolated thing,
> but it’s actually partially driven by AI attempts — YouTube’s algorithm was
> changed specifically because people were getting sucked into this niche
> without actually having ever seemed it out. After watching one mild videos,
> they’d be presented with two which were a bit more intense, then 4 which
> were medium, and so on until they were hardcore decrying the women’s rights
> movement. They were literally groomed for extremism by YouTube’s AI.
I understand that this sort of thing was going on long before YouTube's
current AI, although that has contributed to the current generation.
I also understand that a simple (and I mean really simple, the kind that AI
is capable of today but for a human would seem trivial) recognition and
downweighting of extremist content could solve the problem, but that
YouTube (and Alphabet as a whole) is under pressure from many politicians
inside and outside the US (who are themselves extremist - as you put it,
essentially cult members, having obtained their positions through said
cults) not to do any such thing.
Does this agree with your understanding?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat