[ExI] youtube a radicalizer

Robert G Kennedy III, PE robot at ultimax.com
Mon Mar 12 23:16:52 UTC 2018


Concur w/Bill W's experience.

Recently, I searched for the "Radetzsky March" on YouTube. (The one that 
they always play on New Year's Day at the Vienna Staatsoper, wherein the 
audience clapping was explicitly scored in by the composer Strauss Jr.)

After I watched it, the Recommended screen refreshed itself.  Among 
other highly-ranked suggestions appeared the "Horst Wessel Song", with 
swastika and all.  WTF?  I was pretty stunned.

RGK3


On 2018-03-12 16:15, extropy-chat-request at lists.extropy.org wrote:
> Message: 2
> Date: Mon, 12 Mar 2018 11:00:52 -0500
> From: William Flynn Wallace <foozler83 at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: [ExI] youtube a radicalizer
> Message-ID:
> 	<CAO+xQEZ9C+VqRvXjXuNCochr+eRb7O=vHrL84ZdNpsSmsosARQ at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> In the NYT
> 
> The journalist created two accounts:  one in which she viewed liberal
> videos, the other, conservative.
> 
> The Youtube algorithm recommended videos that were more radical than 
> the
> ones she viewed:
> 
> After viewing Trump rallies, she was recommended white supremacy ones.
> 
> After viewing Clinton and Sanders, she got recommendations alluding to 
> a
> conspiracy in gov. that was behind the towers attack. (this is 
> liberal?)
> 
> After viewing videos about vegetarianism, she got ones about vegans.
> 
> After viewing jogging, she got marathons.
> 
> Quite a consistent pattern.
> 
> The idea behind the algorithm must be that keeping on viewing Youtube 
> must
> depend on interest level, and people are more interested in the radical
> ideas than more moderate ones.
> 
> The media itself is radicalizing.  I don't have to tell you that 
> someone at
> Google knows something about psychology.
> 
> bill w

-- 
Robert G Kennedy III, PE
www.ultimax.com



More information about the extropy-chat mailing list