YouTube to Remove Thousands of Videos Pushing Extreme Views

Black Panther

Long Live The King
Supporter
Joined
Nov 20, 2016
Messages
12,751
Reputation
9,773
Daps
67,497
Reppin
Wakanda
YouTube to Remove Thousands of Videos Pushing Extreme Views

By Kevin Roose and Kate Conger
June 5, 2019

YouTube announced plans on Wednesday to remove thousands of videos and channels that advocate for neo-Nazism, white supremacy and other bigoted ideologies in an attempt to clean up extremism and hate speech on its popular service.

The new policy will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” the company said in a blog post. The prohibition will also cover videos denying that violent incidents, like the mass shooting at Sandy Hook Elementary School in Connecticut, took place.

YouTube did not name any specific channels or videos that would be banned.

“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” the company said in the blog post.

The decision by YouTube, which is owned by Google, is the latest action by a Silicon Valley company to stem the spread of hate speech and disinformation on its site. A month ago, Facebook evicted seven of its most controversial users, including Alex Jones, the conspiracy theorist and founder of InfoWars. Twitter banned Mr. Jones last year.

The companies have come under intense criticism for their delayed reaction to the spread of hateful and false content. At the same time, President Trump and others argue that the giant tech platforms censor right-wing opinions, and the new policies put in place by the companies have inflamed those debates.

The tension was evident on Tuesday, when YouTube said that a prominent right-wing creator who used racial language and homophobic slurs to harass a journalist in videos on YouTube did not violate its policies. The decision set off a firestorm online, including accusations that YouTube was giving a free pass to some of its popular creators.

In the videos, that creator, Steven Crowder, a conservative commentator with nearly four million YouTube subscribers, repeatedly insulted Carlos Maza, a journalist from Vox. Mr. Crowder used slurs about Mr. Maza’s Cuban-American ethnicity and sexual orientation. Mr. Crowder said that his comments were harmless, and YouTube determined they did not break its rules.

“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site,” YouTube said in a statement about its decision on Mr. Crowder.

The back-to-back decisions illustrated a central theme that has defined the moderation struggles of social media companies: Making rules is often easier than enforcing them.

“This is an important and long-overdue change,” Becca Lewis, a research affiliate at the nonprofit organization Data & Society, said about the new policy. “However, YouTube has often executed its community guidelines unevenly, so it remains to be seen how effective these updates will be.”

YouTube’s scale — more than 500 hours of new videos are uploaded every minute — has made it difficult for the company to track rule violations. And the company’s historically lax approach to moderating extreme videos has led to a drumbeat of scandals, including accusations that the site has promoted disturbing videos to children and allowed extremist groups to organize on its platform. YouTube’s automated advertising system has paired offensive videos with ads from major corporations, prompting several advertisers to abandon the site.

The kind of content that will be prohibited under YouTube’s new hate speech policies include videos that claim Jews secretly control the world, those that say women are intellectually inferior to men and therefore should be denied certain rights, or that suggest that the white race is superior to another race, a YouTube spokesman said.

Channels that post some hateful content, but that do not violate YouTube’s rules with the majority of their videos, may receive strikes under YouTube’s three-strike enforcement system, but would not be immediately banned.

The company also said that channels that “repeatedly brush up against our hate speech policies,” but don’t violate them outright, would be removed from YouTube’s advertising program, which allows channel owners to share in the advertising revenue their videos generate.

In addition to tightening its hate speech rules, YouTube announced it would also tweak its recommendation algorithm, the automated software that shows users videos based on their interests and past viewing habits. This algorithm is responsible for more than 70 percent of overall time spent on YouTube, and has been a major engine for the platform’s growth. But it has also drawn accusations of leading users down rabbit holes filled with extreme and divisive content, in an attempt to keep them watching and drive up the site’s usage numbers.

“If the hate and intolerance and supremacy is a match, then YouTube is lighter fluid,” said Rashad Robinson, president of the civil rights nonprofit Color of Change. “YouTube and other platforms have been quite slow to address the structure they’ve created to incentivize hate.”

In response to the criticism, YouTube announced in January that it would recommend fewer objectionable videos, such as those with 9/11 conspiracy theories and vaccine misinformation, a category it called “borderline content.” The YouTube spokesman said on Tuesday that the algorithm changes had resulted in a 50 percent drop in recommendations to such videos in the United States. He declined to share specific data about which videos YouTube considered “borderline.”

“Our systems are also getting smarter about what types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos moving forward,” the company’s blog post said.

Other social media companies have faced criticism for allowing white supremacist content. Facebook recently banned a slew of accounts, including that of Paul Joseph Watson, a contributor to the conspiracy theory website Infowars, and Laura Loomer, a far-right activist. Twitter bans violent extremist groups but allows some of their members to maintain personal accounts — for instance, the Ku Klux Klan was banned from Twitter last August, while its former leader, David Duke, remains on the service. Twitter is currently studying whether the removal of content is effective in stemming the tide of radicalization online. A Twitter spokesman declined to comment on the study.

When Twitter banned the conspiracy theorist Alex Jones last year, Mr. Jones responded with a series of videos decrying the platform’s decision and drumming up donations from his supporters.

YouTube’s ban of white supremacists could prompt a similar cycle of outrage and grievance, said Joan Donovan, the director of the Technology and Social Change Research Project at Harvard. The ban, she said, “presents an opportunity for content creators to get a wave of media attention, so we may see some particularly disingenuous uploads.”

“I wonder to what degree will the removed content be amplified on different platforms, and get a second life?” Ms. Donovan added.
 

Ya' Cousin Cleon

OG COUCH CORNER HUSTLA
Joined
Jun 21, 2014
Messages
24,285
Reputation
-1,595
Daps
81,966
Reppin
Harvey World to Dallas, TX
:hubie:that conspiracy theory shyt has done more harm to any possible motivation to push people towards revolutionary potential than social media has, good riddance

:mjgrin:but for some reason, they ain't going after the white supremacy pipeline tho. but it makes sense. "liberals" care more about ideas of the far-right than they do people being possibly harmed.
 

Black Panther

Long Live The King
Supporter
Joined
Nov 20, 2016
Messages
12,751
Reputation
9,773
Daps
67,497
Reppin
Wakanda
:hubie:that conspiracy theory shyt has done more harm to any possible motivation to push people towards revolutionary potential than social media has, good riddance

Conspiracy brehs gotta hold this L...what will they do, now that they can't post a billion YouTube videos in lieu of having a coherent argument or providing evidence? :mjlol:
 

goatmane

Veteran
Joined
Jan 26, 2017
Messages
16,586
Reputation
2,417
Daps
112,975
:gucci:mufukkas in here really caping for YouTube and Google like that?

they not even banning Steven Crowder.

this is all about censorship. they get to decide "extreme views". they are trying to reduce costs by demonitizing any content they don't want..

:manny:when your favorite black channel goes down for spitting real shyt, dont be surprised when a bunch of them spam the Report button
 

Black Panther

Long Live The King
Supporter
Joined
Nov 20, 2016
Messages
12,751
Reputation
9,773
Daps
67,497
Reppin
Wakanda
:gucci:mufukkas in here really caping for YouTube and Google like that?

they not even banning Steven Crowder.

They haven't announced who they will and will not ban, specifically.

this is all about censorship. they get to decide "extreme views".

I'm all for this type of censorship. I don't think "slippery slope" arguments are really all that valid. :kanyebp:
 

Apollo Creed

Look at your face
Supporter
Joined
Feb 20, 2014
Messages
52,634
Reputation
12,832
Daps
199,468
Reppin
Handsome Boyz Ent
They haven't announced who they will and will not ban, specifically.



I'm all for this type of censorship. I don't think "slippery slope" arguments are really all that valid. :kanyebp:
Banning videos that use terms that youtube decides to classify as “bad” is one thing and pretty fair.Anything else that can be left up to interpretation will always result in black people catching the short end of the stick
 

Black Panther

Long Live The King
Supporter
Joined
Nov 20, 2016
Messages
12,751
Reputation
9,773
Daps
67,497
Reppin
Wakanda
“Extreme” is subjective lol. They will get rid of a ton of black channels and a few non black so people dont call racism.

You didn't read the article, then. Yeah, it can be subjective, but they laid out, in clear terms, what they define as "extreme":

The kind of content that will be prohibited under YouTube’s new hate speech policies include videos that claim Jews secretly control the world, those that say women are intellectually inferior to men and therefore should be denied certain rights, or that suggest that the white race is superior to another race, a YouTube spokesman said.
 

Black Panther

Long Live The King
Supporter
Joined
Nov 20, 2016
Messages
12,751
Reputation
9,773
Daps
67,497
Reppin
Wakanda
Banning videos that use terms that youtube decides to classify as “bad” is one thing and pretty fair.Anything else that can be left up to interpretation will always result in black people catching the short end of the stick

I answered this above, but just to reiterate, they've said clearly how they're defining "extreme". They're targeting anti-semitic, neo-nazi, white supremacist, and misogynist content.

That's good.
 

Apollo Creed

Look at your face
Supporter
Joined
Feb 20, 2014
Messages
52,634
Reputation
12,832
Daps
199,468
Reppin
Handsome Boyz Ent
You didn't read the article, then. Yeah, it can be subjective, but they laid out, in clear terms, what they define as "extreme":

1.That isnt a full list
2. The concept of right and wrong are subjective point blank. Youtube wouls be better off simply drawing the line in the sand and presenting what they value as a company and platform opposed to acting like they are doing something that is right and agnostic of specific moral system.
 
Top