YouTube announces it will no longer recommend conspiracy videos in the US

Orphia Nay

Penguilicious Spodmaster
Tagger
Joined
May 2, 2005
Messages
51,489
Location
Australia
"YouTube has announced that it will no longer recommend videos that "come close to" violating its community guidelines, such as conspiracy or medically inaccurate videos."

https://www.nbcnews.com/tech/tech-n...no-longer-recommend-conspiracy-videos-n969856

"Guillaume Chaslot, a former Google engineer, said that he helped to build the artificial intelligence used to curate recommended videos. In a thread of tweets posted on Saturday, he praised the change."

Guillaume Chaslot's Twitter thread:

https://twitter.com/gchaslot/status/1094359564559044610?s=21


From YouTube's announcement:

"To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11"

https://youtube.googleblog.com/2019/01/continuing-our-work-to-improve.html

"While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community. To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results. We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.
This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as our systems become more accurate, we'll roll this change out to more countries. It's just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."




It's good the AI won't recommend CT crap to someone watching a random CT video.
But it appears it will still recommend crap to people already subscribed to CT channels.
 
Last edited:
"YouTube has announced that it will no longer recommend videos that "come close to" violating its community guidelines, such as...

...videos promoting a phony miracle cure for a serious illness...

Says it all.
 
That's the dilemma, isn't it, what happens when a corporation becomes the arbitrator of truth.

YouTUbe has no right to be an arbitrator of truth, but every right to be an arbitrator of who they allow to use their website and who they don't. They pay for the website, they have the right to set up any damn set of rules they want.
 
Wow. So youtube is deciding to shut itself down for the good of humanity. Never saw that coming.



If only there had been other content on there somewhere!
 
Good. Fewer nonsense in my feed.

I wish they'd stop recommending me videos from people who hated The Last Jedi, though.

Seriously, I have constantly marked those videos off as "not interested" and it still recommends them. Even the channels I specifically singled out as stuff I don't want to see.

Its like all the dude-bro "make a million dollars in a week" videos I get because I follow a couple of personal finance and real estate channels.
 
That's the dilemma, isn't it, what happens when a corporation becomes the arbitrator of truth.

Well, they are doing no such thing. Whack-job and nutbar videos will still be available on YouTube, they are simply not going to recommend them to users.

I see this as not much different from a bookshop not displaying their adult content in the public displays.
 
Seriously, I have constantly marked those videos off as "not interested" and it still recommends them. Even the channels I specifically singled out as stuff I don't want to see.

Its like all the dude-bro "make a million dollars in a week" videos I get because I follow a couple of personal finance and real estate channels.

What annoys me the most is that when I have watched a few space videos or 'reality of Apollo' videos, YT keeps recommending nutjob flat-earth, 'space travel is faked' and moon landing hoax videos.
 
YouTUbe has no right to be an arbitrator of truth, but every right to be an arbitrator of who they allow to use their website and who they don't. They pay for the website, they have the right to set up any damn set of rules they want.

You misunderstand the meaning of my post.
 
Well, they are doing no such thing. Whack-job and nutbar videos will still be available on YouTube, they are simply not going to recommend them to users.

I see this as not much different from a bookshop not displaying their adult content in the public displays.

Apparently more than a few people don't get it.

Later when I'm not so tired, I'll explain what I meant.
 
What annoys me the most is that when I have watched a few space videos or 'reality of Apollo' videos, YT keeps recommending nutjob flat-earth, 'space travel is faked' and moon landing hoax videos.

That's why I stopped clicking on YouTube links posted by CTists...and when I do I report them to YouTube to get them taken down...
 
Only in the US? Any mention of other parts of the world?



It says in my OP:

"This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as our systems become more accurate, we'll roll this change out to more countries. It's just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."

- YouTube blog.




Sent from my iPad using Tapatalk
 
Do you guys actually use the internet or just this forum?
This is the problem with the damn algorithm. All you have to do is watch one, two, three recommendations, and they start multiplying like rabbits. I could see how it might lead you down a conspiracy path that would self-perpetuate after a while. The most insidious part of this is the "auto-play next" which just keeps it going until you manually stop the videos.


For someone watching that type of content, youtube itself could be a pretty bleak place.



Personally, my last watched videos on youtube are about bike repair, songs for the eurovision contest, and an old Christopher Hitchens video. My recommendations are all over the board






ETA - The word that some posters are seeking is ARBITERdict.
 
Last edited:
Well, yeah. The Internet and its ability to make fringe idiots connect to each other is sure responsible, in a way, for those fringe idiots spreading their ideas. Back in the day they'd be isolated village fools and no one would ever talk about them.
 
"YouTube has removed ads from videos that promote anti-vaccination content, citing a ban on “dangerous and harmful” material. BuzzFeed News reported the news this afternoon, saying YouTube had confirmed the decision after the publication contacted seven companies who were unaware that their advertisements were running on anti-vaccination videos. It’s the latest of several ways YouTube has recently restricted conspiracy theories and other objectionable material on its platform."



"YouTube has had some difficulty distinguishing “harmful” conspiratorial misinformation from programs intended for entertainment, but advocating against vaccines poses clear public health risks. US Representative Adam Schiff recently sent Facebook and YouTube’s parent company Google a letter raising concerns about vaccine-related misinformation, and Facebook is reportedly exploring new options to limit it. The image board site Pinterest, meanwhile, simply stopped returning results for searches about vaccination — saying it was “better not to serve those results than to lead people down what is like a recommendation rabbit hole.”"

https://www.theverge.com/2019/2/22/...n-conspiracy-videos-dangerous-harmful-content
 
That's good news.



I agree.

I think the idea with not giving any results on Pinterest is good.

People should get their medical advice from doctors, not from whatever the **** Pinterest is meant to be.

(Yes, I've seen Pinterest. Brainless fluff.)


Sent from my iPad using Tapatalk
 
"YouTube has removed ads from videos that promote anti-vaccination content, citing a ban on “dangerous and harmful” material. BuzzFeed News reported the news this afternoon, saying YouTube had confirmed the decision after the publication contacted seven companies who were unaware that their advertisements were running on anti-vaccination videos. It’s the latest of several ways YouTube has recently restricted conspiracy theories and other objectionable material on its platform."



"YouTube has had some difficulty distinguishing “harmful” conspiratorial misinformation from programs intended for entertainment, but advocating against vaccines poses clear public health risks. US Representative Adam Schiff recently sent Facebook and YouTube’s parent company Google a letter raising concerns about vaccine-related misinformation, and Facebook is reportedly exploring new options to limit it. The image board site Pinterest, meanwhile, simply stopped returning results for searches about vaccination — saying it was “better not to serve those results than to lead people down what is like a recommendation rabbit hole.”"

https://www.theverge.com/2019/2/22/...n-conspiracy-videos-dangerous-harmful-content

Looks like some internet companies are starting to wake up to the fact that by allowing this stuff to appear on their platforms, they are promoting stupid.

IMO, channels that have stupid and harmful CT stuff such as anti-vaccine, holocaust denial and 9/11 Truth should not be permitted to monetize their videos.
 
Looks like some internet companies are starting to wake up to the fact that by allowing this stuff to appear on their platforms, they are promoting stupid.



IMO, channels that have stupid and harmful CT stuff such as anti-vaccine, holocaust denial and 9/11 Truth should not be permitted to monetize their videos.



I think these companies also realise it damages their reputations, and their profits.

Who here hasn't avoided such sites due to the derp?


Sent from my iPad using Tapatalk
 
More on YouTube's recommendation system changes, with respect to post-Christchurch white supremacist terrorsim:

"YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes

Neal Mohan discusses the streaming site’s recommendation engine, which has become a growing liability amid accusations that it steers users to increasingly extreme content."

https://www.nytimes.com/2019/03/29/technology/youtube-online-extremism.html


"I hear a lot about the “rabbit hole” effect, where you start watching one video and you get nudged with recommendations toward a slightly more sort of extreme video, and so on, and all of a sudden you’re watching something really extreme. Is that a real phenomenon?

Yeah, so I’ve heard this before, and I think that there are some myths that go into that description that I think it would be useful for me to debunk.

The first is this notion that it’s somehow in our interests for the recommendations to shift people in this direction because it boosts watch time or what have you. I can say categorically that’s not the way that our recommendation systems are designed. Watch time is one signal that they use, but they use a number of other engagement and satisfaction signals from the user. It is not the case that “extreme” content drives a higher version of engagement or watch time than content of other types.

I can also say that it’s not in our business interest to promote any of this sort of content. It’s not something that has a disproportionate effect in terms of watch time. Just as importantly, the watch time that it does generate doesn’t monetize, because advertisers many times don’t want to be associated with this sort of content.

And so the idea that it has anything to do with our business interests, I think it’s just purely a myth.

[...]It’s an ongoing effort. I think we’ve made great strides here. But clearly there’s more work to be done."
 
Last edited:
(Apologies if these NYT articles are behind a firewall. I've tried to extract the main relevant details.)

Also:

Facebook Extends Ban on Hate Speech to 'White Nationalists'

https://www.nytimes.com/aponline/2019/03/27/technology/ap-us-tec-facebook-white-extremism.html

SAN FRANCISCO — Facebook is extending its ban on hate speech to prohibit the promotion and support of white nationalism and white separatism.

"The company previously allowed such material even though it has long banned white supremacists. The social network said Wednesday that it didn't apply the ban previously to expressions of white nationalism because it linked such expressions with broader concepts of nationalism and separatism — such as American pride or Basque separatism (which are still allowed).

"But civil rights groups and academics called this view "misguided" and have long pressured the company to change its stance. Facebook said it concluded after months of "conversations" with them that white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups."


"As part of Wednesday's change, people who search for terms associated with white supremacy on Facebook will be directed to a group called Life After Hate, which was founded by former extremists who want to help people leave the violent far-right."
 
I agree.

I think the idea with not giving any results on Pinterest is good.

People should get their medical advice from doctors, not from whatever the **** Pinterest is meant to be.

(Yes, I've seen Pinterest. Brainless fluff.)


Sent from my iPad using Tapatalk

Pinterest is actually a great tool for costume designers, set designers, film makers and other artists. I use it a lot myself. Its a useful system for finding and cataloging visual reference.
 

Back
Top Bottom