YouTube relaxes moderation rules to allow more controversial content. Videos are allowed if "freedom of expression value may outweigh harm risk"
-
This post did not contain any content.
Introducing: Pay More, Say More
Surely this can't become deeply problematic for the social fabric.
-
This post did not contain any content.
At least true crime youtubers won't have to bleep half their videos anymore /s
-
Moderators were previously told to remove videos if one-quarter or more of the content violated YouTube policies. Now, that limit has been increased to half.
This seems like alien speak to me. They announce that shit, someone read it, and then repeated it in an article. But what does it mean?
Can you have 6 contents and make 2 really crazy? Can you tell people to commit violence for 5 minutes and then review a game for 6 minutes? Is there a dude with tvtropes open going through and marking the contents of content?
I've worked for TnS at a different company.
I would guess it is likely per video, or they have multiple types of review going over it on a per video basis and then also maybe the channel as a whole?
So you could have half of any video contain the violating content, and it would be clear.
-
Intention: YouTubers can stop with the whole self censoring shit.
Example: Unaliving; PDF file; grape; etc.
Reality: Ben Shapiro, Matt Walsh, and other right wing grifters receive zero censorship while YouTubers still have to self censor to receive monetization.
Actual reality: Right-wing grifters are already on a "whitelist", as long as they also talk about lower taxes, lessening regulations and worker's protections, and also got popular enough. Source: knew a former moderator for Google.
-
Do people still have to say unalive?
Censorship is goddamn stupid.
They should just tag content & let people decide what to filter.People have literally never had to say that.
First off it was Tiktok, not YouTube, that started the "unalived" trend, but even then make no mistake, "killed," "murdered," "died," etc has never been banned on tiktok either.
What has been happening is that videos (on tiktok) with "potentially divisive content" are not being promoted by tiktok. You video will not get removed just for saying the word "killed" and will still be fully available for viewing by your followers, it just won't be promoted on the For You page for strangers.
And it's fine if you still object to this, but we have to stop conflating the two. Not being promoting is not the same thing as being censored.
Edit: I don't know who downvoted this or why, but I'm right
-
This sucks but I think this will lead to a Youtube exodus and other platforms like Peertube will creator and user base will grow
when the site becomes a 100% right wing echo chamber people will flee it.
-
I still wouldn't trust Google not to nuke my channel on a whim even in spite of those relaxed moderation rules. What's stopping a little bribe from the right company or political party from causing them to backpedal or even tighten their grip further?
This is why one should at least mirror their content to PeerTube or a similar alternative platform like that even if they're not going to just outright post future content to said alternative and give up on YT altogether.
they dont need to nuke your channel, they just bury your videos with thier algorithim, this is what pushed many smaller content creators off youtube in the pass. a channel i used to follow, i would have to painstakingly search it through other videos that arnt even related to it.
-
I agree. We've seen enough times in the past where a creator would get a strike from a video several years old because the rules changes. Anyone legit should be careful.
i saw that on a channel i used to follow, they were getting strike so they had to take it down. kinda hoping they would give them the last strike, because turned into trump loving shitheels
-
Actual reality: Right-wing grifters are already on a "whitelist", as long as they also talk about lower taxes, lessening regulations and worker's protections, and also got popular enough. Source: knew a former moderator for Google.
whitelist vs discovered the algorithm, same thing really.
-
People have literally never had to say that.
First off it was Tiktok, not YouTube, that started the "unalived" trend, but even then make no mistake, "killed," "murdered," "died," etc has never been banned on tiktok either.
What has been happening is that videos (on tiktok) with "potentially divisive content" are not being promoted by tiktok. You video will not get removed just for saying the word "killed" and will still be fully available for viewing by your followers, it just won't be promoted on the For You page for strangers.
And it's fine if you still object to this, but we have to stop conflating the two. Not being promoting is not the same thing as being censored.
Edit: I don't know who downvoted this or why, but I'm right
Maybe you're right: is there verification?
Neither content policy (youtube or tiktok) clearly lays out rules on those words.
I only find unverified claims: some write it started at YouTube, others claim TikTok.
They claim YouTube demonetizes & TikTok shadowbans.
They generally agree content restrictions by these platforms led to the propagation of circumspect shit like unalive & SA.TikTok policy outlines their moderation methods, which include removal and ineligibility to the for you feed.
Given their policy on self-harm & automated removal of potential violations, their policy is to effectively & recklessly censor such language.Generally, censorship is suppression of expression.
Censorship doesn't exclusively mean content removal, though they're doing that, too.
(Digression: revisionism & whitewashing are forms of censorship.)Regardless of how they censor or induce self-censorship, they're chilling inoffensive language pointlessly.
While as private entities they are free to moderate as they please, it's unnecessary & the effect is an obnoxious affront on self-expression that's contorting language for the sake of avoiding idiotic restrictions.