YouTube says it will crack down on bizarre videos targeting children

No comments
Earlier this week, a report in The New York Times and a blog post on Medium drew a lot of attention to a world of strange and sometimes disturbing videos on YouTube aimed at young children. The genre, which we reported on in February of this year, makes use of popular characters from family-friendly entertainment, but it’s often created with little care, and can quickly stray from innocent themes to scenes of violence or sexuality.
In August of this year, YouTube announced that it would no longer allow creators to monetize videos which “made inappropriate use of family friendly characters.” Today it’s taking another step to try and police this genre.
“We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged,” said Juniper Downs, YouTube’s director of policy. “Age-restricted content is automatically not allowed in YouTube Kids.” YouTube says that it’s been formulating this new policy for a while, and that it’s not rolling it out in direct response to the recent coverage.
The first line of defense for YouTube Kids are algorithmic filters. After that, there is a team of humans that review videos which have been flagged. If a video with recognizable children’s characters gets flagged in YouTube’s main app, which is much larger than the Kids app, it will be sent to the policy review team. YouTube says it has thousands of people working around the clock in different time zones to review flagged content. If the review finds the video is in violation of the new policy, it will be age restrictied, automatically blocking it from showing up in the Kids app.
YouTube says it typically takes at least a few days for content to make its way from YouTube proper to YouTube Kids, and the hope is that within that window, users will flag anything potentially disturbing to children. YouTube also has a team of volunteer moderators, which it calls Contributors, looking for inappropriate content. YouTube says it will start training its review team on the new policy and it should be live within a few weeks.
Along with filtering content out of the Kids app, the new policy will also tweak who can see these videos on YouTube’s main service. Flagged content will be age restricted, and users won’t be able to see those videos if they’re not logged in on accounts registered to users 18 years or older. All age-gated content is also automatically exempt from advertising. That means this new policy could put a squeeze on the booming business of crafting strange kid’s content. 
YouTube is trying to walk a fine line between owning up to this problem and arguing that the issue is relatively minor. It says that the fraction of videos on YouTube Kids that were missed by its algorithmic filters and then flagged by users during the last 30 days amounted to just 0.005 percent of videos on the service. The company also says the reports that inappropriate videos racked up millions of views on YouTube Kids without being vetted are false, because those views came from activity on YouTube proper, which makes clear in its terms of service that it’s aimed at user 13 years and older.
In today’s policy announcement, YouTube is acknowledging the problem, and promising to police it better. It doesn’t want to outright ban the use of family-friendly characters by creators who aren’t the original copyright holders across all of YouTube. There is a place, the company is arguing, for satire about Peppa Pig drinking bleach, however distasteful you might find it. But YouTube is acknowledging that YouTube Kids requires even more moderation. And, the company is willing to forgo additional ad revenue — and there is a lotof money flowing through this segment of the industry — if that’s what it takes to ensure YouTube Kids feels like a safe experience for families.

No comments :

Post a Comment

Thanks For Sharing Your Views