Parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of Youtube. They have contested that Google must do more to banish and restrict access to inappropriate videos, not minding if it is a propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned.
According to a blog post, the company wants to toughen up it’s approach to manage this issue with policies like applying community guidelines more quickly and strictly, removing ads from the disturbing videos, blocking inappropriate comments on videos featuring minors, giving creators of family-friendly content more guidance.
“In recent months, we’ve noticed a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not,” wrote Johanna Wright, YouTube’s vice president of product management, in a blog post dated Nov. 22. “While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them from YouTube.
Moderators now are instructed to delete videos “featuring minors that may be endangering a child, even if that was not the uploader’s intent,” Wright said. Videos with popular characters “but containing mature themes or adult humor” will be restricted to adults, she said. Wright states that the company has terminated 50 channels and removed thousands of videos and also removed advertisements from 500,000 exploitive videos under an expanded set of guidelines for determining whether content endangers minors.
Google has removed ads from over 3.5 million videos that violated updated rules for advertisements in videos that contain family entertainment characters that engage in inappropriate behaviors since June. The company promises to take an aggressive stance against nasty comments on videos that involve minors (commenting functionality will be disabled on any videos where comments refer to children in a “sexual or predatory” manner), as well.