YouTube paid the FTC a $ 170 million fine this year after the charge of violating the Children's Online Privacy Protection Act. Even after the payment, the charge is still a stain on the site's reputation. Last year the situation got so bad for YouTube when it came to children that the site would have considered individual screening for each YouTube Kids video.
IOS and Android apps gained voice search functionality
YouTube apparently assembled a team known by the code name Crosswalk, which has 40 employees. One proposal was to screen each video intended for children under the age of eight to ensure that no inappropriate content was present.
Last year, reporters found disturbing videos of suicide and violence, often featuring third-party versions of popular cartoon characters like Mickey Mouse and Peppa Pig.
The screening proposal went so far that a press release was issued. At the last minute, CEO Susan Wojcicki gave up on the plan, according to Bloomberg sources. Apparently such moderation would make the site look a lot like a media company. This, in turn, could open you to the same kind of responsibility faced by publishers of copyright news, threats, hate speech, and more.
YouTube promised earlier this year that it would take stronger action on children's content and began by disabling comments on "tens of millions" of videos. The site was able to reduce video views that violate its policies by 80 percent.
Content producers have ten days to adapt their channels
However, the company failed to take any more significant action. Wojcicki recently told CBS that "if we were responsible for every content we recommend, we would have to review it individually." It's also almost impossible for the company to handle the content scale, with over 500 hours of footage coming in every minute. A former YouTube marketing manager even said the site is poorly equipped to handle such challenges.