Content advertising on YouTube will be less risky with new video monitoring policies to protect advertisers from placing ads alongside inappropriate content
As the CEO of a social media platform, it’s important to understand and prioritize the safety of the users. For YouTube’s CEO Susan Wojcicki, creating new ways to make sure everyone is safe within the video environment is currently being developed.
According to Adweek, Wojcicki said in a recent blog post that the company plans to hire 10,000 new people to monitor and filter their content in the coming years. This decision comes after complaints regarding how inappropriate videos, often featuring children, attracted sexual predators and unacceptable online activity on the popular video platform.
Content Advertising on YouTube
Many advertisers became concerned that their video ads would appear alongside such content. And even though this content violates YouTube’s policies, the sheer volume of content on YouTube makes it difficult to filter out inappropriate videos.
Wojiciki said the following in her statement.
We are also taking aggressive action on comments, launching new comment moderation tools and, in some cases, shutting down comments altogether. In the past few weeks, we’ve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with the National Center for Missing and Exploited Children, the International Women’s Forum and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.
Since June, Wojiciki reports that more than 150,000 videos have been removed due to their violent or extremist content. A computerized flagging machine has helped video reviewers pull more than five times more content than they would have without the use of the new algorithms.
“Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly one-half of it in two hours, and we continue to accelerate that speed. Since we started using machine learning to flag violent and extremist content in June, the technology has reviewed and flagged content that would have taken 180,000 people working 40 hours per week to assess,” Wojiciki said.
For advertisers, the addition of reviewers and the new technology is extremely important to note. About 32% of respondents who took a poll on advertising and their shopping habits say they have visited a retailer after seeing a billboard, and no matter the medium, advertising can make a huge impact.
For anyone who advertises on Youtube, it’s important to follow changes to the company’s policies. Recent concerns over inappropriate content may have pushed people away from the platform, but Youtube’s quick action has helped assure advertisers that their videos won’t appear alongside inappropriate content.
And according to Variety, Wokiciki says that people who want to advertise will have their content placed under stricter review before determining whether the videos it is placed with lines up with their brand. This will help brands avoid being associated with objectionable content moving forwards.
“We are taking these actions because it’s the right thing to do,” Wojiciki said.
More from my site
Latest posts by Valerie M. (see all)
- Business Travel May Be Causing Psychological Problems, New Study Shows - January 18, 2018
- 7 New Year’s Resolutions To Boost Your Productivity In 2018 - January 10, 2018
- What Advertisers Should Know About Inappropriate Content on YouTube - December 22, 2017
- 3 Smart Strategies for E-Commerce Sales Success - December 18, 2017
- Marketing To Millennials: How To Get the ‘Savviest Generation’ To Buy Your Product - November 7, 2017