YouTube weighs changes to how it handles children’s content

News

SAN BRUNO, Calif. – YouTube is weighing a number of changes to its handling of content for children following pressure from inside and outside the company, according to a person familiar with the matter.

Some of the changes under consideration include preventing videos from automatically playing after the previous one finishes, the person said. Another concept, first reported by The Wall Street Journal, proposes moving children’s videos off of YouTube and into YouTube Kids, a standalone app that more tightly limits the content it allows. But that would be a drastic step and is unlikely to occur, the person said, and no decisions have been made.

In a statement, YouTube said it considers many ideas for improving the platform. “Some remain just that — ideas,” said a YouTube spokesperson. “Others, we develop and launch, like our restrictions to minors live streaming or updated hate speech policy.”

YouTube is also in the late stages of an investigation by the Federal Trade Commission concerning its handling of children’s videos, according to a report Wednesday by The Washington Post, citing four people familiar with the matter. YouTube declined to comment on the report.

Pressure on YouTube and other tech giants is rising as regulators eye the industry’s power and influence. In recent weeks, the Justice Department has taken jurisdiction over a possible antitrust probe of Google, YouTube’s parent company, while the Federal Trade Commission has reportedly taken oversight responsibility for Facebook and Amazon.

At Google, YouTube has found itself in the crosshairs amid concerns that the platform’s video-recommendation software directs viewers toward violent, disturbing or conspiratorial content. Users, including children, may start viewing safe content and then be led to less appropriate videos that have been optimized for YouTube’s algorithm. There are videos on YouTube of familiar children’s cartoon characters in dangerous or unsettling scenarios.

“My hope would be that YouTube’s algorithms would be better at picking these sorts of things up, and in the case of disturbing or violent images, at the very least put them behind an interstitial warning that content might be seen as offensive,” said Stephen Balkam, founder and CEO of the Family Online Safety Institute.

Sundar Pichai, the CEO of Google, defended YouTube’s recommendation algorithm in a recent interview with CNN.

Pichai acknowledged that the recommendation engine helps increase user engagement with YouTube, a critical factor for the advertising-based platform. But, he told CNN’s Poppy Harlow in a recent interview, the software also seeks to direct viewers toward better content.

“We are really making changes to how it works,” said Pichai in the interview. “Our recommendation is not based on getting you to just watch more videos. It’s based on the fact that we are optimizing for higher-quality videos as well.”

Copyright 2019 Scripps Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Latest Stories

More Community

Don't Miss

More Don't Miss

Trending Stories