Google’s new safety measures are designed to protect kids on YouTube, Search and more

Google has announced changes to YouTube, Search and its other apps designed to make them safer for kids. The latest updates will make YouTube videos created by kids private by default, allow minors or their parents to request the removal of their images from Google Image results, turn SafeSearch on by default and more. The move is part of a recent push by Google to protect kids and give parents more control over what they see.

Many of the updates are dedicated to YouTube and YouTube Kids. The key change is aimed at young creators aged 13 to 17, changing the default upload setting to the most private option available. That means by default, the video can only be seen by select users unless the creator changes it to public. “We want to help younger users… make an intentional choice if they’d like to make their content public,” Google wrote. 

Google is also expanding its so-called digital well-being tools for YouTube. It’ll turn on break and bedtime reminders by default for all kids aged 13 to 17 while turning off autoplay by default. At the same time, it’s adding an autoplay option on YouTube Kids while, at the same time, turning it off by default in the app. Parents will also be able to choose a “locked” default autoplay setting.  

Knowing the accurate age of our users across multiple products and surfaces, while at the same time respecting their privacy and ensuring that our services remain accessible, is a complex challenge.

Finally, Google said it will be removing “overly commercial content” from YouTube Kids, like a video that “only focuses on product packaging or directly encourages children to spend money.” It also updated the disclosures that appear on “made for kids” content when a creator identifies a video as containing paid promotions.

On Search, Google promised to give minors “more control over their digital footprint.” To that end, it’s introducing a new policy allowing anyone under 18, or their parents or guardians, to request the removal of their images from Google Image search results. That change is designed to “help give young people more control of their image online,” Google wrote. It will also turn SafeSearch on for all existing users under 18 and make it the default for teens setting up new accounts. Currently, it’s just turned on for teen accounts managed by Family Link.

In other apps, Google will disable location history for all users under 18 without the ability to turn it on. It’s launching a safety section in Play that will show parents which apps follow Google’s Families policies and disclose how they use the data they collect in greater detail. On the advertising side, it will “block ad targeting based on the age, gender, or interests of people under 18,” the company wrote. 

Google stressed that it wants to work with “kids and teens, parents, governments, industry leaders, and experts in the fields of privacy, child safety, wellbeing and education to design better, safer products for kids and teens.” Taken as a whole, the new changes should help prevent young people from seeing harmful content while blocking exploitive ads. In practice, however, it may take some time to shake out any bugs and ensure that advertisers are following the rules — so as always, it’s best to keep a close eye on your kids’ digital habits. 

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Read the rest at Engadget