In the coming months, Google Ads will be expanding safeguards to prevent age-sensitive ad categories from being shown to teens.
As a result of these changes and new regulations from countries, Google is updating its policies around minors online:
Letting those under 18 remove images from search. “Children are at particular risk when it comes to controlling their imagery on the internet. In the coming weeks, we’ll introduce a new policy that enables anyone under the age of 18, or their parent or guardian, to request the removal of their images from Google Image results,” wrote Mindy Brooks, product and UX director for kids and families at Google. While this doesn’t remove the image from the internet completely, it can prevent it from showing in image search results.
Adjusting product experiences for youths. YouTube will change the default upload mode to private for kids aged 13-17. SafeSearch will be automatically turned on for those under 18 using Google Search. Those under 18 will not be able to turn on their location history.
Advertising changes. In the coming months, Google Ads will “be expanding safeguards to prevent age-sensitive ad categories from being shown to teens, and we will block ad targeting based on the age, gender, or interests of people under 18,” the company said.
Why we care. Any move to protect kids online is a step in the right direction. We’ve all been more online than ever over the past eighteen months as the pandemic forced lockdowns, homeschooling and working from home. With the Delta and Lambda variants, this trend may continue into 2022. This move is a step toward protecting those under 18 as they navigate the internet to attend classes, connect with family and friends and explore the world. While advertisers should not be drastically affected, you may see changes in your ad metrics as audiences are potentially taken away from your targeting.