6 Google measures to protect young people online | Keryc
In Dublin, experts, educators and policymakers met to talk about something that touches all of us: how platforms help children and teens grow up healthy on the internet. What does that really mean for you—parent, teacher or creator? The main idea was simple: fewer blanket bans, more concrete, understandable and adaptable tools.
Six key points we heard in Dublin
Constantly redefine basic protections
Parents expect an online experience that comes with safeguards turned on by default. Google already enables SafeSearch in Search and uploads on YouTube are private by default for minors. In Gemini Apps there are protections that can't be disabled: for example, avoiding language that simulates intimacy or tries to pass as a real person.
Does this mean total control? No. It means the initial experience is designed to minimize risks without shutting doors.
Give parents simple, customizable controls
It’s not the same to receive fragmented reports as to have a single page where you adjust screen time, see summaries and change limits. Tools like Family Link are being improved to centralize everything. On YouTube, parents can limit time on Shorts and will soon be able to set the timer to zero—a feature not seen across the industry.
Invest in teen digital wellbeing at global scale
Google.org and YouTube announced a $20 million initiative to build an open, multilingual resource hub about adolescent digital wellbeing. Based on a global Ipsos study with more than 9,500 teens, the project aims for practical materials: from how to ask for help to understanding responsible use of AI.
Define what quality, age-appropriate content means
Parents and experts want clear criteria. YouTube shared principles and a guide for creators aimed at teens, and adjusted its recommendation system to prioritize higher-quality videos for that group. Concrete examples—like family-trusted channels such as BBC Studios with Bluey—help you know what to look for.
Develop smarter, less invasive age checksn
The debate often falls into two bad options: weak barriers or invasive scans. In Dublin a risk-based approach was proposed: the level of verification should match the risk of the content or feature. The analogy sums it up well: you don’t ask the credit card to check whether you can buy a beer; the bar does that.
Google supports interoperable global standards and opening up technology so other platforms can adopt age checks that preserve privacy.
Avoid one-size-fits-all solutions that push young people out of oversight
Banning is not the same as protecting. Rigid restrictions can drive young people to less regulated platforms where risks are higher. The summit message was clear: offer supervised experiences, effective parental controls and options that let teens make informed choices.
What does this mean for you —parent, teacher, creator or developer—?
If you're a parent: look for tools that give you control without isolating your child. It's not about blocking everything, but about setting clear limits and having open conversations.
If you're a teacher or policymaker: prioritize open resources and curricula that teach digital skills, not just rules.
If you're a creator or developer: applying quality principles and designing with age in mind isn't just ethical—it improves experience and builds family trust.
Working with young people throughout the process is essential. In the end, their voices define what actually works.
Protecting kids in the digital world doesn't mean shielding them from it. It means creating safe, informed and adaptable experiences.