Philips scales AI literacy to 70,000 employees | Keryc
Philips had already been integrating artificial intelligence models into its health products for years. The new thing isn’t the technology: it’s the scale. Now the goal is for AI to be a capability any employee can use with confidence, not just specialized teams.
How they move from experts to the whole organization
The strategy Patrick Mans, Head of Data Science & AI Engineering, describes sounds simple and concrete: move people along a curve from toy to tool and then to transformation. Sounds ambitious? Yes. Impossible? No.
Philips combines top-down and bottom-up actions:
Hand-trained leadership, so executives model the change through practical use, not just by issuing orders.
Company-wide open challenges, where employees propose real, applicable use cases.
Access to enterprise ChatGPT, which accelerates demand because familiarity was already present: many people were already using OpenAI tools privately.
That dual lever — executive backing and grassroots pull — created real momentum. The idea is to channel curiosity into operational capability.
Trust and safety before touching what's critical
Philips is a century-old company operating under strict health regulations. So AI adoption happened step by step.
They started with low-risk internal workflows.
They encouraged experimentation in controlled environments.
They formalized responsible AI principles: transparency, fairness, and human oversight.
Only after trust and skills grew did they allow AI to move into processes that impact patients. Think about it: would you let an untested system touch critical care? Exactly. It’s not just about rolling out technology — it’s about changing culture and building confidence.
Practical priority: give time back to clinicians
The priority is clear and human: reduce administrative burden so healthcare professionals can spend more time with patients. One powerful anecdote illustrates this: a clinician who spends 15 minutes saving a life and then 15 minutes documenting the case. If documentation speeds up, that time can save another person.
That’s why Philips focuses on automating processes and building agents and workflows that operate at the process level, not just boosting individual productivity.
What works and what to learn from this approach
Train leadership with real practice to change cultural norms.
Open channels for ideas from the ground and accelerate prototypes with access to enterprise tools.
Start with low-risk scenarios and formalize responsible AI principles.
Prioritize impacts that give time back in critical settings, like hospitals.
Final thought
What Philips shows is something any organization can apply: you don’t need a technical revolution to scale AI, you need a plan so people actually adopt it. With leadership that leads by using, opportunities to propose, responsible controls, and a human goal — giving time back to caregivers — AI stops being an engineers-only issue and becomes a tool to improve care.