In Focus Security

Inside PDCA: a practical framework for sustainable cybersecurity awareness

Words: Jody Williams

Webinar recording: 'The PDCA approach to cybersecurity awareness

The PDCA approach to cybersecurity awareness: structuring behavioural change for lasting impact

Most security awareness campaigns follow a familiar rhythm: a poster here, a phishing simulation there, a burst of activity during Cybersecurity Month. Then the campaign ends, and everyone goes back to business as usual.

That’s the cycle that Rosanne Pouw, Product Manager for Awareness and Training at Dutch NREN SURF, is determined to break. After years of helping Dutch universities and colleges build stronger security cultures, she believes awareness shouldn’t be a side project, but a continuous part of how institutions operate.

“Raising awareness is often a very ad hoc activity, but it should be part of the cycle inside your institution.”

Rosanne’s perspective combines security with social psychology. To make awareness truly stick, she says, we need to understand how people’s brains work, and design systems that support secure choices instead of fighting human habits.

“It’s not realistic to believe that teaching people to not click links will save your organisation. It is realistic to believe the people working for you want to do the right thing, so we have to make that as easy as possible for them.”

These are the ideas behind the Plan-Do-Check-Act (PDCA) approach to awareness currently being pioneered by some Dutch universities and colleges. In her recent GÉANT webinar, The PDCA approach to cybersecurity awareness: structuring behavioural change for lasting impact, Rosanne explores how the model works in practice and why it’s proving so effective.

Moving beyond fear: building sustainable awareness

For years, cybersecurity awareness has relied on fear to grab attention — dramatic warnings, shocking statistics, simulated attacks. It works in the short term, but it doesn’t last.

“Fear is a short-term emotion that catches attention. But people don’t like feeling afraid for a long time, so they start blocking it out: ‘Ok, we’re still at risk. I’ve got used to it. Let’s move on.’”

This pattern is familiar in higher education. Institutions invest in big awareness campaigns — hire consultants, buy platforms, roll out training — and then relax, believing they’re safe. But staff move on, threats evolve, and short-term gains in security awareness rarely translate into lasting behavioural change.

So two years ago, the Dutch higher-education awareness community came together to ask: how do we make this sustainable?

“We got together and made a plan for the long term,” says Rosanne. “What parts of awareness do we need to do inside our institutions, what skills do we need, and what parts can be done by experts we hire?”

Stop trying to fix people, start designing for them

Rosanne’s background in social psychology shapes her approach to cybersecurity. She believes effective awareness starts with recognising how real humans think and behave.

“Research shows the human brain interacting with technology is much less rational than we often believe,” she says.

“We used to think that once people know about a risk, they’ll change their behaviour. But when you’re hungry, tired, stressed or busy, you’re not using your brain’s full capacity. You just want to log in as fast as possible, so you use ‘welcome123’ instead of a strong password.”

“We should stop trying to fix people, because this is how brains work and have done for tens of thousands of years. It’s not likely we can change that with more training!”

Strict rules may sound good in theory. But in practice, people either stop trying to follow them all or break them when they get in the way of getting things done.

“We keep adding things we want people to do or not do, some of which contradict each other,” Rosanne says. “People get overwhelmed and just give up.”

“People will always find ways to do things that make them happy or achieve their goals. Whether it’s using social media or downloading unauthorised software because you’re a teacher who wants a special tool your IT department doesn’t provide, you may feel the benefits outweigh the potential harm.”

Awareness programmes, therefore, must work with human nature, not against it. That means fewer rules, clearer expectations, and systems that make secure behaviour easy.

“You can tell people not to click links, but clicking links is literally part of their job. Instead, make sure the whole system won’t go down if someone clicks on a phishing link.”

“The key to effective awareness is looking at the whole picture: the specific risks to your organisation, the behaviour you expect from your employees and students, and the aspects of risk you can fix with technical or process changes rather than awareness.”

PDCA: making awareness measurable and manageable

That’s where the PDCA model comes in. Borrowed from continuous-improvement disciplines, it gives awareness professionals a recognisable, repeatable structure for planning, executing, evaluating and improving their work.

“The PDCA approach isn’t just a one-off activity; it shows you’re working to improve as an organisation,” Rosanne explains. “It gives a structured overview of all the skills and disciplines needed to create a learning cycle and achieve better awareness.”

For leadership, PDCA is familiar and measurable. It helps justify investment by showing concrete plans, timelines and outcomes.

“Often, the biggest hurdle people face is explaining to others what you are going to do, why, how much time and effort it will cost, and what it’s going to deliver,” Rosanne says.

“The PDCA model very clearly defines the information you need to bring others into this mindset: not just leadership, but people across all departments. Because if you only talk about risks without explaining the behaviour you want to change, people still don’t see why they should invest in awareness.”

“In cybersecurity, we’re essentially saying: you need to spend a lot of money to not get hacked, and we don’t know how much it would cost if you did. PDCA helps you show what you’ll actually achieve.”

For awareness officers, PDCA breaks the work into manageable steps: identifying the specific risks their institution faces, defining the exact behaviours to change, collaborating with stakeholders, and tracking what works.

“The PDCA framework is practical, not theoretical. We looked at the pieces you really need for effective awareness — risk management, behaviour change, planning, training and informing, stakeholder engagement, evaluation — and put them in logical order. It gives you a structure, especially if you don’t know where to start.”

To support this, Rosanne and her colleagues have created open-source templates, exercises and examples so institutions can adapt the PDCA framework to their own environments. They plan to review and adjust the materials annually, so it continues to evolve as a living, community-driven model.

PDCA in action: from small campaigns to multi-faculty coordination

Since spring 2025, two Dutch universities have been testing out the PDCA model, with others now starting to adopt it. The results so far are promising and highlight how flexible it is: effective for both small interventions and whole-university strategies.

Tackling infostealers

When a wave of credential-stealing malware hit universities, Rosanne and her colleagues used the PDCA approach to plan a focused, evidence-based campaign.

Rather than producing generic warnings, they defined two concrete desired behaviours: staff and students reporting suspicious activity, and service-desk teams following precise removal steps. They built materials around those goals, with a process, templates and examples for institutions to adapt.

This precise, practical focus means it’s clear to people what actions are required of them, and impact can be tracked.

Coordinating across diverse faculties at Erasmus University

Erasmus University is using the PDCA framework to help coordinate awareness across multiple faculties with decentralised awareness teams, while respecting their differences.

“The PDCA structure is the same, but each faculty can decide what is most important for them and what actions would be most effective,” says Rosanne. “Some deal with international research, others with health data. The framework gets them all on the same page but also gives them the freedom to differentiate based on their specific risks.”

That combination of structure and flexibility is what makes PDCA so powerful — it can be as focused or as far-reaching as the situation demands.

“What’s great about PDCA is it’s not just a tool you use in one specific way. It’s a framework that allows you the freedom to use it for whatever you need.

Why mindfulness beats rigid rules in the age of AI

PDCA succeeds because it’s flexible, human-centred, and grounded in real behaviour — the same qualities needed to face new AI-driven threats. As deepfakes, cloned voices, and AI-generated phishing make technical checks less reliable, Rosanne believes mindfulness is becoming an essential defence.

“We used to tell people, ‘Check the lock icon’ or ‘Look for spelling mistakes’,” she says. “But with AI, those rules don’t hold. We need to give people the tools and knowledge to notice when something feels off, and to act accordingly.”

For Rosanne, this is what mindfulness in cybersecurity means: pausing before reacting, noticing emotions like urgency or anxiety, and giving your rational brain time to catch up. “Being mindful brings your head back into a restful state so you can make better judgments.”

It’s a flexible practice, not a prescription. “Mindfulness empowers users to take action in a way that fits their lives and personality. For one person, it might mean they only answer emails between 9 and 10am. For another, it might mean waiting at least two hours before answering email requests.”

“With AI making deepfakes and scam emails more convincing, we should rely more on people being people — trusting their intuition, recognising manipulation — instead of trying to lock them into a harness of rules.”

A living model for a more mindful future

The PDCA model, like the community that inspired it, is designed to keep evolving. Rosanne has translated the Dutch materials into English and plans to share them on an open platform so others can access and adapt them.

“We hope it will be a living practice, not a static model,” she says. “I’ve noticed the PDCA message really resonates with people. They feel it makes sense.”

For higher-education institutions long caught in the loop of one-off campaigns, PDCA offers something new: a practical, human-centred way to move from ad hoc activities to embedded practices, from fear to empowerment, and from rigid rules to mindful resilience.

And as awareness communities continue to learn and share across borders, that living, collaborative spirit may be the sector’s strongest defence of all.

Want to hear more? Join Rosanne for her webinar where she introduces the Plan-Do-Check-Act (PDCA) approach that helps institutions build a comprehensive, data-driven awareness programme that grows and adapts over time.

Watch the recording


About Rosanne Pouw

Rosanne Pouw is Product Manager Awareness and Training at SURF. She helps Dutch research and education institutions increase security awareness with the Cybersave Yourself Toolkit. She holds a MSc in Social Psychology, a Master Public Information Management and an Executive Master Business Administration. Rosanne is a member of the Human Factors in Cyber Security Working Group (ACCSS) and has contributed to several National Cyber Security Centre ()  publications on awareness such as ‘Beyond the e-learning’.


GÉANT Cybersecurity Campaign 2025

Join GÉANT and our community of European NRENs for this year’s edition of the cybersecurity campaign: “Be mindful. Stay safe.” Download campaign resources, watch the videos, sign up for webinars and much more on our campaign website: security.geant.org/cybersecurity-campaign-2025/

 

Skip to content