Cybersecurity risk is no longer just a matter for the IT or compliance function. The front line is increasingly behavioural, not technical. That means HR has a critical role to play
When a cyberattack hits most organisations reach for the tech team. But the real fault lines often lie elsewhere, in a culture of fear, silence or fatigue. People become vulnerable when cultures tolerate disengagement, punish transparency or reward unquestioning compliance.
Threat actors don’t need to break your systems. They just need to manipulate your employees and too often your leadership, your processes and your values make that easy.
Cybersecurity has become a behavioural risk. And that puts HR – not IT – at the centre of organisational defence.
At the CIPD Festival of Work Sarah Armstrong-Smith, chief security adviser at Microsoft, issued a blunt warning: the way organisations treat people during times of pressure, failure or scrutiny can either fortify their resilience or destroy it from within.
Stop blaming the click
The phrase “people are the weakest link” has become shorthand in cybersecurity circles. But as Armstrong-Smith points out, it’s both reductive and counterproductive. Organisations routinely run phishing simulations designed to catch employees out. Those who fail are labelled ‘repeat offenders’. In some cases this leads to formal warnings or disciplinary action.
“Do they face a disciplinary? Is it three strikes and you're out? I don't know how many times I need to tell you to stop clicking the link… but you keep doing it.”
This response, argues Armstrong-Smith, only services to create is a climate of fear. Employees learn that mistakes will be punished and so they start hiding them. But cyber incidents thrive in silence. When an employee suspects something has gone wrong – a mistaken click, an accidental data leak or suspicious access – the most valuable thing they can do is speak up immediately. If they’ve been conditioned to expect blame or dismissal, they’re unlikely to do so.
“Do you think they’re going to stand up and tell you that? Or do you think they’re going to put it under the carpet and hope nobody notices?” asks Armstrong-Smith.
Blaming the user might make for a tidy internal narrative but it does nothing to address the conditions that led to the failure. Without that understanding the problem will simply repeat, often with greater consequences.
Why attackers love your culture
Technical defences are essential but most cyber incidents today don’t require sophisticated hacks. They rely on manipulation, pressure and misplaced trust. According to Armstrong-Smith 80% of cyberattacks begin with phishing – because it works. And it works especially well in cultures that leave employees isolated, exhausted or afraid to challenge authority.
“Cyber attackers are masters of manipulation. They are masters of control. They have honed their art. They know exactly what they're doing. They know exactly what to say, how to say it and when to say it, ” says Armstrong-Smith.
What they exploit isn’t just vulnerability in the system but vulnerability in behaviour. They mimic the tone of the CEO, invoke a sense of urgency or play on emotional triggers like loyalty or stress. They rely on patterns in human psychology: authority, scarcity, reciprocation, social proof and trust.
“If you get a message from your boss or your boss’s boss and they tell you to do something, are you more likely to react?” Armstrong-Smith asks.
In organisations where questioning a senior figure is discouraged or delayed these tactics are even more powerful. The more rigid the hierarchy the easier it becomes to impersonate someone important and get a response. The more tired or unappreciated an employee feels the more likely they are to bypass a cumbersome process or approve an unusual request.
This is why culture is the attacker’s playground.
The risk inside the building
Much of the conversation about cybersecurity still assumes the attacker is on the outside. But increasingly organisations are being breached from within, sometimes deliberately, sometimes unknowingly. And the motivations for these insider threats are often emotional, not financial.
Armstrong-Smith introduces the idea of the “super malicious user”: someone inside your organisation who understands your security controls intimately, knows exactly how to avoid detection and has reached a point where they no longer care.
“This is a new breed. These are the people in your organisation who absolutely 100% understand all your controls. They understand exactly what you're looking for and what you're not looking for. These people might even work in IT. They might even work in your security team. They might work in fraud. And the question is who is auditing the auditor? Do you just assume because people in a certain position have a level of authority and they should be trusted because of who they are and what they do?”
The super malicious user isn’t necessarily driven by greed. More often, they’re the result of long-term neglect, being overlooked or repeated disappointment. As Armstrong-Smith puts it: “Mary gets looked over, over and over again. She gets to the point where she doesn't care anymore. She has this level of apathy. And I can tell you this is the most dangerous quality in your organisation. When people stop caring, bad things happen.”
And when people stop caring, they stop protecting. They stop reporting. They start bypassing. In some cases they start collaborating with external actors or orchestrating internal fraud. But it doesn’t always look dramatic. It might be slow, quiet exfiltration of data. It might be rule-bending justified as ‘efficiency’. It might be switching off a control they once helped build.
The danger lies not just in their knowledge but in their disillusionment. And it grows silently in organisations where hard work is unrecognised, wellbeing is poorly managed and psychological safety is low.
Your values won’t save you
Many organisations have mission statements and values that reference integrity, openness, and care. But Armstrong-Smith challenges HR leaders to look beneath the slogans. When an incident occurs, what really happens? Who gets blamed? Who feels safe to speak up? What changes and what doesn’t?
“The scapegoat is not usually someone in a leadership role… they’re someone quite junior. Someone we can control. Someone we can show to the outside world, ‘We took action.’ But nothing has actually changed.”
Without honest examination of cultural dynamics organisations fall into a cycle of blame and denial. Disciplinary action might satisfy stakeholders temporarily. But if underlying conditions like overwork, fear, opaque processes and poor communication are left unaddressed the breach will recur.
Public inquiry reports into major crises (including non-cyber ones) consistently show the same themes: missed warning signs, ineffective leadership, ignored audit findings and failed whistleblowing channels.
“You’ve been told. You had audit reports. You had test reports. You had previous incidents. In fact, there was another company just last week that had an incident like this and you just wiped your brow and moved on,” notes Armstrong-Smith.
HR’s role in building human-centric security
So how can HR turn culture into a defence, rather than a risk factor?
Armstrong-Smith outlines a vision for human-centric security, one rooted in empathy, empowerment, transparency and readiness. And it’s HR, not IT, that holds the keys to embedding this at scale.
-
Build psychological safety
Employees must feel safe to admit mistakes or raise concerns, especially when the stakes are high. HR leaders should review whistleblowing processes, examine disciplinary patterns and create environments where early reporting is seen as a strength rather than a weakness.
-
Empower people to say no
Attackers rely on urgency. Cultures that reward unquestioning compliance are easier to exploit. Train employees, including those at junior levels, to pause, question and verify. Make this the norm, not the exception.
-
Embrace the red
Too often senior leaders prefer PowerPoint slides with green traffic lights and neat graphs. But if the underlying reality is risk-laden, optimism becomes dangerous. HR should help foster board cultures that welcome difficult truths.
-
Prepare for the worst day
Most business continuity plans are process-heavy and people-light. HR must ensure that cyber preparedness includes emotional and leadership readiness for the stress, blame, confusion and decision-making that follow an attack. As Armstrong-Smith says: “Train like you fight to fight like you train.”
-
Support leadership under pressure
Executives are often the most targeted in cyberattacks, impersonated in phishing, manipulated for access or overloaded with responsibility. HR must provide support, training and systems that reduce single points of failure.
Where culture meets risk
Armstrong-Smith calls for empathy and empowerment not just as cultural values but as strategic defences. Cyber attackers will keep coming. AI will make them faster, harder to detect and more persuasive. But your best defence is still your people, as long as they feel safe, valued and equipped to act.
In the end the weakest link in your organisation is not your people. It’s what they’ve been led to believe: that mistakes are fatal, that silence is safer, that nobody is listening.
If HR can change that belief and build cultures of care, courage and accountability then the risk doesn’t disappear. But it becomes visible. And visibility is the first step to protection.