Last Friday — Valentine’s night, I found myself seated around a poker table instead of a candlelit restaurant! Not quite the Hollywood version of romance, but the conversation that unfolded was far more interesting than small talk over dessert. 🙂
Between hands, one of the group, a successful founder, casually admitted something that made the table go quiet:
“We’ve invested heavily in AI… but I’m pretty sure parts of my team are deliberately underusing it.”
Someone asked why. She didn’t hesitate.
“Because they assume the more efficient they become, the fewer of them we’ll need.”
No one laughed. No one argued. Instead, there was a slow recognition around the table — the kind that happens when an uncomfortable truth is spoken out loud.
I then leaned forward and added:
“If that’s true, you don’t have an AI adoption problem. You have a psychological safety problem. And if people are protecting themselves, you’ve got to ask yourself what else aren’t they telling you?”
That question lingered longer than the final hand of the night, because beneath it sits a leadership tension many organisations are now facing:
Can you truly build a psychologically safe organisation — key for performance — when efficiency is relentlessly prioritised and when technology is advancing faster than trust can keep up?
It’s a question that goes far beyond culture. It reaches directly into cyber resilience, decision-making under pressure, and ultimately, organisational survivability — the new book I’m writing.
And the question I’m interrogating is can psychological safety truly exist in organisations where efficiency is king and where people suspect that becoming more productive may ultimately cost them their jobs?
This is not a theoretical debate. It’s a leadership decision with direct consequences for cyber resilience, organisational survivability, and long-term performance.
The Efficiency Trap Leaders Don’t Intend to Create
Efficiency is not the enemy. No serious executive would argue against disciplined operations or intelligent cost structures. However, something subtle happens when efficiency becomes the dominant narrative.
Employees begin to calculate risk differently.
When employees believe that adopting AI may eliminate their role many will quietly avoid it. When performance management is tightly coupled to headcount reduction, discretionary effort declines. When speed matters more than reflection, mistakes get hidden rather than surfaced.
None of this is irrational. It’s human self-protection, and self-protection produces the exact behaviours that weaken cyber resilience, for example:
- Silence instead of escalation.
- Compliance instead of curiosity.
- Workarounds instead of transparency.
- Individual optimisation instead of collective defence.
Fear does not create strong organisations. It creates brittle ones.
People are Not Infrastructure
As AI expands, there’s a growing temptation to think of organisations as systems to be optimised — friction removed, latency reduced, output increased. But people are not compute layers.
They tire. They interpret ambiguity emotionally. They avoid environments where speaking up feels dangerous.
And when organisations push humans toward machine-like expectations, three predictable outcomes emerge:
- Burnout rises — exhausted people make poor security decisions.
- Judgment deteriorates — cognitive overload narrows thinking.
- Risk visibility drops — because reporting risk begins to feel personally risky.
From a cyber perspective, this is combustible!
Most major incidents are not failures of tooling. They’re failures of communication, escalation, and judgment, and this brings me to a critical leadership realisation:
Cyber resilience is less a technology problem than a human systems problem.
Your Security Stack Starts Long Before Technology
Many organisations still imagine cybersecurity as a tech or tooling conversation. It’s not. Your true security stack is built in layers — and like any structure, its strength is determined by its foundation.
Here are the domains that matter:
Leadership → Culture → GRC → Defence → Collaboration
So let’s examine what must hold if cyber resilience is the objective.
1. Leadership: The Load-Bearing Layer
Cyber posture is a leadership signal long before it becomes an operational one. Employees watch what leaders prioritise, tolerate and ignore:
- Do we reward transparency — or punish bad news?
- Do we treat mistakes as learning — or liability?
- Do we communicate strategy clearly — or allow ambiguity to breed fear?
If leadership communicates, even unintentionally that efficiency matters more than people, employees will optimise for self-preservation. But when leaders signal that excellence and learning are both required here — ideally without fear or force (e.g.,performance reviews based on your adoption of AI or similar) something powerful happens — people begin to think instead of hide.
Cyber resilience begins precisely at that moment.
2. Culture — Where Psychological Safety Lives
Psychological safety is often misunderstood as softness. It’s not. It’s operational courage at scale, and in psychologically safe environments, people do the following:
- Escalate anomalies early.
- Admit uncertainty.
- Challenge flawed assumptions.
- Question risky timelines.
- Flag ethical concerns around AI.
These behaviours are not cultural luxuries. They’re defensive capabilities. A culture of fear, by contrast, delays breach discovery, suppresses dissent, and rewards performative certainty — all dangerous traits in a threat landscape defined by speed and unpredictability. If leadership is the foundation, culture is the reinforcement, and without it, the structure cracks under pressure.
3. GRC: Turning Intent into Discipline
GRC often gets framed as procedural overhead. In resilient organisations, it’s something else entirely…
Institutional clarity.
Clear decision rights. Transparent risk thresholds. Predictable consequences. Ethical guardrails for AI usage.
Ambiguity is the enemy of psychological safety. When people understand how decisions are made — and what constitutes acceptable risk — fear declines and accountability rises simultaneously. So safety and standards are not opposites. They are partners!
4. Defence: Tools are Only as Strong as the Humans Behind Them
Organisations spend millions on detection stacks, automation platforms, and zero-trust architectures. Yet tools cannot compensate for a workforce that’s afraid to speak.
Consider the difference:
- An analyst who hesitates to escalate because leadership dislikes disruption.
- An engineer who notices unusual system behaviour but assumes it’s “probably nothing”.
- A team that avoids reporting near-misses during a transformation initiative.
Technology did not fail in these scenarios. Culture did. Defence is powerful but it’s not foundational.
5. Collaboration: The Multiplier Most Firms Underestimate
Cyber resilience is inherently collective. Threat actors collaborate. Intelligence networks collaborate. Incident responders collaborate. Yet internally, many organisations still operate in silos reinforced by performance pressure.
Psychological safety is what allows information to move laterally. It enables legal to talk to engineering early. Risk to challenge delivery timelines. Security to influence product design. Executives to hear uncomfortable truths quickly.
Collaboration is what turns strong components into a functioning organism, and without it, even mature security programmes fragment under stress.
The AI Fear Leaders Must Address Directly
There’s a particularly modern contradiction unfolding inside many firms — organisations want aggressive AI adoption but employees suspect AI adoption may make them redundant.
When this happens you do not have an adoption problem. You have a trust problem, and trust, once fractured, is extraordinarily expensive to rebuild. So leaders must decide, and communicate, their strategic truth, for example is AI:
- Primarily a growth lever?
- A survivability lever?
- Or a margin lever?
Each creates a radically different psychological climate. And what’s worth understanding is that people can’t tolerate ambiguity. They handle hard truths far better than hidden ones.
Efficiency vs. Adaptability: The False Tradeoff
The executive misconception that needs to be challenged is this. Efficiency alone does not create durable organisations. Adaptability does.
And adaptability requires:
- Information flow.
- Intellectual honesty.
- Fast learning cycles, growth mindsets.
- Willingness to challenge assumptions.
All of which depend on psychological safety.
That’s why the organisations that will outperform in the next decade will not be those that extract the last ounce of productivity through fear – in this case using AI.
They’ll be the ones that combine: High standards + high transparency + high learning velocity.
That’s not a soft model. It’s a strategic one.
To End
Here’s a thought worth sitting with. If employees begin to believe:
“The more efficient I become, the more replaceable I am,”
they’ll — consciously or not — limit their contribution.
But if they believe:
“The more I leverage technology, the more valuable I become,”
they’ll accelerate transformation themselves.
The difference is leadership narrative, and narrative becomes culture faster than most executives realise.
Cyber resilience is ultimately a human achievement. Technology matters without a question. So does architecture and policies. But survivability belongs to organisations where people think clearly under pressure — and speak quickly when something is wrong.
That environment does not happen accidentally. It’s designed. It’s led. And it’s reinforced every day through signals about what truly matters.
If cyber is genuinely a shared responsibility — and it must be — then it cannot live inside a function. It must be woven into the organisational DNA. And DNA is shaped at the cultural level by leaders long before it appears in dashboards.
So the real question for today’s executives is not…
“Can we afford psychological safety, but rather can we afford cyber resilience without it?”
Because in volatile environments, the strongest organisations are not the most efficient. They are the most aware. The most honest. The most adaptive. And those qualities are impossible to manufacture in cultures governed by fear.
Now I Want to Hear from You …
As this tension between efficiency, AI adoption, psychological safety, and cyber resilience is not going away, it will define the next era of leadership. And that’s why the questions I most want you to answer are:
- What signals is your organisation sending today?
- Are you building a lean machine…or a resilient one?
- How are you balancing performance pressure with the human conditions required for long-term cyber resilience?
I’d welcome your perspective, particularly if you’re navigating these decisions at the executive level. So join in the conversation on LinkedIn and please tell me in the comments there.
