.st0{fill:#FFFFFF;}

Rethinking failure in cybersecurity 

 May 25, 2020

By  Jane Frankland

(NOTE. This blog is one of a series and unless you join my IN Security Tribe whereby can get them early, plus other useful content, you’ll just have to wait for the others.)

I’d just finished my keynote and walked off the stage when a young woman approached me with a question. She wanted to know if I’d ever failed. I smiled, and perhaps I even laughed a little as I answered her,

“All the time!”

Any extremely successful person will tell you this,

“Success truly is the result of good judgement. Good judgement is the result of experience. And, experience is often the result of bad judgement.”

The wisdom of learning from failure is undeniable. Yet, despite a commitment to learn from failure, few leaders succeed. For over two decades I’ve watched leaders devote hours to critiquing sessions, postmortems, and analysis in the hope of not repeating the same learning lesson. And, whilst these approaches are useful (I advise they are done and wrote about why in a recent blog) again and again I’ve noticed minimal change, other than an increase in stress.

Here’s why.

Leaders, especially in cybersecurity, are thinking about failure the wrong way. Most believe failure to be bad, and that it must be avoided at all cost. Furthermore, that learning from mistakes is straightforward. However, both of these beliefs are incorrect.

Failure isn’t uniform, it’s not always bad, and it’s certainly not created equally. In fact, there are many different types and sliding scales of failure. And, ascertaining what really happened can be longwinded and challenging.

Take a breach for instance. Did it occur because…

  • a member of the security team didn’t patch a specific vulnerability?
  • a project manager signed off on known vulnerabilities in an application so he/she could make the go-live date?
  • PR and Comms blocked a company-wide security awareness project?
  • the security department was under resourced and underfunded?
  • a disgruntled employee released a time bomb just before being sacked, corrupting systems and falsifying data?
  • an opportunistic employee was recruited by a competitor to steal data?
  • an employee was being blackmailed by a cybercriminal?
  • of negligence and carelessness whereby an employee emailed passwords he/she needed to remember, or sensitive information he/she needed to work on at home?
  • of something an employee did unknowingly, when he/she accidentally downloaded a spreadsheet with malware?

The list could go on and on….

Deviance, inattention to detail, inability to perform a task, process inadequacy or process complexity may indeed warrant blame. But what if uncertainty, which can causes people to act in seemingly reasonable ways but produces undesired results, or the testing of an innovative idea, which goes wrong, were added in? Should these people be praised for ‘intelligent failures’ or simply blamed for the outcome?

Failure is complex and learning from it is far from simple. It brings up feelings of insecurity and blame often ensues. Jobs may be at stake, too, especially if an executive needs a scapegoat. And this is why leaders must rethink failure and adopt a new approach to tackling it. It’s why they must create cultures of psychological safety, where team members feel accepted, respected, and safe from all the negative consequences of self-image, status or career for interpersonal risk taking.

It’s why environments must be High Challenge and High Support, where there are high expectations of what can be achieved, and where team members are actively being enabled to grow, develop and meet the challenges. Environments like these equip teams to secure their own success and operate with responsible initiative and energy rather than being dependent.

Only leaders can create and strengthen a culture that offsets blame and makes their team feel safe and responsible for surfacing and learning from failures. And, cybersecurity leaders must learn how to influence their stakeholders—the C-Suite and the Board, so they can understand what happened rather than “who did what” when things go wrong. This requires strong communication skills, a good degree of trust and a committment to consistent improvement, what's known as Kaizen. Furthermore, teamwork, personal discipline, improved morale, quality circles, consistently reporting of failures (large and small), analysis and suggestions for improvement. Only when this is achieved will an improvement in an organisation's security posture be seen.

Now I want to hear from you…

  • Tell me what insights you've gained on failure and how you're going to modify your planning or leadership approach.
  • Come join my IN Security Tribe, as that's where you'll get more useful content and insights ahead of everyone else to help you in your career and life.
  • And, if you want to work with me, just complete the form here.
Related Posts Plugin for WordPress, Blogger...

Did you enjoy this blog? Search for more blogs that you want to read!

Jane frankland

 

Jane Frankland is a cybersecurity market influencer, award-winning entrepreneur, consultant and speaker. She is the Founder of KnewStart and the IN Security Movement. Having held executive positions within her own companies and several large PLCs, she now provides agile, forward thinking organisations with strategic business solutions. Jane works with leaders of all levels and supports women in male dominated industries like cybersecurity and tech. Her book, IN Security: Why a failure to attract and retain women in cybersecurity is making us all less safe' is a best-seller.

 

Follow me

related posts:

Leave a Reply:

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Get in touch