WannaCry?  Understandable, it’s a scary – and forewarned of – development in cyber attack tactics and capabilities.  However, one often understated aspect of the WannaCry attack concerns the tactic that ended up effectively stopping it dead in its tracks and the elements that lead to the use of this tactic. What secret weapon was used to defeat WannaCry? Luck.

Luck 

The mechanism that stopped the spread of the attack was stumbled upon and put into effect by an act of luck.  There was no grand plan or execution of programmatic incident response actions that stopped WannaCry.  In fact, when Marcus Hutchins, the “hero” who put into effect the “kill switch” mechanism that stopped the spread of WannaCry by registering the domain he found in WannaCry’s code, he had no idea doing so would act as a kill switch on the malware.

Which brings up some thoughts to consider regarding cyber security.

It’s the culture . . .

There’s a well-worn adage in American politics that goes like this: It’s the economy stupid.

Well a variant of that adage applies to cyber security – in a critical way.  It goes like this: It’s the culture…  Forgive me, but I hope I got your attention.  Because it’s important.

Consider some of the following key factors regarding the WannaCry event:

  • One of the main mechanism by which WannaCry propagated itself through networks – which is the capability that really scared the bejesus out of the IT world – exploited a vulnerability in the Windows operating system that had been patched before WannaCry had been let loose in the wilds of the internet.  Nothing screams culture more clearly than the prioritization of importance activities are given in an organization.  Important things get attention, and are acted upon.
  • The vast majority of operating systems compromised by WannaCry (primarily Windows XP), were out of service support; Microsoft had stopped supporting the operating systems.   Again, nothing screams culture more clearly than priorities.  Virtually every organization operating with systems no longer supported by the manufacturer is demonstrating by its actions its priorities (especially with high profile cybersecurity incidents and threats being spoken of all over the news and social media on a daily basis).
  • The way WannaCry was stopped, as mentioned, was effectively by chance.  Now this factor is where the cultural aspects get interesting. Mr. Hutchins was clearly not a monkey on a keyboard; his act of registering the domain was actually a common activity to his profession.  But Mr. Hutchins was not following the steps of an incident response plan that included a “lesson learned” derived step that said “if you see a domain in the code, check to see if it is registered and if it is not, register it.”  In fact, the registration action Mr. Hutchins took could have just as easily, if WannaCry’s code had been written differently, resulted in exacerbating the negative effects of the malware.

Winging it

Looking more closely at the last factor mentioned, Mr. Hutchins was, as we would characterize in military aviation, “winging it.”  But Mr. Hutchins’ “winging it” did not come to pass in a vacuum.  His action was based on things he picked up through the culture, conversations, and education in his profession.   In naval aviation, there is a term for knowledge picked up through such informal processes: hangar flying.  The term derives from the knowledge gained through hanging around a hangar.  Or the more enjoyable form of hangar flying, sitting in the bar listening to old pilots tell “there I was” stories.   These activities, along with the appropriate leadership/governance, are the things that build aviation culture and “situational awareness.”   Which sounds similar to “security awareness.”

It is not by chance that aviation safety and cyber security share commonalities in the framework of their policies, programs, processes, standards, guidance, and training.  It is because all these things impact the culture of the field.  And culture is critical to creating safe and secure environments.

So, the real silver bullet to the WannaCry outbreak, where the tines of its pitch fork were tapped on our collective chest but did not penetrate, was due to an effect of culture – not formal process.  Which begs the questions: why not and is this a bad thing?

They are questions the cyber security community – and every organization impacted by cyber threats – needs to seriously think about.

In another blog I’ll weave in how the concept of “Warnings, Cautions, and Notes” from military aviation can be leveraged to build both the culture and processes necessary to effectively fight cyber threats.

But WannaCry, like virtually everything else in cyber, is really about the culture . . .

If you’d like to learn more about establishing a security culture or have other questions about cybersecurity, we’d be happy to help. Email info@peters.com to start the conversation.