By now you should have heard of the cyber attack on Waikato Hospital; a Russian ransomware holding communication hostage, cancelling elective surgeries and rendering doctors and nurses powerless to call codes, contact each other and find patient details. We asked a cyber security consultant to give us the lowdown.
Over the last week, the media have published some mixed opinions about Waikato DHB and the ransomware incident, which has translated into some gnarly hot takes. Unfortunately, most of the cyber security experts in this country can’t really talk to the media – we have NDAs, work for large risk-averse companies, and have a tonne of (legitimate and not) fears about speaking out.
I’m a cyber security professional. While I haven’t ever worked with Waikato DHB, through my work (and hence why I’m anonymous today), I’ve seen the inner workings of the NZ healthcare sector’s IT, and it’s not pretty. As you would expect from chronically underfunded organisations, the diagnosis is one of too much work, too little funding, and too much organisational pressure.
Let’s start at the beginning.
People are people, people make mistakes. Across society, we’ve learned over the years (mostly through accidents) how to create safer environments for people to make mistakes in. As an example, modern cars have numerous safety features, to protect both the people in the car, and the people outside the car. But cyber security hasn’t grown up in the same way.
There was a lot of focus in early reporting about how the ransomware first came into the Waikato DHB network through an email that someone opened. I haven’t seen this confirmed, but it’s a likely story as this is an easy and cheap way for attackers to gain initial access into a network. These days, most attackers aren’t picky, they’re simply going after the low hanging fruit; instead of spending valuable time and resource researching targets to go after, they’re spending time and resources finding ways to get past spam filters.
The first malware-laden emails, the ILOVEYOU worm, came out in early 2000 – it’s now old enough to have graduated. People received an email with a tempting (and virus-laden) attachment to open, and so people opened it, and the worm spread. In the 21 years that have followed, what have we done to prevent the next ILOVEYOU? Blame people for opening the email. Don’t open the email, and you won’t get the virus, duh.
Since ILOVEYOU, people have been told stupid nonsense like “Don’t open emails from people you don’t know and trust”, which is about the same level of sensibility as making a brand new car without seatbelts, because you’ve told the driver “Don’t drive on roads you where you might crash”.
But if it’s that obvious, why does such bad advice propagate?
Because it’s easy, and most importantly free, to put the onus back on people, and blame people when they make mistakes. Creating an environment where people can safely open emails, even if they’re malicious, is (mostly) possible with today’s technology; but it costs money and takes time to get right, and when money isn’t available, people take shortcuts.
So, an attacker has got someone to open an email, now what? They still have a bit of work to do; typically, this involves moving around the network, finding passwords that give them the permission to become an IT admin (so it’s easier to move around), finding a way of distributing their ransomware virus, and then executing the ransom. All these steps are opportunities for the attacker to make a mistake, be detected, and be evicted before they can finish their attack.
Many people believe that anti-virus is the next layer of defence, and often, it’s the only layer of defence. Anti-virus is designed to detect bad malware, and alert you to its presence. It’s not foolproof; new viruses are designed every day, so we don’t expect it to detect everything always. And once a virus is detected, the next step should be finding out where it came from, how it got in and where else it’s gone.
But often, anti-virus tools are simply installed and forgotten about. Imagine having a swab that tested positive for COVID-19, but instead of quarantining the person who had it, we looked at the swab and went “well, thank God the swab removed all the virus, now you can go about your day!”
This is the second point that many organisations struggle with. Most organisations don’t have a dedicated team to look after their systems, so that if something is detected, someone can jump on it right away. Or if they do, often it’s part of a general IT contract without dedicated cyber security experts, and without the right tools to quickly investigate alerts.
I don’t think I need to harp on about how woefully underfunded our DHBs are. I mean, you know they can’t pay doctors and nurses enough, and they can’t replace crumbling buildings or fill buildings with enough equipment, so you don’t really expect them to have well-funded IT and cyber security departments, do you?
It’s not uncommon for organisations to not be able to fund IT to the levels that they need or would like. The good news is that security isn’t just about putting in shiny new expensive things; instead of trying to defend a large complex IT system, it’s much easier (and cheaper) to defend a simple IT system, and some of the best security initiatives are actually about taking things away, especially old and difficult to manage things.
But this is where NZ healthcare really struggles. Health providers have some of the oldest, clunkiest IT systems I’ve ever seen, and what’s worse, they’re all interconnected. Trying to improve things even slightly is a mammoth task that can span years of concentrated effort. But trying to change the culture and expectations of organisations with thousands of people takes effort and influence over senior leaders; IT crowds are often seen as a cost, so it’s often difficult to get traction.
Where to from here?
Astute readers will have noticed I haven’t written about the other side of this equation; about the criminal scumbags who are carrying out these attacks, or the foreign governments who are shielding them from facing charges, or the cryptocurrency markets which have made it easier for them to cash out. I haven’t, because these are areas outside our control. New Zealand is not going to be able to take down Bitcoin, or be able to extradite criminals from Russia. Let’s instead focus on what we can control and influence.
I think one of the most positive steps this government has done is to announce the removal of the DHB model, with a single Health NZ organisation to provide service up and down the country. It provides an opportunity to reset; to redesign the entire healthcare system, and ensure that cyber security is considered from day 1.
Of course, there’s a risk. If Health NZ isn’t funded adequately, if Health NZ doesn’t have access to good cyber security expertise, and especially if Health NZ isn’t given a chance to start from a clean slate, we will be in a worse position; where even fewer security professionals have even more work to do, under an even bigger, more complex, and more and difficult to manage DHB.
So once the restoration is complete, what the government and Waikato DHB announce is vitally important. But it’s also important for you to think about how you can help. Across IT, but especially in cyber security, we have a massive skills shortage in NZ. Every day I see the same high-paying jobs being advertised and not filled, which only makes it harder for the health sector to have good people.
It’s not all roses, but it is a worthwhile career, where I genuinely feel I’m making a difference. And what’s awesome is that the best people I know didn’t start as IT nerds; the best people in this industry are people who are curious, can think laterally, and have good attention to detail. If you can come up with new ideas, that’s even better. Because, after 21 years of dispensing the same bad advice, clearly it can’t get any worse.