Last month a California hospital was the victim of "ransomware."
You probably know what a computer virus is. You may have experienced first-hand the havoc viruses can wreak. But if you've never heard the term ransomware before, here's the five-second definition: ransomware is a virus that locks you out of your own computer, then forces you to pay cash to get access to your own stuff.
Like old-fashioned computer viruses, there are different types of ransomware. Some install pornography on your computer that won't go away till you pay up. Some install a program that hijacks your screen, covering everything up so that you just can't click on anything. Some encrypt your files and make them unreadable. In all these cases, the only solution is to pay the "computer kidnapper" whatever they ask or to buy a new computer. The FBI actually recommends just paying the "ransom" rather than trying to fight it.
In the case of the California hospital, that ransom amounted to $17,000 dollars. And while it may cheer some of us up to see a hospital instead of a patient having to pay through the nose for once, this brings up an unsettling question that most people haven't yet considered. Medicine is becoming increasingly computer-based. Computers are vulnerable to viruses and hackers. So — how is that going to affect our health?
"Data breaches" are the least of our worries
When you hear the word "hacker" and "hospital" in the same sentence, the first thing that probably springs to mind is your medical record. And make no mistake, that is a gold mine to a hacker. I think we'll see more instances or ransomware and outright data theft by hacking, and sooner rather than later. But Big Medicine and Big Pharma are already handing out your data left and right, HIPAA rules or not. In all honesty, your data isn't the real issue.
Read the fine print for some "patient assistance programs" and you may find you're agreeing to share your medical records with a drug manufacturer. And not only that, you're allowing them to share your information with pretty much anyone who's willing to pay them for it. Now your employer can demand your DNA through "workplace wellness" programs. And GE has openly said that they want all our medical data to be stored "in the cloud" by the end of the decade. So it's pretty much a given that sooner or later your personal data is going to get out.
No, medical privacy has already been a sad casualty of the digital age. What concerns me even more than this is the growing use of "connected" devices in medicine.
Could a computer virus be as deadly as the plague?
A shocking amount of hospital equipment is already connected to the internet. This includes not just things like medical records systems, but also items like MRI machines. Heart monitors. Infusion pumps. Ultrasound machines. Virtually anything that generates data may be wirelessly connected to the hospital's — or doctor's office's — network. And the network, in turn, is connected to the internet.
And we're increasingly taking this connection home with us. Insulin pumps don't just send your numbers to the attached monitor. They're "connected" devices. They'll send your info to your phone if you download the companion app. They'll share it with your doctor. Or you spouse. Or your child. Or anyone else you authorize.
They do this by sending the data over the internet.
Any medical or fitness device that shares data does this. It might be a fitness tracker that sends data to an app on your phone. It might be a wireless heart monitor that sends readings to your doctor. It might be an implantable device like a pacemaker. Once again, if it shares data with any other person or device, from your phone to your doctor's office, it's a "connected" device.
And that means it's vulnerable to hackers, viruses, and malware.
Now let me ask you this: what happens when your pacemaker gets infected by a computer virus and malfunctions? Or when a hacker tells your insulin pump to dump all of its insulin in your bloodstream at once?
It could happen.
Computer security firm McAfee showed that, with the push of a button, every insulin pump within 300 feet could be made to release its entire contents into the blood of the unsuspecting wearers. Or deliver an 850-volt electric shock through a person's pacemaker from 50 feet away.
That's pretty chilling. And it takes a lot of the shine off the appeal of wearable or implantable wireless medical technology. Of course, scenarios like those are unlikely. A virus or malware infection, however, is not. Medical tech security, when it exists at all, is generally stuck back in 1998. And that's a very bad thing.
What can you do? That depends on what type of device you have. If it has any built-in security — if it uses a password, and so on. In general, there are a few things that can increase security:
- If you have a choice in what device you get, look at the manufacturer's security track record. Choose the one with the strongest commitment to security.
- If the device uses a password, don't stick to the default one provided by the manufacturer. Change it to one of your own choosing.
- If the device gives you the option to update its software or firmware, do so. Updates often include security patches and by skipping them you may be leaving your device open to hackers.
- Make sure your home internet connection is secured. This means that it's password-protected and the password is encrypted so it's harder for hackers to crack. If you're clueless when it comes to internet hardware, call your internet service provider and have them walk you through securing your network.
And last but not least, if you've been recommended a connected medical device but don't yet have it, ask questions. Is there a non-connected version out there? If so, what are the benefits of being connected? Do they really outweigh the risks?
Medical tech is a new frontier for cyber criminals. They're just beginning to explore it. The hospital ransomware case shows just how little such criminals care for the lives and health of others — so don't leave yourself open to a cyberattack.