Healthcare insecurity gets personal when you look beyond the big picture
- 25 August 2020
In his latest column, our cyber security expert, Davey Winder explores why healthcare insecurity is about more than just protecting data, it’s about protecting lives as well.
When writing about cybersecurity it’s all too easy to focus on the big picture. That’s especially true when it comes to healthcare where the strategy for securing NHS data is, understandably, front and centre of my attention as a rule. However, every now and then, something crops up that reminds me that the devil can often be found in the smaller detail.
Sometimes stories that would otherwise swim unnoticed in the crowded cyber-ocean, float to the surface and expose a stark reality: healthcare insecurity is about more than just protecting data, it’s about protecting lives as well. This particular incident hit home on another level, a very personal one in fact.
Looking at smart tracker watches
The ethical hackers over at Pen Test Partners have something of a history of finding security problems with what you might call smart tracker watches. A smart tracker watch is a device aimed at the most vulnerable of users, be that young children or, as in this case, those living with dementia.
Which is where things get personal for me as my mother has Alzheimer’s disease. In her case the dementia is now so advanced that, having survived a Covid-19 infection, she is now in a nursing home. Were she not, then I would have invested in such a tracking device myself as she had not only started to wander and then forget how to get home, but taking her life-essential medications had become problematical. Which brings me back to Pen Test Partners and what they discovered.
Triggering medical alerts
Like most such devices, the tracking watch in question worked in conjunction with an app and was linked to back end servers in order to deliver location data to a carer, trigger medication alerts and the like. In this case, the app had happened to be have been downloaded an incredible 10 million times.
More correctly, I should say apps: SETracker, SETracker2 and SETracker3, owned by 3G Electronics out of Shenzen City, China. It’s hard to say how many tracking watches use the apps, and the back end servers they connect to, but Pen Test Partners reckon most of the “low end” watches on Amazon and other online retailers are based around them.
“Like every smart tracker watch we’ve looked at,” the security researchers reported, “anyone with some basic hacking skills could track the wearer, audio bug them using the watch, or perhaps worst, could trigger the medication alert as often as they want”.
All of these implications are bad enough, but it’s the latter that grabs my attention: they could trigger the medication alert as often as they want. The idea that a device that exists in order to protect a life could be exploited in such a way as to end it instead should be a reminder to us all of the importance of cybersecurity in healthcare. It’s not just another item on a regulatory compliance checklist, not just another heading on a budget meeting agenda. It’s a critical part of the patient care equation.
Exploiting vulnerabilities
In the case of the smart watch tracker, the security issue was with an unrestricted server-to-server application programming interface (API) that could enable a “TAKEPILLS” command at will by a hacker. My mother, whose memory span could be measured in minutes, would happily do so with each resulting reminder in such a nightmare scenario.
Of course, there is no evidence that anyone ever did exploit this vulnerability, and it was fixed within days of the security researchers reporting their findings to 3G Electronics. The fix was an easy one, involving restricting the server-to-server API access to specific IPs. Handling it from this end meant that the exploit window was closed even for those users who might not, or could not, update their devices.
Consumer side of health tech
Let’s not forget that, when talking about the consumer side of the healthcare tech equation, many of the devices people buy will not be updatable at all. I’m not talking about at the accompanying app level, but rather the device firmware itself. Even when the firmware can be upgraded, most users can be filed into the ‘fire and forget’ section having neither the technical knowledge nor the personal motivation to do so. This becomes more of an issue when you realise that many internet-connected devices will come with default, hard-wired, passwords. Another small detail that can have big consequences in the cybersecurity scheme of things.
Proposals for regulating consumer ‘smart’ product cybersecurity are already underway, and could result in such universal passwords being banned. I’m hopeful that a legal ‘minimum bar’ for cybersecurity can be put in place, because without it the not-so-small problem of unregulated personal healthcare products will only get worse with increasingly bigger threats to those who are most vulnerable.
1 Comments
Dave, the biggest security (for patients) are the processes that surrounding clinical care. I have personal and received knowledge of these and come to the conclusion that they are broken non-existent and thus ad hoc or make-it-up-as-you-go. They need total redesign before even saying the word ‘technology’. Jeremy Hunt acknowledged some time ago that thousands (not a few) deaths could be attributed to issues, usually communications ones. I cab quote many. I’m sick of saying this (about processes) but as some physicist once said; ‘ It is difficult to bear the torch of truth through a crowd without singeing someone’s beard.’
Comments are closed.