Davey Winder: where’s the appetite for security?

  • 19 September 2016
Davey Winder: where’s the appetite for security?
Apps will be critical to health in the future; but they need to be secure

Speeches at this month’s Health and Social Care Expo showed, if nothing else, that the future of clinical care will be driven in part by wearables and data collection. And that means apps, lots of apps. Which makes me wonder why there hasn’t been more talk about health app security.

High level backing

Digital Health News reported that Simon Stevens, chief executive of NHS England, gave his support for both wearable tech and data mining in Manchester. While Dame Fiona Caldicott spoke about how the collected data would be shared and the patient opt-out options on the table.

It seems pretty clear that, in this shared data driven future, the issue of trust within the NHS will become ever more important. That there will be an opt out or opt outs for the use of identifiable health data outside of the immediate care brief (for regulation or research or both) is good.

That the central collection of information for release as pseudonymised or de-identified data sets will likely be excluded from any such remit is more worrying.

Also excluded, from the public debate surrounding this whole area of data collection, would appear to be any meaningful mention of app development security.

Where’s the security bit?

Secretary of State for Health Jeremy Hunt went on the record in Manchester to state: "I wear a Fitbit, many people use apps. What is going to change with apps is the way that these apps link directly into our own medical records…

“By March next year, NHS England is going to publish a library of approved apps in areas like mental health and chronic conditions like diabetes.”

But while Hunt speaks of putting patients “in control of their healthcare destiny” he fails to hone in on privacy considerations or security issues. This worries me, and it should worry you.

Not least because earlier this year researchers discovered that “the majority of mobile health apps failed security tests and could easily be hacked.”

Indeed, of the apps tested some 80% of the ones that had been approved by the NHS were found to be vulnerable to at least two of the Open Web Application Security Project (OWASP) top ten risks.

Think that’s bad? You had better sit down then. Of those approved by the NHS apps tested, a staggering 100% (yes, every single one of the buggers) were lacking in binary code protection which could help prevent privacy violations and tampering.

Don’t just trust it’ll be ok

A closer look at that report, which surveyed users in the UK as well as US, Germany and Japan, reveals that only half of those users felt that everything was being done to protect the apps.

Here’s the kicker though: 75% of the ‘IT Decision Makers’ asked felt confident about the app security. A confidence that the testing proves is hugely misplaced.

It mattered not whether these were Apple iOS or Android apps, the insecurity factor remained pretty constant.

Maybe of most concern is the fact that the report authors found there wasn’t any significant difference in security between health organisation approved apps and non-approved ones. Which makes me wonder what the approval process is actually concerned with.

Is it, perhaps, more about meeting a misguided value for money brief rather than concentrating on the stuff that ultimately matters most: accuracy of function, ease of use and security?

If it’s in the library, it’d better be good

Listen, in a budget-constrained NHS I am quite obviously aware that the clinical coalface comes first. Given a choice between more lives saved and more data lost, even I would opt for the former.

However, that doesn’t mean that we can just ignore the security issues that introducing a whole bunch of newly NHS approved apps into the ecosystem will inevitably bring. And I’m not blaming the security professionals who work in the NHS either, they do a pretty good job of trying to manage risk with one hand tied behind their backs.

What I am saying is that surely there should be an expectation that any app featured in the NHS library, that comes therefore with a NHS seal of approval however you look at it, should not be sending unencrypted data across the Internet.

That these apps should not be open to fairly basic bad coding practices; should not be treating patient data privacy with scant regard.

Paul Farrington, a solution architect manager at application security specialists Veracode says that there has to be “a rigorous security policy to give patients and healthcare professionals complete assurance that their data is secure.

It’s vital that our NHS ensures approved apps for both patients and professionals are thoroughly tested and secure.”

I agree with Paul wholeheartedly. If the NHS is to review apps in order to add them to an approved library, then that review process has to have teeth. Which means that we need to take a long, hard look at the current NHS approval scheme. Or better still, perhaps, scrap it altogether in favour of an independent third party accreditation?

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Two more Liverpool hospitals impacted by Alder Hey cyber attack

Two more Liverpool hospitals impacted by Alder Hey cyber attack

Alder Hey Children's NHS Foundation Trust has announced that the cyber attack it suffered last week has impacted two more hospitals.
Major cyber security incident declared at Merseyside hospital

Major cyber security incident declared at Merseyside hospital

A “major incident” has been declared at Wirral University Teaching Hospital NHS Foundation Trust “for cyber security reasons”.
Barts Health rolls out Cynerio cyber security platform

Barts Health rolls out Cynerio cyber security platform

Barts Health NHS Trust has rolled out Cynerio’s healthcare-focused cyber security platform across all of its sites.