Two NHS trusts sign agreements with Sensyne Health

  • 4 February 2019
Two NHS trusts sign agreements with Sensyne Health
IBM’s AI unit Watson Health has signed a 10-year agreement with Harrow Council in London to use machine learning to delivery social care.

Two NHS trusts have signed a strategic research agreement (SRA) with AI company Sensyne Health.

George Eliot Hospital NHS Trust in Nuneaton and Wye Valley NHS Trust in Hereford have both signed the agreement which gives Sensyne access to annoymised patient data.

Sensyne Health will apply artificial intelligence to the data for analysis.

This data is then used to find potential clinical solutions which can be later sold to pharmaceutical companies.

In return the trusts will receive a £2.5 million equity stake in Sensyne Health (at a price of £1.75 per share) and will also benefit from royalties that arise from any discoveries.

The royalties the Trust receives will be reinvested back into the NHS which can fund further research and can help deliver higher quality patient care at lower cost.

Both Geroge Eliot and Wye Valley join Oxford University Hospitals NHS Foundation Trust, South Warwickshire NHS Foundation Trust and Chelsea and Westminster Hospital NHS Foundation Trust which have all signed SRAs with Sensyne Health.

The Oxford-based company has previously promised that no data is sold, nor is any ownership or control of data transferred to it or its pharmaceutical collaborators.

Lord Paul Drayson, former Labour science minister and CEO of Sensyne Health, said he was “delighted” that the two trusts had joined the company’s research partnership.

He added: “Together, we aim to make new discoveries that will improve care for patients, accelerate medical research and provide a return back into the NHS.”

On the same day, the company also announced a three-year collaboration with the University of Oxford’s Big Data Institute (BDI).

The collaboration aims to establish a research alliance to develop and evaluate the use of clinical artificial intelligence (clinical AI) and digital technology to understand the complexities of chronic disease.

No financial terms were disclosed but it was revealed that the programme will use data sets provided by NHS trusts which have the SRA with Sensyne Health.

It is hoped the programme will help discover and develop new medicines and improve patient care pathways for chronic diseases, which an initial focus on chronic kidney disease and cardiovascular disease.

In August 2018, Sensyne Health revealed it was planning to raise £60million by opening up its shares on the stock market.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Ambient voice technology to draft patient letters piloted for NHS use

Ambient voice technology to draft patient letters piloted for NHS use

Great Ormond Street Hospital for Children is leading a pan-London, 5,000 patient assessment of the use of ambient voice technology.
AI software improves odds of good maternity care by 69%, say researchers

AI software improves odds of good maternity care by 69%, say researchers

Women are more likely to receive good care during pregnancy when AI and other clinical software tools are used, researchers have found.
NHS to trial AI tool that predicts health risks and early death

NHS to trial AI tool that predicts health risks and early death

The NHS in England is to trial an AI tool that can predict patients’ risk of heart disease and early death using an electrocardiogram (ECG).

7 Comments

  • Firstly I agree with all the comments made.

    As a DPO, Caldicott Guardian and cancer patient, I would have great concerns about an external organisation having ‘access’ to records being held on a hospital system. As has been said, it is always possible to trace a record back.

    I’d also be concerned about a Trust that thinks its alright to share details of my disease, treatment and response without my permission. It isn’t their information to anonymise unless I give permission to do so. GDPR is quite clear that to use my information I must be informed, told how it will be anonymised and for what purpose it will be used. Recently, I heard someone say that all the data held by the NHS belonged to the Secretary of State – that might have been the case before GDPR but its not now. If I can be identified in any way by a piece of information then it is MINE and I have the right to determine how it will be used or not.

    If the Trusts think by taking a stake in Sensyne it means they are a partnership organisation, I don’t think that holds water. One is an organisation providing me with care and the other is out to make money from my condition.

    I’m not averse to sharing my information but I have a right to be involved in the decision making process, to understand the details of what will be shared and, under GDPR, a right to say ‘No’.

    I’d be interested to know the ICO stance on this and whether there was consultation and sign off by the DPOs and Caldicott Guardians because, as this reads, I wouldn’t have signed.

  • If properly anonymised the data wouldn’t be useful to the sort of analysis needed. This is snake oil

  • The article is pretty short on detail on some key points – unfortunately DigitalHealth is not known for its investigative journalism.

    Having worked, within the NHS, on one of the chronic diseases named as a target, I can say that any study purely focussing on secondary care data will be missing much by not also having the primary care data for the same person. So, if the data is truly anonymised, even if Sensyne gets access to primary care data, this study will be severely degraded.

    I think that it would be helpful for Bertl to define what they mean by “Anonymised” data, as understood by the NHS”. This implies different interpretations of what “anonymised” means. The readers could then understand how this, then, comes within the scope of the GDPR. i.e. Article 4 (i).

    It would also be very useful to reference the analysis of the “two leading experts in clinical informatics”. This would be very helpful to anybody who thinks they have adequately anonymised their data to understand possible shortcomings of what they have done.

    This may also be the issue that Joe McDonald suspects in the proposed arrangement. Although I have never dug deep enough to know whether patients’ consent is required for data usage that is truly anonymised.

  • Not sure they would no as secondary care has not got access to the opt-out yet!

  • I’ve a different objection. The NHS Trusts act in cahoots with one AI company, which sells a “clinical solution” to a pharma company, that develops an effective drug, for which it demands grossly exorbitant, unjustifiable and unaffordable payment. (Cystic Fibrosis, anyone? Voluntary sector bodies funded the research that led to Orkambi being developed and, now, quite unaffordable marketed.)

    No where, in this process, is the NHS even trying to secure the real price of such collaboration, ie known affordable commercial access to the end-products and any by-products en-route. Instead of which, a handful of individual NHS Trusts have sold this “birthright” for a relatively very trivial reward.

  • So called “anonymised” data is neither anonymous nor unidentifiable, a fact confirmed for me by two leading experts in clinical informatics. “Anonymised” data just takes marginally more trouble to identify whose data it is. “Anonymised” data, as understood by the NHS, is personal data under the terms of the GDPR. The NHS is deliberately deceiving patients by pretending that it is not personal data.

    If the NHS Trusts named in this article are using “anonymised” data to circumvent patients’ opt-outs from, or objections to secondary use of their personal data, then they are contravening the GDPR. If the patient has optedout or objected to secondary use of their data, they have thereby removed any possible legitimate purpose from use of their data for purposes beyond direct care. Any such processing will therefore contravene GDPR Article 5(1)(b) (legitimate purpose). Lacking legitimate purpose. the processing will inevitably contravene Article 5(1)(a – f), which is to say that it will violate ALL of the fundamental principles that must apply to ALL processing of personal data. The processing is unfair and illegal.

    The Trusts cannot use GDPR Article 6(1)(e) to overrule a patient’s objections, because Article 6(2) implies that 6(1)(e) does not constitute legal grounds for unfair or illegal processing.

    In this and thousands of other instances the NHS is routinely contravening data protection law, while telling patients that they have a legal basis for what they are doing. They have no such thing. When has effective healthcare ever been based on duplicity and outright lies?

    The NHS pretends to allow patients to opt out of secondary use of their personal data, while it never actually honours this undertaking. Opt-outs are always ignored, illegally overruled or illegally circumvented. There is no national data opt-out. All that there is, is an ongoing national data heist, in which the so called “guardians” of our personal data have chosen to collude, for reasons best known to themselves, but most obviously because they are being paid to do so, by the Government.

  • And all the patients were informed about this and happy with it, right? Nothing like deep mind?

Comments are closed.