Daniel Ray: on direct data flows
- 25 July 2016
So, I was chatting to a friend of mine who works for a well-known supermarket chain, and we were talking shop, literally.
He was saying: “We have stores all around the country and a central head office. The data from the local store systems flows into a central hub that enables reporting in real time on how each of the stores is doing.
“That allows the central office to make real time decisions on the operational business. How does it work in the NHS? What’s your set up for hospitals and the central organisations?”
To which I responded: “Well…Extracts are collected locally. These are then processed, and get XML’d, and submitted into a central repository, where they are then also processed and…”
“Are there plans to change this?” he said. “Definitely,” I said.
Time to call time on central returns
In order to produce more real time analytics for the health and care system, data flow needs to increase greatly. Moving to data flow from local warehouses or systems direct to the centre (by which I mean HSCIC, NHS England, the Department of Health and others) is what you would expect in our century and industry.
We are engaging with some pilot providers about what we need to do to achieve this, with data flowing into the HSCIC and in the future the national Data Services Platform.
This is a big shift. For as long as I can remember, hospitals have collected data from their local systems, processed it and then submitted a ‘polished’ (or not in some cases) dataset nationally.
So in this, my second column, I am setting out what I think some of the key considerations need to be. I hope it stimulates some debate, which can take us forward.
A win-win situation
Data gives us the ability to improve health and care outcomes, something which is recognised by both the National Information Board and within the ‘Five Year Forward View’.
But in order to engage hospital and social care provider organisations, we need to be able to demonstrate the benefits of direct data flows for the health and care system and create a win-win (as Covey would say) for all.
We believe that flowing data direct from local providers to the centre will support a reduction in the burden on providers, who have to submit data and returns.
I’ve worked with teams who have asked: “Why are we spending time calculating a set of numbers on an aggregate return on the one hand, and then submitting the underlying raw dataset that could be used to calculate the aggregate return on the other?”
If we receive health and care data in a more timely fashion, then we should be able to calculate a lot of the aggregate returns that we ask providers to calculate at the moment. And this will reduce the burden on them.
We will need to put valuable reports in place for providers, so they can view and analyse their data and national data. This will create incentives for providers to improve the quality of their data and the timeliness with which it is captured, as they’ll be able to see the value of what they are submitting when it comes back as health intelligence.
Providers will also need to be able to ‘sign off’ the data that represents their organisation online, and be given the opportunity to make any corrections necessary on the local systems that feed through to the national, central data systems.
A few practical concerns
In order to get to this place, the timeliness of data entry and the quality of the data that exists within provider systems potentially requires improvement. The use of the DQMI, which I outlined in my last column, should help with that; the next version is due out on 15 August.
Any validation should be done on the core hospital and social care provider system records, rather than outside the system on locally held records, as this improves governance and audit compliance.
Then, to get the feeds flowing we need to consider a number of other factors, such as security, governance, and technology.
One of the early questions is from where and how the data should flow. Should it use system supplier messages? Should it come out of a local trust data warehouse? Should there be a combination of the two, or other options?
Many providers have multiple systems that capture and record care that culminate in a data warehouse. What happens if this set-up changes? How do we maintain this and configure it for health and social care providers?
There are some examples around the country where health economies have joined up care records and where data from multiple providers flows into a unified data warehouse.
This is to enable that geographical health economy to have a care record that is kept up to date that contains data from multiple providers in one place. The Connected Care work in Bristol is a good example of this.
Getting the benefit from data
So we can also look at getting data to flow from set-ups like this. One way or another, the idea is to get providers of all kinds moving away from the monthly extract and submit that has existed for 20 years or more.
Centrally, the benefit of more timely information is astonishing. It should give us the ability to understand the care delivery status of the health and care system in a timelier manner, empower and inform those organisations that regulate, govern and operationally manage it, and eventually improve things for staff and patients.
About the author: Daniel Ray has worked in health informatics for 17 years. Until recently, he was director of informatics at a large teaching hospital, where he transformed the health informatics service, set up a quality outcomes research unit, and developed a patient portal among other work programmes.
He recently joined the Health and Social Care Information Centre (soon to become NHS Digital) as director of data science. He is also honorary professor of health informatics at UCL’s Farr Institute of Health Informatics, where he gets involved in leading edge research and teaching.