DHSC outlines plans to tackle ethnic and other biases in medical devices

  • 14 March 2024
DHSC outlines plans to tackle ethnic and other biases in medical devices

Earlier this week, the government announced its plan to tackle ethnic and other biases in medical devices, in a bid to position the UK as a world leader on the topic. The plan follows a UK-first independent review into equity in medical devices, which identified both the extent and the impact of biases in the design and use of medical devices.

In particular, the review, led by Professor Dame Margaret Whitehead, professor of public health at the University of Liverpool, explored issues around pulse oximeters. This followed concerns that the devices were not as accurate for patients with darker skin tones.

The report found that ethnic minorities, women and those from disadvantaged communities were most at risk of bias from medical devices, and so poorer health outcomes.

As a result of the publication of the report, the government will now address issues from the design stage of medical devices and provide extra funding for applications for new devices that operate without bias.

Minister of State, Andrew Stephenson, said: “Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”

The government has committed to ensuring that pulse oximetry devices used within the NHS, can be used accurately across a range of skin tones, as well as removing racial bias from data sets being used in clinical studies.

In addition, the Medicines and Healthcare products Regulatory Agency is now requesting that approval applications for new medical devices describe how they will address issues of bias. Ongoing work with NHS England will also support the upskilling of clinical professionals on issues including health equality.

Professor Dame Whitehead said: “The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.

“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.

“Our recommendations therefore call for system-wide action, requiring full government support. The UK would take the lead internationally if it incorporated equity in AI-enabled medical devices into its global AI safety initiatives.”

The announcement forms part of the government’s ongoing work to tackle disparities in the healthcare systems. In recent years it has established the Office for Health Improvement and Disparities, dedicated to reducing negative health disparities; commissioned Core20Plus5, a national NHS England approach to inform action to reduce healthcare inequalities; and invested £50m in health inequalities research for local authorities in 2022.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Concerns raised that NHS digital plans could exclude older adults

Concerns raised that NHS digital plans could exclude older adults

Concerns have been raised that government NHS plans, including having a single patient record through the NHS App, will exclude older people.
DHSC to review clinical risk standards for digital health tech

DHSC to review clinical risk standards for digital health tech

DHSC has announced plans to launch a consultation on the clinical risk standards for the use of digital health technologies in 2024/5.
Patient safety must be central to EPR design and rollout says report

Patient safety must be central to EPR design and rollout says report

Patient safety must be central to the design, development, and rollout of EPR systems, a Patient Safety Learning report has stated.