Kurhan - stock.adobe.com

Department of Health investigates bias in medical devices and algorithms

The government has launched a review to look at the problem of discrimination in medical devices and decision-making software

The government has put out a call for evidence, looking for views and advice on how to tackle discrimination in medical devices and technology, as part of an independent review on medical tech.

The call for evidence, which is open until 6 October 2022, aims to gather insights from experts and organisations on the potential racial and gender bias of medical devices. The review is seeking expertise from people who work in development and those who use medical devices such as oxygen-measuring devices and infrared scanners, and related software and hardware, including databases and instructions. This applies across a device’s entire lifecycle, from evaluation to marketing and implementation, to identify potential biases at each and every stage.

As part of an independent review on equity in medical devices, led by Margaret Whitehead, WH Duncan chair of public health in the Department of Public Health and Policy, the government is seeking to tackle disparities in healthcare by gathering evidence on how medical devices and technologies may be biased against patients of different ethnicities, genders and other socio-demographic groups.

For instance, some devices employing infrared light or imaging may not perform as well on patients with darker skin pigmentation, which has not been accounted for in the development and testing of the devices.

Experts are being asked to provide as much information as possible about biases in medical devices. Along with information about the device type, name, brand or manufacturer, the independent review is also looking to gather as much detail as possible about the intended use of medical devices that may be discriminatory, the patient population on which they are used, and how and why these devices may not be equally effective or safe for all the intended patient groups.

Discussing the review, Whitehead said: “We aim to establish where and how potential ethnic and other unfair biases may arise in the design and use of medical devices, and what can be done to make improvements. We especially encourage health, technology, and industry experts and researchers to share their views and any evidence concerning medical devices to help us tackle inequalities in healthcare.

Research suggests the way some medical devices are designed and used may be failing to account for differences related to ethnic background, gender, or other characteristics such as disabilities, potentially exacerbating existing inequalities in healthcare.

While current UK regulations set out clear expectations on medical devices and technologies, they do not currently include provisions to ensure that medical devices are working equally well for different groups in the population based on their social or demographic characteristics.

Health minister Gillian Keegan said: “The independent review is part of our vital work to tackle healthcare inequalities, and I invite the industry to share their expertise in the call for evidence so we can ensure medical devices are free of any form of bias.”

Along with physical devices, the review is assessing artificial intelligence (AI)-enabled applications used in diagnostics and for making decisions about healthcare, where biases may be in-built within the clinical algorithms they use. The review will also investigate risk-scoring systems, where genomics is used to make decisions about personalised medicine.

Read more about tech bias

  • Proposed data gathering powers for UK police could override existing data protection rules, damage citizens’ trust in essential public services and further entrench discriminatory policing practices.
  • The regulation of artificial intelligence must address the power of tech companies, as technical measures alone will not be enough to prevent the harms caused by AI-driven technologies.

Read more on IT legislation and regulation