Medical device bias needs urgent action - review

3 months ago 29

picture of a pulse oximeterImage source, Getty Images

Immediate action is needed to tackle the impact of ethnic and other biases in the use of medical devices, an independent review says.

It found pulse oximeter devices could be less accurate for people with darker skin tones, making it harder to spot dangerous falls in oxygen levels.

Meanwhile, devices using AI could under-estimate skin cancer in people with darker skin, it warns.

The review said fairer devices needed to be designed urgently.

In total, it made 18 recommendations for improvement. The government says it fully accepts the report's conclusions.

It looked closely at three types of device where there is potential for "substantial" harm to patients:

  • optical medical devices - such as pulse oximeters, which send light waves through a patient's skin to estimate the level of oxygen in the blood. The light can behave differently depending on skin tone
  • artificial intelligence in healthcare
  • polygenic risk scores - which combine the results of several genetic tests to help estimate an individual's risk of disease and are used mostly for research purposes

Pulse oximeters were used frequently during the Covid pandemic, for example, alongside other observations, to help judge whether a patient needed hospital admission and treatment.

'Inherent bias'

Building on previous research, the review says the devices, which are clipped onto a finger, can over-estimate the level of oxygen in the blood for people with darker skin tones.

The report says there is evidence from the US to suggests this can sometimes lead to worse health outcomes for Black patients.

And researchers say the situation is compounded by the devices mostly being tested and calibrated on participants with lighter skin tones.

The government points to action already being taken on the issue, including updated NHS guidance on pulse oximeters and more research into smarter devices being funded.

But the researchers said it was crucial that people do not stop using pulse oximeters - which nevertheless help monitor trends of oxygen levels - while more action is taken to resolve the situation.

Image source, Getty Images

Image caption,

Artificial intelligence can be used to help read X-rays

Chair of the review, Professor Dame Margaret Whitehead from the University of Liverpool, called for "system-wide action" to be implemented as a matter of priority.

She said: "The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.

"Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning."

Chest X-rays

One example is the potential under-diagnosis of skin cancers for people with darker skin.

This would likely be a result of machines being 'trained' predominantly on images of lighter skin tones, the team explains.

Another concern arises when using AI systems for reading chest x-rays - which are mainly trained on images taken of men, who tend to have larger lung capacities.

This could potentially lead to under-diagnosis of heart disease in women, the report suggests, worsening an already long-standing problem.

The government says it is committed to removing bias in datasets and increasing training for health professionals.

When it comes to predicting someone's risk of disease using so-called polygenic risk scores, the report says there are similar issues with the data because it's based overwhelmingly on populations of European ancestry - meaning the results may not be applicable to people of other backgrounds.

Another concern is that these scores are only predictive and cannot say for certain that people will develop a disease.

Professor Habib Naqvi, chief executive of the NHS Race and Health Observatory, welcomed the findings, saying access to better health should not be determined by ethnicity nor by the colour of your skin and medical devices needed to be fit-for-purpose for all communities.

He added that the lack of diverse representation in health research and robust equity considerations had "led to racial bias in medical devices, clinical assessments and in other healthcare interventions".

Minister of State, Andrew Stephenson said the review was important.

"Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS," he said.

Related Internet Links

The BBC is not responsible for the content of external sites.

Read Entire Article