Bias risk means diversity must be built into health research

Out-Law Analysis | 24 Aug 2022 | 12:41 pm | 5 min. read

Imposing diversity requirements on researchers developing new medical technologies would improve health outcomes for all groups in society and could even save lives.

This is something the UK government could legislate for in the months ahead as it aims to address the risks of bias in the development of medical technology.

Is there evidence of bias in medtech?

There is health inequality in the UK and globally. Health inequality pre-dated the Covid-19 pandemic, but the fact people of ethnic minority were found to be more likely to be infected and to die from the virus, and the pandemic’s impact on healthcare provision, has brought the issue into stark focus. A study undertaken by Imperial College Business School found that people living in the nation’s poorest areas and Black people, people of Asian descent and people from other minority ethnic backgrounds experienced the most disruption to hospital care from Covid-19-related cancellations and delays to hospital appointments.

Health inequality has the potential to affect the levels of trust in, and engagement with, healthcare systems within society. This in turn can have an impact on the way medicines and medical technologies are developed. Historically, for example, there has been a western bias in medical research and certain groups – including those born female – are underrepresented in pre-clinical and clinical studies. This in turn contributes to disparities in health outcomes.

The pulse oximeters example and the UK review

The importance of understanding and correcting bias in medical devices was underscored last year when the Medicines and Healthcare products Regulatory Agency (MHRA) warned that pulse oximeters, used to determine whether patients with Covid-19 require access to oxygen, could produce different results for patients depending on what colour of skin they had.

The first oximeters were tested in the 1980’s when clinical studies did not require gender, racial and ethnic diversity. The limitations with pulse oximeters in monitoring oxygen levels across diverse skin tones were identified as early as the 1990s, and a study in 2020 in the US demonstrated the clinical significance of potential racial bias.

Cline Helen

Helen Cline

Legal Director

Actions to mitigate the risks of bias in development of medical technology will need to span all phases from development to market deployment and post marketing evaluation and involve multiple agencies

In July 2021, then health secretary for England Sajid Javid said that pulse oximeters “were giving the wrong readings, generally, for anyone that had dark skin – because they were designed for caucasians”. He said that, as a consequence, “you were less likely to end up on oxygen if you were black or brown, because the reading was just wrong”.

Determined to combat health inequality, Javid set up an independent review in a bid to establish the extent and impact of potential ethnic and other biases in the design and use of medical devices.

The ongoing review (3-page / 67KB PDF), being led by professor Dame Margaret Whitehead, “aims to collect and review evidence as to whether the way in which some medical devices are designed, developed or used may lead to them not being equally effective or safe for different populations based on their ethnicity or other social or demographic characteristics, such as socio-economic status”.

Life sciences companies, patient groups and other stakeholders were recently given an opportunity to feed data and other evidence into the review. A closing date for submitting evidence of 6 October 2022 has been set. 

The polygenic risk score example

One area of focus for the UK review is polygenic risk scores (PRS) used for personalised medicine. PRS rely on genomics and aim to inform clinical decision-making.

The prediction of disease risks is an essential part of personalised medicine, which includes early disease detection, prevention, and intervention. The PRS has become the standard for estimating an individual’s genetic risk for a specific disease or trait.

Diseases and characteristics that are controlled by more than one gene are described as being polygenic. Common health problems such as heart disease and breast cancer have a polygenic genetic architecture and have enabled researchers to identify genetic variants associated with these diseases. These variants can be combined into a polygenic risk score that can estimate an individual’s susceptibility to diseases – the score represents the relative risk a person faces and does not constitute a diagnosis.

The accuracy, however, of the PRS for any individual will depend on how closely that person’s DNA (genetic data) resembles the DNA of the people whose DNA was used to develop the score. One test developed in the US using data in the UK Biobank has been criticised because it appears less accurate in people with non-European backgrounds.

Three ideas to address the risk of bias in medtech

Medical technology needs to be developed and tested with rigour and fairness and be affordable and accessible to all. Leaving patients out of clinical studies is both scientifically damaging and unfair. However, a major problem for medical researchers is getting access to diverse datasets at scale.

Datasets often stem from relatively few institutions, geographies or patient demographics, and might therefore contain unquantifiable bias. Medical technology powered by data like this can perpetuate racial and gender bias.

Actions to mitigate the risks of bias in development of medical technology will need to span all phases from development to market deployment and post marketing evaluation and involve multiple agencies. There are three actions that could be taken now:

  • Regulators could be given a more active role in uncovering bias in medical innovation by legislators, given they already act as a regulatory checkpoint for all new medical technologies. This could include more rigorous evaluation protocols to detect performance differences between demographic groups and requiring data and analyses to demonstrate diversity by gender, race and ethnicity in clinical trials. Protocols could also be established to better identify and report bias in medical technologies post-approval. The funding agencies for pre-clinical and clinical studies could also require mandatory subgroup analyses. 
  • Fill existing data gaps globally. Establish a database of all relevant data from historically underrepresented groups, such as women and people from ethnic minority backgrounds, and enable access to that data for designers and developers of new medical technologies.
  • Educate clinicians to recognise and report inequity and bias in medical technology

What happens next?

The UK government has said that the evidence on the existence and impact of bias in medical devices considered to date in the Whitehead review is “not yet conclusive”. It is why it is vital that any evidence one way or the other should be submitted to the review team between now and 6 October.

The review is ongoing at a time when major reform is anticipated to the UK’s medical devices regime. Last year the government and MHRA consulted on options to proposed changes to the regulatory framework for medical devices in the UK. In June this year, the response to the consultation was published in which the government confirmed (155-page / 1.25MB PDF) the new UK medical device regulatory framework will be built on five pillars – one of which is addressing health inequalities and mitigating biases throughout the medical device product lifecycle.

The government has said it hopes the new medical device regulations will come into force in 2023, though transitional arrangements would apply before the new rulebook has effect. It is not yet clear whether the new legislation and guidance it has promised on bias and diversity will be included in a first wave of reforms, or follow on subsequently.

The government said in its consultation response that it would “continue to support” the Whitehead review, and it would appear to make sense to hold off legislating on bias and diversity until the final recommendations from the review, on how to ensure the development and use of medical devices is equitable, are published. This is expected to happen by June 2023.

Racial and gender bias can exacerbate inequalities in healthcare. It is a worldwide issue and arises across health technology sectors, not just in the context of medical devices. It is to be hoped that the outcome of the UK review will feed into a global program that informs a plan for equity and inclusion in medical innovation.

We are processing your request. \n Thank you for your patience. An unknown error occurred, please input and try again.