Pages

Wednesday, 3 April 2024

Will medical AI apps help us to achieve healthcare justice?

AI-powered healthcare apps promise to solve current problems, offering more efficient, unbiased and accurate care to users. According to these promises, AI Apps can diagnose mental disorders, tell patients if their moles are cancerous, and offer a more effective ear for disclosure since some patients report feeling more comfortable sharing their experiences with AI rather than with their clinician. Yet, as AI has been found to reproduce biases within training data, can it really be a departure from the long and difficult history of inequality in healthcare? 

Traditionally, women have been underrepresented in science, leading to conditions faced by half the world's population being ignored and women's symptoms being systemically underplayed and disbelieved. This resulted often in the lack of adequate understanding of female physiology. For decades studies on women’s healthcare have revealed how women’s pain is often misdiagnosed, psychologised, with women reporting having to work harder for their symptoms to be believed and addressed. 

Moreover, the history of psychiatry shows how diagnosing mental disorders has been used to oppress women. Today, we are facing great inequalities in the medical profession; for example, only one in ten surgeons in the NHS identify as female while patients with endometriosis need on average seven and a half years to receive diagnosis.




Reproductive health is a popular area for apps, with one subsection targeting users who suffer from symptoms compatible with endometriosis so they can track their pain and physical discomfort. Apps collect data that can then be presented to a physician, and via this process data seem more reliable and trustworthy. This type of app is presented as a solution for epistemic injustice because the data collection and visualisation of data points are seen as more objective and can thus accelerate diagnosis. The collected data can also be argued to power AI investigations into medical conditions, and, at the individual level, empower people by tracking their medical journeys.

The apps collect inputted information and present it to healthcare providers, aiming to alleviate epistemic injustice in the clinic though data visualisation. However, we contend that this ultimately shifts the burden of proof from the practitioner to the user rather than resolve the structural inequalities responsible for the phenomenon. The apps could ultimately increase the burden of proof for these conditions, requiring longitudinal self-tracking and interaction with the app interface, a labour cost to the individual who is seeking help. For those users who, for whatever reason, do not or cannot provide this information, then the barrier to healthcare will be further entrenched. Ultimately such apps could make epistemic injustice worse rather than alleviating it. The risk is that the issue will be obfuscated by techno-solutionism and that, ultimately, women are not to be believed without neatly packaged data.

Moreover, we are sceptical about the emancipatory role of such apps in improving healthcare for individuals who face epistemic injustice in the clinic. We are also concerned about the argument that the data these apps collect can truly empower research into conditions such as endometriosis. What is currently needed to advance understanding of the condition is more and better imaging and screening of those affected, which start with believing their symptoms. This is possible within traditional medical set-ups. So, before we get excited about the next toolkit, it is worth reflecting on why we expect technology to offer us solutions to deeply entrenched socioeconomic problems.





Dr Milena Ivanova is a philosopher of science interested in the relationship between science and art, creativity and the automation of discovery. She teaches medical professionals on the role of epistemic diversity in medicine and is interested in the presence of gender and racial bias in the history of medicine and psychiatry.
Dr Aisha Sobey is an interdisciplinary STS researcher, who is concerned with understanding the interaction of the digital systems and the human experience, especially concerning power structures.

No comments:

Post a Comment

All comments are moderated.