This post by Kevin Elliott unpacks the value disagreements that lie at the heart of many epistemic injustices in health care and policy
One of the reasons that epistemic injustice
is challenging to tackle in medical contexts is that it can be difficult to decide
how to handle situations where non-specialists challenge the views of the
mainstream medical community. In some cases, non-specialists may have very important
insights, whereas in other cases, they may be guided by misinformation. It’s
understandable that medical experts want to resist misinformation, but how can
they tell the difference between the two cases?
In an article
published recently in Topoi, I argued that recent scholarship in the
philosophy of science could help with tackling this challenge. Philosophers of
science working on the topic of “science and values” have been exploring the
wide array of value-laden choices that scientists make in the course of their
research. These choices are value-laden in the sense that they have
consequences for society, but they can’t be settled just by appealing to
evidence and logic. When non-specialists disagree with medical experts because
they are handling these sorts of value-laden choices differently, it suggests
that the non-specialists’ perspectives should be taken seriously and explored
further.
Consider three important kinds of
value-laden choices: (1) research questions and framing; (2) background
assumptions; and (3) standards of evidence. First, non-specialists might approach
problems differently from the mainstream medical community because they are
asking different questions. For example, Maya Goldenberg contends
that most public health experts who make claims about vaccine safety are
focused primarily on their overall costs and benefits for society as a whole.
She argues that some parents are unconvinced by the experts’ assurances of
safety because they are worried that particular vaccines might pose significant
risks to their specific children based on their unique characteristics. The
parents might accept that the overall costs and benefits of vaccines are
favorable for society as a whole, but they might doubt that the experts have
adequately studied the risks of vaccines in all sub-populations.
Second, non-specialists might draw
different conclusions than specialists because they adopt different background
assumptions. For example, sociologist Gwen Ottinger describes how communities living near
industrial facilities in Louisiana have struggled to convince regulators to
take their concerns about air pollution seriously. This is partly because of a
difference in background assumptions: according to Ottinger, the regulators
assume that they should focus on average pollution levels over an extended
period of time (say, 24 hours or more), whereas community members argue that
they sometimes experience lasting health effects from short-term spikes in
pollution over much shorter periods of time.
Third, specialists and non-specialists
might disagree because they demand different amounts or kinds of evidence. For
example, sociologist Steven Epstein points out that
many AIDS activists criticized the U.S. Food and Drug Administration (FDA) in
the 1980s and 1990s for being too slow to approve new drugs. The activists felt
that the FDA demanded too much evidence before declaring drugs safe and
effective, especially considering that AIDS patients were willing to take risks
because they were likely to die otherwise.
When non-specialists make these kinds of
choices differently from medical experts, it does not automatically mean that
the non-specialists are correct, of course. For example, vaccine-hesitant
parents might be asking a question that has already been addressed. For
instance, experts may have already assessed the risks to children just like
theirs and found them to be insignificant. Or the background assumptions
accepted by non-specialists might be highly implausible compared to the
background assumptions accepted by the mainstream medical community.
Nonetheless, even in cases where
non-specialists make implausible choices, clarifying these differing
choices can still foster greater understanding and richer dialogues between medical
professionals and non-specialists. By clarifying these choices, philosophers of
science can help non-specialists communicate more effectively about why they
disagree with professionals, and they can help professionals interpret the
perspectives of non-specialists in more sympathetic ways. In some cases, medical
professionals might even change their minds. For example, AIDS activists
ultimately convinced the FDA to adopt an expedited approval process for some
drugs, and they altered the ways some clinical trials were designed.
Admittedly, not all cases will turn out as
well as the AIDS case. There will be some cases where those who question
mainstream medical views are simply misinformed or operating in bad faith. But in
order to promote a medical system that combats epistemic injustice, we need to
explore ways to promote dialogue and mutual understanding in the face of
disagreement. The philosophy of science can help with this task.
Note: This post is adapted from a post
written for the blog of the American Philosophical Association, “Threading
the Needle: Can We Respect Local Knowledge While Resisting Misinformation?”
Author bio
Kevin Elliott is a Red Cedar Distinguished
Professor in Lyman Briggs College, the Department of Fisheries and Wildlife,
and the Department of Philosophy at Michigan State University. His research
focuses on the philosophy of science and practical ethics, with an emphasis on
the roles that ethical and social values play in scientific research,
particularly in the environmental health sciences. His books include Values
in Science (Cambridge University Press, 2022) and A Tapestry of Values:
An Introduction to Values in Science (Oxford University Press, 2017).
.jpg)