Skip to main content

POLICY BRIEF article

Front. Mar. Sci., 17 August 2023
Sec. Marine Affairs and Policy
This article is part of the Research Topic Contemporary Marine Science, its Utility and Influence on Regulation and Government Policy View all 10 articles

Epistemology of ignorance: the contribution of philosophy to the science-policy interface of marine biosecurity

  • 1Centre for Biosecurity and One Health, Harry Butler Institute, Murdoch University, Murdoch, WA, Australia
  • 2School of Humanities, Arts and Social Sciences, Murdoch University, Murdoch, WA, Australia
  • 3Centre for One Biosecurity Research, Analysis and Synthesis, Lincoln University, Lincoln, New Zealand
  • 4School of Humanities, University of Western Australia, Crawley, WA, Australia
  • 5School of Life and Environmental Sciences, Faculty of Science, Engineering and Built Environment, Deakin University, Geelong, VIC, Australia
  • 6School of Molecular and Life Sciences, Curtin University, Bentley, WA, Australia
  • 7Department of Energy, Environment, and Climate Action, Melbourne, VIC, Australia

Marine ecosystems are under increasing pressure from human activity, yet successful management relies on knowledge. The evidence-based policy (EBP) approach has been promoted on the grounds that it provides greater transparency and consistency by relying on ‘high quality’ information. However, EBP also creates epistemic responsibilities. Decision-making where limited or no empirical evidence exists, such as is often the case in marine systems, creates epistemic obligations for new information acquisition. We argue that philosophical approaches can inform the science-policy interface. Using marine biosecurity examples, we specifically examine the epistemic challenges in the acquisition and acceptance of evidence to inform policy, discussing epistemic due care and biases in consideration of evidence.

Introduction

The use of evidence in policy and decision making is increasingly promoted as highly desirable, especially for environmental issues. This has resulted in the adoption of evidence-based policy (EBP) ostensibly to provide greater transparency and consistency in decision making by relying on evidence that can be externally verified and validated (Wesselink et al., 2014). Yet the adoption of EBP creates epistemic challenges and responsibilities (i.e., with regard to the acquisition and reliability of knowledge) requiring decision-makers to use relevant scientific research findings, often on topics in which they have little or no expertise. Additionally, decision making in the face of uncertainty, specifically where limited or no prior empirical evidence exists, requires transparent approaches to determine how information is acquired, considered and accepted (or not) (Meßerschmidt, 2020).

Public policymaking has variously been viewed as a collective process of mediation and codification of social ideals or in a contrary view as an authoritative means to enact the will of government on the people (Wesselink et al., 2014; see also Wears and Hunte, 2014; Pak et al., 2021). Increasingly, there is a desire to shift from ideological and intuitive processes to systems that provide greater transparency and consistency by relying on ‘evidence’ through an EBP approach (Wesselink et al., 2014; Sánchez-Bayo et al., 2017).

The centrality of evidence in EBP generates epistemic responsibilities and challenges that are fundamentally philosophical by nature: What counts as evidence? Who is responsible for providing evidence? Where should the burden of proof lie? How far do our epistemic obligations extend? Additionally, there can be personal and systemic incentives (accidental and intentional biases) to encourage and maintain ignorance (defined as the absence or lack of knowledge or understanding). While different ways of effectively navigating the science-policy ‘space’ have frequently been debated (e.g., Ban et al., 2013; Sánchez-Bayo et al., 2017), the discipline of philosophy can help to illuminate the epistemological challenges arising therein.

Philosophy as a discipline is experiencing a surge in research activity regarding the social and ethical dimensions of knowledge and ignorance. One focus has been the novice-expert problem: how non-experts identify, access and interpret reliable sources of information (e.g., Goldman, 2001; Anderson, 2011; Guerrero, 2017). Using formal modelling techniques, philosophers have examined how knowledge spreads (or fails to spread) from scientists to decision-makers, and how propagandists may influence this process (Weatherall et al., 2020). This extends work in history of science showing how perceptions of the scientific record can be distorted by amplifying scientific findings that favor specific conclusions, thereby creating a false sense of legitimate controversy, confusing decision-makers and the public, and delaying action (Oreskes and Conway, 2010). Other work focuses on epistemic failings of our social structures, e.g., rejection of established scientific findings along partisan lines (Levy, 2019), or the incentive to rush into print and the resulting exacerbated risk of replicability problems (Heesen, 2018). Further, substantial philosophical debate exists on the assignation of responsibility for ignorance – when should we have known what we did not to know and to what extent we are required to investigate the impacts of our actions and omissions (Miller, 2017).

Here we consider these epistemic challenges in the acquisition, consideration and acceptance of scientific evidence to inform marine environmental policy, including standards of epistemic due care and biases in consideration of evidence, to demonstrate the contribution philosophy can make at the science-policy interface. Specifically, we consider the case of marine biosecurity (i.e., the management of human mediated biological introductions) that requires immediate action, but is also heavily impacted from limited scientific information. In doing so, we leave aside some of the wider challenges of interpreting and accepting scientific evidence, where the problem may be one of ignorance of, or, more neutrally put, a lack of appreciation for, methods of scientists, including the issue of statistical significance and replicability.

Standards of epistemic due care and epistemic obligations

The old saying that ‘ignorance is bliss’ rings hollow when it comes to irreversible changes to our social and natural environment that may have undesirable if not catastrophic consequences. Policy decisions are always made under some level of uncertainty – our knowledge concerning any issue is never (and can never be) complete. This raises the question what reasonable standards of epistemic due care consist in. What responsibilities do policy-makers have to seek sufficient evidence to make an informed decision and to what extent can decision-makers reasonably be expected to investigate the ramifications of proposed policies and regulation? Naturally, appropriate standards of epistemic due care will always be context-specific.

In the case of marine biosecurity incursions, a biosecurity response may involve trade or port closures to reduce the likelihood of spread and impact while balancing such a response against potentially significant impacts to industries causing wider economic repercussions for society. The rapid response to the Black Striped Mussel, Mytilopsis sallei (Récluz, 1849), incursion in Darwin, Northern Territory, Australia in 1999 was based on the then-available evidence. The incursion was determined to pose a sufficient risk to enact a quarantine closure of three commercial and recreational marinas in order to enact an eradication response (Bax, 1999; Willan et al., 2000), despite significant economic impact to charter and tourist vessel operators. The eradication was successfully conducted over a 15-day period. In determining marine biosecurity action, policy-makers and decision-makers will – often implicitly and perhaps even unconsciously – make decisions about how much evidence is enough, what kind of evidence is needed, whether to investigate the issue further to collect more evidence and, if so, which direction such investigations should take. Evidence gathering takes time and it is often necessary to act before much evidence becomes available. The highly complex, diverse and dynamic character of the systems in which conservation initiatives operate means that the people making decisions will unavoidably be ignorant of the full range of facts and potential outcomes of their decisions. The detection of a novel marine species requires rapid action – it is frequently detected only after the population has reached sufficient density to be observed, reported and identified.

In the case of the Black Striped Mussel incursion the then-available evidence, even though incomplete, was deemed to be of sufficient quantity and quality to warrant the above-described action response. The success of that response, the eradication of Mytilopsis sallei (Récluz, 1849) from the area, appears to suggest that appropriate standards of epistemic due care where met. However, where decisions concerning the adequacy of existing evidence are made without recourse to general principles or overarching standards, policy decisions are not based on solid foundations.

We believe that research in applied epistemology, focusing on the epistemological aspects of EBP can provide such foundations. In saying so, we want to emphasize that we are not advocating for a mere, straight-up ‘application’ of concepts developed in philosophical epistemology to the questions faced by EBP. Rather, we see a need for sustained engagement between philosophers, scientists and EBP practitioners. Such a systematic and comprehensive approach would enable us to develop general principles and guidelines for epistemically sound policy-making in the marine biosecurity space and beyond. One aim of such an approach would be to integrate existing epistemic principles into an overarching set of criteria for assessing the adequacy of one’s evidence. In the following, we discuss two such principles that are already being employed, albeit not necessarily in a systematic or even explicit way.

The first one is a type of epistemic proportionality principle: it would appear that the greater the potential detrimental impact of our actions (including inaction), the more demanding are our obligations to improve our epistemic position vis-à-vis the issue at hand, e.g. the characteristics and potential impact of Mytilopsis sallei. An action (or omission) that could lead to the eradication of an entire species plausibly requires more thorough investigation than an action that may merely affect the local population. In other words, we have obligations to gather evidence which we know bears on policy responses, in proportion to the significance of the problem. In the case of marine biosecurity responses this may impose requirement for baseline knowledge to inform rapid response (Chapman and Carlton, 1991; Chapman and Carlton, 1994; Ojaveer et al., 2015; Campbell et al., 2018).

The second one is an epistemic precautionary principle. The precautionary approach developed for the UN Convention on Biological Diversity dictates that “where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation”. In the context of conservation, often a type of epistemic precautionary principle is adopted; in the absence of knowledge or certainty concerning potential detrimental impacts on species and biodiversity we should choose to err on the side of caution to prevent conservation impacts. While is difficult to establish what standard of epistemic due care (and which level of epistemic obligations) are appropriate in which context, it might be perfectly appropriate to have specific standards of epistemic due care for specific issues, for instance where the risk of irreversible or unacceptable impacts are high due to inaction such as biosecurity concerns for the survival of a rare, threatened or endangered species or a specific standard for public health threats such as COVID-19. In practice, however, we often see the epistemic precautionary principle reversed: no action is taken where uncertainty is high or where there is no explicit evidence of impact, possibly resulting from explicit tradeoffs between value systems (e.g., Campbell et al., 2009; Sánchez-Bayo et al., 2017; Meßerschmidt, 2020) or from individual or systemic biases.

A philosophically informed approach to unifying standards of epistemic due care in EBP would also reflect further insights from epistemology, such as the notion of blameworthy ignorance and of the collective nature of much of our knowledge.

In the face of unknown unknowns, it is particularly difficult to determine the extent of our epistemic obligations. In retrospect, we regularly evaluate cases of harm caused (or facilitated) by ignorance by asking whether a particular agent or agency could and should have known the consequences of certain actions and measures. From an ethical perspective then, lack of knowledge is no excuse if agents are culpably ignorant – if their ignorance arises in a negligent or even reckless way, e.g., where they violated accepted epistemic standards in their field of operation. These can be explicit, codified standards, but decision-makers may find themselves at a loss where these standards are inadequate, unsystematic, or completely lacking.

A final observation of how philosophical research can inform our understanding of epistemic standards of due care in EBP can be drawn from research on collective forms of knowledge. Decisions about policy responses tend to get made by (often very diverse) groups of people rather than by individuals. In order for such groups to make informed decision, knowledge has to be distributed in the group in the right way. Often, group members will need to know what others know. This is what philosophers call second-order knowledge: they know (or have beliefs about) what other people know (or believe). Ignorance of facts can obtain at all levels and in many of the above cases there will be an easy remedy; in others it will be very difficult. It is much more difficult to induce higher-order knowledge in larger and dispersed groups (Schwenkenbecher, 2022). Consequently, individual agents’ epistemic obligations do not just concern their own knowledge, but that of others, too. Or, to put it more clearly: one person’s epistemic obligations may concern a group’s shared or higher-order knowledge or beliefs (ibid.).

Biases in acquisition and consideration of evidence

There are a number of internal and external biases that can impact on people’s capacity and willingness to collect and appropriately evaluate evidence in the process of policymaking. Our focus here is on the philosophical dimensions of such psychological biases.

One famous example is the so-called status-quo bias, which philosophers examine to determine if such biases are failures of rationality (Douglas, 2009; Pauly and Zeller, 2015). Bostrom and Ord (2006) understand status quo bias to be “… an inappropriate (irrational) preference for an option because it preserves the status quo”, while Nebel (2015) defines it more neutrally as “a disposition, or tendency, to prefer some state of affairs because it is the status quo” that need not be irrational.

In the previous section we noted a tendency in practice towards inaction when uncertainty is high or explicit evidence of impact is lacking. We suggest that this tendency may be understood as a form of status quo bias. Using this perspective, we can apply existing philosophical analysis of status quo bias and proposed remedies. For example, Bostrom and Ord (2006) suggest a “reversal test” to mitigate status quo bias. In the context of marine biosecurity, this would involve combining the question “should we allocate money to investigate the risks associated with this (potential) invasive species?” with the question “suppose we had already allocated money to investigate such risks; would now be a good time to stop doing so?” If the answer to both questions is “no”, it suggests that status quo bias is at work, and the tentative decision not to allocate money should be seriously reconsidered, if not reversed. Or likewise, if the question at hand is “should we allocate money to attempt to eradicate this invasive species?”, we should also consider the hypothetical question “if there were a standing effort to eradicate this species, would the current situation be one in which we would be happy to end this effort?” Again, if the answer to both questions is “no”, status quo bias appears to be at work, and the case for allocating funds for eradication may be stronger than it is given credit for.

Moving beyond status quo bias, human and economic resource tradeoffs at the operational level may effectively lead to systemic biases against investigating and collecting evidence. One example is the official global fisheries data suggesting catches are increasing or stable, however reconstructed data accounting for a negative bias in reporting suggest fisheries stocks are significantly declining (Pauly and Zeller, 2015). While these may seem mundane, at the bottom of such tradeoffs are always value-based cost-benefit analyses – however informally conducted (Davidson and Hewitt, 2014). The requirement for further investigations to obtain additional knowledge or determine the potential impact of policies may be considered too expensive and unjustified given certain assumptions about the value of the expected outcome. Philosophers can help expose and evaluate the use of such non-epistemic values in science (Douglas, 2009; Elliott, 2017).

Loss aversion and temporal discounting can also be expected to influence what data is collected and what weight it is given. Loss aversion predicts we will weight evidence of loss more heavily than evidence of foregone gains (Kemel and Paraschiv, 2018). Temporal discounting – our tendency to weigh near term effects much more heavily than those that are delayed – has played a significant role in deferring actions associated with resource management challenges to the future including decisions on habitat and species loss and other environmental problems. These can be further exacerbated by Treasury applied discount rates (Ananthapavan et al., 2021). Salience bias and the availability heuristic may play a role in limiting further evidence gathering. In particular, these biases will militate against epistemic actions that might reduce our ignorance, because what we do not know is usually not salient to us.

Discussion

Many challenges to implementing evidence-based policy are not only conceptual but philosophical in nature. These cannot be truly understood, let alone resolved, by using the tools of the natural and the social sciences alone; nor are there simple fixes from philosophy. Rather, an ongoing, trans-disciplinary, collaborative effort to improve our collective understanding of the nature of these problems and the values expressed in opting for certain choices and not for others is much needed.

Author contributions

AS, CH and RH conceptualized, led and wrote the manuscript. AS and CH organized the workshop where all authors participated in ideation. All authors contributed to the article and approved the submitted version.

Funding

AS, CH and MC received funding from Murdoch University (Australia) to support an inception workshop; AS, CH and RH received funding from Zentrum für interdisziplinäre Forschung (ZiF) der Universität Bielefeld (Germany).

Acknowledgments

We thank the reviewers and editor for their comments on the paper.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Ananthapavan J., Moodie M., Milat A., Veerman L., Whittaker E., Carter R. (2021). A cost–benefit analysis framework for preventive health interventions to aid decision-making in Australian governments. Health Res. Policy Syst. 19 (1), 147. doi: 10.1186/s12961-021-00796-w

PubMed Abstract | CrossRef Full Text | Google Scholar

Anderson E. (2011). Democracy, public policy, and lay assessments of scientific testimony. Episteme 8, 144–164. doi: 10.3366/epi.2011.0013

CrossRef Full Text | Google Scholar

Ban N. C., Mills M., Tam J., Hicks C., Klain S., Stoeckl N., et al. (2013). “A socio-ecological approach to conservation planning: Embedding social considerations”. Front. Ecol. Environ 11, 194–202. doi: 10.1890/110205

CrossRef Full Text | Google Scholar

Bax N. J. (1999). Eradicating a dreissenid from Australia. Dreissena! 10 (3), 1–5. doi: 102.100.100/214161?index=1

Google Scholar

Bostrom N., Ord T. (2006). The reversal test eliminating status quo bias in applied ethics. Ethics 116, 656–679. doi: 10.1086/505233

PubMed Abstract | CrossRef Full Text | Google Scholar

Campbell M. L., Grage A., Mabin C., Hewitt C. L. (2009). Conflict between International Treaties: failing to mitigate the effects of introduced marine species. Dialogue 28 (1), 46–56. Available at: https://socialsciences.org.au/publications/2009-dialogue-volume-28-number-1/.

Google Scholar

Campbell M. L., Leonard K., Primo C., Hewitt C. L. (2018). Marine biosecurity crisis decision-making: two tools to aid “go” / “no go” decision-making. Front. Mar. Sci. 5. doi: 10.3389/fmars.2018.00331

CrossRef Full Text | Google Scholar

Chapman J. W., Carlton J. T. (1991). A test of criteria for introduced species: the global invasion by the isopod Synidotea laevidorsalis (Miers 1881). J. Crustac. Biol. 11, 386–400. doi: 10.2307/1548465

CrossRef Full Text | Google Scholar

Chapman J. W., Carlton J. T. (1994). Predicted discoveries of the introduced isopod S2ynidotea laevidorsalis (Miers 1881). J. Crustac. Biol. 14, 700–714. doi: 10.2307/1548863

CrossRef Full Text | Google Scholar

Davidson A. D., Hewitt C. L. (2014). How often are invasion-induced ecological impacts missed? Biol. Invasions 16, 1165–1173. doi: 10.1007/s10530-013-0570-4

CrossRef Full Text | Google Scholar

Douglas H. E. (2009). Science, policy, and the value-free ideal (Pittsburgh: University of Pittsburgh Press), 256.

Google Scholar

Elliott K. C. (2017). A tapestry of values: An introduction to values in science (Oxford: Oxford University Press).

Google Scholar

Goldman A. I. (2001). Experts: which ones should you trust? Philosophy Phenomenol Res. 63, 85–110. doi: 10.2307/3071090

CrossRef Full Text | Google Scholar

Guerrero A. (2017). "Living with ignorance in a world of experts", in Perspectives on Ignorance from Moral and Social Philosophy. Ed. Peels, R. (New York: Routledge), pp, 156–185. doi: 10.4324/9781315671246

CrossRef Full Text | Google Scholar

Heesen R. (2018). Why the reward structure of science makes reproducibility problems inevitable. J. Philosophy 115, 661–674. doi: 10.5840/jphil20181151239

CrossRef Full Text | Google Scholar

Kemel E., Paraschiv C. (2018). Deciding about human lives: an experimental measure of risk attitudes under prospect theory. Soc. Choice Welfare 51, 163–192. doi: 10.1007/s00355-018-1111-y

CrossRef Full Text | Google Scholar

Levy N. (2019). Due deference to denialism: explaining ordinary people’s rejection of established scientific findings. Synthese 196, 313–327. doi: 10.1007/s11229-017-1477-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Meßerschmidt K. (2020). COVID-19 legislation in the light of the precautionary principle. Theory Pract. Legislation 8, 267–292. doi: 10.1080/20508840.2020.1783627

CrossRef Full Text | Google Scholar

Miller D. J. (2017). Reasonable foreseeability and blameless ignorance. Philos. Stud. 174, 1561–1581. doi: 10.1007/s11098-016-0772-6

CrossRef Full Text | Google Scholar

Nebel J. M. (2015). Status quo bias, rationality, and conservatism about value. Ethics 125, 449–476. doi: 10.1086/678482

CrossRef Full Text | Google Scholar

Ojaveer H., Galil B. S., Campbell M. L., Carlton J. T., Clode J. C., Cook E., et al. (2015). Classification of non-indigenous species based on their impacts: the marine perspective. PloS Biol. 13 (4), e1002130. doi: 10.1371/journal.pbio.1002130

PubMed Abstract | CrossRef Full Text | Google Scholar

Oreskes N., Conway M. (2010). Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming (New York: Bloomsbury Press).

Google Scholar

Pak A., McBryde E., Adegboye O. A. (2021). Does high public trust amplify compliance with stringent COVID-19 government health guidelines? A multi-country analysis using data from 102,627 individuals. Risk Manage. Healthcare Policy 14, 293–302. doi: 10.2147/RMHP.S278774

CrossRef Full Text | Google Scholar

Pauly D., Zeller D. (2015). Catch reconstructions reveal that global marine fisheries catches are higher than reported and declining. Nat. Commun. 7, 10244. doi: 10.1038/ncomms10244

CrossRef Full Text | Google Scholar

Sánchez-Bayo F., Goulson D., Pennacchio F., Nazzi F., Goka K., Desneux N. (2017). Are bee diseases linked to pesticides? - A Brief review. Environ. Int. 89-90, 7–11. doi: 10.1016/j.envint.2016.01.009

CrossRef Full Text | Google Scholar

Schwenkenbecher A. (2022). How we fail to know: Group-based ignorance and collective epistemic obligations. Political Stud. 70 (4), 901–918. doi: 10.1177/00323217211000926

CrossRef Full Text | Google Scholar

Wears R. L., Hunte G. S. (2014). Seeing patient safety 'Like a State'. Saf. Sci. 67, 50–57. doi: 10.1016/j.ssci.2014.02.007

CrossRef Full Text | Google Scholar

Weatherall J. O., O’Connor C., Bruner J. P. (2020). How to beat science and influence people: Policymakers and propaganda in epistemic networks. Br. J. Philosophy Sci. 71 (4), 1157–1186. doi: 10.1093/bjps/axy062

CrossRef Full Text | Google Scholar

Wesselink A., Colebatch H., Pearce W. (2014). Evidence and policy: discourses, meanings and practices. Policy Sci. 47, 339–344. doi: 10.1007/s11077-014-9209-2

CrossRef Full Text | Google Scholar

Willan R. C., Russell B. C., Murfet N. B., Moore K. L., McEnnulty F. R., Horner S. K., et al. (2000). Outbreak of mytilopsis sallei (Récluz 1849) (Bivalvia: dreissenidae) in Australia. Molluscan Res. 20 (2), 25–30. doi: 10.1080/13235818.2000.10673730

CrossRef Full Text | Google Scholar

Keywords: invasion ecology, evidence, bias, obligation, uncertainty

Citation: Schwenkenbecher A, Hewitt CL, Heesen R, Campbell ML, Fritsch O, Knight AT and Nash E (2023) Epistemology of ignorance: the contribution of philosophy to the science-policy interface of marine biosecurity. Front. Mar. Sci. 10:1178949. doi: 10.3389/fmars.2023.1178949

Received: 03 March 2023; Accepted: 31 July 2023;
Published: 17 August 2023.

Edited by:

Di Jin, Woods Hole Oceanographic Institution, United States

Reviewed by:

Porter Hoagland, Woods Hole Oceanographic Institution, United States
Jay Jin, Independent Researcher, Los Angeles, CA, United States

Copyright © 2023 Schwenkenbecher, Hewitt, Heesen, Campbell, Fritsch, Knight and Nash. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Anne Schwenkenbecher, A.Schwenkenbecher@murdoch.edu.au

Present address: Remco Heesen, Department of Philosophy, Logic and Scientific Method, London School of Economics and Political Science, London, United Kingdom

These authors have contributed equally to this work and share first authorship

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.