Scientific dissent and public policy: Is targeting dissent a reasonable way to protect sound policy decisions?
EMBO reports
(2012)
14: 231 - 235
Dissent is crucial for the advancement of science. Disagreement is at the heart of peer review and is important for uncovering unjustified assumptions, flawed methodologies and problematic reasoning. Enabling and encouraging dissent also helps to generate alternative hypotheses, models and explanations. Yet, despite the importance of dissent in science, there is growing concern that dissenting voices have a negative effect on the public perception of science, on policy‐making and public health. In some cases, dissenting views are deliberately used to derail certain policies. For example, dissenting positions on climate change, environmental toxins or the hazards of tobacco smoke [1,2] seem to laypeople as equally valid conflicting opinions and thereby create or increase uncertainty. Critics often use legitimate scientific disagreements about narrow claims to reinforce the impression of uncertainty about general and widely accepted truths; for instance, that a given substance is harmful [3,4]. This impression of uncertainty about the evidence is then used to question particular policies [1,2,5,6].
The negative effects of dissent on establishing public polices are present in cases in which the disagreements are scientifically well‐grounded, but the significance of the dissent is misunderstood or blown out of proportion. A study showing that many factors affect the size of reef islands, to the effect that they will not necessarily be reduced in size as sea levels rise [7], was simplistically interpreted by the media as evidence that climate change will not have a negative impact on reef islands [8].

In other instances, dissenting voices affect the public perception of and motivation to follow public‐health policies or recommendations. For example, the publication of a now debunked link between the measles, mumps and rubella vaccine and autism [9], as well as the claim that the mercury preservative thimerosal, which was used in childhood vaccines, was a possible risk factor for autism [10,11], created public doubts about the safety of vaccinating children. Although later studies showed no evidence for these claims, doubts led many parents to reject vaccinations for their children, risking the herd immunity for diseases that had been largely eradicated from the industrialized world [12,13,14,15]. Many scientists have therefore come to regard dissent as problematic if it has the potential to affect public behaviour and policy‐making. However, we argue that such concerns about dissent as an obstacle to public policy are both dangerous and misguided.
Whether dissent is based on genuine scientific evidence or is unfounded, interested parties can use it to sow doubt, thwart public policies, promote problematic alternatives and lead the public to ignore sound advice. In response, scientists have adopted several strategies to limit these negative effects of dissent—masking dissent, silencing dissent and discrediting dissenters. The first strategy aims to present a united front to the public. Scientists mask existing disagreements among themselves by presenting only those claims or pieces of evidence about which they agree [16]. Although there is nearly universal agreement among scientists that average global temperatures are increasing, there are also legitimate disagreements about how much warming will occur, how quickly it will occur and the impact it might have [7,17,18,19]. As presenting these disagreements to the public probably creates more doubt and uncertainty than is warranted, scientists react by presenting only general claims [20].

A second strategy is to silence dissenting views that might have negative consequences. This can take the form of self‐censorship when scientists are reluctant to publish or publicly discuss research that might—incorrectly—be used to question existing scientific knowledge. For example, there are genuine disagreements about how best to model cloud formation, water vapour feedback and aerosols in general circulation paradigms, all of which have significant effects on the magnitude of global climate change predictions [17,19]. Yet, some scientists are hesitant to make these disagreements public, for fear that they will be accused of being denialists, faulted for confusing the public and policy‐makers, censured for abating climate‐change deniers, or criticized for undermining public policy [21,22,23,24].
…there is growing concern that dissenting voices can have a negative effect on the public perception of science, on policy‐making and public health
Another strategy is to discredit dissenters, especially in cases in which the dissent seems to be ideologically motivated. This could involve publicizing the financial or political ties of the dissenters [2,6,25], which would call attention to their probable bias. In other cases, scientists might discredit the expertise of the dissenter. One such example concerns a 2007 study published in the Proceedings of the National Academy of Sciences USA, which claimed that cadis fly larvae consuming Bt maize pollen die at twice the rate of flies feeding on non‐Bt maize pollen [26]. Immediately after publication, both the authors and the study itself became the target of relentless and sometimes scathing attacks from a group of scientists who were concerned that anti‐GMO (genetically modified organism) interest groups would seize on the study to advance their agenda [27]. The article was criticized for its methodology and its conclusions, the Proceedings of the National Academy of Sciences USA was criticized for publishing the article and the US National Science Foundation was criticized for funding the study in the first place.
Public policies, health advice and regulatory decisions should be based on the best available evidence and knowledge. As the public often lack the expertise to assess the quality of dissenting views, disagreements have the potential to cast doubt over the reliability of scientific knowledge and lead the public to question relevant policies. Strategies to block dissent therefore seem reasonable as a means to protect much needed or effective health policies, advice and regulations. However, even if the public were unable to evaluate the science appropriately, targeting dissent is not the most appropriate strategy to prevent negative side effects for several reasons. Chiefly, it contributes to the problems that the critics of dissent seek to address, namely increasing the cacophony of dissenting voices that only aim to create doubt. Focusing on dissent as a problematic activity sends the message to policy‐makers and the public that any dissent undermines scientific knowledge. Reinforcing this false assumption further incentivizes those who seek merely to create doubt to thwart particular policies. Not surprisingly, think‐tanks, industry and other organizations are willing to manufacture dissent simply to derail policies that they find economically or ideologically undesirable.
Another danger of targeting dissent is that it probably stifles legitimate crucial voices that are needed for both advancing science and informing sound policy decisions. Attacking dissent makes scientists reluctant to voice genuine doubts, especially if they believe that doing so might harm their reputations, damage their careers and undermine prevailing theories or policies needed. For instance, a panel of scientists for the US National Academy of Sciences, when presenting a risk assessment of radiation in 1956, omitted wildly different predictions about the potential genetic harm of radiation [16]. They did not include this wide range of predictions in their final report precisely because they thought the differences would undermine confidence in their recommendations. Yet, this information could have been relevant to policy‐makers. As such, targeting dissent as an obstacle to public policy might simply reinforce self‐censorship and stifle legitimate and scientifically informed debate. If this happens, scientific progress is hindered.
Second, even if the public has mistaken beliefs about science or the state of the knowledge of the science in question, focusing on dissent is not an effective way to protect public policy from false claims. It fails to address the presumed cause of the problem—the apparent lack of understanding of the science by the public. A better alternative would be to promote the public's scientific literacy. If the public were educated to better assess the quality of the dissent and thus disregard instances of ideological, unsupported or unsound dissent, dissenting voices would not have such a negative effect. Of course, one might argue that educating the public would be costly and difficult, and that therefore, the public should simply listen to scientists about which dissent to ignore and which to consider. This is, however, a paternalistic attitude that requires the public to remain ignorant ‘for their own good'; a position that seems unjustified on many levels as there are better alternatives for addressing the problem.

Moreover, silencing dissent, rather than promoting scientific literacy, risks undermining public trust in science even if the dissent is invalid. This was exemplified by the 2009 case of hacked e‐mails from a computer server at the University of East Anglia's Climate Research Unit (CRU). After the selective leaking of the e‐mails, climate scientists at the CRU came under fire because some of the quotes, which were taken out of context, seemed to suggest that they were fudging data or suppressing dissenting views [28,29,30,31]. The stolen e‐mails gave further ammunition to those opposing policies to reduce greenhouse emissions as they could use accusations of data ‘cover up’ as proof that climate scientists were not being honest with the public [29,30,31]. It also allowed critics to present climate scientists as conspirators who were trying to push a political agenda [32]. As a result, although there was nothing scientifically inappropriate revealed in the ‘climategate’ e‐mails, it had the consequence of undermining the public's trust in climate science [33,34,35,36].
A significant amount of evidence shows that the ‘deficit model’ of public understanding of science, as described above, is too simplistic to account correctly for the public's reluctance to accept particular policy decisions [37,38,39,40]. It ignores other important factors such as people's attitudes towards science and technology, their social, political and ethical values, their past experiences and the public's trust in governmental institutions [41,42,43,44]. The development of sound public policy depends not only on good science, but also on value judgements. One can agree with the scientific evidence for the safety of GMOs, for instance, but still disagree with the widespread use of GMOs because of social justice concerns about the developing world's dependence on the interests of the global market. Similarly, one need not reject the scientific evidence about the harmful health effects of sugar to reject regulations on sugary drinks. One could rationally challenge such regulations on the grounds that informed citizens ought to be able to make free decisions about what they consume. Whether or not these value judgements are justified is an open question, but the focus on dissent hinders our ability to have that debate.
Focusing on dissent as a problematic activity sends the message to policy‐makers and the public that any dissent undermines scientific knowledge
As such, targeting dissent completely fails to address the real issues. The focus on dissent, and the threat that it seems to pose to public policy, misdiagnoses the problem as one of the public misunderstanding science, its quality and its authority. It assumes that scientific or technological knowledge is the only relevant factor in the development of policy and it ignores the role of other factors, such as value judgements about social benefits and harms, and institutional trust and reliability [45,46]. The emphasis on dissent, and thus on scientific knowledge, as the only or main factor in public policy decisions does not give due attention to these legitimate considerations.
Furthermore, by misdiagnosing the problem, targeting dissent also impedes more effective solutions and prevents an informed debate about the values that should guide public policy. By framing policy debates solely as debates over scientific facts, the normative aspects of public policy are hidden and neglected. Relevant ethical, social and political values fail to be publicly acknowledged and openly discussed.
Controversies over GMOs and climate policies have called attention to the negative effects of dissent in the scientific community. Based on the assumption that the public's reluctance to support particular policies is the result of their inability to properly understand scientific evidence, scientists have tried to limit dissenting views that create doubt. However, as outlined above, targeting dissent as an obstacle to public policy probably does more harm than good. It fails to focus on the real problem at stake—that science is not the only relevant factor in sound policy‐making. Of course, we do not deny that scientific evidence is important to the develop.ment of public policy and behavioural decisions. Rather, our claim is that this role is misunderstood and often oversimplified in ways that actually contribute to problems in developing sound science‐based policies.
Conflict of Interest
The authors declare that they have no conflict of interest.
References
Michaels D (2008) Doubt is their Product: How Industry's Assault on Science Threatens your Health. Oxford, UK: Oxford University Press
Oreskes N, Conway EM (2010) Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, USA: Bloomsbury
Michaels PJ (2005) Shattered Consensus: The True State of Global Warming. Lanham, Maryland, USA: Rowman & Littlefield
Enstrom JE, Kabat GC (2003) Environmental tobacco smoke and tobacco related mortality in a prospective study of Californians, 1960–98. BMJ 326: 1057
Vastag B (2001) Congressional autism hearings continue: no evidence MMR vaccine causes disorder. JAMA 285: 2567–2569
Shrader‐Frechette K (2010) Conceptual analysis and special‐interest science: toxicology and the case of Edward Calabrese. Synthese 177: 449–469
Webb A, Kench P (2010) The dynamic response of reef islands to sea‐level rise: evidence from multi‐decadal analysis of island change in the central pacific. Global Planet Change 72: 234–246
Chapman P (2010) Pacific islands ‘growing not shrinking’ due to climate change [online]. The Telegraph 3 Jun. http://www.telegraph.co.uk/news/worldnews/australiaandthepacific/tuvalu/7799503/c‐islands‐growing‐not‐shrinking‐due‐to‐climate‐change.html
Wakefield AJ et al (1998) Ileal‐lymphoid‐nodular hyperplasia, non‐specific colitis, and pervasive developmental disorder in children. Lancet 351: 637–641
Geier DA, Geier MR (2006) A meta‐analysis epidemiological assessment of neurodevelopmental disorders following vaccines administered from 1994 through 2000 in the United States. Neuro Endocrinol Lett 27: 401–413
Geier DA, Geier MR (2007) A prospective study of thimerosal‐containing Rho(D)‐immune globulin administration as a risk factor for autistic disorders. J Matern Fetal Neonatal Med 20: 385–390
Omer SB, Salmon DA, Orenstein WA, deHart MP, Halsey N (2009) Vaccine refusal, mandatory immunization, and the risks of vaccine‐preventable diseases. N Engl J Med 360: 1981–1988
Glanz JM, McClure DL, Magid DJ, Daley MF, France EK, Salmon DA, Hembridge SJ (2009) Parental refusal of pertussis vaccination is associated with an increased risk of pertussis infection in children. Pediatrics 123: 1446–1451
Jansen VA, Stollenwerk N, Jensen HJ, Ramsay ME, Edmunds WJ, Rhodes CJ (2003) Measles outbreaks in a population with declining vaccine uptake. Science 301: 804
Brown KF, Kroll JS, Hudson MJ, Ramsay M, Green J, Long SJ, Vincent CA, Fraser G, Sevdalis N (2010) Factors underlying parental decisions about combination childhood vaccinations including MMR: a systematic review. Vaccine 28: 4235–4248
Beatty J (2006) Masking disagreement among experts. Episteme 3: 52–67
IPCC (2007) Climate Change 2007: Synthesis Report. Geneva, Switzerland: Intergovernmental Panel on Climate Change. http://www.ipcc.ch/pdf/assessment‐report/ar4/syr/ar4_syr.pdf
Weare B (2000) Insights into the importance of cloud vertical structure in climate. Geophys Res Lett 27: 907–910
Rind D, Kahn RA, Chin M, Schwartz SE, Remer LA, Feingold G, Yu H, Quinn PK, Halthorne R (2009) The way forward. In Atmospheric Aerosol Properties and Climate Impacts (eds Chin M, Kahn R, Schwartz S), pp 98–104. Washington DC, USA: National Aeronautics and Space Administration
Oppenheimer M, O'Neill BC, Webster M, Agrawala S (2007) Climate change. The limits of consensus. Science 317: 1505–1506
Pidgeon N, Fischoff B (2011) The role of social and decision sciences in communicating uncertain climate risks. Nat Clim Change 1: 35–41
Schiermeier Q (2010) The real holes in climate science. Nature 463: 284–287
van der Sluijs JP, van Est R, Riphagen M (2010) Beyond consensus: reflections from a democratic perspective on the interaction between climate politics and science. Curr Opin Environ Sustainability 2: 409–415
Citizen O (2006) Esteemed Ottawa scientist says cosmic rays, not greenhouse gases, cause global warming [online]. The Ottawa Citizen 16 Mar. http://www.canada.com/ottawacitizen/news/story.html?id=875fbe67‐8c67‐4d12‐ab86‐a9c5f5250d0f&k=82908
Elliott K (2011) Is a Little Pollution Good for You? Incorporating Societal Values in Environmental Research. New York, USA: Oxford University Press
Rosi‐Marshall EJ, Tank JL, Royer TV, Whiles MR, Evans‐White M et al (2007) Toxins in transgenic crop byproducts may affect headwater stream ecosystems. Proc Natl Acad Sci USA 104: 16204–16208
Waltz E (2009) GM crops: battlefield. Nature 461: 27–32
Schiermeier Q (2009) Storm clouds gather over leaked climate e‐mails. Nature 462: 397
Hickman L, Randerson J (2009) Climate sceptics claim leaked e‐mails are evidence of collusion among scientists. The Guardian 20 Nov
Johnson K (2009) Climate e‐mails stoke debate. The Wall Street Journal 23 Nov
Tierney J (2009) E‐mail fracas shows peril of trying to spin science. The New York Times 1 Dec
Sussman B (2010) Climategate: a veteran meteorologist exposes the global warming scam. Washington DC, USA: WND Books
Rapley C (2012) Climate science: time to raft up. Nature 488: 583–585
Maibach E, Leiserowitz A, Cobb S, Shank M, Cobb KM, Gulledge J (2012) The legacy of climategate: undermining or revitalizing climate science and policy? WIREs Clim Change 3: 289–295
Tollefson J (2010) Climate science: an erosion of trust? Nature 466: 24–26
Maibach E, Witte J, Wilson K (2011) ‘Climategate’ undermined belief in global warming among many American TV meteorologists. Bull Amer Meteor Soc 92: 31–37
Retzbach A, Marschall J, Rahnke M, Otto L, Maier M (2011) Public understanding of science and the perception of nanotechnology: the roles of interest in science, methodological knowledge, epistemological beliefs, and beliefs about science. J Nanopart Res 13: 6231–6244
Zia A, Todd AM (2010) Evaluating the effects of ideology on public understanding of climate change science: how to improve communication across ideological divides? Public Underst Sci 19: 743–761
Kerr A, Cunningham‐Burley S, Amos A (1998) The new genetics and health: mobilizing lay expertise. Public Underst Sci 7: 41–60
Molster C, Charles T, Samanek A, O'Leary P (2009) Australian study on public knowledge of human genetics and health. Public Health Genomics 12: 84–91
Irwin A, Wynne B (1996) Misunderstanding Science? The Public Reconstruction of Science and Technology. Cambridge, UK: Cambridge University Press
Bucchi M (1998) Science and the Media: Alternative Routes in Scientific Communication. London, UK: Routledge
Mikulak A (2011) Mismatches between ‘scientific’ and ‘non‐scientific’ ways of knowing and their contributions to public understanding of science. Integr Psychol Behav Sci 45: 201–215
Wagner W (2007) Vernacular science knowledge: its role in everyday life communication. Public Underst Sci 16: 7–22
Finucane ML, Holup JL (2005) Psychosocial and cultural factors affecting the perceived risk of genetically modified food: an overview of the literature. Soc Sci Med 60: 1603–1612
Wynne B (2001) Creating public alienation: expert cultures of risk and ethics on GMOs. Sci Cult (Lond) 10: 445–481
Biographies
Inmaculada de Melo‐Martín is at the Division of Medical Ethics, Department of Public Health, Weill Cornell Medical College in New York, USA. E‐mail: [email protected]

Kristen Intemann is at the Department of History & Philosophy, Montana State University, Bozeman, Montana, USA. E‐mail: [email protected]

Information & Authors
Information
Published In

Submission history
Published online: 8 February 2012
Published in issue: March 2013
Copyright
Copyright © 2013 European Molecular Biology Organization.
Authors
Metrics & Citations
Metrics
Citations
Download Citations
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Select your manager software from the list below and click Download.
Citing Literature
- Nicholas A. Coles, Elizabeth R. Tenney, Jason M. Chin, Jack C. Friedrich, Rose E. O’Dea, Alex O. Holcombe, Team scientists should normalize disagreement, Science, 10.1126/science.ado7070, 384, 6700, (1076-1077), (2024).
- Matthew Coates, Does it Harm Science to Suppress Dissenting Evidence?, Philosophy of Science, 10.1017/psa.2024.21, (1-18), (2024).
- B. Hejna, J. Šesták, Interrelatedness of thermodynamics and information: transformation of heat as a measurable information process and quantity, an overview, Journal of Thermal Analysis and Calorimetry, 10.1007/s10973-023-12833-7, 149, 20, (11517-11528), (2024).
- Alberto Giubilini, Rachel Gur-Arie, Euzebiusz Jamrozik, Expertise, disagreement, and trust in vaccine science and policy. The importance of transparency in a world of experts, Diametros, 10.33392/diam.1871, (1-21), (2023).
- Branden B. Johnson, Marcus Mayorga, Nathan F. Dieckmann, How people decide who is correct when groups of scientists disagree, Risk Analysis, 10.1111/risa.14204, 44, 4, (918-938), (2023).
- Jaroslav Šesták, Where did you come from and where are you heading to, thermal analysis of heating effects?, Journal of Thermal Analysis and Calorimetry, 10.1007/s10973-023-12142-z, 148, 23, (13141-13156), (2023).
- Jamie Shaw, Feyerabend and manufactured disagreement: reflections on expertise, consensus, and science policy, Synthese, 10.1007/s11229-020-02538-x, 198, S25, (6053-6084), (2020).
- Erin J. Nash, In Defense of “Targeting” Some Dissent about Science, Perspectives on Science, 10.1162/posc_a_00277, 26, 3, (325-359), (2018).
- Marcello Menapace, Scientific Ethics: A New Approach, Science and Engineering Ethics, 10.1007/s11948-018-0050-4, 25, 4, (1193-1216), (2018).
- Anna Leuschner, Is it appropriate to ‘target’ inappropriate dissent? on the normative consequences of climate skepticism, Synthese, 10.1007/s11229-016-1267-x, 195, 3, (1255-1271), (2016).