Spotting Pseudoscience Primer
Features of Science Misinformation/Disinformation Efforts:
Understand how to detect false information
by Michael P. Clough, Benjamin C. Herman, and Joseph Taylor
It ain’t what you don’t know that gets you into trouble.
It’s what you know for sure that just ain’t so.
– Mark Twain
Primer
We all depend on accurate information to make important decisions. But therein lies the rub. How is one to know what information is accurate, misleading, or even false? Making such a determination has always required judgement, but even more vigilance is required in the information age. Neil Postman (1992) in Technopoly: The surrender of culture to technology, warned that the information glut is an assault on informed decision-making. Faced with relentless easily accessible information increasingly having no control mechanisms, citizens and policymakers struggle to discriminate accurate information from misinformation (false information) and disinformation (deliberate dissemination of misleading or biased information). This is particularly difficult when dealing with science and technology as few citizens have the knowledge to accurately judge the information being disseminated.
Science disinformation campaigns are most problematic for the public and policymakers because such efforts deliberately cast misinformation as credible; thus, creating controversy and doubt where it does not belong (Harker, 2015). For example, when the link between lung cancer and smoking became well-established, tobacco companies, rather than directly attack the science, purposely sought to put forward disinformation in an effort to cast doubt on the scientific consensus and create uncertainty among the public (SRITA, 2020; Glantz et al., 1996). Perhaps learning from the significant successes that science disinformation efforts have had at times in the past, such efforts are now more widespread and increasingly target socio-scientific issues (social issues that involve science). Given increasingly polarized views regarding societal issues and the way that media and too many political and business leaders wrongly portray the nature of science and mislead the public regarding the science relevant to socio-scientific issues, citizens must become more adept at detecting the characteristics of science misinformation/disinformation.
The increasing acceptance of false or misleading science information (sometimes referred to as pseudoscience) both reflects and promotes distrust in science, and exacerbates the personal and societal decision-making that places all of society at greater risk (NASEM, 2017; Nichols, 2017, Osborne et al., 2022). To assist in promoting detection of science misinformation/disinformation, the twelve stories appearing on this project website recount historical and contemporary instances of misinformation/ disinformation efforts and their common features. To help readers get the most out of the stories and accurately interpret their intent, the following is an overview of the common features of science misinformation/disinformation. Not all features appear in every science misinformation/disinformation story, but understanding each feature will assist in avoiding misleading and false information in personal and societal decision-making.
REFERRING TO AND KEEPING IN MIND THE COMMON FEATURES OF SCIENCE MISINFORMATION/DISINFORMATION WHEN READING THE STORIES WILL HELP YOU BETTER UNDERSTAND AND NOT FALL PREY TO THEM.
Hansson (2017) put forward characteristics of science denialism that are useful features for detecting misinformation/disinformation. Those characteristics appear in Figure 1 under two categories. The first category “Epistemological characteristics” includes features of disinformation/misinformation that are at odds with establishing robust knowledge claims. The second category, “Sociological characteristics”, has features of science misinformation/disinformation efforts with sociocultural underpinnings. Each of these features is explained in more detail below. Misinformation/disinformation efforts may entail one or more of the features. The twelve project stories address historical or contemporary socio-scientific issues that entail significant misinformation and blatant disinformation efforts, and Red Flag text boxes are situated in each reading that draw attention to the particular misinformation/disinformation feature. Each story begins with a statement highlighting the disinformation/disinformation features that are relevant to that socio-scientific issue, with a link to this reading for a quick refresher of the features below.
Epistemological Characteristics
Sociological Characteristics
-
Cherry picking
-
Neglecting refuting information
-
Fabricating a fake science controversy
-
Inappropriate criteria for judging evidence
-
The science idea is complex and based on strong, but less directly accessible evidence
-
Lack of relevant expertise
-
Inability to publish in peer-reviewed outlets
-
Claims of conspiracy
-
Appeals to the public
-
Wrongly claiming large support in the scientific community
-
Fierce attacks on legitimate scientists
-
Male dominance
-
The science idea is perceived as threating to deniers’ worldview
-
Strong political connections
Figure 1. Characteristics of science denialism (Hansson, 2017)
HANSON HAS PUT FORTH CHARACTERISTICS OF SCIENCE DENIALISM. KNOWING THE CHARACTERISTICS OF SCIENCE DENIALISM IS USEFUL FOR DETECTING SCIENCE MISINFORMATION/DISINFORMATION.
Features of misinformation/disinformation
Cherry picking science evidence and ideas
A large body of evidence from many studies coalesce to support well-established scientific knowledge. But rarely is every individual study fully aligned with the consensus ensuing from the entire body of evidence. Research is complex and nuanced, and for a variety of reasons some investigations produce data that can be interpreted as at odds with the full body of evidence. Selectively focusing particular studies and evidence that are at odds with the entire body of relevant studies and evidence is a common feature of misinformation/disinformation efforts. This feature is referred to as cherry-picking because of its highly selective choice of studies and evidence while ignoring the overwhelming number of studies and body of evidence that support the consensus position of the scientific community.
For example, from 1991 to 2015, of the 54,195 peer-reviewed publications on anthropogenic global warming (AGW), 31 were at odds with AGW. An example of cherry-picking is focusing on any or all of the 31 dissenting articles in the face of the 99.94% of publications supporting AGW. Cherry-picking may also be done with evidence; selectively analyzing data to support an idea contrary to the entire body of evidence. Cherry-picking is not the same as putting forth the best representative data when publishing or presenting research. The latter is done to help reduce complexity while cherry-picking is a misrepresentation of the overarching data and defensible conclusions.
Neglecting refuting science information
Even well-established scientific knowledge is potentially open to revision. For example, Newtonian physics was well-established for nearly 300 years before Einstein’s mathematically derived work showed the shortcomings of classical physics. That such knowledge may be re-examined and altered is a strength of science. This does not mean such knowledge is easily changed – and for good reasons! Wide-ranging scientific ideas are often faced with anomalies—phenomena that are poorly accounted for or perhaps even contradict the idea. The reasons for this are varied and detailed, but the crux of the matter is that comprehensive ideas are not discarded simply because some pieces do not fit. When well-established science knowledge is faced by apparently refuting evidence, the far greater likelihood is that the problem lies with the seemingly disconfirming instance or instances. However, science misinformation/disinformation efforts are different in that they selectively ignore the body of well-established evidence in order to maintain ideas that have been thoroughly refuted by the community of relevant science experts.
Fabricating a fake science controversy
A feature of science misinformation/disinformation that is particularly difficult for the public and policymakers to ascertain is fabricating the perception of a genuine controversy within the scientific community where one does not exist. Playing off most people’s yearning for fairness, manufacturing the impression of a scientific controversy is an effective way to cast doubt and wrongly place misinformation/disinformation on a level playing field with established science ideas. Because laypeople lack expertise in the relevant science area(s), purveyors of science misinformation/disinformation can easily create the incorrect appearance that an idea is considered by the scientific community as a serious contender to the accepted idea. Many media outlets exacerbate this problem by seeking to provide balanced reporting, even when a particular idea has been overwhelmingly discredited by the scientific community.
Inappropriate criteria for judging a science idea
Established ideas about the natural world are durable, but potentially open to revision in light of new evidence or a reinterpretation of prior evidence. The scientific community’s criteria for assessing scientific knowledge are multifaceted and nuanced, but reasonable and evenhanded. The vetting of research is directed at ensuring appropriate methods and analysis. For much research, a control-treatment experimental approach is not appropriate and instead call for other more suitable approaches. In contrast to the appropriate and fair vetting of scientific research, those spreading science misinformation/ disinformation establish criteria that are customized in a way so that the accepted science idea is almost impossible to satisfy. But purveyors of science misinformation/disinformation do not hold the information they sow to such standards. Freudenburg and Muselli (2013) refer to this as the “Asymmetry of scientific challenge” (p. 777).
The science idea is complex and based on strong, but less directly accessible evidence
Some science research is sufficiently understandable to citizens and policymakers that enables them to spot dishonest claims. However, science ideas are typically complex, often defy common sense thinking, and are not apparent through everyday observation. The struggles many people had when trying to deeply understand science ideas taught in high school should make apparent the challenges in understanding cutting-edge science. The complexity of many science ideas and the inability of most laypeople to observe, understand, and interpret the relevant evidence creates an opportunity for science misinformation/disinformation to flourish. When faced with a complex science issue, simplistic notions that are more understandable, but misleading or incorrect are appealing.
Lack of relevant expertise
Most contemporary STEM research is so advanced that even professional scientists are often not qualified to judge studies and evidence outside their limited area of expertise. A characteristic of misinformation/disinformation is its purveyors rarely possess the required relevant expertise. Possessing a graduate science, engineering, or mathematics degree or even having credentials in a particular subdiscipline confers credibility only on matters in a relatively narrow area. A feature of misinformation/ disinformation is its adherents’ lack of necessary credentials and expertise regarding the particular issue.
​
Inability to publish in peer-reviewed outlets
Science misinformation/disinformation rarely passes through the peer-review process of reputable science publishing outlets. Historically, this made efforts to widely disseminate science misinformation/ disinformation more difficult. While past science misinformation/disinformation efforts included the creation and distribution of information, including journals, that lacked science credibility, today the peer-review process is more easily circumvented. The internet provides an easy platform to bypass the appropriate expert-review process, and well-designed layouts with false claims regarding the legitimacy of expert peer-review are a common feature of science misinformation/disinformation.
Claims of conspiracy
Unable to meet peer-review scrutiny among authentic experts in the relevant area of the scientific community, misinformation/disinformation efforts often make accusations of conspiracy – claiming deliberate underhanded efforts by the scientific community and others to deny credibility of the rejected idea. Such claims ignore the size and diversity of the scientific community that is intent on accurately understanding the natural world. Conspiracy narratives put forward plots that would require worldwide assent among scientists in the relevant areas of research, and that is not plausible given the number and diversity of those who conduct research.
​
Appeals to the public
Unable to pass the expert peer-review process, misinformation/disinformation efforts frequently go straight to the public with their false claims. This has even been done by scientists who have had their ideas discredited by the broader community of experts in their area. Historically, appeals to the public have been done via pamphlets, newspapers, and radio/television interviews, but the internet and social media has made such efforts easily accessible to large unsuspecting audiences. Those serious about getting to the truth of the matter regarding the natural world would have their work vetted by the appropriate experts in the scientific community. When work is rejected, researchers genuinely interested in moving science forward would conduct further studies in an effort to provide compelling evidence for their claim. Appealing directly to the public is a likely sign of misinformation/disinformation.
Wrongly claiming large support in the scientific community
Misinformation/disinformation efforts convey much greater support in the scientific community than is actually true. This can be accomplished by creating organizations, holding conferences, and establishing websites and even journals – all devoted to the discredited idea. Another tactic is putting forth long lists of individuals possessing PhDs, MDs, and other credentials who lend their support to the discredited idea. However, upon inspection, individuals on these lists often to not possess the scientific expertise required to speak to the issue. When such individuals do possess the relevant expertise, their position must be acknowledged as at odds with the consensus of science experts in that field. For example, the biochemist and Nobel Laureate, Linus Pauling, maintained that vitamin C would prevent or cure or lower the risk all sorts of illnesses, but his claims did not hold up to scrutiny by the community of relevant science experts.
​
Fierce attacks on legitimate scientists
Those promoting misinformation/disinformation will at times mount serious personal and legal attacks on researchers who publish and present peer-reviewed studies that debunk or are at odds with the misinformation/disinformation. Some scientists have abandoned their research because of the harassment and threats made to them and their families. Disagreements within the scientific community are at times contentious, but are different than the personal and legal attacks made in misinformation/ disinformation efforts.
Male predominated
Science denialism is far more common among males. Males generally possess greater self-confidence compared to females’ general self-perceptions (Bleidorn et al., 2016), and those possessing higher self-confidence may wrongly place their own thinking on par with genuine experts. Historically, science misinformation/disinformation efforts have been dominated by males, but this may change going forward as women increasingly become more assertive and hold positions of power.
​
Strong political connections
Strong ideological convictions including extreme political views often drive misinformation/ disinformation efforts aimed at subverting established science. Furthermore, those at both ends of the political spectrum, liberals and conservatives, judge misinformation to be more accurate when the source is congruent with their own political ideology. For example, those having strong right-leaning political views were more likely to spread and believe misinformation during the COVID-19 pandemic while those having strong left-leaning political views have, until recently, been more likely to spread alarmism and misinformation regarding GMO’s (Herman et al., 2022; Zimmerman & Eddens, 2018). A study regarding Facebook (Hopp, Ferrucci, & Vargo, 2020) determined that those at both ends of the political spectrum were most responsible for spreading misleading information. Of the total misinformation shared on Facebook, over 40% came from those at the political extremes—26% from those on the extreme right and 17.5% from those on the extreme left.
The science idea is perceived as threating to deniers’ worldview
Everyone develops a set of ideas that help them navigate through life. Those ideas are influenced, in part, by the time and place in history, friends and family, society at-large, and through personal efforts to contemplate important issues. The values, priorities, interests, beliefs held by a person result in a particular way thinking and even viewing the world. The framework a person develops to make sense of experiences is called a worldview, and it encompasses a person’s ideologies, beliefs, values, assumptions, and attitudes. Some of what comprises a person’s worldview reflects purposeful thinking about these things, while much is worked out at a subconscious level and develops without serious contemplation. For example, your preference for styles of music or movies may have resulted from seeking out and sifting through many different genres, or you may simply like what was popular at a formative time in your life. People hold tightly to their worldviews and rarely like them to be questioned. Some science ideas are perceived as threatening because they raise issues that intersect with a person’s worldview. In such cases, misinformation/disinformation that aligns with a person’s worldview is attractive. An underlying motive behind some science misinformation/ disinformation efforts is the perception that the science is a threat to a particular worldview.
​
Questions for self-reflection
​
1. How can overtly acknowledging one’s worldview, biases, emotions, and preferences assist in: (a) more accurately assessing information, and (b) reflective personal and societal decision-making?
​
2. How can understanding the common features of science misinformation/disinformation assist in more fairly assessing information?
Evaluating Science Information
The features of science misinformation/disinformation described above are highlighted in red flag boxes appearing in the twelve project stories. Carefully think about what appears in those boxes that resulted in the public and policymakers questioning, ignoring, and even rejecting well-supported science and technology. Being on the lookout for the features of misinformation/disinformation is a useful strategy, but keep in mind that they often appear more straightforward in retrospect. Notice that several features of misinformation/disinformation require effort to determine who are the authentic experts and what they say about the information.
A pervasive issue that undermines an accurate determination of the authentic experts and the features of misinformation/disinformation is the frame of mind that is brought to the task. A large body of research makes apparent that emotions and intuition impact decision-making far more than most are aware (Haidt, 2013). People more often use their sociocultural identity and personal emotions to rapidly assess information and make decisions. This fast thinking plays an important and practical role in daily interactions, but has significant problems when the task at hand is complex and calls for more careful and laborious deliberation (Kahneman, 2011). Decision-making regarding personal or societal issues involving science are more likely to be accurately informed and thoughtfully made when we are cognizant of our emotions and biases regarding the issue. Accurately identifying one’s own strong feelings is important; not for eliminating those feelings, but rather for acknowledging how, if left unchecked, they will likely interfere in sound thinking and decision-making. Honestly assessing your own emotions and acknowledging how, if left unchecked, they will likely lead your thinking astray places you on more firm footing for seeking and accurately assessing the credibility of the scientific expertise and information. Also look for signs of bias, ideology, and conflicts of interest in the source of information being assessed.
Strategies for separating accurate science information from misinformation/ disinformation:
​
-
Understand that the results from an internet search are impacted by many factors, and the results appearing first in a search are not necessarily associated with the accuracy of information at the sites.
-
After receiving the results of a search, instead of immediately clicking on the first one or two search results, employ “click restraint”. Click restraint means first scanning the search results to get a sense of the available sources. Scan the titles and URLs appearing on the first page of search results, and read the short information appearing under each URL title. This initial quick review is crucial because search results are not ranked by trustworthiness!
-
Having employed click restraint, open another tab and learn more about a source before going to the link in the first search. That second tab is where lateral reading (Brodsky et al., 2021) should be conducted to learn more about websites before visiting them. Seek further information about a website’s credibility and possible biases. Even if the site appears to be a provider of trustworthy information, lateral reading assists in learning about a source’s strengths and limitations.
-
Click-restraint and lateral reading require acknowledging one’s own limitations in assessing science information, and the need to find credible sources. Humility is crucial! Wise people know their own limits of understanding, and they are also aware of their biases that interfere in determining credible information. When employing click-restraint and lateral reading, reflect on your biases and emotions to ward off confirmation bias — the tendency to seek information that fits one’s biases and emotions.
-
Particularly when assessing science information, search well-established and respected science organizations that have a long track-record of providing credible information. Examples include the following: American Association for the Advancement of Science (AAAS); American Medical Association (AMA); American Dental Association (ADA); National Academy of Science (NAS); National Oceanic and Atmospheric Administration (NOAA), National Science Foundation (NSF), U.S. Centers for Disease Control and Prevention (CDC), World Health Organization (WHO). Science information from credible science sources typically coheres; thus, providing greater confidence that the information is accurate
Many legitimate factors (e.g., economic considerations, values, etc.) play a role in socio-scientific decision-making. While accurate science information does not alone determine decision-making, it is crucial for making well-informed decisions on personal and societal issues involving scientific knowledge. Employing the strategies above will assist in detecting science misinformation/disinformation, and place decision-making on a more solid foundation. Credible sources of science information come from the diversity of researchers in the scientific community, both within the U.S. and worldwide, who together restrain the biases that plague misinformation/disinformation efforts. But keep in mind that successfully employing the above strategies will be considerably hindered if citizens and policymakers do not restrain their own personal biases. Unrestrained emotions, ideologies (e.g., political views), and other biases interfere with the effort necessary to detect science misinformation/disinformation.
​
Questions for self-reflection
​
3. If faced with uncertainty regarding science information, how do the information-checking strategies assist in detecting misinformation/disinformation?
​
4. What should a person do who is still uncertain about science information?
​
References
Bleidorn, W., Arslan, R. C., Denissen, J. J. A., Rentfrow, P. J., Gebauer, J. E., Potter, J., & Gosling, S. D. (2016). Age and gender differences in self-esteem—A cross-cultural window. Journal of Personality and Social Psychology, 111(3), 396-410.
​
Brodsky, J. E., Brooks, P. J., Scimeca, D, Todorova, R., Galati, P., Batson, M., Brosso, R., Matthews, M, Miller, V, &Caulfield, M. (2021). Improving college students’ fact-checking strategies through lateral reading instruction in a general education civics course. Cognitive Research: Principles and Implications, 6, 23. https://doi.org/10.1186/s41235-021-00291-4
​
Freudenburg, W. R. & Muselli, V. (2012). Reexamining climate change debates: Scientific disagreement or scientific certainty argumentation methods (SCAMs)? American Behavioral Scientist, 57(6), 777-795. https://doi.org/10.1177/0002764212458274
​
Glantz, S. A., Slade, J., Bero, L. A., Hanauer, P. & Barnes, D. E. (Eds.) (1996). The cigarette papers. Berkeley: University of California Press. http://ark.cdlib.org/ark:/13030/ft8489p25j/
​
Haidt, J. (2013). The righteous mind. Penguin Books, Harlow, England.
​
Hansson, S. O. (2017). Science denial as a form of pseudoscience. Studies in history and philosophy of science, 63, 39-47.
​
Harker, D. (2015). Creating scientific controversies: Uncertainty and bias in science and society. Cambridge University Press, New York.
​
Herman, B. C., Clough, M. P., Asha, R. (2022). Socioscientific issues thinking and action in the midst of science-in-the-making. Science & Education. https://link.springer.com/article/10.1007/s11191-021-00306-y
​
Hopp, T, Ferrucci, P, & Vargo, C. (2020). Why do people share ideologically extreme, false, and misleading content on social Media? A self-report and trace data-based analysis of countermedia content dissemination on Facebook and Twitter. Human Communication Research, 46(4), 357-384.
​
Kahneman, D. (2011). Thinking fast and slow. Farrar, Straus and Giroux, New York.
​
National Academies of Sciences, Engineering, and Medicine (NASEM) (2017). Communicating science effectively: A research agenda. The National Academies Press, Washington, DC.