Prof. RNDr. Vanda Boštíková Ph.D. Prof. RNDr. Aleš Macela, DrSc.

Information is “factual data”, which includes knowledge that influences individuals as well as subsequently society in their choices, actions, efforts, and endeavors. We get information from many sources, including books, television, radio, newspapers, magazines, advertisements, the Internet, experts, friends, and the ever-expanding sphere of social media.
Unfortunately, rumors and myths also serve as information. They spread quickly and without clear direction. Health issues are a specific group. Incorrect health information can be very dangerous because it can distract us from taking appropriate action that would otherwise help protect our health.
Identifying misinformation that threatens public health programmes is a necessary first step in developing methods to combat it. Based on current analyses, a new approach to eliminating the harmful effects of misinformation campaigns is being considered, based on so-called prevention pathways. This involves assessing and evaluating conspiracy theories before they go viral so that public health professionals can develop effective, real information, messages and programmes to actively counteract misconceptions.
During the COVID-19 pandemic, people were exposed to a multitude of reports, conflicting guidance, summaries of various data and facts, infographics, research, opinions, rumors, myths, and untruths. The World Health Organization and the United Nations have referred to this unprecedented dissemination of information as “infodemia” (1-3).
A paper by Gerts, D., et al. (2021) (4) analyzed conspiracy theories and misinformation spread on Twitter in the first year of the pandemic. The authors extracted 120 million tweets by keyword, for the period January to May 2020. The analysis focused on the sequential development of four selected conspiracy theories and associated misinformation: (1) 5G technology is somehow linked to Covid-19, (2) there is a link between HIV disease and Covid-19, (3) SARS-CoV-2 was man-made and escaped (either intentionally or accidentally) from a laboratory, (4) the Covid-19 vaccine may be harmful, for example because it may contain a microchip.
The Four Conspiracies
The background of the first observed conspiracy theory is linked to 2018, when mobile operators started using 5G mobile service (5), which necessitated the installation of new cell towers (6). These new towers have become the source of general speculation that the signal is harmful to humans and that this danger is being “covered up by the powerful in the telecommunications industry” (7). Wireless technologies have been consistently accused of causing damage to the human immune system (similar to theories that have also emerged in the context of the introduction of 2G, 3G, 4G and Wi-Fi services) (7). Even the Russian flu of 1889 was allegedly caused by the then-new technology producing electric light (8). The conspiracy theory related to COVID-19, 5G has evolved from a fringe opinion to a highly shared and widespread conspiracy theory as a result of the Twitter hashtag (9).
The conspiracy theory about the connection between HIV and COVID-19 disease was born on January 31, 2020, with the publication of a preprint “Uncanny similarity of unique inserts in the 2019-nCoV spike protein to HIV-1 gp120 and Gag”, which was subsequently retracted (10). Before its retraction, Anand Ranganathan, a molecular biologist with over 200,000 followers via Twitter, reposted it. He cited this preprint as evidence of a possible laboratory origin with his now-deleted tweet, “Oh my god. Indian scientists have just discovered HIV (AIDS)-like insertions in the 2019-nCoV virus that are not found in any other coronavirus. They hint at the possibility that this Chinese virus was designed…” Within two hours, Ross Douthat, a prominent columnist for the New York Times, retweeted Ranganathan to his >140,000 followers, further legitimizing the theory through a reputable news outlet and greatly expanding the impact of the report beyond the scientific community (11). Three days after the preprint was published, the article was retracted.
The laboratory origin theory of SARS-CoV-2 also gained political attention; US President Donald Trump spoke of evidence for a Chinese laboratory origin of SARS-CoV-2 (12), prompting a reaction from the Chinese government on its Twitter account. Other conspiracy theories quickly emerged, including the notion that the virus was created to achieve a reduction in the global population or to impose quarantines, as well as travel bans (13, 14).
Vaccine-related articles on social media are often shared by people who have relatively low knowledge and dislike of vaccination compared with those who do not (15), with the content of the messages consisting of refuted associations with autism and general distrust of governments and the pharmaceutical industry. In the context of emerging diseases, conspiracy theories quickly emerged concerning, for example, the possibility of profiting from vaccines while “exposing” the conspiracy of US pharmaceutical companies (16, 20, 21).

Tools Against Disinformation
The U.S. Office of Public Health, headed by the Surgeon General, Dr. V. Murthy, has urgently and repeatedly alerted the American public to the threat of health misinformation and called for a company-wide solution. Priority is recommended (1, 21):
- Equip Americans with the tools to recognize misinformation, make responsible choices about what information to share next, and address health misinformation in their communities in partnership with trusted local leaders.
- Expand research that will deepen our understanding of health misinformation, including how it spreads and evolves, how and why it affects people, who is most susceptible to it, and what strategies are most effective in addressing it.
- Make changes in product design and policy on technology platforms to slow the spread of misinformation.
- Invest in long-term programs and training in media, science, digital, data and health literacy for health professionals, journalists, librarians, and others to build resilience to health misinformation .
- Convene federal, state, local, territorial, tribal, private, nonprofit, and research partners to explore the impact of health misinformation, identify appropriate practices to prevent and address it, make recommendations, and find common ground in difficult questions. Further, develop appropriate legal and regulatory measures that target health misinformation while protecting user privacy and freedom of expression.
- Recommendations for the various actors involved in the dissemination, creation, and spread of information disorders are also discussed.
Recommendations for families and individuals
- While many people share misinformation, they do not have to do so intentionally. They are only trying to inform others and are not aware that the information is false. Social media channels, blogs, forums, and group chats allow people to follow a variety of news sources. However, not every social media post can be considered credible. And misinformation can run rampant even in group texts or emails between friends and family. So, verify the accuracy of information by checking it with credible sources. If you’re not sure, don’t share it.
- Involve your friends and family in addressing health misinformation. If someone you care about has a misconception, you can reach out to them by first trying to understand, not judge. Try to listen with empathy, find common ground with them, ask questions, and provide alternative explanations, including pointing out sources of information. Remain calm and do not expect success from one conversation.
Recommendations for educators and educational institutions
- Strengthen and expand the use of evidence-based educational programs that build resilience to misinformation. Media, science, digital, data, and health literacy programmes should be implemented in all educational settings, primary, secondary, and post-primary schools, including community settings. In addition to teaching how to better discern the credibility of news, educators should address a broader set of topics, such as information overload, internet infrastructure (IP addresses, metadata), content moderation issues, the impact of algorithms on digital outputs, AI-generated misinformation (deepfakes), and visual verification skills. It is recommended to familiarize students and the public with the mechanisms and tactics for online disinformation dissemination.
- Draw attention to the typical disinformation tactic used by those who deny the scientific consensus on health issues, which involves presenting unqualified people as experts, demanding unrealistic expectations of scientific research, misinterpreting scientific results by lay people who do not understand expert explanations, and translating and explaining them in misleading ways (1, 17).
Recommendations for health professionals and health care institutions
- Physicians and nurses are highly trusted by the public and can be effective in helping to address health misinformation (1, 18). If possible, correct misinformation in an individualized manner, and consider using lay, less technical language that is accessible to all patients (in the United States, there has long been a strong emphasis on this point; health information is presented in a way that a fifth-grade elementary school child can understand). Use technology and media platforms to share accurate health information with the public (1, 19).
- Other recommendations listed here have already been put into practice in the United States. These are recommendations for particular professional groups that can be very effective in influencing society’s attitude and responsibility for the spread of information disorders. Professional associations are already beginning to act as consultants to American journalists on health issues. There is an effort to translate the results of peer-reviewed research and professional opinion to journalists in an effective and understandable manner. To do this, hospital systems are newly partnering with community members to create local health messages. Healthcare organizations are coming forward to offer training for physicians on how to address misinformation in a way that takes into account the diverse needs, concerns, backgrounds, and experiences of patients. We provide individual recommendations here in the belief that they may provide motivation for developing one’s own system to defend against the spread of information disorders, whatever their nature (20).
Recommendations for journalists and media organizations
- Journalists, editors, and others in the media space need to be trained to recognize, correct, and not spread misinformation. New training programmes and courses need to be developed. We need to start actively addressing the questions raised by the public. If something is new – a vaccine, for example – people naturally ask.
- By anticipating and proactively answering questions, media organiz ations and journalists can help prevent misinformation and increase the public’s health and information literacy. It is essential to provide context to the public to avoid distorting perceptions of ongoing debates on health topics. When conflicting expert opinions are discussed, lay people need to be provided with information about the position of the scientific community and the strength of the available evidence for different views. The counterproductivity of questions such as “How much disagreement is there among experts?”, “Is a given explanation plausible even if it is unlikely?” should be considered.
- Journalists should, as a matter of principle, limit headlines and images that shock or provoke; for many citizens, it is the essential thing they read and remember, but the content of the following text, which does not correspond to this, is already being missed by their attention. The images that are part of such messages are often shared on social media along with the headlines and can easily be manipulated, used out of context, and very easily become misinformation in themselves.
Recommendations for technology platforms
- Redesign recommender algorithms to avoid reinforcing misinformation, build in “friction elements” (suggestions and warnings) to reduce the sharing of misinformation, and make it easier for users to call out real or perceived misinformation. Provide researchers with access to useful data to properly analyze the spread and impact of misinformation.
- Researchers need data on what people see and hear, not just what they encounter and what content is moderated (e.g., flagged, removed, downgraded), including data on automated accounts that spread misinformation. In order to protect user privacy, data may be anonymized and provided with user consent.
- Platforms should increase the number of staff in multilingual content moderation teams and improve the effectiveness of machine learning algorithms in languages other than English (recommendation applies to the United States but is valid globally) (3).
- Platforms should also address misinformation in live streams, which is more difficult to moderate due to its temporary nature and the use of audio and video. However, some recordings should be analyzed and lessons learned. Early detection of disinformation ‘super-spreaders’ and repeat offenders should be prioritiz Introduce and communicate clear consequences for accounts that repeatedly violate the platform’s policies.
- Evaluate the effectiveness of internal policies and procedures for dealing with misinformation and report transparently on findings. Publish standardized measures of how often and through what channels users are exposed to misinformation, what types of misinformation are most common, and what proportion of misinformation is addressed promptly. Communicate why certain content is flagged or removed. Highlight and draw attention to messages from credible sources and trusted subject matter experts. Once these measures have been applied, analyse the consequences of these measures, one typical consequence being the migration of users to less moderated platforms.
What scientists and research organizations can do
- There is an urgent need to comprehensively quantify the damage to human health caused by misinformation. To seek answers to the questions of how and under what conditions misinformation arises and influences people’s beliefs and behavior, and consequently to identify the consequences for their health. Similarly, questions such as: what role do emotions and other factors of human perception play in the mechanism of “catching” misinformation, and what are the costs to society if misinformation spreads unchecked? Adapting information interventions to the needs of specific population groups has proven to be effective. Different subpopulations perceive misinformation differently.
- The effects and perceptions of misinformation need to be studied in terms of its dissemination and perception, taking into account which population group is involved (e.g., African-Americans, Hispanics, Asians – again a recommendation for the United States, but valid worldwide). A very important question is whether it is possible to build resistance to misinformation through ‘inoculation’ methods such as ‘prebunking’. Prebunking, or preemptive debunking, involves warning people about misinformation they may encounter so that they are less likely to believe it once it is exposed; debunking, on the other hand, involves correcting misinformation after someone has been exposed to it (1, 29).
What funders and foundations can do
- It is recommended that funders and foundations move urgently to make coordinated and large-scale investments in countering disinformation. Evaluate funding portfolios to secure meaningful, multi-year commitments for research programs. Invest in quantifying the damage caused by misinformation and identifying evidence-based interventions. Focus on areas facing a lack of private and public funding. Examples include independent and local journalism, platform accountability mechanisms, and community health literacy programs. It is also recommended to provide training and resources for grantees working in communities disproportionately affected by misinformation (e.g., in areas with lower trust in vaccination).
- Encourage coordination among grantees to maximize results, avoid duplication, and bring together a diversity of expertise. Encourage coordination on monitoring health misinformation in multiple languages (again, relevant to the United States, but applicable globally).
What governments can do (general recommendations from a US perspective)
- Convene federal, state, local, territorial, tribal, private, nonprofit, and research partners to explore the impact of health misinformation, identify approaches to prevent and address it. Subsequently, to make recommendations and find common ground on difficult issues, including appropriate legal and regulatory measures that address health misinformation while protecting user privacy and freedom of expression. Increased investment in research on misinformation to better define it, document it, process its harmfulness, and identify practices to prevent and address misinformation across media and communities is recommended.
- It is recommended that it is important to continue to modernize public health communication with respect to understanding health issues, concerns, and perceptions, especially among hard-to-reach populations (homeless, people with low levels of education). Proactively and quickly publish accurate and understandable population health information in an online environment. Invest in fact-checking and rumor-checking mechanisms (1, 30). Support the creation of teams within public health agencies that can identify local patterns of misinformation and train researchers on misinformation and infodemia in public health. Expand efforts to build long-term resilience to misinformation by supporting educational programs that help people distinguish evidence-based information from opinions and personal stories (21-27).
The proposal presented here seems very reasonable and sensible and, if implemented consistently in daily practice, can effectively influence the intensity of the spread of demagogic messages. It is therefore desirable that a suitable systemic form be found to address each of these points and facilitate their implementation. Even if the recommendations presented can be put into practice, the effect cannot be expected to be immediate; it will take time for society and individual professional groups to take these recommendations on board. Moreover, there is still the permanent complication that knowledge is evolving and that today’s truth may no longer be the truth in time; the world and its knowledge are constantly changing, and therefore the truth seems unpredictable (28).
It is also important to note that these recommendations took a fundamentally different direction in the United States after Donald Trump’s second victory in the presidential election. The current Secretary of Health and Human Services, Robert F. Kennedy, Jr., is an active and well-known opponent of vaccination and himself promotes misinformation and myths about both COVID-19 and other diseases, such as autism. Similarly, the important position of Surgeon General, previously held by Dr. V. Murthy, remains vacant today.
Sources
- Larson HJ. The biggest pandemic risk? Viral misinformation. Nature. 2018 Oct 16;562(7727):309–309. doi: 10.1038/d41586-018-07034-4
- Lewandowsky S, Cook J. The Conpsiracy Theory Handbook. George Mason University Center for Climate Change Communication.2020. https://www.climatechangecommunication.org/wp-content/uploads/2020/03/ConspiracyTheoryHandbook.pdf
- Swami V, Voracek M, Stieger S, et al. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014 Dec;133(3):572–85doi: 10.1016/j.cognition
- Gerts D, Shelley CD, Parikh N, et al. “Thought I’d Share First” and Other Conspiracy Theory Tweets from the COVID-19 Infodemic: Exploratory Study JMIR Public Health Surveill. 2021 Apr; 7(4): e26527.doi: 10.2196/26527
- Oswald E, de Looper C. Verizon 5G: everything you need to know. Digital Trends.2020. https://www.digitaltrends.com/mobile/verizon-5g-rollout/
- Holmes A. 5G cell service is coming. Who decides where it goes? New York Times.2018. https://www.nytimes.com/2018/03/02/technology/5g-cellular-service.html
- Morgan A. What is the truth behind the 5G coronavirus conspiracy theory? Euronews.2020. https://www.euronews.com/2020/05/15/what-is-the-truth-behind-the-5g-coronavirus-conspiracy-theory-culture-clash
- Knapp A. The original pandemic: unmasking the eerily familiar conspiracy theories behind the Russian flu of 1889. Forbes.2020. https://www.forbes.com/sites/alexknapp/2020/05/15/the-original-plandemic-unmasking-the-eerily-parallel-conspiracy-theories-behind-the-russian-flu-of-1889/#3c4e54ff50d5
- Ahmed W, Vidal-Alaball J, Downing J, et al. COVID-19 and the 5G conspiracy theory: social network analysis of Twitter data. J Med Internet Res. 2020;22(5):e19458. doi: 10.2196/19458. https://www.jmir.org/2020/5/e19458/
- Pradhan P, Pandey A, Mishra A. Uncanny similarity of unique inserts in the 2019-NCoV spike protein to HIV-1 Gp120 and Gag. Withdrawn in: BioArxiv. doi: 10.1101/2020.01.30.927871
- Samorodnitsky D. Don’t believe the conspiracy theories you hear about coronavirus and HIV. Especially if you work for the New York Times. Massive Science.2020. https://massivesci.com/notes/wuhan-coronavirus-ncov-sars-mers-hiv-human-immunodeficiency-virus/
- Singh M, Davidson H, Borger J. Trump claims to have evidence coronavirus started in Chinese lab but offers no details. The Guardian.2020. https://www.theguardian.com/us-news/2020/apr/30/donald-trump-coronavirus-chinese-lab-claim
- Wallbank D, Bloomberg. Twitter applies another fact check—this time to China spokesman’s tweets about virus origins. Fortune.2020. https://fortune.com/2020/05/28/twitter-fact-check-zhao-lijian-coronavirus-origin/
- Twitter flags China spokesman’s tweet on COVID-19. Reuters.2020. https://www.reuters.com/article/us-twitter-china-factcheck/twitter-flags-china-spokesmans-tweet-on-covid-19-idUSKBN23506I
- Wang Y, McKee M, Torbica A, et al. Systematic literature review on the spread of health-related misinformation on social media. Soc Sci Med. 2019;240:112552. doi: 10.1016/j.socscimed.2019.112552. https://linkinghub.elsevier.com/retrieve/pii/S0277-9536(19)30546-5
- Feuer A. The Ebola conspiracy theories. New York Times.2014. https://www.nytimes.com/2014/10/19/sunday-review/the-ebola-conspiracy-theories.html
- The Event 201 scenario. Event 201.https://www.centerforhealthsecurity.org/event201/scenario.html
- Ohlheiser A. How covid-19 conspiracy theorists are exploiting YouTube culture. MIT Technology Review.2020. https://www.technologyreview.com/2020/05/07/1001252/youtube-covid-conspiracy-theories/
- Chou WS, Hunt YM, Beckjord EB, Moser RP, Hesse BW. Social media use in the United States: implications for health communication. J Med Internet Res. 2009;11(4):e48. doi: 10.2196/jmir.1249. https://www.jmir.org/2009/4/e48/
- https://www.hhs.gov/surgeongeneral/priorities/health-misinformation/index.html
- Deutch G. How one particular coronavirus myth went viral. Wired.2020. https://www.wired.com/story/opinion-how-one-particular-coronavirus-myth-went-viral/
- Kouzy R, Abi Jaoude Joseph, Kraitem A, et al. Coronavirus goes viral: quantifying the COVID-19 misinformation epidemic on Twitter. Cureus. 2020;12(3):e7255. doi: 10.7759/cureus.7255. http://europepmc.org/abstract/MED/32292669.
- WHO, UN, UNICEF, UNDP, UNESCO, UNAIDS, ITU, UN Global Pulse, & IFRC. Managing the COVID-19 infodemic: Promoting healthy behaviours and mitigating the harm from misinformation and disinformation. World Health Organization. https://www.who. int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-andmitigating-the-harm-from-misinformation-and-disinformation
- Sell TK, Hosangadi D, Smith E, et al. National priorities to combat misinformation and disinformation for COVID-19 and future public health threats: A call for a national strategy. Johns Hopkins Center for Health Security. https://www.centerforhealthsecurity.org/our-work/publications/national-prioritiesto-combat-misinformation-and-disinformation-for-covid-19
- Vraga E, Bode L. Confronting Health Misinformation: The U.S. Surgeon General’s Advisory on Building a Healthy Information Environment 18 world. Royal Society Open Science, 2020, 7(10). http://doi.org/10.1098/rsos.201199
- Chou WYS, Gaysynsky A, Cappella JN. (2020). Where we go from here: Health misinformation on social media. American Journal of Public Health, 110, S273-S275. http://doi. org/10.2105/AJPH.2020.305905
- Vraga E, Bode L. Defining misinformation and understanding its bounded nature: Using expertise and evidence for describing misinformation. Political Communication, 2020,37(1), 136- 144. http://doi.org/10.1080/10584609.2020.1716500
- Chou WS, Oh A, Klein WMP. Addressing health-related misinformation on social media. JAMA, 2018, 320(23), 2417–24
- Jost J, Van der Linden S, Panagopoulos C, et al. Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 2018, 23, 77-83. http://doi.org/10.1016/j.copsyc.2018.01.003
- Wechsler H. Immunity and security using holism, ambient intelligence, triangulation, and stigmergy: Sensitivity analysis confronts fake news and COVID-19 using open set transduction. J Ambient Intell Humaniz Comput. 2023;14(4):3057-3074. doi: 10.1007/s12652-021-03434-z. PMID: 34457079
*This text has received support from the National Recovery Plan under project 1.4 CEDMO 1 – Z220312000000, Support for increasing the impact, innovation, and sustainability of CEDMO in the Czech Republic, which is financed by the EU Recovery and Resilience Facility.
