The Covid-19 catastrophe: A science communication mess?

Abstract Following the declaration, in March 2020, of the Covid-19 pandemic, there was an escalation of disinformation, involving multiple actors and reaching global dimensions. In this article, we analyze the possible causes and characteristics of the spread of disinformation on this issue. Disinformation about science can be explained by the distance that separates scientific knowledge from common knowledge and the difficult relationship between science and the media. The pandemic has multiplied the number of scientific publications and has accelerated publication rates, which has contributed to the dissemination of provisional, erroneous, or totally false information. A process of politicization has also developed, which has led to misinformation. In addition, the need to confront this health crisis has led society to demand accurate information from science, despite the fact that in many cases there is only uncertainty. The experience of this pandemic highlights the importance of providing citizens with accessible and rigorous knowledge that creates confidence in science. To achieve this, it is necessary to have specialized professionals capable of providing rigorous information, not only on the results but also on the research processes.


Introduction
In December of 2019, a novel coronavirus of then-unknown origins was detected in Wuhan, China, and rapidly spread to other countries. On 11 March 2020, the World Health Organization (WHO) declared that Covid-19 could be characterized as a pandemic-the first caused by a coronavirus-and urged countries 'to take urgent and aggressive action' (WHO 2020a).
From then on, many governments around the world announced quarantines of certain population groups in addition to other measures to prevent the spread of the virus. It was in this disruptive situation that the media and social networks played a major role in distributing information from public administrations and scientists to the broader population. But the scarcity of available information, especially in the first weeks, coupled with the uncertainty about how the pandemic would evolve, led to the spread of different forms of misinformation and disinformation, such as rumors and hoaxes which spread through digital media with unprecedented speed. This escalation of misinformation was a cause of concern among governments and health authorities, as it led citizens to make decisions based on inaccurate information (Srivastava et al. 2020).
The escalation of misinformation reached such a high level that the WHO stated that the emergence of the virus and the measures for dealing with it were 'accompanied by a massive "infodemic"', defined as an over-abundance of news, some accurate and some not, that made it difficult for people to find reliable sources which could provide sound guidance on which to base their actions (WHO 2020b, 2). The importance of combatting this 'infodemic' led the WHO to track and refute erroneous information through its website and social media accounts.
This pandemic has given rise to what may be one of the greatest known episodes of misinformation in history. Although misinformation on other science-related matters has proliferated in recent history, the misinformation in those other cases did not have a comparable scope (Salaverr ıa 2021). Since the virus has appeared, 'an indistinguishable mix of unverified information, helpful information, misinformation and intentionally manipulated disinformation', has grown to enormous proportions (Larson 2020, 1). In the first months of the pandemic, The Vaccine Confidence Project scoured social media and gathered more than 240 million pandemic-related posts on websites and social media, averaging 3.08 million posts per day. On Twitter alone more than 113 million people posted about Covid-19, sharing everything from news stories to speculations on the origin of the virus to dangerous remedies (such as drinking bleach) (Larson 2020).
The #CoronaVirusFacts Alliance, spearheaded by the International Fact-Checking Network (IFCN) and the U.S.-based Poynter Institute has identified more than 9000 falsehoods about the coronavirus, which were disseminated in more than 70 countries and over 40 languages. According to this project, misinformation peaked in March and April of 2020 (the first months after the pandemic was declared) and remained at lower but relatively constant levels until April of 2021, when this article was first written. The information they checked refers to various topics related to Covid-19 (the spread of the virus, vaccines, governments, hospitals, lockdowns, etc.), and this information was then classified as either false, partially false, misleading, or unproven (Poynter Institute 2021). Other transnational projects have developed similar collaborative initiatives to detect false content relating to the pandemic. One example is the Latam Chequea initiative, which is made up of 48 media and fact-checking organizations from Latin America and Spain. Participants in this initiative have reported hundreds of verified pandemic hoaxes to impede the spread of false content in different countries.
The results of a survey conducted in March and April of 2020 across six countries (Argentina, Germany, South Korea, Spain, the United Kingdom, and the United States) indicated that one-third of the respondents claimed to have encountered a lot of fake news, especially fake news spread by ordinary citizens and through social media and messaging apps .
The goal of this article is to analyze the characteristics of misinformation related to scientific and medical content that was spread during this pandemic, as well as the factors that may have contributed to the problem and the lessons that society can learn for future health crises.

The phenomenon of disinformation
In every act of communication, the information requires one precondition to be considered as such: truth. If a publicly disseminated message is not true, then it ceases to be information and falls into another category which, depending on the intention of the communicating subject, falls somewhere between simple error or confusion and a deliberate lie.
Differentiating truthful information from distorted messages has indeed become a serious issue of public concern in contemporary democratic societies. The popularization of social media has expanded the ecosystem of public communication, multiplying exponentially the number of messages that reach citizens, while the professional filters to guarantee their veracity are disappearing. By the end of 2020, more than 300 million Twitter users were posting some 350,000 tweets per minute or around 500 million per day. The number of texts, photos, and videos posted on Facebook was just as colossal, thanks to its nearly 2.5 billion users across the globe. Through the global communication platform offered by these and other social networks, the volume of erroneous, misrepresented, or totally fabricated content that reaches the public has skyrocketed .
The growing severity of this phenomenon has led to increased academic research on misinformation over the past decade. These recent studies follow in the wake of works that were published starting in the first half of the twentieth century. In the context of the rise of totalitarianism and the disaster of the two world wars, these initial works analyzed the characteristics, processes, and consequences of propaganda and its various forms of information falsification (Lasswell 1927). During the Cold War in later decades, these techniques of strategic disinformation were refined until they became a key weapon for the intelligence agencies of the world's major powers. The refined propaganda techniques, in which the audiovisual media took on an increasingly prominent role, were also studied by various authors (Bittman 1985;Snyder 1995). With the popularization of the internet beginning in the 1990s and especially with the emergence of social media in the first decade of the 21st century, the dynamics of disinformation have expanded beyond the traditional media, such as the press, radio, and television, moving the problem to the world of big data (Salaverr ıa and Le on 2021).
In recent years, a theoretical framework for the various informational pathologies has been developed around what has been termed the 'information disorders' (Wardle and Derakhshan 2017). This theoretical framework distinguishes three main categories: misinformation, disinformation, and malinformation (Ireton and Posetti 2018;Tur cilo and Obrenovi c 2020). Misinformation refers to erroneous content or involuntary falsehood. Disinformation, on the other hand, alludes to deliberate lying or information fabrication. Finally, malinformation refers to true content which, however, should not be publicly disclosed for ethical reasons. For example, an act of malinformation is committed when the identity of a victim of sexual violence is disclosed; even if the information is true, it is very likely that the public distribution of her name will cause her harm. Meanwhile, even though the public may be misled by unintentionally erroneous content (misinformation) or through the publication of ethically questionable content (malinformation) may also be harmful, disinformation itself refers to the premeditated dissemination of misrepresented content.
Different types of falsehoods have been described within the category of disinformation. Wardle and Derakhshan (2017, 17), for example, identify seven modalities, covering four types of falsehoods ('satire or parody', 'misleading content', 'imposter content', and 'fabricated content') and three procedures of disinformation ('false connection', 'false context', and 'manipulated context'). In their study of the Covid-19 pandemic hoaxes in Spain, Salaverr ıa et al. (2020) offered a four-category typology: hoax, exaggeration, decontextualization, and deception. Based on this classification, these authors propose a 'hoax severity diagram', which describes the disinformation modalities carried out through jokes or hyperbole as milder and describes expressions that resort to decontextualization and deception as more serious and potentially harmful.
In any of its forms, but especially in its most serious forms, it is certain that misinformation produces significant effects. In the best cases, it leads to skepticism towards any public message, including the truthful ones. In the worst cases, it has a destabilizing effect, causes social alarm, and encourages dangerous behavior in a certain segment of the population.

Science and disinformation
Although the spread of false or misleading content about Covid-19 has reached previously unknown levels, this is by no means the first time that disinformation about science-related issues has spread.

From asbestos to climate change
Asbestos is a building material that was widely used between 1860 and 1980 (and is still used in some countries), despite overwhelming evidence that it is carcinogenic and that it causes asbestosis (a type of pulmonary fibrosis), meaning that its use can be deadly. As some research indicates, asbestos producers developed powerful disinformation campaigns in an attempt to deny or cast doubt on the credibility of studies showing the harmful effects of asbestos on human health, even though this knowledge is wellestablished in the scientific community (Baur and Frank 2021).
Another relevant case is that of tobacco. Doctors warned about the harmful effects of cigarettes in 1964, but it was not until 1999 that American health authorities declared that smoking was a cause of cancer. Why did it take so long to clearly alert the population to this danger? The answer can be found in a 2006 ruling in which a U.S. federal tribunal found tobacco manufacturers guilty of 'conspiring to deny, distort, and minimize the hazards of cigarette smoking to the public' ).
Disinformation has also played a prominent role in the social perception of anthropogenic climate change. Research published in 2009 highlights that the level of consensus on its existence and causes (the emission of gasses produced by human activity) among scientists who specialize in the field is between 97 and 98% (Anderegg et al. 2010). However, this consensus has not always been reported by the media (Boykoff 2007;Buettner 2010;D ıaz Nosty 2015), and the existence or relevance of the greatest environmental problem of our time is questioned by different political spheres and social groups (Oreskes 2004). This means that the number of people who believe in the existence of this phenomenon has at certain times decreased (Leiserowitz et al. 2010).
Several different factors led to this disconnect. First, the powerful industry associated with fossil fuels has engaged in intense lobbying for decades, adopting the strategy of sowing doubts about the rigor of climate science through think tanks and denialist groups (Dunlap and McCright 2011). The media's norms, values, and methods have also led to the amplification of voices that contradict the scientific consensus, for example, by enhancing the balance of sources with opposing views (Boykoff and Boykoff 2004) or by frequently turning to experts allied with fossil fuel industries (Zehr 2000).
In addition to the examples above, the recent history of science communication includes other instances of disinformation, such as the alleged causal relation between vaccines and autism (Kata 2010), false miracle weight-loss diets (Forbes 2002), and genetically modified foods' alleged harm to human health (Jiang and Fang 2019).

A perfect breeding ground
In this section, we shall briefly explain why science is a perfect breeding ground for misinformation, starting from the gap between scientific knowledge and common knowledge. Science is based on a systematic, rigorous, logically structured body of knowledge, while common knowledge tends to be 'vague, uncertain, disordered, and ununified' (Fern andez del Moral and Esteve Ram ırez 1994, 27, our translation). That is why when science attempts to reach the broader society, it does so through communicators who try to overcome this distance that separates both types of knowledge-and that is no easy task.
Science communication theorists have tried to explain different ways to bridge this gap through various models. First, what is called the 'deficit model' was developed, according to which there is a lack of scientific knowledge in society that communicators must solve by transmitting that knowledge to the citizens. This model is based on the idea that science is too complicated for the general public, so it is necessary for scientific communicators to carry out a mediation process. Other subsequently developed models have highlighted the need to establish a dialogue with society (dialogical model) and to facilitate the participation of the citizens in the production of knowledge (participatory model). The experts suggest that effective public communication of science must be based on a combination of the three models so that it can start with a transfer of knowledge on an issue and proceed with dialogue and participation. In this way, citizens cease to be passive spectators and become active agents thus gaining greater cultural benefits from science (Bucchi 2008).
Several authors have analyzed the complex relationship between science and the media. As Nelkin (1987) points out, these are two cultures that are in tension but are inextricably linked. When reporting on science, the media tends to present it through frames like amazement at advances or innovations, which lead to the public getting distorted ideas, like the idea that it is possible to achieve easy solutions to economic, social, or medical problems. Crisis situations make it clear that the media has problems with reporting effectively on complex issues and critically analyzing different contradictory information. In short, as this author states, 'the media can play an important role in enhancing public understanding, but they have frequently failed to do so' (Nelkin 1987, 162).
In the specific area of biomedical information, Bellver Capella (2006), recalls that the media have a short time in which to report, limited space in which to report, and they must always keep in mind the need to appeal to the public. Moreover, the editorial line of each media outlet usually plays an excessive role in such key decisions as headlines, selection of sources, and the orientation of information. This facilitates the publication of information that is not very rigorous, is imbued with a fascination with the quantitative and the extraordinary, and is often of a sensationalist nature.
This complicated relationship between science and the public leads to the spread of knowledge that only appears to be backed by science: pseudoscience. According to the etymology of the term, pseudoscience can be defined as non-science posing as science (Gardner 1957), which can be considered a type of disinformation, since it is a matter of beliefs that 'masquerade as genuinely scientific ones' (Baigrie 1988, 438). Although the line between science and pseudoscience is sometimes fuzzy and permeable, fields, such as astrology, ufology, parapsychology, iridology, homeopathy, and alternative medicine, just to name a few of the best-known, are generally considered pseudoscience (Romero 2019). In a broader sense, however, any practice that departs from the scientific establishment (that is, is outside of the scientific consensus) can also be included in this category.
At this point, it is necessary to briefly digress and explain how scientific consensus is formed and what it entails. But even if there is a consensus on an issue within an expert community, it is commonplace for citizens outside of that scientific field to express their disagreement in public, thus giving rise to misinformation. Different arguments can explain these disagreements, for example: that the public is less informed than the experts (Irwin and Wynne 1996); or that the risk-benefit assessment criteria used by the public differ from those used by scientists (Slovic 2000). Other authors explain this dissonance by basing it on the existence of what they call 'cultural cognition of risk', which leads people to conform their perceptions of risk to their own values (Kahan, Jenkins-Smith, and Braman 2011).
This context allows us to better understand that communicating scientific knowledge to the broader society is a complex task. Moreover, this endeavor lies in the middle of a web of different interests and circumstances derived from the way such knowledge is generated. All of this favors misinformation on different sciencerelated issues.
Salaverr ıa (2021) identifies some factors that are conducive to misinformation about science and healthcare, distinguishing between endogenous and exogenous factors. The former includes the acceleration of scientific publication processes and the inadequate transfer to society of knowledge published in scientific journals. The exogenous factors include those of a technological, psychological, political, medical, or educational nature.
As this author indicates, healthcare misinformation can have serious consequences. It can put people's health at risk, impede responsible individual behavior, and hinder authorities' management of health crises. Studies conducted during the Covid-19 pandemic corroborate these statements, indicating that in different countries misinformation reduced the willingness of citizens to get vaccinated and to follow the guidelines prescribed by health authorities (Roozenbeek et al. 2020). It also caused citizens to become less informed about the pandemic and to process information in a less systematic way (Kim et al. 2020).

Scientific and healthcare misinformation and disinformation during the pandemic
As we pointed out in the introduction, during the Covid-19 pandemic, citizens have received a lot of disinformation about different pandemic-related issues. One study conducted in Spain analyzed hoaxes identified by the country's three main fact-checking organizations during the first month of the pandemic. The results indicate that 34.9% of these hoaxes dealt with 'issues related to science and health', 26.7% with 'politics and government', and finally, 38.4% were dedicated to other topics. The science and healthcare hoaxes included false information about investigations, false recommendations to the public, and falsehoods related to health management (Salaverr ıa et al. 2020). Another study, conducted in Iran, indicates that disinformation about Covid-19 spread through social media included 'disease statistics; treatments, vaccines and medicines; prevention and protection methods; dietary recommendations and ways for disease to be transmitted' (Bastani and Bahrami 2020).
There has undoubtedly been some occasional division and controversy between governments, official bodies, and scientists themselves, as well as different positions or ways of interpreting results and conclusions. All of this has produced misinformation in the opinions of the public. In this sense, some professional groups like 'Doctors for Truth' or 'Biologists for Truth', which has representation in several countries, have acted very irresponsibly and have been authentic sources of disinformation during the pandemic. 'Doctors for Truth' started in Germany, picked up steam in Spain, and expanded into Latin America. The falsehoods they have spread include the promotion of false cures, the call not to use masks, and even denial of the pandemic itself. The fact that they are physicians has given them a greater influence on the public discourse. In Spain, the Consejo General de Colegios Oficiales de M edicos de España (General Council of Medical Associations of Spain) began compiling an informational file on 'Doctors for Truth' because of the damage that they may have caused to public health (OMC 2020).
Most of the problem results from three main causes that we analyze in the following section: the acceleration of scientific processes; the politicization of the phenomenon (May 2020); and the lack of knowledge about how science works.

The problem of high-speed express science
During the pandemic, a huge amount of science has been published in record time, and in a world that is hyper-connected by social media. This relationship has proved very dangerous. The scientific publication process is usually slow: the time it takes from when the author obtains the results to when the results are published and accessible to the scientific community is normally a matter of months or even years. During the Covid-19 pandemic, however, the need for results to be shared quickly among the scientific community has led researchers to distribute pre-publication documents to colleagues.
These preprints or prepublications are manuscripts that have not yet passed peerreview. The immediate distribution of the preprints allows the authors to get feedback from colleagues, which can be very useful for revising and improving their work. Over time, servers or repositories have been created in which such preprints are openly available. One of the first fields of study to opt into this system was that of theoretical physics, but since 1991 there has been an online archive, called arXiv, for prepublications of scientific papers in the fields of physics, mathematics, computer science, and quantitative biology (Ginsparg 2011). In 2013 the bioRxiv repository was created for the health sciences. Today there are preprint archives for practically every discipline, including law and philosophy. It is estimated that more than 40% of preprints eventually end up getting published (Tsunoda et al. 2019).
The seriousness and urgency of the pandemic have demanded rapid responses from the scientific community. At the end of January of 2020, the journal Nature (Stoye 2020) published a commentary in which the author was astonished that in fewer than twenty days after the emergence of the new Chinese coronavirus had been announced, more than 50 scientific papers had been published. Even then that number was impressive. However, one year later there are more than 100,000 scientific articles on SARS-CoV-2 or Covid-19 in PubMed (2020), surpassing, for example, the number of those under the heading of 'malaria'. In the first months of 2020, the number of such emergency (unreviewed) scientific publications is estimated to have increased nearly 100fold over the previous year (Else 2020). The number of scientific publications during the pandemic, and especially the number of preprints, has been so great that they have not just overwhelmed the scientists themselves, but also the publishers and specialized journals.
Some of these articles were no more than opinions or simple recommendations. During this time, scientific articles of low quality, but high media impact, were also published while others have been interpreted out of context or even misinterpreted by non-specialists. Moreover, results published in this way have occasionally been taken as proven scientific facts.
Covid-19 has been a perfect storm for the spread of both erroneous news and deliberately false news or hoaxes. Even back in 2018, the journal Science published a work on the spread of news on social media and found that false news is retweeted 70 percent more than the true news (Vosoughi, Roy, and Aral 2018). To achieve these results, the creators of false news resort to tricks like using content that is surreal, exaggerated, stunning, emotional, persuasive, or that is based on clickbait or shocking images (Baptista and Gradim 2020). The result of employing these tactics is that false news tends to spread faster and reach more people than true news.
The majority of these hoaxes were related to false interpretations regarding the origin and lethality of the virus, its permanence in the environment, treatments, vaccines, or absurd recommendations for fighting the virus, such as gargling, following certain diets, drinking wine or hypochlorite, or taking homeopathic remedies (Salaverr ıa et al. 2020).
For example, one scientific article suggested that SARS-CoV-2 was produced in a laboratory through genetic engineering as an artificial combination of a coronavirus and the HIV retrovirus that causes AIDS. This article was published as a preprint on 30 January 2020 (Pradhan et al. 2020) and withdrawn by the authors themselves on February 2 when errors were found in their bioinformatics analysis and interpretation. However, it was one of the most commented-on articles on social media, promoting the hoax of the artificial origin of SARS-CoV-2. Unfortunately, this hoax was echoed by Luc Montagnier, winner of the 2008 Nobel Prize for Medicine for having co-discovered the HIV virus. It bears mentioning that in recent years, this researcher's prestige has been overshadowed by his support for the anti-vaccine and homeopathic medicine movements.
However, the issue of the origin of SARS-CoV-2 is still being debated. Based on the data currently available, the most likely hypothesis is that SARS-CoV-2, like all other human coronaviruses, is of natural origin, from a natural reservoir of bat coronaviruses and reached us through some (still unidentified) intermediate species where it adapted to humans (Ye et al. 2020). However, there are reasonable doubts about what was being done and how it was being worked on at the Wuhan Institute of Virology. The tremendous opacity and lack of transparency of the Chinese government mean that a laboratory origin cannot be ruled out as a less likely but possible hypothesis. Thus, the scientific community has called for further study on the origin of SARS-CoV-2, calling on public health agencies and research laboratories to open their records to the public and allow transparent, objective, data-driven, and independent research (Bloom et al. 2021).
We have never had as much scientific knowledge or as great a technical capacity to deal with a pandemic as we do now. But science needs repose, that is, time for repeat experiments, for others to confirm the same results, and for scientists to evaluate each other. Scientific endeavors are not always compatible with the immediacy of the news. The media demand a lot of information right away. Science responds with thousands of publications, which are open access so that they can be shared by the entire scientific community. But this crisis has highlighted the complicated relationship between express science and the media's need for communication, which ends up leading to misinterpretations as well as malicious hoaxes-a breeding ground for charlatans and conspiracies (Else 2020).

Politics and fraud in science
Unfortunately, some pandemic-related topics have been muddied by political action. For example, whether the use of masks was recommended or even compelled to prevent outbreaks depended more on the ideology of the governor than it does on scientific criteria. In the United States, the recommendation for the use of masks in each state has been very different depending on whether the governor was a Republican or a Democrat (The Guardian 2020). Pressure from anti-vaccine movements has also led to political disputes which have resulted in what some have described as 'a time of political polarization that results in competing distortion of scientific study results' (May 2020).
But perhaps the most scandalous case has been that of hydroxychloroquine: a chemical compound derived from chloroquine that has been used for years to treat malaria and rheumatoid arthritis. This drug was also known to be a potent antiviral because it blocked the entry of viruses in general into cells. Preliminary studies had shown that this compound was able to inhibit the multiplication of SARS-CoV-2 in vitro in laboratory cell cultures. These results made hydroxychloroquine one of the first antivirals to be tested in the most severe cases of Covid-19. A famous (and peculiar) French microbiologist, Didier Raoult, advisor to the French government in the fight against the pandemic, quickly published that this compound was effective in humans against the coronavirus (Colson et al. 2020).
The WHO included hydroxychloroquine in the Solidarity clinical trial. However, some scientists criticized Raoult's work and warned of possible side effects and that significant benefits in patients have not been found. Raoult himself denounced a plot and accused the French Scientific Council and the American Gilead Sciences laboratory of putting the brakes on the use of hydroxychloroquine, which, as a cheap and readily available remedy, was not very lucrative for big pharma.
The issue was further muddied when the then President of the United States, Donald Trump, revealed at a press conference that he was taking hydroxychloroquine to prevent the coronavirus. The consequence of that folly was that there was a shortage of the drug in certain areas so some patients who really needed it had difficulty obtaining it. Thus, the efficacy of hydroxychloroquine became a political issue, with some in favor and others against it for ideological reasons rather than scientific ones (Saag 2020).
To further complicate matters, an article published in one of the most prestigious journals in the field of biomedicine, The Lancet, warned that not only was hydroxychloroquine ineffective but also that it was associated with serious adverse effects and an elevated risk of death. The work was not experimental: the authors relied on statistical data from more than 96,000 patients from 671 hospitals worldwide (Mehra et al. 2020). The WHO decided to discontinue the use of hydroxychloroquine based on this study. However, a group of 174 scientists from 24 countries subsequently questioned these results and thoroughly analyzed the data published in The Lancet. They found that both the experimental design and the database on which the authors based their work were unreliable (Watson et al. 2020).
It was later confirmed that the work was fraudulent and that some of the authors had actually been denounced for malpractice before. The Lancet had to retract the article two weeks after its publication, and this event was dubbed #TheLancetGate. Fortunately, the high-speed science during the Covid-19 pandemic also means that corrections have been issued quickly: the journal retracted the controversial hydroxychloroquine article in just two weeks.
All of this shows why such a technical matter as to whether a drug works should not be decided by political, economic, or fiscal criteria. The World Health Organization decided to resume the use of hydroxychloroquine in clinical trials, and it was ultimately shown that it is not a suitable drug for treating the disease. What happened with the retraction of the results in The Lancet, thanks to the contributions of many scientists who detected errors after publication, shows that the method works and that deception, fraud, or simple errors are quickly detected.

Understanding how science works
One of the major problems the pandemic has accentuated is that society has demanded certainties when everything has been full of uncertainty. This has generated a certain distrust of science because a large part of the public has not understood how science works. Many of the statements made in science are provisional until someone confirms or disproves them. This is especially true in some areas like biology, in which most of the time the results suggest and sometimes prove, but tentatively. Thus, science has its limitations. Absolute trust in science can create false expectations that may lead to frustration.
It is rare to encounter a scientific experiment that is well-designed, double-blind, repeated many times, randomized, has the correct controls, has a sufficient sample size, and has unequivocal conclusions about the subject under investigation. There are many factors that influence the experiments and especially the interpretation of the results. Before the pandemic, the population was used to seeing the end of the scientific process: the outcome. But during this pandemic, the process by which science is carried out has become widely known.
Although these may be very basic ideas, it is worth recalling some aspects of the scientific method that may have had a negative impact on how the results were interpreted and communicated. For example, science is dedicated in part to finding the causes of the patterns we see, but there can be many different explanations for the same phenomenon. How a pandemic caused by an unknown virus will develop is simply unpredictable. Moreover, the design of an experiment or the method of measurement may produce results that are atypical or biased in a certain direction. For example, in a clinical essay, the outcome may be influenced by the expectations of the trial participants.
The researcher who collects and analyzes the results may also be influenced if he or she knows about the treatment beforehand. That is why in so-called double-blind experiments, neither the participants nor the investigators themselves know who has received which treatment. Bias occurs when a researcher ceases to be sufficiently critical, objective, and impartial with his or her own results or stops investigating whether there is evidence that is contrary to his or her starting hypothesis. The aforementioned hydroxychloroquine scandal can serve as an example of this. On the other hand, the effectiveness of a treatment can naturally vary between different people. Therefore, a clinical trial or vaccine is more reliable if it is tested on tens of thousands of individuals than if it is tested on a few hundred. Testing a treatment against Covid-19 in a group of one hundred people is not the same as testing it in a group of five thousand. That is why clinical trials, such as Solidarity, DisCoVeRy, or RECOVERY have been so important. During the pandemic, some researchers published erroneous conclusions (including some which had a great media impact) when they were analyzing a few cases. These conclusions were later amended when the number of cases studied was greater.
It is also worth remembering that correlation is not equivalent to causation. This statement-essential in science-is often forgotten. It is often very tempting to assume that a certain fact is the cause of another. However, that correlation between the two facts could be just a temporal coincidence. The fact that two events usually occur in succession does not imply that one caused the other. For example, the correlation in the timing of some vaccines does not necessarily have anything to do with the occurrence of some other phenomena.
Finally, all experiments must have their own control group for researchers to be able to determine, for example, whether a treatment has been effective. The control experiment is essential for interpreting the results obtained and to be sure that they are not due to other variables that may have affected them. Many scientific articles are rejected because they did not include the proper controls. What for some have been contradictions and continuous reflections, for others is the natural way of conducting science.

Lessons learned: communicating science in times of crisis
During the Covid-19 pandemic, society was faced with a level of misinformation that was probably unknown until then. Beyond its causes, characteristics, and consequences, which will still need detailed study, the experience has allowed us to learn some important lessons which can be immensely valuable in dealing with future crises.
One of the positive aspects of this pandemic is that there has been a great deal of scientific information in the media. Concepts like viruses, PCR, antibodies, vaccines, and immunity have been headline news. There has been a high demand for information, and a great effort has been made to convert complex scientific concepts into information that is accessible to all citizens. There have been examples of specialized science journalists who have done an exceptional job. However, as we have already pointed out, the health, social, economic, and political impact of the pandemic have also increased misinformation.
One of the problems that we have already discussed has been the challenge of combining the immense production of scientific information-express science-with its communication to society. That is why it is important to analyze how science is disseminated and how this pandemic can change how we communicate it. Some are already proclaiming that the system is broken and that the way we communicate science needs to change.
Prepublications, peer-review, and open access in principle allow for a certain degree of self-regulation. However, the fact that the evaluation of research personnel is based almost exclusively on the number of publications they have and on the quality of the publishers, has corrupted the system: the CV is measured by weight. The researcher's future ends up depending on what he or she publishes, which is why there is a rush to publish-sometimes at all costs. Competition between researchers is fierce as they fight to obtain scarce and insufficient funding, and as we have seen, their merit depends on their publications.
On the other hand, the media and the scientific journals themselves are not usually interested in publishing experiments with negative results: positive results sell, and they are more likely to make the news. But an experiment that does not turn out well-because it has errors, for example-is not the same as one in which the result is negative, such as when it is shown that a treatment has no effect. This lack of interest has led to the tendency for negative research experiences and results to go practically unpublished, even though the publication of such results could advance scientific knowledge much more quickly. Without getting into too much detail, knowing that antiviral works and can block the virus is just as important as knowing that it does not. There is no doubt that we need a new model, that is, a new way to evaluate researchers and a new way of publishing science.
Despite all this, it is estimated that more than 7000 scientific articles are published each day, and most of these will never be read or cited. The old saying, 'publish or perish' should be changed to 'be visible or vanish'-either you make your science visible, or you disappear. The pandemic has also shown that scientific communication and dissemination are of vital importance, due to the intermediary role they play between science and society. Scientific knowledge enriches public debate, strengthens democracy, and forms critical and free minds. Communicating science is much more than publishing an article in a journal. High-quality scientific communication can help to explain scientific knowledge and bring that knowledge to citizens.
Behind many crises is also a crisis of communication. According to communication experts, it is of fundamental importance to convey credibility in times of crisis. The worst thing that can happen is for the citizens to think that they are being lied to. The information must be truthful because the truth consoles. This requires transparency and clarity. Transparency means telling what is known: everything that is known. Therefore, information must be abundant and rigorous. At the same time, information must be clear and simple, resorting when necessary to infographic resources or formats of data journalism, a mode of information communication that is on the rise during the pandemic (Westlund and Hermida 2021). Since the objective is for the information to be understood, it is told so that it can be understood. And to be understood, it is necessary to know what questions and doubts ordinary people have, so that science communicators can respond to those questions and fears.
The experience of this pandemic which has involved such cases as that of hydroxychloroquine, which we discussed in this article, shows that it is necessary for the media to include professionals who specialize in scientific communication and whose criteria allow them to select rigorous sources and to avoid publications that are based on provisional results of dubious quality. In this way, citizens will not be misinformed by changing and contradictory news.
These simple ideas lay the foundation for good communication in times of crisis. The more informed and the better-informed citizens are, not only about the results of research but also about its method, the less panic there will be, and the fewer hoaxes there will be. There is no doubt that in these times scientific reporting is becoming more important and that such reporting is the path toward restored trust in science.