Search

Fact or fake? How to check your science before reposting

Updated: Jul 23



THE COMPENDIUM

  • Every new piece of information should be verified before being believed. Taking into account what interests were behind its publication is the first step to secure facts and squash myths.

  • The media often equates correlation with causation for the sake of attracting attention, but even if two events in a study are correlated, that does not mean that one is the cause of the other.

  • The CRAAP test (Currency, Relevance, Authority, Accuracy, Purpose) is a great tool to investigate the relationship between the information published in a popular science article and the scientific study/paper it originated in.

Currency: How old is the article? Has it been revised or updated? Relevance: Does the original article relate to the information that is being spread? Authority: Who are the authors? What are their qualifications and affiliations? Accuracy: Do the conclusions match the experiments? Purpose: For whom or what was the scientific article written? Is there a potential bias or conflict of interest?


Too much information

These are strange times for the scientifically inclined and/or curious. With the world’s knowledge readily available at the touch of a screen and information being widely accessible, one would think that misinformation or ignorance would be at an all time low. However, in recent years, there has been a rise of movements based on the denial of evidence-based scientific facts, most notably the Anti-Vaxxers and Climate Change deniers.

Why is it that with all the facts and knowledge readily available, the spread of misinformation and myths has turned into a (political) issue? The tip of the (melting) iceberg is certainly the fact that on the internet, everyone can be an expert and social media in particular furthers the development of echo chambers, where the potentially wrong ideas of like minded people are neither challenged nor corrected.

To be completely fair, it is not an easy task to filter facts from myth through the endless stream of information, scientific or otherwise, when trying to form an informed opinion. If, in order to understand something everyone is talking about, you need to go through countless convoluted academic papers, scientific letters, communiques and opinion pieces, the appeal of a friend’s link to a simple tweet or Facebook post is evident. From there, it’s easy to spiral into oversimplification, sensationalism, or fear mongering - be it on purpose or not. And that’s how potentially dangerous myths about science are born.

But here lies the important aspect: the information you get from your circles does not necessarily have to be misleading or false. At the same time you shouldn’t believe everything you read. All good sources of information share some common traits, and all you need in order to navigate the flood of available information is a little bit of awareness and the will to go beyond the web page you’re presented with.


How to read into what you're reading: bias and sensationalism

Let’s analyze a typical example: Someone you know shares a link on social media about a topic you are interested in. You follow the link and the tone, while alarming, seems reasonably confident. And yet, the information presented to you contradicts what you thought you knew about this particular topic. Your first instinct is to believe that you’re reading is true; after all, hundreds of people have shared the same link and the comments reveal a lot of approval. But should you?

Let’s go through some important points.

One of the first steps to take when faced with a new source of information is checking who benefits from it. Ideally, specialists in the field aside, no group or company in particular should get any direct benefit, and especially not fringe groups like anti-vaxxers or flat-earthers. A scientific discovery is just that: another piece of the puzzle added to the big picture, neutral by definition (conflicts of interest notwithstanding, as in the case of Big Tobacco and their sponsored studies). Oftentimes, a scientific media article is filtered through a non-expert reporter; findings might be misinterpreted or slightly altered and used to support a certain position, while proven facts are just as often ignored or misreported. Identifying when a scientific finding is being used as “propaganda” is crucial, regardless of the point the author is trying to make.

Fringe content is fairly easy to identify; keeping a levelheaded perspective on something a non-specialist media outlet writes about, on the other hand, can be trickier. This is true especially these days, as they are often on the front lines of partisan warzones. Regardless of their political leaning, news outlets will always make sure that their pieces have a catchy title and gripping opening lines. It’s their job, after all. But one of the most common issues a reader can face is how media often blurs the lines between causation and correlation. This is a huge pet peeve of the scientific community, because oftentimes this shows that a scientist is jumping to conclusions, which is of course the exact opposite of what proper science is all about.

Causation between two events means that one is the direct cause of the other, while correlation means that two events are often detected together but are otherwise unrelated. Let’s imagine, for example, an article whose headline states “Born in the south of Europe? You will have brown eyes”, reporting on a scientific finding whose main conclusion is “86% of interviewees born in the south of Europe have brown eyes”. The headline seems to imply that your birthplace directly influences your eye color, which is a claim completely unsupported by the science that is being reported. What the scientists want to point out is that most people born in the south of Europe have brown eyes, not that being born there gives you brown eyes. This kind of misinterpretation of science has played a major part in the rise of the anti-vaxxer movement (and the disproval of it).

Tackling the science

We can now go for the final step: Once we're sure that what we’re reading is not being exploited as propaganda or overly sensationalized, we can turn our attention to the institution and the scientists generating the data. That is, if we have the chance to examine the sources who informed our own source, which is where most of the critical analysis of an article tends to grind to a halt.

A scientific article can appear quite complex to any non-specialist reader. On top of that, due to the prevalence of paywalls in scientific publishing, even accessing the original science can prove difficult. Luckily, Open Access publishers, whose journals offer their contents for free, are bravely trying to change the status quo.

Once you decide to take the plunge and read the article, though, how can you navigate it?

One useful way to try and figure out if a source is relevant and used in the appropriate context is the so-called CRAAP test, invented by librarian Sarah Blakeslee in 2004 to teach librarians how to critically assess sources. The acronym stands for Currency, Relevance, Authority, Accuracy, Purpose and it can be easily adapted to help figure out if the science behind our source is credible.

Currency, here, means how long ago the article has been published. The most recent articles are considered to be most reliable, since they take into account the latest development in the field, but even a relatively old scientific article can be considered current if it has recently been amended.

Relevance essentially boils down to one question: How closely related are the article shared on social media by your friend and the scientific study it supposedly refers to? Although a research article might look complex, most of its info is summarized in its abstract, including the conclusions. A quick read might already give you an idea of how faithfully the data have been reported.

Authority is an important, but easy parameter to verify: Who wrote the article? Who are their employers? How active have they been in their field? To get a quick answer, check the journal’s Impact Factor value, which reports how often articles from that journal are cited by others: the higher the value, the better.

Accuracy, on the other hand, requires a certain degree of expertise, since it investigates whether the experiments performed actually support the findings of the study. For those unfamiliar with the topic, analyzing this parameter might not make much sense, which just goes to highlight the importance of the other parameters.

Last but not least, Purpose reconnects with the question of who benefits from the research conducted for the article. Was the study commissioned for a company? Was it based on statistics or experimental data? These fundamental questions can help understand if the science was unbiased or if there is reason to believe the scientists might have aligned their interests with those who funded their work.

Ultimately, the responsibility of what to believe and how deeply to challenge our views resides with us. The unprecedented freedom of expression granted by the Internet amplifies and, more often than not, distorts the data science builds upon, either because these already complicated facts are poorly explained, or because of the interests at play in the background. Truth is, science is iterative, and the rewriting of some of its chapters is nothing out of the ordinary. New methodologies and discoveries happen constantly and it would be negligent not to take advantage of them in order to improve and support what we already know as well as correct a claim that needs to be revisited in light of the scientific progress.

The relevance of this is omnipresent; just this week, for example, Angela Merkel was quoted saying about the current novel coronavirus pandemic: “We are seeing at the moment that the pandemic can't be fought with lies and disinformation... Fact-denying populism is being shown its limits.” Ignoring (or disparaging) science is simply admitting that the comfort of the echo chamber and its plethora of unfounded myths is too appealing when compared to facts-based truths.


References and further reading


Merkel says the coronavirus pandemic has exposed leaders who rely on 'fact-denying populism'

https://www.businessinsider.com/angela-merkel-coronavirus-exposes-leaders-fact-denying-populism-trump-2020-7?r=DE&IR=T


Tips to identify whether a source is scholarly and reliable

https://www.editage.com/insights/tips-to-identify-whether-a-source-is-scholarly-and-reliable


What Makes Valid Research? How to Verify if a Source is Credible on the Internet

https://www.democracyandme.org/what-makes-valid-research-how-to-verify-if-a-source-is-credible-on-the-internet/


Evaluating Internet Resources

https://www.library.georgetown.edu/tutorials/research-guides/evaluating-internet-content


How Can I Tell if a Website is Credible?

https://uknowit.uwgb.edu/page.php?id=30276


How to Do Research: A Step-By-Step Guide

https://libguides.elmira.edu/research/evaluate_sources


CRAAP test

https://en.wikipedia.org/wiki/CRAAP_test#:~:text=CRAAP%20is%20an%20acronym%20for,use%20as%20tools%20for%20research.


The CRAAP Test

https://commons.emich.edu/loexquarterly/vol31/iss3/4/


Tobacco Industry Influence on Science and Scientists in Germany

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1470431/


Tobacco Explained

https://www.who.int/tobacco/media/en/TobaccoExplained.pdf?ua=1


Vaccines and Autism

https://www.chop.edu/centers-programs/vaccine-education-center/vaccines-and-other-conditions/vaccines-autism


What are correlation and causation and how are they different?

https://www.abs.gov.au/websitedbs/a3121120.nsf/home/statistical+language+-+correlation+and+causation#:~:text=A%20correlation%20between%20variables%2C%20however,relationship%20between%20the%20two%20events.


Correlation does not imply causation

https://en.wikipedia.org/wiki/Correlation_does_not_imply_causation




0 views
 

©2020 by Biocompendium. Proudly created with Wix.com