• 2020 FMX

    Keynoter Outlines Do’s and Don’ts of Debunking False Science

    October 22, 2020, 12:58 pm News Staff -- It’s not like Timothy Caulfield, Ph.D., didn’t already have plenty of material to use in making the case that science ― real science ― is under attack. The Canadian health law and science professor, bestselling author, and unlikely Netflix star ticked off a list of mind-boggling examples during his Oct. 14 opening main-stage session at this year’s virtual Family Medicine Experience.

    Timothy Caulfield, Ph.D.

    Sandals infused with healing crystals that sell for upwards of $350.

    The vampire facial” Kim Kardashian made famous, but then later came to regret.

    Someone else’s used tissue, marketed as empowering you to “get sick on your own terms” ― to the tune of $79.99. (Spoiler alert: The product was later revealed to be a hoax.)

    And don’t get him started on drinking urine ...

    “There’s so much noise out there ― and I’m sure you guys see this in your practice ― that distracts us. Distracts us from the things we can do to live a healthy lifestyle,” Caulfield told his virtual audience.

    “I was going to talk about that. But holy cow, how things have changed!” he exclaimed, referencing the stunning levels of misinformation swirling around the COVID-19 pandemic. “We truly live in an infodemic.”

    Story Highlights

    As research director of the Health Law Institute at the University of Alberta, Caulfield has spent decades studying the spread and impact of health misinformation. He is currently leading a project to collect and evaluate information ― and misinformation ― about SARS-CoV2 and COVID-19 from multiple sources (e.g., social media, mainstream news outlets, search engine data) and determine which types of messages are gaining traction and why. Then he and his team will fashion those insights into an evidence-based framework that government and private entities, the media, and the public can use to cut through the noise and elevate accurate information from reliable sources.

    “When we first started studying this,” he said, “we were optimistic ― hopeful ― that the misinformation would be tackled early and that people would recognize the significance and the power of science and the harm that misinformation can do.

    “Unfortunately, that is not how it’s played out.”

    Clearly, misinformation about the pandemic is rampant, from claims that it’s a bioweapon (either Chinese or American, take your pick) or that it’s linked to 5G wireless network communications to speculation that it (along with Candida auris) made its way to Earth via meteorite.

    “I’m pretty sure that COVID-19 came from the grassy knoll,” Caulfield quipped. “That’s the next thing that we’re going to hear.”

    What’s truly incredible, though, is how many people buy into these conspiracies, said Caulfield, citing a University of Oxford study that showed about one in five Brits believe coronavirus is a hoax.

    And here stateside, the numbers are equally dismaying, with survey results from the Pew Research Center indicating that three in 10 Americans believe the virus was made in a lab.

    What’s behind all the magical thinking? Researchers at Carleton University seem to have tapped into a likely cause: social media consumption ― or what Caulfield calls “availability bias.”

    A growing body of evidence points in that direction, he noted: “Those who believe in conspiracy theories are much more likely to be getting their information from social media.”

    Celebrities, too, have been responsible for spreading misinformation since long before SARS-CoV2 ever appeared (think Jenny McCarthy’s rabid antivaccination activism), and some have more recently joined the ranks. Case in point: Woody Harrelson’s embrace of the 5G conspiracy theory.

    What’s particularly enlightening, said Caulfield, is to look at the amount of misinformation coming from celebrities, sports stars, politicians and other prominent people compared with the level of social media engagement that content receives. He calls it a “top-down, bottom-up phenomenon.”

    According to research from the University of Oxford’s Reuters Institute for the Study of Journalism, prominent public figures disseminated only about 20% of the misinformation sampled, but that misinformation attracted nearly 70% of all social media engagements seen in the sample.

    “So, it’s bottom-up in that all of us ― all of us ― we’re spreading that misinformation on social media,” Caulfield noted. “It’s important to recognize that because that gives us a sense of what we have to do to stop the spread of misinformation.”

    Use of slick video techniques is a particularly effective way to spread misinformation, he went on. The first Plandemic movie (yes, there were actually two of them), for example, commanded a far greater viewing audience than did The Office reunion or Taylor Swift’s City of Lover Concert. Its high-quality documentary presentation and convincing testimonials lent it an inordinate amount of credibility, Caulfield said, and helped advance the many conspiracy theories the film touted.

    Another important source of misinformation, according to Caulfield? Superspreaders. “There are entities out there, there are individuals out there that have had a disproportionate impact on the public discourse, who have had a large impact on the spread of misinformation,” he said.

    Rush Limbaugh, for example, received more than 2 million likes on Facebook for a commentary he wrote that contended the coronavirus was being weaponized against President Donald Trump, Caulfield explained.

    Not surprisingly, antivaccination activists have gleefully jumped on the bandwagon of naysayers when it comes to coronavirus. “It’s fascinating to see how the anti-vaxxers are positioning their message,” Caulfield observed. “Since the very early days, they tied their message to an ideological approach, perhaps even an intuitively appealing ideological idea. The idea of liberty. Choice. Freedom.

    “We know research tells us that if you can do that, if you can link misinformation to an ideologically appealing concept, it’s more likely to spread.”

    The bad news: This approach has been very successful, creating harmful effects in multiple areas. It’s causing physical harm, promoting stigma and discrimination, sparking property damage (yes, people are actually pulling down cell towers), wreaking havoc on science and health policy, and creating an overall chaotic and divisive information environment.

    The good news: The same approach can be appropriated to disseminate accurate, reliable information. It basically comes down to overpowering bad science with good science, according to Caulfield. And that requires strenuously adhering to salient aspects of scientific discourse and discovery, from highlighting the use of rigorous methodology to faithfully relating how evidence is interpreted and translated into appropriate findings.

    Furthermore, fighting misinformation and promoting transparency needs to happen across all fronts. At long last, he noted, some social media platforms appear to be acknowledging they have a role to play in this battle and are cracking down on sources that spread conspiracy theories and other falsehoods, including by flagging or outright removing the offending content.

    “Of course, we also need to debunk all of this,” said Caulfield. “I want to energize you guys to become the debunkers.”

    Right now, he noted, we’re seeing “the perfect elements for the embrace of conspiracy theories.”

    “Conspiracy theorists, for example, may have a particular cognitive style, they may embrace magical thinking. And also, when there’s times of uncertainty and people feel powerless, conspiracy theories thrive.”

    Add to that emerging research suggesting that certain personality types ― those who tend to feel greater distress, those who are more negative ― are especially likely to be drawn to such theories. Indeed, fear distracts people from being able to judge the accuracy of information they encounter online.

    Still, said Caulfield, “I think debunking works. I think we can get out there and we can debunk.”

    But don’t harbor unrealistic expectations, he cautioned. “It’s really important to highlight here that what we’re trying to do is move the needle. No one strategy is going to fix everything.”

    How to go about debunking scientific falsehoods? First, Caulfield advised, don’t be dissuaded by concerns about the so-called backfire effect, the belief that pushing back against an idea only leads to further entrenchment among those who hold that view. The idea was espoused in a 2010 study that has since been contested, he emphasized, and more recent research has undermined its validity. So, although it can’t be completely discounted, it shouldn’t stand in the way of attempts to counter false information.

    Such efforts are more likely to succeed when conducted by experts and trusted information sources before the misinformation becomes too firmly entrenched. Other keys to successful rebuttal are using compelling facts and highlighting the rhetorical techniques used to push misinformation and support denialism, Caulfield said.

    “What I mean by that is saying, ‘Look, this is a conspiracy theory.’ ‘Look, they’re misrepresenting the risk involved.’ ‘Look, they’re relying on testimonials.’ So, highlight the good data, the good science, and point out the rhetorical devices that are used to push the misinformation.”

    Here’s a brief recap of specific tips Caulfield offered:

    • Provide the science.
    • Use clear, shareable content.
    • Reference trustworthy and independent sources.
    • If possible, reference expert consensus (as well as the fact that science evolves, when appropriate).
    • Be nice, authentic, empathetic and humble.
    • Consider using a narrative.
    • Highlight gaps in logic and rhetorical tricks.
    • Make facts the hook, not the misinformation.
    • Remember: Your audience is the general public, not the hardcore denier.

    Finally, he noted, “We want to get people just to pause before they share.” Getting people to stop long enough to consider how truthful a headline is can disrupt the cycle of perpetuating misinformation, according to a study published in the Harvard Kennedy School Misinformation Review in February. More recently, a study from researchers at the University of Regina in Saskatchewan found this concept held true for COVID-19 misinformation.

    Furthermore, Caulfield noted, people typically aren’t trying to deliberately spread misinformation. “You know, we all seem to think that people have this nefarious agenda ― no, that’s not the case. Most people want to be accurate,” he said. “So, if we can just get people to pause and embrace a culture of accuracy, we can have an impact, we can move that needle. And that’s what we really want to do.”

    Caulfield concluded with a simple tip that traces back to a media campaign he’s involved with: “Our message is ‘Check first. Share after.’”

    “If we can get people to do that ― your patients, people you know ― we may be able to make a difference.”