Issue managers and communication professionals have to deal with them every day: people who seem to genuinely believe ideas about science and technology that are simply wrong. And social media helps spread this “untelligence” at breakneck speed to potentially influence millions.
However, this is not just about a witty neologism. “Untelligence” concerns a much more fundamental problem which inhibits informed discussion and cannot be dismissed as simply another manifestation of fake news.
Communicating difficult scientific concepts or nuanced medical issues is hard enough. But it is made even harder when millions of people are influenced by false information and anti-science. And it’s certainly not helped when, for example, the president of the United States declares that noise from windfarms causes cancer (it doesn’t) or suggests injecting disinfectant may be a treatment for coronavirus (it isn’t).
So how does this “untelligence” spread? We know from research at MIT that false news on Twitter proliferates faster and wider than stories which are independently verified. And we know from research at Iowa State that Russian trolls have deliberately planted anti-GMO messages around the world as part of a campaign to inflame divisive issues in the West to promote distrust of government, large corporations and experts. Indeed, The New York Times says Russia has also been very active in promoting the anti-5G conspiracy.
Yet it’s all too easy to blame outsiders. The news media is also at least partly to blame for the promotion of non-scientific information. It might seem harmless when they devote space to the efforts by rapper B.o.B. to prove the Earth is flat. But it can have real consequences.
Think back to when the Australian media lionised fake wellness blogger Belle Gibson and gave her a national platform to promote the dangerous lie that she used diet to cure her own cancer. Or when the American media made a business hero of Elizabeth Holmes, CEO of Theranos, which cost investors millions when their claimed revolutionary’s new blood-testing technique proved to be a fraud.
In these pandemic times, we know that believers in false cures and false information about the virus crisis are risking their own lives and the lives of others around them.
For issue managers and science communicators, combatting false information is a daily responsibility which cannot be addressed by finger-pointing, no matter how satisfying that might be. It needs planned and persistent challenge and rebuttal.
And never forget: the public are generally not stupid. More likely, they are just wrong-headed or victims of “untelligence”.
This story is reprinted with permission from Managing Outcomes, a newsletter for people who work in issue and crisis management.
Tony Jaques is the director of Issue Outcomes.