For a handful of disinformation researchers, the information crisis that has unfolded around the coronavirus pandemic seemed inevitable.
Joan Donovan, Claire Wardle and Kate Starbird have been preparing for this moment for a decade, capturing the data of the internet, tracking disinformation campaigns and calling for traditional and social media platforms to take responsibility for their spread.
Some of those calls resonated. Tech platforms have taken steps to banish hate groups and include context around misleading information. But the efforts also highlight the severity of what the World Health Organization declared “a massive infodemic.”
NBC News spoke with Donovan, Wardle and Starbird as they continue to work on the front lines of the coronavirus information war to get a sense of what we should be watching for and what the internet might look like after the virus has passed.
“The Internet hasn’t been built for a pandemic,” Donovan said on an April episode of her webinar series, “Big, If True.” “It hasn’t been built for giving timely, relevant, local and redundant information, which is what everyone needs in this moment.”
The series is one of seemingly constant pandemic education and outreach efforts from Donovan. She and her team of researchers have published research in peer-reviewed journals and op-eds and publish a weekly newsletter, Meme War Weekly, that decodes memes as political communication. She hosts the weekly pandemic-focused web series and gives daily interviews to reporters and other researchers. In April, she met with the WHO to advise on coming recommendations to combat misinformation.
Donovan studied disinformation and media manipulation at the Data & Society Research Institute, an independent nonprofit that covers the social and cultural impact of technology. She leveraged her research to implore journalists to practice what she coined “strategic silence,” editorial discretion that diminishes the profiles of violent extremists and slows the spread of hateful messages.
“Misinformation researchers have been dealing with platform accountability and content moderation issues for several years now, and public health is just now seeing how broken our internet infrastructure really is,” Donovan said in a phone interview.
Donovan stays in touch with her team through a long-running text chain, getting messages as late as 3 a.m. from researchers who document misinformation as it bubbles up. She wakes up around 7 and starts writing.
“It’s like crazy on crazy,” she said.
For Donovan, the coronavirus and the accompanying deluge of misinformation have laid bare a truth about the internet itself: The early notion that users could be both producers and consumers of information has turned platforms into information landfills, where people are forced to sift through increasingly dangerous garbage in the search for real information.
“Information is extremely cheap to produce,” she said. “That’s why misinformation is a problem, because you don’t need any evidence, you don’t need any investigation, you don’t need any methods to produce it. Knowledge is incredibly expensive to produce. Experts are sought after, and they aren’t going to work for free. So platform companies can’t rely on the idea that the web is something we build together.”
Donovan is watching coronavirus misinformation that goes unnoticed or undocumented because its virality is hidden — spread inside private messaging apps and within secret groups or by users who employ tactics to evade detection like web archivers.
The volume of misinformation and its potential for widespread harm is unprecedented, and the only real solution would have to be, too, Donovan said, suggesting that the platforms should move beyond moderation to a kind of curation.
“Believing that platform companies can clean this up is like believing that you’re going to win the lottery,” Donovan said. “They need to hire 10,000 librarians to get in there and to look at what’s on the shelves, to sort, to document and to cull what is not viable, what is not useful, what is not serving the public’s interest.”
For the last half-decade, Claire Wardle, the co-founder and director of First Draft, a nonprofit that provides research and training for journalists, has been lobbying the platforms, particularly Facebook, to exercise greater collaboration, transparency and accountability in the fight against misinformation.
“We just kept saying the same thing, and they would nod and smile and say, ‘We’ll do what we can,'” Wardle said. “And then, this flipping infodemic. And no surprises — they weren’t ready.”
Wardle’s misinformation work had mostly meant looking at elections. Indeed, Wardle and her newly expanded team of 18 — 10 in New York and the rest in Sydney, India, London and France — were training journalists on topics surrounding the coming U.S. election when the coronavirus started spreading in Washington state.
In March, as Wardle and many on her U.S. team quarantined in their New York apartments, First Draft shifted focus.
“I think it was a bit of self-preservation,” Wardle said. “We didn’t know how to deal with what was happening, and we all threw ourselves into work”
In a month, First Draft created a training course for journalists on how to navigate coronavirus misinformation, as well as a hub for reporters with a database of debunked claims and a toolkit for verifying content in the wild.
Wardle hosts weekly video chats on which local reporters throughout the country can talk with the team about local angles on national issues and share tips on how to cover misinformation responsibly. First Draft also hosts a 24/7 workspace where journalists and academics collaborate in real time, debunking hoaxes, tracking disinformation campaigns and feeling for “the tipping point,” a phrase now widely known to journalists that Wardle coined to describe when reporting on mis- and dis-information is worthwhile.
Coronavirus misinformation has had a life cycle. In early March, a wave of inaccurate viral private messages warned of a federally enforced lockdown. As the virus began to spread in the U.S., so did ads and articles pushing unapproved and even dangerous treatments. Now, with most states still ordering residents to stay at home, conspiracies around vaccines and government control have overwhelmed the conversation over reopening.
For those who might be financially, politically or otherwise motivated to spread false information, the coronavirus has provided an unprecedented opportunity. The pandemic has also blurred lines between normally separate groups, uniting fringe activists and conspiracy theorists around a similar distrust of institutions like public health organizations and other government agencies.
“Second Amendment groups, anti-vaccine groups, these different networked groups that normally we monitor separately, were all there together,” said Wardle, referring to small but vocal and potentially dangerous anti-quarantine protests that have popped up in nearly every state.
As for the platforms, their efforts haven’t gone nearly far enough, Wardle said.
“They trumpet these transparency initiatives, but they’re marking their own homework,” Wardle said, noting Facebook’s unwillingness to provide useful data to researchers and its recent announcements claiming successful cleanup of coronavirus misinformation. “I just think it’s a massive wasted opportunity,”
Where opportunity lies, Wardle said, is with the public. First Draft has created a guide for it, too.
“As dark as this pandemic is, I feel like it has helped people recognize how harmful misinformation can be in a way that, previously, when it was around political disinformation, people could dismiss,” Wardle said. “If we can use that to have tough conversations with the platforms, then there’ll be a tiny silver lining in the middle of all the horror.”
In December, Kate Starbird and several colleagues presciently launched the Center for an Informed Public with a mission to “resist strategic misinformation, promote an informed society, and strengthen democratic discourse.”
For more than a decade, Starbird’s work has focused on how misinformation and disinformation spread during crisis events, including mass shootings, natural disasters, terrorist attacks and war.
While Starbird and her colleagues are collecting data — nearing half a billion coronavirus-related tweets by now, she suspects — they’re also personally navigating a crisis.
Starbird’s parents live in a senior community, and in March, amid shocking reports of increased infections and deaths among older people, she and her brother were sharing articles and information over text, trying to decide whether and when their parents should move.
“Looking back to all the things I shared with my brother as we were trying to make that decision, some of it was not true. Some of it was misinformation,” Starbird said. “I have a lot of empathy for people who are sharing things that turn out not to be true, because it’s so hard to get everything right right now.”
Starbird said the early responses to misinformation have been heartening overall, but she worries about the next wave of disinformation, which she predicts will be more exploitative.
“Where we had rumors and misinformation that looked natural, with people just trying to figure things out early on, we’re seeing more disinformation, conspiracy theorizing and politicization,” Starbird said. “More people intentionally — because they want to mislead or due to their own confirmation biases — are picking and choosing information that is misleading and spreading information that may be leading us away from the best collective understanding of what’s going on.”
Starbird noted several current disinformation campaigns, the most concerning of which undermined trust in a coming vaccine and targeted Dr. Anthony Fauci, the director of the National Institute of Allergy and Infectious Diseases and the most vocal public proponent of emergency measures on the White House’s coronavirus task force.
“If the medical providers and scientists that we’ve been relying on for good information are delegitimized by disinformation campaigns, then we’re put in a vulnerable situation as we try to continue to deal with this virus, maybe for years to come,” she said.
The pandemic has also provided a superhighway for some of the wildest and longest-running conspiracy theories to enter the mainstream: baseless claims, including those that 5G towers cause coronavirus symptoms, a hostile government created the virus as a bioweapon and Bill Gates is plotting to inject vaccine recipients with microchips, have all gone viral across every social media platform.
Conspiracy theories are powerful because, unlike science, they are uncomplicated and explain a chaotic world in a way that gives the believer a feeling of control. When those become embedded, Starbird said, they can be difficult to refute with evidence.
“You think, ‘I’ll just show this person one piece of information that’s so clear, then that will make their house of cards fall down,'” Starbird said, referring specifically to anti-vaccination activists. “But no, they just build another wing that supports it from the back side.”
Still, Starbird said, disasters can bring unity. Whether the pandemic will here remains to be seen.
“Coming together to deal with COVID could be a bit of a healing moment,” she said. “But the forces that have been pulling us apart are so active right now. With the protests and people congregating when they’ve been asked not to, and the rhetoric, you can see people resorting back to only picking the information that aligns with their political views. And the same kinds of swindlers are in the space trying to take advantage of that to sell their ads and their nutritional supplements.”
“I hope that we’re better than that. I hope that we can do it,” she added. “I’m hopeful but maybe not very confident.”