Misinformation is a Problem of Supply and Demand
By Dan Gillmor
In today’s misinformation crisis, we’re all but ignoring half of the answer: the demand side of the supply-demand equation. The expanding supply of bogus “news,” propaganda, and other kinds of deceit that pollute the information ecosystem needs to be balanced by more demand for something better. That is, knowing how to tell the difference between what’s real and bogus, sharing responsibly, and more.
It’s now been more than two years since the nation belatedly realized how pernicious this situation had become, in the wake of a 2016 election that was marked by trolls, disinformation, and other manipulation.
Since then, however, most of the conversation and resources have been devoted to solving the supply. For example, the expanding world of (after-the-) fact-checking has focused on improving the quality of what we read, watch, listen to, and otherwise encounter in a world of seemingly endless digital information, preventing the purveyors of misinformation from conning a gullible public.
One kind of misinformation – anti-vaccine propaganda – demonstrates how bad this can be. Outbreaks are becoming more common, to the extent that the World Health Organization considers what it politely calls “vaccine hesitancy” to be one of the top threats to global health.
No one can (or at least should) argue against upgrading the quality of our supply. This is especially so when it comes to journalism, which has failed badly in recent years. Overall, most journalists still haven’t demonstrated real appreciation of the emergency we face. Yet how we upgrade supply is not as obvious as the need to do it.
What should be equally obvious, though it is not, is that we need to upgrade ourselves. All of us. You, me, families, friends, our colleagues. Everyone. We all suffer from confirmation bias. People often believe incorrect information even more after being presented with facts to the contrary. Once we understand our tendency to have a confirmation bias, we can challenge ourselves to recognize it. And we are not hard-wired to exist in filter bubbles of our own creation. We can train ourselves to look for information that challenges our assumptions.
Upgrading demand means many things. Key among them is a smarter and deeper public understanding of how media work. Helping people learn that skepticism must include judgment in learning what to trust more than not. Improving our skills in sorting truth from falsehood. And fostering a culture of “slow news” consumption, sharing, and creation that has integrity at its heart.
The demand side of the information marketplace has been called “media literacy,” in its broadest sense. The National Association for Media Literacy Education defines media literacy as ”the ability to access, analyze, evaluate, create, and act on media messages using all forms of communication.” (Analyze plus evaluate equals “understand.”)
A subset of media literacy, news literacy, applies these skills to the news ecosystem, where misinformation about civic topics has become so endemic, from climate-change denial to constant lies from the president and his followers in politics and media, to so much more. I’ve come to prefer a different way of describing the needed collection of skills and attitudes that’s free of potential “you’re not literate” condescension: “news fluency.”
Critical thinking is at the heart of improving demand, but by itself it is not nearly enough. As Data & Society and others have observed, if we simply teach people to distrust everything, we can play into the hands of deceitful people who tell compelling tales, however false they may be. We can’t just question bad information: we have to show people how to find good information. Judgment and other skills have to be part of the process. We need more development of these skills at all levels of society, in all age groups (though recent studies suggest older, more conservative people share the most misinformation.) Moreover, we need to do all of this at scale.
Three of our most powerful institutions can play indispensable roles in making it happen. Two have been with us for a long time: education and media. One is relatively new: technology platforms.
How the educational system can help
This is the place where some effort has already taken place, though not enough. I say “limited,” because despite the admirable efforts of the media and news literacy communities (this project, spearheaded by Mike Caulfield, is super-promising) they haven’t made much of a dent yet.
Academic research, such as this paper, has described some of the many barriers, but I’d add one more: Since the heart of this work is instilling critical thinking, that makes it a nonstarter in communities and school districts where critical thinking emerges in the form of science “skepticism,” like climate change denial or the anti-vaccination movement.
The News Literacy Project works to bring skills to high-school students. It has labored without enough notice for a decade, but more recently has received more attention (including from journalists) and even more well-deserved financial support from donors, including Facebook. Its “Checkology” online course gives it a much greater potential to scale than the project’s original method of bringing journalists into classrooms on a one-off basis.
At the college level, Stony Brook University’s Center for News Literacy has been a leader, and its course materials are used at some other universities. My own work, at the Arizona State University News Co/Lab, is a project of the university’s Walter Cronkite School of Journalism and Mass Communication. We teach an online course in digital media literacy which reaches between 100 and 200 students at a time. Several years ago, we offered a massive open online course (MOOC) that thousands of people registered and participated in. This was useful but, let’s be clear, not the kind of scale we should want. These principles need to reach millions, not thousands. The MOOC, like my book Mediactive, on which the course and MOOC are based, was published under a Creative Commons license. As a result, all of the course materials are available for download and remixing. [Editor’s Note: Defusing Disinfo contributor Claire Wardle will be teaching a MOOC on navigating misinformation online in April.]
The only way to reach real scale in traditional education is through mandates. It’s safe to assume that our bitterly polarized U.S. Congress won’t intervene anytime soon, in part because education is largely controlled by state and local governments.
California, the nation’s largest state, enacted a law in 2018 that purports to bring serious media literacy into classrooms. But Sam Wineburg, the Stanford researcher whose work helped prompt the law, is critical of the way it came out.
“I share legislators’ view that we need to do something,” he wrote. “What worries me is that the solutions they propose are more likely to exacerbate the problem than solve it.”
Meanwhile, we run into another issue: a slew of “literacies” that overlap and whose proponents tend to compete, not collaborate. Eric Newton, my News Co/Lab co-founder, likens them to Russian nesting dolls:
“The outermost, largest doll would be basic literacy — the ability to understand anything represented by any kind of symbol. Directly inside would be information literacy, which lays claim to all forms of information, followed by digital literacy, the fast-rising newcomer, as there is now more digital content than any other kind. Inside that comes media literacy, followed by the news literacy one needs to comprehend non-fiction, current affairs media. In the middle are topic-specific literacies like civic and health literacy, payoffs to those savvy enough to find them.”
Those of us in this arena need to work together, as Eric says, to “distill the key fundamentals of finding, understanding, acting on and creating news and information in the digital age.” And we need to do it now.
A few states have mandated a modern version of what we used to call “civics” training in classrooms. One component of which is critical thinking and, in effect, media literacy. But they are the exceptions, not the rule.
Even if Congress did try to mandate these literacies, let’s note that this White House is controlled by supremely deceitful people who despise journalism itself. Their increasingly strident efforts to overturn the very concept of truth, in a stridently partisan manner, have contributed to the pernicious reality that people who hate the news media are more likely to be fooled by a fake headline. That’s just one key finding in a study the News Co/Lab, along with our research collaborators at the University of Texas’ Center for Media Engagement, published last fall (full PDF here).
The notion of any federal action to push these skills is, given the deceitful nature of the Trump administration, appears absurd, at least for the immediate future.
There is, however, some progress on the international front. Germany, for example, requires young people to become digitally literate. That hasn’t stopped growing political polarization and trolling by misinformation experts, however. This is a universal problem
Before I turn to media organizations’ vital role in making this situation better, I do want to flag several resources we’ve included in our digital media literacy course.
One is from the Media Education Lab, founded by one of the field’s more thoughtful pioneers, Renee Hobbs. It’s called “Mind Over Media: Analyzing Contemporary Propaganda,” and it offers powerful hands-on learning for people of any age.
The other is based on the important understanding that literate people create media, not just consume it. We require students to edit Wikipedia, using the resources of Wiki Education, which like Wikipedia is run by the Wikimedia Foundation. For students this has been revelatory, as they realize both the power of Wikipedia and their own power to be part of a resource that keeps growing and, on balance, getting better.
While education in this arena has been mostly a collection of one-offs in individual classrooms and programs, we’re learning how to make it a broad-based affair. We’re developing digital tools for teaching and learning inside and outside classrooms. We can push for fundamental literacies to be embedded, and taught, in all kinds of subject areas and at all ages. We can achieve scale, with enough resources and will.
How media companies can play a role
The second institution that can bring scale to news literacy is the media itself, from news organizations to entertainment companies to marketing, advertising, and public relations firms.
Not nearly enough is happening in any of those arenas, but there is a heartening development in the journalism field. Journalists should have made news literacy a key part of their core missions a long time ago, beyond providing news and information. They didn’t. I could speculate about the reasons, but the key one has to be a culturally ingrained belief, which persists in many places, that “our journalism speaks for itself,” to use an expression I’ve heard many times.
That arrogance is lessening, I’m glad to say, and in large part you can thank the unraveling of the monopoly business model. An overdue shift toward a genuine conversation with audiences for journalism may be partly a self-defense measure in this era when journalism and reality are themselves under attack. We at the News Co/Lab are fortunate to be part of that change.
Journalists are discovering and, in some cases, re-discovering the new best practices of the digital age:
- understand both what a community knows about journalism and how it works, and what journalists understand about a community’s attitudes toward news and news organizations
- become much more transparent about who they are; what they do; how they do it; and why they do it; and
- engage in deep conversations and collaborations with the people and institutions in the community, such as libraries and schools.
Until very recently, journalistic transparency has been rare, and deep engagement only slightly less so. We’ve seen a notable increase in both of these strategies.
This isn’t about the expanding “fact-checking” movement, which aims to directly help truth compete with lies. Rather, we see transparency as a way to increase public understanding of how news works, and thereby help rebuild some of the trust that journalists have lost in recent decades. Engagement plays a parallel role, as communities help drive the journalism and even participate in it.
Our lab has compiled what we consider “Best Practices” in these areas. Examples include the Toronto Star’s major push toward transparency, Hearken’s work with news organizations to make audiences part of the journalism, and many more.
The Best Practices have been a baseline in our pilot project with newsrooms in four U.S. cities. Three are part of the McClatchy media company. So far, we’ve seen a remarkable change in newsroom cultures as they’ve embedded both transparency and engagement into the journalism. We have a long way to go, but we are far from alone in this movement and convinced this is the right direction.
Other kinds of media could, and should, do other kinds of things to improve the public’s understanding of how to discern truth from lies.
Imagine the good the “persuasion industry,” as I tend to think of advertising and PR companies, could do if it devoted some of its time, talent, and money to this cause. The entertainment industry could also work wonders with its platforms. I wish they’d all come together on a public campaign whose message was this: “Sharing bullshit isn’t cool. It’s just you being conned by experts in deceit, so don’t let them con you.”
Reforming social media platforms
The third arena in which we could make enormous progress is the one that almost defines scale in the modern world: technology platforms.
Here, too, we can report modest progress, with nearly boundless opportunities – if the platforms care to take more responsibility. At the moment, they’re being asked to take responsibility in ways that could lead in precisely the wrong direction, by people I call the “do something” brigades.
The demands come from people whose motivations I don’t question. They demand: Do something about Russian trolls. Do something about extremist posts and videos! Do something about malevolent bots!
But do what, exactly? If we insist that the platforms control in a granular way, we are asking them to effectively become the editors of the Internet, the arbiters of what is true and false. Is that really what we want? Not me, because if they choose who’s allowed to speak, they may someday block the speech of people we like, or even our own speech. Governments will surely start forcing them to do so. In some countries, they already are.
Nor, however, do I want the platforms to use opaque algorithms to make decisions for us, which is what they do now. At some level, they already are the editors of the Internet.
So how can the platforms help best? They can offer more support on the demand side. To that end, they’ve been adding a few user-experience tweaks to their platforms, such as fact-checking links to some articles that show up in feeds and search results.
The platforms have also provided modest financial support to some organizations working in media/news literacy, including a Facebook Journalism Project donation in 2017 to the the News Co/Lab. That support is helpful, but a drop in the bucket considering the scope of the problem.
I’ve asked the platforms to go far beyond these initial efforts in a number of ways. One is a huge ask: To change their products in ways that cede some of the control to the users of these services so that we can make better decisions about what our online experiences and information needs.
Facebook, Google, Twitter, et al should provide two major changes.
The first is dashboards so that users can much more fully control their own information feeds. Let us decide whether to filter our feeds, or not. If we do, let us decide how we want to make the filters work. Give us buttons that pop our personal filter bubbles, exposing us to ideas we’re bound to disagree with. Let communities of users build filters together.
The second change we need from the platforms is to revise how they create and manage the tools programmers use. Those tools have helped advertisers and sleazy propagandists to manipulate our data and violate our privacy in the past. They need to be aimed at helping us, individually and in communities, to control our experiences for our own benefit.
Again, the misuse of such capabilities is what led to many of the misinformation problems of the past few years. Doing it right, to help us reduce pollution rather than letting polluters poison us, should be the goal.
Both of these platform changes could disrupt their business models, which rely on getting people to stay inside the walled gardens and keep clicking and posting. So, don’t hold your breath.
But I am convinced that, done right, they could have a positive impact, even on the bottom line, in the long run.That’s because I believe the companies would engender more trust at a time when so many people hold these companies in contempt – and when the calls for regulation grow louder every day. We need their help, and I’m hoping they’ll understand why they should provide it.
In the end, we need everyone’s help on the demand side: yours, mine, teachers, librarians, journalists, technologists. Without improving demand, we can’t hope to fix the supply.
Dan Gillmor (@DanGillmor) is the co-founder of the News Co/Lab at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication, a veteran journalist, contributing to Future Tense, Backchannel, and Medium, and the author of two books, Mediactive and We the Media: Grassroots Journalism by the People, for the People
[Image Credit: “Reading Room,” by Susan Golding]