Teaching students to sort fact from fiction, op-ed from advertorial, is crucial. Mike Caulfield, the manager of academic and collaborative technologies at the University of Washington Bothell, and Sam Wineburg, a co-founder of the Digital Inquiry Group and professor emeritus at Stanford University, have been testing out strategies for more than a decade, and they’ve figured out some effective methods that even middle schoolers can learn. Here, we share an excerpt from their 2023 book, Verified, and links to their free resources for educators.
–EDITORS
The video is shocking. Two women approach a historic painting in London’s National Gallery and seemingly destroy it. As orange goop streaks down the painting, they read a statement about climate change and glue themselves to the gallery wall. Tweeting the video to his 18,000 followers, British journalist Damien Gayle reveals that the activists “have thrown tomato soup”1 on Vincent van Gogh’s beloved Sunflowers. Within 24 hours, the video racked up 40 million views.
Reaction was swift. For a rare moment, the political left and the political right found common cause. Destruction of art, as one tweeter summarized, “represents a repudiation of civilisation and the achievements of humanity.”2 The sentiment received about 10,000 retweets. Replies and retweets advocated long prison sentences for the women or, van Gogh–like, cutting off their ears.
All this outrage and concern missed a crucial fact: Sunflowers was behind glass.
Apart from minimal damage to the frame, the painting emerged unscathed. The soup had splashed harmlessly on the painting’s protective case—a fact the protesters knew, many bystanders knew, and the gallery knew.
Skillfully navigating the internet requires conceptions of critical thinking tailored to a digital environment. In over a decade of research, we set out to distill a small set of flexible techniques that would allow users to resolve easy questions quickly and inform difficult ones in not much more time.
Along with teams of fellow researchers, we field-tested our approach with students in middle school, high school, and college, as well as with adults, in the United States, Canada, Sweden, and the United Kingdom. To date, 13 separate studies involving nearly 10,000 participants have shown the effectiveness of our approach in helping people make better choices online.3 And in one of the most recent studies, students showed a sixfold increase in use of fact-checking techniques and a fivefold increase in citations of appropriate context after only seven hours of instruction.4 Today, Mike’s SIFT methods (discussed below) have become a mainstay of information literacy workshops, while the Digital Inquiry Group’s “Civic Online Reasoning” curriculum developed at Stanford is used in high schools and universities all over the country.5 While we can only share a handful of methods here, our instructional materials are available for free online (see “Free Resources for Educators” on the upper right).
The Three Contexts
Over the past decade, we’ve looked at how students and the general public judge information that reaches them through the web. Our first finding will not surprise you, given both the state of the world and your social media feeds: on the internet, people reason quite poorly.6
The second finding is more surprising. Many so-called experts assumed that the mistakes people made on the web resulted from a lack of critical thinking. But when we looked at why people got confused, we found it wasn’t their thinking that malfunctioned. In fact, people were thinking quite hard, asking themselves: Does this seem plausible? Does this match how I think the world works?
Those are not bad questions. However, in areas where people have little expertise, or lack direct experience, asking such questions isn’t the first order of business. The first task when confronted with the unfamiliar is not analysis. It is the gathering of context. Let’s consider three crucial contexts that ground reasoning on the web and elsewhere:
- The context of the source. What’s the reputation of the source of information that you arrive at, whether through a social feed, a shared link, or a Google search result?
- The context of the claim. What have others said about the claim? If it’s a story, what’s the larger story? If a statistic, what’s the larger context?
- Finally, the context of you. What is your level of expertise in the area? What is your interest in the claim? What makes such a claim or source compelling to you, and what could change that?7
We found that when students attended to these contexts, spending as little as 30 seconds reflecting and seeking basic information on the web, something stunning happened. Supposedly weak “critical thinkers” became strong critical thinkers, without any additional training in logic or analysis. They made better decisions, leaned less on faulty presuppositions, and were fooled less by deceptive appearances and dirty tricks. They often showed greater nuance and stronger logical argument.
How could 30 seconds of simple web techniques, applied consistently, result in such transformation? People who don’t seek context find themselves devoting a lot of thought to the issue. Their thinking, however, is not a whole lot more advanced than mere playground speculation.
We see this in the Sunflowers story. While mistakes initially appear to be related to a deficit of thinking, they are much more related to a lack of doing. Once students engaged in context-seeking, the thinking often sorted itself out. With a dose of context, they were able to solve simple problems quickly, and they grounded deeper investigations in better sources, assumptions, and data.
Introducing SIFT
It’s one thing to know the context you need. It’s another to build a habit of seeking it out. In our scholarly work, we’ve often pretested participants on examples, telling them that they should assess the credibility of a claim by whatever means might help, including leaving a site and searching the web. We’ve gotten a variety of responses. Some participants remained glued to the original website, descending into “plausibility analysis”—that is, given what I know, do I think this thing is likely to have happened? Does it “sound” right? Never mind that the issue at hand is about cell biology and one’s experience consists solely of having watched the TV show House, M.D.
More perplexing is the response of a smaller number of participants, often formulated as “I’d have to know.” For example, a student might say, “To know if this was credible, I’d have to know more about the author.” That’s true, of course, but misses the bigger question: What’s stopping the student from finding out more? It’s a simple act of opening another tab and starting the search. Still, for some students, the gap between thought and action proves too large.
For this reason, the main tool we give students is formulated not as a set of questions to ask but as a set of things to do before they start reasoning about a specific piece of content that reaches them through the web. We’ve put them in an easy-to-remember acronym: SIFT.
- Stop. Ask yourself what you really know about the claim and the source that’s sharing it. For the moment, forget about questions of truth or falsehood. Do you really know what you’re looking at? Are you sure? If you find it upsetting or surprising, why?
- Investigate the source. Do a quick check to see if the source is trustworthy for this purpose. In a lot of cases, for simple claims, you can stop here if the source is good.
- Find other coverage. Whether you are looking at a news report or a research claim, take a second to zoom out and see what other sources say. Engage in “lateral reading” by opening up new tabs in your browser and using the internet to check the internet.8 If the story or claim is not being picked up by other reputable sources, proceed cautiously.
- Trace the claim, quote, or media to the original context. Sometimes the first source you encountered isn’t great, but it links to where it got its information. Go to that original source and judge (a) whether it’s reputable, and (b) whether it actually supports the assertion.
We can’t promise that if you follow this advice, you’ll never again forward a celebrity death hoax or cite a sketchy publication in your health policy paper. What we can guarantee is that those errors will be fewer and farther between. Just as important, you’ll become more confident sharing things that matter to you.
Takeaways
- When we encounter something online, our first question shouldn’t be “Is this true?” but rather “Do we know what we’re looking at?”
- Knowing what we’re looking at requires getting quick context. This requires leaving the original source and reading laterally.
- SIFT is a way to help you get the sort of quick context that is essential to knowing what you’re looking at.
- Practicing the moves of SIFT can help you answer simple questions quickly and ground your understanding before moving on to more nuanced investigations.
Endnotes
1. Damien Gayle (@damiengayle), “Activists with @JustStop_Oil have thrown tomato soup on Van Gogh’s Sunflowers at the National Gallery and glued themselves to the wall,” Twitter (now X), October 14, 2022, 6:12 am, twitter.com/damiengayle/status/1580864210741133312.
2. Andrew Doyle (@andrewdoyle_com), “Activists vandalise Vincent van Gogh’s Sunflowers at the National Gallery. The vandalism or destruction of art is always an authoritarian act. But more than that - it represents a repudiation of civilisation and the achievements of humanity,” Twitter (now X), October 14, 2022, 6:46 am, twitter.com/andrewdoyle_com/status/1580872772590239746.
3. See C.-A. Axelsson, M. Guath, and T. Nygren, “Learning How to Separate Fake from Real News: Scalable Digital Tutorials Promoting Students’ Civic Online Reasoning,” Future Internet 13, no. 3 (2021): 60–78; J. Brodsky et al., “Associations Between Online Instruction in Lateral Reading Strategies and Fact-Checking COVID-19 News Among College Students,” AERA Open 7, no. 1 (2021): 1–17; J. Brodsky et al., “Fact-Checking Instruction Strengthens the Association Between Attitudes and Use of Lateral Reading Strategies in College Students,” Proceedings of the Annual Meeting of the Cognitive Science Society 44 (2022); J. Brodsky et al., “Improving College Students’ Fact-Checking Strategies Through Lateral Reading Instruction in a General Education Civics Course,” Cognitive Research: Principles and Implications 6, no. 23 (2021): 1–18; J. Breakstone et al., “Lateral Reading: College Students Learn to Critically Evaluate Internet Sources in an Online Course,” Harvard Kennedy School Misinformation Review 2, no. 1 (February 23, 2021); A. Kohnen, G. Mertens, and S. Boehm, “Can Middle Schoolers Learn to Read the Web like Experts? Possibilities and Limits of a Strategy-Based Intervention,” Journal of Media Literacy Education 12, no. 2 (2020): 64–79; S. McGrew, “Learning to Evaluate: An Intervention in Civic Online Reasoning,” Computers & Education 145 (2020): 1–13; S. McGrew et al., “Improving University Students’ Web Savvy: An Intervention Study,” British Journal of Educational Psychology 89 (2019): 485–500; F. Panizza et al., “Lateral Reading and Monetary Incentives to Spot Disinformation About Science,” Scientific Reports 12 (2022): 5678; D. Pavlounis et al., The Digital Media Literacy Gap: How to Build Widespread Resilience to False and Misleading Information Using Evidence-Based Classroom Tools (Toronto: CIVIX Canada, November 2021), ctrl-f.ca/en/the-evidence; L. Weisberg, A. Kohnen, and K. Dawson, “Impacts of a Digital Literacy Intervention on Preservice Teachers’ Civic Online Reasoning Abilities, Strategies, and Perceptions,” Journal of Technology and Teacher Education 30, no. 1 (2022): 73–98; S. Wineburg et al., “Lateral Reading on the Open Internet: A District-Wide Field Study in High School Government Classes,” Journal of Educational Psychology 114, no. 5 (2022): 893–909; and S. McGrew and J. Breakstone, “Civic Online Reasoning Across the Curriculum: Developing and Testing the Efficacy of Digital Literacy Lessons,” Stanford Digital Repository, July 28, 2022, doi.org/10.25740/dd707pp9195.
4. Pavlounis et al., The Digital Media Literacy Gap.
5. M. Caufield, Web Literacy for Student Fact-Checkers (Montreal: Pressbooks, 2017), webliteracy.pressbooks.com; J. Breakstone et al., “Students’ Civic Online Reasoning: A National Portrait,” Educational Researcher 50, no. 8 (2021): 505–15; and S. McGrew et al., “The Challenge That’s Bigger Than Fake News: Civic Reasoning in a Social Media Environment,” American Educator 41, no. 3 (Fall 2017): 4–9.
6. Here’s one of many examples: Between 2019 and 2020, Sam’s research group surveyed 3,446 high school students who had a live internet connection and who evaluated a series of websites. When asked to investigate a site claiming to “disseminate factual reports” on climate science, 96 percent of the students never learned about the organization’s ties to the fossil fuel industry. Two-thirds were unable to distinguish news stories from ads on a popular website’s home page. More than half believed that an anonymously posted Facebook video, shot in Russia, provided “strong evidence” of US voter fraud. See Breakstone et al., “Students’ Civic Online Reasoning: A National Portrait.”
7. This discussion of context is an amalgam of separate work Mike and Sam have done over the years. It corresponds most closely to “stop” in SIFT and the concepts of “footing” and “lateral reading” in the Digital Inquiry Group’s “Civic Online Reasoning” curriculum. See Caulfield, Web Literacy for Student Fact-Checkers; and Wineburg et al., “Lateral Reading on the Open Internet.”
8. Digital Inquiry Group, “Sort Fact from Fiction Online with Lateral Reading,” YouTube, January 16, 2020, youtube.com/watch?v=SHNprb2hgzU.
[Illustrations by Egle Plytnikaite]