Research from the Stanford History Education Group shows how easily young people are deceived by information on the internet - and what schools can do about it.
Soon after the 2016 presidential election, as debates raged over "fake news" and its influence on the outcome, a landmark report from researchers at Stanford Graduate School of Education (GSE) provided sobering evidence of just how easily young people are duped by information online. The study, by the Stanford History Education Group (SHEG), found that middle and high school students overwhelmingly failed to demonstrate the skills necessary to distinguish credible sources from unreliable ones.
A new study from the Stanford History Education Group found that middle and high school students overwhelmingly failed to demonstrate the skills necessary to distinguish credible sources from unreliable ones. (Image credit: Getty Images)
Since the release of that report, policymakers and educators have introduced a wave of initiatives aimed at equipping students with stronger digital literacy skills. But as the 2020 election approaches and many of those students become first-time voters, SHEG researchers have found few signs of progress - and the consequences are dire, said Sam Wineburg, the Margaret Jacks Professor of Education, who co-founded SHEG in 2002.
"Our democracy depends on access to reliable information," he said. "And the internet is increasingly where we go to look for it."
Last year SHEG released Civic Online Reasoning , a free curriculum for educators to instill strategies for evaluating the trustworthiness of online information. More recently, Wineburg and SHEG director Joel Breakstone, PhD ’13, joined with colleagues at SHEG and faculty at MIT to develop a free course on how to teach these skills, which launched this fall. Wineburg will also share research and tools from SHEG during a virtual talk open to the public on Oct. 22.
Here, Wineburg and Breakstone talk about the state of digital literacy among future voters, two simple practices they’ve identified to detect questionable information and how to help young people learn to be more discerning as the 2020 election draws near.
With so much attention to the problem of "fake news" since the 2016 election, have you seen a change in how young people approach information online?
Wineburg : We’re still very much seeing students struggle to make sense of the information they encounter. In 2019 we released the most extensive study to date on how young people go about trying to verify a claim on social media or the internet, based on research with more than 3,000 high school students matching the demographic profile of students across the United States.
What implications could this time of remote learning have on digital literacy skills?
Breakstone : Students are online all the time now, and they’re being confronted with even more content from questionable sources. So, the need to prepare students to navigate this setting is more important than ever. But it also presents opportunities for innovation. Teachers are trying out different possibilities with students online.
We’re currently doing an analysis of a fully online intervention in a college setting, where students watched videos we’ve developed and completed activities from the curriculum we created. Our initial analysis is that students’ skills improved. But there needs to be a much more comprehensive effort if we’re going to make a dent in this problem.
Wineburg : We’ve found that just a modest investment of time can bear fruit. In one study with our curriculum in high school classrooms, six-hour-long lessons over a 10-week period moved the needle.
There’s no question that we have a tremendous amount of work to do. But we’re optimistic that, with enough will on the part of educators and a strong curriculum that’s integrated into the school day, we can see an impact. Listen, misinformation and disinformation are polluting the information stream. If we can’t find a way to upgrade the skills of ordinary citizens - and fast - democracy itself will be the casualty.