Students have difficulties identifying credible online sources
Students have difficulty identifying credible sources on the internet, new data shows. A report released by the Stanford History Education Group detailed that students have a difficult time distinguishing between credible sources, advertisements or from where information originated.
The researchers started their work in January 2015 by tackling the question of civic online reasoning. The researchers sought to assess how students distinguish between credible and false information online. They also tried to find avenues toward giving the necessary training to discern reliable and unreliable sources in the process.
SHEG has previously created social studies programs to teach students how to evaluate sources. Its curriculum is used by several school districts and has 3.5 million downloads.
“Many of the materials on web credibility were state of the art in 1999,” said Joel Breakstone, the director of SHEG. “Schools are stuck in the past.”
The new report was split into three phases and dealt more with social media and news literacy. According to the report, entitled “Evaluating Information: The Cornerstone of Civic Online Reasoning,” the first phase of the 18-month-long project borrowed elements of “design thinking.” This mode of thinking has elements follow a sequence of prototyping, user testing and revision to ensure a continuous cycle of improvement. “Prototyping Assessments” had students review the website MinimumWage.com to see if they could discern that it was a front group for a D.C. lobbyist. Only 9 percent of high school students in an Advance Placement history course were able to parse through the website’s language and realize that it was an untrustworthy source.
In the second phase of the project, a lot of tweaking was done to the assessment tests. Aside from revising the exercises up to six times, the researchers also asked the students to verbalize their thought process while completing the assigned tasks. This allowed the consideration of what is known as cognitive validity, or the relationship between what an assessment seeks to measure and what it actually does measure.
The last phase had the researchers drawing on their extensive teacher network to reach a large student base. With help from educators in Los Angeles and elsewhere, SHEG was able to consult with teachers about their exercises and collect thousands of student responses.
At the end of its research, SHEG had designed, piloted and validated 15 assessments: five at the middle school level, five at the high school level and five at the college level. At the middle school level where “online assessment is in its infancy” the researchers used screen shots of Slate’s webpage to assess students’ ability to discern between news items and advertisements. In similar exercises, the SHEG researchers used screenshots of tweets, Facebook posts and a reproduction of CNN’s website. These screenshots were printed out onto paper, and given to students to assess manually with a pencil. This approach was taken in the hope that the assessment would be used in under-resourced schools where taking assessments online are usually not possible.
At the high school level, students were given more difficult tasks that required them to reason through multiple sources. These tasks included comparing posts from a newspaper’s comment section, identifying the blue checkmark that distinguishes verified Facebook accounts from fakes and comparing whether a news story or a sponsored post was more reliable. At the college level, the assessments were administered online, with exercises having students evaluate the trustworthiness of a website, conducting online research about a controversial topic and identifying whether a partisan website was trustworthy or not. The assessments were available for use throughout all levels given the proficiency of the students that were being assessed, regardless of their education.
Although the work for this report began much before the presidential race and election, the researchers expressed concern that democracy was threatened by the ease with which false information spreads and flourishes. In one particular exercise, high school students were asked to evaluate two posts announcing Donald Trump’s candidacy for president on Facebook.
One of the posts was from a verified Fox News account while the other was from an account that resembled Fox News in appearance but was not the same. The results had only 25 percent of students being able to distinguish the blue check marked verified account. Additionally, 30 percent of students argued that the fake account was more credible because it had more details included.
“This finding indicates that students may focus more on the content of social media posts than on their sources,” the authors wrote. “Despite their fluency with social media, many students are unaware of basic conventions for indicating verified digital information.”
According to the study’s lead author and the founder of SHEG Sam Wineburg, the next step after the research would be to instruct teachers on how to gauge their students’ understanding and then adjust lessons accordingly. Wineburg also hopes to develop new curriculums to instruct students on identifying credible sources; the SHEG team has already begun piloting lesson plans in local high schools.