This link has been bookmarked by 174 people . It was first bookmarked on 22 Nov 2016, by someone privately.
-
21 Jul 19
-
07 Nov 17bbrichacek
Stanford History Education Group
-
17 Oct 17Bill Hemmig
"STANFORD HISTORY EDUCATION GROUP
17
EVALUATING EVIDENCE
Given the vast amount of information available online, students need to be able to distinguish
between legitimate and dubious sources. Students need to ask a basic question: Where did this
document I’m looking at come from? This task assesses whether students will stop to ask this
question when confronted with a vivid photograph. Students are presented with a post from
Imgur
, a photo sharing website, which includes a picture of daisies along with the claim that the
flowers have “nuclear birth defects” from Japan’s Fukushima Daiichi nuclear disaster.
Although the image is compelling and tempting to accept at face value, successful students will
argue that the photograph does not provide strong evidence about conditions near the nuclear
power plant. Students may question the source of the post, arguing that we know nothing
about the credentials of the person who posted this photo (especially since it appears on a site
where anyone can upload a photo). Alternatively, students may point out that the post provides
no proof that the picture was taken near the power plant or that nuclear radiation caused the
daisies’ unusual growth.
Various drafts of this task were piloted with 454 high school students. The final version was given
to 170 high school students. By and large, students across grade levels were captivated by the
photograph and relied on it to evaluate the trustworthiness of the post. They ignored key details,
such as the source of the photo. Less than 20% of students constructed “Mastery” responses, or
responses that questioned the source of the post or the source of the photo. On the other hand,
nearly 40% of students argued that the post provided strong evide" -
17 Aug 17Amanda Lucas
Evaluating information: The cornerstone of civic online reasoning
information literacy news literacy fake news research stanford evaluating sources media literacy
-
08 Aug 17
-
19 Jul 17Tim Venchus
A study from the Stanford History Education Group that evaluated student online reasoning / information literacy. Good example activities and sample responses for assessing information reasoning
education information literacy media literacy Stanford study digital citizenship activities
-
11 Jul 17
-
29 Jun 17
-
13 Jun 17
-
12 Jun 17
-
06 Jun 17
-
16 May 17
-
10 May 17
-
08 May 17
-
01 May 17
-
25 Apr 17
-
24 Apr 17
-
21 Apr 17
-
30 Mar 17
-
29 Mar 17
-
28 Mar 17
-
27 Mar 17
-
Geoff Hinman
Stanford Executive Summary
digital literacy web evaluation information_literacy evaluation information literacy research media literacy digitalcitizenship fake news
-
26 Mar 17Kate Herzog
EVALUATING INFORMATION: THE CORNERSTONE OF CIVIC ONLINE REASONING
EXECUTIVE SUMMARY
STANFORD HISTORY EDUCATION GROUP
PRODUCED WITH THE SUPPORT OF THE ROBERT R. McCORMICK FOUNDATION -
24 Mar 17
-
23 Mar 17
-
21 Mar 17
-
12 Mar 17
-
09 Mar 17
-
08 Mar 17
-
07 Mar 17gisellemk
Research from Stanford that confirms that we need to help students think critically because research shows that students do not have the skills to discern what is true and what is not
-
06 Mar 17
-
05 Mar 17
-
04 Mar 17
-
03 Mar 17
-
02 Mar 17
-
01 Mar 17
-
20 Feb 17
-
13 Feb 17
-
08 Feb 17centregleanings
the Stanford History Education Group has prototyped, feld
tested, and validated a bank of assessments that tap civic online reasoning—the ability to
judge the credibility of information that foods young people’s smartphones, tablets, and
computersinformation literacy stanford research report education digital literacy fake news critical thinking pdf
-
03 Feb 17
-
02 Feb 17
-
25 Jan 17
-
23 Jan 17
-
Shawn McCusker
Gr8 starter kit for MS/HS T's: literacy skills in time of fake news via @Stanford History Group HT @ShawnMcCusker https://t.co/jbhC5qEk8K
-
Adrienne Michetti
If UR wondering, "Should I be teaching my students how to detect false news?" Here you go: https://t.co/3cvH9dhfCJ #research #medialiteracy
-
22 Jan 17
-
18 Jan 17
-
17 Jan 17Lew Douglas
Stanford Study: Teens can't distinguish true from false on the internet.
-
13 Jan 17
-
12 Jan 17
-
11 Jan 17
-
10 Jan 17
-
09 Jan 17
-
08 Jan 17
-
05 Jan 17
-
04 Jan 17
-
03 Jan 17Jill Bergeron
This curriculum offers a way to help students evaluate information.
digitalcitizenship media literacy stanford digital citizenship
-
Deborah Healey
A well-designed study of middle school, high school, and college-age US students showed they're not digitally literate - most are unable to tell an ad from an article or assess the reliability of a tweet
-
30 Dec 16
-
29 Dec 16
-
manuelm ch
Ensenyar a llegir críticament: distingir anuncis de notícies, perfils autèntics de falsos a FB, fotos trucades: https://t.co/EaTEBxx73I
-
28 Dec 16
-
27 Dec 16
-
anita z boudreau
Evaluating Information: The cornerstond of civic online reasoning
digitalcitizenship media literacy fakenews InformationLiteracy stanford
-
22 Dec 16
-
21 Dec 16
-
18 Dec 16
-
Jelmer Evers
Evaluating Information: The Cornerstone of Civic Online Reasoning
November 22, 2016 https://t.co/dNpR8nkS3R -
16 Dec 16susan650
exec summary
researchmedia literacy information literacy fakenews digitalcitizenship stanford hip1
-
15 Dec 16
-
Martha Hickson
Evaluating information: The cornerstone of civic online reasoning
Stanford History Education Group -
eric pichon
"A study of the Stanford University investigated the ability of students to judge the credibility of information by testing 7804 students from 12 States. More than 80% of students believed that the native advertisement, identified by the words “sponsored content” was a real news story!" http://www.itecnet.ep.parl.union.eu/itecnet/cms/homepage/news_fake_news
-
13 Dec 16
-
10 Dec 16
-
09 Dec 16soberle
Over the last year and a half, the Stanford History Education Group has prototyped, field tested, and validated a bank of assessments that tap civic online reasoning—the ability to judge the credibility of information that foods young people’s smartphones, tablets, and computers.
Between January 2015 and June 2016, we administered 56 tasks to students across 12
states. In total, we collected and analyzed 7,804 student responses. Our sites for field testing included under-resourced, inner-city schools in Los Angeles and well-resourced
schools in suburbs outside of Minneapolis. Our college assessments, which focused on
open web searches, were administered online at six different universities that ranged from Stanford, an institution that rejects 94% of its applicants, to large state universities that admit the majority of students who apply.
In what follows, we provide an overview of what we learned and sketch paths our
future work might take. We end by providing samples of our assessments of civic online reasoning.credibility fake_news digital_citizenship digital_literacy digital_media
-
06 Dec 16
-
05 Dec 16
-
Coleen Fillion
Stanford study about how people evaluate tweets for authenticity.
See also
https://evonews.com/tech-science/2016/nov/24/the-impact-of-social-media-on-young-minds-students-are-unable-to-tell-fake-news-from-real-news-these-days/ -
03 Dec 16
-
02 Dec 16
Would you like to comment?
Join Diigo for a free account, or sign in if you are already a member.