This link has been bookmarked by 117 people . It was first bookmarked on 13 Mar 2017, by contentcommando.
-
03 Mar 21
-
17 Dec 20
-
16 Sep 20Joseph Kraus
"This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as “motivated reasoning.” Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs."
-
11 Sep 20
-
30 Mar 20M Says what?
Excerpt: The facts on why facts alone can’t fight false beliefs. There are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you. // “For desired conclusions,” he writes, “it is as if we ask ourselves ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’” People come to some information seeking permission to believe, and to other information looking for escape routes. // What’s more, being intelligent and informed can often make the problem worse. // Fact-checking erroneous statements made by politicians or cranks may also be ineffective. Nyhan’s work has shown that correcting people’s misperceptions often doesn’t work, and worse, sometimes it creates a backfire effect, making people endorse their misperceptions even more strongly. // “Personally my own theory is that there’s a slide that happens,” McIntyre says. “This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet.
critical thinking Siddha Yoga cults Dorothy Martin prophets motivated reasoning selective exposure identity ego willful ignorance ignorance social phenomenons politics polarisation tribalism fake news Donald J Trump Hillary Clinton dissonance pic doomsday
-
17 Jul 19
-
She’s totally lying
-
And that doesn’t matter
-
discomfort of simultaneously holding two thoughts that are in conflict
-
cognitive dissonance
-
elieved that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood
-
no spacemen (and no flood) ever came
-
kept revising her predictions.
-
watched with fascination as the believers kept on believing,
-
individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before
-
doubling down in the face of conflicting evidence
-
reducing the discomfort of dissonance
-
motivated reasoning
-
convince themselves or remain convinced of what they want to believe
-
seek out agreeable information and learn it more easily
-
avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.
-
construct a pillow fort of the information that’s comfortable.
-
to remain convinced that they’re right.
-
evidence that disagrees with them as weaker
-
real enemy of truth
-
is false knowledge
-
understandings of the things they believe are often going to be incomplete
-
believe without proof.
-
religious faith
-
contradiction of false knowledge is that only those on the outside can tell that it’s false.
-
get people to believe you’re a good source without actually being one
-
benefits without having to put in the work
-
tendency to stick with what they already believe could help protect people from being taken in by every huckster
-
eople who take you at your word
-
clearly people you can trust
-
defending your choices no matter what information is presented
-
ose your membership in this group that’s become so important to you.
-
conservatives are more susceptible to fake news
-
not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump.
-
people often don’t engage with information as information but as a marker of identity
-
igher someone’s IQ, the better they are at coming up with arguments to support a position
-
only a position they already agree with,
-
when the coverage has implications for a person’s political party, then selective learning kicks into high gear.
-
fight against motivated reasoning by priming people to focus on accuracy in whatever task is at hand,
-
unclear how to translate that to the real world
-
-
20 Nov 17
-
04 Sep 17
-
02 Aug 17
-
24 Jul 17
-
17 Jun 17
-
For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith exactly. It was something he chose.
-
I remember actually consciously making that choice.”
-
there are beli
-
“I chose to decide that they couldn’t be true.”
-
cognitive dissonance
-
Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.
-
People see evidence that disagrees with them as weaker, because ultimately, they’re asking themselves fundamentally different questions when evaluating that evidence, depending on whether they want to believe what it suggests or not,
-
-
14 Jun 17
-
There are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you.
-
These days, he dedicates part of his practice to working with former cult members and family members of people in cults.
-
the extreme discomfort of simultaneously holding two thoughts that are in conflict
-
“motivated reasoning.”
-
It
-
You can construct a pillow fort of the information that’s comfortable.
-
well, then people become logical Simone Bileses, doing all the mental gymnastics it takes
-
“For desired conclusions,” he writes, “it is as if we ask ourselves ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’”
-
“The natural environment of human beings, like the sea for dolphins or the ice for polar bears, is information provided by others, without which they could not forage, hunt, choose mates, or build tools. Without communication, no survival for humans.”
-
motivated reasoning
-
“You spread stories because you know that they’re likely to be a kind of litmus test, and the way people react will show whether they’re prepared to side with you or not,” Boyer says. “Having social support, from an evolutionary standpoint, is far more important than knowing the truth about some facts that do not directly impinge on your life.” The meditation and sense of belonging that Daniel Shaw got from Siddha Yoga, for example, was at one time more important to his life than the alleged misdeeds of the gurus who led the group.
-
a group that was deeply important to their lives and their sense of self.
-
motivated reasoning
-
And in modern America, one of the groups that people have most intensely hitched their identities to is their political party.
-
sort of cognitive tribalism
-
have made partisanship and its relationship to facts seem like one of the most urgent questions of the era
-
fake news stories
-
In one particularly potent example of party trumping fact, when shown photos of Trump’s inauguration and Barack Obama’s side by side, in which Obama clearly had a bigger crowd, some Trump supporters identified the bigger crowd as Trump’s. When researchers explicitly told subjects which photo was Trump’s and which was Obama’s, a smaller portion of Trump supporters falsely said Trump’s photo had more people in it.
-
It’s not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump.
-
In these charged situations, people often don’t engage with information as information but as a marker of identity. Information becomes tribal.
-
For example, Democrats are overwhelmingly supportive of bills to ban the chemical BPA from household products, even though the FDA and many scientific studies have found it is safe at the low levels currently used. This reflects a “chemophobia” often seen among liberals, according to Politico.
-
perhaps because they have more to draw on when crafting a counterargument.
-
“Most people have no reason to have a position on climate change aside from expression of their identity,” Kahan says. “Their personal behavior isn’t going to affect the risk that they face. They don't matter enough as a voter to determine the outcome on policies or anything like this. These are just badges of membership in these groups, and that’s how most people process the information.”
-
and when people cocooned themselves thoroughly in social-media spheres that only told them what they wanted to hear. After careening through a partisan hall of mirrors,
-
Manjoo, who is now a technology columnist for The New York Times.
-
People have always been tribal and have always believed things that aren’t true. Is the present moment really so different, or do the stakes just feel higher?
-
This is hardly the first time there have been partisan publications, or many competing outlets, or even information silos. People often despair at the loss of the mid-20th-century model, when just a few newspapers and TV channels fed people most of their unbiased news vegetables. But in the 19th century, papers were known for competing for eyeballs with sensational headlines, and in the time of the Founding Fathers, Federalist and Republican papers were constantly sniping at each other.
-
These silos are also no longer geographical, but ideological and thus less diverse. A recent study in the Proceedings of the National Academy of Sciences that analyzed 376 million Facebook users’ interactions with 900 news outlets reports that “selective exposure drives news consumption.”
-
In areas where you lack expertise, you have to rely on trust. Even Clifford acknowledges this—it’s acceptable, he says, to believe what someone else tells you “when there is reasonable ground for supposing that he knows the matter of which he speaks.”
-
“The murder rate in the United States is the highest it’s been in 45 years,” something Trump often said on the campaign trail, as well as something that’s not true. “Because the claim is false,” Nyhan writes, “the most accurate response is to disagree. But what does it mean if a person agrees with the statement?”
-
“Probably nothing,” Tavris says. “I mean that seriously.”
-
motivated reasoning
-
Daniel Shaw ultimately left Siddha Yoga
-
-
11 May 17
-
06 Apr 17
-
cognitive dissonanc
-
was developed by the social psychologist Leon Festinger in the 1950s
-
“A man with a conviction is a hard man to change,” Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Fails, their 1957 book about this study.
-
Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.
-
motivated reasoning
-
It starts at the borders of attention—what people even allow to breach their bubbles.
-
Most people aren’t totally ensconced in a cushiony cave, though. They build windows in the fort, they peek out from time to time, they go for long strolls out in the world. And so, they will occasionally encounter information that suggests something they believe is wrong. A lot of these instances are no big deal, and people change their minds if the evidence shows they should—you thought it was supposed to be nice out today, you step out the door and it’s raining, you grab an umbrella. Simple as that.
-
But if the thing you might be wrong about is a belief that’s deeply tied to your identity or worldview—the guru you’ve dedicated your life to is accused of some terrible things, the cigarettes you’re addicted to can kill you—well, then people become logical Simone Bileses, doing all the mental gymnastics it takes to remain convinced that they’re right.
-
Tom Gilovich
-
Whether it’s unethical or not is kind of beside the point, because people are going to be wrong and they’re going to believe things on insufficient evidence. And their understandings of the things they believe are often going to be incomplete—even if they’re correct.
-
How many people who (rightly) believe climate change is real could actually explain how it works? And as the philosopher and psychologist William James noted in an address rebutting Clifford’s essay, religious faith is one domain that, by definition, requires a person to believe without proof.
-
It’s hard for facts to fight it because to the person who holds it, it feels like truth.
-
But from an evolutionary perspective, there are more important things than truth. Take the same scenario McIntyre mentioned and flip it on its head—you hear a growl in the bushes that sounds remarkably tiger-like. The safest thing to do is probably high-tail it out of there, even if it turns out it was just your buddy messing with you.
-
Survival is more important than truth.
-
“Having social support, from an evolutionary standpoint, is far more important than knowing the truth.”
-
Pascal Boyer
-
The Most Natural Thing: How Evolution Explains Human Societies
-
“The natural environment of human beings, like the sea for dolphins or the ice for polar bears, is information provided by others, without which they could not forage, hunt, choose mates, or build tools.
-
Without communication, no survival for humans.”
-
In this environment, people with good information are valued. But expertise comes at a cost—it requires time and work. If you can get people to believe you’re a good source without actually being one, you get the benefits without having to put in the work. Liars prosper, in other words, if people believe them.
-
So some researchers have suggested motivated reasoning may have developed as a “shield against manipulation.” A tendency to stick with what they already believe could help protect people from being taken in by every huckster with a convincing tale who comes along.
-
Spreading a tall tale also gives people something even more important than false expertise—it lets them know who’s on their side.
-
If you accuse someone of being a witch, or explain why you think the contrails left by airplanes are actually spraying harmful chemicals, the people who take you at your word are clearly people you can trust, and who trust you.
-
“Having social support, from an evolutionary standpoint, is far more important than knowing the truth about some facts that do not directly impinge on your life.”
-
hough false beliefs are held by individuals, they are in many ways a social phenomenon.
-
Shaw describes the motivated reasoning that happens in these groups: “You’re in a position of defending your choices no matter what information is presented,” he says, “because if you don’t, it means that you lose your membership in this group that’s become so important to you.”
-
Though cults are an intense example, Shaw says people act the same way with regard to their families or other groups that are important to them.
-
political party
-
But political scientists Keith Poole and Howard Rosenthal look at the polarization in Congress. And the most recent data shows that 2015 had the highest rates of polarization since 1879, the earliest year for which there’s data. And that was even before well, you know.
-
Now, “party is a stronger part of our identity,”
-
While some studies suggest that conservatives are more susceptible to fake news—one fake news creator told NPR that stories he’d written targeting liberals never gained as much traction—after the election, the tables seem to have turned. As my colleague Robinson Meyer reported, in recent months there’s been an uptick in progressive fake news, stories that claim Trump is about to be arrested or that his administration is preparing for a coup.
-
Though both Hillary Clinton and Donald Trump were disliked by members of their own parties—with a “Never Trump” movement blooming within the Republican Party—ultimately most people voted along party lines.
-
Eighty-nine percent of Democrats voted for Clinton and 88 percent of Republicans voted for Trump, according to CNN’s exit polls.
-
Carol Tavris
-
It’s not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump. “People knew what was being done here,” says Kahan, a professor of law and psychology at Yale University. “They knew that someone was just trying to show up Trump or trying to denigrate their identity.” The question behind the question was, “Whose team are you on?”
-
What’s more, being intelligent and informed can often make the problem worse. The higher someone’s IQ, the better they are at coming up with arguments to support a position—but only a position they already agree with, as one study showed.
-
High levels of knowledge make someone more likely to engage in motivated reasoning—perhaps because they have more to draw on when crafting a counterargument.
-
Nyhan’s work has shown that correcting people’s misperceptions often doesn’t work, and worse, sometimes it creates a backfire effect, making people endorse their misperceptions even more strongly.
-
Sometimes during experimental studies in the lab, Jerit says, researchers have been able to fight against motivated reasoning by priming people to focus on accuracy in whatever task is at hand, but it’s unclear how to translate that to the real world, where people wear information like team jerseys.
-
Especially because a lot of false political beliefs have to do with issues that don’t really affect people’s day-to-day lives.
-
“Most people have no reason to have a position on climate change aside from expression of their identity
-
But the problem, Nyhan says, with “post-truth, post-fact language is it suggests a kind of golden age that never existed in which political debate was based on facts and truth.”
People have always been tribal and have always believed things that aren’t true.
-
But in the 19th century, papers were known for competing for eyeballs with sensational headlines, and in the time of the Founding Fathers, Federalist and Republican papers were constantly sniping at each other. In times when communication wasn’t as easy as it is now, news was more local—you could say people were in geographical information silos.
-
The mid-20th-century “mainstream media” was an anomaly.
-
The situation now is in some ways a return to the bad old days of bias and silos and competition, “but it’s like a supercharged return,” Manjoo says. “It’s not just that I’m reading news that confirms my beliefs, but I’m sharing it and friending other people, and that affects their media.
-
I think it’s less important what a news story says than what your friend says about the news story.
-
These silos are also no longer geographical, but ideological and thus less diverse.
-
A recent study in the Proceedings of the National Academy of Sciences that analyzed 376 million Facebook users’ interactions with 900 news outlets reports that “selective exposure drives news consumption.”
-
Not everyone, however, agrees that the silos exist. Kahan says he’s not convinced: “I think that people have a preference for the sources that support their position. That doesn’t mean that they're never encountering what the other side is saying.” They’re just dismissing it when they do.
-
The sheer scale of the internet allows you to find evidence (if sometimes dubious evidence) for any claim you want to believe, and counterevidence against any claim you don’t want to have to believe. And because humans didn’t evolve to operate in such a large sea of people and information, Boyer says people can be fooled into thinking some ideas are more widespread than they really are.
-
“If you encounter 10 people who seem to have roughly the same idea, then it fools your system into thinking that it must be a probable idea because lots of people agree with it,” Boyer says. “One thing you assume, unconsciously, is that these 10 people came to the same belief independently.
-
You don’t think that nine of these are just repeating something that the 10th one said.”
-
art of the problem is that society has advanced to the point that believing what’s true often means accepting things you don’t have any firsthand experience of and that you may not completely understand.
-
ometimes it means disbelieving your own senses—Earth doesn’t feel like it’s moving, after all, and you can’t see climate change out your window.
In areas where you lack expertise, you have to rely on trust.
-
but Kahan has found that people see experts who agree with them as more legitimate than experts who don’t.
-
As of 2014, the most recent data, the number of people saying most others can be trusted was at a historic low.
-
On the other hand, there’s “particularized trust”—specifically, the trust you have for people in your groups. “Particularized trust destroys generalized trust,” Manjoo wrote in his book. “The more that people trust those who are like themselves—the more they trust people in their own town, say—the more they distrust strangers.”
This fuels tribalism. “Particularized trusters are likely to join groups composed of people like themselves
-
Eric Uslaner
-
Trump’s is an extreme, id-fueled version of particularized trust, to be sure, but it’s akin to a mind-set many are prone to.
-
These are more often disputes over values, Kahan says, about what kind of society people want and which group or politician aligns with that. “Even if a fact is corrected, why is that going to make a difference?” he asks.
-
So what would get someone to change their mind about a false belief that is deeply tied to their identity?
-
“Personally my own theory is that there’s a slide that happens,” McIntyre says. “This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet.
-
There are small things that could help. One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.”
-
Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. And in the study at least, it worked.
-
“We know people are less biased if they see that policies are supported by a mix of people from each party,” Jerit says.
-
“I think we need to get to an information environment where sharing is slowed down,” Manjoo says. “A really good example of this is Snapchat. Everything disappears after a day—you can’t have some lingering thing that gets bigger and bigger.”
-
But, he caveats: “Things must be particularly bad if you’re looking at Snapchat for reasons of hope.”
-
But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says. They have to change their own.
-
As previously noted, Daniel Shaw ultimately left Siddha Yoga. But it took a long time. “Before that [New Yorker] article came out,” he says, “I started to learn about what was going to be in that article, and the minute I heard it is the minute I left that group, because immediately it all clicked together. But it had taken at least five years of this growing unease and doubt, which I didn’t want to know about or face.”
-
It seems like if people are going to be open-minded, it’s more likely to happen in group interactions. As Manjoo noted in his book
-
“One real advantage of group reasoning is that you get critical feedback,” McIntyre says. “If you’re in a silo, you don’t get critical feedback, you just get applause.”
-
-
04 Apr 17
-
24 Mar 17
-
23 Mar 17
-
François Jourde
This Article Won’t Change Your Mind - The Atlantic This Article Won’t Change Your Mind - The Atlantic
-
Jimmy Breeze
“Most people have no reason to have a position on climate change aside from expression of their identity,” Kahan says.
-
18 Mar 17
-
-
upstate New York
-
I felt that I had found my life’s calling.
-
tens of thousands
-
sexual abuse
-
allegations
-
as well as
-
ousted
-
embedded
-
doomsday
-
prophet
-
saucers
-
flying
-
unequivocal
-
unshaken
-
-
17 Mar 17
-
16 Mar 17
-
This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as “motivated reasoning.”
-
But if the thing you might be wrong about is a belief that’s deeply tied to your identity or worldview—the guru you’ve dedicated your life to is accused of some terrible things, the cigarettes you’re addicted to can kill you—well, then people become logical Simone Bileses, doing all the mental gymnastics it takes to remain convinced that they’re right
-
“For desired conclusions,” he writes, “it is as if we ask ourselves ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’”
-
“The real enemy of truth is not ignorance, doubt, or even disbelief,” he writes. “It is false knowledge.”
-
Survival is more important than truth.
-
If you accuse someone of being a witch, or explain why you think the contrails left by airplanes are actually spraying harmful chemicals, the people who take you at your word are clearly people you can trust, and who trust you.
-
“Having social support, from an evolutionary standpoint, is far more important than knowing the truth about some facts that do not directly impinge on your life.”
-
Now, “party is a stronger part of our identity,” says Brendan Nyhan, a professor of government at Dartmouth College. “So it’s easy to see how we can slide into a sort of cognitive tribalism.”
-
While this may appear to be a remarkable feat of self-deception, Dan Kahan thinks it’s likely something else. It’s not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump.
-
The question behind the question was, “Whose team are you on?”
-
Amanda Taub writes that sharing fake news stories on social media that denigrate the candidate you oppose “is a way to show public support for one’s partisan team—roughly the equivalent of painting your face with team colors on game day.”
-
Kahan has previously written that whether people “believe” in evolution or not has nothing to do with whether they understand the theory of it—saying you don’t believe in evolution is just another way of saying you’re religious.
-
Similarly, a recent Pew study found that a high level of science knowledge didn’t make Republicans any more likely to say they believed in climate change, though it did for Democrats
-
What’s more, being intelligent and informed can often make the problem worse. The higher someone’s IQ, the better they are at coming up with arguments to support a position—but only a position they already agree with, as one study showed.
-
People also learn selectively—they’re better at learning facts that confirm their worldview than facts that challenge it.
-
“Most people have no reason to have a position on climate change aside from expression of their identity,”
-
But the problem, Nyhan says, with “post-truth, post-fact language is it suggests a kind of golden age that never existed in which political debate was based on facts and truth.”
-
In areas where you lack expertise, you have to rely on trust.
-
The problem is that who and what people trust to give them reliable information is also tribal.
-
-
Jan Eggers
Wie kann man jemanden vom Gegenteil überzeugen? Möglicherweise gar nicht.
-
“A man with a conviction is a hard man to change,” Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Fails, their 1957 book about this study. “Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.”
-
Having social support, from an evolutionary standpoint, is far more important than knowing the truth about some facts that do not directly impinge on your life.”
-
People have always been tribal and have always believed things that aren’t true. Is the present moment really so different, or do the stakes just feel higher?
-
“If you encounter 10 people who seem to have roughly the same idea, then it fools your system into thinking that it must be a probable idea because lots of people agree with it,” Boyer says. “One thing you assume, unconsciously, is that these 10 people came to the same belief independently. You don’t think that nine of these are just repeating something that the 10th one said.”
-
Objectivity is a valiant battle, but sometimes, a losing one.
-
So what would get someone to change their mind about a false belief that is deeply tied to their identity?
“Probably nothing,” Tavris says. “I mean that seriously.”
-
But of course there are areas where facts can make a difference. There are people who are just mistaken or who are motivated to believe something false without treasuring the false belief like a crown jewel.
-
One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. And in the study at least, it worked.
-
“I think we need to get to an information environment where sharing is slowed down,” Manjoo says. “A really good example of this is Snapchat. Everything disappears after a day—you can’t have some lingering thing that gets bigger and bigger.”
-
groups are usually better at coming up with the correct answers to reasoning tasks than individuals are.
-
, the wisdom of groups is probably diminished if everyone in a group already agrees with each other.
-
-
15 Mar 17
-
Erich Feldmeier
“At first glance, it’s hard to see why evolution would have let humans stay resistant to facts...
but Survival is more important than truth“behaviour evolution psychology groupthink system1 excellent science biology BIOGENTS
-
14 Mar 17
-
Ingrid Glomp
"The facts on why facts alone can’t fight false beliefs" https://t.co/tTn9PIGHWk Werde ich später mal lesen, obwohl's deprimierend klingt.
-
Ryan Johnson
“I remember looking at her and thinking, ‘She’s totally lying.’ At the same time, I remember something in my mind saying, ‘And that doesn’t matter.’” For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith e…
-
Md. Nazrul Islam
“I remember looking at her and thinking, ‘She’s totally lying.’ At the same time, I remember something in my mind saying, ‘And that doesn’t matter.’” For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith e…
-
educationfairbd
“I remember looking at her and thinking, ‘She’s totally lying.’ At the same time, I remember something in my mind saying, ‘And that doesn’t matter.’” For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith e…
-
Jordan Goldman
“I remember looking at her and thinking, ‘She’s totally lying.’ At the same time, I remember something in my mind saying, ‘And that doesn’t matter.’” For Daniel Shaw, believing the words of the guru he had spent years devoted to wasn’t blind faith e…
-
13 Mar 17
-
-
Liars prosper, in other words, if people believe them.
-
So some researchers have suggested motivated reasoning may have developed as a “shield against manipulation.” A tendency to stick with what they already believe could help protect people from being taken in by every huckster with a convincing tale who comes along.
-
false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming, and Shaw held onto his reverence for his guru, because those beliefs were tethered to a group they belonged to
-
“Partisanship has been revealed as the strongest force in U.S. public life—stronger than any norms, independent of any facts,” Vox’s David Roberts wrote in his extensive breakdown of the factors that influenced the election.
-
It’s not that they really believed there were more people at Trump’s inauguration, but saying so was a way of showing support for Trump.
-
“They knew that someone was just trying to show up Trump or trying to denigrate their identity.” The question behind the question was, “Whose team are you on?”
-
In these charged situations, people often don’t engage with information as information but as a marker of identity. Information becomes tribal.
-
saying you don’t believe in evolution is just another way of saying you’re religious. Similarly, a recent Pew study found that a high level of science knowledge didn’t make Republicans any more likely to say they believed in climate change
-
Fact-checking erroneous statements made by politicians or cranks may also be ineffective. Nyhan’s work has shown that correcting people’s misperceptions often doesn’t work
-
Sometimes during experimental studies in the lab, Jerit says, researchers have been able to fight against motivated reasoning by priming people to focus on accuracy in whatever task is at hand, but it’s unclear how to translate that to the real world
-
“Most people have no reason to have a position on climate change aside from expression of their identity,” Kahan says. “Their personal behavior isn’t going to affect the risk that they face. They don't matter enough as a voter to determine the outcome on policies or anything like this. These are just badges of membership in these groups
-
Not everyone, however, agrees that the silos exist. Kahan says he’s not convinced: “I think that people have a preference for the sources that support their position. That doesn’t mean that they're never encountering what the other side is saying.” They’re just dismissing it when they do.
-
society has advanced to the point that believing what’s true often means accepting things you don’t have any firsthand experience of and that you may not completely understand. Sometimes it means disbelieving your own sense
-
Though for the media outlets that are fact-checking these things, it’s a matter of truth and falsehood, for the ordinary person evaluating, adopting, rejecting, or spreading false beliefs, that may not be what it’s really about.
-
These are more often disputes over values, Kahan says, about what kind of society people want and which group or politician aligns with that.
-
I think once they’ve hit denial, they’re too far gone and there’s not a lot you can do to save them.”
-
One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.”
-
Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later
-
While there’s no erasing humans’ tribal tendencies, muddying the waters of partisanship could make people more open to changing their minds.
-
much of how people view the world has nothing to do with facts. That doesn’t mean truth is doomed, or even that people can’t change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says.
-
[The patient] has come to see a lot of problems with the ideology and maintains a relationship with his family in which he tries to discuss in a loving and compassionate way some of these issues,” Shaw says. “He is patient and persistent, and he chips away, and he may succeed eventually.”
“But are they going to listen to a [news] feature about why they’re wrong? I don’t think so.”
-
Would you like to comment?
Join Diigo for a free account, or sign in if you are already a member.