header logo image

Trump supporters know Trump lies. They just don’t care. – Vox

July 11th, 2017 9:46 am

During the campaign and into his presidency Donald Trump repeatedly exaggerated and distorted crime statistics. Decades of progress made in bringing down crime are now being reversed, he asserted in his dark speech at the Republican National Convention in July 2016. But the data here is unambiguous: FBI statistics show crime has been going down for decades.

CNNs Jake Tapper confronted Trumps then-campaign manager, Paul Manafort, right before the speech. How can the Republicans make the argument that somehow its more dangerous today, when the facts dont back that up? Tapper asked.

People dont feel safe in their neighborhoods, Manafort responded, and then dismissed the FBI as a credible source of data.

This type of exchange where a journalist fact-checks a powerful figure is an essential task of the news media. And for a long time, political scientists and psychologists have wondered: Do these fact checks matter in the minds of viewers, particularly those whose candidate is distorting the truth? Simple question. Not-so-simple answer.

In the past, the research has found that not only do facts fail to sway minds, but they can sometimes produce whats known as a backfire effect, leaving people even more stubborn and sure of their preexisting belief.

But theres new evidence on this question thats a bit more hopeful. It finds backfiring is rarer than originally thought and that fact-checks can make an impression on even the most ardent of Trump supporters.

But theres still a big problem: Trump supporters know their candidate lies, but that doesnt change how they feel about him. Which prompts a scary thought: Is this just a Trump phenomenon? Or can any charismatic politician get away with being called out on lies?

In 2010, political scientists Brendan Nyhan and Jason Reifler published one of the most talked about (and most pessimistic) findings in all of political psychology.

The study, conducted in the fall of 2005, split 130 participants into groups who read different versions of a news article about President George W. Bush defending his rationale for engaging in the Iraq War. One version merely summarized Bushs rationale There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks. Another version of the article offered a correction that, no, there was not any evidence Saddam Hussein was stockpiling weapons of mass destruction.

The results were stunning: Staunch conservatives who saw the correction became more likely to believe Hussein had weapons of mass destruction. (In another experiment, the study found a backfire on a question about tax cuts. On other questions, like on stem cell research, there was no backfire.)

Backfire is a pretty radical claim if you think about it, Ethan Porter, a political scientist at George Washington University, says. Not only do attempts to correct information not sink in, but they can actually make conflicts even more intractable. It means earnest attempts to educate the public may actually making things worse. So in 2015, Porter and a colleague, Thomas Wood at the Ohio State University, set out to try to replicate the effect for a paper (which is currently undergoing peer review for publishing in the journal Political Behavior).

And among 8,100 participants and on the sort of political questions that tend to bring out hardline opinions Porter and Wood hardly found any evidence of backfire. (The one exception, interestingly, was the question of weapons of mass destruction in Iraq. But even on that, the backfire effect went away when they tweaked the wording of the question.)

Theres no evidence that backfire describes a common reflex of Americans when it comes to facts, Porter assures me. (Nyhan, for his part, never asserted that backfire was ubiquitous, just that it was a possible and particularly consequential result of fact-checking.)

Stories of failed replications in social psychology often grow ugly, with accusations of bullying and scientific misconduct flying in both directions. But in this story, researchers decided to team up to test the idea again.

The fact that Nyhan and Reiflers breakthrough study didnt replicate isnt a shocker. This happens all the time in science. One group of researchers publishes a breakthrough finding. Another lab tries to replicate it, and fails.

But instead of feuding, Nyhan, Reifler, Porter, and Wood came together to conduct a new study.

If you believe in social science, this is an ideal way to resolve a dispute, Porter says. If we can devise an experiment together, then the results are going to have something meaningful to say about our differing understandings of the world.

So the four researchers collaborated on two experiments with a wide range of people as subjects, including Trump and Hillary Clinton supporters.

The first experiment drew on Trumps exaggerations of crime statistics.

In the experiment, participants read one of five news articles. One was a control article about bird watching. Another just contained a summary of Trumps message without a correction. The third was an article that included a correction. The fourth included a correction, but then also a line of pushback from onetime Trump campaign manager Paul Manafort, who said the FBIs statistics were not to be trusted. The fifth included a line where Manafort really laid into the FBI, saying, "The FBI is certainly suspect these days after what they just did with Hillary Clinton.

The thinking here: If anyone should be able to incite a backfire effect among Trump supporters, its Trumps campaign director. Manafort gives Trump supporters cover. They can reject the correction and cite one of the most influential figures in the campaign. And if theres a time backfire ought to occur, its during a presidential campaign, when our political identities are fully activated.

But it didnt happen. On average, all the studys participants were more likely to accept the correction when they read it. Trump supporters were more hesitant to accept it than Clinton supporters. But thats not backfire; thats reluctance. Manaforts assertion that the FBI statistics were not to be trusted didnt make much of a difference either.

Everyones beliefs about changing crime over the last 10 years became more accurate in the face of a correction, Nyhan says.

The research group then conducted a second experiment during the presidential debates. This one was conducted in near-real time: On the night of the first presidential debate, the group ran an online study with 1,500-plus participants.

The study focused on one Trump claim in particular. Trump said thousands of jobs [are] leaving Michigan, Ohio ... theyre just gone.

This, again, isnt true. The Bureau of Labor Statistics actually finds both states created 70,000 new jobs in the previous year. Half of the participants saw the correction; the other half did not.

Again, the researchers found no evidence of backfire. Its worth underscoring: This was on the night of the first presidential debate. Its the Super Bowl of presidential politics. If corrections arent going to backfire during a debate, when will they?

In both experiments, the researchers couldnt find instance of backfire. Instead, they found that corrections did what they were intended to do: nudge people toward the truth. Trump supporters were more resistant to the nudge, but they were nudged all the same.

But heres the kicker: The corrections didnt change their feelings about Trump (when participants in the corrections conditions were compared with controls).

People were willing to say Trump was wrong, but it didnt have much of an effect on what they felt about him, Nyhan says.

So facts make an impression. They just dont matter for our decision-making, which is a conclusion thats abundant in psychology science.

(And if youre thinking, How could one short experimental manipulation really change how much participants like Trump? know that other research shows its possible. Notably, studies conducted during the election found that just reminding white voters they may be a racial minority one day increased support for Trump.)

The big question is: To what extent do those results generalize beyond Trump himself? says Nyhan. Many of his supporters may have to come to terms with his records of misstatements by the time this study was conducted. (The researchers did not test any fact-checks of Hillary Clinton talking points.)

Nyhan doesnt place blame on Trump supporters themselves; its just human nature to stand by our political partys candidates. But he says theres something wrong with our institutions, norms, and party leaders who enable the rise of candidates who constantly lie.

At least its nice to know that facts do make an impression, right? On the other hand, we tend to avoid confronting facts that run hostile to our political allegiances. Getting partisans to confront facts might be easy in the context of an online experiment. Its much harder to do in the real world.

These results have not yet been peer-reviewed or published in an academic journal so treat them as preliminary. But I did run them by several political science and psychology researchers for a sniff test.

These two experiments are well done, and the data analysis appears to straightforward and correct: we observe clear movement on subjects beliefs as a result of factual corrections, Alex Coppock, who researches political decision-making at Yale, writes in an email. This piece is nice because it adds to the (small but growing) consensus that backfire effects, if they exist at all, are rare.

Others commended the researchers for collaborating in the face of conflicting results. I think this is exactly how the scientific process should operate as we try to explain human behavior, Asheley Landrum, who researches politically motivated reasoning at Texas Tech, writes. Social scientists, arguably, should be even more aware of motivated reasoning, recognizing that it also occurs in scientists.

Nyhans research is about seeing if attitude change is possible. And this research often comes to frustrating ends. In one study, he and Reifler tested out four different interventions to try to nudge vaccine skeptics away from their beliefs. None made a difference. Though it is elusive, at the least, he found a little attitude change within himself.

Jason [Reifler] and I have definitely updated our beliefs about the prevalence of the backfire effect, Nyhan says. He wont say its been debunked. But hes moving in that direction.

Go here to read the rest:
Trump supporters know Trump lies. They just don't care. - Vox

Related Post

Comments are closed.


2024 © StemCell Therapy is proudly powered by WordPress
Entries (RSS) Comments (RSS) | Violinesth by Patrick