American politics is highly polarized. This is often seen as a product of irrationality: people can be tribal, be influenced by their peers, and often get information from very different, sometimes inaccurate, sources.
Tribalism and misinformation are real. But what if people also act rationally, even when they arrive at very different points of view? What if they are not misled or overly emotional, but think logically?
“There can be entirely reasonable ways to polarize people in predictable ways,” says MIT philosopher Kevin Dorst, author of a new paper on the subject, based in part on his own empirical research.
This may be especially true when people face a lot of ambiguity when evaluating political and civic issues. These ambiguities generate political asymmetry. People consider evidence in different ways, as one would expect, leading them to different conclusions. That doesn’t mean they don’t think logically, though.
“What happens is people look at information selectively,” Dorst says. “This is actually why they move in opposite directions, because they selectively scan and look for faults in different places, and thus they get generally different points of view.”
The concept of rational polarization can help us develop a more coherent view of how points of view differ, helping us avoid thinking that only we are rational – or, conversely, that we are not really thought through to arrive at our own opinions. This can thus add nuance to our evaluations of others.
The paper, “Rational polarization“, appears in The philosophical review. Dorst, the sole author, is an assistant professor in the Department of Linguistics and Philosophy at MIT.
Looking for faults
For Dorst, rational polarization provides a useful alternative to other models of belief formation. In particular, he says, rational polarization enhances a type of “Bayesian” model of thinking, in which people continue to use new information to refine their views.
In Bayesian terms, because people use new information to update their views, they will rationally change their ideas or not, as is justified. But in reality, Dorst says, things are not that simple. Often, when we evaluate new evidence, ambiguity arises – and Dorst argues that it is rational to be unsure of this ambiguity. But this can generate polarization because people’s prior assumptions influence where they find ambiguity.
Suppose a group of people were given two studies about the death penalty: one study finds that the death penalty has no deterrent effect on people’s behavior, and the other study concludes that it does. Even when reading the same evidence, group members will likely have different interpretations of it.
“Those who truly believe in the deterrent effect will carefully examine the study suggesting there is no deterrent effect, be skeptical of it, poke holes in the argument, and pretend to recognize flaws in its reasoning ” says Dorst. “Conversely, for those who do not believe in the deterrent effect, it is exactly the opposite. They find flaws in the study suggesting there is a deterrent effect.
Although these seemingly selective readings may be rational, Dorst says, “It makes sense to examine surprising information more than unsurprising information. ” Therefore, he adds, “you can see that people who have this tendency to examine selectively (can) deviate even when presented with the same evidence, mixed in the same way. »
By letter
To help show that this habit exists, Dorst also conducted an online experiment on ambiguity, with 250 participants on the online survey platform Prolific. The goal was to see how polarized people’s opinions could be in the presence of ambiguous information.
Participants received an incomplete sequence of letters, like one might find in crosswords or on the “Wheel of Fortune”. Some strings of letters were part of actual words, others were not. Depending on the type of additional information provided to participants, the ambiguous and unresolvable chain letters had a strongly polarizing effect on how people responded to the additional information they received.
This process at work in the experiment, Dorst says, is similar to what happens when people receive uncertain information, in the news or in studies, about political issues.
“When you find a flaw, it gives you clear evidence that undermines the study,” Dorst says. Otherwise, people often tend to be unsure of the content they see. “When you don’t find a flaw, it (can) give you ambiguous evidence and you don’t know what to think about it. As a result, this can lead to predictable polarization.
The more important point, Dorst believes, is that we can achieve a more nuanced and coherent picture of how political differences exist when people process similar information.
“There is a perception that in politics, rational brains turn off and people think with their gut,” Dorst says. “If you take this seriously, you should say, ‘I form my political beliefs the same way.'”
Unless you believe that only you are rational, and not everyone else is – although Dorst finds that to be an untenable worldview.
“Part of what I’m trying to do is give a narrative that isn’t subject to that kind of instability,” Dorst says. “You don’t necessarily have to point the finger at others. It’s a much more interesting process if you think that there’s something (rational) there too.”