There has been a lot of concern expressed about the role that social media might play in political polarization. The worry is that social media users might only expose themselves to news stories with which they agree and have friends that reinforce their own views, and thereby become more extreme in their views and less understanding or tolerant of those who disagree with them. A recent paper seems to show that the phenomenon is real, but less extreme than we might have thought; at least among those people who identify their political orientation. This group is likely to be more politically aware than other users and may be thought to be more extreme in their exposure to self-reinforcing stories. On average, this group had about 23% of friends with an opposing political viewpoint, and about 29% of the stories they read presented views that were opposed to theirs.
One thing worth mentioning about the study is that it showed user choice – what people choose to click on – played a greater role in skewing exposure to stories that might reinforce pre-existing views than did the Facebook algorithm, which tailors stories to user behavior (history of clicking on links to sites, in the main). That’s worth noting because user choice can skew exposure in any medium: online newspapers, print publications, and so on. If skewed exposure is the problem, social media may not be especially to blame.
But I want to focus on another issue. What was the content and what were the effects of that 29% of stories to which users were exposed? I now leave the realm of data for speculation, based on my own experience.
I see plenty of stories that express views that conflict with my world view. But those stories are often examples of extreme conflict. My friends don’t post links to articles presenting prima facie plausible or even cogent arguments for decreasing immigration, or lower taxes; they post links to posts where rabid right-wingers express overtly racist views or rave about socialism. I guess that conservatives have similarly skewed feeds: they don’t see links to nuanced economic analysis by social democrats, for instance, but to more extreme and far less well-informed opinions. Some evidence for this claim may easily be found at Snopes: they report many instances where they are asked to investigate whether ‘liberals’ want to ban this or that, which often turn out to trace back to a single individual expressing an opinion.
If my own experience is representative, the fact that my feed is far from skewed all one way is not a reason to be sanguine about it. By exposing me to the worst of the right, the exposure I get to views that conflict with my own actually reinforces my views. Moreover, it may give rise to a false impression of how common overtly racist, misogynistic and ignorant views are among those I disagree with politically. In a world with millions of social media users and 100s of millions of people whose views might get reported somewhere, its easy to fill a feed with trash, without ever getting a representative picture of what people – left, right or centre – think and say.
Maybe. I’d be interested in a breakdown of the numbers in the first paragraph – my guess would be that many people of broadly centre-right persuasions would have lots of left-wing people as friends, but that people of left-wing persuasions may not reciprocate. (Anecdotally, I hear this from friends who are active in centre-right movements.) I wonder if it’s possible to get at things like the Shy Tory effect through social media?
“Among friendships with individuals who report their ideological affiliation in their profile, the median proportion of friendships that liberals maintain with conservatives is 0.20, interquartile range (IQR) [0.09, 0.36]. Similarly, the median proportion of friendships that conservatives maintain with liberals is 0.18, IQR [0.09, 0.30].” Of course the US categories ‘liberal’ and ‘conservative’ are not quite equivalent to any non-US categories.
Interesting piece Neil. You may, or may not, be aware of a recent paper by Eran Halperin….it suggests that if people are presented with an extreme version of their own views, it may have the effect of moderating their views. Summary here: http://www.latimes.com/science/sciencenow/la-sci-sn-paradoxical-thinking-20140715-story.html
I am reminded of Scott Alexander’s intelligent (if perhaps too long) analysis of hypervigilance among social justice and anti-social justice people online: http://slatestarcodex.com/2015/06/14/fearful-symmetry/
His point is that experiences and media perceptions can create self-sustaining and socially emergent behaviors that are bad for reasonable debate or constructive compromise. Granted, he is talking about groups on the relatively extreme ends of various political or other cultural spectra, but since such groups do produce a disproportionate amount of media output they likely affect the mainstream too.
It´s good if social media makes this pattern more visible. But an in group choosing to share information that makes them appear right and the out group seem evil, is not a new social phenomena. Programmers just conform to already existing human preferences.
Comments are closed.