t’s increasingly clear that social media exacerbates our differences, using emotional wording and algorithms to curate what information we’re exposed to. The result is that these features conspire to echo, rather than expand, our thinking. Included in this chain of “causative” links is one of our most human needs: belonging – especially to a tribe.
The research sought to understand how
“people may also engage in motivated tweeting (or sharing, liking, or retweeting), selectively interacting with and attending to content that conforms to their partisan identity motivations.”
That quote is quite the mouthful. But the idea is to maintain your group identity; it is better to “hate on” those outside your tribe more than “like” those within. Do we share more of what denigrates others and less of what we “already know to be true” about ourselves?
The researchers searched for answers by examining large datasets of tweets and Facebook posts from mainstream media and members of Congress on both sides of the aisle. They counted the words in these social offerings referencing liberal, conservative, or including negative, positive, and moral language. Here is what they found.
Major Media Outlets
“… political out-group language appears to be the most powerful predictor of engagement of all factors measured.”
This part of the study included 600,000 Facebook posts and 227,000 tweets by news organizations.
- Each additional negative word increased sharing by 5 to 8% in all but conservative media’s Facebook posts, where it decreased sharing by around 2%. Positive words reduced sharing from 2 to 11% in all cases. Is it fair to say we like to “dish the dirt?”
- Words with moral, emotional overtones increased sharing by 10 to 17%. As I have written previously, we are repealed by disgust and feel the need to warn others.
- Words referring to our tribe, “political in-group words,” increased sharing up to 37%. But words referring to the other tribe increased that sharing from 35 to 57% for each additional word used.
- The emotional connotations of the words also mattered. Many of the words chosen to describe the other tribe evoked anger or laughter within our tribe, whether liberal or conservative. We love to denigrate the others.
Members of Congress
This part of the study included one million tweets and 825,000 Facebook posts by members of Congress. Since the research data comes from just before the most recent election, there were clearly partisan political words. That said,
- Negative words increased sharing by 12 to 45% for each word, the most considerable effect being among Conservatives.
- Morality increased sharing, but not as much by roughly 5 to 10%
- The use of in-group words reduced tweets and barely moved posts for liberals and conservatives. The biggest effect was using words about that other tribe that increased sharing by 65 to 180%, far greater than those words when crafted by the media. 
- Posts about the other “strongly predicted negative reactions, such as “angry;” posts about our tribe predictably elicited “love” reactions.
The researchers then looked at both datasets in a type of “internal” meta-analysis. Here they found
- The words associated with the other tribe increased sharing by about 67%. Words associated with our tribe had no effect.
- Negative words increase the diffusion of comments by 14%, moral, emotional words by 10%
- Positive words reduced sharing.
- Political orientation and social media platforms had no effect. We are all equally guilty; we just like our tribe much more than that other one.
“While much of the literature on social media and political polarization has focused on the formation of echo chambers, the finding that social media amplifies out-group animosity might be more concerning than the formation of echo chambers alone.”
Let me put it this way. We are rude to those that have an opposing view. So even if we reach out “across the aisle,” as it were, we often find that we are being denigrated, belittled, and mocked by our worthy opponent. Social media does not promote discourse; it promotes a malicious form of gossip.
The problem with Twitter, Facebook, and Amazon is not about censoring where they do an abysmal job. It is the algorithm that promotes you to stay attentive while they deliver up ads. Re-posting and re-tweeting keep you glued to those tiny smartphone screens, coarsening our discourse as we select the more salacious "dirt" to share. As the researchers conclude,
“When the chief goal is virality, this may create negative externalities in the form of polarizing, hyperpartisan, false, or hostile content. This kind of content may be good at generating superficial engagement but ultimately harms individuals, political parties, or society in the long-term.”
 “This might be due to the fact that members of Congress are explicitly identified with a political party and have a large partisan following.”
Source: Out-group animosity drives engagement on social media PNAS DOI: 10.1073/pnas.2024292118