“Israeli Mass Slaughter.” “Is Joe Biden Fit to be President?” Each time we log on to social media, potent headlines encircle us, as do the unwavering and charged opinions that fill the comment spaces. Each like, repost, or slight interaction we have with social media content is devoured by the “algorithm,” which tailors the space to our demonstrated beliefs.

So, where does this leave us? In our own personal “echo chamber,” claim the directors of Duke’s Political Polarization Lab in a recent panel.

Founded in 2018, the lab’s 40 scholars enact cutting edge research on politics and social media. This unique intersection requires a diverse team, evident in its composition of seven different disciplines and career stages. The research has proven valuable: beneficiaries include government policy-makers, non-profit organizations, and social media companies. 

The lab’s recent research project sought to probe the underlying mechanisms of our digital echo-chambers: environments where we only connect with like-minded individuals. Do we have the power to shatter the glass and expand perspectives? Researchers used bots to generate social media content of opposing party views. The content was intermixed with subject’s typical feeds, and participants were evaluated to see if their views would gradually moderate.

The results demonstrated that the more people paid attention to the bots, the more grounded in their viewpoints or polarized they became. 

Clicking the iconic Twitter bird or new “X” logo signifies a step onto the battlefield, where posts are ambushed by a flurry of rebuttals upon release.

Chris Bail, Professor of Political and Data Science, shared that 90% of these tweets are generated by a meager 6% of Twitter’s users. Those 6% identify as either very liberal or very conservative, rarely settling in a midde area. Their commitment to propagating their opinions is rewarded by the algorithm, which thrives on engagement. When reactive comments filter in, the post is boosted even more. The result is a distorted perception of social media’s community, when in truth the bulk of users are moderate and watching on the sidelines. 

Graphic from the Political Polarization Lab presentation at Duke’s 2024 Research & Innovation Week

Can this be changed? Bail described the exploration of incentives for social media users. This means rewarding both sides, fighting off the “trolls” who wreak havoc on public forums. Enter a new strategy: using bots to retweet top content creators that receive engagement from both parties.

X’s (formerly Twitter’s) Community Notes feature allows users to annotate tweets that they find misleading. This strategy includes boosting notes that annotate bipartisan creators, after finding that notes tended towards the polarized tweets.

 The results were hard to ignore: misinformation decreased by 25-35%, said Bail, saving companies millions of dollars.

Social media is democracy’s public square

Christopher bail

Instead of simply bashing younger generation’s fixation on social media, Bail urged the audience to consider the bigger picture.

“What do we want to get out of social media?” “

What’s the point and how can it be made more productive?”

On a mission to answer these questions, the Polarization Lab has set out to develop evidence-based social media by creating custom platforms. In order to test the platforms out, researchers prompted A.I. to create “digital twins” of real people, to simulate users. 

Co-Director Alex Volfovsky described the thought process that led to this idea: Running experiments on existing social media often requires dumping data into an A.I. system and interpreting results. But by building an engaging social network, researchers were able to manipulate conditions and observe causal effects.

How can the presence of a “like button” or “repost” feature affect our activity on platforms? On LinkedIn, even tweaking recommended users showed that people gain the most value from semi-distant connections.

In this exciting new field, unanswered questions ring loud. It can be frightening to place our trust in ambiguous algorithms for content moderation, especially when social media usage is at an all-time high.

After all, the media I consume has clearly trickled into my day-to-day decisions. I eat at restaurants I see on my Instagram feed, I purchase products that I see influencers promote, and I tend to read headlines that are spoon-fed to me. As a frequent social media user, I face the troubling reality of being susceptible to manipulation.

Amidst the fear, panelists stress that their research will help create a safer and more informed culture surrounding social media in pressing efforts to preserve democracy.

Post by Ana Lucia Ochoa, class of 2026
Post by Ana Lucia Ochoa, class of 2026