The Hidden Force Behind Social Media Division
Every day, billions of posts, tweets, and videos compete for your attention. You might think that celebrities, politicians, or big media outlets decide what dominates your feed. But a new global study suggests something more powerful—and more invisible—is shaping what you see: your own ideology.
Researchers from several European universities analyzed millions of posts across three global debates—climate change, COVID-19, and the war in Ukraine. Their conclusion? The dividing lines that run through our social media platforms are not set by influencers or journalists. They’re carved by deep, consistent ideological alignments that cross topics, borders, and even languages.
Beyond the Power of Influencers
Let’s start with a surprise. The study looked at the most-shared accounts on X (formerly Twitter)—activists, politicians, journalists, NGOs, and regular people. You might expect that journalists or political figures consistently dominate attention. But they don’t. Engagement—likes, shares, and retweets—was roughly the same across all types. Greta Thunberg might stand out during a climate summit, or a Ukrainian journalist might surge during a conflict, but overall, no single group controls the conversation.
That’s a radical shift from traditional media. For decades, a few TV anchors or newspapers set the public agenda. Now, power is diffuse. As one of the study’s authors put it, “It’s not who you are—it’s what ideology you resonate with.”
But here’s where it gets interesting…
The Ideological Map of the Internet
When the team used network analysis to map who retweets whom, two clusters emerged—like magnetic poles. On one side were users supporting climate action, vaccines, and aid to Ukraine. On the other were skeptics: anti-vaccine voices, climate-change deniers, and critics of Western intervention.
The shocking part wasn’t that polarization existed—it was that the same users consistently stayed on one side of the divide across all three debates. In fact, over 90% of users held the same ideological stance regardless of topic. If someone opposed vaccination policies, they were overwhelmingly likely to be skeptical of climate science and supportive of anti-establishment narratives in the Ukraine war.
That’s not just topic-based disagreement—it’s an identity.
In a small city in Nigeria, a medical student might argue online with an engineer in Brazil about vaccine safety. In Delhi, a young climate activist might share posts that also express solidarity with Ukraine. They’ve likely never met—but online, they’re part of the same ideological ecosystem. Each side becomes a self-reinforcing community, echoing shared beliefs across issues.
A New Kind of Agenda Setting
Fifty years ago, communications scholars Maxwell McCombs and Donald Shaw coined the term agenda-setting theory—the idea that media don’t tell us what to think, but what to think about. But on social media, the study shows that users set their own agendas, guided not by authority but by alignment.
It’s a kind of bottom-up gatekeeping. Instead of editors filtering stories, millions of micro-decisions—likes, shares, follows—collectively decide what rises or falls. The algorithm amplifies what keeps us engaged, and what keeps us engaged is often what confirms what we already believe.
Think of it like heat currents in the ocean. Each user’s interaction is a molecule, moving independently but forming powerful, global flows. Together, they create ideological “currents” that carry information (and misinformation) around the world.
Why This Matters
This cross-topic polarization is more than a digital curiosity—it shapes how societies respond to real crises. During COVID-19, public health campaigns struggled to reach skeptical audiences. Climate action faces similar barriers, not because people don’t understand the science, but because the messenger already belongs to the “other” tribe.
For governments, journalists, and educators in resource-limited or divided regions, this poses a challenge: factual corrections won’t break polarization if the audience doesn’t trust the source. Instead, communication may need to be rooted in shared values—community health, faith, family, or economic well-being—rather than ideology.
In Lagos, that might mean faith leaders sharing climate resilience messages tied to stewardship. In São Paulo, local influencers could connect public health with family care. In Manila, teachers might frame vaccine campaigns around protecting elders. The message works best when it speaks across ideological divides rather than against them.
Rethinking “Filter Bubbles”
The study’s final insight challenges a common myth. Filter bubbles aren’t just topic-based—say, “anti-vaxxers” or “climate deniers.” They’re systemic. Once someone’s worldview diverges from the mainstream, that divergence tends to spread across issues. What starts as skepticism about one topic can metastasize into a broader distrust of institutions.
That means tackling misinformation one topic at a time—say, correcting a false vaccine claim—won’t be enough. Solutions must look at the whole ideological ecosystem: what values connect people, what fears drive engagement, and how networks form across issues.
Toward a Healthier Digital Commons
The researchers end on a sobering but hopeful note. Social media has democratized who can speak—but not necessarily who gets heard. If we want online spaces that bridge divides, we may need to design them differently: platforms that reward nuance, amplify diverse viewpoints, or connect people around shared local goals rather than abstract global battles.
Imagine a social app that promotes posts where people from opposing views collaborate on practical projects—like rebuilding after floods or improving local schools. It’s not impossible. It’s just not what current engagement algorithms reward.
But here’s where it gets inspiring: ideology is learned, not inherited. The same human networks that fuel division can also build solidarity. We just have to give them new reasons to connect.
Let’s Explore Together
Could a social media platform ever encourage empathy over outrage?
How might scientists and communicators reframe their messages to reach across ideological lines?
And if you were designing tomorrow’s online debate, what rules would you rewrite?
Share your thoughts below—and let’s rethink the algorithms shaping our world.


