The Noisy Room

A story about common confusion

It began with a study.

In December of 2025, Stanford researchers analyzed 2.2 billion social media posts looking for a pattern. They wanted to know what percentage of users posted severely toxic content. Not rudeness, not sarcasm, but speech that was so hateful that 90% of the world would flag it as being problematic.1

With this data in hand, they then asked thousands of people to answer a simple question:

Take a guess.
What percentage of social media users do you think post severely toxic content?
?
0%50%100%

The Bar

Imagine walking into a bar with 100 people. Three of them are screaming about politics, about each other, about nothing. But the bouncer, who gets paid based on how long you stand there staring, has wired those three into the sound system and turned it up to ten.

You walk in, hear the roar, and conclude: this place is full of lunatics. Never hearing the 97 people having normal conversations a few feet away.

That's social media. The bouncer is an algorithm. And you have definitely been the bystander.

Pick a topic. Any topic. This is what your feed might look like:

Reading this feed, you might reasonably conclude that the country is split between unhinged extremes. It is not. And the gap between what Americans actually believe and what the feed suggests they believe may be the most consequential thing platforms are failing to show you.

See the Room

Let's scale a hypothetical social media platform down to a single room with 100 people inside. This is what it looks like:

97 regular users
3 users who have posted toxic content
3%33% On most platforms, ~3% of accounts produce 1/3 of all content
Your feed Engagement ranking amplifies high-reaction content from the prolific few
Your feed
The actual room. 3 out of 100 users have ever posted severely toxic content.

The room itself is largely quiet. Most people are sharing their own thoughts and posts. But an engagement-based ranking system (the bouncer wired into the sound system) amplified the loudest voices until they dominated your feed. This is largely the result of an extremophile algorithm trying to monetize the most salacious people in the room.

This pattern repeats across platforms. On Twitter/X, toxic tweets receive ~86% more retweets and ~27% more visibility than non-toxic ones, 0.3% of users shared 80% of all contested news,14 and just 6% of users produce roughly 73% of all political tweets.16 On TikTok, 25% of users produce 98% of all public videos.15 The specific numbers vary. The dynamic is the same: a small minority of highly active users overwhelms the majority.

After a time consuming content in this room, your brain performs a kind of ambient demography. The feed becomes a sort of census. You conclude — logically — that the behavior must be widespread. The room might just be full of extreme people! Maybe most people do believe these crazy things.

This Is Not Just About What We See on Social Media

If this were just about tone of our social posts, it wouldn't matter very much. But this distortion ends up causing some seriously bad patterns of behavior.

Pattern 1 The Majority Goes Silent

When the majority of people looks at the feed and assumes they're outnumbered, people will often self-censor.3 The dynamic replicates on social media17 — fear of social isolation suppresses opinion expression on platforms where it's perceived to be unwelcome. They go quiet, or they leave a platform entirely. They cede the space to users with more extreme politics.

Pattern 2 The Loud Minority Thinks It's the Majority

The minority who aggressively post end up with their own distortion – believing they are part of the majority.5

A study of 17 extremist forums found the same pattern: the more someone posted, the more they believed the public agreed with them. More engaged participation bred false consensus.

Pattern 3 Everyone Gets Each Other Wrong

Both sides develop wildly inaccurate beliefs about who the other side actually is.6 See how some of your own beliefs line up:

What percentage of Democratic supporters do you think are LGBTQ?
?
0%50%100%
What percentage of Republican supporters do you think earn over $250,000 a year?
?
0%50%100%

The distortion extends to policy beliefs. Step through to see the perception gap on the issue of immigration.

Source: More in Common (2019) & Moore-Berg et al., PNAS 2020. Illustrative.
Pattern 4 Politicians Follow the Perceived Room, Not the Real One

Elected officials are very good at sensing political sentiment. It's literally their job. (They are not elected to correct people's beliefs.)

Politicians who can build a coalition about a perceived belief are more likely to win. They position themselves against an opponent that doesn't exist, but their supporters think exists.

And remember: most of our politics now happens on social media. Candidates often read the same distorted feed. They are unlikely to change their minds.

The window of discourse shifts. Not because opinions changed, but because perceptions of opinions did.

Pattern 5 Misperception Turns into Hostility

When you believe the other side is extreme, you become more willing to treat them as a threat.7

Both Democrats and Republicans vastly overestimate how many on the other side support political violence. The result is a populace primed to assume the other side is ready to do horrible things.

"What percentage of the other side supports political violence?"
Democrats believe
estimate
35.5%
35.5%
3.4× off
of Republicans support political violence
Republicans believe
estimate
37.1%
37.1%
4.0× off
of Democrats support political violence
Both sides were wrong by 3 to 4 times. When researchers corrected these beliefs, partisan hostility dropped.

Each step feeds the next. The distortion is self-reinforcing.

Knowing Isn't Enough

Okay. So now you know that a small minority dominates the feed.

You know that Republicans and Democrats actually have a far more nuanced set of opinions about contested issues.

Does that fix it? Not really. You also know that everyone else doesn't know it. And if the world continues operating as if the distortion is real, you should probably act the same — even though you know it's wrong. The room hasn't changed, even if you know people inside it are confused.

This is called a common knowledge problem.

Private knowledge
You've read the stat. But you have no idea who else has. The feed still looks the same. You still assume you're outnumbered. You stay quiet.

Steven Pinker lays this out cleanly in his excellent recent book When Everyone Knows That Everyone Knows.8 Learning a fact changes what you know. Seeing it displayed publicly — where everyone else can see it too — where you know others can also see it, changes what everyone knows, and subsequently how they act.

Social media has no public square. It has 300 million private windows, each showing a different distortion of the same room. Illuminating the common thoughts between us has the potential to radically change it.

The Intervention

So what can we do about this?

Fortunately, there's some good evidence showing how it can be fixed. Multiple studies show that when misperceptions are corrected in a public way, hostility drops. Mernyk et al. found that a single correction reduced partisan hostility for a full month.7 Lee et al. found that correcting overestimates of toxic users improved how people felt about their country and each other.1

We can do this today.

Imagine every post on a contested topic had a quiet link beneath it. Not a fact check, a label, or a warning. Just a question:

How do people actually feel about this?

Let's explore an example that cuts across political identity:

Money in Politics

83% of Americans support a constitutional amendment to limit money in politics. 81% are concerned about the influence of money on elections, including 78% of Republicans and 90% of Democrats. 75% say unlimited spending weakens democracy. Only 15% believe unlimited political spending is protected free speech.

And yet, very little changes, largely because everyone assumes the other side is fine with it. The feed is full of people defending their team's donors and attacking the other team's. It might look like a 50/50 partisan battle, but it's not. It's a majority consensus that cannot see itself.

What if you could see this consensus?

@real_talk_politics · 2h
Everyone complains about money in politics but the second their candidate gets a massive donation they shut up real fast. You don't hate money in politics. You hate when the OTHER side has more of it.
♡ 11,847💬 6,203↻ 2,891
click here

Community Check draws from a random sample of platform users + robust national polls, surveyed independently of the content. The sample is statistically representative. The results update continuously. And critically: everyone sees the same numbers.

Nothing is censored. Nothing is labeled. The loud posters can still keep posting.

But you, the viewer will know where the community stands.

Why This Isn't Fact Checking or Audience Polling

Fact-checking is a top-down approach that often times feels like someone is telling you what to think. This is just showing you what people already think.

Content moderation for many years now has been perceived as removing speech. This simply adds context.

Nor is this just a user-poll under a post. A poll beneath the content is already biased by algorithmic selection and those people that are viewing the content in real-time. A community check is different, as it's drawing from all platform users, coupled with statistically significant national surveys. It's an actual window into the views of the majority, not just the views of those looking at the post.

It Works for Video Too

Short-form video is the fastest-growing vector for political distortion. The same dynamic applies — a small minority of creators produce the vast majority of political content — but video bypasses the pause that text gives you. Community Check can adapt. Tap through to see how.

Money IS free speech.
Deal with it. 🇺🇸
Citizens United was CORRECT
@liberty_caucus_tv Follow
#FreeSpeech #CitizensUnited 🔥
284K
💬18.2K
41K
A political video goes viral. 284K likes, 18K comments. The feed shows outrage. But what do people actually think?

See technical specs for how it works below ↓

This Doesn't Require New Technology

Every platform already has the data. They already survey users. They already know the base rates. They already have the infrastructure to display context beneath posts. They just don't have the incentive.

But the unseen majority is the public. And the public deserves to know itself.

A tiny minority, dominating the feed. That's all it ever was. The rest of us were here the whole time, quiet and decent and waiting to be seen.

See FAQ See technical specifications

Community Check is an open specification.

The complete technical spec, research base, and open questions are published for researchers, engineers, and platform designers to stress-test and build on.

View on GitHub
References
1 Lee, Neumann, Zaki & Hancock, "Americans overestimate how many social media users post harmful content," PNAS Nexus, 4(12), 2025. n=1,090. Benchmark: Kumar et al., "Understanding the Behaviors of Toxic Accounts on Reddit," WWW '23, 2023. 3.1% of accounts produced 33.3% of all comments.
2 Grinberg et al., "Fake news on Twitter during the 2016 U.S. presidential election," Science, 363(6425), 2019. 0.1% of users accounted for nearly 80% of contested news sources shared.
3 Noelle-Neumann, "The Spiral of Silence," J. Communication, 24(2), 1974.
4 Hampton et al., "Social Media and the 'Spiral of Silence'," Pew Research, 2014.
5 Wojcieszak, "False Consensus Goes Online," Public Opinion Quarterly, 72(4), 2008.
6 Ahler & Sood, "The Parties in Our Heads," J. Politics, 80(3), 2018. 342% overestimate.
7 Mernyk et al., "Correcting Inaccurate Metaperceptions," PNAS, 119(16), 2022. n=4,741. Effects lasted 1 month.
10 Moore-Berg, Ankori-Karlinsky, Hameiri & Bruneau, "Exaggerated meta-perceptions predict intergroup hostility between American political partisans," PNAS, 117(26), 2020. ~80% of both parties overestimated opposing party hostility by 50-300%. See also: "America's Divided Mind," Beyond Conflict, 2020.
11 Sparkman, Geiger & Weber, "Americans experience a false social reality by underestimating popular climate policy support by nearly half," Nature Communications, 13, 4779, 2022. n=6,119. 80% of Americans support siting renewables locally; perceived support: 43%.
12 Yudkin, Hawkins & Dixon, "The Perception Gap," More in Common, 2019. n=2,100 via YouGov. Average overestimation of opposing party's extreme views: ~55% estimated vs ~30% actual.
13 More in Common, "Americans' Environmental Blind Spot," 2022. 73% of Republicans support U.S. clean energy leadership; Republicans estimate only 33% of their own party agrees.
14 Baribi-Bartov, Munger & Pan, "Supersharers of fake news on Twitter," Science, 384(6700), 2024. 0.3% of users shared 80% of contested news during the 2020 U.S. election.
15 Pew Research Center, "How U.S. Adults Use TikTok," 2024. 25% of users produce 98% of all public videos.
16 Bail, Breaking the Social Media Prism, Princeton University Press, 2021; Pew Research Center, 2021. 6% of U.S. Twitter users produce ~73% of all political tweets.
17 Oz, Shahin & Greeves, "Platform affordances and spiral of silence: How perceived differences between Facebook and Twitter influence opinion expression online," Technology in Society, 76, 2024. Fear of social isolation suppresses opinion expression on platforms where it's perceived to be unwelcome — confirming the spiral-of-silence dynamic operates on social media, with platform-specific affordances (network association, anonymity, social presence) moderating the effect.

Common Questions

Honest objections deserve honest answers. These are the questions skeptics from every political perspective are most likely to ask.

Technical Specification

How Community Check would work in practice, from data sources to platform integration.