HeadlinesBriefing favicon HeadlinesBriefing.com

Why Social Media Toxicity Is Built Into Its Design

Ars Technica •
×

Research from the University of Amsterdam reveals that social media's toxic dynamics aren't caused by algorithms or human nature—they're built into the architecture itself. Petter Törnberg's study used AI personas to simulate online behavior and found that echo chambers emerge naturally, even when users actively seek diverse communities.

The counterintuitive finding: filter bubbles, long blamed for polarization, might actually stabilize communities. When 10% of users share your views, you become more tolerant of opposing opinions. Without that small minority, disagreement thresholds trigger departures, causing once-diverse groups to rapidly homogenize and grow more extreme.

The data paints a grim picture. Twitter/X saw a 72 percentage point shift to the right in engagement. Facebook posting now correlates strongly with partisan intensity, driving casual users away. Meanwhile, Reddit and TikTok buck the trend with modest growth, suggesting users are migrating toward less toxic platforms.

Human activity on legacy platforms has dropped by half, yet post volumes remain stable—the difference is AI-generated content. "Do I want to tell Zuckerberg to implement more filter bubbles? I'd want a little more evidence," Törnberg admitted. The structural problems require fundamental redesign, not superficial fixes.