The Algorithm Was Never Neutral
There is no neutral algorithm. Every ranking decision is a resource allocation decision, and resource allocation decisions are power decisions. When Facebook's feed algorithm deprioritized news content, that was not a technical adjustment. It was a policy choice expressed in code, insulated from accountability by the language of engineering.
The story being told in most press coverage is one of drift: Facebook "moved away" from news, publishers "lost traffic," audiences "shifted" toward video and social content. Passive constructions everywhere. No one chose anything. Events simply occurred.
The structural read is considerably less comfortable.
How the Trap Was Built
Between roughly 2010 and 2018, Facebook positioned itself as an indispensable distribution layer for journalism. Referral traffic from Facebook became, for many publishers, the dominant source of readership. Newsrooms hired social media editors. Headlines were engineered for shareability. Editorial calendars bent toward what Facebook rewarded.
This pattern suggests a classic platform dependency play: make yourself essential to a supplier, extract as much value as possible while the arrangement suits you, then restructure the terms when it no longer does. Publishers were not partners in this arrangement. They were content suppliers to a system they did not own and could not influence.
The tell was always in the asymmetry. Facebook needed journalism the way a fire needs oxygen - useful for burning, not worth preserving. News content drove engagement and time-on-platform during the years when Facebook was scaling. It also provided a veneer of civic seriousness that helped with regulators, advertisers, and congressional relations. Publishers provided that service for free, in exchange for traffic they were never guaranteed to keep.
When News Became Costly
Around 2016, the calculus shifted. News content became associated with misinformation controversies, political polarization accusations, and advertiser anxiety. The same engagement properties that made news valuable - conflict, urgency, strong emotional response - became the properties regulators and press critics blamed for social harm.
The structural logic here is straightforward: Facebook did not suddenly discover that users preferred Reels to Reuters. It discovered that news carried regulatory and reputational costs that entertainment content did not. The algorithm change was a liability hedge expressed as a product decision.
Meta's public framing centered on user preference data. People, the company suggested, simply wanted less news. This may even be partially true. But the framing obscures the mechanism. Platforms shape demand as much as they reflect it. A feed that systematically buries a content category will generate data showing low engagement with that category. The evidence and the outcome are manufactured by the same system.
What Publishers Actually Lost
The material damage to journalism is real and measurable in human terms, even where precise figures are contested. Local newsrooms that had rebuilt their audience strategies around social referral traffic found themselves without readers. National outlets that had expanded aggressively during the Facebook traffic boom contracted sharply when it evaporated. The layoffs that followed were reported as market corrections. The structural read is that they were the delayed invoice for a dependency that should never have been built.
The deeper loss is less visible: publishers surrendered their direct relationships with readers. Email lists were neglected. Subscription muscles atrophied. The audience that once arrived through a publication's own front door was rerouted through a platform's lobby, and when the platform redecorated, the audience did not know how to find its way back.
The Leverage Was Always Elsewhere
This pattern will not resolve itself through better platform policy or improved media-tech partnerships. The structural condition that generated it - platforms controlling distribution while publishers generate content - remains entirely intact. Threads, TikTok, and YouTube are not correctives. They are the same arrangement with different logos.
The only interventions that address the actual structure are the ones least likely to happen: antitrust enforcement that treats algorithmic distribution as infrastructure, regulatory frameworks that create publisher rights over traffic, or a wholesale reconstruction of direct audience relationships that does not route through any platform at all.
Until then, every newsroom negotiating with an algorithm is negotiating with a counterparty that can change the terms unilaterally, at any time, for any reason it chooses to name.
The leverage was never shared. It was only briefly obscured.