The arrangement has a particular structure to it. A writer who used to sell a script for six figures now annotates AI outputs for twenty dollars an hour, flagging whether the dialogue sounds natural, whether the character motivation tracks, whether the scene structure works. They are being paid for the judgment that took them a decade to develop. The judgment is being fed into a system that will make their expertise unnecessary.
This is not irony. It is a business model.
How It Happened
The 2023 WGA and SAG-AFTRA strikes were, in part, about AI. The agreements that ended them included provisions around AI use, consent, and compensation. What the agreements could not address was the economic environment the strikes created and the environment that already existed before them.
The streaming era dramatically compressed the television employment market. Peak TV created a brief abundance. Then the streamers rationalized their content spending, slashed development, shortened seasons, and cut writing rooms. The market for working writers tightened before anyone had to invoke AI provisions.
Into that tightened market came AI training companies offering piecework: flexible, NDA-covered, described in job postings as "creative consultant" or "content evaluator" or "narrative quality assessor." The language is designed to not trigger the contractual prohibitions many union members agreed to. The function is to extract their expertise and encode it in a model.
The Knowledge Transfer Problem
AI language models that generate creative content are not good at it by default. They require domain-specific training data from people who know what good looks like. A model trained on publicly available screenplays can produce screenplays. A model trained on those same screenplays plus the expert annotations of working writers, telling it which moments land and which don't, which dialogue is flat and which has texture, becomes substantially better.
That gap, between a generic model and a craft-specific model, is what creative workers are bridging. They are the training data differential. The irony of this is precise: the thing that makes them valuable as trainers is the same thing that makes them employable as writers. The skills are identical. The compensation is not.
A working screenwriter in a union production commands residuals, benefits, and credit. A screenwriter annotating AI outputs for a training company earns no credit, no residuals, and typically a rate that would have been considered entry-level in the industry they're helping automate.
Why They Do It
The question "why would they participate in this" assumes the alternative is viable. For many it is not.
The entertainment industry shed tens of thousands of creative jobs in the 2023 to 2025 period. Not all of them came back when the strikes ended. Production volume remains below its peak. The writers who used to staff a room for twenty-two-episode network seasons now compete for eight-episode limited series that run for six weeks. The math does not work.
For a writer with a mortgage, a family, and a decade of industry-specific skills that do not transfer easily to other sectors, the choice is not "train AI or keep your dignity." The choice is "train AI or do something else entirely." Many are choosing to train AI. Some do not disclose it because their union agreements may prohibit it. Some do not disclose it because the NDA prohibits it. Some do not disclose it because the social cost of being known as someone who trained AI is high in communities organized around resistance to AI.
They are doing it quietly. The models are getting better quietly.
The Structural Logic
This is a version of something that has happened in other displaced industries. When a skill set becomes economically marginal in its primary market, it becomes available cheaply in adjacent markets. Coal miners' physical capacity was once the most valued labor in their regions. As the industry declined, that capacity was available for whatever was hiring. The transition was not voluntary. It was economic.
Creative workers are experiencing a version of this transition faster, because the thing that is automating their skills is directly consuming their skills as its primary input. The displacement is not lateral. It is recursive. Each hour a writer spends training an AI is an hour of irreversible knowledge transfer in a direction that cannot be undone.
The AI companies understand this perfectly. Their sourcing strategies target people with the most relevant expertise and the fewest alternatives. That intersection, expertise without alternatives, is the structural condition that makes the arrangement possible.
What the Studios Know
The major studios are watching this happen with varying degrees of complicity. Some have explicit partnerships with AI companies. Some are allowing their own IP to be used as training data in exchange for licensing fees. Some are actively piloting AI-assisted production at the development and scripting stage while publicly maintaining that human writers remain central.
The public position and the operational reality are in tension. The public position is for talent relations. The operational reality is for the earnings call.
The writers training AI models are not naive about any of this. They understand what they're doing. They are doing it because the structure of their industry has made it the least bad available option.
That is how structural displacement always works. Not through force. Through narrowed options and the quiet math of economic survival.