The Day Hollywood Woke Up To AI — And Decided To Do Something About It
Hollywood | Photo by Venti Views on Unsplash
There is a particular kind of dread that settles in when someone very smart explains, very calmly, that the world as you know it may not survive the next decade. David Goyer — the writer behind the Dark Knight Trilogy, the showrunner of Foundation, Apple TV+'s epic about the collapse of a galactic civilization — knows that feeling intimately.
About fifteen months ago, he was invited to a private gathering in Silicon Valley, the kind with no press and no recording, where former insiders from Google, Anthropic, and OpenAI spent two days walking a small group of filmmakers through the unvarnished state of artificial intelligence. Where it was headed. How fast. And what the people building it privately believed could go wrong.
He came home shattered. "When I came home, I was extremely depressed and scared for the future of the world and my kids," he said. What had undone him wasn't science fiction. It was the gap between the speed of the technology and the near-total absence of any guardrails — a race toward AI systems of extraordinary power, where the incentives all pointed in one direction: faster, regardless of the consequences.
His wife gave him an ultimatum: find someone who was actually working on solutions, or stop drowning the household in existential dread.
When the Last Wake-Up Call Wasn't Enough
He found Randy Fernando, an advisor at the Center for Humane Technology — the organization behind The Social Dilemma, the Netflix documentary that exposed how social media platforms were engineered to manipulate human behavior — who was thinking seriously about a narrow path forward. That conversation led to filmmaker Daniel Kwan, who had his own awakening. Kwan, fresh off winning every Oscar in sight for Everything Everywhere All at Once, had used his newfound access to meet not with studio heads, but with the same researchers Goyer was now talking to. He had been sounding the alarm inside Hollywood for over a year.
By last summer, the three of them were on a Zoom call together, sketching out what would become the Creators Coalition on AI — a new organization whose purpose is straightforward and, in a town built on individual deals, quietly radical: get the entire creative industry to act as one. The coalition counts among its founders actor and filmmaker Joseph Gordon-Levitt, producer Janet Yang, and thousands of signatories from every corner of the business, from Oscar winners to below-the-line crew.
Ted Tremper, a documentary filmmaker who spent two and a half years tracing AI’s creeping impact on society, joined early. The film he produced, 'The AI Doc: Or How I Became an Apocaloptimist’,is designed to do what legislation and white papers cannot — make people feel the stakes. The coalition is what comes next: the structure that catches people once the film has shaken them awake.
David S. Goyer | Filmmaker - Writer, Director, Producer
Nobody Was Talking to Each Other
What they found when they started organizing was not apathy. It was chaos. The writers’ union had its own working group on AI. So did the Directors’ Guild, the actors' union, the streaming platforms, the independent producers. Nobody was talking to each other. Worse, nobody was speaking the same language.
When the coalition convened one of the first joint lunches bringing all these groups into the same room, the result was illuminating. "We just had everyone from the individual organizations stand up and say, what are we worried about?" Goyer recalled. "That was just mind-blowing. Education really is a big issue."
The education problem runs deeper than most people realize. At a school event, Goyer found himself talking to the head of his child’s school — a PhD, by any measure a sophisticated person — who had concluded that the solution to AI was simple: opt out. "I'm just not going to use it. Then my family and I are safe." Goyer had to explain that this was not actually an option. AI is already embedded in the tools and platforms people use every day. Opting out is not a strategy. It is, at best, a feeling.
Building the Dictionary Before Writing the Rules
Its first project is almost defiantly unglamorous: building a shared dictionary. Before anyone can agree on ethics or rights, they need to agree on what words mean. Amazon's definition of "AI-generated content" differs from Netflix's. Netflix's differs from the writers' union's. Every conversation starts from scratch, with everyone talking past each other.
The coalition is organizing what Tremper calls a ‘constitutional convention’ — bringing every major stakeholder together to ratify a single set of industry-wide definitions, covering four areas: transparency and fair compensation for creative work used to train AI; job protection; guardrails against deepfakes; and safeguards to keep human creativity central. The goal is ultimately a ‘nutrition label’ — a transparency standard telling anyone exactly how AI was used in making a film or show. "If we don't have this set of definitions, we can't describe what our personal values are in regards to how we want to interact with the technology," Tremper said. “But once we have them, people are given agency.”
Ted Tremper, showrunner, director, and writer
A Playbook for Every Industry
The stakes extend far beyond Hollywood. "If we can prove that we were able to do this with the creative industries," Tremper said, "it becomes something we can open source to nurses, to truckers, to teachers, to doctors." The music industry never organized before Spotify rewrote its economics. The taxi industry never organized before Uber gutted it. "If we can decide that our values, the way that we want to create work, takes precedent, and that we can unite, then we'll essentially be able to have a phalanx against being picked off one by one."
None of this is anti-technology. Tremper points out that AI has been part of filmmaking since Peter Jackson used machine learning to animate battle crowds in The Lord of the Rings. Their argument is not that the tool is wrong, but that the deployment — this fast, this opaque — demands a human response. "We cannot just apply the 'move fast and break things' motto when we're talking about civilization," Goyer said. "Sorry. We can't do that."
What surprised them both was how many people inside the AI labs quietly feel the same way. Tremper built relationships with more than a hundred sources inside major AI companies. "We've had companies come to us and say — please tell us what this looks like if this goes well. We want to be the good guys."
That opening — cautious, provisional, but real — is what the coalition is racing to fill.
People Are Ready to Act
The mood has shifted visibly across the festival circuit. At Sundance earlier this year, audiences were still trying to understand what AI actually was. By South by Southwest — the Austin festival where tech and culture have long collided — the question had changed entirely. People weren't asking what it was anymore. They were asking what they were supposed to do about it.
The coalition has an answer. "Every massive change in society is either going to require reform or revolution," Tremper said. "And reform is almost always cheaper in the long term." Hollywood, of all places, knows how to tell a story about the future. For once, it is trying to write one it actually wants to live in.
At Conspiracy of Love, we help changemakers tell their most powerful stories — stories that inspire action, build movements, and create lasting impact.
Find out more about our Values-Driven Storytelling and GPS to Purpose workshops, and how we can help you scale your impact.