The death of Sora, and why you shouldn't build your studio on borrowed sand

OpenAI just killed Sora without warning. Your favourite AI tool could be next. And that should give all of us serious pause.

Image licensed via Alamy / MauriceNorbert

Image licensed via Alamy / MauriceNorbert

When OpenAI launched Sora, it landed like a thunderclap. A standalone app. A scrolling social feed. Hyper-realistic AI video conjured from a few lines of text. Within days, it had shot to the top of the Apple App Store. Actor, writer and producer Tyler Perry had already seen enough of Sora's early demos to put a planned $800 million studio expansion on indefinite hold. That's not hype. That's a person who builds things for a living, genuinely frightened.

But yesterday, OpenAI posted a farewell message on X. "To everyone who created with Sora, shared it, and built a community around it: thank you," it read. "What you made with Sora mattered, and we know this news is disappointing."

Disappointing. That's one word for it.

No warning. No wind-down period. One day, it was publishing safety guidelines for teenage users; the next, it was gone. A $1 billion content partnership with Disney—signed just four months ago, covering more than 200 licensed characters from Marvel, Pixar and Star Wars—is now dead in the water. Disney said it "respects OpenAI's decision to exit the video generation business." The rest of us are left staring at a blank screen where our workflow used to be.

Why this should worry you

If you're an art director, motion designer, filmmaker, content creator or a video producer who's started weaving Sora into your process, even experimentally, you've just been handed a hard lesson. And if you haven't, think about whatever AI tool you have been relying on. Ask yourself: what happens if it disappears tomorrow? What will you do?

The reason OpenAI shut Sora down is almost comically mundane. The company is reportedly preparing for an IPO and needs its books to look more respectable. AI video consumes colossal amounts of computing power, and Sora wasn't making enough money to justify its costs.

In other words, this wasn't a creative decision. It wasn't even really a business decision in the conventional sense. It was a financial tidying-up exercise. And creatives who've invested real time, real workflows, and real client promises into the platform are collateral damage.

This is not an isolated case. The AI space is littered with products that arrive loudly and vanish quietly. The economics of the sector make this almost inevitable: vast upfront costs, unclear revenue models, fierce competition and investors who want returns. When the numbers don't add up, products disappear... regardless of how many people have built their practice around them.

The rules are still being written

Here in the UK, there's another dimension to this that creatives can't afford to ignore. Earlier this month, technology secretary Liz Kendall confirmed that the government was stepping back from its earlier plan to allow AI companies to use copyrighted works freely, unless rights holders actively opted out. Following a furious response from artists (Elton John, Dua Lipa, Thom Yorke and a cast of thousands), Kendall said the government "no longer has a preferred option." It's listening. It's reconsidering. It's forming task forces.

That's progress of a sort, and credit where it's due. But campaigners are rightly cautious. As composer and copyright advocate Ed Newton-Rex put it, virtually everything is still on the table, including the opt-out proposal. The can has been kicked down the road. The creative industries shouldn't be popping champagne just yet.

What this tells us is that the legal and commercial foundations underpinning AI-generated creative work remain deeply unstable. The tools themselves can vanish overnight. The rules governing how those tools are trained—using whose work, for whose benefit—haven't been settled. We, as professionals, are being asked to build on ground that keeps shifting.

What should you do?

In my opinion, none of this means you shouldn't experiment with AI tools. Many of them are genuinely useful. But there's a meaningful difference between using AI as one component in a process you own and control, and restructuring your entire practice around a single platform you have no stake in.

So here's my take. Use AI tools. Learn them. Bill for the efficiency gains. But don't let any one of them become load-bearing. Keep your core skills sharp and portable. Make sure clients understand that specific tools may change. Don't over-promise deliverables that depend on platforms you can't guarantee will exist in six months.

If the saga of Sora tells us anything, it's that "we're discontinuing this service" can arrive with less notice than a broadband outage. And roughly the same amount of sympathy. So, in the words of the Scout and Guide motto I grew up with, be prepared.

Share