Hooded Horse, the indie publisher behind Manor Lords and Terra Invicta, has put an explicit ban on generative AI assets into its publishing contracts — and isn't shy about why. CEO Tim Bender told Kotaku that the company now refuses to work with studios that use generative AI anywhere in the development pipeline. The language is blunt: Bender has described the technology as “cancerous” when it sneaks into projects.

Why Hooded Horse says no

This isn't a boutique policy tweak or a PR posture. Hooded Horse says the ban is ethical and practical. Ethically, Bender argues, relying on tools trained on other people’s work is a betrayal of the artists the publisher employs for marketing and creative oversight. Practically, it's about a tiny mistake that can become a public mess: placeholder art or voice lines generated during prototyping sometimes slip into final builds or marketing materials. The result has already bitten the industry — several leaks and launch controversies in 2024–25 involved AI-generated placeholders making it into public-facing versions of games.

Hooded Horse's stance goes further than forbidding AI in final assets; the publisher urges studios not to touch generative tools at all during production. That reflects a fear familiar to many indie teams: a single outsourced contractor or a hurried prototype build can let an AI-derived piece persist by accident.

Industry friction and mixed responses

Hooded Horse sits on one end of a spectrum. Bigger players and some studios are experimenting — or doubling down — on generative tools to speed workflows, prototype ideas, or even produce certain voice lines. Embark Studios, for instance, has used generative AI in voice work for Arc Raiders while saying it doesn’t replace human creators. Other executives, from Larian to Embracer leadership, have publicly weighed the pros and cons, often promising human authorship remains central.

Regulation and platform rules complicate the picture. Steam introduced disclosure requirements for AI-generated content, but those largely rely on developer honesty — there’s no universal scanner you can run that reliably says “this came from a generator.” And in early 2025 the U.S. Copyright Office signalled that purely AI-created art lacking human creativity may not qualify for copyright protection, which adds legal and commercial uncertainty for developers and publishers alike.

For readers tracking AI in games outside the courtroom and the boardroom, you can see the debate play out in concrete releases. Some studios publicly admitted to experimental use in pre-production before promising no generative assets in finished products, and others have been caught with placeholders in trailers or builds. The tension between experimentation and accountability is real — and sometimes messy. If you want a recent example of a game that leaned on generative audio, look at the industry conversation around Arc Raiders and its launch coverage.

Enforcement is harder than signing a contract

A written prohibition helps set expectations, but enforcement across distributed workflows is tricky. Many indies use freelancers, asset houses, or external contractors; unless you tightly control every file that enters a build, an AI-derived image or voice snippet could slip through. That’s the exact scenario Bender fears: a placeholder intended for a prototype that accidentally ships.

Critics of a blunt ban — and some commentators who want more nuance — point out grey areas. Does heavy human rework that started from a generated base count as an AI asset? What about tools baked into OSes and creative suites that nudge composition or color? Some outlets have suggested publishers should publish more detailed AI charters to define acceptable uses, while others see any reliance on models trained on scraped art as unacceptable.

Microsoft and other companies continue to push new image and text models, and those advances will keep raising the stakes for policy-making inside studios and storefronts. For readers who follow the technical side of image models, the broader ecosystem shift is worth monitoring; Microsoft’s MAI text-to-image work is an example of the rapid progress that’s forcing these conversations onto publisher desks Microsoft MAI-Image-1.

Hooded Horse has also taken a people-first stance: it employs two full-time artists for marketing, and Bender says it would be a “betrayal” to them to accept partner work that used generative art. That kind of internal commitment — paying staff to do the work rather than outsourcing to models — is expensive, and not every publisher or developer will choose the same path.

The bluntness of Hooded Horse’s policy makes a clear promise to players and creators: if they publish your game, they expect craftsmanship that isn’t built on generative shortcuts. Whether other publishers follow, carve out nuanced middle grounds, or double down on AI will shape how games are made — and how the industry argues about creativity and credit — over the next few years.

If you want to keep an eye on how studios are using AI in real releases, industry launches and post-mortems (the kind that follow Arc Raiders) are telling places to start Arc Raiders launch and discussion.

Generative AIGame PublishingEthicsIndie Games