When a short excerpt of an interview started circulating online claiming Level‑5 was letting AI write ‘80 percent or more’ of its game code, the reaction was immediate and loud. Akihiro Hino, Level‑5’s president and CEO, pushed back — not by denying AI’s growing role in studios, but by asking people to stop treating generative AI like an automatic villain.

What actually happened

Hino says the viral figure was a misunderstanding. One programmer, working on an unreleased, AI‑themed title, mentioned deliberately letting AI handle much of that project’s programming as an example of what might be possible. That single, deliberate experiment became a headline: some viewers read it as proof that Level‑5 had quietly handed off most of its development to machines. Hino calls that an overreach.

He didn’t flatly reject AI. Instead he framed the tech as a tool delivering real time savings and potential workflow shifts. In his words (as shared on X), AI could help move us from five‑to‑ten year AAA cycles to a world where those kinds of games might appear every two years — if used effectively.

Why Hino is defending generative AI

Hino argues people are too quick to equate generative models with plagiarism or creative theft. He used a simple analogy: a knife can be used to cook or to harm; a computer can create games or enable cybercrime. Misused, AI can reproduce copyrighted material. Used well, he says, it can expand what creators imagine and build.

That stance is not isolated. Large studios are already experimenting with automating parts of development — QA, asset generation and prototyping feature in internal roadmaps — and some, Hino claims, don’t make those experiments public. You can see that trend elsewhere in the industry, where companies are planning automation like QA shifts reported across the market and discussions about whether AI has crossed a practical threshold in capability AI’s tipping point remains contested and how big studios will automate workflows in the coming years Square Enix plans to automate much QA.

The pushback and real concerns

None of this conversation exists in a vacuum. The Clair Obscur: Expedition 33 controversy — where awards were revoked after disclosures about generative AI use and a studio later pledged to stop using the tools and patched the assets — is the most visible example. Fans worry about creative credit, the ethics of training data, and the environmental costs of massive AI infrastructure (data centres aren’t free or green).

There’s also an authenticity argument. As former Tekken director Katsuhiro Harada has noted in similar conversations, fast, polished outputs from AI don’t necessarily mean better craft or that the original creative intent is preserved. Those points feed into a broader debate about consent, bias and governance — conversations that are starting to produce frameworks like consent‑first audits and benchmarks in vision systems, and that large companies are slowly taking up Sony’s own efforts around ethical AI benchmarks are part of this wider move.

So what does this mean for games?

For players: you’ll see more AI‑assisted workflows behind the curtain before you see wholesale AI‑made titles on store shelves. Studios are incentivized to cut down bloated development schedules; for many, that means experimenting with tools that speed iteration. For creators: the question is less ‘can we use AI’ and more ‘how do we use it responsibly?’ — training data provenance, disclosure to consumers, and internal standards will matter.

Hino’s closing note was aspirational: he wants games that feel like dreams come true, and he doesn’t want a reflexive ‘AI = bad’ mindset to stall technological progress in the medium. That’s a reasonable plea — but it’s also a call for clearer guardrails. The industry will need both imaginative use cases and enforceable norms if it’s going to get the benefits without eroding trust.

There’s one practical reminder tucked into this debate: technology rarely stays neutral. The same tool that accelerates design can remove human jobs or enable questionable shortcuts. Companies and communities are now in the slow, public process of deciding which of those futures they’ll tolerate and which they won’t. Expect the fights — technical, ethical and legal — to continue.

If you follow hardware trends and developer ambitions, this is one of those moments where platform expectations matter. Console and PC audiences will pressure studios for richer, faster experiences — while also holding creators accountable. That tug will shape how quickly experiments turn into standard practice (and yes, some of those experiments will target high‑end machines and consoles like the PlayStation 5 Pro).

This story will keep evolving, and Hino’s position underlines a common industry tension: excitement about what AI can unlock, and anxiety about what it might cost. Neither feeling is going away soon.

Generative AIGame DevelopmentLevel-5Industry DebateEthics