Capcom has told its investors it “will not implement assets generated by AI into our game content” — a firm line on generative AI in published games, even as the company plans to actively use AI tools to “enhance efficiency and boost productivity” across graphics, sound, and programming behind the scenes.
The statement came from Capcom’s February 2026 online investor information session, published as a Q&A summary on 23 March 2026 and reported by PC Gamer. It draws a clean split between what players see and what the development pipeline looks like internally — a distinction that lands at a particularly charged moment for Capcom.
What Capcom Told Investors
Capcom’s February 2026 investor briefing was direct: AI-generated assets will not appear in Capcom games. The full quote from the session summary reads: “Our company will not implement the materials generated by our AI into game content. However, we plan to actively utilize this technology to improve efficiency and productivity in the game development process. Therefore, we are currently exploring ways to use it in various areas, such as graphics, sound, and programming.”
That covers the textures, character art, audio, and other game-facing content that players actually encounter. Capcom did not elaborate on the exact scope of “assets” — whether that extends to concept art used only internally, or only to content shipped in the final product.
The Wording Capcom Used
The phrasing “implement assets” is important — it addresses output going into a shipped title, not every use of AI in the building process. Capcom is explicitly reserving the right to use generative AI in graphics, sound, and programming workflows, as long as the outputs of those processes do not become finished in-game assets.
Why Investors Were Asking
The timing of this publication matters. Just days before Capcom published its investor Q&A summary, Capcom’s Resident Evil Requiem was featured in Nvidia’s announcement of DLSS 5, a neural rendering technology that redraws game visuals using AI. The demo was widely criticised for the way it altered the appearance of protagonist Grace Ashcroft, with players and developers calling it an unwanted AI transformation of human-created art. The backlash was severe enough that Nvidia later confirmed developers have “artistic control” over the feature — but the damage to public perception was visible.
That context sits alongside a broader industry pattern. A 2026 GDC State of the Game Industry survey found that 36% of industry professionals already use generative AI in daily work. Pearl Abyss faced a significant public apology after AI-generated art was discovered in the released build of Crimson Desert, forcing the studio to launch a “comprehensive audit” of all in-game assets. Electronic Arts and Ubisoft have also publicly discussed AI adoption, while community backlash has been significant enough to affect several titles’ public perception. Capcom naming a clear boundary in investor communications signals the company is aware of that reputational risk.
Where Capcom Actually Uses AI
Capcom confirmed it is “currently evaluating potential applications” of generative AI across graphics, sound, and programming. The company gave its clearest public picture of what this looks like in practice back in January 2025, when technical director Kazuki Abe described a Google Cloud-based system built to tackle one of game development’s most labour-intensive tasks.
Abe explained that Capcom games require developers to come up with “hundreds of thousands of unique ideas” for in-game environments — every background object, prop, and world-building detail must be conceived and referenced individually. Capcom built a system using Google Cloud models including Gemini Pro, Gemini Flash, and Imagen, fed with game design documents, that generates initial visual references and idea proposals. Those outputs go to human art directors and artists, who produce the finished assets. Abe said the prototype received strongly positive feedback from Capcom’s development teams.
The Line Capcom Draws
The distinction Capcom is making is between AI as a production tool and AI as a content creator. The Google Cloud system Abe described produces visual references that a human artist still acts on — the AI does not produce a final asset. One category produces output players never see directly; the other produces what players are sold. That is a meaningful difference in both quality risk and brand integrity terms.
Why the Distinction Matters for Players
For players, the practical outcome is that Capcom is committing to human-created game art and assets in its released titles. That matters more for some franchises than others — the hand-crafted visual identity of the Resident Evil series, Street Fighter, and Monster Hunter carries significant weight with the player base. Eroding that through AI-generated textures or animations would be visible and unpopular, as Pearl Abyss learned when players identified AI-generated props in Crimson Desert’s released build within days of launch.
The ambiguity sits in what “assets” actually covers. Capcom has not published a detailed policy document. The investor statement answers the immediate question without fully defining the edges — and as the DLSS 5 controversy showed, the line between “AI-assisted production” and “AI-generated content” can become blurry fast.
What Comes Next
Capcom has not announced a formal public AI policy beyond what surfaced in the February 2026 investor Q&A. The industry-wide pressure on studios to clarify their positions is real — Larian committed to no generative AI art in its games after community pressure, and multiple publishers have faced backlash that required public statements. A more detailed Capcom policy seems likely as upcoming titles move closer to release.
Capcom’s upcoming slate — which includes titles across PlayStation 5, Xbox Series X/S, and PC — will face scrutiny from a player base that has now seen what happens when the commitment to human-made assets is not enforced consistently. The investor communication is the current record. Whether it holds under production pressure is the question.