A GOTY Winner Stripped of Awards Over AI? The Real Story

Collage image for A GOTY Winner Stripped of Awards Over AI? The Real Story

Imagine this: you've just created a record-breaking RPG, a game so beloved it just snagged the coveted Game of the Year award at The Game Awards. The champagne is flowing, the team is celebrating... and then the phone rings. It's the Indie Game Awards, and they're taking back not one, but *two* of your trophies. That's the dizzying reality for Sandfall Interactive, the studio behind the celebrated *Clair Obscur: Expedition 33*. The reason? A few placeholder textures made with generative AI that were, by their own admission, mistakenly left in the game at launch.

Key Highlights

  • Clair Obscur: Expedition 33 had two Indie Game Awards revoked after winning Game of the Year at The Game Awards 2025.
  • ✓ The developer, Sandfall Interactive, admitted to using generative AI for placeholder textures that were accidentally left in the launch build.
  • ✓ The retraction was because Sandfall had previously stated no gen AI was used in their submission to the Indie Game Awards.
  • ✓ The controversy ignited amidst a separate backlash against Larian boss Swen Vincke for comments on using AI in the development of the next *Divinity* game.
  • ✓ The incident highlights a massive industry-wide debate, with companies like EA and Square Enix embracing AI while others face intense fan criticism.
  • ✓ Epic Games CEO Tim Sweeney has called on Valve to remove its mandatory AI disclosure policy on Steam.

This story isn't just about a few overlooked assets, though. It's a flashpoint in the gaming industry's most explosive and divisive debate: the role of artificial intelligence in creative work. This incident rips the cover off a much bigger conversation about transparency, artistic integrity, and the future of how our favorite games are made. What happened with *Clair Obscur* is a perfect storm, revealing just how sensitive this topic has become for both developers and players.

From The Pinnacle to a Public Rebuke

The whiplash for Sandfall Interactive must have been intense. One moment they're the darlings of the industry, clutching the biggest prize from The Game Awards 2025. The next, they're at the center of a firestorm. The source of the trouble traces back to an interview Sandfall's co-founder, François Meurisse, gave to the Spanish newspaper El País back in July. In it, he casually mentioned, “We use some AI, but not much,” noting how tools in Unreal Engine 5 helped them achieve incredible results.

At the time, the comment barely made a ripple. But in the volatile landscape of late 2025, with gamer sentiment against AI at an all-time high, those words resurfaced with a vengeance. The timing was awful for Sandfall, as the gaming community was already fired up over comments from Larian's Swen Vincke about AI use in their next project. Suddenly, all eyes were on *Clair Obscur*, and what was once a quiet admission became a smoking gun.

The Indie Game Awards (IGAs) acted swiftly. They announced they were retracting both the Debut Game and Game of the Year awards from *Clair Obscur*. Here's the critical part: it wasn't just *because* they used AI. According to the IGA's statement, it was because a representative from Sandfall had explicitly agreed that "no gen AI was used" when they submitted the game for consideration. That's the detail that turns this from a debate about technology into a matter of rules and integrity.

The Aftermath and the New Winners

The IGAs have a "hard stance" on generative AI, and breaking that rule, even unintentionally, led to disqualification. As a result, the awards were passed down to the runners-up. The Debut Game award now belongs to *Sorry We’re Closed*, and the coveted Game of the Year title was given to *Blue Prince*. From my perspective, this sets a powerful precedent. It tells every indie developer out there: be upfront about your tools, because the community and the institutions that celebrate you are paying very close attention.

💡 What's Interesting: The Indie Game Awards' decision highlights a crucial distinction. The problem wasn't purely the use of AI for temporary assets, but the failure to disclose it, which violated the terms of their submission. This shifts the focus from an ethical debate about AI to a clear-cut case of rule-breaking.

A "Mistake" or a Sign of the Times? Sandfall's Defense

So, what does Sandfall have to say for itself? In a clarifying statement issued to El País, the studio laid out its version of events. They claim that way back in 2022, when AI art tools first started popping up, some team members "briefly experimented with them to generate temporary placeholder textures." For those unfamiliar with game development, placeholders are common; they're temporary assets used during production until the final, polished art is ready.

According to Sandfall, a few of these AI-generated placeholder textures were missed during the Quality Assurance (QA) process and accidentally made it into the final launch build. They insist that once discovered, these were patched out within five days and replaced with the intended, human-created textures. They are adamant that "there are no generative Al-created assets in the game" now. It's a plausible explanation, and anyone who has worked on a large creative project knows that mistakes and oversights happen.

However, the optics are challenging. In today's climate, any undisclosed use of AI is viewed with deep suspicion by a large segment of the gaming audience. The "it was just an experiment" defense, while potentially true, sounds like a convenient excuse to skeptics. What this really tells us is that the line between experimentation and implementation is now under intense scrutiny. A developer's internal process is suddenly a matter of public debate, and a small oversight can spiral into a major controversy that tarnishes a game's otherwise stellar reputation.

The Larian Connection: How One Spark Ignited Another

You can't fully understand the *Clair Obscur* situation without looking at what was happening simultaneously with Larian Studios, the masterminds behind *Baldur's Gate 3*. Just as The Game Awards were celebrating last year's hits, Larian boss Swen Vincke found himself in hot water. A Bloomberg interview reported that Larian was "pushing hard" on generative AI for their next *Divinity* game.

The article specified that the studio was using the technology to "explore ideas, flesh out PowerPoint presentations, develop concept art and write placeholder text." Even though the piece noted these efforts hadn't led to big efficiency gains, the mere mention of a beloved, artist-forward studio like Larian using gen AI sent shockwaves through the community. Fans who champion Larian for its handcrafted detail felt betrayed, fearing it was a step toward replacing human artists and writers.

This backlash created a hyper-sensitive environment. It was this pre-existing outrage that made Meurisse's old comments about *Clair Obscur* so explosive. The Larian news primed the audience to be on high alert for AI use, and Sandfall inadvertently walked right into the crossfire. Vincke has since tried to calm the storm, promising an "Ask Me Anything" (AMA) session to provide more transparency. But the damage was done, and it perfectly illustrates the tightrope developers are walking. Even using AI for internal, non-final work is now seen as a potential betrayal of trust.

An Industry Divided: Embracers, Experimenters, and Evaders

The incidents with Sandfall and Larian are just the tip of the iceberg. The entire video game industry is currently wrestling with its AI identity, and it's creating a fascinating, if chaotic, landscape. We're seeing studios fall into a few distinct camps. On one side, you have the "Evaders"—companies caught using AI who then have to backtrack amidst fan backlash. Ubisoft had to pull an AI-generated image from *Anno 117: Pax Romana*, and Activision faced complaints over AI art in *Call of Duty: Black Ops 7*.

Then you have the vocal "Embracers." EA CEO Andrew Wilson has boldly stated that AI is "the very core of our business." Square Enix recently went through mass layoffs and a reorganization, explicitly stating a need to "be aggressive in applying AI." And Dead Space creator Glen Schofield sees AI as a key tool to "fix" the industry's broken development pipelines. These companies see AI not as a creative replacement, but as a revolutionary tool for efficiency and scale, allowing them to build bigger worlds faster.

What's really fascinating is the disconnect between the corporate boardrooms and the player base. While executives are touting efficiency and innovation, many gamers hear "AI" and think of soulless art, job losses for talented artists, and unethical data scraping. This chasm of perception is the source of all the friction. Studios trying to quietly experiment, like Sandfall and Larian, are getting caught in the middle, blasted by players who feel that any use of generative AI crosses an ethical line.

The Disclosure Dilemma: To Tell or Not to Tell?

All of this turmoil naturally leads to one central question: should developers be required to disclose their use of AI? This is where another industry titan, Epic Games CEO Tim Sweeney, enters the fray. He has publicly criticized Valve's policy, which requires developers to disclose any AI-generated content on their game's Steam store page. Sweeney seems to believe this is an unnecessary level of scrutiny, sarcastically questioning if developers should also have to disclose the brand of shampoo they use.

On the other hand, Valve's policy is a direct response to consumer demand for transparency. Take a look at the Steam page for a game like *Arc Raiders*, and you'll see a clear note explaining that they use AI-based tools to "assist with content creation" but that the final product reflects the team's own creativity. For many players, this is the ideal middle ground. It acknowledges the use of new technology while reassuring them that humans are still steering the ship. It builds trust, which is something the industry desperately needs right now.

From my viewpoint, the disclosure debate is really about control and information. Sweeney's stance represents a developer-centric view where the tools used are an internal matter. Valve's policy, and the fan backlash that prompted it, represents a player-centric view where consumers have a right to know how the products they buy are made. As we move forward, I suspect the pressure for transparency will only grow. The days of quietly using AI without anyone noticing are likely over.

Conclusion

The saga of *Clair Obscur: Expedition 33* is so much more than a simple story about a few rogue textures. It's a cautionary tale for the entire industry. It shows that in the current climate, transparency isn't just a good idea—it's a necessity for survival. Sandfall Interactive's critical error wasn't just experimenting with AI; it was failing to be upfront about it in a context where rules explicitly forbade it. That single misstep cost them two prestigious awards and threw their celebrated game into a vortex of controversy.

The bottom line is this: the AI genie is out of the bottle, and it's not going back. The technology will continue to evolve and integrate into game development pipelines. The real challenge ahead lies not in stopping it, but in defining the ethical and transparent framework for its use. Developers must learn to communicate openly with their audience, and players will have to decide what level of AI integration they're willing to accept. The story of *Clair Obscur* will be remembered as one of the first major casualties in this new, uncertain, and fascinating era of game creation.

About the Author

This article was written by the editorial team, dedicated to bringing you the latest news, trends, and insights.

0 Comments

Post a Comment

Post a Comment (0)

Previous Post Next Post