Finji, the company responsible for publishing games like Night in the Woods, Tunic, and the upcoming Usual June, has been locked in a bizarre battle against TikTok, which the company says has been creating and posting AI-generated ads using its characters without its permission.

IGN has an extensive report on the situation, which started gaining attention earlier this month when Finji CEO and co-founder Rebekah Saltsman posted on Bluesky asking anyone who saw Finji ads on social media that that were “UN-Finji-like” to reach out with screenshots of the offending ads. As IGN’s story lays out, Finji had been using TikTok to promote its games, including the upcoming paranormal adventure game Usual June, and had turned off all the app’s supposedly optional genAI permissions. 

However, after a flood of social media comments from concerned viewers, the company was clued into what appear to be AI-generated ads on TikTok that look as if they came directly from Finji. According to IGN which viewed screenshots of the ads in question, one included an altered image of June, the protagonist of Usual June, “with a bikini bottom, impossibly large hips and thighs, and boots that rise up over her knees,” which is a far cry from the character’s actual appearance and, given that she is a Black woman, plays into racist stereotypes. Finji says it had no option to view or edit these AI ads.

IGN’s story documents the back and forth between Finji and TikTok, but the gist of what happened here is that the short form video app’s customer support has given the company the runaround, at one point even claiming it saw no indication of AI-generated assets in the offending ads and asserting that Finji must have uploaded the image of June itself. 

After Finji escalated the issue, TikTok backed down and said that it was no longer disputing the publisher’s claim that it was making unauthorized alterations to Finji’s ads with the use of AI. Instead, TikTok said the company’s campaign had included an ad that used a “catalog ads format designed to demonstrate the performance benefits of combining carousel and video assets in Sales campaigns,” for which an opt-out was not guaranteed. The response in no way addressed Finji’s larger concerns about the racist and sexualized changes made to the company’s imagery. When Finji pushed to escalate the issue, TikTok said it would “re-escalate the issue internally,” but Finji has not heard back from the company since. TikTok declined to comment on the record when IGN reached out.

“This is just simply embarrassing but not for me as an individual,” Saltsman told IGN. “For me—I am just super pissed off. This is my work, my team’s work and mine and my company’s reputation—which I have spent over a decade building. My expectation was a proper apology, systemic changes in how they use this technology for paying clients and a hard look at why their technology is so obviously racist and sexist. I am obviously not holding my breath for any of the above.”

What a shitshow. But given that genAI evangelists are scrambling to invent an actual use case for the technology to stop the slop accusations, it’s not too surprising that companies are shoving it into places without permission. It’s not like any company worth a damn is going to use this tech to “enhance” its ads when they’d rather those ads reflect what they’re actually selling.