Some of the biggest tech news today is that Apple has quietly blocked apps like Replit from its app store, constraining the ability of the average user to “vibe code” apps. Experts cite Apple’s concerns about vibe coding that could alter existing app code, and concerns about data privacy.

But that leads to the larger and more overarching question: is vibe coding legal? And are there limits to what users can legally do in using AI to code mobile apps?

It’s actually not a cut and dried answer, and if you’re into the minutia of how law and liability apply, it’s kind of interesting.

Not Illegal to Code

First of all, let’s separate the act of coding, or “vibe coding” in particular, from the act of offering an app, vibe coded or otherwise, on a platform. That clearly established that it is not illegal to vibe code apps, even if they are crappy or don’t adhere to standards, or don’t protect user data, etc.

Any potential problem, if there is one, would come in when publishing such an app in either the Apple store or the Google store.

Here, one might hypothetically face liability if lack of data protection harms a user. That is more likely to come in the form of a user-generated lawsuit, since there don’t appear to be many actual legal regulations of AI in the U.S. legislature. As rep. Elissa Slotkin noted in a recent hearing, many constituents would like Congress to be more prolific in this regard.

“Congress is behind in putting left and right limits on the use of AI, and the first place to start should be at the Pentagon,” Slotkin said, in introducing an AI Guardrails Act that would limit, for example, fully autonomous lethal weapons. “My bill ensures a human is involved when deadly autonomous weapons are fired, AI cannot be used to spy on the American people, and that a human is on the switch to launch nuclear weapons. AI is going to shape the future of America’s national security, and we must win the AI race against China. But to do that, we need action that puts limits on AI in the Department of Defense. This is just common sense.”

Presumably, when the legislature got this done, they might do something on vibe coding.

A Moving Target

To that point, I also want to note that this stuff is changing at an incredible pace. Glance offers this in an Expert Guide Series on the state of artificial intelligence currently:

“The legal landscape for apps built with vibe coding and AI-generated code is shifting faster than most developers can keep up with. What was considered legally acceptable just months ago might now put your app at risk of being pulled from stores or facing intellectual property disputes. This creates a real problem for developers who want to use these powerful AI tools but need to stay on the right side of the law.”

So, what is legal today might not be tomorrow.

“At Glance, we’ve seen firsthand how confusing this space can be,” authors continue. “Clients come to us with apps that use AI-generated code, worried about everything from data protection compliance to whether they actually own the code their AI assistant wrote. These aren’t silly concerns—they’re legitimate legal questions that need proper answers.”

Yes, they are good questions. People should know what they’re up against if they release an app that doesn’t sufficiently guard user data. Now, a good end run would be to code apps that don’t actually input any significant user data. Presumably, there wouldn’t be many rules around those. Except, maybe, that they shouldn’t promote or cause harm to users.

When you think about it a bit more deeply, though, we do need laws around the delivery of vibe coded apps. Otherwise, it’s going to be a loophole, as a gray area, in the law.

Lack of Knowledge is Not a Defense

Here’s another aspect of this to think about: when you’re vibe coding, coming up with the idea and letting AI do the work, if there are any “side effects” of either the concept or the execution, whoever did the vibe coding is still responsible. You couldn’t just say “well, I don’t know code.” You’d still be liable. You could maybe, or maybe not, escape intent in a criminal case. But the core idea is that app creation is a responsibility, regardless of if you use AI or not.

This is not legal advice, in any way, shape, or form, and I am not a legal scholar. But some obvious issues raise their heads, and so, if you think of Apple’s move as just another “walled garden” strategy, it still makes sense to examine the law around spinning up apps with AI. Stay tuned.