vibe-coding security lovable data-breach BOLA

Lovable Exposed Everything and Called It Intentional

Five API Calls. That’s All It Took.

A security researcher created a free Lovable account, fired off five API calls, and pulled another user’s source code, database credentials, AI chat logs, and customer data. No exploit kit. No zero-day. Just a missing authorization check on the API — the kind of bug you learn to avoid in your first week writing CRUD endpoints.

This is Lovable, the $6.6 billion vibe coding darling with eight million users. The platform that lets you “build apps with AI” without writing code. Turns out it also lets strangers read your entire project history without writing code.

The Bug That Wasn’t a Bug

The vulnerability is a textbook BOLA — Broken Object Level Authorization. Lovable’s API endpoints didn’t validate whether the requesting user actually owned the project they were accessing. Every project created before November 2025 was exposed. Tens of thousands of them.

But here’s where it gets worse. Lovable stores the full AI conversation history tied to each project. Every prompt. Every pasted error log. Every time a developer walked the AI through building database tables with fields like email, date_of_birth, and stripe_customer_id. An attacker doesn’t just get your code — they get the entire thought process behind it, credentials included.

48 Days of Nothing

A researcher reported the flaw to Lovable through HackerOne on March 3rd. Lovable’s triage partners looked at it, decided the behavior was “intentional,” and closed the report. When the researcher filed a second report documenting additional affected endpoints, Lovable marked it as a duplicate and closed that one too.

Forty-eight days. Nearly seven weeks of every pre-November project sitting there like an open filing cabinet in a public library. Source code, secrets, customer PII — all of it accessible to anyone with a free account and a curl command.

When The Register finally broke the story on April 20th, Lovable’s response was to deny a “data leak” and blame HackerOne’s triage process. Classic. The house is on fire and you’re arguing about who left the stove on.

The Regression That Nobody Caught

It gets better. Back in February, Lovable unified permissions in their backend and accidentally re-enabled access to chat histories on public projects. A regression that undid a previous API patch. So they’d already found and fixed a version of this bug once — then broke it again during a refactor and nobody noticed.

This is what happens when you move fast and break things in a codebase that stores other people’s secrets. No integration tests catching authorization changes. No audit trail flagging that a previously-locked endpoint was suddenly open again. Just vibes.

The Structural Problem

Lovable isn’t an outlier. It’s the inevitable result of an industry that treats security as a post-launch afterthought. Georgia Tech’s Vibe Security Radar tracked 35 CVEs in March 2026 alone that were directly attributable to AI coding tools. The real number is probably five to ten times that.

The math is simple: 40-62% of AI-generated code contains vulnerabilities. 91.5% of vibe-coded apps had at least one AI hallucination-related flaw in Q1 2026. And 60% of all new code is projected to be AI-generated by year’s end. We’re not building a house of cards — we’re building a city of them.

What You Should Actually Do

If you’ve ever built anything on Lovable — especially before November 2025 — rotate every credential, API key, and database password that touched the platform. Assume your AI chat history was read. If you pasted secrets into the chat (and you probably did), those are compromised.

And if you’re still vibe coding production apps without reviewing the output? This is your wake-up call. The AI doesn’t care about authorization. It doesn’t think about access control. It builds what you ask for and skips everything you didn’t think to mention. That’s not a tool problem. That’s an architecture problem. And no amount of “just ship it” energy is going to patch that.