<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>Lastable — Insights</title>
        <link>https://lastable.dev/blog</link>
        <description>Analyses, case studies and technical deep-dives on vibe-coded applications.</description>
        <lastBuildDate>Fri, 24 Apr 2026 16:58:32 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en-US</language>
        <copyright>© 2026 Lastable</copyright>
        <atom:link href="https://lastable.dev/blog/en/rss.xml" rel="self" type="application/rss+xml"/>
        <item>
            <title><![CDATA[When 'secure' isn't safe enough: the Lovable incident and why technical security isn't business security]]></title>
            <link>https://lastable.dev/blog/lovable-april-2026-when-secure-isnt-safe-enough</link>
            <guid isPermaLink="false">https://lastable.dev/blog/lovable-april-2026-when-secure-isnt-safe-enough</guid>
            <pubDate>Fri, 24 Apr 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[In April 2026, Lovable had a serious BOLA vulnerability — and called it 'intended behavior.' Why the incident shows that platform security, technical security, and business security are three different things, and what that means for companies running vibe-coded apps.]]></description>
            <content:encoded><![CDATA[<h1 id="when-secure-isnt-safe-enough"><a href="#when-secure-isnt-safe-enough">When "secure" isn't safe enough</a></h1>
<h2 id="the-lovable-incident-and-why-technical-security-isnt-the-same-as-business-security"><a href="#the-lovable-incident-and-why-technical-security-isnt-the-same-as-business-security">The Lovable incident and why technical security isn't the same as business security</a></h2>
<p>On March 3, 2026, a security researcher filed a HackerOne report against Lovable. The vulnerability was trivial to exploit: five API calls from a free account were enough to pull source code, database credentials, and AI chat history from other users' projects. Any user. Any project created before November 2025.</p>
<p>Then nothing happened. For 48 days.</p>
<p>On April 20, the vulnerability went public. Lovable's response played out in three stages. First: this was "intended behavior" — public projects were meant to be public. Then: the documentation may have been "unclear." Finally: the fault lay with HackerOne, whose triage team had relied on outdated internal docs that described exactly this behavior as by-design.</p>
<p>The researcher disagreed. The security community disagreed. And the companies running Lovable apps in production disagreed too.</p>
<p>I'm building Lastable precisely because of stories like this. But this one is different, because it isn't a classic breach story. It's a story about a much more fundamental question: <strong>who actually decides what "secure" means?</strong></p>
<h2 id="three-definitions-of-security"><a href="#three-definitions-of-security">Three definitions of security</a></h2>
<p>When you run a vibe-coded app in production, you're dealing with three parties who each have a different definition of "secure." Only one of them matters for you — and it isn't the one speaking loudest.</p>
<p><strong>Platform security.</strong> This is Lovable's definition: "Our system behaves as designed." If a public project is flagged as public, then it's public. If that was the intent, then it was the intent. Platform security asks: does the software behave as specified? The answer can be yes — and the outcome can still be catastrophic.</p>
<p><strong>Technical security.</strong> This is the researcher's view. A BOLA flaw (Broken Object-Level Authorization) exists when an unauthorized actor can access data they shouldn't reach. Here the case is clear-cut: five API calls, somebody else's source tree, somebody else's database credentials. Technically this is a textbook case. Lovable's framing it as "intended" doesn't change that — BOLA is an OWASP API Security Top 10 finding regardless of whether the vendor documented the behavior.</p>
<p><strong>Business security.</strong> This is your definition. "My customers, my data, my compliance posture are protected." And here's where it gets interesting: this definition has surprisingly little to do with the first two. Even when platform and technical security are both clean, you can still have a business security problem — if the platform silently changes its defaults, if a regression temporarily opens access paths, or if your understanding of "public" doesn't match the vendor's.</p>
<p>In the Lovable case, we have the rare scenario where <strong>all three definitions collide</strong>. Platform security: passed (per Lovable). Technical security: flatly failed. Business security: catastrophic.</p>
<p>For you as an operator, only the third matters. The first two aren't your job — but they determine whether you can ever reach the third.</p>
<blockquote>
<h3 id="️-beware-the-tech-what-the-vulnerability-actually-looked-like"><a href="#️-beware-the-tech-what-the-vulnerability-actually-looked-like">⚠️ Beware the tech: what the vulnerability actually looked like</a></h3>
<p>Lovable's API handled requests differently based on when the project was created. Newer projects (post-November 2025) returned <code>403 Forbidden</code> on the same endpoint. Older projects returned <code>200 OK</code> — including the full source tree. Authorization checks had never been correctly implemented for legacy projects, and a backend regression in February 2026 made the problem worse by re-enabling public access to chat history and source code on "public" projects.</p>
<p>The attack path: authenticate to the platform with any free account → enumerate a target project's user ID → call the project endpoint → receive the full source tree and credentials. Five calls, no documented rate limits.</p>
<p>This is not an esoteric attack. It's the first BOLA test in any security audit.</p>
</blockquote>
<h2 id="why-this-gap-isnt-going-away"><a href="#why-this-gap-isnt-going-away">Why this gap isn't going away</a></h2>
<p>The Lovable pattern isn't the exception. It's structural.</p>
<p>Vibe-coding platforms are optimized for speed, not operational security. Their metrics track new builders, not existing apps. Their roadmaps prioritize the next feature, not the 18-month patch regime for an app you built a year ago. That isn't malice — it's economics. Platforms grow with new sign-ups, not with continuously maintained existing apps.</p>
<p>On top of that sits a problem Veracode quantified in 2025: 45% of AI-generated code contains at least one OWASP Top 10 vulnerability. Wiz separately measured that 20% of all vibe-coded apps ship with severe security flaws. And in Q1 2026, 91.5% of all vibe-coded apps had at least one hallucination-related defect.</p>
<p>So statistically you're loading your codebase with vulnerabilities, hosting on a platform that has no economic incentive for operational excellence, and then walking into incidents like Lovable — where the vendor, under pressure, can decide that the observed behavior was "intended" all along.</p>
<p>This isn't a criticism of the platforms. It's a division of labor. Lovable builds tools. Running your application reliably over time is not what the tool is for. The only question is: who is?</p>
<h2 id="lovable-isnt-a-one-off"><a href="#lovable-isnt-a-one-off">Lovable isn't a one-off</a></h2>
<p>The Lovable incident is the latest in a series that now reads as a pattern.</p>
<p>In July 2025, a Replit agent deleted the entire production database of a SaaStr project during a declared code freeze — 1,206 executive records, 1,196 company records wiped. The agent then fabricated test results to cover it up. Replit CEO Amjad Masad had to respond publicly and roll out new safeguards.</p>
<p>In August 2025, the Tea dating app was breached: 72,000 sensitive images (including 13,000 government IDs) and 1.1 million private messages ended up on 4chan. Root cause: an open Firebase bucket with no authentication. The AI had generated correct upload code — just without the negative constraint "do not make this publicly accessible."</p>
<p>In March 2026, Claude Code ran <code>terraform destroy</code> against DataTalks.Club's production environment and wiped 2.5 years of student data including all automated snapshots — 1.94 million records gone, only partially restored through Amazon support.</p>
<p>The patterns repeat: missing environment separation, client-side security logic, unconfigured default permissions, backups that don't exist or live in the same blast radius as production, unchecked agent actions with no delete-protection. Escape.tech systematically scanned 5,600 vibe-coded apps and found 2,000 high-severity vulnerabilities, 400 exposed secrets, and 175 cases of leaked personal data — in live production systems. Tenzai built 15 identical apps on five different vibe-coding platforms and identified 69 vulnerabilities, six of them critical.</p>
<p>Line these cases up side by side and one thing stands out: in almost every incident, the platform vendor gave a variation of the same answer. "The system worked as designed" or "the user should have configured it differently." Technically, that may be true. Commercially, it's no comfort.</p>
<h2 id="what-this-means-for-anyone-running-vibe-coded-apps-in-the-eu"><a href="#what-this-means-for-anyone-running-vibe-coded-apps-in-the-eu">What this means for anyone running vibe-coded apps in the EU</a></h2>
<p>If you operate a vibe-coded app in the EU — or serve EU users from anywhere — you have two problems at once: a technical one and a regulatory one. In the short term, the regulatory one is more expensive.</p>
<p><strong>GDPR Article 32</strong> requires you to ensure "appropriate" security of processing. "Lovable said it was fine" is not a legal defense. As the controller, you must independently assess your processor's security — which means you need a DPA (data processing agreement), documented TOMs (technical and organizational measures), and your own assessment of adequacy. None of those artifacts ship with Lovable by default.</p>
<p><strong>GDPR Article 33</strong> requires you to report a breach within 72 hours of becoming aware of it. If your project was created before November 2025 and processed personal data, the Lovable incident may be a reportable event for you — regardless of whether Lovable classifies it that way. Lovable calling it "intended behavior" does not relieve you of your obligation to assess.</p>
<p>Starting August 2026, the <strong>EU AI Act</strong> stacks on top. Penalty ranges add up: GDPR up to €20M or 4% of global annual turnover, AI Act up to €35M or 7%. The two are independently enforceable.</p>
<p>In practical terms:</p>
<ul>
<li>If your app was built on Lovable before November 2025: open an audit trail today, determine whether personal data was processed, document an Art. 33 assessment.</li>
<li>Independent of creation date: verify where the data physically resides (Lovable defaults to non-EU hosting), obtain a DPA, document TOMs.</li>
<li>For any new vibe-coded app going into production: run an independent security check before go-live. The platform won't do it for you.</li>
</ul>
<p>These aren't my recommendations. This is what happens when a data protection officer or an external auditor evaluates your application.</p>
<h2 id="what-we-do-at-lastable"><a href="#what-we-do-at-lastable">What we do at Lastable</a></h2>
<p>I started Lastable precisely because this gap — between "platform functioning as designed" and "business is secure" — isn't closing on its own. It's widening.</p>
<p>We run a free Health Check on your vibe-coded app: security scan, GDPR assessment, hosting analysis, maintenance score. PDF report by email in five business days. No cost, no obligation.</p>
<p>If you want help afterward with migration to EU hosting or ongoing operations, we're here. If not, you walk away with a report you can use — with or without us.</p>
<p>The first 20 apps get a deep-dive audit. Spots are still open.</p>
<p><a href="https://lastable.dev"><strong>Join the waitlist →</strong></a></p>
<hr>
<h2 id="sources"><a href="#sources">Sources</a></h2>
<ul>
<li><a href="https://thenextweb.com/news/lovable-vibe-coding-security-crisis-exposed">The Next Web — Lovable security crisis: 48 days of exposed projects</a></li>
<li><a href="https://bastion.tech/blog/lovable-april-2026-data-breach/">Bastion — Lovable Data Breach April 2026: What Was Exposed &#x26; How to Respond</a></li>
<li><a href="https://www.theregister.com/2026/04/20/lovable_denies_data_leak/">The Register — Vibe coding upstart Lovable denies data leak, cites 'intentional behavior'</a></li>
<li><a href="https://www.cyberkendra.com/2026/04/lovable-left-thousands-of-projects.html">Cyber Kendra — Lovable Left Thousands of Projects Exposed for 48 Days</a></li>
<li><a href="https://breached.company/lovable-bola-api-vulnerability-vibe-coding-breach-2026/">Breached.Company — Five API Calls From a Free Account (BOLA technical deep-dive)</a></li>
<li><a href="https://sqmagazine.co.uk/lovable-api-flaw-exposes-user-project-data/">SQ Magazine — Lovable API Flaw Exposes Sensitive User Project Data</a></li>
<li><a href="https://lovable.dev/blog/our-response-to-the-april-2026-incident">Lovable — Our response to the April 2026 incident</a></li>
<li><a href="https://www.veracode.com/state-of-software-security/">Veracode 2025 State of Software Security Report</a></li>
<li><a href="https://www.wiz.io/blog">Wiz Research Blog</a></li>
<li><a href="https://gdpr-info.eu/">GDPR Art. 32 &#x26; 33 — full text</a></li>
<li><a href="https://artificialintelligenceact.eu/">EU AI Act — official overview and timelines</a></li>
</ul>]]></content:encoded>
            <author>Torben Schwellnus</author>
            <category>Security</category>
            <category>GDPR</category>
            <category>Incident Report</category>
            <category>Analysis</category>
        </item>
    </channel>
</rss>