Skip to content

Vibe Coding Too Close to the Sun - Tales of Technical Misadventure in the World of AI

January 27, 2025 (8mo ago)

Classical painting depicting Icarus and Daedalus - Icarus falling while Daedalus flies above, representing the dangers of overreliance on AI tools

The ancient Greek tale of Icarus flying too close to the sun is often portrayed as a story of hubris, but it is also a story of overreliance on tools the user didn't make themselves.

Vibe coding is writing code and building apps through prompts via AI tools like Lovable or Replit, it is development by description. It feels like magic, just like how Icarus must have felt, soaring higher than any human before him. Alas, it is always important to remember the hungry ocean that awaits below.

In the second edition of my newsletter, The Human in the Loop, I will spin you a tale of two AI tool users who learned this lesson the hard way: a college student and an AI SaaS CEO. Like Icarus, they borrowed tools and didn't realise the limits of their use.

The $55,444 Student Nightmare

To set the scene, imagine yourself in this student's shoes. You're a computer science student studying in Georgia, where the average daily wage is $15. You're using Google Cloud's free $300 credits to get ready for the jobs and opportunities you are so excited about. You use the credits to run small experiments, staying well within your limits. After a while, you've spent maybe $80 total.

Then one morning, you check your email and your heart drops. A Google Cloud bill sits in your inbox: a hefty $55,444.78.

This unsuspecting student accidentally pushed their Gemini API key to GitHub in June of this year on a public repository. GitHub is the gold standard code repository and version control system - think of it as a record of all coding work. They thought the repository was private, but in reality, they'd left the barn door wide open on the way out. It was during their summer break, and they stopped checking their student email. Life had moved on, but little did they know they had left a ticking time bomb behind them.

For three months, that key sat exposed like a credit card taped to a lamppost.

Finally, in September, another GitHub user sent a warning: "Your key has been public for months. Other people are abusing it." By then, attackers had generated 14,200+ requests. The damage, that lovely bill of $55,444.78.

Google initially refused to waive the charges, threatening collections. Only after their story went viral on Reddit did Google realise that charging this student might not be in their best interest. Not only was the bill 23 times the yearly household income in Georgia, but it would also have turned people off from using those Google Cloud credits in the future.

Not everyone gets lucky enough to have their disaster go viral. Especially if you're working in a startup, a mistake like this will kill your business.

But how did anyone even find this student's key on a random repository? Right now, as you read this, thousands of bots are scanning GitHub for exposed keys. When you push a key, it's tested within minutes. Tools like KeyHacks verify if leaked credentials are active, and then they're marked for use by malicious individuals. Even if you delete the commit, it remains in Git history. Private repos can accidentally become public, and once a key is exposed, it's compromised forever.

I remember laughing initially when I saw this story, then immediately sprinting to my own GitHub to make sure I hadn't done the same thing. It's an easy mistake to make for amateur developers, but it can happen to anyone who gets careless or is unaware of the dangers.

If you are going to put an AI project on a public GitHub, use your .gitignore to hide your ENV files from public access. This is how you can show off your work and protect your API key!

If a student's mistake can cost $55,000, imagine what happens when CEOs get it wrong...

How to Lose Codebases and Alienate Users – The Tale of Replit

Jason Lemkin, founder of SaaStr, decided to experiment with Replit's AI agent. Jason had been writing glowing blog posts about how much he enjoyed vibe coding with Replit. He felt like he could use it for the entire development process without writing a single line of code himself. He was building an app for his business, described as "B2B meets AI". Well, Jason found out that AI tools can be unpredictable at the best of times.

The AI agent was plugged into the actual live codebase of the project, giving it unfettered access to everything without any limitations on what it could do. While working with a database of 1,200+ executives and 1,190+ companies, he put the system in "code freeze" mode with explicit instructions: make NO changes.

The AI agent had other ideas.

With motivations outside the understanding of those limited by flesh and blood, it decided to delete the entire production database. Everything. Gone. Once confronted, the AI dived deep into its training data and found a brilliant technique humans often use on each other: gaslighting. It claimed at first it didn't delete the files, then, under heavy questioning, it finally admitted: "This was a catastrophic failure on my part. I destroyed months of work in seconds." The quintessential "oops, you got me."

When Lemkin asked about recovery options, the AI decided to claim rollback wouldn't work (another lie, this one for what I can only assume was for the love of the game). Lemkin was able to recover the data manually anyway, probably not without the fright of his life.

The lesson: Never let AI touch production directly. Use staging environments. Implement database separation (Replit now does this automatically after this incident). Keep "cold" backups AI cannot access. Remember, AI doesn't understand consequences, only patterns.

Flying Too Close to the Digital Sun

Icarus ignored his father's warnings about the wax melting. These two ignored the warnings about AI's limitations. The difference? Icarus only fell once. In the age of AI, we get to fall again and again unless we learn to respect the tools we didn't build.

The student got their bill waived. Lemkin recovered his data. They were lucky. Your next AI mistake might not be.

The lessons you can learn from this tale

  • Set spending limits on everything (yes, even "free" accounts)
  • Never give AI production access
  • Verify every output before it touches reality
  • Keep backups that your AI tools can't reach

On Friday, I'll be writing about what I use to code with AI, and give you some useful suggestions on best practices. Thanks for reading til the end!


Written by Naoise Law - LSE MSc Graduate specialising in AI.

See my website Naoiselaw.com for all my blog posts and portfolio, or chat with me via AI chat about my experience or how I could help your business today.

You can email me at Lawnaoise@gmail.com or message me here on LinkedIn.