top of page

Vibecoding: Building with AI When You’re Not a Dev (and Don’t Need to Be)

  • Writer: Aleksander Traks
    Aleksander Traks
  • Jun 9
  • 2 min read

Let’s talk about development with AI or maybe better, as it’s aptly called: vibecoding. The world is changing, and while vibecoding might not replace developers, it’s definitely transforming how we view them in the workplace.

A split-screen illustration showing two paths to building software. On the left, a person climbs a broken wooden ladder labeled “Syntax,” “Libraries,” and “Errors,” under stormy skies. On the right, the same person calmly ascends in a sleek glass elevator while holding a tablet with a prompt, surrounded by floating UI elements. Both paths lead to a mountain peak labeled “Working Prototype.” Text below reads: “You used to climb. Now you can elevate.”

A couple of weeks ago, I won a hackathon with a team where I took the lead in both the technical and development roles. Now, I don’t exactly have a developer's background. Sure, I can extrapolate code functions and understand architectural or big-picture nuance, but I’ve never been great at catching the small details that stop code from running. (Those slash hunts are a pain.) And overall any coding was something that I rather pass up to more professional people. But now? I don’t need to hunt them. I don’t even need to write out the function myself. I need to understand the big picture and validate it.

A lot of the fundamentals feel lifted. And you can see it in Y Combinator startups, more and more founders are getting better at pumping out prototypes and testing ideas. One person even got a third of a million in funding for an idea and just three hours of development work. But this isn’t just a prototyping tool. It’s also a way for enterprise IT to boost their output. That said, firing your developers isn’t the answer.

What I’ve seen, building from the trenches, is that AI has trouble with orchestration, understanding how everything connects. As soon as it builds a steering wheel, it forgets about your seats and starts trashing them because they’re “not important.” This is a limitation of hardware processing power. It’s too costly to grasp the full context, so it tries to optimize blindly. That’s where the human comes in to make the goal clear. But that, in turn, requires understanding the whole system. And oh boy, that can get real messy.

Without proper guidance, AI will build you tangled spaghetti code. Monolithic. Opaque. Hard to extend. Imagine a car that’s one welded piece of metal, it works... until you try to install a radio and the steering stops working. So, you need to break it apart. Document it. Make sure future-you actually understands what connects to what. That seems still something that could be more efficient.

But once it’s set up right, vibecoding becomes a seriously efficient way to build new features fast. It all comes down to how you integrate it. If anyone wants to chat about it, would love to riff on ideas.

 
 
 

Comments


bottom of page