It’s a crazy time to be a software developer. For many in our industry, AI tools represent existential threats that cut to the core of our identity. When Claude Code can churn out thousands of lines of production-grade code in hours, what value am I adding?
Given the number of layoffs we read about in the news, it’s easy to come to the conclusion that we are experts in a dying field. As real as the disruption is for many, I do think this crisis presents a chance to step back and take stock.
On one hand, AI represents a huge opportunity. In my experience, coding is rarely the hardest part of software development. I’ve seen far more software projects fail because they were poorly defined than poorly implemented. Building the wrong thing is much more costly than building the right thing wrong. In this light, AI actually gives us an incredible gift—a declarative, natural language interface to software development. We can spend more of our time at a higher altitude, guiding AI (or more junior human engineers) by bringing precision to the problems we’re trying to solve and the solutions we conjure up to solve them.
Vibe coding gets a bad rap from many software developers because it bypasses the need for years of experience to produce functional software, but I see it as a gift in the right context. I see people from all different backgrounds using AI coding tools to translate their ideas to working software. Is it ready to deploy to thousands of users? No. Is it full of awkward AI joints? Sure is. Does it communicate ideas that might otherwise have been abandoned in a high-fidelity format that trained engineers can build upon? Absolutely. The rapid prototyping that vibe coding affords helps us get to the best ideas faster, and creates the potential for real innovation by lowering the barrier to entry.
On the flip side, there are real risks to treating LLMs as an abstraction over code. The biggest one I see is losing touch with the mediums we build for, and losing the ability to push them forward.
Code is our most direct interface to these mediums, and so many of the innovations we’ve made (including LLMs) have emerged from a deep understanding of the machines that code affords. For instance: React, and the declarative UI paradigm it made mainstream, would not have been created without a deep understanding of how browsers render web pages. Foundational knowledge in code remains fundamental to learning how to be an engineer, and deep knowledge accumulated over time remains essential to creating new patterns for AI (and humans) to follow. It’s the only way to develop the taste and intuition required to push the boundaries of what is possible.
It’s important to recognize that this is not a new tension. Technology has been bringing us higher level abstractions for centuries—learning how to wield them for net positive results has been core to human experience for a long time. There are many parallels to be found in recent times as automation has taken over more and more of our decision-making processes. One specific example is automation in flight. In the late 90s, advancements in fly-by-wire flight automation systems earned pilots the nickname ‘Children of the Magenta’ based on their increased dependency on the magenta-colored lines of their automated flight software.
Pilot and journalist William Langewiesche observed: “We appear to be locked into a cycle in which automation begets the erosion of skills or the lack of skills in the first place and this then begets more automation.” And American Airlines pilot Warren Van Der Burgh reminds us, “We are pilots and captains, not automation managers.”
The solution is balance. Pilots were encouraged to take planes off autopilot routinely to reduce skill erosion. The same goes for coding—and frankly, for any creative discipline. Use AI to speed through boilerplate tasks, rapidly prototype ideas, and strengthen your thinking, though not as a replacement for human collaboration. When you hit new territory, recognize it and put in the reps. Because if we lose the ability to innovate at the creative level, AI will just keep recycling existing patterns, generating increasingly stale, derivative output. Someone needs to create the patterns worth following.
Sources and further reference:
99% Invisible’s Children of the Magenta: The Automation Paradox Part I
AA Flight Academy lecture