The skills that won't deprecate in the AI era
- Compilers in the 50s didn't eliminate programmers, they created more demand. LLMs will probably do the same, but with a crucial difference
- AI has already reached a level sufficient to radically change what's expected of a software professional, and the AGI debate is irrelevant
- Decision Making is a skill that AI will take much longer to reach satisfactorily
- You'll be paid for your judgment. For the ability to choose paths and, most importantly, the ones not to follow
- Most people are too lazy to think, and that's exactly where you stand out
The way we develop software is changing, but patterns from the past seem to repeat.
In the 50s and 60s, programming in Assembly was something few could do. With the arrival of high-level languages like Fortran and COBOL, compilers automated the low-level details, allowing programmers to focus on logic and design. There was resistance: experienced programmers feared losing control and doubted that compiled code could be as efficient as hand-written code. In the end, compilers didn't eliminate jobs. On the contrary, they fueled an explosion in demand for programmers, democratizing programming.
Notice any similarity with the current moment?
Today, it seems we're living through an analogous transition. The specifications we write are becoming this new "high-level code", and LLMs + harness, the new compilers. But there's a fundamental difference. Compilers are deterministic: given the same input, they always produce the same output. LLMs are fundamentally probabilistic. No matter how perfect, beautiful, and polished your spec is, the LLM will always have a greater-than-zero probability of returning something different from what you expected.
And it's precisely this difference that makes certain skills more valuable than ever.
Stop debating AGI
"Oh no, here comes another know-it-all talking about what they don't understand..."
What is AGI, anyway? Some say it's Skynet. Others say it's just a finish line that moves forward as we get closer: every time AI solves a complex problem, we say that's still not real intelligence and push the definition further ahead.
But let's get real. At the end of the day, does it matter what technical level of autonomy AI will have?
The truth hurts, but it needs to be said: maybe it has already reached a level sufficient to replace a good portion of what you do today.
The Vinext case, Cloudflare's rewrite of Next.js in a week, shows this clearly. Almost all lines were written by AI. But the project only worked because there was an exceptional person piloting it, deciding the architecture, correcting course, and surrounding the process with determinism.
So instead of wasting time debating whether AI "thinks" or whether it's just a "probability parrot", spend that time evolving the skills it will take longest to reach.
Skill 1: Communication
I already talked in a previous post about how communication is the skill that will most differentiate professionals. But it's worth reinforcing.
"Your success will be determined largely by how well you speak, how well you write, and the quality of your ideas, in that order." Patrick Henry Winston
I'm not talking about being a great public speaker. I'm talking about being able to understand and be understood clearly and efficiently, both by people and by AI.
In the era where LLMs are the new compiler, the quality of the "source code" you write (specs, contexts, prompts) determines the quality of the output. Bad communication generates bad software, no matter how powerful the model is.
Skill 2: System Design
You don't need to be Martin Fowler. But you need to know how systems work under the hood, even at a basic level. Know how to design a solution before implementing it. Know how to evaluate trade-offs. Know how to communicate technical decisions to non-technical people.
LLMs can generate code, but they don't know the nuances of your business. They don't know that microservice will receive 10x more traffic next quarter. They don't know the team maintaining the code has three people. They don't know the SLA requires 99.99% availability.
These design decisions (what to build, how to structure, what not to do) still depend on people.
Skill 3: Decision Making
This is the one I want to emphasize most. And it's the one fewest people are developing.
AI can give you five technically viable paths to solve a problem. They all compile, they all pass tests, they all look elegant. But only you know the company culture, the business risk appetite, the history of decisions that led to this point, and the nuances that make a solution brilliant or a disaster.
"AI can give you five technically viable paths, but only you know the culture, the business risk appetite, and the nuances that make a solution brilliant or a disaster."
With this, you'll be paid for your judgment. Your success will be directly related to how well you can decide which paths to follow and, most importantly, which ones not to follow.
"In the land of the blind, the one-eyed man is king."
Most people are too lazy to think. I never imagined saying this, but that's where you stand out. Strategic decision-making means evaluating and weighing risks. The weight of the final choice and the responsibility for the consequences are the only things that won't leave your hands.
What to do with all of this
- Write more. Document decisions. Practice explaining complex things simply, both for people and for AIs
- Study System Design. Not enough to pass FAANG interviews, but enough to design solutions before implementing them
- Practice making difficult technical decisions. Evaluate trade-offs. Learn to say "no" to paths that are technically viable but strategically wrong
- Surround AI with determinism. Tests, CI/CD, linting, type checking. The more deterministic processes around the LLM, the more reliable the result
- Put on the student hat. This is not the time to think you know enough. For anyone
- Don't get attached to tools. In two months everything will have changed. Experiment with everything, but remember that fundamentals are what lasts
- Don't confuse speed with quality. Generating code quickly is not the same as delivering software that works in production
- Don't ignore the fundamentals. Tools change every two months, but software engineering principles persist
I know, I know... "What a motivational speaker vibe, Marcelo..."
Yeah. Buy my course... Just kidding, I don't have a course. But I should.
If the historical pattern repeats like in the 50s, the demand for professionals won't decrease, it will change shape. Those who know how to communicate, design systems, and make good decisions will have a disproportionate advantage. Because AI amplifies capability. And zero times any number is still zero.
The question isn't whether AI will change the profession. It already has. The question is: are you investing in the right skills?
Let's exchange ideas about software engineering, architecture, and technical leadership.