It is a scary time for many software engineers. AI has proven itself to be an effective tool at writing code and it puts into question investments engineers have made into languages, ecosystems, and even system design. First off, I don’t think AI will take everyone’s job, but it will take some. This is a simple economic force that reflects software becoming cheaper.
The bigger impact is the atrophy of software engineering skills. When everyone is using LLMs to do tasks, you quickly lose skills when you had to do things yourself. As leader and manager who has been coding, this isn’t that big of a deal. I didn’t push out features. I wasn’t on-call to fix issues. My skills have already atrophied quite a bit. But for those writing code, this becomes a serious issue.
For Senior+ engineers, arguably, the problem isn’t as bad. You’ve got years of experience and have seen things. You’ve developed intuition that will continue to be valuable as we build systems with AI. At the same time, you need to find ways to experiment and grow in your knowledge amidst a flurry of new, and powerful software. It will be a challenge recognizing transformative tooling vs. vibe coded slop. Your skills reviewing code will be stretched and you’ll need to find ways to consume more code. The AI providers will argue that doing this with AI is the answer, but at the end of the day real people own systems and have to be responsible for their actions.
For junior folks, I think you have the more difficult challenge. Delivering more complex solutions is expected to take less time. You will need to use AI to meet these unsaid quotas for delivery. At the same time, you need to understand what is getting deployed. This becomes much harder because you’re not actually doing the detailed work where you learn the underlying systems. This extends beyond just the code. You may miss out on CI/CD, kubernetes, DB administration, and git, just to name a few. Where things get really challenging is when a system needs to shift to meet new requirements. Sometimes this can be “easy” by introducing a new technology, but more often then not, you need to create headroom in an existing system, which means understanding how things work “under the hood” where you have explicitly avoided.
Someone needs to understand how things work.
A while back I read (A Deepness in the Sky)[https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky]. The character Pham Nuwen is an old pirate that created a huge pirate network that spans lightyears of space. When the ship he is traveling on is taken over, he managed to take control back. He does this because he knows how the old technology works. He remembers how to program microscopic robots to help him monitor and eventually attack his enemies. His saving trait was he understood the underlying layers of automation worked. Hopefully it is clear where I’m heading with this.
A while back people used to argue about whether or not Open Source contributions were a good indicator when hiring. If someone worked on projects in their free time, there was a good chance they were a decent programmer and cared about what they did. While OSS contributions was typically not a hiring filter, there were a lot of folks that started side projects and tried to contribute because it was good for their career. I think that mentality has to come back in the world of AI, but with a slightly different tactic.
First off, AI maximalists like Steve Yegge are right. You need to get up to speed on AI, if you want to continue to be a software engineer. I can’t say I’m happy about this, but I’m pretty sure it is the truth. To deal with this, you need to play with AI. This is your side project you need to spend time on. If you can spend time on it at work (and this is much more feasible) you should do it there too. Try to use as many tokens as possible trying anything and everything you can think of. You’ll throw most of it away, but in the process you’ll learn. Try making skills, plugins, clis, tools, scripts, tests, frameworks, agents, anything. Learn what others are doing just to see what is happening. Eventually, you’ll start seeing where and how the productivity expectations that organizations are hand waving about could actually have something behind them. Keep doing this, especially if someone else is paying.
Next up, learn low level technology. Pick up a lower level language like C, C++, Rust, etc. where you need to manage memory. You don’t have to build anything specific. You just need to play with it to start to understand the fundamentals of how languages touch the physical computers. This is how you build your personal moat. When that vibed cluster of code starts choking and it needs to get optimized, you’ve got the skills to recognize where your memory leak is or why loading all those JSON objects requires entirely too many allocations when most of the data isn’t even used. Again, you don’t have to be an expert. You still have AI. But you need to understand what happens under hood with the hardware.
For those that don’t do this, yet still have these critical skills, I think there is still a place in this world. Cobol programmers are a rare breed and valuable to those that need their skills. There are many out that will effectively become Cobol programmers for AI generated systems. We don’t know what this looks like, but I’m almost certain there will be folks who can parachute into a broken code base, learn the needful and fix the issue to let folks get back to vibing new features.
I can’t say I’m extremely bullish on AI, but I definitely see the writing on the wall, so I’m leaning in. Some of it is fun exploring a new space and doing things that I never could do before. I work harder to focus and read carefully. It is practice and it is rewarding. At the same time, there is a strange feeling something should be working all the time. Others have talked about it like it is burnout, but I’ll say it is different. Change can be painful. Hardships can make you a better, so in this new world of AI, I’m going to try to get better.