2025-03-30
The AI Revolution and Its Consequences
Should I Find Another Career?
It seems like every time I have a conversation about my choice of career with someone, they inevitably ask if I'm worried about AI taking my job. I always tell them no, and give them the same reasons I gave the previous person. This led me to write down those reasons so I can simply point them to this post instead.
Now, at the time of writing this post, Large Language Models (LLMs) are not capable of replacing software engineers. I don't think anyone who has used LLMs to produce code disagrees with that statement, but if you haven't, just know that a good 10-20% of the code produced by LLMs is completely outdated, buggy, or flat-out incorrect. However, that doesn't mean I don't use AI to produce code; I just have to check it every time before I integrate it with the rest of the codebase. Even with having to take those precautions though, I am still more productive using AI than without.
You could then argue, "But if software engineers are more productive with AI than without AI, companies can use fewer engineers to accomplish the same tasks as before; you could be indirectly replaced by AI as a side-effect of increased individual productivity." This is theoretically possible, but if we look back through history at other inventions that increased individual productivity (e.g. the cotton gin, the automobile, the computer), companies don’t typically lay off employees just to maintain the same level of productivity. They simply replaced the less productive tools with more productive tools, and increased their total productivity in order to keep up with their competitors who were doing the same thing. In some cases, if operating those tools required a different skill set (e.g. handwriting vs typing), they would fire the employees who operated the old tools, and hired ones to operate the new tools. But leveraging AI to write code isn't a completely different skill set for software engineers; it's essentially the same as working with a junior software developer, so I don't think this argument applies here.
A common follow-up argument is, "But AI will get better in the future, and will eventually be able to replace you entirely, like automobiles replaced horses." This is also possible, but if software engineering is eventually replaced entirely by AI, all jobs could be replaced. The world would be a vastly different place, and nothing anyone does today would prepare anyone for it. So I'm not going to worry about this scenario for the same reasons I'm not going to worry about a global nuclear war.
All Aboard The Hype Train
LLMs are, nevertheless, an incredible new technology, and with each new technology come hype and grifters. I recently watched this video from Y Combinator, and to be honest, I could barely finish without my eyes rolling right out of my head. It feels like the essence of the AI hype bubble. Of every tech hype bubble, really; pump-and-dump startups trying to con venture capitalists into bestowing just one more series of funding. This time though, the ice cream sundae comes topped with the additional promise of being able to fire 90% of your workforce.
I try to avoid hype in all aspects of my life; I've never pre-ordered a video game, my side projects use technology that I'm most comfortable with, and I recently researched a $40 purchase for weeks before deciding to order it. I'm not a particularly experienced or old software engineer, but with the breakneck pace this industry moves, even I have lived through multiple tech hype cycles. I believe that eventually, AI progress will reach a local maxima and the hype will settle down a bit.
You might argue that I'm begging the question and assuming that excitement about AI is unwarranted and is therefore hype. This is a fair complaint, but I don't actually think that the excitement is entirely unwarranted. It's an incredible technology! I use it practically every day! People should be excited about AI! But as I keep having to tell my friends and family, I don't believe my career and livelihood is threatened by it. Quite the opposite, actually.
AI Is Good, Actually
Like I said earlier, I view LLMs as a junior engineer. I ask it to do a task, it gives me an answer that looks correct, and then I double check it and fix it when it isn't. It can save you a lot of time on repetitive tasks and boilerplate, and it can even act as a rubber duck to bounce ideas off of. This is a really convenient use case, but I still need to know how to code in order to evaluate the work it produces.
The video I mentioned above promotes "vibe coding", which seems to be the strategy of giving the LLM no moderation at all, other than asking it to add a feature or fix a bug. This is not the relationship of a senior engineer to a junior engineer. This is the relationship of a project manager to a senior engineer, and we've already established that LLMs are not at the level of senior engineers. This is a recipe for disaster, and I hope that no company actually decides to try this method of engineering, but who am I kidding.
Now, I love having my own personal junior engineer, and I'd like to make a more optimistic prediction about the impact of AI. By making individual engineers more productive, AI lowers the cost of entry for new startups and companies. This is a really good thing. Usually you can't afford any employees in the early stages of a company unless you have a lot of cash to burn. This causes most startups to seek investment, and then we end up with the venture capital fueled nightmare we live in today.
Personally, I think it's much healthier for companies to make a profit and deliver real value from day one. With LLMs, a tech-savvy founder and one or two AI assistants can build an MVP at a fraction of the cost; no need to constantly chase investors to extend the runway. Instead of spending months seeking funding just to keep the lights on, founders can focus on actually building their products. An increase in healthy businesses means increased competition which leads to better products at better prices. Everyone wins. I'm currently attempting to leverage AI like this in developing my own side projects. I'm not building wrappers around ChatGPT; I'm using it to speed up development on real web apps, and I'll share my results once the experiment plays out.