Introduction
I’ve been hearing the same thing from all directions lately: “Why would I learn X when AI can do it for me?” I hear it from students, from colleagues, from people at conferences. And every time it makes me wince a little.
Never before have I heard so many people speak about so many things they know so little about.
That’s from a guy at the office, but it might as well be the motto of our era.
The argument goes something like this: AI is getting better at everything, so why invest time and effort into skills that a machine will eventually handle? Just learn to prompt well and let the model do the heavy lifting. Sounds reasonable on the surface. It’s also dangerously wrong.

The calculator didn’t replace mathematicians
Let me tell you what I’ve actually seen happen. I’ve spent 25 years building neural networks. I’ve watched the field go from niche academic curiosity to front-page news to dinner table conversation. And in all that time, the people who thrived weren’t the ones who learned to use the latest tool. They were the ones who understood why the tool worked, where it broke, and what to do when it broke.
This isn’t unique to AI. When calculators showed up, nobody stopped teaching mathematics. When compilers showed up, nobody stopped teaching computer science. The tools changed the baseline of what you could accomplish, but they didn’t remove the need to understand the foundations. They raised the floor. They didn’t lower the ceiling.
AI is a tool. A powerful one, sure. Possibly the most powerful one we’ve ever built. But it is still a tool. And the thing about tools is that they’re only as good as the person wielding them. Give a chainsaw to someone who doesn’t understand wood and you’ll get firewood at best, a trip to the hospital at worst.
What I’ve learned from martial arts
I taught martial arts for over a decade. One of the things you learn very quickly as an instructor is that there’s no shortcut to skill. You can watch a thousand YouTube videos of a technique, you can have someone explain the biomechanics in perfect detail, but until your body has done the movement a few thousand times, you don’t own it. It’s not yours. Under pressure, you will revert to whatever your body actually knows, not what your head thinks it knows.
The same applies to intellectual skills. You can ask an LLM to write code for you, and it’ll produce something that looks right. But when it breaks (and it will break), you need to understand why. When the output is subtly wrong (and it often is), you need to catch it. When the problem doesn’t fit neatly into what the model has seen before (and real problems rarely do), you need to improvise. And you can’t improvise with knowledge you don’t have.
I’ve watched junior developers paste LLM output into production without understanding what it does. It works for a while. Then it doesn’t, and they have no idea where to start debugging. They skipped the part where you learn the thing, and now they’re stuck. The model gave them a fish, but nobody taught them to fish.
The skills that matter
Ok so if AI handles the routine stuff, what’s left? Quite a lot, actually.
The most valuable people I’ve worked with over the years share a common trait: they can think across domains. They see connections that specialists miss. A physicist who understands finance. A programmer who understands biology. A martial artist who understands leadership. These intersections are where the interesting work happens, and they require deep skill in multiple areas, not surface-level prompting ability in one.
Here’s what I think actually matters:
Understanding the fundamentals. Not because you’ll manually compute gradients every day (though I do enjoy it), but because when your model behaves unexpectedly, the person who understands backpropagation will diagnose the problem in minutes while the person who only knows the API will spend days guessing.
Critical thinking. AI is confident. Impressively, fluently, dangerously confident. It will give you wrong answers with the same tone and structure as right ones. If you can’t evaluate the output independently, you’re not using a tool. You’re being used by one.
Building things from scratch. The ability to start from a blank page and create something that didn’t exist before. This requires deep understanding, creativity, and the willingness to sit with discomfort while you figure it out. LLMs can help you iterate faster, but they can’t replace the generative spark.
Physical and embodied skills. This one might sound odd coming from someone who spends most of his day at a computer, but I genuinely believe it. The discipline you develop through martial arts, through sport, through any practice that demands full-body engagement, transfers directly to how you think and work. Patience, persistence, attention to detail, the ability to stay calm under pressure. These aren’t soft skills. They’re the foundation everything else is built on.
The electricity analogy (but for real this time)
People love comparing AI to electricity. Usually in a breathless “this will change everything” kind of way. But the analogy is actually useful if you take it seriously.
When electricity became widespread, it didn’t make human labor obsolete. It made new kinds of labor possible. It created entirely new industries, new professions, new ways of thinking about problems. But the people who benefited most weren’t the ones who just plugged things in. They were the electrical engineers, the inventors, the people who understood the technology deeply enough to push it into places nobody had imagined.
AI will follow the same pattern. The people who just “use” it will do fine. The people who understand it, who can build on it, extend it, and critically evaluate it, will do dramatically better. And the people who gave up learning because “AI will handle it” will find themselves dependent on systems they can’t control, can’t inspect, and can’t fix.
Conclusion
I’m not arguing against using AI. I use it every day. I’m building companies around it. What I am arguing against is the idea that AI makes learning unnecessary. That’s a story being told by people who benefit from you being a passive consumer of their product.
The truth is simpler and less comfortable: the skills that matter most are the ones that are hardest to acquire. They require time, effort, frustration, and the willingness to be bad at something before you’re good at it. No shortcut has ever changed that, and AI won’t either.
So learn the thing. Do the hard work. Build the muscle memory, whether it’s physical or intellectual. The people who do will be the ones shaping the future. The people who don’t will be along for the ride, hoping the driver knows where they’re going.