A World Remade by AI
Feb 15, 2026
The dominant argument given right now for why LLMs are going to remake the world is because "anyone can write software with 'natural language', they don't need to be a programmer".
This is profoundly and utterly false, but let's set that aside for a moment and suppose it's true.
Let's see where that takes us...
Software is pretty difficult: Existence proof, if it were not, you wouldn't pay people much to create it and it would be pretty much anywhere you wanted it to be and do pretty much anything you wanted it to do.
It isn't and it doesn't and you can still make money telling a computer how to do things people want the computers to do. And you can make a ton more money telling computers to do the things most people don't want them to do (see e.g. Ransom Ware crews, online scams, and Palantir corp).
But LLMs pretty much make software a walk in the park. You don't need to know almost anything to write software, you just use your "natural language" to describe what you want and the LLMs grind, burp, cough, sputter, chug-chug and out clanks some "software".
Most people, though, don't want "software". They want a car to take them to the movie theater or mall or restaurant or amusement park or museum or zoo or beach. They want something to play music for them. They want something to eat. They want to put on (or take off) clothes. They want toys, big TVs, furniture, a bed sometimes.
Now, in a world where they just need to use "natural language" to tell an LLM to write "software" for them, why would Google, Facebook, Instagram, TikTok, Palantir, etc. exist? Why would Uber or Lyft or your banking app or the New York Times or WSJ or Washington Post exist?
Why wouldn't I use an LLM to make my own car, washing machine, toaster, refrigerator, couch, bed, lamp, table, clothes, etc. and so on?
I'll admit, I'm imaging a pretty maximalist view here, but how improbable is it really?
If specialization arose in human society because there's a limited amount of time and it takes time, effort, energy, etc. to learn specialties, but we suddenly have a thing that (despite costing an unknown number of trillions of dollars) can spit out software willy-nilly, why couldn't it spit out software to make me a toaster-maker, a washing machine-maker, a bed-maker, a car-maker, etc? What would really be the limiting factor that would make this infeasible?
And so where in fact would the boundary exist between "everyone can make their own software with 'natural language' and "this is too hard for an AI we need a humans to do it"? Where exactly would that boundary be and why?
There are some real bright chaps out there bloviating about this or that AI thing, but I haven't yet heard much about these questions I'm asking. Weird.