3/24/26
There’s a particular opinion that Gen AI tools are like compilers. The same way we write JavaScript instead of Assembly, we will write English instead of JavaScript.
I understand the reason these feels similar. When you say it, it sounds true. But, I have several qualms with it.
Working with Gen AI does not feel at all like writing a program. Writing Assembly and writing JavaScript both feel like writing programs. The level of concern is very very different. And I accomplished very little with Assembly, but I was always arranging a sequence of instructions for a machine to interpret.
Gen AI feels different. Writing a prompt and then watching the tool do stuff feels off to me. It feels more like a communication task, sort of, than a programming task. The issue to me is that I’m not communicating with an actual person. So it feels very odd.
I think it is possible this tool will be “like a compiler” in the sense that no one needs to look at the code it generates. I don’t think it is there right now. I need to look at the code it generates constantly. So that promise feels moot to me at the moment. If I need to look at the code I want to understand the code. I cannot understand the code properly if I prompt someone else to write it.
If we do get to a place where we never need to look at the generated code, I am still unsure that I will feel like it is programming. It feels like some other task to me, at this moment. I’m struggling to articulate it properly, but I don’t think “its just a compiler” holds water for me right now.
I read this article on the subject:
I balk at these lines:
When you interact with tools like Cursor, Windsurf, or Bolt, what you’re really doing is engaging with a system that takes your high-level intent—expressed in something close to natural language—and translates it into code.
The leap from Python to English (or whatever natural language you prefer) as the “programming language” isn’t fundamentally different from the leap from assembly to Python.
For the first, I do not think my “high level intent” means anything until it is expressed in some way. I resonate with the idea that Learning occurs when you write from How to Take Smart Notes. Until I’ve started putting my high level intent to paper, it is mush. I think it make sense, but reality always proves me wrong. I am worried about a mechanism that shortcuts this thinking process I feel is very important.
For the second, I think English is fundamentally different than Python. It is a human language not a computer language. I can make up any arbitrary word I want in English and it is real, English is mushy and malleable. Python has a defined, strict grammar. I can rely on any expression I write in Python to perform the same way, in the same conditions. I cannot rely on any English statement to be interpreted the same way twice.
2/7/26
I don’t like the AI is just going to be like compilers were for machine code argument. It’s a false equivalency in my mind. A good sounding analogy that doesn’t make sense to me.