Respectfully, you seem to love the sound of your writing so much you forget what you are arguing about. The topic (at least for the rest of the people in this thread) seems to be whether AI assistance can truly eliminate programmers.
There is one painfully obvious, undeniable historical trend: making programmer work easier increases the number of programmers. I would argue a modern developer is 1000x more effective than one working in the times of punch cards - yet we have roughly 1000x more software developers than back then.
I'm not an AI skeptic by any means, and use it everyday at my job where I am gainfully employed to develop production software used by paying customers. The overwhelming consensus among those similar to me (I've put down all of these qualifiers very intentionally) is that the currently existing modalities of AI tools are a massive productivity boost mostly for the "typing" part of software (yes, I use the latest SOTA tools, Claude Opus 4.5 thinking, blah, blah, so do most of my colleagues). But the "typing" part hasn't been the hard part for a while already.
You could argue that there is a "step change" coming in the capabilities of AI models, which will entirely replace developers (so software can be "willed into existence", as elegantly put by OP), but we are no closer to that point now than we were in December 2022. All the success of AI tools in actual, real-world software has been in tools specifically design to assist existing, working, competent developers (e.g. Cursor, Claude Code), and the tools which have positioned themselves to replace them have failed (Devin).
There is no respectful way of telling someone they like the sound of their own voice. Let’s be real, you were objectively and deliberately disrespectful. Own it if you are going to break the rules of conduct. I hate this sneaky shit. Also I’m not off topic, you’re just missing the point.
I responded to another person in this thread and it’s the same response I would throw at you. You can read that as well.
Your “historical trend” is just applying an analogy and thinking that an analogy can take the place of reasoning. There are about a thousand examples of careers where automation technology increased the need of human operators and thousands of examples where automation eliminated human operators. Take pilots for example. Automation didn’t lower the need for pilots. Take intellisense and autocomplete… That didn’t lower the demand for programmers.
But then take a look at Waymo. You have to be next level stupid to think that ok, cruise control in cars raised automation but didn’t lower the demand for drivers… Therefore all car related businesses including Waymo will always need physical drivers.
As anyone is aware… this idea of using analogy as reasoning fails here. Waymo needs zero physical drivers thanks to automation. There is zero demand here and your methodology of reasoning fails.
Analogies are a form of manipulation. They only help allow you to elucidate and understand things via some thread of connection. You understand A therefore understanding A can help you understand B. But you can’t use analogies as the basis for forecasting or reasoning because although A can be similar to B, A is not in actuality B.
For AI coders it’s the same thing. You just need to use your common sense rather than rely on some inaccurate crutch of analogies and hoping everything will play out in the same way.
If AI becomes as good and as intelligent as a human swe than your job is going out the fucking window and replaced by a single
Prompter. That’s common sense.
Look at the actual trendline of the actual topic: AI taking over our jobs and not automation in other sectors of engineering or other types of automation in software. What happened with AI in the last decade? We went from zero to movies, music and coding.
What does your common sense tell you the next decade will bring?
If the improvement of AI from the last decade keeps going or keeps accelerating, the conclusion is obvious.
Sometimes the delusion a lot of swes have is jarring. Like literally if AGI existed thousands of jobs will be displaced. That’s common sense, but you still see tons of people clinging to some irrelevant analogy as if that exact analogy will play out against common sense.
How ironic of you to call my argument an analogy while it isn't an analogy, yet all you have to offer is exactly that - analogies. Analogies to pilots, drivers, "a thousand examples of careers".
My argument isn't an analogy - it's an observation based on the trajectory of SWE employment specifically. It's you who's trying to reason about what's going to happen with software based on what happened to three-field crop rotation or whatever, not me.
I argued that a developer today is 1000x more effective than in the days of punch cards, yet we have 1000x more developers today. Not only that, this correlation tracked fairly linearly throughout the last many decades.
I would also argue that the productivity improvement between FORTRAN and C, or between C and Python was much, much more impactful than going from JavaScript to JavaScript with ChatGPT.
Software jobs will be redefined, they will require different skill sets, they may even be called something else - but they will still be there.
>How ironic of you to call my argument an analogy while it isn't an analogy, yet all you have to offer is exactly that
Bro I offered you analogies to show you how it's IRRELEVANT. The point was to show you how it's an ineffective form of reasoning via demonstrating it's ineffectiveness FOR YOUR conclusion because using this reasoning can allow you to conclude the OPPOSITE. Assuming this type of reasoning is effective means BOTH what I say is true and what you say is true which leads to a logical contradiction.
There is no irony, only misunderstanding from you.
>I argued that a developer today is 1000x more effective than in the days of punch cards, yet we have 1000x more developers today. Not only that, this correlation tracked fairly linearly throughout the last many decades.
See here, you're using an analogy and claiming it's effective. To which I would typically offer you another analogy that shows the opposite effect, but I feel it would only confuse you further.
>Software jobs will be redefined, they will require different skill sets, they may even be called something else - but they will still be there.
Again, you believe this because of analogies. I recommend you take a stab at my way of reasoning. Try to arrive at your own conclusion without using analogies.
There is one painfully obvious, undeniable historical trend: making programmer work easier increases the number of programmers. I would argue a modern developer is 1000x more effective than one working in the times of punch cards - yet we have roughly 1000x more software developers than back then.
I'm not an AI skeptic by any means, and use it everyday at my job where I am gainfully employed to develop production software used by paying customers. The overwhelming consensus among those similar to me (I've put down all of these qualifiers very intentionally) is that the currently existing modalities of AI tools are a massive productivity boost mostly for the "typing" part of software (yes, I use the latest SOTA tools, Claude Opus 4.5 thinking, blah, blah, so do most of my colleagues). But the "typing" part hasn't been the hard part for a while already.
You could argue that there is a "step change" coming in the capabilities of AI models, which will entirely replace developers (so software can be "willed into existence", as elegantly put by OP), but we are no closer to that point now than we were in December 2022. All the success of AI tools in actual, real-world software has been in tools specifically design to assist existing, working, competent developers (e.g. Cursor, Claude Code), and the tools which have positioned themselves to replace them have failed (Devin).