The US dictionary Merriam-Webster’s word of the year for 2025 was “slop”, which it defines as “digital content of low quality that is produced, usually in quantity, by means of artificial intelligence”. The choice underlined the fact that while AI is being widely embraced, not least by corporate bosses keen to cut payroll costs, its downsides are also becoming obvious. In 2026, a reckoning with reality for AI represents a growing economic risk.
Ed Zitron, the foul-mouthed figurehead of AI scepticism, argues pretty convincingly that, as things stand, the “unit economics” of the entire industry – the cost of servicing the requests of a single customer against the price companies are able to charge them – just don’t add up. In typically colourful language, he calls them “dogshit”.
Revenues from AI are rising rapidly as more paying clients sign up but so far not by enough to cover the wild levels of investment under way: $400bn (£297bn) in 2025, with much more forecast in the next 12 months.
Another vehement sceptic, Cory Doctorow, argues: “These companies are not profitable. They can’t be profitable. They keep the lights on by soaking up hundreds of billions of dollars in other people’s money and then lighting it on fire.”



20 bucks a month is basically nothing for a developer who’s making $100 an hour.
My employer pays for copilot, and yeah, it makes mistakes, but if you pretend it’s a junior developer and double check its code, it can easily save time on a lot of tedious work, and will turn hours of typing into fewer hours of reading.
Where do these geniuses think they’ll get senior developers from when the current cohort retires? How does someone become a senior developer? Surely not through years of experience as a junior developer under the mentorship of a senior.
This mentality is like burning down an apple tree after one harvest. Fucking idiots, the whole lot of them. I can’t wait for the day all these people wake up and start wandering around confused about why their new talent pool is empty.
Should cars have been outlawed because it put farriers and stables out of business? Were shipping containers a bad idea because they required fewer longshoremen?
Technology comes and makes jobs obsolete. It happens all the time. It just happens that this technology has come in a big, visible way, and many of the ways its used and marketed are useless and/or awful. That doesn’t mean it’s entirely bad, and there’s certainly no way to stop it now.
AI will replace jobs. We can’t get around that fact. Companies that fail to adapt will fall behind. Whether I use it at my job or not has no bearing on the industry, and I’m not in a position to push for industry-wide change (nor is the company I work for). So we can either use it, or also fall behind.
I work for a mid-sized company. We still hire junior developers. I don’t think we have any plans to get rid of them entirely, but I’m not involved in that process. But after a couple decades of huge growth in the industry, developers (especially junior ones) are going to have a rough few years as the industry realigns with the new normal. There will be job losses, there will be companies that disappear entirely because they either depended too much upon AI, or didn’t adapt fast enough. But pretending that AI isn’t a useful tool when used in specific ways is just sticking your head in the sand.
You have no idea the long term impact such a tool has on a codebase. The more it generates the less you understand, regardless of how much you “check” the output.
I work as a senior dev, and I’ve tested just about all the foundational models (and many local ones through Ollama) for both professional and personal projects. In 90% of all cases I’ve tested it has always come back to “if I had just done the work from the beginning myself, I would have had a working result that’s cleaner and functions better in less time”.
Generated code can work for a few lines, for some boilerplate, or for some refactoring, but anything beyond that is just asking for trouble.
In my experience one needs to be a senior developer with at least some experience with their own code having gone through a full project lifecycle (most importantly, including Support, Maintenance and even Expansion stages) to really, not just intellectually know but even feel in your bones, the massive importance in reducing lifetime maintenance costs of the very kind of practices in making code that LLMs (even with code reviews and fixes) don’t clone (even when cloning only “good” code they can’t do things like for example consistency, especially at the design level).
It’s very much the same problem with having junior developers do part of the coding, only worse because at least junior devs are consistent and hence predictable in how they fuck up so you know what to look for and once you find it you know to look for more of the same, and you can actually teach junior developers so they get better over time and especially focus on teaching them not to make the worst mistakes they make, whilst LLMs are unteachable and will never get better plus they’re mistakes are pretty much randomly distributed in the error space.
You give coding tasks in a controlled way to junior devs whilst handling the impact of their mistakes because you’re investing in them, whilst doing the same to an LLM has an higher chance of returning high impact mistakes and yields you no such “investment” returns at all.
I highly doubt the person you’re replying to meant anything else. We’re all kinda on the same page here.
I hope so, but your be surprised. I know some devs that basically think LLMs can do their work for them, and treat it as such. They get them to do multi-hundred line edits with a single prompt.
You’re not allowed to have an opinion other than “ai bad” on the fediverse.
Sorry.