Have you noticed that the quality of software has worsened?
It seems like a lot of the software I use is buggier than it used to be. Games, web apps, messaging apps and phone apps.
It seems like a lot of the software I use is buggier than it used to be. Games, web apps, messaging apps and phone apps.
On how I notice it:
It's the endless need of more and more resources for running xyz.
In former days, the resources where limited. So, the programmers cared for performance by good practice coding & (over) optimization. Example: NASA was in a search of an engineer to code on their old probe flying in space. The problem: very limited resources. So the engineer needs to think like the probe's built in processor and memory - slow and with limited registers. But capable of being highly optimized in doing that. They searched for a old (school) engineer from the 60ies or something like that.
My father, too, is an old school engineer. He doesn't like to use frameworks, but rather write everything by himself. He say "why should I learn logic of others, if I can do it by myself in same time without creating overhead. I use what is necessary, but not more."
And then, I remember whole office suite fitting on a bunch of 1.44mb floppy disks - compared to now's gigabytes.
And that's the problem. There's big * debt (where * insert what ever fits)
This needs more and more resources. In the same time, programmers do not care on the optimization side anymore, as the framework can't be touched easily, or because "it works like that".
This leads to the effect that everything is getting more opaque, with difficulty to catch bugs, or the bugs are introduced by external libs or something like that..
I see the cause with the bigger and bigger frameworks and the lazyness of programmers. Why should one use ORM when one can directly connect to database? Sure. The reasons are the same as the reasons for creating a ORM lib: to make it easier. But more resource hungry and maybe buggy.
Every year, the complexity of software increases. First you’re running everything yourself, it’s a few thousand lines of code that handles everything on bare metal. Then it’s in an operating system, and that only becomes more complex as time goes on. And then you’re running interpreted code, and sending it across the wire, and then you’re running applications inside VMs that you’re running. Who knows how many millions of lines of code would effectively be used every time you use your computer? And it’s the same for applications: as time goes on they get updated to be more and more complex so they can do more things, and all this complexity means more interactions that could go wrong.
It doesn't seem to be a problem in GNU/Linux, which consists of thousands of interchangeable packages.
Could it possibly be related to the dogged persistence of those who would ai all the things?
I've seen quite a few "help, my vibe-coded app doesn't quite work, and it's 16k lines of code, I need a real programmer" posts around the interwebs. And there's the great push to replace workers with ai, irrespective of the clear evidence that this is a good way to tank your product. A lot of sunken cost fallacy roaming about these days.
It started earlier than the latest AI breakthroughs.
This is called enshittification: https://news.ycombinator.com/item?id=41277484
Even Apple isn't safe: https://news.ycombinator.com/item?id=43243075
It doesn't affect free software and decentralized systems though.