The article presents the case that writing code with a specific style - OOP compared to Data Oriented - is responsible for slowing down performance. That code architecture is primarily the cause, regardless of the actual implementation, because some architectures will never work well with the compiler.
I get that like the major parts of picking algorithms that are N over N^2 is always going to be more important, but it doesn't seem like the article is disagreeing with you at all, just another area where people need to consider their decisions.
But only a strict subset of problems are heavily data oriented. For your 3 button GUI app whether you use a tightly packed AoS, even SoA doesn’t matter over a slow linked list of pointers for that 3 element list implementation doesn’t matter.
(Agreeing) The 3 button GUI app is much more likely to be/feel slow because the UI framework used makes it difficult to get the widgets on the screen quickly than because of data processing.
Maybe it insists on a deep heirarchy of widget inheritence as many do. Maybe the framework wants to load and configure all the possible widgets, even though the application only uses a few. Maybe theres a bunch of images loaded from disk even though they're not used. Maybe it's just the fantastic default compositing system that puts everything a frame behind adding up with everything else.
There certainly are applications where data structures matter, and time spent processing data is significant to user experience, but that's not why the whole computing experience feels slower (although maybe prettier) than 25 years ago, despite capability being so much more for most things (as pointed out in the article, ram is better than years ago, and we certainly have more of it, but ram access takes a lot more cpu cycles now)
I am torn on the issue. On one hand, there is definitely useless technical debt, over-abstraction, no longer useful features slowing down our programs. But at the other hand, we are actually handling every language on Earth properly (or at least we very well should already), instead of just saying that yeah ascii is good enough, somewhat care about accessibility though far from enough, etc.
Also, actual native GUI apps are quite fast imo. Only electron apps are slow, but I really do think that it is mostly due to the web having a wrong abstraction for GUIs, not due to other reasons (of course there are shitty web apps, as well as shitty programs everywhere else)
I get that like the major parts of picking algorithms that are N over N^2 is always going to be more important, but it doesn't seem like the article is disagreeing with you at all, just another area where people need to consider their decisions.