Pointless to compare those toy computers against cherry-picked modern desktop which is the equivalent of an old supercomputer. Indeed if you look at the numbers for the older workstations: SGI Indy, NeXTcube, Symbolics 3620 they are not that different and even worse than the newer machines. 30ms is not remarkable these days, with proper gaming keyboard, gaming monitor, config etc. you can get 5ms or less for the whole chain.
old supercomputers also sucked at latency too, and the modern desktop isn't cherry-picked
it's common for computers with less latency to also have less throughput; an arduino, the avr kind, can reliably respond to inputs in less than 200 nanoseconds
but these 'toy' computers are capable of running a word processor or spreadsheet, writing email, browsing the web (without js), compiling c, running an ide, etc. the zx spectrum even had a first-person 3-d shoot-em-up called elite. so i don't think it's pointless
So why would you expect it to be better if you shrunk them down to micro size? That is the whole point of my comment, that the observed latency is better explained as a function of hardware/software complexity and not "year of production". If you understand this, it should not surprise you at all that most contemporary desktops have the latency characteristics they do since we have seen it before in the old supers/workstations, their closest equivalents in terms of hw/sw complexity and capability. Of course it could be better and some are but you won't find them in that article.
>but these 'toy' computers are capable of running a word processor or spreadsheet, writing email, browsing the web (without js), compiling c, running an ide, etc. the zx spectrum even had a first-person 3-d shoot-em-up called elite. so i don't think it's pointless
Can it perform those tasks all at the same time like we expect now or like a mainframe/workstation user would have expected then? When you look at the entire landscape of computing hardware and software it is difficult not to see the early micros in that light. An extreme example would be the Altair 8800. I don't see how you could describe that computer as anything other than a plaything for enthusiasts. I made the analogy between modern desktops and old supers which you seem to agree with; could you even say that an early micro is the rough equivalent of a 50's mainframe with a straight face? It should be but the analogy is more difficult to justify: 36-bit vs 8-bit words, FORTRAN vs BASIC, FPU (some) vs no FPU, at every level of the analysis there are compromises. It should be clear that the micro represents an extremely compromised version of general computing that would be alien not only to us but to the serious computer user in the 70s/80s and yet their characteristics should be held as a benchmark for computing devices in general? Outside the context of "the home computing experience" they make for a poor comparison. Unfortunately there is great ignorance in this area due to the disproportionate focus on micros when discussing historical computing. Videos and articles on the Apple II series, awash with insufferable amounts of sentimentality, are a dime a dozen but you will be lucky to find anything on the IBM TYPE 704, a much more interesting and equally significant machine.
Unless you can produce actual measurements following the same methodology, for these mythical "non-cherrypicked" modern machines, I am going to call bullshit here.
https://danluu.com/input-lag/