A lot of the non-mainstream PC clones, especially in non-conventional formfactors, were more subtly different than those of major manufacturers like Compaq, and probably lacked the testing required to uncover bugs like this. I'm guessing the BIOS implemented the A20 enable call so it returned success, but all it did was send a command to a nonexistent keyboard controller (especially on a PC laptop, not having an 8042 is a bad idea since the 8042 emulation is usually built-in to the EC and interfacing with it is far easier than requiring a whole USB stack, but I digress...) The OEM bought the BIOS and didn't customise it completely to the particular system's setup.
Things might've been better if the PC had been officially standardised, since AFAIK despite IBM publishing schematics and BIOS source code for everything up to the AT, other companies couldn't make use of that and still had to reverse-engineer the functionality.
I see a lot of people, mostly newcomers, complain about the "legacy" stuff but keep in mind that the strong backwards-compatibility is what made the PC platform as successful as it is. It has its quirks due to evolution, but the relative openness and stability of the design is why I'd still prefer it over some of the other platforms out there e.g. ARM SoCs where everything except the CPU cores vary widely between manufacturers and models. To me, a mostly openly-documented platform is far better than a proprietary one even if the latter is "legacy free".
and its a good thing. No one needs to add redundant silicon that is used for 0.0000000001% of device lifetime (<1 second between power on and loading kernel).
People think we need ACPI on aRM, Linaro is even working on it right now :/ Its like no one gets what a clusterfuck ACPI was on PC.
I blame Intel. Its time for x86 CPU that drops all of this compatibility nonsense. Imagine something crazy like x86 bootstraping in protected (or long) mode.
real 16 bit? gone
virtual mode? gone
MSR? gone
CRx? gone, btw wtf happened to CR1?
Im sure MS would be onboard (if not extatic) with CPU that can run only the newest version of Windows. Linux would happily adapt in couple of weeks. There is maybe <1% of computers with CPU ever touching this swamp of cruft and hacks outside of bios/bootloader.
Itanium killed x86 compatibility, this had zero chance of success in commodity market, and medium to little in server space. You cant jump out with no user base product when your competitor (AMD) offers better and cheaper processors.
I am proposing removing _unused_ compatibility hacks. No one sane uses Virtual mode. I wonder how much silicon real estate is taken by all this garbage.
Was that through technical failing, or was the project just mismanaged into the ground like the Alpha? Or was it killed by AMD's backward-compatible 64-bit architecture?
Itanium failed to deliver on its basic promise: the simple in-order VLIW design was intended to be much easier to implement in hardware than the complex out-of-order RISC designs with long pipelines that everybody else (including Intel x86/AMD64) was doing, but then the various actual Itanium CPUs were notorious for being released years later than initially announced, and due to the delays the available out-of-order RISCs were usually much faster.
Nice writeup, but[1] I'm wondering, could the author have saved himself from a lot of grief by not using an ancient, outdated bootloader in the first place?
[1]: I don't want to spoil the fun, but look at [2] in the article and wonder: who the hell is still using grub 1? Even mainstay isolinux (which in addition has to deal with lots of weird / broken BIOS implementations of cd booting (El Torito, anyone?)) does a lot better.
Because in 2009 grub 2 was a dreadful EFI bootloader and there were working patches for grub legacy. We shipped the bootloader we had, not the bootloader we wished we had.
afaict, work on grub2 started back in 2004, and a backport patch[1] of the 'paranoid' A20 checking for grub legacy was posted to the bug-grub mailing list back in 2006.
In fact, the very first reply[2] to that patch makes exactly the same point I did above: why not use grub2?
The rest of the discussion on the email thread from 2006 on the A20 patch is about whether GRUB 2 is ready for use in production environments, so I am not sure what your point is.
My point wasn't that GRUB 2 did not exist in 2009, but that "Who the hell is still using grub 1?" was not as valid a question then, and is definitely not a reason for you to dismiss the article.
Things might've been better if the PC had been officially standardised, since AFAIK despite IBM publishing schematics and BIOS source code for everything up to the AT, other companies couldn't make use of that and still had to reverse-engineer the functionality.
I see a lot of people, mostly newcomers, complain about the "legacy" stuff but keep in mind that the strong backwards-compatibility is what made the PC platform as successful as it is. It has its quirks due to evolution, but the relative openness and stability of the design is why I'd still prefer it over some of the other platforms out there e.g. ARM SoCs where everything except the CPU cores vary widely between manufacturers and models. To me, a mostly openly-documented platform is far better than a proprietary one even if the latter is "legacy free".