Clearly Google don't want Gemini to draw black Vikings. That's just a mistake. I don't know enough about tweaking AIs to say whether it's a dumb mistake or the sort of mistake that's very difficult to avoid without making other, equally serious mistakes. But it's definitely not a political conspiracy.
One of the largest, most bureaucratic, but also most sophisticated companies, did not catch that? Ofcourse they did. The level of approvals to release this model must have been insane.
a large org like this would have spent years of man-hours spent in requirements, functional and NFR, then regression testing and acceptance testing, multiple levels of sign-off required, change request approvals, etc.
And yet lots of Google products get released with bugs. Buggy software is the default. Elaborate explanations are not required.
In concrete terms, what exactly could Google have hoped to gain from releasing an image generator that defaults to black Vikings? If it wasn't a mistake, what is the grand plan that it contributes to? And why has Google immediately about-faced and abandoned this plan? Did they think everyone would love the black Vikings?
In any case, the entire premise of this confected outrage (that AIs should be expected to produce historically accurate images for arbitrary inputs) is completely daft.