Picture credit: Rafael Henrique/SOPA Images/LightRocket via Getty Images
Artillery Row

Google has a history problem

As much as we might wish that history had been different, virtue cannot grow from the soil of falsehood

The truth is often very uncomfortable. When Google’s new AI tool, “Gemini” was released last month, it seemed that its engineers had decided that truth in history was unsavory to the extent that none of us should be able to stomach it. So they changed it. 

When users asked the bot to produce images of well-known historical figures of European ethnicity – Henry Ford, German soldiers in the 1940s, America’s founding fathers – Gemini produced an array of different ethnicities and genders. According to the digital oracle, Henry Ford was a beautiful woman of colour in a high collar and fitted waistcoat. 

Perhaps this is an example of what Plato classified in his Republic as a “noble lie”: a falsehood told by a morally superior elite to an unwitting and malleable populace in pursuit of social harmony. Google’s AI Principles include “Be socially beneficial” and “Avoid creating or reinforcing unfair bias.”

Indeed, the idea that Ford could have been a woman of colour is a pleasant one. It may have been more socially beneficial than the fact that Ford was a thin-faced man of Irish heritage born on a farm in Michigan, whose anti-Semitic writings ran to four bound volumes entitled The International Jew: The World’s Foremost Problem. “We have just had his anti-Jewish articles translated and published. The book is being circulated to millions throughout Germany”, said Hitler in 1923.

Nonetheless, there are several areas of concern here, even if Google’s claims that it was all an innocent mistake are to be believed. In various explanations and apologies, the company’s executives reiterated their mission “to organize the world’s information and make it universally accessible and useful” and classified their quest as “sacrosanct.” 

A robust sense of skepticism should be applied when a profit-making corporation deploys such language. Google’s tools are used 8.5 billion times per day in a global culture in which its results are increasingly accepted as factual. 

As such, we should push back strongly, both against the problematic ahistoricism displayed by the AI, but also against the hubris that caused those in question as hubris often does to woefully miscalculate the effects of their self-conceived virtue. 

The primary cause for alarm is not that those building and training the Gemini software imagined that key episodes in the history of the Western world could be retroactively imagined with non-white people, as well as white people, at their centre. Many would agree that it would be morally preferable if human history had unfolded within the conditions required for Ford to be a woman of colour: concordantly, with equality of opportunity for education and prosperity for all sexes, ethnicities, social classes, etc. 

The problem is that those who’ve put themselves in charge of the brave new world of information production don’t seem to think rigorously about knowledge or about history: two disciplines rather integral to “information.” 

Plato arrived at his notion of a “noble lie” through a stringent examination of epistemology — what it means to know — in which he subjected his own observations and intuitions to the meticulous process of logical critique, convinced that true knowledge, unlike the flimsiness of opinion or the fleeting nature of sensual experience, was the only ground invulnerable to the dangerous repercussions of deception. 

In other words, virtue cannot grow from the soil of falsehood

Many features of Plato’s epistemology have been comprehensively challenged, not least by his student Aristotle. Unlike Plato, Aristotle was a committed empiricist who endowed human faculties with much greater credibility. But core to both philosophers and to much of the canon that grew out of their work is the notion that we cannot improve the world, we cannot cultivate what is right in the world, unless we see the world as it really is.

In other words, virtue cannot grow from the soil of falsehood. Historical injustice cannot be challenged if it has been — erroneously — removed. If, by the rules encoded into an AI model, any of the American founding fathers could have been a black man or a black woman, then, logically, what becomes of the following phenomena: slavery, the American Civil War, the emancipation of slaves, women’s suffrage, Jim Crow and the Civil Rights Movement?

In short, training a computer model to rewrite history is lazy and tokenistic

These concepts are erased along with the original injustices that necessitated them, because there’s no longer any intelligible reason for them to exist. Writing the brutal truth out of the historical record in the name of righteousness also writes out of the historical record those who worked tirelessly over centuries, at great cost, to challenge and reshape the very systems of exclusion and hierarchies of power that omitted them from the picture. 

In short, training a computer model to rewrite history is lazy and tokenistic. It belittles the full reality of those who painstakingly worked in the inhospitable environment of the real world — not a billion-dollar tech campus — to make it, little by little, more hospitable. It bizarrely exonerates some, like Henry Ford, and implicitly incriminates others, like random Southeast Asians who apparently could have been integrally involved in the Third Reich.

Even a little more creative thinking could have led Gemini down a more truth-based path. What if Google had decided to train its model to show, say, not an image of the founding fathers as women or as Native Americans — which is untrue — but rather, a depiction of the founding fathers with all of their slaves included in the image? Technically, this would be an accurate image of history as it was and it would foreground the hypocrisy and indignity that Google wants to expose.

With that type of rigorous realism and attention to detail at the heart of the AI’s framing, we could expect it to produce much fuller pictures of the fraught complexity of human life. If asked to show a typical group of Google engineers, for example, perhaps it would include the streets of urban and social degeneration, poverty and drug use that now plague the locale in which Google crowned itself an omniscient bearer of world knowledge. 

For that image, and many others throughout history, would capture the baffling irony and tragic fallibility of even the most well-intentioned of human endeavours.

Enjoying The Critic online? It's even better in print

Try five issues of Britain’s newest magazine for £10

Critic magazine cover