In some ways it feels like technology is taking us backwards. At the turn of the century Tivo freed us not only from having to sit for our favorite shows at fixed times, but also from having to sit through commercials. Now streaming services have begun to pepper video content with commercials that we can’t skip. And I’ve noticed something strange about the latest generation of these advertisements: People of pallor are startlingly underrepresented. Not that I object to that. I do find it to be an amusingly patronizing reversal of, say, the olde old days of theater when virtually every character, regardless of race or gender, was played by a male. It is also a little uncanny: Like when a white person shows up in a commercial I wonder, “What happened – couldn’t the casting director find a person of color? Is that even legal?”
I’ve taken an interest in the latest diffusion “artificial intelligence” image-generation systems. And the lengths to which their development organizations go to prevent them from doing anything politically incorrect is … Orwellian.
Here’s OpenAI explaining how it is doing its part to keep its DALL-E system woke:
Reducing bias: We implemented a new technique so that DALL·E generates images of people that more accurately reflect the diversity of the world’s population. This technique is applied at the system level when DALL·E is given a prompt about an individual that does not specify race or gender, like “CEO.”
Google says that it is not releasing its Imagen system for general use because the system is not yet sufficiently woke. Citing “safety” and “societal impact:”
Imagen relies on text encoders trained on uncurated web-scale data, and thus inherits the social biases and limitations of large language models. As such, there is a risk that Imagen has encoded harmful stereotypes and representations, which guides our decision to not release Imagen for public use without further safeguards in place.
“Like what harmful stereotypes?” you may wonder. Well, there is “an overall bias towards generating images of people with lighter skin tones and a tendency for images portraying different professions to align with Western gender stereotypes.” Can you imagine the damage if the general public had uncensored access to an information system that contained biases in its depiction of skin tone and gender?