I was playing around with the new version of Midjourney the other day and I noticed something interesting. If you didn't specify race, it would feedback stereotypes for certain prompts. For example, I had it imagine 40 iterations of what a creative director for an ad agency in NYC. It gave me 40 white men. I repeated this with a more general request for a portrait of a doctor and again, all white men.

The first thing people will (and did say) in the comments was that it was reflective of the industry. Which can be partially true, I won't dispute that the industry is heavily white/male, but there are a few problems with that. The first is that a simple google search for the same prompt brings up plenty of diverse results of all races and sexes. So the results are out there to scrape from. And secondly that when you do a prompt for Midjourney it spits out 4 images at a time. There is no reason it should spit out 4 similar looking people. Hell you get more diverse results when you do a general prompt for a cat.

The problem with and has been with AI is that it indirectly reflects the bias (or laziness) of its creators (see the problem automated driving cars have with recognizing dark skinned people). I don't know if the Midjourney creators have a bias, but I do suspect they were lazy with the images they used to train the platform. They probably weren't thinking about what they were doing and this was the result.

Some people tried to argue that all you had to do was to change the prompt to add "randomize" or "diverse" to get the desired results, but this actually reinforces my point. How many people will actually do that (if they even know you can) and not just go with the first thing that's spit back to them? When race and gender aren't specified, white/male (or whatever stereotype for a profession) shouldn't just be the default.

Anyway, I posted my findings to LinkedIn and it resonated with some people. Cindy Gallup even reposted it twice on her timeline (yes, THE Cindy Gallup).