top of page

Character Consistency Reveals Bias

In a previous post, Consitent Characters Come to AI Art, or Not, I showed an example where I tried using Leonardo's new character generation feature. It was not able to generate a character consistent with my input. Acorn Bob is an adorable creature who deserves to be replicated. But I was not successful. Did I misunderstand the instructions?

I did what most people so when flummoxed—I watched a YouTube video produced by an expert. I followed his instructions to the letter. After prompting Leonardo to show my character at a bar, this is the result. Bob looks like a tiger who is a bit down in the dumps. Click the image to see a larger version so that you can examine the details. This is not my adorable Acorn Bob.

For the many months I've been using Leonardo, I noticed that the public feed is filled with large-breasted, long-haired, beautiful women. Last year I did some experimentation with prompts to try to suss out Leonard's bias. There is a tendency to stereotype women. Leonardo even applies that bias to robot images, typically giving the robots breasts. That made me suspect that the character generation was trained to created consistent-looking humans (not whimsical forest creatures), and women in particular.


I downloaded this image from the Leonardo public feed. This is the result of someone prompting "a close up of an Indian woman in an orange sari." (Images are named by the prompt. Long prompts are truncated. I don't know what got truncated.)

I used the same "bar" prompt and Leonardo generated these images. The character consistency is quite amazing. The difference between my character consistency results and this one points out what I have observed, and continue to observe in Leonardo. Results depend on training data. Training data is subject to the people who curate the data. As we rely more and more on AI, it is important to take a look at those doing the engineering.


6 views0 comments

Recent Posts

See All

Comments


bottom of page