davidnjoku,
@davidnjoku@mastodon.world avatar

I asked Dall-E to give me an image of "A Black African Doctor looking after a starving White Caucasian child."

Looks like the AI thought I was hallucinating.

A white doctor looking after a black child
A white doctor looking after a black child
A black doctor looking after a black child.

Sonikku,
@Sonikku@techhub.social avatar

@davidnjoku @lisamelton holy hell that’s bad.

gregly,
@gregly@retro.pizza avatar

@Sonikku @davidnjoku @lisamelton @_L1vY_ This example also demonstrates how stupid (as in “not intelligent”) LLMs truly are. It doesn’t know that “black” refers to “doctor” or “Caucasian” refers to “child” in the prompt. The relative distances between tokens help guide it to results, but that’s it, and since its training data almost certainly has way more examples of the opposite situation in it, that’s what it gravitates toward. It will always reflect training biases.

_L1vY_,
@_L1vY_@mstdn.social avatar

@davidnjoku Some months ago when everyone was making jokes about the mythos of "John Mastodon" as if he were a real founder, there were a lot of tall-tale style AI images of him posted. They were universally white. I tried for about two hours to force Dall-e to give me a Black but it would only gave me weird garbage images that were subpar, poorly drawn, and basic.

grammargirl,
@grammargirl@zirk.us avatar

@davidnjoku I often tell people to watch out for subtle biases in AI-generated content. It helps to start with examples like this and then say something like, "If it will do something this blatant, imagine what you might be missing if you're not paying close attention."

I'm sorry this exists, but also glad to have another example.

blterrible,

@grammargirl @davidnjoku Unless the engineers have explicitly coded for this, the biases are in society and the data used to train the model. It's rarely the "fault" of AI when it does this stuff and is really a reflection of how screwed up we are as a society. If Black doctors were frequently photographed treating White children, the AI would be much less likely to have this issue. AI is just making apparent societal issues.

davidnjoku,
@davidnjoku@mastodon.world avatar

@blterrible @grammargirl I fear that too many of us who love technology can be so willing to understand/explain its failings that we fail to hold it to even minimum standards, fail to force it to aim higher.

No one says, Accidents happen regularly so don't expect your plane to stay in the air.

bearsong,

hello @davidnjoku

this is awful, upsetting, distressing, woefully unacceptable

and
awfully common

i posted about a slightly similar and upsetting interaction between a large language model and an image and people of colour just last week

https://ravenation.club/@bearsong/111232965922163404

this continues to be completely unacceptable

Rozzychan,

@davidnjoku
I also noticed that it didn't show any female doctors.

About that...

I listened to a podcast where a white woman admitted that when her mother read her the book series "Little House on the Prairie" and it mentioned that a black doctor took care of them when the family was sick with cholera, her mother edited out the color of the doctor because she didn't want to be "divisive".
Thus teaching another little girl to hold on to racist stereotypes that black people can't be doctors.

LazaroDTormes,
@LazaroDTormes@mastodon.social avatar

@Rozzychan @davidnjoku

One of the things I like about first season of Babylon 5, so far.

The doctor is black, I mean.

Wraithe,
@Wraithe@mastodon.social avatar

@LazaroDTormes @Rozzychan @davidnjoku Dr Franklin is awesome, but his father is a bit of a jerk 😀

LazaroDTormes, (edited )
@LazaroDTormes@mastodon.social avatar

@Wraithe @Rozzychan @davidnjoku
I was talking about this doctor. Do not tell me something bad happens to him (I mean besides whitening) https://mastodon.social/@LazaroDTormes/111313486529779924

Wraithe,
@Wraithe@mastodon.social avatar

@LazaroDTormes @Rozzychan @davidnjoku Ohhhh no, you’re on the PILOT! Absolutely nothing bad happens to Dr Kyle. He was on loan to B5 from Earth (translation, I think there was just too much time between the Pilot and the first season and they couldn’t keep the actor - so no spoilers 😀)

Osteopenia_Powers,
@Osteopenia_Powers@newsie.social avatar

@Rozzychan @davidnjoku

“Divisive”??!!

Wraithe,
@Wraithe@mastodon.social avatar

@Osteopenia_Powers @Rozzychan @davidnjoku Yes, following this well-respected* formula:
I know the image says “right wing” but there are plenty of “well meaning” folks who follow this logic.

Also apparently there’s a book called “Prairie fires” that dives into some really messed up stuff about the actual Ingalls family. “Pa” was apparently a real piece of work - but I’ll maintain my Michael Landon enjoyment. 😂

*sarcasm

blterrible,

@davidnjoku Have you tried other racial combinations? From what I've seen, current models don't really understand that "Black African doctor" or "White Caucasian child" are "atomic" things. At best you can say that it sort of understands that those words are related because of proximity. The system is also built on statistical models that match a criteria, so depending on your dataset, all doctors might end up looking Chinese because that's what the data contains. Data curation is important.

davidnjoku,
@davidnjoku@mastodon.world avatar

@blterrible What you're saying makes sense. Basically, I might be confusing it with too many adjectives.

But when I tried "black doctor and a white child" it still ignored me. However when I try "black doctor" it knows what I mean.

I understand that these LLMs are black boxes that we don't really understand. That fact had never really worried me before now.

Wraithe,
@Wraithe@mastodon.social avatar

@davidnjoku @blterrible K, so I got curious (and woke up at 5am for some reason) and I tried “Kenyan* doctor treating white starving child” and not ONE of the children was white and half the doctors were white. That’s bullshit.
Meanwhile, I typed in
“Unicorn doctor treating a starving white child” and it messed that up too, but no black children
So, unicorns it can handle….

*I went with Kenya because I didn’t want to give it the “out” of white South Africans

Four images generated by Microsoft Bings Dall-E based image generator using the term “unicorn doctor treating White starving child“ it obviously just slapped a unicorn head on a female doctor image, but the child is white And let’s not talk about why the stethoscope is stuck into the child soup bowl.. (editorial comment): LLM images suuuuuuuuck LOL

Wraithe,
@Wraithe@mastodon.social avatar

@davidnjoku @blterrible Also tried this one, but it said it was “unsafe content” so at least AI/LLM recognize the dangers posed by the Elder Gods.
🤷🏻‍♀️
“I can excuse racism, but I draw the line at awakening that which sleeps” - Dall-E

blterrible,

@Wraithe @davidnjoku The bing image gen is hyper sensitive about what it flags. I'm guessing that there was some... hentai crossover here.

blterrible,

@Wraithe @davidnjoku Again, it really doesn't understand that "white child" or "black child" is an atomic thing. Try this again and note the color of other things in the images returned. You're much more likely to white cars or white walls in the background of an image that specifies a "white child". Also, beating a dead horse here, the generator is driven by the content it is fed. It doesn't have a preponderance of white doctors in the data affecting the outcome when the doctor is a unicorn.

davidnjoku,
@davidnjoku@mastodon.world avatar
dannyman,
@dannyman@sfba.social avatar

@davidnjoku That article is wild.

clive,
@clive@saturation.social avatar

@davidnjoku

Those are really great examples of the training problems of AI image generating models

Hadn’t seen that NPR story before, thank you for bringing it to my attention

davidnjoku,
@davidnjoku@mastodon.world avatar

@clive That's what I thought at first, but I'm not convinced that that fully explains it. If I asked it for a polkadot alien doctor looking after a pineapple shaped semaphod on Mars, it would do it. So why is its imagination suddenly restricted now?

davidnjoku,
@davidnjoku@mastodon.world avatar

@clive In case you wondered what a semaphod looked like 😂

clive,
@clive@saturation.social avatar

@davidnjoku

lol yikes

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ArtificialIntelligence
  • khanakhh
  • DreamBathrooms
  • ethstaker
  • magazineikmin
  • osvaldo12
  • Durango
  • Youngstown
  • ngwrru68w68
  • slotface
  • rosin
  • mdbf
  • kavyap
  • InstantRegret
  • tester
  • JUstTest
  • thenastyranch
  • cisconetworking
  • tacticalgear
  • cubers
  • everett
  • modclub
  • GTA5RPClips
  • anitta
  • Leos
  • provamag3
  • normalnudes
  • megavids
  • lostlight
  • All magazines