admiralteal,

There's two different fuckups happening here, in my opinion.

  1. Asking an generative model to do something factually accurate is an incorrect use of the tech. That's not what they are for. You cannot expect it to give you accurate images of historical figures unless you ALSO tell it accurately what that historical figure should be.
  2. To wit: if you want it to only generate images of white people, tell it so. This tech clearly has guardrails on it because the developers KNOW it has a highly biased training dataset that they are trying to counter. They are correct to acknowledge and try to balance this biased training dataset. That isn't a scandal, it is exactly what they should be doing.
  • All
  • Subscribed
  • Moderated
  • Favorites
  • tech
  • khanakhh
  • magazineikmin
  • mdbf
  • GTA5RPClips
  • everett
  • rosin
  • Youngstown
  • tacticalgear
  • slotface
  • ngwrru68w68
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • tester
  • JUstTest
  • ethstaker
  • cubers
  • osvaldo12
  • cisconetworking
  • Durango
  • InstantRegret
  • normalnudes
  • Leos
  • modclub
  • anitta
  • provamag3
  • megavids
  • lostlight
  • All magazines