JoBo,

The data cannot be understood. These models are too large for that.

Apple says it doesn’t understand why its credit card gives lower credit limits to women that men even if they have the same (or better) credit scores, because they don’t use sex as a datapoint. But it’s freaking obvious why, if you have a basic grasp of the social sciences and humanities. Women were not given the legal right to their own bank accounts until the 1970s. After that, banks could be forced to grant them bank accounts but not to extend the same amount of credit. Women earn and spend in ways that are different, on average, to men. So the algorithm does not need to be told that the applicant is a woman, it just identifies them as the sort of person who earns and spends like the class of people with historically lower credit limits.

Apple’s ‘sexist’ credit card investigated by US regulator

Garbage in, garbage out. Society has been garbage for marginalised groups since forever and there’s no way to take that out of the data. Especially not big data. You can try but you just end up playing whackamole with new sources of bias, many of which cannot be measured well, if at all.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • PowerRangers
  • DreamBathrooms
  • everett
  • magazineikmin
  • osvaldo12
  • Youngstown
  • InstantRegret
  • slotface
  • ngwrru68w68
  • rosin
  • hgfsjryuu7
  • kavyap
  • tsrsr
  • thenastyranch
  • Leos
  • Durango
  • cisconetworking
  • ethstaker
  • tacticalgear
  • mdbf
  • khanakhh
  • vwfavf
  • cubers
  • modclub
  • GTA5RPClips
  • tester
  • normalnudes
  • anitta
  • All magazines