nyan,

The part that’s being ignored is that it’s a problem, not the existence of the hallucinations themselves. Currently a lot of enthusiasts are just brushing it off with the equivalent of boys will be boys AIs will be AIs, which is fine until an AI, say, gets someone jailed by providing garbage caselaw citations.

And, um, you’re greatly overestimating what someone like my technophobic mother knows about AI ( xkcd 2501: Average Familiarity seems apropos). There are a lot of people out there who never get into a conversation about LLMs.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • DreamBathrooms
  • magazineikmin
  • ethstaker
  • InstantRegret
  • tacticalgear
  • rosin
  • love
  • Youngstown
  • slotface
  • ngwrru68w68
  • kavyap
  • cubers
  • thenastyranch
  • mdbf
  • provamag3
  • modclub
  • GTA5RPClips
  • normalnudes
  • khanakhh
  • everett
  • cisconetworking
  • osvaldo12
  • anitta
  • Leos
  • Durango
  • tester
  • megavids
  • JUstTest
  • All magazines