polygon6121,

The model makes decisions thinking it is right, but for whatever reason can’t see a firetruck or stopsign or misidentifies the object… you know almost like how a human hallucinating would perceive something from external sensory that is not there.

I don’t mind giving it another term, but “being wrong” is misleading. But you are correct in the sense that it depends on every given case…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • DreamBathrooms
  • mdbf
  • osvaldo12
  • magazineikmin
  • GTA5RPClips
  • rosin
  • everett
  • Youngstown
  • Durango
  • slotface
  • khanakhh
  • kavyap
  • InstantRegret
  • thenastyranch
  • megavids
  • ethstaker
  • modclub
  • cisconetworking
  • anitta
  • cubers
  • tester
  • ngwrru68w68
  • tacticalgear
  • normalnudes
  • provamag3
  • Leos
  • JUstTest
  • lostlight
  • All magazines