lukeb28

@lukeb28@lemm.ee

This profile is from a federated server and may be incomplete. Browse more on the original instance.

lukeb28,

Depends on the bias of the programmer or just be random due to the impossibility of making a correct choice. If we haven’t been able to solve the problem, a robot will never unless it knows something we dont (in this hypothetical, not an option) or is able to take an action we could not.

It’s such an absurd situation that I don’t think it is constructive to consider. There are always more options in reality than a binary choice, and likely even more to a machine who could consider so many more inputs so much faster.

In the end, an accident is just that, an accident. No matter how well you consider all possibilities and design contingencies, there is always risk in everything. After an accident, we assess what happened and modify our assumptions about the probability of the event repeating, and make changes to reduce the odds of it happening again.

That said, if someone makes a mistake that leads to the robot switching the track from an empty one to one with people, that’s not an accident and someone fucked up royally

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • khanakhh
  • mdbf
  • ethstaker
  • magazineikmin
  • GTA5RPClips
  • rosin
  • thenastyranch
  • Youngstown
  • InstantRegret
  • slotface
  • osvaldo12
  • kavyap
  • DreamBathrooms
  • JUstTest
  • Durango
  • everett
  • cisconetworking
  • normalnudes
  • tester
  • ngwrru68w68
  • cubers
  • modclub
  • tacticalgear
  • provamag3
  • Leos
  • anitta
  • lostlight
  • All magazines