mattb,
@mattb@hachyderm.io avatar

I'm not (at least not yet) persuaded by the argument that the output of an LLM is a derivative of all the material it was trained on. The problem with the argument is that it also applies to me: if I see some cool code you wrote, you bet I'm going to make a mental note of it. The trip hazards here are patents and non-free licenses, but they also apply equally to people.

There are other problems with LLMs, but I'm not convinced by that one specifically.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • rosin
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • vwfavf
  • InstantRegret
  • Youngstown
  • ngwrru68w68
  • slotface
  • Durango
  • cisconetworking
  • tacticalgear
  • kavyap
  • everett
  • megavids
  • cubers
  • khanakhh
  • osvaldo12
  • mdbf
  • ethstaker
  • normalnudes
  • modclub
  • Leos
  • GTA5RPClips
  • tester
  • anitta
  • provamag3
  • JUstTest
  • All magazines