mattb, I'm not (at least not yet) persuaded by the argument that the output of an LLM is a derivative of all the material it was trained on. The problem with the argument is that it also applies to me: if I see some cool code you wrote, you bet I'm going to make a mental note of it. The trip hazards here are patents and non-free licenses, but they also apply equally to people.
There are other problems with LLMs, but I'm not convinced by that one specifically.
Add comment