Moon, LLMs are actually very good in many ways for consuming documentation. But it is incredibly irritating when it messes up. I don't know if there is even a way to fix that in an LLM since it doesn't actually have a truth model or whatever you call it.