if you badger an LLM it will hallucinate a false response in many cases.
That's quite the statement...
Been trying to get copilot to return documents I know are there. But it's like trying to get a cat to find a book in a library. I can't help but think it's easier to find it myself.
Going to try some coding stuff at some point.
Last edited: