ChatGPT is dismissing it, but I’m not so sure.
Seriously, do not use LLMs as a source of authority. They are stochistic machines predicting the next character they type; if what they say is true, it’s pure chance.
Use them to draft outlines. Use them to summarize meeting notes (and review the summaries). But do not trust them to give you reliable information. You may as well go to a party, find the person who’s taken the most acid, and ask them for an answer.
Acid freaks are probably more reliable than chat gpt
You’ll certainly gain some valuable insight, even if it has nothing to do with your question. Which is more than I can say for LLMs.
I don’t understand the willingness to forgive error … Would you go to a person if you knew for a fact that 1 of 5 things they say is wrong?