I got an ad for an AI legal consultant and it made me upset. Someone is trying to make a cheap buck by outsourcing legal advice to an unreliable source. Now I'm more prepared to explain how it's a bad idea, instead of just why.
I like that the article explains a few different ways hallucinations creep in, besides the obvious. Maybe what's most needed is a better retrieval/search system? If the AI can't fetch high quality data, then it's doomed before it tries.
What really upsets me is the idea of making therapist LLMs.
They may be able to regurgitate an impressive amount of psychology texts, but that doesn't give them the theory of mind, observational skill, experience or judgement needed to be a good psychotherapist.
I like that the article explains a few different ways hallucinations creep in, besides the obvious. Maybe what's most needed is a better retrieval/search system? If the AI can't fetch high quality data, then it's doomed before it tries.