Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I got an ad for an AI legal consultant and it made me upset. Someone is trying to make a cheap buck by outsourcing legal advice to an unreliable source. Now I'm more prepared to explain how it's a bad idea, instead of just why.

I like that the article explains a few different ways hallucinations creep in, besides the obvious. Maybe what's most needed is a better retrieval/search system? If the AI can't fetch high quality data, then it's doomed before it tries.



What really upsets me is the idea of making therapist LLMs.

They may be able to regurgitate an impressive amount of psychology texts, but that doesn't give them the theory of mind, observational skill, experience or judgement needed to be a good psychotherapist.


That one is doomed to failure from a customer perspective IMO - not because people want, or therapists do, anything that you’re saying though.

But often because the real reason why people are in therapy is to get someone else’s perspective on what they’re experiencing.

And fundamentally, that is destined to fail when it’s not an actual human involved.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: