Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Misunderstanding benchmarks seems to be the first step to claiming human level intelligence.

It's known as "hallucination" a.k.a. "guessing or making stuff up", and is a major challenge for human intelligence. Attempts to eradicate it have met with limited success. Some say that human intelligence will never reach AGI because of it.



Thankfully nobody is trying to sell humans as a service in an attempt to replace the existing AIs in the workplace (yet).

I’m sure such a product would be met with ridicule considering how often humans hallucinate. Especially since, as we all know, the only use for humans is getting responses given some prompt.


> Thankfully nobody is trying to sell humans as a service

That’s a description of the entire service economy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: