Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is silly to assume we can reach something so sophisticated before we even able to define what is it that we're trying to achieve exactly.

Am I only one seeing the naked king here?



Well, intelligence itself is hard to define. We'd consider pretty much all humans "intelligent" even though certain things for some humans are near impossible for others. The greatest common denominator of human intelligence that people generally seem to imply when they talk about AGI is a multivariate overlap of ability to learn, embodiment, abstract pattern recognition, visual and motion acuity, and emotional understanding. However many of those facets are hard to test.

Intelligence is something that has thousands of variables. It is a spectrum with many points of "emergence" where something near impossible before becomes possible. There are of course the many benchmarks we give LLM's, (MMLU, ARC, etc), but the more practical test is whether a model can completely replace a human in an economically viable activity.


Everything you written here is subject to interpretation.

Even “replace human”.

For example today there are only 2 pilots in an airline plane. And even that is “just in case”. In early days there were 3.

So literally technology has already “completely replaced” some humans.

Does it mean “intelligence”?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: