Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is that intelligence isn't the result, or at the very least the ideas that word evokes in people don't match the actual capabilities of the machine.

Washing is a useful word to describe what that machine does. Our current setup is like if washing machines were called "badness removers," and there was a widespread belief that we were only a few years out from a new model of washing machine being able to cure diseases.



Arguably there isn't even a widely shared, coherent definition of intelligence: To some people, it might mean pure problem solving without in-task learning; others equate it with encyclopedic knowledge etc.

Given that, I consider it quite possible that we'll reach a point where even more people will consider LLMs having reached or surpassed AGI, while others still only consider it "sufficiently advanced autocomplete".


I'd believe this more if companies weren't continuing to use words like reason, understand, learn, and genius when talking about these systems.

I buy that there's disagreement on what intelligence means in the enthusiast space, but "thinks like people" is pretty clearly the general understanding of the word, and the one that tech companies are hoping to leverage.


The defining feature of true AGI, in my opinion, is that the software itself would decide what to do and do it without external prompts more than environmental input.

Doubly so if the AGI writes software for itself to accomplish a task it decided to do.

Once someone has software like that, not a dog that is sicced on a task, but a bloodhound that seeks out novelty and accomplishment for its own personal curiosity or to test its capabilities, then you have a good chance of convincing me that AGI has been achieved.

Until then, we have fancy autocomplete.


What about letting customers actually try the products and figure out for themselves what it does and whether that's useful to them?

I don't understand this mindset that because someone stuck the label "AI" on it, consumers are suddenly unable to think for themselves. AI as a marketing label has been used for dozens of years, yet only now is it taking off like crazy. The word hasn't change - what it's actually capable of doing has.


> What about letting customers actually try the products and figure out for themselves what it does and whether that's useful to them?

Yikes. I’m guessing you’ve never lost anyone to “alternative” medical treatments.


Not to mention ChatGPT-induced suicide ideation.


Please define intelligence




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: