Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Pretty sure it's not scale but sensors.

We have sensors for vision, sound, taste, temperature, texture, etc that are constantly observing the world and affecting not only our current behavior but changing us in real time.



You can feed an LLM of typical size (~1 TB) all the video and audio you can find, but it won't turn into a human. I suspect but can't prove that even if you wired up a bunch of other sensors, gave it a robot body, and let it "explore" the world on its own, that would still be woefully insufficient.

A gazelle just days(!) old can control its body and four legs sufficiently well to outrun a cheetah. This is a complex motor-control loop involving all of its senses. Compare that to Tesla's autopilot training system, which uses many millions of hours of training data and still struggles to move a car... slowly. The equivalent would be a training routine that can take just a handful of days of footage and produce an AI that can win a car race.

There's something magical about neural networks when scaled up to brain sizes. From what I gather, there's little else encoded in the genome except for the high-level pattern of wiring, the equivalent to the PyTorch model configuration.


How many hours or actually years of evolution have been needed until reaching walking capability? If first life is believed to have happened 4 billion years ago, and first walking animals started 450 million years ago during the siluariab period, that’s around 3,5 billion years.


That's not really a great comparison though because evolution wasn't trying to learn how to walk.


Evolution wasn't trying to do anything. But learning (in some sense) to walk was exactly something that evolution did, albeit very indirectly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: