Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would imagine that the alien civilisations would also make that kind of studies, and pre-emptively prevent the extinction scenario from happening.


We have a rather lousy history of completely avoiding risks that we predicted.

Environmental issues consistently get addressed after people start noticing harms, but AI may not give civilizations time to react after they fuck up.


Yeah the article makes no sense. It suggests we need try and prevent the problem by 1) regulating AI development, 2) accelerating our progress towards becoming multi-planetary.

Well, if AI really is such an effective great filter, it would not be so easily mitigated by such predictable solutions.

If you can't implement those mitigations because of politics / economics / etc, then the great filter is in your social dysfunction, not the AI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: