Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hah. In which direction?


Generating plausibly looking bullshit by LLMs will be (already is) super cheap, while LLMs will not significantly help debunking them.


The solution isn't to create an LLM to debunk the bullshit, it's to a second LLM that will double down on the bullshit so that the onus is on LLM1 to debunk.

LLM1: 8 reasons the moon landing was faked

LLM2: 9 reasons the moon doesn't exist

/sarcasm


Favorite example is AIs generating citations of papers that don't exist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: