Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thanks for replying! I disagree that current LLMs can't help build tooling that improves rigor and lets you manage greater complexity. However, I agree that most people are not doing this. Some threads from a colleague on this topic:

https://bsky.app/profile/sunshowers.io/post/3mbcinl4eqc2q

https://bsky.app/profile/sunshowers.io/post/3mbftmohzdc2q

https://bsky.app/profile/sunshowers.io/post/3mbflladlss26



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: