Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You might try to find forums where people are doing the kind of LLM work you want to do, and see what they're recommending for how to do that work.

Personally, for my changing needs, my GPU server is already on its 4th GPU, after I kept finding I needed more VRAM for various purposes. The RTX 3090 seems to be a sweet spot right now, and my next upgrade right now would probably be a 2nd 3090 with NVLink, or some affordable older datacenter/workstation GPU with even more VRAM. Other than the GPU, you just need an ordinary PC with an unusually big PSU, maybe a big HDD/SSD (for data), and maybe a lot of RAM (for some model loading that can't just stream it). https://www.neilvandyke.org/machine-learning/



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: