Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i did this!

XXLLM: ~1T (GPT4/4.5, Claude Opus, Gemini Pro)

XLLM: 300~500B (4o, o1, Sonnet)

LLM: 20~200B (4o, GPT3, Claude, Llama 3 70B, Gemma 27B)

~~zone of emergence~~

MLM: 7~14B (4o-mini, Claude Haiku, T5, LLaMA, MPT)

SLM: 1~3B (GPT2, Replit, Phi, Dall-E)

~~zone of generality~~

XSLM: <1B (Stable Diffusion, BERT)

4XSLM: <100M (TinyStories)

https://x.com/swyx/status/1679241722709311490



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: