It's not exactly an out of the park win though as it's only $100-$200 cheaper and it trends poorly the farther from comparing base models you go (particularly after 64 GB, for which you need to leave the Mini family for the Studio). By the time you get to 128 GB, which you'd want for the 70B class AI models, you end up back at the original statement.
Huh. Ate my words then. I briefly checked Apple's page and didn't see them.
However those are the base M4 chip and not the M4 Pro. You need the M4 Pro to get competitive in GPU compute numbers for a more like to like comparison. The M4 Pro mini comes out at $1800 with the 48GB option, or $2000 for 64GB. For the same price the Framework machine gets you 128GB.
I think you're probably right comparing to the M4 Pro would make more sense but keep in mind you don't really need much compute, it's just that the M4 Pro has memory bandwidth more similar to the AI Max 395+ while the "normal" M4 doesn't.
Every large AI model is heavy memory bandwidth constrained to the point my 9800X3D (with the extra L3 cache) and 128 GB/s memory attached is only 60% utilized running a 32 GB model in CPU only mode (no NPU, iGPU, or GPU offload enabled). Really small AI models can start to be compute bound but, at that point, you don't really need the 32 GB of memory anymore and probably just want a normal GPU.
This starts at 1099$ per the article, so the mac mini is the low-price alternative here.