> You have fallen headfirst into the "Not now, so never" fallacy.
Perhaps. Though we have empirical evidence of how much we can quantize and distillate models to the point of practical uselessness. That sets a bar for how large a local model needs to be for general-use as to compete with the could ones. We are talking in the area of 60GB for GPT-OSS/Qwen3.5, which is what enthusiasts are running on 32GB DDR5 + 24GB VRAM RTX 3090.
> As if consumer hardware won't get more powerful
Now I will let you, with that last fact in hand, plot a chart of how much it's been costing to provision that over the past 2 years and use it to prove me wrong about the affordability of local models.
I don't - idealistic motives seems to be common among leading AI developers and researchers. It's totally realistic that Anthropic sticking to principle & taking a hit for it will give it an edge recruiting those idealistic types.
That different path was grim. Adobe wrecked or killed a lot of really good software after the buyout. Freehand and Fireworks were incredible for their day.
You are flatly wrong. These countries are internationally recognized sovereign nations and members of the UN. They have independent, often conflicting interests and foreign policies. Meanwhile the US bases are held under formal leases, not through force.
Iran also has a gang of murderous theocratic nutters running it, massacring their citizens for taking to the streets and singing songs, undermining foreign societies, and lending their knowhow and drones to other, bigger psychopaths for their invasions. It'll gladden my heart if the leadership is destroyed, even if some old pretty masonry gets chipped along the way.
You have fallen headfirst into the "Not now, so never" fallacy. As if consumer hardware won't get more powerful, or models more economical.
reply