Anthropic's CEO Dario has annoyed me to no end with his "AI will take all the jobs in 6 months" doomer speeches on every podcast he graces his presence with.
Focusing on Dario, his exact quote IIRC was "50% of all white collar jobs in 5 years" which is still a ways off, but to check his track record, his prediction on coding was only off by a month or so. If you revisit what he actually said, he didn't really say AI will replace 90% of all coders, as people widely report, he said it will be able to write 90% of all code.
And dhese days it's pretty accurate. 90% of all code, the "dark matter" of coding, is stuff like boilerplate and internal LoB CRUD apps and typical data-wrangling algorithms that Claude and Codex can one-shot all day long.
Actually replacing all those jobs however will take time. Not just to figure out adoption (e.g. AI coding workflows are very different from normal coding workflows and we're just figuring those out now), but to get the requisite compute. All AI capacity is already heavily constrained, and replacing that many jobs will require compute that won't exist for years and he, as someone scrounging for compute capacity, knows that very well.
But that just puts an upper limit on how long we have to figure out what to do with all those white collar professionals. We need to be thinking about it now.
He's not right though. He's trying to scare the market into his pocket. It's well established that AI just turns devs into AI babysitters that are 10% more productive and produce 200% the bugs, and in the long-term don't understand what they built.
> It's well established that AI just turns devs into AI babysitters that are 10% more productive and produce 200% the bugs, and in the long-term don't understand what they built.
It's not well established at all. In fact, there is increasing evidence to the contrary if you look outside the HN echo chamber.
The nuanced take is that AI in coding is an amplifier of your engineering culture: teams with strong software discipline (code reviews, tests, docs, CI/CD, etc.) enjoy more velocity and fewer outages, teams with weak discipline suffer more outages. There are at least two large-scale industry reports showing this trend -- DORA 2025 and the latest DX report -- not to mention the infinite anecdotes on this very forum.
> He's trying to scare the market into his pocket.
People say this, but I don't get it. Is portraying yourself as a destroyer of the economy considered good marketing? Maybe there was a case to be made for convincing the government to impose regulations on the industry, but as we're seeing and they're experiencing first hand, the problem is the government.
If these tools were so great they wouldn't be struggling so hard to sell them. Great sign that the company has to mandate a "productivity" tool that the workers hate.
Hence why all these LLM companies love government contracts, they can't sell to consumers so they'll just steal from tax payers instead.
Ah yes the mythical "valuations" based on unicorn dust and pixie horns (note that they don't define what a month actually is, my hunch is they take their best week then multiply it by x52).
> Focusing on Dario, his exact quote IIRC was "50% of all white collar jobs in 5 years" which is still a ways off, but to check his track record, his prediction on coding was only off by a month or so. If you revisit what he actually said, he didn't really say AI will replace 90% of all coders, as people widely report, he said it will be able to write 90% of all code.
Ugh, people here seem to think that all software is react webapps. There are so many technologies and languages this stuff is not very good at. Web apps are basically low hanging fruit. Dario hasn't predicted anything, and he does not have anyone's interests other than his own in mind when he makes his doomer statements.
The problem is, the low hanging fruit, the stuff it's good at, is 90% of all software. Maybe more.
And it's getting better at the other 10% too. Two years ago ChatGPT struggled to help me with race conditions in a C++ LD_PRELOAD library. It was a side project so I dropped it. Last week Codex churned away for 10 minutes and gave me a working version with tests.
I think that typescript is a language uniquely suited to LLMs though:
- It's garbage collected, so variable lifetimes don't need to be traced
- It's structurally typed, so LLMs can get away with duplicating types as long as the shape fits.
- The type system has an escape hatch (any or unknown)
- It produces nice stack traces
- The industry has more or less settled styling issues (ie, most typescript looks pretty uniform stylistically).
- There is an insane amount of open source code to train on
- Even "compiled" code is somewhat easy(er) to deobfuscate and read (because you're compiling JS to JS)
Contrast that with C/C++:
- Memory management is important, and tricky
- Segfaults give you hardly anything to work with
- There are like a thousand different coding styles
- Nobody can agree on the proper subset of the language to use (ie, exceptions allowed or not allowed, macros, etc.)
- Security issues are very much magnified (and they're already a huge problem in vibecoded typescript)
- The use cases are a lot more diverse. IE, if you're using typescript you're probably either writing a web page or a server (maybe a command line app). (I'm lumping electron in here, because it's still a web page and a server). C is used for operating systems, games, large hero apps, anything CPU or memory constrained, etc.
I'm not sure I agree that typescript is "90% of all software". I think it's 90% of what people on hacker news use. I think devs in different domains always overestimate the importance of their specific domain and underestimate the importance of other domains.
I wouldn't say TypeScript is 90% of all software exactly, but tons of apps on all kinds of technologies like Python / Django, Ruby on Rails, PHP, Wordpress, "enterprise" Java and the like, primarily doing CRUD and data plumbing especially for niche applications and internal LoB sites that we never see on the open Internet.
I agree C++ is harder, and I still occassionally find a missing free(), but Codex did crack my problem... including fixing a segfault! I had a bunch of strategically placed printfs gated behind an environment variable, it found those, added its own, set the environment variable, and examined the outputs to debug the issue.
I cannot emphasize how mindblowing this is, because years back I had spent an hour+ doing the same thing unsuccessfully before being pulled away.
> 90% of all code, the "dark matter" of coding, is stuff like boilerplate and internal LoB CRUD apps and typical data-wrangling algorithms that Claude and Codex can one-shot all day long.
If you mean "us" on this forum, I would believe that. I would bet the number of engineers working on stuff "outside the distribution" is overrepresented here.
If you mean "us" as in all software engineers, not at all. The challenge we're facing is exactly that, reskilling the 90% of engineers who have been working on CRUD apps to the 10% that is outside the distribution.
> 90% of engineers who have been working on CRUD apps
I am a 30-year "veteran" in the industry and in my opinion this cannot be further from the truth but it is often quotes (even before AI). CRUD apps have been a solved problem for quite some time now and while there are still companies who may allow someone to "coast" doing CRUD stuff they are hard to find these days. There is almost always more to it than building dumb stuff. I have also seen (more and more each year) these types of jobs being off-shored to teams for pennies on a dollar.
What I have experienced a lot is teams where there are what I call "innovators" and "closers." "Innovators" do the hard work, figure shit out, architect, design... and then once that is done you give it to "closers" to crank things out. With LLMs now the part of "closers" could be "replaced" but in my experience there is always some part, whether it is 5% or 10% that is difficult to "automate" so-to-speak
I agree, I'd say we're talking about the same thing, just in different terms. When I said CRUD apps, it was a crude stand-in for what you call the "closing" work. Over-simplifying, but it's unglamorous, not too complicated, somewhat mechanical, mostly a translation into working code from high-level designs that come down from the "innovators."
But I am concerned precisely because AI is usurping that closing work, which accounts for the bulk of the team. Realistically the innovators will be the only people required. But the innovators are able to do the hard stuff by learning through a lot of hands-on experience and painful lessons, which they typically get by spending a lot of time in the trenches as closers.
And we're only talking about coding here, but this pattern repeats ALL over knowledge work: product, legal, consultancy, finance, accounting, adminstration...
So now the problem is two-fold: how do we get the closers to upskill to innovators a) without the hands-on experience b) faster than AI can replace them?
I don't understand why some of these AI companies check their egos at the door and hire public relations companies. Yes, I understand they are changing the world but customers do not open their wallets when they are scared. Very few people I know are as avant-guarde as I am with AI, but, most people look at these new technologies and simply feel fear. Why pay for something that will replace you?
It's to drive FOMO for investors. He needs tens of billions of capital and is trying to scare them into not looking at his balance sheet before investing. It's reckless, and is soaking up capital that could have gone towards more legitimate investments.
It certainly is. For people who have not heard the statements, here are some quotes. I bring them up, because I think it's worthwhile to remember the bold predictions that are made now and how they will pan out in the future.
Council on Foreign Relations, 11 months ago: "In 12 months, we may be in a world where AI is essentially writing all of the code."
Axios interview, 8 months ago: "[...] AI could soon eliminate 50% of entry-level office jobs."
The Adolescence of Technology (essay), 1 month ago: "If the exponential continues—which is not certain, but now has a decade-long track record supporting it—then it cannot possibly be more than a few years before AI is better than humans at essentially everything."
To be fair, it's hilarious how much verbiage was spent discussing AI 'getting out of the box', when the first thing everyone did with LLMs was immediately throw away the box and go "Here! Have the internet! Here! Have root access! Want a robot body? I'll get you a robot body."
"Y'know, like, the thing is, like, y'know, here's the thing..."
I totally feel for people with speech pathologies or anxiety that makes it harder for them to communicate verbally, but how is this guy the public face of the company and doing all these interviews by himself? With as much as is at stake, I find it baffling.
What I find so funny about heads of AI companies coming out saying things like this, is their own career pages suggest they don't actually feel that way.
He's annoyed me most with the way he speaks. I'm not sure if its a tick or what but the way he'll repeat a word 10x before starting a sentence is painful to listen to.
Yes, the CEO's of these AI companies are clearly not the people who should be selling AI products. They need to be hidden away and kept behind closed doors where they can do their best work. And they need advertising companies, PR firms and better marketing tactics to try and soothe the customers.
I like the idea of helping people out of poverty. But the problem with government funded charities is they are so ripe for fraud, they almost never get managed properly.
Most people in a tech business can easily identify a whale hunt. That is, a business where a small number of customers provide such a disproportionate share of revenue that everyone else doesn't matter. But for some reason they fail to see government spending fraud is in fact a whale hunt.
I say we first ensure that fraudsters be not placed in government positions, and then worry more about eradicating the lesser fraud in charities that receive some funding from the government.
OP's comment is just hilarious on too many levels. "I like X, but...", so let's not even try. No evidence for the claim, either. And all while failing to see the bigger picture.
This is heartbreaking in a way to see what's become of it. Windows was my childhood playground. I can't not feel some kind of attachment and a desire to save it.
Nadella more than 10x'ed the value of Microsoft. I doubt many MS execs think it was the wrong call to move Windows work to the B team.
EDIT: somehow people seem to think I'm defending MS here. I'm not, I'm concurring that MS willingly turned Windows to shit (by moving it to the B team) because they thought they could earn more money elsewhere (and they were right). I don't like it, but I bet the people who got filthy rich over it do.
That mindset is why every tech product is turning to shit. They're not consumer focused. All Nadella cares about is making the stock price go up and extracting value.
At some point we went through the looking glass where the stock is the product.
Is this a new phenomena? Stocks aren't new. Why is the modern market treated like this? Did Henry Ford make his vehicles shitter to increase his stock value?
Companies with insufficient competition treated customers badly always. Antitrust enforcement weakened since the 1970s. And investors demanded short term gains.
It's the private equity era. Much like how legislative behaviour is now dictated by the wealthy even to the point of contradicting the will/desire of informed voters, corporate behaviour is now dictated by private equity investment to the point of contradicting the demand from informed user/consumers.
They're fucking up even gaming, that awful gamebar is a pain to disable. Had to do it from powershell and even after it's gone Alt + W won't work in games.
I have a de-bloated win11 build running on my gaming rig, and I still occasionally get the prompt "no program to open link: ms-gamebar://" or something similar
Sorry you’re getting downvoted. Ideally downvoting would be for unconstructive posts, or posts with good info / good contributions that are presented unconstructively.
You’re just being controversial.
That’s not a strong enough reason to downvote someone.
Thanks for pointing this out. I don't participate in HN discussions like I used to because the HN crowd and I don't agree on much, and down-votes is not an engaging counter-point.
Fwiw I think it’s perfectly fine if people downvote me if they disagree with me. I think that’s an unavoidable effect of having up/down arrows, regardless of what the rules say. If i say something controversial I expect some downvotes. I just hadn't expressed myself clearly enough initially, everybody took me as a “money makes right” capitalist (not a weird assumption, theres plenty of those here on HN) and fortunately could still edit to clarify.
Have you actually had it do anything substantial and then tried to work with the code it produces afterwards? It may "work" but it's a horrific mess. Good luck bug fixing that.
Yes, with GPT 5.2 Codex and 5.2 Pro specifically. It’s not a mess because of the context I provide and the guidance and reattempts I apply. It’s working great, the resulting code is good when I accept it, and I’m getting much more done than in the before times.
Im working productive with it every day (Claude Opus). If you provide enough context, the code usually works veeeery well on first shot.
(For sure, I do not tell it "build me a facebook", instead Im telling it precisely which single function I want to have, how it should work, what the output should be.
Chunk it into single parts/components/building blocks. Then do each of these a dedicated project/workspace. In each of these workspaces, you put in only the source code relevant for this module. You are working in each project/workspace separately.
reply