>You pay junior devs way way way more money for the privilege of them being bad.
I hope you don't think that what you're paying for an LLM today is what it actually costs to run the LLM. You're paying a small fraction.
So much investment money is being pumped into AI that it's going to make the 2000 dot-com bubble burst look tiny in comparison, if LLMs don't start actually returning on the massive investments. People are waking up to the realities of what an LLM can and can't do, and it's turning out to not be the genie in the bottle that a lot of hype was suggesting. Same as crypto.
The tech world needs a hype machine and "AI" is the current darling. Movie streaming was once in the spotlight too. "AI" will get old pretty soon if it can't stop "hallucinating". Trust me I would know if a junior dev is hallucinating and if they actually are then I can choose another one that won't and will actually become a great software developer. I have no such hope for LLMs based on my experiences with them so far.
> I hope you don't think that what you're paying for an LLM today is what it actually costs to run the LLM. You're paying a small fraction.
Depends, right? Claude Code on a Max plan is obviously unsustainable if the API costs are any indication; people can burn through the subscription price in API credits in a day or less.
But otherwise? I don't feel like API pricing is that unrealistic. Compute is cheap, and LLMs aren't as energy-intensive in inference as some would have you believe (especially when they conveniently mix up training and inference). And LLMs beat juniors at API prices already.
E.g. a month ago, a few hours of playing with Gemini or Claude 3.5 / 3.7 Sonnet had me at maybe $5 for a completed little MVP of an embedded side project; it would've taken me days to do it myself, even more if I hired some random fresh grad as a junior, and $5 wouldn't fund even an hour of their work. API costs would had to be underpriced by at least two orders of magnitude for juniors to compete.
Yeah, all fair, but I think there's enough capital to keep the gravy train rolling until the cost-per-performance actually get way, way, way below human junior engineers.
A lot of the application layer will disappear when it fails to show ROI, but the foundation models will continue to have obscene amounts of money dumped into them, and the coding use case will come along with that.
Even though I think most people know this deep down, I still don't think we actively realize how optimized LLMs are towards sounding good. It's the ultra processed food version of information consumption. People are super lazy (economical if you like) and rlhf et al have optimized LLM output to being easy to digest.
Consequence is you get a bunch of output that looks really good as long as you don't think about it (and they actively promotes not thinking about it) that you don't really understand, and that if you did dig into you'd realize is empty fluff or actively wrong.
It's worse than not learning, it's actively generating unthinking but palatable garbage that's the opposite of learning.
Yeah, you have to be really careful about how you use LLMs. I've been finding it very useful to use them as teachers, or to use them in the same way that I'd use a coworker. "What's the idiomatic ways to write this python comprehension in javascript?" Or, "Hey, do you remember what you call it when..." And when I request these things I'll try to ask in the most generic way possible so that I then get retype the relevant code, filling in the blanks with my own values.
That's just one use though. The other is treating it like it's a jr developer, which has its own shift in thinking. Practice in writing details specs goes a long way here.
Says who? While “grinding” is one way to learn something, asking AI for a detailed explanation and actually consuming that knowledge with the intent to learn (rather than just copy and pasting) is another way.
Yes, you should be on guard since a lot of what it says can be false, but it’s still a great tool to help you learn something. It doesn’t completely replace technical blogs, books, and hard earned experience, but let’s not pretend that LLMs, when used appropriately, don’t provide an educational benefit.
Pretty much all education research ever points to the act of actually applying knowledge, especially against variable cases, to be required to learn something.
There is no learning by consumption (unfortunately, given how we mostly attempt to "educate" our youth).
I didn't say they don't or can't provide an educational benefit.
Some of the best software learning I ever had when I was starting out was following along with video courses and writing the code line by line along with the instructor... or does this not count as "consumption"?
> I was... following along and writing the code line by line
That's application. Then presumably you started deviating a little bit from exactly what the instructor was doing. Then you deviated more and more.
If you had the instructor just writing the code for every new deviation you wanted to build and you just had to mash the "Accept Edit" button, you would not have learned very effectively.
Maybe it's the senior devs who should be the ones to worry?
Seniors' attitudes on HN are often quick to dismiss AI assisted coding as something that can't replace the hard-earned experience and skill they've built up during their careers. Well maybe, maybe not. Senior devs can get a bit myopic in their specializations. Whereas a junior Dev doesn't have so much baggage, maybe the fertile brains of youth are better in times of rapid disruption where extreme flexibility of thought is the killer skill.
Or maybe the whole senior/junior thing is a red herring and pure coding and tech skills are being deflated all across the board. Perhaps what is needed now is an entirely new skill set that we're only just starting to grasp.
> Seniors' attitudes on HN are often quick to dismiss AI assisted coding as something that can't replace the hard-earned experience and skill they've built up during their careers.
One definition of experience[0] is:
direct observation of or participation in events as a basis of knowledge
Since I assume by "AI assisted coding" you are referring to LLM-based offerings, then yes, "hard-earned experience and skill" cannot be replaced with a statistical text generator.
One might as well assert an MS-Word document template can produce a novel Shakespearean play or that a spreadsheet is an IRS auditor.
> Or maybe the whole senior/junior thing is a red herring and pure coding and tech skills are being deflated all across the board. Perhaps what is needed now is an entirely new skill set that we're only just starting to grasp.
For a repudiation of this hypothesis, see this post[1] also currently on HN.
> Maybe it's the senior devs who should be the ones to worry?
Why would they be worried?
Who else going to maintain the massive piles of badly designed vibe code being churned out at an increasingly alarming pace? The juniors prompting it certainly don't know what any of it does, and the AIs themselves have proven time and again to be incapable of performing basic maintenance on codebases above a very basic level of complexity.
As the ladder gets pulled up on new juniors, and the "fertile brains" of the few who do get a chance are wasted as they are actively encouraged to not learn anything and just let a computer algorithm do the thinking for them, ensuring they will never have a chance to become seniors themselves, who else will be left to fix the mess?
If your seniors aren't analyzing the PRs being vibe coded by others in the orgs to make sure they meet quality standards, that is the source of your problem, not the vibe coding.
Wherever you look, the conclusion is the same - balance is required. Too many seniors, you get stuck in one way streets. Too many juniors, you trip over your own feet and diverge into unknown avenues.
Mix AI in, I don't see how that changes much at all... Juniors drive into unknown territory faster, Seniors get stuck in their niche just as well. Acceleration yes, fundamental change of how we work - I don't see it yet.
Senior devs provide better instructions to the agent, and can recognize more kinds of mistakes and can recognize mistakes more quickly. The feedback loop is more useful to someone with more experience.
I had a feeling today that I should really be managing multiple instances at once, because they’re currently so slow that there’s some “downtime”.
See if the promise was real: llms are great skill multipliers! Then it is the new renaissance of one developer businesses popping up left and right every day! Ain't nobody got time for corporate coercion hierarchy nonsense.
> I really would not want to be a junior dev right now... Very unfair and undesirable situation they've landed in.
I don't really get this, at the beginning of my career I masquaraded as a senior dev with experience as fast as I could until it was laundered into actual experience
Form the LLC and that's your prior professional experience, working for it
I felt I needed to do that and that was way before generative AI, like at least a decade
> You pay junior devs way way way more money for the privilege of them being bad.
Oh, it's worse than that. You do that, and they complain that they are underpaid and should earn much, much more. They also think they are great, it's just you, the old-timer, that "doesn't get it". You invest lots of time to work with them, train them, and teach them how to work with your codebase.
And then they quit because the company next door offered them slightly more money and the job was easier, too.
> I think it would be great to be a junior dev now and be able to learn quickly with llms.
I'm not so sure; I get great results (learning) with them because I can nitpick what they give me, attempt to explain how I understand it and I pretty much always preface my prompts with "be critical and show me where I am wrong".
I've seen a junior use it to "learn", which was basically "How do I do $FOO in $LANGUAGE".
For that junior to turn into a senior who prompts the way I do, they need a critical view of their questions, not just answers.
I have experienced multiple instances of junior devs using llm outputs without any understanding.
When I look at the PR, it is immediately obvious.
I use these tools everyday to help accelerate. But I know the limitations and can look at the output to throw certain junk away.
I feel junior devs are using it not to learn but to try to just complete shit faster. Which doesn’t actually happen because their prompts suck and their understanding of the results is bad.
The vilification of juniors and the abandonment of the idea that teaching and mentoring are worthwhile are single-handedly making me speedrun burnout. May a hundred years of Microsoft Visio befall anybody who thinks that way.
Unless you're running a police state environment where every minute of company time is tracked, enough opportunities for it to happen organically exist that it's not a matter of how you organize it, it's a matter of culture. Give them as much responsibility as they can handle and they'll be the ones reaching out to you.
A constant reminder: you can't have wizards without having noobs.
Every wizard was once a noob. No one is born that way, they were forged. It's in everybody's interest to train them. If they leave, you still benefit from the other companies who trained them, making the cost equal. Though if they leave, there's probably better ways to make them stay that you haven't considered (e.g. have you considered not paying new juniors more than your current junior that has been with the company for a few years? They should be able to get a pay bump without leaving)
I'm sure people (esp engineers) know this. But imagine you're starting a company: would you try to deploy N agents (even if shitty), or take a financial/time/legal/social risk with a new hire. When you consider short-term costs, the math just never works out in favor of real humans.
Every single time I post my comment I get this response...
1) There is no universal rule for anything. It doesn't have to apply to every single case. No one is saying a startup needs to hire juniors. No one is saying you have to hire only juniors. We haven't even talked about the distribution tbh. That's very open to interpretation because it is implicit that you will have to modify that based on your context.
2) Lots of big companies still act like they're startups. You're right, that short term "the math" doesn't work out. But it does on the medium and long term. So basically as long as you aren't working at the bootstrapping stage of a startup, you want to start considering this. Different distributions for different stages, of course.
But you shouldn't sacrifice long term rewards for short term ones. You are giving up larger rewards...
What about the financial / legal / social risk of your AI agent doing something bad? You're only looking at cost savings, without seeing the potentially major downsides.
To follow up my previous comment, I worked on a project where someone fixed an old bug. This bug became a feature for clients who build their systems around this api endpoint. The consequence is hundreds of thousands of user duplicates with automations attaching new ressources and actions randomly on the duplicates. Massive consequences for the customers. If it were an AI doing the fixing with no human intervention, good luck understanding, cleaning the mess and holding accountable.
People seem lightly think that if the agent is doing something bad it’s just a risk to take. But when a codebase with massive amounts of loc and logic is build and no human knows it, how to deal with the consequences on people’s business ? Can’t help but think it’s crappy software with a « Google closed your Gmail account, no one knows why and we can’t do anything about it, sorry ». But instead of a mail account it’s part of your business
I can’t stop thinking that this way of thinking is either plain wrong and misses completely what software development is really about. Or very true and in X years people will just ask the trending AI « I need a billing/CRM/X system with those constraints ». Then the AI will ask questions and refine the need. Work for 30mn the time to use libs and code the whole thing, pass into systems to test and deploy and voila. Custom feature on demand. No CEO, no sales, nobody. You just deploy your own SaaS feature.
Then good luck to scale properly and migrate data and add features and complexity. If agents hold onto their promise, then the future is custom based, you deploy what you need, SaaS platform is dead with everyone in between useless.
I think too many see it more as "every stem cell has the potential to be any [something]", but it's generally better to let them self differentiate until survivors with more potential exist.
Be careful there... There are destructive steady state solutions. For example, all your cells can become cancerous. The stem cells are shaped by their environments, just like people. Don't just approach things with a laissez faire attitude. Flexibility is good, and an overly heavy hand is bad, but that doesn't mean a subtle hand is bad
I spent a lot of time in my career, honestly some of the most impactful stuff I've done, mentoring college students and junior developers. I think you are dead on about the skills being very similar. Being verbose, not making assumptions about existing context, and generalized warnings against pitfalls when doing the sort of thing you're asking it to do goes a long long way.
Just make sure you talk to Claude in addition to the humans and not instead of.
Damn, that sucks. My experience has been the exact opposite; maybe you need to adjust your approach and set expectations up-front, or get management involved? (I've had a similar experience to you with my teenage kids, but that's a whole other situation.)
My M.S. advisor gave me this advice on when I should ask for help, which I've passed on to lots of junior engineers: It's good to spend type struggling to understand something, and depending on the project it's probably good to exert yourself on your own somewhere between an hour and a day. If you give up after 5 minutes, you won't learn, but if you spend a week with no progress, that's also not good.