this post was submitted on 03 Jun 2025
58 points (86.2% liked)
Technology
71717 readers
4239 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This seems like a perfectly reasonable experiment and not something they’re going to release without extensive human and security review.
Oauth libraries aren’t new and A.I. can probably generate adequate code. My main problem with A.I. for this purpose is that senior developers/experts don’t pop out of thin air. You need junior developers now if you want any real experts in the future. Maybe you need fewer and more specialized training. Maybe the goal is to offload the training cost to Universities and tech companies only want PhDs. Maybe someday LLMs will be good enough to not need much supervision. But that’s not where we are.
We probably need a Level x capability scale like self-driving cars for this sort of thing.
If you read the commentary on the process you notice heavy reliance on experts in the field to ensure the code is good and secure. Claude is great at pumping out code, but it can really get confused and forget/omit earlier work, for example.
I think the notion of junior developers disappearing because of AI is false. These tools accelerate productivity, they don't replace human experience.
This is true, because AI is not the actual issue. The issue, like with most, is humanity; our perception and trust of AI. Regardless of logic, humanity still chooses illogical decisions.
I think this take undervalues the AI. I think we self select for high quality code and high quality engineers
But many of us would absolutely gawk at something like Dieselgate. That is real code running in production on safety critical machinery.
I'm basically convinced that Claude would have done better
Dieselgate wasn't a "bug" it was an designed in feature to circumvent emissions. Claude absolutely would have done the same, since it's exactly what the designers would have asked it for. Somehow I doubt it would have gone undetected as long if Claude wrote it tho, it'd probably mess it up some other way.
You should look into how Dieselgate worked
I don't think you understand my take
I guess that makes it a bad analogy
Doctors face a similar obstacle before they can practice: medical school and residency. They literally have to jump from zero to hero before the first real paycheck.
Things may evolve this way for senior software developers with a high rate of dropout.
I hear you, and there’s merit to the concerns. My counter is
I don’t know anything about you, obviously, but I suspect you should to take a more nuanced, historical view of Luddites. Writing someone off as a “Luddite” probably isn’t the burn you think it is.
I’m all for technological progress. Who isn’t? It’s the politics and ownership that causes issues.
I apologize back. I didn’t mean to offend. You never know who you’re talking to on a message board and in rereading it, my comment could easily have been taken as hostile. It’s hard to get nuance across in this medium.