> I'm hearing from Senior devs all over thought, that Junior developers are just garbage at it. They product slow, insecure, or just outright awful code with it, and then they PR the code they don't even understand.
If this is the case then we better have full AI generated code within the next 10 years since those "juniors" will remain atrophied juniors forever and the old timers will be checking in with the big clock in the sky. IF we, as a field, believe that this can not possibly happen, then we are making a huge mistake leaning on a tool that requires "deep [orthogonal] experience" to operate properly.
You can't atrophy if you never grew in the first place. The juniors will be stunted. It's the seniors who will become atrophied.
As for whether it's a mistake, isn't that just the way of things these days? The current world is about extracting as much as you can while you're still here. Look around. Nobody is building for the future. There are a few niche groups that talk about it, but nobody is really doing it. It's just take, take, take.
This just seems more of the same, but we're speeding up. We started by extracting fossil fuels deposited over millions of years, then extracting resources and technology from civilisations deposited over millennia, then from the Victorians deposited only a century or two ago, and now it's software deposited over only mere decades. Someone is going to be left holding the bag, we just hope it's not us. Meanwhile most of the population aren't even thinking about it, and most of the fraction that do think are dreaming that technology is going to save us before it's payback time.
IT education and computer science (at least part of it) will need a stronger focus on software engineering and software architecture skills to teach developers how to be in control of an AI dev tool.
The fastest way is via struggle. Learn to do it yourself first. Understand WHY it does not work. What's good code? What's bad code? What are conventions?
There are no shortcuts - you are not an accountant just because you have a calculator.
With that mindset you don't have to go to school, you could learn everything through struggle... Ideally it's a bit of both, you need theory and experience to succeed.
Brains are not computers and we don't learn by being given abstract rules. We also don't learn nearly as well from class room teaching as we do from doing things IRL for a real purpose - the brain always knows the difference and that the (real, non-artificially created) stakes are low in a teaching environment.
That's also the huge difference between AI and brains: AI does not work on the real world but on our communication (and even that is limited to text, missing all the nuance or face to face communication includes). The brain works based on sensor data from the real world. The communication method, language, is a very limited add-on on top of how the brain really works. We don't think in language, to do even some abstract language based thinking, e.g. when doing formal math, requires a lot of concentration and effort and still uses a lot of "under the hood" intuition.
That is why even with years of learning the same curriculum we still need to make a significant effort for every single concrete example to "get everyone on the same page", creating compatible internal models under the hood. Everybody's own internal model of even simple things are slightly different, depending on what brain they brought to learning and what exactly they learned, where even things like social classroom interactions went into how the connections were formed. Only based on a huge amount of effort can we then use language to communicate in the abstract, and even then, when we leave the central corridor of ideas people will start arguing forever about definitions. No matter how the written text is the same, the internal model is different for every person.
As someone who took neuroscience, I found this surprisingly well written:
"The brain doesn't like to abstract unless you make it"
> This resource, prepared by members of the University of London Centre for Educational Neuroscience (CEN), gives a brief overview of how the brain works for a general audience. It is based on the most recent research. It aims to give a gist of the brain’s principles of function, covering the brain’s evolutionary origin, how it develops, and how it copes in the modern world.
The best way to learn is to do things IRL that matter. School is a compromise and not really all that great. People motivated by actual need often can learn things that take years in school with middling results significantly faster and with better and deeper results.
Yeah. The only, and I mean only non-social/networking advantages to universities stem from forced learning/reasoning about complex theoretical concepts that form the requisite base knowledge to learn the practical requirements of your field while on the job.
Trade schools and certificate programs are designed to churn out people with journeyman-level skills in some field. They repeatedly drill you on the practical day-in-day-out requirements, tasks, troubleshooting tools and techniques, etc. that you need to walk up to a job site and be useful. The fields generally have a predictable enough set of technical problems to deal with that a deep theoretical exploration is unnecessary. This is just as true for electricians and auto mechanics as it is for people doing limited but logistically complex technical work, like orchestrating a big fleet of windows workstations with all the Microsoft enterprise tools.
In software development and lots of other fields that require grappling with complex theoretical stuff, you really need both the practical and the theoretical background to be productive. That would be a ridiculous undertaking for a school, and it’s why we have internships/externships/jr positions.
The combination of these tools letting the seniors in a department do all of the work so companies don’t have to invest in interns/juniors so there’s no reliable entry point into the field, and there being an even bigger disconnect between what schools offer and the skills they need to compete, the industry has some rough days ahead and a whole lot of people trying to get a foothold in the industry right now are screwed. I’m kind of surprised how little so many people in tech seem to care about the impending rough road for entry-level folks in the industry. I guess it’s a combination of how little most higher level developers have to interact with them, and the fact that everybody was tripping over themselves to hire developers when a lot of seniors joined the industry.
It's not a particularly moral way to think, but if you're currently mid level or senior, the junior dev pipeline being cut off will be beneficial to you personally in a few years' time.
Potentially very beneficial, if it turns out software engineers are still needed but nobody has been training them for half a decade
It’s clear that it harms those that get to keep their jobs less to some extent (though when you’ve got a glut of talent and few jobs, the only winners are employers because salaries tank eventually.) But frankly, the pervasiveness of that intense greed and self-absorption used to be anathema to the American software industry. Now it looks a lot more like a bunch of private equity bros than a bunch of people who stood to make good money selling creative solutions to the world’s problems. Even worse, the developers that built this business still think they’re part of the in-club, and too special and talented to get tossed out like a bag of moldy peaches. They’re wrong, and it’s sad to watch.
And that is the best thing about AI, it allows you to do and try so much more in the limited time you have. If you have an idea, build it with AI, test it, see where it breaks. AI is going to be a big boost for education, because it allows for so much more experimentation and hands-on.
By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software, so being able to do much more in a limited amount of time will not necessarily make you a more knowledgeable programmer, or at least that knowledge will most likely only be surface-level pattern recognition. It still needs to be combined with hands-on building your own thing, to truly understand the nuts and bolts of such projects.
If you end up with a working project where you understand all the moving parts, I think AI is great for learning and the ultimate proof whether the learning was succesful if whether you can actually build (and ship) things.
So human teachers are good to have as well, but I remember they were of limited use for me when I was learning programming without AI. So many concepts they tried to teach me without having understood themself first. AI would have likely helped me to get better answers instead of, "because that is how you do it" when asking why to do something in a certain way.
So obviously I would have prefered competent teachers all the time and also now competent teachers with unlimited time instead of faulty AIs for the students, but in reality human time is limited and humans are flawed as well. So I don't see the doomsday expectations for the new generation of programmers. The ultimate goal, building something that works to the spec, did not change and horrible unmaintainable code was also shipped 20 years ago.
I don't agree, to me switching from hand coded source code to ai coded source code is like going from a hand-saw to an electric-saw for your woodworking projects. In the end you still have to know woodworking, but you experiment much more, so you learn more.
Or maybe it's more like going from analog photography to digital photography. Whatever it is, you get more programming done.
Just like when you go from assembly to c to a memory managed language like java. It did some 6502 and 68000 assembly over 35 years ago, now nowbody knows assembly.
Key words there. To you, it's a electric saw because you already know how to program, and that's the other person's point; it doesn't necessarily empower people to build software. You? Yes. Generally though when you hand the public an electric saw and say "have at it, build stuff" you end up with a lot of lost appendages.
Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons. Which means AI even fails as a metaphorical "electric saw", because a good electric saw should strike fear into the user by promising mortal damage through misuse. AI has no such misuse deterrent, so people will freely misuse it until consequences swing back wildly, and the blast radius is community-scale.
> more like going from analog photography to digital photography. Whatever it is, you get more programming done.
By volume, the primary outcome of digital photography has been a deluge of pointless photographs to the extent we've had to invent new words to categorize them. "selfies". "sexts". "foodstagramming". Sure, AI will increase the actual programming being done, the same way digital photography gave us more photography art. But much more than that, AI will bring the equivalent of "foodstagramming" but for programs. Kind of like how the Apple App Store brought us some good apps, but at the same time 9 bajillion travel guides and flashlight apps. When you lower the bar you also open the flood gates.
Being able to do it quicker and cheaper will often ensure more people will learn the basics. Electrical tools open up woodworking to more people, same with digital photography, more people take the effort to learn the basics. There will also be many more people making rubbish, but is that really a problem?
With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.
> With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.
This falls apart today with sufficiently complex software and also seems to require source availability (or perfect specifications).
One of the things I keep an eye out for in terms of "have LLMs actually cracked large-product complexity yet" (vs human-overseen patches or greenfield demos) is exactly that sort of re-implementation-and-improvement you talk about. Like a greenfield Photoshop substitute.
Your last point is also something that happened when the big game engines such as Unity became free to use. All of a sudden, Steam Greenlight was getting flooded with gems such as "potato peeling simulator" et al. I suppose it is just a natural side effect of making things more accessible.
> Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons.
Aren't you being overly optimistic that these would even get traction?
Pessimistic, but yeah. It's just my whole life has been a string of the absolute worst ideas being implemented at scale, so I don't see why this would buck the trend.
> By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software
> will not necessarily make you a more knowledgeable programmer
I think we'd better start separating "building software" from programming, because the act of programming is going to continue to get less and less valuable.
I would argue that programming has been very overvalued for a while even before AI. And the industry believes it's own hype with a healthy dose of elitism mixed in.
But now AI is removing the facade and it's showing that the idea and the architecture is actually the important part, not the coding if it.
Ok. But most developers aren't building AI tech. Instead, they're coding a SPA or CRUD app or something else that's been done 10000 times before, but just doing it slightly differently. That's exactly why LLMs are so good at this kind of (programming) work.
I would say most people are dealing with tickets and meetings about the tickets more than they are actually spending time with their editor. It may be similar, but that 1 percent difference needs to be nailed down right, as that's where the business lifeline lays.
Unfortunately education everywhere is getting really hurt by access to AI, both from students who are enabled to not their homework, and by teacher review/feedback being replaced by chatbots.
In Germany, software engineer is a trade you go to trade school for three years while working in a company in parallel. I don't think that IT education and computer science in universities should have a stronger focus on SE as universities are basically a trade school for being a researcher.
Yes, but it is more of a cultural thing than anything else. Studying computer science to be a software developer* is like studying mechanical engineering to be a machine operator.
* except if you are developing complicated algorithms or do numeric stuff. However, I believe that the majority of developers will never be in such a situation.
A software degree or a CS degree with a more applied focus will teach you way better than the trade schools will. It'd be nice if that weren't the case, but from all I've seen it is.
So you end up in that weird spot where it would work very well for someone with a strong focus on self-learning and a company investing into their side of the training, but at that point you could almost skip the formal part completely and just start directly, assuming you have some self-taught base. Or work part-time will studying on the side, and get the more useful degree that way. Plenty places will hire promising first-year uni students.
If this is the case then we better have full AI generated code within the next 10 years since those "juniors" will remain atrophied juniors forever and the old timers will be checking in with the big clock in the sky. IF we, as a field, believe that this can not possibly happen, then we are making a huge mistake leaning on a tool that requires "deep [orthogonal] experience" to operate properly.