this post was submitted on 04 Apr 2025
840 points (97.6% liked)
Technology
68400 readers
2721 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I do as a software engineer. The fad will collapse. Software engineering hiring will increase but the pipeline of new engineers will is dry because no one wants to enter the career with companies hanging ai over everyone's heads. Basic supply and demand says my skillset will become more valuable.
Someone will need to clean up the ai slop. I've already had similar pistons where I was brought into clean up code bases that failed being outsourced.
Ai is simply the next iteration. The problem is always the same business doesn't know what they really want and need and have no ability to assess what has been delivered.
A complete random story but, I'm on the AI team at my company. However, I do infrastructure/application rather than the AI stuff. First off, I had to convince my company to move our data scientist to this team. They had him doing DevOps work (complete mismanagement of resources). Also, the work I was doing was SO unsatisfying with AI. We weren't tweaking any models. We were just shoving shit to ChatGPT. Now it was be interesting if you're doing RAG stuff maybe or other things. However, I was "crafting" my prompt and I could not give a shit less about writing a perfect prompt. I'm typically used to coding what I want but I had to find out how to write it properly: "please don't format it like X". Like I wasn't using AI to write code, it was a service endpoint.
During lunch with the AI team, they keep saying things like "we only have 10 years left at most". I was like, "but if you have AI spit out this code, if something goes wrong ... don't you need us to look into it?" they were like, "yeah but what if it can tell you exactly what the code is doing". I'm like, "but who's going to understand what it's saying ...?" "no, it can explain the type of problem to anyone".
I said, I feel like I'm talking to a libertarian right now. Every response seems to be some solution that doesn't exist.
AI can look at a bajillion examples of code and spit out its own derivative impersonation of that code.
AI isn't good at doing a lot of other things software engineers actually do. It isn't very good at attending meetings, gathering requirements, managing projects, writing documentation for highly-industry-specific products and features that have never existed before, working user tickets, etc.
I too am a developer and I am sure you will agree that while the overall intelligence of models continues to rise, without a concerted focus on enhancing logic, the promise of AGI likely will remain elusive. AI cannot really develop without the logic being dramatically improved, yet logic is rather stagnant even in the latest reasoning models when it comes to coding at least.
I would argue that if we had much better logic with all other metrics being the same, we would have AGI now and developer jobs would be at risk. Given the lack of discussion about the logic gaps, I do not foresee AGI arriving anytime soon even with bigger a bigger models coming.
If we had AGI, the number of jobs that would be at risk would be enormous. But these LLMs aren't it.
They are language models and until someone can replace that second L with Logic, no amount of layering is going to get us there.
Those layers are basically all the previous AI techniques laid over the top of an LLM but anyone that has a basic understanding of languages can tell you how illogical they are.