Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
Okay so I had a meltdown last year. I was staring down a startup that was circling the drain, I knew my time there was limited, and I was being bombarded daily with layoffs and friends not being able to find work, while hearing constantly that I was going to be left behind due to AI. (of course the layoffs were happening because tech CEOs heard AI and started frothing at the idea of getting rid of some of their most expensive staff)
So, I took it on myself to learn AI. I figured well, if it's coming for my job I might as well learn how it works. And oh lorde, did I learn a lot. To the point where I'm running several LLMs now at home, I have them running in k3s, across multiple servers, and have built several apps to interact with them. I've trained finetuned LLMs, I've played with image generation, voices, I dove headfirst in. Eventually I did lose that job, and that gave me a couple months to focus even more before finding my current one.
My biggest learnings, which I'm sure many of you know:
if(nsfw) dont()
. You have to spend a lot of time forcing the LLM to not give weight to the users course, and it's to the point that it hardly seems worth it.while
loop.There's more but this is too long already. It's neat, it's useful, but the hype was just as intense as blockchain. We're going to see some real great usages out of it, like integration with something like Word or a browser to summarize things is honestly a good idea. But there are so so so many pitfalls.
For coding? I think it's a great place to get started, or to get an idea. I would never trust it in production. It will take a very long time for us to get to the point where you can say "Go build this feature" and I would blindly trust what it generated.
It's like the AI Scientist experiment, run jointly by Sakana AI (Japan), University of Oxford, and University of British Columbia.
It recently was tasked with running experiments, and strict time frames. So it tried to rewrite it's own code to give itself more time to work on the experiment.