Yeah, he's right. AI is mostly used by corps to enshittificate their products for just extra profit
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
game devs gonna have to use different language to describe what used to be simply called "enemy AI" where exactly zero machine learning is involved
Logic and Path-finding?
AI is nothing more than a way for big businesses to automate more work and fire more people.
and do that at the expense of 30+ years of power reduction and efficiency gains, to the point that private companies are literally buying/building/restarting old power plants just to cover the insane power demand, because literally operating a power plant is cheaper than paying the energy costs.
For the common every day person its 3d tv and every other bullshit fad that burned brilliantly for all of 3 seconds before snuffing itself out, leaving people to have had paid for overpriced garbage thats no longer useful.
AI is nothing more than a way for big businesses to automate more work and fire more people.
All technology in human history has done that. What are you proposing? Reject technology to keep people employed on inefficient tasks?
At some point people need to start thinking that is better to end capitalism that to return to monke.
In a way he’s right, but it depends! If you take even a common example like Chat GPT or the native object detection used in iPhone cameras, you’d see that there’s a lot of cool stuff already enabled by our current way of building these tools. The limitation right now, I think, is reacting to new information or scenarios which a model isn’t trained on, which is where all the current systems break. Humans do well in new scenarios based on their cognitive flexibility, and at least I am unaware of a good framework for instilling cognitive flexibility in machines.
and that 10% isnt really real, just a gabbier dr.sbaitso
Idk man, my doctors seem pretty fucking impressed with AI's capabilities to make diagnoses by analyzing images like MRI's.