this post was submitted on 22 Jun 2024
675 points (98.1% liked)
Technology
59207 readers
2520 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And people will still say AI isn't a bubble.
There is a bubble in AI, AI isnt a bubble. In the same way there was a bubble in e-commerce that lead to the dotcom crash. But that didnt mean there was nothing of value there, just that there was too much money chasing hype.
I think it will hinge on one thing: Will AI provide an experience that is maybe worse, but still sufficient to keep the market share, at lower cost than putting in the proper effort? If so, it might still become a tragic "success"-story.
It's very, very costly, both but the hardware and the electricity it takes to run it. There may be a bit of sunk cost fallacy at play for some, especially the execs who are calling for AI Everything, but in the end, in AI doesn't generate enough increase in revenue to offset its operational costs, even those execs will bow out. I think the economics of AI will cause the bubble to burst because end users aren't going to pay money for a service that does a mediocre job at most things but costs more.
That's what I suspect, too, but I'm not entirely sure in my research so far. The question I am still unsure about: Is it as costly in running, or is the real costly part "just" the "training our model" part? I wondered that, because when I was messing around, things like generative text models could run on my potato PC with a bit of python scripting without too much issue, even if not ideally - as long as I had the already trained dataset downloaded.
Can't really answer the expense trade-off until you look at concrete use cases, something general AI is allergic to...
You've got a great point there, actually