this post was submitted on 19 Apr 2025
409 points (92.5% liked)

Technology

69041 readers
2628 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 79 points 1 day ago* (last edited 1 day ago) (12 children)

AI AI AI AI

Yawn

Wake me up if they figure out how to make this cheap enough to put in a normal person's server.

[–] [email protected] 4 points 1 day ago (2 children)

You can get a Coral TPU for 40 bucks or so.

You can get an AMD APU with a NN-inference-optimized tile for under 200.

Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.

What price point are you trying to hit?

[–] [email protected] 1 points 23 hours ago

I just use pre-made AI's and write some detailed instructions for them, and then watch them churn out basic documents over hours...I need a better Laptop

[–] [email protected] 7 points 1 day ago (6 children)

What price point are you trying to hit?

With regards to AI?. None tbh.

With this super fast storage I have other cool ideas but I don't think I can get enough bandwidth to saturate it.

load more comments (6 replies)
load more comments (11 replies)
[–] [email protected] 26 points 1 day ago (1 children)

Clickbait article with some half truths. A discovery was made, it has little to do with Ai and real world applications will be much, MUCH more limited than what's being talked about here, and will also likely still take years to come out

load more comments (1 replies)
[–] [email protected] 42 points 1 day ago (1 children)

Too bad the US can't import any of it.

[–] [email protected] 26 points 1 day ago (1 children)

they can if they pay 6382538% tariffs.

or was it 29403696%?

[–] [email protected] 14 points 1 day ago

“These chips are 10,000 times faster, therefore we will increase our tariffs to 10,100%!”

[–] [email protected] 32 points 1 day ago* (last edited 1 day ago) (1 children)

Brother, have you heard of buses? Even INSIDE cpus/socs bus speeds are a limitation. Also i fucking hate how the first thing people mention now is how ai could benefit from a jump in computing power.

Edit: I havent dabbled that much in high speed stuff yet but isnt the picosecond range so fast that the capacitance of simple traces and connectors between chips influence the rising and falling edge of chips?

[–] [email protected] 8 points 1 day ago

That's pretty much my understanding. Most of the advancements happened in memory speeds are related to the physical proximity of the memory and more efficient transmission/decoding.

GDDR7 chips for example are packed as close as physically possible to the GPU die, and have insane read speeds of 28 Gbps/pin (and a 5090 has a 512-bit bus). Most of the limitation is the connection between GPU and RAM, so speeding up the chips internally 1000x won't have a noticeable impact without also improving the memory bus.

[–] [email protected] 11 points 1 day ago

Wow, finally graphene has been cracked. Exciting times for portable low-energy computing

[–] [email protected] 8 points 1 day ago (1 children)

Is that fast enough to put an LLM in swap and have decent performance?

[–] [email protected] 12 points 1 day ago

Note that this in theory speaks to performance of a non volatile memory. It does not speak to cost.

We already have a faster than NAND non volatile storage in phase change memory . It failed due to expense.

If this thing is significantly more expensive even than RAM, then it may fail even if it is everything it says it is. If it is at least as cheap as ram, it'll be huge since it is faster than RAM and non volatile.

Swap is indicated by cost, not by non volatile characteristics.

[–] [email protected] 3 points 1 day ago* (last edited 1 day ago) (1 children)

Does flash, like solid state drives, have the same lifespan in terms of write? If so, it feels like this would most certainly not be useful for AI, as that use case would involve doing billions/trillions of writes in a very short span of time.

Edit: It looks like they do: https://www.enterprisestorageforum.com/hardware/life-expectancy-of-a-drive/

Manufacturers say to expect flash drives to last about 10 years based on average use. But life expectancy can be cut short by defects in the manufacturing process, the quality of the materials used, and how the drive connects to the device, leading to wide variations. Depending on the manufacturing quality, flash memory can withstand between 10,000 and a million [program/erase] cycles.

[–] [email protected] 6 points 1 day ago* (last edited 1 day ago) (1 children)

For AI processing, I don't think it would make much difference if it lasted longer. I could be wrong, but afaik, running the actual transformer for AI is done in VRAM, and staging and preprocessing is done in RAM. Anything else wouldn't really make sense speed and bandwidth wise.

[–] [email protected] 3 points 1 day ago

Oh I agree, but the speeds in the article are much faster than any current volatile memory. So it could theoretically be used to vastly expand memory availability for accelerators/TPUs/etc for their onboard memory.

I guess if they can replicate these speeds in volatile memory and increase the buses to handle it, then they'd be really onto something here for numerous use cases.

[–] [email protected] 5 points 1 day ago (1 children)

This sounds like that material would be more useful in high performance radars, not as flash memory

[–] [email protected] 21 points 1 day ago* (last edited 1 day ago) (1 children)

It‘s likely BS anyway. Maybe it’s just me but reading about another crazy breakthrough from China every single day during this trade war smells fishy. Because I‘ve seen the exact same propaganda strategy during the pandemic when relations between China and the rest of the world weren‘t exactly the best. A lot of those headlines coming from there are just claims about flashy topics with very little substance or second guessing. And the papers releasing the stories aren‘t exactly the most renowned either.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›