this post was submitted on 10 Dec 2024
329 points (99.1% liked)
Technology
60033 readers
2895 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Are you aware that RAM in your Computing devices looses information if you read the bit?
Why don't you switch from smartphone to abacus and dwell in the anti science reality of medieval times?
And that it looses data after merely a few milliseconds if left alone, that to account for that, DDR5 reads and rewrites unused data every 32ms.
You're describing how ancient magnetic core memory works, that's not how modern DRAM (Dynamic RAM) works. DRAM uses a constant pulsing refresh cycle to recharge the micro capacitors of each cell.
And on top of that, SRAM (Static RAM) doesn't even need the refresh circuitry, it just works and holds it's data as long as it remains powered. It only takes 2 discreet transistors, 2 resistors, 2 buttons and 2 LEDs to demonstrate this on a simple breadboard.
I'm taking a wild guess that you've never built any circuits yourself.
I'm taking a wild guess that you completely ignored the subject of the thread to start an electronics engineering pissing contest?
Do you really trust the results of any computing system, no matter how it's designed, when it has pathetic memory integrity compared to ancient technology?
That is not a product. This is research.
And you would have been there shitting on magnetic core memory when it came out. But without that we wouldn't have the more advanced successors we have now.
Nah, core memory is alright in my book, considering the era of technology anyways. I would have been shitting on the William's Tube CRT Memory system..
https://youtube.com/watch?v=SpqayTc_Gcw
Though in all fairness, at the time even that was something of progress.
Doubt.
Core memory loses information on read and DRAM is only good while power is applied. Your street dime will be readable practically forever and your abacus is stable until someone kicks it over.
You're not the arbiter of what technology is "good enough" to warrant spending money on.
Core memory is also designed to accomodate for that and almost instantly rewrite the data back to memory. That in itself might be a crude form of 'error' correction, but it still lasts way longer than an hour.
Granted that quantum computers are a different beast of their own, how much digital data does a qbit actually store? And how does that stack up in price per bit comparison?
If they already know quantum computers are more prone to memory errors, why not just use reliable conventional RAM to store the intermediate data and just let the quantum side of things just be the 'CPU', or QPU if you like?
I dunno, it just makes absolutely no sense to me to utilitze any sort of memory technology that even with error correction still manages to lose information faster than a jumping spider's memory?