this post was submitted on 20 May 2024
180 points (95.0% liked)

Technology

59374 readers
3392 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 41 points 5 months ago (3 children)

with 85% of the promised functionality no longer functional

To be fair 85% of threads retracting doesn't seem to translate to an equal amount of functional loss. The article mentions

Neuralink was quick to note that it was able to adjust the algorithm used for decoding those neuronal signals to compensate for the lost electrode data. The adjustments were effective enough to regain and then exceed performance on at least one metric—the bits-per-second (BPS) rate used to measure how quickly and accurately a patient with an implant can control a computer cursor.

I think it will be impossible for us to asses how much it actually impacts function in real world use case.

It seems clear that this is a case of learning by trial and error, which considering the stakes doesn't seem like the right approach.

The question that this article doesn't answer is, whether they have learned anything at all or if they are just proceeding to do the same thing again. And if they have learned something, is there something preventing it to be applied to the first patient.

[–] [email protected] 20 points 5 months ago (1 children)

if they have learned something, is there something preventing it to be applied to the first patient.

That's part of what makes me see this as a really bad look.

"Install it deeper" isn't rocket science, and it sounds like their first volunteer is willing.

They just want the extra data from leaving their first volunteer where they landed.

Human subject experiments are supposed to carry more long term obligation than this.

[–] [email protected] 6 points 5 months ago* (last edited 5 months ago)

Seriously. My father was part of a Deep Brain Stimulation trial. Their follow up was for ten years, just for the trial. The implant itself lasted his entire life, which I'm not feeling like doing the math. Five, six years after that?

[–] [email protected] 11 points 5 months ago* (last edited 5 months ago) (1 children)

I think it will be impossible for us to asses how much it actually impacts function in real world use case.

Does seem fair though to say that if you have 85% less data input/probes, that you're losing some to a large amount of fidelity, than an algorithm can only make up so much for.

A potentionally bad analogy, but think of it as a high bitrate versus a low bitrate, for listening to music. The quality of the music will be notably different, but you would still be able to hear both of the songs in their entirety.

At the end of the day, it's a lack of data that was originally expected for the algorithm to work with, that is now missing.

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~

[–] [email protected] 7 points 5 months ago

Currently they've been having him control a cursor. He can left and right click it.

He can perform as good as he could before the problem now it seems, so if that's the case that extra bit rate wasn't needed for that task.

What this probably means is he won't be able to do as much as they learn more about it.

Maybe 1 year in the 2nd patient with full fidelity is able attach it to a robotic arm and fetch themselves a drink, but Nolan while he can click as good, won't be able to do that.

Also if they fix it eventually, as they didn't say never, just not yet, they'll never know if that discrepancy occurs.

[–] [email protected] 6 points 5 months ago (1 children)

For sure they learned something, they must have some ideas why those retracted. Also they confirmed viability of technology by doing tests before those retracted

[–] [email protected] 1 points 5 months ago

This was a known problem that they didn't fix on the animal models before moving to human trials. They learned nothing. All they did was scrap someone's brain. But I'm sure it's no big deal, he was a cripple right, he should be happy to be part of this /s