this post was submitted on 28 May 2024
73 points (82.3% liked)

Technology

59374 readers
6873 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Elon Musk's quest to wirelessly connect human brains with machines has run into a seemingly impossible obstacle, experts say. The company is now asking the public for help finding a solution.

Musk's startup Neuralink, which is in the early stages of testing in human subjects, is pitched as a brain implant that will let people control computers and other devices using their thoughts. Some of Musk's predictions for the technology include letting paralyzed people "walk again and use their arms normally."

Turning brain signals into computer inputs means transmitting a lot of data very quickly. A problem for Neuralink is that the implant generates about 200 times more brain data per second than it can currently wirelessly transmit. Now, the company is seeking a new algorithm that can transmit this data in a smaller package — a process called compression — through a public challenge.

As a barebones web page announcing the Neuralink Compression Challenge posted on Thursday explains, "[greater than] 200x compression is needed." The winning solution must also run in real time, and at low power.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 22 points 5 months ago (9 children)

The implication of a 200 to 1 algorithm would be that the data they're collecting is almost entirely noise. Specifically that 99.5% of all the data is noise. In theory if they had sufficient processing in the implant they could filter the data down before transmission thus reducing the bandwidth usage by 99.5%. It seems like it would be fairly trivial to prove that any such 200 to 1 compression algorithm would be indistinguishable in function from a noise filter on the raw data.

It's not quite the same situation, but this should show some of the issues with this: https://matt.might.net/articles/why-infinite-or-guaranteed-file-compression-is-impossible/

[–] [email protected] 1 points 5 months ago (2 children)

Ugh? That's not what it means at all. Compression saves on redundant data, but it doesn't mean that data is noise. Or are you using some definition of noise I'm not aware of?

[–] [email protected] 4 points 5 months ago (1 children)

I can try to explain, but there are people who know much more about this stuff than I do, so hopefully someone more knowledgeable steps in to check my work.

What does ‘random’ or ‘noise’ mean? In this context, random means that any given bit of information is equally as likely to be a 1 or a 0. Noise means a collection of information that is either random or unimportant/non-useful.

So, you say “Compression saves on redundant data”. Well, if we think that through, and consider the definitions I’ve given above, we will reason that ‘random noise’ either doesn’t have redundant information (due to the randomness), or that much of the information is not useful (due to its characteristic as noise).

I think that’s what the person is describing. Does that help?

[–] [email protected] 1 points 5 months ago

I agree with your point, but you're arguing that noise can be redundant data. I am arguing that redundant data is not necessarily noise.

In other words, a signal can never be filtered losslessly. You can slap a low pass filter in front of the signal and call it a day, but there's loss, and if lossless is a hard requirement then there's absolutely nothing you can do but work on compressing redundant data through e.g. patterns, interpolation, what have you (I don't know much about compression algos).

A perfectly noise free signal is arguably easier to compress actually as the signal is more predictable.

load more comments (6 replies)