this post was submitted on 22 Sep 2023
62 points (80.4% liked)

Technology

34904 readers
293 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
all 40 comments
sorted by: hot top controversial new old
[–] [email protected] 50 points 1 year ago

Obligatory “What could possibly go wrong? /s”

[–] [email protected] 43 points 1 year ago (1 children)

For that thing that killed hundreds of monkeys? Yeah, sounds like a great plan.

[–] [email protected] 35 points 1 year ago

"only" 15-17 monkeys, but thousands of other animals, insanely depressing. the more you read about it, the more you will start to actually believe that the death of one particular primate could indeed be beneficial for humanity ...

On several occasions over the years, Musk has told employees to imagine they had a bomb strapped to their heads in an effort to get them to move faster, according to three sources who repeatedly heard the comment. On one occasion a few years ago, Musk told employees he would trigger a “market failure” at Neuralink unless they made more progress, a comment perceived by some employees as a threat to shut down operations, according to a former staffer who heard his comment. Five people who’ve worked on Neuralink’s animal experiments told Reuters they had raised concerns internally. They said they had advocated for a more traditional testing approach, in which researchers would test one element at a time in an animal study and draw relevant conclusions before moving on to more animal tests. Instead, these people said, Neuralink launches tests in quick succession before fixing issues in earlier tests or drawing complete conclusions. The result: More animals overall are tested and killed, in part because the approach leads to repeated tests. One former employee who asked management several years ago for more deliberate testing was told by a senior executive it wasn’t possible given Musk’s demands for speed, the employee said. Two people told Reuters they left the company over concerns about animal research.

[–] [email protected] 33 points 1 year ago (1 children)

Is there no government oversight for "Uhh no you aren't?"

Given the recent animal testing results this seems like assisted suicide

[–] [email protected] 4 points 1 year ago

There was, they were not initially approved.

[–] [email protected] 32 points 1 year ago (2 children)

Ole Musky should step up and prove how safe it is. If he gets one, I'll definitely get one.

[–] [email protected] 19 points 1 year ago

One implant to rule them all, one implant to find them, One implant to bring them all, and in the darkness bind them

[–] [email protected] 16 points 1 year ago (3 children)

He behaves like he already has one.

[–] [email protected] 13 points 1 year ago

Oh shit, what if the purchase of Twitter, change to X, etc...etc...is all due to brain implant side effects... Someone get this conspiracy going!

[–] [email protected] 2 points 1 year ago

So then the thing that killed the monkeys was the smelling their smug farts all day? It all makes sense now...

[–] [email protected] 1 points 1 year ago

He thinks we’re living in a simulation.

[–] [email protected] 30 points 1 year ago (2 children)

Lol this guy can't even make a car that doesn't kill someone or have a bumper that doesn't fall off

[–] [email protected] 11 points 1 year ago (1 children)

I shudder to think what the human equivalent of "fully autonomous driving" or a launchpad explosion looks like.

[–] [email protected] 7 points 1 year ago

Or even the human equivalent of a bumper falling off.

[–] [email protected] -2 points 1 year ago (1 children)

All cars kill people. Don't Tesla's have a pretty decent safety rating?

[–] [email protected] -2 points 1 year ago (1 children)
[–] [email protected] 11 points 1 year ago* (last edited 1 year ago)

Okay, instead of posting rage bait can you show me that more people are dying in/from Tesla's then other vehicles per mile driven?

And just to be clear, I don't own a car. Nor do I care for Teslas. But you can't claim it's a dangerous car while not comparing it to the rest of the industry. Cars in general are really fucking unsafe.

[–] [email protected] 24 points 1 year ago

First rule of technology: if the increase in complexity and decrease in reliability outweigh the added tangible value, don't implement it. This is why it's usually best to avoid "smart" appliances or, you know, brain implants.

[–] [email protected] 23 points 1 year ago (1 children)

A good rule of thumb with computers and software is to never touch/buy an alpha/version 1.0 of any system as its best to let someone else sort out the major bugs.

This is the dilemma people trying to create wetware (brain-hardware interface) face. There will be problems and how the hell any experiments to advance this pass an ethics board is beyond me.

[–] [email protected] 38 points 1 year ago* (last edited 1 year ago) (2 children)
[–] [email protected] 18 points 1 year ago* (last edited 1 year ago)

Holy shit that is beyond terrible. It reminds me of something William Gibson would write (tech wise) with the absurdity of Douglas Adams or Kurt Vonnegut mixed in.

And I can see this sort of thing happening again and again if this tech keeps developing over the next 50 years.

I would now revise this to never touch any wetware interface for the next 30 years and maybe by then it will be stable.

[–] [email protected] 4 points 1 year ago

that is just terrifying, imagine just suddenly going back to being completely blind, and then learning noone's really out there to fix it anymore because the company behind it just went poof one day

[–] [email protected] 19 points 1 year ago (2 children)

If someone besides Musk was running things, I might be excited about the potential for progress… as it stands, though, I just can’t trust the guy.

[–] [email protected] 25 points 1 year ago

Given what is coming out about how the animal test subjects were treated, you'd be better off letting a random dentist poke at your brain

[–] [email protected] 3 points 1 year ago

Same.

Although I can't say I would trust one from a FAANG company much more.

[–] [email protected] 18 points 1 year ago (1 children)

I'll bet you need a lifetime subscription with that and get a blue verification mark on your forehead.

[–] [email protected] 17 points 1 year ago

A 30 day lifetime subscription.

[–] [email protected] 13 points 1 year ago

I can see how a quadrapoligic or someone with ALS would be excited for this trial. I hope it goes well. It could give someone that is trapped in their body a new way to communicate with their loved ones.

[–] [email protected] 12 points 1 year ago (2 children)

Does it support full auto-think?

[–] [email protected] 13 points 1 year ago

Whoever thinks getting one is a good idea already has it enabled.

[–] [email protected] 2 points 1 year ago

I already have that and I'm not impressed

[–] [email protected] 8 points 1 year ago

I mean look on the bright side. It’s Musk’s sycophants who would line up to die for something like this.

[–] [email protected] 6 points 1 year ago (1 children)

People would need to be force to insert this shit to be competitive in the market. This sucks

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago)

Is anyone up for starting a non religious Amish society?

[–] [email protected] 4 points 1 year ago
[–] [email protected] 3 points 1 year ago

Nice move there AI. I see what you're doing.

[–] [email protected] 2 points 1 year ago (1 children)

Always mount a scratch test subject before testing or reconfiguring.

http://www.catb.org/jargon/html/S/scratch-monkey.html

[–] [email protected] 1 points 1 year ago

They're all dead already.