this post was submitted on 24 Sep 2023
597 points (97.2% liked)
Technology
58151 readers
4180 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Definitely a huge concern, but Amazon didn't erase this guy :P In a very real sense, this guy was fine and was still able to use most of his tech via Siri integration. I'm actually kind of glad Amazon is trying to shut down services for bigots.
How exactly was he a bigot?
As it turned out, he wasn't. But when they stopped servicing him, they had every reason to believe that he was.
Do you continue to service a customer whose behavior is otherwise unacceptable until you're absolutely sure he's a bigot? Or do you abide by your legal obligation to protect your workers from such behavior?
I don't know if Amazon did the worst thing here, but I don't know that the best thing is far off from what they did.
Who at Amazon would be hurt by a bigot using their Echo or doorbell? Stopping deliveries sure but this is a couple of steps further.
That's a great question and I don't know what kind of exposure Amazon employees have to audio logs from those devices but I'm certain there's some sure to required troubleshooting and debugging.
I also don't know how integrated the various aspects of a user's account are and whether it would even be possible for Amazon to have taken a smaller step.