This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here's to hoping that open-source LLMs become more advanced and optimized.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.
If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?
If I leave my front door unlocked and someone walks in and robs my house, who is to blame?
In a private environment, one person's mistake can happen, period.
A corporate environment absolutely needs robust procedures in place to prevent the company and all their clients from such huge impact of one person's mistake.
But that's a looong tradition at M$ - not having it, I mean.
Azure has a huge problem with SAS tokens. The mechanism is so bad, that it invites situations like this.
Root cause is whatever is allowing the human error to happen.
if you live in an apartment and the landlord doesnt replace the front door locks when they break is a better analogy
Because, of course they did.