this post was submitted on 11 May 2024
208 points (98.1% liked)

Technology

59374 readers
3671 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 57 points 6 months ago (3 children)

And the crazy part is that it sounds like Google didn't have backups of this data after the account was deleted. The only reason they were able to restore the data was because UniSuper had a backup on another provider.

This should make anyone really think hard about the situation before using Google's cloud. Sure, it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure but I'm surprised Google themselves do not keep backups for awhile after an account is deleted.

[–] [email protected] 33 points 6 months ago (1 children)

Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.
The same can probably happen on AWS, Azure, any data center really

[–] [email protected] 15 points 6 months ago (1 children)

Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.

Uh, yeah, that's why I said

it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure

The same can probably happen on AWS, Azure, any data center really

Sure, if you colocate in another datacenter and it isn't your own, they aren't backing your data up without some sort of other agreement and configuration. I'm not sure about AWS but Azure actually has offline geographically separate backup options.

[–] [email protected] 5 points 6 months ago

I use AWS to host a far amount of servers and some micro services and for them if you don't build the backup into your architecture design and the live data gets corrupted, etc you are screwed.

They give you the tools to built it all, but it is up to you as the sysadmin/engineer/ dev to actually use those tools.

[–] [email protected] 21 points 6 months ago

The IT guy who set up that backup deserves a hell of a bonus.

A lot of people would have been happy with their multi region resiliency and stopped there.

[–] [email protected] 8 points 6 months ago (2 children)

No, they had backups. They deleted those, too.

[–] [email protected] 9 points 6 months ago* (last edited 6 months ago) (1 children)

Google Cloud definitely backs up data. Specifically I said

after an account is deleted.

The surprise here being that those backups are gone (or unrecoverable) immediately after the account is deleted.

[–] [email protected] 1 points 6 months ago

I've found that Google deletes backups after a few months

[–] [email protected] 1 points 6 months ago

A replica is not a backup.