this post was submitted on 31 Aug 2023
365 points (99.5% liked)

Selfhosted

39964 readers
291 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I posted the other day that you can clean up your object storage from CSAM using my AI-based tool. Many people expressed the wish to use it on their local file storage-based pict-rs. So I've just extended its functionality to allow exactly that.

The new lemmy_safety_local_storage.py will go through your pict-rs volume in the filesystem and scan each image for CSAM, and delete it. The requirements are

  • A linux account with read-write access to the volume files
  • A private key authentication for that account

As my main instance is using object storage, my testing is limited to my dev instance, and there it all looks OK to me. But do run it with --dry_run if you're worried. You can delete lemmy_safety.db and rerun to enforce the delete after (method to utilize the --dry_run results coming soon)

PS: if you were using the object storage cleanup, that script has been renamed to lemmy_safety_object_storage.py

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 66 points 1 year ago* (last edited 1 year ago) (2 children)

I hope people share the positive hits of CSAM and see how widespread the problem is...

DRAMTIC EDIT: the records lemmy_safety_local_storage.py identifies, not the images! @[email protected] seems to think it "sounds like" I am ACTIVELY encouraging the spreading of child pornography images... NO! I mean audit files, such as timestamps, the account that uploaded, etc. Once you have the timestamp, the nginx logs from a lemmy server should help identify the IP address.

[–] [email protected] 31 points 1 year ago (1 children)

What a hilarious mistake haha

[–] [email protected] 28 points 1 year ago* (last edited 1 year ago) (2 children)

It is not even a mistake, it's some pretty mind-fucked up on part of @[email protected] to jump to such a conclusion. crap

[–] [email protected] 11 points 1 year ago

It's cool, most everybody knows what you mean lol. Glad you clarified so there wouldn't be future misunderstandings

[–] [email protected] -1 points 1 year ago (1 children)

Oh come on, its just a correction of communication

[–] [email protected] 3 points 1 year ago (1 children)

It’s probably projection. Nobody reasonable would have jumped to the same conclusion. It doesn’t even remotely read like that.

[–] [email protected] 2 points 1 year ago (1 children)

It sounds to me like an NT desire for perfectly crafted arguments, without ambiguity. I do this, and feel fortunate that I didnt call for a correction, myself. See how vicious you all are about it.

[–] [email protected] 1 points 1 year ago (1 children)

Oh come on. Being ND doesn’t mean your mind jumps to sharing child porn. That’s a fuckin cop out.

[–] [email protected] 3 points 1 year ago

You are saying the mind jumps, but that is the topic. I meant to say that being ND can create a desire for clarity in communication. A direct or terse argument.