palitu

joined 1 year ago
[–] [email protected] 2 points 1 year ago

i have an older Kobo Glo, which does not have the library integration, but i am wanting to do exactly the same as you. get a few to take with us, and use the library borrowing to get books to read.

[–] [email protected] 3 points 1 year ago

that would be good to know!

[–] [email protected] 2 points 1 year ago (1 children)

thanks. i will look at readarr and ubooquity as the server side.

I have seen some books i acquired as over 100mb, typically with kids books. not sure why, but there you have it.

I think i will skip the tablet, due to battery life, i love the weeks of use of my current kobo Glo.

thanks

[–] [email protected] 5 points 1 year ago

Cool work! Love the idea of hooking into the fediverse toolset

[–] [email protected] 1 points 1 year ago

never heard of him before. what about him?

[–] [email protected] 2 points 1 year ago

there are a number of blog posts that have different details about the how/why, etc. i just followed the links in the article to other parts of the series.

I expect that the use case is more prevalent than you think, where you are spending a decent chunk on cloud infra. I have been convinced for some time now that the costs are high compared to our on-prem. I really like the idea of a the "deft" type hardware management service, so that look after the DCs, hardware and connectivity, and we look after the software.

[–] [email protected] 8 points 1 year ago

you mean the ML model?

I dont think it is too bad, as it is more like look for a description that has children and a sexual context. This can be trained without CSAM as the model generalises situations it has seen before - a pornographic picture (sexual context) and kids playing at a platground (children in the scene).

[–] [email protected] 15 points 1 year ago (2 children)

There Is a tool that someone built directly to scan images uploaded to lemmy for CSAM.

It is really quite clever. The image is put through a ML/AI model, which describes it (Imange to text), then the text is reviewed against a set of rules to see if it has the hallmarks of CSAM. If it does, it is deleted.

This is fully self hosted.

What I like is that it avoids the trauma of a person having to see those sort of things

[–] [email protected] 2 points 1 year ago

That must have been a long time ago. They have had it for ages!

[–] [email protected] 15 points 1 year ago

I think they are now x-cretes

[–] [email protected] 33 points 1 year ago (6 children)

Is that it? No article?

[–] [email protected] 2 points 1 year ago

Oh yeah, this is all from memory.

view more: ‹ prev next ›