this post was submitted on 31 Mar 2025
17 points (100.0% liked)

Selfhosted

45394 readers
554 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I already host multiple services via caddy as my reverse proxy. Jellyfin, I am worried about authentication. How do you secure it?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 2 days ago (12 children)

I use good ol' obscurity. My reverse proxy requires that the correct subdomain is used to access any service that I host and my domain has a wildcard entry. So if you access asdf.example.com you get an error, the same for directly accessing my ip, but going to jellyfin.example.com works. And since i don't post my valid urls anywhere no web-scraper can find them. This filters out 99% of bots and the rest are handled using authelia and crowdsec

[–] [email protected] 3 points 1 day ago (1 children)

That’s not how web scrappers work lol. No such thing as obscurity except for humans

[–] [email protected] 1 points 1 day ago (1 children)

It seems to that it works. I don't get any web-scrapers hitting anything but my main domain. I can't find any of my subdomains on google.

Please tell me how you believe that it works. Maybe i overlooked something...

[–] [email protected] 0 points 18 hours ago

My understanding is that scrappers check every domain and subdomain. You’re making it harder but not impossible. Everything gets scrapped

It would be better if you also did IP whitelisting, rate limiting to prevent bots, bot detection via cloudflare or something similar, etc.

load more comments (10 replies)