d00phy

joined 1 year ago
 

Currently, I use dockerproxy + swag and Cloudflare for externally-facing services. I really like that I don't have to open any ports on my router for this to work, and I don't need to create any routes for new services. When a new service is started, I simply include a label to call swag and the subdomain & TLS cert are registered with Cloudflare. About the only complaint I have is Cloudflare's 100MG upload limit, but I can easily work around that, and it's not a limit I see myself hitting too often.

What's not clear to me is what I'm missing by not using Traefik or Caddy. Currently, the only thing I don't have in my setup is central authentication. I'm leaning towards Authentik for that, and I might look at putting it on a VPS, but that's the only thing I have planned. Other than that, almost everything's running on a single Beelink S12. If I had to, I could probably stand up a failover pretty quickly, though.

[–] [email protected] 1 points 12 hours ago

Day to day is Yorkshire tea with a little sugar. Every now and then I like to mix some Darjeeling with licorice tea.

[–] [email protected] 4 points 17 hours ago

I’m surprised at the number of people who don’t know Weezer’s “Hashpipe” is about male prostitution.

[–] [email protected] 3 points 2 days ago

Stayed at Amberley Castle a couple years ago. I could go back there and never leave.

[–] [email protected] 8 points 2 days ago (2 children)

Only if your name’s “Bruce.”

[–] [email protected] 64 points 3 days ago (7 children)

American here. Can I come? Please? And stay?

[–] [email protected] 8 points 4 days ago

Damn! I missed that one. Working now. Thanks!

[–] [email protected] 1 points 4 days ago

Won’t connect on either port using http or https.

 

I've been banging my head on this for a few days now, and I can't figure this out. When I start up immich container, I see in docker ps:

CONTAINER ID   IMAGE                                                        COMMAND                  CREATED              STATUS                        PORTS                                                                                                             NAMES
1c496e061c5c   ghcr.io/immich-app/immich-server:release                     "tini -- /bin/bash s…"   About a minute ago   Up About a minute (healthy)   2283/tcp, 0.0.0.0:2284->3001/tcp, [::]:2283->3001/tcp                                                             immich

netstat shows that port 2283 is listening, but I cannot access http://IP_ADDRESS:2283 from Windows, Linux, or Mac host. If I SSH in and run a browser back through that, I can't access it via localhost. I even tried changing the port to 2284. I can see the change in netstat and docker ps outputs, but still no luck accessing it. I also can't telnet to either port on the host. I know Immich is up because it's accessible via the swag reverse proxy (I've also tried bringing it up w/ that disabled). I don't see anything in the logs of any of the immich containers or any of the host system logs when I try to access.

All of this came about because I ran into the Cloudflare upload size limit and it seems I can't get around it for the strangest reason!

[–] [email protected] 1 points 6 days ago

This is really helpful. I’ll look into that. Thanks!

[–] [email protected] 1 points 6 days ago

I can upload files outside of the docroot, but if they stay there for too long, I get a nasty email from Dreamhost reminding me that this is for web space and not offsite storage (something they also sell). I haven't tried uploading something inside the docroot and just setting permissions to 400 or something!

[–] [email protected] 1 points 6 days ago

I haven't played w/ memory limits, but when I tried messing w/ buld download of raw TIF files, it ran out of memory pretty quick. I may look into what I can to about the limits, though.

[–] [email protected] 1 points 1 week ago

Same. I have a mediawiki install on the shared hosting still, but I haven't updated it in forever. For the $10.99/month I'm paying for shared hosting, I could save a little and do a more powerful VPS to host similiar stuff... Of just keep doing what I'm doing w/ my S12 pro & Synology. Might look at some kind of failover down the road.

[–] [email protected] 1 points 1 week ago

Fair point. Currently, everything that requires off-site backup is sent to my father's Synology using hyperbackup. So off-site is sorta self-hosted already. Was thinking in terms of a second fallback option.

 

A long long time ago, I bought a domain or two, and a shared hosting plan from Dreamhost w/ unlimited bandwidth/storage. I don't have root access, and can't do containers on this. It's been useful for a Piwigo instance to share scanned family photos. The problem I have is the limited resources really limit Piwigo's ability to handle the large TIF files involved in the archival scans. There are ways around this, but they all add time to the workflow that already eats into my free time enough. I'm looking at moving Piwigo to my local server that has plenty of available resources. That leaves me with little reason to keep the Dreamhost space. So what's a decent use case for cheap, shared hosting space anymore?

To be clear, I'm not looking for suggestions to move to a cheap VPS. I've looked into them, and might use one in the future, but don't need it right now. The shared hosting costs about $10.99/month at the moment. If there was a way I could leverage the unlimited bandwidth/storage as an offsite backup, that would be amazing, but I'm not sure it would be a great idea backing up stuff to a webserver where there best security I can add it via an .htaccess file.

 

Was in DC at the end of September staying at the Waldorf (Trump’s old hotel), and saw a bunch of black SUVs with this flag in the windshield parked on the curb by an entrance not open to regular traffic. Also saw press there and some folks walking around in military uniforms. An image search suggested it might be Gabon, but that flag didn’t include the seal in the middle.

 

I currently have my home services set up in a way I like, and think I understand. I have an S12 pro w/ *arr, Overseerr, Immich, paperless, etc running. The only things exposed are immich, paperless, and overseerr. This is via swag/dockerproxy over a cloudflare tunnel. This makes it so I don't have to do anything on the cloudflare end or my router to add a new service. DockerProxy picks up a new container, swag configures a reverse proxy automatically (assuming it recognizes the container, but it also supports custom configs) using the container_id as the subdomain.

I'm looking at setting up a VPS to host authentik and uptima kuma (to start - maybe ntfy in the future). What I'd like to do is have the public interface on these containers use the same cloudflare tunnel I'm currently using... or a second one, if necessary. For the interface back to my home server, I'd like to use Tailscale. I already have it running on my home server, and I expect I'll install it on my VPS. The goal here is the "public" connection uses the cloudflare tunnel, and the backend connection is over tailscale.

I've tested that I can spin up swag/dockerproxy on a second box in my lab and it will connect to cloudflare. I have not yet tested standing up a container on that box to see if the proxy works as expected.

So, questions:

  • Tailscale on VPS: container or no? Obviously, if I can't install it locally, I'll put it in a container
  • How to I configure a container to use these 2 networks? I'm fairily good on getting the cloudflare part working. The TS part is new to me, and all the documentation I've seen doesn't really cover other containers using the tailnet.
  • Am I overthinking this? If I put these services on tailnet alone, will the cloudflare tunnel... tunnel back and forth to/from clients not on tailnet?
 

I have the arr stack and immich running on a beelink S12 pro based on geekau mediastack on GitHub. Basically, and I'm sure my understanding is maybe a bit flawed, it uses docker-proxy to detect containers and passes that to swag, which then sets up subdomains via a tunnel to Cloudflaire. I have access to my services outside of my LAN without any port forwarding on my router. If I'm not mistaken, that access is via the encrypted tunnel between swag & Cloudflaire (please, correct me if I'm wrong).

That little beelink is running out of resources! It's running 20 containers, and when immich has to make any changes, it quickly runs low on memory. What I would like to do is set up a second box that would also run the same "infrastructure" containers (swag, docker-proxy), and connect to the same Cloudflaire account. I'm guessing I need to set up a second tunnel? I'm not sure how to proceed.

129
submitted 8 months ago* (last edited 8 months ago) by [email protected] to c/[email protected]
 

Do you drink the cereal-flavored milk straight from the bowl? I grew up doing this because my parents taught me how good that milk tastes. As I’ve gotten older, I feel a little self-conscious about doing it in public. It’s not something I notice other non-children doing.

Editing to add: I do drink the milk from the bowl. As to when I'm eating it "in public:" hotels mostly. Self-conscious was probably the wrong word. I'm more wondering if people silently judge a grown person drinking cereal milk from the bowl. Not losing sleep if they do, just curious.

 

I’ve seen a lot of recommends for Immich on here, so I have an idea what the answer here is going to be, but I’m looking for some comparisons between it and Photoprism I’m currently using Synology Photos, and I think my biggest issue is it’s lack of metadata management. I’ve gotten around that with MetaImage and NeoFinder. I’m considering moving to something not tied to the Synology environment.

 

Like the title says, what are your favorite lyrics?

I just heard one of my favorite verses today:

“In a seedy karaoke bar by the banks of the mighty Bosporus Is a Japanese man in a business suit singing “Smoke Gets in Your Eyes.” And the muscular cyborg German dudes dance with sexy French Canadians While the overweight Americans wear their patriotic jumpsuits”

 

Trying to backup a TV show I have on BD to my Plex server. The other discs in the season work fine w/ MakeMKV, but this one will only read the deleted scenes. So far, I've tried MakeMKV and dd (which returns an I/O error on this disc). Any other thoughts?

view more: next ›