Server Version#: 1.26.0.5715-8cf78dab3 (via plex deb repo on Ubuntu 22.04)
I have haproxy handling all https traffic and forwarding to plex on port 32400. This works if it forwards to 127.0.0.1:32400, but if I forward to the actual machine IP address, haproxy never marks the server up.
The haproxy log says “Server be-plex-32400/gollum is DOWN, reason: Layer7 invalid response, check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.”
I suspect that I need to tell the web server in plexmediaserver to allow requests from the local LAN as well as localhost, but I have no idea how to do that. I am planning to move haproxy to its own hardware and I need to have it accessing plex via the machine IP address.
I had thought there wasn’t anything in the Plex logs, but I just tried the IP change again and I see this:
May 07, 2022 10:28:31.896 [0x7f56a78dbb38] DEBUG - Request: plaintext connection from 192.168.217.200:33926 rejected because secure connections are required
So I adjusted haproxy to use SSL for the backend server connection and the message in the plex log is the same. I would like to eliminate the SSL for the backend connection. Can I adjust the plex config to allow plaintext connections from certain LAN addresses?
Easy enough to set that to preferred instead of required. But what I’d really like to do is say “Required for everyone other than this list of subnets/addresses”.
If you’re asking whether I have exposed port 32400 to the Internet, no I haven’t. At least not directly. Only ports 80 and 443 can get in from the whole Internet, and those go to haproxy. Port 80 connections only do one thing – redirect to https. Port 443 connections go to whichever webserver handles the domain in question – I have gitlab, plex, and Apache backends on the same machine as haproxy, and a couple of backends that go to completely separate servers. https://raspi.elyograg.org
You might want to require secure connections. If secure connections aren’t required, it’s trivial for an “attacker” to downgrade all connections to be insecure.
Is that a realistic threat? I dunno. Up to the individual. Requiring them for external connections makes sense to me. It would be nice if the server had that option.
I agree, but since be has a HA proxy, I assume he has a ton of users which will eventually have Plex running on a LG, but yeah, ideally you want secure connections required.
Defense in depth is good! But this doesn’t keep me up at night.
If you’re comfortable using non-encrypted connections internally, the combination of not exposing Plex externally, plus a firewall rule to only allow connections from the proxy server … seems good enough?
I just assume he’s a tinkerer who wants to play with stuff.
But regardless, the Plex setting isn’t the critical thing; if devices are hitting a proxy, that’s where certificates are provided and security settings are enforced.
With secure connections required, nothing works unless I direct the proxy to port 32400 on 127.0.0.1. Which is fine when the proxy is running on the same machine, but I want to move it off the machine to separate hardware, so I am adjusting the proxy to use the actual IP address in advance of that move.
Nothing outside of my LAN can get to port 32400 directly. The only way it is reachable from the internet is on port 443, which haproxy is encrypting with a letsencrypt certificate.
I am noticing that when I access the website from the LAN where the server lives, the client connects directly to plex, doesn’t go through the proxy. I see those connections in the plex log.
If I run the plex app on my phone and turn off its wifi, media still plays. I know in that situation it’s not going directly to plex, because it can’t. It HAS to be going through haproxy on port 443.
So at this point I believe I am still secure because haproxy will not allow unsecure connection from the Internet, even though I am allowing unsecure connections on port 32400.
I would feel a lot better about it if I could configure plex to only allow unsecure connections from specific addresses – in my case that would be the hosts where I set up my redundant haproxy pair.
I have shared my library with a couple of people, but I don’t think they’re actually using it. At this time I don’t have a TV with a plex client, but I might someday. A TV on the local lan/wlan is even more reason to have a list of specific addresses that can connect unsecure, but require it from everywhere else.
Have you also disabled Remote Access and Enable Relay in Plex? If those are enabled that’s another way clients can access a PMS.
I agree that would be a reasonable feature for Plex to add. The server is likely capable of doing so. It would be more secure for most users than a blanket Secure connections => Preferred.
But like … why not enable Secure connections => Required? The proxy can make secure connections too.
Or don’t allow any direct access to Plex, except via the proxy.
It’s weird to be comfortable with non-encrypted traffic on a shared LAN with other devices, but also want rules around it, and also not want to enable encryption …
I tried to have haproxy use ssl on that connection. But I still got the “insecure connection rejected” message in the plex log, and the backend did not come up. Looks like I had to specify “ssl” in more than one place in the backend config. It’s working now with secure connections required.
I have been working diligently over the last couple of years to completely eliminate double encryption on all my websites. I had achieved that goal, I really hate that I have to enable it again for a website.
Encryption is a worthy goal for anything that faces the Internet. But if traffic has already been encrypted while traversing public networks, why should the system incur the additional cost of re-encrypting the traffic on the backend if the backend network is private and has good physical security? On the server I am using now, I have far more CPU capacity than I really need, and the CPUs have acceleration for encryption that common encryption libraries like OpenSSL DO use. But this is not always the case. Plenty of websites are being served by less-than-adequate hardware, and encrypting everything twice might be just enough to overload that hardware.