Hello all, I’m having a “All remote clients stream in SD, regardless of settings” problem, which seems to be not-uncommon. I want to make sure I’m not doing anything wrong and would appreciate any help.
I have a Plex Windows server, directly connected to my network at home. It’s fully accessible outside my network. No relays, no VPN, no virtualization… just a Windows Plex server. I was recently upgraded from a 50/5 Mbps connection to a 200/10 Mbps connection. I changed the settings of remote streams from 2 Mbps to 3 Mbps, in hopes to increase everyone’s video quality. As it was before the settings change, all remote streams are being forced to SD.
I read several threads about how a 2 Mbps settings - even though it said was HD - really wasn’t, so I was disappointed when changing it to 3 Mbps didn’t fix the problem.
Please see my screenshots below… is this a Plex server issue, or do I have something configured incorrectly?
Thanks for the reply. How can you tell he’s using an indirect connection? I’m now TOTALLY confused… because he’s watching a different movie and is now getting an HD stream (shows at 3 Mbps instead of 1). Same device, same IP.
different movies have can have wildly different variable bit rate
in your first example, it looks like you have a full bitrate bluray rip with 5.1 audio that is being transcoded down to fit within your 3 meg limit.
in the second example it looks like you have a low bitrate 1080p rip with aac stereo and is direct playing (no conversion because fits within the 3 meg limit).
if you have 10 meg upload, you should simply set your upload speed to 10 (as you have) and your remote bitrate to unlimited.
this will help avoid transcoding, and let plex divide up the total bandwidth amongst multiple clients as needed.
ideally the clients should have ‘automatically adjust’ quality enabled as well.
Thanks for the reply. So it sounds like I’ve inadvertently choked my bitrate upload limit… I thought that meant the maximum that each client gets… so if three people are watching something at the same time, 3 + 3 + 3 = 9 which is still 1 Mbps under my max. You’re saying the server will automatically divvy up the bitrate appropriately if multiple people are streaming at the same time? Sounds like I’ve misunderstood that setting all along.
Thanks again. So one last time to make sure I understand… if I set the bitrate to unlimited, and one person is streaming, that’s theoretically 10 Mbps (since that’s my upload max). If two people watch, each stream is 5 Mpbs. If four people watch, each stream is 2.5 Mbps. The server decides. And my mistake was setting it to 3 Mbps which basically limited ANY stream to a max of 3 Mbps. Correct?