I have some TV channels set up and everything works perfectly other than there is no way to tell Plex the actual bitrate. So it assumes it is 40 Mbps.
So what is happening on initial click is it checks if you have > 40 Mbps and plays it. Actual average 1080p TV channel bandwidth is ~10 Mbps as seen via the Plex dashboard.
The problem is if you don’t have 40 Mbps available (cellular?), it ends up transcoding 1080p to 1080p for no reason and creates the dreaded waiting time before changing channels. It also takes up system resources (processing and upload bandwidth) for no reason. Even if you try and select “original” quality after it started playback, it will just close the stream (bug?).
Does anyone know if it is possible to change the default bitrate value to something more reasonable? I think this setting should be in the channel list selection configuration or maybe in the channel feed itself as a parameter.
Server Version#: 1.28.2.6151
Player Version#: Any
Tuner Make/Model: Any
Guide/Lineup name: Any
Using XMLTV?: No
Channel number/Name: Any
I don’t have an iPhone but almost all players have options for Home Streaming (Maximum) Remote Streaming (Maximum). Absolutely make sure the Video setting “Adjust Automatically” is off, it makes everything transcode when on.
The 40 Mbps is coming from Server Settings > Remote Access > Limit remote stream bitrate. You most likely have it set to 40 Mbps.
Yes all the settings and options on server and client are set to “original” and no auto adjust.
Limit original is like you said at “original” as well.
One thing you gave me an idea for is to set the internet upload speed to some silly high value.
It seems to work now. The server side says no transcoding and original stream. Playback info still says 40 Mbps but I think this is because it just defaults to the highest possible.
Anyway, workaround has resolved this issue but I still think it could be better.
On a related topic, it’s still unfortunate that the stream is not directly passed to the device but actually has to use the Plex server upload bandwidth. Even if it’s the correct bitrate amount now. Why is Plex a middle man for “original stream” at all?
Plex can’t perform an “Analyze” on a live stream so it guesses.
Users wanted Tuner sharing, in order for that to happen all connections originate/terminate on the server. How would the stream route to the remote device without passing through the server?
In your last dashboard, the 39 Mbps is what the device reported to the server.
You have a good point for local tuners. I was thinking more about IPTV that has a remote URL.
As for the 39 Mbps, I am not sure what would report that as any Live TV source reports the same thing, even the Plex TV Channels (i.e. hallmark). None of them are actually 40 Mbps obviously.