If transcoding decision is made on peak bitrate (instead of avg bitrate), why not list peak bitrate?

As the subject asks, why doesn’t Plex show us the bitrate value it’s using to make transcoding decisions? Isn’t this common sense stuff? The average bitrate is good to know (I guess), but if it isn’t used for any decision-making by Plex, it’s somewhat useless. Is this anything that’s been requested?

Thanks!

PMS uses average bitrate because it allows PMS to determine the steady-state requirement to sustain your video stream.
It then computes an artificial 1.5x peak for bandwidth calculations to handle cases where you might have a video where 90% of it is at or below average (typical for non-cgi augmented scenes) and have a sudden burst of high ‘peak bit rate’ spikes in rapid succession.

The alternative is to perform an extensive ‘profiling’ of the video frame-by-frame and then, if transcoding is involved, constantly change output rate to ‘normalize’ the data stream. On a LAN environment, this is possible however in a WAN environment it is not. The Internet (routers & switches from point A to point B) have inherent latency. WAN infrastructure provisioning is not instantly adaptive. PMS must live within these constraints.

Additionally, each Plex app has a built in buffer. This serves two purposes: a) cope with latency and packet loss in transit and b) have sufficient local data ready for those bursts which are above ‘average’.

The best analogy one can draw is the application of fluid dynamics and attempting to pour water through a pipe. Done suddenly there is significant resistance if the pipe wasn’t sized correctly. Plan ahead (a funnel on the input side and a catch basin on the output side) and water flows quickly and smoothly to its ultimate destination.

You’re the man Chuck! Thank you for the detailed response, which makes complete sense!

So the client quality settings need to be at least 1.5x the listed bitrate in media info on Plex to prevent transcoding based on bitrate?

You are correct; the safest setting is one which is at least 1.5x the average bitrate of the content you are playing.
Doing so will prevent any peaks from triggering transcode

On LAN, I strongly recommend you do not ‘micromanage’ the bit rates. If your LAN can’t handle normal full bitrate playback (maximum 100 Mbps peaks), I respectfully suggest you update your equipment :slight_smile:

Thanks again Chuck! This is something that’s bugged me for a while and I’m glad I finally asked this question to get it figured out.

Best Regards!

Keep in mind that the 1.5x average is an estimate but may have no relation to the actual peak bitrate. I’ve seen files where the peak bitrate is 4 times the average, though this is quite rare.

Another reason why the peak bitrate is not shown by default is because there isn’t a single value. As mentioned previously, clients have a buffer and this helps to smooth out peaks in the bitrate. If the client has a larger buffer, it can smooth out the peak even more. The end result is that there are several peak bitrates depending on the size of the client buffer. You can see more info here: https://forums.plex.tv/discussion/comment/1318493/#Comment_1318493