Lets say I have my Upload speed set to 20Mbps. Plex will use 80% of that, in other words 16Mbps. Ill limit my remote stream bitrate to 8Megabit. Meaning I can stream 2 streams remotely at 8Mbps each.
This works great.
What I dont get though, is that even if the source file bitrate is like 2Mbps. Plex will stream it out at the maximum 8Mbps. I am testing with 2 movies, one is 3.5Mbps, other is 3.33Mbps. Plex is streaming both at 8Mbps.
When it’s sending data, it will always send out in chunks to fill the playback buffer of the player.
If your upload speed is 20 but the video is 4 Mbps, you’ll see a gap between the ‘send’ blocks.
When you send a video with lower bitrate, the gap is bigger.
This also happens on my server.
My upload is 235 Mbps.
I limit my upload usage to 200.
When someone plays a video, I see it go out at full 200 Mbps speed as a burst.
It will then sit there quietly until the player asks to be refilled.
When that happens, the next burst will get sent out.
This repeats for every video being played until done.
If you conduct a test of one playback, even on your LAN.
You can see PMS send a burst of data at the beginning.
Then it will send in bursts, only as needed, to fill the player.
When you have gigabit LAN and 20 Mbps videos, it’s easy to see.
Because most video files out there have not a constant bitrate in their video stream.
What you see in the media info and most players is just the average bitrate. But that cannot tell you which bitrate you really need to stream the file without any buffering pauses.