Server Version#: 1.19.2.2737
Player Version#: not relevant
Hi, I installed a bandwidth monitoring app on my windows 7 server. Everything works fine or as expected with other things I do on the web using that machine.
When a user watches something, I can see the bandwidth used by that plex user in real time. My observation is that when a plex user streams something, download and upload bandwidth are around the same for the entire duration of the stream. That seems weird since the player does not need to send that much data back to the server so I’m wondering why that is.
I’m expecting a way higher upload bandwidth on the server side than a download bandwidth.
I have an old Windows 7 app that monitors network activity, and there is hardly any activity on the “down” side on the local network or remotely. The app only shows the activity for the computer it’s installed on, not the entire local network.
If your app is monitoring the entire local network and not just the server, and you are streaming locally, I would expect the rates to be the same.