@SiscoPlex said:
Simple… test it yourself! All of my files do not require a transcode, so I can’t definitively answer that. But again you could have already supplied yourself with an answer and an update in the last 3 hours instead of just looking for someone to answer this for you. My 16 year old son would also like me to give him all the answers to his homework but I won’t.
Anyway it also depends on your CPU, this would directly relate to that as well. So, again test it in your environment and see for yourself the exact % that your CPU will require.
Since there are many CPU’s on the market this will be specific to you and a percentage of the user’s on this forum… not a cut and dry standard across the board.
TL/DR: I believe that Plex changes the quality/speed setting on the transcoder, so this makes it difficult to compare CPU usage across different files
It is still not as simple as you are claiming. I have, in fact, tried monitoring my CPU usage stats while various files have been transcoding. I have monitored usage while transcoding a high bitrate 1080p file, a low bitrate 1080p file, a high bitrate 720p file, and a low bitrate 720p file. CPU usage seems to be the same… usage will be at 10-20% for several seconds, then will spike up to 70-80% for a few seconds, then back down to 10-20% for a few seconds (and this cycle repeats).
Now, obviously there is no way that a high bitrate 1080p file and a low bitrate 720p file require the same amount of CPU cycles to transcode. What this says to me is that the Plex transcoder is monitoring its CPU usage and will throttle the encoding speed/quality so as to not to completely take over the CPU, but I could be wrong about this. If this is the case, then it is very difficult to compare apples to apples here because Plex may choose to encode one file on a “fast” setting, and another one at a “slow” setting based on resources available.
So while the CPU utilization seems to be similar when transcoding ONE high bitrate 1080p file vs a low bitrate 720p file, the theoretical limit of number of simultaneous transcodes is still much lower with high bitrate 1080p files. This makes the question much more difficult to test. My goal is to increase the limit of number of simultaneous transcoding sessions that my CPU can handle, while keeping a respectable level of quality.
I appreciate you taking the time to respond to these questions, but your condescending tone is not required. The reason I made my original question a ‘hypothetical’ situation is because this is quite a lot to read when the discussion is more about the actual algorithm used when transcoding video files, which I am not an expert in, and a lot of the details above are largely irrelevant for the purposes of this discussion. It is difficult to remove all of the variables involved here in a real world situation. Furthermore, it should not really matter what CPU is used because if a certain CPU has a limited instruction set, then it would be equally inefficient at specific calculations at both 720p and 1080p. What I am trying to discuss here is the DIFFERENCE between computations required in transcoding 720p vs 1080p at similar bitrates.