Transcoding resources required: Bitrate vs. Resolution

Hypothetically speaking, let’s say I have 2 video files that have the same bitrate, but are set to different resolutions (ie. I have the same movie encoded to 1080p and 720p and are both 4GB). Which one would require more server resources (CPU usage mainly) to transcode? Or are transcoding resources based only on bitrate (meaning that the resources would be equivalent for both of these)?

My intention here is not to get into a discussion about which file would have better perceived quality, but more about minimizing server resources required.

Thank you!

1080p requires more horsepower to transcode.

I think you should move the mean end of that pistola away from your shoes - you don’t want to shoot yourself in the foot.

If you’re gleefully letting Plex transcode all your material it doesn’t matter if it’s 1080 or 720 because by the time Plex is done with it it’s gonna look like 480.

What you should be doing is finding out why it’s transcoding and correcting that problem. Direct Play (or Vid Copy Direct Stream) looks exactly like it did when it left it’s storage location and is a lot easier on the furniture (CPU).

Not Hijack but how do you find out if Plex is transcoding? If direct plan is enabled how can you be sure?

If you look at the Status page and click on the little “Info” button on one of the streams, it will tell you whether it is Direct Play, or the actual codecs being used for transcoding.

You can also see what’s happening in some Plex apps. If we knew which ones we could aim you at them, but Plexweb will let you know what’s going on.

Status on the Web App can show you what is playing. Then click the small “I” in the circle and it tells you details of the stream.

Or install PlexPy and you can get the same information, as well as tracking that information for a long term history. So you can go back and look at a movie that PlexPy logged you as watching to see if it was transcoding or not. PlexPy doesn’t track the REASON something transcodes, but it does track that it has been transcoded.

In my experience typically h264 mp4 aac will direct play on nearly all devices… seeing as though I do not own every device out there I cannot say that there aren’t a few that will not.

@SiscoPlex said:
In my experience typically h264 mp4 aac will direct play on nearly all devices… seeing as though I do not own every device out there I cannot say that there aren’t a few that will not.

Thank you for your response, but this is not really relevant to my original question. My question is not about when files direct play… in the instances when it DOES need transcoding, would the 720p and 1080p files with the same bitrate require a similar amount of CPU power to transcode?

Yes lower bitrate = less intensive… and worse quality. Although correct encode requires no transcode and better quality.

@SiscoPlex said:
Yes lower bitrate = less intensive… and worse quality. Although correct encode requires no transcode and better quality.

Yes I understand this… my question is about when 720p and 1080p files have equivalent bitrates and the server is forced to transcode. Would the 1080p file require more resources than the 720p file to transcode, even though they have the same bitrate? If so, does the CPU requirement go up by a small amount (ie. 10%), or does it go up by several times (ie. 4x CPU required)?

Simple… test it yourself! All of my files do not require a transcode, so I can’t definitively answer that. But again you could have already supplied yourself with an answer and an update in the last 3 hours instead of just looking for someone to answer this for you. My 16 year old son would also like me to give him all the answers to his homework but I won’t.

Anyway it also depends on your CPU, this would directly relate to that as well. So, again test it in your environment and see for yourself the exact % that your CPU will require.

Since there are many CPU’s on the market this will be specific to you and a percentage of the user’s on this forum… not a cut and dry standard across the board.

@SiscoPlex said:
Simple… test it yourself! All of my files do not require a transcode, so I can’t definitively answer that. But again you could have already supplied yourself with an answer and an update in the last 3 hours instead of just looking for someone to answer this for you. My 16 year old son would also like me to give him all the answers to his homework but I won’t.

Anyway it also depends on your CPU, this would directly relate to that as well. So, again test it in your environment and see for yourself the exact % that your CPU will require.

Since there are many CPU’s on the market this will be specific to you and a percentage of the user’s on this forum… not a cut and dry standard across the board.

TL/DR: I believe that Plex changes the quality/speed setting on the transcoder, so this makes it difficult to compare CPU usage across different files

It is still not as simple as you are claiming. I have, in fact, tried monitoring my CPU usage stats while various files have been transcoding. I have monitored usage while transcoding a high bitrate 1080p file, a low bitrate 1080p file, a high bitrate 720p file, and a low bitrate 720p file. CPU usage seems to be the same… usage will be at 10-20% for several seconds, then will spike up to 70-80% for a few seconds, then back down to 10-20% for a few seconds (and this cycle repeats).

Now, obviously there is no way that a high bitrate 1080p file and a low bitrate 720p file require the same amount of CPU cycles to transcode. What this says to me is that the Plex transcoder is monitoring its CPU usage and will throttle the encoding speed/quality so as to not to completely take over the CPU, but I could be wrong about this. If this is the case, then it is very difficult to compare apples to apples here because Plex may choose to encode one file on a “fast” setting, and another one at a “slow” setting based on resources available.

So while the CPU utilization seems to be similar when transcoding ONE high bitrate 1080p file vs a low bitrate 720p file, the theoretical limit of number of simultaneous transcodes is still much lower with high bitrate 1080p files. This makes the question much more difficult to test. My goal is to increase the limit of number of simultaneous transcoding sessions that my CPU can handle, while keeping a respectable level of quality.

I appreciate you taking the time to respond to these questions, but your condescending tone is not required. The reason I made my original question a ‘hypothetical’ situation is because this is quite a lot to read when the discussion is more about the actual algorithm used when transcoding video files, which I am not an expert in, and a lot of the details above are largely irrelevant for the purposes of this discussion. It is difficult to remove all of the variables involved here in a real world situation. Furthermore, it should not really matter what CPU is used because if a certain CPU has a limited instruction set, then it would be equally inefficient at specific calculations at both 720p and 1080p. What I am trying to discuss here is the DIFFERENCE between computations required in transcoding 720p vs 1080p at similar bitrates.

I don’t work for Plex. I try to help people on my own time, so I can say what I like.

According to your take on Plex’s transcoder and how it works, it sounds like no one will have to upgrade to a Kaby lake or equivalent processor to transcode 4k high bitrate videos either… plex will just handle it…lolol

This truly is great news for all of us.

@SiscoPlex said:
I don’t work for Plex. I try to help people on my own time, so I can say what I like.

According to your take on Plex’s transcoder and how it works, it sounds like no one will have to upgrade to a Kaby lake or equivalent processor to transcode 4k high bitrate videos either… plex will just handle it…lolol

This truly is great news for all of us.

I never implied that all processors can handle high bitrate transcoding in real time. What I said is that the number of computations required for a transcoding session is variable to a certain point. FFmpeg has several encoding speed presets to choose from. For H.264, this ranges literally from “ultrafast” to “veryslow”. Official documentation here: Encode/H.264 – FFmpeg

Kaby Lake processors have HARDWARE transcoding built in, which is a different beast altogether. These processors can handle 8 4K streams without using ANY CPU cycles. If this CPU was given more than 8 simultaneous transcoding sessions, then it would have to resort to using SOFTWARE transcoding (which is what we are talking about in this thread) and would then have the same constraints as “normal” CPUs. Here is another FFmpeg page on hardware acceleration: HWAccelIntro – FFmpeg

I guess we will have to agree to disagree that it is not hardware reliant… simple equation is a cpu benchmark of approximately 2000 per 1080p transcode this is well documented… I will now bow out of your thread and leave you to mull that over. I am now on to help others who actually need help.

@JuiceWSA said:
If you’re gleefully letting Plex transcode all your material it doesn’t matter if it’s 1080 or 720 because by the time Plex is done with it it’s gonna look like 480.

In my experience transcoding 1080p streams (subtitles) this is not always true. Quality is much worse with the standard transcoding setting, but if I raise the quality of the transcoder (and not even to “make my CPU burn”) there’s no such a huge drop of quality.

@zpaolo11x said:

@JuiceWSA said:
If you’re gleefully letting Plex transcode all your material it doesn’t matter if it’s 1080 or 720 because by the time Plex is done with it it’s gonna look like 480.

In my experience transcoding 1080p streams (subtitles) this is not always true. Quality is much worse with the standard transcoding setting, but if I raise the quality of the transcoder (and not even to “make my CPU burn”) there’s no such a huge drop of quality.

I’ll take your word for it. I make all my stuff Direct Play because I saw what Plex did in a transcode once and didn’t care for it. That was the end of that nonsense.

@djshawnee - There is nothing magical about the Plex transcoder regarding performance between 720p/1080p where bitrate is the same to my knowledge. Just fire up ffmpeg and run a couple of prepared videos through it to find out. It may take a while to create some media that is the same bitrate at differing resolutions, but if you choose the same 10 minute clip with all other things being equal, then convert the video streams and time them. It should give you a close approximation to the difference.

I would also like to know the answer to the original question. I love how much nonsense one have to put up with on this forum to just try to get a discussion going :slight_smile:

Just so that my own post isn’t completely useless; One would imagine that going from one resolution to a different one could take some processing power alone thanks to scaling that has to be done, no? I’m just speculating, but if so, it could be possible that for example a 1080p file would be harder to transcode to 720p than a 720p file with the same bitrate would. Although if you were to transcode to 1080p maybe the 720p file would take longer than the 1080p file. See what I mean? I’m just speculating though. I hope some more knowledgeable people would chime in or at least that some discussion could be had about this.