The lower the transcoding bitrate, the higher my servers CPU usage

Server Version: 1.17.0.1841 (Running on Rasplex)
Player Version: 1.3.1.916-1cb2c34d (Plex for Windows)

I have noticed that when transcoding from H264 1080p to H264 720p, the lower the bitrate of the output, the higher my the servers CPU usage gets.

For example, if I transcode to 720p 4mbps, this will use MUCH less CPU than if I transcode to 720p 2mbps.

Is this to be expected?

I suppose it might be relevant to mention that the source files are lower than 4mbps bitrate. Usually about 2.5Mbps.

And while I’m at it, I have a very noobie question.
If I have a source of 2.5mbps 1080p - If I select to convert to 720p 4mbps, is it actually a 4mbps stream? Or does it cap at 2.5mbps because of the source?
Its obvious to me that 1080p 2.5mbps converted to 1080p 4mbps wouldn’t magically increase the quality, but since its downscaling to 720p I don’t know if that allows higher bitrate somehow.

Makes sense to me. Transcoding to a lower bitrate, means it needs to do more compression, which means higher CPU.

2 Likes

That is expected.

1 Like

Thanks, both of you.

And regarding the second part of my question about the bitrates, I have another follow-on question too.
I’m assuming that when transcoding from 1080p to 720p, it makes no difference to transcode to a higher bitrate than the source, regardless of resolution.

But I’ve also just realised that it wouldn’t make sense to even be able to transcode it to the same bitrate. Because how can I take a 1080p source and downscale to 720p and keep the same bitrate? Or is that possible?

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.