How do I know if I will have enough transcoding performance for high-bitrate sources?

Server Version#: 1.41.0.8992
Player Version#: 4.136.1

I’m a long time Lifetime user, so I went through several PC upgrades in the mean time, but I was always in the dark on whether the next CPU upgrade would bring me enough performance to handle any transcoding needs while I’m gaming on my PC at the same time.

I can finally say that thanks to the AMD Ryzen 9 7950X3D I can game without my FPS cratering to a crawl, but I still have some resentment that for the first time I bought a CPU with more than 8 cores! :smiley:

People have told me that I should switch my GPU from AMD (Radeon RX 7900 XT 20 GB) to an Nvidia card, but when I ask whether they can confirm whether the equivalent Nvidia RTX 4070 Ti Super would be able to handle the workload of a real-time transcoding of a high bitrate Blu-Ray rip (imagine HDR10 UHD, maybe 13000 mbps, maybe 20000 mbps), no one can answer me.
I know that the 7950X3D can handle it with acceptable FPS loss (I remember testing it on The Last Of Us Part 1 at 3440x1440 Ultra, Depth of Field and Motion Blur both off, with FSR 2.2 (to increase CPU usage) losing on average 20 FPS from 90 to 70). But the fact of the matter is, there’s no way to benchmark this, is there?
We don’t know how much gaming performance I would lose if I had the 7950X non-3D.
We don’t know how much gaming performance I would lose if I had the 7800X3D.
We don’t know how much gaming performance I would lose if I had the 4070 Ti Super.

So in the end I got lucky that the 7950X3D is good enough for my needs, where my previous 5800X failed horribly if my wife started a movie that my CPU had to process while I was gaming.

So the question remains: how do I know if I will have enough transcoding performance for high-bitrate sources?

I ultimately also don’t want to leave AMD graphics cards, I’m saving a LOT of money by not paying Nvidia’s ridiculous monopoly prices with their ridiculous marketing ray-tracing FOMO (that’s Fear of Missing Out). But if there is a way to benchmark the 4070 Ti Super, and how I would be better off with it and a 7800X3D, then I want to know. Knowledge is power.

Thank you!

If using HW transcoding with Plex then it doesn’t load the GPU up at all. Even the bottom end ones will handle multiple 4K streams with ease.

The recommended to go to nVidia is primarily due to their output quality being slightly better than AMD when HW transcoding, and also HW tone mapping is supported by Plex when using an nVidia card but not an AMD GPU.

You will still need a Plex pass to utilise HW transcoding regardless of which GPU.

The new transcoding engine being developed by the Plex team will offer greater support for AMD GPUs, but it’s only currently in pre-alpha, so is a little way from being released.

2 Likes

Indeed, I read about the new transcoding engine under development, thank you anyway.

However, regarding the sentence “Even the bottom end ones will handle multiple 4K streams with ease.”, I’m afraid that I’m going to need to ask for evidence of that. For two reasons, 1) because 4K streams are not the same as a 4K HDR10 Blu-Ray, there can be an absolutely HUGE difference in bitrate. 2) Well, the main purpose of this topic is to measure performance - both of CPUs and GPUs - when transcoding content, especially my “nemesis”, UHD HDR10 Blu-Rays.

I have an older Intel i7-7700K CPU and have tested Intel 4K HDR10 Blu-ray to 1080P SDR HW transcoding. I stopped at 8 concurrent streams and Quick sync on the Intel iGPU handled it effortlessly with virtual CPU usage well below 20% (I was running Plex in Docker within an Ubuntu VM on Proxmox with only 6 of the 8 hyper threaded cores available). I’ve also run similar tests with a nVidia GTX 1660Ti and VRAM starts becoming the issue, not GPU speed.

This is a link for nVidia cards nVidia Hardware Transcoding Calculator for Plex Estimates

There are plenty of people on this forum and Reddit using lower powered Intel Celeron and i3 processors possessing Quicksync with HW transcoding and HW tone mapping to convert high bitrate 4K HDR10 streams.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.