Which is better for Transcoding Powerful CPU or Powerful GPU

Server Version#:
Player Version#:

currently have a media server set up to run Plex. It uses an i7 7700 (no K) with 16GB of RAM and 9 hard drives (3 are used for backup.) When sending 4k files I frequently get “Server is not fast enough for these settings.” The Plex article I read said that consumer video cards are limited to 2 streams but that Quadro cards aren’t so limited. Also the same article said that a CPU without video card needed a passmark above 17000 to handle everything. So I have ~$300-350 I can use to upgrade:

  1. Purchase a QuadroP2000 video card for hardware transcoding
  2. Purchase an i5 11400 or 11500 and an ASUS motherboard I’ve sourced and use the onboard hardware HEVC encoder.

Which is probably smarter?

Have you tried enabling hardware accelerated transcoding with your current CPU?

The i7-7700 has HD Graphics 630, which is capable of decoding 10-bit HEVC video.

Also, try turning off HDR tonemapping and see if that helps (Settings → Transcoder).

Tonemapping is still a work in progress and can hit the CPU. Currently, when running Plex on Windows, there is limited support for using Intel graphics and no support when using Nvidia graphics. The decode/encode still uses the GPU, but the tonemapping occurs on the CPU.

i7-7700 at ark.intel.com. Kaby Lake processor with Intel® HD Graphics 630.

Intel Quick Sync Video at Wikipedia. Table shows Kaby Lake supports 10-bit HEVC video.

Thanks, I’ve been using hardware decoding but still got buffering and things using Plex to my Sony TV in the den during HEVC 4k playback. When I play directly off the network with an HTPC in my home theater I have no difficulties with the 4k stream. It could be a fault with HDR so turned it off. I appreciate your help!

So, do you think with a new i5 11500 or so that it could better handle the CPU usage for tone-mapping, or is it not ready for primetime.

My personal approach is that I do not transcode 4K media. I keep my 4K movies & TV shows in their own libraries. I do not play 4K HDR material on devices where it will not direct play. I do not share my 4K HDR libraries with remote users as a) none of them stream to 4K TVs, and b) I do not have the upstream bandwidth to stream 4K without transcoding.

The i5-11500 will definitely be better at CPU tasks than the i7-7700. The i5 has a Passmark of 18039 versus 8618 for the i7.

Will the i5 handle tonemapping? Others will have to weigh in. As mentioned, I don’t transcode 4K, so I’ve no experience with it. Also, other than “cpu intensive,” I have not seen information on system requirements for tonemapping (Info may be available, but I haven’t seen any).

Tonemapping is still a work in progress. Plex refers to it as a “Plex Pass preview” in the support document. They’re definitely working to make it better, but it will take some time to work out all the kinks.

Personal Opinion:

The Sony seems to be the TV with playback issues.

  • If the TV should direct play 4K HDR, but does not, find out why and fix that problem.
  • If it does not support direct playing of 4K HDR media, then don’t play 4K HDR media on that TV.
  • If the TV is not 4K HDR capable and you still want to play 4K HDR on the TV, consider a Plex client that handles tonemapping and 4K → 1080p scaling. A 2019 Nvidia Shield Pro performs both those tasks, and cost ~$200 USD. An AppleTV with the Infuse app may also work (see this post). Plex for Windows/Mac or Plex HTPC may also be options. They perform scaling and tonemapping as well, without relying on the Plex Media Server.

Thanks for the reply. The Sony is the 900E and plays 4K HDR. Unfortunately, Sony saved the $.07 it would have cost to put gigabit Ethernet ports into the TV which means I have to use wireless to prevent bottlenecks at the TV. I have an access point about 6 feet from the Sony.

Netflix plays 4k HDR perfectly, but that’s not surprising as it is much less bandwidth than a 4k rip from my UHD Bluray. I use an Nvidia Shield in my theater room along with an HTPC which is most likely overkill, except that my PC can use MadVR which throws a better picture in Rec 709 situations than the Shield.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.