GPU Transcoding vs CPU Transcoding

I have PlexPass on my Plex Server which is installed in an unRAID server. I am just moving into 4K videos and doing a 1st test run, I’m finding that my CPU which can manage 4 or 5 1080 transcodes simultaneously is pretty well tapped at 2 - 4K transcodes. The CPU is an intel E3-1241 v3 and scores 10,000 passmarks.

So that said, I’m now looking at how much my GPU in the PC speeds up the H264/265 encoding when ripping movies and wondering if using GPU hardware acceleration will do the same for transcoding Plex Streaming. I’ve read the page on the Plex website with some pros/cons and the page suggests what while faster / smoother the use of GPU accelerated encoding can reduce overall quality and create a blurry/blocky look.

I’m looking for some real world feedback on the use of GPU accelerated encoding before going too far down the road with my server. I would have to purchase the GPU and replace my MB to make this work and not wanting to go through that expense if ultimately it will degrade the quality that much as it would be a non starter for me. I think this will cost me somewhere between $275-350 do, but going to an E5 processor would be far worse. I also know implementing this in unRAID has an additional set of complexities and will work on those separately, but I understand it can be done.

  1. Does the use of GPU encoding make a difference that anyone has noticed? Is this driven by the quality of the GPU or is this for all GPU’s in general?
  2. How high end of a GPU do you need to accomplish a meaningful change and not reduce quality? I was thinking an nVidea GTX 1050 or 1060, thoughts?
  3. Am I understanding correctly that Plex either only uses GPU or CPU for encoding, not both?
  4. Does the use of GPU accelerated transcoding affect how Plex DVR works?
  5. Any other thoughts based on your experience that I should know or consider before going down this road?

Thanks in advance.

  1. I haven’t noticed (and if you are transcoding on the fly for remote use, really how much quality matters anyway?)

  2. for 4k transcoding, you need a minimum of intel igpu 600 series, or nvidia 10x series (or equiv). for nvidia at least, each generation has the same encoder/decoder, so transcoding performance/quality is the same (within the same gen). the main thing that matters is the amount of video ram. More Vram = more transcodes.

  3. no. plex will use gpu until it runs out of resources (vram generally) then use cpu

  4. not that I have noticed. the gpu will be used for transcoding if enabled in the dvr settings.

  5. just understand that getting into 4k is more complicated (and can be more expensive) than most expect.

this may or may not be of help @ Plex, 4k, transcoding, and you - aka the rules of 4k - a FAQ

see also nVidia Hardware Transcoding Calculator for Plex Estimates

1 Like

thanks for the details.

Thanks, and that looks like tonight’s reading (links) :+1:

OK, so the general theme of staying away from transcoding 4K by putting two versions on plex is an interesting idea… going to ponder that one.

My main listening room is completely 4K UHD compliant and can direct stream… or why else would I bother. The trouble is NOTHING else is 4K compliant.

OK thanks for those links, several things going on there to consider.

1 Like

Yup the bane of all plex admins getting into 4K. :money_mouth_face:

1 Like

So I have a couple movies with both 4K and 1080p rips done (one with 4K in the file name). When I put them both on Plex they show up under the same movie banner. How do you pick which file to watch, or how do you force it to separate them?

Well it appears Plex is smart enough to know which to use… 4K plays to living room. 2K transcodes to phone.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.