I have PlexPass on my Plex Server which is installed in an unRAID server. I am just moving into 4K videos and doing a 1st test run, I’m finding that my CPU which can manage 4 or 5 1080 transcodes simultaneously is pretty well tapped at 2 - 4K transcodes. The CPU is an intel E3-1241 v3 and scores 10,000 passmarks.
So that said, I’m now looking at how much my GPU in the PC speeds up the H264/265 encoding when ripping movies and wondering if using GPU hardware acceleration will do the same for transcoding Plex Streaming. I’ve read the page on the Plex website with some pros/cons and the page suggests what while faster / smoother the use of GPU accelerated encoding can reduce overall quality and create a blurry/blocky look.
I’m looking for some real world feedback on the use of GPU accelerated encoding before going too far down the road with my server. I would have to purchase the GPU and replace my MB to make this work and not wanting to go through that expense if ultimately it will degrade the quality that much as it would be a non starter for me. I think this will cost me somewhere between $275-350 do, but going to an E5 processor would be far worse. I also know implementing this in unRAID has an additional set of complexities and will work on those separately, but I understand it can be done.
- Does the use of GPU encoding make a difference that anyone has noticed? Is this driven by the quality of the GPU or is this for all GPU’s in general?
- How high end of a GPU do you need to accomplish a meaningful change and not reduce quality? I was thinking an nVidea GTX 1050 or 1060, thoughts?
- Am I understanding correctly that Plex either only uses GPU or CPU for encoding, not both?
- Does the use of GPU accelerated transcoding affect how Plex DVR works?
- Any other thoughts based on your experience that I should know or consider before going down this road?
Thanks in advance.

