Server Version#: 1.21.1.3766
Player Version#: 1.24.0.1483-714cba36
I’ve been having some issues streaming 4K movies to 1080p televisions, with the majority of the issues being persistent buffering due to slow transcoding. Basically, all 4K movies are unwatchable on clients which have hardware specs that should be able to play 4K without any issue. And, direct watching the same movies via VLC Player over a network share results in no problems whatsoever. To be clear, I am only talking about a single transcoding stream.
The server is running an Intel Xeon Silver 4208 (8 core/16 thread), 128GB RAM, Quadro P2000, NVME transcode drive. Clients are running Intel J1900, 16GB RAM, AMD RX480, or various other CPU configurations but all with RX480 cards. All computers are hardwired on 1GBE.
No matter what settings I configure on the Plex server, transcoding basically hovers around 0.4-0.6. It also doesn’t matter whether transcoding is set to hardware or to software only, the results are always the same. Even when I can force Direct Play to the clients, the same exact issue occurs. I can’t imagine that it would be a client issue, since an RX480 should have no issues playing 4K, even with the lower performance CPUs.
And as I mentioned before, directly playing over a network share results in perfect playback. Yet with Plex, playback results in nearly 100% CPU usage and almost 100% GPU usage on the clients - playing the same 4K movie with VLC yields about 50% CPU and 25% GPU usage. No difference between using the Plex App or Plex in Chrome. Additionally, I can Direct Play 4K movies to my Samsung TV over WiFi with 100% perfect results.
Do I need a more powerful Quadro card? Are the clients at fault here, even with RX480s? Is this simply an issue that a Quadro P2000 isn’t powerful enough to transcode a single 4K stream?
I use the Plex Desktop App on Windows PCs, but have also tried the Plex Web App on Linux with the same results. Not sure if it makes a difference, but the movies are typically 4K HEVC Main 10 HDR.
Also I understand the typical rule that you shouldn’t transcode 4K movies, but it just seems strange that my hardware can’t handle a single 4K transcode stream for some reason.
The Windows Plex app should be able to play the file directly, so transcoding won’t be necessary.
If you are explicitly testing transcoding, try disabling HDR to SDR tonemapping. This requires a lot of conventional CPU power, since only a part of it can be accelerated in hardware (better on Linux than on Windows).
Settings - Server - Transcoder - ‘Show Advanced’ - “Enable HDR tone mapping”
Your CPU is not as powerful as you might think it is: https://www.cpubenchmark.net/cpu.php?cpu=Intel+Xeon+Silver+4208+%40+2.10GHz&id=3507
You might be better off in general if you go for a CPU with QuickSync support. You could do away with the discrete GPU (which will run out of RAM too soon, anyway with 4K transcodes).
I hear there are actual Xeon models which do have QuickSync.
Thanks for the response! I disabled HDR to SDR tonemapping, and there really doesn’t seem to be any difference. To be clear, the issue is never with the actual Plex server, even with 4K transcode enabled (1080p on the client, 20mbps stream) the server CPU doesn’t exceed 15% and the Quadro never exceeds 25%. Direct play basically results in under 5% CPU/GPU usage. The issue is always with the clients - even with transcoding done by the server the CPU/GPU usage on the clients is basically pegged at nearly 100% and videos are unwatchable.
If we take direct play as an example, the client is running lower end hardware (Celeron J1900 which does have QuickSync) but with a RX480 video card which should be more than capable of playing 4K video at 60hz (although the TV is 1080p). Yet playing a 4K video in the Desktop App results in nearly 100% GPU usage and very high CPU usage.
So is the issue here that the client hardware is simply not powerful enough to play 4K content on a 1080p TV? I assume that the client Plex Desktop App would receive a direct 4K stream from the server and then use the the RX480 to locally covert the 4K stream into 1080, but perhaps I’m mistaken? Is the issue caused by AMD graphics hardware on the clients? I hesitate to think this would be the case, since directly playing the video on a network share with VLC yields perfect results and low CPU/GPU usage on the client.
Sorry if I am not being exactly clear, but I hope my response makes sense.
VLC is a wholly different beast. It cannot be compared with Plex, which uses a different media engine (mpv).
It might just be that mpv hasn’t been optimized that well for your hardware, leading to a higher demand on CPU resources. I also read in some forum that it is very important how up-to-date the drivers for this GPU are.