I’ve got my server running on decent hardware:
i5 6600 (Skylake)
GTX 1060 3GB
32GB RAM
Network is Fibre with upload of 50mb (Limited in Plex to 4k @ 25mb remote)
Transcoder set to “Make my CPU hurt”
240s transcoder buffer writing to 8GB RAM disk
Hardware encoding enabled.
So, I have a friend who is watching remote. He is attempting to watch 4K (HEVC x265) movies that gets transcoded for his players. Non x265 plays natively at 1080p +/- 10mbps
It never seems to be able to go higher than 720p @ 3mbps. Transcoder hits maybe 1.7x maximum
I was keeping an eye on task manager while it was transcoding. CPU did peak every now and then at about 85% but GPU never went higher than 15% usage at all during transcoding/playback.
Is this how its supposed to be? Or is this unexpected behaviour? Surely it should be hitting the GPU much harder and be able to transcode faster at a better bitrate?
Remote 4K is likely to be quite a challenge. That’s a wild guess. Remote Access to a bunch of geriatric, non-techy, friends and family is quite a challenge already - and I’ve created all the material that will Direct Play! Lord Hep Me if I add 4K to that mess…lol
I do know when the transcoder kicks in (way more often than is necessary) my GPU only helps the CPU with the stuff it can do. The CPU is still heavily relied upon.
Transcoding to users will be important to you - when they do so much of it, you can’t do anything.
If 4K HDR, note that transcoding HDR material of any resolution usually has a poor outcome.
Plex transcodes all video to H.264 which does not support HDR. Furthermore, Plex does not tonemap HDR to SDR. As a result, transcoded HDR generally looks quite poor, with washed out colors.