Thanks for the reply,
Yes, it is a 4K (HEVC Main 10 HDR) 4:2:0 5.1 Video File. I have tried with “Enable HDR tone mapping” checked and UNchecked with the nearly identical results.
This all started because a handful, so far, of my 4k movies were buffering during direct play and would error out/stop on my local 2.5g network and it is driving me nuts. I understand the TV (A Hisense R8 w/Roku) does not have a 2.5g nic, it appears to be a 100Mbps nic best I can tell.
So I was just like, heck I will just flip the playback settings on the tv to the highest non original settings and transcode, simple right? Boy, was I wrong.
Now mind you this is not the any where near the largest 4k file i have, so the best I can gather is there is a huge bit rate spike in that scene. I clipped out that 1 minute segment and it does the same (Spec: Video+DTS-HD+DTS). Stripped it down to DTS core only and it plays fine, Stripped it to DTS-HD alone and its fine. The original is showing at 73.9Mbps in plex’s play version rating. When you play it is shows 152Mbps in the dashboard. I am guessing plex is either grabbing a the beginning bit rate or an average, not sure. The bandwidth meter in the dashboard hovers around 94Mbps. So in the end none of these bit rates seem to add up to anything useful. Especially since a movie twice its size plays fine. In the end, its likely just more than the TV’s nic can handle. Oh well, life goes on…
So back to the transcoding issue, Not sure why its using mostly CPU and not primarily the GPU, and if it needs to for some reason use the CPU, why not use 100% of the CPU to get the job done?
From scoping the web Nvidia NVenc can do H265 10 Bit through ffmpeg, which if I understand correctly is at the core of the transcoders here as well. So in theory we should be able to “Make My GPU hurt” as a transcoder setting in the dashboard. From the talks on github a gtx 1080 is able to average 250+fps, (i dont know the exact source file except is is the same as above specs). That is more than 10X realtime on the 24fps movie in question. (Even with lots of fluff or exaggerations there it should still pull of faster than real time by far.
I have been testing this all day looking for a fix. I would through an i7 or i9 at it, but if it will only use half of the juice anyways, not sure if that would solve the issue. I could throw the gtx 2080 at it but again, at less than 9% utilization of the 1650 super not sure that matters either. It appears to be limited for some reason in the plex transcoding code???
And finally before people all jump to tell me to just compress a copy or store the blu ray version as well. I get it, I can do that, and started getting the blu rays of all the 4Ks I have on the server loaded up. But that is not the point of all of this, I bought the plex pass a year ago to switch from kodi so the family could use it “easily” as the remote access on kodi was a nightmare, but it did let you do a bit more back end tinkering, like extending the buffer size for example on the client end… advancedsettings.xml wink wink
But thanks again for the reply, I am all ears to try anything here. And @Plex I am also all ears, and I would be glad to test and help if we can resolve the resource allocation in the server program to either use the Nvidia card to its potential or actually make the CPU hurt 100%. I already set the affinity to real time to give it balls to the walls from a windows allocation point.