Hardware Transcoding ON: But +50% CPU usage, -10% GPU Usage?

Hello Everyone!

This is my first post, but have been around for quite some time, so go easy on me…

I am having trouble utilizing my graphics card to the fullest. I have HW transcoding on (And A Lifetime Plex Pass), and in theory it is working based on the dashboard reporting (hw) and the CPU not having any onboard graphics but it is using 50%+ CPU and less than 10% GPU when a transcode is going on. This is for a single stream.

This seams more like its still running the transcode in software not hardware.

Specs:
CPU: -Intel i5-9600k
GPU: Nvidia 1650 Super (Latest Standard Drivers (Not Studio))
Storage: HDD
System: M.2 Drive (neither drive seems to be any limiting factor)

This is a 4K 10 Bit HVEC Video. (Yes, I have seen the 4K bible)
The GPU supports HVEC 10 bit encoding and decoding. (Yes I have seen the gpu matrix)

Everything else seems to be in order, but It is using no where near all of the available resources. Especially the GPU. I know the 4k bible says don’t transcode 4k, but the build can transcode/encode faster than real time with other programs.

Any pointers on why Plex does not seem to be pulling full steam at the GPU?

Tips, Tricks or Setting that need to be dialed in???

Is it an HDR video?

Do you have HDR Tonemapping enabled?

HDR tonemapping is not supported on Nvidia GPUs with Windows. Tonemapping will be performed on the i5 CPU instead (video transcoding still happens on the GPU).

If enabled, try disabling HDR Tonemapping in Settings → Transcoder. The colors will be off, but you should see the CPU utilization drop.

Thanks for the reply,

Yes, it is a 4K (HEVC Main 10 HDR) 4:2:0 5.1 Video File. I have tried with “Enable HDR tone mapping” checked and UNchecked with the nearly identical results.

This all started because a handful, so far, of my 4k movies were buffering during direct play and would error out/stop on my local 2.5g network and it is driving me nuts. I understand the TV (A Hisense R8 w/Roku) does not have a 2.5g nic, it appears to be a 100Mbps nic best I can tell.

So I was just like, heck I will just flip the playback settings on the tv to the highest non original settings and transcode, simple right? Boy, was I wrong.

Now mind you this is not the any where near the largest 4k file i have, so the best I can gather is there is a huge bit rate spike in that scene. I clipped out that 1 minute segment and it does the same (Spec: Video+DTS-HD+DTS). Stripped it down to DTS core only and it plays fine, Stripped it to DTS-HD alone and its fine. The original is showing at 73.9Mbps in plex’s play version rating. When you play it is shows 152Mbps in the dashboard. I am guessing plex is either grabbing a the beginning bit rate or an average, not sure. The bandwidth meter in the dashboard hovers around 94Mbps. So in the end none of these bit rates seem to add up to anything useful. Especially since a movie twice its size plays fine. In the end, its likely just more than the TV’s nic can handle. Oh well, life goes on…

So back to the transcoding issue, Not sure why its using mostly CPU and not primarily the GPU, and if it needs to for some reason use the CPU, why not use 100% of the CPU to get the job done?

From scoping the web Nvidia NVenc can do H265 10 Bit through ffmpeg, which if I understand correctly is at the core of the transcoders here as well. So in theory we should be able to “Make My GPU hurt” as a transcoder setting in the dashboard. From the talks on github a gtx 1080 is able to average 250+fps, (i dont know the exact source file except is is the same as above specs). That is more than 10X realtime on the 24fps movie in question. (Even with lots of fluff or exaggerations there it should still pull of faster than real time by far.

I have been testing this all day looking for a fix. I would through an i7 or i9 at it, but if it will only use half of the juice anyways, not sure if that would solve the issue. I could throw the gtx 2080 at it but again, at less than 9% utilization of the 1650 super not sure that matters either. It appears to be limited for some reason in the plex transcoding code???

And finally before people all jump to tell me to just compress a copy or store the blu ray version as well. I get it, I can do that, and started getting the blu rays of all the 4Ks I have on the server loaded up. But that is not the point of all of this, I bought the plex pass a year ago to switch from kodi so the family could use it “easily” as the remote access on kodi was a nightmare, but it did let you do a bit more back end tinkering, like extending the buffer size for example on the client end… advancedsettings.xml wink wink

But thanks again for the reply, I am all ears to try anything here. And @Plex I am also all ears, and I would be glad to test and help if we can resolve the resource allocation in the server program to either use the Nvidia card to its potential or actually make the CPU hurt 100%. I already set the affinity to real time to give it balls to the walls from a windows allocation point.

@Plex-Wizards,Dudes&Dudettes …After reading back all of this that I just wrote, Just a simple setting on the client end to increase the buffer cache could likely solve most if not all of these problems for instances like this. Or for the very least increase the buffering attempt time before timeout so the movie plays past the bit rate spike.

Thanks again! Just had to get that out there…

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.