I have been doing a little testing on the small computer I just got with the intention of V2P’ing my existing Plex server, and I’ve noticed something odd. If (web) clients are targetting 1080P conversion, at any bit rate, hardware acceleration doesn’t look like it’s being used. Is this a known thing?
I’m not speaking with certainty this is the case, but where in 720P, I see activity on the GPU in task manager, when 1080P is the target resolution, the CPU utilization increases a lot(from say, 30-40% to 70% for a single stream), and GPU utilization appears to go to 0%. So the rough metrics certainly make it look like it doesn’t get used in that scenario.
Can anyone confirm/tell me if I’ve got something incorrectly configured etc
Thank you!