Plex HTPC feedback

I will never look at log segments. There are numerous parts of the logs that I want to examine and these segments don’t include many of them (including one that I explicitly mentioned in my last comment).

This hotkey is Win11 only and even if it wasn’t trying to force keystrokes to trigger a hotkey is a nasty hack.
Further paraphrasing the statements made my an MPV dev, MS hasn’t provided any sort of API to facilitate the switching. They appear to have taken the stance that switching is only supposed to be done via the windows display settings. While NVIDIA has a proprietary API to do the change, this is specific to only a single vendor’s hardware and not a proper solution. Basically MS has forced this situation.

Sure, here you go.
Plex HTPC.log (250.0 KB)
mpv-log-8.txt (74.0 KB)

So, scrolling through your logs, there is a substantial difference in one of the performance aspects on your GPU. You correctly pointed out that with MPV it logged No advanced processing required. Enabling dumb mode.. This will be a large perf difference. Dumb mode is not enabled with HTPC because of the scalers used in the default Normal Quality are higher than MPV’s defaults. If you want MPV to match the settings used by HTPC, you need to add --scale=spline36 --cscale=spline36 --dscale=spline36. Without that, it’ll default to bilinear (mpv.io). It’s very possible that this is the difference in performance and that by changing HTPC’s quality setting to the lowest (which pretty much matches MPV’s defaults) that you’ll get the desired result on this GPU.

If I guess correctly, when you run MPV with the three *scale=spline36 options, you’ll see equivalent performance to HTPC. If this is the case, then you are seeing similar results to our testing on an Intel UHD Graphics 630 which performed better with spline36 in ANGLE than in D3D11 (see Plex HTPC feedback - #1199 by gbooker02)

Though, in your HTPC run it compiled 8 shaders and then 11 shortly afterwards. That seems like a lot for just spline36 scaling but maybe it’s not.

You shouldn’t need any of these lines outside of the gpu-context=d3d11. The gpu-api restricts the allowed contexts (but since it is explicitly specified, not needed), the vo should auto-detect to gpu and HTPC passes in hwdec=auto (when you have hardware decoding on as you do) which should autodetect d3d11va here and the hwdec-codecs defaults to a list which already includes hevc (mpv.io).

I am curious to see the equivalent logs from using the ANGLE context which you said didn’t drop frames though.

Spot on. If I had those *scale=spline36 options to MPV, then I get similar performance (i.e., lots of dropped frames).

Indeed. I was just being extra thorough. If I remove everything except gpu-context, I still get the same results.

I’ve attached the log from a 1080p ANGLE test to this post (it was 1080p ANGLE that didn’t drop frames)

Side note: if I change the quality to “Low Quality”, 1080p content on my 4k display no longer drops frames - nice and smooth. In addition, my 4K HDR test drops fewer frames with Low Quality than with High Quality, but it’s not there yet.

Edit: It turns out I forgot to re-enable HDR in Windows (running Win 11 Pro), so that quite possibly explains the dropped frames in my 4K Content. I can see in the “Low Quality” log that it is using “dumb mode” like mpv does when it is working ok.

Thanks for your assistance!

Plex HTPCd3d11-1080p-low.zip (27.8 KB)
Plex HTPC-angle-1080p.zip (27.9 KB)
Plex HTPC-low-quality.zip (43.2 KB)

Nope, that wasn’t quite it. Here is a log with HDR on, playing 1080p content (no dropped frames) and 4K HDR content (dropping roughly 4-5 frames per second).

Plex HTPC-with-hdr-on.zip (41.9 KB)

Hi all, I have been running Plex HTPC on my Intel J4205 CPU with Integrated Intel® HD Graphics 505 so far with great results.
I am on Windows 10 LTSB 2016.

However, since you changed something with QT, starting with version 1.11, there is something strange:
When playing a movie or series, I don’t get the media control at the bottom when I press on Enter. Well actually it is somehow there, because the keys respond as if its there, it is just not visible. There is just the complete video screen.

It does not matter if I select HW acceleration or not, and also tried mpv.conf for d3d11, it didnt help.

On my desktop PC with discrete AMD Graphic card, there is no problem and its shown.
Any ideas where to look for issues?

This one is using the Low Quality preset so the GPU goes into dumb mode. I was wanting to see ANGLE with the Normal Quality preset to compare ANGLE to D3D11 with the same settings for everything else.

I take it that this is with D3D11? Sounds like you may be better off using ANGLE for now.

Perhaps your GPU (or drivers) has some sort of optimization that prevents anything from being displayed on top of the video content (because it’s cheaper on the GPU)? I’m not sure what can be done in this case.

On another note, I’ve been playing with the Fast Super Resolution Convolutional Neural Network (FSRCNN) shader as a pre-scaler in addition to other external scalers (as mentioned in the links provided in Plex HTPC feedback - #996 by supergregg). I tested this on an RTX 2060 and it appears to perform well.

I took the shaders from GitHub - classicjazz/mpv-config: MPV configuration files for high quality rendering of traditional live TV and video disc formats and put them in a shaders directory inside directory where the mpv.conf belongs. And then in the mpv.conf I put the lines

glsl-shaders-clr
# luma upscaling
# note: any FSRCNNX above FSRCNNX_x2_8-0-4-1 is not worth the additional computional overhead
glsl-shaders="C:/Users/YourUsernameHere/Local Settings/Plex HTPC/FSRCNNX_x2_8-0-4-1.glsl"
scale=ewa_lanczos
# luma downscaling
# note: ssimdownscaler is tuned for mitchell and downscaling=no
glsl-shaders-append="C:/Users/YourUsernameHere/Local Settings/Plex HTPC/shaders/SSimDownscaler.glsl"
dscale=mitchell
linear-downscaling=no
# chroma upscaling and downscaling
glsl-shaders-append="C:/Users/YourUsernameHere/Local Settings/Plex HTPC/KrigBilateral.glsl"
cscale=mitchell
sigmoid-upscaling=yes

to engaged the external shaders. Note, substitute YourUsernameHere for your username.

So those of you with high-end GPUs may want to start playing with these shaders. Perhaps we may be able to add them as higher quality options in the future.

2 Likes

No its just an older Intel HD Graphics driver from 2019. It has not much features.
Its really strange, because up until version 1.10 it was working.
And overlaying video content with media controls is working with other software without problems (I am using LAV decoders with another software “dvbviewer”)

The video and controls rendering was changed dramatically in 1.11.0 (details spread across multiple prior posts). Previous to this, the video would be rendered in an EGL texture, and then the controls would be rendered on top of that (also in EGL), and then that is blitted and rendered on the screen. Now, the video is displayed in one child window and the UI is displayed in another relying on the OS/GPU drivers to composite them. This allowed us to free the video rendering from Qt’s render pipeline because the latter was causing stuttering and a host of other problems. It sounds like in your case your drivers are disallowing (or refusing to render) a child window which is placed above the child window containing the video. Likely this is dependent on the swapchain setup in the video window.

1 Like

Looks like I need to update my HTPC to a more recent version of Windows 10, because I cannot install any Intel DCH graphics driver.

Vendor approved drivers sometimes block “unapproved” drivers downloaded direcly from the GPU manufacturer. You could try to wipe the exsisting driver with DDU, then install the most recent Intel driver. I’ve had to to this on some laptops sometimes. My most recent laptop refused to let me install drivers from Intel until i completely removed all traces of the version provided by the laptop manufacturer.

Oops, my bad. I’ll gather those logs tonight after the kids go to bed and the HTPC is free for testing.

Yes, that was with D3D11. On my HTPC box in my testing, with ANGLE, 1080p content plays fine in Low and Normal quality settings (haven’t tried others) and 4K HDR is practically unwatchable. For D3D11, in Low Quality, 1080p plays fine and 4K HDR drops ~5 frames/sec; in Normal quality, 1080p drops frames (haven’t quantified how much) and 4K HDR drops ~10 frames/sec.

I hear you and agree that Windows as an OS should really be providing a way to invoke this if they aren’t already. With that said, Nvidia’s API does work reliably in my experience with madVR through Plex for Kodi (it had issues with malformed HDR metadata at times, but long ago). I get this isn’t ideal, but for any users with an Nvidia card, this would be the best experience until MS cleans up its act. I don’t know how difficult it is to scan the active GPU and apply decisions based off that though, so if that opens up a can of worms…

On an RTX 2070 at least I ran pretty much the same same shader config as you, save for bumping FSRCNNX to 16-0-4-1 (remnant from my testing from a year or two ago) all last weekend. Don’t know if it applies to 4k content as that likely isn’t scaled (haven’t looked into supersampling/forced doubling/etc), but for upscaled full 1080p TV content, there have been zero issues.

Note, on the external shaders I corrected the paths in my prior post. It appears that the relative paths aren’t working for some reason (guess that’s what I get for doing all my testing with MPV). Additionally, I forgot to mention that I hit an annoying driver bug in NVIDIA’s drivers causing it to drop audio if the GPU is loaded (even if only 30%). If anyone else encounters it, I found the drivers linked in this blog post to resolve the situation: Fix Nvidia RTX 30 Series HDMI HD Audio Drop Out – techbloggingfool.com

What opens up a can of worms is linking to another library and using that (presumably, I’ve still yet to find any documentation on this supposed API so if you know it, I’d appreciate a link). Identifying the card is likely not too difficult in comparison.

Is there a noticeable difference? I saw people commenting that there isn’t and it’s not worth the extra load (as you can see from my config where I copied the comment from the mpv.conf in the linked repository).

The luma channel isn’t scaled and the shader is actually skipped in the case. This is more obvious in MPV if you use the stat overlay (maybe we can get that in HTPC at some point). However, you are likely playing 4:2:0 content which means the chroma channel is half the resolution and so it is scaled with the KrigBilateral shader (assuming your config matches mine).

This is really what FSRCNN is designed for.

It’s a guess, but I think madshi/madVR uses Nvidia’s NVAPI. The only reason I noticed is the app distinguishes/reports when its using OS HDR (ie Windows toggle) vs NV HDR. I’ve only ever had Nvidia GPUs when using madVR and so it’s been able to flip things for me automatically. HDR playback in Plex HTPC looks great now that the d3d11 context can be called, it’s more of a user experience hiccup than anything.

Thanks for the info on the scaling details :+1: I haven’t done any tests on FSRCNNX 16 vs 8; in the past 16 had been compared to madVR’s NGU Sharp High, which is what I run there. I could very easily see there being little returns for the extra compute cost though. I can drop 8 in next time and see how it looks in comparison.

You have my vote if adding the native mpv stats view would be an option. As an info/troubleshooting step, it can be helpful, particularly if lua scripts/hooks might be an option in the future, it’s easier to see if certain shaders are being applied.

The “little returns for the extra compute cost” is what I’ve seen written of others. Though it’s worth noting that the others all linked to a single individual at one point so it could be one saying it and others just parroting it. So I’m curious how you judge it.

Lua is coming and the stats in MPV are implemented entirely in lua. Compiling luajit on all the platforms has been a royal pain. In fact it could very well be that luajit was the worst package to compile of them all. It wasn’t too bad on some platforms but absolute hell on others. However, we finally got it compiled (but not yet tested) and that was too late for the next week’s version of HTPC.

Hi gbooker02,

v1.12 with d3d11 enabled has been my best viewing experience with Plex HTPC so far. Thank you for your hard work!

However, I seem to get some inconsistent results with HEVC Main 10 content. One movie starts to play, but only shows a black picture (Plex HTPC D3D11 HEVC not working.log), another movie plays just fine (Plex HTPC D3D11 HEVC working.log). If I don’t use d3d11, then the first movie also plays fine (Plex HTPC DXVA2 HEVC working.log).

I hope you can identify the issue within my logs.

Plex HTPC D3D11 HEVC not working.log (324.4 KB)
Plex HTPC D3D11 HEVC working.log (340.6 KB)
Plex HTPC DXVA2 HEVC working.log (328.6 KB)

This is an odd one. The ANGLE gpu-context is unable to display anything when a hw-pixelformat is not nv12 (because the library is too old and that’s because it’s the version that Qt uses). So, the app detects this circumstance and changes the hwdec=dxva2 should it occur. This is happening in your third log (dxva2 working). However, in both your d3d11 working and not working logs, it is using a hw-pixelformat of p010 (10-bits per color) which I have seen d3d11 able to display properly. In fact, I cannot see the differences between the two playbacks in these logs. Do you have any ideas what is different between these two files which could be the deciding factor whether you get video or not?

The only real difference I see is that the one not working is HEVC Level 5.1 and the one working is Level 4. If you want, I can give you a download link for the one that’s not working for me, but I can’t dm you apparently.

Edit: When I was thinking about the problem I noticed, that I haven’t updated my drivers in a pretty long time so I did that and now the problematic movie also plays just fine with D3D11. I guess it was an issue with the old Intel Graphics drivers.
By the way: I noticed you are identifying the Windows build incorrectly in the logs, it should be 21H2, not 2009. You should use DisplayVersion in HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion to identify the build since ReleaseId isn’t updated anymore since 20H2.