4K HDR - will plex ever support it?

Generally no, but with exceptions.
If the TV doesn’t do e-are there is no HD audio.
The Shield will be most likely much faster anyway.

Only if u need DV will the Shield not be the likely better option.

Thanks to the users who just replied. To clarify, I have been playing 4K rips through my Shield. Reading this thread made me start questioning whether I am viewing any kind of HDR via the Shield whether it be Dobly Vision or HDR10. Honestly I have no idea whether Dolby Vision or HDR10 is present in my MakeMKV rips. Although I seem to remember seeing that in metadata presented to me via my Plex Media Server on some 4K rips.

most 4k/hdr tv will have an indicator when the tv is in hdr or DV mode.

this might be an option, so it may or may not be enabled depending on your settings.

my tv pops up a small HDR logo whenever hdr content is playing. it obviously doesn’t stay on, but just for a few seconds.

in plex, it will show something like this
VIDEO 4K (HEVC Main 10 HDR)
AUDIO English (DTS-HD MA 7.1)

I don’t think plex will indicate dolby vision, it is difficult to even rip a dolby vision bluray with dv intact.

even more, different tvs have different levels of dolby vision compatiblity.

the simplest way is to play it directly from a DV bluray disk.

or just don’t even worry about dv.

1 Like

You’re certainly not missing a Dolby Vision logo when using a Shield.
It doesn’t support it.
For that you need an ATV4K, LG TV and maybe a few other brand TV’s but highly unlikely a Samsung TV as they HDR10+

So am I safe to assume that HDR10 will work since Plex passes it through and my Sony X950G supports HDR10? Also, I am hopeful that Sony will add HDR10+ capability to my TV. If they do, has anyone had success with HDR10+ using Plex and a Shield?

And thanks all for your replies, very helpful…

Thanks TecknoJunky, and I do recall seeing some of my rips showing as HEVC Main 10 HDR

I wouldn’t be too hopeful.

Not only do manufacturers fail to keep old hardware up to date (and if it has been sold, it is old), its unlikely they are going to go back to improve something old, when they just add it to the next version as a reason to get us to buy the next thing that comes a long.

not only that, I’m pretty sure that hdr10+ is not simply a software update, but also requires newer hardware that understands it.

shield does not support hdr10+, and even as well as nvidia keeps it up to date, I don’t foresee it getting DV or HDR10+ without a new hardware release.

To answer the questions here

  1. Plex doesn’t indicate Dolby Vision. Plex isn’t licensed for it. If it happens to be in a DirectPlay, stream so be it.
  2. As more pieces of hardware or software are added to the path from the source (PMS) to the end player (TV), the least capable device will ultimately control what happens. It’s for this reason I use PMS -> ATV 4K -> TV. My Onkyo (RZ series) is fully licensed however the best Apple allows is LPCM output. With the LPCM output and the Onkyo’s DSP, I can’t tell the difference. I would need a full qualification test suite (files) which test each audio mode to know the difference.

@dalebeamusa_hotmail_com

I am a purist. I believe that if I’m going to expect the best performance I need to do the following:

  1. Rip it myself (Remux – no HandBrake to save space because saving space discards data)
  2. Have the storage to hold it, as-ripped
  3. Have the network to deliver it when everyone in the house wants to watch different things at the same time (OFTEN hahaha)

When I do the math,
Three or four 80+ Mbps movies, at full quality, is demanding on the typical LAN. The devices are all capable. but this does require a LAN capable of handling stead state plus the startup surge.

2 Likes

MakeMKV may I ask?

Yes, following by mkvtoolnix to take out those tracks I will never use.
The process finishes off with FileBot.

1 Like

Hello, I just bought a 4K TV about a week ago. I know next to nothing on the subject of 4K or HDR. I was planning on hooking up my Desktop PC to my TV (Sony XBR55A9F) via the receiver (Marantz NR1710). When I connected it, I went into the NVIDIA control panel and selected the appropriate resolution. But NVIDIA would only let me select 8bit color, no higher. All of the other options for setting the color format like RGB was also not available from the drop down menu.

Regardless, Windows 10 said my display was HDR capable and so booted up Plex Media Player and played a remuxed blu-ray file. While it was playing, the TV gave me the “HDR” logo in the picture settings menu, indicating it was getting HDR, and the audio played just fine as well.

On other forums, people said I need to set the color to 10bit or higher, but that option, for whatever reason, isnt available to me. I’m not sure if its the graphics card, or the HDMI cable itself, or if its because I’m going through the receiver. Does it even matter since PMP only supports 8bit anyway, right?

I guess I’m just asking if i’m fooling myself into thinking I’m watching a movie with HDR quality when point in fact, I am not. Thank you

I would suggest starting @ [INFO] Plex, 4k, transcoding, and you - aka the rules of 4k

if your pc is saying it is outputting 8bit then it it is not hdr.

most likely explanation is that your receiver is upscaling to hdr, so the tv only sees what the receiver has upconverted.

I would expect you can disable the upscaling in the receiver options.

you might also check your receiver if it has a “PC” video mode you can enable that might allow the pc to output hdr.

you should also check your video card model if it even supports hdr.

finally, if PMP doesn’t support hdr (I don’t use it so I can’t confirm) then none of that really matters.

I’d like it if 4k HDR actually worked properly when both ends support it. My Windows server supported it and my Xbox one X supports it and I can direct play 4k HDR and for 5-20 mins the stream looks and runs great then you get a buffering loop out of the blue for no reason.

On the plex server it looks like no network activity is going on even through the server and the Xbox and perfectly online with no issues.

the xbox has it’s own issues with 4k hdr, and still doesn’t support bitstreaming truehd/atmos or dts.

plex devs have stated the main issues are that MS has to fix or allow.

I have a NVIDIA GTX 1080. The plex PMP faq says plex supports HDR up to 8bit color. My receiver gives no options for upscaling anything, except to set the HDMI video to enhanced.

When I set all of this up, the “HDR” logo only comes on during movie playback inside PMP. If i’m just browsing my library in PMP, there is no HDR logo.

that looks like a nice receiver, one of the new ones with E-ARC support, nice!

sounds like a euphemism for upscaling.


HDMI to HDMI Scaling - up to 4K 30/25/24


hdr will only show during hdr content, not while browsing the library

maybe it is working if the hdr logo is coming from the tv itself, mine does the same thing flashes an hdr logo for a couple seconds when switching to hdr mode.

I tried hooking the computer up directly to the TV, hoping to use eARC to send HD audio down to the receiver (but i suspect my cable isnt right because it’s currently not working), and the HDMI cable I was using was causing the picture to constantly flicker on the TV, it kept cutting in and out to the point that I couldnt even try to change the color settings in the nvidia control panel. I think I’m gonna use it in 8bit mode anyway since I don’t currently have any other viable way to watch remuxed blu-ray content.

But what ever workaround they were using a few months ago actually worked. I was able to consistently watch 4k HDR movies especially if the audio was AAC or AC3. Now those same movies buffer endlessly at random times.

ok gotcha.

all we can do is wait for the future forthcoming ffpmeg update to arrive and go from there, because it doesn’t make any sense to attempt to try to fix the old transcoder with a new one being worked on.

Well waiting and or understanding WTF is going on would certainly be easier if they still had support people that answered questions in these forums.