Excuse me if I am asking something already discussed and answered (and I believe it is because there are several threads but they are somehow old and may not be stating the final status)
I have a Samsung Q70R TV which supports 10bit & even 12bit
I also have a Nvidia Shield Tv which supports HDR10
I am trying to play a HDR10 4K video on Plex for Nvidia Shield.
The TV shows HDR not HDR10
Per recommendation from Nvidia forums, I set :
on SHield:
3840x2160 59,940 Hz
YUV 420 10-bit Rec. 2020
on receiver:
4K signal format Enhanced (meaning : TV, playback devices and cables support 4K 60p 4:2:0 10bit video"
TV format : PAL
on the TV HDMI input signal plus is set.
and my cables are all HDMI 2.1
so what may be wrong ?
is there an unsupported setting I’m trying to use ?
my mistake.
the video shows a “HDR10+” setting.
my file was HDR10 not plus
so I found a HDR10+ sample and tried with that. (directly playing on TV)
TV shows HDR10+ with this file.
However, shows only HDR when the file is only HDR10
so the indicator shows:
HDR for HDR and HDR10
HDR10+ for HDR10+
but when I play the HDR10+ file on Plex (Nvidia) , it only shows HDR
Does Nvidia SHield support HDR10+ ?