[INFO] Plex, 4k, transcoding, and you - aka the rules of 4k

LOL, or you could just use a program that simply “just works” like Emby and not have to worry about this whole convoluted mess.

In my limited experience with putting 2160p and 1080p in the same library, the players I use all know the correct version to play by default. Yeah, “duplicates” will exist but it seems like a minor inconvenience.
I did discover that 4K players may need to be configured to avoid transcoding but otherwise it worked well.

3 Likes

Aww bless. Thanks for creating the account to add your invaluable input.

Thanks for your input @Xhaka and for your confirmation @dduke2104. It does indeed seem to work.

3 Likes

@Xhaka
yes if you kept dupllicate 4k/1080 in the same library, you would probably want to split them, in order to use the restriction labels.

Personally, I would not keep duplicates in the same library, I would delete the non-4k version, and remote users would just be out of luck. I don’t use the labels myself, I just wanted to let other folks know about them and decide what is best for themselves.

Its good to hear that plex has become smarter about choosing the best version if you have multiple resolutions in the same library!

Again for me, I do not want to managed duplicates with the same library, so I’m going to keep 4k separate until if/when plex can do hdr > sdr conversion and I can keep one high quality 4k master.

@Joe_Booysen sorry I have no idea, never heard of that device.

if you try it, please post a new thread here somewhere and let everyone know your experiences with it and how good/bad it works with plex.

1 Like

Yeah that would be the ideal scenario for sure.

This may be a dumb question, but why is the whole tone mapping problem so hard to solve in Plex when my computer can do it just fine?

The same files that look awful when transcoded by Plex look great when I play them directly from a network share without an HDR monitor. Is it just an ffmpeg limitation?

its not a dumb question at all.

its just difficult to understand and explain.

I don’t fully understand it myself, and I don’t feel qualified enough to fully explain it, but I believe the gist is;

HDR (as we consumers know it) is both more colors and more brightness

While technically “HDR” refers strictly to the ratio between the maximum and minimum luminance, the term “HDR video” is commonly understood to imply wide color gamut as well.


sdr color https://en.wikipedia.org/wiki/Rec._709
hdr color https://en.wikipedia.org/wiki/Rec._2020

if you look at the 2 color spaces, you can see that the color expansion does not happen equally across all colors, rec2020 has much more green available than 709.

FFMPEG when transoding see’s it a math problem, it must convert a 10 or 12 bit color (times 3 colors) and 10 or 12 bit brightness level, to a corresponding 8-bit sdr value.

so this is extra work above and beyond any resolution/bitrate/codec conversions.

and another unfortunate part, there is not a standard where all the colors/brightness that those 10-12 bit values map (or round) to specific 8 bit values that everyone must adhere to (as far as I am aware of).

for each frame to look right, all the colors and brightness have to be mapped to similar to what sdr would look like natively.

this is directly converting HDR color/brightness to SDR values.

at the client side, the client is not transcoding or converting any values.

from my limited understanding, its more like dithering

windows/mac applications can use operating system/driver level features to help with color mapping, that are not necessarily available to smart tv or other display devices.

see also

2 Likes

Yeah but that said now I have scanned everything in and deleted the previous separate 4K HDR and 4K Dolby Vision libraries that “little blue” 2 or 3 is still far less annoying that seeing two or 3 copies of the same movie in “continue watching.”

Now that was annoying. :upside_down_face:

Truer words were never spoken…LOL

I do not know how your computer is doing it. But I know, it would take artificial intelligence to do it properly.

Unfortunately TeknoJunky confuses a few things and the wikipedia articles cited are also not the best.

There are a few fundamental things to keep in mind:

  • Digital video is not encoded in RGB but in a different colour space in which a much more efficient data reduction is possible.

  • In that color space the brightness information and the colour information are largely decoupled. Hence, one talks about luminance and chromance.

  • The brightness reception of the human eye is not linear, but close to logarithmic.

  • The relation between the encoded brightness value and the brightness output on the screen is non-linear. This is the “gamma,” or “gamma correction.” More precisely, there is a mathematical function mapping these two which historically contained a parameter that was named gamma.

As far as I understand the problem with the HDR to HD conversion is more with the luminance than the chromance. But I may be corrected.

In SDR video, 8 bits and the gamma correction allow to reproduce a brightness range where the brightest spot is 2^7 times brighter than the darkest one. I.e. one can double the brightness about 7 times before one runs out of the dynamic range the format allows us to encode. However, the human eye is able to resolve 10 such steps.

If one would use 8 bit to describe a larger dynamics range, the eye would see “bands,” i.e. a luminance gradient would not appear as a continuous change, but in discrete steps. (The eye is really good at recognizing these.)

In HDR, using 10 or 12 bit, a higher dynamic range can be encoded without running into banding. With it comes that a different gamma correction function must be used.

The offspring from that is that you cannot just chop off the least significant bits from the 10 or 12 bit information in order to obtain a good 8 bit representation of the image.

Of course you can define a stupid math function doing that semi-properly, but implicitly you have to make an artistic decision. Where do you want to leave detail away: in the darker or the brighter parts of the image? In the professional world a human would make that decision scene by scene.

Added later:

I should stress the following point: With motion pictures, you can not obtain good results by working frame after frame, as the camera might follow the action and the light conditions or the focus of the interest change. Hence, an algorithm has to analyse a complete scene or at least a larger time window.

Oh, this became long, but I hope it helps.

2 Likes

Right. All I did was follow the very first line from OP and I’m extremely happy.

TLDR; buy a Shield > 4K/atmos receiver > 4K tv

I used to try and make this work with PS4. Stuff that. The Shield is an excellent device, its so fast. My audio\video quality is insane.

1 Like

I own a Shield 2017. It is by far the best streamer I have ever owned so I agree with your sentiment.

Except one glaring ommission: it is not capable of 3D-MVC and according to nVidia it never will be. If they added that feature it would eliminate virtually all other competing devices, IMO. It already does virtually everything else.

Of course for those that don’t care about 3D-MVC that is not a concern so party away! For those of us that still want and enjoy 3D, it is the day to day streamer and gets pushed aside on 3D days to be replaced by a LibreELEC box or similar.

I covet 4K, but can see why so many are so confused by UHD, 10 bit color, HDR, etc.

I still use a 1080p Sharp in my bedroom and it’s infuriating to go watch a 1080p flick, only to see it’s an unlabeled 10-bit file, replete with blown out colors and essentially unwatchable on anything but an HDR display.

Like so many have mentioned, it’s difficult to understand why some can pull off HDR-to-SDR tone mapping, while so many others can’t. I can watch the 10-bit 1080p flick perfectly fine on my SDR Plex Server display, but the moment I want to stream it, it’s right back to overexposed video.

I’m struggling to understand how the Plex server can play it locally, but streaming said video reverts to SDR mess? What about the stream can’t be streamed if Plex is using the very same hardware for both local and remote streaming?

I beg to differ. I have both 1080p and 4K sets in larger formats (90” HD & 70” 4K). I’m about 10 ft from my 90” HD panel and you can very much see those pixels. I’ve managed to keep in check by reducing sharpness, but it does improve the image, albeit a bit soft.

Regarding the 70” UHD set: I can’t see any pixels until I’m so close my breath begins to fog the screen. This is the promise of 4K and it delivers in spades.

Now, we can have a discussion about nearly all the streaming services struggling to break 300 nits in HDR, but that has nothing to do with resolution. Nearly all of Disney+ HDR offerings show woefully low dynamic range, despite being tagged as HDR titles. The exception appears to be Mulan, which peaks at an appreciable 900+ nits on non-OLED displays. There’s improved detail in 4K titles like The Mandolorian, but DR & overall contrast are identical.

Lastly, the difference between a Blu-ray & UHD disc (with legit HDR) is night and day. I question the value of 8K, simply because I’ve only seen sets in showrooms, but moving from 2K to 4K was a noticeable improvement.

If video is the only “must have,” then ATV, or Roku, or other things, that can play 4K HDR natively, will work. And there are some titles (limited, e.g. “Color Out Of Space” and season 1 of “Stranger Things”) that are “just” 4K, no HDR. And if you want a single copy of that, great. It works on everything no tone mapping required. I think the issue is: “When does my playback suck.” It sucks when Plex has tone map. There are about, minimum, 5 different ways Plex “has” to tonemap. Having a Shield does not IN AND OF ITSELF guarantee there’s no tonemapping, especially since your scenario is “Hey here’s a one client solution!” If we wanted one client solutions none of us would have a Plex amiright.

1 Like

It does not help at all. E.g., in Davinci Resolve on a PC, you can engage a Dolby Vision (this means something COMPLETELY different HERE than you think it does) algorithm to, more or less, beautifully tonemap 10 bit BT 2020 HDR into 8 bit BT709 color space/dynamic range using any “simple” PC with no external hardware required. MadVR yields good results too. And the HCX chip inside the Panasonic UHD player makes some great on the fly HDR->SDR tone mapping too.

I agree and have stated that many times on many threads.

If truehd/atmos are not desired, then those other options could be completely suitable for someone’s needs.

Otherwise, until another solution presents itself, shield remains the only mainstream device with hd audio/atmos/Dolby vision support.

As far as tone mapping, plex will either solve it eventually, or we can choose to utilize something that can, or live with it, and/or avoid the need by not using hdr content on sdr equipment.

1 Like

But even here, again, caveats. E.g., you’ve got an ATMOS source file. But your client is playing in a 2.0, 2.1, or 5.1 audio, and sans ATMOS capability. Whatever client you’ve got will transcode the ATMOS into audio that will be non-ATMOS but acoustically a moot point more or less (if you have a 2.0, 2.1, 5.1 etc setup). And let’s say you’ve a Roku and playing ATMOS. It will output Dolby Digital Plus 7.1 from that ATMOS source… and if you have a 7.1 setup it sounds good, and if you have a 5.1 setup that 7.1 will be downmixed into 5.1 inside your (non-ATMOS) receiver anyway if you’ve a 5.1 setup (or less) and sound great. So as I said: caveats. And “tone mapping” ATMOS into 5.1 or 7.1 at the client is FAR more good/reasonable than tone mapping ANY HDR video. Which always looks crap. Sounding pretty great instead of really great is far more preferable than looking bad instead of good. The ATMOS (and even needing a receiver) is a red herring in other words. We didn’t come here to debate audio nuances. We came to debate the unacceptable state of video.

Plex themselves say “Anywhere. Any time. Any device.” Your solution is very anti-Plex (and Plex itself is very anti-Plex for not supporting tone mapping)… and for anyone who loves Plex, it just feels restrictive and wrong.

I really don’t understand what you are arguing for, or against.

Many people do not care about atmos or hd audio.

That is perfectly fine, then just about any 4k capable device may work for them.

When attempting to play such hd audio and/or atmos, then plex will transcode the audio as necessary in order to meet the limitations of whatever their playback chain has.

subtitles and audio transcoding can cause transcoding, which is one reason why shield is still a good choice for all users, since it can still direct play most subtitles and hd audio.

There are also many folks who WANT proper hd audio and atmos.

For us, the only mainstream solution is the shield, along with hd audio/atmos receiver or soundbar or whatever.

It sucks that, truehd is not backwards compatible with dolby digital

It sucks that transcoding audio with subtitles can cause video transcoding.

It sucks that smart tvs can’t handle hd audio.

The majority of these issues are caused by using content that in manners that were not/are not intended.

Most smart tvs and streaming devices are simply not designed or intended to be used to stream high bitrate 4k bluray remuxes.

Don’t be obtuse, I’m arguing against your whole train of thought. I’m arguing against anyone at Plex thinking, “Hey this guy makes some good points; we are OK for now.”

Any movie you put on your server, I’m going to go out on a limb and say the “intent” is to watch it, with sound. Radical thought, I know. Plex can transcode any sound format to 2.0 or 5.1 or 7.1, as needed, and the results are listenable. E.g., if you’re “just” watching on a 4K TV, or an iPad, or a phone, or a PC, or remotely 100 miles from your house, you are not “aggravated” that Plex is transcoding audio. It’ll sound good. However, if it’s a 4K HDR source file, you will be aggravated by an essentially unwatchable picture any time Plex transcodes the video. It’s not that it’s NOT transcoding the HDR video… it’s that in contradistinction to audio transcoding, it’s transcoding video very poorly. If Plex were transcoding audio like it does HDR video, all the sound would be lowered by an octave and the music/orchestration would be muted.

The whole raison d’etre of Plex is: transcode. “Most smart tvs and streaming devices are simply not designed or intended to be used to stream high bitrate 4k bluray remuxes,” you say. And you’re right! But Plex WAS meant to stream them to ANY device. And right now it’s got a big weak spot.

Your arguments to avoid transcodes are kludges, by definition. Which is fine, but it’s non-exhaustive kuldging. E.g., why not recommend users “pre-transcode” the 4K HDR videos to 4K SDR at 80% of the original bit rate,* keeping the HD audio? Then we’d have one file, playable across all devices, saving server space etc. vs a two file (original 4K HDR and some pre-transcoded 1080p version). That’s not in the “rules.” But it’s a solution that would work the best technically across the “anywhere, any device” ethos Plex professes.

  • You could pay $20/month for Adobe Premiere, and run your HDR videos through the “SDR conform” process… it looks fine, and is a heck of a lot better than Plex’s “SDR conform.”