Is there a solution to play every codec natively?

I’m trying to have a way to play everything natively. It doesn’t seem to exist. But maybe someone knows something I missed? I want to be able to play 4k HDR/Dolby Vision/Dolby Atmos/Dolby True HD. No transcoding. All these freaking formats are driving me nuts, man…

With the LG Web OS, 4k HDR video is fine, but True HD and Atmost will not passthrough. This isn’t Plex’s fault. My 2017 LG B7 lacks eARC and apparently will never get it. Regular ARC lacks the necessary throughput. Even though I hate the quality loss from transcoding, I might’ve looked the other way, except True HD and Atmos can’t be transcoded w/o frequent buffering, even though I’d think my Windows 10 media server, should be more than capable of transcoding audio only, since the video direct plays.

With my HTPC, audio passthrough is great. But HDR is messed up. It looks weird. Things that are meant to be white have blue artifacts. It’s bad. Tone mapping gone awry, or something. I’ve tried a multitude of settings. People on the internet say Microsoft has not implemented HDR properly.

Is there any device out there that I could plug directly into my AVR that would play or passthrough everything? The NVidia Shield seems to be the closest but lacks Dolby Vision. Does anyone know of any device, current or future or even just rumored? I know there’s rumor of a new Shield. Can the new Raspberry Pi 4 be made to do it? ODROID?

Or maybe someone has suggestions for making HDR work right in Windows? I have an Nvidia 1060 GPU. Thank you

nope.

nvidia shield is the closest all around solution.

maybe someday they will come out with either an update or a new shield device with DV support.

In any case, you can’t currently get DV + hd audio other than via a real 4k bluray player.

DV is difficult to extract/rip, and can only be used in conjunction with mp4 and lossy audio.

Amazon just announced their new FireTV Products, the FireTV Cube, and the new FireStick 4K have nearly everything, including DolbyVision and HDR10+, it just lacks the DTS support as far as I can tell.

Here’s the FireStick 4K and the FireTV Cube Codec Support:
Video: Dolby Vision, HDR 10, HDR10+, HLG, H.265, H.264, VP9
Audio: AAC-LC, AC3, eAC3 (Dolby Digital Plus), FLAC, MP3, PCM/Wave, Vorbis, Dolby Atmos (EC3_JOC), Photo: JPEG, PNG, GIF, BMP

fire sticks/devices do not pass through any hd audio

streaming dolby vision, and streaming dolby atmos are not the same thing as DV/DA from a 4k bluray.

I was curious about the latter part of that, can you explain why it’s not the same streaming, as it is from a UHD Disk? I also did find it odd that they covered eAC3 and Atmos but not TrueHD… that was a bit of a head scratcher…

dolby digital/ac3/eac3 are all lossy. streaming atmos is simply one of those lossy codecs + atmos metadata.

https://developerkb.dolby.com/support/solutions/articles/16000067758-what-is-dolby-digital-plus-joc-joint-object-coding-

truehd is the lossless HD dolby audio (blurays/4kblurays)
4k bluray dolby atmos = truehd + atmos metadata

dolby vision comes in multiple types, and each type has multiple versions.

streaming dolby vision is (generally) a single stream/layer video that the DV metadata is encoded directly, and the streaming player (smart tvs/streaming devices) can play back directly.

dolby vision from 4k blurays is 2 separate video layers, the base video and the DV video, the bluray player will normally decode and combine both layers and send the video out via standard hdmi, to compatible hdr/dv displays.

within those 2 major types, there are essentially differing versions that define the quality level and/or bitrate. Much like how you can use handbrake to convert to different bitrates and/or profile levels.

we will likely never see 4k bluray level video/audio content streamed over the internet from any streaming providers, simply due to the bandwidth rates of 4k bluray media which can already exceed 100mbits.

content streaming companies control both their player applications, and how their content is encoded.

they do not have to support every different video/audio codec under the sun, they only have to support whatever specific codecs they choose to.

I gave up on my amazon devices when I realized, they simply do not want to create devices which support anything other than streaming level content.

they do not want to empower people that rip their own content to watch on their devices, they want us to stream from prime or other supported streaming apps, where they can control the content, monitor viewing habits, and provide ads.

we are extremely fortunate that a device like the nvidia shield exists at all.

smart tv and streaming device makers, do not want us to control our own media. like amazon, they want to control the content and the experience, for their own profit.

1 Like

I follow you on the control part. However, from what I understand, the DolbyVision/HDR 10+ is basically doing the same thing as Atmos/DTS:X, they place a metadata stream/layer that tells the player how and where to dynamically place the specular and the deep black details on a frame by frame basis, effectively giving the overall movie a larger dynamic range. That’s how it works, right? So, if the metadata content was encoded directly into a single video stream rather than be an extra layer, the information would still be there? I guess I don’t understand why that’s such a big deal?

the main deal is dolby is very protective of their IP, everything must be licensed to be able to decode/playback, and that licensing is neither cheap, nor open source.

as far as the video/metadata relationship, think of it like this;

say netflix/amazon/whomever provides a streaming dolby vision video with a 10-15 mbit rate, the single stream video has the DV metadata encoded within it, the player is licensed and can decode that video stream and apply the DV metadata.

when a user does not have a DV compatible player/tv, then they send a non-DV video stream instead.

now compare a 4k bluray, the bluray does not know what the display will be, so it has to support SDR/HDR/DV/whatever else the device maker decides to support.

bluray uses multiple video layers then the bluray player decodes/combines the appropriate layers into a video stream that is compatible with the display. HDMI provides the information between the disk player and the display, about what capabilities are supported and they negotiate the best compatible video/audio streams.

same with audio, the bluray player can decode whatever the selected audio stream and send it to the tv in a format that the tv understands (it stereo wave, DD, DTS, etc) If the tv has arc/earc and a receiver, then the receiver determines tells the bluray disk player which codecs it can receive and the player obliges.

Right, but I guess my point was, from a plex server standpoint, directly encoding the HDR10+ or the DolbyVision metadata layer directly into the video without sacrificing video quality by shoving it into a 15Mbit pipe, should really be no different than playing from the disc, provided your TV/Device is capable of the playback… But I might be missing something somewhere…

Ok, when a 4k bluray is ripped, it becomes a single layer video.

That is why when you try to play an HDR video on an SDR tv, the colors are washed out, because (currently) plex doesn’t do any color conversion or remapping and the tv doesn’t understand HDR.

as far as I am aware, there are no hdr10+ rips yet, so for now we only have regular hdr rips.

DV rips, as mentioned previously are limited to lossy audio only.

so when direct playing 4k/hdr, plex is not combining or decoding or encoding the video, the device itself can read the provided HDR video and boom we have hdr video.

When transcoding, plex takes that native x265 10bit video stream and converts to x264 (which is what we have been using for years). Plex currently does not do any color mapping/conversion while transcoding, so again the colors get washed out.

Plex uses the open source FFMPEG program, which I guess that plex adds their own ‘magic sauce’ to customize it for real time conversion/streaming, so it is going to be limited by whatever capabilities that ffmpeg provides natively, along with any tweaks plex team does.

so plex doesn’t change the video stream when direct playing, and when it converts/transcodes, the codec is changed (265 to 264), and hdr is lost.

as far as DV, plex does not/has not licensed the necessary technology to decode/convert it.

DV played directly, will work with compatible displays.

the problem with dv played directly, goes back to what I said above, there are multiple types of DV, and different versions/levels, and not all DV devices can play back all types of DV.

DV is a mess, and there is no easy solution.

Hopefully, HDR10+ will becomes the new standard and make DV irrelevant.

Even then, it will no doubt take even more years for HDR10+ to mature (content, displays, and software wise) and become as ‘simple’ as HDR is. (of which neither x265 or HDR are not simple anyway)

If you have not already seen it, there is a whole thread @ Plex, 4k, transcoding, and you - aka the rules of 4k - a FAQ - General / General Discussions - Plex Forum that might be of interest.

as far as DV rips, the only reason they exist at all is because there is an obscure dolby vision software decoder that can take 4k bluray dv streams and decode/remux them into a single video stream (similar to streaming DV), unfortunately it only supports mp4 and lossy audio. and only certain tvs support that

at least since last time I saw any news about it.

1 Like

Gotcha, thanks for the info :slight_smile: I will delve more deeply into it when I get back from class

Yep that’s still the case.
Personally I have experimented a lot with that software and tbh although with some scenes in some movies the difference between DV and HDR10 is very noticeable, but for me at least it’s not enough to suddenly make me want to rerip everything into single layer. Especially not for “some” scenes in “some” movies.

1 Like

cool. I don’t have a DV tv, but even if I did, I would not go through the hassle either.

there is another thread on these forums, and there is the google which will provide lots more info for those who still have questions or want to research more.

https://www.google.com/search?&q=dolby+vision+4k+rip

unfortunately people get blinded by the irrelevant and angry when they can’t get what they want, and can’t get it yesterday.

1 Like

For me the bottom line is, it’s one thing having equipment to do a side by side comparison.
I actually did a test with the opening scenes on Bumblebee yesterday.
Yes you can spot noticeable differences in the opening battle scene watching side by side. But nothing that made me think the HDR10 version was suddenly bad.
Oh and I know what you mean about people getting angry…well a combination of angry and needy.
It happens on these forums more and more it seems sadly.

I have a DV capable display and to be honest I think DV is overrated and more trouble than its worth. Hoping for HDR10+ to make DV irrelevant.

I’m curious to see how it pans out too.
I spend most of my time on the Plex Reddit channel nowadays and HDR10+ gets even less love than DV over there. (I have no personal experience of HDR10+)
I’m curious about the comments I read in the comparison between DV and HDR10+ and the fact that HDR10+ uses algorithms to apply the layer while DV is applied by the human eye.

It’s not like we humans are perfect. So…
Right now I’m happy enough with regular HDR10 on my ATV 4K or Shield.

1 Like

Thanks guys. I’ll settle for regular HDR for now. As Teknojunkie pointed out, the use cases for DV are limited, currently in terms of Plex and ripping your media. The LG Plex app works great for 4K video with regular HDR10. Just the audio sucks, constant buffering because of transcoding and/or ARC bandwidth limitations.

I wonder if anyone has an opinion about Dolby Atmos sending PCM to the app (or other device). Because I really don’t need/want Atmos or True HD per se, just high fidelity, uncompressed audio for my 3.1 setup. I find rear/height surround distracting - if I hear a loud sound behind or above me, I want to look there instead of the screen. Plus setting up a surround system in my living room is a hassle and a step farther than I want to go in terms of my living space.

I’ll have to dig into Plex’s audio settings - I confess I’m raising the PCM question off the cuff without really looking into it deeply. Converting Atmos to PCM losslessly might still exceed the bandwidth that ARC is capable of anyway since stripping the metadata off might not be that much of a data reduction.

May just have to get a Shield. Although will probably hold off for now given the rumors of an impending new iteration of the Shield. People on the Nvidia forum don’t seem to think Nvidia will ever pony up for Dolby Vision licensing, so if true the new device wouldn’t have DV.

You can avoid audio transcoding by simply choosing a DD/DTS 5.1 or stereo audio stream.

All 4k blurays include multiple audio streams.

If you are not using atmos/full surround, IMO you are not going to miss much by losing the HD audio.

Using DD fixes your transcoding, and avoids the need for buying a shield and atmos receiver and speakers.

Also, a shield alone won’t necessarily completely solve your problems without the associated atmos receiver.

Thanks I have an Atmos receiver. The bottleneck right now is the ARC ( TV doesn’t have eARC). DD and DTS 5.1 is lossy so that isn’t my preferred solution. You’re right it’s not much different but I have decent speakers and it is in fact noticeable.

As with all things Plex that’s the key thing. No one can tell you if ATMOS is noticeable.
No one call tell you if DV offers anything over regular HDR.
It’s your eyes are your ears after all, not theirs.

1 Like