Which is better: DTS transcoded to AAC, or AC-3?

I don’t believe there is a way to set this to be a specific codec, it will transcode based on the standard audio codec for your playback device. It’s funny that it transcodes to AAC though, as generally gaming consoles widely use AC3.

What is the source audio codec that is being transcoded to AAC?

Glad I read this. I was pretty puzzled earlier why my ATV 4K was reporting direct play of DCA-MA in Tautulli.
Oddly the PMS dashboard showed the same thing.

1 Like

I’ve tried with eac3 and dts. It will transcode anything that isn’t vanilla ac3 to aac which is frustrating.

Was able to replicate your same issues on my PS4, tried multiple different audio codecs and it always transcodes to AAC instead of AC3. In some instances it showed direct stream of AC3 5.1 but then seconds later transcoded to AAC 5.1.

It does look like AAC is capable via Sonos though, however, only at 320kbps via local files (essentially stereo). I don’t believe Sonos will do 5.1 without the appropriate equipment/setup, generally it does Stereo, so anything above that it won’t play aka you won’t hear anything.

I’m assuming you’ve done extensive digging on how to setup a worthy 5.1 system via Sonos, as it looks like you’ll need the Sonos Amp from what I can tell.

Remember there are two parts to this - the server and the client.

The Dashboard and Tautulli report what is happening on the server, which is direct playing the audio.

The client can modify the audio stream and it will never be reported to the server. That is why PMS can show direct play but only the core is being passed to the audio system.

Depending on the client and the codec, the audio could be transcoded to another format on the client and PMS will still show direct play.

Example:
I have a FireTV Cube. I can configure it to pass only EAC3 audio (FireTV, not Plex, setting). I play a movie with dts/dts-HD audio, which is not supported on FireTV devices. The server shows direct play. My Denon reports EAC3.

3 Likes

Fire TV Stick has the same behavior. I find it fascinating that the actual device does the audio conversion before passing the data to the receiver/TV. AFAIK, that’s the only device family that does that. I’ve seen a few hiccups with that method too.

1 Like

@FordGuy61 Nailed it it dead right!

1 Like

Not looking forward to when I get to 70+ when this statement will likely be true due to a deteriorated physical body.

by that time H369 Level 90 may make a difference from Level 87 and you’ll remember those words thinking how cruel they were…

as you roll around in Geriatric Acres Nursing Home with a blanket on your legs…

Gettin’ Old Ain’t For Sissys, but is is inevitable.

Not trying to be cruel. Just stating a fact from your perspective as I hear it from older engineers in this exact industry all the time.

“You just wait, 4K, 8K or 16K, none of it matters once your eyes diminish towards your late 40s and into your 50s.”

I don’t disagree but this doesn’t mean advancement needs to stop in evolving the protocols and eventual products in the industry. It’s not for you now–as some future form of technology will not be for me one day. I was a hardcore gamer in my 20s playing twitch shooters like Doom, Quake and Unreal Tournament. Nowadays I don’t like VR nor do I find much value in it but that doesn’t mean others and upcoming generations should not have the freedom and enthusiasm to push things to evolve.

Personal insult aside (I’m old, remember? I have more important things to worry about) - here’s my parting shot and I’ll leave you guys to your delusions:

Say you have a 1080p MakeMKV BluRay Rip. It’s h264 at level 4.1 because they KNOW more profile and a higher level doesn’t do anything down here where we are. If you, in turn, think re-encoding that at level 5.1 is going to make one bit of difference, you’re mistaken.

The stuff that could actually use level 5.1 isn’t in that rip and it won’t be in your encode just because you think it’s better. If you have an original 1080p rip at level 5.1 - prove it, first, then I will admit that ‘maintaining’ that 5.1 level is a good idea.

Otherwise ya’llz wasting your time.

This is correct. But I was more or less addressing your overall perspective across many threads that seem to portray your pessimistic views on pushing the limits of higher bitrates, greater dynamic range, higher resolutions. Not that you are wrong in this particular situation because I agree with you that you can’t just create bits by changing the codec profile.

1 Like

To be clear, my overall perspective on bit rates and the benefits I get, or don’t get out of them only works for my eyeballs. Resolution is wasted on the youth, but if you need more bit rate to make you happy, by all means pour it on.

Here’s something to look forward to though… if you live this long, you’ll be able to save a LOT of storage space. I sure do.

You mean its wasted on the elderly since the energy required to produce it can only be partially appreciated by the mature optics, nerve and visual cortex. I think its a great idea that you don’t waste energy on a system that is overkill for the elder human receiver.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.