Server Version#: 1.18.0.1913 (Also tested on 1.17.0.1709) [Ubuntu x64]
Player Version#: 7.21.0.12323 [Fire TV Gen 2 (AFTS)]
So I have a very low power server that can’t handle video transcodes- which is fine since all my content is in a format that is able to direct played (or direct streamed) to all my devices. However recently I’ve noticed that Plex is attempting to transcode the video of certain files that have EAC3 audio. The expected behavior is that Plex would only transcode the audio and direct stream the video.
Going through the logs it seems as though Plex is estimating the file to require a bandwidth of 2,147,483,647 kbps even though the bitrate of the file is only 5,667 kbps. Though I could be totally off base. Here is a snippet of the server log and here is the mediainfo xml of the file in question.
The server is a remote server with no set bandwidth limitations. The client app is also configured for remote streams to play at maximum quality. Audio passthrough is set to HDMI.
Also I feel it’s important to note that I have the exact same device hooked up to a second TV. That one is able to Direct Play the same file fine with Audio Passthrough set to HDMI. However when Passthrough is Disabled the same behavior is observed where both the video and audio are transcoded. Any help is greatly appreciated
If Deep Analysis was run on the file it would/should fix the problem
Streaming Resource: Required bandwidth unknown (media requires deep analysis) Changing decision parameters provided by client to fit bandwidth limit of 200000kbps
Thanks! Yeah I noticed that too however I keep deep video analysis disabled due to it causing issues when analyzing my mounted cloud storage and would prefer to keep it off. I also really have no need for my files to be analyzed as I have gigabit upload which never comes close to being saturated.
Even with deep analysis off I was under the impression that Plex would make a reasonable guess as to the bandwidth required. And as I don’t have any bandwidth restrictions imposed at the server or client level I feel that there is no reason for it to transcode even if the estimate it comes up with is orders of magnitude off the mark
Yasss a fellow DIY Plex Clouder! My library is only about 2k files and almost none of them have a deep analysis. Yeah my MDE decoding skills are noob tier but to me it kinda looks like the “Maximum” quality setting on the Plex Fire TV app actually has a hidden value of 200,000kbps.
That should still be more than enough for a 5,667kbps file but I think Plex is incorrectly estimating the required bandwidth as some astronomical amount. The weird thing is that I’ve been running this setup for at least three years and this is the first time I’ve come across this issue
Also the Fire TV is the only client I have that seems to be affected. Even Chrome will only transcode the audio and direct stream the video properly for the exact same file. Here is a log snippet of the same file being played in Chrome
PS: good to see a fellow Plex+GDrive enthusiast too. Can’t ever see me going back to buying hard drives.
Once you get into Petabytes, local storage looks less and less attractive
Your file is missing the bitrate for the audio stream, so it appears to be putting in the max so there might be something wrong there. However, your file is encoded wrongly. If you remux the file, I’ll bet that fixes the problem.
Oh good catch MovieFan! Per your suggestion I tried remuxing the file via both MKVToolNix and FFMPEG and can not get the audio bitrate to be detected in Plex. However if I open the original file (or any subsequent remux) in MediaInfo the audio bitrate is listed and reported properly as “Bit rate: 640kb/s”. I tried remuxing to both MKV and MP4 containers with FFMPEG and the issue persisted
This made me curious and it seems the vast majority of my recent media does not have the audio bitrate listed in Plex even though MediaInfo is able to detect it properly. IINA Media player is also able to correctly determine the audio bit rate for these files
Another thing I noticed is that most of the files that do not show the audio bit rate field properly in Plex were created using very new versions of MKVMerge (v31 or later). If I add files made with older versions of MKVMerge (v29 or earlier) the audio bit rate is detected by Plex properly. I may be on the wrong track here but it seems as if maybe newer versions of encoding tools may have bugged out that field somehow. This affects files with EAC3 as well as those with AAC audio
Please let me know if there’s anything else I can do to help troubleshoot. And thank you again for taking the time to engage with me on this issue!
I’m wondering if it is actually detecting that or just assuming the maximum bitrate. Can you create a sample file that’s only about 1 minute long, see if that also has the problem, and if so provide me that sample?
I was mistaken before, removing the subtitle tracks and remuxing the MKV to MP4 via FFMPEG does indeed get the audio bitrate to show up in Plex. However MKV to MKV remuxes via either MKVToolNix or FFMPEG do not get the bit rate to show up. It may be an issue specific to MKV containers
I had a buddy of mine check through his library as well. Almost all of his recently added content is also missing the audio bitrate information
Edit: Working on uploading the sample. I think maybe it was too big to attach via reply
Remuxing from MP4 back to MKV via FFMPEG causes the audio bitrate info to disappear again. Here is a WeTransfer link to a 1min snippet of the original file
Yeah I have a few files that I’ve added recently that also show the bitrate info fine. It seems to me that what matters more is the version of the tools used to make the file. Any MKV made with MKVMerge/MKVToolNix v31 or later seems to have this issue for me (with newer MKVs just generally correlating to newer versions of that software being used)
Agreed.
I run both a local server at around 110TB of local media and the data is all mirrored to Gdrive mounted to a remote server.
I used the local server for myself and the remote one for my shared users. Around a week ago I got a BSOD on Windows and couldn’t be bothered to troubleshoot at the time.
Maybe I never will. I too switched to the remote server and realized what a waste of time all that local storage is.
I’m kind of torn between just shutting down the local server fully and selling on all the 8TB drives or just keeping it in case Google ever pull the plug on their Unlimited storage.
I am curious about this though. I have no issues and even generate VP thumbnails too.