Not sure why I have not noticed this being such an issue till today and seeing this thread start in 2019 and here it is 2023 is RIDICULOUS.
You know what causes this? Lack of competition.
Very common for me to see 3-4 users at a time streaming on my server at 20% load on CPUs but was very surprised to see 90% spikes when I check the dashboard and all 3 users on their living room TVs with one at 720 and the other 2 in SD 480…like WTF?
I am trying all kinds of things and nothing is helping. Also having a codec issue that is causing 2 of my 4k to play pink and green, I have only just started digging into that…for another thread if it gets to that, not trying to derail this. Clearly Plex needs all the focus they can get on this issue.
EDIT: I may have been a bit too harsh…kinda upset. Anyway, as I was heading down to check my server I had an epiphany. I have used a few different routers over the last few years due to newer router security updates to firmware that has been causing me noticeable issues. Have gone thru 3 different netgear before moving to my new Asus GT-AX1100 and I STILL have to reboot this $350 beast ever 4 days it seems. What I am getting at is could it be possible that much of this is due to a compromised router status? Plex maybe downscaling due to what it perceives are upload issues? Anyone who torrents overnight is going to need to reboot whether they notice right away or not. Just a thought, heck maybe it was already addressed, not reading through 3+ years lol.
I also forgot I was backing up my server…assumed it was done so that may be contributing to issue as well.
Jee, maybe if the Plex leadership wasn’t so focused on becoming the next Netflix and stuck with a smaller team comprised of people like Dave, who were actually working on things the core user base that have made Plex what it is want, they wouldn’t be in such a tough spot. Seems likely this highly requested and outright necessary change/feature will be put on the back burner in place of some more profitable and upsetting changes.
We have started rolling out a change to clients (see the recent Smart TV and HTPC release notes) to increase the default internet/remote streaming quality from 4Mbps 720p to 12Mbps 1080p. (Roku and Apple should see this in their next release or so). While this is not “Max Quality” as many would like, it is a step in the right direction and should elevate some transcoding.
I want to see it change to “Max Quality/Original” just as much as others, but we need to ensure that raising this doesn’t end up having a negative effect with increased buffering. This is why we are doing this in stages along with the continued work behind the scenes on bringing the automatic quality work we shipped on Android, to more clients.
Do these changes you’re speaking about show up in the change log of the PMS or client apps when you post them? I don’t think the previous attempt showed up in any change log I saw. It would just be nice to know exactly which versions on which clients these changes will apply to so we can better troubleshoot with our users.
So after 4 years of proclaiming that the only way to solve this problem is the requirement of an elegant but convoluted and drawn out automated solution, you just toggle the parameter everyone has been asking for
Thank You!!
Long overdue though and the half measure still makes 4k not really an option with plex but yeah, definitely a step in the right direction and very welcome. Should eliminate most of the transcodes of non 4k stuff on my server at least.
I am simultaneously happy and a little disappointed
Can’t you just give us the option server-side to do this, when playing from my server?
I understand you may be worried about your non-technical audience but for those who know what they’re doing, having to tell everyone to manually set “original quality” on EVERY device anyone could stream from is tedious and results in more unnecessary transcodes than this proposed alternative would.
If a transcode is forced because of client-side limitations, that’s fine. It should “auto” to original quality, and drop back to transcode if client-side bandwidth or hardware doesn’t support it. The client app should ideally also have logic to flick between original quality and a “default” (transcode) setting if it detects a user is not on Wi-Fi (e.g, they’re streaming from an iPhone on 4G/LTE, for example).
I have an RTX2080Ti dedicated to my Plex container for transcodes and 2Gbps upstream bandwidth. Having multiple unnecessary transcodes because of client-side settings when their device and downstream bandwidth allows for (for example) a full 4K stream/direct-play is my main issue here. It makes their quality worse, and increases the server-side load (and thus my energy bill, etc.).
There is a post on Reddit which echoes my comments, so I don’t think I’m alone in thinking this way.
This is what we are working on bringing to each client, but due to each client, the method varies from platform to platform due to the system player and other factors that take time. We already have this on Android today, with Smart TVs and Roku coming soon.
@kevindd992002 Are you referring to Android TV where quality suggestions are the default?