I’ve been using Plex for at least half a decade now and I love it. In the last year or so, I’ve started sharing my library with friends and family and taking a lot more care and attention of my media library to minimize transcoding so that I can maximize the experience of simultaneous users. What I’ve come to dread, though, about sharing my library with a new users, is walking them through finding the Remote Quality setting in every single one of their Plex clients so that it can be upgraded from the antiquated default “4mbps 720p”. This is also how I learned that the Plex client on Samsung Smart TV’s use a unitless 1-12 scale for video quality; I had to find the manual to instruct the user to set Remote Quality to 9 or higher.
I have a lot of 1080p content. All of my users have 1080p (or higher) resolution clients. However, if any of these clients do not set Remote Quality to at least “8mbps 1080p”, they will force a transcode down to 720p, to be immediately upscaled on their screen to 1080p. I understand the desire to have a default bandwidth limit for remote streams, but even if that remains at 4mbps, why should the default maximum resolution still be 720p?
Maybe I’m unique in being (slightly) more restricted by transcodes than by bandwidth, but what this results in, more often than not, is a death-spiral of less-tech-savvy users seeing buffering caused by multiple simultaneous transcoding users and instinctively lowering their requested video quality. In reality, none of my users would be buffering if their client had requested 1080p, but they have no way of knowing this.
Essentially, my question is this: What reason is there for the default Remote Quality setting to still be 720p? Why should the default behavior of Plex be to lower video quality for reasons other than bandwidth or encoding?