Default All Clients to Max Internet Streaming

Will Plex have (internal) metrics to see if users are actually taking advantage of updating the quality, or just ignoring the prompt?

Does that mean this feature will only ever be available for TV clients? If so, that doesn’t address the original issue because that happens on web and tablets as well. I don’t have many users on mobile phones so I don’t know what the status of the issue is there. I do use Plex on my phone very rarely, and I keep the streaming on maximum, where I’m happy to let it buffer a bit if it means I get to watch in full quality.

So the client could also prompt to drop the quality?

The main thing I’m worried about with this solution is that once it gets into the real world there could be a bunch of conditions that don’t work well with it that weren’t exposed in testing. Since it is an intrusive UX, it could lead to problems for both server admins (getting “support requests” for the thing that keeps popping up) and for Plex (why does Plex keep interrupting my media with this popup).

Another thing that is stuck in my head is that this is seemingly a “scratch your ear over your head” kind of solution for most of the folks in this thread. I haven’t run any statistics, but an eyeball test shows that most comments imply that all of their clients are able to playback at maximum. I know that is the case for all of my clients. Now there’s a new system adding additional complexity to the situation, where for us the simplest way to fix this would be an option to default clients to maximum.

I’m hoping all of my fears are unfounded and that this ends up working perfectly.

I would think that based off of this thread, Plex employees would refrain from making definitive statements like that :wink:

Hi folks, for those of you wishing for server controlled bandwidth, we already have one, and its been there a long time, that is the max bandwidth per stream and the max total bandwidth available.

so, we still won’t be able to directly control the client bw or to set the MINIMUM bw, but since bw is dynamic different among every individual client, and shared between all connections at the server, this probably makes most sense.

the main point of the original request is to help avoid transcoding, which sounds like this will do, assuming sufficient available bandwidth at client and server.

hopefully now that there is some progress being pushed out, we can relax a bit and let them finish the work, so it can then roll out wider to all clients.

then we can have something new to complain about, or praise, depending on how it works for your users.

5 Likes

This thread was never about server controlled bandwidth; it is about server controlled defaults.

Having made the original request, I’m hopeful that this will fix all the issues that I’ve been experiencing for years now. However I think that the concerns I raised a few hours ago are valid.

As I said:

but without being able to test the feature in my real world scenario, and without answers to my questions above, I’m still skeptical. Edit: skeptical is the wrong word; maybe apprehensive is better.

2 Likes

I didn’t say you or the thread was about server controlled bandwidth, but there are many on this thread who have pushed for such a solution, including myself @ Default All Clients to Max Internet Streaming - #22 by TeknoJunky at one point.

I’m hoping that the wait will pay off and this solution will ‘just work’, as I’m sure most everyone else is.

All we can do is continue to be patient as it rolls out wider and provide useful feedback as it does.

1 Like

Actually, in theory you’d want it to have some logic to be able to say “The original file is 4Mbps, and the speed test says you can handle 100Mbps, so Plex will Direct Play this file.”

For example, the file being played is a 4Mbps 1080p original quality file, I don’t want it transcoding to 20Mbps 1080p because the speed test indicates that it can handle it… I want it to automatically know that it can direct play the file.

1 Like

If like me you’re sick and tired of plex’s broken promises, leaving another alternative that I’ve been using.

Notifiarr, using the plex plugin. Yes, you might have to go through the instructions but it’s pretty straightforward. Once you’re done, your plex plugin should look like this:

It might not answer all your problems, but for me it resolved most of them. The dev is also pretty active on discord and replies to most queries pretty swiftly! (unlike plex).

Have a good day y’all

1 Like

I use Notifiarr for trash sync and I agree that Nitsua is a very responsive and great dev. I’ve never used the plex plugin though. What does it do specifically?

You create rules based on the conditions I’ve shown on the screenshots and it enforces them on a user/global basis.

I’ve made a request to also add exclusions and device type rules :slight_smile:

@DaveBinM has this been rolled out to any other clients? Roku in particular…

We’re already tracking metrics to see how this improves quality for users.

The current iteration is primarily focused on TV clients, because that’s where the most benefit is for now, and where the majority of watching occurs. We hope to be able to also improve things on mobile clients in the future, though that is more complex, taking into account WiFi or Cellular connections, and data caps too. Local streaming on all devices already defaults to Maximum where possible (some hardware does recommend a maximum bitrate, so in that case, we follow the manufacturer’s guidelines), so this is only for remote playback.

Yes, users will see recommendations to decrease or increase quality depending on the connection. We first try to play at the original bitrate, and if the user cannot play at that, we will suggest a downgrade to a quality that will suit the connection. If the connection improves, then we will offer an upgrade to a suitable quality. We’ve done some fairly extensive testing on this, across a variety of scenarios, from people streaming in the same city, to same country, or overseas, with connections jumping up and down (both naturally, and forced from our side).

The prompts are fairly minimal, and users can choose to ignore them if they want, and either play at a reduced quality they’re happy with, upgrade to a better quality, or if they really want to for some reason, they can subject themselves to repeated buffering if their connection is not fast enough, whichever they prefer. From our own internal anecdotal evidence as well as metrics, simply changing things to Maximum would not have been an overall improvement. While it may have improved things for some users, for others, it would have then created a different issue, where users were now buffering all the time, rather than playing media at a bitrate that they could handle.

As such, we wanted to approach this in a way that would improve the experience for everyone, and give everyone the best quality possible, while still allowing both the end-user and server owner control over their own bandwidth use. The user can set their own maximum quality if they desire, though it defaults to Maximum, and server owners can set a streaming bitrate limit for remote sessions.

As stated before, we have no intention of changing how we handle logging.

6 Likes

That is already how it works. Should a user play a file with a lower bitrate than our detected connection speed, it’ll just play. It’ll only transcode it if the bitrate of the file is above the detected connection speed.

1 Like

No, not at this point in time. Currently, this is only on Android, but should be rolled out to other clients should all go well. Should we discover that we need to make some adjustments based on real world usage, then we’ll do that, and then look at other clients. Although, I can’t provide an estimate of when this may arrive on other clients.

This is what scares me. Users are (in a non derogatory way) stupid. I’m anticipating my users ignoring the prompt to increase quality the same way they ignore me when I tell them to set their quality to maximum. They’re all more than capable of streaming at maximum quality, and they “just don’t want to mess with the system.”

So aside from the fact that most of my users do not have Android TVs, whenever this feature does roll out to them, I have no clue if it’ll actually work or not.

So basically there are cases where clients doesn’t have enough bandwidth to stream at maximum quality, and instead of putting the onus on the client to lower their default, Plex is saying they’re fine with wasting server resources.

Why not go with the suggestion bandied about multiple times to allow the server to specify the default? The server admin has the best knowledge of its client’s capabilities, and more importantly, owns the server and the decision about how the hardware should be used.

I understand there’s complexity there, like clients connecting to multiple servers, but surely those could have been dealt with faster and with less fragility than the current solution.

I think this is missing the point. I want a user who has crappy bandwidth to be able to stream at whatever bitrate they need to in order to have a good experience. I just don’t want to optimize for that user. I’d rather they have to manually set the quality lower instead of risking clients who could play at maximum quality transcode instead because they’re afraid to change a setting, or (in the future) are afraid to click a button on a prompt.

1 Like

The default for this feature is maximum. It will only offer a downgrade if they cannot play the original quality. From the sounds of your situation, this should solve your problems, and not require any user interaction.

I’m not sure what you’re getting at here with wasting server resources. This work is about ensuring the best quality for users, and minimal server resource utilisation for server owners. This is the solution we’re moving forward with at this time, and don’t have any plans to approach this issue in a different manner.

At this point, I don’t think you’re properly understanding how this feature works. What you’ve said there is exactly what it does. You don’t need to optimise anything, or have them manually set things lower. It will try to play the original quality, and if it can’t, it’ll suggest downgrading to a quality where it can play smoothly, and from then on, it’ll play at what they’re capable of.

2 Likes

Just want to give some support and say I really like the solution and I’m quite excited it’s finally rolling out!

Is there any implementation or plan for an “automatic” mode that basically always presses the prompt to shift the quality dynamically? I’m assuming metrics are already being collected that will give an idea on how useful that might be and how well it would work.

1 Like

Overall I’m happy with the solution for this problem but I don’t think it will fix everything. This solution aims to default to maximum (great), but we desperately need an option to limit what their maximum should be by with a little more finesse than just 1 toggle.

Some people just have really unreliable connectivity and are unwilling to change their settings. Usually leading to them not using Plex as much anymore. If I could limit only their remote stream quality I would be able to remotely fix their issue and save myself some very limited UK upload bandwidth.

My grandma is out here chilling with a tiny 720p tv and she doesn’t even need 1080p but I can’t change it to max 720p without reducing the quality for everybody. Also, some older fire sticks have awful performance with 1080p even if it supports it, I’ve had to do too much troubleshooting to figure that out and it’s a nightmare.

I for one always have good connectivity and I want to always have access to 4K but there’s only one setting for that maximum remote stream quality.

Surly something as simple as that could easily handled 100% server side with minimum effort from the Plex team to implement. It would only have to be an additional toggle just on the restrictions page.

I understand it, I’m just worried about edge cases. I said a few times above that I’ll probably be much more comfortable once I can actually play around with it.

Specifically I’m worried about a case where the user has a blip in their bandwidth, and they agree to downgrade their quality, but then don’t upgrade it when their bandwidth recovers. From what I understand, their quality would now remain degraded, and they’d constantly see a prompt to upgrade their quality (which in my scenario they ignore).

Now you might say it’s unrealistic for a user to downgrade when prompted yet not upgrade when prompted, but I’ve seen users do much weirder things.

In fact, below is the only way I think of users. It helps me design products that they’re less likely to screw up.

users

1 Like

Is there a reason why it’s been decided for the upgrade/downgrade to be a manual decision, rather than automatically changing it on the fly (like many streaming services do)? You can still provide an option for users to force a quality profile, but surely it’d make far more sense from a user satisfaction/quality point of view to automate this process as far as possible.

1 Like

A single blip would not cause them to be prompted to downgraded. Say they were in a situation where they were getting repeated buffering events, then that is the sort of thing that would prompt a downgrade. If they ignore the prompt to upgrade, it fades away after a short duration, and won’t interrupt their session further. Should they start another session, they would again be prompted to upgrade.

1 Like

To do it in a similar way to the way streaming services do it would be much more complicated, and they have pre-transcoded versions in multiple qualities that they swap between. It’s significantly harder to do the same sort of thing with only one copy of the media item. We tried this previously with the “auto adjust quality” feature, but it had complexities around switching to direct play or direct stream, and always required transcoding. There were a few things that made it less than ideal, which is why we went in this direction.

4 Likes