I’m not aware of any Fire device sold after 2016-17 that doesn’t support HVEC. Roku is a little behind them, but not by much. That’s only cherry-picking the cheapest devices too. Most, even before that support it.
Old devices I have (Pre-2015) sitting in a box I don’t even use anymore support it
If all devices being sold today support it, and almost none for several years?
That means either you are, or are quickly becoming the minority.
Buckle up, AV1 is on the way
Most people here have sunk 1,000’s of dollars into their setups to get undesirable results and hours of headaches and tweaking.
Me and other server owners are the ones sinking all the money into this and providing the content
If someone linked to my server doesn’t want to pay 20 bucks for a new firestick every 7 years, I will gladly send them a subscription link to Netflix, HBO Max, Disney+, Amazon Prime and any others they’re interested in paying for
Back in 2016 when I purchased an Apple TV HD(4th Gen), it was the only non-4k device I could find that supported HEVC.
I think it actually may still be the case.
Amazon state they’ve supported HEVC on everything since 2015, but I know that’s a lie(unless that was another Plex issue).
Roku have a general spec list, not model specific stating they support it.
But I know the Roku express my sister bought new last year does not support it.
I don’t conflate 4k with HEVC support, but I think it’s still a good rule of thumb.
All of the 4k devices support HEVC. Most of the HD devices I’ve seen do not support it.
As for kicking people off because they won’t pay more money for better hardware, doesn’t that defeat the purpose of Plex?
I’ve got a lot of family I’m sharing with, I’m not about to start kicking them off.
You don’t have to kick anyone off, you are simply no longer supporting an antiquated client device. As stated above, they can choose to purchase a $20 FireTV stick or they can no longer access your server. All support has limits. Moving to HEVC is hardly unreasonable when you consider that the codec is a decade old. AVC is nearly two decades old. That’s ample time for mainstream adoption.
It would be nice for server owners to be able to set “recommended settings” which triggers a dialogue upon playing a file if the client’s settings do not match.
This should also inform the user of the required settings, explaining that it will use more bandwith but play at a higher quality.
They should also be informed if the server owner requires this for media to be streamed from their server.
The user would then click ‘yes’ or ‘no’ and their settings should be updated automatically.
The server owner should be able to determine what clicking ‘no’ does (kill stream, allow transcode etc).
This way the server owner doesn’t have to explain anything, and the user is aware of the bandwidth usage.
This could be built upon further by allowing different settings (overrides) for specific users, just like when you set what libraries you want to share with specific users.
If the user then experiences slow speeds and buffers during a stream, this should trigger the already exisiting prompt to switch quality for that stream only (again, this should be toggle-able by the server owner).
With this in place, users can easily meet any server owners requirements with a simple click of yes or no.
The server owner regains control, and the user has accepted that it will use more data.
Transcoding should be used when its NEEDED imo, not as a default setting.
You guys are still thinking in the lousy kind of solutions that Plex uses to offer. The real solution is that the default is a “Recommended Settings” option which is smart enough to detect which client it is running and choose the best scenario settings on the fly. Then you can choose something else if you know better.
I setup a friend yesterday with a brand new account to access my plex server. I personally installed Plex on their TV for the very first time. It’s a 4k Sony Bravia TV, with built in Google TV(android apps). Much to my surprise, it was already defaulted to maximum. Not that I’m complaining, this is actually great news, as this user isn’t very technically inclined.
Can you share the logic for this decision/statement? The client doesn’t know what the server’s capabilities are in terms of both upstream bandwidth and transcoder strength.
Why is the client the chosen place to control this? Why wouldn’t this be a joint decision between the server and the client?
I’ve been using Plex for well over a decade, and I’m having trouble understanding why the best place for this decision is in the client alone.
By preserving the status quo, you are actually giving people a bad experience. Folks who connect to my server say, “yeah, I don’t like Plex because the quality is kinda crappy…”; and these are folks with brand new TVs. Turning them off of Plex at this point means they will be far less likely to ever set up their own Plex servers, won’t subscribe to Plex Pass, etc. They are all shocked at how much better it gets when I coach them through the minutia of simply setting Quality to “original”. ¯\(ツ)/¯
The server doesn’t know the client’s capabilities, in terms of downstream bandwidth, nor any data caps which may be in place (which are still very common in many parts of the world).
The client is where the end user can determine what quality they want, and how much data they want to use. If buffering is occurring, the client knows that. If it’s able to fill the buffer quickly, the client knows that too. Some clients are connected to multiple servers, and if it was controlled server side, which server wins? Why should a server owner control how much data I want to use from my monthly cap? What if it’s increased and makes everyone buffer and unable to watch content instead? There are lots of things to consider here, and we will be making changes to improve the user experience around this.
I didn’t say we were keeping the status quo. I have said that at this point in time, nothing has changed, and that if or when changes would happen, they would happen on the client side.
What does it mean who is connected to several servers which one wins?
Of course, the one that is addressed and where the file is played. How hard can it be to set a bandwidth per user and set that on the respective server that knows if the request he gets this and if the request he gets this?
You keep forgetting that most users have absolutely no technical skills, let alone desire to set something. It is simply inconceivable that in the year 2022 (soon 23) you can not manage that you can preset this. That it is technically feasible knows each of us. Because if you tinker X scripts in Tautulli, it works, but this is extremely cumbersome and can unfortunately lead to errors, especially if Tautulli has been running for a few days, often comes to many error messages.
So the excuses of you at Plex become more and more idiotic and stupid. The only one who knows what his people want are the server operators and nobody else.
He has the technical knowledge to set it up optimally for his users, but thanks to your limitations we have only 2 options. Either 100s of unnecessary transcodes (which costs a lot of electricity and next year when everything is 50+% more expensive, this can ruin one or the other, or he will just give up Plex) or you turn it off completely and risk that X players no longer work, because many can not all codecs.
As I said, once again a function that you have not thought through at all. If I have people who can not get away with original, then I want to set it and not make the user there try around, they know as said rarely what they do or should do better.
In addition, a solution was promised 2 years ago and this is not there until today, shows once again that the users you do not give a ■■■■. You prefer to tinker with features like Arcade or Discover that no one wants, or needs. Luckily I made lifetime accounts and it was years ago. If I paid monthly or yearly I would feel even more screwed.
This is slightly excessive, not to defend the plex team but they need to legitimise their application and discover is that.
However saying that, the default 720p in countries where data caps dont exist is pointless.
Its an exhausting call to make to friends to up their quailty in their settings. Then makes it worse, some app updates clear this selection!
Some effort addressing this default would go a huge distance with the community. To the point they will forget alot of the other “cr@p” they dont want but was forced on them by default.
I would rather deal with a “its buffering” phone call than seeing someone watching in potato quality needlessly. I can.
Thanks for the reply, @DaveBinM, and I fear we’re getting lost in the minutia here.
Instead, I simply recommend y’all take a breath and zoom out to look at where users’ competitive/comparative UX is coming from: Netflix, Disney+, Apple TV+, Hulu, etc.
Those apps make the presumption that the user wants the best quality possible, is fine using all their bandwidth, and certainly don’t require the user to make changes in order to at least attempt experiencing full video quality…
Those apps all make this happen by having a conversation between the server and the client, and they generally start with the “send it all and let’s see if that works” paradigm. Fubo (on some devices) is the only one I’ve experienced which (obviously) starts with low quality and then ramps up 10 seconds in, and it’s jarring by comparison. But even then it’s presuming the user is fine using all their downstream bandwidth and attempts to do so in order to get the best video quality possible.
So yeah, instead of defending the idea of just iterating on the current model, I recommend looking at what folks are getting elsewhere—with great success, I’ll add—and make that your North Star.
That said, I have come to understand that there are a lot of fiefdoms inside Plex nowadays, and so I grok that corporate culture might cause things to get hamstrung here.