[Implemented] Server-Side Speed Limits/Caps for Shared/Subscribed Users

@justinnjess said:
I’ve got the CPUs, not the bandwidth. I want forced transcoding for all users.

@MikeG6.5 said:

@justinnjess said:
Force transcoding for all remote addresses would be great

Only if you don’t want your server to handle more than a couple of streams at a time. If you share out to more than just a few people at a time, and they only use a few streams, total, then this will work for you. If you share out to more than 10 or 20, and they all seem to hit at roughly the same time frames., this will never give the a good experience.

The only way to ensure everyone views everything without buffering is to provide them with alternate versions that can be selected on the server side, for the connections each user has. The client negotiates with the server the speed, the server looks to see if the right bitrate is available, and only transcodes if it can’t find a version that fits the requirements.

Transcoding is really CPU intensive. And those with smaller CPU’s aren’t able to handle even one transcode, let alone 2 or more.

Then you haven’t read any of the discussion after the post you replied to, but that’s OK…

I am curious… What CPU do you have that can handle your current transcodes? And what will the future hold for that CPU? Your library? The users you share with? Where do you see your money going towards when it comes to updating your PMS machine in a year? Two? Three?

Are you going to be buying a bigger CPU to handle the things this one can no longer support as your users become more numerous? And their demands on your CPU are greater than it’s able to provide?

Me? I’ll be buying additional HDD’s and hanging them onto the system I already have, to handle the additional media versions I will be Direct Playing to my users. And when I need to upgrade CPU’s it’s not going to cost $3000-$10,000 to get a system working as I need it to… In fact, in 3 years I expect I’ll just need to plug in a bit faster CPU into the existing box I use now, and gain immediate benefits… Maybe a total cost of $200?

Your server setup as you want means you are going to be spending more on the actual PMS machine than you are on the media storage.

So in the future, you are going to be back, wondering why your uber-powerful CPU isn’t able to do what you require of it. And hopefully the features I’ve been discussing here will all be working as discussed. So then someone can just point you to that feature set and get your CPU intensive system working the way much smaller systems have will have been working for a while… (For that matter, as some of them work NOW, within the existing features, if they are set up right.)

Really, it’s your call, if you want to turn your PMS machine into a money sink… Who am I to tell you where to spend your money? All I can hope to do is suggest that spending it on the latest and greatest CPU and all that this requires might not always be best value.

@MikeG6.5 said:

@justinnjess said:
I’ve got the CPUs, not the bandwidth. I want forced transcoding for all users.

@MikeG6.5 said:

@justinnjess said:
Force transcoding for all remote addresses would be great

Only if you don’t want your server to handle more than a couple of streams at a time. If you share out to more than just a few people at a time, and they only use a few streams, total, then this will work for you. If you share out to more than 10 or 20, and they all seem to hit at roughly the same time frames., this will never give the a good experience.

The only way to ensure everyone views everything without buffering is to provide them with alternate versions that can be selected on the server side, for the connections each user has. The client negotiates with the server the speed, the server looks to see if the right bitrate is available, and only transcodes if it can’t find a version that fits the requirements.

Transcoding is really CPU intensive. And those with smaller CPU’s aren’t able to handle even one transcode, let alone 2 or more.

Then you haven’t read any of the discussion after the post you replied to, but that’s OK…

I am curious… What CPU do you have that can handle your current transcodes? And what will the future hold for that CPU? Your library? The users you share with? Where do you see your money going towards when it comes to updating your PMS machine in a year? Two? Three?

Are you going to be buying a bigger CPU to handle the things this one can no longer support as your users become more numerous? And their demands on your CPU are greater than it’s able to provide?

Me? I’ll be buying additional HDD’s and hanging them onto the system I already have, to handle the additional media versions I will be Direct Playing to my users. And when I need to upgrade CPU’s it’s not going to cost $3000-$10,000 to get a system working as I need it to… In fact, in 3 years I expect I’ll just need to plug in a bit faster CPU into the existing box I use now, and gain immediate benefits… Maybe a total cost of $200?

Your server setup as you want means you are going to be spending more on the actual PMS machine than you are on the media storage.

So in the future, you are going to be back, wondering why your uber-powerful CPU isn’t able to do what you require of it. And hopefully the features I’ve been discussing here will all be working as discussed. So then someone can just point you to that feature set and get your CPU intensive system working the way much smaller systems have will have been working for a while… (For that matter, as some of them work NOW, within the existing features, if they are set up right.)

Really, it’s your call, if you want to turn your PMS machine into a money sink… Who am I to tell you where to spend your money? All I can hope to do is suggest that spending it on the latest and greatest CPU and all that this requires might not always be best value.

Not breaking a sweat on CPU ever.

All cores clocked at turbo on i7-4790K.
6100 hours of streaming on this install.

This is a popular request for a reason. People are bandwidth limited, but have spare CPU. I ask my (not home) family to use the transcode option, but people don’t pay attention until it affects others.
I’m not sharing with the world and expecting to scale out to hundreds of people. I just want to be able to force the option to give an equal experience. I’m already doing source address QOS rate limiting to make people consuming too much bandwidth to get the notification to switch to transcoding because of a slow connection.

Oh, I’m not arguing that this request doesn’t need to happen. I’m all for this request. In fact I’ve been very active in getting more support for it on various other threads. But for those that think this is going to solve their problems, this is actually going to create a different set of problems for smaller CPU installs.

And the answer is NOT going to a faster or more powerful CPU. At least not long term. The complete answer is providing a whole package deal. Bitrate limits, with adaptive bitrates, and making use of the Optimized Media feature, to ensure that as much of the media streaming out is Direct Play and not transcode. Measure your CPU usage and memory usage during even one transcode session. Then measure it during 8 or 10. The results are going to shock you, I think.

For those who don’t have a 10K-15K passmark CPU, this is the only viable option. As content bitrate increases, the demand on the CPU is only going to increase as well. And then where will the bitrate limited, transcode only operators going to go to get things fixed. 4K content transcoded down is going to make many $5000+ systems crawl with too many transcodes. But the admins that have a 2160p/45Mbps, and 1080p/10Mbps version and a 480p/1.5Mbps version stored are going to be able to have everyone streaming all day long,

Many admins ditch a lower quality version to retain the highest on their systems. These are the guys that are going to be crying for something to help them later on. My whole argument is that there is something already in place RIGHT now and doesn’t take a lot to enforce. and if you had already read AND understood what I’ve been laying out above, you would already have your answer as to how to bitrate limit within the existing feature set!

You want to FORCE transcode. I want to force intelligent bitrates that follow a set of rules, defined for each server and install, including limits. I want NO transcodes, PERIOD. It’s not efficient, and ultimately creates more reliance on a bottle neck that is going to bite everyone that relies on it. If a transcode HAS to happen, it should be a one-off and I want to know why so I can fix it.

Everyone thinks this is going to solve all of the bandwidth/buffering issues they have. You are just moving the problem to the next weakest part of the whole system. And until there is a major break through in CPU tech, that next weakest is going to be the CPU of the transcoding server. Even for the biggest CPU’s out there… They are ultimately only going to be able to support a small (relative) number of transcodes at a time.

Now, before we get too much further in this quagmire… I had a conversation with a few of the developers from Plex in the last few weeks. And this particular topic came up in our conversation. Multiple times. And suffice to say, that the Team’s Road Map ™ includes what I have outlined in this thread. I’ve been assured that it’s not a question of “IF” they are doing this feature set, but more a matter of “when” it comes live. OM is already a reality and using it can already enforce any kind of bitrate limits you want to set, by only sharing out a library made from the OM versions the system creates for you.

Next up is probably going to be Bitrate Limits. And then we hope Adaptive Bitrates. All tied together, all leveraging one on the other, and all working as intended on a big system or a little one.

It’s going to be up to each admin how he wants to implement any portion of the whole thing. You want bitrate limits. So do I. I want adaptive limits and use the OM versions to keep things Direct Play as much as possible.

My QOS rate limiting based on source address forces things to be fair for everyone. If I could force downsampling as well as rate limiting at my gateway it’d resolve all issues for me. Re-encoding my entire library isn’t sane. If I had to I could power up my other two 9k+ passmark machines and mount the nfs shares from my primary plex server and gain performance. But as is I have no use, so they’re unplugged collecting dust to cut down on the cooling bill.

I’ve gone as far as considering altering the packets to ask for transcoding. The server itself isn’t open source, and I can’t contribute in the proper way.

I’ve even thought about making dummy mkfifo type buffers and monitoring file access triggers with incron and using ffmpeg to create forced transcodes.

I figured I’d also say I had some success messing with ffmpeg and this https://github.com/vbence/stream-m for being able to deliver a transcoded downsampled stream. It was just something to let me get an idea of what I was working with, but something I can’t invest much time in polishing.

This is definitely needed. Heck, at the very least right now, Id like a way to pause all remote users from being able to stream content.

I have been looking around for this solution for a while. Would love if this was implemented so the people I share with don’t eat up my bandwidth when streaming high quality movies. It is becoming an issue for me :confused:

i’m 100% for this, not from a CPU perspective but from upload speed control. Its not always easy to contact your friend who is just into an episode or film and ask them to change the quality… having a global or by user setting would be very welcome

I would really love to see this implemented. You don’t necessarily have to grey out the options on the client, but instead allow them to choose what they want but force the server side to transcode to that quality. Ideally a user-based option, but at least a global option in my opinion.

@MikeG6.5 said:
For those who don’t have a 10K-15K passmark CPU, this is the only viable option. As content bitrate increases, the demand on the CPU is only going to increase as well. And then where will the bitrate limited, transcode only operators going to go to get things fixed. 4K content transcoded down is going to make many $5000+ systems crawl with too many transcodes. But the admins that have a 2160p/45Mbps, and 1080p/10Mbps version and a 480p/1.5Mbps version stored are going to be able to have everyone streaming all day long,

One point that can change the game is hardware transcoding. We’ve now a first try with the Nvidia shield.

The day we get GPU hardware transcoding for h264 and h265, I guess that won’t be an issue anymore, unless you’re transcoding 20+ 4k streams but even then, just adding a 200$ GPU could double your transcoding capabilities.

@Delivereath said:

@MikeG6.5 said:
For those who don’t have a 10K-15K passmark CPU, this is the only viable option. As content bitrate increases, the demand on the CPU is only going to increase as well. And then where will the bitrate limited, transcode only operators going to go to get things fixed. 4K content transcoded down is going to make many $5000+ systems crawl with too many transcodes. But the admins that have a 2160p/45Mbps, and 1080p/10Mbps version and a 480p/1.5Mbps version stored are going to be able to have everyone streaming all day long,

One point that can change the game is hardware transcoding. We’ve now a first try with the Nvidia shield.

The day we get GPU hardware transcoding for h264 and h265, I guess that will no be an issue anymore, unless you’re transcoding 20+ 4k streams but even then, just adding a 200$ GPU could double your transcoding capabilities.

Hi MikeG6.5
Thanks for your insights!

In general I agree with your opinion, but I think the main concern of most users here (including me) is upload bandwidth.
Yes, in the long term it’s smarter to re-encode the media to a more lightweight format, but the current situation now (and I think for some time) is that the main bottleneck is upload bandwidth.

Server-side transcoding would fix this problem and additionally provide flexibility to adapt to changing bandwidth situations.

Also I think that, when streaming 4K content, it will be mostly local for which a separate library could be used in the meantime.

I have a question that kinda fits into the discussion here.

Lets keep things simple, i have 15mb upload, 3 remote users. Two of them are selfish and wont lower the quality, down from original, due to the fact that they watch at different times and as such can normally pull most things direct at better quality.

Now lets say i have a film 1080p @ 12mb sec. Obv only one remote user can direct stream it at once. So i create an optimized 720p@4mb version.

The question - if selfish remote users don’t change their settings down from original, what version will the server send out??? If it still tries to send the 1080p, rather than the 720p, i still need to get them to change their quality settings down. (which magically goes back up again later)

@iforgot said:
I have a question that kinda fits into the discussion here.

The question - if selfish remote users don’t change their settings down from original, what version will the server send out??? If it still tries to send the 1080p, rather than the 720p, i still need to get them to change their quality settings down. (which magically goes back up again later)

AFAIK user will get what ever they asked for. This is the whole point of this thread.

@Delivereath said:
One point that can change the game is hardware transcoding. We’ve now a first try with the Nvidia shield.

This is what I was about to post. With Tegra, Plex will be able to sent 2-3 simultaneous 1080p transcoded streams. I can’t imagine what you can do with GTX-10xx and 9xx series! Looks like Nvidia team involved with the development and it’s not hard to assume that this will come to x86 servers eventually.

I feel/hope that Plex team will try to make it perfected. Imagine that server would act fully deterministic based on the computing power and network bandwidth. For example start transcoding with CPU if there is no CPU resources left -or certain limit reached- move new transcoding request to the GPU and balance all of them according to upload bandwidth limit or assign more bandwidth to your VIP users or even give high priority to Plex Home users and throttle “friends” bandwidth etc…

Options are limitless but the time is:)

PS: As far as I heard Nvidia NVENC SDK was a mass and nobody liked it. But looks like things are changing with the NVENC SDK 6.0. Tegra X1 should be based on Maxwell which is supported by SDK.

Handbrake hardware encoding on my haswell i5 integrated 4600 gfx can do 75-100fps 1080p avc to 720p x264 mp4 at 5 ref frames and other ‘high quality’ settings.
I can only dream what a more optimized, slightly less quality settings on a dedicated vid card could do.

I look forward to our hardware trans-coding overlord potential.

Back more on topic, I still wait for server controlled limits.
Asking my non-tech family to set 3 or 2 mbps is getting old.

I guess the point I have been trying to make with the Optimized Media feature is, you don’t HAVE to ask anyone to use a different setting. If all they can see for media is an optimized 4 or 6Mbps stream, that’s ALL they will get, period.

Using this feature to implement an artificial bitrate limit isn’t hard. It takes your server’s hardware time to make it, and additional space for the reduce bitrate versions. Those are really the only expenses you face by setting something up to work with the OM feature to handle bitrate limits. It’s completely possible to have the entire library set up at a reduced bitrate and still only take up a small percentage of the over-all library size. (Perhaps as much as 20-30% more space than you are using now? If that…) And keep the users from “hogging” your upload bandwidth, which is the entire point of this discussion. If they can’t see the 1080p 20Mbps media, but only the 720 4Mbps version, that’s all they will ever get. And they can only use 4Mbps of your over-all upload speed.

This is a work around, given the capabilities of the software now. And that’s all it is. It’s not a complete answer, by any means! And even if the Team implements a per user limitation down the road, using 2 or 3 different versions of the media is going to be a part of the complete solution to prevent excessive transcoding sessions from happening. (I feel ANY transcoding is excessive, but that’s my opinion!)

My opinion is, do that transcode once, store it for future use, and then point someone to only that version if you want to force limits on them. Why do multiple transcodes, possibly of the same media at the same time, to support multiple streams of it. Just do one transcode, when the media is added to the library and be done with most transcoding from that point on. Your users are going to get a higher quality version than they would with on-demand transcoding. Your CPU is going to run better. And your upload bandwidth isn’t going to be over-taxed trying to keep up with something larger than the pipe is designed for.

Rocket science… Or more like smoke and mirrors, since you are kind of pulling a fast one on your users, by only sharing out at a bitrate you want them to have.

Really guys, we have a feature request in the system now for storing the transcoded copies against further use. (Cloud syncing, or mobile device syncing.) This is just an extension of that request, but applying it to bitrate limiting… And until we get a working version of user based limits, this is completely viable, if someone wants to set their libraries up for this.

@mike.g5 - you’ve been pounding your opinion in this thread for several pages.
Most of us get it, summarized - you think the best method is to store multiple copies, or, just store a lower version so a higher one can’t be used, and somehow limit users to that.

Unfortunately, that is not what many of us want.
What may be best for you is not best for others.
We appreciate your opinion - I personally just feel it’s been overstated.

I think > @JamminR said:

@mike.g5 - you’ve been pounding your opinion in this thread for several pages.
Most of us get it, summarized - you think the best method is to store multiple copies, or, just store a lower version so a higher one can’t be used, and somehow limit users to that.

Unfortunately, that is not what many of us want.
What may be best for you is not best for others.
We appreciate your opinion - I personally just feel it’s been overstated.

I think he’s right of you have lots of shared users and upload speed. The thing is, it requires lots of storage speed (which, even if $/GB is low, costs money). If you only have 2 or 3 or 4 shared users, maybe on-the-spot transcoding makes more sense… but soon enough you’ll run into issues with your CPU speed being insufficient . Especially if you use a codex that’s not efficient with your transcoding setup like h265.

@JamminR is right, we know what you want

@MikeG6.5 is there a reason you feel the need to write an essay for every post, repeating your point of view over and over?

Were you just never educated on the idea of concise writing?

Is there a reason all options should not be available so that everyone can choose the best option for their particular setups/bottlenecks?

Not as far as I can see. I think Mike sees bit rate limitation as a band-aid fix though. if you limit each user to 2mbps and have 5 users and a 20mbit upload, great, you’re only using about half your upload. But if all of them are transcoding at the same time, you’re going to be taxing the CPU a tad. His solution is probably best for someone that has the storage to support multiple copies.

But in reality, this isn’t the use case I am looking at. I am more concerned with User 1 playing one of my 20mbit files at original quality, consuming all of my upload bandwidth in one go. Thats a problem for User 2 that has their client set up right. Mike’s solution would have me make copies of all my 20mbit files and force them on User 1. Thats a lot of extra storage space and work for me than just toggling User 1 down to 2mbit limitation from the get go.

I get how Mike’s solution would work, but its hardly the simplest solution out there.

To your question, I think both ways would work. I just think bit rate limitation on a per user basis is the simplest. You can set that limitation when you add the user and never worry about it again.

If you have lots of simultaneous users, Mike’s solution is probably the better option.