@kevinwyrick said:
so that everyone can choose the best option for their particular setups/bottlenecks?
This. Absolutely.
Plex’s version of “optimizing” by creating multiple copies is only one option, and it’s not the one I want or have controller and maintenance/hardware upkeep time for.
Server side control of >X< is lacking and what we’re voting “like” on the first post for.
Whether that be bitrate (as first post indicates), user maximum at a time, or whatever (even Mike.G5’s seeming belief that multiple copies trumps all), it should be server controlled and decided by me, not my users I share with. Even current optimization doesn’t limit the users.
If I have a 10mbps file and a 3mbps file, and the user’s set on 12, well, there goes all my upstream.
@JamminR said: @mike.g5 - you’ve been pounding your opinion in this thread for several pages.
Most of us get it, summarized - you think the best method is to store multiple copies, or, just store a lower version so a higher one can’t be used, and somehow limit users to that.
Unfortunately, that is not what many of us want.
What may be best for you is not best for others.
We appreciate your opinion - I personally just feel it’s been overstated.
No, what I have been saying, over and over again, as no one seems to get it, is per user bitrate limits can already be done with the tools we have now. Using separate libraries for the original and the OM versions. The best part is, once this request goes live, then you can share out he main library to those users, kill the OM job and reclaim all of that disk space it used… (Likely less than 25% of the main library.)
For the solutions we have available to us RIGHT NOW, this is the simplest to implement. There aren’t any other commonly available options, so how else does someone limit their users? You don’t and this is where the request came from.
@JamminR said:
Even current optimization doesn’t limit the users.
If I have a 10mbps file and a 3mbps file, and the user’s set on 12, well, there goes all my upstream.
Then you really didn’t read anything I posted if you are sharing both versions out to any given user. And you think I’m “over-stating” it when statements like this prove someone hasn’t read them?
Whatever… Keep muddling along as you are, then, praying for a solution that we haven’t seen anything on since December…
So if you have a 12mbps file and you optimize to include a 3mbps file isnt the 12mbps file still there for users to stream if they have their option set at a higher limit (lets say 20) or does it know to only let them have the 3mbps?
@JamminR said: @mike.g5 - you’ve been pounding your opinion in this thread for several pages.
Most of us get it, summarized - you think the best method is to store multiple copies, or, just store a lower version so a higher one can’t be used, and somehow limit users to that.
Unfortunately, that is not what many of us want.
What may be best for you is not best for others.
We appreciate your opinion - I personally just feel it’s been overstated.
No, what I have been saying, over and over again, as no one seems to get it, is per user bitrate limits can already be done with the tools we have now. Using separate libraries for the original and the OM versions. The best part is, once this request goes live, then you can share out he main library to those users, kill the OM job and reclaim all of that disk space it used… (Likely less than 25% of the main library.)
For the solutions we have available to us RIGHT NOW, this is the simplest to implement. There aren’t any other commonly available options, so how else does someone limit their users? You don’t and this is where the request came from.
@JamminR said:
Even current optimization doesn’t limit the users.
If I have a 10mbps file and a 3mbps file, and the user’s set on 12, well, there goes all my upstream.
Then you really didn’t read anything I posted if you are sharing both versions out to any given user. And you think I’m “over-stating” it when statements like this prove someone hasn’t read them?
Whatever… Keep muddling along as you are, then, praying for a solution that we haven’t seen anything on since December…
@hackztor@gmail.com said:
So if you have a 12mbps file and you optimize to include a 3mbps file isnt the 12mbps file still there for users to stream if they have their option set at a higher limit (lets say 20) or does it know to only let them have the 3mbps?
@JamminR said: @mike.g5 - you’ve been pounding your opinion in this thread for several pages.
Most of us get it, summarized - you think the best method is to store multiple copies, or, just store a lower version so a higher one can’t be used, and somehow limit users to that.
Unfortunately, that is not what many of us want.
What may be best for you is not best for others.
We appreciate your opinion - I personally just feel it’s been overstated.
No, what I have been saying, over and over again, as no one seems to get it, is per user bitrate limits can already be done with the tools we have now. Using separate libraries for the original and the OM versions. The best part is, once this request goes live, then you can share out he main library to those users, kill the OM job and reclaim all of that disk space it used… (Likely less than 25% of the main library.)
For the solutions we have available to us RIGHT NOW, this is the simplest to implement. There aren’t any other commonly available options, so how else does someone limit their users? You don’t and this is where the request came from.
@JamminR said:
Even current optimization doesn’t limit the users.
If I have a 10mbps file and a 3mbps file, and the user’s set on 12, well, there goes all my upstream.
Then you really didn’t read anything I posted if you are sharing both versions out to any given user. And you think I’m “over-stating” it when statements like this prove someone hasn’t read them?
Whatever… Keep muddling along as you are, then, praying for a solution that we haven’t seen anything on since December…
I’m still not following. Do you think you could explain what it is we can do now and what you think the future should be? In at least 1000 words?
He advises you to use two libraries, one with your original content and one with the optimized versions.
But then, for the admin and local users, you will see the two versions of the file. And when there is only 1 or 2 users on my server, direct play is just fine, I only want to force transcode when there are too many users, not all the time. Also, I don’t want to transcode all my library to create optimized versions because that would take months with my CPU running at 100%. Not to mention the cost as my server at 100% would cost me 25$ per month in electricity.
So in the end his solution is a partial workaround and does not fit for every user.
If Plex is going to be considered to be in the “big leagues” then any non-local stream should be transcoded using adaptive bit rate technology like Netflix, Hulu, Amazon, etc. are using…just saying. And quality limits on how high of a transcoded stream should be controlled on the server side, regardless of the kind of CPU you have, or the bandwidth of your pipe.
@jerseydevil62 said:
If Plex is going to be considered to be in the “big leagues” then any non-local stream should be transcoded using adaptive bit rate technology like Netflix, Hulu, Amazon, etc. are using…just saying. And quality limits on how high of a transcoded stream should be controlled on the server side, regardless of the kind of CPU you have, or the bandwidth of your pipe.
Mostly agree, but keep in mind Netflix and Hulu and Amazon don’t transcode a stream in realtime vs available bandwidth… they pre-transcode in MULTIPLE formats and just change the stream to the appropriate one on-the-fly. Result is the same for the watcher, but it’s much faster to switch between two existing files than to stop a transcoding job and start a new one + buffer enough again…
@jerseydevil62 said:
If Plex is going to be considered to be in the “big leagues” then any non-local stream should be transcoded using adaptive bit rate technology like Netflix, Hulu, Amazon, etc. are using…just saying. And quality limits on how high of a transcoded stream should be controlled on the server side, regardless of the kind of CPU you have, or the bandwidth of your pipe.
Mostly agree, but keep in mind Netflix and Hulu and Amazon don’t transcode a stream in realtime vs available bandwidth… they pre-transcode in MULTIPLE formats and just change the stream to the appropriate one on-the-fly. Result is the same for the watcher, but it’s much faster to switch between two existing files than to stop a transcoding job and start a new one + buffer enough again…
Yes what is needed or being asked is to be able to throttle the transcode bandwidth from the server at a user level is ideal but I will take at server level (same for all users). I work in media broadcasting and real time transcoding is starting to surfice but still far away even for the likes of Netflix etc…
KarlDag is right they use ABR transcoding stored on a origin server that waits for the file request then would be packaged on the fly (real time packager for encryption, DRM (right management) etc…and if popular remain on the CDN for other users with same device so they don’t re-package for each request only based on device type etc…
This is a good first start. Hopefully the devs will make it more granular over time so i can specify different upload rates for users. At a minimum, i would want my Home users to select any rate while people i share with are capped.