Optimized version - How exactly the clients are choosing to use it?

Since I didn’t find a reliable way to see which version of my media my remote users are using from my PMS I was wondering how exactly the clients are choosing to use it?
Example with one media:
-Original media is H264 MKV 15 Mbps
-Optimized version of this media is “optimized for mobile” 4 Mbps

From what I understand, from the client:
-If video quality setting is set to 4 mbps or less, it will use the optimized version of this media.
-If it’s set to higher than 15, it will use original

But what if it’s set to something between, like 8 Mbps?
Is it still going to use the original or use the optimized version (4 Mbps)?

Thanks in advance for a confirmation, this could change how I’m settings up my optimized versions…

I really thought this was an easy yes/no question…

In my setup I have a full blu-ray rip of a movie, followed by a 8mbps 1080p iOS optimised version, then a 3 Mbps 720p iOS optimised version, and I have all mobile devices set to 8 Mbps on WiFi and 3 Mbps on cellular,

When my Xbox one plays a file it plays the full blu-ray rip.

When I use an idevice or android device/Chromecast it always plays the 8mbps version.

And when away from home on cellular it plays the 3mbps version.

So yes as long as the setting on the device in question is the same or a small amount higer than the original version it should play the optimised version.

Hope that helps, I cheated to find this out, what I did was as an example create a folder named for example movie a and then in the folder put movie a with original quality, then movie b with lower quality, then movie c with lower quality again, renamed them all to the same and played the movie in different locations/devices, knowing which movie playing was which quality proved the theory, hope that makes sense.

@richarddc79 said:
In my setup I have a full blu-ray rip of a movie, followed by a 8mbps 1080p iOS optimised version, then a 3 Mbps 720p iOS optimised version, and I have all mobile devices set to 8 Mbps on WiFi and 3 Mbps on cellular,

When my Xbox one plays a file it plays the full blu-ray rip.

When I use an idevice or android device/Chromecast it always plays the 8mbps version.

And when away from home on cellular it plays the 3mbps version.

So yes as long as the setting on the device in question is the same or a small amount higer than the original version it should play the optimised version.

Hope that helps, I cheated to find this out, what I did was as an example create a folder named for example movie a and then in the folder put movie a with original quality, then movie b with lower quality, then movie c with lower quality again, renamed them all to the same and played the movie in different locations/devices, knowing which movie playing was which quality proved the theory, hope that makes sense.

Wait, what?

How did you find out which file was being streamed? PlexPy?

Because if you pick 3 different movies of different quality then name them all the same and put them in the same directory, making sure to note which movie is which you would know yourself

Oh yes of course, different media with same name. Smart.

This confused me more lol…
Basically my question is:
If there is an original at 15mbps and an optimized at 4mbps (generic) and client (whatever platform) has video quality set at 8mbps, which one is it going to use?

It should play the 4 mbps one, as what the setting is, is the maximum quality, and Plex will always play the most appropriate file that doesn’t need transcoding.

Exactly. Shouldn’t play the 15Mbps one because that goes over the set limit and requires transcoding.

@richarddc79 said:
It should play the 4 mbps one, as what the setting is, is the maximum quality, and Plex will always play the most appropriate file that doesn’t need transcoding.

@KarlDag said:
Exactly. Shouldn’t play the 15Mbps one because that goes over the set limit and requires transcoding.

OK, because I was scared that it wouldn’t play the 4mbps and take the 15mbps but transcode to respect the 8mbps…