[Request] Multi-segmented streaming (ideally to speed up remote streaming)

While I have a general understanding of how multi-segmented downloading works, I do not know the full range of specifications, so if anyone would like to chime in and add to this and make corrections, please feel free to do so.

 

How data is typically sent.
As many of you are aware, when you download something from a server, the method in which the server transfers files to you is by breaking the file down into small chunks known as packets, which are then sent over one at a time, typically through TCP which which works through a two way communication via your local machine and the server.  

 

To put it in a more real world analogy, what happens is your machine sends a request to the server and say's, "I would like this item."  The server then replies, "Okay, but this is too heavy to fit on the truck in one piece, so I am going to ship it to you in multiple parts.  To insure that we do not miss anything, we are going to send the components one at a time, in order, and when we verify delivery of each component, we'll send the next piece over."  The first package arrives, and your local machine signs for the delivery confirmation, and then the next package is sent.  Of course, if the packages were coming from a warehouse a few cities over, the odds of them arriving relatively quickly are quite high since they do not have to travel over such a great distance.  However, if your packages are coming from a few states, or even a few countries over, and are having to make several stops along the way, then this can greatly reduce the speed in which your packages arrive. 

 

In this case, we have one single, linear stream like so:

 

DAlmr0g.jpg

 

Multi-Segmentation, and why it's faster.

 

What multi-segmentation does, is it utilizes more connections, and allows multiple packets of the file to be transferred simultaneously.

 

If we were to use the analogy from earlier, everything essentially works the same, with the exception that the shipper is utilizing more trucks to get the job done quicker.

 

There are several programs (many open source) that utilize this, such as Bitkinex and lftp (both FTP programs, lftp which is open source) and Free Download Manager which is what I used to produce the screenshots, and runs over HTTP. I know FDM sounds like a shady malware infested plugin you'd get from Cnet or something, but it's actually open source and works amazingly well!

 

This is how the DL looks with multi-segmentation. 

 

dSd6x5r.jpg

 

I would like to propose that the Plex engineers look into implementing this type of technology to help the growing number of users who are now hosting their servers offsite on a vps or dedicated server.  For many of us, we are unable to saturate our connections due to the sheer latency that is involved, and typically, we have to reduce our quality settings to a fraction of what our connections are capable of doing.  I am not sure how plausible this sort of request is, but if it is indeed possible, it would be a welcome addition to Plex Media Server. 

This is an interesting one.

The problem with this approach is how Plex streams work when transcoding or direct streaming. The stream will get broken up into small segments on the server and then transmitted one at a time which is why things slow down considerably compared to a stream which is direct playing. When a stream can direct play it is basically just copying the entire file from the server and this allows for much higher throughput because the download doesn't have start-stop the whole time.

People who have to move many small files over FTP for example will now how painful it is, especially when the latency increases.

I think the ideal solution for this would be to rather have the clients download as many segments as possible and keep them in a local buffer. This feature request is here: https://forums.plex.tv/topic/46169-very-popular-myplex-buffer-content-youtube-style/

-1

This is an interesting one.

The problem with this approach is how Plex streams work when transcoding or direct streaming. The stream will get broken up into small segments on the server and then transmitted one at a time which is why things slow down considerably compared to a stream which is direct playing. When a stream can direct play it is basically just copying the entire file from the server and this allows for much higher throughput because the download doesn't have start-stop the whole time.

People who have to move many small files over FTP for example will now how painful it is, especially when the latency increases.

I think the ideal solution for this would be to rather have the clients download as many segments as possible and keep them in a local buffer. This feature request is here: https://forums.plex.tv/topic/46169-very-popular-myplex-buffer-content-youtube-style/

Interesting.  I went over there and voted.  I'm up for anything that increases my ability to remote stream HQ content easier.  :D  :D  :D

I'm honestly shocked that PLEX does not already continue caching content.  This easily explains why even on extremely high throughput networks, things can get a little wonky. I'd advise everyone to go over there and vote on that issue as well, since that is an incredibly needed (and doable) feature that we need desperately. 

2021 clean-up: duplicate