It may not be clear from some of the info but the number of copies stored did matter. Cablevision intentionally did not share recordings. Several of the music upload services where you can register what you already own went through the same thing, and almost all of them have streaming rights now. If you store one copy for everyone it will be considered a public performance.
IANAL but I’ve worked with people involved in the network DVR work. I can pretty much guarantee if you reach a large enough size to be visible on the MPAA radar you will be sued and probably lose.
This kind of service only scales if you can store one copy for everyone.
Yea, I think the new breed of programmer needs a double major (programming and law).
Both Google and Amazon’s music service allow you to upload your own music. They both do something similar to what is talked about here at it’s fundamental level in that they don’t upload files already in their service. However I’m sure they both have license deals in place AND they both stream the content vs acting as a backup service only. Not sure if that distinction matters, but it’s there.
I would somehow seem to think a service like this really isn’t any different than a CrashPlan (or similar) backup system built specifically for large video backup. Again not a lawyer but it would seem safer to host than say OneBox, Google Drive, Bitcasa, Box, etc which allow file sharing and easy pirating. I’d also think it safer than what Plex and Emby themselves are doing with Cloud based broadcasting/distributing of your media to people you have shared with.
As for the difference of one file vs multiple files this really doesn’t need to be an issue per say. With a “finger print” engine used during the local scanner process it can identify files already “seeded” and make a copy of those files into your cloud backup library. Thus you would have your own copy. It would then also upload anything unique that isn’t already seeded. The host backup system would then use operating system level de-duping to save storage space at the SAN level. Maybe a bit of “smoke and mirrors” but each person would technically have their own files stored.
Agreed its a fine line. And what you describe about using dedup on the backend is something we assumed that people like Cablevision were probably doing, its a gray area but probably where I would start.
The other thing that may or may not be an issue is the DMCA regarding DVD/Bluray rips. That shouldn’t impact a service directly, but its users are definitely infringing. Now we all ignore that normally, but it is a subtle distinction between doing Movie uploads vs Music uploads.
Ignoring the legal aspects for a moment my biggest issue with this personally would be quality of the files. Its pretty easy to screw up and do poor quality rips. The problem is the easiest solution would be for the service to provide the sources but then you are back to needing streaming rights.
This is all in a very gray area. The content companies want us all to be in UV style locker systems, I do wonder at what point we may lose the option to even buy copies of content if we want to…
As far as payments go, since I realize I didn’t answer that, I think some sort of tiered storage would be ideal. Since we’re assuming we’ll de-dupe the files server-side I would think that it would be more than possible to sell by the movie or something. It might sound crazy at the outset, but I think it could work. If we’re going with innovative newish idea anyway why not try a new sales model as well.
@vanstinator
I always like tiered pricing where you have something like $1 per fir TB; $0.90 for TB 11 to 25; $0.80 for TB 26 to 50; $0.75 for each TB over 50
or something like that. This way it’s based on the amount of storage instead of per movie. Someone with 1080 or 4K content shouldn’t pay the same as SD content.
@Plexhilarated
Yea but unless we had a license deal with UV or similar (no good for plex) you couldn’t get content you didn’t already own.
This thread has taken an interesting turn. I love this media backup service idea both as a business idea and as a potential consumer. I’ve only got ~4TB of media, and while I certainly could back it up myself, not having to deal with that would really be awesome.
Are you just thinking movie/tv/music? Because if you’re only storing “new” items the amount of storage you’d actually need isn’t insane. As soon as you include things like “home movies” the storage would balloon pretty quickly.
If only there were another 5 to 10 hours in the day I’d probably put something like this together.
In my mind you would not want to get involved with “home movies” and such. By “talking to” the plex database you could easily check for things such as IMDB # or similar IDs for TV shows and Music Files. It wouldn’t be perfect but would solve a lot of issues.
I’d think you would want to stay clear of personal files such as home movies, porn or other collections that will not “de-dupe” well. Or back them up as a premium service since you know each file will require it’s own storage.
Changing the subject, what are your thoughts on the new Media Optimizer? It’s basically your script built it, albeit you still keep a copy of the original floating around.
Funny that you posted this. I was away for the holidays and didn’t even realize Plex had updated the Plex Pass Server. My local servers showed they had the current Plex Pass version when they did not.
I’ve just started to play with it right now and I’m doing my first conversion. So I’ll hold back comments until I’ve played with it a bit.
But right off the bat I’ll say I find it interesting that Plex now gives us the ability to save files “along side” the media. This was always taboo previously!!!
I’m interested to see how the web page handles hundreds to thousands of conversions.
My first file I “optimized” was down to 4MB TV which is a built in one and this appears to have worked. My second conversion was again to 4MB TV but I choose a file that didn’t need to be converted since it was already optimized for that. It’s converting it anyway. So it looks like these optimizations could use some tuning.
This is a first release but I think they did a pretty good job (minus some tuning issues that will pop up) and I’m glad to see this added. I think this will be very popular and can see it used by many people to “cache” the last 10/25/50100 movies added to their system in multiple formats.
Now on the downside: We have yet another “draw” on CPU use. Live Transcoding, Syncing, Optimizing. We could really use Intel QuickSync (GPU Transcoding) or the ability to offload ffmpeg/transcoding to another machine, but that’s another conversation.
Overall from my quick reading of the features and quick testing I’d say this is going to be a hit. I’ll comment more later when I’ve had a chance to really put it through it’s paces.
I’ve done some rather extensive testing within my own environment. Here are some things I’ve noticed with the Optimizing feature…
It will down sample files, sometimes even if they don’t really need it.
You can queue up an entire library and it goes through each file, one at a time, until the entire library is done.
While it’s converting to the optimized settings you have selected, it reports to some clients that it’s transcoding in prep to a sync.
Syncing media that is optimized when the bit rate is set to one of the optimized versions on the server, it sends this optimized version to the client.
While syncing a file it stops optimizing until the sync is complete.
If you have multiple audio streams it strips out and makes only one, in AAC audio. (Not good if you have a HT system with surround sound, in other words. You lose the extra channels.)
For large libraries this can take some time. I’m about 35% through my main library of almost 1500 movies, and it’s been 3 days + for down sampling to 4Mbps 720.
You can have multiple bit rates and resolutions. They are stored in the media’s source folder. (4Mbps, 1.5Mbps or 720Kbps can co-exist on the same server.)
When it’s optimizing a file and a server side event that should take priority happens, the conversion pauses or goes to a lower operating state. (This is GOOD news!)
I’m going to play with this a lot more, to see how it works. But, this can actually solve a lot of the issue people have with bandwidth or transcoding sessions on lower upload or powered devices. Queue up a few you plan to watch and come back later to watch them. Or set them all to optimizing and just let it grind away. Assuming you have HDD space for it.
The one thing I have noticed that I’m not sure I like… It forces the library it is converting to rescan every few minutes. Not just when the new media is done, but also mid conversion. It finds nothing added during these extra scans, but it can sometimes mess with either the conversions, or streaming on clients.
I like it so far… Just needs some tweaking and it should be a keeper! (I would love to say I would like to keep this a Pass only feature, but too many people on lower end machines could really make use of it!)
Damn your quick Mike. I was going to report some of these items but you got me.
on your list is a “killer” IMHO. I don’t like this at all. I’ve tested a few 3MB 1080 files with the default 1080 4MB optimized TV setting and the files grew and didn’t change any settings my conversion script settings used. So it wasted disk space, gave me a re-encoding of something not needed and thus downgraded the quality of the file. I haven’t YET tested to see which file (original vs transcode) will get used if I fire up a movie with a 4MB 1080 profile.
Other than some fine tuning to handle the above type thing I think they have a well thought out winner on their hands! I agree with MikeG6.5 and also hope they keep it Plex Pass as this is going to be a SUPER FEATURE that should help boost revenue and is truly worthy of shelling out a few bucks each month. @elan you listening?
I like the fact it shows the number of versions of the file on the image artwork (ie 2 or 3) but yet if you setup a filter to look for duplicates these don’t show up! Very nicely implemented and I was expecting this to be a bust and was presently surprised.
Now, maybe Mike or someone else can tell me how to do something I hope is obvious and I just haven’t figured it out yet:
I went into my Movie Library and sorted by RECENTLY ADDED DATE. Set up a job to transcode the last 5 movies to 4MB 1080 and also 4MB 720 (2 jobs). I more or less like what it’s doing and want to expand this to 50 or 500 of the most recent. How to I edit a job?
I can’t find this anywhere. Any clues?
Carlo
EDIT: I think this feature will really be a great addition:
If/when they add GPU transcoding (obvious reasons)
Allow “dynamic” transcodes to “stick”. So I could setup a dedicated 500 job conversion of my last added to convert ot 4MB 1080. Another job to convert to 4MB 720 AND another job which is DYNAMIC. By dynamic I mean a users goes to watch an older move on my system such as Top Gun. So the system converts this to the “jobs/resolutions” I want to convert. I setup either the max movies to convert or give it a dedicated SIZE it can use for “caching” these files. This way the X most used/viewed files will “stick”.
I do the above via scripting and monitoring the system but there is no reason why this couldn’t be built in!
What I did was delete the job and make a new one. I only tagged a few for the first few tests. So deleting them wasn’t a major issue. Yes, this did delete the media it had created, which seemed a bad thing, IMO… I should be able to tag files in the job and delete them as wanted/needed.
Then I tagged the entire main library for optimizing… Talk about SLOW GOING… Even the i3 in my NAS is having some issues with it, and then stopping conversions to transcode or sync slows it down even further. Right now, I’m about 50% through the library after 3 and a half days… But 1500 files does take a while to convert, regardless of how it’s done or what CPU is doing it.
This has gotten me seriously thinking about upgrading the CPU from a i3 to an i7… What’s another $200 or so… It would almost double the passmarks of the NAS, but it’s something that’s not been tried with this model before… And might fry the Alaska Contraceptive… I might need to rename the NAS now… Kinda catchy… AKContra…
and I made a custom job for 4Mbps-720 as the client apps wouldn’t seem to play the 4Mbps optimized, as they were looking for a 720 and not a 1080…
@vanstinator said:
Go into your sever settings and hit the optimize tab. Find your job and hover over to see the pencil icon.
Dang, could have sworn I did that previously. But yest that is the ticket!
MikeG6.5, I did the same right from the get go and created a 4MB 720 job as I know from experience on my system this was needed. So I’m doing two conversions both to 4MB (720 and 1080).
@MikeG6.5 said:
I’ve done some rather extensive testing within my own environment. Here are some things I’ve noticed with the Optimizing feature…
It will down sample files, sometimes even if they don’t really need it.
Are you sure? It should be able to remux or just convert audio as needed. Can you provide the xml for the original and what optimized setting you used?
If you have multiple audio streams it strips out and makes only one, in AAC audio. (Not good if you have a HT system with surround sound, in other words. You lose the extra channels.)
If you have a HT system, you’d probably want to be playing the original file. It’s be odd to have poor video and great audio.
The one thing I have noticed that I’m not sure I like… It forces the library it is converting to rescan every few minutes. Not just when the new media is done, but also mid conversion. It finds nothing added during these extra scans, but it can sometimes mess with either the conversions, or streaming on clients.
Do you have the library set to automatically update? That is an OS level trigger, so I don’t think Plex can stop that. I have that off on my setup and there is no rescanning of the library that I can see.
Ugg, I’m not so sure I’m liking the results. I’ve been comparing the output of the files that Optimize is creating vs what my scripts are creating and it’s pretty radical quality differences. I hadn’t even checked this before.
But I guess that is to be expected comparing something that converts files at 0.25% speed vs 10x+ speed.
I’d much prefer to be able to fine tune the conversion settings used like my the scripts do. Unless your CPU is pegged or you can’t handle real-time conversion this new Optimize really may not be the ticket yet.
Personally if I’m going to pre-convert files before they are needed I would prefer to generate high quality versions vs basically the same thing I can do in real time already.
What do you guys think? Have you checked the quality vs a conversion you would do yourself?
The optimizer basically uses the same settings as a transcode. I don’t know if the feature is meant to replace manual re-encoding of files or just to perform the transcode ahead of time. The latter seems more like it.