You probably should also add in the caveat of using GSuite in that manner runs contrary to what Google has laid out as its policies / pricing, so while it may work NOW, there’s no guarantee they won’t just start enforcing those same policies at a later time. Major one being the pricing; if they were to enforce what they have listed on their pricing page, it would be $50 per month for unlimited storage as you would be paying $10 per user and a minimum of 5 is needed for the unlimited plan. If you only want to pay the $10, then you would be restricted to 1TB (it is 1TB per user if below 5).
Yes, and that has been pointed out in multiple places already.
True, though not everyone checks those other places, so when mentioning one thing in a topic, it is best to also add the counterpoint in the same location.
You probably should also add in the caveat of using GSuite in that manner runs contrary to what Google has laid out as its policies / pricing, so while it may work NOW, there’s no guarantee they won’t just start enforcing those same policies at a later time. Major one being the pricing; if they were to enforce what they have listed on their pricing page, it would be $50 per month for unlimited storage as you would be paying $10 per user and a minimum of 5 is needed for the unlimited plan. If you only want to pay the $10, then you would be restricted to 1TB (it is 1TB per user if below 5).
Sometimes you have to live a little dangerously… anything can happen with any provider. It’s been mentioned several times in various threads, but just remember Bitcasa and the unlimited version of OneDrive. There is no way of telling if Amazon might not do the same, or Google for that matter. Seeing that people boast a 300TB upload on Amazon I personally think it’s just a matter of time before they enforce restrictions.
All that said I too am deeply disappointed that Amazon is gone from the list, because to me it was the only service where the cost is low enough to consider using the cloud over my local setup. Anything more than $60/year is not worth paying for, in my case.
You probably should also add in the caveat of using GSuite in that manner runs contrary to what Google has laid out as its policies / pricing, so while it may work NOW, there’s no guarantee they won’t just start enforcing those same policies at a later time. Major one being the pricing; if they were to enforce what they have listed on their pricing page, it would be $50 per month for unlimited storage as you would be paying $10 per user and a minimum of 5 is needed for the unlimited plan. If you only want to pay the $10, then you would be restricted to 1TB (it is 1TB per user if below 5).
Sometimes you have to live a little dangerously… anything can happen with any provider. It’s been mentioned several times in various threads, but just remember Bitcasa and the unlimited version of OneDrive. There is no way of telling if Amazon might not do the same, or Google for that matter. Seeing that people boast a 300TB upload on Amazon I personally think it’s just a matter of time before they enforce restrictions.
All that said I too am deeply disappointed that Amazon is gone from the list, because to me it was the only service where the cost is low enough to consider using the cloud over my local setup. Anything more than $60/year is not worth paying for, in my case.
Definitely, and I’m actually considering using them, but the main thing that sticks out to me with Google’s Business plan is that the way people are paying / using the service doesn’t match with even what Google has laid out as their plans. So, while it can change at a moment’s notice for any company, Google’s is the only one that says outright that the method people are using it is not correct as purely looking at their policies say that paying $10 per month should not give you unlimited storage.
@paullenhardt said:
If you don’t have 4 or more users G Suite unlimited isn’t unlimited. So it would actually cost $40 a month to get unlimited storage.
Plenty have single domain users have posted that they still get unlimited
For the ones that with large libraries consider the fuse/encfs/acd_cli solution. It was been working very well for me. I keep 3 terabytes local and discharge once in the while to my amazon, then I clean the local folder but since I have a union between amazon and my local folder, and Plex reads from the union. I can stream from both local and amazon.
I know this is not quite the same thing as the Plex cloud solution, but it can be a nice setup for the ones with large libraries. Otherwise is a pity that amazon cloud was drop as a solution for Plex cloud. Nevertheless I still think that even with amazon (or any other solutions) if you would upload terabytes of copyright material (unencrypted) you would be banned.
@jpsobral said:
For the ones that with large libraries consider the fuse/encfs/acd_cli solution. It was been working very well for me. I keep 3 terabytes local and discharge once in the while to my amazon, then I clean the local folder but since I have a union between amazon and my local folder, and Plex reads from the union. I can stream from both local and amazon.
@jpsobral said:
For the ones that with large libraries consider the fuse/encfs/acd_cli solution. It was been working very well for me. I keep 3 terabytes local and discharge once in the while to my amazon, then I clean the local folder but since I have a union between amazon and my local folder, and Plex reads from the union. I can stream from both local and amazon.
I know this is not quite the same thing as the Plex cloud solution, but it can be a nice setup for the ones with large libraries. Otherwise is a pity that amazon cloud was drop as a solution for Plex cloud. Nevertheless I still think that even with amazon (or any other solutions) if you would upload terabytes of copyright material (unencrypted) you would be banned.
If have the time can you write a guide to how to do it ?
If you have an EDU google apps account it’s unlimited storage. Yes it’s high risk, but I’m willing to take it and have been uploading my 3TB library over the past few days. No problems yet and plex cloud is working 75% of the time. When it does work it works flawlessly (Can stream a 2 hour movie with 0 buffering). It doesn’t seem to do well with over 10mb/s for me. Also other apps such as the Xbox One app don’t work too well with plex cloud atm. It’s called beta for a reason.
Follow the tutorial above that explains about creating the encfs key. Finally, I am using a lazy automator script that runs on startup login of my plex user. You can do it more nicely by running at boot time but since I want the user to auto login and open plex media client, I add the automator at startup time. You need to edit the file to change the paths and the PASSWORD for your encfs password. You can also do this better by reading from the keychain and not using echo but once again I did it in the lazy way.
Finally, I upload to amazon manually when I have like 70% of my harddisk full (3 terabytes) by using the following commands:
Is important to maintain folder structure in the local and amazon directory so that acd_cli does not copy twice the same files et cetera. Thus, the command number 3) does exactly that deletes all files but not the folders. Adjust your mtime as your preference. When I finished an upload I like to leave at least the 6 previous days of media in my local harddisk but you can increase or decrease that.
One last piece of advice. Use unionfs folder for Plex Server reading but use the Local folder to add new files, the reason for that is that unionfs can be buggy if you write on it. Plus i have an automatize system and I have temp folder where I download and decompress the files, after that they are moved to the Local (using: sabnzbd, sonarr, couchpotato).
Hope this helps! The tutorial is more comprehensive.
@martinbowling said:
120 a year will get you unlimited google drive either on your own in g suite unlimited or join one of the many groups around that have gotten together
@jpsobral said:
For the ones that with large libraries consider the fuse/encfs/acd_cli solution. It was been working very well for me. I keep 3 terabytes local and discharge once in the while to my amazon, then I clean the local folder but since I have a union between amazon and my local folder, and Plex reads from the union. I can stream from both local and amazon.
This is almost exactly what I’m migrating to. Mount the ACD via acdcli and, point PMS at the library. The only part I am not considering is encrypting it. For the rest of it, I am pondering implementing local caching by using mergerfs on top of a local cache and ACD. I’ve thought about using lsyncd to trigger a pull-down to local storage (e.g. when I open an episode of a series, I want the whole season to be copied down to local storage), but that could get quite complex with a lot of corner cases to account for. Instead as a first pass a manual invocation to pull down the content you expect to be consuming in the next weeks/months from ACD to local cache is probably much easier and much less error prone.
To make it fully equivalent to Plex Cloud you would also need an AWS micro instance to run PMS on, but it’s up to everyone to decide on their own whether it would be more economical to run their own server on the end of a fast broadband connection at home or pay $100 or so per year for a micro instance to keep it all in the cloud.
I know this is not quite the same thing as the Plex cloud solution, but it can be a nice setup for the ones with large libraries. Otherwise is a pity that amazon cloud was drop as a solution for Plex cloud. Nevertheless I still think that even with amazon (or any other solutions) if you would upload terabytes of copyright material (unencrypted) you would be banned.
This is perhaps an issue if your content is largely pirated. But those of us for whom every last file came from legally purchased DVDs, BRs, or is self-made, I don’t see how that can be an issue. If you own the content and it is for personal use (and unless you publish your access credentials or export it via an unprotected share it is), I don’t see any legal basis on which hiding it is beneficial. On the other hand if the T&C breach is going to come from the excessive use clause, encrypting it isn’t going to make any difference.
Obviously I encrypt all the confidential files that go to my ACD for backup purposes, but there is nothing confidential about what DVDs I own.
I suspect you’ll find the reason they don’t like encryption is because it completely eliminates their ability to use compression and deduplication to reduce storage requirements.
As for it being illegal to extract your own DVDs, that would also mean Plex is illegal because it’s only purpose, especially auto indexing media from public data sources, is solely useful for making convenient copyright infringement on an industrial scale. There is no DRM free source of digital downloads of said media. You cannot argue one and not the other.
The fact is that in most countries the law allows “fair use”, and it is difficult to argue that format shifting your own media doesn’t constitute fair use.
@sremick said:
A lot of very bright minds on these forums pointed out from the beginning that this was not a sustainable business model and was doomed as something would have to give. The only surprise was how quickly they were proven right.
It’d be nice if these smart people and their insight weren’t so frequently dismissed, ridiculed, insulted, and so on. They have a solid track record.
Well talking from experience, having an unpopular opinion based on experienced and knowledge in the field is often quickly dismissed by people, whom do not understand how things work together in the bigger picture,o or to simplify. They do not comprehend that they do not understand.
There is no hiding that most of the content that users are pirated, and I do understand Amazon and virtually every other storage provider want to reduce their storage cost, with duplication check, why store the exact same movie 100 times? it quickly adds up when to 100s of TB, even PBs.
@zan79 said:
I suspect you’ll find the reason they don’t like encryption is because it completely eliminates their ability to use compression and deduplication to reduce storage requirements.
True, I forgot about the deduplication possibilities, my bad.