G suite mount on qnap or synology

ok, so whats the path to the share, should probably be something like /volume1/share/

depends on what shares you have setup

I just uploaded it to /volume1/docker/

ok, so you should be able to move it with

mv /volume1/docker/nas-mountOnStartup.sh /root/scripts/

if nas-mountOnStartup.sh is what you called it

I went through and added it the Task Scheduler and enabled it. I think I’m good to go. Thank you again!

Smashing, fingers crossed it works.

As for your question about Automation, I don’t really have that much automation going on as I have a slot on a shared seedbox for downloads, but that does not have fuse, so can not mount anything on that.

I run Sonarr and Radarr to grab stuff on that and then just use rclone to transfer the files from there to my google drive.

How do you setup rclone to transfer the files over to Google Drive?

Sonarr and Radarr rename, organise into folders and move the downloaded files to a directory e.g.
/directory/path/media/ on my seedbox

Then in google drive all my media is stored in folder called plex

Then with a command like this:

rclone copy -v --transfers=5 --checkers=5 -P /directory/path/media/ gDrive:plex

That copies everything inside /directory/path/media/ to inside the plex folder on my google drive

-v and -P just display more info while transferring and--transfers=5 is just the number of files to transfer at the same time.

You have been a huge help. I went in and created this.

rclone copy -v --transfers=5 --checkers=5 -P /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies

I created a script using the other script you sent to automate it and followed the steps to execute it and added it to the task scheduler, but the only way I got it to work was by manually copying and pasting that code into terminal.

Any thoughts of what I did wrong?

This is what my .sh file looks like

#!/bin/bash
#!/usr/bin/rclone
# Make script executable with: chmod a+x /root/scripts/nas-rclonecopyOnStartup.sh

rclone copy -v --transfers=5 --checkers=5 -P /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies

	
exit

I think you would need to add the full path to rclone, so put

/usr/bin/rclone instead of just rclone

and if you are running it from a script I would remove the-v and -P flags as it’s from a script, you will not be able to see the output from these anyway.

I would also probably add these lines so you don’t run the script more than one at a time:

if pidof -o %PPID -x “nameofscript.sh”; then
exit 1
fi

Put this bit of code on lines before copy command and change this bit: nameofscript.sh to the name of your script.

This will mean the script will exit if there is already one running.

On second thoughts, it might be useful to have a log file so you can see what happened.

So I would add this to your copy command
(change the path/location to somewhere on you nas that is accessible from you mac, so you can access the logfile without having to use the terminal):

--log-file= /volume1/sharename/rclone-upload.log

So my by reckoning the whole script would be something like this:
(I have added the -v flag back, but left out the -P flag)

#!/bin/bash
#!/usr/bin/rclone
# Make script executable with: chmod a+x /root/scripts/nas-rclonecopyOnStartup.sh

if pidof -o %PPID -x “nas-rclonecopyOnStartup.sh”; then
exit 1
fi

rclone copy -v --transfers=5 --checkers=5 --log-file= /volume1/sharename/rclone-upload.log /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies
	
exit

And in the task scheduler on the nas (if you have not already) I would set this one up as a scheduled task so you run it on a schedule, daily weekly etc

Should I set it to run every hour in task scheduler. I want it to copy it as soon as possible.

Also, is it better to use moveto instead of copy? I would prefer not to have to go in and have to manually delete the file from the Movies_Gdrive folder after the movie is copied to my Google Drive.

Well I guess that depends on what your upload speeds are :wink:

We don’t have great upload speeds here, so I would never be able to get a 4K movie uploaded in an hour :wink:

(also, I think there may be a (google drive) limit of around 25MB/s to 30MB/s per single file upload, that’s why it’s better to do a few at once. That’s what the transfers setting is for)

With that extra bit of code in the script, it should prevent the script from running if it already is.

And we don’t want rclone trying to upload the same file, as if you had one instance of rclone running uploading a big file and then started another rclone instance, it would probably try to upload the same file and may get a bit confused.

I use move for some of my stuff, not sure what the difference is between move and moveto
So yes you could use move or moveto, but I would wait until you are happy it’s all working properly before changing it to that, so at least for the time being you still have your local copies if something went wrong.

ha, and I don’t know how big your collection of movies is that you are putting onto your gdrive, but you know there is an upload limit of about 750GB a day :wink:

Haha it’s not that big YET. I did notice the movies that did use rclone to copy to the Google Drive aren’t showing up on Plex. Do I have to Scan Library Files in Plex each time it does it? My other movie folders Plex automatically adds them.

Yep, afaik you would need to rescan in plex as I don’t think the inotify stuff to detect changes works with remote filesystems.

You could turn on ‘Scan my library periodically’, but just be aware that if the mount is not up, then it will mark all the content as deleted.

Alright one last thing, when I run the script I get the error message, “Command move needs 2 arguments maximum”.

Here is my code that I put in the script:

#!/bin/bash
#!/usr/bin/rclone
# Make script executable with: chmod a+x /root/scripts/nas-rclonecopyOnStartup.sh

if pidof -o %PPID -x “nas-rclonecopyOnStartup.sh”; then
exit 1
fi

rclone move -v --transfers=5 --checkers=5 --log-file= /volume1/Downloads/rclone-upload.log /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies
	
exit

Ok, couple of things to try,

1: try removing the space after --log-file=
2: escape the commands as we had to do before
3: move the log stuff to the end

so try this one:

/usr/bin/rclone move \-v \--transfers=5 \--checkers=5 /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies \--log-file=/volume1/Downloads/rclone-upload.log

If that does not work try without the log-file switch

/usr/bin/rclone move \-v \--transfers=5 \--checkers=5 /volume1/Downloads/completed/Movies_GDrive Gdrive:4kMovies

That was it! Thank you so much you have been a huge help! You should seriously think about creating a step-by-step guide for this. Especially for the Synology, since most of the stuff out there you have to piece it together to make it work on the Synology. I know a lot of people would appreciate finding it in one location.

I second that 
 would love to mount my g-drive folders on my synology but after scrolling through this thread with so many posts on command lines that need to be entered I dare not do it yet


If you would be willing to write a step by step guide for the synology I would definitely give it a try!

Yep, sorry, the thread did get a bit longer than I had originally anticipated

Mainly to do with the strange thing of having to escape the command line flags, which I never had to do when I set it up.

I will see if I can write up a more concise guide, though you may have to wait till later in the week.

2 Likes