My two cents with your plan to offload the processing to other machines. For your Plex server configure mcebuddy to strip the commercials and convert to MP4 Unprocessed using a Post Processing script. You shouldn’t have the issue of the ts file being locked because Post Processing the ts file is locked apparently by mcebuddy. Recordings will show in your library up quickly after recording. When you get up to 6 recordings at the same time the server should still be able to handle the post processing I would think. Your remote machines can then have mcebuddy configured to monitor your library and convert your videos using the 2 pass mp4 profile. You may have to do mkv processing on the Plex server so the remote machines have a different extension to look for other than mp4. Still need to resolve the issue of naming for the news items in your original post but that probably is more of a Gracenote issue more than likely. I see the same issue here with filenames of news shows but solved it using mcebuddy and a separate profile looking at the title of the show and matching to the news shows I record.
Something you probably should look at to is running more then 1 conversion At a time. With 24 threads you could easily run MCEBUDDY to run up to 6 jobs at once and complete many jobs in a good amount of time. Also make sure you told comskip, if you use it, how many threads(24) your system can support if you are using the donator version.
@johnm_ColaSC said:
My two cents with your plan to offload the processing to other machines. For your Plex server configure mcebuddy to strip the commercials and convert to MP4 Unprocessed using a Post Processing script. You shouldn’t have the issue of the ts file being locked because Post Processing the ts file is locked apparently by mcebuddy. Recordings will show in your library up quickly after recording. When you get up to 6 recordings at the same time the server should still be able to handle the post processing I would think. Your remote machines can then have mcebuddy configured to monitor your library and convert your videos using the 2 pass mp4 profile. You may have to do mkv processing on the Plex server so the remote machines have a different extension to look for other than mp4. Still need to resolve the issue of naming for the news items in your original post but that probably is more of a Gracenote issue more than likely. I see the same issue here with filenames of news shows but solved it using mcebuddy and a separate profile looking at the title of the show and matching to the news shows I record.
----…–…-.-.-…–…–.-…-…-…—…-.
Ahh, even if the post processing script only properly names and places the new .ts into the shared folder for a given remote cpu, it would be great. This will allow the work to be divided up easier since renaming and moving are really no work and the filename problem should be nailed.
- Initial EPG listing data used on the .ts creation.
- Info is dumped to unique filename by pps and moved to shared folder
- MCEB does its thing. Option to or not to have MCEB add/update the show info.
- Plex indexes the final copy, likely looking up the show by filename.
I am thinking that if MCEB does a lookup I would expect to see an occasional mismatch (specific shows) putting bad info into the media file. Right now I will stick with the same data all the way through from the start and deal with it later. When I get a %100 successful day run of news I will be able to see what I have as far as multi-airing episodes and how Plex handles everything. Its too bad that the multi-part playback isn’t reliable. That might have worked out for some of the weekend news.
As for tonight, I have an entire DVR schedule to make. I wiped everything out and restructured my folders and libraries. I am aiming to be able to redirect up to %100 of the encoding load to other computers if ever needed. And I think from the evidence before me I am going with 2 pass encoding. It may not be optimal in making but enables better overall results in the end.
Are there any good sites with scripts to suggest? Windows scripts… no sh here! Maybe vbs? I don’t really don’t want to install any more interpreters if I can help it. Oh, sorry, and no java.
@mavrrick said:
Something you probably should look at to is running more then 1 conversion At a time. With 24 threads you could easily run MCEBUDDY to run up to 6 jobs at once and complete many jobs in a good amount of time. Also make sure you told comskip, if you use it, how many threads(24) your system can support if you are using the donator version.
…—…—…–…—.-.-…-.—…----…
I run between 1 and 4 workers. The thing is, beyond 3 or so you don’t gain much advantage. After the cpu hits 100 its just adding to the time it takes to complete a given conversion so the files are held up longer. I also tried doing a full episode through MCEB and gave it a RAM drive as the temp folder. It didn’t seem to pay.
I confess, I don’t have the donor comskip yet. I’m embarrassed…
lol
Non donator versions of comskip run single threaded. So that could help some. It also doesn’t chew up the cpu as well as handbrake so it can really benefit from the enabled multithreading. Just remember the. Bottleneck will then likely move to your disk. So try to keep the work directory on fast disk.
Couple of thoughts.
With Post Processing the only thing the script receives is the path to the ts file in the grab folder so you will see something like: “G:\DVR.grab\f48ac79b356a4096ae90dafe45078201d7f61b5d\Bull (2016) - S01E21 - How to Dodge a Bullet.ts”.
Your script will then need to figure out what to do:
- Determine which server to send the file to. This could be either a random selection of a server, or you would need some mechanism to know which server is available and not currently processing files. Plex will also probably not like the fact that the ts file was moved so you may want to copy the file. Then copy the file to the shared folder with the server as you mentioned.
- Append the recorded show filename to a file or add it somewhere to track all the files that conversion needs to be completed for. You could then have a separate program that runs continuously in the background that tracks the files to convert and which server is free to do the conversion. This program could be a script or a program written in a language such as VB since you mentioned vb script.
I don’t know if a simple script will satisfy your needs to evenly distributing the workload between multiple servers doing the conversion.
There is a script on the forum that was being created in PowerShell that I have a few comments on: http://forums.plex.tv/discussion/239816/windows-power-shell-script-for-postprocessing#latest. It is using separate programs installed to accomplish the same tasks as mcebuddy.
You could also look at the script discussed on the Reddit article: https://www.reddit.com/r/PleX/comments/52v7vd/guide_commercialfree_experience_with_plex_dvr/. If you go this route be careful when downloading the script referenced in the article. The site is now a possible malware site. I am actually using this script as part of my Post Processing.
@mavrrick said:
Non donator versions of comskip run single threaded. So that could help some. It also doesn’t chew up the cpu as well as handbrake so it can really benefit from the enabled multithreading. Just remember the. Bottleneck will then likely move to your disk. So try to keep the work directory on fast disk.
–.-…-.
Thats one reason I have 72gb of RAM. A 20 to 30 gb RAM drive. But it seems that a SSD is enough.
Thanks for the scripting tips. I wasn’t expecting to find something out of the box that is exactly what I need. But one or a few scripts that are close is good enough. It will save me time as something to work from. But it sounds like it will be from scratch, likely in vbs since that is powerful enough to do anything I could want and I am rather familiar with the code. Did you know the melissa virus was written in vbs? Lol.
@GroupMaster You may want to review the MCEBuddy website. I believe i saw there were someone did exactly what you are talking about with it and distributing data to multiiple machines. I am not sure how it will work if you try to do it as part of post processing though.
@GroupMaster said:
@mavrrick said:
Non donator versions of comskip run single threaded. So that could help some. It also doesn’t chew up the cpu as well as handbrake so it can really benefit from the enabled multithreading. Just remember the. Bottleneck will then likely move to your disk. So try to keep the work directory on fast disk.–.-…-.
Thats one reason I have 72gb of RAM. A 20 to 30 gb RAM drive. But it seems that a SSD is enough.
That may hot be the case once you get the donator version and enable multithreading on all of your CPU’s.
@mavrrick said:
@GroupMaster said:
@mavrrick said:
Non donator versions of comskip run single threaded. So that could help some. It also doesn’t chew up the cpu as well as handbrake so it can really benefit from the enabled multithreading. Just remember the. Bottleneck will then likely move to your disk. So try to keep the work directory on fast disk.–.-…-.
Thats one reason I have 72gb of RAM. A 20 to 30 gb RAM drive. But it seems that a SSD is enough.
That may hot be the case once you get the donator version and enable multithreading on all of your CPU’s.
-.—…-…—.-.-.-…–
We will see. When it comes time to tune for speed I will be testing it again. Oh, I just got the donor version lastnight before everyone makes fun of me… lol. I also plan testing hardware acceleration with a ~$150-200 video card to see what happens. I wonder if it able to utilize 2 GPUs… I have a feeling that GPU acceleration has a way to go before it is worth using.
The usefulness of GPU accelleration is heavily dependent on the rest of your system. In many cases a good GPU can be much faster then CPU alone on desktop PC’s but that isn’t what you are running on. That system you have has about 25k passmark score. The number you gave is for one cpu package not a dual core setup like you have. With that much CPU you probably won’t find that much of an improvement unless you run a extream SLI setup. I am not saying it won’t help, but it won’t be as great as allot of people see with regular dual or quad core home pc’s
Also if you are really hung up on whating two pass encoding the GPU accelleration will be a step down from that. Personally that wouldn’t bother me, but you have been fairly insistant on 2-pass encoding. it is also possible to slow down encoding when turing on GPU acceleration if the combination of hardware isn’t right. So just be aware.
I’m not getting my hopes up for GPU support. It would be nice to be able to expand an existing system that easily but I doubt it will be just that easy or worth it any time soon. And according to cpubenchmark dot net, the dual x5670 is 12613 passmark, the single is 8061. I wish 25k…! hooowaaa!
Bah see my mistake. Cpubenchmark.com kind of shows that info weird. Gpu acceleration can be great. I have used it for some time with my desktop. But I had a high end Gpu at the time with a decent cpu. Handbreak and MCE Buddy can do it. It just needs to be powerful enough to improve upon on what the cpu can already do. Simply put don’t use a cheap gpu and expect much. I also dont know how well it will ru. If you do several conversions as once.
So, let’s break out the old devguru and make us a script. Ok, so what does a variable do again?
My notes
Overall process of V1 script:
Verify incoming filename is good. (done)
Process it’s new filename and rename it. (done, not rename yet, waiting)
Check for encoder availability. (done, via mceb queue length query)
Check encoder preference lists. (not coded yet)
Assign job to encoder. (when it’s ready, this will only be a file move)
Future:
Note: Most cool ways of handling this requires a continuously running script in a loop. I want to try and only have it execute once each time a new recording is complete and exit.
Encoder availability, only currently queued items will be reported. Need to also look in the encoders shared folder for a file count. Can’t assume enough time has passed since last file was processed and mceb hasn’t added it yet.
Assign job to encoder, use [–action=rescan -> Rescan monitor locations and logs] after assigning the job, also helps with Encoder availability
Encoder preference list, simple s/m/l categories to keep the large recordings off slow computers. Maybe add a feature to lock specific shows to an encoder.
In the event of an encoder going offline with items in queue. On each execution of the script, the encoders are checked for operational status. Any offline encoders have their folders checked for content to be reassigned if needed.
General code, Need to move some hard coded strings to setup variables. Almost ready to take the training wheels off and stop dry humping the files.
I just found this thread or I would have commented sooner.
First I’d like to ask why you think 2 pass is better? What do you think the advantage is for Plex purposes of using 2 pass over a proper CRF encode which will guarantee you a certain quality for every recording?
How are you handling de-interlacing of the 1080 content? 1080 in the US is almost always interlaced while 720 is progressive. MCEBuddy doesn’t handle this to well.
Carlo
I really don’t know which is better. It’s simple. I like it’s final product and it is somewhere to start with. If in the future (near or far) I find it to be a waste or something else is better I will move to that. It’s kind of funny how there are so many dead set against 2 pass. I would rather leave it as a .TS but many players have issues with it.
More about the script:
forums.plex.tv/discussion/271583/post-processing-script-for-news-using-mceb-on-windows#latest
The script will not operate if the most recent Plex update is installed. The new Plex DVR apparently is making mkv files opposed to the ts files. I will update this tonight to correct for the change.
Plex has posted a new 1.7.1 update.