Page 1 of 1
How do I make a flow compress only 5Gb of files at a time ?
Posted: Wed Jul 24, 2013 8:03 am
by MitchSharpe
I have a flow which takes Job Folders and enclosed files/folders and moves them to an archive server at another site, but at the final step it zips folders at a certain level.
This has been fine for 1 use but I now need to "Batch" compress a group of files in a job into (for example) a usable size like 5 Gb instead of what the folder actually holds which can be 50Gb or higher ...
What I'd like is the Top level folder kept and then inside that a "CR2" folder as well and then inside that multiple zips of 5Gb named name.1, name.2, name.3, name.4 or something like this.
I can then use Stuffit's Archive manager to look inside the zips to find the required file when needed.
I have tried various places for the Hold job Tool but as the Job folder is already made, it wont split it where I thought which was before the Compress. Ive also tried doing the hold at various other parts of the Job Dismantle/job Assemble but have not found the right place, or possibly Im using the wrong tool?
Any suggestions appreciated !
Mitch
How do I make a flow compress only 5Gb of files at a time ?
Posted: Wed Jul 24, 2013 3:49 pm
by dkelly
A script that runs when a new file arrives, enumerates all of the files in a directory retrieving their file sizes into an array. The script could select the files in a couple of different sort orders: alphabetical, modification date, size, etc. Once a set of files who's size is less than 50MB are selected by the script, move them to a new directory and call compress() function to add them to a ZIP archive.
How do I make a flow compress only 5Gb of files at a time ?
Posted: Wed Jul 24, 2013 4:00 pm
by freddyp
I have not worked it out, but I think it is useful to consider using zip as a command line tool in the "Execute command" element.
If all the subdirectories are smaller than 5GB you could do this:
find subdir_name -name "*.*" -print | zip name_of_zip -@
If that is not the case, zip also has support for taking a text file with the file names to be zipped as input. With a script you could create x number of text files with the paths of files that make up almost 5GB, and then run zip with each of those text files as input.
With a script element in Switch using Javascript making the text files (and at the same time running the zip command line) can be done, no problem. Perhaps the creation of the text files can be done with a shell script as well, but I would not know how to do it. Once you have the text files it is easy:
zip name_of_zip -@name_of_text_file
Freddy
How do I make a flow compress only 5Gb of files at a time ?
Posted: Thu Jul 25, 2013 12:11 am
by dkelly
// Archive files in 50MB max ZIP archives
// Written by Dwight Kelly <
dkelly@apago.com>
// All rights reserved
function timerFired( s : Switch )
{
if (s.getTimerInterval() != 1800)
s.setTimerInterval(1800); // run every 30 mins
var jobs = s.getJobs();
var i, sizeTotal = 0
for (i=0; i < jobs.length; i++) {
var path = jobs.getItem(i).getPath();
var f = new File(path);
if (sizeTotal + f.size > 52428800) { // 50MB
s.log(1, "archiving " + (sizeTotal/1048576) + "MB in " + (i+1) + " files");
var newJob = s.createNewJob();
newJob.createPathWithName("archivedFiles", true);
var j;
for (j=0; j <= i; j++) {
var aJob = jobs.getItem(j);
var path = aJob.getPath();
s.copy(path, newJob.getPath());
aJob.sendToNull(aJob.getPath());
}
var zipFN = newJob.createPathWithName("archivedFiles.zip", false);
s.compress(newJob.getPath(), zipFN);
newJob.sendToSingle(zipFN);
return;
}
sizeTotal += f.size;
}
}
How do I make a flow compress only 5Gb of files at a time ?
Posted: Thu Aug 29, 2013 8:14 am
by MitchSharpe
I finally worked it out and got it to work as I needed with a Hold and setting Max folder size to X MB ...
Seemed to have a lot to do with folder levels and splitting the original folder into several streams.
Thanks for the input !
How do I make a flow compress only 5Gb of files at a time ?
Posted: Mon Oct 14, 2013 8:00 pm
by neecerp
MitchSharpe: Is there any way you could send me your flow? I have been struggling with how to get one server to pull matching files from another....is yours doing this or something similar? If so, any help would be greatly appreciated. I am trying to match folder names to pull from one server and match up in another for archiving.
thank you.
How do I make a flow compress only 5Gb of files at a time ?
Posted: Tue Oct 15, 2013 9:27 am
by freddyp
Mapping input folder structures to output folder stuctures is something that can be done with Submit hierarchy and Archive hierarchy (and the correct folder level settings). I suggest you start a new post and describe what you want to achieve using an example, and then the members can see how your problem is best solved.
Freddy