@mbartosh, as I mentioned earlier in the thread, we run a daily script against our Upload Manager folder structure to do several things. First it moves out files based on naming convention into respective shares by data type. Using a regex for uploaded file naming convention is very helpful as we have several different types of data uploaded daily that is co-mingled in the Upload Manager structure.
The second part of the script clears the structure to be ready for tomorrows data. We have it timed so that our uploads take place overnight. These scripts run first thing each morning. As respective data is transferred to alternative shares, it is then consumed by other sources.
Helpful hints include having a very structured naming convention for all uploaded data. Then build a script on the fly using relevance substitution with the regex of your convention. This can be batch, powershell, etc. Have a daily policy action on your root server to dynamically recreate that script, then execute it on a daily basis. Rebuilding it daily ensures that endpoints that have been added/removed/upload requirement changes are all accounted for. Be mindful of embedded quotes, carriage returns, etc when building your script.
We found it useful to create a file containing the endpoint computer name in it that is included in the upload archive. This makes it easy to parse out which archive belongs to which endpoint at the root. Our endpoints have a very structured naming convention as well which we can also regex to lift out whichever endpoints we need by attributes.
Here is an example line:
appendfile {(“xcopy %22” & item 0 of it & “*applog_*.log%22 %22e:\applog” & item 1 of it & “%22 /E /C /I /Y%0d%0a” ) of ((pathnames of it, lines 1 of files “Name_0_computername.txt” of it as string) of folders of folders of it ) of folders “e:\BigFix Enterprise\BES Server\UploadManagerData\BufferDir\sha1” whose (exists file “name_0_computername.txt” of folders of folders of it)}
Repeating similar lines, one can change out the source and destination criteria. Similar lines can also be used to clean up afterwards.
Hopefully this gets you started. A bit tricky to get set up correctly, but once it is in place it works like a champ. We’ve been using this approach successfully for several years with up to several GB of data daily.