So I found something really strange, I have many processes in my environment where I use 7zip to zip up files, then I use upload manager to upload the Zips. Early on, my zip commands had issues, so I added some logging by putting " >> log.txt" at the end of the line, which gave me logging on the zip command itself. Something else that it appears to have done is actually put the log itself into the zip file itself, along with the custom site folder. If I remove the logging piece, I end up zipping only the file I want. If I have the logging enabled, I am zipping two files and a folder, even when I am calling out a single XML file to be zipped. Can anyone explain this?
Here are my examples below. The FilesToZip parameter is just a single XML file, naming conventions not even close to the log.txt file that is being zipped as well.
This one works as advertised.
wait "{parameter "7ZipPath"}\7za.exe" a -tzip "{parameter "OutputFileName"}" "{parameter "FilesToZip"}"
This one zips up the normal file, the log file, and the custom site folder (blank).
wait "{parameter "7ZipPath"}\7za.exe" a -tzip "{parameter "OutputFileName"}" "{parameter "FilesToZip"}" >> {parameter "command_debug_log"}
What this means to me is that I tried to make the system better by adding logging in case we had issues, but instead, I may have added very large log files in the process to upload and slowing things down. Has anyone else heard of this or can you duplicate it? Thank you
Update: This does not seem to work the same way if it runs within a batch file and created from with the "creatfile until end piece. Only using actionscript does it appear to do this when comparing to other scripts that run in batch.