i would like to copy (or display the content of) several specific log files from a known location on a client PC to my BES console. (the file names are always the same and always in the same location).
Do you have Windows privileges on the computers? If so, you can potentially use the customized “right-click” menu to copy the files… If you don’t have Windows privileges, you will want to have the BES Agent upload the files, but the files will end up on the server and you would have to copy them to the console yourself.
Here is an example action to upload the contents of the folder “C:\test” to the server:
// upload results example script
// set max size to 4 MB to prevent too much data
setting “_BESClient_ArchiveManager_MaxArchiveSize”=“4194304” on “{parameter “action issue date” of action}” for client
// Set the agent to send info only once (rather than resending periodically)
setting “_BESClient_ArchiveManager_OperatingMode”=“2” on “{parameter “action issue date” of action}” for client
// Tell agent which files to collect
setting “_BESClient_ArchiveManager_FileSet-test”=“C:\test*.*” on “{parameter “action issue date” of action}” for client
we use the upload manager to collect system logs from over 1200 remote sites over a 56k frame nightly… pull back around 2.5gb and the network utilization goes unnoticed… the upload manager is definitely a great asset for doing remote collection. the maxarchive bit us once… make sure you know the size of the files you’re collecting!!
you could also write an analysis for a file contains to return a true or false string, like if you’re trying to parse an install log to ensure something worked or didn’t work for a one time type of thing.
When you say “maxarchive bit us once”, is it referring to the MaxArchive limit set at the endpoint or was it some limit reached at your relay or server?
Is there a 2.5G limit at end point or a limit at the relay or a limit at the server ?
I was referring to the MaxArchiveSize setting that is applied at the client Endpoint when you are creating your Upload job.
setting “_BESClient_ArchiveManager_MaxArchiveSize”=“45000000” on “{now}” for client
When we started our automated process, the logs that we were harvesting were small and maintained by the default setting. It turned out that over time the logging methodology changed, creating much more verbose logs, thus creating larger flat files to be uploaded. We noticed failures and discovered the default setting. The setting above is in bytes and controls the size of the files that you can pull back… this is important to us as we have constrained bandwidth to our endpoints and do not want to try to pull back grossly large files that would be affected by and cause network latency.
First, I think that you need a full path, but also, I don’t think you will find many .jpg files in your actionsite (and if you do, they will disappear the next time your agent gathers).
There is a .jpg in the actionsite folder because an app that I am executing put it there. Here is what the index.txt says after the upload. The jpg did not upload though and I verified it was in the __Besdata\actionsite folder.
The actionsite will be cleared every time a new a action is issued and that can complicate things. I recommend you use a different path and provide the full path to your setting.
So then I shouldn’t leave my working directory as __Download\test (the default location where the download was uncompressed to) because it will go away…?