Iterating through the files for download

(imported topic written by cm6591)

If I am not sure what files reside in a particular folder under Uploads on the BES Server, how can I iterate through the files and then issue download command for each file?

I believe either iterating through the files or using wildcards are the only two solutions, if I don’t know the filenames. I’d really appreciate your guidance.

Thanks

(imported comment written by jessewk)

do you know the sha1?

(imported comment written by cm6591)

hi Jesse,

can you pls elaborate on this? I want to go through the contents of a folder one file at a time and issue a download command for each file.

thanks

(imported comment written by jessewk)

Please read this thread: http://forum.bigfix.com/viewtopic.php?id=4598

(imported comment written by cm6591)

Hi Jesse,

Thanks for the thread. I just want to do something simple right now since I’m only in evaluation phase plus my evaluation license will also expire soon. Also, I have all the files in a single folder on the server as opposed to in different sites as mentioned in the thread.

I was thinking about the following solutions…

Write a batch script to output filenames in a folder to stdout and capture these names in a variable. then run download on this variable? can i do this with action script?

Or

can I construct a relevance expression that will give me the filenames in a folder which I can use with download?

Right now, I just need something quick that I can use to demo the fact that I can download all the files in a folder on a periodic basis

Thanks

(imported comment written by cm6591)

I now have a relevance expression which is giving me the list of files in a folder:

files whose (name of it contains “events”) of folder “C:\program files\BigFix Enterprise\BES Server\wwwrootbes\Uploads\fa7fcef158a3dd393606677ddf8a84e2cff2a193”

My question is: how do i apply download command to each of the files returned? The above relevance is returning 3 files, for example, at one go.

Thanks

(imported comment written by BenKus)

Hi cm,

As mentioned in other threads, I don’t think you can get this type of system to work in BigFix. When you take an action, the server caches the files at that time. Even if the action reapplies, the cached files will still be used.

Ben

(imported comment written by cm6591)

Thanks Ben. I read in the forums that I can use download now for this. Can I?

(imported comment written by BenKus)

Yes… “download now” doesn’t use the relays and downloads the files directly so you can potentially use this method… however, since it doesn’t use the relays, you have to be careful about bandwidth usage, server load, etc.

Ben

(imported comment written by SystemAdmin)

Hi cm,

I think what would be really helpful for you (and us) if you describe exactly what you’re trying to do (and why). For instance, why do you think you need to iterate through all of the downloads on the BES server?

-Paul

(imported comment written by cm6591)

Hi,

Basically, I want to achieve the following:

download now http://10.6.67.68:52311/Uploads/fa7fcef … f2a193/.

This is so that I can do a dynamic download without user intervention from a particular folder on the server where files will continously get modified by another task.

Thanks

(imported comment written by SystemAdmin)

Hmm… This one has me thinking of alternate ways…

If the 1st task is modifying the files, maybe you could have it compress up the files with a unique name, which the section fixlet/task would download to the clients and relays each day.

First action:

  1. Modifies the files, however you were doing that.

  2. Use makecab.exe to create a cab file with all of those files in it, call it “mydata”

  3. Use sha1.exe to calculate the sha1 of the file (cab), and write the sha1 to a text file called “sha1-mydata.txt”

  4. Use makecab.exe to create a second cab to compress the two above files, call it "download-{concatenation of substrings separated by " " of following text of first “, " of (date (local time zone) of now as string)}”

Second action:

  1. Download a file called "download-{concatenation of substrings separated by " " of following text of first “, " of (date (local time zone) of now as string)}”

  2. extract the files contents (“mydata” and “sha1-mydata.txt”)

  3. compare the sha1 inside of “sha1-mydata.txt” against “mydata” (this is an alternate way of validating since we don’t know if the outside cab downloaded correctly)

  4. If the sha1 is ok, extract the contents of “mydata” and copy it to wherever its supposed to be on the client.

Paul

(imported comment written by cm6591)

Thanks Paul. I will have to create the above cab file periodically. Should I use “download now” to download the latest cab file? If I use download command, the cached file will be downloaded, correct? Is there a way for me to find out when the cab file was downloaded so that I can delete it from the server? I want to make sure I always download the latest cab file

(imported comment written by SystemAdmin)

Well if the downloaded file (First action, step #4) has the date built into it, then it will always be unique. Then you’d just download, rather than use download now. You’d have the ability to use the relays.

Paul

(imported comment written by cm6591)

Paul,

How can I generate a new cab file every 15 minutes in the above scenario and download every 15 minutes? The assumption above is that I want to download once every day which in my case might not work since I want more frequent downloads of the latest cab file. Thanks

(imported comment written by SystemAdmin)

Don’t you think every 15 minutes is a bit much? I’d think you’d kill your clients by downloading files to it constantly. I don’t even think the relays would even get it fast enough.

If you wanted to do it say, hourly… change the relevance so the filename contains the hour as well.

Like…

concatenation of substrings separated by " " of preceding text of first “:” of following text of first ", " of ( now as string)