I was reading some post and wiki but I don’t found the exactly answer.
I have some scripts to run in some servers, this part I understood but I need some external files, that are update every day by a external team. From the clientes, I can’t download those files directly, so I from what I see I need use the prefetch command that will cache those files on the IEM Servers to send to the clients.
I found the example above, I tested and ok, this would work to download some files from external sources to the clients but my problem is:
As the file is changed everyday and I don’t have control of that, how I can get the sha1 hash, seems that the prefetch command need the sha1 parameter right? As well the size parameter, how I can do to prefetch ignore this and only download the indicated package and put on the clients?
I wanted all of this because I need create a task that run a shell script on a set of servers, but the scripts use some files that are update every day from one source link that I have.
You can use the plain “download” command ( https://developer.bigfix.com/action-script/reference/download/download.html ) which does not specify a hash but uses the caching mechanism though the download now might be needed as the server may think it already has the download for the action once it downloads it.
Hi, many BES Server does not allow you to download files from different sites (by Security reason).
You can think in a alternative solution, for example, create some secure script to make your downloads and then store the downloaded files in your Relay server as sha name convention. Then your endpoints should prefetch it normally.
tks
Hi finsel,
Yes, I see that, as I don’t manage every aspect from the IEM server I asked our IEM team to check for grant access the url but I’m thinkin In another solution for this as I explained here:
From one of our servers clients, that have internet access through proxy, I thinkin in setup one script to download the required files in regular basis and use IEM to collect those files and redistribute to the other clients.
You could use the REST API to automatically generate a fixlet to do what you need whenever the files change. It would dynamically update the prefetch statement to be correct for the new files.
This would allow the files to flow through the relay infrastructure using the hash values for proper verification.
As a fun aside, with the “download now as” command, you can then do relevance substitution within the URL if desired. That opens up all kinds of possibilities with internal corporate web apps and ERP systems.
download now as triggers the client to perform the download directly - these download requests do not use the root server or relay hierachy and do not get cached.
This runs like a charm, unfortunately Proxy und and Password appears in in cleartext in logfile.
Is there any way to use the defined credentials, maybe as parameter or something like that ?
User and password will still appear in the action and fixlet .FXF files on the client.
If you want to remove those as well, you should check out the Secure Parameters techniques mentioned on https://bigfix.me/fixlet/details/3678 It’s trickier to implement, since you have to work with javascript inside of the Fixlet Description, but the security is worth it in the long run.