BES Download Cacher

For the “Updates for windows Applications Extended” external site, i’m seeing fequent broken fixlets because of how quickly vendors provide updates. Prior to patching i will typically pre-cache using the wizard. If i forget, then a broken fixlet will hold up a baseline and just sit at “pending download” until the end of the action and report back that the action was not relevant for that system. I want to automate pre-caching of all the fixlets in this site. Is this the correct approach…

  • relevance to return all download urls for all fixlets in the site:
<?relevance
unique values of parenthesized parts 2 of matches
  (case insensitive regex "^\s*(download|prefetch|download now|download now as|add prefetch item)\s+.*((((ht|f)tp[s]?)://([[:alpha:]\-\.]+(:[0-9]+)?))[[:space:]/]([^[:space:]]+))")
  of scripts of actions whose (script type of it = "application/x-Fixlet-Windows-Shell")
  of fixlets of bes sites whose (name of it = "Updates for Windows Applications Extended")
?>
  • then run the besdownloadcaher utility and have it loop thru all the returned URLs gathered by the relevance query :

BESDownloadCacher.exe -u https://www.vendor1.com/download/Setup_app1.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER

BESDownloadCacher.exe -u https://www.vendor2.com/download/Setup_app2.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER

BESDownloadCacher.exe -u https://www.vendor3.com/download/Setup_app3.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER
etc…

Or is there a better approach?

Hi @LouC, you can also schedule a task like the following example which is similar to the :

https://bigfix.me/fixlet/details/27014

Another idea is to run the BESAirgapTool.exe to accomplish this effort. You have the ability to filter and only download files that meet your criteria. Take a look at the documentation at this link for more details:

Thanks, Gus.

For Airgapped Environment - im using the following workflow:

  1. Frequent AirgapTool for AirgapResponse and Binary Files
  2. Frequent Root Server Script which will find Relevant Content which Don’t have the Binaries - it will output a text file with the URLs
  3. Moving the URLs file to an Internet Facing machine and Running a Script which will Download the URLs