For the “Updates for windows Applications Extended” external site, i’m seeing fequent broken fixlets because of how quickly vendors provide updates. Prior to patching i will typically pre-cache using the wizard. If i forget, then a broken fixlet will hold up a baseline and just sit at “pending download” until the end of the action and report back that the action was not relevant for that system. I want to automate pre-caching of all the fixlets in this site. Is this the correct approach…
- relevance to return all download urls for all fixlets in the site:
<?relevance
unique values of parenthesized parts 2 of matches
(case insensitive regex "^\s*(download|prefetch|download now|download now as|add prefetch item)\s+.*((((ht|f)tp[s]?)://([[:alpha:]\-\.]+(:[0-9]+)?))[[:space:]/]([^[:space:]]+))")
of scripts of actions whose (script type of it = "application/x-Fixlet-Windows-Shell")
of fixlets of bes sites whose (name of it = "Updates for Windows Applications Extended")
?>
- then run the besdownloadcaher utility and have it loop thru all the returned URLs gathered by the relevance query :
BESDownloadCacher.exe -u https://www.vendor1.com/download/Setup_app1.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER
BESDownloadCacher.exe -u https://www.vendor2.com/download/Setup_app2.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER
BESDownloadCacher.exe -u https://www.vendor3.com/download/Setup_app3.exe -x c:\PATH-TO-BIGFIX-SHA1-FOLDER
etc…
Or is there a better approach?