Dynamic Content Download Policy Action

This is a fresh attempt at the age old issue of dealing with large dynamically generated content and how to distribute it successfully in a large WAN environment. Performing a normal prefetch or download results in the clients getting the correct version the first time, but subsequent runs of the policy action contain the old cached content from the first run.

If I ‘download now’ to the url from the clients, while it technically works, it clogs the WAN connection as there are multiple large downloads competing for the WAN. Sometimes it just times out and some clients don’t get the update.

So I devised a two-step process in an attempt to avoid the WAN clogging problem.

First is a policy action on the Root Server to ‘download now’ on the url and to place the latest file into a sub-folder under Uploads. The second part of that job is to create a manifest file and dynamically inserting hash and size values for the current dynamic file.

download now https://server/app/file.zip
delete "path\file.zip"
copy __Download\file.zip "path\file.zip "
delete "path\manifest.txt"
delete __appendfile
appendfile name=file.zip sha1={sha1 of it of file "path\Uploads\SIT\file.zip"} size={size of it of file 
"path\Uploads\SIT\file.zip"} url= https://server/app/file.zip
copy __appendfile "path\Uploads\SIT\manifest.txt"

The clients run a policy action as follows:

begin prefetch block
add nohash prefetch item url=http://server:52311/Uploads/sit/manifest.txt
collect prefetch items
add prefetch item {line 1 of download file "manifest.txt"}
end prefetch block

The first time this runs, the client gets the expected file. On subsequent runs of the policy action, it appears to use what it first cached instead of the fresh file. This isn’t the expected result.

In the policy action on the root server, do I need to clear the prior file under ‘MirrorServer’ in addition to where I’m already clearing it under ‘Uploads’ in order for it to get the fresh file each time?

Open to any other ideas to improve this so it can be truly dynamic.

The first time the client runs the ‘add nohash prefetch item’, the file is cached all through the relay chain. On the next attempt, the Relays and Root Server will refer to that previous download rather than downloading a new copy of the file. Replacing it in the Uploads folder won’t change that state.

There are some serious implications (both technical and security) around dynamic downloads. (As an aside, ‘dynamic downloads’ are any of those with a relevance substitution. In your client action, the ‘add nohash prefetch item’ is not a dynamic download, while ‘add prefetch item {line 1 of …}’ is dynamic)

With the static download, the root server and Relay can perform the ‘add nohash prefetch’ statement independently of the client. ‘manifest.txt’ ends up saved as /downloads/actionid/0 on the Root and Relay, when the client actually makes the download request it’s looking for ‘downloads/[actionid]/0’ from the Relay. The Relay, having previously cached the file, won’t go check for an update to it. Neither would the Root.

For the Dynamic Download, the client sends a DownloadRequest message to the Relay, that contains the URL, size, sha1, and sha256 of the file - the client doesn’t just ask for ‘downloads/[actionid]/1’, instead it sends the Relay an instruction to go download a file with these specifics in it. What makes the download Dynamic is that the Client tells the Relay what to go collect.

So, the approach you’re taking is very close but it’s going to take a couple of more steps, I think.

One method would be for your clients to perform a ‘download now’ to get the manifest.txt file and then parse it to generate the ‘add prefetch item’ command. That’s fairly simple, but the disadvantage is that every client has to open a direct download to the root server, which could be a performance impact (and, for security reasons I usually block clients from talking directly to the root server anyway).

The other method, which I think is preferable, is to set up a scheduled task, run one one machine (maybe on the root server itself?), to collect the file from https://server/app/file.zip, generate the manifest.txt file from that, and then attach the manifest.txt as a Site File on a Custom Site on the root server, with the ‘Send to Clients’ option.

You can get some description of posting to a Site File at https://developer.bigfix.com/rest-api/api/site.html
Your script will need to authenticate to the root server using an operator account.

From the client, instead of downloading the manifest file itself, you’d use the copy of the manifest that’s already been gathered as a Site File; do something like

add prefetch item {line 1 of file "manifest.txt" of client folders of sites whose (name of it = "My-Custom-Site-Name")}

3 Likes

Thanks @JasonWalker! I think my issue is that the manifest is getting cached. I’ll try your suggestions.

I don’t know if the following approach would fit your requirements but I had a similar case where we wanted to push the most recent Windows Defender MPAM to endpoints that were over x days out and failing to update via the built in updater processes. What I did was to create a fixlet that runs on the main server and it download the MPAM, gets the file sha, size etc and then updates a fixlet that will deploy the file to relevant endpoints. This avoid the caching issues each time the file changes. I’m basically using REST API (via IEM.exe running it on the main sever) to manage an existing fixlet then deploy the once its been updated. I’d be happy to send you the BES file to see if you can adopt anything for your use case.

I’ve revised my approach and have been successfully testing the following setup. Thought I’d share for everyone’s benefit.

Instead of running the policy action on my root server, I’ve adapted it to run on the local site relay of the more than 1k relays in our environment.

The idea was to leverage the relay without it being cached on that relay. A bit of a paradox, I know. I found that creating a folder structure under the wwwrootbes folder of the relay works perfectly. The policy action to the client on the relay grabs the file using ‘download now’ (instead of prefetch) drops the new dynamic file into that folder structure that looks like this. …path\wwwrootbes\app\env

The clients in that location have a policy action that runs a half hour later that reaches out to the special folder on the relay, expressly not using the normal cached content.

download now http://{substitute-location-specific-relay-name}:52311/app/env/file.zip

This is working well so far and meets the dynamic requirements from our business partners.

Next step is to reach out to the team that runs the app that generates the file.zip to see if they can generate the a manifest file on the system of origin. If so, I can download that along side the dynamic file to be able to check the hash and size on the clients before they use the contents.

3 Likes

Thought I’d share a status on this project. The dynamic downloads on a daily basis to the location specific relays has been working well.

I was able to get the owner of the source system to generate a manifest file with metadata about the current build including the size and SHA2 of the code bundle. I download this manifest to the relays with the code package. The clients check the hash based on the manifest before they execute it. Best of both worlds - trusted and dynamic.

The side benefit is the meta-data in the manifest can also drive analysis results and reporting.

3 Likes