This is a fresh attempt at the age old issue of dealing with large dynamically generated content and how to distribute it successfully in a large WAN environment. Performing a normal prefetch or download results in the clients getting the correct version the first time, but subsequent runs of the policy action contain the old cached content from the first run.
If I ‘download now’ to the url from the clients, while it technically works, it clogs the WAN connection as there are multiple large downloads competing for the WAN. Sometimes it just times out and some clients don’t get the update.
So I devised a two-step process in an attempt to avoid the WAN clogging problem.
First is a policy action on the Root Server to ‘download now’ on the url and to place the latest file into a sub-folder under Uploads. The second part of that job is to create a manifest file and dynamically inserting hash and size values for the current dynamic file.
download now https://server/app/file.zip
delete "path\file.zip"
copy __Download\file.zip "path\file.zip "
delete "path\manifest.txt"
delete __appendfile
appendfile name=file.zip sha1={sha1 of it of file "path\Uploads\SIT\file.zip"} size={size of it of file
"path\Uploads\SIT\file.zip"} url= https://server/app/file.zip
copy __appendfile "path\Uploads\SIT\manifest.txt"
The clients run a policy action as follows:
begin prefetch block
add nohash prefetch item url=http://server:52311/Uploads/sit/manifest.txt
collect prefetch items
add prefetch item {line 1 of download file "manifest.txt"}
end prefetch block
The first time this runs, the client gets the expected file. On subsequent runs of the policy action, it appears to use what it first cached instead of the fresh file. This isn’t the expected result.
In the policy action on the root server, do I need to clear the prior file under ‘MirrorServer’ in addition to where I’m already clearing it under ‘Uploads’ in order for it to get the fresh file each time?
Open to any other ideas to improve this so it can be truly dynamic.